Explorer
Advertisement
Deepfake Audio Is Real Threat. Beware Of Scam Calls With AI-Generated Voices Of Your Loved Ones Doing The Rounds
The ease with which realistic audio clones can be created poses significant challenges, particularly given the widespread use of messaging apps for audio sharing.
Ever since artificial intelligence (AI) started permeating into the masses over the past couple of years, bad actors have steadily gained access to a new tool to scam unsuspecting citizens, luring them with false baits and coercing them into taking harmful actions. While Deepfake photos and videos made their rounds on social media, much to the ire of the Modi Government which is proactively campaigning to warn people against such scams, it appears now that Deepfake audio is very real as well,
Follow Technology News on ABP Live for more latest stories and trending topics. Watch breaking news and top headlines online on ABP News LIVE TV
Advertisement
Trending News
Advertisement
Advertisement
Top Headlines
World
Cities
India
Election 2024
Advertisement
Saswat PanigrahiSaswat Panigrahi is a multimedia journalist
Opinion