69 Per Cent Indians Can't Differentiate Between AI And Real Voice, 47 Per Cent Fell Prey To AI Voice Scams
About half or 47 per cent of Indian adults have experienced or know someone who has experienced some kind of AI voice scam, which is almost double the global average of 25 per cent.
At a time when Artificial intelligence (AI) chatbots such as OpenAI's ChatGPT are taking the world by storm, AI is also fueling a rise in online voice scams, with just three seconds of audio required to clone a person’s voice, computer security software company McAfee said on Monday. According to the San Jose, California-headquartered company, it conducted a survey with 7,054 people from seven countries, including India and found that more than half or 69 per cent of Indians think they don’t know or cannot tell the difference between an AI voice and real voice.
About half or 47 per cent of Indian adults have experienced or know someone who has experienced some kind of AI voice scam, which is almost double the global average of 25 per cent. Scammers send fake voice messages of friends and family members pretending to be in distress and 70 per cent of Indian adults’ trust in social media has been negatively impacted due to the rise in deepfake technology.
Security researchers at McAfee Labs have revealed their insights and analysis from an in-depth study of AI voice-cloning technology and cybercriminal use and revealed since everybody’s voice is unique, the spoken equivalent of a biometric fingerprint, which is why hearing somebody speak is such a widely accepted way of establishing trust. However, with 86 per cent of Indian adults sharing their voice data online or in recorded notes at least once a week (via social media, voice notes and more), cloning how somebody sounds is now a powerful tool in the arsenal of a cybercriminal.
With the rise in popularity and adoption of AI tools, it is easier than ever to manipulate images, videos, and perhaps most disturbingly, the voices of friends and family members. McAfee’s research reveals scammers are using AI technology to clone voices and then send a fake voicemail or voice note or even call directly the victim’s contacts pretending to be in distress -- and with 69 per cent of Indian adults not confident that they could identify the cloned version from the real thing, it’s no surprise that this technique is gaining momentum.
“Artificial Intelligence brings incredible opportunities, but with any technology, there is always the potential for it to be used maliciously in the wrong hands. This is what we’re seeing today with the access and ease of use of AI tools helping cybercriminals to scale their efforts in increasingly convincing ways,” Steve Grobman, CTO, McAfee, said in a statement.