New Delhi: Delhi police said that the person responsible for the creation of the deepfake video featuring actor Rashmika Mandanna was taken into custody today. In November of last year, the video went viral on social media, sparking calls for social media platform controls from many people.


In the original deepfake video, British-Indian influencer Zara Patel was seen wearing all-black when she entered an elevator. However, Patel's look changed to that of Mandanna with the use of deepfake technology. The Centre sent a warning to social media sites after the viral deepfake video became viral, stressing the legal requirements that cover deepfakes and potential repercussions related to their growth.


After the video went viral, Rashmika Mandanna expressed her shock at the experience, calling it "extremely scary".


"Something like this is honestly, extremely scary not only for me but also for each one of us who today is vulnerable to so much harm because of how technology is being misused," she said.


All social media companies have received letters instructing them to take the appropriate actions to identify and remove misinformation from their platforms, according to a recent statement made by Union IT Minister Ashwini Vaishnaw. 


"Deepfake is a big issue for all of us. We recently issued notices to all the big social media forms, asking them to take steps to identify deepfakes, for removing those content. The social media platforms have responded. They are taking action. We have told them to be more aggressive in this work," he said. 


Deepfakes are artificial intelligence-produced synthetic media that alter both visual and aural aspects through the use of complex algorithms. After a Reddit user unveiled a site for uploading edited videos, the word became popular in 2017. After that, deepfake technology was created, giving cybercriminals access to a weapon that they could use to damage and ruin the image of people, businesses, or even countries.


ALSO READ: Vijay Deverakonda Slams Co-Star Rashmika Mandanna’s Deepfake Clip, Says 'Shouldn't Happen To Anyone'