New Delhi: A really unsettling and filthy deepfake video of Alia Bhatt spread like wildfire on social media recently. She has sadly fallen victim to deep fake technology after Rashmika Mandanna, Katrina Kaif, Kajol and Sara Tendulkar. This comes after the Prime Minister warned about the dangers of using AI to create deepfakes and urged the media to inform the public about the impending danger posed by the abuse of AI.


Alia Bhatt is seen acting inappropriately while wearing a blue flowery dress in a viral video. However, anybody who pays even a little attention will see that the woman in the video is not Alia Bhatt. The actress's face is morphed on a different person's body for this video. The video shows a woman making lewd hand motions at the camera. Now, internet users are raising worries about the exploitation of artificial intelligence.


At a time when people are becoming more reliant on technology, this highlights the dangers associated with its misuse and exploitation.


A deepfake video featuring Rashmika Mandanna began circulating online earlier this month. Rashmika's face was morphed into a British-Indian Instagram sensation Zara Patel’s body in the video as she entered a lift. Several celebrities, including megastar Amitabh Bachchan, have spoken out against the incident and called for harsh punishments for the individuals behind this.






On the other hand, Union Minister Rajeev Chandrasekhar said that Meity (the Ministry of Electronics and Information Technology) would create a system via which citizens may report violations of IT regulations committed by social media sites.


What is DeepFake Technology?


Digitally altering a person's appearance to make them seem like someone else is sometimes used for malevolent purposes or to spread misinformation in the form of a video or photograph. Deepfake, which includes counterfeit visual and auditory content, poses a significant risk to civil society.


ALSO READ: As Deep Fake Videos Of Katrina Kaif And Rashmika Mandanna Trigger Outrage, Here Is What The Law Says