Deepfake technology can create realistic fake audio, video, and images, making it hard to distinguish truth from falsehood. This poses a significant threat to democratic processes, especially during elections.
Elections Under AI Threat: How Deepfakes Are Testing India's Democratic Integrity
Deepfakes are emerging as a serious threat to elections, risking voter trust, spreading misinformation, and challenging the integrity of democratic processes in the AI era

By Dinesh Jotwani
Deepfake technology is getting better and faster. This is a problem for countries with democracies, especially when it is time to vote. Now, AI can make audio, video and pictures that look really real. It is getting harder for people to figure out what is real and what is not. Deepfake technology can be used in ways to make people look bad, change what politicians are saying and hurt the reputation of people who are well known. Deepfake technology is ideal, and it can be used to deceive people.
In India, our electoral process is very important to our democracy. The spread of videos and audio recordings, also known as deepfakes, is a big threat to what people believe and the honesty of elections.
Integrity Of Public Trust At Risk
One deepfake that looks real can become very popular and confuse many voters. This can make people lose faith in how our country is run. The electoral process and deepfakes are closely linked in this issue. We need to be careful about deepfakes during elections. The integrity of elections and public trust are at risk because of deepfakes. India’s democracy relies on fair elections, and deepfakes can affect this.
Recognising these risks, India’s legal framework has taken steps to address the issue. Recent laws, such as the Bharatiya Nyaya Sanhita, 2023, have rules. Section 318 talks about cheating, while Section 319 deals with impersonation. Both are very relevant in stopping the misuse of deepfake technology. When people’s reputations are hurt, Section 356 on defamation helps. There are also legal ways to handle any public order problems caused by spreading false content. India’s laws are trying to stop deepfake technology misuse. The Bhartiya Nyaya Sanhita, 2023, has rules about deepfake technology. These rules can help stop deepfake technology. Defamation is also a problem, with deepfake technology. Section 356 can help with that. So, India’s laws are taking steps to stop deepfake technology. The laws are trying to protect people from deepfake technology.
During elections, the rules are very strict. The Election Commission of India makes sure that political parties and candidates follow the Model Code of Conduct. This code helps keep elections fair. The Election Commission of India enforces it to ensure that everyone follows the rules. The use of deepfakes to cheat voters or harm opponents can lead to punishments. These Punishments include being disqualified from the election. The goal of these measures is to keep elections trustworthy and honest. The Election Commission of India wants to make sure that elections are free, fair and credible. The Model Code of Conduct is in place to achieve this. It helps to prevent deepfakes from being used to deceive voters. The Election Commission of India takes these issues seriously.
Furthermore, the Digital Personal Data Protection Act, 2023, provides robust safeguards against privacy violations. The unauthorised creation or dissemination of deepfakes involving an individual’s likeness, voice, or personal data is expressly prohibited, with significant penalties for those found in violation.
However, laws and rules are not enough. All stakeholders, political parties and their digital teams must act responsibly with technology. They should not spread information made by AI. Our election system is strong only if everyone follows these standards. We all have to work to make sure technology helps, not hurts, our democracy. Political parties and digital teams have a role in this. They must use technology carefully. Be honest with the public.
As artificial intelligence can get better, we need to pay attention all the time. We need to make sure our country is safe and fair. This means we need rules, people who enforce these rules, politicians who do the right thing and citizens who know what is going on. India’s democratic institutions are very important. If we work together, we can make sure that new technology helps India’s institutions, not hurt them. This is about India’s institutions, and we need to protect them.
(The author is the Co-Managing Partner at Jotwani Associates)
Disclaimer: The opinions, beliefs, and views expressed by the various authors and forum participants on this website are personal and do not reflect the opinions, beliefs, and views of ABP Network Pvt. Ltd.
Related Video
Navi Mumbai Civic Polls: Shiv Sena and BJP to Contest Separately, No Alliance Announced
Frequently Asked Questions
What is the main concern regarding deepfake technology in democracies?
How does deepfake technology threaten India's electoral process?
Deepfakes can spread misinformation, confuse voters, and erode public trust in the electoral system. This undermines the integrity of elections, which is crucial for India's democracy.
What legal measures has India implemented to combat deepfakes?
India's legal framework, including the Bharatiya Nyaya Sanhita, 2023, addresses deepfakes through sections on cheating, impersonation, and defamation. The Digital Personal Data Protection Act, 2023, also prohibits unauthorized use of personal likeness and voice.
What role does the Election Commission of India play in managing deepfakes during elections?
The Election Commission of India enforces the Model Code of Conduct, ensuring political parties and candidates use technology responsibly. Violations involving deepfakes can lead to disqualification from elections.
Beyond legal measures, what is needed to address the deepfake issue?
All stakeholders, including political parties and their digital teams, must act responsibly and avoid spreading AI-generated content. Collective effort is needed to ensure technology aids, rather than harms, democracy.





















