The Central Government took a firm stance against the presence of child sexual abuse material (CSAM) on social media platforms on Friday, issuing notices to major intermediaries such as X, YouTube, and Telegram, directing them to remove such content from their platforms within the country as soon as possible.


Rajiv Chandrasekhar, Minister of State for Electronics and Information Technology, stated that if these social media platforms do not respond quickly, their protection under Section 79 of the IT Act, which grants them "safe harbour" status, could be revoked. This implies that they may face prosecution under current laws and regulations, even if they were not the ones who uploaded the objectionable content, according to a statement issued by the ministry. 






The notices sent to these platforms emphasise the critical importance of removing or restricting access to any CSAM present on their platforms as soon as possible. According to a statement issued by the Ministry of Electronics and Information Technology, they also urge the implementation of proactive measures, such as content moderation algorithms and reporting mechanisms, to prevent the future spread of CSAM.


The IT Act of 2000 establishes the legal framework for dealing with pornographic content, including CSAM. Sections 66E, 67, 67A, and 67B of the IT Act, in particular, impose harsh penalties and fines for the online transmission of obscene or pornographic material, the statement said. 


Rajeev Chandrasekhar emphasised the government's commitment to establishing a safe and trusted online environment governed by IT policies. He added, "The IT rules under the IT Act lays down strict expectations from social media intermediaries that they should not allow criminal or harmful posts on their platforms."


In April of this year, India was one of the leading countries that asked X (formerly known as Twitter) to remove content related to various types of abuse, such as child sexual exploitation, harassment, hacked materials, hateful conduct, impersonation, non-consensual nudity, violence perpetrators, private information, promotion of suicide or self-harm, sensitive media, terrorism/violent extremism, and violence. France, Japan, and Germany were among the other countries that made similar requests.