The North News
New Delhi, December 10
The Centre on Wednesday said that it is strengthening its digital regulations to ensure that the internet remains “open, safe, trusted and accountable”, particularly for children, amid rising concerns over harmful and inappropriate content. The government further said that the expanding internet access has increased exposure to risks such as addiction, online abuse, deepfakes and hate speech. In response, the government has introduced a series of legal and enforcement measures to make social media platforms more accountable for the content they host. The measures were shared by Minister of State for Electronics and Information Technology Jitin Prasada in the Lok Sabha today.
Tighter legal framework
The Information Technology Act, 2000, together with the Intermediary Guidelines and Digital Media Ethics Code Rules, 2021, forms the backbone of India’s laws against unlawful online content. These rules require social media intermediaries to remove content that is obscene, harmful to children, invasive of privacy, impersonates others, promotes hate, or threatens national security.
Platforms must inform users of the consequences of sharing unlawful material and are expected to act quickly on court orders or government notices. Complaints involving privacy violations, impersonation or nudity must be resolved within 24 hours. A grievance mechanism overseen by Grievance Appellate Committees allows users to challenge decisions made by platforms. Intermediaries are also required to assist law enforcement agencies in identifying offenders and providing information for investigations.
Extra duties for major platforms
Significant social media intermediaries — those with more than five million users in India — face additional obligations. They must offer voluntary user verification, publish monthly compliance reports, appoint local officers and maintain a physical address in India. Messaging services must enable law enforcement to trace the “first originator” of serious offences. Platforms are also required to deploy automated tools to detect and limit the spread of unlawful content.
Failure to comply can lead to loss of legal immunity under Section 79 of the IT Act, leaving platforms liable for prosecution.
Protecting children’s data and safety
The Digital Personal Data Protection Act, 2023, restricts the processing of children’s personal data and bans targeted advertising or behavioural monitoring of minors without verified parental consent. Other laws, including the Bharatiya Nyaya Sanhita (BNS), 2023, and the Protection of Children from Sexual Offences (POCSO) Act, tighten penalties for online obscenity, misinformation, child sexual abuse material and related offences.
Awareness and reporting mechanisms
Child-protection agencies, including the National Commission for Protection of Child Rights, have issued guidelines for cyber safety, anti-bullying measures and online awareness for parents, schools and children. NCERT has also released a handbook on safe online learning.
India’s broader cyber-response system includes:
- Grievance Appellate Committees for content moderation disputes
- Indian Cyber Crime Coordination Centre (I4C) for statewide coordination
- SAHYOG portal for automated content-removal requests
- National Cyber Crime Reporting Portal (1930 helpline)
- CERT-In, which issues cyber security advisories
- Public awareness events such as Cyber Security Awareness Month and Safer Internet Day

