Digital addiction and the law: What tragedies such as Ghaziabad sisters’ deaths reveal about India’s regulatory blind spot

Wait 5 sec.

5 min readFeb 25, 2026 12:33 PM IST First published on: Feb 25, 2026 at 12:33 PM ISTAlso written by Chanya JaitlyThe recent death of three young girls in Ghaziabad has revived memories of the dreaded “Blue Whale Challenge” tragedies. Though multiple factors contributed to the recent incident, the deaths have been linked to intense immersion in online gaming and digital fan culture.Excessive digital engagement rarely exists in isolation. It often reflects emotional gaps — lack of parental attention, academic pressure, social isolation, or the absence of meaningful conversation at home. When real-world validation is limited, the online world offers belonging, identity, and constant engagement. What begins as entertainment can become refuge.AdvertisementThe Ghaziabad incident inevitably recalls the national alarm over the Blue Whale Challenge. At the time, the government issued advisories under the Information Technology Act, 2000, directing intermediaries to remove harmful content and monitor material linked to self-harm. In Sneha Kalita v Union of India, the Supreme Court considered public interest applications and observed that any game that brainwashes children or traps them on paths leading towards their demise would violate the right to life.Also Read | Ghaziabad triple tragedy: Australia banned social media for minors. India must follow its leadCourts took cognisance, and schools were urged to increase awareness among students and parents. However, beyond these reactive steps, India did not develop a sustained regulatory framework addressing the psychological risks of immersive digital environments.Today, the legal position remains fragmented. Section 79 of the Information Technology Act, 2000, grants safe harbour protection to online platforms that exercise due diligence. The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, as amended in 2023, impose obligations such as grievance redressal and compliance requirements for online gaming platforms. In August 2025, Parliament enacted the Promotion and Regulation of Online Gaming Act (PROGA), 2025, banning online money games under Section 5. Yet, these measures primarily target gambling and financial fraud. Immersive “social” or “role-play” platforms that do not involve monetary transactions fall outside this framework and often escape meaningful scrutiny.AdvertisementThis regulatory gap matters because digital platforms are not passive hosts. Their architecture is designed to maximise engagement. Algorithms track user behaviour and continuously supply personalised content to hold attention. Reward cycles and social validation mechanisms can foster patterns of dependence, particularly among adolescents whose emotional regulation is still developing.The constitutional framework provides both limits and obligations. Digital platforms enjoy protection under Article 19(1)(a) (freedom of speech and expression) and Article 19(1)(g) (freedom to carry on business), subject to reasonable restrictions under Articles 19(2) and 19(6). At the same time, Article 21 guarantees the right to life and personal liberty — interpreted by the Supreme Court to include dignity, mental well-being, and protection from preventable harm. When minors are affected, the state’s positive obligation to safeguard life becomes especially compelling.Also Read | From Ghaziabad tragedy, a grim reminder of the layered, complex emotional maze that is adolescenceInternationally, regulators have begun to acknowledge that digital risks extend beyond privacy and financial harm. The European Union’s General Data Protection Regulation (GDPR) provides enhanced safeguards for children’s data and requires parental consent below certain ages. The Digital Services Act obliges large platforms to assess systemic risks, including those affecting minors, and to implement mitigation measures. The United Kingdom’s Age-Appropriate Design Code requires digital services to prioritise the best interests of the child in their design. Australia has introduced legislative measures to restrict or prohibit social media access for children below a specified age, framing digital exposure as a public health and child safety concern.India’s Digital Personal Data Protection Act, 2023 recognises children as a vulnerable category and mandates verifiable parental consent for processing their personal data. However, like the broader IT framework, its focus remains largely on privacy and data processing. It does not yet confront the deeper question of whether platform design itself can create behavioural dependence and psychological harm.you may likeThe Ghaziabad tragedy brings this unresolved issue into focus. Should platforms be required to implement default safeguards for minors? Should there be mandatory time limits, behavioural risk alerts, or greater algorithmic accountability when prolonged engagement is detected? These are questions policymakers — and eventually courts — will have to address.The lesson from Ghaziabad is clear. Digital childhood must be treated as a serious legal and policy concern. The State’s obligation under Article 21 cannot end with physical safety; it must extend to the environments in which young minds increasingly live and grow.Anand is senior advocate and former Additional Solicitor General of India. Jaitly is advocate