Meta, YouTube found liable for ‘addiction’ in the US. Could it have implications in India?

Wait 5 sec.

A jury in Los Angeles has found Meta Platforms and YouTube liable in a landmark lawsuit that accused the companies of designing addictive social media products that harmed a young user’s mental health. The verdict is being seen as a watershed moment in the legal battle over whether tech platforms can be held accountable for the psychological impact of their products on minors.This could have wide ramifications globally, as it is the first in a series of test cases involving thousands of plaintiffs accusing social media companies of harming children through addictive product design. The decision could influence future litigation against technology companies and potentially reshape the debate over the limits of liability protections traditionally enjoyed by online platforms.The ruling also comes at a time when governments across the world are increasingly scrutinising social media’s impact on children. In India, policymakers have been exploring measures ranging from age-based restrictions to tighter platform obligations aimed at limiting minors’ exposure to addictive features and harmful content. Australia has already set the precedent by banning social media for children under the age of 16.The landmark LA verdict against Meta, YouTubeThe case was brought in by a woman identified in court as KGM, who argued that she became compulsively attached to platforms such as Instagram and YouTube from a young age. The jury concluded that the companies were negligent in the design or operation of their platforms and that this negligence was a substantial factor in causing harm. It awarded damages of roughly $6 million, with Meta bearing around 70% of the liability and YouTube the remaining 30%.The trial centred on claims that features such as infinite scrolling, algorithmic recommendations and constant engagement loops were deliberately engineered to maximise user attention, particularly among young users. Lawyers for the plaintiff argued that these design choices effectively “engineered addiction”, worsening mental health conditions including anxiety and depression. Both companies have said they disagree with the verdict and plan to appeal, pointing to existing safety tools and parental controls on their platforms.Read about the case here | ‘Richest corporations have engineered addiction in children’s brains’: The landmark case against MetaMeta and Google are likely to appeal. Meta said: “Teen mental health is profoundly complex and cannot be linked to a single app. We will continue to defend ourselves vigorously as every case is different, and we remain confident in our record of protecting teens online.”A spokesperson for Google said: “This case misunderstands YouTube, which is a responsibly built streaming platform, not a social media site.”Story continues below this adThe Indian Express had earlier reported that the government is learnt to be considering a graded approach to regulate children’s access to these platforms. For now, India has developed a framework of regulatory measures, self-regulatory codes, and educational initiatives, though critics argue that enforcement can be lax.Also Read | From Karnataka to Andhra Pradesh, why calls for banning social media for kids are growingUnder the Digital Personal Data Protection Act, 2023, companies that collect the data of children – users under the age of 18 – must get their parent/ guardian’s consent. They also cannot track, monitor a child’s behaviour, or serve targeted ads directed to children. But it is widely believed that children would be able to get around this by simply misrepresenting their age.According to a report prepared by the think tank Indian Governance and Policy Project in November 2025, the Information Technology Act, 2000, has provisions which criminalises the creation of child sexual abuse material, the POCSO Act, 2012, defines and penalises online sexual exploitation and grooming, the Bharatiya Nyaya Sanhita, 2023 extends liability to digital/online offences against children including trafficking and harassment, and the Juvenile Justice (Care and Protection of Children) Act, 2015 addresses online facilitation of child exploitation.However, the report noted that “persistent weaknesses in digital forensic capacity, law-enforcement training, and the uneven functioning of Special POCSO Courts continue to limit the effective investigation and prosecution of offences”.Story continues below this adUnder the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, platforms like Netflix, Disney+ Hotstar, Apple TV etc. need to classify the content they host into five age based categories – U (Universal), U/A 7+, U/A 13+, U/A 16+, and A (Adult). These platforms are required to implement parental locks for content classified as U/A 13+ or higher, and reliable age verification mechanisms for content classified as “A”.