Landmark lawsuit finds that social media addiction is a feature, not a bug

Wait 5 sec.

A Los Angeles jury has delivered a landmark verdict: Meta and YouTube were negligent in the design and operation of their platforms, causing a young woman known in court documents as Kaley, or KGM, to become addicted to social media. The tech giants must now pay her a total of US$6 million in damages – $3 million compensatory and $3 million punitive.She claimed the platforms’ design features got her addicted to the technology and exacerbated her depression, anxiety, body dysmorphia and suicidal thoughts.The jury found that Meta bore 70% of the responsibility and YouTube 30%, meaning Meta will pay $4.2 million and Google’s YouTube $1.8 million. Both companies have said they will appeal.The verdict came a day after a separate New Mexico jury ordered Meta to pay US$375 million for failing to protect children from predators on Instagram and Facebook. Kaley filed her lawsuit in 2023, when she was 17. She claimed that she began using social media as a young child and alleged that features such as infinite scroll, autoplay, algorithmically timed notifications and beauty filters were addictive.TikTok and Snap were originally named as defendants but settled before the trial began for undisclosed sums. Meta and YouTube proceeded to a seven-week trial in Los Angeles Superior Court.The case is the first of three bellwether trials scheduled in the California state proceedings – test cases selected to gauge how juries respond to the core legal arguments – drawn from a pool of more than 1,600 plaintiffs, including over 350 families and 250 school districts. The outcome of this first trial was always likely to have consequences far beyond one young woman’s case.Bypassing big tech’s legal shieldThe legal strategy that made this trial possible was a deliberate departure from previous attempts to sue social media companies. Historically, platforms have been shielded by Section 230 of the 1996 Communications Decency Act, which protects internet companies from liability for content posted by their users. The plaintiff’s lawyers sidestepped this entirely by arguing that the harm arose not from what users posted, but from how the platforms were engineered – treating Instagram and YouTube as defective products rather than neutral publishers.The jury heard internal Meta documents that proved damaging. One memo read: “If we wanna win big with teens, we must bring them in as tweens.” Another showed that 11-year-olds were four times as likely to keep returning to Instagram compared with competing apps, despite the platform’s own minimum age requirement of 13. A former Meta engineering director turned whistleblower, Arturo Béjar, testified about how features like infinite scroll exploit the brain’s reward system. Meta CEO Mark Zuckerberg himself took the stand – his first jury testimony on child safety – and was questioned about his decision to retain beauty filters despite internal research flagging their impact on young girls’ body image.The jury rejected the companies’ central defence: that Kaley’s struggles were primarily the result of a difficult home life and pre-existing conditions rather than platform design. In finding that the companies had acted with “malice, oppression or fraud”, they opened the door to the additional punitive damages that brought the total to US$6 million.Both companies will appeal, and the process could take years. In the meantime, a second important trial is scheduled for this summer, and a separate federal case in Oakland involving school districts is also advancing. The pressure on platforms to settle the thousands of remaining cases will grow considerably.Long-term impact?For users, the immediate practical picture is less clear. Meta and YouTube are unlikely to make significant changes to their platforms while the appeals process plays out. Any redesign – if it comes – is likely to be incremental and carefully managed to minimise disruption to the engagement model that drives their revenues.But there is a harder question the verdict does not answer: will it actually change anything? Meta and YouTube are companies worth hundreds of billions of dollars. A US$6 million damages award is not going to restructure the attention- and surveillance-driven economy. My research on digital overuse – based on in-depth interviews with digital users and studies of online communities discussing digital overuse and detox – shows that even people who are fully aware of the problem and genuinely want to reduce their screen time find it extraordinarily difficult to do so. This is not because they lack willpower, but because the features driving compulsive use are not bugs in the system. They are the system, built to maximise engagement and advertising revenue.For years, big tech has placed the burden of managing screen time squarely on individuals and parents – encouraging screen time limits, digital detoxes, and parental controls while continuing to engineer products specifically designed to defeat exactly that kind of self-regulation. The jury has pushed back against that logic. Whether courts, regulators, and legislators will push hard enough to force genuine structural redesign remains to be seen. However, the European Commission has already made the preliminary finding that TikTok’s addictive design features are in breach of the EU’s Digital Services Act.What this verdict does, at minimum, is shift the ground. For the first time, a jury has confirmed what researchers have argued for years: this is not a story of weak willpower or bad parenting. It is, at least in part, a story of deliberate product design. That matters – even if the real fight is still to come.Quynh Hoang does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.