A 20-year-old woman is suing Meta and Google, claiming their platforms deliberately addicted her starting at age nine, leading to diagnosed body dysmorphia, anxiety, and depression
TikTok and Snapchat settled with the plaintiff hours before trial began, with confidential terms, rather than face jury scrutiny of their product design decisions
Thousands of similar lawsuits have been filed across the United States, with Meta's market capitalisation of £1.2 trillion making even small liability shifts translate to tens of billions in potential exposure
Mark Zuckerberg spent seven hours facing jury questions last week, the first time the billionaire has appeared before a jury in such proceedings
A 20-year-old woman sat in a Los Angeles courtroom this week and told a jury that Instagram consumed her childhood from the moment she woke until she went to sleep. She began using YouTube at six, Instagram at nine, and encountered no barriers to prevent her access. The case represents the first time a jury will decide whether Silicon Valley bears legal responsibility for the mental health crisis afflicting a generation of children raised on social feeds.
What makes this trial particularly significant isn't just the individual tragedy at its centre. Two other defendants, TikTok and Snapchat, settled with the plaintiff hours before proceedings began, with terms kept confidential. That those companies chose to pay an undisclosed sum rather than face a jury suggests something deeper: a recognition that their product design decisions might not withstand public scrutiny, and that systematic vulnerabilities exist across the sector.
Young person using smartphone with social media apps
The testimony from the plaintiff, identified as Kaley to protect her privacy, painted a stark picture of childhood years dominated by algorithmic feeds. She described Instagram as "the first thing" she engaged with each morning, continuing "all day" until bedtime. YouTube's autoplay feature, she noted, kept her watching videos for hours without interruption. According to her account, the pursuit of likes on posts dictated her sense of self-worth, leaving her feeling "insecure" or "ugly" when engagement fell short.
Enjoying this article?
Get stories like this in your inbox every week.
The causation question that could reshape liability
The legal battleground centres on a deceptively simple question: did these platforms cause Kaley's diagnosed conditions of body dysmorphia, anxiety, and depression, or were they merely present during a troubled adolescence that would have unfolded similarly regardless? When her lawyer asked whether she'd experienced body dysmorphic symptoms before social media use, Kaley replied simply: "No, I didn't."
Meta's defence strategy, however, points to dysfunction in Kaley's family life as the root cause of her mental health struggles. The company's lead lawyer, Paul Schmidt, highlighted statements Kaey made about a difficult relationship with her mother that had led to thoughts of self-harm, attempting to establish an alternative causal pathway. Kaley countered that most conflicts with her mother stemmed from arguments over iPhone use and time spent online, and maintained that their relationship is currently strong.
This defence approach carries uncomfortable echoes of Big Tobacco's historical playbook: individualise the harm, question the victim's circumstances, and avoid scrutiny of the product itself.
The difference here is that medical causation in mental health remains genuinely complex, with contributing factors difficult to isolate. Expert testimony on whether social media use at such young ages can independently trigger these conditions will prove crucial to the jury's determination.
Person looking at smartphone screen showing social media
Billions in potential exposure
The financial stakes extend far beyond this single case. Thousands of similar lawsuits have been filed across the United States by families and state governments, all currently watching how this jury responds to evidence about product design choices. Meta's market capitalisation hovers around £1.2 trillion; even a fraction of a percentage point shift in liability assumptions could translate to tens of billions in potential exposure.
The timing is particularly awkward for platforms already navigating regulatory pressure in Europe. The UK's Online Safety Act and the EU's Digital Services Act now impose explicit duties of care on platforms regarding young users, with substantial penalties for failures. Those frameworks emerged from years of parliamentary inquiries and regulatory development. A jury verdict finding platforms liable for harm could accomplish in one trial what a decade of American legislative efforts has failed to achieve: a binding liability framework that forces fundamental changes to product design.
What's interesting here is how little the platforms appear to have learned from adjacent industries. Gaming companies spent years developing robust age verification and parental controls after facing similar criticism. Alcohol and tobacco advertising is heavily restricted around minors, with clear liability for violations. Yet social media companies have operated for nearly two decades on the premise that exponential user growth justifies minimal friction at signup, even when those users are children.
Zuckerberg himself spent seven hours facing questions from lawyers last week, the first time the billionaire has appeared before a jury. His presence signals that Meta understands the magnitude of what's at stake.
Courtroom interior with wooden benches and judge's bench
The company isn't just defending against one claim; it's attempting to protect a business model predicated on maximising engagement regardless of user age or vulnerability.
The trial is expected to continue until mid-March. Whether the jury finds Meta and Google liable will determine not just damages in this case, but whether Silicon Valley's "growth at all costs" approach to young users remains legally tenable. For investors, the question isn't whether platforms will eventually face stricter obligations around child safety—international regulatory trends make that inevitable—but whether American liability will come through legislation or through jury verdicts, one case at a time.
Pre-trial settlements by TikTok and Snapchat suggest platforms fear their product design decisions cannot withstand jury scrutiny, potentially triggering a wave of similar settlements across thousands of pending cases
The trial outcome will determine whether Silicon Valley faces liability through unpredictable jury verdicts or controlled legislation—the former being far more expensive and precedent-setting for the industry
Watch for how expert testimony on causation holds up: if juries accept that platforms can independently trigger mental health conditions in children, the entire "growth at all costs" business model becomes legally untenable
Multi-award winning serial entrepreneur and founder/CEO of Venntro Media Group, the company behind White Label Dating. Founded his first agency while at university in 1997. Awards include Ernst & Young Entrepreneur of the Year (2013) and IoD Young Director of the Year (2014). Co-founder of Business Fortitude.