Mark Zuckerberg spent seven hours in a Los Angeles courtroom, his first time ever facing a jury
The plaintiff began using YouTube at six years old and Instagram at nine, encountering no age verification barriers
Thousands of similar lawsuits filed by families and state governments across the US now hinge on this trial's outcome
TikTok and Snapchat settled with the plaintiff before trial with undisclosed terms
Mark Zuckerberg spent seven hours in a Los Angeles courtroom last week, the first time the Meta chief executive has ever faced a jury. The case involves a 20-year-old woman who began using YouTube at six and Instagram at nine, encountering no age verification barriers on either platform. Her testimony that social media use consumed her childhood and triggered mental health problems has set up a legal battle that could fundamentally reshape how tech giants operate their most lucrative products.
The stakes extend far beyond one plaintiff. Thousands of similar lawsuits filed by families and state governments across the US now hinge on whether a jury accepts that social media platforms bear legal responsibility for harm to underage users. Zuckerberg's personal courtroom investment signals that Meta's leadership views this as an existential threat to its business model, not merely another regulatory nuisance to be managed.
Legal documents and courtroom setting
The deflection strategy
Meta's legal defence has centred on deflecting liability away from Instagram's product design and towards Kaley's family circumstances. Paul Schmidt, Meta's lead lawyer, pointed to statements Kaley made before filing her lawsuit about difficulties with her mother, including references to those family tensions contributing to thoughts of self-harm.
Enjoying this article?
Get stories like this in your inbox every week.
This strategy reveals something significant about how the tech industry plans to fight child safety litigation: by making each case about individual family dysfunction rather than systemic product features that affect millions of young users.
If successful, this approach would effectively immunise platforms from liability by suggesting that any child experiencing problems must have pre-existing vulnerabilities unrelated to platform design. Kaley countered in her testimony that arguments with her mother stemmed primarily from her iPhone use and time spent online, and that their relationship is close today.
The he-said-she-said nature of this exchange misses the larger question: whether platforms that deploy autoplay features, algorithmic feeds, and like-based validation systems to users as young as six can claim no responsibility for predictable psychological effects. What's particularly striking is the absence of two defendants. TikTok and Snapchat settled with Kaley shortly before trial, with undisclosed terms.
Social media platform icons on smartphone screen
Product design on trial
Kaley's account describes behaviour patterns that Meta would likely celebrate in internal metrics meetings. She told the court that checking Instagram was "the first thing" she did upon waking and continued "all day" until bedtime, leading to problems at school, at home, and with her mental health. She watched YouTube videos for hours continuously, facilitated by the platform's autoplay feature that queues new content without requiring user action.
These aren't bugs in the system. They're core features designed to maximise engagement and the advertising revenue that flows from it. Autoplay, infinite scroll, algorithmic recommendations calibrated to individual psychology, and social validation through likes and comments all serve the same commercial purpose: keeping users on platform for as long as possible.
The legal question is whether deploying these features to children constitutes negligence.
Kaley has been diagnosed with body dysmorphia, anxiety, and depression—conditions she attributes to social media use starting from age nine. She testified that she had no body image concerns before using these platforms and began self-harming at age ten. She has been in therapy since age 13.
Meta contests that Instagram caused these conditions, arguing instead that Kaley's mental health struggles stem from family problems. Google, facing claims about YouTube's role, has remained quieter in public proceedings but faces similar liability questions about offering algorithmically driven content to young children without effective age restrictions.
Young person using smartphone in dark setting
The business model at risk
The scientific evidence on causation between social media use and specific mental health diagnoses remains contested in academic circles, and Meta's lawyers will certainly emphasise this ambiguity. But juries don't operate like peer review panels. They assess negligence and responsibility based on different standards, including whether companies took reasonable precautions when they could foresee potential harm.
This is where the business implications become acute. If the jury finds that offering addictive design features to young children without meaningful age verification constitutes actionable negligence, the liability exposure extends across platforms' entire underage user base. Implementing robust age verification, removing engagement-maximising features for young users, or both would fundamentally alter the economics of social media platforms that have built valuations in the hundreds of billions by capturing users as young as possible and optimising for maximum engagement.
The trial is expected to continue until mid-March, with the outcome likely establishing precedent for the thousands of similar cases waiting in the wings. Meta and Google's defensive strategies suggest they recognise the threat but have chosen to fight on the ground of individual causation rather than conceding any systemic design problems. The settlements by TikTok and Snapchat suggest an alternative calculation: that avoiding precedent matters more than contesting individual claims.
For investors and industry observers, the key variables to watch are whether the jury accepts Meta's family dysfunction argument or finds product design itself culpable, and what damages framework emerges if plaintiffs prevail. The broader regulatory environment around child online safety is already tightening in the UK and EU, but US litigation could prove more consequential for business models built on capturing young users early and maximising their engagement regardless of psychological effects.
This article is for informational purposes and does not constitute financial advice or legal analysis.
Watch whether the jury accepts individual family dysfunction arguments or finds systemic product design culpable—this will set precedent for thousands of similar cases
If plaintiffs prevail, implementing robust age verification and removing engagement-maximising features for young users would fundamentally alter social media economics and business models
The strategic settlements by TikTok and Snapchat suggest some platforms believe avoiding legal precedent matters more than contesting individual claims
Multi-award winning serial entrepreneur and founder/CEO of Venntro Media Group, the company behind White Label Dating. Founded his first agency while at university in 1997. Awards include Ernst & Young Entrepreneur of the Year (2013) and IoD Young Director of the Year (2014). Co-founder of Business Fortitude.