Ask Onix
First-of-its-kind lawsuit targets Meta and Google
Los Angeles - A jury is weighing whether social media platforms deliberately designed their apps to addict young users, in a case that could reshape legal protections for tech giants. The plaintiff, identified only as Kaley, testified that she spent up to 16 consecutive hours on Instagram, sacrificing sleep, family time, and offline relationships.
Plaintiff's testimony paints picture of compulsive use
Kaley told the court she began using YouTube at age six and Instagram at nine, despite Meta's stated policy barring users under 13. By her preteens, she had created dozens of accounts to chase likes and validation. She recalled first experiencing anxiety and depression around age 10, later diagnosed as body dysmorphia-a condition she attributes to filters that distorted her self-image.
"I didn't have those feelings before social media," she testified when asked if her struggles predated her use of the platforms.
Parents blame platforms for children's deaths
Lori Schott, whose 18-year-old daughter Annalee died by suicide, attended the trial despite having no direct stake in the case. Schott alleges Instagram exposed her daughter to harmful content while concealing internal research about its risks. "They hid the research. They knew it was addictive," she told the BBC.
Aaron Ping, whose 16-year-old son Avery also took his own life, described a once-active boy who became consumed by YouTube. "We fought constantly over screen time," Ping said, recounting a failed agreement with school counselors to limit usage.
Neither Meta nor Google responded to requests for comment on the families' accounts.
Tech CEOs defend platforms under scrutiny
Meta CEO Mark Zuckerberg appeared in court to defend his company's practices-the first time he has testified in person despite hundreds of prior lawsuits. Surrounded by security, Zuckerberg insisted Meta's goal was merely to create "useful" platforms, not addictive ones. When pressed on internal documents discussing child users, he grew frustrated: "I don't see why this is so complicated. It's our policy they're not allowed, and we try to remove them."
Instagram head Adam Mosseri dismissed Kaley's 16-hour sessions as "problematic" but not addictive, testifying that addiction is not an official diagnosis for social media use. Meta's legal team focused on Kaley's home life, citing posts that suggested family instability and arguing other factors contributed to her mental health struggles.
Legal and cultural stakes loom large
Judge Carolyn Kuhl called the case "completely unprecedented," noting that a verdict against Meta or Google could upend decades of legal precedent treating platforms as neutral intermediaries. A plaintiff victory might pave the way for billions in settlements and influence thousands of similar lawsuits nationwide.
Even without a liability finding, public pressure has already prompted some governments to restrict social media access for minors. Critics accuse platforms of exposing children to unrealistic beauty standards, harassment, and predators-harms tech companies have long disclaimed responsibility for.
Plaintiff's future hinges on jury's decision
Now an adult, Kaley testified that she still uses social media and has considered a career in the field. But when asked if her life would be better without platforms like Instagram, she answered unequivocally: "Yes."
The jury's decision will determine whether tech giants bear legal responsibility for the mental health fallout of their design choices-or whether the status quo prevails.