TL;DR
A Los Angeles jury is hearing the first major U.S. trial accusing Meta and Google of designing Instagram and YouTube to addict and harm children, a bellwether case that could shape hundreds of similar lawsuits and future rules for kids’ social media use.
Why This Matters
The trial unfolding in Los Angeles tests a new legal theory: that social media companies can be held liable not just for what users post, but for allegedly addictive design choices built into their products. If jurors agree that features such as endless scrolling and “like” buttons were deliberately engineered to hook young users and worsen their mental health, it could open the door to large damages and sweeping changes in how platforms operate.
The case comes amid deep concern about youth mental health and heavy screen time. A 2023 advisory from the U.S. Surgeon General warned that social media can pose a “profound risk of harm” to children and teens when used without adequate safeguards. At the same time, tech companies say their services help young people connect and learn, and they stress that parents and users also share responsibility.
Globally, governments are moving toward tighter limits on children’s access to social platforms. Lawmakers in France, Australia, and the United Kingdom are advancing or enforcing age-based bans and stricter protections, signaling that what happens in U.S. courts may resonate far beyond one country or one set of companies.
Key Facts & Quotes
The Los Angeles County Superior Court trial centers on a 19-year-old woman identified only as “KGM,” who says she started using social media at a young age and became addicted, worsening her depression and suicidal thoughts. Her lawsuit claims that Meta’s Instagram and Google’s YouTube used design features aimed at maximizing youth engagement to drive advertising revenue.
In opening statements, plaintiffs’ lawyer Mark Lanier told jurors the case is “as easy as ABC,” which he said stands for “addicting the brains of children.” He argued that Meta and Google, “two of the richest corporations in history,” have “engineered addiction in children’s brains.” The complaint alleges the companies “borrowed heavily from the behavioral and neurobiological techniques used by slot machines and exploited by the cigarette industry.”
This is the first jury trial among hundreds of similar cases consolidated in what lawyers describe as “bellwether” proceedings – early test trials meant to guide potential settlements or future litigation. Executives, including Meta CEO Mark Zuckerberg, are expected to testify. The trial is scheduled to last six to eight weeks.
Meta and Google strongly deny the allegations. A Meta spokesperson has said the company “strongly disagrees” with the claims and is “confident the evidence will show our longstanding commitment to supporting young people.” A Google spokesperson, Jose Castaneda, called the allegations against YouTube “simply not true,” saying, “Providing young people with a safer, healthier experience has always been core to our work.” Both companies point to parental controls, age restrictions, and other safeguards added in recent years.
The lawsuit also seeks to test whether the companies’ alleged design choices fall outside protections such as Section 230 of the Communications Decency Act, which generally shields online platforms from liability for user-generated content. Legal experts say a ruling that focuses on product design, rather than content, could narrow that shield.
Parallel actions are building pressure on the industry. In New Mexico, a separate trial is beginning over claims that Meta failed to protect young users from sexual exploitation. A federal bellwether trial set for June in Oakland, California, will be the first to represent school districts suing social media platforms over harms to students. More than 40 state attorneys general have filed suits accusing Meta of contributing to a youth mental health crisis by designing features that allegedly keep children online longer. TikTok faces similar lawsuits in more than a dozen states.
Overseas, French lawmakers have approved a bill banning social media for children under 15, while Australia reports revoking access to millions of underage accounts under new rules that bar those under 16 from using major platforms. The British government has also said it is weighing age-based bans as part of wider online safety laws.
What It Means for You
For parents, grandparents, and caregivers, this trial is a high-profile test of who bears responsibility for managing children’s time and behavior online. A verdict in favor of the plaintiffs could push platforms to redesign features that keep users scrolling, strengthen age checks, or further limit what younger teens can see and do. Even if the companies win, the volume of lawsuits and global policy changes suggests more guardrails for kids’ social media use are likely on the horizon.
For everyday users, the case may lead to new prompts, default settings, and safety tools – for example, more aggressive time limits or content filters for minors. Policymakers will be watching closely as they debate national and state-level rules. Appeals are likely, so any legal outcome may take years to fully play out, but this first jury trial offers an early look at how courts are weighing the balance between innovation, free expression, and child protection.
How do you think responsibility for protecting children online should be shared between families, technology companies, and government?
Sources include: Los Angeles County Superior Court case filings and schedules; public statements from Meta, Google and the New Mexico Attorney General’s Office (2023-2026); multi-state attorneys general complaints filed against social media platforms (2023-2024); the U.S. Surgeon General’s Advisory on Social Media and Youth Mental Health (May 2023); and contemporaneous wire service reporting dated Feb. 9, 2026.