
After a month of explosive testimony, damning internal documents, and a courtroom appearance by Mark Zuckerberg that will not soon be forgotten, closing arguments got underway Thursday in a landmark Los Angeles trial that could fundamentally reshape how social media companies build their products, and what happens when those products hurt children.
The case centers on a 20-year-old woman identified only as Kaley, who alleges that Instagram and YouTube hooked her from the time she was a small child. By her account, she was watching YouTube at six and on Instagram by nine, a timeline that left her with worsening depression, body dysmorphia, and a documented struggle that her attorneys argue was engineered by two of the wealthiest corporations in American history.
The Case That Could Change Everything
This is not just one lawsuit. It is a bellwether trial, a legal term of art that carries enormous weight, tied to more than 1,600 similar suits filed by families and school districts across the country. The way this jury rules in Los Angeles will almost certainly shape the trajectory of every one of those cases, and potentially trigger a wave of settlements that could cost Meta and Google billions.
TikTok and Snap, originally named as defendants, settled before trial began under terms that were not publicly disclosed. Meta and Google’s YouTube chose to fight. That decision is about to be tested before a jury.
The plaintiffs’ legal theory is as audacious as it is consequential: treat social media apps not as publications shielded by Section 230 of the Communications Decency Act, but as defective products under product liability law. The argument is that companies like Meta deliberately engineered their platforms to be harmful and then dismissed their own internal warnings when those warnings got in the way of growth. Legal experts have compared the scale of what is being attempted here to the tobacco litigation of the 1990s.
Zuckerberg on the Stand
The most electric moment of the trial came when Zuckerberg himself took the witness stand. Plaintiffs’ attorney Mark Lanier, who opened the case by describing it as being “as easy as ABC” (addicting the brains of children), confronted the Meta CEO with a 35-foot-wide collage of hundreds of selfies that Kaley had posted to Instagram as a child, pressing him on whether her account ever drew any scrutiny for such heavy use at so young an age. Zuckerberg did not give a direct answer. Kaley watched from the gallery.
When Meta’s own lawyer took over questioning, the tone shifted markedly. Zuckerberg relaxed. He pushed back on what he called a misconception that the company profits from any attention it can capture, regardless of harm. But that argument ran directly into the evidence the plaintiffs had already introduced: Meta’s internal research, known as Project MYST, found that parental supervision and household controls had little meaningful impact on whether teenagers compulsively used social media. Not because parents weren’t trying, but because the platforms are designed in ways that override those efforts.
Meta’s head of Instagram, Adam Mosseri, also testified. He partly acknowledged the Project MYST findings, suggesting that some teens use the platform to escape difficult realities. He was careful, however, to avoid the word addiction. The company prefers the phrase “problematic use,” a linguistic choice that did not go unnoticed by the jury or the press.
What the Evidence Showed
The trial produced a steady stream of internal documents that made for uncomfortable viewing. Expert witnesses described multiple studies linking regular social media use with worsening depression, anxiety, and body image problems among teenagers. A Stanford psychiatrist testified that Meta’s design features are addictive. Witnesses described features like infinite scroll, auto-play video, push notifications, and algorithmic recommendation systems as tools deliberately engineered to keep young users on platform as long as possible, maximizing the advertising revenue that flows from their attention.
The defense pushed back consistently. Meta’s lead attorney Paul Schmidt argued that the central question is not whether bad things happen on the internet, but whether the platforms disclosed the risks they knew about. His repeated refrain: “Meta disclosed, it didn’t deceive.” Google, through spokesperson José Castañeda, maintained that the allegations against YouTube are “simply not true” and that providing safer experiences for young users has always been a priority.
But credibility is a jury question, and that jury has now spent a month watching both sides make their case.
A Second Front in New Mexico
The Los Angeles trial is not happening in isolation. A separate case brought by New Mexico Attorney General Raúl Torrez opened in Santa Fe the same week and involves a different but overlapping set of allegations: that Meta failed to protect children from sexual exploitation on its platforms, and that its move to end-to-end encryption on Messenger actively made that problem worse by limiting the ability to detect and report abuse.
Unsealed internal documents in that case included a warning from a Meta safety researcher estimating that roughly half a million cases of child exploitation occur on the platform daily. The New Mexico AG’s office built its case by posing as children online and documenting what followed: sexual solicitations, and Meta’s response to them. Prosecutors described the platform as a marketplace for human trafficking. Meta called the investigation ethically compromised and said prosecutors were cherry-picking data. That trial is expected to run approximately eight weeks.
West Virginia filed a similar suit against Apple this week, alleging the company failed to prevent child sexual abuse material from being stored and shared on iOS and iCloud, and that its encryption practices impede law enforcement’s ability to investigate those crimes.
The Stakes Are Existential
Eric Goldman, a professor at Santa Clara University School of Law who has spent years studying internet liability, has been blunt about what a plaintiff victory would mean. “The internet is on trial in these cases,” he said. “If the plaintiffs win, the internet will almost certainly look different than it does today.” The same legal arguments, if they succeed, could apply to video game companies, generative AI products, and virtually any platform that uses algorithmic recommendation to hold user attention.
That is precisely the argument Meta and Google have made throughout the trial. They are not merely fighting this one case. They are fighting to preserve a legal architecture that has protected the internet industry for three decades. The stakes, in every sense, could not be higher.
The jury now begins the process of weighing whether two of the most powerful companies on earth knew what their products were doing to children, and chose growth anyway. Closing arguments wrap before deliberations begin. The verdict, when it comes, will land like a verdict on the entire era.
For more on the trial’s background, NPR has a detailed breakdown of how the cases were built and what is at stake legally.
