There is a document that Meta executives would very much prefer you never read. In it, a senior engineer describes the challenge of hooking users ages 11 to 13 with language that reads less like product development and more like a grooming manual. “If we wanna win big with teens, we must bring them in as tweens.” That memo was read aloud in a Los Angeles courtroom this week, and on Wednesday, a jury decided it had heard enough.
The verdict: Meta and YouTube are liable. On all counts. A jury found both companies negligent in the design of their platforms for deliberately engineering addiction in the developing brains of children and teenagers. The damages, $3 million in compensatory and $3 million in punitive, are almost beside the point. What matters is the legal ground that just shifted beneath the entire social media industry.
What The Jury Actually Decided
This was not a case about whether social media is bad for kids in the abstract sense that everyone vaguely agrees it probably is. This was a case about whether Meta and Google’s YouTube built defective products, products deliberately engineered to exploit the neurological vulnerabilities of minors for profit, and whether those companies knew it and kept going anyway.
The jury said yes. To both questions.
The plaintiff, a young woman identified in court as Alexis Spence, began using Instagram at 11 years old. By her early teens she had developed a severe eating disorder and depression that her family attributes directly to the algorithmic content loops Instagram served her with precision. Instagram’s recommendation engine, the internal documents showed, knew she was vulnerable and served her content anyway. The jury found Meta bore 70 percent of the liability, with Google’s YouTube picking up the remaining 30 percent.
Both companies said they plan to appeal.
The Internal Documents That Did The Damage
The trial’s most damning moments came not from expert testimony or medical records but from Meta’s own files. Emails and memos introduced as evidence showed that executives at the highest levels, including Mark Zuckerberg himself, were aware of research showing Instagram was harming teenage girls’ mental health and chose engagement metrics over intervention.
The 2021 Wall Street Journal investigation that first surfaced Meta’s internal research, showing that the company knew Instagram was “toxic” for teen girls and buried the findings, was the genesis of the legal avalanche now crashing down. That reporting triggered congressional hearings, state attorney general investigations, and ultimately the consolidation of thousands of lawsuits into federal multidistrict litigation. This trial was the first of those cases to reach a jury.
The companies’ defense, essentially that parents and users bear responsibility for how platforms are used, landed with a thud. Jurors interviewed after the verdict said the internal documents were impossible to explain away. When a company’s own engineers describe teenagers as a “growth vector” and calculate optimal notification timing to keep kids scrolling past midnight, the “we didn’t know” defense evaporates.
Why $6 Million Is Not The Story
Meta generated roughly $160 billion in revenue in 2024. Six million dollars is a rounding error. The significance of this verdict is not the dollar amount but the precedent: for the first time in American legal history, a jury has concluded that social media platforms can be held liable as defective products for how they were designed, not just what content they carried.
Section 230 of the Communications Decency Act has long shielded tech platforms from liability for the content their users post. But the plaintiffs in these cases argued something different: that the harm came not from user content but from the platforms’ own algorithmic architecture, the recommendation engines, the infinite scroll, the notification systems, the features designed from the ground up to maximize time-on-app regardless of psychological cost. That argument just got a jury’s stamp of approval.
There are now approximately 2,000 similar lawsuits pending in federal and state courts across the country. They cover plaintiffs ranging from children who developed eating disorders and anxiety to families of teenagers who died by suicide. Tuesday’s verdict functions as a bellwether: it tells plaintiffs’ attorneys, defendants’ attorneys, and judges everywhere how a jury of regular Americans reacts to this evidence when they see it unfiltered.
What Comes Next
Both Meta and Google’s appeals will likely focus on whether the plaintiffs’ design-defect theory is preempted by federal law or blocked by Section 230. Legal experts are divided, but the trend in the courts has not been favorable to the platforms. Multiple appellate decisions over the past two years have allowed similar claims to proceed, and the Supreme Court has so far declined to definitively resolve the Section 230 question in ways that would shut down these cases.
Congress, as usual, has done little. The Kids Online Safety Act passed the Senate in 2024 with overwhelming bipartisan support and then died quietly in the House, a casualty of tech industry lobbying and election-year paralysis. The current session has seen competing bills but nothing close to passage. The courtroom, for now, is where the accountability is actually happening.
There is a broader reckoning building here that goes beyond litigation. The testimony in this trial put on public record something most parents already suspected: the attention economy was not a side effect of social media. It was the product. The platforms were not accidentally addictive. They were engineered to be. And the engineers knew who their most profitable, most vulnerable users were.
Meta’s stock dropped modestly on the news. YouTube’s parent company Alphabet fell slightly more. Neither move reflected the existential weight of what a Los Angeles jury just decided. The industry’s reckoning has been arriving in slow motion for years. This week, it arrived a little faster.
