
The facts are stark. A bipartisan coalition of 42 state attorneys general has taken Meta to court, alleging the company knowingly engineered addictive features for Instagram and Facebook that harm kidsâ mental health and collected data from children under 13 without parental consent in violation of COPPA.
The lead case was filed in federal court in the Northern District of California, with additional state actions running in parallel. Itâs an extraordinary show of state power aimed squarely at the business model of attention extraction that defines social media. And itâs not isolated. Roblox, the gaming platform that is effectively the social network for millions of children, faces a fastâgrowing cascade of lawsuits alleging it failed to protect kids from predators and built systems that made grooming easier, not harder. If youâre looking for a turning point in platform accountability for children, this may be it.
State AGs didnât tiptoe. They allege Metaâs âinfinite scroll,â social reward mechanics (likes, streaks, notifications), and pushy engagement loops are designed to hook kids and correlate with harms like anxiety, insomnia, and school interference, while Meta publicly downplayed risks. They also say Meta had âactual knowledgeâ of underâ13 users and still gathered their data, triggering COPPA. That coalition spans red and blue statesâArizona to New York, Virginia to Washington. Itâs rare, itâs muscular, and it specifically targets design, not just content moderation theater. That focus is critical: it reframes the problem from âbad postsâ to âpredatory mechanics.â See the filings and state press materials for details and participating states in New York, Colorado, and California announcements, each describing the same theory of harm and legal posture. Read the NY Attorney General press release.
The Roblox litigation wave is less centralized but no less revealing. Families across the country have filed suits alleging negligent design, inadequate age verification, weak moderation, and features that make it easy for adults to impersonate peers, establish trust via inâgame chat or âRobux,â and migrate children to lessâregulated apps for exploitation. Investigations and suits from state officials and families are piling up, and new cases continue to be filedâan unmistakable sign of mounting systemic risk. Recent reporting underscores the stateâlevel scrutiny, including criminal subpoenas and AG actions that explicitly frame Roblox as a predatorâfriendly environment and demand accountability for safety failures. Additional litigation filings and coverage detail claims of grooming pipelines and platform design choices under fire.
The Design Choices Under Indictment
This moment is less about a single feature and more about an architecture of engagement that, when pointed at children, becomes a health hazard. On social: endless feeds, intermittent reinforcement, and social comparison loops. On Roblox: openâended communication, insufficient age assurance, content sprawl with inconsistent guardrails, and seamless migration pathways to offâplatform channels. The legal theory converges on a simple assertion: these systems were not accidents; they were business choices. The Meta complaints emphasize manipulative features and COPPA violations as unfair and deceptive practices under state law.
Read the California AG press release.
Thereâs a broader trend worth naming: states filling a federal vacuum. Congress has stalled on comprehensive protections such as KOSA and COPPA 2.0. So AGs are using UDAP statutes, COPPA enforcement theories, and public nuisance arguments to force changes in product design. Itâs the same playbook weâve seen in opioids and vapingâwhen Washington dithers, states litigate. That has consequences: settlements can rewrite product defaults faster than rulemaking ever could.
The Stakes for Democracy and the Next Internet
If you care about democratic resilience, this is not an adjacent issueâitâs core. Healthy democracies require healthy information environments and citizens capable of navigating them. When platforms normalize surveillanceâbased engagement for children and outsource risk to families and schools, they externalize costs onto the public realm: classrooms manage attention fallout; pediatricians triage anxiety and sleep disruption; prosecutors chase predators strategically enabled by design. This isnât just a consumer disputeâitâs a transfer of public health risks from corporations to communities.
Thereâs also a market power dimension. The dominant players set the design standards others follow. If Meta is compelled to harden defaultsâtrue age assurance, teenâsafe mechanics by design, data minimization for minorsâthose choices can cascade across the industry. Likewise, if Roblox is forced to decouple monetization and social affordances from safetyâcritical featuresâseparating adults and minors by default, narrowing chat to verified connections, enforcing real age verification, and preâpublication review for experiences likely to be accessed by kidsâthe âacceptable baselineâ for child platforms rises. Thatâs how norms shift: not through PR, but through courtâenforceable obligations.
What Real Reform Looks Like
- Age assurance with teeth: no more honorâsystem birthdays; adopt privacyâpreserving age estimation backed by verifiable parental consent for underâ13 features.
- Safety by default, not by setting: teens and kids start in the safest configuration; friction is added to expand access, not to secure it.
- Antiâgrooming architecture: adultâminor separation by default, restricted DMs, hard blocks on offâplatform contact sharing, and robust realâtime detection tuned for grooming patterns.
- Design detox: end infinite scroll for minors; reduce social comparison metrics; rateâlimit notifications; enforce âlightsâoutâ quiet hours.
- Data minimization: no behavioral advertising or monetization of minor data, period.
- Accountability with logs: auditable safety pipelines, moderator response SLAs for child endangerment, and mandatory transparency to independent monitors.
This is all feasible. The presence of recent, hurried safety improvements proves it. When platforms roll out ageâestimation, chat restrictions, and new moderation AI after public pressure, they inadvertently make the plaintiffsâ case: safer designs were practicable all along.
The Political Signal
The bipartisan AG front against Meta is instructive. It suggests a rare area of crossâparty consensus: the status quo for kids online is indefensible. It also hints at the likely endgame: a multistate settlement that imposes design constraints industryâwide. If that happens, watch for a domino effect as AGs turn to other youthâheavy platforms and demand parity with whatever Meta agrees to. Settlements can become the de facto rulebook even in the absence of federal law.
The Roblox docket may get there differentlyâthrough a thicket of state actions and private suitsâbut the gravitational pull is the same: platforms that serve children will be judged by the safest reasonable design standard, not by growthâera excuses.
The throughline is simple: democratic societies donât outsource child safety to engagement metrics. We draw lines. The courts will be drawing them now.