
Here is the clearest possible statement of Meta’s priorities: when a state court ordered the company to actually protect children on its platforms, Meta’s first instinct was to threaten to take its toys and go home.
Phase two of New Mexico’s landmark child safety lawsuit begins today, and the tech giant has warned it could simply withdraw Facebook and Instagram from the entire state rather than comply with court-ordered safety redesigns. New Mexico Attorney General Raul Torrez called the bluff, telling reporters that Meta “is showing the world how little it cares about child safety.”
He is not wrong. And the implications of this trial extend far beyond one state’s borders.
What the Jury Already Decided
In March, a New Mexico jury found that Meta violated the state’s consumer protection law by misrepresenting the safety of Facebook and Instagram for young users. The jury ordered $375 million in civil penalties, determining that Meta knowingly harmed children’s mental health and concealed what it knew about child sexual exploitation on its platforms. That verdict was not ambiguous. It was a jury of ordinary citizens looking at the evidence and concluding that one of the world’s largest companies systematically lied about how safe its products were for kids.
Phase two, which starts today, is where the remedy gets decided. And what New Mexico is asking for would fundamentally change how Meta operates, at least within state lines.
What New Mexico Wants
Prosecutors are demanding a comprehensive overhaul of how Meta’s platforms interact with minors. The list reads like a tech accountability wish list that child safety advocates have been pushing for years: a complete redesign of recommendation algorithms so they no longer prioritize engagement over safety for children, elimination of features linked to compulsive use (infinite scroll, push notifications, and default “like” counts), mandatory parental or guardian accounts linked to every child’s profile, improved age verification, stronger measures against child sexual exploitation, and a court-supervised child safety monitor to track compliance over time.
Every item on that list targets a specific design choice that Meta made deliberately because it drives engagement metrics. Infinite scroll keeps users on the platform longer. Push notifications pull them back. Like counts create social pressure loops. These are not bugs. They are features, engineered to maximize time-on-app, and they reflect an industry that consistently prioritizes growth over user welfare.
Meta’s Nuclear Option
Rather than engage with these demands on the merits, Meta has floated the possibility of pulling its platforms from New Mexico entirely. The company filed court documents in late April stating it “could withdraw Facebook and Instagram from New Mexico pending the bench trial’s outcome.” The message to the court, and to other states watching closely, is unmistakable: push too hard on child safety and we will simply deny your citizens access to our platforms.
This is the corporate equivalent of a hostage negotiation. Meta is betting that the prospect of two million New Mexicans losing access to Facebook and Instagram will pressure the court into softening its remedies. It is a strategy that reveals more about Meta’s internal calculus than any earnings call ever could. The company apparently believes that the political backlash from pulling out of a state would be less damaging than actually building safer products for children.
Think about what that means. Meta has looked at the math and concluded that redesigning its algorithms to stop harming kids is more expensive than abandoning an entire state’s user base. That calculation tells you everything about how deeply engagement-driven design is embedded in Meta’s business model. It is not a feature that can be toggled off. It is the architecture.
Why This Trial Matters Nationally
New Mexico is not the only state pursuing Meta over child safety. Attorneys general across the country are watching this trial as a test case. If New Mexico succeeds in forcing algorithmic changes, it creates a legal template that other states can replicate. If Meta successfully uses withdrawal threats to water down the remedy, it sets a precedent that tech companies can bully their way out of accountability by threatening to leave markets.
The federal landscape offers no alternative. Congressional efforts to pass comprehensive children’s online safety legislation have stalled repeatedly, caught in the usual crossfire between free speech concerns, industry lobbying, and partisan gridlock. State courts have become the de facto regulators of tech platforms because the federal government has failed to act. That makes this New Mexico courtroom one of the most important venues for tech accountability in the country right now.
The Broader Pattern
Meta’s response follows a familiar playbook. When Australia passed legislation requiring tech platforms to pay news publishers, Meta briefly pulled news content from its platform before relenting under public pressure. When the European Union implemented the Digital Services Act, Meta initially resisted before adapting. The pattern is consistent: threaten withdrawal, generate panic, negotiate from a position of leverage, and ultimately comply with a watered-down version of the original requirement.
The difference here is that the requirement involves protecting children. Public sympathy for Meta’s position evaporates when the alternative is a company choosing profits over kids. Attorney General Torrez understands this dynamic, which is why he has been aggressive in framing Meta’s threat as evidence of corporate indifference to child welfare. “If Meta would rather leave New Mexico than protect children,” Torrez told the Washington Post, “that tells parents everything they need to know about this company.”
What Happens Next
The bench trial, presided over by a judge rather than a jury, will determine whether New Mexico’s requested remedies are legally justified and practically enforceable. Meta has vowed to appeal the Phase One verdict regardless of the outcome, which means this fight is headed to appellate courts and possibly beyond.
But the real battle is not legal. It is structural. Meta built its business on algorithmic engagement. Every dollar of advertising revenue depends on keeping users scrolling, clicking, and returning. Asking Meta to redesign those systems for children is asking it to put a governor on the engine that drives its entire business model. The company’s willingness to threaten state-level withdrawal rather than comply tells you how existential it considers that request.
For parents, for regulators, and for anyone who believes that a company worth hundreds of billions of dollars should be capable of not harming children, the answer from Menlo Park is now on the record: Meta would rather leave than change.
