Brunswick, ME • (207) 245-1010 • contact@johnzblack.com
There’s a legal difference between settling a lawsuit and a jury deciding you’re liable. Settlements can be framed as pragmatic choices. Jury verdicts can’t. Six weeks of trial, and a group of citizens said: yes, Meta is responsible for this.
On March 24, 2026, a New Mexico jury handed down a $375 million verdict against Meta. First jury verdict against the company over harm to children. Not a regulatory fine, not a consent decree, not a confidentiality-covered settlement. A verdict.
New Mexico AG Raul Torrez pursued the case under the state’s Unfair Practices Act, which was smart. It shifted the question from “did Instagram hurt this specific child” to “did Meta engage in unfair practices toward consumers?” The jury found thousands of violations. Core liability: Meta failed to warn users about dangers to children and failed to protect kids from predators on its platforms.
Meta called it sensationalist and announced it would appeal. They probably will, and it’ll run for years. But the verdict exists now.
Forty-one states are already in a separate multistate lawsuit against Meta. Torrez didn’t wait for that to resolve. He brought his own case, got to trial faster, got a verdict. That’s a roadmap. State consumer protection statutes sidestep Section 230 arguments entirely by focusing on business practices rather than content. If this theory holds on appeal, other AGs have a working template.
Meanwhile, the UK is piloting social media restrictions for teenagers and consulting on whether to ban under-16s from social media altogether. Two continents, same pressure point.
Platform self-regulation had a long run. It’s running out of runway.