New Mexico jury finds company willfully violated consumer protection law

Meta has suffered one of its most damaging courtroom defeats yet, after a New Mexico state court jury found the company liable for nearly 400 million dollars in civil damages in a case centered on child safety across Facebook and Instagram. The verdict marks a major legal and reputational setback for the social media giant, and it adds momentum to a broader wave of litigation accusing major technology platforms of misleading the public about the safety of their products for young users.

The jury awarded 375 million dollars after concluding that Meta willfully violated New Mexico’s unfair practices law. The case was brought by state attorney general Raúl Torrez, whose office argued that the company failed to adequately protect children from predators and misrepresented how safe its platforms really were. The legal fight grew out of a 2023 lawsuit filed after an undercover investigation in which authorities created a fake profile for a 13-year-old girl and, according to the state, quickly drew a flood of explicit images and predatory solicitations.

For Meta, the scale of the verdict is significant not only because of the financial penalty, but because a jury was persuaded that the conduct went beyond isolated content failures and into the territory of deceptive business practices. That distinction matters. It means the case was not framed only as a problem of third-party misuse, but as a question of whether the company’s own design choices, warnings and public statements misled users and regulators about the risks facing children on the platform.

The case focused on design, warnings and what Meta knew internally

New Mexico’s lawyers argued throughout the trial that Meta had internal evidence showing serious risks to minors, yet failed to move with enough urgency or honesty. Prosecutors pointed to corporate records and internal messages that, in their view, showed executives understood how certain platform features could make it easier for predators to reach children or harder for law enforcement to investigate abuse.

Among the most sensitive issues raised in court were internal discussions surrounding Meta’s push toward end-to-end encryption on Facebook Messenger. According to filings highlighted during the trial, employees discussed how the move could sharply reduce the company’s ability to generate reports involving child sexual abuse material for law enforcement. Prosecutors used those materials to argue that Meta understood the trade-offs but continued to present itself publicly as a company acting aggressively to keep minors safe.

Meta rejected that characterization and said the state had cherry-picked documents to construct an unfair narrative. The company maintained that it has invested heavily in youth safety tools and that it cannot realistically prevent every harmful actor from using its services. After the verdict, Meta said it disagreed with the decision and would appeal, while continuing to defend its record on teen safety.

The ruling could shape how social media companies are sued

The case is important beyond New Mexico because it shows a legal strategy that may become more common. For years, internet companies have relied heavily on Section 230 protections, arguing that they cannot be held legally responsible for content posted by users. In this case, however, the focus was not simply on user content. It was on product design, corporate conduct and consumer protection law.

That is a meaningful shift. Instead of asking courts to hold a platform liable for every harmful post or message, states and plaintiffs are increasingly arguing that companies should be held responsible for how they build systems, what they know those systems do and whether they mislead the public about the risks. Torrez has made clear that this approach is central to his office’s strategy, and he has suggested that changes imposed in one state could become a model for broader action elsewhere.

The verdict is also likely to energize similar litigation already underway across the United States. Cases involving Meta, Snap, TikTok, YouTube and other platforms are increasingly being compared to the tobacco lawsuits of past decades because they revolve around claims that companies understood harms internally while projecting reassurance publicly. Whether those comparisons ultimately hold in court, the legal pressure on social media firms is clearly intensifying.

A second phase now threatens broader remedies

The New Mexico case is not over. A second phase will begin in May, this time without a jury, and a judge will decide whether Meta created a public nuisance and should be required to fund programs aimed at addressing the alleged harms. State lawyers are also seeking operational changes, including stronger age verification, more aggressive removal of predators and new protections for minors in areas where encryption may shield abusive conduct.

That second phase could prove just as consequential as the damages verdict. A financial penalty is painful, but a court-ordered change to platform design or safety practices could have wider implications for how Meta operates not only in New Mexico, but potentially in other jurisdictions watching closely. Regulators and plaintiffs elsewhere may see the case as evidence that courts are becoming more willing to scrutinize not just what social media companies host, but how they are built.

Meta still plans to appeal, and the company will argue that it has been unfairly blamed for criminal conduct committed by bad actors it does not control. But the jury’s decision has already sent a powerful signal. In one of the most closely watched child safety cases against a major technology company, jurors were persuaded that Meta’s actions deserved punishment on a large scale. That outcome alone is likely to reverberate far beyond New Mexico.