May 11, 2026 6:42 pm

New Mexico Jury Finds Meta Liable for Child Exploitation Harms

A New Mexico jury found Meta harmed children's mental health and hid knowledge of child exploitation on its platforms.
New Mexico jury says Meta harms children's mental health and safety

A jury in New Mexico has delivered a significant verdict against Meta, finding the tech giant liable for knowingly harming children’s mental health and concealing information about child sexual exploitation on its platforms. This decision marks a pivotal moment in governmental actions against technology companies.

The ruling follows a nearly seven-week trial in New Mexico, coinciding with jury deliberations in a separate but similar case in California, which questions the responsibilities of Meta and YouTube regarding social media addiction.

Prosecutors argued that Meta, the owner of Instagram, Facebook, and WhatsApp, prioritized profit over user safety, violating the state’s Unfair Practices Act. The jury supported claims that Meta made misleading statements and engaged in exploitative trade practices targeting the vulnerabilities of children.

Financial Implications for Meta

The jury identified thousands of violations, resulting in a $375 million penalty, substantially lower than the amount sought by prosecutors. Despite the verdict, Meta’s stock experienced a 5% increase in after-hours trading, indicating shareholder confidence.

Juror Linda Payton, 38, stated that the jury compromised on the number of affected teenagers, but opted for the maximum penalty per violation, valuing each child’s experience at $5,000.

Potential Changes to Meta’s Operations

The verdict does not immediately impose changes on Meta’s practices. The judge will later decide if Meta’s platforms constitute a public nuisance and whether the company should fund public programs to mitigate these harms. This phase is scheduled for May.

A Meta spokesperson expressed disagreement with the verdict and announced plans to appeal. The company insists on its commitment to safety and transparency in tackling harmful content.

Ongoing Legal Challenges for Meta

This case is among the first in a series of lawsuits targeting social media platforms for their impact on children. Over 40 state attorneys general have filed complaints against Meta, alleging features on Instagram and Facebook contribute to a youth mental health crisis.

Sacha Haworth of The Tech Oversight Project remarked on the weakening of Meta’s defenses, highlighting the company’s failure to prevent online interactions from leading to real-world harm.

The New Mexico lawsuit included an undercover investigation where agents posed as children, documenting Meta’s handling of sexual solicitations. The 2023 lawsuit by Attorney General Raúl Torrez accused Meta of failing to address social media addiction risks.

Meta’s defense cited efforts to eliminate harmful content and asserted that safety investments serve both ethical and business interests. Attorney Kevin Huff emphasized that Meta’s design aims to foster positive connections, not risk exposure.

Legal protections, such as Section 230 of the U.S. Communications Decency Act, have historically shielded tech companies from liability for user-generated content. However, New Mexico prosecutors assert Meta should be accountable for harmful content amplified by its algorithms.

Evidence Reviewed by the Jury

The trial presented Meta’s internal communications and reports on child safety, along with testimonies from executives, engineers, and tech safety experts. Educators testified about social media-related disruptions, including sextortion schemes.

The jury evaluated statements on platform safety by Meta executives, considering the company’s enforcement of age restrictions and algorithmic content prioritization.

ParentsSOS, a coalition of families affected by social media harms, hailed the decision as a significant step in holding Big Tech accountable.

Share:

More Posts

Send Us A Message

Subscribe