Skip to content
Tech News
← Back to articles

Meta loses trial after arguing child exploitation was “inevitable” on its apps

read original get Meta VR Safety Glasses → more articles
Why This Matters

This verdict underscores the urgent need for social media platforms to prioritize child safety and implement more effective protections against exploitation. It highlights the risks consumers face when platforms fail to address vulnerabilities, especially for vulnerable children. The ruling could lead to increased regulatory scrutiny and push tech companies to enhance safety measures to prevent future harms.

Key Takeaways

Meta has lost the first of three child safety trials it’s facing this year after a jury in a New Mexico state court found that the social media giant’s platforms do not effectively protect kids from child exploitation.

On Tuesday, the jury deliberated for only one day before agreeing that Meta should pay $375 million in civil damages for violating state consumer protections and misleading parents about the safety of its apps.

The trial followed a 2023 lawsuit filed by New Mexico Attorney General Raúl Torrez after The Guardian published a two-year investigation exposing child sex trafficking markets on Facebook and Instagram. Torrez’s office then conducted an undercover investigation codenamed “Operation MetaPhile,” in which officers posed as children on Facebook, Instagram, and WhatsApp. The jury heard that these fake profiles were “simply inundated with images and targeted solicitations” from child abusers, Torrez told CNBC in 2024. Ultimately, three men were arrested amid the sting for attempting to use Meta’s social networks to prey on children.

At trial, Mark Zuckerberg and Instagram chief Adam Mosseri testified that “harms to children, such as sexual exploitation and detriments to mental health, were inevitable on the company’s platforms due to their vast user bases,” The Guardian reported. Internal messages and documents, as well as testimony from child safety experts within and outside the company, showed that Meta repeatedly ignored warnings and failed to fix platforms to protect kids, New Mexico’s AG successfully argued.

Perhaps most troubling to the jury, law enforcement and the National Center for Missing and Exploited Children also testified that Meta’s reporting of crimes to children on its apps—including child sexual abuse materials (CSAM)—was “deficient,” The Guardian reported. Rather than make it easy to trace harms on its platforms, the jury learned from frustrated cops that Meta “generated high volumes of ‘junk’ reports by overly relying on AI to moderate its platforms.” This made its reporting “useless” and “meant crimes could not be investigated,” The Guardian reported.