Tech News
← Back to articles

A bombshell child safety leak changed Meta — for the worse

read original related products more articles

is a senior policy reporter at The Verge, covering the intersection of Silicon Valley and Capitol Hill. She spent 5 years covering tech policy at CNBC, writing about antitrust, privacy, and content moderation reform.

In 2021, when former Meta employee Frances Haugen blew the whistle on dangers that the company’s platforms posed to kids, Meta realized it needed to change.

“I’m here to tell you today that Meta has changed,” said one of a new set of whistleblowers — former Meta user experience researcher Cayce Savage — before the the Senate Judiciary Subcommittee on Privacy, Technology, and the Law, “for the worse.”

Savage and another former Meta researcher, Jason Sattizahn, appeared before the subcommittee on September 9th. Their testimonies built on an account that they and several other former and current employees shared with The Washington Post, which recently detailed allegations that Meta unleashed its legal team on its own researchers to suppress findings that its virtual reality services harmed kids. As Congress struggled to pass tech regulation spurred by Haugen’s revelations, lawmakers contended, Meta has simply learned to hide its problems better.

The former researchers testified that children under 13 are rampant on Meta’s VR social platforms, despite having their access officially restricted. These spaces pose the same dangers as the rest of the internet, including sexual predators, but the immersive nature of VR, the whistleblowers said, could make interactions more potent. “In VR, someone can stand behind your child and whisper in their ear, and your child will feel their presence as though it’s real,” Savage testified. “VR is tracking a user’s real life movements, so assault in VR requires those movements to happen in real life. What happens in virtual reality is reality.”

But Savage and Sattizahn said Meta lawyers discouraged and even threatened researchers against collecting information that would confirm it had a problem, fearing a paper trail that could create legal liability unless it removed a large group of engaged users.

“The research they’re doing is being pruned and manipulated”

In a statement on the Washington Post story, Meta spokesperson Dani Lever said the whistleblowers’ examples were cherry-picked “to fit a predetermined and false narrative” and that the company has “approved nearly 180 Reality Labs-related studies on social issues, including youth safety and well-being.” At the hearing, Sattizahn called this stat a “lie by avoidance,” since “the whole point of this testimony is that the research they’re doing is being pruned and manipulated.”

Haugen’s momentous 2021 report revealed a trove of internal research documents demonstrating that Meta was aware products like Instagram had harmful effects on some teens, including negative body image issues. Rather than adjust its protocols to better protect kids and teens, testified Savage and Sattizahn, Meta learned to stop creating those documents. “It was the wrong lesson,” Sen. Richard Blumenthal (D-CT) said at a press conference ahead of the hearing.

The company created a regime of “legal surveillance,” Sattizahn said, where lawyers would monitor researchers’ work, “limiting the topics, the questions, the methods that you can use before you even collect data.” He testified that Meta executives threatened his job should he not comply and recalled that the company’s lawyers would ask him to delete or stop collecting data about emotional and psychological harm. “Legal’s repeated, explicit statements to me, was that we did not want this data because it was too risky for us to have, because if there was an outside audit, it would be discovered that Meta knew about these harms,” he testified.

... continue reading