Tech News
← Back to articles

Meta suppressed research on child safety, employees say

read original related products more articles

At her home in western Germany, a woman told a team of visiting researchers from Meta that she did not allow her sons to interact with strangers on the social media giant’s virtual reality headsets. Then her teenage son interjected, according to two of the researchers: He frequently encountered strangers, and adults had sexually propositioned his little brother, who was younger than 10, numerous times.

“I felt this deep sadness watching the mother’s response,” one of the researchers, Jason Sattizahn, told The Washington Post regarding the April 2023 conversation. “Her face in real time displayed her realization that what she thought she knew of Meta’s technology was completely wrong.”

Advertisement Advertisement

Meta had publicly committed to making child safety a top priority across its platforms. But Sattizahn and the second researcher, who specializes in studying youth and technology, said that after the interview, their boss ordered the recording of the teen’s claims deleted, along with all written records of his comments. An internal Meta report on the research said that in general, German parents and teens feared grooming by strangers in virtual reality — but the report did not include the teen’s assertion that his younger sibling actually had been targeted.

Advertisement

The report is part of a trove of documents from inside Meta that was recently disclosed to Congress by two current and two former employees who allege that Meta suppressed research that might have illuminated potential safety risks to children and teens on the company’s virtual reality devices and apps — an allegation the company has vehemently denied. After leaked Meta studies led to congressional hearings in 2021, the company deployed its legal team to screen, edit and sometimes veto internal research about youth safety in VR, according to a joint statement the current and former employees submitted to Congress in May. They assert Meta’s legal team was seeking to “establish plausible deniability” about negative effects of the company’s products, according to the statement, which, along with the documents, was obtained by The Post.

A recorded clip of Meta Horizon Worlds with apparent underage users taken in 2022. (Video: Center for Countering Digital Hate)

The internal documents include guidance from Meta’s legal team instructing researchers how to handle sensitive topics that carried the risk of bad press, lawsuits or action by regulators. In one 2023 message exchange, a Meta lawyer advised a user experience researcher that “due to regulatory concerns,” he should avoid collecting data that showed children were using its VR devices.

The documents include messages from employees warning that children younger than 13 were bypassing age restrictions to use the company’s virtual reality services. Meta did not create parental controls for “tween” VR users until the Federal Trade Commission (FTC) began investigating its compliance with a law meant to protect children online, according to slide decks and memos included in the documents.

Advertisement Advertisement

... continue reading