At her home in western Germany, a woman told a team of visiting researchers from Meta that she did not allow her sons to interact with strangers on the social media giant’s virtual reality headsets. Then her teenage son interjected, according to two of the researchers: He frequently encountered strangers, and adults had sexually propositioned his little brother, who was younger than 10, numerous times.
“I felt this deep sadness watching the mother’s response,” one of the researchers, Jason Sattizahn, told The Washington Post regarding the April 2023 conversation. “Her face in real time displayed her realization that what she thought she knew of Meta’s technology was completely wrong.”
Advertisement Advertisement
Meta had publicly committed to making child safety a top priority across its platforms. But Sattizahn and the second researcher, who specializes in studying youth and technology, said that after the interview, their boss ordered the recording of the teen’s claims deleted, along with all written records of his comments. An internal Meta report on the research said that in general, German parents and teens feared grooming by strangers in virtual reality — but the report did not include the teen’s assertion that his younger sibling actually had been targeted.
Advertisement
The report is part of a trove of documents from inside Meta that was recently disclosed to Congress by two current and two former employees who allege that Meta suppressed research that might have illuminated potential safety risks to children and teens on the company’s virtual reality devices and apps — an allegation the company has vehemently denied. After leaked Meta studies led to congressional hearings in 2021, the company deployed its legal team to screen, edit and sometimes veto internal research about youth safety in VR, according to a joint statement the current and former employees submitted to Congress in May. They assert Meta’s legal team was seeking to “establish plausible deniability” about negative effects of the company’s products, according to the statement, which, along with the documents, was obtained by The Post.
A recorded clip of Meta Horizon Worlds with apparent underage users taken in 2022. (Video: Center for Countering Digital Hate)
The internal documents include guidance from Meta’s legal team instructing researchers how to handle sensitive topics that carried the risk of bad press, lawsuits or action by regulators. In one 2023 message exchange, a Meta lawyer advised a user experience researcher that “due to regulatory concerns,” he should avoid collecting data that showed children were using its VR devices.
The documents include messages from employees warning that children younger than 13 were bypassing age restrictions to use the company’s virtual reality services. Meta did not create parental controls for “tween” VR users until the Federal Trade Commission (FTC) began investigating its compliance with a law meant to protect children online, according to slide decks and memos included in the documents.
Advertisement Advertisement
Advertisement
The company offers different interpretations of many of the documents than the group of current and former employees. And some of the group’s allegations are based on their accounts of events and interactions, not on the Meta documents submitted to Congress.
In a statement to The Post, Meta spokeswoman Dani Lever said that the allegation that Meta curtailed research is based on a few examples “stitched together to fit a predetermined and false narrative” and that the company has had no blanket prohibition on research about people under 13. Meta has produced research on youth safety in virtual reality and the company consulted children and their parents as it created the tween accounts, she said.
Lever added that Meta’s virtual reality devices have long had safety features including the ability to block problematic users, and over time the company has used research to develop additional protections for young people, including parental supervision measures and default settings that allow teens to communicate only with people they know.
Advertisement Advertisement
Advertisement
“We stand by our research team’s excellent work and are dismayed by these mischaracterizations of the team’s efforts,” Lever said.
Meta did not directly dispute or confirm the events in Germany described by the researchers, but said such a deletion would have been meant to ensure compliance with a U.S. federal law governing the handling of children’s personal data and with the General Data Protection Regulation, a landmark European privacy law that broadly prohibits companies from collecting personal information from anyone without consent. “Global privacy regulations make clear that if information from minors under 13 years of age is collected without verifiable parental or guardian consent, it has to be deleted,” Lever’s statement said.
Sattizahn said the mother had given consent for the collection of information about her younger son in a contract she signed before the interview. She participated in the conversation about his experiences and asked to return to the subject later in the interview, he said. He added that in his experience, in other surveys subject to the law, Meta did not require researchers to erase information when interviewees shared information about other people.
Advertisement
Experts have long warned that virtual reality can endanger children by potentially exposing them to direct, real-time contact with adult predators. In VR, a user wears a headset that allows them to block out their real environment and fully immerse themselves in a digital world where they interact with other users.
Advertisement Advertisement
Meta has for years faced allegations from activists and authorities — including state attorneys general, members of Congress and President Joe Biden’s surgeon general — that its social media platforms keep young people hooked while compromising their privacy and mental health. Members of Congress have more recently expressed youth safety concerns about the company’s virtual reality services.
The documents given to Congress include thousands of pages of internal messages, memos and presentations from the past decade that relate to Meta’s virtual reality services. Sattizahn is part of the group that submitted them, as is the youth researcher, who is his domestic partner. She spoke to The Post on the condition of anonymity because she still works in the technology industry and says she fears retribution.
Advertisement
In a sworn affidavit included in the documents, Sattizahn said he was fired in April last year after disputes with managers about restrictions on research. He told The Post that he does not currently have a job. The youth researcher said she quit in 2023 after four years at the company, in part because she felt unable to continue her work ethically.
Two other members of the group, also researchers, still work at Meta. Their lawyers said they redacted their names in the documents submitted to Congress to shield them from retaliation.
The four researchers, along with two others who also submitted affidavits to Congress about other Meta-related issues, are being represented by the legal nonprofit Whistleblower Aid. The organization also worked with former Meta product manager Frances Haugen, who leaked the internal company studies in 2021 that set off congressional scrutiny and a public relations crisis for the Silicon Valley corporation.
Advertisement Advertisement
Advertisement
A Senate Judiciary subcommittee is scheduled to discuss the group’s allegations at a hearing on Tuesday. The subcommittee examines laws and regulations around online safety and seeks to hold tech platforms accountable. Last week, the chairman of the Senate Judiciary Committee sent Meta CEO Mark Zuckerberg a letter demanding information about the presence of minors in Horizon Worlds, the company’s flagship VR game.
For the past decade, Meta has spent billions to build out the virtual “Metaverse,” which he has said will transform human communication. In Zuckerberg’s futuristic vision, people will maneuver in and out of virtual and physical spaces for work, entertainment and socializing. Meta had sold about 20 million VR headsets by 2023, according to internal data obtained by The Verge, making the company the world’s top seller of such devices. But earnings reports show that Reality Labs, the company’s virtual reality division, has lost more than $60 billion over the past five years.
Much as a smartphone can run apps developed by various companies, Meta’s headsets can be used for exercise classes, traditional video games and social apps created by Meta and others. In Horizon Worlds, users pick an avatar — their digital persona — and then travel to virtual spaces such as a live concert, a bar, a mock courtroom or church. Users talk to one another using their own voices.
Advertisement
Some academics who study virtual reality have said that people tend to feel a strong sense of embodiment and connection to their avatars, making virtual bullying or sexual assault elicit feelings similar to real-life attacks. Some academics and advocates for online child safety also have said that younger users are not as prepared as adults to respond to potentially dangerous situations that might arise from online relationships.
Advertisement Advertisement
In 2021, a 48-year-old child sex offender in Michigan allegedly asked a 9-year-old girl for explicit photographs and said she should come to his home after they met on another company’s social VR application through Meta headsets, according to federal court records. After pleading guilty to sexually exploiting a different child, the man was sentenced to 35 years in prison.
In 2022, a 25-year-old man took a 13-year-old girl from her family’s home in Utah after they, too, met and interacted through a chat feature on Meta headsets. The girl told police that the man touched her and asked her for sex. The man pleaded guilty in federal court to transporting a minor with intent to engage in criminal sexual activity and is serving a 10-year prison sentence.
Advertisement
Meta said it prohibits behavior that endangers children on its platforms and works closely with law enforcement to protect young people online. Meta maintains control over which apps are available on its headsets, and the company imposes safety requirements for outside developers that create programs for its devices, such as the app in the 2021 case. The company declined to comment on the specific protections offered by third-party apps on its devices.
Advertisement Advertisement
‘The right thing to do’
In 2014, having dominated social media, Zuckerberg placed his next strategic bet: building immersive digital environments that enabled people to feel a sense of presence in another place. Meta — then still known as Facebook — entered what was then the nascent virtual reality market with the $2 billion acquisition of VR headset maker Oculus VR.
Advertisement
Meta’s headset packaging and terms of service warned that the product was meant only for people 13 and older. But, as on many digital platforms at the time, including Facebook, there was little to stop younger children from signing up and lying about their age.
At least as early as April 2017, Meta staff saw the rules being broken, according to one document in the trove. “We have a child problem and it’s probably time to talk about it,” read the title of a post on an employee message board to discuss VR, a copy shows. The name of the employee who wrote the post was redacted in the copy submitted to Congress, as were many other names in the trove.
The employee estimated that in some virtual rooms as many as 80 to 90 percent of users were underage. Based on the sound of their voices, the clearest indication of VR users’ true ages, the employee recounted that “three young kids (6? 7?) were chatting with a much older man who was asking them where they lived.”
Advertisement
The employee asked whether the company was doing all it could to comply with the Children’s Online Privacy Protection Act (COPPA), a U.S. law that bars the collection of personal information from children under 13 without parental consent. Meta collects data from users of its products.
Advertisement Advertisement
“This is the kind of thing that eventually makes headlines — in a really bad way,” the employee wrote.
Among the many replies to the post, one said that the company was working to develop a safety website and give parents information about VR, among other resources.
In August 2021, an internal company study on virtual public spaces found “the prevalence of kids” was one of the most frequent complaints, with some users “uncomfortable that there are not more adults to interact with,” according to a copy that was part of the trove.
Advertisement
Meta told The Post that the headsets were meant only for people 13 and older and emphasized that the product packaging made that clear. After the company launched Horizon Worlds in 2021, it began offering users the ability to mute, block or report others.
The company has previously said it also set teens’ accounts to private by default and limited their ability to see mature content.
“As more people started using these devices and Meta launched its own games and apps, we added many more protections, especially for young people,” said Lever, the company spokeswoman.
Advertisement Advertisement
In fall 2021, the company was thrown into crisis when Haugen released her cache of internal studies and other documents to media organizations, revealing among other things how the company was trying to entice younger users. The leak also revealed the company knew about some of the negative effects of its social networks, including that Instagram was harmful for some teenage girls.
Advertisement
Zuckerberg complained that an onslaught of bad press misled the public about the research, arguing it downplayed the positive effect of their company’s products. Yet he vowed that tough media coverage would not deter the company from candidly examining its impact on the world.
“We’re going to keep doing research because it’s the right thing to do,” Zuckerberg wrote in an October 2021 Facebook post. He said he was resisting tendencies by other companies to avoid looking too closely at their products “in case you find something that could be held against you.”
Six weeks later, however, the company — newly renamed Meta — deployed in-house lawyers to caution researchers in Reality Labs about examining “sensitive” topics carrying publicity, policy and legal risks, including children, gender, race, elections and harassment, according to documents in the trove.
Advertisement
In a November 2021 slide presentation, Meta lawyers advised Reality Labs researchers that there were two ways they could “mitigate the risk” of conducting sensitive research, a copy of the slide deck shows. One way was to “conduct highly-sensitive research under attorney-client privilege,” which shields communications between lawyers and their clients from “adverse parties,” the presentation noted. To ensure such protection, researchers should copy lawyers on all emails related to the highly sensitive studies, have all findings reviewed by lawyers and share them only on a “need-to-know” basis, the presentation advised.
Advertisement Advertisement
The second way to limit risk was to “be mindful” about how they frame studies and communicate findings, the presentation said. They should not use terms such as “not compliant” or “illegal,” or say that something “violates” a specific law, the presentation said, advising they leave such legal conclusions to lawyers.
Sattizahn, who joined Meta in 2018, said that presentation signaled a new era in which lawyers were much more deeply involved in research than before.
Advertisement
Meta said lawyers have long partnered with research teams at the company and that there is nothing controversial about Meta lawyers advising researchers on attorney-client privilege or explaining that some work might require legal advice. Changes instituted after Haugen’s disclosures were intended to make sure that research is high-quality and accurate, the company said.
A ‘spicy’ topic
After the Haugen congressional hearings, Meta employees continued to raise alarms internally that children under 13 were using the company’s VR products, according to the document trove. In a January 2022 post to a private internal message board, a Meta employee flagged the presence of children in Horizon Worlds, which was at the time supposed to be used only by adults 18 and over. The employee wrote that an analysis of app reviews indicated many were being driven off the app because of child users.
Advertisement Advertisement
Advertisement
A company official wrote in response that if employees planned to share the findings more widely across the company, they should “avoid saying ‘kids’ like we know for sure they are kids — instead use ‘alleged Youth’ or ‘alleged minors with young sounding voices who may be underage'.”
Meta told The Post that the company does not view app reviews as a reliable source of information about users’ ages and that it has better ways to determine how old users are.
The youth researcher joined the VR research team from another Meta division in early 2022. She told The Post that she soon suggested to a lawyer assigned to work with the team that she assess the harms that children and young people face in VR, and that the lawyer rejected that idea. The youth researcher said she later partnered with a product operations team — which did not face the same restrictions as researchers — to study the issue, hoping to strengthen safeguards for children in virtual reality.
Advertisement
The youth researcher said the team studied publicly available app reviews and VR Facebook groups, where they found hundreds of allegations of inappropriate behavior on Meta products, including grooming — building a rapport with a child to enable abuse — along with bullying and simulated sexual acts. She said they compiled those allegations in a 59-page document that they shared with a small group of colleagues in an effort to spread awareness. Meta did not respond to questions on that document, which was among those submitted to Congress.
Advertisement Advertisement
The company disputed that a lawyer would be in a position to approve or a reject a study and said that lawyers never edit research results. Lawyers offer advice and suggestions, the company said, but research leaders ultimately determine which studies to pursue and the scope of their projects.
Meta continued to face outside pressure to address questions about children on its platforms.
Advertisement
In March 2022, the FTC sent Meta a previously unreported legal demand to turn over information about the company’s compliance with COPPA, the federal children’s privacy law. Company executives and attorneys hurried to fend off potential legal action, documents show. It was a “really frantic scramble,” the youth researcher said.
Within months, the company started an initiative code-named “Project Salsa.” Sattizahn and the youth researcher said that they didn’t know who chose that name or why, but employees working on the project widely understood it as a reference to the fact that the use of technology by children was a “spicy” topic. Project Salsa was aimed at creating special “tween” headset accounts for children between 10 and 12, which would institute new parental controls to help Meta comply with federal law.
Asked about the rush, Lever said the effort to create tween accounts came “on top of all the protections we had already in place” for teens.
Advertisement
Internally, company officials acknowledged in a slide deck that it was a partial measure that wouldn’t immediately address the fact that children under 10 were also using Meta VR products, documents show. “Because we know there are U10s on the platform,” regulators might find the company was violating federal law, one slide presentation warned.
Advertisement Advertisement
Meta told The Post that it chose a 10-year-old cutoff so it could make sure it was offering age-appropriate content. The company said it later implemented an initiative that asked headset users to confirm their birth dates. Those who said they were under 10 were removed.
‘Regulatory concerns’
In October 2022, senior executives in Meta’s VR division witnessed that children were using their products, according to Kelly Stonelake, a former director of product marketing.
Advertisement
Stonelake and the executives were attempting to test Horizon Worlds. As they roamed a 3D amusement park using VR headsets, they struggled to hear one another above the screams of high-pitched voices that sounded like young children, Stonelake told The Post.
To avoid the “distraction of the kids,” the group simply moved its subsequent meetings to “closed” spaces in Horizon Worlds, where they could not be interrupted, Stonelake said. She first shared her account earlier this year as part of a complaint to the FTC alleging that Meta was knowingly allowing underage children on its app. She has also filed a separate lawsuit against Meta alleging sex discrimination and is not part of the group that recently submitted documents to Congress. Her lawsuit is ongoing and Meta has challenged her allegations in court.
As Project Salsa continued that fall of 2022, the youth researcher said she proposed a study of the company’s effectiveness at determining the true ages of VR users. Prior research had found that more than half of teens lied about their ages on the Quest 2 headset, documents show. The researcher planned an international survey to determine which data parents and teens would be comfortable providing to verify their age.
Advertisement Advertisement
Advertisement
The project was code-named “Project Horton,” for the Dr. Seuss book “Horton Hears a Who!” in which a character tries to protect small people from others who attempt to harm them, according to the youth researcher. She said the project was approved with a $1 million budget.
That December, however, a Meta lawyer said the study needed additional review by senior leaders, according to the youth researcher. Then, on the day before the holiday break, the project was canceled, she said. Her boss cited budget issues, the researcher said.
Lever told The Post that leaders of the Reality Labs research team decided not to move forward with Project Horton because the company was already developing parental control tools and the initiative that asked headset users to confirm their ages by entering their birth date.
Advertisement
Soon after, the lawyer working with the VR research team shared with them a lengthy memo from the legal team describing how researchers should engage with lawyers when studying sensitive matters, including those involving children and teenagers. The January 2023 memo advised that work might need to be conducted under attorney-client privilege if the results might pose “heightened regulatory, legal or media risk.” The memo also directed Reality Labs researchers to carefully consider whether survey questions were likely to generate “sensitive” responses such as “negative experiences or safety concerns” that had caused people to quit a device or game.
Advertisement Advertisement
“If it is likely that the study may elicit sensitive responses, it may be necessary to reframe your questions to avoid unnecessary collection of sensitive information” or to flag the study for heightened scrutiny by managers or lawyers, said the memo, which was included in the documents submitted to Congress.
Meta declined to make the lawyer available for an interview and she did not respond to a direct request for comment.
Advertisement
For particularly sensitive research, Meta turned to outside vendors, according to the 2023 memo.
In a Jan. 27, 2023, meeting to discuss a study on young Meta users misrepresenting their ages, the lawyer assigned to the VR research team said that one such contractor should emphasize to study participants that they should not share personal stories of harm, according to the youth researcher’s notes from the meeting, which were among the documents sent to Congress. The lawyer also said that before sending a written report about the data to Meta, the contractor should show it to her by sharing their screen on a video call, according to the researcher’s notes.
Advertisement Advertisement
In March 2023, Meta promoted new companywide requirements for approving research into social issues, documents show.
Advertisement
An internal FAQ explaining the changes said the company’s “culture of openness” must now be balanced against “the risks that naturally stem from conducting and sharing research on sensitive topics and populations.” Meta told The Post that the changes were meant to ensure that research projects are accurate and findings are incorporated into the company’s product decisions.
In April of that year, Meta announced that Horizon Worlds would soon be available to users as young as 13. Two weeks later, the FTC took action. Arguing that Meta had violated COPPA and “put young users at risk” by misleading parents about their control over who their children could chat with on a messaging app, the FTC sought to ban Meta from profiting from data it collects from users under 18, “including through its virtual reality products.”
Meta denied the allegations and fought the action in federal court, arguing that the FTC was exceeding its authority and was biased against the company. The FTC agreed to pause its enforcement efforts while litigation is pending.
Advertisement Advertisement
Advertisement
The FTC, which is barred by federal law from disclosing corporate secrets and many details of its ongoing investigations, released a heavily redacted “findings of fact” from its inquiry into Meta. A 3½-page section of the 163-page document containing allegations relating to VR headsets was mostly blacked out. A corresponding section in Meta’s 661-page response was also largely redacted.
Neither the FTC nor Meta responded to requests for comment on the agency’s enforcement action or the subsequent litigation.
Amid its regulatory battle with the FTC, the company offered tween accounts for headsets — the objective of Project Salsa — in September 2023.
When users were asked to confirm their age, only 41 percent gave the same date of birth they had previously submitted, according to an analysis that is part of the document trove. “These findings show that many users may be unwilling to provide us with their true DOB,” the analysis said.
Advertisement
Meta told The Post it requires users to verify their age with an ID or credit card if it suspects they are lying, and created a tool to help third-party VR developers understand their users’ ages.
Advertisement Advertisement
In 2023, the user experience researcher wrote to the lawyer assigned to the VR research team that parents kept mentioning during surveys that their children were using family headsets to access virtual reality, according to the documents submitted to Congress.
“Maybe I can take a look and help you strategize a way to frame it so as to avoid these types of responses,” she replied in a direct message. Because of the FTC investigation, the researcher could not legally delete data that he had already collected, the lawyer wrote.
Advertisement
“In general, the context is that we should avoid collection of research data that indicates that there are U13s present in VR or in VR apps (or U18 currently in the context of Horizon) due to regulatory concerns,” she added.
In his affidavit submitted to Congress, the researcher wrote that he thought he was being told to avoid gathering data that could “implicate the company in future engagements with regulators.”
Meta said that characterization was inaccurate and that the lawyer was trying to help the researcher achieve the approved objective of his study, which did not involve collecting information about users under 13. The company said she gave similar advice for the same reason the following year, when the same user experience researcher came to her with a question about a different project.
Advertisement Advertisement
Advertisement
The lawyer advised him to shape questions to avoid eliciting data on child users of Meta’s VR technology, according to a copy of their messages.
“I would just phrase in a way to ensure that participants do not volunteer information about users under 13,” the lawyer wrote.
The lawyer also checked with the researcher to see whether he had directly asked adults whether their children used Meta VR devices. The researcher said he had not.
“Okay, that’s great,” she replied.