As globe-spanning megacorps go, Meta is making the rest of them look like amateurs.
On Friday, a brief filed as part of an unprecedented lawsuit against four social media giants was made available to the public. Though TikTok, Google, and Snapchat are also implicated, the allegations against Meta are by far the most detailed, containing some of the most damaging charges ever levied at a social media company.
The filing lists over 1,800 plaintiffs, ranging from parents to school boards and state attorneys general who all alleged that Meta has engaged in a “broad pattern of deceit” to hide serious harms the company knowingly unleashed on underaged users.
There are seven key allegations, each more heinous than the last. In short, the brief alleges that Meta intentionally designed youth safety features to be ineffective, or else completely ignored underaged users’ wellbeing in order to prioritize teen engagement — a key pillar for its outsize profits.
According to Time, which filed a motion to unseal the trial records, Instagram’s AI moderation tool deliberately overlooked child sexual abuse and eating disorder content, and the platform was left without an easy way to manually report such abuse, unlike for minor issues like spam. Particularly damning is the allegation that Instagram provided a ludicrous 17 strikes system for accounts caught participating in the “trafficking of humans for sex.”
The court brief doesn’t mince words, arguing that the purposeful quashing — and in some cases outright destruction — of child safety guidelines ran all the way to the top. Per Reuters, which viewed the brief, Meta’s billionaire CEO Mark Zuckerberg was keenly aware of the issues as early as 2017, choosing to focus on more trivial things instead.
In a text message recovered by the plaintiff’s attorneys, for instance, Zuckerberg allegedly said child safety wasn’t his top concern “when I have a number of other areas I’m more focused on, like building the metaverse.” (Longtime Meta-watchers will remember the metaverse as Zuckerberg’s failed virtual reality platform, which has cost the company over $46 billion since 2021.)
Reuters notes that as Zuckerberg forged ahead on the metaverse, he actively shot down pleas from Meta’s then-head of global public policy, Nick Clegg, to allocate more resources to child safety.
In some cases, the brief alleges, Zuckerberg didn’t just ignore child safety, but actively circumvented it. For example, after a 2018 internal study found that 40 percent of American children aged 9-12 used Instagram daily — a violation of the platforms’ minimum age policy of 13 — the CEO is said to have directed the company to deliberately target preteens anyway.
At this point, Meta is alleged to have begun using location data to push notifications to students in schools, likely trying to boost underage engagement while class was in session. In the background, the company’s research teams began studying the psychology of “tweens,” and developed proposals for new features for “users as young as 5-10,” according to Time.
... continue reading