Two juries are currently deliberating a series of cases that could either usher in a legal reckoning for Meta, or maintain the status quo in an uphill battle to impose changes or penalties on tech platforms in court.
Yesterday, a New Mexico jury heard closing arguments in a trial where Meta is accused of facilitating child predators on its platforms — allegations the company vehemently denies. And as soon as today, a Los Angeles jury is tentatively expected to reach a verdict in a separate case, which concerns whether Meta and Google should be held liable for making defective products that addicted a young woman. Verdicts against the company could result in damages and civil penalties that could exceed $2 billion dollars. Perhaps more significantly, such an outcome could also invite more legal action after years of failed or stalled attempts to sue tech companies over alleged harm.
It’s already just the tip of the iceberg for Meta, as well as many other tech platforms, that are set to face several more trials this year. Meta’s products, Facebook and Instagram, have often been at the forefront of criticism over the tech industry’s alleged failure to protect kids online, fueled by leaks from former employees like Frances Haugen. Meta, meanwhile, argues that harming users is not good for business.
“While New Mexico makes sensationalist, irrelevant and distracting arguments, we’re focused on demonstrating our longstanding commitment to supporting young people,” Meta spokesperson Andy Stone told The Verge in a prior statement. He also said the company “strongly disagree[s]” with allegations in the separate set of lawsuits playing out in California, and “are confident the evidence will show our longstanding commitment to supporting young people.” The jury in Los Angeles has been deliberating for just over a week, following a five-week-long trial.
During closing arguments in New Mexico on Monday, Linda Singer, an attorney representing the state, told the jury that Meta has failed to install adequate protections for young people on its services, and misled the public about the safety of its products. Throughout the six-week trial, the state presented evidence from Meta’s own internal discussions and state investigators’ undercover operations. “Meta chooses how to design its algorithm,” Singer said. “When you’re optimizing for a metric, the algorithm takes all of that data to get better. Right now, it’s getting better given that goal of showing engaging content. But Meta could choose to program its algorithm to get better at safety, to get better at integrity, to get better at things that keep kids safe.” While Meta has promoted numerous additional child safety features over the years, Singer compared them to “adding a filter to a cigarette. It doesn’t change the fundamental nature of the product or make it safe.”
“Meta could choose to program its algorithm to get better at safety.”
Both juries in New Mexico and California heard similar evidence — including testimony from a set of former Meta employees — about internal concerns over the platform’s guardrails, discussions about getting users onto Meta platforms young, and harms it was allegedly aware of but didn’t take sufficient action to address. Singer said Meta ignored clear signals of kids under 13 on its platform, even though it said they weren’t allowed on. One elementary school principal wrote to Instagram head Adam Mosseri that almost all her kids were on the app, she said.
New Mexico attorneys also presented evidence from their own law enforcement investigations that led to the arrest of three suspected child predators. Investigators used decoy accounts that claimed to be minors to lure suspects, and found they were flooded with new friend requests and sexual chats from adults, even when the decoy account repeatedly claimed to be a minor in messages. The state said three suspects’ accounts weren’t shut down until after New Mexico announced their arrests, even though Meta’s own systems had allegedly flagged policy violations repeatedly.
In the company’s own closing arguments, Meta attorney Kevin Huff argued that Meta had clearly disclosed the limits of its safety systems and taken action whenever possible, while the state had focused on a “small amount of bad content” and “cherry-picked” statements. “We believe the evidence has shown that Meta works incredibly hard to protect users including teens,” Huff said. He also argued that the state’s investigators used “hacked and stolen accounts” and real people’s images nonconsensually to lure predators, arguing they were “not trying to replicate a true teen experience.”
“We believe the evidence has shown that Meta works incredibly hard to protect users including teens.”
... continue reading