is a senior policy reporter at The Verge, covering the intersection of Silicon Valley and Capitol Hill. She spent 5 years covering tech policy at CNBC, writing about antitrust, privacy, and content moderation reform.
Still fresh off its recent $375 million jury verdict against Meta, New Mexico attorney general Raul Torrez’s office began arguing for even greater asks in the second phase of a landmark trial. On Monday, an attorney for the state, David Ackerman, pressed the court for a $3.7 billion abatement plan that would require Meta to fund programs for mental health providers, law enforcement, and educators. Other requests include changes to Meta’s services — like age verification, a 99 percent detection rate for new child sexual abuse material (CSAM), and no more late-night or school-day notifications for teens in the state.
During opening statements, the state argued that only this kind of sweeping plan could resolve the safety and public health issues Meta poses to New Mexico minors. The plan “recognizes the scope of the public nuisance that Meta has caused,” Ackerman said. Meta, on the other hand, said the AG’s asks are so far-fetched and infeasible that it could have no choice but to leave the state entirely if Judge Bryan Biedscheid forced it to comply with the plan.
Biedscheid indicated he also has some reservations. While he wants to address any identified harms, he said, he is “not the easiest sell on an idea where I would become a one person legislature, judge, and executive branch enforcer of administrative code provision.” Although he said he was open to learning more during trial, he expressed concern that some of the states’ requests “could amount to some of that overreach.”
Whatever Biedscheid decides could signal how far a judge is willing to go to address alleged social media harms. There are still thousands of other cases waiting to be tried against social media companies on similar grounds, and such rulings could serve as a reference point during settlement talks.
In March, a Santa Fe jury determined that Meta committed 75,000 violations of the state’s Unfair Practices Act, finding it misled users about the safety of its products for teens and engaged in unconscionable trade practices by facilitating child predators on its services. In a second phase of the trial, Biedscheid will determine whether Meta’s actions went beyond harm to individual users, and also created a public nuisance for the broader community. He’ll also decide the appropriate relief. That could range from the state’s dramatic roadmap of changes to some modest tweaks proposed by Meta — which include funding law enforcement internet crimes training, and committing to improving its age assurance models to detect kids under 13.
Because he presided over the first phase of the trial, Biedscheid warned the parties he’s “not a blank slate.” Even under what he called a “fairly restrictive definition of civil penalty calculation,” the jury found Meta’s thousands of violations warranted the maximum penalty of $5,000 each. But he also noted he’s well aware of the First Amendment concerns that may come with certain proposals, and the issues they may run into with Section 230, the law that shields social media companies from being held liable for their users’ speech.
“It cannot be the case that safety features are only implemented when there are trials or when there are Wall Street Journal articles”
In his opening statement, Ackerman, the state attorney, told the judge that Meta doesn’t take action to address its issues until it’s “forced to do so.” “It cannot be the case that safety features are only implemented when there are trials or when there are Wall Street Journal articles,” Ackerman said. He added that the judge has “broad and flexible powers” to address the mental health crisis he said is “fueled and caused by social media.”
The state addressed some common critiques of its proposed solutions. Ackerman told the judge they’re not asking for the court to “impose a specific age verification scheme,” but rather a “menu” of options that, when layered together, can increase effectiveness. Meta’s concern that a 99 percent standard for detecting new CSAM would be impossible to achieve may be mitigated, according to Ackerman, by a court-assigned child safety monitor lowering the standard if they think Meta has done all it reasonably can. And despite the privacy concerns created by getting rid of encryption, Ackerman said, “the risks to minors of encrypted messages far outweigh the privacy concerns for that population.”
... continue reading