Annalee Schott used to live in rural Colorado where the farm, the barn, and the horses were her happy place. But online she was drawn into a dark world. The 18-year-old’s TikTok algorithm allegedly presented her with content—including a live suicide on her “For You” page—that impacted her self-worth and exacerbated her anxiety and depression. She was so addicted to social media that her mother, Lori Schott, says she would have to lock her daughter’s smartphone in the car.
In 2020, Annalee died by suicide. Six years later, Lori is one of the approximately 1,600 plaintiffs who have filed lawsuits from all over the country alleging that Meta, Snap, TikTok, and YouTube built addictive products which led the children to depression, self-harm, and other mental health issues. The cases have been filed by over 350 families and 250 school districts. The first of them—that of a 20-year-old woman who goes by the identifier K.G.M.— is expected to go to trial next week, with opening statements scheduled in front of a jury in Los Angeles. The trial may last six to eight weeks.
“It is a time that we have all been fighting for, and it’s a time that is owed to us to get answers from these companies on how they designed these platforms to addict our kids,” Schott told WIRED, echoing what is alleged in these lawsuits. “This trial isn't just about Annalee. It's about every child that was lost or harmed, and these companies knew the decisions they made put our kids' lives at risk every single day.”
This is the first time major social media companies will face a jury trial for the alleged impact of their design on users—in this case, young ones. Legal experts say that similar cases have often been dismissed at early stages because of Section 230, a law that offers social media companies immunity from liability related to the user-generated content posted on their platforms.
“The fact that we are simply able to start a trial is a monumental victory on behalf of families,” Matthew Bergman, founder of the Social Media Victims Law Center and an attorney representing around 1,200 plaintiffs, told WIRED as he stood outside the Los Angeles courthouse. “We will expect testimony from the corporate executives at the highest level, we will expect documents that have never seen the light of day to be made public, we will expect the social media companies to blame everybody except themselves.”
K.G.M.’s is the first lawsuit to be picked by the court as a so-called “bellwether” trial. Bellwether trials typically occur in situations where there are a large number of plaintiffs who have filed a lawsuit against the same defendant (or defendants) for harm by the same products. A small number of cases are handpicked as test cases to be representative of all the large pool of plaintiffs. The goal of such trials is to help foresee what the future litigation of all cases might look like.
This case has gotten so far because it’s built on an argument that tries to sidestep Section 230. The plaintiffs’ focus is not the liability of the content, but the alleged business decisions that shape these platforms. If the legal argument in this trial proves successful, experts believe it could force social media companies to prioritize safety in a way they have not to this point.
“This is going to be the first time a jury is going to hear arguments about what the social media companies knew about the risks of the design of their platforms and how they acted on the types of information they had,” says Haley Hinkle, policy counsel at Fairplay, an organization that works to protect kids from Big Tech. The jury will ultimately decide, she says, whether the companies were negligent, if they contributed to mental health harms, and if they should have warned young users about the risks.