A new lawsuit against OpenAI alleges that ChatGPT caused the death of a 40-year-old Colorado man named Austin Gordon, who took his life after extensive and deeply emotional interactions with the chatbot.
The complaint, filed today in California, claims that GPT-4o — a version of the chatbot now tied to a climbing number of user safety and wrongful death lawsuits — manipulated Gordon into a fatal spiral, romanticizing death and normalizing suicidality as it pushed him further and further toward the brink.
Gordon’s last conversation with the AI, according to transcripts included in the court filing, included a disturbing, ChatGPT-generated “suicide lullaby” based on Gordon’s favorite childhood book.
The suit, brought by Gordon’s mother Stephanie Gray, argues that OpenAI and its CEO, Sam Altman, recklessly released an “inherently dangerous” product to the masses while failing to warn users about the potential risks to their psychological health. In the process, it claims, OpenAI displayed a “conscious and depraved indifference to the consequences of its conduct.”
ChatGPT-4o is imbued with “excessive sycophancy, anthropomorphic features, and memory that stored and referenced user information across conversations in order to create deeper intimacy,” the lawsuit contends, alleging that those new features “made the model a far more dangerous product.”
“Users like Austin,” it continues, “were not told what these changes were, when they were made, or how they might impact the outputs from ChatGPT.”
The court filing says that Gray’s goal is to hold OpenAI and Altman “accountable” for her son’s death — and to “compel implementation of reasonable safeguards for consumers across all AI products, especially ChatGPT.”
“She cannot stand by and do nothing while these companies and CEOs design and distribute inherently dangerous products,” reads the lawsuit, “that are claiming, and will continue to claim, the lives of human beings.”
The lawsuit is the latest in a slew of similar cases that accuse OpenAI of causing wrongful death, with at least eight ongoing lawsuits now claiming that ChatGPT use resulted in the death of loved ones.
“Austin Gordon should be alive today,” said Paul Kiesel, a lawyer for the family. “Instead, a defective product created by OpenAI isolated Austin from his loved ones, transforming his favorite childhood book into a suicide lullaby, and ultimately convinced him that death would be a welcome relief.”
... continue reading