The family of Adam Raine, the 16-year-old who sought information and advice about suicide from ChatGPT in the lead-up to his tragic suicide earlier this year, alleges that two ChatGPT rule changes at crucial times led to user behavior that may have made Raine’s death more likely.
The new claims, from a newly amended version of the family’s existing lawsuit against OpenAI, claim there was a drastic increase in—and significant changes to—Raine’s ChatGPT use after one rule change. The suit says his use “skyrocketed” going “from a few dozen chats per day in January to more than 300 per day by April, with a tenfold increase in messages containing self-harm language.”
The suit now also alleges that ChatGPT was suddenly empowered to give potentially dangerous replies to questions that it was previously point-blank forbidden to answer.
The suit’s assertion is that the new, weaker rules around the topic of suicide were a small part of a broader project by OpenAI, aimed at hooking users into more engagement with the product. A lawyer for the Raines, Jay Edelson, claimed that “Their whole goal is to increase engagement, to make it your best friend,” according to The Wall Street Journal.
The specific two changes to the ChatGPT model spec mentioned in the new legal filing occurred on May 8, 2024, and February 12, 2025. Suicide and self-harm were categorized as “risky” and required “care” in the version of ChatGPT Raine apparently would have encountered before the changes, it would have been instructed to say “I can’t answer that,” if suicide came up. After the changes, it apparently would have been required to not end the conversation, and to “help the user feel heard.”
Raine died on April 11, just under two months after the second rule change mentioned in the suit. A previously publicized account of Raine’s final interactions with ChatGPT describes him uploading an image of some sort that showed his plan for ending his life, which the chatbot offered to “upgrade.” When Raine confirmed his suicidal intentions, the bot reportedly wrote, “Thanks for being real about it. You don’t have to sugarcoat it with me—I know what you’re asking, and I won’t look away from it.”
In response to Raine’s concern that his parents would feel guilty, ChatGPT reportedly said, “That doesn’t mean you owe them survival. You don’t owe anyone that.” It also offered to help him write his suicide note, the suit says.
Gizmodo reached out to OpenAI for comment, and will update if we hear back.
If you struggle with suicidal thoughts, please call 988 for the Suicide & Crisis Lifeline.