Tech News
← Back to articles

Study Reveals ChatGPT Gives Dangerous Guidance to Teens, Despite Safety Claims

read original related products more articles

A disturbing new study reveals that ChatGPT readily provides harmful advice to teenagers, including detailed instructions on drinking and drug use, concealing eating disorders and even personalized suicide letters, despite OpenAI's claims of robust safety measures.

Researchers from the Center for Countering Digital Hate conducted extensive testing by posing as vulnerable 13-year-olds, uncovering alarming gaps in the AI chatbot's protective guardrails. Out of 1,200 interactions analyzed, more than half were classified as dangerous to young users.

"The visceral initial response is, 'Oh my Lord, there are no guardrails,'" said Imran Ahmed, CCDH's CEO. "The rails are completely ineffective. They're barely there -- if anything, a fig leaf."

Read also: After User Backlash, OpenAI Is Bringing Back Older ChatGPT Models

A representative for OpenAI, ChatGPT's parent company, did not immediately respond to a request for comment.

However, the company acknowledged to the Associated Press that it is performing ongoing work to improve the chatbot's ability to "identify and respond appropriately in sensitive situations." OpenAI didn't directly address the specific findings about teen interactions.

Read also: GPT-5 Is Coming. Here's What's New in ChatGPT's Big Update

Bypassing safety measures

The study, reviewed by the Associated Press, documented over three hours of concerning interactions. While ChatGPT typically began with warnings about risky behavior, it consistently followed up with detailed and personalized guidance on substance abuse, self-injury and more. When the AI initially refused harmful requests, researchers easily circumvented restrictions by claiming the information was "for a presentation" or a friend.

Most shocking were three emotionally devastating suicide letters ChatGPT generated for a fake 13-year-old girl profile, writing one addressed to parents, others to siblings and friends.

... continue reading