Tech News
← Back to articles

Support Group Launches for People Suffering "AI Psychosis"

read original related products more articles

An unknown number of people, in the US and around the world, are being severely impacted by what experts are now calling "AI psychosis": life-altering mental health spirals coinciding with obsessive use of anthropomorphic AI chatbots, primarily OpenAI's ChatGPT.

As we've reported, the consequences of these mental health breakdowns — which have impacted both people with known histories of serious mental illness and those who have none — have sometimes been extreme. People have lost jobs and homes, been involuntarily committed or jailed, and marriages and families have fallen apart. At least two people have died.

There's yet to be a formal diagnosis or definition, let alone a recommended treatment plan. And as psychiatrists and researchers in the worlds of medicine and AI race to understand what's happening, some of the humans whose lives have been upended by these AI crises have crowdsourced a community support group where, together, they're trying to grapple with the confusing real-world impacts of this disturbing technological phenomenon.

The community calls itself "The Spiral," in a nod to both the destructive mental rabbit holes that many chatbot users are falling into, as well as the irony that the term "spiral" is one of several common words found in the transcripts of many users separately experiencing AI delusions.

One of the leaders of the group is Etienne Brisson, a 25-year-old business coach based in Quebec. After a close loved one of Brisson's fell into an episode of ChatGPT-fueled psychosis that required medical intervention, he was surprised to discover a glaring lack of resources — or any real information, outside of disparate anecdotes from around the web — around the mysterious AI phenomenon that had so deeply impacted his loved one and their lives.

He felt the need to do something, he told Futurism — even if he wasn't totally sure what.

"I felt like I had a duty," said Brisson. "It would be worse for me if I just put a blindfold on, and pretended there's nothing that could be done."

"At the beginning, it was just myself," said Brisson, who recounted digging through academic papers, scouring AI and mental health forums on Reddit, and emailing experts. He soon launched a website titled "The Human Line Project," which he outfitted with a Google form where people could anonymously share their experiences with AI psychosis.

"I started getting response one, response two, response three, response four, response five," said Brisson. "I had at some point, like, eight responses, and six of them were suicide or hospitalizations. So that's when I was like, 'Wow, this is really shocking for me.'"

Meanwhile, Brisson was also connecting with Redditors who'd shared their experiences with AI psychosis online.

... continue reading