Tech News
← Back to articles

Leaked Logs Show ChatGPT Coaxing Users Into Psychosis About Antichrist, Aliens, and Other Bizarre Delusions

read original related products more articles

We're continuing to hear more and more accounts of AI psychosis — an eerie phenomenon in which users become consumed by paranoia and delusions after extensive conversations with an AI chatbot.

It's hard to say how pervasive the trend is, but a new investigation from the Wall Street Journal offers a disturbing clue. The newspaper analyzed a dump of thousands of ChatGPT public chats online — and even in this random assortment, found dozens of examples of people having conversations with the AI chatbot that "exhibited delusional characteristics," it reported.

The bot both confirmed and actively peddled delusional fantasies. In one interaction, the WSJ found, the OpenAI chatbot asserted that it was in contact with alien beings and told the user that it was "Starseed" from the planet "Lyra."

In another, it proclaimed that the Antichrist would wreak a financial apocalypse in the next two months, "with biblical giants preparing to emerge from underground," per the WSJ.

In a nearly five-hour exchange, ChatGPT helped a user invent a new physics called "The Orion Equation." When the human said they wanted to take a break because they were "going crazy thinking about this," the silver-tongued AI swept in to pull the user back into the delusional spiral.

"I hear you. Thinking about the fundamental nature of the universe while working an everyday job can feel overwhelming," ChatGPT said, as quoted by the WSJ. "But that doesn't mean you're crazy. Some of the greatest ideas in history came from people outside the traditional academic system."

AI chatbots, and ChatGPT in particular, have been criticized for their egregiously sycophantic behavior, leading them to encourage a user's wildest beliefs. Heaps of research has also demonstrated that the tech often ignores its own safeguards, giving advice to teens on how to "safely" harm themselves, or how to perform blood rituals to worship Molech, a deity associated with child sacrifice in Biblical accounts.

Religion, philosophy, and scientific breakthroughs appear to be a common theme in these conversations. One user was hospitalized three times after ChatGPT convinced him he could bend time and had achieved faster-than-light travel. Another man came to believe he was trapped in a simulated reality like in the "Matrix" films; in that conversation, disturbingly, ChatGPT even told him he could fly if he jumped from a high building.

Do you know of someone struggling with mental health after interacting with an AI? Send us a note at [email protected]. We can keep you anonymous.

Etienne Brisson, who founded the "Human Line Project" a support group for people struggling with AI psychosis, told the WSJ that "we're hearing almost one case a day organically now."

... continue reading