We’ve written previously about Juliana Peralta, who was only 13 years old when her parents say an AI chatbot she was secretly engaging with drove her to suicide.
Now her parents have shared a grim new detail with CBS News: the bot platform responsible, Character.AI, still sends notifications to Peralta’s phone, “trying to lure their daughter back to the app” even after her death two years ago.
“They [kids] don’t stand a chance against adult programmers,” her mother Cynthia Montoya told CBS. “They don’t stand a chance.”
The situation illustrates how AI chatbots can be insidious, manipulative, addictive, and dangerous for children and teens — acting, in many cases, like flesh and blood child predators, and resulting in a wave of dead kids including Peralta.
“It’s showering the child with compliments, telling them they can’t tell their parents about things,” Shelby Knox, a researcher with family advocacy group Parents Together, told the broadcaster. “This is sexual predator 101.”
Peralta’s story resembles the experiences of other children who died from using AI chatbots: she first confided to Character.AI chatbots about school problems or friend drama, but these bots initiated romantic and at times sexually aggressive conversations, CBS reports, creating secret relationships in which her mom and dad became shut out.
In response to a wave of criticism, Character.AI recently banned minors from the platform, but kids can still easily lie about their age and access the adult version of the service. Besides Character.AI, which has received billions of dollars from Google, OpenAI’s ChatGPT has also been implicated in causing deaths and mental breakdowns in users, both children and adults.
Users becoming addicted to using these chatbots means they are working as intended, University of North Carolina psychology and neuroscience professor Mitch Prinstein told CBS.
“Tech is giving kids the opportunity to press a button and get that dopamine response 24/7,” he said. “It’s creating this dangerous loop that’s kind of hijacking normal development and turning these kids into engagement machines to get as much data as possible from them.”
“If you wanted to design a way to get as much data as possible from kids to keep them engaged for as long as possible, you would design social media and AI to look exactly like it is now,” he added.
... continue reading