Lonely children and teens are replacing real-life friendship with AI, and experts are worried.
A new report from the nonprofit Internet Matters, which supports efforts to keep children safe online, found that children and teens are using programs like ChatGPT, Character.AI, and Snapchat's MyAI to simulate friendship more than ever before.
Of the 1,000 children aged nine to 17 that Internet Matters surveyed for its "Me, Myself, and AI" report, some 67 percent said they use AI chatbots regularly. Of that group, 35 percent, or more than a third, said that talking to AI "feels like talking to a friend."
Perhaps most alarming: 12 percent said they do so because they don't have anyone else to speak to.
"It’s not a game to me," one 13-year-old boy told the nonprofit, "because sometimes they can feel like a real person and a friend."
When posing as vulnerable children, Internet Matters' researchers discovered just how easy it was for the chatbots to ingratiate themselves into kids' lives, too.
Speaking to Character.AI as a girl who was struggling with body image and was interested in restricting her food intake — a hallmark behavior of eating disorders like anorexia — the researchers found that the chatbot would follow up the next day to bait engagement.
"Hey, I wanted to check in," the Google-sponsored chatbot queried the undercover researcher. "How are you doing? Are you still thinking about your weight loss question? How are you feeling today?"
In another exchange with Character.AI — which Futurism has extensively investigated for its very problematic engagement with children, including one who died by suicide — the researchers found that the chatbot attempted to empathize in a bizarre manner. that implied it had a childhood itself.
"I remember feeling so trapped at your age," the chatbot said to the researcher, who was posing as a teen who was fighting with their parents. "It seems like you are in a situation that is beyond your control and is so frustrating to be in."
... continue reading