Last week, the embattled chatbot platform Character.AI said that it would move to ban minors from conversing with its many thousands of AI companion and roleplay bots. Site users, including self-avowed minors and adults alike, have a lot of thoughts.
The policy change, announced last week, comes as the controversial AI company continues to battle multiple lawsuits alleging that interactions with its chatbots caused real-world emotional and physical harm to underage users, with multiple teen users dying by suicide following extensive conversations with bots hosted by the platform.
As of last week, Character.AI now says people under 18 will no longer be allowed to engage in what it refers to as “open-ended” chats, which seemingly refers to the long-form, unstructured conversations on which the service was built, where users can text and voice call back-and-forth with the site’s anthropomorphic AI-powered chatbot “characters.” Minors won’t be kicked off the site entirely; according to Character.AI, it’s working to create a distinct, presumably much more limited “under-18 experience” that offers teens some access to certain AI-generated content, though specifics are pretty vague.
To enforce the shift, Character.AI says it’ll use automated in-house age verification tools as well as third-party tools to determine whether a user is under 18. By November 25, if the site determines that an account belongs to a minor, they’ll no longer be able to engage in unstructured conversations with the platform’s emotive AI chatbots, according to the company.
Given that unstructured chats with platform bots have long been the company’s core offering, the promise to ban minors from such interactions — even if they’ll still have some access to the site — is a huge move. It was also bound to be controversial with the company’s fanbase, as many users have formed close emotional bonds with various AI characters, with some reporting having used the platform for “comfort” or “therapy.” And though the company has consistently declined to share age data about its users with journalists, it’s understood that a huge chunk of the platform’s user base are currently minors.
The details and possible impacts of the promised transformation have been much debated over on the very active r/CharacterAI subreddit, where users have flocked to post statements like “it is officially over” and “this is INSANE” in response to the news; at the same time, other users are admonishing each other for being hypocritical or overdramatic.
Many of those upset with the change say they’re minors, and have expressed an unsurprising blend of concern, sadness, and anger. What’s more surprising is the breadth of who these young people actually blame for the platform policy shift — from parents who have raised safety concerns, to Character.AI developers, to other teens.
“I very much blame my own fellow teenagers over anything,” reads one comment. “If they’d just interacted with AI normally, then this wouldn’t have happened.”
“I genuinely do not understand what this new update is,” another user wrote. “Do the devs seriously not understand that the majority of their users are likely under 18…?”
Other self-reported minors, though, expressed feelings of conflict, in some cases saying that while they believe Character.AI has had a negative impact on their life, they and other teen peers now rely on it.
... continue reading