Tech News
← Back to articles

Character.AI to Teens: Sorry, No More Open-Ended Chats With AI Companions

read original related products more articles

The AI companion chatbot company Character.AI will soon have an adults-only policy for open-ended conversations with AI characters. Teens who use the app will start facing restrictions: They'll still be able to interact with characters through generated videos and other roleplaying formats, but they won't be able to chat freely with the app's different personalities.

Open-ended chats have been a cornerstone of AI, particularly since ChatGPT launched three years ago. The novelty of having a live back-and-forth with a computer that responds directly to what you say led to the popularity of platforms like Character.AI.

It's also been a driver of concerns, as those conversations can take AI models in unpredictable directions, especially if teens use them to discuss mental health concerns or other sensitive issues. There are also concerns about AI chat addiction and its impact on social behavior.

Character.AI is a bit different from other chatbots. Many people use the app for interactive storytelling and creatively engaging in conversations with customizable characters, including those based on real celebrities or historical figures.

Karandeep Anand, Character.AI's CEO, said the company believes it can still provide the interactive fun that teens expect from the platform without the safety hazards of open-ended chats. He said the move is about doing more than the bare minimum to keep users safe.

"There's a better way to serve teen users," Anand told CNET ahead of Wednesday's announcement. "It doesn't have to look like a chatbot."

In addition to prohibiting open-ended conversations for those under 18, Character.AI is adding new age verification measures and creating a nonprofit AI Safety Lab.

Don't miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.

What's changing about Character.AI?

AI entertainment has proven to be one of the more fraught uses of large language models. Safety concerns around how children suffer from relationships with AI models have grown significantly this year, with the Federal Trade Commission launching an investigation into several firms, including Character.AI.

... continue reading