Olemedia/iStock/Getty Images Plus via Getty Images
Follow ZDNET: Add us as a preferred source on Google.
ZDNET's key takeaways
The FTC is investigating seven tech companies building AI companions.
The probe is exploring safety risks posed to kids and teens.
Many tech companies offer AI companions to boost user engagement.
The Federal Trade Commission (FTC) is investigating the safety risks posed by AI companions to kids and teenagers, the agency announced Thursday.
The federal regulator submitted orders to seven tech companies building consumer-facing AI companionship tools -- Alphabet, Instagram, Meta, OpenAI, Snap, xAI, and Character Technologies (the company behind chatbot creation platform Character.ai) -- to provide information outlining how their tools are developed and monetized and how those tools generate responses to human users, as well as any safety-testing measures that are in place to protect underage users.
Also: Even OpenAI CEO Sam Altman thinks you shouldn't trust AI for therapy
"The FTC inquiry seeks to understand what steps, if any, companies have taken to evaluate the safety of their chatbots when acting as companions, to limit the products' use by and potential negative effects on children and teens, and to apprise users and parents of the risks associated with the products," the agency wrote in the release.
... continue reading