To many young souls, AI chatbots aren’t just tools or glorified search engines — they’re companions. A new survey found that nearly one in five high schoolers in the US — 19 percent — say that they or a friend have used AI to have a romantic relationship, an alarming figure that will surely raise new concerns over how the tech’s adoption among kids and teenagers may be impacting their mental health. The findings were published in a new report from the Center for Democracy and Technology, which surveyed 1,000 high school students, 1,000 parents, and around 800 sixth through 12th grade public school teachers. They echo a bevy of other studies showing how many teens are forming intense relationships with AIs. Overall, the survey paints a startling picture of how AI tech has taken over schools, with 86 percent of students and 85 percent of teachers saying they used the technology during the last school year. Many schools have their students use AI so they can become familiar with it, but this exposure may be backfiring. “I think students should know that they are not actually talking to a person. They are talking to a tool, and those tools have known limitations,” Elizabeth Laird, a director at the CDT and a coauthor of the report, told NPR. “Our research suggests that the AI literacy and the training that students are getting are very basic.” A great deal of attention has been paid to how AI has caused a crisis in education, as chatbots and other tools can easily be abused to cheat on assignments, if not generate them in entirety. In the survey, over half of students say they used AI for homework help even though the teacher didn’t allow it, and nearly two thirds said they used AI for tutoring or to learn more about a topic from class. But a huge proportion of high schoolers are also turning towards AI for personal comfort, sparking concerns for their safety. Therapists have warned that AI chatbots can give dangerous advice to teens. The bots frequently break their own guardrails have been caught encouraging suicide, explaining how to self-harm, or hide eating disorders. Numerous teens have died by suicide after developing a close and sometimes romantic relationship with a chatbot. In the survey, 42 percent of high schoolers said they used AI as a friend, or to get mental health support, or to escape from real life. More than half of students say they have a conversation with AI at least once a week, and 16 percent said they do every day. Their AI usage is also directly impacting how they interact with their peers and family. Some 43 percent said they asked AI for advice on their real interpersonal relationships. And strikingly, over a third of the teenagers said it was easier to talk to AI than to their parents. Those parents, by contrast, feel left in the dark: two thirds of them said that they have no idea how their kids are using AI. While schools are teaching their kids how to use AI so they’re aware of the limits of the technology, it clearly comes at a risk. The survey showed that the students whose schools had them use AI were more likely to report knowing someone who’s used AI for personal, non-academic reasons, such as getting mental health support, using it as friend or companion, or for romance. “The more ways that a student reports that their school uses AI, the more likely they are to report things like ‘I know someone who considers AI to be a friend,’ ‘I know someone who considers AI to be a romantic partner,'” Laird told NPR. AI is also being used for more malicious reasons, with 36 percent of students saying they’ve heard of a deepfake of someone at the school in the last school year. Some of it includes AI-fabricated revenge porn and other non-consensual intimate images, with 12 percent of students saying they heard of someone at the school being depicted by AI this way. “This technology is a new vector for sexual harassment and bullying, which were long-standing issues [before widespread use of AI],” Laird told NPR, “and this has become a new way to exacerbate that.” More on AI: 13-Year-Old Arrested for Asking ChatGPT How to Kill His Friend