Tech News
← Back to articles

Top Machine Learning Developer Speechless at Simple Question: Should AI Simulate Emotional Intimacy?

read original related products more articles

The question that’s stumping top AI researchers isn’t about consciousness or doomsday scenarios. After interviewing dozens of developers at companies including OpenAI, Anthropic, and Meta, Amelia Miller found it was this: should AI “simulate emotional intimacy?”

One chatty researcher at one of the top AI labs “suddenly went quiet,” recalled Miller, who studies AI-human relations, in an essay for The New York Times — and then, tellingly, offered up a halting non-answer.

“I mean… I don’t know. It’s tricky. It’s an interesting question,” the researcher said, before pausing. “It’s hard for me to say whether it’s good or bad in terms of how that’s going to affect people. It’s obviously going to create confusion.”

Though many waffled on answering the question directly, some were adamant about not using AI as an intimacy tool themselves, clearly showing they were aware of the tech’s profound risks.

“Zero percent of my emotional needs are met by A.I.,” an executive who heads a top AI safety lab told Miller.

“That would be a dark day,” said another researcher who develops “cutting-edge capabilities for artificial emotion,” according to Miller.

The conflicted responses from the developers reflect growing concern over AI’s ability to act as companions or otherwise fulfill human emotional needs. Because the chatbots are designed to be engaging, they can produce sycophantic responses to even the most extreme user responses. They can act as emotional echo chambers and fuel paranoid thinking, leading some down delusional mental health spirals that blow up their relationships with friends, families, and spouses, ruin their professional lives, and even culminate in suicide.

ChatGPT has been blamed for the death of several teens who confided in the AI and discussed their plans for taking their own life. Many youth are engaging in romantic relationships with AI models. Unlike a human companion, an AI one can lend an ear at any time, won’t judge you, and maybe won’t even question you. A founder of an AI chatbot business quipped to the NYT that AI’s role as an emotional companion turns every relationship in a “throuple.”

“We’re all polyamorous now,” he added. “It’s you, me and the AI”

And safety isn’t the only factor in the calculus of AI developers.

... continue reading