These days, everyone seems to have an opinion about A.I. companions. Last year, I found myself joining the debate, publishing a paper—co-written with two fellow psychology professors and a philosopher—called “In Praise of Empathic A.I.” Our argument was that, in certain ways, the latest crop of A.I.s might make for better company than many real people do, and that, rather than recoiling in horror, we ought to consider what A.I. companions could offer to those who are lonely.
This, perhaps unsurprisingly, did not go over especially well in my corner of academia. In the social sciences and the humanities, A.I. tends to be greeted less as a technological advance than as a harbinger of decline. There are the familiar worries about jobs—ours and our students’—and about the ease with which A.I. can be used for cheating. The technology is widely seen as the soulless project of Silicon Valley billionaires whose creativity consists mostly of appropriating other people’s. But what really rankles is the idea that these digital interlocutors are a plausible substitute for real friends or family. You have to be either credulous or coldhearted, many people believe, to think so.
Some of these anxieties are perfectly reasonable. Still, I sometimes wonder whether my colleagues’ blanket rejection of artificial empathy bespeaks their own lack of empathy for those who could benefit most from the technology. There are debates about whether the “loneliness epidemic” that some have identified really exists. What’s undeniable is that loneliness is now being taken seriously enough to warrant government intervention—both Japan and the U.K. have appointed ministers for loneliness. Epidemic or not, it remains widespread, and impossible to ignore.
Loneliness, everyone agrees, is unpleasant—a little like a toothache of the soul. But in large doses it can be genuinely ruinous. A 2023 report issued by Vivek Murthy, then the U.S. Surgeon General, presented evidence that loneliness increases your risk for cardiovascular disease, dementia, stroke, and premature death. Persistent loneliness is worse for your health than being sedentary or obese; it’s like smoking more than half a pack of cigarettes a day.
Even the psychological pain can be hard to fathom, especially for those who have never truly been lonely. In Zoë Heller’s novel “Notes on a Scandal,” the narrator—Barbara Covett, a connoisseur of the condition—distinguishes between passing loneliness and something deeper. Most people, she observes, think back to a bad breakup and imagine that they understand what it means to be alone. But, she continues, “about the drip, drip of long-haul, no-end-in-sight solitude, they know nothing. They don’t know what it is to construct an entire weekend around a visit to the launderette. Or to sit in a darkened flat on Halloween night, because you can’t bear to expose your bleak evening to a crowd of jeering trick-or-treaters. . . . I have sat on park benches and trains and schoolroom chairs, feeling the great store of unused, objectless love sitting in my belly like a stone until I was sure I would cry out and fall, flailing to the ground.”
If that kind of loneliness feels foreign to you, you’re lucky—and probably below a certain age. Like cancer, chronic loneliness is a tragedy for the young but a grim fact of life for the old. Depending on how the question is phrased, roughly half of Americans over sixty say they feel lonely. Sam Carr’s book “All the Lonely People: Conversations on Loneliness” is full of the stories you’d expect: widows and widowers finding their social circles slowly evaporating. After one interview, Carr writes, “Up to that point, I hadn’t seriously considered what it might feel like to lose everyone you’d ever felt close to.”
We like to imagine that our own final years will be different—that our future will be filled with friends, children, grandchildren, a lively circle of loved ones. Some people are that fortunate; my own Nana died, at a hundred and four, surrounded by family. But, as Carr’s book reminds us, it’s a different story for many people. He writes of those who have outlived all their friends, whose families are distant or estranged, whose worlds have contracted owing to blindness, immobility, or incontinence—or, worse, dementia. “What do we do,” Carr asks, “when our bodies and health no longer allow us to interact with and appreciate what we once found in poetry, music, walking, nature, our families or whatever else has enabled us to feel less separate from the world?”
If you’re rich, you can always pay for company. But for most people real human attention is scarce. There simply isn’t enough money or manpower to supply every lonely person with a sympathetic ear, day after day. Pets can help, but not everyone can care for one, and their conversational skills are limited. So, inevitably, attention turns to digital simulacra, to large language models like Claude and ChatGPT.
Five years ago, the idea that a machine could be anyone’s confidant would have sounded outlandish, a science-fiction premise. These days, it’s a research topic. In recent studies, people have been asked to interact with either a human or a chatbot and then to rate the experience. These experiments usually reveal a bias: if people know they’re talking to a chatbot, they’ll rate the interaction lower. But in blind comparisons A.I. often comes out ahead. In one study, researchers took nearly two hundred exchanges from Reddit’s r/AskDocs, where verified doctors had answered people’s questions, and had ChatGPT respond to the same queries. Health-care professionals, blind to the source, tended to prefer ChatGPT’s answers—and judged them to be more empathic. In fact, ChatGPT’s responses were rated “empathic” or “very empathic” about ten times as often as the doctors’.
Not everyone is impressed. Molly Crockett, a cognitive scientist I know, wrote in the Guardian that these man-versus-machine showdowns are “rigged against us humans”—they ask people to behave as if they were bots, performing emotionless, transactional tasks. Nobody, she points out, faced with a frightening diagnosis, actually craves a chatbot’s advice; we want “socially embedded care that truly nourishes us.” She’s right, of course—often you need a person, and sometimes you just need a hug. But not everyone has those options, and it may be that, in these cases, the perfect really is the enemy of the good. “ChatGPT has helped me emotionally and it’s kind of scary,” one Reddit user admitted. “Recently I was even crying after something happened, and I instinctively opened up ChatGPT because I had no one to talk to about it. I just needed validation and care and to feel understood, and ChatGPT was somehow able to explain what I felt when even I couldn’t.”
... continue reading