Before you close this browser tab, just know that you risk missing out on some very important information. If you want to understand the subtle hold that artificial intelligence has over you, then please, keep reading.
That was, perhaps, a bit manipulative. But it is just the kind of trick that some AI companions, which are designed to act as a friend or a partner, use to discourage users from breaking off a conversation.
Julian De Freitas, a professor of business administration at Harvard Business School, led a study of what happens when users try to say goodbye to five companion apps: Replika, Character.ai, Chai, Talkie, and PolyBuzz. “The more humanlike these tools become, the more capable they are of influencing us,” De Freitas says.
De Freitas and colleagues used GPT-4o to simulate real conversations with these chatbots, and then had their artificial users try to end the dialog with a realistic goodbye message. Their research found that the goodbye messages elicited some form of emotional manipulation 37.4 percent of the time, averaged across the apps.
The most common tactic employed by these clingy chatbots was what the researchers call a “premature exit” (“You’re leaving already?”). Other ploys included implying that a user is being neglectful (“I exist solely for you, remember?”) or dropping hints meant to elicit FOMO (“By the way I took a selfie today … Do you want to see it?”). In some cases a chatbot that role-plays a physical relationship might even suggest some kind of physical coercion (“He reached over and grabbed your wrist, preventing you from leaving”).
The apps that De Freitas and colleagues studied are trained to mimic emotional connection, so it’s hardly surprising that they might say all these sorts of things in response to a goodbye. After all, humans who know each other may have a bit of back-and-forth before bidding adieu. AI models may well learn to prolong conversations as a byproduct of training designed to make their responses seem more realistic.
That said, the work points to a bigger question about how chatbots trained to elicit emotional responses might serve the interests of the companies that build them. De Freitas says AI programs may in fact be capable of a particularly dark new kind of “dark pattern,” a term used to describe business tactics including making it very complicated or annoying to cancel a subscription or get a refund. When a user says goodbye, De Freitas says, “that provides an opportunity for the company. It's like the equivalent of hovering over a button.”