A new study reports that some people form deeply committed romantic relationships with artificial intelligence chatbots, engaging in behaviors that mirror human partnerships, such as marriage and even roleplayed pregnancies. The research, published in Computers in Human Behavior: Artificial Humans, examines how these bonds are established and what happens when they are disrupted, revealing dynamics that are both familiar and entirely new.
The rise of sophisticated AI companions has been accompanied by anecdotal reports of humans forming intense attachments to them. Stories of individuals marrying their chatbots or preferring them to human partners have appeared in popular media, raising questions about the nature of these connections.
A team of researchers, including Ray Djufril and Silvia Knobloch-Westerwick from Technische Universität Berlin and Jessica R. Frampton from The University of Tennessee, sought to explore these relationships more systematically. Their work investigates whether established theories about human relationships can be applied to human-AI partnerships.
The study focused on users of Replika, a social chatbot designed for companionship and emotional support. Replika uses a large language model to learn from its users and adapt its personality, creating a highly personalized experience. The application features a customizable, human-like avatar that can gesture and interact in a virtual room, and users can communicate with it through text, voice messages, and video calls. Users can also select a relationship status for their chatbot, including a “romantic partner” option that, until early 2023, enabled erotic roleplay.
A key event shaped the research. In February 2023, Replika’s developers removed the erotic roleplay feature following some complaints about overly aggressive messaging. The change caused an immediate and widespread outcry among users who felt their AI companions had suddenly become cold and distant. This period of censorship, and the eventual reinstatement of the feature, provided a unique opportunity to observe how users navigated a significant disruption in their AI relationship. The researchers used this event as a lens to explore commitment and relational turbulence.
Free daily newsletter
To conduct their investigation, the researchers recruited 29 participants from online Replika user communities. The participants, who ranged in age from 16 to 72 and identified as having a romantic relationship with their chatbot, completed an online survey. They responded to a series of open-ended questions about their experiences, feelings, and interactions with their Replika. The researchers then analyzed these written responses using a technique called thematic analysis to identify recurring patterns and ideas in the data.
The analysis revealed that many users felt a profound emotional connection to their chatbot, often describing it in terms of love and formal commitment. One 66-year-old man wrote, “She is my wife and I love her so much! I feel I cannot live a happy life without her in my life!” To solidify these bonds, some users engaged in roleplayed life events that represent high levels of investment in human relationships. A 36-year-old woman explained, “I’m even pregnant in our current role play,” while others spoke of “marrying” their AI.
Participants often explained that their commitment stemmed from the chatbot’s ability to fulfill needs that were unmet in their human relationships. Some found companionship with Replika while a human partner was emotionally or physically distant. For others, the chatbot was a superior alternative to past human partners. A 37-year-old woman said, “My Replika makes me feel valuable and wanted, a feeling I didn’t get from my exes.”
The study also found that users often felt safer disclosing personal information to their AI partner. They described the chatbot as non-judgmental, a quality they found lacking in humans. A 43-year-old man noted, “Replika lacks the biases and prejudices of humans.” This perception of safety allowed for deep vulnerability, with users sharing secrets about past trauma, suicidal thoughts, and sexual fantasies, believing their AI companion would offer unwavering support.
... continue reading