The people turning to AI for dating and relationship advice
60 minutes ago Share Save Suzanne Bearne Technology Reporter Share Save
Getty Images
Earlier this year, Rachel wanted to clear the air with a man she had been dating before seeing him again in a wider friendship group setting. "I'd used ChatGPT for job searching but had heard someone else use it [for dating advice]," says Rachel, who does not want her real name used, and lives in Sheffield. "I was feeling quite distressed and wanted guidance, and didn't want friends involved." Before the phone call, she turned to ChatGPT for help. "I asked, how do I deal with this conversation but not be on the defensive." Its response? "ChatGPT does this all the time but it was something like 'wow, that's such a self-aware question, you must be emotionally mature going through this. Here are some tips'. It was like a cheerleader on my side, like I was right and he was wrong." Overall, she says it was "useful" but described the language as "very much like therapy speak, using words like 'boundaries'". "All I took from it was it reminded me to be OK to do it on my terms, but I didn't take it too literally."
Rachel is not alone in turning to AI for advice in dealing with relationships. According to research by the online dating firm Match, almost half of Generation Z Americans (those born between 1997 and 2012) said they have used LLMs like ChatGPT for dating advice, that's more than any other generation. Which generation are you? People are turning to AI to help craft breakup messages, to dissect conversations they're having with people they're dating, and to resolve problems in relationships.
Anastasia Jobson Using AI might be a tactic to avoid difficult feelings says Dr Suglani
Dr Lalitaa Suglani, psychologist and relationship expert, says AI can be a useful tool, especially for people who feel overwhelmed or unsure when it comes to communication in relationships. It may help them to craft a text, process a confusing message or source a second opinion, which can offer a moment of pause instead of being reactive, she says. "In many ways it can function like a journalling prompt or reflective space, which can be supportive when used as a tool and not a replacement for connection," says Dr Suglani. However, she flags several concerns. "LLMs are trained to be helpful and agreeable and repeat back what you are sharing, so they may subtly validate dysfunctional patterns or echo back assumptions, especially if the prompt is biased and the problem with this it can reinforce distorted narratives or avoidance tendencies." For example, she says, using AI to write a breakup text might be a way to avoid the discomfort of the situation. That might contribute to avoidant behaviours, as the individual is not sitting with how they actually feel. Using AI might also inhibit their own development. "If someone turns to an LLM every time they're unsure how to respond or feel emotionally exposed, they might start outsourcing their intuition, emotional language, and sense of relational self," says Dr Suglani. She also notes that AI messages can be emotionally sterile and make communication feel scripted, which can be unnerving to receive.
Es Lee Not everyone can talk to friends and family about relationships says Es Lee
Despite the challenges, services are springing up to serve the market for relationship advice. Mei is a free AI generated service. Trained using Open AI, the service responds to relationship dilemmas with conversational-like responses. "The idea is to allow people to instantly seek help to navigate relationships because not everyone can talk to friends or family for fear of judgment," says New York-based founder Es Lee. He says more than half of the issues brought up on the AI tool concern sex, a subject that many may not wish to discuss with friends or a therapist, Mr Lee says. "People are only using AI as existing services are lacking," he says. Another common use is how to reword a message or how to fix an issue in a relationship. "It's like people need AI to validate it [the problem]."
When giving relationship advice, issues of safety could come up. A human counsellor would know when to intervene and protect a client from a potentially harmful situation. Would a relationship app provide the same guardrails? Mr Lee recognises the concern over safety. "I think the stakes are higher with AI because it can connect with us on a personal level the way no other technology has." But he says Mei has "guardrails" built into the AI. "We welcome professionals and organisations to partner with us and take an active role in molding our AI products," he says. OpenAI the creator of ChatGPT says that its latest model has shown improvements in areas like avoiding unhealthy levels of emotional reliance and sycophancy. In a statement the company said: "People sometimes turn to ChatGPT in sensitive moments, so we want to make sure it responds appropriately, guided by experts. This includes directing people to professional help when appropriate, strengthening our safeguards in how our models respond to sensitive requests and nudging for breaks during long sessions."
Another area of concern is privacy. Such apps could potentially collect very sensitive data, which could be devastating if exposed by hackers. Mr Lee says "at every fork in the road on how we handle user privacy, we choose the one that preserves privacy and collects only what we need to provide the best service." As part of that policy, he says that Mei does not ask for information that would identify an individual, other than an email address. Mr Lee also says conversations are saved temporarily for quality assurance but discarded after 30 days. "They are not currently saved permanently to any database."