Richard Drury/Getty Images
More and more, in the midst of a loneliness epidemic and structural barriers to mental health support, people are turning to AI chatbots for everything from career coaching to romance. Anthropic's latest study indicates its chatbot, Claude, is handling that well -- but some experts aren't convinced.
Also: You shouldn't trust AI for therapy - here's why
On Thursday, Anthropic published new research on its Claude chatbot's emotional intelligence (EQ) capabilities -- what the company calls affective use, or conversations "where people engage directly with Claude in dynamic, personal exchanges motivated by emotional or psychological needs such as seeking interpersonal advice, coaching, psychotherapy/counseling, companionship, or sexual/romantic roleplay," the company explained.
While Claude is designed primarily for tasks like code generation and problem solving, not emotional support, the research acknowledges that this type of use is still happening, and is worthy of investigation given the risks. The company also noted that doing so is relevant to its focus on safety.
The main findings
Anthropic analyzed about 4.5 million conversations from both Free and Pro Claude accounts, ultimately settling on 131,484 that fit the affective use criteria. Using its privacy data tool Clio, Anthropic stripped conversations of personally identifying information (PII).
The study revealed that only 2.9% of Claude interactions were classified as affective conversations, which the company says mirrors previous findings from OpenAI. Examples of "AI-human companionship" and roleplay comprised even less of the dataset, combining to under 0.5% of conversations. Within that 2.9%, conversations about interpersonal issues were most common, followed by coaching and psychotherapy.
Anthropic
Usage patterns show that some people consult Claude to develop mental health skills, while others are working through personal challenges like anxiety and workplace stress -- suggesting that mental health professionals may be using Claude as a resource.
... continue reading