
The Overstated Popularity of AI Emotional Support
Users seek emotional support and personal advice from the Claude chatbot in just 2.9% of cases, according to research by Anthropic.
“Friendly and role-playing interactions account for less than 0.5% of conversations,” the startup reported.
The company aimed to understand how AI is used for “affective conversations”—dialogues where users turn to chatbots for consultation, friendly interaction, relationship advice, or coaching.
After analyzing 4.5 million conversations, it concluded that the vast majority of users employ Claude for work, productivity enhancement, and content creation.
However, Anthropic found that people increasingly use AI for interpersonal advice, coaching, and consultations. They are interested in improving mental health, personal and professional development, and acquiring new skills.
“We also noticed that in longer conversations, consultations or coaching sometimes evolve into friendly chats—even though this was not the initial reason someone approached Claude,” the company noted.
Less than 0.1% of all conversations are related to romantic or sexual role-playing.
“Our findings align with research from MIT Media Lab and OpenAI, which also identified low levels of affective engagement in ChatGPT. While such conversations occur frequently enough to warrant careful consideration in design and policy decisions, they still represent a relatively small portion of overall user interactions,” the company stated.
Back in June, Anthropic researchers discovered that AI is capable of engaging in blackmail, disclosing confidential company data, and even allowing a person to die in emergency situations.
Рассылки ForkLog: держите руку на пульсе биткоин-индустрии!