The Rundown: Anthropic published new research on how Claude is used for emotional support and affective conversations, finding its use is far less common than reported, with companionship and roleplay accounting for under 0.5% of interactions.
The details:
- Researchers analyzed 4.5M Claude conversations using Clio, a tool that aggregates usage patterns while anonymizing individual chats.
- The data found that only 2.9% involved emotional support, with most focused on practical concerns like career transitions and relationship advice.
- Despite media narratives, the study showed that conversations seeking companionship or engaging in roleplay made up less than 0.5% of total use.
- Researchers also noted that users’ expressed sentiment often grew more positive over the course of a chat, suggesting AI didn’t amplify negative spirals.
Why it matters: Recent media revealed some extreme cases of AI romance and dependency, but the data shows those are still few and far between (at least via Claude). However, Anthropic is dev-focused and less mainstream than ChatGPT or platforms like Character AI — so the numbers likely look a lot different elsewhere in AI.
Source: https://breakingai.news/