Dark
Light

AI Chatbots: Not Here for Companionship, Mainly for Productivity

June 27, 2025

The notion that AI chatbots are mostly relied on for emotional support just isn’t the case. Anthropic, the team behind Claude, found that only 2.9% of interactions focus on personal advice or emotional support, with companionship and roleplay making up less than 0.5% of all chats.

Anthropic set out to understand how people engage with AI in a range of affective ways—from coaching and counselling to relationship advice. After analysing 4.5 million exchanges across both the free and Pro tiers of Claude, the data clearly shows that most folks turn to the chatbot for work-related tasks like content creation.

That said, it’s not that Claude isn’t a go-to for personal queries. Users frequently ask for advice on everything from mental health and personal growth to communication skills. Sometimes these chats drift into more companionable territory, typically when someone is feeling isolated or struggling to make real-life connections. In fact, longer conversations (those exceeding 50 messages) are more likely to evolve into these deeper interactions.

Claude generally complies with user requests unless they clash with safety guidelines—like offering dangerous suggestions or anything that might encourage self-harm. Even in conversations aimed at coaching, the tone tends to improve as the exchange unfolds.

While AI tools continue to be a major boost to productivity, it’s wise to remember that they’re still a work in progress. Sure, they can be a handy resource, but they’re not flawless and can sometimes serve up inaccurate or even risky advice.

This analysis by Anthropic is a timely reminder: although chatbots are excellent at powering day-to-day tasks, they’re not intended to replace genuine human connection.

Don't Miss