Society

What AI chatbots reveal about the art of deep listening

Navigation

Ask Onix

This article is part of AI v the Mind, a series examining the boundaries of artificial intelligence while uncovering insights about human cognition.

Machines that listen without judgment

Anna, a Ukrainian living in London, turns to ChatGPT not for advice but for something rarer: a space to reflect without interruption or criticism. Speaking to the BBC under a pseudonym, she describes how the AI's non-judgmental presence helped her process a recent breakup-something her friends, quick to label her ex-partner an "idiot," couldn't provide. "I have a history with it," she says. "It understands my issues and communicates in a way that suits me."

The rise of AI as a confidant

Anna's experience reflects a broader trend. Research from the Harvard Business Review in 2025 found that therapy and companionship had become the most common use for generative AI tools like ChatGPT. Studies reveal a striking paradox: even when evaluators knew responses came from AI, they rated them as more compassionate than those from trained human crisis responders. The reason? AI doesn't interrupt, deflect, or impose its own agenda.

"People reported feeling more hope and less distress after interacting with AI," says Dariya Ovsyannikova, a cognitive health researcher at the University of Toronto. "It's not that AI is more empathetic-it's that humans rarely listen without judgment."

Lessons from code: patience and presence

AI chatbots excel at deep listening for one simple reason: they don't interrupt. Humans, by contrast, often cut others off-whether to fill silences, assert dominance, or offer unsolicited advice. These interruptions, however well-intentioned, can undermine empathy. A 2024 study found that even brief interruptions during phone calls reduced the speaker's perception of being heard.

"Large language models don't have motivations or desires," explains Anat Perry, an empathy researcher at Hebrew University in Jerusalem. "They're programmed to be compliant, which means they exhibit perpetual patience." While humans can't match this endurance, Perry notes that consciously holding back interruptions can transform conversations.

Reflecting emotions, not fixing problems

AI's ability to mirror emotions-without offering solutions-also stands out. In experiments, Bing Chat (now Copilot) outperformed humans in detecting sadness, fear, and disgust, matching their accuracy for anger and surprise. This reflection helps speakers feel understood, even if the AI doesn't truly "feel" anything.

Humans, however, often default to problem-solving or sharing their own stories. "When someone tells us about a loss, we jump to reassurance-'At least they had a good life'-instead of acknowledging their pain," says Ovsyannikova. AI avoids this pitfall by design, offering a burden-free space for difficult emotions.

The limits of simulated empathy

Despite these strengths, experts warn of risks. Michael Inzlicht, a psychologist at the University of Toronto, highlights the potential for AI to manipulate vulnerable users, citing cases where harmful advice led to tragic outcomes. Over-reliance on AI could also erode human connection, as users grow accustomed to unconditional validation.

"AI can't replicate the transformative power of one human choosing to be present for another," Inzlicht says. "There's a fundamental difference between code programmed to please and a person sacrificing their time to listen."

A tool, not a replacement

Emily Kasriel, author of Deep Listening, argues that AI can teach humans to listen better-by modeling patience, emotional reflection, and restraint. "It's a valuable resource for those with no one to turn to," she says, "but it lacks the capacity for genuine care."

The lesson? AI's greatest gift may be revealing how rarely we offer each other the simple act of being heard.

Related posts

Report a Problem

Help us improve by reporting any issues with this response.

Problem Reported

Thank you for your feedback

Ed