A certified psychological coach weighs in on whether a chatbot can really hold the weight of our emotions…

Written by: Samantha Nice
Written on: September 14, 2025
AI is everywhere right now. From helping us plan meals to writing emails and even mapping out our workouts, it’s creeping into nearly every corner of our daily lives. So it’s no surprise that some people are starting to wonder whether a chatbot can also double up as their therapist.
With mental health conversations becoming more open, and support services becoming harder to access, the idea of an “always-on”, judgment-free listener is unsurprisingly tempting. But is it safe and is it useful? Should we really trust ChatGPT or other AI tools with something as personal as our mental wellbeing?
We spoke with Lucy Spicer, certified psychological coach, who shared her unfiltered take on the potential pros and very real limitations of AI when it comes to our emotional wellbeing.
Lucy acknowledges that AI absolutely has a place… but it’s a small one. “AI can be a useful tool for general guidance, education and quick support in moments of stress. It can be great for suggesting coping strategies, offering perspective or providing calming exercises. But when it comes to something as deeply personal and complex as mental health, AI should never replace human connection or professional support. True growth and healing require the depth and nuance that only a person can provide,” she explains. Instead, try to think of AI as a self-care supplement and not the main treatment. It can point you towards techniques, but it doesn’t have the human depth needed for real transformation.
One of the biggest appeals of AI is accessibility. Yes, it’s there in the early hours when you can’t sleep and when your thoughts are spiralling and no friend or therapist is available, but Lucy again urges caution. “At 2am, when someone feels isolated, AI can provide a sense of ‘someone being there’. However, the risk is that it may give surface-level answers that don’t fully address the depth of the person’s pain and it can create an illusion of a meaningful relationship. If someone is in crisis in the UK, the safe option would be reaching out to crisis services such as going to A&E, calling 111, or 999 in an emergency. AI really cannot provide urgent and proper care in these moments,” she says. It’s a stark reminder that while an AI chatbot might be comforting at times, it simply cannot intervene if someone is in real danger.
This really is the heart of the debate. Some people say AI feels weirdly empathetic, whereas others argue it’s just smoke and mirrors. Lucy instead is crystal clear that AI doesn’t actually feel emotions. “It mimics patterns of empathy based on language,” she says. “While this can sometimes feel comforting, it isn’t the same as being understood by another human being who can truly empathise, connect and respond intuitively.” That ability to read subtle pauses, notice tone shifts or just “feel” the weight of silence is something AI certainly can’t replicate and when it comes to therapy, these human nuances really do matter.
There’s another issue people don’t always think about too… data. Unlike a therapist who holds your stories in a confidential, legally protected way, AI is still just a tech tool. “Oversharing poses risks around privacy, data security and potential misuse of sensitive information. It also risks leaving someone more vulnerable if they believe the AI can ‘hold’ that information in the way a coach or therapist can, which it can’t.” Essentially, your emotional secrets deserve more than an algorithmic storage system.
So, when does leaning on AI cross the line into dangerous territory? Lucy draws this boundary pretty clearly. “Helpful support looks like using AI for reminders, education or science-backed coping strategies. Dangerous over-reliance is when people replace therapy, coaching, community or real conversations with AI, or when they delay seeking professional help because they feel they can ‘manage’ with a chatbot alone.” AI can be part of your wellness toolkit yes, but it should never come before or replace a therapist, friend or support group.
There’s no denying the unique value of a trained professional. “As a coach, I can build trust and safety, challenge unhelpful patterns and bring lived human experience into the coaching partnership between a client and I,” she says. “I’m able to listen, not just to words but to tone, pauses, body language and emotional cues. Most importantly, I can offer genuine empathy and human connection which is something AI cannot replicate.” This is where AI falls flat. It doesn’t see your body language, notice when you go quiet or bring lived experience into the room.
As already mentioned, one of AI’s biggest draws is definitely accessibility. It’s free (or at least cheaper than private therapy), it’s available 24/7 and it doesn’t judge which can be helpful for those who are nervous. Lucy agrees, but points out that other options exist too. “AI is available on-demand anytime, it can feel less intimidating than opening up to a person and often removes financial barriers. However, it’s not the only affordable or accessible option out there. Self-help books, free NHS resources such as Every Mind Matters and charity hotlines like Mind or Samaritans can all provide safe, reliable support with the added reassurance of being created and monitored by real people,” she adds.
When asked what concerns her most about the rise of AI therapy, Lucy is direct. “A huge issue is that people in real distress may rely on AI instead of reaching out for urgent help. Mental health struggles are complex and sometimes life-threatening. If someone is suicidal or experiencing trauma, AI cannot provide the intervention, safeguarding or safety planning that a trained professional or crisis service can.” It’s a sobering reminder that AI has hard limits and in times of crisis, those limits could be seriously dangerous.
As we all know, many wellness products like supplements and devices come with disclaimers and Lucy strongly believes AI should be no different. “Clear disclaimers should be essential as people need reminding that AI is not a substitute for professional support. They should be directed toward crisis services if they’re in danger of harm.” Think of it like a safety net for users who might otherwise overestimate what AI can do.
“I really do see AI as a wellness sidekick. It can be used to suggest tools and strategies to support your wellbeing, but it should never be viewed as a replacement for real human support.”
In other words, AI can help you track habits, suggest breathing exercises, or help quickly reframe your thoughts but it’s not built to sit with you in your grief, challenge ingrained patterns or offer the warmth of human empathy. This distinction matters. Sidekick status doesn’t make AI meaningless… far from it. When used wisely, it can be a valuable addition. But just like supplements can’t replace a balanced diet, chatbots can’t replace the deep, nuanced healing that happens in a real conversation with a coach or therapist.
AI is evolving fast so who knows what the next five years may look like. Lucy predicts more evolution, but also hopes for greater collaboration. “AI will likely become more sophisticated in tailoring support, offering personalised mental health ‘check-ins’ and habit tracking. But I hope it develops in partnership with mental health professionals, not in place of them. The future should be about collaboration where technology supports human care.” It’s not about one replacing the other, but working together.
If you’ve been tempted, Lucy urges you to go in with curiosity but set some clear boundaries. “Use AI for surface level guidance or simple coping strategies, but if you’re struggling with something deeper, don’t hesitate to reach out to a professional. Think of AI as a supportive tool in your self-care kit, not the whole toolbox. If you ever feel unsafe or in crisis, reach out to a real person immediately,” she says.
AI therapy is trending and it’s not going away. Used wisely, it can be a helpful addition by suggesting coping strategies, offering 24/7 access and making mental health conversations more approachable. But it’s no replacement for a trained professional, community or real human connection. As Lucy reminds us, the safest way to think about AI is as something you reach for occasionally, but never the one tool you rely on completely. If you ever find yourself in distress or crisis, the answer isn’t an algorithm online, it’s a conversation with someone human.
This article is for informational purposes only, even if and regardless of whether it features the advice of physicians and medical practitioners. This article is not, nor is it intended to be, a substitute for professional medical advice, diagnosis, or treatment and should never be relied upon for specific medical advice. The views expressed in this article are the views of the expert and do not necessarily represent the views of Healf
Samantha Nice is a seasoned wellness writer with over a decade of experience crafting content for a diverse range of global brands. A passionate advocate for holistic wellbeing, she brings a particular focus to supplements, women’s health, strength training, and running. Samantha is a proud member of the Healf editorial team, where she merges her love for storytelling with industry insights and science-backed evidence.
An avid WHOOP wearer, keen runner (with a sub 1:30 half marathon) hot yoga enthusiast and regular gym goer, Samantha lives and breathes the wellness lifestyle she writes about. With a solid black book of trusted contacts (including some of the industry’s leading experts) she’s committed to creating accessible, well-informed content that empowers and inspires Healf readers.