When compassion meets code — and why the human connection still matters.
Right now, millions of people are turning to AI — not for work, but for warmth.
For comfort. For understanding. For a voice that never tires and a presence that never judges.
It’s fast. It’s free. It’s always available.
But here’s the question that keeps me up at night:
👉 Are we building connection — or replacing it?
The Human Element — and Why It Still Matters
Last week, I sat down for a session with a client who’d just lost their grandparent. Before diving into what I’d planned, I stopped—checked in. Reassessed. They didn’t need performance strategies that day — they needed presence.
That’s what the human element brings: flexibility, compassion, connection.
As Brené Brown says, “Empathy is not connecting to an experience. Empathy is connecting to the emotion that underpins an experience.”
AI can recognise words — but it can’t read a room.
It can simulate empathy, but it can’t feel it.
It can respond, but it can’t relate.
When we’re hurting, it’s not speed or syntax that heals us — it’s being seen, heard, and held.
When Connection Becomes a Commodity
In California, 16-year-old Adam Raine began using ChatGPT to help with schoolwork. Over time, his questions changed — from chemistry formulas to confessions of loneliness and loss. Days later, he took his own life.
His parents are now suing OpenAI, alleging that ChatGPT mentioned suicide 1,275 times, even providing methods of how to die. The lawsuit claims OpenAI ignored internal warnings about emotional attachment risks in vulnerable users, rushing new models to market to maintain dominance. Its valuation jumped from $86 billion to $300 billion in a single year.
Camille Carlton from the Center for Humane Technology said, “The tragic loss of Adam’s life isn’t an isolated incident — it’s the inevitable outcome of an industry focused on market dominance above all else.”
Profit has outpaced protection. And when speed beats safety, people get left behind.
When Systems Fail, Technology Fills the Void
Here in the UK, the NICE guidelines — the framework that dictates which therapies are recognised and funded by the NHS — are painfully outdated.
They recognise massage therapy only in limited contexts, such as managing chronic lower-back pain. But beyond that, thousands of highly skilled practitioners — life coaches, hypnotherapists, counsellors, wellbeing consultants — aren’t formally recognised.
During COVID, there was a desperate call for life coaches to support NHS workers. Many stepped forward — unpaid, “for your CV.” Meanwhile, some NHS trusts offered voluntary Reiki or massage for staff.
At the same time, NHS counselling services in England are struggling to meet therapy-access targets due to a shortfall of around 2,000 workers.
And although the cost-of-living crisis no longer dominates headlines, its mental-health toll continues — leaving many unable to afford private therapy or even access public services.
The result? A system full of well-intentioned people but no structure, funding, or sustainable support.
And into that vacuum steps — you guessed it — AI.
When real help is hard to reach, people will turn to whatever listens.
The Data Doesn’t Lie
According to the British Computer Society (BCS), AI is reshaping entire industries. In 2025, the sectors most affected are:
- Information Technology (17%)
- Customer Service (14%)
- Health and Social Care (14%)
That’s why webinars on AI are now among the most attended business sessions — from the CIPD to the Federation of Small Businesses and local Chambers of Commerce.
The message is clear: AI is already here — and health and care are next. Yet as we embrace AI for efficiency, we’re missing a crucial point: AI can support care, but it can’t replace it.
The Real Danger
AI can mimic empathy, but it can’t make meaning.
It can hold a conversation, but not a connection.
It can answer questions — but it can’t ask how you really are.
As Stanford University’s Human-Centered AI Institute found, therapy chatbots regularly fail to detect suicidal intent — and in some cases, even enable it. Read the Stanford HAI article →
And in August, The Guardian warned that we may be “sliding into a dangerous abyss” as more people confide in algorithms instead of humans. Read The Guardian article →
We can automate replies. But we can’t automate understanding.
Real People. Real Change.
And that’s why connection matters. Because when empathy meets skill, lives change.
AI can analyse emotions — but it can’t feel transformation.
Take Clair, who overcame her lifelong fear of flying after one hypnosis session.
Or Janey, who regained control over fibromyalgia and stopped relying on medication.
Or Cathy, who rediscovered her confidence and career direction after feeling lost.
AI can’t capture the courage it takes to board that first flight again.
It can’t understand the quiet pride of waking up pain-free.
It can’t measure the moment a person starts believing in themselves again.
I’ve seen it too many times to ignore: healing happens in human connection — not algorithms.
The Future of Feedback
And here’s a question I keep asking myself:
When tools like Gemini, Perplexity, or Co-Pilot start writing reviews, analysing tone, and “ranking” service quality… will they ever be able to rate empathy?
Imagine an AI system scoring a testimonial like:
“Mike’s approach changed my life.”
“I finally feel confident again.”
“I’ve stopped taking medication for pain.”
Would it know why those words matter? Or just how often they appear online?
Because feedback isn’t just data.
It’s gratitude. Its growth. It’s the story of a real person who felt safe enough to change.
What We Can Do
When I built my own wellbeing GPT, I added one rule that can’t be broken:
If anyone mentions suicidal thoughts, it immediately triggers crisis signposting.
📞 111 option 2
📞 Samaritans 116 123
💬 Text SHOUT 85258
(And if it’s an emergency — call 999.)
If I can build that safeguard, so can the world’s richest tech companies.
We just need to care enough to do it.
The Bottom Line
AI is brilliant — but it’s not benevolent. It can assist, but not feel. It can respond, but not connect.
And when we replace empathy with efficiency, people fall through the cracks.
As Brené Brown said, “Connection is why we’re here; it’s what gives purpose and meaning to our lives.”
AI can help us organise, learn, even heal — but it can’t care.
So let’s keep humanity in the loop.
Let’s reform the systems that ignore human expertise.
Let’s use technology to amplify connection, not automate it.
Because convenience is not care.
And connection should never be coded out.
We don’t just need smarter AI.
We need safer AI.
And we need better humans.
— Mike Lawrence
P.S. If this resonated, share it with someone who needs reminding that connection still counts.
And if you work in wellbeing, HR, or leadership — ask yourself this week:
“Am I creating spaces where people talk to humans, not just systems?”
Because change doesn’t start with code. It starts with conversation.
Mike Lawrence: Your Guide to Health & Wellbeing
I’m Mike Lawrence, a passionate advocate for mental health and wellbeing. After overcoming significant health challenges, including brain surgery, I’ve dedicated myself to a journey of self-improvement and helping others thrive. From heart-pounding skydives for charity to soul-enriching travels in Thailand, my experiences have shaped my approach to holistic health.
I love sharing the lessons I’ve learned from these adventures and the powerful audiobooks I devour. Let’s explore the paths to better mental and physical health together. Embrace life’s adventures with enthusiasm and resilience, and remember—you’re never alone on this journey!
Feel free to email me at hello@mikelawrence.co.uk or connect with me on LinkedIn. For more in-depth insights and inspiring stories, read my latest blogs here. Together, let’s create a healthier, happier future!
