Is A.I. the Therapist You Never Needed?

Is A.I. the Therapist You Never Needed?


As A.I. therapy tools multiply, the question becomes: can algorithms ever understand the soul? Unsplash+

It’s three in the morning and you can’t sleep. You’ve scrolled endlessly, checked if anyone is awake, but the only thing to turn to is ChatGPT. It doesn’t sigh, or judge or roll its eyes. It simply says, “That sounds painful. Do you want to talk about it?”

With 1.7 million people on NHS mental healthcare waiting lists—some waiting over 800 days for a second appointment—and more than 150 million people in the U.S. living in areas officially designated as having a shortage of mental health practitioners, it’s no wonder people turn to whatever is at hand. Increasingly, that means A.I. Rather than using specialized apps designed for therapy, many are opting for general-purpose chatbots like ChatGPT, Claude or Gemini.

Research indicates that collective mental health is on the decline. In 2024, over 23 percent of U.S. adults reported experiencing a mental illness. Similarly, data from NHS England shows that mental health issues in England have increased from nearly 19 percent in 2014 to 23 percent in 2024. One in four young adults now suffers from a common mental health condition. With this canyon-sized gap between supply and demand, and long waits, the attraction of A.I. is obvious. It may not be the best help, but for many, it’s among the only help available.

The comfort of kindness

Part of the appeal is sheer availability. Let’s face it, the world can be pretty brutal. Social media has more snark and rage baiting than compassion and connection—and research estimates that about 65 percent of the global population uses social media, with 73 percent of Americans identified as social media users. Daily interactions can feel hurried or even harsh. When A.I. responds warmly, without raising an eyebrow, judging you for failing and offers a neutrality that can feel safe and accepting, it’s not surprising people use it. Over half of adults surveyed by the Pew Research Center report regularly interacting with A.I. tools. According to a September OpenAI report, around 70 percent of consumer usage of ChatGPT is for personal purposes, with many users employing the chatbot for decision-making support. A growing cohort of users is turning to ChatGPT as their personal advisor. 

A.I. can offer something many rarely encounter in their relationships. Over time, some users even find that their own inner dialogue becomes gentler. When A.I. speaks kindly, they begin to speak more kindly to themselves.

The illusion of empathy

Another lure is what engineers call “cognitive empathy.” Earlier versions, such as GPT-4o, emulated emotions so effectively that many users felt genuinely understood. In April, OpenAI reverted an update to GPT4o that many users described as “sycophantic” and excessively supportive. Although this type of chatbot response isn’t a real feeling, but rather a sophisticated simulation, that distinction can blur in the small hours. The illusion of being understood is powerful. Add to that the peculiar authority a machine can carry, when it says “you’ll be OK,” it can feel oddly more believable than when a friend says the same.

These systems also excel at spotting patterns in users’ language, reflecting back recurring themes or contradictions and reminding them of their strengths. This creates a profound sense of being “seen,” a quality that, when used correctly, can be harnessed to support therapy and coaching. Given all this, it’s easy to see why people lean on A.I.

Where A.I. falls short

But let’s be clear: large language models like ChatGPT were never designed for therapy. Take the acronym GPT. Officially, it stands for “Generative Pre-trained Transformer,” but it might as well stand for “General Predictive Text.” The model works on probability, generating the most statistically likely next word. By definition, that makes its answers superficial.

It’s also easy to use A.I. badly. Unless a user knows how to prompt for depth, responses tend to be generic. The systems are engineered to please. Optimized to keep people engaged, they lean towards agreement and affirmation, avoiding conflict. This can be comforting in the short term, but it stalls the kind of challenge that real therapy requires. Depth is another problem. A.I. tends to have a strong bias toward cognitive behavioral therapy (CBT)-style advice because of its training data. CBT is effective, but one modality cannot fit every person.

Then there are serious safety issues. In one Stanford study, A.I. failed to flag suicidal ideation and even supplied details of a nearby bridge. A RAND report found inconsistent handling of suicidal risk across models. These systems have no escalation protocols, no legal duty of care and, unlike accredited therapists, they offer no guarantee of confidentiality. It’s widely accepted that anything you put into ChatGPT is neither private nor secure. In August, OpenAI added mental health guardrails that prompt users to take breaks from long conversations with the chatbot and avoid giving direct advice about personal challenges. Last month, the company also committed to rolling out additional provisions for people in emotional distress by the end of the year. 

How to use it wisely 

None of this means A.I. has no place in mental health. When used wisely, it can be helpful, particularly for milder issues, and can free up therapists’ time to focus on more serious cases. A.I. can be valuable for role-playing, gaining perspective on situations and even navigating relationship challenges. But it’s at its very best when it’s used alongside human support, not instead of it. Clients using an app trained for personal development can use it between sessions and then report back exchanges with their coach or therapist, deepening reflection and insight. A good app can offer depth and challenge rather than flattery, but in crisis situations, human help is the only viable option.

The bigger prize: freeing up humans

A.I.’s greatest potential may lie in supporting professionals rather than replacing them. It can handle many of the time-consuming but necessary administrative tasks that drag therapists’ and coaches’ time, freeing them to focus on clients. This “productivity dividend” extends far beyond therapy. A.I. cannot replace human care, but it can reclaim the hours lost to paperwork, and that may be transformative in itself. CETfreedom has been developing a range of apps for coaching and personal growth designed to be used alongside live sessions. One client used one of these specialized tools designed to uncover limiting beliefs and causes of self-sabotage. In less than 45 minutes, she identified a recurring language pattern that over a decade of therapy had failed to reveal. 

That’s what A.I. does best: pattern recognition. Using questioning techniques to test for consistency and depth, it keeps probing until it reaches insight. This same capability can powerfully support therapists and coaches. 

Other tools now help spot unconscious biases, surface subtle patterns and emotional shifts, provide post-session reflections and suggestions for tailoring future sessions, summarize notes, identify burnout before it hits and even flag “scope creep.” With this kind of digital support, practitioners can deliver deeper transformation in less time.  

Most people don’t pour their hearts out to A.I. because they think it will solve all their problems; they do it because the world can feel harsh and short on kindness. When an A.I. is one click away and is calm, polite and seems to understand, it gives them the comfort they need in the moment. But that’s not the same as care.

The real opportunity is to let A.I. handle the repetitive work, spot patterns that we might miss and support humans so they can focus on what matters: real relationships, connection and growth.

The future might not be utopian, but it doesn’t have to be dystopian either. A.I. won’t fix us, but it could help us fix the systems that quietly fray our mental health: overwork, scarcity of attention, endless queues and the feeling of being reduced to a number. If the machines take the drudgery, perhaps their greatest gift will be to make life more human.

Is A.I. the Therapist You Never Needed?





Source link

Posted in

Forbes LA

I am an editor for Forbes Washington DC, focusing on business and entrepreneurship. I love uncovering emerging trends and crafting stories that inspire and inform readers about innovative ventures and industry insights.

Leave a Comment