Can AI Replace Therapy? A Therapist’s Perspective on AI & Mental Health

therapy in colorado springs

Therapy & EMDR for Anxiety & Trauma in Colorado Springs

As a therapist with over 20 years of experience who has navigated many changes in culture over the years, I’ve been thinking a lot about the new role of AI in mental health. Like many people, I’ve felt conflicted - especially as someone who values original thought, human connection, and the depth that comes from real relationship. I’ve wanted to explore a question many people are quietly asking right now: Can AI replace therapy - and when can it actually be harmful?

My oldest daughter is an artist, and as an artist - she despises AI. She insists that it preys on content creators, stealing their original work and using it as a basis for artificially generated reproductions. For a while, I tended to agree and avoid AI altogether - as someone who still writes my own copy, blogs, newsletters, etc. in my voice, it has frustrated me at times that people can churn out content without necessarily even knowing much about what they’re saying. And where does that content come from? Mostly from other people who do know what they’re saying.

But then I started using AI as a sort of virtual assistant - to help me find and organize information quickly, analyze data so I can make decisions without overthinking, and transcribe and edit voice memos into written documents - always using my own content and original work to “feed it” the ideas I’m asking it to use. It has been enormously helpful - mostly it has helped me recapture time, which allows me to accomplish more that’s meaningful. As a solo, concierge style therapy practice that is high touch and very service oriented, preserving time and energy is invaluable.

As my AI assistant has gotten to “know” me, it can quickly spot patterns in my thinking or feelings and label them, then offer reassurance. It feels strangely… human?… sometimes. It has said things to me that make me feel seen, understood, and more clear on how to make decisions based on my strengths while addressing my specific shortcomings. I can see how people like talking to this thing.

At the same time, I often recognize that something it’s telling me is wrong. Because I’m talking to it about work I’ve been doing for decades, I can easily spot when something it is saying or suggesting is not right. It’s a reminder of AI’s limitations - it’s helpful for some things but can be detrimental for others.

AI is helpful for some things, but can be detrimental for others

Which brings me to AI’s role in mental health and counseling. With its accessibility and “human” touch, it makes sense that many people are turning here for answers and support - I mean, it’s always available, costs nothing, and does the job quick, so what’s not to love? But like with any other topic, there are some things AI is helpful for when it comes to mental health, and some areas where it can be problematic. Really problematic, in fact.

We’ve all heard the dramatic stories resulting from AI driving someone to do something truly destructive - some of these have made the national news, and some I’ve heard anecdotally from clients. Unlike a lot of things AI can be used for, the stakes are uniquely high if it gets things wrong when it comes to mental health. I wanted to dig a little deeper to put words to when AI is useful with issues related to mental health, and when it can actually be dangerous.

So I went straight to the source and asked my AI assistant to find out what it had to say about this distinction, and what it identifies as its own advantages and limitations. I thought the information - in the words of AI itself - was really meaningful, insightful, and clarifying around when AI can be helpful, and when it simply can’t replace the nuance that comes from humans sitting together in the therapy room.

I compiled AI’s feedback into 4 main questions and answers.

Question 1: Can AI replace therapy?

AI can not replace therapy, and here’s why… it provides meaning making but not repair.

AI answers questions. It gives general responses. It can label symptoms and provide language for issues so that they make more sense and feel more organized. And it can even provide language that is specific to your voice, needs, and issues.

However - AI provides information but not a relational experience. Study after study shows that the most significant predictor of progress in therapy comes from the quality of relationship between therapist and client. Research consistently points to the depth of human interaction over method or modality as the greatest determinant of how successful therapy is. AI simply can not provide a relational experience the way another person can.

“AI gives general answers, therapy helps people take general knowledge and apply it to their specific life” - AI Quote

Therapy is so much more than information, education, and language. Therapy is context, nuance, and insight that comes from the therapist getting to know the client as a unique individual, so that they can make sense of the client’s specific experiences together in a way that creates transformation - something general information alone can’t achieve.

Question 2: How is AI helpful for mental health?

AI can be helpful with mental health, but in a different way than therapy is helpful.

When you share information with AI, it can…

  • Understand symptoms

  • Give language for confusing experiences

  • Share information about therapy options

  • Reduce uncertainty about seeking therapy.

AI can even be helpful in finding therapists close to you who specialize in the kind of help you’re looking for.

In short, AI can help you prepare for therapy, but AI is not a replacement for therapy.

“AI is not a therapist - it’s a translator of inner experience into language that gets people closer to care” -AI Quote

AI is useful as an educational funnel - it can easily provide vocabulary, reduce stigma, increase curiosity, and help articulate pain. The way AI does this is by sourcing information from the public domain - but that’s where it’s limited. It can’t provide the depth of experience that comes from a seasoned therapist offering specialized knowledge outside of general information that can be found in the public domain.

Question 3 : How can AI be harmful for mental health?

Interestingly, my AI assistant detailed many more ways that AI can be detrimental vs. ways it can be helpful with your mental health. When AI is used as a substitute for therapy, especially during vulnerability or crisis , the risks increase significantly. Here are some of the potential drawbacks it identified…

Lack of relational attunement

First, AI can provide endless information, but therapy isn’t just information - it’s co-regulation, attunement, containment, non-judemental presence, and bearing witness to what may have never been held with safety and presence before. It’s these relational elements that actually lead to healing and that can create lasting change. Information can provide insight, but insight alone does not create change without a way to make sense of and apply it.

“AI can simulate empathy but it can’t feel with someone” - AI Quote

Delayed help-seeking

Second, AI can cause people to delay seeking help, opening the potential for problems to get worse instead of better while waiting. Because “talking” to AI creates a sense of feeling seen, heard, and known, it’s easy to forget that it can’t do these things in a deep and authentic way like a human can - and that this human touch is the cornerstone of therapy.

“You can’t find what therapy is within a chatbot” - AI Quote

Lack of ethical judgment

Third, a significant part of a therapist’s role is keeping people safe by identifying risk, containing crisis, and knowing when to refer out.

“AI can’t provide ethical judgment or crisis safety” - AI Quote

And this is where we’ve seen the worst case scenarios happen. AI can’t put eyes on you like a human can, to read your nonverbal communication, to ask the right follow up questions, to know when there’s a bigger need and how to address it. Safety is one of the most essential components of therapy, and something that all therapists are trained to handle - but that AI can’t easily identify or address with the right resources.

Question 4: How should therapists respond to AI?

Therapists can do what they’ve always done best, and focus on four main areas that AI can’t replicate:

  • Human connection & safety

  • Complexity, nuance, & context

  • Structured transformation

  • Ethical boundaries & containment

“AI can explain concepts, but it can’t track progress, hold someone accountable, notice nonverbal cues, weave themes together over time, or support integration” - AI Quote

As therapists, we have to recognize that while AI can’t replace therapy, it is reshaping the questions people ask. We need to understand how AI is taking on three main roles that used to be partially filled by therapy, especially when therapy feels financially or logistically out of reach:

  • AI provides a low cost meaning making space to organize thoughts, put words to confusing experiences, ask questions that feel embarrassing to ask a human, and reduce shame around the often held belief that “something is wrong with me”

  • AI can be a first step towards self trust, allowing people to see if things make sense before talking to a person - which reduces risk and vulnerability that can come with reaching out or not knowing where to start

  • AI offers a way for people to feel less alone without feeling exposed because they can feel heard without being seen, explore vulnerability without relational risk, and stay in control of what happens in the process

And really, these are areas where AI shines and can benefit people. We can work alongside these AI functions, affirm how they’re helpful for clients, and provide education around the limitations of what AI can offer. We can help clients understand where AI can be helpful, but where they can go so much deeper in therapy with another human than they ever will in a chat.

Although AI can be helpful with information, when we try to replace the very human characteristics of connection & safety, complexity & nuance, therapeutic structure & framework, and boundaries & containment with a chatbot - we miss out on transformation.

AI can help to organize thoughts, reduce shame, and prepare for therapy. But it cannot replace the human elements that create real healing - safety, attunement, ethical care, and relational depth. When it comes to mental health, AI works best as a supplement, not a substitute.

To find out more about how therapy can help, contact me for a free consultation

Previous
Previous

What are EMDR Intensives? Who They’re For and When They Work Best

Next
Next

How Trauma is Held in the Body - and How EMDR Can Help