Can AI be your Therapist?

23

September

2025

4.5/5 (2)

Just this week I was talking with a friend, who studies Psychology, about AI. She told me that people are starting to use AI as an emotional companion and as a therapist. I was surprised by that trend and decided to dive deeper into the topic.

Since a few years and especially since the Covid-19 period, loneliness has increased and now we are even dealing with a ‘loneliness epidemic’ (Ross, 2024). At the same time professional mental health care has become unreachable for many in the world, due to long waitlists or high costs. Therefore it isn’t very surprising that people are starting to turn to AI assistants and chatbots, such as Abby and Replika. These assistants or ‘digital friends’ are very low-cost, easily available during all hours of the day and non-judgemental. 

However, AI is not a person and should not and cannot replace a social network to fall back on. I personally would take criticism or advice more seriously and reflect on it when it comes from a person, especially someone that I care about. I think the danger lies in when we start to normalize AI ‘therapy’ instead of human therapy. We’re then letting tech companies fill the gap in the mental healthcare sector instead of actually fixing it.

For example, in 2023, users of the emotional support app Koko were unknowingly given responses that were generated by ChatGPT. Many users felt betrayed, which suggests that talking to an actual human person is apparently a crucial ingredient in therapy. (Ingram, 2023)

Maybe AI could be helpful in combination with human therapists, rather than a replacement thereof. For instance, Woebot offers tools to practice things that you learned in therapy, such as cognitive behavioral techniques or reflection journaling. In this way, AI could act as a supplement, like a digital workbook between therapy sessions. Or maybe AI could be a nice start to self-reflection and could recommend actual therapy sessions when needed.

So AI therapists could offer both opportunities to underserved communities, while solely relying on them as a social support network is not fulfilling and could maybe even be dangerous when the wrong advice is given in crisis situations.

Personally I have not tried an AI ‘therapist, but I am curious. Maybe in the future, the mental healthcare sector will be made up of both human and AI therapists.

References:

Ingram, D. (2023, January 17). AI Chat used by mental health tech company in experiment on real users. NBC News. From: https://www.nbcnews.com/tech/internet/chatgpt-ai-experiment-mental-health-tech-app-koko-rcna65110

Ross, E.M. (2024, October 25). What is Causing Our Epidemic of Loneliness and How Can We Fix It? Harvard Graduate School of Education. From: https://www.gse.harvard.edu/ideas/usable-knowledge/24/10/what-causing-our-epidemic-loneliness-and-how-can-we-fix-it

Please rate this

7 thoughts on “Can AI be your Therapist?”

  1. Really cool topic! I agree with you that AI “friends” or therapists can never fully replace human connection — no matter how advanced the tech becomes, an algorithm can’t truly empathize.
    At the same time, it’s wild to think that some people might actually prefer a chatbot because it’s cheaper, always available, and never judges. Simultaneously, I also wonder what happens if people become more comfortable opening up to AI than to other humans — could that make real relationships even harder?
    Maybe the real danger isn’t just bad advice, but that we slowly outsource our emotions to tech…

  2. The topic of mental health is becoming increasingly important. However, many people still don’t have the confidence to talk openly about it because prejudices and negative attributions continue to exist. At the same time, there are too few therapy places and long waiting times in many places.

    In my opinion, artificial intelligence can offer meaningful support here. It can provide a low-threshold entry point for approaching the topic and taking the first steps toward self-reflection. However, it is important that the systems used are developed specifically for this purpose and are quality-assured. The use of any random, unspecific AI-systems without proper qualification could be counterproductive or even harmful.

    I agree with the author that artificial intelligence should not completely replace therapeutic care. Many emotions and subtle signals are difficult or impossible for artificial intelligence to recognize. Body language and nonverbal cues also provide valuable information that often only a human therapist can interpret correctly.

    Therefore, I also believe that a combination of human therapy and artificial intelligence support is the best approach. This can relieve the burden on therapists, give more people access to support services, and at the same time ensure the quality of care.

  3. It’s an interesting article and take, and you make some convincing points. AI is becoming increasingly relevant in the space of therapy, and indeed more and more people are reaching out for AI to help them with therapeutic issues.

    Common criticism is that people need the inherent ‘human connection’ and that therefore AI can only be a mere tool for therapy rather than completely replacing it. This is also discussed in this article. However, I disagree.

    It is indeed true that AI can (presumably) not feel empathy and that a complete two-sided human connection with it would therefore be impossible. However, it that what we need in therapy? Is it human connection and amicability that we are searching for? Maybe for some people the answer is yes, but I would argue that a lot of therapy is focused on solution-based thinking, where therapists merely give advice and interpretation of the clients problems.

    AI might actually be better at this than human therapists because of a (1) larger knowledge base and (2) limitless memorizing capabilities, meaning it could potentially provide a complete replacement for human therapy given the right coding. Besides, most people nowadays aren’t even in therapy. Wouldn’t algorithmic therapy be better than no therapy at all? I would therefore argue that ‘solely relying on AI therapists’ for some people would be a valuable option.

    What are your thoughts on this? All in all your article is extremely interesting and I’m curious to hear your viewpoint.

    1. Very interesting take!

      I have to agree that some people probably would not necessarily want a human connection in therapy sessions. AI could possibly be better at diagnosing and interpreting based on the large knowledge base. However I have to disagree that therapy with AI would be better than no therapy at all. There have been some incidents where AI gave totally wrong advice and even encouraged people to harm themselves after they persistently asked the model if that would be okay. I know that that is a far fetched example, but still we have to take those incidents into account when recommending people to use AI for therapy sessions. That is why I would encourage using AI in the ‘early stages’ of the process. When things get serious, I would demand that the AI recommends going to a human therapist.

  4. I too have been very concerned with the rise in people using AI as a therapist and asking for advise for personal situations. Not only do I think this is a risk to privacy and security but I think it also results in people receiving the wrong guidelines. Your therapist might have not experienced exactly what you are experiencing but they have human intuition to be able to empathise and thus realise even further information about yourself and the ongoing situation, combined with their expert knowledge, this results in sound advice. However, AI is not human and should not be treated as such.AI can not read your body language as you unpack a difficult situation, it cannot empathise. Thank you so much for writing this and starting the conversation!

    1. Strongly agree. I am no professional in the field of psychology, but I do have some understanding of how LLMs work, and with this knowledge I would never use a LLM as therapist. An interesting read would be this article: https://arxiv.org/abs/2212.03551 which I think describes one of the reasons why people would use LLM as a therapist. The paper describes how we describe LLMs with human like terms leading to seeing the systems as more human like than they really are.

  5. Really insightful article and discussion Shanna! I agree with what most people say that AI will never fully replicate the empathy of a human therapist. On the other hand the perception of being heard is already so important, we should not underestimate how much this mathers. If someone feels safe enough to open up to AI in stead of not talking to anyone this can already be a step forward. The downside is that people may rely too much on AI, which could lead to less trust in other people. In my opninion AI should be a bridge, so people practise with talking about there concerns and they will be slowly guided to the human care they need.

Leave a Reply to Ferdinand Ritter Cancel reply

Your email address will not be published. Required fields are marked *