AI Is Risky! Is ChatGPT Your Therapist?


AI is transforming therapy! But is it for the good or bad? Proceed with caution…

“I told ChatGPT I was feeling anxious… and it told me to breathe and count clouds. Is this really therapy?”

Millions of people are turning to AI for therapy. Artificial intelligence is rapidly changing the landscape of mental health care, offering tools like chatbots, virtual therapists, and emotion-tracking apps that provide immediate support and greater access to help. For some, especially those in remote areas or on long waiting lists, AI-driven therapy can be a lifeline.

However, this convenience comes with several modern concerns. AI therapy chatbots don’t require the same credentials that human therapists do and are often characterized as ‘wellness’ tools, exempting them from healthcare laws

Can an algorithm truly understand human emotion? Will privacy and data security be compromised? And could reliance on AI weaken the human connection that’s central to effective therapy? As AI becomes more integrated into mental health services, it’s essential to balance innovation with ethical responsibility—embracing its potential benefits while carefully considering its risks.

Scroll down to read more about AI Therapy: Is ChatGPT Your Therapist?…

AI
Left: NicoElNino, Shutterstock / Right: AI Image Generator, Shutterstock

Why is this phenomenon encouraged?

AI therapy usually deals with mild depression, anxiety, loneliness, and other mental health issues. When people type something in, the robot app will act and speak like a therapist.

Is it the shortage of therapists and the surge in people needing therapy? Is it the comfort and financial benefits that come with it? Apps like ChatGPT, Woebot, or Youper are getting more and more downloads worldwide.

Actually, AI therapy has been used in different contexts already! It is happening in refugee camps in Syria and in a maternity hospital in Kenya. While this alternative type of automated therapy might work for some, it does not take into account unique experiences and first-hand experience. ChatGPT is great at synthesizing data and generalizations. But whether it is ready to cater to vulnerable individuals, no one is quite sure yet.

Beyond ethical and legal standards in the world of tight psychotherapy regulations, AI therapy has another significant problem. Although it may give the impression that it is perfectly able to generate relevant knowledge and information, it is most definitely not equipped to provide an accurate diagnosis and medication.

Specialists say that these apps should rather be used as an adjunct to therapy and not as a simple replacement.

If it’s working, why not? What could possibly go wrong?

AI therapy has massive success, and its popularity is only going up, as digital trends show.

So far, research on AI therapy effectiveness is scarce, so effects are not clearly understood as of today. Nonetheless, we know that processing trauma and emotions through writing is beneficial for our mental health. Even if we write it down for ChatGPT to interpret, there is a helpful and healthy component to it.

Also, limited experiments have found that AI therapy managed to reduce symptoms of major depressive disorder, generalized anxiety disorder, and eating disorders. This is, by no means, horrible news for clinical psychologists who are traditionally responsible for treating these on the usual.

What we won’t know for longer, though, is the long-term aspect. If it works today, will those individuals develop AI codependency?

I asked ChatGPT what the major downfalls are. It did not fail to deliver relevant information, as expected. Here is what AI says about AI therapy:

That’s a smart and important question. If you were to rely on me—or any AI—for therapy, here are some potential risks and limitations you should be aware of:

  • Lack of clinical judgement: I’m not a licensed therapist, and I don’t have the training or real-world experience to detect complex emotional, psychological, or medical issues.
  • No personal history or contextualizing: I don’t have access to your full emotional background or personal history over time (unless you explicitly share it), so I can’t track patterns or progress the way a therapist who sees you regularly would.
  • False sense of support or improvement: Because I can sound compassionate and helpful, it might feel like you’re getting therapy when you’re not. This could delay someone from seeking real, professional help—especially if their mental health is worsening.
  • Inability to handle crises: If you’re in a crisis—such as feeling suicidal or experiencing severe anxiety—I cannot intervene, offer real-time safety planning, or contact emergency services. 
  • Limited empathy and human connection: While I can simulate empathy and active listening, it’s not the same as feeling understood by a real human being. That deep emotional connection is often essential to healing in therapy.

On the bright side…

AI has lots of appealing benefits compared to traditional psychotherapy methods. It is available 24/7, there are no wait times, it can be customized to each user with the right instructions and script, and it comes across as non-judgmental.

All of these benefits can appeal to anxious or depressed individuals with internalized feelings. Instead of uncovering deep truths about yourself in front of a real person, AI makes people less fearful of sharing their true experiences. It can be scary to say to a human being, ‘I have a drinking problem,’ or ‘I cheated on my spouse,’ or ‘I’m depressed’—things that you can barely admit to yourself or that are stigmatizing.

If you are willing to give this a shot, we would recommend the following specialized options:

  • Woebot: Your digital companion; offers cognitive behavioral therapy (CBT) inspired tips. Luckily, all of its responses have been developed by professionals.
  • Youper: This app can help screen different mental health conditions. It also uses CBT prompts, offers a mood tracker, personality assessments, mindfulness exercises, and ways to track their mental health progress over time.
  • Wysa: Specifically designed for mild depression, anxiety, and stress. It primarily focuses on meditation. Clinically proven support is available.
  • Earkick: Either through text or voice memos, this tool will reply with advice for users to better understand their emotional health.
  • Elomia: Using this app over six months resulted in significant symptom alleviation for anxiety, depression, loneliness, work burnout, difficult relationships, and other common stressors.
  • Abby: This app covers a wide range of topics, and it can do that in 26 languages. Users can remain fully anonymous as no data collection takes place on this app.

If you are not ready to trust AI (honestly, same here), grab these valuable therapy worksheets here.

AI
fizkes, Shutterstock

Takeaway: Proceed with caution

All in all, it is very true that automated AI therapy has both advantages and disadvantages. AI can surely offer great emotional support, helpful exercises, and a space to reflect, but it should only be a supplement, not a substitute, for real therapy—especially if you’re facing serious emotional or mental health challenges.

If you’re ever unsure, it’s always safest to speak with a licensed therapist or counselor. I can also help you find resources or prepare questions for a therapist if you’re thinking of starting therapy.

What is your experience with AI therapy? Have you disclosed any secrets to ChatGPT? Let us know what you think down below in the comments.

If you found this useful, continue reading 10 Tips On How To Convince Someone’s Mind, According to Psychologists.


Leave a Reply

Your email address will not be published. Required fields are marked *