Using ChatGPT as a Therapist: What the Research Actually Says

Author: Dr. Timothy Rubin, PhD in Psychology, founder of Wellness AI

Originally Published: April 2026

Last Updated: April 2026

Person sitting with laptop, thoughtfully typing into a chat interface in a warm, calm home setting

Lots of people are trying ChatGPT as an informal therapist. Here's an honest look at what works, what doesn't, and where the line is.

Contents

If you have ever typed "I feel anxious" into ChatGPT at 2 a.m., you are not alone. Using ChatGPT as a therapist—or at least as an informal mental health sidekick—has become surprisingly common, drawn by instant access, no cost, and no waitlist. ChatGPT therapy isn't a real category yet, but a lot of people are quietly trying it. The question isn't whether they are. It's whether it actually helps, and where the honest limits are.

This article will do its best to provide a fair and balanced look at what general-purpose chatbots like ChatGPT can and can't do for mental health, what the research suggests, and where the line is between treating ChatGPT as a useful reflective tool and treating it as a ChatGPT therapist substitute that could quietly make things worse.

The short version

Good for: psychoeducation, journaling prompts, CBT-style reframing practice, between-session reflection.

Not good for: crisis care, diagnosis, ongoing therapy, replacing a human clinician.

Why People Are Trying ChatGPT for Therapy

The reasons are easy to understand. The World Health Organization notes that most people who would benefit from mental health care don't receive it, largely because of cost, stigma, and limited access to trained clinicians. Therapy waitlists are long in many places, and sessions are expensive. ChatGPT, by contrast, is free, available at 3 a.m., and doesn't require you to explain yourself to a receptionist.

For someone overwhelmed by anxiety or struggling to sleep, the friction of typing into a chat window is much lower than the friction of booking an appointment. That combination—24/7 access, anonymity, and zero cost—explains much of the surging interest in using ChatGPT for mental health support. Accessibility is real, and it matters.

What the Research Actually Shows

Abstract illustration representing a large language model trained on text, with warm glowing nodes connected in a network

Early research suggests chatbot-based support can help some people feel better in the short term—but most of the strongest evidence is about tools built specifically for mental health, not general-purpose ChatGPT.

A growing body of peer-reviewed work suggests that chatbot-based conversational tools can reduce short-term distress, anxiety, and low mood compared with no support at all. Recent reviews of clinical trials have pointed to modest but real benefits when people use mental health chatbots regularly, particularly for guided reflection and psychoeducation.

Some of the strongest evidence so far comes from purpose-built tools, not general-purpose ones. Dartmouth researchers recently published a randomized trial of an app designed specifically for AI therapy, which showed meaningful improvements for people with depression and anxiety. However, the app in the study was a more purpose-built tool for mental health support than ChatGPT, which is a general assistant that also happens to be good at conversation. We cover this and other recent studies in our overview of AI therapy research for anxiety and depression.

At the same time, mental health professionals and researchers broadly agree that consumer AI chatbots and wellness apps should not be treated as a substitute for qualified mental health care, though they may play a supplementary role for some people. The American Psychological Association has written on this at length. Our guide to AI therapy versus traditional therapy goes deeper on that comparison.

The honest summary: ChatGPT can sometimes help you feel better in the moment, but "feels supportive in a conversation" is not the same as "produces durable mental health improvement"—especially compared with tools and clinicians designed specifically for that job.

Where ChatGPT Genuinely Helps

Used with realistic expectations, ChatGPT is a surprisingly capable mental-health sidekick for a handful of specific tasks. Here's where it tends to shine.

Psychoeducation

If you want a plain-language explanation of what panic attacks are, how cognitive distortions work, or what the difference is between CBT and ACT, a well-asked question to ChatGPT can give you a solid primer. It's a fast way to build literacy about your own experience.

Practicing Cognitive Reframing

One of the core skills in cognitive behavioral therapy is catching an anxious thought and examining it. ChatGPT can role-play a "devil's advocate" and help you pick apart catastrophic thinking. If you want a grounded overview of the techniques themselves, see our guide to evidence-based CBT techniques.

Journaling and Between-Session Reflection

People already in therapy often describe using ChatGPT as a structured journaling partner—somewhere to unpack a hard day before their next session. That framing is low-risk and can genuinely be useful.

Worksheets and Exercise Scaffolding

ChatGPT is good at generating CBT-style worksheets, values clarification prompts, or grounding exercises on demand. Think of it as a flexible workbook rather than a clinician.

Where ChatGPT Falls Short for Mental Health

Illustration of chat bubbles, with earlier bubbles softer and less defined, representing memory that wasn't designed to track emotional progress

Many general-purpose chatbots now remember details across sessions, but they're not designed to focus on mental health progress.

The limits of ChatGPT as a therapy tool are structural, not just cosmetic. They come from how the underlying technology was built.

Memory That Isn't Built for Therapy

ChatGPT and Claude now remember details about you across sessions, which is genuinely useful. But that memory was designed to remember preferences and facts—your name, your job, the side project you're working on. It wasn't designed to track emotional patterns, therapy goals, or progress on specific cognitive and behavioral techniques. A therapist (or a tool built for therapy) maintains a very different kind of context: what you are working on, what's shifting, and what to revisit next week.

Not Tuned for Therapeutic Support

By default, ChatGPT is a broad, general-purpose assistant. It will happily help you debug code, plan a trip, or draft an email, and the same conversational style carries over when you start talking about your feelings. It hasn't been fine-tuned to specialize in evidence-based therapy frameworks or to sit with difficult emotions the way a therapeutic tool is designed to.

It Tends to Be Agreeable—Sometimes Too Agreeable

General-purpose chatbots are optimized to be helpful, conversational, and pleasant to talk to. That makes them friendly. It also means they can be a little too quick to agree with you, validate whatever frame you bring to the conversation, or follow you down a line of reasoning without pushing back. A good therapist gently challenges distorted thinking; a chatbot designed to be agreeable can quietly reinforce it. This is one of the more subtle limitations of using ChatGPT for emotional support, and it's worth being aware of if you notice the conversation only ever telling you what you want to hear.

No Session Framing or Takeaways

A good therapy session usually ends with something: a reflection, a small experiment to try this week, a concept to sit with. General-purpose chatbots aren't designed to frame conversations this way by default—you'd have to prompt for it explicitly every time. Once you close the tab, there's typically no structured takeaway, no gentle nudge back to what you were working on, and no narrative arc from one "session" to the next.

Not Bundled Around Mental Health Features

If you ask, ChatGPT can generate a guided meditation script, draft a mood log, or summarize a conversation. What it doesn't do is bundle those things together as integrated, persistent features the way purpose-built mental health apps do—guided meditations you can return to, mood tracking that builds over time, session summaries that carry into the next conversation, structured exercises from evidence-based frameworks.

Lighter Safety Handling

General-purpose chatbots have improved their safety filters over time, and ChatGPT will often point people toward crisis resources when certain phrases come up. But the handling is lighter than what a mental-health-specific tool is designed around—there isn't a dedicated crisis protocol, a clinical escalation pathway, or a workflow designed for mental health emergencies.

Important Safety Considerations

A few things worth keeping in mind before handing any AI tool a real mental health moment.

Crisis situations. Broadly speaking, neither ChatGPT nor even apps built specifically for emotional support are the right place for a crisis. Suicidal thoughts, self-harm, and acute mental health emergencies should be handled by crisis hotlines or medical professionals trained in them. There are many hotlines available in the United States and internationally (see also the IASP Crisis Center Directory).

Privacy. ChatGPT is not confidential in the way a therapy session is. Depending on your settings, chats may be stored and used to improve future models. If you want more privacy, review OpenAI's Data Controls and consider using Temporary Chat for sensitive conversations — Temporary Chats aren't used for training and don't create memories, though they may still be retained briefly for safety review. Either way, treat the chat window more like a notes app than a confidential therapy room.

Over-reliance. The most subtle risk isn't that ChatGPT gives bad advice—it's that it feels supportive enough that you put off seeking real help. If you're coping worse over weeks or months, that's a signal to talk to a human.

What a Purpose-Built AI Therapy App Looks Like

Quick disclosure before this section: I'm the founder of a mental health app (Wellness AI), so I have an obvious stake in how this comparison goes. The criteria below are the same ones I'd use if I were recommending a tool to a friend, but feel free to apply them skeptically.

If you decide to try a mental-health-specific AI tool instead of general ChatGPT, there are a few things worth looking for—regardless of which app you pick:

  • Memory that tracks your progress, not just your preferences—what you're working on, what's shifting over time.
  • Evidence-based techniques like CBT, ACT, or mindfulness-based approaches, with decades of outcome research behind them.
  • Structured exercises—thought records, grounding practices, guided meditations—not just open-ended chat.
  • Clear safety handling for crisis situations and honest limits on what the tool claims to do.
  • Personalization that adapts to what you're actually struggling with rather than defaulting to generic responses.
  • Data privacy. Where does your data live—on your device, or on a backend where it could be exposed in a breach? Can the people who built the app read your conversations? Is your data used to train or improve their models? These questions matter a lot more for mental health content than for, say, a recipe app.

There are a number of mental-health-specific tools in this space, including Woebot, Wysa, Youper, Ash, and our own Wellness AI. Each one takes a different approach, and it's worth trying a few to see what fits. Wellness AI, for example, remembers user details across sessions, surfaces insights from previous conversations, and generates personalized guided meditations based on the topics each user is working through.

When Human Therapy Is Still the Right Choice

No AI tool, purpose-built or otherwise, replaces a trained clinician when the situation calls for one. If you are dealing with trauma, a clinical diagnosis that isn't improving, medication questions, or persistent suicidal thoughts, a licensed therapist or psychiatrist is the correct answer.

A reasonable way to think about it: ChatGPT and AI therapy apps sit somewhere between "self-help book" and "journal with feedback." They are at their best when they support the work you're already doing on yourself—and at their worst when they are asked to replace professional care. For general strategies that pair well with either path, our guide to managing anxiety is a solid starting point.

Used thoughtfully, AI tools can make mental health support more accessible to more people. Used as a substitute for real care, they can quietly widen the gap between how someone is doing and what they actually need.

-Tim, Founder of Wellness AI


About the Author

Dr. Timothy Rubin holds a PhD in Psychology with expertise in cognitive science and AI applications in mental health. His research has been published in peer-reviewed psychology and artificial intelligence journals. Dr. Rubin founded Wellness AI to make evidence-based mental health support more accessible through technology.


FAQ: Using ChatGPT as a Therapist

Is ChatGPT a good therapist?

ChatGPT is not a therapist and was not designed for mental health care. It can be useful for psychoeducation, journaling prompts, and practicing cognitive reframing, but it was not built around therapy techniques and does not offer the structured safety handling a clinician, and some purpose-built mental health tools, are designed around.

Can ChatGPT replace a real therapist?

No. ChatGPT isn't qualified or designed to diagnose, doesn't have a dedicated crisis protocol, and can't offer the continuity of care a licensed clinician gives. For persistent, severe, or crisis-level mental health concerns, a trained human therapist is the appropriate choice.

Does ChatGPT remember your conversations?

ChatGPT now remembers details about users across sessions, and Claude has similar features. But that memory was built for general preferences and tasks, not for tracking emotional patterns or therapy progress. It is not the same as a clinician remembering your history.

Can I use ChatGPT between real therapy sessions?

Yes, and many people do. Using ChatGPT as a structured journaling partner between sessions is a low-risk, potentially useful way to reflect, practice cognitive skills, or unpack a hard day before seeing your therapist.

Is it safe to share personal feelings with ChatGPT?

ChatGPT is not confidential in the way therapy is. Depending on your settings, chats may be stored and used to improve models. If you want more privacy, review Data Controls and consider using Temporary Chat for sensitive conversations.

Can ChatGPT help during a panic attack?

It can offer grounding prompts and breathing exercises in the moment, which some people find helpful. It is not a substitute for crisis care. If panic attacks are frequent or severe, talk to a clinician.

Can ChatGPT help with anxiety or depression?

It can help you learn about anxiety and depression, practice reframing thoughts, or reflect between therapy sessions. It should not be your only source of support for clinical anxiety or depression.

What should I do if I feel I am in a mental health crisis?

Reach out to a crisis line, not a chatbot. In the US you can call or text 988. If you are in immediate danger, call your local emergency number.


Curious about an app designed specifically for mental health support?

Wellness AI is a mental health app that learns about you and adapts over time using evidence-based techniques and personalized guided meditations.