AI in Mental Health: a New Frontier for Therapy and Support (With Author’s Commentary)


Author’s Note: This article was originally published on kevinmd.com. It is being reposted here with additional author’s commentary.


Image showing mental health support through a combination of therapeutic modalities including therapy, meditation and medication

What do recent trends in AI therapy research tell us about the future role of AI in mental health support?


The mental health crisis in America continues to intensify. With therapist shortages, high costs, and long waitlists, there are millions of people without access to therapy. As a cognitive scientist who analyzed therapy transcripts in grad school and now develops AI mental health applications, I’ve observed artificial intelligence emerging as a complementary solution to these accessibility challenges.

Historical context and growing evidence

Screenshot of the ELIZA chatbot program from 1966

ELIZA was the first AI therapy chatbot and one of the first natural language processing programs.

The evolution of AI in mental health has been remarkable. From ELIZA in 1966, which merely parroted back user statements as questions, we’ve advanced to sophisticated systems capable of meaningful therapeutic interactions. Recent research demonstrates that these modern AI interventions can produce measurable benefits.

A meta-analysis of multiple studies from 2023 found significant reductions in depression symptoms among people using AI-based mental health chatbots. Another study in the New England Journal of AI revealed that in couples therapy scenarios, ChatGPT responses compared favorably to human therapists’ responses on several therapeutic metrics. And a number of recent articles have shown clinical efficacy of these apps, including a randomized controlled trial showing an AI therapy app effectively reduced symptoms of depression, anxiety, and eating disorders compared to waitlist controls.

These tools are certainly not a cure-all, but mounting evidence indicates that AI can produce clinically significant positive outcomes, especially for individuals who might otherwise receive no help at all.


Author’s commentary: AI therapy has a surprisingly rich history in both the real world and in fiction.

There were several early NLP programs developed in the mental health space besides ELIZA, such as a program developed by John Greist in the 1970’s to interview potentially suicidal patients, which reportedly was more accurate than humans as predicting actual suicide attempts.

AI as a potential mental health tool has been extensively explored in the realm of science fiction, of which I’m an avid fan. Perhaps the first example is from Asimov’s short story “Satisfaction Guaranteed” in which a a robot recognizes a person’s low self-esteem and provides emotional support. In the 1971 THX 1138, there are computer confessional booths where people can emotionally vent. And in Frederik Pohl’s Gateway (a novel I loved when I read it), about half the story involves therapy sessions between the protagonist and the robot therapist Sigfrid Von Shrink.


Understanding the advantages and limitations of AI

A robot scientist analyzing data

While AI chatbots will never be a substitute for authentic human connection, it has other benefits that humans can’t match

Traditional therapy offers clear benefits: Trained professionals with genuine empathy, an understanding of complex human emotions, and the ability to handle crises. AI chatbots do not grasp the nuances of human emotions and experiences (and depending on who you ask, don’t truly “understand” anything at all) and aren’t equipped to manage severe mental illness the way a professional can.

However, AI mental health tools offer distinct advantages that address current gaps in care:

Accessibility: Available 24/7, often at a fraction of traditional therapy costs. For the millions lacking access to mental health care due to financial constraints, geographic limitations, or provider shortages, AI offers immediate support.

Reduced barriers: Many people find it easier to initially engage with an AI system due to reduced stigma and fear of judgment. This can serve as an entry point to the mental health system for those hesitant to seek traditional care.

Consistent support between sessions: People are already using chatbots or AI coaches as a supplement to therapy—checking in during a stressful night when their therapist isn’t available or using AI-guided exercises between weekly sessions. In fact, the use of supplementary tools such as CBT workbooks or mindfulness tools isn’t even a new phenomenon; AI is really just a next step in the evolution of these tools.


Author’s commentary: It is easy to read past the concept of Accessibility in articles talking about issues in mental health, and I want to emphasize just how serious a mental health crisis we are facing in the United States (and much of this applies worldwide).

About 1 in five American adults experience a mental health condition each year, and only about one quarter of American’s mental health needs are met annually. Just imagine if the same were true for other health issues; if we only had resources to treat 25% of appendicitis cases each year there would be riots in the streets. And the barriers preventing people from getting support are truly massive:

  • Traditional therapy is unaffordable for many (perhaps most) people. Most insurance has zero or little mental health coverage. And the cost of therapy ranges from $150-$300 for a single session. In my personal experience, I could not afford therapy for over a decade of my adult life since I was in grad school or a poor postdoc. And even when pulling a tech salary, seeing a therapist felt like a pretty extravagant expense to me.

The only logical conclusion we need to rethink what mental healthcare looks like in the future if we’re to provide our citizen’s with sufficient mental health resources.


Beyond digital replicas: Novel therapeutic approaches

In my opinion, the most significant potential of AI in mental health isn’t simply in mimicking traditional therapy but in creating entirely new supportive experiences. For example, several mental health apps now combine AI therapy with guided meditation, another evidence-based approach to improving mental health. The app I developed takes this further by creating personalized meditation experiences generated specifically for each user’s situation—an integrated approach that very few traditional therapists can provide. Other applications use AI to recommend targeted exercises based on tracking data, deliver interactive cognitive behavioral therapy tools, or provide customized journaling prompts at optimal times.

These integrated approaches combine multiple evidence-based techniques to deliver personalized support that adapts to user needs—something traditional mental health care simply isn’t structured to provide.


Author’s Note: One of the reasons that I made personalized meditation a core feature of Wellness AI is that features which combine (a) evidence-based techniques and (b) experiences that cannot be replicated in traditional settings, are what truly sets AI mental health support apart, besides the accessibility and affordability issues mentioned previously.


The future landscape of AI-assisted mental health

Cartoon of a friendly robot psychologiest

In the future, I expect healthcare systems to favor hybrid treatment approaches combining clinicians + AI.

Looking forward, AI will likely become an increasingly standard component of our mental health care system. I expect health care organizations to favor hybrid treatment approaches in order to address resource constraints: AI providing ongoing support between appointments, with human clinicians supervising and focusing on complex cases.

For people with milder issues, the majority of their support may come through AI-powered tools, with occasional human therapist sessions for deeper work. This approach could be necessary to meet growing demand within existing financial constraints of our health care system.

As technology evolves, AI systems will become more sophisticated at analyzing patterns in user language, voice tone, or interaction styles to detect concerning changes that might indicate worsening conditions. These capabilities will enable earlier intervention and more personalized support. It’s hard to know exactly how good AI will get, or how all of this will play out, but I expect that we will see more and more AI tools that can provide more personalized support, and that these tools will become more and more integrated into our lives.

Ethical considerations and safeguards

As we integrate AI into mental health care, robust safeguards remain essential:

  • Privacy protection: Mental health data requires stringent security measures and transparent data policies.

  • Crisis protocols: Quality mental health apps implement clear protocols for crisis situations. Most current apps, for example, will refer users to a crisis line if they mention suicidal thoughts—an important safety feature.

  • Professional oversight: Involving mental health professionals in designing and supervising AI tools ensures alignment with therapeutic best practices. I anticipate that industry standards and regulations will continue to develop to certify AI mental health tools, similar to other digital health interventions.


Author’s Note: As someone that takes these issues extremely seriously, I spent a lot of time factoring these into the Wellness AI app:

  • Privacy: To ensure user privacy, we went to great lengths to design Wellness AI from the ground up to store user data only on individual’s own devices. In order to provide user’s with a cohesive personalized experience, memories of user’s conversations and meditations needed to be stored somewhere. The easiest way to do this from a technical perspective would be to store them in the cloud. But I know that even with the best security practices in place, no system is totally secure. So the best thing for my users’ privacy was to for us to simply never store or have access to user data ourselves. This created some complexities from a technical perspective, but figuring out to provide user’s with a system whereby our AI therapist can remember user’s conversations while never violating or risking complete user privacy.

  • Crisis Protocols: Both Wellness AI and many competitor apps have mechanisms in place for identifying concerning user behavior for which an AI is not appropriate. Most will correctly identify language related to self-harm, but it remains an open question of what to do when this behavior is detected. Wellness AI currently surfaces a popup that will direct users to professional resources as well as appropriate hotlines, and ensures that our AI therapist provides appropriate contextual responses in conversation. But we do not immediately discontinue chats because this can lead to terrible user experiences if the crisis-detection flags something incorrectly. But I’m very curious to see what standards emerge as the field grows.

  • Professional Oversight: As the creator of Wellness AI, I consider it important that I have an appropriate background for building a mental health app. I’m a PhD in psychology that has published multiple papers analyzing real-world therapy transcripts using AI, and have extensive knowledge of the field of mental health and artificial intelligence. I’m not a therapist myself nor am I trained as one, but I consulted with multiple therapists in developing the app and have tested it with both mental health and medical professionals.


The path forward

Having spent many years at the intersection of psychology and AI, I’m optimistic about technology’s potential to widen access to mental health support. AI won’t replace the human connection at the heart of both therapeutic and personal relationships, but it can significantly expand access to care and create innovative supportive experiences.

The key is to embrace what these tools do well, while remaining cognizant of their limitations, and focusing on creating complementary systems where technology and human providers each contribute their unique strengths to address the enormous unmet need for mental health support.

-Tim

Creator of Wellness AI

Get Personalized AI Mental Health Support

Try Wellness AI for AI therapy & Personalized meditations that draw on evidence-based techniques to help with anxiety and depression symptoms. Created by a PhD in psychology.

Previous
Previous

Cognitive Restructuring: How to Reframe Anxious Thoughts and Overcome Negative Thinking

Next
Next

Latest Research Strengthens Evidence for AI Therapy's Mental Health Benefits