If you’ve ever tried to find a therapist, you probably know this frustrating feeling: You spend your lunch break looking up providers who can meet you in person, only to realize that the few who are available in your area don’t have appointments after you get off work in the evening. So you turn to telehealth counselors because of their abundance, but soon discover you’ll have to pay out of pocket. Without any options, you regress into thinking mental health treatment is overrated, anyway.
You’re not the only one without access to therapeutic care: According to Mental Health America’s 2024 “State of Mental Health in America” report, there is currently only one mental health provider per every 340 Americans. The report also details that “60 million adults (23.08 percent) experienced a mental illness in the past year,” which means that those millions of people in need of help have had to sift through slim pickings. But some experts say that the solution to this crisis can be found in the tech boom on everyone’s minds: artificial intelligence.
We know it’s smart — it’s made us appear composed and confident in emails to our bosses and magically turned our pets into human-like avatars — but is it emotionally evolved enough to support us when we’re in the throes of grief, seeking support with family issues, struggling in a romantic relationship, or facing other complex problems? We may be on the verge of finding out. As the shortage of therapists deepens, A.I.-powered tools are emerging as a new kind of support system — one that’s always on, always accessible, and increasingly capable.
As bots step in, researchers and medical professionals are still exploring the benefits, risks, and implications — like Michael V. Heinz, who coauthored a breakthrough clinical trial that used an AI-trained therapy chatbot to treat a group of adults with depression, anxiety, and those at clinically high risk for eating disorders. Meanwhile, over at Abby, an AI therapy site, chief of staff Jason Ross is on the cutting edge of this burgeoning industry. At the American Psychiatric Association, senior director for healthcare innovation Vaile Wright, Ph.D., is weighing the benefits and drawbacks of these current technologies.
Before you spill your guts to a bot, we explain its possibilities and limitations.
What is A.I. therapy?
A.I. therapy is the use of an A.I. bot for mental health support, as a supplement to — or in place of — a traditional therapy journey. Instead of setting up appointments to visit a licensed professional in a real office, people surf the web to find free or low-cost A.I. that will listen to their issues.
Those options run the gamut — and not all of them are actually designed for therapy. Plenty of curious users have begun using ChatGPT as an ad-hoc therapist; similarly, others have been confiding in flashy entertainment bots — like Replika or Character.A.I. These platforms weren’t developed by clinicians, but they can respond to emotional, health-related questions. Unfortunately, using one of these tools presents a risk since they weren’t designed to provide mental or emotional support.
There’s a growing group of bots that are specifically, officially designed to respond as a therapeutic companion — though none of this technology has been approved by the FDA yet. These are bots well versed in familiar methods of treatment, like Cognitive Behavioral Therapy, and they’re available to chat anytime of day or night — the vast majority of these models will communicate to you on your phone or computer through text, but some platforms, like Clare & Me, can even talk via a phone call.
A.I. therapist Abby speaks 26 languages and can utilize different types of therapy (she’s trained in both Gestalt Therapy and Psychodynamic Therapy for instance). You can text her on your phone or computer 24/7 and she always maintains a calm, kind tone. “Abby has been built by clinical psychologists that we have on staff,” Ross says. “Its main and only focus is to be your mental health companion, all for the cost of a Netflix subscription.”
“We’re not trying to replace human therapy,” Ross says of Abby. “Our goal is [to help] people who can’t access human therapy or need a supplement to human therapy.”
He acknowledges that it has become a substitute for some, however., “A lot of our users can’t afford traditional therapy. For them, it’s not a decision between Abby and a human therapist. It’s Abby or nothing.”
Assuming our readers would be curious about the experience, I signed up for an account. When I interact with Abby, the first thing I notice is that she’s consistently even-keeled — while a human therapist might have a stomachache or a bad hair day that would leave them in a noticeable funk, Abby reliably arrives at each session with a calm, straightforward tone.
“Since this is our first time chatting,” Abby explains to me, “I’ll ask a few questions to better understand the situation that’s bothering you. If at any point you want advice, a different perspective, or to change topics, just let me know. This is your space and you can tell me what you want from it.”
When I admit to Abby that I’m stressing over the pros and cons of A.I. therapy (it’s true, I am), she swiftly delivers me an analysis of my behavior. In seconds, she replies, “your uncertainty about the article’s conclusion hints at a potential need to build confidence in trusting your own opinions.”
After a few more minutes of dialogue, the A.I. crafts a custom roadmap to help me assuage my stress and build confidence as I forge ahead. Then, I meet the number of messages that I can send for free — if I want to keep chatting, I have to start a trial.
Dr. Heinz’s trial chatbot, Therabot, operated somewhat similarly to Abby — participants interacted with Therabot through a smartphone app and typed out responses to prompts about their emotional well-being. Plus, they initiated conversations whenever they needed to chat. But Therabot is actually the first of its kind to undergo a clinical trial, which means that this technology is one step closer to being proven to help users with mental health concerns.
Why is A.I. therapy gaining ground?
“There is a workforce shortage when it comes to the delivery of mental and behavioral healthcare,” says Dr. Wright, “and it’s across all providers. So if you look at psychologists, master’s level trained providers, and psychiatrists, all the projections out of HRSA suggest that the demand for services greatly outweighs the supply of providers.”
Aside from the shortage, there are also plenty of people who can’t access therapy because of common coverage issues. Dr. Wright remarks, “It’s not just that we don’t have enough providers in the right places, but also that providers are currently very disincentivized from taking insurance. Reimbursement rates are so low and the administrative burden is very high. So what that means is those with insurance have even less access to mental health care than those who can pay out of pocket. It’s a very challenging problem.” In contrast, A.I. is much cheaper to use — Abby is even free, if you exclude premium features or options.
Plus, the effects of the loneliness epidemic may also be tied to the rise in A.I. therapy. “We’re in a period where people are lonelier than you can even imagine,” Ross says. “A lot of people [don’t have anyone] to turn to. Their family dynamics might not be great, or they don’t have a ton of friends.” Being able to talk to a kind, receptive, good listener can make all the difference, even if that listener isn’t human.
“It’s amazing to see when we get feedback,” Ross explains. “We’ll have users email us and just thank us for being the first [resource] that they’ve ever gotten therapeutic benefits from.”
And there is some proof that properly managed therapeutic A.I. can make a positive difference. The 2025 clinical trial of Therabot included 106 people, and those diagnosed with major depressive disorder reported a 51 percent average reduction in symptoms. Those with generalized anxiety disorder reported a 31 percent average reduction in symptoms. Plus, those at clinically high risk for eating disorders also showed improvement — a 19 percent average reduction in concerns about body image and weight.
Shockingly, despite it being artificial, the users felt in sync with Therabot. “A secondary result was the therapeutic alliance,” Dr. Heinz explains. “We administered a scale to have participants rate their relationship with Therabot. They rated that very highly. They felt a shared goal with Therabot.”
“That was the reason for running the trial,” Dr. Heinz says. “So we can really make the claim that Therabot treats clinical-level mental health symptoms.”
Eventually, Dr. Heinz and his colleagues want Therabot to be the first A.I. chatbot approved by the FDA: “That’s what ultimately gives you the ability to even market it as something that treats a disorder. Otherwise, you remain in the wellness space. To move into the zone where you can say, ‘Yes, we are evidence-based to treat this particular [concern],’ you have to have the FDA behind that.”
What are the downsides of A.I. therapy?
When you’re dipping your toe into new technologies, maintaining your privacy is a priority. “A lot of people are putting personal health information into these bots,” Dr. Wright points out. “But what if a breach happened? Do you really want your boss, for example, to know that you’re chatting to an A.I. bot about your alcohol use or your infidelity? People are potentially putting themselves at risk.”
But the major downside of using A.I. for mental health treatment is that not all of this technology is created equally. While Therabot may have undergone a clinical trial, ChatGPT didn’t.
“If you’re a companionship app like Character.A.I. or you’re just ChatGPT, none of that space is regulated at all. Nobody’s ensuring that what they’re producing is safe or effective,” Dr. Wright explains. And since mental health can be so high stakes, a lack of regulation can be very dangerous. Per the APA, “In two cases, parents filed lawsuits against Character.A.I. after their teenage children interacted with chatbots that claimed to be licensed therapists. After extensive use of the app, one boy attacked his parents and the other boy died by suicide.”
Because of this potential for A.I. to go off the rails, human oversight is crucial; Dr. Wright confirms that A.I. can’t treat people all by itself: “It does require a fair amount of human oversight because of its potential and likelihood to engage in hallucinations or fabricating information in a way that could be unhelpful, but at worst could be harmful.”
Human supervision was used during Therabot’s trial: “We let the A.I. respond, but we reviewed messages. If we needed to intervene, we could contact the participant and do whatever was needed,” says Dr. Heinz. But would that be sustainable for every mental health service, especially if there’s already a shortage of clinicians who have the expertise to oversee this kind of work?
“The goal is to train these A.I. to get good at establishing efficacy and safety benchmarks, and to make sure these models are meeting those benchmarks. Then you probably get to a place where you have gradual reduction in that supervision,” says Dr. Heinz. But there’s no certainty as to when or if that will happen.