Join our FREE personalized newsletter for news, trends, and insights that matter to everyone in America

Newsletter
New

Does Ai Therapy Have A Future?

Card image cap

For more than a century, therapists have relied on the smallest of gestures—a pause, a nod, or a subtle smile—to build trust and help people heal. In the U.S., about a quarter of the population visits a mental health professional to address everyday concerns, as well as more serious issues like depression and eating disorders.

But a problem persists. Mental health issues are on the rise, and there aren’t enough therapists to treat everyone. One study found that about half of those seeking care in the U.S. are unable to access a mental health professional.

Enter artificial intelligence (AI), which promises to deliver therapy to the masses. “The goal is to is introduce high-quality, evidence-based treatments and make them available to people who might not otherwise receive mental health services,” said Michael Heinz, an assistant professor of psychiatry at Dartmouth College in New Hampshire.

Heinz, along with colleague Nicholas Jacobson, an associate professor of Biomedical Data Science, Psychiatry, and Computer Science at Dartmouth, recently pushed the idea one step closer to reality. They, along with a team of psychologists and psychiatrists in the Center of Technology and Behavioral Health in the university’s Geisel School of Medicine, conducted the first-ever clinical trial of a generative AI-powered chatbot.

The system, Therabot, yielded impressive results. Individuals diagnosed with depression experienced a 51% reduction in symptoms, while those suffering from generalized anxiety reported a 31% reduction in symptoms, according to findings published in the New England Journal of Medicine’s NEGM AI. Furthermore, individuals with eating disorders—among the most difficult conditions to treat—experienced a 19% reduction in concerns about body image and weight.

“About one-third of the population experiences a mental health issue in any given year,” said Jacobson, senior author of the study. “Unfortunately, many people fail to receive treatment because caseloads are high and waiting lists are long. Evidence-based AI-powered therapy supervised by humans can help address the gaps.”

Couch in the Cloud

Enter a typical therapist’s office and you’re greeted by a warm and comforting space. However, with an uptick in mental health problems stretching the traditional face-to-face-therapy model beyond its limits, code is becoming the new couch.

“There’s enormous potential for therapy bots. AI doesn’t sleep, it’s always attentive, and it’s highly scalable,” said J. Ryan Fuller, a practicing psychologist and executive director of New York Behavioral Health, whose therapists treat patients with cognitive behavior therapy, a type of psychotherapy that helps people change unhealthy thinking patterns and behaviors to improve their emotional well-being and cope with life challenges. “The key is to build in strong safeguards to ensure that the technology is used wisely and responsibly.”

To be sure, generative AI systems can sputter errors, hallucinate, and deliver highly inappropriate or even dangerous responses. For example, chatbots have instructed young people to commit crimes and even engage in self-harm. In 2023, The National Eating Disorders Association (NEDA) pulled its chatbot from a help hotline after the organization learned the AI system was dispensing dangerous advice about eating disorders.

In addition, researchers at Stanford University warned that AI therapy could contribute to harmful stigmas and perpetuate dangerous biases. There’s also the risk of developing an unhealthy dependency on AI. “We have more people staying at home, not going out to restaurants and avoiding interactions with other people,” Fuller said. “Therapy bots must balance emotional support with best practices in therapy and mental health. The goal isn’t to chain people to a virtual couch; it’s to give them the skills to function well in the real world.”

That philosophy is at the center of Therabot. When Heinz and Jacobson began working on the project in 2019, they recognized that large language models had enormous potential but tuning them for evidence-based psychology was no simple task. Initially, the pair dropped video from actual therapy sessions into the AI model. “We thought this would be the gold standard for teaching the system, but the model failed pretty miserably,” Jacobson admitted.

That prompted the researchers to take a different tack. They began writing detailed vignettes that simulated high-quality therapy sessions. “We focused on common problems that people have and built in psychological treatment methods, built largely around cognitive behavioral therapy,” Heinz said.

After feeding the vignettes into the open-source model, the result was Therabot, an AI system that converses with patients like a human therapist—with the ability to detect when patients are at immediate risk and direct them to emergency resources. During the study, the group tracked results for 106 people using the chatbot, as well as 104 others diagnosed with mental health conditions but with no access to Therabot.

Those who used the system reported feeling better, but they also gave the chatbot high marks and reported that it was on par with an excellent human therapist. This included the ability to perform consistently and empathetically.

Algorithmic Empathy

Despite the encouraging results, AI-based therapy remains largely in the laboratory. Heinz and Jacobson noted that generative models cannot yet operate autonomously, and there are too many edge cases that could put a patient at risk. They are continuing to add vignettes and engage the model in broader training, so that it can address a wider range of conditions across groups and cultures.

They point out that therapy-based bots are ideal for outpatient care and addressing immediate needs; however, they shouldn’t become a low-cost replacement for in-person sessions. “There is still a need for clinician oversight,” Jacobson said. “We have not yet reached a point where generative AI can operate autonomously in the mental health field. There are many scenarios—including when a patient is at risk—that require human expertise and intervention,” Heinz added.

Nevertheless, AI bots eventually will move into the mainstream of psychology, Fuller said. “At some point, chatbots will likely be able to handle even complex cases.”

In addition to conducting ongoing research, Fuller believes that it’s critical to examine regulations and controls over AI, how to keep data private, and what type of human oversight is essential. “The key is to retain accountability.”

While therapy bots may never replace the warmth of a soft sofa and the empathetic smile of a human clinician, it’s increasingly clear that they can help bridge the imbalance between supply and demand for mental health care, Jacobson said. “I think we’ve reached a level where we are close to matching human therapy in terms of both efficacy and safety.”

Samuel Greengard is an author and journalist based in West Linn, OR, USA.