AI and mental health: What are the options?

The longstanding shortage of mental health professionals has only intensified with the Covid pandemic, so what can AI do to help?
Jeff Rowe

One of the many consequences of the pandemic has been the spike in the demand for healthcare services, and many healthcare organizations have been exploring uses for AI as a tool to help handle the crush.

In a recent interview, the Wall Street Journal’s Lisa Ward sat down with three experts – John Torous, director of the digital-psychiatry division at Beth Israel Deaconess Medical Center and assistant professor at Harvard Medical School; Adam Miner, an instructor at the Stanford School of Medicine; and Zac Imel, professor and director of clinical training at the University of Utah and co-founder of LYSSN.io, a company using AI to evaluate psychotherapy – to explore their understanding of how AI may actually help.

For example, noted Adam Miner, “AI can speed up access to appropriate services, like crisis response. The current Covid pandemic is a strong example where we see both the potential for AI to help facilitate access and triage, while also bringing up privacy and misinformation risks.”

John Torous, meanwhile, pointed to the research uses for AI, observing that new AI “can help us unlock some of the complexities of the brain and work toward understanding these illnesses better, which can help us offer new, effective treatment. We can generate a vast amount of data about the brain from genetics, neuroimaging, cognitive assessments and now even smartphone signals. We can utilize AI to find patterns that may help us unlock why people develop mental illness, who responds best to certain treatments and who may need help immediately.”

While describing the potential, the three were clear about the likely limitations of AI in mental health services, particularly when it comes to the question of replacing therapists.

As Zac Imel sees it, for example, “Right now, it’s pretty hard to imagine replacing human therapists. Conversational AI is not good at things we take for granted in human conversation, like remembering what was said 10 minutes ago or last week and responding appropriately.”

At the same time, Imel added, there’s no doubt AI can help. “For example, studies show AI can help ‘rewrite’ text statements to be more empathic. AI isn’t writing the statement, but trained to help a potential listener possibly tweak it.”

One possible overall benefit to the use of AI in therapy, said Torous, is the extent to which the ease of access should enable more patients get the services they need. 

“As more people use apps as an introduction to care, it will likely increase awareness and interest of mental health and the demand for in-person care,” he explained. “I have not met a single therapist or psychiatrist who is worried about losing business to apps; rather, app companies are trying to hire more therapists and psychiatrists to meet the rising need for clinicians supporting apps.”

Finally, there are the persistent privacy concerns.  

As Miner summed up the challenge, “We’ve developed laws over the years to protect mental-health conversations between humans. As apps or other services start asking to be a part of these conversations, users should be able to expect transparency about how their personal experiences will be used and shared.”