Prescription Health: New efforts to regulate AI chatbots used in therapy
San Francisco based psychologist Emily Anhalt has seen first-hand the effects of the country's mental health crisis.
"The rates of things like anxiety and depression and burnout have skyrocketed, there are not enough trained and licensed therapists out there to meet all of the people who want to get support,” Anhalt said. “It doesn't surprise me that all these mental health apps are popping up."
Anhalt is referring to the new breed of wellness apps such as Woebot, Replika, and Earkick that offer support through an AI chatbot. Treatments range from cognitive behavioral exercises to companion bots that engage with users, all for a fraction of what it costs to see an actual therapist.
U.C. Berkeley bioethics professor Jodi Halpern said she is concerned about the aspect of apps which have no oversight. Last year, the co-founder of Koko revealed the platform provided AI responses to thousands who thought they were speaking with a human.
“It's not that I’m against in any way AI developing, but I think we need to think about regulation and doing it safely,” Halpern said.
Halpern is leading the charge to make sure that chatbot apps offering mental health services become regulated by the FDA over the limitations to AI therapy.
“If you say that you have any suicidal thoughts or feelings, the bots just say, 'I can't help you with that. Dial 9-1-1,'" Halpern said.
Headlines were made last year when a man in Belgium used the app Chai, and committed suicide after being encouraged to do so by a rogue bot.
Halpern said she recognizes AI therapy is not going away anytime soon, and says it's best to use AI as a tool for cognitive behavior exercises, while continuing to see a real live therapist.
Watch the video above for the full story.