News

Your Health: Concerns surrounding mental health apps

J.Thompson44 min ago

Each year, one in five Americans will experience some form of mental health illness.

And while the need for help is clear, many don't seek therapy either due to cost or stigma.

San Francisco-based psychologist Emily Anhalt has seen first-hand the effects of the country's mental health crisis.

"The rates of things like anxiety and depression and burnout have skyrocketed. There are not enough trained and licensed therapists out there to meet all of the people who want to get support. It doesn't surprise me that all these mental health apps are popping up," said Dr. Emily Anhalt, PsyD, Clinical Psychologist.

Dr. Anhalt is referring to the new breed of wellness apps such as Woebot, Replika and Earkick that offer support through an a-i chatbot. Treatments range from cognitive behavioral exercises to companion bots that engage with users. All for a fraction of what it costs to see an actual therapist.

"We just wanna make sure that we're not doing it in a way that actually causes more harm than good," said Anhalt.

Leading worldwide expert and UC Berkeley bioethics Professor Jodi Halpern is concerned about the "wild west" aspect of apps that have no oversight.

Last year, the co-founder of Koko revealed the platform provided AI responses to thousands who thought they were speaking with a human.

"It's not that I'm against in any way AI developing, but I think we need to think about regulation and doing it safely," said Dr. Jodi Halpern, MD, PhD, Chancellor's Chair & Professor of Bioethics, UC Berkeley.

Dr. Halpern is leading the charge to make sure that chatbot apps offering mental health services become regulated by the FDA. She is concerned about the limitations to AI "therapy."

"If you say that you have any suicidal thoughts or feelings, the bots just say. 'I can't help you with that. Dial 9-1-1,'" Halpern said.

Headlines were made last year when a man in Belgium using the app, Chai, committed suicide after being encouraged to do so by a rogue bot.

"I have a lot of concerns about this wave of AI therapy," said Dr. Anhalt.

"Technology is a good thing, but everything needs to be used in a way that cares about people. I think we need more enlightened uses," Dr. Halpern said.

0 Comments
0