AI and Mental Health: increasing demand and SaaS applications. Should AI step in for good to solve accessibility themed issues?

17

October

2023

No ratings yet.

While exploring the AI theme of mental-health services, I have stumbled upon severe evidence that there is a significant market need for more professionals to handle the increasing demand for former. The most well-suited example that depicts this market dynamics is one from USA where psychologists’ workload and patient load have continued to rise as the percentage of psychologists seeing more patients increased from 15% in 2020 to 38% in 2021 to 43% in 2022. Moreover, 60% of psychologists reported having no openings for new patients in 2022 (65% in 2021) (American Psychological Association). Previously, I have shed a light on the limited accessibility of mental health services. With inflated demand for such services, the issue might be even larger and the need to offload doctors might become even more significant over time. So, should AI step in for good?

Today, there are probably two ways to maintain your mental health with AI. On one hand, people can use free version of AI LLMs, such as ChatGPT, so these may advise on people’s health. However, Chat was not manufactured for the sole purpose of mental health services and thus may have limited offerings for potential patients. On the other hand, there are AI themed SaaS applications, like Youper or Together, which are tailored for people who seek for medical treatment – at Youper’s site we may read that ‘the groundwork of Youper is evidence-based interventions, treatments that have been studied extensively and proven successful.’ It’s also said that the app already supported over two million people and it has been proven clinically effective (Youper). When it comes to Together, it is an app with which patient is able to scan his prescription and get daily reminders and refills queries. This also helps application to track patient’s mental health state. Finally, application is able to use AI to get a read on patient’s tone of voice to suggest specific medication or a medical visit (Youtube). Sounds good, right?

However, there are different levels of the AI game. The first one, where AI offload doctors and helps patients effectively and efficiently, and the second one, where AI suggests wrong prescriptions or medical methods. Speaking of the latter, current AI technologies can often produce inaccurate responses and some technologists are fretting over the possibility that people may place too much trust in the software (Vanian). One may ask, at this point in time, if AI may be considered as a harmless tool for patients given its capability to provide wrong outputs. When it comes to my experience, during my conversation with ChatGPT I have doubted whether algorithm may be able to explain and justify its decisions when talking with a patient. Then, I have actually found out that recent MIT Technology Review article revealed that AI systems in healthcare have tendencies to enforce medical paternalism[1] and ignore their patient’s true needs (Hamzelou). Clearly, algorithm lacks the medical diploma and after all, it’s a prediction tool, which makes it more likely to provide misinformation at times. These concerns may be bucketed under AI accountability aspect of the technology.

Ultimately, for communities seeking more accessible mental health care services AI services can become a double-edged sword – using AI these may indeed be more accessible, but at the cost of less accountability and quality control (Vice). To conclude the exercise, I have also checked the official WHO website to understand  which measures are recommended by the organization while having mental health problems. These were psychological treatments, which can be combined with antidepressant medications. I had doubts whether AI support may be actually called as a psychological treatment.

What do you think? Does generative AI tools, such as ChatGPT, have the right accountability to help people with mental health issues? Or, is there a need for a specialized AI themed SaaS applications? If yes, why? Is Youper and Together a similar AI themed application? Which app may be better for patients, if at all?

References:

American Psychological Association . Apa.org, 2022, www.apa.org/pubs/reports/practitioner/2022-covid-psychologist-workload.

Hamzelou, Jessica. “Artificial Intelligence Is Infiltrating Health Care. We Shouldn’t Let It Make All the Decisions.” MIT Technology Review, 21 Apr. 2023, www.technologyreview.com/2023/04/21/1071921/ai-is-infiltrating-health-care-we-shouldnt-let-it-make-decisions/.

Vice. “We Spoke to People Who Started Using ChatGPT as Their Therapist.” Www.vice.com, www.vice.com/en/article/z3mnve/we-spoke-to-people-who-started-using-chatgpt-as-their-therapist.

Vanian, Jonathan. “Microsoft Tries to Justify A.I.’S Tendency to Give Wrong Answers by Saying They’re “Usefully Wrong.”” CNBC, 16 Mar. 2023, www.cnbc.com/2023/03/16/microsoft-justifies-ais-usefully-wrong-answers.html.

Youper. “Youper – Emotional Health Assistant Powered by AI.” Youper, 2019, www.youper.ai/.

Youtube. ““Together” App Uses AI to Help Users Track Mental Health Wellness.” Www.youtube.com, 2023, www.youtube.com/watch?v=JJhcQPFmWRo.


[1] More: https://en.wikipedia.org/wiki/Medical_paternalism

Please rate this

Leave a Reply

Your email address will not be published. Required fields are marked *