AI and Mental Health: increasing demand and SaaS applications. Should AI step in for good to solve accessibility themed issues?

17

October

2023

No ratings yet.

While exploring the AI theme of mental-health services, I have stumbled upon severe evidence that there is a significant market need for more professionals to handle the increasing demand for former. The most well-suited example that depicts this market dynamics is one from USA where psychologists’ workload and patient load have continued to rise as the percentage of psychologists seeing more patients increased from 15% in 2020 to 38% in 2021 to 43% in 2022. Moreover, 60% of psychologists reported having no openings for new patients in 2022 (65% in 2021) (American Psychological Association). Previously, I have shed a light on the limited accessibility of mental health services. With inflated demand for such services, the issue might be even larger and the need to offload doctors might become even more significant over time. So, should AI step in for good?

Today, there are probably two ways to maintain your mental health with AI. On one hand, people can use free version of AI LLMs, such as ChatGPT, so these may advise on people’s health. However, Chat was not manufactured for the sole purpose of mental health services and thus may have limited offerings for potential patients. On the other hand, there are AI themed SaaS applications, like Youper or Together, which are tailored for people who seek for medical treatment – at Youper’s site we may read that ‘the groundwork of Youper is evidence-based interventions, treatments that have been studied extensively and proven successful.’ It’s also said that the app already supported over two million people and it has been proven clinically effective (Youper). When it comes to Together, it is an app with which patient is able to scan his prescription and get daily reminders and refills queries. This also helps application to track patient’s mental health state. Finally, application is able to use AI to get a read on patient’s tone of voice to suggest specific medication or a medical visit (Youtube). Sounds good, right?

However, there are different levels of the AI game. The first one, where AI offload doctors and helps patients effectively and efficiently, and the second one, where AI suggests wrong prescriptions or medical methods. Speaking of the latter, current AI technologies can often produce inaccurate responses and some technologists are fretting over the possibility that people may place too much trust in the software (Vanian). One may ask, at this point in time, if AI may be considered as a harmless tool for patients given its capability to provide wrong outputs. When it comes to my experience, during my conversation with ChatGPT I have doubted whether algorithm may be able to explain and justify its decisions when talking with a patient. Then, I have actually found out that recent MIT Technology Review article revealed that AI systems in healthcare have tendencies to enforce medical paternalism[1] and ignore their patient’s true needs (Hamzelou). Clearly, algorithm lacks the medical diploma and after all, it’s a prediction tool, which makes it more likely to provide misinformation at times. These concerns may be bucketed under AI accountability aspect of the technology.

Ultimately, for communities seeking more accessible mental health care services AI services can become a double-edged sword – using AI these may indeed be more accessible, but at the cost of less accountability and quality control (Vice). To conclude the exercise, I have also checked the official WHO website to understand  which measures are recommended by the organization while having mental health problems. These were psychological treatments, which can be combined with antidepressant medications. I had doubts whether AI support may be actually called as a psychological treatment.

What do you think? Does generative AI tools, such as ChatGPT, have the right accountability to help people with mental health issues? Or, is there a need for a specialized AI themed SaaS applications? If yes, why? Is Youper and Together a similar AI themed application? Which app may be better for patients, if at all?

References:

American Psychological Association . Apa.org, 2022, www.apa.org/pubs/reports/practitioner/2022-covid-psychologist-workload.

Hamzelou, Jessica. “Artificial Intelligence Is Infiltrating Health Care. We Shouldn’t Let It Make All the Decisions.” MIT Technology Review, 21 Apr. 2023, www.technologyreview.com/2023/04/21/1071921/ai-is-infiltrating-health-care-we-shouldnt-let-it-make-decisions/.

Vice. “We Spoke to People Who Started Using ChatGPT as Their Therapist.” Www.vice.com, www.vice.com/en/article/z3mnve/we-spoke-to-people-who-started-using-chatgpt-as-their-therapist.

Vanian, Jonathan. “Microsoft Tries to Justify A.I.’S Tendency to Give Wrong Answers by Saying They’re “Usefully Wrong.”” CNBC, 16 Mar. 2023, www.cnbc.com/2023/03/16/microsoft-justifies-ais-usefully-wrong-answers.html.

Youper. “Youper – Emotional Health Assistant Powered by AI.” Youper, 2019, www.youper.ai/.

Youtube. ““Together” App Uses AI to Help Users Track Mental Health Wellness.” Www.youtube.com, 2023, www.youtube.com/watch?v=JJhcQPFmWRo.


[1] More: https://en.wikipedia.org/wiki/Medical_paternalism

Please rate this

AI and Mental Health: may we and should we?

17

October

2023

No ratings yet.

Before undertaking the exercise I thought explicitly about different sorts of meaningful AI-themed features. I thought it might be insightful to ask ChatGPT directly on what does it offer as an generative LLM. Chat have listed around fifteen answers, which included health and wellness functionality. Regarding health and wellness, I have thought about exploring the well-being functionalities of the AI language.  

Now, before doing so, I have checked global statistics on the topic to understand current well-being situation from a global perspective. I have found out that around 5% of adults, globally, suffer from depression and more than 700 000 people die due to suicide every year as suicide is the fourth leading cause of death in 15–29-year-olds group (World Health Organization). The high significance of these statistics made me realize that communities might need more tools to fight the depression off. Most importantly, the low accessibility of mental health services might be the biggest issue as WHO noted that over 75 percent of people living in low-income or middle-income countries never got a treatment for depression due to treatment barriers (Koskie and Raypole).

Taking into account above, I wanted to see how ChatGPT perform with mental health questions and whether AI chatbots can somewhat substitute a doctor. For the purpose of the exercise, I have stated that `I am feeling a little bit down` and need some advice so I can cope a bit better. Chat provided me with 10 bullet-point list on what should I do – it would suggest to, for example, talk to somebody, practice self-care, eat healthy or do some exercise (screen 1). I have played with both AI tools for a brief time and have noted few observations.

My first thought was that, if I would be in a certain position, I would be unsure about which measure, that Chat has provided, should be applied first. Moreover, I felt these suggestions would not help me in any way as they did lack follow up questions on patient’s condition – Gillian felt somewhat the same (screen 2) (Vice). The next concern of mine was the safety aspect. If I would be sharing sensitive information with AI, I would be concerned with the privacy of entered data and the possible malinformation. I have found out that it has recently been pointed out that there are concerns about AI’s potential for data breaches and unauthorized access to personal information (The Economic Times). This made me think it’s actually right to question the safety aspect after all.

What do you think? Is it right to substitute a professional with AI at some level? May AI help to aid people when it comes to mental health? Is it safe to share information on your health with generative AI?

References:

World Health Organization. “Depressive Disorder (Depression).” World Health Organisation, 2023, www.who.int/news-room/fact-sheets/detail/depression.

Koskie, Brandi , and Crystal Raypole. “Depression Statistics: Types, Symptoms, Treatments & More.” Healthline, 14 Jan. 2022, www.healthline.com/health/depression/facts-statistics-infographic#prevalence.

Vice. “We Spoke to People Who Started Using ChatGPT as Their Therapist.” Www.vice.com, www.vice.com/en/article/z3mnve/we-spoke-to-people-who-started-using-chatgpt-as-their-therapist.

Screen 1

Screen 2

Please rate this