After reading several blog posts on AI tools and its implications for the health sector, specifically the mental health sector, I was intrigued. In my opinion, there is a thin line between doing good or harm when leaving these type issues into the hands of technology and experimentation. Do we actually think (generative) AI tools can replace humans in all fields? Do we even really want it to?
Generative AI is used more frequently to revolutionize traditional therapy, such as chatbots or diagnosis tools. Counselling sessions are revised, more efficient diagnoses are made and treatment plans are designed all assisted by generative AI tools (Roth, 2023). These advances have definitely played a huge part in breaking the stigma around therapy, even making it more accessible to those without (Leamey, 2023). One of the latest innovations is an AI that monitors a patient’s mental health over time, detecting subtle changes in speech or text that could suggest symptoms of mental health disorders worsening (Roth, 2023).
While simultaneously admitting technology is the future in many aspects, its applications to mental health should not be rushed. People need care, empathy and nuances, and generative AI is just not quite there yet (Leamey, 2023). Leaving mental health, that comes from our unique human minds, to some, maybe advanced, algorithm seems to have dangerous implications.
AI pulls data from many sources, but these sources do not get verified before. Research has even shown that eating disorders were promoted by generative AI chatbots, putting in certain prompts like “anorexia inspiration” and the AI returning toxic images or diet plans (Leamey, 2023). Source verification, consent procedures, and obviously privacy issues come to rise when thinking of using AI tools for mental health.
Asking ChatGPT for mental health tips, it provides me with a basic list of activities that I should do in order to improve my mental state, such as self-care, sports, journaling, but on the top of the list it states talking to someone and seeking professional help. When asking the tool for the most important step of the list, ChatGPT tells me to reach out to a professional and reach out for support. Even when asking it “what if I can’t” it lists many other ways to seek help through help lines, trusted persons, or online support.
While AI tools can definitely aid in the mental health industry in the future, it is far from ready to apply wide-scale already, and its implications should be carefully considered. I find it interesting that ChatGPT recognizes the need for human interaction, empathy and nuance. How far along do you think AI tools should be for them to be used in sensitive practices like this?
References
Roth, E. (2023). Revolutionizing Mental Health: Generative AI in Therapy. Revolutionizing Mental Health: Generative AI In Therapy. https://www.productiveedge.com/blog/revolutionizing-mental-health-generative-ai-and-therapy
Leamey, T. (2023). Popular AI tools can hurt your mental health, new study finds. CNET. https://www.cnet.com/health/mental/popular-ai-tools-can-hurt-your-mental-health-new-study-finds/