Finally action against social media platforms?

6

October

2021

No ratings yet.

Over the last couple of years, scandal after scandal regarding the blatant disregard for mental health issues by social media platforms like Facebook have been brought to the public’s attention. The 2020 Netflix documentary The Social Dilemma, where several industry experts give insights into how social media platforms are exploiting users by manipulating their mental health for their own profits, brought great attention to these issues, but over time these effects subsided. This week, again, Facebook whistleblower Francis Haugen came out and stirred the pot by testifying before Congress the dangers of Facebook for children, and how the company put profits before the safety of users. Finally, it seems that governments are listening to society’s cries for help, but will it be enough to enforce stricter regulations?

It is not “news” that social media heavily manipulates its users to try and keep them on the app as long as possible, improving their own profits. For years it has been publicly known that platforms like Facebook hire people who’s main aim is to make the website as addictive as possible, without regard for the implications these might have on the mental health of users. As a result, increased anxiety, depression, and isolation are associated with excessive social media usage. Multiple employees have gone public to try and gain attention for this issue; some successfully, some to no avail. Recently, it was discovered that Facebook tried to cover up an internal report which researched how their products affects users. The report finds that 32% of young girls who felt bad about their body felt worse when they went to Instagram. Furthermore, the report concludes that Instagram negatively affects mental health in both young boys and young girls. With an app so commonly used among children, and (young) adults it is staggering how legislators are not cracking down on this extremely damaging industry. After all, (mental) health should be a main priority for governments.

This latest scandal has again put the power of social media in the media’s eye. Finally, legislators in the U.S. have put forward new, and expanding regulations that could have a bit of an impact on the negative effects of social media, such an expansion of the Children’s Online Privacy Protection Act which makes it illegal for platforms to collect data from under 13-year olds without consent of their parents. However, I am not very optimistic as other initiatives have often failed to become accepted into regulation, or fail to really serve the purpose of the legislation. Furthermore, the power of the platforms is enormous, and these are heavily involved in the funding of political parties, which poses another interesting question: should these gigantic (tech) corporations be allowed to be this involved in politics? However, that is a topic for a different blog.

Source: https://www.ft.com/content/e9e25ff3-639a-4cc1-bb81-dedf24d956e3 https://www.ft.com/content/febd8adc-8729-4e50-889d-f22a109fd44e

Please rate this

Artificial Intelligence in Mental Health Care

18

September

2019

5/5 (1) According to Mental Health America (2019), 12.63% of children between the ages of twelve and seventeen experienced at least one major depressive episode in the US during 2018.  Twenge et. al (2017) searched for major trends that could be the cause of these astonishing rates. They found that 48% of the adolescents who spent more than 5 hours a day on their phone were prone to having suicidal thoughts. From their peers who kept their selves busy with their phone for just one hour per day, 28% had once thought of suicide. Treatments for mental health problems vary, but the stigma and neglect prevents people from speaking up (Who.int, 2001). Other reasons why mental health diseases remain untreated are limited coverage of insurance, lack of connection between health care systems and the scarcity of mental health care providers (Bakker et. al, 2016).

The digitalization in the health care industry did not leave the mental health segment untouched. Several apps have been developed to improve mental health. Headspace, for example, is my favorite meditation app, but the app does not leave much room for user input. The journaling app Stigma applies word cloud technology to look for frequently recurring words and matches them with emotions. Other apps provide Cognitive Behavioral Training (CBT) and help people with Obsessive Compulsive Disorders (OCD) or addictions (Shelton, n.d.). The real disruption seems to be found in the screening, diagnosing and treating mental illnesses with Artificial Intelligence. Ginger is a chat app that uses an algorithm to analyze the messages to give recommendations and is used in the workplace (Marr, 2019). Quartet Health can be used by the general practitioner to screen the medical history of a patient to find behavioral patterns that could predict or discover an undiagnosed mental health illness. The more these apps are used, the more data on behavioral patterns they can collect to establish healthcare preventing illnesses.

Bakker et. al (2016) found that the mental health apps that can be downloaded by smartphone users are rarely proven to be effective. I also doubt that picking up your phone every five minutes to provide a dozen apps the metrics they need to analyze your mental state would make you feel balanced. On the other hand, I believe the use of Artificial Intelligence in the screening, diagnosing and preventing of mental health illnesses could disrupt the way the care has been offered. I think the actual treatment, however, should not be based exclusively on apps, because real connections are made with real humans, not with an app. What are your thoughts?

 

References:

Bakker, D., Kazantzis, N., Rickwood, D. and Rickard, N. (2016). Mental Health Smartphone Apps: Review and Evidence-Based Recommendations for Future Developments. JMIR Mental Health, [online] 3(1), p.e7. Available at: https://mental.jmir.org/2016/1/e7/?utm_content=bufferfb966&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer

Marr, B. (2019). The Incredible Ways Artificial Intelligence Is Now Used In Mental Health. [online] Forbes.com. Available at: https://www.forbes.com/sites/bernardmarr/2019/05/03/the-incredible-ways-artificial-intelligence-is-now-used-in-mental-health/#5428a74dd02e [Accessed 18 Sep. 2019].

Mental Health America (2019). The state of mental health in America. [online] Mental Health America, p.17. Available at: https://mhanational.org/sites/default/files/2019-09/2019%20MH%20in%20America%20Final.pdf.

Shelton, J. (n.d.). Top 25 Mental Health Apps for 2018: An Alternative to Therapy?. [online] Psycom.net – Mental Health Treatment Resource Since 1986. Available at: https://www.psycom.net/25-best-mental-health-apps [Accessed 18 Sep. 2019].

Twenge, J., Joiner, T., Rogers, M. and Martin, G. (2017). Increases in Depressive Symptoms, Suicide-Related Outcomes, and Suicide Rates Among U.S. Adolescents After 2010 and Links to Increased New Media Screen Time. Clinical Psychological Science, [online] 6(1), pp.3-17. Available at: https://journals.sagepub.com/doi/10.1177/2167702617723376

Who.int. (2001). World Health Report. [online] Available at: https://www.who.int/whr/2001/media_centre/press_release/en/

Please rate this