Are You Mental?!

11

October

2018

5/5 (3)

The past two days London has been captivated by the first Global Mental Health Summit. Why? Because mental health is becoming one of the biggest health challenges of the 21st century.

A study by Public Health England on cases from 2015 showed  the most common cause of death amongst both males and females between the age of 20 and 34 in the UK is suicide. In 2013, depression was the leading cause of years lived with a disability in 26 countries (Ferrari et al., 2013). In 2014, 19.7% of people aged over 16 in the UK showed symptoms of anxiety or depression (Evans et al., 2016). However, these symptoms are often invisible for outsiders and hard to measure. How do you determine when someone needs help and what help is needed? And why are algorithms important in overcoming one of the biggest challenges of the 21st century?

Now for a second think about the people around you. If you take ten people from your environment, based on the given statistics two of these ten people are struggling with mental health illnesses such as an anxiety disorder or depression. Perhaps you are in the know about your friends’ and family’s mental health, but it is hard to fully understand what is going on in their minds. How can you make sure you notice these little changes in behaviour that occur when someone has a mental illness?

This is where social media and algorithms come into play. Last year Facebook announced it will expand a programme designed to prevent suicide based on a pattern matching algorithm. It scans Facebook posts and comments for word combinations signalling potential suicide threats. After a threat has been identified, it will be reviewed by specialists trained in suicide and self-harm. Most concerning reports will be flagged to receive priority. In the next steps appropriate institutions will be alarmed about the persons discussion to create an appropriate care plan. (NBC News, 2018)

 

 

It is still possible to flag posts manually and in the help centre of social media platforms there are extensive guidelines on what to do when you encounter a worrying post. The main difference with the use of algorithms is the elimination of unpredictability of humans. With the magnitude of posts we see in a day, are we really able to see the impact of a single post of someone who is struggling? And if we do so, are we engaged enough to take appropriate action to support this person? Algorithms provide us with the security that certain posts will be noticed and addressed by specialists.

These systems will not replace current treatment, but they might play an important role in getting the right treatment for the everyone who is not able to find their own way to many systems currently in place. As of now, we do not know how this will impact the suicides rates. Nonetheless, I like to believe this a step into the right direction for big firms like Facebook and Snapchat to take responsibility in overcoming one of the main challenges of our century. What do you think of the role of hub-firms in mental illness signalling, prevention, and treatment?

 

In loving memory of Sam, 5 August 1997 – 25 September 2018.

 

Sources:
Ferrari, A.J., Charlson, F.J., Norman, R.E., Patten, S.B., Freedman, G., Murray, C.J.L., … & Whiteford, H.A., (2013). Burden of Depressive Disorders by Country, Sex, Age, and Year: Findings from the Global Burden of Disease study 2010. PLOS Medicine, 10(11).

NBC News. (2018). Can an algorithm help prevent suicide?. [online] Available at: https://www.nbcnews.com/mach/video/a-facebook-algorithm-that-s-designed-to-help-prevent-suicide-1138895939701?v=railb&

Evans, J., Macrory, I., & Randall, C. (2016). Measuring national wellbeing: Life in the UK, 2016. ONS. [online] Available at: https://www.ons.gov.uk/peoplepopulationandcommunity/wellbeing/articles/measuringnationalwellbeing/2016#how-good-is-our-health.

Please rate this

5 thoughts on “Are You Mental?!”

  1. To begin with, Steffie Broere, thank you for this interesting and enlightening blog contribution. I never heard of the combination of technology and mental health before, which I think is very interesting.

    To answer your question regarding the role of hub-firms such as Twitter and Facebook, in mental illness signalling, prevention, and treatment. I think that the used technology can’t make the people within the mental health industry redundant, it can nevertheless make the signaling proces more efficient and effective. Thanks to hub firms, possible suicide attempt messages can be redirected to the right people with the right knowledge and resources.

    So to conclude, Hub-firms can indeed help professionals in signaling mental illness, but prevention and treatment should only be assigned to professionals within this field.

    1. Thank you for your supportive comment Dina!

      I fully agree with your point the human aspect, especially in treatment of mental illnesses, will remain. In my opinion, part of the effectiveness of the current ways of treatment comes from the personal connections you make once presented with and during your treatment method.

      Therefore, I mainly believe in the power of signaling of mental diseases as done by for example Facebooks newly implemented algorithm as this will allow for earlier implementation of treatment plans as well as potentially better tailored treatments. Let’s hope it will contribute to the solution of the mental health crisis!

  2. Hi Steffie,
    Very interesting article topic. It’s great to see AI being used in such an efficient and life-saving manner, especially when connected with Social Media!

    Before the introduction of these algorithms, other users would have to report/flag content which may indicate the intent to commit self-harm. These reports/ flags were then checked by a person who decided whether this comment/ post falls under an intent to commit self-harm; if a suicide prevention hotline support should be offered or if Facebook’s law enforcement team should intervene (Novet, 2018).

    This new algorithm will make the evaluation of comments a lot less biased, and reduce the number of people that fall through the cracks when evaluating the content flagged. Several websites claim that this newly launched algorithm has helped more than 100 people in the first month of its launch! It’s definitely a cause worth investing in.

    Currently users are unable to opt out of this Facebook AI algorithm. Do you think that with regard to personal privacy, Facebook should people the opportunity to opt-out of this new feature?

    https://www.cnbc.com/2018/02/21/how-facebook-uses-ai-for-suicide-prevention.html

    1. Thank you for you comment Lena!

      Great to hear the algorithm had such incredible results already after it’s first month after implementation. It really shows how powerful it can be in overcoming the current mental health challenge.

      Nonetheless, I see your concern about personal privacy with the introduced algorithm. This will pose yet another challenge for Facebook regarding privacy, as the personal data of their users is thoroughly analysed by the system but also by the specialists once concerning content has been posted. Once not dealt with in the right manner personal privacy can easily be violated. A good solution might indeed be to allow people to opt-out of this feature which can often be done in the account settings. Apart from that I think it is important for companies like Facebook to inform their users about this new feature and be clear about its purpose and it’s right intentions. Once users know why this is an important feature and how it will be specifically used it will potentially limit the privacy backlash after introduction.

      Of course it will be hard to know how to dodge the bullet, but I am glad to see companies like Facebook are willing to take action to solve the mental health challenge.

  3. Hi Steffie,

    Thank you for shedding light on this issue and the potential opportunity for hub-firms/technology to play a role in suicide prevention.

    I’d like to point out that social media firms actually play a role in causing mental health issues, such as addiction, anxiety, depression and a negative body image. With this in mind, it seems only natural that they would make an effort to tackle this issue. Besides tackling it after the fact, I believe they should consider the role they are playing in causing it. For example, they should consider which trade-offs between engagement with their platform and potentially psychologically damaging consequences they are making.

    https://www.sciencedirect.com/science/article/pii/S0747563217304685?_rdoc=1&_fmt=high&_origin=gateway&_docanchor=&md5=b8429449ccfc9c30159a5f9aeaa92ffb#!

Leave a Reply

Your email address will not be published. Required fields are marked *