Dark Side of Algorithms

29

September

2020

5/5 (2)

While the connected world has shifted towards an attention-based economy, where every additional second spent staring at your phone on a social network makes big tech companies earn more bucks, little consideration is given to the ethics behind this mechanism. A very efficient tool that makes masses tied up to their screen is recommendation-based content. YouTube for instance (owned by Google) has mastered the art of recommendation through very sophisticated algorithms. To ensure that consumers are staying on the website, YouTube recommends video based not only on your interests but also on similar behaviors from other users. This results in an escalation of extreme videos for every new recommendation to keep you staring at your screen.

The straw that broke the camel’s back
Numerous examples illustrate the way algorithm locks users into an infinite loop. It is not a surprise that the suicide rate skyrocketed since 2012, with a 98.5% increase in the UK when users are shown more and more extreme videos online with biased algorithms and little to no ethical consideration. Molly Russell, a British teenager, took her life after searching for suicide and self-harm images online. Attention-based algorithms kept on showing her content related to these images in order to keep her online. But what would have happened if in between such horrible content she saw inspiring posts, smiling people or suicide prevention ads? Ironically, algorithms intended to capture her attention, instead it led to no attention at all anymore.

It just needs a little push
In his book “Civilisation du Poisson Rouge”, Bruno Patino wonders what steps could be taken to prevent such catastrophe from happening. Re-writing algorithms to ensure bias-free AI might be an option. Providing a switch off button of notification from all social media might be another too. Setting a reminder from Facebook and other apps that you spent too much time on the screen, and it might be good to take a break from it is also another solution. In fact, plenty of solutions exist. But none of the big tech companies would shift towards a human technology and reinvent itself from a necessity to a simple tool.

Reference:

Walsh, M., 2019. When Algorithms Make Managers Worse. [online] Harvard Business Review. Available at: <https://hbr.org/2019/05/when-algorithms-make-managers-worse>

Gerrard, Y. and Gillespie, T., 2019. When Algorithms Think You Want To Die. [online] Wired. Available at: <https://www.wired.com/story/when-algorithms-think-you-want-to-die/>

Samaritans. 2020. Suicide Facts And Figures. [online] Available at: <https://www.samaritans.org/about-samaritans/research-policy/suicide-facts-and-figures/>

Patino, B., 2019. La Civilisation Du Poisson Rouge. 1st ed. Grasset.

Please rate this

6 thoughts on “Dark Side of Algorithms”

  1. Hi Simon,

    Interesting topic! I do not agree that it ‘just needs a little push’. These solutions are merely negative for the companies that in your opinion need to introduce it. Why would Facebook want people to limit the amount of scrolling through their website? Of course, in the big picture this would be better for certain people, but it is not in favour of those companies. Similarly, people would probably not use a switch off button, otherwise they would chose now to delete the app. Therefore, I am curious if you think these simple solutions are feasible, and which one do you prefer?

    Nino van de Ven

  2. I completely agree with your post. “The Social Dilemma” (available on Netflix) visualizes your post even further in case you would like more information regarding the subject. On the subject of how to deal with the current situation regarding these algorithms, as long as the big tech firms do not act ethically, matters should be taken into our own hands in my opinion. I strongly believe parents should consider not providing the tools to access social platforms till a certain age (for example 15). This would increase the level of ‘direct’ social activity, create friendships that are not as shallow as they would be if they mainly consist of sitting next to each other on their phones. Wait untill they are more aware of what kind of person they are untill they’re able to access these platforms. If a majority of parents would follow the guidelines as you mentioned and consider not providing the tools to access these platforms, then hopefully the suicide rate would go down significantly while raising children in a more healthy way.

  3. Hey Simon!

    I was just about to write my second blogpost. Just as I had typed the preliminary title “The Ethics of Datamining Consumers”, I was like: “You know what, I might as well check out some recent blogposts first.” I feel like you pretty much read my mind.

    I recently watched the documentary “The Social Dilemma” (available on Netflix), which sketches the problem. It also fortified my critical views on the way that algorithms and big data are employed nowadays. In that documentary, they said: “If you did not pay for a product, you are the product”. I guess there’s truth in that.

    Big tech companies do not just sell your data, they sell your future. This does not always have to result in suicide. Rather, it’s in all the little things. The layout of your instagram feed, your YouTube recommendations. With every interaction, algorithms get better and better at capturing people inside their bubbles, while extracting profit. Social media influences their buying decisions, their political views, how they spend their free time.

    But it’s like you said. Big tech has no incentive to reinvent itself. This leaves those of us, who are concerned about this, with two options.
    1) Advocate for regulation on big data and tech.
    2) Find ways to keep people off their screens profitably.
    The first option is, at this point, almost inevitable. But personally, I prefer the second option.

    Cheers

  4. Very interesting post that describes one of the most important issues relating to the current use of algorithms. I think that each of us has experienced how addictive phone applications can be. The sad information is that the algorithms will most likely only be better. Therefore, it will be even more difficult to resist the temptation of using your phone in a few years than it is now.

    In my opinion, tech companies should increase the oversight over the content that their algorithms recommend to the users. We spend enormous amount of time in front of our screens, and the content that we see may influence us in a very negative way. You described one example of a tragic history, but as statistics suggest there are probably many more.

    Unfortunately, tech companies have done little to control the content recommended by algorithms. It seems that keeping users in front of their screens for longer is higher on their list of priorities than ensuring high quality of the displayed content. We should all be aware of the impact that these technologies have on our everyday life and try to control our screen time.

  5. Hi Simon,

    I really liked your post! You addressed a very relevant topic which impacts not only the time spent on social media as an individual but also how people interact with each other. I agree with you that there is no relationship between the algorithms applied and the ethical questions behind it. The algorithms’ main function is to keep people on site and to incentivise them to come back. Your example is an extreme but I think it does provide a good idea of how algorithms can negatively impact people’s life’s. We can already see this happening on all levels, online shopping ads are a less drastic example but are influencing our behaviour as well. We are constantly confronted with our own ‘online profiles’ and predictive analysis about ourselves. Yet, people do not act upon it. You are saying one prevention step is to set a timer for social media. I wonder whether most people are even aware of the consequences of social media, also considering their own health. I feel that companies will not take a step towards more ethical behaviour and people will not oppose it enough. Hence, what is the solution, maybe stricter regulations?

  6. Hey Simon,

    Thanks for the interesting read! Commonly discussed topic after all hype from The Social Dilemma while you also approach other areas and more general applications around AI. It makes me think of the article we read for IS where AI algorithms gender discriminate based on the wide range of factors they take into account in terms of Facebook ads and recommendation-based systems in general. It is definitely not ethical but just like so many other things; I don’t think many of the initial developers could grasp the future horrific consequences of their algorithms. However, given the clear evidence of all negative consequences, it is such an important topic to address and make sure that we improve the situation for the future.

    The topic is also very relevant given the wide adaptation of AIs to support business and social functions. The AIs are also fed with such a broad and complex range of data so its crucial to have skilled developers who also take socially responsible factors into account. I really like your comment that social media “just needs a little push” by limiting the already existing platform. However, for more complex systems which analyse and function more through black box processes, it is important to get the code right from the beginning.

Leave a Reply

Your email address will not be published. Required fields are marked *