YouTube’s algorithms promotes fake cancer cures

17

October

2019

5/5 (3)

After a dad claimed that a YouTube dance videos damaged his daughter’s mind, the phenomenon “Falling down the YouTube rabbit hole” came to a heavy discussion . This phenomenon is referred to when users are redirected to potentially dangerous and controversial content that they would otherwise never stumbled upon.  According to the parent, his 10-year-old daughter innocently browsed for ‘tap dance videos’ on YouTube. At one point, she has been redirected to videos that were giving her advice on body-harming and starvation. Even though the dad has put in parental controls into all her devices, she keeps finding new ways to watch these harmful videos.

YouTube’s recommendation algorithms has been a topic of discussion many times before. 70% of the total viewing time on YouTube is powered by its recommendation engine. BBC investigations claim that YouTube’s algorithms, aids misinformation such as flat earth conspiracy theorists and  videos on fake cancer cures. People that get obsessive in watching such videos are generally more vulnerable to believing untrustworthy information. Before these videos, often ads of major consumer good brands are run (e.g. Samsung, Clinique and Heinz). This means that YouTube, the video makers and these big brands are making money out of vulnerable people by promoting misleading videos.

In reaction to this, a YouTube spokesperson responded that in the beginning of 2019, multiple changes to its recommendation engine has been made to prevent misleading content. This resulted in 50% drop in watch time of harmful and misleading content in the United States. Although YouTube claims to have made progressions in tackling harmful misinformation and conspiracy theories, it refuses to share the logic behind its recommendation algorithms.

To better understand YouTube’s recommendation engine, researchers of Mozilla openly asked people to share personal experiences in getting redirected  to a harmful YouTube recommendation. They successfully promoted this stunt as “Tell us your YouTube Regret”, receiving more than 2,000 responses. This shows a recognizable pattern that it is hard to fight against YouTube’s  recommendation algorithms. Despite the fact that loved ones are doing everything in their power to manually delete the YouTube recommendation histories or to control the YouTube account, victims of the YouTube rabbit hole keep finding new ways to obsessively watch harmful videos.

Have you guys every experienced a “YouTube regret”? And, what is your opinion on the claim that YouTube is misusing its platform to make money out of vulnerable people? Please share your comments below!

 

BBC (2019) YouTube aids flat earth conspiracy theorists, research suggests. Available at:  https://www.bbc.com/news/technology-47279253

Carmichael, F. (2019) Available at: https://www.bbc.com/news/technology-50045919

Geurkink, B (2019) Available at: https://foundation.mozilla.org/en/blog/youtube-regrets/

Gragnani, J. (2019) YouTube advertises big brands alongside fake cancer cure videos. Available at: https://www.bbc.com/news/blogs-trending-49483681

Mozilla (2019) Available at: https://foundation.mozilla.org/en/campaigns/youtube-regrets/

Please rate this

3 thoughts on “YouTube’s algorithms promotes fake cancer cures”

  1. The Youtube rabbit hole is a very interesting phenomena! Things like ElsaGate, which promoted disturbing and sexually violent content to kids, was quite the scandal around 2017.

    I think it shows that there is a trade-off for Youtube. On the one hand, the platform wants their users to be as engaged as possible, click on recommended video’s and stay on the platform. However, as shown this comes with a hefty price. These ‘engaging’ video’s are most of the times either controversial or specifically tailored to follow the algorithm.

    Therefore Youtube has to be decisive in the coming years, are they going to crack down on harmful video’s in their algorithm (and stop preying on those most vulnerable) or are they going to not care due to the revenue that it brings to the platform.

  2. Hi Lina! Very interesting and unique topic right here. I see in this an ongoing debate about the libertarianism of information sharing that is not going to cease to exist anytime soon.

    The topic at the centre of recommendation-algorithm issues like the ones Youtube is facing time and time again, is the subjectivity of what can be considered a free and open market for information. Even harmful videos, as dangerous as they might be in causing negative externalities for their audiences, might have a right to be seen or to exist in the first place, as they highlight the true and unaltered struggles that many humans are faced with in a society with increasing rates of depression and suicide, especially at juvenile ages.

    As sad as it is that malevolent messages are spread through such content to people who, like the young girl in your example, have no say in what is coming their way and possibly end up traumatised for a long time to come, who is to decide if troubled individuals who publish this content should not be allowed to speak their mind in a sphere that is commonly advertised as an open space for opinions and stories? I know this is a radical thing to say, but net neutrality is a genuine concern for platforms like YouTube that intend to provide freedom to their users.

    The ultimate solution to harmful information will, in my opinion, not be to take it down or forbid it, but to facilitate very sophisticated filtering to make sure that underage and vulnerable individuals are not going to be touched by such darkness, but the publishers of this darkness are still heard and have a chance to make their struggles known and possibly receive help. This, of course, is based on the idea that the majority of dangerous content in the “rabbit hole” is published not to disturb but as an outcry for help, directly or indirectly. I would be interested to hear your opinion on this matter.

  3. Dear Lina, thanks for your interesting article! I really liked your choice of topic as I actually never thought or heard about that. I think generally, it is very dangerous if these recommendation algorithms are not controlled, but it is also overall dangerous that YouTube cannot control its content for such harmful videos. I believe that controlling the content and at least applying an age check for videos with disturbing content would actually tackle the root of this problem. This can be further applied to content on e.g. Facebook that is also not controlled for its validity. Cheers, Sophie

Leave a Reply

Your email address will not be published. Required fields are marked *