Down the YouTube Rabbit Hole

7

October

2020

5/5 (1)

 

Over the past few weeks, a lot has been said (including on this blog) about how social media has been impacting the offline world in a negative way. After watching “The Social Dilemma”, which launched on Netflix last September, we started to think about how these platforms are selling our attention as a commodity and leading to an increasingly polarized society, harming democracies around the world. Some people decided to take it one step further and deleted accounts, turned off notifications and stopped clicking on recommended content – just as suggested in the documentary by the whistleblowers who helped creating these platforms. I was one of those people – until I wasn’t anymore!

Interestingly enough, shortly after watching the documentary I started to receive tons of recommendation of content that addressed the same issues, especially on YouTube and Facebook. Isn’t it funny how the algorithm can work against itself? In the beginning, I was decided not to click on any of the suggested videos even though the content seemed quite interesting. Instead, I decided to do my own research on topics such as data privacy, surveillance capitalism or ethical concerns when designing technology. However, the more research I would do the more recommendations I would get – unexpected, uh?

So, one lazy Sunday afternoon I gave in to temptation and clicked on a video that was recommended to me by YouTube – it was a really interesting Ted Talk by techno-sociologist Zeynep Tufekci, which dug a little deeper into some of the question raised in “The Social Dilemma”. Needless to say, one hour later I had already watched 5 more Tedtalks – I admit it, I felt into the Youtube Rabbit Hole!

However, I cannot say that I regret my decision as I gained really interesting insights from these recommendations. After all, that’s how this recommendation system is supposed to work, right? In particular, I was a able to find some answers to a question that had been in my mind for a while: “But what can we do to stop the negative effects of social media while still valuing freedom of speech as a pillar of the internet?”

Even though a lot has been said about the threats arising from the widespread use of social media, I haven’t come across tangible solutions for this issue. Sure, we can turn notifications off, but that won’t tackle the problem at its core! But in two very enlightening Ted Talks by Claire Wardle (misinformation expert) and Yasmin Green (research director a unit of Alphabet focused on solving global security challenges through technology) I was able to find some clarity. According to them, there are three areas that we can act upon to create a better digital and physical world:

  • Tech Companies – first of all, if any advances are going to be made, we need technology platforms to be on board. As an eternal optimist, I do believe that tech leaders are aware of the challenges they face and are certainly trying to find solutions. As Yasmeen Green explains, Google already successfully developed what they called the “Redirect Method”, which targeted people who made searched related to joining terrorist groups. For example, when a Google search about extremist content was made the first result would be an add inviting them to watch a video about more moderate content. Furthermore, the targeting would not be made based on the user profile, but on the specific question that was asked. What if we could use the “Redirect Method” to stop the spread of conspiracies theories or misinformation about climate change? It would be great for society, although probably not so profitable for the tech giants ?
  • Governments – Although tech companies have their fair share of responsibilities, at the moment they are “grading their own homework” and regulating themselves, making it impossible for us to know if interventions are working. That’s where governments come in place. But a challenge this big doesn’t simply call on local or even national regulators. What we really need is global response to regulate the information ecosystem. Or, as Brad Smith (Microsoft’s President) puts it, we need a “Digital Geneva Convention” that holds tech platforms accountable and prevents coordinated social attacks on democracy.
  • We the People – While we would love to place our hopes on Governments to solve this situation for us, it is undeniable that most lawmakers are struggling to keep up with a rapidly changing digital world. From time to time, a US Senate Committee investigating tech companies will originate a few memes as we see that lawmakers have a difficult time understanding what they’re talking about – I will leave you my favorite down below! That’s why we need to take the matter into our own hands and a way to do it is, as Claire Wardle puts it “donate our social data to science”. Millions of datapoints on us are already collected by social media platforms anyway, but what if we could use them to develop a sort of centralized open repository of anonymized data, built on the basis of privacy and ethical concerns? This would create transparency and allow technologists, journalists, academics and society as a whole to better understand the implications of our digital lives.

Overall, I recognize that these solutions are not perfect or complete. But I do believe that they provide a starting point to “build technology as human as the problems we want to solve”.

 

 

Sources

Smith, B., 2017. The Need For A Digital Geneva Convention – Microsoft On The Issues. [online] Microsoft on the Issues. Available at: www.blogs.microsoft.com [Accessed 6 October 2020].

Shead, S., 2020. Netflix Documentary ‘The Social Dilemma’ Prompts Social Media Users to Rethink Facebook, Instagram And Others. [online] CNBC. Available at: www.cnbc.com [Accessed 6 October 2020].

Green, Y., 2018. Transcript Of “How technology can fight extremism and online harassment”. [online] Ted.com. Available at: www.ted.com [Accessed 6 October 2020].

Wardle, C., 2019. Transcript Of “How you can help transform the internet into a place of trust” [online] Ted.com. Available at: www.ted.com [Accessed 6 October 2020].

Tufekci, Z., 2017. Transcript Of “We’re building a dystopia just to make people click in ads” [online] Ted.com. Available at: www.ted.com [Accessed 6 October 2020].

Please rate this

2 thoughts on “Down the YouTube Rabbit Hole”

  1. Hi Maria, thank you for your insightful post!

    I agree that there should be some form of censoring on the internet/social media algorithm (for cases like terrorism). However, limiting certain opinions to be spread on the internet can lead to a situation where a tech giant/government essentially controls what we see and think.

    More specifically, you mentioned in your first bullet point that tech giants could limit the spread of conspiracy and misinformation using their algorithms. I think this makes the situation even worse, because it means that a company decides on what is conspiracy and what is not. It decides on what is misinformation and what is ‘correct’.

    I’m a strong believer in net neutrality, where the user him/herself has the final responsibility in consuming content on the internet. Of course, this whole discussion is a very complicated one that appears to have no easy solution. Just like you did in your post, we should definitely think about this subject with a critical view to create a better ‘future internet’ 🙂

  2. Thank you, Maria, for this very informative and interesting blog post!
    I haven’t seen The Social Dilemma, but read a few things about it – mainly in social media, to be honest.
    Most of the time it was something like: “We are watching a documentary about social media and its impact on society and then discussing this documentary about social media”.

    Although I haven’t seen it yet – but I will definitely watch it in the near future – people naturally think about the influence that social media has on democracies and society in these times. Since the President of the United States announced on Twitter last night that negotiations on a stimulus would be halted until after the election, one can only imagine the importance of social media in today’s politics. What goes hand in hand with the actual documentary is perhaps “The Great Hack”, which provides information about the influence of the data analysis company “Cambridge Analytica” on the 2016 presidential election as well as many other elections, especially in developing countries.

    However, the media usually read about the threats and problems that social media create in politics and society. Therefore, your article is quite interesting and in some ways encouraging. In any case, the solutions you mentioned are interesting. But as a government representative, even if it is a kind of Geneva Convention, it is a very fine line on which regulators work in terms of freedom of expression. Nevertheless, the platforms could be held responsible for the content that is distributed on them.
    Also very interesting, as I have never come across it before, is the ‘donation’ of data as part of the solution.

    What is also encouraging is the fact that you have received this information and these talks by changing your behaviour only for a few days.

    It will certainly be interesting to see how social media is regulated or self-regulated to prevent it from destabilizing democracies and weakening science by creating misinformation and disbelief in science itself.

    I am an optimist, just like you. I also believe that it is not in the interest of the leaders of these companies to exert this influence on society. Maybe it is a little naive, but let’s hope for the best.

    While I believe that there will be solutions to the problem, your article is honestly the first information I have received about opportunities.
    So I thank you for this influence your article had on me 🙂

Leave a Reply

Your email address will not be published. Required fields are marked *