Living in an algorithmic bubble

4

October

2021

No ratings yet.

Online, many of us are surrounded by views and opinions we agree with. Websites use algorithms that look at things like browsing history and age to offer personalized content and ensure the content shown supports the visitor’s views. These algorithms decide what we view and read online and often exclude opposing perspectives. Because of this, we live in so called ‘filter bubbles’.

Initially, an algorithm that ensures we see content we like and agree with does not sound that bad. However, when we do not see opposing views or opinions we disagree with online, these filter bubbles create echo chambers and we forget that what we see is actually being filtered. In my opinion this is a huge flaw to these otherwise valuable algorithms, because the filter bubbles that arise are distorting our ideas of the world. People are using Facebook as their main news source for example and a significant portion of those people is probably not mindful about what Facebook’s algorithms do. This lack of awareness increases the negative impact of filter bubbles, because the people who are consuming the news do not know that what they see is constantly being filtered to match their opinions and perspectives (FS, 2017; Pariser, 2011). Furthermore, we limit our own experiences and learning possibilities by only viewing filtered content. In my opinion, this extreme content filtering problem is perfectly summed up by Pariser (2011): “A world constructed from the familiar is the world in which there’s nothing to learn.”

Social media platform Tiktok is trying to combat this problem by sporadically adding videos to your feed that are not relevant to your expressed interests. They do this to let their users experience new perspectives or ideas and to increase the diversity of content shown to the users. This is something that other platforms like Facebook, Instagram and YouTube could improve on, as their algorithms still keep users in their own echo chambers (Perez, 2020).

Can we, the content consumers, pop the bubble ourselves? There are some ways we can ‘bypass’ the filter or find less filtered content. First of all, visiting websites that offer a wide range of content is a good start. Websites that show you multiple perspectives help you create a more complete view yourself. Other things that content consumers can do are using Incognito mode and deleting cookies. Both methods will de-personalize your content, because you are giving the algorithms less information. If we become more aware and actively try to find unfiltered, completer content, the filter bubble can be popped (FS, 2017; Pariser, 2011).

References:
FS. (2017, July 3). How Filter Bubbles Distort Reality: Everything You Need to Know. Retrieved 4 October 2021, from https://fs.blog/2017/07/filter-bubbles/

Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. Penguin UK.

Perez, S. (2020, June 18). TikTok explains how the recommendation system behind its ‘For You’ feed works. Retrieved 4 October 2021, from https://techcrunch.com/2020/06/18/tiktok-explains-how-the-recommendation-system-behind-its-for-you-feed-works/?guccounter=1&guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&guce_referrer_sig=AQAAAG37494luglqH9K2xpIfdbz7eMt1NslKsRggWOCjkDR55sH_D_pgWizSYt0N0ERfhD9dlwTrrv1QQbymNfFwkw8L-10oJ-Gy3WSI-Y3Ag0dodCEyWWgPP-f0j03gMdDGv2vw2wqE4F7V_YCDmUuhkq0hZoRiwbugjPAXgI5wrTzH

Please rate this