Living in your own bubble

21

September

2022

No ratings yet.

We all know them, the advanced recommendations tools incorporated in our every trace on the internet, continuously feeding the user with information, which aligns with their interest, may it be regarding political choices or other personal affairs. Wonderful right? The internet provides information, which aligns with your interest, seems harmless, or not. Are these personalized algorithms the cause of individual isolation?


This concept is called the ”filter bubble”, which suggests that search engines and social media, together with their recommendation and personalization algorithms, are centrally culpable for the societal and ideological polarisation experienced in many countries (Bruns, 2019). Filter bubbles are even seen as critical contributors to Trump, Brexit, and Bolsonaro. These algorithms strengthen the ideology of the user, confirming their own beliefs, attitudes, and vision of the world. For example, if the user searches or expresses a liking for the Democratic Party, it is probable that the user will receive more (positive) information about this party. This phenomenon is not only seen in search engines but also on other social media platforms. For instance, the Facebook news feed algorithm will tend to amplify news that your political companions favour (Pariser, 2015).


However, this raises some serious concerns, as social media is acknowledged as a primary source of information and other news. Furthermore, these personalized algorithms connect users with the same ideology, creating new exclusive communities with like-minded people, which intensifies the barriers with people who present different opinions. In a study of 10.1 million U.S. Facebook users with self-reported ideological affiliation found that more than 80% of these Facebook friendships shared the same party affiliation (Bakshy et al., 2015). While this homophily can be beneficial, it also forms a threat to the extremes of the ideological spectrum.


My main concern for these personalized algorithms is that people will create a tunnel vision, which shall affect their ideology. users will not be challenged anymore with divergent perspectives, which can widen their horizons, causing them to be preserved in their own ”bubble”, resulting in some extreme cases of radicalization. For instance, the Christchurch attack that occurred in 2019, where a terrorist attack got live-streamed on Facebook. The perpetrator was inspired by Facebook groups/communities, promoting white nationalism and white separatism (Wong, 2019). Unfortunately, I think that this phenomenon will only deteriorate as people are more reliant on information from the internet and start secluding themselves from contrasting beliefs.


Do you think it is too far-fetched to think that filter bubbles can affect people’s ideologies and provoke radicalization?

References:

Bakshy, E., Messing, S., and Adamic, L. A. 2015. “Exposure to Ideologically Diverse News and Opinion on Facebook,” Science (348:6239), pp. 1130-1132.
Bruns, A. (2019). Filter bubble. Internet Policy Review, 8(4). https://doi.org/10.14763/2019.4.1426
Pariser, E. (2015, May 7). Did Facebook’s Big Study Kill My Filter Bubble Thesis? Wired. Retrieved on September 18 from https://www.wired.com/2015/05/did-facebooks-big-study-kill-my-filter-bubble-thesis/
Wong, J. C. (2019, March 30). Facebook finally responds to New Zealand on Christchurch attack. The Guardian. Retrieved 18 September 2022, from https://www.theguardian.com/us-news/2019/mar/29/facebook-new-zealand-christchurch-attack-response

Please rate this

1 thought on “Living in your own bubble”

  1. Very interesting! I agree that filter bubbles selective exposure can affect people’s ideologies since everybody FYP will only reconfirm their beliefs. Our algorithms surround us with like-minded people and hide what we disagree with. By avoiding disagreeable information, we are ‘protected’ from inconsistent beliefs which can cause confusion.

    You might find this article interesting:
    van Prooijen, J. W., Cohen Rodrigues, T., Bunzel, C., Georgescu, O., Komáromy, D., & Krouwel, A. P. (2022). Populist gullibility: Conspiracy theories, news credibility, bullshit receptivity, and paranormal belief. Political Psychology.

    The study found that populist attitudes are more susceptible to bullshit statements, supernatural beliefs and see politically neutral news items (no matter the source) as biased or fake, also referring to the Trump campaign and the Brexit.

Leave a Reply

Your email address will not be published. Required fields are marked *