How Facebook is keeping you in your own bubble

14

October

2017

No ratings yet.

It has been 11 years since Facebook first introduced their news feed. Since its inception, it has become way for people to know what people around them are doing, liking and sharing. Over the years, Facebook has been optimizing its algorithm so that your Facebook feed is as engaging and relevant as possible for you personally. In essence, this sounds great, but it has severe negative connotations for the perpetuation of one’s beliefs and convictions.

Facebook’s algorithm takes into account hundreds of thousands of variables to optimize your news feed. To simplify, one can divide them into four core categories: the creator, post, type and recency (Constine, 2016). The creator variable looks at your interest in the original content creator. The post variable looks at the level of traction said post is receiving among your friends and other users. Type concerns the format of the post, as in whether it is a picture, video or a piece of text. Lastly, recency refers to the novelty of the post, since Facebook prioritizes new content.

One might ask oneself: ‘Why would this be negative? I do not want to see content that is not going to be interesting to me!’. Generally, this would be correct. However, we are seeing a shift in society. With increased access to information, it is easier for a person to find people and information that agree with their point of view. This concept is referred to as selective exposure (Bakshy et al., 2015). This theory argues that people have a tendency to seek out information that is consistent with their own beliefs. Consequently, people generally see only one side of any given argument. The Facebook news feed perpetuates this by removing information one does not like or agree with and adding content that one will interact with.

Facebook is obviously not the only one to blame here, but it is definitely one of the main culprits. Having one’s own point of view constantly reinforced creates a feeling of false justice. When one only sees information consistent with their own pre-existing assumptions, this assumption can be regarded as the absolute truth. This creates an environment where every person with a certain opinion feels they know the absolute truth, creating large amounts of friction between groups of people with clashing opinions. People become ignorant of the other side’s arguments.

What can we do to combat this? Firstly, Facebook should be more transparent in how their algorithm works. The previously provided explanation is oversimplified and incomplete. The algorithm remains a black box. However, more importantly, people should actively challenge their own beliefs and be open to discussion: try to exit the bubble.

Sources:
Bakshy, E., Messing, S. & Adamic, L.A., 2015. Exposure to ideologically diverse news and opinion on Facebook. Science. Available at: http://science.sciencemag.org/content/348/6239/1130.full [Accessed October 11, 2017].

Constine, J., 2016. How Facebook News Feed Works. TechCrunch. Available at: https://techcrunch.com/2016/09/06/ultimate-guide-to-the-news-feed/ [Accessed October 11, 2017].

Please rate this

Leave a Reply

Your email address will not be published. Required fields are marked *