Data ethics: Facebook’s algorithm

12

October

2022

No ratings yet.

Every day, we are spending hours on our phones scrolling through social media applications like Instagram, Twitter and Facebook. It is hardly surprising that we are exposed to ads while we are scrolling; and nowadays, it is also common knowledge that these ads match our interests. Algorithms figure out our ‘scrolling’-behavior and investigate our personal profiles in order to predict what kind of products we are currently (unconsciously) craving.

On 1 May 2017, the Australian (2017) reported that Facebook uses emotional targeting on young and insecure children. Emotional targeting uses predictions on user’s emotions to show product recommendations.  Emotional targeting is basically not very different from the way algorithms could already predict our buying behavior, yet Facebook received a lot of criticism after this news report. Critics say the problem with this doesn’t lie with users experiencing positive feelings and being shown matching ads that could reinforce their mood; but rather with people experiencing negative feelings. In fact, algorithms know how to target this audience when they are in their most vulnerable state. In this way, it more closely resembles emotional manipulation instead of emotional targeting (Kulp, 2017).

Subsequently, Facebook came forward with an official response. They stated that among the millions of potential ad opportunities, emotional marketing was not one of them (Kulp, 2017). While this cannot be properly verified, Facebook did admit that they do routinely study user reactions and emotions using their “Compassion Research Team”. One of those experiment was done back in 2012, where Facebook’s algorithm showed its user’s different types of content, triggering positive and negative feelings. Subsequently, they recorded how the different types of content affected people’s emotions by analyzing negative and positive words in their posts (Meyer, 2014).

These are among Facebooks many studies, however, often these studies remain under the radar. Criticism towards such techniques and algorithms has also abounded in recent years, because, after all, where do we draw the line? Therefore, Facebook exemplifies the problems that exist about the use of data in the social media industry. People may be aware of the fact that our behavior on apps is used to create product recommendations, but the issue becomes problematic when researchers label the use of data as ethical simply because this data is available to them. For this reason, to this day, data ethics remains a grey area, but still is an important area of discussion.

References

Kulp, P. (2 May, 2017). Ads will target your emotions and there’s nothing you can do about it. [Online]. Retrieved from https://mashable.com/article/facebook-ad-targeting-by-mood#:~:text=Facebook%20issued%20a%20rare%20mea,state%2C%22%20the%20spokesperson%20said.

Meyer, R. (June 28, 2014). Everything We Know About Facebook’s Secret Mood-Manipulation Experiment. [Online]. Retrieved from https://www.theatlantic.com/technology/archive/2014/06/everything-we-know-about-facebooks-secret-mood-manipulation-experiment/373648/

The Australian. (1 May 2017). Facebook targets ‘insecure’ young people to sell ads. [Online]. Retrieved from: https://www.theaustralian.com.au/subscribe/news/1/?sourceCode=TAWEB_WRE170_a_GGL&dest=https%3A%2F%2Fwww.theaustralian.com.au%2Fbusiness%2Fmedia%2Fdigital%2Ffacebook-targets-insecure-young-people-to-sell-ads%2Fnews-story%2Fa89949ad016eee7d7a61c3c30c909fa6&memtype=anonymous&mode=premium&v21=dynamic-groupa-test-noscore&V21spcbehaviour=append

Please rate this

Leave a Reply

Your email address will not be published. Required fields are marked *