The main ignition behind a social media company’s profit lies in advertisements. Facebook, for example, has a digital weapon with its ad algorithm. An advertisement is personal to your preferences, curiosities, interests, and activity. Yet, even if users change their online presence for a month, the data collected is an accumulation of a long period of time. Thus, the ad algorithm bases its knowledge over your digital life span rather than your alternating choices.
Despite users attempt to steer away from personalized advertisements, the algorithm is too powerful. Not only does it base its knowledge on for instance, Facebook likes, friend groups, pages, posts engagement, but also it goes beyond the social media application. This means that you are tracked on websites you visit, purchases you make, additions to the shopping cart, videos you watch, time spent on an image or a post. The list is endless.
The algorithm slowly builds a virtual character of each person on the social media and decides what the user wants to see. If there is engagement and activity online, the algorithm keeps improving and feeding on information until it grows. The advertisements soon represent your interests, and the smallest interaction can make a difference.
People try to stop this and slow down the constant pop-up of new ads. However, the only actual escape from this digital prison is logging of the internet. Unfortunately, this is quite difficult in today’s world. Individuals consume and depend on electronic devices, applications, software, and socializing. Further, deleting and/or not interacting with the ad does not stop the algorithm from working, but rather, it incentivizes it to improve and further dissect the data.
The digital pattern of the algorithm might be opaque and invisible to both businesses and users. Nevertheless, in this scenario, there must be a way to limit the advertisement targeting. Can governments intervene and legislate social advertisements?
The next step is placing boundaries on the extent to which advertisements can be based on individual’s private life, as sometimes the advertisement is a trigger to a negative emotion. Thus, can governments and institutions gain control on this ad algorithm and set a red line where both parties (businesses and consumers) can benefit, but with certain constraints on ad topics and personal data usage?
References
Martin, K., 2021. Facebook under fire for burying research into mental health impact. Financial Times. Available at: https://www-ft-com.eur.idm.oclc.org/content/5f0402b7-812a-4314-aba1-cee242f9e161 [Accessed October 3, 2021].