—
When Charlie Kirk died, my Instagram feed changed quickly.
At first, I saw NOS posting on Instagram, stating that Charlie Kirk had been fatally shot. Within an hour, more and more news outlets were reporting the same thing. Then, something shifted. My Instagram ‘For You’ page shifted from mourning to outrage, then from outrage to ideology. I had liked two posts, not necessarily out of agreement, but as a means of engagement. By then, the algorithm had noticed my attention and began changing my feed accordingly.
It began showing me tribute posts, previous podcast clips of him, and responses of people to his death. Soon, it showed me content that had nothing to do with Kirk at all. It showed me posts about immigration, nationalism and the collapse of Western values. While scrolling, these were the only posts I’d get, unless I went back to my ‘For You’ page and consciously picked a thumbnail that didn’t look political. As someone who follows both political sides to stay informed, I was shown increasingly extreme content, both from left- and right-wing views. The algorithm didn’t know what I believed. It only knew I was paying attention.
Digital disruption has changed how news is consumed (Nawale et al., 2023). Digital disruption refers to changes driven by digital technologies that happen at a speed and scale that transform established ways of value creation (Digital Disruption Research Group, n.d.). Where once we got information at set times, such as newspapers or TV at set times, we now get it constantly. Through Instagram reels and other forms of social media. Traditional news companies such as De Telegraaf or The New York Times had to adapt, and no longer necessarily control the narrative. Now, algorithms do.
In my opinion, these consequences are dangerous. Extremist groups exploit trending events to spread ideology under the radar of casual scrolling. This, combined with algorithmic reinforcement, creates a loop where radical content thrives (Akram et al., 2023). According to Ravi et al. (2024), platforms like Facebook and TikTok don’t just reflect beliefs, they actively shape them. I fear that as a society, we will become more polarized, either pushed to the extreme left or extreme right. Not by conscious choice, but by the invisible hand of algorithmic design.
—
References
Akram, M., & Nasar, A. (2023). Systematic review of radicalization through social media. Ege Akademik Bakış (Ege Academic Review), 23(2), 279–296. https://doi.org/10.21121/eab.1166627
Digital Disruption Research Group. (n.d.). Digital Disruption Research Group. The University of Sydney. https://www.sydney.edu.au/business/our-research/research-groups/digital-disruption-research-group.html
Nawale, R. D., & Kumar, L. (2023). Exploring the impact of social media on the dynamics of news consumption: A study on its effectiveness. International Journal of Current Science, 13(2), 303–305. https://www.ijcspub.org/papers/IJCSP23B1040.pdf
Ravi, K., & Yuan, J.-S. (2024). Ideological orientation and extremism detection in online social networking sites: A systematic review. Intelligent Systems with Applications, 15, 200456. https://doi.org/10.1016/j.iswa.2024.200456