Instagram’s Invisible Hand: How Algorithms Fuel Online Radicalization

16

September

2025

5/5 (5)

When Charlie Kirk died, my Instagram feed changed quickly.

At first, I saw NOS posting on Instagram, reporting that Charlie Kirk had been fatally shot. Within an hour, more and more news outlets were reporting the same thing. Then, something shifted. My Instagram ‘For You’ page shifted from mourning to outrage, then from outrage to ideology. I had liked two posts, not necessarily out of agreement, but as a means of engagement. By then, the algorithm had noticed my attention and began changing my feed accordingly.

It began showing me tribute posts, previous podcast clips of him, and responses of people to his death. Soon, it showed me content that had nothing to do with Kirk at all. It showed me posts about immigration, nationalism and the collapse of Western values. While scrolling, these were the only posts I’d get, unless I went back to my ‘For You’ page and consciously picked a thumbnail that didn’t look political. Even then, my reels started to get political again after a while. As someone who follows both political sides to stay informed, I was shown increasingly extreme content, both from left- and right-wing views. The algorithm didn’t know what I believed. It only knew I was paying attention.

Digital disruption has changed how news is consumed (Nawale et al., 2023). Digital disruption refers to changes driven by digital technologies that happen at a speed and scale that transform established ways of value creation (Digital Disruption Research Group, n.d.). Where once we got information at set times, such as newspapers or TV at set times, we now get it constantly through Instagram reels and other forms of social media. Traditional news companies such as De Telegraaf or The New York Times had to adapt, and no longer necessarily control the narrative. Now, algorithms do.

In my opinion, these consequences are dangerous. Extremist groups exploit trending events to spread ideology under the radar of casual scrolling. This, combined with algorithmic reinforcement, creates a loop where radical content thrives (Akram et al., 2023). According to Ravi et al. (2024), platforms like Facebook and TikTok don’t just reflect beliefs, they actively shape them. I fear that as a society, we will become more polarized, either pushed to the extreme left or extreme right. Not by conscious choice, but by the invisible hand of algorithmic design.

References

Akram, M., & Nasar, A. (2023). Systematic review of radicalization through social media. Ege Akademik Bakış (Ege Academic Review), 23(2), 279–296. https://doi.org/10.21121/eab.1166627

Digital Disruption Research Group. (n.d.). Digital Disruption Research Group. The University of Sydney. https://www.sydney.edu.au/business/our-research/research-groups/digital-disruption-research-group.html

Nawale, R. D., & Kumar, L. (2023). Exploring the impact of social media on the dynamics of news consumption: A study on its effectiveness. International Journal of Current Science, 13(2), 303–305. https://www.ijcspub.org/papers/IJCSP23B1040.pdf

Ravi, K., & Yuan, J.-S. (2024). Ideological orientation and extremism detection in online social networking sites: A systematic review. Intelligent Systems with Applications, 15, 200456. https://doi.org/10.1016/j.iswa.2024.200456

Please rate this

2 thoughts on “Instagram’s Invisible Hand: How Algorithms Fuel Online Radicalization”

  1. I completely agree, algorithms now dictate what we consume and when we consume it and alter our beliefs. This issue is more prevalent than ever, and I first came across it with figures like Andrew Tate a couple of years ago. My algorithms also showed polarizing views on different politics, disregarding whether I approve of the message or not.

    Engaging in posts by solely watching them, opening comments, or sending them to friends shows the algorithm that you are interested. Even though you may not agree with the message at all. All of a sudden, my feed was filled with content that I didn’t agree with. You get sucked up into ‘echo-chambers’ and information bubbles that reinforce certain types of thinking (Vicario et al., 2016).
    A recent fictional show actually covered how boys at a young age who consume ‘alpha-male’ content online are more and more radicalized. The show is called ‘Adolescence’ and gained major popularity for shedding light on societal issues with social media and the impact it can have on the younger generation. Though the show is fictional, it borders reality and shows the danger of these ‘echo-chambers’ online and their influence on our beliefs. I recommend you check out the show if you haven’t already, as just yesterday various actors received Emmys for their performance!

    Reference:
    Vicario, D., Vivaldo, G., Bessi, A., Zollo, F., Scala, A., Caldarelli, G., & Quattrociocchi, W. (2016). Echo Chambers: Emotional Contagion and Group Polarization on Facebook. ArXiv.org. https://arxiv.org/abs/

  2. I like how you related the larger problem of digital disruption to your own feed experience. I think this shows really well how algorithms are influencing stories more quickly than traditional media ever could, which makes their impact on polarisation definitely interesting but also more alarming.

Leave a Reply to Ben Seidensticker Cancel reply

Your email address will not be published. Required fields are marked *