Cooking for a Crowd with ChatGPT

29

September

2025

No ratings yet.

I’ve experimented with Generative AI in the kitchen last Saturday. Since I didn’t want to order pizza, I used ChatGPT to help me how much food and drinks I needed when I hosted for a large crowd.

Anyone who has ever tried to cook for 8+ people probably knows the struggle. Recipes are usually written for four, and manually multiplying quantities can be annoying and difficult when cooking for an uneven amount of people. So, I decided to see whether ChatGPT could help me. I gave it a pasta recipe for four people and asked it to scale it up for 17 people. It gave me a detailed shopping list with adjusted quantities of pasta, sauce and vegetables.

I was impressed since it saved me a lot of thinking time. But I noticed a big limitation: AI assumes you can just multiply everything linearly. For example, it told me to use 15 tablespoons of olive oil in one pan. That doesn’t actually work, because when cooking for large groups you don’t cook everything at once in a single pan. You need to split the recipe into batches, because doing it all it one pan won’t fit. Or at least, my pan wasn’t big enough.

I also tried the same experiment with cocktails. This worked surprisingly well: scaling up espresso martinis for 17 people gave me an accurate shopping list for coffee, vodka bottles, coffee liquor and edible coffee beans. Still, ChatGPT forgot that the drinks should be made in pitchers or batches, not in one giant shaker. Given that I had to make the food and cocktails for 17 people, I still had to figure out how big every batch needed to be and what fits in my pans and cocktail shakers.

This experiment on Saturday showed me that AI is a great starting point for meal and party planning, but human judgment is still needed when looking at ChatGPT’s generated answers.

Please rate this

Instagram’s Invisible Hand: How Algorithms Fuel Online Radicalization

16

September

2025

5/5 (5)

When Charlie Kirk died, my Instagram feed changed quickly.

At first, I saw NOS posting on Instagram, reporting that Charlie Kirk had been fatally shot. Within an hour, more and more news outlets were reporting the same thing. Then, something shifted. My Instagram ‘For You’ page shifted from mourning to outrage, then from outrage to ideology. I had liked two posts, not necessarily out of agreement, but as a means of engagement. By then, the algorithm had noticed my attention and began changing my feed accordingly.

It began showing me tribute posts, previous podcast clips of him, and responses of people to his death. Soon, it showed me content that had nothing to do with Kirk at all. It showed me posts about immigration, nationalism and the collapse of Western values. While scrolling, these were the only posts I’d get, unless I went back to my ‘For You’ page and consciously picked a thumbnail that didn’t look political. Even then, my reels started to get political again after a while. As someone who follows both political sides to stay informed, I was shown increasingly extreme content, both from left- and right-wing views. The algorithm didn’t know what I believed. It only knew I was paying attention.

Digital disruption has changed how news is consumed (Nawale et al., 2023). Digital disruption refers to changes driven by digital technologies that happen at a speed and scale that transform established ways of value creation (Digital Disruption Research Group, n.d.). Where once we got information at set times, such as newspapers or TV at set times, we now get it constantly through Instagram reels and other forms of social media. Traditional news companies such as De Telegraaf or The New York Times had to adapt, and no longer necessarily control the narrative. Now, algorithms do.

In my opinion, these consequences are dangerous. Extremist groups exploit trending events to spread ideology under the radar of casual scrolling. This, combined with algorithmic reinforcement, creates a loop where radical content thrives (Akram et al., 2023). According to Ravi et al. (2024), platforms like Facebook and TikTok don’t just reflect beliefs, they actively shape them. I fear that as a society, we will become more polarized, either pushed to the extreme left or extreme right. Not by conscious choice, but by the invisible hand of algorithmic design.

References

Akram, M., & Nasar, A. (2023). Systematic review of radicalization through social media. Ege Akademik Bakış (Ege Academic Review), 23(2), 279–296. https://doi.org/10.21121/eab.1166627

Digital Disruption Research Group. (n.d.). Digital Disruption Research Group. The University of Sydney. https://www.sydney.edu.au/business/our-research/research-groups/digital-disruption-research-group.html

Nawale, R. D., & Kumar, L. (2023). Exploring the impact of social media on the dynamics of news consumption: A study on its effectiveness. International Journal of Current Science, 13(2), 303–305. https://www.ijcspub.org/papers/IJCSP23B1040.pdf

Ravi, K., & Yuan, J.-S. (2024). Ideological orientation and extremism detection in online social networking sites: A systematic review. Intelligent Systems with Applications, 15, 200456. https://doi.org/10.1016/j.iswa.2024.200456

Please rate this