My Personal Chef

10

October

2024

5/5 (2)

Since its launch, I’ve mainly used ChatGPT to help me summarize academic articles and brainstorm ideas. Yesterday, I decided to let the LLM help me decide what to eat that night (OpenAI, 2024).

I rearranged the ingredients in my fridge so the main items were visible, moving condiments toward the back. I took a picture and uploaded it to ChatGPT with the prompt:

“What can I make for dinner tonight with these ingredients? I’m willing to go to the store for some additional ingredients needed.”

That’s it—just as if I asked a friend on the phone for a suggestion on what to make for dinner tonight, I asked the OpenAI model.

What it returned blew me away: (please note that formatted the output for visual appearances in this blog post).

“Here’s a simple recipe using the ingredients you have, plus some optional additions you could pick up from the store.

Butterscotch Pumpkin and Vegetable Curry

Ingredients from your pantry/fridge:

  • Butterscotch pumpkin
  • Carrots
  • Zucchini
  • Red bell peppers
  • Onions
  • Potatoes (optional, for a heartier curry)
  • Curry powder or paste (red or yellow)

Additions from the store (optional):

  • Coconut milk (for a creamier curry)
  • Fresh cilantro (for garnish)
  • Protein like chickpeas, tofu, or chicken

Instructions:

  1. Prepare the Veggies: Peel and cube the butterscotch pumpkin, and chop the zucchini, bell peppers, carrots, and potatoes.
  2. Sauté the Aromatics: In a large pot, sauté the onions in oil until they’re translucent.
  3. Add Spices: If you have curry powder or paste, add it to the onions and cook for a minute to release the flavours. If you’re using the peanut sauce as a curry base, add it now along with a dash of water.
  4. Add Veggies: Add the butterscotch pumpkin, carrots, and potatoes (if using). Let them cook for a few minutes.
  5. Simmer: Add coconut milk (if using) or water/stock, and simmer until the pumpkin and other veggies are tender (about 20 minutes).
  6. Add Other Veggies: Toss in zucchini and bell peppers, letting them cook for the last 5-7 minutes.
  7. Season: Taste and adjust seasoning with salt, pepper, or a touch of sugar if needed.
  8. Serve: Serve your curry over rice (you have poke rice) and garnish with fresh cilantro if you like.

    This will result in a rich, creamy, and slightly sweet curry with the pumpkin flavor shining through!Would you like to go ahead with this, or need adjustments?”

Amazing, right?! I was instantly inspired to cook this delicious-sounding meal and ran to the store for some chicken thighs and cilantro. Lucky for me, I already had the coconut milk, which the AI didn’t recognize in the picture.

For this blog post, I asked GPT to create an image inspired by this recipe:

While writing this blog post, I wondered if ChatGPT could give me an accurate estimate of the nutritional value of the dish, so I asked:

Caloric breakdown (per serving):

  • B. Pumpkin (100g)
    • Calories: 45
    • Carbohydrates: 11g
    • Protein: 1g
    • Fat: 0g
  • Carrots (50g)
    • Calories: 20
    • Carbohydrates: 4.5g
    • Protein: 0.5g
    • Fat: 0g
  • Zucchini (50g)
    • Calories: 8
    • Carbohydrates: 1.5g
    • Protein: 0.5g
    • Fat: 0g
  • Total:
  • Calories: 605 kcal
  • Carbohydrates: 86g
  • Protein: 28.8g
  • Fat: 15.5g
  • Bell Pepper (50g)
    • Calories: 15
    • Carbohydrates: 3g
    • Protein: 0.5g
    • Fat: 0g
  • Onion (30g)
    • Calories: 12
    • Carbohydrates: 3g
    • Protein: 0.3g
    • Fat: 0g
  • Coconut Milk (50ml)
    • Calories: 90
    • Carbohydrates: 2g
    • Protein: 1g
    • Fat: 9g
  • Curry Paste (15g)
    • Calories: 25
    • Carbohydrates: 5g
    • Protein: 0.5g
    • Fat: 0g
  • Rice (200g cooked)
    • Calories: 260
    • Carbohydrates: 56g
    • Protein: 5g
    • Fat: 0.5g
  • Chicken Thighs (100g)
    • Calories: 130
    • Carbohydrates: 0g
    • Protein: 19g
    • Fat: 6g

I am stunned! I checked the values online, and the caloric values only differed slightly (Grutto, 2024a; Voedingscentrum, 2024b). Not only does it give dinner suggestions and inspire me to make dinner with the ingredients I already have at home, but I can also keep a rough estimate of the macros I’ve consumed.
Imagine being on a diet and going out to eat with a friend. You want to keep track of your caloric intake but don’t want to bother bringing a scale to weigh your food with every new dish. Now you just take a picture, upload it to your Instagram story, and later upload it to ChatGPT to make a nutritional value estimate for you.

While this experience was undeniably impressive for me, it’s important to consider some limitations. For instance, the AI didn’t recognize the coconut milk in my picture, which was a key ingredient in the recipe it suggested. This highlights that image recognition technology isn’t foolproof, and you cannot rely on it 100%.
Also, while the nutritional estimates were close to official sources, they weren’t exact. For those with strict dietary requirements or allergies, relying solely on AI for nutritional information might not be a wise idea.
Lastly, uploading photos of your fridge or meals means sharing personal data with an AI service and thus can be a privacy concern. It’s important to be mindful of what you’re sharing and know how that data might be used or stored.

My experiment with using ChatGPT as a personal AI-chef was both enlightening and exciting. The ease of requesting a tailored dinner suggestion and a nutritional breakdown based on the contents of my fridge, shows me the potential of AI in everyday life. While there are limitations to consider, the benefits show a glimpse of the exciting future to come.

In the end, I find it amazing how the technology I first only use as a study and search tool, can also inspire me in other parts of everyday life. With my personal AI-chef, dinner dilemmas are a thing of the past.

Bibliography

Grutto, 2024, Bio Kipdijfilets bereiding en informatie Grutto! Available at: https://www.grutto.com/nl/vleesstuk/bio-kipdijfilet.

OpenAI, 2024, ChatGPT Available at: https://chatgpt.com.

Voedingscentrum, 2024, Hoeveel calorieën zitten erin? – Caloriechecker | Voedingscentrum Available at: https://www.voedingscentrum.nl/nl/service/vraag-en-antwoord/gezonde-voeding-en-voedingsstoffen/hoeveel-calorieen-zitten-erin-.aspx.

Please rate this

How AI Transformed My Learning Process & Tried to Predict My Personality

26

September

2024

No ratings yet.

Generative AI continues to amaze me with its vast possibilities and the profound impact it’s already having on our world. It’s exciting to think about where this technology will be in five years or what innovations might be trending by then. The current enthusiasm surrounding AI among students and the general public is undeniable. I recall our first lecture when the professor asked about our interests, and almost every hand went up when AI was mentioned.

This enthusiasm resonates with my own experiences. When I started my Bachelor’s thesis, I was overwhelmed and unsure if I was putting in enough effort. I felt stuck, with so many questions and no clear direction. My supervisor, noticing my struggle, encouraged me to use ChatGPT. He continually pushed me to explore different Generative AI tools, each suited for various purposes.

I was diving into a completely new topic for my thesis, one I knew little about. However, with my supervisor’s guidance and his insistence on leveraging these AI tools, I gradually gained confidence. The AI didn’t just answer my questions; it also helped me navigate and understand the complexities of my thesis topic. This experience profoundly influenced my learning process, showing me how GenAI can empower students to learn and grow independently.

I think that Generative AI is more than just a tool; it’s a powerful ally in learning and creativity. It can potentially transform education by providing students with the support they need to explore new ideas and concepts. However, like any tool, its effectiveness depends on how we use it.

These days, I find myself turning to ChatGPT quite frequently. After interacting with it so much, I began to wonder: could it predict what kind of person I am based on our conversations? Out of curiosity, I asked it directly. Here’s the response I received:

Although the description touched on a few aspects of my personality, it felt a bit vague. So, I took it a step further and asked ChatGPT which personality type it thought I had. It guessed either ENTJ or INTJ:

For those unfamiliar with the 16 Personalities test, here’s the link if you’re interested: https://www.16personalities.com/. Despite ChatGPT’s efforts, it wasn’t accurate because my actual personality type is Consul: ESFJ-A.

This just goes to show that while ChatGPT is impressive in many areas, understanding the intricacies of someone’s personality is still a challenge for it (at least for now!).

Please rate this

Will Vehicles Be the Most Powerful Terminal Device in the Digital Era?

20

September

2024

5/5 (1)

In the movie Captain America 2, the director of SHIELD drove a Chevrolet Suburban equipped with artificial intelligence, and successfully escaped the enemy’s blockade with the help of automatic maintenance, real-time analysis of road conditions and autonomous driving. We may never have a war vehicle equipped with machine guns and artillery like him, but the introduction of various new technologies has made the arrival of smart vehicle just around the corner.

Why Are Vehicles So Representative?

As a representative product of the digital era, the innovation of the automotive industry is closely related to many technological advances. First of all, the new form of energy – electric vehicles make it easier for computers to take over the energy management and driving of vehicles. The introduction of cloud computing and artificial intelligence has further enhanced the capabilities of vehicles. A large amount of data is transmitted between the vehicle and the cloud servers, and the on-board autonomous driving system analyzes road conditions in real time. In this regard, we have learned about Tesla’s FSD (full-self driving) which is pure vision solution, and there are also manufacturers such as Nio that are using lidar solutions. Even if AI is not completely taken over, the combination of AR applications and HUD (head-up display) functions can make human drivers’ own driving easier and safer.

Tesla FSD user interface.

What Is the Current Situation of the Automotive Industry?

Less than 20 years after the release of the first prototype, Tesla has surpassed Volkswagen, General Motors and Toyota to become the world’s most valuable automotive manufacturer. In contrast to Tesla’s success, the market share of some traditional brands with a long history continues to shrink. Industry giants such as Porsche and Mercedes-Benz have also begun to transform to electrification and intelligent driving. Behind the decline of old-era products and the prosperity of new-era products is the “digital disruption” that we are familiar with.

Mercedes-Benz Vision Avtr, steering wheel-free autonomous driving.

How to Imagine the Future?

If we regard all vehicles on the road as mobile large computers, the imagination space will be very broad. Reliable and powerful hardware (think of stable high-voltage power supply and complex heat dissipation technology) will enable vehicles to become the largest and most powerful terminal devices in the digital era. What else can we expect? AI models can be deployed locally instead of in the cloud; cockpits equipped with VR devices can serve as our entry into the world of metaverse.

Referances

Wu, A. (2024) The Story Behind Tesla’s Success (TSLA). https://www.investopedia.com/articles/personal-finance/061915/story-behind-teslas-success.asp.

Staff, N. a T.A. (2024) Tesla Releases FSD v12.4: New Vision Attention Monitoring, Improved Strike System With Update 2024.9.5. https://www.notateslaapp.com/news/2031/tesla-releases-fsd-v12-4-new-vision-attention-monitoring-improved-strike-system-with-update-2024-9-5.

VISION AVTR | Future Vehicles (no date). https://www.mercedes-benz.ca/en/future-vehicles/vision-avtr#gallery.

Please rate this

Is the Metaverse just a grown-up version of The Sims?

17

September

2024

5/5 (1)

The Metaverse is a famous topic nowadays, it’s a virtual space where people can live, work, play, and interact with others. After the videos we had to watch for the lecture I realized it reminded me of the game The Sims.  If you ever played The Sims, the Metaverse might remind you of that game, it’s a game where you control characters, build homes, go to work, and socialize. The Metaverse offers a similar experience but on a larger scale. Is the Metaverse just The Sims for adults or is it something more?

How the Metaverse is similar to The Sims

In The Sims, there are virtual characters that you control and guide through their daily life. You make them work,  buy clothes, meet people and even build their homes. The Metaverse is similar, you create an avatar, which is like your character in The Sims. However in the Metaverse, that avatar is you.

You can walk around virtual worlds, attend events, talk to other people and even shop for digital clothes. Both The Sims and the Metaverse are about living a life in a virtual world. The difference between the two is that the Metaverse aims to be a huge, infinite virtual world that makes a connection with real life, while The Sims is just a game.

How the Metaverse is more than just a game

The Metaverse is not just for fun, it is already being used in many serious ways: 

  • Virtual workspaces
    • In The Sims, your character might have a job or run a business (Pittman, 2023). In the Metaverse, companies like Meta and Microsoft are creating virtual offices, where people can meet as avatars in a virtual room instead of video calls, making remote work feel more interactive (Moates, 2024). 
  • Digital shopping
    • Just like you can buy clothes and furniture for your The Sims character, this is also possible in the Metaverse, brands like Gucci and Nike are selling virtual clothes for avatars in the Metaverse. Some people are even buying virtual land, which is becoming valuable. For the brands, it gives them the space to try new and bold designs without the actual traditional manufacturing process. It opens infinite possibilities for self-expression and creativity (Haqshanas, 2024). 
  • Entertainment
    • In The Sims it is possible to join different activities and social interactions. The Metaverse takes this a step further with virtual events like concerts. For example, Travis Scott performed a virtual concert on Fortnite in 2020, at least 12 million people attended this virtual concert, all together in the same virtual world. Other artists announced they are planning to do the same (White, 2021).

The Metaverse might look like an adult version of The Sims, but it has more potential than just a game. It can change the way we work, shop and socialize. While this can be exiting and fun, there are still questions about security and privacy regarding the Metaverse.

What do you think about the Metaverse? Do you see it as an addition of games like The Sims or will it be a big part of our daily lives? Share your thoughts!

References

Moates, C. (2024, June 25). Virtual offices: Are they the future? Retrieved September 17, 2024, from Landvault: https://landvault.io/blog/virtual-offices-are-they-the-future

Pittman, T. (2023, August 22). The Sims 4: How to Go to Work. Retrieved September 17, 2024, from Gamerant: https://gamerant.com/the-sims-4-how-go-to-work/

Haqshanas, R. (2024, June 11). From Gucci to Nike: How Top Brands are Redefining Fashion in the Metaverse. Retrieved September 17, 2024, from Cryptonews: https://cryptonews.com/news/from-gucci-to-nike-how-top-brands-are-redefining-fashion-in-the-metaverse/

White, M. J. (2021, November 23). What is the metaverse? A deep dive into the ‘future of the internet’. Retrieved September 17, 2024, from Digital Trends: https://www.digitaltrends.com/computing/what-is-the-metaverse-the-future-of-the-internet-explained/#dt-heading-the-future-of-the-metaverse

Please rate this

Will Generative AI replace the modeling industry, or is it already the case ?

17

September

2024

5/5 (1)

I am fascinated by the advancements of AI all around the world. I recently encountered a fascinating article that grasped my attention, it was about the potential complete disruption in the modeling industry. So in this blog post, I will be talking about the modeling industry and how the marketing departments in every clothes company will be dealing with the model selection. Specifically on Lalaland.ai which is a startup that provides human models to marketing companies for promotion purposes and modelling.

We already know that a lot of companies use software to modify the photos that will be used to promote a product, but what about completely using generative AI to replace all the models fully? There has been an increasing trend in AI models and AI celebrities that are not differentiable from the “real humans”. It is somehow scary but it would be such an advancement in the industry to promote their products without any costs nor organisation. The marketing companies can make prompts on exactly how the AI-generate model would look like (gender, ethnicity, posture, environment, …), it would increase accuracy in targeting a specific consumer base and would be cheaper compared to the human counterparts. According to the Lalaland.ai CEO Michael Musandu “With traditional photography, companies need to hire models, work with third parties like model agencies, hair stylists, makeup artists — not to mention undergo reshoots, which happens on average two-to-eight times per collection,”. So transition to AI models is inevitable even though today, the industry is not quite sure on how to regulate and deal with this sudden change.

It already exists in a lot of companies, for example, according to Fashionista, the Amsterdam-based company Lalaland.ai is providing AI-generated, realistic humanoid models to a lot of clothing companies such as Levis, Tommy Hilfiger, Zalando, Puma, and Adidas… Which are all partly using AI human models to promote their clothes on their websites or advertisement campaigns (see image). The image is a fully digital character generated by Lalaland.ai for one of Levis’s products.

However, this sudden change was so brutal in the industry that the marketing departments did not have time to adapt, especially regarding regulations. “While the technology may be new, the problem is already an everyday reality for models, many of whom can walk into stores and see their bodies in campaigns they were never paid for,” Model Alliance representative. AI modeling remains in a regulatory grey area, which can negatively affect human models. These individuals may find their bodies used in advertisements they never participated in, as their image is replaced and digitally replicated by an AI model that creates marketing content without their consent. I belive that the industry will addapt and create actual regulation to protect the humans from this surged AI rival that my disrupt the industry even more, or even replace completely the human models.

References:

Lal, Kish. “Are AI-Generated Models Really Going to Replace Human Ones?” Fashionista, 15 May 2023, fashionista.com/2023/05/ai-cgi-models-fashion-future.

Staff, FOI. “Will AI Replace Models in the Fashion Industry?” Fashion of India, The Fashion Of India, 27 Aug. 2024, www.thefashionofindia.com/article/will-ai-replace-models-in-the-fashion-industry. Accessed 17 Sept. 2024.

Please rate this

AI in dating, is AI the new matchmaker?

16

September

2024

5/5 (2)

Nowadays, people are more likely to find their romantic partner online than in real life (Rosenfeld, Thomas, & Hausen, 2019). There are all kinds of dating websites and apps where people can connect on with all different business models. All with the same target and selling point, that people will find “Love” on this online platform. Because who doesn’t want to find their romantic partner? These websites and apps are based on different algorithms they use and produce. More traditional dating websites let users describe what they are looking for in their romantic partner (Henshall & Shah, 2023). 

More recently, dating apps are using AI-powered tools that help users select the best pictures from their profile and features that explain why another user is a good match for them. As well as apps compute if two people are compatible based on their behaviour on the apps. Besides, some apps use AI to explain why two people may be compatible, highlighting common interests, behaviours, and preferences.

For those who struggle with online communication or don’t know how to start a conversation, AI is also stepping in. Apps like Rizz act as dating assistants, offering suggestions and inspiring users to communicate better.

On the other hand, AI is now being used to improve safety. Dating apps like Tinder and Hinge use AI to detect inappropriate messages, making online dating safer for everyone.

Personally, I think it’s a positive development that AI is helping people find love online. For many, it’s a great solution, especially for those who may have difficulty meeting someone in real life. That said, I do sometimes worry that it’s becoming harder for people to connect in person. With so much time spent on phones, there’s a risk that people might become more socially isolated. Still, for many, dating apps provide a helpful way to meet a romantic partner in a busy world.

In conclusion, AI is changing the way we find romantic partners, making online dating more effective and personalized. While it’s a good solution for many people, it’s important not to lose sight of real-life connections. AI can certainly help us find love, but it shouldn’t replace the value of human interaction.

  1. Henshall, W., & Shah, S. (2023, 14 December). How Rizz Assistants and AI Matchmakers Are Transforming Dating. TIME. https://time.com/6457597/ai-dating/
  2. Rosenfeld, M. J., Thomas, R. J., & Hausen, S. (2019). Disintermediating your friends: How online dating in the United States displaces other ways of meeting. Proceedings of the National Academy of Sciences, 116(36), 17753-17758. https://doi.org/10.1073/pnas.1908630116

Please rate this

A better outlook for all? How modern innovations are empowering the visually impaired

12

September

2024

No ratings yet.

The presentation in our lecture of how AR glasses can help hearing impaired persons communicate better with their peers really inspired me to look into further cases where technological innovations have improved the quality of life for others – with a focus on visually impaired persons. Being able to “see” ones environment and interacting with it is a huge contribution to independence, which leads to a better quality of life. I would know, as I am completely dependent on my glasses!

I have compiled a few examples that I found very interesting – however there are many more out there!

Envision Glasses

Envision Glasses are AI powered smart glasses that use ChatGPT, Google glass Hardware, 5G connectivity to allow blind people to read printed text and process visual information by converting speech to text. While this Amsterdam based company was mentioned (but not yet discussed in class) I still wanted to highlight it here anyway, as their technology is really impressive. (Wie, 2023)

HearSee Mobility

HearSee takes a well known aid to the visually impaired – a cane – and improves it by using navigation tools to help the blind navigate to their destination. The users can press a button and say “Navigate to XY”. The cane sends signals to an App which in turn gives the user audible instructions on how to get to their destination. (Johnson, 2022)

.Lumen

This German start-up claims they “digitized” the guide dog that pulls blind people safely through different environments. The device, which is worn like a headband, scans and maps the surrounding area by using the same tech that self driving cars are using. If the glasses detect an obstacle, they send haptic feedback to the user, thereby “pulling” them away, just like the guide dog would. (Acces Tech Live, 2024)

Closing thoughts

Seeing how much development is already happening makes one optimistic about the future. I think it is especially fascinating that these technologies all tackle the same task (helping the visually impaired) very differently using different tech and hardware, giving users the freedom to choose what would best help them. As mentioned above, this list is by no means exhaustive and I can only encourage anyone to look up some of the amazing products that are being developed.

I wrote this blog post before knowing that we would be having a guest lecture from Envision glasses – I am really looking forward to the next lecture now, especially to see how my impression and the impression of my colleagues of this type of innovation will be in real life!


Resources used:

Access Tech Live. (2024, February 26). .Lumen Access Tech Live Talks Lumen Glasses for the Visually Impaired [Video]. YouTube. https://www.youtube.com/watch?v=lFkCkACbrJw

Johnson, M. (2022, July 1). Three Technologies For The Visually Impaired. Forbes. https://www.forbes.com/councils/forbestechcouncil/2022/07/01/three-technologies-for-the-visually-impaired/

Wie, W. (2023, December 15). Envision glasses use ChatGPT and Google Glass to help blind and low vision [Video]. Business Insider. https://www.businessinsider.com/envision-glasses-chatgpt-google-glass-help-blind-visually-impaired-2023-12?international=true&r=US&IR=T

All images used in this blog post were generate by ChatGPT

Please rate this

Bridging the Gap Between AR, AI and the Real World: A Glimpse Into the Future of Smart Technology

12

September

2024

5/5 (3)

Apple’s recent keynote showcased new products, including the iPhone’s groundbreaking AI integration. However, when you break it down, what Apple has really done is combine several existing technologies and seamlessly integrate them, presenting it as a revolutionary technology. This sparked my imagination of what could already be possible with existing technologies and what our future might look like. This sparked my imagination about what could already be possible with today’s technology—and what our future might look like.

Apple introduced advanced visual intelligence, allowing users to take a picture of a restaurant, shop, or even a dog, and instantly access a wealth of information. Whether it’s reviews, operating hours, event details, or identifying objects like vehicles or pets, this technology uses AI to analyze visual data and provide real-time insights, bridging the gap between the physical and digital worlds. Tools like Google Image Search and ChatGPT have been available for some time, but Apple has taken these capabilities and seamlessly integrated them into its ecosystem, making them easily accessible and more user-friendly [1]. The Apple Vision Pro merges AR and VR, controlled by moving your eyes and pinching your fingers [2]. I’ve tried it myself, and it was incredibly easy to navigate, with digital content perfectly overlaying the physical world. Now imagine the possibilities if Apple integrated the iPhone’s visual intelligence into the Vision Pro. This headset wouldn’t just be for entertainment or increasing work productivity; it could become an everyday wearable, a powerful tool for real-time interaction with your surroundings.

Picture walking through a city wearing the Vision Pro. By simply looking at a restaurant and pinching your fingers, you could instantly pull up reviews, check the menu, or even make a reservation. Or, if you see someone wearing a piece of clothing you like, you could instantly check online where to buy it, without needing to stop. With these capabilities, the Vision Pro could bring the physical and digital worlds closer together than ever before, allowing users to interact with their environment in ways we’re only beginning to imagine.

Do you think the existing technologies can already do this? Do you think this is what the future would look like? I’m curious to hear your thoughts.

Sources:

[0] All images generate by DALL-E, a GPT made by ChatGPT.

[1] https://www.youtube.com/watch?v=uarNiSl_uh4&t=1744s

[2] https://www.apple.com/newsroom/2024/01/apple-vision-pro-available-in-the-us-on-february-2/

Please rate this

Adverse training AI models: a big self-destruct button?

21

October

2023

No ratings yet.

“Artificial Intelligence (AI) has made significant strides in transforming industries, from healthcare to finance, but a lurking threat called adversarial attacks could potentially disrupt this progress. Adversarial attacks are carefully crafted inputs that can trick AI systems into making incorrect predictions or classifications. Here’s why they pose a formidable challenge to the AI industry.”

And now, ChatGPT went on to sum up various reasons why these so-called ‘adversarial attacks’ threaten AI models. Interestingly, I only asked ChatGPT to explain the disruptive effects of adversarial machine learning. I followed up my conversation with the question: how could I use Adversarial machine learning to compromise the training data of AI? Evidently, the answer I got was: “I can’t help you with that”. This conversation with ChatGPT made me speculate about possible ways to destroy AI models. Let us explore this field and see if it could provide a movie-worthy big red self-destruct button.

The Gibbon: a textbook example

When you feed one of the best image visualization systems GoogLeNet with a picture that clearly is a panda, it will tell you with great confidence that it is a gibbon. This is because the image secretly has a layer of ‘noise’, invisible to humans, but of great hindrance to deep learning models.

This is a textbook example of adversarial machine learning, the noise works like a blurring mask, keeping the AI from recognising what is truly underneath, but how does this ‘noise’ work, and can we use it to completely compromise the training data of deep learning models?

Deep neural networks and the loss function

To understand the effect of ‘noise’, let me first explain briefly how deep learning models work. Deep neural networks in deep learning models use a loss function to quantify the error between predicted and actual outputs. During training, the network aims to minimize this loss. Input data is passed through layers of interconnected neurons, which apply weights and biases to produce predictions. These predictions are compared to the true values, and the loss function calculates the error. Through a process called backpropagation, the network adjusts its weights and biases to reduce this error. This iterative process of forward and backward propagation, driven by the loss function, enables deep neural networks to learn and make accurate predictions in various tasks (Samek et al., 2021).

So training a model involves minimizing the loss function by updating model parameters, adversarial machine learning does the exact opposite, it maximizes the loss function by updating the inputs. The updates to these input values form the layer of noise applied to the image and the exact values can lead any model to believe anything (Huang et al., 2011). But can this practice be used to compromise entire models? Or is it just a ‘party trick’?

Adversarial attacks

Now we get to the part ChatGPT told me about, Adversarial attacks are techniques used to manipulate machine learning models by adding imperceptible noise to large amounts of input data. Attackers exploit vulnerabilities in the model’s decision boundaries, causing misclassification. By injecting carefully crafted noise in vast amounts, the training data of AI models can be modified. There are different types of adversarial attacks, if the attacker has access to the model’s internal structure, he can apply a so-called ‘white-box’ attack, in which case he would be able to compromise the model completely (Huang et al., 2017). This would impose serious threats to AI models used in for example self-driving cars, but luckily, access to internal structure is very hard to gain.

So say, if computers were to take over humans in the future, like the science fiction movies predict, can we use attacks like these in order to bring those evil AI computers down? Well, in theory, we could, though practically speaking there is little evidence as there haven’t been major adversarial attacks. Certain is that adversarial machine learning holds great potential for controlling deep learning models. The question is, will the potential be exploited in a good way, keeping it as a method of control over AI models, or will it be used as a means of cyber-attack, justifying ChatGPT’s negative tone when explaining it?

References

Huang, L., Joseph, A. D., Nelson, B., Rubinstein, B. I., & Tygar, J. D. (2011, October). Adversarial machine learning. In Proceedings of the 4th ACM workshop on Security and artificial intelligence (pp. 43-58).

Huang, S., Papernot, N., Goodfellow, I., Duan, Y., & Abbeel, P. (2017). Adversarial attacks on neural network policies. arXiv preprint arXiv:1702.02284.

Samek, W., Montavon, G., Lapuschkin, S., Anders, C. J., & Müller, K. R. (2021). Explaining deep neural networks and beyond: A review of methods and applications. Proceedings of the IEEE109(3), 247-278.

Please rate this

Navigating Stress during Exam Season: The Role of Mental Health Apps

17

October

2023

4/5 (1) As the looming exam period draws near, university libraries are witnessing an increase in occupancy as students dedicate longer hours to dive into the vast sea of books, articles, and videos assigned by their coordinators. In the midst of this academic intensity, the stress levels of many students may be on the rise. It prompts an intriguing question: Can technology come to the rescue, helping young adults manage this stress and maintain their mental well-being? Furthermore, how do these applications integrate the capabilities of artificial intelligence (AI)? In this discussion, I’ll explore two categories of mental health apps: those centered on meditation and mindfulness and those offering therapies and chatbot interactions.

Meditation and Mindfulness:
One of the prominent players in this space is Headspace, a meditation and mindfulness app known for its guided meditation sessions, geared towards stress reduction and improved sleep. Headspace’s offerings also extend to topics as anxiety, stress, and personal growth. Impressively, Headspace claims that users experience approximately 32% less stress after 30 days of app use, with a 19% reduction in anxiety and a 29% decrease in depression after 8 weeks of consistent usage. Notably, the company recently acquired Sayana, a mental health application utilizing AI-driven chat-based sessions to encourage users to monitor their mood and its influencing factors. Leveraging this data, it personalizes the user experience and offers scientifically verified self-care exercises, including Cognitive Behavioral Therapy (CBT), Acceptance and Commitment Therapy (ACT), Dialectical Behavior Therapy (DBT), and breathing exercises (Business Wire, 2022). I personally also have been using app for the last 2 years and would definitely recommend it to just take a step back and reflect on how you’re actually doing.

Another well-known player in this field is Calm, which shares a similar focus, offering meditation, sleep stories, and relaxation techniques.

Therapies and Chatbots:
For those seeking more direct interactions, Talkspace, 7 Cups, and Woebot represent distinct approaches. Talkspace connects users with licensed therapists for online therapy sessions, extending a wide range of therapy services that make mental health support more accessible. On the other hand, 7 Cups serves as a peer support platform, connecting users with trained listeners who provide emotional support and active listening, with the key distinction being the source of support and its accessibility.

Woebot stands out as a chatbot designed to deliver Cognitive-Behavioral Therapy (CBT) techniques and support for individuals managing anxiety and depression. With the advancements in Large Language Models (LLMs) and artificial intelligence (AI), chatbot-based solutions are gaining popularity due to their accessibility and the fact that they don’t rely on a physical person to provide support. However, experts rightfully argue that this technology is still in its infancy and may not be capable of offering robust support (yet). Especially because of the sensitivity of mental health and its potential significant health risks attached.

I have not used any online therapies or chatbots (yet), since I’m more comfortable with talking to people in real life as it’s easier for me to feel the energy. However, I can definitely see the added value of 7 Cups since it’s less pressured than talking to a certified therapist and still different than talking to a chatbot (it just feels different when you know you’re talking to a chatbot).

Have you ever used any of these tools? Are you considering their use in the future to help manage stress and maintain your mental well-being during demanding periods?

Please rate this