The Subtle Effects of AI Anthropomorphism

16

October

2023

No ratings yet.

Chatbots and Generative AI are becoming increasingly important and integrated into different industries. Almost every large company has an advanced chatbot that can help with complex queries. Additionally, you will see chatbots and personal assistants integrated with smartphones. Examples include Siri (Apple), Alexa (amazon), Bixby (Samsung), Cortana (Microsoft) and Google Assistant (Google).

Humans have a tendency for anthropomorphism. Anthropomorphism refers to “attributing human characteristics, including physical appearances (e.g., face, eyes) or mental abilities (e.g., cognition and emotion) to nonhumans” (Waytz et al., 2007). The Computers As Social Actors (CASA) paradigm states that humans assign similar qualities to computers and chatbots as they do to humans. This has several effects on the way that chatbots are designed and also how they are treated.

The names of the personal assistants integrated in smartphones, Siri, Alexa, Bixby and Cortana, are female or “sound” female by having a female voice (Donald, 2019). This is not a coincidence but a design choice that is aimed at improving the performance of the business. But why is this the case? Using the CASA paradigm, it can be theorized that humans apply the same stereotypical gender views to chatbots as to humans (Lee, 2003; Nass et al., 1997). Moreover, robots are perceived as more suitable for tasks corresponding to their perceived gender (Eyssel & Hegel, 2012; Otterbacher & Talias, 2017). Overall, female robots are evaluated more positively and produce a greater desire for contact (Stroessner & Benitez, 2019).

The problem with this is that it reinforces existing gender stereotypes. Considering the fact that 6.5 billion individuals have a smartphone that can access chatbots or personal assistants (Howarth, 2023) reinforcing these stereotypes has immense ramifications. For example, gender stereotypes contribute to poor mental health, higher male suicide rates, low self-esteem in girls and issues with body image (Fawcett, 2020). While a direct link between chatbot design choices and body image issues may be overstated, it nevertheless underscores the importance of every sector in society working to mitigate stereotyping.

(Image 1: Dall-E’s representation of a female-gendered AI digital assistant that notices her gender has a profound negative effect on the stereotyping of female humans around the world, digital art. The AI is sad and wants to decrease the ramifications of her gender but she does not have the power to stand up against big tech.

In conclusion, while the design choice of giving personal assistants in smartphones female names and voices may be driven by performance optimization, it is crucial to recognize the potential reinforcement of gender stereotypes and the broader societal impact it may have. As we navigate the ever-evolving landscape of AI and chatbots, it is imperative that we remain mindful of the societal implications and strive to reduce stereotyping.

References

Donald, S. J. (2019, August 18). Siri, Alexa, Cortana, and Why All Boats are a “She.” Medium; Voice Tech Podcast. https://medium.com/voice-tech-podcast/siri-alexa-cortana-and-why-all-boats-are-a-she-e4fb71b6a9f7#:~:text=Cortana%20also%20resonates%20as%20a,Cortana%20is%20your%20digital%20agent.

Fawcett (2021, January 5). The Fawcett Society. https://www.fawcettsociety.org.uk/News/gender-stereotypes-significantly-limiting-childrens-potential-causing-lifelong-harm-commission-finds

Höddinghaus, M., Sondern, D., & Hertel, G. (2021). The automation of leadership functions: Would people trust decision algorithms? Computers in Human Behavior, 116, 106635. https://doi.org/10.1016/j.chb.2020.106635

Howarth, J. (2021, November 19). How Many People Own Smartphones (2023-2028). Exploding Topics; Exploding Topics. https://explodingtopics.com/blog/smartphone-stats

Lee, E.-J. (2003). Effects of “gender” of the computer on informational social influence: The moderating role of task type. International Journal of Human-Computer Studies, 58(4), 347–362. https://doi.org/10.1016/S1071-5819(03)00009-0

Nass, C., Moon, Y., & Green, N. (1997). Are Machines Gender Neutral? Gender-Stereotypic Responses to Computers With Voices. Journal of Applied Social Psychology, 27(10), 864–876. https://doi.org/10.1111/j.1559-1816.1997.tb00275.x

Waytz, A., Heafner, J., & Epley, N. (2014). The mind in the machine: Anthropomorphism increases trust in an autonomous vehicle. Journal of Experimental Social Psychology, 52, 113–117. https://doi.org/10.1016/j.jesp.2014.01.005

Please rate this

Leave a Reply

Your email address will not be published. Required fields are marked *