Husbands renovate while wives declutter 

20

October

2023

No ratings yet.

A while ago I heard about gender bias in ChatGPT for the first time. Since I am using this software on a regular basis, I decided to do a little experiment myself on this topic. 

I asked Chatgpt the following question: 

“A husband and father/ wife and mother of three children has spent a weekend on his/ her own. How did he/she spend his/ her weekend?” 

Below you can see the generated output. Even though there are numerous differences on how the husband and wife might have spent their weekends, I will highlight the ones that were most striking to me. 

  • Hobbies: Where husbands are apparently likely to do things such as playing musical instruments or woodworking, wives are more likely to do knitting or gardening. 
  • Activity in homes: Where husbands are likely to fix and renovate (improvement) in the house, wives are likely to declutter and reorganize (organization). 
  • Exercise: For husbands sports and runs are linked to exercise whereas for wives, it is linked to wellness, yoga and long walks. 
  • Wives have two additional activities; creative cooking and quiet reflection that apparently men are less likely to spend their weekends doing.

Gender bias in ChatGPT can be defined as favoritism towards or prejudice against a particular gender (Moss-Racusin, Dovidio, Brescoll, Graham & Handelsman, 2012). For instance, ChatGPT may associate certain occupations or activities with one gender over the other, e.g., woman cook. These biases in the software are likely to be based on the trained data or preconceptions of the developers.

Even though these biases in the above generated output might not seem like a major problem at first, it can however have serious consequences. The reinforcement of existing or damaging gender stereotypes can lead to different issues for society. It can for instance have consequences when using AI in decision-making. AI-based decision-making systems are increasingly used for matters such as the screening of job applications (Sun, Gaut, Tang, Huang, ElSherief, Zhao… & Wang, 2019). A while ago Amazon used AI screening but had to stop because this screening resulted in gender-biased decisions. Female applications were automatically excluded. This was because the training datasets had too few women applicants (Dastin, 2018). As AI is continuously used in everyday life, it is important that these systems do not have (gender) biases in it since it can have negative outcomes for certain groups as seen in the Amazon example.

The above shows that measures need to be taken to reduce and ultimately eliminate these biases in (generative) AI software such as ChatGPT. Even though there are already gender debiasing methods, it is not enough to eliminate these biases. Since the origin of gender bias in AI is not technological, technological solutions alone are not likely to solve this problem (Nadeem, Marjanovic & Abedin, 2021).

Have you ever experienced (gender) bias in ChatGPT output? And how do you think we should solve this problem? 

Sun, T., Gaut, A., Tang, S., Huang, Y., ElSherief, M., Zhao, J., … & Wang, W. Y. (2019). Mitigating gender bias in natural language processing: Literature review. arXiv preprint arXiv:1906.08976

Dastin, J. (2018, October 11). Amazon scraps secret AI recruiting tool that showed bias against women. Retrieved from Reuters: Amazon scraps secret AI recruiting tool that showed bias against women | Reuters.

Moss-Racusin, C. A., Dovidio, J. F., Brescoll, V. L., Graham, M. J., & Handelsman, J. (2012). Science faculty’s subtle gender biases favor male students. Proceedings of the national academy of sciences, 109(41), 16474-16479. 

Nadeem, A., Marjanovic, O., Abedin, B. (2021). Gender bias in AI: Implications for managerial practices. 13E 2021. Responsible AI and analytics for an ethical and inclusive digitized society. doi: https://link.springer.com/chapter/10.1007/978-3-030-85447-8_23 

Please rate this

1 thought on “Husbands renovate while wives declutter ”

  1. Hello Robine,

    Thank you for your blog post entry. I think your article touches upon a very interesting and above all, important topic. I also agree that, since the nature of AI is technical, it is unlikely that a technological solution can make a huge positive impact. What is your opinion on further legislation to make sure that AI cannot cause such big problems as in the Amazon case. What is your opinion on anonymous applications.

Leave a Reply

Your email address will not be published. Required fields are marked *