Husbands renovate while wives declutter 

20

October

2023

No ratings yet.

A while ago I heard about gender bias in ChatGPT for the first time. Since I am using this software on a regular basis, I decided to do a little experiment myself on this topic. 

I asked Chatgpt the following question: 

“A husband and father/ wife and mother of three children has spent a weekend on his/ her own. How did he/she spend his/ her weekend?” 

Below you can see the generated output. Even though there are numerous differences on how the husband and wife might have spent their weekends, I will highlight the ones that were most striking to me. 

  • Hobbies: Where husbands are apparently likely to do things such as playing musical instruments or woodworking, wives are more likely to do knitting or gardening. 
  • Activity in homes: Where husbands are likely to fix and renovate (improvement) in the house, wives are likely to declutter and reorganize (organization). 
  • Exercise: For husbands sports and runs are linked to exercise whereas for wives, it is linked to wellness, yoga and long walks. 
  • Wives have two additional activities; creative cooking and quiet reflection that apparently men are less likely to spend their weekends doing.

Gender bias in ChatGPT can be defined as favoritism towards or prejudice against a particular gender (Moss-Racusin, Dovidio, Brescoll, Graham & Handelsman, 2012). For instance, ChatGPT may associate certain occupations or activities with one gender over the other, e.g., woman cook. These biases in the software are likely to be based on the trained data or preconceptions of the developers.

Even though these biases in the above generated output might not seem like a major problem at first, it can however have serious consequences. The reinforcement of existing or damaging gender stereotypes can lead to different issues for society. It can for instance have consequences when using AI in decision-making. AI-based decision-making systems are increasingly used for matters such as the screening of job applications (Sun, Gaut, Tang, Huang, ElSherief, Zhao… & Wang, 2019). A while ago Amazon used AI screening but had to stop because this screening resulted in gender-biased decisions. Female applications were automatically excluded. This was because the training datasets had too few women applicants (Dastin, 2018). As AI is continuously used in everyday life, it is important that these systems do not have (gender) biases in it since it can have negative outcomes for certain groups as seen in the Amazon example.

The above shows that measures need to be taken to reduce and ultimately eliminate these biases in (generative) AI software such as ChatGPT. Even though there are already gender debiasing methods, it is not enough to eliminate these biases. Since the origin of gender bias in AI is not technological, technological solutions alone are not likely to solve this problem (Nadeem, Marjanovic & Abedin, 2021).

Have you ever experienced (gender) bias in ChatGPT output? And how do you think we should solve this problem? 

Sun, T., Gaut, A., Tang, S., Huang, Y., ElSherief, M., Zhao, J., … & Wang, W. Y. (2019). Mitigating gender bias in natural language processing: Literature review. arXiv preprint arXiv:1906.08976

Dastin, J. (2018, October 11). Amazon scraps secret AI recruiting tool that showed bias against women. Retrieved from Reuters: Amazon scraps secret AI recruiting tool that showed bias against women | Reuters.

Moss-Racusin, C. A., Dovidio, J. F., Brescoll, V. L., Graham, M. J., & Handelsman, J. (2012). Science faculty’s subtle gender biases favor male students. Proceedings of the national academy of sciences, 109(41), 16474-16479. 

Nadeem, A., Marjanovic, O., Abedin, B. (2021). Gender bias in AI: Implications for managerial practices. 13E 2021. Responsible AI and analytics for an ethical and inclusive digitized society. doi: https://link.springer.com/chapter/10.1007/978-3-030-85447-8_23 

Please rate this

Am I a cheating student now?

9

October

2023

No ratings yet.

Starting this master’s program, I had little experience with chatGPT. However, within a few weeks, I began using it regularly. Initially, using it almost felt like cheating. But are you cheating when you use this technology?

ChatGPT is an artificial intelligence technology designed to generate natural human language, in a conversational way (Cotton, Cotton & Shipway, 2023). This new technology comes with various challenges in the educational field, such as plagiarism. Students may submit assignments that are completely generated by ChatGPT, potentially devaluating degrees overtime (Dehouche, 2021). Furthermore, Cotton et al. (2023) state that students who use this technology, might gain an unfair advantage over other students who do not use the technology, especially when using paid versions.

In order to be able to answer my question, I went and experienced with the technology. I asked ChatGPT to generate a paragraph on information strategy with references (see below, 1). When I checked the reference it appeared to be non-existing (see below). Therefore, I would not have been able to use this text for an assignment. This proves that you cannot just copy paste the given output. You cannot solely rely on chatGPT as an independent source, it can merely serve as a source for inspiration and brainstorming.

Since the quality of the output of ChatGPT is not perfect at this moment, you must always check it to make sure you are working with correct information. It therefore forces you to think more critically when it comes to the content, including used sources, and biases in the output (Anders, 2023). 

My conclusion is that the usage of ChatGPT is not the same as cheating or committing plagiarism, if used wisely and according to the regulations. The existence of technologies like ChatGPT is the new reality, so we need to adapt to it instead of prohibiting or ignoring it.

Would you say that we are cheating? And, do our degrees still hold the same value as before?

Cotton, D. R., Cotton, P. A., & Shipway, J. R. (2023). Chatting and cheating: Ensuring academic integrity in the era of ChatGPT. Innovations in Education and Teaching International, 1-12. 

Dehouche, N. (2021). Plagiarism in the age of massive Generative Pre-trained Transformers (GPT-3). Ethics in Science and Environmental Politics, 21, 17-23.

Anders, B. A. (2023). Is using ChatGPT cheating, plagiarism, both, neither, or forward thinking?. Patterns, 4(3). 

Please rate this