Is machine learning hard-coding discrimination in the world?

18

September

2019

5/5 (4)

Data based decision making is widespread across industries and departments, therefore impacting resource allocation, product development and design choices, human resource management practices, and so forth. So what if data is only seemingly objective, but actually inherently biased? 

The result is that, in spite of best efforts to decrease gender and race inequalities, biases are not only perpetuated, but amplified. These biases lead to women and minorities being treated unfairly, as well as put their lives in danger. The issue is that oftentimes the “default white man” is treated as the “default human” in data analysis, and therefore decision making. This happens, for instance, when healthcare devices are designed according to male physical characteristics, such as the electrical wave threshold below which a pacemaker is fitted being correct for men. Unsurprisingly, it is therefore not well suited for women, although sold for both genders. Or worse, when clinical trials are not sex-disaggregated, and high blood pressure drugs end up reducing the chance of heart attacks for men, but increasing it for women. (Gordon, 2019)

These kinds of issues are likely to become more important in the future, when the majority of processes will be automated, thus unsupervised, and driven by machine learning. A machine learning algorithm needs assumptions, and relies on statistical bias to make predictions (Schadowen, 2018).  This becomes dangerous when an algorithm is trained on gender or race biased data, and faulty assumptions are incorporated in the model, resulting in machine bias. Contrary to statistical bias, machine bias is not necessary to make predictions, but rather the result of prejudice being assumed by the model, from either its creator or training data (Schadowen, 2018). Therefore, it is possible to rid systems of machine biases, but it requires that developers acknowledge them, and account for gender and race differences while developing, testing, and re-evaluating their models. This means that counterintuitively, to achieve equality it is actually necessary to treat men, women, and minorities as unequal, account for physiological and lifestyle differences, and adjust the models accordingly. 

 

References:

Gordon, S. (2019). It’s a man’s world — how data are rife with insidious sexism | Financial Times. Retrieved 18 September 2019, from https://www.ft.com/content/9e67294a-28a0-11e9-a5ab-ff8ef2b976c7

Marriott, J. (2019). Data Discrimination: Exploring Big Data and Bias|SXSW 2015 Event Schedule. Retrieved 18 September 2019, from https://schedule.sxsw.com/2015/events/event_IAP43058

Shadowen, N. (2018). How to Prevent Bias in Machine Learning. Retrieved 18 September 2019, from https://becominghuman.ai/how-to-prevent-bias-in-machine-learning-fbd9adf1198

Please rate this

4 thoughts on “Is machine learning hard-coding discrimination in the world?”

  1. Hi Elisa, thank you for an insightful article ? I would like to provide another example for the phenomenon that you described. One of the areas that can be prone to racial or gender biases is the development of automated recruiting tools. The dataset provided to the algorithm can have an underlying bias towards f. ex. white men. That is what happened in the case of Amazons recruiting tool. It turned out that the AI algorithm was giving preferential scores to male applicants for technical posts. Even after trying to fix the algorithm to make gender-neutral decisions, it wasn’t possible to prove that the tool was making unbiased choices. By this example, I wanted to showcase that biased algorithms can have influence on different areas of our lives. I’m just wondering, how many biased algorithms will be guiding our lives in the future?

  2. Hi Elisa,

    Thanks for sharing this insightful post, and I agree with the challenges that you have mentioned. Indeed, I believe that the very root cause is the fact that developers, whom you refer to in the last few sentences, are mostly young white males. In order to fight potentially biased algorithms, more diversity needs to be established in technical expert positions. I believe that this can be tackled in two ways. First, stereotypical thinking needs to be eliminated among society, so that more females will get involved “non-typical” hobbies such as engineering. Second, I believe that better integration of STEM courses (Science, Technology, Engineering, Math) into early school curriculum is key in sparking interest and passion among young children of all genders and ethnicities, which ultimately lead to a larger workforce diversity for technical positions. I would also like to share this TED Talk video, where the researcher descibes how biased alorithms are already affecting reality and what can be done about it:
    https://www.ted.com/talks/joy_buolamwini_how_i_m_fighting_bias_in_algorithms

  3. Hi Elisa, this is a very interesting topic that you are writing about. I agree with you that bias is and will continue to be a challenge to the successful implementation of AI in every area of our life. However, I also believe that AI can actually be a good tool to show us our biases that we may not be aware of. What makes AI so powerful is that it is able to learn on its own, without having a human having to hard code every little thing. If it then learns to make biased decisions, we might be able to discover biases that we did not know existed before. This became apparent to me when I read an article on algorithmic bias (Lambrecht & Tucker, 2019). In a nutshell, the authors first thought the algorithm was biased to show STEM job advertisements more to men than to women, but then later found out that this was only happening because men were less likely to click on the advertisement. This way the “bias” of the machine was able to point to a very different characteristic in the data, that had not been apparent before.

    1. Lambrecht, A., & Tucker, C. (2019). Algorithmic Bias? An Empirical Study of Apparent Gender-Based Discrimination in the Display of STEM Career Ads. Management Science.

Leave a Reply

Your email address will not be published. Required fields are marked *