How Algorithms Discriminate Against Women: The Hidden Gender Bias

9

September

2020

5/5 (2)

In past decades, AI worries have moved from whether it will take over the world, to whether it will take our jobs. Today we have a new, and justifiably serious, concern: AIs might be perpetuating or accentuating societal biases and making racist, sexist or other prejudiced decisions.

 

Machine learning technology is inherently biased

Many believe that software and algorithms that rely on data are objective. But machine learning technology is inherently biased because it works on the fundamental assumption of bias. That is, it is biasing certain input data to map them to other output data points. Of course, there is also the option to directly modify the data that is fed in through techniques like data augmentation to enable less biased data. But there is a problem; humans consciously know not to apply certain kinds of bias, yet, subconsciously they end up applying certain kinds of bias that cannot be controlled.

 

Tech-hiring platform Gild

This being the case, it is not surprising to find hidden biases all around us in the world today. For example, let’s talk about the secretive algorithms that have become increasingly involved in hiring processes. American scientist Cathy O’Neil explains how online tech-hiring platform Gild enables employers to go well beyond a job applicant’s CV, by combining through the trace they leave behind them online. This data is used to rank candidates by ‘social capital’ which is measured through how much time they spend sharing and developing code on development platforms like GitHub or Stack Overflow. 

 

This all sounds very promising, but the data Gild shifts through also reveal other patterns. For instance, according to Gild’s data, frequenting a particular Japanese manga site is a ‘solid predictor for strong coding’. Programmers who visit this site, therefore, receive higher scores. As O’Neil points out, awarding marks for this is a large problem for diversity. She suggests ‘if, like most techdom, that manga site is dominated by males and has a sexist tone, a good number of the women industry will probably avoid it’. 

 

‘Gild undoubtedly did not intend to create an algorithm that discriminated against women. They were intending to remove human biases’

 

In the book “invisible women”, Caroline Criado Perez noted that ‘Gild undoubtedly did not intend to create an algorithm that discriminated against women. They were intending to remove human biases’. However, if managers are not aware of how those biases operate, if they are not collecting data, and if they are taking little time to produce evidence-based processes, an organisation will continue to blindly perpetuate old injustices. Indeed, by not considering how women’s lives differ from men’s, Gild’s coders accidentally created an algorithm with a hidden data bias against women. 

 

But that is not even the worst part. The worst part is that we have no idea about how bad the problem really is. Most algorithms of this kind are kept secret and protected as proprietary code. This implies that we do not know how decisions are being made and what biases they are hiding.  Perez points out, ‘The only reason we know about this potential bias in Gild’s algorithm is because one of its creators happened to tell us’. This, therefore, is a double data gap: (1)  First,  in the knowledge of the coders designing the algorithm, and (2) second, in the knowledge of society at large, about just how discriminatory these AIs can be (Perez, 2020).

 

‘The only reason we know about this potential bias in Gild’s algorithm is because one of its creators happened to tell us’

 

We need more diversity in tech to reduce the hidden gender bias

Many argue that one easy way to combat the hidden gender bias is to increase the diversity of thought through the number of women in tech. According to the World Economic Forum, currently only 22% of AI professionals globally are female, compared to 78% who are male. Additionally, at Facebook and Google, less than 2% of technical roles are filled by black employees. To remove hidden bias in algorithms, tech companies should step up their recruiting practices and increase diversity in technical roles. 

 

Do you have any other suggestions for managers to reduce hidden bias? Or have you come across a type of hidden bias? Feel free to leave a comment.

 

References:

The Guardian (2016). How algorithms rule our working lives. Retrieved from https://www.theguardian.com/science/2016/sep/01/how-algorithms-rule-our-working-lives

Perez, C. C. (2020). Invisible women Data bias in a world designed for men. New York: Abrams Press. 

Forbes (2020). AI Bias Could Put Women’s Lives At Risk – A Challenge For Regulators. Retrieved from  https://www.forbes.com/sites/carmenniethammer/2020/03/02/ai-bias-could-put-womens-lives-at-riska-challenge-for-regulators/#2201ee44534f

World Economic Forum (2020). Assessing Gender Gaps in Artificial Intelligence. Retrieved from  http://reports.weforum.org/global-gender-gap-report-2018/assessing-gender-gaps-in-artificial-intelligence/

Dogtown Media (2019). Can AI’s Racial & Gender Bias Problem Be Solved? Retrieved from https://www.dogtownmedia.com/can-ais-racial-gender-bias-problem-be-solved/

Please rate this