Companies are tapping into their data by using advanced analytics techniques in finance, marketing, sales in order to gain competitive advantage against their competitors. With the rise of artificial intelligence – machines making decisions instead of humans – it is hard not to wonder whether those decisions are ethical. Those techniques can discriminate (women, for instance) or highlight human biases and aid ethical decision-making.
On the shadow side of AI-driven decision making, we see Amazon for which automation has been a key in the industry dominance. However, in 2015, Amazon found out that its secret AI recruiting tool was picking men over women for tech positions. It happened because the algorithm was learning using historical data of a company where men outnumber women, thus making it biased (Dastin, 2018). Another example, which was found by Lambrecht and Tucker (2019), is that gender neutral advertisements (i.e. STEM careers ad) on Facebook were shown more to the men because of cost-optimizing algorithm which discriminated women who were a prized demographic and more expensive to display ads to.
On the other hand, things are not as gloomy as they may seem. There are also examples of data analytics tools which are used towards eradicating the gender gap. As people are becoming aware that at a current rate it would take 202 years to achieve gender parity (Sanz Sáiz, 2018), they are rolling up their sleeves to quicken the pace using AI algorithms and Data Science to solve this problem. An example of these efforts is Syndio a Human Resource analytics platform. It strives to solve gender pay issue by providing insights to companies on where they stand in the gender pay parity journey. Syndio helps companies in three ways – encourages them to commit to ongoing pay analyses, use valid methodology to analyze the data and ensure transparency. Their clients are able to see dashboards in which they can identify the changes they need to make in order to compensate women fairly. It is important to note Syndio’s algorithms have been created together with National Women’s Law Center and has been vetted by federal and state agencies (Kasunich, 2019) and have the potential to become a standard procedure.
As we are still grasping the potential of AI algorithms, it is eventual that some solutions may backfire. However, I believe that we can use these mistakes as a cautionary tale and become more rigorous in the way we create them. On the bright side, as the saying goes “what gets measured gets done”, I think that transparency which data analytics can bring to the table, would help us to make decisions based on reality rather assumptions and therefore make decisions more ethical.
References
Dastin, J. (2018). Amazon scraps secret AI recruiting tool that showed bias against women. Retrieved 16 October 2019, from https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G
Kasunich, C. (2019). Syndio Establishes Pay Equity Standards for Global Corporations Based on Validated Methodology. Retrieved 16 October 2019, from https://www.prnewswire.com/news-releases/syndio-establishes-pay-equity-standards-for-global-corporations-based-on-validated-methodology-300839523.html
Lambrecht, A., & Tucker, C. (2019). Algorithmic Bias? An Empirical Study of Apparent Gender-Based Discrimination in the Display of STEM Career Ads. Management Science.
Sanz Sáiz, B. (2018). Five ways data analytics can help close the gender gap. Retrieved 16 October 2019, from https://www.ey.com/en_gl/digital/five-ways-data-analytics-can-help-close-the-gender-gap