Saying Goodbye to Bias?

18

October

2018

No ratings yet.

Do a quick Google search in regards to fallacies in Artificial Intelligence (AI) and Machine Learning and the number one hit will be algorithmic bias. To quickly introduce the concept: “Algorithmic bias occurs when a computer system reflects the implicit values of the humans who are involved in coding, collecting, selecting, or using data to train the algorithm [1].” Algorithmic bias is the outcome of years of prejudice within our actual society with racism and discrimination as its darkest results. It is clear within the entire industry that this integral bias needs to be solved before AI’s potential can be totally exploited.

Companies are struggling to find a solution but IBM just introduced something very promising. They announced AI Fairness 360 (AIF360) aimed at resolving the bias issue. AIF360 is an open-source toolkit that checks data and machine learning models for unwanted bias. This first release contains nine different algorithms that checks data for well-known biases like gender or race favoring. AIF360 compares the model results that people can send in, to mitigated results after using the AIF360 algorithms on the dataset. It then checks for statistical significant differences and reduces the bias if any was found. IBM made it open source to increase the number of checking algorithms in the hope that more biases can be corrected. The more people engage, the higher chance of success. Although this is just a start, this could be the beginning of a bias free algorithmic world [2] [3].

However, at the end of the day this bias is a human problem, not one of technology. The real long-term solution therefore lies in the removing of biases in our own society and day-to-day activities. How painful the truth may be, AI and Machine Learning results show us where those biases lie and we should use this to our advantage, instead of closing our eyes to it.

Please rate this

Leave a Reply

Your email address will not be published. Required fields are marked *