Decontextualized algorithms

17

October

2018

No ratings yet.

In Weapons of Math destruction (2016) Cathy O’neil describes a system for evaluating teachers in the USA, called IMPACT.
This model predicts how well students will perform, and to compare the expected results with the actual results. IMPACT is used to monitor how well teachers perform. If students consequently score lower than they are expected to, the teacher is fired based on this algorithm.
O’neil argues a model like this are perceived as more efficient and fair than ‘regular’ evaluations submitted on paper which filled in by humans. This is because this model “not only saved time but also was marketed as fair and objective. After all, it didn’t involve prejudiced humans digging through reams of paper, just machines processing cold numbers” (O’neil, 2016, pp 3).
However, there may be problems with IMPACT and algorithms like it.
O’neil argues IMPACT is an algorithm that tries to quantify or formalize the behavior of the teacher. According to O’neil, Impact may be used inappropriately because it is applied in a context that is not suited to formalize. For this reason, O’neil calls IMPACT a Weapon of Math Destruction( WMD). WMD are systems that inappropriately use algorithms. She calls such algorithms decontextualized.
A decontextualized algorithm is a “bias that originates from the use of an algorithm that fails to treat all groups under all significant circumstances” (Friedman & Nissembaum, 1996, pp 334). The fair treatment of all groups under all circumstances, is a high goal to aim for. No system may be able to live up to this goal. But the massive use of algorithms to decide how loses her job takes false positives as a hazard of doing business.
Like O’neil describes, models portray an image of mathematical certainty and objectivity, “[n]evertheless, many of these models encoded human prejudice, misunderstanding, and bias into the software systems that managed our lives” (O’neil, 2016, pp 3). These biases emerge because algorithms are in their core statistical or mathematical models. These models function well by the virtue that data are complete and objective. This is not always the case. A teacher may for example teach a class which educational level is lower than the model assumes. Every statistical models carries the risk of false positives.
Therefore, O’neil argues WMD’s should be recognized as such and abolished.
Do you agree with Cathy O’neil?
Do you know examples of algorithms that are used in the wrong context and therefore cause problems?
Sources
Friedman, B., & Nissenbaum, H. (1996). Bias in computer systems. ACM Transactions on Information Systems (TOIS), 14(3), 330-347.

O’Neil, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Broadway Books.

Please rate this

Leave a Reply

Your email address will not be published. Required fields are marked *