Tech company IBM has offered software that can be used to analyse surveillance footage to the New York Police Department (NYPD). The software allows the NYPD to search for people by gender, age, hair color, facial features and skin color.
While this kind of software has great potential, there has been a lot of controversy in the United States facial recognition systems. In July of 2018, the American Civil Liberties Union (ACLU) stated that Amazon’s Rekognition wrongly identified 28 members of Congress as criminals. Amazon’s responded in a statement that ACLU most likely used lower confidence thresholds than recommended. However, it becomes clear that the software, currently, is nowhere near perfection.
Furthermore, research has shown that facial recognition software is often vulnerable to racial biases. For example, a study in 2011 found that software developed in Asia had more trouble distinguishing Caucasian faces than distinguishing East Asian faces. However, another study in 2012 shows that facial recognition algorithms perform up to 10 percent worse on African Americans than on Caucasians.
A spokesperson of the NYPD confirmed that the NYPD opted not to deploy the software and it openly rejects the idea of searching for people based on ethnicity.
What intrigues me in this matter is the question of what is ethical. Is it ‘right’ to search for people on their gender, age or the color of their skin? Especially, when the software has proven to be biased and erroneous in the past. The NYPD has every right not to use the software, but was this decision made based on logical or ethical grounds?
In my humble opinion, the NYPD should never engage in activities that promote racism in any way. However, this software could make a significant contribution to the everlasting fight against crime. Hence, I believe it could be used as a tool to support in the analysis of surveillance footage in investigations. For the software to function optimally, there must be strict regulations to fight any kind of racism. Also, it is crucial that human operators control the software and correct obvious mistakes (such as identifying 28 Congress members as criminals) to eliminate mistakes as early in the process as possible.
Sources:
IBM collaborated with the NYPD on an AI system that can search for people by race
https://www.nu.nl/internet/5450645/techgigant-ibm-bood-politie-new-york-software-etnisch-profileren-aan.html
https://aws.amazon.com/rekognition/