A report, released on Tuesday by the Center for Privacy & Technology at the Georgetown University law school, found that law enforcement databases now include the facial recognition information of 117 million Americans, about one in two U.S. adults. Alvaro Bedoya, the executive director of the Center for Privacy & Technology, describes the databases as an unprecedented privacy violation: “a national biometric database that is populated primarily by law abiding people”. One could argue in advantage of the use of biometric tools that they benefit law enforcement practices by, for example, reducing racial policing. After all, a computer does not know the societal meaning of race or gender and sorts based on numerical patterns. However, research has shown that facial recognition software has a built-in racial bias.
The algorithms can be biased due to the way they are trained, says Anil Jain, head of the biometrics research group at Michigan State University. Face matching software must learn to recognize faces using training data, a set of images that gives the software information about how faces differ. In 2012, a test in Florida pointed out that recognition algorithms were consistently less accurate on woman, African-Americans, and younger people. According to researchers, apparently the systems were trained on data that was not representative enough of those groups. For these groups, this results in a higher chance of being linked to a crime without justification.
According to the report, at least a quarter of all local and state police departments have access to a facial recognition databse – either their own or another agency’s- but just few department have enacted standards for testing the accuracy of the recognition system. Also, there was no notice taken of teaching employees to visually confirm face matches. Face recognition technology allows untrained police to identify random people, without having a legitimate reason per se. The dangarous combination of human prejudices and biased recognition systems might lead to etnical profiling, which is an illegal practice.
Perhaps computers does not know the societal meaning of race or gender, but the people who made the algorithms defenitely do. Therefore, I believe that it is important to bear in mind that certain systems and tools are produced by human beings, who are biased per definition. Based on the report, I would like to encourage you to be critical with regard to information systems. Results could be used to create a partial world-view, which will do unforeseeable damage to society.
The FBI declined to specifically comment on the report.