In China, facial recognition has become a new popular tech subject to debate about over the last few years. Facial recognition is used all around the country, from its streets, banks, airlines and even public restrooms are using this new technology to confirm people’s identities (Borak, 2021). Though it has been used extensively by the private sector within the country, it is the police and security state that have embraced this novel technology with the most zeal (Borak, 2021). Supporters of facial recognition around the world frequently claim that the monitoring technology should only be used to combat the most dangerous crimes such as violent crimes or terrorist attacks (Dou, 2021). While such technology is still used around the world for minor offences such as shoplifting, its use still seems mild in comparison to how extensively it is being used in China.
With a massive network of cameras positioned all throughout the nation, China’s facial recognition system records almost every single resident there. Such technology could be innocuous and helpful, but it can also be manipulated to enable a crackdown on behaviors that the typical person might not even consider illegal. This is illustrated by China’s active development and usage of facial recognition. Chinese authorities have even gone to the extent of shaming people for wearing their pajamas in public by using facial recognition technology and labelling it uncivilized conduct (Qin, 2020). These infractions are penalized on purpose and the Chinese government can influence over a billion people towards what they consider appropriate behavior whether it be the way you dress or how you cross the street, by threatening them with public humiliation.
The idea of using technology and psychology to influence people’s behavior is known as behavioral engineering (Berndt, 2015), and it is something we observe every day. However, there is a significant distinction between the application of behavioral engineering in the rest of the world and China’s use of face recognition. For example, in the US, behavioral engineering is possible by gathering data on individuals and presenting or removing material to them in accordance with projected personality traits (Hinds et al., 2020). This was seen in the Cambridge Analytica scandal. While this type of behavioral engineering is more about pushing products or increasing profits, China’s is more about controlling its population through fear (Mozur, 2019). Thus, it is very important for the rest of the world to learn from the backlash that such technology has faced in China before it is widely adopted, and it is crucial to understand how this technology can be improved to safely provide its benefits.
Bibliography
Berndt, C. (2015). Behavioural economics, experimentalism and the marketization of development. Economy and Society, 44(4), 567–591. https://doi.org/10.1080/03085147.2015.1043794
Borak, M. (2021, January 26). Chinese people are concerned about use of facial recognition, survey shows. South China Morning Post. Retrieved from https://www.scmp.com/tech/innovation/article/3119281/facial-recognition-used-china-everything-refuse-collection-toilet
Dou, E. (2021, July 30). China built the world’s largest facial recognition system. now, it’s getting camera-shy. The Washington Post. Retrieved from https://www.washingtonpost.com/world/facial-recognition-china-tech-data/2021/07/30/404c2e96-f049-11eb-81b2-9b7061a582d8_story.html
Hinds, J., Williams, E. J., & Joinson, A. N. (2020). “it wouldn’t happen to me”: Privacy concerns and perspectives following the Cambridge analytica scandal. International Journal of Human-Computer Studies, 143, 102498. https://doi.org/10.1016/j.ijhcs.2020.102498
Mozur, P. (2019, April 14). One month, 500,000 face scans: How China is using A.I. to profile a minority. The New York Times. Retrieved October 17, 2022, from https://www.nytimes.com/2019/04/14/technology/china-surveillance-artificial-intelligence-racial-profiling.html
Qin, A. (2020, January 21). Chinese city uses facial recognition to shame pajama wearers. The New York Times. Retrieved October 17, 2022, from https://www.nytimes.com/2020/01/21/business/china-pajamas-facial-recognition.html
The Scare Of Facial Recognition Technology
17
October
2022
Facial recognition has proved to have some limitations. Indeed, in the United States, facial recognition technology has been linked to the arrests of innocent people, especially in Afro-American communities. According to the Pittsburgh Post Gazette, colored people have more chance to be identified, and potentially arrested and convicted for crimes that they have not committed. More, the lack of safeguards that our laws have regarding this technology seems to provide space for different actors to abuse it.
An interesting article which touches upon a very salient issue. It is even more dangerous as we observe the rapid rise in sophisticated AI, which is able to detect an “undesirable” behavior (which in the eyes of CCP often means any sign on defiance or discontent). An interesting point which you have not mentioned is how those troves of data is stored. As China gathers a lot of information about its citizens, any potential cyberattack targeting this database can be devastating. Imagine if foreign intelligence agencies were able to access data about movement of all of Chinese citizens. It is a large danger to the China itself, which remains largely outside of consideration of CCP.