Using AI to… shame jaywalkers?

4

October

2017

No ratings yet.

Companies and police have for years been working together to catch criminals through IT, but now they’re going further by developing technology to stop criminals even before they’ve committed a crime. Li Meng, vice-minister of science and technology, said that “If we use our smart systems and smart facilities well, we can know beforehand . . . who might be a terrorist, who might do something bad.” (1)

Previously, China’s network of [176 million] surveillance cameras was being used much in the same way as in the U.S., to cross-reference surveillance footage with national ID photographs to catch criminals and terrorists. (2) Over the last years, Cloud Walk, a company headquartered in Guangzhou, has been training its facial recognition and big data systems to assess the risk level of people who frequent weapons stores (but also hardware stores), or transportation hubs. If a person for example buys a kitchen knife now, and a sack and hammer later, their suspicion rating will increase. A high-risk person will be flagged and the system will notify the local police. (3) Their software can also be applied at the train station, where it can analyse the crowd to single out potential pick pocketers.

However, the potential for unjustified arrests increases with this new technology. Legally, charges cannot be brought against someone for a crime they have yet to commit. Li Xiaolin, a partner at Beijing Weiheng Law Firm, states that in practice, even without evidence, a suspect can be charged [with attempting to commit crimes], and that wrongful verdicts are hard to appeal because of the judicial system. (1) Must the judicial system develop more checks and balances before this technology is implemented further?

Additionally, preventing crime is not the only use the Chinese government has found for AI. In dozens of cities, jaywalkers are being shamed. In June, China implemented a new cybersecurity law that protects citizens’ personal information being collected for commercial use, but this doesn’t count for local authorities. Chinese law allows facial recognition to be deployed to “name and shame” traffic light offenders, quite literally. The citizen’s personal information, including names and home addresses, are displayed on screens on the side of roads, reported a local news agency. (2) Furthermore, facial recognition is used in schools to counter cheating, and even crazier, in bathrooms to limit toilet paper waste. (1) Is this going too far?

Still, AI can improve both safety and privacy, online and offline. However many concerns there are about AI being used as a surveillance tool, it can also be used to keep healthcare records private, secure financial transactions, and prevent hacking, for self-driving cars (possibly eliminating at least 90 percent of traffic fatalities). Also, for smart security cameras, robot guards and better military technologies. (3) Although this last one is debatable, as conflicts may escalate more quickly and destructiveness may increase.

In conclusion, AI has great potential for stopping crimes before they happen and improving overall safety, but the flipside is it also has great potential to be misused. What do you think about AI being implemented into society further and further, and not just in China, but everywhere?

1. Yang, Y. (2017, July 23). China seeks glimpse of citizens’ future with crime-predicting AI. Financial Times. Retrieved October 04, 2017, from https://www.ft.com/content/5ec7093c-6e06-11e7-b9c7-15af748b60d0
2. Wang, Y. (2017, July 11). China Is Quickly Embracing Facial Recognition Tech, For Better And Worse. Forbes. Retrieved October 04, 2017, from https://www.forbes.com/sites/ywang/2017/07/11/how-china-is-quickly-embracing-facial-recognition-tech-for-better-and-worse/#b2af19068560
3. Lant, K. (2017, July 25). China’s “Minority Report” Style Plans Will Use AI to Predict Who Will Commit Crimes. Futurism.. Retrieved October 04, 2017, from https://futurism.com/chinas-minority-report-style-plans-will-use-ai-to-predict-who-will-commit-crimes/

Please rate this

1 thought on “Using AI to… shame jaywalkers?”

  1. Isn’t our digital existence in the economic and social sense nowadays already strongly influenced by surveillance and spying? Just take Facebook – people allow the company to comb through their personal data in exchange for recommendation-based services. This makes it increasingly hard for our society to draw the line between healthy data-tracking and unhealthy surveillance (Funnel, 2014). Even if someone committed as a jaywalker he should still be guaranteed with privacy rights. Who decides which ‘crime’ should be made public?

    It is actually planned in the US to use programs to predict the likelihood someone will commit a crime in a specific neighbourhood based on crime statistics data as well as support the surveillance of crime and military actions (McLaughlin, 2017).

    Awareness of the extend of surveillance and its effects tend to be limited prior to disclosures like the WikiLeaks (York, 2014). You could argue that surveillance infringes one’s personal freedom, subject to control and prevents progress in society (York, 2014). So there is a risk of becoming a surveillance society. We should have learned from history (Statsi surveillance or NSA affair) which negative effects spying has on individuals and societies as a whole.

    References:
    Funnell, A. (2014). 1984 and our modern surveillance society. [online] Radio National. Available at: http://www.abc.net.au/radionational/programs/futuretense/1984-and-our-modern-surveillance-society%C2%A0/5631512 [Accessed 5 Oct. 2017].
    McLaughlin, J. (2017). Artificial Intelligence Will Put Spies Out of Work, Too. [online] Foreign Policy. Available at: http://foreignpolicy.com/2017/06/09/artificial-intelligence-will-put-spies-out-of-work-too/ [Accessed 5 Oct. 2017].
    York, Y. (2014). The harms of surveillance to privacy, expression and association | GISWatch. [online] Giswatch.org. Available at: https://giswatch.org/en/communications-surveillance/harms-surveillance-privacy-expression-and-association [Accessed 5 Oct. 2017].

Leave a Reply

Your email address will not be published. Required fields are marked *