For a couple of years Artificial Intelligence (AI) is a highly discussed topic. On the one hand are many optimistic articles that present countless use cases of how AI will improve our everyday life and even save lives. On the other hand, are questions raised whether AI will harm humans and eventually extinguish humanity. The latter became recently more relevant when the US military increased investments in technologies to enable a more automated form of war. About $10bn will be invested in the Joint Enterprise Defence Infrastructure (JEDI) project (Williams, 2018). This is a modern cloud platform that will enable to weaponize AI by pooling the military´s data and use machine-learning techniques to identify enemies and targets all around the world. Furthermore, all currently conducted AI projects are joined together and are endowed with an additional $1.7bn and the pentagon announced investments of $2bn into AI weapon research (Fryer-Biggs, 2018).
These investments are often described by leaders within the military as important milestones to improve precision of attacks and therefore will decrease the number of casualties in the long run. However, critics argue that there are countless stories of faulty algorithms (Tarnoff, 2018). A recent example is Amazon´s recruiting algorithm that had to be shut down in September after showing a bias against women (Dastin, 2018).
Now imagine that such biases and faults are part of AI algorithms that decide over life and death of a person. Scharrer, a former soldier and fellow at the Center for a New American Security, argues that further use of AI technology within the US military will not slow down soon and stresses that AI researchers should take a more active part in discussions around the use of AI technology by militaries around the world (Knight, 2018).
Dastin, J. (2018). Amazon scraps secret AI recruiting tool that showed bias against women. [online] reuters.com. Available at: https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G [Accessed 14 Oct. 2018].
Fryer-Biggs, Z. (2018). The Pentagon plans to spend $2 billion to put more artificial intelligence into its weaponry. [online] The Verge. Available at: https://www.theverge.com/2018/9/8/17833160/pentagon-darpa-artificial-intelligence-ai-investment [Accessed 13 Oct. 2018].
Knight, W. (2018). Why AI researchers shouldn’t turn their backs on the military. [online] MIT Technology Review. Available at: https://www.technologyreview.com/s/611852/why-ai-researchers-shouldnt-turn-their-backs-on-the-military/ [Accessed 14 Oct. 2018].
Tarnoff, B. (2018). Weaponised AI is coming. Are algorithmic forever wars our future?. [online] The Guardian. Available at: https://www.theguardian.com/commentisfree/2018/oct/11/war-jedi-algorithmic-warfare-us-military [Accessed 14 Oct. 2018].
Williams, L. (2018). DOD releases $10 billion JEDI cloud contract. [online] DefenseSystems.com. Available at: https://defensesystems.com/articles/2018/07/26/jedi-hits-the-street.aspx [Accessed 14 Oct. 2018].