Autonomous “killer” drones as war weapons – What about ethics?

23

September

2018

5/5 (5)

We all heard that question relating to one of our newest technologies before: In nowadays society is Artificial Intelligence a blessing or a curse? Many would say it makes life easier and more convenient, looking at Siri or Google for example, and that it will keep on improving and taking over difficult tasks in an efficient manner, that human beings cannot compete with. However, some people tend to be skeptic about AI and what it might be able to do in the future. And I do not mean the fact that it will replace millions of people’s work place, no. I mean the fact that AI will be able to kill people and function as a real war weapon.

By using their own decision-making, autonomous AI driven “killer” drones will be able to target humans they decide they want to kill. Would that be a blessing or a curse for us as human beings? The intuitive answer would be that this is a horrible scenario. However, it is known that warriors and soldiers suffer from psychological damages after they return from war. One huge factor influencing these damages is the fact that they have to harm and eventually kill people. Even if it is not actually them pulling the trigger, by giving the command to a drone, the sole knowledge of the fact that they gave permission to kill someone causes psychological distress. If autonomous “killer” drones would take over the part of killing armed opponents by knowing themselves who they have to target; would that diminish the post dramatic war effects on fighters? And even if a drone could distinguish between “enemies” and “non-enemies”, will they be able to know which attack is appropriate and which one is not?

One of the most important ethical questions to ask here would be: Who is responsible for killing someone, if the execution was made by an AI driven “killer” drone? From my point of view, it does not make sense to put drones behind bars…? What do you think?

 

Autonomous military drones: No longer science fiction. (2017, July 28). NATO Review. doi:https://www.nato.int/docu/review/2017/also-in-2017/autonomous-military-drones-no-longer-science-fiction/EN/index.htm

Stroud, M. (2018, April 12). The pentagon is getting serious about AI weapons. The Verge. Retrieved September 23, 2018, from https://www.theverge.com/2018/4/12/17229150/pentagon-project-maven-ai-google-war-military

Please rate this

1 thought on “Autonomous “killer” drones as war weapons – What about ethics?”

  1. Hi Judith,

    I think this is a very interesting topic, since it is related to the general subject of accountability for AI. It reminds me of the related topic of concerns about autonomous vehicles; who is responsible if someone is involved in an accident with an autonomous car? The car manufacturer, the software designer, the driver?

    I think you can look at the killer drones the same way; is the manufacturer responsible, the programmer, etcetera. But while the autonomous vehicles do not have killing as their primary objective, these killer drones might. How will you make sure that the drones kill the right people, and not for example children and women? My first thought would be that the programmer is responsible for any mistakes that the killer drones make. However, we are dealing with war zones, and conditions can change very quickly and unexpectedly. Will the drones be able to learn that fast and react in the appropriate way? Can we really hold programmers accountable for their inability to do so? I don’t think so.

    In addition, soldiers are not only responsible for killing people, but also for bringing the local community to safety, reassuring them, supplying them with food, and protecting them. You can’t let drones do this since they don’t have the emotional intelligence necessary to reassure and comfort those people.

    I therefore believe that even though the use of killer drones might indeed spare people from psychological conditions like PTSS, it would not be ethical nor practical to use killer drones. There are just too many variables, and even though AI can do a lot, I believe that human soldiers will still be needed to deal with this in a more efficient manner.

    Kimberly

Leave a Reply

Your email address will not be published. Required fields are marked *