With Uber, Tesla, Google, and other mobility companies investing in autonomous cars that don’t need anyone behind the wheel from A to B, it is imminent that accidents will happen with these systems. (Interesting Engineering, 2019). However, what do we need to keep in mind when we read the news about (semi)autonomous vehicles being involved in accidents?
Having someone, or something, literally take the wheel usually leaves us humans with a helpless feeling. Helplessness is definitely not part of feeling safe and even less part of pushing humans to accept technology. Especially technology that can easily kill either them or other humans.
So, whenever news arises that, for instance, Tesla’s Autopilot was involved in an accident, news channel features, online forums and many more media outlets light ablaze with the questioning of autonomous vehicles and whether they should be banned or not (Forbes, 2019). These articles and forums show that people are scared, maybe even rightly so, people are dying because of autonomous cars… right?
However, we must consider several things before conclusions are drawn. Firstly, the bias that can easily arise with news of this nature. Second, whether having autonomous cars are per definition ‘fault’ or ‘unsuited’ when they are involved in accidents.
Bias: One should always be cautionary when drawing conclusions on data presented by others. This is also the case for autonomous cars. Due to the fact that the safety of autonomous cars is only a topic of conversation when it goes wrong (e.g. a car crash) we are biased towards thinking it is dangerous (Wikipedia, 2019). This is due to how the news works, it is way less detectable and less newsworthy when an autonomous is NOT involved in an accident or narrowly avoided one.
Second, even if autonomous cars are sometimes involved in accidents, this does not immediately mean that they are therefore unsuited to replace human drives. The reason is that they don’t have to be perfect. They don’t have to have a 0% crash rate, they only need to have a lower crash rate than humans in order to be ‘worthy’ to take the wheel.
In conclusion, I am definitely not saying that autonomous driving is the answer and safe enough to implement. However, I do recommend that we take in numbers not without thinking and quickly judging, but analyze what the numbers mean and what they represent.
https://interestingengineering.com/are-we-programming-killer-cars-the-ethics-of-autonomous-vehicles
https://en.wikipedia.org/wiki/Availability_heuristic