top of page

11: Self-Driving Cars

Self-driving cars were built and developed in order to make the roads safer. According to Tesla, “Self-driving vehicles will play a crucial role in improving transportation safety and accelerating the world’s transition to a sustainable future.” Additionally, Tesla says that these self driving vehicles would not only be safer for humans, but also make transportation cheaper for people who do not have cars. Self driving systems use computing power that enables the car to process data from all of its surroundings, going “far beyond the human senses” (Tesla). Because of these advantages to driving and improvements in human safety, companies like Tesla, GM, Ford, Uber, and Lyft are investing billions of dollars into the project. So then what are the cons? The biggest concerns are accidents and imperfections in software that will danger the lives of humans.

According to George Dvorsky in “Your Self Driving Car will be Programmed to Kill You - Deal with It,” there is an ethical dilemma of how to program vehicles when a crash is inevitable. Surveys say that people prefer casualties to be minimized, but they want their own car to protect them even if it means killing more people. This “social dilemma” where a consumer makes decisions based on his own self interest, causing the roads to be less safe for the greater population, creates an ethical challenge for self-driving car manufacturers. People tend to support the utilitarian programmed cars if other people are buying them. However, these same people are less likely to buy cars like this themselves. Patrick Lin of California Polytechnic State University points out that humans are “notoriously bad at risk assessments: we drink and drive, we text and drive, we go way over the speed limit, and so on.” Therefore, should the ethical decision making of individuals even be considered when we often make stupid choices on a daily basis? However, public opinion is still important because these cars are ultimately affecting the public. These self-driving cars are a very new project, and the more people learn about the cars, the more they will adjust their opinions of them. I think that there are many grey areas to how AI systems should handle the programming of ethics. If a crash is inevitable, the car should assess the situation to minimize casualties. It’s easy to say this, however, when you aren’t the one actually in this kind of situation. So, if there is a crash, then who is responsible? I think that the companies programming these cars are liable. Engineers take the oath to design and build with ethics and careful decision making. If companies like Tesla, Ford, and GM are willing to take on this challenge of creating self-driving cars, then they are becoming hugely responsible for systems that will affect the actual lives of people on the road, even people who choose not to use self-driving cars. Transportation via vehicles is inevitable in the 21st century, so even pedestrians who don’t use cars are at the mercy of cars when they walk on the street. Therefore, self-driving cars will affect everyone, and companies should be responsible for the mistakes they make in their automation and programming. If they are not held responsible, then companies and engineers are less motivated to work toward perfection.

The government should be responsible for providing safety expectations and uniform rules for all companies involved in the design and production of self-driving cars. According to the New York Times, in 2016 the government created a 15-point safer standard to regulate the design and development of self-driving cars. Although the rules and regulations are not perfect, I think that the government is challenged with how to deal with such new territory with regard to policy and standards. The government should play a role in regulating these standards because of its job to keep the public safe. President Obama announced that self-driving cars could eventually save tens of thousands of live a year. Economically, I think that ride sharing will increase with the improvement of self-driving cars, but personal car sales will decrease. Socially, I think that self-driving cars will decrease deaths. However, we have a long way before the technology is perfected.

Would I want a self-driving car? For now — no. The technology is so new that I would rather die in a crash at the mercy of my own driving decisions than because of a failure or flaw in a computer system. However, in several years when the systems are perfected and there are more self-driving cars than human-driving cars on the road, I would want a self-driving car.


bottom of page