The Liability of Concerns

#22 - LLI Columbus

Throughout these past few years, one has probably seen the idea of autonomous cars. From Manufacturers such as apple to companies such as Ford have been planning to put some of these cars into the streets. Currently, Tesla is one of the big brands that has a care out with an autonomous mode. While this idea of the car seems beneficial, there are still a lot of questions floating around. The most concerned one is who should be liable if or when the car crashes.

Personally, I believe in most situations the manufactures should be responsible, but there are some situations where the person in the car is to blame. When researching about autonomous cars, I found that the data of the cars surroundings is “processed by on-board computer that makes decisions about steering, breaking and accelerating……… the sophisticated programming requires the computer to self-learn” (How Do Driverless Cars Work? By Australia’s Science Channel).

This raises the possibility of the cars shopping out with a defect, or the car not being programmed properly. If these conditions were the cause of the accident, that is when I strongly believe it is the manufacturers fault.

Agencies such as the National Highway Traffic Safety Administration (NHTSA) say that artificial intelligence in the google car could be considered as the driver. Based on this, that means if the car gets into an accident, it is the car’s/ manufacturers fault.

On the other hand, some people believe all these concerns will soon go away. In an interview, James McBride, a technical leader at the Ford said, “When we first introduced things like Anti-lock brakes and airbags there was naturally some apprehension……… but as soon as the data actually came back and shows the safety benefits………… and the driver convinced {They} were enormous {and} everyone wanted them” (Autonomous Driving by Engagement).

Also, the text says, “While liability will always be important with respect to motor vehicle operation, automation will dramatically increase safety on the highways by reducing both the numbers and serenity of accidents” (John Villasenor). Villasenor decided to ignore the controversy, and go straight to the fact that it will ultimately increase safety on the roads. While I agree, I believe that we need to address this topic now. Time and time again, the world lets something happen then makes precedent. While we know the risks, we should start making the laws and planning for the future.

In conclusion, I believe certain situations depends on who should be liable. In a vide by Autoline Network, Gail Gottehrer gave an example of how people can try to blame the crashers on the companies. The example was when a Tesla vehicle jumped a curb and crashed into a building and the driver said it was the vehicle’s fault. It turns out Telsa had data that showed that the car was not in autonomous mode when the crash happened. In that crime interview she proposed a scenario that said, “One can’t expect a vehicle to drive you home if you’re drunk because if the alarm goes off and tells you to drive and you can’t, you are responsible” (Autonomous Liability by Autoline Network). While autonomous cars are beneficial for safety reasons, it could cause some manufacturers to lose more money. All in all, certain situations confirm who is responsible.