Self-driving vehicles potentially offer a tremendous upside. They could lower the cost of transportation, decrease congestion, reduce emissions and increase mobility for disabled individuals, elderly and hundreds of thousands of others who are unable to drive for a variety of reasons.
But all sorts of questions remain to be resolved, both technological and legal, before autonomous vehicles become widely used.
In this two-part post, we will tackle two of those issues. How are autonomous vehicles defined? And what types of liability questions are likely to arise as such vehicles become more common?
Five Levels Of Autonomous Driving
Currently, there are five levels of autonomous driving capability (plus those vehicles that have no autonomous ability at all).
Level 1 – At the first level of autonomy, a vehicle can assist with functions such as acceleration and braking, but drivers still handle all of these functions.
Level 2 – Partial automation is where many so-called self-driving cars are at today. These vehicles can assist with steering, accelerating and braking enough so that a driver may disengage from these tasks. The driver must stay aware of what’s happening and be ready to retake control of the vehicle.
Last March, a woman pedestrian in Tempe, Arizona, was struck and killed by a self-driving Volvo operating as an Uber. In that case, as well as two fatal accidents involving self-driving Teslas, it has been reported that the drivers of these vehicles were not paying close enough attention and were not ready to take back control of the vehicle when it became necessary.
Level 3 – Vehicles with this level of autonomy are able to control and monitor their environment using technology such as light detection and ranging (Lidar), which allows human operators to disengage from “safety critical” functions.
Level 4 – At this level, autonomous vehicles can perform all functions – steering, braking, accelerating – but cannot handle more complex functions like merging on to a highway.
Level 5 – Vehicles at this level are completely automated and require no human interaction. In addition to handling functions such as braking and accelerating, these vehicles also can monitor and identify more complex driving tasks such as traffic jams and merging.
When crashes occur
The National Highway Traffic Safety Administration (NHTSA), which oversees vehicle safety, has not established detailed regulatory standards for autonomous vehicles. Conceivably NHTSA could provide a baseline framework for safety regulations and allow states to establish their own set of rules.
But self-driving cars – which as we have discussed come in various types – have already caused serious crashes and even fatalities. Which parties should be liable for injuries caused in these crashes?
In part two of this post, we will discuss the range of options. These could include lawsuits against individual drivers of vehicles that are not fully autonomous, as well as products liability litigation against makers of autonomous vehicles.