Skip navigation

Blog

Who's at fault when autonomous cars kill?

March 9, 2022

We’ve all seen the commercials – vehicles parallel parking themselves, changing lanes, or braking. Nice features to have.

But when an automated vehicle gets in an accident and kills someone, who’s liable: the human driver or the company that programmed the driving software?

A court case in California may provide the answer. Prosecutors there have filed two counts of vehicular manslaughter against the driver of a Tesla Model S on Autopilot who ran a red light, slammed into another car, and killed two people in 2019.

The families of both victims have sued Tesla, accusing the company of selling vehicles without capable emergency automatic braking systems. A joint trial is set for mid-2023.

The defendant, according to Associated Press, appears to be the first person to be charged with a felony in the U.S. for a fatal crash involving a motorist who was using a partially automated driving system.

Autopilot engaged

The details of the case—whether the driver was paying attention or not—have yet to fully emerge, but the vehicle’s Autopilot feature was engaged. The driver, Kevin George Aziz Riad, 27, has pleaded not guilty and his preliminary hearing is set for Feb. 23.

The criminal charges aren’t the first involving an automated driving system, AP reported, but they are the first to involve a widely-used driver technology.

Tesla’s Autopilot system is considered a Level 2 vehicle autonomy by the National Highway Traffic Safety Administration (NHTSA). That means that although the system can steer, brake, and accelerate the vehicle, the driver “must continue to pay full attention at all times” and be ready to perform all driving tasks, according to the agency.

Some critics say companies are marketing Level 2 technologies as more competent than they are, and consumers driving these vehicles may believe they can let the car autopilot itself.

The Society of Automotive Engineers (SAE) International has a universal classification system to define automation levels for motor vehicles:

Level 1 -Driver Assistance: Intelligent features add a layer of safety and comfort. A human driver is required for all critical functions.

Level 2Partial Automation: At least two automated tasks are managed by the vehicle, but the driver must remain engaged with the driving task.

Level 3 - Conditional Automation: The vehicle becomes a co-pilot. It manages most safety-critical driving functions, but the driver must be ready to take control of the vehicle at all times.

Level 4 - High Automation: The vehicle is capable of performing all driving functions under certain conditions. The driver may have the option to control the vehicle.

Level 5 - Full Automation: The vehicle is capable of being completely driverless. No vehicle of this type is currently available for purchase.

From Level 2 to 4, where there is still an element of human oversight, accidents can be attributed to human error, such as when a Tesla operating on Autopilot crashed into a cop car while the driver was watching a movie. The driver is still at fault since they were supposed to pay attention to the road. It is only when the driver disappears entirely that liability changes. With Level 5, the vehicle operator is no longer a human driver but the artificial intelligence (AI) that is making decisions to steer, slow, accelerate, and stop the vehicle. Driver error disappears.

Ontario launched a 10-year pilot program in 2016 to allow the testing of automated vehicles on its roads. The province was the first jurisdiction in Canada to allow on-road testing of automated vehicles.

Existing laws apply

In January 2019 the program was updated. Then the public – as opposed to registered pilot program participants – were able to drive Level 3 conditionally automated vehicles, but the driver had to always be ready to take control of the vehicle. At the time, the province said all existing laws (such as distracted, careless, and impaired driving laws) would continue to apply to the driver.

In 2018, the Insurance Bureau of Canada (IBC) released a report on auto insurance and automated vehicles and liability. The report noted this eventual shift in responsibility for collisions from humans to automated technology means many injured people will have to proceed through product liability litigation to get compensated. Product liability litigation is more complex, IBC said, and takes years longer to resolve than traditional motor vehicle liability claims.

The report called for changes to auto insurance policies and legislation to ensure people injured in accidents involving automated vehicles get compensated fairly and quickly. IBC recommended that:

  • A single insurance policy be established, covering driver negligence and the automated technology to facilitate liability claims.
  • A legislated data-sharing arrangement with vehicle manufacturers and vehicle owners and/or insurers be established to help determine the cause of a collision.
  • Federal vehicle safety standards be updated with technology and cyber security standards.

One thing is certain, the filing of charges in the California crash could serve notice to drivers who use autopilot systems that they cannot rely on them to control vehicles.

RELATED READING:Autonomous cars not allowed, ICBC says