What lessons can we take from the fatal accident in Arizona in 2018 involving an autonomous vehicle?

Author

On March 18, 2018, in Tempe, Arizona, a vehicle being operated by self-driving software which was under development, collided with a pedestrian, causing her death. Following this accident, the U.S. National Transportation Safety Board ("NTSB") conducted an investigation and, on November 19, 2019, issued its preliminary results and recommendations.1

The circumstances of the accident involving an autonomous car from Uber

The autonomous vehicle ("AV"), a 2017 Volvo XC90, was equipped with an automated driving system being developed by Uber Technologies Inc. ("Uber"). At the time of the collision, the vehicle was travelling at a speed of approximately 72 km/h, and was completing the second portion of a predetermined route as part of a driving test. The pedestrian was struck while crossing the street outside the crosswalk.

The NTSB's investigation found that the vehicle's automated driving system had detected the pedestrian, but was unable to qualify her as a pedestrian and predict her path.

Further, the automated driving system prevented the activation of the vehicle's emergency braking system, relying instead on the intervention of the human driver on board to regain control of the vehicle in this critical situation.

However, videos from inside the vehicle showed that the driver was not paying attention to the road, but was rather looking at her cell phone lying on the vehicle console.

Since the collision between the pedestrian and the vehicle was imminent, the inattentive driver was unable to take control of the vehicle in time to prevent the accident and mitigate the damages.

What are the causes of the accident?

The NTSB issued several findings, including the following:

  • Neither the driver's experience nor knowledge, her fatigue or mental faculties, or even the mechanical condition of the vehicle, were factors in the accident;
  • An examination of the pedestrian showed the presence of drugs in her body which may have impaired her perception and judgment;
  • Uber's automated driving system did not adequately anticipate its safety limitations, including its inability to identify the pedestrian and predict her path;
  • The driver of the vehicle was distracted in the moments preceding the accident. Had she been attentive, she would have had enough time to see the pedestrian and take control of the vehicle to avoid the accident or mitigate its impact;
  • Uber did not adequately recognize the risks of distraction of the drivers of its vehicles;
  • Uber had removed the second driver from the vehicle during the tests, which had the effect of giving the sole remaining driver full responsibility for intervening in a critical situation, thereby reducing vehicle safety.

The probable cause of the accident was found to be the driver's distraction and failure to take control of the AV in a critical situation. Additional factors were identified, including insufficient vehicle safety measures and driver monitoring, associated with deficiencies in the safety culture at Uber.

The NTSB issued recommendations, including the following:

  • Arizona should implement obligations for AV project developers regarding the risks associated with the inattentiveness of vehicle drivers which are aimed at preventing accidents and mitigating risks;
  • The NTSB should require entities conducting projects involving AVs to submit a self-assessment report on the safety measures for their vehicles. Additionnaly, the NTSBshould set up a process for the assessment of these safely measures;
  • Uber should implement a policy on the safety of its automated driving software.

Can an identical tragedy related to autonomous vehicles occur in Quebec and Canada?

Following the update to the Highway Safety Code in April 2018, level 3 AVs are now permitted to be driven in the province of Quebec when their sale is allowed in Canada. Driving of level 4 and 5 automated vehicles is permitted where it is expressly regulated in the context of a pilot project.2

According to SAE International Standard J3016, level 3 AVs are vehicles with so-called conditional automation, where active driving is automated, but require the human driver to remain attentive so that they can take control of the vehicle in a critical situation.

Thus, the vehicle involved in the Arizona accident, although still in the development phase, corresponded to a level 3 AV.

Level 3 AVs are now circulating fully legally on Quebec roads.

In Canada, the Motor Vehicle Safety Act3 and the relevant regulations thereof govern “the manufacture and importation of motor vehicles and motor vehicle equipment to reduce the risk of death, injury and damage to property and the environment”. However, there is currently no provision specifically for the regulation of automated driving software or the risks associated with the inattention of Level 3 AV drivers.

With the arrival of AVs in Canada, taking in consideration the recommendations of the NTSB and to ensure the safety of all, we believe the current framework would need to be improved to specifically address VA security measures.

 

  1. National Transportation Safety Board, Public Meeting of November 19, 2019, “Collision Between Vehicle Controlled by Developmental Automated Driving System and Pedestrian”, Tempe, Arizona, March 18, 2019, HWY18MH010.
  2. Highway Safety Code, CQLR c. C-24.2, s. 492.8 and 633.1; the driving of autonomous vehicules in Ontario is regulated by Pilot Project - Automated Vehicles, O Reg 306/15.
  3. Motor Vehicle Safety Act, S.C. 1993, c. 16; see, in particular, the Motor Vehicle Safety Regulations, C.R.C., c. 1038.
Back to the publications list