Self-Driving Car Driving Toward Pedestrian

Who can be held liable when an autonomous vehicle strikes and kills a pedestrian?

This legal question came to the fore this week when an autonomous Uber car struck and killed a pedestrian in Tempe, Arizona, just outside of Phoenix.

Liability for the women’s death is at issue.

Tempe Police said the vehicle was in self-driving mode with a driver behind the wheel at 10 PM time when it struck a woman who was crossing the road outside a crosswalk. Elaine Herzberg, 49, of Mesa was transported to a local hospital where she died from injuries suffered in the mishap.

Uber Technologies Inc. said in a statement that “our hearts go out to the victim’s family. We’re fully cooperating with Tempe police and local authorities as they investigate this accident.”

In light of the tragedy, Uber announced that it is suspending autonomous vehicle testing in Tempe, San Francisco, and other cities.

The National Transportation Safety Board is also investigating the accident. The agency said in a statement that a team of investigators will go through Uber’s equipment and video to help determine what caused the crash, in addition to examining “the vehicle’s interaction with the environment, other vehicles and vulnerable road users, such as pedestrians and bicyclists.” Human performance also will be examined, the NTSB said.

Arizona Governor Signed Executive Order Weeks Earlier Expanding Tests

In a somewhat ironic twist, Arizona Governor Doug Ducey recently updated the state’s autonomous vehicle executive order just 18 days before the fatal crash. The governor’s order permits autonomous vehicles to operate on the state’s roads without a human present.

“As technology advances, our policies and priorities must adapt to remain competitive in today’s economy,” said Ducey in a news release announcing the change. “This executive order embraces new technologies by creating an environment that supports autonomous vehicle innovation and maintains a focus on public safety.” The new order mandates that all automated driving systems comply with all federal and state safety standards, including that the vehicle be able to stop if the automated system fails, that it comply with all state traffic and safety laws, and be licensed, registered, and insured in the state.

Who’s at Fault?

Evidence shows that the autonomous vehicle was traveling at 38 mph, although it is yet to be determined if that was above or below the speed limit. The vehicle’s back-up driver was 44-year-old Rafaela Vasquez. The back-up operator behind the wheel can take the car out of self-driving mode if need be.

Although the victim was not an Uber passenger, her family may be bound by Uber’s arbitration agreement if she ever downloaded the company’s app. The company’s requirement is extremely broad.

Expert Testimony Likely to be Needed in Cases

The accident in Arizona involving the autonomous Uber car is among the first fatalities related to the driverless car industry. Two years ago, a man riding in a Tesla operating in autopilot mode was killed when the vehicle steered under a semi-tractor trailer.

This case in Arizona could be litigated on issues of products liability, tort, government liability, and traffic engineering and technology.

ForensisGroup has a host of experts that can assist in a case like this concerning autonomous vehicle use, as well as accident reconstruction and safety.