Why there's a need for protocol on self-driving car accidents
Twitter went viral in March with a photo of an Uber self-driving SUV that was upended after a high-speed crash in Tempe, Ariz.
Last March, Twitter went viral with a photo of an Uber self-driving SUV after a high-speed crash in Tempe, Ariz.
There were no injuries, and investigators concluded that the other car failed to yield. But that photo is a blunt reminder of the auto industry’s unfinished business.
As fleets of robo-taxis start mixing on public roads with cars, motorcycles and pedestrians, accidents will be inevitable. Each incident will be scrutinized, and one bad crash could rattle public opinion.
What to do? Amnon Shashua has a solution.
In a blog posted Tuesday, Mobileye’s co-founder proposed an industry protocol that would enable investigators to quickly determine which vehicle caused an accident.
Without an industry protocol, “you could have a long investigation, and it might not be so clear whose fault it was,” said Mobileye spokesman Dan Galves. “The automaker would be facing liability, and consumer confidence could go down.”
Here are the key elements:
A self-driving car can operate at normal speeds in mixed traffic with other vehicles.
If the car crashes, investigators should have immediate access to its sensor data to determine the cause.
Standard industry algorithms would prevent the vehicle from making risky decisions that could trigger a collision.
A self-driving car would not be held liable if the other vehicle caused the accident.
Industry algorithms would define the most aggressive evasive action that a vehicle can take to avoid an accident.
That last item is going to be tricky. For example, should a self-driving car be programmed to avoid a pedestrian at all costs — even if it hits another vehicle?
Shashua does not envision a vague “do-no-harm” directive. He is proposing a concrete set of algorithms that regulators and insurers would endorse, and that automakers would adopt.
If those algorithms are adopted — and if a vehicle malfunction did not cause the accident — the automaker would not be held liable.
Shashua outlined his proposal for an industrywide “do-no-harm” protocol in an academic paper published Tuesday.
He will get a respectful hearing. Shashua’s company commands a 70 percent share of the world market for obstacle detection software. After Intel Corp. finalized the $ 15 billion purchase of Mobileye this year, it put Shashua in charge of its automated vehicle group.
It seems safe to assume this proposal will trigger a lively debate. But Shashua’s blog noted that automakers will have something to offer in return for legal limits on their liability.
A self-driving car festooned with cameras, radar and lidar will be much safer than conventional vehicles. For example, such a vehicle can calculate precisely how closely it can follow a vehicle without risking a rear-end collision.
If the entire U.S. vehicle fleet were autonomous, Shashua estimates that it would suffer about 40 fatal traffic accidents annually. By comparison, there were nearly 40,000 fatalities last year on U.S. roads.
“Society can accept human error, but what happens on day one when you fully take the driver out of the driver’s seat?” Galves asked. “To create an environment for self-driving vehicles, we need to quickly and definitively assign blame for accidents.”
Let’s block ads! (Why?)