Tesla criticised over blaming driver again for latest crash

Company said the “only” explanation for the crash was “if Mr Huang was not paying attention to the road, despite the car providing multiple warnings to do so”

FILE- In this Friday, March 23, 2018, file photo provided by KTVU, emergency personnel work a the scene where a Tesla electric SUV crashed into a barrier on U.S. Highway 101 in Mountain View, Calif.  The National Transportation Safety Board is "unhappy" about Tesla's decision to release information in a fatal crash investigation involving its Autopilot system. A vehicle using the semi-autonomous system crashed into a concrete lane divider in California last week, killing the driver. (KTVU via AP, File)
Powered by automated translation

Consumer-safety advocates and autonomous-vehicle experts criticised Tesla for issuing another statement about the death of a customer that pinned the blame on driver inattentiveness.

Days after publishing a second blog post about the crash involving Walter Huang, a 38-year-old who died last month in his Model X, Tesla issued a statement in response to his family speaking with San Francisco television station ABC7. The company said the “only” explanation for the crash was “if Mr Huang was not paying attention to the road, despite the car providing multiple warnings to do so”.

“I find it shocking,” Cathy Chase, president of the group Advocates for Highway and Auto Safety, said Wednesday. “They’re claiming that the only way for this accident to have occurred is for Mr Huang to be not paying attention. Where do I start? That’s not the only way.”

Groups including Advocates for Highway and Auto Safety and Consumer Reports have criticised Tesla for years for naming its driver-assistance system Autopilot, with the latter calling on the company to choose a different moniker back in July 2016. The two organisations share the view of the National Transportation Safety Board, which has urged car makers to do more to ensure drivers using partial-autonomy systems like Autopilot remain engaged with the task of driving. The US agency is in the midst of two active investigations into Autopilot-related crashes.

It’s Tesla responsibility to provide adequate safeguards against driver misuse of Autopilot, including by sending visual and audible warnings when the system needs a human to take back over, Ms Chase said. “If they’re not effective in getting someone to reengage - as they say that their drivers have to -- then they’re not doing their job.”

High Stakes

The stakes for Tesla’s bid to defend Autopilot are significant. The NTSB’s investigation of the March 23 crash involving Huang contributed to a major selloff in the company’s shares late last month. Chief executive Elon Musk claimed almost 18 months ago that the system will eventually render Tesla vehicles capable of full self-driving, and much of the value of the $51 billion company is linked to views that it could be an autonomous-car pioneer.

Tesla has declined to say how long drivers can now use Autopilot between visual or audible warnings to have a hand on the wheel. It’s also refused to comment on how many alerts can be ignored before the system disengages, what version of Autopilot software was in Huang’s Model X, or when the car was built.

“Just because a driver does something stupid doesn’t mean they - or others who are truly blameless - should be condemned to an otherwise preventable death,” said Bryant Walker Smith, a professor at the University of South Carolina’s School of Law, who studies driverless-car regulations. “One might consider whether there are better ways to prevent drivers from hurting themselves or, worse, others.”

Investigations

_______________

Read more:

Tesla woes continue with China tariffs set to hit car maker hardest

Elon Musk’s April Fools’ joke falls flat for Tesla investors

_______________

The NTSB is looking into the crash that killed Huang, as well as a collision in January involving a Tesla Model S using Autopilot that rear-ended a fire truck parked on a freeway near Los Angeles. The agency said after Tesla’s second blog post about the Huang incident that it was unhappy with the company for disclosing details during its investigation.

In its latest statement, Tesla said it is “extremely clear” that Autopilot requires drivers to be alert and have hands on the steering wheel. The system reminds the driver this every time it’s engaged, according to the company.

“Tesla’s response is reflective of its ongoing strategy of doubling down on the explicit warnings it has given to drivers on how to use, and not use, the system,” said Mike Ramsey, an analyst at Gartner. “It’s not the first time Tesla has taken this stance.”

Minami Tamaki, the San Francisco-based law firm that Mr Huang’s family has hired, said Wednesday that it believes Tesla’s Autopilot is defective and likely caused Mr Huang’s death. The firm declined to comment on Tesla’s statement.

The National Highway Traffic Safety Administration, which has the power to order recalls and fine auto manufacturers, found no defect after investigating the May 2016 crash involving a Tesla Model S driven on Autopilot by Josh Brown. The agency closed its probe in January 2017.

According to data Tesla gave NHTSA investigators prior to its decision against any recall, Autopilot’s steering system may prevent the rate of crashes per million miles driven by about 40 per cent, a figure the company cited in its latest statement.

“We empathise with Mr. Huang’s family, who are understandably facing loss and grief, but the false impression that Autopilot is unsafe will cause harm to others on the road,” Tesla said. “The reason that other families are not on TV is because their loved ones are still alive.”

Neither Tesla nor NHTSA has released the underlying data to support the crash-rate reduction claim.

“Tesla explicitly uses data gathered from its vehicles to protect itself, even if it means going after its own customers,” said Mr Ramsey.