Tesla must bear some blame for self-driving car accident fatality, report says

America’s National Transportation Safety Board ruled that the Autopilot system in the Model S needs to be disabled in certain conditions

This image provided by the National Transportation Safety Board shows the damage to the left front of the Tesla involved in a May 7, 2016, crash in Williston, Fla. Investigators are meeting Sept. 12, 2017, to determine the likely cause of the crash that killed Joshua Brown, 40, of Canton, Ohio, who was using the semiautonomous driving systems of his Tesla Model S sedan. The sedan struck the underside of a semitrailer that was turning onto a divided highway in Williston. The sedan's roof was sheared off before the vehicle emerged on the other side of the trailer.(NTSB via AP)
Powered by automated translation

A report into the first fatality involving a self-driving car concluded on Tuesday that the vehicle’s maker bears some of the responsibility for the death of Joshua Brown on a Florida highway in May 2016 behind the wheel of his Tesla Model S.

America’s National Transportation Safety Board (NTSB) ruled that despite the possibility that Mr Brown had not paid attention to the car’s safety warnings while using Autopilot, Tesla needed to prevent autonomous driving systems from being used on roads for which they are not designed.

“The combined effects of human error and the lack of sufficient system controls resulted in a fatal collision that should not have happened,” said Robert Sumwalt, the chairman of the NTSB.

The accident that resulted in the death of the 40-year-old Mr Brown, a former Navy Seal, occurred on a divided road with occasional intersections, conditions that Tesla had warned owners not to use Autopilot in. Mr Brown crashed into an articulated lorry at 74mph

In spite of such warnings, the car’s software allowed drivers to go as fast as 90 miles an hour under automated steering, the NTSB found.

“In this crash, Tesla’s system worked as designed,” Mr Sumwalt said. “But it was designed to perform limited tasks in a limited range of environments. The system gave far too much leeway to the driver to divert his attention to something other than driving.”

Earlier this year, in what was deemed to be a victory for Tesla, a National Highway Traffic Safety Administration report on the accident said that vehicles with the Autopilot system didn’t need to be recalled, and the company yesterday showed due contrition.

_______________

Read more:

_______________

“We appreciate the NTSB’s analysis of last year’s tragic accident and we will evaluate their recommendations as we continue to evolve our technology,” a statement from Tesla said. “We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times.”

During tests after the fatal accident, Mr Brown’s car showed no signs that he had tried to brake or evade the truck, which was making a left turn. The truck driver’s failure to yield as he made the turn and Brown’s over-reliance on Tesla’s automation were the primary causes of the accident, the NTSB found. It also concluded that the automation contributed because it permitted Brown’s “prolonged disengagement from the driving task.”

Mr Brown’s Model S had warned him seven times during the 37 minutes before the crash that his hands weren’t on the steering wheel, but he was able to touch the wheel momentarily and the system continued driving itself. Newer versions of Autopilot stop the car after the third such infraction, but drivers can go for minutes at a time without steering or quickly stop the warning, according to NTSB.