Ãå±±ÂÖ¼é

Investigation Concludes Tesla Not at Fault in Self-Driving Car Crash

By and | January 20, 2017

The push to bring self-driving cars to American roads got a significant boost on Thursday when the nation’s chief auto safety regulator essentially cleared Tesla Motors Inc.’s Autopilot system of fault in a fatal 2016 crash.

The U.S. National Highway Traffic Safety Administration found that the owner of a Tesla Motors Inc. Model S sedan that drove itself into the side of a truck in May had ignored the manufacturer’s warnings to maintain control even while using the driver-assist function. The agency said it found no defect in the vehicle and wouldn’t issue a recall.

“The auto industry just let out a giant sigh of relief,” said Dave Sullivan, an analyst at consultancy AutoPacific Inc. “This could have started a snowball effect of letting automakers and suppliers become liable for human error.”

The finding concludes NHTSA’s first investigation into the role played by automated driving systems in a fatal crash. It was a win for not only for Tesla but for companies from General Motors Co. to Alphabet Inc.’s Google that have invested billions of dollars in what they see as the future of personal transportation. Safety regulators, too. have backed the nascent industry, giving it the flexibility to develop products that they think could greatly reduce highway deaths.

‘Very Positive’

Tesla Chief Executive Officer Elon Musk called the NHTSA report “very positive.” In a Twitter message he highlighted data showing the company’s vehicle crash rate dropped by 38 percent after the company installed its auto-steer system.

“We appreciate the thoroughness of NHTSA’s report and its conclusion,” Tesla said in an e-mailed statement. Some auto-safety advocates have criticized Tesla for what they said was a premature introduction of its Autopilot system and said NHTSA could have taken stronger action.

“If a vehicle could not distinguish between a white truck and the sky, that to me would seem to be a defect,” said Joan Claybrook, NHTSA administrator under President Jimmy Carter and an auto safety advocate.

Stephanie Brinley, a senior analyst at IHS Markit Ltd.’s automotive group, cautioned that it’s too early to draw too many conclusions about self-driving vehicles from the findings.

“This decision does not in and of itself tell us what will happen down the road,” Brinley said. “It’s really too soon. ”

Self-Driving Risks

NHTSA didn’t completely absolve self-driving technologies. The agency drew several observations about the limits of automated driver-aids and the risks associated with how drivers use them.

Automatic braking systems like the one on the Model S and those increasingly available on other new vehicles can’t address all crash scenarios, NHTSA spokesman Bryan Thomas said. The crash in May that killed Joshua Brown, a former Navy SEAL and Tesla enthusiast, in Florida is an example of that, Thomas said.

The Model S’s sensors couldn’t distinguish the trailer against a bright sky as it attempted to cross the highway while making a left turn. Auto-braking systems are best at preventing rear-end collisions, not the cross-traffic collision that led to Brown’s death, Thomas said.

‘Full Attention’

So-called Level 2 automated driver systems like Tesla’s Autopilot, which provide automated driving functions in limited circumstances, continue to require a driver’s “full attention,” Thomas said.

Carmakers must anticipate that some drivers will fail to do so and design their systems with the “inattentive driver in mind,” he said. He also signaled that automakers will be expected to provide clearer warnings about the limitations of automated driver aids, saying NHTSA believes “strongly” that “it’s not enough to simply put it in an owner’s manual.”

Since the accident, Tesla added protections to its software that shuts off Autopilot if it detects the driver isn’t paying attention. The software also emphasizes radar over cameras, and Musk has said that change would have made it easier for the car in the crash to detect the truck and might have saved Brown’s life.

NTSB Probe

The California-based group Consumer Watchdog said Tesla should have been held accountable for the accident.

“NHTSA has wrongly accepted Tesla’s line and blamed the human, rather than the Autopilot technology and Tesla’s aggressive marketing,” John Simpson, the group’s privacy project director, said in an e-mailed release. “The very name Autopilot creates the impression that a Tesla can drive itself. It can’t.”

The National Transportation Safety Board, an independent agency that has no regulatory power, is conducting a parallel investigation of the accident. The safety board is planning to issue its conclusions by early summer, spokesman Christopher O’Neil said.

In spite of the crash, the Tesla Autopilot system appears to have improved the safety of its vehicles overall. Crash rates in Tesla vehicles equipped with the Autosteer system fell by 38 percent to 0.8 per million miles compared to those without it, NHTSA said in its report.

Thomas said Tesla was “fully” cooperative and provided data on what he estimated were “dozens” of Tesla Model S and Model X crashes in which Autopilot was active during the crash or 15 seconds prior.

Tesla was able to pull crash data directly from its vehicles, providing the agency access that “would not have been possible just a few years ago or with other automakers.”

Tesla advanced as much as 4.3 percent in U.S. trading and closed up 2.3 percent at $243.76 per share, its highest since April 28. Shares got an early boost from a Morgan Stanley upgrade.

Topics Auto Personal Auto Tesla

Was this article valuable?

Here are more articles you may enjoy.