DETROIT — The fiery crash of a Tesla near Houston with no one behind the wheel is drawing scrutiny from twothat could bring new regulations of electronic systems that take on some driving tasks. The National Highway Traffic Safety Administration and the National Transportation Safety Board said Monday they would send teams to investigate the crash on a residential road that killed two men in a Tesla Model S.
Local authorities said one man was found in the passenger seat while another was in the back. They’re issuing search warrants in the probe, which will determine whetherpartially automated system was in use. Autopilot can keep a car centered in its lane, keep a distance from cars in front of it, and even change lanes automatically in some circumstances. On Twitter Monday, wrote that data logs “recovered so far” show Autopilot wasn’t turned on, and “Full Self-Driving” was not purchased for the vehicle. He didn’t answer reporters’ questions posed on Twitter.
In the past, NHTSA, which has the authority to regulate automakers and seek recalls for defective vehicles, has taken a hands-off approach to control partial andsystems for fear of hindering the development of promising new features. But the agency has stepped up inquiries into Teslas since March, dispatching teams to three crashes. It has investigated 28 in the past few years but thus far has relied on voluntary safety compliance from auto and tech companies.
“With a new administration in place, we’re reviewing regulations around autonomous vehicles,” the agency year.. Agency critics — especially of Tesla — are long overdue as the automated systems keep creeping toward being fully autonomous. At present, though, there are no specific regulations and no fully self-driving procedures available for sale to consumers in the U.S. At issue is whether Musk has over-sold the capability of his systems by using Autopilot or telling customers that “Full Self-Driving” will be available this
“Elon’s been irresponsible,” said Alain Kornhauser, faculty chair ofengineering at Princeton University. Musk, he said, has sold the dream that the themselves even though Tesla says they’re not ready in the fine print. “It’s not a game. This is serious stuff.” Tesla, which has disbanded its media relations office, also did not respond to requests for comment Monday. Its stock fell 3.4% in the face of publicity about the crash. In December, before former President left office, NHTSA sought public comment on regulations. Transportation Secretary Elaine Chao, whose department included NHTSA, said the proposal would address safety “without hampering innovation in the development of automated driving systems.”
But her replacement under pay a lot of attention to that and do everything we can within our authorities,” he said, adding that the agency may work with Congress on the issue. , which has been involved in several fatal crashes. It failed to stop for tractor-trailers crossing in front of it, stopped emergency vehicles, or a highway barrier., Pete Buttigieg, indicated before Congress that change might be coming. “I would suggest that the policy framework in the U.S. has not caught up with the technology platforms,” he said last month. “So we intend to
The NTSB, which can only issue recommendations, asked that NHTSA and Tesla limit the system to roads on which the system can safely operate. Tesla installs a more robust approach to monitor drivers to ensure they pay attention. Neither Tesla nor the agency took action, and blame for one of the crashes from the NTSB. Missy Cummings, an electrical and computer engineering professor at Duke University who studies automated vehicles, said the Texas crash is a watershed moment for NHTSA. She’s not optimistic the agency will do anything substantial but hopes the impact will . “Tesla has had such a free pass for so long,” she said.
Frank Borris, a former head of NHTSA’s Office of Defects Investigation who now runs a safety consulting business, said the agency is challenging because of a slow, outdated regulatory process that can’t keep up with fast-developing technology. The system holds great promise to improve safety, Borris said. But it’s also working with “what is an antiquated regulatory rule promulgating process which can take years.” Investigators in the Houston-area case haven’t determined how system to monitor the driver, which detects force from hands on the wheel.of the crash, but Harris County Precinct Four Constable Mark Herman said it was a high speed. He would not say if there were evidence that anyone tampered with Tesla’s
The system will issue warnings and eventually shut the car down if it doesn’t see hands. But critics system is easy to fool and can take a minute to shut down. The company has said that drivers using Autopilot and its “Full Self-Driving Capability” system must be ready to intervene at any time. Neither approach can themselves. On Sunday, Musk tweeted that the company had released a safety report from the first quarter showing that Tesla with Autopilot has nearly a ten times lower chance of crashing than the average vehicle with human piloting it.
But Kelly Funkhouser, head of connected and automated vehicle testing for Consumer Reports, said Tesla’s numbers have been inaccurate in the past and are difficult to verify without underlying data. “You just have to take their word for it,” Funkhouser said, adding that Tesla doesn’t say how many times the system failed but didn’t crash or when a driver failed to take over. Funkhouser said it’s time for the government to step in, set performance standards, and draw a line between partially automated systems that require drivers to intervene and strategies that can drive themselves. “There is no metric, no yes or no, black or white,” she said. She fears that or putting self-driving cars on the road while “getting away with using the general population of Tesla owners as guinea pigs to test the system.” Hope Yen contributed to this report from Washington.