BERKELEY, Calif. – The fatal crash of a Tesla with no apparent someone behind the wheel has shed new light on the safety of semi-autonomous vehicles and the vague US regulated terrain in which they navigate.
Police in Harris County, Texas, said a Tesla Model S crashed into a tree at high speed on Saturday after failing through a corner and went up in flames, killing one occupant in the front passenger seat and the owner in the back seat. .
Tesla Chief Executive Elon Musk tweeted Monday that preliminary data downloaded by Tesla indicates the vehicle was not running on Autopilot and not part of the SAE Level 2 driver assistance system confusingly called “Full Self-Driving” (FSD). He also suggested it couldn’t work without someone behind the wheel, but that was debunked by a Consumer Reports test.
Tesla’s Autopilot and FSD, as well as the growing number of similar semi-autonomous driving features in cars made by other automakers, pose a challenge to officials responsible for motor vehicle and highway safety.
The United States Federal Road Safety Authority, the National Highway Traffic Safety Administration (NHTSA), has not yet issued specific regulations or performance standards for semi-autonomous systems such as Autopilot or fully autonomous vehicles (AVs).
There are no NHTSA rules requiring car manufacturers to ensure systems are used as intended or to prevent drivers from misusing them. The only major federal restriction is that vehicles have steering wheels and human controls required by federal regulations.
With no performance or technical standards, systems such as Autopilot inhabit a regulatory gray area.
The Texas crash follows a series of crashes in which Tesla cars were driven on Autopilot, the partially automated driving system that performs a range of functions, such as helping drivers stay in the lane and steer on highways.
Tesla has also rolled out what it describes as a “beta” of its FSD system to approximately 2,000 customers since October, allowing them to effectively test how well it works on public roads.
Harris County Police are now seeking a search warrant for the Tesla data and said witnesses told them the victims planned to test the automated driving of the car.
Adding to the confusion in the regulations is that the NHTSA traditionally regulates vehicle safety, while motor vehicle departments (DMVs) in individual states supervise drivers.
When it comes to semi-autonomous functions, it may not be clear whether the on-board computer or driver is driving the car, or if supervision is shared, says the U.S. National Transportation Safety Board (NTSB).
California has introduced AV regulations, but they only apply to cars equipped with technology that can perform the dynamic driving task without the active physical control or monitoring of a human operator, the state DMV told Reuters.
It said Tesla’s fully self-driving system doesn’t meet those standards yet and is considered some sort of advanced driver assistance system that it doesn’t regulate.
That leaves Tesla’s Autopilot and its California-based FSD system in regulatory limbo, while the automaker rolls out new versions of the systems for its customers to test.
NHTSA, the federal agency responsible for vehicle safety, said it opened 28 investigations this week into accidents involving Tesla vehicles, of which 24 remain active, and at least four, including the Texas fatal accident, have occurred since March.
NHTSA has repeatedly argued that its broad authority to require automakers to recall any vehicle that poses an unreasonable safety risk is sufficient to address driver assistance systems.
So far, the NHTSA has not taken any enforcement action against Tesla’s advanced driving systems.
White House spokeswoman Jen Psaki said the NHTSA is “actively involved with Tesla and local law enforcement” about the Texas crash.
The NTSB, a US government agency charged with investigating road accidents, has criticized NHTSA’s hands-off approach to regulating cars with self-driving functions and AVs.
“NHTSA refuses to take action on vehicles that are said to have partial or lower automation, and continues to wait for higher levels of automation before requiring AV systems to meet minimum national standards,” NTSB Chairman Robert Sumwalt wrote in a letter from 1 February. to NHTSA.
“Because NHTSA has no requirements, manufacturers can operate and test vehicles virtually anywhere, even if the location exceeds the limitations of the AV control system,” the letter said.
NHTSA told Reuters that with a new administration, it was reviewing the regulations surrounding AVs and welcomed the input from the NTSB as policies for automated driving systems advanced.
It said the most advanced vehicle technologies on the market required a fully observant human driver at all times.
“Abuse of these technologies at the very least leads to distraction while driving. Every state in the country holds the driver responsible for the safe operation of the vehicle,” NHTSA told Reuters.
NTSB also says NHTSA has no method of verifying whether vehicle manufacturers have implemented system protections. For example, there are no federal regulations requiring drivers to touch the steering wheel within a specific time frame.
“NHTSA is in the process of regulating autonomous vehicles, but it has been slow to regulate semi-autonomous vehicles,” said Bryant Walker Smith, a law professor at the University of South Carolina. “There is a growing realization that they deserve more research priority and regulatory action.”
New York has a law requiring drivers to keep at least one hand on the wheel at all times, but no other state has legislation that could prohibit the use of semi-autonomous cars.
According to the National Conference of State Legislatures, 35 states have passed legislation or governors have signed executive orders pertaining to AVs.
Such rules allow companies such as Alphabet’s Google and General Motors to test their Waymo and Cruise vehicles on public roads.
But the regulations vary by state.
Texas AV regulations state that vehicles must comply with NHTSA processes, although such federal regulations do not exist. The Texas Department of Public Safety, the regulator charged with overseeing AVs, did not respond to a request for comment.
Arizona’s transportation division requires companies to submit regular applications to verify, among other things, that vehicles can operate safely if autonomous technology fails.
While most automakers offer vehicles with various forms of assisted driving, there are no fully autonomous vehicles for sale to customers in the United States.
However, concerns about the safety of autonomous driving technology have grown in recent years, and Tesla has warned of its limitations.
In February 2020, Tesla’s director of autonomous driving technology, Andrej Karpathy, identified a challenge to its Autopilot system: how to recognize when the emergency flasher lights of a parked police car come on.
“This is an example of a new task that we would like to know,” said Karpathy at a conference lecture on Tesla’s efforts to deliver FSD technology.
In just over a year since then, Tesla vehicles have collided four times with police cars parked on the road, and at least three Tesla vehicles running on autopilot have been killed since 2016.
US security regulators, police and local government have investigated all four incidents, officials told Reuters.
At least three of the cars were on autopilot, police said. In one case, a doctor was watching a movie on a phone when his vehicle rammed into a police officer in North Carolina.
Tesla did not immediately respond to a request for comment.
Accidents and investigations have not delayed Musk to promote that Tesla cars are capable of driving themselves.
In a recent Tweet, Musk said Tesla is “nearly done with FSD Beta V9.0. Improvement in incremental changes is huge, especially for weird corner cases and bad weather. Clean visibility, no radar.”
Tesla also says it has used 1 million cars on the road to collect image data and improve Autopilot using machine learning and artificial intelligence.
Tesla’s Karpathy said he drove his Tesla for 20 minutes to get coffee without intervention in Palo Alto.
“It’s not a perfect system, but it’s coming,” he said in a “Robot Brains” podcast in March. “I’ll definitely keep my hands on the wheel.”