Federal investigators are investigating the recall of Tesla’s Autopilot after 20 crashes

7 Min Read

GettyImages 2151277390 e1715100569766

Federal highway safety investigators want Tesla to tell them how and why it developed the fix in a recall of more than 2 million vehicles equipped with the company’s partially automated Autopilot operating system.

Researchers at the U.S. National Highway Traffic Safety Administration are concerned about whether the recall worked because Tesla has reported 20 crashes since the fix shipped as an online software update in December.

The recall was also intended to indicate whether Autopilot would be allowed to operate on roads other than limited-access highways. The solution to this was increased warnings to the driver on roads with intersections.

But at one letter to Tesla posted on the agency’s website On Tuesday, researchers wrote that they could find no difference between warnings to drivers to pay attention before the recall and after the new software was released. The agency said it will evaluate whether driver warnings are adequate, especially when a driver monitoring camera is covered.

The agency requested large amounts of information about how Tesla developed the fix, focusing on how it used human behavior to test the effectiveness of the recall.

‘Insufficient remedies’

Phil Koopman, a professor at Carnegie Mellon University who studies automated driving safety, said the letter shows that the recall did little to fix Autopilot’s problems and was an attempt to appease NHTSA, which ordered the recall after more than two years of research.

“It’s pretty clear to anyone watching that Tesla tried to come up with the least possible solution to see what they could get away with,” Koopman said. “And NHTSA must respond strongly, or other auto companies will provide inadequate solutions.”

See also  Stock Market Outlook: Rally could resume even without Fed rate cuts

Safety advocates have long raised concerns that Autopilot, which can keep a vehicle in its lane and away from objects in front of it, is not designed to work on roads other than limited-access highways.

Missy Cummings, a professor of engineering and computer science at George Mason University who studies automated vehicles, said NHTSA is responding to criticism from lawmakers over a perceived lack of action on automated vehicles.

“No matter how inept our government is, the feedback loop works,” Cummings said. “I think NHTSA leadership is now convinced that this is a problem.”

The 18-page NHTSA letter asks how Tesla used human behavioral science in designing Autopilot, and how the company assesses the importance of evaluating human factors.

It also wants Tesla to identify every job involved in evaluating human behavior and employee qualifications. And it asks Tesla to say if the positions still exist.

A message was left early Tuesday by The Associated Press seeking comment from Tesla on the letter.

Tesla is in the process of laying off about 10% of its workforce, about 14,000 people, in an effort to cut costs to cope with declining global sales.

Cummings said she suspects CEO Elon Musk would have fired anyone with knowledge of human behavior, a key skill needed to deploy partially automated systems like Autopilot, which cannot drive themselves and require humans to be ready at all times. to intervene.

“If you want to have a technology that relies on human interaction, you better have someone on your team who knows what he or she is doing in that area,” she said.

See also  Federal Reserve Official Admits Biden's Illegal Immigration Crisis Is Driving Up Mortgage Rates for Americans | The Gateway expert

Cummings said her research has shown that once a propulsion system takes control of humans, there is little left for the human brain to do. Many drivers tend to overly rely and settle on the system.

“You can have your head fixed in one position, you can potentially keep your eyes on the road and your head can be a million miles away,” she said. “All the driver monitoring technologies in the world still won’t force you to pay attention.”

Is the autopilot on or off?

In its letter, NHTSA also asks Tesla for information on how the recall addresses driver confusion about whether autopilot is disabled when force is applied to the steering wheel. Previously, if Autopilot was turned off, drivers might not quickly notice that they had to take over driving.

The recall added a feature that provides a “more pronounced delay” to alert drivers when Autopilot is disabled. But the callback doesn’t automatically activate the feature; drivers must do this. Researchers asked how many drivers have taken that step.

NHTSA asks Tesla, “What do you mean you have a cure and it’s not actually turning on?” Koopman said.

The letter shows that NHTSA is investigating whether Tesla conducted tests to ensure the solutions actually worked. “When I looked at the remedy, I found it hard to believe that there is a lot of analysis showing that it will improve safety,” Koopman said.

The agency also says Tesla implemented safety updates after the recall was issued, including an effort to reduce the number of accidents caused by aquaplaning and reduce the number of collisions in high-speed lanes. NHTSA said it will investigate why Tesla did not include the updates in the original recall.

See also  Yen slightly higher after suspected intervention By Reuters

NHTSA could seek further recalls, limit Tesla where Autopilot can operate, or even force the company to disable the system until it is repaired, safety experts said.

NHTSA began its Autopilot investigation in 2021 after receiving 11 reports of Teslas using Autopilot hitting parked emergency vehicles. In documents explaining why the investigation was ended because of the recall, NHTSA said it ultimately found 467 crashes involving Autopilot, resulting in 54 injuries and 14 deaths.

Subscribe to the Eye on AI newsletter and stay informed about how AI is shaping the future of business. Free sign-up.
Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *