Tesla’s Autopilot linked to more crashes even after massive recall

Tesla’s Autopilot linked to more crashes even after massive recall


The National Highway Traffic Safety Administration said Friday that a two-year investigation into Tesla’s Autopilot had identified hundreds of crashes — 13 of them fatal — and that it is launching a new probe after confirming that collisions continued even after a recall that the company said would fix the system.

In documents dated Thursday, the agency gave the fullest account of findings about the risks of the driver-assistance feature, saying in some circumstances “system controls and warnings were insufficient.”

The agency said it is now reviewing whether Tesla’s fixes during the December recall of 2 million vehicles went far enough. Investigators disclosed 20 crashes of vehicles that received updates as part of the recall.

Tesla agreed to that recall following a NHTSA inquiry into whether Autopilot had enough safeguards to keep drivers alert while the system was engaged. In its recall notice, the agency said it found that Autopilot’s key Autosteer feature may not have sufficient controls to “prevent driver misuse,” such as using the feature outside the controlled-access highways for which it was designed.

Tesla disputed the agency’s criticisms at the time but said it solved the issue with software updates that added alerts to remind drivers to pay attention while using the automated driving system. The company did not limit where the system could operate, which experts at the time said would have been a better fix.

NHTSA’s new action comes after testing the cars at its facility in Ohio. It said parts of Tesla’s software solution required drivers to opt in and could be easily reversed. The agency said it plans to examine “the prominence and scope of Autopilot controls to address misuse, mode confusion, or usage in environments the system is not designed for.”

A NHTSA database of crashes involving the system does not include details on whether anyone was killed in any of the 20 new crashes. But in Washington state last week, the driver of a Tesla told police he was using Autopilot when he hit a motorbike, killing its 28-year-old rider, according to local news accounts.

The summary of the initial investigation showcases the intense scrutiny Tesla is under from federal regulators. NHTSA said the very name Autopilot was problematic: “This terminology may lead drivers to believe that the automation has greater capabilities than it does and invite drivers to overly trust the automation.”

The agency reviewed a total of 956 crashes in which Autopilot was suspected of being involved and focused on 467 of those, including the 13 fatal incidents and dozens of others involving serious injuries.

NHTSA said the crashes showed three broad trends, including loss of traction on wet roads and the inadvertent disabling of Autopilot’s steering function. The most severe were hundreds involving the front of a Tesla crashing into something, often at high speed. In some of those cases, investigators concluded an attentive driver would have avoided a collision or at least taken action to reduce its severity.

The agency gave as an example a fatal July 2023 crash in Warrenton, Va., in which a Tesla Model Y ran into a turning tractor-trailer, saying the Tesla driver could have seen the truck with enough time to avoid a crash.

But the new probe also underscores NHTSA’s limited authority. The agency can investigate safety problems and order recalls, but it cannot tell vehicle manufacturers how to fix the issues that investigators uncover.

Ann Carlson, who led the agency when the recall was announced, said the new review was evidence of regulators continuing to scrutinize Tesla.

“NHTSA is doing its job as authorized, and indeed required, under the statute,” said Carlson, who is now a law professor at UCLA.

Tesla did not respond to a request for comment Friday.

Sens. Edward J. Markey (D-Mass.) and Richard Blumenthal (D-Conn.) wrote to NHTSA last week, calling on the agency to do more to hold Tesla accountable. They said in a statement Friday they were encouraged by the opening of a new review.

“Within days of Tesla’s December recall of Autopilot, it became clear the company had not done enough to protect drivers and all road users from the dangers of this unsafe technology,” the senators said.

Autopilot is included in nearly every Tesla. The recall review encompasses 2012 to 2024 Model Y, S, X and 3 vehicles and the Cybertruck pickup. When activated by the driver, the system will steer on streets, follow a set course on freeways, and maintain a set speed and distance without human input. But Tesla warns drivers to remain alert and keep their hands on the wheel.

Drivers can also purchase a package called Full Self-Driving that allows vehicles to respond to traffic signals and follow turn-by-turn directions on surface streets.

The agency’s new action comes on the heels of a grim earnings report this week; Tesla reported a steeper-than-expected 55 percent plunge in profit amid lagging sales and increased competition. CEO Elon Musk has staked the company’s future on autonomous driving, recently promising to unveil a fully self-driving robotaxi in August.

At the same time, the company faces lawsuits that allege Tesla exaggerated the true capabilities of its Autopilot technology and created a false sense of security for drivers who died or were seriously injured in crashes.

The company settled a lawsuit this month over the 2018 death of a former Apple engineer, who was killed when his vehicle veered off a highway in Northern California. The lane lines had faded and the car picked up on the wrong line, steering itself into a highway barrier at about 70 mph, according to the company’s lawyers.

In court documents, the company maintains it is not liable for the crashes because it repeatedly warns drivers to remain in control of the vehicle.

December’s recall followed a Washington Post investigation that identified at least eight fatal or serious crashes on roads where Autopilot was not designed to be used.

The National Transportation Safety Board and others have asked federal regulators to force the company to add interventions in the system’s software limiting its use to settings where it was designed to operate. NHTSA has rejected that approach as too complex and resource intensive.

Matthew Wansley, a professor at the Cardozo School of Law in New York who specializes in emerging automotive technologies, said Tesla’s recall remedy was inadequate, as it failed to limit Autopilot only to where it is meant to be used. Despite the additional alerts Tesla added after the recall, the feature can still be activated outside of controlled-access highways, including in dense urban areas with cross traffic, frequent stoplights and stop signs, and roads without clear lane markings. NHTSA said it neither negotiates remedies before recalls nor preapproves them.

Wansley said NHTSA is in a tricky political position when it comes to Tesla as the company is the largest manufacturer of electric vehicles in the United States as the Biden administration pushes for a transition to battery-powered cars.

“This was always a more high stakes recall, and NHTSA had strong pressure to do something that Tesla would agree with rather than fight about it,” he said. “But even with those constraints, [NHTSA] still could have gone further.”

Tesla critic Dan O’Dowd, who has pushed for the company’s software to be banned through his advocacy group the Dawn Project, said NHTSA “must go further.”

Only through the threat of a blanket ban will Tesla take action to address critical safety defects in its self-driving software,” O’Dowd said.



Source link

Leave a Comment

Your email address will not be published. Required fields are marked *

Translate »
Scroll to Top
Donald Trump Could Be Bitcoin’s Biggest Price Booster: Experts USWNT’s Olympic Final Standard Warren Buffett and Berkshire Hathaway Annual Meeting Highlights What to see in New York City galleries in May Delhi • Bomb threat • National Capital Region • School