investigation points to drivers’ responsibility in accidents

investigation points to drivers’ responsibility in accidents
investigation points to drivers’ responsibility in accidents
-

The National Highway Traffic Safety Administration (NHTSA) concluded, on Friday (26), a long investigation into Tesla’s Autopilot. As reported by Engadget, the department decided, after analyzing the most serious accidents related to Autopilot, that they occurred due to improper use of the system by the driver — but there are caveats.

What do you need to know:

  • Despite this, the organization also highlighted that Tesla did not use enough tricks to keep drivers attentive — in other words, it did not make it clear that they needed to continue paying attention to traffic even with Autopilot activated;
  • The organization investigated almost a thousand accidents between January 2018 and August 2023, accounting for 29 deaths in total — but most of them either did not have enough data for analysis, or did not have Autopilot engaged;
  • Of the thousand accidents, 211 were classified and considered more serious. All had autopilot or FSD (more autonomous version of the Autopilot software) on;
  • Among the 211, 13 were fatal, with 14 deaths in total and 49 injuries;
  • For the agency, drivers had enough time to react, but did not do so in 78 of the incidents — but Tesla would be primarily to blame for this.

Read more!

Accidents without attempted evasive or delayed action by the driver were found in all hardware versions and accident circumstances with the models.

NHTSA in a statement.

The department noted that, due to driver expectations and the actual operational capabilities of Autopilot, a “critical security gap” was created, leading to “predictable misuse and preventable accidents”.

However, although the drivers did not use the system correctly, with the demand for necessary attention, the company also did not maintain them “sufficiently engaged in the task of driving”, this being the primary error for others.

Tesla has stated, on several occasions, that it warns customers that they need to pay attention when using Autopilot and FSD, including presenting notifications about keeping your hands on the steering wheel and eyes on the road.

For the department, current protocols are not enough. Considering the results of the investigation, the NHTSA officially opened a second investigation against the automaker, this one to evaluate the corrections made in December, after two million vehicles had been recalled. The intention is to check whether the Autopilot update already implemented by Tesla is effective enough.

Tesla’s Autopilot, false advertising?

The organization’s investigation also showed an error in Tesla’s description of Autopilot aimed at advertising the software. While the company’s technical classifies the system as “driver assistance”, the marketing puts it for sale as an autonomous pilot, which would be misleading, according to the NHTSA.

Thus, the California Attorney General’s Office and the state Department of Motor Vehicles will also investigate Tesla for false branding and marketing.


The article is in Portuguese

Tags: investigation points drivers responsibility accidents

-

-

NEXT SOS: urgent national assessment of medical graduates!