Autopilot and full autonomous driving linked to accidents and dozens of deaths

-

Tesla remains in the eye of the storm when it comes to scrutiny of accidents and deaths allegedly caused by the driver’s overconfidence in the car. The American regulator NHTSA even states that drivers are led to think that these electric cars are more capable than they actually are. Autopilot and fully autonomous driving by Elon Musk’s company are being associated with hundreds of accidents and dozens of deaths.

 

Autopilot and Full Self-Driving in trouble

In March 2023, a North Carolina student was getting off a school bus when he was struck by a Tesla Model Y traveling at “freeway speeds,” according to a federal investigation published April 25. The Tesla driver was using Autopilot, the company's advanced driver assistance feature that Elon Musk insists will eventually lead to fully autonomous cars.

As reported by The Verge, the 17-year-old student who was shot was transported to a hospital by helicopter with serious injuries. But what the investigation found after examining hundreds of similar accidents was a pattern of driver inattention, combined with the shortcomings of Tesla's technologywhich resulted in hundreds of injuries and dozens of deaths.

Drivers using Autopilot or the system's more advanced sibling, Full Self-Driving, “were not sufficiently engaged in the driving task” and Tesla's technology “did not adequately ensure that drivers maintained their attention on the driving task ”.

Completed NHTSA.

Image Tesla Model 3 crashed and Autopilot being investigated

In total, the NHTSA investigated 956 accidentsstarting in January 2018 and extending until August 2023. Of these accidents, some of which involved other vehicles striking the Tesla vehicle, 29 people died.

There were also 211 accidents in which “the Tesla's frontal plane struck a vehicle or obstacle in its path.” These accidents, which were often the most serious, caused 14 deaths and 49 injuries.

Tesla cars investigated for crashing into parked emergency vehicles

The NHTSA was prompted to launch its investigation after several incidents of Tesla drivers crashing into emergency vehicles parked on the side of the road. A Most of these incidents occurred after dark, with the software bypassing site control measuresincluding warning lights, flags, cones and an illuminated arrow sign.

In its report, the agency concluded that Autopilot – and in some cases, FSD – was not designed to keep the driver engaged in the task of driving. A Tesla says it warns its customers that they need to pay attention while using Autopilot and FSD, which includes keeping your hands on the steering wheel, eyes on the road.

However, NHTSA states that in many cases, drivers became too complacent and lost concentration. And when the time came to react, it was often too late.

In 59 crashes examined by NHTSA, the agency concluded that Tesla drivers had enough time, “five or more seconds,” to react before colliding with another object. In 19 of these accidents, the hazard was visible for 10 or more seconds before the collision.

Analyzing accident records and data provided by Tesla, NHTSA found that drivers did not brake or steer to avoid the danger in most of the accidents analyzed.

Accidents without any evasive action or with a delayed attempt at evasive action by the driver were found in all Tesla hardware versions and in all accident circumstances.

NHTSA stated.

Level 2 of overvalued “autonomy”

NHTSA also compared Tesla's Level 2 (L2) automation capabilities to products available in other companies' vehicles. Unlike other systems, Autopilot would deactivate instead of allowing drivers to adjust the direction. According to the regulator, this behavior “discourages” drivers from remaining involved in the task of driving.

A comparison between Tesla's design choices and those of its L2 peers identified Tesla as an outlier in its approach to L2 technology, combining a weak driver engagement system with the permissive operational capabilities of Autopilot.

The agency stated.

Even the brand name “Autopilot” is misleading, NHTSA said, evoking the idea that drivers are not in control. While other companies use some version of “assistance”, Tesla products trick drivers into thinking they are more capable than they are. The California Attorney General and the state Department of Motor Vehicles are investigating Tesla for deceptive branding and marketing.

The NHTSA acknowledges that its investigation may be incomplete based on “gaps” in Tesla’s telemetry data. That could mean there are many more accidents involving Autopilot and FSD than what the NHTSA was able to find.

Tesla issued a voluntary recall late last year in response to the investigation, releasing a software update over-the-air to add more warnings to Autopilot. NHTSA also said it was beginning a new investigation into the recall after several security experts said the update was inappropriate and still allowed misuse.

To the conclusions contradict Musk's insistence that Tesla is an artificial intelligence company that is about to launch a fully autonomous vehicle for personal use. The company plans to unveil a robotaxi later this year, which will supposedly usher in this new era for Tesla.

During this week's first-quarter earnings call, Musk reinforced the notion that his vehicles were safer than cars driven by humans.

If you have, at scale, a statistically significant amount of data that shows conclusively that the autonomous car has, say, half the accident rate of a human-driven car, I think it's hard to ignore. Because, at this point, stopping autonomy means killing people.

Said Elon Musk.

The article is in Portuguese

Tags: Autopilot full autonomous driving linked accidents dozens deaths

-

-

PREV MS Dhoni at Chepauk for the last time? CSK vs RR IPL 2024 may be the final chance for Chennai fans
NEXT SOS: urgent national assessment of medical graduates!