TechTesla's Autopilot under scrutiny: flawed system linked to fatal crashes

Tesla's Autopilot under scrutiny: flawed system linked to fatal crashes

Tesla's Autopilot under scrutiny: flawed system linked to fatal crashes
Images source: © Pixabay

1 May 2024 20:58

The Office of Highway Safety in the United States released a report on investigating accidents involving Tesla vehicles. The report highlights that the Autopilot feature was defective and contributed to fatal incidents.

Tesla's electric vehicles have become popular for their innovative features in the car industry. The Autopilot, in particular, has been a significant factor in changing how we envision the future of travel.

However, the reality could be different. In 2021, the American Office of Highway Safety, or NHTSA, started looking into Tesla's Autopilot feature. The findings are worrying and add to the brand's technical issues, including the recent revelations about faulty acceleration pedals in the Tesla Cybertruck.

Findings from the Traffic Safety Commission

The NHTSA reviewed accidents involving Tesla vehicles from January 2018 to August 2023. They gathered data on 956 incidents, which resulted in the loss of 29 lives, as reported by The Verge.

Furthermore, there were 211 incidents where a Tesla collided head-on with another vehicle or object. These types of collisions, which are often severe, resulted in 14 deaths and injuries to 49 more. Most of these events occurred at night, with the vehicle's software ignoring warning signals, such as lights, flares, cones, and illuminated signs.

The NHTSA acknowledged that its investigation may have missed some details due to gaps in telemetry data. Therefore, accidents involving the Autopilot and FSD system may be higher than reported.

Autopilot Proven to be Flawed

The report indicates that the Autopilot system is flawed, leading to predictable misuse and failures that could have been avoided. The primary concern revolves around the imperfect driver engagement system, which is limited by Autopilot's capabilities and thereby creates significant safety risks.

In 59 examined cases, Tesla drivers had enough time (over 5 seconds) to react. However, after reviewing accident reports and data from Tesla, NHTSA discovered that in most cases, drivers did not make necessary manoeuvres, such as braking or steering, to avoid a collision.

Unlike other systems, Tesla's Autopilot would turn off instead of allowing drivers to take control. The NHTSA states that this discourages drivers from driving attentively and safely.

The Term 'Autopilot' is Misleading

Tesla warns its users to be cautious when using Autopilot and the Full Self Driving (FSD) system, which includes keeping hands on the wheel and eyes on the road at all times.

However, NHTSA believes that the term "Autopilot" might mislead users, suggesting the car can operate without the driver's full control. While other manufacturers use terms like "assistance," Tesla's words could inadvertently tempt drivers with promises of non-existent capabilities.

The Attorney General of California and the state Department of Motor Vehicles are looking into Tesla following allegations of potentially misleading marketing and branding.

Related content