Tesla's Full Self-Driving Software Is Being Investigated by Federal Safety Regulators

America's top car safety regulator opened a new investigation into Tesla's so-called "Full Self-Driving (Supervised)" software after four reported low-visibility crashes - one of them proving fatal, where a pedestrian was killed.
Tesla's Full Self-Driving Software Is Being Investigated by Federal Safety Regulators

America's top car safety regulator opened a new investigation into Tesla's so-called "Full Self-Driving (Supervised)" software after four reported low-visibility crashes - one of them proving fatal, where a pedestrian was killed.

The National Highway Traffic Safety Administration's Office of Defects Investigation announced Friday that it is investigating the driver assistance system to determine if it can "detect and respond appropriately to reduced roadway visibility conditions," such as "sun glare, fog, or airborne dust." It also wants to know whether other crashes have happened in those conditions beyond what were reported.

The probe comes just a week after Tesla CEO Elon Musk unveiled the prototype of his company's "Cybercab," a two-seater car that he said was supposed to serve as a robotaxi, years after the promised date. Musk also said during the meeting that Tesla's Model 3 sedan and Model Y SUV will someday in 2025 be able to drive themselves around, without human supervision, anywhere within the borders of California and Texas, although he did not speculate on how that will be done.

In April, NHTSA closed a nearly three-year probe into Autopilot, the less-capable driver assistance software that Tesla offers, after investigating almost 500 crashes where the system was active. The agency said it found 13 of those crashes were fatal. Around the same time that it closed that probe, NHTSA opened a new investigation into the recall fix that Tesla had issued to address problems with Autopilot.

Other legal controversies also face Tesla's software. The US government is probing allegations that Tesla made over its driver-assist capabilities, and the California Department of Motor Vehicles accused Tesla of overstating its software's abilities.

A host of Autopilot-related crashes have also given the company a spate of lawsuits. It settled one of the more high-profile cases that was set to go to trial earlier this year. The company had said in the past that it makes drivers aware that they are supposed to constantly monitor Full Self-Driving and Autopilot and be ready to take over at any moment.

The newly filed complaint lists four crashes in which Full Self-Driving (Supervised) was engaged and all occurred between November 2023 and May 2024.
 
The first crash was on November 2023 in Rimrock, Arizona. A Model Y struck and killed a pedestrian. The second crash occurred in January 2024 in Nipton, California. That one involved a Model 3 veering into another car on the highway during a dust storm. In March of 2024, a Model 3 ran into another vehicle on the highway near Red Mills, Virginia, with cloudy weather. A Model 3 crashed into a stationary object in Collinsville, Ohio on a rural road while conditions were foggy in May 2024. According to reports from NHTSA, at least one person was injured during the May 2024 collision.

NHTSA's defects investigations team divides its investigations into four types: Defect Petition, Preliminary Evaluation, Recall Query and Engineering Analysis. The agency classified this new investigation as preliminary evaluation. NHTSA normally tries to complete these types of probes within eight months.

Blog
|
2024-10-19 17:49:47