Federal investigators have launched a wide-ranging review of nearly 2.9 million Tesla vehicles equipped with the company’s Full Self-Driving (FSD) software. This move comes after several reports that the software caused cars to run red lights, veer into oncoming lanes, and crash.
The National Highway Traffic Safety Administration (NHTSA) said it is looking into 58 incidents where the FSD system allegedly broke traffic rules, including 14 crashes that injured 23 people. In six cases, Teslas reportedly entered intersections against red signals, with four of those accidents causing injuries.
FSD is classified as a Level 2 driver-assistance system, which means the driver must keep a close watch at all times. Along with reviewing these crashes, regulators are also examining how the system performs near railroad crossings after recent concerns about near misses.
This investigation is just the first step and could lead to a recall if officials find the software poses a serious safety risk. Tesla has released a software update recently but hasn’t publicly addressed the investigation.
This federal inquiry adds to growing scrutiny of Tesla’s automated driving features. Critics have questioned the company’s use of names like "Full Self-Driving" and "Autopilot," saying they suggest more advanced automation than what the technology actually offers. Tesla insists that drivers must remain alert and supervise the system.
The probe comes at a tense time for Tesla. In August, a Miami jury partly blamed Tesla’s Autopilot system for a fatal crash in 2019, awarding over $240 million in damages. Tesla plans to appeal this verdict.
Government officials and lawmakers are paying more attention to automated driving tech, questioning whether safety claims keep pace with real-world performance. This scrutiny has increased under the leadership of the new NHTSA administrator.
Tesla faces other investigations too. Earlier this year, the agency looked into 2.6 million cars with a remote “summon” function that lets owners call their vehicles without anyone inside. This feature has been linked to parking lot accidents. Another probe covers 2.4 million Teslas with FSD after crashes in fog and glare, including a fatal one. Officials are also checking if Tesla failed to report required accident data.
The outcome of this investigation matters a lot for insurance companies. Up until now, drivers mostly took responsibility for crashes, covered by their personal auto insurance. But if crashes are found to stem from software flaws, the responsibility—and costs—could shift toward Tesla and its insurers. This would change how coverage is priced and handled, with insurers factoring in risks tied to the software, not just the driver.
Some insurance experts say this might lead to new policy types that mix car insurance with technology-related liability coverage.
Tesla’s stock dropped about 2% after the investigation became public. Investors are growing wary of legal problems and the risks tied to automated systems, which once gave Tesla a big edge.
As the NHTSA investigation continues, regulators, courts, and insurers will watch closely for any recalls or rules that set new standards for who is responsible when software, not a human, is behind the wheel.