Safety Expert Testifies That Tesla Did Not Prevent Autopilot Misuse

A trial is underway in Miami federal court over a deadly Tesla crash that happened in 2019, raising tough questions about the safety of the company’s Autopilot system. At the center of the case is whether Tesla did enough to prevent drivers from misusing its driver-assist technology.

Mary “Missy” Cummings, an engineering professor at George Mason University and former adviser to the National Highway Traffic Safety Administration, told the jury that Tesla’s owner’s manual, which includes important warnings about Autopilot, is hard for drivers to find and understand. She pointed out that before the crash, Tesla was already facing issues with drivers ignoring warnings from the system. Unlike other car makers, Tesla hadn’t put in place geo-fencing technology, which blocks the use of driver-assist features on roads they aren’t designed for.

When asked why Tesla hadn’t adopted this safety measure in 2019, Cummings said she believes the company avoided it to help sell more cars. Tesla has not commented on her testimony. The trial is expected to last three weeks, with Tesla’s lawyers set to question Cummings when she returns to the witness stand.

The lawsuit was filed on behalf of Naibel Benavides Leon, who died in the crash, and Dillon Angulo, who was seriously hurt. The incident involved a Tesla Model S running off the road at a T-intersection in Key Largo, Florida, and hitting their parked Chevrolet Tahoe while they were beside it.

Plaintiffs argue that Tesla’s system is faulty and that the company did not warn drivers properly about its limits. Tesla’s defense is that the crash was caused by driver error. In previous similar cases in California, Tesla successfully argued the same point. The Tesla driver, George McGee, had the Autopilot system on but was distracted after dropping his phone and reaching down to pick it up.

The plaintiffs’ lawyers showed the jury video footage from the Tesla’s cameras. The clips reveal that the system detected the end of the road, a stop sign, the parked vehicle, and a pedestrian nearby—yet it didn’t react to prevent the crash. The lawyers called it a “preventable tragedy.” Tesla says that no available tech in 2019 could have stopped the crash and that McGee bears full responsibility. They said he pressed the gas pedal and turned off the car’s adaptive cruise control right before going off the road.

Cummings also examined a Tesla letter claiming its Autopilot had the “most robust” warnings against misuse in the auto industry. She told the court she saw no proof supporting that claim. When she joined NHTSA as an adviser in 2021, Tesla CEO Elon Musk accused her of bias against the company, and some Tesla fans even petitioned against her.

This isn’t Cummings’ first time testifying on Autopilot safety. She has been an expert witness in two other lawsuits related to Tesla’s system.

After the crash, McGee said he treated the car like a co-pilot that would stop for obstacles. Cummings explained that many Tesla drivers share this belief. They trust their car to handle the driving enough to look away briefly, like when McGee reached down for his phone.

This trial is closely watched because Tesla continues to push toward a future with self-driving robots that could act as taxis. How the court rules could impact not only this case but the company’s claims about the safety of its technology going forward.

Author

  • 360 Insurance Reviews Official Logo

    Patricia Wells investigates niche and specialty lines—everything from pet insurance to collectibles—so hobbyists know exactly how to protect what they love.