Federal regulators have turned their attention back to Tesla, this time focusing on a new update to its driver-assistance software called “Mad Max.” Early users report that this mode lets Tesla cars drive faster and make sharper maneuvers than before.
The National Highway Traffic Safety Administration (NHTSA) has reached out to Tesla for information about this feature, which is part of the company’s Full Self-Driving (FSD) system. The agency wants to find out whether "Mad Max" encourages risky behavior like speeding or sudden lane changes that break traffic laws.
NHTSA reminded drivers that even with these systems, the person behind the wheel is still responsible for driving safely and following the rules. This inquiry comes as part of a larger investigation into around three million Tesla vehicles with FSD, prompted by numerous complaints. Some reports mention cars running red lights or entering intersections when signals were against them, leading to at least 14 crashes and 23 injuries, according to federal data.
Tesla has not made any public comment on this latest review. The company has said before that its Full Self-Driving system is meant to assist, not replace, an attentive driver. Tesla promotes the system as being able to handle many driving situations — but only with a human supervising closely.
The “Mad Max” mode first appeared back in 2018 during Tesla’s early Autopilot tests. Named after a movie known for wild car chases, it’s meant to let the car make quicker lane changes and drive more aggressively. Tesla quietly brought back this mode in a recent software update called FSD Version 14.1.2, alongside a slower “Sloth Mode.”
Videos on social media show cars in “Mad Max” mode speeding well above local limits and rolling through stop signs. This echoes earlier concerns that led Tesla to recall over 50,000 vehicles in 2022 to remove a feature allowing “rolling stops.”
People in the insurance world are already worried. As driver-assistance tech gets more advanced, figuring out who’s at fault in accidents gets trickier. Tesla’s system is officially “Level 2 automation,” which means a human must always be ready to take control. But calling it “Full Self-Driving” or adding modes like “Mad Max” can confuse drivers and insurance companies about where human error ends and product fault begins.
Some insurers have noticed more claims involving cars with these advanced features. Repairs to their sensors can be expensive, and disputes over how the system acted before a crash add to the headache. Introducing a mode designed to push the car to speed faster only makes these problems worse.
NHTSA has been watching automated driving systems more closely this past year, but so far, actions mostly happened after accidents or reports. In other parts of the world, like Europe and Asia, regulators have set stricter rules for automated lane changes and speed limits.
If U.S. officials find that “Mad Max” encourages illegal driving, Tesla might face another recall. That could also kick off a wider debate on how automakers name and advertise these systems.
For insurers, what happens with Tesla is more than just about one company’s software. It shows how quickly technology is changing how risks are measured and priced. As one claims manager put it, it’s tough to predict driver behavior when it can shift overnight with a new software update.
As authorities continue reviewing, their decisions could shape how the U.S. insurance industry handles the risks of automation — not just for Tesla, but for all carmakers pushing self-driving technology.