Tesla Under Investigation For Major Traffic Violations

NHTSA launches probe into over 50 incidents involving Tesla’s $199-per-month system causing injuries and traffic violations

C. da Costa Avatar
C. da Costa Avatar

By

Our editorial process is built on human expertise, ensuring that every article is reliable and trustworthy. AI helps us shape our content to be as accurate and engaging as possible.
Learn more about our commitment to integrity in our Code of Ethics.

Image credit: Wikimedia

Key Takeaways

Key Takeaways

  • NHTSA investigates Tesla’s FSD software after 50+ incidents involving red light violations
  • Four people injured as FSD caused wrong-way driving and illegal lane changes
  • Investigation could escalate to recalls within eight months, affecting $199 monthly subscribers

Red lights should be pretty straightforward for a system called “Full Self-Driving,” but Tesla’s FSD software apparently missed that memo. Federal safety regulators just opened an investigation into more than 50 incidents where Tesla’s most advanced driver assistance system violated basic traffic laws—including running red lights, driving the wrong way, and making illegal lane changes. Four people got injured along the way.

If you’re one of the hundreds of thousands paying Tesla’s $199 monthly FSD subscription or dropped the full purchase price, this investigation directly affects you. The National Highway Traffic Safety Administration documented everything from vehicles blowing through red lights to cars turning into oncoming traffic despite visible wrong-way signs.

The violations weren’t random glitches—multiple incidents occurred at the same intersection in Joppa, Maryland, suggesting systematic software failures rather than edge cases. NHTSA documented at least 18 complaints about FSD running red lights or not stopping fully. They received numerous reports of FSD causing cars to:

  • Enter oncoming traffic
  • Cross double-yellow lines
  • Attempt wrong-way turns despite correct signage

Lane discipline failures include:

  • Driving straight through intersections from turn-only lanes
  • Turning from through lanes with FSD engaged

NHTSA’s Preliminary Evaluation represents the first formal step in their defect investigation process. Think of it like the regulatory equivalent of a criminal investigation’s initial evidence gathering.

The agency typically completes these evaluations within eight months, potentially escalating to an Engineering Analysis and ultimately a recall determination if they confirm safety defects. Tesla already pushed a software update to address the Maryland intersection issues, but that reactive approach highlights the broader concern about using public roads as testing grounds.

This investigation targets FSD specifically, unlike NHTSA’s previous scrutiny of Tesla’s basic Autopilot system. The distinction matters because FSD costs significantly more and promises urban navigation capabilities that Autopilot doesn’t attempt.

Tesla’s Autopilot deals mainly with highway driving, while FSD is designed for urban, complex scenarios like traffic lights and intersections. When NHTSA closed its Autopilot investigation in April 2024 after identifying 13 fatal crashes, Tesla’s stock jumped 3%. This time feels different—the violations involve fundamental traffic safety principles that any driving system should master before deployment.

The timing stings too. NHTSA launched this investigation the same week Elon Musk released FSD’s latest version, which he’d been hyping for months as incorporating data from Tesla’s robotaxi pilots.

That’s like Netflix releasing a highly anticipated series just as reports surface about streaming quality issues.

Your FSD usage decision just got more complicated. The system still requires your full attention and readiness to intervene, but these documented violations suggest even alert drivers might not get sufficient warning when FSD makes dangerous moves.

Tesla relies on camera-based monitoring to check driver attention, but effectiveness is in question according to NHTSA and industry analysts. Until this investigation concludes, you’re essentially beta testing alongside federal regulators—and sharing roads with everyone else who didn’t sign up for that experiment.

Share this

At Gadget Review, our guides, reviews, and news are driven by thorough human expertise and use our Trust Rating system and the True Score. AI assists in refining our editorial process, ensuring that every article is engaging, clear and succinct. See how we write our content here →