Is it a good sign that Tesla keeps getting probed by regulators because its self-driving cars are horrible drivers?
You’ll never guess the reason we’re asking.
On Thursday, the US National Highway Traffic Safety Administration said it’s launching an investigation into 2.88 million Tesla cars with its dubiously labeled “Full Self-Driving” mode, after receiving reports of the vehicles blowing off traffic laws and getting into crashes, Reuters reports.
It’s another bump in the road for the Elon Musk-led automaker, which has struggled to perfect its self-driving tech for years, putting itself under even more pressure to deliver by launching a limited “Robotaxi” service this summer.
According to Reuters, the regulator is reviewing 58 reports of a Tesla running FSD appearing to violate traffic laws. Of the incidents, 14 involved crashes and 23 involved injuries.
In six of these crashes, FSD “approached an intersection with a red traffic signal, continued to travel into the intersection against the red light and was subsequently involved in a crash with other motor vehicles in the intersection,” the NHTSA said, per Reuters.
In 2024, one Houston driver complained to the agency that FSD “is not recognizing traffic signals,” causing it to run through red lights and stop at green lights.
“Tesla doesn’t want to fix it, or even acknowledge the problem,” the driver vented, “even though they’ve done a test drive with me and seen the issue with their own eyes.”
The agency is also reviewing another shocking habit of the driving software that’s recently gotten more attention: driving straight into oncoming trains, and otherwise going haywire at railroad crossings.
This comes as Tesla is already mired in numerous legal battles and official inquiries. The NHTSA has already been investigating Tesla’s less advanced driving system, Autopilot, for over a year. In August, a jury ordered Tesla to pay $329 million in damages after a car running Autopilot struck and killed a woman, finding it to be partially responsible for the deadly accident.
It’s also conducting another investigation in FSD, which it launched last October. That probe is investigating several crashes that involved poor visibility conditions. Footage of one of the incidents showed FSD’s camera obscured by sunlight before the car plowed into an elderly woman on the side of the road, killing her.
Meanwhile, its lackluster Robotaxi service, which is currently a limited operation in Austin, almost immediately drew scrutiny from the NHTSA after passenger videos showed the self-driving cabs — which come with a human “safety monitor” sitting shotgun — careening over the speed limit and driving erratically.
Adding to the pile, Tesla is embroiled in legal issues with California regulators. The state DMV sued Tesla for false advertising because of FSD’s misleading branding, since, despite being called “Full Self-Driving,” the software can’t fully drive by itself and requires constant human supervision to avoid getting into accidents. Last year, Tesla added “(Supervised)” to the driving mode’s official name.
The false advertising suit also focuses on statements made by Musk, who has a proclivity for making improbable promises and overblowing the capabilities of his products. He has long openly stated that Tesla cars can “drive themselves,” while promising that fully autonomous driving is right around the corner.
As it stands, this latest investigation from the NHTSA is at a preliminary stage, but a recall could follow if it concludes that the cars are a safety risk. And, like, come on.
More on Tesla: Tesla Tells Sleepy Drivers to Switch to Its Self-Driving Mode That Needs to Be Monitored Constantly So It Doesn’t Cause a Fatal Accident