Skip to content
Tech News
← Back to articles

Tesla hits Musk’s threshold for ‘safe unsupervised’ driving

read original get Tesla Autopilot Safety Kit → more articles
Why This Matters

Tesla has surpassed 10 billion miles driven with its Full Self-Driving (Supervised) system, meeting Elon Musk's threshold for potentially enabling 'safe unsupervised' driving. However, the system remains a Level 2 driver-assist feature, requiring human oversight, and the transition to fully autonomous, unsupervised driving has not yet occurred. This milestone raises important questions about liability, safety, and regulatory approval in the evolving landscape of autonomous vehicle technology.

Key Takeaways

is transportation editor with 10+ years of experience who covers EVs, public transportation, and aviation. His work has appeared in The New York Daily News and City & State.

Posts from this author will be added to your daily email digest and your homepage feed.

We’ve crossed yet another one of Elon Musk’s self-driving thresholds. Tesla’s fleet of vehicles using the company’s Full Self-Driving (Supervised) system has driven over 10 billion miles, according to the company’s updated safety page. That means the company has crossed the line Musk set earlier this year for “safe unsupervised” driving.

But Tesla owners did not suddenly wake up today to find their FSD (Supervised) vehicles transformed into FSD (Unsupervised) ones. FSD is still just a Level 2 system that requires a fully attentive human driver behind the wheel to monitor the road and be prepared to take over at any moment.

In January, Musk said on X that “roughly 10 billion miles of training data is needed to achieve safe unsupervised self-driving” — the implication being that once Tesla reached that milestone, the company would flip the switch and all its customers would suddenly have access to an unsupervised driving system.

In January, Musk said on X that “roughly 10 billion miles of training data is needed to achieve safe unsupervised self-driving.”

Of course, that would have been an enormously risky move by Tesla, especially when there are still so many questions about the company’s willingness to accept legal responsibility for over a million vehicles with FSD. When a Waymo vehicle is responsible for a crash, Waymo assumes liability because it owns the tech and the fleet. But Tesla’s terms of service put the liability on the owner, based mostly on its characterization of FSD as a Level 2 supervised system. What happens when FSD goes unsupervised? Who assumes responsibility for a crash then?

It’s not clear that Tesla has figured that out yet. Over the years, there have been hundreds of crashes involving Tesla’s partially autonomous features and dozens of fatalities. But the company has been able to avoid liability, either by settling with victims or convincing courts to dismiss the lawsuits. On its website, Tesla maintains that FSD (Supervised) “requires active driver supervision and does not make the vehicle autonomous.”

Still, it’s worth acknowledging the incredible accomplishment of 10 billion miles driven in FSD (Supervised). Tesla claims that its FSD-equipped vehicles drive 5.5 million miles on average before a major collision, as compared to 660,000 miles for the average US driver. Tesla touts this as evidence that FSD is safer than human driving.

What happens when FSD goes unsupervised? Who assumes responsibility for a crash then?

... continue reading