Tech News
← Back to articles

When Tesla's FSD works well, it gets credit. When it doesn't, you get blamed

read original related products more articles

Photo by Monroe County Sheriff's Department, via New York Times

Tesla has engaged in a pattern of taking credit for the successes of its Full Self-Driving (FSD) software, even though the car still relies on an attentive driver, and yet blaming the driver rather than the software whenever things go badly.

But new moves towards allowing more distracted driving could make it harder for the company to blame drivers when its software fails.

Tesla has been marketing some version of its Autopilot or FSD software since 2013. Ever since then, the company has made bold pronouncements about how rapidly the software would improve, stating almost continually that fully autonomous driving would come within a year.

Autopilot and FSD have changed definitions over time, with basic Autopilot initially being an option and now being included on most vehicles, and with FSD being an additional cost on top of that, at varying prices (costing up to $15,000 at one point).

Advertisement - scroll for more content

In general, Autopilot has promised to be a driver’s aid, while FSD has promised to allow the car to fully drive itself with no human intervention when the software is finally ready.

That fully autonomous ability has yet to be delivered, but Tesla’s software does continue to improve.

At first Autopilot was merely active on highways, as soft of a “smart cruise control” system. It could hold the car in a lane and track the speed of vehicles ahead and match them.

Over time the systems have gained more capabilities, including being able to follow the car’s navigation system and take highway interchanges on its own. And throughout all this time, colloquially Teslas have very often been referred to as “self-driving cars.”

... continue reading