Sign up to see the future, today Can’t-miss innovations from the bleeding edge of science and tech Email address Sign Up Thank you!
Elon Musk has long promised fans that Tesla cars can already drive themselves, when the reality is that they still need constant supervision. Now, a woman is suing Tesla after her self-driving Cybertruck allegedly tried to drive her off the side of a bridge, Chron and the Austin American-Statesman report.
While running its “Full Self-Driving” feature, the unorthodox pickup truck “suddenly and without warning” flew towards the edge of a Houston overpass, according to the lawsuit filed by the owner, Justine Saint Amour.
Dashcam footage of the incident, which took place in August 2025, shows the Cybertruck accelerating up the overpass ramp. But as it reaches a curve in the road that connects to a Y-shaped interchange, the vehicle doesn’t slow down in time, barrels through traffic cones that separated the lanes, and slams into a concrete sidewall. The truck violently spins around as pieces of its hood fly onto the road.
The Cybertruck “attempted to drive straight ahead into the concrete barrier and the freeway below,” the lawsuit claimed.
A Houston driver was in a Cybertruck in August 2025 when the Autopilot-controlled vehicle drove straight into a concrete barrier on a Y-shaped overpass on 69 Eastex Freeway. The vehicle was expected to follow a curve to the right, but when it failed to do so, she disengaged the… pic.twitter.com/LopI3y5elg — Austin Statesman (@statesman) March 6, 2026
Saint Amour said she disengaged FSD as it sped too quickly up the ramp and tried to take control, but had little time to react. When the vehicle slammed into the concrete barrier, it caused “substantial” injuries to her neck, shoulders and back. She was diagnosed with two herniated discs in her lower back and another in her neck, sprained tendons in her wrist, and numbness and weakness in her right hand, the suit stated.
The crash is the latest incident that illustrates the dubious capabilities of Tesla’s self-driving systems, which regularly draw the scrutiny of government regulators.
The tech frequently causes mishaps and accidents, some of them deadly. Tesla was found partially responsible for the death of a 22-year-old woman who was struck by a Tesla running Autopilot, FSD’s predecessor, and a judge ordered Tesla to pay $243 million to the woman’s family.
Last year, the National Highway Traffic Safety Administration launched a probe into the automaker after a Tesla running FSD struck and killed an elderly pedestrian on the side of the road, with dashcam footage showing that the vehicle’s front camera had been blinded by sunlight before the collision.
... continue reading