Tech News
← Back to articles

Why Waymo is having a hard time time stopping for school buses

read original related products more articles

For years, Alphabet-owned Waymo has tried to set itself apart from other self-driving startups by emphasizing a culture of caution and safety. Now, just ahead of major planned rollouts across the country, it is facing a recurring failure in one of the most sensitive places imaginable: school zones.

In December, the National Highway Traffic Safety Administration (NHTSA) opened an investigation into Waymo after Austin’s largest school district reported at least 19 incidents where the company’s robotaxis failed to fully stop for school buses during loading and unloading — an illegal violation in all 50 states. Waymo quickly responded by issuing a voluntary software recall and rolling out updates intended to fix the problem.

But the patch didn’t work. Since the update, Austin Independent School District (ISD) says at least four additional violations have occurred, including one as recently as January 19th, when a Waymo vehicle was filmed breezing through the opposite lane of traffic as children waited to cross the street and board a bus with its stop arm extended. In total, at least 24 safety violations involving Waymo vehicles and school buses have been reported in Austin since the start of the 2025 school year.

Waymo has defended itself, in part, by noting that none of the Austin school bus incidents resulted in a collision or injury. But that’s no longer strictly the case nationwide. Last week, Waymo published a blog post acknowledging that one of its vehicles struck a child outside a Santa Monica elementary school on January 23rd. Although the school district told The Washington Post that the child sustained only minor injuries, the outcome could have been far worse: Waymo says the vehicle slowed from 17 mph to 6 mph in the instant before impact.

Experts specializing in autonomous vehicle safety and pedestrian interaction told The Verge that these incidents were concerning, particularly given the company’s stated goal of making its vehicles drive more “confidently assertive.” In an effort to shed the stereotype of driving like a cautious grandparent, the vehicles have been spotted playing looser with traffic rules. But making robotaxis seem more humanlike also risks having them inadvertently inherit some of our more dangerous driving habits.

“These technologies are still being developed and tested in a real world environment because there’s a lot of things that happen in the real world that’s hard for companies and engineers to anticipate,” Cornell Tech professor and and expert in human-robot interaction Wendy Ju tells The Verge. “Unless you have some understanding of all the things that might happen, it’s hard to know what to design around.”

Waymo did not respond to repeated requests for comment. On Wednesday, Waymo’s chief safety officer Mauricio Peña responded to safety concerns raised during a Senate hearing. He said Waymo is evaluating each of the school bus incidents and is developing fixes, some of which have already been incorporated into its software. Peña also said they are working with Austin ISD to “collect data on different lighting patterns and different conditions.” Waymo notably didn’t commit to stop operating around school buses while that data collection and testing occurs.

School bus stops put autonomous vehicle ‘logic’ to the test

Navigating around school buses is one of the more dangerous aspects of driving, for both humans and robots alike. NHTSA attributed 61 fatalities to vehicles illegally passing school buses between 2000 and 2023, almost half of whom were pedestrians under the age of 18. That danger has less to do with the bus drivers themselves, who are typically licensed and careful, and more to do with the chaotic, improvisational nature of the situation. Buses are often double-packed, and kids, being kids, might not wait to cross the street when they are supposed to.

“Waymos are having an issue because every driver has issues around school buses,” Ju said

... continue reading