Alphabet’s autonomous-vehicle company Waymo is launching a voluntary software recall after several of its self-driving cars were found to have driven past school buses that were stopped with their safety arms extended—a violation of traffic laws and a major safety concern. The incidents, reported in multiple U.S. cities where Waymo operates its robotaxi service, have sparked renewed scrutiny of autonomous-vehicle safety and prompted questions about how well self-driving technology can recognize and respond to situations involving children.
The issue first gained attention after school transportation officials documented a series of encounters in which Waymo vehicles failed to come to a complete stop behind school buses displaying flashing red lights. In several cases, onboard cameras and eyewitness accounts indicated that the autonomous cars proceeded past buses while children were crossing, or had just finished crossing, the street. Although no injuries were reported, the severity of the hazard immediately raised alarms.

Waymo acknowledged the problem in a statement and confirmed that the behavior was caused by a software flaw affecting how the system interpreted the visual cues associated with stopped school buses. According to the company, the cars correctly detected the bus itself but failed to properly interpret certain stop-arm positions and lighting combinations, particularly when visibility was reduced or when buses were stopped in atypical road configurations.
The company emphasized that its vehicles have logged millions of autonomous miles with a strong safety record, but insisted that “any unexpected behavior around vulnerable road users—especially schoolchildren—is unacceptable.” The upcoming recall will push an over-the-air update to the affected software, modifying the perception and decision-making logic that governs school-bus interactions. Waymo says the patch will ensure its vehicles stop consistently, remain stationary, and wait until the bus retracts its stop sign or resumes motion before proceeding.
This incident arrives at a sensitive time for the autonomous-vehicle industry. Public confidence in self-driving technology has fluctuated over the past several years as a series of high-profile crashes and regulatory investigations have made headlines. While companies like Waymo frequently highlight that their vehicles, on average, are involved in fewer serious crashes than human drivers, even a single malfunction involving children can trigger widespread concern.
School transportation scenarios are among the most challenging environments for self-driving systems. Not only do school buses vary widely in size, shape, and design, but their stop signals—fold-out arms, flashing red lights, and crossing gates—must be detected with extremely high reliability. Furthermore, children exiting buses may behave unpredictably, running across the street or returning to the bus unexpectedly. Human drivers are trained to anticipate this unpredictability, and regulators expect autonomous vehicles to meet or exceed that standard.
Waymo says it has been working closely with school districts and transportation safety experts to examine the incidents. In some of the reported cases, the company argues that human drivers on the road reacted more dangerously than its own vehicles, weaving around buses at high speeds or failing to stop. However, company officials also acknowledged that the standard for autonomous systems is—and must remain—significantly higher because machines are expected to operate with a level of consistency that humans often lack.
The recall is expected to roll out in the coming weeks, with engineers conducting extensive internal testing before the update is deployed fleet-wide. Because Waymo vehicles receive software updates over the air, no customer action is required, and no cars will need to be physically returned. The company says the affected vehicles will continue to operate during the update process, but with heightened monitoring.
Regulators at the federal and state level are also reviewing the incidents. While no penalties have been announced, transportation authorities have asked Waymo to provide detailed logs, sensor data, and explanations of the underlying technical issue. Some safety advocates have called on regulators to temporarily restrict autonomous-vehicle operations near school bus routes until the software fix has been validated. Others argue that such restrictions would be impractical and that improving the technology, rather than limiting it, should be the priority.
Community reactions have been mixed. Parents and school officials have expressed concerns, stressing that reliability around schoolchildren must be absolute. Some have called for more public transparency around autonomous-vehicle testing and incident reporting. At the same time, several transportation researchers have noted that human drivers routinely pass school buses illegally—an estimated 40,000 times per day nationwide according to some studies—and that a well-designed autonomous system could ultimately help reduce these violations.
For Waymo, the challenge is not only technical but reputational. The company has scaled its robotaxi service across multiple states, positioning itself as the safest and most mature operator in the autonomous-vehicle market. A software flaw affecting school-bus detection threatens to undermine that narrative, even if the incidents did not lead to injuries. How effectively and transparently Waymo handles this recall may influence public trust in driverless technology for years to come.
As the recall progresses, industry analysts expect increased scrutiny of how autonomous vehicles handle edge cases involving vulnerable road users—not just schoolchildren, but cyclists, construction workers, and pedestrians with disabilities. The Waymo incidents underscore a broader truth: in the transition toward a driverless future, even rare mistakes can shape public opinion, regulatory policy, and the speed at which society is willing to embrace autonomous transportation.
Waymo says the software update will resolve the issue completely. Whether that reassurance satisfies regulators, parents, and the public remains to be seen.









