What Happens When Robotaxis Break Down?

insight

A series of incidents involving Waymo’s autonomous vehicles has highlighted what happens when driverless systems fail in complex real world situations and how much they still rely on human intervention to recover.

A Technology Built For The Road Meets The Unexpected

Waymo’s robotaxi service has expanded rapidly across multiple US cities, now delivering hundreds of thousands of paid rides each week. The company positions its system as a fully autonomous driving service, designed to operate without a human driver behind the wheel.

However, recent incidents show that when situations fall outside expected conditions, vehicles can struggle to respond. In several reported cases, Waymo vehicles have stopped, hesitated or behaved unpredictably during emergencies, requiring intervention from police officers or other first responders.

One widely reported example from August 2025 involved a highway fire in California, where traffic was redirected in an unusual way. A Waymo vehicle was unable to adapt to the change, eventually stopping and requiring a police officer to manually move it out of the way.

When Autonomous Vehicles Cannot Proceed

The most significant issue here seems to be what happens when the system cannot decide what to do next.

Autonomous vehicles are designed to prioritise safety, which often means stopping when uncertainty is too high. While this reduces the risk of collisions, it can create new problems, particularly in fast-moving or emergency situations where standing still is not a viable option.

In multiple incidents, it seems that autonomous vehicles have effectively become obstacles in live environments, blocking traffic or delaying access for emergency services until human intervention takes place.

Human Support As The Fallback

To manage these situations, Waymo relies on human support systems behind the scenes.

The company uses Remote Assistance teams who provide contextual guidance when the vehicle encounters something it cannot resolve. According to Waymo, these workers do not drive the vehicle. Instead, they support decision-making. As the company explains, Remote Assistance agents “provide advice and support to the [vehicle] but do not directly control, steer, or drive the vehicle.”

This model is designed to ensure that the automated system remains in control at all times. However, it also means that when the system reaches its limits, recovery can depend on how effectively this human support is integrated.

Where Things Can Go Wrong

Even with this support in place, errors can still occur. For example, in one case under investigation in Austin, Texas, in January this year, a Waymo vehicle approached a stopped school bus with warning lights active. The system requested input from a remote assistant, who it is alleged incorrectly confirmed it was safe to proceed. The vehicle then moved past the bus while children were boarding, an action that would normally be illegal for a human driver.

Other reported incidents show a different type of failure, where no safe path is identified at all. In these cases, vehicles have remained stationary until physically moved, sometimes by police or other first responders.

All this has led to local officials raising concerns that this places an unexpected burden on public services. For example, in San Francisco, emergency management leaders warned that responders were becoming a default support function for autonomous vehicles, something they described as unsustainable.

Scaling The Problem Alongside The Technology

It seems that these challenges are becoming more visible as Waymo scales its operations.

The company operates thousands of vehicles and is expanding into new cities, increasing the number of unpredictable environments its systems must handle. It has said that around 70 Remote Assistance agents support a fleet delivering more than 400,000 rides per week.

In its response to US lawmakers, Waymo reiterated that Remote Assistance is limited in scope, stating that agents “provide advice only when requested by the automated driving system on an event-driven basis” and do not take control of the vehicle.

As deployment grows, the question is not whether incidents will occur, but how frequently and how effectively they can be resolved without external intervention.

Balancing Autonomy With Accountability

Waymo maintains that its system is designed to prioritise safety, even if that means stopping when conditions are unclear. The vehicle can also ignore human input if it conflicts with its own assessment, reinforcing that it remains the primary decision maker.

The company also states that “Waymo’s service does not rely on remote drivers,” emphasising that human involvement is limited and controlled.

However, the pattern of real world incidents seems to suggest that full autonomy still depends on multiple layers of human support. When those layers are not sufficient, responsibility can extend beyond the company itself to public infrastructure and emergency services.

What Does This Mean For Your Business?

For UK businesses, this highlights a critical aspect of automation that is often overlooked, namely what happens when systems fail or reach their limits.

Autonomous technologies are not just defined by how they perform under normal conditions, but by how they behave when they cannot proceed. Stopping safely is one outcome, but in operational environments, recovery is just as important.

It seems that human oversight, fallback processes and clear responsibility models remain essential. Businesses adopting automation will, therefore, need to plan not only for success scenarios, but also for failure scenarios, including how issues are resolved quickly and safely.

There is also a wider accountability question here. When automated systems interact with public environments, any gaps in ownership can become visible very quickly.

The Waymo case shows that the real test of autonomous systems is not when everything works, but how they respond when it doesn’t.

Sponsored

Ready to find out more?

Drop us a line today for a free quote!

Posted in

Mike Knight