The Saturday before Christmas in San Francisco — normally a carnival of last‑minute shoppers and packed buses — turned oddly still. Traffic lights went dark across large swaths of the city after a fire at a PG&E electrical substation, and video of driverless Waymo cars sitting at intersections quickly spread online. For a few tense hours the images felt like a test not of software, but of whether autonomous vehicles can navigate real‑world breakdowns in civic infrastructure.
What happened
Utility crews say a fire inside a PG&E substation on Dec. 20 damaged critical equipment and knocked out power for roughly 130,000 customers. The outage hit many busy corridors and neighborhoods — Golden Gate Park, the Presidio and parts of downtown — and snarled traffic when signals stopped working. PG&E and city officials described the damage as "significant and extensive," and restoration took place in stages over the next day or two.
Waymo — Alphabet’s autonomous‑vehicle unit — temporarily paused its driverless ride‑hailing service in the Bay Area on Saturday evening as teams assessed operations. The company later resumed service, saying it was focused on integrating lessons from the event.
Why the cars stopped
Waymo’s system is programmed to treat non‑functioning traffic signals as four‑way stops, a conservative choice designed to prioritize safety. In this outage, however, the scale and simultaneity of dead signals across busy intersections meant some Waymo vehicles paused longer than human drivers might have — producing bottlenecks and, in places, gridlock.
That cautious behavior is intentional. Autonomous stacks are built to ask themselves two questions in these moments: what is the lawful action, and what minimizes risk. Sometimes the answers look like indecision to human drivers, especially when lines of frustrated motorists try to squeeze around stopped vehicles.
Critics seized on the visuals as proof that human intuition still matters on the streets. Editorial writers and some local voices argued that experienced, sometimes improvisational human judgment remains useful when infrastructure and technology both falter.
The public sparring
The outage also produced a public spat on social media. Elon Musk reposted video of Waymo cars holding at dark intersections and wrote that Tesla robotaxis were unaffected, a jab that spilled into a broader debate about which companies’ systems are more resilient. Tesla and Waymo use different philosophies for autonomy — and different mixes of cameras, sensors and fail‑safe behaviors — so comparisons in a single blackout are noisy at best.
Waymo responded that the outage was a widespread municipal event that produced non‑functioning traffic signals and transit disruptions, and that the company is working to adapt its technology to traffic flow during such events.
Bigger picture: infrastructure, trust and edge cases
The incident matters for two overlapping reasons. First, it highlights how tightly tied autonomous vehicles are to public infrastructure. When traffic control fails at scale, even well‑trained models must make conservative choices that can slow traffic. Second, it illustrates a political and PR reality: a handful of viral videos can shape public perception about an entire technology.
Alphabet’s transport ambitions sit beside other AI and infrastructure bets the company is making — from conversational assistants to far‑out ideas for where to put compute — underscoring how a single outage can ripple across many projects and reputations. For context on Google’s broader AI work in maps and navigation, see the company’s recent move to bring a conversational copilot to routing and directions in Google Maps. And for a sense of how big‑picture infrastructure thinking is evolving inside Alphabet, look at projects like Project Suncatcher.
Waymo says it will rapidly fold lessons from the event into its software; PG&E is investigating the cause of the substation fire and continued rolling restorations into Monday and beyond. For commuters, city officials and the companies themselves, the episode is a reminder that robustness isn’t only about better models or sensors — it’s also about resilient public systems, clear rules for interaction when things fail, and the messy human judgments that tech sometimes struggles to mirror.
Traffic resumed, lights came back on, and holiday shoppers returned to their errands. But the images of driverless cars paused in the dark lingered as a concrete example of the unusual edge cases that autonomous vehicles will keep encountering as they move from pilots to everyday service.