A viral video of a 'reckless' robotaxi caused an uproar in San Francisco. Police say the internet got it wrong.

SAN FRANCISCO — San Francisco police say a viral video that sparked condemnation of the city’s self-driving taxis may have been taken out of context and unfairly ignited a firestorm of controversy around the self-driving vehicles.

The video shows a first responder yelling at a driverless car from the tech startup Cruise to move out of the way because it is blocking emergency vehicles from reaching the scene of a mass shooting. The taxi in the video — with no people inside — defiantly stays put. 

The witness who recorded the video said the robotaxi was being reckless, an idea that local media outlets and politicians picked up and ran with as another example of Big Tech’s going too far. 

Aaron Peskin, the president of the San Francisco Board of Supervisors, the equivalent of a city council, said the incident added to “grave concerns” in the city about the robotaxis. Jackie Fielder, a candidate for the board, called the video “Absolutely infuriating!” and said regulators should “get these things off the streets NOW.” 

But it turns out the initial perception created by the video may have been wrong, according to the San Francisco police and fire departments. They now say the robotaxi didn’t block emergency responders or otherwise get in anyone’s way after the shooting, which wounded nine people. 

“The autonomous vehicle did not delay police, fire, or other emergency personnel with our arrival or departure from this scene. Furthermore, it did not interfere with our investigation into the shooting incident,” the police department said in an email. 

The fire department said in a separate email that the vehicle “did not delay” fire personnel or paramedics. It added that the episode “could have been catastrophic but we are lucky that there was another lane we could use.” 

That version of events is consistent with an earlier statement from Cruise, which is majority-owned by General Motors. 

“Our car initially stopped as it was approaching an active emergency scene, then proceeded to perform a U-turn and pull over. Throughout this time, all vehicles, including emergency response vehicles, were able to proceed around our car,” Cruise said in a series of tweets early Saturday, hours after the shooting. 

A spokesperson for Cruise said Wednesday that a further review of data from its vehicle again showed that it didn’t block any emergency personnel. 

The viral video was only 13 seconds, and it didn’t show a wide sweep of the area. 

The incident highlights the swirl of confusion around robotaxis, which in San Francisco are a flashpoint for a heated debate. The debate may soon go national as Cruise and its competitors, such as Google’s affiliate Waymo, expand to more places. 

“This is going to be an issue in cities across America. San Francisco just happens to be first,” Peskin, the Board of Supervisors president, said in a phone interview. 

The video from Friday got outsize attention because of its connection to a mass shooting, but Cruise and Waymo have blocked emergency vehicles in other circumstances. Peskin cited an example from last week, when the fire department said it was delayed eight minutes on a medical call because a Waymo vehicle had parked in front of the firehouse. Waymo, which shares a parent company with Google, didn’t immediately respond to a request for comment about that example. 

Peskin said that regardless of what occurred Friday, there are enough similar incidents that the companies shouldn’t be allowed to expand until they improve. 

“These devices are not ready for full implementation in a complex urban environment, and our calls upon Cruise primarily and Waymo secondarily have fallen on deaf ears as these companies are in a race to fully deploy,” he said. 

The companies’ customers and other defenders say that’s a double standard that cities don’t apply to human drivers, who also cause traffic and delays, as well as an increasing number of traffic deaths. 

Fielder, the Board of Supervisors candidate, said the problems are frequent enough that the robotaxis should be banned entirely or have human drivers at all times. She recorded a video of one stopped in a crosswalk in April.

“We as residents never consented to being a part of this experiment by tech companies. It’s been imposed on us,” she said in a phone interview. 

Paul Valdez, the witness who recorded the video, declined an interview request. He said in a direct message that he recorded the incident because the first responder was so upset and yelling. 

San Francisco has no direct control over the future of robotaxis in the city. Under state law, that power resides with the California Public Utilities Commission, which is scheduled to consider an expansion plan for Cruise and Waymo at a meeting June 29. City officials oppose the plan. 

Cruise began operating a late-night taxi service in San Francisco in June 2022, after years of test drives in the city and elsewhere. It costs about as much as a human-driven taxi or a ride-hail service, such as Uber and Lyft, and it’s available in selected neighborhoods from 9 p.m. to 5:30 a.m., with plans to expand hours and territory. The vehicles operate during the day in test mode. 

Human drivers sometimes get angry with the robotaxis, because they’re relatively slow and law-abiding, obeying speed limits and making full stops at stop signs. 

The autonomous vehicles also sometimes mess up when they encounter something new to them or confusing. They’ve driven through police tape, entered construction areas, driven over fire hoses and delayed city buses. 

In San Francisco, the robotaxis haven’t caused any traffic deaths, while deaths and serious injuries involving human drivers are on the rise locally and nationally. The robotaxis are sometimes the victims of hit-and-runs by human drivers. 


Related Posts