San Francisco’s Robotaxi Fleet Halved After Horror Crash
10:45 AM EDT on October 27, 2023
Ten short weeks after the hilariously corrupt California Public Utility Commission voted to allow two driverless car companies to begin offering rides within the tortuous, craggy seven-by-seven confines of San Francisco, the larger of those two companies has fallen so severely short of its mission to drive normally without hitting people and destroying the traffic grid that the state has abruptly pulled its permit to operate. The California DMV announced on Tuesday that Cruise's approval to operate self-driving cars in San Francisco had been yoinked, citing it for both operating vehicles "not safe for the public's operation" and also misleading investigators looking into one particularly nasty incident involving a Cruise car.
That Cruise would eventually fail on the merits to offer robotaxi rides to San Franciscans without those robotaxis driving into construction sites, blocking emergency vehicles, or bricking in the middle of intersections is no surprise—well before they were fully legalized, those vehicles had already been causing problems. But the speed with which they were regulated is a real surprise. San Francisco city officials not bought off by GM or Google have long expressed public skepticism about the vehicles' safety, though they have had oddly little power to regulate the fleet. That's a state issue, and the state-level commission that OK'd the initial rush of driverless cars has a former Cruise lawyer on it. Given that California's attitude towards big tech companies using San Francisco as a sandbox has mostly been blank, glassy-eyed approval, it's heartening that Cruise only got to muck up the city for less than three months before they were taken off the road.
This still proved to be plenty of time for Cruise to do a lot of damage, and for its legal team to prove stunningly maladroit in its response. Around 9:30 p.m. on the night of October 2, a human driver ran into a woman at the intersection of Fifth and Market in downtown San Francisco, knocking her into the path of an oncoming Cruise vehicle. The robotaxi completed a "hard-braking maneuver" and came to a complete stop, during which the woman was trapped under the car. Apparently unaware that it was now atop an injured person, the autonomous car then executed a "pullover maneuver" and dragged the woman for 20 feet. The San Francisco Fire Department had to use the jaws of life to lift the car off the victim, who remains in serious condition at SF General.
The scandal only began with that dragging. The DMV met with Cruise to review footage of the accident on Oct. 3, and the GM subsidiary only showed it a video of the car rolling over the woman and stopping. In their filing, the agency said that it only saw the full video of the dragging that followed 10 days later, after the National Highway Traffic Safety Administration, which is also investigating Cruise, told it about it; once the agency asked Cruise for the full tape, it got it. A Cruise spokesperson disputed that timeline and claimed the DMV had it the whole time, though the video they showed two reporters the day after the crash also stopped before the pedestrian was dragged. Board President Aaron Peskin told Forbes on Oct. 6 that he had learned of the dragging, and when Forbes asked Cruise for comment, it refused to say anything. "They have known this since moments after it happened," Peskin told Mission Local. "They are really bad actors and I wonder what criminal liability they have."
Cruise has been lighting a bit more than a quarter of a billion dollars on fire each month in their aggressive pursuit of expansion and legitimization; some of that money went to the San Francisco Giants, who were 61-49 when the sponsorship began and 18-34 after. While the company is still rolling out operations and testing in many major cities around the country, losing San Francisco like this is a harsh blow. The city is the ideal testing ground for AVs, as it is topographically adversarial, quite small, and adjacent to the largest node of tech money in the world. Meanwhile, Waymo seems to have largely avoided, say, having their cars brick en masse or drive into wet concrete. This means that, while San Francisco will still be crisscrossed by driverless taxis, their numbers will be smaller and those vehicles' propensity for chaos will, hopefully, be reduced.
Throughout the development life cycle of driverless cars, their makers have pointed out that, when compared to human drivers, their robot cars boast far safer traffic records. While both AV companies operating in San Francisco are opaque with their in-house data, there's reason to believe that this is true, if only when one squints at the claim from a particular angle. The insurmountable problem with AV development is that the street is a far messier and more unpredictable space than the "brains" of driverless cars are capable of problem-solving their way through. If it's impressive that Waymo's cars seem largely able to avoid hitting people, it's also baffling that something as mundane as a falling blossom can scramble their circuits. Human drivers are fallible and maddeningly fond of housing four beers and hitting the road, but the argument for replacing them with unfeeling robots starts to fall apart when you consider the ostensible replacement's inability to do important things like "react to novel stimuli."
In this way, the Oct. 2 crash offered a neat encapsulation of the dynamic at play here. A human driver started it, hitting the pedestrian and "launching" her into the path of the Cruise car, but the driverless car compounded the problem by failing to notice something so basic as the road surface changing from tarmac to human body. The result left what Mission Local reported as a "gruesome trail of blood [...] clearly visible on the pavement." Adding a couple hundred robots into the turbulent swirl of human-incited traffic violence doesn't make the streets safer by thinning out the pack, it turns out. It only makes that uneasy dynamic even more unpredictable and hazardous by adding drivers that can't do the most important thing a driver can do.