A video posted to social media showing a Waymo self-driving taxi being vandalized and engulfed in flames in San Francisco is sparking debate over autonomous vehicles. The incident happened on Saturday, Feb. 10, in the Chinatown area of the city.
The San Francisco Fire Department did not report any injuries from the vandalism. However, the vehicle is a total loss. Video posted to X shows the charred remains of the taxi as firefighters douse water on the twisted and singed metal.
The incident has led to some experts scrutinizing the artificial intelligence of Waymo, which is owned by Google’s parent company Alphabet. Some critics have also suggested that the company’s autonomous vehicles should be able to detect heavy foot traffic or danger with multiple cameras and sensors mounted to the vehicle.
“Most normal car drivers know that they have to avoid Chinatown during the Lunar New Year holidays,” said Aaron Peskin with the San Francisco Board of Supervisors, who has called for more regulation of self-driving cars. “The computer doesn’t know that.”
However, some came to the company’s defense. San Francisco’s Mayor London Breed, D, called the Waymo incident a “dangerous and destructive act of vandalism” and praised Waymo’s role in the city.
“We are a city that is home to exciting, emerging technologies, like autonomous vehicles, that are changing the world,” Breed said.
Despite the support from some, autonomous vehicles still face questions of safety.
The Washington Post reported that then-Tesla employee, Hans von Ohain, died while his friend, Erik Rossiter, said that he was using the full self-driving feature during the crash in 2022. If true, it would mark the first known death of a person using the full self-driving feature of Tesla.
Ohain’s widow and Rossiter maintain that Ohain used the full self-driving feature every time he drove the car, even with his infant child inside. Ohain’s widow told the Post she was uneasy with the technology, reporting the feature was “jerky” and that Ohain experienced the vehicle unexpectedly swerving while using the feature. Still, he maintained confidence in the technology.
According to the Post, Ohain was selected by Elon Musk to test the technology in the hopes that he would log data that Tesla could use to perfect the experimental technology. The Post reported that only 400,000 Tesla drivers were selected for the fully autonomous feature.
An investigation by the Colorado State Patrol and National Highway Safety Administration did find that Ohain had a blood alcohol level of .26, well over the legal limit. However, neither agency could determine if the self-driving feature of Tesla directly contributed to the crash. The fire resulting from the crash destroyed the car and any systems that preserved data regarding the self-driving feature.
Ohain’s friend did confirm that the pair had been out drinking that day after finishing 21 holes of golf. However, Rossiter claimed that the self-driving feature caused Ohain to leave the road and strike a tree. Rossiter was able to escape, while Ohain was trapped in the car, and the resulting fire led to Ohain’s death.
Musk did not respond to comment regarding the article by The Washington Post and has not publicly acknowledged Ohain’s death. However, he has taken steps to fix driver assist features in Tesla.
NBC News reported that Tesla voluntarily recalled more than 2 million Teslas to update driver assist features and implement more safeguards. In addition, Tesla also urges drives to keep their hands on the wheel at all times and to be attentive. Experts argue that driver assist features and autonomous vehicles give drivers a “false sense” of security.
Despite the claims of added feelings of security, it appears drivers are still hesitant about autonomous vehicles.
Forbes found that 93% of drivers surveyed have concerns about self-driving cars. In addition, 62% of Americans are not confident in Tesla’s self-driving technology, and 81% of those surveyed have never been in a self-driving vehicle.
Advocates for safer autonomous vehicles are also pushing for companies to be more accountable.
The Dawn Project released videos that show Teslas reportedly in self-driving mode running stop signs, failing to stop for school buses and hitting mannequins meant to impersonate pedestrians.
The criticism isn’t just directed towards Tesla’s autonomous feature. In October, NPR reported that a woman suffered critical injuries after being struck and dragged by a General Motors-owned Cruise. Now, the company is reportedly facing fines and an investigation. The total cost could be millions of dollars for GM.
On Feb. 6, another incident involving a Waymo occurred. According to Reuters, an autonomous taxi hit a bicyclist.
Officials said the bicyclist was not seriously injured, but the incident still brought the safety of self-driving vehicles into question. However, both Waymo and Cruise maintain that autonomous systems make roads safer. Both companies claim that their vehicles have driven millions of miles without “any human fatalities.”
According to NPR, the rise of self-driving cars is also leading to protests. Activists are reportedly putting traffic cones on the hoods of driverless cars, rendering them immobile.
Robotaxis have also proven to be an obstacle for first responders. Reuters reported that a San Francisco Fire Department spokesperson confirmed that a firetruck on its way to an emergency had been held up by a Waymo that failed to pull over.
As for the robotaxi destroyed over the weekend, Waymo did not respond Monday when asked why the car drove into a crowded public event. Waymo previously described the vandalism as an isolated incident, but a Waymo rider told Reuters that he rode an autonomous taxi the next day through another San Francisco crowd and fireworks were shot at the car.