Self-driving cars can be found on roads across America. But are they really safe?
It’s easy to be amazed after seeing a driverless car pass by you on the freeway. The technology has progressed significantly in recent years and become markedly safer. According to data from Waymo, its self-driving cars were involved in 91% fewer crashes resulting in serious injury or worse, and 80% fewer crashes causing any injury, than comparable human-driven vehicles.
Of course, this data has a lot of caveats. Additionally, there are still numerous issues with self-driving cars, including surveillance concerns and their propensity to illegally park. Not only that, but if a self-driving car does break the law, it’s unclear who should be fined.
Now, a user on TikTok has captured an autonomous vehicle breaking the law. But how did this happen?
Why Did This Self-Driving Car Turn Into Traffic?
In a video with over 13 million views, TikTok user Jessica Pacita (@jessicapacita) shows a Waymo vehicle in the Bay Area, California. The car is at an intersection.
While the vehicle has a green turn light, it appears that police officers have stopped traffic and are directing it independently.
After a few seconds, however, it appears that the Waymo car is ignoring these officers. Instead, it makes its way into the intersection—slowly approaching a slew of cars that are speeding in its direction.
By the end of the video, a police officer has approached the Waymo vehicle. At this point, the car reverses back toward its original position.
“Just as I was starting to think I could ride in one,” Pacita writes in the caption.
Waymo Cars And Police Officers
According to Waymo, its cars have sensors that detect features like sirens and emergency lights. This, in theory, allows them to identify first responders, emergency vehicles, and temporary traffic control devices and give them priority.
In practice, this is not always the case. There have been numerous documented incidences of Waymo vehicles getting in the way of emergency services. For example, in March 2026, the company faced criticism after one of its self-driving cars prevented an ambulance from quickly reaching the scene of a mass shooting.
For its part, Waymo says it has several procedures in place to stop this from happening. As previously stated, the cars are programmed to prioritize emergency vehicles, and, per the company, the software is “designed to respond to customary first responder instructions.”
However, in cases where this programming fails, police officers and other first responders have been known to take control of the vehicles and move them to other locations.
As for what happened in this specific incident, it’s unclear. It may be that the passing cars prevented the autonomous vehicle from seeing the police officers, making it instead prioritize its green arrow.
Commenters Are Worried
In the comments section, users were divided. Some noted that the car, technically speaking, had been signaled to turn left.
“To be fair, he had the light,” wrote a user.
“To be fair, there was a green light with a protected left turn arrow. Why are all the cross traffic running the red light?” asked another.
Other users countered that the Waymo vehicle should have been following police directions.
“The robot doesn’t know there’s a cop directing traffic and the lights are irrelevant,” declared a user.
Consequently, some felt that issues like these made self-driving cars dangerous.
“There are too many factors on the road for waymo to work well,” stated a commenter.
“Perfect example why we still need humans. Can’t stand Waymo’s,” offered a second.
@jessicapacita Just as I was starting to think I could ride in one #bayarea #waymo
BroBible reached out to Waymo via email and Pacita via TikTok direct message and comment.
