San Francisco city officials and residents are divided on whether they should allow robotic taxis to operate during daylight hours. Autonomous car companies Waymo and Cruise insist that their cars are safe enough — but some students and faculty are apprehensive.
The California Public Utilities Commission votes Aug. 10 on a referendum that, if passed, may result in hundreds of robotic taxis available for hire across the city.
Waymo and Cruise currently operate self-driving taxis at night through pilot programs that started over the last two years. They also operate a handful of cars during the day, but are unable to charge for these daytime services. The artificial intelligence underlying the robotic taxis was trained on San Francisco-based data sets for several years prior. Both companies believe robotic taxis will increase road safety, decrease traffic and improve quality of life for city residents.
Critics point to a history of malfunctions leading to traffic congestion and minor property damage but acknowledge the limited number of fatal incidents. Some experts expressed concerns about whether these cars are truly safe enough for daylight operations.
Freeman Spogli Institute adjunct lecturer and artificial intelligence expert Jerry Kaplan said he is unable to dismiss the associated safety risks. Without the proper infrastructure, the cars are not safe enough to operate during daylight hours, Kaplan said.
His primary concern is that it could be difficult to interpret and convey signals without a human driver.
“Driving is much more of a social activity than people realize. Particularly when you’re in traffic or you’re around other cars, there are all kinds of subtle cues about who’s going to go first and whose turn it is,” Kaplan said, giving the example of a car encountering a pedestrian about to cross the road.
Adyasha Mohanty, a fifth-year aeronautics and astronautics Ph.D. student who develops algorithms for multi-sensor perception for autonomous vehicles, echoed Kaplan.
For example, Mohanthy said a realistic scenario was autonomous vehicles stumbling upon construction: “What happens when you have construction on the road that hasn’t been there before? All the datasets that you train, none of them have construction. Even if they do, what if this construction is different?”
“It’s a huge challenge,” Mohanthy said.
According to Kaplan, issues like the example above may emerge because autonomous vehicles are “not socialized to deal with the kinds of judgments, particularly in unusual situations, that humans are attuned to being able to deal with.”
Some people are more confident in the capabilities of autonomous vehicles, including fifth-year electrical engineering Ph.D. student Shubh Gupta. In San Francisco, autonomous vehicles were sufficiently trained over the past few years and collected enough data to ensure safety during daylight operations, Gupta said.
Gupta said he was less confident in the safety of robotic taxis in other cities where the cars are yet to undergo the same amount of testing.
San Francisco resident Shobha Dasari ’23 said she frequently saw self-driving taxis operating around the city at night and even rode in one. Based on her experiences, Dasari said she believes the cars are relatively safe: “Outside of just the fact that there’s no driver in the car, [the ride] didn’t feel particularly notable.”
It stopped pretty smoothly, “which, you know, is better than me as a driver,” Dasari said.