The specific examples listed out in the article are egregious, causing real harm. The fire chief is right to be angry, if anything her response is too measured.<p>From the article:<p>- Running through yellow emergency tape and ignoring warning signs to enter a street strewn with storm-damaged electrical wires, then driving past emergency vehicles with some of those wires snarled around rooftop lidar sensors.<p>- Twice blocking firehouse driveways, requiring another firehouse to dispatch an ambulance to a medical emergency.<p>- Sitting motionless on a one-way street and forcing a firetruck to back up and take another route to a blazing building.<p>- Pulling up behind a firetruck that was flashing its emergency lights and parking there, interfering with firefighters unloading ladders.<p>- Entering an active fire scene, then parking with one of its tires on top of a fire hose.
These articles <i>always</i> conflate all industry participants but when you dig into it, it's always Cruise that is causing the problem. SFMTA's complaint to the state about Waymo cites 13 incidents in Appendix B involving Cruise cars. The very best thing Waymo could do is lobby the state government to establish strict rules so their own reputation doesn't get diluted by Cruise.<p>There are a quarter million vehicle crashes in the DataSF Fire Department Calls For Service. Self-driving cars will prevent these. It's the systematically better way to go.
It’s absolutely insane that automated driving is allowed on public streets.<p>It should be completely banned until such time as there exists a comprehensive testing regime for validating that the automated system functions in all the scenarios it might be expected to face including emergency vehicles, construction, pedestrians, bad weather, and damaged sensors.
AI training data doesn't include public safety scenarios is malpractice at minimum. This is nuts.<p>There is sufficient evidence that these are not ready for prime time.
A family member was a NY fire fighter many years ago. The Soviets had a lot of people at the UN, and they had diplomatic plates. They would park wherever they wanted, often blocking things. There was nothing the police could do.<p>The fire chief decided to have fire drills. Whenever a cop saw a car with diplomatic plates parked in front of a hydrant, they would call the fire department, who would come out and perform a drill. Upon seeing the vehicle blocking the hydrant, they would practice breaching through the vehicle to attain access to the hydrant. The vehicle would not survive in a drivable state.<p>The Soviets complained, but stopped blocking fire accesses.<p>EDIT: the firechief could solve this by putting truck or police cruiser style bullbars on their vehicles and "carefully nudging" the miscreant vehicles out the the way.
It appears these robotaxis need an emergency button similar to the ones on industrial robots. But instead of completely stopping, a remote operator immediately takes over and bugs out. Meanwhile, the AI can take a backseat and observe how to respond in a situation like this. Because obviously they don't have enough training when dealing with emergency situations.
This past weekend in SF, I saw a driver slow and stop to reverse parallel park on Bush Street at Fillmore. An automated Cruise car stopped right behind him, blocking the driver from reversing into the parking spot. The guy got out to yell at the driver to back up, but then saw that it was driverless.<p>He drove off, and someone else got the spot.
I'm surprised no one has mentioned the US legal system as part of a solution. If a robo-taxi causes material harm, sue the operator for some multiple of the cost of the harm.<p>I suppose the drawback to this strategy is that real harm has to happen first and that could easily involve loss of life or limb, but perhaps the threat of that would be enough to motivate the robo-taxi providers to fix the problem.
It's weird, I'm a developer, and I really want AI to work, but it just doesn't, so I'm just generally against AI in general. And it seems it's far, far from working properly.<p>It's difficult to see domains where AI can really improve productivity without having major drawbacks.<p>I'm not against research on AI, but as long as science cannot define what intelligence really is, I guess AI will not make major advances.
This is going to get even more lolsob when Cruise deploys its self-driving car without a steering wheel[1].<p>[1] <a href="https://arstechnica.com/cars/2022/02/gm-seeks-us-approval-to-put-driverless-cruise-origin-into-commercial-service/" rel="nofollow noreferrer">https://arstechnica.com/cars/2022/02/gm-seeks-us-approval-to...</a>
The incidents in the article are inexcusable. But to be fair, I manually reviewed a dozen accident reports on the CA state website and 100% were due to human error in other vehicles or due to issues with an operator of an AI vehicle in manual mode:
<a href="https://www.dmv.ca.gov/portal/vehicle-industry-services/autonomous-vehicles/autonomous-vehicle-collision-reports/" rel="nofollow noreferrer">https://www.dmv.ca.gov/portal/vehicle-industry-services/auto...</a>
Some simple infrastructure could help with this. A no-fly zone for robo-stuff that travels with fire engines for instance. Require all robo companies to respect these zones. Radio or something.
I do think that, as long as someone hasn't yet invented the sufficiently smart self driving car, that robo-taxis will be required to have some sort of override mode, either cell or satellite based; something where a remote human operator can take over, using the sensor data to drive the car until it can get back to situations where it can actually handle.<p>I understand that lag can be an issue, but if speeds are limited to 5-10 mph, that should be less of a problem.
How do these companies not have a simulator built to help train cars for simulated non-standard scenarios? Have dogs, kids, fire trucks, caution tape etc and run them with 100k variations.<p>And then do real world staged validation.<p>Actually why are extensive real world mock scenarios not running 24/7? If a car does something bad, do mock scenarios around it.<p>Please tell me they are doing these things because it’s crazy if they aren’t.
Seems to me a simple regulation like a fire department access in buildings could solve this.<p>eg1: What about a direct line to a 24/7 operations center with cruise?<p>eg2: Or how about a proximal access console controller that allows them to take control of the the vehicle (with occupants consent if occupied) .<p>This neednt be a total ban on AVs
Why are the people operating the trucks not getting ticketed for all this then? These various programs should be accruing points on their license just like a real driver. Cruise would probably be off the road if they did this though.
>Under the agency’s own rules, issues such as traffic flow and interference with emergency workers can’t be used to deny expansion permits. The resolutions list four “goals” to be considered:<p><i>inclusion of people with disabilities<p></i>improved transportation options for the disadvantaged<p><i>reduction of greenhouse gases; and<p></i>passenger safety
Most countries have penalty points on driving license. If some driver is violating traffic rules too often, they should lose driving license! Self driving cars should not have an exception!