If self-driving vehicles are tested on public streets, all data generated should be public. Companies should be cooperating, not competing, on "don't hit telephone poles" and "don't run over children".
After some sleuthing and Geoguessring, here's the alley where it happened:<p><a href="https://maps.app.goo.gl/UNUzQn686ESYWbfo7" rel="nofollow">https://maps.app.goo.gl/UNUzQn686ESYWbfo7</a><p>Further south of the pin, by the section of the alley with stripes on the pavement behind the garages for 842 N 6th Ave (but not on 6th or 7th, on an unnamed alley between the two).<p>Let me repeat - it's an alley. The Google Maps car didn't even go down that road (though it looks to have been under construction when the Maps car went through).<p>Without exclusing Waymo (they had their car do something dangerous and stupid) this is the kind of pseudo-off-road parking lot/driveway/construction zone nav stuff that's really hard to get right, and almost requires AGI.<p>I think the real error was not the damage score but the planning algorithm that directed it to drive down and to continue through that alley.<p>I think we'll soon get to (if we're not there already) a form of level 2 driver aids or level 3 geofenced self driving (highway only?) that's safer than average human drivers. I think we're a long way from self-driving cars that will assign a low damage score and drive over an empty cardboard box in an unmarked, unmapped private alley, and we may never get there. But that doesn't mean Waymo can't or shouldn't exist, it means they need to shut down the car and delegate to a human when they're stuck and not not on public, mapped, confirmed clear roads. Maybe that means it can't pick you up from the back of the Chic Fil A parking lot or the entrance to the mall that's an island in a quarter mile of private parking lots and you have to go to the nearest parking spot on the actual road, but if the alternative is assigning damage scores to stuff in alleys that's probably for the best.
This is the most interesting part, in my opinion:<p>> Waymo’s recall was deployed by the company’s engineers at the central depot where the vehicles return for regular maintenance and testing. It was not through an over-the-air software update, like some of Tesla’s recent recalls.<p>I'd be interested to learn more about why the updates are manual, and also whether the map data is fully local to the vehicle. Tesla obviously does the polar opposite of this, and it seems to have at least some degree of success, but Tesla's approach has always seemed like it would be subject to some bad potential failure modes in my mind.<p>How much data does this amount to? Gigs? Terabytes?<p>On the same note, I'm curious about what data gets pulled from the map versus sensor data. The car seems to have used map data instead of sensor data (unless I'm misunderstanding?). Whether there's a curb seems to be exactly the sort of thing you could rely on sensors for, mostly because you also already need to look for obstructions which necessarily can't be in map data.
"<i>The update corrects an error in the software that “assigned a low damage score” to the telephone pole."</i><p>What did it get classified as? What's a Waymo allowed to hit?
I assume Waymo is continuously improving the safety of their software. It's bizarre to classify two of these updates as "recalls" but not the others.