A pick-up truck being towed crooked and backwards. Both vehicles failed to read the situation in the same manner.<p>Autonomous vehicles have various redundant systems built-in that can take priority and override false positives.<p>I was previously under the assumption that one of the really important reasons for Lidar is that it can get you closer to an absolute truth about whether something is a solid object, and where that hypothetically solid object is relative to the position of the vehicle, regardless of what the classifier thinks it is seeing.<p>So did the lidar fail to read the solid object, or was the lidar, was it de-prioritized? or was it simply not available as a fallback?<p>Presumably Radar and proximity sensors were also involved. What were they doing?<p>This is a fascinating edge case, and I hope to hear about the real reason for the 2 incidents.
Link should go to the blog post not CNN.<p><a href="https://waymo.com/blog/2024/02/voluntary-recall-of-our-previous-software/" rel="nofollow">https://waymo.com/blog/2024/02/voluntary-recall-of-our-previ...</a>
One of the best things I've learnt recently is how to apply the zero blame, process improvement approach that (many) air safety regulators take to my own teams.<p>I'd sat through 'five whys' style postmortems before, but it was reading air safety investigation reports that finally got me to understand it and make it a useful part of how we get better at our jobs.<p>By comparison, the way we're investigating and responding to self-driving safety incidents still seems very primitive. Why is that?
Sounds like they were relying solely on their neural network path prediction, which failed when the truck was dragged at an odd angle.<p>A simple lidar moving object segmentation, which doesn't even know what it's looking at but can always spit out reasonable path predictions, would probably have saved them.<p>I think Mobileye is doing something like this, but they release so little data, which is always full of marketing bullshit, that it is hard to know what exactly they are working on.
I wish there was a picture of the strange towing configuration. I wonder if I would be confused as well, although my guess is that I’d read the situation correctly
This is what people don't appreciate when quoting those statistics about how self-driving cars are safer than humans: when a human driver causes an accident, it was because that particular person did something wrong. When a self-driving car handles a situation wrongly that's a big issue, because all the self-driving cars run the same software.
This is my biggest fear with self-driving cars. Correlated failures. As a society we are extremely good at dealing with independent accidents. We can calculate very precisely how many people will die of traffic in a given year and we can account for it, we can have insurances, and we can decide exactly how much we are willing to spend to save a life on the margin.<p>But if everything is fine, everything is fine, everything is fine, and then all hell breaks lose? We are not as good at dealing with that.
Do we have a picture of the truck? I'm having difficulty imagining it given that surely the tow truck would want the towed vehicle in-line to make driving go smoothly?
[Recycled from a older submission] Well, I feel kinda vindicated by this news, after previously noting:<p>> People worry that ways and times [self-driving cars] are unsafe (separate from overall rates) will be unusual, less-predictable, or involve a novel risk-profile.<p>In this example, having a secretly cursed vehicle configuration is something that we don't normally think of as a risk-factor from human drivers.<p>_______<p>As an exaggerated thought experiment, imagine that autonomous driving achieves a miraculous reduction in overall accident/injury rate down to just 10% of when humans were in charge... However of the accidents that still happen, half are spooky events where every car on the road targets the same victim for no discernible reason.<p>From the perspective of short-term utilitarianism, an unqualified success, but it's easy to see why it would be a cause of concern that could block adoption.