<i>> In July, a Waymo in Tempe, Arizona, braked to avoid hitting a downed branch, leading to a three-car pileup.</i><p><i>> In August, a Waymo at an intersection “began to proceed forward” but then “slowed to a stop” and was hit from behind by an SUV.</i><p><i>> In October, a Waymo vehicle in Chandler, Arizona, was traveling in the left lane when it detected another vehicle approaching from behind at high speed. The Waymo tried to accelerate to avoid a collision but got hit from behind.</i><p>It’s worth noting that all 3 of these incidents involve a Waymo getting hit from behind, which is the other driver’s fault even if the Waymo acted “unexpectedly”. This is very very good news for them.
Disclosure: I work at Waymo, but not on the Safety Research team.<p>The Ars article linked to the Waymo blog post [1], but the underlying paper is at [2] via waymo.com/safety . A lot of folks are assuming this wasn't corrected for location or surface streets, but all of the articles do attempt to mention that. (it's easier to miss in the Ars coverage, but it's there). The paper is naturally more thorough on this, but there's a simple diagram in the blog post, too.<p>[1] <a href="https://waymo.com/blog/2023/12/waymo-significantly-outperforms.html" rel="nofollow noreferrer">https://waymo.com/blog/2023/12/waymo-significantly-outperfor...</a><p>[2] <a href="https://assets.ctfassets.net/e6t5diu0txbw/54ngcIlGK4EZnUapYvAjyf/7a5b30a670350cc1d85c9d07ca282b0c/Comparison_of_Waymo_Rider_Only_Crash_Data_to_Human_Benchmarks_at_7_1_Million_Miles_arxiv.pdf" rel="nofollow noreferrer">https://assets.ctfassets.net/e6t5diu0txbw/54ngcIlGK4EZnUapYv...</a>
About ten years ago I made a prediction:<p>Self-driving technology will overtake average human ability with regard to safety within a decade, but the biggest hurdle will be public acceptance. The AI will not make the same kind of mistakes humans make. So while the aggregate number of accidents will be (likely much) lower without a human at the wheel, the AI will make deadly mistakes that no human would make, and this will terrify the public. A intuitively predictable crash will always be scarier than one that makes no sense to our minds. The only way self-driving tech will ever succeed is if the AI can be limited to the same kinds of mistakes humans make, just fewer, and that's a VERY hard technical nut to crack that I do not believe will be solved anytime soon.<p>That said, I still believe that the ubiquity of cars is inherently a problem, human operator or no. If we put more effort into self-driving busses and autonomous trains—which have regular schedules, routes, and predictable speeds—I think we would see much greater dividends on our investment and far fewer "unintuitive" errors. Our collective fixation on cars blinds us as a society to this option unfortunately. More cars just clog up the road even more, demand more parking, and otherwise monopolize land use that could be more productive otherwise. More idling/circling driverless cars adds to the blight rather than relieving it. We need to transport more people between points in higher density, not lower, and cars are the lowest density transportation options available.
I know Waymo are the investing a lot into the PR that makes them seem successful, but they are the only company I actually see on track to delivering autonomous cars (on existing infrastructure).<p>I'm still a bit torn on whether autonomous cars are a good thing once you consider all the second and third order effects (even more cars on the streets, less investment into better modes of transport, and traffic will get a lot worse once people are ok with sitting in bad traffic and watching Netflix). But I have to applaud Waymo for their great execution on a very difficult problem.
I can believe it. I rode in a Waymo for the first time a couple days ago and it was incredible. No problems with the rain or bad San Francisco drivers. It was a really smooth ride and I felt extremely safe.
Like others said. Waymo One in San Francisco is great. Smooth / confident drive. Good situation awareness (several times when it made unexpected action, only later I realized there is a person or a car it tried to avoid).<p>Looking forward to expand its coverage to SFO, that will be a game-changer.<p>Still not sure of it economics though. Its current price is on-par with Uber Comfort / a little bit over Uber X. How that can support the R&D or future capital-heavy expansion?
> Waymo currently operates commercial robotaxi services in Phoenix, Arizona and San Francisco,<p>Basically straight lines and 365 days of sun, now send them to Europe small towns/mountain roads/&c.<p><a href="https://www.youtube.com/watch?v=RIyEg35Stbo" rel="nofollow noreferrer">https://www.youtube.com/watch?v=RIyEg35Stbo</a><p><a href="https://www.youtube.com/watch?v=P7wphiL3vbo" rel="nofollow noreferrer">https://www.youtube.com/watch?v=P7wphiL3vbo</a><p><a href="https://www.youtube.com/watch?v=O1ZaoRu7okU" rel="nofollow noreferrer">https://www.youtube.com/watch?v=O1ZaoRu7okU</a>
I wish all of these benchmarks pitted autonomous cars against a somewhat comparable user group – say professional taxi drivers – over just a general sampling of the population. The majority of people are driving most of their miles during rush hour when chances of a crash are the highest, while Waymo cars operate all day/night. Plus I'm sure that first-time drivers, drunk drivers, people out on illegal joyrides and other such extremes drag the numbers down enough that saying "I'm in the top 40%" really isn't all that meaningful anymore.<p>What I'm interested in knowing is how these cars drive compared to the average <i>competent</i> driver in the exact same environment.
The only way this has an impact on road accident/injury/fatality rates at any meaningful scale is if millions of people switch from routine personal-car based transportation to shared transportation, even single-occupancy shared transportation like Waymo.<p>I don't see millions running out and buying a real self-driving car kitted with a spinning lidar "hat" and visible radar transmitters sticking out everywhere, even if doing so meant safer roads for everyone.<p>What people in the market actually want is what Tesla has been selling (however fraudulently): a car that looks/performs very nice and <i>claims</i> full-self-driving capability, not a goofy looking car that self-drives very well in particular locations and use cases. Cars are about personal identity and power at least as much as they are about functional transportation.<p>I'd like to be wrong about all that, and would like a future where swarms of electric self-driving buses that route-optimize based on demand pick up people very close to where they are. But I also realize that the reptilian brains of consumers tend to decide how these things eventually pan out, and not the solutions that are optimized for efficiency and safety.
As someone that has taken quite a few Waymo trips now since October 2023, I am continually impressed with how it handles the crazy here in San Francisco from odd/narrow streets, bad drivers doing stupid things, and overall safety with pedestrians doing all sorts of non-standard behaviors from crossing randomly to pausing at odd points in crosswalks, etc. Also, I've been in a bunch of situations in a Waymo where other drivers are messing with position to try and freak the Waymo out, and every time, it did a great job. I've never been in a Cruise, but I can't deny Waymo has been a great experience for me in SF and up around 20 or so trips.<p>Here is a video of Waymo going through the Broadway Tunnel in SF back in Oct 2023 to give you a sense of it. >> <a href="https://mer.gy/broadwaytunnelwaymo" rel="nofollow noreferrer">https://mer.gy/broadwaytunnelwaymo</a>
>The new data comes at a crucial time for the self-driving industry.<p>Which is by the way, a good reason to be skeptical of it. I remember talking to someone who worked with BMW on their self driving a long time ago and their take on Tesla's self driving effort was (a) It's fine for Tesla to have a bad reputation for safey but BMW simply can't choose to get a bad reputation for safety they sell far too many non-autonomous cars and (b) It's actually not fine for Tesla (and others) to be rushing ahead with self-driving because they <i>will</i> kill people and they're just as likely to kill the whole self-driving industry at the same time.<p>I have no doubt that even if the data looked terrible, Waymo would find a way to spin it to look safe. I also have no doubt that even if the data is good, it's not indicative of self-driving being safer in the average situation.
As a single anecdote, I’ve taken 12 Waymo rides over the past 3 months, and I’d put them at ~90th percentile with respect to human Uber/Lyft drivers in terms of smoothness/quality of reaction to the various hazards of SF streets.<p>(Over ~4 Cruise rides, I’d put them closer to median)
FWIW, I've used them quite a few times here in Phoenix --- overall a very positive experience. The Waymo car used to be too cautious and would take weird routes but now they drive appropriately aggressive and the route selection is much better.
To me safety is mostly about the risk of injury and death to me. And mostly about death.<p>Humans die from driving only once every 100 million miles on average.<p>So until there is a comparison at this scale, to me it’s a very incomplete picture of safety to say that you are outperforming humans.<p>If driving my own car means colliding 100x as often but I expect to die 2x less often, I would consider self driving to be much much much more unsafe. There is really no way for me to understand the risk of fatalities from a sample of only 7 million miles, since either humans or a system with 2x the fatality rate of humans would both be expected to have 0 fatalities at this scale.<p>Given we are at 7m miles, hopefully this comparison is coming soon and I will be much more convinced.
I think this is a useful and impressive study - I haven't read all 40 pages, neither have you :) I did do some good-faith skimming. Assuming Waymo didn't falsify their data (they didn't), this makes me feel comfortable having Waymo in SF and Phoenix. I think it's clearly safer than an Uber. But some caveats:<p>- The major caveat is that Waymo is not being directly compared against <i>sober</i> humans driving <i>lawfully.</i>[1] The reason why this caveat is so important is that technology which makes it impossible for humans to exceed a posted speed limit might be overall much safer than replacing human drivers with autonomous drivers. Uber isn't more dangerous than Waymo because humans are incompetent, it's because humans obey orders from impatient drivers and Waymo currently does not. This is a UI choice, not an AI advancement.<p>- More specifically, <i>lawful</i> driving is an important caveat because Tesla Autopilot had <i>two different settings for driving unlawfully</i>, according to the users' own sense of personal risk. An AV manufacturer who advertises "AI-assisted speeding" will almost certainly find a lot of customers, even if it's under the table. People don't speed and run red lights because they're too stupid to understand why it's dangerous: they do it because they're reckless and selfish. AI won't stop that, only regulation will.<p>- Another caveat is that Waymo was trained on human-dominated streets. Waymo being safer in a sea of human vehicles does not actually translate to Waymo being safer in a sea of Waymos. I think this is a low-probability risk but it's hardly a simple question: I believe Waymo has had issues where several AVs occupied the same street after an event and blocked traffic because they couldn't decide what to do - they were waiting on each other to behave like a human. But again, the risk seems like gridlock, not property damage or injury.<p>- And a minor but still important caveat is that SF and Phoenix have modern linear grids which have been mapped to death by AV manufacturers. As a Boston resident I am still holding my breath about their performance here :)<p>[1] Not because of anything insidious, it's just a granularity that both the analysis and the data struggle to capture.
IMHO we need a standardised benchmark for this stuff.<p>You can't claim better than human when humans are driving under different conditions.<p>I looked around to try to find the actual data but it's just marketing materials. Are these miles on the same roads under the same traffic and weather conditions?
These measurements should be standardized and independent, there is a large incentive for these companies to get creative with their record keeping and methodology.
A year ago, Elon Musk was saying Tesla was going to have a million robotaxis on the road by now. It wasn't even just him. Investors also were predicting this.<p>But it looks like Google/Waymo is the only one with reliable self driving technology.<p>If they don't give up like Google typically does. This could actually be a huge business for them.<p>Everyone I know who has used the Waymo service had good things to say about it and we are just at the beginning.<p>The FSD problem will become easily as you scale up due to the fact that these AI can easily communicate with all the cars around them unlike human drivers.
> this study includes all Waymo crashes, regardless of the Waymo vehicle’s role in the crash, and with any amount of property damage<p>I really like this because I think this happens a lot with human drivers. There are many instances where there was a potential crash that was avoided by me or the other drivers. I think that's crucial to autonomous driving. Not only should they have the ability to make mistake but to also compensate for other drivers' mistakes. It's all very good to say "we did everything we were supposed to" after an incident but it's even better to never have the incident at all. An AI that can react well to the unexpected would be a huge milestone.
Here in Finland we already have some autonomous public transportation buses running in my neighborhood. They're great! Can't wait for the day where they displace human drivers in all non-sporting arenas.
Except when you consider stopping in the road and preventing emergency vehicles from reaching emergency sites. And failing to respond to other emergency responder commands such as “move your tire off that that pedestrian’s head”
"Comparable" human benchmarks may be a bit misleading. Waymo drives far safer and less assertive, on turns it does not push itself into traffic and waits longer...<p>It is still pretty impressive, but we should compare average speed and such.
Pretty neat! Worth keeping in mind though that a lot of that data came from Phoenix, a city that only gets 8 inches of rain a year and has fairly pristine, well-marked roads to match. In a way, Phoenix is the perfect initial testing ground for self-driving cars before you put them in an environment that more closely resembles the typical US city.<p>But a lot of that data also seems to come from San Francisco, so I have to admit I'm impressed.
I sometimes get annoyed at how we talk about self driving cars, we shouldn't expect them to have a perfect track record. There will always be situations that neither choice is a good one but something has to happen.<p>But to me, I often wonder about once this tech comes out what happens once theoretically all of the self driving cars can communicate with each other? So if you need to break suddenly, instead of just your car than its a series of cars that can do it together or example?<p>I have been wondering if there has been any talk about a standard around doing this proposed since to me that is the true power of self driving cars, not just how do they operate with other unpredictable drivers.
I find it a pretty optimistic take when someone claims success of a safety system after conducting a test that basically equals 600 people using it for an average year worth of driving…<p>Is it enough to support a pretty extraordinary claim that a fully self-driving system that will then be sold in millions of vehicles is now perfectly safe?
The headline does sound impressive, but am I the only one that’s also disappointed? Their website says they’ve logged over 40M miles in total. That is an <i>insane</i> number and many times more than a person would ever drive in their entire lifetime, let alone in just 3 cities.<p>I’m imagining a fleet of cars driving in generally sunny weather cities of SF, LA, and Phoenix, driving every street thousands of times over. After all that, saying they perform better than a human just doesn’t wow me that much.<p>After 40M miles of testing, bring it to Chicago, Boston and NY and let us see how they do in the summer and then in winter. Are they going to do terribly at first because they need another 100M+ more miles of training?
A quote from Tyler Cowen I read this morning...<p>Tyler Cowen: Uncertainty should not paralyse you. Try to do your best, pursue maximum expected value, and just avoid the moral nervousness. Be a little Straussian about it. Like here's a rule, on average it's a good rule, we're all gonna follow it. Bravo, go on to the next thing. Be a builder.<p>Joe Walker: Get on with it?<p>Tyler Cowen: Yes. Because ultimately the nervous Nellies, they're not philosophically sophisticated, they're overindulging in their own neuroticism when you get right down to it. So it's not like there's some brute let's be a builder' view and then in contrast there's some deeper wisdom that the real philosophers pursue. It's: you be a builder or you're a nervous Nelly. Take your pick. I say be a builder.
Sadly, if true, it's a low bar. Too many people have horribly unsafe driving habits and contribute to the bulk of carnage on the road. Skill levels vary a <i>lot</i>, even for "professional" drivers. (I'm looking at you, semi driver, who tailgated me at 60 mph and had the gall to honk because I left a reasonable distance between my vehicle and the semi in front of me. This in traffic too heavy to do anything but go with the flow.)<p>Also, I'd find the statistics more believable if they were published by an unbiased third party.
I’ve done two dozen trips and I can vouch for this. It seemed completely safe and in control, with crisp decisive moves throughout all those rides. I’m not sure what they’re doing on the backend but this is it.
As optimistic as I am about self-driving it seems like the “9s” model of reliability isn’t the right model here because one “outage” could be someone being killed. The aftershock of the Cruise car dragging someone shows how far one accident can sully a company.<p>The other issue I see is that I believe Waymo doesn’t live map the terrain in SF and Arizona right? They are using intricately created maps and so my question is what’s the leap for Waymo to operate with this success in say Pittsburgh?
When autonomous vehicles become super safe, will the size of the vehicles more accurately reflect how many people are in the vehicle? Cars that can carry 5 people when the average occupancy is 1.5 in the US seems wasteful just because people feel safer in big cars.<p>It seems like we should have a lot more small single occupant vehicles that are effectively caged motorcycles, but actually safe.
Comparing autonomous vehicles to the average driver is always going to be misleading.<p>It’s not fair to compare it to drunk drivers, or to the small portion of people (likely causing an oversized portion of accidents) who shouldn’t have a license.<p>I’d like to see these comparisons made against a typical good driver, who never drivers impaired and obeys the law.
The folks that get indignant about even the marginal self-driving cars (Cruise, Tesla) vastly overestimate the abilities of human drivers.<p>Spend an hour watching this channel: <a href="https://www.youtube.com/@DashcamLessons" rel="nofollow noreferrer">https://www.youtube.com/@DashcamLessons</a><p>If you have a strong stomach, search youtube for "brutal and fatal".<p>Humans are shit drivers. Remember that when you hand the car keys over to your teenager.
So, how are they doing this when the public results for recognising pedestrians from images seem to be so rubbish? (Or at least, don't seem to have a sufficient number of nines precision for this purpose). Is it easier to recognise a pedestrian from LiDAR, or is everything just very precautionary?
What would happen if the only vehicles on the road were all self-driving? Wouldn't it be easier if all self-driving vehicles are only having to deal with other self-driving vehicles and hence less accidents?<p>Of course other objects such as pedestrians etc will always make things more complicated.
Actually I don’t care about these kind of “benchmark” . I just ask waymo to start running commercial trucks with goods only on closed roads first that should make it profit of it really works.
Yay! Maybe in a good Five years(TM), I, a blind person, will finally be able to drive! Um, be driven? Ride? Anyway, whatever they'll call it when one isn't actually driving a self-driving car.
Obviously an announcement like this from a company with plenty on the line is going to be as positive. Report as is possible. That said, kudos to them for acknowledging several possible sources of bias in the writeup.<p>However, while differing vehicle types were mentioned as a source of variation, there was almost no indication that this factor was applied to the numbers. Also, my understanding is that this service does limit its coverage area, so I’m curious what sort of impact that has on the numbers.<p>One other interesting fact. They claim 7+ million miles on 700,000 trips. So the average trip is over 10 miles, which I found surprising, but perhaps I shouldn’t since most of the data is probably from the Phoenix area.
> The ODD [Operational Design Domain] does not include severe weather conditions, such as thick fog, heavy rain, or blowing sand but does include light rain or light fog. [1]<p>What is the crash rate for humans under good to light rain/fog conditions? This doesn’t seem to be comparing apples to apples.<p>[1] <a href="https://assets.ctfassets.net/e6t5diu0txbw/54ngcIlGK4EZnUapYvAjyf/7a5b30a670350cc1d85c9d07ca282b0c/Comparison_of_Waymo_Rider_Only_Crash_Data_to_Human_Benchmarks_at_7_1_Million_Miles_arxiv.pdf" rel="nofollow noreferrer">https://assets.ctfassets.net/e6t5diu0txbw/54ngcIlGK4EZnUapYv...</a>
A more fair comparison would be against human drivers who are constantly texting about where they are and what they are doing and seeing, since that's what the Waymo drivers are doing.<p>:)
I’ve taken a Waymo a half dozen times. It tends to go through residential neighborhoods instead of making a left across traffic. It felt like it went miles through the neighborhood at 15mph instead of on the main street at 45mph. Somehow I expect they failed to mention these types of “optimizations” in their report. Sure these neighborhood routes are safer and 1/3 the speed, but we don’t want to encourage an exponential increase in intra neighborhood traffic. Certainly not when my kids are playing in the road.
And all of those miles are in the arid southwest in places that don't have winter. They don't really extrapolate to the rest of the country or world. SF + Arizona is easy mode for autonomous driving.
How many 9's would that be?
Should it be miles/interventions or miles/incidents?<p>I would think a better scoring system should be based upon human interventions.
but won't be culpable in 99.999999% of fatalities.<p>tell us who on the programming team will act as tribute and you can have your mostly autonomous wealth consumption device.<p>remember, it's driving itself, by itself, that's wasting resources when public transportation would eliminate orders of magnitude more issues.
Do they compare this to the accident and fatality rates specifically where these cars were driving?<p>I think a lot of that was done in a completely flat and mostly empty suburb of Phoenix where the accident and death rates may be much much lower than the national average.
Is this meaningful?<p>OK they have a lower crash rate compared to humans. They also just stop in the middle of the street when they get confused and do nothing until someone remotes in to drive them.<p>I’m sure humans would do a lot better if every time they got unsure they just stopped and never moved again.<p>Waymo is clearly out in front by a few hundred miles, but touting this seems a little disingenuous to me.
Worth noting that this is only in 3 cities that don't really experience "weather" the same way most other cities do.<p>This is of course great news for people living in the desert, but let us know when they achieve this in Toronto or Boston year-round.
That's great. I'm glad they're good enough to drive in the southwest. But Waymo cars performance in SF cannot be implicitly extrapolated to non-SF environments.<p>Try the same thing in a place with winter on a normal suburban city road. The road edges and evolving swarm-defined lanes have little to do with the absolute GPS position of the unobscured lanes and edges. An <i>all</i> the road makings are often obscured for weeks at at time (or longer). The road surface snow <i>looks</i> just like the road edge snow. And it's a semi-permanent slippery surface.<p>There are a lot of challenges left in non-cherry picked regions before autonomous driving can be said to "outperform comparable human benchmarks" without this qualifier.
I remember Cruise showing similar numbers and then at some point they ran over a woman and lost their right to operate in the state of California and GM cut 20% of Cruise's workforce.<p>I guess it's pointless to point out the WeWork-level of creative reporting bcs the "if we can just save one life" people don't want to hear it. The reality outside of the engineering bubble is that self-driving is an expensive dud with a smaller potential market than initially hoped (major cities like NYC will not see self-driving anytime soon). The expectation that there just needs to be a little more time for development has run its course and money for speculative bets are harder to come by (especially for something that's burned through so much cash already).
I've said it before, and I'll say it again: Waymo's approach to self driving is a fool's errand. The only reason it works is that they have mapped out cities with straightforward roads and sunny weather to millimeter precision, and have hardcoded vehicles to operate in those conditions.<p>Compare this to Tesla's AI-first approach, aggregating training data from millions of drivers across the US. There's no overhead to introducing Autopilot in a new city it's never been in before--it just "works," with no expensive 3D mapping required, by extrapolating cheaply acquired vision data from other cities.<p>Elon really got this right from the beginning, but Waymo's aggressive PR and marketing have pitted public opinion against Tesla, as is very clearly evident even in these HN comments.
The problem is that it's not a person driving, it's a vehicle that's property of a company. If that vehicle kills someone, the company is responsible. Who goes to prison? The company owners? CEO? Vehicle engineer(s)?<p>update: I see we have some Waymo engineers in the house!!!!!!! (which is why I'm getting downvoted)
Forgive me for this but all self-driving cars do is act as the default to postpone and diffuse support for scalable and long term (also driverless) infrastructure with the false promise that self driving cars that will somehow solve all our transportation needs.<p>This technology will <i>increase</i> the issues we have as a society around the use of a car based transportation system, as they are simultaneously expensive to maintain for low density environments while also conversely being almost impossible to maintain for high density environments.<p>Car transport systems have no true limit on capacity so as the efficiency of the transportation system increases, so will demand and density.<p>If everyone changes to another transportation system powered by Waymo and others, what’s to stop a large number of people moving to self-driving cars, increasing the number on the road, and causing grid lock again meaning we have to increase massively the amount of car-centric infrastructure all over again. Isn’t that what history has shown us will happen?<p>Why do we keep supporting the most inefficient and expensive form of transport ever?<p>Downvote if you must but it is a warranted and substitute argument and worth considering. Even if you think it’s not or not on the appropriate post of whatever criticisms you wish to levy (or just as likely work on self-driving systems). It’s always amazing to me how hard people will defend the status quo while calling themselves “technology innovators.”