I've been in situations where I'm waiting to make a tricky left turn safely while people behind me literally yell at me to go faster. Now those same people have to contend with cars that can't be intimidated into making risky turns by the driver behind them yelling, honking, or tailgating. I hope it plays out like this:<p>1. Driver tailgates at an intersection or stop sign to try to intimidate the driverless car in front of him into going faster.<p>2. Driverless car ignores it, and gets rear-ended.<p>3. Google helpfully provides the tailgater's insurance company with a copy of the 360 degree video, conclusively proving his guilt under law.<p>4. Tailgater's insurance rates skyrocket after his insurer is forced to pay the maximum to repair the expensive driverless car.<p>5. Tailgater is forced to stop being a jackass.
I'm really curious whether this frustration is truly because of the limitation of the technology(which is totally possible) or because of some sort of cognitive bias/confirmation. Would they feel the same frustration if the van had no decals or visible sensors? Are they more frustrated at the cars more than the other drivers? Community "anger" can be really perplexing like how a community of people were complaining about the ailments from a phone tower when the company already turned off the tower six weeks prior.<p>[1]<a href="https://mybroadband.co.za/news/wireless/11099-massive-revelation-in-iburst-tower-battle.html" rel="nofollow">https://mybroadband.co.za/news/wireless/11099-massive-revela...</a>
> <i>More than a dozen locals told The Information they they hated the cars</i><p>I'm sure you could also find more than a dozen locals who hate other human drivers
> The anecdotes highlight how challenging it can be for self-driving cars, which are programmed to drive conservatively, to master situations that human drivers can handle with relative ease, like merging or finding a gap in traffic to make a turn.<p>Human drivers are terrible at both of these. I'm regularly getting stuck behind people who won't make a right turn even as a line of people are making the left turn that shields the turn they're trying to make. I see a lot more promise of computer-driven cars at least being able to improve over time as a group.
I don't think I've ever heard anyone say they like conservative driving. "I would have risked that left turn, so you should have too. The person with the green light would have slowed down to avoid an accident, probably, and I would have gotten to the next red light 10 seconds sooner!"<p>You will also hear this when talking about less-capable drivers ("get off the road, grandma!") or less-capable vehicles ("I hate having to slow down for bikes and drive carefully around them.").<p>And of course, nobody ever says "you're going 80, the speed limit is 55! slow down!"<p>As a result, traffic accidents are the leading cause of death for people from 8-24.
US desperately needs self-driving cars. Desperately. I don't think most people realize how much. Because they've never seen a city with good public transport where most people don't own or care about cars. But achieving good public transport in US is impossible for cultural reasons.<p>So there's no public transport so most expensive areas will fight to the death to keep population density low. Because more people means more cars means more traffic means whole place becomes terrible. End result: cost of living is too high, commutes too long. Not exactly the way I'd like to live.
Waymo has a long-standing problem with being rear-ended at entry to an intersection. Read their California DMV accident reports. This is because their system, quite properly, insists on seeing the absence of cross traffic before entering an intersection. So, when the view to the side is obstructed, the vehicle will advance slowly into the intersection to get a better view, detect cross traffic, and stop.<p>There's one intersection in Mountain View where Google self-driving vehicles have logged two accidents of that type. There's a tree in the median. At human driver height for cars (but not for trucks) the cross street can be seen. At roof height, where the scanner is, the tree blocks a side view. So the vehicle has to advance past it to see cross traffic.<p>This just needs a convention. Perhaps blinking the brake lights rapidly in such situations.
I think the degree to which driving is a social behavior is underappreciated.<p>By social behavior, I mean we learn to read the "body language" of the drivers (cars) around us, pedestrians, bikers, etc. Have you ever thought to yourself, "that dumbass is just about to pull into my lane" or something like that? That's you being a social animal, making a social judgment about a fellow animal.<p>This is ancient stuff. Nature is full of animals who have no spoken language, and even no verbalization at all, who nonetheless operate together physically to great effect.<p>Think of a herd of buffalo, or a school of fish, or a murmuration of swallows. Now, you want to put a robot in there, and have it keep up with the animals? That's a hard problem.<p>I would argue that we work the same way when driving. So a robot trying to operate in human traffic is going to face the same hard problem.<p>Instead of tackling that problem, self-driving cars fall back on the written rules of the road. But it's crucial to understand that the <i>rules were written by humans for interpretation by humans</i>. They are not a fool-proof algorithm for safe robot driving, and they were never intended to be that.<p>Sure, you can argue that conservative driving is safer. I'd argue that that is a cop-out that simply delegates the argument to how we define "conservative." It's conservative to drive 5mph in a 25mph zone; it's also totally legal. But is it good driving?<p>We can design a conservative robot to put into a school of fish. The real fish will avoid it and leave it behind. The robot fish is safe, in that there are no collisions. But is it fulfilling the mission of being a good fish?
Once self driving cars come to Boston, locals will just figure out how to go around them. If you honk and ride an AV’s tail close and it pulls over, everyone will do that. It’s a guarantee
> <i>One woman said that she almost hit one of the company's minivans because it suddenly stopped while trying to make a right turn, while another man said that he gets so frustrated waiting for the cars to cross the intersection that he has illegally driven around them.</i><p>I.e. obnoxiously bad human drivers hate impeccably good drivers that happen to be robots.<p>Suddenly stopped trying to make a right turn? I'm not convinced by that without evidence that it was not a required stop, like turning into a road that has right-of-way.<p>Passing cars at intersections is not only illegal but a dangerous dickhead move. I don't care what a dangerous dickhead thinks about self-driving cars or any other cars; please don't repeat it and convey it as news.
When i honk at another car it is always to correct bad behavior of some sort, normally when someone doesnt notice when a light turns green. But can these cars learn from being honked at? If they are doing 50 in a 90[1] clearly because they missed a sign, is there anything we can do to communicate to them that they are making a mistake?<p>Still wondering if they are capable of handling a police stop. Can they know that the cop wants them to pull over? can they know that the firetruck wants them to drive through a red light to let them by?<p>[1] Just had to honk at someone for this. Highway drops from 90 to 50 for road work but he missed the "resume speed" sign.
The worst type of drivers are the ones who go 20 under the posted speed and are overly cautious.<p>It sounds odd but one driver who is too cautious may be OK for them but it makes everyone around them go out of their minds.<p>Go with the flow.
> The company has previously said that it plans to launch a commercial self-driving taxi service before the end of the year, but that its service will still include a Waymo employee in each car as a "chaperone."<p>Is this new information?
I think there are still so many aspects that need to be perfected and fixed by the automakers before these cars could be operated on public roads freely. I have just read an article at <a href="https://www.lemberglaw.com/self-driving-autonomous-car-accident-injury-lawyers-attorneys/" rel="nofollow">https://www.lemberglaw.com/self-driving-autonomous-car-accid...</a> that talks about this. Hopefully car companies will think seriously about this.
A thought that has just occurred to me. I wonder if self driving cars should have L plates? They are, after all, learning how to drive. It might change the way other drivers behave around them as well since people in general give learners more leeway for their errors.
Do these cars ever go at the speed of traffic even if it's above the speed limit? Practically if everyone is driving 5-10mph above the speed limit then it actually seems more dangerous if the Waymo car is driving at the exact speed limit here?