"Truck reverses into stationary object" is not really news.<p>I think there are too many possible risks of programming the vehicle to do anything except stop dead in an emergency situation - you rapidly get into very complex programming with all kinds of failure modes - e.g. what if the problem was a faulty sensor? The vehicle might try to avoid a non-existent threat by crashing into something else. Stopping is the safe thing to do. If you want to react, you rapidly have to make lots of moral decisions like (The Trolley Problem).<p>Briefly sounding a horn in an unexpected emergency stop situation is probably a good idea, particularly if it's a white noise type thing, rather than a siren, so that other road users can localise it's source quickly.
It has no horn?!<p>"What Makes for a Street Legal Vehicle?"<p><i>Horn – It may not seem the most important piece of safety equipment, and many big cities even limit how it can be used, but to be street legal every vehicle must have a horn that is audible for at least 200 feet. The horn can generally be any note or sound (even ones that play musical tunes are usually permitted), so long as the minimum volume requirements are met.</i><p>Ref: <a href="https://www.hg.org/article.asp?id=31563" rel="nofollow">https://www.hg.org/article.asp?id=31563</a>
Ugh. This really worries me - not because I’m afraid of driverless cars but because this is the kind of “news” headline that will get anti-driverless car jerks all up in a righteous tizzy.<p>Title should be NO_TITLE because “doofus driving truck backs into something” isn’t news.
This feels like a <i>really</i> tricky question, actually -- what to do when an moving object is headed towards a stopped self-driving car?<p>It's easy to say the car should be smart enough to move -- but what if, as it moves in one direction, the object (like a truck trying to avoid the car) suddenly swerves in that direction too? Then does the car become responsible for the collision?<p>And of course, it feels like there could be a real-world version of the trolley problem [1] -- what if there are 5 occupants in the vehicle who will be killed by an oncoming truck, but in the only direction where it can move out of the way, it will have to run over a single pedestrian?<p>Glad I'm not the one having to make these kinds of programming decisions.<p>[1] <a href="https://en.wikipedia.org/wiki/Trolley_problem" rel="nofollow">https://en.wikipedia.org/wiki/Trolley_problem</a>
I'd say the shuttle's insurer should be partly responsible here. The Engadget article seems to show the shuttle stopping in the truck's blindspot.[0]<p>A reasonable truck driver would assume the driver of the other car would back up a little bit. But in this case, the car stopped where the truck couldn't see whether it was backing up or not, and wasn't programmed to understand the truck's movements at this angle.<p>So, software was responsible for:
1. Stopping too close.
2. Stopping in the blindspot of a truck.
3. Having no horn.
4. Not backing up or understanding fairly common motion by a truck.<p>[0] <a href="https://www.engadget.com/2017/11/09/las-vegas-self-driving-shuttle-bus-crash/" rel="nofollow">https://www.engadget.com/2017/11/09/las-vegas-self-driving-s...</a>
> Now, it must be said that technically the robo-car was not at fault. It was struck by a semi that was backing up, and really just grazed — none of the passengers was hurt.<p>I get that it wasn't the Driverless car's fault, but this brings up an important use case that the driverless cars currently don't seem to be able to handle.<p>In an ALL-Human situation, what would've occurred is that the parked car (if it had a passenger inside) would honk at the car that is trying to back into it, or open the door and yell at the person trying to back up, and the accident would be avoided.<p>Driverless car doesn't (or didn't) honk even if it does detect something backing up into it. Hence the accident.
I fully expect every death caused by a self-driving cars to make national headlines in the next few years. Hysteria, whether justified or not, will set in and legislation will pass banning autonomous vehicles in the US.<p>Self-driving cars may have a faster reaction time, but they will never reach the level of human awareness of their surroundings while driving.<p>Let's see a self-driving car navigate through a construction zone, watch for instructions from a police officer who is directing traffic, or stop when kids are playing baseball in a yard and the ball rolls across the street. Answering, "well they'll have that capability someday" isn't a very compelling answer. Truly self-driving cars are dependent on technology that simply hasn't been invented yet.<p>How nice it will be sitting behind a fleet of self-driving cars dragging their asses down the highway at exactly the speed limit, or slamming on the brakes when a leaf flies in front of the sensors.<p>In addition to that, you think masses of people will be silent while losing their jobs because these robot overlords are taking the wheel?<p>Source: I work for a self-driving car startup.
Surely something as basic as typical traffic patterns in a town have been simulated? Because a vehicle coming onto you seems to be as basic and mundane an everyday traffic event as you will get.<p>It's in self driving proponents own best interest to have stringent standards, because if the public loses faith its going to be an uphill battle.<p>Simply demonizing human drivers and hand waving away errors is too self serving to work.
> A City of Las Vegas representative issued a statement that the shuttle “did what it was supposed to do, in that its sensors registered the truck and the shuttle stopped to avoid the accident.” It also claims, lamely, that “Had the truck had the same sensing equipment that the shuttle has the accident would have been avoided.”<p>Note a subtle shift, with government now shilling for the driverless vehicles. That's the first time I've seen it; I suspect it won't be the last.
These crashes seem to be caused by the "weird" way self-driving machines behave as compared to humans.<p>Perhaps we need some sort of placard that is legally mandated and easily visible (like the "student driver" placard) to let people around these vehicles be aware of them and expect different behaviors than "ordinary" divers.
So the shuttle stayed still while a truck backed into it.<p>Apparently, simply having the ability to stop does not make a self driving car better than a human. It should evade, or at least honk.
A semi truck backed into it.<p>If anything this just highlights to me how much commercial vehicles need self driving tech (I guess I, lamely, agree with the mayor).<p>My office used to be in an industrial district. The trucks there scare me. They absolutely do not follow the rules of the road, and it's dangerous for everybody around them.<p>Stuff like this: just backing up and expecting everybody to move out of their way, or taking a turn too tight and expecting everybody at the light to back up, were almost daily occurrences.<p>Yeah, self drivers need to account for this, but the bigger problem IMHO is getting those trucks and their drivers either off of the road or in compliance with driving laws.
Hard to judge this without seeing it. A human driver might of been able to anticipate what the truck was eventually going to do sooner so as to find a better place to stop. A human driver understands the difference between a truck backing blind and other vehicles.