My brother lives in the neighborhood where this public beta testing is occurring. Folks, let me tell you from first-hand experience- these cars are terrible drivers. They are extremely hesitant, and because of this they cause traffic backups and delays in places where safety is compromised. You better hope you are not behind one when they perform a left-hand turn onto a busy street, because you are probably going to be waiting a while. They stop in seemingly random situations where human drivers would never stop, like a bicyclist standing way off the side of the road (talking to someone) with his bike pointing towards traffic. People have told me they've been in the middle of a turn behind a Waymo car, when all of a sudden the car just randomly stops for no apparent reason, leaving them stranded in the middle of the intersection as the light turns red. The default reaction for a Waymo car under any sort of ambiguity seems to be to just stop (literally) until the ambiguity resolves itself. Unfortunately this causes considerable confusion and traffic backups for human drivers.<p>I understand why Waymo is proceeding cautiously. But you really get the impression that the technology has a long ways to go. People in Chandler are pissed because this fleet of cars is a huge nuisance and is in fact less safe for human drivers who are sharing the road. I have no doubt that autonomous vehicles will be safer than human drivers someday, but that day is definitely not today.
I love this quote:<p><i>“People are lashing out justifiably," said Douglas Rushkoff, a media theorist at City University of New York and author of the book “Throwing Rocks at the Google Bus.” He likened driverless cars to robotic incarnations of scabs — workers who refuse to join strikes or who take the place of those on strike.<p>“There's a growing sense that the giant corporations honing driverless technologies do not have our best interests at heart,” Mr. Rushkoff said. “Just think about the humans inside these vehicles, who are essentially training the artificial intelligence that will replace them.”</i><p>Why is it that for every act of random vandalism or hooliganism, there's always an academic who will find a way to blame the victim?
> In some of their reports, police officers also said Waymo was often unwilling to provide video of the attacks. In one case, a Waymo employee told the police they would need a warrant to obtain video recorded by the company’s vehicles.<p>Veering off topic a bit here, but this is a very good decision by the company. Taking a hard line here removes the subjectivity of when to violate your customer's privacy and puts the onus on the police to choose to make that violation.
It is interesting how Waymo chose not to pursue charges in a few cases where they had footage of the assailant. They thought it would further antagonize people in the community of making it an us vs them issue, with court cases, lawyers and publicity ensuing. But if charges are not pursued, it might also increase the attacks...<p>There are already road-rage issues between human drivers, as cars are perceived to dehumanize other side "me vs car". In this case this just gets amplified. A few tricks might help here, for example, making the car look less a "Waymo" car and just make it like a normal car. It might be hard to hide the camera and sensors completely. In that case maybe make a random logo, say Acme Surveyors, inc.<p>Another trick could be to have the test drivers pretend they are driving the car rather than sit and watch. They could keep their hands on the wheel and pretend to move it, so make it seem like they it's just a human driver.
I'm all for driverless cars, but if you put an algorithm in charge then the creators of that algorithm need to be legally responsible for the outcome. I'm talking engineers going to prison when one of these cars does inevitibly hit someone. Every driver is liable for their actions on the road, for some reason SV thinks being "statistically better than humans" means they aren't responsible for however small a % of crashes they are responsible for. It's like saying I only crashed once in my 50 years of driving so on average I'm not responsible for killing that kid last Tuesday.
Vandalising cars or causing threats of violence should never be condoned, but their still is something very wrong in the way tech companies encroach on the lives of people unwillingly.<p>>“They didn’t ask us if we wanted to be part of their beta test,” added his wife, who helps run the business.<p>The woman in the article is correct in my opinion. It ought to be up to the people of Arizona to decide what happens on Arizona's streets, not to a company from California, because it's the people of Arizona that are the receiving end of all the social, economic, and safety implications of this technology.<p>We have had this trend of tech companies creating facts and asking democratic permission afterwards a few times now in the world of social media apps, but now with autonomous driving things are getting a little more physical.
Unfortunate, but I guess with any new tech there will always be luddites.<p>Many HN folks probably already know the etymology, but for those that aren't familiar, the word "luddite" itself comes from "English textile workers in the 19th century [who] destroyed textile machinery as a form of protest" [1]<p><a href="https://en.wikipedia.org/wiki/Luddite" rel="nofollow">https://en.wikipedia.org/wiki/Luddite</a>
> “The behavior is causing the drivers to resume manual mode over the automated mode because of concerns about what the driver of the other vehicle may do,” Officer Johnson wrote.<p>Interesting that the drivers don't trust how the autonomous vehicles will handle these cases. In the case of someone actively being malicious, it seems more manual intervention will be desired. I can't see how this will work out with certain car companies touting their cars will not be equipped with steering wheels.
To all the Hacker News bringing up Luddites:<p>Why aren't there similar stories of people throwing rocks at Tesla vehicles?<p>Those vehicles are readily identifiable and have had safety issues from drivers using its cruise control <i>as if</i> it were autopilot. Moreover it's a goddamned battery on wheels which could put an enormous number of car service shops out of business. There are already laws in various states to keep Tesla out for that reason.<p>There could be a lot of reasons why someone throws rocks at a Waymo vehicle but also thinks Tesla is cool. Ludditism is not one of those reasons.
These vehicles will get vandalized even if the public accepts them. Great new graffiti medium owned by “Big Corp” that drives all over metro areas 24/7. Wonder how much companies will need to spend just to keep them from being covered in tags.
Yes, of course violence and vandalism is never justified, but think of it from a bored high schoolers perspective. It would be so much fun in the future to stop a waymo in the middle of an street, cover its cameras, and watch the aftermath. Pranks involving technology are gonna be way more fun in the future.
Fear of change and new things. People were opposed to cars when they were first introduced. Ironically self-driving cars will arguably save countless lives. The resistance and fear will pass. But I do love driving...
IT's the 'misuse'of tech. Sure, we can optimize for efficiency if efficiency is defined by minimizing cost or maximizing safety. But in a world of humans, efficiency should be defined by 'maximizing human utility'. In other words, use tech to make people better, more employable, happier. And no I do not mean borg. If we continue down this path, ultimately we will marginalize humanity because our systems are far more efficient than the evolutionary process that spawned and selected us. in the specific example of 'self driving vehicles' an analogy might be to have an AI supervisor issuing directives. Much as our GPS issues turn directions, an advanced AI could be pointing out traffic hazards hundreds of feet away that many people who are distracted, texting, talking or just have poor vision would miss. OTOH, we have the resilience of millenia on our resume, whereas machines are scarcely 200 years old. If a cataclysm hits the planet, much of the static infrastructure will fail.
<i>The antitechnology Luddite movement will grow increasingly vocal and possibly resort to violence as these people become enraged over the emergence of new technologies that threaten traditional attitudes regarding the nature of human life (radical life extension, genetic engineering, cybernetics) and the supremacy of mankind (artificial intelligence). Though the Luddites might, at best, succeed in delaying the Singularity, the march of technology is irresistible and they will inevitably fail in keeping the world frozen at a fixed level of development.</i><p>- Ray Kurweil<p><a href="https://en.wikipedia.org/wiki/Predictions_made_by_Ray_Kurzweil" rel="nofollow">https://en.wikipedia.org/wiki/Predictions_made_by_Ray_Kurzwe...</a>
On one hand, I am sympathetic to fears of driverless car safety and job losses. I would not be a big fan of having my (hypothetical) children playing on a street that self driving cars are being tested on.<p>On the other, if you threaten the employee within the car or attempt to drive them off the road, you should be prosecuted with the full force of the law. There are still people within those self driving cars, and attempted homicide is still a crime.<p>Edit: being opposed to attempted vehicular homicide is an unpopular opinion. Never saw that coming.
> The emergency drivers in the Waymo vans that were attacked in various cases told the Chandler police that the company preferred not to pursue prosecution of the assailants.<p>This is part of the problem, IMO. Make it known that you will act lawfully to protect the safety of your drivers and will cooperate fully with law enforcement. This pacifist policy does nothing to discourage future attacks.
Are there any other examples from relatively recent history of new technology that people would spontaneously attempt to attack/destroy in an uncoordinated fashion like this?
Probably not a good idea to damage property in protest. Maybe build something cheap and portable that should prevent them from moving. If they do move through the obstacle anyway, you've demonstrate how unsafe they can be.
> "People are lashing out justifiably," said Douglas Rushkoff, a media theorist at City University of New York and author of the book "Throwing Rocks at the Google Bus." He likened driverless cars to robotic incarnations of scabs — workers who refuse to join strikes or who take the place of those on strike.<p>I don't see how violence or vandalism of property is ever justified. Not to mention the fact that pelting self-driving cars with rocks or trying to run them off the road puts others in danger as well.<p>What I don't understand about these modern day Luddites is why what's considered today's technology is okay but any technological advances beyond this point is harmful. Why not attack washing machines and dishwashers? Surely forcing people to get their clothes washed by hands would create hundreds of thousands of jobs.
> Officer William Johnson of the Chandler Police Department described in a June report how the driver of a Chrysler PT Cruiser wove between lanes of traffic while taunting a Waymo van.<p>How do we know that it just wasn't the PT Cruiser's driver's normal driving behavior. They did buy a PT Cruiser in the first place.
I wonder what the algorithm says to do when a protestor steps in front of a speeding car that would have to brake hard enough to injure the occupants of the car in order to avoid hitting the protestor? Under a comparative fault regime, Google and the protestor would both be partially at fault for the occupants injuries. The protestor is broke, so Google will bare the whole burden of being jointly and severally liable. Or should it make a calculation of how badly it's going to injure the occupant and the protestor and minimize the situation in terms of damage to the company, thus possibly choosing to injure the protestor over the occupants to preserve brand value and future revenue?