TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Uber Not Criminally Liable in Death of Woman Hit by Self-Driving Car

215 点作者 abhisuri97大约 6 年前

26 条评论

rayiner大约 6 年前
&gt; In the six seconds before impact, the self-driving system classified the pedestrian as an unknown object, then as a vehicle, and then as a bicycle, a preliminary report from the National Transportation Safety Board explained. While the system identified that an emergency braking maneuver was needed to mitigate a collision, the system was set up to not activate emergency braking when under computer control.<p>The prosecutor made the wrong call here. This part is absolutely criminal negligence. Putting a “self driving” car out there that doesn’t have emergency braking enabled (apparently because it creates too many false positives) is an unjustifiable risk. Working emergency braking should be the first thing perfected, before the computer gets to control the car.
评论 #19335218 未加载
评论 #19335220 未加载
评论 #19336016 未加载
评论 #19336174 未加载
评论 #19335102 未加载
评论 #19335094 未加载
评论 #19336074 未加载
评论 #19335521 未加载
评论 #19336615 未加载
评论 #19335153 未加载
评论 #19336488 未加载
评论 #19335352 未加载
评论 #19336088 未加载
评论 #19336192 未加载
评论 #19336027 未加载
评论 #19335682 未加载
vichu大约 6 年前
This is potentially precedent as a legal blocker for public adoption of self-driving cars. If I am to serve a 1 year prison sentence for faulty code or a deprecated LIDAR sensor, I don&#x27;t see in what scenario I would be willing to leave my fate in the hands of a self-driving car manufacturer. Just as in the case of a chauffeur, if they are driving and commit vehicular manslaughter, I would not expect to be deemed guilty.<p>It seems to me that it would be the benefit of self-driving car companies to own up to liability as it is to their benefit of achieving widespread adoption. For example, Volvo is in line with this idea and has publicly stated that they would accept full liability in fully autonomous operation modes[0].<p>In this case, I do think that some liability lies with the driver - as they were tasked specifically with prevent situations like this. What is not clear is whether or not this task is even humanly feasible given reaction times, and based off of this, whether or not Uber has been criminally negligible. Given this, I am surprised that the prosecutor seems to have absolved Uber of any blame.<p>[0] <a href="https:&#x2F;&#x2F;www.media.volvocars.com&#x2F;global&#x2F;en-gb&#x2F;media&#x2F;pressreleases&#x2F;167975&#x2F;us-urged-to-establish-nationwide-federal-guidelines-for-autonomous-driving" rel="nofollow">https:&#x2F;&#x2F;www.media.volvocars.com&#x2F;global&#x2F;en-gb&#x2F;media&#x2F;pressrele...</a>
评论 #19334276 未加载
评论 #19334418 未加载
评论 #19334569 未加载
评论 #19334995 未加载
评论 #19335079 未加载
评论 #19334915 未加载
评论 #19334972 未加载
评论 #19334280 未加载
8bitsrule大约 6 年前
A woman on a road was killed. By a car with no driver.<p>There was a person sitting in the driver&#x27;s seat, but that person was in no way engaged in maintaining safe operation. That person was hired by Uber.<p>As the judge, therefore, I would certainly assign -most- liability to Uber for putting that screw-off behind the wheel in the first place. Uber&#x27;s driverless car killed a woman.<p>The charge would be negligent homicide.
评论 #19334989 未加载
评论 #19334960 未加载
评论 #19337052 未加载
kbos87大约 6 年前
Let&#x27;s take a look at some of the ingredients that led to this:<p>1) A pedestrian crossing the street likely expected an approaching driver to see them and slow down. Typical behavior on the part of the average pedestrian and the average driver, whether it&#x27;s a misdemeanor or not.<p>2) An engineer at Uber made the decision to put tech on the road without emergency braking capabilities, likely using the justification that a safety driver would be there to intervene when needed.<p>3) A safety driver in an autonomous vehicle that behaves the right way 99% of the time grows complacent.<p>The pedestrian may have been jaywalking and the safety driver may have been abdicating their duties. The safety driver especially isn&#x27;t an innocent actor here. But the party who has the most responsibility is by far the engineers and managers at Uber.<p>This set of circumstances was completely foreseeable, but they still decided to take the risk and put this technology out on the roads. I for one don&#x27;t want to be a part of Uber, or anyone else&#x27;s great experiments. Spend the money, spend the time, and figure this out in a controlled environment before subjecting the rest of us to the negative impacts of your selfish ambition. Someone at Uber deserves time in prison for this.
评论 #19337560 未加载
r00fus大约 6 年前
On one hand, I think Uber was reckless and wantonly created the conditions for this incident.<p>On the other hand, I don&#x27;t want this to be the death knell for autonomous driving experiments.<p>On the gripping hand, it&#x27;s possible that a large settlement might have been the best outcome from this tragedy for the family of the woman in question. They gain nothing from Uber&#x27;s criminal liability. (edit: clarity)
评论 #19334298 未加载
评论 #19334652 未加载
评论 #19336831 未加载
评论 #19334285 未加载
userbinator大约 6 年前
After all these stories I strongly believe that &quot;self-driving, except when it&#x27;s not&quot; is even worse than manually driving, because supervising a computer is not too far from a driving instructor supervising a student driver; most of the time it&#x27;s OK, but you have to be extremely alert to catch the times when it makes a fatal mistake. On the other hand, if you&#x27;re &quot;manually&quot; driving, you are fully in control of and can anticipate the situation, thinking ahead with what&#x27;s next. A self-driving car, like a student driver, won&#x27;t tell you what it&#x27;s going to do and ask whether that&#x27;s OK --- it just does and you have to be <i>very</i> alert to instantly take control to correct when things go wrong.<p>Autopilots work in planes because the operators are extremely well-trained, and the reaction times needed are measured in seconds or even minutes. In a car, it&#x27;s less than a second.
评论 #19334773 未加载
jimktrains2大约 6 年前
&gt; While the system identified that an emergency braking maneuver was needed to mitigate a collision, the system was set up to not activate emergency braking when under computer control.<p>&gt; Instead, the car&#x27;s system relied on the human operator to intervene as needed. &quot;The system is not designed to alert the operator,&quot; the report notes.<p>How are the not negligent for this specifically?
评论 #19334123 未加载
评论 #19334101 未加载
评论 #19335492 未加载
perfunctory大约 6 年前
&gt; The Arizona Republic has reported that the driver, 44-year-old Rafaela Vasquez, was streaming the television show The Voice in the vehicle in the minutes before the crash.<p>I guess we shouldn&#x27;t be surprised any more as even the drivers of the non-self-driving cars do the same sometimes. It&#x27;s scary how often, when I look in the rear-view mirror, I see drives behind me looking down at their phones. This is one of the reasons I almost stopped driving.
dkarl大约 6 年前
Most driving is done by humans. Humans are terrible at it and kill people unnecessarily all the time. That&#x27;s the standard to beat. Speaking as someone who rides a bike on city streets, I really don&#x27;t give a shit how many people Uber kills, as long as it&#x27;s less than human drivers would. This whole thread smells like Monday morning quarterbacking and people throwing aside reason under stereotype threat. (Techies are in love with technology, techies are in thrall to startup narratives and oblivious to social responsibility. Everyone&#x27;s anxious to disprove that.)<p>In fact, the mistake Uber made here was relying on a human being to do a job that is routine and boring the vast bulk of the time but occasionally requires life-saving decisions that depend on attentive awareness of the surroundings. That&#x27;s the same mistake our entire civilization makes a million times a day. The fact that the operator had fewer responsibilities than a normal driver probably magnified the problem, but it&#x27;s the same problem that makes driving fundamentally dangerous. She thought she was doing a good enough job and then oops, guess not, somebody&#x27;s dead.<p>That happens every day without robot drivers involved. The standards we hold autonomous driving technology to should reflect this insane status quo.
pteredactyl大约 6 年前
My thoughts go out to her family. And the many many people who die every day because of reckless drivers and accidents. Driverless or not. Really sucks to lose someone due to no fault of their own (other than being there).<p>To add, driverless or not, most who kill others using a car are not criminally liable. Often times not even liable beyond what the state minimum is. Unless the victim sues for personal assets. So in California, that&#x27;s 100k. Which is one of the lowest in the USA.
评论 #19336277 未加载
评论 #19335105 未加载
评论 #19335385 未加载
mnm1大约 6 年前
Unbelievable. I wonder how many other self driving cars are out there designed to not brake even when they clearly detect an object in their path. It&#x27;s ridiculous to rely on a human to intervene in such situations. This was a completely preventable homicide, yet Uber gets away with it. As I&#x27;ve said before, the easiest way to get away with murder is to commit it under the protection of a corporation. True, this is manslaughter, but the principle still stands. I bet they will go after the driver now and try to place blame on her despite her having an impossible job that should simply not exist because it can&#x27;t be carried out. So they get away with manslaughter and a precedent is set for other car companies that no matter how negligent your system and operations are, you can kill people and get away with it as long as you find a patsy to sit behind the wheel and take the blame. With this kind of attitude, I hope self driving cars never make it to market. I no longer think they will be safer than human drivers because there is no incentive for these companies to make them safe, let alone safer.
评论 #19334371 未加载
smallgovt大约 6 年前
I wonder why Uber (and other self-driving car companies) don&#x27;t have remote employees monitoring the road who can take over in case the appointed driver doesn&#x27;t react quickly enough.<p>The problem with these driver-assisted self driving cars is that the driver is unlikely to be paying attention at any given time since the system is 99% reliable.<p>By adding remote monitoring, you could even have multiple people monitoring each car. It might be impossible to safely steer the car from a remote location, but they could surely activate the breaks and&#x2F;or trigger other simple directives to drastically decrease a fatalistic collision.<p>Given that the average driver reaction time in a car collision is 2.3s [1], I doubt network latency would pose much of a problem, and cost surely isn&#x27;t an issue for these companies. A remote person could also use the car&#x27;s cameras to gain a superior field of vision (especially at night time) when compared to the in-car driver.<p>[1] <a href="https:&#x2F;&#x2F;copradar.com&#x2F;redlight&#x2F;factors&#x2F;IEA2000_ABS51.pdf" rel="nofollow">https:&#x2F;&#x2F;copradar.com&#x2F;redlight&#x2F;factors&#x2F;IEA2000_ABS51.pdf</a>
评论 #19334542 未加载
评论 #19334482 未加载
评论 #19334711 未加载
true_tuna大约 6 年前
Those cars were nowhere near ready to be on the roads. And given how fast the program spun up there’s no way the drivers were properly trained and vetted. I personally had several run-ins with the Uber cars in San Francisco before they got their California registrations revoked. I worked on 3rd in soma and saw them out as I would walk to lunch. They didn’t even try to yield to pedestrians in crosswalks during right turn on red (when pedestrians clearly have right of way). I had more than one near miss. I shouted at the driver and he grabbed the wheel in panic. He went around the block to try again and the car did exactly the same thing. It was like they were blind to anything smaller than a car. Uber knew it too, because the failure to yield was reported to Uber in person at their garage on 3rd and Harrison. Uber (and anyone who behaves like that) has no business running a self-driving program. If they start the program back up I guarantee they will continue to kill pedestrians. If you see one coming get away quick. I speak from personal experience and I’m not joking even a little bit.
gudok大约 6 年前
Interaction between humans and AI is far from perfect. It seems that it would be easier for the people to adapt to AI rather than the opposite. I feel that someday every traffic participant (including pedestrians) will be required to carry tracking device. These devices will communicate together and prohibit or allow actions, e.g. making a turn or crossing a road. Eventually they will make traffic rules as we know today, obsolete. No traffic lights, no road signs and no crossings anymore. Every action will controlled by the device. And, of course, they will automatically report and fine law-breakers.<p>Is this the bright future of the humankind? Or is this a setting for a new dystopian book?
NoPicklez大约 6 年前
Interesting reading the article, from a risk perspective I would think Uber would have assessed the risks associated with using an Uber self-driving solution. Particularly mitigating automated controls to stop the car in the event it detects an emergency, given it has this capability.<p>But I agree with the verdict, it&#x27;s just strange that the vehicle has the ability to detect a potential collision but cannot apply emergency braking to prevent it.
评论 #19334198 未加载
Quarrelsome大约 6 年前
and this ladies and gentlemen is a major reason why the US is the best place in the world to do business and specifically bleeding edge stuff.
xutopia大约 6 年前
Blame the driver... of course the driver was told that the car drove itself... he was watching a show on his phone when it happened so Uber is putting the blame on him.<p>I don&#x27;t think the driver was trained properly to drive in that semi-automated driving system. Is the fault lying with him or is Uber just trying to lay the blame on him to avoid paying anything.
评论 #19334271 未加载
评论 #19334152 未加载
rb808大约 6 年前
The law around this is so important. When your car hurts someone, who&#x27;s fault is it? What about if its completely self driving like this one? What if there is a bugfix which you didn&#x27;t apply? What if a software upgrade causes a crash? Lane assist caused you to swerve into something?<p>So many grey areas.
评论 #19334007 未加载
评论 #19334289 未加载
评论 #19334192 未加载
评论 #19334091 未加载
评论 #19334082 未加载
helthanatos大约 6 年前
Isn&#x27;t most of the point of computers driving us so they can react to things we can&#x27;t? I dont know why that bit would be turned off... Seems seriously problematic. Secondly, that looks like a really bad place to be pushing your bike at night.
评论 #19334140 未加载
edgartaor大约 6 年前
Let say that in the future every car it is a Self-Driving Car. Which are more reliable and accidents are reduced by 99%. How to deal with that 1%? Whose fault is it? The manufacturer? The developer?
评论 #19335005 未加载
tobyhinloopen大约 6 年前
UBER is not criminally liable, but THE DRIVER might be, since she was watching a TV show. Subtle but important detail. Looking at the comments, I suspect many comment without reading the article.
aasasd大约 6 年前
Funny enough, this is about the fifth attempt to post this story to HN, with different links. It&#x27;s getting barely any response.
评论 #19334314 未加载
npip99大约 6 年前
None of the top comments make any sense to me. I simply don&#x27;t understand them. Anyway, here is my understanding of the subject:<p>(1) Yes, Uber should have trained the driver better to look at the road. By trained, I mean there should have been a sticky note on the wheel saying &quot;PAY ATTENTION OR YOU WILL KILL SOMEONE&quot;<p>(2) Yes, The driver absolutely should have to pay some penalty for this, if Uber told him that he should have been paying attention (Which they most certainly did). Watching a video while in a self-driving car is IDENTICAL to watching a video in a normal car. Modern self-driving cars are NOT fully autonomous, and they SHOULD be viewed as IDENTICAL to cruise control for all legal considerations, and thought experiments. Most of the top comments, which are blindly attacking self-driving cars, are not making this analysis.<p>However, (1) and (2) do not justify the lack of logic displayed by most of the top comments here. There seems to be violations of<p>(a) There is no logical difference between a person accidentally killing someone, and the self-driving car accidentally killing someone. Actually, because the car is already known to not be fully autonomous, this already is a case of the person accidentally killing someone. However, even if the car is fully autonomous, we MUST be considering the ODDS of an accident. None of the other comments are doing this. There is always an odds of an accident, so a specific accident means absolute bullshit. Literally nothing. This post doesn&#x27;t even mean anything. What SHOULD be posted is &quot;Self-driving cars with humans at the wheel kill X people per road-hour. Human-only cars kill Y people per road-hour&quot;. If X &gt; Y, then yes we have a fking problem. But without that information, we literally have nothing to even think about, or process.<p>(b) Disabling the safety feature of the car is not a fk&#x27;ing concern, at all. Almost no cars have these features - only expensive ones do. Turning an expensive car into a normal car is NOT something you can be sued for, or is even something anyone should care about. Who is at blame for the situation is simply not dependent on this fact. I don&#x27;t understand why people are discussing this aspect.<p>(c) Do you see the image? Maybe it&#x27;s not showing it all. But as far as I can see there isn&#x27;t a light there. Wtf is she doing crossing the road if there&#x27;s no light there, without waiting for the cars? As a citydweller that is constantly found in the middle of streets trying to cross parts of the road that don&#x27;t have stop lights, I just can&#x27;t fathom ever being in her position. Maybe my city has less considerate drivers than her city, but if I tried to cross roads without a red light blocking cars, and didn&#x27;t consciously give right of way to the traffic, I would die sometime this week.<p>(d) Many people seem to quote &quot;self-driving system classified the pedestrian as an unknown object, then as a vehicle, and then as a bicycle&quot; as &quot;Oh this self-driving system is complete sh*t it&#x27;s all Uber&#x27;s fault&quot;. Makes no damn sense. As a side note, if you&#x27;ve ever worked with a neural net, especially with video as opposed to still photos, you already understand that the given sentence means nothing. That&#x27;s just how they work, and there will always be milliseconds in-between frames where it reassigns the object&#x27;s identification. But, anyway, this is not relevant. The self-driving part could have been completely off, or disabled. The car is SUPPOSED to be driven by the driver, and any deviation from that is the fault of either the driver, or Uber&#x27;s training of the driver. Whether it&#x27;s the former, or the latter, is exactly where and how suing should be directed and handled. Nothing else matters, despite most comments putting much emphasis on many other aspects of the situation.
评论 #19335851 未加载
dana321大约 6 年前
I know its bad having a machine-controlled car killing someone, but.. it will happen and we must accept it and my guess is that it will happen far less than with human operators in the long-term.
评论 #19334311 未加载
评论 #19335827 未加载
评论 #19334431 未加载
sneak大约 6 年前
I’m not a lawyer, so forgive my ignorance, but how can a corporation be criminally liable for anything ever? It’s not like you can charge a corporation with a crime or put a corporation in prison. Corporations can’t commit crimes because they cannot act; only humans can.
评论 #19334226 未加载
评论 #19334837 未加载
评论 #19334225 未加载
wand3r大约 6 年前
Self-driving vehicle development has so many difficult problems to solve. The technology itself is extremely difficult to create and relies on this extremely complex compromise where vehicles are allowed (and are) operated by code but ultimately babysat by drivers.<p>I think self-driving technology is worth the risk and worth the large amount of property damage and human injury that comes with it. However, this is a societal issue and there needs to be some sort of referendum on this. While many parallels can be drawn with space travel, those projects were tightly contained and included individuals who were specifically trained and informed of the potential dangers--there is no way to really do this at scale for self-driving tech which necessarily demands an expansive ecosystem that can&#x27;t be tightly controlled nor can consent be inferred from the unknown number of people involved in this experiment.
评论 #19334345 未加载
评论 #19334324 未加载
评论 #19334331 未加载