TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Tesla sued in wrongful death lawsuit that alleges Autopilot caused crash

216 pointsby mindgam3about 6 years ago

29 comments

sytelusabout 6 years ago
Tesla&#x27;s blog post: <a href="https:&#x2F;&#x2F;www.tesla.com&#x2F;en_GB&#x2F;blog&#x2F;update-last-week’s-accident" rel="nofollow">https:&#x2F;&#x2F;www.tesla.com&#x2F;en_GB&#x2F;blog&#x2F;update-last-week’s-accident</a><p><i>the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider</i><p>Looks like sensors failed to see concrete divider in nice sunny weather and car slammed in to it at 70mph. Driver was obviously over confident on system&#x27;s ability to self-drive, probably busy looking at phone and ignored warnings to put his hands on steering.<p><i>In the US, there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.</i><p>These stats don&#x27;t help when you read the guy had two kids who will now grow up fatherless for rest of their lives. Humans killing humans is very different thing than machines killing humans even if the fatality rates are 10X lower. Companies need to aggressively enforce, both hands on steering until self-driving is really really really good.
评论 #19806777 未加载
评论 #19806086 未加载
评论 #19805573 未加载
评论 #19805457 未加载
评论 #19805331 未加载
评论 #19805961 未加载
评论 #19807221 未加载
评论 #19806280 未加载
评论 #19805769 未加载
评论 #19806363 未加载
评论 #19805182 未加载
评论 #19806649 未加载
评论 #19805722 未加载
评论 #19805269 未加载
评论 #19806186 未加载
评论 #19805553 未加载
评论 #19807235 未加载
评论 #19810349 未加载
评论 #19805337 未加载
评论 #19806230 未加载
评论 #19806759 未加载
评论 #19805348 未加载
评论 #19805789 未加载
评论 #19817039 未加载
评论 #19814073 未加载
评论 #19807065 未加载
评论 #19806123 未加载
评论 #19806947 未加载
评论 #19810328 未加载
评论 #19805750 未加载
areoformabout 6 years ago
I would like to add over here that Tesla does indeed do sensor fusion in their cars. Their Autopilot combines radar and ultrasound with vision to decide where to drive. Commentators bringing up LIDAR are jumping the gun by assuming that this scenario isn&#x27;t something that these sensors wouldn&#x27;t have detected in either combination or individually to spot the anomaly. The problem over here is likely to be software due to a bug in their code than a simple lack of additional sensors (LIDAR). And this touches on deeper issues that could have profound ramifications for the autonomous driving industry and the broader industry in general.<p>At least, in my eyes, the big problem with the autonomous car industry isn&#x27;t the sensor suites they are deploying (or not), but the over-reliance on neural networks. They are black boxes with failure modes that can&#x27;t be adequately mapped. See: <a href="http:&#x2F;&#x2F;www.evolvingai.org&#x2F;fooling" rel="nofollow">http:&#x2F;&#x2F;www.evolvingai.org&#x2F;fooling</a><p>What if the neural net or the system used to detect obstacles didn&#x27;t see it because the precise configuration of the data fooled it? And if that&#x27;s the case then what&#x27;s next? How do we decide when it is okay for safety-critical systems to be opaque? How do we deal with autonomous driving if the conclusion comes out to be a &quot;no&quot; for this case? How should broader society deal with a yes? And who decides all of this in the first place?<p>Possibilities like this scare me far more than the lack of LIDAR because replicating a bug like this would be next to impossible. We don&#x27;t know what we don&#x27;t know, and we can&#x27;t explore and understand the system to suss out what we don&#x27;t know.<p>Edit: Fleshed out the idea with more questions.
评论 #19804819 未加载
评论 #19804424 未加载
评论 #19805049 未加载
评论 #19804669 未加载
评论 #19804572 未加载
评论 #19805354 未加载
评论 #19805735 未加载
评论 #19804027 未加载
评论 #19805174 未加载
saagarjhaabout 6 years ago
The problem with Tesla&#x27;s &quot;Autopilot&quot; is that it&#x27;s marketed heavily as just that: autopilot. It&#x27;s good enough to work that way 99% of the time, too, which is a recipe for disaster: things that work 99% of the time but have the potential to fail catastrophically often get glossed over by humans because it&#x27;s hard to keep paying attention to something that rarely fails.
评论 #19804366 未加载
评论 #19802921 未加载
评论 #19803107 未加载
neotekabout 6 years ago
That&#x27;s what happens when you mislead your customers into believing your self-driving technology is far more advanced than it actually is. I hope Tesla loses this case and is forced to change their bullshit marketing before more people lose their lives.
评论 #19806845 未加载
评论 #19805794 未加载
CoolGuySteveabout 6 years ago
If the Tesla had a proximity sensor that slammed on the breaks in the last 2-3 meters, would he have survived? I ask because it&#x27;s a common luxury car feature and would be easy for Tesla to implement. IIRC this crash and the semitruck crash had no evidence of the autopilot hitting the breaks according to the NTSB.<p>Even if his velocity was only reduced by a fraction, the amount of power involved in the collision would have been reduced by whatever that fraction was.<p>Edit: So it turns out the Model X does have automatic emergency breaking, but the preliminary report says that the Tesla actually increased its speed in the 3 seconds leading up to the crash. Sounds like a major software bug to me.<p>Here&#x27;s a review of the Model S AES compared to other cars, in particular the Tesla AES trigger can&#x27;t handle when a lead car moves out of way, which is what the NTSB report says happened: <a href="https:&#x2F;&#x2F;www.caranddriver.com&#x2F;features&#x2F;a24511826&#x2F;safety-features-automatic-braking-system-tested-explained&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.caranddriver.com&#x2F;features&#x2F;a24511826&#x2F;safety-featu...</a><p>Here&#x27;s the NTSB preliminary report: <a href="https:&#x2F;&#x2F;www.ntsb.gov&#x2F;investigations&#x2F;AccidentReports&#x2F;Reports&#x2F;HWY18FH011-preliminary.pdf" rel="nofollow">https:&#x2F;&#x2F;www.ntsb.gov&#x2F;investigations&#x2F;AccidentReports&#x2F;Reports&#x2F;...</a>
评论 #19803485 未加载
评论 #19803089 未加载
评论 #19803916 未加载
评论 #19806324 未加载
评论 #19803755 未加载
评论 #19803930 未加载
评论 #19803101 未加载
helloindiaabout 6 years ago
A more nuanced dissection of whose fault it was or could be[0].<p>&quot;Huang was apparently fooled many times by Autopilot. In fact, he reportedly experienced the exact same system failure that led to his fatal crash at the same location on at least seven occasions.&quot; ...........<p>&quot;Huang knew that Autopilot could not be relied on in the circumstances where he was commuting along the 101 freeway in Mountainview, California. Yet he persisted in both using the system and ignoring the alerts that the system apparently gave him to put his hands on the wheel and take full control.&quot; ..........<p>&quot;Elon Musk and Tesla should be held to account for the way they have rolled out and promoted Autopilot. But users like Walter Huang are probably not the poster children for this accountability.&quot;<p>[0]<a href="https:&#x2F;&#x2F;www.forbes.com&#x2F;sites&#x2F;samabuelsamid&#x2F;2019&#x2F;05&#x2F;01&#x2F;the-problem-with-blaming-tesla-for-walter-huangs-death&#x2F;#59a9c59a5c88" rel="nofollow">https:&#x2F;&#x2F;www.forbes.com&#x2F;sites&#x2F;samabuelsamid&#x2F;2019&#x2F;05&#x2F;01&#x2F;the-pr...</a>
评论 #19806842 未加载
评论 #19807454 未加载
barrkelabout 6 years ago
I think it&#x27;s deeply irresponsible to create a device which can drive in 90 or 95% of scenarios. It&#x27;s just a way to fool humans into killing themselves.
评论 #19803329 未加载
评论 #19802738 未加载
评论 #19803935 未加载
baohaabout 6 years ago
While I&#x27;m not quite sure about Tesla&#x27;s responsibility, I do think CA DOT has its part in this tragic accident. Had the attenuator been replaced right after the previous accident, it could have saved the driver&#x27;s life.<p>Usually I don&#x27;t complain much about the gov, but just look at the construction mess they&#x27;ve created on 101, it&#x27;s been like that for more than 4 years!
评论 #19803946 未加载
评论 #19805276 未加载
评论 #19802903 未加载
patejamabout 6 years ago
Tesla is playing with lives and should stop offering Autopilot.<p>Anything less than L4 autonomous driving is completely reckless. Calling it &quot;Autopilot&quot; when it&#x27;s an L2 system should be criminal.<p>Yes, it&#x27;s an extreme stance, but we&#x27;re going to have a really hard time getting out true autonomous driving when companies are playing around with people&#x27;s lives. You can say &quot;well they know the risks&quot; but it&#x27;s not a closed situation. There are others on the road who will also die because of Autopilot mistakes.
评论 #19803330 未加载
评论 #19802722 未加载
rayinerabout 6 years ago
I hear people say “self driving cars don’t have to be perfect, they just need to be safer than humans.” Here is a good example of why that might be harder to achieve than expected. Apparently the car had gotten confused at that exact location repeatedly before. That’s what self driving cars are going to do—if there is something “weird” it’s likely that every self driving car (at least, every one from the same manufacturer) that encounters the weird scenario will run into a problem. That could result in catastrophic failure modes at scale.<p>What would have happened if every car on that road had been an identical Tesla? How many crashes would have happened? How long would it have taken Tesla to issue a fix? How many miles of perfect driving would be required to make up for the cluster of crash events due to that one anomaly?
评论 #19804092 未加载
评论 #19803784 未加载
评论 #19807337 未加载
评论 #19804542 未加载
protomythabout 6 years ago
So, in a traditional embedded system, discovery can reveal the source code to help determine what went wrong (e.g. divide by zero error in x-ray machine). What are the lawyers going to get when they start looking into the Tesla&#x27;s autopilot software?
评论 #19802866 未加载
huhtenbergabout 6 years ago
This person died because of Tesla&#x27;s cocky marketing, which leads people to believe that &quot;Auto Pilot&quot; is just that and that does very little to discourage this interpretation even though it is life-threatening. In this context them blaming it on the accident victim is a 100% asshole move.<p>I am life-long Musk&#x27;s fan, but Tesla trying their hardest to weasel out of any responsibility here is incredibly damaging to their reputation. The future <i>will</i> come, trying to accelerate its arrival at all costs is reckless.
评论 #19806158 未加载
评论 #19807180 未加载
评论 #19806258 未加载
rjdagostabout 6 years ago
Tesla made some very big claims in their recent autonomy day event- basically, they claimed that they are years ahead of competitors, while operating on &quot;hard&quot; mode (no lidar). And yet, a number of participants in the short autonomy day demo rides claimed that the support driver had to disengage the autopilot.<p>Has Tesla provided any evidence which shows evidence that they are in fact far beyond competitors?
评论 #19804689 未加载
评论 #19807039 未加载
kvhdudeabout 6 years ago
English is not my first language. I am confused by the blatant use of the term &quot;Auto Pilot&quot;. Does it not suggest more automation than is currently feasible? Why not intelligent assist? Why is tesla&#x2F;musk getting a pass here?
评论 #19804928 未加载
评论 #19805381 未加载
评论 #19804956 未加载
robinduckettabout 6 years ago
Do they not have Rumble strips in the US?<p><a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Rumble_strip" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Rumble_strip</a><p>I have been in situations in my youth where I was commuting or travelling tired, and nothing alerts you like the loud rumbling sound, and I&#x27;m sure the autopilot could detect it and bring the car to a stop if the driver hasn&#x27;t been alerted by the in car warning system or the rumble paint...
评论 #19806248 未加载
carlivarabout 6 years ago
I know the argument is always that autopilot is not intended by Tesla to be abused (never mind the CEO on national television abusing it).<p>However, there&#x27;s what you tell humans, and then there&#x27;s pragmatism regarding human nature. We should consider the latter.<p>It&#x27;s possible to be both academically correct on the warnings&#x2F;instructions given and also practically wrong on human psychology.
评论 #19804157 未加载
评论 #19804699 未加载
gooseusabout 6 years ago
It seems like a (relatively) trivial addition to the Autopilot system would be to allow the drive to tell the car when it has made a mistake and if Autopilot(s) consistently make mistakes in the same area then Autopilot should give a specific warning, or force a disengagement, when it detects that it is approaching that area.<p>I can&#x27;t imagine that this isn&#x27;t already a thing or that someone at Tesla hasn&#x27;t come up with this... am I missing something here?
评论 #19803294 未加载
评论 #19803241 未加载
11thEarlOfMarabout 6 years ago
The crux of the issue will be that autonomous capabilities will&#x2F;have made cars overall safer. Tesla will be able to cite data and circumstances where the safety features saved lives, and likely more times than lives were lost.<p>The problem is that the lives that are lost when the safety features fail are different lives than would have been lost without it at all. The families of those killed in this manner will have their day in court.
评论 #19804566 未加载
评论 #19804523 未加载
hackerpackerabout 6 years ago
can we just make a law that the driver is responsible&#x2F;liable for the car they drive, regardless how they drive it? I mean breaks failing while you are actively driving is one thing, but to completely surrender control, that is a choice.
评论 #19804576 未加载
评论 #19804675 未加载
sjg007about 6 years ago
Tesla is certainly on the knifes edge... they must be extremely confident
评论 #19805500 未加载
_pmf_about 6 years ago
How do you feel as a Tesla owner knowing that after your death, Tesla will publicly post that it was your fault, based on data from your vehicle?
jordacheabout 6 years ago
Why does this TC article need to mention he was an Apple engineer? Just trying to fill space?
intrasightabout 6 years ago
&quot;Move fast and break things&quot;<p>So it has to be for rapid technological progress to be made.
Jonaninabout 6 years ago
That news about this case is being spread everywhere irks me a bit, because people aren&#x27;t considering the actual circumstances of the accident. Excerpt from the article:<p>&quot;According to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location. The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.&quot;<p>This deserves a WTF. He understood autopilot makes errors, complained to his wife several times [0] that the car usually makes errors in that exact spot, and yet wasn&#x27;t paying enough attention on a clear day with ideal driving conditions to commute safely.<p>[0] &quot;Family members say he complained about his Tesla veering into the same barrier at the exact location of the crash and that he brought his car into the dealership several times to report a problem with the autopilot function.&quot; from <a href="https:&#x2F;&#x2F;sanfrancisco.cbslocal.com&#x2F;2019&#x2F;05&#x2F;01&#x2F;family-driver-died-tesla-autopilot-crash-files-lawsuit&#x2F;" rel="nofollow">https:&#x2F;&#x2F;sanfrancisco.cbslocal.com&#x2F;2019&#x2F;05&#x2F;01&#x2F;family-driver-d...</a>
评论 #19802749 未加载
评论 #19802643 未加载
评论 #19802793 未加载
评论 #19802747 未加载
评论 #19802647 未加载
评论 #19802628 未加载
评论 #19802649 未加载
newnewpdroabout 6 years ago
Tesla should get taken to the cleaners for this, I <i>hate</i> that these cars w&#x2F;&quot;autopilot&quot; are operated on the same streets I use.<p>They should focus on making quality, well-performing electric cars. Stop using us all as beta testers for an autonomous future most of us never asked for.
lunulataabout 6 years ago
Living around silicon valley with all the tesla drivers here, you learn real fast it makes for some of the worst drivers out on the road. Regularly see them running red lights and just generally not paying attention. They crutch too hard on the auto-pilot and dick around on their smart phones, which is probably exactly what this guy was doing. It&#x27;s bad for tesla drivers and it&#x27;s bad for drivers around them. Hope Tesla loses this lawsuit just because of that.
评论 #19803966 未加载
评论 #19803858 未加载
jbrittonabout 6 years ago
I think self driving cars should have both high dynamic range cameras and LIDAR and maybe time of flight cameras. Input from a LIDAR system would be much more likely to detect that barrier, and computer vision via a camera much more likely to be fooled. I think an investigation into why the computer vision system failed to detect a barrier under clear daylight conditions will show the negligence on the part of Tesla. Lane lines are frequently not well marked, and sunlight glare is a difficult problem for cameras. However, you have to be able to detect a concrete barrier, in the worst of conditions. Does Tesla have in place some kind of determination of its lane detection accuracy and then alert the driver that it is turning off auto-pilot when accuracy is low?
评论 #19803918 未加载
评论 #19803001 未加载
cmurfabout 6 years ago
I think it&#x27;s unacceptable for automation to produce a worse result than a human in the same situation, with the same information. i.e. it&#x27;s not acceptable for automation to fail danger, it must fail safe, including even if all it can do is give up (disconnects, warning tone, hands control over to the human driver).<p>I think it&#x27;s reasonable, in the narrow case where primary control is asserted and competency is claimed to be as good or better than a human, to hold automation accountable the same as a human. And in this case, if this driver acted the way autopilot did based on the same available information, we would say the driver committed suicide or was somehow incompetent.<p>I see this as possibly a case of automation committing involuntary manslaughter (unintentional homicide from criminally negligent or reckless conduct).
评论 #19802564 未加载
评论 #19802581 未加载
评论 #19802584 未加载
评论 #19802625 未加载
评论 #19802590 未加载
评论 #19802593 未加载
评论 #19802577 未加载
CriticalCathedabout 6 years ago
So many people here have hardons for tesla hate.<p>The guy fucked up, bad. Tesla is not at fault here.<p>&gt;&quot;According to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location. The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.&quot;