TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

NTSB: Autopilot steered Tesla car toward traffic barrier before deadly crash

509 pointsby nwrkalmost 7 years ago

32 comments

Animatsalmost 7 years ago
NTSB:<p><i>• At 8 seconds prior to the crash, the Tesla was following a lead vehicle and was traveling about 65 mph.</i><p><i>• At 7 seconds prior to the crash, the Tesla began a left steering movement while following a lead vehicle.</i><p><i>• At 4 seconds prior to the crash, the Tesla was no longer following a lead vehicle.</i><p><i>• At 3 seconds prior to the crash and up to the time of impact with the crash attenuator, the Tesla’s speed increased from 62 to 70.8 mph, with no precrash braking or evasive steering movement detected.</i><p>This is the Tesla self-crashing car in action. Remember how it works. It visually recognizes rear ends of cars using a BW camera and Mobileye (at least in early models) vision software. It also recognizes lane lines and tries to center between them. It has a low resolution radar system which ranges moving metallic objects like cars but ignores stationary obstacles. And there are some side-mounted sonars for detecting vehicles a few meters away on the side, which are not relevant here.<p>The system performed as designed. The white lines of the gore (the painted wedge) leading to this very shallow off ramp become far enough apart that they look like a lane.[1] If the vehicle ever got into the gore area, it would track as if in a lane, right into the crash barrier. It won&#x27;t stop for the crash barrier, because <i>it doesn&#x27;t detect stationary obstacles.</i> Here, it sped up, because there was no longer a car ahead. Then it lane-followed right into the crash barrier.<p>That&#x27;s the fundamental problem here. These vehicles will run into stationary obstacles at full speed with no warning or emergency braking at all. <i>That is by design.</i> This is not an implementation bug or sensor failure. It follows directly from the decision to ship &quot;Autopilot&quot; with that sensor suite and set of capabilities.<p>This behavior is alien to human expectations. Humans intuitively expect an anti-collision system to avoid collisions with obstacles. This system does not do that. It only avoids rear-end collisions with other cars. The normal vehicle behavior of slowing down when it approaches the rear of another car trains users to expect that it will do that consistently. But it doesn&#x27;t really work that way. Cars are special to the vision system.<p>How did the vehicle get into the gore area? We can only speculate at this point. The paint on the right edge of the gore marking, as seen in Google Maps, is worn near the point of the gore. That may have led the vehicle to track on the left edge of the gore marking, instead of the right. Then it would start centering normally on the wide gore area as if a lane. I expect that the NTSB will have more to say about that later. They may re-drive that area in another similarly equipped Tesla, or run tests on a track.<p>[1] <a href="https:&#x2F;&#x2F;goo.gl&#x2F;maps&#x2F;bWs6DGsoFmD2" rel="nofollow">https:&#x2F;&#x2F;goo.gl&#x2F;maps&#x2F;bWs6DGsoFmD2</a>
评论 #17258121 未加载
评论 #17259604 未加载
评论 #17258011 未加载
评论 #17257877 未加载
评论 #17258001 未加载
评论 #17260819 未加载
评论 #17258038 未加载
评论 #17262455 未加载
评论 #17258759 未加载
评论 #17257927 未加载
评论 #17257970 未加载
评论 #17257841 未加载
评论 #17263763 未加载
评论 #17262057 未加载
评论 #17257939 未加载
评论 #17261735 未加载
评论 #17262371 未加载
评论 #17257976 未加载
评论 #17260255 未加载
评论 #17261867 未加载
评论 #17264316 未加载
评论 #17262548 未加载
评论 #17261788 未加载
评论 #17258252 未加载
评论 #17264676 未加载
评论 #17258068 未加载
评论 #17258412 未加载
评论 #17261198 未加载
评论 #17262893 未加载
评论 #17261225 未加载
评论 #17262753 未加载
MBCookalmost 7 years ago
“[Driver’s] hands were not detected on the steering wheel for the final six seconds prior to the crash. Tesla has said that Huang received warnings to put his hands on the wheel, but according to the NTSB, these warnings came more than 15 minutes before the crash.”<p>This kind of stuff is why I’ve lost all faith in Tesla’s public statements. What they said here was, for all intents and purposes, a flat out lie.<p>Clearly something went wrong here, but they lept to blaming everyone else instead of working to find the flaw.
评论 #17257523 未加载
评论 #17257647 未加载
评论 #17257698 未加载
评论 #17257788 未加载
评论 #17263798 未加载
评论 #17257484 未加载
评论 #17257889 未加载
评论 #17260578 未加载
评论 #17257497 未加载
评论 #17257438 未加载
评论 #17257938 未加载
abalonealmost 7 years ago
<i>&gt; During the 18-minute 55-second segment, the vehicle provided two visual alerts and one auditory alert for the driver to place his hands on the steering wheel. These alerts were made more than 15 minutes prior to the crash.</i><p>Whoah. So there were NO alerts for 15 minutes prior to the crash. Compare this with Tesla&#x27;s earlier statement:<p><i>&gt; The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision.</i>[1]<p>This gives a very different impression. They omitted the fact that there were no warnings for 15 minutes. Frankly that appears to be an intentionally misleading omission.<p>So basically the driver was distracted for 6 seconds while believing that the car was auto-following the car in front of it.<p>[1] <a href="https:&#x2F;&#x2F;www.tesla.com&#x2F;blog&#x2F;update-last-week’s-accident" rel="nofollow">https:&#x2F;&#x2F;www.tesla.com&#x2F;blog&#x2F;update-last-week’s-accident</a>
评论 #17260900 未加载
评论 #17264364 未加载
评论 #17262074 未加载
评论 #17261950 未加载
mymacbookalmost 7 years ago
Reading that initial report is terrifying. I am so glad the NTSB set the record straight that the driver had his hands on the wheel for the majority of the final minute of travel. Really makes me feel like Tesla was out to blame the driver from the get go. To be clear the driver is absolutely partially at fault, but my goodness autopilot sped up into the barrier in the final seconds — totally unexpected when the car has automatic emergency breaking.<p>Emergency breaking feels not ready for prime time. I hope there are improvements there. Don’t want to see autopilot disabled as a result of this, would rather Tesla use this to double down and apply new learnings.<p>Just so sad to hear about this guys death on his way to work - not the way I want to go. :(
评论 #17259369 未加载
评论 #17258024 未加载
评论 #17258215 未加载
ckastneralmost 7 years ago
&gt; <i>His hands were not detected on the steering wheel for the final six seconds prior to the crash.</i><p>&gt; <i>Tesla has said that Huang received warnings to put his hands on the wheel, but according to the NTSB, these warnings came more than 15 minutes before the crash.</i><p>&gt; <i>Tesla has emphasized that a damaged crash attenuator had contributed to the severity of the crash.</i><p>These may or may not have been factors contributing to the death of the driver, and ultimately may or may not absolve Tesla from a legal liability.<p>However, the key point here is that without question, <i>the autopilot failed</i>.<p>It is understandable why Tesla is focusing on the liability issue. This is something that <i>they can dispute</i>. The fact that the autopilot failed is <i>undisputable</i>, and it is unsurprising that Tesla is trying to steer the conversation away from that.<p>The discussion shouldn&#x27;t be <i>either</i> the driver is at fault <i>or</i> Tesla screwed up, but two separate discussions: whether the driver is at fault, <i>and</i> how Tesla screwed up.
评论 #17257689 未加载
评论 #17259673 未加载
评论 #17262099 未加载
nwrkalmost 7 years ago
The report itself - worth of reading<p><a href="https:&#x2F;&#x2F;www.ntsb.gov&#x2F;investigations&#x2F;AccidentReports&#x2F;Reports&#x2F;HWY18FH011-preliminary.pdf" rel="nofollow">https:&#x2F;&#x2F;www.ntsb.gov&#x2F;investigations&#x2F;AccidentReports&#x2F;Reports&#x2F;...</a>
评论 #17257601 未加载
jackson1wayalmost 7 years ago
Despite the autopilot failure, I find the battery failure quite remarkable too:<p>&gt; The car was towed to an impound lot, but the vehicle&#x27;s batteries weren&#x27;t finished burning. A few hours after the crash, &quot;the Tesla battery emanated smoke and audible venting.&quot; Five days later, the smoldering battery reignited, requiring another visit from the fire department.<p>Where is your LiPo god now? Batteries have more energy density than 20 years ago, ok. But they are also much more dangerous. Now imagine the same situation with Tesla&#x27;s huge semi batteries. They&#x27;ll have to bury them 6ft under, like Chernobyl&#x27;s smoldering fuel rods. Minus the radiation.
评论 #17262160 未加载
评论 #17258306 未加载
评论 #17257784 未加载
netsharcalmost 7 years ago
Dear Elon, want to start a website that rates how fake-newsy government-produced accident reports are? &#x2F;S<p>&quot;FDA said my farm is producing salmonella-infected chicken. Downvote their report on this URL!&quot;
评论 #17257603 未加载
gburtalmost 7 years ago
I am generally against often-called &quot;excessive regulation,&quot; but the regulator -- perhaps FTC -- should aggressively prohibit the misleading marketing message here.<p>The entire problem manifests from calling this lane keeping mechanism &quot;Autopilot.&quot; Tesla should be prohibited from using that language until they have achieved a provably safer self-driving level 3+.<p>The problem is exacerbated by Musk&#x27;s aggressive marketing-driven language. Saying things like <i>we&#x27;re two years out from full self-driving</i> (first said in 2015) and <i>the driver was warned to put his hands on the steering wheel</i> (15 minutes prior to the crash) makes Musk look like he is plainly the bad guy and attempting to be misleading.<p>&quot;Provably safe&quot; probably means some sort of acceptance testing -- a blend of NTSB-operated obstacle course (with regression tests and the like) and real world exposure.
dcposchalmost 7 years ago
Tesla Autopilot makes it to HN pretty much every week now, almost never in a good way.<p>Every time, we have a big discussion about autopilot safety, AI ethics, etc.<p>What about <i>lack of focus</i>?<p>Tesla has already reinvented the car in a big way--all-electric, long range, fast charge, with a huge network of &quot;superchargers&quot;. It&#x27;s taken EV from a niche environmentalist pursuit to something widely seen as the future of automotive.<p>Why are they trying to tackle self-driving cars at the same time?<p>This feels like a classic mistake and case of scope creep.<p>Becoming the Toyota of electric is vast engineering challenge. Level 5 autonomous driving is an equally vast engineering challenge. Both represent once-in-a-generation technological leaps. Trying to tackle both at the same time feels like hubris.<p>If they just made great human-piloted electric cars and focused on cost, production efficiency, volume, and quality, I think they&#x27;d be in a better place as a business. Autopilot seems like an expensive distraction.
评论 #17262519 未加载
menacinglyalmost 7 years ago
Tesla has to realize these &quot;shame the dead dude&quot; posts are PR nightmares, right?<p>They are reason alone for me to never consider one, that a private moment for my family might end up a pawn in some &quot;convince the public we&#x27;re safe using any weasel stretch of the facts we can&quot; effort.<p>If this is disruption, I&#x27;ll wait for the old guard to catch up, lest I be disrupted into a concrete barrier and my grieving widow fed misleading facts about how it happened.
评论 #17262491 未加载
RcouF1uZ4gsCalmost 7 years ago
After this incident and Tesla&#x27;s response to it, I hope Tesla is sued and or fined into bankruptcy. Tesla is normalizing releasing not fully tested software to do safety-critical things, and literally killing people as a result. A message needs to be sent that this is unacceptable. In addition, their first response is a PR driven response that sought to blame to driver, and violated NTSB procedures. Safety is probably the most important thing to get right with these types of software and Tesla is nonchalantly sacrificing safety for marketing.
评论 #17258177 未加载
评论 #17258156 未加载
kevinchenalmost 7 years ago
Tesla Autopilot should be recalled via the next OTA update.<p>The “Autopilot” branding implies that users need not pay attention, when in reality, the system needs interventions at infrequent but hard-to-predict times. If an engineer at Apple can’t figure it out, then the average person has no chance. Their software sets users up to fail. (Where failure means permanent disability or death.)<p>Inevitably, Musk fans will claim that recalling Autopilot actually makes Tesla drivers less safe. But here&#x27;s the problem with Musk’s framing of Autopilot.<p>Sure, maybe it fails less often than humans. (We don&#x27;t know whether we can trust his numbers.) But we do know that when it fails, it fails in different ways — Autopilot crashes are noteworthy because they happen in situations where human drivers would have no problem. That’s what people can’t get over. And it is why Autopilot is such a dangerous feature.<p>An automaker with more humility would’ve disabled this feature years ago. (Even Uber suspended testing after the Arizona crash!) With Musk, my fear is that more people will have to die before there is enough pressure from regulators &#x2F; the public to pull the plug.
评论 #17263256 未加载
MBCookalmost 7 years ago
So people are asking why the barrier wasn’t detected, and that’s fair.<p>Here’s another question: why wasn’t the ‘gore’ zone detected?<p>Why did the car thing it was safe to drive over and area with striped white lines covering the pavement?<p>It saw the white line on the <i>side</i> of that area and decided that was a land market but ignored the striped area you’re not supposed to drive on?<p>If you’re reading the lines on the pavement you have to try to look at all of them.<p>I don’t know if other cars, like those with MobileEye systems, do that but given Tesla’s safety claims they’d better be trying.
评论 #17258015 未加载
mcguirealmost 7 years ago
Here&#x27;s the most interesting quote to me:<p>&quot;<i>The crash created a big battery fire that destroyed the front of Huang&#x27;s vehicle. &quot;The Mountain View Fire Department applied approximately 200 gallons of water and foam&quot; over a 10-minute period to put out the fire, the NTSB reported.</i><p>&quot;<i>The car was towed to an impound lot, but the vehicle&#x27;s batteries weren&#x27;t finished burning. A few hours after the crash, &quot;the Tesla battery emanated smoke and audible venting.&quot; Five days later, the smoldering battery reignited, requiring another visit from the fire department.</i>&quot;<p>Shouldn&#x27;t it be possible to make the battery safe?
评论 #17262078 未加载
userbinatoralmost 7 years ago
This just reconfirms my belief about Tesla&#x27;s &quot;autopilot&quot; --- most of the time it behaves like an OK driver, but occasionally makes a fatal mistake if you don&#x27;t pay attention and correct it. In other words, you have to be <i>more</i> attentive to drive safely with it than without, since a normal car (with suspension and tires in good condition, on a flat road surface) will not decide to change direction unless explicitly directed to --- it will continue in a straight line even if you take your hands off the wheel.<p>Given that, the value of autopilot seems dubious...
评论 #17262095 未加载
walrus01almost 7 years ago
This guy tested it at the EXACT same location with tesla autopilot. The Tesla starts steering directly into the barrier before he corrects it.<p><a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=VVJSjeHDvfY" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=VVJSjeHDvfY</a>
评论 #17261097 未加载
LinuxBenderalmost 7 years ago
Disclaimer: Taboo comment ahead.<p>Subtle bugs in self driving cars would be a simple way to assassinate people with low cost overhead. One OTA update to a target and you could probably even get video footage of the job being completed, sent to the client all in one API call.<p>Surely by now someone must have completed a cost analysis of traditional contractors vs. having a plant at a car manufacturer.<p>Am I the only one thinking about this?
评论 #17259818 未加载
评论 #17260915 未加载
jakelarkinalmost 7 years ago
self driving systems cant well reason about untrained scenarios or the intent of other humans on the road. I think the people have grossly underestimated how driving in an uncontrolled environment is really a general AI problem, which we&#x27;re not even close to solving.
评论 #17257787 未加载
评论 #17257825 未加载
评论 #17258144 未加载
评论 #17258078 未加载
评论 #17257804 未加载
cmurfalmost 7 years ago
<i>Involuntary manslaughter usually refers to an unintentional killing that results from recklessness or criminal negligence, or from an unlawful act that is a misdemeanor or low-level felony (such as a DUI).</i> (Wikipedia)<p>It&#x27;s rather uncontroversial that this kind of accident falls under civil law, because there is some degree of liability involved in marketing a product as being safer than a human driver, but then fails in an instance where a human driver flat out would not fail: apples to apples. If the human driver is paying attention, which the autonomous system is always doing, they&#x27;d never make this mistake. It could only be intentional.<p>But more controversial and therefore more interesting to me, is to what degree the system is acting criminally, even if it&#x27;s unintended, let alone if it is intended. Now imagine the insurance implications of such a finding of unintended killing. And even worse, imagine the total lack of even trying to make this argument.<p>I think a prosecutor must criminally prosecute Tesla. If not this incident, in the near future. It&#x27;s an area of law that needs to be aggressively pursued, and voters need to be extremely mindful of treating AI of any kind, with kid gloves, compared to how we&#x27;ve treated humans in the same circumstances.
crazygringoalmost 7 years ago
Wow. I will say that, when you look straight-on in Street View, it does look disturbingly like a valid lane to drive in -- same width, same markings at one point [1]:<p><a href="https:&#x2F;&#x2F;www.google.com&#x2F;maps&#x2F;@37.4106804,-122.075111,3a,75y,117.92h,81.35t&#x2F;data=!3m6!1e1!3m4!1snAoBJlvBLm0NQWYBWKxWGw!2e0!7i16384!8i8192" rel="nofollow">https:&#x2F;&#x2F;www.google.com&#x2F;maps&#x2F;@37.4106804,-122.075111,3a,75y,1...</a><p>If it were night and a car in front blocking view of the concrete lane divider, it doesn&#x27;t seem too difficult for a human to change lanes at the last second and collide as well. (And indeed, there was a collision the previous week.)<p>There&#x27;s no excuse for not having an emergency collision detection system... but it also reminds me how dangerous driving can be period, and how we need to hold autonomous cars to a higher standard.<p>[1] Thanks to comments by Animats and raldi for the location from other angles
beenBoutITalmost 7 years ago
Anyone here actually think Elon uses autopilot?
评论 #17257963 未加载
评论 #17259072 未加载
评论 #17259828 未加载
评论 #17258039 未加载
评论 #17258126 未加载
27182818284almost 7 years ago
<p><pre><code> The NTSB report confirms that. The crash attenuator—an accordion-like barrier that&#x27;s supposed to cushion a vehicle when it crashes into the lane separator—had been damaged the previous week when a Toyota Prius crashed at the same location. The resulting damage made the attenuator ineffective and likely contributed to Huang&#x27;s death. </code></pre> kinda sounds like maybe that part of the road isn&#x27;t well designed or marked too.
评论 #17257757 未加载
评论 #17258055 未加载
评论 #17257910 未加载
评论 #17262030 未加载
评论 #17261577 未加载
tqialmost 7 years ago
&gt; During the 18-minute 55-second segment, the vehicle provided two visual alerts and one auditory alert for the driver to place his hands on the steering wheel. These alerts were made more than 15 minutes prior to the crash.<p>If your hand is always supposed to be on the wheel, why is does the car not constantly alert you when it detects that your hands are off (similar to how cars beep at you if your seatbelt is unbuckled while driving)?
deapsalmost 7 years ago
I think one of my main concerns with &quot;autopilot&quot; is that for <i>a lot</i> of drivers, it will absolutely make the roads safer for them and those that use the roads around them. Consequently, for some safer and more-alert drivers, it has the potential to make driving less safe.
jhanschooalmost 7 years ago
Here&#x27;s a relevant video that shows autopilot directing a Tesla into lane split.<p><a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=6QCF8tVqM3I" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=6QCF8tVqM3I</a>
dre85almost 7 years ago
I wonder what percentage of owners actually have the courage to turn on autopilot? How many people here would&#x2F;do?
ericbalmost 7 years ago
If I was building this, I would upload millions of hours of data from actual Tesla drivers, and I would have autopilot releases step through data and flag the variances from the behavior of the actual drivers. I&#x27;d run this in a massively parallel fashion.<p>For every release, I&#x27;d expect the score to improve. With a system like this, I would think you&#x27;d detect the &quot;drive towards traffic barrier&quot; behavior.
stretchwithmealmost 7 years ago
Job 1: Don&#x27;t run into things.
newnewpdroalmost 7 years ago
Tesla Autocrash, how much does this option cost again?
myth_drannonalmost 7 years ago
I was listening to a Software Engineering Daily podcast with Lex Fridman about self-driving deep learning. Very interesting topic on ethics of self-driving cars. What he was saying, is that we need to accept the fact that people are going to die following incidents with autonomous vehicles involved. In order for systems to learn how to drive, people will have die. It&#x27;s more of societal change that is needed. 30,000 people die on the roads in US every year, in order to decrease that number we need self-driving cars even with a price that society as of now can&#x27;t accept
评论 #17259199 未加载
评论 #17259290 未加载
manicdeealmost 7 years ago
Short version: due to poor lane markings, Autopilot made the same mistake as many humans in the same situation and collided with the divider. Due to the frequency of this kind of accident, the crash attenuator had been collapsed and not reset meaning the Tesla hit the concrete divider at full speed, as has happened in the past with humans in control.<p>But please continue to blame Autopilot for not being smarter than the human operating the vehicle.