From a UX (and potentially legal) standpoint, it matters less how many warnings he ignored in the few minutes prior to the crash, but how many warnings he—and others—ignored on other drives.<p>By way of extreme example, if every time anyone engaged autopilot, it issued a warning every minute, you would quickly ignore them. Information fatigue.
I'm not sure I like the focus on the driver here. I do think he was a fool for trusting the device.<p>It genuinely sounds like the autopilot failed to prevent a collision that, whether right or wrong, we would expect it to before broadly deploying these devices.<p>A semi is a very large and pretty reasonably defined visual mass that intersected with the path of the vehicle.<p>What did the autopilot do and when did it do it seems to be missing from this round of reports on this incident and thats pretty lame.<p>Edit- Notice there is no mention of a COLLISION warning being issued? Only nagging "hands on wheel".
Sounds like he either committed suicide or was unconscious/disabled before the accident.<p>An autopilot shouldn't just hand over control to the driver and pray when it needs help and the driver is unresponsive; it should try to come to a stop on the side of the road, then shout for help. Shouting via a cell call to 911 (or country-specific equivalent) wouldn't be a bad idea.<p>"This is a Tesla P85. My driver is unresponsive. The car is stopped on I-95 southbound, 2.1 miles south of exit 4 past I-295. The car is red. Please send help. This message will repeat three times."
The article mentioned how newer Tesla vehicles have a "strikeout" system where the autopilot software disables itself until the next car startup if the driver repeatedly ignores warnings.<p>While this is definitely an improvement, how does that actually work? If the driver isn't paying attention, how do the newer vehicles force the driver back into paying attention and back to taking full control of the car? This seems like a really hard problem, and the article doesn't really dig into it.<p>Related: <a href="https://en.wikipedia.org/wiki/Dead_man%27s_switch" rel="nofollow">https://en.wikipedia.org/wiki/Dead_man%27s_switch</a>
While some (elsewhere) are arguing the system could have gracefully handed back control by slowing down to an eventual stop, it's pretty clear the driver was at fault. It's also not clear how the autopilot could have done better - is slowing down on a high speed road any less dangerous? It needed a human in the loop.<p>The evidence was that the driver was wilfully ignoring the safety warnings, just like ignoring a fence or sign above a steep cliff. That increased the fatal risk to him and other road users. Any other (less litigious) country would immediately place the fault where it clearly lies.
I know they changed the software somehow (although I don't know exactly how).<p>It seems like if the driver isn't interacting and following the rules (like not keeping their hands on the wheel) why not slow the car and have it pull over with hazard lights on?<p>My car's systems (though not as advanced) will simply turn off if you're not doing your part.
I'd love to compare the results of this final analysis with the FUD that I remember coming out following this crash on Reddit et al with a series of people saying 'I told you so'.<p>Complex systems of risk mitigation tends to get pushed in the media as an ideal solution to these accidents well before the complete facts come in. Common sense precaution is typically ignored in this culture of total risk adverse superiority that floats to the surface after these events and would ultimately result in very few of these experimental products, which drive innovation, being tested (as is clearly the case in many risk adverse cultures such as Japan and Switzerland).<p>With any early adopter's of automation whose success is partially dependent on human intervention (ie, flying planes) having stories of failures of incentives to know when or not to actually listen to the warnings provided by the interface over your own intuition is helpful not only in training but also in the design of these systems.<p>To not expect a certain degree of these types of failure situations where the system largely acted as intended is naive and ultimately defeatist.<p>I hope the drivers realize the risk they are engaging in here and take appropriate behavioural change as a result but I can't imagine many formal systems that would practically be effective here at deterring these situations other than maybe adapting educational training and the content of the warning systems.
This was the crash where the Tesla drove at full-speed into the back of a trailer yes?<p>I don't see how the "7 safety warnings" are relevant here, that was just the cars reminder that he should put his hands back on the steering wheel. It does not mean the car detected it was losing vision and would stop steering the vehicle shortly. The safety warnings had nothing to do with the Tesla failing to recognize the obstacle ahead and brake.<p>I don't think the future looks good for autonomous driving if the NTSB is going to accept an explanation that there is no vehicle failure here because it gave some regular reminders to put your hands on the steering wheel.
Until we get to real autonomous cars, we'll probably have more issues with people and their hubris. My car has a bunch of these features, and I have them only in case of messing up. They are redundancy. Nothing more.<p>Some people are way overestimating what these systems can do or are paying less attention because they think the car will correct for their mistakes. We're going to be in for a rocky few years or so where these advanced systems may lead more crashes from inattentive drivers.<p>Having used these various technologies, I can firmly say that I can't wait for fully autonomous cars.
Maybe the biggest problem is Tesla calling it autopilot. Just call it lane assist or adaptive cruise control. I know many of the cars will stop working if you take your hands off the wheel too long.
The crash had nothing to do with the driver ignoring autopilot warnings. For some reason the automatic emergency braking feature of the Tesla did not work. This is a serious problem and AEB should be completely separate from autopilot. What is the point of a safety feature which is supposed to stop you from crashing if it doesn't activate? In this case the driver appears to be an idiot who wasn't paying attention, but imagine if he had a seizure, fell asleep, got cut off and brake checked, or something else?
Worth noting that the driver seemed particularly enthusiastic about autopilot, to the point of frequently recording his experiences and publishing them on YouTube:<p><a href="https://www.nytimes.com/2016/07/02/business/joshua-brown-technology-enthusiast-tested-the-limits-of-his-tesla.html" rel="nofollow">https://www.nytimes.com/2016/07/02/business/joshua-brown-tec...</a><p>> <i>CANTON, Ohio — Joshua Brown loved his all-electric Tesla Model S so much he nicknamed it Tessy.</i><p>> <i>And he celebrated the Autopilot feature that made it possible for him to cruise the highways, making YouTube videos of himself driving hands-free. In the first nine months he owned it, Mr. Brown put more than 45,000 miles on the car.</i><p>> <i>“I do drive it a LOT,” he wrote in response to one of the hundreds of viewer comments on one of his two dozen Tesla-themed videos. His postings attracted countless other Tesla enthusiasts, who tend to embrace the cars with an almost cultish devotion.</i><p>That he was a Navy SEAL who specialized in electronics and bomb defusal probably also gave him additional confidence/hubris when driving.
Why does autopilot continue to drive the car without someone holding the steering wheel in the first place? If the car can't drive itself without driver input, then why isn't letting go of the steering wheel the equivalent of disengaging autopilot?<p>Tesla's technical hubris could have killed more than just the driver.
<a href="https://www.tesla.com/blog/tragic-loss" rel="nofollow">https://www.tesla.com/blog/tragic-loss</a><p>I think Tesla's original statement is relevant to post here. Although likely somewhat biased, they provided a succinct explanation to the reason why the Tesla did not brake automatically to avoid the collision.<p>From the statement: "Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S."
Has anyone driven one of these using Autopilot?<p>It strikes me as an eerie feature in that semi-autonomous means you're not controlling the car but at any moment you may have to jump in generally unaware of the forces the car/tires have been experiencing.<p>It just seems like a major mindset shift from mostly idle passenger to active driver and offhand I feel like until the car can reliably drive itself I'd rather just drive the whole way.
1. I think any autopilot system which makes users to keep their hands on steering wheel and pay equal attention is totally useless. In fact it's detrimental since it gives false sense of control and naturally leads users to lapse.<p>2. Anyone who read Don Norman knows this is usually fault of the design of the system not the user. The system needs to do more to assist users in these edge cases.
For a high tech vehicle with so many safety features built in, I don't understand why an automatic collision avoidance system didn't override the autopilot system and start braking well ahead of this high speed impact. Maybe this model doesn't have such a system, or the traffic conditions changed so rapidly there was no time left to brake?
This again brings us back to the fundamentals.<p>Do you trust your self-driving car AI algorithm? Yes? Sure lets remove the car controls(Accelerator, break and steering wheel). And then safety and other liability is on the car manufacturer.<p>If you don't have this level of trust on your algorithm you don't really have a self driving car.
Our family is looking at getting a Tesla and my wife brought up this incident. I told her that Teslas are not fully capable of driving themselves and that the guy was probably not paying attention to what he was doing. I'm sad this guy lost his life, but ignoring safety warnings for whatever reason eared him the Darwin award.
Ars Technica has an article that's not paywalled:<p><a href="https://arstechnica.com/tech-policy/2017/06/tesla-model-s-warned-driver-in-fatal-crash-to-put-hands-on-steering-wheel/" rel="nofollow">https://arstechnica.com/tech-policy/2017/06/tesla-model-s-wa...</a>