Tesla: <i>"the incident occurred as a result of the driver not being properly attentive to the vehicle's surroundings while using the Summon feature or maintaining responsibility for safely controlling the vehicle at all times."</i><p>That's the "deadly valley" I've written about before - enough automation to almost work, not enough to avoid trouble, and expecting the user to take over when the automation fails. That will not work. Humans need seconds, not milliseconds, to react to complex unexpected events. Google's Urmson, who heads their automatic driving effort, makes this point in talks.<p>There is absolutely no excuse for an autonomous vehicle hitting a stationary obstacle. If that happened, Tesla's sensors are inadequate and/or their vision system sucks.
Read the 'updated' version, it explains that the user was doing nearly everything they shouldn't regarding this feature. <a href="http://www.theverge.com/2016/5/11/11658226/tesla-model-s-summon-autopilot-crash-letter" rel="nofollow">http://www.theverge.com/2016/5/11/11658226/tesla-model-s-sum...</a><p>That said, sure it should be able to stop on it's own, but I think they couldn't have been more clear that this is beta and the driver is still always responsible.<p>In my view the driver is just as liable if the put on cruise control and don't pay attention. Is it the manufacturer's fault the car slammed into a vehicle in front of them while cruise control was on? No, I think any reasonable person will be saying it's the driver's fault.
If Tesla's response to this is actually what the article says, then that's somewhat worrying. It's never a good idea to blame the user for a failing of the product like this, especially on something like a car, beta version or not. If the car can't reliably not collide with obstacles in Summon mode, then the mode shouldn't be available to the public yet.<p>This also points out a failing with Tesla's "we don't need LIDAR" strategy for sensors. Ultrasonic/IR sensors around the body might be reasonable for most driving situations, but clearly there are going to be incidents like this one if the car can't see at the full height of the body at close distance.
I feel that pressing "park" should be idempotent. If I press "park" twice in my car, I don't want to drive away once I get out. Tesla really needs a dedicated "start autopilot" button to make the intention to use the feature explicit.<p>Yes, apparently the fact that this feature was activated was messaged on the instrument cluster, but that shouldn't be sufficient to absolve Tesla from the liability of this poor UI decision.
Is it just me or if you can't approximate obstacles via a sensor within the complete bounding box of the car -- except for perhaps the top and bottom, can you really even have this feature work reliably?<p>From a technical perspective, you just don't have all the data necessary, and therefore any solutions will be guesses, hacks and "best efforts", and cannot be improved on via any manor of software update. This voids the "beta" claim made by the company, as no software update could remedy the situation.<p>Tesla has got to know this, and I think its negligent for them to release a feature (even in "beta") when they know there are hard technical limitations (sensors, not software) that prohibit it from working properly. It puts property and people's lives at risk unnecessarily.<p>At the minimum, Tesla cars equipped or enabled with these features represent a higher risk to the public, and the owners of these vehicles should be required to carry high risk insurance.
> In a statement to KSL, Tesla says that Summon "may not detect certain obstacles, including those that are very narrow (e.g., bikes), lower than the fascia<p>May as well rewrite that to read "May run over bikers or children". If you can't implement a feature properly, then don't implement it at all. If that means current Teslas can't do it because they lack the proper sensors, then they shouldn't do it.
There is a grave danger that Tesla's precocious push of autonomous features could result in a PR disaster for self-driving technology if it actually ends up killing someone.<p>We shouldn't have this in the wild until we're sure it's ready.
How hard is it to disable the “dead man's switch” for this feature? Can it be done without searching the forum for hours? Is it documented in the owner's manual?<p>The direction of my blame here kind of depends on the answer to those questions. Of course, it's technically the owner's fault, but a feature like this really needs to be 100% idiot proof.<p>This is a new feature to many people, and it's exactly the type of feature that people are going to “test” outside of the ideal operating conditions. It's not Tesla's responsibility to account for every stupid decision of its customers, but Tesla should have at least done everything in their power to ensure that critical safety features couldn't be disabled (which they may have done; I don't know).<p>Most critical safety features on cars can't be trivially disabled (ABS, airbags, automatic seatbelt locks, etc...). The only safety feature that I can think of that can be trivially disabled is traction/stability control, but there's a real reason for this (getting out of deep snow/mud). Also, disabling traction/stability control is a multi-stage process on many cars. On late model BMWs at least, pressing the “DTC” button once will partially reduce traction/stability control, but not completely disable it. To the average person, it appears to be completely disabled. However, if you do a little research, you'll find that if you hold it down for another 5 seconds, it disables completely (sort of). Even with it completely disabled, certain aspects of the system remain on. The only way to completely disable those portions would be to flash custom software to the car (which is well beyond the ability of the average person).
I see a lot of comments about safety here. Note that Tesla's Summon mode limits the car to 1MPH and 39ft of movement. It also is very sensitive to resistance, to the point that I actually had to construct ramps for my car to climb the 1-inch lip at the entrance to my garage, otherwise it would stop at that point and refuse to go further. Using this feature to kill somebody is going to take a <i>lot</i> of effort. There are many interesting discussions to be had here about software, UX, corporate responsibility in the face of user error with bad UX, etc., but I don't think there's much room to discuss safety here. This is a risk to property, not life.
The "parked trailer" description is confusing.<p>A picture is worth a thousand words - check out the crash here:<p><a href="https://www.ksl.com/?sid=39727592&nid=148&title=utah-man-says-tesla-car-started-on-its-own-crashed-into-trailer" rel="nofollow">https://www.ksl.com/?sid=39727592&nid=148&title=utah-man-say...</a><p>This is not a sensor failure, a sensor just clearly is NOT on the roof forward to measure object height in close-contact situations.
From an updated version of the story (<a href="http://www.theverge.com/2016/5/11/11658226/tesla-model-s-summon-autopilot-crash-letter" rel="nofollow">http://www.theverge.com/2016/5/11/11658226/tesla-model-s-sum...</a>) it sounds like the driver was standing next to the car when it crashed. Tesla says that Summon mode started operating three seconds after he got out of the car.
<a href="http://kxan.com/2016/05/11/man-says-tesla-started-on-its-own-got-into-crash/" rel="nofollow">http://kxan.com/2016/05/11/man-says-tesla-started-on-its-own...</a><p>"[Tesla] is just assuming that I just sat there and watched it happen and I was okay with that." (in the video)<p>The article notes: "A worker at the business met him at the side of the road, Overton said, and asked him multiple questions about his car."<p>Tesla said 3 seconds passed from the door closing to the car moving.<p>It sounds to me like he was showing off some features to someone. It failed to stop as he expected it to. But he decided to blame the incident on Tesla.
I find it more worrying that Tesla will connect to your car to pull logs to manage their PR.<p>In future incidents, what prevents Tesla from forging logs and who could prove it?
Maybe it's just me but I find the fact that Tesla has ready access to such detailed logs to be extremely creepy and pretty much a showstopper for me ever owning a Tesla.
Tesla needs to have some sympathy for the user. People "will" accidentally press a button, sometimes more than once. Sometimes you press it because it is there. This happens. The feature needs to be robust in the face of failure and it seems like this did not happen.
Tesla's response to this issue is staggering to me. Summon mode is not a remote control. The car is controlling itself, and I don't think it's reasonable to expect the driver to be responsible for it's actions when that is happening.
The traditional Silicon Valley development culture that developed under web apps isn't up to the complexity that results when software meets the real world in safety-critical contexts. Heck, as formulated, it wasn't ready to deal with smartphone apps!<p>You have a feature where the car navigates itself through parking situations, and no hardware or software developer ever paid attention to overhanging obstacles? That wasn't even a concern!? To me, this smacks of the same kind of arrogance and shortsightedness that caused Nest to release thermostats that deactivated without WiFi signal.
It's not clear to me that 'beta' is, or should be, an allowable thing with regards to vehicles.<p>As far as I'm aware you're not, legally, allowed to put out of spec tyres or breaks on your car, but somehow a beta feature that can autonomously control a vehicle is okay. I'm not convinced.<p>Personally I'm not at all fond of the idea of trialing beta features. If you ride the bleeding-edge expect to get cut.
Update: Driver whose Tesla Model S crashed while using Summon was breaking all the rules<p><a href="http://www.theverge.com/2016/5/11/11658226/tesla-model-s-summon-autopilot-crash-letter" rel="nofollow">http://www.theverge.com/2016/5/11/11658226/tesla-model-s-sum...</a>
Tesla says:<p>> As such, Summon requires that you continually monitor your vehicle's movement and surroundings while it is in progress and that you remain prepared to stop the vehicle at any time using your key fob or mobile app or by pressing any door handle.<p>Let's consider the choices. Key fob? Might be in your pocket. App? Has anyone from Tesla tried to <i>use</i> the Tesla app? It frequently takes literally minutes to respond. What if you try to press the door handle and the [expletive removed] door handle sensor doesn't notice? (The latter happens all the time with my car. It usually works when I press very hard on it, which might be a challenging thing to do when the semi-autonomous car is <i>moving</i>.)<p>This crap makes my glad my Tesla is too old to support their beta autopilot.
Tesla's official statement:<p><i>Safety is a top priority at Tesla, and we remain committed to ensuring our cars are among the absolute safest vehicles on today's roads. It is paramount that our customers also exercise safe behavior when using our vehicles - including remaining alert and present when using the car's autonomous features, which can significantly improve our customers' overall safety as well as enhance their driving experience.
Summon, when used properly, allows Tesla owners to park in narrow spaces that would otherwise have been very difficult or impossible to access. While Summon is currently in beta, each Tesla owner must agree to the following terms on their touch screen before the feature is enabled:
This feature will park Model S while the driver is outside the vehicle. Please note that the vehicle may not detect certain obstacles, including those that are very narrow (e.g., bikes), lower than the fascia, or hanging from the ceiling. As such, Summon requires that you continually monitor your vehicle's movement and surroundings while it is in progress and that you remain prepared to stop the vehicle at any time using your key fob or mobile app or by pressing any door handle. You must maintain control and responsibility for your vehicle when using this feature and should only use it on private property.</i>
Maybe it's early days yet, but I'm not comfortable with the approach taken by Tesla. Either give the car full control, or else the operator must be in full control with the technology playing an assistive role. I cannot be expected to sit doing nothing behind the wheel for hours and then suddenly be called upon to take over the driving in a split second.<p>Obviously for the former option Tesla is not there quite yet (and that is an understatement), but I wonder if better sensor tech will not help here. Sensor tech is reasonably robust; thus, for example, even if the autopilot is no longer able to properly make out the markings on the road, a sensor override should be able to determine (using radar?) the locations of nearby vehicles within say a 100 meter radius, thus ensuring collisions are avoided, and give the human driver several second or even a minute to take over.<p>No wonder this [1] effort from Volvo focuses on, among other this, Radar tech. I think that is a core tech for the success of self-driving cars<p>[1] <a href="http://spectrum.ieee.org/cars-that-think/transportation/self-driving/volvos-selfdriving-program-will-have-redundancy-for-everything" rel="nofollow">http://spectrum.ieee.org/cars-that-think/transportation/self...</a>
To be clear: this was a UX failure on the part of Tesla, not merely a sensor failure. And they are super wrong to blame their faultless user for their screwup.<p>The UX failure is this happens if you simply double tap the park button and exit the vehicle. That's it! It starts moving forward. No fob interaction, no confirmation, nothing. I'm not making this up, it's insanely bad UI. [1]<p>A double vs single tap is super easy to do mistakenly. And there we get to the really sh*tty thing: Tesla must know this, and hey are selling out their customer to cover their ass. The whole "the logs prove the user is in he wrong" is wrong and disingenuous. All the logs show is he double tapped the park button.. Probably meant to do a single tap!<p>Shame on Tesla for disingenuously blaming the user for their design error with such a safety critical feature. I hope they are more careful than this as they move forward. (No pun intended.)<p>[1] video of this "feature" in action (thanks to user schiffern) <a href="https://www.youtube.com/watch?v=t-JoZL9edlA" rel="nofollow">https://www.youtube.com/watch?v=t-JoZL9edlA</a>
This feature has me wondering what the SAE standards for automatic transmissions have to say on the matter, if anything?<p>I'm not familiar with them, nor do I have access, but I'm hoping someone here might be helpful on either aspect. Specifically, I'm wondering if the standards for automatic transmission controls recommend against having the Park setting do anything beyond activating the parking pawl and parking brake (when electronically controlled).
Isn't the real issue here that the trailer doesn't appear to have an ICC bar? I'd expect the object detection to see that.
Example:
<a href="http://www.morgancorp.com/images/08_options/01_bumpers/options_large/03-optional-bumpers.jpg" rel="nofollow">http://www.morgancorp.com/images/08_options/01_bumpers/optio...</a>
I'm not trying to take this to extremes, but my first thought reading this, especially the part about it not necessarily detecting objects close to the ground, is how long will it be before a Tesla in summon mode runs over a kid playing in the driveway? It doesn't seem inconceivable, and you would think it should be literally inconceivable that something like that could happen before releasing the feature. If it was a critical safety feature that could have a deadly side effect - like an airbag - that's a different thing entirely. But a convenience feature more akin to comfort locks should be held to a higher standard of safety.<p>It's a similar situation to determining what side effects are acceptable in medication. If it cures a deadly disease, serious side effects including chance of death are acceptable. If it's a cure for, say, male pattern baldness, that level of risk would obviously be unacceptable.
They need lidar, and some UI improvements.<p>GM built an ignition switch that killed people, I'm ok with Tesla releasing a semi-autonomous beta that you have to use correctly.
>We can't be required to be smarter than the software<p>Jesus - that's the kind of stuff that makes me afraid for the future of humanity. Yeah I get that this may very well have been caused by human stupidity and also that the software that powered the car should have been smarter than to do that, but that quote just gave me the chills.
"Autonomous Vehicles Could ‘Change Everything,’ But ‘Growing Pains’ Are Likely"<p><a href="http://www.wbur.org/2016/04/29/traffic-future-driverless-cars" rel="nofollow">http://www.wbur.org/2016/04/29/traffic-future-driverless-car...</a>
How come the NHTSA doesn't have rules concerning this scenario? Regardless of how it was operated the car should not run into obstacles when no one is behind the wheel and it is guiding itself.
It's too bad the driver stated that he didn't use the summon feature. It gave Tesla an excuse to shift the conversation away from the real issue.
Well given the shape of the obstacle could it be that the model S doesn't have sensors for that? i.e. there's space in front of the car but not high enough.
Beta is Beta.<p>Expect to not see any new features being offered before they've been run through the entire gambit at this point.<p>Some might feel that's good.<p>Others may want to see what's going on.<p>Release cycle will be changed from bimonthly to each new car model.<p>Congratulations everyone.