> <i>According to the NTSB, a witness had come across the stopped CR-V prior to the collision and noted that neither its taillights or hazard lights were illuminated, She was able to change lanes and avoid hitting the Honda, but after she passed it, she saw the Mach-E strike the stopped crossover in her rearview mirror.</i><p>This scenario sounds like what I refer to as a "peek-a-boo rear ending". You see it exaggerated in movies with high speed chase scenes, but its real world equivalent is a very common road going scenario and almost certainly leads to a collision if at 45+ mph and the rear car driver isn't maintaining situational awareness of traffic flow several cars ahead...a scenario that aggressive tailgaters simply don't take into account in their public behavioral risk calculus, and clearly BlueCruise isn't capable of handling either.
I see a number of negative stories, so sharing my positive story with self driving cars.<p>Two weeks ago driving home, Tesla Model Y with FSD, FSD on, nighttime, suburban streets. Car phantom brakes and I can’t see a reason why. Glance down and I see the UI thinks a human is running into the street.<p>I look back up, canceling FSD and taking over… and a damn invisible guy in dark brown jacket and black pants races past my headlights.<p>He had stopped before crossing my lane, but when he saw FSD slow down he booked it across. Other cars slowed after he cut straight into my headlights as they noticed him for the first time. No one else saw him either. I was super impressed with FSD.<p>The promised perfect future isn’t here - you can’t take a nap and the car drives you - but the current state is a value add.
A car stopped on the highway with no warning lights or anything in center lane, at night... I'm not sure I'd be able to spot and react in time, this is what I would expect radar and other sensors on car to pick up and break on... those also failed...
> According to the NTSB, a witness had come across the stopped CR-V prior to the collision and noted that neither its taillights or hazard lights were illuminated<p>I’m shocked by how many people drive at night with daylight running lights in US, they seem to be completely oblivious of that, even when I stop next to them at a traffic light and try to tell them - they fail to understand that daylight lights don’t have a rear light
This is great! We're starting to treat car crashes like plane crashes [1].<p>[1] <a href="https://www.nytimes.com/2006/10/31/health/31safe.html" rel="nofollow">https://www.nytimes.com/2006/10/31/health/31safe.html</a>
The CRV was stopped with hazards and lights off, at night, in the center lane of a 10 lane freeway. Certainly a human driver could have easily made this mistake.
I own a model 3 with FSD. One of the key realizations I had after a few months with it is that self-driving at the moment is like supervising a newbie teenager driving. You have to stay alert and watchful, but also don’t have the direct control. When you are teaching a new driver it’s worth it.<p>Honestly, it’s a lot less work to just do it yourself.<p>This is one of those engineering situations where the 0.0001% edge cases matter, and could lead to fatalities. I don’t know if any of the implementations are up to the mark. FSD isn’t.
I wish we put 10% as much money into public transit as we do for cars and their associated infrastructure.<p>The roads would be more open for people who do enjoy driving. And folks like me could kick back and let the bus driver/train engineer/etc. handle it.<p>Self-driving vehicles just seem like a subpar way of solving the problem of getting someone from A to B without requiring their full attention.
When I'm using BlueCruise, I can't look at a sign beside the road without beeping warning me to keep my eyes on the road. There are few roads where BlueCruise even remains active for long without reverting to "hands-free" mode, so that if my hands aren't positioned properly on the wheel, or if my hands even rest too passively on the wheel during turns, there's an alarm.<p>I honestly don't know how people lose track of the road using BlueCruise. Wearing glasses with eyes painted on them, with rubber hands hooked onto the wheel?
I am so glad that I am not involved with self driving systems (except as a squishy object on the same road).<p>The systems are so incredibly complicated and convoluted that I just cannot see how they could ever be assumed safe. I've worked on (relatively) large legacy hairball embedded systems where you don't have confidence in the behaviour determinism and it was deeply discomfiting even when the highest price of being wrong was broken hardware or ruining a production run and having the customer give a support engineer an earful.<p>I can't imagine the stress of having to deal with a much larger, more complex system that ends up in an open, public, uncontrolled environment where the price of a mistake is smearing a once-living human into paste.<p>I'm also intrigued by how these systems are tested. Presumably, as well as unit testing everything to death, you use some kind of integration test where you replay scenarios and check outcomes, and/or some stochastic generated scenarios. This must take a <i>lot</i> of time and resources.<p>As a shower thought, wouldn't it be interesting if you could get a virtual car model of a public-road self driving car and be able to play scenarios at it at will to allow external testing.
BlueCruise or not, the fact that our society accepts the number of highway fatalities that we do while underinvesting in obvious rail corridor and public transit opportunities is saddening.<p>The amount of yearly daily car crash fatalities would be like if 20 fully loaded commercial airliners crash every day. We would never just sit back and accept that in the world of commercial aviation.<p>Philadelphia and San Antonio are two very large cities where many daily car trips could be eliminated (this does not mean I'm telling people to sell their cars and walk everywhere) with even a moderateley conservative amount of public transit investment.
I suppose Ford could just do what Tesla does and have the car disable BlueCruise right before the crash, so the car isn't _technically_ 'self driving' when the crash happened.<p><a href="https://www.motortrend.com/news/nhtsa-tesla-autopilot-investigation-shutoff-crash/" rel="nofollow">https://www.motortrend.com/news/nhtsa-tesla-autopilot-invest...</a>
I almost had an accident like this.<p>I was driving home on the freeway late at night and there was a car on its roof that had been in an accident in the lane I was in. It was at the end of a long sweeping curve and I just barely saw it in time.<p>I was really glad I was in the car I was in and paying enough attention. I was in my fun car going too fast but was also more alert because of that.
Great, lets investigate every single fatal crash and make actual changes based on those. Let's not limit it to crashes involving systems like Bluecruise or Autopilot. If human drivers caused a fatal crash, let's make changes so that doesn't happen again as well.
I believe cars with autodrive etc often interpret surroundings by looking at objects moving relative to its surroundings. If an object is first seen by the car when stationary it is much harder for the software to filter it out from the “backdrop”
Is this the first publicized crash of a hands-free system in a non-Tesla consumer vehicle?<p>I’m not counting the Cruze autonomous one, it wasn’t a consumer. For what it’s worth I own a Mach E and have used BC. I like it but it’s just L2.
Would the pre-collision avoidance systems that are standard in a lot of vehicles now been able to prevent an accident with this kind of "peek a boo" scenario?
Anyone who knows anything about AI would only use it to drive their car as a last resort. The driving public's ignorance about AI's capabilities is astonishing.
As expected, half of these comments are victim blaming and trying to validate that this isn’t BlueCruise’s fault…and any human would have done the same. At the same time, everyone keeps ignoring the fact that a car, moments before, drove around the vehicle safely.<p>So many Tesla haters here with vindictive mentalities
Driving on the roads just gets a little bit more anxiety ridden with every douche overestimating the technology.<p>Personal vehicles as the primary mode of transportation is the greatest failed experiment in this country.