I would find it kind of satisfying if Waymo solidly won the self-driving race. It would be a triumph of solid engineering over unreliable "good-enough" solutions.
All of the current gen assisted/self driving tech seems like it's operating in a legal grey area like Uber. Every car manufacturer assumes people are still behind the wheel, and the government is (naively?) going along with it. "By operating this vehicle you agree to..." seems to dissolve them of any responsibility, including around quality of software, and it honestly scares the hell out of me.<p>I don't know what the fix is for this, the genies out of the bottle here. We won't have reliable self driving cars for years (decades?), and until then we're stuck in this horrible wild west where an off by one error can cause a pileup on the highway.<p>Let's stop the the assisted/self driving stuff until we have a regulatory framework that can prove the tech works in various conditions, much like seatbelts and collision testing.
This seems right about where I expected their real-world capabilities to be today based on their sensor setup and the state of ML for self-driving. i.e. not very good, and nowhere near the Elon hype train or timetable. The real question is how much of this can actually be fixed by software updates alone. Personally, I don't think they can even get to 99% reliability on these use cases, let alone the number of 9s required to be at least as good as the human drivers it's supposed to replace. My money is still solidly on Waymo being the frontrunner for years.
Wow, this is pretty damning:<p>> Traffic Light and Stop Sign Control is designed to come to a complete stop at all stoplights, even when they are green, unless the driver overrides the system. We found several problems with this system, <i>including the basic idea that it goes against normal driving practice for a car to start slowing to a stop for a green light</i>. At times, it also drove through stop signs, slammed on the brakes for yield signs even when the merge was clear, and stopped at every exit while going around a traffic circle.<p>(emphasis mine)<p>Is that really how this feature is intended to work?
I suspect some aspects of driving a car consumes a majority of a person's brain power, even if it doesn't feel like it.<p>Ever get into a situation where there's on-ramps and off-ramps on both sides, you've never been on that stretch of road before and there's a lot of traffic? Driver conversation stops. Driver paying attention to the audiobook or podcast playing stops. It takes all your attention.<p>If that's the case, then maybe we won't get true self-driving until these systems have the processing power of a human brain...
In the summer of 2015, Elon Musk said the software upgrade Tesla was rolling out would allow the Model S to have hands-free autopilot.<p>5 years later and even for an extra $8k we still don't have hands-free autopilot.
To be honest I just don't understand how what is shown could be legal -- Having a beta software on something that could easily kill, making the drivers/passengers other road users lab rats and beta-testers.
I do admire what Musk did with SpaceX, but the 'self-driving' aspect of Tesla is just disgusting.
Would keeping Autopilot 'dumb' be a good way to keep users engaged and continue training the network? While Tesla keeps and improves a far more capable version of autopilot internally? I think autopilot is a huge legal burden for Tesla, and I would imagine being careful is a high priority. Once Tesla starts marketing full self driving capability, it will open a gateway of legal troubles that they'll need to deal with once the system is fully capable and can no longer hide behind the 'beta' word. They've put out videos showing autopilot doing complex things and I think that's a pre-release version of autopilot they were using. IDK though, just wondering.
Shameless self promotion, I wrote an article on this a while back: <a href="https://jarbus.net/Tesla-and-False-Advertising-In-AI" rel="nofollow">https://jarbus.net/Tesla-and-False-Advertising-In-AI</a><p>Elon Musk is just blatantly falsely advertising FSD by this point. He's trying to sell a technology that a) doesn't exist, and b) has no guarantees of being ready anytime soon. I wouldn't be surprised if people started demanding refunds pretty soon.
Who would've thought that relying on a statistical model, whose many thousands of parameters are not understand, would lead to such an erratic end product?
What does HN think of the Ghost self driving product? Is it real or a scam? They claim to be able to add limited level 3 autonomous operation to most late model cars.<p><a href="https://gh.st/home" rel="nofollow">https://gh.st/home</a>