I've had FSD since the very first beta and honestly even the 13 mile number is generous. Maybe on freeway only driving it's every 13 miles. On city streets, it's more like every 1-2 miles requires manual intervention unless you want to be the biggest nuisance on the road and a total jerk to everyone around you.
I had a really negative view on FSD from just reading and seeing stuff online until I finally decided to rent a Model Y on Turo with FSD...I was absolutely blown away.<p>I drove it from Houston to Amarillo (600 miles) and had to touch the wheel only a couple times. That includes pulling me off the freeway, into the freaking parking spot next to the Supercharger, and finally through my neighborhood to the front of my house.<p>For the price I don't think the MY or M3 can be beat and will surely be very high on the list for my family's next vehicle
This is true, however it has been improving quickly. Releases are coming once every couple of months. There is already a newer release than the ones they tested, and each release is noticeably better. There is no indication of a ceiling yet.<p>I do take issue with the claim that "its seeming infallibility in anyone's first five minutes of FSD operation breeds a sense of awe that unavoidably leads to dangerous complacency". Waymo claims this as well, citing it as the reason they never released a driver assistance feature and went for full autonomy instead. However, this is is essentially speculation that has not been borne out in practice. There has been no epidemic of crashes from people abusing Autopilot or FSD. It's been on the roads for years. If "dangerous complacency" was a real problem we it would be blindingly obvious from the statistics by now. There have been just a few well publicized cases but statistically there is no evidence that this "dangerous complacency" problem is worse than normal driving.
Like a fool I purchased the FSD feature on a new Tesla in March 2019. All this time later, it still does absolutely nothing in my country. It’s actively dangerous to use because it can’t even recognize speed limits and will happily drive at 120 km/h in a 100 km/h zone.<p>I’m going to get rid of the car soon. This feature cost 7,500 euros but its resale value is essentially zero because everyone knows it’s a complete joke.<p>Obviously my next car won’t be from this scam company. Worst purchase I ever made.
Why should I trust Tesla's AI with my life, much less everyone else's? They couldn't even get the CyberTruck's trim right! It's wild that we have not demanded greater governmental oversight over consumer AI products but in time it will become inevitable.
"requires human intervention every 13 miles" is a horrible metric, because it makes it sound it's not so bad until you remember that the moments for intervention are unpredictable and are also when you're just about to die.
FSD is getting so good so fast. The difference between 1 year ago and now is night and day. It's a godsend for road trips and it amazes me with each passing month's improvements.<p>It's not perfect and people shouldn't expect that. But I don't understand how anyone experiences FSD and isn't amazed. It's not unsafe -- if anything my interventions are because it's being _too_ safe/timind.<p>Weather forecasting isn't perfect. But it's pretty good! And it's getting better! Just because weather forecasting isn't perfect doesn't mean I won't use it and it doesn't mean we should stop improving it.
How does it respond when there is a police officer directing traffic?<p>What if the police officer gives a verbal command?<p>FSD, as Tesla markets it, is hype. They are no where close to a marketable solution for the use cases they advertise - Robotaxis, the ability to step out of your car and have it park somewhere else by itself, etc.<p>Yes, they will get it to 99% at some point - but 99% is not good enough for legal liability.<p>FSD is an ambitious goal and I don’t criticize Musk for pursuing it. But I will criticize him for using it as a distraction from the fact that Tesla has a stale and/or uncompetitive product line and is rapidly losing their lead to both conventional OEMs and Chinese EV makers.
I am so, so tired of Tesla's claims that FSD is "multiple times safer than humans" when the data they base these claims on are basically people using FSD in a totally safe environment which made them use FSD in the first place (mostly long straight highways).<p><i>Anyone</i> trying FSD in a crowded city environment would shit their pants. Unprotected lefts are very often a mess, and interventions are legion. It is really a breath of fresh air to hear news outlets report finally about the actual state of the technology.
Like other here I think 13 miles may be generous for city driving but pretty reasonable for highway.<p>With that said, it works MUCH better for me than it did a couple years ago and I find most of the time I disengage it is not because it actually needed human intervention but because it wasn't aware of the social norms at certain intersections.<p>For example, I have an intersection near my house that requires you to inch out and essentially floor it first chance you get if you want any hope whatsoever of taking a left hand turn, but FSD will (understandably, I think) not do that.
In Germany someone is sueing Tesla over "phantom braking", and the judge ordered an idependent court-appointed expert to check. After 600 km of driving the car braked without reason, and the expert wrote in his report the situation was dangerous enough that he had to stop any further testing. This is now official record and can be referred to in other lawsuits. We'll see what happens...
I use the monthly subscription every couple of months on my Model Y and the FSD has become quite good. For me the two recurring problems are does not follow traffic rules when merging and navigating to exit in case of back 2 back traffic on highway simply does not work. However in rest of the cases its pretty good or perhaps better way to say the best in market right now.
Waymo requires interventions about that often driving in San Francisco as well from my experience over many trips. Their interventions are automatic when the car calls back to home base to make a determination as to what to do next and the operator makes a choice on how to proceed. Happens about once every half an hour travelling on Waymo in SF for me.
FSD?! They can't even fix the damn windshield wipers!<p>The car ahead of me decides to clean their windshield and 3 droplets get on my windshield? Tesla: <i>WIPER LUDICROUS SPEED ENGAGED</i>!<p>I just drove next to a semi and got blasted with a tsunami of water in a rain-storm? Tesla: <i>...</i>
It's incredible that Tesla is nearly a $1T corporation because it is about to announce robo taxi. Meanwhile it's actual car sales are shrinking quarter by quarter. And it's CEO supports the presidential candidate that wants to do away with carbon credits (~40% of tesla's net income).<p>If you any evidence that the market isn't rigged, but just full of gullible idiots, Tesla is it.