I just completed my trial and it was better than I expected, and yet still utterly terrifying at bits. It still feels _very_ far off from being able to safely navigate and negotiate only medium-complexity environments, but maybe the vaunted 12.4/12.5 update will change all of that.<p>For example:<p>* It still won't change in or out of a solid line HOV lane here in Arizona. Feels like an easy fix, but there it is<p>* I have concerns about its ability to check oncoming traffic when coming out of an occluded side street or alley. For example, my alley (where my garage leads) connects to a MAJOR road that is extremely fast. It is also fairly occluded in the side view by bushes and a light post. A human will move their head forward, crane their neck, and also be able to detect subtle changes in the light and shadows through the bush itself to determine if there's _any_ movement and interpret that movement as a potential car, even if they can't positively see a car. They can inch forward until they can see that the path is clear. The Tesla's side-facing camera is in the b-pillar, behind the driver's head, and at best, it can inch forward (and does) but gaining a high-confidence positive reading that the path is clear is... well, nearly impossible in certain cases that aren't impossible for humans, and that's concerning.<p>* Parking still takes one too many adjustments, and impatient drivers around you definitely notice it<p>* At one point, the FSD/AP engine itself crashed on me while fully engaged. Unfortunately, this happened on a freeway connector ramp with a pretty steep curve, and when it crashed, it disengaged the steering wheel and sent us careening towards the barrier: it was a single lane HOV ramp, and we were going about 70 mph, so if I hadn't been hover-handing, it would've easily resulted in a bad accident. This wasn't a case of disengagement or AP getting scared or losing confidence. The engine itself suddenly, without warning, and for no discernible reason, crashed entirely. (It immediately threw an error and said AP System Error/Take Control Immediately.) It then showed the car in a sea of black, as the visualization/FSD engine rebooted. This sort of crash is kryptonite. It's terrifying and its randomness and senselessness and opacity towards what caused it if anything is haunting. Again, a disengagement like this with no driver would result in catastrophe.<p>On the flip side, I was fairly surprised at how well it handled a lot of basic driving tasks. Visual-only parking still freaks me out (especially since my model HAS ultrasonics, but you disable them when you go visual-only, which is absurd), and a couple turns felt close to the curb, but overall, driving was fairly smooth and decent.<p>I have the added benefit of living in Phoenix, which is Waymo country. Waymos drive more confidently, and more importantly, are already fully autonomous. They navigate complex environents fairly decently (though, for example, my dad got stuck in one doing loops of a dealership parking lot that confused it a few weeks ago) and they're comfortable to ride in. They're not yet on freeways, but apparently that'll change soon, but they also only go the speed limit, which is Phoenix is... a choice.<p>Elon keeps pushing this dream of a robotaxi fleet of Teslas, but I agree with the OP that it feels a far way off before I'd be comfortable with the idea of these things fully autonomously driving, and I say this as someone who sees a half dozen Waymos every single day. I also wonder more broadly about the core conceit here: not that fractional car ownership doesn't make sense; it absolutely does, but in the idea that Tesla owners are going to be comfortable with their ~$50k-$150k vehicles roaming around and picking up strangers who... hopefully don't do things to their car, all while hoping the car comes back home. I don't believe Elon was pitching the robotaxi as being wholly Tesla owned vehicles, but it seems like a big societal shift to get people comfortable with their cars having minds of their own, and taking in randos.