> A hit-and-run incident in Melbourne yesterday could set a legal precedent for the use of autonomous driving technologies in Australia.<p>There is no precedent to be set. If you are behind the wheel of a car you are responsible for it's operation. 'Autopilot' or not. And hard to say it was the autopilot's fault for driving off and returning to the police station two hours later.
The run part: I think they're a young overseas resident who panicked. They returned and handed themselves in to police at the scene.<p>A lot of younger foreigners have no idea what to do in an accident and in their home economy can face retribution, extortion, sometimes at the hands of police. In Australia, you stay, you help the victim, you report the accident. That's the law. In theory all drivers should know this but you get a year on your homeland licence before having to sit a local road rules exam.<p>I have nothing to add regarding self driving. Well I do: I think it's been misnamed and oversold.
1. A driver who drove off after an accident claims they were using Autopilot at the time.<p>2. The road photographed in the article isn't a highway, and so I'm not sure it would be accepted by regular Autopilot.<p>3. Whilst FSD might accept that road, FSD isn't available in Australia.<p>I think the driver's claim that Autopilot was in use needs to be questioned.
In Melbourne, quite often there are 4 lane roads (2 lanes in each direction) with tramlines in the inner two lanes going in each direction. At the tram stops, passengers need to walk across the outer lane to get to/from the sidewalk, so cars need to stop behind the tram even if they are in the outer lane and not directly behind the tram so that they can make way for passengers.<p>Having moved here from Sydney a few years ago, where there are no trams in this configuration, this took a lot of getting used to, since if I was in the outer lane my natural instinct was to keep driving as the road ahead seems clear at first.<p>I'm not sure how common this is globally, but I wonder if Tesla's autopilot is configured for this edge case? I imagine it would be difficult to program since it would detect the tram as simply another vehicle and the road ahead as clear? Then again, I suppose it should have detected the pedestrian in any case?
The key word here is "blamed" as this has likely nothing to do with Autopilot. The person panicked and fled the scene and then made up stories to pretend they were not the cause of the accident.
I didn’t realize Autopilot disabled the vehicle’s brakes. This poor driver couldn’t stop even after the collision. Seems like a terrible design choice.