From yesterday: <a href="https://news.ycombinator.com/item?id=19536375" rel="nofollow">https://news.ycombinator.com/item?id=19536375</a>
While I'm hugely skeptical of the current state of self-driving cars, you could probably get human drivers to make the same mistake if you were to repaint the lines. However, humans will also notice the oncoming cars (if there are any) and avoid getting in a head-on collision.<p>The thing missing from this test is that critical practical piece: if there was an oncoming car, will the Tesla do something to avoid the collision? I would assume that not getting in a head-on crash is higher priority than staying in the lane markings.<p>Without oncoming traffic, all this is testing is what the Tesla considers valid line markings. I'm sure there's room for improvement here (such as checking where the other lane is, raising the requirement for how well-defined the lines have to be, etc), but those are also going to involve trade-offs where there are legitimate situations that will stop working.<p>I think you could just as easily title this video "Tesla auto-pilot follows road markings even if they're really bad".<p>Edit: The best shot I could get from the video [1] makes me even more upset at this test: these look like the temporary markings often used during construction, just before they come and paint the normal lines using the big line-painting truck. There's not even regular lane lines after this. I wouldn't even be surprised if this is something Tesla specifically trained the software to handle.<p>[1] <a href="https://i.imgur.com/aLbhnzQ.jpg" rel="nofollow">https://i.imgur.com/aLbhnzQ.jpg</a>
No shit. What about when you erase all the lines on the road? What about when you remove the road? What's next, "Luxury Tesla defeated by a cheap piece of duct tape over the main sensors"? "We removed a wheel from a Tesla and it crashed into a wall!". These tests are so contrived, honestly.