The real problem here is that Tesla is operating under the assumption that people obey warnings they give them, and that they behave rationally around their luxury cars. They do not. In a perfectly rational world, people would read the warnings and understand the risks, and thus not "expect" the Tesla to detect things that it warns that it cannot detect.<p>The warnings say you need direct line of sight, that the system isn't perfect, and that it may not detect all obstacles. Even ones expected to be in parking lots. Those warnings all make sense, but the people recording the videos don't care. They're just pressing the button and being shocked that it doesn't work.<p>Ultimately, regulators will step in if they feel that people still get into crashes despite having the warnings. If the warnings don't stop people from doing stupid things, they'll require more warnings or kill the feature. Unfortunately for Tesla, the regulator focus is public safety (and the underlying statistics). If the tables turn then it doesn't matter how safe the feature is when used correctly. Instead, it matters how safe the feature is when used incorrectly.<p>And that will only hurt Tesla in the long run. And that's a shame, because it will make true self driving cars that much harder to get to market.