> When Rajkumar has raised concerns with those Tesla employees about autopilot's technical limitations, the response is they have to "wash their hands of it" because "it's a business decision."<p>I've wondered about Tesla's engineering management before, and this would support that it is inadequate. Engineers cannot "wash their hands" on safety critical projects. This is potentially a serious ethical lapse: the health, safety, and welfare of the public is the foremost concern:<p><pre><code> If engineers' judgment is overruled under circumstances
that endanger life or property, they shall notify their
employer or client and such other authority as may be
appropriate.[0]
</code></pre>
> In another anecdote recounted by two sources, Musk was told that the sensors used for Tesla's self-parking feature might have difficulty recognizing something as small as a cat. Musk is said to have responded that given how slow the car moves in this parking mode, it would only be dangerous to "a comatose cat."<p>Outrageous, a baby can be the size and of the speed of a "comatose cat". Hysterical example aside, a comatose cat demands no less consideration with respect to risk analysis. Handwaving that the the feature is of no safety concern, rather than submit to the rigors of risk management, is an absolutely improper response.<p>[1] <a href="https://www.nspe.org/resources/ethics/code-ethics" rel="nofollow">https://www.nspe.org/resources/ethics/code-ethics</a>