I don't see why Tesla should do this when plenty of other safety critical software (cruise control, autopilot from actual planes, etc.) doesn't do this.<p>I also think public review of this kind of code is likely to generate a whole lot more noise and PR bullshit than any actual actionable feedback.
While I love the trailblazing Tesla has done for EVs and smart cars, it's moments like these that make me cringe.<p>Obviously, we don't know right now if the driver was at fault or not - but I'm a capable coder and have some exposure to control theory as well as microcontrollers - and would be more than happy to audit Tesla's firmware.<p>Even more, I think there's a case to be made here for formal verification and high assurance languages such as SPARK.<p>It would be nice if you could pass the Tesla codebase to a formal solver and ask the solver "prove that pressing the brake never accelerates", and then have the resulting code and proof be made public.