The post pushes the view that CrowdStrike's engineers should be held responsible. That's one way of looking at it.<p>But there is an entire chain of responsibility here. The hospital IT department that chose to use a computer instead of dumber technologies. The IT department that chose to run Windows. The security team that chose to purchase CrowdStrike's software, possibly without vetting them.<p>If a software's license has clear terms stating that there is no warranty, and the buyer buys it anyway, why shouldn't this be a caveat emptor situation? If they didn't like it, they could negotiate indemnity clauses, go to a competitor, or not use the software at all.<p>Don't get me wrong, I absolutely think that CrowdStrike did a shitty thing. But maybe they already disclosed that in their license agreement, and the purchasers decided to overlook that to their own peril. After all, running kernel-mode software is equivalent to handing over the keys to your computers. Maybe negotiating/selling software with liability clauses should be more normalized?
The phrase "Responsibility without authority" comes to mind. Who ultimately has the authority to set the target quality and direction of software projects? In my experience this is usually not the programmers...
These fields are compared to programmers:<p><pre><code> Anesthesiologist, Structural Engineer
</code></pre>
This is the example of programmer-caused harm:<p><pre><code> Yesterday a friend of mine was stuck in the Hospital all day. Their computer system went down and that led to a delay of care. Delays in care kill people.
</code></pre>
It seems like the author isn't considering the reality of the professionals they are comparing.<p>Coders of a CrowdStrike-sized company have layers of people-exercising-company-authority above them. Coders are employees who operate within the confines they are given.<p>Those confines get shaped by what filters down from shareholders/investors and executives.<p>Conversely, anesthesiologists and structural engineers are commonly directed by people they interact with. Their expertise carries real weight. In many cases they have a degree of ownership in the biz getting paid.<p>source: 30y supporting the latter
Management is ultimately responsible. Licensed engineers are already responsible and we still get chronic institutional maleficence, Boeing, Enron, Ford, Purdue etc.
Sorry, wrong, these are the causes of the issues:<p>* Management and Marketing insisting on having a product done by a specific date, no matter what.<p>* Testing is always a second thought<p>* not enough hardware available to test all possible configurations.<p>Until companies spend real money on Development and Testing instead of cutting corners, this will happen again and again.
Programmers are seldom the ones driving the conditions that are the root cause for operational failures, but often the ones that bear the consequences.
Jailing the guard at the gate for not stopping an attack, eh? How about “corporate criminal liability extends to every employee at director and higher pay grades (management and otherwise)”?
Programmers have little to no power in development and release processes. Management has the power.<p>Also, it's the customers fault for clustering blindly into Crowdstrike. There needs to be diversity and choice.<p>Moreover, if uBlock Origin can be an open source success, then an open alternative to Crowdstrike can in theory too.
“Be ruthless with systems and kind to people”<p>In my experience the software engineers that often get promoted crank out as many features as possible with almost no regard to the defect rate. If you do things at a more deliberate pace, you can be regarded by product or management as slow and unproductive. Real incentives exist in software development that push for quantity over quality.<p>This is not analogous to the failure of an anesthesiologist to properly sedate a patient. This is a process failure. Clearly the proper amount of QA was not in place for whatever reason and they need to re-examine their approach. Of course their process needs extra diligence given the cost of failure.<p>It would be a real shame if an individual is punished for this instead of examining the process and system of incentives that led to this failure.
Nobody seems to be saying that the failure was one of properly testing their release.<p>Even the CrowdStrike blog said it was doing a root cause analysis into how that poor logic was introduced.<p>That isn't the point; all software can have bugs. This is a failure of their QA process. It was clearly quite easy to trigger the problem.
I think the issue was testing. Many organizations don't do enough, or the bare minimum, which is already insufficient as it's in some controlled lab that does not mimic the real world.
A lot of comments here blaming management and executives, but bear in mind: who was the person who wrote the buggy code? Who was the person who pushed the release button? Sure, broken process, poor release engineering practices, whatever. And who, ultimately, implemented those release engineering processes?<p>The engineer who caused all this has surely already been fired, but that's just scratching the surface--I hope this is brought to court as criminal negligence.
what kind of nonsense are people in the comments mentioning. it shouldn't even be remotely considered.<p>wtf kind of BS to allow any remote semblance of programmers deserving consequences? if programmers received EXECUTIVE level pay then yes, they deserve the rammifications, but they do not, executive pay is ridiculous with their already inflated salary, ridiculous bonuses, and stock buybacks. whoever gets paid more is the one that deserves the consequences.<p>Thats the WHOLE GODDAM REASON THEY ARE IN THAT POSITION.
Shit happens. What makes this so interesting is the speed at which the shit tsnuami spread. Try rolling out a little more slowly so less of the world gets affected.<p>For those affected, consider it a relatively painless test of your DR systems, and a chance to fix 'em up.