Perhaps the most favorable indication of the culture of commercial air flight is that the attitude exhibited by this author has been exceedingly rare.<p>First of all I haven't read or seen any speculation that wasn't qualified as being mere speculation. Second, this is the first instance of reading someone minimizing the accident <i>because</i> of the excellent safety record.<p>Is it too obvious to say that the safety record comes from not minimizing accidents as flukes? The rigorous/tedious checklists, innovations like removing oxygen from near empty fuel tanks so they're less likely to explode, and flight attendant training for quick exits come from experience and the enormous amounts of attention directed at every past failure.<p>Such advances come from a combination of the expertise of people with domain knowledge <i>and</i> people outside of the domain having high standards. In 1985, described as an example of a terrible year for air travel, one still could have put the safety record in perspective with other hazards. Surely it was far more dangerous to travel by car. It is easy to imagine pilots and engineers of 1985 wishing that outsiders who knew less than them would just shut up and let them do their jobs without unnecessary distractions. It's even easy to imagine arguments about the meddling not only being costly, but even a dangerous distraction.<p>Some hysteria, and definitely a lot of lawsuits, aided in the prioritization of problems for people who had the expertise to come up with solutions to minimize fatalities in the future. Outside, unwelcome pressure also disrupts the status quo which, by definition, is part of every failure.<p>I imagine anyone who has domain specific expertise has at least at some time been incredibly annoyed to have to deal with and attempt to manage the reactions of nonexperts while they're in the midst of solving a problem immediately at hand. And yet, such a reaction is cowardly, and enormously unproductive in the long term.<p>I'm not sure if I am beating a dead horse over an obvious point, or not making the point very well at all, but I think it is really important to avoid the pitfalls of only talking to experts, or peers, or people within your established hierarchy of influence when attempting to address failures. This article, made me recall students of nuclear physics on Reddit talking about Fukishima, who couldn't decide whether it was irrelevant because it was an older reactor design, or whether it was irrelevant because it occurred after an earthquake and tsunami that were more severe than the models forecast as possible, or whether nuclear disasters in general are irrelevant because their understanding of worst case was the not-that-bad-all-things-considered case of what was only achieved through heroics of people at Chernobyl.<p>What made the point for me personally was reading about failures at NASA. I highly recommend the long "Columbia's Last Flight" by William Langewiesche in <i>The Atlantic</i>[1] as it had a great influence on me when it came out in 2003, and helped me learn to value input from people without specialty knowledge. Even when special expertise defines who is most likely to develop fixes, the big picture view sometimes only afforded to outsiders is important, or sometimes just enough disruption so different voices within the system are heard.<p>In the case of NASA, things were humming along for the entire edifice because it had worked before when they squelched concerns by small groups within the enormous operation. As a result, when it failed, it was the system itself that failed, and it would have been reckless to trust that system to determine what went wrong. Perhaps even more famously, with Challenger, Richard Feynman was the gadfly in that investigation, and later had plenty to say about the ability of entire cultures to encourage mistakes specifically as a result of their desire to build in-group consensus.<p>So, I think the air travel industry deserves an enormous amount of respect for its excellent safety record, and that the shortage of voices trying minimize this failure suggests that they will accurately determine what went wrong. The value of this column is as an example to the rest of us, not in the air flight industry, of what not to do, and what attitudes stand in the way of improvement and finding solutions.<p>[1] <a href="http://www.theatlantic.com/magazine/archive/2003/11/columbias-last-flight/304204/?single_page=true" rel="nofollow">http://www.theatlantic.com/magazine/archive/2003/11/columbia...</a>