This reminds me very much of Sidney Dekker's work, particularly The Field Guide to Understanding Human Failure, and Drift Into Failure.<p>The former focuses on evaluating the system as a whole, and identifying the state of mind of the participants of the accidents and evaluating what led them to believe that they were making the correct decisions, with the understanding that nobody wants to crash a plane.<p>The latter book talks more about how multiple seemingly independent changes to complex loosely coupled systems can introduce gaps in safety coverage that aren't immediately obvious, and how those things could be avoided.<p>I think the CAST approach looks appealing. It seems as though it does require a lot of analysis of failures and near-misses to be best utilized, and the hardest part of implementing it will undoubtably be the people, who often take the "there wasn't a failure, why should we spend time and energy investigating a success" mindset.