Long time member of BSAC here. The classic example I use when describing this to students is torch batteries. If you're at home and you notice that your dive light batteries are flat, no biggie, just swap them out. If you notice while on the boat that's a bit more annoying, but the chances are someone will have a spare you can borrow, and if they don't, you will have to miss the dive, which is unfortunate but you're not in any danger. If you're in the water, you can still call the dive (abort and return to the boat). But if you're at 50m and entered a wreck and your buddy has just swum round a corner taking their light with them and you switch yours on and nothing happens, then you have the rest of your life to try and find the way out.<p>Same, seemingly trivial, failure, but very different consequences depending on at which point of the dive you notice it.<p>By the way I mainly dive with GUE now who mandate a primary light and two backups as part of standard kit!
Related is William Gibson's idea of "the Jackpot", or a "multicausal appocalypse", where a large catastrophe is caused not by a single major factor but by several smaller ones accumulating and interacting over time. Gibson talks about it in <a href="https://vimeo.com/116132074" rel="nofollow">https://vimeo.com/116132074</a>.<p>If this is the kind of thing you find interesting, you should also read How Complex Systems Fail (<a href="http://web.mit.edu/2.75/resources/random/How%20Complex%20Systems%20Fail.pdf" rel="nofollow">http://web.mit.edu/2.75/resources/random/How%20Complex%20Sys...</a>).
There is a specialized application of this concept that is sometimes used in airway management during anesthesia (specifically endotracheal intubation). It is referred to as the "vortex" approach (i.e. you don't want to get get pulled into the vortex as the longer you spend there, the harder it is to get out).<p>There is a well produced reenactment of an anesthesia team falling victim to the "vortex" (resulting in fatal injury to their patient): <a href="https://vimeo.com/103516601" rel="nofollow">https://vimeo.com/103516601</a><p>In fire and EMS we generally refer to the concept of an "accident chain". In any event where rescue personnel are injured or killed, there is a chain of events that had to take place leading up to that accident. Breaking the chain at any point would prevent the accident from occurring, and many of our procedures are built around the idea of breaking accident chains as early as possible. This is a concept that (as far as I know) originated in the aviation industry.<p><a href="https://en.wikipedia.org/wiki/Chain_of_events_(aeronautics)" rel="nofollow">https://en.wikipedia.org/wiki/Chain_of_events_(aeronautics)</a><p>It's the same basic idea though... The further along the chain you allow the event to progress (even if you don't know the end result), the less margin for error you have.
I think this accurately describes American history for the past 15 years and explains why establishment hate is so mainstream.<p>Iraq war, housing bubble, housing collapse, bailouts, rise of ISIS, student debt explosion, ...<p>Eventually there is a point where confidence is lost.
For years I've used (and I'm most likely not the first to have coined this) the "Gravity Well of Fail" to describe situations that become increasingly more perilous due to badly chosen decision paths as time passes. I didn't actually know about the term "Incident Pit", or perhaps only vaguely remember this or a similar term from a couple of pals who are scuba divers.
This Quanta article on research into predicting disease outcomes strikes me as related<p><a href="https://www.quantamagazine.org/20160830-who-will-die-from-infection/" rel="nofollow">https://www.quantamagazine.org/20160830-who-will-die-from-in...</a>