The right question to ask about a space shuttle failure — or any other horrendously complex catastrophe-in-the-making — is not "What should you do if (or when) you know something has just gone wrong?" That's innately deceptive in its Manichean good-bad world view.
No, the actual challenge is to answer "What should you do when the probability of disaster has just gone from 2% to 20%?" Or even more honestly: "What should you do when your estimate of the probability of disaster has just gone from between 0.5% and 5% to between 5% and 50%?"
These questions — full of caveats and impossible to turn into bumper stickers — are the real quandries that need to be addressed. One (almost) never can know that there's a problem, with certainty, until no time is left to act.
The same applies, in trumps, to geopolitical train wrecks. Is Oceania about to attack us? If so, will Eurasia come to our aid, or join in the onslaught, or simply stand by laughing while we combatants destroy each other? There's literally no way to tell until too late to do any good. The best that anyone can do is to make informed guesses as to the odds of various events — and then put error bars on those odds.
False precision is no friend of the military planner, or the civilian policymaker. Early action is essential in a crisis — but constant panic is costly and counterproductive. There are no easy answers, regardless of what op-ed pundits claim ...
(see also InStability (20 Aug 1999), PredictingVersusUnderstanding (27 Aug 1999), NoGrandDesigners (13 Jan 2000), EpistemologicalEnginerooms (10 Aug 2000), ThermodynamicsOfTerrorism (15 Jan 2002), OpaqueJustice (29 Jan 2002), RetrospectiveHistory (7 Mar 2003), ...)
TopicThinking - TopicScience - 2003-03-12
(correlates: CreepingConfidence, RetrospectiveHistory, ObliviousAce, ...)