"The argument that the same risk was flown before without failure is often accepted as an argument for the safety of accepting it again. Because of this, obvious weaknesses are accepted again and again, sometimes without a sufficiently serious attempt to remedy them, or to delay a flight because of their continued presence." {R. P. Feynman, "Personal Observations on the Reliability of the Shuttle"}
Feynman's supplement to the report on the Challenger disaster was featured on Longform.org today, and while I found some of the more technical bits to be dry reading (and others of those technical bits to be extremely interesting), one of the things that most stood out to me were Feynman's comments on the intentional self-blindness and optimism about known potential defects in the shuttle's components.
In the rush and enthusiasm of the early years of space exploration, it must have been so easy to allow safety concerns--the non-urgent ones, anyway--to fall to the wayside. Another article I came across noted that initially NASA tried everything to obscure the fact that the crew were probably alive for the entire two and a half-minute descent to the ocean below (and have, in subsequent years, done little to alter the prevailing impression of instant death), which brings up the issue not only of official secrecy but also of NASA's failure to provide for the case of a survivable catastrophe. Even if they were alive after the explosion, they had no parachutes, no life jackets, just the absolute certainty of death to keep them company for the longest final 150 seconds of their lives.
Certainly, hindsight supplies better insight into possible contingencies than even the best futurist can hope for. But Feynman's point here seems to be that there was not only a lack of foresight--there was also a lack of hindsight. The past was there to be learned from, complete with the evidence of the strains of various forces on the components of the shuttle and the rocket that failed. That those components had not thus far failed is not relevant, he argues. They ought to have been given more attention and more testing before the somber and sacred burden of human lives were entrusted to them, regardless of whether they had survived the trip before.
I'm not really equipped to criticize NASA's actions in any substantive way, since my entire knowledge of the issue arises solely from the two articles I read today. In a more general way though, I wonder if it's not just a problem for NASA engineers and management. There are so many ways to avoid confronting personal weaknesses, even when we are keenly aware of them. There is so much else to be done, after all. So much life to be lived, so much future to be shaped and pushed toward.
In the short run, we might be able to get somewhere on pure energy and drive. But it's like all the leaders in those books about integrity that I read in high school: you can only get so far before the pressure makes the tiny cracks bigger. Failure to respond to and correct your behaviors when it "doesn't really matter" may lead to enormous problems down the road. And even if they don't, even if you manage to walk a straight line, you'll only be able to manage it by limping with both legs. Which hardly sounds pleasant. Or feasible. But that's a question for another day.
No comments:
Post a Comment