Thursday, December 10, 2009

NATURE CANNOT BE FOOLED

Wretchard has a really excellent post on the AGW issue that discusses in some depth confirmation and bureaucratic bias. What particularly interested me was his and the Atlantic's Megan McArdle's 's use of the Challenger disaster to explain how such bias works.
Confirmation bias “is a tendency to search for or interpret information in a way that confirms one’s preconceptions, leading to statistical errors.” A similar, but subtly different kind of problem affected the Space Shuttle program. Let’s call it ‘incentive bias’. NASA grossly underestimated the probability of a launch failure and set it at 1:100,000 because that’s what it was bureaucratically believed to be. What it bureaucratically had to be. Richard Feynman, who was asked to look into the causes of the disaster knew this number could not possibly be right. But he also knew how powerful an influence a bureaucratic bias could be. There was a consensus on how safe the vehicle was on launch among rocket scientists. But there was only one problem: it had to be wrong.
The first thing Feynman found while talking to people at NASA, was a startling disconnect between engineers and management. Management claimed the probability of a launch failure was 1 in 100,000, but he knew this couldn’t be. He was, after all a mathematical genius. Feynman estimated the probability of failure to be more like 1 in 100, and to test his theory, he asked a bunch of NASA engineers to write down on a piece of paper what they thought it was. The result: Most engineers estimated the probability of failure to be very close to his original estimate.

He was not only disturbed by management’s illusion of safety, but by how they used these unrealistic estimates to convince a member of the public, teacher Christa McAuliffe, to join the crew, only to be killed along with the six others.

Feynman dug deeper, where he discovered a history of corner-cutting and bad science on the part of management. Management not only misunderstood the science, but he was tipped off by engineers at Morton Thiokol that they ignored it, most importantly when warned about a possible problem with an o-ring.

Feynman discovered that on the space shuttle’s solid fuel rocked boosters, an o-ring is used to prevent hot gas from escaping and damaging other parts. Concerns were raised by engineers that the o-ring may not properly expand with the rest of the hot booster parts, keeping its seal, when outside temperatures fall between 32 degrees Fahrenheit. Because temperatures had never been that low, and there had never been a launch failure, management ignored the engineers. The temperature on launch day was below 32 degrees.

Feynman had his answer, he just had to prove it.

The perfect opportunity arrived when he was requested to testify before Congress on his findings. With television cameras rolling, Feynman innocently questioned a NASA manager about the o-ring temperature issue. As the manager insisted that the o-rings would function properly even in extreme cold, Feynman took an o-ring sample he had obtained out of a cup of ice water in front of him. He then took the clamp off the o-ring which was being used to squish it flat. The o-ring remained flat, proving that in fact, resilliancy was lost with a temperature drop.

In his own report Feynman described the terrible and corrupting influence of incentives and expectation upon science and engineering. Even literal rocket science was not exempt from human pressure. Feynman ended his discussionof the Challenger disaster with an observation that eerily speaks to the subject of “consensus” in scientific matters. Consensus doesn’t matter. Only science and engineering does. “For a successful technology, reality must take precedence over public relations, for nature cannot be fooled.”


I was there, right in the middle of all the events of Challenger. As a lowly flight surgeon (who also had a biomedical engineering degree) I watched and listened to the debate in launch control over the effects of the weather. Like many others in that room I was a bit skeptical about the decision to launch, especially since we could see icicles on the SRB and we had all driven to the Cape that early morning in temperatures below 20 degrees. Most of us knew that the o-rings had not been tested at temperatures below freezing (as confirmation of this, hours after the explosion of the orbiter, there were many discussions about the o-rings and the temperature issue as being the most likely cause; so I know it was a subject on many people's minds. Most of us had heard about the Morton Thiokol engineers' reluctance to ok a "go" for launch; though unless you were in upper management, you were not aware of the details of this.)

What I remember most of all was my own sense of trust: trust that the mission managers knew what they were doing; and a calm acceptance of their decision to launch. My own thoughts at the time are still very clear to me: this was NASA , after all. The people here were the "best and brightest" (of course I included myself in this) and our scientific credentials would insure that we would never ignore objective reality. Though I was young and foolish, I clearly understood that wishing and wanting something to be true did not make it so. I had faith that the system was relatively immune to psychosis (i.e., being out of touch with reality).

Needless to say, it was an extremely painful lesson that nature taught us that day, and I have never forgotten it. Of course, I internalized that lesson in a way that is not always consistent with being a psychiatrist, in that I learned you cannot take the "human" out of "human nature"; and that wishes and hopes are all very nice and all, but that reality is not in the least interested in your wishes and hopes--or any of your feelings for that matter.

Nature cannot be fooled; but human nature is predisposed to foolishness--and therefore likely to accept and tolerate all sorts of errors and fantasies for a variety of very human reasons--no matter what the tragic consequences might turn out to be.