The U.S. space shuttle program, according to NASA’s website, was a marvel:

NASA’s space shuttle fleet began setting records with its first launch on April 12, 1981 and continued to set high marks of achievement and endurance through 30 years of missions. Starting with Columbia and continuing with Challenger, Discovery, Atlantis and Endeavor, the spacecraft has carried people into orbit repeatedly, launched, recovered and repaired satellites, conducted cutting-edge research and built the largest structure in space, the International Space Station. The final space shuttle mission, STS-135, ended July 21, 2011 when Atlantis rolled to a stop at its home port, NASA’s Kennedy Space Center in Florida.

NASA is not wrong–the shuttle program accomplished astounding feats of engineering, science, and adventure. Unfortunately, the program is best remembered not for its breathtaking successes, but for the loss of the Challenger and its crew of seven on January 28, 1986. In his new book: Challenger: A True Story of Heroism and Disaster on the Edge of Space (2024), Adam Higginbotham tells a detailed and fascinating story of how and why the Challenger and its crew of seven were lost.

It is impossible in a blog post to do justice to a book as detailed and exhaustively researched as this one. But for our purposes, the key lesson is that when huge organizations make thousands and perhaps millions of decisions over a thirty-year period, some may be botched. And some of those managerial and engineering decisions inevitably will carry moral ramifications. We should analyze these decisions, so that we can learn from them.

It is important to remember that what these astronauts, engineers, government contractors, federal bureaucrats and others were attempting had never been done before. The task at hand was as technically challenging as anything that mankind had ever attempted. And throughout, the program was plagued by inconsistent government and public support as well as constant budget pressure. That said, NASA always promised that the safety of the astronauts would be its top priority and the first canon of the National Society for Professional Engineers code of ethics is that engineers shall: “Hold paramount the safety, health, and welfare of the public.” And yet, the Challenger blew up.

As you probably know, the precipitating cause of the Challenger explosion was a flaw in the design of the O-rings on the solid rocket boosters (SRBs) designed by Morton Thiokol (Thiokol). The O-rings were rubber seals designed to prevent hot gases from leaking through the joints between the segments of the shuttles’ SRBs. If the O-rings failed, a jet of fiery gas could escape from an SRB and cause a catastrophic explosion of the rocket’s huge external fuel tanks.

Challenger’s fatal January 28, 1986 flight was the program’s 21st operational flight after four test flights. While things generally seemed to be going well, many in the know were deeply concerned. The fifth operational flight occurred on November 8, 1983. After the Columbia’s SRBs were retrieved from the ocean and inspected, Thiokol inspectors saw that one of the O-rings had been badly-damaged (burned or vaporized) during the flight. Although they were alarmed and reported it to the Marshall Space Flight Center in Huntsville, Alabama (Marshall), no action was taken. After all, Thiokol engineers had designed a primary O-ring, a secondary O-ring if the first one leaked, and added flameproof asbestos-filled putty to the joints.

Although the O-ring idea had been borrowed from the very successful Titan booster rockets that preceded these SRBs, the numerous changes just mentioned essentially created a new, untested experiment design. In 1978 and 1979, technical specialists at Marshall argued that the field joints between the four propellant sections of the motor did not perform as expected and should be modified as soon as possible “to prevent hot gas leaks and resulting catastrophic failure.” The Marshall engineers visited the plant of the Parker Seal Company that made the O-rings. Its managers were dismayed that Thiokol’s tests had demonstrated that the seals did not work as intended. Yet, this concern was lost in the ether of the bureaucracy.

When a NASA certification committee was preparing to approve the first test flights in 1981, members expressed concerns about leaking O-rings and recommended a thorough program of further tests, but Marshall and Thiokol managers talked them out of it. The program’s successes, Higginbotham reports, led many at NASA “to think of [themselves] and the agency around [them] as almost infallible.” (p. 215) (See our video on the Overconfidence Bias)

Once the test flights ended and the true operational flights began, it was clear that the O-ring problem had not been solved by the Thiokol engineers’ minor fixes. Post-flight examination of the April 4, 1983 launch of the Challenger showed new O-ring problems and engineers calculated that if the rocket had burned for just eight or nine seconds longer, it would have exploded, killing everyone aboard.  (p 175)

That was the second of the Operational Launches. Flights #6 and 8 also had leaks. The engineers did not know what was going on and could not solve the problem. Yet, the first three rockets with leaks had not exploded, so they maintained a “go” mindset and approved further launches though they planned further tests and more minor fixes. These engineers seemed to be in the grips of the optimism bias, which Tali Sharot defines as “the inclination to overestimate the likelihood of encountering positive events in the future and to underestimate the likelihood of experiencing negative events.”  In other words: “The last rocket didn’t blow up, despite our fears. Probably the next one won’t either.”

Worse, NASA’s managers’ judgment was warped by incrementalism, the slippery slope, which can lead people to walk over a cliff, one step at a time. As Higginbotham writes when describing a meeting between Thiokol engineers and their counterparts at Marshall:

…beneath these crisp certainties, obscured amid the blizzard of charts, data-filled binders, and Viewgraph slides, the rocket engineers failed to realize that they had reached a critical inflection point. Over the course of the years they had been developing and flying the solid rocket motors, the men at Thiokol and Marshall had slowly expanded the parameters of what they regarded as acceptable risk in the joints. Incrementally, they had begun to accept as normal problems that deviated dangerously from the original design standards set for the boosters—and the seals that constrained the seething power of their volatile propellant in flight. (p. 208)

NASA’s and Thiokol’s commitment to human safety slowly wasted away as they became comfortable taking a little more risk, then a little bit more, and then yet more. Had the managers making these decisions been in the crew compartment, they might have made different choices. The astronauts themselves were never informed of the O-ring problems.

When the Discovery launched on January 24, 1985, the temperatures in Florida were so cold as to break records. When Thiokol engineer Roger Boisjoly examined the boosters after they’d been recovered from the ocean, he found such evidence of leakage of hot gas that “he was astonished that Discovery hadn’t been blown to pieces on the launchpad.” (p. 253)

At meetings with Thiokol and Marshall engineers, Boisjoly spread the word about the danger and stated his opinion that the cold temperature was a significant factor. Because this theory had potentially momentous consequences for NASA’s year-round launch schedule, he acknowledge that it would not be well received. And it wasn’t.

But others, including Allan McDonald, Thiokol’s representative at Marshall, agreed with Boisjoly’s theory that cold weather could make the rubbery material composing the O-rings less elastic and thereby exacerbate whatever problems they had in sealing effectively. Boisjoly had experiments performed at Thiokol’s lab in Utah that seemed to confirm his suspicions, but his manager ordered him to keep the results quiet.

When the post-flight inspection of the SRBs from the April 29, 1985 Challenger mission showed the worst seal damage yet and that the astronauts “had been as little as three-tenths of a second away from an explosion in the solid rocket booster that would have torn apart the orbiter and killed everyone on board,” (p. 258) Boisjoly lost confidence in the O-ring system. He worked constantly to find solutions, but received little support from Thiokol. In July of 1985, he wrote an internal memo to his supervisor, Bob Lund, and other Thiokol engineers that ended: “It is my honest and very real fear that if we do not take immediate action…we stand in jeopardy of losing a flight along with all the launch pad facilities.” (p. 262)

Because NASA’s managers did not wish to show vulnerability and thereby lose public and political support, and Thiokol did not wish to lose its lucrative NASA contracts, launches continued. (See video on the self-serving bias).

The nation was paying particular attention to the planned late-January 1986 launch of Challenger, because the crew would include public school teacher Christa McAuliffe who’d been chosen for the “Teacher in Space” program designed to bring a bit of positive PR to NASA. But when technical problems delayed the launch by a few days and another cold front stormed into Florida so that it was going to be perhaps 20 degrees colder than it had been for Discovery’s launch a year earlier, Boisjoly gathered his engineering colleagues together. They pored over all the data and went to their Thiokol supervisor Bob Lund and told him it would be crazy to launch under these conditions.

Allan McDonald, Thiokol’s man at Marshall, was convinced. He told Lund to continue to study the situation, but that: “This has to be an engineering decision, not a program management decision.” (p. 315). In other words, the focus should remain on safety! As Thiokol’s team continued to study the situation from an engineering perspective while the countdown continued, the unanimous recommendation of all fourteen managers and engineers remained: Do not launch. (p. 320)

Thiokol’s engineers made their case at the Flight Readiness Review remotely from Utah. When they were done, Thiokol’s executive Joe Kilminster stood firmly by the “no launch” recommendation. But Larry Mulloy, NASA’s manager of the solid rocket booster project at Marshall, pushed back and did so forcefully. In so doing, he reframed the debate. (See our video on framing).  As Higginbotham explains:

Although the Thiokol engineers did not fully realize it at the time, Mulloy’s rebuttal marked a subtle shift in the tone and expectations in the meeting—a change that made it different from all previous Flight Readiness Reviews. In the past, if a contractor’s data about the state of flight hardware had been inconclusive, the default position was not to fly: they were expected to prove that their equipment and components constituted an acceptable risk before launch. Now, it seemed, Mulloy was asking them to prove the opposite—to show him the data that proved conclusively it was not safe to launch. (p. 326)

With NASA officials at Marshall plainly displeased with the Thiokol recommendation, Thiokol’s McDonald “knew that his bosses at Thiokol would be in a bind if they hindered NASA’s flight schedule at such a vulnerable time for the company. Already behind schedule in production, they further risked jeopardizing their monopoly on their billion-dollar solid rocket contract.” (p. 328) Still, McDonald was confident that they would stick with the no-launch decision.

However, when Mulloy finished, Jerry Mason, a Thiokol vice president and the highest-ranking executive in the meeting, asked NASA for a five-minute break. He then addressed his group in Utah, saying; “We’ve got to make a management decision.” (p. 328) The ten engineers in the room stood firmly against the launch. But Mason quickly received the assent of two other vice-presidents (Wiggins and Kilminster) and then put pressure on Bob Lund: “Now, Bob, you take off your engineering hat and put on your management hat. We’ve done all we can from an engineering point, and now we’ve got to make a tough decision. And as a manager, you’ve got to do that.” (pp. 329-330)

In his role as a manager, Lund saw his obligation as helping Thiokol make money. The other managers (Mason, Wiggins and Kilminster), although all had probably been practicing engineers at some point, changed their positions and supported proceeding with the launch. All in the room who were still practicing engineers and saw their primary role as advancing the cause of safety, stuck with their original recommendation, which was not to launch. (See our video on Role Morality)

Lund adopted the role of manager and reframed the issue as one involving dollars and cents rather than safety. Both he and Mulloy’s underlings at Marshall showed themselves to be susceptible to the phenomenon of obedience to authority )

The launch went forward and seven astronauts died. Higginbotham summarizes:

An organization that had, since its inception, boasted of its ability to manage extraordinary risk on the frontiers of technology and learn from its mistakes had instead overlooked a litany of clear warnings; the signals lost in the noise of a complacent can-do culture bred by repeatedly achieving the apparently impossible. Seduced by their own mythos, and blind to the subtleties of engineering complexity that none of them fully understood, the nation’s smartest minds had unwittingly sent seven men and women to their deaths. (p. 409)

 

Sources:

Adam Higginbotham, Challenger: A True Story of Heroism and Disaster on the Edge of Space (2024).

Allan McDonald (with James Hansen), Truth, Lies, and O-Rings: Inside the Space Shuttle Challenger Disaster (2009).

NASA, History: The Space Shuttle, at https://www.nasa.gov/space-shuttle/.

Martin Peterson, Ethics for Engineers (2020).

Tali Sharot, The Optimism Bias: A Tour of the Irrationally Positive Brain (2011).

Behnam Taebi, Ethics and Engineering: An Introduction (2021).

 

Videos:

Framing:  https://ethicsunwrapped.utexas.edu/video/framing.

Incrementalism: https://ethicsunwrapped.utexas.edu/video/incrementalism.

Obedience to Authority: https://ethicsunwrapped.utexas.edu/video/obedience-to-authority.

Overconfidence Bias: https://ethicsunwrapped.utexas.edu/video/overconfidence-bias.

Role Morality: https://ethicsunwrapped.utexas.edu/video/role-morality.

Self-serving Bias:  https://ethicsunwrapped.utexas.edu/video/self-serving-bias