When college professors have a bad day, their students don’t learn as much that particular day. When engineers have a bad day, many people can die and significant environmental harm can be done—consider the Volkswagen pollution control device scandal, the Deepwater Horizon fire, the Kansas City Hyatt walkway collapse, the Challenger space shuttle explosion, the Ford Pinto fires, and the General Motors ignition switch debacle.

Many of these disasters are rooted in ethical lapses, so it is well that several engineering ethics textbooks have been published recently, including Engineering Ethics: Contemporary & Enduring Debates (2020) by Deborah G. Johnson, Ethics for Engineers (2020) by Martin Peterson, and Ethics and Engineering: An Introduction (2021) by Behnam Taebi. Although each of these texts lacks a behavioral ethics section which we believe is a weakness, all three are substantively excellent. However, the most educational book on engineering ethics published recently is likely Peter Robison’s new book on the Boeing 737 MAX scandal: Flying Blind: The 737 MAX Tragedy and the Fall of Boeing (2021).

When we started reading this book, we were largely sympathetic to Boeing. Designing an airplane requires tens of thousands of decisions, most of which involve cost versus safety considerations. Just one wrong choice can cost lives and, in retrospect, appear unethical. However, by the time we finished reading the book, our sympathy had evaporated.

According to Robison, and he tells a convincing story, Boeing was long one of the most admired companies in the world largely because it was run by engineers with an engineering point of view. As all three textbooks mentioned above note, the first of the six fundamental canons of the National Society for Professional Engineers code of ethics is that engineers shall: “Hold paramount the safety, health, and welfare of the public.” As Robison explains it, Boeing’s downfall began when it merged with McDonnell Douglas in 1997 and managerial power over the new combined company shifted from the engineers who had run Boeing to the financial engineers who had already run McDonnell Douglas into the ground. Engineers were sidelined in numerous ways, and the new firm’s emphasis shifted decisively from passenger safety to shareholder value.

This change in focus had several unfortunate impacts, but none worse than its influence on the 737 MAX rollout. In 2010, Boeing needed a new plane to meet competition from a new version of Airbus’s popular A320. To avoid losing a huge chunk of market share overnight, Boeing promised airlines that it would produce a new version of its popular 737 outfitted with more powerful and fuel-efficient engines.

Rather than spend the $20 billion needed to produce an entirely new plane, Boeing budgeted just $2.5 billion to remake the 737, touting a “stingy with a purpose” approach. Robison notes that from 2010 to 2014, then CEO Jim McNerney never mentioned safety even once in the company’s annual proxy statements, despite design problems with the new Boeing 787 Dreamliner’s lithium-ion batteries that endangered passenger safety in 2013.

Rather than refocus on passenger safety, McNerney doubled down on cost efficiency with a “more for less” emphasis. Managers told engineers that under no circumstances were they to make any changes that would require pilots familiar with the current 737 to need flight simulator training in order to fly the MAX. Boeing had promised Southwest airlines that it would pay it one million dollars per plane if such expensive training were required to fly the MAX.

The main design change was that the newer, bigger engines would be mounted to the front of the wings rather than beneath them. A simulator test in a wind tunnel disclosed a MAX tendency to pitch up in certain situations, creating the danger of an in-flight stall. A hardware fix would have been preferable but more expensive, so Boeing opted for a software fix by borrowing an “MCAS” system from a refueling tanker it was already selling. Boeing also pretended that this was not a new function but just a tweaking of the original 737 system so that little new training would be required. Pilots were not adequately trained, not clearly informed of the design flaw, and not told how to respond when the MCAS system activated.

Because the MAX flight manual was “criminally insufficient” in warning pilots of the MAX’s pitching problem and in explaining to them how to handle an emergency when the MCAS software activated and repeatedly forced the plane to nosedive shortly after takeoff, two MAX planes crashed, killing 346 people.

Behavioral ethics principles can help explain this disaster.

Framing. How people frame choices they face has everything to do with the decisions they make. After the McDonnel-Douglas merger, Boeing’s leaders relentlessly focused on cost reduction, earnings, and stock price at the expense of safety. This was repeatedly manifested in making safety-endangering low-cost choices while designing the 747 MAX. Engineers were forbidden to make design changes that would cost too much money or require significant pilot retraining. Work was often outsourced to less qualified companies. Rather than spending the money that would have enhanced safety, Boeing plowed 80% of its free cash into stock buybacks, making tons of money for shareholders and executives.

Obedience to Authority. People tend to be obedient to authority, especially if there are adverse consequences to not being obedient. Engineers who asked for more testing or safety improvement were repeatedly warned “very directly and [in] threatening ways” that their pay was at risk if cost reduction and timing targets were not met. Over recent years, Boeing CEOs, including McNerney, had a habit of emphasizing their power and ability to make employees cower.

Tangible & Abstract. Several Boeing employees involved in designing and testing the 737 MAX stated that they would not fly in the MAX or allow their families to do so, yet they did not take active steps to stop it from being marketed. This seems a manifestation of the phenomenon of the tangible and the abstract whereby people’s decision making is more influenced by factors that are close in time and place (their own safety, their family’s safety, Boeing’s financial situation) than by more remote and less tangible factors (the as-yet nameless and faceless passengers who might be injured in a future crash of the plane).

Self-serving Bias. People tend to make choices that advance their own self-interest. Because of the continual pressure from management to emphasize cost reduction and to toe the company line, engineers who should have blown the whistle on safety practices did not do so. They did not want to risk their jobs. Employees of the Federal Aviation Administration (FAA) similarly should have spoken out, but they were often rewarded not for advancing passenger safety, but for helping Boeing get its planes to market on schedule.

Conflicts of Interest. Boeing and other companies in the industry successfully lobbied Congress to pass laws that largely neutered the FAA as an oversight agency. Boeing was given “self-certification” authority and functionally graded its own exams. FAA employees had very little authority to overrule Boeing’s decisions. Boeing’s management did not hesitate to use the company’s substantial political influence to cow the FAA into submission. It also frequently threatened the FAA, snowed it under with huge numbers of documents, and used what it described as “Jedi mind tricks” to deceive the agency. Boeing had a duty to preserve passenger safety that conflicted with its duty to make money for shareholders. Too often, it chose the latter over the former.

Overconfidence Bias. People are subject to both an Overconfidence Bias and an Overoptimism Bias. The former involves undue faith in one’s own skills and judgment. A reading of Robison’s book discloses such overconfidence by virtually all of the Boeing CEOs profiled. The latter involves undue faith that things will work out for the best, and Boeing’s conclusion that the 747 MAX was safe (even after the two crashes that killed 346 people) is a concrete illustration.

Engineers must frequently make decisions that involve trade-offs between cost and safety. But when they are embedded in an organizational culture that emphasizes profits over safety and punishes those who dissent, bad things often happen. In this case, 346 people died. As pilot and software developer Gregory Travis wrote: “Today, safety doesn’t come first—money comes first, and safety’s only utility in that regard is in helping to keep the money coming.” This prioritization is utterly inconsistent with the first canon of the engineers’ code of ethics. And it turns out to be costly as well.  Robison reports that the direct cost of the 737 MAX scandal to Boeing in the form of compensation to customers, aircraft storage, pilot training, and settlements with passengers’ families was $21 billion. And if Boeing’s lapses deter customers from returning, losses could approach $65 billion.

 

 

Sources

Elaine Englehardt et al., “Leadership, Engineering and Ethical Clashes at Boeing,” Science and Engineering Ethics, 27: __ – __ (2021).

Joseph Herkert et al., “The Boeing 737 MAX: Lessons for Engineering Ethics,” Science and Engineering Ethics 26: 2957-2974 (2020).

Deborah Johnson, Engineering Ethics: Contemporary & Enduring Debates (2020) by Deborah G. Johnson, Ethics for Engineers (2020).

Bess Levin, “Boeing Trained 737 Max Pilots on iPads to Save Cash,” Vanity Fair, March 18, 2019.

Martin Peterson, Ethics for Engineers (2020).

Peter Robison, Flying Blind: The 737 MAX Tragedy and the Fall of Boeing (2021).

Chesley (“Sully”) Sullenberger, “Letter to the Editor,” New York Times, Oct, 12, 2019.

Behnam Taebi, Ethics and Engineering: An Introduction (2021).

Gregory Travis, “How the Boeing 737 MAX Disaster Looks to a Software Developer,” IEEE Spectrum, April 18, 2019, at https://spectrum.ieee.org/how-the-boeing-737-max-disaster-looks-to-a-software-developer.

Jerry Useem, “The Long-Forgotten Flight that Sent Boeing Off Course,” The Atlantic, Nov. 20, 2019.

 

Videos

Conflicts of Interest: https://ethicsunwrapped.utexas.edu/video/conflict-of-interest.

Framing: https://ethicsunwrapped.utexas.edu/video/framing.

Obedience to Authority: https://ethicsunwrapped.utexas.edu/video/obedience-to-authority.

Overconfidence Bias: https://ethicsunwrapped.utexas.edu/video/overconfidence-bias.

Self-Serving Bias: https://ethicsunwrapped.utexas.edu/video/self-serving-bias.

Tangible & Abstract: https://ethicsunwrapped.utexas.edu/video/tangible-abstract.