In a recent New York Times column, Floyd Norris noted in detail the obvious similarities between the downfall of Arthur Andersen during the Enron debacle and the recent troubles of Standard & Poor’s and other credit rating agencies (CRAs).
Arthur Andersen was in an inherent conflict-of-interest situation. Like all auditors, it was paid by its client, which creates obvious difficulties. How does one truly act as a watchdog or gatekeeper over an entity that pays the freight? The situation was hugely exacerbated by the fact that Andersen also sold massive amounts of consulting services to Enron and other audit clients, making it doubly and triply important to keep the client happy. Although Andersen had a professional duty to police Enron and other audit clients, exerting a skeptical eye upon their books and records, it actually sold its services by promising to act as a “partner” to assist clients in meeting their business goals.
The CRA parallels are striking. S&P was also paid by clients that it wished to keep happy. S&P also sold consulting services to clients to teach them how to structure instruments so as to obtain the rating that they would then pay S&P to give them.
There was obviously some conscious greed involved in both Andersen’s downfall and S&P’s missteps. Enron was Andersen’s largest client and Andersen was heading toward earning $100 million a year in fees from Enron. S&P and other CRAs seemed intensely focused upon expanding the record earnings they were recording during the run-up to the subprime debacle that segued into the Great Recession.
But many of the errors in judgment by both Arthur Andersen and S&P can likely be explained in large part by the teachings of behavioral ethics. First, the sharp conflicts of interest that both Andersen and S&P faced prevented Andersen from making the hard calls when auditing Enron and made it difficult for S&P to stick to its standards when rating mortgage-backed securities. The evidence from behavioral research, as reported by Ariely and others, is that the impact that conflicts of interest have on others’ judgment is obvious to us, but we tend to be oblivious to the impact it has upon our own judgment. In one study, 61 percent of doctors believed that the swag and perks that pharma companies lavish upon doctors affected the prescribing behavior of other physicians, but only 16% thought those same benefits affected their own behavior.
When people have much to gain from reaching a certain conclusion, it is very hard for them not to reach that conclusion. Often the self-serving bias causes the ethical issue to fade away as the mind focuses on the benefits to be gained from reaching the conclusion (“It’s ok for Enron to book this transaction as income.” “It’s ok to rate this instrument as AAA”). Related is the notion of motivated blindness which, Bazerman and Tenbrunsel point out, can cause us to fail to see that others are acting unethically when it suits our purposes. Thus lawyers may fail to see that their clients are guilty, auditors may fail to see that their clients are fudging the numbers, and credit rating agencies may fail to see that the instruments that they are rating are ticking time bombs if a stream of revenue would be cut off were they to see it. Sinclair Lewis once said something along the lines of: “The hardest thing in the world is to get a man to understand a thing that his livelihood depends upon his not understanding.”
Furthermore, as Norris pointed out, Arthur Andersen probably never imagined that Enron could collapse as fast as it did. And S&P likely did not suspect that that the consequences of its cutting corners would have such severe consequences. But David Brooks has pointed out that the human mind is an overconfidence machine. People tend to be overconfident that things will work out the way they hope that they will (“Enron’s stock price will continue to go up.” “These mortgage securitizations will not fail.”). They also tend to be overconfident in their own moral character. Confident in their own good moral character, they just know that the decisions they make will be ethical. With this confidence, they may make those decisions without adequate reflection.
It’s too late now, but employees of Arthur Andersen and Standard & Poor’s would have profited from viewing Ethics Unwrapped ’s videos on conflicts of interest, the self-serving bias, ethical fading, and the overconfidence bias. Or from reading the following books and articles:
Dan Ariely, The (Honest) Truth About Dishonesty (2012); David Brooks, The Social Animal (2012)
Max Bazerman & Ann Tenbrunsel, Blind Spots (2011); Floyd Norris, In Actions, S&P Risked Andersen’s Fate, New York Times, February 8, 2013, p. B1
Robert A. Prentice, Enron: A Brief Behavioral Autopsy, 40 American Business Law Journal 417-444 (2003).