Why can’t we see red flags?
No one likes to listen to a pessimist
Nineteen eighty-nine was a pivotal year in the twentieth century. The Berlin Wall crumbled with startling swiftness. The unification of Germany rapidly followed. Eastern Europe – Czechoslovakia, Poland, Hungary and Romania – freed themselves from Communism. The Soviet Baltic republics of Lithuania, Estonia, and Latvia held massive demonstrations that eventually led to their independence in 1991 and contributed to the dissolution of the Soviet Union.
According to 2017 book, “Gorbachev: His Life and Times,” these events blindsided the Soviet leadership. Gorbachev was asleep the night crowds of East Germans swarmed across the border and joined West Germans in climbing atop the wall to celebrate by starting to break it down with pickaxes. Overall, Gorbachev was preoccupied with internal Soviet economic and political problems – food shortages; almost entirely absent from stores were televisions, refrigerators, washing machines; Communist hardliners and radical liberals were at each other’s throats. “Gorbachev really did not have time, and so the (the issue of Eastern Europe and the demise of the Soviet Union) was for us of secondary importance,” recalled a Gorbachev aide. “Eastern Europe was on the back burner.”
Most common blinders
Gorbachev also possessed a “considerable capacity for optimism” and an ability to remain hopeful in the midst of crisis. “Don’t worry, it will work out,” he’d tell aides. Distraction and over-optimism are two reasons history moved much faster than Gorbachev expected. There are other causes for missing red flags. The National Transportation Safety Board studied 37 major aircraft accidents from 1978 to 1990 and identified seven basic categories of blinders: 1) conflicting input; 2) preoccupation; 3) not communicating; 4) confusion; 5) violating policy or procedures; 6) failure to meet targets; and 7) not addressing discrepancies.
History is replete with disasters that had warning signs missed or dismissed. In her 1962 study, “Pearl Harbor: Warning and Decision,” Roberta Wohlstetter analyzed how the Japanese surprise attack should have been less surprising than it appeared at the time because there were warning signs known to the U.S. government that such an attack was possible. From 1941 to 2001, fast-forward 60 years and the CIA possessed plenty of warnings in the spring and summer that al Qaeda was planning a spectacular attack. But the agency failed to add to the State Department “watch list” two al Qaeda suspects who then entered the country under their true names with ease, and months later were two of the “muscle” hijackers on the American Airlines jet that plunged into the Pentagon, killing 189 people.
Challenger and Deepwater Horizon
On the evening of January 27th, 1986, five Morton Thiokol engineers engaged in a heated discussion with their leadership team, arguing it was too cold to launch the Space Shuttle Challenger the next day. The solid booster seals may not function properly. NASA decided to go ahead with the launch, and one of the engineers told his wife that night, “It’s going to blow up.” The next morning, 73 seconds after launch, Challenger exploded, killing seven astronauts.
Several equipment readings indicated potential signs of an impending blowout in the 24 hours before the Deepwater Horizon rig exploded on April 20, 2010, taking the lives of 11 workers and creating an environmental catastrophe in the Gulf of Mexico. According to the Associated Press’s investigation “its record was so exemplary, according to MMS officials, that the rig was never on inspectors’ informal ‘watch list’ for problem rigs.”
On Wednesday morning, November 9, 2016, Hillary Clinton gave a concession speech she never prepared for after her shocking loss to Donald Trump. What happened? The national media, Democrats, even Republicans, were stunned. Among the many post-mortems, former Democratic National Committee Chairwoman Donna Brazile has said Clinton’s campaign was like a “cult” that became impossible to penetrate. She urged reaching out to Rust Belt states that the campaign team thought were in the bag. They trusted their analytics and didn’t heed reality on the ground. Eventually states taken for granted provided Trump’s margin of victory in the Electoral College.
Too much “noise”
One of the risk factors running through many of these disasters is the presence of “noise” in the system – an overload of data and signals that make it difficult to separate out the really important red flags. Thanks to ever-more sophisticated software and technology, “noise” has the potential today to flood safety and health departments with indicators of performance and patterns of behavior and system control errors. This clutter of noise leads to conflicting input, confusion and failure to address discrepancies – three of the NTSB’s classifications of “blinders.”
Another risk factor is bureaucratic “stovepiping.” Agencies and departments keep information to themselves and don’t talk to each other. In many organizations the safety and health department resides in a “silo” separate from mainstream operations. How many times are precursors to accidents identified but not communicated to senior leaders? Or too readily rejected? As a student of the Challenger disaster writes, “Pessimists, typically, are not popular people. The rest of us don’t want to hear about bad things when they might not even happen.”
When seeing a red flag, speaking up is essential, say experts. But the Morton Thiokol engineers argued their case to no avail. NASA leaders proved susceptible to “normalization of deviation.” The same can be said of BP, Mikhail Gorbachev, and the Clinton campaign team. Due to complacency, over-confidence, distraction, lack of communication, or the fact “nothing bad has happened so far,” organizations fail to recognize deviations or steps away from best practices and smart thinking. Then an accident happens and root cause analysis reveals a progressive digression from prudent practices.
Do your safety and health efforts have a sufficient number of realists – or potential pessimists – who know the truth on the ground, and will call it as they see it to more distant and pre-occupied higher-ups?