How often do you ignore facts staring you in the face because they conflict with your preconceived worldview?

We tend to spend far more time confirming what we already know versus seeking out and paying attention to opposing points of view or facts. 

Each of us has blind spots in our brains. Robert Burton, M.D., former chief of neurology at Mount Zion-UCSF Hospital, described this selective choosing as the feeling of knowing.1 As we observe the details of what is going on around us, our brains filter out information that we are not familiar with or do not recognize. We create a craving for certainty of knowledge at the expense of questioning or inquiry.

Metaphorically, Burton uses the formation of a riverbed over time to describe how we develop our minds into a feeling of knowing.  The meandering flow of water eventually seeks out a path of least resistance whereupon a creek is formed and the beginning of a riverbed is created. As the creek deepens, a river develops and the sides of the river grow steeper.

Margaret Heffernan points out this is how Willful Blindness creeps into our lives.2  Everyday we make countless decisions based on our innate feeling of knowing and desire for familiarity, which crowds out our peripheral vision to possible options. Our blindness grows slowly as we make more and more decisions to the point that “we see less and less and feel more comfort and greater certainty.”3

Willful Blindness came to us through the criminal legal system. The term refers to an individual who could have known the facts of a situation, and should have known the facts, but deliberately blinded himself to the existence of the facts.4

 Some historically profound examples of Willful Blindness by individuals in power positions and organizations include Bernie Madoff of Ponzi scheme fame; the Catholic Church and pedophile priests; and the medical profession refusing to abandon X-raying pregnant mothers for more than 20 years after learning the practice doubled cancers in children based on Alice Stewart’s research. As Gayle Greene, Alice Stewart’s biographer notes, “People are very resistant to changing what they know how to do, what they have expertise in and certainly what they have an economic investment in.”5

In the safety world

A number of cases of “Willful Blindness” can be found in the safety world. The Texas City Refinery explosion; the Deepwater Horizon blowout preventer failure; the Upper Big Branch mine explosion; the Challenger explosion — in each of these cases, people in leadership roles “should have known, and could have known, but decided to blind themselves from the facts.” 

In each of these cases, individuals with the authority to challenge and change the direction of the event chose to blind themselves from the facts leading up to the event. Their comfort with the feeling of knowing or the feeling of certainty overwhelmed their ability or desire to accept conflicting data and information ultimately leading to disaster.

Some of the most tragic Willful Blindness examples have manifested out of “just following orders.” To remain competitive, companies have undertaken significant cost cutting initiatives, both in budgets and personnel, with little to no regard for the risk associated with the cost cutting. Management becomes so fixated on achieving upper management’s order to cut costs; they lose sight of everything around them to achieve the target, which can result in catastrophic consequences.

Breaking through false confidence

Heffernan provides insight into the conditions that allow Willful Blindness to flourish and means to overcome our feeling of knowing.6 The following is a safety and health perspective of her insights.

• Recognize the homogeneity of our lives, our institutions, our neighborhoods, our colleagues and our friends.  Instead of only hanging out with like-minded people, reach out to those individuals who seem to not fit in. Challenge the political correctness notion of Diversity (i.e., gender, race, age) and include diversity of thought. Next time you are forming an incident investigation team, ask someone who knows little to nothing about safety incident investigations to participate and add a different perspective.

• Acknowledge the biases we bring to any group and adjust for them. Do you always advocate the Heinrich premise that “unsafe acts of workers are the principle causes of occupational accidents?” Be mindful, this bias may be blinding you from the systemic causes of accidents.

• Know the hard limits to our cognitive capacity. Long hours at all levels of work lead to incompetence, carelessness, and lost productivity. How many days do your 12-hour/day shift workers work without days off?

• Endure the ability to welcome debate and conflict. Do you allow employees to question the unquestionable? Who is your safety devil’s advocate?

• Create an environment where employees have the room to offer solutions, even if they are contrary to current thinking, without any repercussions. Often employees know the answer to a safety matter, but fear saying anything. Relieve the pressure and ask for their opinion.

• Establish a small network of people who will bring you the unvarnished truth and with whom you can have unfettered exploration. Going into execution mode severely diminishes your peripheral vision, so have a network that watches your back and front.  Remember to include people that bring diversity of thought.

• Develop yourself into a critical thinker with courage. To be a critical thinker you must resist the temptation to be a pleaser.  Be a nonconformist. Rather than always knowing the answer for your boss, ask questions for understanding the safety issue. Challenge your boss’s thinking with questions. An indicator of critical thinking is discomfort. Be leery of unanimous decisions, they are intrinsically suspicious.

• Seek out minority opinions.  The mere existence of a minority opinion in a discussion can significantly alter the flow of a safety discussion.

• Study the history of your organization as opposed to being obsessed with the present.  Since history tends to repeat itself, great value comes from understanding the systemic causes of past safety events. Also, the challenge of weak signals, or near misses, is knowing when to take them seriously. Look for what you cannot see.

 

References

1 Burton, R.A. 2008. On Being Certain: Believing You Are Right Even When You’re Not. St. Martin’s Press. New York, NY.

2 Heffernan, M. 2011. Willful Blindness: Why We Ignore the Obvious at Our Peril. Walker & Company. New York, NY.

3 Ibid. pp. 21.

4 Ibid. pp. 2.

5 Ibid. pp. 51.

6 Heffernan.Op cit. pp. 223-247.