Not all senior executives care about safety with the passion exhibited by CEOs who are great safety leaders. Senior executives who rise through non-operations channels often have little knowledge or experience with even the most basic core concepts of leading edge safety management. More importantly, they may not recognize the disastrous outcomes that are possible, or they may have homespun theories about why accidents occur, how they are prevented, and their own role in prevention.

Unfortunately, most delegate and dismiss safety to the people who report to them. This all-to-common practice will not bring about the kind of high-reliability performance needed in high-hazard industries.

Addressing the leadership inadequacies exemplified by the BP Gulf oil spill catastrophe requires a new vision of safety from senior executives in industry and public officials in government, new standards and oversight, and rigorous enforcement. The highest performing organizations in high-hazard industries use existing methodologies from the human sciences to assure high levels of safe and reliable operation.

But these organizations are the exception more than the rule.
 

Types of leadership

Some senior leaders understand and act on safety management issues; most do not. It is critically important for the organization’s CEO and senior executive leadership team to grasp and accept the necessity of possessing leading edge safety methodologies. There are three levels of understanding exhibited by senior executive leaders concerning the central role played by safety in their businesses:

1) Those who don’t get it at all;

2) Those who get it thoroughly and act accordingly;

3) Those who think they get it but in reality do not,

yet nevertheless try to convey that they are among the converted.

The most difficult and damaging leaders exist in this third category, and it is at least as common as the other types. In “get it but don’t” cases the situation looks like this: “our CEO says the right things about safety, but we aren’t always sure she really means it. And some of our senior leaders don’t care about safety really but give lip service to it. Our middle level leaders are all over the map on safety, but it is exceptional to find a great safety leader.”
 

Where BP and regulators went astray

Root causes relating to unreliable leadership are at the foundation of a long series of causes contributing to the BP catastrophe. Senior executives within the drilling organizations failed to establish a culture that supports the risk analysis, understanding, communication and decision processes needed for adequate operational safety and reliability. Government regulators failed to set adequate safety standards and enforce compliance. Both failed to heed warnings of problems, act on the knowledge of problems, and failed to prepare adequate response plans because they underestimated the worst-case scenario.
 

Cognitive biases

The Gulf disaster brings to light a significant and common barrier to outstanding safety leadership: cognitive biases.

Decision-making processes are subject to biases, identifiable cognitive shortcuts that simplify decisionmaking and have the benefit of being correct most of the time. “Most of the time” is good enough for most day-to-day decisions, but it is reliable only in that it produces predictable errors. Effective management of high-hazard technology requires rigorously high levels of reliability, at both the decision-making and behavioral level.

Perhaps the most interesting human science research of the past 20 years has been by cognitive psychologists Daniel Khaneman and Amos Tversky.(1) Their efforts were recognized in 2002 with a Nobel Prize.

Cognitive biases are responsible for all types of errors in judgment, risk assessment and decision-making, wherever the cognitive process requires assessing probabilities. This applies directly to all kinds of risk management, where methodologies assume that risks can be quantified with some accuracy. Specific examples of cognitive bias abound in the analysis of catastrophic events. (2)

More than a dozen specific types of biases have been identified and researched. These include: sunk cost bias - the continued investment in a project that has not provided adequate return; failure to see the baseline - the estimate of a future probability that overlooks existing baseline data; recency bias - being overly influenced by the most immediate past events and observations; and confirmation bias - placing undue weight on data that supports the outcome one expects.

Related to the BP case, an example that illustrates the affects of cognitive bias is the “boiled frog syndrome.” The story goes that if a frog is placed in boiling water it will jump out, but if it is placed in water that gradually warms it will be killed. (This is admittedly a metaphor, not a scientific fact.)

Estimates of the number of off-shore oil wells that have been drilled in the Gulf of Mexico range as high as 50,000 in the past 40 years. In 2006, 3,858 oil and gas platforms operated in the Gulf, according to the National Oceanic and Atmospheric Administration. Given that there have been no serious explosions in the Gulf since 1979 (off the coast of Mexico), and much financial success, it is not hard to see how the frog got boiled.
 

How to deal with cognitive bias


The easy part is training, which is readily available and worthwhile. The harder part is addressing cultural issues related to the prevalence of cognitive bias. The climate and culture of an organization will cause people to favor or resist the automatic use of unexamined biases. (3)

Take the case of NASA’s safety culture earlier this decade. Our assessment after the Columbia shuttle tragedy in 2003 found the atmosphere of some meetings was anything but conducive to open and productive communication, especially if people disagreed. NASA had a good culture generally, but the way information flowed in all directions did not encourage productive dialogue. Often disagreements were “unsafe” in the sense that someone would win and someone would lose.

Subsequently, leadership at NASA took on the task of creating a more open environment where communication was encouraged and dialogue about disagreement was valued. Progress was measured via surveys and observations, and positive results were evident in the first year.

The Gulf catastrophe illustrates the need for leaders in both government and industry to have a new vision for safety, a new blueprint. It should integrate both process and employee safety in a culture of safety. The essence of the new vision is that a condition of doing business is doing it safely. To avoid cognitive biases, this means that oil drilling can only be done if leading indicator data meet pre-determined levels, which are monitored within the organization and by outside regulators. In some cases leading indicators exist, but in most cases they do not and must be developed.

These findings and recommendations are not particular to the oil drilling industry. They have application in all industries where cognitive biases can result in low-frequency but high-consequence events.

(1) Kahneman, D., & Tversky, A. (1972). Subjective probability: A judgment of epresentativeness. Cognitive Psychology, 3, 430–454.
Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5, 207–232.
Kahneman, D., P. Slovic, and A. Tversky, (1982) Judgment Under Certainty: Heuristics and Biases. Cambridge: Cambridge University Press.
Kahneman, D., & Tversky, A. (1984). Choices, values and frames. American Psychologist, 39, 341–350.
Kahneman, D., & Lovallo, D. (1993). Timid choices and bold forecasts: A cognitive perspective on risk-taking. Management Science, 39, 17–31.
Kahneman, D., & Tversky, A. (1996). On the reality of cognitive illusions. Psychological Review, 103, 582–591.

(2) Zebrowski, D. (1991) “Lessons Learned from Man-Made Catastrophes.“
(3) Krause, T. (2005) Leading with Safety. Wiley-Interscience CIRCLE 225 FOR FREE INFO