In January of 1986, along with my coworkers outside of the Kennedy Space Center headquarters building, I watched the space shuttle Challenger catastrophe and was brought to tears. It was an event that I will never forget! 

Cues to this national disaster became increasingly evident, with the focus on the faulty O-ring design. Initially, O-ring concerns were not highlighted, but once reported and analyzed, the issues were largely unabated. At one point, engineers pushed back with their concerns, but senior leaders pushed back as well — “let’s launch,” seemed to be what really mattered. Eventually, an O-ring “burn through” occurred and an unforgettable tragedy erupted. At various stages and levels, open communications and relevant actions were constrained by fear.

Precursors — observable cues

For more than 15 years, there has been a growing body of knowledge regarding serious accidents and their precursors. Largely, precursors provide observable cues and manageable signals that something very bad might be about to happen.   

In similar ways, precursors seemingly coexist with almost every serious accident.  

Often times, precursors are more readily apparent when dealing with high hazard concerns inherent to kinetic, nuclear, thermal, electrical, explosive energies, and the like. And in nearly every case, when analyzed, various precursors signaled an eventual gloomy outcome. 

Evolution of a serious event

Rarely is only one pathway or root cause associated with serious accidents or fatalities. Oftentimes, a significant event has multiple hazards or activities that are poorly controlled within a complex organizational framework and support systems. 

In most industrial operations, front line supervisors and workers are closest to the hazards and the catastrophic events. And it’s here, where lateral communications between co-workers may first reveal “certain cues” that signal — something’s wrong!

Cues may come in the form of an electrical arc, a flash volatile solvent vapors, a noisy piece of equipment that previously failed, inappropriate or at-risk behaviors — the list is nearly endless.  The bottom line: precursors have to be communicated, recorded, and analyzed so that hazardous conditions or activities can be controlled and mitigated. 

Organizational defenses and precursors

Well-known psychologist Chris Argyris and others have realized that certain workplace discussions are “not discussable” and become part of various defense routines that limit organizational efficiencies. 

I will argue that many organizational defense routines are driven by fear. There’s fear of individual embarrassment, fear regarding the perceived value of individuals or groups, or fear that future work may be lost if it is not completed on time. Ultimately, fear helps to produce certain defensive mechanisms and routines that impede critical forms of communications. Think about the various defense routines and behaviors in your own organization that are often created by fear.

When it comes to safety, precursors may not be reported outside of one’s immediate work area or amongst coworkers because of fear. Serious events and near misses often go under- or unreported to management. If the concern is reported to management, it may not reach an appropriate level from which abatement can be initiated. 

When a concern is “fed back,” communications may become filtered, fractured, and misunderstood in ways that leads management to believe that a serious event is unlikely to occur, or that it’s being appropriately abated. Within these same filtered communications, management may manipulate what has been discussed or rationalizes the risk as acceptable. Management may even take part in self-serving communications, viewed as bullying.  As a result, event precursors are not accurately communicated, analyzed, managed, and controlled. 

A culture of fear

In every organization, cultural elements of fear produce organizational defenses that can impede the abatement of serious incidents. Even with the recent Penn State debacle, fear was a formidable barrier to appropriate reporting and action. 

I’m certain that many of us recognize the effects of fear in the workplace and its  impact on productivity, quality, morale, and of course, safety performance. At times, fear can strongly influence finger pointing, blaming, mistakes and miscues, job stress and overload, under-reporting of near-misses and accidents; and in turn, the evolution of various forms of serious accidents, even fatalities. 

All of this is especially true in times of economic frailty, when there is fear of job security, and potential job loss. Fear immobilizes, impairs important communications, and limits critical decision-making.

What Good Leaders Do

In some cultures, fear may be a prominent dimension that will never become well-managed. In other organizations, leaders may be encouraged to diminish fear and build trust. It’s trust that leads to openness, engagement, creativity, and a desire for safety excellence.

I’ve highlighted three essential, fear-diminishing and trust-building tactics that will help you in your own quest for limiting the risk for serious incidents and building a culture of safety excellence.

Lead by listening.  Listening and being open to what is heard is required for serious accident abatement. Listening well without interrupting, manipulating, swaying, or bullying a discussion is a learned skill that must be well monitored and managed.

Honesty is essential.  When it comes to safety-related communications, be open and honest.  Honesty and openness help lead to a reciprocal candidness that is a key ingredient to near miss reporting and hazard abatement.

Be fair.  At times, discipline, dismissal, and other difficult organizational decisions in the workplace are required, but always strive to be fair.  A lack of fairness leads to a loss of respect, resistance, and limited communications that will not help to uncover accident precursors, at any level.

Your early warning system

Right now, you just might be thinking, “Sure, everything is clearer after it happens. Hindsight is always 20-20. There’s a certain ‘after-the-fact-bias’ to all of this.”

In return, I must say — yes, there may be a particular “precursor bias,” but you have a built-in early warning system wherever you work. And fear just could be a major barrier to communicating, tracking, analyzing, managing and abating the hazards, conditions, and at-risk actions that need to be resolved.