The ISHN BlogPeople make mistakes; it’s what makes us human. The propensity for human error is practically embedded in our DNA. While the idiom holds that there is no use crying over spilled milk, might there not be some benefit in examining the causes, contributors, and catalysts associated with poor decision-making?

Unless safety professionals stop trying to prevent injuries by reminding people not to make mistakes and focus instead on a real course of protecting people from the injuries resultant from the natural mistakes people make there can never be an appreciable advancement in workplace safety. The secret to prevention is predictability, and the secret to predictability lies in understanding of the underlying factors that increase the likelihood of human error, and interferes with our ability to make sound choices and good decisions.

A scientific look at where our systems are likely to fail, and the most likely injuries that will result from those failures, are paramount to the successful management of injuries. 

A tool for predicting and mitigating risks

The most sophisticated employers use a tool called a Failure Mode Effects Analysis (FMEA) to predict and mitigate the risks associated with process and product failures. Quality FMEAs require skills and discipline that are sorely lacking in most organizations so even when they are completed it is not uncommon that the analysis is substandard. But even a substandard FMEA is better than being surprised by an injury.

FMEAs aren’t just for predicting hazards, rather, it is a living document that is updated, as injuries are evaluated and new causes and catalysts are discovered. This new-found data should include information about human error and poor decision-making.

For more than 70 years too many incident investigations have concluded that the cause of an injury was merely human error. This culture of blame and shame meant that once the responsible party was found and blamed there was no further need for investigation. Why keep looking once the cause has been found?

Most of us understand the need to understand the need to repeatedly ask why when investigating the causes, contributors, and catalysts associated with an injury.

The wrong "whys"

But the most common mistake in this process is the practice of asking the wrong “whys?”

This is especially common when investigating why someone made a mistake or why someone made a poor decision. Let’s say that a person chose to run a red light and was struck by a car. The temptation is to have an incident investigation that goes something like this:

Q:   Why did the person run a red light?
A:   Because he was hurrying to get to the job site.

Q:   Why was the person hurrying to get to the job site?
A:   Because he was late for an appointment with the customer.

Q:   Why was he late for an appointment with the customer?
A:   Because he didn’t allow for sufficient time for travel during heavy traffic.

Q:   Why didn’t he allow for sufficient time for travel during heavy traffic?
A:   Because he lost track of time.

Q:   Why did he lose track of time?
A:   Because he was busy.

Q:   Why was he busy?
A:   Because business has increased faster than the company’s ability to hire and on-board.

I could go on, but I think you get the point.

This kind of linear and stiff investigation doesn’t really work all that well for determining the causes, contributors, and catalysts of human error or poor decision-making. In fact this type of limited, root cause analysis often misses key elements of the situation.

Correcting system issues

A better approach to the same poor decision is a process that looks something like this:

Q:   What was the poor decision?
A:   Running a red light.

Q:   What factors contributed to the poor decision?
A:   Running late, increased traffic, driver distraction (referencing a global positioning system, driver fatigue, and an increased tolerance for risk (he had run yellow lights before and suffered no meaningful consequence).

Q:   What was the catalyst for the incident?
A:   The presence of cross traffic.

Q:   What controls could have been put in place that would have prevented this accident or mitigated the risk of injury?
A:   Training, risk awareness, calendar software, policies prohibiting the use of GPS devices in the car.

Q:   Which of these controls can be practically implemented?
A:   Calendar software and training.

Q:   To what extent is the worker effectively managing the things in his life that are likely to undermine his decision-making?
A:   Poorly, this is the fourth case this year in which this employee has taken unreasonable risks that resulted in a near miss or injury.

Q:   What measures have been taken to improve the worker’s management of the factors that contribute to poor decision-making?
A:   Counseling, training, and a verbal warning.

This type of investigation (and there are scores more questions along these lines) provides a far better view of the situation and therefore affords a far better opportunity to correct the system issues that are likely to create or contribute to risk. Under the current system, if safety professionals were coroners the causes of most deaths would be that the person stopped breathing or that their hearts stopped beating. There is far more to understanding safety than determining a fraction of what caused the injury. Safety is detective work and too many safety professionals get it wrong.

A holistic and contextual view of safety

There may well scant little use in crying over milk spilled, but in understanding the context in which an injury or near miss occurred, one can not only prevent similar incidents in the future, but can prevent a host of other injuries that result from the same system failure; injuries hard to imagine or measure. It’s only through this kind of holistic and contextual view of safety that we can ever truly achieve more robust processes that greatly reduce the likelihood of injuries.

You may not prevent every injury, but that doesn’t make it impossible. FMEA’s and other predictive tools should be used to identify areas of greatest risk and efforts should be made to reduce the risk of injuries to the lowest practical level. Sometimes the true nature of safety lies less in the prevention of injuries and more in the disappointment when we fail to do so.


Recently, the American Society of Safety Engineers has reprinted two of Phil La Duke's articles: Applying Deming To Safety (http://www.fsmmag.com/Articles/2010/09/Demings%2014%20Points%20for%20Safety.htm.)  and Why We Violate The Rules (http://www.fabricatingandmetalworking.com/2011/05/why-we-violate-the-rules/.)