- OIL & GAS
“When I was a young man I was given a check for a million dollars. I tore it up and went to the top of a mountain to contemplate the mistakes of mankind…one man in particular.”
--Joe Martin, Cartoonist.
When it comes to unsafe behaviors, there is a lot of ground to cover. In some cases, these unsafe behaviors are just human errors—honest mistakes that we all unintentionally make once in a while. And despite what the mouth breathing, pig-eyed brutes that fanatically claim that behavior based systems are the answer, there is scant little we can do about it. Sufficed to say, we had ought to focus less on preventing mistakes and more on protecting people and process from being harmed by errors.
Misjudgments are a different story, and potentially more lethal. For starters it is tough to get inside someone’s head and figure out exactly what they were thinking. We love to beat up on people for making a bad call, in many cases because we just know that if put in that situation with the same information we would have decided more prudently. We believe that we are able to make better decisions than our peers and generally avoid making stupid choices. Unfortunately, we’re most often wrong.
The human brain (and perhaps dog brains, and bird brains, for all I know) is designed to see patterns, and to put our input into categories. In broad strokes (if you want the details read Joseph T. Hallinan'sWhy We Make Mistakes: How We Look Without Seeing, Forget Things In Seconds, and Are All Pretty Sure We Are Way Above Average) our subconscious minds tend to convince us that things are true when they aren’t. In other words, our biases fill in the blanks. These aren’t big, ugly biases like racism, but little biases like the tendency of right handed people to turn right when entering a building even if the shortest route is to the left.
Sometimes, in fact MOST times, our biases are helpful. Perhaps you’ve seen the email with all the letters jumbled; it expertly demonstrates that our brains can descramble letters simply because our subconscious mind is jumping to (in this case the write) conclusions. In other cases these biases are wrong and have deadly consequences.
I was working with a heavy truck manufacturer and the supplier of the fuel tanks had missed a shipment. This forced the truck manufacturer to run production without these parts. Running production lines during parts shortage is common, but in this case it created a problem. On one of the stations the standard operating procedure was for the operator to step out backward onto the fuel tank (the fuel tank was designed with a step for driver entry and exit—it was designed to hold the weight of a burly truck driver) and weld the exterior joints. Even though the operators had been warned of this hazard, and contingencies were in place, an operator stepped backward from the truck and fell to the floor. Miraculously he was uninjured but had this happened another few seconds into the process he would have fallen an additional ten feet into a pit.
This wasn’t an at risk behavior. The operator didn’t think, “well yes this is dangerous, but I will only be out there for a second.” The operator did his job the way he had been doing it for years, day-in and day-out. His mental bias made him think something was true when it wasn’t and that bias could have gotten him killed. Should this worker be punished for his miscalculation? Hardly. But overcoming biases—those deep-seated beliefs that are buried in our subconscious—are tough to manage. Again, this demonstrates the importance of protecting people from their decisions instead of trying to manipulate the decision making process.
We see what we want to see
“He’s as blind as he can be,
Just sees what he wants to see,”
—John Lennon, Nowhere Man, The Beatles
Research has shown that our perceptions are intrinsically flawed. In experiments where a stranger who approaches a pedestrian for directions and the pedestrian is distracted long enough to replace the stranger, only about 30% of the pedestrian noticed that the stranger was a completely different person. Numerous studies were able to repeat this experiment with essentially the same results. Researchers concluded that people saw what they were expecting to see and (at least consciously) ignored the things that didn’t seem to fit.
Imagine the implications of this study for safety, where non-standard work (situations where things aren’t the way they are supposed to be, or where there is no “normal” because of the uniqueness of the work being performed. Our subconscious brain forces us to make decisions on what is supposed to be true and not what is actually true.
Zachary Shore’s book Blunder: Why Smart People Make Bad Decisions examines why otherwise intelligent people make really stupid decisions. It’s worth a read, but if you prefer the equivalent of a fourth grade book report on it, read my post, it’s not my best work, but then you get what you pay for, and I haven’t seen a bunch of checks rolling in lately.
Shore identifies seven mental states that can cause people to make misjudgments that he calls “cognitive traps”. The cognitive traps that Shore identifies are:
Exposure anxiety. Exposure anxiety is the subconscious fear that our decisions might make us seem weak or somehow tarnish our reputations. In these cases we will actually make a bad decision over a better option simply because we are overly concerned about how others perceive us.
This manifests in safety when people chose to ignore safety protocols because they think coworkers will see them as foolish, unnecessarily worried about a remote possibility of injury, or even as sucking up to the boss.
A good share of exposure anxiety is influenced by the overall cultural view of safety— if there is peer pressure to take short cuts or violate safety rules we will subconsciously make bad decisions. In other words, we will put ourselves in harms way without even realizing that we are making a decision to do so.
Causefusion. Causefusion (a contraction of the words “cause” and “confusion”) is the practice of ascribing cause and effect to correlation. he best example is what we as a profession have done with the findings of the National Safety Council (and a parade of greedy, self- snake-oil salesmen) have done with the statistic that 96% of all worker injuries have a behavioral element. That is a correlation. In 96% of cases where a worker was injured there was also an unsafe behavior present, but that is not the same as cause and effect. That doesn’t mean that the injuries were caused by unsafe behaviors, only that both factors were present.
We can also see that causefusion in a lot of what we do in safety, not the least of which is incident investigation.Far too often our incident investigations establish correlation which we mistakenly attribute a causal relationship. Unless we can overcome this cognitive trap we create organizational superstitions around our hazards. For all the good it will do use, we might as well be shaking a gourd over the heads of workers or sacrificing live chickens at our pre-shift huddles. At least people would feel like we were taking action.
Shore adds more mental traps we can fall into: Flatview, Cure-Allism, Infomania, Mirror Imaging, and Static Cling; some are more applicable to safety than others but they are worth considering. Shore isn’t a safety professional, in fact, he is primarily a historian, but his research on why we make bad decisions is compelling. Each of these cognitive traps can be seen in the root of most of the major misconceptions of safety: that 96% of all worker injuries are caused by unsafe behavior.
There is a ton more I could say about the many factors that cause us to make bad decisions, but I am already approaching the maximum that anybody will ever read in a blog. But sufficed to say, human behavior is a lot more variable and complex than any of the BBS yahoos will ever admit. There is scarce little that we can do to improve our decision making because so much of it is done on a subconscious level. Once again, we have to stop trying to change human nature, and focus instead on the physical environment in which people behave and stop trying to apply Pavlovian and Skinneresque theory to workplace safety. It doesn’t work.