Google the phrase “stupid unsafe acts” and you get about 174,000 results. That’s a bit scary. Included are links to “People Doing Stupid Things,” “Stupid Human Beings Doing Unsafe Things” and “Hilarious pictures from around the world show workmen who somehow cheated death.”

So why is a guy hanging a “Think Safety First!” banner while balancing on staircase railings?

What’s with the worker in a hardhat smoking a cigarette while right in front of him is a sign, “Danger – Flammable Gas”?

Or the worker kneeling while using a power saw to cut into a concrete floor – he’s wearing a clear bucket over his head instead of a hardhat and faceshield or safety specs?

Age-old question

The rationality of human beings has been questioned by philosophers, psychologists, sociologists, economists, bartenders and their patrons – and safety and health professionals – for eons. Humans may be the only rational animal, but only sometimes or intermittently, as just about any safety and health practitioner will tell you.

To explain our habitual and presumably ineradicable irrationality, let’s turn to Daniel Kahneman, one of the foremost experts on the psychology of judgment and decision-making. Kahneman is professor emeritus of psychology and public affairs at Princeton University’s Woodrow Wilson School of Public and International Affairs. Called by some the world’s most distinguished psychologist, Kahneman was awarded the 2002 Nobel Prize in Economic Sciences, and is the author of the 2011 book, Thinking, Fast and Slow, published by Farrar, Strauss and Giroux.

Says Kahneman, “We can be blind to the obvious, and we are also blind to our blindness.” He adds, “We’re all prone to make some very simple errors when we try to work out what to do. And we continue to make those mistakes even when they are pointed out to us (by safety and health pros on a regular basis). I don’t have a recipe for avoiding the errors that we’re all prone to, and I don’t think there is one. It’s the way we are.”

No answers?

Well that’s a bit pessimistic, eh? Safety and health pros spend whole careers finding and implementing “recipes” to avoid “the errors that we’re all prone to.” Millions and millions of dollars have been spent on behavior-based safety’s observation and feedback processes, OSHA regulatory compliance, checklist manifestos, safety and health management systems, the  hierarchy of controls, coaching and training, rulebooks and discipline, root causes analyses, job safety analyses – everything from time-worn toolbox talks to modern predictive analytics.

In recent years  human and organizational performance (HOP) has come to the fore. HOP starts by recognizing that human error is part of the human condition. It’s not a matter of stupidity. HOP accepts human error as inevitable, and works around human foibles by reducing risk traps in work processes. Assessments and mitigation is based on the severity and probability of risk.

Our very own blinders

Kahneman and others point out that cognitive biases play a big role in making us “less rational.” Our actions are not always anchored in reasons, Kahneman says. (You’re likely nodding your head in agreement.) Biases or blinders get in the way. Here a just a few:

• Present bias causes us to pay attention to what is happening now, and not worry about the future. “Let’s get this job done now.” This do-it-now, damn-the-consequences thinking causes things like texting and driving, overeating, smoking, and having unprotected sex.

• Confirmation bias is the tendency to access information and experiences that confirm what we already know. We don’t wear PPE because we’ve gone without it before and haven’t been hurt.

• Loss aversion means we feel the pain of a loss much more than the pleasure of a gain. Ask any athlete. This can spell trouble for safety and health pros: injuries most often do happen to the other guy, the pain of loss is not personally experienced, so risk-inducing losses are not averted. And the positive gains of working safely – something of a pleasure principle – don’t register with us.

• The negativity bias operates along similar lines. Negative events are far more easily remembered than positive ones. It’s why athletes, fishermen and lovers dwell on “the one that got away” more than a victory or a prize catch.

A different sort of system thinking

Kahneman also employs a type of “system thinking” to explain our irrationality. He describes two modes of thinking – System 1 and System 2. System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control. It’s fast thinking.  System 2 is slow thinking. It pays attention to mental activities that demand effort, calculations, putting pen to paper. Mindfulness, situational awareness -- goals of safety and health pros -- require System 2 thinking.

System 2 thinking is the preference in safety. Slow down, monitor the appropriateness of your behavior, pay attention, think decisions through, exert self-control. System 1, in contrast, represents the dreaded “working on auto pilot.” The problem is that System 1 think comes automatically to us and is easy and effortless. System 2 thinking requires mental effort and energy. Listen and read carefully. Analyze logically. This slows us down. We usually only engage in a fraction of System 2’s capacity.

HOP – human and organization performance – offers an antidote to our naturally fast, involuntary, biased (and often irrationally unsafe) way of thinking. Create a work environment of checks and balances; design in process controls; emphasize audits and risk assessment. Identify and mitigate the workplace traps and inefficiencies that we’re blinded to by cognitive biases and that we circumvent by fast, auto pilot thinking. HOP sees behavior as a function of the work system.  Sure, selection and training of employees and personal accountability will always be essential, but simply blaming behaviors or “stupidity” is too easy and lazy.