scales of justiceFor the last couple of weeks I have been teaching Just Culture to a large healthcare organization and it has triggered new revelations about the concept.

First, a bit about Just Culture: James Reason first used the term “Just Culture” in the late 1990s to describe an environment where workers felt free to admit mistakes. Because people are intrinsically fallible, Reason, believed it was unjust to punish people for things that they neither intended nor chose to do. Mistakes, most often are important sources of information about flaws in our processes, and Reason believed that punitive responses to mistake making led to a climate of fear where mistakes were driven underground.

Most advocates of Just Culture believe that a “culture of safety” cannot be achieved until a Just Culture is firmly entrenched within an organization. Just Culture is characterized by a sense of fairness — people are held accountable for their actions, but the accountability doesn’t necessarily mean punishment. When it comes to things going wrong there are basically three categories: mistakes, risk taking and recklessness.

Mistakes

Mistakes are those unintended negative outcomes that we categorize as human errors; they are the reason they put erasers on pencils.

Much as we try we can never eliminate human errors (since they are unintended and often subconscious). Mistakes aren’t bad, in fact, most of our learning as people come from trial and error. As infants we investigate our world and as we make mistakes we fill our gray-matter database with what works and what doesn’t. This neurological database subconsciously continues testing our environment and leads us to innovations or disasters with cold indifference. We call mistakes that are purely the result of slip-up “human error” and in general we would hold people accountable for their mistakes to the extent that we would ask them to bring the lessons that they have learned to bear so that we can learn from the mistake and to help others from making similar errors in the future. This is call mistake-proofing.

Mistake-proofing is something of a misnomer because its intent is not to prevent mistakes so much as it is to lesson the consequences of the error. While mistakes are difficult to prevent, there are many factors that make mistakes:

Stress. The greater the stress the more frequent the “mistakes” one makes as the subconscious brain rapidly tests the safety of moving from one environment to the next.

Fatigue. Many tasks require us to complete a process precisely or to check details. Fatigue makes it more difficult to perform a task without variation and variation in a process leads to mistakes.

Repetition. Our brains are designed to look for patterns and anticipate and continue those patterns. Often, people who are doing repetitive tasks will unknowingly see something that simply isn’t there or to fail to see something that is there but shouldn’t be present.

Distraction. Our brains process stimuli at an astounding rate. The vast majority of these stimuli are interpreted by our subconscious in microseconds. Additionally, recent brain research has shown that the human brain is incapable of consciously “multi-tasking which means that distraction plays a major role in mistake-making.

Drift

While to err is human, not all of our snafus are honest mistakes. Often, mistakes are the result of behavioral drift or inappropriate risk taking.

Behavioral drift is the tendency of a person to gradually move away from a standard until the person has moved out of compliance.

There are two kinds of drift, intentional and subconscious.

Subconscious drift often begins through human error when a person makes a mistake but suffers no meaningful consequences for having made the error.

Intentional drift typically evolves as a worker becomes more familiar with operational norms, taboos, and mores. As a person feels more at home with the culture one is more comfortable deviating from the standards because he or she understands the amount of behavioral variance the culture will tolerate. In other cases, a deviation from the standard is inappropriately believed to be justified or mistakenly believed to be an innovation.

Taking a shortcut is an example of an intentional drift in one’s behavior. Is it wrong to take a shortcut? Perhaps not in itself, but one should consider that if a shortcut is truly an improvement one should work to have the shortcut included in the work standard rather than working outside the standard work.

Another source of drift is ineffective training. Research has shown that as much as 80% of the skills taught in training courses are not retained long enough to be applied on the job. If this is the case, workers may be operating at only 20% accuracy in the tasks they are charged with completing, and the remaining 80% represents variation or drift from the standard.

Poor decision-making

Poor decisions can be more dangerous and also more difficult to prevent. Poor decisions are typically categorized as “unsafe behavior” or “at risk behavior” and represent the largest cause of undesired outcomes. Workers are called on to make decisions all day long and the more bad decisions they make the greater the risk the organization faces. Poor decisions can be effected by:

Poor communication. Many catastrophic decisions are made simply because the information on which these decisions are made are incomplete, inaccurate, or just plain wrong. Increasing the effectiveness of the communication vehicles is perhaps the single most effective ways to reduce bad decisions.

Assumptions. Often disasters are caused because someone made a decision based on an assumption—the person who made the decision believed something to be true when it was not or assumed that all elements of a process were performing to a standard when they weren’t.

Miscalculation of risk. We all manage risk, but many times one person’s assessment of a risk is considerably different than someone else’s perception of the situation. Often times a bad decision grows out of a person’s belief that the possibility of a fairly probable failure is extremely remote.

Discipline doesn’t work

There is a misguided thought process that people use when things go wrong. People tend to seek retribution for even the smallest errors and in the case of mistakes; retribution has no chance of preventing further errors. Punishment is almost as ineffective in preventing people from repeating mistakes, or continuing to make bad decisions. When we assign blame, we seldom look for answers beyond the blame.

From Phil LaDuke’s Rockford Greene International’s Blog. rockfordgreeneinternational.wordpress.com