The concepts for Human Performance Improvement (HPI) have been advocated for use in safety programs for a number of years. Human performance is defined as “a series of behaviors carried out to accomplish specific task objectives.” HPI emphasizes principles and activities designed to reduce the potential for human error. Many of these have been used in some form in many safety programs. But it is the underlying research and structure that differentiates an HPI approach from traditional safety.

HPI increases safety process effectiveness if its core principles are taken to heart. When combined with concepts of risk and an improved understanding of job requirements, an organization can improve its potential to reach stated safety goals and objectives.

Human performance principles

The U.S. Department of Energy’s Human Performance Handbook lists the following five HPI principles:

1. People are fallible, and even the best people make mistakes.  The belief still exists that mistakes are 100-percent avoidable. The routine phrase “unsafe act” skews in-depth inquiry into underlying causes. It implies at the start of any discussion the employee was at fault. In our view, HPI’s first principle is fundamentally different from viewing events from an “unsafe act” point of view.

2. Error-likely situations are predictable, manageable, and preventable. If humans are fallible, then the approach must be to design safety and operational processes that search for and identify error-likely situations.

3. Individual behavior is influenced by organizational processes and values.  Peer pressure, values, management style contribute to behavioral risk taking.  Understanding the current “organizational culture” is essential.

4. People achieve high levels of performance because of the encouragement and reinforcement received from leaders, peers and subordinates. Feedback, its strength and type (positive and negative) can drive employees toward or away from safe behavior.

5. Events can be avoided through an understanding of the reasons mistakes occur and application of the lessons learned from past events (or errors). A better way of saying this: the probability and severity of undesired events can be reduced.

Latent errors

Latent errors are hidden organizational weaknesses or equipment flaws that go largely unnoticed. As they are dormant, they have no immediate negative impact and can go for long periods of time before they create a loss-producing event. Latent errors include a wide range of various management, supervisory, engineering and administrative actions, directives or decisions that set up conditions for error or that fail to prevent the effects of an error. 

For example, an excellent employee observation process may be in use. However, it is probably not designed to identify hidden weaknesses woven into the fabric of the organization.

Human error and equipment failure

Per references from the nuclear power industry, loss-producing events are said to be split between 80 percent human error and 20 percent physical equipment failure. Of this 80 percent allocated to human error, 70 percent stem from “organizational weakness” (latent errors). Only 30 percent are from individual mistakes. If 70 percent of incidents are due to organizational weakness, 56 percent (70 percent times 80 percent) of all loss-producing events are due to these latent built-in errors. Only 24 percent (30 percent times 80 percent) are due to individual mistakes.

Much more emphasis is needed on assuring assigned tasks are thoroughly evaluated using a structured process. As humans are fallible, we cannot assume that a job was initially perfectly designed for maximum efficiency or effectiveness. A job may be based on traditional movements and actions that actually contain hidden loss-producing traps for the employees if various job components and hazards come together at a specific future point in time.

Why do an enhanced Job Hazard Analysis (JHA)?

Few companies make a strong effort to assure all tasks with risk potential are reviewed and controls are in place to limit the damage that can be done. JHAs that are completed may be only focused on the potential for direct observable hazards.

The JHA is better used to uncover issues built or designed into the job when the basic steps and tasks, tools, equipment and materials, the work environment, administrative policies, guidelines, etc. and employees are identified and reviewed,

HPI asks us to seek out latent errors that may not manifest themselves for long periods of time. For example, one of the authors used the JHA as part of an incident investigation review. The initial report found the injured employee at fault for an “unsafe act.” The corrective action was to “retrain the employee in the proper procedure” for the task. When the JHA review was completed, the current procedure was requested. It turned out no procedure even existed.  A latent error created the loss event, not just an action error by the employee.

Improving the Job Hazard Analysis

When completing a job hazard analysis, consider the following:

  • Begin by asking about the initial task design. Don’t assume it is designed correctly. Expand into a full review of the steps and tasks currently required.
  • What needs to be avoided, redesigned and modified to reduce the hazards and scope of risk based on improved job design?
  • Can any latent errors be identified that are built into the job design? These may be in the form of task demands, individual capacities, work environment, and from human nature.
  • Is this job being completed in tandem with other jobs that have their own hazards and risks that can combine and develop a synergistic effect, creating a greater risk and higher loss potential?

Adopt human performance improvement ideas into your JHA process. Your effort might produce additional insights on the scope of risk, hazards and improved controls.

 

 

References

U.S. Department of Energy Human Performance Improvement Handbook, Volume 2, DOE-HDBK-1028-2009

Roughton, James, Crutchfield, Nathan; “Job Hazard Analysis. A Guide to Compliance and Beyond,” Butterworth-Heinemann, 2008