Last month I introduced you to Scott A. Snook’s term practical drift, which he coined in his root cause analysis of the accidental shoot down of two U.S. Army Black Hawk helicopters resulting in the loss of 26 peacekeepers.  Recall, practical drift is the “slow, steady uncoupling of local practice from written procedures.”1

So when does practical drift occur in the context of the practice of safety?

First, a brief overview of Snook’s theoretical matrix model is in order. Three dimensions define the model: Situational Coupling, Logics of Action, and Time. The organizational situation can either be loosely or tightly coupled. In the case of loosely coupled, the subunits of the organization are tied together either weakly, infrequently, slowly, or with minimal interdependence. Think of an organization that tolerates a high degree of flexibility in making safety decisions at the operations level.

In a tightly coupled situation, there is no buffer or slack between elements within the organization.3 In this case there is a great deal of command and control decision-making such that a safety decision in one unit has a direct affect on another unit.

The second dimension is Logics of Action, which is described as either rule-based logics or task-based logics. The “logics of action” are the scripts, norms, and schemas among which people shift according to the context. People move back and forth from rule-based to task-based logics of action depending upon the context in which they find themselves. Snook contends these shifts have a predictable impact on the smooth function of the organization.4

The third dimension is Time. The circle of arrows through the four quadrants implies a cycle of time. Each quadrant represents a different state of stability the organization is in based on the pressure of the current situational coupling and logics of action taking place.

Quadrant 1 and 3 indicate a state of stability — the logic of action matches the situation.  Quadrants 2 and 4 represent instability wherein the rule-based logics do not match loose coupling, and task-based action is no match for tight coupling.  As an organization moves from 2 to 3 and 4 to 1 it does so due to an imbalance in the organizational system. Each quadrant depicts a point in time where the organization pauses (sometimes for years) in the natural flow of the transition from one quadrant to another and carries a label.5

Quadrant 1 is labeled “Designed” indicating, for example, the organization is at the planning stage. Members follow the rules in a tightly coupled environment. Quadrant 2 depicts an “Engineered” organization in which the organization is operating as designed (i.e., following the rules) even though the situation is loosely coupled. In Quadrant 3, labeled “Applied,” members of the organization take on a pragmatic approach to applying the rules and become more task-based. Finally, in Quadrant 4, labeled “Failed,” the task-based operations do not align with the tightly coupled organization. It is at this point in the cycle that an organization is most likely to experience a significant event. In the case of Snook’s analysis, this is when friendly helicopters get shot down.

 An example of drift

To gain an understanding of when practical drift occurs in the cycle and what a safety professional should consider to avoid a significant event, here is a hypothetical situation.

The VP of safety, health, and environment calls you into her office to discuss your idea regarding the introduction of a new safety management system (SMS). As the senior safety professional on her staff, you have raised the need for a SMS for several years due to the fragmented approach to safety used across the organization. You are assigned the responsibility of designing and implementing a new SMS.

As you begin the process, you define the safety programs and procedures you need to address. You assume the situation is tightly coupled and that a safety activity or decision in one subunit will directly affect what happens in another. You also assume employees will “follow the rules” in your new SMS.

A circumstance we often find ourselves in at this stage is the belief if a significant safety incident occurs we do not want any blame to be traced back to us for a poorly written procedure. So we tend to create worst-case scenarios and write the procedure to ensure the worst case does not occur; in essence, we “overdesign” the SMS. Thus, the design state of your planning stage is a tightly coupled, rule-based logic of action (Quadrant 1).

Once the SMS is written, it is time to go live at the operations level and move from Quadrant 1 into 2 (i.e., Engineered), where the organization operates as designed. Operators follow the rules, but in a looser fashion even though you assumed the organization would remain tightly coupled. In the early stages of implementation, employees obey the rules to avoid punishment. But over time operators gain experience with the new SMS procedures and begin to take risks in order to get the job done. When this shift in behavior becomes commonplace, the organization moves from Quadrant 2 to 3 (i.e., Applied), and Snook’s phenomenon of Practical Drift sets in.

 Seductive persistence of pragmatism

As Snook points out, “when the rules do not match the situation, pragmatic individuals adjust their behavior accordingly; they act in ways that better align with their perceptions of current demands. In short, they break the rules.”6 As time passes, “the seductive persistence of pragmatic practice loosens the grip of even the most rational and well-designed procedures.” Employees’ practical actions gradually drift away from your originally established SMS procedures.7

The drift is reinforced daily when employees turn to their fellow workers for advice on completing job tasks versus relying on your SMS procedures. For each uneventful day in this loosely coupled environment, it becomes even more difficult to persuade employees of the benefits of following the rules.

In the “Applied” world (i.e., Quadrant 3), the engineered organization governed by SMS rules gives way to locally pragmatic responses by employees to their daily tasks. With time, these locally pragmatic responses become the “new” rules. Interestingly, Snook notes in the “Applied” stage employees act based on the assumption that people outside their own work group are behaving in accordance with the original set of established rules.8 

Think about it. Have you ever interviewed a victim of a serious incident and had them say, “I’m surprised those guys were breaking the rules.” Or the age old axiom, “If I don’t get caught, I’m not breaking the rules.” These statements are commonly heard when practical actions have drifted far away from the designed SMS rules.

 Systemic nature of catastrophes

Then at some point in this “Applied” world, a rare stochastic (random) event occurs causing the system to rapidly contract and become tightly coupled moving us from Quadrant 3 to 4 (i.e., a “Failed” world). When this random event occurs, we are forced to act based on the behaviors of others — interdependence. We expect others to act in accordance with originally agreed-upon standard procedures knowing full well we rarely act in accordance with established guidelines ourselves.9

Snook notes, it is the “perverse combination of practical drift and tight coupling [which] set the conditions for randomly triggered disaster.” As the organizational system moves from stability to instability, energy is created that leads to change. In the “Failed” world, this instability drives management to save the organization from recurrence of the disaster, while overlooking the systemic nature of the disaster.10

The dotted arrow from Quadrant 4 feeds back into Quadrant 1 – now 1’ where the cycle starts over and fixes are implemented in Redesign 1’, which often leads to an overcorrection for the task-based actions by over-tightening the rules. This knee jerk reaction provides the energy to spawn subsequent cycles of disaster. 

Snook reminds us “the tighter the rules, the greater the potential for sizable practical drift to occur as the inevitable influence of local tasks takes hold.”11 So before you write a new procedure, don’t fall prey to thinking up the worse case and writing to prevent it from happening. Get out from behind your desk and go into the field. Watch operators do their work. Solicit their opinions on how to satisfy the perceived need for the procedure. Give operators the opportunity to help you design the procedure so that they will own it when you roll it out.

  1. Snook, S.A. Friendly Fire: The Accidental Shootdown of U.S. Black Hawks over Northern Iraq. Princeton University Press, Princeton, NJ. 2000.
  2. Ibid. pp. 186.
  3. Ibid. pp. 187.
  4. Ibid. pp. 188.
  5. Ibid. pp. 189.
  6. Ibid. pp. 193.
  7. Ibid. pp. 194.
  8. Ibid. pp. 198.
  9. bid. pp. 199.
  10. Ibid. pp. 200.
  11. Ibid. pp. 201.