ISHN exclusive interview at NSC with Captain Chesley “Sully” Sullenberger III (Part 2)
Monday morning October 22 at the National Safety Council Congress & Expo in Orlando, we had the opportunity for an exclusive one-on-one, sit-down interview with Chesley Sullenberger III. Captain “Sully” successfully ditched US Airways Flight 1549 in the Hudson River with no loss of life after its initial climb out of New York City on January 15, 2009, was disabled due to striking a flock of Canada geese.
While you were work with NASA, you co-authored a paper on error-inducing contexts. What are “error-inducing contexts”?
In aviation, it is the operational environment in which we work. It is our operational policies and procedures. It is the training that we get. It’s how we interact. It’s the human skills we use with one another in the cockpit. And it has to take into account the cognitive biases that we all have.
Interestingly, since the 2009 Hudson River landing, I’ve learned that this particular paper that I co-authored with the NASA scientists at the NASA Ames Research Center in Mountain View, California, has been translated in Mandarin. So that’s just one small indication of the global knowledge acquisition that is currently going on in China. It is going on in many industries, in many domains (job functions).
In this paper we studied how pilots sometimes fall into the trap of planned continuation errors. In other words, not changing our course of actions when circumstances change or when we get new information.
There are two cognitive biases that are particularly active in these kinds of situations: one is availability bias, and one is confirmation bias. Availability bias, for example, is also a factor in medical diagnostic errors. This is where a physician might have treated many cases of malaria, sees another patient with some of the same symptoms, and quickly reaches the conclusion that this patient also has malaria.
Of course confirmation bias sometimes is a factor in these cases, too. Because what we tend to do as humans is we tend to look for things and give them weight, and pay attention to indications that confirm our preconceived notions. We tend to discount as either random or by chance sometimes factors that may not indicate that our diagnoses, our conclusions are correct. What we should be doing both in aviation and medicine and the other demands is have a better overall situational awareness, looking at all the data, and seeing to what extent or not, that what we have really decided on is still correct, or in light of new or better information do we need to change course.
Based on you experience, Captain, what is the most important action occupational safety professionals should take during a crisis?
Well, I’ll tell you about the three things we did on the flight (that resulted in the Hudson River landing).
This challenge that we were suddenly confronted with on January 15, 2009, was a novel and unanticipated circumstance that we were never specifically trained for, not even in the most extreme, demanding flight training center had we practiced this particular scenario, and all these things going wrong at such a low altitude so with so little time with which to deal with it.
After almost 30 years of more or less routine flying, it was a situation where we were suddenly confronted with the challenge of a lifetime. And one we had never seen before. And ultimately, as it turned out, we had only 208 seconds, just under three and half minutes, from the time we hit the birds and lost thrust in engines until we had landed, to solve this problem we had never seen before.
The three things that I did that seemed to have made a great deal of difference was to first, force-calm myself. The kind of professional calm that we learn to summon up from somewhere within us, which isn’t so much the calm as having the discipline to compartmentalize our thinking and focus clearly on the task at hand in spite of the stress.
And that was difficult to do, it required a lot of effort, because my body’s initial physiological response to the startle factor was huge. I was aware of it in the first seconds as it happened. My blood pressure and pulse spiked. I got tunnel vision, my perceptual view narrowed because of the stress. It was marginally debilitating, it really interfered with my mental processes.
The second thing I did was that I knew because of extreme workload and extreme time pressure I did not have time to do everything I really needed to do in that short time. And even though this is not something we are specifically trained for, based on my training and experience, I was able to quickly take what I did know, and apply it in a new way to solve this problem.
The third thing I did was to workload-shed. I chose to workload-shed the work I couldn’t do. So the second thing I did was to set priorities. The third was to make my workload manageable. I chose to do only the highest priority items, but do them very, very well. And then I had the discipline to ignore everything else that I didn’t have time to do that would only be a potential distraction.
So force-calming myself, set clear priorities in even an unfamiliar situation, and then control the workload so that instead of trying to do a lot of things and not doing any of them well, I did only the absolute highest priority things like fly the airplane first, quickly make the decisions about where I am to go and land, but do each of them very well and then not allow myself to be distracted by anything else.