What is a High Reliability Organizations (HRO)? The work is highly technical and complex, operators require a high level of technical training and certification (and often regular checking, retraining/ requalification), and the consequences of error can be catastrophic. Hence, “it has to be done right every time.”
My earliest exposure to a true HRO was with nuclear control room operators. Work in the nuclear industry led to similar research and consulting opportunities with commercial aviation. At around the same time I was doing this foundational work with nuclear operations, U.S. airlines were supplementing their “stick and rudder” technical training with crew- coordination training.
One of my mentors, the late Bob Helmreich, was one of the first to see the strong parallels between the cockpit and the medical operating room. He was among the earliest pioneers to extend the logic of our joint work on teamwork training to the surgical team.
Don’t question the boss
In the operating room there is a leader and a supporting crew. Traditionally, the lead surgeon is the “boss.” He/she might be prone to operate in “command and control” mode (like airline captains of old), telling others what to do and when to do it, and not necessarily involving others in the thought process. If someone, including the boss, made an error of commission or omission during the surgery, it might or might not be caught and corrected. There were strong implicit barriers (sometimes explicit ones) to speaking up and “correcting” the boss.
Our research in nuclear operations and in commercial aviation had pointed out clearly that human error was a primary contributing cause of accidents. In the overwhelming majority of cases, a chain of events, each event in and of itself often minor, compounded to lead ultimately to a bad outcome. In the nuclear industry in the U.S., the “events” have not led to fatalities; they might be better classified as near-misses rather than as accidents. Not so in aviation. While fatal accidents are indeed rare in U.S. commercial aviation (but more common in small private planes and in foreign carriers), they do occur, and when they do occur, error is the usual culprit.
Back to the medical environment. In times past, fatalities and less severe adverse outcomes were not very visible. They happened, to be sure. But they were not much discussed. How things have changed. According to reliable statistics from sources such as the National Institutes of Health, the Mayo Clinic, and Johns Hopkins University, medical error is the third-leading cause of death in the U.S., behind only cardiovascular disease and cancer. These reliable sources count the number of fatalities traced to medical error to between 250,000 and 400,000 per year. Compare those figures to automobile fatalities (around 35,000 - 40,000 per year) and domestic airline fatalities (zero since 2009).
For a variety of reasons it is more difficult to assess the frequency of error-related, non-fatal “adverse events” in healthcare, but experts estimate that such events are more common than fatalities by multiples of at least 10 or more.
Medical error is, of course, not limited to the operating room. It can occur at any point in the chain of treatment events, from initial diagnosis, to medication (including prescriptions written and filled correctly), to various forms of treatment, to discharge protocols, and on and on.
Much as the nuclear industry and the airlines became sensitized to the role of error in accidents, and to the need to institute better error-management practices, so too has healthcare begun. Not long ago I had cataract surgery. As I was being prepped for the procedure, several nurses, the anesthetist, and the ophthalmologist came to me, one at a time, looked at my bracelet, asked me my name and date of birth, then asked me which eye we were doing that day. When I answered, each put a mark with a Sharpie on my brow above the correct eye. This was essentially a five-person quality assurance process in action. They were not going to perform a “wrong site surgery”, which in my case would have been a non-life-threatening error, but in other cases could be (and has been) catastrophic.
When I discuss safety with my classes, I ask “who has had surgery in the last two or three years?” Usually one or two have. I ask, “Did anyone hand you a Sharpie and ask you to write on the body part they were going to operate on (or do so themselves)?” Almost all say yes, and the other students look amazed.
Burnout and errors
I am currently doing research on the topic of “physician burnout.” Burnout is very common among medical doctors, especially those in primary care. Wait… they are highly educated, highly respected professionals who make a nice income, right? What do they have to be stressed over?
Well… a lot, as it turns out. One of the clear consequences of burnout is the increased likelihood of errors. A study in the Annals of Surgery, published a few years back (and since generally confirmed by other researchers) found that nine percent of surgeons who responded to the survey reported their concern that they had committed a “major medical error in the last three months.” The majority of those doctors attributed their error to their own mistakes, not to “system level factors.” They just messed up. Burnout was identified as a significant predictor of error, by their own acknowledgement.
Healthcare is now addressing such challenges head-on, and importing error-management concepts which EHS experts have been developing and using for many years. Error in healthcare, more than most settings, can really be a killer.