We are a week late with our typical holiday book review! In our module on root cause analysis we spend a deal of time discussing efficiency-thoroughness trade-offs (ETTO’s). That concept, best associated with Erik Hollnagel, is well known to everyone in the human factors and safety fields. We’ve not discussed it in our Patient Safety Tip of the Week columns before since the literature has been somewhat arcane and not easy to wade through. But the concept is very important for those who need to understand how systems work and Hollnagel has now come out with a new book (The ETTO Principle: Efficiency-Thoroughness Trade-Off. Why Things That Go Right Sometimes Go Wrong) that attempts to explain ETTO’s in a simpler way. It’s a paperback edition, written without footnotes so that it is a relatively easy read, though it still contains much of the jargon that dominates the human factors literature. As in any good patient safety or human factors presentation, he lets stories get your attention and bring home points.
He begins with the simple story of a worker who cuts off his thumb while using a circular saw. The worker knows the dangers and has the safety equipment needed (a push stick that is to be used to push the wood piece through the saw blade). But the worker realizes that he can cut many more pieces of wood without taking the time to set up the safety device (i.e. he is much more efficient) and since he has done it already many times without accident, he perceives it as relatively safe so he trades off thoroughness for the sake of efficiency. Hollnagel stresses that for most recurrent work situations people will naturally choose the more efficient mode as long as they perceive it to be safe based on their past experience.
Sound familiar? We’ve discussed workarounds in several of our columns. The workaround is the ultimate ETTO. The worker uses a much more efficient means of getting a task done while he perceives little or no safety risk.
When you think about it, almost everything we do in our lives, whether work or play, involves the ETTO principle. We are constantly trying to balance efficiency and thoroughness. In some cases we (sometimes unfortunately) tend to weigh efficiency more heavily whereas in others we weight thoroughness more heavily. You’ve heard us quote James Reason “correct performance and systematic error are two sides of the same coin”. Hollnagel notes that the ETTO principle is really saying the same thing. It is the very things we do regularly to accomplish our goals efficiently that, under slightly different circumstances, comes back to bite us. And it is human nature that, for the most part, we tend to favor efficiency over thoroughness.
Hollnagel provides numerous ETTO “rules” that serve as signals such a trade-off is being made. You’ll notice most of them from our discussions on workarounds and root cause analyses:
“It looks fine”
“It is not really that important”
“It is normally OK, there is no need to check”
“I’ve done this a million times before, so trust me…”
“It normally works”
“It’s been checked earlier by someone else”
“It’ll be checked again later by someone else”
“It’s good enough for now”
“This way is much quicker”
“There’s no time to do it now”
“We must not use too much of X”
“I can’t remember how to do it (and I can’t be bothered to look it up)”
“It looks like a Y, so it probably is a Y”
“We must get this done”
“It must be ready in time”
“If you don’t say anything, I won’t either”
“I’m not the expert on this, so I’ll let you decide”
Hollnagel provides lots of real-life examples of ETTO’ing in multiple industries, all of which have applicability to healthcare. But he also provides some healthcare-specific examples. One of the most compelling stories is that of a patient who received inappropriate chemotherapy. The patient was a farmer who had delayed seeing a physician because he did not want to skip the “calving” season on the farm. He was diagnosed as having gastric carcinoma and scheduled for chemotherapy. He was anxious to being the chemotherapy so he would be ready for the upcoming hay-drying seasone on the farm. The oncologist was anxious to start it because he was going to a conference for two weeks. The chemotherapy was begun and no one ever noticed that the final pathology report had shown a lymphoma rather than gastric carcinoma (the treatment, response to treatment, and prognosis for a lymphoma are much different than for a gastric carcinoma). He died after 5 months, receiving the wrong chemotherapy. Hollnagel outlines several of the ETTO’s obvious in the case, all of which favored efficiency over thoroughness. The patient was first seen on a day the oncologist was seeing 35 other patients, a practice that oncology clinic had obviously adopted for the sake of efficiency. The chemotherapy was scheduled prior to receipt of the final pathology report, again for efficiency’s sake because both the physician and patient wanted to get started for the above-mentioned time pressures. And the pathology report (which apparently did return prior to onset of the chemotherapy but was never noticed) had the final diagnosis typed in small font on the second page compared to the oncologist’s clinical diagnosis which appeared in large font on the front page. That was another ETTO favoring efficiency (that the pathologist did not make a separate phone call to the oncologist when a disparity between clinical and pathological diagnoses was present).
Another with good lessons learned was a near-miss aviation situation. A vintage plane used by a sight-seeing company had inaccurate fuel gauges so the crew had a policy of always dipping the fuel tanks to verify fuel before flights. The plane usually held most of the fuel in the main tanks and enough reserve fuel for 45 minutes of flight in the auxillary tanks. On one particular day, the situation was reversed – the auxillary tanks held all the fuel needed for that day’s planned flight and the main tanks held the reserve fuel. During the briefing, the information about the reversed fuel situation was mentioned but never emphasized. When the pilot asked the technician if the main tanks had been dipped, the answer was “yes” (but, again, no discussion about the reversal of where the main fuel source for today was). Shortly after the flight began, they had engine problems and could not restart the engines despite attempts to switch back and forth between main and auxillary tanks. They made a successful emergency landing. Lots of ETTO’s in this one! The failure to emphasize the reversal of the fuel allocation led the pilots to assume all was as usual. Because they knew the fuel tanks had been dipped by someone else, they did not dip them themselves each time the plane flew that day. Flying with known inaccurate fuel gauges was tolerated and the workaround was dipping the tanks prior to takeoff to verify the tanks were filled. You may recognize the latter phenomenon as “normalization of deviance” where an organization comes to accept a deviation from best practice as the new standard practice (i.e. a workaround becomes the norm). How many of you know of examples in your organizations where there is a workaround around a faulty piece of equipment or a faulty procedure that has now become accepted “standard” practice? If you don’t start looking today – you’ll find them!
Hollnagel even talks about how the ETTO principle applies to root cause analyses. In an RCA we continually ask the question “why” on and on again. He notes that in any investigation you must apply a “stop rule” in which you establish criteria for ending your investigation. It is really the ETTO principle: you are really trading off throroughness vs. efficiency. You have to expend resources (time, personnel, money) to continue the investigation but you also have a need to fully understand the causes of the accident in order to prevent future recurrences. He also points out that our bias to investigate cases with bad outcomes (our “failures”) rather than our successes is also an ETTO.
Hollnagel’s book has good discussions about the strengths and flaws of the various theories of accidents. He points out the complexities that make finding simple causes unlikely and the fallacies of those approaches looking for such. He also emphasizes some of the inherent biases and faults of root cause analysis. In particular, he talks about how most accident theories apply best to “tractable” systems (meaning those systems that are readily predictable and manageable) and fail to take into account novel factors that occurred (which, by definition, could not have had a root cause since they did not exist prior to the event!). And he makes the case that the systems most prone to such unpredictabilities are those that use RCA’s the most: hospitals and healthcare!
He emphasizes that rather than focusing on failures when we review cases, we should look at what normally should take place. Then we should review what ETTO’s took place and why such things that usually go right went wrong in the particular case. To illustrate his point, Hollnagel desribes in detail a trauma case admitted to a hospital in which multiple physicians caring for a patient all focus on the patient’s head injury, assume that one and other are addressing other issues, only to ignore a hip dislocation which led to loss off blood supply and eventual need for a hip replacement.
But we are still left wondering where ETTO fits into the patient safety world. We liken it to PET scanning. PET scanning was a great technology that spent years looking for a clinically useful application. We think the ETTO principle is the same: a great concept looking for a useful application. While an understanding of the ETTO principle is very useful in helping us to understand behaviors in analyzing events that have already happened, we think that its greatest utility will be in prospectively managing processes. In designing new processes, applying the ETTO principle to predict responses in various scenarios. Hollnagel apparently feels the same way and discusses potential applicability of the ETTO principle in forward-looking methods like Probabalistic Risk Assessment (PRA) or Human Reliability Assessment (HRA). Those of us who have been involved in design of clinical computing systems readily recognize the importance of understanding how individuals are likely to react in certain situations. We know that ETTO’s are likely to occur and that needs to be an integral part of such planning and design.
Hollnagel E. The ETTO Principle: Efficiency-Thoroughness Trade-Off. Why Things That Go Right Sometimes Go Wrong. Burlington, VT: Ashgate Publishing Company, 2009 http://erik.hollnagel.googlepages.com/theettobook