We’ve spent the last 2 columns (and several columns in the past three years) discussing valuable lessons learned from aviation tragedies that can be applied to healthcare. In fact, since the late 1980’s we have used the aviation industry as a model for patient safety. Everyone involved in patient safety realizes the rapidly growing numbers of effective measures that have been demonstrated to minimize errors and improve outcomes. But we have all been disappointed at our overall inability to significantly reduce the number of patients being harmed annually in whom adverse outcomes were potentially avoidable.
John Nance’s new book “Why Hospitals Should Fly” really hits the target. While Nance masterfully weaves many best practices from IHI, NQF, AHRQ, Joint Commission, ISMP and others into his fictional “St. Michael’s”, his theme is that we have failed because we have failed to convert to a culture of safety. That basic theme is that “culture kills strategy every time”.
Nance describes the sentinel event that really led to cultural change in aviation: the 1977 Tenerife accident. In that incident, a KLM jumbo jet attempting to take off crashed into a Pan Am jumbo jet that had not yet cleared the runway, resulting in 583 deaths. He describes in detail many of the contributing factors, including time and monetary pressures, bad weather, poor visibility, language problems, excess fuel, and communication problems. But most importantly, the pilot of the plane attempting to take off was “the best and brightest”. Captain Jacob van Zanten was the Chief Pilot for KLM, a Vice President, head of their safety program, a veteran pilot with over 30 years of experience, and poster child in their ad campaign. Two other people in the crew had concerns about the attempted takeoff but each acquiesced to the captain’s wishes to take off. The fact that this accident could happen to people with outstanding records forced everyone in the industry over the next decade to focus on the systems of safety and develop crew resource management (CRM), remove the powerful hierarchical structure in the cockpit, and other changes that helped mold a totally new culture of safety and teamwork in which all parties in a flight had equal voice.
Nance then demonstrates through the eyes of a physician visitor to St. Michael’s all the lessons learned from aviation that were applied to healthcare and changed the culture to one of “total, egoless support of the common goal (taking safe, effective care of our patients)” in a system in which the whole team acknowledges that errors will occur and the team will catch each other’s mistakes before harm comes to patients. It is a high reliability organization and a learning organization in which all members take pride in learning from their mistakes as much as celebrating their successes. It is a culture in which teams are empowered and encouraged to do a root cause analysis on the spot and make changes to the system immediately (ala Toyota/lean thinking concepts).
It is a culture in which every member really looks at each thing they do thinking “could what I am about to do be wrong?”. Essentially, in the fictional hospital a 50-50 rule is applied to everything but medications. That rule is to expect that what your are about to do has a 50% chance of being in error and harming a patient. He uses the striking analogy to a “see-saw” where something weighted 90-10 (i.e. a 90% chance of doing something correctly) leads to a perception that nothing is likely to go wrong. But using the 50-50 rule instead makes you question virtually every process you are involved in. For medications, he argues, you must assume that every process is going to result in error. He also uses the “last chance, best chance” concept from law for avoiding accident or injury, using the surgical timeout as an example of how this “last chance” may be the “best chance” to avoid wrong site surgery.
He provides lots of examples of good team communications and removal of the hierarchical structure. He does talk about how one “bad apple” can spoil the communication culture and notes the studies on disruptive behaviors (for both physicians and nurses). And though he gives some vivid descriptions of such “bad apples” he makes a great point: it is not just the 5% at the “bad end” of the curve that is the problem. Rather, he stresses that large “middle of the curve” that is silent as being in need of shifting to new attitudes and a new culture of safety. He also uses an amusing “bucket of crabs” analogy in which a crab cannot escape a bucket because all the other crabs in the bucket will pull it back in!
He stresses barrierless communication and the fact that physician autonomy must be subjugated in order for barrierless communication to take place. In his discussion about standardization and practice variation, he uses the concept “patients do not grant physicians the right to gamble their welfare just so a physician can demonstrate his autonomy”. And he nicely describes the distinction between being a leader and being a commander (using Star Trek analogies that he often uses in this book!). It is a culture where every one is encouraged to speak up and is rewarded for speaking up even if they were wrong.
He, of course, talks about good patient safety practices and concepts such as use of checklists and bundles, standardization, a fearless error reporting system, structured handoffs, briefings and debriefings, readback, shadow-a-colleague for a day, staff empowerment, normalization of deviance, Just Culture, the James Reason/swisscheese model of error defenses, and many other things we talk about in patient safety circles.
He discusses failures in perception, assumption, and communication as “the three tiers” that a patient safety culture must address to be successful. His discussion on assumptions (see also our discussion on assumptions in last week’s Patient Safety Tip of the Week “Learning from Tragedies. Part II”) is excellent. In a safety culture it boils down to this: if you need to assume something, always assume the negative.
In the communication discussion, he points out that studies show 12.5% of the time we do not understand what someone is actually saying even when we are speaking the same language, same dialect, and have the same education and profession. As we have advocated in several of our columns, his St. Michael’s videotapes OR cases so that they can be used to dissect problem areas in communication and be used in a constructive manner to improve teamwork and communication. The “CEO” of St. Michaels does a nice job destroying the arguments typically used to avoid videotaping. Videotapes are great for showing what actually went on, not what you think went on. And videotapes are great for demonstrating the old adage “90% of communication is non-verbal”.
There is, of course, discussion about sterile cockpit analogies in medicine. In addition to some of the examples we have given in the past, St. Michaels’s has some offshoots: the “zero-exceptions” rule for bedside barcoding and the “no-interruptions zone” when a nurse is preparing, dispensing, administering, or otherwise handling medications. The latter even includes a signal - a red towel over the nurse’s left shoulder – that tells everyone not to interrupt him/her.
And, speaking of nursing, St. Michael’s has created an atmosphere that promotes nurses as full equals with strong voices, empowered to make changes as needed, and freed up to provide care where it is needed the most – at the bedside.
So do we think Nance’s “St. Michael’s” is achievable? Can it be done overnight? Do we have to wait for a new generation of healthcare workers? The question should not be “can it be done?”. We simply must do it. Nance has nailed it when he says our failure to change the culture is central to our failure to impact patient safety. Will it happen overnight? No. In the airline industry it took a decade or more. Will it require a new generation? Not necessarily, but there will be some “casualties” along the way. At St. Michael’s they had to let several of their top surgeons go because they could not fit into the new culture. But we need to be developing that next generation anyway. Our professional school curricula need dramatic changes. While we teach patient safety concepts to our medical students and residents, we don’t yet have any significant curricular interactions between medical, nursing and pharmacy students. Our simulation training usually focuses on interaction directly with patients. We need to develop simulation training for our students with students in the other healthcare professions they will work with for the rest of their careers. Instead of fostering “perfection” in all our students, we need to be training them to understand they will make errors and the best way to prevent those errors from producing patient harm is to have teams that work together collectively to recognize and mitigate those errors. No longer can we just focus our teaching efforts to show our students and residents get top scores on the National Boards or their specialty board exams. Just as importantly, we need to restructure the finance side of healthcare so that incentives are aligned for all healthcare workers to produce different outcomes than we see today and ensure appropriate behaviors to achieve those outcomes are rewarded.
We probably overuse the “must read” label. But we have no qualms about using that label for this masterful book by John Nance. You won’t be sorry you read it. It is easy reading, thanks to his style and its quasi-fictional nature. But you’ll readily recognize all the characters in the book (they all have a counterpart in your hospital!) and you’ll realize why you will never impact medical error and patient safety without changing the way those characters interact.
Nance, John J. Why Hospitals Should Fly: The Ultimate Flight Plan to Patient Safety and Quality Care. Bozeman MT: Second River Healthcare Press. 2008