We had planned this week to move away from our recent theme of culture of patient safety. However, we came upon a great new paper discussing aviation safety concepts and their current or potential applications in healthcare (Lewis 2011). This follows a recent debate in the British Medical Journal as to whether we have gone too far in applying aviation analogies to healthcare (Rogers 2011; Gaba 2011). With the added weight of the new Lewis paper, we’ll declare Gaba the winner of the debate: we still have a lot we can learn from the aviation industry that can be applied to healthcare.
And the senior author on the new paper is none other than James Bagian, the former astronaut and NASA engineer turned VA patient safety guru. The title “Counterheroism, Common Knowledge, and Ergonomics: Concepts from Aviation That Could Improve Patient Safety” reveals the three themes that emerge from a group of 15 aviation safety concepts that we can better apply to healthcare.
The first theme, counterheroism,
is one that only evolved in aviation over a long time. Anyone who has seen the
movies “The Right Stuff” or “Top Gun” remembers the daring, swashbuckling style
of early aviators that fostered the culture of “can do” or “change in mid-air”
that fostered workarounds and novel fixes to problems. But that gradually gave
way in commercial aviation to a culture in which such ad lib solutions were no
longer considered to be “heroic” and following structured standardized
procedures became the norm. Basically, the
new culture of safety emphasizes the role of teams and the system as a whole.
In healthcare, where the medical profession has always glorified innovation and
“problem solving” as highly desirable traits, it’s no surprise that resistance
from physicians has been a barrier to change. Of course, if the focus is really
on problem solving one should readily appreciate that teamwork is more likely to
achieve success than individual or solo efforts.
Some of the aviation safety concepts under the
counterheroism theme are use of checklists, alternating roles, the
first-names-only rule, and crew resource management (CRM). Alternation of roles
(eg. the pilot and first officer typically take turns flying the aircraft at
various times) and the first-names-only rule are concepts that tend to flatten the hierarchy and make it much
more likely that any individual on the team will feel free to speak up when a
potential safety hazard is identified.
The second major theme is common knowledge. Examples in aviation are the joint safety briefings
for pilots and crew before flights, publicizing detailed minimum safety
requirements, and rules such as the “sterile cockpit” rule or the “bottle to
throttle” rule. We’ve emphasized the equivalent of the sterile cockpit rule in
healthcare in multiple columns (see prior aviation-related patient safety
columns listed at the end of this column). Once a rule becomes common knowledge
and someone is seen to violate that rule, all other team members are authorized
to bring this to the attention of the violating party.
The third theme is ergonomics or human factors
engineering, including design to mistake-proof or use forcing functions.
Use of standardized layouts is important so that pilots moving from
plane to plane are not confused by location of various controls and dials.
We’ve previously noted the problems that arise when different ventilators or
monitoring equipment are used in different ICU’s and staff have to float
between these ICU’s. Their example of mistake proofing is the automatic
locking of the landing gear lever in the down position whenever the plane is on
the ground. In healthcare, we mistake proof, for example, by preventing
connection of catheters to wrong sites or gas lines to the wrong gas outlet.
Their aviation example of forcing functions is the Traffic Alert and
Collision Avoidance System, which identifies a potential collision and orders
the pilots to take appropriate actions. The healthcare equivalent might be the
requirement for manual overrides for dangerous medication administration.
Another example is flight envelope protection, which we might liken to a
form of constraint. This is where the system prevents the pilot from doing
something with the plane that is beyond safe limits. The example given in the
paper is a computer system that negates a command from the pilot that would
pitch the aircraft beyond the stalling angle. The example they provide from
healthcare would be a system that prevents too high a concentration of oxygen
being delivered to a patient with type 2 respiratory failure (i.e. those prone
to hypercarbia).
The paper goes on to describe the cost:benefit analysis of
implementing the various safety measures and many of the barriers likely to be
encountered amongst physicians when attempting to implement such changes.
One of the most interesting concepts is the black box. That, of course, is the device that records all the
flight data, conversations in the cockpit, and information related to crashes.
They describe the resistance encountered from pilots when the systems were
introduced because crew knew that everything was being recorded. They note this
may lead to more civility between pilots and between pilots and air traffic
controllers. But the main utility of such recording systems is to be able to
identify anomalies and go back to discuss why those occurred so as to prevent
similar future occurrences. They suggest a theoretical potential for a
healthcare black box – recording telephone conversations. We have also been
advocates for recording the surgical time-out (or other events) in the OR with
the intent of using the playback in a constructive manner to improve teamwork.
Similarly, we have mentioned in several prior columns the LOSA (Line Operations
Safety Audits) audits done in aviation. In these, someone sits alongside the
crew on a flight and records actions and interactions and later provides
feedback to the crew in attempt to improve operations.
By the way, the interview with James Bagian (Schulz 2010)
mentioned above is worth reading. Most of his comments are in keeping with the
theme that the biggest difference between aviation and healthcare is related to
the difference in culture. Bagian also points out that a much higher percentage
of the budget is spent on safety and training in the aviation industry compared
to healthcare.
And what about the debate in the British Medical Journal?
How could you possibly think we have gone too far with the analogies to
aviation? We don’t really think that James Rogers, who takes the “yes we have
gone too far” side of the debate, is discounting the valuable lessons that can
be learned from aviation. Rather he points out areas in which differences
between healthcare and aviation call for different types of safety
interventions. But one of the biggest differences, he concedes, is that there
are more “authority gradients” in healthcare, the very theme we have been
emphasizing in our recent columns.
Our prior Patient Safety Tips of the Week on aviation safety
lessons and healthcare:
May 18, 2010 “Real-Time
Random Safety Audits”
June 2, 2009 “Why
Hospitals Should Fly…John Nance Nails It!”
May 26, 2009 “Learning
from Tragedies. Part II”
May 19, 2009 “Learning
from Tragedies”
October 2, 2007 “Taking
Off From the Wrong Runway”
August 28, 2007 “Lessons
Learned from Transportation Accidents”
May 15, 2007 “Communication,
Hearback and Other Lessons from Aviation
January 2010 “Crew
Resource Management Training Produces Sustained Results”
References:
Lewis GH, Vaithianathan R, Hockey PM, Hirst G, Bagian JP.
Counterheroism, Common Knowledge, and Ergonomics: Concepts from Aviation That
Could Improve Patient Safety. The Milbank Quarterly 2011; 89(1): 4–38
http://www.milbank.org/quarterly/8901feat.html
Schulz K. The Wrong Stuff: What it Means to Make Mistakes.
Risky Business: James Bagian—NASA astronaut turned patient safety expert—on
Being Wrong. Slate 2010; Posted Monday, June 28, 2010 5:01 PM
Rogers J. Head to Head. Have we gone too far in translating
ideas from aviation to patient safety? Yes. BMJ 2011; 342:c7309 published
online 14 January 2011
http://www.bmj.com/content/342/bmj.c7309.full
Gaba DM. Head to Head. Have we gone too far in translating
ideas from aviation to patient safety? No. BMJ 2011; 342:c7310 published online
14 January 2011
http://www.bmj.com/content/342/bmj.c7310.full
http://www.patientsafetysolutions.com
Patient
Safety Tip of the Week Archive
What’s New in the Patient Safety World Archive