Patient Safety Tip of the Week

 

May 19, 2009        Learning from Tragedies

 

 

Those of you who are regular readers of this column know we often use lessons learned from the NTSB (National Transportation Safety Board) accident reports and try to apply them to healthcare. While no one wants to see tragedies, either in aviation (or other transportation) or healthcare, a bigger tragedy is when we fail to apply learnings to prevent other tragedies from occurring. So this month we are looking at lessons learned in two major fatal aviation accidents. One was the crash of Continental Flight 3407, which happened near where we are based in Western New York. The other was the mid-air crash of two airplanes in Brazil in 2006.

 

We’ll start with the latter. This incident is chronicled superbly and the chain of events recreated with humanization of the key players by William Langewiesche in the January 2009 edition of Vanity Fair (Langewiesche 2009). Pilots were flying a brand new luxury Legacy 600 private jet back to the United States from Brazil when it collided at 37,000 feet with a Brazilian Boeing 737. All 154 passengers in the Boeing 737 died but all passengers and crew aboard the smaller Legacy 600 jet survived. The investigation showed the typical cascade of errors and circumstances that allowed the fatal collision to occur. Just as in medical sentinel events, breakdowns in communication were significant factors in causing the disaster.

 

The key elements of the crash are as follows. The Legacy 600’s transponder was inadvertently turned off. The transponder is the device that allows air traffic controllers to accurately identify planes and get an accurate reading of their altitude. It is also necessary for operation of the TCAS (traffic alert and collison avoidance system) that prevents 2 planes from colliding. In addition, communication with air traffic control was lost for a considerable period of time. As a result, air traffic control did not realize that two planes were coursing toward each other at the same altitude and the TCAS never alerted the planes they were about to collide.

 

There are many obvious analogies to adverse incidents in medicine. Just as we have noted a particularly risky time is when equipment comes out of maintenance (see our August 7, 2007 Patient Safety Tip of the Week on the Role of Maintenance in Incidents), new equipment poses especially risky circumstances. Unfamiliarity with the equipment can lead to many sorts of errors. The new Legacy 600 had numerous new high technology features, many of them “nested” in various modes on the dials in the cockpit. Even though they were given training and had several test runs on this plane, the pilots were still relatively new to this plane and they did not have ready knowledge of all its features. At one point they even had to go through multiple computer screens just to find “time to destination”.

 

Sound familiar to healthcare? How often do we trot out the newest and fanciest equipment to our OR’s and ICU’s and expect that with minimal training staff will be able to perform without glitches? This is another great argument for standardization. Airlines that have proven success stories, like Southwest, have used one or just a few plane designs so that their pilots have little difficulty moving from one plane to another.

 

But what about all the high tech bells and whistles here? Lessons from other industries have provided ample warning and examples that introduction of new technologies introduces opportunities for new types of errors and other unintended consequences. In our December 16, 2008 Patient Safety Tip of the Week “Joint Commission Sentinel Event Alert on Hazards of Healthcare IT” we noted Charles Perrow in his classic book “Normal Accidents” (Perrow 1999) talks about how new technologies often simply push the envelope, citing as an example how the introduction of maritime radar simply encouraged boats to travel faster and did little to reduce the occurrence of maritime accidents. Aviation is similar. In the not so distant past, the standard vertical clearance for planes was 2000 feet. However, when newer more sophisticated instruments became available, that vertical clearance was reduced to 1000 feet. Ironically, as pointed out by Langewiesche in the current incident, the new equipment may have allowed the midair collision to occur. When equipment was less sophisticated, there would be enough variation just by chance that two planes flying at “37,000 feet” would unlikely actually both be flying at exactly the same altitude. However, with the newest altimeters and GPS devices and autopilot systems the accuracy is so good that both planes can fly at that exact altitude, thereby increasing the chance of collision.

 

You are all well aware that about three-quarters of Sentinel Events reported to Joint Commission involve problems with communication. This aviation disaster had multiple problems with communication. One is the language barrier. Though English is the “universal” language used in aviation, there are problems that arise regarding accents, pronunciation, nuances of words and phrases, and cultural differences. In the Brazil plane collision, there were several times when the Legacy 600 pilots did not understand what the air traffic controller had said. And, for whatever reason, they did not seek clarification. In healthcare, we emphasize the need for both “readback” and “hearback”. Readback is typically used when someone is taking verbal or telephone orders and reads back the orders to the person who gave them, often spelling out specific terms. Verbal or telephone orders should be avoided whenever possible. But there are times when they are unavoidable and that is when readback is critical to minimize the chance of error. “Hearback”, on the other hand, is a little more complex. While it does involve to a degree repeating something that has been said, it also is a statement of understanding about what was said or intended. Particularly in healthcare we often use terminologies that are not well standardized and we need to convey back our level of understanding.

 

While communication is a problematic area, a more sublte one is recognizing “lack of communication”. It is, of course, more difficult to recognize something negative than something positive. In this incident, the pilots and the air traffic controllers went through long periods of time where there was no radio communication. Part of “situational awareness” should including being attuned to what is not happening. Gary Klein (See our May 27, 2008 Patient Safety Tip of the Week “If You Do RCA’s or Design Healthcare Processes…Read Gary Klein’s Work”) graphically describes how fire captains responding to a fire often sense that a disaster is imminent not by what they see (or hear or smell or feel) but rather by what they do not see. Similarly, in healthcare we need to pay constant attention to the variety of cues in our environment and actively seek out those things that should be there but are missing.

 

Just as in healthcare, there are handoffs and changes of shift that occur in the airline industry. During the flights of both the Legacy 600 and the Boeing 737 there were handoffs as the air traffic controllers changed shifts and as the planes passed from one air traffic control territory to another. The article contains only a few details about these handoffs. It is not clear whether those handoffs are done with a structured format, such as SBAR (see our September 30, 2008 Patient Safety Tip of the Week “Hot Topic: Handoffs”). But clearly some vital information was omitted or overlooked during these handoffs and some erroneous information (most notably the altitude of the Legacy 600) was passed on. In healthcare, handoffs should not only follow a structure to help ensure completeness and relevance but should also be carried out under optimal conditions in an interactive fashion where all parties can and do ask questions.

 

Problems with alarms are an issue identified in many of our healthcare root cause analyses. In this aviation incident there was no evidence of alarms purposefully being turned off, as we often see in healthcare. Rather, the problem with alarms here seems to be in the design of the systems. When the transponder in the Legacy 600 turned off (not clear whether it was inadvertently turned off by one of the pilots or turned off somehow by the high technology of the aircraft), a small warning appeared on the two radio management units. That warning was simply an abbreviation for “standby” and it appeared silently without any audible warning. Also, there was a small warning that appeared on each pilot’s Primary Flight Display indicating the TCAS (the traffic collision avoidance system) was off. The latter can only be on when the transponder is active. The change in the TCAS status also appeared without audible alert. With TCAS and the transponder off, other planes cannot sense the presence of this plane so would not get a collision avoidance warning if they neared this plane just as the Legacy 600 would not get such a warning. And with the transponder off the actual altitude of the Legacy 600 was not accurately conveyed to the air traffic controllers on the ground. We don’t know enough about aviation to understand design of such systems but we’d certainly wonder why you would ever want the transponder and TCAS to be “off” once the plane is airborne and in motion. But even if there is such a reason for an “off” option, there certainly should be an audible alert that gets the pilot to focus on the status of the transponder and TCAS systems. We’ve addressed alarm issues previously in our Patient Safety Tips of the Week for March 5, 2007, March 26, 2007, April 2, 2007, and April 1, 2008.

 

Air safety is also plagued by what are called “automation surprises”. These are incidents where computers and other high tech instruments are working in the background to control or correct various factors and conditions. In the past, pilots would have been aware of such conditions but now are unaware while the computers are in the background. Autopilots are now so sophisticated that changes are made very subtly and pilots are unaware until they turn off the autopilot. In the Legacy 600, the autopilot at one point appropriately changed the course of the airplane as per the flight plan. The new course should have been accompanied by a change in altitude to 36,000 feet (by convention planes cruising in a westerly direction are assigned “odd” altitudes and those cruising in an easterly direction are assigned “even” altitudes). But the altitude is not to be changed until directed by the air traffic controller and in this case communication with air traffic control had broken down so the Legacy 600 remained at 37,000 feet. In healthcare, the design of systems must take into account the possibility of automation surprises (see our November 6, 2007 Patient Safety Tip of the Week “Don Norman Does It Again!”).

 

 

There are multiple other conditions that contributed to the adverse outcome here. Though there do not appear to have been any significant time pressures, there were other types of “pressure”. One of the passengers on the Legacy 600 was a writer for the New York Times. Both the company that bought this jet and the company that made it (who also had a representative onboard) obviously wanted to impress that writer. So some of the conversations and cockpit intrusions occurring during the flight occurred because of the presence of these passengers. One is reminded of the serious incident where the US submarine Greeneville, showing off the submarine’s capabilities to a group of important civilian visitors, accidentally struck a Japanese fishing trawler, the Ehime Maru, killing nine people aboard the trawler. Though the exact role of the civilian visitors in that incident is unclear, the possibilities were raised that they might have served as a distraction or may have indirectly led to the captain and crew taking some of the actions taken. Does this sort of thing happen in healthcare? Yes, there are times when there are visitors or media attending or witnessing live medical interventions. One must recognize such events as having inherent dangers and be extremely wary not to cut corners, get distracted, or be overly aggressive in attempting to get a point across. Similarly, there are times when intrusions into the OR or the pharmacy may cause distractions that facilitate errors.

 

Assumptions may also have played a role. It is speculated that the Legacy 600 pilots may have assumed that the air traffic controllers were “doing them a favor” by allowing them to stay at 37,000 feet. Apparently, the convention of odd vs. even altitudes for westbound and eastbound flights is sometimes purposefully not followed by some air traffic controllers under certain circumstances, perhaps leading to the false assumption in this case. In healthcare, a cardinal rule is to never assume that someone else has done something without verifying it.

 

So this accident, despite its tragic outcome, has numerous lessons not only for aviation but also for healthcare.

 

Next week we will be discussing some of the lessons learned from the investigation of the crash of Continental Flight 3407 in Western New York. Our regular readers also know we often do a book review around the time of holidays. We had intended to do one on the book “Nudge” by Richard Thaler and Cass Sunstein but, in view of these new columns on lessons learned from aviation disasters we may instead review John Nance’s new book “Why Hospitals Should Fly”.  In any event, we will review both books at some time because they each have important lessons for healthcare and patient safety.

 

 

References:

 

 

Langewiesche W. The Devil at 37,000 Feet. Vanity Fair; January 2009

http://www.vanityfair.com/magazine/2009/01/air_crash200901

 

 

Perrow C. Normal Accidents: Living with high-risk technologies. Princeton, New Jersey: Princeton University Press, 1999

 

 

Wikipedia. Ehime Maru and USS Greeneville collision

http://en.wikipedia.org/wiki/Ehime-Maru

 

 

Klein G. Sources of Power. How People Make Decisions. (1999) Cambridge: MIT Press

 

 

 

 

 

 

 


 


http://www.patientsafetysolutions.com

 

Home

 

Patient Safety Tip of the Week Archive

 

What’s New in the Patient Safety World Archive