In our December 17, 2013 Patient Safety Tip of the Week “The Second Victim” we noted that we’d be looking at the many analogies between the tragic Asiana plane crash in San Francisco this past July and healthcare incidents. Our regular readers know we often use aviation analogies to help us understand root causes that underlie many of the adverse events we see in healthcare. The current NTSB investigation of the Asiana Flight 214 crash is loaded with examples that apply equally to aviation and healthcare, including training issues, learning curves, failure to heed alarms, failure to buck the authority gradient, cultural barriers, automation surprises, safety systems temporarily unavailable while under repair, and many more. Most of the important findings can be found in the Operations Group Chairman Factual Report and Interview Summaries, though important details can be found in the many documents included in the docket of the NTSB investigative hearing.
On July 6, 2013 Asiana Airlines Flight 214 crashed while landing at the airport in San Francisco (SFO), resulting in 3 deaths and over 200 injuries among the 307 passengers and crew aboard the aircraft. Basically, the crash occurred because the rate of speed for the aircraft was far below the recommended speed for landing. The result was that the rate of descent of the plane was too rapid and the plane hit a seawall just short of the runway, resulting in the back of the fuselage breaking off and the plane doing a cartwheel and then burning. It was actually miraculous that only 3 of the passengers died in this horrific crash.
The NTSB investigation, which should still be considered a work in progress at this point, did not uncover any significant mechanical, structural, or maintenance issues with the aircraft. Nor did it find any significant likely contribution from crew fatigue or distractions from personal computing devices, factors found to be contributory in many recent transportation accidents. And weather did not seem to be a contributory factor either.
But reading through the NTSB investigation documents and the media reports about the crash there are multiple root causes identified and most have analogies to incidents we see in healthcare:
The Learning Curve
The pilot flying the airplane at the time of the crash was an experienced pilot but was actually a trainee pilot on this particular plane. He had less than 45 hours of flight time on this model. In addition, he had not flown to the San Francisco airport since 2004. Note also that the supervising pilot was apparently also making his inaugural flight as a trainer.
The healthcare analogy is the experienced surgeon who is just beginning to perform procedures with a new technique. We saw complication rates soar when experienced surgeons first began doing laparoscopic cholecystectomies, bariatric surgery, robotic prostatectomies, etc.
Communication
Issues/Lack of Hearback
Apparently there were several verbal warnings that the plane was descending too rapidly, beginning about a minute before the crash. The relief first officer, sitting in the jump seat behind the pilots gave three or four “sink rate” warnings in succession about 52 seconds before impact (Ahlers 2013).
When we meet with OR staff, nursing staff, medical residents, or others to discuss hierarchical issues we stress the importance of “hearback” and how they can “escalate their concerns”.
In the Asiana crash it is not clear whether hearback was utilized. The cockpit voice recorder did note the relief first officer called out “sink rate, sir” and the trainee pilot say “Yes, sir”. The first officer calling out the descent speed warning may have tried some form of escalation since, after 2 call outs in English, his third call out was in Korean (Wald 2013).
This is eerily similar to a previous airline crash that we often use in our sessions. In that previous crash the copilot can be heard multiple times on the cockpit voice recorder saying in a rather quiet calm manner “we’re running out of fuel”. Needless to say the plane ran out of fuel and crashed.
Hearback must indicate not just that something was heard but must rather convey that what was said was fully understood and that both parties have the same understanding. In healthcare we often have situations where jargon or abbreviations are used and all parties do not understand that nomenclature, leading to miscommunication and errors.
There are techniques that can be used to escalate concerns. One is the “Two-Challenge Rule”. In this the concern is first raised in a non-confrontational tone. If the concern is not acknowledged, you repeat the concern with more emphasis. If that second “challenge” is not acknowledged, you refer the issue to another person, eg. supervisor with authority to intervene or a trusted colleague
Many of you are more familiar with the CUSS acronym for escalating your concern tactfully:
C “I’m concerned and need
clarification”
U “I am uncomfortable and don’t
understand”
S “I’m seriously worried here”
S “Stop”
Unfortunately, some of the hierarchical and cultural issues described below have made it difficult for some to feel comfortable in raising issues. That is why it is critical that every organization emphasize the importance of speaking up and help them understand techniques to escalate their concerns.
The Hierarchical
Issues
The pilot apparently harbored fears about landing safely while relying on manual controls and a visual approach but failed to tell anyone because he did not want to fail his training mission and embarrass himself.
While any of the three pilots could have broken off the approach it was noted that “is very hard” for “low-level people” which apparently included the pilot flying the plane who was being supervised by an instructor pilot (Wald 2013). That sentiment persisted despite the fact all had undergone cockpit resource management (CRM) training that encourages subordinates to speak up about safety concerns (Ahlers 2013).
We’ve said it time and time again – culture always trumps policy. How often have you found that your staff, even after they have undergone TeamSTEPPS™ training or CRM training, remain reluctant to speak up and buck the authority gradient even when they know something is wrong? It happens all the time. Especially when a hospital administration fails to back up those staff who had been willing to speak up and got admonished for doing so.
Failure to Heed
Alarms
Aside from failure to heed the alarm about the too rapid descent (“sink rate”), there weren’t too many other alarms here. The audible alarm that signifies the aircraft is too low to the ground and the “stick shaker” that signifies an aerodynamic stall both did occur but the pilots were not able to react soon enough to increase speed and abort the landing.
But the one other display that should have caused alarm was the air speed indicator. The pilots had planned an approach speed of 137 knots but the actual speed had dropped to 103 knots just before impact. About 11 seconds before impact an electronic voice called out an alert about minimum speed. But the pilots, likely because they were under the misconception that the autothrottle was still engaged and would automatically increase their speed, failed to try to increase speed until just a few seconds before impact.
Failure to heed alarms is, of course, one of our “big three” root causes in almost every root cause analysis we do after a healthcare incident with serious adverse outcome (the other two being failure to buck the authority gradient and failed communications). Our July 2, 2013 Patient Safety Tip of the Week “Issues in Alarm Management” discusses multiple aspects of alarms in healthcare and has links to our many prior columns on alarm issues.
The Other Cultural
Issues
The pilot flying the airplane at the time of the landing also noted that he was temporarily blinded by a bright light on the runway, possibly a reflection of the sun. He was not wearing sunglasses because apparently that is considered a sign of disrespect or impoliteness in Korean culture (Wald 2013).
Today’s hospials have staffs that come from a variety of cultural backgrounds. It’s not enough to just be cognizant of potential language barriers that might impact communication. Other cultural “norms” may also create barriers to communication. We’ve often seen that regarding gender issues.
Lack of Standard Instrumentation/Design
Issues
The plane was a Boeing 777. Even though the pilot was very experienced, he had less than 45 hours experience in that model. He described confusing some of the details of the automation system with that of the Airbus A320 on which he had more extensive experience (Wald 2013).
The pilots had left their “flight director” system, which includes the autopilot, partly on (Scott 2013). But on the Boeing 777 the system would not “wake up” from hold mode, meaning that the autothrottle system would not automatically speed up the plane if it fell below the target speed. Boeing said that its design uses the philosophy leaving the pilot in control of the airplane. On some other airplane models the autothrottle system would not be disabled and would automatically correct the speed.
We’ve often discussed lack of standardization as a risk factor for medical errors. One example we frequently point out is ventilators or alarm systems differing from one ICU to another in a hospital. When staff get “floated” from one ICU to the other they are often confronted with equipment that has dials and displays which differ considerably from those they have become accustomed to.
The Automation Issues/Overreliance
on Technology
One of the major issues contributing to the crash was apparently overreliance on technology. The pilots apparently thought that the automatic throttle system was engaged, which should have increased engine thrust when the airplane speed fell below the recommended speed. However, that automatic throttle system was not engaged. Once the pilots recognized that their speed and altitude were too low and that the autothrottle had not automatically increased speed, they tried to initiate a “go round” (i.e. to abort the landing and fly around and try again) but it was too late.
It’s pretty clear that sometimes pilots don’t understand what mode the computer systems are in. The FAA has just released a comprehensive study on the overreliance of pilots on automation and loss of situational awareness due to automation surprises (FAA 2013).
Healthcare is no different. We often use computer systems in which multiple “modes” are available and we may not recognize which mode the system is operating in. Also, in all our discussions about alarm issues we note that erroneous assumptions are often made that an alarm will trigger when anything serious happens.
Why Hasn’t This
Happened Before?
It has! Because of the design of this system you would anticipate that other pilots have likely experienced situations in which they thought the autothrottle was engaged when it was not. The ground school instructor interviewed by the NTSB stated that he provided specific training on this issue because he had personally experienced, in flight, an unexpected activation of HOLD mode and thus the failure of the autothrottle to re-engage (Operations Group Chairman Factual Report p. 8). He stated that he wanted to warn his students on this aspect of the B777 automation.
Healthcare organizations often fail to remember lessons learned from incidents within their own organizations. In our March 30, 2010 Patient Safety Tip of the Week “Publicly Released RCA’s: Everyone Learns from Them” we noted some cases where incidents occurred following identical prior incidents. Lessons learned and proposed solutions were either not implemented nor widely disseminated within the organization.
Airport Navigation
System Out of Order
At the time of the crash in July the system at SFO that syncs with aircraft systems to help guide the landing was out of order. This was one of the factors that necessitated a visual/manual landing.
The pilot flying the airplane, who was highly experienced in a Boeing 747 but was transitioning to flying a Boeing 777, told the National Transportation Safety Board that he found it "very stressful, very difficult" to land without the glideslope indicator that helps pilots determine whether the plane is too high or too low during approach (Ahlers 2013).
Multiple pilots interviewed provided varying responses about their ability to perform visual approaches but the overall theme seems to have been that most were uncomfortable with visual approaches (Operations Group Chairman Factual Report p. 32-33). Though all were required to fly a visual approach in the simulator, some felt “this training was to fill in the square on the simulator check ride rather than to learn something”.
In healthcare we also sometimes attempt to function with equipment or systems we know are suboptimal. Sometimes we’ll see equipment failures midway through surgical procedures and find out that the particular piece of equipment had failed previously and never been fully repaired. That is one reason why the post-op debriefing is so important: that is where faulty equipment issues should always be discussed and referred to the person who may rectify the problem.
The Sterile Cockpit
Interestingly, compared to most of our previous columns on transportation accidents, there did not appear here to be any violations of the “sterile cockpit” rule. That rule requires that all unnecessary “chatter” cease during critical phases of the flight, such as takeoff and landing. On the cockpit voice recording the only possible extraneous bits of conversation were really about visual landmarks (eg. the Golden Gate Bridge, a golf course, etc.) and did not appear to be distracting in any way.
In healthcare that same “sterile cockpit” concept applies to
all key situations where distractions must be avoided, such as during the
surgical timeout or any verification scenario, the surgical “count”, and a host
of other situations.
The Crash Response
The NTSB investigation looks not only at factors leading to the accident but also at “survival factors” and the response after the crash. While the overall response was credited with saving many lives, review of the response did identify opportunities for improvement.
Crew members did access instructions for evacuation and follow them. One emergency inflatable slide or raft deployed inappropriately and obstructed an exit and actually pinned a flight attendant. Fortunately another crew member recognized this and was able to deflate the slide enough for the flight attendant to escape.
But some images of the crash showed passengers ignoring official safety procedures and collecting their carry-on items before evacuating the aircraft (Thompson 2013).
Also, a Boeing evacuation engineer testified at the hearing that at least one and possibly two passengers who died did not have their seatbelts on (Braun 2013). One of the 3 fatalities was a girl who had already been found on the ground near the left wing of the plane but became covered in firefighting foam and got hit by emergency vehicle(s) on the runway.
An automated system designed to alert key managers at SFO failed during the period immediately following the crash (van Derbeken 2013). There were also questions about the communications between air traffic controllers in the tower, who could see some of the activity at the crash site, and emergency command center managers on the ground.
Healthcare organizations also need to look at their response after an untoward incident, analyzing not only the proximate and root causes of the incident but how people responded to it and what was done well and what might have been done better.
Do as I Say, Not as I
Do
While many have focused on overreliance on technology as a key root cause in this and many other transportation accidents, we think there is another lesson here that is extremely important to healthcare. That is the disparity between what you say should be done and what is actually done.
There are a couple examples in this case. One was that both the Boeing (plane manufacturer) representative and the Asiana representatives stated their philosophies that the pilot should be in full control of the aircraft on landing. Yet the interviews with many Asiana pilots showed that landings using instruments and technology were the norm and that most pilots were extremely reluctant to do a “visual” or “manual” landing and that the simulation training actually paid only cursory attention to visual/manual landings.
Similarly, though all pilots underwent CRM training, which stresses the obligation of everyone to speak up when they see something wrong, the cultural norm in this airline was that they still deferred the important decisions to the “one in command”.
Healthcare organizations are also guilty of this. How many of you have sent your OR teams through TeamSTEPPS™ or CRM training yet failed to follow up, only to find after an incident that your OR culture had reverted back to its previous hierarchical morass? And how many hospitals flaunt their commitment to patient safety on their websites, yet relegate discussion about quality and patient safety at their Board meetings to the end of the agenda or fail to provide the resources needed for that commitment to safety? And how many turn a blind eye to bad behavior by that physician who brings the most “business” to the hospital (see our May 29, 2011 Patient Safety Tip of the Week “The Silent Treatment: A Dose of Reality”)?
It is always agonizing for victims of such transportation accidents and their families and for the second victims as well. But it would be even worse if we failed to learn from the events and apply those lessons learned in a constructive fashion.
See some of our previous columns that use aviation analogies for healthcare:
May 15, 2007 “Communication, Hearback and Other Lessons from Aviation”
August 7, 2007 “Role of Maintenance in Incidents”
August 28, 2007 “Lessons Learned from Transportation Accidents”
October 2, 2007 “Taking Off From the Wrong Runway”
May 19, 2009 “Learning from Tragedies”
May 26, 2009 “Learning from Tragedies. Part II”
January 2010 “Crew Resource Management Training Produces Sustained Results”
May 18, 2010 “Real Time Random Safety Audits”
April 5, 2011 “More Aviation Principles”
April 26, 2011 “Sleeping Air Traffic Controllers: What About Healthcare?”
May 8, 2012 “Importance of Non-Technical Skills in Healthcare”
March 5, 2013 “Underutilized Safety Tools: The Observational Audit”
April 16, 2013 “Distracted
While Texting”
May 2013 “BBC Horizon 2013: How to Avoid Mistakes in Surgery”
August 20, 2013 “Lessons
from Canadian Analysis of Medical Air Transport Cases”
References:
NTSB. Investigative Hearing. Crash of Asiana Flight 214, San Francisco, CA, 7/6/2013. NTSB 2013; December 2013
http://www.ntsb.gov/news/events/2013/asiana214_hearing/index.html
Operations Group Chairman Factual Report
http://dms.ntsb.gov/public%2F55000-55499%2F55433%2F543236.pdf
Operations Group Chairman Factual Report. Interview Summaries.
http://dms.ntsb.gov/public%2F55000-55499%2F55433%2F543238.pdf
Cockpit Voice Recorder (transcript)
http://dms.ntsb.gov/public%2F55000-55499%2F55433%2F544904.pdf
Ahlers MM. Pilot concerned about landing Asiana jet before crash. CNN.com December 11, 2013
http://www.samachar.com/NTSB-probes-fatal-Asiana-Flight-214-crash-nmltKyajccc.html
Wald ML. Nobody wanted to be rude: good manners helped doom Asiana flight 214. Sydney Morning Herald 2013; December 12, 2013
http://www.news.nom.co/nobody-wanted-to-be-rude-good-manners-7484199-news/
Scott A. Asiana crash pilots knew speed was low, hesitated. Reuters December 11, 2013
FAA. Operational Use of Flight Path Management Systems. FAA September 5, 2013
http://media.nbcbayarea.com/documents/FAA_Final_Report_Recommendations+11-22-13.pdf
Thompson C, Meng M, Did Asiana passengers ignore safety messages? CNN.com July 9, 2013
http://www.cnn.com/2013/07/09/travel/asiana-passenger-safety/index.html?iid=article_sidebar
Braun S, Mendoza M. Probe: Asiana pilot wasn't confident, assertive.
Associated Press/Denver Post December 11, 2013
http://www.denverpost.com/breakingnews/ci_24700519/ntsb-review-asiana-crash-at-hearing-wednesday
Van Derbeken J. SFO reveals missteps after Asiana crash. SFGate.com November 21, 2013
http://www.sfgate.com/bayarea/article/SFO-reveals-missteps-after-Asiana-crash-4996962.php
Print “PDF
version”
http://www.patientsafetysolutions.com/