In patient safety we frequently use analogies from other industries, particularly aviation. It’s been a couple years since we’ve done a “lessons learned” from an aviation disaster or other transportation disaster. This time we can learn a lot from an aviation near-miss.
On July 7, 2017 an Air Canada flight 759 overflew multiple planes that were in line on a taxiway at San Francisco International Airport (SFO), missing them by only feet, in what could have been one of the worst aviation disasters ever (NTSB 2018). Just as we see in healthcare incidents, there was a cascade of contributing factors that led to this almost-disaster. And we can see that almost each of those contributing factors has an analogous presence in healthcare incidents.
The flight was from Toronto, Ontario to San Francisco and was scheduled to arrive about 11:30 PM (Pacific Time). It left Toronto about 30 minutes late. Autopilot was engaged shortly after takeoff and remained engaged until just before the final approach to SFO. The departure, climb, cruise, and descent phases of flight were uneventful except for an area of thunderstorms about midway through the flight.
The flight was designated to land on runway 28R (R=right). That same night, runway 28L (L=left) was scheduled to be closed from 2300 to 0800 the next morning. The lights on runway 28L were turned off about 2312. As they approached SFO, the pilot inadvertently lined up for Taxiway C (parallel to and to the right of runway 28R), where four large air carrier airplanes were lined up awaiting clearance to take off from runway 28R. The control tower gave Air Canada flight 759 clearance to land on runway 28R. At 2355:45, the flight crew made the following transmission to the controller: “Just want to confirm, this is Air Canada seven five nine, we see some lights on the runway there, across the runway. Can you confirm we’re cleared to land?”. At 2355:52, 1 second after the flight crew completed its transmission, the controller replied, “Air Canada seven five nine confirmed cleared to land runway two eight right. There’s no one on runway two eight right but you.”
The Air Canada plane descended to an altitude of 100 ft above ground level and overflew the first airplane on the taxiway. The flight crew initiated a go-around, and the airplane reached a minimum altitude of about 60 ft as it overflew the second airplane on the taxiway before starting to climb.
Thus, only feet separated this near-miss from being perhaps the worst airline disaster ever.
The NTSB (National Transportation Safety Board) recently issued its investigation report (NTSB 2018). The NTSB, when it investigates, looks at every conceivable contributing factor (equipment, weather, human factors, fatigue, policies and procedures, etc.). Below, we’ve summarized some of the most important points in their investigation and attached analogies that apply in healthcare.
Information about the runway closure not salient enough.
A major factor that contributed to confusion was the closure of the adjacent runway 28L. That runway had been closed around 2300. This was a schedule closure (it was scheduled to be closed from 2300 to 0800 the next morning). The lights on runway 28L were turned off about 2312. The pilots of Air Canada flight 759 were familiar with landing at SFO, but the lack of lights on 28L likely contributed to their aligning inadvertently with the taxiway. They likely saw two “runways” and thought they were aligning with the one to the right (taxiway C was to the right of runway 28R as they approached SFO).
Note that the crew of a different plane that had landed on runway 28R about 4 minutes before the incident reported that, after visually acquiring the runway environment, they, too, questioned whether their airplane was lined up for runway 28R. But they were able to determine that their airplane was lined up for runway 28R after cross-checking the lateral navigation (LNAV) guidance. They stated that, without lateral guidance, they could understand how the runway 28R and taxiway C surfaces could have been confused because the lights observed on the taxiway were in a straight line and could have been perceived as a centerline.
The information about the planned closure of 28L was available to the pilots both prior to their takeoff and while in-flight. Prior to takeoff they had access to the notice to airmen (NOTAM) about the runway 28L closure. However, the first officer stated that he could not recall reviewing the specific NOTAM that addressed the runway closure. The captain stated that he saw the runway closure information, but his actions (as the pilot flying) in aligning the airplane with taxiway C instead of runway 28R demonstrated that he did not recall that information when it was needed. The second opportunity occurred in flight when the crewmembers reviewed automatic terminal information system (ATIS), which also included NOTAM information about the runway 28L closure. Both crewmembers recalled reviewing the ATIS information but could not recall reviewing the specific NOTAM that described the runway closure.
One additional reason the crew may not have paid much attention to the runway closure was that they may have expected to land prior to the closure. Recall, however, that the late departure also resulted in their late arrival at SFO, after the closure of runway 28L.
Analogy: ignoring alerts that do not demand a specific action
Analogy: receiving remote training about an issue without reinforcement at a time when such information is needed
Analogy: ignoring warnings about defective piece of equipment (“we had a problem with that yesterday”)
We’ve seen this problem before.
Although the NOTAM about the runway 28L closure appeared in the flight release and the in-flight messages that were provided to the flight crew, the presentation of that information did not effectively convey the importance of the runway closure information and promote flight crew review and retention. Multiple events in the National Aeronautics and Space Administration’s aviation safety reporting system database showed that this issue has affected other pilots, indicating that all pilots could benefit from the improved display of flight operations information.
We often fail to learn from prior events. See, for example, our March 30, 2010 Patient Safety Tip of the Week “Publicly Released RCA’s: Everyone Learns from Them”. In that column we describe an adverse event that followed an almost identical adverse event several years earlier. There was a failure to disseminate lessons learned from earlier events and implement solutions to prevent future events.
The first officer failed to tune the instrument landing system to the frequency for runway 28R.
The procedures for the approach to runway 28R required the first officer (as the pilot monitoring) to manually tune the instrument landing system (ILS) frequency for runway 28R, which would provide backup lateral guidance during the approach to supplement the visual approach procedures. However, when the first officer set up the approach, he missed the step to manually tune the ILS frequency. The captain was required to review and verify all programming by the first officer but did not notice that the ILS frequency had not been entered.
There are several issues here. Apparently, the instruction on the approach chart to manually tune the ILS frequency was not conspicuous during the crew’s review of the chart. Second, this was the only approach in Air Canada’s Airbus A320 database that required manual tuning for a navigational aid, so the manual tuning of the ILS frequency was not a usual procedure for the flight crew. Third, the captain was required to review and verify all programming by the first officer but did not notice that the ILS frequency had not been entered.
Analogy: failure to adhere to all the items/actions in a checklist. For example, in a checklist for a surgical timeout, staff may not pay attention to a step that requires independent confirmation that the imaging materials in the OR actually belong to the patient.
Anaology: lack of standardization. For example, a hospital may have different mechanical ventilators in each of its ICU’s. A nurse temporarily called to work in an ICU different from his/her usual ICU may therefore be faced with dials and settings that he/she is not used to.
Analogy: failed double checks. The statistic we often give is that an inspector fails to recognize an error in someone else’s work 10% of the time. That is why truly independent double checks (where each person independently reviews the order or other issue) are necessary when doing things such as administration of a high-alert drug. (See our October 16, 2012 Patient Safety Tip of the Week “What is the Evidence on Double Checks?”).
Planes should be equipped with a system that alerts air crews if the plane is not aligned with their destination runway.
Although the Federal Aviation Administration (FAA) has not mandated the installation of such a system, the results of a simulation showed that such technology, if it had been installed on the incident airplane, could have helped the flight crew identify its surface misalignment error earlier in the landing sequence, which could have resulted in the go-around being performed at a safer altitude.
Analogy: A (theoretical) system that could inactivate electrosurgical devices while there is free flow of oxygen in head/neck surgery might prevent surgical fires.
There was no system in place to alert air traffic controllers that the plane was aligned with a taxiway rather than a runway.
If an airplane were to align with a taxiway, an automated airport surface detection equipment (ASDE) alert could assist controllers in identifying and preventing a potential taxiway landing as well as a potential collision with aircraft, vehicles, or objects that are positioned along taxiways. An FAA demonstration in February 2018 showed the potential effectiveness of such a system.
Analogy: a system of detecting RFID tags in surgical sponges might reduce the likelihood of a retained surgical sponge
Need for a method to more effectively signal a runway closure to pilots when at least one parallel runway remains in use.
A runway closure marker with a lighted flashing white “X” appeared at the approach and departure ends of runway 28L when it was closed. The runway closure marker was not designed to capture the attention of a flight crew on approach to a different runway. Increased conspicuity of runway closure markers, especially those used in parallel runway configurations, could help prevent runway misidentification by flight crews while on approach to an airport.
Analogy: Special warnings on vials of drugs in conspicuous lettering (eg. large font, special color) informing that the drug is not to be used in certain circumstances.
Both the pilot and first officer noted feeling tired after they had navigated an area of thunderstorms about halfway through the flight. The pilot had been awake for more than 19 hours, and first officer for more than 12 hours at the time of the landing. Transport Canada had proposed regulations would better address the challenge of fatigue mitigation for pilots on reserve duty who are called to operate evening flights extending into their window of circadian low. However, Transport Canada has not yet finalized its rulemaking in this area.
It’s also not clear whether fatigue may have impacted the air traffic controller. The air traffic controller began his “midnight” shift at 2230. He had also worked a daytime shift on July 7 from 0530 to 1330. The controller reported that he took a 45-minute nap in between the morning and midnight shifts and that he felt rested for his shifts. The controller also stated that he had “no problems” adjusting to the midnight shift.
Analogy: see our many columns on the impact of fatigue on healthcare workers
Expectation bias is “when someone expects one situation, she or he is less likely to notice cues indicating that the situation is not quite what it seems.”
The first officer, on first looking up toward the landing zone, assumed that the pilot had aligned with the correct runway.
The air traffic controller, because he had never seen a plane align with the taxiway before, assumed that the plane was aligned with the correct runway.
Analogy: hanging the wrong IV bag because another IV bag was expected
A closely related concept is “inattentional blindness” in which we tend to see what we expect to see (as discussed in our April 21, 2015 Patient Safety Tip of the Week “Slip and Capture Errors”)
Despite lack of most of the usual visual markers of a runway, the presence of lights in a row (which were actually related to planes waiting in line on the taxiway) conveyed to the pilot the appearance of a runway, confirming in his mind that he was on the correct path.
Analogy: That vial you picked up looked like the one you always use for flushing an IV line, only it was for a different concentration of heparin or for a totally different medication.
Ignoring disconforming information
The pilot and first officer questioned the presence of certain light patterns (eg. some that appeared to be crossing the “runway”) on what they thought was their designated runway. Yet they continued on that course once the air traffic controller said they were on the correct course.
Analogy: You ignore a piece of evidence that does not fit with your working diagnosis, leading to a diagnostic error.
Plan continuation bias
Plan continuation bias is an “unconscious cognitive bias to continue an original plan in spite of changing conditions”. The pilots, despite having some uncertainty about visual cues, continued with their plan to land on what they thought was runway 28R.
Analogy: The surgeon ignores a discrepant “count” and proceeds to close a surgical cavity because a previous discrepant “count” had subsequently been corrected.
When we talk about the aviation “error chain” we often note the “queasy feeling”, which is confusion or an “empty feeling” or feeling that something is just not right. The flight crew did likely have this when they commented on the unusual light pattern on what they thought was their designated runway. However, there definitely was a “queasy feeling” just in time to avert the disaster. During a postincident interview, the captain stated that, as the airplane was getting ready to land, “things were not adding up” and it “did not look good,” so he initiated a go-around. The captain reported that he thought that he saw runway lights for runway 28L and believed that runway 28R was runway 28L and that taxiway C was runway 28R. During a postincident interview, the first officer reported that he thought that he saw runway edge lights but that, after the tower controller confirmed that the runway was clear, he then thought that “something was not right”; as a result, the first officer called for a go-around because he could not resolve what he was seeing. The captain further reported that the first officer’s callout occurred simultaneously with the captain’s initiation of the go-around maneuver. (See below what Air Canada subsequently did to train its pilots on the importance of “the gut feeling”.)
Analogy: There are numerous times in healthcare, for example in the OR, when someone suspects that “something does not feel right”. The hierarchical nature of medicine often discourages them from speaking up. We always instruct our medical students, residents, and nurses how to speak up tactfully to convey their concerns. The CUSS acronym is also a good system for bring your concerns to everyone’s attention (see our Patient Safety Tips of the Week for May 8, 2012 “Importance of Nontechnical Skills in Healthcare” and January 7, 2014 “Lessons from the Asiana Flight 214 Crash”).
Analogy: There is also a role for “gut feelings” in the clinical diagnostic process (see our August 2013 What's New in the Patient Safety World column “Clinical Intuition”).
Was there a “sterile cockpit”?
There are certain times in aviation (and in healthcare) when there needs to be a “sterile cockpit”. That means that there are not extraneous conversations ongoing during certain times (including takeoff and landing) and all attention is focused on the tasks at hand. Unfortunately, in this case the cockpit voice recordings were overwritten because Air Canada did not initially appreciate the severity of this incident. So there is no way of determining whether “sterile cockpit” was adhered to during the landing approach.
Analogy: The surgical timeout demands a “sterile cockpit”, where attention of everyone is focused on the task at hand and there is no extraneous conversation. The same should apply to when the surgical “count” is being performed or when doing double checks during a variety of circumstances.
Incidents shortly after handoffs
About 2349, all air traffic control positions and frequencies were combined, and one controller worked the positions in the tower cab while the other controller took a recuperative break. This was 7 minutes before the incident occurred.
Analogy: We’ve done several columns in which incidents occurred around the time of change of shift. The period of change of shift should be considered a very vulnerable period.
In many of our prior columns on transportation accidents we’ve talked about “automation surprises”. That’s where crew think a computer or system is in one mode, when in fact it is in another mode. An example might be where they think the plane is on autopilot but it is on manual pilot mode. It’s not apparent that any automation surprises occurred in this incident. But one might wonder whether the failure of the first officer to tune the instrument landing system (ILS) frequency for runway 28R might have represented one (did he think that would be automated when, in fact, it required manual input?).
Analogy: You assume that alarm parameters have been set at certain thresholds for your patient, not realizing they have been set at default thresholds that may not be optimal for your patient.
Need for a method to more effectively signal a runway closure to pilots when at least one parallel runway remains in use.
A runway closure marker with a lighted flashing white “X” appeared at the approach and departure ends of runway 28L when it was closed. The runway closure marker was not designed to capture the attention of a flight crew on approach to a different runway, and the marker did not capture the attention of the incident flight crew as the airplane approached the airport while aligned with taxiway C. Increased conspicuity of runway closure markers, especially those used in parallel runway configurations, could help prevent runway misidentification by flight crews while on approach to an airport.
Analogy: Preventing people (eg. firefighters, police) from entering an MRI unit
Should construction on a runway take place while planes are landing on parallel runways?
The purpose of the construction project on runway 10R/28L at the time of the incident was to resurface the runway and replace existing light fixtures with improved lighting. The project started in February 2017 and was expected to last about 10 months. The work required the closure of the runway each night and during some weekends. At the time of the incident, 28 portable light plants were located around the construction zone. A runway closure marker (a white flashing lighted “X”) was placed at the approach and departure ends of runway 28L when the runway was closed. But the system was not really designed to alert incoming aircraft to the closure. There was some evidence that the bright construction lights may have contributed to the confusion about what runway the pilots were looking at. The crew of the plane that had landed 4 minutes before Air Canada flight 759 attempted landing noted that there were “really bright” white lights on the left side of runway 28R (similar to the type used during construction), but they knew that runway 28L was closed.
Analogy: Keeping a piece of equipment in need of repair in close proximity to an active OR.
Things happen when construction ongoing
Construction projects cannot take place in a vacuum. Potential unintended consequences should always be considered and the impact on operations aside from those directly on the area under construction must be taken into account.
Analogy: Hospital fires often occur in parts closed and under construction.
Analogy: Wandering patients have been found dead in hospital construction zones.
Too many tasks?
The captain of one of the planes that had been on the taxiway also said it was also important to note that the tower controller “was performing way too many functions…Ground, Tower, and at times ops vehicles.” The NTSB investigation also suggested the air traffic controller was likely performing too many tasks. After the incident, air traffic control management issued a guidance indicating that the ground control and local control positions could not be combined before 0015 (when air and ground traffic would be much diminished).
Analogy: an anesthesiologist preparing medications and responding to questions about a previous patient fails to notice a change in a vital parameter currently being monitored
Technologies available that could prevent a taxiway landing
The NTSB investigation also looked at several existing technological tools that might be used to prevent an inadvertent landing on a taxiway, including an ASSC (airport surface surveillance capability) system enhancement and an EGPWS (Enhanced Ground Proximity Warning System Simulation) module, and concluded that such might prevent taxiway landings.
Analogy: Biometrics could help avoid patient misidentification in patients with communication barriers such as language problems, cognitive impairment, hearing or speech problems, etc.
Analogy: GPS tags could prevent wandering patients from wandering off units
Do anything possible to avert a disaster
The captain of one of the planes on the taxiway stated, over the tower frequency, “where is that guy going?” at 2355:59. But that was not in time for the air controller to respond. Because the flight crew had already begun the go-around maneuver (at 2356:05), the airplane was climbing at the time of the controller’s go-around instruction (2356:09). However, several planes on the taxiway, seeing the Air Canada plane coming onto the taxiway, turned on various lights on their planes in attempt to warn the oncoming plane. While the pilots of Air Canada 759 don’t recall seeing these, perhaps these were some of the factors that told the Air Canada pilots “something’s not right”.
Analogy: A nurse shouting out “stop” as a surgeon attempts to use an electrocautery device on the forehead of a patient receiving active flow of oxygen via face mask.
Failure to sequester equipment relevant to investigation
The cockpit voice recordings from the period of the anticipated landing were not available because Air Canada allowed them to be overwritten. Air Canada apparently was not aware of the seriousness of the incident. The captain reported the event to dispatch about 1608 (1908 EDT) on July 8. The dispatcher who spoke with the captain stated that he reported that the airplane was lined up with the wrong runway and that a go-around ensued. The dispatcher also stated that the captain’s report sounded “innocuous” and that, because of the late notification (16 hours after the incident), he did not think that the event was serious. The airplane had also flown about 40 hours before Air Canada senior officials became aware of the severity of the incident and realized that data from the airplane needed to be retrieved.
Analogy: Failure to report near-misses
Analogy: Every organization needs to have a policy for dealing with serious incidents, whether they resulted in patient harm or not, and should have a checklist to guide all in what to do after such events (see our July 24, 2007 Patient Safety Tip of the Week “Serious Incident Response Checklist” and the actual checklist).
Mitigations to Overcome Expectation Bias
About 3 months before this incident, Air Canada implemented training on plan continuation and expectation bias. The training, which was provided to company pilots during annual recurrent training, comprised a video titled “Understanding Gut Feel,” which explained that a gut feeling was a sense of knowing things before a person could consciously know, communicate, or explain them. The video also explained that a gut feeling indicated a potential threat resulting from a situation that was different or strange or had changed.
Analogy: There are many good resources on teamwork training, such as TeamSTEPPS™, and crew resource management (CRM) programs that emphasize situational awareness and how to speak up when that “gut feeling” makes anyone on the team feel unsure or uncomfortable with something they perceive.
You may recall that San Francisco International Airport (SFO) was also the site of another tragic airline accident that we discussed in our January 7, 2014 Patient Safety Tip of the Week “Lessons from the Asiana Flight 214 Crash”. That column and the ones listed below have many good lessons learned that apply equally in healthcare. They emphasize how multiple events and conditions combine at the right time to enable serious incidents. Today’s column shows a striking example of how analysis of near-misses is as important as doing a root cause analysis after an incident in which actual harm resulted. Near-misses are learning opportunities that help us prevent future accidents and incidents from occurring.
See some of our previous columns that use aviation analogies for healthcare:
May 15, 2007 “Communication, Hearback and Other Lessons from Aviation”
August 7, 2007 “Role of Maintenance in Incidents”
August 28, 2007 “Lessons Learned from Transportation Accidents”
October 2, 2007 “ ”
May 19, 2009 “Learning from Tragedies”
May 26, 2009 “Learning from Tragedies. Part II”
January 2010 “ ”
May 18, 2010 “Real Time Random Safety Audits”
April 5, 2011 “More Aviation Principles”
April 26, 2011 “Sleeping Air Traffic Controllers: What About Healthcare?”
May 8, 2012 “Importance of Non-Technical Skills in Healthcare”
March 5, 2013 “Underutilized Safety Tools: The Observational Audit”
April 16, 2013 “Distracted While Texting”
August 20, 2013 “Lessons from Canadian Analysis of Medical Air Transport Cases”
December 17, 2013 “The Second Victim”
January 7, 2014 “Lessons from the Asiana Flight 214 Crash”
January 5, 2016 “”
Some of our prior columns on RCA’s, FMEA’s, response to serious incidents, etc:
July 24, 2007 “Serious Incident Response Checklist”
March 30, 2010 “Publicly Released RCA’s: Everyone Learns from Them”
April 2010 “RCA: Epidural Solution Infused Intravenously”
March 27, 2012 “Action Plan Strength in RCA’s”
March 2014 “FMEA to Avoid Breastmilk Mixups”
July 14, 2015 “NPSF’s RCA2 Guidelines”
July 12, 2016 “”
May 23, 2017 “”
NTSB (National Transportation Safety Board). Taxiway Overflight. Air Canada Flight 759. Airbus A320-211. C-FKCK. San Francisco, California July 7, 2017. NTSB September 25, 2018