The Brexit vote garnered the world’s attention last month, making a surprising attack on tradition. But another British attack on tradition targeted the root cause analysis (RCA). Peerally and colleagues (Peerally 2016) actually did a nice review of many of the pitfalls involved in RCA’s, most of which we agree with. We’ve actually addressed most of them in our prior columns on RCA’s, most recently in our July 14, 2015 Patient Safety Tip of the Week “NPSF’s RCA2 Guidelines”.
We’ve always had a strong conviction that the RCA (root cause analysis) is probably the most important learning tool that an organization with a good culture of safety has at its disposal. We encourage organizations to do RCA’s not just on events with bad patient outcomes but on any event that had the potential to induce harm (near-misses).
We’ll try to comment on each of the points raised in the Peerally paper. One of the criticisms offered by Peerally et al. is that the RCA team is often not expert in doing RCA’s. They point out that in the airline industry the RCA is typically done by a very expert team. When there is an aircraft crash (or other significant transportation incident) in the US, the NTSB (National Transportation Safety Board) dispenses a team to investigate the incident. The NTSB keeps “go teams” that are ready to travel to the site of the incident within hours so the investigation may begin expediently.
Given the large number of events that could and should benefit from performance of an RCA (we recommend RCA’s not only for incidents resulting in patient harm but also in most near misses) it is not practical to think an NTSB-like “go team” could be dispensed to investigate every such incident. Perhaps a larger multi-hospital system might be able to do this and it’s even conceivable that a PSO (Patient Safety Organization) might be able to have an “RCA go team” ready to send to contracted healthcare organizations. A hospital can (and should) always develop a “core” RCA group and ensure they get formal training on RCA (and other) techniques and tools. But there is something to be said for having a truly independent body or group be able to investigate incidents without having the biases that are inherent in most RCA’s (see section below on biases and the “political hijack”).
But another real need in healthcare is making sure that everyone in the organization understands what an RCA is all about and takes seriously that its main goal is to prevent similar events from occurring in the future. All too often those invited to participate in or provide information to the RCA team are frightened and fearful the RCA is an “inquisition”. Physicians, in particular, have historically been reluctant to even speak to the RCA team for fear that their “testimony” will be “discoverable” in litigation. That’s especially a problem in the United States compared to other less litiginous countries. So it is incumbent upon the organization’s leadership and the RCA “core” team to appropriately educate everyone to the RCA process in general and to orient them to the process if they are called upon to participate in an RCA.
Peerally et al. also criticize the somewhat arbitrary timelines and techniques “mandated” by some regulatory bodies for RCA’s. They note that such requirements often lead to more focus on putting together a document than actually preventing future similar events. That’s probably a legitimate criticism. They note as an example the practice of asking “the 5 why’s”. We also often see organizations struggling to address all the components and questions in the widely used VA Root Cause Analysis Tools (VA 2015), to the point they appear more concerned about their document than their solutions. Don’t get us wrong – the VA RCA Tools are very helpful in getting RCA teams to consider potential contributing factors and root causes related to Rules, Safeguards, Environment, Equipment, Information Technology, Fatigue and Scheduling, Training, and Communication. But organizations should use those categories as reminders of where they may find root causes and contributing factors.
Timelines are a mixed bag. Peerally and colleagues note these often result in “a compromise between depth of data and accuracy of the investigation”. Most regulatory bodies mandate RCA’s be completed within specific arbitrary timelines (typically 30-45 days). Timing of the initial RCA meeting is important. It should be held within 24-72 hours of the event. That helps ensure that events are accurately recalled by witnesses and should allow enough time for scheduling interviews and other activities. In our July 24, 2007 Patient Safety Tip of the Week “Serious Incident Response Checklist” we discussed the many other things that need to be done immediately after serious events. That column included a link to our Serious Event Response Checklist, which includes things like sequestering involved equipment, identification of witnesses, disclosure to patient or family, notification of regulatory bodies and Board if necessary, and others.
Prompt performance of the RCA is critical not only for ensuring accurate recall of events but also for taking those steps needed immediately to prevent similar occurrences. While most thorough RCA’s can and should be completed within 30-45 days, those arbitrary timelines fail to take into account some of the bureaucratic lags in our healthcare organizational hierarchies. Those timelines may be sufficient to have the RCA go through your hospital Quality Improvement/Patient Safety Committee and up to your Board. However, due to the complexities of scheduling, the RCA may not be discussed at clinical and support departmental meetings, committees such as your Pharmacy and Therapeutics Committee or OR Committee, and your Medical Executive Committee within those timelines. Sometimes important considerations are raised at those meetings that lead to changes in the RCA actions. And, as discussed below, it is critical that RCA action plans are disseminated to all those who must take actions and appropriate buy-in from all parties be obtained. Therefore, while an RCA may be submitted to your regulatory body within the 30-45 day timeframe, you must ensure that you review it again after those other committees and departments have discussed it, and revise and amend the RCA as necessary.
Peerally et al. highlight the challenges of getting unbiased information and the “political hijack”. By the latter they mean that areas of focus are often influenced by “interpersonal relationships, hierarchical tensions, and partisan interests” (essentially, some areas are underemphasized to avoid stepping on toes). They also note that some root causes may be edited out when solutions appear to be too difficult to achieve.
The next criticism by Peerally and colleagues, one with which we strongly agree, is that solutions and action plans are poorly designed or implemented. In our March 27, 2012 Patient Safety Tip of the Week “Action Plan Strength in RCA’s” we noted prior studies in the VA system (Hughes 2006) which analyzed action items from RCA’s and found that 30% were not implemented and another 25% were only partially implemented. Stronger action items were more likely to be implemented. Actions that were assigned to specific departments or people were more likely to be implemented than those assigned to general areas. And they found that the patient safety manager plays a critical role in RCA action implementation.
In our March 27, 2012 Patient Safety Tip of the Week “Action Plan Strength in RCA’s” we emphasized the importance of tracking whether recommended action steps were implemented following an RCA, whether they were effective, and whether there were any unintended consequences. All too often action steps never get implemented at all or consist solely of “weak” action steps and organizations are then surprised when a similar adverse event occurs in the future. Moreover, even the most well intentioned and well planned action steps sometimes lead to consequences that were never anticipated. We typically see weak actions like education and training or policy changes as the sole actions undertaken rather than strong actions like constraints and forcing functions. We discussed strength of actions in our March 27, 2012 Patient Safety Tip of the Week “Action Plan Strength in RCA’s”. In that column we included an analogy to the effectiveness of signs and tools used to try to get drivers to slow down in construction zones on highways. We put them together in pictures with RCA action items and now incorporate them in our webinar presentations on doing good RCA’s. Click here to see them. Remember: images are more likely to be remembered than words!
One of the biggest issues we see in hospitals related to RCA’s is failure to follow up and close the feedback loop. In fact, probably the majority of hospitals lack formal procedures for ensuring the corrective actions recommended in an RCA are actually carried out (or barriers to their implementation identified and alternative steps taken). In our March 30, 2010 Patient Safety Tip of the Week “Publicly Released RCA’s: Everyone Learns from Them” we discussed an incident at a hospital in which a similar incident had occurred several years prior. After the first incident an extensive root cause analysis was done and multiple recommendations were made, including key recommendations that should have prevented the second incident. But all those recommendations had never been fully implemented. Importantly, the recommendations were communicated back to those individuals deemed to be in the “need to know” but not widely disseminated to middle for front line management nor to front line staff.
We recommend you keep a list or table of such identified action items from all your RCA’s to discuss at your monthly patient safety committee or performance improvement committee meetings. Action items should remain on that list until they have been implemented or completed. Only that sort of rigorous discipline will ensure that you did what you said you were going to do, i.e. that you “closed the loop”. And don’t forget you need to monitor your implemented actions for unanticipated and unintended consequences. For example, you might take the strong action of removing a drug from a particular setting, only to realize later that there were circumstances where that drug was needed in that setting.
From our perspective the major failure in the patient safety movement in the US has been our failure to share lessons learned. Peerally and colleagues also lament a lack of dissemination of lessons learned and lack of aggregation of similar events. Ironically, the example Peerally and colleagues used to illustrate lessons not learned was implantation of incorrect intraocular lenses. In several of our columns (most recently in our May 17, 2016 Patient Safety Tip of the Week “”) we’ve described that very issue as one that led us over 20 years ago to develop one of the earliest surgical timeout protocols that served as a model for subsequent state and national timeout protocols. In those columns we describe how cases of incorrect intraocular lens (IOL) implantation occurred singly (or occasionally multiply) in many hospitals yet those cases and their contributing factors were never shared widely. The same concept, of course, was seen with cases of fatal overdoses from inadvertent injection of concentrated potassium chloride. Those typically occurred as single isolated events in many hospitals and it was only years later that the widespread occurrence of this unfortunate incident was appreciated and steps taken to remove concentrated potassium chloride from floor stocks.
Even in organizations capable of wider dissemination of lessons learned there is a tendency to wait until several cases have been aggregated before sharing those lessons. But some isolated solitary cases also need to be shared because the circumstances leading to those cases is very likely replicated at multiple other venues. Such an example is another ophthalmological incident with the inadvertent use of methylene blue dye instead of trypan blue (see our prior columns of May 20, 2014 “Ophthalmology: Blue Dye Mixup” and September 2014 “Another Blue Dye Eye Mixup”). When we discussed the first case, we said “we can’t believe this is the first time this has happened”. Then, shortly thereafter a second case was reported. In fact, the second case anteceded the first. Perhaps with better dissemination of lessons learned the subsequent case might have been avoided.
In reality, such failure to share is a societal problem. The various legal and public relations consequences of sharing lessons and aggregating similar cases have been among the biggest barriers to implementation of sound patient safety practices.
The “problem of many hands” described by Peerally and colleagues is that incidents with adverse patient outcomes typically have many contributing factors and no one individual or action is responsible entirely for the adverse outcome or the potential solutions. Often those “actors” are even outside direct control of the organization (for example, manufacturers and suppliers) and hospitals may have little ability to inform changes there.
Perhaps the toughest nut to crack is the complicated issue of blame. The beauty of the RCA is that it stresses identification and remediation of system defects that are generally more amenable than human behaviors. And it is clear that system defects may put individuals at risk of committing human errors that then result in adverse patient outcomes. One key tenet of the National Patient Safety Foundation’s RCA2 Guidelines is that it only addresses system issues and should not address or focus on individual performance (see our July 14, 2015 Patient Safety Tip of the Week “NPSF’s RCA2 Guidelines”). In fact, NPSF recommends that all organizations should define “blameworthy” events and actions that fall outside the purview of the safety system and define how and under what circumstances they will be handled separately. Of course, we would emphasize that system issues that lead to or facilitate improper individual performance must be addressed under the RCA2 process. For example, workarounds are (often) improper individual actions that almost always have a system issue that led to their use. Another example is “normalization of deviance” where the culture of the system led to acceptance of a certain deviation from proper practice as being “normal” and allowed that deviation to be performed by many individuals.
Peerally and colleagues acknowledge that a “no-blame” approach is not always possible or appropriate and may impede thorough incident investigation and we often see failure to place blame when placing blame may be appropriate. They note that most of us have adopted the “Just Culture” approach but that tools such as algorithms and decision tools (eg. the “culpability tree”) have flaws of their own. (Also, don’t forget that in parallel to your RCA process you need to ensure your organization has a means to address the “second victim” or healthcare workers involved in such incidents. See also our December 17, 2013 Patient Safety Tip of the Week “The Second Victim” and we expect to do another column on the “second victim” soon.)
Lastly, they admit we all have problems figuring out how to fit patients and families into the RCA process. Our many columns on critical incident response and disclosure and apology (see list of prior columns below) have emphasized how after disclosure and apology we need to keep patients and families in the loop as we complete our RCA’s and implement actions to prevent similar events in the future. But few of us have figured out how to actually include patients or their families in the actual RCA process. Often patients and families have unique perspectives and observations that healthcare workers have not seen (or have been unwilling to admit!). More and more research is demonstrating that patients and families impacted by adverse events are highly motivated to help ensure similar events don’t occur to other patients. While we don’t have the perfect solution to inclusion of patients and families on the RCA team, we do recommend that as part of the disclosure and apology process we also appeal to them “we need your help in determining exactly what happened and how we can prevent similar events”. We’ve also stressed the need to avoid intimidation when such interactions with patients and families occur (see our June 22, 2010 Patient Safety Tip of the Week “Disclosure and Apology: How to Do It”). Don’t hold such meetings in a formal Board Room or have 1-2 family members sitting across a table full of individuals dressed in suits or white coats. You must keep the meeting as cordial as possible, expressing your sincere apology and sincere desire to get their perspectives on the events and give them plenty of time to ask questions and present their observations and concerns.
See our July 14, 2015 Patient Safety Tip of the Week “NPSF’s RCA2 Guidelines” for many other recommendations to include in your RCA process.
So we’re not really bashing the Brits for bashing the RCA. We’re basically acknowledging problems with the RCA that we’ve been discussing all along and hope that this conversation may lead to improvement in our ability to actually implement useful changes after untoward events or near misses.
Some of our prior columns on RCA’s, FMEA’s, response to serious incidents, etc:
July 24, 2007 “Serious Incident Response Checklist”
March 30, 2010 “Publicly Released RCA’s: Everyone Learns from Them”
March 27, 2012 “Action Plan Strength in RCA’s”
March 2014 “FMEA to Avoid Breastmilk Mixups”
July 14, 2015 “NPSF’s RCA2 Guidelines”
Some of our prior columns on Disclosure & Apology:
July 24, 2007 “Serious Incident Response Checklist”
June 16, 2009 “”
June 22, 2010 “Disclosure and Apology: How to Do It”
September 2010 “Followup to Our Disclosure and Apology Tip of the Week”
November 2010 “ ”
April 2012 “Error Disclosure by Surgeons”
June 2012 “Oregon Adverse Event Disclosure Guide”
Other very valuable resources on disclosure and apology:
Peerally MF, Carr S, Waring J, Dixon-Woods M. The problem with root cause analysis. BMJ Qual Saf 2016; Published Online First 23 June 2016
VA (Veteran’s Administration). VA National Center for Patient Safety. Root Cause Analysis Tools. REV.02.26.2015
Serious Incident Response Checklist.
Hughes D. Root Cause Analysis: Bridging the Gap Between Ideas and Execution. VA NCPS Topics in Patient Safety TIPS 2006; 6(5): 1,4 Nov/Dec 2006
Weak vs. Strong Responses to an RCA (Power Point presentation).
NPSF (National Patient Safety Foundation) RCA2. Improving Root Cause Analyses and Actions to Prevent Harm. NPSF 2015