Patient Safety Tip of the Week

May 23, 2017    Trolling the RCA

 

 

Hard to believe it’s been almost a year since we had to defend the root cause analysis (see our July 12, 2016 Patient Safety Tip of the Week “Forget Brexit – Brits Bash the RCA!”). But now we’re at it again!!

 

The May 2017 issue of BMJ Quality and Safety has several articles questioning whether RCA’s are good or bad for patient safety (Kellogg 2017) (Trbovich 2017) (Peerally 2017). One of the articles (Peerally 2017) is actually the article published online ahead of print that triggered our 2016 column mentioned above but these articles provide the opportunity to again discuss what’s good and what’s bad about RCA’s in healthcare.

 

The RCA is not dead. We’ve always had a strong conviction that the RCA (root cause analysis) is probably the most important learning tool that an organization with a good culture of safety has at its disposal. We encourage organizations to do RCA’s not just on events with bad patient outcomes but on any event that had the potential to induce harm (near-misses). The key problems with RCA’s are:

 

That last problem is a major failure in our healthcare system. How many times do we need to see a problem uncovered in one hospital occur in other hospitals simply because our system does not promote sharing of problems and solutions?

 

The new study was a review of RCA’s from a major academic medical center (Kellogg 2017). Researchers reviewed 302 RCA’s over an 8-year period, 106 of which proposed solutions. The average number of solutions proposed per RCA was 4.7. But a large proportion of solutions offered were what we would deem “weak” interventions (eg. training 20%, process change 19.6%, policy reinforcement 15.2%). Moreover, multiple event types were repeated during the study period despite repeated RCA’s.

 

Kellogg and colleagues, of course, are correct that many or most RCA’s fail to implement strong interventions as solutions. That echoes a criticism in the second article (Peerally 2017), one with which we strongly agree, that solutions and action plans are poorly designed or implemented. In our March 27, 2012 Patient Safety Tip of the Week “Action Plan Strength in RCA’swe noted prior studies in the VA system (Hughes 2006) which analyzed action items from RCA’s and found that 30% were not implemented and another 25% were only partially implemented. Stronger action items were more likely to be implemented. Actions that were assigned to specific departments or people were more likely to be implemented than those assigned to general areas. And they found that the patient safety manager plays a critical role in RCA action implementation.

 

In our March 27, 2012 Patient Safety Tip of the Week “Action Plan Strength in RCA’s” we emphasized the importance of tracking whether recommended action steps were implemented following an RCA, whether they were effective, and whether there were any unintended consequences. All too often action steps never get implemented at all or consist solely of “weak” action steps and organizations are then surprised when a similar adverse event occurs in the future. Moreover, even the most well intentioned and well planned action steps sometimes lead to consequences that were never anticipated. We typically see weak actions like education and training or policy changes as the sole actions undertaken rather than strong actions like constraints and forcing functions. We discussed strength of actions in our March 27, 2012 Patient Safety Tip of the Week “Action Plan Strength in RCA’s”. In that column we included an analogy to the effectiveness of signs and tools used to try to get drivers to slow down in construction zones on highways. We put them together in pictures with RCA action items and now incorporate them in our webinar presentations on doing good RCA’s. Click here to see them. Remember: images are more likely to be remembered than words!

 

One of the biggest issues we see in hospitals related to RCA’s is failure to follow up and close the feedback loop. In fact, probably the majority of hospitals lack formal procedures for ensuring the corrective actions recommended in an RCA are actually carried out (or barriers to their implementation identified and alternative steps taken). In our March 30, 2010 Patient Safety Tip of the Week “Publicly Released RCA’s: Everyone Learns from Them” we discussed an incident at a hospital in which a similar incident had occurred several years prior. After the first incident an extensive root cause analysis was done and multiple recommendations were made, including key recommendations that should have prevented the second incident. But all those recommendations had never been fully implemented. Importantly, the recommendations were communicated back to those individuals deemed to be in the “need to know” but not widely disseminated to middle for front line management nor to front line staff.

 

We recommend you keep a list or table of such identified action items from all your RCA’s to discuss at your monthly patient safety committee or performance improvement committee meetings. Action items should remain on that list until they have been implemented or completed. Only that sort of rigorous discipline will ensure that you did what you said you were going to do, i.e. that you “closed the loop”. And don’t forget you need to monitor your implemented actions for unanticipated and unintended consequences. For example, you might take the strong action of removing a drug from a particular setting, only to realize later that there were circumstances where that drug was needed in that setting.

 

Take a look at the cases discussed in some of our recent columns. Though these were not technically the RCA’s performed by the hospitals, their plans of correction (POC’s) did include the actions taken. And those actions were relatively weak. In our April 25, 2017 Patient Safety Tip of the Week “Dialysis and Alarm Fatigue” the actions taken by the hospital were primarily education and policy changes, both of which are “weak” interventions. They missed the opportunity to implement stronger interventions. One would have been to redesign the alarm system to focus the responder’s eyes to the site indicated by the alarm. A second would have been to add a hard stop to the alarm. The best interventions are forcing functions. We’d suggest that these alarm systems program in a “hard” stop for this particular alarm that requires the responder to verify that he/she has inspected the access site. That verification should then become part of the medical record.

 

As above, constraints and forcing functions are the strongest of actions. In our May 20, 2014 Patient Safety Tip of the Week “Ophthalmology: Blue Dye Mixup” and September 2014 What's New in the Patient Safety World column “Another Blue Dye Eye Mixup” we discussed unfortunate cases where methylene blue was used during cataract surgery rather than trypan blue. It is very clear there is a huge system issue here. The system actually put those healthcare workers and the patient in a vulnerable position that allowed the mistake to happen. It is very much akin to the concentrated potassium chloride issue of the past in which nurses accidentally administered fatal doses of concentrated KCl to patients. There was little reason for nurses to have access to vials of concentrated KCl yet we placed them on nursing units and it was simply a matter of time until someone unwittingly drew up a syringeful and administered a fatal dose. Our eventual system fix was to remove vials of concentrated KCl from floor stock on nursing units. If you are a facility that only does eye cases, you probably have no need for methylene blue and therefore should not stock it at all. In other facilities where you may have a legitimate need for methylene blue (for example, it is used to help identify leaks in some surgeries or to help identify tissue in need of debridement in others) you clearly need to store the two blue dyes separately. If you have a dedicated “eye” room and can store all the medications and materials for eye surgery there (or in an automated dispensing cabinet dedicated to ophthalmology) make sure that methylene blue is not in those areas.

 

But sometimes it is difficult to implement a strong intervention following an RCA. In our May 2, 2017 Patient Safety Tip of the Week “Anatomy of a Wrong Procedure” the major problem was a poor culture of safety. In their recent editorial Trbovich and Shojania (Trbovich 2017) put at the top of their hierarchy of effectiveness “culture change” as one of the “strong” actions. Of course, it is undoubtedly the strongest of all actions. You’ve often heard us use the phrase “culture trumps ________” (fill in the blank with words like policy, procedure, strategy, tactics, vision, etc). In fact, “Culture trumps…Everything!”. But, unfortunately, changing the culture is a long-term process, tough to implement and difficult to measure. So while it’s something that desperately needs work, very few RCA’s include it as an action. One possible stronger action to take in that case, and we would consider it an action of only intermediate strength, would have been to cancel any elective case in which copies of the consent and H&P are not available several days prior to the scheduled procedure and then ensure those documents are available in the OR at the time the procedure is actually done. But even those are weak actions because there is no guarantee that people will use them. The case also illustrates that use of a stronger action (implementation of the WHO Surgical Safety Checklist) failed because of poor implementation and verification of its use.

 

But that gets us back to our most significant point. Hospitals often fail to include strong actions because they don’t know what those strong actions are. Our inability to disseminate lessons learned at other facilities and their solutions is, in our minds, the single biggest barrier to improving patient safety. Peerally and colleagues also lament a lack of dissemination of lessons learned and lack of aggregation of similar events. Ironically, the example Peerally and colleagues used to illustrate lessons not learned was implantation of incorrect intraocular lenses. In several of our columns (most recently in our May 17, 2016 Patient Safety Tip of the Week “Patient Safety Issues in Cataract Surgery”) we’ve described that very issue as the one that led us over 20 years ago to develop one of the earliest surgical timeout protocols that subsequently served as a model for subsequent state and national timeout protocols! In those columns we describe how cases of incorrect intraocular lens (IOL) implantation occurred singly (or occasionally multiply) in many hospitals yet those cases and their contributing factors were never shared widely. The same concept, of course, was seen with cases of fatal overdoses from inadvertent injection of concentrated potassium chloride. Those typically occurred as single isolated events in many hospitals and it was only years later that the widespread occurrence of this unfortunate incident was appreciated and steps taken to remove concentrated potassium chloride from floor stocks.

 

Even in organizations capable of wider dissemination of lessons learned there is a tendency to wait until several cases have been aggregated before sharing those lessons. But some isolated solitary cases also need to be shared because the circumstances leading to those cases is very likely replicated at multiple other venues. Such an example is that ophthalmological incident with the inadvertent use of methylene blue dye instead of trypan blue (see our prior columns of May 20, 2014 “Ophthalmology: Blue Dye Mixup” and September 2014 “Another Blue Dye Eye Mixup”). When we discussed the first case, we said “we can’t believe this is the first time this has happened”. Then, shortly thereafter a second case was reported. In fact, the second case anteceded the first. Perhaps with better dissemination of lessons learned the subsequent case might have been avoided.

 

In reality, such failure to share is a societal problem. The various legal and public relations consequences of sharing lessons and aggregating similar cases have been among the biggest barriers to implementation of sound patient safety practices. PSO’s (Patient Safety Organizations) have the potential to help disseminate lessons learned and solutions but to date have had a limited impact since their work is shared only with their individual member organizations.

 

See our July 12, 2016 Patient Safety Tip of the Week “Forget Brexit – Brits Bash the RCA!” for discussion of several of the other points raised by Peerally et al. such as the problem of “many hands”, political hijacking of the RCA process, challenges of getting unbiased information, timeline issues, conduct of and participation in an RCA, having the right expertise available, figuring how to fit patients and families into the RCA process, and more.

 

In their recent editorial Trbovich and Shojania (Trbovich 2017) warn that jumping to corrective actions on the basis of a single case can be problematic. That’s one reason we also recommend periodically reviewing all your RCA’s to cull out recurrent themes. It’s often such collective reviews that make you realize the increased importance of root causes that were identified in individual RCA’s.

 

One of the most important things in making your organization’s use of RCA’s is to have a culture that understands the focus is to uncover system issues that can be fixed to prevent subsequent similar events. That means people should not fear reporting events or speaking openly in RCA interviews. James Bagian does a nice job in an AHRQ interview on RCA’s (AHRQ 2016) explaining that you have to educate people that, when they report in the safety system, they will be held harmless but that doesn't mean people get a free pass. If there was a “blameworthy act”, the case would be placed on an administrative route where the facts of the event have to be "rediscovered" by the administrative system, which could culminate in punitive action. But if it was not blameworthy, under no circumstances would there ever be punitive action.

 

In our July 12, 2016 Patient Safety Tip of the Week “Forget Brexit – Brits Bash the RCA!we noted that perhaps the toughest nut to crack is the complicated issue of blame. The beauty of the RCA is that it stresses identification and remediation of system defects that are generally more amenable than human behaviors. And it is clear that system defects may put individuals at risk of committing human errors that then result in adverse patient outcomes. One key tenet of the National Patient Safety Foundation’s RCA2 Guidelines (NPSF 2015) is that it only addresses system issues and should not address or focus on individual performance (see our July 14, 2015 Patient Safety Tip of the Week “NPSF’s RCA2 Guidelines”). In fact, NPSF recommends that all organizations should define “blameworthy” events and actions that fall outside the purview of the safety system and define how and under what circumstances they will be handled separately. Of course, we would emphasize that system issues that lead to or facilitate improper individual performance must be addressed under the RCA2 process. For example, workarounds are (often) improper individual actions that almost always have a system issue that led to their use. Another example is “normalization of deviance” where the culture of the system led to acceptance of a certain deviation from proper practice as being “normal” and allowed that deviation to be performed by many individuals.

 

Peerally and colleagues acknowledge that a “no-blame” approach is not always possible or appropriate and may impede thorough incident investigation and we often see failure to place blame when placing blame may be appropriate. They note that most of us have adopted the “Just Culture” approach but that tools such as algorithms and decision tools (eg. the “culpability tree”) have flaws of their own. Also, don’t forget that in parallel to your RCA process you need to ensure your organization has a means to address the “second victim” or healthcare workers involved in such incidents (see also our December 17, 2013 Patient Safety Tips of the Week “The Second Victim and August 9, 2016       More on the Second Victim”.)

 

Alternatives to the RCA are available. Gupta and Lyndon (Gupta 2017) recently reviewed many of the issues raised in the Peerally article and in the RCA2 documents (see our July 14, 2015 Patient Safety Tip of the Week “NPSF’s RCA2 Guidelines”) and note several other review tools and techniques that can be applied effectively in certain situations, often with less time and effort. The FMEA (Failure Mode and Effects Analysis) is also a great tool to identify system issues that may lead to patient safety events. But a FMEA is time consuming and most organizations can do only one or two in a year. They also deal with a lot of theoretical issues or “what ifs” that staff may consider to be unlikely to occur. On the other hand, the RCA is typically done after an actual event or near miss that grabs everyone’s attention and hammers home that we need to make changes to avoid another event. The best case, however, is seeing the RCA from someone else’s facility and saying “Wow! That could happen here!” and implementing changes at your facility before you have an untoward event.

 

See our Patient Safety Tips of the Week for July 14, 2015 “NPSF’s RCA2 Guidelines” and July 12, 2016 “Forget Brexit – Brits Bash the RCA!” for many other recommendations to include in your RCA process.

 

 

So our message to all the detractors of RCA’s…stop trolling the RCA! The RCA is a great learning tool. The problem is doing it right and what you do with it. Instead, help fix the barriers that prevent the sharing of lessons learned and the solutions that utilize strong actions to help prevent patient safety incidents in all healthcare settings.

 

 

 

Some of our prior columns on RCA’s, FMEA’s, response to serious incidents, etc:

July 24, 2007               Serious Incident Response Checklist

March 30, 2010           Publicly Released RCA’s: Everyone Learns from Them

April 2010                   RCA: Epidural Solution Infused Intravenously

March 27, 2012           Action Plan Strength in RCA’s

March 2014                 FMEA to Avoid Breastmilk Mixups

July 14, 2015               NPSF’s RCA2 Guidelines

July 12, 2016               Forget Brexit – Brits Bash the RCA!

 

 

References:

 

 

Kellogg KM, Hettinger Z, Shah M, et al. Our current approach to root cause analysis: is it contributing to our failure to improve patient safety? BMJ Qual Saf 2017; 26(5): 381-387

http://qualitysafety.bmj.com/content/26/5/381

 

 

Trbovich P, Shojania KG. Root-cause analysis: swatting at mosquitoes versus draining the swamp. BMJ Qual Saf 2017; 26(5): 350-353

http://qualitysafety.bmj.com/content/26/5/350

 

 

Peerally MF, Carr S, Waring J, Dixon-Woods M. The problem with root cause analysis. BMJ Qual Saf 2017; 26(5): 417-422

http://qualitysafety.bmj.com/content/26/5/417

 

 

Hughes D. Root Cause Analysis: Bridging the Gap Between Ideas and Execution. VA NCPS Topics in Patient Safety TIPS 2006; 6(5): 1,4  Nov/Dec 2006

http://www.patientsafety.va.gov/docs/TIPS/TIPS_NovDec06.pdf#page=1

 

 

Weak vs. Strong Responses to an RCA (Power Point presentation).

http://patientsafetysolutions.com/docs/RCA_strong_vs_weak_responses.ppt

 

 

AHRQ. Perspectives on Safety. In Conversation With... James P. Bagian, MD, PE. Root Cause Analysis: What Have We Learned? AHRQ PSNet 2016; Published December 2016

https://psnet.ahrq.gov/perspectives/perspective/211/in-conversation-with--james-p-bagian-md-pe

 

 

NPSF (National Patient Safety Foundation) RCA2. Improving Root Cause Analyses and Actions to Prevent Harm. NPSF 2015

http://c.ymcdn.com/sites/www.npsf.org/resource/resmgr/PDF/RCA2_first-online-pub_061615.pdf

 

 

Gupta K, Lyndon A. Perspectives on Safety. Annual Perspective 2016. Rethinking Root Cause Analysis. AHRQ PSNet 2017; January 2017

https://psnet.ahrq.gov/perspectives/perspective/216/rethinking-root-cause-analysis

 

 

 

 

 

Print “PDF version

 

 

 

 

 

 

 

 

 

 

 

 


 

 

http://www.patientsafetysolutions.com/

 

Home

 

Tip of the Week Archive

 

What’s New in the Patient Safety World Archive