We still have a strong conviction that the RCA (root cause analysis) is probably the most important learning tool that an organization with a good culture of safety has at its disposal. We encourage organizations to do RCA’s not just on events with bad patient outcomes but on any event that had the potential to induce harm (near-misses).
But when you do your RCA’s you need to track the action items you identified for each RCA and determine:
1) were they actually implemented?
2) were they effective?
3) were there any unintended consequences?
We recommend you keep a list or table of such identified action items from all your RCA’s to discuss at your monthly patient safety committee or performance improvement committee meetings. Only that sort of rigorous discipline will ensure that you did what you said you were going to do, i.e. that you “closed the loop”. It is amazing at how often you do an RCA on a new event and realize you had a previous similar event and that the action items you recommended to prevent recurrence were never carried out. Our March 30, 2010 Patient Safety Tip of the Week “Publicly Released RCA’s: Everyone Learns from Them” discussed an RCA done on a case where enteral feedings were inadvertently given intravenously. At that hospital there had been a similar incident several years earlier. After that incident an extensive root cause analysis was done and multiple recommendations were made, including key recommendations that should have prevented the current incident. But all those recommendations had not yet been fully implemented. More importantly, the recommendations were communicated back to those individuals deemed to be in the “need to know” but not widely disseminated to middle or front line management nor to front line staff.
Auditing for compliance with the recommendations is a good way to identify whether your risky situations continue. Think about this issue at your organization: how many times have you done an RCA and found that your frontline staff were completely unaware of an intervention you thought you had implemented after a previous incident? We will answer that for you: it happens all the time!
A new study (Morse & Pollack 2012) looks at another issue you should be monitoring in your RCA’s – strength of action items. We classify actions in RCA’s as weak, intermediate, and strong based upon the likelihood that they will result in the desired effect and prevent similar occurrences in the future. Obviously, we’d like to make sure that our RCA action items don’t just fall in the weak category and that each RCA would contain at least one strong action item. So Morse and Pollack looked at 20 RCA’s that had been done in their Children’s hospital and categorized all the action items. They used the VA National Center for Patient Safety recommended hierarchy of actions to categorize their action items. They had 78 action plans in their 20 RCA’s. They categorized their action plans as weaker 46.2%, intermediate 43.6%, and stronger 10.3%. Overall 90% of their RCA’s had at least some intermediate or stronger action items but 10% had only weaker ones.
One day when we were driving on the Pennsylvania Turnpike to a patient safety conference we passed through numerous road construction zones. We were struck by the myriad of signs and tools used to try to get drivers to slow down in these zones. Some resulted in virtually no drivers slowing down, others got them to slow down a little, and some really got them under the speed limit. The analogy to the strength of RCA action items was striking! So we put them together in pictures with RCA action items and now incorporate them in our webinar presentations on doing good RCA’s. Click here to see them. Remember: images are more likely to be remembered than words!
Morse and Pollack did note that 4 of their 78 recommended action plans (5%) never got implemented. We’d say that is in keeping with our experience at multiple healthcare organizations. There were a variety of reasons for that failure of implementation but most involved lack of leadership support, usually because the resources needed did not seem commensurate with the proposed value of the action item (see our September 15, 2009 Patient Safety Tip of the Week “ETTO’s: Efficiency-Thoroughness Trade-Offs”).
Prior studies in the VA system (Hughes 2006) analyzed action items from RCA’s and found that 30% were not implemented and another 25% were only partially implemented. Stronger action items were more likely to be implemented. Actions that were assigned to specific departments or people were more likely to be implemented than those assigned to general areas. And they found that the patient safety manager plays a critical role in RCA action implementation.
Hopefully your patient safety or performance improvement committees have broad multidisciplinary representation and don’t just consist of department managers. Discussion of the strength of action items from RCA’s and ensuring that action items actually got implemented in such multidisciplinary meetings helps raise the bar for patient safety. Moreover, it gets people to think outside their departmental silos and bring lessons learned to the broader audience. It is amazing how some events get reviewed just at a departmental or unit level and lessons learned never get shared with others. We once did an RCA in a case where lack of timely response to the temperature sensor alarm in a refrigerator led to loss of some stored products. In doing that RCA we tried to identify other systems using temperature (or other) sensors/alarms that might be similarly vulnerable. In fact, there had been at least 3 prior events in other areas that had been similar. Yet those events had been dealt with strictly at the departmental level and lessons learned not shared with other departments. No lesson learned is so small that it shouldn’t be shared with others. A high performing health system has a culture of disseminating lessons learned no matter how “small” those lessons learned may seem. Moreover, knowing which types of action plans actually work is invaluable.
Morse RB, Pollack MM. Root Cause Analyses Performed in a Children's Hospital: Events, Action Plan Strength, and Implementation Rates. Journal of Healthcare Quality 2012; 34(1): 55–61)
Hughes D. Root Cause Analysis: Bridging the Gap Between Ideas and Execution. VA NCPS Topics in Patient Safety TIPS 2006; 6(5): 1,4 Nov/Dec 2006
VA National Center for Patient Safety. RCA Tools. Actions & Outcomes. Recommended hierarchy of actions.