Print “PDF version”
Adverse events, whether in healthcare or other industries, typically result from a combination of human errors and system issues. In developing interventions to prevent adverse events, we most often focus on the system issues. The reason? System issues are easier to address than the human issues.
But we cannot afford to ignore the human errors. A recent study provided an important contribution to better understanding the role of human errors in surgical adverse events. Suliburk and colleagues (Suliburk 2019) developed a new taxonomy to classify human performance deficiencies (HPD’s) and used this tool in analyzing adverse surgical events that occurred in a level I municipal trauma center, a quaternary care university hospital, and a US Veterans Administration hospital. They found that, of the 188 adverse events reviewed, 106 (56.4%) were associated with HPD’s.
Their new taxonomy classified the HPD’s in 5 categories:
Planning or problem solving (Class I)
|
Guideline or protocol misapplication Knowledge deficit Diagnostic cognitive bias Treatment cognitive bias Latent mistakes |
Execution (Class II)
|
Lack of recognition Lack of attention Memory lapse Technical error |
Rules violation (Class III)
|
Ignoring routine or cutting corners Optimizing or personal gain Situational or time pressure |
Communication (Class IV)
|
Absent Assumed Misinterpreted |
Teamwork (Class V)
|
Ill-defined roles or lack of leadership Lack of group expertise Failure to evaluate progress |
Of the 192 human performance deficiencies identified, they were categorized as:
- execution (51.0%)
- planning or problem solving (8.6%)
- communication (12.5%)
- teamwork (4.7%)
- rules violation (3.1%)
Cognitive errors in execution of care or in case planning or problem solving (51.6%) were the most common HPD’s.
Lack of recognition was the most prevalent cognitive error. The authors gave 2 specific examples of how cognitive errors contributed to the adverse outcomes. In one case, a significant technical error in a surgeon’s performance of an operation was likely precipitated by the initially unappreciated influence of the surgeon’s distraction by an outside telephone call in the operating room. In a second case, a stylus that was inadvertently retained postoperatively was clearly visible but repeatedly unrecognized by radiologists in their reports.
Sound familiar? Just look back at our August 20, 2019 Patient Safety Tip of the Week “Yet Another (Not So) Unusual RSI”. That column featured a case where telephone calls distracted OR staff, contributing to a retained surgical instrument, and a case where a retained surgical instrument was not initially recognized as such on radiographic images.
Suliburk and colleagues note that in their second example, the cognitive bias of lack of attention (by the radiologists) was not the only cognitive error. There was also likely confirmation bias, in that the clinicians likely dismissed their own concerns because they were not validated in official radiology reports.
teams to reinforce systems-based safety constructs. Playbacks of real-life scenarios could be used, akin to training performed in the aviation and aerospace industries. For example, replay of their first example case could provide an opportunity for behavioral training to reset following intraoperative distractions. Replay of the second example might be used to teach clinicians to avoid losing their situational awareness to the convenience of alternative data (avoiding confirmation bias).
Interventions to avoid human error are difficult to implement and sustain. Our mantra “stories, not statistics” is especially valid here. If we just make all aware of the statistics in the Suliburk study, we’d likely not prevent any adverse events. But tying the stories to simulation training is a good way of getting healthcare professionals to develop better situational awareness and recognize the factors and situations that make cognitive errors more likely.
The Suliburk study is a timely reminder that we cannot rely purely on system improvements to avoid adverse events. Just as we have a renewed focus on diagnostic error, we need to have a broader focus on cognitive errors that influence performance in multiple facets of healthcare.
That said, another recent study challenges the assertion that communication problems are the leading cause of medical errors. Clapper and Ching (Clapper 2019) did a systematic review of articles using the terms “medical errors, research, and communication”. In the 42 studies that met their inclusion criteria, three categories of errors were dominant: errors of commission (47.6%), errors of omission (14.2%) and errors through communication (9.5%), though there was some overlap. There were 28.5% of studies in which all three categories together significantly contributed to error. It’s just another reminder that, while we always focus on system issues and communication because those are the contributing factors most amenable to solutions, human errors still are important factors contributing to adverse events.
And, while human error is a major factor in most serious adverse events in healthcare, don’t forget that our systems and poor communication often put healthcare workers in a position where human errors will have devastating consequences. Recall that, in our analysis of a recent neuromuscular blocking agent tragedy in which human error was a proximate cause (see our Patient Safety Tips of the Week for December 11, 2018 “Another NMBA Accident” and February 12, 2019 “From Tragedy to Travesty of Justice”), we identified at least 15 other factors that contributed and, had they been fixed, might have averted the tragedy.
So, while the work of Suliburk et al. reminds us not to underestimate the role of human error in healthcare accidents, lessons learned in well-done root cause analyses (RCA’s) teach us about the complex interaction between humans and the systems we create.
Some of our prior columns on RCA’s, FMEA’s, response to serious incidents, etc:
July 24, 2007 “Serious Incident Response Checklist”
March 30, 2010 “Publicly Released RCA’s: Everyone Learns from Them”
April 2010 “RCA: Epidural Solution Infused Intravenously”
March 27, 2012 “Action Plan Strength in RCA’s”
March 2014 “FMEA to Avoid Breastmilk Mixups”
July 14, 2015 “NPSF’s RCA2 Guidelines”
July 12, 2016 “Forget Brexit – Brits Bash the RCA!”
May 23, 2017 “Trolling the RCA”
See our prior columns on team training, huddles, briefings, and debriefings:
References:
Suliburk JW, Buck QM, Pirko CJ, et al. Analysis of Human Performance Deficiencies Associated With Surgical Adverse Events. JAMA Netw Open 2019; 2(7): e198067 Published online July 31, 2019
https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2740065
Clapper, TC, Ching, K. Debunking the myth that the majority of medical errors are attributed to communication. Med Educ 2019; 00: 1-8
https://onlinelibrary.wiley.com/doi/abs/10.1111/medu.13821
Print “PDF version”
http://www.patientsafetysolutions.com/