Patient Safety Tip of the Week

October 9, 2012

Call for Focus on Diagnostic Errors

 

 

Recently Mark Graber, whose work on diagnostic error we have often cited, along with Bob Wachter and Christine Cassel issued a call for a new focus on diagnostic error in patient safety efforts (Graber 2012). Diagnostic error has been very much underrepresented in research and the literature on patient safety. You’ve all undoubtedly heard the oft-quoted statistic that the seminal IOM report “To Err is Human…” cites medication errors some 70 times yet mentions diagnostic errors only twice. Yet diagnostic errors occur frequently in all care settings. In fact, on the ambulatory side they are the leading cause for malpractice claims. And we do very little in educating medical students and residents/fellows about recognizing and preventing diagnostic errors. Graber et al. call for action at the medical school, residency, specialty society and policy levels to address the issue of diagnostic error and promote research on best practices to avoid diagnostic error.

 

We’ve already done several columns on diagnostic error, including our Patient Safety Tips of the Week for September 28, 2010 “Diagnostic Error” and November 29, 2011 “More on Diagnostic Error” and May 15, 2012 “Diagnostic Error Chapter 3” plus several others listed at the end of today’s Tip on the way(s) we think and the various cognitive biases that impact our clinical decision making.

 

System errors may play an important role in leading to errors in diagnosis or delay in diagnosis. Last week’s Patient Safety Tip of the Week “Test Results: Everyone’s Worst Nightmare” had links to many of our prior columns on failures in followup of test results. Such system errors have received attention in the patient safety literature. But errors in the cognitive aspects of the diagnostic process have received far less attention.

 

It is very clear that diagnostic error occurs at virtually all levels of our continuum of care. A recent study in the UK showed almost a third of preventable hospital inpatient deaths were due to diagnostic error (Hogan 2012) and diagnostic errors occurred at all stages of the diagnostic process.

 

A systematic review of autopsy studies on ICU patients (Winters 2012) found that 28% of all autopsies on such patients revealed at least one misdiagnosis and 8% showed potentially lethal errors in diagnosis. Most commonly misdiagnosed were pulmonary embolism, myocardial infarction, pneumonia and aspergillosis.

 

Gehring and colleagues (Gehring 2012) used a survey methodology to assess the frequency of various patient safety incidents in primary care settings. While diagnostic errors were not the most frequent events in the study, error in diagnosis or delay in diagnosis were the most common cause of events associated with at least minor harm and the most frequent cause of events leading to severe harm or death.

 

Our May 15, 2012 Patient Safety Tip of the Week “Diagnostic Error Chapter 3” focused on diagnostic error in ambulatory settings. We noted a paper on diagnostic errors in primary care (Ely 2012) highlighting that diagnostic errors were often preceded by common symptoms and common, relatively benign initial diagnoses. The three most common lessons learned in their review were (1) consider diagnosis X in patients presenting with symptom Y (2) look beyond the initial, most obvious diagnosis and (3) be alert to atypical presentations of disease. The authors note how mental shortcuts and cognitive biases such as anchoring, premature closure, and diagnostic momemtum frequently lead to diagnostic errors. Broadening the differential diagnosis and always considering the “don’t miss” diagnoses were important themes. They recommend de-biasing strategies such as diagnostic timeouts and use of checklists, noting that these strategies have still not been well developed with an evidence base.

 

Another recent study focused on early warning signs for diagnostic errors (Balla 2012). They identified the initiation and closure of the cognitive process as those most exposed to the risk of diagnostic error. At initiation of the diagnostic process “framing” occurs and sets the frame for subsequent information search. Biases occurring at the initiation, such as framing bias, can ultimately influence errors that occur at the end of the process. One of the warning signs they identified was presenting with a diagnosis label. That could refer either to a patient coming to you already labeled with a specific diagnosis or the clinician simply jumping to an “obvious” diagnosis without thinking about what else it could be. Another is the psychosocial or behavioral label. We’ve done campaigns at hospitals to avoid the practice of “negative labeling” because all too often we’ve seen patient with serious conditions suffer delayed diagnosis because staff chalked them up to some label applied because of prior interactions. Another was ignoring red flags. Confirmation bias includes not only putting weight on information that confirms your impression but also discounting disconfirming evidence or information. They note many cases of ignoring red flags or critical clues because the approach taken was “ruling in” the early diagnosis rather than “ruling out” the more serious possible condition. Another is ignoring the possibility of serious disease with low probability. Another is using wrong clinical features to rule out a condition. The example given was not considering ectopic pregnancy in a patient presenting with bleeding but lacking pain (pain being a cardinal symptom of ectopic pregnancy in the clinician’s mind). And, lastly, ignoring gut feelings should be a warning sign.

 

While we have talked about the dangers of many biases that appear early in our diagnostic reasoning, we’ve often stressed the importance of not ignoring the “gut feeling” that something is wrong. The aviation safety literature often talks about the “queezy” feeling that often alerts pilots to something going wrong. Van den Bruel and colleagues, who have done much work on recognizing serious infections in children, recently wrote about the role clinician’s gut feelings play in such recognition (van den Bruel 2012). They noted that acting on the “gut feeling” had the potential to prevent two serious infections being missed at the expense of 44 false alarms. Moreover, compared to the clinical impression the gut feeling was consistently more specific regardless of the child’s age or diagnosis or the seniority of the physician (though more experienced clinician’s were less likely to experience gut feelings).

 

Note also that when we teach students, residents, and nurses about ways to get around medical hierarchy barriers, one of the methods we recommend is the simple but very powerful statement “I’ve just got this uneasy feeling that something is not right…”. That will usually get even the most detached attending to refocus.

 

And many of us have learned that the patient with very vague symptoms who, nevertheless, conveys a “feeling of impending doom” must be taken extremely seriously. It’s akin to the “gut feeling” that healthcare workers or families may also convey. These are very important “red flags” that we simply cannot ignore.

 

The Balla article (Balla 2012) provides a nice algorithm that might be used to incorporate these warning signs into reflective reasoning.

 

Another recent study (Ogdie 2012) looked at cognitive biases that may lead to diagnostic errors but also included a great discussion on how contextual issues interplay with those biases. Though the study focused on diagnostic errors encountered by internal medicine residents, the findings are applicable to all providers and all settings. Anchoring and availability bias and framing effect were the most common biases identified but all the ones we’ve discussed in previous columns were seen. In discussing the contextual factors, they broke them down into team and provider factors, system and environmental factors, and patient-related factors. For example, context greatly influenced the framing effect. It was often associated with the patient providing a vague history, being too busy or having too many patients, providing only temporary coverage for a patient, or a patient being transferred from one service to another. They did not include premature closure as its own bias (because it is so strongly influenced by the other cognitive biases) but did note that premature closure may be influenced by such contextual factors as blind obedience (to the hierarchy), overreliance on a consultant, lack of interest in a patient’s case, or even lack of confidence.

 

 

Even though diagnostic error remains a domain grossly underemphasized in the patient safety literature there have been excellent contributions on cognitive thinking and the diagnostic process by individuals like Mark Graber, Pat Croskerry, John Ely, Gordon Schiff, Hardeep Singh, Jerry Groopman, Gary Klein and many others. You can find many of these in our previous columns mentioned below.

 

 

Some of our prior Patient Safety Tips of the Week on diagnostic error:

 

·        September 28, 2010     Diagnostic Error

·        November 29, 2011     More on Diagnostic Error

·        May 15, 2012              Diagnostic Error Chapter 3

·        May 29, 2008             If You Do RCA’s or Design Healthcare Processes…Read Gary Klein’s Work”)

·        August 12, 2008           Jerome Groopman’s “How Doctors Think”)

·        August 10, 2010           It’s Not Always About The Evidence

·        January 24, 2012          Patient Safety in Ambulatory Care

 

 

·        And our review of Malcolm Gladwell’s “Blink” in our Patient Safety Library

 

 

 

 

 

References:

 

 

Graber ML, Wachter RM, Cassel CK. Bringing Diagnosis Into the Quality and Safety Equations. JAMA 2012; 308(12): 1211-1212

http://jama.jamanetwork.com/article.aspx?articleid=1362034

 

 

Hogan H, Healey F, Neale G, et al. Preventable deaths due to problems in care in English acute hospitals: a retrospective case record review study. BMJ Qual Saf 2012; 21(9): 737-745 doi:10.1136/bmjqs-2011-001159

http://qualitysafety.bmj.com/content/21/9/737.full.pdf+html?sid=f5528442-8374-46c0-9197-08c35b3a20bb

 

 

Winters B, Custer J, Galvagno SM, et al. Diagnostic errors in the intensive care unit: a systematic review of autopsy studies. BMJ Qual Safe 2012; published online first 21 July 2012

http://qualitysafety.bmj.com/content/early/2012/07/23/bmjqs-2012-000803.abstract?sid=1608742a-f3dd-4433-994d-b6207b563b9e

 

 

Gehring K, Schwappach DLB, Battaglia M, et al. Frequency of and Harm Associated With Primary Care Safety Incidents. Am J Manag Care. 2012; 18(9): e323-e337

http://www.ajmc.com/articles/Frequency-of-and-Harm-Associated-With-Primary-Care-Safety-Incidents

 

 

Balla J, Heneghan C, Goyder C, Thompson M. Identifying early warning signs for diagnostic errors in primary care: a qualitative study. BMJ Open 2012; 2: e001539

http://bmjopen.bmj.com/content/2/5/e001539.full.pdf+html

 

 

Van den Bruel A, Thompson M, Buntinx F, Mant D. Clinicians’ gut feeling about serious infections in children: observational study. BMJ 2012; 345: e6144 (Published 25 September 2012)

http://www.bmj.com/content/345/bmj.e6144.pdf%2Bhtml

 

 

Ogdie AR, Reilly JB, Pang WG, et al. Seen Through Their Eyes: Residents’ Reflections on the Cognitive and Contextual Components of Diagnostic Errors in Medicine. Academic Medicine 2012; 87(10): 1361-1367, October 2012

http://journals.lww.com/academicmedicine/Abstract/2012/10000/Seen_Through_Their_Eyes___Residents__Reflections.18.aspx

 

 

Ely JW, Graber M, Croskerry P. Checklists to reduce diagnostic errors. Academic Medicine 2011; 86(3): 307-313

http://journals.lww.com/academicmedicine/Fulltext/2011/03000/Checklists_to_Reduce_Diagnostic_Errors.17.aspx

 

 

 

 

 

 

 

 

 

 

 

 

 


 


 

http://www.patientsafetysolutions.com/

 

Home

 

Tip of the Week Archive

 

What’s New in the Patient Safety World Archive