The Joint Commission has now approved the new National Patient Safety Goal for 2014 on clinical alarm safety and management (NPSG 06.01.01). The content is pretty much as described in our What’s New in the Patient Safety World columns for February 2013 “Joint Commission Proposes New 2014 National Patient Safety Goal” and May 2013 “Joint Commission Sentinel Event Alert: Alarm Safety”. However, the announcement (Joint Commission 2013) has a considerable change in the expected timeline, allowing hospitals much more time for full implementation. The Joint Commission has done this in view of ongoing studies developing best practices for alarm management.
The new NPSG requires as of July 1, 2014 all affected organizations establish alarm system safety as a priority and in 2014 to identify the most important alarm signals to manage (based on input from staff, risk to patients, clinical necessity of alarms vs. unnecessity that might contribute to noise and alarm fatigue, potential for patient harm based on internal incident history, and published best practices and guidelines).
However, it then gives organizations until January 1, 2016 to develop policies and procedures and staff educational programs to manage the alarms identified in the above exercise.
We certainly hope that this “reprieve” does not discourage organizations form moving forward expediently in improving alarm system management. The Joint Commission recognizes that many best practices will be developed or refined over the coming years and presumably is allowing for such changes to impact systems.
A good place to start is our July 2, 2013 Patient Safety Tip of the Week “Issues in Alarm Management” which has some comprehensive recommendations about alarm safety and management. It has many links to useful resources and to many of our prior columns on alarm-related issues that include some chilling stories of adverse outcomes related to improper alarm issues and their root causes.
Prior Patient Safety Tips of the Week pertaining to alarm-related issues:
The Joint Commission. The Joint Commission Announces 2014 National Patient Safety Goal. July 19, 2013
AAMI HTSI website (AAMI Healthcare Technology Safety Institute).
ECRI Institute. Alarm Safety Resource Site.
AORN (the Association for periOperative Registered Nurses) has provided many important patient safety resources over the years. Now, through a survey of its membership, AORN has developed a list of the top 10 patient safety issues (Steelman 2013). Their top 10 are:
1. Preventing wrong site/procedure/patient surgery
2. Preventing retained surgical items
3. Preventing medication errors
4. Preventing failures in instrument reprocessing
5. Preventing pressure injuries
6. Preventing specimen management errors
7. Preventing surgical fires
8. Preventing perioperative hypothermia
9. Preventing burns from energy devices
10. Responding to difficult intubation or airway emergencies
Each of the topics comes with a brief summary of the issue at hand and recommendations for what more can be done. Their recommendations are both evidence-based and practical. The sections on preventing instrument reprocessing failures, preventing burns from energy devices, and preventing specimen management errors are particularly informative.
If you are at all involved in perioperative management of patients this is an outstanding resource that you will find very useful.
Steelman VM, Graling PR. Top 10 Patient Safety Issues: What More Can We Do? AORN Journal 2013; 97(6): 679-701
We’ve done multiple columns on the risks of inpatient suicide, not just of psychiatric inpatients but also patients on med/surg units (see the list at the end of today’s column).
A new retrospective analysis from the Mayo Clinic of suicide attempts by med/surg inpatients was recently published (Shekunov 2013). The authors found 8 suicide attempts among 777,404 med/surg inpatient admissions over a 12-year period (only one attempt proved fatal). They then developed case-control matches for these and compared characteristics of those patients who attempted suicide to those who did not.
While the overall rate of suicide attempts on med/surg units was low, the article shows that suicide can occur on med/surg units and underscores some important points to help prevent inpatient suicides. Half the patients had psychiatric consultations prior to the attempted suicide, though none had expressed suicidal intent in proximity to the attempt. The patients also did have a higher likelihood of prior suicide attempts compared to control groups. Importantly, stressors were identified in most. Inadequately controlled pain was considered a contributing factor in three and agitation or anxiety in two. Acute delirium, insomnia, and psychosocial difficulties were contributing factors in one each.
Compared to the literature, which suggests that med/surg inpatient suicide attemtps tend to be more violent, less violent means were used in most of the cases in the current study. Overdose was the method used in half. Two lacerated their wrists, one attempted strangulation using a blood pressure cuff, and one swallowed several physical items.
Perhaps the most salient lesson learned was that all the overdoses were attempted with medications the patients had brought in from home. That emphasizes the need for every healthcare facility to have strict policies and procedures on managing medications brought in from home. Problems related to medications brought in from home by patients, often unbeknownst to the hospital staff, is a significant problem. An excellent Patient Safety Advisory from the Pennsylvania Patient Safety Authority (PPSA) in 2012 found over 900 medication errors in less than 7 years related to medications brought into hospitals by patients (
Note that all the suicide attempts in the Shekunov study occurred in the patient room (5 near the bed and 3 in the bathroom). We have stressed previously the importance of looking in other areas as well for suicide risk factors as well. For example, the bathroom in the radiology suite is a potentially vulnerable area.
We hope that you’ll look at our previous columns on the issue of inpatient suicide since they have lots of information and recommendations about identifying patient-level risk factors, environmental risk factors, precipitating events, assessment tools, and system interventions to reduce the chances of inpatient suicide.
Some of our prior columns on preventing hospital suicides:
· January 6, 2009 Patient Safety Tip of the Week “Preventing Inpatient Suicides”
· February 9, 2010 Patient Safety Tip of the Week “More on Preventing Inpatient Suicides”
· March 16, 2010 Patient Safety Tip of the Week “A Patient Safety Scavenger Hunt”
· December 2010 What’s New in the Patient Safety World column “ ”
· September 27, 2011 Patient Safety Tip of the Week “The Canadian Suicide Risk Assessment Guide”
· December 2011 What’s New in the Patient Safety World column “Columbia Suicide Severity Rating Scale”
· July 2012 “VA Checklist Reduces Suicide Risk”
Shekunov J, Geske JR, Bostwick JM. Inpatient medical–surgical suicidal behavior: a 12-year case–control study. Gen Hosp Psychiatry 2013; 35(4): 423–426
Patients Taking Their Own Medications While in the Hospital
Pa Patient Saf Advis 2012; 9(2): 50-57 June 2012
Most of the literature on diagnostic error discusses two primary modes of decision making, “intuitive” vs “rational” (also known as “analytical”). In our November 29, 2011 Patient Safety Tip of the Week “More on Diagnostic Error” we noted that it’s estimated we spend up to 95% of our time using the intuitive mode. In that intuitive mode we basically use a form of pattern recognition where we use our previous experiences to key concepts is that we often do most of this thinking at a subconscious level.
But there is another form of intuition that sometimes entered the diagnostic process. A recent study (Woolley 2013) concluded that, rather than admonishing clinicians not to trust their intuition, we need to better understand the nature of various “intuitive” processes. Those authors make a distinction between making diagnoses based upon “first impressions” vs. “intuition”. They note that first impressions, while often using automatic, nonanalytical thinking, may still be relatively rational and justifiable. On the other hand, many clinicians consider their intuitions to be more like “gut feelings” where they do not understand the basis and often consider them irrational.
They recruited family physicians to conduct their study. Each was asked to identify 2 occasions where they felt they knew the diagnosis (or prognosis) but did not know why, one case for which they were correct and one in which they were incorrect. After conducting interviews and applying the Critical Decision Method to analyze the cases, three types of decision process emerged: gut feelings, recognitions, and insights.
“Gut feelings” were the most common. These were cases where, during initial data gathering, a feeling cast doubt over the initial interpretation. That feeling signaled alarm, often in response to a single cue that “did not seem right” or an unexpected pattern of cues. Sometimes they did not recognize what the nonfitting cues meant. At other times they were aware of some basis for their feeling but thought it was not evidence-based or supported by guidelines. They often believed their colleagues would have acted differently. An example included a 28 y.o. man with flu-like symptoms who the physician sent to the emergency room despite colleagues feeling he had nothing urgent. The patient turned out to have meningococcal septicemia.
“Recognitions” were instances where a diagnosis was formulated quickly with little information. These differ from first impressions in that the physicians may have been aware of conflicting information or absence of key symptoms and signs. An example given was a physician suspected alcohol abuse in a patient who vehemently denied it. The physician could see no one feature that stamped the case as alcohol abuse but found multiple subtle cues that led to a diagnosis of alcohol abuse confirmed by a high blood alchohol level and subsequent patient admission of drinking.
“Insights” are cases in which initially there is no pattern of recognizable cues and no satisfactory explanation is found, though several diagnoses are considered. Subsequent information gathering suddenly results in a clear interpretation that integrates all the symptoms and signs. In these cases the physician was surprised and it was often a single piece of information that suddenly came into his/her awareness. The example given was a patient complaining of a severe headache in whom the physician, while examining her eyes, suddenly thought of glaucoma as a cause of headaches. That turned out to be the correct diagnosis.
They go on to describe the feelings these physicians had when relying on these collectively “intuitive” feelings. They often felt conflicted between their “intuition’ and other interpretations they considered more rational. Some of the diagnoses suggested were considered highly unlikely, implausible, or rare. Some of the cues were considered out of the ordinary and not evidence-based. And often the pattern of cues was so complex that the physician could not verbalize them.
Note that on stratifying the family physicians by years in practice and by gender, they found that “gut feelings” were more frequently reported by experienced physicians and more often by female physicians.
Note that “gut feelings” are not unique to the medical field. It is not uncommon during root cause analyses of aviation accidents or near-misses to see that a pilot or other crew member had a “feeling of unease” or “gut feeling” that something was not quite right. These are often based on subtle cues or lack of expected cues.
It’s pretty clear that various forms of intuition, particularly the “gut feeling”, are often important in at least getting us to stop and think about the direction of our diagnostic thinking. Most experienced physicians can remember cases where that “gut feeling” surfaced and helped them avoid a potential disaster. In fact, when we train housestaff or nurses to challenge the medical hierarchy when they see something they don’t think is right we often tell them to use the phrase “I just have this funny feeling”. That often gets even the most recalcitrant physicians to pause and reexamine the situation.
Some of our prior Patient Safety Tips of the Week on diagnostic error:
· September 28, 2010 “Diagnostic Error”
· November 29, 2011 “More on Diagnostic Error”
· May 15, 2012 “Diagnostic Error Chapter 3”
· August 12, 2008 “Jerome Groopman’s “How Doctors Think”
· August 10, 2010 “ ”
· January 24, 2012 “Patient Safety in Ambulatory Care”
· October 9, 2012 “Call for Focus on Diagnostic Errors”
· March 2013 “Diagnostic Error in Primary Care”
· May 2013 “Scope and Consequences of Diagnostic Errors”
· And our review of Malcolm Gladwell’s “Blink” in our Patient Safety Library
Woolley A, Kostopoulou O. Clinical Intuition in Family Medicine: More Than First Impressions Ann Fam Med 2013; 11: 60-66; doi:10.1370/afm.1433
Print “August 2013 Clinical Intuition”
We noted AHRQ’s Patient Safety Primers in our What’s New in the Patient Safety World columns for August 2008 “AHRQ's New Patient Safety Primers” and February 2009 “Some More New AHRQ Patient Safety Primers”. AHRQ has recently updated some of these primers and added several more:
As we noted previously they are, in fact, primers – meaning they are very introductory works on several important areas related to patient safety. However, each has extensive links to both classic and contemporary bibliographic references and tools. The new ones are no different and are equally useful resources.
The primer on Diagnostic Errors lists several of the more common types of cognitive bias with clinical examples of each. It does stress the relative paucity of interventions proven to reduce diagnostic errors. Once again, a real strength of the primer is its very useful bibliography.
The primer on Adverse Events after Hospital Discharge is an update to an earlier version. It stresses that 20% of patients discharged from the hospital will suffer an adverse event, most of which were preventable. They focus on various failed communication opportunities and discontinuities in care. They stress the importance of discharge planning, medication reconciliation, patient and family education, followup on pending tests, and attention to health literacy issues. The update notes programs like Project RED and the Transitions trial. Good discussion of our favorite tools, checklists and structured communication tools, as cornerstones. And the links and bibliography are what you’ve come to expect of these fine AHRQ primers.
The primer on Error Disclosure focuses on the trend toward full disclosure and apology after adverse patient events. Again, good bibliography. But it’s still a primer. You’ll find much more on disclosure and apology in our prior columns on the topic:
· July 24, 2007 “Serious Incident Response Checklist”
· June 16, 2009 “”
· June 22, 2010 “Disclosure and Apology: How to Do It”
· September 2010 “Followup to Our Disclosure and Apology Tip of the Week”
· November 2010 “ ”
· April 2012 “Error Disclosure by Surgeons”
· June 2012“Oregon Adverse Event Disclosure Guide”
And several other very valuable resources on disclosure and apology:
· IHI’s “Respectful Management of Serious Clinical Adverse Events” (Conway 2010)
· The Canadian Disclosure Guidelines (Canadian Patient Safety Institute 2008)
· The Harvard Disclosure Guidelines (Massachusetts Coalition for the Prevention of Medical Errors 2006)
· The ACPE Toolkit (American College of Physician Executives)
· Oregon Patient Safety Commission Oregon Adverse Event Disclosure Guide.
The primers on Wrong-Site, Wrong-Procedure, and Wrong-Patient Surgery and Checklists have good up-to-date bibliographies. Those on Systems Approach and Root Cause Analysis are introductions to human factors and the systems approach to error. The primer on Handoffs and Signouts looks at handoffs involving nurses, housestaff, hospital transitions, and lessons from other industries. It has many new references from 2013.
The entire collection of patient safety primers can be found at the AHRQ Patient Safety Primers home page and is an extremely useful resource not only for those new to the patient safety movement but even for experienced patient safety and quality improvement personnel.
AHRQ Patient Safety Primers (home page)
AHRQ Patient Safety Primer “Diagnostic Errors”
AHRQ Patient Safety Primer “Adverse Events after Hospital Discharge”
AHRQ Patient Safety Primer”Error Disclosure”
AHRQ Patient Safety Primer “Wrong-Site, Wrong-Procedure, and Wrong-Patient Surgery”
AHRQ Patient Safety Primer “Systems Approach”
AHRQ Patient Safety Primer “Checklists”
AHRQ Patient Safety Primer “Root Cause Analysis”
AHRQ Patient Safety Primer “Handoffs and Signouts”
Print “August 2013 Clinical Intuition”