It’s been almost three years since we did our first series on patient safety issues related to alarms (see our Patient Safety Tips of the Week for March 5, 2007 “Disabled Alarms”, March 26, 2007 “ ”, and April 2, 2007 “ ”). Since the early 1990’s we’ve always put alarms at the top of our list of items to check when doing patient safety walk rounds. And we encourage nurses, physicians, and quality improvement personnel to conduct alarm rounds periodically. And problems with alarms are one of our “Big 3” factors encountered most often in our root cause analyses. We even play a little game with hospital CEO’s, betting them $50 we can find in their hospital within 2 hours at least 5 alarms that have either been disabled or whose volumes have been reduced to barely audible levels or whose parameters have been set such that they will allow clinically significant deterioration to go unheeded. Then we head straight to the ICU’s, ER and dialysis unit to easily find our 5 alarms. (We never actually make them pay off the bet but they sure get embarrassed when we find them!)
This month a fatality related to a cardiac alarm inadvertently left off at the Massachusetts General Hospital made the news (Kowalczyk 2010). The Globe article speculates that the alarm may have been turned off by someone during a previous patient event, thinking they were “pausing” the alarm rather than turning it off. Few other details are provided but the Globe article quotes Joint Commission personnel stating that they have been seeing increased reports of alarm-related incidents. The Mass General’s actions have included inspecting all alarms in their system, ensuring that the “off” switch cannot be used, assigning a nurse to each unit to ensure timely responses to all alarms, and instituting an educational program for all nurses working with the monitoring systems. They note that these are interim steps as they work on more thorough and durable long-term solutions.
But the Globe article also quotes patient safety guru Lucian Leape as questioning why you would ever manufacture a critical alarm or monitor that allows staff to turn if off. A point that we have made over and over is that medical equipment is all too often designed in places other than where it will be used. Microsoft wouldn’t dream of designing software without looking to see how users interact with it. Yet many pieces of medical monitoring equipment are designed without observing over time how healthcare professionals will interact with it. Everyone who has ever worked in an ICU (or other critical care area) knows that the first thing a human responder usually does when an alarm goes off is: turn off the alarm! There is a huge difference in “pausing” an alarm (so you can attend to whatever that alarm was signaling) vs. disabling that alarm.
We encourage you to go back and carefully read each of the three previous columns mentioned above. They highlight the problems with design of alarms and the equipment the alarms are attached to, the design of the environments in which the alarms are used, and the problem of alarm fatigue. More importantly, when you read them, you will avoid the urge to rush out and take disciplinary action against any individuals in such cases. In virtually every case we have seen or investigated there are significant system issues (and usually cultural issues as well) that are the real root causes. Most striking along that line was the case discussed in our April 2, 2007 Patient Safety Tip of the Week “”. In that case, the teams investigating an incident in which the volume of an alarm in an ER had been turned down found exactly the same thing on three subsequent visits to the ER! On each occasion a different individual had turned down the volume. It was obvious this was a problem with design of the ER and a problem with safety culture on the unit.
The bottom line: when you find an alarm that has purposely been altered, keep looking – you will always find an important root cause that led to its being altered.
And when you find a specific problem with an alarm, don’t stop there. Go look at all the other equipment and sites using that same alarm. In the case described in our March 5, 2007 Patient Safety Tip of the Week “Disabled Alarms”, the finding of a piece of tape used to mute a ventilator blender alarm led to finding 6 other ventilators in the hospital system that had been similarly muted.
Alarms and alerts, whether audible or visual, are often poorly designed for the environments in which they are used. The indicator light on the portable ventilator in the April 2, 2007 case was on the rear of the ventilator in an area not readily visible. In the March 26, 2007 case the alarm did not immediately direct the attention of the nurse to the area of concern. And even the blender alarm described above in the March 5, 2007 case was poorly designed because it should have been anticipated someone might purposely silence the alarm during routine maintenance when oxygen was not being used.
Alarm fatigue is a real problem. Just as we see “alert fatigue” when physicians are exposed to too many alerts and reminders during CPOE, “alarm fatigue” refers to the human tendency to begin ignoring alarms when exposed to a constant bombardment by alarms, many of which are false alarms or not clinically important.
Sometimes we fail to ask the most important questions. One question seldom asked is “Why do we need so many alarms in the first place?”. That very question was asked in an article this month in Critical Care Medicine (Siebig et al 2010). They looked at clinical and physiological data and alarms used in patients in multiple medical intensive care units. Strikingly, they found that only 15% of alarms were considered clinically relevant. No wonder so many alarms get ignored! 40% of the alarms did not correctly describe the patient condition and were classified as technically false. A substantial portion of the alarms were caused by staff manipulating the monitoring systems (eg. during blood drawing).
Most alarms in the Siebig study were simple threshold alarms (meaning they alarm when a high or low threshold is exceeded). They, thus, don’t convey meaningful information about trends which may be even more important and would alert staff to clinical deterioration earlier. That article and the accompanying editorial (Blum 2010) call for future research into alarms and suggest development of monitoring algorithms that could monitor multiple physiological parameters simultaneously to identify clinically relevant changes earlier and more reliably. They also suggest using different audible tones to help differentiate various signals indicating problems with the electrodes versus problems with the patient.
Don’t forget that monitoring and alarm systems consist of much more than pieces of medical equipment (see our April 1, 2008 Patient Safety Tip of the Week “Pennsylvania PSA’s FMEA on Telemetry Alarm Interventions”). The Pennsylvania Patient Safety Authority’s “Alarm Interventions During Medical Telemetry Monitoring: A Failure Mode and Effects Analysis” analyzed data on alarm-related incidents from the Pennsylvania Patient Safety Reporting System and identified 29 steps involved in the telemetry monitoring process. They provide excellent recommendations regarding patient identification, optimal display location, ensuring the power source of the telemetry receivers, protocols for when monitoring is temporarily suspended or on standby (eg. during transport or while electrodes are being manipulated), protocols for alarm volume levels, electrode placements and inspection and maintenance, making alarm parameters appropriate to both the individual patient and the setting, and protocols for responding to all alarms (whether low- or high-priority alarms) including establishment of a tiered backup response system. They also point out a very important question easily overlooked in a FMEA “Is telemetry monitoring indicated in this patient at all?”.
Because people must set up the alarms it is no surprise that human error plays a big role in alarm-related incidents. Our July 19, 2007 Patient Safety Tip of the Week “Unintended Consequences of Technological Solutons” noted an adverse outcome related to transposition of telemetry receivers that were placed on two patients at about the same time. When the telemetry alarm-related incidents from the Pennsylvania Patient Safety Reporting System were analyzed, patient misidentification was one of the root causes seen fairly frequently.
Our quick list of things you should do to help avoid alarm-related incidents:
Alarms are wonderful safety devices. But, as we see with all technological advances, there are always unintended consequences. So vigilance and frequent auditing should become a routine part of your alarm safety program.
Update: See also our March 2, 2010 Patient Safety Tip of the Week “Alarm Sensitivity: Early Detection vs. Alarm Fatigue”.
Kowalczyk L. MGH death spurs review of patient monitors. Heart alarm was off; device issues spotlight a growing national problem. Boston Globe February 21, 2010
Siebig S, Kuhls S; Imhoff M, et al. Intensive care unit alarms - How many do we need? Critical Care Medicine 2010; 38(2): 451-456
Blum JM, Tremper KK. Alarms in the intensive care unit: Too much of a good thing is dangerous: Is it time to add some intelligence to alarms?
Critical Care Medicine 2010; 38(2): 702-703
Pennsylvania Patient Safety Authority. Patient Safety Advisory supplement “Alarm Interventions During Medical Telemetry Monitoring: A Failure Mode and Effects Analysis”. March 2008
Patient Safety Tips of the Week pertaining to alarm-related issues:
March 5, 2007 “Disabled Alarms”
March 26, 2007 “”
April 2, 2007 “”
June 19, 2007 “Unintended Consequences of Technological Solutons”
April 1, 2008 “Pennsylvania PSA’s FMEA on Telemetry Alarm Interventions”
February 23, 2010 “Alarm Issues in the News Again”
March 2, 2010 “Alarm Sensitivity: Early Detection vs. Alarm Fatigue”