We are big fans of
clinical decision support systems (CDSS) as patient safety tools, keeping in
mind that too much CDSS may lead to alert fatigue and unintended consequences.
But well-reasoned clinical decision support rules that are also adequately
tested for both validity and usability may be very effective tools.
Brigham and Women’s
Hospital (BWH) in Boston probably has the most robust CDSS of any healthcare
organization anywhere and they just reported some disturbing findings on
malfunctions of CDSS alerts (Wright
2016). Serendipitously, the lead author had noted one such alert
malfunction while he was demonstrating the CDSS. It happened to be an alert
that would remind physicians to check the TSH level in patients who had been on
amiodarone for at least a year. The research team subsequently identified three
other examples of CDSS alert malfunctions and conducted a sample survey of
CMIO’s at various hospitals and found most of them had also experienced CDSS
malfunctions.
Alarmingly, they found that the alert malfunctions were
often very difficult to detect and some had eluded detection for long periods
of time (weeks or even years!). Moreover, the causes for the malfunctions were
sometimes even more difficult to elucidate. They were, however, able to
identify several contributing factors:
Changes to some of the data codes or data fields are often
made by IT staff or external vendors who are not part of the CDSS team and such
changes may not be apparent to the CDSS team members. For example, in the
TSH/amiodarone example, a change had been made to the drug code for amiodarone.
Alert malfunctions were most often first identified by
end-users. But those most often occurred when suddenly there was a spike in the
frequency of alerts for one or many alerts. Alerts that failed to fire after
malfunction were far more likely to elude detection.
The authors have several important recommendations:
The BWH researchers have identified a significant
vulnerability in our CDSS operations, one that has important patient safety
implications. This is a study that every healthcare organization needs to pay
careful attention to and evaluate their own actual or potential
vulnerabilities.
And, of course, most of you by now have seen the results of
The Leapfrog Group’s most recent report on how hospitals perform on their CPOE
evaluation tool (Leapfrog
2016). We’ve written about the Leapfrog tool in several prior columns (see
our previous columns for July 27, 2010 “EMR’s
Still Have a Long Way to Go” and June
2012 “Leapfrog
CPOE Simulation: Improvement But Still Shortfalls” and March 2015 “CPOE
Fails to Catch Prescribing Errors”).
To fully meet Leapfrog’s standard, hospitals must:
In 2015 nearly two-thirds of hospitals (64%) fully met the
standard, showing a considerable improvement compared to 14% in 2010. The
hospitals also demonstrated improved performance in medication reconciliation.
However, on the 2015 Leapfrog Hospital Survey, hospitals’ CPOE systems failed
to flag 39% of all potentially harmful drug orders, or nearly two out of every
five orders. The systems also missed 13% of potentially fatal orders.
The Wright study and the Leapfrog study demonstrate that it
is never enough to simply implement a CPOE system or e-prescribing system with clinical
decision support systems and assume your patients will be safe from medication
errors. Clearly, ongoing evaluation and assessment using validated tools are
important to identify vulnerabilities that may be unexpected. We, of course,
should expect better design and function from our IT vendors. However, the
Wright study clearly shows that problems may arise even when the initial design
and implementation were good yet changes to systems or files result in gaps
that may go unidentified for long periods.
See some of our other
Patient Safety Tip of the Week columns dealing with unintended consequences of
technology and other healthcare IT issues:
References:
Wright A, Hickman T-T T, McEvoy D, et al. Analysis of
clinical decision support system malfunctions: a case series and survey. JAMIA
2016; First published online: 28 March 2016
http://jamia.oxfordjournals.org/content/early/2016/03/28/jamia.ocw005
The Leapfrog Group. Hospitals’ Computerized Systems Proven
to Prevent Medication Errors, but More is Needed to
Protect Patients from Harm or Death. The Leapfrog Group 2016; April 7, 2016
full report
Print “PDF
version”
http://www.patientsafetysolutions.com/