View as “PDF version

Patient Safety Tip of the Week

June 11, 2019

ISMP’s Grissinger on Overreliance on Technology

 

 

The recent crashes of the Boeing 737 Max8 aircraft brought to the attention of all some of the downsides of technology, including both “automation surprises” and overreliance of technology. The same issues occur in health care. Several of our recent columns have had examples (see our Patient Safety Tips of the Week for December 11, 2018 “Another NMBA Accident”, January 1, 2019 “More on Automated Dispensing Cabinet (ADC) Safety”, and February 5, 2019 “Flaws in Our Medication Safety Technologies”).

 

But we discussed the issue of overreliance on technology in detail in our August 23, 2016 Patient Safety Tip of the Week “ISMP Canada: Automation Bias and Automation Complacency”.

 

This month, ISMP’s Matthew Grissinger used the same index incident from the 2016 ISMP Canada bulletin (ISMP Canada 2016) as a springboard for a good discussion in the journal P&T on overreliance on technology, automation bias, and automation complacency (Grissinger 2019). The incident occurred in a patient admitted with new onset seizures. An order for phenytoin was handwritten using the brand name Dilantin. A hospital pharmacy staff member, who was relatively new to the clinical area, entered the first 3 letters “DIL” into the pharmacy IT system. The staff member was then interrupted and, when the task was resumed, diltiazem 300 mg was ordered instead of Dilantin 300 mg. Back on the clinical unit the handwritten order had correctly been transcribed by a nurse into the MAR. Later, when another nurse obtained the evening’s medications for the patient from the ADC (automated dispensing cabinet), he/she noted a discrepancy between the MAR and the ADC display but accepted the information on the ADC display as being correct. The diltiazem was erroneously administered and the patient developed significant hypotension and bradycardia.

 

First, the definitions:

Automation bias is “the tendency to favor or give greater credence to information supplied by technology (e.g., an ADC display) and to ignore a manual source of information that provides contradictory information (e.g., a handwritten entry on the computer-generated MAR), even if it is correct.”

 

Automation complacency, a closely linked, overlapping concept “refers to the monitoring of technology with less frequency or vigilance because of a lower suspicion of error and a stronger belief in its accuracy.”

 

Grissinger gives the example of a nurse who relies on the ADC display that lists the medications to be administered might forget or ignore that information from the device may depend on data entered by a person. In fact, we gave such an example in our August 23, 2016 Patient Safety Tip of the Week “ISMP Canada: Automation Bias and Automation Complacency”. That was an incident that was a near-miss involving insulin. A physician, from a specialty not used to ordering insulin, did medication reconciliation and mistook the “U-100” formulation of insulin listed on records accompanying the patient to be a dose of 100 units of insulin. The physician entered a dose of 100 units of regular insulin into the CPOE system. The pharmacy system did not have dose range limits for insulin and there was no prompt for special review by the pharmacist. The nurse who received the syringe with 100 units of regular insulin was somewhat surprised by the relatively high dose but barcoded the patient’s wrist band ID and the medication and looked at the electronic MAR, all of which indicated correct patient, correct medication, correct dose. So, the medication was administered. Fortunately, because the nurse had an “uneasy feeling” she went back and checked the patient’s records and found that he had been on 10 units of regular insulin prior to admission. She drew a stat blood glucose level and administered D50W and potentially serious harm was prevented. In the old days, of course, a nurse would have immediately checked the records and the orders and spoken to the ordering physician before administering such a high dose of insulin. But, here, our tendency to believe that the technology is always correct biased the nurse toward first administering the insulin then checking further rather than checking further before administering it.

 

Two of the commonest cognitive biases that often lead to error are interrelated: confirmation bias and ignoring disconfirming information (see our September 28, 2010 Patient Safety Tip of the Week “Diagnostic Error”). Automation bias or overreliance on technology happen to be very strong contributors to confirmation bias. In the case described by Grissinger, the nurse trusted the ADC display rather than the handwritten entry on the computer-generated MAR.

 

Grissinger cites work of Kate Goddard and colleagues that noted clinicians overrode their own correct decisions in favor of erroneous advice from technology between 6% and 11% of the time (Goddard 2012) and the risk of an incorrect decision increased by 26% if the technology output was in error (Goddard 2014).

 

In a simulation exercise (Goddard 2014), clinicians were shown 20 hypothetical prescribing scenarios. They were asked to prescribe for each case, followed by being shown simulated advice that may have been correct or incorrect. Participants were then asked whether they wished to change their prescription, and the post-advice prescription was recorded. While CDSS advice improved the decision accuracy in 13.1% of prescribing cases, decision switches from correct pre-advice to incorrect post-advice occurred in 5.2% of all cases. The latter, of course, is a measure of automation bias.

 

Grissinger reiterates the 3 human factors in the ISMP Canada bulletin that contribute to automation bias and automation complacency:

1)     our tendency to select the pathway with the least cognitive effort

  2)  our perception that the analytic capability of automated aids is superior to humans

  3)  we often “shed” our responsibility when an automated system is performing the same function.

 

He also notes that one’s experience may contribute, but possibly in different ways. For example, as our experience makes us more confident in our decisions, we may rely less on technology. But, on the other hand, we often become desensitized as we gain experience with the technology and ignore our instincts and go with the technology, particularly when we have developed trust in the reliability of a specific technology.

 

Grissinger offers several recommendations to reduce the risks of overreliance on technology, many of which were in the original ISMP Canada paper:

 

Analyze and address vulnerabilities. He suggests you do a FMEA (failure mode and effects analysis) for new technologies before undertaking facility-wide implementation. Also, encourage the reporting of technology-associated risks, issues, and errors.

 

Limit human-computer interfaces. Organizations should continue to enable the seamless communication of all technology, thereby limiting the need for human interaction with the technology, which could introduce errors.

 

Design technology to reduce over-reliance. He provides the good example in which current systems that allow a drug name to pop up after typing in just the first few letters of the drug name often result in selection of the first name to appear. He notes that requiring four letters to generate a list of potential drug names could reduce this type of error.

 

Provide training. Provide training in the technology involved in the medication use system to all staff who utilize the technology. Include information about its limitations, as well as previously identified gaps and opportunities for error. Importantly, he recommends you allow trainees to experience automation failures during training.  For example, you might include instances of non-issue of an important alert; discrepancies between technology and handwritten entries in which the handwritten ones are correct; “auto-fill” or “auto-correct” errors; incorrect calculation of body surface area due to human error by inputting weight in pounds instead of kilograms, etc..

 

Reduce task distraction. He notes that automation failures are less likely to be identified if users are multitasking or are otherwise distracted or rushed.

 

 

In the “old” days (before CPOE and BCMA), after a physician wrote an order for a medication, a nurse would typically do a “mental approximation” to assess whether the dose ordered was in a reasonable range. We’ve seen more and more that now the response is simply to rely upon the IT systems and forego that mental approximation, losing an important extra check in the medication administration system. We’ve also described that mental approximation as an important step in managing home infusion (see our March 5, 2019 Patient Safety Tip of the Week “Infusion Pump Problems”). For example, if you are setting up an infusion of a chemotherapy agent that could be lethal if the total dose was administered over a few hours rather than several days as intended, you do a simple calculation in your head that would say “this infusion is going to be done in 4 hours”, not in the 4 days that were intended”. (Of course, if you are a regular reader of our columns, you’ll recognize we would tell you to never hang up an amount of medication that could be fatal if infused too rapidly!).

 

In our August 23, 2016 Patient Safety Tip of the Week “ISMP Canada: Automation Bias and Automation Complacency” we also discussed the aviation safety concept “automation surprise”. That refers to the fact that many complex computerized aviation systems may have the aircraft flying in a mode that is relatively masked to the pilot. For example, an aircraft may be flying under autopilot and if the autopilot disengages the pilot may not immediately be aware of several important flight parameters. There are numerous instances in the NTSB files about automation surprises contributing to aviation crashes.

 

Reports to NASA’s Aviation Safety Reporting System also provide examples of how attention to autoflight can lead to loss of situational awareness (NASA 2013). In examples, awareness of the aircraft’s actual flight path seems to have been compromised by:

 

In our January 7, 2014 Patient Safety Tip of the Week “Lessons From the Asiana Flight 214 Crash” we noted that one of the major issues contributing to this crash was apparently overreliance on technology. The pilots apparently thought that the automatic throttle system was engaged, which should have increased engine thrust when the airplane speed fell below the recommended speed. However, that automatic throttle system was not engaged. Once the pilots recognized that their speed and altitude were too low and that the autothrottle had not automatically increased speed, they tried to initiate a “go round” (i.e. to abort the landing and fly around and try again) but it was too late. It’s pretty clear that sometimes pilots don’t understand what mode the computer systems are in. The FAA released a comprehensive study on the overreliance of pilots on automation and loss of situational awareness due to automation surprises (FAA 2013).

 

The recent Boeing 737 MAX8 crashes also illustrate overreliance on technology – but this time on the manufacturer side. Boeing had put in place a software system designed to prevent the type of aerodynamic stalls noted above. It was intended to automatically force the plane’s nose down if a stall seemed imminent. The trigger for the software program was input from an “angle of attack” sensor. Apparently, in the crashes, that sensor was faulty and there was no second or backup sensor in place. When the software program triggered and forced the plane’s nose down, the pilots apparently did not know they had to manually disconnect that system. So, Boeing overrelied on technology to develop what it thought was a fix to a human problem. The unintended consequence was that it led to an “automation surprise” for the pilots.

 

Recently, we were test driving a new car that had a “smart” cruise control system, designed to slow down the vehicle as it neared the vehicle ahead. The car salesmen said “Go ahead and follow that car. This car will slow down and stop when he gets to the next stop sign.” Well, the road happened to curve just before that next stop sign. As the vehicle ahead turned at the curve first, our vehicle’s “smart” system lost sight of it and sped up! Fortunately, we had not yet developed trust in the new technology and were ready with our foot on the brake to avoid a collision. But we suspect that, once we had come to trust and rely on that technology, we may not have been so lucky.

 

So, overreliance on technology occurs in our daily lives and in many industries. Healthcare is no different. We often use computer systems in which multiple “modes” are available and we may not recognize which mode the system is operating in. Automation surprises can also occur in healthcare and, when coupled with automation bias, lead to serious adverse consequences. For example, if the computer is expecting a patient weight to be input in kilograms and someone instead inputs a weight in pounds, serious harm may be done to the patient. Or, if an infusion pump is expecting input of dose or dose rate instead of flow rate or vice versa (so called “wrong-field programming error”), major adverse consequences may occur. Also, in all our discussions about alarm issues we note that erroneous assumptions are often made that an alarm will trigger when anything serious happens, not realizing that the alarm parameters had been set to different physiologic parameters.

 

The bottom line: we all likely have some degree of automation bias and automation complacency in both healthcare and our other daily activities. We still need to use common sense and never assume that the technology is flawless. In our June 2, 2009 Patient Safety Tip of the Week “Why Hospitals Should Fly…John Nance Nails It!” we noted that we all should really look at each thing we are doing in patient care and think “could what I am about to do harm this patient?”.

 

 

See some of our other Patient Safety Tip of the Week columns dealing with unintended consequences of technology and other healthcare IT issues:

 

 

References:

 

 

ISMP Canada. Understanding Human Over-reliance on Technology. ISMP Canada Safety Bulletin 2016; 16(5): 1-4

https://www.ismp-canada.org/download/safetyBulletins/2016/ISMPCSB2016-05_technology.pdf

 

 

Grissinger M. Understanding Human Over-Reliance On Technology. P&T 2019; 44(6): 320-321, 375

https://www.ptcommunity.com/journal/article/full/2019/6/320/understanding-human-over-reliance-technology

 

 

Goddard K, Roudsari A, Wyatt JC. Automation bias: a systematic review of frequency, effect mediators, and mitigators. J Am Med Inform Assoc 2012; 19(1): 121-127 Published online June 16, 2011

https://academic.oup.com/jamia/article/19/1/121/732254

 

 

Goddard K, Roudsari A, Wyatt JC. Automation bias: empirical results assessing influencing factors. Int J Med Inform 2014 ;83(5): 368-375 Published online January 17, 2014

https://www.sciencedirect.com/science/article/pii/S1386505614000148?via%3Dihub

 

 

NASA. Autoflight Associated Loss of Situational Awareness. Callback 2013; 407: 1-2 December 2013

http://asrs.arc.nasa.gov/publications/callback/cb_407.html

 

 

FAA. Operational Use of Flight Path Management Systems. FAA September 5, 2013

http://media.nbcbayarea.com/documents/FAA_Final_Report_Recommendations+11-22-13.pdf

 

 

 

 

Print “PDF version

 

 

 

 

 

 

 

 

 

 

 


 

 

http://www.patientsafetysolutions.com/

 

Home

 

Tip of the Week Archive

 

What’s New in the Patient Safety World Archive