Patient Safety Tip of the Week

August 23, 2016

ISMP Canada: Automation Bias and Automation Complacency

 

 

“It must be right, the computer said so.” Unfortunately, that concept has become very ingrained into our thinking and we see more and more adverse events related to over-reliance on technology. We love technology and think it is one of the most important tools in our armamentarium to help prevent medical errors and adverse patient outcomes. Yet we end up every year writing several columns on the unintended consequences of technology (see the list of those columns at the end of today’s column).

 

This month, ISMP Canada published a safety bulletin on over-reliance on technology (ISMP Canada 2016). They first described a patient incident and then focused on two very important and related concepts: automation bias and automation complacency.

 

The incident occurred in a patient admitted with new onset seizures. An order for phenytoin was handwritten using the brand name Dilantin. A hospital pharmacy staff member, who was relatively new to the clinical area, entered the first 3 letters “DIL” into the pharmacy IT system. The staff member was then interrupted and, when the task was resumed, diltiazem 300 mg was ordered instead of Dilantin 300 mg. Back on the clinical unit the handwritten order had correctly been transcribed by a nurse into the MAR. Later, when another nurse obtained the evening’s medications for the patient from the ADC (automated dispensing cabinet), he/she noted a discrepancy between the MAR and the ADC display but accepted the information on the ADC display as being correct. The diltiazem was erroneously administered and the patient developed significant hypotension and bradycardia.

 

Another example of over-reliance on technology that we give in several of our presentations is a near-miss involving insulin. A physician, from a specialty not used to ordering insulin, did medication reconciliation and mistook the “U-100” formulation of insulin listed on records accompanying the patient to be a dose of 100 units of insulin. The physician entered a dose of 100 units of regular insulin into the CPOE system. The pharmacy system did not have dose range limits for insulin and there was no prompt for special review by the pharmacist. The nurse who received the syringe with 100 units of regular insulin was somewhat surprised by the relatively high dose but barcoded the patient’s wrist band ID and the medication and looked at the electronic MAR, all of which indicated correct patient, correct medication, correct dose. So the medication was administered. Fortunately, because the nurse had an “uneasy feeling” she went back and checked the patient’s records and found that he had been on 10 units of regular insulin prior to admission. She drew a stat blood glucose level and administered D50W and potentially serious harm was prevented. In the old days, of course, a nurse would have immediately checked the records and the orders and spoken to the ordering physician before administering such a high dose of insulin. But, here, our tendency to believe that the technology is always correct biased the nurse toward first administering the insulin then checking further rather than checking further before administering it.

 

The first type of error in the ISMP Canada example is one we have encountered since the very first computers came out: the cursor error (also known variously as the mouseclick error, drop-down list error, picklist error, stylus error, or juxtaposition error depending upon the setting and device being used). We’ve all done it – you have a list of choices and you think you touched one choice yet your cursor or stylus actually hit the choice above or below the one you wanted. Usually we look to see what was chosen but, as in the ISMP Canada example, we may get distracted and not notice our erroneous choice (Yes, your email might get sent to someone you didn’t want it to go to!).

 

But the gist of the ISMP Canada article is not the error of choosing the wrong medication from a drop-down list. Rather, it is about our tendency to over-rely on technology and assume the technology is correct. They discuss two interrelated concepts: automation bias and automation complacency. Automation bias is “the tendency to favor or give greater credence to information derived from an automated decision-making system…and to ignore a manual (non-automated) source of information that provides contradictory information”. Examples are accepting the information on the ADC display rather than the handwritten MAR in the ISMP Canada example, or the acceptance of the barcoding system rather than the “gut feeling” in our insulin example. The closely related “automation complacency” refers to “monitoring of an automated process less frequently or with less vigilance than optimal because of a low degree of suspicion of error and a strong belief in the accuracy of the technology”. Both ignore the fact that the technology is only as good as the data that gets input.

 

ISMP Canada notes 3 factors contribute to automation bias and automation complacency: (1) our tendency to select the pathway with the least cognitive effort, (2) our perception that the analytic capability of automated aids is superior to humans, and (3) we often “shed” our responsibility when an automated system is performing the same function.

 

(Note that similar factors may contribute to the complacency we often see in double check systems as we’ve described in our October 16, 2012 Patient Safety Tip of the Week “What is the Evidence on Double Checks?”.)

 

The ISMP Canada article goes on to discuss the conflicting evidence as to the effect training and experience might have on automation bias and automation complacency. Some studies suggest that inexperience predisposes to such errors whereas other studies suggest increased familiarity with a technology may lead to desensitization and habituation.

 

ISMP Canada recommends training about automated systems both at orientation and on an ongoing basis, including discussion of limitations of such systems and any gaps or previous errors identified. It also suggests allowing trainees to experience automation failures during training. Further, a proactive risk assessment (eg. FMEA or failure mode and effects analysis) or a staged implementation should be used with new technologies to help identify unanticipated vulnerabilities. Input from end-users should be sought up front and feedback should be sought after implementation. They also have recommendations about avoiding interruptions during double checks, having standardized ways to address identified medication discrepancies, and the importance of comparing the ADC display with the MAR when selecting a medication from the ADC.

 

Note that there are other issues in the ISMP Canada incident that are important. One is failure to include an indication for the drug being ordered. A good system (whether manual or computerized) should require an indication. That gives everyone the opportunity to say “wait a minute, diltiazem is not used for treating seizures”. (But keep in mind that the indication may not be known for many medications taken prior to admission and continued during hospitalization.)

 

And the insulin example illustrates problematic medication reconciliation and lack of manual and electronic review of dosages for a high alert medication. One excellent patient safety intervention for high-risk drugs is setting dose range limits on your CPOE or pharmacy IT systems. This is very valuable in preventing, for example, overdoses of chemotherapy agents. For insulin, it is much more difficult than it sounds. That is because the dosages of insulin used are so variable across patients. Particularly at a large hospital treating lots of complex patients it might not be surprising for a nurse to have administered 100 units of insulin to a patient. But it is still worth looking at your data and saying “we’ve seldom used a dose of insulin exceeding x units” and then adding an alert that helps physicians, pharmacists or nurses question orders for large doses of insulin.

 

In aviation safety a term often encountered is “automation surprise”. That refers to the fact that many complex computerized aviation systems may have the aircraft flying in a mode that is relatively masked to the pilot. For example, an aircraft may be flying under autopilot and if the autopilot disengages the pilot may not immediately be aware of several important flight parameters. There are numerous instances in the NTSB files about automation surprises contributing to aviation crashes. You, yourself, may have experienced an “automation surprise” when your motor vehicle sped up as you were approaching the rear of another vehicle because you forgot your car was on cruise control.

 

Reports to NASA’s Aviation Safety Reporting System also provide examples of how attention to autoflight can lead to loss of situational awareness (NASA 2013). In examples, awareness of the aircraft’s actual flight path seems to have been compromised by:

 

In our January 7, 2014 Patient Safety Tip of the Week “Lessons From the Asiana Flight 214 Crash” we noted that one of the major issues contributing to this crash was apparently overreliance on technology. The pilots apparently thought that the automatic throttle system was engaged, which should have increased engine thrust when the airplane speed fell below the recommended speed. However, that automatic throttle system was not engaged. Once the pilots recognized that their speed and altitude were too low and that the autothrottle had not automatically increased speed, they tried to initiate a “go round” (i.e. to abort the landing and fly around and try again) but it was too late. It’s pretty clear that sometimes pilots don’t understand what mode the computer systems are in. The FAA released a comprehensive study on the overreliance of pilots on automation and loss of situational awareness due to automation surprises (FAA 2013).

 

Healthcare is no different. We often use computer systems in which multiple “modes” are available and we may not recognize which mode the system is operating in. Also, in all our discussions about alarm issues we note that erroneous assumptions are often made that an alarm will trigger when anything serious happens.

 

 

The bottom line: we all likely have some degree of automation bias and automation complacency in both healthcare and our other daily activities. We still need to use common sense and never assume that the technology is flawless. In our June 2, 2009 Patient Safety Tip of the Week “Why Hospitals Should Fly…John Nance Nails It!” we noted that we all should really look at each thing we are doing in patient care and think “could what I am about to do harm this patient?”.

 

 

 

See some of our other Patient Safety Tip of the Week columns dealing with unintended consequences of technology and other healthcare IT issues:

 

 

References:

 

 

ISMP Canada. Understanding Human Over-reliance on Technology. ISMP Canada Safety Bulletin 2016; 16(5): 1-4

https://www.ismp-canada.org/download/safetyBulletins/2016/ISMPCSB2016-05_technology.pdf

 

 

NASA. Autoflight Associated Loss of Situational Awareness. Callback 2013; 407: 1-2 December 2013

http://asrs.arc.nasa.gov/publications/callback/cb_407.html

 

 

FAA. Operational Use of Flight Path Management Systems. FAA September 5, 2013

http://media.nbcbayarea.com/documents/FAA_Final_Report_Recommendations+11-22-13.pdf

 

 

 

 

Print “PDF version

 

 

 

 

 

 


 

http://www.patientsafetysolutions.com/

 

Home

 

Tip of the Week Archive

 

What’s New in the Patient Safety World Archive