A human factors concept we often see during incident investigations and root cause analyses (RCA’s) in healthcare has been in the media spotlight recently. But many of you heard the term “slip and capture error” for the first time recently after a volunteer police deputy fatally shot a man, thinking he was firing his taser, not his gun (Yan 2015).
Sounds appalling but this is not the first time this has happened. In 2009 a very similar shooting occurred on the Oakland BART system when the officer fired his gun rather than the intended taser (Force Science Institute 2010). In fact, there were at least 6 similar incidents prior to that 2009 shooting (Meyer 2010). So we now have at least 8 such incidents in total.
This sounds vaguely reminiscent of concentrated potassium chloride issues in healthcare in the past. Multiple incidents occurred disseminated both in time and location so it took years for us to see a pattern and look for root causes.
As to the current incident, investigators with a background in human factors analysis were quick to suggest “slip and capture error” as a major contributing factor.
Those of us working in patient safety understand what slips are but what about “capture errors”? Actually we often talk about them but may not recognize that specific nomenclature. Basically, a capture error occurs when two potential actions share the same or similar initial sequences but one action is relatively unfamiliar and the other is a well-known and well-practiced action (the latter often carried out almost automatically or subconsciously). In effect, under certain circumstances the well-practiced action sequence will “capture” the action.
But the capture error is not a new concept. In fact, for years when we are teaching patient safety to medical students, residents, or other healthcare workers and tell them mistakes are inevitable we give them a classic example: “It’s Sunday morning. You intend to go to the grocery store. But you find yourself in your car halfway to your usual workplace/school, far past the grocery store.” (Usually about two thirds of the audience raises their hands when we ask if that has happened to any of them!) That happens to be a classic capture error. The more practiced activity “captured” the intended but less familiar activity.
Usually there are “enabling” factors that contribute to the occurrence of “capture” errors. These include stressful situations, emergencies, distractions or interruptions and others.
Capture errors have long been described by human factors pioneers like James Reason and Don Norman. How about some everyday examples of “capture” errors?
James Reason, widely known as the father of human factors research, provides numerous examples (Reason 1990):
Don Norman (see our November 6, 2007 Patient Safety Tip of the Week “Don Norman Does It Again!”) in his two great books on human factors and design of things (Norman 1988, Norman 2009) has some great examples:
Did you ever rent a car on a trip and turn on the windshield wipers instead of the lights because the control knobs were reversed from the car you usually drive?
In fact, the classic predictable error of using the previous year when you write a check in January of a new year is probably also really a “capture” error.
So what are some healthcare examples of “capture” errors? A nurse or physician, confronted with a new version of a device (eg. ventilator, infusion pump, dialysis machine, etc.), programs in the sequence of keystrokes or dial manipulations he/she used on the old device even though he/she has been inserviced on the new device.
Another example might occur during CPOE. You almost always choose the first option from a drop-down list listing regimens for a certain anticonvulsant. Your software vendor updates the software and the drop-down list is now reordered. Still you choose the first option and this time your patient gets the wrong dose.
A nurse, distracted by a problem with a diabetic patient, delivers a dinner tray to another patient who is supposed to be NPO (Carayon 2012).
We can imagine other scenarios in which “capture” errors might occur. Last week (in our April 14, 2015 Patient Safety Tip of the Week “Using Insulin Safely in the Hospital”) we recommended that hospitals should develop protocol-driven and evidence-based order sets for specific uses of insulin. So let’s assume I just admitted an elderly patient with obtundation and a nonketotic hyperosmolar state. But her past medical history is also relevant for heparin-induced thrombocytopenia (HIT). I log onto the EHR and find this patient’s record. I then go to the order entry section. From a drop down list of those order sets I accidentally choose the order set for DKA or diabetic ketoacidosis (which I use very often) rather than the order set for nonketotic hyperosmotic state (which I only occasionally use). I enter information where prompted but then hurry down to the last section, which deals with DVT prophylaxis, because I’m concerned about her history of HIT or heparin-induced thrombocytopenia (note that this is now a salient distracting feature as we discussed in our January 14, 2014 Patient Safety Tip of the Week “Diagnostic Error: Salient Distracting Features”). To avoid any heparin-based DVT prophylaxis I choose the “other” box and I’m shunted to another menu that has an argatroban order set, which I choose. So now a contributory factor (the history of HIT) helped me perform a well-practiced action (choosing the DKA order set) over a more unfamiliar action (the nonketotic hyperosmolar state order set) that shares many features of the DKA order set. That actually meets the definition of a “capture” error.
There is probably also some relationship or at least overlap of “capture” errors with another human factors concept: inattentional blindness (ISMP 2009). In the latter, which is really a sort of confirmation bias, we tend to see what we expect to see rather than what we actually see. This is often a contributing factor in incidents where medications are drawn up from the wrong vials.
Another interesting thought: technology may cause some “capture” errors. Autotext or automatic completion of phrases by a word processor or smart phone may lead to such errors. We’ve noted several times that every time we type “EHR” (for electronic health record) our word processor converts it to “HER” and we might miss that on proof-reading. Or our smart phone automatically inserts one email address when we really intended a different one. These examples really meet the definition of a “capture” error in that two actions start with the same sequence of steps and one that is far more familiar than the other (at least far more familiar to the computer!) takes over for the intended action. You can bet that there will be analogies with healthcare technologies.
In the taser/gun incidents, use of the taser is the relatively unfamiliar action and use of the gun is the more familiar and well-practiced action. Even if the officer has never fired his/her gun on duty, they all spend considerable time on the shooting range so have practiced use of the gun frequently. But we suspect most have practiced using the taser much less frequently and probably never practiced under stressful conditions.
As Don Norman would tell us, design of systems significantly impacts on how humans use those systems. Design of the taser and its holster likely contributed to each of the incidents. In all of the seven previous taser/gun mistakes the taser had apparently been drawn by the “strong” (dominant) hand, though the location of the taser holster was variable (Meyer 2010). One of the recommendations made by the Forensic Science Institute after the BART case was use “weak-side, weak-hand-draw” taser holsters to minimize the chance of unintentionally drawing one’s gun rather than the taser.
So how do we avoid “capture” errors? The ISMP article on intentional blindness suggests that classic interventions like education and training or rules are unlikely to prevent such errors, at least if those are the only interventions. The same likely applies to “capture” errors. But frequent practice is an obvious intervention that would be expected to help. Checklists might be a good intervention for the new device scenario but certainly are not applicable to situations like the taser/gun scenario that plays out in emergent time frames. Thoughtful design is likely the best intervention. But you can’t design things without actually seeing all the settings and factors that might influence the response.
With everything in the media over the past year about potential excessive use of force by police we often found ourselves saying “why didn’t they just use a taser rather than shoot the suspect?”. Sometimes police did not carry tasers because they were bulky and didn’t always work correctly. Ironically, making police carry poorly designed tasers could have unintended consequences like the most recent incident. After the BART shooting, BART changed its policy to require Taser placement and holster design that accommodates only a “weak-side, weak-hand” draw (Meyer 2010). We’ll leave those design issues up to those who are experts in the field but good design requires feedback from those who will actually use the equipment.
And it’s clear it is not enough to just receive some education/training on use of the taser. Taser use must be practiced just as often (perhaps even more often) than practicing gun use and done under conditions closely simulating those in which a taser is likely to be needed.
While there are clearly many other issues and factors contributing to the most recent taser/gun incident, the fact that 8 such incidents have occurred tells us that there is a strong underlying system vulnerability that is a primary root cause of such incidents.
Yan H. How easy is it to confuse a gun for a Taser? CNN 2015; April 14, 2015
Force Science Institute. Force Science explains "slips-and-capture errors" and other psychological phenomena that drove the fateful BART shooting. PoliceOne.com July 22, 2010
Meyer G. The BART shooting tragedy: Lessons to be learned. PoliceOne.com July 12, 2010
Reason J. Human Error. Cambridge: Cambridge University Press. 1990. p. 68
Norman D. The Psychology of Everyday Things. New York: Doubleday. 1988. P. 107
Norman D. The Design of Future Things. New York: Basic Books. 2009. P. 107
Carayon P (ed.). Handbook of Human Factors and Ergonomics in Health Care and Patient Safety. Boca Raton: CRC Press. 2012; p. 349
ISMP (Institute for Safe Medication Practices). Inattentional blindness: What captures your attention? ISMP Medication Safety Alert Acute Care Edition 2009; February 26, 2009
Print “PDF version”