Some errors are predictable. The classic example given is people writing checks in January dated for the previous year. Banks had to alter their usual time limitation for cashing checks to take into account this predictable error. Some errors in medicine are also predictable.
One type of predictable error in medicine we call the “two in a box” phenomenon. Assume there are 2 items (A and B) in a box or some other sort of container. You are looking for item A. You reach into the box and pull out B. Many people put B back in the box and pull out the other item, assuming it is A without checking its identity. That’s human nature. But what happens if your assumption was wrong (i.e. that the items in the box were different items and that an A was even in the box)? Suppose the “box” was a refrigerator with 2 units of blood in it. You’ve now pulled out a unit of blood that may be inappropriate for your patient. You say that’s why you have a double check built into your blood products policy? That may help you in that particular circumstance but keep in mind that we know from industry in general that the error rate for someone who checks someone else’s work may be as high as 10%!
Think of what circumstances you might find this “two in a box” problem in your facility. We have seen the following examples of this phenomenon in medicine events with adverse outcomes:
So next time you are making safety walk rounds in your facility, take a look to see how often you come across this potential “two in a box” scenario. You’ll be surprised how often you find it.
When there is an untoward incident, the person who failed to check the item often gets reprimanded or punished in some other way. But it should be clear that this is a system design problem that set up that individual (and potentially many others) to do exactly what they did. So make sure your systems are designed to avoid putting anyone in the “two in a box” scenario.