Patient Safety Tip of the Week

 

May 27, 2008

If You Do RCA’s or Design Healthcare Processes…Read Gary Klein’s Work

 

 

 

 

Funny…it seems we always do book reviews on holiday weekends! Well, this one has been sitting on our computer for almost a year now. In our May 29, 2007 Tip of the Week we discussed some of the premises in Malcolm Gladwell’s book “Blink”. We noted that many decisions, particularly those made in urgent situations, are the result of “rapid cognition” rather than more deliberate comparison of all possible options. Much of the research basis for that was developed over many years by the cognitive psychologist, Gary Klein. We have included Klein’s book “Sources of Power. How People Make Decisions” in our patient safety library for a long time but never really gave him and his colleagues their due in one of our columns.

 

 

It was Klein who began the work on naturalistic decision making (i.e. how people make decisions in field settings) in the 1980’s. The field work consisted of interviews of people who had to make decisions of high stakes under time pressure (though much of his subsequent work showed that the same processes often apply even in the absence of time pressures), often with inadequate information, unclear goals, ambiguity, and dynamic (changing) situations or conditions. Prior to the work of Klein and his colleagues, the classical thinking about decision making was that people would assemble the data then analyze multiple options, assign weight to the important elements of each option, compare them and then choose the most rational one as the final option. Klein’s interviews of firefighting captains, military commanders, nurses, etc. showed that in most cases these individuals relied on their previous experience and mental simulation to choose a course of action that was likely to work. That is, they chose a single option rather than comparing multiple options.

 

Klein and others called this the Recognition-Primed Decision Model. In their interviews they found that these critical decision makers often did not think they had even made any decisions. The terms “intuition” and “gut feeling” often appear when discussing such scenarios after the fact. However, Klein is quick to point out that “intuition” is not, in fact, an inborn trait. Rather, it is based on one’s experience and it often takes place at a subconscious level.

 

 

In essence, the decision makers usually choose the first workable option, which may not necessarily be the best option. They call upon their recognition of patterns, which in turn is based upon their experience, to rapidly diagnose situations and apply courses of action. They recognize a situation that is typical or at least familiar so that they readily have goals that make sense, see cues that are important, know what to expect next, and know typical ways of responding. That pattern recognition includes not only positive cues in the situation, but also cues that are missing. In fact, Klein asserts that the ability to recognize the missing event as a cue is often what separates experts from novices. Most of this pattern recognition takes place at a subconscious level, so the decision maker cannot often tell us why he/she made that decision.

 

 

Part of the process also involves making a mental simulation of how the course of action is likely to play out. This type of decision making does not always result in desired outcomes. Confirmation bias (using evidence that bolsters our decision) and dismissing contradictory or disconfirming evidence may lead us to stick with an initial decision that will lead to undesirable outcomes. Explaining away disconfirming evidence is one of the biggest factors leading to undesired outcomes. And the process may not work when the situation is too complicated. Typically, individuals using the RPD model consider a very limited number of variables. When Klein and colleagues analyzed decision making in a wide variety of settings, they concluded that most decisions are made using the RPD model. And experts are more likely to use this model than are novices.

 

 

Klein’s work suggests that the experience of an individual is what makes them an “expert” but that one can help develop that “expertise”. Experts tend to engage in deliberate practice, obtain feedback, and learn from their mistakes. And in developing that expertise, they rely heavily on use of stories, metaphors and analogues that add meaning to their experiences. They also tend to more readily identify leverage points, where small changes can lead to large changes in outcomes.

 

 

Klein also talks about the need for “mind reading”. He says that often in critical situations, the communication leaves the recipient in the position of having to read the mind of the communicator. He gives an example of a surgeon telling an anesthesiologist to give a drug to lower the blood pressure (without specifically mentioning that low blood pressure is his goal). The anesthesiologists gives that drug, then responds to the lower blood pressure by giving a pressor agent that increases the blood pressure. The surgeon tells him to give more of the original drug. He does and again responds to the lower blood pressure by giving additional pressor agent. And the vicious cycle goes on and on. In the patient safety movement, we of course talk extensively about this scenario in team training and simulation exercises and emphasize the need for “hearback” and other feedback to indicate that all on the team have clear understanding of the goals and procedures.

 

 

So why is all this important to those participating in root cause analyses (RCA’s)? We commonly have hindsight bias when we review a case with an untoward outcome. We all have the tendency to say things like “why couldn’t they see that” or “why didn’t they do this”. Understanding the type of subconscious factors involved in decision making in such situations helps put us in the position of the actual participants of the situation. We can then better see what cues or lack of cues led them in the direction they went. That will ultimately help us in better designing our processes to help avoid an untoward outcome in similar situations in the future, which is of course the primary goal of any RCA.

 

 

And the same goes for those involved in designing various processes in healthcare. We need to understand how people are likely to think and react in certain situations. Klein readily points out that our efforts to prevent errors often just make it more difficult to diagnose a situation. Sometimes accepting errors but making them more visible is much more important that preventing them. This especially applies where we are using technology in the background to handle errors and anomalies, thereby keeping the active participants in the dark about what is going on. Often they find out about those anomalies only to late to take appropriate corrective action or they take an action that actually magnifies the underlying anomaly.

 

 

As an aside, to anyone involved in the patient safety field Klein’s discussion on the use of stories is right on target. We readily admit that all the statistics in the world about medical error and patient safety can’t convey the same message that a single story can. He points out the key elements of a good story: it is dramatic, empathetic, instructive, and often ends with a surprise. A good story is what sticks in people’s minds and is one they often draw upon in recognizing situations and taking a course of action that has a high likelihood of success.

 

 

For anyone involved in quality improvement or patient safety, this is a book that you definitely won’t regret reading.

 

 

References:

 

 

Klein G. Sources of Power. How People Make Decisions. Cambridge: MIT Press (1999)

 

 

 

 

 

 


 


http://www.patientsafetysolutions.com

 

Home

 

Patient Safety Tip of the Week Archive

 

What’s New in the Patient Safety World Archive