Skip to main content

Verified by Psychology Today

Bias

Escaping from Fixation

Harnessing the power of curiosity to reduce diagnostic errors

Diagnostic errors crop up in all kinds of settings, often with very serious consequences. What might cause them? The Institute of Medicine report “Improving Diagnosis in Health Care” (Balogh et al., 2015) identified some of the usual suspects: workload, time pressure, lack of expertise, fatigue, communications breakdowns.

The report also listed a cognitive problem—jumping to an initial hypothesis that is wrong and getting stuck on that diagnosis. That’s the problem I want to tackle in this essay.

Why does this getting-stuck error happen?

Why do we sometimes get stuck on an incorrect diagnosis?

Frequently, this getting-stuck error is blamed on confirmation bias: we jump to a conclusion and then, instead of testing it, we look for evidence to support it. However, as I explained in my last essay, the confirmation bias explanation has some serious weaknesses. (Smith, 2018 raised similar concerns about confirmation bias.) When I went back and read the early studies that are cited in support of confirmation bias, I found that the majority of subjects in these experiments did not show confirmation bias. Apparently, the notion of confirmation bias is so compelling that people who like to find evidence of judgment and decision biases have distorted the findings and then locked into the distortions.

Also, many decision researchers now acknowledge that a confirmation strategy can actually be very useful because when we are very uncertain about what is going on we can learn more by trying to confirm our speculations than trying to falsify them. In other words, the confirmation strategy is more of a benefit than a bias. Therefore, I suggest we dismiss the notion of confirmation bias to explain the getting-stuck error.

Fortunately, there’s a better account of getting-stuck errors: Fixation.

The concept of fixation is that we get stuck on an initial explanation. Often, that initial explanation will be accurate but when it is wrong, with hindsight we can see that we held on to it too long.

But fixation errors aren’t just holding onto our initial explanation too long—fixation gets compounded when we dismiss any anomalous evidence that runs counter to our original diagnosis instead of taking these anomalies into account and revising our beliefs. DeKeyser and Woods (1990) speculated about some ways that fixation works, and Feltovich et al. (2001) called these tactics, “knowledge shields” that we use to deflect contrary data.

Chinn & Brewer (1993) listed six basic ways that knowledge shields can operate, ways that we can react to anomalous data that are inconsistent with our beliefs: (i) we can ignore the data; (ii) we can reject the data by finding some flaw or weakness in the way the data were collected or analyzed or even speculate that the data reflected a random occurrence; (iii) we can decide that the data don’t really apply to the phenomenon of interest; (iv) we can set the data aside for the present in the expectation that future developments will show why the anomaly is not really a problem (v) we can find a way to interpret the data that allows us to preserve our beliefs; (vi) we can make cosmetic changes to our beliefs and fool ourselves into thinking that we have taken the data into account. Chinn and Brewer found that college students displayed each of these tactics and so did established scientists. Chinn and Brewer also listed a seventh type of reaction—we can accept the data and change or discard our initial beliefs.

The sensemaking model presented by Klein et al. (2007) describes two pathways for reacting to data that question the way we have been framing a situation. We can try to preserve the frame we have been using, employing the six tactics described by Chinn and Brewer, or we can accept the anomaly (the seventh reaction listed by Chinn and Brewer) and re-frame the situation. Both reactions have value. If we over-react to anomalies, even ones that are basically noise, we can keep reframing and reframing in response to every anomaly, even ones that are basically noise, and never arrive at any interpretation, a condition referred to as “vagabonding.” On the other hand, if we under-react to anomalies and preserve the frame too long, we display fixation.

Some might argue that fixation is not a type of error. Rather, it is just the extreme case of preserving our initial frame; trying to preserve our initial frame is a useful tendency. It isn’t realistic to keep re-thinking everything whenever we encounter some possible anomaly. Fixation may seem like an error because in hindsight, after we know the correct diagnosis, we can determine that we went too far in preserving our initial beliefs.

But in another sense fixation does seem like an error because we aren’t trying to test our diagnosis when we encounter contrary evidence that shouldn’t be ignored or dismissed and we aren’t aware of other possibilities. It’s hard to tell when our efforts to preserve our initial frame shades over into fixation but I think we can generally agree that there is a point where the anomalies are so frequent and serious that a reasonable person should no longer dismiss them.

Consider the case of Josef Stalin in World War II. Stalin had forged a non-aggression pact with the German leader Adolf Hitler in 1939. Stalin was confident that this treaty would hold, even as he kept getting all kinds of information suggesting that Hitler was planning a surprise attack on the Soviet Union. Stalin used his knowledge shields to dismiss all of this evidence. He ignored most of it and dismissed other reports as false rumors designed to stir up trouble with his German ally. He even ordered the execution of some informants because he suspected they were secret agents attempting to mislead him. He kept explaining away the evidence right up to the actual German assault in 1940, Operation Barbarossa. As a result, the Soviet defenses were caught unprepared. The Germans quickly occupied territory, seized weapons, killed and captured many soldiers, and almost took Moscow. In retrospect, Stalin had blundered but even at the time, looking at what was knowable, we can conclude that he was fixated on a false belief.

Note: the term “fixation error” is sometimes used for situations in which a person keeps focusing attention on one display and ignoring others, or focusing on one problem and ignoring others (e.g., a pilot struggling to get the landing gear in place and ignoring indications of low fuel). This essay only considers fixation during the diagnosis of problems that have arisen, as in a physician trying to determine what is causing a patient’s symptoms or a panel operator in a petrochemical plant trying to understand why the temperature inside a reactor has dropped so sharply.

Therefore, the concept of fixation describes how we can hold on to our initial diagnosis despite strong contrary evidence, by deploying a variety of tactics to shield ourselves having to think about the implications of the contrary evidence.

The Balogh et al. Institute of Medicine report examines types of diagnostic errors but never once mentions fixation. Instead, it goes into detail about confirmation bias.

Fixation and confirmation bias seem to be explaining the same thing. What’s the difference?

Defective Thinkers or Effective Thinkers?

The concept of confirmation bias asserts that we need to change the way we think, whereas the concept of fixation claims that there’s nothing wrong with our thinking, just that we sometimes preserve our beliefs too stubbornly.

The concept of confirmation bias is part of a framework that views people as defective thinkers who are riddled with all sorts of biases that interfere with rational thinking. However, attempts to de-bias people to eliminate confirmation bias have repeatedly failed. We can’t do it and we shouldn’t do it because we don’t want to lose the benefits of the heuristic of seeking confirming evidence. When we label this heuristic tendency as a bias we discourage people from speculating at the outset—and rapid speculation is valuable for guiding our exploration, especially when we face wicked problems, under ambiguous, complex and changing conditions involving contextual influences. The concept of correcting a bias makes it seem that the purpose of thinking is to avoid making errors, rather than encouraging us to be curious and to explore and discover. These are the reasons to avoid the “defective thinker” formulation.

In contrast, the notion of fixation is part of a framework that views people as effective thinkers who are capable of insights. Our natural tendency is to quickly speculate, and we typically get it right. Sometimes we get it wrong, or sometimes the conditions change so that the initial diagnosis becomes overtaken by events, and then we usually reconceptualize smoothly (e.g., Fugelsang et al., 2004; Klein et al., 2005). But sometimes we don’t reconceptualize smoothly or quickly enough—we get stuck and then we need help getting un-stuck. The fixation approach isn’t trying to change the way we think. Instead, the idea is to help us escape from fixation when it occurs.

So what can we do? Let’s start with some common pieces of advice that don’t seem to be very helpful.

Questionable advice

Some people suggest that we should keep an open mind as a way to prevent fixation on an initial hypothesis, but I don’t like that advice very much. We are not built to keep an open mind. And an open mind is essentially a passive mind, not an inquiring or speculative mind.

Another common piece of advice is to identify all the assumptions we are making in order to identify any weak or false assumptions. But in the cases I have examined the assumptions that get us in trouble are often those we make unconsciously, and we wouldn’t be likely to list these up front.

Some researchers encourage decision makers to inhibit intuitions and speculations until they have a chance to thoroughly analyze the data, but this tactic seems like a recipe for paralysis by analysis. The idea of thinking first then acting sounds safe but misses the types of learning and discoveries that arise through action.

Each of these kinds of advice have one thing in common—they are intended to reduce the chance of making an error. I don’t believe that they would actually reduce errors—I am not aware of any evidence that they work. They would likely make things worse, not better, because they would reduce the chance of gaining insight.

And now a few other pieces of advice that I do find worthwhile, but with some reservations.

One valuable tactic is to use Differential Diagnosis which involves setting out alternative diagnoses. This approach is well-known in the healthcare community. For many challenging cases it seems only natural for a diagnostician to consider various possible causes. However, Differential Diagnoses can be utopian. Diagnosticians are likely to find it impractical to continually generate logical comparison sets for every choice point. Therefore, we need criteria for when to use Differential Diagnosis. Further, in complex and unfolding situations decision makers are unlikely to be able to imagine the true cause of the problem and so they won’t be able to include it in the initial logical comparison set.

Another good tactic is to use generic questions that we can pose to ourselves or to others. (i) Cohen et al. (1997) suggest a crystal ball method: “I am looking in an infallible crystal ball and I see that the diagnosis you are considering is wrong. What else can it be?” Crosskerry (2003) has suggested the same kind of exercise. (ii) You could ask, “What’s the worst thing this could be?” (iii) This question is basically a test for fixation: “What evidence would it take for you to abandon your diagnosis?” If we can’t think of any such evidence that’s a good sign that we are gripped by fixation. (iv) We can use a prospective hindsight tactic: “Imagine that you got the diagnosis wrong, what cue or hint had you been ignoring?” I like these questions but I’m not sure how to deploy them. Advising diagnosticians to ask them all the time seems impractical. Advising diagnosticians to ask them when they get stuck seems like a better idea except that when diagnosticians know they’re stuck they’re no longer gripped by fixation. Still, these kinds of questions might help people once they realize they’ve been fixating. Maybe these questions should be posed by team members who suspect that the prime decision makers are fixating.

Now let’s back up to see if there is another approach for escaping fixation and reducing diagnosis errors. We want to speculate and explore hypotheses—we just don’t want to get trapped.

Harnessing the power of curiosity

This approach tries to use curiosity to overcome our commitment to our initial diagnosis.

The core of the strategy is to become more curious about anomalies—the hints we can be noticing. That doesn’t mean chasing every anomaly because that’s not realistic. Instead, it means trying to at least notice contrary indicators, perhaps just for a few seconds, and wonder what is causing those indicators. We want the anomalies to at least get on our radar. For instance, in taking a medical history a physician might go through the motions of filling in all the blocks, but an actively curious physician would be asking the next question, working off minor discrepancies to tease out additional clues.

The idea of harnessing curiosity fits in with research I have done on the nature of insight. I discovered three different paths that lead to insights. One of these is the correction path in which we recover from a fixation on flawed beliefs. I investigated 120 cases of insights and 27 of them involved the correction path. Then I wondered how these 27 decision makers escaped from their fixation. In 18 of these 27 cases the decision maker noticed a hint, an event or comment or some sort of clue. Instead of dismissing this hint, explaining it away like Chinn and Brewer described, these 18 individuals wondered about the hint and took it seriously. I believe that even more of the 27 cases revolved around examining the hint/anomaly—I only counted those 18 cases for which the records were sufficiently clear.

So the notion of harnessing curiosity is not an academic recommendation. It is based on the success stories of people who managed to escape from fixation and make important discoveries.

Let’s take this further. Once we become more curious about anomalies we can try to keep track of how many anomalies we are explaining away. If our initial diagnosis is wrong, we should be getting more and more signals that contradict it. There’s more and more to explain away. So that’s another leverage point we can use. Mark Smith, the Chief Innovation Officer at MedStar, explained to me that he uses a two-strikes rule in examining a patient. If he feels confident in a diagnosis he may disregard an initial anomaly but if he notices a second anomaly that’s a wake-up call to step back and re-examine what’s going on.

Yet another leverage point is to notice how much work we are doing to explain away all these anomalies. Cohen et al. (1997) coined the term “snap back” to describe how the sheer effort of explaining away so many contrary data points can induce us to lose faith in our initial diagnosis and seek another one.

I see the power of curiosity as an antidote to fixation. We want to shift the mindset of diagnosticians so that they wonder about anomalies instead of dismissing them.

The primary ideas in this essay have emerged from a collaboration over the past few years with Terry Fairbanks, the Vice President for Quality and Safety at MedStar Health. Terry and I have been designing a workshop on diagnostic errors and we hope to test it out soon. I expect that the strategy described in this essay for overcoming fixation will expand and get revised as we go along.

Diagnostic errors are just one side of the coin. The other side is diagnostic successes. Diagnostic error is a serious problem and gets a lot of attention, whereas diagnostic success is often taken for granted. We need to consider both sides of this coin. Otherwise we may take steps to reduce diagnostic errors that also cut our chances for diagnostic success, leaving us worse off. Framing the problem as one of fixation, encouraging us to do some re-conceptualizing, seems like a better approach than framing it as a bias, requiring us to change the way we think.

References

Balogh, E.P., Miller, B.T., & Ball, J.R. (2015). Improving diagnosis in health care.Institute of Medicine. National Academies of Sciences, Engineering, and Medicine. National Academies Press, Washington, D.C.

Chinn, C.A., & Brewer, W.F. (1993). The role of anomalous data in knowledge acquisition: A theoretical framework and implications for science instruction. Rev. Educational Research, 63, 1-49.

Cohen, M., Freeman, J.T., & Thompson, B. (1997). Training the naturalistic decision maker. In C.E. Zsambok & G.A. Klein (Eds.). Naturalistic DecisionMaking (pp 257-268). Mahwah, NJ: Erlbaum

Crosskerry, P. (2003). The importance of cognitive errors in diagnosis and strategies to minimize them. Academic Medicine, 78,775-780.

De Keyser, V., & Woods, D.D. (1990). Fixation errors: Failures to revise situation assessment in dynamic and risky systems. In A.G. Colombo and A. Saiz de Bustamante (Eds.) System reliability assessment (pp. 231-251). Drodrecht, The Netherlands: Kluwer Academic.

Feltovich, P.J., Coulson, R.L., & Spiro, R.J. (2001). Learners’ (mis)understanding of important and difficult concepts: A challenge to smart machines in education. In K. Forbus and P.J. Feltovich (Eds.) Smart machines in education, AAAI/MIT Press: Cambridge, MA.

Fugelsang, J.A., Stein, C.B., Green, A.E., & Dunbar, K.N. (2004). Theory and data interactions of the scientific mind: Evidence from the molecular and the cognitive laboratory. Canadian Journal of Experimental Psychology, 58, 86-95.

Klein, G., Phillips, J. K., Rall, E., & Peluso, D. A. (2007). A data/frame theory of sensemaking. In R.R. Hoffman (Ed.), Expertise out of context (pp. 113-155). Mahweh, NJ: Erlbaum.

Klein, G., Pliske, R., Crandall, B., & Woods, D.D. (2005). Problem detection. Cognition, Technology & Work, 7, 14-28.

Smith, P. (2018). Making brittle technologies useful. In P.J. Smith & R.R. Hoffman (Eds.) Cognitive systems engineering: The future of a changing world. CRC Press (Taylor & Francis Group): Boca Raton, FL

advertisement
More from Gary Klein Ph.D.
More from Psychology Today