Skip to main content

Verified by Psychology Today

Heuristics

4 Outcomes of Lazy Thinking

Using heuristics to understand why people fall prey to fake news.

In a recent article in the New York Times, Why Do People Fall for Fake News?, psychologists Gordon Pennycook and David Rand discuss people’s susceptibility to (strategic) misinformation. Two leading explanations for why people believe such misinformation are presented, which essentially boil down to people’s rationalizations and laziness. Much like the focus of many past posts in this blog, the article tackles the concept of faulty thinking and barriers to Critical Thinking.

With respect to rationalization, what is discussed is in many ways associated with confirmation bias, in that when presented with (mis)information, particularly if it’s a politically-charged issue, people will rationalize the information in a manner that persuades them into believing what they want to be true. What can make this difficult to differentiate from fact—assuming one is not privy to the facts—is that rationalizing in this manner does require reasoning; so, to say to someone that they aren’t using their reasoning when they infer an inaccurate conclusion through rationalization isn’t correct either. Rather, they’re applying their reasoning incorrectly and uncritically.

The concept of the laziness perspective is by no means new—it’s consistent with the work of two Nobel Prize winners: according to Daniel Kahneman (2011), we are lazy thinkers and cognitive misers (i.e. given that it takes additional cognitive effort to assimilate information and simultaneously assess its truth); and according to Herbert Simon (1957), rather than engage in extensive, reflective decision-making, people generally settle for a decision or solution that is satisfactory, or simply, "good enough" (i.e. satisficing). This is why people go with their gut—their intuition; which is a big no-no for critical thinking. But with that, intuition is generally good at what it does; and helps us save our cognitive energy for things that matter and stave off decision fatigue. However, people often apply it to ‘things that matter’ as well; hence the problem of people falling for misinformation.

An important aspect of the laziness we’ve been discussing is the tendency to apply a general mental framework to information—a heuristic. A large body of research indicates that the manner in which intuitive judgments come to mind and why they are often limited or grossly incorrect result from heuristic-based thinking (Kahneman, 2011; Tversky & Kahneman, 1974). A heuristic is a "simple" experience-based protocol for problem-solving and decision-making, which acts as a mental shortcut—a “procedure that helps find the adequate, though often imperfect, answers to difficult questions” (Kahneman, 2011, p. 98). The misapplication of heuristics can lead to fallacious reasoning and cognitive biases (Gilovich, Griffin, & Kahneman, 2002; Kahneman, 2011; Kahneman, Slovic, & Tversky, 1982; Slovic, Fischhoff, & Lichtenstein, 1977; Tversky & Kahneman, 1974).

Let’s now turn our attention to four specific heuristics—three classic ones famously identified by Tversky and Kahneman (1974) and a more modern, though commonly encountered one—that is, four outcomes of lazy thinking.

1. The Availability Heuristic (Tversky & Kahneman, 1974) refers to a mental framework whereby people base a judgment on the ease with which they can bring relevant information, such as an example, to mind. Consider the classic example:

Are there more words that begin with the letter ‘K’ than those with ‘K’ as the third letter?

Without adequate reflection, many answer that there are more words that begin with the letter "K" given the relative ease with which these words came to mind with words that have "K" as their third letter (i.e. when searching your memory for words with the letter "K," kitten comes to mind faster than hake). In reality, however, there are substantially more words in the English language with "K" as their third letter than there are with "K" as their first letter. Now, in a more real-world context, how often things appear on the news can lead us to think that such occurrences are much more common than is actually the case.

2. The Representativeness Heuristic (Tversky & Kahneman, 1974) is a judgment-making shortcut for the likelihood of a phenomenon. When individuals rely on this heuristic, they are generally wrong as a result of substituting what they perceive as representative of the real-world for the actual likelihood of something. For example:

Which outcome is more likely when playing at a fair roulette table?

Black, Red, Red, Black, Red, Black

or

Black, Black, Black, Red, Red, Red

Again, without adequate reflection (or perhaps statistical knowledge), many fail to realize that both occurrences are equally likely; instead, reporting the first is more likely because it represents what they consider as being "random." Intuition has a very poor understanding of statistics and, in particular, the nature of true randomness (Kahneman, 2011). In a more real-world context, stereotyping is a prime example of how we use the representativeness heuristic, by applying the information we hold as representative to a novel situation or information (e.g. librarians are quiet and organized).

3. The Anchoring (and Adjustment) Heuristic (Tversky & Kahneman, 1974) refers to a mental rule of thumb that uses the initial piece of information presented to them (often a number value) as a starting point and, subsequently, judgment or decision is made by adjusting away from this anchor. For example, consider the following two questions in turn:

Was Winston Churchill more or less than 40 when he died?

How old was Churchill when he died?

A vast majority realize that Churchill was well over 40 when he died, but nevertheless, the first question seems to imply that he was around 40 when he died, which primes people to consider that he was not as old as they initially thought. Perhaps 55–65 years old is reasonable to suggest? However, this would be wrong. Now, consider the following questions:

Was Winston Churchill more or less than 120 when he died?

How old was Churchill when he died?

Though 120 years old may seem like a preposterous age to many (many will, of course, have said younger), it may have allowed you to adjust your answer to an age somewhere between 85 and 95, which, in its own right, would be a big age. Interestingly, this would be accurate—Churchill died at 90 and it seems that the more preposterous anchor was in fact that the more reasonable one (being a spin on the classic Gandhi Anchoring & Adjustment example). Herein lies the contradictory nature of heuristics—people very often fall prey to following an anchoring heuristic because, in many situations, anchoring is the reasonable thing to do. Sometimes, when we are presented with difficult questions, we have a tendency to clutch at straws and the anchor is a plausible straw (Kahneman, 2011).

Again, in a more real-world context, imagine shopping for a used car and finding one in which you’re interested. You ask the salesman for the price and he tells you that the car costs $8,900. Though you realize that used car salesmen are stereotyped as being notorious for overcharging on cars (ahem, representativeness heuristic), it remains that the counter-offer you propose is based on the initial suggestion and may still be very much over the actual value of the car. So, in situations where negotiation is likely, be sure to be the first to offer a price!

4. The Affect Heuristic (Kahneman & Frederick, 2002) refers to a mental shortcut in which judgments are made in light of the thinker’s current emotion (e.g. disappointment or pleasure). Kahneman and Frederick’s research on this heuristic based are in part on a study conducted by Strack, Martin, and Schwarz (1988), in which college students were asked the following questions:

How happy are you with your life in general?

How many dates did you have last month?

The size of the correlation between the two questions was small when asked in this order, but was significantly large when the order was switched, suggesting that individuals who were asked the dating question first were primed by being asked a question about their romantic life, which elicited an emotional reaction. This reaction heavily influenced the manner in which the following question was answered. Like the availability heuristic, individuals who were just asked about how frequently they date have information related to their romantic life readily available and easily accessible and are likely to draw more heavily upon this information than they are to search for other facets of their life for consideration. Once again, in a more real-world example, it is a common observation of educators that when students are asked difficult questions and they do not know the answer (i.e. when they lack the required knowledge), they more often than not respond with a related, emotion-based belief or attitude (Slovic et al., 2002).

Since the work of Tversky and Kahneman (1974), numerous other heuristics have been proposed and researched (e.g. recognition, similarity, fluency and effort heuristics, to name a few)—all of which are utilized as a result of lazy thinking. Likewise, their use has been rationalized. In order to overcome lazy thinking, rationalising biases and avoid falling prey to misinformation, Pennycook and Rand, along with many others in the field of Critical Thinking, recommend spending more time, effort, and resources to the spread of accurate information; as well as the promotion of encouraging and training people to think critically.

References

Gilovich, T., Griffin, D., & Kahneman, D. (Eds.). (2002). Heuristics and biases: The psychology of intuitive judgment. Cambridge: Cambridge University Press.

Kahneman, D. (2011). Thinking fast and slow. Penguin: Great Britain.

Kahneman, D. & Frederick, S. (2002). Representativeness revisited: Attribute substitution in intuitive judgment. In T. Gilovich, D. Griffin, & D. Kahneman (Eds.), Heuristics and biases: The psychology of intuitive judgment, 49–81. New York: Cambridge University Press.

Kahneman, D., Slovic, P., & Tversky, A. (Eds.). (1982). Judgment under uncertainty: Heuristics and biases. UK: Cambridge University Press.

Simon, H. A. (1957). Models of man. New York: Wiley.

Slovic, P., Finucane, M., Peters, E., & MacGregor, D. G. (2002). Rational actors or rational fools: Implications of the affect heuristic for behavioral economics. The Journal of Socio-Economics, 31, 4, 329–342.

Slovic, P., Fischhoff, B., & Lichtenstein, S. (1977). Behavioral decision theory. Annual Review of Psychology, 28, 1–39.

Strack, F., Martin, L. L., & Schwarz, N. (1988). Priming and communication: Social determinants of information use in judgments of life satisfaction. European Journal of Social Psychology, 18, 5, 429–442.

Tversky, A. & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 4157, 1124–1131.

advertisement
More from Christopher Dwyer Ph.D.
More from Psychology Today
More from Christopher Dwyer Ph.D.
More from Psychology Today