Skip to main content

Verified by Psychology Today

Bias

12 Common Biases That Affect How We Make Everyday Decisions

Make sure that the decisions that matter are not made based on bias.

Key points

  • Confirmation bias means that people favor ideas that confirm their existing beliefs.
  • People overestimate the likelihood of positive outcomes if they're in a good mood, which is optimism bias.
  • Declinism refers to a bias in favor of the past, due to a resistance to change.

Though the concept of illusory superiority arguably dates back to Confucius and Socrates, it may come as a shock that its discussion in the form of the Dunning-Kruger Effect is almost 20 years old; and though it may simply be a result of an echo chamber created through my own social media, it seems to be popping up quite frequently in the news and posts that I’ve been reading lately—even through memes. For those of you unfamiliar with the phenomenon, the Dunning-Kruger Effect refers to a cognitive bias in which individuals with a low level of knowledge in a particular subject mistakenly assess their knowledge or ability as greater than it is. Similarly, it also refers to experts underestimating their own level of knowledge or ability.

But, then again, maybe it’s not my echo chamber—maybe it is part and parcel of our new knowledge economy (Dwyer, 2017; Dwyer, Hogan & Stewart, 2014) and the manner in which we quickly and effortlessly process information (right or wrong) with the help of the internet. In any case, given the frequency with which I seem to have encountered mention of this cognitive bias lately, coupled with the interest in my previous blog post "18 Common Logical Fallacies and Persuasion Techniques," I decided it might be interesting to compile a similar list—this time, one of cognitive biases.

A cognitive bias refers to a "systematic error" in the thinking process. Such biases are often connected to a heuristic, which is essentially a mental shortcut—heuristics allow one to make an inference without extensive deliberation and/or reflective judgment, given that they are essentially schemas for such solutions (West, Toplak, & Stanovich, 2008). Though there are many interesting heuristics out there, the following list deals exclusively with cognitive biases. Furthermore, these are not the only cognitive biases out there (e.g. there’s also the halo effect and the just world phenomenon); rather, they are 12 common biases that affect how we make everyday decisions, from my experience.

1. The Dunning-Kruger Effect

In addition to the explanation of this effect above, experts are often aware of what they don’t know and (hopefully) engage their intellectual honesty and humility in this fashion. In this sense, the more you know, the less confident you're likely to be—not out of lacking knowledge, but due to caution. On the other hand, if you know only a little about something, you see it simplistically—biasing you to believe that the concept is easier to comprehend than it may actually be.

2. Confirmation Bias

Just because I put the Dunning-Kruger Effect in the number one spot does not mean I consider it the most commonly engaged bias—it is an interesting effect, sure; but in my critical thinking classes, the confirmation bias is the one I constantly warn students about. We all favour ideas that confirm our existing beliefs and what we think we know. Likewise, when we conduct research, we all suffer from trying to find sources that justify what we believe about the subject. This bias brings to light the importance of, as I discussed in my previous post on "5 Tips for Critical Thinking," playing devil’s advocate. That is, we must overcome confirmation bias and consider both sides (or, if there are more than two, all sides) of the story. Remember, we are cognitively lazy—we don’t like changing our knowledge (schema) structures and how we think about things.

3. Self-Serving Bias

Ever fail an exam because your teacher hates you? Ever go in the following week and ace the next one because you studied extra hard despite that teacher? Congratulations, you’ve engaged the self-serving bias. We attribute successes and positive outcomes to our doing, basking in our own glory when things go right; but, when we face failure and negative outcomes, we tend to attribute these events to other people or contextual factors outside ourselves.

4. The Curse of Knowledge and Hindsight Bias

Similar in ways to the availability heuristic (Tversky & Kahneman, 1974) and to some extent, the false consensus effect, once you (truly) understand a new piece of information, that piece of information is now available to you and often becomes seemingly obvious. It might be easy to forget that there was ever a time you didn’t know this information and so, you assume that others, like yourself, also know this information: the curse of knowledge. However, it is often an unfair assumption that others share the same knowledge. The hindsight bias is similar to the curse of knowledge in that once we have information about an event, it then seems obvious that it was going to happen all along. I should have seen it coming!

5. Optimism/Pessimism Bias

As you probably guessed from the name, we have a tendency to overestimate the likelihood of positive outcomes, particularly if we are in good humour, and to overestimate the likelihood of negative outcomes if we are feeling down or have a pessimistic attitude. In either the case of optimism or pessimism, be aware that emotions can make thinking irrational. Remember one of my "5 Tips for Critical Thinking": Leave emotion at the door.

6. The Sunk Cost Fallacy

Though labeled a fallacy, I see "sunk cost" as just as much in tune with bias as faulty thinking, given the manner in which we think in terms of winning, losing, and breaking even. For example, we generally believe that when we put something in, we should get something out—whether it’s effort, time, or money. With that, sometimes we lose… and that’s it—we get nothing in return. A sunk cost refers to something lost that cannot be recovered. Our aversion to losing (Kahneman, 2011) makes us irrationally cling to the idea of regaining even though it has already been lost (known in gambling as chasing the pot—when we make a bet and chase after it, perhaps making another bet to recoup the original [and hopefully more] even though, rationally, we should consider the initial bet as out-and-out lost). The appropriate advice of cutting your losses is applicable here.

7. Negativity Bias

Negativity bias is not totally separate from pessimism bias, but it is subtly and importantly distinct. In fact, it works according to similar mechanics as the sunk cost fallacy in that it reflects our profound aversion to losing. We like to win, but we hate to lose even more. So, when we make a decision, we generally think in terms of outcomes—either positive or negative. The bias comes into play when we irrationally weigh the potential for a negative outcome as more important than that of a positive outcome.

8. The Decline Bias (a.k.a. Declinism)

You may have heard the complaint that the internet will be the downfall of information dissemination; but, Socrates reportedly said the same thing about the written word. Declinism refers to a bias in favour of the past over and above "how things are going." Similarly, you might know a member of an older generation who prefaces grievances with, "Well, back in my day" before following up with how things are supposedly getting worse. The decline bias may result from something I’ve mentioned repeatedly in my posts—we don’t like change. People like their worlds to make sense, they like things wrapped up in nice, neat little packages. Our world is easier to engage in when things make sense to us. When things change, so must the way in which we think about them; and because we are cognitively lazy (Kahenman, 2011; Simon, 1957), we try our best to avoid changing our thought processes.

9. The Backfire Effect

The backfire effect refers to the strengthening of a belief even after it has been challenged. Cook and Lewandowsky (2011) explain it very well in the context of changing people’s minds in their Debunking Handbook. The backfire effect may work based on the same foundation as Declinism, in that we do not like change. It is also similar to negativity bias, in that we wish to avoid losing and other negative outcomes—in this case, one’s idea is being challenged or rejected (i.e. perceived as being made out to be "wrong") and thus, they may hold on tighter to the idea than they had before. However, there are caveats to the backfire effect—for example, we also tend to abandon a belief if there's enough evidence against it with regard to specific facts.

10. The Fundamental Attribution Error

The fundamental attribution error is similar to the self-serving bias, in that we look for contextual excuses for our failures, but generally blame other people or their characteristics for their failures. It also may stem from the availability heuristic in that we make judgments based only on the information we have available at hand.

One of the best textbook examples of this integrates stereotyping: Imagine you are driving behind another car. The other driver is swerving a bit and unpredictably starts speeding up and slowing down. You decide to overtake them (so as to no longer be stuck behind such a dangerous driver) and as you look over, you see a female behind the wheel. The fundamental attribution error kicks in when you make the judgment that their driving is poor because they’re a woman (also tying on to an unfounded stereotype). But what you probably don’t know is that the other driver has three children yelling and goofing around in the backseat, while she’s trying to get one to soccer, one to dance, and the other to a piano lesson. She’s had a particularly tough day and now she’s running late with all of the kids because she couldn’t leave work at the normal time. If we were that driver, we’d judge ourselves as driving poorly because of these reasons, not because of who we are. Tangentially, my wife is a much better driver than I am.

11. In-Group Bias

As we have seen through consideration of the self-serving bias and the fundamental attribution error, we have a tendency to be relatively kind when making judgments about ourselves. Simply, in-group bias refers to the unfair favouring of someone from one’s own group. You might think that you’re unbiased, impartial, and fair, but we all succumb to this bias, having evolved to be this way. That is, from an evolutionary perspective, this bias can be considered an advantage—favouring and protecting those similar to you, particularly with respect to kinship and the promotion of one’s own line.

12. The Forer Effect (a.k.a. The Barnum Effect)

As in the case of Declinism, to better understand the Forer effect (commonly known as the Barnum Effect), it’s helpful to acknowledge that people like their world to make sense. If it didn’t, we would have no pre-existing routine to fall back on and we’d have to think harder to contextualise new information. With that, if there are gaps in our thinking of how we understand things, we will try to fill those gaps in with what we intuitively think makes sense, subsequently reinforcing our existing schema(s). As our minds make such connections to consolidate our own personal understanding of the world, it is easy to see how people can tend to process vague information and interpret it in a manner that makes it seem personal and specific to them. Given our egocentric nature (along with our desire for nice, neat little packages and patterns), when we process vague information, we hold on to what we deem meaningful to us and discard what is not. Simply, we better process information we think is specifically tailored to us, regardless of ambiguity. Specifically, the Forer effect refers to the tendency for people to accept vague and general personality descriptions as uniquely applicable to themselves without realizing that the same description could be applied to just about everyone else (Forer, 1949). For example, when people read their horoscope, even vague, general information can seem like it’s advising something relevant and specific to them.

While heuristics are generally useful for making inferences by providing us with cognitive shortcuts that help us stave off decision fatigue, some forms of heuristics can make our judgments irrational. Though various cognitive biases were covered in this post, these are by no means the only biases out there—just the most commonly engaged, in my experience, with respect to everyday decision-making. If you’re interested in learning more about these and other cognitive biases, I recommend checking out yourbias.is. Remember, we make thousands of decisions every day, some more important than others. Make sure that the ones that do matter are not made based on bias, but rather on reflective judgment and critical thinking.

References

Cook, J. & Lewandowsky, S. (2011). The debunking handbook. St. Lucia, Australia: University of Queensland. Retrieved from http://www.skepticalscience.com/docs/Debunking_Handbook.pdf

Dwyer, C.P. (2017). Critical thinking: Conceptual perspectives and practical guidelines. Cambridge, UK: Cambridge University Press; with foreword by former APA President, Dr. Diane F. Halpern.

Dwyer, C. P., Hogan, M. J., & Stewart, I. (2014). An integrated critical thinking framework for the 21st century. Thinking Skills & Creativity, 12, 43–52.

Forer, B. R. (1949) "The Fallacy of Personal Validation: A classroom Demonstration of Gullibility," Journal of Abnormal Psychology, 44, 118-121.

Kahneman, D. (2011). Thinking fast and slow. Penguin: Great Britain.

Kruger, J. &Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one's own incompetence lead to inflated self-Assessments. Journal of Personality and Social Psychology, 77, 6, 1121–1134.

Simon, H. A. (1957). Models of man. New York: Wiley.

Tversky, A. & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 4157, 1124–1131.

West, R. F., Toplak, M. E., & Stanovich, K. E. (2008). Heuristics and biases as measures of critical thinking: Associations with cognitive ability and thinking dispositions. Journal of Educational Psychology, 100, 4, 930–941.

advertisement
More from Christopher Dwyer Ph.D.
More from Psychology Today
More from Christopher Dwyer Ph.D.
More from Psychology Today