Skip to main content

Verified by Psychology Today

Bias

Belief Bias, Polarization, and Potential Solutions

There may be a way to have comfortable political conversations after all.

Key points

  • We often evaluate arguments and evidence by their believability rather than their logical validity.
  • This belief bias is found in both low-stakes and high-stakes reasoning.
  • Good reasoning may help overcome belief bias, but it probably won't solve it.
  • Overcoming belief bias may also require us to focus on our superordinate, shared beliefs.

If all flowers have petals and roses have petals, should we conclude from those claims that roses are flowers? Many people make this mistake of thinking that we should (Markovits & Nantel, 1989).

Let me explain the mistake in case it wasn't obvious. That conclusion doesn't follow (logically) from those two premises—in fact, the implied argument commits the fallacy of affirming the consequent. The easiest way to reveal the problem with this logic is to give you analogous argument: if all pregnant people have uteri and elderly grandmothers have uteri, should we conclude that elderly grandmothers are pregnant? Of course not! Having a uterus is not a sufficient condition for being pregnant!

What Is The Bias?

So why would people be more likely to make this mistake with the first argument than the latter argument? One answer suggests that it is the believability of the conclusion: we probably believe that roses are flowers and our comfort with that conclusion may make us unlikely to look for a fallacy in the argument that was supposed to support it. However, we probably don't believe that elderly grandmothers tend to be pregnant. So when we encounter that conclusion, we are more likely to be on the lookout for a fallacy in its argument. This phenomenon is known as belief bias: we tend to evaluate arguments according to the believability of their conclusions rather than their logical structure.

Psychological scientists have been finding evidence of belief bias for most of the last century (e.g., Janis & Frick, 1943). And it's not just less-educated people that tend exhibit belief bias: Some of the earliest scientific detections of belief bias were among graduate students at Columbia University (ibid.).

Why Care About Belief Bias?

Maybe you don't care if evaluations of silly arguments about roses and grandmothers are biased by people's prior beliefs. After all, it's not as though there are lives on the line with such mundane logic puzzles. Belief bias would probably matter outside of psychological science labs only if it showed up in high-stakes argument evaluation.

Well, it turns out that people do exhibit belief bias about more high-stakes arguments. For example, when evaluating evidence about immigration or gender quota policies (Strömbäck, Andersson, Västfjäll, and Tinghög, 2021), people who hold views that conflict with the evidence being presented to them are more likely to misinterpret that evidence. Similar results have been found for evidence about climate change (e.g., Stenhouse et al., 2018) and even COVID-19 misperceptions (Pennycook, McPhetres, Bago, and Rand, 2020).

What Can We Do About Belief Bias?

Presumably, the solutions to belief bias are simple. Intuitively, if we just become more open to alternative beliefs and double-check our initial impulse, then we'll be less likely to make the mistake of evaluating arguments and evidence according to its believability. And there is some evidence that is consistent with this. The better that people performed on tests designed to trick them into relying on faulty intuitions the more likely they were to correctly estimate the accuracy of fake news and the less likely to share fake news—even when headlines aligned with their partisan beliefs (Pennycook & Rand 2019). Also, agreeing with statements like, "People should take into consideration evidence that goes against conclusions they favor" has correlated with being more open to human-changed climate change (Stenhouse et al., 2018).

However, even that last correlation held across political orientations, suggesting that preferring actively open-minded thinking helps overcome controversy, but not necessarily belief bias. After all, if it was political beliefs that were overcome by open-minded preferences, then the correlation would have been stronger among one political bloc than another, but it wasn’t.

Moreover, sometimes reasoning performance predicts being more likely to evaluate arguments and evidence in line with one's prior beliefs, not less (e.g., Shoots-Reinhard et al., 2021). In other words, open-minded preferences and good reasoning can be helpful, but they may not be a panacea when it comes to belief bias.

So what else can we do? In a recent paper, I propose that we may need to reframe how we think about arguments, evidence, and our prior beliefs. For example, if you and I disagree about a polarized issue, then we will be likely to evaluate the arguments and evidence in a way that complies with our political commitments, affiliations, etc. However, if we instead focus on prior beliefs that we share (e.g., that we trust the scientific method, that we ought to adapt our beliefs to the best evidence, etc.), then we may be more likely to agree about how to evaluate the relevant evidence and arguments (Byrd, 2022).

Of course, if our discussion is heated or we feel that our team is being attacked, then this unity re-framing exercise may be more challenging. However, some evidence suggests that if we eliminate perceived threats and attacks, people may be less likely to be so biased by us-vs-them thinking (Roberts & Davidai, 2021).

My proposal is hypothetical of course. So perhaps you can test its plausibility: next time you have a chance to talk about a polarized issue with someone with affiliations or commitments that differ from yours, try affirming their group's concerns or achievements before discussing the topics you disagree about. After all, they're probably concerned about the future, their family, and their bank account just like the rest of us. See how that impacts the outcome of the conversation.

References

Byrd, N. (2022). Bounded Reflectivism & Epistemic Identity. Metaphilosophy, 53(1). https://doi.org/10.1111/meta.12534

Janis, I. L., & Frick, F. (1943). The relationship between attitudes toward conclusions and errors in judging logical validity of syllogisms. Journal of Experimental Psychology, 33(1), 73–77. https://doi.org/10.1037/h0060675

Markovits, H., & Nantel, G. (1989). The belief-bias effect in the production and evaluation of logical conclusions. Memory & Cognition, 17(1), 11–17. https://doi.org/10.3758/BF03199552

Pennycook, G., McPhetres, J., Bago, B., & Rand, D. (2020). Attitudes about COVID-19 in Canada, the U.K., and the U.S.A.: A novel test of political polarization and motivated reasoning. https://doi.org/10.31234/osf.io/zhjkp

Pennycook, G., & Rand, D. G. (2019). Who falls for fake news? The roles of bullshit receptivity, overclaiming, familiarity, and analytic thinking. Journal of Personality, 88(2), 185–200. https://doi.org/10.1111/jopy.12476

Roberts, R., & Davidai, S. (2021). The psychology of asymmetric zero-sum beliefs. Journal of Personality and Social Psychology. https://doi.org/10.1037/pspi0000378

Stenhouse, N., Myers, T. A., Vraga, E. K., Kotcher, J. E., Beall, L., & Maibach, E. W. (2018). The potential role of actively open-minded thinking in preventing motivated reasoning about controversial science. Journal of Environmental Psychology, 57, 17–24. https://doi.org/10.1016/j.jenvp.2018.06.001

Shoots-Reinhard, B., Goodwin, R., Bjälkebring, P., Markowitz, D. M., Silverstein, M. C., & Peters, E. (2021). Ability-related political polarization in the COVID-19 pandemic. Intelligence, 88, 101580. https://doi.org/10.1016/j.intell.2021.101580

advertisement
More from Nick Byrd Ph.D.
More from Psychology Today