Motivated Reasoning
Human beings are not always—in fact, probably not often—the objective, rational creatures we like to think we are. In the past few decades, psychologists have demonstrated the many ways people deceive themselves in the process of reasoning. Cognitive faculties are a distinguishing feature of humanity—lifting humankind out of caves and enabling language, arts, and sciences. Nevertheless, they are also rooted in and subject to influence, or bias, by emotions and instincts.
One of the most significant ways information processing and decision-making becomes warped is through motivated reasoning, when biased reasoning leads to a particular conclusion or decision, a process that often occurs outside of conscious awareness.
Cognitive scientists see motivated reasoning as a force that operates in many domains. Studies by political psychologists highlight denial of climate change or discrediting its science as important examples of motivated reasoning; people process scientific information about climate shifts to conform to pre-existing feelings and beliefs. After all, accepting that climate change is real portends unpleasant environmental consequences and would require most people to head them off by making significant changes in lifestyle. Changing one’s mind and changing one’s lifestyle are hard work; people prefer mental shortcuts—in this case, having the goal fit their ready-made conclusions.
Motivated reasoning operates in more personal spheres as well. For example, it is seen as a mechanism people commonly use to preserve a favorable identity, particularly in Western cultures. To maintain positive self-regard, people (unwittingly) discount unflattering or troubling information that contradicts their self-image. Individuals engage in motivated reasoning as a way to avoid or lessen cognitive dissonance, the mental discomfort people experience when confronted by contradictory information, especially on matters that directly relate to their comfort, happiness, and mental health. Rather than re-examining a contradiction, it’s much easier to dismiss it.
Most decisions we make, conscious or unconscious, are influenced by motivation; there is an intended purpose underlying those decisions. Yet those goals sometimes conflict with each other. The process of balancing and prioritizing competing goals can determine the reasoning we use, which often results in motivated reasoning.
A cognitive bias refers to a systematic error in the thinking process, of which there are many. For example, confirmation bias involves favoring ideas that confirm preexisting beliefs. The Dunning-Kruger effect occurs when people with a low level of knowledge in a given domain overestimate their knowledge or ability. The self-serving bias is the tendency to attribute successes to one’s actions and attribute failures to external circumstances.
Other common biases include the hindsight bias, the negativity bias, the sunk cost fallacy, the decline bias, the backfire effect, the fundamental attribution error, the in-group bias, and the Barnum effect.
Confirmation bias is the tendency for people to believe evidence that confirms their preexisting beliefs and discount information that counters those beliefs. Seeking to corroborate our beliefs comes naturally, while it feels uncomfortable and counterintuitive to look for evidence that contradicts them.
An example of confirmation bias can sometimes be found in anxious individuals; someone highly sensitive to rejection may interpret ambiguous social cues as confirming their belief that no one likes them.
Cognitive dissonance is the discomfort of holding two conflicting beliefs at one time. For example, if you think of yourself as an ethical person, but then cheat on a test, that would create cognitive dissonance that you might then try to reason your way out of. Motivated reasoning can function to reduce cognitive dissonance.
People can be drawn to conspiracy theories for several reasons. In addition to pervasive misinformation and siloed information ecosystems, psychological factors that drive conspiracy theories include the desire for understanding and certainty, the desire to maintain a positive self-image, the desire for control and security.
For example, the thinking might go, “If global temperatures are rising catastrophically due to human activity, then I’ll have to make painful changes to my lifestyle. But if pundits and politicians assure me that global warming is a hoax, I can maintain my current way of living.” This is an example of motivated reasoning.
Motivated reasoning is a natural human tendency. But just because cognitive biases are pervasive doesn’t mean they can’t be changed. There are ways to identify and overcome erroneous thinking, whether that be an individual’s decision-making process or communication challenges in society more broadly.
To avoid errors in decision-making that can accompany motivated reasoning, you can take a few concrete steps. First, you can avoid making decisions when experiencing intense emotions. Second, you can be attentive to situations in which you’re motivated to prove yourself right, which can render people vulnerable to reasoning errors. Third, you can avoid jumping to conclusions based on what you think they know, because assumptions are often baseless.
Several tips can help you cultivate critical thinking daily:
1. Save critical thinking for things that matter—prioritize the investment in time and effort for decisions with important consequences.
2. Think critically in the morning—otherwise, decision fatigue can set in.
3. Take a step back—take time to reflect on your knowledge and its limits.
4. Play devil’s advocate—identify four compelling points for and against a given position.
5. Remove your emotions—strong emotions can be persuasive.
Misinformation seems to be spreading faster than ever, partly because of the way it’s magnified by social media. In addition to social media companies adopting rigorous rules about the information they propagate, teaching people about critical thinking, science literacy, and media literacy can also help protect people from falling prey to conspiracy theories.
As machine learning has progressed, people have come to realize that these systems often adopt human biases due to the datasets they receive and their tendency to amplify existing processes and decisions. Researchers are now aware of these challenges and actively working to reduce biases in artificial intelligence. One recent model, for example, incorporates steps to detect, audit, and eliminate algorithmic biases.