Skip to main content

Verified by Psychology Today

Bias

The Danger of Searching for One True Cause

Why our preference for simple explanations is a problem.

Pixabay / jarmoluk
Source: Pixabay / jarmoluk

What caused Trump to win the 2016 election? Was it economic anxiety? Racism? Sexism? Russian hacking? Immigration? Political correctness?

Implicit in questions like these is the idea that complex events like the 2016 election have only one cause. When headlines say that “A new study reveals the real reason Obama voters switched to Trump,” there is an assumption that the switch to Trump had one true cause, and once this cause is identified, there is no need to look for other potential causes. This tendency to assume complex events had only one cause is sometimes called “the fallacy of the single cause.”

Research suggests that people tend to prefer simple, single-cause explanations to more complex ones. One study led by UC Berkeley Psychologist Tania Lombrozo found that study participants are more likely to believe that two symptoms are caused by one disease rather than by two separate diseases — even if participants are shown information suggesting that the second explanation is more probable. Even young children prefer one-cause explanation: When shown a toy with a light turning on and a fan spinning, children are more likely to think these effects come from a common cause rather than two independent causes.

What causes this tendency to think in terms of only one cause — or, should we say, what are some of the causes behind single-cause thinking? And what are the consequences of it?

Perhaps our prototypical understanding of causality only consists of one cause and one effect. This is what cognitive scientist George Lakoff argues in his classic book Metaphors We Live By. Lakoff believes that our understanding of causality is rooted in our understanding of the physical world. We learn about causality from a young age through our experience of manipulating objects: If we drop something, it falls, and if we hit something, it moves.

It makes sense for this prototypical idea of causality to be linear and consist of one cause and one effect because this is typically how physical causality works. However, this conception of causality fails to explain more complex issues, such as global warming, poverty, or the results of the 2016 election. To understand issues like these, Lakoff argues that we need to think in terms of “systemic causation,” which involves multiple interacting causes, feedback loops, and probabilistic causes. However, Lakoff acknowledges that thinking this way does not come naturally to us and is typically harder to communicate with our language.

We might also find simple explanations attractive. In his book Factfulness, Hans Reisling says that we find simple ideas alluring because we like feeling like we really know and understand something. But this can lead us to engage in what Reisling calls the “Single Perspective Instinct,” or the tendency to think there is a single cause behind complex problems that can be solved with a single solution. For instance, it might be tempting to think that government regulation is the root cause of all problems and that removing government regulation will be the single solution. Or it might be tempting to view inequality as the true cause of all problems and see redistribution as the single solution. But, as Reisling argues, this instinct leads us to misunderstand the complexity of the world around us.

It may also be comforting to place all the blame for a complex problem on one single cause. In a previous blog post, I wrote about a series of studies suggesting that blaming an enemy can reduce existential anxiety and give us a greater sense of control. Similarly, it might be comforting to believe that societal issues stem from one root cause because this means the problem can be understood, contained, and controlled. If we recognize the true complexity of issues, it might lead to the unsettling realization that we do not truly understand those issues and that they may be much harder than we think to solve.

Of course, simple explanations that appeal to fewer causes can hold some benefits. People often defer to the principle of “Occam’s Razor” when choosing an explanation, which states that when choosing between two explanations, the simplest explanation is usually the best one. Simple explanations are easier to communicate and can make it easier for us to reliably predict and prevent future events.

Yet, we should not think simple explanations are any more likely to be true. In fact, research suggests that our explanations of things in the world around us are often sparse and incomplete. Despite this, we think we are able to explain the world around us in much more depth than we actually are — a bias called the illusion of explanatory depth. One study found that people believe they know much more about policy positions than they actually do, but when they are asked to explain how these policies work — describing, for instance, how a single-payer healthcare system works — they come to a halt and realize how little they actually know. The researchers also found that the act of trying to explain how these policies work leads people to develop more moderate opinions about them, suggesting that overconfidence about our knowledge might lead to more extreme opinions. Similarly, single-cause thinking may undergird extreme beliefs, causing people to believe that complex problems can be addressed with simple solutions that focus on eliminating a root cause of a problem.

Things are often much more complex than we’d like to believe and complex systemic issues are unlikely to have only one cause. So, instead of asking questions like “what caused Trump to win the 2016 election?” we can ask questions that emphasize the causal complexity of the world around us, like “what were some of the factors that caused Trump to win the 2016 election?”

advertisement
More from Steve Rathje
More from Psychology Today