Skip to main content

Verified by Psychology Today

Bias

The Arguing Ape Hypothesis

Why the world needs more (not less) biased and lazy reasoning.

euranet_plus/Flickr
Source: euranet_plus/Flickr

Most people are better at reasoning than it seems -- even those you think voted for the wrong candidate in the last election. I'm not saying people are especially good at rationality. I'm just saying that they're pretty good at reasoning. And I'm not saying all of their reasoning has been productive. I'm just saying that the problem isn't with their reasoning per se. It has more to do with the environment in which they do their reasoning. If you're a bit skeptical, please bear with me as I set things up with a brief autobiographical detour.

When I was in graduate school, a colleague of mine was trying to determine which virtues we should develop if we want to put ourselves in a position to believe more true things and fewer false things.

I found his project interesting, but I was also a little bothered about something. He seemed to want everyone to develop the same set of epistemic virtues, such as conscientiousness, tenacity, imaginativeness, curiosity, discernment, etc.

Frankly, I didn't like his proposal. All that sameness. Yuck! But I also had some pragmatic concerns about the idea that we should all develop the same set of virtues.

Suppose there are people who are high in curiosity and low in discernment, while others have the opposite profile. What's wrong with that? Can't we just allow the curious folks to dig up all kinds of new ideas, and then let the discerning people test them? Moreover, is it possible that this arrangement would actually help us advance our collective knowledge more efficiently than the alternative where everyone keeps developing their weaknesses until they catch up with their strengths?

Not long after these discussions a visiting scholar spoke at a departmental symposium. He argued that we should follow some stricter norms for scholarship. I'm relying on faded memory at this point, but he put forth something like the following principle:

Before speaking or publishing anything, scholars should try to anticipate and address every objection they can think of.

He wasn't just saying that some people in a particular scene had gotten a little sloppy and could stand to do a better job of anticipating objections because they were wasting everybody's time. His point was that we all have an ethical duty to do go the extra mile in anticipating objections, and we should be embarrassed, and perhaps even open to censure, for the objections we leave unaddressed. In the Q&A period I asked him something like the following:

I agree that we should often anticipate and address objections before presenting our ideas. But how much time should scholars invest in this activity? Aren't other scholars, collectively, in a better position to generate objections to our views than we are? Can a case be made for some division of labor here?

The question got mixed reviews. The speaker more or less brushed the question off, while one of the more distinguished professors in the room mumbled under her breath, "no, that's a really good question".

Good question or not, I didn't give it much more thought until just this year, when I read Hugo Mercier and Dan Sperber's wonderful new book "The Enigma of Reason".

The Double Enigma of Reason

Mercier and Sperber claim that, when viewed through the lens of evolutionary biology, Reason is doubly enigmatic.

The first enigma stems from the observation that reason seems to be a superpower unique to humans. If our development is more or less continuous with that of other animals, how did we get this ability that seems so radically different from that of any other creature?

The second enigma grows out of the observation that we seem to be pretty bad at reasoning much of the time. We are prone to bias and laziness, and we often persist in holding our beliefs in spite of overwhelming evidence to the contrary. If reason is an adaptation, why do we seem to be so bad at it?

Reason is doubly enigmatic because evolution doesn't generally create superpowers out of nowhere, and most adaptations aren't this flawed.

Mercier and Sperber try to demystify this double enigma. In the first half of the book they make the case that reason isn't really so very different from things other animals do. And in the second half they make the case that reason isn't as flawed as it appears to be. It only seems broken when we attribute the wrong function to it and use it in evolutionarily novel contexts.

According to Mercier and Sperber the purpose of reason is not to make us better individual thinkers, but to make us better communal thinkers. And they claim that, in fact, most people do reason well when they engage their tribe-mates in argumentation.

And that seems to me to be a bold, interesting, and counter-intuitive claim that raises many questions.

Why Is Our Reasoning So Biased and Lazy?

If you've engaged in even a little bit of political wrangling on the internet over the last couple of years, you've probably noticed that people can be biased and lazy in their reasoning. People share articles without reading them. They repeat statistics without fact-checking them. They overestimate the strength of considerations supporting their side. And they often underestimate both the relevance and strength of considerations offered by their opponents.

So the question is "why"? Why are we so naturally biased and lazy in our reasoning?

One of Mercier and Sperber's explanations goes something like this:

Suppose you and I each have in our minds a certain amount of information and a limited number of mental models for interpreting information. If my mind generates an idea, it's probably going to be at least somewhat congruent with the information and mental models I have available to me. Like most new ideas, my idea should probably be tested, and I might be able to think up some objections on my own, but that might also prove difficult, because the idea was generated by my mind, and, by default, fits fairly well with my view of the world.

But you have access to different information and different ways of looking at things. When you hear my idea, it's less likely to fit with your view of the world. And that means you are in a much better position to generate objections to my idea than I am. In many cases it won't even pay for me to spend much time vetting my own ideas. I'm better off just putting them out there and letting you do your thing. The roles we each play will be next to effortless, we'll both learn something from the process, and we'll do it quickly.

So my laziness can allow me to take advantage of your knowledge. And I return the favor by being a little stubborn. If I am somewhat biased in favor of my own ideas, I will put more effort into defending my idea. And that will give you a chance to see some of the relevant information and mental models I carry around in my mind.

If we are both a bit lazy and stubborn, soon enough we will pool together a wide range of relevant considerations. And, because we are now working from a similar set of considerations, our attitudes might converge. And because the set of considerations is larger than either of us started with, we will be evaluating the idea in a brighter light, and our opinions are likely to be better grounded.

In the right room, bias and laziness are not bugs, but features of human reasoning.

Why Are Bias And Laziness Considered Bad?

If reasoning works best when it is biased and lazy, then how did we ever get the crazy idea that bias and laziness are bad things?

Mercier and Sperber place some of the blame on the development of Rationality, and it seems to me that we can also place some of the blame on Cosmopolitanism. Let's consider these two factors in turn.

Somewhere along the way the people who spent the most time thinking about Reason got two ideas in their heads: 1) Reason is for individual thinkers, and 2) Reason is for getting at objective truth. And it's pretty clear that, for individuals trying to get at objective truth, bias and laziness are decidedly detrimental.

Now it's worth noting that, if this rationalist characterization of Reason has in fact been wrong, it's been a very useful fiction. When we elevated the goal of objective truth above the goal of social coordination we cleared the way for Science. And the emphasis on the individual reasoner reinforced the idea that the physical world, and not the community, should have the final say. But that doesn't mean the adaptive function of reasoning is to help individuals pursue objective truth. It just means we figured out a way to take our natural reasoning instincts and put them (awkwardly) to new uses -- much like bears learning to ride tricycles (only more useful).

And, truth be told, scientific practice still makes liberal use of biased and lazy humans arguing with each other just as they always have.

But we also think bias and laziness are bad because, in contemporary (cosmopolitan) political discourse, we have seen them take center stage in the presence of bad reasoning. And they have become guilty by association.

The suggestion here is that the problems with modern discourse aren't caused by bias and laziness per se. They are caused by the fact that we are reasoning in contexts in which bias and laziness aren't able to do their jobs.

And to make that case, I want us to consider the difference between late night bull sessions and contemporary social media discourse.

Social Media Arguments vs Late Night Bull Sessions

In a bull session lazy thinking is rewarded. If something pops into your mind, you just say it. And why not? If someone thinks you're full of crap, they'll tell you. All you know is that you're not going to give up your claim until you've pushed it as far as you can. The back and forth is vigorous, and, at the end of the evening, no one much cares which side of the argument prevails. It's just fun to hash it all out.

Bull sessions are usually more or less face to face, and tend to occur between friends who have known each other awhile. There is little concern about saying something dumb, because pecking orders are already well established, and status adjustments come in small increments these days. Plus it's late, and substances may or may not have been consumed, so, if someone does say something dumb, who can blame them?

In late night bull sessions, bias and laziness are free to become the potent forces of good they were meant to be.

But when arguments occur between strangers and concern material conflicts of interest across social identity lines, things are very different. Bias and laziness aren't able to do their jobs, because they don't facilitate a pooling of considerations. And that's because people don't trust (and are sometimes even afraid of), the ideas and mental models in the minds of the people on the other side of the issue.

Participants are more likely to use the words of the other as evidence of the wickedness or stupidity of the other than they are to use them to expand their understanding of the issue.

The problem with contemporary discourse is not bias and laziness. It's tribalism. Bull sessions tend to be intra-tribal discussions among longtime friends. Contemporary discourse across social identity lines, when the issues are salient to the differential material outcomes for those identity groups, are more like inter-tribal warfare.

This makes for a souped up kind of motivated reasoning. These are fights over scarce resources, but they're worse than the degenerate arguments friends might have over who should eat the last slice of pizza. In these cases all kinds of social motivations are also in play. Our tribe-mates are watching and judging the way we interact with the other side. Charges of heresy and treason lurk under the surface. We have to be very careful to avoid giving the ideas of the other side too much credit. And there are positive rewards for twisting and misrepresenting those ideas. And that means the pace of pooling considerations, and converging on mutually agreeable solutions will slow considerably, if it doesn't stop entirely.

How Can We Argue Better?

The suggestion here is that reasoning worked well in the environment of evolutionary adaptation (EEA). And the EEA was much more like late night bull sessions than it was like contemporary online political discourse.

And if that's true, then we might do well to consider whether there are ways to make our arguments more like late night bull sessions, and less like war. Are there ways to take our mutual suspicion and replace it with some degree of mutual appreciation for the considerations that are locked up inside the minds of those on the other side?

I'm tempted to settle for simply asking the question without trying to answer it. I'm probably not in much better position to answer it than any individual reader is. But if we're going to figure this out together it can't hurt for me to get the ball rolling and put a couple of ideas out there.

1. Establish a Common Social Identity. My first suggestion is the same as the suggestion I made at the end of my last post ("Kids, Would You Please Lower Your Weapons?") We have some power to choose whether we will approach an argument as friends or foes. Do we wish to argue as "Republicans" and "Democrats" in a zero-sum war? Or would we rather argue as "Americans" trying to figure out the best way to live together, while looking for win-win ways of dealing with our conflicting concerns? Our natural tendency toward bias and laziness will be much more useful when we frame our discourse as in-group problem-solving rather in-group/out-group warfare.

2. Go right after the trust issue. If we aren't inclined to trust the considerations our conversation partner offers, why go through the motions? And if they aren't prepared to trust anything we offer, why waste our time? How many arguments do we have with others when both sides have already written off anything the other side might say before they ever open their mouths?

Perhaps, when we sense that tribalistic fervor is running high, we should cut to the chase and just ask something like this question:

"Do you think you could benefit at all from anything I have to say about this topic?"

If the answer is "no", then we can stop wasting each other's time. But there's also a chance that the question will make us both stop and consider for a moment whether we might benefit from hearing how things look from other points of view. And if it seems like the back and forth might make us all better off, we can proceed.

If we want more productive political discussions, perhaps it's time to stop worrying so much about the bias and laziness of the participants, and start worrying more about the fundamental distrust we have for members of other tribes.

advertisement
More from Jim Stone Ph.D.
More from Psychology Today