Skip to main content

Verified by Psychology Today

Bias

Our Biggest Blind Spot

Become aware of one of the strongest opinions you didn't know you held.

Sometimes our strongest traits can be our greatest weaknesses. Similarly, sometimes our strongest opinions can lead to our greatest misperceptions.

So, what’s one of your strongest opinions? Maybe your political preference? Maybe your opinion on black licorice? Maybe whether you think it’s appropriate to wear socks with sandals?

Today, the specific, strong opinion I’m referring to is one that many hold: your opinion of yourself. That is, most people hold a very strong and positive attitude toward themselves, which contributes to one of the most common phenomena in human psychology…

The Bias Blind Spot

In general, people want to hold objective and accurate opinions. At the same time, people are aware that outside factors like limited information, strong emotions, and self-interest can all bias our opinions (i.e., lead us to hold opinions that are not objective or accurate).

In general, people are pretty good at recognizing the influence of bias. For example, if someone purchases an extra bag of chips at the store, you might realize they don’t necessarily love these chips. Their decision for the extra purchase might have been biased by advertising.

However, when it comes to evaluating our own decision-making processes, we are much less likely to observe (and admit to) any biasing influence. That is, the ‘bias blind spot’ refers to people’s tendency to underestimate the extent to which their own decision-making processes are biased.

In a classic study, researchers first described a variety of different psychological biases to participants. For example, participants were informed about the ‘halo effect’ – the bias in which we evaluate attractive people more positively (e.g., we think they’re more trustworthy) simply because they’re attractive. (In reality, attractiveness is unrelated to trustworthiness.)

After being informed of these different decision-making biases, the researchers asked (1) how susceptible other people were to these biases, as well as (2) how susceptible they themselves were. And across different groups and a variety of psychological biases, people consistently indicated that other people were more prone to bias than themselves.

And here’s the kicker: Even when researchers explicitly told participants that the vast majority of people fall prey to these biases, participants still indicated that those biases didn’t apply to them.

That’s right: I’m talking to all of you right now who read that previous information and thought: “Well, sure, those biases apply to a lot of people, but not me."

Introspective Illusion

Let me take one more moment to illustrate how strong this favorable opinion of ourselves is: People tend to think others’ opinions are more often biased by self-interest than their own opinions are. Physicians tend to think other physicians are more biased by pharmaceutical gifts than they are themselves. And people on both sides of the political spectrum believe the opposite side is more biased by group ideology than their own side is!

Why does this occur? Largely, researchers point to what’s been called the ‘introspective illusion.’

In reality, a lot of our judgments and decisions are driven by nonconscious processes. We often hold preferences or make choices for reasons that aren’t apparent to us. For example, when shoppers in a mall were asked which of four ties they liked the most, they consistently pointed to one of the ties and offered a reason for their preference – even though each tie was in truth identical.

In other words, we tend to believe that we know the reasons underlying our decisions, even when those reasons are in truth nonconscious (e.g., you pointed to that specific tie because it was the closest to your pointing hand). Thus, we don’t think we’re susceptible to bias, because we hold the illusion that we possess full introspective awareness.

However, when we evaluate other people, we tend to believe they don’t have the same level of introspective awareness. Unlike our own minds, we don’t have access to all their thoughts. So, we naturally believe they’re not as introspective as us, making us believe they are more susceptible to biasing factors.

Removing the Blinders

Although telling people that they have a bias blind spot can have some success in reducing it, other research suggests a better solution: Specifically, we need to help ourselves realize that none of us have full introspective awareness.

That is, pointing out various nonconscious, influencing factors that researchers have studied – subliminal primes, the mental accessibility of content, social mimicry – can help people recognize that they, too, are susceptible to the same bias as everyone else.

Even still, the bias blind spot is one strong opinion that might require more extensive persuading.

Interested in what some of these nonconscious factors biasing your decision might be? Check out this short article I wrote about a fascinating example of subliminal priming.

References

Pronin, E. (2007). Perception and misperception of bias in human judgment. Trends in Cognitive Sciences, 11(1), 37-43.

Pronin, E., & Kugler, M. B. (2007). Valuing thoughts, ignoring behavior: The introspection illusion as a source of the bias blind spot. Journal of Experimental Social Psychology, 43(4), 565-578.

Pronin, E., Lin, D. Y., & Ross, L. (2002). The bias blind spot: Perceptions of bias in self versus others. Personality and Social Psychology Bulletin, 28(3), 369-381.

advertisement
More from Jake Teeny Ph.D.
More from Psychology Today