Skip to main content

Verified by Psychology Today

Bias

7 Reasons Why We Fall for Fake News

Critically thinking about fake news

The concept of fake news is nothing new. It refers to a story that isn’t true or is not entirely true, taking the form of, for example, accidental misinformation or deliberate disinformation. But what makes it problematic now more so than ever is its abundance and the fact that people keep falling for it. In a recent piece, "10 Ways to Spot Fake News," my purpose was to provide tips for identifying it; however, perhaps just as important is our understanding of why we fall for it.

1. Confirmation bias.

Confirmation bias refers to our favoring of information that confirms our existing beliefs. Without accounting for this bias in our thinking, we are more likely to fall for fake news if we agree with what is being said. This works the other way around as well; indeed, confirmation bias will yield the opposite effect, enhanced skepticism, for fake news stories we dislike. Yes, it’s good that confirmation bias can, in some contexts, help us dispel fake news; but, at the end of the day, engaging this bias exhibits a lack of critical thinking.

2. Lack of credibility evaluation.

We engage the news in order to inform ourselves, generally because we weren’t there to witness events unfold first-hand. As a result, we trust our source of news that the information they provide us is, in fact, true; and in doing so, we put trust in the source’s credibility. But we cannot blindly do so. We must first evaluate it.

Such evaluation involves digging deeper into the article and assessing the sources of the claims, looking for evidence (rather than opinion, anecdotal support, or common belief statements), searching for replication across other news outlets and assessing the credentials of the author, publisher, and/or website. Though I list various steps for completing an evaluation of a news story, I must concede: this is a simplified version of what is required, it is quite an abstract concept and, as a result, people may lack both the skill and care to apply such higher-order thinking.

As I mention throughout this blog, time and time again, one should apply critical thinking only to issues they care about or that are important to them (e.g. given the negative effects of decision fatigue [Baumeister, 2003] and cognitive load [Sweller, 2010]). If U.S. politics or children’s healthcare isn’t important to an individual, it’s probably not likely that they will dedicate time and effort to evaluating it; thus, making them more susceptible to fake news relevant to such topics.

3. Attention and impatience.

On the other hand, let’s assume that the topic in question is important to you and that you do have the skill of evaluating credibility—you are still susceptible to modern trends in information processing, let alone the other psychological factors presented in this piece. That is, in today’s world, it can be argued that we have a surplus of information (Dwyer, 2017). We don’t read everything in our social media newsfeed. We scroll past articles that are unimportant or uninteresting to us; we don’t pay attention to them. Sometimes, we barely read the headlines. If we do manage to read the headline, that might be all we read.

We want our information fast because we have been primed to get it fast. Now, I’m not saying fast, efficient access to information is a bad thing; it’s not an issue of declinism here; but, I recall a time that if you wanted information on a current event, you’d have to hope it was covered in the newspaper, on the radio, or the evening news on television. Nowadays, we can just type a few letters into our phone and what we want, from a wide array of sources, is there. But along with that is other information, from unfamiliar sources, that we didn’t necessarily seek out.

Moreover, we need to ask ourselves: Are we really attending to what is being said or are we just looking for a quick answer? How deep are we evaluating? Are we patient enough to engage this properly? Are we even evaluating or are we just skimming through before moving on to the next report? This brings me to a concept that probably deserves its own book, let alone blog post: Is knowledge about having an abundance of information or knowing what to do with it?

But, let’s not go off on a tangent! So, we keep scrolling through our newsfeed. If writers are concerned with getting you to read their article, then they’re going to dress it up in a way that makes it interesting. Thus, they grab your attention by using sensationalist language. Flip flops cause cancer was actually a headline from almost a decade ago. Of course, a thorough inspection of the article led to the understanding that any footwear that allows for exposure of skin on the foot to the sun, without proper protection, is correlated with increased chances of developing skin cancer—flip flops just happen to be the footwear that exposes the most skin.

Using the same logic, one could report, sensationally, Baldness causes cancer! In reality, a more truthful report would have read: Protecting yourself from the sun is important; but, that doesn’t get clicks or sell papers. Notably, you don’t have to believe it for this strategy to work—sure, I even read it! Even though I did so for the purposes of seeing how one could jump to such a conclusion, the news outlet still won because they got my click.

4. We are cognitively lazy.

As discussed throughout this blog, humans are cognitively lazy (Kahneman, 2011). Our brains have evolved to conserve energy for "more important" tasks; and, so, they don’t very much like expending energy when an intuitive decision can be made that is good enough (e.g. satisficing [Simon, 1957]). Is our belief in a random news story really that important in our day-to-day lives? Well, it could be; but more often it probably isn’t… and so, we fail to engage evaluation and reflective judgment. Instead, we conduct a simplified means of information processing—yielding a conclusion that isn’t necessarily accurate, such as choosing to believe the fake news report.

5. Our emotions are targeted.

One of the largest barriers to critical thinking is emotion, because, simply, it makes thinking irrational. When people think with their emotions, they think based on gut-level intuitive reasoning, fueled by how they feel and by past experiences associated with those feelings—the opposite of reflective, critical thought. Fake news, like propaganda, can evoke and breed emotions like fear and anger in the reader or listener. If you’re emotional, you’re not thinking rationally and are more susceptible to falling for fake news.

6. Reiteration: the illusory truth effect.

The illusory truth effect refers to the phenomenon in which the more we have been exposed to certain information, the more likely we are to believe that information. Earlier in this post, I mentioned that flip flops had been reported to cause cancer. If you have never been exposed to this information before, its very mention here is the second time you’ve encountered it. The more you read about flip flops and cancer, the stronger the link between the two becomes in your head. Of course, there is no causal relationship between the two. However, debunking isn’t necessarily a helpful solution.

Now, I’ll add the caveat that because you were introduced to this concept alongside the debunking, you’re probably less likely to believe in the relationship; but, imagine being presented information with a fair amount of repetition, without any objection. Then, after repeated exposures, you’re provided compelling evidence that this information is actually incorrect. Even though you accept the refuting evidence, the misinformation is still remembered and can implicitly affect your thinking in related contexts.

We are particularly susceptible to fake news, in this context, given the echo chambers we help create for ourselves on social media. As I outlined in the How to Change People's Minds: The Art of Debunking, Cook and Lewandowsky’s (2011) concise handbook is a quick and useful read for methods of debunking; and addresses, as a foundational perspective, that once people process information (factual or fake), it’s quite difficult to remove that information’s influence.

7. Social pressure.

The final reason why people fall for fake news is kind of a big one with respect to its impact as well as the various subtopics it covers. One of the best-selling books of all-time, How to Make Friends and Influence People (Carnegie, 1936), was perhaps so successful because people recognize the importance of social influence and, likewise, social pressure. When you think about it, the mechanisms of such pressure are quite simplistic with respect to how it works within social media: if you say something that someone doesn’t like, they might unfriend you; if it’s something they really don’t like, they might report you; the more you have in terms of friends, followers, likes, views or clicks, the more influence you and your (signaled) values have.

I would argue that though these mechanisms of social pressure exist in real life, perhaps they aren’t as straightforward as they are on social media. With respect to the impact of social pressure on your decision-making regarding fake news, you might say, Yeah, but I think for myself, I don’t let other people affect my decisions. Well, that’s not entirely true. Social pressure plays a much larger role than you think. Again, think about your echo chambers. How many people or organizations present information with which you disagree? Maybe you’re like me and enjoy a good debate; but for the most part, you may block, hide, or even unfriend or unfollow individuals with different views.

Indeed, friendships in real life are also largely based on similarity and common ground. We are molded by the people around us. For example, research indicates that over the past few decades, the ratio of American psychology professors/lecturers voting for a liberal presidential candidate has grown from 4:1 to 14:1 (as of 2012) with further research suggesting that this gap is increasing (Duarte et al., 2015). As you will know from Which side are you on?, I’m bipartisan in the arena of politics, which may explain why I find these results worrisome.

Remember, politics and social perspectives are not objectively wrong or right; they’re based on beliefs about how things should be done. So, with that in mind, if your education or any job you work at takes place in an environment that is likewise biased to such an extent, surely there will exist some level of social pressure consistent with those views. Similarly, the problem here may be that, despite all the hubbub about diversity in perspective, the impact of mechanisms associated with social pressure may actually enhance polarized thinking—us vs. them—with everyone thinking they’re right; and in a polarized arena, you’re part of the majority or the minority. If the latter, you may be pressured, implicitly or explicitly, into changing your position by the majority.

But, just because the majority believes something does not make it true. Well, everyone has their own truth. No, that’s not correct either—that's subjectivity. When we are tasked with separating fact from fake news, only objectivity can yield an appropriate response. So, be aware of the social climate, be aware of the political climate, be aware of the majority; because, the pressure associated with these are likely to impact the information you engage, as well your belief in its truth or fakeness.

References

Baumeister, R. (2003). The psychology of irrationality: Why people make foolish, self-defeating choices. The Psychology of Economic Decisions, 1, 3-16.

Carnegie, D. (1936). How to win friends and influence people. Pocket Books: New York.

Cook, J. & Lewandowsky, S. (2011). The debunking handbook. St. Lucia, Australia: University of Queensland. Retrieved from http://www.skepticalscience.com/docs/Debunking_Handbook.pdf

Duarte, J. L., Crawford, J. T., Stern, C., Haidt, J., Jussim, L., & Tetlock, P. E. (2015). Political diversity will improve social psychological science 1. Behavioral and Brain Sciences, 38.

Dwyer, C.P. (2017). Critical thinking: Conceptual perspectives and practical guidelines.Cambridge, UK: Cambridge University Press.

Kahneman, D. (2011). Thinking fast and slow. Penguin: Great Britain.

Simon, H. A. (1957). Models of man. New York: Wiley.

Sweller, J. (2010). Cognitive load theory: Recent theoretical advances. In J.L. Plass, R. Moreno & R. Brünken (Eds.), Cognitive Load Theory, 29-47. New York: Cambridge University Press.

advertisement
More from Christopher Dwyer Ph.D.
More from Psychology Today
More from Christopher Dwyer Ph.D.
More from Psychology Today