Skip to main content

Verified by Psychology Today

Forgiveness

Misinformation on the Mind

Who’s to blame when we get the facts wrong?

Death panels, socialism, the end of Medicare. These are some of the pieces of misinformation that have pervaded public conversation and private thinking about the Affordable Care Act. And health care isn’t the only area where U.S. politics churn up a lot of misperceptions: Barack Obama is not a U.S. citizen, John Kerry lied to win military awards in Vietnam, undocumented immigrants get special government perks, vaccines contain autism-causing mercury, Saddam Hussein orchestrated the 9/11/2001 terrorist attacks, the Bush administration orchestrated the 9/11/2001 terrorist attacks, and an alien space craft crash-landed in Roswell, NM in 1947. None of this is true, but many people seem to believe some or all of it. Who’s to blame when we get the facts wrong?

Agnotology, the study of ignorance, is a hot topic in political science journals, in part because there seems to be a lot of misinformation floating around in politics. Researchers have been trying to figure out why people believe rumors and untruths, and why they hold on to those misbeliefs even after being made aware they are wrong.

The recent interest in agnotology among political sciences comes from a 2000 study by Jim Kuklinski and colleagues at the University of Illinois, who found that people know almost nothing about social welfare policy—and what they think they know (that welfare disproportionately benefits racial and ethnic minorities, that it consumes a substantial portion of the federal government’s budget, and that welfare beneficiaries can receive benefits indefinitely) tends to be wrong. As a result, Kuklinski and his coauthors found that people have negative opinions of social welfare but those opinions are premised on a misguided understanding of the policy. In an effort to see if opinions would change, the researchers provided study participants with a set of information about federal welfare policies to see if their attitudes would change. To the researchers’ surprise, opinions didn’t change and neither did peoples’ misbeliefs. Even after being told that they were misinformed, participants continued to hold onto their misperceptions about the policy!

Other researchers have identified patterns of widespread misinformation about Saddam Hussein, weapons of mass destruction in Iraq, tax policy, stem cell research, and health care.

Some academics have been quick to judge that citizens should know better and wondered at how people could continue to believe rumors and untruths even after they are told they are wrong. But who is really to blame for misinformation? Is the human mind capable of differentiating truth from fiction?

Philosophers have long defined “knowledge” as “justified, true belief.” That is, knowledge is any belief, substantiated by some logical inference or empirical justification. As a result, the only thing that differentiates truth from fiction is whether a “justified belief” is true or not. Unfortunately, that means that as long as the psychological criteria for knowledge are met (a belief that is justified) the factor that makes something truth versus fiction is external to the mind. Truth, if it is even knowable, is determined by social consensus, not psychological deliberation.

Consequently, it seems hard to blame people for holding misbeliefs when society, politicians, and media readily dispense false and misleading information. PolitiFact, a nonpartisan website which judges statements made by politicians according to whether they are true or false recently published this summary of misinformation regarding the Affordable Care Act. Given the abundance of false information, it should come as no surprise that a lot of people are misinformed about that legislation, let alone other aspects of politics.

But there is a lingering question about why people continue to misbelieve when they are shown their beliefs are false. One possibility is that our political beliefs (whether true or false) do not meaningfully underlie our opinions. Instead, because we anticipate having to explain our subconsciously formed opinions to others, we develop beliefs to rationalize those opinions. Thus believing Saddam Hussein had weapons of mass destruction doesn’t explain why someone supported the 2003 invasion of Iraq; instead, that person’s support for the invasion explains why they are willing to believe to he had weapons of mass destruction. It is a somewhat unintuitive possibility, but fully consistent with the theory of motivated reasoning.

Brendan Nyhan (Dartmouth) and Jason Reifler (Geogia State) offer another intriguing and related possibility. They argue that people resist information that contradicts their beliefs (whether true or false) because that information is threatening to their worldview or “self-concept.” As a result, new information, even if shown to be true while old information is shown to be false, is threatening and the mind needs to cope with this threat in order to accept the new information. In their experiments, participants engage in “self-affirmation” in order to bolster their self-concept so that they are less psychologically threatened by new (belief-challenging) information about a variety of political issues (global climate change, the effect of the Iraq troop surge, and the U.S. economy). Nyhan and Reifler find that this self-affirmation makes their participants much more open to the new, corrective information and reduces their misperceptions.

The takeaway from Nyhan and Reifler’s study is to be a discerning critic of information, both new information you encounter and the information stored in your own long-term memory. Just because you believe something doesn’t make it true. And, just because a politician says it doesn’t make it true either.

advertisement
More from Thomas J. Leeper PhD
More from Psychology Today