Skip to main content

Verified by Psychology Today

False Memories

People Share Fake News Even When They Don't Believe It

Why care about truthfulness when you can have likes?

Key points

  • Experiments show that people share content on social media that they know is false.
  • People care more about their political tribe and popularity more than accuracy.
  • Those who share misinformation aren't gullible. They're social actors.
Image by Gerd Altmann from Pixabay
Source: Image by Gerd Altmann from Pixabay

Recently we’ve talked about how heightened concerns about misinformation and radicalization online may be misguided and alarmist. First, people have varying definitions of what “misinformation” actually is, without a clear and consistent consensus. We also tend to overestimate others’ gullibility to fake news relative to our own, known as the third-person effect. And social media algorithms tend to steer people away from fringe or extremist content rather than toward it.

But still, people are very concerned about how much misinformation exists online, and much of the concern rests on the idea that masses of people are foolish, ignorant, or stupid, and will believe whatever they are told. Part of why I think this view is very unwise is because research evidence points to a much different psychological mechanism—that people will intentionally share misinformation with others while knowing that it is false.

Educational Videos Do Not Reduce Sharing of Misinformation

In a field experiment by a team led by Alexander Bor, scientists studied how educational videos could inform Twitter users on how to detect fake news so that they would be less susceptible to it. They studied a representative sample of 1,600 people recruited by YouGov by observing their sharing habits on Twitter. Half of the participants watched videos such as “How to Spot Fake News” by FactCheck.org, or “Four Ways to Tell If Something Is True Online” by MediaSmarts. The other half of the sample was in the control condition and did not watch a video. Following this, all participants completed a fake-news quiz, in which they were shown both real and false news headlines and asked to pick out which ones were real and which were fake. The first result of this experiment was clear and consistent with other studies—those participants in the educational video condition scored higher than the control group on the fake news quiz, meaning that their discernment skills had increased.

Time to celebrate, right? Well, not so fast. The educational videos were successful in guiding people toward true beliefs, but they did not reduce people’s tendencies to share misinformation on Twitter. That means that those who saw the educational videos were just as likely to share misinformation as those in the control group. This strongly suggests an important phenomenon: People share content on social media that they know is false.

The authors speculated a bit about why this might be happening. People have other motivations aside from an allegiance to the truth. Individuals share (retweet) misinformation because they think it will help their political tribe or help them score popularity points in their social network.

People Care More About Popularity Than Accuracy

This idea was also supported in another set of experiments by a separate research team. They also found evidence that when sharing content on social media, people will sacrifice accuracy in exchange for social standing. This was true even for content that was not political. For instance, participants were asked if they were willing to share a story touting a debunked conspiracy theory that the TWA 800 plane crash was caused by a missile launch followed by a government cover-up (this is not a salient issue that divides liberals and conservatives). While most of the participants said they would never share such fake news, over 40% of participants were willing to do so, provided that there was an incentive. In part, they were doing it for the “likes,” or to be socially recognized. They were also likely to share misinformation when the researchers offered raffle tickets in exchange for the number of likes that they received. Participants seemed to realize that they would get more engagement if they shared conspiracy theories and that this would pay off for them.

The authors concluded, “In politically polarized settings, people may intentionally share conspiracy theories or other types of misinformation (e.g., rumors or fake news) to advance other motives (e.g., to signal their identity) … our findings suggest that social motives (e.g., signaling social identity or eliciting social engagement) play a critical role in accounting for people’s decisions to broadcast conspiracy theories.”

Misinformation Itself Isn’t a Huge Problem

These results challenge some of the widely held notions that people share misinformation on social media because they are gullible. On the contrary, most people who share falsehoods do so because they believe it provides them a social benefit—to gain attention, status, popularity, and respect in one’s community, or to help one’s political tribe defeat another political tribe. These motivations have little to do with people’s naivete or poor reasoning skills. This is why I don’t share the view that misinformation on the internet is an enormous problem that threatens democracy. I think at best, it’s a nuisance on par with similar types of misinformation found in supermarket tabloids, which have been around for more than a century.

References

Bor, A., Osmundsen, M., Rasmussen, S. H. R., Bechmann, A., & Petersen, M. (2020, September 24). "Fact-checking" videos reduce belief in but not the sharing of "fake news" on Twitter. https://doi.org/10.31234/osf.io/a7huq

Ren, Z. B., Dimant, E., & Schweitzer, M. E. (2023). Beyond Belief: How Social Engagement Motives Influence the Spread of Conspiracy Theories. Journal of Experimental Social Psychology, 104.

advertisement
More from Dylan Selterman Ph.D.
More from Psychology Today