Skip to main content

Verified by Psychology Today

Media

How to Discern Fake News from Real News

Choosing which “facts” to believe.

Last summer I made my annual trip to Moscow to teach. When I arrived, the first thing my contact asked me was, “Is Hillary dying?”

I replied, “What are you reading?” She told me she receives U.S. news on the Internet. I wanted to blame her perception on the Russian media, but I saw similar reports in the U.S.

scandinaviastock/AdobeStock
Source: scandinaviastock/AdobeStock

Determining what is true is difficult. People will tell you it is simple if you just accept the facts. How do you know what facts to accept? Before you attempt to test your opinions externally with evidence, you need to look internally at the fears and assumptions that could lead you to believe unsupported opinions.

How your brain defines truth

Even if you think you are an independent thinker, most of your thoughts stem from the groups you identify with based on your upbringing, communities, religion, and even science. Shared meaning holds people together, giving you a sense of belonging, connection, and safety.

When you see evidence in the news that supports what you believe, you feel relieved and pleased.

If the news runs counter to your beliefs, you feel anxious or angry. You're more likely to call it “fake news” without looking to see if the evidence is strong or not. If someone says you are wrong, your brain sees this as a personal attack, making you even angrier. You then label their argument absurd, ignorant, or biased.

A USC study found that challenges to political beliefs, like religious beliefs, activate the same brain areas as when you see a snake or fast-approaching car.1 You feel threatened, unable to hear rational evidence against your beliefs. You lose cognitive flexibility, becoming rigidly defensive.

News articles rarely paint the entire picture; the news you get comes in fragments based on the leaning of the media you get it from. If you get your news from social media, like 62 percent of US adults, you get information from like-minded people so it isn’t balanced. This increases conviction and misinterpretation, especially when you read news counter to your beliefs.

Convenient truth

Even when you recognize your biases, Nobel prize winner Daniel Kahneman says you won’t stop to analyze what you read because it is too hard.2 Not only do you have too much to do...

  1. There is too much information to try to digest.
  2. The information comes in bits, so you fill in the gaps with generalities based on your beliefs.
  3. Short term memory is limited so you only grasp what matches your assumptions.

Reflective techniques

NOTE: This doesn’t mean you should accept what you see, especially bigotry and hatred. Reflection opens you to at least understand other people's views and fears.

To counteract your brain’s tendency to construct “truth” out of comfort, convenience, and confusion, you can access your Reflective Intelligence to try to sort through your filters. This isn’t easy, but if you are courageous enough to accept a different reality, you might be able to see what else could be true.

  1. After reading an article, write down your thoughts without censoring them.
  2. Read your words as if someone else wrote them. Circle the emotionally-charged words and judgments of people’s character instead of their actions.
  3. Ask, “What is causing me to think this way? What beliefs are forming these thoughts? What assumptions am I holding that are keeping me from opening my mind?” Consider these points:
  • Until you speak them, you are rarely aware of the assumptions behind your thoughts, just as you ride a bike or drive a car without thinking. Emotions can be a window to your values and beliefs.
  • Don’t stop your emotions and reactions, especially your impulse to agree or condemn. Be curious about why they are occurring.
  • Catch yourself saying, “This is absolutely wrong.” The word absolutely defines the limits of your thinking. You learned it out of survival; it doesn’t necessarily define what is true.
  • Hear the story you are telling, “They are trying to betray me” or “He's a selfish idiot." Ask yourself what you are afraid will happen if you believed something else. You can also ask why others react positively or negatively. The more you can suspend your past knowledge to remove judgment, the more clearly you can see what else could be driving the actions you disagree with.
  • Practice interpreting with a beginner’s mind. Say to yourself, “If I had never seen this situation, or knew this person before, what might I perceive?” If you already know why people do what they do, there is nothing new to see. Are you willing to look from a different perspective?
  • Look for the values you fear are at stake, such as equality, fairness, loyalty, or prosperity. Other people’s actions might be supporting their values which may differ from yours. Rob Willer defines this difference in his TED talk, How to Have Better Political Conversations.

After questioning your beliefs and biases, you can better weigh the evidence to assess if the “facts” were intended to inform or manipulate.

Weighing evidence

Carl Sagan created a Baloney Detection Kit to check if arguments are based in pseudoscience or superstition.3 The following items come from his list:

  1. Quantify – How many people are included or how many times has this happened? Don’t rely on what is projected for the future without verifying strong patterns.
  2. Is the hypothesis is testable? Can it be proven in other circumstances with the same result?
  3. Reject attacks on people; focus only on the opposing argument not who is saying it.
  4. Question arguments based on...
  • an "authority,” the opinion of one person with no backup data. Also, when people say, “studies show…” and don’t cite any studies, don’t accept the numbers as facts.
  • one party saying the others actions will create large-scale consequences without evidence. Be wary of the word "all" relating to an entire group.
  • statistics that draw conclusions from a small, inadequate sample.
  • unsupported cause and effect - "it happened after so it was caused by…"
  • no middle - considering only the two extremes in a range of possibilities.
  • short-term vs. long-term – “We can’t care about science when we have a budget deficit."
  • slippery slope - give an inch and they will take a mile.

The more you question what you read, the sturdier the truth.

References

1 Kaplan, Jonas T., Gimbel, Sarah I., and Harris, Sam, “Neural correlates of maintaining one’s political beliefs in the face of counterevidence”, Scientific Reports 6, Article number: 39589 (Dec. 2016)

2 Kahneman, Daniel. Thinking Fast and Slow, Farrar, Straus and Giroux, 2013

3 Sagan, Carl, The Demon-Haunted World: Science as a Candle in the Dark, Ballantine Books, 1997.

Reynolds, Marcia, Outsmart Your Brain: How to Make Decisions Feel Easy, Covisioning, 2004

advertisement
More from Marcia Reynolds Psy.D.
More from Psychology Today