Skip to main content

Verified by Psychology Today

Bias

Anti-Vax Bias Is Partly a Failure of Science Communication

When explaining science, it's critical to differentiate possible from probable.

Key points

  • Probability is the likelihood of something possible occurring.
  • Although it was improbable given past efforts, manufacturing and rolling out an effective and safe COVID-19 vaccine was always possible.
  • Biomedical experts need to appreciate the needs and expectations of their audiences when communicating science.

There is a big difference between impossible and improbable and between truth and lies. The importance of these differences is amplified when we’re talking about science and critical issues of public health such as during a pandemic. We are hugely dependent on science and it is critical to almost every aspect of our lives but effective science communication is hampered by not just what but how it is communicated.

Improbable does not mean impossible

A big part of the problem we have right now with anti-vaxxers can be laid at the feet of science communication. Or, more accurately, a failure of effective science communication. Many folks don't understand the difference between probable and possible, improbable and impossible, likely and unlikely. Scientists are trained all the time to report their work in the starkest possible terms. Report what you find but then describe them in terms of how likely they are to be reproduced. This is the probability bit where the statistics and statements of certainty come in.

We're so hardwired as scientists to make sure we never overstate what we find. To make sure we outline all the possible limitations. We're pretty negative, if you get right down to it. But that scrutiny and stringent approach can help science advance and move forward. But what it doesn't help is advance the knowledge of science in society. Scientists have to make realistic distinctions for folks and not expect them to be able to understand how to make those distinctions themselves.

At the beginning of the COVID-19 pandemic, there were all kinds of stories about how difficult it was going to be to come up with an effective vaccine on the short timeline needed to be effective globally. There were many, many science experts and science journalists who had talked with experts on television, on podcasts, on YouTube, on the radio, and in written media describing how extremely challenging it would be (it was often stated that it was basically going to be impossible) to come up with an effective vaccine and roll it out safely.

What those folks were trying to get across is that the probability of achieving something that has not been done before was pretty low. It was very unlikely, but critically, it wasn't impossible. It wasn't that it could not happen, it's just that it wasn't very likely to happen. but many interpreted this as “here's a scientist telling me something. I'm going to believe them. It's impossible.”

But then, relatively quickly, something actually happens and numerous effective vaccines are developed and tested and then rolled out. How can those same folks then accept as truth the information they got from someone who told him the thing they're now talking about as real couldn't actually be? There’s a big disconnect there for many and forces a judgment about how truthful information is considered to be.

Judging truth depends on expectations

In their paper “Judging Truth,” Nadia M. Brashier and Elizabeth J. Marsh suggest that when trying to figure out what to believe, people have a bias to accept information at first because many claims are true. This is balanced against feelings we have about information due to ease (or difficulty) of presentation or explanation. Further, people evaluate whether the information they are given matches expectations they should have based on what they have been told before. Brashier and Marsh wrote that we are in a post-truth world “where falsehoods travel further and faster than the truth.” In such an environment, it’s even more critical that initial information be accurate and easily accessible. This is again where the distinction between probability and possibility is critical.

This distinction is reminiscent of stories you hear about someone who is in a horrible and unfortunate accident and their neurosurgeon tells them they'll never walk again. The neurosurgeon is telling them to temper their expectations and to point out that it's very unlikely, not that it's impossible. The intention is to save those folks from getting their expectations too high and then being disappointed. Personally, I think this is the wrong message to be given and needs a bit of revising, but I'm just using this as an example. Then when that person actually does regain some function (which often happens), it's deemed a miracle that was said to be impossible yet here it is.

It was never impossible, it was just improbable. It was unlikely, but could happen. We then accept that they can walk because they're walking in front of us. We don’t deny what is happening because it doesn’t match our expectations. We don't then refuse to accept the outcome by saying that it was so unlikely that that couldn't have happened and therefore I deny that it occurred. Yet this is what happens with other conspiracy-laden things like vaccinations.

Science communication has to be less negative and more accessible

The point of all this is to point out that moving forward, those who communicate science have to get better at explaining the difference between how likely something is to occur versus whether it can never occur. And the truth is very few things are actually impossible but a lot of things are improbable. Effectively communicating complex ideas to non-specialists requires an appreciation of the expectations of others. There’s a critical need to understand the audience and present information in a way and in terms that are accessible and comfortable for that audience.

© E. Paul Zehr (2021).

advertisement
More from E. Paul Zehr Ph.D.
More from Psychology Today