Skip to main content

Verified by Psychology Today

Eric Horowitz
Eric Horowitz
Smoking

Why Are People Bad at Evaluating Risks?

Using evidence or data to communicate risk can be a fool’s errand

Using evidence or data to communicate risk to the American public can be a fool’s errand. The most publicized “la, la, la, I can’t hear you!” moments involve people ignoring dangers that threaten ideology or political beliefs. Others may choose to ignore risks because immediate short-term pleasures are too alluring.

Yet biased risk assessment can also occur due to factors that are less self-inflicted. A trio of new studies illustrate how seemingly benign ways of thinking can impair risk assessment even among open-minded people eager to learn the truth.

One thing that can lead people astray is "base rates." There's been a lot written about the propensity to ignore base rates -- for example, people tend to ignore a disease's low incidence rate when underestimating the chance that a positive test is a false positive – but new research suggests that when base rates are relatively low the opposite can occur. That is, people can "double-count" base rates.

In a series of experiments participants were told that weather forecasts called for a 30% chance of rain in both Seattle and Phoenix. When participants were later asked how likely they were to bring an umbrella on a trip to either city, they revealed that they believed it was more likely to rain in Seattle. Even though the weather forecasts took Seattle's higher "base rate" of rain into account, people tended to count the rate again in deciding that a 30% chance of rain in Seattle was more likely to result in rain than a 30% chance of rain in Phoenix.

There are a number of ways this could screw up how one makes sense of risk. For example, if a doctor tells a non-smoking, slightly overweight 50-year-old he has an 8% chance of having a heart attack, the patient may think, "Well, I don't smoke, so it's probably less for me." The only problem is that the 8% already takes that into account. Similarly, when a resident of Atlanta hears a climate scientist say that in the next 30 years there's a 2% chance a climate catastrophe will strike their city, they may think, "Well, we're not near any major bodies of water so the chances are probably less than that." But once again the 2% already takes that into account. Double-counting base rates can lead well-intentioned analysis to overestimate or underestimate the chance of danger.

Another way people can miscalculate risk is by failing to fully grasp measurements in unfamiliar units. In our society there's a constant need to evaluate measurements in units like calories, metric tons, or interest rate points, but many people might not have a great sense of what the units mean. For example, an increase from 0 to 500 calories is different than an increase from 500 to 1000 calories. With regular numbers that’s not the case. In different units quantities relate to other quantities in different, non-linear ways. Thus, you can understand a 2% increase in metric tons but not properly understand the consequences of a 2% increase in interest rates.

In an ideal world people would recognize when they're dealing with unfamiliar units and put less weight on the magnitude of the measurement. As an extreme example, if Bob had to estimate the value of 14 Zorgs and Joe had to estimate the value of 20 Zorgs, the difference in magnitudes should not lead to a difference in their estimates. Having never heard of a Zorg until now, each of them ought to be just as likely to believe their quantity of Zorgs is worth $5.

The good news is that a new study suggests people are wise enough to take their unfamiliarity with certain units into account. The bad news is that this may only happen when the units are made salient through something like larger or darker font. When attention isn't drawn to the units people are less likely to be cognizant of their lack of knowledge, and the result is that they will be more sensitive to the magnitude of measurements in unfamiliar units. In multiple studies, when two groups of participants were asked to bid on different magnitudes of an unknown currency, the group bidding on a higher magnitude bid significantly more dollars, but only when the units were not made salient, They did this even though they had no idea how much the currency was worth.

Crucially, the researchers found this "deliberational blindness" can occur even for familiar but not well-understood units like horsepower. That means a remedial understanding of units could potentially influence the assessment of risks involving health or financial decisions.

Finally, it’s not just numbers that cause problems for people. That pesky English language can also hamper assessments of risk. Specifically, new research shows that terms like "improbable" and "unlikely" are so ambiguous it may not be worth using them at all.

The paper, which was authored by a psychologist from the University of Oslo, first recounts how a solid body of research has shown that when people are asked to define an "unlikely" outcome there is a consensus around outcomes that occur 10%-30% of the time. The researchers then describe a series of 5 studies that uncovered two ways this consensus falls apart. First, people think outcomes that lead to higher measurements are more unlikely than outcomes that lead to lower measurements. For example, when participants read that a certain climate scientist thinks oceans are likely to rise 50-90 cm, they tended to call a 40 cm rise “probable” but a 100 cm rise “improbable."

Second, when people are given a distribution of outcomes, they often reveal their conception of an “unlikely” event to be something with close to a 0% chance of occurring. For example, in one experiment participants were told about the performance of a computer battery used by 100 students. Forty of the batteries lasted 2.5 hours, 25 batteries lasted 3 hours, 20 batteries lasted 2 hours, 10 batteries lasted 1.5 hours, five batteries last 3.5 hours, and zero batteries lasted for 1 hour or 4 hours. When asked to complete the statement, "It is improbable/unlikely that the battery will last ______ hours," about 60% of participants answered 4 hours despite the fact that 0 of the 100 batteries lasted that long. The same kind of result occurred in a variety of similar experiments, thereby suggesting that things deemed improbable or unlikely are frequently interpreted as having close to a 0% chance of occurring.

So what do people think when they hear an outcome described as "unlikely?" Will the perception be different if they’re shown a whole distribution of outcomes? Will they make a mistake by only imagining magnitudes that are higher than predicted?

The bottom line is that communicating risk is hard, particularly when it involves multiple numbers and probabilities. The good news is that being meticulously clear about what you mean can mitigate the cognitive shortcomings outlined in the studies above. Don’t use vague words. Emphasize how base rates are used. Make units salient and use analogies so that people have a familiar frame of reference for changes in unfamiliar units. Some people will always find ways to ignore risks, but for those who want to understand them it’s important that it be as easy as possible.

---------------------------------------------------------------------------------------

Follow me on Twitter.

advertisement
About the Author
Eric Horowitz

Eric Horowitz is a social science writer and education researcher.

More from Eric Horowitz
More from Psychology Today
More from Eric Horowitz
More from Psychology Today