Skip to main content

Verified by Psychology Today

Free Will

Deliberate Ignorance Can Both Increase and Reduce Uncertainty

What do you know when you choose not to know?

Key points

  • Deliberate ignorance is choosing not to know.
  • Deliberate ignorance can help manage uncertainty.
  • One must choose wisely when not to know something—or consult formal models.

Real knowledge is to know the extent of one's ignorance. – Confucius (riffing on Socrates)

Sometimes I don't open my fortune cookie. - Hoca Camide

Deliberate ignorance

Would you, now, like to know the time and manner of your death? Perhaps not; and for good reason. Some information is more of a curse than a blessing, even if, and particularly if, it is free of charge. People may not wish to learn about their demise ahead of time because of the emotional costs they can reasonably expect. Here, ignorance brings a sort of bliss. We can live well and go about our business until we expire without agonizing over the expiration itself, which would detract from our joie de vivre. Then again, had we the emotional fortitude, we might use this knowledge to live even more fully, making sure that our life projects are completed.

Hilmi Abedillah/Shutterstock
Source: Hilmi Abedillah/Shutterstock

Those who cling to the belief in free will presumably would (freely?!) choose to not know about their day zero because having that knowledge would reinforce determinism: "This is when it is going to happen and there is nothing you can do about it." Intuitions of agency, control, and free will rebel against such finality. These intuitions seek to disarm the premise that the future can—in principle—be foretold. If we live well from now on and take care of our lives, the prophecy can be disarmed, or so we imagine. Back in Attica, Presocratic dramatists made a living slaying this idea.

Then there is the psychology of regret. Obtaining potentially dangerous information is something we might regret. When we say "I wish I did not know," it is already too late. This is an anticipated regret of an act of commission (getting information). By contrast, when the information is eventually revealed as part of the unfolding progress of life, we might regret that we did not obtain it earlier. This is an anticipated regret of an inaction or omission (choosing ignorance). It is difficult to weigh one type of regret against the other, particularly when the regret of an action is stronger at first but weaker in the end than the regret of an inaction. Discounting future regrets, many people take the path of least resistance now, that is, they choose ignorance (Gigerenzer & Garcia-Retamero, 2017).

Choosing not to know is not a strategy we can handily dismiss as irrational. Once we delve into the contextual complexities of this issue, nuance quickly overshadows brute categorical conclusions (Hertwig & Engel, 2020). This is interesting because, in the classical rational-agent view, free information cannot have a negative value (a brute conclusion right there!); so we should take it. Besides allowing us to make better decisions, additional information is apt to reduce uncertainty, and uncertainty itself carries a negative value for most people (Krueger & Grüning, 2020).

Doubt and skepsis

So much for the standard view. It is easily shown that more information can increase uncertainty—while it may well be true this is the exception. Any information that raises doubt, by definition turns that which used to seem pretty certain into something that seems less so. All sorts of skepticism and counterfactual thinking (considering alternatives or the opposite) are designed to break down (near) certainty; they introduce information that epistemically unsettles the recipient.

In Bayesian probability theory, what matters is the difference between a prior and a posterior belief (or the ratio of the latter over the former), where both are expressed as subjective probabilities. In general, more data will move the posterior probability toward one certainty (p = 0) or another (p = 1), but they don’t have to. An otherwise rational person who strongly (dis)believes in, say, telepathy will see her subjective belief move toward the middle when belief-contrary data are observed; and at p = .5, uncertainty is at its maximum. More data will shrink the credibility interval around the best Bayesian estimate (be it .5 or whatever), thereby providing a sort of meta-certainty even when primary uncertainty is at its maximum. "I am quite sure now," the Bayesian might say, "that I find myself in a state of uncertainty."

Consider the issue from the more familiar frequentist perspective. The conventional assumption, which often holds, is that more information amounts to an increase in sample size N, while nothing else (the mean, the variance) changes much. Under this regime, the precision with which we estimate the mean increases as the standard error of the mean decreases; s.e.m. = standard deviation / square root (N). As we sample more broadly, though, our standard deviation will increase, and we can now see that the s.e.m. will remain constant if the standard deviation and the square root of the sample size increase at the same rate. If, however, the standard deviation, that is, the variability of our data, increases faster than the square root of N, then the precision with which we estimate the mean decreases. Although it sets a high bar, this simple regularity shows that as information gets more diverse fast, we end up being less sure about what it tells us, on average.[1]

Being open-minded about ignorance and uncertainty

If the standard deviation (the square root of the variance) of the data is one measure of uncertainty and the standard error (which depends on the square root of N) is another (higher-order) measure, so can attentional zooming make one or the other type more salient. Someone contemplating whether to retrieve potentially troubling information may wish to tolerate the uncertainty of not knowing for the reasons sketched above. Yet, this person might also realize that if she chose to retrieve the information, a reduction of uncertainty would not necessarily result. Instead, new uncertainties could assert themselves. If, as in the initial scenario, a person elects to preview her time and manner of death, many of her routines, protocols, and preferences could be thrown into disarray, creating new challenges for uncertainty management. "I have no idea," the person might think, "how to live, now that I know this." Inasmuch as people are able to anticipate this secondary and more severe kind of uncertainty, their decision to not know in the first place could be an effort to minimize uncertainty when it cannot be eliminated.

Recognizing the science of deliberate ignorance is young, Hertwig and Engel (2020) end with a call for wisdom. Though their edited volume comprises chapters on mathematical modeling and normative analysis, the case-to-case nuances have a way of proliferating. Indeed, this may be an instance of the standard deviation outrunning the square root of the sample size. More research is needed, and until we have it, we must, at least in part, leave room for common sense and the reasonable-person heuristic. At any rate, we should be mindful of our decisions to know or to not know, and ask ourselves if we can make a reasoned argument for our decision, at least when it is a decision of importance, like life and death.

[1] He who warned against stepping into the river that is 3 feet deep on average ignored, perhaps deliberately, the variance of the river's depth. If the river's depth ranges from 2.5 ft to 3.5 feet, you're fine.

References

Hertwig, R., & Engel, C. (2020). Deliberate ignorance: Choosing not to know. MIT Press.

Gigerenzer, G., & Garcia-Retamero, R. (2017). Cassandra’s regret: The psychology of not wanting to know. Psychological Review, 124, 179-196.

Krueger, J. I., & Grüning, D. J. (2021). Psychological perversities and populism. In J. P. Forgas, W. D. Crano, & K. Fiedler (eds.). The social psychology of populism: The tribal challenge to liberal democracy. The Sydney Symposium on Social Psychology, 22, 125-142. Taylor & Francis.

advertisement
More from Joachim I. Krueger Ph.D.
More from Psychology Today