Skip to main content

Verified by Psychology Today

Decision-Making

Distinguishing the Genuinely Rational From the Irrational

How our intuitions about rationality can lead us astray.

Key points

  • If someone made perfectly accurate judgments and sound decisions, would we recognize it?
  • The science and philosophy of judgment and decision-making suggests that the answer is often “no.”
  • Suggestions on how we can distinguish the genuinely rational from the irrational.

Part Two of Two

In a previous post, we considered three of seven "irrational" habits of highly rational people. Here we consider the remaining four, as well as some suggestions for distinguishing the genuinely rational from the irrational.

4. They avoid risks that don’t happen

As discussed, a rational person can look “irrational” or “paranoid” by virtue of thinking the “impossible” is possible or even (probably) true. Not only that, but they will also act to reduce risks that never actually happen.

This is because our leading theory of rational decision-making claims that we should make decisions not just based on how probable or improbable outcomes are, but rather based on the so-called expected utility of those outcomes, where the expected utility of an outcome is the product of both its probability and how good or bad it is.

This sometimes means that we should avoid decisions if there is an improbable chance of something really bad happening. For example, it can be rational to avoid playing Russian roulette even if the gun is unlikely to fire a bullet, simply because the off-chance of a bullet firing is so bad that this outcome has high expected negative utility.

Likewise, for many other decisions in life, it may be rational to avoid decisions if they have improbable outcomes that are sufficiently bad. This consequently means a rational person could often act to avoid many risks that never actually happen.

But as is well known, people often evaluate the goodness of a decision based on its outcome, and if the bad thing does not happen, the average person might evaluate that decision as “irrational.”

This kind of thing could happen quite often too. For example, if for every highly negative outcome with a probability of 10 percent and which a rational decision-maker would avoid, then they will be avoiding negative outcomes that simply don’t happen 90 percent of the time, potentially making them look quite irrational.

The situation is even worse if the evaluator has the "miscalibrated certainty" we considered earlier, and the outcome not only does not happen, but rather it looks like it was always “impossible” from the perspective of the evaluator.

5. They pursue opportunities that fail
But the same moral holds for decisions that avoid risk, and for decisions that pursue reward. For example, a rational decision-maker might accept an amazing job offer that has merely a 10 percent chance of continued employment if the possibility of continued employment is sufficiently good. But of course, the decision to accept that job has a 90 percent chance of resulting in unemployment, potentially making the decision again seem like a “failure” if the probable happens.

More generally, a rational decision-maker would pursue risky options with 90 percent chances of failure if the options are sufficiently good all things considered: it is like buying a lottery ticket with a 10 percent chance of winning but with a sufficiently high reward.

But again, the rational decision-maker could look highly “irrational” in the 90% of cases where those decisions lead to less-than-ideal outcomes.

In any case, what both this habit and the preceding one have in common is that rational decision-making requires making decisions that lead to the best outcomes over many decisions in the long run, but humans often evaluate decision-making strategies based on mere one-off cases.

6. They are often irrational
Despite that, though, arguably any realistic person who is as rational as could be would still be genuinely irrational to some degree. This is because our dominant theory of judgment and decision-making—dual process theory—entails that while we often make reflective judgments and decisions, there are countless situations where we do not and simply cannot.

Instead, the literature commonly affirms that everyone employs a set of so-called heuristics for judgment and decision-making which—while often adequate—also often lead to sub-optimal outcomes. Consequently, even if someone was as rational as could be, they would still make irrational judgments and decisions in countless other contexts where they cannot be expected to rely on their more reflective faculties.

If we then focus solely on these unreflective contexts, we would get an inaccurate impression of how rational they are overall.

7. They do things that are often “crazy” or “unconventional”
All of the preceding thoughts then entail that rational people may do things that seem “crazy” or “unconventional” by common standards: they might believe in seemingly impossible things, act to reduce risks that never happen, or pursue opportunities that never materialize, and so on. This might express itself in weird habits, beliefs, or in many other ways.

But this shouldn’t be too surprising. After all, the history of humanity is a history of common practices that later generations appraise as unjustified or irrational. Large portions of humanity once believed that the earth was flat, that the earth was at the center of the universe, that women were supposedly incapable or unsuitable for voting, and so on.

Have we then finally reached the apex of understanding in humanity’s evolution, a point where everything we now do and say will appear perfectly rational by future standards? If history is anything to go by, then surely the answer is “No.” If that is the case, then perhaps the truly "rational" will be ahead of the rest—believing or doing things that seem crazy or irrational by our currently common standards.
​​

How to distinguish the rational from the irrational

I hope I have conveyed just how frequently our untrained intuitions about what is rational may diverge from what is truly rational: what’s rational might appear “irrational” and vice versa. In a world where these intuitions might lead us astray, then, how can we tell rational from irrational, accurate from inaccurate, or wisdom from lack of wisdom?

Some common rules of thumb might not work too well. For example, sometimes the evidence fails to find that years of experience, age, or educational degrees improve accuracy, at least in domains like geopolitical forecasting.

Some suggestions supported by the evidence:

Suggestion #1: Measure calibration

First, track the calibration of the judgments you care about—whether they are yours or others. I provide some tools and ideas for how to do this here. This can help us to put things in perspective, avoid focusing on single cases, and detect pervasive miscalibration that can affect our decision-making. And as other studies suggest, past accuracy is the greatest predictor of future accuracy.

Suggestion #2: Learn norms of reasoning
Additionally, I would suggest learning and practicing various norms of reasoning. These include the evidence-based suggestions for forming more accurate judgments in my book Human Judgment, such as practicing active open-minded thinking and thinking in terms of statistics. It also includes other norms, such as so-called “Bayesian reasoning,” which can produce more accurate judgments in the Monty Hall problem and potentially other contexts, as I discuss here and here.

Suggestion #3: Think in terms of expected utility
Finally, when evaluating the rationality of someone’s decisions, think in terms of expected utility theory. Expected utility theory is complicated, but here is a potentially helpful introduction to it (and from my former Ph.D. advisor—a really awesome person!). In short, though, expected utility theory requires us to ask what probabilities people attach to outcomes, how much they value those outcomes and, on my preferred version of it, whether their probabilities are calibrated and their values are in some sense objectively “correct.” Then, we can ask whether they are making decisions that lead to the best possible outcomes in the long run.

In these ways, I think we can better tell what’s rational from what’s not in a world where our intuitions can otherwise lead us astray.

advertisement
More from John Wilcox Ph.D.
More from Psychology Today