Whether we’re considering the economy of motion, or economy of thought, humans are remarkably efficient creatures. In some ways, these attributes provide us with an enormous advantage. In others, they can leave us coming up short—particularly in terms of cognition and the processing of information.
Occam’s Double Edge
The law of parsimony, more commonly referred to as Occam’s Razor, suggests the simplest answer is usually the most correct. So—to paraphrase—rather than "multiplying entities without necessity," we would be better served by seeking and following the easiest route. In service of problem solving, this lends us tremendous efficiency. It also introduces the sometimes disadvantageous effect of "cognitive miserliness."
We have evolved to "think lean," tending to seek out the shortest distance between a problem and its solution, or a concept and its understanding. Rather than expending a lot of mental energy on solving a problem or absorbing a concept, we instead hoard that energy—a propensity that cuts both ways. Put plainly, we will often—and are more likely to—find a solution or take on a concept without thinking it through.
Systems of Thinking
In terms of cognition, we generally think about things in two ways, which is how we get to that place of not thinking things through. First is the somewhat automatic "seeing-hearing-experiencing is believing" thought process and, second, is the more critical approach that takes a bit of effort. Speaking to cognitive miserliness, we generally default to the first, less critical, approach.
Automatic, or lean, thinking lends itself more to the trap of accepting misinformation for a couple of reasons. On the one hand, the easier something is to process, the more likely we are to believe it’s true. In other words, quick and easy judgments are more likely to feel right, even if they aren’t demonstrably true—or are, by contrast, even demonstrably false—so, we tend to go in that direction. On the other hand, the efficiency lent to us through our propensity to cognitive miserliness—lean thinking—can cause us to overlook critical details, particularly when those details run contrary to our "easy out" thinking.
Repetition Is Believing
All of these cognitive tendencies become amplified within the context of something called "fluency." At its most basic, fluency is the notion that the more often we hear something, the easier it is to believe it, and not only believe it, but accept it as truth. If we bridge this construct with our tendency toward lean thinking, it’s fairly easy to see how oft-repeated falsehoods, no matter how much evidence stands against them, can become commonly held facts.
Think about it. What’s a mantra of your childhood that you still hold dear, even though your adult self knows it to be questionable, untrue, or even patently false? By the same token, there are likely some notions you have developed in your adult life that, upon closer examination, don’t necessarily hold up to scrutiny.
It’s likely you’ve shed, or modified, some of your childhood beliefs over the years. Others, you may still apply reflexively, which is an example of that pesky automatic thinking. The same may hold true for beliefs you’ve developed later in life, on both counts. Either way, our cognitive biases can hamstring our critical thinking.
Mainstreaming Misinformation
We are, by nature, territorial. In cognitive terms, territoriality roughly translates to positionality—we stake out a claim around a concept, idea, or belief, and dig in to defend it, often no matter the cost. When that concept, idea, or belief is fed by lean, rather than critical, thinking, and fluency, rather than fact-checking, we set ourselves up to be the bearers of the falsehoods and untruths underlying the culture of misinformation.
Further, when we float these ideas and they are accepted, we end up contributing to the Orwellian phenomenon of groupthink, where a collection of individuals reaches a consensus without critical reasoning or substantive evaluation of consequences. In getting caught up in this social momentum, we end up contributing to a broader web of misinformation that ultimately undermines our social fabric.
The bottom line here is that we are obliged to check our premises. Lean thinking allows us to effectively problem solve and respond to crises with efficiency and alacrity. It also puts us at a disadvantage, in terms of developing a deeper understanding of both our social circumstances and, by association, our social responsibility.
© 2021 Michael J. Formica, All Rights Reserved