Skip to main content

Verified by Psychology Today

Massively Intelligent

Hardly anyone can accurately explain how a zipper works. And yet we've cracked the atom and explored deep space. A new book explains how we manage it.

In a classic Monty Python sketch, panelists on the talk show "How to Do It" promise to teach viewers how to split an atom, construct a box girder bridge, and irrigate the Sahara "to make vast new areas of land cultivatable." But first, how to play the flute: John Cleese picks up the instrument, points to it, and declares, "You blow there, and you move your fingers up and down here."

As Steven Sloman and Philip Fernbach relate with less silliness in The Knowledge Illusion, most of us have no deeper understanding of how everyday devices like a toilet, a zipper, or a coffeemaker actually work, to say nothing of watches, cell phones, or space probes. And yet until we're actually asked, by researchers like the authors, to describe how a toilet functions—and they helpfully provide an elegant description—most of us assume that we do understand. This gap between what we know and what we think we know is known as the illusion of explanatory depth.

"Our point is not that people are ignorant," Sloman and Fernbach write. "It's that people are more ignorant than they think they are." In fact, we know just enough to get by.

Sloman, a psychologist at Brown and the editor of the journal Cognition, and Fernbach, a cognitive scientist at the University of Colorado, explore the paradox that while individual humans may be fairly ignorant of how the world works, we're collectively capable of brilliance. Their breezy guide to the mechanisms of human intelligence breaks us down, then builds us up, only to break us down again: We're ignorant, but that's okay, because accessible, actionable knowledge is everywhere—but our easy access to information makes us recklessly overconfident.

A Global Brain Bank

The human mind is an outstanding problem solver but a less impressive storage device. We can hold, according to some estimates, about 1 gigabyte of memory, maybe as much as 10. But our minds are not computers. They rely not entirely on memory and deliberation, as a machine must, but on pattern recognition and insight. Besides, there's just too much to know. Most of our knowledge instead resides outside of our heads—in our bodies, in the environment, and most crucially, in other people.

That's not a weakness, it's a strength. "You're seeing only a tiny bit of the world at a time, but you know the rest is there," the authors write. Our knowledge of how the world normally functions allows us to learn all we need to know about an environment by just looking around and allows us to draw correct conclusions about the space we're in without seeing it all at once. In other words, the world is part of your memory. Did you already forget the punch line to the Monty Python sketch? That's fine: You know it's right there at the top of the page, easy to access.

We have succeeded as a species because of how well communities of brains work together. Every significant project ever attempted—pyramid, cathedral, or skyscraper—would be inconceivable if it had to rely on a single mind. Evolutionarily speaking, this is the social brain hypothesis. As social groups grew in size and complexity, we developed new cognitive capabilities to support our communities through specialization, distributed expertise, and division of cognitive labor. In short, we share intentionality, a trait easily observed in the group play of children and the global collaboration of scientific research.

What's true of a particle accelerator, a massively complex product of distributed expertise, also holds for our smallest groups: Spouses are more prone to forget details in areas of their partner's clear expertise—the right wine or how to file taxes—and vice versa. We can focus our memory on what we have to, or want to, secure that our partner has his or her territory equally well covered.

If we can't make use of other people's knowledge, we can't succeed. We can barely function. But if we can't recognize that most of our knowledge lives outside our brains, we face a different problem, and at least as much as it is about cognition, The Knowledge Illusion is about hubris.

To varying extents, we all live under the illusion of explanatory depth. "We fail to draw an accurate line between what is inside and outside our heads," the authors write. "So we frequently don't know what we don't know." The illusion is in some ways a byproduct of maturity. As preschoolers, we take an object in our hands and ask why and how it works until we're satisfied, or until our parents throw up their hands or distract us. Eventually, we stop asking. We come to tolerate our inability to understand our complex tools by deciding to stop recognizing it. And then we go further. "We think the knowledge we have about how things work sits inside our skulls when in fact we're drawing a lot of it from the environment and from other people."

Ignorance, the authors remind us, is our natural state and nothing to be ashamed of, as long as it's tempered with humility. When it is not, however, we can fall prey to the Dunning-Kruger effect, the phenomenon in which those who perform the worst on a task overrate their skills the most. As has been shown in studies of doctors, workers, students, and perhaps most notably, drivers, those with the highest level of skill tend to underrate their abilities—they recognize how much they don't know and how much better they could perform. Those who perform worst, however, tend to lack a sense of what skills they're missing, and instead remain willfully ignorant of their potential. "When the only way to evaluate how much you know is through your own knowledge, you'll never get an honest assessment."

The Dunning-Kruger effect is a particular risk in the political arena. Leaders "have the responsibility to learn about their own ignorance and effectively take advantage of others' knowledge and skills," the authors suggest, and voters have an obligation as well: "A mature electorate is one that makes the effort to appreciate a leader who recognizes that the world is complex and hard to understand."

How Facts Can Defeat Beliefs

It's not only knowledge that is spread across communities, it's beliefs and values as well—and they're not always fact-based. The flip side of collective intelligence is groupthink, when members of a group provide each other with support for a shared belief that may have no factual basis. The illusion of explanatory depth helps to explain how we can passionately hold strong opinions with little factual support. But discarding an incorrect belief agreed upon by one's community—say, about climate change—is a daunting cognitive challenge.

Research into the illusion offers a way out—not through debate but through humility-evoking doubt. Studies have found that the best way to shift others' opinions is to employ the same strategy that helps them understand that they don't really know how a toilet works: Ask them to explain it. When people are prompted to do so, they are forced to acknowledge their lack of knowledge. The illusion is broken, and after such encounters, people report less attachment to extreme opinions. Similarly, when climate-change skeptics are shown videos or articles about the science behind the process—the mechanisms, not the blame—their cognitive wall begins to crack. In the end, the authors report, "no one wants to be wrong."

Facebook image: George Rudy/Shutterstock