How to Outsmart Your Brain
We’re up against the biological limits of what the brain can absorb. The way to think now is to reach beyond it, using as resources our own bodies and the world around us.
By Annie Murphy Paul published July 6, 2021 - last reviewed on July 13, 2021
“Use your head.”
How many times have you heard that advice? Perhaps you’ve even urged it on someone else. The command is a common one, issued in schools, in the workplace, amid the trials of everyday life. Its refrain finds an echo in culture both high and low, from Auguste Rodin’s The Thinker, chin resting thoughtfully on fist, to the bulbous cartoon depiction of the brain that festoons all manner of products and websites—educational toys, nutritional supplements, cognitive fitness exercises. When we say it, we mean: Call on the more than ample powers of your brain, draw on the magnificent lump of tissue inside your skull. We place a lot of faith in that lump; whatever the problem, we believe the brain can solve it.
But what if our faith is misplaced? What if the directive to use your head, ubiquitous though it may be, is misguided? As it is, we use our brains entirely too much—to the detriment of our ability to think intelligently. What we need to do is think outside the brain.
Thinking outside the brain means skillfully engaging entities external to our heads—the feelings and movements of our bodies, the physical spaces in which we live and learn and work, and the minds of the other people around us—drawing them into our own mental processes. By reaching beyond the brain to recruit these extraneural resources, we are able to focus more intently, comprehend more deeply, and create more imaginatively—to entertain ideas that would be literally unthinkable by the brain alone.
It’s true that we’re more accustomed to thinking about our bodies, our spaces, and our relationships. But we can also think with and through them—by using the movements of our hands to understand and express abstract concepts, for example, or by arranging our workspace in ways that promote idea generation, or by engaging in social practices like teaching and storytelling that lead to deeper understanding and more accurate memory. Rather than exhorting ourselves and others to use our heads, we should be applying extraneural resources to the project of thinking outside the skull’s narrow circumference.
We’ve been led to believe that the human brain is an all-purpose, all-powerful thinking machine. We’re deluged with reports of discoveries about the brain’s astounding abilities, its lightning quickness and its protean plasticity; we’re told that the brain is a fathomless wonder, “the most complex structure in the universe.”
But when we clear away the hype, we confront the fact that the brain’s capacities are actually quite constrained and specific. The less-heralded scientific story of the past several decades has been researchers’ growing awareness of the brain’s limits. The human brain is limited in its ability to pay attention, limited in its capacity to remember, limited in its facility with abstract concepts, and limited in its power to persist at a challenging task.
Importantly, these limits apply to everyone’s brain. It’s not a matter of individual differences in intelligence; it’s a matter of the character of the organ we all possess, its biological nature and its evolutionary history. The brain does do a few things exquisitely well—sensing and moving the body, navigating through space, connecting with other humans. These activities it can manage fluently, almost effortlessly. But accurately recalling complex information? Engaging in rigorous logical reasoning? Grasping abstract or counterintuitive ideas? Not so much.
Here we arrive at a dilemma—one that we all share: The modern world is extraordinarily complex, bursting with information, built around nonintuitive ideas, centered on concepts and symbols. Succeeding in this world requires focused attention, prodigious memory, capacious bandwidth, sustained motivation, logical rigor, and proficiency with abstractions. The gap between what our biological brains are capable of and what modern life demands is large and getting larger each day. With every terabyte of data swelling humanity’s store of knowledge, our native faculties are further outstripped. With every twist of complexity added to the world’s problems, the naked brain becomes more unequal to the task of solving them.
Our response to the cognitive challenges posed by contemporary life has been to double down on what Andy Clark, a professor of cognitive philosophy at the University of Sussex, England, calls “brainbound” thinking—those very capacities that are, on their own, so woefully inadequate. We urge ourselves and others to grit it out, bear down, “just do it”—to think harder. But as we often find, to our frustration, the brain is made of stubborn and unyielding stuff, its vaunted plasticity notwithstanding.
Confronted by its limits, we may conclude that we ourselves (or our children or our students or our employees) are simply not smart enough, or not “gritty” enough. In fact, it’s the way we handle our mental shortcomings—which are, remember, endemic to our species—that is the problem. Our approach constitutes an instance of (as the poet William Butler Yeats put it in another context) “the will trying to do the work of the imagination.” The smart move is not to lean ever harder on the brain but to learn to reach beyond it.
The good news is that we have long been drawing extraneural resources into our thinking processes—that we already think outside the brain. The bad news is that we often do it haphazardly.
Our efforts at education and training, as well as at management and leadership, are aimed almost exclusively at promoting brainbound thinking. Beginning in elementary school, we are taught to sit still, work quietly, think hard. The skills we develop and the techniques we are taught are those that involve using our heads: committing information to memory, engaging in internal reasoning and deliberation, endeavoring to self-discipline and self-motivate.
Meanwhile, there is no corresponding cultivation of our ability to think outside the brain—no instruction, for instance, in how to tune in to the body’s internal signals, sensations that can profitably guide our choices and decisions. We’re not trained to use bodily movements and gestures to understand highly conceptual subjects like science and mathematics, or to come up with novel and original ideas. Schools don’t teach students how to restore their depleted attention with exposure to nature or how to arrange their study spaces so that they extend intelligent thought. Teachers and managers don’t demonstrate how abstract ideas can be turned into physical objects that can be manipulated and transformed in order to achieve insights and solve problems.
Employees aren’t shown how the social practices of imitation and vicarious learning can shortcut the process of acquiring expertise. Classroom groups and workplace teams aren’t coached in scientifically validated methods of increasing the collective intelligence of their members. Our ability to think outside the brain has been left almost entirely uneducated and undeveloped. This oversight is the result of what has been called our “neurocentric bias”—that is, our idealization and even fetishization of the brain—and our corresponding blind spot for all the ways cognition extends beyond the skull.
Until recently, science shared the larger culture’s neglect of thinking outside the brain. But this is no longer the case. Psychologists, cognitive scientists, and neuroscientists are now able to provide a clear picture of how extraneural inputs shape the way we think. Even more promising, they offer practical guidelines for enhancing our thinking through the use of outside-the-brain resources. Such developments are unfolding against the backdrop of a broader shift in how we view the mind—and, by extension, how we understand ourselves.
Our current ideas about the brain were born on February 14, 1946, when the Electronic Numerical Integrator and Computer, the first machine of its kind capable of performing calculations at lightning speed, was revealed to the world. Weighing 30 tons, the massive ENIAC used around 18,000 vacuum tubes, employed about 6,000 switches, and encompassed upwards of half a million soldered joints; it had taken more than 200,000 man-hours to build.
With funding from the U.S. Army, the ENIAC had been developed for computing artillery trajectories for American gunners fighting in Europe. Compiling trajectory tables—necessary for the effective use of new weapons—was a laborious process. A machine that could do the job with speed and accuracy would give the army an invaluable edge.
Now, six months after V-Day, at a press conference to introduce the invention to the world, ENIAC was tasked with multiplying 97,367 by itself 5,000 times. “Watch closely—you may miss it,” warned the speaker as he pushed a button. Before the reporters had time to look up from their notepads, the task was complete, executed on a punch card delivered to the announcer’s hand.
“One of the war’s top secrets, an amazing machine which applies electronic speeds for the first time to mathematical tasks hitherto too difficult and cumbersome for solution, was announced here tonight by the War Department,” The New York Times reported on its front page. “So clever is the device,” the reporter wrote, “that its creators have given up trying to find problems so long that they cannot be solved.”
The introduction of the ENIAC was not just a milestone in the history of technology. It was a turning point in the story of how we understand ourselves. In its early days, the invention was frequently compared to a human brain—a “giant electronic brain,” a “robot brain,” an “automatic brain,” a “brain machine.” But before long, the analogy got turned around. It became a commonplace that the brain is like a computer.
Indeed, the “cognitive revolution” that would sweep through American universities in the 1950s and 1960s was premised on the belief that the brain could be understood as a flesh-and-blood computing machine. The first generation of cognitive scientists “took seriously the idea that the mind is a kind of computer,” notes Brown University professor Steven Sloman. “Thinking was assumed to be a kind of computer program that runs in people’s brains.”
Since then, the brain-computer analogy has become only more pervasive and more powerful. The brain, according to this analogy, is a self-contained information-processing machine, sealed inside the skull as the ENIAC was sequestered in its locked room. From this inference emerges a second: The human brain has attributes, akin to gigabytes of RAM and megahertz of processing speed, that can be easily measured and compared. Following on these is the third and perhaps most significant supposition of all—that some brains, like some computers, are just better; they possess the biological equivalent of more memory storage, greater processing power, higher-resolution screens.
To this day, the computer metaphor dominates the way we think and talk about mental activity—but it’s not the only one that shapes our notion of the brain. “New Research Shows That the Brain Can Be Developed Like a Muscle,” blared the headline of the news article. The year was 2002, and Carol Dweck, a psychology professor then at Columbia University, was testing a new theory, investigating the possibility that the way we conceptualize the brain can affect how well we think.
Dweck’s idea, which she initially called “the incremental theory of intelligence,” would eventually become known to the world as the “growth mindset”—the belief that concerted mental effort could make people smarter, just as vigorous physical effort could make people stronger. At the center of it all is a metaphor: the brain as muscle. The mind, in this analogy, is akin to a bicep or a deltoid—a physical entity that varies in strength among individuals.
The two metaphors—brain as computer and brain as muscle—share some key assumptions: The mind is a discrete thing that is sealed in the skull; this discrete thing determines how well people are able to think; this thing has stable properties that can easily be measured, compared, and ranked. The two metaphors fit neatly with our society’s emphasis on individualism—its insistence that we operate as autonomous, self-
contained beings, in possession of capacities and competencies that are ours alone. These comparisons also readily conform to our culture’s penchant for thinking in terms of good, better, best.
There even appear to be hardwired psychological factors underlying our embrace of these ideas about the brain. The belief that some core quantity of intelligence resides within each of our heads fits with a pattern of thought, apparently universal in humans, that psychologists call “essentialism”—the conviction that each entity we encounter possesses an inner essence that makes it what it is. We think in terms of enduring essences—rather than shifting responses to external influences—because we find such essences easier to process mentally, as well as more satisfying emotionally. From the essentialist perspective, people simply “are” intelligent or they are not.
Together, the historical, cultural, and psychological bases of our assumptions about the mind—that its properties are individual, inherent, and readily ranked according to quality—give them a powerful punch. Such assumptions have profoundly shaped the views we hold on the nature of mental activity, on the conduct of education and work, and on the value we place on ourselves and others. It’s therefore startling to contemplate that the whole lot of it could be misconceived.
On the morning of April 18, 2019, computer screens went dark across a swath of Seoul, South Korea’s largest city. Lights flickered out in schools and offices across the 234-square-mile metropolis, home to some 10 million people. Stoplights at street intersections blinked off, and electric trains slowed to a halt. The cause of the blackout was magpies. The birds are well known for making their nests out of whatever is available in the environment: twigs, string, moss, fishing line, plastic Easter grass, chopsticks, and more.
The densely packed urban neighborhoods of modern-day Seoul feature few trees or bushes, so magpies use what they can find: metal clothes hangers, TV antennas, lengths of steel wire. These materials conduct electricity—and so, when the birds build their nests on the city’s tall electrical transmission towers, the flow of electricity is regularly disrupted.
Our brains, it might be said, are like magpies, fashioning their finished products from the materials around them, weaving the bits and pieces they find into their trains of thought. Set beside the brain-as-computer and brain-as-muscle metaphors, it’s apparent that the brain as magpie is a very different kind of analogy. For one thing, thought happens not only inside the skull but out in the world, too; it’s an act of continuous assembly and reassembly that draws on resources external to the brain. For another: The kinds of materials available to think with affect the nature and quality of the thought that can be produced. And last: The capacity to think well—that is, to be intelligent—is not a fixed property of the individual but rather a shifting state that is dependent on access to extraneural resources and the knowledge of how to use them.
This is a radically new way of thinking about thinking. Recasting our model of how the mind functions has lately become an urgent necessity, as we find ourselves increasingly squeezed by two opposing forces: We need ever more to think outside the brain, even as we have become ever more stubbornly committed to the brainbound approach.
In the accelerated pace of our days and the escalating complexity of our duties at school and work, the demands on our thinking are ratcheting up. There’s more information we must deal with. The information we have to process is coming at us faster. And the kind of information assaulting us is increasingly specialized and abstract. The knowledge and skills that we are biologically prepared to learn have been outstripped by the need to acquire a set of competencies that come far less naturally and are acquired with far more difficulty.
David Geary, a professor of psychology at the University of Missouri, makes a useful distinction between “biologically primary” and “biologically secondary” abilities. Human beings, he points out, are born ready to learn certain things: how to speak the language of the local community, how to find their way around a familiar landscape, how to negotiate the challenges of small-group living. We are not born to learn the intricacies of calculus or the counterintuitive rules of physics; we did not evolve to understand the workings of the financial markets or the complexities of global climate change. And yet we dwell in a world where such biologically secondary capacities hold the key to advancement, even survival. The demands of the modern environment have now met the limits of the biological brain.
For a time, it’s true, humanity was able to keep up with its own ever-advancing culture, resourcefully finding ways to use the biological brain better. As their everyday environments grew more intellectually demanding, people responded by upping their cognitive game. Continual engagement with the mental rigors of modern life—along with improving nutrition, rising living conditions, and reduced exposure to infectious disease and other pathogens—produced a century-long climb in average IQ score, as measured by intelligence tests taken by people all over the globe. But this upward trajectory is now leveling off.
In recent years, IQ scores have stopped rising or have even begun to drop. It may be that “our brains are already working at near-optimal capacity,” Nicholas Fitz and Peter Reiner write in the journal Nature. Efforts to wrest more intelligence from this organ, they add, “bump up against the hard limits of neurobiology.”
There is a way we can extend beyond our limits. It’s not by revving our brains like a machine or bulking them up like a muscle—but by strewing our world with rich materials and by weaving them into our thoughts.
Thinking With Your Body
- Perform a body scan. Direct nonjudgmental attention to each part of your body in turn. Practicing this meditative exercise daily will improve your ability to perceive interoceptive sensations, a source of information summarizing many types of input.
- Engage in micro movements. Micro movements are small movements of the body, such as those we make when we’re working at a standing desk; they help keep us alert and engaged.
- Achieve “hypofrontality.” Extremely vigorous and sustained exercise inhibits the activity of the prefrontal cortex, the brain’s taskmaster and critic—allowing more creative and original ideas to emerge.
- Act out the abstract. In order to grasp an abstract concept, or to commit it more firmly to memory, act it out with whole-body movements.
- Explicitly encourage gesture. When instructing a child or a student, or helping a friend or a colleague brainstorm, offer a suggestion: “Could you try moving your hands as you say that?”
- Include your hands in the conversation. When you’re on a Zoom call or teaching via video, make sure your viewers can see your hands. Using your hands to gesture makes your own speech more fluent and helps your audience remember what you say.
- Let your hands share the burden. Allow your hand gestures to “hold” some of your mental contents, thereby lightening cognitive load.
Thinking With Your Environs
- Apply “environmental self-regulation.” Instead of trying to get a grip on your thoughts and feelings from the inside, use exposure to the outside world—especially nature—to restore your equilibrium and your focus.
- Take an “awe walk.” Spend time outdoors, allowing yourself to wonder at and be moved by nature’s majesty. Awe can act as a reset button for the human brain, shaking us loose from old patterns and opening us up to new possibilities.
- Implement “sensory reduction.” Reduce the amount of environmental stimuli your brain has to attend to by thinking within a quiet, enclosed space—or by closing your eyes. Sensory reduction generates a state of “stimuli hunger” in which weakly activated inner knowledge (barely-remembered facts, elusive imaginative notions) becomes accessible.
- Give yourself some privacy. Shielding yourself from the gaze of others reduces cognitive load and fosters experimentation.
- Sketch it out. Drawing the concept you’re thinking about has benefits beyond writing about it—even if you “can’t draw.” Attempting to capture a concept in visual terms deepens understanding and reinforces memory.
- Make it physical. The human brain evolved to manipulate physical objects, not abstract ideas. Create a concrete model of a concept, then look at it from different perspectives, manipulating its elements and trying out new combinations.
Thinking With Others
- Choose a model to imitate. Our culture values innovation and originality, but often the most efficient and effective approach to solving a problem is to copy what someone else has already done.
- Encourage close observation. Children in other cultures commonly learn by observing and imitating their elders. Research has found that American children are not so adept at this practice—but that these capacities can be deliberately cultivated.
- Take advantage of “the protégé effect. Teaching someone the material we need to learn leads us as teachers to learn most of all. As highly social creatures, we’re more motivated by the goal of conveying information to others than by the goal of simply studying for our own sake. Even struggling learners benefit from teaching younger students or family members.
- Move in sync. Engaging in coordinated physical movement with others will lead you to like them more, identify with them more closely, and cooperate with them more effectively.
Submit your response to this story to letters@psychologytoday.com. If you would like us to consider your letter for publication, please include your name, city, and state. Letters may be edited for length and clarity.
Pick up a copy of Psychology Today on newsstands now or subscribe to read the rest of the latest issue.
LinkedIn image: ikuzmin/Shutterstock