Skip to main content

Verified by Psychology Today

Build a Better Brain

Learning, the process by which we acquire information about our world, may actually change our brains for the better.

In one corner stands a stack of magazines. In another sits the Sunday paper. On the counter, the radio crackles with news, while nearby the fax machine hums. The information age is definitely upon us.

If you're like most, you're still reeling, struggling to take it all in, perhaps shutting down your input channels entirely or jettisoning subscriptions simply to survive. And it's not going to get any better. The outlook report from dataland is bleak: Every five years, the information load is doubling.

There's nothing left to do but hope for a bigger, brighter brain.

What the data doctors can't even hope to promise, science may yet deliver. In this, the Decade of the Brain, researchers are hot on the trail of how we acquire and store information. Merging psychology with biology, they have made a series of recent discoveries that appear to catch learning in its tracks.

Neuroscientists plumbing this virgin terrain now know that, along with genetic inheritance, experience shapes the very structure of our nervous systems—it alters the brain circuits that process everything from a French lesson to an auto repair guide. Learning, the process by which we acquire information about our world, may actually change our brains for the better. Animal research suggests that the more we use our brains, the more efficient our intellectual muscle gets.

Taken together, their work demonstrates that the brain is an extraordinarily plastic organ, responding actively to a novel environment by growing new connections to greet it. Although the brain is unlike any other organ in that it lacks the ability for cell-body renewal, nerve cells do generate new connections, or synapses—the points at which signals are transmitted—forging new and enhanced pathways for the flow of information.

These findings suggest you can essentially train your brain to collate more information faster, and access it quicker and better. And under the right conditions of stimulation, you can grow yourself a brain that will keep up with your information needs—perhaps even exceed them.

Nature does set certain parameters. We all start out with about the same number of neurons, or brain cells, having the same basic structure. By nine months of age, our nerve cells stop dividing, leaving us with about 100 billion to a trillion each.

By far the most sophisticated thinking machine known to man, the adult brain massively outperforms today's best supercomputer. It processes billions of operations a second—approximately 10[15], versus a mere 109 for the machine—all in three pounds of tissue crammed inside the cranium. So densely packed is the brain that a sample no larger than a grain of rice contains one million neurons, 20 miles of axon—or the extension cords of nerve cells—and 10 billion synapses, calculates California neurobiologist Charles Stevens, Ph.D.

The vast majority of them are contained in the cerebral cortex, or neocortex, the most recently evolved part of the brain—a highly corrugated sheet of gray matter less than three eighths of an inch deep that overlies most other brain structures. The cortex accounts for 80 percent of its total volume and containing the equipment responsible for many sensations, thoughts, imagery, language, and other distinctively human abilities. Here, with the assistance of other brain structures, is where the brain makes sense of received stimuli, piecing together the signals from various sensory pathways, connecting them and interconnecting them, and converting them into felt experience.

Formerly the domain of philosophers, this once-ethereal territory has been opened for scientific exploration. Using such technological advances as electrodes, gels, high-powered microscopes and imaging devices including positron emission tomography, or PET scans (think of them as maps of energy flow), along with such low-tech equipment as sea slugs and rat brains, neuroscientists are providing an unprecedented understanding of our brains.

"Stimulation in general is very important to the development of the brain," reports neurobiologist Carla Shatz, Ph.D., of the University of California at Berkeley. While evolution has programmed us to perform certain basic tasks necessary to sustain life—such as eating and sleeping—we still have to learn how to do almost everything else.

Researchers believe that, from birth to adolescence, we are laying down the basic circuitry of the brain. As we grow up, the world subsequently makes its mark physically. Exposure to novel tasks and novel stimuli generates the development of new circuits and synapses for handling all of them. From then on, continued stimulation throughout life further strengthens these pathways and enhances their interconnections.

Scientists cannot yet quantify exactly how much an enriched environment helps the brains of young children to grow. But "we do know that deprivation and isolation can result in failure of the brain to form its rich set of connections," says Shatz.

Whether it's a new sensation or a fresh idea, every outside stimulus is first converted into electrical signals as it enters the cranium. These electrical signals trundle down known pathways, splitting off into multiple directions for processing. Where the lack of prior experience has left no established route, the signal will forge a new one, linking neuron to neuron as it travels along. The resulting chain is called a brain circuit, and the next time the same stimulus enters the brain, it speeds efficiently along its old route, now grooved into an expressway. Hundreds of millions of brain circuits are created by millions of experiences.

Sometime near the high-school prom age—around 18 years old—networks stop forming. We are hard-wired by the end of adolescence. Each of us is left with a "brainprint," or network system, which like a fingerprint, is unique to each one of us. This is the hardware that processes our thoughts.

Once it's in place, certain opportunities are no longer available to us. If, for example, we learn a second language after adolescence, it comes out sounding something like the first one. "We are incapable of acquiring new languages without an accent after adolescence," reports Mark Konishi, a biologist at California Institute of Technology who studies bird-song development. During development the connections form that process sound. But once our brains are so shaped, Konishi says, "we probably use the same neural substrate to process new sounds."

However, the brain is an enormously adaptive organ: The connections between neurons proliferate and shrink depending upon use. The links between them can be strengthened or weakened. "Brain networks can always be fine-tuned," says neurobiologist Stevens, of the Howard Hughes Medical Institute at Salk Institute in La Jolla. The more synapses between cells, the more avenues for information transmission. The better your cells communicate with one another, the more information you can likely digest, understand and recall efficiently.

"Smarter" people—those who can consume and regurgitate facts with the efficiency of machines—may in fact have a greater number of neural networks more intricately woven together. And recall of any one part seems to summon up a whole web of information.

Pictures of the brain in action confirm this model of efficiency of information flow. Researchers scanning human brains by positron emission tomography (PET)—which highlights the regions that work hardest during various tasks—found that "smarter" brains consume less energy than other brains; to do the same tasks they require less glucose, their favored fuel. "It maybe that once the brain becomes really well grooved you don't need as much energy," explains Eric Kandel, M.D., a neurobiologist at the Howard Hughes Medical Institute at Columbia University in New York.

Perhaps that explains why rats raised in enriched environments later learn faster than counterparts kept in barren cages. And perhaps it will help researchers to understand a recent controversial study showing a significant correlation between low levels of education and the incidence of Alzheimer's disease. According to neuroscientist Robert Katzman, Ph.D., of the University of California at San Diego, individuals who lack formal education may develop fewer synapses, or junctures between neurons, than individuals who have routinely stretched their minds. Then, when disease occurs, there is less brain reserve to call on, he says. When Alzheimer's disease strikes them, the loss of synapses is dramatic and quick to show.

Katzman hopes to directly investigate whether the number of synapses in uneducated people is actually different from that of educated people. In the meantime, neurobiologist Richard Mayeux, Ph.D., of Columbia University, appears to have confirmed part of what Katzman is getting at. He has shown that people with high IQs can withstand more brain scarring than less gifted people before they show a noticeable loss of intellect.

So managing large amounts of information throughout your life—as well as keeping your mind active into old age—doesn't just make you smarter, it also appears to buy you some time should you be stricken with a degenerative brain disease. And it can also help you withstand the more everyday ravages of age.

What makes it possible to change our brains to work faster and smarter? Human studies show that there are two kinds of learning. Declarative—or factual-learning consists of the acquisition of details about people, places, and things; it is presumed to be highly associative, drawing on rich neural interconnections. Procedural learning, on the other hand, involves information on how to do things that utilize motor skills and perceptual strategies, such as driving a car.

Each relies on different neural systems in the brain. Procedural learning, more narrow-channel, involves the specific sensory and motor systems underlying the particular skill. Declarative learning is processed in the hippocampus—a small, seahorse-shaped structure located at the base of the temporal lobe of the cerebral cortex. Not only is the hippocampus central to the formation and retrieval of lasting memories, it is part of the limbic system, or emotional brain.

The details of the still-unfolding story of the hippocampus and its role in the flow of information started with amnesiacs and are spun increasingly from nonhuman sources. Sophisticated as the new imaging technology is, it does not go far enough to pin down the complex doings of the human brain at work. For this, scientists have turned to the simpler systems of the sea snail Aplysia, and to rats and cats, among other creatures.

The bet is that the basic cellular processes in these neurons are similar to ours; that the same kinds of changes that animate the brains of "lower" animals animate ours as well. Even as this article goes to press, the various models are producing vast amounts of information that provide an increasingly complete idea of how brains input the newspaper stories, the lectures, the exhibits, the news, and the noise we want to remember.

At the start of this chain, the sensory organs—eyes, ears, nose, mouth, fingertips—transform stimuli into rhythmic patterns of electrical impulses. Then, one by one, millions of neurons pass the charge on to their neighbors. This process is accomplished by chemical as well as electrical means.

Picture a nerve cell. Extending out from the cell body in one direction is the axon, or output arm. Shorter receiving cables, or dendrites, stem from other parts of the cell. The ultra-thin fibers of axon and dendrite terminate in tiny branches. Between the axonal branches of one cell and the dendrite ends of the next is an infinitesimal space—the synapse—which is the site of communication between two neurons.

When an electric charge is sent from one cell to the next, it is ferried across the synaptic gap with the help of specialized chemicals knows as neurotransmitters. The neurotransmitter influences the electrical conditions at the synapse, and the receiving neuron fires if it collects enough charge, carrying the starting stimulus to the next cell down the line.

A single neuron can send and receive thousands of signals a second. All this brain noise produces a biological translation of the words that you just read. How this message is eventually stored, or retained, is less clear. Most neurobiologists suggest that memory involves some kind of sustained changes in the neurons and their connections—perhaps similar to those that occur during information acquisition. The cells that respond when you recall Hamlet's Act III soliloquy, for instance, may be the same ones that were throbbing when you were taking it in originally.

It is now widely accepted that memory is not stored in a single cell but is spread out over an extensive neuronal network. Each cell provides a tiny piece of a complex mosaic. "Even the simplest memory is spread out over millions of neurons," Stevens says.

Along the same lines, memory recall appears to involve multiple parts of the brain. The most convincing evidence comes from a study by neuroscientists Marcus Raichle, Ph.D., of Washington University in St. Louis, and San Diego's Larry Squire. The two peered by PET scan into the brains of a group of subjects asked to remember specific words. As the subjects reached back into their memory, their brain images flashed all over with light, a sign many sites were participating in the process. "Memory," says Raichle, "is like a piece of music—it has lots of different parts that come together to create the whole."

Further, "we appear to have specialized processing centers that act in different combinations when we recognize something. We know that the hippocampus plays a critical role in laying down new memories and recalling the recent past." Raichle and others think that memory formed in the hippocampus gets stored in the neighboring neocortex, a setup that frees the hippocampus for new tasks. No one, however, knows for sure how a short-term memory, which lasts for a couple of days at best, turns into long-term memory that can last a lifetime.

It is a problem Columbia University's Eric Kandel has been working on for three decades. In his painstaking investigations into how experience changes the nervous system, or the cellular and molecular mechanisms of learning and memory, he has focused on the simple nervous system of the sea snail Aplysia. Its 20,000 neurons are the largest in the animal kingdom. For this Kandel is considered by some to be the most reductive scientist of our times.

Along with magnificently accessible nerve cells, Aplysia also has an easily observable behavior, the "gill withdrawal" reflex. Tap on Aplysia's spout, or siphon, and the snail withdraws its gill. Kandel found that if he shocked the snail's tail, the creature became "sensitized." It learned to overreact, to instantly withdraw its gill upon the slightest touch. (A basic form of learning, sensitization takes place in humans as well.)

Once Kandel identified the key cells that contribute to this type of learned behavior, he then looked for changes that, with training, occurred within the cells. He and his colleagues found that a single tail shock—which produces short-term memory for sensitization—activates a cascade of cellular events in which the sensory neuron releases more neurotransmitter so that the neural connections are strengthened between the sensory neuron from the siphon and the motor neuron for the gill. As a result, the communication between the sensory and motor cells becomes more efficient.

If not reinforced, this activity is transient, and the increase in strength of the connections lasts only minutes. However, when the tail shocks are repeated at least four or five times, long-term memory forms as a result of prodding the synthesis of new proteins within the nerve cells. Under these conditions, Kandel finds, the sensory neuron actually undergoes an anatomical change. The neurotransmitter acts as a growth factor; there is a doubling of the number of synaptic connections the sensory cells make onto the motor cells. Now the cell is altered for a period of weeks so that it can send messages more effectively than before, thereby enhancing information processing within the brain.

That this phenomenon applies to you and me is becoming increasingly clear. What Kandel has observed of Aplysia, others have espied in mammals, including rats. University of Illinois neuroscientist William Greenough, Ph.D., for one, has found that neocortex neurons of rats reared in complex environments, and trained in a maze every day, had more extensive dendrites than did comparison animals. Their dendrites also sprouted more synapses. So experience seems to change the brains of rats much as it does those of sea snails.

Other researchers have recently focused increasing attention on another phenomenon, called long-term potentiation (LTP), that also seems to be a component of associative memory formation. To elicit the LTP response, researchers stick a probe into one section of hippocampal tissue and stimulate it briefly but intensely with electricity.

Later they stimulate the tissue with less shock—but communication across the synapse is found to be stronger. Moreover, the effect persists for days, sometimes weeks. This, researchers believe, "looks an awful lot like learning," a case of neuronal plasticity with an increase in synaptic response—in other words, the creation of new channels that increase the efficiency of information processing.

Scientists want to know precisely what changes take place in LTP, whether the molecular changes associated with LTP occur primarily in the receiver cell, the transmitting nerve cell, or in both. A variety of mechanisms affect synaptic strength. Perhaps various combinations of these determine how different forms of learning occur—for example, how facts are acquired versus how skills are retained.

Still, enough of the evidence is in to walk away with growing certainty. Simply making the attempt to keep up appears to stretch and strengthen our minds physically. And it may give us an edge against degenerative disease. We may survive the information age after all. And exit it in better shape than it found us.

Noshing For Neurons

Your brain hungers for nourishment. It needs protein and vitamins, among other nutrients, to make the membranes and chemicals that facilitate learning and memory. Tip the mental scales in your favor. Eat a balanced diet.

In a study of 26 teenagers, sociologist Stephen Schoenthaler of California State University in Turlock examined the relationship between nutrition and brain function. For three months, 15 of the kids received vitamin-mineral supplements. The others were given placebos.

After three months, the 15 on supplements scored significant gains in non-verbal IQ. Four kids from the supplement group and one on placebo increased an extraordinary 20 points each. When blood samples were drawn from these high five at the end of the trial, all had normal nutrient concentrations. By contrast, none had met laboratory norms at the beginning. The researchers also tested the brain activity of the four supplemented kids and found a significant reduction in brain wave abnormalities.

According to Harvard neurosurgeon Vernon Mark, M.D., certain nutrients have long been known to be essential to the chemical processes of a brain at work: protein, carbohydrates, lecithin, and vitamin B1 (also known as thiamine).

Mark identifies other 'key nutrients' for brain function: the electrolytes calcium, magnesium, sodium, potassium, and chloride; the vitamins B3 (niacin), B6 (pyridoxine), B9 (folate), B12, and C; and the minerals iron, copper, and zinc.

Vital brain nutrients also include choline, a near-vitamin that helps form neuron membranes, the site of a lot of communication action. The essential amino acids tyrosine and tryptophan, which contain not only building blocks of proteins but of neurotransmitters as well, also have an effect on behavior and brain function, reports Mark in his book, Reversing Memory Loss (Houghton Mifflin; 1992).

Here are the goodies your brain craves:

  • PROTEIN: salmon, tuna, chicken, turkey
  • CARBOHYDRATES: potatoes, low-fat bread
  • LECITHIN: tofu, egg yolks
  • VITAMIN 81: organ meats, brewer's yeast, kidney beans, salmon
  • CALCIUM: dates, almonds, molasses, cheese
  • MAGNESIUM: shrimp, molasses, herring, codfish, almonds
  • SODIUM: along with chloride, the body gets enough in salt
  • POTASSIUM: spinach, raisins, peaches, parsnip, banana, dates, dried figs, most meats
  • VITAMIN B3: mushroom, collards, avocado, salmon, tuna, halibut, turkey, chicken, veal
  • VITAMIN B6: whole-wheat, rice, tuna, avocado, bananas
  • VITAMIN B9: cantaloupe, oranges, peas, rice, wheat germ
  • VITAMIN B12: swiss cheese, most meats, fish such as herring, mackerel, snapper
  • VITAMIN C: citrus fruits
  • IRON: Pumpkin seeds, blackstrap molasses, walnuts, wheat germ, caviar, egg yolk
  • COPPER: mushrooms, oats, oysters, peanuts, salmon, honey, barley, blackstrap molasses
  • ZINC: dates, dried figs, egg yolk, fish, maple syrup, milk, oysters, wheat germ, sesame seeds
  • TYROSINE: peanuts, pickled herring, pumpkin seeds, lima beans
  • TRYPTOPHAN: peanuts, bananas, skim milk.

Smell Your Way To Success

Want to improve your memory? Then learn under conditions that stimulate all your senses. Writing, touching, talking, listening, even smelling—the more of them you stimulate in the process of memorizing, the better your ability to recall information. The idea has been around for a while, but a team of behavioral neuroscientists recently identified how such cues improve memory.

Michael S. Fanselow, Ph.D., and Jeansok J. Kim, Ph.D., of UCLA found that the hippocampus, an area of the brain known to integrate sensory information, plays an essential role in the short-term memory of contextual information—say, the clues you draw upon to locate those missing eyeglasses or your car in a crowded parking lot.

First they trained 22 rats to associate being in a flat, square box with a particular tone, delivered with a small electrical shock to their feet. The box also smelled of ammonia. Then Fanselow and Kim surgically damaged the hippocampus of eight of the rats at four different times after training: one day, one week, two weeks, four weeks.

There was no apparent memory of the box or the contextual clues in the rats altered one day after the training session. Hippocampal damage seemed to have destroyed any recollection of the training session. But rats damaged four weeks after the pairing of stimuli responded just as well as others that had been trained but not damaged. When put back in the box and exposed to the tone, they crouched in the corner afraid.

Fanselow concludes that the hippocampus is involved in learning that requires integrating various stimuli, including the look, feel, and smell of the box, or various bits of an environment. Further, he believes that the hippocampus is necessary for short-term recall of such memories, but that with time, the memories are down-loaded to another part of the brain, perhaps the neighboring neocortex, for long-term storage.

Last fall, neuroscientists led by Larry Squire, Ph.D., of the Veterans Affairs Medical Center in San Diego, produced the first pictures of human memory at work. Using positron emission tomography (PET), they confirmed that memory has no single location. Rather, bits of a memory are scattered all over the brain—possibly in the vast neocortex neighboring the areas in the hippocampus that first process the sensory input.

In remembering, the hippocampus acts like a telephone switchboard, activating the scattered cortical links. After awhile, the cortical areas learn to dial direct—they establish independent neural connections among themselves, and the switchboard can be bypassed. It is likely that the more stimuli the brain is given to process, the better the connections will be.

In other words, taking account of your surroundings and various other sensory clues will aid your recall. That may explain why some professors have long advocated studying in the same room where you will be tested. According to a new study, wearing the same scent during learning and recall may also up your scores. Psychologist David G. Smith, Ph.D., of Bishop's University in Lennoxville, Canada, found that ambient odor can act as a contextual cue for retrieval of verbal stimuli. He had 47 subjects learn a list of 24 words while either jasmine incense or Lauren perfume wafted through the air. Later, the subjects relearned the list with either the same or an alternative odor present. Memory for the word list was best when the odor of the relearning session was the same one present at the time of initial learning. Ammonia, chocolate, and peppermint scents have produced similarly good results.

Don't stop at smell. To improve performance, marshall all the senses, cognition experts urge. 'Create colorful, moving, three-dimensional mental images, complete with sound, rhythm, touch, and even scent to associate with the thing you want to remember,' says Ronald Gross, Ph.D., in Peak Learning (Tarcher, 1991). The more you use in the process of memorizing something, the better your recall will be.

2X3=6, 2X3=6, 2X3=6, 2X3=6, 2X3=6, 2X3=6, 2X3=6, 2X3=6, 2X3=6

Whether you want to master a physical skill such as throwing curveballs or a body of information like the names of all the vice presidents of the United States, there's no getting around it. You must first conquer the basics. Recent findings demonstrate that—because it reduces the energy demands on the brain—practicing the simple stuff makes the acquisition of later information more efficient.

UCLA neurologist John Mazziotta, M.D., photographed several living brains at work. First, using PET scans, he watched people perform a routine task—signing their name. He observed very little activity in the motor cortex, where signals are integrated, and a relatively small response in the basal ganglia, which sits just beneath the cortex and receives commands directly.

Mazziotta's subjects literally processed the task without much thought. No surprise there; after all, they'd been signing their names for decades.

Then he asked his subjects to create the same autograph with their nondominant hand. This time cortical structures—but not the subcortical basal ganglia—flashed on. The brain was hard pressed to process the request. It called on multiple circuits to make sense of the novel task. The brain eventually quieted after repeated scribbles with the nondominant hand. In fact, it transferred activity from the cortex back to the basal ganglia, where less space and energy were consumed.

This pattern, theorizes Mazziotta, represents learning. As we master a task, the total brain area required starts to shrink. More space is freed up to devote to other things. Moreover, with experience, processing shifts from brain regions that spend lots of energy integrating and supervising activity—the conscious cortex—to areas that are more automatic. In doing this, the more conscious structures are made available for new challenges.

That helps explain the much-touted advantage of Chinese children over American kids in math. University of Missouri psychologist David Geary challenged two groups of first graders with a slew of standard problems. He watched them and measured their response times.

Overall, the Chinese kids did better. They solved three times as many problems as did the Americans. Instead of counting on their fingers, they retrieved answers from memory. When they got stuck, they broke the problems into logical pieces—an achievement that requires conceptual understanding. This was clearly beyond the reach of the Americans.

This early advantage, Geary insists, arises not from genetic difference or different teaching strategies. Methods here and in China appear pretty much the same. Rather, what gives the Chinese kids an edge is repeated practice of basic skills. They practice and practice their math. That makes automatic certain processes. "They don't have to think about the little stuff," says Geary. "This frees them to manage and integrate more sophisticated material."

The advantage endures. By the third grade, the American kids are way behind, with no hope of catching up.

In another study, older folks—who presumably mastered their basic mathematical skills way back in elementary school—were compared with latter-day college students. The elderly outwitted the young, Geary reports. That's not to say that math training will make you smarter overall. But then again...

Although practice can't improve global memory, reports psychologist Douglas Herrmann, Ph.D., author of Super Memory, it sure can boost retrieval ability in targeted areas. Practice at specific memory tasks can produce damn near spectacular results. For example, people who attempt to learn a long string of digits that is read to them normally remember only about seven of the digits correctly. But, after practicing for several months, many people can recall large series. One student learned to remember 80 digits in correct order.

The same holds true for many other types of information. Most people can normally recall only about a third of what they know. However, after a month of daily practicing to recall items from the specific category—say, state capitals or types of fruit—you could likely recall nearly all the items you have ever known, and recall them faster than ever before. The task becomes automatic. "So, if you want information at your fingertips", advises Herrmann, "practice remembering it."

At Least Take A Deep Breath

If you want to maximize your ability to take in information then find a way to minimize chronic stress.

Some stress is essential to good performance—it equips you with the oomph you need to face everyday challenges. Under the influence of hormones produced by the body when put under stress, blood is pumped faster and oxygen delivery to the brain is stepped up. That sets you up for quicker responses, sharper answers. But chronic stress, unremitting pressure for two or more years, can block learning—even kill brain cells.

Neurobiologist Richard Thompson, Ph.D., of the University of Southern California, recently demonstrated that stress can create learning deficits by disrupting long-term potential (LTP), a mechanism believed to be a component of associative memory formation. LTP is a sign of efficiency of synaptic communication, the carving of a quick path along which information can travel through the brain.

Thompson measured the capacity for LTP in brain cells taken from three groups of rats. One group was put under chronic stress for seven days by exposure to inescapable shocks. Another group experienced the same amount of shock but was trained to escape. The third group was left alone.

Later, in each set of animals, the researcher stimulated cells in the brain's hippocampus that are primary to memory, and found that LTP was blocked in the shock group that was helpless to control the stress. The rats that learned to short-circuit the negative stimulus retained LTP, but it was reduced. Not only does stress appear to affect learning, Thompson concludes, but the degree of control we have over it may determine how much learning disruption occurs.

Animal studies also suggest that chemicals produced naturally in response to stress—glucocorticoids, released by the adrenal glands—can actually damage our brains. In fact, when healthy rats were subjected to high levels of the stress chemicals for three months, parts of their hippocampus were so wrecked they looked like the brains of senile old rats.

Neurobiologist Robert Sapolsky, Ph.D., of Stanford University, pinpointed how the brain system breaks down. It turns out that stress hormones do not kill neurons directly. Instead, they disrupt their function, leaving them "queasy," as Sapolsky puts it.

When stressed, the body diverts energy to power muscles for fight or flight. Glucocorticoids suppress the uptake of glucose, the main energy source. Neurons lose about 20% to 30% of their energy. This is good for body muscles, but makes it tough for neurons to regulate incoming signals and chemicals. Neuro-toxicity results. The brain cell essentially dies of excitement.

Indeed, says research psychologist Douglas J. Hermann, Ph.D., people who complain about stressed lives often report slowed learning and memory failure. He reports that nurses who work in hospital intensive care wards have more memory failure than those who work in routine units.

In contrast, many studies have shown that people who exercise, meditate, or otherwise manage pressure, are whizzes at short-term memory, are more creative and have faster reaction times. For temporary relief, take slow, deep breaths. This delivers oxygen to the brain, clears carbon dioxide from the body, and short-circuits the stress response.