Skip to main content

Verified by Psychology Today

Memory

Why the Digital World Is Making Us Forget Things

Analyzing the growing problem of digital-induced amnesia.

Key points

  • Digital-induced amnesia is the inability to process, retain, or recall information due to digital stimuli.
  • In an information-rich, time-poor world, we rely on machines to process information—maybe to our detriment.
  • The loss of memory and retention is the cost we accept for the convenience of information at our fingertips.
  • We must remember what we are losing, even as we make forgetting our standard way of being in the world.

Back in the old days—the 1980s—when I was a graduate student, the start-up whir of my Gateway computer focused my brain as I studied, wrote, and retained scientific information that would serve me in my future career in academia.

That type of concentration was possible throughout the ’80s, ’90s, and early 2000s—until the arrival of what Thomas Friedman described as our age of extreme acceleration in his book Thank You for Being Late (2016). After decades of computers becoming mobile, powerful, and ubiquitous, focus seems to be our most endangered mental commodity, leading to a collective memory loss, or what I call digital-induced amnesia, the inability to process, retain, or recall information due to chronic overconsumption of digital stimuli. As our world becomes ever more information-rich and time-poor, our inability to retain and analyze that information for ourselves has us relying on machines—potentially to our detriment.

The Way Things Were

When I started my academic career, I understood that command of information and deep knowledge of my field was paramount to my success. I dutifully read and filed away mounds of scientific journal articles into manila folders inserted into pea-soup green file folders that would hang suspended on metal rods in the filing cabinet. Eventually, I would spice things up with folders of different colors.

To meticulously maintain command of increasing amounts of information, every paper was placed in a folder and every folder in a category. Categories were assigned main numbers, and folders within categories got a decimal point added to the main number to give it a unique identifier. If I read a new article on injection drug use among adolescents, it was assigned the file number 600.007, 600 for the category “adolescents,” and 007 was the next available number after paper 600.006 on adolescents and smoking. The system legend was kept in a three-ring loose-leaf binder. Until recently, I knew all of the category numbers by memory. I was a blast at parties, I’m sure.

At the time, I believed—maybe a little smugly—that this system gave me a competitive edge as a scholar, scientist, and expert, and I was unfailing in my filing discipline. But gradually, and then seemingly all at once, the world changed around me, file folders be damned. My efforts to migrate my file folder system into this exponentially expanding digital world quickly fell apart.

NappyStock / Nappy
Source: NappyStock / Nappy

Losing Our “Human Hard Drives”

At the same time, many people seemingly welcomed the ability to turn their memories and the contents of their “human hard drives” over to a machine. It freed up their mental capacities for higher-order creative thinking. Data processing steps and quantitative analyses now could produce results without a human touch. But I found that threatening to my very identity as a scientist who needed to stay on top of the field.

Now I recognize that they were right and I was wrong—human memory skills no longer are the asset they once were. Most technology analysts agree that changed in 2007 when the iPhone entered our pockets and transformed our relationship with information—it became more readily available and much more voluminous.

James Lang observes in his book, Distracted (2020), distractedness is the norm among American college students, and my students in the classroom and the lab at George Mason University sometimes tell me they can’t remember the sentences they just read. Not being able to shift from the sympathetic (reacting) to the parasympathetic (contemplating) parts of our brain long enough to retain information can leave one anxious, agitated, and without the secure feeling of being knowledgeable.

I consider these to be symptoms of the widespread problem of digital-induced amnesia.

What It Means to Rely on Machines

T.S. Elliot once asked, “Where is the knowledge we have lost in information?” That is an apt question that encapsulates what we are missing in our distracted digital age. Without the learned ability to acquire knowledge by commanding, synthesizing, and retaining information, our critical thinking skills are only as good as the machines we depend on.

It isn’t tragic that I no longer need to meticulously hand-file my articles and that I can search whatever pops into my head by effortlessly typing it into a search field. But it contributes to what Cal Newport describes in his book A World Without Email (2021) as the hyperactive hive mind. Our brains serve as human routers, receiving and transmitting data, day in and day out, as we search, find, forget, and repeat. It is a radically different set of behaviors from reading, contemplating, critiquing, synthesizing, physically filing, and ultimately retaining enduring knowledge.

Perhaps the loss of memory and retention at any age is the price we have accepted for the conveniences and instant rewards of fingertip information. Perhaps accessing it and knowing it for the moment is all that most people actually want. Some futurists foresee that the notion of lost human capacity is all short-term and that our brains will become more machine-like in time as we adapt to the constant conditioning of the thousands of digital acts we each perform every day.

If that is the case, we need to remember what we are losing even as we make forgetting our standard way of being in the world.

advertisement
More from Psychology Today