Skip to main content

Verified by Psychology Today

Education

‘Digital Literacy’ Will Never Replace The Traditional Kind

We're overestimating how much computers will teach our kids

Have you heard about the octopus who lives in a tree? In 2005, researchers at the University of Connecticut asked a group of seventh graders to read a website full of information about the Pacific Northwest Tree Octopus, or Octopus paxarbolis. The Web page described the creature’s mating rituals, preferred diet, and leafy habitat in precise detail. Applying an analytical model they’d learned, the students evaluated the trustworthiness of the site and the information it offered. Their judgment? The tree octopus was legit. All but one of the pupils rated the website as “very credible.” The headline of the university’s press release read, “Researchers Find Kids Need Better Online Academic Skills,” and it quoted Don Leu, professor of education at UConn and co-director of its New Literacies Research Lab, lamenting that classroom instruction in online reading is “woefully lacking.”

There’s something wrong with this picture, and it’s not just that the arboreal octopus is, of course, a fiction, presented by Leu and his colleagues to probe their subjects’ Internet savvy. The other fable here is the notion that what these kids need — what all our kids need — is to learn online skills in school. It would seem clear that what Leu’s seventh graders really require is knowledge: some basic familiarity with the biology of sea-dwelling creatures that would have tipped them off that the website was a whopper (say, when it explained that the tree octopus’s natural predator is the sasquatch). But that’s not how an increasingly powerful faction within education sees the matter. They are the champions of “new literacies” — or “21st century skills” or “digital literacy” or a number of other faddish-sounding concepts. In their view, skills trump knowledge, developing “literacies” is more important than learning mere content, and all facts are now Googleable and therefore unworthy of committing to memory.

There is a flaw in this popular account. Robert Pondiscio, an education consultant and staffer at the nonprofit Core Knowledge Foundation (and a former fifth-grade teacher), calls it the “tree octopus problem:” even the most sophisticated digital literacy skills won’t help students and workers navigate the world if they don’t have a broad base of knowledge about how the world actually operates. “When we fill our classrooms with technology and emphasize these new ‘literacies,’ we feel like we’re reinventing schools to be more relevant,” says Pondiscio. “But if you focus on the delivery mechanism and not the content, you’re doing kids a disservice.”

Indeed, evidence from cognitive science challenges the notion that skills can exist independent of factual knowledge. Dan Willingham, a professor of psychology at the University of Virginia, is a leading expert on how students learn. “Data from the last thirty years leads to a conclusion that is not scientifically challengeable: thinking well requires knowing facts, and that’s true not only because you need something to think about,” Willingham has written. “The very processes that teachers care about most — critical thinking processes such as reasoning and problem solving — are intimately intertwined with factual knowledge that is stored in long-term memory (not just found in the environment).” Just because you can Google the date of Black Tuesday doesn’t mean you understand why the Great Depression happened or how it compares to our recent economic slump. And sorting the wheat from the abundant online chaff requires more than simply evaluating the credibility of the source (the tree octopus material was supplied by the “Kelvinic University branch of the Wild Haggis Conservation Society,” which sounded impressive to the seventh graders in Don Leu’s experiment). It demands the knowledge of facts that can be used to independently verify or discredit the information on the screen.

There is no doubt that the students of today, and the workers of tomorrow, will need to innovate, collaborate and evaluate, to name three of the “21st century skills” so dear to digital literacy enthusiasts. But such skills can’t be separated from the knowledge that gives rise to them. To innovate, you have to know what came before. To collaborate, you have to contribute knowledge to the joint venture. And to evaluate, you have to compare new information against knowledge you’ve already mastered. Nor is there any reason that these skills must be learned or practiced in the context of technology. Critical thinking is crucial, but English students engage in it whenever they parse a line of poetry or analyze the motives of an unreliable narrator. Collaboration is key, but it can be effectively fostered in the glee club or on the athletic field. Whatever is specific to the technological tools we use right now — and these tools are bound to change in any case — is designed to be easy to learn and simple to use.

This last point was colorfully expressed by Alan Eagle, a Silicon Valley executive who sends his children to a Waldorf School that does not allow computers in its classrooms. Using the Internet is “supereasy,” Eagle was quoted as saying in a much-discussed New York Times article. “At Google and all these places, we make technology as brain-dead easy to use as possible. There’s no reason why kids can’t figure it out when they get older.” What they won’t figure out is deep reading, advanced math, scientific reasoning — unless we teach them. While economically disadvantaged students may benefit from access to computers that they don’t have at home, more affluent kids are surrounded by technology. Does it make sense to limit their screen time at home and then give them more of the same at school?

Read more about the science of learning at www.anniemurphypaul.com, or email the author at annie@anniemurphypaul.com.

This post originally appeared on Time.com.

advertisement
More from Annie Murphy Paul
More from Psychology Today