Skip to main content

Verified by Psychology Today

Media

Back-to-School: Why Reading and Writing Isn’t Enough

Let's finally give students the literacy skills they need.

Key points

  • Students unarmed with digital literacy skills are vulnerable to online misinformation, algorithms, filter bubbles, and more.
  • Teens rely on celebrities and influencers on social media and YouTube for news, surveys suggest.
  • Some videos on TikTok that prominently feature COVID-19 misinformation have been viewed millions of times.
  • A new report makes a case for building resilience amongst students against the growing problem of misinformation.

Screen time soared during the pandemic, especially for students suddenly online for school and to stay connected with family and friends. As many head back into the classroom for the first time in over a year, there is much talk, as always, of teaching “literacy” (how to read and write). But these lessons are likely antiquated—unless, of course, they include teaching them how to “read” online information and how to “write” (aka “post”) for online spaces, where everything can be seen by anyone, everyone, and lives online forever.

Sadly, most students are not being taught these kind of “literacy” skills. Coming on the heels of a year and a half spent clicking, scrolling, and posting more than ever before, it seems downright urgent to update “literacy” education.

Online misinformation, disinformation, disturbing content, and more, is often just a click away; and youth unarmed with the proper skills can easily fall down a rabbit hole of algorithms, filter bubbles, conspiracy theories, and plain old junk. The information they encounter might influence their beliefs, end up in the papers they write, or worse, get spread to and believed by unsuspecting peers.

A new report, "Teaching Cyber Citizenship: Bridging Education and National Security to Build Resilience to New Online Threats," written by Peter W. Singer, Nathan Fisk, Jimmeka Anderson, and Lisa Guernsey for the think tank New America, aptly illustrates the urgent need for a literacy update with this example:

"Imagine a world in which a young student is looking at a computer screen and sees false information on YouTube, Facebook, Instagram, TikTok, or some not yet invented social media platform. Perhaps it is a conspiracy theory pushed by a foreign government or an extremist group, seeking to recruit her or cause harm to our democracy. Or it might be a veiled advertisement, seeking to induce her to buy some shoddy product or steal her personal information. Or maybe it is just a rumor among school peers that has run wild. Whatever it is, that information was designed to trigger emotions and lead to sharing, as well as real-world action to her detriment."

The authors envision a positive outcome for this all-too-common scenario. To paraphrase—they see a world where this student does not “click and share” because the student knows “information can be created to mislead” and has learned “how the online world works, including how algorithms shape content and target us,” and “how media is created and how to verify sources.”

We Can Fix This

The outcome above is entirely achievable, and one that a comprehensive cyber civics education can easily accomplish. This type of education would include and give equal time to everything students need to know about using technology safely and wisely—from cyberbullying to copyright, online safety to misinformation, addictive technology design to reputation management, and too many more topics to fit in this post.

Unfortunately, the roadblocks to achieving this outcome are many. Schools and teachers lack time, resources, funds, or even a map of what skills to teach in which order. For many educators, the online ecosystem is still brand new terrain. Even the online spaces where youth access information are foreign and confusing for many adults.

Where Teens Get Information

Teens don’t rely on traditional media organizations to get information about current events. More than half (54 percent) get news at least a few times a week from social media platforms like Instagram, Facebook, and Twitter, and 50 percent get news from YouTube, according to a poll conducted by Common Sense Media and SurveyMonkey. Additionally, six in ten teens who get their news from social media and YouTube report they are more likely to rely on celebrities and influencers than news organizations.

In the time since this pre-pandemic poll was conducted, the social media landscape has shifted a bit. The popular social media site TikTok saw a significant increase in popularity during COVID-19, experiencing growth of a whopping 180 percent among 15- to 25-year-old users.

Quarantined teens spent hours scrolling through user-made videos on TikTok, and in the past year, many of these videos strayed from the site’s former staple of wanna-be pop stars lip-syncing to popular tunes. Take a quick scroll through TikTok today and you’ll find many users sharing COVID-19 and vaccine “information.” According to a story published on WPR, “a recent analysis by the Institute for Strategic Dialogue found that some videos with COVID-19 misinformation had been viewed millions of times.”

This is disturbing on many levels. Developmentally, teens are primed to be influenced by their peers, and these user-made videos can capitalize on this vulnerability. Additionally, research shows that false news online spreads faster than the truth and that people are biased towards believing a claim if they have seen it before.

Currently, on TikTok, you’ll find a slew of user-made videos about ivermectin, a deworming medication for animals such as horses, cows, and dogs that the FDA warns can be highly toxic to humans. According to a recent article in Rolling Stone, TikTok's guidelines plainly state that it prohibits “medical misinformation that can cause harm to an individual’s physical health.” Yet it took me all of 30 seconds to find several young people on TikTok espousing their endorsement of the drug.

Falling Down the Algorithm Rabbit Hole

On TikTok, like YouTube, algorithms track what users watch in order to determine what to serve next. In other words, if someone watches a video with vaccine disinformation all the way through, the algorithm is likely to feed them more of the same (which means I can probably look forward to a lot more ivermectin videos). Unfortunately, a young person unaware of algorithms and how they work may become the unsuspecting target of more misinformation.

The Surgeon General recently released a report about misinformation and disinformation, citing concerns that the pandemic is being worsened by an accompanying "infodemic." The report calls on educators to teach students about common tactics that help spread misinformation.

What Can Be Done?

The most obvious solution? Education. But a comprehensive cyber civics curriculum is still a long way off for most schools. In the short term, parents and teachers can equip students with some basics:

  1. Check the Source. Remind your children that anyone can post anything online. If they see something questionable on their social media feeds, encourage them to check out the source. Ask: Is it a random YouTuber or a subject-matter expert? Suggest they check other sources to see if the information is verified by other experts.
  2. Don’t Share. Encourage your children to think twice about sharing any information they have not determined to be true.
  3. Report. Teach your children that social media sites offer a way to report “False News.” This is important to do and can help stop the spread of misinformation.
  4. Be Aware of Algorithms. Tell your children that social media sites track their activities in order to deliver more of what they think they want or like. If your kids are like most, they won’t like being told what to think or like. Not by you or by YouTube.
advertisement
More from Diana E. Graber M.A.
More from Psychology Today