Skip to main content

Verified by Psychology Today

Bias

Fake News, Echo Chambers & Filter Bubbles: A Survival Guide

Internet misinformation biases our beliefs ― Is it time to unplug?

Public domain
War News from Mexico, Richard Caton Woodville (1848)
Source: Public domain

“Who's the more foolish? The fool or the fool who follows him?”

― Obi Wan Kenobi, Star Wars

Rise of the Machines: A 2016 Timeline

This summer, the satirical sci-fi TV series Brain Dead portrayed our modern state of highly polarized political discourse as the result of a conspiratorial plot on the part of space alien insects intending a hostile takeover of our planet. The alien bugs, so the story went, were infecting the brains of US Senators, causing them to abandon any spirit of compromise in favor of violent conflict so that we would kill each other off, paving the way for a new world order.

While that premise was intentionally far-fetched for comedic effect, current events now suggest something else familiar from science fiction, but increasingly real. Only, it’s not space aliens that are wresting control of our brains, it’s the internet.

We’ve heard about the dangers of using cellphones while driving, making it 23 times more likely to crash your car and resulting in 1.6 million accidents a year. Lately, we’ve been told that the continuous use of mobile devices has resulted in an epidemic of “text neck,” a hunch-backed, arthritic syndrome that might best be described as a posture of slavery to our artificial intelligence overlords.

Now, headlines suggest that a combination of fake news, fake posts, and fake tweets, consumed within our online echo chambers and filter bubbles may be intentionally feeding our brains a narrowly biased view of the world. Increasingly, the warped version of reality that we see online isn’t reality at all.

Back in January, in a blogpost called “Does the Internet Promote Delusional Thinking?”, I wrote about how the way our online world is constructed provides a technological enhancement of confirmation bias, our built-in neural bias towards preferring information that confirms our pre-existing beliefs. I cited the work of Michela Del Vicario and colleagues, whose 2015 paper, “The Spreading of Misinformation Online,”1 demonstrated this “echo chamber” effect on Facebook. And I quoted Eli Pariser, author of The Filter Bubble: How the New Personalized Web is Changing What We Read and How We Think, who described how those bubbles likewise have the potential to confine ourselves within a “Web of one.” Now, less than a year later, “echo chambers,” “filter bubbles,” and “confirmation bias” have all become household words.

Two months ago, Stephanie McCrummen painted a compelling portrait of online confirmation bias in her Washington Post article, “Finally, Someone Who Thinks Like Me.” In it, McCrummen told the story of Melanie Austin, a Pennsylvania woman who repeatedly sought and found online evidence to support her own suspicions and other things she’d heard, such as that President Obama founded ISIS, or that First Lady Michelle Obama was born a man, or that Donald Trump would win the presidential election in a landslide. Many Washington Post readers probably dismissed the article as a tale of some isolated, paranoid, country bumpkin, but no doubt may have reconsidered that dismissal after the election results came in.

The Sunday before the election, the New York Times published an article by Jim Rutenberg called “Media’s New Challenge: Overcoming the Threat of Fake News” that warned of the proliferation of bogus news articles intended to influence voting. After Trump emerged with a victory, I wrote about how the consumption of fake news within echo chambers had allowed liberals to be taken by surprise by the election outcome, resulting in a kind of acute "Post-Trump Stress Disorder."

In the days that followed, fake news became a ubiquitous headline, with a flurry of subsequent articles suggesting that fake news had gotten Trump elected and that websites like Facebook were to blame. Vox writer Timothy Lee reported that Facebook engagement with fake news articles outpaced engagement with mainstream news articles by the eve of the election. John Markoff followed with a New York Times piece that noted that those fake news articles had been generated by an “automated army of pro-Donald Trump chatbots” and Washington Post writer Caitlin Dewey published an interview with Paul Horner, a prolific faux news author, who claimed, “I Think Donald Trump is in the White House Because of Me.

By most accounts, the internet has now been credited with playing a significant role in determining the outcome of this year’s presidential race, handing Donald Trump the 45th Presidency of the United States. If artificial intelligence overlords are indeed behind it all, Election Night 2016 may have, in retrospect, been “Judgment Day” in a quiet, insidious war intended to destroy the human race.

Internet Misinformation: Why Worry?

On a more serious and less partisan note, let’s be clear: fake news, echo chambers, filter bubbles, and confirmation bias are not exclusive tools, traps, or foibles of the right or the left. They’re vulnerabilities for all of us, on either side of the political fence.

Why should we care? Because consuming misinformation within the narrow confines of our online worlds inhibits our ability to know what’s true, to make choices based on the most accurate information, to make informed decisions about what to believe, and to resist the many invisible forces that might not have our best interests in mind. Political differences aside, do we really want to allow ourselves to be misled by Russian trolls and Macedonian teenagers? Do we really want to allow ourselves to be constantly manipulated by corporate advertising?

If we allow ourselves to become passive vessels of a daily deluge of online information, we’ll never learn about our perceived opponents, whether in the spirit of understanding and “reaching across the aisle” to unite the country, or in efforts to mount a resistance or win an ideological war. If we allow ourselves to become slaves to confirmation bias on e-steroids, we’ll lose the ability to distinguish between what’s real and what’s not, and “truth” will be regarded as something that’s endlessly debatable, if not completely unknowable. In short, we’ll never learn anything. Is that how we want to live our lives?

Assuming we’re not already there, what can be done to avoid this fate?

Efforts to enhance the quality of online news are already underway by some of the major fountains of misinformation. Indeed, although Facebook founder Mark Zuckerberg made headlines last week by denying that his creation played a role in the outcome of the presidential election (read University of North Carolina School of Information and Library Science professor Zeynep Tufecki’s rebuttal here), both Facebook and Google have since claimed that they plan to withhold advertising from fake news sources, effectively preventing them from gaining easy access to viewers’ eyes. Facebook says it is now also developing a way to flag fake news sources, while New York magazine writer Brian Feldman has already created a browser extension that does this when surfing the web on Google Chrome. A similar plug-in called “FiB” was recently developed by three college students. If there’s a continued demand for such products, more will surely be on the way.

So, maybe there’s hope at the root of the problem for an improvement in the quality of our online information. But ultimately, it might be up to individuals to arm ourselves against misinformation by becoming more discriminating consumers of what we see online. In the next section, I'll discuss how we might go about doing just that.

Safeguarding Against Online Misinformation: A Survival Guide

► Learn to Recognize Fake News for What It Is

America has had a longtime affection for fake news. The National Enquirer has been going strong since 1926. The print edition of the Weekly World News ran for 30 years, keeping us informed about aliens at Roswell, the faked death of Elvis, or the discovery of bigfoot before shutting down in 2007, only to resurrect itself as a online-only publication in 2009. Those periodicals have always provided some levity to the daily grind, if only when we glanced at them in the supermarket checkout line, but few of us have ever mistook them for actual news.

In the internet age however, demand for fake news has driven up supply and with millions of links that indiscriminately alternate between opinions and well-researched facts popping up in a single Google search, the ability to discern between reliable information and misinformation has been all but lost. Stephanie McCrummen’s Washington Post article gave us an inside look at an individual who had trouble in that regard, but sometimes it seems as if an entire generation has come to rely on Google searches for easy answers, while never being taught how to weed through the chaff for reliable information. Perhaps the reward of confirmation bias when we find something that tells us, “See? I was right!” is just too strong.

In scientific publishing, the proliferation of online journals that operate on a “predatory open access” model is a known problem whereby authors pay out of pocket for publication, often getting articles printed online with minimal peer review or editorial discretion. In response, University of Colorado librarian Jeffrey Beall has created a website called Scholarly Open Access that maintains a list of such publishers to help both researchers and readers to avoid the bad apples.

It’s been suggested that we might benefit from a similar watchdog for online news sources. In an effort to fill that niche, Melissa Zimdars, an Assistant Professor of Communication at Merrimack College, recently published an extensive list of fake news sources. But conservatives were quick to criticize her inclusion of websites like Breitbart.com and Infowars.com and she has since taken it down after being subjected to harassment and threats. Aside from the challenge of weeding through the sheer number of online news sources, the question of “who will watch the watchers” will inevitably plague the blacklisting of online information sources or any related proposals for developing systems to rate them.

Paul Horner, the professional fake news writer mentioned earlier, said this about consumers of online news:

“Honestly, people are definitely dumber. They just keep passing stuff around. Nobody fact-checks anything anymore — I mean, that’s how Trump got elected. He just said whatever he wanted, and people believed everything, and when the things he said turned out not to be true, people didn’t care because they’d already accepted it. It’s real scary. I’ve never seen anything like it.”

New, but as-of-yet unpublished research suggests that fact-checking websites can help people to revise their misinformed opinions rather than digging their heels in as a part of confirmation bias and the “backfire effect.” However, claims of biased fact-checking abound and the proliferation of fake fact-checking sites is inevitable. So online consumers will still have to learn to separate good and bad fact-checking sites and to seek out unbiased sources.

Again, it may end up being up to the individual to become shrewd – and fair-balanced – enough to figure things out. Fortunately, anyone can learn how to spot fake news. Dr. Zimdars has described some tips on how to do so, such as being wary of unfamiliar domain names, especially those that end in “lo” and “com.co,” and researching the source of a website by checking it on Snopes.com or Wikipedia.org. She also provides this sound advice:

“If [a news] story makes you REALLY ANGRY it’s probably a good idea to keep reading about the topic via other sources to make sure the story you read wasn’t purposefully trying to make you angry (with potentially misleading or false information) in order to generate shares and ad revenue.”

Guidelines for recognizing fake news can be helpful regardless of one's political affiliation. Similar tips for detecting fake news can be found from Dr. Zimdar here, from Factcheck.org here, and from the conservative site RedState.com here.

► Limit Your Use of Social Media for News and Consider Paying for Something Better

David Mikkelson, a writer at Snopes.com, notes that while some news is fake, “bad news” – including news delivered with a clear political slant – can be just as much of a problem. The best news, as former ABC anchorman Ted Koppel recently reminded us, is based on the “old fashioned concept” of objective reporting, but that may be something of a lost art.

In order to incentivize the return of objectivity into the landscape of news, it may be necessary to pay for it. We might all therefore consider subscribing to a reputable news source. The New York Times reported an increase of over 40,000 subscribers this past week, but if you’re a liberal, consider another subscription to the National Review to balance things out (see the following section about escaping our echo chambers). Or consider a less partisan-leaning publication like The Economist. Or consider The Week, an easily digestible summary of all types of news and opinion that cites sources from around the world, left, right, and center – it can be a helpful way to keep abreast of a broad range of information that allows you to find the source material when you’re particularly interested in a specific topic.

Tim Wu, author of The Attention Merchants: The Epic Scramble to Get Inside Our Heads, notes that beyond politics, the biased information that we’re fed online is as much a byproduct of the e-commerce business model that relies on advertising to make money. He’s therefore a proponent of avoiding online manipulation by paying for something that’s more tailored to serving you than the other way around:

“If you really want change... you probably have to pay for stuff, pay for content. Some people are like, "Oh my God I have to pay?" But people do pay. They pay for Netflix, they pay for HBO, they pay for other types — they subscribe to newspapers sometimes.

Generally speaking, when you pay for stuff it has more of your interests at heart. ...In other words, a lot of the websites are always serving two masters, they're both trying to get you entertained enough to stay there, or to click on things, but to also then make it a good platform for advertising. So I have sort of a plea to people who want to change these sort of things is, like, maybe just suck it up and start paying for more stuff.”

...Every time you click on a "like" button on another site, you've told Facebook that you're doing that, and so therefore advertisers know who their fan base is. When you decide to "like" something you may feel you're innocently putting out your preferences, but actually you're delivering something of enormous value, which is indicating that you essentially like to be advertised to by this company.

It's so funny that the Internet has become a series of traps where you do innocent things like give your name or address or indicate a preference — "I like this thing" — and therefore you open yourself up to a deluge of advertising based on those stated preferences. That's what you're doing, you're signaling who you are as a consumer.”

...Google, Facebook, Twitter — the whole set of companies essentially knows all your weaknesses and essentially how to manipulate you in subtle ways in order to have you do things you might not otherwise do.”

The bottom line is that free online news isn’t free. Rather than paying the cost of misinformation, consider investing in something better.

► Make a Conscious Effort to Learn Outside of Your Echo Chamber (and Be Nice In the Process)

In an effort to curb hate speech, Twitter recently suspended several high-profile “alt-right” accounts, including that of Richard Spencer, a prominent leader in the white nationalist movement. But that move has been criticized for censoring free speech and already seems to have resulted in a backlash proliferation of new fake accounts. With unfettered online talk all around us, it may be more useful to learn how to filter through it on our own, rather than relying on individual websites to do it for us.

Likewise, while a 2015 article published instructions on how to systematically “find and delete friends who support Donald Trump” on Facebook, that’s bad advice for the informed consumer of online information. Learning to recognize and avoid fake news is a worthwhile endeavor, but with opinion increasingly masquerading as news these days, we also need to keep ourselves aware of opinions that diverge from our own. In my last blogpost entitled, “Understanding Post Trump Stress Disorder: Why Liberals Didn’t See It Coming,” I offered this advice to escape our echo chambers and filter bubbles:

Don’t inform yourself about what’s going on in the world by relying exclusively on your Facebook and Twitter feeds. Stay friends with that person whose divergent views and comments sometimes drive you crazy. If you’re a liberal, keep tabs on what's being said over at Fox News and read the Wall Street Journal and the National Review. Hell, for the next 4 years, you might want to even take a peek at Infowars once in awhile.

With Steve Bannon’s role in Trump’s presidency going forward, liberals would probably do well to keep tabs on Breitbart.com as well. Of course, the same advice in reverse could be followed by a conservative reader.

In a similar vein, our ability to learn online would be enhanced by striving to maintain civility when participating in online discourse. As I discussed in a previous blogpost, about internet trolling, the anonymity and lack of face-to-face interaction in online communication can bring out the worst of us. Most of us who engage in online discussion, by commenting on Facebook or tweeting on Twitter, have felt the jaw-clenching pressure to respond when we’re met with opposition, in an effort to prove ourselves right. In trying to win such an argument, it’s frighteningly easy to behave online in the same way that we behave when we’re alone in our cars and someone cuts us off in traffic. But that’s no way to learn. Being open-minded and keeping our cool when discussing topics with our ideological opposites will make us better informed and therefore smarter in the long run.

► Time to Unplug?

“Make America Great Again” was a rallying cry for the Trump campaign that seems to have captured a kind of nostalgia for a simpler time that has passed. There’s little doubt that the internet has been a transformative technology that has given us unprecedented access to an impossibly large amount of information at the touch of a button. The world wasn’t a better place when we had to go to the library to find one or two books, or the Encylopedia Britannica, to complete a grade school assignment, but I do sometimes long for the days when we could tune into one of four trusted television networks or read a small handful of major print newspapers to get our news. And no doubt, many of us yearn for a time when children went outside to play, rather than already remodeling their spines into geriatric curves when texting at the dinner table.

While there are upsides and downsides to all emerging technologies, with all the misinformation that our daily interaction with handheld devices feeds to our brains, I can empathize with the desire to go full Luddite, unplugging completely. And with rumors of net neutrality – the government requirement of internet service providers to manage and deliver all online data in the same way, without restriction or censorship, to all consumers – being vulnerable under our new presidential administration, unplugging might seem an increasingly nostalgic option to Make American Great Again. Maybe Neo-Luddites will become the next Tea Party.

In the wake of the election and in order to gather information for this blogpost, I’ve probably been on Facebook and Twitter more than in any other week in my life. I’ve therefore gained a small, but exhausting, window into the world of writer and blogger Andrew Sullivan, who recently described how his full-time commitment to blogging and other online activity nearly killed him. Indeed, he says, the distraction of our online existences threatens our very souls. He managed to rescue his by unplugging from that distraction and putting a self-imposed end to his career as a blogger.

I’m glad to know I’ll always have that option. Fortunately, all of us always have the option to unplug, if only for a little while. Maybe we should try it more often.

Right now, it’s a sunny day in Los Angeles. I’m going outside to walk my dog. And I’m leaving my cellphone at home.

Dr. Joe Pierre and Psych Unseen can be followed on Facebook and Twitter.

To check out some of my fiction, click here to read the short story "Thermidor," published in Westwind last year.

References

1. Del Vicario M, Bessi A, Zollo F, et al. The spreading of misinformation online. Proceedings of the National Academy of Science 2015; 113:554-559.

advertisement
More from Joe Pierre M.D.
More from Psychology Today