Skip to main content

Verified by Psychology Today

Deception

The Death of Facts: The Emperor's New Epistemology

Denialism, deception, and the psychology of “truthiness”

Creative Commons Attribution-Share Alike 3.0 Unported license
The Emperor's New Clothes, Monument in Odense (photo by Владимир Шеляпин)
Source: Creative Commons Attribution-Share Alike 3.0 Unported license

“‘But he has nothing on!’ said the whole people at length. That touched the Emperor, for it seemed to him that they were right; but he thought within himself, ‘I must go through with the procession.’ And so he held himself a little higher, and the chamberlains held on tighter than ever, and carried the train which did not exist at all."
The Emperor’s New Clothes, Hans Christian Andersen (1837)

To hear President-elect Trump surrogate and self-described “journalist and patriot” Scottie Nell Hughes tell it this past week, 2016 might very well have marked the death of facts.

As highlighted in an Esquire article by Jack Holmes, Hughes was interviewed on The Diane Rehm Show and was asked about claims that the President-elect had tweeted false information about winning the popular vote. In response, Hughes said:

"Well, I think it's also an idea of an opinion. And that's—on one hand, I hear half the media saying that these are lies. But on the other half, there are many people that go, 'No, it's true.' And so one thing that has been interesting this entire campaign season to watch, is that people that say facts are facts—they're not really facts. Everybody has a way—it's kind of like looking at ratings, or looking at a glass of half-full water. Everybody has a way of interpreting them to be the truth, or not truth. There's no such thing, unfortunately, anymore as facts. And so Mr. Trump's tweet, amongst a certain crowd—a large part of the population—are truth. When he says that millions of people illegally voted, he has some facts—amongst him and his supporters, and people believe they have facts to back that up. Those that do not like Mr. Trump, they say that those are lies and that there are no facts to back it up."1

The idea that there’s no such thing as facts, only subjective interpretations of reality, does seem to have emerged as a popular philosophy this past election year. We’ve grown accustomed to the idea that what politicians say in an election campaign might warrant verification, but now we’re also being told that we can’t trust the fact-checkers any more than we can trust fake news. As I noted in my last blog post, “Fake News, Echo Chambers & Filter Bubbles: A Survival Guide,” the apparent inability to distinguish between what’s accurate and what’s not leaves us vulnerable to concluding that truth is “endlessly debatable, if not completely unknowable.” That’s quite a nihilistic version of epistemology indeed.

While the rejection of truth seems to have peaked this year, it’s hardly a novel philosophy. Just over a decade ago, comedian Stephen Colbert coined the term “truthiness” to describe how gut intuition had become a preferred method of determining truth, supplanting the rational evaluation of evidence. In 2006, Merriam Webster Dictionary declared “truthiness” the Word of the Year and defined it as “truth coming from the gut, not books; the quality of preferring concepts or facts one wishes to be true, rather than concepts of facts known to be true.”

But the history of truthiness goes back well beyond a decade ago. A recent article in The Atlantic by Megan Garber credits the historian Daniel Boorstin with the theory that “image” in America became preferred over reality in the century leading up to the 1960s. Garber writes that Boorstin conceived of image as a strict “replica of reality, be it a movie or a news report or a poster of Monet’s water lilies, that manages to be more interesting and dramatic and seductive than anything reality could hope to be” and as a “fundamentally democratic… illusion [that] we have repeatedly chosen for ourselves until we have ceased to see it as a choice at all.” Boorstin, Garber says, “worried that we don’t know what reality is anymore… and we don’t seem to care.” And while Boorstin implicated emerging media in creating the illusion of image, he made that claim in 1962, long before reality TV was a thing.

Looking at a bigger picture, Boorstin’s historical account can be thought of as a singular version of the larger 20th century movement called “postmodernism,” which represented a kind of revolt against the Age of Enlightenment, during which time science and reason claimed ascendancy, as well as against modernism, when industrialization and technological advancements facilitated the advancement of civilization into two world wars. Postmodernism has been defined as:

“...largely a reaction to the assumed certainty of scientific, or objective, efforts to explain reality. In essence, it stems from a recognition that reality is not simply mirrored in human understanding of it, but rather, is constructed as the mind tries to understand its own particular and personal reality. For this reason, postmodernism is highly skeptical of explanations which claim to be valid for all groups, cultures, traditions, or races, and instead focuses on the relative truths of each person. In the postmodern understanding, interpretation is everything; reality only comes into being through our interpretations of what the world means to us individually. Postmodernism relies on concrete experience over abstract principles, knowing always that the outcome of one's own experience will necessarily be fallible and relative, rather than certain and universal. Postmodernism is "post" because it is denies the existence of any ultimate principles, and it lacks the optimism of there being a scientific, philosophical, or religious truth which will explain everything for everybody - a characteristic of the so-called "modern" mind.”2

In 1991, the philosopher Daniel Dennett declared:

“Postmodernism, the school of ‘thought’ that proclaimed ‘There are no truths, only interpretations’ has largely played itself out in absurdity, but it has left behind a generation of academics in the humanities disabled by their distrust of the very idea of truth and their disrespect for evidence, settling for ‘conversations’ in which nobody is wrong and nothing can be confirmed, only asserted with whatever style you can muster.”3

Rejection of scientific explanations? Supplanting objective knowledge with subjective experience? The end of truth? Doesn’t that sound an awful lot like 2016?

Still, while it’s tempting to declare the rise of truthiness and the death of facts as a kind of post-postmodernism, it could be argued that the conflict between subjective vs. objective knowledge goes back even farther, reflecting none other than the age-old tension between faith and reason, which dates back to the very start of Western civilization. And yet, despite the impression of a longstanding dichotomy between faith and reason, it has been said that their untenable conflict is not historically accurate and that at their best, they should be able to complement one another. In 1998, Pope John Paul II issued the Fides et Ratio that stated that when viewed properly, faith and reason are not only compatible, but essential together. Faith without reason, he wrote, leads to superstition, whereas reason without faith, leads to nihilism and relativism.

Following that argument to the present day, it seems that things have somehow ended up backwards. How is it that we’ve arrived at a point when it has been truthiness, attributed to intuition and faith, and not reason that has led us to nihilism?

To understand this, we need to reconsider the premise that truthiness, or epistemological nihilism, really has anything to do with faith and instead think about faith and reason in psychological terms. Psychologically speaking, both faith and reason are cognitive attempts to seek truth and understand reality. Faith involves choosing to believe in something that provides our lives with meaning and serves as a kind of bookmark for gaps in knowledge. Some things are currently unknowable — Is there a God? What happens after we die? Is there a multiverse? Are we living in a computer simulation? In the domain of the uncertain, choosing to believe in a hypothesis, a mythology, or even a hunch can be an important contributor to mental health.

In contrast, reasoned scientific facts deal with the knowable. Ultimately, scientific truths are probabilistic — they’re confidences based on repeated observations and controlled experiments designed to establish causality. While verified scientific facts are by nature always open to a fresh look, many such facts are worthy of belief and inappropriate for disputation. The earth is round. Vaccines prevent disease. President-elect Trump won the Electoral College, but lost the popular vote. It’s mentally healthy to believe in reasoned facts because facts are predictive and help us in the day-to-day navigation of the physical world. Taken together, faith doesn’t require that reason or facts be rejected, just as thinking rationally doesn’t require that we abandon faith.

With that peaceful coexistence in mind, truthiness isn’t a faith-based rejection of facts after all, it’s narcissistic denialism. When Nathan Rabin interviewed Stephen Colbert about truthiness in 2006, Colbert said:

“Truthiness is 'What I say is right, and [nothing] anyone else says could possibly be true.' It's not only that I feel it to be true, but that I feel it to be true. There's not only an emotional quality, but there's a selfish quality.4

In other words, the rejection of facts is often more about rejecting an opposing view based on the dogged insistence of being right. And when that insistence goes up against facts or expert opinion, arguing that facts don’t exist or that experts don’t actually know what they’re talking about is a way out.

Beyond that, how to best understand fact denialism from a psychological perspective depends on the level of consciousness at which it’s operating. If the rejection of countervailing facts in favor of our own opinions is occurring unconsciously or subconsciously, it probably represents the brain’s now well-known inherent tendency that we call confirmation bias. If denialism is occurring more consciously however, it becomes less an act of belief and might instead be described colloquially as contrarianism or arrogance. In some cases, it might be best characterized as deception.

In language, we have a rich vocabulary at our disposal to describe false statements that fly in the face of facts, depending on such particulars. Dictionary.com recently proclaimed “xenophobia” to be the Word of the Year for 2016, but I already covered that topic around this time last year. This year, I think the Word of the Year should be “narrative.” Narratives are stories we tell ourselves and others that reflect our own experience of real events. As subjective experiences, those accounts are inevitably biased in some way, often represent fact denialism, and are sometimes outright lies.

In 2016, objective facts were often replaced by subjective narratives. The “Emperor’s New Epistemology” may not actually be all that new, but the current nihilism with regard to truth is as insubstantial as a set of invisible clothes, paraded about in public for all to see.

Dr. Joe Pierre and Psych Unseen can be followed on Facebook and Twitter.

To check out some of my fiction, click here to read the short story "Thermidor," published in Westwind last year.

References

1. The Diane Rehm Show. How Journalists are Rethinking Their Role Under a Trump Presidency, November 30, 2016. http://thedianerehmshow.org/audio/#/shows/2016-11-30/how-journalists-are-rethinking-their-role-under-a-trump-presidency/114095/@14:40

2. NPR.org, Postmodernism. http://www.pbs.org/faithandreason/gengloss/postm-body.html

3. Dennett DC. Dennett on Wiesltier v. Pinker in the New Republic. Edge, September 10, 2013. https://www.edge.org/conversation/dennett-on-wieseltier-v-pinker-in-the-new-republic

4. Rabin R. Stephen Colbert. A.V Club; January 5, 2006. http://www.avclub.com/article/stephen-colbert-13970

advertisement
More from Joe Pierre M.D.
More from Psychology Today