Skip to main content

Verified by Psychology Today

Bias

How to Change People’s Minds

The art of debunking.

People love to be right. Research indicates that we hate being wrong more so; but in a similar manner, we generally love to be right. But, being right is not, and shouldn’t be, the primary focus of critical thinking; rather, we use critical thinking to decide what to believe and what to do. However, critically thinking for the purpose of being right can be a positive thing, particularly in educational settings; that is, for educating others.

I teach a master’s module on Critical Thinking for Leaders, wherein students are taught, for the purpose of enhancing their leadership skills in the context of Training and Education, how to both become better critical thinkers and empower their peers to do likewise. Gaining the skills to be ‘right’ and leading others down the correct path are desired educational outcomes. In this context, being right is a positive outcome.

In a recent class with this cohort, one of my students asked a question that cut straight to the core of this line of thinking: How do we get others to change their minds? A very interesting question–one which I have thought a great deal about before; and yet, I didn’t readily have an answer!

To clarify, this question wasn’t meant in the sense of how best to persuade people (for better or worse); rather, when we have critically thought about something and developed a reasonable conclusion, based on credible, relevant and logical evidence, how can we change the minds of those who haven’t thought critically–or worse: those who believe they have thought critically, yet are still wrong? Simply, how do we debunk erroneous information?

Just being right isn’t enough. What good is being right when the person you’re educating thinks that they’re right and you’re wrong? What if they don’t want your education? In this context, how do you succeed in educating?

Success in educating with respect to debunking erroneous information depends largely on the nature of the individual you are trying to educate. This is alluded to by the two questions above: how can we change the minds of (1) those who haven’t thought critically and (2) those who believe they have though critically, yet are still wrong? Though similar, the subtle difference between the questions is important, because it results in two different answers.

The first: How can we change the minds of those who haven’t thought critically?

Cook and Lewandowsky (2011) have put together a concise handbook on ‘debunking’ that addresses, as a foundational perspective, that once people process information, it’s quite difficult to remove that information’s influence. This concept is not new. Research literature has long discussed the power of belief(s) on cognitive processing, as well as the sustainability of this power. According to Cook and Lewandowsky (2011; see also Lewandowsky et al., 2012), a common solution strategy is to remove the influence of erroneous information by providing, educating, adding (any or all of the above) correct information. But this strategy is in itself erroneous because it, to some extent, ignores some implicit processes of bias. Instead, they recommend three routes towards successful debunking:

  • The position you are trying to teach must focus on the core evidence rather than their misinformation. Though you should refute the incorrect information (i.e. the ‘myth’), your position must be the primary focus. By making the refutation of the myth center-stage, cognitive processing of the myth is still reinforced, which winds up making it less likely to successfully combat the myth.
  • Any mention of said myth should be prefaced by explicit warning that the following information is erroneous.
  • The refutation should include an alternative explanation(s) of the supporting propositions presented within the original misinformation.

Accounting for these three routes, a procedure takes shape:

  1. State and emphasize the central claim (i.e., the alternative position).
  2. Present and reinforce the central claim through core evidence.
  3. Address the erroneous information as a myth.
  4. Explain how the myth is erroneous (while keeping the truthfulness of the alternative position center-stage).

Though these are useful recommendations based on extant research regarding how human cognition works (the handbook is certainly worth a read and includes discussion of a number of interesting cognitive effects), they don’t completely address contexts wherein people simply disagree and subsequently disregard your position in favor of their own–which brings us to our second question: How can we change the minds of those who believe they have thought critically, yet are still wrong?

Changing people’s minds is not easy; and it’s even more difficult when the person you’re working with believes they have critically thought about it. The truth, it seems, is that there’s little you can do about that–it simply boils down to the person you’re trying to educate and their disposition towards critical thinking. For example, consider a checklist of dispositions (see a few below; Dwyer et al., 2016) for the person whose mind you’re trying to change. Ask whether they are inclined or willing to do the following:

  • Reflect (i.e., to contemplate on one’s behavior, attitudes, opinions, as well as the motivations behind these; to distinguish what is known and what is not, as well as limited knowledge or uncertainty).
  • Be Open-Minded (i.e., be cognitively flexible and avoid rigidity in thinking; tolerate divergent or conflicting views and treat all viewpoints alike, prior to subsequent analysis and evaluation; to detach from one’s own beliefs and consider, seriously, points of view other to one’s own without bias or self-interest; to be open to feedback by accepting positive feedback and to not reject criticism or constructive feedback without thoughtful consideration; amend existing knowledge in light of new ideas and experiences; and to explore such new, alternative or ‘unusual’ ideas).
  • Seek the truth (i.e., to have a desire for knowledge; to seek and offer both reasons and objections in an effort to inform and to be well-informed; a willingness to challenge popular beliefs and social norms by asking questions (of oneself and others); to be honest and objective about pursuing the truth even if the findings do not support one’s self-interest or pre-conceived beliefs or opinions; and to change your mind about an idea as a result of the desire for truth).
  • Be Skeptical (i.e., to challenge ideas; to withhold judgment before engaging all the evidence or when the evidence and reasons are insufficient; to take a position and be able to change position when the evidence and reasons are sufficient; and to look at findings from various perspectives).
  • Persevere (i.e., to be resilient and to be motivated to persist at working through complex tasks and the associated frustration and difficulty inherent in such tasks, without giving up; motivation to get the job done correctly; a desire to progress).
  • Be organized and clear in your thinking and presentation.

If the individual you’re trying to educate is lacking in these dispositions, it’s certainly going to be hard to change their minds. However, if they do possess these inclinations, it’s more likely that, through their willingness to think critically (as suggested, to some extent, by this checklist), that they will change their perspective to be more consistent with your, already critically considered, view.

But, with that said, there is no easy way of changing someone’s mind. You cannot force an idea on someone. In reality, it’s quite possible that if you push too hard, the person may resent it and purposefully ‘switch off’ from conversation or even hold on stronger to their existing beliefs, perhaps out of spite. This piece provides two ways of looking at how debunking and changing people’s minds may work, but at the same time admits that there remains a large amount of uncertainty in this area. As I conclude, I think it is worthwhile asking you, the readers, if you have come across any interesting research in this area or similar guidelines? If so, I'd love to hear from you. I find this an intriguing area of investigation and one that merits such further investigation, particularly given the current climate in media, politics, and society.

References

Cook, J. & Lewandowsky, S. (2011). The debunking handbook. St. Lucia, Australia: University of Queensland. Retrieved from http://www.skepticalscience.com/docs/Debunking_Handbook.pdf

Dwyer, C.P., Hogan, M.J., Harney, O.M. & Kavanagh, C. (2016). Facilitating a Student-Educator Conceptual Model of Dispositions towards Critical Thinking through Interactive Management. Educational Technology & Research, 65, 1, 47-73.

Lewandowsky, S., Ecker, U. K., Seifert, C. M., Schwarz, N. & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106-131.

advertisement
More from Christopher Dwyer Ph.D.
More from Psychology Today
More from Christopher Dwyer Ph.D.
More from Psychology Today