Skip to main content

Verified by Psychology Today

Law and Crime

6 Ways to Safeguard a Murder Hunch

Bias that taints investigative logic can undermine justice.

Key points

  • Solving a murder can feel urgent and rewarding.
  • Either motivation can add pressure and invite errors formed by bias.
  • Trained awareness of this weakness can help to avoid it.
Photo by K. Ramsland
Source: Photo by K. Ramsland

Three years ago, I noted the shocking response from amateur sleuths to an incident involving Elisa Lam in the Cecil Hotel in Los Angeles. I evaluated it in the context of common cognitive errors. Some of these sleuths “knew” what had happened and confidently named a culprit in the crime. But there was no crime and nothing these self-proclaimed Sherlocks said turned out to be accurate. Such certainty in one’s deductive prowess generally derives from deficient information coupled with an urgency to solve and an ignorance about harmful cognitive shortcuts. This can derail and even damage a just case resolution.

I’ve been involved recently in a similar situation, but this time the web sleuths were cautious. They listed circumstantial facts but acknowledged that their case lacked proof. It was a person in law enforcement who led the charge, a sheriff. When the case was solved quite differently than he’d envisioned, his bias became clear: he’d focused on a specific suspect, which had influenced his selective filter.

Here’s the case: On December 2, 1990, the remains of a young woman were discovered on an abandoned farm near Lanagan, Missouri. Nearby were several items of female clothing. A white towel had been wrapped around the decedent’s head, and she was bound with ropes, cords, and cables. It took years, but she was eventually identified as Shauna Beth Garber. She’d worked at a chicken processing plant in the area.

An amateur sleuth wondered whether Dennis Rader, the BTK serial killer from Kansas, was the likely culprit. Others chimed in. The circumstantial case involved several items: In 1990, Rader had traveled to Oklahoma and Missouri for work. The crime scene is an hour from where he’d played on his grandparents’ farms as a boy. Rader had a lengthy “project” list of targets for murder that included women outside Wichita. In a 1989-1990 journal entry, he’d expressed an intense need to kill. (By then, he had already murdered nine of his ten victims.) He liked to scout for abandoned farms. Most relevant, he bound his victims with items like those placed on Garber. Also, among photos he took of himself engaged in autoerotic activities were some that showed him wearing items that resembled her clothing.

Rader seemed a solid suspect. His infamy as a serial killer with a bondage fetish understandably attracted conjectures. The sheriff linked the Garber case to his own unsolved missing persons case. He urged detectives on the Garber case to interview Rader. They did, but they weren’t persuaded. Despite the apparent links, there was no solid evidence. Less than a year later, they solved the case naming a different suspect, confirmed by witnesses and evidence.

Investing too much significance in a hypothesis sets up conditions for error. It adds an emotional layer that endows the speculation with a feeling of confidence. The more investment, the stronger the sense that it’s right. More alarming, it encourages selective attention.

This is focalism, a form of confirmation bias. It involves getting so attached to a specific belief that one bypasses objective evaluation and adjusts all data to support the belief. Some items gain more emphasis than they deserve. The multiple bindings, for example, suggested that Garber's killer had a bondage fetish. Most investigators on this case, from amateur to professional, focused on this as a lead. However, the bindings turned out to be nothing more than items available that could keep her under control. No fetish involved.

Cognitive psychologist Daniel Kahneman offers another cognitive error: What You See Is All There Is. We focus on readily available information. Period. “A mind that follows What You See Is All There Is,” Kahneman states, “will achieve high confidence much too easily by ignoring what it does not know.” (I say more about this in a previous post mentioned above.)

Still, investigations must start from the information at hand, even if incomplete. There will be blind spots and guesses, trial and error. How can investigators guard against moving too fast onto what they think is the right path?

Criminologist Kim Rossmo has studied biased investigations. He notes that all investigators arrive at scenes with a perceptual set derived from their education, culture, and experience. This helps them to make decisions, but even the most experienced can anchor an investigation in a faulty threshold diagnosis that defines its direction. People react to a scene, compare it with what they know, decide what it means, and then look for support. They often see what—and only what—they expect to see. Urgency encourages shortcuts.

Rossmo says that with training and effort, investigators can avoid some of these errors. His suggested strategies include:

1. Teach investigators about cognitive shortcuts and their impact on information gathering and processing.

2. Provide problem sets that can help form and analyze several competing hypotheses.

3. Brainstorm case-specific checklists of key assumptions and determine which are potentially damaging.

4. Identify areas of uncertainty as attractants for hasty shortcuts.

5. Examine the impact of ego investment and fatigue.

6. Study the path of solid investigations, as well as of those that made mistakes.

Genuine truth-seekers tread carefully. They know that no matter how coherent a case analysis seems, it might be incomplete. New information could revise the narrative significantly. Dissecting the speculation that linked Rader to Garber shows what it lacked and where it failed, providing a cautionary note about murder hunches.

References

Ask, K., & Granhag, P. A. (2005). Motivational sources of confirmation bias in criminal investigations. Journal of Investigative Psychology and Offender Profiling, 2(1),43-63.

Howard, L., & Ramsland, K. (2024, June 9). Genealogy, suspect death solve 1990 “Grace Doe” case. Forensic Magazine. https://www.forensicmag.com/613433-Genealogy-Suspect-Death-Solve-1990-G…

Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.

Kruglanski, A. W., Webster, D. M., & Klem, A. (1993). Motivated resistance and openness to persuasion in the presence or absence of prior information. Journal of Personality and Social Psychology, 65(5), 861-876

Rossmo, K. D. (2011, June). Failures in criminal investigation. Police Chief Magazine.

Rossmo, D. K. (2008). Cognitive biases: Perception, intuition, and tunnel vision. In K.D. Rossmo (ed.), Criminal investigative failures (pp. 9-21). Boca Raton, FL: CRC Press.

advertisement
More from Katherine Ramsland Ph.D.
More from Psychology Today