Skip to main content

Verified by Psychology Today

Environment

Anomaly Detection: The Art of Noticing the Unexpected

What the analytical tradition is missing.

Unlike pattern-matching, which is about spotting connections and relationships, when we detect anomalies we are seeing disconnections—things that do not fit together. Anomalies get much less attention than pattern-matching, but the ability to spot anomalies is extremely important, enabling us to escape fixation and question the way we are making sense of a situation (Klein et al., 2007).

For example, in my 2013 book, Seeing What Others Don’t, I tell the story of a young police officer, waiting for traffic to move, who notices that in the new BMW just ahead of him the driver has taken a deep drag on his cigarette and then flicked the ashes That surprises him—what owner of a new BMW would do that? So he investigates further, pulling the driver over and discovers that it was a stolen car.

Anomaly detection helps us to spot inconsistencies in our diagnosis of a problem, and therefore to escape from fixation. It lets us notice that a pre-existing plan might not work in our current environment. It helps us adapt to changing conditions, perhaps triggering our “Spidey sense” that som ething doesn’t feel right.

What is an anomaly? There are two different perspectives: a statistical perspective and a cognitive perspective.

The statistical perspective treats an anomaly as an outlier. Wikipedia offers this definition: “the identification of rare items, events or observations which raise suspicions by differing significantly from the majority of the data … Anomalies are also referred to as outliers, novelties, noise, deviations and exceptions.”

The statistical perspective includes all kinds of methods to spot these outliers such as visual analytic tools to make the outliers stand out.

The cognitive perspective is that an anomaly is a violation of our expectancies. Something happens that we didn’t expect. For example, a pattern is disrupted, or a surprising event occurs (back to the BMW example).

This post is about the cognitive perspective. The cognitive perspective does not include the powerful methods found in the statistical perspective. However, it captures something that is missing from the statistical perspective: our ability to form expectancies. It includes the way we can notice missing data—events that were expected but failed to occur—which typically don’t show up in the statistical methods that analyze events that have occurred. The cognitive perspective captures mental models, and the causal factors that allow us to appreciate the significance of a particular deviation.

Most deviations and outliers are uninteresting. At the cognitive level, anomalies matter primarily when they have the potential to alter the way we understand a situation. And that type of sensemaking is very different from the flagging of outliers found in statistical methods.

So now we can amend the cognitive definition of anomaly offered above. An anomaly is a violation of our expectancies that enables us to revise the way we understand a situation. Therefore, anomaly detection aligns closely with problem detection (see Klein, Pliske, Crandall & Woods, 2005).

Anomaly detection depends on a regular environment. If the environment is completely random there’s no way to spot an anomaly. Anomaly detection also depends on curiosity—the cues and discrepancies that catch our attention. Anomaly detection also depends heavily on expertise, which grows out of our experience.

Expertise. We use our expertise, our pattern repertoire, to form expectations. More experience will result in better anomaly detection because our expectations are sharper, making anomalies easier to see. Experience also provides us with tacit knowledge such as recognizing stimuli as familiar; the recognition of familiarity lets us detect anomalies which are departures from familiarity.

Expertise also provides us with richer mental models—richer sets of causes to take into account in assessing a potential anomaly.

Barriers to anomaly detection. One of the strongest barriers is our mindset. A mindset to be curious about inconsistencies is different from a mindset to dismiss or explain away any inconsistencies, what Perrow (1984) has called de minimus explanations. This dismissive mindset leads us to preserve a frame in the face of anomalies and contrary evidence, resulting in fixation errors.

Conclusion. The notion of an anomaly as an outlier is simple enough—and too simple. It ignores the cognitive dimension, that an anomaly is a violation of an expectancy. And it ignores the way we use sensemaking tactics to judge whether the violation of expectancies has the potential to change our story about what is going on. Algorithmic, statistical methods aren’t sensitive to these aspects of anomaly detection. Tools based on analytical methods may have some value but because of their insensitivity to the cognitive dimension, they may flag irrelevant outliers and miss the important ones.

References

Klein, G. (2013). Seeing what others don’t: The remarkable ways we gain insights. New York, NY: PublicAffairs.

Klein, G., Phillips, J. K., Rall, E., & Peluso, D. A. (2007). A data/frame theory of sensemaking. In R. R. Hoffman (Ed.), Expertise out of context (pp. 113-155). Mahweh, NJ: Erlbaum.

Klein, G., Pliske, R. M., Crandall, B., & Woods, D. (2005). Problem detection. Cognition, Technology, and Work, 7, 14-28.

Perrow, C. (1984). Normal accidents: living with high-risk technologies. New York: Basic Books.

advertisement
More from Gary Klein Ph.D.
More from Psychology Today