Skip to main content

Verified by Psychology Today

Career

An Intro to Entropy for Neuroscientists and Psychologists

Entropy is a confusing concept, but it can be a powerful quantitative tool.

Key points

  • The term entropy has many different definitions and can be used to quantify various properties of dynamical systems, including the human brain.
  • Entropy measures the amount of energy unavailable for work, the amount of disorder in a system, or the uncertainty regarding a signal's message.
  • Entropy can measure a brain's computational complexity by quantifying the number of accessible mental states (the size of the mental repertoire).

Entropy is one of the most useful concepts in science but also one of the most confusing. This article serves as a brief introduction to the various types of entropy that can be used to quantify the properties of dynamical systems, including the human brain. First, it is necessary to understand the term’s history and evolution.

The concept of entropy—commonly understood as “disorder”—was developed in the 19th century by Sadi Carnot and Rudolf Clausius, who were searching for the most efficient way to convert energy from heat flow into mechanical energy that could power machines. The steam engine had been around for some time and was proof that energy could be extracted from heat flow to do physical work.

What Carnot noticed was that if there is a temperature difference—or a “gradient”—between two bodies in contact, heat will spontaneously flow from the hotter to the colder body until both reach a mutual uniform temperature, a state known as “thermodynamic equilibrium.” We experience this phenomenon whenever our warm cup of coffee interacts with the surrounding air and cools to room temperature. When you have a temperature gradient, the flow of heat that ensues to eliminate the gradient creates a physical force that can be harnessed to do work. But Carnot realized that the conversion of heat energy into mechanical energy was a process that could never be one hundred percent efficient—some of the useful energy would always be lost to the surroundings through what physicists call dissipation.

For energy to be dissipated simply means that it gets uniformly dispersed into the environment, scattered in such a way that it can never again be harnessed to do work. A familiar example of energy dissipation is the body heat that we as humans—complex adaptive systems—constantly give off. Another is the heat being generated by your laptop computer as it computes. A swinging pendulum in a grandfather clock continually dissipates a small amount of energy due to constant friction with the air, which is why it must eventually come to a stop. Every mechanical process that happens in the universe dissipates some amount of useful energy by producing heat. This is the basis for the famous “second law of thermodynamics.” Entropy—as originally conceived—is a mathematical term representing the quantity of energy no longer available for work. Since there is more than one type of entropy, we will refer to this kind as thermal entropy.

As you can see, there has been no mention of order or disorder so far. So, where did this popular notion of entropy as disorder come from?

In the second half of the 19th century, support for atomic theory grew rapidly, and physicists began looking for microscale explanations for all previously explained phenomena since they were considered more fundamental. An Austrian physicist named Ludwig Boltzmann set out to explain the Second Law—namely heat energy’s tendency to disperse and dissipate—as a result of the statistical behavior of vast numbers of molecules moving according to simple laws of mechanics. Boltzmann was inspired to think microscopically by the recent discovery that the kinetic energy of a gas is a direct consequence of how fast its individual molecules are moving.

Boltzmann reasoned that if kinetic energy was nothing but increased molecular motion, then its dissipation must involve a gradual diffusion and dampening of this excited motion over time, which he suspected could be related to random collisions between molecules. To explore this model of energy dispersion—or entropy creation, since they are two sides of the same coin—he wisely chose a simple system: an ideal gas in a closed container.

Boltzmann imagined a bunch of molecules quickly zipping about in all directions, frequently bumping into one another like billiard balls on a pool table, transferring momentum around and spreading each other out in space. As the molecules randomly collide, the faster ones naturally slow down, and the slower ones speed up until, eventually, the speed of all becomes roughly the same. Without any energy differences between molecules, there are no temperature gradients and, therefore, no heat flows that can be harnessed to do work. According to this explanation, an isolated system will inevitably approach thermodynamic equilibrium because of the effects of countless unseen molecular interactions.

Boltzmann described a system moving closer to thermodynamic equilibrium as becoming more “disordered” because no matter how the particles in the system were arranged initially—perhaps the faster molecules were neatly bunched up in one corner of the container—the collective configuration would inevitably drift toward a uniform spatial distribution that was devoid of any patterns or discernible structure. For this reason, Boltzmann considered this state of thermodynamic equilibrium and maximum entropy a state of “maximum disorder.”

The reason isolated systems naturally drift toward higher entropy or disorder is that there are simply many more ways to be mixed up and disorderly than there are ways to be organized and patterned. Systems naturally drift toward disorder from the large-scale effects of chance.

While it is not obvious on the surface, this conception of entropy—known as statistical entropy—allows us to think about disorder in terms of information, and in doing so, we see that entropy is curiously related to the knowledge of the observer and their ignorance of the precise physical state of the system under observation. To understand how entropy can be a measure of ignorance or uncertainty, we must appreciate the microstate-macrostate distinction.

A macrostate represents a collective property of a many-particle system and can be easily measured, like the average temperature of a system of gas molecules (a global measure), while the microstate is a detailed description of the position and velocity of every molecule in the system, which turns out to be impossible to measure due to quantum uncertainty and classical-scale chaos.

For any dynamical system, there are many different microstates that correspond to a single macrostate. This is because there are many equivalent ways for the individual molecules to be distributed and have the same average total kinetic energy. What Boltzmann showed was that entropy is a measure of how many different ways a system can be arranged without changing its macrostate. The higher the entropy, the more microstates there are that correspond to a single macrostate because there are many more ways to be arranged in a disordered mess.

In the 1960s, the physicist E.T. Jaynes showed that Boltzmann’s entropy is not just a measure of disorder or the number of microstates that correspond to a particular macrostate. Entropy is also a measure of the uncertainty or ignorance of an observer observing the system’s macrostate and not knowing the specific microstate. Because states with higher disorder have more equivalent microstates that correspond to a unique macrostate, higher entropy means greater ignorance—or less certainty—about the specific microstate the system is in. It also means there is more surprise and information gained upon learning the actual microstate of the system because more is learned (more bits of information gained) when there are more alternative possibilities that get eliminated by the act of observation.

This is why statistical thermodynamics and Claude Shannon’s information theory are essentially the same theory: Shannon’s entropy, called information entropy, is a measure of how many states a system can be in or how many alternative messages can be sent over a communication channel. According to Shannon’s theory, information is a “reduction in uncertainty,” and the amount of information in a signal corresponds to the amount of uncertainty or ignorance that has been reduced upon receiving the message.

Let’s think about the information that is gained when one flips a coin and observes the outcome. Before the coin toss, one does not know whether it will land on heads or tails, and with a fair coin, the odds are 50-50. When it lands, and you look at the result, you collapse two probable states into a single well-defined one, and in doing so, you acquire exactly one bit of information.

Now instead of a coin, which has only two possible states, imagine throwing a six-sided die while your eyes are closed. In this example, your uncertainty is greater because there are more states that the die could potentially be in. When you open your eyes and observe the way it landed, you are reducing more uncertainty and therefore gain more bits of information. The amount of information gained is proportional to the number of possible alternatives.

Because entropy can be a measure of the number of states a system can be in or the number of ways it can be configured, entropy measures can be used to measure the complexity of a complex adaptive system.

For example, one might reasonably assume that a species’ intelligence corresponds to the number of accessible states in its behavioral or mental repertoire. To quantify this number, we can describe it in terms of entropy: The more possible states the cognitive system can be in, the higher the entropy. In this application, entropy is not so much a measure of disorder as it is a measure of channel capacity, or cognitive bandwidth. It is also a measure of our ignorance of the exact internal or mental state of an organism at an instant in time if we cannot observe the organism’s state directly.

Neuroscientist Robin Carhart-Harris of Imperial College London, whose “Entropic Brain Hypothesis” attempts to explain the effects of psychedelics on states of consciousness, cites Integrated Information Theory’s “phi” as a measure of information entropy for brains:

“The view taken here is that the human brain exhibits greater entropy than other members of the animal kingdom, which is equivalent to saying that the human mind possesses a greater repertoire of potential mental states than lower animals.”

This introductory article has only described the most basic types of entropy. For a summary of the different entropy measures being used in the brain-mind sciences—such as transfer, differential, permutation, or multiscale entropy—see this paper, which describes how such theoretical tools can be used to quantify brain function and information processing capacity.

This post is adapted from my book The Romance of Reality: How the Universe Organizes Itself to Create Life, Consciousness, and Cosmic Complexity.

advertisement
More from Bobby Azarian Ph.D.
More from Psychology Today