Skip to main content

Verified by Psychology Today

Cognition

How Consciousness Evolved as a Purely Physical Phenomenon

Part 3: Internal representations are key to the evolution of consciousness.

metamorworks| AdobeStock
Source: metamorworks| AdobeStock

This is Part 3 of a five-part blog series on the evolutionary origins of consciousness. I encourage you to read Part 1 first, for the overall context. Parts 2 through 5 look in a little more detail (but still necessarily in a summarized manner) at each of six books by scientists focusing on this intriguing, fundamental way of understanding consciousness. Here in Part 3, we will discuss the theories of Joseph LeDoux, who is also a PT blogger.

Neuroscientist Joseph LeDoux’s career has focused on studying the amygdala, a small almond-shaped structure located deep in the temporal lobes of the brain that plays a central role in behavioral fear conditioning. His research has shown that, contrary to what we might expect, physiological and behavioral defense responses to threats (such as automatically jumping back and your heart racing when you think you've just seen a snake right in front of you while walking on a trail), a reaction mediated by the amygdala, can be dissociated from the conscious feeling of fear that we assume invariably accompanies those responses. That is, in certain conditions you can have the one without the other.

In The Deep History of Ourselves: The Four-Billion-Year Story of How We Got Conscious Brains1 LeDoux devotes a sizeable first portion of the book to providing a wonderful sweeping overview of evolution, from the origin of life onwards. The book is highly worth reading even just for this. The book is altogether very readable, with its short digestible thematically-grouped chapters (66 in all) and beautiful illustrations.

LeDoux explains the gradual evolution of nervous systems and brains. For a basic summary of those processes, see Parts 1 and 2 of this blog series (Feinberg and Mallat, reviewed in Part 2, provide a similar overview of evolution, but they posit basic consciousness beginning much further back in animal life than LeDoux does2).

Mindful of the inability to cover all the subject matter of such a rich book in a blog post, I will mainly summarize here LeDoux’s discussion of the relationship between emotion-based feeling states and consciousness, the central role of internal representations in consciousness, and his ideas about higher-order theories of consciousness and subjective experience.

Which came first, consciousness or emotional feeling?

Drawing on his own research into the relationship between fear and defense responses, LeDoux focuses on the evolution of emotions and feelings in higher animals and their relation to consciousness. He argues that conscious feelings of fear, and other emotional feelings, were a later evolutionary add-on to physiological and behavioral defense responses, and are dependent on cognition and consciousness. LeDoux argues that what came first was cognition, not emotion. He sees emotions as cognitively constructed. So consciousness came before emotion, in his model.

We shall see in Part 4 that this is the opposite of Antonio Damasio’s model: Damasio theorizes that emotion came first (albeit in a primitive form) and that consciousness is partly a product of those feeling states. Both Damasio’s and LeDoux’s models are plausible, and considering them both is instructive in theorizing about the evolution of consciousness. The main difference in their theories is that Damasio thinks emotion appeared early in evolution, whereas LeDoux thinks it appeared late, at least in terms of the feeling of emotion, with cognition and consciousness having evolved first.3

Cognition is the ability to form internal representations

LeDoux defines cognition as the ability to form internal representations (mental representations) and to use these to guide behavior. See Parts 1 and 2 and a previous blog post for more explanation of internal representations, their central role in consciousness, and how they are entirely physical (at a basic level they are sensory images, in more complex brains they are "ideas," but are still built on sensory images). Internal representations enable an animal to behave on the basis of those representations to stimuli that are not present. For example, there may be stored representations of a cue that has previously been associated with food, or danger, or sex. Those representations can then guide behavior independent of the presence of the actual stimulus. Thus it is the mental representation that is now guiding behavior.

Like Feinberg and Mallatt, LeDoux provides evidence that all vertebrates have brains that store internal representations, and even invertebrates have basic cognitive representational ability. LeDoux gives a picture of the gradual evolution of more complex internal representations in more complex brains. Mammals have a much more complex kind of cognitive representation, with some deliberative capacity—the ability to form mental models that can be predictive of things that are not existing—this is a much more complicated cognitive capacity than simply having a kind of static memory of what's there. It enables the ability to weigh, plan, and calculate behavioral options.

Nonconscious cognitive processes

LeDoux also emphasizes the “cognitive unconscious”—the nonconscious cognitive processes that allow us to control complex behaviors without having to call upon consciousness. Most of our routine behavior and even quite complex behavior is carried out by nonconscious cognitive processes (most of our movements, much of our language processing, tasks like driving a car if the driver is very experienced, etc.). Many actions can be "delegated" to nonconscious cognitive processes, which reduces the cognitive load on conscious information processing. But conscious information processing enables greater behavioral control and more flexible responses to stimuli than do nonconscious cognitive processes.

Sense of self

LeDoux discusses the uniquely human capacity for language and its role in consciousness. And he discusses what is called "autonoetic self-awareness," a term introduced by the psychologist Endel Tulving to describe the ability of the human brain to represent the self as an entity, as a subjective experience (a mental self-representation):

“In other words, there are different views of the self—you have a self that’s your body, but autonoetic consciousness is about yourself as a subject, something that is part of the experience, so when you have an experience of danger, it's you that is in danger.”4

This might be a uniquely human capacity, to have such a strong internal representation of the self as agent—the protagonist in the narrative of one’s life. It enables such things as mental time travel to the past or to the future (autobiographical memory and foresight/planning).

Higher-order theories of consciousness, and the role of higher cortical regions

LeDoux develops a formulation based on higher-order theories (HOT) of consciousness:

“In David Rosenthal’s version of a higher-order theory (HOT), conscious awareness results when nonconscious first-order sensory information is cognitively re-represented, resulting in a higher-order state, which is a mental state about a lower-order mental state. The way to think about the difference between first-order and higher-order theory is in terms of the kinds of states involved: first-order theory focuses on a mental state that represents the world, while HOT adds an additional (higher) mental state that re-represents the sensory state. By definition, then, in HOT, the first-order state is a nonconscious one, and only becomes a conscious mental state with the help of a higher-order one. At the risk of over-simplification, the higher-order state makes conscious the lower-order one.

One implication of Rosenthal’s HOT is that we are not conscious of the higher-order state itself, only of the lower-order state. To become conscious of the content of the higher-order state requires that it be re-represented by an additional higher-order state. For example, HOT assumes that when we consciously see a red apple it is because the prefrontal cortex makes conscious the visual cortex’s representation of the visual properties of the object. To be aware that you are having the experience, an additional higher-order state, possibly also involving the prefrontal cortex, is needed.” (p. 278-9).

LeDoux describes a model for how the brain performs higher-order processing in the prefrontal cortex (a higher cortical region) of sensory perceptions coming from primary sensory areas such as the visual cortex (a lower cortical region). He contrasts HOT with other prominent theories of consciousness, such as Global Workspace Theory (GWT) and its more current version, the global neuronal workspace model. Each of these theories has its strengths.

The prefrontal cortex, and the frontal pole of the prefrontal cortex, in particular, are much more developed in humans than other primates, with rich interconnections. The frontal pole (the front-most part of the prefrontal cortex) is unique to humans. LeDoux articulates a model for how abstract meta-representations of the self and the external world could be formed in this most evolved part of the brain when internal representations in lower areas get re-represented and restructured at those higher levels.

Subjectivity

In contrast to Daniel Dennett, whose theories we will discuss in Part 4 and who explains away subjective experience, in LeDoux’s theory, subjective experience plays a central role.

“I'm a fan of subjective experience, and want to hang on to it. That's why I've adopted this Higher-Order Theory (HOT) of consciousness, which is about how subjective experiences themselves come about.”4

“The basic idea, according to HOT, is that conscious experiences entail some kind of minimal inner awareness of one’s ongoing mental functioning, and this is due to the first-order state (e.g. visual image) being in some ways monitored or meta-represented by a relevant higher-order representation. This requirement of HOT distinguishes it from cognitive theories, such as GWT, which also invoke additional cognitive processes as a crucial element of conscious experience, but which do not posit this type of inner awareness.”5, 6

In Parts 2 and 3 of this blog series, we have seen that Feinberg and Mallatt as well as Ledoux have strongly emphasized the central role of internal representations in consciousness—images that the brain forms of the external world and of the self. They provide a clear picture of the gradual evolution of increasingly complex internal representations, and they explain how subjective conscious experience could emerge from such representations of the self in relation to the external world.7

In humans, the sense of self is of course much more complex than in any other species. Our brains have layered or nested internal representations of internal representations—of our body, behaviors, proclivities, interactions with and feedback from other people, and our whole autobiographical history. These internal representations are increasingly abstract at their higher levels. The brain’s reflection on itself is recursive, like the infinite regress of reflections-of-reflections that you get by holding two mirrors facing each other. Conscious self-awareness almost certainly has its basis in this form of self-reflection.

Internal representations, while crucial, are not the only basis for consciousness. Feinberg, Mallatt, and Ledoux also emphasize the role of emotions (the evolution of which has an equally mechanistic, physical basis).

Part 4 of this five-part series looks at the further insights provided by Daniel Dennett and Antonio Damasio into the evolution of consciousness.

References

1. Joseph E. LeDoux, The Deep History of Ourselves: The Four-Billion-Year Story of How We Got Conscious Brains (New York City: Viking, 2019).

2. And LeDoux goes into more detail into the 3+ billion years between the origin of life and the Cambrian explosion.

3. LeDoux suggests that the human experience of emotion is likely very different from what other animals experience. He doesn’t deny emotions in other animals. It’s just that what humans call emotion depends on the prior arrival of complex cognition and culture, and humans’ experience of emotion is very much shaped by these.

[CLICK 'MORE' TO VIEW FOOTNOTES 4-7]

4. Joseph LeDoux in an Interview on Brain Science Podcast with Ginger Campbell, M.D. September 27, 2019 https://brainsciencepodcast.com/bsp/2019/161-ledoux

5. Brown R, Lau H, LeDoux JE. Understanding the Higher-Order Approach to Consciousness. Trends Cogn Sci. 2019;23(9):754-768. https://www.cell.com/action/showPdf?pii=S1364-6613%2819%2930161-5

6. GWT and global neuronal workspace theory are further explained in Parts 4 and 5 of this blog series.

7. Feinberg and Mallatt define a very elemental form of primary or sensory consciousness in simple animals that is based merely on the ability to experience sensory images or affects, lacking of course any kind of reflective self-awareness. They argue that this very basic, primary form of consciousness is present in all vertebrates, and also in cephalopods and some arthropods. According to their model, more complex animals with more complex brains begin to have higher forms of consciousness—there are gradations of consciousness. While LeDoux’s theory is also based on the central role of internal representations in consciousness, he takes a more cautious view on the question of whether non-human animals are conscious. LeDoux does not deny the possibility of consciousness in non-human animals, but argues that whatever the subjective experience of other animals may be, it is likely very different from the way we humans experience consciousness. But he also says that the differences would likely be less for closely related species than more distant ones.

The HOT model favored by LeDoux requires that the lower-order sensory representations be re-represented by a higher-order mental state, possibly involving the prefrontal cortex, in order for awareness to occur. By contrast, in Feinberg and Mallatt’s model, having a prefrontal cortex or even any cortex at all is not a requirement for the primary form of consciousness or subjective experience they postulate in simpler animals.

LeDoux distinguishes three kinds of consciousness, and proposes that while only humans may have reflective self awareness, other primates may have semantic awareness of their world and body, while other mammals may have a more primitive kind of statistical knowledge of the objects in their lives and the value of these. In recent work he and Hakwan Lau suggest how even a primitive, statistical form of experience might depend on higher-order metarepresentation in order to be consciously experienced (LeDoux JE, Lau H. Seeing consciousness through the lens of memory. Curr Biol. 2020 Sep 21;30(18): R1018-R1022).

The other difference is that LeDoux does not think that the feeling of emotions can occur at a subcortical level, whereas Feinberg and Mallatt (like Damasio and others) do. In general LeDoux cautions against attributing mental states to animals, and claiming a scientific basis for it, not because he denies other animals have such states, but because of the difficulty of scientifically demonstrating when a behavior depends on conscious vs. non-conscious brain processes in them. In non-scientific situations, he has no issue with assuming animals have conscious feelings, and treats his cat as if it does.

advertisement
More from Ralph Lewis M.D.
More from Psychology Today