Consciousness
We Need a Better Vocabulary for Mind and Consciousness
Differentiating consciousness from the concept of Mind2.
Updated October 7, 2023 Reviewed by Lybi Ma
Key points
- A major dispute has emerged in the science of consciousness.
- There are definitional problems, but this post shows how to define consciousness and mind.
- This shows we can divide mental processes into three domains.
- The domain of Mind2 relates to consciousness but is not the same as the broad definition.
As I noted in a prior blog post, a fight has broken out in the field of consciousness research. More than 100 scientists and scholars signed a letter proclaiming that integrated information theory was pseudoscience.
From a Unified Theory of Knowledge, UTOK, perspective, the fight emerges because there remains significant confusion about the meanings of words like consciousness and mind. These concepts are related, but they are not the same. Consider, for example, as Freud demonstrated, there are clearly unconscious mental processes. This shows they are not the same. But how, exactly, are they related? Because UTOK allows us to clearly define core words in psychology, words like science, behavior, mind, cognition, and consciousness, we can use the system to sort out the details.
Consciousness has three distinct referents. The first, and broadest, refers to functional awareness and responsivity. Something demonstrates functional awareness and responsivity when it can be shown to control and adjust its behavior to evolving inputs. Using this definition, we can say something like a drone programmed to drop a package at a point and return to its origin is conscious of the landscape. After all, it shows a kind of functional awareness and responsivity. For example, if the wind blows and starts to push the drone off course, it will adjust accordingly. And, if we were to travel back in time with a drone and show people such an entity, they would think of it as a kind of magic or a conscious machine.
Putting this definition in everyday language, we use this meaning of the word conscious to refer to people who are either awake or unresponsive. For example, when, after a car crash, we say the driver is unconscious, we mean that he is not demonstrating functional awareness and responsivity to his environment.
The next definition is the subjective conscious experience of being. This refers to what Thomas Nagel characterized as something that it is like to be. Humans have access to this meaning of consciousness via introspection. When someone asks you how you are feeling or what you are perceiving, you have access to this qualitative experience. Other animals like dogs also have this subjective experience of being.
Finally, there is explicit, recursive self-consciousness. This refers to the explicit awareness of experience and the implications therein. When folks claim humans have free will, it is usually justified by the fact that humans demonstrate what Douglas Hofstadter calls a strange loop. I am aware that I am writing this post, and I can justify that I am doing so and in so doing, I now, apparently, have the freedom to stop doing so.
In A New Synthesis for Solving the Problem of Psychology: Addressing the Enlightenment Gap1, I show that the concept of mental processes is confused and that there are, in fact, three different referents that need to be differentiated when we use the term. The first referent is neurocognitive activity. This refers to the sensory-motor looping system that enables animals to demonstrate functional awareness and responsivity. I call this the domain of Mind1. It can further be divided into domain Mind1a, which is the activity that takes place within the nervous system (i.e., neuro-information processing), and Mind1b which is the activity that takes place between the animal and the environment (i.e., what many call “behavior”).
The second referent is subjective conscious experience. This is the felt experience of being that arises from Mind1 and is what we humans identify as our phenomenology. This is the domain of Mind2.
Finally, there is the domain of self-conscious justification, which is Mind3. It evolves because of propositional language and socialization that causes us to transform from primates and become human persons.
There are significant parallels between the three definitions of consciousness and the three domains of mind. However, there are also differences. The key difference is that mindedness and the domains of mental processes, as defined by UTOK, emerge in animals with brains. That is, for something to be minded, it must have a nervous system. However, consciousness does not have this requirement. It is defined more by structural and functional relations and properties than by the medium it is embedded in.
Why is this important in the current fight? Consider that the primary reason that the letter writers called Integrated Information Theory pseudoscience was that it implied things like logic gates might be conscious. However, this depends on the nature of the term. If consciousness is defined by a set of structural and functional relations only, then it makes sense that some things without a brain might be conscious. If, however, the central referent is our subjective experience as minded animals with brains, then the claim seems odd. Simply put, in the language of UTOK, we can say that it is conceivable that a collection of logic gates might be conscious, but they can not be said to have Mind2. The point here is clear. If the field of consciousness research had a shared vocabulary that differentiated Mind2 from consciousness in general, then much of the fight would likely be resolved.
References
1. Henriques, G. R. (2022). Mind2: Subjective conscious experience in animals and humans. In A New Synthesis for Solving the Problem of Psychology (pp. 357-388). Palgrave Macmillan.