Environment
The Sine Qua Non of Human Existence
After agriculture, the ability to think became the measure of humanity.
Posted August 1, 2019
Unintended Consequences #3: The Sine Qua Non of Human Existence
Agriculture created the need for an increasing number of calculations including methods of irrigation, planting times, harvest time, weather prediction, and pest control, as well as food preservation and storage. All these required planning ahead and all were subject to literally endless improvement.
In addition, agriculture vastly increased the role of hierarchy in human life. Dealing with hierarchy requires a lot of thought: Who is above you, who is below, whom do you curry favor with, whom can you lord it over? How can you rise in the hierarchy? Clever people could rise on the strength of their thinking alone. Counselors and advisors became indispensable to kings and emperors.
Perhaps more importantly for our theme, agriculture dramatically increased the need for deferred gratification (e.g. don’t eat the seed wheat). Deferred gratification placed a completely new and heavy burden on the cortex, requiring that it think about the emotions and desires of the organism and override them if necessary.
This increase in the role and value of the cortex has had profound effects on our intellectual and emotional lives. The classical Greeks (circa 500 BC) made reason the sine qua non of human existence; Socrates told us to know ourselves and declared that “the unexamined life is not worth living," a phrase that Freudians love to echo. Descartes, with his famous statement, “I think, therefore I am,” doubled-down, making the mind the very essence of being and identity. Maslow created a “hierarchy of needs” that put physiological needs at the bottom and “self-actualization” at the top. It’s another example of the cortex congratulating itself on its own importance.
It's not surprising that the cortex eventually became the critic, the supreme judge of our behavior, emotions, and even of the thoughts that it itself produces. Today, self-evaluation, in America at least, is virtually universal and is taken for granted (Glantz and Bernhard, 2018). Most people, routinely, think that they should be “better.” Self-improvement is the subject of an entire body of literature. We set arbitrary “standards” for ourselves and then condemn ourselves for not meeting them.
There are other ways of being. Lao Tsu, the founder of Taoism, was probably one of the first to realize that the mind had gone astray—had lost the Tao, the way. Today, the use of psychedelic drugs is one indication that people long to escape from the tyranny of the critic. Another is the abiding interest in Eastern philosophy. The prevalence of low self-esteem and depression is another indication that thinking too much about the self can create more problems than it can solve. As Blaise Pascal wrote, several hundred years ago, “the heart has reasons that reason doesn’t recognize.”
The hunter-gatherer adaptation had been in existence for at least 250,000 years before the aggrandizement of the cortex. It was the crucible of the human genome, and it had developed, organically, out of millions of years of group-living by ancestral species. The breakdown of this ancient adaptation created a constant need for creative new solutions. Humans no longer possessed a way of life that could be taken for granted, that was automatic. After agriculture, we had to think about living in a rapidly changing environment. Alienation from the natural world was the price we paid for our brilliance.
References
Glantz, Kalman and Bernhard, J. Gary (2018). Self-Evaluation and Psychotherapy in the Market System. New York: Routledge.