Ethics and Morality
Morality and Community
What humans call morality is really about making communities better.
Posted November 9, 2019
In his new book, A Natural History of Morality, Michael Tomasello begins with the startling suggestion that what we call “morality” is really about mechanisms of creating group structure—that there are fundamentally human mechanisms that allow us to see ourselves as playing roles within a group and that allow us to put the group’s good ahead of our own. The part that startled me was not the insight that humans are social animals or that we have unique cognitive structures that allow us to accommodate group behaviors (see below). The part that startled me was the simple statement that this is what we call “morality.”
What Tomasello is arguing is that humans evolved to play an assurance game, sometimes called the stag-hunt game. It is important to differentiate an assurance game from a prisoner’s dilemma game. In the prisoner’s dilemma, two players can either “cooperate” or “defect.” If both defect, they both lose. If one cooperates and the other defects, the defecting player wins big. If both cooperate, they both do pretty well. The problem is that in a single-round prisoner’s dilemma, defecting is always better. In an assurance game, if both players cooperate, they do much better than if they both defect or than if one defects in the face of the other cooperating.
So the key to the assurance game is ensuring that the other player is also cooperating. If you know the other player is going to cooperate in an assurance game, then you also want to cooperate. (Note that an iterated or repeating prisoner’s dilemma game starts to look like an assurance game, see Axelrod and Hamilton.)
It is important to recognize that the problem of the commons and the problem of public goods are not prisoner’s dilemma games; they are assurance problems (Runge). What I will explore in the rest of this blog post is the data that humans have developed social institutions to ensure cooperation within those assurance games (Ostrom, Wilson), and that this is what we call “morality.”
A conflict between levels—selfish win within groups; altruistic groups win over selfish groups.
In several books (Unto Others, Darwin’s Cathedral, Does Altruism Exist?), David Sloan Wilson and his colleagues make the case that there is an inherent conflict between organizational levels—within a group, the more selfish generally win over the more altruistic, but altruistic groups do much better than selfish groups.
Let’s take, as a canonical experiment, the tax problem: We have groups of players. Each player starts with the same amount of money. Each player in each group can put some of their money into a central pot, where the money triples.
The money in the central pot is then distributed to all of the players in the group equally, whether they put money in or not. (I call this the “tax problem” because it is a good model of putting money into a centralized governmental entity that builds infrastructure that is useful economically. Everyone gets to use the infrastructure, even the selfish players who cheated and didn’t pay their taxes.) Within the group, the selfish players do better than the altruistic players, because not only do they get the money back from the pot, but they kept more of their own money to start with. However, the more selfish players there are in the group, the less of a pot there is to triple and return back.
If we think of this from an evolutionary perspective, think of these as resources that drive successful reproduction, we find that at each evolutionary game step, the proportion of selfish players grow within each group, but that altruistic groups grow faster than selfish groups.
Peter Turchin calls this asabiya—the ability to work together—and points out that in any physical or economic conflict (such as between societies, cultures, or empires), the group that works together wins over those that don’t. One of my favorite stories about this is Salman Rushdie’s masterpiece Haroun and the Sea of Stories, in which the Guppee army of pages debate every order, but then fight together once they’ve come to consensus, while the much larger and scarier Chupwala army fight as much among themselves as between and, thus, lose the war.
Third-party punishment. Restorative justice.
The key to solving this conflict is third-party punishment. Third-party punishment is when one member of a group punishes another for being unfair to a third. It can be contrasted with coalition defense. Coalition defense occurs when you have a group (say with two agents A and B), and a third player (C) attacks a member of the coalition (C attacks A), and the other member comes to their defense (B attacks C). Third-party punishment occurs when the three members are all part of the same group (A, B, and C), and B punishes C for being unfair to A within the group.
There are lots of examples of coalition defense in non-human animals. Some of these coalitions are defined by simple relationships, like kinship. For example, parents defending children. Don’t mess with mama bear’s cub! But also lots of examples, particularly among primates, of siblings defending each other. As pointed out by numerous primatologists studying primates in the field (Cheney and Seyfarth, Strum, Sapolsky), primates like baboons and vervet monkeys are very aware of their kinship relationships and will coalition-defend along those relationships. (Don’t mess with my kid brother!) There are also extensive examples of non-sibling coalitions within primate tribes and groups. (Don’t mess with my friend!) These are held together by mutual grooming and other friendship-related behaviors (De Waal, Goodall).
But to my knowledge, there are no clear examples of third-party punishment in non-human animals. One of the most interesting examples of this lack of third-party punishment was the observations by Jane Goodall of the inability of the Gombe tribe of chimpanzees to handle the sociopaths, Passion and Pom (described in Through a Window). Passion and Pom were a mother-daughter pair (a kinship-related coalition) who would find a mother chimpanzee with an infant, harass the mother until she was separated from the infant, and then kill and eat the infant. The males of the tribe would interfere if they saw this happening, but did not treat Passion or Pom differently in other situations.
Importantly, this was not a problem of memory, because Gilka clearly remembered the assault and reacted in terror to Passion and Pom later, but the males did not. Humans do not do this. Humans who harm children are punished and not suffered to survive within the tribe.
Tomasello points out that one of the key differences between humans and other primates is this third-party punishment, this restorative justice. This raises an interesting question. Tomasello’s comparisons are all between humans and other primates, and while primates have complex social structures, they are often not as socially complex as humans. They live in smaller groups and do not show the same social complexity that humans do. Tomasello makes the suggestion that humans developed this moral structure because we passed through an evolutionary situation where group success was paramount, where we played the assurance game. In the language used above, the evolutionary equation shifted to emphasize altruistic-strengthening, between-group dynamics over selfish-strengthening, within-group dynamics.
This raises a very interesting question. How does morality play out in other, more social species? In Unto Others, Sober and Wilson argue that you see altruistic behavior evolve in transient groups when those groups are separated. So what happens in species that live in highly social groups where mutual help is sometimes critical to survival? What happens in other species that live in the assurance game?
For example, if a vampire bat misses a couple of meals, it will starve, so having a mutual insurance policy to smooth out noise in the economics would be highly beneficial, evolutionarily. It has been seen that vampire bats share blood, recognize cheaters, and show non-kin coalitions (Carter and Wilkinson). To my knowledge, it is not known if they show third-party punishment as well or only tit-for-tat individual punishment.
Do other animals gossip? Tit-for-tat punishment is a dyadic morality play, in which a player participating in a repeated prisoner’s dilemma game starts by cooperating, but then simply does what the other player did last time. This effectively punishes cheaters. In a world with lots of players playing repeated prisoner’s dilemma games, tit-for-tat is a generally-stable, winning strategy (Alexrod and Hamilton). (Remember that a repeated prisoner’s dilemma becomes an assurance game.)
The problem is that as society gets larger, players become more and more likely only to encounter someone once. So, how do you turn this single-encounter prisoner’s dilemma into a repeated prisoner’s dilemma? Gossip and reputation. Gossip is a means of sharing information about individual behaviors within a group. Thus, turning individual, single-encounter prisoner’s dilemma games into a general, many-encounter assurance game. Interestingly, most human conversation is gossip.
Interestingly, in Darwin’s Cathedral, Wilson makes the suggestion that one role of religion is to create groups, and that the creation of those groups leads to within-group altruism. Moreover, those religious rules are particularly well-designed to ensure cooperation in that assurance game and to punish parties that cheat within that game. Is this part of why religion, particularly public religion, is so often thought of as being “about morality”? For example, the Ten Commandments are all about either defining a group [I-III, the important point of there being “no other gods before me”], structure within the group [V], or reducing in-group conflict [VI-X]). As pointed out by Pascal Boyer (Religion Explained), the other role religion plays is to create the illusion of (or, perhaps better to say, acceptance of) global, in-group monitoring to ensure in-group cooperation.
What this new perspective on morality suggests is that what we call “moral” is about what we owe to each other, about how to create groups, and how to maintain them. One of my favorite points about community is a point made by Lynn Stout that people express “passive altruism”—we tend to work well within the community—that there are basic community norms we accept (don’t randomly attack people in your community, don’t steal from each other, etc.).
Of course, one of the problems with these definitions is that these processes lead to within-group altruism and between-group conflict (and xenophobia). Turchin argues that there is a historical cycle of altruistic groups winning out until they are destroyed from within by selfishness, leaving them vulnerable for another altruistic group to take over. He also argues that empires appear at the boundaries of cultures when warring groups realize that they are part of a larger group because a more different group has appeared.
So maybe what we need is an alien invasion for us to realize that we are all part of the same humanity. (Of course, this just moves the group one step larger, at which point, we’ll have to ask how we bring those aliens into our humanity, but that’s for another day.)
References
Axelrod, R., & Hamilton, W. D. (1981). The evolution of cooperation. science, 211(4489), 1390-1396.
Boyer, P. (2007). Religion explained: The evolutionary origins of religious thought. Basic books.
Card, O. S. (1986). Speaker for the Dead. Tor Books
Carter, G. G., & Wilkinson, G. S. (2015). Social benefits of non-kin food sharing by female vampire bats. Proceedings of the Royal Society B: Biological Sciences, 282(1819), 20152524.
Cheney, D. L., & Seyfarth, R. M. (2018). How monkeys see the world: Inside the mind of another species. University of Chicago Press.
De Waal, F. B. (1989). Peacemaking among primates. Harvard University Press.
Goodal, J. (1986). The chimpanzees of Gombe, patterns of behavior. Harvard University Press.
Goodall, J. (2010). Through a window: My thirty years with the chimpanzees of Gombe. HMH.
Moore, A. & Gibbons, D. (1987) Watchmen. DC Comics.
Ostrom, E. (1990). Governing the commons: The evolution of institutions for collective action. Cambridge university press.
Runge, C. F. (1981). Common property externalities: isolation, assurance, and resource depletion in a traditional grazing context. American journal of agricultural economics, 63(4), 595-606.
Rushdie, S. (2014). Haroun and the Sea of Stories. Penguin.
Sapolsky, R. M. (2007). A primate's memoir: a neuroscientist's unconventional life among the baboons. Simon and Schuster.
Scanlon, T. (1998). What we owe to each other. Harvard University Press.
Sober, E., & Wilson, D. S. (1999). Unto others: The evolution and psychology of unselfish behavior (No. 218). Harvard University Press.
Strum, S. C. (2001). Almost human: A journey into the world of baboons. University of Chicago Press.
Stout, L. (2010). Cultivating conscience: How good laws make good people. Princeton University Press.
Stout, L. (2008). Taking conscience seriously. in Moral Markets, edited by Paul Zak., Princeton University Press.
Tomasello, M. (2016). A natural history of human morality. Harvard University Press.
Turchin, P. (2018). Historical dynamics: Why states rise and fall. Princeton University Press.
Wilson, D. S. (2015). Does altruism exist?: culture, genes, and the welfare of others. Yale University Press.
Wilson, D. (2010). Darwin's cathedral: Evolution, religion, and the nature of society. University of Chicago press.