Skip to main content

Verified by Psychology Today

Prisoner's Dilemma

Regret in interpersonal games

Regret pulls toward rational, not moral, choice.

betrayal

Regret is nothing but the revocation of our will and the conflict among our moods.
Michel de Montaigne [1]

After two ruminations on regret (here and here), I grudgingly concluded that if regret is so prevalent, there must be some good reason for its existence, although the arguments I have considered so far have fallen apart upon inspection. So on to attempt number three: game theory. Is a person who is sensitive to regret more likely to make moral or rational choices?

First, consider the prisoner's dilemma. Two players choose independently between cooperation and defection. If both cooperate, they each get payoff R. If both defect, they each get P. If one cooperates and one defects, they respectively get T and S. The inequalities are T > R > P > S. Traditional analysis says that it is moral to cooperate because it increases the other's payoff, and that it is rational to defect because it increases one's own payoff.

What might one regret? Arguably, regret is greatest for the player who cooperates hoping the other player will too, and who then meets with defection. In the absence of masochism, this player will switch to defection. In other words, regret is associated with rational defection in the prisoner's dilemma, which may seem odd for a moral emotion. More importantly, regret is unnecessary. A traditional game theorist would start out with defection anyway, knowing that its payoff is higher than the payoff of cooperation regardless of the other player's choice. If, nonetheless, the first move is cooperative, the heuristic of tit-for-tat can account for the switch to defection without the emotional baggage.

Second, consider the trust game. This is a sequential game, in which the first player chooses between transferring a sum of money to the second player and keeping it. If the money is transferred, it triples in value and the second player chooses between keeping it all and splitting it with the first player. If the first player distrusts the second player and keeps the money, there will be no opportunity for regret. Only if the first player is betrayed after trust can the choice of transferring the money be regretted. The result is greater suspicion, more distrust, and a decline in economic activity. As in the prisoner's dilemma, this result amounts to greater rationality and less morality; and again, a simple learning rule can account for it.

To drive this last point home, consider this: You can stop putting your hand on the hot burner because you know from past experience or from observing others that it hurts like hell. You do not need to avoid the burner because you know that you would regret having failed to avoid it. The physical pain speaks in a loud and clear language. The emotional pain is the kind of suffering that Buddhists bemoan: tacked on, self-inflicted, and totally unproductive.

Third, consider the ultimatum game. This is also a sequential game, in which the first player offers the second player a way of splitting a sum of money. If the second player agrees, the money is divided as proposed; if the second player declines, no one gets anything. The safe option for the first player is to offer a 50 - 50 split. Regret is only imaginary. The player might imagine that a 60 - 40 split would also have been accepted. In contrast, the first player feels clear regret of not having offered more if a low offer is rejected. Anticipating this, the player might offer a bit more than he or she otherwise would.

So here at last we have an example where regret enhances both the morality and the rationality of choice. Yet, the enhanced morality by having offered the other player more than a dime is perhaps only a byproduct of the rational concern for the self not to get nothing. Nonetheless, anticipated regret seems useful here. Many economists think that the minimization of future regret is an important principle of choice. But this cannot be the whole story. If it were so, everyone would propose a 50 - 50 split. But wait, that choice would open the door for the reverse kind of imaginary regret ("Had I been a bit more selfish, I still would have got away with it."). To say that whichever split a player in the ultimatum game proposes is by definition the regret-minimizing choice is not terribly enlightening because it is a circular argument.

A more complete, but still circular, argument is that players try to find the minimum offer acceptable to the other. This offer balances the attempt to minimize regret ("Good thing that I did not offer less than I did") and its sister emotion of rejoicing ("I am glad I offered as little as I did, and got as much as I did.").

What I think I have learned from this exercise is that if these games are played only once, the regret experienced after a disappointing choice made by the other player is wasted energy. If the game is played repeatedly, regret may guide a person towards more rational choices, although it is hard to see what the emotion contributes beyond what simple learning can accomplish. Along the way, choices also become less moral (prisoner's dilemma, trust game), or only seemingly more moral (ultimatum game).

To dispel the idea that regret always stands in the service of rationality, let's take a look at gambling. In the Dutch postcode lottery, individuals cannot help not finding out if they would have won had they played (Zeelenberg & Pieters, 2004). The lottery number is the postcode. The Dutch play, in part, to guard against that regret. Gambling is self-defeating because the expected value of the gamble is less than the price of the ticket. The greater the anticipated regret is, the greater is the irrationality.

The prisoner's dilemma revisited. Not everyone will buy my assertion that cooperation begets more regret than defection does. Consider a payoff structure that is horrible for the sucker (T = 100, R = 90, P = 90, S = 0). Suppose you defected against (cheated on, if you wish) your friend on the expectation that he was already defecting against you. To your horror, you find out that he had not cheated. You: 100, He: 0. You are flooded with regret and you switch to cooperation, hoping to set things right and to harvest the 90 + 90 payoff for the two of you. This scenario sounds moral and moralistic. A scenario of sin and redemption. Alas, the story overlooks my earlier claim that being a betrayed cooperator instills the greatest regret and hence the strongest motive to switch strategy. As you switch from defection to cooperation, your partner - who is also motivated by regret - does the opposite, with both of you becoming ever more miserable. I think you should talk.

[1] The translation is my own and comes with a bucket of salt (but no regrets). I read Montaigne in German parce que mon français ça pue.

Zeelenberg, M. & Pieters, R. (2004). Consequences of regret aversion: The case of the Dutch postcode lottery. Organizational Behavior and Human Decision Processes, 93, 155-168.

advertisement
More from Joachim I. Krueger Ph.D.
More from Psychology Today