The behavioral strategy allows for mistakes and thus promotes cooperation
Understanding mutual cooperation is a key element in understanding how people work together. Whether it is friends doing favors for each other, animals exchanging food or aid, or nations coordinating policies, these are all essentially cooperative interactions. Such interactions require people to be willing to help others, but also to fight back when they are taken advantage of. But what rules ensure that cooperation can flourish without being exploited?
To investigate this question, Charlotte Rossetti and Christian Hilbe from the Max Planck Institute for Evolutionary Biology in Plön, together with collaborators from the Dalian Institute of Technology (China), rely on the so-called repeated prisoner’s dilemma. In a repeated prisoner’s dilemma, two game players face the same decision at the same time: they can pay a small price to give the other player a financial advantage, or do nothing. Ideally, both players would "cooperate" and pay this cost so that both receive the advantage. However, there is also the possibility that one player deviates, does not pay the cost, and thus pockets the advantage given to him by the other player. How can this game be played in such a way that cooperation is possible and self-serving behavior is kept in check?
A typical example of how to address the repeated prisoner’s dilemma is the tit-for-tat strategy. And indeed, in a society where people use Tit-for-Tat, cooperation can develop and flourish, but with one major drawback: if individuals sometimes make mistakes, mutual cooperation becomes unstable. "Tit-for-Tat is a nice rule of thumb that is easy to implement and feels very human. After all, it’s based on the old adage of an eye for an eye," says Charlotte Rossetti of the Max Planck Institute for Evolutionary Biology. "But it’s not forgiving enough and doesn’t take into account the mistakes that we know are all too common in humans. If I accidentally make a mistake when I really wanted to cooperate, and then don’t cooperate again until you do too, then we’re out of sync."
To address these weaknesses, researchers analyzed an alternative strategy they call "cumulative reciprocity," or CURE. Individuals who use CURE keep track of the imbalance of cooperative actions within an interaction. That is, they observe in each round whether both players have cooperated equally often in the past, or whether this ratio is out of balance in favor of the other player. If the imbalance is zero or low enough, the strategy suggests cooperating. However, if the imbalance becomes too large, there is a risk of being taken advantage of. In such a case, the strategy suggests acting selfishly as well.
The first advantage of the CURE strategy is a practical one. By calculating a simple number (the current imbalance), individuals can consider the entire course of the interaction without having to store the outcome of each round in detail. This greatly simplifies calculations and allows researchers to analyze the model comprehensively. To this end, Hilbe and Xia’s team studied the mathematical properties of this strategy and performed extensive computer simulations. In doing so, they are testing how cooperation evolves in different environments. These results show that CURE has the ability to promote fairness while allowing for errors. It is also able to prevail in a hostile environment. So even in a population of egoists, cooperation can emerge.
Prediction of human behavior
Another strength of CURE is its intuitiveness and simplicity, which makes it a good candidate for predicting real-world human behavior. To investigate this aspect in more detail, Charlotte Rossetti conducted an online experiment in which participants had the opportunity to play with another person for a small amount of money. The results show that CURE explains participants’ decisions better than other rules, especially when the experiment also simulates human error. The fact that people sometimes make mistakes when interacting with others can have a detrimental effect on cooperation. Therefore, any model that seeks to replicate human behavior as closely as possible must take this into account.
We know from psychology that most people in friendships and other close relationships do not keep accurate records of who owes whom a favor. Instead, they have a general sense of whether the relationship is fair or not. CURE perfectly embodies this behavior. Interestingly, the researchers’ approach does not require people to consciously choose such a strategy. Strategies like CURE can emerge naturally over time as simple rules of thumb. These rules of thumb then help to enable mutual cooperation.