- Tit for tat
Tit for tat is a highly effective strategy in
game theory for theiterated prisoner's dilemma . It was first introduced byAnatol Rapoport inRobert Axelrod 's two tournaments, held around 1980. Based on the English saying meaning "equivalent retaliation" ("tit for tat"), an agent using this strategy will initially cooperate, then respond in kind to an opponent's previous action. If the opponent previously was cooperative, the agent is cooperative. If not, the agent is not. This is similar toreciprocal altruism inbiology .Overview
This strategy is dependent on four conditions that has allowed it to become the most prevalent strategy for the
prisoner's dilemma :# Unless provoked, the agent will always cooperate
# If provoked, the agent will retaliate
# The agent is quick to forgive
# The agent must have a good chance of competing against the opponent more than once.In the last condition, the definition of "good chance" depends on the
payoff matrix of the prisoner's dilemma. The important thing is that the competition continues long enough for repeated punishment and forgiveness to generate a long-term payoff higher than the possible loss from cooperating initially.A fifth condition applies to make the competition meaningful: if an agent knows that the next play will be the last, it should naturally defect for a higher score. Similarly if it knows that the next two plays will be the last, it should defect twice, and so on. Therefore the number of competitions must not be known in advance to the agents.
Against a variety of alternative strategies, tit for tat was the most effective, winning in several annual automated tournaments against (generally far more complex) strategies created by teams of computer scientists, economists, and psychologists. Game theorists informally believed the strategy to be optimal (although no proof was presented).
It is important to know that tit for tat still is the most effective strategy if the average performance of each competing team is compared. The team which recently won over a pure tit for tat team only outperformed it with some of their algorithms because they submitted multiple algorithms which would recognize each other and assume a master and slave relationship (one algorithm would "sacrifice" itself and obtain a very poor result in order for the other algorithm to be able to outperform Tit for Tat on an individual basis, but not as a pair or group). Still, this "group" victory illustrates an important limitation of the Prisoner's Dilemma in representing social reality, namely, that it does not include any natural equivalent for friendship or alliances. The advantage of "tit for tat" thus pertains only to a
Hobbesian world of rational solutions, not to a world in which humans are inherently social.Fact|date=August 2008Example of play
Payoff matrix | Name = Prisoner's dilemma example
2L = Cooperate | 2R = Defect
1U = Cooperate | UL = 3, 3 | UR = 0, 5
1D = Defect | DL = 5, 0 | DR = 1, 1 Assume there are 4 agents: 2 are Tit for Tat players ("variables") and 2 are "Defectors", simply trying to maximize their own winnings by always giving evidence against the other. Assume that each player faces the other 3 in a match lasting 6 games. If one player gives evidence against a player who does not, the former gains 5 points and the latter nets 0. If both refrain from giving evidence, both gain 3 points. If both give evidence against each other, both gain 1 point.When a variable faces off against a defector, the former refrains from giving evidence in the first game while the defector does the opposite, gaining the control 5 points. In the remaining 5 games, both players give evidence against each other, netting 1 point each game. The final score is: Defector - 10 | Variable - 5.
When the variables face off against each other, each refrains from giving evidence in all 6 games. 6 * 3 = 18 points, the final score being Variable(1) - 18 | Variable(2) - 18.
When the defectors face off, each gives evidence against the other in all 6 games. 6 * 1 = 6 points, the final score being Defector(1) - 6 | Defector(2) - 6.
The final score for each variable is 5 (game against defector(1)) + 5 (game against defector(2)) + 18 (game against variable) = 28 points. The final score for each defector is 10 (against variable(1)) + 10 (against variable(2)) + 6 (against defector) = 26 points.
Despite the fact that the variables never won a match and the defectors never lost a match, the variables still came out ahead, because the final score is not determined by the winner of matches, but the scorer of points. Simply put, the variables gained more points tying with each other than they lost to the defectors.The more variables that there are in the game, the more advantage it is to be a variable.
(This example was taken from
Piers Anthony 's novel, "Golem in the Gears ".)Implications
The success of the strategy, which is largely cooperative, took many by surprise. In successive competitions various teams produced complex strategies which attempted to "cheat" in a variety of cunning ways, but Tit for Tat eventually prevailed in every competition.
Some theorists believe this result may give insight into how groups of animals (and particularly human societies) have come to live in largely (or entirely) cooperative societies, rather than the individualistic "red in tooth and claw" way that might be expected from individual engaged in a
Hobbes ian state of nature. This, and particularly its application to human society and politics, is the subject ofRobert Axelrod 's book "The Evolution of Cooperation ".Problems
While Axelrod has empirically shown that the strategy is optimal in some cases, two agents playing tit for tat remain vulnerable. A one-time, single-bit error in either player's interpretation of events can lead to an unending "death spiral". In this symmetric situation, each side perceives itself as preferring to cooperate, if only the other side would. But each is forced by the strategy into repeatedly punishing an opponent who continues to attack despite being punished in every game cycle. Both sides come to think of themselves as innocent and acting in self-defense, and their opponent as either evil or too stupid to learn to cooperate.
This situation frequently arises in real world conflicts, ranging from schoolboy fights to civil and regional wars. Tit for two tats could be used to avoid this problem.
"Tit for Tat with forgiveness" is sometimes superior. When the opponent defects, the player will occasionally cooperate on the next move anyway. This allows for recovery from getting trapped in a cycle of defections. The exact probability that a player will respond with cooperation depends on the line-up of opponents.
The reason for these issues is that tit for tat is not a
subgame perfect equilibrium . [cite book
last = Gintis
first = Herbert
title = Game Theory Evolving
publisher =Princeton University Press
date = 2000
isbn = 0691009430] If one agent defects and the opponent cooperates, then both agents will end up alternating cooperate and defect, yielding a lower payoff than if both agents were to continually cooperate. While this subgame is not directly reachable by two agents playing tit for tat strategies, a strategy must be aNash equilibrium in all subgames to be subgame perfect. Further, this subgame may be reached if any noise is allowed in the agents' signaling. A subgame perfect variant of tit for tat known as "contrite tit for tat" may be created by employing a basic reputation mechanism. [cite journal |last= Boyd |first= Robert |title= Mistakes Allow Evolutionary Stability in the Repeated Prisoner's Dilemma Game" |journal = Journal of Theoretical Biology |volume = 136 |pages = pp. 47–56 |date = 1989 |doi= 10.1016/S0022-5193(89)80188-2]Tit for two tats
Tit for Two Tats is similar to Tit for Tat in that it is nice, retaliating, forgiving and non-envious, the only difference between the two being how nice the strategy is.
In a tit for tat strategy once an opponent defects, the tit for tat player immediately responds by defecting on the next move. This has the unfortunate consequence of causing two retaliatory strategies to continuously defect against one another resulting in a poor outcome for both players. A tit for two tats player will let the first defection go unchallenged as a means to avoid the "death spiral" of the previous example. If the opponent defects twice in a row, the tit for two tats player will respond by defecting.
This strategy was put forward by
Robert Axelrod during his second round of computer simulations atRAND . After analyzing the results of the first experiment he determined that had a participant entered the tit for two tats strategy it would have emerged with a higher cumulative score than any other program. As a result he himself entered it with high expectations in the second tournament. Unfortunately due to the more aggressive nature of the programs entered in the second round, tit for two tats did significantly worse than tit for tat due to aggressive strategies being able to take advantage of its highly forgiving nature.Popular culture
The tit for tat strategy was employed in an episode of "
Numb3rs ", whereFBI agents were interrogating and attempting to obtain information from an inmate on death row. The strategy was working, but the FBI would not implement a "tit for two tats".Live and Let Live
The tit for tat strategy has been detected by analysts in the spontaneous non-violent behaviour, called "live and let live" that arose during the
First World War . TheChristmas truce of 1914 appears to be an example.See also
*
An eye for an eye
*Trigger strategy (a set of strategies of which tit for tat is a member)
*Quid pro quo
*Chicken (game)
*Nice Guys Finish First , a documentary byRichard Dawkins that discusses tit for tatReferences
External links
* [http://www.wired.com/news/culture/0,1284,65317,00.html Wired magazine story about Tit for Tat being 'defeated' by a group of collaborating programs]
* [http://www2.owen.vanderbilt.edu/mike.shor/Courses/GTheory/docs/Axelrod.html Explanation of Tit for tat on Australian Broadcasting Corporation]
* [http://journal.ilovephilosophy.com/Article/Can-cooperation-every-occur-without-the-state-/1130 Article on Tit for Tat and its success in evolutionary cooperation]
Wikimedia Foundation. 2010.