Book Review: The Evolution of Cooperation by Robert Axelrod


(Basic Books, 10 East 53rd Street, New York, New York 10022), 1984 • 241 pages, index • $17.95

In the belief that nice guys always finish last in the marketplace, an arena of harsh Darwinian natural selection, many propose to ration freedom. The Invisible Hand, assuring that the market works to the advantage of all as each pursues his own self-interest, is in such disrepute that Axe]rod doesn’t seem to recognize that his experiment in game theory has given this “myth” a solid theoretical foundation.

Nature is not always red in tooth and claw. A close study of biology reveals abundant instances of cooperation, even apparent altruism. Human history shows that bitter enemies may practice reciprocity under certain circumstances, as in the trench warfare of World War I, when both sides frequently refrained from shooting. Cooperation among rivals in business may develop all too readily in Axelrod’s view; understanding the mechanism may help prevent collusion.

Axelrod’s paradigm for the evolution of cooperation is the game of Prisoner’s Dilemma, invented about 1950 and the subject of a voluminous literature, particularly in the field of psychology. Though one round of this game evokes dog-eat-dog competition, in the iterated ver sion, straightforward cooperation outcompetes deviousness and treachery, rather to everyone’s astonishment.

The classic Prisoner’s Dilemma is employed by prosecutors to get accomplices in crime to inform on each other. An easily understood variant is a business transaction. Suppose that a man who possesses a bag of money wishes to obtain a bag of diamonds. He and a diamond dealer are able to work out mutually agreeable terms. However, for some reason, the trade must take place in secret. Each must simultaneously leave his bag at a different spot in the woods. By cooperation, each can obtain something he values, the Reward (R). But there is always the Temptation (T) to get something for nothing, and leave the other fellow with the Sucker’s Payoff (S), an empty bag in exchange for a full one. If both parties “defect,” both will get an empty bag, the Punishment (P). If they both know that they will never have to deal with each other again, each could arrive, by impeccable logic, at the conclusion that he would be better off leaving an empty bag, regardless of what the other does.

Introducing the prospect of an indefinitely large number of future encounters between the same individuals changes the situation dramatically. The supposed short term advantage of defection may be outweighed by the long term advantage of cooperation. While an authority would be required to enforce the contract in the first instance, for the iterated Prisoner’s Dilemma, honesty becomes the best policy, to a large extent a self-policing one.

Axelrod set up an ingenious computer tournament in which the winner was the program amassing the largest number of points in a round-robin Prisoner’s Dilemma of about 200 encounters. Entries were submitted by political scientists, economists, psychologists, biologists mathematicians, and computer scientists. At each encounter, two programs simultaneously decided to cooperate or defect. Each could remember the history of previous interactions with the other individual. For mutual cooperation, both were awarded three points (R). Mutual defection earned one point each (P). If just one program cooperated, it received no points (S), and its exploiter got five points (T). The winner was the simplest of all the rules: called TIT FOR TAT, it defected if and only if the other program had defected on the last previous encounter. Even more surprising]y, all of the eight top-rank-ing entries were “nice”; that is, they never defected first, at least not until near the end of the game. The “meanies,” which tried to take advantage of the programs that cooperated, often by clever and devious methods, were defeated by a wide margin.

An evolutionary biologist, John Maynard Smith, extended the game to populations. A “community” of individuals using a TIT FOR TAT strategy cannot be successfully “invaded” by a group of “meanies,” because the “natives” do so well when dealing with each other. On the other hand, a population of individuals that always behave treacherously can be “invaded” or can be “converted” by “nice” strategies, providing only that a large enough cluster of individuals is introduced so that the nice guys have a significant chance of meeting each other.

Axelrod draws some extremely significant conclusions: “Mutual cooperation can emerge in a world of egoists without central control by starting with a cluster of individuals who rely on reciprocity.” Furthermore, he notes that our robust hero TIT FOR TAT is not envious. It cannot receive more points than any rival in a series of encounters, and is frequently defeated, though not by much. Its success results from eliciting cooperative behavior from other players using many different strategies. Besides being “nice,” TIT FOR TAT is “forgiving”—it retaliates only once for each episode of defection, minimizing the chance of an unending “feud.” However, its “provocability” is essential for deterring “bullies.” Strategies that are too forgiving, or do not retaliate immediately, are unable to survive in a hostile environment.

The possibility of cooperation depends on the rules of the game. The foundation of cooperative relationships is not necessarily trust, but durability; future encounters must be anticipated. Furthermore, the payoff matrix must reward mutual cooperation; that is, unlike chess, Prisoner’s Dilemma is not a zero-sum game.

Although Axelrod explores many different applications of his findings, from biological evolution to arms control, one might wish he had speculated on the implications of current trends in society, especially in his own field of political science. The drift toward a planned economy is altering the payoff equations. The shift from individual to collective responsibility tends to diminish the “shadow of the future.” The concept of life as a zero-sum game reduces the Reward. Rapid, arbitrary changes dictated by the legislature, the courts, and the bureaucracy can increase the Temptation, while also discounting the reliability of future rewards. All these changes tend to destroy the conditions necessary for spontaneous cooperation. Not sur-prisingly, they are accompanied by pressures for more regulation. Just as in the single-round version of Prisoner’s Dilemma, in a socialist economy it is always advantageous to cheat (if not essential for survival).

Besides being profoundly important for all the social sciences, this work is a delight, and even an inspiration, to read. For scholars, it has nearly 200 references, and for those who remember some algebra, there are proofs in the appendix. Yet all with a high school education should be able to follow the lucid, elegantly simple argument.


December 1984

comments powered by Disqus


* indicates required
Sign me up for...


October 2014

Heavily-armed police and their supporters will tell you they need all those armored trucks and heavy guns. It's a dangerous job, not least because Americans have so many guns. But the numbers just support these claims: Policing is safer than ever--and it's safer than a lot of common jobs by comparison. Daniel Bier has the analysis. Plus, Iain Murray and Wendy McElroy look at how the Feds are recruiting more and more Americans to do their policework for them.
Download Free PDF