[Update, 9/7/2011: A comment at Cheap Talk mentioned a new paper by Nicholas Bardsley which I find quite relevant to the final paragraph of this post. Essentially, Bardsley is able to completely change (as far as I’m concerned) the “sharing” characteristic of the dictator game just by changing the action set available to players; if the dictator can also “take” money, and not simply share, then they do take indeed. The Hawthorne Effect Is Real, shout the villagers from the mountaintop.]
Here is one more experimental paper, which I believe is forthcoming in the AER as well. Experimentalists love the Ultimatum Game. In the Ultimatum Game, two anonymous people are matched and one of them is given X dollars. She is told to propose a split of the money between herself and the other player. The other player can then either accept his share of the split, or reject, in which case both parties get nothing. Tons of experiments over the past 20 years have shown, everywhere from U.S. undergraduate labs to tribes in the Amazon, offers that tend to be rather high (30-50%) and also high rejection rates on low offers. This is “strange” (more on this shortly) to economists because the unique subgame perfect Nash equilibrium is to offer one penny, and for the responder to accept. Even if you think that the so-called paradox is nothing of the sort – rather, people are unused to one-shot games and are instead trying to develop reputation in a repeated game called Life – there is an even stranger stylized fact: changing stakes doesn’t seem to affect behavior. That is, if the stakes are 1 dollar, 10 dollars or 100 dollars, people still reject. Why aren’t people responding to incentives at all?
I remember a study a few years ago, from Indonesia perhaps, where many days worth of wages were being rejected seemingly out of spite. (And speaking of spite, ultimatum game papers are great examples of economists abusing language. One man’s “unfair offers were consistently rejected” is another man’s “primitive spite seems more important to responders than rational thought.”)
Andersen et al (more on this also in a second) play the ultimatum game in India using stakes that range up to a year’s income. And unsurprisingly, stakes matter a lot. No matter how low the split, only one time is an offer rejected with the year’s income stakes, and that offer was less than 10% of the stake. As stakes increase from 20 rupees up to 20000, the rejection rate for a given split falls, though it seems to fall fastest when stakes get very large. The takeaway: even given all of the experimental results on the Ultimatum game, spite is probably not terribly important vis-a-vis more standard incentives across the range of “very important economic phenomena.” None of this is to say that CEOs won’t cost their firm millions out of spite – surely they sometimes do – but rather claims that human nature is hardwired for fairness or spite or whatever you want to call it even at the expense of standard maximizing behavior are limited claims indeed.
Two final notes here. First, I think economists need to come to some conclusion concerning norms on experimental papers. Econ has long had a standard of giving author billing only to those who were essential for the idea and the completion of a paper – rarely has this meant more than three authors. Credit for data collection, straightforward math, coding, etc. has generally been given in the acknowledgments. A lot of econ psych and experimental work strikes me as fighting that norm: five and six authors have become standard. (I should caveat this by saying that in the present paper, I have no idea how workload was divided; rather, I think it’s undeniable that more generally the work expected of a coauthor in experimental papers is lower than that which was traditional in economics.)
Second, and I’m sure someone has done this but I don’t have a cite, the “standard” instructions in ultimatum games seem to prime the results to a ridiculous degree. Imagine the following exercise. Give 100 dollars to a research subject (Mr. A). Afterwards, tell some other subject (Ms. B) that 100 dollars was given to Mr. A. Tell Mr. A that the other subject knows he was given the money, but don’t prime him to “share” or “offer a split” or anything similar. Later, tell Ms. B that she can, if she wishes, reverse the result and take the 100 dollars away from A – if she does so, had Mr. A happened to have given her some of the money, that would also be taken. I hope we can agree that if you did such an experiment, A would share no money and B would show no spite, as neither has been primed to see the 100 dollars as something that should have been shared in the first place. One doesn’t normally expect anonymous strangers to share their good fortune with you, surely. That is, feelings of spite, jealousy and fairness can be, and are, primed by researchers. I think this is worth keeping in mind when trying to apply the experimental results on ultimatum games to the real economy.
http://openarchive.cbs.dk/bitstream/handle/10398/8244/ECON_wp1-2011.pdf?sequence=1 (January 2011 working paper, forthcoming in the AER)
So people are rational enough to respond to changes in the incentives, but also irrational enough to be affected by the Hawthorne Effect? Sounds tricky, even if it is true.
-Brad
Indians love pleasing the white man!
😛