“The Ambiguity Aversion Literature: A Critical Assessment,” N. al-Najjar & J. Weinstein (2009)

A wide class of decision theoretic models have recently attempted to model “ambiguity”, or the lack of firm knowledge about a probability. This is important, because in the traditional Bayesian literature, there is zero distinction between p(a)=p(b)=.5 because I don’t know what the true probability is, or p(a)=p(b)=.5 because I am certain after updating on many pieces of data that .5 is about right for each. The best-known model of ambiguity aversion is the multiple priors model of Gilboa & Schmeidler, in which agents have a set of priors for some events, and given a decision, will assume that the “worst-case” prior is the true one. This model essentially comes from dropping the Sure Thing principle in Savage, and it allows Ellsberg-type paradoxes to be rationalized.

Al-Najjar and Weinstein find such models wanting. First, models like multiple priors, in addition to rationalizing the Ellsberg paradox, also rationalize behavior like the sunk cost fallacy that nearly all economists would agree is normatively irrational. Second, these sorts of models lead to problems with dynamic updating. In particular, new information can lead an agent to change not just his probabilities about acts, but his preferences over those acts themselves. If the decisionmaker is “sophisticated” and does not accept such reversals, then the decisionmaker may ignore new information altogether even when that information is useful. Each of these behaviors are normatively unappealing.

The authors suggest that the “simple” answer for Ellsberg is the most sensible – that people use heuristics in unfamiliar situations such as laboratory experiments, and that people who make Ellsberg-type mistakes would not make them if the consequences of such behavior were explained. Of course, there are many explanations for Ellsberg aside from loosening the Sure Thing principle; I particularly like a version of multiple priors where, from the point of view of the analyst, the decisionmaker is using multiple priors, updating each element of that set according to Bayes, but then using any possible capacity (a mathematical generalization of a probability) consistent with those priors at a given decision. This model is lacking in that it can’t pin down behavior, but it also allows the decisionmaker to, for instance, avoid the sunk cost fallacy while still being susceptible to Ellsberg behavior in a lab.

http://www.kellogg.northwestern.edu/~/media/Files/Faculty/Research/ArticlesBookChaptersWorkingPapers/AmbiguityFinal.ashx (Final WP – ifinal version published in Economics and Philosophy 25 (2009))

Advertisements

One thought on ““The Ambiguity Aversion Literature: A Critical Assessment,” N. al-Najjar & J. Weinstein (2009)

  1. Jonathan Weinstein says:

    This post appeared within 24 hours of my viewing this blog for the first time! Coincidence? There is a possible non-coincidental explanation…

    The summary is mostly fine, but I find “new information can lead an agent to change not just his probabilities about acts, but his preferences over those acts themselves” to be confusing. We did object to lack of separation between tastes and beliefs, in the sense that (in the sunk-cost example) your utility at an event that can no longer happen can change your beliefs over future events.

    Also, a major additional theme both here and in our previous paper “Comparative Testing of Experts” is: The idea that a true probability really exists and we just don’t know it is useless and leads to confusion. This statement needs much more fleshing out than I will do here; see end of p.32 and p.33

Comments are closed.

%d bloggers like this: