If you know the probability theorist Bruno de Finetti, you know him either for his work on exchangeable processes, or for his legendary defense of finite additivity. Finite additivity essentially replaces the Kolmogorov assumption of countable additivity of probabilities. If Pr(i) for i=1 to N is the probability of event i, then the probability of the union of all i is just the sum of each individual probability under either countable of finite additivity, but countable additivity requires that property to hold for a countably infinite set of events.
What is objectionable about countable additivity? There are three classic problems. First, countable additivity restricts me from some very reasonable subjective beliefs. For instance, I might imagine that a Devil is going to pick one of the integers, and that he is equally likely to predict any given number. That is, my prior is uniform over the integers. Countable additivity does not allow this: if the probability of any given number being picked is greater than zero, then the sum diverges, and if the probability any given number is picked is zero, then by countable additivity the sum of the grand set is also zero, violating the usual axiom that the grand set has probability 1. The second problem, loosely related to the first, is that I literally cannot assign probabilities to some objects, such as a nonmeasurable set.
The third problem, though, is the really worrying one. To the extent that a theory of probability has epistemological meaning and is not simply a mathematical abstraction, we might want to require that it not contradict well-known philosophical premises. Imagine that every day, nature selects either 0 or 1. Let us observe 1 every day until the present (call this day N). Let H be the hypothesis that nature will select 1 every day from now until infinity. It is straightforward to show that countable additivity requires that as N grows large, continued observation of 1 implies that Pr(H)->1. But this is just saying that induction works! And if there is any great philosophical advance in the modern era, it is Hume’s (and Goodman’s, among others) demolition of the idea that induction is sensible. My own introduction to finite additivity comes from a friend’s work on consensus formation and belief updating in economics: we certainly don’t want to bake in ridiculous conclusions about beliefs that rely entirely on countable additivity, given how strongly that assumption militates for induction. Aumann was always very careful on this point.
It turns out that if you simply replace countable additivity with finite additivity, all of these problems (among others) go away. Howson, in a paper in the newest issue of Synthese, asks why, given that clear benefit, anyone still finds countable additivity justifiable? Surely there are lots of pretty theorems, from Radon-Nikodym on down, that require countable additivity, but if the theorem critically hinges on the basis of an unjustifiable assumption, then what exactly are we to infer about the justifiability of the theorem itself?
Two serious objections are tougher to deal with for de Finetti acolytes: coherence and conditionalization. Coherence, a principle closely associated with de Finetti himself, says that there should not be “fair bets” given your beliefs where you are guaranteed to lose money. It is sometimes claimed that a uniform prior over the naturals is not coherent: you are willing to take a bet that any given natural number will not be drawn, but the conjunction of such bets for all natural numbers means you will lose money with certainty. This isn’t too worrying, though; if we reject countable additivity, then why should we define coherence to apply to non-finite conjunctions of bets?
Conditionalization is more problematic. It means that given prior P(i), your posterior P(f) of event S after observing event E must be such that P(f)(S)=P(i)(S|E). This is just “Bayesian updating” off of a prior. Lester Dubins pointed out the following. Let A and B be two mutually exclusive hypothesis, such that P(A)=P(B)=.5. Let the random quantity X take positive integer values such that P(X=n|B)=0 (you have a uniform prior over the naturals conditional on B obtaining, which finite additivity allows), and P(X=n|A)=2^(-n). By the law of total probability, for all n, P(X=n)>0, and therefore by Bayes’ Theorem, P(B|X=n)=1 and P(A|X=n)=0, no matter which n obtains! Something is odd here. Before seeing the resolution of n, you would take a fair bet on A obtaining. But once n obtains (no matter which n!), you are guaranteed to lose money by betting on A.
Here is where Howson tries to save de Finetti with an unexpected tack. The problem in Dubins example is not finite additivity, but conditionalization – Bayesian updating from priors – itself! Here’s why. By a principle called “reflection”, if using a suitable updating rule, your future probability of event A is p with certainty, then your current probability of event A must also be p. By Dubins argument, then, P(A)=0 must hold before X realizes. But that means your prior must be 0, which means that whatever independent reasons you had for the prior being .5 must be rejected. If we are to give up one of Reflection, Finite Additivity, Conditionalization, Bayes’ Theorem or the Existence of Priors, Howson says we ought give up conditionalization. Now, there are lots of good reasons why conditionalization is sensible within a utility framework, so at this point, I will simply point your toward the full paper and let you decide for yourself whether Howson’s conclusion is sensible. In any case, the problems with countable additivity should be better known by economists.
Final version in Synthese, March 2014 [gated]. Incidentially, de Finetti was very tightly linked to the early econometricians. His philosophy – that probability is a form of logic and hence non-ampliative (“That which is logical is exact, but tells us nothing”) – simply oozes out of Savage/Aumann/Selten methods of dealing with reasoning under uncertainty. Read, for example, what Keynes had to say about what a probability is, and you will see just how radical de Finetti really was.
I think you switched A and B somewhere.
If you could provide an example (about conditionalization) with conjugated priors, I think it would be clear how the prior in the above example is odd. With conjugated priors, as you know, we can think of our prior as a previous posterior. If any such posterior cannot be obtained, than of course such prior can never have emerged.
Is it something like this the reasoning of the paper?
I should try to work it out, but if you mean that Dubins’ example has some strange properties, you are absolutely right.