[extropy-chat] Re: monty hall paradox again: reds and green gorfs

Eliezer Yudkowsky sentience at pobox.com
Sun May 23 09:36:47 UTC 2004


Eliezer Yudkowsky wrote:
> 
> Similarly, pick any envelopes T and 2T from an infinite smooth prior 
> from 0 to infinity, and your friend will always want to trade up.  I am 
> reminded of Martin Gardner's proof that all numbers are tiny: no matter 
> how large a finite number is, most numbers are very much larger.

PPS:  This is not an exact analogy.  But the expected value of an envelope 
randomly selected from a range of 0 to infinity is infinite.  Hence it is 
not surprising that after looking in any one envelope we find that the 
other envelope has an expected value of 1.25 times the first.  The problem 
is in the absurd prior.

If we have a smooth prior *in the area of the dollar amount we actually 
find*, then it is always wise to trade up, and the expected value of the 
other envelope is 1.25 times the first.  If we have a logarithmic prior *in 
the area of the dollar amount we actually find*, then the other envelope 
has a 2/3 probability of being the smaller one.  In either case this 
reflects our *updated* probability *after* finding some actual dollar 
amount, because the *specific* value happens to lie in a smooth or 
logarithmic region of our prior.

If one asks, "But which prior should I use?", the answer I would give is, 
"Well, guess how many dollars someone would be likely to actually put into 
an envelope, and try to have a prior distribution that reflects this."  The 
case of zorgs is dependent on alien psychology, but I would still bet a 
priori that one zorg is more likely to be worth ten dollars than a 
googleplexth of a dollar.

If you give up and cheat and use a uniform logarithmic distribution for 
your whole prior, then you will have no expected value for trading, and any 
paradoxes you find in the math are your fault for using bad math.

-- 
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



More information about the extropy-chat mailing list