[extropy-chat] what is probability?

gts gts_2000 at yahoo.com
Sun Jan 14 17:39:30 UTC 2007

On the subjective theory of probability, two rational agents with the same  
background knowledge may hold different judgemental probabilities on the  
same outcome, at least in certain situations.

Though I don't necessarily endorse this view, I think I can defend it. Jef  
wants to know for example how two rational machine intelligences might  
offer different outputs given the same inputs.

Drawing on the cube conundrum, imagine this scenario:

Unbeknownst to each other, two rational agents (machine intelligences,  
AI's, whatever) are each given a box containing a cube.

The cubes in the two boxes are identical. The agents know only that their  
side-lengths are between 3 and 5 centimeters, their surface areas are  
between 54 and 150 cm^2, and their volumes are between 27 and 125 cm^3.  
Note there is nothing challenging about these ranges; lots of real cubes  
could meet these constraints.

For some practical unspecified reason, each agent must use the information  
given to make a best-guess of the probable dimensions of the actual cube  
in his respective box. (As often happens in the real world, our agents are  
being forced to make decisions under conditions of high uncertainty.)

They make their decisions, then come together to compare their conclusions  
and their reasoning. Is it possible that our agents will have reached  
different conclusions?

I think so. In fact I'd be surprised if they didn't.

Our agents were forced to decide arbitrarily between 'side-length',  
'surface area' and 'volume' as the parameter to use for estimating the  
over-all dimensions of the cube. Each of these three methods leads to a  
different but perfectly rational judgemental probability. As we've seen,  
judgements based on all three parameters are paradoxical.



More information about the extropy-chat mailing list