[extropy-chat] what is probability?
gts
gts_2000 at yahoo.com
Mon Jan 15 15:34:08 UTC 2007
On Sun, 14 Jan 2007 21:36:50 -0500, Russell Wallace
<russell.wallace at gmail.com> wrote:
> I'm inclined to think that issue would be best discussed in the context
> of a practical example - do you have one to hand?
Sure.
An urn contains 100 green and/or red marbles in unknown proportion. We
want to know pr(green) (the probability of drawing a green marble).
For better or worse, the principle of indifference (PI) tells us that
given no evidence to prefer either green or red, we should assign them
equal probabilities. So pr(green) = 0.5.
We set .5 as the prior probability in bayes' theorem, then begin
performing experiments (drawing marbles with replacement), updating our
judgmental probability along the way according to the theorem (bayesian
conditioning).
After we've conditioned on enough evidence our judgmental probability
should match, in the limit, the true objective chance of drawing a green
marble from the urn (assuming we believe in such things as objective
chance).
This is I think not a totally unreasonable process, even though our
initial prior probability judgment was based on what you and I agree is
shaky reasoning.
Opposing the PI is the idea that given no evidence, we should assign no
probability at all, i.e., that we should accumulate some empirical
evidence *before* we start estimating probabilities. But this idea makes
some bayesians very uncomfortable.
-gts
More information about the extropy-chat
mailing list