[ExI] Bayesian epistemology

Jef Allbright jef at jefallbright.net
Mon Aug 6 18:41:33 UTC 2007


On 8/6/07, Michael M. Butler <mmbutler at gmail.com> wrote:
> > It appears you may be unclear about the distinction between
> > probability and likelihood.
>
> 1) It's been a while since I've cracked open my E. T. Jaynes
> (_Probability Theory As Extended Logic_) and I seem to have packed it
> prior to my recent move. Can you point me (us) to someone on teh
> Intarweb who does a good concise job of making this distinction clear
> in a Bayesian (but not Bayesianismistic) context?


I did a quick "Intarweb search" for helpful references and found a
great deal of misinformation on this simple fundamental point.  So
here's my attempt to convey it in simple terms:

Again it's very much about context, meaning necessarily partial information.

We can think of likelihood as the function (in "reality") that
determines the distribution of outcomes.  Probability, or better,
posterior probability, is the product of our prior and the likelihood
function and reflects our uncertain (i.e. incomplete) knowledge of the
parameters of the likelihood function.

>From Jaynes' _Probability Theory - The Logic of Science, section 8.5:

<quote>
In applying Bayes' theorem, the posterior pdf for a parameter [theta]
is always a product of a prior p(theta | l) and a likelihood function
[mathematical function]; the only place where the data is in the
latter. Therefore it is manifest that

"Within the context of the specified model, the likelihood function
L(theta) from data D
contains all the information about [theta] that is contained in D."

For us, this is an immediate and mathematically trivial consequence of
the product rule of probability theory, and is no more to be
questioned than the multiplication table.  Put differently, two data
sets D, D' that lead to the same likelihood function to within a
normalization: [mathematical equation], where `a' is a constant
independent of [theta], have just the same import for any inferences
about [theta], whether it be point estimation, interval estimation, or
hypothesis testing. But for those who think of a probability
distribution as a physical phenomenon arising from n`randomness'
rather than a carrier of incomplete information, the above quoted
statement -- since it involves only the sampling distribution - has a
meaning independent of the product rule and Bayes' theorem. They call
it the `likelihood principle', and its status as a principle of
inference has been the subject of long controversy, still continuing
today.
</quote>

Sorry, too busy with work to continue here, but I think understanding
the foregoing is key.

- Jef



More information about the extropy-chat mailing list