[extropy-chat] what is probability?

Benjamin Goertzel ben at goertzel.org
Tue Jan 16 21:06:51 UTC 2007


Hi,

> For our purposes I think it's fair to say a sequence is random if there is
> and can be no discernible pattern,

This is what Chaitin's definition of randomness (in the references) says.

The problem is: Discernible by WHOM?  Discernibility of patterns can
only be defined objectively for infinitely large entities.  Otherwise
it can only be described relative to a specific finite Turing machine,
i.e. subjectively.

>i.e., the sequence is random if the
> observations are *independent trials* in the usual sense.

But "independent trials" is not a well-defined concept, so it can't
really be used to define randomness in a rigorous way.

> If I understand your question, you are really wanting know how randomness
> is defined subjectively. Yes?

Yeah...

I can define "X is random to A to degree r, if A cannot compress X to
less than a ratio r of its original size".

So if A can't compress X at all, then X is random to degree 1 with respect to A

If A can compress X close to zero size, then X is random to degree 0
with respect to A

Then the Chaitin definition of randomness comes down to the
observation that as the size of X goes to infinity, the property of
"having degree of randomness 1" becomes independent of the observer A.

Anyway, my question is exactly how this sort of randomness connects
with the Bayesian interpretation of probability.

I think that these results

http://citeseer.ist.psu.edu/calude94borel.html

imply that Chaitin randomness implies exchangeability for infinite
sequences.    Also, they show that for finite sequences, almost all
sequences are exchangeable (so in that sense, a 'randomly chosen'
sequence is very likely to be exchangeable).

However, exchangeability likely does NOT imply Chaitin randomness....

-- Ben G



More information about the extropy-chat mailing list