[extropy-chat] Coin Flip Paradox (was Randomness)

The Avantguardian avantguardian2020 at yahoo.com
Mon Jan 29 04:59:59 UTC 2007

--- gts <gts_2000 at yahoo.com> wrote:

> On Sun, 28 Jan 2007 18:05:39 -0500, The
> Avantguardian  
> <avantguardian2020 at yahoo.com> wrote:

> Both Baysianists and propensity theorists can talk
> meaningfully about the  
> probability of single or rare events. Frequentists
> cannot.

Right. Which is a serious discontinuity in their
theory in my view. To put it another way they borrow a
tool from calculus called a limit and try to define a
probability by it and it fails. Probability is not a
limit, mathematically speaking. A random sequence
should *converge* on any hypothesized limit as n
approaches infinity. With a real world random
sequence, this does not happen, not at infinity.

Instead the measured frequency converges infinitely
many times on the probility and diverges from that
same probability just as often. Just because it has
bill on its face does not make it a duck. It could be
a platypus. What I am saying is that math shows that
probability is not a frequency at all, limiting or
not, but is instead the average or expected frequency.

By forcing the concept of a limit onto probability
axiomatically in the definition discards a great deal
of the information available to Bayesians. Invoking
infinities where they are not observed in nature is
sloppy and unnecessary.       

> I think your paradox here is based on either a false
> intuition or a  
> misconception of frequentism. Frequentists do not
> argue that the observed  
> frequency of heads should ever be *exactly* 0.5 in
> any finite number of  
> observations.

But they don't explain why it can be *exactly* 0.5
infinitely many times in an infinite sequence either.
This is not what limits and mathematical convergence
of sequences are designed to do. It's a mathematical
fudge rather than a real explanation.  

> They argue merely that the observed
> frequency will  
> *converge* on 0.5 as n tends to infinity.
> > If probability were a limiting frequency on
> > the other hand then it would be mathematically
> > *guanteed* that the frequency would be equal to
> the
> > probability at infinite n.
> But that is exactly the case according to the Von
> Mises's first empirical  
> law of probability. Probability is calculated as the
> limiting frequency  
> (calculated as a proportion or as a percentage) as n
> goes to infinity.

I am not saying you *can't* define probability in such
a fashion, I am saying it is the less useful
definition of randomness. Especially since perfectly
ordered sequences like infinitely flipping a
two-headed coin can pass as random, under the Axiom of
Randomness when they clearly are not.

Just like the defining "duckness" by the presence of a
bill allows platypusses(platypi?) to pass as ducks.

> can be shown by the way that this convergence of the
> observed frequency  
> follows an inverse square rule, not unlike some
> other supposed empirical  
> laws of the universe.

Precisely. The *actual* frequency of any real random
sequence would be more accurately described to
chaotically orbit the probability, like a strange
attractor, rather than approach it as any kind of
deterministic limit in a classical calculus sense.

Well that's my "Axiom of Randomness" and I am sticking
with it. If it is true then Everrett and Bohm were
right. All randomness is actually determinism with
hidden variables. And probability, uncertainty,
freewill, and possibly consciousness itself all owe
their existense to blissful ignorance of a future that
is in reality meant to be.

I may even be able to come up with theorem to prove it
or rather make good use of it since it is axiomatic.
See how easy axioms are? heh. ;)

Stuart LaForge
alt email: stuart"AT"ucla.edu

"If we all did the things we are capable of doing, we would literally astound ourselves." - Thomas Edison

Have a burning question?  
Go to www.Answers.yahoo.com and get answers from real people who know.

More information about the extropy-chat mailing list