[extropy-chat] Coin Flip Paradox
jef at jefallbright.net
Mon Jan 29 18:51:49 UTC 2007
Gordon, it might be more interesting and more productive--for the rest
of us--if you would contribute statements that you personally find
profound, newsworthy, or otherwise relevant to extropian themes.
It's a bit frustrating to watch as you continue to stir the pot,
paraphrasing the work of others, claiming to play devil's advocate, but
showing no particular insight or other tangible contribution to this
Probability, Randomness, Bayesian, Frequentist; it's all out there on
the web and in books. There certainly are different schools of thought
on these topics, but what is it that you think you're adding?
Do you have a point?
> -----Original Message-----
> From: extropy-chat-bounces at lists.extropy.org [mailto:extropy-chat-
> bounces at lists.extropy.org] On Behalf Of gts
> Sent: Monday, January 29, 2007 9:03 AM
> To: ExI chat list
> Subject: Re: [extropy-chat] Coin Flip Paradox
> On Sun, 28 Jan 2007 23:59:59 -0500, The Avantguardian
> <avantguardian2020 at yahoo.com> wrote:
> > The *actual* frequency of any real random
> > sequence would be more accurately described to
> > chaotically orbit the probability, like a strange
> > attractor, rather than approach it as any kind of
> > deterministic limit in a classical calculus sense.
> I totally disagree, and wonder where you came up with the unusual
> that frequencies "chaotically orbit the probability like a strange
> attractor". Do you have mathematical or empirical evidence to
> support that
> Frequentists have plenty of evidence, both empirical and
> mathematical, to
> support their much more boring claim that frequencies converge in
> ordinary way as n increases.
> But let's talk a bit about the meaning of randomness.
> I surmise that you see an ambiguity in the conventional view of
> that I also see, but that you are expressing your displeasure about
> it in
> ways that make no sense to me.
> As I mentioned and you agreed, randomness and entropy are closely
> ideas, but the ideas should (perhaps) be kept apart.
> Rafal objected, for example, when I wrote that a sequence of flips
> of a
> heavily weighted coin is still a completely random sequence. It
> seems his
> intuition was telling him that a weighted coin should produce a
> less random than a fair coin.
> I think Rafal really meant that such a heavily weighted sequence
> has lower
> *entropy*, not lower *randomness*. I think people are sometimes
> about the two terms because of their close meanings.
> As probability theorists normally use the word (at least in my
> randomness is mainly about the independence (or exchangeability) of
> individual trials/observations, not about the measure of disorder
> in the
> sequence of trials/observations.
> The situation is made more cloudy (or perhaps more clear, depending
> your perspective) by algorithmic definitions of randomness.
> Consider a binary sequence generated by an idealized perfectly
> random fair
> coin, where Heads=1 and Tails=0. What if this unlikely sequence
> came up?
> 20 heads in a row! Is this freaky sequence still random? It
> doesn't *look* random, but how could it not still *be* random?
> After all
> we stipulated in advance that it was generated by an idealized
> random coin-flip process.
> Well, according to the algorithmic definition of randomness,
> randomness is
> a property of the *sequence*, not a property of the *process*. So
> sequence of 20 heads is extremely un-random by that definition even
> it was obtained via a purely random process. This is a sort of
> marriage of
> entropy to randomness, for better or worse.
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
More information about the extropy-chat