[extropy-chat] Random Thought Experiments (was Coin Flip Paradox)

The Avantguardian avantguardian2020 at yahoo.com
Wed Jan 31 04:05:12 UTC 2007


--- Jef Allbright <jef at jefallbright.net> wrote:

> The principle of indifference says simply and
> powerfully that equivalent
> states of information yield equivalent
> probabilities. It is a special
> case of the principle of maximum entropy which is
> even more elegant. 

Yes. As I mentioned in an earlier post, Boltzman's
Equipartition Theorem may be thought of as the
Principle of Indifference as applied to the internal
energy of a classical system.  

> Q: Do you think there can be "information" without a
> subjective
> (necessarily context-limited) observer? Think deeply
> about this and
> curtains may fall.

This is so important a point that I will herein
describe some thought experiments to demonstrate it.
Hopefully these will shed some light on the questions
being addressed in this debate. 

Experiment 1: Random versus Hidden Variables or
"What's in your wallet?"

Lets say that Gordon and Jef run into each other and
decide to play a simple gambling game. The rules are
simple. Whoever has the less amount of money in their
wallet wins all of the money in the others wallet. Now
is the amount of money in a wallet deterministic or
random? Well from Jef's POV the amount of money in
Gordon's wallet is a random variable but the amount of
money in his own is definitively non-random (he put it
there so he knows). From Gordon's POV the exact
opposite is true, Gordon knows whats in his own wallet
but the amount in Jef's wallet is a random variable.
Thus the very same objective reality can be completely
predetermined to one player and be completely random
to the other.

Bonus question: Is this a "fair" game? What are the
players expectation values? Is it a wise gamble for
either to play?

Experiment 2: Maxwell's Demon or "Is entropy a state
of mind?"

Imagine you have a sealed two chamber container of gas
in thermodynamic equilibrium. Futhermore imagine that
there is a partition between the two chambers that has
a tiny hole in it. Now because the gas is in
equilibrium, it is in a state of maximum entropy. The
gas cannot be used to do any work.

Now imagine that the great sorceror Maxwell summons a
tiny little demon the size of the hole in the
partition between the chambers. The tiny demon is then
instructed to act like molecular goalie or a bouncer
at a ritzy nightclub. His job is to stand in the
little doorway between the two chambers and
selectively block molecules of gas he sees coming. 

When he sees a fast moving molecule of gas coming from
right hand chamber toward his doorway, he is to step
aside and let the molecule pass. Conversely if he sees
a slowly moving molecule coming toward the doorway
from the left chamber, he is to step aside and let the
molecule pass. In all other cases he is stand
steadfastly in the doorway and let the molecule bounce
off of him.

As you can clearly see what will happen is that the
left hand chamber will soon come to be filled with
fast moving molecules and will become rather hot. The
right hand chamber will similarly be filled with slow
molecules and will become cold. Thus, by expending a
the merest pittance of energy (lets say you buy the
demon lunch) to gather *information* about the speed
of individual molecules, you have stored up a lot of
energy by setting up hot and cold resevoirs that can
then be used to perform useful work.

Thus even objective physical realities (i.e. entropy
and the Laws of Thermodynamics) can be hacked by
sufficient *information*.

Bonus Questions: Do ion channels in living cells mimic
Maxwell's Demon? Could MNT be used to construct the
mechanical equivalent of Maxwell's Demon?  

Experiment 3: Chaos Theory or "Methodical Madness"

Kolmogrov's complexity theorem states that a sequence
is random if there is no algorithm that can generate
the sequence that is shorter than the sequence itself.
Yet chaos theory is finding that many very simple
deterministic equations display such seemingly
unpredicatable behavior that if you didn't know the
function to begin with, you would assume they are
random. They even pass all the statistical tests for
randomness.

To demonstrate this, examine the following two 64 bit
strings:

A.
1010000100100100001011101010010010010010001110001001111111101000

B.
0101101110110101111111100111101110010011001111000100001100010001

One of them is completely random even to myself. I
generated it by literally flipping a coin 64 times and
recording the heads as 1s and the tails as 0s.

The other one is completely deterministic - not
pseudorandom! I generated it by using a seemingly
novel function that I will unimaginatively call the
Coin Function that I designed just to prove my point.
It is simply the first 64 values given by the Coin
Function for the natural numbers. (ie. 1 to 64) 

The thought experiment is for you to try to figure out
which is which. 

Bonus questions: Does me telling you that the output
of the Coin Function is completely predetermined make
it any less random from your POV? What are the 65th,
66th, and 67th values of the function?  
     




Stuart LaForge
alt email: stuart"AT"ucla.edu

"If we all did the things we are capable of doing, we would literally astound ourselves." - Thomas Edison


 
____________________________________________________________________________________
Don't pick lemons.
See all the new 2007 cars at Yahoo! Autos.
http://autos.yahoo.com/new_cars.html 



More information about the extropy-chat mailing list