[extropy-chat] [COSMO-ASTRO] Weighing the Universe

scerir scerir at libero.it
Fri Apr 29 21:18:51 UTC 2005

A simulation is a collection of information. 
Information (by definition, I think) requires that 
the information storage system is not in 
thermodynamic equilibrium, sot he storage system 
contains energy. But energy is mass.

This assumes the simulation metaverse 
is identical with this universe. 
For multiple reasons, it is unlikely 
that the metaverse physics is similiar 
to this universe's physics.

You don't think it's a safe bet that 
however exotic the physics of any hypothetical 
substrate metaverse, information will not travel 
or be stored or transformed free of cost?

Information should be whatever contributes to 
a reduction in the uncertainty of the state of a system.
The old principle (Jaynes, 1957) says that a system
is expected to be in the state with maximal entropy,
because if it were in a state with a lower entropy
it would also contain more information than
previously specified. Information, then, seems to be [1,2]
"local", sometimes even "subjective" [3].

Is there a "metaverse"? Is the physics of this "metaverse"
different? In a certain sense QM is between that exotic
physics of an hypothetical metaverse and the ordinary
physics of the ordinary universe.

The current interpretation of QM consists in a prescription
for computing the probability of finding (after measurement)
a certain state of affairs at a given time. One has to
make use (by integration over space) of the *simultaneous*
values of a certain function (of coordinates and time), 
at that particular time. But in another Lorentz frame things 
change: a given region at a given time is no longer the same 
given region at the same given time; so the prescription for
computing the probability of finding a certain quantum 
state must be changed, and must be changed also because
the store of *simultaneous* values of that function (of 
coordinates and time) that the prescription uses for computing 
the probabilities is, in general, not independent of the
specific Lorentz frame.

Here comes the magic. Following the above reasoning we
expect that, i.e., the information about the probability
of a particle being at a distance x comes to us with
a signal velocity c. 

Then |wavefunction(x,t - r/c)|^2 should represent 
the probability that a particle is at x, as seen at 
the origin. The normalization expression should then be 
Integral(t=fixed) |wavefunction(x,t - r/c)|^2 ds = 1
where ds is a measure of the backward light cone. 

QM, instead, uses the magic normalization expression
Integral(t=fixed) |wavefunction(x,t)|^2 dxdxdx = 1 
which means that the probability space, at a fixed time,
implies that we have *istantaneous* information
of probabilities at all distances, and we add up all 
these probabilities equal to one.

Here starts the nonlocality/nonseparability issue,
at least if you think there is, out there, a real space-time. 
But if you remove this belief [4], this faith, you realize 
that QM is close to be the exotic physics of an hypothetical 

[1] Asher Peres http://www.arxiv.org/abs/quant-ph/0310010
[2] Asher died recently, his story, nice read
[3] David Mermin http://www.arxiv.org/abs/quant-ph/0107151
[4] "Most physicists are happy with our experimental results:
they conclude that quantum theory is once again well
supported by experimental data. (Some will even claim
that the experiments where not necessary since they
know that quantum theory is correct!). However, the issue
is not a matter of happiness or of simple belief in a
theory! If the speed of quantum information is indeed
infinite, or non-existing, then we are left with the two
remaining alternatives: either space-time or free
will is an illusion. I am tempted to vote for the first one!
But - again - it is not a matter of personal preference.
The real problem for physics is the following: how
could one test it?" -Nicolas Gisin

More information about the extropy-chat mailing list