[extropy-chat] 't Hooft, QC, complexity

scerir scerir at libero.it
Fri Dec 2 19:29:49 UTC 2005


Gerard 't Hooft writes something, on Physics World
http://www.physicsweb.org/articles/world/18/12/2/1
which seems (to me) a bit confusing. Specifically
he seems to consider QT - because of the essential
indeterminism, the essential symmetries, the essential
principle of 'indistinguishability' (superposition,
entanglement, interference, and so on) - as a
temporary model. (On the contrary it seems, to me,
that such essential principles allow a sort
of 'freedom', a sort of evolution, etc.)

But the main problem is at the end of the paper.
It seems that P. Davies (or is G. 't Hooft?) writes:

<<This is defined by the maximum amount of information
that the universe can possibly have processed since
its origin in a Big Bang. Seth Lloyd of the Massachusetts
Institute of Technology has computed this to be about 10^120
bits (2000 Nature 406 1047 and 2002 Phys. Rev. Lett. 99 237901).
A system that requires more than this quantity of information
to describe it in detail is so complex that the normal
mathematical laws of physics cannot be applied to arbitrary
precision without exceeding the information capacity
of the universe. Cosmology thus imposes a small but
irreducible uncertainty, or fuzziness, in the operation
of physical laws.

For most systems the Lloyd limit is irrelevantly large.
But quantum systems are described by vectors in a so-called
Hilbert space, which may have a great - indeed infinite -
number of dimensions. According to my maximum-complexity
criterion, quantum mechanics will break down when 
the dimensionality of the Hilbert space exceeds about 10^120.

A simple example is an entangled state of many electrons. 
This is a special form of superposition in which up and down 
spin orientations co-exist in all possible combinations. 
Once there are about 400 electrons in such a state,
the Lloyd limit is exceeded, suggesting that it is 
at this level of complexity that classicality emerges. 

Although such a state is hard to engineer, it lies firmly 
within the design specifications of the hoped-for
quantum computer. This is a machine that would harness 
quantum systems to achieve an exponentially greater level 
of computing power than a conventional computer. 
If my ideas are right, then this eagerly awaited
technology will never achieve its full promise.>>

Now there are many problems here, like these.
1) Is information physical (Landauer principle)?
2) Is physics much more reach than its informational
   content?
3) Is there a strict relation between the global
   information processed since the Big-Bang and
   the information we can process in the future?
Etc. 








More information about the extropy-chat mailing list