[Paleopsych] Physics World: Does God play dice?

Premise Checker checker at panix.com
Fri Dec 9 01:56:14 UTC 2005

Does God play dice? (December 2005) - Physics World - PhysicsWeb

[I like Seth Lloyd's 10^120 calculation of the maximum number of 
calculations since the universe began. I did something similar, namely to 
take 1. The number of photons, 2. The Planck distance divided by the speed 
of light, as number of movements a photon can make per unit of time, and 
3. The number of units of time since the Big Bang. I multiplied them 
together and got iirc just this 10^120. Now since 2^10 is approx. 10^3, 
10^120 = 2^600. I've seen the 10^120 figure elsewhere, but this may just 
have been a repeat of Lloyd's reasoning.

[And so a key of 600 bits would be absolutely unbreakable in the next 13.5 
billion years, if the entire universe were devoted to breaking it AND 
there were no barriers, like the speed of light, to slow down the 
calculations coming together.

[Yet I've been told that it is possible to crack a 600-bit encryption. 
Please reconcile this! And while you are at it, tell me how much 
communication is slowed down as the number of bits increases.

[A different point: I can hardly think that superstring theory should be 
stopped just because we NOW have no means to testing it. This is like Dr. 
Michael Behe saying there must be an intelligent designer because Dr. 
Michael Behe cannot figure out how life evolved. It is immoral ever to 
stop inquiry. Unless you run out of grant money, of course.]

Forum: December 2005

Einstein was one of the founders of quantum mechanics, yet he
disliked the randomness that lies at the heart of the theory. God
does not, he famously said, play dice. However, quantum theory has
survived a century of experimental tests, although it has yet to be
reconciled with another of Einstein's great discoveries - the
general theory of relativity. Below four theorists - Gerard 't
Hooft, Edward Witten, Fay Dowker and Paul Davies- outline their
views on the current status of quantum theory and the way forward

   Gerard 't Hooft argues that the problems we face in reconciling
  quantum mechanics with general relativity could force us to
reconsider the basic principles of both theories.

Gerard 't Hooft
Gerard 't Hooft

If there is any preconceived notion concerning the laws of nature -
one that we can rely on without any further questioning - it is the
   assumption that they are controlled by strict logic. Under all
  conceivable circumstances, the laws of nature should dictate how
   the universe evolves. Curiously, however, quantum mechanics has
given a new twist to this adage. It does not allow a precise
sequence of events to be predicted, only statistical averages. All
statistical averages can be predicted - in principle with infinite
   accuracy - but nothing more than that.

Einstein was one of the first people to protest against this
impoverishment of the concept of logic. It has turned out, however,
to be a fact of life. Quantum mechanics is the only known realistic
description of the microscopic parts of our universe like atoms and
  molecules, and it works just fine. Logically impoverished or not,
  quantum mechanics appears to be completely self-consistent.

But how does quantum mechanics tie in with particles that are much
smaller than atoms? The Standard Model is the beautiful solution to
   two fundamental problems: one, how to combine quantum mechanics
with Einstein's theory of special relativity; and two, how to
explain numerous experimental observations concerning the behaviour
  of sub-atomic particles in terms of a concise theory. This model
tells us how far we can go with quantum mechanics. Provided that we
  adhere strictly to the principles of quantum field theory, nature
   obeys both quantum mechanics and special relativity up to
arbitrarily small distance and time scales.

   Just like all other successful theories of nature, the Standard
Model obeys the notions of locality and causality, which makes this
theory completely comprehensible. In other words, the physical laws
of this theory describe in a meaningful way what happens under all
   conceivable circumstances. The standard theory of general
  relativity, which describes the gravitational forces in the
macroscopic world, approaches a similar degree of perfection.
Einstein's field equations are local, and here, cause also precedes
  effect in a local fashion. These laws, too, are completely

But how can we combine the Standard Model with general relativity?
Many theorists appear to think that this is just a technical
problem. But if I say something like "quantum general relativity is
  not renormalizable", this is much more than just a technicality.
Renormalizability has made the Standard Model possible, because it
lets us answer the question of what happens at extremely tiny
   distance scales. Or, more precisely, how can we see that cause
  precedes effect there? If cause did not precede effect, we would
  have no causality or locality - and no theory at all.

  Asking both questions in quantum gravity does not appear to make
   sense. At distance scales small compared with the Planck scale,
   some 10^-33 cm, there seems to be no such thing as a space-time
  continuum. That is because gravity causes space-time to be highly
curved at very small distances. And at small distance scales, this
curvature exceeds all bounds. But what exactly does this mean? Are
  space and time discrete? What then do concepts such as causality
and locality mean? Without proper answers to such questions, there
is no logically consistent formalism, not even a quantum-mechanical

   One ambitious attempt to combine quantum mechanics with general
  relativity is superstring theory. However, I am unhappy with the
   answers that this theory seems to suggest to us. String theory
  seems to be telling us to believe in "magic": it is claimed that
  "duality theorems", which are not properly understood, will allow
us to predict features without reference to locality or causality.
  To me such magic is synonymous with "deceit". People only rely on
  magic if they do not understand what is really going on. This is
   not acceptable in physics.

  In thinking about these matters, I have reached a conclusion that
  few other researchers have adopted: the problem lies with quantum
  mechanics, possibly with general relativity, or conceivably with

  Quantum mechanics could well relate to micro-physics the same way
  that thermodynamics relates to molecular physics: it is formally
  correct, but it may well be possible to devise deterministic laws
  at the micro scale. However, many researchers say that the
  mathematical nature of quantum mechanics does not allow this - a
  claim deduced from what are known as "Bell inequalities". In 1964
   John Bell showed that a deterministic theory should, under all
   circumstances, obey mathematical inequalities that are actually
  violated by the quantum laws.

  This contradiction, however, arises if one assumes that the
  particles we talk about, and their properties, are real, existing
entities. But if we assume that objects are only real if they have
been precisely defined, including all oscillations as small as the
Planck scale - and that only our measurements of the properties of
  particles are real - then there is no blatant contradiction. One
might assume that all macroscopic phenomena, such as particle
positions, momenta, spins and energies, relate to microscopic
  variables in the same way thermodynamic concepts such as entropy
  and temperature relate to local, mechanical variables. Particles,
  and their properties, are not (or not entirely) real in the
ontological sense. The only realities in this theory are the things
  that happen at the Planck scale. The things we call particles are
  chaotic oscillations of these Planckian quantities. What exactly
these Planckian degrees of freedom are, however, remains a mystery.

  This leads me to an even more daring proposition. Perhaps general
  relativity does not appear in the formalism of the ultimate
equations of nature. In making the transition from a deterministic
theory to a statistical - i.e. quantum mechanical - treatment, one
may find that the quantum description develops many more symmetries
than the deeper deterministic description.

Let me try to clarify what I mean. If, according to the
   deterministic theory, two different states evolve into the same
  final state, then quantum mechanically these states will be
  indistinguishable. We call such a feature "information loss". In
  quantum field theories such as the Standard Model, we often work
   with fields that are not directly observable, because of "gauge
invariance", which is a symmetry. Now, I propose to turn this
  around. In a deterministic theory with information loss, certain
  states are unobservable (because information about them has
disappeared). When one uses a quantum-mechanical language to
describe such a situation, gauge symmetries naturally arise. These
symmetries are not present in the initial laws. The "general
co-ordinate covariance" of general relativity could be just such a
  symmetry. This is indeed an unusual view on the concept of
   symmetries in nature.

Nature provides us with one indication that perhaps points in this
  direction: the unnatural, tiny value of the cosmological constant
  L. It indicates that the universe has a propensity to stay flat.
Why this happens is a mystery that cannot be explained in any
  theory in which gravitation is subject to quantum mechanics. If,
  however, an underlying, deterministic description naturally
   features some preferred flat co-ordinate frame, the puzzle will
  cease to perplex us. There might be another example, which is the
  preservation of the symmetry between the quarks in the subatomic
   world, called charge-parity (CP) symmetry - a symmetry that one
  would have expected to be destroyed by their strong interactions.

The problem of the cosmological constant has always been a problem
of quantum gravity. I am convinced that the small value of L cannot
  be reconciled with the standard paradigms of quantized fields and
general relativity. It is obvious that drastic modifications in our
way of thinking, such as the ones hinted at in this text, are
  required to solve the problems addressed here.

   Edward Witten thinks that one of the most perplexing aspects of
  quantum mechanics is how to apply it to the whole universe

   Edward Witten
Edward Witten

Quantum mechanics is perplexing, and likely to remain so. The
   departure from ordinary classical intuition that came with the
   emergence of quantum mechanics is almost surely irrevocable. An
improved future theory, if there is one, will probably only lead us
   farther afield.

   Is there any hint of a clue that might lead to a more complete
  theory? Experimental physicists are increasingly able to perform
   experiments that used to be called thought experiments in
textbooks. Quantum mechanics has held up brilliantly. If there is a
cloud on the horizon, it is that it is hard to see what it means to
apply quantum mechanics to the whole universe. I suppose that there
  are two aspects to this. Quantum-mechanical probabilities do not
  seem to make much sense when applied to the whole universe, which
appears to happen only once. And we all find it confusing to
  include ourselves in a quantum description of the whole universe.

   Yet applying quantum mechanics to something less than the whole
  universe - to an experimental system that is observed by a
classical human observer - is precisely what forces us to interpret
quantum mechanics in terms of probabilities. If we had a good
  understanding of what quantum mechanics means when applied to the
  whole universe, we might ultimately say that the notion that "God
  plays dice" results from trying to describe a quantum reality in
  classical terms.

  Fay Dowker thinks that the puzzles of quantum mechanics could be
solved by considering what are known as the "histories" of a
  system, as introduced by Richard Feynman

Fay Dowker
Fay Dowker

   The development of quantum mechanics was a major advance in our
understanding of the physical world. However, quantum mechanics has
  not yet come fully to fruition because it has not replaced
classical mechanics in the way that general relativity has replaced
  Newtonian gravity. In the latter case, we can start from general
   relativity and derive the laws of Newtonian gravity as an
   approximation; we can also predict when - and quantitatively to
what extent - that approximation is valid.

But we cannot yet derive classical mechanics from quantum mechanics
in the same way. The reason is that, in its standard textbook
formulation, quantum mechanics requires us to assume we have
  classical measuring equipment. Predictions about the measurements
  that are recorded, or observed, by this equipment form the
scientific output of the theory. But without a classical observer,
   we cannot make any predictions. While many physicists have been
content with quantum mechanics in its textbook form, others -
beginning with Einstein - have sought to complete the quantum
revolution and make it a truly universal theory, independent of any
  classical crutch.

One attempt to sort out quantum mechanics is to view it as a
generalization of classical "stochastic" theories, such as Brownian
motion. In Brownian motion, a particle moves along one of a number
  of possible trajectories, or "histories". The notion of a history
is crucial here: it is a complete description of the system at each
time between some initial and final times. A history is an a priori
possibility for the complete evolution of the system, and the
collection of all the histories is called the "sample space".

The system will have only one actual history from the sample space
but any one is an a priori possibility. The actual history is
   chosen from the sample space at random according to the "law of
  motion" for a Brownian particle. This law is a probability
   distribution, or "measure", on the sample space that outlines,
roughly, how likely each history is for the actual evolution.

  Quantum mechanics also has a sample space of possible histories -
  trajectories of a particle, say - but on this occasion the sample
space has a "quantal measure" associated with it. As with Brownian
  motion, the quantal measure gives a non-negative number for each
  subset of the sample space. However, this quantal measure cannot
  now be interpreted as a probability because of the phenomenon of
quantum interference, which means that the numbers cannot be added
  together like probabilities.

   For example, when electrons pass through a Young's double-slit
   set-up, the quantal measure of the set of all histories for the
  electron that ends up at a particular region on the screen is not
just the quantal measure of the set of histories that goes through
one slit added to the quantal measure of the set of histories that
goes through the other. Essentially, this is due to the phenomenon
  we call quantum interference between histories, which is due, in
   turn, to the way we calculate the quantum measure of a bunch of
  histories as the square of the sum of the amplitudes of the
  histories in the bunch. When you add some numbers and then square
the result, you do not get the sum of the squares - there are also
   cross terms, which are the expression of the interference that
spoils the interpretation as probabilities.

  The challenge is to find the right interpretation of this quantal
  measure, one that explains the textbook rules by predicting
   objectively when classical "measurement" situations arise. This
  includes the struggle to understand quantum mechanics as a theory
  that respects relativistic causality in the face of experimental
evidence that widely separated particles can be correlated in ways
  that seem incompatible with special relativity.

It is no coincidence that those physicists who are at the forefront
of developing this histories approach to quantum mechanics - people
like James Hartle from the University of California at Santa
Barbara, Chris Isham at Imperial College, London and Rafael Sorkin
  at Syracuse University - all work on the problem of quantum
gravity, which is the attempt to bring gravity within the framework
  of a universal quantum theory. In histories quantum gravity, each
history in the sample space of possibilities is not in space-time;
rather, each history is a space-time. If a theory of quantum
  gravity of this sort can be achieved, it would embody Einstein's
  hopes for a unification in which matter and space-time, observer
   and observed, are all treated on an equal footing.

  Paul Davies believes that the complexity of a system could define
  the boundary between the quantum and classical worlds

  Paul Davies
   Paul Davies

  Despite its stunning success in describing a wide range of
phenomena in the micro-world, quantum mechanics remains a source of
puzzlement. The trouble stems from meshing the quantum to the
  classical world of familiar experience. A quantum particle can be
in a superposition of states - for example it may be in many places
  at once - whereas the "classical" world of observation reveals a
single reality. This conundrum is famously captured by the paradox
of Schrödinger's cat, in which a quantum superposition is amplified
in order to put an animal into an apparently live-dead hybrid

   Physicists divide into those who believe quantum mechanics is a
complete theory that applies to the universe as a whole, regardless
   of scale, and those who think it must break down at some level
between atom and observer. The former group subscribe to the "many
   universes" interpretation, according to which all branches of a
quantum superposition are equally valid and describe parallel
   realities. Though many physicists reject this interpretation as
   unacceptably bizarre, there is no consensus on the alternative.
   Quantum mechanics does not seem to fail at any obvious scale of
  size or mass, as the phenomenon of superconductivity attests. So
perhaps some other property of a physical system signals the
emergence of classicality from the quantum realm? I want to suggest
that complexity may be the appropriate quantity.

  Just how complex must a system be to qualify for the designation
  "classical"? A cat is, I submit, a classical object because it is
complex enough to be either alive or dead, and not both at the same
time. But specifying a precise measure of complexity is difficult.
Many definitions on offer are based on information theory or
computing. There is, however, a natural measure of complexity that
   derives from the very nature of the universe.

This is defined by the maximum amount of information that the
   universe can possibly have processed since its origin in a Big
  Bang. Seth Lloyd of the Massachusetts Institute of Technology has
   computed this to be about 10^120 bits (2000 Nature 406 1047 and
2002 Phys. Rev. Lett. 99 237901). A system that requires more than
this quantity of information to describe it in detail is so complex
  that the normal mathematical laws of physics cannot be applied to
  arbitrary precision without exceeding the information capacity of
the universe. Cosmology thus imposes a small but irreducible
uncertainty, or fuzziness, in the operation of physical laws.

For most systems the Lloyd limit is irrelevantly large. But quantum
   systems are described by vectors in a so-called Hilbert space,
  which may have a great - indeed infinite - number of dimensions.
   According to my maximum-complexity criterion, quantum mechanics
will break down when the dimensionality of the Hilbert space
   exceeds about 10^120.

A simple example is an entangled state of many electrons. This is a
special form of superposition in which up and down spin
orientations co-exist in all possible combinations. Once there are
  about 400 electrons in such a state, the Lloyd limit is exceeded,
suggesting that it is at this level of complexity that classicality
emerges. Although such a state is hard to engineer, it lies firmly
within the design specifications of the hoped-for quantum computer.
This is a machine that would harness quantum systems to achieve an
exponentially greater level of computing power than a conventional
  computer. If my ideas are right, then this eagerly awaited
  technology will never achieve its full promise.

  About the authors

Gerard 't Hooft is in the Institute for Theoretical Physics,
   University of Utrechtthe Netherlands, e-mail
  mailto:g.thooft at phys.uu.nl; Edward Witten is in the Institute for
Advanced Study, Princeton, US, e-mail witten at ias.edu; Fay Dowker is
  at Imperial College, London, UK, e-mail f.dowker at imperial.ac.uk;
   and Paul Davies is professor of natural philosophy in the
  Australian Centre for Astrobiology, Macquarie University, Sydney,
   Australia, e-mail pdavies at els.mq.edu.au

More information about the paleopsych mailing list