[Paleopsych] New Scientist: Higher laws and the mind-boggling complexity of life

Premise Checker checker at panix.com
Fri Apr 15 20:30:48 UTC 2005


Higher laws and the mind-boggling complexity of life
http://www.newscientist.com/article.ns?id=mg18524891.000&print=true
5.3.5
      * Paul Davies is at the Australian Centre for Astrobiology at
        Macquarie University, Sydney

    TAKE a bucketful of subatomic particles. Put them together one way,
    and you get a baby. Put them together another way and you'll get a
    rock. Put them together a third way and you'll get one of the most
    surprising materials in existence: a high-temperature superconductor.
    How can such radically different properties emerge from different
    combinations of the same basic matter?

    The history of science is replete with investigations of the
    unexpected qualities that can arise in complex systems. Shoals of fish
    and ant colonies seem to display a collective behaviour that one would
    not predict from examining the behaviour of a single fish or ant.
    High-temperature superconductors and hurricanes offer two more
    examples where the whole seems to be greater than the sum of its
    parts. What is still hotly disputed is whether all such behaviour can
    ultimately be derived from the known laws of physics applied to the
    individual constituents, or whether it represents the manifestation of
    something genuinely new and different - something that, as yet, we
    know almost nothing about. A new factor could shed light on this most
    fundamental question. And it comes from an entirely unexpected
    quarter: cosmology.

    The standard scientific view, known as reductionism, says that
    everything can ultimately be explained in terms of the "bottom level"
    laws of physics. Take the origin of life. If you could factor in
    everything about the prebiotic soup and its environment - and assuming
    you have a big enough computer - you could in principle predict life
    from the laws of atomic physics, claim the reductionists.

    What has become increasingly clear, however, is that many complex
    systems are computationally intractable. To be sure, their evolution
    might be determined by the behaviour of their components, but
    predictive calculations are exponentially hard. The best that one can
    do is to watch and see how they evolve. Such systems are said to
    exhibit "weak emergence".

    But a handful of scientists want to go beyond this, claiming that some
    complex systems may be understood only by taking into account
    additional laws or "organising principles" that emerge at various
    levels of complexity. This point of view is called "strong emergence",
    and it is anathema to reductionists.

    The debate is often cast in the language of "Laplace's demon." Two
    centuries ago, Pierre Laplace pointed out that if a superintelligent
    demon knew at one instant the position and motion of every particle in
    the universe, and the forces acting between them, the demon could do a
    massive calculation and predict the future in every detail, including
    the emergence of life and the behaviour of every human being. This
    startling conclusion remains an unstated act of faith among many
    scientists, and underpins the case for reductionism.

    Laplace's argument contains, however, a questionable assumption: the
    demon must have unlimited computational power. Is this reasonable? In
    recent years, there has been intense research into the physical basis
    of digital computation, partly spurred by efforts to build a quantum
    computer. The late Rolf Landauer of IBM, who was a pioneer of this
    field, stressed that all computation must have a physical basis and
    therefore be subject to two fundamental limitations. The first is
    imposed by the laws of physics. The second is imposed by the resources
    available in the real universe.

    The fundamental piece of information is the bit. In standard binary
    arithmetic of the sort computers use, a bit is simply a 1 or a 0. The
    most basic operation of information processing is a bit-flip: changing
    a 1 to a 0 or vice versa. Landauer showed that the laws of physics
    impose limits on the choreography of bit-flips in three ways. The
    first is Heisenberg's uncertainty principle of quantum mechanics,
    which defines a minimum time needed to process a given amount of
    energy. The second is the finite speed of light, which restricts the
    rate at which information can be shunted from place to place. The
    third limit comes from thermodynamics, which treats entropy as the
    flip side of information. This means a physical system cannot store
    more bits of information in its memory than is allowed by its total
    entropy.

    Given that any attempt to analyse the universe and its processes must
    be subject to these fundamental limitations, how does that affect the
    performance of Laplace's demon? Not at all if the universe possesses
    infinite time and resources: the limitations imposed by physics could
    be compensated for simply by commandeering more of the universe to
    analyse the data. But the real universe is not infinite, at least not
    in the above sense. It originated in a big bang 13.7 billion years
    ago, which means light can have travelled at most 13.7 billion light
    years since the beginning. Cosmologists express this restriction by
    saying that there is a horizon in space 13.7 billion light years away.
    Because nothing can exceed the speed of light, regions of space
    separated by more than this distance cannot be in causal contact: what
    happens in one region cannot affect the other. This means the demon
    would have to make do with the resources available within the horizon.

    Seth Lloyd of the Massachusetts Institute of Technology recently
    framed the issue like this. Suppose the entire universe (within the
    effective horizon) is a computer: how many bits could it process in
    the age of the universe? The answer he arrived at after applying
    Landauer's limits is 10120 bits. That defines the maximum available
    processing power. Any calculation requiring more than 10120 bits is
    simply a fantasy, because the entire universe could not carry it out
    in the time available.

    Lloyd's number, vast though it is, defines a fundamental limit on the
    application of physical law. Landauer put it this way: "a sensible
    theory of physics must respect these limitations, and should not
    invoke calculative routines that in fact cannot be carried out". In
    other words, Landauer and Lloyd have discovered a fundamental limit to
    the precision of physics: we have no justification in claiming that a
    law must apply - now, or at any earlier time in the universe's
    existence - unless its computational requirements lie within this
    limit, for even a demon who commandeered the entire cosmos to compute
    could not achieve predictive precision beyond this limit. The inherent
    fuzziness that this limit to precision implies is quite distinct from
    quantum uncertainty, because it would apply even to deterministic
    laws.

    How does this bear on the question of strong emergence - the idea that
    there are organising principles that come into play beyond a certain
    threshold of complexity? The Landauer-Lloyd limit does not prove that
    such principles must exist, but it disproves the long-standing claim
    by reductionists that they can't. If the micro-laws - the laws of
    physics as we know them - cannot completely determine the future
    states and properties of some physical systems, then there are gaps
    left in which higher-level emergent laws can operate.

    So is there any way to tell if there is some substance to the
    strong-emergentists' claims? For almost all systems, the
    Landauer-Lloyd limit is nowhere near restrictive enough to make any
    difference to the conventional application of physical laws. But
    certain complex systems exceed the limit. If there are emergent
    principles at work in nature, it is to such complex systems that we
    should look for evidence of their effects.

    A prime example is living organisms. Consider the problem of
    predicting the onset of life in a prebiotic soup. A simple-minded
    approach is to enumerate all the possible combinations and
    configurations of the basic organic building blocks, and calculate
    their properties to discover which would be biologically efficacious
    and which would not.

    Calculations of this sort are already familiar to origin-of-life
    researchers. There is considerable uncertainty over the numbers, but
    it scarcely matters because they are so stupendously huge. For
    example, a typical small protein is a chain molecule made up of about
    100 amino acids of 20 varieties. The total number of possible
    combinations is about 10130, and we must multiply this by the number
    of possible shapes the molecule can take, because its shape affects
    its function. This boosts the answer to about 10200, already far in
    excess of the Landauer-Lloyd limit, and shows how the remorseless
    arithmetic of combinatorial calculations makes the answers shoot
    through the roof with even modest numbers of components.

    The foregoing calculation is an overestimate because there may be many
    other combinations of amino acids that exhibit biological usefulness -
    it's hard to know. But a plausible guesstimate is that a molecule
    containing somewhere between 60 and 100 amino acids would possess
    qualities that almost certainly couldn't have been divined in the age
    of the universe by any demon or computer, even with all the resources
    of the universe at its disposal. In other words, the properties of
    such a chain simply could not - even in principle - be traced back to
    a reductionist explanation.

    Strikingly, small proteins possess between 60 and 100 amino acids. The
    concordance between these two sets of numbers, one derived from
    theoretical physics and cosmology, the other from experimental
    biology, is remarkable. A similar calculation for nucleotides
    indicates that in DNA, the properties of strings of more than about
    200 base pairs might require additional organising principles to
    explain their properties. Since genes have upwards of about this
    number of base pairs, the inference is clear: emergent laws may indeed
    have played a part in giving proteins and genes their functionality.

    Biologists such as Christian de Duve have long argued that life is "a
    cosmic imperative", written into the laws of nature, and will emerge
    inevitably and naturally under the right conditions. However, they
    have never managed to point to the all-important laws that make the
    emergence of life "law-like". It seems clear from the Landauer-Lloyd
    analysis that the known laws of physics won't bring life into
    existence with any inevitability - they don't have life written into
    them. But if there are higher-level, emergent laws at work, then
    biologists like de Duve may be right after all - life may indeed be
    written into the laws of nature. These laws, however, are not the
    bottom-level laws of physics found in the textbooks.

    And while we are looking for phenomena that have long defied
    explanation by reductionist arguments, what about the emergence of
    familiar reality - what physicists call "the classical world" - from
    its basis in quantum mechanics? For several decades physicists have
    argued about how the weird and shadowy quantum micro-world interfaces
    with the concrete reality of the classical macro-world. The problem is
    that a quantum state is generally an amalgam of many alternative
    realities, coexisting in ghostly superposition. The macro-world
    presented to our senses is a single reality. How does the latter
    emerge from the former?

    There have been many suggestions that something springs into play and
    projects out one reality from many. The ideas for this "something"
    range from invoking the effect of the observer's mind to the influence
    of gravitation. It seems clear, however, that size or mass are not
    relevant variables because there are quantum systems that can extend
    to everyday dimensions: for example, superconductors.

    One possible answer is that complexity is the key variable. Could it
    be that a quantum system becomes classical when it is complex enough
    for emergent principles to augment the laws of quantum mechanics,
    thereby bringing about the all-important projection event?

    To find where, on the spectrum from atom to brain, this threshold of
    complexity might lie, we can apply the Landauer-Lloyd limit to quantum
    states of various configurations. One such complex state, known as an
    entanglement, consists of a collection of particles like electrons
    with spins directed either up or down, and linked together in a
    quantum superposition. Entangled states of this variety are being
    intensively studied for their potential role in quantum computation.
    The number of up-down combinations grows exponentially with the number
    of electrons, so that by the time one has about 400 particles the
    superposition has more components than the Landauer-Lloyd limit. This
    suggests that the transition from quantum to classical might occur, at
    least in spin-entangled states, at about 400 particles. Though this is
    beyond current technology, future experiments could test the idea.

    For 400 years, a deep dualism has lain at the heart of science. On the
    one hand the laws of physics are usually considered universal,
    absolute and eternal: for example, a force of 2 newtons acting on a
    2-kilogram mass will cause it to accelerate by 1 metre per second per
    second, wherever and whenever in the universe the force is applied.

    On the other hand, there is another factor in our description of the
    physical world: its states. These are not fixed in time. All the
    states of a physical system - whether we are talking about a hydrogen
    atom, a box full of gas or the recorded prices on the London stock
    market - are continually moving and changing in a variety of ways.

    In our descriptions of how physical systems behave, these states are
    as important as the laws. After all, the laws act on the states to
    predict how the system will behave. A law without a state is like a
    traffic rule in a world with no cars: it doesn't have any application.

    What the new paradigm suggests is that the laws of physics and the
    states of the real world might be interwoven at the deepest level. In
    other words, the laws of physics do not sit, immutable, above the real
    world, but are affected by it.

    That sounds almost heretical, but some physicists - most notably John
    Wheeler - have long speculated that the laws of physics might not be
    fixed "from everlasting to everlasting", to use his quaint expression.
    Most cosmologists treat the laws of physics as "given", transcending
    the physical universe. Wheeler, however, insisted the laws are
    "mutable" and in some manner "congeal" into their present form as the
    universe expands from an initial infinitely dense state. In other
    words, the laws of physics might emerge continuously from the ferment
    of the hot big bang.

    It seems Wheeler's ideas and the Landauer-Lloyd limit point in the
    same direction. And that means the entire theory of the very early
    universe could be back in the melting pot.



More information about the paleopsych mailing list