[Paleopsych] NYTBR: 'A Different Universe': You Are More Important Than a Quark

Premise Checker checker at panix.com
Sat Jun 18 23:29:45 UTC 2005

'A Different Universe': You Are More Important Than a Quark
New York Times Book Review, 5.6.19
[First chapter appended.]


    EVERY child knows how to learn what makes a toy work: bust it open. In
    that sense, we're all born reductionists, whose philosophy holds that
    anything can be explained by breaking it into its component parts. By
    analyzing them, one discovers how the parts act together to produce
    larger phenomena. If you crack open a windup clock, you can examine
    its gears to see what makes it tick.

    Some people resent reductionism because it sweeps away many mysteries.
    Behind spooky phenomena, reductionists have shown, are the ordinary
    ticktocks of nature's machinery, the concealed ropes and pulleys of
    cosmic-scale Penn and Teller tricks. Indeed, reductionism has
    reinforced the old philosophical suspicion that there is something
    vaguely unreal about ''reality'': as the Greek philosopher Democritus
    said, it's all just atoms and the void. To a hyper-reductionist, the
    invisibly small microworld is more ''real'' than everything else.
    Bigger objects -- cats, toasters, people, the sun, galactic
    superclusters -- are just second-order consequences. The atoms or
    quarks or leptons (or ''strings,'' if you follow the latest trendy
    theories) are what count, while you and I are just ephemera.

    It's a disillusioning view, but so far it has yielded undeniable
    benefits. By breaking matter into atoms, subatomic particles and
    subatomic forces, and by disassembling living organisms into such
    discrete elements as cells, genes, enzymes and so forth, scientists
    have learned much about how nature works, and how we can make it do
    our bidding.

    Inevitably, reductionism has been overused. Not everything can be
    reduced to cosmic nuts and bolts. In the emerging sciences of the 21st
    century, many researchers are dusting off an old saying: ''The whole
    is more than the sum of its parts.''

    A recent example: many molecular biologists once thought the chemical
    information stored on DNA coded for the full complexity of living
    organisms. But a few years ago, the Human Genome Project revealed
    people have far too few genes (not many more than a roundworm) to
    account for the kaleidoscopic complexity of the human body. By itself,
    it appears, DNA cannot explain it any more than you can infer the
    United States Constitution from the traffic laws of Topeka. Somehow,
    biologists propose, higher-level ''organizational'' or ''emergent''
    principles switch on at larger sizes, such as on the scale of

    Even physicists, wizards of the nonliving realm, are talking about
    emergent properties. Their change of heart is not easy, though, as
    Robert B. Laughlin, who received a Nobel Prize in Physics, shows us in
    his important, brain-tickling new book, ''A Different Universe.'' Like
    the blacksmith to whom everything resembles a nail, some physicists
    spent decades trying to explain everything in terms of particles;
    thus, gravity was attributed to a hypothetical ''graviton.'' In recent
    decades, though, a few physicists have won acclaim for experiments
    with antireductionist implications. One example is a bizarre
    laboratory phenomenon called superfluidity, in which liquefied helium
    crawls vertically out of its beaker like the gelatinous monster in
    ''The Blob.''

    Laughlin, who teaches at Stanford University, illuminates emergent
    principles through a charming analogy: the paintings of Renoir and
    Monet. Up close the paintings look like ''daubs of paint,'' nothing
    more. Yet when we step back from the canvases, we see fields of
    flowers. ''The imperfection of the individual brush strokes tells us
    that the essence of the painting is its organization. Similarly'' --
    Laughlin adds in a most unexpected segue -- ''the ability of certain
    metals to expel magnetic fields exactly when they are refrigerated to
    ultralow temperatures strikes us as interesting because the individual
    atoms out of which the metal is made cannot do this.''

    A major step toward recognition of emergent phenomena was a discovery
    about electrical conductivity in 1980 by the German physicist Klaus
    von Klitzing. To understand its significance, be aware of its
    historical context: in the 19th century Edwin Hall had discovered
    principles of electrical conductivity usually called the Hall effect,
    and for a century afterward electrical conduction had been understood
    as simple Newtonian motion of electrons in a metal.

    Von Klitzing found a totally unexpected phenomenon -- that Hall
    conductivity in strong magnetic fields and ultralow temperatures
    changes in a precise, stepwise fashion as the field strength is
    varied. What identifies the effect as emergent is its precision and
    the fact that it disappears in small samples. The Nobel Prize in
    Physics awarded to him in 1985 specifically cites this work. Laughlin
    and two colleagues shared the 1998 prize for their studies of a
    similar phenomenon, one even more bizarre than von Klitzing's,
    ''unanticipated by any theory and not analogous to anything previously
    known in nature,'' as Laughlin writes.

    Talk of emergence makes many scientists nervous. The word, after all,
    has been co-opted by all kinds of people who have bowdlerized it,
    along with once precise terms like ''holistic'' and ''paradigm,'' for
    trivial purposes. More pertinent, emergence seems to defy common
    sense, just as the notion of the sphericity of the earth once did.
    There are no emergent principles in money, for example: 100 million
    pennies equals $1 million, not an emergent $2 million. To our primate
    brains, the whole is the sum of its parts. But when I once griped
    about the counterintuitiveness of quantum physics, a scientist at the
    University of Illinois replied dryly, ''Common sense is a poor guide
    to the nature of reality.''

    Laughlin's thesis is intriguing, if not completely persuasive. I can't
    help wondering if hard-core reductionists will eventually explain
    emergent phenomena in reductionist terms; they've pulled rabbits out
    of hats before. Still, his thesis reminds us of the great value of
    something most physicists assume they can live without: philosophy.
    Behind the seemingly concrete principles, practices and instruments of
    any laboratory, there are certain philosophical assumptions, often
    unexamined. In the 19th century physicists were hypnotized by the myth
    of the cosmic ether, an invisible medium through which light rippled,
    as waves ripple across a pond. In 1905, Albert Einstein, then a young
    patent clerk, awakened them. Likewise, Laughlin says, physicists face
    a philosophical ''crisis'' over emergence, ''a confrontation between
    reductionist and emergent principles that continues today.'' In the
    history of science, philosophical crises often precede scientific

    This year is the 100th anniversary of Einstein's revolution. In
    Laughlin's view, another physics revolution is coming. He mocks
    speculations in the 1990's about an imminent end of science: ''We live
    not at the end of discovery but at the end of Reductionism, a time in
    which the false ideology of human mastery of all things through
    microscopics is being swept away by events and reason.'' To invoke a
    familiar metaphor, physicists have fruitfully spent the last century
    trying to map every twig, acorn and bird's nest in the trees. Now it's
    time to step back and see the forest.

    Keay Davidson, a science writer for The San Francisco Chronicle, is
    the author of ''Carl Sagan: A Life.''


First chapter of 'A Different Universe'


    Frontier Law

      Nature is a collective idea, and though its essence exist in each
      individual of the species, can never its perfection inhabit a
      single object. Henri Fuseli

    Many years ago, when I was living near New York, I attended a
    retrospective of Ansel Adams, the great nature photographer, at the
    Museum of Modern Art. Like many people born in the American West, I
    had always liked Mr. Adams's work and felt I appreciated it better
    than New Yorkers ever could, so I jumped at the chance to see it
    firsthand. It was well worth the effort. Anyone seeing these images
    close up realizes at once that they are not simply sterile pictures of
    rocks and trees but thoughtful comments on the meaning of things, the
    immense age of the earth, and the impermanence of human concerns. This
    exhibition made a much stronger impression on me than I had expected,
    and it flashes into my mind even now when I am wrestling with a tough
    problem or having difficulty separating what is important from what is

    Public television viewers were reminded recently by Ric Burns's
    excellent American Experience documentary that Mr. Adams's work, like
    any other art, was as much a creation of a specific time and place as
    of the artist himself. In the early part of the twentieth century,
    when Adams was a boy and the frontier had been declared closed,
    Americans debated vigorously over what its loss implied for their
    future. In the end, they decided that they did not want to be like
    Europe, that part of their identity, and of meaningful life generally,
    was in close proximity to wildness. Thus was born the metaphorical
    frontier-the myth of the cowboy, the vast landscape of the possible,
    the ideal of the rugged individual-that defines American culture to
    this day. Adams's work grew to maturity alongside this metaphor and
    derives its power by eliciting the nostalgia for untamed wilderness at
    its core.

    The idea of the frontier is not just quaint provincialism. It is often
    spoken of as such, especially in Europe, where the mythological nature
    of the American West has always been easier to discern than it is here
    and is often viewed with suspicion. I first saw this idea expressed in
    a lengthy article on America in the magazine Stern when I was a
    soldier stationed in Germany in the early 1970s. Such articles are
    appearing with increasing frequency nowadays as the cold war recedes
    into history. But the perception is incorrect. While the confluence of
    cultural forces that generated Adams's images is uniquely American,
    the images themselves are not. The longing for a frontier seems to lie
    deep in the human soul, and people from different parts of the world
    and with different cultural backgrounds understand it quickly and
    intuitively. In no country does one have to dig very deep to find an
    appreciation of, and identification with, wildness. Adams's work
    travels well for this reason and has universal appeal.

    The idea of science as a great frontier is similarly timeless. While
    there are clearly many nonscientific sources of adventure left,
    science is the unique place where genuine wildness may still be found.
    The wildness in question is not the lurid technological opportunism to
    which modern societies seem so hopelessly addicted, but rather the
    pristine natural world that existed before humans arrived-the vast
    openness of the lone rider splashing across the stream with three pack
    animals under the gaze of mighty peaks. It is the choreography of
    ecologies, the stately evolution of minerals in the earth, the motion
    of the heavens, and the birth and death of stars. Rumors of its death,
    to paraphrase Mark Twain, are greatly exaggerated.

    My particular branch of science, theoretical physics, is concerned
    with the ultimate causes of things. Physicists have no monopoly on
    ultimate causes, of course, for everyone is concerned with them to
    some extent. I suspect it is an atavistic trait acquired long ago in
    Africa for surviving in a physical world in which there actually are
    causes and effects-for example between proximity to lions and being
    eaten. We are built to look for causal relations between things and to
    be deeply satisfied when we discover a rule with cascading
    implications. We are also built to be impatient with the
    opposite-forests of facts from which we cannot extract any meaning.
    All of us secretly wish for an ultimate theory, a master set of rules
    from which all truth would flow and that could forever free us from
    the frustration of dealing with facts. Its concern for ultimate causes
    gives theoretical physics a special appeal even to nonscientists, even
    though it is by most standards technical and abstruse.

    It is also a mixture of good news and bad news. First you find that
    your wish for an ultimate theory at the level of human-scale phenomena
    has been fulfilled. We are the proud owners of a set of mathematical
    relationships that, as far as we know, account for everything in the
    natural world bigger than an atomic nucleus. They are very simple and
    beautiful and can be written in two or three lines. But then you find
    that this simplicity is highly misleading-rather like those
    inexpensive digital wristwatches with only one or two buttons. The
    equations are devilishly difficult to manipulate and impossible to
    solve in all but a small handful of instances. Demonstrating that they
    are correct requires arguments that are lengthy, subtle, and
    quantitative. It also requires familiarity with a huge body of work
    done after the Second World War. While the basic ideas were invented
    by Schrödinger, Bohr, and Heisenberg in the 1920s, it was not until
    powerful electronic computers were developed and armies of technically
    competent people were generated by governments that these ideas could
    be tested quantitatively against experiment over a wide range of
    conditions. Key technical developments, such as the purification of
    silicon and the perfection of atomic beam machines, were also
    important. Indeed, we might never have known for certain that the
    whole thing was correct had it not been for the cold war and the
    economic importance of electronics, radar, and accurate timekeeping,
    which made financing easy on various ostensibly practical grounds.

    Thus eighty years after the discovery of the ultimate theory we find
    ourselves in difficulty. The repeated, detailed experimental
    confirmation of these relationships has now officially closed the
    frontier of reductionism at the level of everyday things. Like the
    closing of the American frontier, this is a significant cultural
    event, causing thoughtful people everywhere to debate what it means
    for the future of knowledge. There is even a best-selling book
    exploring the premise that science is at an end and that meaningful
    fundamental discovery is no longer possible. At the same time, the
    list of even very simple things found "too difficult" to describe with
    these equations continues to lengthen alarmingly.

    Those of us out on the real frontier listening to the coyotes howl at
    night find ourselves chuckling over all this. There are few things a
    real frontiersman finds more entertaining than insights about
    wilderness from people back in civilization who can barely find the
    supermarket. I find this moment in history charmingly similar to Lewis
    and Clark's wintering on the Columbia estuary. Through grit and
    determination their party had pushed its way across a continent, only
    to discover that the value had not been in reaching the sea but in the
    journey itself. The official frontier at that time was a legal fiction
    having more to do with property rights and homesteading policy than a
    confrontation with nature. The same is true today. The real frontier,
    inherently wild, may be found right outside the door, if one only
    cares to look.

    Despite being a wild place, the frontier is regulated by laws. In the
    mythical old West the law meant the force of civilization in a land
    where there was none, and it was often enforced by some heroic figure
    holding back the wildness of human nature through strength of will. A
    man had a choice of whether to obey this law or not, but he stood a
    good chance of getting gunned down if he did not. But there are
    natural laws as well, relationships among things that are always true
    regardless of whether people are present to observe them. The sun
    rises every morning. Heat flows from hot things to cold ones. Herds of
    deer spotting cougars always dash away. These are the exact opposite
    of laws of myth, in that they flow out of wildness and constitute its
    essence rather than being a means for its containment. Indeed,
    describing these things as laws is somewhat misleading, for it implies
    a kind of statute that otherwise willful natural things choose to
    obey. This is not correct. It is a codification of the way natural
    things are.

    The important laws we know about are, without exception, serendipitous
    discoveries rather than deductions. This is fully compatible with
    one's everyday experience. The world is filled with sophisticated
    regularities and causal relationships that can be quantified, for this
    is how we are able to make sense of things and exploit nature to our
    own ends. But the discovery of these relationships is annoyingly
    unpredictable and certainly not anticipated by scientific experts.
    This commonsense view continues to hold when the matter is examined
    more carefully and quantitatively. It turns out that our mastery of
    the universe is largely a bluff-all hat and no cattle. The argument
    that all the important laws of nature are known is simply part of this
    bluff. The frontier is still with us and still wild.

    The logical conflict between an open frontier on the one hand and a
    set of master rules on the other is resolved by the phenomenon of
    emergence. The term emergence has unfortunately grown to mean a number
    of different things, including supernatural phenomena not regulated by
    physical law. I do not mean this. I mean a physical principle of
    organization. Human societies obviously have rules of organization
    that transcend the individual. An automobile company, for example,
    does not cease to exist if one of its engineers gets run over by a
    truck. The government of Japan does not change very much after an
    election. But the inanimate world also has rules of organization, and
    they similarly account for many things that matter to us, including
    most of the higher-level physical laws we use in our daily lives. Such
    commonplace things as the cohesiveness of water or the rigidity of
    steel are simple cases in point, but there are countless others.
    Nature is full of highly reliable things that are primitive versions
    of impressionist paintings. A field of flowers rendered by Renoir or
    Monet strikes us as interesting because it is a perfect whole, while
    the daubs of paint from which it is constructed are randomly shaped
    and imperfect. The imperfection of the individual brush strokes tells
    us that the essence of the painting is its organization. Similarly,
    the ability of certain metals to expel magnetic fields exactly when
    they are refrigerated to ultralow temperatures strikes us as
    interesting because the individual atoms out of which the metal is
    made cannot do this.

    Since principles of organization-or, more precisely, their
    consequences -can be laws, these can themselves organize into new
    laws, and these into still newer laws, and so on. The laws of electron
    motion beget the laws of thermodynamics and chemistry, which beget the
    laws of crystallization, which beget the laws of rigidity and
    plasticity, which beget the laws of engineering. The natural world is
    thus an interdependent hierarchy of descent not unlike Jonathan
    Swift's society of fleas:

      So, naturalists observe, the flea
      Has smaller fleas that on him prey;
      And these have smaller still to bite 'em
      And so proceed ad infinitum.

    This organizational tendency is so powerful that it can be difficult
    to distinguish a fundamental law from one of its progeny. The only way
    we know that the behavior of cats is not fundamental, for example, is
    because cats fail to work when pushed beyond their proper operating
    limits, so to speak. Similarly, the only way we know atoms are not
    fundamental is that they come apart when caused to collide at great
    speed. This principle continues down to smaller and smaller scales:
    the nuclei from which atoms are made come apart when caused to collide
    at greater speed, the parts liberated from the nucleus come apart at
    even greater speeds, and so forth. Thus the tendency of nature to form
    a hierarchical society of physical laws is much more than an academic
    debating point. It is why the world is knowable. It renders the most
    fundamental laws, whatever they are, irrelevant and protects us from
    being tyrannized by them. It is the reason we can live without
    understanding the ultimate secrets of the universe.

    Thus the end of knowledge and the closing of the frontier it
    symbolizes is not a looming crisis at all, but merely one of many
    embarrassing fits of hubris in civilization's long history. In the end
    it will pass away and be forgotten. Ours is not the first generation
    to struggle to understand the organizational laws of the frontier,
    deceive itself that it has succeeded, and go to its grave having
    failed. One would be wise to be humble, like the Irish fisherman
    observing quietly that the sea is so wide and his boat so small. The
    wildness we all need to live, grow, and define ourselves is alive and
    well, and its glorious laws are all around.

More information about the paleopsych mailing list