[Paleopsych] Hedgehog Review: Albert Borgmann: On the Blessings of Calamity and the Burdens of Good Fortune

Premise Checker checker at panix.com
Sun Oct 10 22:48:13 UTC 2004

Albert Borgmann: On the Blessings of Calamity and the Burdens of Good Fortune

          Albert Borgmann is Professor of Philosophy at the University of
    Montana. Professor Borgmann's work has been the topic of conferences
    and books such as Technology and the Good Life? edited by Eric Higgs,
    Andrew Light, and David Strong (2000). His books include: Holding On
    to Reality: The Nature of Information at the Turn of the Millennium
    (1999); Crossing the Postmodern Divide (1992); and Technology and the
    Character of Contemporary Life: A Philosophical Inquiry (1984).

          It is difficult now to recall the world of the nineties. At the
    time it seemed like the beginning of boundless prosperity, inspired by
    the manifest destiny of exploring and settling the new world of
    cyberspace, an era in which the iron laws of gravity and economics had
    been abrogated, a time of challenges that called for boldness and
    unconditional devotion.

          But at the turn of the millennium, diffidence and disappointment
    set in. We began to realize that the second coming might not occur in
    our lifetime. Limitless affluence would take longer, and more work was
    needed to construct hyperreal happiness. On September code 11 /code of
    code 2001 /code , diffidence turned to despair and disappointment to
    sorrow. In retrospect we could see that in the nineties we had been
    turning our private spheres into cocoons of self-indulgence, and we
    had enveloped the public realm in a virtual fog of cell phones,
    pagers, beepers, personal CD players, digital cameras, and video

          September 11^th was in a terrifying way what Virginia Woolf has
    called a moment of being, a situation that made us feel the shock of
    reality. [3]^1 The attacks themselves were conducted in a primitively
    real way, and the terrors in turn shredded our cocoons and dispelled
    the virtual fog. Suddenly we became aware again of one another and of
    the things about us. People emerged from their seclusion and anonymity
    through their heroism, their selfless exertions, through acts of
    kindness and sometimes simply through the acknowledgment of tears and
    consolations. Suddenly the high-rises that had seemed so forbidding
    and aloof looked frail and precious. We felt affection and sorrow for
    the twin towers of the World Trade Center, which we had previously
    regarded as the height of witless arrogance.

          Calamity has a way of restoring us to reality and kindness. When
    the big snow paralyzed Chicago in code 1967 /code , people learned
    again how to walk, how to be neighbors, and how to attend to the
    simple tasks of getting milk and bread from the store on a sled and of
    clearing a space from the garage to the street. When an ice storm
    paralyzed the northern part of upstate New York early in code 1996
    /code and shut down electricity for weeks, people shared their fuel
    and their kitchens and volunteered to minister to the sick and the
    elderly in makeshift shelters. [4]^2 When wildfires ravaged Montana in
    the summer of code 2000 /code , people sheltered and consoled one
    another, and the much detested "Feds" turned into heroic guardians.

          Yet, within weeks after the terror attacks, normalcy returned.
    People went back to their enclaves of entertainment. Irony and
    cynicism surfaced again. [5]^3 And while the prospects for the economy
    are generally clouded, the video game business is confident of growth
    and profits. [6]^4 The President urged us not exactly to indulge
    ourselves and not directly to consume, but certainly to go out and buy
    stuff; doing so usually comes to consumption and ends in

          So should we hope for another disaster to wake us from our
    consumptive slumber and our sleepwalking pursuit of glamorous
    affluence? The blessings of calamity carry a forbidding price. Surely
    we must do everything to prevent catastrophe and misery and take up
    the burdens of good fortune that come with the progress of technology.
    Chief among them is the task of comprehending more consciously and
    deeply the benefits and liabilities of technology. For such purposes
    "technology" is not just the name for certain machineries and
    procedures that can be used for well or ill, although "technology" can
    certainly be so understood. But if we want to take the measure of the
    human condition in our time, "technology" is a suggestive and useful
    label for what is distinctive of contemporary culture as a whole.

          The characteristic forms of recent technology are information
    technology and biotechnology, and one way of locating both the crucial
    peril and the best hope of the moment is to consider the threats to
    mind, body, and world that appear to issue from these two
    technologies. The very identity of the human person and the very
    substance of reality are presumably called into question by
    developments in artificial intelligence, in genetics, and in virtual
    reality. Reactions to these prospects are as divided as they are to
    carnival rides--they produce exhilaration in some people and vertigo
    in others. [7]^5

          Each of these three areas of development--artificial
    intelligence, genetics, and virtual reality--is enormously complex and
    technically sophisticated, and laypeople are tempted to throw up their
    hands in frustration and to surrender their destiny to the experts.
    But "I give up" is not an acceptable reply to recent technology. We
    must do our best to penetrate the thickets of technical terms and
    scientific findings. In addition, I want to suggest, there is a method
    of outlining the shape of our future through thought experiments that
    suggest moral bounds that emerge and remain no matter how perfect the

Artificial Intelligence

          Let me begin with artificial intelligence. Its threat or its
    promise rests on the claim that the most distinctive human capacity,
    intelligence, is independent of the stuff that it is realized in and
    that computers consist of stuff that allows for the construction of
    intelligence that is at least as powerful as human intelligence. A
    related claim says that a person's particular intelligence can some
    day be transferred from the person's brain to a computer so that the
    essence of that person can exist alongside or beyond the person in
    question. Thus there could be duplicates of you, immortal versions of
    you, nonhuman superiors of you, but also vastly enhanced editions of
    you--prospects that surely can provoke excitement or vertigo.

          But how exactly could we tell whether an artificially
    intelligent computer had reached a stage of perfection that would at
    least equal human intelligence? The great British logician and
    mathematician Alan Turing proposed that we call a machine intelligent
    when in conversation it would be indistinguishable from a human being.
    For the purposes of our thought experiment we assume that the machine
    would easily pass the Turing test. There is at the moment no such
    computer, and, as far as I can tell, there would have to be presently
    inconceivable breakthroughs in our understanding of the syntax and
    semantics of natural language for such artificial intelligence to be
    possible. [8]^6 But in a thought experiment we can set these problems

          Now the revealing question is under what circumstances and to
    what extent we would find it worthwhile to converse with such a
    computer. To answer the question we have to distinguish domains of
    discourse, and for our purposes three are enough: scientific
    discourse, factual discourse, and moral discourse. These domains shade
    over into one another but are distinct in their more central regions.

          We would certainly find it useful to query the computer about
    scientific matters, for example, the law of gravity, the number of the
    solar planets, the effect of the gravitational force on the orbits of
    the planets, the state of the search for a theory of everything, etc.
    Propositions in reply to such queries are made from nowhere since they
    are true everywhere. The same is true of brute historical facts, the
    fact, for example, that the terror attacks on the World Trade Center
    took place on September 11, code 2001 /code , that the Pentagon was
    attacked the same day, that a total of four planes had been hijacked,

          Search engines are beginning to resemble artificially
    intelligent sources of scientific and factual information. They are
    both more versatile and quicker than their printed forebears. They are
    less focused and trustworthy than a human expert, but then we rarely
    have the privilege to ask such an expert in person and on the spot. In
    well-bounded and formal areas such as chess, moreover, computers
    already surpass humans.

          As soon, however, as you ask the computer for a fuller account
    of an event like the attacks of September 11^th, namely, for the
    background, the context, and the consequences of these events, the
    computer would have to assume a standpoint from which to tell the
    story. And at this point, moral matters come into play. From Osama bin
    Laden's point of view, this was a jihad; from our standpoint, it was
    terrorism. But so far, truth is still a guide for the computer. It was
    in truth terrorism, and not an act of holy war. Yet there are
    different standpoints that are morally valid and compatible with one
    another. A New Yorker's story of the terrors will differ from that of
    a Montanan; a sociologist will give an account that differs from a
    political scientist's. The point is that a selection from millions of
    facts and facets must be made, and any intelligible and consistent
    account betrays a point of view.

          But we would not find this unnatural or jarring in a computer.
    Even now we attribute a loose kind of standpoint and certain
    intentions to our personal computers, and as Daniel Dennett has
    pointed out, we would find it difficult to talk about the behavior of
    computers without ascribing states of mind to them. [9]^7 We do this
    when we say of our PC: "It's looking for a file," or "It thinks it's
    connected to the local area network," etc. It is also true that an
    intelligent computer would be unpredictable without being bizarre,
    just as the best chess computers surprise their designers with their
    inventiveness (another mental property). And finally, it is certain
    that some people would respond to an intelligent computer the way they
    answer a person. After all, Eliza, an unintelligent program mimicking
    a psychoanalyst, was so treated. [10]^8 Joseph Weizenbaum, the creator
    of the program, concealed its limitations by having Eliza turn
    statements to which no precooked reply was available into questions
    and by having it reply to unaswerable questions with "Tell me more
    about this" and the like. And yet people began to confide in Eliza as
    though it were a she or he.

          Though it is practical to act as though a computer were a
    person, there are limits to a computer's personality and to the scope
    of its discourse. It fails to meet the principles of equality and
    dignity that are crucial to moral conduct and discourse. Both
    principles are rooted in our bodily being, and it follows, trivially
    in one sense, that computers cannot be equal to those principles since
    they are not embodied in the human way.

          But in another respect, the difference in physical structure of
    humans and computers, when made vivid and concrete, reveals the
    distance that separates humans from machines. As regards equality, I
    shape my conduct in emulation, competition, or companionship with
    others who are like me. My complaint about a surly colleague
    evaporates when I hear of my friend's losing his mother. When my
    mother dies, I take consolation from my friend because he has suffered
    the same sorrow. I look toward my declining years with confidence
    because my spouse of forty years will be with me. I learn who I am and
    what I ought to do in conversation with others who have a standpoint
    like mine and experience reality the way I do--as children of
    venerable parents, as parents of vulnerable children, as helpful
    friends, as mature persons, as wise elders.

          A computer has none of these properties, relations, or
    experiences. A computer has designers rather than parents. It has
    breakdowns rather than illnesses. It becomes obsolete rather than old.
    It can be replaced and, as Kant already observed, has a price rather
    than dignity:

      Whatever has a price can be replaced by something else as its
      equivalent; on the other hand, whatever is above all price, and
      therefore admits of no equivalent, has dignity. [11]^9

          Each of us is a unique and inexhaustible locus of convergence
    and transmission through our ancestry, both evolutionary and
    historical, through our descendants, through the sensibility of each
    of our organs, through our pleasures and pains, through our wounds and
    our scars, through our attacks and our embraces.

          In moral matters we may turn to an intelligent computer the way
    we now turn to the Psalms or to Chicken Soup for the Soul. [12]^10 But
    in both cases it is the writers' experiences of pain and their
    fortitude in the face of it that give us a sense of trust and solace.
    No doubt artificial intelligence will become still more expert in
    cognitive and formal tasks. What it will always lack, however, is the
    human standpoint--the moral authenticity and authority of a good woman
    or a good man.


          Yet even if the human mind in its moral dimensions is beyond
    simulation, the realization of this distinctively human power, the
    body, seems itself to be cut loose from traditional norms and
    constraints due to the impending transformative power of genetics.
    Here too exciting or vertiginous prospects seem to open up--the
    possibility, for example, to customize one's children as to their
    height, their looks, their health, and their character.

          Our professed hopes are more modest. As Nicholas Wade has
    reported, "Dr. Richard Lifton of Yale predicted that in code 20 /code
    years researchers would be `able to identify the genes and pathways
    predisposing to every human disease.'" [13]^11 Another of the problems
    scientists would like to see solved is "the biological basis of
    intelligence and cognition." [14]^12 Here we obviously approach a
    different level of shaping and improving humans. Finding and utilizing
    the genetics of diseases will make humans, such as they are, healthier
    and live longer. But understanding the genetic organization of
    intelligence and cognition will allow us to build better humans--more
    insightful and resourceful persons, people of greatly superior quality
    to put it summarily.

          Or will it? Here again a thought experiment suggests limits to
    what looks like limitless power and fearful possibility. Imagine the
    oral culture of ancient Greece, say code 1000 /code bce, when the
    Homeric epics were presented at the manors of the chieftains. Such an
    epic was the possession of a singer of songs and would be realized as
    a great event, rising powerfully, commanding attention, and finally
    receding into memory. Imagine how strange and unsettling it would have
    been for a singer or listener to be told that the entire epic could be
    fixed on papyrus from a store of no more than code 24 /code letters,
    that such letters would compose words, that all the words of a
    language could be assembled in a dictionary, that there would be rules
    for the formation of words and for the formation of sentences from

          A quick and bright member of such an oral culture would realize
    that writing and grammar promise to provide incredible power over an
    epic. The entire poem could be inspected at leisure and to the
    smallest detail, and surely knowledge of the vocabulary and grammar
    would allow one to make vast improvements in the quality of an epic
    and to fix them for all time. Well, we do have such power over
    language now. Do we have the power to improve the quality of a novel
    such as The Firm? [15]^13

          John Grisham's book is well-constructed and peopled with
    interesting and engaging characters. It tells the tale of a young man
    who finds his way and identity in an unreal world. Is it one of the
    great novels of the last century? As regards literary quality it is
    like you and me--alright, nothing special, but certainly estimable and
    thoroughly decent. Can we make it into a masterpiece in the same
    genre, something like Thomas Mann's Magic Mountain, also a story of a
    young man searching for his identity under unreal circumstances?

          The analogy between the structure and quality of a novel and the
    organization and character of a person is loose, but it may be tight
    enough to suggest a limit of genetics when it comes to improving the
    quality of a person. We can hope to find a genetic cure for an obvious
    illness such as diabetes. Similarly we can do much by way of a spell
    check program to cure a poorly spelled novel. We may be able to
    determine genetically the color of a person's skin, eyes, or hair. We
    can change a novel's spelling from American to British English; we can
    replace adverbs and phrases such as "namely," "that is," "for
    example," "and so on," with their Latin abbreviations. More ambitious
    changes would, however, cause more damage than improvement. We could
    try to make the language more nuanced by replacing "said" alternately
    with "replied," "suggested," and "observed." We could replace basic
    color terms with their variants, replacing "blue" alternately with
    "indigo" and "cobalt." As often as not this would result in nonsense.
    Similarly, certain well-intentioned genetic changes may introduce as
    much debility as improvement. [17]^15

          But could not a good writer introduce into The Firm the
    leitmotifs, philosophical discussions, subtle portraits, and artful
    syntax of The Magic Mountain? And could not an accomplished geneticist
    of the future analogously reshape the genetic material of an embryo?
    There are two problems here. The first is that the rewriter is not the
    analog of a geneticist but of a tutor, a personal trainer, or a
    cosmetic surgeon. The analog to the geneticist would be a programmer.
    But we cannot even conceive of a program that could perform the subtle
    and complex changes a competent editor can accomplish. Accordingly it
    seems unlikely that we will discover a theory of genetics that would
    allow us to grasp and control the complex and subtle ways in which all
    the human genes interact, not to mention the often unpredictable or
    uncontrollable forces of the environment that cooperate with the

          The second problem is a corollary of the first. Since we do not
    understand fully or even thoroughly how exactly Thomas Mann wrote his
    novels, any emulation of Mann's style and constructions will look like
    a parody at best and a disaster at worst. There are surely ways of
    improving The Firm and the ordinary Joe and Josephine. But the result,
    via editing in one case and education in the other, would not be a
    creature of a higher order. Rather, The Firm would be more fully what
    it could have been, and Josephine and Joe would more fully come into
    their own.

Virtual Reality

          Even if mind and body retain the core of their primal integrity,
    the reality of the world that contains human beings has come into
    question through recent technology and vis à vis the new and different
    reality that has emerged from technology, namely, cyberspace.
    Moreover, while once reality was the last court of appeal and truth
    the supreme justice of knowledge, reality is a construction, we are
    now told, and truth an honorific term we bestow on the stories that
    are told by the powerful or that we all have consented to.

          The debates over these issues are mostly confined to English
    departments and to the social sciences. But these airy struggles have
    concrete counterparts in the foods, shelters, and entertainments of
    everyday life. What looks like old-fashioned ice cream is an
    engineered food drawn from genetically modified cows and corn. A
    building that seems to have the classic gravity of blocks of stone is
    a steel construction that has thin slices of limestone hung on it. In
    a film, the Colosseum seems to have been restored and filled with
    seething Romans, but the construction was done electronically rather
    than through stones, mortar, and living persons.

          The gravest challenge to the traditional world comes, however,
    from cyberspace because it is not merely a modification of
    old-fashioned reality but a separate reality. Especially in the
    nineties, there were confident predictions that the digital realm of
    bits and electrons would displace the world of bricks and atoms.
    [18]^16 If cyberspace is the new realm, virtual reality is its
    capital. It is a city still under construction, and visitors are only
    shown some roughly finished buildings. Once completed, however,
    virtual reality is supposed to provide experiences that would be
    indistinguishable from "real" ones were it not for the greater glamour
    of hyperreality and the superior powers we will be able to exercise in

          Virtual reality still has an exotic aura about it that makes us
    overlook the slice of hyperreality we have learned to take for
    granted--music on compact discs. We have pretty well come to accept
    the acoustic realization of the information on CDs as real music, or
    more precisely as hyperreal music, that is, as music that is so
    flawlessly resonant as to surpass easily any actual performance. It
    is, nonetheless, a mere slice of hyperreality since the visual part of
    the performance is unavailable, and it is a poor example of
    interactivity, by its nature, so to say--we do little in a concert
    hall beside listening, coughing, and applauding. We are not authorized
    to do to a live performance what we often do with a CD that is
    playing--interrupt it, start it over again, skip a portion of it, or
    stop it.

          Still, an enlargement of the CD's sonic segment to its full
    hyperreality will disclose the crucial limits of hyperreality. A
    hyperreal concert is in fact quite conceivable now. A supercompact DVD
    and a wall-sized digital screen of fine resolution together with an
    advanced stereo system will for all practical purposes provide you,
    sitting in a comfortable chair in front of the screen, with the same
    sensory input as you would receive front and center in a venerable
    concert hall with a top orchestra performing perfectly. [19]^17

          Presented with a scenario like that, thoughtful people are
    stumped when challenged to tell what difference there could possibly
    be between the virtual and the actual concert, and often such people
    turn away whatever scruples come to mind as romantic sentiments or
    Luddite resentments. Is there a difference? To simplify and focus the
    issue, let us stipulate the experience, defined as the sum of sights
    and sounds, to be exactly the same in the two cases, in virtuality and

          The contexts and background experiences are different, of
    course, and as in the case of artificial intelligence, this difference
    is obvious and trivial at first sight, but pivotal and illuminating
    when considered closely. The virtual concert is disposable and
    discontinuous with reality where the actual performance reveals and is
    continuous with our world. To gain access to virtual reality, one has
    to cross a threshold and enter a hyperreal world. Such a crossing may
    be entering a flight simulator; donning a helmet with earphones and
    screens; putting on a body suit; or powering up an amplifier,
    inserting a DVD, and turning on the screen.

          In all cases, the threshold is clearly marked and easily
    traversed. Because it is clearly marked, we never forget, when
    immersed in virtual reality, the distinctiveness and ease of the
    threshold, and this background knowledge subtly infects our central
    experiences--it is entirely at our disposal; we can at any time
    withdraw from it, and return to it, or replace it. Virtual reality is
    disposable because it is discontinuous, unlike an actual concert that
    is anchored in commitments to a certain time and place by the respect
    we owe to actual humans who give their best in performing for us, by
    our acknowledgment of the audience and the mutual expectations that
    govern a concert hall.

          Because of its discontinuity with actuality, a virtual concert
    reveals little about what matters in the world. It will continue to
    exist whether the hall has burned down or not, the conductor has died
    or not, the orchestra has disbanded or not. A CD or DVD is, of course,
    a record of the past, but it is not even that, strictly speaking,
    since the information it contains has been carefully tweaked and
    assembled from many takes. It is certainly not the record of one
    continuous, actual performance.

          A real concert, to the contrary, tells you much about the world
    you live in. It reflects what kind of music is supported here and to
    what extent. It shows what kind of artistry one can expect at the
    level of this particular orchestra and community. And here once more
    the moral authority and aesthetic authenticity that an actual
    performance possesses and a virtual one lacks are undiminished by
    advances in information and entertainment technology.


          What is the cumulative force of these reflections on technology
    and its effects on mind, body, and world? One result is surely that
    the common alarm about technology is misplaced. But why this
    fascination with the supposedly radical and revolutionary effects of
    technology? Social theorists and commentators realize, I suppose, that
    the house of American culture is not in order. But think of your
    reaction when last you contemplated cleaning up your garage, your
    closets, or just your post-holiday kitchen. It is one thing to
    recognize but quite another to remedy disorder, and it is harder still
    to determine why and how things got that way and how they could be put
    on a better footing. There seems to be a similar disinclination among
    most social theorists to acknowledge the common intuition that there
    is something wrong with the daily, inconspicuous, ordinary American
    household and to instigate a significant and sustained conversation
    about the quality of contemporary life. Given this apparently
    distasteful and intractable situation, it is convenient to be told:
    There is no need to put this house in order. It is obsolete,
    condemned, and will soon be torn down; we have to move out anyway, and
    we may as well begin to envision a radically new and revolutionary
    kind of life.

          We are alert to damage to the infrastructure, to the security or
    healthfulness of our lives, and willing, if not eager, to undertake
    judicial or environmental repairs. Hence you find most social critics
    and reformers in the utility room of the republic, worrying about the
    circuit breakers, the water lines, and the sewage pipes. But no one
    worries about the music room and the fact that the violin is out of
    tune, the flute is tarnished, and dust has settled on the piano. And
    worse, few are exercised by the transformation of the music room into
    a TV den. To be clear on a contentious point, I am not invoking a
    romantic view of the musical culture as a lost tradition, though there
    is some truth to seeing things that way. Something like the music
    room--a place of skilled and leisurely engagement--is at any rate the
    tacit ideal and promise that is supposed to warrant our obsessive
    concern with the means to the good life. Are we to conclude then that
    there is an enduring cultural ideal and that putting our house in
    order comes to sweeping technological junk into the corners to make
    room for Haydn's piano trios? Something like this scheme is needed,
    but the content need not be borrowed from the high culture of the

          One of the remarkable features of contemporary culture is that
    the distinctive creations of our time fail to be actually and tangibly
    central to our culture. Haydn's music was one of the characteristic
    achievements of late 18^th century Europe, and so were the violins and
    pianos built at the time. All of this occupied a central position in
    the culture of the day. Information technology is likely the crucial
    human achievement at the turn of the millennium. In outward appearance
    there have been few changes in kind within the last forty years. There
    have been massive quantitative changes--more highways, more
    high-rises, more cars, and more planes. What has qualitatively changed
    has taken place under the skin of buildings, planes, and cars; and it
    has surfaced inconspicuously in the boxes and keyboards we call
    computers. However, information technology is not just the distinctive
    marker of our time, it is also astounding and admirable in itself.
    Computer chips are by far the most densely complex creations of the
    human mind. They represent, moreover, the convergence of incredible
    ingenuity and strenuous diligence in logic, mathematics, computer
    science, material science, electrical engineering, industrial
    production, and marketing.

          But none of these amazing disciplines and structures are at the
    surface and center of contemporary culture. Of course, all of us use
    information technology for communication and information, and
    everybody employs it in entertainment. So what does a word-processing
    English professor learn about Boolean algebra and logic gates? Where
    does a video-game-playing teenager run into the properties of
    semiconductors and the behavior of electrons? Answer: Nothing and
    nowhere. Information technology has imploded into the subterranean
    machinery of our devices. What we enjoy at the surface is some
    commodity, some consumer good that, resting on a sophisticated
    machinery, is preternaturally instantaneous, ubiquitous, safe, and

          The development of personal computers over the past quarter
    century is a paradigm of the culture of technology--the divergence
    between the surfaces and the substructure, between the commodity and
    the machinery of the devices that inform the course of daily life. The
    increase in the speed, capacity, and sophistication of computer
    technology in the last twenty-five years is mind-boggling and defies
    all attempts at making it palpable through analogies and
    illustrations. Those of us who in the mid-seventies used computers to
    write articles, search for information, retrieve information, or
    communicate with colleagues will realize immediately where that
    tremendous increase in computer power went--not into teaching us more
    about the nature of information and the structure of technology, but
    into concealing all this more tightly and, most important, to make the
    use and scope of computers easier, quicker, more extensive, and more
    stable. Information technology has furnished powerful tools for the
    sciences, and these tools have been engaged in the discovery of
    phenomena and relationships that would have remained hidden without
    those tools. But for most of us the progress of technology has been a
    transition from the engagement with things in their contexts to the
    consumption of commodities that are available anywhere and anytime.

          At the center of contemporary culture is consumption. This is a
    truism we are deeply conflicted about. We hang on to consumption
    because it still contains a measure of promise and plausibility. Yet
    we cannot bring ourselves to celebrate it anymore because we sense the
    vacuity at its center. We still are drawn to consumption because it
    promises uniquely pure enjoyment, pleasure unmixed with labor and
    unconstrained by challenges. But being so easy and undemanding,
    consumption has nothing ennobling or elevating about it.

          Looking back in light of this pattern at recent developments in
    artificial intelligence, genetics, and virtual reality, we can see
    that they fail to be truly revolutionary and only push along a
    tendency that has been emerging for over two hundred years. The future
    of artificial intelligence is unlikely to equal the procurement of
    knowledge that the Internet has already accomplished. Virtual reality
    will transform our sense of the actual world less than the telephone
    and television have done. And genetics is unlikely to produce the
    bursts of health and longevity that public health measures,
    vaccinations, and antibiotics have produced. But surely all three
    endeavors will make the realm of consumption and commodities still
    more instantly and ubiquitously available and more safely and easily

          To see the characteristic pattern of technology, that is, the
    pairing and perfection of easy commodities with intricate machineries,
    is to recognize why the characteristic achievements of our time have
    left the centers of our lives barren. Most of the enormous ingenuity
    and application that the best and the brightest of today are capable
    of flows into the creation or perfection of concealed machineries,
    never to be seen again. Most of the most difficult endeavors today
    serve consumption, and thus incisiveness begets superficiality,
    exertion begets passivity, and expertise begets ignorance. The
    disparity between the producers and recipients of culture was not
    always so stark. Writers, composers, and builders used to create works
    that invited deep and knowledgeable engagement. But Shakespeare's
    plays, Mozart's symphonies, and Jefferson's buildings attracted even
    the untutored ear or eye, and persistent attention often made amateurs
    into connoisseurs.

          These observations seem to leave us with the melancholy
    conclusion that when it comes to leisure we have to choose between
    contemporary distractions and obsolete engagements. Superficially it
    does seem that the activities and celebrations we take pleasure and
    pride in are old-fashioned and inherited from pretechnological
    activities and practices--reading books, running races, playing music,
    etc. But the fact is that traditional things assume new significance
    against the backdrop of the technological culture. That is true of
    mind, body, and world when seen within the horizons of artificial
    intelligence, genetics, and virtual reality.

          Vis à vis artificial intelligence the dignity of the mind's
    embodiment comes into focus. The human mind does not happen to be
    housed in wetware from which it could be extracted and transferred to
    the crystalline and metallic stuff of a computer. Rather the human
    mind is the uniquely unified sensibility, the precious vulnerability,
    and the generational connectedness of the body (though it is not
    merely that). The body in turn, when examined in light of genetics,
    emerges as the inexhaustible richness of evolution and the
    unsurpassable harmony of trillions of distinguishable parts. The
    world, finally, when contrasted with virtual reality, comes to the
    fore in its commanding presence and the unsearchable depth of its

          When the culture of technology prospers, that is, when research
    is revolutionary, industry productive, commerce flourishing, and
    consumers confident, we feel blessed with good fortune as well we
    might. But blessings come with burdens. The clearest is the
    requirement that we share our prosperity with the poor, the hungry,
    and the sick here and around the globe. The hardest is to see the
    emptiness at the center of consumption and to search for those focal
    things and practices that deserve and reward our whole-hearted

    [20]^1 Virginia Woolf, "A Sketch of the Past," Moments of Being, ed.
    Jeanne Schulkind (New York: Harcourt, 1976) 70-3. ] [21]^2 Stephen
    Doheny-Farina, The Grid and the Village (New Haven: Yale University
    Press, code 2001 /code ). ] [22]^3 Rick Lyman, "In Little Time, Pop
    Culture is Almost Back to Normal," New York Times on the Web, code 4
    /code October code 2001 /code
    [23]<www.nytimes.com/2001/10/04/arts/04POP.html>; Michiko Kakutani,
    "The Age of Irony Isn't Over After All," The New York Times, section
    code 4 /code ( code 9 /code October code 2001 /code ): 1. ] [24]^4
    Chris Gaither, "Video Game Field Becomes Crowded and Highly
    Profitable," New York Times on the Web, code 19 /code December code
    2001 /code [25]<www.nytimes.com/2001/12/17/technology/17GAME.html>. ]
    [26]^5 Bill Joy and Francis Fukuyama are alarmed by the potentially
    catastrophic abuse of biotechnology. Joy is also worried about
    information technology and nanotechnology. My sense is that utopians
    will be foiled and Cassandras disproven by the enormous, if
    intelligible, complexity of the brain. See Joy, "Why the Future
    Doesn't Need Us," Wired (April 2000) code 3 /code April code 2002
    /code [27]<www.wired.com/wired/archive/8.04/joy_pr.html>; Fukuyama,
    "Biotechnology and the Threat of a Posthuman Future," The Chronicle of
    Higher Education ( code 22 /code March 2002): B7-10. ] [28]^6 If we
    are to believe MIT's Technology Review, artificial intelligence
    researchers have turned their back on the project of simulating or
    equaling human intelligence. See Michael Hiltzik, "A.I. Reboots,"
    Technology Review (March 2002): 46- code 55 /code . ] [29]^7 Daniel
    Dennett, "Intentional Systems," Brainstorms (Montgomery: Bradford,
    1978) 3-22. ] [30]^8 Joseph Weizenbaum, Computer Power and Human
    Reason (San Francisco: Freeman, 1976) 188- code 91 /code . ] [31]^9
    Immanuel Kant, Foundations of the Metaphysics of Morals, trans. Lewis
    White Beck (Indianapolis: Bobbs-Merrill, 1959) code 53 /code ( code
    434 /code in the Prussian Academy edition). ] [32]^10 Jack Canfield
    and Mark Victor Hansen, eds., Chicken Soup for the Soul (Deerfield
    Beach: Health Communications, 1993). ] [33]^11 Nicolas Wade, "With
    Genome, a Radical Shift for Biology," The New York Times ( code 25
    /code December code 2001 /code ): code F7 /code . ] [34]^12 Wade F7. ]
    [35]^13 John Grisham, The Firm (New York: Doubleday, 1991). ] [36]^14
    Thomas Mann, Der Zauberberg (1924; Berlin: Fischer, 1954). ] [37]^15
    Howard Gardner, Mihaly Csikszentmihalyi, and William Damon, Good Work
    (New York: Basic, code 2001 /code ) 41-2, code 117 /code -21. Colin
    Tudge, "The Future of Humanity," New Statesman on the web, code 8
    /code April code 2002 /code .
    layURN=2>. ] [39]^16 William Mitchell, City of Bits (Cambridge, MA:
    MIT Press, 1995); Nicholas Negroponte, Being Digital (New York:
    Vintage, 1996). ] [40]^17 Cf. Katie Hafner, "Drawn to the Hearth's
    Electronic Glow," New York Times on the Web, code 24 /code January
    code 2002 /code
    ageinsidebox...>. ]


   23. http://www.nytimes.com/2001/10/04/arts/04POP.html
   25. http://www.nytimes.com/2001/12/17/technology/17GAME.html
   27. http://www.wired.com/wired/archive/8.04/joy_pr.html
   38. http://www.newstatesman.co.uk/site.php3?newTemplate=NSArticle_NS&newDisplayURN=2
   41. http://www.nytimes.com/2002/01/24/technology/circuits/24SCRE!.html?homepageinsidebox...

More information about the paleopsych mailing list