[Paleopsych] Edge Annual Question 2000: What Is Today's Most Underreported Story?

Premise Checker checker at panix.com
Thu Jan 12 18:38:26 UTC 2006

Edge Annual Question 2000: What Is Today's Most Underreported Story?
http://www.edge.org/3rd_culture/story/contributions.html (Links omitted)

[These are worth reviewing, six years later. Lots of transhumanist themes 
here. And lots of continued underreporting. Most of these are long-term 
trends, not stories.

[The biggest underreported story of 2005 was the collapse of the Standard 
Social Science ("blank-slate") Model. Over the last six years, the most 
underreported story was the demographic decline of Europe, but that is now 
rapidly changing, perhaps due to the collapse of the SSSM.

[The second most underreported is the continuing shift from equality to 
pluralism as the major preoccupation of the political "left." Is the 
"right" unifying around universalism? Look for defections from the right 
on the part of free-marketeers, leaving the theocrats and empire builders 
together on the "right."]


"Don't assume for a second that Ted Koppel, Charlie Rose and the
editorial high command at the New York Times have a handle on all the
pressing issues of the day....when Brockman asked 100 of the world's
top thinkers to come up with pressing matters overlooked by the media,
they generated a lengthy list of profound, esoteric and outright
entertaining responses."

-- "Web Site for Intellectuals Inspires Serious Thinking" by Elsa
Arnett, San Jose Mercury News

  The World Question Center:


102 contributions to Date (71,200 words):
William Calvin, Mihaly Csikszentmihaly, Pattie Maes, George Dyson,
Douglas Rushkoff , Howard Gardner, Roger Schank, Lee Smolin, Judith
Rich Harris, Stewart Brand, John McWhorter, Paul Davies, Rodney
Brooks, Sally M. Gall, John Gilmore, Eric J. Hall, Stephen R. Kellert,
Thomas Petzinger, Jr, Sylvia Paull, James J. O'Donnell, Philip W.
Anderson, Stephen Grossberg, Brian Goodwin, Arnold Trehub, Ivan Amato,
Howard Rheingold , Clifford A. Pickover, Hans Weise, John Horgan,
Philip Elmer-DeWitt, Lance Knobel, Jeff Jacobs, Piet Hut, Freeman
Dyson, Kevin Kelly, Marc D. Hauser, Daniel Goleman, Philip Brockman,
Terrence J. Sejnowski, Bart Kosko, Dean Ornish, Keith Devlin, Andy
Clark, Anne Fausto-Sterling, Eberhard Zangger, Peter Cochrane, Hans
Ulrich Obrist, Ellis Rubinstein, Stuart Hameroff, David Lykken, Mehmet
C. Oz, M.D., Eduard Punset, Stephen H. Schneider, David G. Myers, Todd
Siler, Joseph Ledoux, Verena Huber-Dyson, Julian Barbour, Henry
Warwick, James Bailey. Robert R. Provine, Steven Quartz, Jaron Lanier,
Robert Hormats, Daniel Pink, Timothy Taylor, Carlo Rovelli, Peter
Schwartz, Leon M. Lederman, Phil Leggiere, Denise Caruso, Tor
Norretranders, Delta Willis, , Charles Arthur, David M. Buss, Denis
Dutton, Tom de Zengotita, Rupert Sheldrake, Marney Morris, Raphael
Kasper, Jason McCabe Calacanis, Steven Pinker, Philip Campbell, Ernst
Poppel, David Braunschvig, Geoffrey Miller, Nancy Etcoff, Kenneth W.
Ford, Richard Potts, Robert Aunger, Colin Tudge, Paul W. Ewald, David
Bunnell, W. Brian Arthur, Margaret Wertheim, Thomas A. Bass, Rafael
Nunez, Margaret Wertheim, Randolph M. Nesse, M.D., Sherry Turkle,
Joseph Vardi

Joseph Vardi
How Kids Replaced the Generals

The most important untold story according my opinion is how kids
replaced the generals as the major source of defining the innovation
agenda of the world; how the biblical prophecy "thou shell turn your
swords into Sony Playstation 2" is being fulfilled; how the underlying
power sending satellites to the skies, creating fabs for 128 bit
machines, pushing for broadband, is not any longer based on the
defense needs of the big powers ,but on the imagination and the
passion of kids (and adults) to play games, see 1000 channels of
television, and listen to music!!

This is just amazing, and beautiful. Never in the history of mankind
have kids had such a profound influence on the creativity and
innovation agenda. It is a very democratic process as well as the
decision-making power is distributed widely.

DR. JOSEPH VARDI is the Principal of International Technologies
Ventures, a private venture capital enterprise, investing principally
for its own account, which has initiated, negotiated, structured and
arranged financing for the acquisition of operating companies; and
created and funded several high-tech companies in the fields of
Internet, software, telecommunications, electro-optics, energy,
environment and other areas.

Dr Vardi is the founding investor and the former chairman of Mirabilis
Ltd, the creator of the extremely popular Internet communication
program ICQ which took the web by storm, making it one of the most
successful Internet products of all times, with currently over 65
million users. The company was acquired by AOL.

LINK: Joseph Vardi's bio page on Edge

Sherry Turkle

I. A new kind of object: From Rorschach to Relationship

I have studied the effects of computational objects on human
developmental psychology for over twenty years, documenting the ways
that computation and its metaphors have influenced our thinking about
such matters as how the mind works, what it means to be intelligent,
and what is special about being human. Now, I believe that a new kind
of computational object -- the relational artifact -- is provoking
striking new changes in the narrative of human development, especially
in the way people think about life, and about what kind of
relationships it is appropriate to have with a machine. Relational
artifacts are computational objects designed to recognize and respond
to the affective states of human beings-and indeed, to present
themselves as having "affective" states of their own. They include
children's playthings (such as Furbies and Tamagotchis), digital dolls
that double as health monitoring systems for the homebound elderly
(Matsushita's forthcoming Tama), sentient robots whose knowledge and
personalities change through their interactions with humans, as well
as software that responds to its users' emotional states and responds
with "emotional states" of their own.

Over the past twenty years, I have often used the metaphor of
"computer as Rorschach" to describe the relationship between people
and their machines. I found computers used as a projective screen for
other concerns, a mirror of mind and self. But today's relational
artifacts make the Rorschach metaphor far less useful than before.
These artifacts do not so much invite projection as they demand
engagement. The computational object is no longer affectively
"neutral." People are learning to interact with computers through
conversation and gesture, people are learning that to relate
successfully to a computer you do not have to know how it works, but
to take it "at interface value," that is to assess its emotional
"state," much as you would if you were relating to another person.
Through their experiences with virtual pets and digital dolls
(Tamagotchi, Furby, Amazing Ally), a generation of children are
learning that some objects require (and promise) emotional nurturance.
Adults, too, are encountering technology that attempts to meet their
desire for personalized advice, care and companionship (help wizards,
intelligent agents, AIBO, Matsushita's forthcoming Tama).

These are only the earliest, crude examples of the relational
technologies that will become part of our everyday lives in the next
century. There is every indication that the future of computational
technology will include ubiquitous relational artifacts that have
feelings, life cycles, moods, that reminisce, and have a sense of
humor, which say they love us, and expect us to love them back. What
will it mean to a person when their primary daily companion comes is a
robotic dog? Or their health care "worker" is a robot cat? Or their
software program attends to their emotional states and, in turn, has
its own?. We need to know how these new artifacts affect people's way
of thinking about themselves, human identity, and what makes people
special. These artifacts also raise significant new questions about
how children apporach the question of `What is alive?" In the proposed
research the question is not what the computer will be like in the
future, but what will we be like, what kind of people are we becoming?

Relational artifacts are changing the narrative of human development,
including how we understand such "human" qualities as emotion, love,
and care. The dynamic between a person and an emotionally interactive,
evolving, caring machine object is not the same as the relationship
one might have with another person, or a pet, or a cherished inanimate

We have spent a large amount of social resources trying to build these
artifacts; now it is time to study what is happening to all of us as
we go forth into a world "peopled" with a kind of object we have never
experienced before. We need to more deeply understand the nature and
implications of this new sort of relationship -- and its potential to
fundamentally change our understanding of what it means to be human.

We need to be asking several kinds of new questions:

   o How are we to conceptualize the nature of our attachments to
   interactive robots, affective computers, and digital pets?

   o How does interacting with relational artifacts affect people's
   way of thinking about themselves and others, their sense of human
   identity and relationships? How do the models of development and
   values embedded in the design of relational artifacts both reflect
   and influence our ways of thinking about people?

   o What roles -- both productive and problematic -- can relational
   artifacts play in fulfilling a basic human need for relationship?
   Their first generation is being predominantly marketed to children
   and the elderly. What does this reflect about our cultural values
   about these groups? How will these objects influence their
   understanding of who they are as individuals and in the world? Are
   we reinforcing their marginality? Are we tacitly acknowledging that
   we do not have enough "human" time to spend with them?


In the 1960s through the 1980s, researchers in artificial intelligence
took part in what we might call the classical "great AI debates" where
the central question was whether machines could be "really"
intelligent. This classical debate was essentialist; the new
relational objects tend to enable researchers and their public to
sidestep such arguments about what is inherent in the computer.
Instead, the new objects depend on what people attribute to them; they
shift the focus to what the objects evoke in us. When we are asked to
care for an object (the robot Kismet, the plaything Furby), when the
cared-for object thrives and offers us its attention and concern,
people are moved to experience that object as intelligent. Beyond
this, they feel a connection to it. So the question here is not to
enter a debate about whether relational objects "really" have
emotions, but to reflect on a series of issues having to do with what
relational artifacts evoke in the user.

In my preliminary research on children and Furbies, I have found that
children describe these new toys as "sort of alive" because of the
quality of their emotional attachments to the Furbies and because of
their fantasies about the idea that the Furby might be emotionally
attached to them. So, for example, when I ask the question, "Do you
think the Furby is alive?" children answer not in terms of what the
Furby can do, but how they feel about the Furby and how the Furby
might feel about them.

Ron (6): Well, the Furby is alive for a Furby. And you know, something
  this smart should have arms. It might want to pick up something or
  to hug me.
  Katherine (5): Is it alive? Well, I love it. It's more alive than
  a Tamagotchi because it sleeps with me. It likes to sleep with me.

Here, the computational object functions not only as an evocative
model of mind, but as a kindred other. With these new objects,
children (and adults) not only reflect on how their own mental and
physical processes are analogous to the machine's, but perceive and
relate to the machine as an autonomous and "almost alive" self.

My work with children and computational objects has evolved into a
decades-long narrative about the way computation has affected the way
we make sense of the world. In many ways, the behaviors and comments
of children have foreshadowed the reactions of adults. In the first
generation of computer culture I studied, the children of the late
1970s and early 1980s tended to resolve metaphysical conflicts about
machine "aliveness" by developing a concept of "the psychological
machine"-concluding that psychology and a kind of consciousness were
possible in objects they knew were not alive. This way of coping with
the conundrums posed by computational objects was pioneered by
children, and later adopted by adults. Later cohorts of children's
responses to computational objects that were more complex and
problematic in new ways again reliably foreshadowed the conclusions
the culture at large would soon reach. First, they explained that
although machines might be psychological in the cognitive sense (they
might be intelligent, they might have intentionality), they were not
psychological in the emotional sense, because they did not know pain,
or love, they were not mortal, and they did not have souls. Soon
after, the children I interviewed began consistently citing biology
and embodiment as the crucial criteria that separated people from
machines; they insisted that qualities like breathing, having blood,
being born, and, as one put it, "having real skin," were the true
signs of life. Now, I have begun to see a new pattern - children
describe relational artifacts not as "alive" or "not alive", but as
"sort-of-alive." Categories such as "aliveness" and "emotion" seem
poised to split in the same way that the categories of "psychological"
and "intelligent" did twenty years ago.

Children's reactions to the presence of "smart machines" have fallen
into discernable patterns over the past twenty years. Adults'
reactions, too, have been changing over time, often closely following
those of the children. To a certain extent, we can look to children to
see what we are starting to think ourselves. However, in the case of
relational artifacts, there is more to the choice of children as
subjects than a simple desire to stay ahead of the curve in
anticipating changes in computer culture. By accepting a new category
of relationship, with entities that they recognize as "sort-of-alive",
or "alive in a different, but legitimate way," today's children will
redefine the scope and shape of the playing field for social relations
in the future. Because they are the first generation to grow up with
this new paradigm, it is essential that we observe and document their

SHERRY TURKLE is a professor of the sociology of sciences at MIT. She
is the author of The Second Self: Computers and the Human Spirit;
Psychoanalytic Politics: Jacques Lacan and Freuds French Revolution,
and Life on the Screen: Identity in the Age of the Internet..

LINKS: Sherry Turkle's Home Page; See "The Cyberanalyst", Chapter 31
in Digerati

Randolph M. Nesse, M.D.
Is the Market on Prozac?

The press has been preoccupied with possible explanations for the
current extraordinary boom. Many articles say, as they always do while
a bubble grows, that this market is "different." Some attribute the
difference to new information technology. Others credit changes in
foreign trade, or the baby boomer's lack of experience with a real
economic depression. But you never see a serious story about the
possibility that this market is different because investor's brains
are different. There is good reason to suspect that they are.

Prescriptions for psychoactive drugs have increased from 131 million
in 1988 to 233 million in 1998, with nearly 10 million prescriptions
filled last year for Prozac alone. The market for antidepressants in
the USA is now $6.3 billion per year. Additional huge numbers of
people use herbs to influence their moods. I cannot find solid data on
how many people in the USA take antidepressants, but a calculation
based on sales suggests a rough estimate of 20 million.

What percent of brokers, dealers, and investors are taking
antidepressant drugs? Wealthy, stressed urbanites are especially
likely to use them. I would not be surprised to learn that one in four
large investors has used some kind of mood-altering drug. What effects
do these drugs have on investment behavior? We don't know. A 1998
study by Brian Knutson and colleagues found that the serotonin
specific antidepressant paroxetine (Paxil) did not cause euphoria in
normal people, but did block negative affects like fear and sadness.
From seeing many patients who take such agents, I know that some
experience only improved mood, often a miraculous and even life-saving
change. Others, however, report that they become far less cautious
than they were before, worrying too little about real dangers. This is
exactly the mind-set of many current investors.

Human nature has always given rise to booms and bubbles, followed by
crashes and depressions. But if investor caution is being inhibited by
psychotropic drugs, bubbles could grow larger than usual before they
pop, with potentially catastrophic economic and political
consequences. If chemicals are inhibiting normal caution in any
substantial fraction of investors, we need to know about it. A more
positive interpretation is also easy to envision. If 20 million
workers are more engaged and effective, to say nothing of showing up
for work more regularly, that is a dramatic tonic for the economy.
There is every reason to think that many workers and their employers
are gaining such benefits. Whether the overall mental health of the
populace is improving remains an open question, however. Overall rates
of depression seem stable or increasing in most technological
countries, and the suicide rate is stubbornly unchanged despite all
the new efforts to recognize and treat depression.

The social effects of psychotropic medications is the unreported story
of our time. These effects may be small, but they may be large, with
the potential for social catastrophe or positive transformation. I
make no claim to know which position is correct, but I do know that
the question is important, unstudied, and in need of careful research.
What government agency is responsible for ensuring that such
investigations get carried out? The National Institute of Mental
Health? The Securities and Exchange Commission? Thoughtful
investigative reporting can give us preliminary answers that should
help to focus attention on the social effects of psychotropic

Randoph M. Nesse, M.D., is Professor of Psychiatry, Director, ISR
Evolution and Human Adaptation Program, The University of Michigan and
coauthor (with George C. Williams) of Why We Get Sick: The New Science
of Darwinian Medicine. 
LINK: Randolph Nesse's Home Page

Margaret Wertheim
Response to Paul Davies

I appreciate Paul Davies' response to my question "What is science,
and do indigenous kniowledge systems also contain a genuine scientific
understanding of the world?" My point in raising this question is not
to suggest that Western science is not universal - clearly the same
law of gravity operates in the deserts of central Australia as
operates in the labs of Caltech. In that sense Western science is
indeed something that every culture can share and benefit from, if
they so choose. At issue here is really the reverse question: Might
there also be discoveries about the way the world works that have been
made by other cultures, that we in the West have not yet come to -
knowledge that we in turn might benefit from?

One example here is the Aboriginal tradition of fire burning. It is
now known that Aboriginal people traditionally managed the land and
its native flora and fauna by complex patterns of burning. Given the
huge risk of out-of-control bush-fires in Australia, there is now
interest among some ecologists and park mamagers re understanding this
tradional knowledge. Another example is accupuncture. Some years ago I
had a serious case of hepatitis, for which Western medicine could do
nothing whatever. Eventually after months of illness, I started to see
an accupuncturist because Chinese medicine claims to have ways of
treating liver disease. Eventually I recovered. It is possible, of
course, that I might have recovered without the accupuncture, but many
many people (including billions of Chinese) have had therapeutic
experiences with accupuncture. I do not claim to know how accupuntcure
works, but it seems fair to at least keep an open mind that there
really is some deep understanding here of bodily function - some
knoweldge that we might truly benefit from.

The hard part of the question is does such knowledge constitute a
genuine "science"? Paul suggests not, but I think this option should
not be ruled out. In practical terms, accupuncturists operate much the
same way that Western doctors operate: you go for a diagnosis, they
check for various symptoms, then they prescribe certain treatments.
All this involves rational analysis based on a complex underlying
theory of how the body works. That theoretical foundation might well
sound odd to Western minds (it obvioulsy does to Paul, as it does to
me), but if billions of people get well it seems hard to dismiss it
completely. We should not forget that our own medical science today
now incorporates theoretical ideas (jumping genes for example) that
were scofffed at by most scientists just a few decades ago. There are
no doubt many more "true" ideas that we have not yet come to about the
human body - things that might seem crazy today. Numbers of
double-blind trials have shown that accupuncture can be very effective
- so even by Western standards it seems to pass the test. One could
still argue that its not a "true science", but just a complex set of
heuristics that happens to work in lots of cases, but is this any less
so of much of our own medical science?

As "shining emblems of true science" Paul suggests, "radio waves,
nuclear power, the computer and genetic engineering." The first three
examples all come out of physics, which because of its mathematical
foundation is somewhat different to most of the other sciences. As
philosphers of science have been saying for some time, it is
problematic to judge all sciences according to the methodologies of
phsyics. If that is the criteria for a "true science" then much of
modern biology would not count either. Paul's final example, genetic
engineering, is from the biological area, but it is the most
"atomized" part of biology. Historically the whole area of gene
science (from Max Delbruck on) has been heavily influenced by a
physics mentality, and contemporary genetic engieering is indeed a
testimony to what can been achieved by applying a physics paradigm to
biology. But again, if this is our only criteria for "true science"
then what is the status of other biological sciences such as ecology,
zoology, and indeed Darwin's theory of evolution? None of these would
seem to me to pass Paul's criteria.

Thus we come back top the question: What really is science? This is a
question of immense debate among philosophers of science, and among
many scientists. I don't claim to have a simple answer - but I would
like to argue for a fairly expansive definition. Although I trained as
a physicist myself, and physics remains my personal favorite science,
I do not think it can or should be our only model for a "true

By suggesting that indigenous knowledge systems contain genuine
scientific understandings of the world, I do not mean to imply that
Western science becomes less universal, only that there may well be
other truths that our science has yet to discover. The point is not to
diminish our own science, or our understanding of what sicence is, but
to enrich both.

MARGARET WERTHEIM is the author of Pythagoras Trousers, a cultural
history of physics, and The Pearly Gates of Cyberspace: A History of
Space from Dante to the Internet. She writes regularly about science
for Salon, L.A. Weekly, The Sciences, Guardian, TLS and New Scientist.
She is the senior science reviewer for the Australian's Review of

Paul Davies 
Response to Margaret Wertheim

Margaret Wertheim asks what is meant by "science." I have an answer.
It must involve more than merely cataloguing facts, and discovering
successful procedures by trial and error. Crucially, true science
involves uncovering the principles that underpin and link natural
phenomena. Whilst I wholeheartedly agree with Margaret that we should
respect the world view of indigineous non-European peoples, I do not
believe the examples she cites -- Mayan astronomy, Chinese
acupuncture, etc. -- meet my definition. The Ptolemaic system of
epicycles achieved reasonable accuracy in describing the motion of
heavenly bodies, but there was no proper physical theory underlying
it. Newtonian mechanics, by contrast, not only described planetary
motions more simply, it connected the movement of the moon with the
fall of the apple. That is real science, because it uncovers things we
cannot know any other way. Has Mayan astronomy or Chinese acupuncture
ever led to a successful nontrivial prediction producing new knowledge
about the world? Many people have stumbled on the fact that certain
things work, but true science is knowing why things work. I am
open-minded about acupuncture, but if it does work, I would rather put
my faith in an explanation based on nerve impulses than mysterious
energy flows that have never been demonstrated to have physical

Why did science take root in Europe? At the time of Galileo and
Newton, China was far more advanced technologically. However, Chinese
technology (like that of the Australian Aborigines) was achieved by
trial and error refined over many generations. The boomerang was not
invented by first understanding the principles of hydrodynamics and
then designing a tool. The compass (discovered by the Chinese) did not
involve formulating the principles of electromagnetism. These latter
developments emerged from the (true, by my definition) scientific
culture of Europe. Of course, historically, some science also sprang
from accidental discoveries only later understood. But the shining
emblems of true science -- such as radio waves, nuclear power, the
computer, genetic engineering - all emerged from the application of a
deep theoretical understanding that was in place before -- sometimes
long before -- the sought-after technology.

The reasons for Europe being the birthplace of true science are
complex, but they certainly have a lot to do with Greek philosophy,
with its notion that humans could come to understand how the world
works through rational reasoning, and the three monothesitic religions
-- Judaism, Christianity and Islam -- with their notion of a real,
lawlike, created order in nature, imposed by a Grand Architect.
Although science began in Europe, it is universal and now available to
all cultures. We can continue to cherish the belief systems of other
cultures, whilst recognizing that scientific knowledge is something
special that tanscends cultures.

Paul Davies is an internationally acclaimed physicist, writer and
broadcaster, now based in South Australia. Professor Davies is the
author of some twenty books, including Other Worlds, God and the New
Physics, The Edge of Infinity, The Mind of God, The Cosmic Blueprint,
Are We Alone? and About Time.

LINK: "The Synthetic Path" -- Ch. 18 in The Third Culture

Rafael Nunez
The Death of Nations

For centuries societies have organized themselves in terms of
kingdoms, countries, and states. Towards the second half of the
recently past 20th Century these geographical, cultural, and political
"units" acquired a more precise meaning through the establishment of
modern "nations". The process was consolidated, among others, through
the creation of the so called United Nations, and the independence of
most colonial territories in Africa during the 60's. Today, we
naturally see the world as organized in clear-cut and well-defined
units: the world's nations (just check the colors of a political
atlas). Nations have their own citizens, well established territories,
capital cities, flags, currencies, stamps and postal systems, military
forces, embassies, national anthems, and even their own sport teams
competing in the various planet-scale events. This widespread view not
only has been taken for granted by most sectors of the public opinion,
but also it has served as the foundation of the highest form of
international organization -- the United Nations. The most serious
world affairs have been approached with this nation-oriented paradigm.
But the reality of our contemporary global society (which goes far
beyond pure global technology) is gradually showing that the world is
not a large collection of nations. Nations, as we know them, are not
anymore the appropriate "unit of analysis" to run the world, and to
deal with its problems. Here is why.

   o Environmental problems: Purely national/inter-national efforts to
   avoid the pollution of rivers, to protect the ozone layer, to
   manage (and avoid) environmental disasters, and to protect
   endangered species and biological diversity, have not given good
   results. New forms of global organizations, such as WWF and
   Greenpeace, have emerged to deal with these problems in a more
   efficient manner.

   o Natural resources: The management of the world's forests, the
   Antarctic ice, and fishing resources, has shown that they don't
   belong to the national/inter-national realm. Again, new and more
   efficient forms of global organizations have emmerged for
   addressing these problems.

   o Sovereignty: The relatively recent arrest in London of the
   ex-chilean dictator Augusto Pinochet (facing a potential
   extradition to Spain), has raised unprecedented and deep issues
   about the sovereignty of nations. The Chilean government claims
   that Pinochet should be judged in Chile, but international laws
   seem to be gradually evolving towards a form of jurisdiction that
   is above the sovereignty of nations. The role of supra national
   organizations such as Amnesty International and Human Rights Watch
   is becoming extremely prominent in redefining these issues.

   o Neutrality: The complexity of contemporary world organization is
   leaving almost no room for neutrality. Contemporary Swiss society,
   for instance, is experiencing an important identity crisis, since
   their traditional neutrality is no longer tenable in the new
   european and international contexts. One of their essential aspects
   of national identity -- neutrality - is collapsing. A simple fact
   illustrates this crisis. In 1992, during the World Expo in Seville,
   the official Swiss stand exhibited the following motto:
   "Switzerland does not exist".
   o Ethnic groups representation: Many ethnic groups around the world
   whose territories extend over several nations, such as Kurds (who
   live mainly in Eastern Turkey, Westren Iran, and Northern Iraq) or
   Aymaras (who live in Eastern Bolivia, Southern Peru, and Northern
   Chile), have had almost no representation in international
   organizations. Their problems haven't been heard in a world
   organized under the nation-paradigm. These groups, however, have
   been in the news on the last decade bringing their issues more to
   the foreground, thus relegating the traditional nations to a less
   prominent role.
   o Epidemics: Serious epidemics such as AIDS and new forms of
   tuberculosis, are spreading at alarming rates in some areas of the
   world. The cure, the study, and the control of these epidemics
   demand organizational efforts that go well beyond
   national/inter-national schemas. The emergence of many NGO's
   dealing with health issues is an attempt to provide more
   appropriate answers to these devastating situations.
   o Civil wars and ethnic cleansing: The stopping and control of
   ethnic massacres such as the ones observed in the former
   Yugoslavian regions, and those between Tutsis and Hutus in Africa,
   demand quick intervention and serious negotiation. A heavy
   nation-oriented apparatus is usually extremely slow and
   innefficient in dealing with this kind of situations. It can't
   capture the subtleties of cultural dynamics.
   o Ongoing separatism and proliferation of nations: The world has
   more and more nations. Only a few dozen nations founded the United
   Nations half a century ago. Today the UN has around two hundred
   members (The International Olympic Committee and FIFA, the World's
   Football Federation, have even more!). And it is not over. Former
   Soviet republics, Slovenia, Croatia, Czech Republic, Slovakia, and
   so on, already created new nations. Many others, such as the Basque
   country, Quebec, and Chechnya, are still looking for their
   independence. An ever increasing number of nations will eventually
   o Loss of national property and national icons: The openness and
   dynamism of international markets, as well as the globalization of
   foreign investment have altered at unprecedented levels the sense
   of what is "national". For instance, many airlines (to take a very
   simple example) usually seen as "national" airlines, today belong
   in fact to extra-national companies. Such is the case of Aerolineas
   Argentinas, LOT Polish Airlines, TAP Portugal, and LAN Peru, to
   mention only a few. National airlines, which in many countries have
   been seen as national icons, are simply not national anymore. Of
   course, the same applies to fishing waters, mines, forests,
   shopping malls, vineyards, and so on.

These are only a few examples. There are many others. Very serious
ones, such as the primacy of watersheds over national borders in
solving serious problems of water distribution. And less serious ones,
such as the potential collapse of one of the Canadian national sports
(ice-hockey), if their franchises continue to move to more profitable
lands in the United States. All these aspects of our contemporary
societies challenge the very notion of "nation", and reveal the
primacy of other factores which are not captured by nation-oriented
institutions. The world is now gradually adjusting to these changes,
and is coming up with new forms of organization, where nations, as
such, play a far less important role. Such is the case of the
formation of the European Community (which allows for free circulation
of people and merchandises), the establishment of a "European
passport", and the creation of the Euro as common currency. After all,
many national borders are, like those straight lines one sees in the
maps of Africa and North America, extremely arbitrary. It shouldn't
then be a surprise that the world divided into nations is becoming an
anachronism from the days when the world was ruled by a few powerful
kingdoms, that ignored, fundamental aspects of ethnic, cultural,
biological, and environmental dynamics. We are now witnessing the
death of nations as we know them.

RAFAEL NUNEZ is Assistant Professor of Cognitive Science at the
University of Freiburg, and a Research Associate at the University of
California, Berkeley. He is co-editor (with Walter J. Freeman) of
Reclaiming Cognition: The Primacy of Action, Intention, and Emotion.

Thomas A. Bass 
Shifting Empires and Power

I'm currently thinking about Sophocles' Oedipus at Colonus and the
Book of Exodus, which is inclining me to the opinion that today's
unreported stories are similar to yesterday's: shifting empires and
power as the powerless struggle for sanctuary and their own form of

THOMAS A. BASS is the author of The Predictors, Back to Vietnamerica,
Reinventing the Future, Camping with the Prince and Other Tales of
Science in Africa , and The Eudaemonic Pie. A frequent contributor to
Smithsonian, Audubon, Discover, The New York Times, and other
publications, he is Contributing Writer for Wired magazine and
Scholar-in-Residence at Hamilton College.

LINK: Thomas A. Bass Home Page

Margaret Wertheim
Indigenous Science

Over the last century we in the western world have gradually come to
take seriously other culture's religions, social systems, aesthetics,
and philosophies. Unlike our eighteenth century forebears we no longer
think of indigenous peoples of the non-white world as "savages", but
have come to understand that many of these cultures are as complex and
sophisticated as ours. The one area where we have continued to
proclaim our own specialness - and by extension our own superiority -
is science. "True Science" - that is a "truely empirical"
understanding of the world - is often said to be a uniquely western
pursuit, the one thing we alone have achieved. Against this view, a
small but growing body of scholars are beginning to claim that many
indigenous cultures also have a genuine scientific understanding of
the world - their claim is that science is not a uniquely western
endeavour. These "other" sciences are sometimes referred to as
"ethnosciences" - examples include (most famously) Mayan astronomy and
Chinese medicine, both of which are highly accurate, though wildly
different to their western equivalents. Less well known is the
logic-obsessed knowledge system of the Yolgnu Aborigines of Arnhemland
in northern Australia, and the complex navigational techniques of the

The claim that other cultures have genuine sciences (and sometimes
also complex logics) embedded in their knowledge systems, raises again
the whole philosophical issue of what exactly does the word "science"
mean. Helen Verran, an Australian philosopher of science who is one of
the leaders of the ethnoscience movement, has made the point that
having the chance to study other sciences gives us a unique
opportunity to reflect back on our own science. Her work on the Yolgnu
provides a important window from which to see our own scientific
insights in a new light.

Sadly, some scientists seem inherently opposed to the very idea of
"other sciences". But studying these other ways of knowing may enhance
our own understanding of the world in ways we cannot yet imagine. The
example of accupuncture must surely give any skeptic at least some
pause for thought - the Chinese have performed operations using
accupucture needles instead of anesthetic drugs. Likewise Mayan
astronomy, though based on the cycles of Venus, was as empirically
accurate as anything in the West before the advent of the telescope.

Two hundred years ago the idea that indigenous "savages" might be
genuine philsophers would have struck most Europeans as preposterous.
Today we have accepted this "preposterous" proposition, but a similar
view prevails about science. Learning about, and taking seriously,
these other ways of knowing the world seems to me one of the greatest
tasks for the next century - before (as Steven Pinker has rightly
noted) this immense wealth of human understanding disappears from our

MARGARET WERTHEIM is the author of Pythagoras Trousers, a cultural
history of physics, and The Pearly Gates of Cyberspace: A History of
Space from Dante to the Internet. She writes regularly about science
for Salon, L.A. Weekly, The Sciences, Guardian, TLS and New Scientist.
She is the senior science reviewer for the Australian's Review of

W. Brian Arthur
The last word on Y2K: The Y1K Problem

Just before the year 1000, a rumor arose in a certain town in Germany,
Hamelin I believe, that the coming of the new time would bring rats to
the public buildings. Some had been spotted in the basement of the
town hall, some in the local stables. Rat preventers were hired at
great price, and indeed when the century turned no rats were to be

The city fathers felt shamed by the scare and called the preventers
before them. You have spent a great deal of money on these rats ? but
there were no rats, they said.

Ah, city fathers, said the preventers. That's because we prevented

W. BRIAN ARTHUR is Citibank Professor at the Santa Fe Institute. From
1982 to 1996 he held the Morrison Chair in Economics and Population
Studies at Stanford University. Arthur pioneered the study of positive
feedbacks or increasing returns in the economy ? in particular their
role in magnifying small, random events in the economy.

David Bunnell 
New York Times Sells Out!

Sources from deep inside The New York Times Company, owner of The New
York Times, Boston Globe, numerous TV stations, regional newspapers,
and various digital properties, and from The Onion, a web based
satirical newspaper (www.theonion.com) have verified the rumors. It's
true, The Onion, Inc. company headquartered in Madison, Wisconsin has
made a offer to buy The New York Times, Inc., company for stock.

This is a serious offer and word is that NYT Chairman and Publisher
Arthur Sulzberger sees it as a way instantly transform his family's
company into a major Internet content provider and thereby pump up the
company's stock creating instant wealth for many long time

According to people close to the talks, Sulzberger and other New York
Times executives were recently seen in Madison Wisconsin where they
reportedly attended a Friday afternoon beer bash at The Onion
headquarters. Apparently, the executives of both companies really hit
it off and have even gone on camping trips together. Executive Editor
Joseph Lelyveld of The New York Times and Onion's Editor-in-Chief
Robert Siegel have formed a "mutual admiration club" and are seriously
considering swapping jobs once the merger is finalized.

"Some of the ideas these two groups discuss once they've had a few
beers is phenomenal, particularly when you get Mr. Sulzberger into
it," reported one of the Onion editors. The New York Times group is
particularly intrigued with the success that The Onion has had by
using invented names in all its stories except for public figures. By
employing an Internet journalistic standard to an old media newspaper
like The Times, it is felt that editorial costs can be reduced by a
whopping 80%!

Cultural differences between the two companies and differing standards
of journalism aren't seen as major stumbling blocks to getting the
deal done. The biggest challenge will be to get the two sides to agree
to a valuation that gives shareholders of both companies plenty to
cheer for. This is complicated because The Onion has a market cap that
is several hundred billion dollars higher than the New York Times
Company. The expectation, though is this will be worked out to be
similar to AOL's purchase of Time Warner, with The Onion shareholders
getting about 55 to 60 percent of the merger company. Thus, The New
Times Shareholder will see an instant up tick in their stock which
should compensate them more than adequately for losing control of the

Onion Publisher & President Peter K. Haise will reportedly give up his
position to become Chairman of the combined company and move to New
York. Arthur Sulzberger will more to Wisconsin, to run The Onion which
will be the new flag ship of the what will be called "Onion New York
Times Media Giant Company." Haise and Sulzberger have also agreed to
swap houses and families as part of the deal, which will facilitate
their need to move quickly.

The resulting "Onion New York Times Media Giant Company" will be one
of the world's largest media companies in terms of market cap, though
only half as big as AOL/ Time-Warner. The year 2000 is already being
seen as the year that old media surrendered to new media and there are
some more surprises to come. The biggest merger yet could happen this
summer when Wired Digital spins out Suck.com which will in turn make a
bid to buy The Walt Disney Corporation. Stay tuned dear readers,
Suck!Disney could become the biggest acquistion of all time.

DAVID BUNNELL is founder of PC Magazine, PC World, MacWorld, Personal
Computing, and New Media. He is CEO and Editor of Upside.

LINKS: Upside; David Bunnell in Upside

Further reading on Edge:

Chapter 4 - "The Seer" -- in Digerati; "PC MEMORIES, HOW I CREATED THE
PC" by David Bunnell

Paul W. Ewald

Infection Is Much Bigger Than We Thought, Bigger Than We Think, And
Perhaps Bigger Than We Can Think

I am confident that I don't know "today's most important unreported
story" because it hasn't been reported to me yet. But I'll take a stab
at one of today's most important under-reported stories: Infection is
much bigger than we thought, bigger than we think, and perhaps bigger
than we can think. With apologies to J.B.S. Haldane, let me offer a
less grandiose, but more tangible and testable (and ponderous)
version: The infectious diseases that are already here but not yet
generally recognized as infectious diseases will prove to be vastly
more important than the infectious diseases that newly arise in the
human population from some exotic source (such as the jungles of
Africa) or genetic diseases that will be newly discovered by the human
genome project. By "important" I mean both the amount of disease that
can be accounted for and the amount that can be ameliorated, but I
also mean how much of what we value as well as what we fear.

A judgment on this pronouncement can be assessed incrementally decade
by decade over the next half-century. What are the diseases to keep an
eye on? Heart disease and stroke; Alzheimer's, Parkinson's and other
neurodegenerative diseases; impotence, polycystic ovary disease,
cancers of the breast and ovaries, the penis and prostate;
schizophrenia and the other major mental illnesses. The list goes on.

But is the scope really "bigger than we can think"? Who can say? We
can speculate that the scope of infection may extend far beyond what
many in the year 2000 would be willing to take seriously. If
schizophrenia and manic depression are caused largely by infection,
then perhaps the artistic breakthroughs in our society, the
groundbreaking work of van Gogh, for example, can also be attributed
to infection. How much of what we prize in society would not be here
were it not for our constant companions? Rather than pass judgment
now, I suggest that we return in 2010 to this offering and each of the
other contributions to see how each is faring as the fuzziness of the
present gives way to the acuity of hindsight.

PAUL W. EWALD is a professor of biology at Amherst College. He was the
first recipient of the George E. Burch Fellowship in Theoretic
Medicine and Affiliated Sciences, and he conceived a new discipline,
evolutionary medicine. He is the author of Evolution of Infectious
Disease which is widely acknowledged as the watershed event for the
emergence of this discipline.

Robert Aunger 
The End of the Nation-State

One of the Big Stories of the last century was globalization ? the
rise of plodding great dinosaur-like institutions promoting the
interests of the Fortune 500. Of course, merger-mania continues to
capture headlines, creating ever-larger multinational firms,
centralizing information and money ? and hence power ? in the hands of
a few old White guys. This is Goliath, and Goliath at a scale above
the State. On David's side of the battle for our hearts and souls, we
have the Internet, the weapon of Everyman. The Internet is the
newfound instrument of the little people, bringing us all within a few
clicks of each other (the so-called "small world" phenomenon). It is
no accident that the first to flock to this medium were minorities of
all kinds ? poodle-lovers, UFO-watchers and other fringe-dwellers.
Here, through this network, they found a way to broadcast their
message across the world at virtually no cost through an avenue not
controlled by Walmart or Banque Credit Suisse.

What is getting squeezed out in this picture is the institution in the
middle, the nation-state. It is easy for the media to focus on the
President as he waves to them while boarding Air Force One ? indeed,
they fawn on these "photo-ops." The existence of standardized
channels, like the press advisor, for disseminating "important
messages" makes their job easy. Thus, the media haven't noticed that
the institution the President represents is increasingly irrelevant to
the course of events. Why? Let's look at the sources of State power,
and how they are being eroded.

First, money is no longer tied to any material token (see Thomas
Petzinger, Jr., this Forum). Once the link to cowrie shells or gold
bullion is severed, the exchange of value becomes a matter of trust.
And this trust is increasingly being placed in computers ? the
Internet again. Greenspan can control greenbacks, but not e-money. Any
zit-faced teenager can become an instant millionaire by flipping a
digit on a strategic computer account. This is digital
democratization, of a sort. So one of the vital sources of centralized
governmental power ? control over the money supply ? is increasingly
no longer in the hands of the State.

What about the distribution of wealth? It used to be that those close
to the political decision-making machinery could write the rules for
moneymaking and thus guarantee themselves advantages: policies
informed incentives. But the globalization of capital markets has
reversed that causal ordering: money now flows as if national
boundaries were invisible, slipping right'round local rules and
regulations. The policy-makers are always a step behind. So the State
no longer finds it easy to ply favorites with favors.

The ultimate source of control, of course, is access to information.
What you don't know you can't act on. Governments have long recognized
how important this is. Can States nowadays control public opinion? Are
the media operated by people the State can coopt? Well, sometimes. But
the Fall of the Wall suggests control is never perfect. So you can
tell some of the people what to do some of the time, but not whole
populations what to think for very long. It just costs too much. And
(as Phil Leggiere points out elsewhere in this Forum), the Internet is
now a powerful means for protest against State interests. No wonder
States are trying hard to control this organically-grown monster.

States of course use various means to attract allegiance beside the
media. For example, they stir up patriotism by the tried-and-true
method of demonizing outsiders. However, of late, it has become harder
to direct aggression "outside," as made obvious by the proliferation
of aggressive conflicts along ethnic lines within States (Jaron
Lanier's non-Clausewitzian wars, in this Forum). The other
possibility, of course, is that some splinter group will get hold of ?
or make ? a nuclear warhead, and hold a government ransom. So the
ability to incite war ? another source of State power ? seems to be
coming from other quarters. This constitutes additional evidence of
the soon-to-be demise of States.

What people really care about, the social psychologists tell us, is
the group they identify with. You don't identify with Uncle Sam (a
clever anthropomorphizing gimmick that only works during war); you
identify with Uncle Fred and the other kin who share your name. So
it's difficult for people to identify with a country. It's too big ?
just a jerry-rigged bit of color on a map in many cases. How can you
care when your vote has no influence over outcomes? "Representative"
government is farcical when a population is counted in millions. Of
course, if you're rich, you can buy influence, but the ante is always
being upped as some other special interest vies for control over your
Man in Washington. Besides, those guys always logroll anyway. When
your self-concept, wealth and well-being derive from participation in
other kinds of community, the State becomes an anachronism.

The result of all this will not be the arrival of the Libertarian
heaven, a State-less society. It is just that mid-level governance
will be replaced by larger- and smaller-scale institutions. We won't
have monolithic Big Brother looking over our shoulders in the next
century. Instead, we will become a network of tightly linked
individuals, empowered by technologies for maintaining personal
relationships across space and time. We will all choose to be cyborgs
(Rodney A. Brooks), with implants that permanently jack us into the
global brain (Ivan Amato), because of the power we derive from our
environmentally augmented intelligence (Andy Clark, with apologies to
Edwin Hutchins and Merlin Donald). We will all come to live in what
Manuel Castells calls a Network Society, and begin, literally, to
"think globally and act locally."

ROBERT AUNGER works on cultural evolution at the Department of
Biological Anthropology, University of Cambridge, and is editor of
Darwinizing Culture: The Status of Memetics as a Science.

LINK: Robert Aunger Home Page

Richard Potts 
Emergence of an Integrated Human-Earth Organism

Several under-reported stories come to mind. Almost all powerful
stories concern human beings in some way or another, metaphorically or
directly. One result of globalization, let's call it cultural unity,
is a story of such power. I'll mention only one facet of this story.
Over the past 50,000 years, the vast diversification of human culture
? the creation of quasi distinct cultures, the plural ? stands as a
peculiarity of Homo sapiens (relative to earlier humans and other
organisms). Human life has divided into diverse languages and ways of
organizing kin, technologies, economies, even mating and demographic
systems. It's a process that reflects our ken for doing things
differently from the people in the next valley.

Globalization may mean the dissolving (all too gradually) of tribal
mentality. But there's more to it. The related extinction of
languages, loss of local cultural information, and decay of cultural
barriers, all point toward an eventual homogenization of behavior that
hasn't existed at such a scale (across all humans) since the
Paleolithic prior to 50,000 years ago, or even much earlier. The
result: the loss of alternative adaptive strategies and behavioral
options, which have been rather important in the history of human

That's pretty big.

In seeking a truly unreported story, though, it's wise to think a
little further out, to make an unexpected prediction. How can that be
done? "Unexpected prediction" seems contradictory. Well, the history
of life is full of curious experiments, and careful study lets one
fathom its rash opportunism and rises and erasures of biotic
complexity. The history offers hints. An intriguing case is the
evolution of the complex cell, the basis of all eukaryotic life,
including multicellular organisms. The cell, with its nucleus,
mitochrondria, centrioles, and other components, represents an
ecosystem of earlier organisms. The cell emerged evidently by
symbiosis of a few early organisms brought together in a single,
coordinated system. It's complex internally, but it evolved by
simplifying, by gleaning from the surrounding ecosystem. Each of us
carries around about a hundred trillion of these simplified early
ecosystems, which are coordinated at even higher levels of
organization ? tissues, organ systems, the individual.

The big unreported story that I fancy is a latter-day parallel to this
fateful development in life's history. Human alteration of ecosystems
presents the parallel ? a sweeping simplification of a previously
diverse biotic system. Homo sapiens has slashed, culled, and gleaned.
It has forged symbiotic relationships with a few other species
(domesticates) that help fuel its metabolism (economic functions) as
humans enhance the replication of those few at other species' expense.

While these observations are somewhat familiar, the unreported part is
this: The global reach of this process threatens/promises to create a
single extended organism. The superorganism continues to alter the
planet and promises to touch virtually every place on the third rock
from the sun. Will this strange organism eventually harness the
intricate linkages of ocean, atmosphere, land, and deep Earth? Will it
seize control over the circulation of heat, moisture, energy, and
materials ? that is, the core operations of the planet? Hard to say
without a crystal ball. At its current trajectory, the process seems
destined to turn the planet into a cell, highly complex in its own
right but evolved by vast simplification of its original setting.
Certainly a different Gaia than is usually envisioned. If this story
has any validity, it's interesting that the initial loss of cultural
alternatives due to globalization roughly coincides with the emergence
of this incipient planetary organism.

What I suggest here is the onset of a Bizarre New World, not an
especially brave one. It might take more bravery to conserve Earth's
biological diversity and diverse ways of being human, salvaging
species and cultures from oblivion in a globalized world. Then
again...this may already be old fashioned sentiment.

Any important story, even as complicated as this one, needs a

Human-Earth Organism Evolves

Will It Survive? What Will It Become?

RICHARD POTTS is Director of The Human Origins Program, Department of
Anthropology, National Museum of Natural History, Smithsonian
Institution. The program focuses on the long history of ecosystem
responses to human pressures and vice versa. Museum researchers are
piecing together the climatic and ecological conditions that allowed
humans to evolve.

He is the author of Humanity's Descent : The Consequences of
Ecological Instability and a presenter, with Stephen Jay Gould, of a
videotape, Tales of the Human Dawn.

Kenneth W. Ford
The Swiftness of the Societal Changes That Occurred Two-thirds of the
Way Through the Century

No end of changes in our world are cited as we look back on the
twentieth century: population growth, scientific and medical advances,
communications technology, transportation, child rearing and family
structure, depletion of energy and mineral resources, and human impact
on the environment, to name a few. In general we analyze these changes
over the whole sweep of the century although some, to be sure, because
of their exponential character, have made their mark mainly toward the
end of the century.

What has gone largely unreported, it seems to me, is the suddenness
with which a set of societal changes occurred in less than a decade
between 1965 and 1975 (a step function to a mathematician, a seismic
shift to a journalist). In that period, we saw revolutionary change in
the way people dress, groom, and behave; in the entertainment that
grips them; in equity for minorities, women, and the variously
disabled; in higher education; and in the structure of organizations.

The unpopular Vietnam war can account for some of these changes, but
surely not all. The changes were too numerous and extended into too
many facets of our lives to be explained solely by antiwar fervor.
Moreover, what happened was not a blip that ended when the war ended.
The changes were permanent. With remarkable speed, as if a switch had
been thrown, we altered the way we deal with one another, the way we
see our individual relation to society, and the way we structure our
organizations to deal with people.

I lived through the period on a university campus, and saw rapid
changes in higher education, not to mention dress and behavior, that
are with us still. My own professional society, the American Physical
Society, transformed itself between the late 1960s and the early 1970s
from an organization that held meeting and published papers to an
organization that, in addition, promotes equity, highlights links
between science and society, seeks to influence policy, and cares for
the welfare of its individual members. Why did so much with lasting
impact -- happen so quickly?

KENNETH W. FORD is the retired director of the American Institute of
Physics. He recently taught high-school physics and served as science
director of the David and Lucile Packard Foundation. His most recent
book, written with John A. Wheeler, is Geons, Black Holes, and Quantum
Foam: A Life in Physics, which won the 1999 American Institute of
Physics Science Writing Prize.

Nancy Etcoff 
Good News

Four of five stories on trends in American life that appear on
national television news describe negative, frightful trends rather
than hopeful ones. Crime stories are the top category of local news,
outnumbering segments on health, business, and government combined.

Perhaps we require the media to be our sentinel. But we also seek a

The popularity and prestige of science has never been higher because
science is forward looking. Science has become the bearer of hope, a
source of the sublime.

NANCY ETCOFF, a faculty member at Harvard Medical school and a
practicing psychologist and neuropsychologist in the Departments of
Psychiatry and Neurology at the Massachusetts General Hospital, has
been researching the perception of human faces for the past ten years.
Her work and ideas have been reported in The New York Times, Newsweek,
Rolling Stone, U.S. News and World Report, Discover, Fortune, and
Mademoiselle. She has been a featured guest on Dateline, NPR, The
Discover Channel, and Day One. She is the author of Survival of the
Prettiest: The Science of Beauty.

Geoffrey Miller 
Social Policy Implications of the New Happiness Research

In the last ten years, psychology has finally started to deliver the
goods -- hard facts about what causes human happiness. The results
have been astonishing, but their social implications have not sparked
any serious public debate:

(1) Almost all humans are surprisingly happy almost all the time. 90%
of Americans report themselves to be "very happy" or "fairly happy".
Also, almost everyone thinks that they are happier than the average
person. To a first approximation, almost everyone is near the maximum
on the happiness dimension, and this has been true throughout history
as far back as we have reliable records. (This may be because our
ancestors preferred happy people as sexual partners, driving happiness
upwards in both sexes through sexual selection).

(2) Individuals still differ somewhat in their happiness, but these
differences are extremely stable across the lifespan, and are almost
entirely the result of heritable genetic differences (as shown by
David Lykken's and Auke Tellegen's studies of identical twins reared

(3) Major life events that we would expect to affect happiness over
the long term (e.g. winning the lottery, death of a spouse) only
affect it for six months or a year. Each person appears to hover
around a happiness "set-point" that is extremely resistant to change.

(4) The "usual suspects" in explaining individual differences in
happiness have almost no effect. A person's age, sex, race, income,
geographic location, nationality, and education level have only
trivial correlations with happiness, typically explaining less than 2%
of the variance. An important exception is that hungry, diseased,
oppressed people in developing nations tend to be slightly less happy
-- but once they reach a certain minimum standard of calorie intake
and physical security, further increases in material affluence do not
increase their happiness very much.

(5) For those who suffer from very low levels of subjective well-being
(e.g. major depression), the most potent anti-depressants are
pharmaceutical, not social or economic. Six months on Prozac(TM),
Wellbutrin(TM), Serzone(TM), or Effexor(TM) will usually put a
depressed person back near a normal happiness set-point (apparently by
increasing serotonin's effects in the left prefrontal cortex). The
effects of such drugs are much stronger than any increase in wealth or
status, or any other attempt to change the external conditions of

The dramatic, counter-intuitive results of happiness research have
received a fair amount of media attention. The leading researchers,
such as Ed Diener, David Myers, David Lykken, Mihaly Csikszentmihalyi,
Norbert Schwarz, and Daniel Kahneman, are regularly interviewed in the
popular press. Yet the message has influenced mostly the self-help
genre of popular psychology books (which is odd, given that the whole
concept of self-help depends on ignoring the heritability and
stability of the happiness set-point). The research has not produced
the social, economic, and political revolution that one might have
expected. Journalists have not had the guts to rock our ideological
boats by asking serious questions about the broader social
implications of the research.

Popular culture is dominated by advertisements that offer the
following promise: buy our good or service, and your subjective
well-being will increase. The happiness research demonstrates that
most such promises are empty. Perhaps all advertisements for
non-essential goods should be required to carry the warning: "Caution:
scientific research demonstrates that this product will increase your
subjective well-being only in the short term, if at all, and will not
increase your happiness set-point". Of course, luxury goods may work
very well to signal our wealth and taste to potential sexual partners
and social rivals, through the principles of conspicuous consumption
that Thorstein Veblen identified. However, the happiness research
shows that increases in numbers of sexual partners and social status
do not boost overall long-term happiness. There are good evolutionary
reasons why we pursue sex and status, but those pursuits are
apparently neither causes nor consequences of our happiness level.
Some journalists may have realized that the happiness research
challenges the consumerist dream-world upon which their advertising
revenues depend -- their failure to report on the implications of the
research for consumerism is probably no accident. They are in the
business of selling readers to advertisers, not telling readers that
advertising is irrelevant to their subjective well-being.

Also, if we take the happiness research seriously, most of the
standard rationales for economic growth, technological progress, and
improved social policy simply evaporate. In economics for example,
people are modelled as agents who try to maximize their "subjective
expected utility'. At the scientific level, this assumption is very
useful in understanding consumer behavior and markets. But at the
ideological level of political economy, the happiness literature shows
that "utility" cannot be equated with happiness. That is, people may
act as if they are trying to increase their happiness by buying
products, but they are not actually achieving that aim. Moreover,
increasing GNP per capita, which is a major goal of most governments
in the world, will not have any of the promised effects on subjective
well-being, once a certain minimum standard of living is in place.
None of the standard "social indicators" of economic, political, and
social progress are very good at tracking human happiness.

When hot-headed socialists were making this claim 150 years ago, it
could be dismissed as contentious rhetoric. Equally, claims by the
rich that "money doesn't buy happiness" could be laughed off as
self-serving nonsense that perpetuated the oppression of the poor by
creating a sort of envy-free pseudo-contentment. But modern science
shows both were right: affluence produces rapidly diminishing returns
on happiness. This in turn has a stark and uncomfortable message for
those of us in the developed world who wallow in material luxuries:
every hundred dollars that we spend on ourselves will have no
detectable effect on our happiness; but the same money, if given to
hungry, ill, oppressed developing-world people, would dramatically
increase their happiness. In other words, effective charity donations
have a powerful hedonic rationale (if one takes an objective view of
the world), whereas runaway consumerism does not. Tor Norretranders
(in this Edge Forum) has pointed out that 50 billion dollars a year --
one dollar a week from each first world person -- could end world
hunger, helping each of the 6 billion people in the world to reach
their happiness set-point. The utilitarian argument for the rich
giving more of their money to the poor is now scientifically
irrefutable, but few journalists have recognized that revolutionary
implication. (Of course, equally modest contributions to the welfare
of other animals capable of subjective experience would also have a
dramatic positive effect on overall mammalian, avian, and reptilian

Other contributors to this Edge Forum have also alluded to the social
implications of happiness research. David Myers pointed out the lack
of correlation between wealth and happiness: "it's not the economy,
stupid'. Douglas Rushkoff and Denise Caruso bemoaned America's descent
into mindless, impulsive consumerism and media addiction, neither of
which deliver the promised hedonic pay-offs. Daniel Goleman identified
the hidden social effects of our daily consumption habits -- they not
only fail to make us happier, but they impose high environmental costs
on everyone else. Others have suggested that some external substitute
for consumerism might be more hedonically effective. David Pink
championed a switch from accumulating money to searching for meaning.
John Horgan was excited about the quiet proliferation of better
psychedelic drugs. Howard Rheingold thinks more electronic democracy
will help. They may be right that spiritualism, LSD, and online voting
will increase our happiness, but the scientific evidence makes me
skeptical. If these advances don't change our genes or our serotonin
levels in left prefrontal cortex, I doubt they'll make us happier.
There may be other rationales for these improvements in the quality of
life, but, ironically, our subjective quality of life is not one of

Perhaps the most important implication of the happiness literature
concerns population policy. For a naïve utilitarian like me who
believes in the greatest happiness for the greatest number, the
happiness research makes everything much simpler. To a first
approximation, every human is pretty happy. From an extra-terrestrial
utilitarian's viewpoint, human happiness could be treated as a
constant. It drops out of the utilitarian equation. That leaves just
one variable: the total human population size. The major way to
maximize aggregate human happiness is simply to maximize the number of
humans who have the privilege of living, before our species goes

Obviously, there may be some trade-offs between current population
size and long-term population sustainability. However, most of the
sustainability damage is due not to our large populations per se, but
to runaway consumerism in North America and Europe, and catastrophic
environmental policies everywhere else. Peter Schwartz (in this Edge
Forum) mentioned the declining growth rate of the world's population
as if it were unreported good news. I take a different view: the good
news for a utilitarian who appreciates the happiness research would be
a reduction in America's pointless resource-wastage and Brazil's
deforestation rate, accompanied by a luxuriantly fertile boom in world
population. Given modest technological advances, I see no reason why
our planet could not sustain a population of 20 billion people for
several hundred thousand generations. This would result in a
utilitarian aggregate of 10 quadrillion happy people during the
life-span of our species -- not bad for such a weird, self-deluded
sort of primate.
GEOFFREY MILLER is an evolutionary psychologist at University College
London, and author of The Mating Mind: How Sexual Choice Shaped Human
Nature. He is currently researching the implications of evolutionary
psychology for consumer behavior and marketing.

LINKS: Geoffrey Miller Home Page

Further Reading on Edge: "Sexual Selection and the Mind": A Talk with
Geoffrey Miller

David Braunschvig 
The Non-US Uniform Mobile Standard

The current American monopoly on Internet innovation is not etched in

With about two thirds of the worldwide internet user base in North
America, US based companies generate over 80% of global revenue and
these represent about 95% of the sector's overall market
capitalization of about a trillion US dollars (as of January 2000).
This is indeed a paradox for a medium that was designed to be open and

Quite understandable, though, when you consider that US entrepreneurs
benefit from: abundant venture capital, more efficient equity markets,
flexible employment, higher PC penetration, efficient infrastructures
and earlier deregulation leading to lower communications costs to
consumers, better business academia linkages and a large, homogeneous
domestic market.

Of course, the rest of the world is catching up, as is increasingly
reported in the news. In Europe alone, the aggregate market value of
internet companies has shot up by a factor of 30 in the past year,
admittedly from a low base of $ 2 billion early 1999  to be contrasted
with "only" a four-fold increase for internet companies quoted in US
markets. Thus, observers generally agree that the disproportionately
low aggregate capitalization of the non-US internet companies is a
temporary fact.

However, the media here often views the primacy of US innovation in
the internet  which is of course the premise of its leadership  as
something like an American birth right. During the past "American
century" this has been a conventional wisdom for other equally
significant sectors. In the late 1960s, a major unreported story was
that Boeing's leadership in civil aircraft construction was more
fragile than one would expect; yet, Airbus's orders surpassed Boeing's
last year. Ten years ago, US dominance in cellular telecommunications
technology seemed equally impregnable. Since then, the European GSM
consortium has spawned a technology which is now widely accepted as a
global standard for digital mobile telephony. Likewise, could new
internet concepts and user experiences emerge outside of the US, with
global relevance and reach ?

A remarkably underreported story is that the existence of a uniform
mobile standard outside of the US is poised to be the foundation of a
new generation of internet-enabled applications, which can be an
extremely significant innovation. If portable devices and
internet-enabled mobility are to be at the center of the current
information revolution, Europe seems at an advantage to seed the
landscape with new concepts, technologies and companies leveraging
their consistent mobile infrastructure.

In Europe, location-sensitive services are being tested as we speak,
enabling merchants to reach pedestrians and motorists with information
and opportunities. Thus, rather than competing online with pure-play
e-commerce companies, established bricks-and-mortar businesses could
find their revenge in the high streets, thanks to these devices. The
best technologies enabling these experiences might well come from all
over the world  but the first movers are likely to find a privileged
ground in Europe: a caveat for the complacent in the US !

DAVID BRAUNSCHVIG is a managing director of Lazard FrÆres & Co. LLC in
New York, where he advises governments and corporations on
transactions and technology. In addition to his ongoing work as an
advisor in the fields of information technology, Internet services,
and "new media," he has advised the Mexican government on the
privatization of its national satellite system.

LINK: Lazard FrÆres & Co. LLC

Ernst Poppel
My Own Story

Today's most important unreported story is of course my own story.
This must be true for everybody. But who else would be interested?

ERNST POPPEL is a brain researcher. Chair of the Board of Directors at
the Center for Human Sciences and Director of the Institute for
Medical Psychology, University of Munich.

Philip Campbell 
"Chemistry for non-chemists"; Entrepreneurism

I found your demand for "most" important a bit of a distraction, so
forgive me for ignoring it.

My first thought was "chemistry for non-chemists". Few people write
about chemistry for the public, few stories appear in the press. It's
intrinsically difficult and, anyway, biology is, for the foreseeable
future, just too sensational (in both good and questionable senses)
and fast-moving for all but the sexiest of the rest of science to get
much of a chance to compete for space in the media. But there is room
for unusual science writers who know how to hit a nerve with a neat
association between interesting chemistry and the everyday world -
there just seem to be too few in existence and/or too little demand.
(I'd mention John Emslie and my colleague Philip Ball as two
honourable examples.)

The second thought was entrepreneurism. Because of inevitable business
secrecy, entrepreneurism too rarely gets adequately opened up to
scrutiny and public awareness. That's not to imply a hostile intent -
entrepreneurism can provide the basis of riveting tales in a positive
as well as negative senses. But, in Europe especially, chief
executives of high-technology companies who bemoan the lack of an
entrepreneurial culture unsurprisingly resist suggestions that a
well-proven journalist be given the chance to roam around their
company and write about what they find. Partly as a result of such
inevitable caution, and partly because of the way the media approaches
business, the public tends to get basic news and oceans of speculation
about share prices and profits, gee-whiz accounts of technology,
misrepresentation from lobby groups on both sides of a divide,
lectures on management, partial autobiographies of successful business
people, but, unless a company collapses, nothing like the whole truth.
More could surely be done, though the obstacles are daunting.

PHILIP CAMPBELL is the Editor-in-Chief of Nature.

LINK: Nature

Steven Pinker 
The Loss of our Species' Biography

Just as we are beginning to appreciate the importance of our
prehistoric and evolutionary roots to understanding the human
condition, precious and irreplaceable information about them is in
danger of being lost forever:

1. Languages. The 6,000 languages spoken on the planet hold
information about prehistoric expansions and migrations, about
universal constraints and learnable variation in the human language
faculty, and about the art, social system, and knowledge of the people
who speak it. Between 50% and 90% of those languages are expected to
vanish in this century (because of cultural assimilation), most before
they have been systematically studied.

2. Hunter-gatherers. Large-scale agriculture, cities, and other
aspects of what we call "civilization" are recent inventions (< 10,000
years old), too young to have exerted significant evolutionary change
on the human genome, and have led to cataclysmic changes in the human
lifestyle. The best information about the ecological and social
lifestyle to which our minds and bodies are biologically adapted lies
in the few remaining foraging or hunting and gathering peoples. These
peoples are now assimilating, being absorbed, being pushed off their
lands, or dying of disease.

3. Genome diversity. The past decade has provided an unprecedented
glimpse of recent human evolutionary history from analyses of
diversity in mitochondrial and genomic DNA across aboriginal peoples.
As aboriginal people increasingly intermarry with larger groups, this
information is being lost (especially with the recent discovery that
mitochondrial DNA, long thought to be inherited only along the female
line, in fact shows signs of recombination).

4. Fossils. Vast stretches of human prehistory must be inferred from a
small number of precious hominid fossils. The fossils aren't going
anywhere, but political instability in east Africa closes down crucial
areas of exploration, and because of a lack of resources existing
sites are sometimes inadequately protected from erosion and vandalism.

5. Great apes in the wild. Information about the behavior of our
closest living relatives, the bonobos, chimpanzees, gorillas, and
orangutans, requires many years of intensive observation in
inaccessible locations, but these animals and their habitats are
rapidly being destroyed.

What these five areas of research have in common, aside from being
precious and endangered, is that they require enormous dedication from
individual researchers, they are underfunded (often running on a
shoestring from private foundations), and have low prestige within
their respective fields. A relatively small reallocation of priorities
(either by expanding the pie or by diverting resources from
juggernauts such as neuroscience and molecular biology, whose subject
matter will still be around in ten years) could have an immeasurable
payoff in our understanding of ourselves. How will we explain to
students in 2020 that we permanently frittered away the opportunity to
write our species' biography?

STEVEN PINKER is Professor of Psychology in the Department of Brain
and Cognitive Sciences, and author of Language Learnability and
Language Development, Learnability and Cognition, The Language
Instinct, How the Mind Works, and Words and Rules.

LINKS: The Official Steven Pinker Web Page; The Unofficial Web Page
about Steven Pinker

Jason McCabe Calacanis
The Farce of the Slacker Generation (Or What the Hell Happened to
Generation X)?

In the early nineties, when I graduated from college, the media was
obsessed with a generation of indifferent teenagers and
twenty-somethings who couldn' t be bothered with social causes,
careers, or the general state of humanity. Ironically, the same media
structure which had previously been upset with the 60s generation for
being too rebellious was now upset the kids born in the 70s and the
80s for not being rebellious enough. They branded us slackers and they
called us generation X, ho-hum.

Fast forward five short years. The same media covering the same
generation, but instead of dismissing them as slackers they anoint
them business titans and revolutionaries controlling the future of
business, media and culture. With technology as their ally they will
not rest until they've disintermediated anyone or anything
inefficient. This group of rebels are on a mission, and their drive is
matched only by their insane work ethic. Never a mention of slackers
or generation X.

The story that isn't being told in all of this is why a generation of
slackers would suddenly create and drive one of the biggest paradigm
shifts in the history of industry. Clearly part of this is a matter of
perspective. The media givieth and the media takieth away, all in
their desire to create sexy stories through polarization,
generalization and, of course, exaggeration. However, looking deeper
into the issue, is the fact that a generation of young adults, having
stumbled onto a new medium (the Internet), was smart enough to seize
the opportunity, taking their own piece of the pie and leaving the
dead to bury the dead (think: old media).

What did we, as generation X inherit in the early 90s? The remnants of
a five-yea r, cocaine-infused, party on Wall Street that ended in
tears and a recession. Our generation wasn't filled with slackers, it
was filled with such media savvy, and saturated, individuals that we
knew that participating in the existing paradigm would only result in
low pay and long hours for some old-school company. Is it is a
coincidence that this same group of people are the ones who owned the
media that obsessed over the slacker generation? Perhaps they hoped to
guilt us to getting into line?

Equity is the revolution of our generation, as in having equity in the
company you work for. This equity, in the form of stock options, is
not on the same level as the equality that the 60s generation fought
for, but it is certainly an evolution of that same movement. Don't
believe the hype.

JASON MCCABE CALACANIS is Editor and Publisher of Silicon Alley Daily;
The Digital Coast Weekly, Silicon Alley Reporter and Chairman CEO,
Rising Tide Studios.

LINKS: Silicon Alley Daily; The Digital Coast Weekly, Silicon Alley

Raphael Kasper 
The Fact That There Are No Longer ANY Unreported Stories

"Today's most important unreported story" may be the fact that there
are no longer ANY unreported stories. To be sure, there are stories
that are given less attention than some might think appropriate, and
others that are inaccurate or misleading. But the proliferation of
sources of information -- the Internet, myriad cable television
stations, niche magazines, alternative newspapers -- makes it
virtually certain that no occurrence can remain secret or unmentioned.

The dilemma that faces all of us is not one of ferreting out
information that is hidden but of making sense of the information that
is readily available. How much of what we read and see is reliable?
And how can we tell? In the not-so-distant past, we could rely, to an
extent, on the brand name of the information provider. We all applied
our own measures of credence to stories we read in the York Times or
in the National Enquirer. But who knows anything about the authors of
what we read at www.whatever.com?

Everything -- all that has happened and much that has not -- is

RAPHAEL KASPER, a physicist, is Associate Vice Provost for Research at
Columbia University and was Associate Director of the Superconducting
Super Collider Laboratory.

LINK: Columbia University Record

Marney Morris
The Consequences of Choices Made About The Internet Today

"America, where are you going in your automobile?" Allen Ginsberg

The years of 1939 and 1999 were snapshots in time  revealing the world
as it was and as it would be. At the 1939 New York World's Fair,
General Motors' "Futurama" and Ford's "Road of Tomorrow" showcased
freeways and spiral ramps scrolling around urban towers. The future
was clear. American would rebuild its cities and highways to sing a
song of prosperity and personal freedom.

In 1999, a snapshot of the Internet revealed what was and what will
be. Wires are still being strung. The Ecommerce structure is still
being built. And content is still incunabulum.

What is today's most important unreported story? That the choices made
about the Internet today will have great consequences in the next
century. Like the automotive age, exuberant times make it easy to
forget that a bit of thoughtful design will profoundly influence the
fabric of our future society.

What's the issue? Access. Half the people in the US don't have
computers, but 98% have TVs and 97% have phones. Why do they say they
don't have computers? "Too complicated."

Using a computer should be as easy as turning on a TV. And it could

Computers interfaces should be self explanatory. And simple. And they
could be.

The quality of information design in the US is declining. Good
information design should reveal relationships about the information.
It should make you smarter. Learning disorders are on the rise. It's
not because we are getting better at diagnosing them. It's because we
are creating them. All too often our textbooks are confusing or
misleading. And that same lack of thoughtful design pervades the
personal computer, and the Internet. Information design is a science
that needs to underpin our society if we are going to remain
democratic and vital.

The biggest difference between 1939 and 1999? The automobile was
simple at the outset. It took years to make it complicated and
inaccessible. Computers have been unnecessarily complicated since they
began. It is hard to make things simple. But they could be.

MARNEY MORRIS teaches interaction design in the Engineering department
at Stanford University and is the founder of Animatrix, a design
studio that has built over 300 interactive projects since 1984.
Animatrix is currently creating Sprocketworks.com.

LINKS: Animatrix; Sprocketworks.com

Rupert Sheldrake
The Rise of Organic Farming in Europe

Once seen as a marginal enterprise of interest only to health food
fanatics, organic farming is booming in Europe. Over the last 10
years, the acreage under organic management has been growing by 25 per
cent per year. At present growth rates, 10 per cent of Western
European agriculture would be organic by 2005, and 30 per cent by
2010. But in some parts of Europe the growth rates are even higher. In
Britain, within the last 12 months the acreage more than doubled, but
even so the surge in demand for organic food greatly outstrips the
supply, and 70 per cent has to be imported. Most supermarket chains in
the UK now carry a range of organic products, and the market is
growing at 40 per cent per year. By the end of this year, nearly half
the baby food sold in Britain will be organic.

Why is this happening? It reflects a major shift in public attitudes,
which are probably changing more rapidly in Britain than in other

First there was the trauma of mad cow disease and the emergence of
CJD, its human form, contracted through eating beef. No one knows
whether the death toll will rise to thousands or even millions; the
incubation time can be many years. Then in 1999 there was the
remarkable public rejection of genetically modifie foods, much to the
surprise of Monsanto and their government supporters.

Recent surveys have shown the third of the public who now buy organic
food do so primarily because they perceive it as healthier, but many
also want to support organic farming because they think it is better
for the environment and for animal welfare.

The rise in organic farming together with the continuing growth of
alternative medicine are symptoms of a mass change in world view.

Governments and scientific institutions are not at the leading edge of
this change, they are at the trailing edge. A major paradigm shift is
being propelled by the media, consumers' choices and market forces. A
change of emphasis in the educational system and in the funding of
scientific and medical research is bound to follow, sooner or later.

RUPERT SHELDRAKE is a biologist and author of Dogs That Know When
Their Owners Are Coming Home, And Other Unexplained Powers of Animals,
The Rebirth of Nature and Seven Experiments That Could Change the
World, as well as many technical papers in scientific journals. He was
formerly a Research Fellow of the Royal Society, and is currently a
Fellow of the Institute of Noetic Sciences. He lives in London.

LINK: Rupert Sheldake Online

Tom de Zengotita
Linda Tripp's Makeover

It's being covered as a publicity ploy by a Lewinsky scandal leftover
-- that is, not much and certainly not seriously. But check out the
pictures. This is a MAJOR makeover. It represents the culmination of a
process we have been tracking for awhile -- but in two different
arenas of celebrity, the real and the Hollywood.

Remember the new Nixon? Distant ancestor of Al Gore remakes and of
McCain and Bradley, the "story" candidates, and every prominent real
life figure today who is now obliged to play some version of
themselves. On the other front, we have cases of performer
resurrections in new guises going back to the straightforward
comebacks of Frank Sinatra and John Travolta to Madonna and Michael
Jackson makeovers to Garth Brooks effort to recreate himself as a
fictional celebrity whose name escapes me at the moment.

Linda Tripp's makeover represents a consolidation, a fusion of these
trends. This marks a moment when the possibility of making and
remaking one's image collapses into the possibility of making and
remaking oneself literally.

And it's just the beginning...

TOM DE ZENGOTITA, anthropologist, teaches philosophy and anthropology
at The Dalton School and at the Draper Graduate Program at York

Denis Dutton 
The Gradual The Growth of a Prosperous Middle Class in China and in

Few large-scale, gradual demographic changes can be expected to
generate headlines. The exceptions are those which point toward
catastrophe, such as the widespread belief a generation ago that the
population bomb would doom millions in the third world.

In fact, the most significant unreported story of our time does deal
with the so called third world, and it is the obverse of the panic
about overpopulation. It is the story of the gradual growth of a
prosperous middle class in China and in India.

The story is truly dull: yes, millions of Indians can now shop in
malls, talk to each other on cell phones, and eat mutton burgers and
vegetarian fare at Mcdonald's. Such news goes against the main reason
for wanting to cover Indian cultural stories in the first place, which
has traditionally been to stress cultural differences from the West.
That millions of people increasingly have a level of wealth that is
approaching the middle classes of the West (in buying power, if not in
exact cash equivalence) is not really newsworthy.

Nevertheless, this development is of staggering importance. Middle
class peoples worldwide, particularly in a world dependent on global
trade, have important values in common. They share the values they
place on material comfort. They borrow in living styles from one
another. They appreciate to an increasing extent each others' cultures
and entertainments. And they place an important value on social
stability. Countries with prosperous middle classes are less likely to
declare war on one another: they have too much to lose. In the modern
world, war is a pastime for losers and ideologues; the middle classes
tend to be neither.

When I was a Peace Corps volunteer in India in the 1960s, I accepted
the conventional belief that south Asia would experience widespread
famine by the 1980s. My first surprise was returning to India in 1988
and finding that far from moving closer to famine, India was richer
than ever.

Now in the computer age, and having abandoned the Fabianism of Nehru,
India is showing its extraordinary capacity to engage productively
with the knowledge economies of the world. China too is will
contribute enormously to the world economy of the 21st century.

The story does not square with many old prejudices about the backward
Orient, nor does it appeal to our sense of exoticism. But the emerging
middle class of Asia will change the human face of the world.

DENIS DUTTON teaches the philosophy of art at the University of
Canterbury, New Zealand. He writes widely on aesthetics and is editor
of the journal Philosophy and Literature, published by the Johns
Hopkins University Press. He is also editor of the Web page, Arts &
Letters Daily. Prof. Dutton is a director of Radio New Zealand, Inc.
LINK: Arts & Letters Daily

David M. Buss
Discrimination in the Mating Market

Hundreds of stories are reported every year about discrimination,
bias, and prejudice against women, minorities, and those who are
different. But there's a more pervasive, universal, and possibly more
insidious form of discrimination that goes on every day, yet lacks a
name or an organized constituency-discrimination on the mating market.
Although there are important individual differences in what people
want (e.g., some like blondes, some like brunettes), people worldwide
show remarkable consensus in their mating desires. Nearly everyone,
for example, wants a partner who is kind, understanding, intelligent,
healthy, attractive, dependable, resourceful, emotionally stable, and
who has an interesting personality and a good sense of humor. No one
desires those who are mean, stupid, ugly, or riddled with parasites.
To the degree that there exists consensus about the qualities people
desire in mating partners, a mating hierarchy is inevitably
established. Some people are high in mate market value; others are
low. Those at the top, the "9's" and "10's" are highly sought and in
great demand; those near the bottom, the "1's" and the "2's," are
invisible, ignored, or shunned.

Being shunned on the mating market relegates some individuals to a
loveless life that may cause bitterness and resentment. As the rock
star Jim Morrison noted, "women seem wicked when you're unwanted."
Discrimination on the mating market, of course, cuts across sex, race,
and other groups that have names and organized advocates. Neither men
nor women are exempt. For those who suffer discrimination on the
mating market, there exists no judicial body to rectify the injustice,
no court of appeals. It's not against the law to have preferences in
mating, and no set of social customs declares that all potential mates
must be treated equally or given a fair chance to compete.

But it's not just the rock bottom losers on the mating market that
suffer. A "4" might aspire to mate with a "6," or "7" might aspire to
mate with a "9." Indeed, it's likely that sexual selection has forged
in us desires for mates who may be just beyond our reach. The "7" who
is rejected by the "9" may suffer as much as the "4" who is rejected
by the "6."

People bridle at attaching numbers to human beings and making the
hierarchy of mate value so explicit. We live in a democracy, where
everyone is presumed to be created equal. Attaching a different value
to different human beings violates our sensibilities, so we tend not
to speak in civilized company of this hidden form of discrimination
that has no name. But it exists nonetheless, pervasive and insidious,
touching the lives of everyone save those few who opt out of the
mating market entirely.

DAVID M. BUSS is s Professor of Psychology at the University of Texas
at Austin where he teaches courses in evolutionary psychology and the
psychology of human mating. He is the author of The Dangerous Passion:
Why Jealousy is as Necessary as Love and Sex; The Evolution Of Desire:
Strategies Of Human Mating; and Evolutionary Psychology: The New
Science Of The Mind.

Charles Arthur 
The Peculiar Feedback Loops - Both Negative and Positive - That Drive
Media Reporting of Technological and Science Issues

The most important unreported story, and perhaps one that is
impossible to report, is about the peculiar feedback loops-- both
negative and positive -- that drive media reporting of technological
and science issues.

In Britain, the science repoting agenda in the past year has been
dominated by stories about genetically modified food and crops.
Britons have rejected them, crops in experiments have been torn up
(thus preventing the results of the experiments, which could show
whether or not the crops had harmful effects, being produced).
Supermarkets vie with each other to find some way in which they don't
use genetically modified ingredients or crops. Newspapers run
"campaigns" against genetically modified ingredients.

There is an incredible positive feedback loop operating there, driving
ever wilder hysteria -- at least amongst the media. Whether the public
really cares is hard to ascertain.

Meanwhile climate change, that oft-repeated phrase, is almost accepted
as being right here, right now; to the extent that my news editor's
eyes glaze over at the mention of more global warming data, more
melting ice shelves (apart from "Are there good pictures?" A calving
ice shelf can do it.) There is clearly a negative feedback loop
running there. The only way to garner interest is to present someone
or some paper which says it isn't happening. Which seems to me
pointless, before Stephen Schneider jumps on me.

But what is making those loops run in the way they do? Why doesn't
genetically modified food get a negative loop, and climate change a
positive one? What are the factors that make these loops run with a +
or - on the input multiplier?

Damned if I know how it all . But I'll read about it with fascination.
As we are more and more media-saturated, understanding how all this
works looks increasingly important, yet increasingly hard to do.

CHARLES ARTHUR is technology editor at The Independent newspaper.

Delta Willis
Weird Ape Fouls Planet

In the spirit of tabloid headlines (Ted No Longer Fonda Jane) my
Exclusive, Untold Story would be headed Weird Ape Fouls Planet.

Granted Bill McKibben got very close with his book The End of Nature,
but there continues a pervasive denial that stops this story from
honest resolution.

First, of course, we'd rather not hear about it. At the Jackson Hole
Wildlife Film Festival we discussed the dilemma of presenting the
dreadful conservation stories without depressing the audience, which
is another way of asking, Shall we not hit this nail on the head
again? So beyond the various lobbies and doubts that obscure issues
such as global warming, the media hesitate to offend, or to be

Secondly, it's a story I'd rather not research and write because it is
a bummer. The first half I did attempt; I truly do think humans are
quite a wonderfully weird, unique species and the story of our place
within the evolution of life on earth is fantastic (and for many,
still unbelievable.) But when it comes to our impact on the earth, the
already palpable effects of over-population, I get bogged down in the
details, or distracted by Untold Story Number Two (Equal Rights
Amendment Never Passed U.S. Senate).

For example: there was a rush by health insurers to provide for
coverage of Viagra use, but not of contraception pills for women.
Enormous pressures remain for reproduction, from social and religious
ones (the Vatican Rag) to the biological cues that inspired John
Updike (The Witches of Eastwick) to put these words in the mouth of
Jack Nicholson: What a Bait They Set Up. In a roundabout way the
fallibilities of being human were covered ad nausea (Leader of Free
World Impeached for Thong Thing) and maybe the reason that too will
pass is because we would prefer to deny the power of these urges, on
par with drugs and greed, plus ego, i.e. parking one's off road 4-WD
Range Rover in front of the Hard Rock Café.

So I'm with Bugs Bunny, who said, people are the strangest animals
because we have this ability to reason and yet that base stem of the
brain, wherever it is located on your anatomy, tends to rule the day.
Hollywood might be the only medium that can rattle our cage on such
issues of perspective, truly seeing ourselves in context; journalism
no longer seems capable of delivering profound, incisive news, unless
you dare to have canines as sharp as those of Maureen Dowd.

DELTA WILLIS is the author of The Hominid Gang, Behind the Scenes in
the Search for Human Origins; The Leakey Family: Makers of Modern
Science; and The Sand Dollar & The Slide Rule: Drawing Blueprints from

Tor Norretranders
A Dollar a Week Will End World Hunger

It's now a billion to a billion: Of the six billion human beings
currently alive on this planet, one billion live with a daily agenda
of malnutrition, hunger and polluted drinking water, while another one
billion -- including you and me -- live lives where hunger is never
really an issue.

The number of really rich and really poor people on the planet now
match. That makes the following piece of arithmetics very simple

If all of us who are rich (in the sense that starvation is out of the
question and has always been) want to provide the economic resources
necessary to end hunger, how much should we pay? We assume that all
existing government and NGO aid programs continue, but will be
supplemented by a world-wide campaign for private donations to end
hunger (feed your antipode).

The cost of providing one billion people with 250 kilograms of grain
every year is approximately $40 billion dollars a year. That would
seem to be a lot of money, but with one billion people to pay, it is
no big deal: $40 a year! An even more moderate estimate is provided by
the organization Netaid: Just $13 billion dollars a year and the basic
health and food needs of the world's poorest people could be met.

With $50 billion a year as an estimated cost of ending world hunger,
the expense for each well-off person is one dollar a week. It is the
growth in the number of rich people on the planet, while the number of
poor has not grown, that results in this favorable situation,
unprecedented in human history.

The advent of the Internet makes this proposal practical and
conceptually clear:

Living in a global village makes it meaningful to help end global
hunger, just like the populations of most industrialized countries
have already done on a national scale.

The Internet provides a simple way of collecting the money (this
writer broke the embargo on his own unreported story and sent $100 to
www.netaid.org to pay the global tax to end hunger for himself and one
child). The money flowing through organizations such as netaid.org and
hungersite.org will attract public attention and scrutiny of their
efficiency in turning money into food for the hungry.

Also, the Internet makes it perfectly clear who should consider
her/himself as part of the rich billion on the planet and hence pay a
dollar a week: Every user of the Internet. In few years time the
number of users will be one billion and we could see the end of hunger
on this planet.

Obviously, once the money to end hunger is available, all sorts of
obstructions will appear before those in need are fed: bureaucracies,
mafias, corruption, waste. But is it not then time that we deal with
them? A very important effect of annual donations from a billion
people is the resulting global awareness of the embarrassment involved
in the present unnecessary situation.

TOR NORRETRANDERS is a Danish science writer, lecturer and consultant
based in Copenhagen. He is the author of The User Illusion. Cutting
Consciousness Down to Size. His latest book, in Danish, is Frem i

Denise Caruso 
Maybe Media Is the Real Opiate of the People

One of today's great untold stories -- or, I should say, it keeps
trying to get itself told and is usually mercilessly thrashed or
ignored entirely -- is the degree to which our behavior is manipulated
and conditioned by media.

Most everyone has heard about the studies (more than 200 at last
count) that show a direct correlation between increased aggression and
exposure to violence portrayed in media. The most compelling of this
research suggests that the visual media in particular -- television,
movies and even video games -- employ psychological techniques such as
desensitization and Pavlovian conditioning which change how we think
about and react to violent behavior.

Of course, the entertainment and advertising industries dismiss these
studies, saying it's impossible that their little ol' movie or TV show
or 30-second ad or point-and-shooter could actually influence anyone's

That's what they say to Congress, anyhow, when they get called on the
carpet for irresponsible programming.

But how do their protestations square with the gazillion-dollar
business of TV advertising, in particular? This is an industry which
is based entirely on the proposition that it can and does, in fact,
impel people to buy a new car or a new pair of shoes, to drink more
beer or get online -- to do something different than they've been
doing, in some shape or form.

So one of those statements has to be a lie, and if you follow the
money, you can make a pretty good guess which one.

Once you are willing to consider this premise (and if you've read the
studies and/or are willing to honestly observe your own behavior, it's
pretty hard not to), it becomes apparent that a whole lot more than
our attitudes toward violence may be influenced by visual media.

Not long ago, for example, it occurred to me that the rising obesity
rate of our TV-addicted population might actually have something to do
with the fact that any given hour of programming will yield an
infinity of food porn -- sexy, slender women shoving two-pound
dripping hamburgers into lipsticked mouths, or normal-sized families
cheerily gorging themselves at tables piled with giant lobsters and
steaks and all manner of things oozing fat and sugar.

I mentioned this theory of mine to a colleague last November and,
remarkably, the very next day, a blurb in The New York Times' science
section announced that a Stanford University study had correlated
children's obesity with television watching, and that the American
Institute for Cancer Research found that most Americans overestimated
a normal-sized portion of food by about 25 percent.

Neither of these two studies directly linked TV's food bonanza with
overeating, but they do suggest a connection between what our eyes see
and what our brains subsequently do with that information.

It's then no giant leap to wonder whether the constant barrage of TV
"news" and political programming -- from the Clinton-Lewinsky
extravaganza to Sunday morning's meet-the-pundits ritual to the
"coverage" of the latest batch of presidential hopefuls -- is another
case of media desensitization in action.

Could TV itself, the place where most Americans get their daily fix of
news, actually be causing America's vast political ennui and depressed
voter turnout? Have we become so anesthetized by what we watch that we
require the specter of Jesse Ventura or Donald Trump as president to
engage, even superficially, in the political process?

The studies that correlate media exposure with a flattened cultural
affect about violence would support that general premise.

But as we know, correlation does not prove causation. To prove that TV
"causes" violence, for example, you'd have to conduct a controlled,
double-blind experiment which, if successful, would result in someone
committing a violent act.

The human subjects committee at any responsible research lab or
university would never approve such an experiment, and for good

But it must be possible to set up a sufficiently rigorous,
violence-free experiment to measure the actual neurological and
behavioral effects of visual media. Wouldn't we all like to know what
really happens -- what happens in our brains, what humans can be
impelled to do -- as a result of spending so many hours in front of
TVs and computers and movie screens?

Considering the massive amount of visual stimuli that is pumped into
our brains every day -- and the astronomical profits made by the
industries who keep the flow going -- this seems like a story
eminently worth reporting.

DENISE CARUSO is Digital Commerce/Technology Columnist, The New York

LINK: "Digital Commerce" Column Archive

Phil Leggiere 
Appropriation of the Internet as an Effective and Powerful Tool of
Large Scale Global Social Protest

Buried beneath the blitz of news coverage of the rise of e-commerce
and the emergence of the World Wide Web as a new focal point of
consumerism ( the most ubiquitous stories of the moment) is a
potentially just as significant, still unreported, story: the
appropriation of the Internet as an effective and powerful tool of
large scale global social protest.

Most major mainstream broadcast and cable news coverage and commentary
rather jadedly treated the WTO protests in Seattle last month as a
fluke, a nostalgic hippie flashback. Their cynicism reflects not only
the binders of their Beltway mindsets, but the bias of their own, now
challenged, media formats.

For most of the past forty years, since broadcast television emerged
in about 1960 as the primary deliverer ( and definer) of news,
political activism evolved in a kind of dependent relationship (which
superficially some took to be a symbiotic one) to television.
Intuitively, sometimes by instinct, sometimes, as students of McLuhan,
quite consciously, activists of the civil rights and anti-Vietnam War
movements, attuned to the persuasive power of the mediated image,
learned to cast and craft their political protests at least in part as
media politics. Grass roots organizing remained, as always, the
essential underpinning of a viable social movement, but angling for a
dramatic visually intense slot on the nightly news ( what Abbie
Hoffman called "Becoming an Advertisement for the Revolution" or
"Media Freaking") became a primary tactic, if not full fledged

The power relationship, however, was always ultimately one-sided.
Those who lived by the televised image, could be easily squashed by
the image gatekeepers, cancelled like a burnt-out sit-com or
cops-and-robbers show once their novelty effect ebbed. And when "The
Whole World" was no longer watching, communication was pretty easily

What the WTO protests represent, far from Luddite know-nothing-ism
(despite the handful of brick throwing John Zerzan/Theodore Kaczynski
"anarcho-primitivists" whom broadcast TV reflexively and inevitably
locked-in on as the TV stars of the event) is the first social protest
movement created largely through and communicating largely via the
Web. Which is to say the first, potentially at least, able to by-pass
the gatekeepers of mainstream media while reaching hundreds of
thousands, perhaps millions, of participants/observers/ sympathizers
and others, globally on an ongoing basis.

This suggests that the populist cyber-punk roots of Net BBS's are
surviving and even flourishing alongside the corporate branding the
Web is undergoing. With due apologies to the great writer Bruce
Sterling (who advises us to retire cyber prefixes once and for all), I
can't help thinking that, despite the apparent easy triumph of
cyber-commercialization (the Web as global strip-mall), the next few
years may also witness the blossoming of the first era of mass global
populist cyber-protest.

PHIL LEGGIERE is a free-lance journalist and book reviewer for Upside
Magazine and several other publications.

Leon M. Lederman 
Survival Depends on the Race Between Education and Catastrophe

A greatly underrated crisis looming over us was predicted by the
futurist H. G. Wells. In about 1922 he commented that survival would
depend on the race between education and catastrophe. The
justification for this profound foresight can be seen in the
incredible violence of this century we have survived, and the newfound
capacity of mankind to obliterate the planet. Today, although
political rhetoric extols education, the educational system we have
cleverly devised and which is in part a product of the wisdom of our
founding fathers defies reform. It is a system incapable of learning
from either our successes or our failures.

How many parents and policy makers know that the system for teaching
science in 99% of our high schools was installed over 100 years ago?

A National Committee of Ten in 1893 chaired by a Harvard President
recommended that high school children be instructed in science in the
sequence Biology, then Chemistry, and then Physics. The logic was not
wholly alphabetical since Physics was thought to require a more
thorough grounding in mathematics.

Then came the 20^th century, the most scientifically productive
century in the history of mankind. Revolutions in all these and other
disciplines have changed the fundamental concepts and have created a
kind of hierarchy of sciences; the discovery of the atom, quantum
mechanics, nuclear sciences, molecular structures, quantum chemistry,
earth sciences and astrophysics, cellular structures and DNA. To all
of this, the high school system was unmoved.

These events and pleas to high school authorities from scientists and
knowledgeable teachers went unheeded. The system defies change. We
still teach the disciplines as unconnected subjects with ninth grade
biology as a chore of memorizing more new words than 9^th and 10^th
grade French together!

This is only one dramatic example of the resistance of the system to
change. Our well-documented failure in science education is matched by
failures in geography, history, literature and so forth.

"So what?" critics say. "Look at our booming economy. If we can do so
well economically, our educational system can't be all that

Here is where appeal to H. G. Wells' insightful vision enters. The
trend lines of our work force are ominous. Increasing numbers of our
citizens are cut-off from access to technological components of
society, are alienated and are condemned to scientific and
technological illiteracy. We have by the process, solidified and
increased the gap between the two classes of our culture. And the
formative elements of culture outside of school: TV, cinema, and radio
. . . strongly encourage this partition. Look at the social (as well
as economic) status of teachers. Most parents want the best teachers
for their children, but would bridle at the suggestion that their
children become teachers.

The penalties of continuing to graduate cultural illiterates (in
science and the humanities) may not be evident in year 2000 Wall
Street, but it is troubling the leaders of our economic success, the
CEO's of major corporations who see a grim future in our workforce.
Can we continue to import educated workers? As the low-level service
jobs continue to give way to robots and computers, the needs are
increasingly for workers who have high level reasoning skills, which a
proper education can supply to the vast majority of students.

But what is it that threatens "catastrophe" in the 21^st century?
Aside from the dark implication of a hardening two-class system, there
is a world around us that provides global challenges to society and
solutions require large popular consensus. Global climate change,
population stabilization, the need for research to understand
ourselves and our world, the need for extensive educational reform,
support for the arts, preservation of natural resources, clean air and
water, clean streets and city beautification, preservation of our
wilderness areas and our biodiversity-these and other elements make
life worth living, and cannot sensibly be confined to enclaves of the

You don't have to be a rocket scientist to construct catastrophes out
of a failed educational system.

LEON M. LEDERMAN , the director emeritus of Fermi National Accelerator
Laboratory, has received the Wolf Prize in Physics (1982), and the
Nobel Prize in Physics (1988). In 1993 he was awarded the Enrico Fermi
Prize by President Clinton. He is the author of several books,
including (with David Schramm) From Quarks to the Cosmos : Tools of
Discovery, and (with Dick Teresi) The God Particle: If the Universe Is
the Answer, What Is the Question?

LINKS: The Story of Leon; Leon M. Lederman Science Information Center

Peter Schwartz 
The Dramatic Fall in the Rate of Growth in Global Population

My candidate for most important unreported story is the dramatic fall
in the rate of growth in global population. Instead of hitting 20, 20
or even 50 billion as was feared only a few years ago, with all the
associated horror, it is likely to reach between ten and eleven
billion by mid century. The implications for the carrying capacity of
the planet are profound.

PETER SCHWARTZ is an internationally renowned futurist and business
strategist. He is cofounder and chairman of Global Business Network, a
unique membership organization and worldwide network of strategists,
business executives, scientists, and artists based in Emeryville,
California. He is the author of The Art of the Long View : Planning
for the Future in an Uncertain World and coauthor (with Peter Leyden &
Joel Hyatt) of The Long Boom.

LINK: Global Business Network; The Long Boom Home Page

Carlo Rovelli 
The End of the Dream of a More Gentle World

This unreported story is not much up to date. It is perhaps not even
unreported (everything is reported -- does anything exist if it isn't
reported?). And I do not know how "important" it is. Importance is
determined by what is perceived to be important, and what is perceived
important is reported. What remains is what will be viewed as
important by our descendants -- but let us leave them the burden of
the choice -- , and what is perceived important by singles or groups.

I have been reading the other "most important unreported stories" in
this fascinating collection of answers, and I have been surprised how
much the perspective of importance matches the specific of the
interests, or of the personal history, of the writer. So, I shall
allow myself to be absorbed by the same soft self-indulgence...

The most important under-reported story I want to talk about is the
old story of the great dream of a more gentle world. A more just
world, not based on the competition of everybody against everybody,
but based on sharing, the world as a collective adventure.

The dream is dead.

Killed by the simple fact that most of the times -- and this one is no
exception -- the stronger and aggressive wins and the gentle one
looses. Killed by the fears it generated in the privileged. Killed by
its own naivete, by horrors its unrealism generated, by its incapacity
to defend itself from the thirst of power hidden inside. And killed by
the thousand of reported and over-reported stories of its sins.

The new century raises over a dreamless new world, with richer riches
and with desperates by the millions. This is reality and we go with
it, its imperfections and its promises, which are no small.

But the dream is dead. It has moved generations, it has inspired
peoples, made countries, it has filled with light the youth of so many
of us, from Prague to San Francisco, from Paris to Beijin, from Mexico
City to Bologna. It has lead some to the country, some to take arms,
some to let their minds fly, some to Africa, some to join the party,
some to fight the party. But it was for the humanity, for a better
future for everybody, it was pure, beautiful and generous. And real.

Stories do not get unreported because they are kept hidden. But simply
because there are different perceptions of reality and of importance.
Perception of reality changes continuously. What was there and big,
may then vanish like a strange morning dream, leaving nothing but
confused undecoded traces and incomprehensible stories. A strong
ideology believes to be realist, and calls its reported stories
"reality". The old dreams are transformed in irrational monsters, then
they get unreported, then they have never existed.

CARLO ROVELLI is a theoretical physicist, working on quantum gravity
and on foundations of spacetime physics. He is professor of physics at
the University of Pittsburgh, PA, and at the Centre de Physique
Theorique in Marseille, France.

LINK: Carlo Rovelli Home Page

Timothy Taylor 
The sexual abuse of children by women

While I was writing about libidinousness among female primates for my
book The Prehistory of Sex: Four Million Years of Human Sexual
Culture, a friend told me that she had been sexually abused by her
mother. My research had helped her cut through the cultural myth that
only men could be sexually violent. Since then five more people have
told me that as children they were sexually abused by females (not all
by their own mother -- adult relatives and unrelated persons figure
too). A seventh person believes she may have been abused by a female,
but her memory is clouded by later abuse by a male. The seven come
from a range of social and ethnic backgrounds; three are men and four
are women. None of them has had any form of memory-tweaking therapy,
such as hypnotic regression. Indeed only two of the seven have
mentioned their abuse to a doctor or therapist (as compared with two
out of the three people I know who were abused by males).

Each abused child grew up in ignorance of others, in a culture in
which their kind of story was not told. As with abuse by males, the
psychological effects are profound and long-term. The ability to name
what happened is thus won with difficulty and has come only recently
to each of the victims I know: maturity and parenthood, supportive
friends, and the simple realization that it can actually happen, have
all played a part.

Whatever its biological and cultural antecedents -- poor parent-infant
bonding, the urge to control and dominate, repression, hidden
traditions of perversion, etc. -- the truth of abuse by males has only
recently been accepted, and the extent of it probably remains
underestimated. By contrast, abuse by females is almost totally
unreported outside specialist clinical literature. Successful criminal
prosecutions, rare enough for the former, are almost unheard of for
the latter except where they comprise part of more unusual
psychopathic crimes (such as the torture and murder of children). But
there is nothing inherently implausible about there being as many
female as male paedophiles in any given human community. That women
paedophiles have been a systemic part of recent social reality is, in
my view, today's most important unreported story.

TIMOTHY TAYLOR teaches in the Department of Archaeological Sciences,
University of Bradford, UK, and conducts research on the later
prehistoric societies of southeastern Europe. He is the author of The
Prehistory of Sex: Four Million Years of Human Sexual Culture.

Daniel Pink 
Maslow's America

Time's "Person of the Year" should have been Abraham Maslow.

The great psychologist is the key to understanding the biggest
economic story of our day ? a story that's been obscured by stock
tickers crawling across the bottom of every television screen, by
breathless magazine covers about dot-com fired insta-wealth, and by
the endless decoding of Alan Greenspan's every emission.

Deep into the middle class, Americans are enjoying a standard of
living unmatched in world history and unthinkable to our ancestors
just 100 years ago. This development goes well beyond today's high Dow
and low unemployment rate. (Insert startling factoid about VCRs,
longevity, car ownership, antibiotics, indoor plumbing, or computing
power here.) And demographics are only deepening the significance of
the moment. Roughly seven out of eight Americans were not alive during
the Great Depression ? and therefore have no conscious memory of
outright, widespread, hope-flattening economic privation. (Note: long
gas lines and short recessions don't qualify as life-altering
hardship.) As a result, the default assumption of middle class
American life has profoundly changed: the expectation of comfort has
replaced the fear of privation.

Enter Maslow and his hierarchy of needs. I imagine that at least a
majority of Americans have satisfied the physiological, safety, and
even social needs on the lower levels of Maslow's pyramid. And that
means that well over 100 million people are on the path toward
self-actualization, trying to fulfill what Maslow called "metaneeds."
This is one reason why work has become our secular religion ? and why
legions of people are abandoning traditional employment to venture out
on their own. (It's also why I guarantee that in the next twelve
months we'll see newsmagazine stories about despondent, unfulfilled
"What's It All About, Alfie?" Internet millionaires.)

What happens when life for many (though, of course, not all) Americans
? and ever more people in the developed world ? ceases being a
struggle for subsistence and instead becomes a search for meaning?

It could herald an era of truth, beauty, and justice. Or it could get
really weird.

DAN PINK, a contributing editor at Fast Company and former chief
speechwriter to Vice President Al Gore, is completing a book on the
free-agent economy.

LINK: Pink's Free Agent Nation Website

Robert Hormats 
The Information Revolution Requires A Matching Education Revolution

Today's most important unreported story is that for many millions of
people in the industrialized and developing countries education and
training are not keeping up with the information technology
revolution. As the world enters the 21st century we need more robust
educational education and training, benchmarking to ensure that
educational systems provide the skills needed for this new era and
resource commitments that recognize that educational investments are
critical to economic prosperity and social stability in this new

If the benefits of the information technology revolution are to be
broadly shared, and its economic potential fully realized, a far
greater effort is required during and after school years to enable
larger numbers of people to utilize and benefit from new information

Failure to do this will widen the digital divide and the income gap
within and among nations, sowing seeds of social unrest and political
instability. It also will deprive our economies of the talents of many
people who could make enormous contributions to science, medicine,
business, the arts and many other fields of endeavor were they able to
realize their full educational and professional potential.

The goal of our societies should be not only to be sure schools and
homes are wired and online -- itself a critical infrastructure
challenge -- but to provide education and training programs so that
larger and larger numbers of people at all income levels can use these
new technologies to learn and create during their school years and
throughout their lives.

For the US, whose population is steadily aging, this means ensuring
that older citizens have greater training in the use of these new
technologies. And it means that younger Americans, especially
minorities who will become an increasingly significant portion of the
21st century workforce, have far greater education and training in the
use of information technologies than many do now. The better trained
they are the better position they will be in to contribute
productively to the US economy -- empowered by these new technologies.
In the emerging economies, IT education is an important part of their
evolution into dynamic participants in the global information economy,
attracting more and more investment based not only on low labor costs
or large domestic markets but also on their innovativeness and ability
to adapt to a world where more and more high quality jobs are
knowledge based. In much of Asia the financial crisis received so much
attention that much of the world paid little attention to the dynamic
changes in the information technology sector taking place in the
region; impressive as that is, it can be even more impressive as
greater investment in human capital expands the number of information
/ technology savvy citizens in these countries and thus broadens the
base of high-tech prosperity.

In the least developed economies, IT education should be a top
priority. It is greatly in the world's interest that they be able to
achieve their full economic potential. A substantial amount of
international support from the private sector and governments will be
needed. This can both prevent these nations from falling further
behind and unlock the innovative potential of their peoples.

An education revolution in industrialized, emerging and developing
nations is needed to keep up with and realize the full potential of
the information technology revolution. We should not become so
enamoured of technology that we ignore the human dimension that is so
critical to its success and to the social progress that these
technologies have the potential to accelerate.

ROBERT HORMATS is Vice-Chairman of Goldman Sachs International.

LINK: Goldman Sachs

Jaron Lanier 
The End of Clausewitzian War

Prior to about twenty years ago, wars could almost always be
understood as depressingly rational events perceived by instigators as
being in their own self interests. Certain recent wars and other acts
of organized violence are astonishing in that they seem to break this
age old pattern. A striking example is the series of awful
confrontations in the former Yugoslavia.

If it was only an evil strongman, a Slobodan Milosevic, who instigated
the bloodshed, events would have kept true to the old established
pattern. Many a leader has instigated conflict, conjuring a demonized
foreign or domestic enemy to rouse support and gain power. But that is
not really what happened in this case of Yugoslavia. In the past, the
demons were accused of posing a material threat. Hitler claimed the
Jews had taken all the money, for example. Yes, he claimed they (we)
were morally degenerate, etc., but that alone would perhaps not have
roused a whole population to support war and genocide. The material
rationale seemed indispensable.

By contrast, in Yugoslavia a large number of both middle level leaders
and ordinary citizens, not limited to the Serbs or another single
group, rather suddenly decided to knowingly lower their immediate
standard of living, their material prospects for the foreseeable
future, their security, and their effective long term options and
freedoms in order to reinforce a sense of ethnic identity. This is
remarkably unusual. While ethnic, religious, and regional movements
have throughout history sought political independence, they have
almost never before resorted to large scale violence unless economic
or some other form of material degradation was a critical motivation.
Had the English Crown been more generous in the matter of taxation,
for instance, he might well have held on to the American Colonies.

It is often pointed out that the cultural context for conflict in the
Balkans is extraordinarily old and entrenched, but there are awful
psychic wounds in collective memory all over the world. There are
plenty of individuals who might under other circumstances be drawn
once again into conflict over the proper placement of the border
between Germany and Poland, for example, but there is absolutely no
material incentive at this time to make an issue of it, and every
material incentive to live with the situation as it is.

Similarly, if an uninformed, uneducated population had burst into
violent conflict on the basis of bizarre beliefs that the enemy posed
a serious threat of some kind, perhaps abducting children to drink
their blood, then that would have kept to the historical pattern as
well. Neither Von Clausewitz nor any other theorist of war has claimed
that war has always in fact been in the self interest of perpetrators,
only that it was perceived to be so. But Yugoslavia was a nation that
was relatively prosperous, well educated, and informed. Yugoslav
society was not closed or controlled to the extent of other
contemporary nations formed upon related ideologies. There were
relatively open borders and extensive commerce, tourism, and cultural
contact with the West.

And Yugoslavia was not Germany between the wars. Yugoslavs were not
humiliated or frustrated relative to other populations across their
borders. The material conditions were critically different. There was
no sense of hopeless economic disintegration, no reason to think,
"Even war would be better than this, or at least a risk worth taking."

Before Yugoslavia, war famously spared nations blessed with
Macdonald's hamburger franchises. The comforting common wisdom was
that economic interdependence reduced the threat of war. Economic
globalism was supposed to remove the material incentives from making
war, and it indeed it probably has done that.

In former Yugoslavia, an upwelling of need for absolute identity
trumped rational, material self interest. This phenomenon can also
perhaps be seen in some instances in the rise of Islamic militancy.
The recent rise of violent events perpetrated in the name of
"traditional" identities, values, and beliefs is startling. Once
again, such violence has always existed, but almost always before it
has been coupled with a component of material motivation. The Biblical
Israelites were enslaved and subjected economic abuse, for example.
The fundamentalists who attack abortion clinics seek no improved
material prospects. Neither do the Taliban. Or the bombers of the
Federal Building in Oklahoma City.

In all these cases, identity has become more important than wealth,
and that is new.

Another possible explanation that haunts me is that the human spirit
cannot cope with the changes technology makes to human identity. This
can be as simple as MTV blasting into the lives of children who
otherwise would never have known the meaning of spandex, piercing, or
whatever is in fashion on a particular day. Any thinking person,
though, must know that the changes to the human condition wrought by
such technologies as MTV, or even abortion and birth control, are mere
whispers compared to the roar of changes that will soon come to pass.

JARON LANIER , a computer scientist and musician, is a pioneer of
virtual reality, and founder and former CEO of VPL. He is currently
the lead scientist for the National Tele-Immersion Initiative.

Further reading on Edge: Chapter 17, "The Prodigy," in Digerati

LINKS: Jaron Lanier's Home Page; The National Tele-Immersion

Steve Quartz 
The Coming Transformation in Human Life and Society in the
Post-Genomic World

Although there hasn't been any shortage of stories on genes in the
press, public dialogue hasn't even begun to seriously consider how
radically genetic technologies will alter human life and society --
and probably all much sooner than we think. Forget cloning -- the pace
of the Human Genome Project combined with the emerging dominance of
market forces in dictating how spinoff technologies from gene therapy
to engineering novel genes will be utilized suggests that we'll soon
be able to retool human life (altering human traits from life history
-- aging, reproduction -- to intelligence and personality). We haven't
really begun to consider the enormous implications these will have for
the design of human society and social policy, from the family unit to
education and work. My bet is that feasible technologies to retool
human life will put us face to face with the basic dilemma of deciding
what it means to be human within two decades.

STEVEN QUARTZ is a professor in the division of Humanities and Social
Sciences and the Computation and Neural Systems Program at Caltech. He
is the author, with Terrence Sejnowski, of the forthcoming Who We Are:
How Today's Revolutionary Understanding of the Brain is Rewriting Our
Deepest Beliefs About Ourselves.

Robert R. Provine
The Walkie-Talkie Theory: Bipedalism Was Necessary For Human Speech

Speech is a byproduct of the respiratory adjustments associated with
walking upright on two legs. With bipedalism came a secondary and
unrecognized consequence, the respiratory plasticity necessary for
speech. Quadrupedal species must synchronize their locomotion and
respiratory cycles at a ratio of 1:1(strides per breath), a coupling
required by the shared, rhythmic use of the thoracic complex (sternum,
ribs, and associated musculature), and the need to endure impacts of
the forelimbs during running. Without such sychronization, running
quadrupeds would fall face first into the dust because their thorax
would be only a floppy air-filled bag that could not absorb the shock
of forelimb strikes. Human bipedal runners free of these mechanical
constraints on the thorax employ a wide variety of phase-locked
patterns (4:1, 3:1, 2:1 [most common], 1:1, 5:2, and 3:2), evidence of
a more plastic coupling between respiratory rhythm and gait. The
relative emancipation of breathing from locomotion permitted by
bipedality was necessary for the subsequent selection for the
virtuosic acts of vocalization we know as speech.

The contribution of bipedality to speech evolution has been neglected
because linguists typically focus on higher-order cognitive and
neurobehavioral events that occur from the neck up and overlook the
neuromuscular processes that produce the modified respiratory
movements known as speech.

ROBERT R. PROVINE is Professor of Psychology and Neuroscience at the
University of Maryland Baltimore County where he studies the
development and evolution of the nervous system. The walkie-talkie
theory is presented in his forthcoming book Laughter: A Scientific

James Bailey 
The Spread of Universal Visual Literacy

Beneath Keith Devlin's "Death of the Paragraph" lies a deeper and even
less reported story: the spread of Universal Visual Literacy. Visual
Literacy is the ability not just to understand knowledge in visual
form but also to create it. Future generations of scientists (and
poets) are growing up with Photoshop in their fingertips. To them, a
conjunction is a video fade or wipe as much as a but or a yet. A
modifier is a texture on a 3D model as much as an adverb. With the
(under-reported) close of the Gutenberg Era, young scholars are no
longer constrained to old textual modes of communication. With the aid
of new electronic tools for expressing knowledge visually, they will
go back and forth with a facility unknown since Leonardo.

The reason this matters hugely is that visual modes of knowing can
accurately apprehend and communicate realities that are parallel,
whereas paragraphs and equations force us to pretend, with Descartes,
that life really happens in the single-step-at-a-time sequences that
the printing press demands. In the Gutenberg Era, the master
scientific concept was the equality of two strings of symbols on a
printed page. For young scientists growing up today, the master
scientific concept is the all-at-once docking of one molecular shape
onto the binding site of another on a computer screen.

Perhaps the most egregious example of using the old sequential
concepts of the Gutenberg Era to try to express parallel reality is
our current enthusiasm for the lame assertion that life is speeding
up. Here is a candidate for the most over-reported story of our time.
As a culture we are stringing together whole bookloads of paragraphs
trying to apply the centuries-old sequential concept of speed to
whatever is going on right now, because that is the best that text
seems to be able to do. Count on todays fourth-grader, with her iBook
and her Chime plug-in in her back pack, to do a whole lot better some
day. But by the time she finishes high school, she will still be able
to understand our old ways, because, along with her daily biology and
art classes, she will humor her physicist parents and take two days a
week of Algebra Appreciation.

JAMES BAILEY is an independent scholar focusing on the impact of
electronic computing on the overall history of ideas. He is the author
of After Thought: The Computer Challenge to Human Intelligence.

Henry Warwick 
My friend, John Brockman, asked me, "What is the most important
unreported story of the year?"

Deprived of sleep and somewhat dulled by holiday festivities, I had no
reaction except a mumbled "Damned if I know..."

I immediately set out to understand this question (which soon
dominated My Every Waking Hour of my Holiday Vacation), and in order
to get my mind around the question and all it implied, I would need to
do some research beyond the most propitious mixture of rum and eggnog,
and how to cook a turkey dinner for nine...

I basically asked almost everyone I met, making a (typically) cheerful
nuisance of myself. The results were most interesting, and I quickly
found that the results of my research, like many of the previous
responses, was also conditioned by the world views that obtained given
the career choices and life objectives of the people I asked.

Most of people I asked, deprived of sleep and dulled by the holiday
festivities, shrugged and said "Damned if I know..." I found such
informal results less than satisfactory. Over more "eggnog" I came to
the conclusion I should ask people who might actually know. The next
day, I talked to people in the news trade, figuring, if they publish
the stories that do get reported, they would certainly know what
doesn't get reported.

I contacted a number of people in this regard, among them; an editor
of a major San Francisco weekly newspaper, a writer for a major San
Francisco daily newspaper, a photo editor for another daily, and a
writer for a weekly newspaper and the internet, located near Seattle.
Four people from completely different backgrounds- and they all said
(basically) the same thing- "How can it be that there is incredible
poverty amidst incomparable wealth, so often resulting in
homelessness? "

This threw me for a loop, because I didn't anticipate unanimity from
such a diverse lot. What also struck me was how I felt they were
wrong. While I do think poverty does deserve a greater examination,
and is certainly an important issue, I don't feel like it is
particularly "unreported" much less unknown. Anyone who lives in an
urban center in America (and many other countries for that matter)
knows about the reality that is poverty and homelessness.

I also felt I had to discount their answer, to a certain degree. For
one thing, they spend much of their time reporting on the headline
eating news- the acts, both dastardly and venal, of society's misfits,
madmen, and squalid criminals, both elected and otherwise. These
people are journalists, and journalists, especially American
journalists, have a tradition- bordering on an archetype- of being the
voice for the voiceless, the muckraker, the fourth estate, the ever
critical conscience of a secular society. This would make their odd
unanimity explainable, and, to a degree, underscore the value and
gravity of their choice.

But -- John didn't ask them, he asked me. And what do I know? Enough
to make me a worthy opponent at most trivia games. Enough that I'm not
homeless. Yet.

Are the growing ranks of the homeless and poor amidst our ever
deepening sense of prosperity and wealth the answer to Brockman's

Or is it something broader and deeper? Ever since there have been
small privileged classes of the rich and/or powerful, there have been
the endless ranks of peasants and proles, microserfs and
burgerflippers, all of them struggling to feed their children, and
then there have been those who look up to peasants and proles,
microserfs and burgerflippers: the misfits, the madmen, and the
squalid criminals both elected and otherwise. Perhaps that's an
important untold story- the grand parade of the society's faceless
"losers", the peasants and refugees fleeing some obscene tyrant and
his witless army of cannon fodder dupes and cruel henchmen, and why on
earth do they all buy into this fiction we call "Civilisation"?

Or, is it less of a fiction as one might imagine, and simply the
natural product of a status conscious primate whose every activity is
amplified and processed by its symbolic language center? Does reason
in human relations only extend as far as the highly codified and
ritualistic systems of voting and criminal justice? Can the
objectivity of scie nce ever be used to develop social and economic
systems that will eliminate injustice and poverty? Or, I wondered, is
such a quest based in an outmoded socio political messianistic
teleology? Are we fated to forever step over the prone bodies of those
less fortunate or healthy? If the answer to "When will the horror ever
end?" is "Never", then the big unreported story of the year is the
true loss of "Utopia" and the evisceration of the humanist's hope by
the knives of history and a scientifically informed realism.

Should we then also apply the logical conclusions of the Copernican
revolution to our own human existence? With a decentered earth, sun,
galaxy, and now, if some theories are correct, a decentered Universe,
it is now logical that we should apply the lens of decentering to
ourselves, our civilisations and cultures, and to our actions both
collective and personal Perhaps that's the most important unreported
story of the year -- we're really not "The Story" any more, and what
we do is likely of little, if any, consequence. Are we, as persons and
a species, merely bit players in a peculiar performance? Improvising
before an empty house, and all of our preening culture and posturing
civilised rhetoric but a vain and oddly comical conceit? On this tiny
planet of water, trees, and concrete -- are we small participants in a
giant multiverse that is actually less moving material incidentals
expressing an equation of variables and constants, and more of a
growing, blooming, beautiful, if very slowly dying, flower?

A Flower?

HENRY WARWICK is an artist, composer, and scientist whose formal
education consists of a BFA from Rutgers University in visual systems
studies, a major of his own invention. He lives in San Francisco, and
his works can be appreciated at www. kether .com.
LINKS: Henry Warwick's Website: kether.com

Julian Barbour 
a) The Extraordinary Proliferation of Jobs and Careers; b) James P.
Carse's Jewel of a Book, Finite and Infinite Games

The very notion of the "most important unreported story" makes it
inevitable that any answer will be highly subjective. For can anyone
honestly say they know for sure what is the most important thing in
life? One might say that the very essence of life is growth and
uncertainty about the direction in which it happens. I feel forced to
look for striking unreported stories whose ultimate importance is
inevitably unknown. Then I find that no story is completely
unreported, only underreported compared with the standards I would

In this line, I find that the existence of serious (though not
conclusive) scientific evidence for the complete nonexistence of time
has been strikingly underreported. But having just published a book on
the subject (and also undergone Edge edition No. 60) on this theme, I
do not feel like returning to it. I offer two substitutes.

Though certainly reported, there is a feature of modern life that I
feel should be given more prominence. It is the extraordinary
proliferation of jobs and careers that are now open to people, both
young and old, compared with my childhood over 50 years ago. I can
gauge this especially well because of a piece of research that is
worth mentioning on Edge. In the midst of the Second World War, but
when it was already clear that the Allies would win, the British war
cabinet decided that it should start thinking about how life should be
improved for both the urban and the rural population. Task forces were
set to work, researching the existing state. As it happened, the
village 20 miles north of Oxford where I grew up and still live (South
Newington) was chosen as the centre of an oblong region (measuring 4
by 6 miles on the Ordnance Survey map of Britain) that was to be
studied in detail as typical of rural Britain. Every village in it was
surveyed, and almost every person living in this region questioned
about their lifestyle and occupations.

The outcome was a book and a film called Twentyfour Square Miles, made
just after the war, when I was a boy of eight. Every few years, this
film is shown in the village. What is most striking is the subsequent
incredible mushrooming of career opportunities (and the large degree
of equalization of prospects between the sexes). Back in 1945, boys
could look forward to jobs in agricultural (quite a lot), as motor
mechanics (perhaps 20 such jobs in the entire region), working in the
aluminum factory in the nearby town of Banbury, and not much else. For
girls the options, apart from motherhood, were essentially limited to
domestic service, working as salespersons, and secretarial or other
clerking jobs. Probably only about one child in 20 would go on to
higher education. The transformation in 55 years has been amazing --
and it seems to me to be accelerating. In fact, any reasonably bright
young (or even relatively old) person in the region can now choose to
follow more or less any career.

There is some concern today that we are all becoming more and more
stereotyped. It seems to me that the mere fact that we now engage in
such a huge variety of occupations should largely offset any such
danger of stereotyping. Now to my second offering.

A highly original book not widely known must be an important
unreported story. (The President of Edge can hardly disagree!). It
turns out that the book I have in mind is actually quite widely known
(it has about half a million sales worldwide, as I learned recently on
the phone from its author) but seems to be almost completely unknown
in the UK. I am a sufficiently chauvinistic Brit to think that the UK
reading world is important, so here goes.

I learned about the book from Stewart Brand's The Clock of the Long
Now: It was a professor of religion, James P. Carse of New York
University, who came up with the idea of "the infinite game". His 1986
jewel of a book, Finite and Infinite Games, begins, "A finite game is
played for the purpose of winning, an infinite game for the purpose of
continuing the game."

This struck me as an extremely novel idea. I immediately ordered the
book and was not disappointed. A slim volume written in the addictive
aphoristic manner of Leibniz's Monadology and Wittgenstein's
Tractatus, with both of which it most certainly can be compared, I
think it is perhaps the most original perspective on life and the
world I have encountered. In fact, I was so delighted that I logged
onto amazon.co.uk and ordered 50 copies to send to friends as
Christmas and New Year presents (along with a copy of this entry to
serve as explanation for the said slim volume).

My order had a gratifying effect on the Amazon sales ranking of Finite
and Infinite Games. Before my order it stood at 25499. Two hours later
it was 4287. I mention this in the hope that some high Amazon
executive reads Edge and will ponder a somewhat less erratic way to
measure the rankings, which fluctuate so wildly as to be almost
useless. I think they need a Longer Now.

I am writing this on 31st December 1999 while listening to BBC Radio
3's monumental day-long 2000-year history of music called "The
Unfinished Symphony". The subtitle of Carse's book seems especially
apt in this connection --" A Vision of Life As Play and Possibility".
Perhaps the idea of life as an everlasting game or an ongoing symphony
is the most important unreported story.

On the subject of slim volumes, the British photographer David Parker,
who specializes in science studies and monumental landscapes with a
trace of the human element (Utah mainly), recently told me the
ultimate traveling-light story. Years ago he was trekking in Bolivia
with a friend who takes traveling light to the extreme. He therefore
allowed himself just one book, the Tractatus. To lighten even that
slim load, he would read one page each evening, tear it out and then
deposit it carefully wherever his bed happened to be: beneath a stone
if in the open or in the bedside table to rival the Gideon Bible if in
a hotel. One can hardly call the Tractatus an unreported story, but
the method of dissemination is worthy of Professor Emeritus James P.

JULIAN BARBOUR , a theoretical physicist, has specialized in the study
of time and inertia. He is the is the author of Absolute or Relative
Motion? and The End of Time.

Further Reading on Edge: "Time In Quantum Cosmology"; Julian Barbour
comments in the Reality Club on "A Possible Solution For The Problem
Of Time In Quantum Cosmology" by Stuart Kauffman and Lee Smolin

LINK: Julian Barbour's Website

Verena Huber-Dyson 
The Classification Theorem for Finite Simple Groups

It has been a great century for mathematical groups of all shapes and
sizes! They have been part and parcel of our daily lives for the last
couple of millennia or more. Apart from ruling our bureaucracy, groups
of transformations and symmetries are the keepers of law and order
among mathematical structures, the sciences and the arts -- as well as
the sources of beauty. Every finite group admits a decomposition into
a finite sequence of simple ones similar to the prime factorization of
integers. But it has proven spectacularly difficult to obtain a
systematic grasp of the set of all finite simple groups.

The first step was a 254 page proof in 1961 of a key conjecture dating
back to the last century. Then a team of over 50 group theorists
started snowballing under the leadership of Daniel Gorenstein who, in
1980, announced the result of their labors, recorded in some 15000
highly technical journal pages: every finite simple group belongs
either to one of two fairly well understood infinite families or to
one of 16 less tractable infinite lists or is one of the 26 so-called
sporadic groups-- a strange elusive lot whose discovery was a hunt
full of hunches, surprises and insights, reminiscent of the chase for
elementary particles. It was an extraordinarily fruitful proof. Within
2 years of its completion a conference was dedicated to applications
in nearly all branches of mathematics. But beyond merely establishing
truth a good proof must illuminate and explain. At first ascent
everybody is just glad to have scaled the peak.

Then comes the search for a more elegant, easier, snappier route. By
1985 that second stage of the project was well under way. The
"enormous proof" has set a new trend in mathematics. It is a True Tale
of a Tower of Babel with a Happy Ending! All those mathematicians
toiling side by side, if spread all over the globe, each with his own
outlook, language, bag of tricks: constructing geometries, permuting
objects, calculating characters, centralizing, fusing, signalizing and
inventing all sorts of new terms for situations that had been lying
there waiting to be recognized, named and used. They were not treading
on each other's toes but collaborating in a prolific way unprecedented
in the history of mathematics!

This wonderful happening did provoke lively professional discussions
but not much attention from the popular media. It is difficult to
advertise, unintelligible without technical explanations and lacking
in historical romance. Much of the friendliness of the sporadic
monster is lost on an audience gaping at its size, incapable of
appreciating its capricious charms. And there are no melodramatic side
shows; group theorists -- especially finite ones -- make up just about
the sanest and nicest species among mathematicians. Glitter and
glamour are not engendered by the laborious toil that went into the
quest for a classification of the finite simple groups. Finally, the
"second generation" has not yet completed its task, a good reason for
holding back the popular fanfares.

But it is a great story in progress to be carried over into the new

VERENA HUBER-DYSON received her Ph.D. in mathematics at the University
of Zuerich with a thesis in finite group theory, then, in 1948, came
as a post doc. to the Institute for Advanced Study in Princeton. After
a short but fruitful marriage she resumed her career, at UC Berkeley
in the early sixties, then at the U of Illinois' at Chicago Circle and
finally retired from the University of Calgary. Her research papers on
the interface between Logic and Algebra concern decision problems in
group theory. Her monograph Goedel's theorem: a workbook on
formalization (Teubner Texte, Leipzig 1991) is an attempt at a
self-contained interdisciplinary introduction to logic and the
foundations of mathematics.

Joseph LeDoux 
Educational Inequities

As a nation we pay lip service to the idea that we're all created
equal. But the rich are getting richer and the poor are getting
poorer. Is this because the poor have bad brains that can't learn to
do better, or because their brains never get the opportunity to learn?
We know that even the best of brains needs input from the environment
to form and flourish. So why do we allow schools in poor
neighborhoods, as a rule, to be so much worse?

The difference is less about race than about class. Shouldn't
education be more standardized from neighborhood to neighborhood, city
to city, and state to state? Improvement of educational opportunities
wouldn't solve all the problems the poor face, but is an obvious place
to start. Critics of liberal social policy often claim that pouring
money into a situation doesn't help. I'm not suggesting that the poor
get anything extra, just that they get what others get. A decent
education is a right not a privilege.

JOSEPH LEDOUX, a Professor at the Center for Neural Science, New York
University. Joseph LeDoux, has written the most comprehensive
examination to date of how systems in the brain work in response to
emotions, particularly fear. Among his fascinating findings is the
work of amygdala structure within the brain. He is the author of the
recently published The Emotional Brain: The Mysterious Underpinnings
of Emotional Life, coauthor (with Michael Gazzaniga) of The Integrated
Mind, and editor with W. Hirst of Mind and Brain: Dialogues in
Cognitive Neuroscience.

Further Reading on Edge: "Parallel Memories: Putting Emotions Back
Into The Brain" -- A Talk With Joseph LeDoux on Edge

LINK: LeDoux Lab: Center for Neural Science Home Page

Todd Siler 
Applying our Lessons from the Nuclear Age to this Age of Molecular
Biology and Genetic Engineering

The general public is wondering about the deep connection between
these two Ages in which we've glimpsed the power of atoms and are
beginning to glean the power of genes.

Welcome to the WISDOMillennium! There's no better time to tap our
collective wisdom as we welcome new opportunities to work together
toward advancing science, technology and society. In order to
responsibly and thoughtfully use our scientific insights into the
nature of molecules and genes, we need to hold in our conscious mind
(and conscience) the key learnings from our more freewheeling
experimental work on the atom which has lead to the development of
increasingly sophisticated nuclear weapons. With some concerted
effort, perhaps we can avoid repeating the mistake of mindlessly
innovating, constructing and stockpiling weapons of mass destruction.
Clearly, our boldest insights into the basic building blocks of
biological matter can be abused in a similar manner, as we start to
scratch that big itch of curiosity -- designing, for example, evermore
exotic bioweapons with will-o'-the-wisp applications.

Anyone with a sense of wonder can't help but marvel at those
potentially mind-altering innovations in genetic engineering that keep
rolling out of the laboratories and into our lives. All this
excitement has a way of momentarily silencing our skepticism, as we
tend to overlook the impact these innovative works may be having on
the whole of human ecology. Somehow, we need to keep in the forefront
our wonderment "the thinking eye," as the Symbolist painter Paul Klee
referred to that most essential element of creativity: higher
awareness. We need to look ahead -- cautiously and with full vision --
reviewing the times our eyes were "wide shut" to wanton exploits of
scientific explorations.

Given our penchant for reaching for the impossible -- and,
occasionally, realizing "the unthinkable" (those worst case scenarios
and experiences involving serious human error) -- perhaps the science
community would be wise to do some collaborative, projective thinking
and moral forecasting about the more questionable applications of
experimental research in both Molecular Biology and Genetic
Engineering; hopefully, this collaboration will take place before the
world community is forced to face yet another flagrant act of poor
judgment. Surely, there must be a way of supporting basic scientific
research while safeguarding ourselves from what cynics gleefully call
"the inevitable": you know the story (it's always the same story):
some megalomaniac spearheads a group of clowns, or clones -- or cloned
clowns -- who proudly announce their spectacular creation, a lethal
new life-form. "Here, let us demonstrate how this new viral strain
works...Oops! Gosh, we had no idea this creature would be quite so
devastating." As the story goes -- some millions of deaths and
apologies later -- another group of renegade researchers will try
their hand at reengineering virtually every cell in our bodies, as
they blindly attempt to morph the human spirit to fit an inhumane
world. I mean, just imagine what would happen if we pushed aside
bioethics and other governors of conscience, while envisioning what we
want the human race to become as it "grows up"? Will we end up looking
like amoebas, but thinking like gods? Or will we look like gods, but
think like amoebas? "Inquiring minds want to know."

There's an anxious public just waiting to be informed about the whole
field of Molecular Biology and the prospects of The Genome Project.
Instead of reporting on this field and Project in a straightforward
manner -- treating the story as if it were merely another development
in the history of ideas -- it needs to be seen laterally, against the
backdrop and fallout of the Nuclear Age. Looking below the surface and
politics of The Cold War that initially drove this Age, we need to
drill down to see what lies at the core of human creativity in the
service of its aggressive tendencies. Maybe this report could address
one of the most perplexing aspects of human creative potential --
namely, how highly intelligent people can take good ideas and turn
them into bad ones real fast, bearing really grave consequences. But
what's a "good idea" anyway? I think The Genome Project is a great
idea because of its enormous medical benefits. However, I fear the
more radical possibilities of its broad applications -- particularly,
those that sway toward expanding the work on advanced weapon systems.
I suspect we'll always be dealing with this dilemma of deciding what's
a fair use of scientific knowledge, and what isn't; what's beneficial,
and what isn't; what improves the human condition, and what doesn't.
In responding to the public's escalating fears about our nuclear
future, R. Buckminster Fuller once remarked: "There are five billion
people on this planet and no one seems to know what to do."

Have you heard those haunting words of Albert Schweitzer rumbling in
the distant? "Man has lost the capacity to foresee and to forestall.
He will end by destroying the earth," said Dr. Schweitzer. I gather
from his pensive perspective, we're no longer opening Pandora's Box;
we're living in it. The question remains: Do we always have to live in
this Box? If so, what's the best way to live in it and flourish?

In order to make our way with some peace of mind in this new Age of
engineered atoms, molecules and genes, we need to quickly learn from
the past. That means figuring out how we're going to initiate policies
in bioethics while encouraging greater social responsibility. Jacob
Bronowski touched on this issue in his classic muse, Science and Human
Values -- prompting us to always consider the dimension of values in
science that should never vanish from our view. We must take the time
now -- in this fresh moment -- to envision ways of ensuring a healthy
collective future. One step towards this end is to transfer our
learnings from the Nuclear Age to this new Age, and then transform
them. I trust we won't find this act of connection-making an exercise
in futility.

There's a dark line in Kurt Vonnegut's novel Hocus Pocus that may
bleed into this new millennium -- permanently staining it. In one
sweeping definition, Vonnegut sums up the whole of 20^th century
thought as "the complicated futility of ignorance." He proceeds to
define high art as "making the most of futility." I'm wondering
whether many scientists see how this edgy definition applies to high
science, as well. Is it so futile to summon our leading scientists and
technological visionaries to strategize about the next steps for
growing and applying our knowledge of molecules and genes? (This
gathering might work if everyone checked one's ego at the door and
removed any Master of the Universe costume or attitude before "dancing
naked in the mind field" [to borrow Kary Mullis's lively expression].)

Finally, the following questions should be included in this unreported
story, adding to the bonfire of challenging world questions: Is there
one thing in particular that would help improve the state of the
world? What is it, and who's working on it?

TODD SILER, artist, author, inventor, is the Founder and Director of
Psi-Phi Communications, a company that provides creative catalysts and
communication tools for breakthroughs & innovation in Fortune 500
Companies and schools. He recently presented his latest book, Thinking
Like A Genius, at the World Economic Forum, in Davos. He is the author
of Breaking the Mind Barrier and co-author (with Patricia Ward
Biederman) of "Creativity and A Civil Society," a commissioned report
by the Institute for Civil Society.

David G. Myers
The Disconnect Between Wealth and Well-Being: It's Not the Economy,

Does money buy happiness? Few of us would agree. But would a little
more money make us a little happier? Many of us smirk and nod. There
is, we believe, some connection between fiscal fitness and emotional
fulfillment. Most of us tell Gallup that, yes, we would like to be
Three in four entering American collegians -- nearly double the 1970
proportion -- now consider it "very important" or "essential" that
they become "very well off financially." Money matters.

Think of it as today's American dream: life, liberty, and the purchase
of happiness. "Of course money buys happiness," writes Andrew Tobias.
Wouldn't anyone be happier with the indulgences promised by the
magazine sweepstakes: a 40 foot yacht, deluxe motor home, private
housekeeper? Anyone who has seen Lifestyles of the Rich and Famous
knows as much. "Whoever said money can't buy happiness isn't spending
it right," proclaimed a Lexus ad.

Well, are rich people happier? Researchers have found that in poor
countries, such as Bangladesh, being relatively well off does make for
greater well-being. We need food, rest, shelter, social contact.

But -- the underreported story in our materialistic age -- in
countries where nearly everyone can afford life's necessities,
increasing affluence matters surprisingly little. The correlation
between income and happiness is "surprisingly weak," observed
University of Michigan researcher Ronald Inglehart in one 16 nation
study of 170,000 people. Once comfortable, more money provides
diminishing returns. The second piece of pie, or the second $100,000,
never tastes as good as the first.

Even lottery winners, those whose income is much higher than 10 years
ago, and the very rich people -- the Forbes' 100 wealthiest Americans
surveyed by University of Illinois psychologist Ed Diener -- are only
slightly happier than the average American. Making it big brings
temporary joy. But in the long run wealth is like health: Its utter
absence can breed misery, but having doesn't guarantee happiness.
Happiness seems less a matter of getting what we want than of wanting
what we have.

Has our collective happiness floated upward with the rising economic
tide? In 1957, when economist John Galbraith was about to describe the
United States as the Affluent Society, Americans' per person income,
expressed in today's dollars, was $8700. Today it is $20,000. Compared
to 1957, we are now "the doubly affluent society" -- with double what
money buys. We have twice as many cars per person. We eat out two and
a half times as often. Compared to the late 1950s when few Americans
had dishwashers, clothes dryers, or air conditioning, most do today.
So, believing that it's very important to be very well off, are we now
happier ?

We are not. Since 1957, the number of Americans who say they are "very
happy" has declined from 35 to 32 percent. Meanwhile, the divorce rate
has doubled, the teen suicide rate has nearly tripled, the violent
crime rate has nearly quadrupled (even after the recent decline), and
depression has mushroomed. These facts of life explode a bombshell
underneath our society's materialism: Economic growth has provided no
boost to human morale. When it comes to psychological well being, it
is not the economy, stupid.

We know it, sort of. Princeton sociologist Robert Wuthnow reports that
89 percent of people say "our society is much too materialistic."
Other people are too materialistic, that is. For 84 percent also
wished they had more money, and 78 percent said is was "very or fairly
important" to have "a beautiful home, a new car and other nice

But one has to wonder, what's the point? "Why," wondered the prophet
Isaiah, "do you spend your money for that which is not bread, and your
labor for that which does not satisfy?" What's the point of
accumulating stacks of unplayed CD's, closets full of seldom worn
clothes, garages with luxury cars -- all purchased in a vain quest for
an elusive joy? And what's the point of leaving significant inherited
wealth to one's heirs, as if it could buy them happiness, when that
wealth could do so much good in a hurting world?

DAVID G. MYERS is the John Dirk Werkman Professor of Psychology at
Hope College and, most recently, author of The American Paradox:
Spiritual Hunger in an Age of Plenty.

Further Reading on Edge: "What Questions are on Psychologists' Minds
Today?" David G. Myers. 
LINK: David G. Myers Home Page.

Stephen H. Schneider
The Way Stories About Complex Scientific Controversies Are Often
Unintentionally Mis-reported by the Mainstream Media.

And since policy making to deal with such controversies calls for
value judgments, and that in turn requires a scientifically literate
public to telegraph their value preferences to leaders,
miscommunication of the nature of scientific controversy has serious
implications for democracy in a world of exploding complexity.

In political reporting, it is both common and appropriate to "get the
other side": if the Democrat gives a speech, the Republican gets
comparable time/inches/prominence. This doctrine of "balance" -- which
is still taught proudly in journalism schools in the U.S. -- is
supposed to underlie the journalistic independence of the Fourth
Estate. [And let's not forget that conflict packaged in sound
bite-sized chunks garners higher ratings than more circumspect

But while journalists rightly defend the need for balance in truly
bipolar stories, how many scientific controversies are really
two-sided? More likely, there will be several competing paradigms and
a half a dozen marginal ideas kicking around scientific controversies.
And when the issues have high-stakes political winners and losers --
like the global warming topic I work in -- it is to be expected that
various special interests will compete for their spin. We've all seen
media filled with the views of environmental catastrophists,
technological cornucopians, ideological opponents of collective
controls on entrepreneurial activities, or denial from industrial
producers of products that pollute -- to name the usual prime players.
And each often has their hired or favored PhDs handy with ready
explanations and slick sound bites -- e.g., why carbon dioxide buildup
in the air will be either catastrophic or good for you.

Unfortunately, here is where a serious -- and largely unreported by
the very people who bring us this daily show -- disjuncture occurs.
For example, in the name of "balance", a 200-scientist,
two-years-in-the-making refereed scientific assessment gets comparable
space or airtime to a handful of "contrarian" scientists saying it
"ain't so". When I challenge this equal time reporting to my media
colleagues, they accuse me of being against "balance". This parade of
dueling scientists isn't remotely "balance" I respond, but rather,
utter distortion -- unless the journalist also reports the relative
credibility of each quoted position. I call the latter doctrine
"perspective" -- as opposed to the "balance" that journalists label.

In science all opinions are decidedly not equal, and we spend the bulk
of our effort winnowing the less from the more probable lines of
inquiry. Moreover, when we are assessors, we are obligated to report
whether our estimates of the likelihood of some set of hypothesized
outcomes are based on objective rather than subjective odds. I don't
have space to get into the "frequentist" versus "Bayesian" debate over
what is ever "objective", but awareness of the issue is also part of
what scientific literacy entails -- even for scientists.

Nevertheless, I do agree it would be irresponsible not to cover
minority opinions in media accounts of complex controversies. My
concern comes when contradictory scientific opinions are offered
without any attempt to report the relative credibility of these views.
Then, the public -- and political leaders too for the most part -- are
left to do that difficult assessment job themselves. More often than
not the "dueling scientists" get equal time in the story, confusion
sets in and outlier opinions win equal status at the bar of public
opinion with more widely accepted views. Of course, as Kuhn has taught
us, once in a while someone comes along to overthrow the mainstream
doctrine -- but we celebrate these paradigm busters primarily because
they are rare, not commonplace. One well-known editor argued with me
that to report scientific credibility "calls for a judgement on the
part of the journalist, and that most reporters lack specialized
qualifications to do that". "Without making judgments how can they
choose what to report and who to quote", I responded? "Why don't you
get someone from the Flat Earth Society to 'balance' every space shot
you cover -- isn't that a 'judgment' about their lack of credibility"?
Of course, they could hire such specialists, but only a few major
media outlets do -- and those reporters are decidedly not at the top
of the respect hierarchy in corporate media.

Science must always examine and test dissent, even if it takes a long
time to reduce some uncertainties. But science policy needs to know
where the mainstream is at the moment. My mantra to those seeking
scientific literacy in order to address the implications of the debate
is to remember to ask all competing claimants of scientific "truth"
three questions: 1), "What can happen?", 2), "What are the odds?", and
3) "How do you know?" And if you intend to ask the third question,
plan to have a pen and paper along and be willing to check references,
for question 3) isn't a sound bite-length inquiry.

In summary, most stories turn the doctrine of balance on its head by
applying it too literally to complex, multi-faceted scientific
debates. Then, the unreported story becomes that there actually are
different probabilities that belong to each of the various positions
covered, yet these conflicting positions appear in the story to be
equally likely.

Science must always examine and test dissent, even if it takes a long
time to reduce some uncertainties. But science policy needs to know
where the mainstream is at the moment. My mantra to those seeking
scientific literacy in order to address the implications of the debate
is to remember to ask all competing claimants of scientific "truth"
three questions: 1), "What can happen?", 2), "What are the odds?", and
3) "How do you know?" And if you intend to ask the third question,
plan to have a pen and paper along and be willing to check references,
for question 3) isn't a sound bite-length inquiry.

STEPHEN H. SCHNEIDER is Professor in the Biological Sciences
Department at Stanford University and the Former Department Director
and Head of Advanced Study Project at the National Center for
Atmospheric Research Boulder. He is internationally recognized as one
of the world's leading experts in atmospheric research and its
implications for environment and society. Dr. Schneider's books
include The Genesis Strategy: Climate Change and Global Survival; The
Coevolution Of Climate and Life and Global Warming: Are We Entering
The Greenhouse Century?; and Laboratory Earth.

Eduard Punset 
The End of the Brain

It is the best candidate to the major unreported event of the next
century. Only people in need really do need a brain. Plants don't, and
get along pretty well without it: photosynthesis alone largely fulfil
all their requirements. Actually, now that we know that we share, for
better or worse similar DNA, -- the instruction booklet that designs
living organisms robust enough to ensure survival, but flexible enough
to adapt to changing environments --, the missing brain is the only
difference between plants and us. And as we'll quickly learn during
the next century, it is nothing to be very proud of.

Although it seems harder to define the differences within the brainy
species themselves, including primates and other animals, it is,
however, rather surprising to find that -in the History of human
thought- there has hardly been a single intellectual who has not
condescended to share the collective and unending appraisal of the
substantial differences between men and the rest. In fact, this debate
has bored quite a few generations of learned hominids. Fortunately, it
is about to end thanks to, above all other scientists, Lynn Margulis.
Let me explain why.

It has taken quite some time and arguing to show that most animals do
indeed communicate and master reasonably evolved languages to that
end. There is nothing terribly creative about the capacity to learn a
language; as Steven Pinker pointed out it is genetic, and could not be
more damn simple, since it is digital. Men can do it; other mammals
too. The only surprising thing about it is the sheer impotence of
current scientific thought to unveil the basics of animal culture.

Despite the fact that we share the same genes, tool-making also helped
to substantiate the differences between hominids and chimps. Of
course, a few of those genes are different, but we still don't know
which of them actually makes the difference. The tool making
singularity, however, has not outlived the language exclusiveness. As
other people moved by curiosity, I have enjoyed looking at zoologist
Sabater's collection of chimp's sandals, hats, seed's catchers and
sticks for all sorts of widely different uses, such as beating, or
carving the sand searching for fresh water during the dry season,
instead of trying to drink in muddy soils.

The identification of consciousness -- since scientists assumed twenty
years ago that the scientific method could be extended to these
domains, up to then left to superstition --, looked like the final
argument. "We're conscious of ourselves. We know who we are. And they
don't". It was the most serious argument ever put forward in our
defense. It did not matter that chimps could also recognize themselves
in a mirror; somehow they would not show the precise awareness, nor
the same cognitive capacity to ruminate about one self. Unfortunately,
biologists like Lynn Margulis showed that bacteria -- as far back as
two billion years ago- could not manage their electric-like motors,
nor their magnetic navigation systems, without some realization of
what on earth they were building up those ultramodern transport
systems for. You just can't pretend any longer that bacteria are not
conscious too.

For those still interested in the old debate about the differences
between the brainy species, let me remind you that the most avant
garde argument now runs something like this: only the descendants of
the Australopitecus have developed the capacity to generate symbols.
Nobody can demonstrate neither when nor how it happened; I myself am
convinced that the whole thing started six thousand years ago when
people settled to labor the land, and women had to leave their babies
unattended shouting all day long.

But total allegiance to symbols like the San Francisco 49ers, the
Serbian motherland, or the Manchester United colors are undeniably
humane. No chimpanzee would risk his life for these or similar
symbols, nor for that matter would leave their newly born unattended.
Chimp's mothers love to carry them. There at last is something which
makes us really different from other animals. The capacity to generate
symbols and to blindly follow them, has indeed taken Homo Sapiens a
long way off from the brain's original purpose: to go in the right
direction, and to anticipate a few questions. A very lucid New York
physiologist attending last December a Neuroscience Congress at the
birthplace of Ramon y Cajal, actually told me he knows of a particular
specie who ends up eating its own brain once it settles in the right
place and knows the basic answers.

Could it not be that the brain has taken over a bunch of simple people
who were only in need of a few addresses and of guessing what on earth
was going to happen tomorrow? The World Health Organization is
predicting that life expectancy will reach one hundred and twenty five
years very shortly. Neuroscientists should start worrying about the
outcome of forty additional years with jammed brains immersed in the
process of deepening their symbolic capacity, leaving at long last an
unbridgeable and recognizable gap with plants and animals.

Yet despite this distinctive capacity to generate symbols, some 25% of
the population -- excluding criminals -- have serious brain
disfunctions, and most medical observers already agree that brain
disorders will be the most serious health threat in the twenty-first
century. The lucky 75% who will not be insane already know, according
to the latest statistics, that more patients die as a result of
practitioner's brains guessing wrongly about the nature and treatment
of real or invented illness, than people succumb on the roads and from
heart failure altogether.

Thankfully, building a collective brain through Internet should
alleviate the stress of saturated individual brains, and help manage
the lives of the great majority of people who have already been
overcome by too many choices regarding the path to follow and the
answers to non-formulated questions, even under current life
expectancy models. I'm afraid that quite a few of them will, however,
regret the placid and constructive life of brainless plants.

EDUARD PUNSET is Master of Science (Economics) by LSE, and Professor
of Economic Policy at the Chemical Institute of Ramon Llull University
in Barcelona. He was Chairman of the Bull Technological Institute,
Professor of Innovation and Technology at Madrid University, and IMF
Representative in the Caribean. He actively participated in the
Spanish political transition to democracy as Minister for Relations
with the European Union, and Member of the European Parliament. He is
currently Director and Producer of Networks, a weekly programme of
Spanish public television on Science. His latest book is A Field Guide
to Survive in the XXIst Century.

Mehmet C. Oz, M.D.
The True Nature of Much of Human Illness Eludes Mankind.

We routinely tolerate concepts that challenge our perceived reality.
Electrons are allowed to spiral through the air transmitting
information and sprouting quarks, yet we insist of a very concrete,
biomedical understanding of our body.

The body is sacred and our provincial understanding of its workings in
imbued with cultural biases that are bred from birth and dominated by
guttural rather than cerebral influences. The resulting theology of
the medicine insists that all citizens having heart attacks must have
chest pain, or smoke, or have type A personalities, or have high
cholesterol, or be obese. The observation that half the victims of
mankind's largest killer do not fit this profile eludes the public.
And how do these risk factors explain why heart attacks are most
likely to occur on Monday mornings? We make the error of assuming that
medicine, which describes tendencies rather than certainty, is a
mathematical field rather than a statistical field. Even the act of
observing disease in the form of a patient-physician relationship can
alter the natural history of the illness, the medical equivalent of
the Heisenberg uncertainty principle in physics. This may explain the
well know placebo effect and its dark relative, the nocebo impact.

Religious and scientific explanations for disease crisscross
frequently in the brain. How about the eyesight you use to read a
sentence describing your grandmother's love for you. We can describe
how you can see the words on the page, and even how you know what the
words mean and how they fit together. We can understand the memory
processes that allow you to recall you grandmother's face, but how do
you know that the image you are seeing is your grandmother? Do you
have an individual brain cell reserved for each object you ever see
with one being reserved a young grandmother and another for a more
mature memory? At some point we push our scientific understanding into
the abyss of art and become surrounded by the seeming darkness of
theology. What makes this experience particularly frightening is the
associated realization that we, as individuals, are so unique that
cookie cutter answers offered to humanity on losing weight or avoiding
heart disease are unlikely to work. We each have to shoulder our own
burden in seeking these answers on our journey through life.

MEHMET C. OZ, M.D., is Irving Assistant Professor, Department of
Surgery, Columbia-Presbyterian, and a leading cardiac surgeon who has
pioneered both high-technology approaches (he was involved in the
invention of the cardiac assist device) and the use of complementary
medicine in surgery. He is author of Healing from the Heart.

David Lykken 
The Reduction Since 1993 In American Crime Is An Illusion

The much-touted reduction since 1993 in American crime is an illusion.
The U.S. rate of violent crime today is still nearly four times what
it was in 1960. The recent dip in crime is the predictable result of
our segregating in our prisons more than six times the number who were
inmates as recently as 1975. A few of these inmates are psychopaths,
persons whose genetic temperaments made them too difficult for the
average parents to successfully socialize. A few others are mentally
ill or retarded or sheer victims of circumstance. But most are
sociopaths, persons broadly normal in genetic endowment who matured
unsocialized due to parental mis-, mal-, or non-feasance. Like our
language talent, humans evolved an ability to acquire a conscience, to
feel empathy and altruism, to accept the responsibilities of a member
of the social group. But, like the language talent, this proclivity
for socialization requires to be elicited, shaped, and reinforced
during childhood. The epidemic of crime that began in the 1960s is due
largely to the fact that, of males aged 15 to 24, the group
responsible for at least half our violent crime, the proportion who
were reared without fathers is now four times what it was in 1960.

More than two-thirds of -- abused children, juvenile delinquents,
school dropouts, pregnant teen-agers, homeless persons, adult
criminals -- were reared without the participation of their biological
fathers. Calculated separately for white and black youngsters, it can
be shown that a fatherless boy is seven times more likely to become
incarcerated as delinquent than a boy raised by both biological
parents. Judith Rich Harris argues that parents are fungible, that
children are shaped mainly by their genes and their peers. I think she
is 80% correct but I think that there are a few super-parents who
effectively nurture and cultivate their children (and largely
determine their choice of peers). And I am certain that the bottom 10%
of parents are truly malignant -- immature, or overburdened, or
indifferent, or sociopathic themselves -- so that their children are
almost certain to be robbed of their rights to life, liberty, and the
pursuit of happiness.

Suppose we were to require those who wish to have -- and keep -- a
baby must be mature, married, self-supporting, and never convicted of
a crime of violence. If the parents of the 1.3 million Americans
currently in prison had met such simple licensure requirements, I
believe that at least a million of those inmates would instead be
tax-paying citizens and neighbors. Interfering with parental rights,
even as modestly as this, is rather frightening because the instinct
to procreate is as strong in us as it is in all the birds and beasts.
But homo sapiens should be able to agree that the rights of the
children outweigh those of parents who are unable or unwilling to grow
up, get married, and get a job.

DAVID LYKKEN is a behavioral geneticist at the University of Minnesota
who recently published the results of a study of 1500 pairs of twins
in the May issue of Psychological Science. He is the proponent of a
set-point theory of happiness, the idea that one's sense of well-being
is half determined by genetics and half determined by circumstances.
His research illustrates that a person's baseline levels of
cheerfulness, contentment, and psychological satisfaction are largely
a matter of heredity. He is the author of Happiness: What Studies on
Twins Show Us about Nature, Nurture, and the Happiness Set Point.

Further Reading on Edge: "How Can Educated People Continue to be
Radical Environmentalists?" A Talk by David Lykken.

Stuart Hameroff
The Imminent Paradigm Shift In Understanding The Conscious Mind

Today's most important unreported story is an imminent paradigm shift
in understanding consciousness. Quantum computation will soon replace
our familiar classical computation as primary metaphor for the
brain/mind. The purported brain=mind=computer analogy promising
robot/computer superiority and human/machine hybridization from
near-future classical computers is a myth promulgated by the
"silicon-industrial complex. "

Quantum computation was proposed in the 1980's by Feynmann, Benioff,
Deutsch and others to take advantage of the mysterious but well
documented quantum phenomenåa of 1) superposition (particles existing
in multiple states or locations simultaneously) and 2) entanglement
(instantaneous, non-local communication among quantum states). Whereas
classical computers represent information digitally as "bits" of
either 1 OR 0, quantum computation utilizes "qubits" in which
information exists in quantum superposition of both 1 AND 0. While in
superposition, multiple entangled qubits may interact nonlocally,
resulting in computation of near-infinite massive parallelism.

In 1994 Peter Shor of Bell Labs proved that quantum computers (if they
are able to be built) could factor large numbers into their primes
(the key to modern cryptography, banking codes etc) with unprecedented
efficiency, rendering conventional systems obsolete. Shor's work
sparked major funding in the general area of quantum information
(quantum computation, quantum cryptography, quantum teleportation). An
apparent roadblock to quantum computation -- the problem of
decoherence by environmental interactions -- was potentially solved in
the mid 1990's by groups who developed quantum error correction codes
which can detect and repair decoherence before quantum computation is

In the past several years numerous quantum computational prototypes
have been developed, and various technologies for full blown, large
scale quantum computers are being explored. It seems almost inevitable
that quantum computation will have an enormous impact on information

The brain/mind has traditionally been compared to contemporary
vanguards of information processing (dating from the Greeks' "seal
ring in wax" as a metaphor for memory, to the telephone switching
circuit, to the hologram, to the modern day classical computer in
which consciousness "emerges" from complex computation among simple
neurons). As quantum computation comes to the forefront of technology,
human nature (and ego) will surely resist the notion that technology
bears superior intellect, and search for quantum computation in the

There are cogent reasons for believing that quantum computation does
indeed operate in the brain, and such suggestions have been made by
theorists including Sir John Eccles and Sir Roger Penrose. However
critics quickly point out that the warm, wet, noisy brain must be
inhospitable to delicate quantum effects which (in the case of
superconductors, Bose-Einstein condensates etc) seem to require
complete isolation and temperatures near absolute zero to prevent

On the other hand "quantum-mind" advocates suggest that biological
quantum coherence is metabolically "pumped", point to several lines of
evidence suggesting that biological evolution has solved the
decoherence problem, observe that only quantum computation can solve
the enigmatic features of consciousness, and propose testable
predictions of quantum-mind theories (on the contrary, experimental
predictions regarding classical computational emergence of
consciousness have not been put forth). The implication, and potential
theme for the next century, is that we are not strictly emergent
products of higher order complexity, but creatures connected to the
basic level of the universe.

STUART HAMEROFF. M.D. Professor, Anesthesiology and Psychology
Associate Director, Center for Consciousness Studies The University of
Arizona Tucson, Arizona.

LINK: Stuart Hameroff Home Page.

Ellis Rubinstein
The Erosion of Traditional "Divides"

I believe the world to be experiencing an unprecedented erosion of
traditional "divides". Yes, we can all point to examples of horrific
ideological conflict, but such tribalism surely seems anachronistic to
most of us. And that is because many of us have grown accustomed in
the latter decades of the 20th century to a kind of social
enlightenment that stems from urbanization, globalization, and the
sharing of common information disseminated by our extraordinary new
communication tools.

Now, it may seem obvious that nationalism and political and religious
ideologies are having an increasingly hard time remaining "pure" in
the face of increased face-to-face contact with those who see things
differently from us. Moreover, we cannot easily cling to our most
formative views when we increasingly find ourselves in conversation
via phone and e-mail with others who see the world differently from
us. And, finally, it must be ever more difficult to remain isolated in
our views of others when we are surrounded by images of them -- often
touching images -- on film and television.

Still, all this may have been discussed somewhat in various media.
What especially intrigues me is the apparent erosion among relatively
educated families of a different "cultural divide": the generational
divide. What are the drivers of this shift? And what are its effects?

In my necessarily limited experience, I have observed that parents and
children are increasingly "friends". It has been much noted that the
Baby Boom generation and their children share many of the same
interests in music. At formal events -- I think of my recent
experience in Sweden attending the Nobel festivities -- teenagers and
20-somethings happily mingled with their elders who, if they were
male, were dressed in cut-aways. Indeed, some of the young people were
wearing special costumes in order to play roles in this highly
traditional event. In my time, we would have seriously considered
committing suicide before putting on costumes provided by our elders,
then attending hours of events populated largely by our parents and
grandparents, and finally dancing to "retro" music in the closest
imaginable proximity to our parents and even grandparents.

I conclude from this and similar experiences that as this new
millennium begins, a sort of truce has taken place between
generations, with parents and children attempting to bridge divides
that, in my view, ought naturally to exist between them. If I am
correct about this, then surely there are major ramifications on our
culture...and I'm not at all sure they would only be for the good. I
worry, for example, that some needed element of rebelliousness is
being "bred out" of the system of growing up. I worry that this may
have an effect on creative thought. And I worry that the potential
lack of tension between generations might lead to a kind of stagnation
in the arts, humanities and sciences.

Am I alone in this concern? I personally haven't seen this topic
addressed in the all-too-limited spectrum of publications I can
personally scan. So maybe others have publicly shared this concern. If
not, however, I vote for this as one of the most important
underreported stories of our time.

ELLIS RUBINSTEIN is Editor of Science

LINKS: Science Magazine; Ellis Rubinstein's Science Page

Hans Ulrich Obrist
The Roads Not Taken

I see unrealized projects as the most important unreported stories. As
Bergson showed, realization is only one case of the possible. There
are many amazing unrealized projects out there, forgotten projects,
misunderstood projects, lost projects, realizable projects,
poetic-utopian dream constructs, unrealizable projects, partially
realized projects, censored projects and so on. The beginning of a new
millennium seems like a good moment to remember certain roads not
taken in an active and dynamic and not nostalgic way.

HANS ULRICH OBRIST has curated exhibitions at Musee d'Art Moderne de
la Ville de Paris, unsthalle, Wien and Deichtor-Hallen,Hamburg, and
Serpentine Gallery in London, among other institutions. He currently
divides his time between France, Switzerland and Austria. After an
initial training in economics and politics, he switched to
contemporary art and has organized a variety of exhibitions in such
unlikely venues as his own house, a monastery library, an airplane and
an hotel.

Peter Cochrane 
Decoding the Human Genome is a Long Term Project

We are currently being fed a series of snippets detailing the
unraveling of the human blueprint -- our genome. If every isolated and
obscure report were strung end to end, and edited thoroughly into a
coherent whole, we might just have a comprehensible story -- and one
of the most important to be told. But let me jump to the end point and
make an educated guess! I suspect that when we decide that tweaking
the odd gene or two seems like a good idea to cure this or that
disease or shortcoming, or to get this or that eye color etc, we will
get a heck of a surprise. It seems to me that the likelihood that
individual genes are singularly responsible for any one trait is
pretty slim. So I am putting my money on a reasonably strong
interdependency, that is -- we will have to attend to complex
combinations of genes to achieve some desired effect. Realizing an
adequate level of understanding of the complexity of the genome code
book may well take considerably more effort than the initial mapping
of all the raw elements. Like Morse Code and the German Enigma
machines of W.W.II, we will almost certainly require some hefty
computing power to do the job. Now this really will be worth

PETER COCHRANE, Chief Technologist BT, Collier Chair for The Public
Understanding of Science & Technology, @Bristol, and the author of
Tips for Time Travelers.

LINKS: Peter Cochrane's Home Page; C2G: Communications Consultants
Group, @ BT Adastral lPark

Eberhard Zangger
A Lost Civilization

Beyköy is a hamlet sitting on the lonely edge of the world -- in the
highlands of Phrygia in western Turkey, hundreds of kilometers away
from either city or coast. In 1876, a local peasant made a remarkable
discovery in these forsaken backwoods -- one that ranks today as the
world's most important unreported story.

This farmer's pasture extended along the foot of a low but extensive
mound, believed to hold the remains of an ancient city. While working
his ox and plow along the base of the hill, the farmer struck some
objects in the furrow which gave off a metallic noise. After picking
up and scrutinizing the large pieces of metal which he had
accidentally unearthed, the peasant noticed they were covered with

Several years later, the American scholar William M. Ramsay passed
through the village. Speaking with some local people, Ramsay inquired
if they possessed any coins and artifacts picked up off the ground,
with the intention of finding a bargain. Among the pieces offered to
him was a tiny fragment containing a short inscription. Wondering
aloud about its original location, the locals pointed towards the
knoll. After further inquiry, Ramsay found out about the series of
spectacular bronze tablets which had also been found there.
Unfortunately, they were gone and nobody was able to say where.

As it turned out, the Beyköy bronze tablets had made their way into
the archives of the Ottoman empire. Recognizing their possible
significance, the curator in charge did not hesitate in contacting the
world's foremost authority, German-born Anatolia expert Albrecht
Goetze, in order to publish the finds. In the late 1950's, this
professor of Yale University began to investigate the texts. For over
ten years, he examined, translated and interpreted them. Then, he
informed another colleague of their existence. Goetze completed his
investigations and manuscript shortly before he died in 1971. Owing to
the subsequent death of his editor, the monograph, however, has never
been published and the Beyköy tablets remain completely unknown.
Goetze had found that they contained lists of Anatolian states, kings,
and military actions from as early as the fourth millennium BC up
until the eighth century BC. These texts proved conclusively that
today's Turkey was once the home of a civilization older and more
important to European history than Pharaonic Egypt. Yet, this
civilization has remained virtually uninvestigated to the present

Now, why should the discovery of the Beyköy tablets be considered the
world's most important unreported story? After all, archaeological
discoveries -- like that of the man in the ice -- have often surprised
the general public and upset established scholarship. The discovery of
the Beyköy tablets, however, is of a different order of magnitude. It
shows that the center of European history is to be found way outside
the frontiers of the hitherto Old -- and accepted -- World. Therefore,
its impact equals a scientific revolution similar to those caused by
Nicolas Copernicus, Charles Darwin and Siegmund Freud. Copernicus'
work generated an upheaval, since it demonstrated that humankind is
not at the center of the universe. Darwin's research demonstrated that
humans do not stand at the ultimate zenith of Creation; instead they
are more or less an accidental product of a million years-long
evolutionary process. And Freud showed that we are not even in full
command of our own mind, our inner selves.
The publication of the Beyköy tablets will take this sequence of
upheavals in western thought one step further. It will demonstrate --
once and for all -- that "western culture" is an arbitrary, abstract
concept resting upon the wishful thinking of certain eighteenth
century members of the educated class. Old world civilizations,
especially that of ancient Greece, evolved from frontier outposts of
much older Asian civilizations in the adjacent Orient. The practices
characterizing western civilization -- domestication of animals and
plants, agriculture, husbandry, permanent homes, life in village
communities, metallurgy, politics and cosmology -- all clearly
originated in Asia.

After this paradigm shift, only one even greater upheaval remains --
proving that intelligent life exists elsewhere in the universe.

EBERHARD ZANGGER is an exploder of myths and the discoverer of the
lost continent of Atlantis, which was never lost in the first place.
His five books on ancient civilizations have been published
internationally -- with more to come. Feeling a bit estranged from the
great thinkers of the planet, he is impatiently waiting for the fourth
culture to begin.

Anne Fausto-Sterling
The End of Gene Control

As the 20th century draws to a close, biologists triumphantly announce
the beginning of the end of the project to sequence the human genome.
Metaphoric hyperbole runs rampant as we speak of "reading the book of
life" and of "unraveling the essence of what it means to be human".
But less noticed is the fact that developmental biologists who study
the role of genes in development are busily dethroning the gene.

When I was a young embryologist I lectured about genes in development.
Following the dogma of the time, I told my students that there were
two groups of genes. First, there were housekeeping genes -- those
responsible for the mundane daily functions of the cell -- the
feminine duties of maintenance. These genes supposedly kept the
machinery running smoothly -- respiration and waste disposal went on
quietly and demurely. But the really important genes were the
development genes -- those masculine entities that pioneered new
territory and wrought new form from undifferentiated plasm. The goal
of any self-respecting developmental geneticist was to find those
special genes for development and thus unravel the mystery of how
genes control the formation of new organisms.

The successes have been many and profound. Developmental biologists
have uncovered myriad genes involved in embryo formation. They have
found an amazing continuity of genetic structure and function across
the phyla. We now understand in fabulous detail the function of many
genes in development. But something funny happened on the way to the
genetic forum. The distinction between housekeeping genes and
development genes has become increasingly hard to maintain. Some
development genes fall into the category of transcription regulators,
as might be expected for genes that control genetic expression. But
many turn out to be involved in cell communication and signaling. What
is more these genes don't control development. In a real sense
development controls the genes. The same genetic read-out can have a
vastly different outcome depending upon when during development and in
which cell the protein is produced. Indeed, most development genes
seem to act at multiple times during development and in many tissue
and cell types. The same gene can play a key role in quite a variety
of developmental events.

The important story is that the search for genes that control
development has shown us that our initial idea that genes control
processes within an organism is wrong. Instead genes are one set of
actors within a developmental system. The system itself contains all
of the pre-existing contents of the cell, organ or organism. These
include thousands of gene products, other chemicals such as ions,
lipids, carbohydrates and more, all organized and compartmentalized in
a highly-stru ctured physical setting (the cell and its substructures,
the organ and its tissues, the organism and its organ systems). From
before the turn of the century embryologists debated whether the
cytoplasm controlled the nucleus or vice versa. What the last decade
of research on genes in development reveals is that both things are
simultaneously true -- the system and its history control development.
Genes are but one of many crucial components of the process.

ANNE FAUSTO-STERLING is Professor of Biology and Women's Studies at
Brown University. She is the author of Myths of Gender: Biological
differences between women and men.

LINK: Anne Fausto-Sterling Home Page

Andy Clark
That the Human Mind is Less and Less In the Head

Human brains are making the human world smarter and smarter, so that
they (the brains) can be dumb in peace. Or rather, we are
progressively altering our environment so that our brains, which are
good at pattern-matching, at sensory recognition, and at the
manipulation of objects in the world, can support intelligent choice
and action by their participation in much larger networks. Human
development is a process in which the brain becomes deeply tuned to
the available environmental surround, be it pen, paper and sketchpad,
or PC, Palm Pilot and designer software. As the century closes, and
our typical, reliable environmental props and supports become ever
more sophisticated and interlinked, so the mental machinery that makes
us who we are is becoming ever more extended, interanimated and
networked. In the near future, software agents whose profiles have
evolved alongside those of a specific individual will count as part of
the individual person. To say that I use those software agents will be
strictly false. Instead, my brain and the various personalized
manifestations of new technology will constitute integrated thinking
things. I will no more be the user of these close-knit technologies
than I am the user of the various sub- systems already existing within
my biological brain. Instead, better to see both those sub-systems and
these close-knit external technologies, as together constituting a
spatially extended kind of user or person. The next step may be, as
Rodney Brooks suggests, to put as much of that technology back inside
the biological membrane as possible. This buys easier portability
without changing the real state of affairs. We are already (mental)

ANDY CLARK is Professor of Philosophy and Director of the Philosophy /
Neuroscience / Psychology Program at Washington University in St
Louis. He has written extensively on Artificial Neural Networks,
Robotics and Artificial Life. His latest book is Being There: Putting
Brain, Body and World Together Again.

LINK: Andy Clark's Home Page

Keith Devlin 
The Death of the Paragraph

The most significant unreported story? The mathematician in me screams
that this is a paradox. The moment I write about it, it ceases to be
unreported. That aside, there are so many reporters chasing stories
today, it would be hard to point to something that has the status of
being a "story" but has not yet been reported. On the other hand, as
Noam Chomsky is fond of reminding us, there's a difference between
being reported somewhere, by somebody, and being covered by the major
news organizations.

I'll settle for a trend. I'm not sure if it will turn out to be a
story, but if it does it will be big. It's the death of the paragraph.

We may be moving toward a generation that is cognitively unable to
acquire information efficiently by reading a paragraph. They can read
words and sentences -- such as the bits of text you find on a
graphical display on a web page -- but they are not equipped to
assimilate structured information that requires a paragraph to get

To give just one example, a recent study of 10,000 community college
students in California found that, in the 18-25-year age group, just
17% of the men could acquire information efficiently through reading
text. For the remaining 83%, the standard college textbook is little
more than dead weight to carry around in their bag! The figure for
women in the same age group is a bit higher: just under 35% can learn
well from textually presented information.

These figures contrast with those for students aged 35 or over: 27% of
males and over 42% of females find it natural to learn from reading.
Of course, that's still less than half the student population, so any
ideas we might fondly harbor of a highly literate older generation are
erroneous. But if the difference between the figures for the two
generations indicates the start of a steady decline in the ability to
read text of paragraph length, then a great deal of our scientific and
cultural heritage is likely to become highly marginalized.

Half a century after the dawn of the television age, and a decade into
the Internet, it's perhaps not surprising that the medium for
acquiring information that both age groups find most natural is visual
nonverbal: pictures, video, illustrations, and diagrams. According to
the same college survey, among the 18-25 age-group, 48% of males and
36% of females favor this method of acquiring information. The figures
for the over-35s are almost identical: 46% and 39%.

If these figures reflect the start of a story that has not been
reported, then by the time somebody does write it, there may not be
many people around able to read it. The social, cultural, scientific,
and political consequences of that are likely to be major.

KEITH DEVLIN, mathematician, is a Senior Researcher at Stanford
University, and the Dean of Science at Saint Mary's College of
California. He is the author of Goodbye, Descartes : The End of Logic
and the Search for a New Cosmology of the Mind; Life by the Numbers;
and The Language of Mathematics: Making the Invisible Visible.
Link: Keith Devlin Home Page

Dean Ornish 
The Globalization of Illness

Many developing countries are copying the Western way of living, and
they are now copying the Western way of dying.

Illnesses like coronary heart disease that used to be very rare in
countries such as Japan and other Asian countries are becoming
epidemics, causing huge drains on their economies as well as enormous
personal suffering -- much of which could be avoided. The same is true
for prostate cancer, breast cancer, colon cancer, diabetes,
hypertension, obesity, arthritis, and so on. Trillions of dollars in
direct and indirect expenditures could be avoided, along with untold

DEAN ORNISH, has 22 years experience directing clinical research
demonstrating, for the first time, that comprehensive lifestyle
changes may reverse even severe coronary heart disease without drugs
or surgery. Founder, President, and Director, non-profit Preventive
Medicine Research Institute. Clinical Professor of Medicine,
University of California, San Francisco. Physician consultant to
President Clinton and several members of the U.S. Congress. He is the
author of five best-selling books, including New York Times'
bestsellers Dr. Dean Ornish's Program for Reversing Heart Disease; Eat
More, Weigh Less; and Love & Survival: The Scientific Basis for the
Healing Power of Intimacy.

LINK: Dean Ornish, M.D. -- "Healthy Living"

Bart Kosko

The most important unreported story at the dawn of the Information Age
has two parts: (1) The last Sunday of the 20th century passed and the
United States Government still continued to outlaw first-class mail on
Sunday and (2) no one complained.

BART KOSKO is Professor, Electrical Engineering Department, University
of Southern California, and author of Fuzzy Thinking; Nanotime; and
The Fuzzy Future: From Society and Science to Heaven in a Chip.

LINK: Bart Kosko Home Page

Terrence J. Sejnowski
Exercise Can Make you Smarter

A revolution recently occurred in neuroscience that has far reaching
implications for our future.

According to all the textbooks in neuroscience, we are born with a
full complement of around 100 billion neurons and that it is all
downhill from there. This was a discouraging "fact".

Fred Gage, a colleague of mine at the Salk Institute, has discovered
that new neurons are born every day in the hippocampus, an important
part of the brain for long-term memory of facts and events, even in
adults. This was first found in rats, but has now been shown in
monkeys and humans, and not just in the hippocampus, but also in the
cerebral cortex, the storehouse of our experience and the fountainhead
of our creativity. This was widely reported, but what has emerged
since then is even more encouraging.

First, the new neurons in the hippocampus don't survive unless the
animal exercises; a running wheel in an otherwise standard lab cage is
enough to keep new neurons alive and well in mice. Second, the
increase in the strengths of connections between neurons in the
hippocampus that occurs when they are strongly activated, called
long-term potentiation, is twice as strong in mice that had access to
a running wheel. Finally, the mice with exercise were smarter at
memory tasks. We still do not know how all this happens, but the
bottom line is that something as basic as exercise can make you
smarter. Recess in schools and executive gyms help not only the body,
but can also make the mind sharper.

These results have implications for graceful aging.

Until recently, the dominant view of how the brain develops made the
assumption that experience selects neurons and connections from a
fixed, pre-existing repertoire. This view had some plausibility when
it was thought that all of the neurons you will ever have are present
at birth, coupled with evidence of neuron death and pruning of
connections during childhood. However, if the brain makes new neurons
in adults then this cannot be the whole story, since the growth of new
connections must also be occurring, and doing so in an
experience-dependent way.

This discovery, coupled with increasing evidence that new connections
can grow even between old neurons as a consequence of an enriched
environment, means that an active mind along with an active body
predisposes us for a lifetime of learning. This is good news for the
baby boomers who have embraced health clubs and challenging new
experiences, but bad news for couch potatoes who are exercise phobic.

In short, "Use it or lose it".

An active lifestyle and a rich social life are the best insurance
against premature senility. We will learn much more about the how the
brain renews itself in the next century as neuroscientists reveal more
about the mechanisms and circumstances that make new neurons survive
and grow stronger connections. Ultimately, this will lead to greater
productivity from the elderly, the fastest growing segment in western

TERRENCE J.SEJNOWSKI, a pioneer in Computational Neurobiology, is
regarded by many as one of the world's most foremost theoretical brain
scientists. In 1988, he moved from Johns Hopkins University to the
Salk Institute, where he is a Howard Hughes Medical Investigator and
the director of the Computational Neurobiology Laboratory. In addition
to co-authoring The Computational Brain, he has published over 250
scientific articles.

LINKS: Terrence Sejnowksi Home Page; Compututional Neurobiology Lab

Philip Brockman
How Great the Kids Are Today.

The young people I work with at NASA's Langley Research Center are
sharp and hard working. We get some of the best, but they are almost
all great. And what is rarely understood, nevermind reported, is that
they have to process about twice the information that I had to deal
with starting out in research in 1959.

Of course, the biggest untrue story told, one probably found in the
first cave writings, is that "the younger generation is going to

PHILIP BROCKMAN, a physicist, has been at NASA LaRC (Langley Field,
Virginia) since 1959. His research includes: Shock tubes; Plasma
propulsion; Diode laser spectroscopy; Heterodyne remote sensing; Laser
research; Laser injection seeding; Remote sensing of atmospheric
species, winds, windshear and vortices. He is currently supporting all
solid state laser development for aircraft and spaceborne remote
sensing of species and winds and developing coherent lidars to measure
wake vortices in airport terminal areas.

He is a recipient of NASA's Exceptional Service Medal (ESM).

LINK: Philip Brockman's NASA Home Page

Daniel Goleman 
Hidden Consequences of Our Daily Choices as Consumers of Products and

What is the biggest unreported story?

The hidden consequences for our health and the environment, and for
social and economic justice, of our daily choices as consumers of
products and services.

Our individual habits of consumption, when multiplied by our vast
numbers, have devastating impacts -- but we are blind to the chain
that links our individual choices with their vaster consequences. I'd
like to know what these links are -- but they lack transparency. We
need something akin to the labels of nutritional value on foods that
would surface these hidden consequences of our own actions.

A case in point: what is the environmental cost of choosing to buy a
hamburger? How many acres for cattle to graze, how much erosion or
degrading of land as a consequence, how much more greenhouse gases
added to the atmosphere, how much water used for this purpose rather
than other things? How does a burger made from beef compare in this
regard to, say, one made from turkey, or from soybeans?

Another case in point: since childhood, I've suffered from asthma.
Asthma is becoming epidemic among children, especially in urban
neighborhoods. One clear reason for the upsurge is the increase in
airborne particulates that irritate and inflame respiratory passages.
I live in the Connecticut River Valley of Massachusetts, which,
because of prevailing winds, receives a large portion of its
particulates from the pollution in the metro New York City area, as
well as from the industrial Midwest states. One coal-powered
electrical plant in Ohio, a notoriously bad offender, contributes
almost half the airborne particulates that reach my area from the
Midwest. Who are the largest industrial customers of that electrical
plant? What products that I'm now buying are built using power from
that plant? I might boycott them if I knew.

Individually, the consequences of my choices are admittedly
negligible. But if summated across millions of consumers making the
similarly informed choices, the impact could be quite great. We could
'vote' for more benign consequences if we had this missing

I applaud, for instance, the mutual funds and other corporate citizens
who are offering shareholders the option to get their reports via the
internet, rather than wasting resources -- trees, power, etc. --
mailing thousands of hard copies. One fund informs me, for instance,
that if all members sign up for internet reports, the savings in pulp
amounts to more than three hundred trees per year.

I want more choices like that.

So how about a new investigative beat for journalism: hidden
DANIEL GOLEMAN is the author of the international bestsellers,
Emotional Intelligence and Working with Emotional Intelligence. For
twelve years he covered the behavioral and brain sciences for the New
York Times, and has also taught at Harvard. His previous books include
Vital Lies, Simple Truths; The Meditative Mind; and, as coauthor, The
Creative Spirit. Dr. Goleman is CEO of Emotional Intelligence

LINK: Emotional Intelligence Services (EI)

Marc D. Hauser
(1) The Health Crisis in Africa and (2) The Poor Level of Educational

Having lived for several years in East Africa, I am struck by two
observations which seem to me to have escaped sufficient reporting.
The first concerns the health crisis on the continent. Unlike anywhere
that I have ever lived, and especially over the past 5-10 years, I am
overwhelmed by the illnesses. When one walks among villages in Uganda,
or on the streets of cities such as Nairobi and Kamapal, one sees
AIDS. There is no need to go to the local newspapers and read the
latest counts. It is right in front of ones eyes. Perhaps more
depressing than the AIDS crisis is the problem with malaria. Unlike
AIDS, for which we have only the weakest medicines, we certainly do
have the medical technology to eradicate most of the problems with
malaria. Cracking this problem will not, however, require medical
expertise alone. Rather, it will require a creative team of doctors,
anthropologists, sociologists, and economists working hand in hand,
trying to understand the local mores, the local economy, and the
struggles of daily life. It will also have to crack the problem of
medical distribution, still a problem in much of Africa.

A second problem comes from my experience teaching in Uganda. I had
the great pleasure, and honor of teaching students in Uganda. I have
never seen such a thirst to learn. And yet, much of the educational
system is lacking in basic supplies and materials that would transfer
these students into highly educated scholars. Like the poor
distribution of medicinal supplies, there is an equally poor level of
educational distribution. The inequities here are dramatic. Those of
us who write for the Edge, should put our heads together to think of
some way to help, to lend our minds to theirs. Any takers?

MARC D. HAUSER is an evolutionary psychologist, and a professor at
Harvard University where he is a fellow of the Mind, Brain, and
Behavior Program. He is a professor in the departments of Anthropology
and Psychology, as well as the Program in Neurosciences. He is the
author of The Evolution of Communication

LINKS: Primate Cognitive Neuroscience Laboratory, Marc Hauser Home

Kevin Kelly
The Population Fizzle

While the absolute numbers of humans on Earth will continue to rise
for another generation at least, birth rates around the world are
sinking, particularly in the developed countries. In the coming
decades many emerging countries -- currently boosting the rise in
absolute population numbers -- will themselves make the transition to
lower birth rates. While most estimates see the global population of
Earth peaking around 2040-50, what none of them show is what happens
afterward. Unless some unknown counterforce arises, what happens after
the peak is that the world's birth rate steadily sinks below
replacement level. Increasing people's longevity can only slightly
postpone this population fizzle.

The main reason this is unreported is that it will be decades before
it will happen and until it does, the population boomers are correct:
our numbers swell exponentially. But the story is worth reporting for
three reasons:

1) Long before the world's population fizzles, it will age. As it
already is. The average age of a human on earth has been increasing
since 1972. The developing countries are aimed to be Geezer Countries
in another twenty years or less. This will affect culture, politics,
and business.

2) There is currently no cultural force in the modern societies to
maintain human population levels over the long term. Population
robustness these days comes primarily from undeveloped countries, and
from cultural practices that are disappearing in developed countries.
Once the developing countries become developed, there will be little
at work to maintain an average of one offspring per person, with
millions have more to balance the millions with few or no children.
How many future readers will be willing to have three or more kids?

3) There is no evidence in history of declining population and
increasing prosperity. In the past rising prosperity has always
accompanied rising populations. Prosperity riding upon of declining
population *may* be possible through some wisdom or technology that we
don't possess now, but it would require developing an entirely new
skill set for society.

This story could be reported by asking the non-obvious questions of
the usual population pundits: what happens afterwards? What happens in
China with a continually declining birth rate? What would happen to
the US without immigration? Who is going to work in Europe?

KEVIN KELLY is a founding editor of Wired magazine. In 1993 and 1996,
under his co-authorship, Wired won it's industry's Oscar -- The
National Magazine Award for General Excellence. Prior to the launch of
Wired , Kelly was editor/publisher of the Whole Earth Review, a
journal of unorthodox technical and cultural news. He is the author of
New Rules for the New Economy; and Out of Control: The New Biology of
Machines, Social Systems, and the Economic World.

LINK: Kevin Kelly Bio Page

Freeman Dyson
The Enduring Vitality of the More Moderate Kinds of Religions

Here is my not very original answer to this year's question: what is
today's most important unreported story?

The enduring vitality of the more moderate kinds of religion. Although
the fanatical and fundamentalist versions of religion receive heavy
attention from the media, the moderate versions of religion do not.
The majority of religious people are neither fanatical nor
fundamentalist. This is true of Christians, Jews and Moslems alike. In
the modern world, with inequalities between rich and poor growing
sharper, governments are increasingly incapable or unwilling to take
care of the poor. Organized religions increasingly provide the glue
that holds societies together, giving equal respect to those who
profit from economic prosperity and those who are left behind. This is
true even in such a prosperous and technologically advanced community
as Princeton, New Jersey, where I live. Princeton has more than twenty
churches, all trying in their different ways to reach out to people in
need, all bridging the gap between young and old as well as the gap
between rich and poor. Religion plays this same role, holding
communities together with bonds of fellowship, among the Arabs of the
West Bank, the Jews of Brooklyn, and the African-Americans of Trenton,
New Jersey. Religions that have endured for a thousand years and
helped generations of oppressed people to survive are not about to

FREEMAN DYSON is professor of physics at the Institute for Advanced
Study, in Princeton. His professional interests are in mathematics and
astronomy. Among his many books are Disturbing the Universe, Infinite
in All Directions Origins of Life, From Eros to Gaia, Imagined Worlds,
and The Sun, the Genome, and the Internet.

Piet Hut 
There Are No Things.

That's right. No thing exists, there are only actions. We live in a
world of verbs, and nouns are only shorthand for those verbs whose
actions are sufficiently stationary to show some thing-like behavior.
These statements may seem like philosophy or poetry, but in fact they
are an accurate description of the material world, when we take into
account the quantum nature of reality.

Future historians will be puzzled by the fact that this interpretation
has not been generally accepted, 75 years after the discovery of
quantum mechanics. Most physics text books still describe the quantum
world in largely classical terms. Consequently anything quantum seems
riddled with paradoxes and weird behavior. One generally talks about
the "state" of a particle, such as an electron, as if it really had an
independent thing-like existence, as in classical mechanics. For
example, the term `state vector' is used, even though its operational
properties belie almost anything we normally associate with a state.

Two voices have recently stressed this verb-like character of reality,
those of David Finkelstein, in his book Quantum Relativity, and of
David Mermin, in his article "What is quantum mechanics trying to tell
us" [1998, Amer. J. of Phys. 66, 753]. In the words of the second
David: `"Correlations have physical reality; that which they correlate
does not.'" In other words, matter acts, but there are no actors
behind the actions; the verbs are verbing all by themselves without a
need to introduce nouns. Actions act upon other actions. The ontology
of the world thus becomes remarkably simple, with no duality between
the existence of a thing and its properties: properties are all there
is. Indeed: there are no things.

Two hundred years ago, William Blake scolded the physicists for their
cold and limited view of the world, in terms of a clockwork mechanism,
in which there was no room for spontaneity and wonder. Fortunately,
physicists did not listen to the poet, and pushed on with their
program. But to their own utter surprise, they realized with the
discovery of quantum mechanics that nature exhibits a deeply
fundamental form of spontaneity, undreamt of in classical physics. An
understanding of matter as dissolving into a play of interactions,
partly spontaneous, would certainly have pleased Blake.

What will be next? While physics may still seem to lack a fundamental
way of touching upon meaning and wonder, who is to say that those will
remain forever outside the domain of physics? We simply do not know
and cannot know what physics will look like, a mere few hundred years
from now.

There is an analogy with computer languages. Physicists have a
traditional aversion to learning any other language than Fortran, with
which they grow up, no matter how useful the other languages may be.
But without ever parting from their beloved Fortran, it was Fortran
that changed out from under them, incorporating many of the features
that the other languages had pioneered. So, when asked how future
physicists will program, a good answer is: we have not the foggiest
idea, but whatever it is, it will still be called Fortran.

Similarly, our understanding of the material world, including the very
notion of what matter and existence is, is likely to keep changing
radically over the next few hundred years. In what direction, we have
no idea. The only thing we can safely predict is that the study of
those wonderful new aspects of reality will still be called physics.

PIET HUT is professor of astrophysics at the Institute for Advanced
Study, in Princeton. He is involved in the project of building GRAPEs,
the world's fastest special-purpose computers, at Tokyo University,
and he is also a founding member of the Kira Institute.

LINKS: Piet Hut's Home Page, GRAPEs; and Kira Institute.

Jeff Jacobs 
The Lack of Services to Heal Abused Children

The emotional, physical, sexual abuse of children in our society and
the lack of services to heal them.

JEFF JACOBS is the Founder of Civitas Initiative, and President of
Harpo Entertainment Group.
LINK: Civitas Initiative

Lance Knobel
Two Stories, One Encouraging, the Other Worrying

First is the incredible dynamism, energy and economic hope found in
the great cities of the developing world. Most coverage of developing
world cities concentrates on the problems: environment, overcrowding,
shanty towns. But these cities represent the best hope of the world's
poor nations. Modern myth has it that these cities are growing at
unprecedented rates in an increasingly urbanized world. In fact the
cities of the north grew at even greater rates in the 19th century.
Mexico City, Sao Paulo, Calcutta, Buenos Aires and Rio all had more
people moving out than moving in over the decade of the '80s. More
than two-thirds of the world's cities with populations of more than 1
million were important cities 200 years ago. Miami and Phoenix grew
more rapidly than Nairobi in the 20th century; Los Angeles more
rapidly than Calcutta. Urban growth is generally good: rising levels
of urbanization are strongly associated with growing and diversifying
economies, and most of the nations in the south whose economic
performance over the last 10 to 15 years is so envied by others are
also the nations with the most rapid increase in their levels of
urbanization. The developing worlds cities need to be celebrated and
not lamented.

Less encouraging is the largely ignored story of climate change. All
media have great difficulty in covering stories that develop over
generations. Who in 1900 would have written about the changing role of
women or the spread of democracy, two of the extraordinary shifts of
the last century. The failure of the Kyoto conference on climate
change to get significant action by the advanced, industrialized
countries on climate change means that the problems will get far worse
before there is any chance of them getting better.

LANCE KNOBEL is editor-in-chief of World Link, the magazine of the
World Economic Forum, and is responsible for the program of the
Forum's Annual Meeting in Davos, Switzerland. He also publishes their
nascent Weblog.

LINKS: The World Economic Forum, Worldlink: The Online Magazine of the
World Economic Forum, and Weblog

Philip Elmer-Dewitt
O.J.'s Search for the Real Killer.

PHILIP ELMER-DEWITT is Science and technology Editor, Time Magazine.
He has been writing about science and technology for Time since he
reported a cover story on computer "Whiz Kids" in 1982. He became a
staff writer in 1983, a senior writer in 1993, a senior editor in 1994
and science editor in 1995. He started two new sections in the
magazine -- Computers (1982) and Technology (1987) -- and in 1994
helped launch Time Online, America's first interactive newsmagazine.
LINK: Philip Elmer-DeWitt's Home Page.

John Horgan
The Quiet Resurgence of Psychedelic Compounds as Instruments of Both
Spiritual and Scientific Exploration

The story that has gripped me lately is the quiet resurgence of
psychedelic compounds as instruments of both spiritual and scientific
exploration. This trend is unfolding worldwide. I just attended a
conference in Switzerland at which scholars presented findings on the
physiological and psychological effects of drugs such as psilocybin,
LSD and MDMA (Ecstacy). At the meeting, I met an American chemist who
had synthesized a new compound that seems to induce transcendent
experiences as reliably as LSD does but with a greatly reduced risk of
bad trips; a Russian psychiatrist who for more than 15 years has
successfully treated alcoholics with the hallucinogen ketamine; and a
German anthropologist who touts the spiritual benefits of a potent
Amazonian brew called ayahuasca. Long a staple of Indian shamans,
ayahuasca now serves as a sacrament for two fast-growing churches in
Brazil. Offshoots of these churches are springing up in the U.S. and

Several non-profit groups in the U.S. are attempting to rehabilitate
the image of psychedelic drugs through public education and by
supporting research on the drugs' clinical and therapeutic potential.
They include the Heffter Institute, based in Santa Fe, New Mexico and
the Multidisciplinary Association for Psychedelic Studies. MAPS, based
in Florida. The question is, will this new psychedelic movement
founder, as its predecessor did in the 1960's? Or will it bring about
the profound spiritual and social changes that advocates envision?

JOHN HORGAN is a freelance writer and author of The End of Science and
The Undiscovered Mind. A senior writer at Scientific American from
1986 to 1997, he has also written for the New York Times, Washington
Post, New Republic, Slate, London Times, Times Literary Supplement and
other publications. He is now writing a book on mysticism.

Further Reading on Edge: "Why I Think Science Is Ending" A Talk With
John Horgan; "The End of Horgan?"

Hans Weise
That There Are So Many Important Unreported Stories.

Given the number of media outlets for independent voices to tell good
stories, the vanilla quality of mainstream reporting is like the
proverbial frog in a pot of water who doesn't notice the slow
temperature increase and boils cozily. In a consumer-oriented culture
under a booming economy, critical voices are marginalized and the
questions "we" ask ourselves lose color and substance and become
sensational and picayune. A news magazine won't tell you about
record-setting U.S. arms sales, for instance, but they will tell you
that Ricky Martin is the sexiest man alive.

Consumers don't like to hear that things aren't what they seem, so
advertisers won't support those who publish such nonsense. The same
goes for education -- people are still uncomfortable with the idea
that we've descended from apes and so, in the case of Kansas at least,
students needn't be burdened with that knowledge.

HANS WEISE is a filmmaker, writer, and Web developer living in
Alexandria, VA. He was a home-schooled auto-didact who majored in
cinema studies at NYU and then studied archaeology and astronomy via
Harvard extension.

Clifford A. Pickover
The Immortalization of Humanity

The most unreported story deals with evolution of human lifespans and
intelligence. Although we hear news reports about how humans will live
longer in the future, we rarely hear reports that our children or
grandchildren will be immortal by the end of the next century. Given
the tremendous advances in molecular biochemistry that will take place
by 2100, we will certainly uncover the molecular and cellular
mysteries of aging, and therefore many humans will live forever,
assuming they don't suffer a fatal accident. I am amazed that this
obvious concept is not discussed more often or taken more seriously.
Of course, the ecological, economic, political, social, and religious
implications will be extreme. Imagine an immortal Pope discussing the
afterlife with his followers -- or the growth of two social classes,
those that can afford immortality and those too poor to gain access to
the required anti-aging "treatment."

Similarly, most scientists and lay people seem to think that there is
intelligent, space-faring life elsewhere in the universe. A related
unreported story is just how special human intelligence is. Despite
what we see in Star Wars and Star Trek, I don't expect intelligence to
be an inevitable result of evolution on other worlds. Since the
beginning of life on Earth, as many as 50 billion species have arisen,
and only one of them has acquired technology. If intelligence has such
has high survival value, why are so few creatures intelligent? Mammals
are not the most successful or plentiful of animals. Ninety-five
percent of all animal species are invertebrates. Most of the worm
species on our planet have not even been discovered yet, and there are
a billion insects wandering the Earth.

If humankind were destroyed in some great cataclysm, in my opinion
there is very little possibility that our level of intelligence would
ever be achieved on Earth again. If human intelligence is an
evolutionary accident, and mathematical, linguistic, artistic, and
technological abilities a very improbable bonus, then there is little
reason to expect that life on other worlds will ever develop
intelligence that allows them to explore the stars. Both intelligence
and mechanical dexterity appear to be necessary to make radio
transmitting devices for communication between the stars. How likely
is it that we will find a race having both traits? Very few Earth
organisms have much of either. As evolutionary biologist Jared Diamond
has suggested, those that have acquired a little of one (smart
dolphins, dexterous spiders) have acquired none of the other, and the
only species to acquire a little of both (chimpanzees) has been rather
unsuccessful. The most successful creatures on Earth are the dumb and
clumsy rats and beetles, which both found better routes to their
current dominance. If we do receive a message from the stars, it will
undermine much of our current thinking about evolutionary mechanisms.

Despite the improbabilities, we must continue to scan the stars for
signs of intelligence. I agree with the ancient Persian proverb, "The
seeker is a finder," which suggests we must always search in order to
understand our place in our universe.

CLIFFORD A. PICKOVER is a research staff member at the IBM Watson
Research Center in Yorktown Heights, New York. He received his Ph.D.
from Yale University and is the author of over twenty books on such
topics as computers and creativity, art, mathematics, black holes,
human behavior and intelligence, time travel, and alien life. His web
site, www.pickover.com, has received over 200,000 visits, and his
latest book is Surfing Through Hyperspace: Understanding Higher
Universes in Six Easy Lessons.

LINK: Clifford A. Pickover's Home Page

Howard Rheingold
How Will The Internet Influence Democracy?

The way we learn to use the Internet in the next few years (or fail to
learn) will influence the way our grandchildren govern themselves. Yet
only a tiny fraction of the news stories about the impact of the Net
focus attention on the ways many to-many communication technology
might be changing democracy -- and those few stories that are
published center on how traditional political parties are using the
Web, not on how grassroots movements might be finding a voice.

Democracy is not just about voting for our leaders. Democracy is about
citizens who have the information and freedom of communication the
need to govern themselves. Although it would be illogical to say that
the printing press created modern democratic nation-states, it would
have been impossible to conceive, foment, or implement self-government
without the widespread literacy made possible by printing technology.
The more we know about the kind of literacy citizens are granted by
the Internet, the better our chances of using that literacy to
strengthen democracy. And what could be more important? What good is
health and wealth and great personal home entertainment media without

Every communication technology alters governance and political
processes. Candidates and issues are packaged and sold on television
by the very same professionals who package and sell other commodities.
In the age of mass media, the amount of money a candidate can spend on
television advertising is the single most important influence on the
electoral success. Now that the Internet has transformed every desktop
into a printing press, broadcasting station, and place of assembly,
will enough people learn to make use of this potential? Or will our
lack of news, information, and understanding of the Net as a political
tool prove insufficient against the centralization of capital, power,
and knowledge that modern media also make possible?

The same tool that affords tremendous power to the grassroots, the
broad citizenry, the cacaphony of competing "factions" necessary for
healthy democracy, also affords tremendous power to the elites who
already have wealth and power. Guess who can best afford to apply the
tool to further their ends? What's in it for big media interests to
inform us about how we can compete with big media interests?

The political power afforded to citizens by the Web is not a
technology issue. Technology makes a great democratization of
publishing, journalism, public discourse possible, but does not
determine whether or not that potential will be realized. Every
computer connected to the Net can publish a manifesto, broadcast audio
and video eyewitness reports of events in real time, host a virtual
community where people argue about those manifestos and broadcasts.
Will only the cranks, the enthusiasts, the fringe groups take
advantage of this communication platform? Or will many-to-many
communication skills become a broader literacy, the way knowing and
arguing about the issues of the day in print was the literacy
necessary for the American revolution?

The "public sphere" is what the German political philosopher Habermas
called that part of public life where ordinary people exchange
information and opinions regarding potholes on main street and
national elections, school bonds and foreign policy. Habermas claimed
that the democratic revolutions of the 18th century were incubated in
the coffee houses and committees of correspondence, informed by the
pamphlets and newspaper debates where citizens argued about how to
govern themselves without a King. Public governance could only emerge
from public opinion. Habermas wrote: "By "public sphere," we mean
first of all a domain in our social life in which such a thing as
public opinion can be formed." The public sphere is the reason why the
modern coup d'etat requires paratroopers to capture television
broadcast stations -- because those are the places where the power to
influence public opinion is concentrated.

The problem with the public sphere during the past sixty years of
broadcast communications has been that a small number of people have
wielded communication technology to mold the public opinion of entire
populations. The means of creating and distributing the kind of media
content that could influence public opinion -- magazines, newspapers,
radio and television stations -- were too expensive for any but a few.
Just as books were once too expensive for any but a few. The PC and
the Internet changed that. Desktop video, desktop radio, desktop
debates, digicam journalism, drastically reduced the barriers to
publishing and broadcasting. These technological capabilities have
emerged only recently, and are evolving rapidly. While much attention
is focused on how many-to-many audio technology is threatening the
existing music industry, little attention is focused on political
portals. While all eyes are on e-commerce, relatively few know about
public opinion BBSs, cause-related marketing, web-accessible voting
and finance data.

Look at VoxCap, and the Minnesota E-Democracy Project, project, the
California Voter's foundation, and scores of other unreported
experiments. Imagine what might happen if more people were told that
the Web could help them remain free, as well as enhance their shopping

HOWARD RHEINGOLD is the author of Virtual Reality, and The Virtual
Community, and was the editor of Whole Earth Review and the Millennium
Whole Earth Catalog .

Further reading on Edge: Chapter 24, "The Citizen" in Digerati.

LINK: rheingold's brainstorms.

Ivan Amato
The Planet Itself Is Becoming Self Aware

Living flesh is innervated with all kinds of sensors like taste buds,
pressure sensors, photoreceptors and position sensors in muscle
fibers, which monitor internal and external conditions. Brains analyze
signals from these sensors using built-in and ever-evolving models of
the world (that include the owners of the brains) and then use these
analyses to formulate plans of action.

One of the most important un(der)reported stories today is the way the
inanimate world built by humanity is becoming ever more innervated
with sensors (cameras, microphones, strain gauges, magnetic sensors,
GPS receivers, transponders, infrared sensors, satellite surveillance,
etc.) as well as communications systems linking these sensors to
computers that can store, analyze and act on those signals just like
biological brains. What's more, all of these sensors are likely to
ultimately link into a next-generation Internet via
ultra-miniaturized, on-board, wireless connections (one of the main
R&D thrusts of the microelectromechanical systems community). Millions
of millions of thermometers, barometers, GPs transponders in vehicles,
seismic monitors, radiation monitors, department store surveillance
cameras, and thousands of other gadgets watching the world will all
feed data into the system. This will amount to a global-scale,
sensitive infrastructure a planet-sized body, that is -- whereby
myriad sensory signals will constantly feed into a global-scale
cyberspace coursing with sophisticated pattern-recognition abilities,
knowledge-discovery (data-mining) systems, and other artificial
cognition tools. One consequence will be that Earth will have a new
kind of planetary self-awareness akin the bodily awareness living
creatures have due to their sensory tissue. Debate about personal
privacy will become almost moot since the entire world will constitute
a glass house. On the up side, the complexity of this worldwide
awareness -- and the new categories of data about the world will
become available -- is likely to lead to emergent phenomena as
surprising as the way life emerges from molecules and consciousness
from life.

IVAN AMATO, freelance print and radio writer; editor of the Pathways
of Discovery essay series for Science Magazine; author of Stuff: The
Materials The World Is Made Of and Pushing the Horizon, which is an
institutional history of the Naval Research Laboratory.

Arnold Trehub
How The Human Brain Models The World

A plausible explanation of how the brain can create internal models of
veridical and hypothetical worlds has long eluded theorists. But
recently there has been significant progress in the theoretical
understanding of this defining aspect of human cognition, and it has
scarcely been reported. About a decade ago, I wrote in The Cognitive
Brain that the capability for invention is arguably the most
consequential characteristic that distinguishes humans from all other
creatures. Our cognitive brain is especially endowed with neuronal
mechanisms that can model within their biological structures all
conceivable worlds, as well as the world we directly perceive or know
to exist. External expressions of an unbounded diversity of
brain-created models constitute the arts and sciences and all the
artifacts and enterprises of human society.

The newsworthy story is that we now have, for the first time, a
biologically credible large-scale neuronal model that can explain in
essential structural and dynamic detail how the human brain is able to
create internal models of its intimate world and invent models of a
wider universe.

ARNOLD TREHUB is adjunct professor of psychology at the University of
Massachusetts Amherst. He has been the director of a laboratory
devoted to psychological and neurophysiological research and is the
author of The Cognitive Brain.

Brian Goodwin
Quality Pigs

My story is about pigs! How could anything connected with pigs
possibly have significant cultural consequences? It comes from
research that entails a fundamental change in the scope of scientific
inquiry. To appreciate what is at stake, we need to recall a basic
assumption in the practice of western science: reliable knowledge
about nature depends upon measurement. We can be sure of the
wavelength of light rays from the setting sun, but there's no way we
can determine the beauty of a sunset. Or we can find out the weight of
a pig, but we can never know if a pig is happy or sad. Western science
is about quantities, which are regarded as 'objective' properties of
the world that everyone using the same method of measurement can agree
on. It is not about qualities such as pleasure, pain, honesty,
happiness or grief, which are regarded as subjective states that are
not objectively real, however important they may seem to us.

But what if it could be shown that qualities can be evaluated just as
reliably and consistently as quantities? And by essentially the same
scientific procedures? This is what has been shown in studies by a
research team working in Edinburgh. People were shown videos of
individual pigs interacting in a standard pen with the team leader,
Francoise Wemelsfelder. They were asked to write down for each pig any
set of terms that they felt described the quality of its behavior.
These included words such as bold, aggressive, playful for one animal;
timid, shy, nervous for another; indifferent, independent,
self-absorbed for a third, and so on. There was no limit to the number
of descriptors that could be used for any pig. A routine procedure was
then followed in which each pig was evaluated again by each observer
using all their chosen pig-descriptive terms and the results compared
over the whole group of observers to see if there was consistency of
evaluation. This type of procedure is regularly used in evaluation of
food quality and flavour, but it has never before been used to see if
people agree about an animal's 'subjective' state in terms of its

The results were startling: there was a high level of consensus among
people about the quality of behavior shown by different pigs. Their
assessments were not arbitrary, personally idiosyncratic descriptions,
but evaluations with a high degree of intersubjective consistency.
This is precisely the basis of scientific 'objectivity': agreement
between different observers using an agreed method of observation.
This opens the door to a science of qualities with startling

The most important aspects of our lives are connected with qualities:
quality of relationships, quality of education, quality of our
environment, quality of life generally. We spend a great deal of time
evaluating the behavior of those on whom we depend and trying to sort
out whether they are happy, angry, depressed, reliable, and so on;
i.e., we get a lot of practice at evaluating others' internal states
by reading their behavior. And on the whole we are pretty good at it,
despite dramatic errors of judgement to which we are prone. So it
isn't all that surprising that people with no familiarity with pigs
should nevertheless be very consistent at evaluating the quality of
their behavior. But what is most dramatically lacking in the lives of
people in 'developed' countries at the moment is, by general
consensus, quality of life. Quantities we have in abundance - of food,
technological gadgets of all kinds, cars, aircraft, information, and
so on; the things that our science of measurement and quantities has
been so successful at providing. But that science has degraded
qualities such as beauty, love, joy, grief, and creativity to mere
epiphenomal subjectivity, regarding them as ephemeral shadows with no
objective reality. We intuitively know better. But now we can actually
explore this territory systematically, scientifically, and reinvest
our world with the qualities that are essential for living full lives;
not just for humans but also for pigs and cows and trees and cities
and landscapes and watersheds and cultures and the biosphere. With a
science of qualities we can start to recover the wisdom we lost when
we restricted our search for reliable knowledge to measurable
quantities and cut ourselves off from the qualitative half of the
world without which we and all else must perish.

BRIAN GOODWIN is a professor of biology at the Schumacher College,
Milton Keynes, and the author of Temporal Organization in Cells and
Analytical Physiology, How The Leopard Changed Its Spots: The
Evolution of Complexity, and (with Gerry Webster) Form and
Transformation: Generative and Relational Principles in Biology. Dr.
Goodwin is a member of the Board of Directors of the Sante Fe

Further reading on Edge: Chapter 4, The Third Culture; "A New Science
of Qualities:" A Talk With Brian Brian Goodwin

Stephen Grossberg
How does a brain give rise to a mind?

When we think about our conscious experiences of the world, we are
aware of vivid colors, sounds, and feelings. These seem to occur in a
world of three spatial dimensions evolving through time. When we look
at a typical brain, we see an unbelievably intricate web of networks
energized by electrical and chemical processes. These networks often
have incredibly large numbers of components. Given such different
levels of description, it has been difficult to comprehend how a brain
can give rise to a mind.

During the past decade, theorists of mind and brain have finally been
able to model the detailed temporal dynamics of known brain cells and
circuits...AND the observable behaviors that they control, using the
same model. Thus, for the first time in history, there are theories
powerful enough to begin to explicitly show how our minds arise from
our brains.

The exciting thing about this progress is that, in every case, it
depends upon qualitatively new concepts and theories about how our
brains adapt on their own, moment - by - moment, to a rapidly changing
world. Many outstanding problems in technology also require that an
intelligent agent adapt on its own, moment-by-mo ment, to a rapidly
changing world. These new concepts and theories have therefore begun
to find their way into new technological applications.

Thus the new progress about how our brains work promises to provide a
more human technology. The story has not been adequately reported
because the new concepts are qualitatively new -- for the same reason
that brains seem to be so different from minds -- and many reporters
have not taken the time to understand them.

STEPHEN GROSSBERG is one of the principal founders of the fields of
computational neuroscience, connectionist cognitive science, and
artificial neural network research. He is Wang Professor of Cognitive
and Neural Systems and Professor of Mathematics, Psychology, and
Biomedical Engineering at Boston University. He is Co-Editor-in-Chief
of the journal Neural Networks, which is the official journal of the
three major neural modeling societies in the world.

LINK: Stephen Grossberg Home Page

Philip W. Anderson
(1) The closing of the High Flux Beam Reactor at Brookhaven National
Lab; (2) Eppawala

I have two answers, which will certainly qualify because you may well
never have heard of either.

The first, and my primary answer because it is a local matter that the
Third Culture can hope to affect, is the closing of the High Flux Beam
Reactor at Brookhaven National Lab.

Many high profile media scientists proclaimed the Supercollider
decision as the point at which the US definitively turned away from
science. But it was nothing of the sort. It was a badly managed and
unwisely promulgated project, immensely expensive and disconnected
with the rest of science, about which many perfectly reasonable
scientists had serious doubts -- not just me, though I seem to have
taken the brunt of the blame.

My nominee for this turning point is the HFBR. A coalition of
pseudo-environmentalists and trendy New Agers, useful and wellheeled
Clinton friends and campaign contributors with Long Island real
estate, blew up a leakage which amounted to the amount of tritium in
the exit signs of your local movie theatre into a major issue, and
Bill Richardson, the Secretary of Energy, caved without listening to
the scientists involved. It is reported that the coalition arranged an
interview for the Secretary with a supermodel on the afternoon the
scientists had asked for a last appeal.

In any case the loss of the HFBR closes one of the world's most
productive scientific instruments and sends the entire community off
to our friendly competitors in Europe and Japan. Neutron scattering
and diffraction is central to much of condensed matter physics and
useful in biology, chemistry and several branches of technology.
Approximately 300 experiments were run the last year the HFBR was up.
There was no conceivable economic reason for shutting it down -- it
was a very inexpensive instrument relative to the projects which are
replacing it. Its real problem is the anti-intellectual bias of the
majority culture. If only we had been able to label it "organic"
rather than "nuclear" it would have survived.

I will be brief about the other.

You will never have heard the name "Eppawala". This is a project of a
US-Japan consortium to mine phosphate in gigantic quantities from a
mountain in Sri Lanka, destroying a thousand-year-old irrigation
system, numerous antiquities, and many villages and compensating the
locals on a typical Third World scale with a minute fraction of the
profits -- profits which hardly exist if one were to count the true
cost of the project. It is a staggering example of the misuse of
economic reasoning which characterizes third world "development"
projects. Not just third world, in my opinion!

PHILIP W. ANDERSON is a Nobel laureate physicist at Princeton and one
of the leading theorists on superconductivity. He is the author of A
Career in Theoretical Physics, and Economy as a Complex Evolving

LINK: Philip W. Anderson Home Page.

James J. O'Donnell
The Remaking Of Personality As Seen In Language.

Until a few years ago, tracking the evolution of language was only
possible for idle gentlemen: university scholars, amateur dilettantes
(whence arose the Oxford English Dictionary), or pompous columnists
(only one of whom has a weekly piece in the New York Times Magazine).
It was carried on from an elite gentleman's club perspective and
consisted of deploring decline and patronizing neologism.

But now we have the tools to do a vastly better job of paying
attention to what *we* are saying. Huge quantities of "real" language
as it is spoken and written can be collected easily and subjected to
sophisticated tracking analysis. Gender, class, nationality: all can
be revealed and studied as never before. The ethical and political
bases of society as it is (not as it imagines itself to be) can be
displayed and analyzed.

Why is this important? Because the way we talk about ourselves and
others is the way we create ourselves and our society. The ethical and
social revolutions of the last half century have been far reaching,
but it is still possible for those who prefer nostalgia to justice to
wish those revolutions away. The emergence of a serious journalism of
language, supported by good science, would document the way in which
all classes and social groups have changed and continue to change. It
would tell us things about ourselves that we know in our hearts but
have not had the self aware and the wisdom and the courage to say to
ourselves aloud. I believe we would all be happier if we knew how far
we have come, and I can think of no better way of measuring and
showing it.

JAMES J. O'DONNELL, Professor of Classical Studies and Vice Provost
for Information Systems and Computing at the University of
Pennsylvania, is the author of Avatars of the Word: From Papyrus to

LINK: The James J. O'Donnell Website.

Sylvia Paull
Women Are Still Considered Inferior To Men

After two thousand years of "civilization," women are still considered
inferior to men by most cultures, whether in developed, developing, or
undeveloped nations.

Although the media reports on glass ceilings in the job place, they do
not penetrate beyond the economic discrimination women face into the
culture itself: What is it that makes most men think they are superior
to women?

Why is the thought of electing a woman president of the United States
so unthinkable to most of the population? Why is it surprising that
most Fortune 1000 companies still lack a woman on their board of
directors? Why do women athletes still lack funding and popular
support on a scale that their male counterparts garner?

Because after 2,000 years of recorded history, and 20,000 years of
artifact-preserved history, women have generally been relegated as
breeders not leaders. And even though technological and economic
advances have allowed women to have children as well as professional
careers, their multimillenial image as background breeders persists.

This pervasive fallacy continues to limit the creative potential of
half of the world's population. The underlying belief in women's
inferiority seems to be so ingrained in our collective psyches that
even the media doesn't seem motivated to investigate -- let alone
challenge -- its roots.

SYLVIA PAULL is founder of Gracenet , a group of women who work in
computing, high-tech journalism, and related fields and who support
the advancement of women in these fields.

LINK: Gracenet

Thomas Petzinger, Jr.
The End Of Money

No one, least of all in the press -- least of all in the business
press -- has seen the beginnings of what may be the greatest
revolution in the history of commerce: the end of money, and with it
the concept of the customer.

Until there was money, there was no such thing as a customer. It
wasn't swapping tools for fish that turned a Polynesian islander from
a trader into a customer. That's simply barter. The idea of "buyer"
and "seller" emerged only when one party swapped something with a
fixed use for something fungible. Often, the money received by the
seller had a modest utilitarian purpose; gold, for instance, could be
hammered into nose rings, false teeth or satellite solar arrays. But
money became the foundation of economic life precisely because it had
symbolic more than practical value.

Then God gave us lawyers and accountants to prevent underweighing and
overcharging, to make sure that every exchange of tangible things for
intangible money was perfectly balanced, perfectly reciprocal. But
this is a conceit of economists, accountants and lawyers, as everyday
commercial life reveals. Because it can be turned into anything, money
represents dreams unfulfilled, and unrequited dreams, at any price,
are worth more than dreams realized. We all realize this intuitively.
A buyer asks a seller to give up a mere thing; a seller asks a buyer
to give up hopes and possibilities. For the same reason, it's more
costly for sellers to recruit buyers than for buyers to recruit
sellers: Sellers can exchange their stuff for only one thing (money),
while buyers can exchange their money for anything. That's why, in the
real world of purportedly balanced transactions, sellers invariably
defer to buyers -- why we say "the customer is king" and "The customer
is always right."

But let's say it's 2000 and you're Time Inc. You own some of the
best-known media properties in the world: Sports Illustrated, People
magazine, etc. You want to leverage those properties. So you approach
Yahoo!, say, or American Online. You propose to provide content to
them. They propose to promote your brand. And as you sit down to the
bargaining table to sort out the economics of all this, you throw up
your hands and ask, "Are we paying you or are you paying us?" That's
how these negotiations actually go.

"Who's paying whom?" Asking a question like that signals that maybe
nobody needs to pay anything to anybody. Lots of value is created, but
"nobody's paying for it". It just happens because two (or more)
business partners create something together. In these situations firms
can't begin to account for the nickels and dimes in the deal and may
not even bother trying. In these situations, relationships triumph
over transactions. Money drastically diminishes as a factor in the
deal. And the identity of the customer -- Are we paying you or are you
paying us? -- becomes fuzzy. The very concept of the customer begins
to disappear.

Look at Silicon Valley. Every major firm there is a node in a complex
network in which a huge fraction of the value creation could never be
accounted for in monetary terms. Should Intel pay for Microsoft to
optimize operating systems in a way that makes Intel chips ubiquitous?
Or should Microsoft pay Intel to design chips that make Microsoft
operating systems ubiquitous?

The press and the pundits are clueless about the effects of these
de-monetized value-added dealings. No wonder, because all their
measurements are expressed as units of money. Unless some dough
changes hands, even the biggest commercial developments are as unheard
as trees falling in the distant forest. The data mavens at Commerce
are blind to the value created when Yahoo! adds a new Web site listing
or when Mapquest shaves 0.6 miles off my trip. When the Labor
Department calculates the Consumer Price Index it has no idea that its
own Web pages are being dished out on free Linux source code or that a
building contractor in Bowie, Md., decided to eat a change order
because he wanted to preserve the goodwill of his client -- and that
more and more of the economy is being transacted on such a basis. When
Dr. Greenspan and the poo-bahs at the Fed deliberate over the
"irrational exuberance" of the stock market, how much weight do you
suppose they're giving to the fact that the marginal cost of a
transaction in a world of e-commerce has essentially dropped to zero?
More de-monetization.

Today most of the money in the world isn't even made of paper, much
less metal. It exists as binary digits. No wonder the central banks of
the world are heaving their gold reserves into a collapsing market.
Who needs gold when money sheds the slightest pretense of being
anything but data? Say good-bye to gold. Gold is history. If you want
currency backed by something tangible, sign up for 5,000 frequent
flier miles on a new Visa card.

THOMAS PETZINGER, JR., has spent 21 years as an editor, reporter and
weekly columnist at The Wall Street Journal, where he served most
recently as Millennium Editor. His latest book is The New Pioneers:
The Men and Women Who Are Transforming the Workplace and Marketplace

LINK: Thomas Petzinger, Jr. Home Page

Stephen Kellert
The Concept Of Ecology

As Aldo Leopold once suggested, the greatest invention of the 20th
century might be the concept of ecology. It reminds us of the illusory
nature of the concept of the autonomous, unitary, individual being. We
are all -- ourselves as persons, the human animal, all species, in all
likelihood, the universe -- a constant product of relational and
transactional processes.

STEPHEN R. KELLERT, a social ecologist at Yale and E.O. Wilson's
collaborator, is recognized as the world's foremost authority on human
relationships to animals. The New York Times has featured his work in
various articles and his work has been recognized by a number of
awards including the Distinguished Individual Achievement Award of the
Society for Conservation Biology. Kellert is coeditor with E.O. Wilson
of The Biophilia Hypothesis and author of Kinship to Mastery, The
Value of Life.

Eric J. Hall
The World Is Losing Potable Water At A Rate That Is Unprecedented.
Quite an interesting question and one that required both research and
introspection. I finally had to rely on what I have experienced in
travelling to 6 of the earth's continents.

The world is losing potable water at a rate that is unprecedented.
Saharan and sub-saharan Africa water supplies are contaminated with
bilharzia, schistosomiasis (sic), giardia, and other water-born
parasites and diseases. Russia has seen the Caspian Sea and other
land-locked lakes decline rapidly over the past 25 years due to
evaporation from a warmer climate and drainage to irrigate arid lands.
Populations are expanding into areas where water is scarce and
aquifers will be drained in less than 100 years at current consumption
rates. Even my favorite Sierra streams are polluted with giardia and
require treatment before you can drink the water. China, India, Asia,
and South America are also faced with the same problems of water-born
diseases and decreasing supplies.

In the 19th century we saw battles fought over water rights in New
Mexico. What will happen in the next century when nations are faced
with hydro projects in a neighboring country decrease the flow of
water to another country. This is already happening in Turkey where a
massive hydro project threatens Iraq's water supply. Will this be
Saddam's next war?

Scientists still do not understand what the impact of desalinization
of sea water will have on the balance of the global ecosystem. The
salt removed has to go somewhere, usually back into the water. Will
increased salinity impact polar ice caps or the temperature of the
earth's oceans? We have already seen an increase in the salinity of
water in Russia, the Middle East, and even in areas in the USA.

In terms of consumption, we waste a tremendous amount of water as
individuals. More water goes down the drain or the gutter as we
continue to plant water-intensive lawns or keep the tap running while
we brush our teeth. Recent draughts in California have highlighted how
much water can be saved on a daily basis. On the converse, in many
third-world countries more time is spent each day bringing the
necessary water to the home as wells dry up. You can't be productive
if you're always fetching water.

The next global war will be fought over water, not imperialism or
political ideology. Man can live without a lot of things, but not
water. Mainstream media in all countries have been absent in their
reporting as well as conservation groups like the Sierra Club. What
will the world do when there's not enough rain to go around?.

ERIC J. HALL is President of The Archer Group, a consulting firm
specializing in emerging technology companies. He has helped found
companies including Yahoo!, Women.com, The ImagiNation Network, and
rightscenter.com, Inc.

John Gilmore
The World Isn't Going to Hell

Media sells your attention to advertisers using bad news. This makes
people think the bad news is the real state of the world. Pollution is
making the world a worse place, raw materials are being used up,
bigger populations are overconsuming us into wretchedness, etc. They
even want all of us to waste our time sorting garbage for fear that
we'll run out of barren land to put it on!

The real story is that human life has gotten better and better and
better over the centuries. The world used to be a very polluted place
-- if you count deadly infectious bacteria in the environment.
Centuries of focus on clean drinking water, separating sewers from
food and water supplies, medicine, and nutrition have resulted in
human life span being literally doubled and tripled, first in
"civilized" countries and then in "developing" countries. China has
doubled life-span in this century. Everyone grows up taller and
stronger than hundreds of years ago. There is much less pollution in
London today than in any recorded century. There is much less
pollution in the US today than in any recorded decade. There are more
proven oil reserves than ever before, and in fifty years when those
have been used, another fifty or sixty years' worth will have been
worth locating. There are more acres of forest in North America than a
hundred years ago. (There's a market for growing trees now, and they
can be transported to where people want to buy the wood! Two hundred
years ago it was more work to move wood twenty miles in carts on mud
roads, than it was to take it across the Atlantic!) Resources of all
types are getting cheaper and cheaper, as measured in decades and
centuries. There is no reason to believe that these trends will
change. (Remember the people like Paul Erlich who predicted world
famine by 2000? They are still making predictions, but you shouldn't
believe 'em any more, because it's 2000 and the starving hordes aren't

Prof. Julian Simon was a "liberal who got mugged" -- by the facts. He
started off trying to prove the environment was getting worse, but
whenever he found actual historical data, it contradicted that thesis.
Eventually he changed his mind and started writing books about it.

Ultimate Resource 2 is his updated book about how all the resources
except one of-a-kind objects are becoming less scarce, except humans.
(Human attention commands higher and higher prices over the decades, a
trend easily visible in the price of labor, despite there being more
humans around.)

The State of Humanity is a very well documented (footnoted) survey of
human life, health, and the environment. It points you to actual
historical data showing the real long-term trends in human longevity,
health, welfare, prices of materials, acres of forest, number of
species known, pollution, smoke, you name it he's got it.

Knee-jerk liberals beware! These are the kinds of books that you won't
like to read. The interior feeling of a mind stretching is
uncomfortable, though the result is well worth it.

JOHN GILMORE is an entrepreneur disguised as a philanthropist. Or
perhaps vice verse. He co-founded the Electronic Frontier Foundation,
the "alt" newsgroups, the Cypherpunks, and 1989 open source company
Cygnus Solutions. He's been pushing encryption technology out of
government spy agency control for 15 years. He's a big believer in
civil rights, even for Internet users and those who like drugs. He
thinks nation-states have killed more people than any other single
force in history, and wishes to see them wither to a tiny fraction of
their current size and power over their citizens. And he refuses to
"recycle" his trash -- if it's worth it to you, you can recycle it.

Sally M. Gall
The Potentially Glorious, or Dangerous, Massive Cultural Impact of
Mitochondrial DNA Studies.

The increasing use of mitochondrial DNA in determining genetic
relationships among human beings opens up the extraordinary
possibility of a global registry in which every individual knows his
or her antecedents and degree of genetic closeness to all other living
human beings.

What would be the result of such knowledge? A delight in finding out
that we are all more or less brothers and sisters under the skin,
leading -- one hopes -- to a decrease in hostilities between
antagonistic groups? Or would a new clannishness emerge in which
anyone who is more, say, than six degrees of genetic separation from
oneself is identified as a natural enemy?

SALLY M. GALL, holds degrees from Harvard / Radcliffe and NYU. A
librettist, poet, critic, and scholar, she now specializes in writing
texts for a broad range of music drama.

Rodney A. Brooks 
People are Morphing into Machines.

Since I work in building autonomous humanoid robots reporters always
ask me what will happen when the robots get really smart. Will they
decide that we (us, people) are useless and stupid and take over the
world from us? I have recently come to realize that this will never
happen. Because there won't be any us (people) for them (pure robots)
to take over from.

Barring an asteroid size thwack that knocks humans back into
pre-technological society, humankind has embarked on a journey of
technological manipulation of our bodies. The first few decades of the
new millennium will be a moral battleground as we question, reject,
and accept these innovations. Different cultures will accept them at
different rates (e.g., organ transplantation is currently routine in
the United States, but unacceptable in Japan), but our ultimate nature
will lead to wide spread adoption.

And just what are these technologies? Already there are thousands of
people walking around with cochlea implants, enabling the formerly
deaf to hear again -- these implants include direct electronic to
neural connections. Human trials have started with retina chips being
inserted in blind people's eyes (for certain classes of blindness,
such as macular degeneration), enabling simple perceptions. Recently I
was confronted with a researcher in our lab, a double leg amputee,
stepping off the elevator that I was waiting for -- from the knees up
he was all human, from the knees down he was robot, and prototype
robot at that -- metal shafts, joints full of magneto-restrictive
fluids, single board computers, batteries, connectors, and wire
harnesses flopping everywhere; not a hint of antiseptic packaging --
it was all hanging out for all to see. Many other researchers are
placing chips in animal, and sometimes human, flesh and letting
neurons grow and connect to them. The direct neural interface between
man and machine is starting to happen. At the same time surgery is
becoming more acceptable for all sorts of body modifications -- I
worry that I am missing the boat carrying these heavy glasses around
on my nose when everyone else is going down to the mall and having
direct laser surgery on their eyes to correct their vision. And at the
same time cellular level manipulation of our bodies is becoming real
through genetic therapies.

Right now we ban Olympic athletes who have used steroids. Fairly soon
we may have to start banning kids with neural Internet connection
implants from having them switched on while taking the SATs. Not long
after that it may be virtually mandatory to have one in order to have
a chance taking the new ISATs (Internet SATs).

We will become a merger between flesh and machines, and we (the
robot-people) will be a step ahead of them (the pure robots). We won't
have to worry about them taking over.

RODNEY A. BROOKS is Director of the MIT Artificial Intelligence
Laboratory, and Fujitsu Professor of Computer Science. He is also
Chairman and Chief Technical Officer of IS Robotics, an 85 person
robotics company. Dr. Brooks also appeared as one of the four
principals in the Errol Morris movie "Fast, Cheap, and Out of Control"
(named after one of his papers in the Journal of the British
Interplanetary Society) in 1997 (one of Roger Ebert's 10 best films of
the year).

Further reading on Edge:"The Deep Question" A Talk With Rodney Brooks.

LINKS: Rodney A. Brooks Home Page; and Cog: A Humanoid Robot.

John McWhorter 
The Transformation Of The American Musical Ear

There are now two generations of Americans who have grown up after the
rock revolution of the late 1960s, for whom classical music and the
old style Broadway/Hollywood songs are largely marginal. As a result,
today's typical American ear is attuned more to rhythm and vocal
emotion -- the strengths of rock and rap -- than to melody and
harmony, the strengths of classical music and Golden Age pop. This is
true not just of teenagers but of people roughly fifty and under, and
has been the most seismic shift in musical sensibility since the
advent of ragtime introduced the American ear to syncopation a century

A catchy beat is not just one element, but the sine qua non in most
pop today, opening most songs instead of the instrumental prelude of
the old days. The increasing popularity of rhythm-centered Third World
pop (pointedly called "World Beat") underscores this change in taste.
Certainly folks liked a good beat before Elvis, but much of even the
most crassly commercial dance music before the 1950s was couched in
melody and harmony to a degree largely unknown in today's pop. Our
expectations have so shifted that the rock music that critics today
call "melodic" would sound like Gregorian chants to members of even
the cheesiest little high school dance band in 1930.

Yet what pop has lost in craft it has gained in psychological
sophistication, and the focus on vocal emotion is part of this. In the
old days, singers made their marks as individuals, no doubt, but, for
example, Sinatra's artistry was in being able to suggest a range of
emotions within the context of rather homogenous lyrics. Modern pop
singers like Alanis Morrisette are freed from these constraints, and
the variety and individuality of many modern pop lyrics have made them
America's true poetry; indeed, many listeners relate to the lyrics of
their favorite rock singers with an intensity our grandparents were
more likely to devote to the likes of Robert Frost.

Yet the fact remains that for the typical American of the future,
melody and harmony will be as aesthetically marginal as they are to
the African musician whose music is based on marvelously complex
rhythms, with a vocal line serving largely rhythmic and/or decorative
ends (notably, World Beat listeners are little concerned with not
understanding most of the lyrics; it's the vocal texture that
matters). Lyrics will continue to count, but their intimate linkage to
musical line will be of no more concern than individual expression or
complex rhythm was to pop listeners sixty years ago. I once attended a
screening of a concert video from the mid-1960s in which Sammy Davis,
Jr., who occupied the transitional point between the old and the
current sensibility, sang Cole Porter's "I've Got You Under My Skin"
first "straight", and then without accompaniment, eventually moving
into scatting and riffing rhythmically to the merest suggestion of the
written vocal line for a good few minutes, in a vein we would today
call "performance art". Young hipsters behind me whispered "This is
rad!"; a few seconds later I heard an elderly woman in the front row
mumble "Enough of this is enough!" She would have been happy to hear
Davis simply sing it through once with the orchestra; the hipsters
wouldn't have minded Davis walking out and doing only the vocal riffs
-- and they are the American musical ear of today and tomorrow.

JOHN H. MCWHORTER is Assistant Professor of Linguistics at the
University of California at Berkeley. He taught at Cornell University
before entering his current position at Berkeley. He specializes in
pidgin and creole languages, particularly of the Caribbean, and is the
author of Toward a New Model of Creole Genesis and The Word on the
Street : Fact and Fable About American English. He also teaches black
musical theater history at Berkeley and is currently writing a musical
biography of Adam Clayton Powell, Jr.

Further reading on Edge: "The Demise of Affirmative Action at
Berkeley": An Essay by John McWhorter.

Stewart Brand 
The Peace Dividend

Whatever happened to looking for the Peace Dividend? What if the
rampant prosperity in America these years is it? Money not spent on
Defense gets spent on something. Research not sequestered into Defense
applications gets loose into the world faster, and in pace with other
events in technology and science. Policies not organized around
paranoia can be organized around judicious optimism. The more former
enemies there are, the more new customers and suppliers.

(Are Democrats getting credit for something that Republicans did? --
-win and end the Cold War. But maybe Democrats are exactly who you
want running things when a long debilitating war ends.)

Question: if the Peace Dividend is prosperity, is it a blip good for
only a couple years, or is it a virtuous circle that goes on and on?

STEWART BRAND is founder of the Whole Earth Catalog, cofounder of The
Well, cofounder of Global Business Network, cofounder and president of
The Long Now Foundation. He is the original editor of The Whole Earth
Catalog, author of The Media Lab: Inventing the Future at MIT , How
Buildings Learn), and The Clock of the Long Now: Time and
Responsibility (MasterMinds Series).

Further reading on Edge: "The Clock of the Long Now" A Talk by Stewart
Brand; and Chapter 3, "The Scout" in Digerati.

LINKS: Global Business Network; and The Long Now Foundation.

Judith Rich Harris
Parenting Styles Have Changed But Children Have Not

What stories are most likely to go unreported? Those that have to do
with things that happen so gradually that they aren't noticed, or
happen so commonly that they aren't news, and those that have
politically incorrect implications.

A story that has gone unreported for all three reasons is the gradual
and pervasive change in parenting styles that has occurred in this
country since the 1940s, and the consequences (or lack of
consequences) of that change.

In the early part of this century, parents didn't worry about shoring
up their children's self-esteem or sense of autonomy, and they didn't
feel called upon to provide them with "unconditional love." They
worried that their children might become spoiled, self-centered, or
disobedient. In those days, spankings were administered routinely,
often with a weapon such as a belt or a ruler. Kisses were exchanged
once a day, at bedtime. Declarations of parental love were made once a
lifetime, from the deathbed.

The gradual but dramatic change in parenting styles over the past 50
years occurred mainly because more and more parents were listening to
the advice of the "experts," and the experts' advice gradually
changed. Nowadays parents are told that spankings will make their
children more aggressive, that criticism will destroy their
self-esteem, and that children who feel loved will be kinder and more
loving to others. As a result of this advice, most parents today are
administering far fewer spankings and reprimands, and far more
physical affection and praise, than their grandparents did.

But that's only half the story. The other half is the results, or lack
of results, of this change in parenting styles. Are today's children
less aggressive, kinder, more self-confident, or happier than the
children of two generations ago? If anything, the opposite is true.
Rates of childhood depression and suicide, for example, have gone up,
not down. And certainly there has been no decline in aggressiveness.

The implications, whatever they are, are bound to be politically
incorrect. Perhaps the "experts" don't know what they're talking
about. Perhaps parenting styles are less important than people have
been led to believe. Perhaps human nature is more robust than most
people give it credit for -- perhaps children are designed to resist
whatever their parents do to them. It's possible that being hit by a
parent doesn't make children want to go right out and hit their
playmates, any more than being kissed by a parent makes them want to
go right out and kiss their playmates. It's even possible (dare I
suggest it?) that those parents who are still doling out a lot of
punishment have aggressive kids because aggressiveness is, in part,
passed on genetically.

But now I'm getting into a story that HAS been reported.

JUDITH RICH HARRIS is a writer and developmental psychologist;
co-author of The Child: A Contemporary View Of Development; winner of
the 1997 George A. Miller Award for an outstanding article in general
psychology, and author of The Nurture Assumption: Why Children Turn
Out The Way They Do.

Further reading on Edge: "Children Don't Do Things Halfway". A Talk
with Judith Rich Harris; Judith Rich Harris Comments on Frank J.
Sulloway's Talk "How Is Personality Formed?".

LINK: The Nurture Assumption Web Site.

Lee Smolin 
The Internationalization of the Third Culture

The internationalization of the third culture, by which I mean the
growth of a class of people who do creative work of some kind
(science, arts, media, business, technology, finance, fashion...) who
live and work in a country other than their own, are married to such a
person, or both. This is not a new situation, but what is new is the
extent to which the combination of inexpensive air travel, telephone
the Internet and computer technologies makes living and working
outside of ones native country not only easy but increasingly
attractive for a growing proportion of people in these professions.
This is a natural consequence of the internationalization of these
areas, which has made frequent international travel, and periods of
studying and working abroad the norm rather than the exception. It is
made possible by the ascendancy of English as a global language and
the long period in which the developed world has been more or less at
peace. With the end of the cold war, the growth of democracies in
Latin America and the Far East and the unification of Europe there
remain few significant political obstructions to the growth in size
and influence of a denationalized community of people who work in
exactly those areas which are most critical for shaping the human

This class of people shares not only a common language and a common
set of tastes in food, clothing, coffee, furniture, housing,
entertainment, etc, but are increasingly coming to share a common
political outlook, which is far more international than those from the
old literary cultures, based as they are each on a national language
and history. It is perhaps too early to characterize this outlook, but
it involves a mix of traditional social democratic and environmental
concerns with an interest (or perhaps self-interest) in the links
between creative work, international exchange of ideas and
technologies and economic growth. Moreover, they share an interest in
the conditions which make their lives possible, which are peace,
stability, democracy and economic prosperity, and these are more
important to them than the nationalist concerns of their native
countries. It is not surprising that the daily experience of juggling
different languages, identities and cultures gives these people a much
more optimistic outlook concerning issues such as pluralism and
multiculturalism than those from the literary cultures. Most of them
feel an attachment and identification to their native culture, but
they also feel alienated from the party politics and petty
nationalisms of their home countries. When they move to a new country
they do not immigrate in the traditional sense, rather they enter a
denationalized zone in which their colleagues and neighbors come from
an array of countries and the place where they happen to be is less
important than the work they do.

How they and their children will resolve these different loyalties is
far from clear. One can meet young people whose parents each speak a
different language, who grew up in a third country, did a university
education in a fourth, and now work in a fifth. What the political
loyalties of such people will be is impossible to predict, but it
seems not impossible that the growing concentrations of such people in
the areas of work that most influence public taste and economic growth
may catalyze the evolution of nation states into local governments and
the invention of a global political system.

LEE SMOLIN is a theoretical physicist; professor of physics and member
of the Center for Gravitational Physics and Geometry at Pennsylvania
State University; author of The Life of The Cosmos.

Further Reading on Edge: "A Theory of the Whole Universe" in The Third
Culture; "A Possible Solution For The Problem Of Time In Quantum
Cosmology" by Stuart Kauffman and Lee Smolin.

Roger Schank 
The Politicians Who Are Running On Education Platforms Don't Actually
Care About Education

As each presidential candidate makes education the big issue for his
campaign we need to understand that none of them actually wants change
in education. There are two main reasons for this. The first are
vested interests. Those who oppose real change in our schools include
teachers unions who lobby heavily for the status quo, book publishers
who are afraid of losing the textbook investments they have made,
testing services and test preparation services who have a significant
investment in keeping things as they are and parents who really would
be quite frightened if the schools changed in a way that made their
own educations seem irrelevant. No politician wants to challenge a
group like this so no politician wants to do any more than pay lip
service to the issue. The second reason is more insidious. When real
reformers propose that everyone should be equally educated there are
gasps from the elitists who run our government. Their concern? If
everyone were educated, who would do the menial jobs? Hard as it may
be to believe this issue is raised quite often in Washington.

ROGER SCHANK, is the Chairman and Chief Technology Officer for
Cognitive Arts and has been the Director of the Institute for the
Learning Sciences at Northwestern University since its founding in
1989. He holds three faculty appointments at Northwestern University
as John Evans Professor of Computer Science, Education, and
Psychology. His books include: Dynamic Memory: A Theory of Learning in
Computers and People , Tell Me a Story: A New Look at Real and
Artificial Memory, The Connoisseur's Guide to the Mind, and Engines
for Education, and Virtual Learning: A Revolutionary Approach to
Building a Highly Skilled Workforce.
Further reading on Edge: "Information is Surprises" -- Roger Schank in
The Third Culture; and "The Disrespected Student -- or -- The Need for
the Virtual University": A Talk with Roger Schank.

LINKS: Cognitive Arts; and Institute for the Learning Sciences.

Howard Gardner
(1) African Civil Wars; (2) Evolutionary Theory Is Not Intuitive;
Creationism Is

Two stories, one of global importance, the other of importance in the
areas in which I work:

Global: At any one time in recent years, there have been civil wars
raging in several countries in Africa. Thousands of individuals die
each year. Last year, according to one authority, 1/3 of the African
countries were at war. Yet because the political aspects of these
conflicts are no longer of interest to Americans (because the Cold War
is over), the economic stakes have no global importance, and African
populations do not capture the attention of well-off Westerners, one
needs to be a specialist to find out the details of these conflicts.
The contrast with the attention paid to the death of an American
youth, particularly one from a middle-class family, is shocking and
hard to justify sub species aeternitas. Of course, mere knowledge of
these conflicts does not in itself solve anything; but it is a
necessary step if we are to consider what might be done to halt this

Local: In my own areas of psychology and education, there is plenty of
interest nowadays in student achievement in schools. Yet the coverage
of these matters in the press almost entirely leaves out knowledge
which enjoys wide consensus among researchers. In the area of human
development, it is recognized that youngsters pass through stages or
phases, and it makes no sense to treat a four year old as if he or she
were simply a "slower" or less informed middle school student or
adult. In the area of cognitive studies, it is recognized that
youngsters "construct" their own theories by which they attempt to
make sense of the world; and that these intuitive theories often fly
directly in the face of the theories and disciplines which we hope
that they will ultimately master.

Because these points are not well understood by journalists,
policymakers, and the general public, we keep implementing policies
that are doomed to fail. Efforts to teach certain materials in certain
ways to youngsters who aren't ready to assimilate them will not only
be ineffective but they are likely to cause children to come to
dislike formal education. And efforts to instruct that fail to take
into account -- and challenge -- the often erroneous theories that
youngsters have already developed will delude us into thinking that
the students are actually understanding materials that remain opaque
to them.

I think that this happens because as humans we are predisposed to come
up with this theory of learning: Our minds are initially empty and the
job of education is to fill those vessels with information. It is very
difficult for humans to appreciate that the actual situation is quite
different: in our early years, we construct all kinds of explanations
for things. Our scholarly disciplines can only be mastered if we get
rid of these faulty explanations and construct, often slowly and
painfully, better kinds of explanations. Put sharply, evolutionary
theory is not intuitive; creationism is. And that is why eight year
olds are invariable creationists, whether their parents are
fundamentalists or atheists.

HOWARD GARDNER, the major proponent of the theory of multiple
intelligences, is Professor of Education at Harvard University and
holds research appointments at the Boston Veteran's Administration
Medical Center and Boston University School of Medicine. His numerous
books include Leading Minds, Frames of Mind, The Mind's New Science: A
History of the Cognitive Revolution, To Open Minds, Multiple
Intelligences, and Extraordinary Minds: Portraits of Four Exceptional
Individuals. He has received both a MacArthur Prize Fellowship and the
Louisville Grawemeyer Award.
Further Reading on Edge: "Truth, Beauty, and Goodness: Education for
All Human Beings.

Douglas Rushkoff 
America's Descent Into Computer-Aided Unconsciousness And Consumer

We have taught our machines to conduct propaganda. Web sites and other
media are designed to be "sticky," using any means necessary to
maintain our attention. Computers are programmed to stimulate
Pavlovian responses from human beings, using techniques like
one-to-one marketing, collaborative filtering, and hypnotic
information architecture.

Computers then record our responses in order to refine these
techniques, automatically and without the need for human intervention.
The only metrics used to measure the success of banner ads and web
sites is the amount of economic activity - consumption and production
- they are able to stimulate in their human user/subjects. As a
result, the future content and structure of media will be designed by
machines with no priority other than to induce spending.

It amounts to a closed feedback loop between us and our computers,
where - after their initial programming - the machines take the active
role and human beings behave automatically. Programs adjust themselves
in real time, based on their moment to moment success in generating
the proper, mindless responses from us. In fact, computers and
software are already charged with the design of their own successors.
They are encouraged to evolve, while we are encouraged to devolve into
impulsive, thoughtless passivity.

Those who stand a chance of resisting - people who actually think -
are rewarded handsomely for their compliance, and awarded favorable
media representations such as "geek chic." These monikers are reserved
for intelligent people who surrender their neural power to the
enhancement of the machine, by becoming vested web programmers, for
example. Those who refuse to suspend active thought are labeled
communist, liberal, or simply "unfashionably pessimistic." Worse, they
are unfaithful enemies of NASDAQ, and the divinely ordained expansion
of the US economy.

Ultimately, if such a story were actually reported, it would have to
dress itself in irony, or appear as the result of an abstract
intellectual exercise, so as not to alert too much attention.

DOUGLAS RUSHKOFF, a Professor of Media Culture at New York
University's Interactive Telecommunications Program, is an author,
lecturer, and social theorist. His books include Free Rides, Cyberia:
Life in the Trenches of Hyperspace, The GenX Reader (editor), Media
Virus! Hidden Agendas in Popular Culture, Ecstasy Club (a novel),
Playing the Future, and Coercion: Why We Listen to What "They" Say . 
Further reading on Edge: "The Think That I Call Doug: A Talk with
Douglas Rushkoff".

LINK: Doug Rushkoff Home Page.

George B. Dyson

The Size Of The Digital Universe

Cosmologists have measured the real universe with greater precision
than any reportable metric encompassing the extent of the digital
universe -- even though much of it is sitting on our desks.

Sales figures of all kinds are readily available, whereas absolute
numbers (of processors, CPU cycles, addressable memory, disk space,
total lines of code) are scarce. We are left to make rough
approximations (skewed by volatility in prices) where there could and
should be a precise, ongoing count.

Have we reached Avogadro's number yet?

GEORGE DYSON is a leading authority in the field of Russian Aleut
kayaks --the subject of his book Baidarka, numerous articles, and a
segment of the PBS television show Scientific American Frontiers. His
early life and work was portrayed in 1978 by Kenneth Brower in his
classic dual biography, The Starship and the Canoe. Now ranging more
widely as a historian of technology, Dyson's most recent book is
Darwin Among the Machines.
Further reading on Edge: "Darwin Among the Machines; or, The Origins
of Artificial Life"; and "CODE - George Dyson & John Brockman: A

Pattie Maes

What Scientific Research Is Really Like

The public still thinks of research as a very serious and lonely
activity. The picture of a scientist that typically comes to mind is
that of a person in labcoat hunched over heavy books, locked up in
their ivory tower. The truth is that scientists are more like a group
of uninhibited, curious kids at play. Maybe teenagers would be more
into science if they had a more accurate picture of what research is
like and realized that one way to avoid growing up is to become a

PATTIE MAES is an Associate Professor at MIT's Media Laboratory, where
she founded and directs the Software Agents Group, and is principal
investigator of the e-markets Special Interest Group. She currently
holds the Sony Corporation Career Development Chair.

Further reading on Edge: Intelligence Augmentation -- A Talk With
Pattie Maes.

LINK: Pattie Maes' Home Page.

Mihaly Csikszentmihalyi

The Reasons For Right-Wing Extremism In Europe And The U.S.

Today (but I hope not tomorrow) I think the most important unreported
story concerns the reasons for a return of right-wing extremism in
Europe, and for the first time in the U.S. Since I am not a journalist
I would not report such a story, but I would first find out if it is
really true, and if true then study what its causes are. Is it that
people are running out of hope and meaning? Have the Western
democracies run out of believable goals? What conditions favor fascism
and what can we do to prevent them from spreading?

MIHALY CSIKSZENTMIHALYI (pronounced "chick-SENT-me high"), a
Hungarian-born polymath and the Davidson Professor of Management at
the Claremont Graduate University, in Claremont, California has been
thinking about the meaning of happiness since a child in wartime
Europe. He is author of Flow: The Psychology Of Optimal Experience;
The Evolving Self: A Psychology For The Third Millennium; Creativity;
and Finding Flow.

LINK: FlowNet.

William H. Calvin

Abrupt Climate Change

That's easy: abrupt climate change, the sort of thing where most of
the earth returns to ice-age temperatures in just a decade or two,
accompanied by a major worldwide drought. Then, centuries later, it
flips back just as quickly. This has happened hundreds of times in the

The earth's climate has at least two modes of operation that it flips
between, just as your window air-conditioner cycles between fan and
cool with a shudder. And it doesn't just settle down into the
alternate mode: the transition often has a flicker like an aging
fluorescent light bulb. There are sometimes a half-dozen whiplash
cycles between warm-and-wet and cool-and-dusty, all within one
madhouse century. On a scale far larger than we saw in the El Nino
several years ago, major forest fires denude much of the human

To the extent the geophysicists understand the mechanism, it's due to
a rearrangement in the northern extension of the Gulf Stream. A number
of computer simulations, dating back to 1987, of the winds and ocean
currents have shown that gradual global warming can trigger such a
mode switch within several centuries, mostly due to the increased
rainfall into the northern North Atlantic Ocean (if the cold salty
surface waters are diluted by fresh water, they won't flush in the
usual manner that allows more warm water to flow north and lose its
heat). Meltwater floods from Iceland and Greenland will do the job if
tropical-warming-enhanced rainfall doesn't.

This has been the major story in the geophysical sciences of the last
decade. I've been puzzled since 1987 about why this story hasn't been
widely reported. A few newspapers finally started reporting the story
in some detail two years ago but still almost no one knows about it,
probably because editors and readers confuse it with gradual climate
change via greenhouse gases. This longstanding gradual warming story
seems to cause the abrupt story to be sidetracked, even though another
abrupt cooling is easily the most catastrophic outcome of gradual
warming, far worse than the usual economic and ecological burden

How would I report it? Start with the three million year history of
abrupt coolings and how they have likely affected prehuman evolution.
Our ancestors lived through a lot of these abrupt climate changes, and
some humans will survive the next one. It's our civilization that
likely won't, just because the whiplashes happen so quickly that
warfare over plummeting resources leaves a downsized world where
everyone hates their neighbors for good reason. Fortunately, if we get
our act together, there are few things we might do to stabilize the
patient, buying some extra time in the same manner as preventive
medicine has extended the human lifespan.

WILLIAM H. CALVIN is a theoretical neurophysiologist on the faculty of
the University of Washington who writes about the brain and evolution.
Among his many books are How Brains Think, The Cerebral Code, and
Lingua ex Machina. He is the author of a cover story for The Atlantic
Monthly, "The Great Climate Flip-flop," January 1998, and a
forthcoming book, Cool, Crash and Burn: The Once and Future Climate of
Human Evolution.

Further reading on Edge: "Competing for Consciousness: How
Subconscious Thoughts Cook on the Backburner: A Talk by William H.

LINK: William Calvin Home Page.

  Add your own suggestions for Today's Most Important Unreported Story
  at the Edge discussion forum at

More information about the paleopsych mailing list