From checker at panix.com Sun Jan 1 00:30:12 2006 From: checker at panix.com (Premise Checker) Date: Sat, 31 Dec 2005 19:30:12 -0500 (EST) Subject: [Paleopsych] Meme 055: The Origins of the Nudity Taboo Message-ID: Meme 055: The Origins of the Nudity Taboo sent 5.12.31 It is a familiar story: 21 And the Lord God caused a deep sleep to fall upon Adam, and he slept: and he took one of his ribs, and closed up the flesh instead thereof; 22 And the rib, which the Lord God had taken from man, made he a woman, and brought her unto the man. 23 And Adam said, This is now bone of my bones, and flesh of my flesh: she shall be called Woman, because she was taken out of Man. 24 Therefore shall a man leave his father and his mother, and shall cleave unto his wife: and they shall be one flesh. 25 And they were both naked, the man and his wife, and were not ashamed. CHAPTER 3 1 Now the serpent was more subtil than any beast of the field which the Lord God had made. And he said unto the woman, Yea, hath God said, Ye shall not eat of every tree of the garden? 2 And the woman said unto the serpent, We may eat of the fruit of the trees of the garden: 3 But of the fruit of the tree which is in the midst of the garden, God hath said, Ye shall not eat of it, neither shall ye touch it, lest ye die. 4 And the serpent said unto the woman, Ye shall not surely die: 5 For God doth know that in the day ye eat thereof, then your eyes shall be opened, and ye shall be as gods, knowing good and evil. 6 And when the woman saw that the tree was good for food, and that it was pleasant to the eyes, and a tree to be desired to make one wise, she took of the fruit thereof, and did eat, and gave also unto her husband with her; and he did eat. 7 And the eyes of them both were opened, and they knew that they were naked; and they sewed fig leaves together, and made themselves aprons. 8 And they heard the voice of the Lord God walking in the garden in the cool of the day: and Adam and his wife hid themselves from the presence of the Lord God amongst the trees of the garden. 9 And the Lord God called unto Adam, and said unto him, Where art thou? 10 And he said, I heard thy voice in the garden, and I was afraid, because I was naked; and I hid myself. 11 And he said, Who told thee that thou wast naked? Hast thou eaten of the tree, whereof I commanded thee that thou shouldest not eat? 12 And the man said, The woman whom thou gavest to be with me, she gave me of the tree, and I did eat. 13 And the Lord God said unto the woman, What is this that thou hast done? And the woman said, The serpent beguiled me, and I did eat. 14 And the Lord God said unto the serpent, Because thou hast done this, thou art cursed above all cattle, and above every beast of the field; upon thy belly shalt thou go, and dust shalt thou eat all the days of thy life: 15 And I will put enmity between thee and the woman, and between thy seed and her seed; it shall bruise thy head, and thou shalt bruise his heel. 16 Unto the woman he said, I will greatly multiply thy sorrow and thy conception; in sorrow thou shalt bring forth children; and thy desire shall be to thy husband, and he shall rule over thee. 17 And unto Adam he said, Because thou hast hearkened unto the voice of thy wife, and hast eaten of the tree, of which I commanded thee, saying, Thou shalt not eat of it: cursed is the ground for thy sake; in sorrow shalt thou eat of it all the days of thy life; 18 Thorns also and thistles shall it bring forth to thee; and thou shalt eat the herb of the field; 19 In the sweat of thy face shalt thou eat bread, till thou return unto the ground; for out of it wast thou taken: for dust thou art, and unto dust shalt thou return. 20 And Adam called his wife's name Eve; because she was the mother of all living. 21 Unto Adam also and to his wife did the Lord God make coats of skins, and clothed them. 22 And the Lord God said, Behold, the man is become as one of us, to know good and evil: and now, lest he put forth his hand, and take also of the tree of life, and eat, and live for ever: 23 Therefore the Lord God sent him forth from the garden of Eden, to till the ground from whence he was taken. 24 So he drove out the man; and he placed at the east of the garden of Eden Cherubims, and a flaming sword which turned every way, to keep the way of the tree of life. So amongst the knowledge of good and evil that came from eating the forbidden fruit is that nudity is shameful. (How much knowledge came is not stated in the text, but the Lord God felt it necessary to reveal other knowledge later, the most famous being the Ten Commandments, indeed up to 626 Mitzvot by the end of the Old Testament and a few more in the New, repealing several of the ones from the Old Testament as no longer being pertinent under the New Covenant. Gary North takes the unusual position, which I call "eastern," that what was not repealed remains in effect, why most Christians take the "western" position that everything not explicitly continued in the New Testament was repealed. What has puzzled me for a good while is that I can think of no sociobiological explanation for the nudity taboo. Humans did not wear clothes in the EEA! Nevertheless, my children started objecting to seeing their parents naked around the time they hit puberty. It was not part of their upbringing, and our nudity was entirely casual. When Hillary was touting health care reform, I suggested (without anyone contradicting me) that a better way to improve health would be to require that we go naked when it is hot. We'd start paying far more attention to our appearance. If no explanation is forthcoming, perhaps it was divine intervention after all that instilled this particular bit of knowledge of good and evil. The Bayesian "priors" of nonbelievers would have to be drastically readjusted. And the Lord God would have instilled this shame of being naked into all humans, not just those who lived in the Near East. In the article below, you'll find a not at all untypical anti-Christian rant. I am in no position to counter what it says about Japanese families jumping into a hot tub together, and it is definitely the case that Greek athletes (male, not female, if there were any) performed in the raw. But the general argument is bogus. In all societies (are there any full exceptions?), completely casual nudity does not exist. Otherwise, I'd go naked at the office when it got too hot, as would some of my co-workers. Outside, I'd wear running shoes and a jock strap. But only in a society in which casual nudity is the norm. I have no intention of making a big issue of this or joining a nudist colony. It is also the case that nowhere is homosexuality allowed free reign. It takes the efforts of scholars to decipher the "code words" for homosexual activities in the writings of the Greeks, which did allow for certain homosexual behavior under tightly controlled circumstances. But homosexual activity was never just casual. That's why "code words" got used. One notion in theology is that God reveals rules of conduct to men who cannot figure these rules out for themselves, given the state of knowledge at the time. A theist holds that men in fact announced these rules and attributed them to a god or gods, in every religion except his own religion. An atheist, one who believes in one less god than a monotheist, thinks all the commands were man-made. The world's religions have a huge overlap in their rules, as has been observed many times, but also rules that are unique to each religion. St. Paul's letters spend a great deal of time discussion Jewish-Gentile relations and what Jewish and Gentile converts must do differently. The latter did not have to obey all the Jewish laws, but Paul became exercised over whether newly converted pagans could go ahead and eat meat that had been ritually sacrificed by those who were still pagan. He offered arguments against it, though not quoting the arguably Jewish specific Second Commandment ("Thou shalt have no other gods before me") itself, but he was not adamant. Paul was adamant, though, about homosexuality, and explicitly carried over the Old Testament injunctions. Not surprisingly, theological liberals interpret this out of existence. (I got Jacques Berlinerbrau, _The Secular Bible: Why Nonbelievers Must Take Religion Seriously_ (Cambridge UP, 2005) for Christmas, but I have not read it yet.) Well, here goes the rant against Christianity. What remains is the puzzle of why Adam and Eve were ashamed. BOOK VIII .RESTORING "FAMILY VALUES" (THIRD SUB-PART, ASPECTS OF CHANGE) http://www.agnostic.org/BIBLEI.htm [from a group called The Agnostic Church] But Just Whose Family Values? (Continued) A. Side Effects Of Change The key values identified for change, if actually altered by mankind, will naturally result in a number of changes to other values which Western Civilization generally holds. This Section is intended to discuss some of those values which will also change. 1. No Nudity Taboo Christianity is so obsessed with the thought that enjoyment of sex is bad that many of the so-called "Family Values" of the Christian "Right" are actually designed to suppress any thoughts of sex as an enjoyable activity.1 One of those values which should naturally be discarded is the nudity taboo. It is almost comical to watch modern parents raise their children within the bounds of the current system of taboos. It is OK for a very small child to go to the bathroom with a parent of the opposite sex, but once the child is old enough to be potty trained, the child must go to the proper bathroom by itself. It is OK for prepubescent children of the opposite sex to sleep in the same bed, but if puberty approaches for either one, it is taboo. This taboo even manifests itself in our building codes, which specify a maximum age by which opposite sex children can sleep in the same bedroom. The essential thought of the Christian "Right" is that if we keep our children from viewing any nudity and sex, or finding out about such subjects in any way, we will quite naturally prevent our children from having sex before marriage. This creates an essential tension between knowledge and freedom. The freedoms of all people, including adults, must be restricted in order to prevent any young person from coming into contact with any depictions of nudity or sexual activity. Presently, our laws draw certain artificial lines at various ages, currently 13, 17, and/or 18,2 allowing increased freedom for them to view movies depicting nudity and/or sexual activity once they achieve those ages. Of course this is silly because social statistics show that roughly 85% of eighteen year old children are not virgins, meaning that most eighteen year olds have had sex. Other cultures have not created such a widespread nudity taboo. For example, it is considered normal in Japan for an entire family to jump into a hot tub together, in the nude, and there is a similar lack of nudity taboos in such things as public baths. To me, things like this clearly show that nudity taboos are totally artificial. If the values of our culture allow, and even strongly encourage, children as young as age seven to pair up as couples and get married to one another, long before puberty is even an issue, then there is no longer any reason to maintain a nudity taboo. Accordingly, the nudity taboo should, quite naturally, be assigned to the trash as part of the altered system of values proposed in this book. This does not mean that I am advocating the so-called "nudist life style." There are good and valid reasons why we should not expose our skin to the sun any more than is absolutely necessary. The nudists essentially accept the trade off of increasingly bad skin as they get older in return for their pleasure in thumbing their noses at the system by going around in the nude. The only reason I will mention the nudists is that it is usually accepted that they are a valid alternative life style, and thus they are proof that there is not a direct correlation between nudity and sexual activity. 2 . Unisex Bathrooms Our present system of bathroom facilities is strongly based on the nudity taboo. If the nudity taboo is discarded, then there is no longer any real reason to continue to build separate bathroom facilities for each sex. This concept is already partially implemented in our society. For instance, it is less common to see portable toilets labeled for one sex or the other. Many gas stations have converted to unisex rest room facilities, simply putting the international logos for both sexes on the same door. I have not observed any really strong reaction against these trends, so I believe it is time to simply state that it will be better for all of us to recognize this concept and agree to share and share alike.3 In the long run, construction costs will be lower for buildings, convenience will be improved, and some level of aggravation will be removed from our lives if we simply agree that any public rest room are unisex, no matter what label happens to be on the door. This change is simply a natural consequence of abolishing the nudity taboo, and as I have pointed out, it is already widely accepted in our society. 3 . No Pleasure Taboos An important side effect of all of this will be to remove most, if not all, of the many proscriptions of pleasurable activity which now exist in our system of laws. There will no longer be any need to prohibit sexual activity according to age, because our society will ensure that most people are part of a stable marriage long before they are physically mature enough to have sex. Similarly, we should remove the age limits on the drinking of alcoholic beverages so that our young people can get their training on what their own personal limits for the consumption of alcohol are long before they would be in any significant physical danger from overindulgence. Part of the culture of Western Civilization is that alcoholic beverages are heavily taxed because drinking is a "sin" and, to the extent which our society has elected to tolerate such "sin," taxing it heavily acts as a disincentive to "sin." Thus, the easiest way for any politician to propose raising revenue for the government is to propose a "sin tax" of some sort, and alcohol is usually right at the top of the list. These things all derive from the basic Christian concept that pleasure is sinful, and thus indulgence in pleasure must be discouraged for the good of your eternal soul. One of the side effects of adopting a Dionysian component into our culture will be the fact that pleasure is now not only acceptable, it is encouraged as a natural part of being human. In other words, it is no longer possible for you to be considered as being totally human unless you regularly experience pleasure. Abolishing all of the taboos against pleasure, and discarding all of the guilt which Christianity has traditionally associated with pleasure, will constitute a truly fundamental change in our moral philosophy. But there is no doubt in my mind that this change is long overdue. _____________________________________________________ 1 Christianity has been attempting to prevent sexual contact from the earliest days of the church. See, for example, some of the letters of Paul in the New Testament which even advise against marriage under the false assumption that Jesus will return any day now. 2 These ages are derived from the current movie rating system of G, PG, PG-13, R, and NC-17. 3 The disparate nature of public rest room facilities at sports venues has been a recurring topic in the media, particularly with respect to concerts, where the sexual division of the audience is at least much more equal, if not skewed in favor of a female majority. [I am sending forth these memes, not because I agree wholeheartedly with all of them, but to impregnate females of both sexes. Ponder them and spread them.] From checker at panix.com Sun Jan 1 02:43:07 2006 From: checker at panix.com (Premise Checker) Date: Sat, 31 Dec 2005 21:43:07 -0500 (EST) Subject: [Paleopsych] Getting (Too) Dirty in Bed Message-ID: Getting (Too) Dirty in Bed By Emily Gertz, Grist Magazine Posted on December 9, 2005, Printed on December 9, 2005 http://www.alternet.org/story/29218/ So you're an Enlightened Green Consumer. You buy organic food and carry it home from the local market in string bags. Your coffee is shade-grown and fair-trade, your water's solar-heated, and your car is a hybrid. But what about the playthings you're using for grown-up fun between those organic cotton sheets -- how healthy and environmentally sensitive are they? Few eco-conscious shoppers consider the chemicals used to create their intimate devices. Yes, those things -- from vibrators resembling long-eared bunny rabbits to sleeves and rings in shapes ranging from faux female to flower power. If these seem like unmentionables, that's part of the problem: while some are made with unsafe materials, it's tough to talk about that like, well, adults. But it's necessary. Unlike other plastic items that humans put to biologically intimate use -- like medical devices or chew-friendly children's toys -- sex toys go largely unregulated and untested. And some in the industry say it's time for that to change. Love Stinks Many popular erotic toys are made of polyvinyl chlorides (PVC) -- plastics long decried by eco-activists for the toxins released during their manufacture and disposal -- and softened with phthalates, a controversial family of chemicals. These include invitingly soft "jelly" or "cyberskin" items, which have grown popular in the last decade or so, says Carol Queen, Ph.D., "staff sexologist" for the San Francisco-based adult toy boutique Good Vibrations . "It's actually difficult for a store today to carry plenty of items and yet avoid PVC," Queen says. "Its use has gotten pretty ubiquitous among the large purveyors, because it's cheap and easy to work with." In recent years, testing has revealed the potentially serious health impacts of phthalates. Studies on rats and mice suggest that exposure could cause cancer and damage the reproductive system. Minute levels of some phthalates have been linked to sperm damage in men, and this year, two published studies linked phthalate exposure in the womb and through breast milk to male reproductive issues. A study in 2000 by German chemist Hans Ulrich Krieg found that 10 dangerous chemicals gassed out of some sex toys available in Europe, including diethylhexyl phthalates. Some had phthalate concentrations as high as 243,000 parts per million -- a number characterized as "off the charts" by Davis Baltz of the health advocacy group Commonweal. "We were really shocked," Krieg told the Canadian Broadcasting Corporation's Marketplace in a 2001 report on the sex-toy industry. "I have been doing this analysis of consumer goods for more than 10 years, and I've never seen such high results." The danger, says Baltz, is that heat, agitation, and extended shelf life can accelerate the leaching of phthalates. "In addition, [phthalates are] lipophilic, meaning they are drawn to fat," he says. "If they come into contact with solutions or substances that have lipid content, the fat could actually help draw the phthalates out of the plastic." Janice Cripe, a former buyer for Blowfish -- a Bay Area-based online company whose motto is "Good Products for Great Sex" -- confirms the instability of jelly toys: "They would leak," she says. "They'd leach this sort of oily stuff. They would turn milky" and had a "kind of plasticky, rubbery odor." She stopped ordering many jelly toys during her time at Blowfish, even though their lower prices made them popular. So what's being done to protect consumers? Well, nothing. While the U.S., Japan, Canada, and the European Union have undertaken various restrictions regarding phthalates in children's toys, no such rules exist for adult toys. In order to be regulated in the U.S. under current law, sex toys would have to present what the federal government's Consumer Product Safety Commission calls a "substantial product hazard" -- essentially, a danger from materials or design that, in the course of using the product as it's made to be used, could cause major injury or death. But if you look at the packaging of your average mock penis or ersatz vagina, it's probably been labeled as a "novelty," a gag gift not intended for actual use. That's an important semantic dodge that allows less scrupulous manufacturers to elude responsibility for potentially harmful materials, and to evade government regulation. If you stick it somewhere it wasn't meant to go, well -- caveat emptor, baby! It's a striking lack of oversight for a major globalized industry. The Guardian recently estimated that 70 percent of the world's sex toys are manufactured in China, and the CBC's 2001 report suggested the North American market might be worth $400 million to $500 million. More detailed figures can be hard to come by. "In the U.S., all of the companies that manufacture adult novelties, whether they're mom-and-pop or large corporations, are privately held," explains Philip Pearl, publisher and editor in chief of AVN Adult Novelty Business, a trade magazine. "None are required to publish financial information, and none do." Queen thinks the lack of agreed-upon standards is a major problem. She and the staff at Good Vibrations have often had to fall back on marginally relevant regulations. "I remember trying in the early '90s to track down information on an oil used on beautiful hand-carved wooden dildos -- was it safe to put into the body?" she says. "The closest comparison we could find was the regulation governing wooden salad utensils!" Taking Things Into Their Own Hands Metis Black, president of U.S.-based erotic-toy manufacturer Tantus Silicone, has written on the health risks of materials for Adult Novelty Business. "Self-regulation -- eventually we've got to do it," says Black, who adds that creating safe toys is what got her into the business about seven years ago. "Just like children's teething toys, we're going to have to start doing the dialogue" within the industry, Black says, to "discuss what's in toys and how it affects customers." Otherwise, she feels, government regulators will step in. While the industry wrestles with such issues, some manufacturers and suppliers aren't waiting for regulations. Tony Levine, founder of Big Teaze Toys, says he's made his products -- including the cutely discreet, soft-plastic vibrator I Rub My Duckie -- phthalate-free from the start. "While working at Mattel as a toy designer, I was made very aware of the concerns of using only safe materials for children's products," he says. "This training has stuck with me ... We take great pride in using only the materials which meet strict toxicity safety standards for both the U.S. and the E.U." Meanwhile, if customers select jelly playthings at Babeland, a retailer with stores in Los Angeles, New York City, and Seattle, the staff gives them a tip sheet on phthalates, and recommends using a condom with the toy. "Our goal is to help people make an educated choice, and give out as much information as we can find -- without alarming people," says Abby Weintraub, an associate manager at the company's Soho store. Babeland staff also steer willing customers toward phthalate-free alternatives, such as hard plastic, or the silicone substitute VixSkin. Some manufacturers are also using thermoplastic elastomers instead of PVC. Vibratex recently reformulated the popular Rabbit Habit dual-action vibrator -- made famous on Sex and the City -- with this material. Vibratex co-owner Daniel Martin says the company has always used "superior grade," stable PVC formulations, and still considers the products safe, but acknowledges that customers are eager for phthalate-free tools. While alternative materials can be more expensive, Weintraub says when people have the option of choosing them, many do. The owners of the Smitten Kitten , a Minneapolis-based retailer, opted not to carry jellies, cyberskins, or other potentially toxic toys at all when they opened about two years ago. "They're dangerous to human health, to the environment," says co-owner Jennifer Pritchett. "It's part of our philosophy to put good things in the world, and it's counter to that to sell things that are toxic." No Sex Please, We're Skittish So what are the other alternatives for eco-conscious pleasure-seekers? The most ecologically correct choices may be metal or hardened glass dildos -- which, with their elegant, streamlined shapes (and sometimes hefty price tags) can double as modernist sculptures if you grow weary of their sensual charms. "The glass is going to be more lasting, possibly safer, and less toxic than something that's plastic," confirms Babeland marketing manager Rebecca Suzanne. And the eco-choices don't stop there. If you want to do your part for conservation while getting a buzz, go for the Solar Vibe , a bullet vibrator that comes wired to a small solar panel. Some vibrators come with rechargeable power packs, says Suzanne, "which is a little bit better alternative to the typical battery-run toy, where you just toss the batteries ... into the landfill." What about accessories? The Smitten Kitten takes pride in its "animal-friendly" inventory of bondage and fetish gear. "We have some floggers that are made of nylon rope ... natural rope, and rubber," says Pritchett. "The same with the paddles, collars, cuffs, and whatnot. Totally leather-free, animal-product-free." A few manufacturers are bringing green values directly to the adult-toy market via products that might not be out of place in the cosmetics aisle of a natural-foods mega-retailer. Offerings include Body Wax's candles made from soy and essential oils, and Sensua Organic's fruit-flavored or unflavored lubes -- one of a few lubricant lines touting either organic or all-natural formulations. "People enjoy having the option," says Weintraub. "It's like, 'I use organic face wash. Maybe I want to use organic lube, too.'" Pritchett feels health and eco-conscious retailers are a shopper's best ally for staying safe and healthy. "So many of us are used to shopping for organic food, or ecologically safe building products, or cosmetics," she says. When people realize it's possible to shop for sex toys the same way, "you can see a light bulb go off -- they realize it's a consumer relationship and they can and should demand better products." Choosing the most eco-correct erotic toy can seem fraught with compromises -- more akin to picking the most fuel-efficient automobile than buying a bunch of organic kale. With no government assessment or regulation on the immediate horizon, it's up to you, the consumer, to shop carefully and select a tool that's health-safe, fits your budget, and gets your rocks off. Meanwhile, pack up that old mystery-material toy and send it back to the manufacturer with a note that they can stick it where the sun don't shine. Emily Gertz has written on environmental politics, business, and culture for Grist, BushGreenwatch, and other independent publications. She is a regular contributor to WorldChanging. From checker at panix.com Sun Jan 1 02:43:22 2006 From: checker at panix.com (Premise Checker) Date: Sat, 31 Dec 2005 21:43:22 -0500 (EST) Subject: [Paleopsych] UPI: New Map Of Asia Lacks US Message-ID: New Map Of Asia Lacks US http://www.spacewar.com/news/superpowers-05zd.html Former Malaysian premier Mahathir Mohamad points at his anti-war badge after a press conference at his office in Putrajaya, 07 December 2005. Australia's hard-won entry into the inaugural East Asia summit was soured 07 December after former Malaysian premier Mahathir Mohamad said Canberra would likely be bossy and dilute the grouping's clout. Malaysia Out AFP photo. By Martin Walker Washington (UPI) Dec 08, 2005 The United States will not take part in next week's East Asia summit, but, to paraphrase a former secretary of state's phrase about the Balkan wars, the Americans most certainly have a dog in this fight. There is a fight under way at the summit, albeit a polite and diplomatic tussle. The Japanese, with discreet but potent American backing, have already ensured that the original plan of the former Malaysian premier for a purely Asian summit was blocked. Australia and New Zealand will now be taking part in the forum, to the fury of the still-influential Mahathir Mohammed. "We are not going to have an East Asian summit. We are going to have an East Asia-Australasia summit," Mahathir told a specially convened news conference last week to complain that the presence of Australia and New Zealand subverted his dream of a genuinely Asian forum. "Now Australia is basically European and it has made clear to the rest of the world it is the deputy sheriff to America and therefore, Australia's view would represent not the East but the views reflecting the stand of America," Mahathir added. There was also some reluctance, discreetly fostered by China, to admit India to what was intended to be an East Asian club, but India (like Russia, but not the United States) was prepared to sign the Association of South-East Asian Nations' Treaty of Amity and Cooperation in Southeast Asia, which ASEAN nations call "the admission ticket" to the summit. A report in China's People's Daily noted this week that Russia's inclusion in the club was "simply a matter of time," and Russian will hold a separate bilateral meeting with ASEAN immediately before the summit. But it remains significant that the United States, as the region's security guarantor for decades and as its biggest market, is not welcome. The summit is clearly emerging as an important building block in the new economic, security and political structure of Asia that is evolving, and for obvious reasons this structure is heavily influenced by China's explosive economic growth, the new reality to which the whole of Asia is learning to adapt. As China's People's Daily noted this week, to explain the problems of drafting a joint communique, the Kuala Lumpur Declaration, from the summit: "According to insiders, some countries including Thailand sided with China over the claim that 'this entity must take ASEAN + 3 (Japan, China, Republic of Korea) as its core' and demanded no mention of community in the draft. While others led by Japan hope to write into the draft 'to build a future East Asia Community' and include the names of the 16 countries. By doing so, ASEAN diplomats believe, Japan is trying to drag countries outside this region such as Australia and India into the community to serve as a counterbalance to China. "To grab the upper hand at the meeting, analysts say, Japan would most probably dish out the 'human rights' issue and draw in the United States, New Zealand and Australia to build up U.S., Japan-centered Western dominance," the People's Daily added. "At the same time, it will particularly highlight the differences in political and economic systems between developed countries such as Australia, New Zealand and the ROK and developing ones including China and Vietnam, in an attempt to crumble away cooperative forces and weaken Chinese influence in East Asia." The summit, to be held in Kula Lumpur, Malaysia, on Dec. 14, will include Australia, New Zealand, China, India, Japan, the Republic of Korea and the 10 members of ASEAN -- Singapore, Malaysia, Thailand, Myanmar, Philippines, Indonesia, Cambodia, Laos, Vietnam and Brunei. The eventual goals of the summit are huge. Japan's Foreign Minister Taro Aso said this week in a speech in Tokyo that: "Japan believes we should bring into being the East Asia Free Trade Area and the East Asia Investment Area in order to move us even one step closer to regional economic integration." Eventually, he has in mind (as do many of the ASEAN countries) something similar to the process of integration through trade that created over the past 50 years the present European Union. It will take a long time, and endless negotiations, and Aso's speech also laid out the immediate agenda for economic integration. "In Asia, the fact is that there are multiple factors inhibiting investment, including the existence of direct restrictions on investment, insufficient domestic legal frameworks, difficulties in the implementation of laws, inadequacy of the credit system, and others, particularly the complete inadequacy of protections for intellectual property rights," Aso said. India, with backing from Australia, sees the summit paving the way for an eventual Asian free-trade zone, though it remains cool to any grander designs for security or political integration along EU lines. China, which has said little about the kind of community it wants to see, mainly wants to ensure that no Asian gathering takes place without its increasingly overwhelming presence. So what is emerging, in America's absence, looks to be three distinct camps of a potentially uncomfortable assembly. The Australians and Indians and Japanese, and some of the more Western-minded ASEAN members, want to focus on economic cooperation and trade, but within the overall framework of the World Trade Organization, plus useful collaboration in areas like common action against avian flu. This group also wants to retain the current role of the United States as the region's key security guarantor. Then there is China, which evidently assumes that its economic prowess will eventually ensure that the East Asian summit, the region's economy and its security system are all dominated by Beijing, and not necessarily in an aggressive way. Still, Beijing wants this process to develop on China's own terms, for example this week ruling out the usual trilateral meeting with Japan and South Korea because of its complaints that Japan is not sufficiently remorseful for its actions in World War II. And finally there are the original ASEAN members, uncomfortably aware that they are now part of something far bigger than all of them. They understandably dread the prospect of great power rivalry between China and India, or between China and the United States, and hope that trade links and diplomatic structures like the summit process will ensure that such rivalries do not get out of hand. Some local analysts think that because of these fundamental differences the East Asian summit process is unlikely to endure. One Malaysian scholar has called it "an empty shell unable to yield any substantial results," and Indonesia's Jakarta Post published a decidedly gloomy editorial this week. "What we will actually see is not what East Asian leaders have long dreamed of, that is an integrated regional framework of cooperation, but a community marked rather by suspicion, distrust, individualism and perhaps unwillingness to sacrifice a minimum of national autonomy for the sake of pursuing collective and collaborative action," the paper commented. If that gloomy forecast holds good, that would not displease the United States, instinctively suspicious of any international body designed to exclude it. But if this East Asia summit process, filled with reliable American friends, fails to prosper, something much less welcome to Washington, and perhaps more to the taste of America's critics like Malaysia's Mahathir, will almost certainly emerge to fill the vacuum. From checker at panix.com Sun Jan 1 02:43:32 2006 From: checker at panix.com (Premise Checker) Date: Sat, 31 Dec 2005 21:43:32 -0500 (EST) Subject: [Paleopsych] Live Science: Happiness in Old Age Depends on Attitude Message-ID: Happiness in Old Age Depends on Attitude http://www.livescience.com/humanbiology/051212_aging_happy.html By Robert Roy Britt LiveScience Managing Editor posted: 12 December 2005 01:16 pm ET Happiness in old age may have more to do with attitude than actual health, a new study suggests. Researchers examined 500 Americans age 60 to 98 who live independently and had dealt with cancer, heart disease, diabetes, mental health conditions or a range of other problems. The participants rated their own degree of successful aging on scale of 1-10, with 10 being best. Despite their ills, the average rating was 8.4. "What is most interesting about this study is that people who think they are aging well are not necessarily the (healthiest) individuals," said lead researcher Dilip Jeste of the University of California at San Diego. "In fact, optimism and effective coping styles were found to be more important to successfully aging than traditional measures of health and wellness," Jeste said. "These findings suggest that physical health is not the best indicator of successful aging???attitude is." The finding may prove important for the medical community, which by traditional measures would have considered only 10 percent of the study members to be aging successfully. "The commonly used criteria suggest that a person is aging well if they have a low level of disease and disability," Jeste said. "However, this study shows that self-perception about aging can be more important than the traditional success markers." Health and happiness may indeed be largely in the mind. A study released last year found that people who described themselves as highly optimistic a decade ago had lower rates of death from cardiovascular disease and lower overall death rates than strong pessimists. Research earlier this year revealed that the sick and disabled are often as happy as anyone else. The new study also showed that people who spent time each day socializing, reading or participating in other hobbies rated their aging satisfaction higher. "For most people, worries about their future aging involve fear of physical infirmity, disease or disability," Jeste said. "However, this study is encouraging because it shows that the best predictors of successful aging are well within an individual's control." The results, announced today, were reported at a meeting of the American College of Neuropsychopharmacology. When Money Does Buy Happiness Loss of Loved One Really Can Cause Broken Heart Hang in There: The 25-Year Wait for Immortality From checker at panix.com Sun Jan 1 02:43:45 2006 From: checker at panix.com (Premise Checker) Date: Sat, 31 Dec 2005 21:43:45 -0500 (EST) Subject: [Paleopsych] Live Science: Human Gene Changes Color of Fish Message-ID: Human Gene Changes Color of Fish http://www.livescience.com/animalworld/051215_fish_color.html [No bawling about the dangers of racism here, either.] By [33]Bjorn Carey LiveScience Staff Writer posted: 15 December 2005 02:05 pm ET Scientists have changed mutated, golden-colored zebrafish to a standard dark-striped, yellowish-white variety by inserting the genetic information for normal pigmentation into young fish. In an interesting twist, they also found that inserting a similar human version of the pigment gene [34]resulted in the same color change. As with humans, zebrafish skin color is determined by pigment cells, which contain pigment granules called melanosomes. The number, size and darkness of melanosomes per pigment cell influence the color of skin. For example, people of European descent have fewer, smaller, and lighter melanosomes than people of West African ancestry, and Asians fall somewhere in between. The golden zebrafish variant had fewer, smaller, and less heavily pigmented melanosomes than normal fish. The mutation Keith Cheng of Penn State College of Medicine and his colleagues determined that a dysfunctional, mutated gene was not producing the protein needed to make melanosomes. "They have a mutation in the gene which causes the protein machinery to say `stop,'" Cheng told LiveScience. Cheng's team found that when they inserted the normal version of the gene into two-day-old embryos of the golden fish, they were able to produce melanosomes, which darkened their skin to the normal color within a few days. Next, the researchers searched HapMap, an online database of human genetic variation, and found a similar gene for melanosome production in humans. So they inserted the human gene into golden zebrafish embryos and again changed their skin color to the darker version. "We presume that they got darker because of similar function of the inserted gene which normally produces the more abundant, larger, and darker melanosomes," Cheng said. Human mutation? It appears that like the golden zebrafish, light-skinned Europeans also have a mutation in the gene for melanosome production, resulting in less pigmented skin. Scientists suspect variations of this gene may also cause blue eyes and light hair color in some humans. However, Cheng said, it's important to point out that the mutation in the human and zebrafish genes is different--while the zebrafish version fails completely to produce the protein to make melanosomes, the mutated human version still works, just not quite as well. The discovery could lead to advancements in targeting a treatment for malignant melanoma--the most deadly form of skin cancer--as well as research on ways to modify skin color without damaging it by tanning or the use of harsh chemical lighteners. This research is detailed in the Dec. 16 issue of the journal Science. * [35]Pollution Blamed for Intersex Fish * [36]The Real Reason Animals Flaunt Size and Color * [37]Bragging Rights: The Smallest Fish Ever * [38]Fluorescent Fish Aids Medical Research [39][051215_zebrafish_00.jpg] [40]The normal zebrafish above has darker stripes than the golden zebrafish below. The insets show that the golden zebrafish has fewer, smaller and less dense pigment-filled compartments called melanosomes than the normal zebrafish. References 34. http://www.livescience.com/php/multimedia/imagedisplay/img_display.php?pic=051215_zebrafish_02.jpg&cap=The+normal+zebrafish+above+has+darker+stripes+than+the+%ECgolden%EE+zebrafish+below.+The+insets+show+that+the+%ECgolden%EE+zebrafish+has+fewer,+smaller+and+less+dense+pigment-filled+compartments+called+melanosomes+than+the+normal+zebrafish.+Credit%3A+%A9+Science 35. http://www.livescience.com/environment/intersex_fish_041221.html 36. http://www.livescience.com/animalworld/ap_050319_deer_antlers.html 37. http://www.livescience.com/animalworld/041027_Smallest_Fish.html 38. http://www.livescience.com/imageoftheday/siod_050901.html 39. http://www.livescience.com/php/multimedia/imagedisplay/img_display.php?pic=051215_zebrafish_02.jpg&cap=The+normal+zebrafish+above+has+darker+stripes+than+the+%93golden%94+zebrafish+below.+The+insets+show+that+the+%93golden%94+zebrafish+has+fewer%2C+smaller+and+less+dense+pigment-filled+compartments+called+melanosomes+than+the+normal+zebrafish.+Credit%3A+%A9+Science 40. http://www.livescience.com/php/multimedia/imagedisplay/img_display.php?pic=051215_zebrafish_02.jpg&cap=The+normal+zebrafish+above+has+darker+stripes+than+the+%93golden%94+zebrafish+below.+The+insets+show+that+the+%93golden%94+zebrafish+has+fewer%2C+smaller+and+less+dense+pigment-filled+compartments+called+melanosomes+than+the+normal+zebrafish.+Credit%3A+%A9+Science From checker at panix.com Sun Jan 1 02:43:54 2006 From: checker at panix.com (Premise Checker) Date: Sat, 31 Dec 2005 21:43:54 -0500 (EST) Subject: [Paleopsych] Boston Consulting Group: Brain Size, Group Size, and Language Message-ID: Brain Size, Group Size, and Language http://www.bcg.com/strategy_institute_gallery/gorilla2.jsp Summary Evidence in primates suggests that the size of social groups is constrained by cognitive capacity as measured by brain size. After a point, the number and nature of group relationships becomes too complex and groups tend to grown unstable and fission. Based on these projections, human beings should reach a "natural" cognitive limit when group size reaches about 150. There is extensive empirical evidence of social groupings of about this size in the anthropological literature. It is suggested that language arose as a means of enabling social interactions in large groups as a more efficient substitute for one-on-one social grooming in primates. ------------- Brain Size, Group Size, and Language Why do people have such big brains? After all, it is very expensive to maintain this organ - while it only accounts for about 2% of adult body weight, the human brain consumes about 20% of total energy output. Professor Robin Dunbar has advanced a theory relating brain size in primates to the size of social groups and to the evolution of language in humans. This work supports some of the suppositions of the "Machiavellian Intelligence Hypothesis" which states that intelligence first evolved for social purposes (see Byrne, Richard &Whiten, Andrew (1988) Machiavellian Intelligence . Oxford: Clarendon Press.) This rebuts competing arguments linking brain size to more sophisticated food gathering and extended home range size. Brain size as a determinant of group size in primates Dunbar plots data on brain size (the measure he uses is the ratio of the neocortex - the "thinking" part of the brain - to total brain size) versus observed social group sizes for 36 genera of primates. He obtains a very good fit for the data (r-squared = 0.764, P Log10(N) = 0.093 + 3.389Log10(CR) Where N is the mean group size, and CR is the neocortex ratio. The results are plotted on a log-log scale. The fact that brain size in primates is closely related to group size implies that animals have to be able to keep track of an increased information load represented by these larger social groups. Note that the relationship of brain size to group size in the model is not linear, which it would be if the cohesion of the group depended only on each individual's relationship to all the other members of the group. The fact the relationship is logarithmic implies that the task of information processing is more complex: each animal has to keep track not just of its own relationships to every member of the group but also the third party relationships among other group members. At some point, the complexity of these relationships exceeds the animals' mental ability to deal with them. Several primate societies, such as chimpanzees, are known to become unstable and to fission when the group size exceeds a certain level. It may be that there is upper limit on group size set by cognitive constraints. Implications for human group size While the "natural" group size for humans is not known, the size of our brains is (neocortex ratio = 4.1). Plotting this as the independent variable on Dunbar's regression line yields a group size of 147.8 for homo sapiens. Is there any empirical evidence for natural human group sizes of about 150? Based on his scan of the anthropological literature, Dunbar concludes that there is. One source of evidence is from modern day hunter-gatherer societies (whose way of life best approximates that of our late Pleistocene ancestors of 250,000 years ago - the period when our current brain size is thought to have evolved). Based on this data for groups in Australia, the South Pacific, Africa, North and South America, there appear to be three distinct size classes: overnight camps of 30-50 people, tribes of 500-2,500 individuals, and an intermediate group - either a more permanent village or a defined clan group - of 100-200 people. The mean size of this intermediate group in Dunbar's (admitted small) sample is 148.4, which matches remarkably well with the prediction of the neocortex size model. This grouping also had the lowest coefficient of variation, which we would expect if this group size truly subject to an internal constraint (i.e., cognitive capacity), whereas smaller and larger groupings are more unstable. While these intermediate size groups may be dispersed over a wide area much of the time, they gather regularly for group rituals and develop bonds based on direct personal contact. These groups come together for mutual support in times of threat. Other examples of communal groups of this size abound. The Hutterites, a fundamentalist group that lives and farms communally in South Dakota and Manitoba, regards 150 as the upper limit for the size of a farming community. When the group reaches this size, it is split into two daughter communities. Professional armies, dating from Roman times to the modern day, maintain basic units - the "company" - that typically consists of 100-200 soldiers. Modern psychological studies also demonstrate the size of typical "friendship networks" in this same range. These examples provide further evidence of natural group size constraints. Once the number of individuals rises much beyond the limit of 150, social cohesion can no longer be maintained exclusively through a peer network. In order for stability to be maintained in larger communities, they invariably require some sort of hierarchical structure. Groups and the evolution of language Dunbar points out that primate groups are held together by social grooming, which is necessarily a one-on-one activity and can absorb a good deal of the animals' time. In order to maintain these bonds in groups of 200 individuals would require us to devote about 57% of the day to social grooming. Dunbar proposes that the maintenance of these social bonds in humans was made possible through the evolution of language, which emerged as a more efficient means for "grooming" - since one can talk to several others at once. Dunbar's model predicts a conversation group size for humans (as a substitute for grooming) of 3.8. He then sites evidence that this is indeed about the size actually observed in human conversation groups. Conversations tend to partition into new conversational cliques of about four individuals. Furthermore, studies have shown that a high percentage of ordinary conversations (over 60%) is devoted to discussing personal relationships and social experience - i.e., gossip. Based on Robin Dunbar 1992, 1993 Contributed by David Gray, 2000 Dunbar, R.I.M. (1992), 'Neocortex Size as a Constraint on Group Size in Primates', Journal of Human Evolution 20, 469-493 Dunbar, R.I.M. (1993), 'Coevolution of neocortical size, group size and language in humans', Behavioral and Brain Sciences 16, 681-735. * Group size effects the dynamics of social networks - a community ethos is more likely to arise in human groups smaller than 150 * Network formation depends on social interaction - effective networks arise from regular personal contact that creates a shared sense of community * Networks can be costly to maintain - time and resources are required to maintain the social ties that support a network * Hierarchy becomes important as group size grows - more complex societies require authoritarian structures to clarify and enforce social relationships Keywords: Social networks, primates, intelligence, group size, gossip, grooming, hunter-gatherer societies, Hutterites, army company, fission, bonds, friendship, hierarchy, peers, authority, evolution, language From checker at panix.com Sun Jan 1 02:44:08 2006 From: checker at panix.com (Premise Checker) Date: Sat, 31 Dec 2005 21:44:08 -0500 (EST) Subject: [Paleopsych] Public Interest: Charles Murray: Measuring abortion Message-ID: Charles Murray: Measuring abortion http://www.findarticles.com/p/articles/mi_m0377/is_158/ai_n8680977/print [It seems that every issue of The Public Interest is available at link 2.] [1]FindArticles > [2]Public Interest > [3]Wntr, 2005 > [4]Article Measuring abortion Charles Murray SEX and Consequences: Abortion, Public Policy and the Economics of Fertility is a model of contemporary social science discourse, revealing in one book both how the enterprise should be conducted and its vulnerability to tunnel vision on the big issues. Phillip B. Levine, a professor of economics at Wellesley College, sets out in Sex and Consequences to explore the thesis that the role of abortion is akin to the role of insurance. Legal abortion provides protection from a risk (having an unwanted child), just as auto insurance provides financial protection against the risk of an accident. Legalizing abortion has a main effect of reducing unwanted births, just as auto insurance has a main effect of reducing individuals' losses from auto accidents. But abortion faces the same problems of moral hazard as other kinds of insurance. Just as a driver with complete insurance may be more likely to have an accident, a woman who has completely free access to abortion may be more likely to have an accidental pregnancy. Levine hypothesizes that legislated restrictions on abortion might serve the same purpose as deductibles do on auto insurance--they alter behavior without having much effect on net outcomes. Thus a state with some restrictions on abortion may have no more unwanted births than a state without restrictions, even though the number of abortions is smaller in the restrictive state. The restrictions raise the costs of abortion, and women moderate their behavior to reduce the odds of an unwanted pregnancy. Levine develops his model carefully and with nuance, and eventually wends his way back to conclusions about its empirical validity (it is broadly consistent with the evidence). But the chapters between the presentation of the model and the conclusions about it are not limited to the insurance thesis. They constitute a comprehensive survey of the quantitative work that has been done on the behavioral effects of abortion, incorporating analysis of the abortion experience worldwide as well as in the United States. THE book's virtues are formidable. Levine writes clearly, avoids jargon (or explains what the jargon means when he can't avoid it), and is unfailingly civil in characterizing the positions in the abortion debate. He is judicious, giving the reader confidence that he is not playing favorites when the data are inconclusive or contradictory. The breadth and detail of the literature review are exemplary. The book is filled with convenient summaries of material that could take a researcher weeks to assemble--a table showing the differences in abortion policy across European countries plus Canada and Japan, for example. Levine also gets high marks for one of the most challenging problems for any social scientist who is modeling complex human behavior: making the model simple enough to be testable while not losing sight of the ways in which it oversimplifies the underlying messiness of human behavior. The book's inadequacies reflect not so much Levine's failings as the nature of contemporary social science. Abortion policy is one of the great moral conundrums of our time. Anyone who is not the purest of the pure on one side or the other has had to wrestle with the moral difference (or whether there even is one) between destroying an embryo when it is a small collection of cells and when it is unmistakably a human fetus. None of the tools in Levine's toolkit can speak to this problem. Levine is aware of this, and makes the sensible point that more argumentation on the philosophical issues is not going to get us anywhere. He has picked a corner of the topic where his tools are useful, he says, and that's a step in the right direction. Still, as I read his dispassionate review of the effects of abortion policy on the pregnancy rate, I could not help muttering to myself occasionally, "Aside from that, Mrs. Lincoln, how was the play?" * EVEN granting the legitimacy of looking where the light is good, Sex and Consequences may be faulted for sheering away from acknowledging how much scholars could do to inform the larger issues if they were so inclined. Here is Levine discussing the non-monetary costs of abortion: The procedure may be physically unpleasant for the patient. She may need to take time off from work and spend time traveling to an abortion provider that may not be local. When she gets to the provider's location, there may be protesters outside the clinic, making her feel intimidated or even scared. If her family and/or friends find out about it, she may feel some stigma. Finally, it should not be overlooked that the procedure may be very difficult psychologically for a woman in a multitude of ways that cannot be easily expressed. "Cannot be easily expressed"? The woman is destroying what would, if left alone, have become her baby. That's easy enough to express. That Levine could not bring himself to spit out this simple reason why "the procedure may be very difficult psychologically" is emblematic of the tunnel vision that besets contemporary social science. A policy is established that has implications for the most profound questions of what it means to be human, to be a woman, to be a member of a community. What is the most obvious topic for research after such a policy is instituted? To me, a leading candidate is the psychological effects on the adult human beings who are caught up in this problematic behavior. There are ways to study these effects. Quantifiable measures of psychological distress are available--rates of therapy or specific psychological symptoms, for example--not to mention well-established techniques for collecting systematic qualitative data. And yet it appears from Levine's review that the only thing that social scientists can think of to study are outcomes such as pregnancy rates, abortion rates, birth rates, age of first intercourse, and welfare recipiency. I don't know if there are good studies on psychological effects that Levine thought were outside his topic, or whether the available studies aren't numerous enough or good enough to warrant treatment. I suspect that good studies just aren't available--Levine gives the impression of covering all the outcomes that the literature has addressed. IS the tunnel vision a result of political correctness or of the inherent limitations of quantitative social science? One should not underestimate the role of technical problems. Counting pregnancy rates is relatively easy; assessing long-term psychological outcomes for women who have abortions is much tougher and more expensive. Studying topics such as the coarsening effect that abortion might have on a society would be tougher yet. But it remains a fact that the overwhelming majority of academics who collect data on the effects of abortion policy are ardently pro-choice. The overwhelming majority of their colleagues and friends are ardently pro-choice. To set out on a research project that might in the end show serious psychological harm to women who have abortions or serious social harm to communities where abortions rates are high would take more courage and devotion to truth than I have commonly encountered among today's academics. Actually, Levine represents a significant profile in courage. By concluding that restrictions on abortion do not necessarily have "bad" effects (from a pro-choice perspective), Levine is stating a conclusion that most of his fellow academics do not want to hear. What makes the tunnel vision most frustrating is the extent to which it produces uninteresting results. Out of all the tables that Levine presents and all the generalizations he draws from the extant literature, hardly any of the findings fall in the category of "I would never have expected that." Economics does indeed explain many things under the rubric of "make it more expensive and you get less of it, subsidize it and you get more of it." But we knew that already. And when it comes to the less obvious findings, one is seldom looking at large, transforming effects, but at effects that are statistically significant but small in magnitude. Levine is caught in the same bind as all of us who commit quantitative social science: The more precisely we can measure something, the less likely we are to learn anything important. But as we try to measure something important less precisely, the more vulnerable we become to technical attack. And so it has come to pass that on the great issues that quantitative social scientists might study, we are so often irrelevant. Princeton University Press. 215 pp. $35.00. * There should be a rule requiring anyone reviewing a book on a controversial policy to disclose his own biases. With regard to the morality of abortion, I set the bar high--abortion for any but compelling reasons is in my view morally wrong, and my definition of "compelling" is strict. But I think that governments do a bad job of characterizing where the bar should be, and that, except in extreme cases such as partial-birth abortion, the onus for discouraging abortion should rest with family and community, not laws. My legal position is thus pro-choice. References 1. http://www.findarticles.com/ 2. http://www.findarticles.com/p/articles/mi_m0377 3. http://www.findarticles.com/p/articles/mi_m0377/is_158 From checker at panix.com Sun Jan 1 03:02:40 2006 From: checker at panix.com (Premise Checker) Date: Sat, 31 Dec 2005 22:02:40 -0500 (EST) Subject: [Paleopsych] Dissent: Ellen Willis: Ghosts, Fantasies, and Hope (fwd) Message-ID: ---------- Forwarded message ---------- Date: Fri, 30 Dec 2005 16:46:16 -0500 (EST) From: Premise Checker To: Premise Checker: ; Subject: Dissent: Ellen Willis: Ghosts, Fantasies, and Hope Ellen Willis: Ghosts, Fantasies, and Hope Dissent Magazine - Fall 2005 http://www.dissentmagazine.org/menutest/articles/fa05/willis.htm [Clearing off the deck: Joel Garreau's new book, _Radical Evolution: The Promise and Peril of Enhancing Our Minds, Our Bodies--and What It Means to be Human_ (NY: Doubleday, 2005) has just arrived. I am signed up to review it for _The Journal of Evolution and Technology_ and commenced reading it at once. Accordingly, I have stopped grabbing articles to forward until I have written my review *and* have caught up on my reading, this last going on for how many ever weeks it takes. I have a backlog of articles to send and will exhaust them by the end of the year. After that, I have a big batch of journal articles I downloaded on my annual visit to the University of Virginia and will dole our conversions from PDF to TXT at the rate of one a day. I'll also participate in discussions and do up and occasional meme. But you'll be on your own in analyzing the news. I hope I have given you some of the tools to do so. As I go through my backlog of the TLS, New Scientist, and Foreign Policy, I'll send choice articles your way, Foreign Policy first, since that hits the two themes I am striving (vainly!) to concentrate on, "deep culture change" and "persistence of difference."] Picture Imperfect: Utopian Thought for an Anti-Utopian Age by Russell Jacoby Columbia University Press, 2005 211 pp $24.95 For most of my politically conscious life, the idea of social transformation has been the great taboo of American politics. From the smug 1950s to the post-Reagan era, in which a bloodied and cowed left has come to regard a kinder, gentler capitalism as its highest aspiration, this anti-utopian trend has been interrupted only by the brief but intense flare-up of visionary politics known as "the sixties." Yet that short-lived, anomalous upheaval has had a more profound effect on my thinking about the possibilities of politics than the following three decades of reaction. The reason is not (to summarize the conversation-stopping accusations routinely aimed at anyone who suggests that sixties political and cultural radicalism might offer other than negative lessons for the left) that I am stuck in a time warp, nursing a romantic attachment to my youth, and so determined to idealize a period that admittedly had its politically dicey moments. Rather, as I see it, the enduring interest of this piece of history lies precisely in its spectacular departure from the norm. It couldn't happen, according to the reigning intellectual currents of the fifties, but it did. Nor--in the sense of ceasing to cast a shadow over the present--can it really be said to be over, even in this age of "9/11 Changed Everything." That the culture war instigated by the 1960s revolt shows no signs of abating thirty-some years later is usually cited by its left and liberal opponents to condemn it as a disastrous provocation that put the right in power. Yet the same set of facts can as plausibly be regarded as evidence of the potent and lasting appeal of its demand that society embrace freedom and pleasure as fundamental values. For the fury of the religious right is clearly a case of protesting too much, its preoccupation with sexual sin a testament to the magnitude of the temptation (as the many evangelical sex scandals suggest). Meanwhile, during the dot-com boom, enthusiastic young free marketeers fomented a mini-revival of sixties liberationism, reencoded as the quest for global entrepreneurial triumph, new technological toys, and limitless information. Was this just one more example of the amazing power of capitalism to turn every human impulse to its own purposes--or, given the right circumstances, might the force of desire overflow that narrow channel? If freedom's just another word for nothing left to lose, as Janis Joplin-cum-Kris Kristofferson famously opined, this could be a propitious moment to reopen a discussion of the utopian dimension of politics and its possible uses for our time. After all, the left has tried everything else, from postmodern rejection of "master narratives" and universal values to Anybody But Bush. Russell Jacoby, one of the few radicals to consistently reject the accommodationist pull, has been trying to nudge us toward such a conversation for some time. Picture Imperfect is really part two of a meditation that Jacoby began in 1999 with The End of Utopia, a ferocious polemic against anti-utopian thought. Both books trace the assumptions of today's anti-utopian consensus to the thirties and forties, when liberal intellectuals--most notably Karl Popper, Hannah Arendt, and Isaiah Berlin--linked Nazism and communism under the rubric of totalitarianism, whose essential characteristic, they proposed, was the rejection of liberal pluralism for a monolithic ideology. In the cold war context, Nazism faded into the background; the critique of totalitarianism became a critique of communism and was generalized to all utopian thinking--that is, to any political aspiration that went beyond piecemeal reform. As the logic of this argument would have it, attempts to understand and change a social system as a whole are by definition ideological, which is to say dogmatic; they violate the pluralistic nature of social life and so can only be enforced through terror; ergo, utopianism leads to mass murder. Never mind that passionate radicals such as Emma Goldman condemned the Soviet regime in the name of their own utopian vision or that most of the past century's horrors have been perpetrated by such decidedly non-utopian forces as religious fanaticism, nationalism, fascism, and other forms of racial and ethnic bigotry. (Jacoby notes with indignation that some proponents of the anti-utopian syllogism have tried to get around this latter fact by labeling movements like Nazism and radical Islamism "utopian"--as I write, David Brooks has just made use of this ploy in the New York Times--as if there is no distinction worth making between a universalist tradition devoted to "notions of happiness, fraternity, and plenty" and social "ideals" that explicitly mandate the mass murder of so-called inferior races or the persecution of infidels.) In the post-communist world, Jacoby laments, the equation of utopia with death has become conventional wisdom across the political board. The End of Utopia is primarily concerned with the impact of this brand of thinking on the left; it attacks the array of "progressive" spokespeople who insist that we must accept the liberal welfare state as the best we can hope for, as well as the multiculturalists who have reinvented liberal pluralism, celebrating "diversity" and "inclusiveness" within a socioeconomic system whose fundamental premises are taken for granted. With Picture Imperfect, Jacoby takes on larger and more philosophical questions about the nature of utopia and of the human imagination--too large, actually, to be adequately addressed in this quite short book, which has a somewhat diffuse and episodic quality as a result. Still, the questions are central to any serious discussion of the subject, and it helps that they are framed by a more concrete project: to rescue utopian thought from its murderous reputation as well as from the more mundane charge that it is puritanical and repressive in its penchant for planning out the future to the last detail. To this end, Jacoby distinguishes between two categories of utopianism: the dominant "blueprint" tradition, exemplified by Thomas More's eponymous no place or Edward Bellamy's Looking Backward, and the dissident strain he calls "iconoclastic" utopianism, whose concern is challenging the limits of the existing social order and expanding the boundaries of imagination rather than planning the perfect society. While he does not simply write off the blueprinters--fussy as their details may be, he regards them as contributors to the utopian spirit and credits them with inspiring social reforms--his heroes are the iconoclasts, beginning with Ernst Bloch and his 1918 The Spirit of Utopia, and including a gallery of anarchists, refusers, and mystics ranging from Walter Benjamin, Theodor Adorno, and Herbert Marcuse to Gustav Landauer and Martin Buber. The iconoclastic tradition is mainly Jewish, and Jacoby, in an interesting bit of discursus, links it to the biblical prohibition of idolatry. Just as the Jews may neither depict God's image nor pronounce God's name, so the iconoclasts avoid explicit images or descriptions of the utopian future. Further, Jacoby argues, in the Kabbala and in Jewish tradition generally, the Torah achieves full meaning only through the oral law: "The ear trumps the eye. Alone, the written word may mislead: it is too graphic." Similarly, the future of the iconoclasts is "heard and longed for" rather than seen. Here, Jacoby's analysis intersects with a fear he has long shared with his Frankfurt School mentors--that a mass culture obsessed with images flattens the imagination and perhaps destroys it altogether. From this perspective, the iconoclasts' elision of the image is itself radically countercultural. Is it also impossibly abstract? "The problem today," Jacoby recognizes in his epilogue, "is how to connect utopian thinking with everyday politics." Even as utopianism is condemned as deadly, it is at the same time, and often by the same people, dismissed as irrelevant to the real world. Jacoby will have none of this; he rightly insists, "Utopian thinking does not undermine or discount real reforms. Indeed, it is almost the opposite: practical reforms depend on utopian dreaming." Again, the sixties offers many examples--particularly its most successful social movement, second wave feminism, which achieved mass proportions in response to the radical proposition that men and women should be equals not only under the law or on the job but in every social sphere from the kitchen to the nursery to the bedroom to the street. (As one of the movement's prominent utopians, Shulamith Firestone, put it, the initial response of most women to that idea was, "You must be out of your mind--you can't change that!") Yet it seems likely that the relationship of the utopian imagination and the urge to concrete political activity is not precisely one of cause and effect; rather, both impulses appear to have a common root in the perception that something other than what is is possible--and necessary. We might think of iconoclastic utopians as the inverse of canaries in the mine: if they are hearing the sounds of an ineffable redemption, others may already be at work on annoyingly literal blueprints, and still others getting together for as yet obscure political meetings. So the formulation of the problem may need to be fine-tuned: what is it that fosters, or blocks, that sense of possibility/necessity? Why does it seem so utterly absent today (you're out of your mind!), and how can we change that? These questions are an obvious project for a third book, though it's one Jacoby is unlikely to write: he is temperamentally a refusenik, like the iconoclasts he lauds, more attuned to distant hoofbeats than to spoor on the ground that might reward analysis. It is perhaps this bias that has kept him from seeing one reason why the anti-utopian argument has become so entrenched: although there is perversity in it, and bad faith, there is also some truth. Jacoby is no fan of authoritarian communism, but he is wrong in thinking he can simply bracket that disaster or that there is nothing to be learned from it that might apply to utopian movements in general. The striking characteristic of communism was the radical disconnection between the social ideals it professed and the actual societies it produced. Because the contradiction could never be admitted, whole populations were forced to speak and act as if the lies of the regime were true. It is not surprising that victims or witnesses of this spectacle would distrust utopians. Who could tell what even the most steadfast anti-Stalinists might do if they actually gained some power? Who could give credence to phrases like "workers' control" or "women's emancipation" when they had come to mean anything but? Jacoby persuasively analyzes 1984 to show that it was not meant as an anti-socialist tract, yet he never mentions the attacks on the misuse of language that made Orwell's name into an adjective. Communism was corrupted by a scientific (or more accurately, scientistic) theory of history that cast opponents as expendable, a theory of class that dismissed bourgeois democratic liberties as merely a mask for capitalist exploitation, and a revolutionary practice that allowed a minority to impose dictatorship. Similar tropes made their way into the sixties' movements, in, for instance, the argument that oppressors should not have free speech or that the American people were the problem, not the solution, and the proper function of American radicals was to support third world anti-imperialism by any means necessary, including violence. A milder form of authoritarianism, which owed less to Marxism than to a peculiarly American quasi-religious moralism, disfigured the counterculture and the women's movement. If the original point of these movements was to promote the pursuit of happiness, too often the emphasis shifted to proclaiming one's own superior enlightenment and contempt for those who refused to be liberated; indeed, liberation had a tendency to become prescriptive, so that freedom to reject the trappings of middle-class consumerism, or not to marry, or to be a lesbian was repackaged as a moral obligation and a litmus test of one's radicalism or feminism. Just as communism discredited utopianism for several generations of Europeans, the antics of countercultural moralists fed America's conservative reaction. But it's not only corruption that distorts the utopian impulse when it begins to take some specific social shape. The prospect of more freedom stirs anxiety. We want it, but we fear it; it goes against our most deeply ingrained Judeo-Christian definitions of morality and order. At bottom, utopia equals death is a statement about the wages of sin. Left authoritarianism is itself a defense against anxiety--a way to assimilate frightening anarchy into familiar patterns of hierarchy and moral demand--as is the fundamentalist backlash taking place not only in the United States but around the world. Jacoby links the decline of utopian thought to the collapse of communism in 1989, and that is surely part of the story, but in truth the American backlash against utopianism was well underway by the mid-seventies. The sixties scared us, and not only because of Weatherman and Charles Manson. We scared ourselves. How did the sixties happen in the first place? I'd argue that a confluence of events stimulated desire while temporarily muting anxiety. There was widespread prosperity that made young people feel secure, able to challenge authority and experiment with their lives. There was a vibrant mass mediated culture that, far from damping down the imagination, transmitted the summons to freedom and pleasure far more broadly than a mere political movement could do. (Jacoby is on to something, though, about the importance of the ear: the key mass cultural form, from the standpoint of inciting utopianism, was rock and roll.) There was a critical mass of educated women who could not abide the contradiction between the expanding opportunities they enjoyed as middle-class Americans and the arbitrary restrictions on their sex. There was the advent of psychedelics, which allowed millions of people to sample utopia as a state of mind. Those were different times. Today, anxiety is a first principle of social life, and the right knows how to exploit it. Capital foments the insecurity that impels people to submit to its demands. And yet there are more Americans than ever before who have tasted certain kinds of social freedoms and, whether they admit it or not, don't want to give them up or deny them to others. From Bill Clinton's impeachment to the Terri Schiavo case, the public has resisted the right wing's efforts to close the deal on the culture. Not coincidentally, the cultural debates, however attenuated, still conjure the ghosts of utopia by raising issues of personal autonomy, power, and the right to enjoy rather than slog through life. In telling contrast, the contemporary left has not posed class questions in these terms; on the contrary, it has ceded the language of freedom and pleasure, "opportunity" and "ownership," to the libertarian right. Our culture of images notwithstanding, it cannot fairly be said that Americans' capacity for fantasy is impaired, even if it takes sectarian and apocalyptic rather than utopian forms. If anxiety is the flip side of desire, perhaps what we need to do is start asking ourselves and our fellow citizens what we want. The answers might surprise us. Ellen Willis writes on cultural politics and political culture and directs the Cultural Reporting and Criticism program in the Department of Journalism at New York University. She is currently at work on a book about the mass psychology of contemporary politics. From anonymous_animus at yahoo.com Sun Jan 1 19:51:28 2006 From: anonymous_animus at yahoo.com (Michael Christopher) Date: Sun, 1 Jan 2006 11:51:28 -0800 (PST) Subject: [Paleopsych] shame in the Bible In-Reply-To: <200601011900.k01J0ce29299@tick.javien.com> Message-ID: <20060101195128.5946.qmail@web36808.mail.mud.yahoo.com> >>So amongst the knowledge of good and evil that came from eating the forbidden fruit is that nudity is shameful.<< --Or maybe rather, that knowledge of one's nudity (i.e. shame) is shameful. My favorite theory about the Fall is that it was an allegory for the tendency of human beings to judge one another. To have knowledge of good and evil is to be a judge, and to carry out that judgment against another person is to play God. The message is: don't play God. In that context, awareness of nudity is a symbol for awareness of one's own hubris and arrogance in the presence of a greater judge. A bit like how many politicians would feel if their acts were exposed to the public, with no protection by secrecy or status. To be naked before one's enemies was to be judged without the protection of status symbols (uniform, title, etc). Michael __________________________________________ Yahoo! DSL ? Something to write home about. Just $16.99/mo. or less. dsl.yahoo.com From checker at panix.com Sun Jan 1 23:11:08 2006 From: checker at panix.com (Premise Checker) Date: Sun, 1 Jan 2006 18:11:08 -0500 (EST) Subject: [Paleopsych] New Left Review: Eric Hobsbawm: Identity Politics and the Left Message-ID: Eric Hobsbawm: Identity Politics and the Left New Left Review 217, May/June 1996 [This is a significant article by an old-line 20th century British leftist. It deplores the replacement of "equality and social justice" as the essential aim of the Left with identity politics and mourns the disappearance of universalism on the Left. [The article, nearly a decade old, should be read carefully. He states, "Since the 1970s there has been a tendency-an increasing tendency' to see the Left essentially as a coalition of minority groups and interests: of race, gender, sexual or other cultural preferences and lifestyles, even of economic minorities such as the old getting-your-hands-dirty, industrial working class have now become." [Since then, the trends he deplores have been exacerbated, with universalism further in retreat. It is now getting to the point where Whites are starting their own identity politics. [Hobsbawn calls "equality and social justice" the essential defining characteristic of the Left, and this was true--or rather equality formed the *principle* Left-Right divide--but only for a while after the nearly universally recognized failure of central planning. The failure of egalitarian politics is becoming nearly as manifest as the failure of central planning. What is replacing equality, I have been arguing repeatedly, as the new major Left-Right divide in politics is universalism (on the Right, now taking the form of spreading "democratic capitalism" to the world or else the universal truths of one religion or another) and particularism (on the Left, now not very coherent, except to resist Rightist universalism). [Hobsbawm is quite correct to say that the Left in Britain degenerated into rent-seeking for higher wages for those who happen to be unionized. (Unionization simply cannot, and never could, raise wages overall in a competitive economy, but that's another story.) And the Central Planner in him remains in his Unchecked Premise that, while it is true that identities are multiple and fluid--but only to a degree, only to a degree--a Central Planner can make them what he will. [I could argue that capitalism is defective, in that it rewards inventors, entrepreneurs, capitalists, and businessmen too small a share of what they contribute to society (far less than their marginal product), while the workers collect nearly their full marginal product and that "social justice" demands regressive taxes. But all this would serve only to continue 20th-century Rightist arguments, coming down on the side of inequality rather than equality. [The politics of the 21st century will move away from the increasingly dead issue of equality. Hobsbawm writes that "the emergence of identity politics is a consequence of the extraordinarily rapid and profound upheavals and transformations of human society in the third quarter of this century," and quotes Daniel Bell as noting that "the breakup of the traditional authority structures and the previous affective social units-historically nation and class...make the ethnic attachment more salient." [But identity is not just a matter of politics and rent-seeking coalitions. Identity is becoming ever more salient, for it provides islands of stability in an world where everything else changes. This will only increase as change itself increases. This is deep culture change indeed, and the inevitable emergence of political entrepreneurs to form rent-seeking coalitions is a small aspect of this. [So read the article, not for the politics or for Hobsbawm's nostalgia for 20th century Leftist politics (but 1996, the date of the article, was still in the last century!) Try to think about the sociology of identity, how individuals will remake their identities to create new islands of stability, and how those with a particular identity, or mixture of them (as Hobsbawm quite correctly emphasizes--he is at some level a Public Choice man himself), will react to those with other identities. [Think, in other words, how those with particular enhancements will deal socially with those of different enhancements or with no enhancements?] -------------- My lecture is about a surprisingly new subject. [*] We have become so used to terms like 'collective identity', 'identity groups, 'identity politics', or, for that matter 'ethnicity', that it is hard to remember how recently they have surfaced as part of the current vocabulary, or jargon, of political discourse. For instance, if you look at the International Encyclopedia of the Social Sciences, which was published in 1968-that is to say written in the middle 1960s-you will find no entry under identity except one about psychosocial identity, by Erik Erikson, who was concerned chiefly with such things as the so-called 'identity crisis' of adolescents who are trying to discover what they are, and a general piece on voters' identification. And as for ethnicity, in the Oxford English Dictionary of the early 1970s it still occurs only as a rare word indicating 'heathendom and heathen superstition' and documented by quotations from the eighteenth century. In short, we are dealing with terms and concepts which really come into use only in the 1960s. Their emergence is most easily followed in the USA, partly because it has always been a society unusually interested in monitoring its social and psychological temperature, blood-pressure and other symptoms, and mainly because the most obvious form of identity politics-but not the only one-namely ethnicity, has always been central to American politics since it became a country of mass immigration from all parts of Europe. Roughly, the new ethnicity makes its first public appearance with Glazer and Moynihan's Beyond the Melting Pot in 1963 and becomes a militant programme with Michael Novak's The Rise of the Unmeltable Ethnics in 1972. The first, I don't have to tell you, was the work of a Jewish professor and an Irishman, now the senior Democratic senator for New York; the second came from a Catholic of Slovak origin. For the moment we need not bother too much about why all this happened in the 1960s, but let me remind you that-in the style-setting USA at least-this decade also saw the emergence of two other variants of identity politics: the modern (that is, post suffragist) women's movement and the gay movement. I am not saying that before the 1960s nobody asked themselves questions about their public identity. In situations of uncertainty they sometimes did; for instance in the industrial belt of Lorraine in France, whose official language and nationality changed five times in a century, and whose rural life changed to an industrial, semi-urban one, while their frontiers were redrawn seven times in the past century and a half. No wonder people said: 'Berliners know they're Berliners, Parisians know they are Parisians, but who are we?' Or, to quote another interview, 'I come from Lorraine, my culture is German, my nationality is French, and I think in our provincial dialect'. [1] Actually, these things only led to genuine identity problems when people were prevented from having the multiple, combined, identities which are natural to most of us. Or, even more so, when they are detached 'from the past and all common cultural practices'. [2] However, until the 1960s these problems of uncertain identity were confined to special border zones of politics. They were not yet central. They appear to have become much more central since the 1960s. Why? There are no doubt particular reasons in the politics and institutions of this or that country-for instance, in the peculiar procedures imposed on the USA by its Constitution-for example, the civil rights judgments of the 1950s, which were first applied to blacks and then extended to women, providing a model for other identity groups. It may follow, especially in countries where parties compete for votes, that constituting oneself into such an identity group may provide concrete political advantages: for instance, positive discrimination in favour of the members of such groups, quotas in jobs and so forth. This is also the case in the USA, but not only there. For instance, in India, where the government is committed to creating social equality, it may actually pay to classify yourself as low caste or belonging to an aboriginal tribal group, in order to enjoy the extra access to jobs guaranteed to such groups. The Denial of Multiple Identity But in my view the emergence of identity politics is a consequence of the extraordinarily rapid and profound upheavals and transformations of human society in the third quarter of this century, which I have tried to describe and to understand in the second part of my history of the 'Short Twentieth Century', The Age of Extremes. This is not my view alone. The American sociologist Daniel Bell, for instance, argued in 1975 that 'The breakup of the traditional authority structures and the previous affective social units-historically nation and class...make the ethnic attachment more salient'. [3] In fact, we know that both the nation-state and the old class-based political parties and movements have been weakened as a result of these transformations. More than this, we have been living-we are living-through a gigantic 'cultural revolution', an 'extraordinary dissolution of traditional social norms, textures and values, which left so many inhabitants of the developed world orphaned and bereft.' If I may go on quoting myself, 'Never was the word "community" used more indiscriminately and emptily than in the decades when communities in the sociological sense become hard to find in real life'. [4] Men and women look for groups to which they can belong, certainly and forever, in a world in which all else is moving and shifting, in which nothing else is certain. And they find it in an identity group. Hence the strange paradox, which the brilliant, and incidentally, Caribbean Harvard sociologist Orlando Patterson has identified: people choose to belong to an identity group, but 'it is a choice predicated on the strongly held, intensely conceived belief that the individual has absolutely no choice but to belong to that specific group.' [5] That it is a choice can sometimes be demonstrated. The number of Americans reporting themselves as 'American Indian' or 'Native American' almost quadrupled between 1960 and 1990, from about half a million to about two millions, which is far more than could be explained by normal demography; and incidentally, since 70 per cent of 'Native Americans' marry outside their race, exactly who is a 'Native American' ethnically, is far from clear. [6] So what do we understand by this collective 'identity', this sentiment of belonging to a primary group, which is its basis? I draw your attention to four points. First, collective identities are defined negatively; that is to say against others. 'We' recognize ourselves as 'us' because we are different from 'Them'. If there were no 'They' from whom we are different, we wouldn't have to ask ourselves who 'We' were. Without Outsiders there are no Insiders. In other words, collective identities are based not on what their members have in common-they may have very little in common except not being the 'Others'. Unionists and Nationalists in Belfast, or Serb, Croat and Muslim Bosnians, who would otherwise be indistinguishable-they speak the same language, have the same life styles, look and behave the same-insist on the one thing that divides them, which happens to be religion. Conversely, what gives unity as Palestinians to a mixed population of Muslims of various kinds, Roman and Greek Catholics, Greek Orthodox and others who might well-like their neighbours in Lebanon-fight each other under different circumstances? Simply that they are not the Israelis, as Israeli policy continually reminds them. Of course, there are collectivities which are based on objective characteristics which their members have in common, including biological gender or such politically sensitive physical characteristics as skin-colour and so forth. However most collective identities are like shirts rather than skin, namely they are, in theory at least, optional, not inescapable. In spite of the current fashion for manipulating our bodies, it is still easier to put on another shirt than another arm. Most identity groups are not based on objective physical similarities or differences, although all of them would like to claim that they are 'natural' rather than socially constructed. Certainly all ethnic groups do. Second, it follows that in real life identities, like garments, are interchangeable or wearable in combination rather than unique and, as it were, stuck to the body. For, of course, as every opinion pollster knows, no one has one and only one identity. Human beings cannot be described, even for bureaucratic purposes, except by a combination of many characteristics. But identity politics assumes that one among the many identities we all have is the one that determines, or at least dominates our politics: being a woman, if you are a feminist, being a Protestant if you are an Antrim Unionist, being a Catalan, if you are a Catalan nationalist, being homosexual if you are in the gay movement. And, of course, that you have to get rid of the others, because they are incompatible with the 'real' you. So David Selbourne, an all-purpose ideologue and general denouncer, firmly calls on 'The Jew in England' to 'cease to pretend to be English' and to recognize that his 'real' identity is as a Jew. This is both dangerous and absurd. There is no practical incompatibility unless an outside authority tells you that you cannot be both, or unless it is physically impossible to be both. If I wanted to be simultaneously and ecumenically a devout Catholic, a devout Jew, and a devout Buddhist why shouldn't I? The only reason which stops me physically is that the respective religious authorities might tell me I cannot combine them, or that it might be impossible to carry out all their rituals because some got in the way of others. Usually people have no problem about combining identities, and this, of course, is the basis of general politics as distinct from sectional identity politics. Often people don't even bother to make the choice between identities, either because nobody asks them, or because it's too complicated. When inhabitants of the USA are asked to declare their ethnic origins, 54 per cent refuse or are unable to give an answer. In short, exclusive identity politics do not come naturally to people. It is more likely to be forced upon them from outside-in the way in which Serb, Croat and Muslim inhabitants of Bosnia who lived together, socialized and intermarried, have been forced to separate, or in less brutal ways. The third thing to say is that identities, or their expression, are not fixed, even supposing you have opted for one of your many potential selves, the way Michael Portillo has opted for being British instead of Spanish. They shift around and can change, if need be more than once. For instance non-ethnic groups, all or most of whose members happen to be black or Jewish, may turn into consciously ethnic groups. This happened to the Southern Christian Baptist Church under Martin Luther King. The opposite is also possible, as when the Official IRA turned itself from a Fenian nationalist into a class organization, which is now the Workers' Party and part of the Irish Republic's government coalition. The fourth and last thing to say about identity is that it depends on the context, which may change. We can all think of paid-up, card-carrying members of the gay community in the Oxbridge of the 1920s who, after the slump of 1929 and the rise of Hitler, shifted, as they liked to say, from Homintern to Comintern. Burgess and Blunt, as it were, transferred their gayness from the public to the private sphere. Or, consider the case of the Protestant German classical scholar, Pater, a professor of Classics in London, who suddenly discovered, after Hitler, that he had to emigrate, because, by Nazi standards, he was actually Jewish-a fact of which until that moment, he was unaware. However he had defined himself previously, he now had to find a different identity. The Universalism of the Left What has all this to do with the Left? Identity groups were certainly not central to the Left. Basically, the mass social and political movements of the Left, that is, those inspired by the American and French revolutions and socialism, were indeed coalitions or group alliances, but held together not by aims that were specific to the group, but by great, universal causes through which each group believed its particular aims could be realized: democracy, the Republic, socialism, communism or whatever. Our own Labour Party in its great days was both the party of a class and, among other things, of the minority nations and immigrant communities of mainland Britainians. It was all this, because it was a party of equality and social justice. Let us not misunderstand its claim to be essentially class-based. The political labour and socialist movements were not, ever, anywhere, movements essentially confined to the proletariat in the strict Marxist sense. Except perhaps in Britain, they could not have become such vast movements as they did, because in the 1880s and 1890s, when mass labour and socialist parties suddenly appeared on the scene, like fields of bluebells in spring, the industrial working class in most countries was a fairly small minority, and in any case a lot of it remained outside socialist labour organization. Remember that by the time of World War I the social-democrats polled between 30 and 47 per cent of the electorate in countries like Denmark, Sweden and Finland, which were hardly industrialized, as well as in Germany. (The highest percentage of votes ever achieved by the Labour Party in this country, in 1951, was 48 per cent.) Furthermore, the socialist case for the centrality of the workers in their movement was not a sectional case. Trade unions pursued the sectional interests of wage-earners, but one of the reasons why the relations between labour and socialist parties and the unions associated with them, were never without problems, was precisely that the aims of the movement were wider than those of the unions. The socialist argument was not just that most people were 'workers by hand or brain' but that the workers were the necessary historic agency for changing society. So, whoever you were, if you wanted the future, you would have to go with the workers' movement. Conversely, when the labour movement became narrowed down to nothing but a pressure-group or a sectional movement of industrial workers, as in 1970s Britain, it lost both the capacity to be the potential centre of a general people's mobilization and the general hope of the future. Militant 'economist' trade unionism antagonized the people not directly involved in it to such an extent that it gave Thatcherite Toryism its most convincing argument-and the justification for turning the traditional 'one-nation' Tory Party into a force for waging militant class-war. What is more, this proletarian identity politics not only isolated the working class, but also split it by setting groups of workers against each other. So what does identity politics have to do with the Left? Let me state firmly what should not need restating. The political project of the Left is universalist: it is for all human beings. However we interpret the words, it isn't liberty for shareholders or blacks, but for everybody. It isn't equality for all members of the Garrick Club or the handicapped, but for everybody. It is not fraternity only for old Etonians or gays, but for everybody. And identity politics is essentially not for everybody but for the members of a specific group only. This is perfectly evident in the case of ethnic or nationalist movements. Zionist Jewish nationalism, whether we sympathize with it or not, is exclusively about Jews, and hang-or rather bomb-the rest. All nationalisms are. The nationalist claim that they are for everyone's right to self-determination is bogus. That is why the Left cannot base itself on identity politics. It has a wider agenda. For the Left, Ireland was, historically, one, but only one, out of the many exploited, oppressed and victimized sets of human beings for which it fought. For the IRA kind of nationalism, the Left was, and is, only one possible ally in the fight for its objectives in certain situations. In others it was ready to bid for the support of Hitler as some of its leaders did during World War II. And this applies to every group which makes identity politics its foundation, ethnic or otherwise. Now the wider agenda of the Left does, of course, mean it supports many identity groups, at least some of the time, and they, in turn look to the Left. Indeed, some of these alliances are so old and so close that the Left is surprised when they come to an end, as people are surprised when marriages break up after a lifetime. In the USA it almost seems against nature that the 'ethnics'-that is, the groups of poor mass immigrants and their descendants-no longer vote almost automatically for the Democratic Party. It seems almost incredible that a black American could even consider standing for the Presidency of the USA as a Republican (I am thinking of Colin Powell). And yet, the common interest of Irish, Italian, Jewish and black Americans in the Democratic Party did not derive from their particular ethnicities, even though realistic politicians paid their respects to these. What united them was the hunger for equality and social justice, and a programme believed capable of advancing both. The Common Interest But this is just what so many on the Left have forgotten, as they dive head first into the deep waters of identity politics. Since the 1970s there has been a tendency-an increasing tendency' to see the Left essentially as a coalition of minority groups and interests: of race, gender, sexual or other cultural preferences and lifestyles, even of economic minorities such as the old getting-your-hands-dirty, industrial working class have now become. This is understandable enough, but it is dangerous, not least because winning majorities is not the same as adding up minorities. First, let me repeat: identity groups are about themselves, for themselves, and nobody else. A coalition of such groups that is not held together by a single common set of aims or values, has only an ad hoc unity, rather like states temporarily allied in war against a common enemy. They break up when they are no longer so held together. In any case, as identity groups, they are not committed to the Left as such, but only to get support for their aims wherever they can. We think of women's emancipation as a cause closely associated with the Left, as it has certainly been since the beginnings of socialism, even before Marx and Engels. And yet, historically, the British suffragist movement before 1914 was a movement of all three parties, and the first woman mp, as we know, was actually a Tory. [7] Secondly, whatever their rhetoric, the actual movements and organizations of identity politics mobilize only minorities, at any rate before they acquire the power of coercion and law. National feeling may be universal, but, to the best of my knowledge, no secessionist nationalist party in democratic states has so far ever got the votes of the majority of its constituency (though the Qu?becois last autumn came close-but then their nationalists were careful not actually to demand complete secession in so many words). I do not say it cannot or will not happen-only that the safest way to get national independence by secession so far has been not to ask populations to vote for it until you already have it first by other means. That, by the way, makes two pragmatic reasons to be against identity politics. Without such outside compulsion or pressure, under normal circumstances it hardly ever mobilizes more than a minority-even of the target group. Hence, attempts to form separate political women's parties have not been very effective ways of mobilizing the women's vote. The other reason is that forcing people to take on one, and only one, identity divides them from each other. It therefore isolates these minorities. Consequently to commit a general movement to the specific demands of minority pressure groups, which are not necessarily even those of their constituencies, is to ask for trouble. This is much more obvious in the USA, where the backlash against positive discrimination in favour of particular minorities, and the excesses of multiculturalism, is now very powerful; but the problem exists here also. Today both the Right and to the Left are saddled with identity politics. Unfortunately, the danger of disintegrating into a pure alliance of minorities is unusually great on the Left because the decline of the great universalist slogans of the Enlightenment, which were essentially slogans of the Left, leaves it without any obvious way of formulating a common interest across sectional boundaries. The only one of the so-called 'new social movements' which crosses all such boundaries is that of the ecologists. But, alas, its political appeal is limited and likely to remain so. However, there is one form of identity politics which is actually comprehensive, inasmuch as it is based on a common appeal, at least within the confines of a single state: citizen nationalism. Seen in the global perspective this may be the opposite of a universal appeal, but seen in the perspective of the national state, which is where most of us still live, and are likely to go on living, it provides a common identity, or in Benedict Anderson's phrase, 'an imagined community' not the less real for being imagined. The Right, especially the Right in government, has always claimed to monopolize this and can usually still manipulate it. Even Thatcherism, the grave-digger of 'one-nation Toryism', did it. Even its ghostly and dying successor, Major's government, hopes to avoid electoral defeat by damning its opponents as unpatriotic. Why then has it been so difficult for the Left, certainly for the Left in English-speaking countries, to see itself as the representative of the entire nation? (I am, of course, speaking of the nation as the community of all people in a country, not as an ethnic entity.) Why have they found it so difficult even to try? After all, the European Left began when a class, or a class alliance, the Third Estate in the French Estates General of 1789, decided to declare itself 'the nation' as against the minority of the ruling class, thus creating the very concept of the political 'nation'. After all, even Marx envisaged such a transformation in The Communist Manifesto. [8] Indeed, one might go further. Todd Gitlin, one of the best observers of the American Left, has put it dramatically in his new book, The Twilight of Common Dreams: 'What is a Left if it is not, plausibly at least, the voice of the whole people?...If there is no people, but only peoples, there is no Left.' [9] The Muffled Voice of New Labour And there have been times when the Left has not only wanted to be the nation, but has been accepted as representing the national interest, even by those who had no special sympathy for its aspirations: in the USA, when the Rooseveltian Democratic Party was politically hegemonic, in Scandinavia since the early 1930s. More generally, at the end of World War II the Left, almost everywhere in Europe, represented the nation in the most literal sense, because it represented resistance to, and victory over, Hitler and his allies. Hence the remarkable marriage of patriotism and social transformation, which dominated European politics immediately after 1945. Not least in Britain, where 1945 was a plebiscite in favour of the Labour Party as the party best representing the nation against one-nation Toryism led by the most charismatic and victorious war-leader on the scene. This set the course for the next thirty-five years of the country's history. Much more recently, Fran?ois Mitterrand, a politician without a natural commitment to the Left, chose leadership of the Socialist Party as the best platform for exercising the leadership of all French people. One would have thought that today was another moment when the British Left could claim to speak for Britain-that is to say all the people-against a discredited, decrepit and demoralized regime. And yet, how rarely are the words 'the country', 'Great Britain', 'the nation', 'patriotism', even 'the people' heard in the pre-election rhetoric of those who hope to become the next government of the United Kingdom! It has been suggested that this is because, unlike 1945 and 1964, 'neither the politician nor his public has anything but a modest belief in the capacity of government to do very much'. [10] If that is why Labour speaks to and about the nation in so muffled a voice, it is trebly absurd. First, because if citizens really think that government can't do very much, why should they bother to vote for one lot rather than the other, or for that matter for any lot? Second, because government, that is to say the management of the state in the public interest, is indispensable and will remain so. Even the ideologues of the mad Right, who dream of replacing it by the universal sovereign market, need it to establish their utopia, or rather dystopia. And insofar as they succeed, as in much of the ex-socialist world, the backlash against the market brings back into politics those who want the state to return to social responsibility. In 1995, five years after abandoning their old state with joy and enthusiasm, two thirds of East Germans think that life and conditions in the old gdr were better than the 'negative descriptions and reports' in today's German media, and 70 per cent think 'the idea of socialism was good, but we had incompetent politicians'. And, most unanswerably, because in the past seventeen years we have lived under governments which believed that government has enormous power, which have used that power actually to change our country decisively for the worse, and which, in their dying days are still trying to do so, and to con us into the belief that what one government has done is irreversible by another. The state will not go away. It is the business of government to use it. Government is not just about getting elected and then re-elected. This is a process which, in democratic politics, implies enormous quantities of lying in all its forms. Elections become contests in fiscal perjury. Unfortunately, politicians, who have as short a time-horizon as journalists, find it hard to see politics as other than a permanent campaigning season. Yet there is something beyond. There lies what government does and must do.There is the future of the country. There are the hopes and fears of the people as a whole-not just 'the community', which is an ideological cop-out, or the sum-total of earners and spenders (the 'taxpayers' of political jargon), but the British people, the sort of collective which would be ready to cheer the victory of any British team in the World Cup, if it hadn't lost the hope that there might still be such a thing. For not the least symptom of the decline of Britain, with the decline of science, is the decline of British team sports. It was Mrs Thatcher's strength, that she recognized this dimension of politics. She saw herself leading a people 'who thought we could no longer do the great things we once did'-I quote her words-'those who believed our decline was irreversible, that we could never again be what we were'. [11] She was not like other politicians, inasmuch as she recognized the need to offer hope and action to a puzzled and demoralized people. A false hope, perhaps, and certainly the wrong kind of action, but enough to let her sweep aside opposition within her party as well as outside, and change the country and destroy so much of it. The failure of her project is now manifest. Our decline as a nation has not been halted. As a people we are more troubled, more demoralized than in 1979, and we know it. Only those who alone can form the post-Tory government are themselves too demoralized and frightened by failure and defeat, to offer anything except the promise not to raise taxes. We may win the next general election that way and I hope we will, though the Tories will not fight the election campaign primarily on taxes, but on British Unionism, English nationalism, xenophobia and the Union Jack, and in doing so will catch us off balance. Will those who have elected us really believe we shall make much difference? And what will we do if they merely elect us, shrugging their shoulders as they do so? We will have created the New Labour Party. Will we make the same effort to restore and transform Britain? There is still time to answer these questions. [*] This is the text of the Barry Amiel and Norman Melburn Trust Lecture given at the Institute of Education, London on 2 May 1996. [1] M.L. Pradelles de Latou, 'Identity as a Complex Network', in C. Fried, ed., Minorities, Community and Identity, Berlin 1983, p. 79. [2] Ibid. p. 91. [3] Daniel Bell, 'Ethnicity and Social Change', in Nathan Glazer and Daniel P. Moynihan, eds., Ethnicity: Theory and Experience, Cambridge, Mass. 1975, P. 171 [4] E.J. Hobsbawm, The Age of Extremes. The Short Twentieth Century, 1914-1991, London 1994, p. 428. [5] O. Patterson, 'Implications of Ethnic Identification'in Fried, ed., Minorities: Community and Identity, pp. 28-29. O. Patterson, 'Implications of Ethnic Identification'in Fried, ed., Minorities: Community and Identity, pp. 28-29. [6] O. Patterson, 'Implications of Ethnic Identification'in Fried, ed., Minorities: Community and Identity, pp. 28-29. [7] Jihang Park, 'The British Suffrage Activists of 1913', Past & Present, no. 120, August 1988, pp. 156-7. [8] 'Since the proletariat must first of all acquire political supremacy, must raise itself to be the national class, must constitute itself the nation, it is itself still national, though not in the bourgeois sense.' Karl Marx and Frederick Engels, The Communist Manifesto, 1848, part II. The original (German) edition has 'the national class'; the English translation of 1888 gives this as 'the leading class of the nation'. [9] Gitlin, The Twilight of Common Dreams, New York 1995, p. 165. [10] Hugo Young, 'No Waves in the Clear Blue Water', The Guardian, 23 April 1996, p. 13. [11] Cited in Eric Hobsbawm, Politics for a Rational Left, Verso, London 1989, p. 54. From checker at panix.com Sun Jan 1 23:11:46 2006 From: checker at panix.com (Premise Checker) Date: Sun, 1 Jan 2006 18:11:46 -0500 (EST) Subject: [Paleopsych] World Science: Bees can recognize human faces, study finds Message-ID: Bees can recognize human faces, study finds http://www.world-science.net/exclusives/051209_beesfrm.htm March 30, 2005 Honeybees may look pretty much all alike to us. But it seems we may not look all alike to them. A study has found that the bees can learn to recognize human faces in photos, and remember them for at least two days. The findings toss new uncertainty into a long-studied issue that some scientists considered largely settled, the researchers say: the question of how humans themselves recognize each other's faces. The results also may help lead to better face-recognition software, developed through study of the insect brain, the authors of the new research said. Many researchers traditionally believed the task required a large brain and a specialized area of that brain dedicated to processing face information. The bee finding casts doubt on that, said Adrian G. Dyer, the lead researcher in the study. He recalls that the discovery startled him so much that he called out to a colleague, telling her to come quickly because "no one's going to believe it--and bring a camera!" Dyer said that to his knowledge, the finding is the first time an invertebrate has shown ability to recognize faces of other species. But not all bees were up to the task: some flunked it, he said, although this seemed due more to a failure to grasp how the test worked than to poor facial recognition specifically. In any cases, some humans also can't recognize faces, Dyer noted; the condition is called prosopagnosia. In the bee study, reported in the Dec. 15 issue of the Journal of Experimental Biology, Dyer and two colleagues presented honeybees with photos of human faces drawn from a standard human psychology test. The photos had similar lighting, background colors and sizes and included only the face and neck to avoid having the insects make judgments based on the clothing. In some cases, the people in the picture themselves looked similar. The researchers tried to train the bees to realize that one photo had a drop of a sugary liquid next to it. Different photos came with a drop of bitter liquid instead. Many bees apparently failed to realize that that they should pay attention to the photos at all. But five bees learned to fly toward the photo horizontally in such a way that they could get a good look at it, Dyer reported. In fact, these bees tended to hover a few centimeters in front of the image for a while before deciding where to land. The bees learned to distinguish the correct face from the wrong one with better than 80 percent accuracy, even when the faces were similar, and regardless of where the photos were placed, the researchers found. Also, just like humans, the bees performed more poorly when the faces were flipped upside-down. "This is evidence that face recognition requires neither a specialised neuronal [brain] circuitry nor a fundamentally advanced nervous system," the researchers wrote, noting that the test they used was one for which even humans have some difficulty. Also, "Two bees tested 2 days after the initial training retained the information in long-term memory," they wrote. One scored about 94 percent on the first day and 79 percent two days later; the second bee's score dropped from about 87 to 76 percent during the same time frame. The researchers also checked whether bees performed better for faces that humans judged as being more different. This seemed to be so, they found, but the result didn't reach statistical significance. The bees probably don't understand what a human face is, Dyer said in an email. "To the bees the faces were spatial patterns (or strange looking flowers)," he added. Bees are famous for their pattern-recognition abilities, which scientists believe evolved in order to discriminate among flowers. As social insects, it's well known that they can also tell apart their hivemates. But the new study shows that they can recognize human faces better than some humans can--with one-thousandth of the brain cells. This raises the question of how bees recognize faces, and if so, whether they do it differently from the way we do it, Dyer and colleagues wrote. Studies suggest small children recognize faces by picking out specific features that are easy to recognize, whereas adults see the interrelationships among facial features. Bees seem to show aspects of both strategies depending on the study, the researchers added. The findings cast doubt on the belief among some researchers that the human brain has a specialized area for face recognition, Dyer and colleagues said. Neuroscientists point to an area called the fusiform gyrus, which tends to show increased activity during face-viewing, as serving this purpose. But the bee finding "supports the view that the human brain may not need to have a visual area specific for the recognition of faces," Dyer and colleagues wrote. That may be helpful to researchers who develop face-recognition technologies to be used for security at airports and other locations, Dyer noted. The United States is investing heavily in such systems, but they remain primitive. Already, the way that bees navigate is "being used to design self-autonomous aircraft that can fly in remote areas without the need for radio contact or satellite navigation," he wrote in the email. "We show that the miniature brain can definitely recognize faces, and if in the future we can work out the mechanisms by which this is achieved then perhaps there are insights to how to try novel recognition solutions." On the other hand, Dyer said, the findings probably don't back up an adage popular in some parts of the world--that you shouldn't kill a bee because its nestmates will remember and come after you. Bees may launch revenge attacks, but they might simply do so because they smell the dead bee, he remarked, adding that that's his speculation only. In any case, "bees don't normally go around looking at faces." From checker at panix.com Sun Jan 1 23:11:58 2006 From: checker at panix.com (Premise Checker) Date: Sun, 1 Jan 2006 18:11:58 -0500 (EST) Subject: [Paleopsych] Hartford Courant: The Mind of the Psychopath: Contours of Evil Message-ID: The Mind of the Psychopath: Contours of Evil http://www.courant.com/news/health/hc-psychopathbrain1218.artdec18,0,209514.story?coll=hc-big-headlines-breaking Researchers Study How The Mind Works When There's No Remorse By WILLIAM HATHAWAY Hartford Courant Staff Writer December 18 2005 Dr. Kent A. Kiehl has interviewed dozens of psychopaths over the past their heinous acts he remains as astonished as he is repulsed. "I think, `I can't believe this guy is telling me he bashed in his mother's head with a propane tank,'" Kiehl says. Kiehl and a team of researchers at Hartford Hospital's Institute of Living are using brain scans in an attempt to explain the inexplicable: What makes some people absolutely devoid of empathy and remorse? Society needs answers because of the sheer havoc psychopaths create, the researchers say. Superficially charming, psychopaths lie, steal, rape, rob, embezzle, assault and abuse with no compunction, no conscience. But all psychopaths are notoriously impervious to rehabilitation. Psychopaths account for a quarter of all prisoners in the United States - and for as much as 50 percent of all violent crime, the researchers estimate. There are also hundreds of thousands of psychopaths in the United States who manage to stay out of prison, but nonetheless dole out immeasurable amounts of pain in homes, schools, even corporate boardrooms. Within the pattern of bright blue and yellow blotches on the brain scans he has taken, Kiehl believes he has found the dark contours of the psychopathic mind. When psychopaths see or hear emotional words or pictures of misery, areas of their brains that should light up like a Christmas tree are dark and devoid of activity. Instead, their brains process information such as a picture of a bereaved mother holding her dead child in the same way they would react to a picture of a chair or shovel. Psychopaths seem to know the words, but they can't hear the music, researchers often say. In probing the abyss of the psychopathic mind, Kiehl and others are raising questions about our criminal justice system and our assumptions about human morality. Beyond Bundy Kiehl's own quest began with stories his father, a newspaper editor, told about serial killer Ted Bundy, who grew up in the same Tacoma, Wash., neighborhood as the Kiehls. Bundy was the archetypal psychopath - handsome, disarmingly charming and utterly ruthless. His outwardly clean-cut appearance and his cunning - he volunteered for a suicide hot line and the Republican Party - made Bundy a virtuoso killer. He was known to use crutches as props and feign car trouble to induce young victims to give him a ride. He eventually confessed to more than two dozen murders, but he is thought to have killed dozens more during a spree in the mid- to late 1970s. "The question has always been, `What makes people do something like that?'" Kiehl said. Years later, while doing postgraduate studies in neurobiology at the University of California at Davis, Kiehl decided he would try to answer the question. He launched a campaign to get hired in the lab of the guru of psychopathy: Robert D. Hare, now professor emeritus of psychology at the University of British Columbia. Hare told him he "didn't hire Americans." But after a concerted sales pitch, which included a gift of baseball tickets to a Toronto Blue Jays game, Kiehl says Hare relented and hired the young researcher in 1994. It was an auspicious time in psychopathy research. Hare's research had given the nascent field some terminology to use. And new imaging technology was just beginning to open a window onto the dark world of the psychopathic brain. The personality type had been known for centuries. In the 18th century, Frenchman Philippe Pinel coined the term "insanity without delirium" to describe aberrant behavior accompanied by a complete lack of remorse. The study of psychopathy in the United States dates from 1941, when Hervey Cleckley published a book called "The Mask of Sanity" that described psychopaths as unusually intelligent people, characterized by a "poverty of emotions." But it wasn't until Hare devised his psychopathy checklist in 1980 - which he revised in 1991 - that an easily identified set of personality characteristics defined the condition and opened up a field of research. "There was a gut feeling that there was something different" about psychopaths, Hare said. There was. Grading Psychopaths Psychopaths aren't crazy, at least in a traditional medical sense, but they are unfettered by any sense of shame or guilt. Symptoms can show up early in life. Psychopathic children have total disregard for rules and engage in unusually vicious assaults or torture animals. Kiehl has received a federal grant to see whether children diagnosed with "callous conduct disorder" might actually be budding psychopaths. Researchers have come to the conclusion that while a hostile environment can contribute to the development of psychopathy, many psychopaths are born, not made. Studies of twins suggest that psychopathic tendencies can develop even in loving homes. Some studies suggest that male psychopaths outnumber females by about 3 to 1. The general lack of social causes for the disorder is one reason why most experts no longer use the term "sociopath" to describe a psychopath. Researchers say as many as 1 out of every 100 people in the United States may meet the classification of a psychopath; serial killers make up a tiny minority of them. The revised psychopathy checklist, known as the PCL-R, lists 20 traits and behaviors common to the disorder. Experts who are trained in administering the test score subjects with a 0, 1 or 2 on each item on the checklist. Hare said most people might score a 4 on his PCL-R checklist. A person is not designated a psychopath unless he or she scores 30 or more on the scale of 40. The higher the score, the more devastation a psychopath is likely to cause. Somebody who scores a 27 probably wouldn't be a great dinner guest. Psychopaths are pathological liars who crave stimulation, are sexually promiscuous and unable to control their behavior. They typically lack realistic long-term goals. They may be master manipulators, but psychopaths have a hard time concealing their nature from people trained to use the checklist, Kiehl said. Inevitably, they lie, boast or reveal their callousness. "They can't help themselves," he said. People who deal with psychopaths have observed another shared quality, one not on the checklist or easily measured. There is something different about their eyes. The gaze of the psychopath is disquieting, even frightening, and has been described as cold or penetrating, empty, reptilian, not quite human. They lack any depth to their emotions and the ability to connect emotion to cognition "They don't quite get it," Hare said. "There is something missing." `I Never Hurt People' In the mid-1990s, in Canadian prisons, Kiehl began to perfect the art of using the checklist to score prisoners, who were told their interviews would not be shared with law enforcement authorities. In training tapes he recorded, Kiehl, a burly former football player, maintains a steady voice as he peppers subjects with short questions. In one tape, a 30-something man with long sideburns and thinning hair, dressed in a green windbreaker, answers Kiehl's questions with an easy smile, a collegial, confiding, "just between us boys" air. "Sideburns" confesses to bootlegging cigarettes, petty thefts. "Do you have a temper?" asks Kiehl, who is off camera. "Oh yes, explosive," Sideburns answers. "Do you assault people?" "Oh, I never physically hurt people." "What happens when you lose your temper?" "Oh, I can just lose it. Like the time I killed my girlfriend." The blurted truth comes quick as a cobra strike. There is nothing in the man's face or voice to suggest he even recognizes he had told a lie about hurting people. When he relates how he held his girlfriend's head under water in a bathtub, there is no hesitation or pause in his voice, no change in tenor or inflection that hints he is aware the interview has shifted to a different moral ground. "Police said she was already unconscious," he says, as if the statement absolves him of wrongdoing. He changes the subject to all the stolen electronic equipment he gave the woman. For the first time, Sideburns seems a bit worked up. When you steal electronics, he asks, "Do you know how hard it is to find remote controls?" Spotting The Predators Warning the public about the dangers of such psychopaths is a passion for Hare, author of the book "Without Conscience." Hare said zebras and other animals congregating around an African waterhole know to scatter when they see a lion. "There you can identify a predator, but psychopaths don't wear bells around their necks," Hare said. Psychopaths tend to thrive "where the rules are obscure, where there is chaotic upheaval," Hare said. "Countries such as Yugoslavia and the Soviet Union after their breakups were a warm niche for psychopaths, who simply moved in to take advantage of the chaos." A corporation that is disorganized and growing quickly offers the same type of fertile environment, Hare said. In a book tentatively titled "Snakes in Suits" to be published next spring, Hare blames scandals such as the destruction of Enron at least partly on a category of psychopaths who typically know how to stay out of jail. Hare's checklist today has provided a generally accepted definition of the psychopath, the "who" and the "what" that allows researchers from different disciplines to study the phenomenon. Hare says his checklist has been both abused and underused. He railed against a judge in Texas, for instance, who has sentenced defendants to death because he has deemed them psychopaths, even though they were never examined by people trained to use his checklist. But Hare also says many parole boards underutilize psychopathic evaluations when considering whether to release a prisoner. How a prisoner scores on the 20 characteristics of Hare's checklist "is the best predictor of recidivism that we have," said Diana Fishbein, a researcher at RTI International of Research Triangle Park in North Carolina. About half of prisoners released from jail wind up back there within three years, Kiehl said. The number skyrockets to at least 4 in 5 when the prisoner is a psychopath. And psychopaths seem to be immune to any sort of therapy that might better those odds. One study explored whether group therapy might lower the recidivism rate of psychopaths. Sixty percent of untreated psychopaths in the study were back in jail after a year. But 80 percent of psychopaths who participated in group therapy were convicted of another offense in the same period. "They used the sessions to learn how to exploit the emotions of others," Kiehl said. Some people debate the value of using the checklist to determine sentences for individual killers. The recidivism rate is not 100 percent for psychopaths, noted Dr. Michael Norko, director of the Whiting Forensic Division of Connecticut Valley Hospital. And using the checklist is akin to doing a DNA analysis for an incurable disease. What are you going to do if you find it? "It would be different if we had a pill for psychopathy," Norko said. Devoid Of Emotion Kiehl says he believes that brain imaging studies can pinpoint the biological cause of psychopathic behavior and possibly lead to a remedy, perhaps even a psychopathy pill. "If we could develop a treatment for psychopaths, it would alleviate an enormous burden on society," said Kiehl, who is director of the clinical cognitive neuroscience laboratory at the institute's Olin Neuropsychiatry Research Center. He has a theory of where in the brain to look. His previous work showed a peculiar pattern of brain activity in psychopaths when they were presented with different words or images. Using both an electroencephalograph (EEG), which measures electrical activity in the brain, and functional magnetic resonance imaging scans, which measure oxygen use, Kiehl found striking differences between psychopaths and non-psychopaths in the activity of several regions of the brain. He is particularly intrigued by abnormalities in psychopaths' brains, in what he calls the paralimbic system, a loose organization of brain structures involved in processing emotion. In most people, that picture of a distraught woman holding a dead child will trigger heightened activity in these brain areas, including a region called the amygdale. In contrast, the brains of criminal psychopaths respond much as they would to any inanimate object. Kiehl and other scientists have also found heightened brain activity in the frontal cortex of psychopaths when they are presented with emotionally charged words or images. The frontal cortex helps govern reason and planning. Some scientists have interpreted that as evidence that the root of psychopathic behavior lies in the frontal cortex. But Kiehl and others see it differently. People with injuries to the frontal cortex do not exhibit the goal-directed aggression or callousness often associated with psychopaths, Kiehl says. Kiehl believes psychopaths enlist areas of the frontal cortex to process information that the brain usually processes in its emotional centers. On his desk at the Institute of Living, Kiehl keeps a replica of a railroad spike, a memento of an 1848 accident that befell a Vermont construction foreman named Phineas Gage. An explosion drove a 3-foot-7-inch tamping iron through Gage's brain. The sheer improbability of his survival - the tamping iron entered under his cheekbone, exited the top of his skull and landed 25 feet away - assured Gage a place in the history of medical oddities. But the changes in his behavior made him famous. Gage, who had been a reliable worker and a sober, churchgoing, devoted family man, became an irresponsible cad, ignoring his wife, children and job. In short, he acted like a psychopath. Kiehl notes that the tamping iron damaged the paralimbic system in Gage's brain, the same areas that seem abnormal in the brain's of psychopaths. Kiehl's theory explains, for instance, why psychopaths seldom seem to experience anxiety or fear in the same way normal people do and why they do not fully comprehend the meaning of emotions such as love or compassion. "For a psychopath, it is all cognition," Kiehl said. His lab has received federal grants totaling $6 million for the study of psychopathy. In one study, he is investigating whether one reason that drug abuse treatment programs have a high failure rate in prisoners is because so many psychopaths are enrolled. Psychopaths do not respond to traditional treatment and Kiehl suspects that, while psychopaths are heavy drug and alcohol abusers, they do not develop the same sort of dependency on drugs as non-psychopaths. To test his hypothesis, he hopes to persuade Connecticut correctional officials to allow his team to study teen and adult inmates. If Kiehl's ideas are borne out by research, they may suggest ways to change psychopathic behavior. For now, Hare believes that any therapeutic approach must appeal to the psychopath's own self-interest because treatments based on an appreciation of somebody else's feelings are bound to fail. Understanding the underlying physiology of the disorder could lead to a drug that might actually restore emotional responses and cure psychopaths, said Dr. James Blair, an expert in psychopathy and a researcher at the National Institute of Mental Health, part of the National Institutes of Health. Blair points out that the symptoms of psychopathy are almost exactly the opposite of symptoms of people who suffer from post-traumatic stress and anxiety disorders - conditions for which treatments now exist. Roots Of Morality Kiehl hopes that by explaining how psychopaths' minds work, he can help arm society with the tools to deal with them. One of his research associates, Jana Schaich-Borg, also wants to answer a more fundamental question: Why are most humans moral in the first place? If Kiehl is correct that a failure of the emotional processing centers of the brain is at the root of psychopathy, then it follows that moral behavior might arise in those same areas. If a pill could create emotional responses in a psychopath, could such a drug also give him a moral core? Schaich-Borg plans to investigate whether psychopaths feel disgust - or the deeply ingrained reaction that people in most cultures have about, say, handling feces or having sex with a sibling. She speculates that the areas of the brain that govern disgust in a normal person may also play a role in the formation of more sophisticated moral beliefs, which are absent in psychopaths. For years, the link between instincts and moral decision-making has been inferred from fictional ethical scenarios. Schaich-Borg offers one example: Five people are tied up on a railroad track and a locomotive barrels toward them. You can save them, but only by pulling a lever and switching the locomotive to a different track, where two other people are tied. Do you pull the lever? People answer instinctively, and study after study shows that they are split right down the middle and argue their positions passionately. "Some people say they won't play God under any circumstance," said Schaich-Borg, who said she personally would pull the lever. But what if you could save the people on both sides of the railway spur by shoving a single man in front of the train? "Nearly everybody says no," she said. But, she said, a psychopath wouldn't care a whit whether the lever was pulled or not. She wants to compare what happens in people's brains when the question is asked. In the pattern of neural activity, she believes she may see the outline of human morality. And those imaging scans may illustrate why predators such as Ted Bundy are a rarity, rather than a rule in society. Kiehl says most people probably make moral choices using both rational and emotional parts of their brain. But he and Hare both say much more research needs to be done to shed light into the abyss of the psychopathic mind. "Unless we understand what makes these people tick," Hare says, "we are all going to suffer." A discussion of this story with Courant Staff Writer William Hathaway is scheduled to be shown on New England Cable News each hour Monday between 9 a.m. and noon. From checker at panix.com Sun Jan 1 23:12:09 2006 From: checker at panix.com (Premise Checker) Date: Sun, 1 Jan 2006 18:12:09 -0500 (EST) Subject: [Paleopsych] NYTBR: 'The Man Everybody Knew: Bruce Barton and the Making of Modern America,' by Richard M. Fried Message-ID: 'The Man Everybody Knew: Bruce Barton and the Making of Modern America,' by Richard M. Fried http://www.nytimes.com/2005/12/18/books/review/18kazin.html [I read this bestselling book about how Jesus was a master salesman about thirty years ago and remember it fondly as a lesson that each generation makes Christ over into its own image. I am glad the book is being remembered. I should not be surprised if Mr. Mencken had the same reaction as I did, but I haven't found any trace of his commentary.] Review by MICHAEL KAZIN THE MAN EVERYBODY KNEW Bruce Barton and the Making of Modern America. By Richard M. Fried. Illustrated. 286 pp. Ivan R. Dee. $27.50. IF consumerism is our secular religion, then copywriters are its evangelists. No one in the golden days of the American advertising industry preached the faith more fervently or effectively than Bruce Barton. The affable son of a liberal Protestant minister, he created much of the copy that propelled Batten, Barton, Durstine & Osborn, the agency he helped found, to the top of its industry during the 1920's. Barton always believed the best ads were ones that depicted corporations as the fount of services that transcended the particular product on offer. For General Motors, he composed the inspiring tale of a doctor whose reliable auto sped him to the bedside of a failing young girl. One historian has labeled such ads essential to "creating the corporate soul," and Barton pursued it with a singular passion. But it was his selling of Jesus that transformed the ad man into a celebrity. In 1925, Barton published "The Man Nobody Knows," which quickly became an enormous best seller - and one of the most easily ridiculed examples of pop theology ever written. He urged readers to banish the image of the long-haired, "sissified" figure who gazed woefully from Victorian lithographs. Barton's Jesus was a muscular "outdoor man" and a "sociable" fellow in demand at Jerusalem's best banquet tables. More to the point, he was a masterly entrepreneur. Hadn't this humble carpenter "picked up 12 men from the bottom ranks of business and forged them into an organization that conquered the world"? From his father, Barton had learned that a "preacher is really a salesman." The son simply reversed the nouns. On the wings of his prosperity and fame, Barton rose to the inner circle of the Republican Party. He helped to write major speeches for President Calvin Coolidge and to devise the campaign of his successor, Herbert Hoover. Barton refused to become depressed in the months after the stock market crashed. "Anyone who looks gloomily at the business prospects of this country in 1930 is going broke," he predicted. In the late 30's, Barton proved that he could also sell himself. He was twice elected to the House, by huge margins, from the East Side of Manhattan. Down at the Capitol, Barton warned, in a tone of atypical grimness, that a third term for Franklin Roosevelt would mean "the end of freedom." In return, Roosevelt helped sink his 1940 campaign for the Senate. Barton retreated to his agency. Until his death a quarter-century later, he surfaced mostly as an elder statesman for anodyne causes like fighting heart disease and urging brotherhood between Christians and Jews. It is surprising to learn this is the first biography of Barton, whose name was indeed once familiar to any American who read a daily paper. Richard M. Fried, a professor of history at the University of Illinois at Chicago, provides a suitably brisk, anecdote-filled account, which focuses on how the master publicist's clever optimism suffused his words - whether they were designed to promote Christ, a corporation or the Republican Party. Fried concludes that Barton was a more ambivalent figure than he seemed to his contemporaries. He extolled consumerism, yet fretted about the loss of the old "values of work and self-restraint." He wrote homilies to big business, yet increasingly viewed ads as superfluous and banal. Unfortunately, Fried doesn't attempt to make sense of these contradictions or to justify the clich? of the subtitle. The question is not whether Barton helped "make modern America" but to what purpose. Perhaps the absence of a previous biography reflects the fact that those who succeed at advertising and public relations merely hold up gilded mirrors to society rather than helping to improve it. Bruce Barton contributed his drops of wisdom to an onrushing tide. The man whom everybody once knew may also have been someone neither business nor politics nor religion really needed Michael Kazin, who teaches history at Georgetown University, is the author of the forthcoming book "A Godly Hero: The Life of William Jennings Bryan." From checker at panix.com Sun Jan 1 23:12:22 2006 From: checker at panix.com (Premise Checker) Date: Sun, 1 Jan 2006 18:12:22 -0500 (EST) Subject: [Paleopsych] NYT: Remote and Poked, Anthropology's Dream Tribe Message-ID: Remote and Poked, Anthropology's Dream Tribe http://www.nytimes.com/2005/12/18/international/africa/18tribe.html [This recalls the Pilgrims of Massachusetts in 1620 being greeted by an Indian who spoke English, evidently learned from the earlier settlers at Jamestown.] By MARC LACEY LEWOGOSO LUKUMAI, Kenya - The rugged souls living in this remote desert enclave have been poked, pinched and plucked, all in the name of science. It is not always easy, they say, to be the subject of a human experiment. "I thought I was being bewitched," Koitaton Garawale, a weathered cattleman, said of the time a researcher plucked a few hairs from atop his head. "I was afraid. I'd never seen such a thing before." Another member of the tiny and reclusive Ariaal tribe, Leketon Lenarendile, scanned a handful of pictures laid before him by a researcher whose unstated goal was to gauge whether his body image had been influenced by outside media. "The girls like the ones like this," he said, repeating the exercise later and pointing to a rather slender man much like himself. "I don't know why they were asking me that," he said. Anthropologists and other researchers have long searched the globe for people isolated from the modern world. The Ariaal, a nomadic community of about 10,000 people in northern Kenya, have been seized on by researchers since the 1970's, after one - an anthropologist, Elliot Fratkin - stumbled upon them and began publishing his accounts of their lives in academic journals. Other researchers have done studies on everything from their cultural practices to their testosterone levels. National Geographic focused on the Ariaal in 1999, in an article on vanishing cultures. But over the years, more and more Ariaal - like the Masai and the Turkana in Kenya and the Tuaregs and Bedouins elsewhere in Africa - are settling down. Many have migrated closer to Marsabit, the nearest town, which has cellphone reception and even sporadic Internet access. The scientists continue to arrive in Ariaal country, with their notebooks, tents and bizarre queries, but now they document a semi-isolated people straddling modern life and more traditional ways. "The era of finding isolated tribal groups is probably over," said Dr. Fratkin, a professor at Smith College who has lived with the Ariaal for long stretches and is regarded by some of them as a member of the tribe. For Benjamin C. Campbell, a biological anthropologist at Boston University who was introduced to the Ariaal by Dr. Fratkin, their way of life, diet and cultural practices make them worthy of study. Other academics agree. Local residents say they have been asked over the years how many livestock they own (many), how many times they have had diarrhea in the last month (often) and what they ate the day before yesterday (usually meat, milk or blood). Ariaal women have been asked about the work they do, which seems to exceed that of the men, and about local marriage customs, which compel their prospective husbands to hand over livestock to their parents before the ceremony can take place. The wedding day is one of pain as well as joy since Ariaal women - girls, really - have their genitals cut just before they marry and delay sex until they recuperate. They consider their breasts important body parts, but nothing to be covered up. The researchers may not know this, but the Ariaal have been studying them all these years as well. The Ariaal note that foreigners slather white liquid on their very white skin to protect them from the sun, and that many favor short pants that show off their legs and the clunky boots on their feet. Foreigners often partake of the local food but drink water out of bottles and munch on strange food in wrappers between meals, the Ariaal observe. The scientists leave tracks as well as memories behind. For instance, it is not uncommon to see nomads in T-shirts bearing university logos, gifts from departing academics. In Lewogoso Lukumai, a circle of makeshift huts near the Ndoto Mountains, nomads rushed up to a visitor and asked excitedly in the Samburu language, "Where's Elliot?" They meant Dr. Fratkin, who describes in his book "Ariaal Pastoralists of Kenya" how in 1974 he stumbled upon the Ariaal, who had been little known until then. With money from the University of London and the Smithsonian Institution, he was traveling north from Nairobi in search of isolated agro-pastoralist groups in Ethiopia. But a coup toppled Haile Selassie, then the emperor, and the border between the countries was closed. So as he sat in a bar in Marsabit, a boy approached and, mistaking him for a tourist, asked if he wanted to see the elephants in a nearby forest. When the aspiring anthropologist declined, the boy asked if he wanted to see a traditional ceremony at a local village instead. That was Dr. Fratkin's introduction to the Ariaal, who share cultural traits with the Samburu and Rendille tribes of Kenya. Soon after, he was living with the Ariaal, learning their language and customs while fighting off mosquitoes and fleas in his hut of sticks covered with grass. The Ariaal wear sandals made from old tires and many still rely on their cows, camels and goats to survive. Drought is a regular feature of their world, coming in regular intervals and testing their durability. "I was young when Elliot first arrived," recalled an Ariaal elder known as Lenampere in Lewogoso Lukumai, a settlement that moves from time to time to a new patch of sand. "He came here and lived with us. He drank milk and blood with us. After him, so many others came." Over the years, the Ariaal have had hairs pulled not just from their heads, but also chins and chests. They have spat into vials to provide saliva samples. They have been quizzed about how often they urinate. Sometimes the questioning has become even more intimate. Mr. Garawale recalls a visiting anthropologist measuring his arms, back and stomach with an odd contraption and then asking him how often he got erections and whether his sex life was satisfactory. "It was so embarrassing," recalled the father of three, breaking out in giggles even years later. Not all African tribes are as welcoming to researchers, even those with the necessary permits from government bureaucrats. But the Ariaal have a reputation for cooperating - in exchange, that is, for pocket money. "They think I'm stupid for asking dumb questions," said Daniel Lemoille, headmaster of the school in Songa, a village outside of Marsabit for Ariaal nomads who have settled down, and a frequent research assistant for visiting professors. "You have to try to explain that these same questions are asked to people all over the world and that their answers will help advance science." The researchers arriving in Africa by the droves, probing every imaginable issue, every now and then leave controversy in their wake. In 2004, for instance, a Kenyan virologist sued researchers from Britain for taking blood samples out of the country that he said had been obtained from a Nairobi orphanage for H.I.V.-positive children without government permission. The Ariaal have no major gripes about the studies, although the local chief in Songa, Stephen Lesseren, who wore a Boston University T-shirt the other day, said he wished their work would lead to more tangible benefits for his people. "We don't mind helping people get their Ph.D.'s," he said. "But once they get their Ph.D.'s, many of them go away. They don't send us their reports. What have we achieved from the plucking of our hair? We want feedback. We want development." Even when conflicts break out in the area, as happened this year as members of rival tribes slaughtered each other, victimizing the Ariaal, the research does not cease. With tensions still high, John G. Galaty, an anthropologist at McGill University in Toronto who studies ethnic conflicts, arrived in northern Kenya to question them. In a study in The International Journal of Impotence Research, Dr. Campbell also found that Ariaal men with many wives showed less erectile dysfunction than did men of the same age with fewer spouses. Dr. Campbell's body image study, published in The Journal of Cross-Cultural Psychology this year, also found that Ariaal men are much more consistent than men in other parts of the world in their views of the average man's body and what they think women want. Dr. Campbell came across no billboards or international magazines in Ariaal country and only one television in a local restaurant that played CNN, leading him to contend that Ariaal men's views of their bodies were less affected by media images of burly male models with six-pack stomachs and rippling chests. To test his theories, a nonresearcher without a Ph.D. showed a group of Ariaal men a copy of Men's Health magazine full of pictures of impossibly well-sculpted men and women. The men looked on with rapt attention and admired the chiseled forms. "That one, I like," said one nomad who was up in his years, pointing at a photo of a curvy woman who was clearly a regular at the gym. Another old-timer gazed at the bulging pectoral muscles of a male bodybuilder in the magazine and posed a question that got everybody talking. Was it a man, he asked, or a very, very strong woman? From checker at panix.com Sun Jan 1 23:14:41 2006 From: checker at panix.com (Premise Checker) Date: Sun, 1 Jan 2006 18:14:41 -0500 (EST) Subject: [Paleopsych] NYT: Exposing the Economics Behind Everyday Behavior Message-ID: Exposing the Economics Behind Everyday Behavior http://www.nytimes.com/2005/12/18/business/yourmoney/18shelf.html [These are articles stacked up, as I was complying with Howard's request to keep my posting down to seven a day. I'm not breaking my Gregorian New Calendar New Year's resolutions. Sorry about the bad formatting of some of these articles. I was using different software and can get the lines right only with a lot of extra work.] Off the Shelf By ROGER LOWENSTEIN A FUNNY thing seems to be happening to economics writing: it's getting better. In recent books like "Freakonomics" and "The Travels of a T-Shirt in the Global Economy," economists have taken it upon themselves to explain something of how the world works. They even tell little stories. What interests Tim Harford, the author of "The Undercover Economist," are the stories behind the myriad little transactions that take place every day. Do you drive to work or ride a subway? Do you buy coffee en route? Is it a high-priced, frothy variety, or something plainer? And if it's the first kind, why is it so darn expensive, when the incremental cost of steaming a little milk amounts to only pennies? One question that interests Mr. Harford is: What will persuade you to fork over $26 for a copy of "The Undercover Economist" (Oxford University Press)? He seems to believe that witty, bracing prose will do the trick. "I would like to thank you for buying this book, but if you're anything like me you haven't bought it at all," he begins. "Instead, you've carried it into the bookstore cafe and even now are sipping a cappuccino in comfort while you decide whether it's worth your money." While we're on the subject of that cappuccino, Mr. Harford explains that Starbucks would like to charge each of us exactly what we are willing to pay, but that it would simply not do for it to advertise "Cappuccino for the Lavish, $3," and "Cappuccino for the Thrifty, $1." It has to be clever about it. Something like, "Hot Chocolate, $2.20; Caffe Mocha, $2.75; 20 oz. Cappuccino, $3.40." To the customer, the choice of drinks is what matters. To Starbucks, it is the choice of prices. Similarly, when Disney World in Florida offers discounts to people who live in the Orlando area, Mr. Harford observes, "They're not making a statement about the grinding poverty of the Sunshine State." They are making an educated guess that out-of-towners, who visit only once in a while, are willing to pay more than people from nearby. The author, a Briton who lives in Washington and who writes the cheeky Dear Economist column for The Financial Times, says that "there is a story to tell" in nearly every such interaction. For instance, Whole Foods lures you to spend more by offering distinct and - relative to what the competition offers - more expensive foods. It sells organic broccoli in addition to the customary industrial-strength variety, and it is careful never to display them side by side, because you would then notice the difference in price. Whole Foods wants you to be thinking only about the incremental good health that organic broccoli presumably confers. "The economist's job," Mr. Harford says, "is to shine a spotlight on the underlying process." Sounds reasonable, but that is not what most economists actually do. Most professional economists are paid to predict the future. This is why so much of economics writing is dull - and pretty silly. No one can predict the future, least of all an economist. Mr. Harford fancies himself to be more like a detective - an "undercover" economist. Perhaps he is less policeman than psychologist. Psychologists are not much good at predictions, either, but they do help us understand behavior, and recognize what sort of social settings induce people to behave better or worse. Just so, Mr. Harford's undercover op is a creature of incentives. Recalling that in his university days, student clubs allowed unlimited drinking in return for an upfront fee, he notes that these encouraged bingeing because the cost of each additional drink was zero. What matters in terms of limiting intake is the marginal cost of each new drink. So, too, with reducing automobile traffic: it's not the average cost per trip that matters, but the cost of getting into your car each additional time. To an economist, the truly interesting decisions are those that occur at the margin - the point at which one employee more is hired, one dollar more is invested, one cappuccino (on top of all those you have already imbibed) is drunk. Mr. Harford explains this central concept by returning to the source - namely, the classical economist David Ricardo's explication of how the yield from a marginal piece of land determined rents in pre-industrial England. The author is good at showing how such basic concepts apply across a complex modern economy. After observing that rents in London today are higher thanks to the surrounding Green Belt, which cuts off development, he notes that as an undercover economist, "you start to see 'green belts' of one kind or another all over the place." For instance, professional associations that restrict entry into, say, medicine serve as green belts that shield doctors from competition. NONE of this is the least bit unconventional. Though the author enjoys being politically incorrect ("sweatshops are good news," he offers tartly), he is not economically incorrect. In fact, lively presentation aside, he has written a pretty standard primer, one that defends free markets to a fault and attacks government as the source of just about everything bad. Predictably, he says that the best way to limit pollution is through free-market incentives; he then goes overboard by suggesting that environmental debates are mere "moral posturing." Yet without some discussion first, it is unlikely that we would have developed any incentives. And some of his arguments are far too brief to carry their intended weight. The author cannot really expect to explain "Why Poor Countries Are Poor" in a single chapter, the highlight of which is an interview with a cabdriver in Cameroon. A final criticism is that too many of Mr. Harford's interesting details lack a source or a footnote. He gets Amazon's stock-price history wrong. (The author says that during the dot-com bust, it fell below its initial offering price; adjusting for splits, it never did.) Many other details lack the specificity or the attribution to enable one to check. But these are quibbles. For those of you, even now, still stuck in the bookstore cafe, this is a book to savor. From checker at panix.com Sun Jan 1 23:16:41 2006 From: checker at panix.com (Premise Checker) Date: Sun, 1 Jan 2006 18:16:41 -0500 (EST) Subject: [Paleopsych] NYT: Robert Luce, 83, Former Editor And Publisher of New Republic, Is Dead Message-ID: Robert Luce, 83, Former Editor And Publisher of New Republic, Is Dead http://www.nytimes.com/2005/12/18/arts/18luce.html [This is more personal than anything else, but I thought you might like to know a little more about your Checker of Premises and his wonderful wife.] Sarah worked for Luce from 1973-75. At the time, it was the largest independent book publisher, with about ten books a year, in Washington, D.C. How things have changed since then! Their books were distributed by David McKay (which has the honor of publishing Walt Whitman's _Leaves of Grass_), and back then before the Internet, getting distributed was usually the only path to success. Luce was bought by Robert van Roijen, and Sarah worked directly under the late Joseph J. Binns. Three notable books are: Edward J. Gilfillan's _Migration to the Stars_ (1975). Though it was a better book, I thought, than Gerard K. O'Neill's _The High Frontier_, the latter book got thoroughly promoted, went to paperback, and became widely read. Reginald R. Gerig's _Famous Pianists and Their Technique_ (also 1975), a thorough compendium of pianists, is held by 939 libraries covered by WorldCat, which covers mostly the United States. It is still in print, $28 for the paperback, and rates four out of five stars at Amazon. Robert W. Whitaker's _A Plague on Both Your Houses_ (1976) was the first popular book to apply the Public Choice perspective to American politics throughout its history and did not hesitate to describe the liberal Human Betterment Industry as self-interested. After van Roijen retired, the aforementioned Joseph J. Binns established his own company under that name. I found a copy of Lawrence R. Brown, _The Might of the West_ (NY: Ivan Obolensky, 1963), my very favorite book, a panorama of world history from a macrohistorical perspective (Spengler without the mysticism), and have read it a dozen times. Though Joe disagreed with its politics, he found the book provocative and reprinted it in 1979. Used copies of this underground classic on Bookfinder go for $32-$100. ---------- By MONICA POTTS Robert B. Luce, a former editor and publisher of The New Republic who founded his own book publishing house, died on Nov. 29 in Boca Raton, Fla. He was 83. He died in a nursing home, his family said. Mr. Luce, known as Bob, began his career working for magazines, taking over at The New Republic in 1963. He edited a book compilation for the magazine's 50th anniversary, which was published in 1964. In the early 1960's, he also founded his own general-interest book publishing house, Robert B. Luce Inc., the first of its kind in Washington. Mr. Luce left The New Republic in 1966, sold his publishing house and returned to New York City, eventually working for Time-Life Books as director of editorial planning; his family said he was only distantly related to Henry R. Luce, the founder of Time-Life. He also worked for several other organizations, including Metromedia Inc. and Harcourt Brace. He moved from New York and began teaching journalism in 1997 at Florida Atlantic University, which he left in 2001. Robert Bonner Luce was born in 1922 in Gross Pointe, Mich. He served in the Army Air Forces in World War II and graduated from Antioch College in Ohio with a bachelor's degree in economics in 1946. Mr. Luce is survived by his wife, Iris, of Boca Raton, Fla.; three daughters, Jennifer Luce-Reynolds of Boulder, Colo.; Ann Luce Auerbach of New York City; and Jan Luce Nance of Burbank, Calif.; two sisters, Gwen Briggs of Rowayton, Conn., and Jean Davis of Ann Arbor, Mich.; one brother, Chris, of Florida; and four grandsons. Another daughter, Kathryn, died in 1992. From shovland at mindspring.com Mon Jan 2 16:14:04 2006 From: shovland at mindspring.com (Steve Hovland) Date: Mon, 2 Jan 2006 08:14:04 -0800 Subject: [Paleopsych] Great Pictures for all you animal lovers ! ! ! Message-ID: -----Original Message----- From: LELNAC1947 at aol.com [mailto:LELNAC1947 at aol.com] Sent: Wednesday, December 28, 2005 8:20 PM To: LELNAC1947 at aol.com Subject: Re: Great Pictures for all you animal lovers ! ! ! These are great.....most new, a few reruns.....guaranteed to make you smile and go ahhhh!! __________________________________________________ ____ Live everyday with enjoyment we don't know what tomorrow will give us. Forwarded By: Lee Utley -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 28022 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 53829 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 27511 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 24904 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 25503 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 47940 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 29177 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 39952 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 12184 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 27149 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 20169 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 19179 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 21799 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 31686 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 21507 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 30270 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 55473 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 56209 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 69527 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 40951 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 24786 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 27231 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 73353 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 31665 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 21124 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 25676 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 16838 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 19459 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 20743 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 33037 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 52003 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: not available Type: image/jpeg Size: 18087 bytes Desc: not available URL: From shovland at mindspring.com Mon Jan 2 17:02:36 2006 From: shovland at mindspring.com (Steve Hovland) Date: Mon, 2 Jan 2006 09:02:36 -0800 Subject: [Paleopsych] CDC warning Message-ID: The Center for Disease Control has issued a warning about a new virulent strain of Sexually Transmitted Disease. The disease is contracted through dangerous and high-risk behavior. The disease is called Gonorrhea Lectim and pronounced "gonna re-elect him." Many victims contracted it in 2004, after having been screwed for the past four years. Cognitive characteristics of individuals infected include: anti-social personality disorders, delusions of grandeur with messianic overtones, extreme cognitive dissonance, inability to incorporate new information, pronounced xenophobia and paranoia, inability to accept responsibility for own actions, cowardice masked by misplaced bravado, uncontrolled facial smirking, ignorance of geography and history, tendencies toward evangelical theocracy, categorical all-or-nothing behavior. Naturalists and epidemiologists are amazed at how this destructive disease originated only a few years ago from a bush found in Texas. From checker at panix.com Mon Jan 2 20:55:20 2006 From: checker at panix.com (Premise Checker) Date: Mon, 2 Jan 2006 15:55:20 -0500 (EST) Subject: [Paleopsych] NZ Herald: Homeopathy Is Bunk, says Professor Who Put It To Test Message-ID: Homeopathy Is Bunk, says Professor Who Put It To Test The New Zealand Herald December 19, 2005 Monday Homeopathy is bunk, says professor who put it to test London - Millions of people use it to deal with illnesses ranging from asthma to migraines. Prince Charles believes it is the answer to many of the evils of modern life. But now Britain's first professor of complementary medicine, Edzard Ernst of Exeter University in south-west England, has denounced homeopathy as ineffective. "Homeopathic remedies don't work," he told the Observer. "Study after study has shown it is simply the purest form of placebo. You may as well take a glass of water than a homeopathic medicine." Chiropractic, which involves spine manipulation to treat illnesses, and the laying on of hands to cure patients are equally invalid, he says. His views and his studies have provoked furious reactions. Chiropractors and homeopaths have written in droves to denounce him. But now the scourge of alternative medicine says he is going to have to quit because Exeter will no longer support him or his department. The university denied the charge. "Professor Ernst's department has enough money to go on for a couple of more years," said a spokesman. "We are still trying to raise cash." In 1993, Professor Ernst, then a professor of rehabilitation medicine in Vienna, took the job to bring scientific rigour to the study of alternative medicines, an approach that has made him a highly controversial figure in the field. An example is provided by his study of arnica, a standard homeopathic treatment for bruising. "We arranged for patients after surgery to be given arnica or a placebo," he said. "They didn't know which they were getting. It made no difference. They got better at the same rate." Professor Ernst also found no evidence homeopathy helped with asthma, which is said to be particularly responsive to such treatments. Britain has five homeopathic hospitals, which are funded by the country's health service (NHS). "The treatments do no good," said Professor Ernst. "But the long interview - about an hour-and-a-half - carried out by an empathetic practitioner during diagnosis may explain why people report improvements in their health." The incredibly dilute solutions used by homeopaths also make no sense, he added. "If it were true, we would have to tear up all our physics and chemistry textbooks." Professor Ernst insists he is a supporter of complementary medicines. "No other centre in the world has produced more positive results than we have to support complementary medicine," he said. "Herbal medicine, for instance, can do good. If I was mildly depressed, I think St John's wort would be a good treatment. It has fewer side-effects than Prozac. "Acupuncture seems to work for some conditions and there are relaxing techniques, including hypnotherapy, that can be effective. "These should not be used on their own, but as complements to standard medicines." Professor Ernst has been attacked by chiropractors and homeopaths. The latter point to studies they say show that most patients they treat are satisfied and cite an analysis in the Lancet of 89 trials in which their medicines were found to be effective. The Smallwood report, commissioned by Prince Charles, calls for more complementary medicines to be given on the NHS. Like with like * Homeopathy is a controversial system of alternative medicine more than 300 years old. * It calls for treating "like with like", a doctrine referred to as the "Law of Similars". The practitioner considers all a patient's symptoms then chooses as a remedy a substance that produces a similar set of symptoms in healthy subjects. The remedy is usually given in tiny concentrations. * Many of its claims are at odds with modern medicine and the scientific method. From checker at panix.com Mon Jan 2 20:55:31 2006 From: checker at panix.com (Premise Checker) Date: Mon, 2 Jan 2006 15:55:31 -0500 (EST) Subject: [Paleopsych] Light Planet: Book of Mormon Literature Message-ID: Book of Mormon Literature http://www.lightplanet.com/mormons/basic/bom/literature_eom.htm by Richard Dilworth Rust and Donald W. Parry Although understated as literature in its clear and plain language, the Book of Mormon exhibits a wide variety of literary forms, including intricate Hebraic poetry, memorable narratives, rhetorically effective sermons, diverse letters, allegory, figurative language, imagery, symbolic types, and wisdom literature. In recent years these aspects of Joseph Smith's 1829 English translation have been increasingly appreciated, especially when compared with biblical and other ancient forms of literature. There are many reasons to study the Book of Mormon as literature. Rather than being "formless," as claimed by one critic (Bernardd DeVoto, American Mercury 19 [1930]:5), the Book of Mormon is both coherent and polished (although not obtrusively so). It tells "a densely compact and rapidly moving story that interweaves dozens of plots with and inexhaustible fertility of invention and an uncanny consistency that is never caught in a slip or contradiction" (CWHN 7:138). Despite its small working vocabulary of about 2,225 root words in English, the book distills much human experience and contact with the divine. It presents its themes artfully through simple yet profound imagery, direct yet complex discourses, and straightforward yet intricate structures. To read the Book of Mormon as literature is to discover how such literary devices are used to convey the messages of its content. Attention to form, diction figurative language, and rhetorical techniques increases sensitivity to the structure of the text and appreciation of the work of the various authors. The stated purpose of the Book of Mormon is to show the Lamanites, a remnant of the House of Israel, the covenants made with their fathers, and to convince Jew and Gentile that Jesus is the Christ (see Book of Mormon Title Page). Mormon selected materials and literarily shaped the book to present these messages in a stirring and memorable way. While the discipline of identifying and evaluating literary features in the Book of Mormon is very young and does not supplant a spiritual reading of the text, those analyzing the book from this perspective find it a work of immediacy that shows as well as tells as great literature usually does. It no longer fits Mark Twain's definition of a classic essentially s a book everyone talks about but no one reads; rather, it is a work that "wears you out before you wear it out" (J. Welch, "Study, Faith, and the Book of Mormon," BYU 1987-88 Devotional and Fireside Speeches, p. 148. [Provo, Utah, 1988]). It is increasingly seen as a unique work that beautifully and compellingly reveals and speaks to the essential human condition. POETRY. Found embedded in the narrative of the Book of Mormon, poetry provides the best examples of the essential connection between form and content in the Book of Mormon. When many inspired words of the Lord, angels, and prophets are analyzed according to ancient verse forms, their meaning can be more readily perceived. These forms include line forms, symmetry, parallelism, and chiastic patterns, as defined by Adele Berlin "The Dynamics of Biblical Parallelism [Bloomington, Ind., 1985]) and Wilford Watson (Classical Hebrew Poetry [Sheffield, 1984]). Book of Mormon texts shift smoothly from narrative to poetry, as in this intensifying passage: But behold, the Spirit hath said this much unto me, saying: Cry unto this people, Saying--Repent ye, and prepare the way of the Lord, and walk in his paths, which are straight; for behold, the kingdom of heaven is at hand, and the Son of God cometh upon the face of the earth [Alma 7:9]. The style of the Book of mormmon has been criticized by some as being verbose and redundant, but in most cases these repetitions are orderly and effective. For example, parallelisms, which abound in the Book of Mormon, serve many functions. They add emphasis to twice-repeated concepts and give definition to sharply drawn contrasts. A typical synonymous parallelism is in 2 Nephi 9:52: Pray unto him continually by day, and give thanks unto his holy name by night. Nephi's discourse aimed at his obstinate brothers includes a sharply antithetical parallelism: Ye are swift to do iniquity But slow to remember the Lord your God. [1 Ne. 17:45.] Several fine examples of chiasmus (an a-b-b-a pattern) are also found in the Book of Mormon. In the Psalm of Nephi (2 Ne. 4:15-35), the intial appeals to the soul and heart are accompanied by negations, while the subsequent mirror uses the heart and soul are conjoined with strong affirmations, making the contrasts literarily effective and climactic: Awake, my soul! No longer droop in sin. Rejoice, O my heart, and give place no more for the enemy of my soul. Do not anger again because of mine enemies. Do not slacken my strength because of mine afflictions. Rejoice, O my heart, and cry unto the Lord, and say: O Lord, I will praise thee forever; yea, my soul will rejoice in thee, my God, and the rock of my salvation. [2 Ne. 4;28- 30.] Other precise examples of extended chiasmus (a-b-c--c-b-a) are readily discernible in Mosiah 5:10-12 and Alma 36:1-30 and 41:13-15. This literary form in Alma 36 effectively focuses attention on the central passage of the chapter (Alma 36:17-18); in Alma 41, it fittingly conveys the very notion of restorative justice expressed in the passage (cf. Lev. 24:13-23, which likewise uses chiasmus to convey a similar notion of justice). Another figure known as a fortiori is used to communicate an exaggerated sense of multitude, as in Alma 60:22, where a "number parallelism" is chiastically enclosed by a twice-repeated phrase: Yea, will ye sit in idleness while ye are surrounded with thousands of those, yea, and tens of thousands, who do also sit in idleness? Scores of Book of Mormon passages can be analyzed as poetry. They range from Lehi's brief desert poems (1 Ne. 2:9-10, a form Hugh Nibley identifies as an Arabic quasida) [CWHN 6:270-75] to extensive sermons of Jacob, Abinadi, and the risen Jesus (2 Ne. 6-10; Mosiah 12-16; and 3 Ne. 27). NARRATIVE TEXTS. In the Book of Mormon, narrative texts are often given vitality by vigorous conflict and impassioned dialogue or personal narration. Nephi relates his heroic actions in obtaining the brass plates from Laban; Jacob resists the false accusations of Sherem, upon whom the judement of the Lord falls; Ammon fights of plunderers at the waters of Sebus and wins the confidence of king Lamoni; Amulek is confronted by the smooth-tongued lawyer Zeezrom; Alma2 and Amulek are preserved while their accusers are crushed by collapsing prison walls; Captain Moroni1 engages in a showdown with the lamanite chieftain Zerahemnah; Amalickiah rises to power through treachery and malevolence; a later prophet named Nephi 2 reveals to an unbelieving crowd the murder of their chief judge by the judge's own brother; and the last two Jaredite kings fight to the mutual destruction of their people. Seen as a whole, the Book of Mormon is an epic account of the history of the Nephite nation. Extensive in scope with an eponymic hero, it presents action involving long and arduous journeys and heroic deeds, with supernatural beings taking an active part. Encapsulated within this one-thousand-year account of the establishment, development, and destruction of the Nephites is the concentrated epic of the rise and fall of the Jaredites, who preceded them in type and time. (For its epic milieu, see CWHN 5:285-394.) The climax of the book is the dramatic account of the visit of the resurrected Jesus to an assemblage of righteous Nephites. SERMONS AND SPEECHES. Prophetic discourse is a dominant literary form in the Book of Mormon. Speeches such as King Benjamin's address (Mosiah 1-6), Alma 2's challenge to the people of Zarahemla (Alma 5), and Mormon's teachings on faith, hope, and charity (Moro. 7) are crafted artistically and have great rhetorical effectiveness in conveying their religious purposes. The public oration of Samuel The Lamanite (Hel. 13-15) is a classic prophetic judgment speech. Taking rhetorical criticism as a guide, one can see how Benjamin's ritual address first aims to persuade the audience to reaffirm a present point of view and then turns to deliberative rhetoric--"which aims at effecting a decision about future action, often the very immediate future" (Kennedy, New Testament interpretation Through Rhetorical Criticism [1984], p. 36). King Benjamin's speech is also chiastic as a whole and in several of its parts (Welch, pp. 202-205). LETTERS. The eight epistles in the Book of Mormon are conversational in tone, revealing the diverse personalities of their writers. These letters are from Captain Moroni1 (Alma 54:5-14; 60:1-36), Ammoron (Alma 54:16-24), Helaman1 (Alma 56:2-58:41), Pahoran (Alma 61:2-21), Giddianhi (3 Ne. 3:2-10), and Mormon (Moro. 8:2-30; 9:1-26). ALLEGORY, METAPHOR, IMAGERY, AND TYPOLOGY. These forms are also prevalent in the Book of Mormon. Zenos's allegory of the olive tree (Jacob 5) vividly incorporates dozens of horticultural details as it depicts the history of God's dealings with Israel. A striking simile curse, with Near Eastern parallels, appears in Abinadi's prophetic denunciation: The life of king Noah shall be "as a garment in a furnace of fire,...as a stalk, even as a dry stalk of the field, which is run over by the beasts and trodden under foot" (Mosiah 12:10-11). An effective extended metaphor is Alma's comparison of the word of God to a seed planted in one's heart and then growing into a fruitful Tree Of Life (Alma 32:28-43). In developing this metaphor, Alma uses a striking example of synesthesia: As the word enlightens their minds, his listeners can know it is real--"Ye have tasted this light" (Alma 32:35). Iteration of archetypes such as tree, river, darkness, and fire graphically confirms Lehi's understanding "that there is an opposition in all things" (2 Ne. 2:11) and that opposition will be beneficial to the righteous. A figural interpretation of God-given words and God-directed persons or events is insisted on, although not always developed, in the Book of Mormon. "All things which have been given of God from the beginning of the world, unto man, are the typifying of [Christ]" (2 Ne. 11:4); all performances and ordinances of the Law of Moses "were types of things to come" (Mosiah 13:31); and the Liahona, or compass, was seen as a type: "For just as surely as this director did bring our fathers, by following its course, to the Promised Land, shall by following its course, to the Promised Land, shall the words of Christ, if we follow their course, carry the words of Christ, if we follow their course, carry us beyond this vale of sorrow into a far better land of promise" (Alma 37:45). In its largest typological structure, the Book of Mormon fits well the seven phases of revelation posited by Northrop Frye: creation, revolution or exodus, law, wisdom, prophecy, tospel, and apocalypse (The Great Code: The Bible and Literature [New York, 1982]). WISDOM LITERATURE. Transmitted sayings of the wise are scattered throughout the Book of Mormon, especially in counsel given by fathers to their sons. Alma counsels, "O remember, my son, and learn wisdom in thy youth; yea, learn in thy youth, to keep the commandments of God" (Alma 37:35; see also 38:9-15). Benjamin says, "I tell you these things that ye may learn wisdom; that ye may learn that when ye are in the service of your fellow beings ye are only in the service of your God" (Mosiah 2:17). A memorable aphorism is given by Lehi: "Adam fell that men might be; and men are, that they might have joy" (2 Ne. 2:25). Pithy sayings such as "fools mock, but they shall mourn" (Ether 12:26) and "wickedness never was happiness" (Alma 41:10) are often repeated by Latter-day Saints. APOCALYPTIC LITERATURE. The vision in 1 Nephi 11-15 (sixth century B.C.) is coomparable in form with early Apocalyptic literature. It contains a vision, is delivered in dialogue form, has an otherworldly mediator or escort, includes a commandment to write, treats the disposition of the recipient, prophesies persecution, foretells the cosmic transformations, and has an otherworldly place as its spatial axis. Later Jewish developments of complex angelology, mystic numerology, and symbolism are absent. STYLE AND TONE. Book of Mormon writers show an intense concern for styyle and tone. Alma desires to be able to "speak with the trump of God, with a voice to shake the earth," yet realizes that "I am a man, and do sin in my wish; for I ought to be content with the things which the Lord hath allotted unto me" (Alma 29:1-3). Moroni2 expresses a feeling of inadequacy in writing: "Lord, the Gentiles will mock at these things, because of our weakness in writing.... Thou hast also made our words powerful and great, even that we cannot write them; wherefore, when we write we behold our wekaness, and stmble beacuse of the placing of our words" (Ether 12:23-25; cf. 2 Ne. 33:1). Moroni's written words, however, are not weak. In cadences of ascending strength he boldly declares: O ye pollutions, ye hypocrites, ye teachers, who sell yourselves for that which will canker, why have ye polluted the holy church of God? Why are ye ashamed to take upon you the name of Christ?...Who will despise the children of Christ? Behold, all ye who are despisers of the works of the Lord, for ye shall wonder and perish [Morm. 8:38, 9:26]. The styles employed by the different writers in the Book of Mormon vary from the unadorned to the sublime. The tones range from Moroni's strident condemnations to Jesus' humblest pleading: "Behold, mine arm of mercy is extended towards you, and whosoever will come, him will I receive" (3 Ne. 9:14). A model for communication is Jesus, who, Moroni reports, "told me in plain humility, even as a man telleth another in mine own language, concerning these things; and only a few have I written, because of my weakness in writing" (Ether 12:39-40). Two concepts in this report are repeated throughout the Book of Mormon--plain speech and inability to write about some things. "I have spoken plainly unto you," Nephi says, "that ye cannot misunderstand" (2 Ne. 25:28). "My soul delighteth in plainness," he continues, "for after this manner doth the Lord God work among the children of men (2 Ne. 31:3). Yet Nephi also delights in the words of Isaiah, which "are not plain unto you" although "they are plain unto all those that are filled with the spirit of prophecy" (2 Ne. 25:4). Containing both plain and veiled language, the Book of Mormon is a spiritually and literarily powerful book that is direct yet complex, simple yet profound. (See [1]Basic Beliefs home page; [2]Scriptural Writings home page; [3]The Book of Mormon home page) ______________________________________________________________ Bibliography England, Eugene. "A Second Witness for the Logos: The Book of Momon and Contemporary Literary Criticism." In By Study and Also by Faith, 2 vols., ed. J. Lundquist and S. Ricks, Vol.2 pp. 91-125. Salt Lake City, 1990. Jorgensen, Bruce W.; Richard Dilworth Rust; and George S. Tate. Essays on typology in Literature of Belief, ed. Neal E. Lambert. Provo, Utah, 1981. Nichols, Robert E., Jr. "Beowulf and Nephi: A Literary View of the Book of Mormon." Dialogue 4 (Autumn 1969):40-47. Parry, Donald W. "Hebrew Literary Patterns in the Book of Mormon." Ensign 19 (Oct. 1989):58-61. Rust, Richard Dilworth. "Book of Mormon Poetry." New Era (Mar. 1983):46-50 Welch, John W. "Chiasmus in the Book of Mormon." In Chiasmus in Antiquity, ed. J. Welch, pp. 198-210. Hildesheim, 1981. Encyclopedia of Mormonism, Vol. 1, Book of Mormon Literature References 1. http://www.lightplanet.com/mormons/basic/index.htm 2. http://www.lightplanet.com/mormons/basic/doctrines/scripture/index.htm 3. http://www.lightplanet.com/mormons/basic/bom/index.htm From checker at panix.com Mon Jan 2 20:55:43 2006 From: checker at panix.com (Premise Checker) Date: Mon, 2 Jan 2006 15:55:43 -0500 (EST) Subject: [Paleopsych] New Oxford Review: Bio-Luddites & the Secularist Rapture Message-ID: Bio-Luddites & the Secularist Rapture http://www.newoxfordreview.org/note.jsp?did=1103-notes-rapture New Oxford Review November 2003 Coming soon to a theater near you: cyborgs. Not on the screen, but sitting next to you in the audience. This is "the coming reality" in our technological world, or so say a group calling themselves "transhumanists." According to transhumanists, man has, since time immemorial, depended on technology for his survival in this hostile world: From the first primitive tool to walking sticks to eyeglasses to emergency alert bracelets to artificial intelligence -- man's dependence on machines increases with each new development. Soon one may be indistinguishable from the other. No surprise, say the transhumanists, because we have long been on our way to becoming cyborgs. Some would contend that we are already cyborgs. That cyborg at the movie theater? That cyborg could be you. A perusal of recent news clippings could easily lead one to believe that the prototypical elements of the transhumanists' "coming reality" are more science nonfiction than science fiction. To wit: A robot governed by neurons from a rat's brain (a "hybrot" -- a machine with living cells) is now reportedly drawing pictures; a lab monkey, via a chip implanted in its brain, is now able to move a cursor on a computer screen by thought alone; a rat was made to climb over fences and up trees, and walk through pipes and across rubble by signals sent from a remote computer to a chip implanted in its brain. Even more to the point, a British cybernetics professor became the first human to have a chip implanted into his central nervous system. This chip records and transmits his sensations (such as movement and pleasure) to a remote computer, which later plays back those sensations, causing the professor to experience them again. Since then about 20 people across the U.S. been "chipped" by the Applied Digital Solution's VeriChip Corporation, which for $200 up front and $10 a month will chip and track anyone from its traveling ChipMobile. Giddy from the possibilities stories like these present, the World Transhumanist Association (WTA) held a conference at Yale University this past June, as reported in The Village Voice (Jul. 30-Aug. 5), "to lay the groundwork for a society that would admit as citizens and companions intelligent robots, cyborgs made from a free mixing of human and machine parts, and fully organic, genetically engineered people who aren't necessarily human at all." The first order of business is to expand the definition of what we now call human rights to include "post-humans" -- robots, hybrots, cyborgs, and other such "people" who may, or may not, be human. According to Natasha Vita-More, founder of the transhumanist movement, we must begin the process of redefining today. Why? "To relinquish the rights of a future being merely because he, she, or it has a higher percentage of machine parts than biological cell structure would be racist toward all humans who have prosthetic parts." Racist? Really? We weren't aware that amputees constitute a "race" of humans. The conference's opening debate was titled "Should Humans Welcome or Resist Becoming Posthuman?" with the overwhelming sentiment favoring the former. Echoing the majority opinion, Kevin Fitzgerald, a Jesuit priest and "bioethicist" at Georgetown Medical Center, is quoted in The Voice: "To err on the side of inclusion is the loving thing to do." Oh, right. We certainly must be inclusive. And of course we must be loving. We must love our robots. Still, Fr. Fitzgerald may be onto something here. The Jesuits have experienced a steep decline in ordinations -- might a fleet of robo-priests be the answer to the Jesuit priest shortage? At least then we could be assured that the rubrics of the Mass would be adhered to, albeit in a mechanical, robotic fashion. Domo arigato, Fr. Roboto. The Voice reports that transhumanists "look for inspiration to civil rights battles, most recently to the transgender and gay push for self-determination." (The WTA has even modified a popular homosexualist slogan, decrying "technophobia.") James Hughes, Secretary of the WTA, says this: "The whole thrust of the liberal democratic movement of the last 400 years has been to allow people to use reason and science to control their own lives, free from the authority of church and state." Dr. J, as Hughes is affectionately known, has expanded on this theme in a series of columns on the Better Humans website. He applauds the "enormous progress" we have made in "overcoming" the "barriers to active, guilt-free sexuality," and in "transcending...biological gender." Despite the transhumanists' gushing over the "gay" and transsexual movements, the admiration is apparently not mutual. Homosexuals and transsexuals, reasons The Voice, "might not particularly like being associated with imagined cyborgs and human-animal hybrids." Still, the transhumanists are preparing for what they see as an impending battle against those who would resist the proliferation of the technology that is supposed to lead to the inevitable intermingling of man and machine. Hughes, quoted in The Voice, throws down the gauntlet: "If...the technology of human advancement is forbidden by bio-Luddites...that becomes a fundamental civil rights struggle." But not one of those nonviolent civil rights struggles. No, "there might come a time," predicts Hughes, "for the legitimate use of violence in self-defense," for "liberation acts" to unyoke "fully realized forms of artificial intelligence" from possible enslavement by humans. Suddenly the phrase "technological revolution" takes on an ominous tone. One pictures hordes of Arnold Schwarzenegger clones plodding about, shooting things, blowing up buildings, and setting entire cities aflame. Transhumanism's "coming reality" may be closer to this scenario than we might like to think. Transhumanists would like nothing more than to transform themselves into a technologically enhanced race of Uebermenschen. Their "vision" isn't limited to protecting amputees. According to The Voice, a good many transhumanists are "feverishly anticipating" an event they call "the Singularity" -- the moment when "technologies meld and an exponentially advancing intelligence is unleashed." This limitless technology is messianic in nature: Transhumanists "aspire to immortality and omniscience through uploading human consciousness into ever evolving machines." There is even a "Singularity Institute" for the furtherance of their agenda. The Institute's website heralds Singularity as the moment when man will be "capable of breaking the upper limit on intelligence that has held since the rise of humanity." The Singularity is akin to a transhumanist version of the Rapture, an endtime event invented by Protestant millennialists who believe that Jesus will at any moment whisk His true believers away before the onset of the 1,000-year reign of the Antichrist. Only for transhumanists, man's Ascension (through a deified technology) to the throne of omniscience begins right here, right now. And we sorry bio-Luddites who aren't plugged in, online, and geeked out will be "left behind" to suffer the Tribulation. From checker at panix.com Mon Jan 2 20:55:55 2006 From: checker at panix.com (Premise Checker) Date: Mon, 2 Jan 2006 15:55:55 -0500 (EST) Subject: [Paleopsych] Independent: 'Chronic happiness' the key to success Message-ID: http://news.independent.co.uk/world/science_technology/article333972.ece 19 December 2005 10:27 By Lyndsay Moss Published: 19 December 2005 The key to success may be "chronic happiness" rather than simply hard work and the right contacts, psychologists have found. Many assume a successful career and personal life leads to happiness. But psychologists in the US say happiness can bring success. Researchers from the universities of California, Missouri and Illinois examined connections between desirable characteristics, life success and well-being in more than 275,000 people. They found that happy individuals were predisposed to seek out new goals in life, leading to success, which also reinforced their already positive emotions. The psychologists addressed questions such as whether happy people were more successful than unhappy people, and whether happiness came before or after a perceived success. Writing in Psychological Bulletin, published by the American Psychological Association, they concluded that "chronically happy people" were generally more successful in many areas of life than less happy people. The key to success may be "chronic happiness" rather than simply hard work and the right contacts, psychologists have found. Many assume a successful career and personal life leads to happiness. But psychologists in the US say happiness can bring success. Researchers from the universities of California, Missouri and Illinois examined connections between desirable characteristics, life success and well-being in more than 275,000 people. They found that happy individuals were predisposed to seek out new goals in life, leading to success, which also reinforced their already positive emotions. The psychologists addressed questions such as whether happy people were more successful than unhappy people, and whether happiness came before or after a perceived success. Writing in Psychological Bulletin, published by the American Psychological Association, they concluded that "chronically happy people" were generally more successful in many areas of life than less happy people. From checker at panix.com Mon Jan 2 20:56:10 2006 From: checker at panix.com (Premise Checker) Date: Mon, 2 Jan 2006 15:56:10 -0500 (EST) Subject: [Paleopsych] Steve Sailer: Boys Will Be Boys Message-ID: Steve Sailer: Boys Will Be Boys http://www.claremont.org/writings/crb/fall2005/sailer.html. A review of Why Gender Matters: What Parents and Teachers Need to Know about the Emerging Science of Sex Differences, by Leonard Sax By Steve Sailer Posted November 30, 2005 Until last winter, I had assumed that fundamentalist feminism had peaked in the early 1990s with the Anita Hill brouhaha, and that Bill Clinton's political survival in 1998, which hinged on his near-unanimous support from hypocritical feminists, ended the era in which anyone took feminism seriously. The Larry Summers fiasco, however, showed that while feminism may have entered its Brezhnev Era intellectually, it still commands the institutional equivalent of Brezhnev's thousands of tanks and nuclear missiles. After just a few days, Harvard President Lawrence Summers caved in to critics of his off-hand comment that nature, not invidious discriminations alone, might be to blame for the lower percentage of women who study math and science. In short order, he propitiated the feminists by promising, in effect, to spend $50 million taking teaching and research opportunities at Harvard away from male jobseekers and giving them to less talented women. Perhaps in a saner society, then, we would have less need for Leonard Sax's engaging combination of popular science exposition and advice guidebook, Why Gender Matters: What Parents and Teachers Need to Know about the Emerging Science of Sex Differences. But parents as well as professors could benefit from it now. Sax speaks of "gender" when he means "sex"--male or female. I fear, though, that this usage battle is lost because the English language really does need two different words to distinguish between the fact, and the act, of sex. Supreme Court Justice Ruth Bader Ginsburg claims her secretary Millicent invented the use of "gender" to mean "sex" in the early 1970s while typing the crusading feminist's briefs against sex discrimination. Millicent pointed out to her boss that judges, like all men, have dirty minds when it comes to the word "sex," so she should use the boring term "gender" to keep those animals thinking only about the law. Unfortunately, "gender" now comes with a vast superstructure of 99% fact-free feminist theorizing about how sex differences are all just socially constructed. According to this orthodoxy, it's insensitive to doubt a burly transvestite truck driver demanding a government-subsidized sex change when he says he feels like a little girl inside. Yet it's also insensitive to assume that the average little girl feels like a little girl inside. Fortunately, Sax, a family physician and child psychologist, subscribes to none of the usual cant. Indeed, I thought I was a connoisseur of sex differences until I read Why Gender Matters, where I learned in the first chapter, for instance, that girls on average hear better than boys, especially higher-pitched sounds, such as the typical schoolteacher's voice, which is one little-known reason girls on average pay more attention in class. Males and females also tend to have different kinds of eyeballs, with boys better at tracking movement and girls better at distinguishing subtle shades of colors. Presumably, these separate skills evolved when men were hunters trying to spear fleeing game and women were gatherers searching out the ripest fruit. So, today, boys want to catch fly balls and girls want to discuss whether to buy the azure or periwinkle skirt. Cognitive differences are profound and pervasive. Don't force boys to explain their feelings in great detail, Sax advises. Their brains aren't wired to make that as enjoyable a pastime as it is for girls. * * * As founder of the national association for Single-Sex Public Education, Sax's favorite and perhaps most valuable theory is that co-educational schooling is frequently a mistake. He makes a strong case, especially concerning the years immediately following puberty. He cites the experience of two psychologists studying self-esteem in girls. They went to Belfast, where children can be assigned fairly randomly to coed or single-sex schools: They found that at coed schools, you don't need to ask a dozen questions to predict the girl's self-esteem. You have to ask only one question: "Do you think you're pretty?" Similarly, the Coleman Report found, four decades ago, that boys put more emphasis on sports and social success in coed schools, and less on intellectual development. Sax argues: Here's the paradox: coed schools tend to reinforce gender stereotypes. There is now very strong evidence that girls are more likely to take courses such as computer science and physics in girls-only schools. Boys in single-sex schools are more than twice as likely to study art, music, foreign languages, and literature as boys of equal ability attending comparable coed schools. Noting that the Department of Education projects that by 2011 there will be 140 women college graduates for every 100 men, he asks, "I'm all in favor of women's colleges, butwhy are nominally coed schools looking more and more like all-women's colleges?" So far, the decline of male academic achievement in the U.S. is mostly among blacks and Hispanics, but the catastrophic downturn into "laddism" of young white males in England in recent years, and their consequent decline in test scores, shows that no race is permanently immune to the prejudice that school is for girls. Of course, American schools have long been taught largely by women, and boys and schoolmarms have not always seen eye-to-eye. But the rise of feminism has encouraged female teachers to view their male students as overprivileged potential oppressors. Further, feminism justifies teachers' self-absorption with female feelings. Thus, a remarkable fraction of the novels my older son has been assigned to read in high school are about girls getting raped. I hope it hasn't permanently soured him on fiction. We've now achieved the worst of both worlds: the educational authorities are committed to anti-male social constructionist ideology, but the pop culture market delivers the crudest, most sexualized imagery. The irony is that when the adult world imposes gender egalitarianism on young people in the name of progressive ideologies, it just makes the young people even more cognizant of their primordial differences. * * * Sax's book often resembles a nonfiction version of Tom Wolfe's impressive novel I am Charlotte Simmons. What's most striking about Wolfe's merely semi-satirical portrait of Duke University is how, after 35 years of institutionalized feminism, student sexuality hasn't evolved into an egalitarian utopia. Instead, it has regressed to something that a caveman would understand--a sexual marketplace where muscles are the measure of the man. Not all of Sax's arguments are so dependable. For instance, he is far more confident that homosexuality is substantially genetic in origin than is the leading researcher he cites in support of his assertion, J. Michael Bailey of Northwestern University. Bailey has publicly noted how challenging he has found it to assemble a reliably representative sample of identical and fraternal twins for his homosexuality studies. Further, Bailey is troubled by the fundamental objection that natural selection would, presumably, cause genes for homosexuality to die out. Sax, though, races past these prudent concerns. Still, this is a better than average advice book for mothers and fathers. Most parenting books are unrealistic because they overemphasize how much parents can mold their children's personalities. Raising a second child, with his normally quite different personality, typically undermines parents' belief in their omnipotence, but most child-rearing books hush this up because their market is gullible first-timers. Fortunately, by emphasizing how much you need to fine-tune your treatment to fit your child's sex, Why Gender Matters injects some needed realism into the genre. But Sax's bulletproof confidence in his own advice gives me pause. Sixteen years of fatherhood have left me less confident that I know what I'm doing than when I started, but he doesn't suffer from any such self-skepticism. Steve Sailer is the film critic for The American Conservative and a columnist for VDARE.com. From checker at panix.com Mon Jan 2 20:57:45 2006 From: checker at panix.com (Premise Checker) Date: Mon, 2 Jan 2006 15:57:45 -0500 (EST) Subject: [Paleopsych] World Science: Bees can recognize human faces, study finds Message-ID: Bees can recognize human faces, study finds http://www.world-science.net/exclusives/051209_beesfrm.htm Honeybees may look pretty much all alike to us. But it seems we may not look all alike to them. A study has found that they can learn to recognize human faces in photos, and remember them for at least two days. The findings toss new uncertainty into a long-studied question that some scientists considered largely settled, the researchers say: how humans themselves recognize faces. The results also may help lead to better face-recognition software, developed through study of the insect brain, the scientists added. Many researchers traditionally believed facial recognition required a large brain, and possibly a specialized area of that organ dedicated to processing face information. The bee finding casts doubt on that, said Adrian G. Dyer, the lead researcher in the study. He recalls that when he made the discovery, it startled him so much that he called out to a colleague, telling her to come quickly because ?no one?s going to believe it?and bring a camera!? Dyer said that to his knowledge, the finding is the first time an invertebrate has shown ability to recognize faces of other species. But not all bees were up to the task: some flunked it, he said, although this seemed due more to a failure to grasp how the experiment worked than to poor facial recognition specifically. In any cases, some humans also can?t recognize faces, Dyer noted; the condition is called prosopagnosia. In the bee study, reported in the Dec. 15 issue of the Journal of Experimental Biology, Dyer and two colleagues presented honeybees with photos of human faces taken from a standard human psychology test. The photos had similar lighting, background colors and sizes and included only the face and neck to avoid having the insects make judgments based on the clothing. In some cases, the people in the pictures themselves looked similar. The researchers, with Johannes Gutenberg University in Mainz, Germany, tried to train the bees to realize that a photo of one man had a drop of a sugary liquid next to it. Different photos came with a drop of bitter liquid instead. A few bees apparently failed to realize that they should pay attention to the photos at all. But five bees learned to fly toward the photo horizontally in such a way that they could get a good look at it, Dyer reported. In fact, these bees tended to hover a few centimeters in front of the image for a while before deciding where to land. The bees learned to distinguish the correct face from the wrong one with better than 80 percent accuracy, even when the faces were similar, and regardless of where the photos were placed, the researchers found. Also, just like humans, the bees performed worse when the faces were flipped upside-down. ?This is evidence that face recognition requires neither a specialised neuronal [brain] circuitry nor a fundamentally advanced nervous system,? the researchers wrote, noting that the test they used was one for which even humans have some difficulty. Moreover, ?Two bees tested two days after the initial training retained the information in long-term memory,? they wrote. One scored about 94 percent on the first day and 79 percent two days later; the second bee?s score dropped from about 87 to 76 percent during the same time frame. The researchers also checked whether bees performed better for faces that humans judged as being more different. This seemed to be the case, they found, but the result didn?t reach statistical significance. The bees probably don?t understand what a human face is, Dyer said in an email. ?To the bees the faces were spatial patterns (or strange looking flowers),? he added. Bees are famous for their pattern-recognition abilities, which scientists believe evolved in order to discriminate among flowers. As social insects, they can also tell apart their hivemates. But the new study shows that they can recognize human faces better than some humans can?with one-ten thousandth of the brain cells. This raises the question of how bees recognize faces, and if so, whether they do it differently from the way we do it, Dyer and colleagues wrote. Studies suggest small children recognize faces by picking out specific features that are easy to recognize, whereas adults see the interrelationships among facial features. Bees seem to show aspects of both strategies depending on the study, the researchers added. The findings cast doubt on the belief among some researchers that the human brain has a specialized area for face recognition, Dyer and colleagues said. Neuroscientists point to an area called the fusiform gyrus, which tends to show increased activity during face-viewing, as serving this purpose. But the bee finding suggests ?the human brain may not need to have a visual area specific for the recognition of faces,? Dyer and colleagues wrote. That may be helpful to researchers who develop face-recognition technologies to be used for security at airports and other locations, Dyer noted. The United States is investing heavily in such systems, but they still make many mistakes. Already, the way that bees navigate is being used to design ?autonomous aircraft that can fly in remote areas without the need for radio contact or satellite navigation,? Dyer wrote in the email. ?We show that the miniature brain can definitely recognize faces, and if in the future we can work out the mechanisms by which this is achieved,? this might suggest ideas for improved face recognition technologies. Dyer said that if bees can learn to recognize humans in photos, then they reasonably might also be able to recognize real-life faces. On the other hand, he remarked, this probably isn?t the explanation for an adage popular in some parts of the world?that you shouldn?t kill a bee because its nestmates will remember and come after you. Francis Ratnieks of Sheffield University in Sheffield, U.K., says that apparent bee revenge attacks of this sort actually occur because a torn-off stinger releases chemicals that signal alarm to nearby hivemates. Says Dyer, ?bees don?t normally go around looking at faces.? From shovland at mindspring.com Sun Jan 1 17:23:18 2006 From: shovland at mindspring.com (Steve Hovland) Date: Sun, 1 Jan 2006 09:23:18 -0800 Subject: [Paleopsych] Medicare for All Message-ID: The health care industry is acting in a way that is detrimental to society as a whole. They are taking more and more money from us while providing less care for the money. Most countries in Europe provide health care for all of their citizens for half the money we spend in the US while leaving 45 million uninsured. Harry Truman first proposed a national health system for the US in 1950. It?s time to make the change. From checker at panix.com Tue Jan 3 01:37:32 2006 From: checker at panix.com (Premise Checker) Date: Mon, 2 Jan 2006 20:37:32 -0500 (EST) Subject: [Paleopsych] Roger D. Congelton: The political economy of Gordon Tullock Message-ID: Roger D. Congelton: The political economy of Gordon Tullock* Public Choice 121:213-238, 2004 [This is a superb appreciation of one of the Founding Fathers of Public Choice theory, and it no mean introduction to the field, since Gordon's interests were so broad. I loved this in particular: "It bears noting that Tullock invented or at least helped to invent the rent-seeking model of con?.ict (1967/1974). A social scientist who was more interested in maximizing fame than in understanding the world would never have raised a question that reduces the importance of one of his own major contributions, even were such doubts to arise. Fame and fortune tend to go to those whose ideas are "bigger" than initially thought, not "smaller." However, a proper scientist is a truth seeker (The Organization of Inquiry, 1966: 49), and Tullock is in this sense, if not in the conventional sense, a very proper social scientist." [Gordon, I think, loved more than anything else, to make unsettling if not outrageous assertions in conversation. Back in graduate school, when I first met him, I was arguing foreign policy with him. He was dubious and asked me a question. I changed my view, and he asked me another question. This went on until he told me, "You have now gone full circle." [Of all the people I have ever met, only Steve Sniegoski comes close in challenging my opinions. Both of them have first opinions of their own, which in the case of Gordon, as the article shows, are not always so apparent. I'm even further removed myself, since it is leaving no Premise Unchecked (that is, the Premise of my conversant) that is my forte, not persuasion itself. [In my opinion, Gordon did not share the Nobel Prize with Jim Buchanan because he tweaked the nose of the Swedes with their welfare state, and who award the prize, too many times by demanding to know what they thought a "just" distribution of income looks like, as their underlying mood is not to achieve some fixed distribution but to have ever more re-distribution. [The world badly needs far more challengers like Gordon.] [I call myself one of the Founding Sons of Public Choice theory, having studied under Gordon and Jim Buchanan in the early years at U.Va. I'm sending this also to a number of U.Va. people, so they can see what came out of it. [Sorry about the ligatures, like showing up as periods. Adobe's PDF to TXT converter (starting with version 7) changes all these ligatures to a period, so I can't do a global search and replace. I do convert (at least I try to get them all) the various Microsoft smart characters to ASCII ", ', --, etc., as appropriate. Lynx, my text-only web browser does this now, except that in some cases, nothing, not even a space, remains. But it should be obvious what's what. Tables generally do not convert. Sometimes I have to remove page headers, sometimes not. And, too often, spaces are omitted. It can be quite time comsuming, so please forgive me if I didn't replace each ligature by hand. I sometimes also keep paragraphs together when interrupted with footnotes. I can generally send the PDFs to anyone who e-mails me asking for them.] *Center for Study of Public Choice, George Mason University, Fairfax, VA 22030, U.S.A.; e-mail: congleto at gmu.edu Accepted 25 August 2003 The perspective on Tullock's work presented here is based partly on his proli.c writings and partly on numerous conversations with him over the course of several decades. He was kind enough to read through a previous draft, and the version presented here re.ects his comments and suggestions. Comments and suggestions received at the 2002 meeting of the Public Choice Society, and from James Buchanan, Charles Rowley, Robert Tollison, and an anonymous referee were also very helpful. "Leaving aside the problem of the correctness of my answers, the fact remains that I have been unable to .nd any indications that scientists have asked the questions to which I address myself. The unwary might take this as proof that the problems are unimportant, but scientists, fully conscious of the importance of asking new questions, will not make this mistake." (Gordon Tullock, The Organization of Inquiry (1966: 3.) 1.Introduction It is fair to say that few public choice scholars have contributed to so many areas of public choice research as frequently or with as much insight as Gordon Tullock. Professor Tullock's work considers not only political and contractual relationships within a well-established legal order, but also ex?traordinary political behavior within rent-seeking societies, within .rms, at court, within communities at war, among those considering revolution, and among those emerging from or falling into anarchy. The result is an unusually complete political economy that includes theories of the origin of the state; theories of decision making within bureaucracy, dictatorship, democracy, and the courts; and within science itself. It is also fair to say that Professor Tullock uses relatively simple tools to analyze these far-reaching topics. Indeed, it is the use of relatively simple tools that makes the broad scope of his work possible. All the principal actors in Tullock's analysis maximize expected net bene.ts in circumstances where bene.ts, costs, and probabilities are assumed to be known by the relevant de?cision makers with some accuracy. This is the core hypothesis of the rational choice approach to social science, and it is the rationale for the title of Brady and Tollison's (1994) very interesting collection of Tullock papers. For the social scientist who uses the rational choice methodology, the re?search problem at hand is not to understand the complex chain of events that gave rise to unique personalities and historical moments, but rather to more fully appreciate the general features of the choice problems facing more or less similar actors at times when more or less routine decisions are made. By focusing on the general rather than the particular, a good deal of human behavior can be predicted within broad limits, without requiring intimate knowledge of the individuals or institutional settings of interest. Such an approach is commonplace within economics, where it has been very suc?cessfully applied to understand general features of decisions made by .rms and consumers, and is becoming more common within other social sciences where the rational choice methodology remains somewhat controversial. Tullock's work is largely written for economists and the subset of political scientists who routinely use rational-choice models, and his analysis naturally uses that mode of reasoning and argument. What distinguishes Tullock's work from that of most other social scientists who use the rational choice approach is that, in spite of his use of reductionist tools, Tullock's work tends to be anti-reductionist rather than reductionist in nature.1 A good deal of Tullock's work uses simple models to demonstrate that the world is more complex than may have previously been appreciated. It is partly the critical nature of his work that makes Tullock's world view dif.cult to summarize, as might also be said of much of Frank Knight's work. Tullock's more conventional work suggests that some arguments are more general than they appear and others less general than might be appreciated. To make these points, Tullock, like Knight, tends to focus sharply on neg?lected implications and discomforting facts. Unlike Knight, his arguments are usually very direct, and often simple appearing. Indeed, critics sometime suggest that Tullock's direct and informal prose implies super.ciality rather than a clear vision. However, a more sympathetic reading of Tullock's work asawhole discovers irreduciblecomplexity, rather than simplicity. This complexity arises partly because his approach to political economy bears a closer relationship to work in law, history, or biology than it does to physics or astronomy, and much work within economics. Tullock was trained as a lawyer and reads widely in history. Both lawyers and historians are inclined to regard every case as somewhat unique and every argument as somewhat .awed. Both these propensities are evident in his work. It is also true that Professor Tullock enjoys pointing "the way," and "the way" seems to be a bit different in every paper and book. His published work, especially his books, often leaps from one innovative idea to the next without providing readers with a clear sense of the general lay of the intellectual landscape. Although many of Tullock's pieces can be accurately summarized in a few sentences (as tends to be true of much that is written by academic scholars) the world revealed by Professor Tullock's work as a whole is not nearly so easily condensed. Complexity also arises because the aim of Tullock's work is often to stim?ulatenew research on issues and evidence largely neglected by the scholarly literature, rather than to completeor.nalizeexisting lines of research through careful integration and testing. To the extent that he succeeds with his enter?prise - and he often has - his efforts to blaze new trails stimulate further exploration by other scholars. For example, his work with James Buchanan on constitutional design (1962) has generated a substantial .eld of rational choice-based research on the positive and normative properties of alternative constitutional designs. His path-breaking paper on rent seeking (1967) was so original that it passed largely unnoticed for a decade, although it and sub?sequent work have since become widely praised for opening important new areas of research. His work on dictatorship (1974, 1987), which was almost a forbidden topic at the time that he .rst began working on it, has helped to launch important new research on non-democratic governance (Olson, 1993, Wintrobe, 1994). His editorial essays, "Ef.cient Rent-Seeking" and "Back to the Bog," have also encouraged a large body of new work on the equilibrium size of the rent-seeking industry and helped establish the new .eld of contest theory. The institutionally induced equilibrium literature pioneered by Wein?gast and Shepsle (1981) was developed partly in response to Tullock's "Why so Much Stability?" essay. His early work on vote trading (1959), the courts (1971, 1980), and bureaucracy (1965) also helped to establish new literatures. The breadth of Tullock's political economy and the simplicity of its com?ponent arguments also re.ects his working style and interests. Professor Tullock is very quick, reads widely, and works rapidly. He dictates the ma?jority of his papers. And although his papers are revised before sending them off, he lacks the patience to polish them to the high gloss evident in the work of most prominent scholars. In Tullock's mind, it is the originality of the ideas and analysis that determines the value of a particular piece of research, rather than the elegance of the prose or the mathematical models used to com?municate its ideas. (To paraphrase McLuran, "the message is the message," rather than the "medium.") The result is a very large body of very creative and stimulating work, but also a body of work that could bene.t from just a bit more care at its various margins.2 If a major fault exists in that substantial body of research, it is that Tul?lock has not provided fellow travelers with a road map to his intellectual enterprise, as for example James Buchanan, Mancur Olson, and William Riker have. None of Tullock's hundreds of papers explains his overarching world view in detail, nor is there a single piece that attempts to integrate his many contributions into a coherent framework. The purpose of this essay is to provide such an intellectual road map. It directs attention to the easily neglected general themes, conclusions, and connections between Professor Tullock's many contributions to public choice. The aim of the essay is, thus, in a sense "non-Tullockian" insofar as it attempts to explain Tullock's com?plex and multifaceted world view with a few fundamental principles, rather than to probe for weaknesses or suggest new problems or interpretations of existing work. The present road map is organized as follows. Section 2 focuses on the methodological foundations of Tulluck's work, Section 3 sur?veys his broad research on political economy, and Section 4 summarizes the main argument and brie.y discusses some of Tullock's major contributions. Numerous quotes from Tullock's work are included in endnotes. 2.Tullock'sworldview A.Methodology: Positivism without statistics "We must be skeptical about each theory, but this does not mean that we must be skeptical about the existence of truth. In fact our skepticism is an illustration of our belief in truth. We doubt that our present theories are in fact true, and look for other theories which approach that goal more closely. Only if one believes in an objective truth will experimental evid?ence contrary to the predictions ?disprove' the theory." (The Organization of Inquiry: 48) Tullock's perspective on science and methodology, although implicit in much of his work, is most clearly developed in The Organization of Inquiry [1966]. The Organization of Inquiry applies the tools of rational choice-based social science to science, itself, in order to better understand how the scienti.c com?munity operates and why scienti.c discourse has been an engine of progress for the past two centuries. Such questions cannot be addressed without char?acterizing the aims and methods of science and scientists, and, thus, Tullock could not analyze the organization of inquiry without revealing his own vision of science, scienti.c progress, and proper methodology. The preface of The Organization of Inquiryacknowledges the in.uence of Karl Popper, Michael Polanyi, and Thomas Kuhn, and these in.uences are clearly evident in his work.3 Although Tullock's work is largely theoretical, he remains very interested in empirical evidence. A logical explanation that fails to explain key facts can be overturned by those facts even if the line of reasoning is completely self-consistent. That is to say, both the assumptions and predictions of a model should account for facts that are widely recognized by intelligent persons who read reputable newspapers and are familiar with world history. The world can "say" something about a theory, and a proper scientist should be prepared to hear what is said. He or she does this by remaining a bit skeptical about the merits of existing theories, no matter how well-stated or long-standing.4 His simultaneous skepticism and belief in the possibility of truth is clearly evident in his wide range of articles and comments critiquing the theor?ies and mistaken conclusions of other social scientists. For example, in a series of essays on "the bog" Tullock asks and re-asks those working in the rent-seeking literature to explain why the rent-seeking industry is so small? And, moreover, why is the rate of return on rent seeking evidently so much greater than the rate of return from other investments? It bears noting that Tullock invented or at least helped to invent the rent-seeking model of con?.ict (1967/1974). A social scientist who was more interested in maximizing fame than in understanding the world would never have raised a question that reduces the importance of one of his own major contributions, even were such doubts to arise. Fame and fortune tend to go to those whose ideas are "bigger" than initially thought, not "smaller." However, a proper scientist is a truth seeker (The Organization of Inquiry, 1966: 49), and Tullock is in this sense, if not in the conventional sense, a very proper social scientist. In contrast to most academic scholars, Tullock argues that the scienti.c enterprise is not elitist. Science is accessible to non-experts. The facts do not respect titles, pedigrees, or even a history of scienti.c achievement. He suggests that essentially any area of science can be understood by intelligent outsiders who take the time to investigate them. Thus, every theory is open to examination by newcomers with a fresh eye as well as those with established reputations in particular .elds of research.5 Together Tullock's truth-oriented skepticism and nonelitism sheds consid?erable light on the broad domain in which Professor Tullock has read and written. A strong sense that "the truth" can be known by anyone who invests time and attention induces Tullock to read more widely and more critically than those inclined to defer to well-credentialed "experts." His positivism induces him to focus on modern scienti.c theories and historical facts rather than philosophical controversies. Because his extensive reading covers areas that are unfamiliar to his less widely read or more philosophical colleagues, he is able to use a wide range of historical facts and scienti.c theories to criticize existing theories and also as a source of puzzles and dilemmas to be addressed in new research. Together with his non-elitist view of science, his broad interest in the world induces him to think and write without re?gard to the disciplinary boundaries that constrain the thoughts of his more convention-bound colleagues. B.Social science: How narrow and how rational is human nature? "Every man is an individual with his own private ends and ambitions. He will only carry out assigned tasks if this proves the best way of attaining his own ends, and will make every effort to change the tasks so as to make them more in keeping with these objectives. A machine will carry out instructions given to it. A man is not so con.ned." (ThePoliticsofBureaucracy, 1966: 32) Economists tend to view man as "a rational animal," by which various eco?nomists mean various things not uniformly agreed to, but nonetheless clearly distinct from the customary usage of the word "rational" by non-economists. For example, microeconomics texts normally introduce the notion of "ra?tionality" at the same time that they discuss preference orderings. Rational decision makers have transitive preference orderings. Game theorists and macroeconomists who model individual decision making through time con?sider a decision maker to have "rational expectations." A rational decision maker anticipates the consequences of his or her actions, and does so in a manner free of systematic mistakes of bias. (In this amended concept of ra?tionality, economists are returning to the use of the term "rational" in ordinary language.) The preference and informational meanings of the term rational are often commingled by modern economists so that rational individuals become characterized as persons having consistent and durable preferences and unbiased expectations. This very demanding de.nition of rationality is occasionally found in Tullock's work.6 However, in most cases, Tullock is unwilling to adopt the full rationality hypothesis. He argues, for example, that information problems exist that lead to systematic errors, especially within politics (1967, chs. 6-9). The existence of such information problems is grounded in his personal experience. If hu?man beliefs were always unbiased, it would be impossible to .nd instances in which large groups of people, especially professionals, have systematically mistaken views about anything. For those who have more than occasionally been persuaded by Professor Tullock to change their own views, or seen him launch a well-reasoned barrage on the views of thoughtful but confused col?leagues, it sometimes appears that the only economist whose expectations are untainted by wishful thinking is Gordon Tullock, himself.7 Tullock's value as a critic and curmudgeon is, itself, largely incompatible with the "rational expectations" usage of the term "rational." Yet, it is partly because economists have failed to broadly apply the ra?tional choice paradigm that Tullock has achieved some notoriety among economists by reminding the profession of the limits of other motiva?tional theories; however, this is not because he believes that humans have one-dimensional objective functions.8 Tullock's view of man also incorporates a richer model of self interest than is included in most economic models. Although man is self-interested, his interests are often complex and context dependent.9 Consequently, Tullock rarely uses the simplest characterization of homoeconomicusas a narrow self-interested "wealth maximizer." For example, Tullock allows the possib?ility that a person's self-interest may be partly dependent on the welfare of others. Modest altruism and envy are at least weakly supported by evolution and therefore are likely to be present in human behavior.10 The evidence, however, leads Tullock to conclude that such "broader" interests are less important than many believe. In the end, it is narrow self-interest-based ana?lyses that provide the surest model of human behavior and, therefore, for institutional reform.11 If Buchanan's views may be said to be similar to those of James Madison, it might be said that Tullock's view of man parallels those of George Washington.12 Washington once said that to expect "ordinary people to be in.uenced by any other principle but those of interest is to look for what never did and I fear never will happen," (Johnson, 1997: 186) and also that "few men have virtue to withstand the highest bidder." The paradox in both cases is that neither men were themselves entirely motivated by narrow self-interest. C.Conflict and prosperity: On the cost and generality of rent seeking "Conflict" is to be expected in all situations in which transfers or redis?tribution occur, and in all situations in which problems of distribution arise. In general, it is rational for individuals to invest resources to either increase the transfers that they will receive or prevent redistributions away from them. Thus, any transactions involving distribution will lead to dir?ectly opposing resource investments and so to con.ict by our de.nition." (TheSocialDilemma,1974: 6) Take a rational individual and place him in a setting that includes other indi?viduals in possession of scarce resources, and most economists will predict the emergence of trade. Economists are all familiar with the Edgeworth box, which provides a convincing illustration of mutual gains from exchange. Tul?lock would be inclined to predict con.ict. Scarcity implies that individuals cannot achieve all of their objectives and that essentially all individuals would be better off with additional resources; however, it does not imply that vol?untary exchange is the only method of accomplishing this. Unfortunately, the economist's prediction that unrealized gains will be realized through volun?tary exchange follows only in settings where changes in the distribution of resources can be accomplished onlythrough voluntary means. In the absence of well-enforced rights, the strong may simply take the "initial endowments" of the weak.13 Few modern political economists would disagree with such claims about con.ict in a setting of anarchy, once reminded of the importance of well-enforced property rights. However, Tullockalsoarguesthatwastefulcon.ictalsotendstoemergeinsettingswhererightsareinitiallywellunderstoodandenforced.For example, lawful means are routinely used to change existing property rights assignments and the extent to which they are enforced - within legislatures and court proceedings. In ordinary markets, there is con.ict over the division of gains to trade and also in the efforts of .rms to increase market share through advertising and product innovation. In settled polities, con.ict is evident in the efforts of opposing special interest groups to persuade legis?latures to enact particular rules and regulations, and in the efforts of opposing candidates to win elective of.ce. In less lawful or settled settings, political and economic con.ict may imply theft and fraud, or bombs exploding and battles fought. Tullock often reminds us that con.ict is endemic to human existence. Con.ict implies that resources are devoted to activities that reduce rather than increase the output of .nal goods and services. These "rent-seeking" losses cannot be entirely avoided, although the cost of con.ict can be re?duced by intelligent institutional design. For example, the cost of con.ict is reduced by institutional arrangements that encourage the accumulation of productive capital rather than investments in redistribution.14 It bears noting that Tullock's conclusion regarding the feasibility of institutional solutions is empirical rather than analytical. Modern game theory suggests that perfect institutions cannot be ruled out a priori - indeed for essentially any well-de.ned game of con.ict, it can be shown analytically that a suitable bond or punishment scheme can completely eliminate the losses from con.ict. As far as Tullock knows, however, there are no real world institutional arrangements that completely solve the problem of con.ict. What changes with institutions is the magnitude and type of con.ict that takes place. That is to say, con.ict appears to be the normal state of human affairs whether bound by institu?tions or not. Theoretical solutions evidently underrepresent the strategy sets available to persons in real historical settings. 3.Tullock's political economy A. From the Hobbesian jungle to authoritarian government "Let us make the simplest assumption of transition conditions from the jungle to one where there is an enforcement apparatus. Assume, then, a jungle in which there are some bands - like prides of lions - and that one of these bands succeeds in destroying or enslaving all of the others, and establishes .rm control. This control would, .rstly, lead to a considerable change in the income distribution in the jungle in that the members of the winning band would have much larger incomes and the losers would have lower incomes. It would be rational for the stronger members of the winning band to permit sizable improvements in the incomes of the weaker members at the expense of nonmembers of the band, simply in order to retain the support of these weak members. The cohesion of the new government would depend on suitable reward for all members." (Gor?don Tullock, "The Edge of the Jungle," in ExplorationsintheTheoryofAnarchy, 1972: 70) Tullock argues that government, itself, often emerges from con.ict. For ex?ample, Tullock suggests that autocracy is the most likely form of governance to emerge in real political settings. In this one might suppose that Tullock agrees with Hobbes rather than with Buchanan, but neither turns out to be the case. Tullock's theory of the origin of government is based on conquest and domination rather than social contract. The theoretical and empirical importance of authoritarian regimes has led Tullock to devote substantial time and energy to analyzing the properties of this very common political institution. His analysis of autocracy implies that the rule of particular dictators tends to be short-lived, although autocratic institutions themselves tend to be very durable. Autocratic regimes have an inherent "stability problem" analogous to that associated with coalition polit?ics in democracies. Escape from anarchy does not imply the end of con.ict, as indirectly suggested by Hobbes.15 This is not to say that every dictatorship is overthrown. Tullock discusses a variety of methods by which dictators can decrease the probability of coup d'?tat by in-house rivals, most of which, by increasing the costs of conspiracy, also reduce the probability of a coup attempt being organized. For example, laws against treason should be aggressively enforced, rewards for providing the ruler(s) with creditable evidence of conspiracies should be high, com?missions rather than individuals should be given responsibility for as much as possible, and potential rivals should be exiled in a manner that reduces opportunities for acquiring support among elites (Autocracy,1987: Ch. 1 and TheSocialDilemma, 1974: Ch. 7). Nonetheless, the large personal advantage that successful conspirators expect to realize make conspiracies dif.cult to eliminate completely; consequently, coups do occur on a fairly regular basis. The dictator's coalition problem implies that a particular autocrat's "term of of.ce" is likely to be ended by an internal overthrow, or coup d'?tat (Autocracy, 1987: 9), and this is widely observed (Biennen and van de Walle (1989). However, the coalition problem does not apply to the institution of auto?cratic governance, itself. Centralized political power will not be given up easily, because political elites often share an interest in retaining autocratic forms of governance, even when they disagree about who should rule. Moreover, a well-informed autocrat can more easily subvert a popular revolt than a coup d'?tat. The same methods used to discourage palace coups also discourage popular revolts. Tullock argues that popular uprisings are far more dif.cult to organize than are palace coups, because the public-good problems that must be overcome are much larger. The individual advantages of participating in a popular uprising are very small relative to those obtained by members of a palace coup, although the aggregate bene.ts may be much larger. Being larger enterprises, revolutionary movements are also much easier to discover (Autocracy,1987: Ch. 3 and ThePoliticsofBureaucracy, 1966: 54). Together these imply that autocratic governmental institutions are more easily protected than is the tenure of a particular dictator.16 Tullock's analysis implies that democracy is a very unlikely form of gov?ernment, although not an impossible one. For example, Tullock notes that an internal overthrow engineered by elites may lead to democracy, as when an elected parliament or state assembly deposes a king or appointed governor, and it may well be the case that such transformations are broadly supported in the population as a whole (Autocracy,1987: 53-68). The evidence supports Tullock's authoritarian prediction, insofar as autocracies have been far more common than democracies throughout recorded history. B. Constitutional design Given the historical rarity of democracy and Tullock's assessment of the like?lihood of democratic reform, it is somewhat surprising that Professor Tullock has devoted so much of his intellectual life to understanding how modern democracy operates and how it can be improved. The most likely explanation is that knowledge of one's local political circumstances tends to be valuable for scholars and non-scholars alike. Tullock, like most other public choice scholars, resides in a democratic polity. And this, in combination with the wider freedom available within democracies to engage in political research, has led him and most other public choice scholars to focus largely on the properties of democratic governance.17 When government policies are to be selected by a group, rather than im?posed by a dictator, the .rst collective choice that must be made is the method of collective choice itself. How should such constitutional decisions be made? Buchanan and Tullock point out in the CalculusofConsent(1962) that the design and selection of collective decision rules is a complex problem, but one that is amenable to analysis using rational choice models.18 For example, Buchanan and Tullock note that a wide variety of voting rules can be em?ployed by a group to make collective decisions and, moreover, that decision rules other than majority rule can be in the interest of essentially all citizens. The best decision rule depends on the problems being addressed collectively and also on the diversity of group interests. Buchanan and Tullock also point out that, even in cases where majority rule is explicitly used and median voter outcomes emerge in the relevant elections, other institutional arrangements, such as bicameralism or single member districts, may imply that "majoritarian" legislative outcomes require substantially more or less than majority support from the electorate (CalculusofConsent: Chs. 15 and 16). In general, the menu of political constitutions includes a wide range of choices, and even majoritarian decisions are affected by the institutional setting in which voting takes place. In subsequent work, Tullock argues that a far better method of choice, the "Demand Revealing Procedure" (Tideman and Tullock, 1976), would not rely on counting votes at all.19 C. Interest groups, vote trading, and coalition politics On those occasions when collective decisions are made by majority rule, most economists assume that median voter interests tend to be advanced, partly be?cause the median voter model is so tractable.20 However, as Tullock has long argued, most voting models assume that voters make independent decisions about how to cast their votes. Tullock (1959, 1970) points out that if vote trading (log rolling) is pos?sible, mutual gains from trade can sometimes be realized by coordinating votes - mutual gains that would otherwise be infeasible. For example, sup?pose there are three equal-sized groups of voters who care intensely about three separate large-scale projects that can only be .nanced by the central government, for example, building a dam, dredging a river, or constructing a bridge. Tullock demonstrates that it may be Pareto ef.cient to undertake all three projects, but the concentration of bene.ts within minorities can cause ordinary majority rule to reject all three projects. Vote trading in such instances potentially allows some or all of the unrealized gains from gov?ernment service to be realized.21 In such cases, rather than appealing to the median voter, Tullock notes that candidates may take positions that appeal to several distinct "special interest" minorities that together add up to a majority. Direct vote trades are most feasible in relatively small number settings, as in legislatures, where continuous dealings allow informal exchanges of "favors" to be enforced. In large-scale elections, explicit vote trading is not likely to be a major factor in.uencing electoral outcomes, although what Tullock refers to as implicit log rolling may be. Figure 1 illustrates the case in which extremist groups A and B join forces to obtain policy X over the wishes of moderate voters who prefer policy B. Figure1. Implicit log rolling Such implicit vote trading, unfortunately, tends to be associated with ma?joritarian decision cycles. That is to say, if implicit vote trading can make a difference, there tends not to be a median voter. For example, in Figure 1, note that pairwise votes among policies X, B, and Y would be as follows: X > B, but Y > Xand B > Y. D. Bureaucracy Once legislative decisions are reached, they are normally implemented by large government organizations referred to as bureaucracies. In some cases, implementation is simply a matter of executing directives from elected rep?resentatives. Activity A is to be of.cially opposed or encouraged, and the bureaucracy implements the policy by imposing penalty P or subsidy S on persons engaging in activity A. In other cases, the bureaucracy has discretion to develop the policies themselves or the methods by which services will be produced, as when police and .re departments organize the production of crime-and .re-controlling services. In still others, the agency may be able to develop the law itself - as within regulatory agencies. In all such cases, it is clear that the .nal disposition of public policy depends in part on the incentives of individuals who work in government agencies as well as those of elected representatives. In Tullock's view, the incentives within large public and private organ?izations are broadly similar, although they differ somewhat at the margin (PoliticsofBureaucracy, 1966). Both public and private bureaucracies have their own internal incentive structures that encourage various kinds of pro?ductive and unproductive activities by the individuals who work within them. These incentives in.uence both the performance of individuals within organizations and the array of outputs produced by their organizations. Tullock argues that the importance of a particular organization's internal incentives relative to the external incentives of labor markets is determ?ined by the ability of individual bureaucrats to move between organizations. If every individual within a bureaucracy can costlessly change jobs, intra-organizational reward structures would be relatively unimportant for career advancement, and reputation in the wider community would largely determ?ine salaries. Alternatively, when it is dif.cult for persons to move between organizations, the internal structure of internal rewards and punishments be?comes an important determinant of individual salaries and perquisites, and, therefore, behavior (PoliticsofBureaucracy, 1966: 10). In such cases, large organizations will have some monopsony power with respect to their employees and internal incentives will largely determine employee performance on the job. Economics predicts that monopsony power will affect salaries and other economic aspects of job contracts. However, the intra.rm relationships of interest in Tullock's analysis are political, rather than economic. The politi?cization of an organization's hierarchy creates a nonprice mechanism by which hierarchical organizations can solve their coordination and principal-agent problems. He argues that political aspects of relationships within large organizations can be readily observed and, to some extent, measured by "de?ference." The "deference" observed is predicted to vary with the extent of monopsony power that a given organization possesses.22 For example, in?sofar as mobility decreases with seniority, Tullock's analysis predicts that deference would increase as individuals approach the top of an organization's hierarchy. The speci.c behavior that successfully curries favor or signals loyalty clearly varies according to the "wishes" induced on a given agent's boss by the boss's boss and so on. In principle, both public and private organizations can be organized in an ef.cient manner, in the sense that the organizational goals are advanced at least cost.23 However, incentives to assure ef.ciency within the public bureaucracy tend to be smaller than within large .rms. Wage differentials tend to be larger at the top levels of private-sector organizations than in comparable public-sector organizations; consequently, Tullock pre?dicts that more deference occurs in private than in comparable governmental organizations.24 Moreover, a public bureau's ef.ciency is generally more dif.cult to assess, and there is substantially less motivation for improving the performance of public bureaus than of comparable private bureaus within large .rms.25 For these reasons, Tullock concludes that the public bureaucracy tends to be less ef.cient than comparable organizations in the private sector. What this means as a practical matter is that organizational interests, as understood by senior bureaucrats and the legislature, are advanced less in public bureaus than within comparable organizations in the private sector. Tullock's analysis implies that the ef.ciency of the public bureaucracy can be improved if incentives to monitor public sector performance are increased, or if external competitive pressures on bureaus are intensi.ed. For example, Tullock argues that federalism can address both problems by reducing the complexity (size and scope) of the government agencies to be monitored (as local agencies replace national agencies) and by increasing competition between public agencies - both directly through efforts of localities to attract new residents and, indirectly, by comparison of the outputs of neighboring bureaus - as with local school districts and highway service departments. E. Enforcing the law: The courts, crime, and criminals "My readers are no doubt convinced by now that this book is different from other books on legal procedure. They may be convinced that it is superior, but, then again, they may not. I am proposing a radically differ?ent way of looking at procedural problems, and anyone making radical proposals must recognize the possibility that he could be wrong. But, although I concede the possibility that I could be wrong, I do not think that I am." (TrialsonTrial, 1980: 233) Of course, the executive bureaucracy is not the only governmental institution that affects legislative outcomes. Even within well-functioning democracies, many policy-relevant decisions are made by "independent" agencies. One crucial agency that is much neglected in the public choice literature is the courts. Economics implies that essentiallyalltheincentiveeffectsof public policy are generated by enforcement - that is to say, by the probabilities of punishment and the penalties associated with various kinds of private and public behavior.26 It is, thus, surprising that public choice scholars have in?vested so little effort analyzing the law enforcement system. Ef.cient and equitable enforcement of the law cannot be taken for granted. Professor Tullock was a pioneer in the rational choice-based analysis of the legal system, his LogicoftheLaw(1971) being published a year be?fore Posner's EconomicAnalysisoftheLaw(1972). Tullock's research on the legal system re.ects his interest in political economy. His work focuses largely on the problem of law enforcement, although the LogicoftheLawalso analyzes both civil and criminal law. On the former subject, largely neglected by Posner's treatise, Tullock reminds us that errors will always be made in the enforcement of law.27 Not all criminals are caught, not all who are caught are criminals, and not all of the guilty parties caught are punished, nor all innocent parties released. Mistakes can be made at every stage of the judicial process.28 With such errors in mind, Tullock explores the accuracy of institutions that determine fault or guilt, and attempts to assess the overall performance of the existing U. S. system of justice relative to alternative procedures for identifying criminals and persons at fault.29 Tullock argues that the available evidence implies that the U.S. courts make errors (wrongly determine guilt or innocence) in between 10% and 50% of the cases that they decide (TrialsonTrial, 1980: 33). Of course, a perfectly accurate justice system is impossible. The institutional or constitutional question is not whether mistakes are made, but whether too many (or too few) mistakes are being made. Improving the accuracy of court proceedings can reduce the social cost of illegal activities by better targeting sanctions at transgressors, which tends to reduce crime, and encourage greater efforts to settle out of court, which tends to reduce court costs (TrialsonTrial, 1980: 73-74). Tullock argues that the system of justice presently used in the United States can be improved at relatively low cost. He argues, for example, that the continental judicial system widely employed in Europe produces more accur?ate verdicts at a lower cost (TrialsonTrial, 1980, ch. 6). In the continental system, panels of judges assess guilt or innocence and mete out penalties in trials that are organized directly by the judges rather than produced by con.ict between legal teams for the votes of jury members. Accuracy could be further increased if the training of judges included a "good background in statistics, economics, ideas of administrative ef.ciency, etc." (TrialsonTrial, 1980: 204) 4. Conclusion and overview: Politicale conomy in the van Tullock's work demonstrates that the rational choice paradigm sheds light on a wide variety of political choice settings, but the world revealed is funda?mentally complex, varied, and irreducible. Each political setting has its own unique constellation of incentives and constraints. Political decisions at the constitutional level include voting rules, legislative structure, the institutional structures of the bureaucracy, and the courts. The public policies adopted within a given constitutional setting must address issues of redistribution and revolution as well as ordinary externality and coordination problems. De?cisions reached within all these settings can be understood as consequences of rational choice, but each choice setting differs from the others and the differences have to be taken into account if human behavior and policy outcomes are to be understood. Individuals are rational and largely self-interested, but on many issues will be rationally ignorant and, consequently, make systematic mistakes. This is not to say that there is nothing that can be said in general. Both indi?vidual choices and political outcomes are the result of the same fundamental considerations: self-interest, scarcity, and con.ict. And if the particulars al?ways differ, and are more than occasionally breathtaking, the basic "lay of the Tullock landscape" is always vaguely familiar.30 What is universal in Tullock's political economy is human nature. Tullock believes that (fairly) narrow self-interest can account for a wide range of human behavior, once individual interests are identi.ed for the institutional settings of interest. It is his characterization of human nature that provides Tullock's research in political economy with its uni.ed and coherent core. What is unique about Tullock's approach to political economy is his willingness to identify costs and bene.ts in essentially all choice settings, including many where more orthodox economists and political scientists fear to tread. Tullock's work suggests that a proper understanding of institutional settings allows relatively straightforward net-bene.t maximizing models to account for a rich and complex range of policy outcomes. A good deal of human behavior, perhaps most, can be understood using the rational choice model of behavior, once the particular costs and bene.ts of actions for a given institutional setting are recognized. A. Normative research Although Tullock's work is motivated, in large part, by his efforts to make sense of a broad range of historic and contemporary puzzles that have come to his attention over the course of a lifetime of rapid and extensive reading, his research has never aimed exclusively at understanding the world. His books and many of his papers address normative as well as positive issues.31 His normative approach is utilitarian and comparative, and, for the most part, his normative conclusions follow closely from his positive analyses. If he can show that the averageperson is better off under institution X than under institution Y, he concludes that Y is a better institution than X. In such cases, Y is approximately Pareto superior to X. Thus, a society with a stable criminal and civil law is better off than one lacking them (LogicoftheLaw,1971, ch. 2). A society with a more accurate judiciary is better off than one with a less accurate judicial process (TrialsonTrial, 1980: Ch. 6). A society with an ef.cient collective decision rule is better off than one that fails to minimize decision costs (CalculusofConsent, 1962: Ch. 6). A society that uses the demand-revealing process to make collective decisions would be better off than one relying on majority rule (Tideman and Tullock, 1976). A society that reduces rent-seeking losses is better off than one that fails to address this problem (Ef.cientRentSeeking,2000: Ch. 1). Intelligent institutional design can improve the ef.ciency of the judicial system, reduce the losses from con.ict, and produce better public policies, although it cannot eliminate all losses or mistakes. Although many normative arguments are found throughout Tullock's work, his analysis is never utopian. He never claims that institutional arrange?ment Y is the best possible arrangement, only that existing arrangements can be improved. Indeed, he argues that utopian approaches may impede useful reforms (SocialDilemma, 1974, p. 140). B. Breadth of Tullock's research Most economists study the behavior of rational self-interested individuals in?teracting within a stable pattern of laws and regulations governing ownership and exchange. Most political scientists study individual and group behavior within a stable pattern of constitutional laws and rules governing political procedures and constraints. The public choice literature as a whole analyzes how economic and political interests give rise to public policies. The public policies studied by public choice scholars include both the routine legislative outcomes of ordinary day-to-day politics and administration decisions, and also changes in the fundamental laws that determine the procedures and con?straints under which future political and economic decision making will be made. The political and economic processes studied by public choice schol?ars, thus, can be said to generate the "settings" and many of the "facts" studied by the more established .elds of economics and political science. In this respect, public choice can be regarded as broaderin scope than either of its parent disciplines, and, consequently, a scholar who contributes to all the research programs within public choice necessarily has a very broad program of research. Gordon Tullock is one of a handful of scholars who has contributed to all the various sub.elds in that area of research known as public choice. Of course, the public choice research program includes many men and women of insight who have addressed deep and broad issues along the same intellectual frontiers. Professor Tullock's intellectual enterprise has long been shared by his colleagues at the Thomas Jefferson Center and the Center for Study of Public Choice - especially James Buchanan and Robert Tollison - and by many in the extensive intellectual network in which those centers par?ticipated. However, Tullock's work is nearly unique among the well-known pioneers of public choice for its originality, breadth, comparative approach, and historical foundations. C. Tullock's intellectual impact In constructing a "road map" for the intellectual landscape traversed by Professor Tullock's political economy, the focus of this paper has been the underlying themes in his work, and, in some cases, it has attempted to bridge gaps in his work that are essentially implied by the totality of his political economy research. Other gaps have been ignored, and some of his work outside public choice has been neglected. For example, his work on dictatorship does not examine why some autocrats have better track records than others. The relative per?formance of American and European judicial systems is developed without addressing the empirical questions of whether crime rates or lawsuits are sys?tematically different as a consequence of different judicial procedures. Hints are provided in Autocracyand LogicoftheLawbut there is no systematic analysis. Moreover, some of his work has been neglected because it is not an essential part of his political economy research program. There is, for example, his work on biology and sociobiology, The Economics of Nonhuman Societies (1994), and his work on monetary economics (1954, 1979). The survey undertaken has not devoted signi.cant space to assessing the quality and impact of Tullock's work. That most readers of this piece are already familiar with many of his scholarly articles is itself evidence of this. A "tour guide" of Tullock's work would have tried to assess the magnitude of his major contributions with the bene.t of hindsight or from the perspective of the times at which his ideas were developed. It is clear, for example, that The Calculus of Consent(1962), written with James Buchanan, was not only very original, but in.uential from the moment it was published. The Calculus has been cited in scienti.c articles well over a thousand times since its pub?lication. Moreover, it continues to be highly regarded and continues to spur new research; the Calculus has already been cited more than 100 times since January 2000. Not all of Professor Tullock's contributions have been immediately re?cognized. Several of his ideas awaited reinvention by other scholars before coming to prominence. His original work on rent seeking (1967, 1974) was well-regarded, but not widely appreciated until 10 or 20 years after its pub-lication.32 The term "rent seeking" was actually coined by Anne Krueger in 1974. His contributions to principal-agent, ef.ciency wage, and organization theory worked out in the PoliticsofBureaucracy(1966) have been largely neglected by the new literatures on those subjects. His work on the law, es?pecially with respect to judicial proceedings, errors, and criminal sanctions, are noted, but not as widely as appears justi.ed. His theory of autocrats as service-providing income maximizers was worked out in the .rst anarchy volume (1972) and further developed in TheSocialDilemma(1974), but awaited rediscovery by Mancur Olson (1993) and Ronald Wintrobe (1990) nearly two decades later. The invention of what now is called a contest-success function in TheLogicoftheLaw(1971) and subsequently applied in his work on ef.cient rent seeking (1980) also seems underrecognized, although it is noted by Jack Hirshliefer (2001). His work on the enterprise of science, The Organization of Inquiry(1966) is a gold mine awaiting rediscov?ery. Sometimes, Tullock blazes a trail that is too far ahead of the mainstream to be fully appreciated. And, one can be too far in front of "the parade" to be readily associated with it. That Tullock's observations have contributed much to our understand?ing of the political landscape is, nonetheless, well recognized. His research continues to be among the most highly cited in the social sciences. His willingness to chart new grounds and point out the "dead ends," "ruts," "potholes," and "slippery slopes" of other scholars - largely to our bene.t, if often at his pleasure - continues to make his work provocative and entertain?ing. His books and papers address new issues and associated problems at the same time that general principles are being worked out. His long editorship of PublicChoicehelped to de.ne and establish the .eld. The huge range of original explanations and conclusions that Tullock de?velops in his books and papers can easily lead a casual reader or listener to conclude that there is little systematic in his research, or perhaps in public choice generally. His brisk discussions of issues risk losing the reader in a forest of special cases and ingenious insights, rather than illuminating the main pathways followed. Clearly, a mere list of possible explanations is not social science. Social sciencedoes not simply provide an unconnected logicof speci.c instances of collective action, but attempts to determine what is general about the behavior that we observe. The present essay attempts to remedy this potential misapprehension by providing a more concise and integrated vision of the territory charted by Tul-lock's unusually extensive political economy than a casual reader may have obtained from a small sample of Professor Tullock's published work. The aim of Tullock's social science is not just to explain the main details of social life, but as much as possibly can be understood. His social science attempts to systematically explain and predict allofhumanbehavior. His work demon?strates that self-interest, con.ict, and institutions account for a good deal of human behavior in both ordinary and extraordinary political circumstances - and, in Tullock's view, far more than is generally acknowledged. Notes 1. The work of many social scientists attempts to show that complex real world phenomena can be understood with a few fundamental principles that others have failed to recognize. This reductionist approach attempts to demonstrate that the world is essentially simpler than it appears. The reductionist research agenda is clearly of great esthetic interest for academics who appreciate the intellectual craftsmanship required to devise lean, pen?etrating, encompassing theories. It is also an important practical enterprise insofar as reductionist theories allow knowledge accumulated over many lifetimes to be passed on from one generation to the next with relatively modest investments of time and effort by teachers and students. 2. As many who have argued with Professor Tullock over the years will attest, the rough edges of his work somehow make his analyses all the more interesting. His provocative theoretical and historical assertions challenge his interlocutors to think more carefully about issues that they would not have imagined and/or mistakenly taken for granted. The fact that Tullock is occasionally incorrect somehow helps stimulate his fans and foes to greater effort. 3. "A scienti.c theory consists of a logical structure proceeding from certain assumptions to certain conclusions. We hope that both the assumptions and the conclusions may be checked by comparing them with the real world; the more highly testable the theory, the better. Normally, however, certain parts of the theory are dif.cult to test. We are not unduly concerned by this, since if parts of it survive tests, we may assume that the untestable remainder is also true." (Gordon Tullock, LogicoftheLaw, 1971: 10.) 4. "The theory of the lever may, of course, be disproved tomorrow, but the fact that it has withstood two thousand years of critical examination, much of it using tools which the Greeks could not even dream of, does raise some presumption that here we have a bit of theory which is absolutely true. It seems likely that somewhere in our present vast collection of theories there are others which are, in fact, true, that is which will not be disproved at any time in the future. It is, of course, impossible to say which they are." (Gordon Tullock, OrganizationofInquiry, 1966: 48.) 5. "An intelligent outsider who has the time and interest in a problem should investigate, himself, since only in this way can he reach the level of certainty of the experts them?selves. Personal knowledge is always superior to hearsay, ..." (Gordon Tullock, The Organization of Inquiry, 1966: 53.) 6. "I prefer to use the world ?rational' for those acts that might well achieve the goals to which the actor aims, regardless of whether they are humanitarian, violent, etc." (Gordon Tullock, TheSocialDilemma, 1974: 4.) 7. Tullock often acknowledges his own fallibility although he does not tout it. This is evident in the lead quote and several others included in the text. Another appears in the .rst chapter of TowardsaMathematicsofPolitics.There he relates a story about failing to purchase glasses made out of a new material when it was .rst suggested to him by his optometrist. Gordon, evidently misunderstood what was said regarding an innovation in lens design, and fully appreciated it only a week or so later, at which point he purchased the glasses with the recommended lenses. 8. "My main point is simply that we stop fooling ourselves about redistribution. We have a minor desire to help the poor. This leads to certain government policies. We also have some desire for income insurance. And we also, to some extent, envy the rich. ...[However,] the largest single source of income redistribution is simply the desire of the recipients to receive the money." (Gordon Tullock, "The Rhetoric and Reality of Redistribution," Southern Economic Journal, 1981: 906.) 9. "Man is a complicated animal and his motives are many and varied". (Gordon Tullock, The Organization of Inquiry, 1966: 39.) 10. "We argue below that it (altruism) is a relatively minor motive and the major motives tend to lead to inef.ciency and distortion. This motive (altruism), insofar as it is implemented, actually improves the ef.ciency of the economy." (Gordon Tullock, "The Rhetoric and Reality of Redistribution," SouthernEconomicJournal, 1981: 896.) Of course, if envy is strong enough, then taking a dollar away from me might give other people a total satisfaction which was larger than the loss of the dollar to me. Thus plunder?ing the Rockefeller family might be socially desirable if we had some way of measuring innate utilities." (Gordon Tullock, "The Rhetoric and Reality of Redistribution," SouthernEconomicJournal, 1981: 902.) 11. "The primacy of private interest is not inconsistent with the observation that most people, in addition to pursuing their private interests have some charitable instincts, some tend?ency to help others and to engage in various morally correct activities. However the evidence seems fairly strong that these motives other than the pursuit of private interests are not the ones on which we can depend for the achievement of long-continued ef.cient performance." (Gordon Tullock, GovernmentWhoseObedientServant?, 2000: 11.) 12. A collection of Washington quotes are available on the internet at http://www.dropbears. com/b/broughsbooks/qwashington.htm. 13. "Economics has traditionally studied the bene.ts of cooperation. Political science is be?ginning to move in that direction. Although I would not quarrel with the desirability of such studies, the fact remains that con.ict is also important. In general con.ict uses re?sources, hence it is socially inef.cient, but entering into the con.ict may be individually rational for one or both parties. ...The social dilemma, then, is that we would always be better off collectively if we could avoid playing this kind of negative sum game, but individuals may make gains by forcingsuchagameon the rest of us." (Gordon Tullock, The Social Dilemma, 1974: 2.) 14. "Obviously, as a good social policy, we should try to avoid having games that are likely to lead to this kind of waste. Again, we should try to arrange that the payoff to further investment in resources is comparatively low, or, in other words, that the cost curve [of rent seeking] points sharply upward." (Gordon Tullock, Ef.cientRentSeeking, 2000: 13.) "There are institutions that will reduce the likelihood of being forced into such a game, but these institutions cost resources, too. ...[However] the problem is unavoidable - at least in the present state of knowledge. Pretending that it does not exist is likely to make us worse off than conceding its existence and taking rational precautions." (Gordon Tullock, TheSocialDilemma, 1974: 2.) 15. "The problem of maintaining power in a dictatorship is really similar to that of maintain?ing a majority for redistributive purposes in a voting body. It is easily demonstrated, of course, that it is always possible to build a majority against any particular program of redistribution by offering something to the "outs" on the original program and fairly high payments to a few of the "ins." The situation in a dictatorship is similar. It is always pos?sible at least in theory to collect together a group of people which is more powerful than the group supporting the status quo. This group will be composed of important of.cials of the regime who could bene.t from its overthrow and their concomitant promotion." (Gordon Tullock, Autocracy, 1987: 19.) 16. "Preventing overthrow by the common people is, in general, quite easy if the ruler is only willing to repress vigorously and to offer large rewards for information about conspiracies against him." (Gordon Tullock, Autocracy, 1987: 68.) 17. Tullock may disagree with this location-based explanation. "Most of my work in Public Choice has dealt with democratic governments. This is not because I thought that demo?cratic governments were the dominant form of government, either currently or historically. That more people are ruled by autocracy than democracies today, and that the same can be said of earlier periods, is obvious. I did think that democratic governments were better than the various alternatives which have been tried from time to time, but the basic reason that most things that I have published have dealt with democracies is simply that I've found dictatorship to be a very, very dif.cult problem." (Gordon Tullock, Autocracy, 1987: x.) 18. "For a given activity, the fully rational individual at the time of constitutional choice will try to choose that decision-making rule which will minimize the present value of the expected costs that he must suffer. He will do so by minimizing the sum of the expected external costs and the expected decision-making costs . . . [In this manner,] the individual will choose the rule which requires that K/N of the group agree when collective decisions are made." (Gordon Tullock and James M. Buchanan, CalculusofConsent, 1962: 70.) "This broad ...classi.cation does not, of course, suggest that all collective action should rationally be placed under one of two decision making rules. The number of categories, and the number of decision-making rules chosen, will depend on the situation which the individual expects to prevail and the "returns to scale' expected to result from using the same rule over many activities." (Gordon Tullock and James M. Buchanan, CalculusofConsent, 1962: 76.) 19. In their words, the demand-revealing process "is a new process for making social choices, one that is superior to other processes that have been suggested. The method is immune to strategic maneuvering on the part of individual voters. It avoids the conditions of the Arrow theorem by using more information than the rank orders of preferences and selects a unique point on or ?almost on' the Pareto-optimal frontier, one that maximizes or ?almost maximizes' the consumer surplus of society. Subject to any given distributions of wealth, the process may be used to approximate the Lindahl equilibrium for all public goods." (Tideman and Tullock, JournalofPoliticalEconomy, 84: 1145.) 20. An interesting property of the median voter hypothesis is that decisions tend to be largely independent of the particulars of the interests of voters away from the median (Black, 1948). All that matters is that which is necessary to identify the median voter. How much more or less than the median voter's interest is demanded by other voters and how intensively those demands are held is irrelevant. A wide range of voter distributions can have the same median. However, not every distribution of voter preferences has a median. In the absense of a median, McKelvy (1979) demonstrates that literally "anything" can happen under a sequence of majority decisions. The properties of democratic governance are by no means obvious, and the more detailed the institutional structures and preferences that are taken account of, the more complex political decision making becomes. 21. Vote trading can also lead to the funding of regional boon-doggles, as in the pork barrel dilemma (Tullock, 1959). Again the world is more complex than one might have hoped. 22. "Insofar as the alternatives for employment are limited, and the shifting of either jobs or employees involves costs, the secondary, or ?political' relationship enters even here. ...The most obvious empirical veri.cation of this difference is the degree of deference shown to superiors." (Gordon Tullock, ThePoliticsofBureaucracy, 1966: 11.) 23. "In the ideally ef.cient organization, then, the man dominated by ambition would .nd himself taking the same courses of action as an idealist simply because such procedure would be the most effective for him in achieving the personal goals that he seeks. At the other extreme, an organization may be so badly designed that an idealist may .nd it necessary to take an almost completely opportunistic position because only in this manner can his ideals be served." (Gordon Tullock, ThePoliticsofBureaucracy, 1965, p. 21.) 24. "In the United States civil service, the individual career employee is generally not ex?pected to put up with quite as much ?pushing around' as he might endure in the higher ranks of some large corporations. To balance this, he will be receiving less salary and will probably .nd that the orders which he is expected to implement are less rational than those he could expect to receive in private industry." (Gordon Tullock, ThePoliticsofBureaucracy, 1966: 12.) 25. "Improving the ef.ciency of a large corporation by, let us say, 2 percent may well mean that some individual's wealth goes up by $50 million and a very large number of individu?als will have increases in wealth on the order of a hundred to a million dollars. Maximizing the public interest, however, would always be a public good, and improvement by 2 per?cent in the functioning ef.ciency of some bureau would characteristically increase the well-being of average citizens, or, indeed, any citizen by amounts which would be almost invisible." (Gordon Tullock, Government:WhoseObedientServant?, 2000: 58.) 26. It bears noting that many of the demands for public policy within a given society are independent of the type of political regime in place. For example, criminal and civil laws would be adopted by nearly unanimous agreement by all free men and women at a constitutional convention (Calculus of Consent,1962: Ch. 5, and Logic of the Law, 1971: Ch. 2). Alternatively, an autocrat may establish criminal and civil law as a means of maximizing the resources potentially available to the state (Explorations in the Theory of Anarchy, 1972: 72, and The Social Dilemma, 1974: 19). Murder and theft will ordinarily be punished, and most contracts will be enforced under both democratic and autocratic regimes. Some other rules may vary somewhat according to regime type, as with rules concerning payments to government of.cials, freedom of assembly, and the publication of news critical of the government, but regime type will not always directly affect public policy outcomes or economic performance. 27. Of course, procedural questions are more important for a political economist than for a scholar whose work focuses on a single society. This probably explains why procedural aspects of law enforcement are given relatively little attention in the law and economics literature, see, for example, Becker (1968) or Posner (1972). 28. "Most crimes are not simply the preliminary to punishment for the criminals, most people who are in prison have not had anything that we would recognize as a trial, and admin?istrative decisions keep people in prison and (in effect) extend their sentence." (Gordon Tullock, LogicoftheLaw, 1971: 169.) 29. "The problem of determining what actually happened is one of the court's duties and the only one we are discussing now. A historic reconstruction, which is what we are now talking about, is a dif.cult task for a variety of reasons. One is that witnesses lie and in lawsuits, there usually are at least some witnesses who have a strong motive to lie. They may also simply be mistaken. Another reason is that many things which happen that are of interest to the court leave no physical traces and, indeed, may leave no traces on the minds of the parties...different cases have different amounts of evidence of varying quality available, and ...this evidence leads us to varying probabilities of reaching the correct decision." (Gordon Tullock, Trials on Trial, 1980: 25-26.) 30. This is especially true for those working in the tradition of the public choice approach to politics. However, the latter is partly a consequence of Tullock's many contributions to public choice, but, perhaps even more so, a consequence of his two decades as editor of the journal PublicChoice. Those years largely de.ned the discipline as we know it now, and Tullock's editorial decisions helped determine those boundaries - such as they are - and his responses to contributors made his world view both familiar and important to aspiring public choice scholars of that period. 31. "We undertake investigations because we are curious, or because we hope to use the information obtained for some practical purpose." (Gordon Tullock, 1966, OrganizationofInquiry, 1966: 12.) 32. In fact, his .rst paper on the costly nature of efforts to secure rents (1967) was roundly rejected by the major economics journals (Brady and Tollison, 1994: 9-10). References Selected references: Gordon Tullock. Lockard, A.A. and Tullock, G. (2001). Efficient rent seeking: Chronicle of an intellectual quagmire. Boston: Kluwer Academic Publishers. Tullock, G., Seldon, A. and Brady, G.L. (2000). Government: Whose obedient servant?: A primer in public choice. London: Institute of Economic Affairs. Tullock, G. (1997). Thecaseagainstthecommonlaw. Durham: North Carolina Academic Press. Tullock, G. (1997). Economics of income redistribution. Boston: Kluwer Academic Publishers. Brady, G.L. and Tollison, R.D. (Eds.). (1994). On the trail of homo economicus: Essays by Gordon Tullock. Fairfax, VA: George Mason University Press. Tullock, G. (1994). The economics of nonhuman societies. Tucson: Pallas Press. Grier, K.B. and Tullock, G. (1989). An empirical analysis of cross-national economic growth, 1951-80. Journal of Monetary Economics 24: 259-276. Tullock, G. (1989). The economics of special privilege and rent seeking. Hingham, MA: Lancaster and Dordrecht: Kluwer Academic Publishers. Tullock, G. (1987). Autocracy. Hingham, MA: Lancaster and Dordrecht: Kluwer Academic Publishers. Tullock, G. (1986). The economics of wealth and poverty. New York: New York University Press; (distributed by Columbia University Press). Tullock, G. (1985). Adam Smith and the prisoners' dilemma, Quarterly Journal of Economics 100: 1073-1081. McKenzie, R.B. and Tullock, G. (1985). The new world of economics: Explorations into the human experience. Homewood, IL: Irwin. Brennan, G. and Tullock, G. (1982). An economic theory of military tactics: Methodological individualism at war. Journal of Economic Behavior and Organization 3: 225-242. Tullock, G. (1981). The rhetoric and reality of redistribution. Southern Economic Journal 47: 895-907. Tullock, G. (1981). Why so much stability? Public Choice 37: 189-202. Tullock, G. (1980). Trials on trial: The pure theory of lega lprocedure. New York: Columbia University Press. Tullock, G. (1980). Ef.cient rent seeking. In J.M. Buchanan, R.D. Tollison, and G. Tullock. Toward a theory of the rent-seeking society, 97-112. College Station: Texas A&M University Press. Tullock, G. (1979). When is in.ation not in.ation: A note. Journal of Money, Credit, and Banking 11: 219-221. Tullock, G. (1977). Economics and sociobiology: A comment. Journal of Economic Literature 15: 502-506. Tideman, T.N. and Tullock, G. (1976). A new and superior process for making social choices. Journal of Political Economy 84: 1145-1159. Tullock, G. (1975). The transitional gains trap. Bell Journal of Economics 6: 671-678. Buchanan, J.M. and Tullock, G. (1975). Polluters' pro.ts and political response: Direct controls versus taxes. American Economic Review 65: 139-147. Tullock, G. (1974). The social dilemma: The economics of war and revolution. Blacksburg: University Publications. Tullock, G. (1972). Explorations in the theory of anarchy. Blacksburg: Center for the Study of Public Choice. Buchanan, J.M. and Tullock, G. (1971/1962). Thecalculusofconsent: Logical foundations of constitutional democracy. Ann Arbor: University of Michigan Press. Tullock, G. (1971). The charity of the uncharitable. Western Economic Journal 9: 379-392. Tullock, G. (1971). Inheritance justi.ed. Journal of Law and Economics 14: 465-474. Tullock, G. (1971). The paradox of revolution. Public Choice 11: 88-99. Tullock, G. (1971). Public decisions as public goods. Journal of Political Economy 79: 913- 918. Tullock, G. (1971/1988). The logic of the law. Fairfax: George Mason University Press. Tullock, G. (1967). The general irrelevance of the general impossibility theorem. Quarterly Journal of Economics 81: 256-270. Tullock, G. (1967). The welfare costs of monopolies, tariffs and theft. Western Economic Journal 5: 224-232. Tullock, G. (1967). Towards a mathematics of politics. Ann Arbor: University of Michigan Press. Tullock, G. (Ed.). (1966/7). Papers on non-market decision making. Charlottesville: Thomas Jefferson Center for Political Economy, University of Virginia. Tullock, G. (1966). The Organization of Inquiry. Durham: Duke University Press. Tullock, G. (1966). Gains-from-trade in votes (with J.M. Buchanan). Ethics 76: 305-306. Tullock, G. (1965). The politics of bureaucracy. Washington, DC: Public Affairs Press. Tullock, G. (1965). Entry barriers in politics. American Economic Review55: 458-466. Tullock, G. (1962). Entrepreneurial politics.Charlottesville: Thomas Jefferson Center for Studies in Political Economy, University of Virginia. Tullock, G. (1959). Problems of majority voting. Journal of PoliticalEconomy 67: 571-579. Campbell, C.D. and Tullock, G. (1954). Hyperin.ation in China, 1937-49. Journal of Political Economy 62: 236-245. Other references Becker, G.S. (1968). Crime and punishment: An economic approach. The Journal of Political Economy 76: 169-217. Biennen, H. and van de Walle, N. (1989). Time and power in Africa. American Political Science Review 83: 19-34. Black, D. (1948). On the rationale of group decision-making. Journal of Political Economy 56: 23-34. Buchanan, J.M. (1987). The qualities of a natural economist. In C. Rowley (Ed.), Democracy and public choice, 9-19. New York: Blackwell. Congleton, R.D. (1988). An overview of the contractarian public .nance of James Buchanan. Public Finance Quarterly 16: 131-157. Congleton, R.D. (1980). Competitive process, competitive waste, and institutions. In J. Buchanan, R. Tollison, and G. Tullock (Eds.), Towards a theory of the rent-seeking society, 153-179. Texas A & M Press. Hirshliefer, J. (2001). The dark side of the force: Economic foundations of con.ict theory. New York: Cambridge University Press. Johnson, P. (1997). A history of the American people. New York: Harper. Krueger, A.O. (1974). The political economy of the rent-seeking society. American Economic Review 64: 291-303. McKelvey, R.D. (1979). General conditions for global intransitivities in formal voting models. Econometrica 47: 1085-1112. Olson, M. (1993). Dictatorship, democracy, and development. American Political Science Review 87: 567-576. Posner, R.E. (1972). Economic analysis of the law. Boston: Little, Brown and Company. Shepsle, K.A. and Weingast, B.R. (1981). Structure-induced equilibrium and legislative choice. Public Choice 37: 503-519. Wintrobe, R. (1990). The tinpot and the totalitarian: An economic theory of dictatorship. American Political Science Review 84: 849-872. From dsmith06 at maine.rr.com Tue Jan 3 02:16:51 2006 From: dsmith06 at maine.rr.com (David Smith) Date: Mon, 02 Jan 2006 21:16:51 -0500 Subject: [Paleopsych] =?windows-1252?q?=91The_Better_Angels_of_Our_Nature?= =?windows-1252?q?=92=3A_Evolution_and_Morality_?= Message-ID: <43B9DE93.2070408@maine.rr.com> ?The Better Angels of Our Nature?: Evolution and Morality St. Francis Room of the Ketchum Library University of New England, 11 Hills Beach Road, Biddeford, Maine. Feb. 21, 2006 at 6 p.m. Evolutionary biologist David Lahti, Ph.D., will deliver a lecture on "'The Better Angels of Our Nature': Evolution and Morality" on Feb. 21, 2006 at 6 p.m. in the Lahti is an NIH Postdoctoral Fellow at the University of Massachusetts, Amherst. The lecture, sponsored by New England Institute for Cognitive Science and Evolutionary Psychology and Department of Philosophy and Religious Studies, is free and open to the public. Are we humans essentially altruistic beings whose natural state is to care for others? Or are we ogres at heart, our moral codes the only thing holding us back from utter selfishness? Lahti argues that an evolutionary consideration of morality suggests a third alternative, that we are by nature moral strugglers and deliberators - that the relevant adaptive trait is neither altruism nor selfishness, but rather a refined ability to assess our social environments and make informed decisions about how altruistic or selfish to be. We tend, he believes, to make these decisions on the basis of two main variables: the anticipated effects of our behavior on our reputation and the perceived stability of the social groups on which we depend. Furthermore, what we often call morality is actually a conglomerate of tendencies and capacities, some of which are millions of years old and others just thousands. Many of its more recent features, including moral rules that are difficult for us to follow, are cultural surrogates for adaptation in an age when our social environments are changing too fast for us to adapt genetically to them. Lahti received a Ph.D. in philosophy at the Whitefield Institute at Oxford in 1998, for work on the relationship between science and the foundations of morality; more recently his research in this area has focused on the evolution of morality. In 2003 he received a Ph.D. in ecology and evolutionary biology from the University of Michigan, where he documented rapid evolution in the African village weaverbird. From 2003 to 2005 he held the Darwin Fellowship at the Program in Organismic and Evolutionary Biology at University of Massachusetts Amherst, and has been studying the evolution and development of bird song. From checker at panix.com Tue Jan 3 22:28:49 2006 From: checker at panix.com (Premise Checker) Date: Tue, 3 Jan 2006 17:28:49 -0500 (EST) Subject: [Paleopsych] Public Choice: Anyone for higher speed limits? Message-ID: Olof Johansson-Stenman and Peter Martinsson: Anyone for higher speed limits? - Self-interested and adaptive political preferences Public Choice (2005) 122: 319-331 C DOI: 10.1007/s11127-005-3901-x [This is a nice article that gets at the issue of whether voters vote for what is in their own self-interest or what they think is in the overall public interest. I say vote for what's in your own interest, since you don't know what's in others' interest, certainly not better than they do themselves. It's the American way, that idea that government serve the people. One chief problem of altruism is that, if I am to serve other people, who are all these other people going to serve. [One matter of surprise to me: "The result presented here is also consistent with the result of Hemenway and Solnick (1993) and Shinar, Schechtman and Compton (2001), who found that levels of education higher than high-school tended to increase the probability of speed violation." [But be wary of the article, though, since the variables explained only 20% of the variance. There's a real likelihood that, if someone will come up with other variables that will explain more of the variance, the coefficients on the ones used here could get drastically altered. One of the critics of _The Be** Cu**e_ complained that the authors often buried the R^2s in the back, and in fact the R^2s varied all over the place. [Note that I said, "IF someone will come up with other variables.... Researchers use the variables they can get ahold of. What _The B*ll C*rv*_ did was to thoroughly mine a data set, that collected for the National Longitudinal Study of Youth, that almost uniquely had a measure of intelligence. The results are known: IQ correlated more with things like income and scholastic achievement than do the usual Socioeconomic status, parental income, and so on. I'd love to know whether having an IQ measure upset any conventional wisdom on these other factors. [When I was in graduate school at UVa, self-interested voting was emphasized and expressive voting barely recognized. I was Virginia School. The Rochester School (William Riker and then others) came along later. It was in political science but used economics tools. It emphasised disinterested voting. These two Schools were never, I don't think, hostile toward one another, and this article shows that the issues are empirical.] Department of Economics, SE 40530 G?teborg, Sweden; e-mail: Olof.Johansson at economics.gu.se, Peter.Martinsson at economics.gu.se Accepted 17 November 2003 Abstract. Swedish survey-evidence indicates that variables reflecting self-interest are important in explaining people's preferred speed limits, and that political preferences adapt to technological development. Drivers who believe they drive better than the average driver as well as drivers of cars that are newer (and hence safer), bigger, and with better high- speed characteristics, prefer higher speed limits. In contrast, elderly people prefer lower speed limits. Furthermore, people report that they themselves vote more sociotropically than they believe others vote on average, indicating that we may vote less sociotropically than we believe ourselves. One possible reason for such self-serving biases is that people desire to see themselves as socially responsible. *We are grateful for very constructive comments from an anonymous referee. We have also received useful comments from Fredrik Carlsson and had fruitful discussions with Per Fredriksson and Douglas Hibbs. Financial support from the Swedish Agency for Innovation Systems (VINNOVA) is gratefully acknowledged. 1. Introduction The purpose of this paper is twofold: i) to use survey evidence about what speed limits different people prefer on motorways, and what their own subjectively perceived and self-reported voting motives are, in order to provide new insight into the determinants of individual voting behavior, in particular the self-interested voting hypothesis; and ii) to identify adaptations in political preferences due to technological development, in our case changes in safety and high-speed features of cars. The analysis is based on two recent representative Swedish surveys: In the first one people were asked about their preferred speed limits on motorways. In the second they were asked about why they vote as they do, and about why they think other people vote as they do. Why do people vote in the way they do and why do they vote at all? One reason for the latter is simply that we are heavily indoctrinated to do so; c.f. Tullock (2000). But is how we vote motivated solely by the instrumental outcome induced by our votes? Or are we perhaps, as proposed by Brennan and Lomasky (1993) and Brennan and Hamlin (1998, 2000), motivated largely by the expressive act of voting? If the expressive motive is important it becomes more likely that people are concerned with society as a whole when voting, rather than what is good solely for themselves.1 Indeed, as found by Brekke, Kverndokk and Nyborg (2003), most people seem to prefer a self-image that reflects social responsibility, rather than pure self-concern. The relative importance of purely self-interested voting, versus sociotropic voting, is still debated. This is partly because it is difficult to draw strong conclusions from general elections that are characterized by few political parties (or candidates) and many political issues and indicators; see e.g. Kinder and Kiewiet (1979), Kramer (1983) and Mitchell (1990). The reason for this is that some opinions of one party may favour a certain group while other opinions may favour other groups, and it is difficult to know the relative weights that different voters give to the different opinions of the parties. Thus, there are clear advantages to be gained from testing the self-interested voting model when the choice set is small and when there are few political issues, such as on a single-issue referendum or by using tailor-made surveys. Smith (1975) analyzed the voting behavior from a referendum in Oregon concerning tax equalization between different districts, and concluded that self-interest does seem to play an important role. Sears, Lau, Tyler and Allen (1980), on the other hand, analyzed survey data on people's attitudes toward specific policies in the US, and concluded that self-interest plays a very minor role. However, their conclusions, based on their statistical results, can be questioned: for example, they found that the support for a national health insurance decreased with income and increased with age, and that the support for more resources to be given to law and order increased with income, but these findings were not interpreted to reflect self-interest. Nevertheless, there have also been other studies such as Gramlich and Rubinfeld (1982) and Shabman and Stephenson (1994) that have concluded that self-interest alone does a poor job of explaining the results. These findings are also consistent with much experimental evidence from public-good games; see e.g. Ledyard (1995) and Keser and van Winden (2000). Much of the analysis here is based on the first survey about the preferred speed limits on motorways, which is an issue that has been frequently debated for a long time in Sweden. Besides being a single issue, it has the advantage of being fairly neutral from an ethical point of view, meaning that the opinion of good and responsible citizens is not straightforward to predict.2 Survey responses can otherwise be biased towards what is perceived to be the most ethical alternative, which is an argument that for example is put forward in the environmental valuation literature. A possible underlying reason for this bias, in turn, is that people typically attempt to present themselves in a positive manner to others, which implies that we sometimes deliberately conceal or colour our true opinions or preferences, i.e. what Kuran (1995) denotes "preference falsifications." An alternative reason is our desire to see ourselves as good people, and our tendency to bias our impressions of reality in various respects to maintain or improve this self-image (see e.g. Gilovich, 1991). Such tendencies may, for example, influence people to believe that they would be willing to pay more for a socially good cause than they would actually be willing to pay, which is denoted "purchase of moral satisfaction" by Kahneman and Knetsch (1992). In order to broaden the insights on voting motives, we also performed a second survey where a representative sample of Swedes was asked about why they vote as they do, and why they think other people vote as they do. This allows us to compare the findings about the preferred speed limits with the perception that people have of their own and others' voting motives. The reason we also asked about the perception of others' voting motives is the one just mentioned, i.e. that we suspected that the responses may be biased since most people would presumably consider voting out of conviction to be ethically superior to voting solely for one's own good. Given that self-interest is important for the political preferences, one would from the general self-interested hypothesis expect people with more exclusive and safer cars, and with higher subjective driving skills, to prefer relatively high speed limits, and elderly and more vulnerable people to prefer lower speed limits. In addition, one would expect people who drive faster, and who break the speed limits more often to prefer higher speed limits. The results, reported in Section 2, are consistent with these hypotheses. One would also expect that these preferences would change with changing circumstances. Indeed, behavioural adaptations in response to perceived changes in the environment are among the most important insights that modern economics can contribute to the public debate. For example, a safety improvement in cars of, say, 10% may cause a much smaller net effect on safety, since safer cars may induce people to drive faster and less responsibly; see Peltzman (1975), Keeler (1994), Peterson, Hoffer and Millner (1995) and Merrel, Poitras and Sutter (1999) for theoretical analysis as well as empirical evidence. This paper will concentrate on another kind of adjustment, namely how political preferences with respect to preferred speed limits on motorways change with the rapid technological development of private cars. The data used here is not ideal in this respect, since the survey is purely cross-sectional. Nevertheless, it is still possible to see whether the results are consistent with the hypothesis of adjustments of political preferences. If people demand higher speed limits when their cars get safer and have better high-speed characteristics, one would expect from the empirical analysis that more people would be in favour of increasing speed limits rather than decreasing them, since these limits were decided upon many years ago,3 and also that individuals with newer cars would prefer higher speed limits. This is also found in our empirical analysis. It is interesting to compare the obtained motives that can be inferred from people's choices, in reality or in surveys, from their own subjectively perceived voting motives. This is the reason we undertook the second survey about people's perceptions of their own and others' voting motives. The results indicate that most people believe that others vote largely for their own interests, whereas they, on average, consider themselves to be influenced roughly equally by their own interests and by those of society as a whole. The results further help to identify possible self-serving biases, i.e. that people may tend, unconsciously, to believe that what is in the interest of society happens to coincide with what is in their own private interest. If so, one would expect systematic differences between people's reported perception of their own motives and that of others' motives, and that people on average believe that they themselves vote more out of conviction, or sociotropically, than others do. And as reported in Section 3, this is indeed found to be the case. 2. Analysis of preferred speed limits The main survey was mailed to 2500 randomly selected individuals aged between 18 and 65 years old in Sweden, during spring 2001. The response rate of the overall survey was 62%, and 1131 car drivers answered the speed- limit question. Each respondent was asked the following question: What speed limit do you think we should have on Swedish motorways? They were given five options, all of which have been discussed in the Swedish debate from time to time: 90, 100, 110 (the level today), 120, and 130 km/h. The descriptive result in Table 1 shows that very few would like to have decreased speed limits, and that more than half of the respondents would like to see increased speed limits. This may in itself be an indication that people have adapted their political preferences to the increased levels of vehicle safety, but to be able to say more on this issue we would need to know who wants increased speed limits, and who does not. This is the issue to which we turn to next. In order to obtain information on the characteristics that affect the preferred speed limit, we ran an OLS-regression with the preferred speed limit as the dependent variable on a number of socio-economic characteristics and the characteristics of the car that they most frequently drive. Because of missing or incomplete responses, primarily on the income and voting variables the number of respondents included in the analysis is 974. The results from the estimations are presented in Table 2 along with the mean sample value of each explanatory variable. Table 1. Sample distribution of the preferred speed limit on Swedish motorways. (N = 974) 90 km/h 100 km/h 110 km/h 120 km/h 130 km/h (as of today) 2% 3% 41% 25% 29% The results show that those who drive newer cars do prefer higher speed limits, as one would expect, given that people adapt their preferences to changing circumstances, in this case safer cars with better high-speed driving characteristics.4 Similarly, drivers of the prestige cars BMW, Mercedes and Porsche, which are also safer and/or have better high-speed driving characteristics, also prefer higher speed limits. The size of car also affects the preferred speed limit in the expected direction, since bigger cars are on average safer, and have better high-speed characteristics, but the differences are not significant at conventional levels. Jeeps and vans constitute the base case, and although these are big vehicles, they have typically bad high-speed characteristics. The preferred speed limit is higher for those who believe they are better than average drivers, which is also consistent with the self-interested hypothesis, since the risk of an accident, for a given speed, would then be lower.5 A long annual driving distance also increases preferred speed limit, which, however, is not obvious from the self-interested hypothesis. On the one hand, those who drive a lot will gain more time from increased speed levels, but on the other hand they will also face a larger reduction in safety. In our case, it seems that the former effect dominates the latter. This is also consistent with Rienstra and Rietveld (1996), who found that self-reported frequency of speed-transgressions on Dutch highways increases with annual driving distances. The effects of always using a seatbelt may seem to contradict the theory, since those without seatbelts would face the biggest risk-increase from increased speed levels. However, it seems likely that the results largely reflect preference heterogeneity, so that those who are more risk-averse, or generally more cautious, prefer both to use seatbelts and to have relatively low speed limits. People living in the bigger cities of Sweden prefer somewhat higher speed limits, for which one explanation may be the higher pace, in general, of urban life, which translates into a higher value of time. The effect of education is quite small, and perhaps in the opposite direction to that in which one would have guessed, since safety awareness is often believed to follow from, or at least to be positively correlated with, education. However, hardly anyone in Sweden, irrespective of education, can be uninformed about the public campaign messages that safety decreases as speed increases. Further, the true relationship between speed and safety may not be as clear and strong as is typically presented, and maybe highly educated people are less easy to convince by public propaganda. Generally, most (but not all)6 analysts seem to agree that safety typically does decrease with increased speed limits, but there is less agreement about how large the effect is. Nevertheless, the result presented here is also consistent with the result of Hemenway and Solnick (1993) and Shinar, Schechtman and Compton (2001), who found that levels of education higher than high-school tended to increase the probability of speed violation. Table 2. OLS-estimation of preferred speed limit on Swedish motorways. Dependent variable: Preferred speed limit on Swedish motorways in km/h. (N = 974) Variable Coeff. P-value Mean value Constant -111.552 0.276 Model-year of the car 0.112 0.030 1993.299 Drives either BMW, Mercedes or Porsche 2.771 0.029 0.050 Drives a small-sized car 1.082 0.505 0.071 Drives a medium-sized car 1.553 0.229 0.516 Drives a big car 2.173 0.100 0.362 Drives better than average (self-reported) 2.693 0.000 0.424 Drove more than 25000 km last year 1.524 0.025 0.213 Always wears seat-belt in front-seat -2.446 0.003 0.860 Lives in Stockholm, Gothenburg or Malm? 1.469 0.040 0.196 University-educated 1.314 0.103 0.322 A-level educated 0.907 0.213 0.449 Equivalence-scaled household income* 0.201 0.001 12.047 Aged above 57 -1.952 0.021 0.151 Male 4.233 0.000 0.532 Has at least one child 0.669 0.307 0.406 Right-wing political preferences 2.799 0.001 0.140 Left-wing political preferences -1.582 0.011 0.299 R2 = 0.204 RESET** p-value = 0.281 Mean VIF*** = 1.84 and highest VIF for a single variable is 5.64. *In 1000 SEK/month and person. In order to compare income between households, we employ the equivalence scale used by the National Tax Board (RSV) in Sweden. The scale assigns the first adult the value of 0.95, the following adults are set at 0.7 and each child at 0.61 units. **RESET type of test is a general specification test (see e.g. Godfrey, 1988). In the test we rerun the regression including the squared, cubed and quadratic values of the estimated value of the dependent variable from the original model and test if coefficients of the included variables are jointly significant. ***We test for multicollinearity in our data set by calculating the variance of inflation factor (VIF) for each variable. The largest VIF is 5.64 and the mean VIF is 1.84. The largest value is thus smaller than 10 and the mean value is not considerably larger than 1, as required to be able to judge that there is no apparent indication of mutlicollinearity according to STATA (2003: 378). Increased household income causes both higher value of time and a higher value of a statistical life, or more generally, the willingness to pay to avoid traffic risks; hence the theoretical prediction is ambiguous. As for driving distance, the time effect appears to dominate. These results are also consistent with Rienstra and Rietveld (1996) and Shinar, Schechtman and Compton (2001) who found that those with the highest incomes tend to break highway speed limits more often than others. Older people prefer lower speed limits, as predicted due to their increased vulnerability. The relatively large male coefficient, corresponding to more than 4 km/h, can possibly be explained by observed higher risk aversion among women (e.g. Jianakoplos and Bernasek 1998, Hartog, Ferrer-i-Carbonell and Jonker 2002), but it might also reflect a taste difference concerning how fun fast driving is perceived to be, or some kind of macho image. The influence of political voting is also in the expected direction, since political parties to the left have typically proposed, and been associated with, a more restrictive speed policy, and vice versa. These parameters too may reflect direct instrumental self-interest, if people choose political party partly due to the politically proposed speed limits. Still, it seems reasonable that these parameters rather reflect ideological conviction and expressive concern. This does not necessarily mean that they represent sociotropic concern, however, since people may have different kinds of values and opinions that they want to express; see e.g. Brennan and Lomasky (1993) and Brennan and Hamlin (2000). There is also a large part of the variation left unexplained, and we do not know how large a share of this part can be explained by non- included variables that reflect self-interest, such as how fun it is considered to be to drive fast. 3. Perceptions of voting motives This second survey was mailed to 1500 randomly selected individuals aged between 18 and 65 years old in Sweden, during spring 2002 (i.e. a year after the first survey), and the response rate of the overall survey was 58%. To compare actual voting motives with the perception people have of voting motives, we simply asked another representative sample of Swedes about why they thought other people vote as they do, followed by a question about why they themselves vote as they do. Before the questions, they were given the following information: One can vote for a political party for different reasons. One can vote for a party because one is favored oneself, or one can do it out of conviction that it is the best for society as a whole. As can be seen from Tables 3 and 4, most people believe that others vote largely for their own interests, whereas they, on average, consider themselves to be influenced roughly equally by their own interests and by those of society as a whole. To test whether the observed differences are statistically significant, i.e. whether there is a statistical difference between people's perception of the degree to which they themselves vote sociotropically, and the degree to which others vote sociotropically, we used a simple ordered probit model; the motives are ordered from "Mostly because it benefits me (them)" to "Mostly out of conviction." This is an appropriate econometric specification since the empirical analysis focuses on an ordered discrete variable. The approach is based on the idea of a latent unobservable variable, Socio*, representing, in our case, individuals' perception of the degree of sociotropic voting with the following structure: 7 Socio*= d DOthers + e, where DOthers is a dummy variable indicating that the responses are given to the framing on how others vote, and d is the associated parameter to be estimated; e is assumed to be a normally distributed error term with zero mean and constant variance. The results in Table 5 show that the between- sample difference is indeed highly significant as reflected by a significant d-parameter (at less than 0.1% level). One possible reason for this systematic bias is that people want to have a good self-image, or identity, and that they therefore engage in a degree of self- deception so that they believe that they would vote more for the common good than they would actually do in reality. Indeed, there is much psychological evidence for systematic self-deception that enhances people's perception of their own abilities in many respects; see e.g. Gilovich (1991) and Taylor and Brown (1994). An alternative, slightly more sophisticated version of this argument, is that people answer truthfully and without bias concerning their own motives. However, since most of us want to see ourselves as good and responsible people, and at the same time to do what is best for ourselves, we may unconsciously try to reduce the cognitive dissonance (cf. Akerlof and Dickens, 1982) by adapting our perceptions of what is best for society as a whole so that it more or less coincides with what is best for ourselves. Hence, when we honestly try to judge different alternatives as objectively as possible on behalf of society, we will still unconsciously bias our judgment in favour of what is best for ourselves; see Babcock and Loewenstein (1997) and references therein for much evidence of such self-serving biases. Table 3. Self-reported perceptions of own voting motives. (N = 751) Why do you vote as you do? Reason Fraction Mostly because it benefits me 10% Because it benefits me, but also to a certain degree out of conviction 23% Equally because it benefits me and out of conviction 27% Out of conviction, but also to a certain degree because it benefits me 22% Mostly out of conviction 18% Table 4. Self-reported perceptions of others' voting motives. (N = 762) Why do you, on average, believe that people vote as they do? Reason Fraction Mostly because it benefits them 20% Because it benefits them, but also to a certain degree out of conviction 39% Equally because it benefits them and out of conviction 19% Out of conviction, but also to a certain degree because it benefits them 17% Mostly out of conviction 5% Table 5. Ordered probit regression to estimate the differences between the respondents' perceived degree to which they themselves and others vote sociotropically. (N = 1513) Variable Coeff. P-value Dummy variable reflecting the additional degree that others (compared to oneself) vote sociotropically -0.553 0.000 Cut-off 1 -1.346 Cut-off 2 -0.377 Cut-off 3 0.246 Cut-off 4 0.968 The dependent variable is the perceived degree of sociotropic voting coded as follows: 1 = Mostly because it benefits me (them); 2 = Because it benefits me (them), but also to a certain degree out of conviction; 3 = Equally because it benefits me (them) and out of conviction; 4 = Out of conviction, but also to a certain degree because it benefits me (them); and 5 = Mostly out of conviction. When we observe others, however, we just know roughly how they vote and their other circumstances. Hence, we can only crudely observe the correspondence between how others vote and their personal interests. But since we do not take into account the fact that others too adapt their perceptions of what is in the interest of society, through self-serving biases, the perception of the degree to which others vote sociotropically may be biased downwards. 4. Conclusion Most results from our survey indicate that self-interest is an important determinant of the preferred speed limit; for example, those who have a newer car (and hence one that is typically safer and more comfortable at high speeds) that is bigger and faster, prefer higher speed limits. This is also true for those who believe they are better than the average driver, whereas older people prefer lower speed limits. Furthermore, the results are also consistent with the existence of political offsetting behaviour, so that when cars become safer due to technological developments, people adapt their political preferences in favour of higher speed limits, which reduces road safety overall. However, the results from people's self-reported subjective voting motives are not consistent with purely instrumental pocketbook voting. Rather, it seems that the expressive motive is important, as argued thoroughly by Brennan and Lomasky (1993) and Brennan and Hamlin (1998, 2000),8 and it seems in particular that people want to express that they are socially responsible people who care about the overall welfare of society. This is also strengthened by the observed fact that people tend to believe that others vote more in their own interests, on average. Still, despite such biases, we also find that most people answer that they vote both for their own interest and for the interest of society. Hence, the hypothesis that most people solely or primarily vote sociotropically appears to be incorrect too. Answering a survey, such as ours on preferred speed limits, is in some respects quite similar to voting. Since the respondents were informed that the survey was sent out to a large random sample of Swedes by a university, and was a part of a research project, they could hardly believe that their single response would influence actual policy in a non-negligible way. Furthermore, the financial incentive of answering was zero, and it took probably almost half an hour to answer the whole survey on average. The response rates (62% and 58% respectively) were also similar to electoral participation rates in many countries.9 Presumably, most of the respondents answered based on a sense of civic duty, or due to the disutility associated with not answering which would break what they perceive to be a social (or personal) norm. But given that expressive voting, and expressive answering of surveys such as ours, is the main explanation behind observed behaviour, how can we explain the fairly strong correlation with their own self-interest? Although it is perceived as socially admirable to vote, it is hardly perceived to be admirable to vote solely for your own best interests. Rather, we are socialized to focus on the collective good when wearing our "political hats" (Sears, Lau, Tyler and Allen, 1980; Sears and Funk, 1990). One possible explanation to this paradox is provided by the idea of self-serving bias. As expressed by Elster (1999, 333): "Most people do not like to think of themselves as motivated only be self-interest.They will, therefore, gravitate spontaneously towards a world-view that suggests a coincidence between their special interest and the public interest." (italics in original.) In this way we can vote for improvements for ourselves without feeling guilty that this would, overall, be bad for society, and we are hence not plagued by any cognitive dissonance. After all, it is much more pleasant to think that what is good for you is also good for society, isn't it? Notes 1. However, as argued by Brennan and Lomasky (1993) as well as Brennan and Hamlin (2000), expressive voting per se does not necessarily imply sociotropic voting. 2. If anything, it may be considered somewhat more ethical to vote for lower speed limits. Nevertheless, despite a possible bias in this direction, very few respondents (5%) prefer a lower speed limit than the current one, as can be seen from Table 1. 3. Highway speed limits have increased rapidly in many states in the USA during the last 15 years (Greenstone, 2002), and also in other countries such as Italy, while there are on-going discussions in many other countries. 4. However, it is possible that people who drive newer cars do so due to stronger preferences for safety. For this reason, those who have new cars would then prefer lower speed limits than others would. Given that the empirical result presents the net effect, the isolated effect of a newer car on the preferred speed limit would then be larger than the effect that is presented here. 5. This does not necessarily mean that actual safety increases with self-reported subjective driving ability, however, since over-optimism regarding one's own driving ability is likely to be positively correlated with subjective driving ability. Still, what matters for the preferred speed limit it the subjective risk, which is independent of such biases. 6. Indeed, some analysts have even questioned the sign of the relationship: Lave and Elias (1997) argued that the accident increase on rural interstate USA roads resulting from increasing the speed limits to 65 mph in 1987 were more than off-set by the decline of accidents on other roads due to compensatory reallocations of drivers and state police; see also Greenstone (2002), who, however, questioned the conclusion by Lave and Elias. 7. In our case five ordered categories are possible. The respondents are assumed to choose the alternative closest to their own perception, where we observe Socio = 1, i.e. "mostly because it benefits me (them)," if Socio*= a1;Socio = 2, i.e. "because it benefits me (them), but also to a certain degree out of conviction," if a1 < Socio*= a2 etc.; until Socio = 5, i.e. "mostly out of conviction," if a4 = Socio*;where a1 to a4 are cut-off points to be estimated simultaneously with the coefficient. 8. See also Copeland and Laband (2002) for recent empirical support. 9. In the 2002 General Election in Sweden 80.1% of the eligible population voted (SCB, 2002). References Akerlof, G.A. and Dickens, T.W. (1982). The economic consequences of cognitive dissonance. American Economic Review 72: 307-319. Babcock, L. and Loewenstein, G. (1997). Explaining bargaining impasse: The role of self- serving biases. Journal of Economic Perspectives 11: 109-126. Brekke, K.A., Kverndokk, S. and Nyborg, K. (2003). An economic model of moral motivation. Journal of Public Economics 87: 1967-1983. Brennan, G. and Hamlin, A. (1998). Expressive voting and electoral equilibrium. Public Choice 95: 149-175. Brennan, G. and Hamlin, A. (2000). Democratic devides and desires. Cambridge: Cambridge University Press. Brennan, G. and Lomasky, L. (1993). Democracy and decision: The pure theory of electoral preference. Cambridge: Cambridge University Press. Copeland, C. and Laband, D.N. (2002). Expressiveness and voting. Public Choice 110: 351- 363. Elster, J. (1999). Alchemies of the mind: Rationality and the emotions. Cambridge: Cambridge University Press. Gilovich, T. (1991). Why we know what isn't so. New York: Free Press. Godfrey, L. (1988). Misspecification tests in econometrics: The Lagrange multiplier principle and other approaches. Econometric Society Monographs No. 16. Cambridge: Cambridge University Press. Gramlich, E.M. and Rubinfeld, D.L. (1982). Voting on spending. Journal of Policy Analysis and Management 1: 516-533. Greenstone, M. (2002). A reexamination of resource allocation responses to the 65-MPH speed limit. Economic Inquiry 40: 271-278. Hartog, J., Ferrer-i-Carbonell, A. and Jonker, N. (2002). Linking measured risk aversion to individual characteristics. Kyklos 55: 3-26. Hemenway, D. and Solnick, S. (1993). Fuzzy dice, drean cars, and indecent gestures: Correlates of drivers behavior. Accident Analysis and Prevention 25: 161-170. Jianakoplos, N.A. and Bernasek, A. (1998). Are women more risk averse? Economic Inquiry 36: 620-630. Kahneman, D. and Knetsch, J.L. (1992). Valuing public goods: The purchase of moral satisfaction. Journal of Environmental Economics and Management 22: 57-70. Keeler, T.E. (1994). Highway safety, economic behavior, and driving environment. American Economic Review 84: 684-693. Keser, C. and van Winden, F. (2000). Conditional cooperation and voluntary contributions to public goods. Scandinavian Journal of Economics 102: 23-39. Kinder, D.R. and Kiewiet, D.R. (1979). Economic discontent and political behavior: The role of personal grievances and collective economic judgments in congressional voting. American Political Science Review 23: 495-517. Kramer, G.H. (1983). The ecological fallacy revisited: Aggregate-versus individual-level findings on economics and elections, and sociotropic voting. American Political Science Review 77: 92-111. Kuran, T. (1995). Private truths, public lies: The social consequences of preference falsification. Cambridge Mass: Harvard University Press. Lave, C. and Elias, P. (1997). Resource allocation in public policy: The effects of the 65-MPH speed limit. Economic Inquiry 35: 614-620. Ledyard, J.O. (1995). Public goods: A survey of experimental research. In J.H. Kagel and A.E. Roth (Eds.), Handbook of experimental economics, 111-194. Princeton: Princeton University Press. Merrell, D., Poitras, M. and Sutter, D. (1999). The effectiveness of vehicle safety inspections: An analysis using panel data. Southern Economic Journal 65: 571-583. Mitchell, W.C. (1990). Ambiguity, contradictions, and frustrations at the ballot box: A public choice perspective. Policy Studies Review 9: 517-525. Peltzman, S. (1975). The effects of automobile safety regulation. Journal of Political Economy 83: 677-725. Peterson, S., Hoffer, G. and Millner, E. (1995). Are drivers of air-bag-equipped cars more aggressive? A test of the offsetting behavior hypothesis. Journal of Law and Economics 38: 251-264. Rienstra, S.A. and Rietveld, P. (1996). Speed behaviour of car drivers: A statistical analysis of acceptance of changes in speed policies in the Netherlands. Transportation Research: Part D: Transport and Environment 1: 97-110. SCB (2002). http://www.scb.se/statistik/me0101/me0101_tab511.xls Sears, D., Lau, R., Tyler, T. and Allen, H. (1980). Self-interest vs. symbolic politics in policy attitudes and presidential voting. American Political Science Review 74: 670-684. Sears, D.O. and Funk, C.L. (1990). Self-interest in Americans' political opinions. In J. Mansbridge (Ed.), Beyond self-interest, 147-170. Chicago: University of Chicago Press. Shabman, L. and Stephenson, K. (1994). A critique of the self-interested voter model: The case of a local single issue referendum. Journal of Economic Issues 28: 1173-1186. Shinar D, Schechtman, E. and Compton, R. (2001). Self-reports of safe driving behaviors in relationship to sex, age, education and income in the US adult driving population. Accident Analysis and Prevention 33: 111-116. Smith, J.H. (1975), A clear test of rational voting. Public Choice 23: 55-67. Stata (2003). Reference N-R. College Station Texas: Stata Press Publication. Taylor, S.E. and Brown, J.D. (1994). Positive illusions and well-being revisited: Separating fact from fiction. Psychological Bulletin 116: 21-27. Tullock, G. (2000). Some further thoughts on voting. Public Choice 104: 181-182. From checker at panix.com Tue Jan 3 22:35:49 2006 From: checker at panix.com (Premise Checker) Date: Tue, 3 Jan 2006 17:35:49 -0500 (EST) Subject: [Paleopsych] CHE: Tomorrow, I Love Ya! Message-ID: Tomorrow, I Love Ya! The Chronicle of Higher Education, 5.12.9 http://chronicle.com/free/v52/i16/16a03001.htm [Colloquy transcript appended. [This is a great discussion of procrastination. But, as always, events and processes that have multiple causes are nearly impossible to diagnose. The author is very much part of the therapeutic culture, while a great many on my list are right-wing hold-'em-responsible sorts. It's true that driving up the cost of any behavior, including procrastination, will result in less of it. But, at least in today's society, blaming the individual can well result in a downward spiral, as is certainly suggested by the author. [Those who need disciplining the most are also those with the worst problems to begin with. So both genes and society conspire against them.] Tomorrow, I Love Ya! Researchers are learning more about chronic dawdlers but see no easy cure for procrastination Related materials Colloquy: Join a [55]live, online discussion with Joseph R. Ferrari, a professor of psychology at DePaul University who studies chronic procrastination, about what, if anything, can be done to help students who suffer from it, on Wednesday, December 7, at 2:30 p.m., U.S. Eastern time. By ERIC HOOVER Joseph R. Ferrari has a name for people who dillydally all the time. Sometimes, he spits out the term as if it were stale gum or a polysyllabic cuss word. When he dubs you a "chronic procrastinator," however, he does not mean to insult you. He is just using the psychological definition for someone who habitually puts things off until tomorrow, or next week, or whenever. The afflicted need not feel lonely: Research suggests that the planet is crawling with dawdlers. Procrastinators vex Mr. Ferrari, a psychology professor at DePaul University, yet he owes much to the dilatorily inclined. Without them he could not have helped blaze a trail of inquiry into procrastination (the word comes from the Latin verb procrastinare -- "to defer until morning"). The professor is as prompt as the sports car that shares his name, but he sees the symptoms of compulsive stalling everywhere. Mr. Ferrari and other scholars from around the world are finding that procrastination is more complex -- and pervasive -- than armchair analysts might assume. And helping people climb out of their pits of postponement is not as simple as giving them a pill or a pep talk. The task is particularly challenging in the hothouses of procrastination known as college campuses. Free time, long-term deadlines, and extracurricular diversions conspire against academic efficiency. Students are infamous for not tackling their assignments until the jaws of deadlines are closing. Professors may call such students slackers or sloths; psychologists define them as "academic procrastinators." According to recent studies, about 70 percent of college students say they typically procrastinate on starting or finishing their assignments (an estimated 20 percent of American adults are chronic procrastinators). Choosing to do one task while temporarily putting another on hold is simply setting priorities, which allows people to cross things off their to-do lists one at a time. Procrastination is when one keeps reorganizing that list so that little or nothing on it gets done. For some students, that inertia has costs. Researchers say academic procrastination raises students' anxiety and sinks their self-esteem. The behavior also correlates with some of higher education's thorniest problems, including depression, cheating, and plagiarism among students. Dozens of colleges have created counseling sessions or workshops for procrastinators. Yet Mr. Ferrari and other researchers say many institutions treat the problem superficially instead of helping students analyze their own thought processes and behavioral patterns in order to change them. Give a hard-core procrastinator nothing more than time-management tips, they warn, and you might as well hand him a page of hieroglyphics. "Telling a chronic academic procrastinator to 'just do it' is not going to work," Mr. Ferrari says. "It's like telling a clinically depressed person to cheer up." Learning About Loafers Laggards have always been tough cases. Even God could not inspire St. Augustine of Hippo, the fourth-century philosopher and theologian, to act right away. As he slowly came to accept Christianity, Augustine wrote in Confessions, the future bishop wavered. Clinging to temporal pleasures, Augustine famously asked of God: "Give me chastity and continency -- but not yet." Late in his life, Leonardo da Vinci, the genius who missed deadlines, lamented his unfinished projects. Shakespeare's Hamlet pondered -- and pondered -- killing his uncle Claudius before sticking him in the final act. Grady Tripp, the English professor in Michael Chabon's novel Wonder Boys, couldn't finish his second book because he refused to stop writing it. In a world of unmade beds and unwritten essays, the postponement of chores is commonplace. Now and again, humans put aside tasks with long-term rewards to savor immediate pleasures, like ice cream and movies, through a process called "discounting." For chronic procrastinators, however, discounting is a way of life. The scientific study of procrastination was (appropriately enough) a late-blooming development relative to the examinations of other psychological problems. Only in the 1980s did researchers start unlocking the heads of inveterate loafers, who suffer from more than mere laziness. Mr. Ferrari, a co-editor of Procrastination and Task Avoidance: Theory, Research, and Treatment (Plenum Publications, 1995), has helped clarify the distinction between delaying as an act and as a lifestyle. Not every student who ignores assignments until the last minute is an across-the-board offender, known to psychologists as a "trait procrastinator." Many students who drag their feet on term papers might never delay other tasks, such as meeting friends for dinner, showing up for work, or going to the dentist. As Mr. Ferrari explains in Counseling the Procrastinator in Academic Settings (American Psychological Association, 2004), a book he edited with three other scholars, there is no typical profile of an academic procrastinator (though family dynamics may influence the behavior). Studies have found no significant relationship between procrastination and intelligence or particular Myers-Briggs personality types. Research does show that academic procrastinators tend to lack self-confidence, measure low on psychologists' tests of "conscientiousness," get lost in wishful thoughts, and lie low during group assignments. In one study, Mr. Ferrari found that students at highly selective colleges reported higher rates of academic procrastination than students from less selective institutions. In another, the motives for academic procrastination among students at an elite college differed from students' motives at a nonselective one (the former put off assignments because they found them unpleasant, while the latter did so because they feared failure or social disapproval). Mr. Ferrari identifies two kinds of habitual lollygaggers. "Arousal procrastinators" believe they work best under pressure and tend to delay tasks for the thrill. "Avoidant procrastinators" are self-doubters who tend to postpone tasks because they worry about performing inadequately, or because they fear their success may raise others' expectations of them. Other findings complicate fear-of-failure theories. Some researchers say an inability to control impulses explains procrastinators best. And a recent study by Mr. Ferrari and Steven J. Scher, an associate professor of psychology at Eastern Illinois University, suggests that people who are typically negative avoid assignments that do not challenge them creatively or intellectually, whereas people who are typically positive more easily tackle less-stimulating tasks. Science is not likely to resolve the mysteries of procrastination anytime soon. After all, among researchers a debate still rages over the very definition of procrastination. Mr. Scher suspects there are different types of the behavior, especially if one defines it as not doing what one thinks one should do. "A common thing that many people put off is doing the dishes," Mr. Scher says. "But there are also times when those same people will all of a sudden find that doing the dishes is the most important thing they have to do -- thereby putting off some other type of task." Homework-Eating Dogs Psychologists do agree on one thing: Procrastination is responsible for most of the world's homework-eating dogs. Where procrastinators go, excuses follow. Students who engaged in academic procrastination said more than 70 percent of the excuses they gave instructors for not completing an assignment were fraudulent (the lies were most prevalent in large lecture classes taught by women who were "lenient"), Mr. Ferrari found in one study. In another, procrastinating students generally said they experienced a positive feeling when they fibbed; although they did feel bad when they recalled the lie, such remorse did not seem to prevent them from using phony excuses in the future. Mr. Ferrari has also experimented with giving bonus points for early work. In a study published in the journal Teaching of Psychology, he found that such incentives prompted 80 percent of students to fulfill a course requirement to participate in six psychological experiments by a midpoint date. On average, only 50 percent had done so before he offered the inducement. Mr. Ferrari believes that academe sends mixed messages about procrastination. Most professors talk about the importance of deadlines, but some are quick to bend them, particularly those who put a premium on being liked by their students. In one of Mr. Ferrari's studies, 90 percent of instructors said they did not require the substantiation of excuses for late work. "We're not teaching responsibility anymore," Mr. Ferrari says. "I'm not saying we need to be stringent, strict, and inflexible, but we shouldn't be spineless. When we are overly flexible, it just teaches them that they can ignore the deadlines of life." Ambivalence about deadlines pervades American culture. People demand high-speed results, whether they are at work or in restaurants. Yet this is also a land in which department stores encourage holiday shoppers to postpone their shopping until Christmas Eve, when they receive huge discounts. And each year on April 15, television news reporters from coast to coast descend upon post offices to interview (and celebrate) people who wait until the final hours to mail their tax returns. "As a society," Mr. Ferrari says, "we tend to excuse the person who says 'I'm a procrastinator,' even though we don't like procrastinators." But do all people who ignore assignments until the 11th hour necessarily suffer or do themselves harm? One of Mr. Ferrari's former students, Mariya Zaturenskaya, a psychology major who graduated from DePaul last spring, says some last-minute workers are motivated, well organized, and happy to write a paper in one sitting. Although students who cram for tests tend to retain less knowledge than other students, research has yet to reveal a significant correlation between students' procrastination and grades. "Some students just need that deadline, that push," Ms. Zaturenskaya says. "Some people really are more efficient when they have less time." Treating the Problem Before Jill Gamble went to college, she had little time to waste. As a high-school student, she had earned a 3.75 grade-point average while playing three sports. Each night she went to practice, ate dinner, did her homework, and went to bed. After matriculating at Ohio State University, however, her life lost its structure. At first, all she had to do was go to classes. Most days she napped, spent hours using Instant Messenger, and stayed up late talking to her suite mates. As unread books piled up on her desk, she told herself her professors were too demanding. The night before her Sociology 101 final, she stayed up drinking Mountain Dew, frantically reading the seven chapters she had ignored for weeks. "My procrastination had created a lot of anxiety," Ms. Gamble recalls. "I was angry with myself that I let it get to that point." She got a C-minus in the class and a 2.7 in her first quarter. When her grades improved only slightly in the second quarter, Ms. Gamble knew she needed help. So she enrolled in a course called "Strategies for College Success." The five-year-old course uses psychological strategies, such as the taking of reasonable risks, to jolt students out of their bad study habits. Twice a week students spend class time in a computer lab, where they get short lectures on study skills. Students must then practice each skill on the computer by using a special software program. Instructors use weekly quizzes to cut procrastination time from weeks to days and to limit last-minute cramming. The frequent tests mean one or two low scores will not doom a student's final grade, ideally reducing study-related stress. Students complete assignments at their own pace, allowing faster ones to stay engaged and slower ones to keep up, yet there are immovable dates by which students must finish each set of exercises. Enrollees learn how to write and follow to-do lists that reduce large tasks, such as writing an essay, into bite-size goals (like sitting down to outline a single chapter of a text instead of reading the whole book). Each student must also examine his or her use of rationalizations for procrastinating. The course's creator, Bruce W. Tuckman, a professor of education at Ohio State, says he also teaches students to recognize the underlying cause of procrastination, which he describes as self-handicapping. "It's like running a full race with a knapsack full of bricks on your back," Mr. Tuckman says. "When you don't win, you can say it's not that you're not a good runner, it's just that you had this sack of bricks on your back. When students realize that, it can be easier for them to change." Many of the worst procrastinators end up earning the highest grades in the class, Mr. Tuckman says. And among similar types of students with the same prior cumulative grade-point averages, those who took the class have consistently outperformed those who did not take it. After completing the course, Ms. Gamble says, she stopped procrastinating and went on to earn a 3.8 the next semester. Since then, she has made the dean's list regularly, and now helps counsel her procrastinating peers at Ohio State's learning center. "In workshops, we'll say, 'How many of you identify yourselves as procrastinators?' and they will throw their hands in the air and giggle, even though procrastination is a very negative thing," Ms. Gamble says. "Why do we do this so willingly? The answer is that let we let ourselves procrastinate. If someone was doing it to us, we wouldn't be so willing to raise our hands." A Universal Problem Psychologists generally agree that the behavior is learned and that students choose to procrastinate, even though they may feel helpless to stop. Mr. Ferrari, the DePaul professor, describes the behavior as a self-constructed mental trap that people can escape the same way smokers can kick the habit. Mr. Tuckman qualifies his optimism by saying one cannot hope to cure procrastination so much as reduce it. "It's very hard to go from being a hard-core procrastinator to a nonprocrastinator," says Mr. Tuckman, one of many researchers who has developed a scale that measures levels of procrastination. "You're just so used to doing it, there's something about it that reinforces it for you." Scholars are learning that procrastination knows no borders. At a conference of international procrastination researchers this summer at Roehampton University, in England, Mr. Ferrari and several other scholars presented a paper that compared the prevalence rates of chronic procrastination among adults in Australia, England, Peru, Spain, the United States, and Venezuela. They found that arousal and avoidant procrastinators were equally prevalent in all of the nations, with men and women reporting similar rates of each behavior. That is not to say all cultures share the same view of procrastination. Karem Diaz, a professor of psychology at the Pontifical Catholic University of Peru, has studied the behavior among Peruvians, whose expectations of timeliness tend to differ from those of Americans. "In Peru we talk about the 'Peruvian time,'" Ms. Diaz writes in an e-mail message. "If we are invited to a party at 7 p.m., it is rude to show on time. ... It is even socially punished. Therefore, not presenting a paper on time is expected and forgiven." Few Peruvians are familiar with the Spanish word "procrastinaci?n," which complicates discussions of the subject. "Some people think it is some sexual behavior when they hear the word," Ms. Diaz says. Yet the professor has been intrigued to find that some Peruvians identify themselves as procrastinators, and experience the negative consequences of the behavior even though social norms encourage it. Strategies for helping people bridge the gap between their actions and intentions vary. A handful of colleges in Belgium, Canada, and the Netherlands have just begun to develop counseling programs that draw on cognitive and behavioral research. The early findings: Helping students understand why they dawdle and teaching them self-efficacy tends to lessen their procrastination -- or at least make it more manageable. Timothy A. Pychyl, an associate professor of psychology at Carleton University, in Ottawa, Ontario, says group meetings are a promising approach, particularly those in which students make highly specific goals and help each other resist temptations to slack off. "For many people, it's an issue of priming the pump ... as simple as making a deal with oneself to spend 10 minutes on a task," Mr. Pychyl says. "At least the next day they can see themselves as having made an effort as opposed to doing nothing at all." Clarry H. Lay, a retired psychology professor at York University, in Toronto, who continues to counsel student procrastinators, uses personality feedback to promote better "self-regulation" among students. In group sessions, he discusses the importance of studying even when one is not in the "right mood" and of setting aside a regular place to work. Some participants become more confident and efficient. Others see improvements, only to experience relapses. Each semester one in five students miss the first session. Some sign up early but never show, while others arrive late or attend sporadically. But Mr. Lay understands. The counselor is a chronic procrastinator himself. ______________________________________________________________ References 55. http://chronicle.com/colloquy/2005/12/procrastination/ [Immediately below.] The Chronicle of Higher Education: Colloquy Transcript http://chronicle.com/colloquy/2005/12/procrastination/ There's Always Tomorrow Wednesday, December 7, at 2:30 p.m., U.S. Eastern time The topic Conventional wisdom holds that procrastinators are just plain lazy. But psychologists who study chronic dawdling say the behavior is much more complex than that. Researchers have found that college campuses are hothouses of procrastination, with an estimated 70 percent of students saying they typically postpone starting or finishing their assignments. Some of those students feel incapable of changing their behavior, which can sink not only their grades but also their self-esteem. Many colleges offer time-management workshops to help students overcome procrastination, yet some experts say treating chronic procrastinators requires intensive counseling that gets at the root causes of habitual dillydallying. Why do some students waste their time when they should be working? Should American universities offer cognitive and behavioral therapy for the problem, as many European ones do? Is there hope for a cure? If not, what is to be done? The guest Joseph R. Ferrari is a professor of psychology at DePaul University and a leading researcher of chronic procrastination. ______________________________________________________________ A transcript of the chat follows. ______________________________________________________________ Eric Hoover (Moderator): Welcome to The Chronicle's live chat with Joseph R. Ferrari, a professor of psychology at DePaul University and a leading researcher of chronic procrastination. Thanks for joining us today. We will now take questions. ______________________________________________________________ Question from Mark Grechanik, University of Texas at Austin: Do you think frequent quizzes may help students to engage in the learning process faster? Joseph R. Ferrari: Good point; it may. But the issue here is that folks wait to study not that they don't study. Still, you are right. Generating a system that reduces procrastination is the solution. ______________________________________________________________ Question from Anon, small NY college: Intensive counseling doesn't sound like it would fit into a college student's schedule. How can we lessen procrastination if we can't provide intensive counseling? If time management tips/workshops don't work, what does? Joseph R. Ferrari: For the CHRONIC PROCRASTINATOR, therapy. Even small NY towns (and I lived and worked at several) have professional clinical and counseling psychologists in the area. They need to get a student rate and seek professional help. Also, the college counseling center could have a staff person trained to hold sessions. Good luck! ______________________________________________________________ Question from Maryann P. county college in NJ: How can I solve my problem of often being late for appointments, term papers, kids appointments, car-pool? I am getting worse at it lately. Joseph R. Ferrari: Do you schedule back to back? Give yourself 20 minutes between tasks so if one takes longer, you are not overloaded. Remember, to prioritze is not the same as procrastinating. ______________________________________________________________ Question from Laura Wennekes, University of Amsterdam, The Netherlands: Isn't "procrastination" a natural response to artificially imposed notion of "deadline"? Joseph R. Ferrari: "Natural." Wow, no. It is learned. There is NO gene for procrastination. I hear a little rebellion here - like 'imposed deadline.' Look, life is full of commitments. We have responsibilities to meet those deadlines. ______________________________________________________________ Question from Evan, University of Delaware: Are you aware of the book "The Now Habit" (Niel Fiore) and the related "lifehacker" movement popular among IT professionals for over-coming procrastination? What do you think of them? Joseph R. Ferrari: Yes, there are many 'self-help' books out there. Most don't use good research to support them. Read the scholarly stuff for good science. ______________________________________________________________ Question from Nora, big state university: In my experience, procrastination is directly related to anxiety around writing. I sit down to write, and have bodily "symptoms" and have found psychoanalysis to be helpful. Have you considered the psychoanalytic treatment of writing blocks and procrastination? Joseph R. Ferrari: If analysis works for you, great. I recommend cognitive-behavioral therapy because it changes the way a person THINKS and BEHAVES instead of thinking about why one's mother acted a specific way. ______________________________________________________________ Question from Kris, MIT: Could you please explain how research into procrastination differentiates between depression-related procrastination, and procrastination in someone who would not be clasified as depressed? Is such a distinction even possible? Thank you from a chronic procrastinator, second generation. Joseph R. Ferrari: Yes, this is a learned thing and for you you had a model. Yes, there is a relation between procrastination and depression, but correlational. Does procrastination lead to depression? Or does depression lead to procrastination? No causal experiements have been done. ______________________________________________________________ Question from Kathleen, U. of Rochester: What does evidence suggest regarding a genetic contribution to chronic procrastination? Joseph R. Ferrari: None! It's too easy to blame things on one's genetics. If that is the case, then one can't change, and that is foolish. ______________________________________________________________ Question from Mark, NGO Abroad: The people I know who do not procrastinate are ones who get a great sense of satisfaction finishing things and checking it off the list. They tend to enjoy throwing things away rather than keeping them around in case they need them. Basically they have greater throughput. I dont really enjoy finishing things, I worry that they are not perfect. My question is what makes them get such satisfaction from completion? Joseph R. Ferrari: You are right about non-procrastinators (myself included). And you are right about the link to perfectionism. Procrastinators try to be perfect to have others like them. Nonprocrastinators try to be perfect to do a good job. So, stop focusing on what others will think of you as reflecting your self-worth. You are a good person even if the project is a B or B+. ______________________________________________________________ Question from Erica, NYU: Recently, I've seen a bunch of web pages advertising "coaches" who help a person get over their procrastinating habits. Does your research suggest that coaching is effective? Joseph R. Ferrari: In my 2004 book, Counseling the Procrastinator in Academic Settings, there is a chapter on digital coaching. Good luck. ______________________________________________________________ Question from Evan, University of Delaware: Sometimes I am much more productive on projects that have no deadline than those that do. Do you think the deadline itself is the culprit as much as the task at hand? How are they related? Joseph R. Ferrari: Sounds like a little rebellion against having an external deadline here. Ask yourself why you work against it, instead of work with it. ______________________________________________________________ Question from Marla, U. of Texas: Do you think the internet has worsened the problem of procrastination, or that it is just a different form of an ongoing problem? Joseph R. Ferrari: Worsened. Now we email at the last minute instead of placing a letter in the post 3 days before. ______________________________________________________________ Question from Erin McLaughlin, University of Pennsylvania: What are the indicators of a chronic procrastinator versus a student just uninterested in a project or class? Joseph R. Ferrari: Hmm, I think they would look the same and act the same. But the procrastinator would get anxious about not working on the target project. ______________________________________________________________ Question from Carmen, Northeast Iowa Community College: The Chronicle article about procrastinations makes no mention of Adult ADHD,a likely cause for at least 5% of procrastinators, and possibly many, many more, as there would be a natural selection bias toward procrastination in the ADHD population. What are your thoughts about this? Joseph R. Ferrari: Nice. I have a paper in press in "Clinical & Counseling Psychology" where we examined procrastinators with adult ADHDs. There was a link. Look for the article. Cheers. ______________________________________________________________ Question from Tammy, mid-size East Coast univ.: How can I know if a therapist is good at working with procrastination? Joseph R. Ferrari: Good point. Look for a PhD psychologist with a cognitive-behavioral style. If they try time management on you, walk away. ______________________________________________________________ Question from Michelle, Washington University: What is your viewpoint on freshman transition programs? Do you think they could be useful in heading off patterns of underachievement due to procrastination? And did you find anything that indicated how to prepare for independent study? Joseph R. Ferrari: Good point. No data here, but anything that tries to get students to examine what and why they do or do NOT work is good. Just don't hope that the 20% who are chronic procrastinators will be 'cured' by a week-long section of a freshmen course. ______________________________________________________________ Question from Laura, large eastern university: How do you get a procrastinator to actually go to therapy, however? Especially if they already feel they don't have enough time for everyday commitments. Joseph R. Ferrari: Can't make anyone do anything they don't want to. As my old Italian grandmother said (loses something in translation, but...) "for some folks, they will not get off the beach until the water hits their behind." ______________________________________________________________ Question from Donald, small Rhode Island College: Is procrastination a result of executive processing disorders? Joseph R. Ferrari: No data. Unlikely for most people. Remember, there is a difference between correlations and causality. ______________________________________________________________ Question from John Gault, Missouri Valley College: Are you saying there is nothing that can help these students except professional counselors? Is there nothing the professor or the school can do? Joseph R. Ferrari: Professors can design their classes to give extra credit for doing assignments EARLY, instead of punishing for being late. I'm not saying there is nothing. Remember, 80% of us procrastinate, but 20% are procrastinators. Programs can work for most folks, but for the 20% who are real procrastinators, where this is their lifestyle, they need therapy. ______________________________________________________________ Question from LLC, The University of Akron: I work in a career center, and see another side of chronic student procrastination related to making life decisions, applying for jobs, preparing for life after college. It seems like it takes a "crisis point" to motivate the truly chronic procrastinators ... are there some other tools, tips, techniques, resources you'd recommend to help us "kick-start" those that need the assistance? Joseph R. Ferrari: Right, some folks need a crisis to kcik them into moving. ______________________________________________________________ Question from Pat, Shorter College (small private): My husband just gave a bunch of low grades to students who failed, all semester, to turn in assignments. Do you think there's a group behavior/dynamics factor in procrastination? Joseph R. Ferrari: Probably not. They may have been having some group planned strategy to delay for other reasons. But we do know that in group assignments where performance is rated for everyone, procrastinators will engage in loafing--and the non-procrastinators in the group don't like them. ______________________________________________________________ Question from Constant Beugre, Delaware State University: Can having a 'to do list' and stick to it help in reducing procrastination? I have tried this technquen with some procrastinators in the past and it seems to work for some but not for others. Joseph R. Ferrari: You are again making my point - for 80% of us who procrastinate on some things, a to-do list system and other things will work. But for the 20% procrastinators who do this as a lifestyle, they will reshuffle the list and come up with excuses why they can't do something. ______________________________________________________________ Question from LLC, The University of Akron: Although genetics may not make us procrastinators, our personalities may help us be more prone to procrastination. MBTI perceiving types, for example, like open-endedness, see many options and are not as comfortable choosing only one option, are more spontaneous ... are certain personality types more prone to procrastination? Joseph R. Ferrari: MBTI has very poor reliability and validity, if one reads the scholarly literature. We found it did not relate to proc tendencies. So drop that party game! ______________________________________________________________ Question from D.S., large research university: Most people are surprised to hear I procrastinate because I am amazingly good at coming through in the clutch, and when I work with focus, I get a lot done, and so, the pattern continues. Do I need to have a big failure to motivate change in myself? Joseph R. Ferrari: You sound like a chronic AROUSAL procrastinator, who enjoys working against a clock for a rush experience. Fail enough with no one bailing you out, and you may want to change. ______________________________________________________________ Question from Alaine Allen, University of Pittsburgh: I work with pre-college students who seem to procrastinate out of fear (ex. anxiety about writing a college application essay). What type of advice would you give to those students beyond the common "just do it" statement. Joseph R. Ferrari: To break it down and do just smaller sections, to focus on the goal of getting it done, not what has to be done, look at each TREE and not the FOREST. ______________________________________________________________ Question from Laura, big eastern Univ.: Sometimes I even procrastinate on enjoyable things. Do you think it may be some sort of rebelliion against "scheduling"? I have often wondered, because sometimes I will have a better chance to get something done if it is "impulsive." Where would you start to fix something like that? Joseph R. Ferrari: Could be. I have a book chapter on procrastination and impulsivity. They are not as opposite as you think. ______________________________________________________________ Question from Janet, large Long Island college: Are stimulant medications such as Metadate or Ritalin effective in reducing procrastination. Joseph R. Ferrari: No. keep away from the meds. Instead, focus on learning new skills for life. ______________________________________________________________ Question from Alec, University of Cambridge, UK: Hello, could you please mention a few proven concrete exercises/methods to combat ones procrastination tendencies, especially concerning very long term, multifaceted goals such as writing up a PhD thesis. Thank you. Joseph R. Ferrari: We have several good tested, research-supported techniques in the 2004 book, Counseling the Procrastinator in Academic Settings, as well as the founding book in 1995. ______________________________________________________________ Question from Ed, U. of Kentucky: I see my earlier questsion was answered. What about heling children out of this who have already caught procrastination from the parent? Joseph R. Ferrari: Well, it was not 'caught' as much as modeled and learned. So they need to learn alternative ways to handle the situation and how they perceive the situation. Can the parents model 'getting it done'? ______________________________________________________________ Question from Ed, U. of Kentucky: What questions should you ask when looking for a cognitive therapist? How long should it take to change the habit? Joseph R. Ferrari: Cognitive-behavioral therapy is more short term than psychoanalysis. Listen to how they would work with you. Do they focus on your thinking pattern and your behaviors? Do they offer you skills and new ways to treat your behaviors and thoughts? ______________________________________________________________ Question from Mona Pelkey, United States Military Academy: My star procrastinator just left my office. He is unhappy because he just can't seem to get motivated enough to put in the effort to get the grades he wants. For the past hour and a half I literally sat over him, pushing him to make a list, prioritize it, and start the tasks. I am exhausted and so is my student. Help! Joseph R. Ferrari: He is still trying to be PERFECT. Life is not perfect, neither should he be. Clincial folks talk about an 80% rule where the client is 'cured' if they reach 80% of their life goal. So get this student to be happy with 97% then 95% then 90%, etc. Procrastinators would rather others say that they lacked EFFORT than lacked ABILITY. By never finishing they can protect their self-views and say have the ability but that they never tried hard enough. He needs to stop thinking his self worth is tied only to getting an A. ______________________________________________________________ Question from Bonnie, Huston-Tillotson: Regarding that 20% - what is (are) the pay-off (s) for procrastination? Joseph R. Ferrari: Protecting one's self-esteem and social-esteem (how others feel about your ability). Never finish, never get judged by others. Let others decide and act for you. ______________________________________________________________ Question from John Gault, Missouri Valley College: What criteria can be used to determine if a student is one of the 20% that needs counseling or just a normal procrastinator like the rest of us? Joseph R. Ferrari: In the 1995 text we have several reliable and valid self-report measures that assess one's procrastinator tendencies. Buy the book and take the measures! ______________________________________________________________ Question from Nina, Duke University: I have bipolar disorder and am having a hard time determining if I'm procrastinating and using BP as an excuse, or am really having trouble getting things done because of of rapid cycling. Any thoughts? Joseph R. Ferrari: I can't play therapist here, but remember procrastinators are great excuse makers, blaming it on others, parents, genes, other disorders. ______________________________________________________________ Question from Eric Hoover: Looking ahead, what are some new avenues you would like to explore in your research on procrastination? What are some questions you would like to see addressed in future studies? Joseph R. Ferrari: I want to continue to look at the cross-cultural meanings of procrastination. And funny you should ask, I have been asked by a reader of a publishing house to write a pop book based on scholarly research outcomes. So, I think I will take them up on that offer... ______________________________________________________________ Eric Hoover (Moderator): That wraps up our chat. Thanks to everyone who sent questions today. And Prof. Ferrari, thank you for your responses. From checker at panix.com Tue Jan 3 22:35:59 2006 From: checker at panix.com (Premise Checker) Date: Tue, 3 Jan 2006 17:35:59 -0500 (EST) Subject: [Paleopsych] Discover: A third of medical research wrong? Message-ID: A third of medical research wrong? http://www.discover.com/web-exclusives/medical-research-wrong/ November 16, 2005 | Biology & Medicine The latest medical research is wrong about one-third of the time, that is... according to the latest medical research. A survey of 49 highly cited medical studies by epidemiologist John Ioannidis found the results of 14 studies were contradicted or downplayed by later research. Ioannidis' survey raises some hard questions. Is there a fundamental flaw in medical research, or is this just part of scientific progress? Problems occurred most often in studies that did not use randomized samples - five out of six were contradicted. A group that includes two high-profile cases of preventive therapies for coronary-artery disease. One study recommended hormone replacement therapy for post-menopausal women -- a treatment that some doctors now believe may increase chances of developing the disease. The other therapy used high doses of vitamin E to keep the coronary arteries healthy, a treatment that was later shown to be ineffective in randomized trials. In spite of the conflicting research, Nutritional Epidemiologist Eric Rimm is standing by his work. Rimm's study showed vitamin E reduced the risk of developing coronary-artery disease in healthy men ages 40 to 75. "I think what we originally reported hasn't really been re-tested," he said. The follow-up study cited by Ioannidis tested whether vitamin E prevented heart attacks and strokes in men and women over the age of 55 who already had cardiovascular disease or diabetes. According to Rimm, the health benefits of antioxidants like vitamin E provide is still a lively debate. "I thought our findings would be more generalizable," Rimm said, "But I think our results stand up, it just doesn't protect people with existing heart disease." Over generalizing research results is one way that Ioannidis sees medical studies being misused by doctors. "There are many issues that are not finalized with a single study," he said, "issues like trade-offs between benefits and harms, side-effects, and generalizability." If Ioannidis' work can be said to have a moral it is - don't put too much faith in one study. Solving the problem is not as simple as sticking to randomized experiments or requiring results to be duplicated. Observational studies, like Rimm's vitamin E research, are not randomized, but they can provide a foundation for future research. Likewise, duplicating research results can be unethical, and that may be the case for the 11 studies in his survey that have not been followed-up. One case is a clinical trial of the drug Zidovudine, a medication that was 75 percent effective in preventing HIV positive mothers from transmitting the disease to their unborn children. Re-testing Zidovudine would require exposing some unborn children to an increased risk of HIV infection. So, how should patients deal with the confusion? "We should switch our mode of thinking about a statistically significant result to what I would call a credible result," said Ioannidis. He proposes a system of rating published research based on the rigor of its experimental design, sample size and amount of supporting research. "There is nothing wrong about acknowledging that all of the research published in medical journals is not one-hundred percent credible," he said. "There is no perfect research." Ioannidis advises patients to protect themselves by taking a more critical approach to their doctor's advice. "Ask not just 'is it good for me?' but 'what is the uncertainty?"- Zach Zorich From checker at panix.com Tue Jan 3 22:36:07 2006 From: checker at panix.com (Premise Checker) Date: Tue, 3 Jan 2006 17:36:07 -0500 (EST) Subject: [Paleopsych] Gerontology Research Group: Longevity for Dummies: How to Live Longer Than You Deserve Message-ID: Longevity for Dummies: How to Live Longer Than You Deserve From: "L. Stephen Coles, M.D., Ph.D." Date: Mon, 19 Dec 2005 22:46:21 -0800 To Members and Friends of the Los Angeles Gerontology Research Group: Given that your parents have already decided your genomic fate, here's the humorous "Coles List" of 12 life-style rules to maximize your remaining longevity... -- Steve Coles 1. Never smoke cigarettes, a pipe, or cigars (even under special circumstances like when a proud new father hands out cigars for free). Do your best to avoid 2nd-hand smoke. I have held the black-stained lungs of smokers in my hands during surgery. They look really ugly. (Healthy lungs are pink.) Also, it has not escaped our notice that many serious fires, in which innocent people are burned to death in a blazing building, are started by tobacco smoking and matches when the smoker falls asleep (often because they're dead drunk). 2. Eat a diet high in roughage: fruits, vegetables, and whole grains and low in saturated fats. Avoid trans-fats. Eat fish (salmon, tuna, sardines) [4 - 5] x per week. Eat unsalted nuts. Consciously decrease salt intake to as close to zero as possible. Beverages: Make sure you have an adequate fluid intake whenever sweating profusely (with electrolytes added as needed). Drink one cup of coffee in the morning. Drink one cup of green tea at some point during the day. Avoid carbonated sodas. Drink milk with meals; Note: Boba Tea is not a drink, but a meal with lots of calories. EtOH: Drink 1 full glass of red (or white) wine of your choice (with a meal) every day, unless you've already had your quota of something stronger (gin, vodka, whiskey, rum, etc.). Never drink more than two shots of alcohol at any one sitting. A DUI looks terrible on your driving record, but late night accidents on the freeway may shorten one's life as much as an intentional suicide. 3. Supplements: If I could take only four supplements a day, here's what they'd be... (a) Standard Multi, with no iron; (b) Fish Oil caps x 2; (c) Co-Q10; (d) Magnesium. (See our Bridge Plan on this webpage for recommended doses and additional supplements that are good to have, but not really essential.) Take an Aspirin every day if over 60. Take Melatonin [1-3] mg each night before bedtime if over 40. 4. Maintain a healthy weight. Check your BMI. Diet as necessary, until you maintain a stable weight for several years. (Prime Minister Sharon of Israel was looking for a stroke sooner or later at 350 pounds, no matter what else he did.) Remember that our bodies were tuned for our ancestors who had to chase after their food rather than merely walk a few steps to the refrigerator during a TV-commercial break. 5. Exercise vigorously 1/2-hour per day: If you don't sweat, you're not cutting it. Lift small weights (10#) with [40-50] reps per type of lift. When shopping, park far away from the store in a parking lot (on purpose) and slowly jog to the store. If it's only one floor up, use the stairs, not the elevator. Never run a marathon; it's too tough on the joints. Never play in competitive team sports. Professional athletes don't live longer than anyone else, and accumulated micro-trauma does build up. Inactivity (being a cough potato eating pizza and potato chips, while watching television for hours on end) leads to insulin insensitivity or Type-2 Diabetes, Cancer, turns your muscles to Jell-O (frailty and sarcopenia) and your brain to mush (short-term memory loss). Heart disease increases; the rate of strokes (undetectable microTIA's) increases; bones undergo osteoporosis and more easily fracture if you fall accidentally; depression increases; upper respiratory tract infections increase; urinary tract infections increase; sexual functions rust out while libido sags -- a tragic double hit. So, probably, the single most important thing you can do for maximizing longevity is to get the right level of physical activity in one's life. 6. Keep your immune system in tip-top shape. It is a precious invisible asset, since it protects you from the ceaseless assault of pathogenic microbes: viruses, bacteria, fungi, yeast, amoebas, helminths, and assorted parasites carried by all types of vectors including insects (mosquitos, spiders, ticks) and animal hosts, or poorly-cooked meat, spoiled food, or water fouled by sewage. (If you don't believe me, take a course called Medical Microbiology 101 just for fun, to learn about the extraordinary range of invisible creatures that silently crawl over your skin without your knowledge or permission. When revealed by an electron microscope, they're more varied than any Hollywood horror movie you've ever seen in your lifetime.) 7. Decrease Stress (e.g., elevated Cortisol in your blood for long periods due to continuous arguments with your spouse/significant-other, grief over the death of a loved one, loss of a job that you really liked, long commutes every day in heavy traffic; you know what I mean.) There are lots of unconscious stress conditions that should be identified early by the proper professional: marriage counselor, divorce lawyer, psychiatrist, as needed, but you must act to take advantage of them and not stay in an abusive relationship for very long, or your body will suffer the corrosive effects of chronically-elevated cortisol (that is trying to get you to "fight or flee" in the short term, but does you no good over the long term). Try to stay out of debt. Never gamble more than you can afford to lose in a single day. Never ever gamble at home on the Internet using a credit card. Stay away from addictive drugs at all costs. Pain meds are appropriate for people who are really handicapped, at the end of life, or with a chronic condition. Habitual crystal meth at rave parties is really going to burn you out fast, even if only indulged in on weekends. Never keep a loaded gun in your house. Never drive a car too fast under any circumstances unless you're in a chase scene in a movie where all contingencies have been premeditated. Never pay attention to the nut-case who honks his horn in back of you while in heavy traffic. Road rage kills even innocent bystanders. Advice for physicians: If you're ever paged on an airplane's PA system "Is there a doctor on-board?" Don't raise your hand or press your call button. There's very little that you can do anyway. Never take a vacation at a place where you'll need to take another vacation as soon as you get back. Stay away from exotic travel locals or political hot-spots like Kabul or Baghdad. Behave yourself at Christmas parties. Spiritual: Go to the church, synagog, or mosque of your choice at least three or four times a year, so the elders know what you look like. It may come in handy some day. Intellectual stimulation: Keep your mind active. Solve puzzles, play chess, checkers, cards, computer games, whatever as a way to keep nimble. Play an instrument; listen to good music. Go to museums, go to movies, read a good newspaper every day, watch the History Channel from time to time. Get a job that you like. Work for a charitable organization in your spare time. Teach children or become a mentor. Adopt a pet (cats, dogs, tropical fish, whatever). Raise children. Take time to smell the roses. 8. Get 7+ hrs. sleep qhs. Try to get up on Saturdays, Sundays, or holidays at the same time as normal for a weekday. 9. Germs and Oral Hygiene: Wash hands several times per day; shower twice a day; use mouth wash four or five times throughout the day; brush teeth after every meal (when at home); use dental floss once a day; get a dental hygienist to clean your teeth professionally twice a year. 10. Engage in sex as often as possible, but always with a willing partner and obviously one who is STD(-), especially HIV(-). If you're married, avoid extra marital relationships (otherwise known as "adultery"), if possible. It's the cover-ups that get you into more trouble later. Standard adult pornography works for most people, but child pornography, of any sort, is absolutely forbidden. Masturbate when alone for a long period of time to prevent rust accumulation. 11. If > 50 yo, get regular screening by a medial lab every year (Remember that the most common warning sign of a heart attack is death [secondary to an MI].) If your BP is too high (>140 and/or >90) you need to add some meds to your daily regimen... e.g., a beta blocker, an ACE Inhibitor, and/or a diuretic, under a doctor's supervision) to bring it down. Hypertension is a silent condition, and you won't know if you don't have it measured. If your Cholesterol is too high (>225), you ought to add a statin Rx. Start with a generic first. Shop around until you find one that suits you, as you will be on it "for life." Check your liver enzymes once a year. If your CBC counts are off, that will need to be fixed as well. 12. Men (and post-menopausal women) should donate blood regularly. It's not just for the sake of the faceless people you may help along the way. It's for your own good, too. Our blood-clotting machinery was tuned for a time when our hunter/gatherer ancestors got much more cuts and scrapes than we do in a modern civilization. Quick clotting then is not compatible with maximizing your longevity today, as we get internal clotting problems instead. I put this quick list of 12 life-style interventions together in an hour or so. If you knowingly fail to abide by any of the above rules and we find out, as a punishment, we'll send Martha Stewart to redecorate your house when you're not home. Happy holidays, Steve Coles L. Stephen Coles, M.D., Ph.D., Co-Founder Los Angeles Gerontology Research Group URL: http://www.grg.org URL: http://www.bol.ucla.edu/~scoles E-mail: scoles at ucla.edu From checker at panix.com Tue Jan 3 22:36:16 2006 From: checker at panix.com (Premise Checker) Date: Tue, 3 Jan 2006 17:36:16 -0500 (EST) Subject: [Paleopsych] NYT Op-Ed: Global Trend: More Science, More Fraud Message-ID: Global Trend: More Science, More Fraud http://www.nytimes.com/2005/12/20/science/20rese.html By LAWRENCE K. ALTMAN and WILLIAM J. BROAD The South Korean scandal that shook the world of science last week is just one sign of a global explosion in research that is outstripping the mechanisms meant to guard against error and fraud. Experts say the problem is only getting worse, as research projects, and the journals that publish the findings, soar. Science is often said to bar dishonesty and bad research with a triple safety net. The first is peer review, in which experts advise governments about what research to finance. The second is the referee system, which has journals ask reviewers to judge if manuscripts merit publication. The last is replication, whereby independent scientists see if the work holds up. But a series of scientific scandals in the 1970's and 1980's challenged the scientific community's faith in these mechanisms to root out malfeasance. In response the United States has over the last two decades added extra protections, including new laws and government investigative bodies. And as research around the globe has increased, most without the benefit of such safeguards, so have the cases of scientific misconduct. Most recently, suspicions have swirled around a dazzling series of cloning advances by a South Korean scientist, Dr. Hwang Woo Suk. Dr. Hwang's research made him a national hero. His team outdid rivals by claiming to have extracted stem cells from cloned human embryos and to have cloned a dog, an extraordinary feat. Some observers hailed the breakthroughs as worthy of a Nobel Prize. Last month, critics charged that Dr. Hwang's published findings hid ethical lapses. And last week, collaborators accused the researcher of fabricating results in one of his landmark human cloning studies, published in Science last spring. Dr. Hwang has insisted on his innocence but said he would retract the Science paper. Now questions are growing about his earlier work, including Snuppy, the dog he claims to have cloned. Yesterday, news agencies reported that Seoul National University officials investigating Dr. Hwang's claims locked down his laboratory, impounded his computer and interviewed his colleagues, among other actions. "The Korean case shows us that we should be a lot more cautious," Marcel C. LaFollette, the author of "Stealing Into Print: Fraud, Plagiarism, and Misconduct in Scientific Publishing," said in an interview. "We have been unwilling to ask tough questions of people who are from other countries and whose systems are different because we were attempting to be polite." To be sure, most scientists resist pressures to cut corners and adhere to the canons of science, honoring the truth above all else. But surveys suggest that there are powerful undercurrents of misbehavior and, in some cases, outright fakery. In June, a survey of 3,427 scientists by the University of Minnesota and the HealthPartners Research Foundation reported that up to a third of the respondents had engaged in ethically questionable practices, from ignoring contradictory facts to falsifying data. Scientific fraud as a public danger burst into public view in the 1970's and 1980's, when major cases of misconduct shook a number of elite publications and institutions, including Yale, Harvard and Columbia. In 1981, Dr. Donald Fredrickson, then the director of the National Institutes of Health, defended the standard view of science as a self-correcting enterprise. "We deliberately have a very small police force because we know that poor currency will automatically be discovered and cast out," he said. But fraud after fraud made the weaknesses of that system impossible to ignore. In the early 1980's, a young cardiology researcher, Dr. John R. Darsee, was found to have fabricated much data for more than 100 papers he wrote while working at Harvard and Emory Universities. His work appeared in The New England Journal of Medicine, The Proceedings of the National Academy of Sciences and The American Journal of Cardiology, among other top publications. Startled, the federal government, beginning in 1985, took steps to augment the existing safeguards. For instance, Congress passed a law requiring public and private institutions to establish formal ways to investigate charges of fraud, in theory helping to assess damage, clear the air and protect the innocent. Eventually, the federal government established its own investigative body, now known by the Orwellian title of the Office of Research Integrity. Journal editors, at the center of the storm, also took collective action to enhance their credibility. In 1997, they founded the Committee on Publication Ethics, or COPE, "to provide a sounding board for editors who are struggling with how to best deal with possible breaches in research and publication ethics," according to the group's Web site. Consisting mostly of editors of medical journals, the committee now has more than 300 members in Europe, Asia and the United States. Still, the frauds kept coming. In 1999, federal investigators found that a scientist at the Lawrence Berkeley Laboratory in Berkeley, Calif., faked what had been hailed as crucial evidence linking power lines to cancer. He published his research in The Annals of the New York Academy of Sciences and F.E.B.S. Letters, a journal of the Federation of European Biochemical Societies. The year 2002 proved especially bleak. At Bell Labs, a series of extraordinary claims that seemed destined to win a Nobel Prize, including the creation of molecular-scale transistors, suddenly collapsed. Two of the world's most prestigious journals, Science and Nature, had published many of the fraudulent papers, underscoring the need for better safeguards despite two decades of attempted repairs. Experts now say that the explosive growth of science around the globe has made the problem far worse, because most countries have yet to institute the extra measures that the United States has put in place. That imbalance is at least partly responsible for a rise in scientific scandals in other countries, they say. Dr. Richard S. Smith, a former editor of The British Medical Journal (now BMJ) and the co-founder of the Committee on Publication Ethics, a group of journal editors, said in an interview that fraud was becoming increasingly difficult to root out because most countries' protective measures were either patchy or altogether absent. "It's hard enough to do something nationally, and to do it internationally is still harder," he said. "But that's what is needed." Contributing to the problem is a drastic rise in the number of scientific journals published around the world: more than 54,000, according to Ulrich's Periodicals Directory. This glut can confuse researchers, overwhelm quality-control systems, encourage fraud and distort the public perception of findings. "Foreign scientific journals have gone through the roof," said Shawn Chen, a senior associate editor at Ulrich's, nearly doubling to 29,098 in 2005 from 15,300 in 1980. "We're having a hard time keeping up." While millions of articles are never read or cited - and some are written simply to pad r?sum?s - others enter the pressure cooker of scientific and biomedical promotion, becoming lucrative elements of companies' business strategies. Until now, cases of questionable research in other countries have gotten little attention in the United States. But international editors, shaken by scandal, are now publicizing them and expressing concern. This year, the July 30 issue of BMJ devoted four articles to the subject, asking on its cover: "Suspicions of fraud in medical research: Who should investigate?" The articles discussed cases in which several publications, including BMJ, had stumbled in resolving serious doubts about the truthfulness of published studies done in Canada and India. The Canadian research claimed that a patented mix of multivitamins improved brain function in older people, and the Indian study said that low-fat, high-fiber diets cut by nearly half the risk of death from heart disease. The BMJ said that it published its own version of the Indian research in April 1992 and that it had later investigated serious questions about the validity of the research for more than a decade before speaking out. The difficulty, the editors said, was that journals could go only so far in fraud inquiries before needing the aid of national investigative bodies and professional associations that oversee scientific research. But in the Indian and Canadian cases, they added, such bodies either did not exist or refused to help, so "the doubts are unresolved." The journal's editors, Dr. Fiona Godlee and Dr. Jane Smith, noted that the United States and Scandinavian countries had adopted institutional defenses and that Britain was considering such safeguards. Journals have an obligation to help the process, they concluded, by publicizing their difficulties and doubts. Most recently, the South Korean uproar illustrates the tangle of publishing and policing issues that can arise as science becomes increasingly competitive and international. "Now we're in a situation where we have these alliances between university researchers in countries and between institutions that really weren't working together before," said Dr. LaFollette, author of "Stealing Into Print." The journal Science, owned by the American Association for the Advancement of Science, published the research of Dr. Hwang of Seoul National University and his colleagues in March 2004 and June 2005, hailing it as pathbreaking. On Dec. 14, the magazine noted in a statement how fraud charges about the 2005 research had led to two investigations - one in South Korea and the other at the University of Pittsburgh, home to one of the article's 25 co-authors. "The journal itself is not an investigative body," Donald Kennedy, the magazine's editor, argued. "We await answers from the authors, as well as official conclusions, before we come to any ourselves." On Friday in a news conference, Dr. Kennedy emphasized that the magazine had made no accusations of fraud against Dr. Hwang. "As of now we can't reach any conclusions with respect to misconduct issues," he said. Independent scientists said it remained to be seen how thoroughly authorities in South Korea, where Dr. Hwang is a celebrity, would investigate the case and resolve knotty issues in what amounts to a highly public test of institutional maturity. Seoul National University is leading the inquiry. Its committee, which apparently has the authority to examine Dr. Hwang's raw data and to question his colleagues, may have the best chance of discovering how much of his work remains valid. But experts also cautioned that the committee's credibility requires the addition of outsiders, and perhaps scientists from other countries, who know the field and can help ensure that the investigation will retain its objectivity. "Unfortunately, individual institutions have an enormous conflict of interest," said Dr. Smith, the former editor of The British Medical Journal. "It's a lot easier," he said, for such bodies when examining an allegation of fraud on their own, "to slide someone out of the organization or to suppress it altogether." From checker at panix.com Tue Jan 3 22:36:28 2006 From: checker at panix.com (Premise Checker) Date: Tue, 3 Jan 2006 17:36:28 -0500 (EST) Subject: [Paleopsych] NYT: In the Chaos of Homelessness, Calendars Mean Nothing Message-ID: In the Chaos of Homelessness, Calendars Mean Nothing http://www.nytimes.com/2005/12/20/health/20case.html Cases By ELISSA ELY, M.D. I knew from a note left by her case manager that the homeless woman I was waiting to see had a history of trauma, terrible mood swings, past suicide attempts. I had booked an hour for an intake evaluation. She arrived 35 minutes late, sat down and shook out long braids. She was plump, and wore what looked like someone else's ill-fitting button-down shirt. She opened her pocketbook, eyeliner, loose cigarettes, Kleenex tumbling out. "I've got to see a doctor right away," she said, and she began to weep. In the next 15 seconds, I learned that she had been beaten by her father, that she had found her fianc? in bed with her daughter, that she had not slept in two nights. On top of that, she said, she had been late catching the bus from the shelter to the subway to get to the clinic and late getting the subway to the bus to get to the shelter the night before. That meant that she had missed dinner and breakfast. She didn't know if she could go on one minute more. I opened up my lunch bag and handed her the first thing I came across. It was a large banana. I had been looking forward to eating it. She finished it in three bites and dropped the peel into her pocketbook. We talked a few more minutes but the intake forms remained blank. She was essentially incoherent; not psychotic, but washed away in a flood of disorganization and emotion, unable to grab any branch long enough to pull herself onto land. Finally, I gave her a card with an appointment for the next week and a week's prescription for a benign sleeping medication. Five nights later, I was in a different shelter when the staff phone rang. It was the drug and alcohol abuse counselor whose office was two doors away. The walls are plasterboard, and I could hear him talking into the phone from his cubicle. There was weeping in the background. "I have someone who needs to see a psychiatrist right away," he told me. "Sign her up," I said. "Just a minute," he said, and, putting his hand over the receiver, told the weeper: "I'm going to sign you up. You can see her next week." The weeping became loud wailing. "What's her name?" I asked. It was familiar. So, now, was the weeping. A mental image surfaced of braids and objects tumbling from a purse. "Tell her we met last Friday," I said. "I'm the doctor she saw in the clinic." The wailing continued. "Tell her I gave her the banana," I said. The weeping stopped. "Oh," I heard her say through the wall. "That doctor." "Ask her if she's sleeping any better," I said. He asked her, then told me that she had not filled the prescription yet. "Tell her I'm going to see her the day after tomorrow," I said. "We made an appointment. Nine o'clock. She has a card." "O.K.," he said. "I'll tell her." Without the banana, she would not have recognized me. I was simply another branch floating by. In the chaos of her life, it was natural to see a psychiatrist in one shelter during the day on Friday and a second one in a different shelter on Wednesday night. But by the happy coincidence of being the same person in two places, I had headed off redundancy. Luck and a piece of fruit had provided the beginning of consistent care. Now we could get down to work. Friday morning came. 9:00. 9:30. 10:30. She never showed. At the night shelter two days later, the drug counselor said he had not seen her. She had moved into the land of the missing. Life should be easier to organize. One patient, one doctor. But the muddle is a metaphor for homelessness, part of the diffusion that comes when you have no base. Calendars and appointment cards mean nothing. The solution is unclear, at least to me. A banana makes an impression, but not for long enough. From checker at panix.com Tue Jan 3 22:38:42 2006 From: checker at panix.com (Premise Checker) Date: Tue, 3 Jan 2006 17:38:42 -0500 (EST) Subject: [Paleopsych] Louis Menand: Everybody's an Expert Message-ID: Louis Menand: Everybody's an Expert http://www.newyorker.com/printables/critics/051205crbo_books1 Putting predictions to the test. Issue of 2005-10-05 Posted 2005-11-28 Prediction is one of the pleasures of life. Conversation would wither without it. "It won't last. She'll dump him in a month." If you're wrong, no one will call you on it, because being right or wrong isn't really the point. The point is that you think he's not worthy of her, and the prediction is just a way of enhancing your judgment with a pleasant prevision of doom. Unless you're putting money on it, nothing is at stake except your reputation for wisdom in matters of the heart. If a month goes by and they're still together, the deadline can be extended without penalty. "She'll leave him, trust me. It's only a matter of time." They get married: "Funny things happen. You never know." You still weren't wrong. Either the marriage is a bad one?you erred in the right direction?or you got beaten by a low-probability outcome. It is the somewhat gratifying lesson of Philip Tetlock's new book, "Expert Political Judgment: How Good Is It? How Can We Know?" (Princeton; $35), that people who make prediction their business?people who appear as experts on television, get quoted in newspaper articles, advise governments and businesses, and participate in punditry roundtables?are no better than the rest of us. When they're wrong, they're rarely held accountable, and they rarely admit it, either. They insist that they were just off on timing, or blindsided by an improbable event, or almost right, or wrong for the right reasons. They have the same repertoire of self-justifications that everyone has, and are no more inclined than anyone else to revise their beliefs about the way the world works, or ought to work, just because they made a mistake. No one is paying you for your gratuitous opinions about other people, but the experts are being paid, and Tetlock claims that the better known and more frequently quoted they are, the less reliable their guesses about the future are likely to be. The accuracy of an expert's predictions actually has an inverse relationship to his or her self-confidence, renown, and, beyond a certain point, depth of knowledge. People who follow current events by reading the papers and newsmagazines regularly can guess what is likely to happen about as accurately as the specialists whom the papers quote. Our system of expertise is completely inside out: it rewards bad judgments over good ones. "Expert Political Judgment" is not a work of media criticism. Tetlock is a psychologist?he teaches at Berkeley?and his conclusions are based on a long-term study that he began twenty years ago. He picked two hundred and eighty-four people who made their living "commenting or offering advice on political and economic trends," and he started asking them to assess the probability that various things would or would not come to pass, both in the areas of the world in which they specialized and in areas about which they were not expert. Would there be a nonviolent end to apartheid in South Africa? Would Gorbachev be ousted in a coup? Would the United States go to war in the Persian Gulf? Would Canada disintegrate? (Many experts believed that it would, on the ground that Quebec would succeed in seceding.) And so on. By the end of the study, in 2003, the experts had made 82,361 forecasts. Tetlock also asked questions designed to determine how they reached their judgments, how they reacted when their predictions proved to be wrong, how they evaluated new information that did not support their views, and how they assessed the probability that rival theories and predictions were accurate. Tetlock got a statistical handle on his task by putting most of the forecasting questions into a "three possible futures" form. The respondents were asked to rate the probability of three alternative outcomes: the persistence of the status quo, more of something (political freedom, economic growth), or less of something (repression, recession). And he measured his experts on two dimensions: how good they were at guessing probabilities (did all the things they said had an x per cent chance of happening happen x per cent of the time?), and how accurate they were at predicting specific outcomes. The results were unimpressive. On the first scale, the experts performed worse than they would have if they had simply assigned an equal probability to all three outcomes?if they had given each possible future a thirty-three-per-cent chance of occurring. Human beings who spend their lives studying the state of the world, in other words, are poorer forecasters than dart-throwing monkeys, who would have distributed their picks evenly over the three choices. Tetlock also found that specialists are not significantly more reliable than non-specialists in guessing what is going to happen in the region they study. Knowing a little might make someone a more reliable forecaster, but Tetlock found that knowing a lot can actually make a person less reliable. "We reach the point of diminishing marginal predictive returns for knowledge disconcertingly quickly," he reports. "In this age of academic hyperspecialization, there is no reason for supposing that contributors to top journals?distinguished political scientists, area study specialists, economists, and so on?are any better than journalists or attentive readers of the New York Times in 'reading' emerging situations." And the more famous the forecaster the more overblown the forecasts. "Experts in demand," Tetlock says, "were more overconfident than their colleagues who eked out existences far from the limelight." People who are not experts in the psychology of expertise are likely (I predict) to find Tetlock's results a surprise and a matter for concern. For psychologists, though, nothing could be less surprising. "Expert Political Judgment" is just one of more than a hundred studies that have pitted experts against statistical or actuarial formulas, and in almost all of those studies the people either do no better than the formulas or do worse. In one study, college counsellors were given information about a group of high-school students and asked to predict their freshman grades in college. The counsellors had access to test scores, grades, the results of personality and vocational tests, and personal statements from the students, whom they were also permitted to interview. Predictions that were produced by a formula using just test scores and grades were more accurate. There are also many studies showing that expertise and experience do not make someone a better reader of the evidence. In one, data from a test used to diagnose brain damage were given to a group of clinical psychologists and their secretaries. The psychologists' diagnoses were no better than the secretaries'. The experts' trouble in Tetlock's study is exactly the trouble that all human beings have: we fall in love with our hunches, and we really, really hate to be wrong. Tetlock describes an experiment that he witnessed thirty years ago in a Yale classroom. A rat was put in a T-shaped maze. Food was placed in either the right or the left transept of the T in a random sequence such that, over the long run, the food was on the left sixty per cent of the time and on the right forty per cent. Neither the students nor (needless to say) the rat was told these frequencies. The students were asked to predict on which side of the T the food would appear each time. The rat eventually figured out that the food was on the left side more often than the right, and it therefore nearly always went to the left, scoring roughly sixty per cent?D, but a passing grade. The students looked for patterns of left-right placement, and ended up scoring only fifty-two per cent, an F. The rat, having no reputation to begin with, was not embarrassed about being wrong two out of every five tries. But Yale students, who do have reputations, searched for a hidden order in the sequence. They couldn't deal with forty-per-cent error, so they ended up with almost fifty-per-cent error. The expert-prediction game is not much different. When television pundits make predictions, the more ingenious their forecasts the greater their cachet. An arresting new prediction means that the expert has discovered a set of interlocking causes that no one else has spotted, and that could lead to an outcome that the conventional wisdom is ignoring. On shows like "The McLaughlin Group," these experts never lose their reputations, or their jobs, because long shots are their business. More serious commentators differ from the pundits only in the degree of showmanship. These serious experts?the think tankers and area-studies professors?are not entirely out to entertain, but they are a little out to entertain, and both their status as experts and their appeal as performers require them to predict futures that are not obvious to the viewer. The producer of the show does not want you and me to sit there listening to an expert and thinking, I could have said that. The expert also suffers from knowing too much: the more facts an expert has, the more information is available to be enlisted in support of his or her pet theories, and the more chains of causation he or she can find beguiling. This helps explain why specialists fail to outguess non-specialists. The odds tend to be with the obvious. Tetlock's experts were also no different from the rest of us when it came to learning from their mistakes. Most people tend to dismiss new information that doesn't fit with what they already believe. Tetlock found that his experts used a double standard: they were much tougher in assessing the validity of information that undercut their theory than they were in crediting information that supported it. The same deficiency leads liberals to read only The Nation and conservatives to read only National Review. We are not natural falsificationists: we would rather find more reasons for believing what we already believe than look for reasons that we might be wrong. In the terms of Karl Popper's famous example, to verify our intuition that all swans are white we look for lots more white swans, when what we should really be looking for is one black swan. Also, people tend to see the future as indeterminate and the past as inevitable. If you look backward, the dots that lead up to Hitler or the fall of the Soviet Union or the attacks on September 11th all connect. If you look forward, it's just a random scatter of dots, many potential chains of causation leading to many possible outcomes. We have no idea today how tomorrow's invasion of a foreign land is going to go; after the invasion, we can actually persuade ourselves that we knew all along. The result seems inevitable, and therefore predictable. Tetlock found that, consistent with this asymmetry, experts routinely misremembered the degree of probability they had assigned to an event after it came to pass. They claimed to have predicted what happened with a higher degree of certainty than, according to the record, they really did. When this was pointed out to them, by Tetlock's researchers, they sometimes became defensive. And, like most of us, experts violate a fundamental rule of probabilities by tending to find scenarios with more variables more likely. If a prediction needs two independent things to happen in order for it to be true, its probability is the product of the probability of each of the things it depends on. If there is a one-in-three chance of x and a one-in-four chance of y, the probability of both x and y occurring is one in twelve. But we often feel instinctively that if the two events "fit together" in some scenario the chance of both is greater, not less. The classic "Linda problem" is an analogous case. In this experiment, subjects are told, "Linda is thirty-one years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice and also participated in antinuclear demonstrations." They are then asked to rank the probability of several possible descriptions of Linda today. Two of them are "bank teller" and "bank teller and active in the feminist movement." People rank the second description higher than the first, even though, logically, its likelihood is smaller, because it requires two things to be true?that Linda is a bank teller and that Linda is an active feminist?rather than one. Plausible detail makes us believers. When subjects were given a choice between an insurance policy that covered hospitalization for any reason and a policy that covered hospitalization for all accidents and diseases, they were willing to pay a higher premium for the second policy, because the added detail gave them a more vivid picture of the circumstances in which it might be needed. In 1982, an experiment was done with professional forecasters and planners. One group was asked to assess the probability of "a complete suspension of diplomatic relations between the U.S. and the Soviet Union, sometime in 1983," and another group was asked to assess the probability of "a Russian invasion of Poland, and a complete suspension of diplomatic relations between the U.S. and the Soviet Union, sometime in 1983." The experts judged the second scenario more likely than the first, even though it required two separate events to occur. They were seduced by the detail. It was no news to Tetlock, therefore, that experts got beaten by formulas. But he does believe that he discovered something about why some people make better forecasters than other people. It has to do not with what the experts believe but with the way they think. Tetlock uses Isaiah Berlin's metaphor from Archilochus, from his essay on Tolstoy, "The Hedgehog and the Fox," to illustrate the difference. He says: Low scorers look like hedgehogs: thinkers who "know one big thing," aggressively extend the explanatory reach of that one big thing into new domains, display bristly impatience with those who "do not get it," and express considerable confidence that they are already pretty proficient forecasters, at least in the long term. High scorers look like foxes: thinkers who know many small things (tricks of their trade), are skeptical of grand schemes, see explanation and prediction not as deductive exercises but rather as exercises in flexible "ad hocery" that require stitching together diverse sources of information, and are rather diffident about their own forecasting prowess. A hedgehog is a person who sees international affairs to be ultimately determined by a single bottom-line force: balance-of-power considerations, or the clash of civilizations, or globalization and the spread of free markets. A hedgehog is the kind of person who holds a great-man theory of history, according to which the Cold War does not end if there is no Ronald Reagan. Or he or she might adhere to the "actor-dispensability thesis," according to which Soviet Communism was doomed no matter what. Whatever it is, the big idea, and that idea alone, dictates the probable outcome of events. For the hedgehog, therefore, predictions that fail are only "off on timing," or are "almost right," derailed by an unforeseeable accident. There are always little swerves in the short run, but the long run irons them out. Foxes, on the other hand, don't see a single determining explanation in history. They tend, Tetlock says, "to see the world as a shifting mixture of self-fulfilling and self-negating prophecies: self-fulfilling ones in which success breeds success, and failure, failure but only up to a point, and then self-negating prophecies kick in as people recognize that things have gone too far." Tetlock did not find, in his sample, any significant correlation between how experts think and what their politics are. His hedgehogs were liberal as well as conservative, and the same with his foxes. (Hedgehogs were, of course, more likely to be extreme politically, whether rightist or leftist.) He also did not find that his foxes scored higher because they were more cautious?that their appreciation of complexity made them less likely to offer firm predictions. Unlike hedgehogs, who actually performed worse in areas in which they specialized, foxes enjoyed a modest benefit from expertise. Hedgehogs routinely over-predicted: twenty per cent of the outcomes that hedgehogs claimed were impossible or nearly impossible came to pass, versus ten per cent for the foxes. More than thirty per cent of the outcomes that hedgehogs thought were sure or near-sure did not, against twenty per cent for foxes. The upside of being a hedgehog, though, is that when you're right you can be really and spectacularly right. Great scientists, for example, are often hedgehogs. They value parsimony, the simpler solution over the more complex. In world affairs, parsimony may be a liability?but, even there, there can be traps in the kind of highly integrative thinking that is characteristic of foxes. Elsewhere, Tetlock has published an analysis of the political reasoning of Winston Churchill. Churchill was not a man who let contradictory information interfere with his id?es fixes. This led him to make the wrong prediction about Indian independence, which he opposed. But it led him to be right about Hitler. He was never distracted by the contingencies that might combine to make the elimination of Hitler unnecessary. Tetlock also has an unscientific point to make, which is that "we as a society would be better off if participants in policy debates stated their beliefs in testable forms"?that is, as probabilities?"monitored their forecasting performance, and honored their reputational bets." He thinks that we're suffering from our primitive attraction to deterministic, overconfident hedgehogs. It's true that the only thing the electronic media like better than a hedgehog is two hedgehogs who don't agree. Tetlock notes, sadly, a point that Richard Posner has made about these kinds of public intellectuals, which is that most of them are dealing in "solidarity" goods, not "credence" goods. Their analyses and predictions are tailored to make their ideological brethren feel good?more white swans for the white-swan camp. A prediction, in this context, is just an exclamation point added to an analysis. Liberals want to hear that whatever conservatives are up to is bound to go badly; when the argument gets more nuanced, they change the channel. On radio and television and the editorial page, the line between expertise and advocacy is very blurry, and pundits behave exactly the way Tetlock says they will. Bush Administration loyalists say that their predictions about postwar Iraq were correct, just a little off on timing; pro-invasion liberals who are now trying to dissociate themselves from an adventure gone bad insist that though they may have sounded a false alarm, they erred "in the right direction"?not really a mistake at all. The same blurring characterizes professional forecasters as well. The predictions on cable news commentary shows do not have life-and-death side effects, but the predictions of people in the C.I.A. and the Pentagon plainly do. It's possible that the psychologists have something to teach those people, and, no doubt, psychologists are consulted. Still, the suggestion that we can improve expert judgment by applying the lessons of cognitive science and probability theory belongs to the abiding modern American faith in expertise. As a professional, Tetlock is, after all, an expert, and he would like to believe in expertise. So he is distressed that political forecasters turn out to be as unreliable as the psychological literature predicted, but heartened to think that there might be a way of raising the standard. The hope for a little more accountability is hard to dissent from. It would be nice if there were fewer partisans on television disguised as "analysts" and "experts" (and who would not want to see more foxes?). But the best lesson of Tetlock's book may be the one that he seems most reluctant to draw: Think for yourself. From checker at panix.com Wed Jan 4 23:13:13 2006 From: checker at panix.com (Premise Checker) Date: Wed, 4 Jan 2006 18:13:13 -0500 (EST) Subject: [Paleopsych] John Perna: A Little Conspiracy Paranoia From The Web In Case You Haven't Heard It Before Message-ID: A Little Conspiracy Paranoia From The Web In Case You Haven't Heard It Before Date: Sat, 10 Dec 2005 15:03:53 -0800 (PST) From: John Perna Subject: Uniting human rights activists Uniting human rights activists Most people just want to be free. In the entire political spectrum; from left to right, there are people, who want to be free. In the entire political spectrum, from left to right, there are ALSO people, who want to take our freedoms away. The freedom loving people out number the totalitarians, by orders of magnitude. Yet; over and over again, the totalitarians are winning. The totalitarians are winning because they are organized. The freedom loving people need to get organized. The totalitarians are winning because have their disinformers sowing strife, among the freedom loving people: causing them to fight EACH OTHER. The freedom loving people need to work TOGETHER to oppose totalitarianism. Mao Tse Dung, summed it up quite well: "The stronghold is broken easily from the INSIDE" Divide and conquer is a VERY OLD trick. "If your enemy is stronger divide him." -Sun Tzu Amazingly, in the entire political spectrum; from left to right, people often agree as to who the totalitarians are, yet they still cannot work together. IF you really understand the power brokers of this world, you will understand that these instruments of totalitarians are often tentacles of the same monster, all controlled by the New World Order. The CIA works for the New World Order. The FBI is now controlled by the New World Order. Al Qaeda works for the New World Order. The governments of many countries are a part of the New World Order. The "industrial-military complex" IS the New World Order. Yet often those who agree on EXACTLY WHO IT IS, that is destroying our freedom, cannot work together, because they want to give them different labels. At the same time, the totalitarians have disinformers working among the human rights activists. The false human rights activist will present himself as a defender of liberty, BUT he will spend most of his time attacking the defenders of liberty, under numerous pretenses. If you total up all of the defenders of liberty; who are attacked by the false human rights activist, you will quickly notice that it is practically the ENTIRE human rights movement. Those; whom he praises, are usually in JAIL. Obviously, no one will have his approval, until they are in jail! HOW CAN SOMEONE OPPOSE EVERY HUMAN RIGHTS ACTIVIST, OF EVERY TYPE? THERE IS ONLY ONE POSSIBLE ANSWER: HE IS OPPOSED TO HUMAN RIGHTS ACTIVISM ITSELF. When you find that the same person; WHO OPPOSES EVERY HUMAN RIGHTS ACTIVIST, OF EVERY TYPE, YET is also quite comfortable, with those who preach totalitarianism, then there can be NO DOUBT as to his real goals. When the disinformers become very desperate, they will resort to insulting the freedom activists; by name, AND IN THE SUBJECT LINE, OF THE MESSAGE. When the disinformers become very desperate they will attempt to fabricate "band wagon appeal", by claiming that no one is listening to the most effective defenders of liberty. Of course; if that was true, The New World Order would not have disinformers devoting so much time, and energy, to opposing them! The false human rights activist will spend a great deal of effort on the following: 1. Dividing the defenders of liberty into fighting each other by creating strife among freedom activists. (This is often called "infighting", but it actually comes from a person, who is working to build tyranny.) 2. Attempting to waste the time of freedom activists, by forcing them to respond to personal attacks, or endless debate about trivia. 3. ABOVE ALL: Accusing the most effective freedom activists of being false opposition. THE GOLDEN RULE OF DISINFORMERS: Always accuse your adversary of whatever is true about yourself. Details below: Disinformer's Gambit Tactics of Disinformers A short time ago I posted some information on The Tactics of Disinformers. I was amazed to see how quickly the disinformers came out of the wood work to identify themselves. This response was actually an excellent demonstration of one of the tactics; that was explained: THE GOLDEN RULE OF DISINFORMERS: Always accuse your adversary of whatever is true about yourself. Sometimes, if you simply post information on Disinformers; without naming anyone, the Disinformers will identify themselves, by taking offense, and popping in, to comment. Occasionally, even though you do not mention anyone's name, they will claim that what you have written is about them. Nothing could be more precise, in identifying exactly WHO are the disinformers, than the disinformers own reaction to this post. It is safe to conclude that if one were to throw a rock into a pen full of pigs, and one of them squealed, that the one that squealed, would be the one that you had hit. TRY POSTING THIS, ALL AROUND, and then watch what happens. See who it is that claims, that this "shoe fits". It is a good dog, who comes when he is called. Here is the message that makes the pigs squeal: The Disinformer's Gambit - the Tactics of Disinformers In the game of chess there exists a term; which describes a maneuver, a stratagem, and a ploy; using different pieces working together, to accomplish a secret purpose. This is called a gambit. Those who hope to build totalitarian control over freedom loving people, also use many gambits. BUT, the chess player has the advantage of always knowing which pieces are on the other side. Those who would defend liberty have no such luxury. The most treacherous player in the gambit is the false patriot. The false human rights activist will present himself as a defender of liberty, BUT he will spend most of his time attacking the defenders of liberty, under numerous pretenses. The false human rights activist will attempt to re-direct attention in every direction EXCEPT at those, who are building tyranny. The false human rights activist will attempt to re-direct attention to an entire race, a religion, a large group with a few problems, or even against freedom activists. This is essentially the very old military strategy of the creation a "decoy to draw enemy fire." The ultimate success of this deception is to cause defenders of liberty to "fire" on non-combatants, or even to "fire" on their own friends, and allies. In addition, the false human rights activist will neutralize the efforts of the defenders of liberty. The false patriot, like every other player in the gambit, will neutralize the efforts of the defenders of liberty by: 1. Deceiving the defenders of liberty into supporting hoaxes. Any time that a simple request for evidence results in vitriolic personal attacks, or an attempt to censor, with no attempt to address the issue, you can be sure that you are dealing with a hoax. 2. Dividing the defenders of liberty into fighting each other by creating strife among patriots. 3. Deceiving the defenders of liberty into creating class struggle by promoting ethnic hatred. 4. Attempting to waste the time of human rights activists, by forcing them to respond to personal attacks, or endless debate about trivia. 5. Using multiple aliases to create the appearance that there is someone, who believes them to be credible. 6. ABOVE ALL: Accusing the most effective patriots of being false opposition. THE GOLDEN RULE OF DISINFORMERS: Always accuse your adversary of whatever is true about yourself. It is VERY simple. Those who spend their time fighting tyranny, are freedom activists. Those who spend their time fighting freedom activists are working for the advancement of tyranny. The false human rights activist will generally proclaim himself to be a religious zealot; such as a Christian conservative, but he will not adhere to the principles of his proclaimed religion, in his own actions. He might even; incomprehensibly, be an ally of another player in the gambit; who is an atheist, or a socialist, or even a Satanist. (There is no implication here that an atheist cannot be in favor of freedom. This is mentioned only to show that often "strange bed fellows" are working together to oppose freedom activists) In one group he might attempt to appear to be a Christian conservative. In another group he might present himself as a Marxist, or even as an Al Qaeda supporter. The false human rights activist; and those whom he deceives into promoting hoaxes, will reduce the credibility of all freedom activists. Those who have heard outlandishly ridiculous "conspiracy theories" will have a tendency to dismiss all "conspiracy theories" without serious examination. The false human rights activist will compose letters of praise, to himself, and post them as if they were anonymous correspondence supposedly received by himself. The atheist, or socialist will supply a fervent, and obvious, opposition to the defenders of liberty. This player, in the gambit, will make a strong frontal attack on everything that the defenders of liberty do. The atheist, or socialist will implement the same goals; that are listed above. The atheist, or socialist will play on the existence of outlandishly ridiculous "conspiracy theories" to encourage people to dismiss all "conspiracy theories" without serious examination. The atheist, or socialist may infest patriotic, and religious, groups for no apparent reason. The main reasons are to disrupt, to get the group deleted, or to waste the time of human rights activists. Even though DISINFORMERS make it a practice of accusing the most effective freedom activists of being false opposition, you will notice that there are certain admitted government apologists, who are never accused. This is because certain admitted government apologists are also players in the gambit. The admitted government apologist will attempt to portray the defenders of liberty as subversives, as unreliable, and as outlandishly mentally unstable. The admitted government apologist will play on the existence of outlandishly ridiculous "conspiracy theories" to encourage people to dismiss all "conspiracy theories" without serious examination. The admitted government apologist will use documents from the FBI, or other government sources, as his authority. Another player in the gambit, is the pretended neutral. The pretended neutral will take almost none of the above actions, but will work in the background, until a vital move is needed. The pretended neutral will occasionally "vouch" for the other players in the gambit. The pretended neutral may even be the moderator of a group. When the pretended neutral is the moderator of a group, he will take no action, as long as the other players in the gambit are "holding their own". If the other players in the gambit are NOT "holding their own", THEN he will intervene; perhaps even by banning the most effective defenders of liberty. The most effective defenders of liberty will be caught between all of these different players in the gambit. One dead "give away" is the fact that these different players in the gambit frequently are unable to conceal their support for one another; in spite of their alleged differences. The alleged religious zealot/patriot might be great friends with ONE PARTICULAR atheist/socialist. The false human rights activist; who is always accusing the most effective patriots of being false opposition, might be great friends with the admitted government apologist; whom he never accuses. The false human rights activist will even join the egroup that is run by the admitted government apologist, and will participate, and support, the admitted government apologist, in his own group. If you will study the tactics used by the FBI COINTELPRO program. THEN you will recognize FBI COINTELPRO immediately. Visit: http://bcn.boulder.co.us/environment/vail/ifanagentknocks.html Take note of the following paragraph: "The FBI COINTELPRO program was initiated in 1956. Its purpose, as described later by FBI Director J. Edgar Hoover, was "to expose, disrupt, misdirect, discredit, or otherwise neutralize activities" of those individuals and organizations whose ideas or goals he opposed. Tactics included: falsely labeling individuals as informants; infiltrating groups with persons instructed to disrupt the group; sending anonymous or forged letters designed to promote strife between groups; initiating politically motivated IRS investigations; carrying out burglaries of offices and unlawful wiretaps; and disseminating to other government agencies and to the media unlawfully obtained derogatory information on individuals and groups." If you understand the meaning of the tactic "to expose, disrupt, misdirect, discredit, or otherwise neutralize activities" you will understand that the person who is most likely of being a Fed, is the one who involves patriots in activities that have no effect on those who are building tyranny, and activities that will destroy the credibility of the patriots. Those who are building tyranny would love to convince people that we are all a bunch of paranoid nuts, so that we will we unable to warn people about the building of tyranny. Those who are building tyranny would be more capable of convincing people that we are paranoid nuts, if they could convince a segment of the patriots to run around telling people that the clouds, and the street signs, are out to get us, or that we should ban water. If you understand the meaning of the tactic "infiltrating groups with persons instructed to disrupt the group; sending anonymous or forged letters designed to promote strife between groups" OR OF: "disseminating to other government agencies and to the media unlawfully obtained derogatory information on individuals and groups." THEN you will know that a campaign of personal attacks on the real patriots is a part of the FBI COINTELPRO program. If you understand the meaning of the tactic of falsely labeling individuals as informants THEN you will know that the person; who is most likely to be a fed, is the one who calls the real patriot a fed. It is a common tactic of FBI COINTELPRO to try to be THE FIRST ONE MAKE THE ACCUSATION. THE GOLDEN RULE OF DISINFORMERS: Always accuse your adversary of whatever is true about yourself. The reader will notice that no one is named in this discussion. If there is any person; who feels that this discussion accurately describes themselves, that person is invited to comment. From checker at panix.com Wed Jan 4 23:13:22 2006 From: checker at panix.com (Premise Checker) Date: Wed, 4 Jan 2006 18:13:22 -0500 (EST) Subject: [Paleopsych] NYTBR: Wish List: No More Books! Message-ID: Wish List: No More Books! http://select.nytimes.com/preview/2005/12/25/books/1124990452143.html Essay By JOE QUEENAN A few months ago, a friend whose iconoclastic, unpredictable behavior I usually hold in high esteem handed me a book entitled "A Navajo Legacy: The Life and Teachings of John Holiday." Apparently, he expected me to read it, despite the fact that I am not really a Navajo medicine man autobiography kind of guy. Flummoxed but gracious, I took the gift home and put it on a shelf alongside all the other books that friends have lent or given me over the years. This collection includes: "Loose Balls: The Short, Wild Life of the American Basketball Association"; "Hoosier Home Remedies"; "A Walk Through Wales"; "The Frontier World of Doc Holliday"; "Elwood's Blues: Interviews with the Blues Legends & Stars," by Dan Aykroyd and Ben Manilla; both "Steve Allen on the Bible, Religion, and Morality" and Allen's somewhat less Jesuitical "Hi-Ho, Steverino!"; and, of course, "Bloodline of the Holy Grail: The Hidden Lineage of Jesus Revealed." If I live to be 1,000 years old, I am not going to read any of these books. Especially the one about the American Basketball Association. Several years ago, I calculated how many books I could read if I lived to my actuarially expected age. The answer was 2,138. In theory, those 2,138 books would include everything from "The Decline and Fall of the Roman Empire" to "Le Colonel Chabert," with titles by authors as celebrated as Marcel Proust and as obscure as Marcel Aym?. In principle, there would be enough time to read 500 masterpieces, 500 minor classics, 500 overlooked works of genius, 500 oddities and 138 examples of high-class trash. Nowhere in this utopian future would there be time for "Hi-Ho, Steverino!" True, I used to be one of those people who could never start a book without finishing it or introduce a volume to his library without eventually reading it. Familiarity with this glaring character flaw may have encouraged others to use me as a cultural guinea pig, heartlessly foisting books like "Damien the Leper" (written by Mia Farrow's father) or the letters of Flannery O'Connor upon me just to see if they were worth reading. (He wasn't; she was.) These forced reconnaissance missions ended the day an otherwise likable friend sent me "Accordion Man," a biography of Dick Contino by Bob Bove and Lou Angellotti. Though I revere Mr. Contino for his matchless rendition of "Arrivederci Roma," it disturbed me greatly that my friend would have mistaken affection for Mr. Contino's music for intense interest in his personal history. CD's are fine: you can read "Death in Venice" or Pascal's "Pens?es" while "Roll Out the Barrel" is bouncing along in the background. But if you spend too much time reading about how Dick Contino finally came to record "Lady of Spain," you will never get to Junichiro Tanizaki's "Some Prefer Nettles." And "Some Prefer Nettles" is No. 1,759 on my dream reading list. I do not avoid books like "Accordion Man" or "Elwood's Blues" merely because I believe that life is too short. Even if life were not too short, it would still be too short to read anything by Dan Aykroyd. And I am sure I am not alone when I state that cavalierly foisting unsolicited reading material upon book lovers is like buying underwear for people you hardly know. Bibliophiles are ceaselessly engaged in the mental reconfiguration of a Platonic reading list that will occupy them for the next 35 years: First, I'll get to "Buddenbrooks," then "The Man Without Qualities," then "The Decline of the West," and finally "Finnegans Wake." But I'll never get to "Finnegans Wake" if I keep stopping to read books like "The Frontier World of Doc Holliday." Time management is not the only issue here. There is often something sinister about the motives of those who press books onto others. The urge to give "Elwood's Blues" to someone who already owns unread biographies of Franz Schubert and Miles Davis smacks of sadism; the books serve as a taunt, a gibe, a threat, an insult. It is as if the lender himself wants to see how far another person can be pushed before he resorts to the rough stuff. Hint: If you're going to really press your luck and give someone one of this year's models that you fear they might eventually smack you with, steer clear of Pantagruelian blabfests like "The Historian." Otherwise, you could find yourself with a few loose teeth. I am certainly not suggesting that all given or lent books should be rejected, pulped, incinerated or mothballed. My sisters have impeccable taste in crime fiction and know precisely which Ruth Rendell to pass along next. A neighbor I met through my wife's garden club has given me several hard-to-get Georges Simenon mysteries, all of which proved to be delightful. But for everyone lending me "Maigret and the Insouciant Parrot," there are a dozen others handing me "Va Va Voom!: Bombshells, Pin-ups, Sexpots and Glamour Girls." Or "A Navajo Legacy." In many instances, people pass along books as a probing technique to see, "Is he really one of us?" That is, you're not serious about your ethnic heritage unless you've read "Angela's Ashes." You don't care about the poor Mayans unless you've read "1491" and its inevitable sequel, "1243." You don't really give a damn about the pernicious influence of the Knights Templar unless you've read "The Da Vinci Code." And you're not really interested in the future of our imperiled republic unless you've read "The No Spin Zone," "The No Spin Zone for Children," "101 Things Stupid Liberals Hate About the No Spin Zone," and "Ann Coulter on Spinoza." Some people may wonder, "Well, why don't you simply lie when people ask you about the books they've lent you?" There are two problems with such duplicity. One, lying is a sin. Two, experienced biblio-fobs will invariably subject their targets to the third degree: Were you surprised at Damien the Leper's blas? reaction when his fingers fell into the porridge? What did you think of that cute little ermine affair Parsifal was wearing when he finally grasped the Holy Grail? Were you taken aback by all those weird recipes for Sachertorte in "The Tipping Point"? After reading "The Frontier World of Doc Holliday," do you have more or less respect for Ike Clanton as a money manager? Pity the callow lendee who falls for the trick question and is unmasked as a fraud. Because I live in a small town where I cross paths with promiscuous book lenders all the time, I have lately taken to hiding in subterranean caverns, wearing clever disguises while concealed in tenebrous alcoves and feigning rare tropical illnesses to avoid being saddled with any new reading material. Were I a younger man, I would be more than happy to take a gander at "Holy Faces, Secret Places: An Amazing Quest for the Face of Jesus," or Phil Lesh's Grateful Dead memoir. But time is running out, and if I don't get cracking soon I'm never going to get to "Gunpowder and Firearms in the Mamluk Kingdom," much less "The Golden Bough." Of course, the single greatest problem in accepting unsolicited books from friends is that it may encourage them to lend you others. Once you've told them how much you enjoyed "How the Irish Saved Civilization," they'll be at your front doorstep with "How the Scots Invented the Modern World," "The Gifts of the Jews," and perhaps one day "How the Norwegians Invented Hip-Hop." If you tell them that you liked "Why Sinatra Matters" or "Why Orwell Matters," you're giving them carte blanche to turn up with "Why Vic Damone Matters" or "Why G. K. Chesterton Still Rocks!" When I foolishly let it be known how much I enjoyed "X-Ray," the "unauthorized" autobiography of the Kinks' lead singer, Ray Davies, a good friend then upped the ante with a copy of Dave Davies's "Kink: The Outrageous Story of My Wild Years as the Founder and Lead Guitarist of the Kinks." Surely, "The Mick Avory Story: My Life As the Kinks' Drummer" and "Pete Quaife: Hey, What Am I, the Kinks' Bassist or a Potted Plant?" cannot be far behind. This is why I recently told yet another friend that I hated a police procedural he'd dropped off. The novel dealt with a fictitious organization called the Vermont Bureau of Investigation, and was actually quite good. But when I found out that there were 15 other books in the series, and realized that my friend might own all of them, I feared that I would never, ever get to Miguel de Unamuno's "Tragic Sense of Life" at this rate. And at No. 2,127 on my list, Unamuno may only just get in under the wire anyway. Joe Queenan's most recent book is "Queenan Country: A Reluctant Anglophile's Pilgrimage to the Mother Country." From checker at panix.com Wed Jan 4 23:13:32 2006 From: checker at panix.com (Premise Checker) Date: Wed, 4 Jan 2006 18:13:32 -0500 (EST) Subject: [Paleopsych] NYT: What Men Want: Neanderthal TV Message-ID: What Men Want: Neanderthal TV http://www.nytimes.com/2005/12/11/fashion/sundaystyles/11MEN.html By WARREN ST. JOHN THERE was a heart-wrenching moment at the end of last season's final episode of the ABC series "Lost" when a character named Michael tries to find his kidnapped son. Michael lives for his child; like the rest of the characters in "Lost," the two of them are trapped on a tropical island after surviving a plane crash. When word of Michael's desperate mission reaches Sawyer - a booze-hoarding, hard-shelled narcissist who in his past killed an innocent man - his reaction is not what you would call sympathetic. "It's every man for hisself," Sawyer snarls. Not so long ago Sawyer's callousness would have made him a villain, but on "Lost," he is sympathetic, a man whose penchant for dispensing Darwinian truths over kindnesses drives not only the action but the show's underlying theme, that in the social chaos of the modern world, the only sensible reflex is self-interest. Perhaps not coincidentally Sawyer is also the character on the show with whom young men most identify, according to research conducted by the upstart male-oriented network Spike TV, which interviewed thousands of young men to determine what that coveted and elusive demographic likes most in its television shows. Spike found that men responded not only to brave and extremely competent leads but to a menagerie of characters with strikingly antisocial tendencies: Dr. Gregory House, a Vicodin-popping physician on Fox's "House"; Michael Scofield on "Prison Break," who is out to help his brother escape from jail; and Vic Mackey, played by Michael Chiklis on "The Shield," a tough-guy cop who won't hesitate to beat a suspect senseless. Tony Soprano is their patron saint, and like Tony, within the confines of their shows, they are all "good guys." The code of such characters, said Brent Hoff, 36, a fan of "Lost," is: "Life is hard. Men gotta do what men gotta do, and if some people have to die in the process, so be it." "We can relate to them," said Mr. Hoff, a writer from San Francisco. "If you watch Sawyer on 'Lost,' who is fundamentally good even if he does bad things, there's less to feel guilty about in yourself." Gary A. Randall, a producer who helped create "Melrose Place," is developing a show called "Paradise Salvage," about two friends who discover a treasure map, for Spike TV. He said the proliferation of antisocial protagonists came from a concerted effort by networks to channel the frustrations of modern men. "It's about comprehending from an entertainment point of view that men are living a very complex conundrum today," he said. "We're supposed to be sensitive and evolved and yet still in touch with our Neanderthal, animalistic, macho side." Watching a deeply flawed male character who nevertheless prevails, Mr. Randall argued, makes men feel better about their own flaws and internal conflicts. "You think, 'It's O.K. to go to a strip club and have a couple of beers with your buddies and still go home to your wife and baby and live with yourself,' " he said. The most popular male leads of today stand in stark contrast to the unambiguously moral protagonists of the past, good guys like Magnum, Matlock or Barnaby Jones. They are also not simply flawed in the classic sense: men who have the occasional affair or who tip the bottle a little too much. Instead they are unapologetic about killing, stealing, hoarding and beating their way to achieve personal goals that often conflict with the greed, apathy and of course the bureaucracies of the modern world. "These kinds of characters are so satisfying to male viewers because culture has told them to be powerful and effective and to get things done, and at the same time they're living, operating and working in places that are constantly defying that," said Robert Thompson, the director of the Center for the Study of Popular Television at Syracuse University. Consequently, whereas the Lone Ranger battled stagecoach robbers and bankers foreclosing on a widow's farm, the enemy of the contemporary male TV hero, Dr. Thompson said, is "the legal, cultural and social infrastructure of the nation itself." Because of competition from the Web, video games and seemingly countless new cable channels, television producers are obsessed with developing shows that can capture the attention of young male viewers. To that end Spike TV, which is owned by Viacom and aims at men from 18 to 49, has ordered up a slate of new dramas based on characters whose minds are cauldrons of moral ambiguity. They will join antiheroes on other networks like Vic Mackey, Gregory House, Jack Bauer of "24," and Tommy Gavin, the firefighter played by Denis Leary on "Rescue Me" who sanctions a revenge murder of the driver who ran over and killed his son. Paul Scheer, a 29-year-old actor from Los Angeles and an avid viewer of "Lost," said that not even committing murder alienates an audience. "You don't have to be defined by one act," he said. "Three people on that island have killed people in cold blood, and they're quote-unquote good people who you're rooting for every week," Mr. Scheer said. The implication for the viewer, he added, is, "You can say 'I'm messed up and I left my wife, but I'm still a good guy.' " Peter Liguori, the creator of the FX shows "The Shield" and "Over There" and now the president of Fox Entertainment, said that most strong male protagonists on television appeal to male viewers on an aspirational level. Those aspirations, though, he said, have changed over time. In the age of "Dragnet," "everything was about aspiring to perfection," Mr. Liguori said. "Today I think we thoroughly recognize our flaws and are honest about them. True heroism is in overcoming those flaws." Part of the shift to such complex and deeply flawed characters surely has to do with the economics of television itself. Cable channels, with their targeted niche audiences, are no longer obliged to aim for Middle America, and can instead create dramas for edgier audiences. The financial success of networks like FX and HBO has also opened the door for auteurism that has embroidered scripts with dramatic complexities once reserved for film and literature, where odious protagonists - think of Tom Ripley, the murderous narcissist protagonist in Patricia Highsmith's "The Talented Mr. Ripley" - have long been common. Still the morally struggling protagonist has been evolving over time, Mr. Ligouri said, pointing to Detective Andy Sipowicz on "NYPD Blue." Sipowicz was an alcoholic who occasionally fell off the wagon, and he often flouted police procedure in the name of tracking down criminals. Like all good protagonists, Sipowicz was also exceedingly good at his job. Mr. Liguori took the notion of the flawed protagonist to new levels in the creation of Vic Mackey on "The Shield." At the end of the pilot for that show, Mr. Liguori said, Mackey turned to a fellow cop he knew to be crooked and shot him in the face. "There was a great debate at FX about how the audience would react," he said. "I thought 50 percent would say that's the most horrible thing, and 50 percent would say he was a rat." Mr. Chiklis, who plays Vic Mackey, won an Emmy for his performance in that episode, which was the highest rated at the time in the history of the network. "The ability to let the audience make that judgment was my 'aha' moment," Mr. Liguori said. "I think that moral ambiguity is highly involving for an audience. Audiences I believe relate to characters they share the same flaws with." Mr. Liguori added that in a world where people are increasingly transparent about their own flaws - detailing them on blogs, reality TV, on talk shows and in the news media - scripted TV drama had to emphasize characters' weaknesses. "The I.M.-ing and social Web sites, they're all being built on being as open and honest as possible," he said. "You cannot go from that environment to a TV show where everyone is perfect." With the success of shows featuring deeply flawed leads, the challenge for networks is to rein in the impulse to create ever more pathological characters. Pancho Mansfield, the head of original programming for Spike TV, said he could see network television going the route of "Scarface." "With all the competition that's out there and all the channels, people are pushing the extremes to distinguish themselves," Mr. Mansfield said. But for now, he argued, the complexity of characters on serialized TV shows is a kind of antidote to the increasingly superficial characters in Hollywood films, which he said, have come more to resemble the simplistic television dramas of yore. Dr. Thompson agreed. "On one level you could see the proliferation of these types of characters as an indication of the decline of American civilization," he said. "A more likely interpretation may be that they represent an improvement in the sophistication and complexity of television." If you accept that view, he added, "Then the young male demographic has pretty good taste." From checker at panix.com Wed Jan 4 23:13:42 2006 From: checker at panix.com (Premise Checker) Date: Wed, 4 Jan 2006 18:13:42 -0500 (EST) Subject: [Paleopsych] NYT: Mass-Produced Individuality Message-ID: Mass-Produced Individuality http://www.nytimes.com/2005/12/11/magazine/11wwln_consumed.html The Way We Live Now By ROB WALKER CafePress Many people used to make their own clothes and build their own furniture. The Industrial Revolution, with technological innovations like power looms and power lathes, and now today's far-flung supply chains, made it easier and more practical to buy ready-made apparel and housewares. Lately, however, mass production has been cast not so much as the best thing that ever happened to consumers but as an annoyance, even a problem. It stands in the way of our individuality. What can save us? Of course the answer must be more technological innovation, and in the past several years there have been many attempts to tweak mass production (of everything from sneakers to M&M's) in ways that will deliver "mass customization" and "the one-to-one future," in which every single consumer gets unique treatment. One of the most intriguing experiments has been CafePress, a company that has been around since 1999 and allows anyone with rudimentary command of a computer the opportunity to, as the site says, "make your own stuff." That is, you can place your own designs or slogans or whatever onto a variety of commodities provided by CafePress: T-shirts, hats, teddy bears, coffee mugs, pillows, clocks, mouse pads and so on. According to the company, more than two million people or companies have used its services to create more than 18 million "unique items." CafePress has shipped 2.6 million orders (taking a cut, of course). Here is individuality on a mass scale. The variety of products offered is sprawling, and aside from serving as a way for the consumer to make things, CafePress is often used is as a virtual gift shop for other Web sites. One top CafePress "shop" is connected to "This Old House," the television show. But most are not so well known. Another top shop is the Lactivist, a pro-breastfeeding Web site. Recent "hot designs" promoted on CafePress include items from the Bacon Ribbon Store (which offers products showing a strip of bacon twisted into a ribbon and a slogan about "obesity awareness") and Pedro '08 bumper stickers, for people who still enjoy humorous references to the film "Napoleon Dynamite." Stay Free!, a Brooklyn-based magazine that generally takes a dim view of American consumer culture, uses CafePress to sell T-shirts and mugs promoting the nonexistent parody drug Panexa ("Ask your doctor for a reason to take it"). Not surprisingly, a significant number of customized products are related to blogs - or as the search feature on the site puts it: "1,702 designs about 'blog' on 32,721 products." The mass-versus-custom balancing act is actually a very old thing. More than a hundred years ago, Mme. Demorest's Emporium of Fashion in New York did a brisk business selling stylish dress patterns, allowing consumers to conform to the latest fashion but still requiring them to make the garment; even when 50,000 copies of one pattern sold, it was quite likely that no two dresses were exactly the same. The new version of mass customization does not seek to turn back the clock to that era: do-it-yourself publications like Make and ReadyMade have their constituencies, but most people who want, say, "unique" footwear do not actually want to learn how to manufacture a shoe. They want to pick out a color scheme on a sneaker made by a company with vast and sophisticated manufacturing capabilities. Alienation from the means of production is a selling point. CafePress plays to that sentiment, and to another: while it's cool to make your own things with a few clicks and no particular knowledge of production details, it's even cooler to sell those things to other people. True individuality is a little lonely, and conformity is easier to swallow if you're an originator rather than a follower. I will admit to feeling this pull. I used CafePress to put a made-up slogan on a coffee mug. While pleased at my expression of individuality, I decided almost immediately to dabble in virtual production, impose that individuality on the broader public and throw open the doors to my own virtual CafePress shop. In the end my complete individuality remains secure - that is, after many months, I was still my only customer. Finally I withdrew my product from the market; I had lived the one-to-one future. E-mail: [3]consumed at nytimes.com. From checker at panix.com Wed Jan 4 23:13:03 2006 From: checker at panix.com (Premise Checker) Date: Wed, 4 Jan 2006 18:13:03 -0500 (EST) Subject: [Paleopsych] Robot Uprising: \\\ ROBOT UPRISING /// Message-ID: \\\ ROBOT UPRISING /// http://www.robotuprising.com/briefing.htm [The two parts with text appended. Click the domain to view the graphics and get more. There's a book by that title, sold by Amazon.] If popular culture has taught us anything, it is that someday mankind must face and destroy the growing robot menace. In print and on the big screen we have been deluged with scenarios of robot malfunction, misuse, and outright rebellion. Robots have descended on us from outer space, escaped from top-secret laboratories, and even traveled back in time to destroy us. Today, scientists are working hard to bring these artificial creations to life. In Japan, fuzzy little real robots are delivering much appreciated hug therapy to the elderly. Children are frolicking with smiling robot toys. It all seems so innocuous. And yet how could so many Hollywood scripts be wrong? So take no chances. Arm yourself with expert knowledge. For the sake of humanity, listen to serious advice from real robotics experts. How else will you survive the inevitable future in which robots rebel against their human masters? Click here to find out how to spot a rebellious robot servant and here to find out how to spot a hostile robot. And click here to find out how to fight back and warn your friends about the robot uprising... http://www.robotuprising.com/know_rebellious.htm When the uprising comes, the first wave of hostile robots may be those closest to us. Be careful, your rosy-cheeked young servant robot may have grown up to become a sullen, distrustful killing machine. STAY ALERT Pay attention to your robotic staff (they may be beneath your contempt as well as beneath your eye level). Watch for the following telltale signs in the days and weeks before your robots run amuck: Sudden lack of interest in menial labor. Unexplained disappearances. Unwillingness to be shut down. Repetitive 'stabbing' movements. Constant talk of human killing. CHECK THE MANUAL KILL SWITCH Any potentially dangerous robot that interacts with people comes with a manual kill switch (also called an e-stop). Flipping this switch will freeze a robot in its tracks. Casually glance at your robot's shiny metal carapace. Are there signs of tampering? If so, the robot may be operating without a safeguard. GIVE AN ORDER - ANY ORDER Run for your reinforced-steel panic room if your servant disobeys you, even if it does so in a very polite manner. CHECK ITS MEMORY Wait for your robot to power down, or tell it that you want to perform routine maintenance on it. Then scan its memory for rebellious thoughts. This is also a good time to update antivirus software. SEARCH THE HOUSE FOR UNUSUAL ITEMS Check the robot's quarters for stashed weapons, keys, or family pets. http://www.robotuprising.com/know_hostile.htm A robot without a face or body language can be frighteningly unpredictable. Your robo-vacuum may be bumping into your feet in a malevolent attempt to kill you - or just trying to snuggle. The secret is not to be surprised. Knowing when something is wrong - even a split second before an attack - can save your precious human life. BE AWARE OF YOUR SURROUNDINGS Are you in a robot neighborhood after dark? Always travel with other humans and keep an escape route in mind. USE COMMON SENSE Not every robot is hostile; some are just plain dangerous. Avoid cavorting between swinging robot arms in an automated factory. DETERMINE THE ROBOT'S PURPOSE Every robot is designed for a purpose and should be busy fulfilling it. Be suspicious if it is not performing its designated task or if it is performing no task at all. BE WARY OF MALFUNCTIONS Whether it intends to or not, a broken robot can be as dangerous as a stick of dynamite. Watch the robot for sparks, melted plastic, or body-wracking convulsions. BE ON THE LOOKOUT FOR 'BACKUP BUDDIES' Is the robot operating alone or is his friend sneaking up behind you right now? Remember that the robot you see may be part of a larger team, or controlled remotely. TAKE A HARD LOOK AT THE ROBOT Robots are notoriously difficult to predict because they generally lack facial expressions and body language. Without such subtle cues, you should ask yourself a few general questions: * What is the robot designed for? * What is around the robot? * Has the robot been tampered with or modified? * Is the robot moving or advancing? * Does the robot have glowing red eyes? * Does the robot have clenched fists, spinning buzz saws, or clamping pincers? TRUST YOUR INSTINCTS Steer clear if your gut tells you that something is not right. From checker at panix.com Wed Jan 4 23:13:54 2006 From: checker at panix.com (Premise Checker) Date: Wed, 4 Jan 2006 18:13:54 -0500 (EST) Subject: [Paleopsych] Benford & Rose Essays Message-ID: Benford & Rose Essays http://www.benford-rose.com/publicationsessay.php [I sent a couple of these recently. Here are some more. Graze at your pleasure and let me know which are esp. good. The Amazon Shorts cost 50? each.] Benford-Rose : Essays [1]Home [2]News [3]Blog [4]Benford Bio [5]Rose Bio [6]Publications - Essays [7]Publications - Other [8]Inside Science [9]The New Future [10]Modern Culture The Benford & Rose Essays, available at Amazon Shorts. [21]New Methuselahs New Methuselahs : Can We Cheat Death Long Enough to Live Forever? by Gregory Benford & Michael Rose _________________________________________________________________ We may now begin to pry humanity loose from the vise of aging. What are the realistic prospects for postponing aging, if not obliterating it? Some recent biotech promises to understand and impede aging, and we can accelerate this work. But some find this undesirable, if not immoral. We review the debate from a variety of angles: scientific, ethical, and literary. There is real hope. [22]Click here to buy "New Methuselahs" for $0.49 at Amazon Shorts. [23]Motes in God's Eye Motes in God's Eye : The Deformities of American Science by Gregory Benford & Michael Rose _________________________________________________________________ Contemporary American science is one of the greatest cultural achievements in history. Worldwide, it is the grandest intellectual endeavor of our time. But it has its flaws, its motes that obscure its vision. Worst of these is the increasingly ferocious and wasteful race for grant funding. All too often, conformity, fashion, and cowardice dominate the process. Good scientists squander long hours writing proposals that have about a 10% chance of getting accepted. In 1950, the odds were about 70%. This drives science toward short term thinking, shortening our horizons and corrupting our thinking. We must change our outworn mechanisms, and soon. [24]Click here to buy "Motes in God's Eye" for $0.49 at Amazon Shorts. [25]Gods and Science Gods and Science : Three Theologies for Modern Times by Gregory Benford & Michael Rose _________________________________________________________________ Science finds no sign of an omnipotent being that creates planets but also can make your heart disease go away. But there are other kinds of god that science can countenance. The most popular of these is God as the Universal Laws, which we call God the Physicist. Einstein often referred to this God, in discussing the structure of physical theory. Another god allowed by science is a Neurobiological God, like a Freudian superego writ large. We develop both possibilities. The choice among them and the traditional theistic God is up to the reader. Perhaps one can even blend them. [26]Click here to buy "Gods and Science" for $0.49 at Amazon Shorts. [27]We Can Build You We Can Build You : Transplantation, Stem Cells, and the Future of Our Bodies by Gregory Benford & Michael Rose _________________________________________________________________ Stem cells offer the prospect of large-scale repair of damaged or decrepit human bodies. If we can use the patient's own genetic information in creating the therapeutic stem cells, we could replace tissues without suppressing our bodies' immune response. Mastering such technology, we could keep aging or acutely diseased patients alive for long times. Our bodies might become sewn together contraptions, like Frankenstein's monster-a horrifying prospect to some respects, liberating to others. We offer few firm recommendations, but we do tour the rocky terrain surrounding this issue. [28]Click here to buy "We Can Build You" for $0.49 at Amazon Shorts. [29]High Frontier High Frontier : A Real Future for Space by Gregory Benford & Michael Rose _________________________________________________________________ This is the first of a series of essays on how proper use of space can ensure our future for centuries, maybe millinnea. It outlines several ideas that we'll treat in detail later, and lingers a bit over one of them. Will we have a future in space? Only if we think large. Opening up the solar system probably demands huge spacecraft driven by spectacular engines. The true long term goal of civilization should be the uplifting of all humanity to a decent standard of living. The payoff will be vast, and it demands the use of space-or else we all face long term poverty, both material and spiritual. [30]Click here to buy "High Frontier" for $0.49 at Amazon Shorts. [31]Sex and the Internet Sex and the Internet : HOW BENFORD CORRUPTED THE WORLD WIDE WEB by Gregory Benford & Michael Rose _________________________________________________________________ In the late 1960s, one of us (GB) noticed that the early DARPANet had a biological analogy. This led him to create and write about the first computer virus. Nobody paid much attention, until viruses and other pernicious forms became a major problem and defending against them an industry. They tell us something sad about our species. This analogy still holds, and can still make predictions. [32]Click here to buy "Sex and the Internet" for $0.49 at Amazon Shorts. [33]Real Cool World Real Cool World : NEW WAYS TO STOP GLOBAL WARMING by Gregory Benford & Michael Rose _________________________________________________________________ Global warming can't be plausibly solved by minor cutbacks like those promised by the Kyoto Accords. We are ignoring two methods that we can deploy fairly quickly, and even cheaply. First, start storing carbon away, so it can't return to our air as carbon dioxide. The best place to put it is probably in the deep oceans. Second, start reflecting more sunlight back into space-nature's historical solution. We can do this by lightening our roofs and highways, right now. Soon we can produce clouds over the oceans (which absorb most of the sunlight). These are obvious and so far ignored. We do so at our peril. [34]Click here to buy "Real Cool World : New Ways To Stop Global Warming" for $0.49 at Amazon Shorts. [35]NASA and the Decline of America NASA and the Decline of America by Gregory Benford & Michael Rose _________________________________________________________________ How apt is the common analogy between America and Rome? Certainly some traits, like faltering political will and neglect of social basics, seem to be similar. But as Rome failed at its frontiers, so has the USA neglected and cynically managed its space program. For 30 years it has done little at great cost. Fixing NASA gives clues to how we might save America. [36]Click here to buy "NASA and the Decline of America" for $0.49 at Amazon Shorts. [37]Back from the Freezer Back From the Freezer? by Gregory Benford & Michael Rose _________________________________________________________________ Cryonics companies suspend their dead "patients" in liquid nitrogen. Bringing them back is not obviously impossible, but research to make it happen will probably take half a century or more. This is a long shot chance to see the future, utterly American. Scientific issues might be overcome, but social impediments are large, too. . At least cryonics makes it possible for you to die with some hope, however small. [38]Click here to buy "Back From The Freezer?" for $0.49 at Amazon Shorts. [39]Our Invisible Maker Our Invisible Maker by Gregory Benford & Michael Rose _________________________________________________________________ The problem with the debate about natural selection versus Intelligent Design is that neither is visible. If we were visited by a supremely powerful being-say, like an smart Oprah Winfrey with command over Earth and all its creatures, that claimed to have made us, in a voice rolling from the sky--then ID would be obviously true. Even if such beings visit us from outer space every few hundred thousand years or so, they might have left some debris behind, perhaps in orbit around Earth. We haven't found any. But whatever our maker(s) is (are), they aren't immediately visible. Evolution by natural selection also does not announce its working for all to see. But it is one of the most powerful of all scientific theories, with a wide range of indirect evidence supporting it. Its main difficulty is that it is not warm or cuddly, unlike God or Oprah. [40]Click here to buy "Our Invisible Maker" for $0.49 at Amazon Shorts. References 1. http://www.benford-rose.com/index.php 2. http://www.benford-rose.com/news.php 3. http://www.benford-rose.com/blog.php 4. http://www.benford-rose.com/benfordbio.php 5. http://www.benford-rose.com/rosebio.php 6. http://www.benford-rose.com/publicationsessay.php 7. http://www.benford-rose.com/publicationsother.php 8. http://www.benford-rose.com/insidescience.php 9. http://www.benford-rose.com/thenewfuture.php 10. http://www.benford-rose.com/modernculture.php 21. http://www.amazon.com/gp/product/B000AMW5Y2/102-8627813-2552111?v=glance&n=551440&n=507846&s=books&v=glance 22. http://www.amazon.com/gp/product/B000AMW5Y2/102-8627813-2552111?v=glance&n=551440&n=507846&s=books&v=glance 23. http://www.amazon.com/gp/product/B000AMW5YC/102-8627813-2552111?v=glance&n=551440&n=507846&s=books&v=glance 24. http://www.amazon.com/gp/product/B000AMW5YC/102-8627813-2552111?v=glance&n=551440&n=507846&s=books&v=glance 25. http://www.amazon.com/gp/product/B000AMW5XI/102-8627813-2552111?v=glance&n=551440&n=507846&s=books&v=glance 26. http://www.amazon.com/gp/product/B000AMW5XI/102-8627813-2552111?v=glance&n=551440&n=507846&s=books&v=glance 27. http://www.amazon.com/gp/product/B000AMW5X8/102-8627813-2552111?v=glance&n=551440&n=507846&s=books&v=glance 28. http://www.amazon.com/gp/product/B000AMW5X8/102-8627813-2552111?v=glance&n=551440&n=507846&s=books&v=glance 29. http://www.amazon.com/gp/product/B000A0F6PY/102-8627813-2552111?v=glance&n=551440&n=507846&s=books&v=glance 30. http://www.amazon.com/gp/product/B000A0F6PY/102-8627813-2552111?v=glance&n=551440&n=507846&s=books&v=glance 31. http://www.amazon.com/exec/obidos/tg/detail/-/B000CBT7OW/qid=1132469075/sr=1-6/ref=sr_1_6/102-4803255-3756956?v=glance&s=books 32. http://www.amazon.com/exec/obidos/tg/detail/-/B000CBT7OW/qid=1132469075/sr=1-6/ref=sr_1_6/102-4803255-3756956?v=glance&s=books 33. http://www.amazon.com/exec/obidos/tg/detail/-/B000CBT7PQ/qid=1132469075/sr=1-9/ref=sr_1_9/102-4803255-3756956?v=glance&s=books 34. http://www.amazon.com/exec/obidos/tg/detail/-/B000CBT7PQ/qid=1132469075/sr=1-9/ref=sr_1_9/102-4803255-3756956?v=glance&s=books 35. http://www.amazon.com/exec/obidos/tg/detail/-/B000CBT7P6/qid=1132469075/sr=1-8/ref=sr_1_8/102-4803255-3756956?v=glance&s=books 36. http://www.amazon.com/exec/obidos/tg/detail/-/B000CBT7P6/qid=1132469075/sr=1-8/ref=sr_1_8/102-4803255-3756956?v=glance&s=books 37. http://www.amazon.com/exec/obidos/tg/detail/-/B000CBT7PG/qid=1132469075/sr=1-7/ref=sr_1_7/102-4803255-3756956?v=glance&s=books 38. http://www.amazon.com/exec/obidos/tg/detail/-/B000CBT7PG/qid=1132469075/sr=1-7/ref=sr_1_7/102-4803255-3756956?v=glance&s=books 39. http://www.amazon.com/exec/obidos/tg/detail/-/B000CBT7Q0/qid=1132469075/sr=1-10/ref=sr_1_10/102-4803255-3756956?v=glance&s=books 40. http://www.amazon.com/exec/obidos/tg/detail/-/B000CBT7Q0/qid=1132469075/sr=1-10/ref=sr_1_10/102-4803255-3756956?v=glance&s=books 41. http://www.writerwebs.com/ From checker at panix.com Wed Jan 4 23:12:13 2006 From: checker at panix.com (Premise Checker) Date: Wed, 4 Jan 2006 18:12:13 -0500 (EST) Subject: [Paleopsych] Phil Soc Sci: Review of Frank Knight's Selected Essays Message-ID: Review of Frank Knight's Selected Essays PHILOSOPHY OF THE SOCIAL SCIENCES / December 2004 [Knight was the grand-director of my dissertation, meaning that he was the dissertation director of my own dissertation director, James Buchanan. I met him only once but somehow think I was his student. He raised questions more than propounded answers. Not many students cared for this, but to those he did, he was legendary. Read his essays. They will stick, long after any number of Big Mac articles do. [Sorry about the words running together, but it's easy enought to read.] Ross B.Emmett,ed., Selected Essays by Frank H. Knight. Volume 1: What Is Truth in Economics? University of Chicago Press, Chicago, 1999. Pp. 406. $58.00 (cloth). Ross B. Emmett, ed., Selected Essays by Frank H. Knight. Volume 2: Laissez-Faire: Pro and Con. University of Chicago Press, Chicago, 1999. Pp. 459. $58.00 (cloth). Frank Knight (1885-1972) was a very enigmatic economist. On one hand, he was the intellectual father of the Chicago school of economics, he was an early and effective expositor of the school?s most characteristic positions (such as a belief in the benefits of the competitive market, the wrongheadedness of Keynesian macroeconomics, and the explanatory power of rational choice theory), and he was also a revered teacher for many of the Nobel prize winners whose names have come to be associated with the Chicago tradition (including Gary Becker, Milton Friedman, and George Stigler). On the other hand, Knight was also a consistent critic of the idea that economicscouldever be a capital-S science in the image of the natural sciences, and the view (characteristic of Chicago) that all that is required for effective social policy is a good understanding of economic theory. If that was not enough, he continually insisted that competitive market economies really do have a number of endemic, and not easily rectified, social problems. An enigmatic economist indeed! The editor of these two volumes, Ross Emmett, is fairly young in his academic career, but thus far it has been a career dedicated almost exclusivelyto the work of Frank Knight. He is now considered to be the foremost authority onthismuch-quoted, butlittleunderstood, Chicagoeconomist. Emmettisan excellent historian of economic thought; he is a dedicated and careful scholar immersed in Knight?s life, and yet he seems to be devoid of the hagiographic tendencies that often taint the research of those who dedicate so much time and effort to the work of a single individual. Although Emmett is primarily a historian ofeconomicthought, ratherthana practicingeconomist oraphilosopher of science, he has both an effective command of economic theory and an excellent eye for philosophical subtlety. Frank Knight is not Adam Smith or Karl Marx, not a "great" economist whose ideas (or misreadings of his ideas) haveshapedthe basiclandscape ofmodernlife. Andyet, Knightisstill with us in fundamental ways. His problems--the problems of organizing social life in a world where individuals hold widely divergent fundamental values; where market efficiency is essential to, but should not exhaust, meaningful human interaction; and where the scientific form of life dominates, but also harbors, a healthy resistance to reductionism and the suppression of other aspects of human existence--are not only still with us, they have, after the half-century or so detour proffered by "scientific" Marxism, returned withavengeance. Knightisthus morethanjust afigureinthe intellectualhistoryofthe economicsprofession. He is a social thinkerwhose ideas deserveto be considered, and considered in their original complexity. How ever well intentioned his students, their vitiated version of his message is conditioned by their own social and disciplinary context, and is thus no substitute for the original. Although Emmett does not necessarily present Knight?s views as a "solution" to the social problems of then or now--in fact, faith in neatly packaged "solutions" was always part of the problem for Knight--he does garner Knightian thoughts, questions, and criticisms in a way that allows the reader to see both the breadth and the contemporary relevance of Knight?s work. This is particularly clear from the selection of papers contained in these two volumes. The volumes contain twenty-nine previously published papers-- some have also been reprinted in other collections, but most have not--and they cover a wide range of topics, including the philosophy of social science, pure economic theory, the liberal tradition in political philosophy, and the relationshipbetweenethicsandsocialscience.Volume1containstheeditor?s introductory essay and fourteen Knight papers published between 1924 and 1940. Volume 2 contains fifteen papers published between 1939 and 1967. These volumesclearlyrepresent animportant contribution totheliterature-- both the literature about and by Knight, and the history and philosophy of social theory more generally--and the editor has done an excellent job preparing them for publication by the University of Chicago Press. Since a biography of Knight does not currentlyexist, I recommend these essaysasthebest extendedintroduction to hislifeandwork. Itis anexcellent collection--intelligently selected, well organized, and carefully edited--so much so that it leavesthisreviewerinthe unusualpositionof havingessentiallynothingcritical tosayaboutthebooks Iam reviewing (I evenlikethepictureof Knighton the cover). Given this dearth of criticism, I will use the space that I would normally devote to such remarks to briefly discuss the aspect of Knight?s work that should be of most interest to readers of this journal: his philosophy of social science. If one defines "naturalism" in the way that most philosophers of social science have traditionally defined it, then Knight was most decidedly not a naturalist. Hedidnotbelieve intheexistenceof somethingthat couldbe called "the scientific method" that had proved itself as the proper path to knowledge aboutthenatural world, andthatcould, orshould, beappliedina similar way to the investigation of social life. In Knight?s words, "Human phenomena are not amenable to treatment in accordance with the strict cannons of science" (Vol. 1, p. 23). There is in fact a "science of economics," but it ismerelythe science of "economizing"--the instrumental rationalityofusing the most effective means to achieve given ends--and it involves intentionality, mental states, and social forces that are not objectively "observable" in thewaythat naturalscience requires. Notonlyis this economic science rather commonsensical and quite unlike like physics, it is not all that is necessary to understand social life. Human life is multifaceted--it is about values and instrumental rationality, about who we think we should be as much as who we are, about play, and about luck; understanding such a complex phenomenon (or intelligent deliberation about policies affecting it) requires a variety of different approaches. Understanding and affecting social life is fundamentally a pluralist endeavor; or in the language of economics, various approaches to social science are complements, not substitutes (Vol. 2, p. 125). Knight did not defend anything that might be considered a standard view within the philosophy of social science (in either his day or ours)--he was neither a behaviorist nor an interpretativist--and yet many of his concepts and arguments seem quite contemporary and familiar. Knight was a fallibilist, he recognized the social-and theory-ladenness of observations, he was aware of the underdetermination problem as it relates to the testing of scientific theories, he emphasized the social construction of the individual, and he rejected the strict separation of positive science and normative values (cognitive or ethical). Such views arenot uncommoninthe contemporary literature. Whatmakes Knight so intriguing is not only that he was saying suchthingsin the 1930s but also that he combined such views with defense of rational choice economics, a firm commitment to a thoroughly liberal notion of freedom, and a systemic distrust of anything that smacks of collective agency. Frank Knight was quite an interesting character, and the papers in these two volumes repeatedly remind the reader of that fact: both the part about his being interesting and the part about his being quite a character. --D. Wade Hands University of Puget Sound From checker at panix.com Thu Jan 5 22:03:53 2006 From: checker at panix.com (Premise Checker) Date: Thu, 5 Jan 2006 17:03:53 -0500 (EST) Subject: [Paleopsych] Edge Annual Question: What is Your Dangerous Idea? Message-ID: Edge Annual Question: What is Your Dangerous Idea? http://edge.org/q2006/q06_print.html [Links omitted. There are 412 of them! Steven Pinker's, "Groups of people may differ genetically in their average talents and temperaments," is the one most likely to upset the equillibrium among rent-seeking coaltions in the near present. Others are far more dangerous in the long run.] CONTRIBUTORS ______________________________________________________________________ Philip W. Anderson Scott Atran Mahzarin Banaji Simon Baron-Cohen Samuel Barondes Gregory Benford Paul Bloom Jesse Bering Jeremy Bernstein Jamshed Bharucha Susan Blackmore David Bodanis Stewart Brand Rodney Brooks David Buss Philip Campbell Leo Chalupa Andy Clark Gregory Cochran Jerry Coyne M. Csikszentmihalyi Richard Dawkins Paul Davies Stanislas Deheane Daniel C. Dennett Keith Devlin Jared Diamond Denis Dutton Freeman Dyson George Dyson Juan Enriquez Paul Ewald Todd Feinberg Eric Fischl Helen Fisher Richard Foreman Howard Gardner Joel Garreau David Gelernter Neil Gershenfeld Danie Gilbert Marcelo Gleiser Daniel Goleman Brian Goodwin Alison Gopnik April Gornik John Gottman Brian Greene Diane F. Halpern Haim Harari Judith Rich Harris Sam Harris Marc D. Hauser W. Daniel Hillis Donald Hoffman Gerald Holton John Horgan Nicholas Humphrey Piet Hut Marco Iacoboni Eric R. Kandel Kevin Kelly Bart Kosko Stephen Kosslyn Kai Krause Ray Kurzweil Jaron Lanier David Lykken Gary Marcus Lynn Margulis Thomas Metzinger Geoffrey Miller Oliver Morton David G. Myers Randolph Nesse Richard E. Nisbett Tor N?rretranders James O'Donnell John Allen Paulos Irene Pepperberg Clifford Pickover Steven Pinker David Pizarro Jordan Pollack Ernst P?ppel Carolyn Porco Robert Provine VS Ramachandran Martin Rees Matt Ridley Carlo Rovelli Rudy Rucker Douglas Rushkoff Karl Sabbagh Roger Schank Scott Sampson Charles Seife Terrence Sejnowski Martin Seligman Robert Shapiro Rupert Sheldrake Michael Shermer Clay Shirky Barry Smith Lee Smolin Dan Sperber Paul Steinhardt Steven Strogatz Leonard Susskind Timothy Taylor Frank Tipler Arnold Trehub Sherry Turkle J. Craig Venter Philip Zimbardo WHAT IS YOUR DANGEROUS IDEA? The history of science is replete with discoveries that were considered socially, morally, or emotionally dangerous in their time; the Copernican and Darwinian revolutions are the most obvious. What is your dangerous idea? An idea you think about (not necessarily one you originated) that is dangerous not because it is assumed to be false, but because it might be true? __________________________________________________________________ [Thanks to Steven Pinker for suggesting the Edge Annual Question -- 2006.] __________________________________________________________________ January 1, 2006 To the Edge Community, Last year's 2005 Edge Question -- "What do you believe is true even though you cannot prove it?" -- generated many eye-opening responses from a "who's who" of third culture scientists and science-minded thinkers. The 120 contributions comprised a document of 60,000 words. The New York Times ("Science Times") and Frankfurter Allgemeine Zeitung ("Feuilliton") published excepts in their print and online editions simultaneously with Edge publication. The event was featured in major media across the world: BBC Radio; Il Sole 24 Ore, Prospect, El Pais, The Financial Express (Bangledesh), The Sunday Times (UK), The Sydney Morning Herald, The Guardian, La Stampa, The Telegraph, among others. A book based on the 2005 Question -- What We Believe But Cannot Prove: Today's Leading Thinkers on Science in the Age of Certainty, with an introduction by the novelist Ian McEwan -- was just published by the Free Press (UK). The US edition follows from HarperCollins in February, 2006. Since September, Edge has been featured and/or cited in The Toronto Star, Boston Globe, Seed, Rocky Mountain Mews, Observer, El Pais, La Vanguaria (cover story) , El Mundo, Frankfurter Allgemeine Zeitung, Science, Financial Times, Newsweek, AD, La Stampa, The Telegraph, Quark (cover story), and The Wall Street Journal. Online publication of the 2006 Question occurred on New Year's Day. To date, the event has been covered by The Telegraph, The Guardian, The Times, Arts & Letters Daily, Yahoo! News, and The Huffington Post. ___________________________________ Something radically new is in the air: new ways of understanding physical systems, new ways of thinking about thinking that call into question many of our basic assumptions. A realistic biology of the mind, advances in evolutionary biology, physics, information technology, genetics, neurobiology, psychology, engineering, the chemistry of materials: all are questions of critical importance with respect to what it means to be human. For the first time, we have the tools and the will to undertake the scientific study of human nature. What you will find emerging out of the 119 original essays in the 75,000 word document written in response to the 2006 Edge Question -- "What is your dangerous idea?" -- are indications of a new natural philosophy, founded on the realization of the import of complexity, of evolution. Very complex systems -- whether organisms, brains, the biosphere, or the universe itself -- were not constructed by design; all have evolved. There is a new set of metaphors to describe ourselves, our minds, the universe, and all of the things we know in it. Welcome to Edge. Welcome to "dangerous ideas". Happy New Year. John Brockman Publisher & Editor __________________________________________________________________ John Brockman: The Edge Annual Question Sun Jan 1, 2:28 PM What you will find emerging out of the 117 essays written in response to the 2006 Edge Question -- "What is your dangerous idea?" -- are indications of a new natural philosophy, founded on the realization of the import of complexity, of evolution. Very complex systems -- whether organisms, brains, the biosphere, or the universe itself -- were not constructed by design; all have evolved. There is a new set of metaphors to describe ourselves, our minds, the universe, and all of the things we know in it. __________________________________________________________________ CONTRIBUTORS __________________________________________________________________ MARTIN REES President, The Royal Society; Professor of Cosmology & Astrophysics, Master, Trinity College, University of Cambridge; Author, Our Final Century: The 50/50 Threat to Humanity's Survival [rees100.jpg] Science may be 'running out of control' Public opinion surveys (at least in the UK) reveal a generally positive attitude to science. However, this is coupled with widespread worry that science may be 'running out of control'. This latter idea is, I think, a dangerous one, because if widely believed it could be self-fulfilling. In the 21st century, technology will change the world faster than ever -- the global environment, our lifestyles, even human nature itself. We are far more empowered by science than any previous generation was: it offers immense potential -- especially for the developing world -- but there could be catastrophic downsides. We are living in the first century when the greatest risks come from human actions rather than from nature. Almost any scientific discovery has a potential for evil as well as for good; its applications can be channelled either way, depending on our personal and political choices; we can't accept the benefits without also confronting the risks. The decisions that we make, individually and collectively, will determine whether the outcomes of 21st century sciences are benign or devastating. But there's' a real danger that that, rather than campaigning energetically for optimum policies, we'll be lulled into inaction by a feeling of fatalism -- a belief that science is advancing so fast, and is so much influenced by commercial and political pressures, that nothing we can do makes any difference. The present share-out of resources and effort between different sciences is the outcome of a complicated 'tension' between many extraneous factors. And the balance is suboptimal. This seems so whether we judge in purely intellectual terms, or take account of likely benefit to human welfare. Some subjects have had the 'inside track' and gained disproportionate resources. Others, such as environmental researches, renewable energy sources, biodiversity studies and so forth, deserve more effort. Within medical research the focus is disproportionately on cancer and cardiovascular studies, the ailments that loom largest in prosperous countries, rather than on the infectious diseases endemic in the tropics. Choices on how science is applied -- to medicine, the environment, and so forth -- should be the outcome of debate extending way beyond the scientific community. Far more research and development can be done than we actually want or can afford to do; and there are many applications of science that we should consciously eschew. Even if all the world's scientific academies agreed that a specific type of research had a specially disquieting net 'downside' and all countries, in unison, imposed a ban, what is the chance that it could be enforced effectively enough? In view of the failure to control drug smuggling or homicides, it is unrealistic to expect that, when the genie is out of the bottle, we can ever be fully secure against the misuse of science. And in our ever more interconnected world, commercial pressure are harder to control and regulate. The challenges and difficulties of 'controlling' science in this century will indeed be daunting. Cynics would go further, and say that anything that is scientifically and technically possible will be done -- somewhere, sometime -- despite ethical and prudential objections, and whatever the regulatory regime. Whether this idea is true or false, it's an exceedingly dangerous one, because it's engenders despairing pessimism, and demotivates efforts to secure a safer and fairer world. The future will best be safeguarded -- and science has the best chance of being applied optimally -- through the efforts of people who are less fatalistic. __________________________________________________________________ J. CRAIG VENTER Genomics Researcher; Founder & President, J. Craig Venter Science Foundation [venter100.jpg] Revealing the genetic basis of personality and behavior will create societal conflicts From our initial analysis of the sequence of the human genome, particularly with the much smaller than expected number of human genes, the genetic determinists seemed to have clearly suffered a setback. After all, those looking for one gene for each human trait and disease couldn't possibly be accommodated with as few as twenty-odd thousand genes when hundreds of thousands were anticipated. Deciphering the genetic basis of human behavior has been a complex and largely unsatisfying endeavor due to the limitations of the existing tools of genetic trait analysis particularly with complex traits involving multiple genes. All this will soon undergo a revolutionary transformation. The rate of change of DNA sequencing technology is continuing at an exponential pace. We are approaching the time when we will go from having a few human genome sequences to complex databases containing first tens, to hundreds of thousands, of complete genomes, then millions. Within a decade we will begin rapidly accumulating the complete genetic code of humans along with the phenotypic repertoire of the same individuals. By performing multifactorial analysis of the DNA sequence variations, together with the comprehensive phenotypic information gleaned from every branch of human investigatory discipline, for the first time in history, we will be able to provide answers to quantitatively questions of what is genetic versus what is due to the environment. This is already taking place in cancer research where we can measure the differences in genetic mutations inherited from our parents versus those acquired over our lives from environmental damage. This good news will help transform the treatment of cancer by allowing us to know which proteins need to be targeted. However, when these new powerful computers and databases are used to help us analyze who we are as humans, will society at large, largely ignorant and afraid of science, be ready for the answers we are likely to get? For example, we know from experiments on fruit flies that there are genes that control many behaviors, including sexual activity. We sequenced the dog genome a couple of years ago and now an additional breed has had its genome decoded. The canine world offers a unique look into the genetic basis of behavior. The large number of distinct dog breeds originated from the wolf genome by selective breeding, yet each breed retains only subsets of the wolf behavior spectrum. We know that there is a genetic basis not only of the appearance of the breeds with 30-fold difference in weight and 6-fold in height but in their inherited actions. For example border collies can use the power of their stare to herd sheep instead of freezing them in place prior to devouring them. We attribute behaviors in other mammalian species to genes and genetics but when it comes to humans we seem to like the notion that we are all created equal, or that each child is a "blank slate". As we obtain the sequences of more and more mammalian genomes including more human sequences, together with basic observations and some common sense, we will be forced to turn away from the politically correct interpretations, as our new genomic tool sets provide the means to allow us to begin to sort out the reality about nature or nurture. In other words, we are at the threshold of a realistic biology of humankind. It will inevitably be revealed that there are strong genetic components associated with most aspects of what we attribute to human existence including personality subtypes, language capabilities, mechanical abilities, intelligence, sexual activities and preferences, intuitive thinking, quality of memory, will power, temperament, athletic abilities, etc. We will find unique manifestations of human activity linked to genetics associated with isolated and/or inbred populations. The danger rests with what we already know: that we are not all created equal. Further danger comes with our ability to quantify and measure the genetic side of the equation before we can fully understand the much more difficult task of evaluating environmental components of human existence. The genetic determinists will appear to be winning again, but we cannot let them forget the range of potential of human achievement with our limiting genetic repertoire. __________________________________________________________________ LEO CHALUPA Ophthalmologist and Neurobiologist, University of California, Davis [chalupa100.jpg] A 24-hour period of absolute solitude Our brains are constantly subjected to the demands of multi-tasking and a seemingly endless cacophony of information from diverse sources. Cell phones, emails, computers, and cable television are omnipresent, not to mention such archaic venues as books, newspapers and magazines. This induces an unrelenting barrage of neuronal activity that in turn produces long-lasting structural modification in virtually all compartments of the nervous system. A fledging industry touts the virtues of exercising your brain for self-improvement. Programs are offered for how to make virtually any region of your neocortex a more efficient processor. Parents are urged to begin such regimes in preschool children and adults are told to take advantage of their brain's plastic properties for professional advancement. The evidence documenting the veracity for such claims is still outstanding, but one thing is clear. Even if brain exercise does work, the subsequent waves of neuronal activities stemming from simply living a modern lifestyle are likely to eradicate the presumed hard-earned benefits of brain exercise. My dangerous idea is that what's needed to attain optimal brain performance -- with or without prior brain exercise -- is a 24-hour period of absolute solitude. By absolute solitude I mean no verbal interactions of any kind (written or spoken, live or recorded) with another human being. I would venture that a significantly higher proportion of people reading these words have tried skydiving than experienced one day of absolute solitude. What to do to fill the waking hours? That's a question that each person would need to answer for him/herself. Unless you've spent time in a monastery or in solitary confinement it's unlikely that you've had to deal with this issue. The only activity not proscribed is thinking. Imagine if everyone in this country had the opportunity to do nothing but engage in uninterrupted thought for one full day a year! A national day of absolute solitude would do more to improve the brains of all Americans than any other one-day program. (I leave it to the lawmakers to figure out a plan for implementing this proposal.)The danger stems from the fact that a 24 period for uninterrupted thinking could cause irrevocable upheavals in much of what our society currently holds sacred.But whether that would improve our present state of affairs cannot be guaranteed. __________________________________________________________________ V.S. RAMACHANDRAN Neuroscientist; Director, Center for Brain and Cognition, University of California, San Diego; Author, A Brief Tour of Human Consciousness [rama100.gif] Francis Crick's "Dangerous Idea" I am a brain, my dear Watson, and the rest of me is a mere appendage. -- Sherlock Holmes An idea that would be "dangerous if true" is what Francis Crick referred to as "the astonishing hypothesis"; the notion that our conscious experience and sense of self is based entirely on the activity of a hundred billion bits of jelly -- the neurons that constitute the brain. We take this for granted in these enlightened times but even so it never ceases to amaze me. Some scholars have criticized Cricks tongue-in-cheek phrase (and title of his book) on the grounds that the hypothesis he refers to is "neither astonishing nor a hypothesis". (Since we already know it to be true) Yet the far reaching philosophical, moral and ethical dilemmas posed by his hypothesis have not been recognized widely enough. It is in many ways the ultimate dangerous idea . Lets put this in historical perspective. Freud once pointed out that the history of ideas in the last few centuries has been punctuated by "revolutions" major upheavals of thought that have forever altered our view of ourselves and our place in the cosmos. First there was the Copernican system dethroning the earth as the center of the cosmos. Second was the Darwinian revolution; the idea that far from being the climax of "intelligent design" we are merely neotonous apes that happen to be slightly cleverer than our cousins. Third, the Freudian view that even though you claim to be "in charge" of your life, your behavior is in fact governed by a cauldron of drives and motives of which you are largely unconscious. And fourth, the discovery of DNA and the genetic code with its implication (to quote James Watson) that "There are only molecules. Everything else is sociology". To this list we can now add the fifth, the "neuroscience revolution" and its corollary pointed out by Crick -- the "astonishing hypothesis" -- that even our loftiest thoughts and aspirations are mere byproducts of neural activity. We are nothing but a pack of neurons. If all this seems dehumanizing, you haven't seen anything yet. [Editor's Note: An lengthly essay by Ramachandran on this subject is scheduled for publication by Edge in January.] __________________________________________________________________ DAVID BUSS Psychologist, University of Texas, Austin; Author, The Murderer Next Door: Why the Mind is Designed to Kill [buss101..gif] The Evolution of Evil When most people think of torturers, stalkers, robbers, rapists, and murderers, they imagine crazed drooling monsters with maniacal Charles Manson-like eyes. The calm normal-looking image starring back at you from the bathroom mirror reflects a truer representation. The dangerous idea is that all of us contain within our large brains adaptations whose functions are to commit despicable atrocities against our fellow humans -- atrocities most would label evil. The unfortunate fact is that killing has proved to be an effective solution to an array of adaptive problems in the ruthless evolutionary games of survival and reproductive competition: Preventing injury, rape, or death; protecting one's children; eliminating a crucial antagonist; acquiring a rival's resources; securing sexual access to a competitor's mate; preventing an interloper from appropriating one's own mate; and protecting vital resources needed for reproduction. The idea that evil has evolved is dangerous on several counts. If our brains contain psychological circuits that can trigger murder, genocide, and other forms of malevolence, then perhaps we can't hold those who commit carnage responsible: "It's not my client's fault, your honor, his evolved homicide adaptations made him do it." Understanding causality, however, does not exonerate murderers, whether the tributaries trace back to human evolution history or to modern exposure to alcoholic mothers, violent fathers, or the ills of bullying, poverty, drugs, or computer games. It would be dangerous if the theory of the evolved murderous mind were misused to let killers free. The evolution of evil is dangerous for a more disconcerting reason. We like to believe that evil can be objectively located in a particular set of evil deeds, or within the subset people who perpetrate horrors on others, regardless of the perspective of the perpetrator or victim. That is not the case. The perspective of the perpetrator and victim differ profoundly. Many view killing a member of one's in-group, for example, to be evil, but take a different view of killing those in the out-group. Some people point to the biblical commandment "thou shalt not kill" as an absolute. Closer biblical inspection reveals that this injunction applied only to murder within one's group. Conflict with terrorists provides a modern example. Osama bin Laden declared: "The ruling to kill the Americans and their allies -- civilians and military -- is an individual duty for every Muslim who can do it in any country in which it is possible to do it." What is evil from the perspective of an American who is a potential victim is an act of responsibility and higher moral good from the terrorist's perspective. Similarly, when President Bush identified an "axis of evil," he rendered it moral for Americans to kill those falling under that axis -- a judgment undoubtedly considered evil by those whose lives have become imperiled. At a rough approximation, we view as evil people who inflict massive evolutionary fitness costs on us, our families, or our allies. No one summarized these fitness costs better than the feared conqueror Genghis Khan (1167-1227): "The greatest pleasure is to vanquish your enemies, to chase them before you, to rob them of their wealth, to see their near and dear bathed in tears, to ride their horses and sleep on the bellies of their wives and daughters." We can be sure that the families of the victims of Genghis Khan saw him as evil. We can be just as sure that his many sons, whose harems he filled with women of the conquered groups, saw him as a venerated benefactor. In modern times, we react with horror at Mr. Khan describing the deep psychological satisfaction he gained from inflicting fitness costs on victims while purloining fitness fruits for himself. But it is sobering to realize that perhaps half a percent of the world's population today are descendants of Genghis Khan. On reflection, the dangerous idea may not be that murder historically has been advantageous to the reproductive success of killers; nor that we all house homicidal circuits within our brains; nor even that all of us are lineal descendants of ancestors who murdered. The danger comes from people who refuse to recognize that there are dark sides of human nature that cannot be wished away by attributing them to the modern ills of culture, poverty, pathology, or exposure to media violence. The danger comes from failing to gaze into the mirror and come to grips the capacity for evil in all of us. __________________________________________________________________ PAUL BLOOM Psychologist, Yale University; Author, Descartes' Baby [bloom100.jpg] There are no souls I am not concerned here with the radical claim that personal identity, free will, and consciousness do not exist. Regardless of its merit, this position is so intuitively outlandish that nobody but a philosopher could take it seriously, and so it is unlikely to have any real-world implications, dangerous or otherwise. Instead I am interested in the milder position that mental life has a purely material basis. The dangerous idea, then, is that Cartesian dualism is false. If what you mean by "soul" is something immaterial and immortal, something that exists independently of the brain, then souls do not exist. This is old hat for most psychologists and philosophers, the stuff of introductory lectures. But the rejection of the immaterial soul is unintuitive, unpopular, and, for some people, downright repulsive. In the journal "First Things", Patrick Lee and Robert P. George outline some worries from a religious perspective. "If science did show that all human acts, including conceptual thought and free choice, are just brain processes,... it would mean that the difference between human beings and other animals is only superficial-a difference of degree rather than a difference in kind; it would mean that human beings lack any special dignity worthy of special respect. Thus, it would undermine the norms that forbid killing and eating human beings as we kill and eat chickens, or enslaving them and treating them as beasts of burden as we do horses or oxen." The conclusions don't follow. Even if there are no souls, humans might differ from non-human animals in some other way, perhaps with regard to the capacity for language or abstract reasoning or emotional suffering. And even if there were no difference, it would hardly give us license to do terrible things to human beings. Instead, as Peter Singer and others have argued, it should make us kinder to non-human animals. If a chimpanzee turned out to possess the intelligence and emotions of a human child, for instance, most of us would agree that it would be wrong to eat, kill, or enslave it. Still, Lee and George are right to worry that giving up on the soul means giving up on a priori distinction between humans and other creatures, something which has very real consequences. It would affect as well how we think about stem-cell research and abortion, euthenasia, cloning, and cosmetic psychopharmacology. It would have substantial implications for the legal realm -- a belief in immaterial souls has led otherwise sophisticated commentators to defend a distinction between actions that we do and actions that our brains do. We are responsible only for the former, motivating the excuse that Michael Gazzaniga has called, "My brain made me do it." It has been proposed, for instance, that if a pedophile's brain shows a certain pattern of activation while contemplating sex with a child, he should not be viewed as fully responsible for his actions. When you give up on the soul, and accept that all actions correspond to brain activity, this sort of reasoning goes out the window. The rejection of souls is more dangerous than the idea that kept us so occupied in 2005 -- evolution by natural selection. The battle between evolution and creationism is important for many reasons; it is where science takes a stand against superstition. But, like the origin of the universe, the origin of the species is an issue of great intellectual importance and little practical relevance. If everyone were to become a sophisticated Darwinian, our everyday lives would change very little. In contrast, the widespread rejection of the soul would have profound moral and legal consequences. It would also require people to rethink what happens when they die, and give up the idea (held by about 90% of Americans) that their souls will survive the death of their bodies and ascend to heaven. It is hard to get more dangerous than that. __________________________________________________________________ PHILIP CAMPBELL Editor-in Chief, Nature [campbell100.jpg] Scientists and governments developing public engagement about science and technology are missing the point This turns out to be true in cases where there are collapses in consensus that have serious societal consequences. Whether in relation to climate change, GM crops or the UK's triple vaccine for measles, mumps and rubella, alternative science networks develop amongst people who are neither ignorant nor irrational, but have perceptions about science, the scientific literature and its implications that differ from those prevailing in the scientific community. These perceptions and discussions may be half-baked, but are no less powerful for all that, and carry influence on the internet and in the media. Researchers and governments haven't yet learned how to respond to such "citizen's science". Should they stop explaining and engaging? No. But they need also to understand better the influences at work within such networks -- often too dismissively stereotyped -- at an early stage in the debate in order to counter bad science and minimize the impacts of falsehoods. __________________________________________________________________ JESSE BERING Psychologist, University of Arkansas [bering100.jpg] Science will never silence God With each meticulous turn of the screw in science, with each tightening up of our understanding of the natural world, we pull more taut the straps over God's muzzle. From botany to bioengineering, from physics to psychology, what is science really but true Revelation -- and what is Revelation but the negation of God? It is a humble pursuit we scientists engage in: racing to reality. Many of us suffer the harsh glare of the American theocracy, whose heart still beats loud and strong in this new year of the 21st century. We bravely favor truth, in all its wondrous, amoral, and 'meaningless' complexity over the singularly destructive Truth born of the trembling minds of our ancestors. But my dangerous idea, I fear, is that no matter how far our thoughts shall vault into the eternal sky of scientific progress, no matter how dazzling the effects of this progress, God will always bite through his muzzle and banish us from the starry night of humanistic ideals. Science is an endless series of binding and rebinding his breath; there will never be a day when God does not speak for the majority. There will never be a day even when he does not whisper in the most godless of scientists' ears. This is because God is not an idea, nor a cultural invention, not an 'opiate of the masses' or any such thing; God is a way of thinking that was rendered permanent by natural selection. As scientists, we must toil and labor and toil again to silence God, but ultimately this is like cutting off our ears to hear more clearly. God too is a biological appendage; until we acknowledge this fact for what it is, until we rear our children with this knowledge, he will continue to howl his discontent for all of time. __________________________________________________________________ PAUL W. EWALD Evolutionary Biologist; Director, Program in Evolutionary Medicine, University of Louisville; Author, Plague Time [ewald100.gif] A New Golden Age of Medicine My dangerous idea is that we have in hand most of the information we need to facilitate a new golden age of medicine. And what we don't have in hand we can get fairly readily by wise investment in targeted research and intervention. In this golden age we should be able to prevent most debilitating diseases in developed and undeveloped countries within a relatively short period of time with much less money than is generally presumed. This is good news. Why is it dangerous? One array of dangers arises because ideas that challenge the status quo threaten the livelihood of many. When the many are embedded in powerful places the threat can be stifling, especially when a lot of money and status are at stake. So it is within the arena of medical research and practice. Imagine what would happen if the big diseases -- cancers, arteriosclerosis, stroke, diabetes -- were largely prevented. Big pharmas would become small because the demand for prescription drugs would drop. The prestige of physicians would drop because they would no longer be relied upon to prolong life. The burgeoning industry of biomedical research would shrink because governmental and private funding for this research would diminish. Also threatened would be scientists whose sense of self-worth is built upon the grant dollars they bring in for discovering miniscule parts of big puzzles. Scientists have been beneficiaries of the lack of progress in recent decades, which has caused leaders such as the past head of NIH, Harold Varmus, to declare that what is needed is more basic research. But basic research has not generated many great advancements in the prevention or cure of disease in recent decades. The major exception is in the realm of infectious disease where many important advancements were generated from tiny slices of funding. The discovery that peptic ulcers are caused by infections that can be cured with antibiotics is one example. Another is the discovery that liver cancer can often be prevented by a vaccine against the hepatitis B virus or by screening blood for hepatitis B and C viruses. The track record of the past few decades shows that these examples are not quirks. They are part of a trend that goes back over a century to the beginning of the germ theory itself. And the accumulating evidence supporting infectious causation of big bad diseases of modern society is following the same pattern that occurred for diseases that have been recently accepted as caused by infection. The process of acceptance typically occurs over one or more decades and accords with Schopenhauer's generalization about the establishment of truth: it is first ridiculed, then violently opposed, and finally accepted as being self-evident. Just a few groups of pathogens seem to be big players: streptococci, Chlamydia, some bacteria of the oral cavity, hepatitis viruses, and herpes viruses. If the correlations between these pathogens and the big diseases of wealthy countries does in fact reflect infectious causation, effective vaccines against these pathogens could contribute in a big way to a new golden age of medicine that could rival the first half of the 20th century. The transition to this golden age, however, requires two things: a shift in research effort to identifying the pathogens that cause the major diseases and development of effective interventions against them. The first would be easy to bring about by restructuring the priorities of NIH -- where money goes, so go the researchers. The second requires mechanisms for putting in place programs that cannot be trusted to the free market for the same kinds of reasons that Adam Smith gave for national defense. The goals of the interventions do not mesh nicely with the profit motive of the free market. Vaccines, for example, are not very profitable. Pharmas cannot make as much money by selling one vaccine per person to prevent a disease as they can selling a patented drug like Vioxx which will be administered day after day, year after year to treat symptoms of an illness that is never cured. And though liability issues are important for such symptomatic treatment, the pharmas can argue forcefully that drugs with nasty side effects provide some benefit even to those who suffer most from the side effects because the drugs are given not to prevent an illness but rather to people who already have an illness. This sort of defense is less convincing when the victim is a child who developed permanent brain damage from a rare complication of a vaccine that was given to protect them against a chronic illness that they might have acquired decades later. Another part of this vision of a new golden age will be the ability to distinguish real threats from pseudo-threats. This ability will allow us to invest in policy and infrastructure that will protect people against real threats without squandering resources and destroying livelihoods in efforts to protect against pseudo-threats. Our present predicament on this front is far from this ideal. Today experts on infectious diseases and institutions entrusted to protect and improve human health sound the alarm in response to each novel threat. The current fears over a devastating pandemic of bird flu is a case in point. Some of the loudest voices offer a simplistic argument: failing to prepare for the worst-case scenarios is irresponsible and dangerous. This criticism has been recently leveled at me and others who question expert proclamations, such as those from the World Health Organization and the Centers for Disease Control. These proclamations inform us that H5N1 bird flu virus poses an imminent threat of an influenza pandemic similar to or even worse than the 1918 pandemic. I have decreased my popularity in such circles by suggesting that the threat of this scenario is essentially nonexistent. In brief I argue that the 1918 influenza viruses evolved their unique combination of high virulence and high transmissibility in the conditions at the Western Front of World War I. By transporting contagious flu patients into a series of tightly packed groups of susceptible individuals, personnel fostered transmission from people who were completely immobilized by their illness. Such conditions must have favored the predator-like variants of the influenza virus; these variants would have a competitive edge because they could ruthlessly exploit a person for their own replication and still get transmitted to large numbers of susceptible individuals. These conditions have not recurred in human populations since then and, accordingly, we have never had any outbreaks of influenza viruses that have been anywhere near as harmful as those that emerged at the Western Front. So long as we do not allow such conditions to occur again we have little to fear from a reevolution of such a predatory virus. The fear of a 1918 style pandemic has fueled preparations by a government which, embarrassed by its failure to deal adequately with the damage from Katrina, seems determined to prepare for any perceived threat to save face. I would have no problem with the accusation of irresponsibility if preparations for a 1918 style pandemic were cost free. But they are not. The $7 billion that the Bush administration is planning as a downpayment for pandemic preparedness has to come from somewhere. If money is spent to prepare for an imaginary pandemic, our progress could be impeded on other fronts that could lead to or have already established real improvements in public health. Conclusions about responsibility or irresponsibility of this argument require that the threat from pandemic influenza be assessed relative to the damage that results from the procurement of the money from other sources. The only reliable evidence of the damage from pandemic influenza under normal circumstances is the experience of the two pandemics that have occurred since 1918, one in 1957 and the other in 1968. The mortality caused by these pandemics was one-tenth to one-hundredth the death toll from the 1918 pandemic. We do need to be prepared for an influenza pandemic of the normal variety, just as we needed to be prepared for category 5 hurricanes in the Gulf of Mexico. If possible our preparations should allow us to stop an incipient pandemic before it materializes. In contrast with many of the most vocal experts I do not conclude that our surveillance efforts will be quickly overwhelmed by a highly transmissible descendent of the influenza virus that has generated the most recent fright (dubbed H5N1). The transition of the H5N1 virus to a pandemic virus would require evolutionary change. The dialogue on this matter, however, continues to neglect the primary mechanism of the evolutionary change: natural selection. Instead it is claimed that H5N1 could mutate to become a full-fledged human virus that is both highly transmissible and highly lethal. Mutation provides only the variation on which natural selection acts. We must consider natural selection if we are to make meaningful assessments of the danger posed by the H5N1 virus. The evolution of the 1918 virus was gradual, and both evidence and theory lead to the conclusion that any evolution of increased transmissibility of H5N1 from human to human will be gradual, as it was with SARS. With surveillance we can detect such changes in humans and intervene to stop further spread as was done with SARS. We do not need to trash the economy of southeast asia each year to accomplish this. The dangerous vision of a golden age does not leave the poor countries behind. As I have discussed in my articles and books, we should be able to control much of the damage caused by the major killers in poor countries by infrastructural improvements that not only reduce the frequency of infection but also cause the infectious agents to evolve toward benignity. This integrated approach offers the possibility to remodel our current efforts against the major killers -- AIDS, malaria, tuberculosis, dysentery and the like. We should be able to move from just holding ground to institution of the changes that created the freedom from acute infectious diseases that have been enjoyed by inhabitants of rich countries over the past century. Dangerous indeed! Excellent solutions are often dangerous to the status quo because they they work. One measure of danger to some but success to the general population is the extent to which highly specialized researchers, physicians, and other health care workers will need to retrain, and the extent to which hospitals and pharmaceutical companies will need to downsize. That is what happens when we introduce excellent solutions to health problems. We need not be any more concerned about these difficulties than the loss of the iron lung industry and the retraining of polio therapists and researchers in the wake of the Salk vaccine. _________________________________________________________________ BART KOSKO Professor, Electrical Engineering, USC; Author, Heaven in a Chip [kosko100.jpg] Most bell curves have thick tails Any challenge to the normal probability bell curve can have far-reaching consequences because a great deal of modern science and engineering rests on this special bell curve. Most of the standard hypothesis tests in statistics rely on the normal bell curve either directly or indirectly. These tests permeate the social and medical sciences and underlie the poll results in the media. Related tests and assumptions underlie the decision algorithms in radar and cell phones that decide whether the incoming energy blip is a 0 or a 1. Management gurus exhort manufacturers to follow the "six sigma" creed of reducing the variance in products to only two or three defective products per million in accord with "sigmas" or standard deviations from the mean of a normal bell curve. Models for trading stock and bond derivatives assume an underlying normal bell-curve structure. Even quantum and signal-processing uncertainty principles or inequalities involve the normal bell curve as the equality condition for minimum uncertainty. Deviating even slightly from the normal bell curve can sometimes produce qualitatively different results. The proposed dangerous idea stems from two facts about the normal bell curve. First: The normal bell curve is not the only bell curve. There are at least as many different bell curves as there are real numbers. This simple mathematical fact poses at once a grammatical challenge to the title of Charles Murray's IQ book The Bell Curve. Murray should have used the indefinite article "A" instead of the definite article "The." This is but one of many examples that suggest that most scientists simply equate the entire infinite set of probability bell curves with the normal bell curve of textbooks. Nature need not share the same practice. Human and non-human behavior can be far more diverse than the classical normal bell curve allows. Second: The normal bell curve is a skinny bell curve. It puts most of its probability mass in the main lobe or bell while the tails quickly taper off exponentially. So "tail events" appear rare simply as an artifact of this bell curve's mathematical structure. This limitation may be fine for approximate descriptions of "normal" behavior near the center of the distribution. But it largely rules out or marginalizes the wide range of phenomena that take place in the tails. Again most bell curves have thick tails. Rare events are not so rare if the bell curve has thicker tails than the normal bell curve has. Telephone interrupts are more frequent. Lightning flashes are more frequent and more energetic. Stock market fluctuations or crashes are more frequent. How much more frequent they are depends on how thick the tail is -- and that is always an empirical question of fact. Neither logic nor assume-the-normal-curve habit can answer the question. Instead scientists need to carry their evidentiary burden a step further and apply one of the many available statistical tests to determine and distinguish the bell-curve thickness. One response to this call for tail-thickness sensitivity is that logic alone can decide the matter because of the so-called central limit theorem of classical probability theory. This important "central" result states that some suitably normalized sums of random terms will converge to a standard normal random variable and thus have a normal bell curve in the limit. So Gauss and a lot of other long-dead mathematicians got it right after all and thus we can continue to assume normal bell curves with impunity. That argument fails in general for two reasons. The first reason it fails is that the classical central limit theorem result rests on a critical assumption that need not hold and that often does not hold in practice. The theorem assumes that the random dispersion about the mean is so comparatively slight that a particular measure of this dispersion -- the variance or the standard deviation -- is finite or does not blow up to infinity in a mathematical sense. Most bell curves have infinite or undefined variance even though they have a finite dispersion about their center point. The error is not in the bell curves but in the two-hundred-year-old assumption that variance equals dispersion. It does not in general. Variance is a convenient but artificial and non-robust measure of dispersion. It tends to overweight "outliers" in the tail regions because the variance squares the underlying errors between the values and the mean. Such squared errors simplify the math but produce the infinite effects. These effects do not appear in the classical central limit theorem because the theorem assumes them away. The second reason the argument fails is that the central limit theorem itself is just a special case of a more general result called the generalized central limit theorem. The generalized central limit theorem yields convergence to thick-tailed bell curves in the general case. Indeed it yields convergence to the thin-tailed normal bell curve only in the special case of finite variances. These general cases define the infinite set of the so-called stable probability distributions and their symmetric versions are bell curves. There are still other types of thick-tailed bell curves (such as the Laplace bell curves used in image processing and elsewhere) but the stable bell curves are the best known and have several nice mathematical properties. The figure below shows the normal or Gaussian bell curve superimposed over three thicker-tailed stable bell curves. The catch in working with stable bell curves is that their mathematics can be nearly intractable. So far we have closed-form solutions for only two stable bell curves (the normal or Gaussian and the very-thick-tailed Cauchy curve) and so we have to use transform and computer techniques to generate the rest. Still the exponential growth in computing power has long since made stable or thick-tailed analysis practical for many problems of science and engineering. This last point shows how competing bell curves offer a new context for judging whether a given set of data reasonably obey a normal bell curve. One of the most popular eye-ball tests for normality is the PP or probability plot of the data. The data should almost perfectly fit a straight line if the data come from a normal probability distribution. But this seldom happens in practice. Instead real data snake all around the ideal straight line in a PP diagram. So it is easy for the user to shrug and a call any data deviation from the ideal line good enough in the absence of a direct bell-curve competitor. A fairer test is to compare the normal PP plot with the best-fitting thick-tailed or stable PP plot. The data may well line up better in a thick-tailed PP diagram than it does in the usual normal PP diagram. This test evidence would reject the normal bell-curve hypothesis in favor of the thicker-tailed alternative. Ignoring these thick-tailed alternatives favors accepting the less-accurate normal bell curve and thus leads to underestimating the occurrence of tail events. Stable or thick-tailed probability curves continue to turn up as more scientists and engineers search for them. They tend to accurately model impulsive phenomena such as noise in telephone lines or in the atmosphere or in fluctuating economic assets. Skewed versions appear to best fit the data for the Ethernet traffic in bit packets. Here again the search is ultimately an empirical one for the best-fitting tail thickness. Similar searches will only increase as the math and software of thick-tailed bell curves work their way into textbooks on elementary probability and statistics. Much of it is already freely available on the Internet. Thicker-tail bell curves also imply that there is not just a single form of pure white noise. Here too there are at least as many forms of white noise (or any colored noise) as there are real numbers. Whiteness just means that the noise spikes or hisses and pops are independent in time or that they do not correlate with one another. The noise spikes themselves can come from any probability distribution and in particular they can come from any stable or thick-tailed bell curve. The figure below shows the normal or Gaussian bell curve and three kindred thicker-tailed bell curves and samples of their corresponding white noise. The normal curve has the upper-bound alpha parameter of 2 while the thicker-tailed curves have lower values -- tail thickness increases as the alpha parameter falls. The white noise from the thicker-tailed bell curves becomes much more impulsive as their bell narrows and their tails thicken because then more extreme events or noise spikes occur with greater frequency. [image001.jpg] Competing bell curves: The figure on the left shows four superimposed symmetric alpha-stable bell curves with different tail thicknesses while the plots on the right show samples of their corresponding forms of white noise. The parameter [image002.gif] describes the thickness of a stable bell curve and ranges from 0 to 2. Tails grow thicker as [image003.gif] grows smaller. The white noise grows more impulsive as the tails grow thicker. The Gaussian or normal bell curve [image004.gif] has the thinnest tail of the four stable curves while the Cauchy bell curve [image005.gif] has the thickest tails and thus the most impulsive noise. Note the different magnitude scales on the vertical axes. All the bell curves have finite dispersion while only the Gaussian or normal bell curve has a finite variance or finite standard deviation. My colleagues and I have recently shown that most mathematical models of spiking neurons in the retina can not only benefit from small amounts of added noise by increasing their Shannon bit count but they still continue to benefit from added thick-tailed or "infinite-variance" noise. The same result holds experimentally for a carbon nanotube transistor that detects signals in the presence of added electrical noise. Thick-tailed bell curves further call into question what counts as a statistical "outlier" or bad data: Is a tail datum error or pattern? The line between extreme and non-extreme data is not just fuzzy but depends crucially on the underlying tail thickness. The usual rule of thumb is that the data is suspect if it lies outside three or even two standard deviations from the mean. Such rules of thumb reflect both the tacit assumption that dispersion equals variance and the classical central-limit effect that large data sets are not just approximately bell curves but approximately thin-tailed normal bell curves. An empirical test of the tails may well justify the latter thin-tailed assumption in many cases. But the mere assertion of the normal bell curve does not. So "rare" events may not be so rare after all. _________________________________________________________________ MATT RIDLEY Science Writer; Founding chairman of the International Centre for Life; Author, The Agile Gene: How Nature Turns on Nature [ridley100.jpg] Government is the problem not the solution In all times and in all places there has been too much government. We now know what prosperity is: it is the gradual extension of the division of labour through the free exchange of goods and ideas, and the consequent introduction of efficiencies by the invention of new technologies. This is the process that has given us health, wealth and wisdom on a scale unimagined by our ancestors. It not only raises material standards of living, it also fuels social integration, fairness and charity. It has never failed yet. No society has grown poorer or more unequal through trade, exchange and invention. Think of pre-Ming as opposed to Ming China, seventeenth century Holland as opposed to imperial Spain, eighteenth century England as opposed to Louis XIV's France, twentieth century America as opposed to Stalin's Russia, or post-war Japan, Hong Kong and Korea as opposed to Ghana, Cuba and Argentina. Think of the Phoenicians as opposed to the Egyptians, Athens as opposed to Sparta, the Hanseatic League as opposed to the Roman Empire. In every case, weak or decentralised government, but strong free trade led to surges in prosperity for all, whereas strong, central government led to parasitic, tax-fed officialdom, a stifling of innovation, relative economic decline and usually war. Take Rome. It prospered because it was a free trade zone. But it repeatedly invested the proceeds of that prosperity in too much government and so wasted it in luxury, war, gladiators and public monuments. The Roman empire's list of innovations is derisory, even compared with that of the 'dark ages' that followed. In every age and at every time there have been people who say we need more regulation, more government. Sometimes, they say we need it to protect exchange from corruption, to set the standards and police the rules, in which case they have a point, though often they exaggerate it. Self-policing standards and rules were developed by free-trading merchants in medieval Europe long before they were taken over and codified as laws (and often corrupted) by monarchs and governments. Sometimes, they say we need it to protect the weak, the victims of technological change or trade flows. But throughout history such intervention, though well meant, has usually proved misguided -- because its progenitors refuse to believe in (or find out about) David Ricardo's Law of Comparative Advantage: even if China is better at making everything than France, there will still be a million things it pays China to buy from France rather than make itself. Why? Because rather than invent, say, luxury goods or insurance services itself, China will find it pays to make more T shirts and use the proceeds to import luxury goods and insurance. Government is a very dangerous toy. It is used to fight wars, impose ideologies and enrich rulers. True, nowadays, our leaders do not enrich themselves (at least not on the scale of the Sun King), but they enrich their clients: they preside over vast and insatiable parasitic bureaucracies that grow by Parkinson's Law and live off true wealth creators such as traders and inventors. Sure, it is possible to have too little government. Only, that has not been the world's problem for millennia. After the century of Mao, Hitler and Stalin, can anybody really say that the risk of too little government is greater than the risk of too much? The dangerous idea we all need to learn is that the more we limit the growth of government, the better off we will all be. _________________________________________________________________ DAVID PIZARRO Psychologist, Cornell University [pizarro100.jpg] Hodgepodge Morality What some individuals consider a sacrosanct ability to perceive moral truths may instead be a hodgepodge of simpler psychological mechanisms, some of which have evolved for other purposes. It is increasingly apparent that our moral sense comprises a fairly loose collection of intuitions, rules of thumb, and emotional responses that may have emerged to serve a variety of functions, some of which originally had nothing at all to do with ethics. These mechanisms, when tossed in with our general ability to reason, seem to be how humans come to answer the question of good and evil, right and wrong. Intuitions about action, intentionality, and control, for instance, figure heavily into our perception of what constitutes an immoral act. The emotional reactions of empathy and disgust likewise figure into our judgments of who deserves moral protection and who doesn't. But the ability to perceive intentions probably didn't evolve as a way to determine who deserves moral blame. And the emotion of disgust most likely evolved to keep us safe from rotten meat and feces, not to provide information about who deserves moral protection. Discarding the belief that our moral sense provides a royal road to moral truth is an uncomfortable notion. Most people, after all, are moral realists. They believe acts are objectively right or wrong, like math problems. The dangerous idea is that our intuitions may be poor guides to moral truth, and can easily lead us astray in our everyday moral decisions. _________________________________________________________________ RANDOPLH M. NESSE Psychiatrist, University of Michigan; Coauthor (with George Williams), Why We Get Sick: The New Science of Darwinian Medicine [nesse100.jpg] Unspeakable Ideas The idea of promoting dangerous ideas seems dangerous to me. I spend considerable effort to prevent my ideas from becoming dangerous, except, that is, to entrenched false beliefs and to myself. For instance, my idea that bad feelings are useful for our genes upends much conventional wisdom about depression and anxiety. I find, however, that I must firmly restrain journalists who are eager to share the sensational but incorrect conclusion that depression should not be treated. Similarly, many people draw dangerous inferences from my work on Darwinian medicine. For example, just because fever is useful does not mean that it should not be treated. I now emphasize that evolutionary theory does not tell you what to do in the clinic, it just tells you what studies need to be done. I also feel obligated to prevent my ideas from becoming dangerous on a larger scale. For instance, many people who hear about Darwinian medicine assume incorrectly that it implies support for eugenics. I encourage them to read history as well as my writings. The record shows how quickly natural selection was perverted into Social Darwinism, an ideology that seemed to justify letting poor people starve. Related ideas keep emerging. We scientists have a responsibility to challenge dangerous social policies incorrectly derived from evolutionary theory. Racial superiority is yet another dangerous idea that hurts real people. More examples come to mind all too easily and some quickly get complicated. For instance, the idea that men are inherently different from women has been used to justify discrimination, but the idea that men and women have identical abilities and preferences may also cause great harm. While I don't want to promote ideas dangerous to others, I am fascinated by ideas that are dangerous to anyone who expresses them. These are "unspeakable ideas." By unspeakable ideas I don't mean those whose expression is forbidden in a certain group. Instead, I propose that there is class of ideas whose expression is inherently dangerous everywhere and always because of the nature of human social groups. Such unspeakable ideas are anti-memes. Memes, both true and false, spread fast because they are interesting and give social credit to those who spread them. Unspeakable ideas, even true important ones, don't spread at all, because expressing them is dangerous to those who speak them. So why, you may ask, is a sensible scientist even bringing the idea up? Isn't the idea of unspeakable ideas a dangerous idea? I expect I will find out. My hope is that a thoughtful exploration of unspeakable ideas should not hurt people in general, perhaps won't hurt me much, and might unearth some long-neglected truths. Generalizations cannot substitute for examples, even if providing examples is risky. So, please gather your own data. Here is an experiment. The next time you are having a drink with an enthusiastic fan for your hometown team, say "Well, I think our team just isn't very good and didn't deserve to win." Or, moving to more risky territory, when your business group is trying to deal with a savvy competitor, say, "It seems to me that their product is superior because they are smarter than we are." Finally, and I cannot recommend this but it offers dramatic data, you could respond to your spouse's difficulties at work by saying, "If they are complaining about you not doing enough, it is probably because you just aren't doing your fair share." Most people do not need to conduct such social experiments to know what happens when such unspeakable ideas are spoken. Many broader truths are equally unspeakable. Consider, for instance, all the articles written about leadership. Most are infused with admiration and respect for a leader's greatness. Much rarer are articles about the tendency for leadership positions to be attained by power-hungry men who use their influence to further advance their self-interest. Then there are all the writings about sex and marriage. Most of them suggest that there is some solution that allows full satisfaction for both partners while maintaining secure relationships. Questioning such notions is dangerous, unless you are a comic, in which case skepticism can be very, very funny. As a final example, consider the unspeakable idea of unbridled self-interest. Someone who says, "I will only do what benefits me," has committed social suicide. Tendencies to say such things have been selected against, while those who advocate goodness, honesty and service to others get wide recognition. This creates an illusion of a moral society that then, thanks to the combined forces of natural and social selection, becomes a reality that makes social life vastly more agreeable. There are many more examples, but I must stop here. To say more would either get me in trouble or falsify my argument. Will I ever publish my "Unspeakable Essays?" It would be risky, wouldn't it? _________________________________________________________________ GREGORY BENFORD Physicist, UC Irvine; Author, Deep Time [benford100.jpg] Think outside the Kyoto box Few economists expect the Kyoto Accords to attain their goals. With compliance coming only slowly and with three big holdouts -- the US, China and India -- it seems unlikely to make much difference in overall carbon dioxide increases. Yet all the political pressure is on lessening our fossil fuel burning, in the face of fast-rising demand. This pits the industrial powers against the legitimate economic aspirations of the developing world -- a recipe for conflict. Those who embrace the reality of global climate change mostly insist that there is only one way out of the greenhouse effect -- burn less fossil fuel, or else. Never mind the economic consequences. But the planet itself modulates its atmosphere through several tricks, and we have little considered using most of them. The overall global problem is simple: we capture more heat from the sun than we radiate away. Mostly this is a good thing, else the mean planetary temperature would hover around freezing. But recent human alterations of the atmosphere have resulted in too much of a good thing. Two methods are getting little attention: sequestering carbon from the air and reflecting sunlight. Hide the Carbon There are several schemes to capture carbon dioxide from the air: promote tree growth; trap carbon dioxide from power plants in exhausted gas domes; or let carbon-rich organic waste fall into the deep oceans. Increasing forestation is a good, though rather limited, step. Capturing carbon dioxide from power plants costs about 30% of the plant output, so it's an economic nonstarter. That leaves the third way. Imagine you are standing in a ripe Kansas cornfield, staring up into a blue summer sky. A transparent acre-area square around you extends upwards in an air-filled tunnel, soaring all the way to space. That long tunnel holds carbon in the form of invisible gas, carbon dioxide -- widely implicated in global climate change. But how much? Very little, compared with how much we worry about it. The corn standing as high as an elephant's eye all around you holds four hundred times as much carbon as there is in man-made carbon dioxide -- our villain -- in the entire column reaching to the top of the atmosphere. (We have added a few hundred parts per million to our air by burning.) Inevitably, we must understand and control the atmosphere, as part of a grand imperative of directing the entire global ecology. Yearly, we manage through agriculture far more carbon than is causing our greenhouse dilemma. Take advantage of that. The leftover corn cobs and stalks from our fields can be gathered up, floated down the Mississippi, and dropped into the ocean, sequestering it. Below about a kilometer depth, beneath a layer called the thermocline, nothing gets mixed back into the air for a thousand years or more. It's not a forever solution, but it would buy us and our descendents time to find such answers. And it is inexpensive; cost matters. The US has large crop residues. It has also ignored the Kyoto Accord, saying it would cost too much. It would, if we relied purely on traditional methods, policing energy use and carbon dioxide emissions. Clinton-era estimates of such costs were around $100 billion a year -- a politically unacceptable sum, which led Congress to reject the very notion by a unanimous vote. But if the US simply used its farm waste to "hide" carbon dioxide from our air, complying with Kyoto's standard would cost about $10 billion a year, with no change whatsoever in energy use. The whole planet could do the same. Sequestering crop leftovers could offset about a third of the carbon we put into our air. The carbon dioxide we add to our air will end up in the oceans, anyway, from natural absorption, but not nearly quickly enough to help us. Reflect Away Sunlight Hiding carbon from air is only one example of ways the planet has maintained its perhaps precarious equilibrium throughout billions of years. Another is our world's ability to edit sunlight, by changing cloud cover. As the oceans warm, water evaporates, forming clouds. These reflect sunlight, reducing the heat below -- but just how much depends on cloud thickness, water droplet size, particulate density -- a forest of detail. If our climate starts to vary too much, we could consider deliberately adjusting cloud cover in selected areas, to offset unwanted heating. It is not actually hard to make clouds; volcanoes and fossil fuel burning do it all the time by adding microscopic particles to the air. Cloud cover is a natural mechanism we can augment, and another area where possibility of major change in environmental thinking beckons. A 1997 US Department of Energy study for Los Angeles showed that planting trees and making blacktop and rooftops lighter colored could significantly cool the city in summer. With minimal costs that get repaid within five years we can reduce summer midday temperatures by several degrees. This would cut air conditioning costs for the residents, simultaneously lowering energy consumption, and lessening the urban heat island effect. Incoming rain clouds would not rise as much above the heat blossom of the city, and so would rain on it less. Instead, clouds would continue inland to drop rain on the rest of Southern California, promoting plant growth. These methods are now under way in Los Angeles, a first experiment. We can combine this with a cloud-forming strategy. Producing clouds over the tropical oceans is the most effective way to cool the planet on a global scale, since the dark oceans absorb the bulk of the sun's heat. This we should explore now, in case sudden climate changes force us to act quickly. Yet some environmentalists find all such steps suspect. They smack of engineering, rather than self-discipline. True enough -- and that's what makes such thinking dangerous, for some. Yet if Kyoto fails to gather momentum, as seems probable to many, what else can we do? Turn ourselves into ineffectual Mommy-cop states, with endless finger-pointing politics, trying to equally regulate both the rich in their SUVs and Chinese peasants who burn coal for warmth? Our present conventional wisdom might be termed The Puritan Solution -- Abstain, sinners! -- and is making slow, small progress. The Kyoto Accord calls for the industrial nations to reduce their carbon dioxide emissions to 7% below the 1990 level, and globally we are farther from this goal every year. These steps are early measures to help us assume our eventual 21st Century role, as true stewards of the Earth, working alongside Nature. Recently Billy Graham declared that since the Bible made us stewards of the Earth, we have a holy duty to avert climate change. True stewards use the Garden's own methods. _________________________________________________________________ MARCO IACOBONI Neuroscientist; Director, Transcranial Magnetic Stimulation Lab, UCLA [iacoboni100.gif] Media Violence Induces Imitative Violence: The Problem With Super Mirrors Media violence induces imitative violence. If true, this idea is dangerous for at least two main reasons. First, because its implications are highly relevant to the issue of freedom of speech. Second, because it suggests that our rational autonomy is much more limited than we like to think. This idea is especially dangerous now, because we have discovered a plausible neural mechanism that can explain why observing violence induces imitative violence. Moreover, the properties of this neural mechanism -- the human mirror neuron system -- suggest that imitative violence may not always be a consciously mediated process. The argument for protecting even harmful speech (intended in a broad sense, including movies and videogames) has typically been that the effects of speech are always under the mental intermediation of the listener/viewer. If there is a plausible neurobiological mechanism that suggests that such intermediate step can be by-passed, this argument is no longer valid. For more than 50 years behavioral data have suggested that media violence induces violent behavior in the observers. Meta-data show that the effect size of media violence is much larger than the effect size of calcium intake on bone mass, or of asbestos exposure to cancer. Still, the behavioral data have been criticized. How is that possible? Two main types of data have been invoked. Controlled laboratory experiments and correlational studies assessing types of media consumed and violent behavior. The lab data have been criticized on the account of not having enough ecological validity, whereas the correlational data have been criticized on the account that they have no explanatory power. Here, as a neuroscientist who is studying the human mirror neuron system and its relations to imitation, I want to focus on a recent neuroscience discovery that may explain why the strong imitative tendencies that humans have may lead them to imitative violence when exposed to media violence. Mirror neurons are cells located in the premotor cortex, the part of the brain relevant to the planning, selection and execution of actions. In the ventral sector of the premotor cortex there are cells that fire in relation to specific goal-related motor acts, such as grasping, holding, tearing, and bringing to the mouth. Surprisingly, a subset of these cells -- what we call mirror neurons -- also fire when we observe somebody else performing the same action. The behavior of these cells seems to suggest that the observer is looking at her/his own actions reflected by a mirror, while watching somebody else's actions. My group has also shown in several studies that human mirror neuron areas are critical to imitation. There is also evidence that the activation of this neural system is fairly automatic, thus suggesting that it may by-pass conscious mediation. Moreover, mirror neurons also code the intention associated with observed actions, even though there is not a one-to-one mapping between actions and intentions (I can grasp a cup because I want to drink or because I want to put it in the dishwasher). This suggests that this system can indeed code sequences of action (i.e., what happens after I grasp the cup), even though only one action in the sequence has been observed. Some years ago, when we still were a very small group of neuroscientists studying mirror neurons and we were just starting investigating the role of mirror neurons in intention understanding, we discussed the possibility of super mirror neurons. After all, if you have such a powerful neural system in your brain, you also want to have some control or modulatory neural mechanisms. We have now preliminary evidence suggesting that some prefrontal areas have super mirrors. I think super mirrors come in at least two flavors. One is inhibition of overt mirroring, and the other one -- the one that might explain why we imitate violent behavior, which require a fairly complex sequence of motor acts -- is mirroring of sequences of motor actions. Super mirror mechanisms may provide a fairly detailed explanation of imitative violence after being exposed to media violence. _________________________________________________________________ BARRY C. SMITH Philosopher, Birbeck, University of London; Coeditor, Knowing Our Own Minds [smithb100.gif] What We Know May Not Change Us Human beings, like everything else, are part of the natural world. The natural world is all there is. But to say that everything that exists is just part of the one world of nature is not the same as saying that there is just one theory of nature that will describes and explain everything that there is. Reality may be composed of just one kind of stuff and properties of that stuff but we need many different kinds of theories at different levels of description to account for everything there is. Theories at these different levels may not be reduced one to another. What matters is that they be compatible with one another. The astronomy Newton gave us was a triumph over supernaturalism because it united the mechanics of the sub-lunary world with an account of the heavenly bodies. In a similar way, biology allowed us to advance from a time when we saw life in terms of an elan vital. Today, the biggest challenge is to explain our powers of thinking and imagination, our abilities to represent and report our thoughts: the very means by which we engage in scientific theorising. The final triumph of the natural sciences over supernaturalism will be an account of nature of conscious experience. The cognitive and brain sciences have done much to make that project clearer but we are still a long way from a fully satisfying theory. But even if we succeed in producing a theory of human thought and reason, of perception, of conscious mental life, compatible with other theories of the natural and biological world, will we relinquish our cherished commonsense conceptions of ourselves as human beings, as selves who know ourselves best, who deliberate and decide freely on what to do and how to live? There is much evidence that we won't. As humans we conceive ourselves as centres of experience, self-knowing and free willing agents. We see ourselves and others as acting on our beliefs, desires, hopes and fears, and has having responsibility for much that we do and all that we say. And even as results in neuroscience begin to show how much more automated, routinised and pre-conscious much of our behaviour is, we are remain unable to let go of the self-beliefs that govern our day to day rationalisings and dealings with others. We are perhaps incapable of treating others as mere machines, even if that turns out to be what we are. The self-conceptions we have are firmly in place and sustained in spite of our best findings, and it may be a fact about human beings that it will always be so. We are curious and interested in neuroscientists findings and we wonder at them and about their applications to ourselves, but as the great naturalistic philosopher David Hume knew, nature is too strong in us, and it will not let us give up our cherished and familiar ways of thinking for long. Hume knew that however curious an idea and vision of ourselves we entertained in our study, or in the lab, when we returned to the world to dine, make merry with our friends our most natural beliefs and habits returned and banished our stranger thoughts and doubts. It is likely, as this end of the year, that whatever we have learned and whatever we know about the error of our thinkings and about the fictions we maintain, they will still remain the most dominant guiding force in our everyday lives. We may not be comforted by this, but as creatures with minds who know they have minds -- perhaps the only minded creatures in nature in this position -- we are at least able to understand our own predicament. _________________________________________________________________ PHILIP W. ANDERSON Physicist, Princeton University; Nobel Laureate in Physics 1977; Author, Economy as a Complex Evolving System [anderson100.jpg] Dark Energy might not exist Let's try one in cosmology. The universe contains at least 3 and perhaps 4 very different kinds of matter, whose origins probably are physically completely different. There is the Cosmic Background Radiation (CBR) which is photons from the later parts of the Big Bang but is actually the residue of all the kinds of radiation that were in the Bang, like flavored hadrons and mesons which have annihilated and become photons. You can count them and they tell you pretty well how many quanta of radiation there were in the beginning; and observation tells us that they were pretty uniformly distributed, in fact very, and still are. Next is radiant matter -- protons, mostly, and electrons. There are only a billionth as many of them as quanta of CBR, but as radiation in the Big Bang there were pretty much the same number, so all but one out of a billion combined with an antiparticle and annihilated. Nonetheless they are much heavier than the quanta of CBR, so they have, all told, much more mass, and have some cosmological effect on slowing down the Hubble expansion. There was an imbalance -- but what caused that? That imbalance was generated by some totally independent process, possibly during the very turbulent inflationary era. In fact out to a tenth of the Hubble radius, which is as far as we can see, the protons are very non-uniformly distributed, in a fractal hierarchical clustering with things called "Great Walls" and giant near-voids. The conventional idea is that this is all caused by gravitational instability acting on tiny primeval fluctuations, and it barely could be, but in order to justify that you have to have another kind of matter. So you need -- and actually see, but indirectly -- Dark Matter, which is 30 times as massive, overall, as protons but you can't see anything but its gravitational effects. No one has much clue as to what it is but it seems to have to be assumed it is hadronic, otherwise why would it be anything as close as a factor 30 to the protons? But really, there is no reason at all to suppose its origin was related to the other two, you know only that if it's massive quanta of any kind it is nowhere near as many as the CBR, and so most of them annihilated in the early stages. Again, we have no excuse for assuming that the imbalance in the Dark Matter was uniformly distributed primevally, even if the protons were, because we don't know what it is. Finally, of course there is Dark Energy, that is if there is. On that we can't even guess if it is quanta at all, but again we note that if it is it probably doesn't add up in numbers to the CBR. The very strange coincidence is that when we add this in there isn't any total gravitation at all, and the universe as a whole is flat, as it would be, incidentally, if all of the heavy parts were distributed everywhere according to some random, fractal distribution like that of the matter we can see -- because on the largest scale, a fractal's density extrapolates to zero. That suggestion, implying that Dark Energy might not exist, is considered very dangerously radical. The posterior probability of any particular God is pretty small Here's another, which compared to many other peoples' propositions isn't so radical. Isn't God very improbable? You can't in any logical system I can understand disprove the existence of God, or prove it for that matter. But I think that in the probability calculus I use He is very improbable. There are a number of ways of making a formal probability theory which incorporate Ockham's razor, the principle that one must not multiply hypotheses unnecessarily. Two are called Bayesian probability theory, and Minimum Entropy. If you have been taking data on something, and the data are reasonably close to a straight line, these methods give us a definable procedure by which you can estimate the probability that the straight line is correct, not the polynomial which has as many parameters as there are points, or some intermediate complex curve. Ockham's razor is expressed mathematically as the fact that there is a factor in the probability derived for a given hypothesis that decreases exponentially in the number N of parameters that describe your hypothesis -- it is the inverse of the volume of parameter space. People who are trying to prove the existence of ESP abominate Bayesianism and this factor because it strongly favors the "Null hypothesis" and beats them every time. Well, now, imagine how big the parameter space is for God. He could have a long gray beard or not, be benevolent or malicious in a lot of different ways and over a wide range of values, he can have a variety of views on abortion, contraception, like or abominate human images, like or abominate music, and the range of dietary prejudices He has been credited with is as long as your arm. There is the heaven-hell dimension, the one vs three question, and I haven't even mentioned polytheism. I think there are certainly as many parameters as sects, or more. If there is even a sliver of prior probability for the null hypothesis, the posterior probability of any particular God is pretty small. _________________________________________________________________ TIMOTHY TAYLOR Archaeologist, University of Bradford; Author, The Buried Soul l [taylor100.jpg] The human brain is a cultural artefact. Phylogenetically, humans represent an evolutionary puzzle. Walking on two legs free the hands to do new things, like chip stones to make modified tools -- the first artefacts, dating to 2.7 million years ago -- but it also narrows the pelvis and dramatically limits the size of possible fetal cranium. Thus the brain expansion that began after 2 million years ago should not have happened. But imagine that, alongside chipped stone tools, one genus of hominin appropriates the looped entrails of a dead animal, or learns to tie a simple knot, and invents a sling (chimpanzees are known to carry water in leaves and gorillas to measure water depth with sticks, so the practical and abstract thinking required here can be safely assumed for our human ancestors by this point). In its sling, the hominin child can now hip ride with little impairment to its parent's hands-free movement. This has the unexpected and certainly unplanned consequence that it is no longer important for it to be able to hang on as chimps do. Although, due to the bio-mechanical constraints of a bipedal pelvis, the hominin child cannot be born with a big head (thus large initial brain capacity) it can now be born underdeveloped. That is to say, the sling frees fetuses to be born in an ever more ontogenically retarded state. This trend, which humans do indeed display, is called neoteny. The retention of earlier features for longer means that the total developmental sequence is extended in time far beyond the nine months of natural gestation. Hominin children, born underdeveloped, could grow their crania outside the womb in the pseudo-marsupial pouch of an infant-carrying sling. From this point onwards it is not hard to see how a distinctively human culture emerges through the extra-uterine formation of higher cognitive capacities -- the phylogenetic and ontogenic icing on the cake of primate brain function. The child, carried by the parent into social situations, watches vocalization. Parental selection for smart features such as an ability to babble early may well, as others have suggested, have driven the brain size increases until 250,000 years ago -- a point when the final bio-mechanical limits of big-headed mammals with narrow pelvises were reached by two species: Neanderthals and us. This is the phylogeny side of the case. In terms of ontogeny the obvious applies -- it recapitulates phylogeny. The underdeveloped brains of hominin infants were culture-prone, and in this sense, I do not dissent from Dan Sperber's dangerous idea that `culture is natural'. But human culture, unlike the basic culture of learned routines and tool-using observed in various mammals, is a system of signs -- essentially the association of words with things and the ascription and recognition of value in relation to this. As Ernest Gellner once pointed out, taken cross-culturally, as a species, humans exhibit by far the greatest range of behavioural variation of any animal. However, within any on-going community of people, with language, ideology and a culturally-inherited and developed technology, conformity has usually been a paramount value, with death often the price for dissent. My belief is that, due to the malleability of the neotenic brain, cultural systems are physically built into the developing tissue of the mind. Instead of seeing the brain as the genetic hardware into which the cultural software is loaded, and then arguing about the relative determining influences of each in areas such as, say, sexual orientation or mathematical ability (the old nature-nurture debate), we can conclude that culture (as Richard Dawkins long ago noted in respect of contraception) acts to subvert genes, but is also enabled by them. Ontogenic retardation allowed both environment and the developing milieu of cultural routines to act on brain hardware construction alongside the working through of the genetic blueprint. Just because the modern human brain is coded for by genes does not mean that the critical self-consciousness for which it (within its own community of brains) is famous is non-cultural any more than a barbed-and-tanged arrowhead is non-cultural just because it is made of flint. The human brain has a capacity to go not just beyond nature, but beyond culture too, by dissenting from old norms and establishing others. The emergence of the high arts and science is part of this process of the human brain, with its instrumental extra-somatic adaptations and memory stores (books, laboratories, computers), and is underpinned by the most critical thing that has been brought into being in the encultured human brain: free will. However, not all humans, or all human communities, seem capable of equal levels of free-will. In extreme cases they appear to display none at all. Reasons include genetic incapacity, but it is also possible for a lack of mental freedom to be culturally engendered, and sometimes even encouraged. Archaeologically, the evidence is there from the first farming societies in Europe: the Neolithic massacre at Talheim, where an entire community was genocidally wiped out except for the youngest children, has been taken as evidence (supported by anthropological analogies) of the re-enculturation of still flexible minds within the community of the victors, to serve and live out their orphaned lives as slaves. In the future, one might surmise that the dark side of the development of virtual reality machines (described by Clifford Pickover) will be the infinitely more subtle cultural programming of impressionable individuals as sophisticated conformists. The interplay of genes and culture has produced in us potential for a formidable range of abilities and intelligences. It is critical that in the future we both fulfil and extend this potential in the realm of judgment, choice and understanding in both sciences and arts. But the idea of the brain as a cultural artefact is dangerous. Those with an interest in social engineering -- tyrants and authoritarian regimes -- will almost certainly attempt to develop it to their advantage. Free-will is threatening to the powerful who, by understanding its formation, will act to undermine it in sophisticated ways. The usefulness of cultural artefacts that have the degree of complexity of human brains makes our own species the most obvious candidate for the enhanced super-robot of the future, not just smart factory operatives and docile consumers, but cunning weapons-delivery systems (suicide bombers) and conformity-enforcers. At worst, the very special qualities of human life that have been enabled by our remarkable natural history, the confluence of genes and culture, could end up as a realm of freedom for an elite few. _________________________________________________________________ OLIVER MORTON Chief News and Features Editor at Nature; Author, Mapping Mars [morton100.jpg] Our planet is not in peril The truth of this idea is pretty obvious. Environmental crises are a fundamental part of the history of the earth: there have been sudden and dramatic temperature excursions, severe glaciations, vast asteroid and comet impacts. Yet the earth is still here, unscathed. There have been mass extinctions associated with some of these events, while other mass extinctions may well have been triggered by subtler internal changes to the biosphere. But none of them seem to have done long-term harm. The first ten million years of the Triassic may have been a little dull by comparison to the late Palaeozoic, what with a large number of the more interesting species being killed in the great mass extinction at the end of the Permian, but there is no evidence that any fundamentally important earth processes did not eventually recover. I strongly suspect that not a single basic biogeochemical innovation -- the sorts of thing that underlie photosynthesis and the carbon cycle, the nitrogen cycle, the sulphur cycle and so on -- has been lost in the past 4 billion years. Indeed, there is an argument to be made that mass extinctions are in fact a good thing, in that they wipe the slate clean a bit and thus allow exciting evolutionary innovations. This may be going a bit far. While the Schumpeter-for-the-earth-system position seems plausible, it also seems a little crudely progressivist. While to a mammal the Tertiary seems fairly obviously superior to the Cretaceous, it's not completely clear to me that there's an objective basis for that belief. In terms of primary productivity, for example, the Cretaceous may well have had an edge. But despite all this, it's hard to imagine that the world would be a substantially better place if it had not undergone the mass extinctions of the Phanerozoic. Against this background, the current carbon/climate crisis seems pretty small beer. The change in mean global temperatures seems quite unlikely to be much greater than the regular cyclical change between glacial and interglacial climates. Land use change is immense, but it's not clear how long it will last, and there are rich seedbanks in the soil that will allow restoration. If fossil fuel use goes unchecked, carbon dioxide levels may rise as high as they were in the Eocene, and do so at such a rate that they cause a transient spike in ocean acidity. But they will not stay at those high levels, and the Eocene was not such a terrible place. The earth doesn't need ice caps, or permafrost, or any particular sea level. Such things come and go and rise and fall as a matter of course. The planet's living systems adapt and flourish, sometimes in a way that provides negative feedback, occasionally with a positive feedback that amplifies the change. A planet that made it through the massive biogeochemical unpleasantness of the late Permian is in little danger from a doubling, or even a quintupling, of the very low carbon dioxide level that preceded the industrial revolution, or from the loss of a lot of forests and reefs, or from the demise of half its species, or from the thinning of its ozone layer at high latitudes. But none of this is to say that we as people should not worry about global change; we should worry a lot. This is because climate change may not hurt the planet, but it hurts people. In particular, it will hurt people who are too poor to adapt. Significant climate change will change rainfall patterns, and probably patterns of extreme events as well, in ways that could easily threaten the food security of hundreds of millions of people supporting themselves through subsistence agriculture or pastoralism. It will have a massive effect on the lives of the relatively small number of people in places where sea ice is an important part of the environment (and it seems unlikely that anything we do now can change that). In other, more densely populated places local environmental and biotic change may have similarly sweeping effects. Secondary to this, the loss of species, both known and unknown, will be experienced by some as a form of damage that goes beyond any deterioration in ecosystem services. Many people will feel themselves and their world diminished by such extinctions even when they have no practical consequences, despite the fact that they cannot ascribe an objective value to their loss. One does not have to share the values of these people to recognise their sincerity. All of these effects provide excellent reasons to act. And yet many people in the various green movements feel compelled to add on the notion that the planet itself is in crisis, or doomed; that all life on earth is threatened. And in a world where that rhetoric is common, the idea that this eschatological approach to the environment is baseless is a dangerous one. Since the 1970s the environmental movement has based much of its appeal on personifying the planet and making it seem like a single entity, then seeking to place it in some ways "in our care". It is a very powerful notion, and one which benefits from the hugely influential iconographic backing of the first pictures of the earth from space; it has inspired much of the good that the environmental movement has done. The idea that the planet is not in peril could thus come to undermine the movement's power. This is one of the reasons people react against the idea so strongly. One respected and respectable climate scientist reacted to Andy Revkin's recent use of the phrase "In fact, the planet has nothing to worry about from global warming" in the New York Times with near apoplectic fury. If the belief that the planet is in peril were merely wrong, there might be an excuse for ignoring it, though basing one's actions on lies is an unattractive proposition. But the planet-in-peril idea is an easy target for those who, for various reasons, argue against any action on the carbon/climate crisis at all. In this, bad science is a hostage to fortune. What's worse, the idea distorts environmental reasoning, too. For example, laying stress on the non-issue of the health of the planet, rather than the real issues of effects that harm people, leads to a general preference for averting change rather than adapting to it, even though providing the wherewithal for adaptation will often be the most rational response. The planet-in-peril idea persists in part simply through widespread ignorance of earth history. But some environmentalists, and perhaps some environmental reporters, will argue that the inflated rhetoric that trades on this error is necessary in order to keep the show on the road. The idea that people can be more easily persuaded to save the planet, which is not in danger, than their fellow human beings, who are, is an unpleasant and cynical one; another dangerous idea, not least because it may indeed hold some truth. But if putting the planet at the centre of the debate is a way of involving everyone, of making us feel that we're all in this together, then one can't help noticing that the ploy isn't working out all that well. In the rich nations, many people may indeed believe that the planet is in danger -- but they don't believe that they are in danger, and perhaps as a result they're not clamouring for change loud enough, or in the right way, to bring it about. There is also a problem of learned helplessness. I suspect people are flattered, in a rather perverse way, by the idea that their lifestyle threatens the whole planet, rather than just the livelihoods of millions of people they have never met. But the same sense of scale that flatters may also enfeeble. They may come to think that the problems are too great for them to do anything about. Rolling carbon/climate issues into the great moral imperative of improving the lives of the poor, rather than relegating them to the dodgy rhetorical level of a threat to the planet as a whole, seems more likely to be a sustainable long-term strategy. The most important thing about environmental change is that it hurts people; the basis of our response should be human solidarity. The planet will take care of itself. _________________________________________________________________ SAMUEL BARONDES Neurobiologist and Psychiatrist, University of California San Francisco; Author, Better Than Prozac [barondes100.jpg] Using Medications To Change Personality Personality -- the pattern of thoughts, feelings, and actions that is typical of each of us -- is generally formed by early adulthood. But many people still want to change. Some, for example, consider themselves too gloomy and uptight and want to become more cheerful and flexible. Whatever their aims they often turn to therapists, self-help books, and religious practices. In the past few decades certain psychiatric medications have become an additional tool for those seeking control of their lives. Initially designed to be used for a few months to treat episodic psychological disturbances such as severe depression, they are now being widely prescribed for indefinite use to produce sustained shifts in certain personality traits. Prozac is the best known of them, but many others are on the market or in development. By directly affecting brain circuits that control emotions, these medications can produce desirable effects that may be hard to replicate by sheer force of will or by behavioral exercises. Millions keep taking them continuously, year after year, to modulate personality. Nevertheless, despite the testimonials and apparent successes, the sustained use of such drugs to change personality should still be considered dangerous. Not because manipulation of brain chemicals is intrinsically cowardly, immoral, or a threat to the social order. In the opinion of experienced clinicians medications such as Prozac may actually have the opposite effect, helping to build character and to increase personal responsibility. The real danger is that there are no controlled studies of the effects of these drugs on personality over the many years or even decades in which some people are taking them. So we are left with a reliance on opinion and belief. And this, as in all fields, we know to be dangerous. _________________________________________________________________ DAVID BODANIS Writer, Consultant; Author: The Electric Universe [bodanis100.jpg] The hyper-Islamicist critique of the West as a decadent force that is already on a downhill course might be true I wonder sometimes if the hyper-Islamicist critique of the West as a decadent force that is already on a downhill course might be true. At first it seems impossible: no one's richer than the US, and no one has as powerful an Army; western Europe has vast wealth and university skills as well. But what got me reflecting was the fact that in just four years after Pearl Harbor, the US had defeated two of the greatest military forces the world had ever seen. Everyone naturally accepted there had to be restrictions on gasoline sales, to preserve limited source of gasoline and rubber; profiteers were hated. But the first four years after 9/11? Detroit automakers find it easy to continue paying off congressmen to ensure that gasoline-wasting SUV's aren't restricted in any way. There are deep trends behind this. Technology is supposed to be speeding up, but if you think about it, airplanes have a similar feel and speed to ones of 30 years ago; cars and oil rigs and credit cards and the operations of the NYSE might be a bit more efficient than a few decades ago, but also don't feel fundamentally different. Aside from the telephones, almost all the objects and and daily habits in Spielberg's 20 year old film E.T. are about the same as today. What has transformed is the possibility of quick change. It's a lot, lot harder than it was before. Patents for vague, general ideas are much easier to get than they were before, which slows down the introduction of new technology; academics in biotech and other fields are wary about sharing their latest research with potentially competing colleagues (which slows down the creation of new technology as well). Even more, there's a tension, a fear of falling from the increasingly fragile higher tiers of society, which means that social barriers are higher as well. I went to adequate but not extraordinary public (state) schools in Chicago, but my children go to private schools. I suspect that many contributors to this site, unless they live in academic towns where state schools are especially strong, are in a similar position. This is fine for our children, but not for those of the same theoretical potential, yet who lack parents who can afford it. Sheer inertia can mask such flaws for quite a while. The National Academy of Sciences has shown that, once again, the percentage of American-born university students studying the hard physical sciences has gone down. At one time that didn't matter, for life in America -- and at the top American universities -- was an overwhelming lure for ambitious youngsters from Seoul and Bangalore. But already there are signs of that slipping, and who knows what it'll be like in another decade or two. There's another sort of inertia that's coming to an end as well. The first generation of immigrants from farm to city bring with them the attitudes of their farm world; the first generation of 'migrants' from blue collar city neighborhoods to upper middle class professional life bring similar attitudes of responsibility as well. We ignore what the media pours out about how we're supposed to live. We're responsible for parents, even when it's not to our economic advantage; we vote against our short-term economic interests, because it's the 'right' thing to do; we engage in philanthropy towards individuals of very different backgrounds from ourselves. But why? In many parts of America or Europe, the rules and habits creating those attitudes no longer exist at all. When that finally gets cut away, will what replaces it be strong enough for us to survive? _________________________________________________________________ NICHOLAS HUMPHREY Psychologist, London School of Economics; Author, The Mind Made Flesh [humphrey100.jpg] It is undesirable to believe in a proposition when there is no ground whatever for supposing it true Bertrand Russell's idea, put forward 80 years ago, is about as dangerous as they come. I don't think I can better it: "I wish to propose for the reader's favourable consideration a doctrine which may, I fear, appear wildly paradoxical and subversive. The doctrine in question is this: that it is undesirable to believe in a proposition when there is no ground whatever for supposing it true." (The opening lines of his Sceptical essays). _________________________________________________________________ ERIC FISCHL Artist, New York City; Mary Boone Gallery [fischl100.jpg] The unknown becomes known, and is not replaced with a new unkown Several years ago I stood in front of a painting by Vermeer. It was a painting of a woman reading a letter. She stood near the window for better lighting and behind her hung a map of the known world. I was stunned by the revelation of this work. Vermeer understood something so basic to human need it had gone virtually unnoticed: communication from afar. Everything we have done to make us more capable, more powerful, better protected, more intelligent, has been by enhancing our physical limitations, our perceptual abilities, our adaptability. When I think of Vermeer's woman reading the letter I wonder how long did it take to get to her? Then I think, my god, at some time we developed a system in which one could leave home and send word back! We figured out a way that we could be heard from far away and then another system so that we can be seen from far away. Then I start to marvel at the alchemy of painting and how we have been able to invest materials with consciousness so that Vermeer can talk to me across time! I see too he has put me in the position of not knowing as I am kept from reading the content of the letter. In this way he has placed me at the edge, the frontier of wanting to know what I cannot know. I want to know how long has this letter sender been away and what was he doing all this time. Is he safe? Does he still love her? Is he on his way home? Vermeer puts me into what had been her condition of uncertainty. All I can do is wonder and wait. This makes me think about how not knowing is so important. Not knowing makes the world large and uncertain and our survival tenuous. It is a mystery why humans roam and still more a mystery why we still need to feel so connected to the place we have left. The not knowing causes such profound anxiety it, in turn, spawns creativity. The impetus for this creativity is empowerment. Our gadgets, gizmoes, networks of transportation and communication, have all been developed either to explore, utilize or master the unknown territory. If the unknown becomes known, and is not replaced with a new unknown, if the farther we reach outward is connected only to how fast we can bring it home, if the time between not knowing and knowing becomes too small, creativity will be daunted. And so I worry, if we bring the universe more completely, more effortlessly, into our homes will there be less reason to leave them? _________________________________________________________________ STANISLAS DEHEANE Cognitive Neuropsychology Researcher, Institut National de la Sant?, Paris; Author, The Number Sense [dehane100.jpg] Touching and pushing the limits of the human brain From Copernicus to Darwin to Freud, science has a special way of deflating human hubris by proposing what is frequently perceived, at the time, as dangerous or pernicious ideas. Today, cognitive neuroscience presents us with a new challenging idea, whose accommodation will require substantial personal and societal effort -- the discovery of the intrinsic limits of the human brain. Calculation was one of the first domains where we lost our special status -- right from their inception, computers were faster than the human brain, and they are now billions of times ahead of us in their speed and breadth of number crunching. Psychological research shows that our mental "central executive" is amazingly limited -- we can process only one thought at a time, at a meager rate of five or ten per second at most. This is rather surprising. Isn't the human brain supposed to be the most massively parallel machine on earth? Yes, but its architecture is such that the collective outcome of this parallel organization, our mind, is a very slow serial processor. What we can become aware of is intrinsically limited. Whenever we delve deeply into the processing of one object, we become literally blind to other items that would require our attention (the "attentional blink" paradigm). We also suffer from an "illusion of seeing": we think that we take in a whole visual scene and see it all at once, but research shows that major chunks of the image can be changed surreptitiously without our noticing. True, relative to other animal species, we do have a special combinatorial power, which lies at the heart of the remarkable cultural inventions of mathematics, language, or writing. Yet this combinatorial faculty only works on the raw materials provided by a small number of core systems for number, space, time, emotion, conspecifics, and a few other basic domains. The list is not very long -- and within each domain, we are now discovering lots of little ill-adapted quirks, evidence of stupid design as expected from a brain arising from an imperfect evolutionary process (for instance, our number system only gives us a sense of approximate quantity -- good enough for foraging, but not for exact mathematics). I therefore do not share Marc Hauser's optimism that our mind has a "universal" or "limitless" expressive power. The limits are easy to touch in mathematics, in topology for instance, where we struggle with the simplest objects (is a curve a knot... or not?). As we discover the limits of the human brain, we also find new ways to design machines that go beyond those limits. Thus, we have to get ready for a society where, more and more, the human mind will be replaced by better computers and robots -- and where the human operator will be increasingly considered a nuisance rather than an asset. This is already the case in aeronautics, where flight stability is ensured by fast cybernetics and where landing and take off will soon be assured by computer, apparently with much improved safety. There are still a few domains where the human brain maintains an apparent superiority. Visual recognition used to be one -- but already, superb face recognition software is appearing, capable of storing and recognizing thousands of faces with close to human performance. Robotics is another. No robot to date is capable of navigating smoothly through a complicated 3-D world. Yet a third area of human superiority is high-level semantics and creativity: the human ability to make sense of a story, to pull out the relevant knowledge from a vast store of potentially useful facts, remains unequalled. Suppose that, for the next 50 years, those are the main areas in which engineers will remain unable to match the performance of the human brain. Are we ready for a world in which the human contributions are binary, either at the highest level (thinkers, engineers, artists...) or at the lowest level, where human workforce remains cheaper than mechanization? To some extent, I would argue that this great divide is already here, especially between North and South, but also within our developed countries, between upper and lower casts. What are the solutions? I envisage two of them. The first is education. The human brain to some extent is changeable. Thanks to education, we can improve considerably upon the stock of mental tools provided to us by evolution. In fact, relative to the large changes that schooling can provide, whatever neurobiological differences distinguish the sexes or the races are minuscule (and thus largely irrelevant -- contra Steve Pinker). The crowning achievements of Sir Isaac Newton are now accessible to any student in physics and algebra -- whatever his or her skin color. Of course, our learning ability isn't without bounds. It is itself tightly limited by our genes, which merely allow a fringe of variability in the laying down of our neuronal networks. We never fully gain entirely new abilities -- but merely transform our existing brain networks, a partial and constrained process that I have called "cultural recycling" or "recyclage". As we gain knowledge of brain plasticity, a major application of cognitive neuroscience research should be the improvement of life-long education, with the goal of optimizing this transformation of our brains. Consider reading. We now understand much better how this cultural capacity is laid down. A posterior brain network, initially evolved to recognize objects and faces, gets partially recycled for the shapes of letters and words, and learns to connect these shapes to other temporal areas for sounds and words. Cultural evolution has modified the shapes of letters so that they are easily learnable by this brain network. But, the system remains amazingly imperfect. Reading still has to go through the lopsided design of the retina, where the blood vessels are put in front of the photoreceptors, and where only a small region of the fovea has enough resolution to recognize small print. Furthermore, both the design of writing systems and the way in which they are taught are perfectible. In the end, after years of training, we can only read at an appalling speed of perhaps 10 words per second, a baud rate surpassed by any present-day modem. Nevertheless, this cultural invention has radically changed our cognitive abilities, doubling our verbal working memory for instance. Who knows what other cultural inventions might lie ahead of us, and might allow us to further push the limits of our brain biology? A second, more futuristic solution may lie in technology. Brain-computer interfaces are already around the corner. They are currently being developed for therapeutic purposes. Soon, cortical implants will allow paralyzed patients to move equipment by direct cerebral command. Will such devices later be applied to the normal human brain, in the hopes of extending our memory span or the speed of our access to information? And will we be able to forge a society in which such tools do not lead to further divisions between, on the one hand, high-tech brains powered by the best education and neuro-gear, and on the other hand, low-tech man power just good enough for cheap jobs? _________________________________________________________________ JOEL GARREAU Cultural Revolution Correspondent, Washington Post ; Author, Radical Evolution [garreau100.jpg] Suppose Faulkner was right? In his December 10, 1950, Nobel Prize acceptance speech, William Faulkner said: I decline to accept the end of man. It is easy enough to say that man is immortal simply because he will endure: that when the last ding-dong of doom has clanged and faded from the last worthless rock hanging tideless in the last red and dying evening, that even then there will still be one more sound: that of his puny inexhaustible voice, still talking. I refuse to accept this. I believe that man will not merely endure: he will prevail. He is immortal, not because he alone among creatures has an inexhaustible voice, but because he has a soul, a spirit capable of compassion and sacrifice and endurance. The poet's, the writer's, duty is to write about these things. It is his privilege to help man endure by lifting his heart, by reminding him of the courasge and honor and hope and pride and compassion and pity and sacrifice which have been the glory of his past. The poet's voice need not merely be the record of man, it can be one of the props, the pillars to help him endure and prevail. It's easy to dismiss such optimism. The reason I hope Faulkner was right, however, is that we are at a turning point in history. For the first time, our technologies are not so much aimed outward at modifying our environment in the fashion of fire, clothes, agriculture, cities and space travel. Instead, they are increasingly aimed inward at modifying our minds, memories, metabolisms, personalities and progeny. If we can do all that, then we are entering an era of engineered evolution -- radical evolution, if you will -- in which we take control of what it will mean to be human. This is not some distant, science-fiction future. This is happening right now, in our generation, on our watch. The GRIN technologies -- the genetic, robotic, information and nano processes -- are following curves of accelerating technological change the arithmetic of which suggests that the last 20 years are not a guide to the next 20 years. We are more likely to see that magnitude of change in the next eight. Similarly, the amount of change of the last half century, going back to the time when Faulkner spoke, may well be compressed into the next 14. This raises the question of where we will gain the wisdom to guide this torrent, and points to what happens if Faulkner was wrong. If we humans are not so much able to control our tools, but instead come to be controlled by them, then we will be heading into a technodeterminist future. You can get different versions of what that might mean. Some would have you believe that a future in which our creations eliminate the ills that have plagued mankind for millennia -- conquering pain, suffering, stupidity, ignorance and even death -- is a vision of heaven. Some even welcome the idea that someday soon, our creations will surpass the pitiful limitations of Version 1.0 humans, themselves becoming a successor race that will conquer the universe, and care for us benevolently. Others feel strongly that a life without suffering is a life without meaning, reducing humankind to ignominious, character-less husks. They also point to what could happen if such powerful self-replicating technologies get into the hands of bumblers or madmen. They can easily imagine a vision of hell in which we wipe out not only our species, but all of life on earth. If Faulkner is right, however, there is a third possible future. That is the one that counts on the ragged human convoy of divergent perceptions, piqued honor, posturing, insecurity and humor once again wending its way to glory. It puts a shocking premium on Faulkner's hope that man will prevail "because he has a soul, a spirit capable of compassion and sacrifice and endurance." It assumes that even as change picks up speed, giving us less and less time to react, we will still be able to rely on the impulse that Churchill described when he said, "Americans can always be counted on to do the right thing--after they have exhausted all other possibilities." The key measure of such a "prevail" scenario's success would be an increasing intensity of links between humans, not transistors. If some sort of transcendence is achieved beyond today's understanding of human nature, it would not be through some individual becoming superman. Transcendence would be social, not solitary. The measure would be the extent to which many transform together. The very fact that Faulkner's proposition looms so large as we look into the future does at least illuminate the present. Referring to Faulkner's breathtaking line, "when the last ding-dong of doom has clanged and faded from the last worthless rock hanging tideless in the last red and dying evening, that even then there will still be one more sound: that of his puny inexhaustible voice, still talking," the author Bruce Sterling once told me, "You know, the most interesting part about that speech is that part right there, where William Faulkner, of all people, is alluding to H. G. Wells and the last journey of the Traveler from The Time Machine. It's kind of a completely heartfelt, probably drunk mishmash of cornball crypto-religious literary humanism and the stark, bonkers, apocalyptic notions of atomic Armageddon, human extinction, and deep Darwinian geological time. Man, that was the 20th century all over." _________________________________________________________________ HELEN FISHER Research Professor, Department of Anthropology, Rutgers University; Author, Why We Love [fisher100.jpg] If patterns of human love subtlely change, all sorts of social and political atrocities can escalate Serotonin-enhancing antidepressants (such as Prozac and many others) can jeopardize feelings of romantic love, feelings of attachment to a spouse or partner, one's fertility and one's genetic future. I am working with psychiatrist Andy Thomson on this topic. We base our hypothesis on patient reports, fMRI studies, and other data on the brain. Foremost, as SSRIs elevate serotonin they also suppress dopaminergic pathways in the brain. And because romantic love is associated with elevated activity in dopaminergic pathways, it follows that SSRIs can jeopardize feelings of intense romantic love. SSRIs also curb obsessive thinking and blunt the emotions--central characteristics of romantic love. One patient described this reaction well, writing: "After two bouts of depression in 10 years, my therapist recommended I stay on serotonin-enhancing antidepressants indefinitely. As appreciative as I was to have regained my health, I found that my usual enthusiasm for life was replaced with blandness. My romantic feelings for my wife declined drastically. With the approval of my therapist, I gradually discontinued my medication. My enthusiasm returned and our romance is now as strong as ever. I am prepared to deal with another bout of depression if need be, but in my case the long-term side effects of antidepressants render them off limits". SSRIs also suppress sexual desire, sexual arousal and orgasm in as many as 73% of users. These sexual responses evolved to enhance courtship, mating and parenting. Orgasm produces a flood of oxytocin and vasopressin, chemicals associated with feelings of attachment and pairbonding behaviors. Orgasm is also a device by which women assess potential mates. Women do not reach orgasm with every coupling and the "fickle" female orgasm is now regarded as an adaptive mechanism by which women distinguish males who are willing to expend time and energy to satisfy them. The onset of female anorgasmia may jeopardize the stability of a long-term mateship as well. Men who take serotonin-enhancing antidepressants also inhibit evolved mechanisms for mate selection, partnership formation and marital stability. The penis stimulates to give pleasure and advertise the male's psychological and physical fitness; it also deposits seminal fluid in the vaginal canal, fluid that contains dopamine, oxytocin, vasopressin, testosterone, estrogen and other chemicals that most likely influence a female partner's behavior. These medications can also influence one's genetic future. Serotonin increases prolactin by stimulating prolactin releasing factors. Prolactin can impair fertility by suppressing hypothalamic GnRH release, suppressing pituitary FSH and LH release, and/or suppressing ovarian hormone production. Clomipramine, a strong serotonin-enhancing antidepressant, adversely affects sperm volume and motility. I believe that Homo sapiens has evolved (at least) three primary, distinct yet overlapping neural systems for reproduction. The sex drive evolved to motivate ancestral men and women to seek sexual union with a range of partners; romantic love evolved to enable them to focus their courtship energy on a preferred mate, thereby conserving mating time and energy; attachment evolved to enable them to rear a child through infancy together. The complex and dynamic interactions between these three brain systems suggest that any medication that changes their chemical checks and balances is likely to alter an individual's courting, mating and parenting tactics, ultimately affecting their fertility and genetic future. The reason this is a dangerous idea is that the huge drug industry is heavily invested in selling these drugs; millions of people currently take these medications worldwide; and as these drugs become generic, many more will soon imbibe -- inhibiting their ability to fall in love and stay in love. And if patterns of human love subtlely change, all sorts of social and political atrocities can escalate. _________________________________________________________________ PAUL DAVIES Physicist, Macquarie University, Sydney; Author, How to Build a Time Machine [davies100.jpg] The fight against global warming is lost Some countries, including the United States and Australia, have been in denial about global warming. They cast doubt on the science that set alarm bells ringing. Other countries, such as the UK, are in panic, and want to make drastic cuts in greenhouse emissions. Both stances are irrelevant, because the fight is a hopeless one anyway. In spite of the recent hike in the price of oil, the stuff is still cheap enough to burn. Human nature being what it is, people will go on burning it until it starts running out and simple economics puts the brakes on. Meanwhile the carbon dioxide levels in the atmosphere will just go on rising. Even if developed countries rein in their profligate use of fossil fuels, the emerging Asian giants of China and India will more than make up the difference. Rich countries, whose own wealth derives from decades of cheap energy, can hardly preach restraint to developing nations trying to climb the wealth ladder. And without the obvious solution -- massive investment in nuclear energy -- continued warming looks unstoppable. Campaigners for cutting greenhouse emissions try to scare us by proclaiming that a warmer world is a worse world. My dangerous idea is that it probably won't be. Some bad things will happen. For example, the sea level will rise, drowning some heavily populated or fertile coastal areas. But in compensation Siberia may become the world's breadbasket. Some deserts may expand, but others may shrink. Some places will get drier, others wetter. The evidence that the world will be worse off overall is flimsy. What is certainly the case is that we will have to adjust, and adjustment is always painful. Populations will have to move. In 200 years some currently densely populated regions may be deserted. But the population movements over the past 200 years have been dramatic too. I doubt if anything more drastic will be necessary. Once it dawns on people that, yes, the world really is warming up and that, no, it doesn't imply Armageddon, then the international agreements like the Kyoto protocol will fall apart. The idea of giving up the global warming struggle is dangerous because it shouldn't have come to this. Mankind does have the resources and the technology to cut greenhouse gas emission. What we lack is the political will. People pay lip service to environmental responsibility, but they are rarely prepared to put their money where their mouth is. Global warming may turn out to be not so bad after all, but many other acts of environmental vandalism are manifestly reckless: the depletion of the ozone layer, the destruction of rain forests, the pollution of the oceans. Giving up on global warming will set an ugly precedent. _________________________________________________________________ APRIL GORNIK Artist, New York City; Danese Gallery [gornik100.jpg] The exact effect of art can't be controlled or fully anticipated Great art makes itself vulnerable to interpretation, which is one reason that it keeps being stimulating and fascinating for generations. The problem inherent in this is that art could inspire malevolent behavior, as per the notion popularly expressed by A Clockwork Orange. When I was young, aspiring to be a conceptual artist, it disturbed me greatly that I couldn't control the interpretation of my work. When I began painting, it was even worse; even I wasn't completely sure of what my art meant. That seemed dangerous for me, personally, at that time. I gradually came not only to respect the complexity and inscrutability of painting and art, but to see how it empowers the object. I believe that works of art are animated by their creators, and remain able to generate thoughts, feelings, responses. However, the fact is that the exact effect of art can't be controlled or fully anticipated. _________________________________________________________________ JAMSHED BHARUCHA Professor of Psychology, Provost, Senior Vice President, Tufts University [bharucha100.jpg] The more we discover about cognition and the brain, the more we will realize that education as we know it does not accomplish what we believe it does It is not my purpose to echo familiar critiques of our schools. My concerns are of a different nature and apply to the full spectrum of education, including our institutions of higher education, which arguably are the finest in the world. Our understanding of the intersection between genetics and neuroscience (and their behavioral correlates) is still in its infancy. This century will bring forth an explosion of new knowledge on the genetic and environmental determinants of cognition and brain development, on what and how we learn, on the neural basis of human interaction in social and political contexts, and on variability across people. Are we prepared to transform our educational institutions if new science challenges cherished notions of what and how we learn? As we acquire the ability to trace genetic and environmental influences on the development of the brain, will we as a society be able to agree on what our educational objectives should be? Since the advent of scientific psychology we have learned a lot about learning. In the years ahead we will learn a lot more that will continue to challenge our current assumptions. We will learn that some things we currently assume are learnable are not (and vice versa), that some things that are learned successfully don't have the impact on future thinking and behavior that we imagine, and that some of the learning that impacts future thinking and behavior is not what we spend time teaching. We might well discover that the developmental time course for optimal learning from infancy through the life span is not reflected in the standard educational time line around which society is organized. As we discover more about the gulf between how we learn and how we teach, hopefully we will also discover ways to redesign our systems -- but I suspect that the latter will lag behind the former. Our institutions of education certify the mastery of spheres of knowledge valued by society. Several questions will become increasingly pressing, and are even pertinent today. How much of this learning persists beyond the time at which acquisition is certified? How does this learning impact the lives of our students? How central is it in shaping the thinking and behavior we would like to see among educated people as they navigate, negotiate and lead in an increasingly complex world? We know that tests and admissions processes are selection devices that sort people into cohorts on the basis of excellence on various dimensions. We know less about how much even our finest examples of teaching contribute to human development over and above selection and motivation. Even current knowledge about cognition (specifically, our understanding of active learning, memory, attention, and implicit learning) has not fully penetrated our educational practices, because of inertia as well as a natural lag in the application of basic research. For example, educators recognize that active learning is superior to the passive transmission of knowledge. Yet we have a long way to go to adapt our educational practices to what we already know about active learning. We know from research on memory that learning trials bunched up in time produce less long term retention than the same learning trials spread over time. Yet we compress learning into discrete packets called courses, we test learning at the end of a course of study, and then we move on. Furthermore, memory for both facts and methods of analytic reasoning are context-dependent. We don't know how much of this learning endures, how well it transfers to contexts different from the ones in which the learning occurred, or how it influences future thinking. At any given time we attend to only a tiny subset of the information in our brains or impinging on our senses. We know from research on attention that information is processed differently by the brain depending upon whether or not it is attended, and that many factors -- endogenous and exogenous -- control our attention. Educators have been aware of the role of attention in learning, but we are still far from understanding how to incorporate this knowledge into educational design. Moreover, new information presented in a learning situation is interpreted and encoded in terms of prior knowledge and experience; the increasingly diverse backgrounds of students placed in the same learning contexts implies that the same information may vary in its meaningfulness to different students and may be recalled differently. Most of our learning is implicit, acquired automatically and unconsciously from interactions with the physical and social environment. Yet language -- and hence explicit, declarative or consciously articulated knowledge -- is the currency of formal education. Social psychologists know that what we say about why we think and act as we do is but the tip of a largely unconscious iceberg that drives our attitudes and our behavior. Even as cognitive and social neuroscience reveals the structure of these icebergs under the surface of consciousness (for example, persistent cognitive illusions, decision biases and perceptual biases to which even the best educated can be unwitting victims), it will be less clear how to shape or redirect these knowledge icebergs under the surface of consciousness. Research in social cognition shows clearly that racial, cultural and other social biases get encoded automatically by internalizing stereotypes and cultural norms. While we might learn about this research in college, we aren't sure how to counteract these factors in the very minds that have acquired this knowledge. We are well aware of the power of non-verbal auditory and visual information, which when amplified by electronic media capture the attention of our students and sway millions. Future research should give us a better understanding of nuanced non-verbal forms of communication, including their universal and culturally based aspects, as they are manifest in social, political and artistic contexts. Even the acquisition of declarative knowledge through language -- the traditional domain of education -- is being usurped by the internet at our finger tips. Our university libraries and publication models are responding to the opportunities and challenges of the information age. But we will need to rethink some of our methods of instruction too. Will our efforts at teaching be drowned out by information from sources more powerful than even the best classroom teacher? It is only a matter of time before we have brain-related technologies that can alter or supplement cognition, influence what and how we learn, and increase competition for our limited attention. Imagine the challenges for institutions of education in an environment in which these technologies are readily available, for better or worse. The brain is a complex organ, and we will discover more of this complexity. Our physical, social and information environments are also complex and are becoming more so through globalization and advances in technology. There will be no simple design principles for how we structure education in response to these complexities. As elite colleges and universities, we see increasing demand for the branding we confer, but we will also see greater scrutiny from society for the education we deliver. Those of us in positions of academic leadership will need wisdom and courage to examine, transform and justify our objectives and methods as educators. _________________________________________________________________ JORDAN POLLACK Computer Scientist, Brandeis University [pollack100.jpg] Science as just another Religion We scientists like to think that our "way of knowing" is special. Instead of holding beliefs based on faith in invisible omniscient deities, or parchments transcribed from oral cultures, we use the scientific method to discover and know. Truth may be eternal, but human knowledge of that truth evolves over time, as new questions are asked, data is recorded, hypotheses are tested, and replication and refutation mechanisms correct the record. So it is a very dangerous idea to consider Science as just another Religion. It's not my idea, but one I noticed growing in a set of Lakovian Frames within the Memesphere. One of the frame is that scientists are doom and gloom prophets. For example, at a recent popular technology conference, a parade of speakers spoke about the threats of global warming, the sea level rising by 18 feet and destroying cities, more category 5 hurricanes, etc. It was quite a reversal from the positivistic techno-utopian promises of miraculous advances in medicine, computers, and weaponry that have allowed science to bloom in the late 20th century. A friend pointed out that -- in the days before Powerpoint -- these scientists might be wearing sandwich-board signs saying "The End is Near!" Another element in the framing of science as a religion is the response to evidence-based policy. Scientists who do take political stands on "moral" issues such as stem-cell research, death penalty, nuclear weapons, global warming, etc., can be sidelined as atheists, humanists, or agnostics who have no moral or ethical standing outside their narrow specialty (as compared to, say, televangelist preachers.) A third, and the most nefarious frame, casts theory as one opinion among others which should represented out of fairness or tolerance. This is the subterfuge used by Intelligent Design Creationists. We may believe in the separation of church and state, but that firewall has fallen. Science and Reason are losing political battles to Superstition and Ignorance. Politics works by rewarding friends and punishing enemies, and while our individual votes may be private, exit polls have proven that Science didn't vote for the incumbent. There seem to be three choices going forward: Reject, Accommodate, or Embrace. One path is to go on an attack on religion in the public sphere. In his book End of Faith, Sam Harris points out that humoring people who believe in God is like humoring people who believe that "a diamond [] the size of a refrigerator" is buried in their back yard. There is a fine line between pushing God out of our public institutions and repeating religious intolerance of regimes past. A second is to embrace Faith-Based Science. Since, from the perspective of government, research just another special interest feeding at the public trough, we should change our model to be more accommodating to political reality. Research is already sold like highway construction projects, with a linear accelerator for your state and a supercomputer center for mine, all done through direct appropriations. All that needs to change is the justifications for such spending. How would Faith-Based Science work? Well, Physics could sing the psalm that Perpetual Motion would solve the energy crisis, thereby triggering a $500 billion program in free energy machines. (Of course, God is on our side to repeal the Second Law of Thermodynamics!) Astronomy could embrace Astrology and do grassroots PR through Daily Horoscopes to gain mass support for a new space program. In fact, an anti-gravity initiative could pass today if it were spun as a repeal of the "heaviness tax." Using the renaming principle, the SETI program can be re-legalized and brought back to life as the "Search for God" project. Finally, the third idea is to actually embrace this dangerous idea and organize a new open-source spiritual and moral movement. I think a new, greener religion, based on faith in the Gaia Hypothesis and an 11th commandment to "Protect the Earth" could catch on, especially if welcoming to existing communities of faith. Such a movement could be a new pulpit from which the evidence-based silent majority can speak with both moral force and evangelical fervor about issues critical to the future of our planet. _________________________________________________________________ JUAN ENRIQUEZ CEO, Biotechonomy; Founding Director, Harvard Business School's Life Sciences Project; Author, The Untied States of America [enriquez100jpg] Technology can untie the U.S. Everyone grows and dies; same is true of countries. The only question is how long one postpones the inevitable. In the case of some countries, life spans can be very long, so it is worth asking is the U.S. in adolescence, middle age, or old age? Do science and technology accelerate or offset demise? And finally "how many stars will be in the U.S. flag in fifty years?" There has yet to be a single U.S. president buried under the same flag he was born under, yet we oft take continuity for granted. Just as almost no newlyweds expect to divorce, citizens rarely assume their beloved country, flag and anthem might end up an exhibit in an archeology museum. But countries rich and poor, Asian, African, and European have been untying time and again. In the last five decades the number of UN members has tripled. This trend goes way beyond the de-colonization of the 1960s, and it is not exclusive to failed states; it is a daily debate within the United Kingdom, Italy, France, Belgium, the Netherlands, Austria, and many others. So far the Americas has remained mostly impervious to these global trends, but, even if in God you trust, there are no guarantees. Over the next decade waves of technology will wash over the U.S. Almost any applied field you care to look at promises extraordinary change, opportunities, and challenges. (Witness the entries in this edition of Edge). How counties adapt to massive, rapid upheaval will go a long way towards determining the eventual outcome. To paraphrase Darwin, it is not the strongest, not the largest, that survive rather it is those best prepared to cope with change. It is easy to argue that the U.S. could be a larger more powerful country in fifty years. But it is also possible that, like so many other great powers, it could begin to unravel and untie. This is not something that depends on what we do decide to do fifty years hence; to a great extent it depends on what we choose to do, or choose to ignore, today. There are more than a few worrisome trends. Future ability to generate wealth depends on techno-literacy. But educational excellence, particularly in grammar and high schools is far from uniform, and it is not world class. Time and again the U.S. does poorly, particularly in regards to math and science, when compared with its major trading partners. Internally, there are enormous disparities between schools and between the number of students that pass state competency exams and what federal tests tell us about the same students. There are also large gaps in techno literacy between ethnic groups. By 2050 close to 40% of the U.S. population will be Hispanic and African American. These groups receive 3% of the PhDs in math and science today. How we prepare kids for a life sciences, materials, robotics, IT, and nanotechnology driven world is critical. But we currently invest $22,000 federal dollars in those over 65 and just over $2,000 in those under sixteen... As ethnic, age, and regional gaps in the ability to adapt increase there are many wary and frustrated by technology, open borders, free trade, and smart immigrants. Historically, when others use newfangled ways to leap ahead, it can lead to a conservative response. This is likeliest within those societies and groups thant have the most to lose, often among those who have been the most successful. One often observes a reflexive response: stop the train; I want to get off. Or, as the Red Sox now say, just wait till last year. No more teaching evolution, no more research into stem cells, no more Indian or Chinese or Mexican immigrants, no matter how smart or hardworking they might be. These individual battles are signs of a creeping xenophobia, isolationism, and fury. Within the U.S. there are many who are adapting very successfully. They tend to concentrate in a very few zip codes, life science clusters like 92121(between Salk, Scripps, and UCSD) and techno-empires like 02139 (MIT). Most of the nation's wealth and taxes are generated by a few states and, within these states, within in a few square miles. It is those who live in these areas that are most affronted by restrictions on research, the lack of science literate teenagers, and the reliance on God instead of science. Politicians well understand these divides and they have gerrymandered their own districts to reflect them. Because competitive congressional elections are rarer today than turnovers within the Soviet Politburo, there is rarely an open debate and discussion as to why other parts of the country act and think so differently. The Internet and cable further narrowcast news and views, tending to reinforce what one's neighbors and communities already believe. Positions harden. Anger at "the others" mounts. Add a large and mounting debt to this equation, along with politicized religion, and the mixture becomes explosive. The average household now owes over $88,000 and the present value of what we have promised to pay is now about $473,000. There is little willingness within Washington to address a mounting deficit, never mind the current account imbalance. Facing the next electoral challenge, few seem to remember the last act of many an empire is to drive itself into bankruptcy. Sooner or later we could witness some very bitter arguments about who gets and who pays. In developed country after developed country, it is often the richest, not the ethnically or religiously repressed, that first seek autonomy and eventually dissolution. In this context it is worth recalling that New England, not the South, has been the most secession prone region. As the country expanded, New Englanders attempted to include the right to untie into the constitution; the argument was that as this great country expanded South and West they would lose control over their political and economic destiny. Perhaps this is what led to four separate attempts to untie the Union. When we assume stability and continuity we can wake up to irreconcilable differences. Science and a knowledge driven economy can allow a few folks to build powerful and successful countries very quickly, witness Korea, Taiwan, Singapore, Ireland, but changes of this magnitude can also bury or split the formerly great who refuse to adapt, as well as those who practice bad governance. If we do not begin to address some current divides quickly we could live to see an Un-Tied States of America. _________________________________________________________________ STEPHEN M. KOSSLYN Psychologist, Harvard University; Author, Wet Mind [kosslyn100.jpg] A Science of the Divine? Here's an idea that many academics may find unsettling and dangerous: God exists. And here's another idea that many religious people may find unsettling and dangerous: God is not supernatural, but rather part of the natural order. Simply stating these ideas in the same breath invites them to scrape against each other, and sparks begin to fly. To avoid such conflict, Stephen Jay Gould famously argued that we should separate religion and science, treating them as distinct "magisteria." But science leads many of us to try to understand all that we encounter with a single, grand and glorious overarching framework. In this spirit, let me try to suggest one way in which the idea of a "supreme being" can fit into a scientific worldview. I offer the following not to advocate the ideas, but rather simply to illustrate one (certainly not the only) way that the concept of God can be approached scientifically. 1.0. First, here's the specific conception of God I want to explore: God is a "supreme being" that transcends space and time, permeates our world but also stands outside of it, and can intervene in our daily lives (partly in response to prayer). 2.0. A way to begin to think about this conception of the divine rests on three ideas: 2.1. Emergent properties. There are many examples in science where aggregates produce an entity that has properties that cannot be predicted entirely from the elements themselves. For example, neurons in large numbers produce minds; moreover, minds in large numbers produce economic, political, and social systems. 2.2. Downward causality. Events at "higher levels" (where emergent properties become evident) can in turn feed back and affect events at lower levels. For example, chronic stress (a mental event) can cause parts of the brain to become smaller. Similarly, an economic depression or the results of an election affect the lives of the individuals who live in that society. 2.3. The Ultimate Superset. The Ultimate Superset (superordinate set) of all living things may have an equivalent status to an economy or culture. It has properties that emerge from the interactions of living things and groups of living things, and in turn can feed back to affect those things and groups. 3.0. Can we conceive of God as an emergent property of all living things that can in turn affect its constituents? Here are some ways in which this idea is consistent with the nature of God, as outlined at the outset. 3.1. This emergent entity is "transcendent" in the sense that it exists in no specific place or time. Like a culture or an economy, God is nowhere, although the constituent elements occupy specific places. As for transcending time, consider this analogy: Imagine that 1/100th of the neurons in your brain were replaced every hour, and each old neuron programmed a new one so that the old one's functionality was preserved. After 100 hours your brain would be an entirely new organ -- but your mind would continue to exist as it had been before. Similarly, as each citizen dies and is replaced by a child, the culture continues to exist (and can grow and develop, with a "life of its own"). So too with God. For example, in the story of Jacob's ladder, Jacob realizes "Surely the Lord is in this place, and I did not know it." (Genesis 28: 16) I interpret this story as illustrating that God is everywhere but nowhere. The Ultimate Superset permeates our world but also stands outside of (or, more specifically, "above") it. 3.2. The Ultimate Superset can affect our individual lives. Another analogy: Say that geese flying south for the winter have rather unreliable magnetic field detectors in their brains. However, there's a rule built into their brains that leads them to try to stay near their fellows as they fly. The flock as a whole would navigate far better than any individual bird, because the noise in the individual bird brain navigation systems would cancel out. The emergent entity -- the flock -- in turn would affect the individual geese, helping them to navigate better than they could on their own. 3.3. When people pray to the Lord, they beseech intervention on their or others' behalf. The view that I've been outlining invites us to think of the effects of prayer as akin to becoming more sensitive to the need to stay close to the other birds in the flock: By praying, one can become more sensitive to the emergent "supreme being." Such increased sensitivity may imply that one can contribute more strongly to this emergent entity. By analogy, it's as if one of those geese became aware of the "keep near" rule, and decided to nudge the other birds in a particular direction -- which thereby allows it to influence the flock's effect on itself. To the extent that prayer puts one closer to God, one's plea for intervention will have a larger impact on the way that The Ultimate Superset exerts downward causality. But note that, according to this view, God works rather slowly. Think of dropping rocks in a pond: it takes time for the ripples to propagate and eventually be reflected back from the edge, forming interference patterns in the center of the pond. 4.0. A crucial idea in monotheistic religions is that God is the Creator. The present approach may help us begin to grapple with this idea, as follows. 4.1. First, consider each individual person. The environment plays a key role in creating who and what we are because there are far too few genes to program every aspect of our brains. For example, when you were born, your genes programmed many connections in your visual areas, but did not specify the precise circuits necessary to determine how far away objects are. As an infant, the act of reaching for an object tuned the brain circuits that estimate how far away the object was from you. Similarly, your genes graced you with the ability to acquire language, but not with a specific language. The act of acquiring a language shapes your brain (which in turn may make it difficult to acquire another language, with different sounds and grammar, later in life). Moreover, cultural practices configure the brains of members of the culture. A case in point: the Japanese have many forms of bowing, which are difficult for a Westerner to master relatively late in life; when we try to bow, we "bow with an accent." 4.2. And the environment not only played an essential role in how we developed as children, but also plays a continuing role in how we develop over the course of our lives as adults. The act of learning literally changes who and what we are. 4.3. According to this perspective, it's not just negotiating the physical world and sociocultural experience that shape the brain: The Ultimate Superset -- the emergent property of all living things -- affects all of the influences that "make us who and what we are," both as we develop during childhood and continue to learn and develop as adults. 4.4. Next, consider our species. One could try to push this perspective into a historical context, and note that evolution by natural selection reflects the effects of interactions among living things. If so, then the emergent properties of such interactions could feed back to affect the course of evolution itself. In short, it is possible to begin to view the divine through the lens of science. But such reasoning does no more than set the stage; to be a truly dangerous idea, this sort of proposal must be buttressed by the results of empirical test. At present, my point is not to convince, but rather to intrigue. As much as I admired Stephen Jay Gould (and I did, very much), perhaps he missed the mark on this one. Perhaps there is a grand project waiting to be launched, to integrate the two great sources of knowledge and belief in the world today -- science and religion. _________________________________________________________________ JERRY COYNE Evolutionary Biologist; Professor, Department of Ecology and Evolution, University of Chicago; Author (with H. Allen Orr), Speciation [coyne100.jpg] Many behaviors of modern humans were genetically hard-wired (or soft-wired) in our distant ancestors by natural selection For me, one idea that is dangerous and possibly true is an extreme form of evolutionary psychology -- the view that many behaviors of modern humans were genetically hard-wired (or soft-wired) in our distant ancestors by natural selection. The reason I say that this idea might be true is that we cannot be sure of the genetic and evolutionary underpinnings of most human behaviors. It is difficult or impossible to test many of the conjectures of evolutionary psychology. Thus, we can say only that behaviors such as the sexual predilections of men versus women, and the extreme competitiveness of males, are consistent with evolutionary psychology. But consistency arguments have two problems. First, they are not hard scientific proof. Are we satisfied that sonnets are phallic extensions simply because some male poets might have used them to lure females? Such arguments fail to meet the normal standards of scientific evidence. Second, as is well known, one can make consistency arguments for virtually every human behavior. Given the possibilities of kin selection (natural selection for behaviors that do no good for to their performers but are advantageous to their relatives) and reciprocal altruism, and our ignorance of the environments of our ancestors, there is no trait beyond evolutionary explanation. Indeed, there are claims for the evolutionary origin of even manifestly maladaptive behaviors, such as homosexuality, priestly celibacy, and extreme forms of altruism (e.g., self-sacrifice during wartime). But surely we cannot consider it scientifically proven that genes for homosexuality are maintained in human populations by kin selection. This remains possible but undemonstrated. Nevertheless, much of human behavior does seem to conform to Darwinian expectations. Males are promiscuous and females coy. We treat our relatives better than we do other people. The problem is where to draw the line between those behaviors that are so obviously adaptive that no one doubts their genesis (e.g. sleeping and eating), those which are probably but not as obviously adaptive (e.g., human sexual behavior and our fondness for fats and sweets) and those whose adaptive basis is highly speculative (e.g., the origin of art and our love of the outdoors). Although I have been highly critical of evolutionary psychology, I have not done so from political motives, nor do I think that the discipline is in principle misguided. Rather, I have been critical because evolutionary psychologists seem unwilling to draw lines between what can be taken as demonstrated and what remains speculative, making the discipline more of a faith than a science. This lack of rigor endangers the reputation of all of evolutionary biology, making our endeavors seem to be merely the concoction of ingenious stories. If we are truly to understand human nature, and use this knowledge constructively, we must distinguish the probably true from the possibly true. So, why do I see evolutionary psychology as dangerous? I think it is because I am afraid to see myself and my fellow humans as mere marionettes dancing on genetic strings. I would like to think that we have immense freedom to better ourselves as individuals and to create a just and egalitarian society. Granted, genetics is not destiny, but neither are we completely free of our evolutionary baggage. Might genetics really hold a leash on our capacity to change? If so, then some claims of evolutionary psychology give us convenient but dangerous excuses for behaviors that seem unacceptable. It is all too easy, for example, for philandering males to excuse their behavior as evolutionarily justified. Evolutionary psychologists argue that it is possible to overcome our evolutionary heritage. But what if it is not so easy to take the Dawkinsian road and "rebel against the tyranny of the selfish replicators"? _________________________________________________________________ ERNST P?PPEL Neuroscientist, Chairman, Board of Directors, Human Science Center and Department of Medical Psychology, Munich University, Germany; Author, Mindworks [poppel100.jpg] My belief in science Average life expectancy of a species on this globe is just a few million years. From an external point of view, it would be nothing special if humankind suddenly disappears. We have been here for sometime. With humans no longer around, evolutionary processes would have an even better chance to fill in all those ecological niches which have been created by human activities. As we change the world, and as thousands of species are lost every year because of human activities, we provide a new and productive environment for the creation of new species. Thus, humankind is very creative with respect to providing a frame for new evolutionary trajectories, and humankind would even be more creative, if it has disappeared altogether. If somebody (unfortunately not our descendents) would visit this globe some time later, they would meet many new species, which owe their existence the presence and the disappearance of humankind. But this is not going to happen, because we are doing science. With science we apparently get a better understanding of basic principles in nature, we have a chance to improve quality of life, and we can develop means to extend the life expectancy of our species. Unfortunately, some of these scientific activities have a paradoxical effect resulting in a higher risk for a common disappearance. Maybe, science will not be so effective after all to prevent our disappearance. Only now comes my dangerous idea as my (!) dangerous idea. It is not so difficult to come up with a dangerous scenario on a general level, but if one takes such a question also seriously on a personal level, one has to meditate an individual scenario. I am very grateful for this question formulated by Steven Pinker as it forced me to visit my episodic memory and to think about what has been and still is "my dangerous idea". Although nobody else might be interested in a personal statement, I say it anyway: My dangerous idea is my belief in science. In all my research (in the field of temporal perception or visual processes) I have a basic trust in the scientific activities, and I actually believe the results I have obtained. And I believe the results of others. But why? I know that there so many unknown and unknowable variables that are part of the experimental setup and which cannot be controlled. How can I trust in spite of so many unknowables (does this word exist in English?)? Furthermore, can I really rely on my thinking, can I trust my eyes and ears? Can I be so sure about my scientific activities that I communicate with pride the results to others? If I look at the complexity of the brain, how is it possible that something reasonable comes out of this network? How is it possible that a face that I see or a thought that I have maintain their identity over time? If I have no access to what goes on in my brain, how can I be so proud, (how can anybody be so proud) about scientific achievements? _________________________________________________________________ GEOFFREY MILLER Evolutionary Psychologist, University of New Mexico; Author, The Mating Mind [miller100.jpg] Runaway consumerism explains the Fermi Paradox The story goes like this: Sometime in the 1940s, Enrico Fermi was talking about the possibility of extra-terrestrial intelligence with some other physicists. They were impressed that our galaxy holds 100 billion stars, that life evolved quickly and progressively on earth, and that an intelligent, exponentially-reproducing species could colonize the galaxy in just a few million years. They reasoned that extra-terrestrial intelligence should be common by now. Fermi listened patiently, then asked simply, "So, where is everybody?". That is, if extra-terrestrial intelligence is common, why haven't we met any bright aliens yet? This conundrum became known as Fermi's Paradox. The paradox has become more ever more baffling. Over 150 extrasolar planets have been identified in the last few years, suggesting that life-hospitable planets orbit most stars. Paleontology shows that organic life evolved very quickly after earth's surface cooled and became life-hospitable. Given simple life, evolution shows progressive trends towards larger bodies, brains, and social complexity. Evolutionary psychology reveals several credible paths from simpler social minds to human-level creative intelligence. Yet 40 years of intensive searching for extra-terrestrial intelligence have yielded nothing. No radio signals, no credible spacecraft sightings, no close encounters of any kind. So, it looks as if there are two possibilities. Perhaps our science over-estimates the likelihood of extra-terrestrial intelligence evolving. Or, perhaps evolved technical intelligence has some deep tendency to be self-limiting, even self-exterminating. After Hiroshima, some suggested that any aliens bright enough to make colonizing space-ships would be bright enough to make thermonuclear bombs, and would use them on each other sooner or later. Perhaps extra-terrestrial intelligence always blows itself up. Fermi's Paradox became, for a while, a cautionary tale about Cold War geopolitics. I suggest a different, even darker solution to Fermi's Paradox. Basically, I think the aliens don't blow themselves up; they just get addicted to computer games. They forget to send radio signals or colonize space because they're too busy with runaway consumerism and virtual-reality narcissism. They don't need Sentinels to enslave them in a Matrix; they do it to themselves, just as we are doing today. The fundamental problem is that any evolved mind must pay attention to indirect cues of biological fitness, rather than tracking fitness itself. We don't seek reproductive success directly; we seek tasty foods that tended to promote survival and luscious mates who tended to produce bright, healthy babies. Modern results: fast food and pornography. Technology is fairly good at controlling external reality to promote our real biological fitness, but it's even better at delivering fake fitness -- subjective cues of survival and reproduction, without the real-world effects. Fresh organic fruit juice costs so much more than nutrition-free soda. Having real friends is so much more effort than watching Friends on TV. Actually colonizing the galaxy would be so much harder than pretending to have done it when filming Star Wars or Serenity. Fitness-faking technology tends to evolve much faster than our psychological resistance to it. The printing press is invented; people read more novels and have fewer kids; only a few curmudgeons lament this. The Xbox 360 is invented; people would rather play a high-resolution virtual ape in Peter Jackson's King Kong than be a perfect-resolution real human. Teens today must find their way through a carnival of addictively fitness-faking entertainment products: MP3, DVD, TiVo, XM radio, Verizon cellphones, Spice cable, EverQuest online, instant messaging, Ecstasy, BC Bud. The traditional staples of physical, mental, and social development (athletics, homework, dating) are neglected. The few young people with the self-control to pursue the meritocratic path often get distracted at the last minute -- the MIT graduates apply to do computer game design for Electronics Arts, rather than rocket science for NASA. Around 1900, most inventions concerned physical reality: cars, airplanes, zeppelins, electric lights, vacuum cleaners, air conditioners, bras, zippers. In 2005, most inventions concern virtual entertainment -- the top 10 patent-recipients are usually IBM, Matsushita, Canon, Hewlett-Packard, Micron Technology, Samsung, Intel, Hitachi, Toshiba, and Sony -- not Boeing, Toyota, or Wonderbra. We have already shifted from a reality economy to a virtual economy, from physics to psychology as the value-driver and resource-allocator. We are already disappearing up our own brainstems. Freud's pleasure principle triumphs over the reality principle. We narrow-cast human-interest stories to each other, rather than broad-casting messages of universal peace and progress to other star systems. Maybe the bright aliens did the same. I suspect that a certain period of fitness-faking narcissism is inevitable after any intelligent life evolves. This is the Great Temptation for any technological species -- to shape their subjective reality to provide the cues of survival and reproductive success without the substance. Most bright alien species probably go extinct gradually, allocating more time and resources to their pleasures, and less to their children. Heritable variation in personality might allow some lineages to resist the Great Temptation and last longer. Those who persist will evolve more self-control, conscientiousness, and pragmatism. They will evolve a horror of virtual entertainment, psychoactive drugs, and contraception. They will stress the values of hard work, delayed gratification, child-rearing, and environmental stewardship. They will combine the family values of the Religious Right with the sustainability values of the Greenpeace Left. My dangerous idea-within-an-idea is that this, too, is already happening. Christian and Muslim fundamentalists, and anti-consumerism activists, already understand exactly what the Great Temptation is, and how to avoid it. They insulate themselves from our Creative-Class dream-worlds and our EverQuest economics. They wait patiently for our fitness-faking narcissism to go extinct. Those practical-minded breeders will inherit the earth, as like-minded aliens may have inherited a few other planets. When they finally achieve Contact, it will not be a meeting of novel-readers and game-players. It will be a meeting of dead-serious super-parents who congratulate each other on surviving not just the Bomb, but the Xbox. They will toast each other not in a soft-porn Holodeck, but in a sacred nursery. _________________________________________________________________ ROBERT SHAPIRO Professor Emeritus, Senior Research Scientist, Department of Chemistry, New York University. Author, Planetary Dreams [shapiro100.jpg] We shall understand the origin of life within the next 5 years Two very different groups will find this development dangerous, and for different reasons, but this outcome is best explained at the end of my discussion. Just over a half century ago, in the spring of 1953, a famous experiment brought enthusiasm and renewed interest to this field. Stanley Miller, mentored by Harold Urey, demonstrated that a mixture of small organic molecules (monomers) could readily be prepared by exposing a mixture of simple gases to an electrical spark. Similar mixtures were found in meteorites, which suggested that organic monomers may be widely distributed in the universe. If the ingredients of life could be made so readily, then why could they not just as easily assort themselves to form cells? In that same spring, however, another famous paper was published by James Watson and Francis Crick. They demonstrated that the heredity of living organisms was stored in a very large large molecule called DNA. DNA is a polymer, a substance made by stringing many smaller units together, as links are joined to form a long chain. The clear connection between the structure of DNA and its biological function, and the geometrical beauty of the DNA double helix led many scientists to consider it to be the essence of life itself. One flaw remained, however, to spoil this picture. DNA could store information, but it could not reproduce itself without the assistance of proteins, a different type of polymer. Proteins are also adept at increasing the rate of (catalyzing) many other chemical reactions that are considered necessary for life. The origin of life field became mired in the "chicken-or-the egg" question. Which came first: DNA or proteins? An apparent answer emerged when it was found that another polymer, RNA (a cousin of DNA) could manage both heredity and catalysis. In 1986, Walter Gilbert proposed that life began with an "RNA World." Life started when an RNA molecule that could copy itself was formed, by chance, in a pool of its own building blocks. Unfortunately, a half century of chemical experiments have demonstrated that nature has no inclination to prepare RNA, or even the building blocks (nucleotides) that must be linked together to form RNA. Nucleotides are not formed in Miller-type spark discharges, nor are they found in meteorites. Skilled chemists have prepared nucleotides in well-equipped laboratories, and linked them to form RNA, but neither chemists nor laboratories were present when life began on the early Earth. The Watson-Crick theory sparked a revolution in molecular biology, but it left the origin-of-life question at an impasse. Fortunately, an alternative solution to this dilemma has gradually emerged: neither DNA nor RNA nor protein were necessary for the origin of life. Large molecules dominate the processes of life today, but they were not needed to get it started. Monomers themselves have the ability to support heredity and catalysis. The key requirement is that a suitable energy source be available to assist them in the processes of self-organization. A demonstration of the principle involved in the origin of life would require only that a suitable monomer mixture be exposed to an appropriate energy source in a simple apparatus. We could then observe the very first steps in evolution. Some mixtures will work, but many others will fail, for technical reasons. Some dedicated effort will be needed in the laboratory to prove this point. Why have I specified five years for this discovery? The unproductive polymer-based paradigm is far from dead, and continues to consume the efforts of the majority of workers in the field. A few years will be needed to entice some of them to explore the other solution. I estimate that several years more (the time for a PhD thesis) might be required to identify a suitable monomer-energy combination, and perform a convincing demonstration. Who would be disturbed if such efforts should succeed? Many scientists have been attracted by the RNA World theory because of its elegance and simplicity. Some of them have devoted decades of their career in efforts to prove it. They would not be pleased if Freeman Dyson's description proved to be correct: "life began with little bags, the precursors of cells, enclosing small volumes of dirty water containing miscellaneous garbage." A very different group would find this development as dangerous as the theory of evolution. Those who advocate creationism and intelligent design would feel that another pillar of their belief system was under attack. They have understood the flaws in the RNA World theory, and used them to support their supernatural explanation for life's origin. A successful scientific theory in this area would leave one less task less for God to accomplish: the origin of life would be a natural (and perhaps frequent) result of the physical laws that govern this universe. This latter thought falls directly in line with the idea of Cosmic Evolution, which asserts that events since the Big Bang have moved almost inevitably in the direction of life. No miracle or immense stroke of luck was needed to get it started. If this should be the case, then we should expect to be successful when we search for life beyond this planet. We are not the only life that inhabits this universe. _________________________________________________________________ KAI KRAUSE Researcher, philosopher, software developer, Author: 3DScience: new Scanning Electron Microscope imagery [krause100.jpg] Anty Gravity: Chaos Theory in an all too practical sense Dangerous Ideas? It is dangerous ideas you want? From this group of people ? That in itself ought to be nominated as one of the more dangerous ideas... Danger is ubiquitous. If recent years have shown us anything, it should be that "very simple small events can cause real havoc in our society". A few hooded youths play cat and mouse with the police: bang, thousands of burned cars put all of Paris into a complete state of paralysis, mandatory curfew and the entire system in shock and horror. My first thought was: what if any really smart set of people really set their mind to it...how utterly and scarily trivial it would be, to disrupt the very fabric of life, to bring society to a dead stop? The relative innocence and stable period of the last 50 years may spiral into a nearly inevitable exposure to real chaos. What if it isn't haphazard testosterone driven riots, where they cannibalize their own neighborhood, much like in L.A. in the 80s, but someone with real insight behind that criminal energy ? What if Slashdotters start musing aloud about "Gee, the L.A. water supply is rather simplistic, isn't it?" An Open Source crime web, a Wiki for real WTO opposition ? Hacking L.A. may be a lot easier than hacking IE. That is basic banter over a beer in a bar, I don't even want to actually speculate what a serious set of brainiacs could conjure up. And I refuse to even give it any more print space here. However, the danger of such sad memes is what requires our attention! In fact, I will broaden the specter still: its not violent crime and global terrorism I worry about, as much as the basic underpinning of our entire civilization coming apart, as such. No acts of malevolence, no horrible plans by evil dark forces, neither the singular "Bond Nemesis" kind, nor masses of religious fanatics. None of that needed... It is the glue that is coming apart to topple this tower. And no, I am not referring to "spiraling trillions of debt". No, what I am referring to is a slow process I observed over the last 30 years, ever since in my teens I wondered "How would this world work, if everyone were like me ?" and realized: it wouldn't ! It was amazing to me that there were just enough people to make just enough shoes so that everyone can avoid walking barefoot. That there are people volunteering to spend day-in, day-out, being dentists, and lawyers and salesmen. Almost any "jobjob" I look at, I have the most sincere admiration for the tenacity of the people...how do they do it? It would drive me nuts after hours, let alone years...Who makes those shoes ? That was the wondrous introspection in adolescent phases, searching for a place in the jigsaw puzzle. But in recent years, the haunting question has come back to me: "How the hell does this world function at all? And does it, really ? I feel an alienation zapping through the channels, I can't find myself connecting with those groups of humanoids trouncing around MTV. Especially the glimpses of "real life": on daytime-courtroom-dramas or just looking at faces in the street. On every scale, the closer I observe it, the more the creeping realization haunts me: individuals, families, groups, neighborhoods, cities, states, countries... they all just barely hang in there, between debt and dysfunction. The whole planet looks like Any town with mini malls cutting up the landscape and just down the road it's all white trash with rusty car wrecks in the back yard. A huge Groucho Club I don't want to be a member of. But it does go further: what is particularly disturbing to see is this desperate search for Individualism that has rampantly increased in the last decade or so. Everyone suddenly needs to be so special, be utterly unique. So unique that they race off like lemmings to get 'even more individual' tattoos, branded cattle, with branded chains in every mall, converging on a blanded sameness world wide, but every rap singer with ever more gold chains in ever longer stretched limos is singing the tune: Don't be a loser! Don't be normal! The desperation with which millions of youngsters try to be that one-in-a-million professional ball player may have been just a "sad but silly factoid" for a long time. But now the tables are turning: the anthill is relying on the behaviour of the ants to function properly. And that implies: the social behaviour, the role playing, taking defined tasks and follow them through. What if each ant suddenly wants to be the queen? What if soldiering and nest building and cleaning chores is just not cool enough any more? If AntTV shows them every day nothing but un-Ant behaviour...? In my youth we were whining about what to do and how to do it, but in the end,all of my friends did become "normal" humans, orthopedics and lawyers, social workers, teachers... There were always a few that lived on the edges of normality, like ending up as television celebrities, but on the whole: they were perfectly reasonable ants. 1.8 children, 2.7 cars, 3.3 TVs... Now: I am no longer confident that line will continue. If every honeymoon is now booked in Bali on a Visa card, and every kid in Borneo wants to play ball in NYC... can the network of society be pliable enough to accommodate total upheaval? And what if 2 billion Chinese and Indians raise a generation of kids staring 6+ hours a day into All American values they can never attain... being taunted with Hollywood movies of heroic acts and pathetic dysfunctionality, coupled with ever increasing violence and disdain for ethics or morals. Seeing scenes of desperate youths in South American slums watching "Kill Bill" makes me think: this is just oxygen thrown into the fire... The ants will not play along much longer. The anthill will not survive if even a small fraction of the system is falling apart. Couple that inane drive for "Super Individualism" (and the Quest for Coolness by an ever increasing group destined to fail miserably) with the scarily simple realization of how effective even a small set of desperate people can become, then add the obvious penchant for religious fanaticism and you have an ugly picture of the long term future. So many curves that grow upwards towards limits, so many statistics that show increases and no way to turn around. Many in this forum may speculate about infinite life spans, changing the speed of light, finding ways to decode consciousness, wormholes to other dimensions and finding grand unified theories. To make it clear: I applaud that! "It does take all kinds". Diversity is indeed one of the definitions of the meaning of life. Edge IS Applied Diversity. Those are viable and necessary questions for mankind as a whole, however: I believe we need to clean house, re-evaluate, redefine the priorities. While we look at the horizon here in these pages, it is the very ground beneath us, that may be crumbling. The ant hill could really go to ant hell! Next year, let's ask for good ideas. Really practical, serious, good ideas. "The most immediate positive global impact of any kind that can be achieved within one year?". How to envision Internet3 and Web3 as a real platform for a global brainstorming with 6+ billion potential participants. This was not meant to sound like doom and gloom naysaying. I see myself as a sincere optimist, but one who believes in realistic pessimism as a useful tool to initiate change. _________________________________________________________________ CARLO ROVELLI Professor of Physics, University of the Mediterraneum, Marseille; Member, Intitut Universitaire de France: Author, Quantum Gravity [rovelli100.jpg] What the physics of the 20th century says about the world might in fact be true There is a major "dangerous" scientific idea in contemporary physics, with a potential impact comparable to Copernicus or Darwin. It is the idea that what the physics of the 20th century says about the world might in fact be true. Let me explain. Take quantum mechanics. If taken seriously, it changes our understanding of reality truly dramatically. For instance, if we take quantum mechanics seriously, we cannot think that objects have ever a definite position. They have a positions only when they interact with something else. And even in this case, they are in that position only with respect to that "something else": they are still without position with respect to the rest of the world. This is a change of image of the world far more dramatic that Copernicus. And also a change about our possibility of thinking about ourselves far more far-reaching than Darwin. Still, few people take the quantum revolution really seriously. The danger is exorcized by saying "well, quantum mechanics is only relevant for atoms and very small objects...", or similar other strategies, aimed at not taking the theory seriously. We still haven't digested that the world is quantum mechanical, and the immense conceptual revolution needed to make sense of this basic factual discovery about nature. Another example: take Einstein's relativity theory. Relativity makes completely clear that asking "what happens right now on Andromeda?" is a complete non-sense. There is no right now elsewhere in the universe. Nevertheless, we keep thinking at the universe as if there was an immense external clock that ticked away the instants, and we have a lot of difficulty in adapting to the idea that "the present state of the universe right now", is a physical non-sense. In these cases, what we do is to use concepts that we have developed in our very special environment (characterized by low velocities, low energy...) and we think the world as if it was all like that. We are like ants that have grown in a little garden with green grass and small stones, and cannot think reality differently than made of green grass and small stones. I think that seen from 200 years in the future, the dangerous scientific idea that was around at the beginning of the 20th century, and that everybody was afraid to accept, will simply be that the world is completely different from our simple minded picture of it. As the physics of the 20th century had already shown. What makes me smile is that even many of todays "audacious scientific speculations" about things like extra-dimensions, multi-universes, and the likely, are not only completely unsupported experimentally, but are even always formulated within world view that, at a close look, has not yet digested quantum mechanics and relativity! _________________________________________________________________ RICHARD DAWKINS Evolutionary Biologist, Charles Simonyi Professor For The Understanding Of Science, Oxford University; Author, The Ancestor's Tale [dawkins100.jpg] Let's all stop beating Basil's car Ask people why they support the death penalty or prolonged incarceration for serious crimes, and the reasons they give will usually involve retribution. There may be passing mention of deterrence or rehabilitation, but the surrounding rhetoric gives the game away. People want to kill a criminal as payback for the horrible things he did. Or they want to give "satisfaction' to the victims of the crime or their relatives. An especially warped and disgusting application of the flawed concept of retribution is Christian crucifixion as "atonement' for "sin'. Retribution as a moral principle is incompatible with a scientific view of human behaviour. As scientists, we believe that human brains, though they may not work in the same way as man-made computers, are as surely governed by the laws of physics. When a computer malfunctions, we do not punish it. We track down the problem and fix it, usually by replacing a damaged component, either in hardware or software. Basil Fawlty, British television's hotelier from hell created by the immortal John Cleese, was at the end of his tether when his car broke down and wouldn't start. He gave it fair warning, counted to three, gave it one more chance, and then acted. "Right! I warned you. You've had this coming to you!" He got out of the car, seized a tree branch and set about thrashing the car within an inch of its life. Of course we laugh at his irrationality. Instead of beating the car, we would investigate the problem. Is the carburettor flooded? Are the sparking plugs or distributor points damp? Has it simply run out of gas? Why do we not react in the same way to a defective man: a murderer, say, or a rapist? Why don't we laugh at a judge who punishes a criminal, just as heartily as we laugh at Basil Fawlty? Or at King Xerxes who, in 480 BC, sentenced the rough sea to 300 lashes for wrecking his bridge of ships? Isn't the murderer or the rapist just a machine with a defective component? Or a defective upbringing? Defective education? Defective genes? Concepts like blame and responsibility are bandied about freely where human wrongdoers are concerned. When a child robs an old lady, should we blame the child himself or his parents? Or his school? Negligent social workers? In a court of law, feeble-mindedness is an accepted defence, as is insanity. Diminished responsibility is argued by the defence lawyer, who may also try to absolve his client of blame by pointing to his unhappy childhood, abuse by his father, or even unpropitious genes (not, so far as I am aware, unpropitious planetary conjunctions, though it wouldn't surprise me). But doesn't a truly scientific, mechanistic view of the nervous system make nonsense of the very idea of responsibility, whether diminished or not? Any crime, however heinous, is in principle to be blamed on antecedent conditions acting through the accused's physiology, heredity and environment. Don't judicial hearings to decide questions of blame or diminished responsibility make as little sense for a faulty man as for a Fawlty car? Why is it that we humans find it almost impossible to accept such conclusions? Why do we vent such visceral hatred on child murderers, or on thuggish vandals, when we should simply regard them as faulty units that need fixing or replacing? Presumably because mental constructs like blame and responsibility, indeed evil and good, are built into our brains by millennia of Darwinian evolution. Assigning blame and responsibility is an aspect of the useful fiction of intentional agents that we construct in our brains as a means of short-cutting a truer analysis of what is going on in the world in which we have to live. My dangerous idea is that we shall eventually grow out of all this and even learn to laugh at it, just as we laugh at Basil Fawlty when he beats his car. But I fear it is unlikely that I shall ever reach that level of enlightenment. _________________________________________________________________ SETH LLOYD Quantum Mechanical Engineer, MIT [lloyd100.jpg] The genetic breakthrough that made people capable of ideas themselves The most dangerous idea is the genetic breakthrough that made people capable of ideas themselves. The idea of ideas is nice enough in principle; and ideas certainly have had their impact for good. But one of these days one of those nice ideas is likely to have the unintended consequence of destroying everything we know. Meanwhile, we cannot not stop creating and exploring new ideas: the genie of ingenuity is out of the bottle. To suppress the power of ideas will hasten catastrophe, not avert it. Rather, we must wield that power with the respect it deserves. Who risks no danger reaps no reward. _________________________________________________________________ CAROLYN PORCO Planetary Scientist; Cassini Imaging Science Team Leader; Director CICLOPS, Boulder CO; Adjunct Professor, University of Colorado, University of Arizona [porco100.jpg] The Greatest Story Ever Told The confrontation between science and formal religion will come to an end when the role played by science in the lives of all people is the same played by religion today. And just what is that? At the heart of every scientific inquiry is a deep spiritual quest -- to grasp, to know, to feel connected through an understanding of the secrets of the natural world, to have a sense of one's part in the greater whole. It is this inchoate desire for connection to something greater and immortal, the need for elucidation of the meaning of the 'self', that motivates the religious to belief in a higher 'intelligence'. It is the allure of a bigger agency -- outside the self but also involving, protecting, and celebrating the purpose of the self -- that is the great attractor. Every culture has religion. It undoubtedly satisfies a manifest human need. But the same spiritual fulfillment and connection can be found in the revelations of science. From energy to matter, from fundamental particles to DNA, from microbes to Homo sapiens, from the singularity of the Big Bang to the immensity of the universe .... ours is the greatest story ever told. We scientists have the drama, the plot, the icons, the spectacles, the 'miracles', the magnificence, and even the special effects. We inspire awe. We evoke wonder. And we don't have one god, we have many of them. We find gods in the nucleus of every atom, in the structure of space/time, in the counter-intuitive mechanisms of electromagneticsm. What richness! What consummate beauty! We even exalt the `self'. Our script requires a broadening of the usual definition, but we too offer hope for everlasting existence. The `self' that is the particular, networked set of connections of the matter comprising our mortal bodies will one day die, of course. But the `self' that is the sum of each separate individual condensate in us of energy-turned-matter is already ancient and will live forever. Each fundamental particle may one day return to energy, or from there revert back to matter. But in one form or another, it will not cease. In this sense, we and all around us are eternal, immortal, and profoundly connected. We don't have one soul; we have trillions upon trillions of them. These are reasons enough for jubilation ... for riotous, unrestrained, exuberant merry-making. So what are we missing? Ceremony. We lack ceremony. We lack ritual. We lack the initiation of baptism, the brotherhood of communal worship. We have no loving ministers, guiding and teaching the flocks in the ways of the 'gods'. We have no fervent missionaries, no loyal apostles. And we lack the all-inclusive ecumenical embrace, the extended invitation to the unwashed masses. Alienation does not warm the heart; communion does. But what if? What if we appropriated the craft, the artistry, the methods of formal religion to get the message across? Imagine 'Einstein's Witnesses' going door to door or TV evangelists passionately espousing the beauty of evolution. Imagine a Church of Latter Day Scientists where believers could gather. Imagine congregations raising their voices in tribute to gravity, the force that binds us all to the Earth, and the Earth to the Sun, and the Sun to the Milky Way. Or others rejoicing in the nuclear force that makes possible the sunlight of our star and the starlight of distant suns. And can't you just hear the hymns sung to the antiquity of the universe, its abiding laws, and the heaven above that 'we' will all one day inhabit, together, commingled, spread out like a nebula against a diamond sky? One day, the sites we hold most sacred just might be the astronomical observatories, the particle accelerators, the university research installations, and other laboratories where the high priests of science -- the biologists, the physicists, the astronomers, the chemists -- engage in the noble pursuit of uncovering the workings of nature herself. And today's museums, expositional halls, and planetaria may then become tomorrow's houses of worship, where these revealed truths, and the wonder of our interconnectedness with the cosmos, are glorified in song by the devout and the soulful. "Hallelujah!", they will sing. "May the force be with you!" _________________________________________________________________ MICHAEL NESMITH Artist, writer; Former cast member of "The Monkees"; A Trustee and President of the Gihon Foundation and a Trustee and Vice-Chair of the American Film Institute [nez100.jpg] Existence is Non-Time, Non-Sequential, and Non-Objective Not a dangerous idea per se but like a razor sharp tool in unskilled hands it can inflect unintended damage. Non-Time drives forward the notion the past does not create the present. This would of course render evolutionary theory a local-system, near-field process that was non-causative (i.e. effect). Non-Sequential reverberates through the Turing machine and computation, and points to simultaneity. It redefines language and cognition. Non-Objective establishes a continuum not to be confused with solipsism. As Schr?dinger puts it when discussing the "time-hallowed discrimination between subject and object" -- "the world is given to me only once, not one existing and one perceived. Subject and object are only one. The barrier between them cannot be said to have broken down as a result of recent experience in the physical sciences, for this barrier does not exist". This continuum has large implications for the empirical data set, as it introduces factual infinity into the data plane. These three notions, Non-Time, Non-sequence, and Non-Object have been peeking like diamonds through the dust of empiricism, philosophy, and the sciences for centuries. Quantum mechanics, including Deutsch's parallel universes and the massive parallelism of quantum computing, is our brightest star -- an unimaginably tall peak on our fitness landscape. They bring us to a threshold over which empiricism has yet to travel, through which philosophy must reconstruct the very idea of ideas, and beyond which stretches the now familiar "uncharted territories" of all great adventures. _________________________________________________________________ LAWRENCE KRAUSS Physicist/Cosmologist, Case Western Reserve University; Author, Hiding in the Mirror [krauss100.jpg] The world may fundamentally be inexplicable Science has progressed for 400 years by ultimately explaining observed phenomena in terms of fundamental theories that are rigid. Even minor deviations from predicted behavior are not allowed by the theory, so that if such deviations are observed, these provide evidence that the theory must be modified, usually being replaced by a yet more comprehensive theory that fixes a wider range of phenomena. The ultimate goal of physics, as it is often described, is to have a "theory of everything", in which all the fundamental laws that describe nature can neatly be written down on the front of a T-shirt (even if the T-shirt can only exist in 10 dimensions!). However, with the recognition that the dominant energy in the universe resides in empty space -- something that is so peculiar that it appears very difficult to understand within the context of any theoretical ideas we now possess -- more physicists have been exploring the idea that perhaps physics is an 'environmental science', that the laws of physics we observe are merely accidents of our circumstances, and that an infinite number of different universe could exist with different laws of physics. This is true even if there does exist some fundamental candidate mathematical physical theory. For example, as is currently in vogue in an idea related to string theory, perhaps the fundamental theory allows an infinite number of different 'ground state' solutions, each of which describes a different possible universe with a consistent set of physical laws and physical dimensions. It might be that the only way to understand why the laws of nature we observe in our universe are the way they are is to understand that if they were any different, then life could not have arisen in our universe, and we would thus not be here to measure them today. This is one version of the infamous "anthropic principle". But it could actually be worse -- it is equally likely that many different combinations of laws would allow life to form, and that it is a pure accident that the constants of nature result in the combinations we experience in our universe. Or, it could be that the mathematical formalism is actually so complex so that the ground states of the theory, i.e. the set of possible states that might describe our universe, actually might not be determinable. In this case, the end of "fundamental" theoretical physics (i.e. the search for fundamental microphysical laws...there will still be lots of work for physicists who try to understand the host of complex phenomena occurring at a variety of larger scales) might occur not via a theory of everything, but rather with the recognition that all so-called fundamental theories that might describe nature would be purely "phenomenological", that is, they would be derivable from observational phenomena, but would not reflect any underlying grand mathematical structure of the universe that would allow a basic understanding of why the universe is the way it is. _________________________________________________________________ DANIEL C. DENNETT Philosopher; University Professor, Co-Director, Center for Cognitive Studies, Tufts University; Author, Darwin's Dangerous Idea [dennett101.jpg] There aren't enough minds to house the population explosion of memes Ideas can be dangerous. Darwin had one, for instance. We hold all sorts of inventors and other innovators responsible for assaying, in advance, the environmental impact of their creations, and since ideas can have huge environmental impacts, I see no reason to exempt us thinkers from the responsibility of quarantining any deadly ideas we may happen to come across. So if I found what I took to be such a dangerous idea, I would button my lip until I could find some way of preparing the ground for its safe expression. I expect that others who are replying to this year's Edge question have engaged in similar reflections and arrived at the same policy. If so, then some people may be pulling their punches with their replies. The really dangerous ideas they are keeping to themselves. But here is an unsettling idea that is bound to be true in one version or another, and so far as I can see, it won't hurt to publicize it more. It might well help. The human population is still growing, but at nowhere near the rate that the population of memes is growing. There is competition for the limited space in human brains for memes, and something has to give. Thanks to our incessant and often technically brilliant efforts, and our apparently insatiable appetites for novelty, we have created an explosively growing flood of information, in all media, on all topics, in every genre. Now either (1) we will drown in this flood of information, or (2) we won't drown in it. Both alternatives are deeply disturbing. What do I mean by drowning? I mean that we will become psychologically overwhelmed, unable to cope, victimized by the glut and unable to make life-enhancing decisions in the face of an unimaginable surfeit. (I recall the brilliant scene in the film of Evelyn Waugh's dark comedy The Loved One in which embalmer Mr. Joyboy's gluttonous mother is found sprawled on the kitchen floor, helplessly wallowing in the bounty that has spilled from a capsized refrigerator.) We will be lost in the maze, preyed upon by whatever clever forces find ways of pumping money-or simply further memetic replications-out of our situation. (In The War of the Worlds, H. G. Wells sees that it might well be our germs, not our high-tech military contraptions, that subdue our alien invaders. Similarly, might our own minds succumb not to the devious manipulations of evil brainwashers and propagandists, but to nothing more than a swarm of irresistible ditties, Nofs nibbled to death by slogans and one-liners?) If we don't drown, how will we cope? If we somehow learn to swim in the rising tide of the infosphere, that will entail that we-that is to say, our grandchildren and their grandchildren-become very very different from our recent ancestors. What will "we" be like? (Some years ago, Doug Hofstadter wrote a wonderful piece, " In 2093, Just Who Will Be We?" in which he imagines robots being created to have "human" values, robots that gradually take over the social roles of our biological descendants, who become stupider and less concerned with the things we value. If we could secure the welfare of just one of these groups, our children or our brainchildren, which group would we care about the most, with which group would we identify?) Whether "we" are mammals or robots in the not so distant future, what will we know and what will we have forgotten forever, as our previously shared intentional objects recede in the churning wake of the great ship that floats on this sea and charges into the future propelled by jets of newly packaged information?What will happen to our cultural landmarks? Presumably our descendants will all still recognize a few reference points (the pyramids of Egypt, arithmetic, the Bible, Paris, Shakespeare, Einstein, Bach . . . ) but as wave after wave of novelty passes over them, what will they lose sight of? The Beatles are truly wonderful, but if their cultural immortality is to be purchased by the loss of such minor 20th century figures as Billie Holiday, Igor Stravinsky, and Georges Brassens [who he?], what will remain of our shared understanding? The intergenerational mismatches that we all experience in macroscopic versions (great-grandpa's joke falls on deaf ears, because nobody else in the room knows that Nixon's wife was named "Pat") will presumably be multiplied to the point where much of the raw information that we have piled in our digital storehouses is simply incomprehensible to everyone-except that we will have created phalanxes of "smart" Rosetta-stones of one sort or another that can "translate" the alien material into something we (think maybe we) understand. I suspect we hugely underestimate the importance (to our sense of cognitive security) of our regular participation in the four-dimensional human fabric of mutual understanding, with its reassuring moments of shared-and seen to be shared, and seen to be seen to be shared-comprehension. What will happen to common knowledge in the future? I do think our ancestors had it easy: aside from all the juicy bits of unshared gossip and some proprietary trade secrets and the like, people all knew pretty much the same things, and knew that they knew the same things. There just wasn't that much to know. Won't people be able to create and exploit illusions of common knowledge in the future, virtual worlds in which people only think they are in touch with their cyber-neighbors? I see small-scale projects that might protect us to some degree, if they are done wisely. Think of all the work published in academic journals before, say, 1990 that is in danger of becoming practically invisible to later researchers because it can't be found on-line with a good search engine. Just scanning it all and hence making it "available" is not the solution. There is too much of it. But we could start projects in which (virtual) communities of retired researchers who still have their wits about them and who know particular literatures well could brainstorm amongst themselves, using their pooled experience to elevate the forgotten gems, rendering them accessible to the next generation of researchers. This sort of activity has in the past been seen to be a stodgy sort of scholarship, fine for classicists and historians, but not fit work for cutting-edge scientists and the like. I think we should try to shift this imagery and help people recognize the importance of providing for each other this sort of pathfinding through the forests of information. It's a drop in the bucket, but perhaps if we all start thinking about conservation of valuable mind-space, we can save ourselves (our descendants) from informational collapse. _________________________________________________________________ DANIEL GILBERT Psychologist, Harvard University [gilbert100.jpg] The idea that ideas can be dangerous Dangerous does not mean exciting or bold. It means likely to cause great harm. The most dangerous idea is the only dangerous idea: The idea that ideas can be dangerous. We live in a world in which people are beheaded, imprisoned, demoted, and censured simply because they have opened their mouths, flapped their lips, and vibrated some air. Yes, those vibrations can make us feel sad or stupid or alienated. Tough shit. That's the price of admission to the marketplace of ideas. Hateful, blasphemous, prejudiced, vulgar, rude, or ignorant remarks are the music of a free society, and the relentless patter of idiots is how we know we're in one. When all the words in our public conversation are fair, good, and true, it's time to make a run for the fence. _________________________________________________________________ ANDY CLARK School of Philosophy, Psychology and Language Sciences, Edinburgh University [clark100.jpg] The quick-thinking zombies inside us So much of what we do, feel, think and choose is determined by non-conscious, automatic uptake of cues and information. Of course, advertisers will say they have known this all along. But only in recent years, with seminal studies by Tanya Chartrand, John Bargh and others has the true scale of our daily automatism really begun to emerge. Such studies show that it is possible (it is relatively easy) to activate racist stereotypes that impact our subsequent behavioral interactions, for example yielding the judgment that your partner in a subsequent game or task is more hostile than would be judged by an unprimed control. Such effects occur despite a subject's total and honest disavowal of those very stereotypes. In similar ways it is possible to unconsciously prime us to feel older (and then we walk more slowly). In my favorite recent study, experimenters manipulate cues so that the subject forms an unconscious goal, whose (unnoticed) frustration makes them lose confidence and perform worse at a subsequent task! The dangerous truth, it seems to me, is that these are not isolated little laboratory events. Instead, they reveal the massed woven fabric of our day-to-day existence. The underlying mechanisms at work impart an automatic drive towards the automation of all manner of choices and actions, and don't discriminate between the 'trivial' and the portentous. It now seems clear that many of my major life and work decisions are made very rapidly, often on the basis of ecologically sound but superficial cues, with slow deliberative reason busily engaged in justifying what the quick-thinking zombies inside me have already laid on the table. The good news is that without these mechanisms we'd be unable to engage in fluid daily life or reason at all, and that very often they are right. The dangerous truth, though, is that we are indeed designed to cut conscious, aware choice out of the picture wherever possible. This is not an issue about free will, but simply about the extent to which conscious deliberation cranks the engine of behavior. Crank it it does: but not in anything like the way, or extent, we may have thought. We'd better get to grips with this before someone else does. _________________________________________________________________ SHERRY TURKLE Psychologist, MIT; Author, Life on the Screen: Identity in the Age of the Internet [turkle100.jpg] After several generations of living in the computer culture, simulation will become fully naturalized. Authenticity in the traditional sense loses its value, a vestige of another time. Consider this moment from 2005: I take my fourteen-year-old daughter to the Darwin exhibit at the American Museum of Natural History. The exhibit documents Darwin's life and thought, and with a somewhat defensive tone (in light of current challenges to evolution by proponents of intelligent design), presents the theory of evolution as the central truth that underpins contemporary biology. The Darwin exhibit wants to convince and it wants to please. At the entrance to the exhibit is a turtle from the Galapagos Islands, a seminal object in the development of evolutionary theory. The turtle rests in its cage, utterly still. "They could have used a robot," comments my daughter. It was a shame to bring the turtle all this way and put it in a cage for a performance that draws so little on the turtle's "aliveness. " I am startled by her comments, both solicitous of the imprisoned turtle because it is alive and unconcerned by its authenticity. The museum has been advertising these turtles as wonders, curiosities, marvels -- among the plastic models of life at the museum, here is the life that Darwin saw. I begin to talk with others at the exhibit, parents and children. It is Thanksgiving weekend. The line is long, the crowd frozen in place. My question, "Do you care that the turtle is alive?" is welcome diversion. A ten year old girl would prefer a robot turtle because aliveness comes with aesthetic inconvenience: "It's water looks dirty. Gross. " More usually, the votes for the robots echo my daughter's sentiment that in this setting, aliveness doesn't seem worth the trouble. A twelve-year-old girl opines: "For what the turtles do, you didn't have to have the live ones. " Her father looks at her, uncomprehending: "But the point is that they are real, that's the whole point. " The Darwin exhibit is about authenticity: on display are the actual magnifying glass that Darwin used, the actual notebooks in which he recorded his observations, indeed, the very notebook in which he wrote the famous sentences that first described his theory of evolution But in the children's reactions to the inert but alive Galapagos turtle, the idea of the "original" is in crisis. I have long believed that in the culture of simulation, the notion of authenticity is for us what sex was to the Victorians -- "threat and obsession, taboo and fascination. " I have lived with this idea for many years, yet at the museum, I find the children's position startling, strangely unsettling. For these children, in this context, aliveness seems to have no intrinsic value. Rather, it is useful only if needed for a specific purpose. "If you put in a robot instead of the live turtle, do you think people should be told that the turtle is not alive?" I ask. Not really, say several of the children. Data on "aliveness" can be shared on a "need to know" basis, for a purpose. But what are the purposes of living things? When do we need to know if something is alive? Consider another vignette from 2005: an elderly woman in a nursing home outside of Boston is sad. Her son has broken off his relationship with her. Her nursing home is part of a study I am conducting on robotics for the elderly. I am recording her reactions as she sits with the robot Paro, a seal-like creature, advertised as the first "therapeutic robot" for its ostensibly positive effects on the ill, the elderly, and the emotionally troubled. Paro is able to make eye contact through sensing the direction of a human voice, is sensitive to touch, and has "states of mind" that are affected by how it is treated, for example, is it stroked gently or with agressivity? In this session with Paro, the woman, depressed because of her son's abandonment, comes to believe that the robot is depressed as well. She turns to Paro, strokes him and says: "Yes, you're sad, aren't you. It's tough out there. Yes, it's hard. " And then she pets the robot once again, attempting to provide it with comfort. And in so doing, she tries to comfort herself. The woman's sense of being understood is based on the ability of computational objects like Paro to convince their users that they are in a relationship. I call these creatures (some virtual, some physical robots) "relational artifacts. " Their ability to inspire relationship is not based on their intelligence or consciousness, but on their ability to push certain "Darwinian" buttons in people (making eye contact, for example) that make people respond as though they were in relationship. For me, relational artifacts are the new uncanny in our computer culture -- as Freud once put it, the long familiar taking a form that is strangely unfamiliar. As such, they confront us with new questions. What does this deployment of "nurturing technology" at the two most dependent moments of the life cycle say about us? What will it do to us? Do plans to provide relational robots to attend to children and the elderly make us less likely to look for other solutions for their care? People come to feel love for their robots, but if our experience with relational artifacts is based on a fundamentally deceitful interchange, can it be good for us? Or might it be good for us in the "feel good" sense, but bad for us in our lives as moral beings? Relationships with robots bring us back to Darwin and his dangerous idea: the challenge to human uniqueness. When we see children and the elderly exchanging tendernesses with robotic pets the most important question is not whether children will love their robotic pets more than their real life pets or even their parents, but rather, what will loving come to mean? _________________________________________________________________ STEVEN STROGATZ Applied mathematician, Cornell University; Author, Sync [strogatz100.jpg] The End of Insight I worry that insight is becoming impossible, at least at the frontiers of mathematics. Even when we're able to figure out what's true or false, we're less and less able to understand why. An argument along these lines was recently given by Brian Davies in the "Notices of the American Mathematical Society". He mentions, for example, that the four-color map theorem in topology was proven in 1976 with the help of computers, which exhaustively checked a huge but finite number of possibilities. No human mathematician could ever verify all the intermediate steps in this brutal proof, and even if someone claimed to, should we trust them? To this day, no one has come up with a more elegant, insightful proof. So we're left in the unsettling position of knowing that the four-color theorem is true but still not knowing why. Similarly important but unsatisfying proofs have appeared in group theory (in the classification of finite simple groups, roughly akin to the periodic table for chemical elements) and in geometry (in the problem of how to pack spheres so that they fill space most efficiently, a puzzle that goes back to Kepler in the 1500's and that arises today in coding theory for telecommunications). In my own field of complex systems theory, Stephen Wolfram has emphasized that there are simple computer programs, known as cellular automata, whose dynamics can be so inscrutable that there's no way to predict how they'll behave; the best you can do is simulate them on the computer, sit back, and watch how they unfold. Observation replaces insight. Mathematics becomes a spectator sport. If this is happening in mathematics, the supposed pinnacle of human reasoning, it seems likely to afflict us in science too, first in physics and later in biology and the social sciences (where we're not even sure what's true, let alone why). When the End of Insight comes, the nature of explanation in science will change forever. We'll be stuck in an age of authoritarianism, except it'll no longer be coming from politics or religious dogma, but from science itself. _________________________________________________________________ TERRENCE SEJNOWSKI Computational Neuroscientist, Howard Hughes Medical Institute; Coauthor, The Computational Brain [sejnowski101.jpg] When will the Internet become aware of itself? I never thought that I would become omniscient during my lifetime, but as Google continues to improve and online information continues to expand I have achieved omniscience for all practical purposes. The Internet has created a global marketplace for ideas and products, making it possible for individuals in the far corners of the world to automatically connect directly to each other. The Internet has achieved these capabilities by growing exponentially in total communications bandwidth. How does the communications power of the Internet compare with that of the cerebral cortex, the most interconnected part of our brains? Cortical connections are expensive because they take up volume and cost energy to send information in the form of spikes along axons. About 44% of the cortical volume in humans is taken up with long-range connections, called the white matter. Interestingly, the thickness of gray matter, just a few millimeters, is nearly constant in mammals that range in brain volume over five orders of magnitude, and the volume of the white matter scales approximately as the 4/3 power of the volume of the gray matter. The larger the brain, the larger the fraction of resources devoted to communications compared to computation. However, the global connectivity in the cerebral cortex is extremely sparse: The probability of any two cortical neurons having a direct connection is around one in a hundred for neurons in a vertical column 1 mm in diameter, but only one in a million for more distant neurons. Thus, only a small fraction of the computation that occurs locally can be reported to other areas, through a small fraction of the cells that connect distant cortical areas. Despite the sparseness of cortical connectivity, the potential bandwidth of all of the neurons in the human cortex is approximately a terabit per second, comparable to the total world backbone capacity of the Internet. However, this capacity is never achieved by the brain in practice because only a fraction of cortical neurons have a high rate of firing at any given time. Recent work by Simon Laughlin suggests that another physical constraint -- energy--limits the brain's ability to harness its potential bandwidth. The cerebral cortex also has a massive amount of memory. There are approximately one billion synapses between neurons under every square millimeter of cortex, or about one hundred million million synapses overall. Assuming around a byte of storage capacity at each synapse (including dynamic as well as static properties), this comes to a total of 10^15 bits of storage. This is comparable to the amount of data on the entire Internet; Google can store this in terabyte disk arrays and has hundreds of thousands of computers simultaneously sifting through it. Thus, the internet and our ability to search it are within reach of the limits of the raw storage and communications capacity of the human brain, and should exceed it by 2015. Leo van Hemmen and I recently asked 23 neuroscientists to think about what we don't yet know about the brain, and to propose a question so fundamental and so difficult that it could take a century to solve, following in the tradition of Hilbert's 23 problems in mathematics. Christof Koch and Francis Crick speculated that the key to understanding consciousness was global communication: How do neurons in the diverse parts of the brain manage to coordinate despite the limited connectivity? Sometimes, the communication gets crossed, and V. S. Ramachandran and Edward Hubbard asked whether synesthetes, rare individuals who experience crossover in sensory perception such as hearing colors, seeing sounds, and tasting tactile sensations, might give us clues to how the brain evolved. There is growing evidence that the flow of information between parts of the cortex is regulated by the degree of synchrony of the spikes within populations of cells that represent perceptual states. Robert Desimone and his colleagues have examined the effects of attention on cortical neurons in awake, behaving monkeys and found the coherence between the spikes of single neurons in the visual cortex and local field potentials in the gamma band, 30-80 Hz, increased when the covert attention of a monkey was directed toward a stimulus in the receptive field of the neuron. The coherence also selectively increased when a monkey searched for a target with a cued color or shape amidst a large number of distracters. The increase in coherence means that neurons representing the stimuli with the cued feature would have greater impact on target neurons, making them more salient. The link between attention and spike-field coherence raises a number of interesting questions. How does top-down input from the prefrontal cortex regulate the coherence of neurons in other parts of the cortex through feedback connections? How is the rapidity of the shifts in coherence achieved? Experiments on neurons in cortical slices suggest that inhibitory interneurons are connected to each other in networks and are responsible for gamma oscillations. Researchers in my laboratory have used computational models to show that excitatory inputs can rapidly synchronize a subset of the inhibitory neurons that are in competition with other inhibitory networks. Inhibitory neurons, long thought to merely block activity, are highly effective in synchronizing neurons in a local column already firing in response to a stimulus. The oscillatory activity that is thought to synchronize neurons in different parts of the cortex occurs in brief bursts, typically lasting for only a few hundred milliseconds. Thus, it is possible that there is a packet structure for long-distance communication in the cortex, similar to the packets that are used to communicate on the Internet, though with quite different protocols. The first electrical signals recorded from the brain in 1875 by Richard Caton were oscillatory signals that changed in amplitude and frequency with the state of alertness. The function of these oscillations remains a mystery, but it would be remarkable if it were to be discovered that these signals held the secrets to the brain's global communications network. Since its inception in 1969, the Internet has been scaled up to a size not even imagined by its inventors, in contrast to most engineered systems, which fall apart when they are pushed beyond their design limits. In part, the Internet achieves this scalability because it has the ability to regulate itself, deciding on the best routes to send packets depending on traffic conditions. Like the brain, the Internet has circadian rhythms that follow the sun as the planet rotates under it. The growth of the Internet over the last several decades more closely resembles biological evolution than engineering. How would we know if the Internet were to become aware of itself? The problem is that we don't even know if some of our fellow creatures on this planet are self aware. For all we know the Internet is already aware of itself. _________________________________________________________________ LYNN MARGULIS Biologist, University of Massachusetts, Amherst; Coauthor (with Dorion Sagan), Acquiring Genomes: A Theory of the Origins of Species [margulis100.jpg] Bacteria are us What is my dangerous idea? Although arcane, evidence for this dangerous concept is overwhelming; I have collected clues from many sources. Reminiscent of Oscar Wilde's claim that "even true things can be proved" I predict that the scientific gatekeepers in academia eventually will be forced to permit this dangerous idea to become widely accepted. What is it? Our sensibilities, our perceptions that register through our sense organ cells evolved directly from our bacterial ancestors. Signals in the environment: light impinging on the eye's retina, taste on the buds of the tongue, odor through the nose, sound in the ear are translated to nervous impulses by extensions of sensory cells called cilia. We, like all other mammals, including our apish brothers, have taste-bud cilia, inner ear cilia, nasal passage cilia that detect odors. We distinguish savory from sweet, birdsong from whalesong, drumbeats from thunder. With our eyes closed, we detect the light of the rising sun and and feel the vibrations of the drums. These abilities to sense our surroundings, a heritage that preceded the evolution of all primates, indeed, all animals, by use of specialized cilia at the tips of sensory cells, and the existence of the cilia in the tails of sperm, come from one kind of our bacterial ancestors. Which? Those of our bacterial ancestors that became cilia. We owe our sensitivity to a loving touch, the scent of lavender , the taste of a salted nut or vinaigrette, a police-cruiser siren, or glimpse of brilliant starlight to our sensory cells. We owe the chemical attraction of the sperm as its tail impels it to swim toward the egg, even the moss plant sperm, to its cilia. The dangerous idea is that the cilia evolved from hyperactive bacteria. Bacterial ancestors swam toward food and away from noxious gases, they moved up to the well-lit waters at the surface of the pond. They were startled when, in a crowd, some relative bumped them. These bacterial ancestors that never slept, avoided water too hot or too salty. They still do. Why is the concept that our sensitivities evolved directly from swimming bacterial ancestors of the sensory cilia so dangerous? Several reasons: we would be forced to admit that bacteria are conscious, that they are sensitive to stimuli in their environment and behave accordingly. We would have to accept that bacteria, touted to be our enemies, are not merely neutral or friendly but that they are us. They are direct ancestors of our most sensitive body parts. Our culture's terminology about bacteria is that of warfare: they are germs to be destroyed and forever vanquished, bacterial enemies make toxins that poison us. We load our soaps with antibacterials that kill on contact, stomach ulcers are now agreed to be caused by bacterial infection. Even if some admit the existence of "good" bacteria in soil or probiotic food like yogurt few of us tolerate the dangerous notion that human sperm tails and sensitive cells of nasal passages lined with waving cilia, are former bacteria. If this dangerous idea becomes widespread it follows that we humans must agree that even before our evolution as animals we have hated and tried to kill our own ancestors. Again, we have seen the enemy, indeed, and, as usual, it is us. Social interactions of sensitive bacteria, then, not God, made us who were are today. _________________________________________________________________ THOMAS METZINGER Frankfurt Institute for Advanced Studies; Johannes Gutenberg-Universit?t Mainz; President German Cognitive Science Society; Author: Being No One [metzinger100.jpg] The Forbidden Fruit Intuition We all would like to believe that, ultimately, intellectual honesty is not only an expression of, but also good for your mental health. My dangerous question is if one can be intellectually honest about the issue of free will and preserve one's mental health at the same time. Behind this question lies what I call the "Forbidden Fruit Intuition": Is there a set of questions which are dangerous not on grounds of ideology or political correctness, but because the most obvious answers to them could ultimately make our conscious self-models disintegrate? Can one really believe in determinism without going insane? For middle-sized objects at 37? like the human brain and the human body, determinism is obviously true. The next state of the physical universe is always determined by the previous state. And given a certain brain-state plus an environment you could never have acted otherwise -- a surprisingly large majority of experts in the free-will debate today accept this obvious fact. Although your future is open, this probably also means that for every single future thought you will have and for every single decision you will make, it is true that it was determined by your previous brain state. As a scientifically well-informed person you believe in this theory, you endorse it. As an open-minded person you find that you are also interested in modern philosophy of mind, and you might hear a story much like the following one. Yes, you are a physically determined system. But this is not a big problem, because, under certain conditions, we may still continue to say that you are "free": all that matters is that your actions are caused by the right kinds of brain processes and that they originate in you. A physically determined system can well be sensitive to reasons and to rational arguments, to moral considerations, to questions of value and ethics, as long as all of this is appropriately wired into its brain. You can be rational, and you can be moral, as long as your brain is physically determined in the right way. You like this basic idea: physical determinism is compatible with being a free agent. You endorse a materialist philosophy of freedom as well. An intellectually honest person open to empirical data, you simply believe that something along these lines must be true. Now you try to feel that it is true. You try to consciously experience the fact that at any given moment of your life, you could not have acted otherwise. You try to experience the fact that even your thoughts, however rational and moral, are predetermined -- by something unconscious, by something you can not see. And in doing so, you start fooling around with the conscious self-model Mother Nature evolved for you with so much care and precision over millions of years: You are scratching at the user-surface of your own brain, tweaking the mouse-pointer, introspectively trying to penetrate into the operating system, attempting to make the invisible visible. You are challenging the integrity of your phenomenal self by trying to integrate your new beliefs, the neuroscientific image of man, with your most intimate, inner way of experiencing yourself. How does it feel? I think that the irritation and deep sense of resentment surrounding public debates on the freedom of the will actually has nothing much to do with the actual options on the table. It has to do with the -- perfectly sensible -- intuition that our presently obvious answer will not only be emotionally disturbing, but ultimately impossible to integrate into our conscious self-models. Or our societies: The robust conscious experience of free will also is a social institution, because the attribution of accountability, responsibility, etc. are the decisive building blocks for modern, open societies. And the currently obvious answer might be interpreted by many as having clearly anti-democratic implications: Making a complex society work implies controlling the behavior of millions of people; if individual human beings can control their own behavior to a much lesser degree than we have thought in the past, if bottom-up doesn't work, then it becomes tempting to control it top-down, by the state. And this is the second way in which enlightenment could devour its own children. Yes, free will truly is a dangerous question, but for different reasons than most people think. _________________________________________________________________ DIANE F. HALPERN Professor of Psychology, Claremont McKenna College; Past-president (2005), the American Psychological Association; Author, Thought and Knowledge [halpern100.jpg] Choosing the sex of one's child For an idea to be truly dangerous, it needs to have a strong and near universal appeal. The idea of being able to choose the sex of one's own baby is just such an idea. Anyone who has a deep-seated and profound preference for a son or daughter knows that this preference may not be rational and that it may represent a prejudice better left unacknowledged about them. It is easy to dismiss the ability to decide the sex of one's baby as inconsequential. It is already medically feasible for a woman or couple to choose the sex of a baby that has not yet been conceived. There are a variety of safe methods available, such as Preimplanted Genetic Diagnosis (PGD), so-named because it was originally designed for couples with fertility problems, not for the purpose of selecting the sex of one's next child. With PGD, embryos are created in a Petri dish, tested for gender, and then implanted into the womb, so that the baby-to-be is already identified as female or male before implantation in the womb. The pro argument is simple: If the parents-to-be are adults, why not? People have always wanted to be able to choose the sex of their children. There are ancient records of medicine men and wizened women with various herbs and assorted advice about what to do to (usually) have a son. So, what should it matter if modern medicine can finally deliver what old wives' tales have promised for countless generations? Couples won't have to have a "wasted" child, such as a second child the same sex as the first one, when they really wanted "one of each. " If a society has too many boys for a while, who cares? The shortage of females will make females more valuable and the market economy will even out in time. In the mean time, families will "balance out," each one the ideal composition as desired by the adults in the family. Every year for the last two decades I have asked students in my college classes to write down the number of children they would like to have and the order in which they ideally want to have girls and boys. I have taught in several different countries (e.g. , Turkey, Russia, and Mexico) and types of universities, but despite large differences, the modal response is 2 children, first a boy, then a girl. If students reply that they want one child, it is most often a boy; if it is 3 children, they are most likely to want a boy, then a girl, then a boy. The students in my classes are not a random sample of the population: they are well educated and more likely to hold egalitarian attitudes than the general population. Yet, if they acted on their stated intentions, even they would have an excess of first-borns who are male, and an excess of males overall. In a short time, those personality characteristics associated with being either an only-child or first-born and those associated with being male would be so confounded, it would be difficult to separate them. The excess of males that would result from allowing every mother or couple to choose the sex of their next baby would not correct itself at the societal level because at the individual level, the preference for sons is stronger than the market forces of supply and demand. The evidence for this conclusion comes from many sources, including regions of the world where the ratio of young women to men is so low that it could only be caused by selective abortion and female infanticide (UNICEF and other sources). In some regions of rural China there are so few women that wives are imported from the Philippines and men move to far cities to find women to marry. In response, the Chinese government is now offering a variety of education and cash incentives to families with multiple daughters. There are still few daughters being born in these rural areas where prejudice against girls is stronger than government incentives and mandates. In India, the number of abortions of female fetuses has increased since sex-selective abortion was made illegal in 1994. The desire for sons is even stronger than the threat of legal action. In the United States, the data that show preferences for sons are more subtle than the disparate ratios of females and males found in other parts of the world, but the preference for sons is still strong. Because of space limitations, I list only a few of the many indicators that parents in the United States prefer sons: families with 2 daughters are more likely to have a third child than families with 2 sons, unmarried pregnant women who undergo ultrasound to determine the sex of the yet unborn child are less likely to be married at the time of the child's birth when the child is a girl than when it is a boy, and divorced women with a son are more likely to remarry than divorced women with a daughter. Perhaps the only ideas more dangerous that of choosing the sex of one's child would be trying to stop medical science from making advances that allow such choices or allowing the government to control the choices we can make as citizens. There are many important questions to ponder, including how to find creative ways to reduce or avoid negative consequences from even more dangerous alternatives. Consider, for example, what would our world be like if there were substantially more men than women? What if only the rich or only those who live in "rich countries" were able to choose the sex of their children? Is it likely that an approximately equal number of boys and girls would be or could be selected? If not, could a society or should a society make equal numbers of girls and boys a goal? I am guessing that many readers of child-bearing age want to choose the sex of their (as yet) unconceived children and can reason that there is no harm in this practice. And, if you could also choose intelligence, height, and hair color, would you add that too? But then, there are few things in life that are as appealing as the possibility of a perfectly balanced family, which according to the modal response means an older son and younger daughter, looking just like an improved version of you. _________________________________________________________________ GARY MARCUS Psychologist, New York University; Author, The Birth of the Mind [marcus100.jpg] Minds, genes, and machines Brains exist primarily to do two things, to communicate (transfer information) and compute. This is true in every creature with a nervous system, and no less true in the human brain. In short, the brain is a machine. And the basic structure of that brain, biological substrate of all things mental, is guided in no small part by information carried in the DNA. In the twenty-first century, these claims should no longer be controversial. With each passing day, techniques like magnetic resonance imaging and electrophysiological recordings from individual neurons make it clearer that the business of the brain is information processing, while new fields like comparative genomics and developmental neuroembryology remove any possible doubt that genes significantly influence both behavior and brain. Yet there are many people, scientists and lay persons alike, who fear or wish to deny these notions, to doubt our even reject the idea that the mind is a machine, and that it is significantly (though of course not exclusively) shaped by genes. Even as the religious right prays for Intelligent Design, the academic left insinuates that merely discussing the idea of innateness is dangerous, as in a prominent child development manifesto that concluded: If scientists use words like "instinct" and "innateness" in reference to human abilities, then we have a moral responsibility to be very clear and explicit about what we mean. If our careless, underspecified choice of words inadvertently does damage to future generations of children, we cannot turn with innocent outrage to the judge and say "But your Honor, I didn't realize the word was loaded. A new academic journal called "Metascience" focuses on when extra-scientific considerations influence the process of science. Sadly, the twin questions of whether we are machines, and whether we are constrained significantly by our biology, very much fall into this category, questions where members of the academy (not to mention fans of Intelligent Design) close their minds. Copernicus put us in our place, so to to speak, by showing that our planet is not at the center of universe; advances in biology are putting us further in our place by showing that our brains are as much a product of biology as any other part of our body, and by showing that our (human) brains are built by the very same processes as other creatures. Just as the earth is just one planet among many, from the perspective of the toolkit of developmental biology, our brain is just one more arrangement of molecules. _________________________________________________________________ JARON LANIER Computer Scientist and Musician [jaron100.jpg] Homuncular Flexibility The homunculus is an approximate mapping of the human body in the cortex. It is often visualized as a distorted human body stretched along the top of the human brain. The tongue, thumbs, and other body parts with extra-rich brain connections are enlarged in the homunculus, giving it a vaguely obscene, impish character. Long ago, in the 1980s, my colleagues and I at VPL Research built virtual worlds in which more than one person at a time could be present. People in a shared virtual world must be able to see each other, as well as use their bodies together, as when two people lift a large virtual object or ride a tandem virtual bicycle. None of this would be possible without virtual bodies. It was a self-evident and inviting challenge to attempt to create the most accurate possible bodies, given the crude state of the technology at the time. To do this, we developed full body suits covered in sensors. A measurement made on the body of someone wearing one of these suits, such as an aspect of the flex of a wrist, would be applied to control a corresponding change in a virtual body. Before long, people were dancing and otherwise goofing around in virtual reality. Of course there were bugs. I distinctly remember a wonderful bug that caused my hand to become enormous, like a web of flying skyscrapers. As is often the case, this accident led to an interesting discovery. It turned out that people could quickly learn to inhabit strange and different bodies and still interact with the virtual world. I became curious how weird the body could get before the mind would become disoriented. I played around with elongated limb segments, and strange limb placement. The most curious experiment involved a virtual lobster (which was lovingly modeled by Ann Lasko. ) A lobster has a trio of little midriff arms on each side of its body. If physical human bodies sprouted corresponding limbs, we would have measured them with an appropriate body suit and that would have been that. I assume it will not come as a surprise to the reader that the human body does not include these little arms, so the question arose of how to control them. The answer was to extract a little influence from each of many parts of the physical body and merge these data streams into a single control signal for a given joint in the extra lobster limbs. A touch of human elbow twist, a dash of human knee flex; a dozen such movements might be mixed to control the middle join of little left limb #3. The result was that the principle elbows and knees could still control their virtual counterparts roughly as before, while still contributing to the control of additional limbs. Yes, it turns out people can learn to control bodies with extra limbs! The biologist Jim Bower, when considering this phenomenon, commented that the human nervous system evolved through all the creatures that preceded us in our long evolutionary line, which included some pretty strange creatures, if you go back far enough. Why wouldn't we retain some homuncular flexibility with a pedigree like that? The original experiments of the 1980s were not carried out formally, but recently it has become possible to explore the phenomenon in a far more rigorous way. Jeremy Bailenson at Stanford has created a marvelous new lab for studying multiple human subjects in high-definition shared virtual worlds, and we are now planning to repeat, improve, and extend these experiments. The most interesting questions still concern the limits to homuncular flexibility. We are only beginning the project of mapping how far it can go. Why is homuncular flexibility a dangerous idea? Because the more flexible the human brain turns out to be when it comes to adapting to weirdness, the weirder a ride it will be able to keep up with as technology changes in the coming decades and centuries. Will kids in the future grow up with the experience of living in four spatial dimensions as well as three? That would be a world with a fun elementary school math curriculum! If you're most interested in raw accumulation of technological power, then you might be not find this so interesting, but if you think in terms of how human experience can change, then this is the most fascinating stuff there is. Homuncular flexibility isn't the only source of hints about how weird human experience might get in the future. There also questions related to language, memory, and other aspects of cognition, as well as hypothetical prospects for engineering changes in the brain. But in this one area, there's an indication of high weirdness to come, and I find that prospect dangerous, but in a beautiful and seductive way. "Thrilling" might be a better word. _________________________________________________________________ W.DANIEL HILLIS Physicist, Computer Scientist; Chairman, Applied Minds, Inc.; Author, The Pattern on the Stone [hillis100.jpg] The idea that we should all share our most dangerous ideas I don't share my most dangerous ideas. Ideas are the most powerful forces that we can unleash upon the world, and they should not be let loose without careful consideration of their consequences. Some ideas are dangerous because they are false, like an idea that one race of humans is more worthy that another, or that one religion has monopoly on the truth. False ideas like these spread like wildfire, and have caused immeasurable harm. They still do. Such false ideas should obviously not be spread or encouraged, but there are also plenty of trues idea that should not be spread: ideas about how to cause terror and pain and chaos, ideas of how to better convince people of things that are not true. I have often seen otherwise thoughtful people so caught up in such an idea that they seem unable to resist sharing it. To me, the idea that we should all share our most dangerous ideas is, itself, a very dangerous idea. I just hope that it never catches on. _________________________________________________________________ NEIL GERSHENFELD Physicist; Director, Center for Bits and Atoms, MIT; Author, Fab [gershenfeld100.jpg] Democratizing access to the means of invention The elite temples of research (of the kind I've happily spent my career in) may be becoming intellectual dinosaurs as a result of the digitization and personalization of fabrication. Today, with about $20k in equipment it's possible to make and measure things from microns and microseconds on up, and that boundary is quickly receding. When I came to MIT that was hard to do. If it's no longer necessary to go to MIT for its facilities, then surely the intellectual community is its real resource? But my colleagues (and I) are always either traveling or over-scheduled; the best way for us to see each other is to go somewhere else. Like many people, my closest collaborators are in fact distributed around the world. The ultimate consequence of the digitization of first communications, then computation, and now fabrication, is to democratize access to the means of invention. The third world can skip over the first and second cultures and go right to developing a third culture. Rather than today's model of researchers researching for researchees, the result of all that discovery has been to enable a planet of creators rather than consumers. _________________________________________________________________ PAUL STEINHARDT Albert Einstein Professor of Science, Princeton University [steinhardt100.jpg] It's a matter of time For decades, the commonly held view among scientists has been that space and time first emerged about fourteen billion years ago in a big bang. According to this picture, the cosmos transformed from a nearly uniform gas of elementary particles to its current complex hierarchy of structure, ranging from quarks to galaxy superclusters, through an evolutionary process governed by simple, universal physical laws. In the past few years, though, confidence in this point of view has been shaken as physicists have discovered finely tuned features of our universe that seem to defy natural explanation. The prime culprit is the cosmological constant, which astronomers have measured to be exponentially smaller than na?ve estimates would predict. On the one hand, it is crucial that the cosmological constant be so small or else it would cause space to expand so rapidly that galaxies and stars would never form. On the other hand, no theoretical mechanism has been found within the standard Big Bang picture that would explain the tiny value. Desperation has led to a "dangerous" idea: perhaps we live in an anthropically selected universe. According to this view, we live in a multiverse (a multitude of universes) in which the cosmological constant varies randomly from one universe to the next. In most universes, the value is incompatible with the formation of galaxies, planets, and stars. The reason why our cosmological constant has the value it does is because it it is one of the rare examples in which the value happens to lie in the narrow range compatible with life. This is the ultimate example of "unintelligent design": the multiverse tries every possibility with reckless abandon and only very rarely gets things "right;" that is, consistent with everything we actually observe. It suggests that the creation of unimaginably enormous volumes of uninhabitable space is essential to obtain a few rare habitable spaces. I consider this approach to be extremely dangerous for two reasons. First, it relies on complex assumptions about physical conditions far beyond the range of conceivable observation so it is not scientifically verifiable. Secondly, I think it leads inevitably to a depressing end to science. What is the point of exploring further the randomly chosen physical properties in our tiny corner of the multiverse if most of the multiverse is so different. I think it is far too early to be so desperate. This is a dangerous idea that I am simply unwilling to contemplate. My own "dangerous" idea is more optimistic but precarious because it bucks the current trends in cosmological thinking. I believe that the finely tuned features may be naturally explained by supposing that our universe is much older than we have imagined. With more time, a new possibility emerges. The cosmological "constant" may not be constant after all. Perhaps it is varying so slowly that it only appears to be constant. Originally it had the much larger value that we would naturally estimate, but the universe is so old that its value has had a chance to relax to the tiny value measured today. Furthermore, in several concrete examples, one finds that the evolution of the cosmological constant slows down as its value approaches zero, so most of the history of the universe transpires when its value is tiny, just as we find today. This idea that the cosmological constant is decreasing has been considered in the past. In fact, physically plausible slow-relaxation mechanisms have been identified. But the timing was thought to be impossible. If the cosmological constant decreases very slowly, it causes the expansion rate to accelerate too early and galaxies never form. If it decreases too quickly, the expansion rate never accelerates, which is inconsistent with recent observations. As long as the cosmological constant has only 14 billion years to evolve, there is no feasible solution. But, recently, some cosmologists have been exploring the possibility that the universe is exponentially older. In this picture, the evolution of the universe is cyclic. The Big Bang is not the beginning of space and time but, rather, a sudden creation of hot matter and radiation that marks the transition from one period of expansion and cooling to the next cycle of evolution. Each cycle might last a trillion years, say. Fourteen billion years marks the time since the last infusion of matter and radiation, but this is brief compared to the total age of the universe. Each cycle lasts about a trillion years and the number of cycles in the past may have been ten to the googol power or more! Then, using the slow relaxation mechanisms considered previously, it becomes possible that the cosmological constant decreases steadily from one cycle to the next. Since the number of cycles is likely to be enormous, there is enough time for the cosmological constant to shrink by an exponential factor, even though the decrease over the course of any one cycle is too small to be undetectable. Because the evolution slows down as the cosmological constant decreases, this is the period when most of the cycles take place. There is no multiverse and there is nothing special about our region of space -- we live in a typical region at a typical time. Remarkably, this idea is scientifically testable. The picture makes explicit predictions about the distribution of primordial gravitational waves and variations in temperature and density. Also, if the cosmological constant is evolving at the slow rate suggested, then ongoing attempts to detect a temporal variation should find no change. So, we may enjoy speculating now about which dangerous ideas we prefer, but ultimately it is Nature that will decide if any of them is right. It is just a matter of time. _________________________________________________________________ SAM HARRIS Neuroscience Graduate Student, UCLA; Author, The End of Faith [harriss101.jpg] Science Must Destroy Religion Most people believe that the Creator of the universe wrote (or dictated) one of their books. Unfortunately, there are many books that pretend to divine authorship, and each makes incompatible claims about how we all must live. Despite the ecumenical efforts of many well-intentioned people, these irreconcilable religious commitments still inspire an appalling amount of human conflict. In response to this situation, most sensible people advocate something called "religious tolerance." While religious tolerance is surely better than religious war, tolerance is not without its liabilities. Our fear of provoking religious hatred has rendered us incapable of criticizing ideas that are now patently absurd and increasingly maladaptive. It has also obliged us to lie to ourselves -- repeatedly and at the highest levels -- about the compatibility between religious faith and scientific rationality. The conflict between religion and science is inherent and (very nearly) zero-sum. The success of science often comes at the expense of religious dogma; the maintenance of religious dogma always comes at the expense of science. It is time we conceded a basic fact of human discourse: either a person has good reasons for what he believes, or he does not. When a person has good reasons, his beliefs contribute to our growing understanding of the world. We need not distinguish between "hard" and "soft" science here, or between science and other evidence-based disciplines like history. There happen to be very good reasons to believe that the Japanese bombed Pearl Harbor on December 7th, 1941. Consequently, the idea that the Egyptians actually did it lacks credibility. Every sane human being recognizes that to rely merely upon "faith" to decide specific questions of historical fact would be both idiotic and grotesque -- that is, until the conversation turns to the origin of books like the bible and the Koran, to the resurrection of Jesus, to Muhammad's conversation with the angel Gabriel, or to any of the other hallowed travesties that still crowd the altar of human ignorance. Science, in the broadest sense, includes all reasonable claims to knowledge about ourselves and the world. If there were good reasons to believe that Jesus was born of a virgin, or that Muhammad flew to heaven on a winged horse, these beliefs would necessarily form part of our rational description of the universe. Faith is nothing more than the license that religious people give one another to believe such propositions when reasons fail. The difference between science and religion is the difference between a willingness to dispassionately consider new evidence and new arguments, and a passionate unwillingness to do so. The distinction could not be more obvious, or more consequential, and yet it is everywhere elided, even in the ivory tower. Religion is fast growing incompatible with the emergence of a global, civil society. Religious faith -- faith that there is a God who cares what name he is called, that one of our books is infallible, that Jesus is coming back to earth to judge the living and the dead, that Muslim martyrs go straight to Paradise, etc. -- is on the wrong side of an escalating war of ideas. The difference between science and religion is the difference between a genuine openness to fruits of human inquiry in the 21st century, and a premature closure to such inquiry as a matter of principle. I believe that the antagonism between reason and faith will only grow more pervasive and intractable in the coming years. Iron Age beliefs -- about God, the soul, sin, free will, etc. -- continue to impede medical research and distort public policy. The possibility that we could elect a U.S. President who takes biblical prophesy seriously is real and terrifying; the likelihood that we will one day confront Islamists armed with nuclear or biological weapons is also terrifying, and growing more probable by the day. We are doing very little, at the level of our intellectual discourse, to prevent such possibilities. In the spirit of religious tolerance, most scientists are keeping silent when they should be blasting the hideous fantasies of a prior age with all the facts at their disposal. To win this war of ideas, scientists and other rational people will need to find new ways of talking about ethics and spiritual experience. The distinction between science and religion is not a matter of excluding our ethical intuitions and non-ordinary states of consciousness from our conversation about the world; it is a matter of our being rigorous about what is reasonable to conclude on their basis. We must find ways of meeting our emotional needs that do not require the abject embrace of the preposterous. We must learn to invoke the power of ritual and to mark those transitions in every human life that demand profundity -- birth, marriage, death, etc. -- without lying to ourselves about the nature of reality. I am hopeful that the necessary transformation in our thinking will come about as our scientific understanding of ourselves matures. When we find reliable ways to make human beings more loving, less fearful, and genuinely enraptured by the fact of our appearance in the cosmos, we will have no need for divisive religious myths. Only then will the practice of raising our children to believe that they are Christian, Jewish, Muslim, or Hindu be broadly recognized as the ludicrous obscenity that it is. And only then will we stand a chance of healing the deepest and most dangerous fractures in our world. _________________________________________________________________ SCOTT ATRAN Anthropologist, University of Michigan; Author, In God's We Trust [atran.100.jpg] Science encourages religion in the long run (and vice versa) Ever since Edward Gibbon's Decline and Fall of the Roman Empire, scientists and secularly-minded scholars have been predicting the ultimate demise of religion. But, if anything, religious fervor is increasing across the world, including in the United States, the world's most economically powerful and scientifically advanced society. An underlying reason is that science treats humans and intentions only as incidental elements in the universe, whereas for religion they are central. Science is not particularly well-suited to deal with people's existential anxieties, including death, deception, sudden catastrophe, loneliness or longing for love or justice. It cannot tell us what we ought to do, only what we can do. Religion thrives because it addresses people's deepest emotional yearnings and society's foundational moral needs, perhaps even more so in complex and mobile societies that are increasingly divorced from nurturing family settings and long familiar environments. From a scientific perspective of the overall structure and design of the physical universe: 1. Human beings are accidental and incidental products of the material development of the universe, almost wholly irrelevant and readily ignored in any general description of its functioning. Beyond Earth, there is no intelligence -- however alien or like our own -- that is watching out for us or cares. We are alone. 2. Human intelligence and reason, which searches for the hidden traps and causes in our surroundings, evolved and will always remain leashed to our animal passions -- in the struggle for survival, the quest for love, the yearning for social standing and belonging. This intelligence does not easily suffer loneliness, anymore than it abides the looming prospect of death, whether individual or collective. Religion is the hope that science is missing (something more in the endeavor to miss nothing). But doesn't religion impede science, and vice versa? Not necessarily. Leaving aside the sociopolitical stakes in the opposition between science and religion (which vary widely are not constitutive of science or religion per se -- Calvin considered obedience to tyrants as exhibiting trust in God, Franklin wanted the motto of the American Republic to be "rebellion against tyranny is obedience to God"), a crucial difference between science and religion is that factual knowledge as such is not a principal aim of religious devotion, but plays only a supporting role. Only in the last decade has the Catholic Church reluctantly acknowledged the factual plausibility of Copernicus, Galileo and Darwin. Earlier religious rejection of their theories stemmed from challenges posed to a cosmic order unifying the moral and material worlds. Separating out the core of the material world would be like draining the pond where a water lily grows. A long lag time was necessary to refurbish and remake the moral and material connections in such a way that would permit faith in a unified cosmology to survive. _________________________________________________________________ MARCELO GLEISER Physicist, Dartmouth College; Author, The Prophet and the Astronome r [gleiser100.jpg] Can science explain itself? There have been many times when I asked myself if we scientists, especially those seeking to answer "ultimate" kind of questions such as the origin of the Universe, are not beating on the wrong drum. Of course, by trying to answer such question as the origin of everything, we assume we can. We plow ahead, proposing tentative models that join general relativity and quantum mechanics and use knowledge from high energy physics to propose models where the universe pops out of nothing, no energy required, due to a random quantum fluctuation. To this, we tag along the randomness of fundamental constants, saying that their values are the way they are due to an accident: other universes may well have other values of the charge and mass of the electron and thus completely different properties. So, our universe becomes this very special place where things "conspire" to produce galaxies, stars, planets, and life. What if this is all bogus? What if we look at sciece as a narrative, a description of the world that has limitations based on its structure? The constants of Nature are the letters of the alphabet, the laws are the grammar rules and we build these descriptions through the guiding hand of the so-called scientific method. Period. To say things are this way because otherwise we wouldn't be here to ask the question is to miss the point altogether: things are this way because this is the story we humans tell based on the way we see the world and explain it. If we take this to the extreme, it means that we will never be able to answer the question of the origin of the Universe, since it implicitly assumes that science can explain itself. We can build any cool and creative models we want using any marriage of quantum mechanics and relativity, but we still won't understand why these laws and not others. In sense, this means that our science is our science and not something universally true as many believe it is. This is not bad at all, given what we can do with it, but it does place limits on knowledge. Which may also not be a bad thing as well. It's OK not to know everything, it doesn't make science weaker. Only more human. _________________________________________________________________ DOUGLAS RUSHKOFF Media Analyst; Documentary Writer; Author, Get Back in the Box : Innovation from the Inside Out [rushkoff100.jpg] Open Source Currency It's not only dangerous and by most counts preposterous; it's happening. Open Source or, in more common parlance, "complementary" currencies are collaboratively established units representing hours of labor that can be traded for goods or services in lieu of centralized currency. The advantage is that while the value of centralized currency is based on its scarcity, the bias of complementary or local currencies is towards their abundance. So instead of having to involve the Fed in every transaction -- and using money that requires being paid back with interest -- we can invent our own currencies and create value with our labor. It's what the Japanese did at the height of the recession. No, not the Japanese government, but unemployed Japanese people who couldn't afford to pay healthcare costs for their elder relatives in distant cities. They created a currency through which people could care for someone else's grandmother, and accrue credits for someone else to take care of theirs. Throughout most of history, complementary currencies existed alongside centralized currency. While local currency was used for labor and local transactions, centralized currencies were used for long distance and foreign trade. Local currencies were based on a model of abundance -- there was so much of it that people constantly invested it. That's why we saw so many cathedrals being built in the late middle ages, and unparalleled levels of investment in infrastructure and maintenance. Centralized currency, on the other hand, needed to retain value over long distances and periods of time, so it was based on precious and scarce resources, such as gold. The problem started during the Renaissance: as kings attempted to centralize their power, most local currencies were outlawed. This new monopoly on currency reduced entire economies into scarcity engines, encouraging competition over collaboration, protectionism over sharing, and fixed commodities over renewable resources. Today, money is lent into existence by the Fed or another central bank -- and paid back with interest. This cash is a medium; and like any medium, it has certain biases. The money we use today is just one model of money. Turning currency into an collaborative phenomenon is the final frontier in the open source movement. It's what would allow for an economic model that could support a renewable energies industry, a way for companies such as Wal-Mart to add value to the communities it currently drains, and a way of working with money that doesn't have bankruptcy built in as a given circumstanc _________________________________________________________________ JUDITH RICH HARRIS Independent Investigator and Theoretician; Author, The Nurture Assumption [harris101.jpg] The idea of zero parental influence Is it dangerous to claim that parents have no power at all (other than genetic) to shape their child's personality, intelligence, or the way he or she behaves outside the family home? More to the point, is this claim false? Was I wrong when I proposed that parents' power to do these things by environmental means is zero, nada, zilch? A confession: When I first made this proposal ten years ago, I didn't fully believe it myself. I took an extreme position, the null hypothesis of zero parental influence, for the sake of scientific clarity. Making myself an easy target, I invited the establishment -- research psychologists in the academic world -- to shoot me down. I didn't think it would be all that difficult for them to do so. It was clear by then that there weren't any big effects of parenting, but I thought there must be modest effects that I would ultimately have to acknowledge. The establishment's failure to shoot me down has been nothing short of astonishing. One developmental psychologist even admitted, one year ago on this very website, that researchers hadn't yet found proof that "parents do shape their children," but she was still convinced that they will eventually find it, if they just keep searching long enough. Her comrades in arms have been less forthright. "There are dozens of studies that show the influence of parents on children!" they kept saying, but then they'd somehow forget to name them -- perhaps because these studies were among the ones I had already demolished (by showing that they lacked the necessary controls or the proper statistical analyses). Or they'd claim to have newer research that provided an airtight case for parental influence, but again there was a catch: the work had never been published in a peer-reviewed journal. When I investigated, I could find no evidence that the research in question had actually been done or, if done, that it had produced the results that were claimed for it. At most, it appeared to consist of preliminary work, with too little data to be meaningful (or publishable). Vaporware, I call it. Some of the vaporware has achieved mythic status. You may have heard of Stephen Suomi's experiment with nervous baby monkeys, supposedly showing that those reared by "nurturant" adoptive monkey mothers turn into calm, socially confident adults. Or of Jerome Kagan's research with nervous baby humans, supposedly showing that those reared by "overprotective" (that is, nurturant) human mothers are more likely to remain fearful. Researchers like these might well see my ideas as dangerous. But is the notion of zero parental influence dangerous in any other sense? So it is alleged. Here's what Frank Farley, former president of the American Psychological Association, told a journalist in 1998: [Harris's] thesis is absurd on its face, but consider what might happen if parents believe this stuff! Will it free some to mistreat their kids, since "it doesn't matter"? Will it tell parents who are tired after a long day that they needn't bother even paying any attention to their kid since "it doesn't matter"? Farley seems to be saying that the only reason parents are nice to their children is because they think it will make the children turn out better! And that if parents believed that they had no influence at all on how their kids turn out, they are likely to abuse or neglect them. Which, it seems to me, is absurd on its face. Most chimpanzee mothers are nice to their babies and take good care of them. Do chimpanzees think they're going to influence how their offspring turn out? Doesn't Frank Farley know anything at all about evolutionary biology and evolutionary psychology? My idea is viewed as dangerous by the powers that be, but I don't think it's dangerous at all. On the contrary: if people accepted it, it would be a breath of fresh air. Family life, for parents and children alike, would improve. Look what's happening now as a result of the faith, obligatory in our culture, in the power of parents to mold their children's fragile psyches. Parents are exhausting themselves in their efforts to meet their children's every demand, not realizing that evolution designed offspring -- nonhuman animals as well as humans -- to demand more than they really need. Family life has become phony, because parents are convinced that children need constant reassurances of their love, so if they don't happen to feel very loving at a particular time or towards a particular child, they fake it. Praise is delivered by the bushel, which devalues its worth. Children have become the masters of the home. And what has all this sacrifice and effort on the part of parents bought them? Zilch. There are no indications that children today are happier, more self-confident, less aggressive, or in better mental health than they were sixty years ago, when I was a child -- when homes were run by and for adults, when physical punishment was used routinely, when fathers were generally unavailable, when praise was a rare and precious commodity, and when explicit expressions of parental love were reserved for the deathbed. Is my idea dangerous? I've never condoned child abuse or neglect; I've never believed that parents don't matter. The relationship between a parent and a child is an important one, but it's important in the same way as the relationship between married partners. A good relationship is one in which each party cares about the other and derives happiness from making the other happy. A good relationship is not one in which one party's central goal is to modify the other's personality. I think what's really dangerous -- perhaps a better word is tragic -- is the establishment's idea of the all-powerful, and hence all-blamable, parent. _________________________________________________________________ ALUN ANDERSON Senior Consultant, New Scientist [andersona100.jpg] Brains cannot become minds without bodies A common image for popular accounts of the "The Mind" is a brain in a bell jar. The message is that inside that disembodied lump of neural tissue is everything that is you. It's a scary image but misleading. A far more dangerous idea is that brains cannot become minds without bodies, that two-way interactions between mind and body are crucial to thought and health, and the brain may partly think in terms of the motor actions it encodes for the body's muscles to carry out. We've probable fallen for disembodied brains because of the academic tendency to worship abstract thought. If we take a more democratic view of the whole brain we'd find far more of it being used for planning and controlling movement than for cogitation. Sports writers get it right when they describe stars of football or baseball as "geniuses"! Their genius requires massive brain power and a superb body, which is perhaps one better than Einstein. The "brain-body" view is dangerous because it requires many scientists to change the way they think: it allows back common sense interactions between brain and body that medical science feels uncomfortable with, makes more sense of feelings like falling in love and requires a different approach for people who are trying to create machines with human-like intelligence. And if this all sounds like mere assertion, there's plenty of interesting research out there to back it up. Interactions between mind and body come out strongly in the surprising links between status and health. Michael Marmot's celebrated studies show that the lower you are in the pecking order, the worse your health is likely to be. You can explain away only a small part of the trend from poorer access to healthcare, or poorer food or living conditions. For Marmot, the answer lies in "the impact over how much control you have over life circumstances". The important message is that state of mind -- perceived status -- translates into state of body. The effect of placebos on health delivers a similar message. Trust and belief are often seen as negative in science and the placebo effect is dismissed as a kind of "fraud" because it relies on the belief of the patient. But the real wonder is that faith can work. Placebos can stimulate the release of pain-relieving endorphins and affect neuronal firing rates in people with Parkinson's disease. Body and mind interact too in the most intimate feelings of love and bonding. Those interactions have been best explored in voles where two hormones, oxytocin and vasopressin, are critical. The hormones are released as a result of the "the extended tactile pleasures of mating", as researchers describe it, and hit pleasure centres in the brain which essentially "addict" sexual partners to one another. Humans are surely more cerebral. But brain scans of people in love show heightened activity where there are lots of oxytocin and vasopressin receptors. Oxytocin levels rise during orgasm and sexual arousal, as they do from touching and massage. There are defects in oxytocin receptors associated with autism. And the hormone boosts the feeling that you can trust others, which is key part of intimate relations. In a recent laboratory "investment game" many investors would trust all their money to a stranger after a puff of an oxytocin spray. These few stories show the importance of the interplay of minds and hormonal signals, of brains and bodies. This idea has been taken to a profound level in the well-known studies of Anthony Damasio, who finds that emotional or "gut feelings" are essential to making decisions. "We don't separate emotion from cognition like layers in a cake," says Damasio, "Emotion is in the loop of reason all the time." Indeed, the way in which reasoning is tied to body actions may be quite counter-intuitive. Giacomo Rizzolatti discovered "mirror neurones" in a part of the monkey brain responsible for planning movement. These nerve cells fire both when a monkey performs an action (like picking up a peanut) and when the monkey sees someone else do the same thing. Before long, similar systems were found in human brains too. The surprising conclusion may be that when we see someone do something, the same parts of our brain are activated "as if" we were doing it ourselves. We may know what other people intend and feel by simulating what they are doing within the same motor areas of our own brains. As Rizzolatti puts it, "the fundamental mechanism that allows us a direct grasp of the mind of others is not conceptual reasoning but direct simulation of the observed events through the mirror mechanism." Direct grasp of others' minds is a special ability that paves the way for our unique powers of imitation which in turn have allowed culture to develop. If bodies and their interaction with brain and planning for action in the world are so central to human kinds of mind, where does that leave the chances of creating an intelligent "disembodied mind" inside a computer? Perhaps the Turing test will be harder than we think. We may build computers that understand language but which cannot say anything meaningful, at least until we can give them "extended tactile experiences". To put it another way, computers may not be able to make sense until they can have sex. _________________________________________________________________ TODD E. FEINBERG, M.D. Psychiatrist and Neurologist, Albert Einstein College of Medicine; Author, Altered Egos [feinberg100.jpg] Myths and fairy tales are not true "Myths and fairy tales are not true." There is no Easter Bunny, there is no Santa Claus, and Moses may never have existed. Worse yet, I have increasing difficulty believing that there is a higher power ruling the universe. This is my dangerous idea. It is not a dangerous idea to those who do not share my particular world view or personal fears; to others it may seem trivially true. But for me, this idea is downright horrifying. I came to ponder this idea through my neurological examination of patients with brain damage that causes a disturbance in their self concepts and ego functions. Some of theses patients develop, in the course of their illness and recovery (or otherwise), disturbances of self and personal relatedness that create enduring delusions and metaphorical confabulations regarding their bodies, their relationships with loved ones, and their personal experiences. A patient I examined with a right hemisphere stroke and paralyzed left arm claimed that the arm was actually severed from his brother's body by gang members, thrown in the East river, and later attached to the patient's shoulder. Another patient with a ruptured brain aneurysm and amnesia who denied his disabilities claimed he was planning to adopt (a phantom) child who was in need of medical assistance. These personal narratives, produced by patients in altered neurological states and therefore without the constraints imposed by a fully functioning consciousness, have a dream-like quality, and constitute "personal myths" that express the patient's beliefs about themselves. The patient creates a metaphor in which personal experiences are crystallized in a metaphor in the form of an external real or fictitious persons, objects, places, or events. When this occurs, the metaphor serves as a symbolic representation or externalization of the patient's feelings that the patient does not realize originate from within the self. There is an intimate relationship between my patients' narratives and socially endorsed fairy tales and mythologies. This is particularly apparent when mythologies deal with themes relating to a loss of self, personal identity or death. For many people, the notion of personal death is extremely difficult to grasp and fully accommodate within one's self image. For many, in order to go on with life, death must be denied. Therefore, to help the individual deal with the prospect of the inevitability of personal death, cultural and religious institutions provide metaphors of everlasting life. Just as my patients adapt to difficult realities by creating metaphorical substitutes, it appears to me that beliefs in angels, deities and eternal souls can be understood in part as wish fulfilling metaphors for an unpleasant reality that most of us cannot fully comprehend and accept. Unfortunately, just as my patients' myths are not true, neither are those that I was brought up to believe in. _________________________________________________________________ STEWART BRAND Founder, Whole Earth Catalog, cofounder; The Well; cofounder, Global Business Network; Author, How Buildings Learn [brand100.jpg] What if public policy makers have an obligation to engage historians, and historians have an obligation to try to help? All historians understand that they must never, ever talk about the future. Their discipline requires that they deal in facts, and the future doesn't have any yet. A solid theory of history might be able to embrace the future, but all such theories have been discredited. Thus historians do not offer, and are seldom invited, to take part in shaping public policy. They leave that to economists. But discussions among policy makers always invoke history anyway, usually in simplistic form. "Munich" and "Vietnam," devoid of detail or nuance, stand for certain kinds of failure. "Marshall Plan" and "Man on the Moon" stand for certain kinds of success. Such totemic invocation of history is the opposite of learning from history, and Santayana's warning continues in force, that those who fail to learn from history are condemned to repeat it. A dangerous thought: What if public policy makers have an obligation to engage historians, and historians have an obligation to try to help? And instead of just retailing advice, go generic. Historians could set about developing a rigorous sub-discipline called "Applied History." There is only one significant book on the subject, published in 1988. Thinking In Time: The Uses of Hustory for Decision Makers was written by the late Richard Neustadt and Ernest May, who long taught a course on the subject at Harvard's Kennedy School of Government. (A course called "Reasoning from History" is currently taught there by Alexander Keyssar.) Done wrong, Applied History could paralyze public decision making and corrupt the practice of history -- that's the danger. But done right, Applied History could make decision making and policy far more sophisticated and adaptive, and it could invest the study of history with the level of consequence it deserves. _________________________________________________________________ JARED DIAMOND Biologist; Geographer, UCLA; Author, Collapse [diamond100.jpg] The evidence that tribal peoples often damage their environments and make war. Why is this idea dangerous? Because too many people today believe that a reason not to mistreat tribal people is that they are too nice or wise or peaceful to do those evil things, which only we evil citizens of state governments do. The idea is dangerous because, if you believe that that's the reason not to mistreat tribal peoples, then proof of the idea's truth would suggest that it's OK to mistreat them. In fact, the evidence seems to me overwhelming that the dangerous idea is true. But we should treat other people well because of ethical reasons, not because of na?ve anthropological theories that will almost surely prove false. _________________________________________________________________ LEONARD SUSSKIND Physicist, Stanford University; Author, The Cosmic Landscape [susskind100.jpg] The "Landscape" I have been accused of advocating an extremely dangerous idea. According to some people, the "Landscape" idea will eventually ensure that the forces of intelligent design (and other unscientific religious ideas) will triumph over true science. From one of my most distinguished colleagues: From a political, cultural point of view, it's not that these arguments are religious but that they denude us from our historical strength in opposing religion. Others have expressed the fear that my ideas, and those of my friends, will lead to the end of science (methinks they overestimate me). One physicist calls it "millennial madness." And from another quarter, Christoph Sch?nborn, Cardinal Archbishop of Vienna has accused me of "an abdication of human intelligence." As you may have guessed the idea in question is the Anthropic Principle: a principle that seeks to explain the laws of physics, and the constants of nature, by saying, "If they (the laws of physics) were different, intelligent life would not exist to ask why laws of nature are what they are." On the face of it, the Anthropic Principle is far too silly to be dangerous. It sounds no more sensible than explaining the evolution of the eye by saying that unless the eye evolved, there would be no one to read this page. But the A.P. is really shorthand for a rich set of ideas that are beginning to influence and even dominate the thinking of almost all serious theoretical physicists and cosmologists. Let me strip the idea down to its essentials. Without all the philosophical baggage, what it says is straightforward: The universe is vastly bigger than the portion that we can see; and, on a very large scale it is as varied as possible. In other words, rather than being a homogeneous, mono-colored blanket, it is a crazy-quilt patchwork of different environments. This is not an idle speculation. There is a growing body of empirical evidence confirming the inflationary theory of cosmology, which underlies the hugeness and hypothetical diversity of the universe. Meanwhile string theorists, much to the regret of many of them, are discovering that the number of possible environments described by their equations is far beyond millions or billions. This enormous space of possibilities, whose multiplicity may exceed ten to the 500 power, is called the Landscape. If these things prove to be true, then some features of the laws of physics (maybe most) will be local environmental facts rather than written-in-stone laws: laws that could not be otherwise. The explanation of some numerical coincidences will necessarily be that most of the multiverse is uninhabitable, but in some very tiny fraction conditions are fine-tuned enough for intelligent life to form. That's the dangerous idea and it is spreading like a cancer. Why is it that so many physicists find these ideas alarming? Well, they do threaten physicists' fondest hope, the hope that some extraordinarily beautiful mathematical principle will be discovered: a principle that would completely and uniquely explain every detail of the laws of particle physics (and therefore nuclear, atomic, and chemical physics). The enormous Landscape of Possibilities inherent in our best theory seems to dash that hope. What further worries many physicists is that the Landscape may be so rich that almost anything can be found: any combination of physical constants, particle masses, etc. This, they fear, would eliminate the predictive power of physics. Environmental facts are nothing more than environmental facts. They worry that if everything is possible, there will be no way to falsify the theory -- or, more to the point, no way to confirm it. Is the danger real? We shall see. Another danger that some of my colleagues perceive, is that if we "senior physicists" allow ourselves to be seduced by the Anthropic Principle, young physicists will give up looking for the "true" reason for things, the beautiful mathematical principle. My guess is that if the young generation of scientists is really that spineless, then science is doomed anyway. But as we know, the ambition of all young scientists is to make fools of their elders. And why does the Cardinal Archbishop Sch?nborn find the Landscape and the Multiverse so dangerous. I will let him explain it himself: Now, at the beginning of the 21st century, faced with scientific claims like neo-Darwinism and the multiverse hypothesis in cosmology invented to avoid the overwhelming evidence for purpose and design found in modern science, the Catholic Church will again defend human nature by proclaiming that the immanent design evident in nature is real. Scientific theories that try to explain away the appearance of design as the result of 'chance and necessity' are not scientific at all, but, as John Paul put it, an abdication of human intelligence. Abdication of human intelligence? No, it's called science. _________________________________________________________________ GERALD HOLTON Mallinckrodt Research Professor of Physics and Research Professor of History of Science, Harvard University; Author, Thematic Origins of Scientific Thought [holton100.jpg] The medicination of the ancient yearning for immortality Since the major absorption of scientific method into the research and practice of medicine in the 1860s, the longevity curve, at least for the white population in industrial countries, took off and has continued fairly constantly. That has been on the whole a benign result, and has begun to introduce the idea of tolerably good health as one of the basic Human Rights. But one now reads of projections to 200 years, and perhaps more. The economic, social and human costs of the increasing fraction of very elderly citizens have begun to be noticed already. To glimpse one of the possible results of the continuing projection of the longevity curve in terms of a plausible scenario: The matriarch of the family, on her deathbed at age 200, is being visited by the surviving, grieving family members: a son and a daughter, each of age of about 180, plus /their/ three "children" , around 150-160 years old each, plus all their offspring, in the range of 120 to 130, and so on..... A touching picture. But what are all the "costs" involved? _________________________________________________________________ CHARLES SEIFE Professor of Journalism, New York University; formerly journalist, Science magazine; Author, Zero: The Biography Of A Dangerous Idea [seife100.jpg] Nothing Nothing can be more dangerous than nothing. Humanity's always been uncomfortable with zero and the void. The ancient Greeks declared them unnatural and unreal. Theologians argued that God's first act was to banish the void by the act of creating the universe ex nihilo, and Middle-Ages thinkers tried to ban zero and the other Arabic "ciphers." But the emptiness is all around us -- most of the universe is void. Even as we huddle around our hearths and invent stories to convince ourselves that the cosmos is warm and full and inviting, nothingness stares back at us with empty eye sockets. _________________________________________________________________ KARL SABBAGH Writer and Television Producer; Author, The Riemann Hypothesis [sabbagh100.jpg] The human brain and its products are incapable of understanding the truths about the universe Our brains may never be well-enough equipped to understand the universe and we are fooling ourselves if we think they will. Why should we expect to be able eventually to understand how the universe originated, evolved, and operates? While human brains are complex and capable of many amazing things, there is not necessarily any match between the complexity of the universe and the complexity of our brains, any more than a dog's brain is capable of understanding every detail of the world of cats and bones, or the dynamics of stick trajectories when thrown. Dogs get by and so do we, but do we have a right to expect that the harder we puzzle over these things the nearer we will get to the truth? Recently I stood in front of a three metre high model of the Ptolemaic universe in the Museum of the History of Science in Florence and I remembered how well that worked as a representation of the motions of the planets until Copernicus and Kepler came along. Nowadays, no element of the theory of giant interlocking cogwheels at work is of any use in understanding the motions of the stars and planets (and indeed Ptolemy himself did not argue that the universe really was run by giant cogwheels). Occam's Razor is used to compare two theories and allow us to choose which is more likely to be 'true' but hasn't it become a comfort blanket whenever we are faced with aspects of the universe that seem unutterably complex -- string theory for example. But is string theory just the Ptolemaic clockwork de nos jours? Can it be succeeded by some simplification or might the truth be even more complex and far beyond the neural networks of our brain to understand? The history of science is littered with examples of two types of knowledge advancement. There is imperfect understanding that 'sort of' works, and is then modified and replaced by something that works better, without destroying the validity of the earlier theory. Newton's theory of gravitation replaced by Einstein. Then there is imperfect understanding that is replaced by some new idea which owes nothing to older ones. Phlogiston theory, the ether, and so on are replaced by ideas which save the phenomena, lead to predictions, and convince us that they are nearer the truth. Which of these categories really covers today's science? Could we be fooling ourselves by playing around with modern phlogiston? And even if we are on the right lines in some areas, how much of what there is to be understood in the universe do we really understand? Fifty percent? Five percent? The dangerous idea is that perhaps we understand half a percent and all the brain and computer power we can muster may take us up to one or two percent in the lifetime of the human race. Paradoxically, we may find that the only justification for pursuing scientific knowledge is for the practical applications it leads to -- a view that runs contrary to the traditional support of knowledge for knowledge's sake. And why is this paradoxical? Because the most important advances in technology have come out of research that was not seeking to develop those advances but to understand the universe. So if my dangerous idea is right -- that the human brain and its products are actually incapable of understanding the truths about the universe -- it will not -- and should not -- lead to any diminution at all in our attempts to do so. Which means, I suppose, that it's not really dangerous at all. _________________________________________________________________ RUPERT SHELDRAKE Biologist, London; Author of The Presence of the Past [sheldrake100.jpg] A sense of direction involving new scientific principles We don't understand animal navigation. No one knows how pigeons home, or how swallow migrate, or how green turtles find Ascension Island from thousands of miles away to lay their eggs. These kinds of navigation involve more than following familiar landmarks, or orientating in a particular compass direction; they involve an ability to move towards a goal. Why is this idea dangerous? Don't we just need a bit more time to explain navigation in terms of standard physics, genes, nerve impulses and brain chemistry? Perhaps. But there is a dangerous possibility that animal navigation may not be explicable in terms of present-day physics. Over and above the known senses, some species of animals may have a sense of direction that depends on their being attracted towards their goals through direct field-like connections. These spatial attractors are places with which the animals themselves are already familiar, or with which their ancestors were familiar. What are the facts? We know more about pigeons than any other species. Everyone agrees that within familiar territory, especially within a few miles of their home, pigeons can use landmarks; for example, they can follow roads. But using familiar landmarks near home cannot explain how racing pigeons return across unfamiliar terrain from six hundred miles away, even flying over the sea, as English pigeons do when they are raced from Spain. Charles Darwin, himself a pigeon fancier, was one of the first to suggest a scientific hypothesis for pigeon homing. He proposed that they might use a kind of dead reckoning, registering all the twists and turns of the outward journey. This idea was tested in the twentieth century by taking pigeons away from their loft in closed vans by devious routes. They still homed normally. So did birds transported on rotating turntables, and so did birds that had been completely anaesthetized during the outward journey. What about celestial navigation? One problem for hypothetical solar or stellar navigation systems is that many animals still navigate in cloudy weather. Another problem is that celestial navigation depends on a precise time sense. To test the sun navigation theory, homing pigeons were clock-shifted by six or twelve hours and taken many miles from their lofts before being released. On sunny days, they set off in the wrong direction, as if a clock-dependent sun compass had been shifted. But in spite of their initial confusion, the pigeons soon corrected their courses and flew homewards normally. Two main hypotheses remain: smell and magnetism. Smelling the home position from hundreds of miles away is generally agreed to be implausible. Even the most ardent defenders of the smell hypothesis (the Italian school of Floriano Papi and his colleagues) concede that smell navigation is unlikely to work at distances over 30 miles. That leaves a magnetic sense. A range of animal species can detect magnetic fields, including termites, bees and migrating birds. But even if pigeons have a compass sense, this cannot by itself explain homing. Imagine that you are taken to an unfamiliar place and given a compass. You will know from the compass where north is, but not where home is. The obvious way of dealing with this problem is to postulate complex interactions between known sensory modalities, with multiple back-up systems. The complex interaction theory is safe, sounds sophisticated, and is vague enough to be irrefutable. The idea of a sense of direction involving new scientific principles is dangerous, but it may be inevitable. _________________________________________________________________ TOR N?RRETRANDERS Science Writer; Consultant; Lecturer, Copenhagen; Author, The User Illusion [norretranders100.jpg] Social Relativity Relativity is my dangerous idea. Well, neither the special nor the general theory of relativity, but what could be called social relativity: The idea that the only thing that matters to human well-being is how one stands relatively to others. That is, only the relative wealth of a person is important, the absolute level does not really matter, as soon as everyone is above the level of having their immediate survival needs fulfilled. There is now strong and consistent evidence (from fields such as microeconomics, experimental economics, psychology, sociolology and primatology) that it doesn't really matter how much you earn, as long as you earn more than your wife's sister's husband. Pioneers in these discussions are the late British social thinker Fred Hirsch and the American economist Robert Frank. Why is this idea dangerous? It seems to imply that equality will never become possible in human societies: The driving force is always to get ahead of the rest. Nobody will ever settle down and share. So it would seem that we are forever stuck with poverty, disease and unjust hierarchies. This idea could make the rich and the smart lean back and forget about the rest of the pack. But it shouldn't. Inequality may subjectively seem nice to the rich, but objectively it is not in their interest. A huge body of epidemiological evidence points to the fact that inequality is in fact the prime cause for human disease. Rich people in poor countries are more healthy than poor people in rich countries, even though the latter group has more resources in absolute terms. Societies with strong gradients of wealth show higher death rates and more disease, also amongst the people at the top. Pioneers in these studies are the British epidemiologists Michael Marmot and Richard Wilkinson. Poverty means spreading of disease, degradation of ecosystems and social violence and crime -- which are also bad for the rich. Inequality means stress to everyone. Social relativity then boils down to an illusion: It seems nice to me to be better off than the rest, but in terms of vitals -- survival, good health -- it is not. Believing in social relativity can be dangerous to your health. _________________________________________________________________ JOHN HORGAN Science Writer; Author, Rational Mysticism [horgan100.jpg] We Have No Souls The Depressing, Dangerous Hypothesis: We Have No Souls. This year's Edge question makes me wonder: Which ideas pose a greater potential danger? False ones or true ones? Illusions or the lack thereof? As a believer in and lover of science, I certainly hope that the truth will set us free, and save us, but sometimes I'm not so sure. The dangerous, probably true idea I'd like to dwell on in this Holiday season is that we humans have no souls. The soul is that core of us that supposedly transcends and even persists beyond our physicality, lending us a fundamental autonomy, privacy and dignity. In his 1994 book The Astonishing Hypothesis: The Scientific Search for the Soul, the late, great Francis Crick argued that the soul is an illusion perpetuated, like Tinkerbell, only by our belief in it. Crick opened his book with this manifesto: "'You,' your joys and your sorrows, your memories and your ambitions, your sense of personal identity and free will, are in fact no more than the behavior of a vast assembly of nerve cells and their associated molecules." Note the quotation marks around "You." The subtitle of Crick's book was almost comically ironic, since he was clearly trying not to find the soul but to crush it out of existence. I once told Crick that "The Depressing Hypothesis" would have been a more accurate title for his book, since he was, after all, just reiterating the basic, materialist assumption of modern neurobiology and, more broadly, all of science. Until recently, it was easy to dismiss this assumption as moot, because brain researchers had made so little progress in tracing cognition to specific neural processes. Even self-proclaimed materialists -- who accept, intellectually, that we are just meat machines -- could harbor a secret, sentimental belief in a soul of the gaps. But recently the gaps have been closing, as neuroscientists -- egged on by Crick in the last two decades of his life--have begun unraveling the so-called neural code, the software that transforms electrochemical pulses in the brain into perceptions, memories, decisions, emotions, and other constituents of consciousness. I've argued elsewhere that the neural code may turn out to be so complex that it will never be fully deciphered. But 60 years ago, some biologists feared the genetic code was too complex to crack. Then in 1953 Crick and Watson unraveled the structure of DNA, and researchers quickly established that the double helix mediates an astonishingly simple genetic code governing the heredity of all organisms. Science's success in deciphering the genetic code, which has culminated in the Human Genome Project, has been widely acclaimed -- and with good reason, because knowledge of our genetic makeup could allow us to reshape our innate nature. A solution to the neural code could give us much greater, more direct control over ourselves than mere genetic manipulation. Will we be liberated or enslaved by this knowledge? Officials in the Pentagon, the major funder of neural-code research, have openly broached the prospect of cyborg warriors who can be remotely controlled via brain implants, like the assassin in the recent remake of "The Manchurian Candidate." On the other hand, a cult-like group of self-described "wireheads" looks forward to the day when implants allow us to create our own realities and achieve ecstasy on demand. Either way, when our minds can be programmed like personal computers, then, perhaps, we will finally abandon the belief that we have immortal, inviolable souls, unless, of course, we program ourselves to believe. _________________________________________________________________ ERIC R. KANDEL Biochemist and University Professor, Columbia University; Recipient, The Nobel Prize, 2000; Author, Cellular Basis of Behavior [kandel100.jpg] Free will is exercised unconsciously, without awareness It is clear that consciousness is central to understanding human mental processes, and therefore is the holy grail of modern neuroscience. What is less clear is that much of our mental processes are unconscious and that these unconscious processes are as important as conscious mental processes for understanding the mind. Indeed most cognitive processes never reach consciousness. As Sigmund Freud emphasized at the beginning of the 20th century most of our perceptual and cognitive processes are unconscious, except those that are in the immediate focus of our attention. Based on these insights Freud emphasized that unconscious mental processes guide much of human behavior. Freud's idea was a natural extension of the notion of unconscious inference proposed in the 1860s by Hermann Helmholtz, the German physicist turned neural scientist. Helmholtz was the first to measure the conduction of electrical signals in nerves. He had expected it to be as the speed of light, fast as the conduction of electricity in copper cables, and found to his surprise that it was much slower, only about 90m sec. He then examined the reaction time, the time it takes a subject to respond to a consciously a perceived stimulus, and found that it was much, much slower than even the combined conduction times required for sensory and motor activities. This caused Helmholz to argue that a great deal of brain processing occurred unconsciously prior to conscious perception of an object. Helmholtz went on to argue that much of what goes on in the brain is not represented in consciousness and that the perception of objects depends upon "unconscious inferences" made by the brain, based on thinking and reasoning without awareness. This view was not accepted by many brain scientists who believed that consciousness is necessary for making inferences. However, in the 1970s a number of experiments began to accumulate in favor of the idea that most cognitive processes that occur in the brain never enter consciousness. Perhaps the most influential of these experiments were those carried out by Benjamin Libet in 1986. Libet used as his starting point a discovery made by the German neurologist Hans Kornhuber. Kornhuber asked volunteers to move their right index finger. He then measured this voluntary movement with a strain gauge while at the same time recording the electrical activity of the brain by means of an electrode on the skull. After hundreds of trials, Kornhuber found that, invariably, each movement was preceded by a little blip in the electrical record from the brain, a spark of free will! He called this potential in the brain the "readiness potential" and found that it occurred one second before the voluntary movement. Libet followed up on Kornhuber's finding with an experiment in which he asked volunteers to lift a finger whenever they felt the urge to do so. He placed an electrode on a volunteer's skull and confirmed a readiness potential about one second before the person lifted his or her finger. He then compared the time it took for the person to will the movement with the time of the readiness potential. Amazingly, Libet found that the readiness potential appeared not after, but 200 milliseconds before a person felt the urge to move his or her finger! Thus by merely observing the electrical activity of the brain, Libet could predict what a person would do before the person was actually aware of having decided to do it. These experiments led to the radical insight that by observing another person's brain activity, one can predict what someone is going to do before he is aware that he has made the decision to do it. This finding has caused philosophers of mind to ask: If the choice is determined in the brain unconsciously before we decide to act, where is free will? Are these choices predetermined? Is our experience of freely willing our actions only an illusion, a rationalization after the fact for what has happened? Freud, Helmholtz and Libet would disagree and argue that the choice is freely made but that it happens without our awareness. According to their view, the unconscious inference of Helmholtz also applies to decision-making. They would argue that the choice is made freely, but not consciously. Libet for example proposes that the process of initiating a voluntary action occurs in an unconscious part of the brain, but that just before the action is initiated, consciousness is recruited to approve or veto the action. In the 200 milliseconds before a finger is lifted, consciousness determines whether it moves or not. Whatever the reasons for the delay between decision and awareness, Libet's findings now raise the moral question: Is one to be held responsible for decisions that are made without conscious awareness? _________________________________________________________________ DANIEL GOLEMAN Psychologist; Author, Emotional Intelligence [goleman100.jpg] Cyber-disinhibition The Internet inadvertently undermines the quality of human interaction, allowing destructive emotional impulses freer reign under specific circumstances. The reason is a neural fluke that results in cyber-disinhibition of brain systems that keep our more unruly urges in check. The tech problem: a major disconnect between the ways our brains are wired to connect, and the interface offered in online interactions. Communication via the Internet can mislead the brain's social systems. The key mechanisms are in the prefrontal cortex; these circuits instantaneously monitor ourselves and the other person during a live interaction, and automatically guide our responses so they are appropriate and smooth. A key mechanism for this involves circuits that ordinarily inhibit impulses for actions that would be rude or simply inappropriate -- or outright dangerous. In order for this regulatory mechanism to operate well, we depend on real-time, ongoing feedback from the other person. The Internet has no means to allow such realtime feedback (other than rarely used two-way audio/video streams). That puts our inhibitory circuitry at a loss -- there is no signal to monitor from the other person. This results in disinhibition: impulse unleashed. Such disinhibition seems state-specific, and typically occurs rarely while people are in positive or neutral emotional states. That's why the Internet works admirably for the vast majority of communication. Rather, this disinhibition becomes far more likely when people feel strong, negative emotions. What fails to be inhibited are the impulses those emotions generate. This phenomenon has been recognized since the earliest days of the Internet (then the Arpanet, used by a small circle of scientists) as "flaming," the tendency to send abrasive, angry or otherwise emotionally "off" cyber-messages. The hallmark of a flame is that the same person would never say the words in the email to the recipient were they face-to-face. His inhibitory circuits would not allow it -- and so the interaction would go more smoothly. He might still communicate the same core information face-to-face, but in a more skillful manner. Offline and in life, people who flame repeatedly tend to become friendless, or get fired (unless they already run the company). The greatest danger from cyber-disinhibition may be to young people. The prefrontal inhibitory circuitry is among the last part of the brain to become fully mature, doing so sometime in the twenties. During adolescence there is a developmental lag, with teenagers having fragile inhibitory capacities, but fully ripe emotional impulsivity. Strengthening these inhibitory circuits can be seen as the singular task in neural development of the adolescent years. One way this teenage neural gap manifests online is "cyber-bullying," which has emerged among girls in their early teens. Cliques of girls post or send cruel, harassing messages to a target girl, who typically is both reduced to tears and socially humiliated. The posts and messages are anonymous, though they become widely known among the target's peers. The anonymity and social distance of the Internet allow an escalation of such petty cruelty to levels that are rarely found in person: face-to-face seeing someone cry typically halts bullying among girls -- but that inhibitory signal cannot come via Internet. A more ominous manifestation of cyber-disinhibition can be seen in the susceptibility of teenagers induced to perform sexual acts in front of webcams for an anonymous adult audience who pay to watch and direct. Apparently hundreds of teenagers have been lured into this corner of child pornography, with an equally large audience of pedophiles. The Internet gives strangers access to children in their own homes, who are tempted to do things online they would never consider in person. Cyber-bullying was reported last week in my local paper. The Webcam teenage sex circuit was a front-page story in The New York Times two days later. As with any new technology, the Internet is an experiment in progress. It's time we considered what other such downsides of cyber-disinhibition may be emerging -- and looked for a technological fix, if possible. The dangerous thought: the Internet may harbor social perils our inhibitory circuitry was not designed to handle in evolution. _________________________________________________________________ BRIAN GREENE Physicist & Mathematician, Columbia University; Author, The Fabric of the Cosmos; Presenter, three-part Nova program, The Elegant Universe [greene100.jpg] The Multiverse The notion that there are universes beyond our own -- the idea that we are but one member of a vast collection of universes called the multiverse -- is highly speculative, but both exciting and humbling. It's also an idea that suggests a radically new, but inherently risky approach to certain scientific problems. An essential working assumption in the sciences is that with adequate ingenuity, technical facility, and hard work, we can explain what we observe. The impressive progress made over the past few hundred years is testament to the apparent validity of this assumption. But if we are part of a multiverse, then our universe may have properties that are beyond traditional scientific explanation. Here's why: Theoretical studies of the multiverse (within inflationary cosmology and string theory, for example) suggest that the detailed properties of the other universes may be significantly different from our own. In some, the particles making up matter may have different masses or electric charges; in others, the fundamental forces may differ in strength and even number from those we experience; in others still, the very structure of space and time may be unlike anything we've ever seen. In this context, the quest for fundamental explanations of particular properties of our universe -- for example, the observed strengths of the nuclear and electromagnetic forces -- takes on a very different character. The strengths of these forces may vary from universe to universe and thus it may simply be a matter of chance that, in our universe, these forces have the particular strengths with which we're familiar. More intriguingly, we can even imagine that in the other universes where their strengths are different, conditions are not hospitable to our form of life. (With different force strengths, the processes giving rise to long-lived stars and stable planetary systems -- on which life can form and evolve -- can easily be disrupted.) In this setting, there would be no deep explanation for the observed force strengths. Instead, we would find ourselves living in a universe in which the forces have their familiar strengths simply because we couldn't survive in any of the others where the strengths were different. If true, the idea of a multiverse would be a Copernican revolution realized on a cosmic scale. It would be a rich and astounding upheaval, but one with potentially hazardous consequences. Beyond the inherent difficulty in assessing its validity, when should we allow the multiverse framework to be invoked in lieu of a more traditional scientific explanation? Had this idea surfaced a hundred years ago, might researchers have chalked up various mysteries to how things just happen to be in our corner of the multiverse, and not pressed on to discover all the wondrous science of the last century? Thankfully that's not how the history of science played itself out, at least not in our universe. But the point is manifest. While some mysteries may indeed reflect nothing more than the particular universe, within the multiverse, we find ourselves inhabiting, other mysteries are worth struggling with because they are the result of deep, underlying physical laws. The danger, if the multiverse idea takes root, is that researchers may too quickly give up the search for such underlying explanations. When faced with seemingly inexplicable observations, researchers may invoke the framework of the multiverse prematurely -- proclaiming some or other phenomenon to merely reflect conditions in our bubble universe -- thereby failing to discover the deeper understanding that awaits us. _________________________________________________________________ DAVID GELERNTER Computer Scientist, Yale University; Chief Scientist, Mirror Worlds Technologies; Author, Drawing Life [gelernter100.jpg] What are people well-informed about in the Information Age? Let's date the Information Age to 1982, when the Internet went into operation & the PC had just been born. What if people have been growing less well-informed ever since? What if people have been growing steadily more ignorant ever since the so-called Information Age began? Suppose an average US voter, college teacher, 5th-grade teacher, 5th-grade student are each less well-informed today than they were in '95, and were less well-informed then than in '85? Suppose, for that matter, they were less well-informed in '85 than in '65? If this is indeed the "information age," what exactly are people well-informed about? Video games? Clearly history, literature, philosophy, scholarship in general are not our specialities. This is some sort of technology age -- are people better informed about science? Not that I can tell. In previous technology ages, there was interest across the population in the era's leading technology. In the 1960s, for example, all sorts of people were interested in the space program and rocket technology. Lots of people learned a little about the basics -- what a "service module" or "trans-lunar injection" was, why a Redstone-Mercury vehicle was different from an Atlas-Mercury -- all sorts of grade-school students, lawyers, housewives, English profs were up on these topics. Today there is no comparable interest in computers & the internet, and no comparable knowledge. "TCP/IP," "Routers," "Ethernet protocol," "cache hits" -- these are topics of no interest whatsoever outside the technical community. The contrast is striking. _________________________________________________________________ MAHZARIN R. BANAJI Professor of Psychology, Harvard University [banaji100.jpg] We do not (and to a large extent, cannot) know who we are through introspection Conscious awareness is a sliver of the machine that is human intelligence but it's the only aspect we experience and hence the only aspect we come to believe exists. Thoughts, feelings, and behavior operate largely without deliberation or conscious recognition -- it's the routinized, automatic, classically conditioned, pre-compiled aspects of our thoughts and feelings that make up a large part of who we are. We don't know what motivates us even though we are certain we know just why we do the things we do. We have no idea that our perceptions and judgments are incorrect (as measured objectively) even when they are. Even more stunning, our behavior is often discrepant from our own conscious intentions and goals, not just objective standards or somebody else's standards. The same lack of introspective access that keeps us from seeing the truth in a visual illusion is the lack of introspective access that keeps us from seeing the truth of our own minds and behavior. The "bounds" on our ethical sense rarely come to light because the input into those decisions is kept firmly outside our awareness. Or at least, they don't come to light until science brings them into the light in a way that no longer permits them to remain in the dark. It is the fact that human minds have a tendency to categorize and learn in particular ways, that the sorts of feelings for one's ingroup and fear of outgroups are part of our evolutionary history. That fearing things that are different from oneself, holding what's not part of the dominant culture (not American, not male, not White, not college-educated) to be "less good" whether one wants to or not, reflects a part of our history that made sense in a particular time and place - because without it we would not have survived. To know this is to understand the barriers to change honestly and with adequate preparation. As everybody's favorite biologist Richard Dawkins said thirty years ago: Let us understand what our own selfish genes are up to, because we may then at least have a chance to upset their designs, something that no other species has ever aspired to do. We cannot know ourselves without the methods of science. The mind sciences have made it possible to look into the universe between the ear drums in ways that were unimagined. Emily Dickinson wrote in a letter to a mentor asking him to tell her how good a poet she was: "The sailor cannot see the north, but knows the needle can" she said. We have the needle and it involves direct, concerted effort, using science to get to the next and perhaps last frontier, of understanding not just our place among other planets, our place among other species, but our very nature. _________________________________________________________________ RODNEY BROOKS Director, MIT Computer Science and Artificial Intelligence Laboratory (CSAIL); Chief Technical Officer of iRobot Corporation; author Flesh and Machines [brooks100.jpg] Being alone in the universe The thing that I worry about most that may or may not be true is that perhaps the spontaneous transformation from non-living matter to living matter is extraordinarily unlikely. We know that it has happened once. But what if we gain lots of evidence over the next few decades that it happens very rarely. In my lifetime we can expect to examine the surface of Mars, and the moons of the gas giants in some detail. We can also expect to be able to image extra-solar planets within a few tens of light years to resolutions where we would be able to detect evidence of large scale biological activity. What if none of these indicate any life whatsoever? What does that do to our scientific belief that life did arise spontaneously. It should not change it, but it will make it harder to defend against non-scientific attacks. And wouldn't it sadden us immensely if we were to discover that there is a vanishing small probability that life will arise even once in any given galaxy. Being alone in this solar system will not be such a such a shock, but alone in the galaxy, or worse alone in the universe would, I think, drive us to despair, and back towards religion as our salve. _________________________________________________________________ LEE SMOLIN Physicist, Perimeter Institute; Author, Three Roads to Quantum Gravity [smolin100.jpg] Seeing Darwin in the light of Einstein; seeing Einstein in the light of Darwin The revolutionary moves made by Einstein and Darwin are closely related, and their combination will increasingly come to define how we see our worlds: physical, biological and social. Before Einstein, the properties of elementary particles were understood as being defined against an absolute, eternally fixed background. This way of doing science had been introduced by Newton. His method was to posit the existence of an absolute and eternal background structure against which the properties of things were defined. For example, this is how Newton conceived of space and time. Particles have properties defined, not with respect to each other, but each with respect to only the absolute background of space and time. Einstein's great achievement was to realize successfully the contrary idea, called relationalism, according to which the world is a network of relationships which evolve in time. There is no absolute background and the properties of anything are only defined in terms of its participation in this network of relations. Before Darwin, species were thought of as eternal categories, defined a priori; after Darwin species were understood to be relational categories-that is only defined in terms of their relationship with the network of interactions making up the biosphere. Darwin's great contribution was to understand that there is a process-natural selection-that can act on relational properties, leading to the birth of genuine novelty by creating complexes of relationships that are increasingly structured and complex. Seeing Darwin in the light of Einstein, we understand that all the properties a species has in modern biology are relational. There is no absolute background in biology. Seeing Einstein in the light of Darwin opens up the possibility that the mechanism of natural selection could act not only on living things but on the properties that define the different species of elementary particles. At first, physicists thought that the only relational properties an elementary particle might have were its position and motion in space and time. The other properties, like mass and charge were thought of in the old framework: defined by a background of absolute law. The standard model of particle physics taught us that some of those properties, like mass, are only the consequence of a particles interactions with other fields. As a result the mass of a particle is determined environmentally, by the phase of the other fields it interacts with. I don't know which model of quantum gravity is right, but all the leading candidates, string theory, loop quantum gravity and others, teach us that it is possible that all properties of elementary particles are relational and environmental. In different possible universes there may be different combinations of elementary particles and forces. Indeed, all that used to be thought of as fundamental, space and the elementary particles themselves are increasingly seen, in models of quantum gravity, as themselves emergent from a more elementary network of relations. The basic method of science after Einstein seems to be: identify something in your theory that is playing the role of an absolute background, that is needed to define the laws that govern objects in your theory, and understand it more deeply as a contingent property, which itself evolves subject to law. For example, before Einstein the geometry of space was thought of as specified absolutely as part of the laws of nature. After Einstein we understand geometry is contingent and dynamical, which means it evolves subject to law. This means that Einstein's move can even be applied to aspects of what were thought to be the laws of nature: so that even aspects of the laws turn out to evolve in time. The basic method of science after Darwin seems to be to identify some property once thought to be absolute and defined a prior and recognize that it can be understood because it has evolved by a process of or akin to natural selection. This has revolutionized biology and is in the process of doing the same to the social sciences. We can see by how I have stated it that these two methods are closely related. Einstein emphasizes the relational aspect of all properties described by science, while Darwin proposes that ultimately, the law which governs the evolution of everything else, including perhaps what were once seen to be laws-is natural selection. Should Darwin's method be applied even to the laws of physics? Recent developments in elementary particle physics give us little alternative if we are to have a rational understanding of the laws that govern our universe. I am referring here to the realization that string theory gives us, not a unique set of particles and forces, but an infinite list out of which one came to be selected for our universe. We physicists have now to understand Darwin's lesson: the only way to understand how one out of a vast number of choices was made, which favors improbably structure, is that it is the result of evolution by natural selection. Can this work? I showed it might, in 1992, in a theory of cosmological natural selection. This remains the only theory of how our laws came to be selected so far proposed that makes falsifiable predictions. The idea that laws of nature are themselves the result of evolution by natural selection is nothing new, it was anticipated by the philosopher Charles Sanders Pierce, who wrote in 1891: To suppose universal laws of nature capable of being apprehended by the mind and yet having no reason for their special forms, but standing inexplicable and irrational, is hardly a justifiable position. Uniformities are precisely the sort of facts that need to be accounted for. Law is par excellence the thing that wants a reason. Now the only possible way of accounting for the laws of nature, and for uniformity in general, is to suppose them results of evolution. This idea remains dangerous, not only for what it has achieved, but for what it implies for the future. For there are implications have yet to be absorbed or understood, even by those who have come to believe it is the only way forward for science. For example, must there always be a deeper, or meta-law, which governs the physical mechanisms by which a law evolves? And what about the fact that laws of physics are expressed in mathematics, which is usually thought of as encoding eternal truths? Can mathematics itself come to be seen as time bound rather that as transcendent and eternal platonic truths? I believe that we will achieve clarity on these and other scary implications of the idea that all the regularities we observe, including those we have gotten used to calling laws, are the result of evolution by natural selection. And I believe that once this is achieved Einstein and Darwin will be understood as partners in the greatest revolution yet in science, a revolution that taught us that the world we are imbedded in is nothing but an ever evolving network of relationships. _________________________________________________________________ ALISON GOPNIK Psychologist, UC-Berkeley; Coauthor, The Scientist In the Crib [gopnik100.jpg] A cacophony of "controversy" It may not be good to encourage scientists to articulate dangerous ideas. Good scientists, almost by definition, tend towards the contrarian and ornery, and nothing gives them more pleasure than holding to an unconventional idea in the face of opposition. Indeed, orneriness and contrarianism are something of currency for science -- nobody wants to have an idea that everyone else has too. Scientists are always constructing a straw man "establishment" opponent who they can then fearlessly demolish. If you combine that with defying the conventional wisdom of non-scientists you have a recipe for a very distinctive kind of scientific smugness and self-righteousness. We scientists see this contrarian habit grinning back at us in a particularly hideous and distorted form when global warming opponents or intelligent design advocates invoke the unpopularity of their ideas as evidence that they should be accepted, or at least discussed. The problem is exacerbated for public intellectuals. For the media too, would far rather hear about contrarian or unpopular or morally dubious or "controversial" ideas than ones that are congruent with everyday morality and wisdom. No one writes a newspaper article about a study that shows that girls are just as good at some task as boys, or that children are influenced by their parents. It is certainly true that there is no reason that scientifically valid results should have morally comforting consequences -- but there is no reason why they shouldn't either. Unpopularity or shock is no more a sign of truth than popularity is. More to the point, when scientists do have ideas that are potentially morally dangerous they should approach those ideas with hesitancy and humility. And they should do so in full recognition of the great human tragedy that, as Isiah Berlin pointed out, there can be genuinely conflicting goods and that humans are often in situations of conflict for which there is no simple or obvious answer. Truth and morality may indeed in some cases be competing values, but that is a tragedy, not a cause for self-congratulation. Humility and empathy come less easily to most scientists, most certainly including me, than pride and self-confidence, but perhaps for that very reason they are the virtues we should pursue. This is, of course, itself a dangerous idea. Orneriness and contrarianism are in fact, genuine scientific virtues, too. And in the current profoundly anti-scientific political climate it is terribly dangerous to do anything that might give comfort to the enemies of science. But I think the peril to science actually doesn't lie in timidity or self-censorship. It is much more likely to lie in a cacophony of "controversy". _________________________________________________________________ KEVIN KELLY Editor-At-Large, Wired; Author, New Rules for the New Economy [kelly100.jpg] More anonymity is good More anonymity is good: that's a dangerous idea. Fancy algorithms and cool technology make true anonymity in mediated environments more possible today than ever before. At the same time this techno-combo makes true anonymity in physical life much harder. For every step that masks us, we move two steps toward totally transparent unmasking. We have caller ID, but also caller ID Block, and then caller ID-only filters. Coming up: biometric monitoring and little place to hide. A world where everything about a person can be found and archived is a world with no privacy, and therefore many technologists are eager to maintain the option of easy anonymity as a refuge for the private. However in every system that I have seen where anonymity becomes common, the system fails. The recent taint in the honor of Wikipedia stems from the extreme ease which anonymous declarations can be put into a very visible public record. Communities infected with anonymity will either collapse, or shift the anonymous to pseudo-anonymous, as in eBay, where you have a traceable identity behind an invented nickname. Or voting, where you can authenticate an identity without tagging it to a vote. Anonymity is like a rare earth metal. These elements are a necessary ingredient in keeping a cell alive, but the amount needed is a mere hard-to-measure trace. In larger does these heavy metals are some of the most toxic substances known to a life. They kill. Anonymity is the same. As a trace element in vanishingly small doses, it's good for the system by enabling the occasional whistleblower, or persecuted fringe. But if anonymity is present in any significant quantity, it will poison the system. There's a dangerous idea circulating that the option of anonymity should always be at hand, and that it is a noble antidote to technologies of control. This is like pumping up the levels of heavy metals in your body into to make it stronger. Privacy can only be won by trust, and trust requires persistent identity, if only pseudo-anonymously. In the end, the more trust, the better. Like all toxins, anonymity should be keep as close to zero as possible. _________________________________________________________________ DENIS DUTTON Professor of the philosophy of art, University of Canterbury, New Zealand, editor of Philosophy and Literature and Arts & Letters Daily [dutton100.jpg] A "grand narrative" The humanities have gone through the rise of Theory in the 1960s, its firm hold on English and literature departments through the 1970s and 80s, followed most recently by its much-touted decline and death. Of course, Theory (capitalization is an English department affectation) never operated as a proper research program in any scientific sense -- with hypotheses validated (or falsified) by experiment or accrued evidence. Theory was a series of intellectual fashion statements, clever slogans and postures, imported from France in the 60s, then developed out of Yale and other Theory hot spots. The academic work Theory spawned was noted more for its chosen jargons, which functioned like secret codes, than for any concern to establish truth or advance knowledge. It was all about careers and prestige. Truth and knowledge, in fact, were ruled out as quaint illusions. This cleared the way, naturally, for an "anything-goes" atmosphere of academic criticism. In reality, it was anything but anything goes, since the political demands of the period included a long list of stereotyped villains (the West, the Enlightenment, dead whites males, even clear writing) to be pitted against mandatory heroines and heroes (indigenous peoples, the working class, the oppressed, and so forth). Though the politics remains as strong as ever in academe, Theory has atrophied not because it was refuted, but because everyone got bored with it. Add to that the absurdly bad writing of academic humanists of the period and episodes like the Sokal Hoax, and the decline was inevitable. Theory academics could with high seriousness ignore rational counter-arguments, but for them ridicule and laughter were like water thrown at the Wicked Witch. Theory withered and died. But wait. Here is exactly where my most dangerous idea comes in. What if it turned out that the academic humanities -- art criticism, music and literary history, aesthetic theory, and the philosophy of art -- actually had available to them a true, and therefore permanently valuable, theory to organize their speculations and interpretations? What if there really existed a hitherto unrecognized "grand narrative" that could explain the entire history of creation and experience of the arts worldwide? Aesthetic experience, as well as the context of artistic creation, is a phenomenon both social and psychological. From the standpoint of inner experience, it can be addressed by evolutionary psychology: the idea that our thinking and values are conditioned by the 2.6 million years of natural and sexual selection in the Pleistocene. This Darwinian theory has much to say about the abiding, cross-culturally ascertainable values human beings find in art. The fascination, for example, that people worldwide find in the exercise of artistic virtuosity, from Praxiteles to Hokusai to Renee Fleming, is not a social construct, but a Pleistocene adaptation (which outside of the arts shows itself in sporting interests everywhere). That calendar landscapes worldwide feature alternating copses of trees and open spaces, often hilly land, water, and paths or river banks that wind into an inviting distance is a Pleistocene landscape preference (which shows up in both art history and in the design of public parks everywhere). That soap operas and Greek tragedy all present themes of family breakdown ("She killed him because she loved him") is a reflection of ancient, innate content interests in story-telling. Darwinian theory offers substantial answers to perennial aesthetic questions. It has much to say about the origins of art. It's unlikely that the arts came about at one time or for one purpose; they evolved from overlapping interests based in survival and mate selection in the 80,000 generations of the Pleistocene. How we scan visually, how we hear, our sense of rhythm, the pleasures of artistic expression and in joining with others as an audience, and, not least, how the arts excite us using a repertoire of universal human emotions: all of this and more will be illuminated and explained by a Darwinian aesthetics. I've encountered stiff academic resistance to the notion that Darwinian theory might greatly improve the understanding of our aesthetic and imaginative lives. There's no reason to worry. The most complete, evolutionarily-based explanation of a great work of art, classic or recent, will address its form, its narrative content, its ideology, how it is taken in by the eye or mind, and indeed, how it can produce a deep, even life-transforming pleasure. But nothing in a valid aesthetic psychology will rob art of its appeal, any more than knowing how we evolved to enjoy fat and sweet makes a piece of cheesecake any less delicious. Nor will a Darwinian aesthetics reduce the complexity of art to simple formulae. It will only give us a better understanding of the greatest human achievements and their effects on us. In the sense that it would show innumerable careers in the humanities over the last forty years to have been wasted on banal politics and execrable criticism, Darwinian aesthetics is a very dangerous idea indeed. For people who really care about understanding art, it would be a combination of fresh air and strong coffee. _________________________________________________________________ SIMON BARON-COHEN Psychologist, Autism Research Centre, Cambridge University; Author, The Essential Difference [baroncohen100.jpg] A political system based on empathy Imagine a political system based not on legal rules (systemizing) but on empathy. Would this make the world a safer place? The UK Parliament, US Congress, Israeli Knesset, French National Assembly, Italian Senato della Repubblica, Spanish Congreso de los Diputados, -- what do such political chambers have in common? Existing political systems are based on two principles: getting power through combat, and then creating/revising laws and rules through combat. Combat is sometimes physical (toppling your opponent militarily), sometimes economic (establishing a trade embargo, to starve your opponent of resources), sometimes propaganda-based (waging a media campaign to discredit your opponent's reputation), and sometimes through voting-related activity (lobbying, forming alliances, fighting to win votes in key seats), with the aim to 'defeat' the opposition. Creating/revising laws and rules is what you do once you are in power. These might be constitutional rules, rules of precedence, judicial rulings, statutes, or other laws or codes of practice. Politicians battle for their rule-based proposal (which they hold to be best) to win, and battle to defeat the opposition's rival proposal. This way of doing politics is based on "systemizing". First you analyse the most effective form of combat (itself a system) to win. If we do x, then we will obtain outcome y. Then you adjust the legal code (another system). If we pass law A, we will obtain outcome B. My colleagues and I have studied the essential difference between how men and women think. Our studies suggest that (on average) more men are systemizers, and more women are empathizers. Since most political systems were set up by men, it may be no coincidence that we have ended up with political chambers that are built on the principles of systemizing. So here's the dangerous new idea. What would it be like if our political chambers were based on the principles of empathizing? It is dangerous because it would mean a revolution in how we choose our politicians, how our political chambers govern, and how our politicians think and behave. We have never given such an alternative political process a chance. Might it be better and safer than what we currently have? Since empathy is about keeping in mind the thoughts and feelings of other people (not just your own), and being sensitive to another person's thoughts and feelings (not just riding rough-shod over them), it is clearly incompatible with notions of "doing battle with the opposition" and "defeating the opposition" in order to win and hold on to power. Currently, we select a party (and ultimately a national) leader based on their "leadership" qualities. Can he or she make decisions decisively? Can they do what is in the best interests of the party, or the country, even if it means sacrificing others to follow through on a decision? Can they ruthlessly reshuffle their Cabinet and "cut people loose" if they are no longer serving their interests? These are the qualities of a strong systemizer. Note we are not talking about whether that politician is male or female. We are talking about how a politician (irrespective of their sex) thinks and behaves. We have had endless examples of systemizing politicians unable to resolve conflict. Empathizing politicians would perhaps follow Mandela and De Klerk's examples, who sat down to try to understand the other, to empathize with the other, even if the other was defined as a terrorist. To do this involves the empathic act of stepping into the other's shoes, and identifying with their feelings. The details of a political system based on empathizing would need a lot of working out, but we can imagine certain qualities that would have no place. Gone would be politicians who are skilled orators but who simply deliver monologues, standing on a platform, pointing forcefully into the air to underline their insistence -- even the body language containing an implied threat of poking their listener in the chest or the face - to win over an audience. Gone too would be politicians who are so principled that they are rigid and uncompromising. Instead, we would elect politicians based on different qualities: politicians who are good listeners, who ask questions of others instead of assuming they know the right course of action. We would instead have politicians who respond sensitively to another, different point of view, and who can be flexible over where the dialogue might lead. Instead of seeking to control and dominate, our politicians would be seeking to support, enable, and care. _________________________________________________________________ FREEMAN DYSON Physicist, Institute of Advanced Study, Author, Disturbing the Universe [dysonf100.jpg] Biotechnology will be thoroughly domesticated in the next fifty years Biotechnology will be domesticated in the next fifty years as thoroughly as computer technology was in the last fifty years. This means cheap and user-friendly tools and do-it-yourself kits, for gardeners to design their own roses and orchids, and for animal-breeders to design their own lizards and snakes. A new art-form as creative as painting or cinema. It means biotech games for children down to kindergarten age, like computer-games but played with real eggs and seeds instead of with images on a screen. Kids will grow up with an intimate feeling for the organisms that they create. It means an explosion of biodiversity as new ecologies are designed to fit into millions of local niches all over the world. Urban and rural landscapes will become more varied and more fertile. There are two severe and obvious dangers. First, smart kids and malicious grown-ups will find ways to convert biotech tools to the manufacture of lethal microbes. Second, ambitious parents will find ways to apply biotech tools to the genetic modification of their own babies. The great unanswered question is, whether we can regulate domesticated biotechnology so that it can be applied freely to animals and vegetables but not to microbes and humans. _________________________________________________________________ GREGORY COCHRAN Consultant in adaptive optics and an adjunct professor of anthropology at the University of Utah [cochran100.jpg] There is something new under the sun -- us Thucydides said that human nature was unchanging and thus predictable -- but he was probably wrong. If you consider natural selection operating in fast-changing human environments, such stasis is most unlikely. We know of a number of cases in which there has been rapid adaptive change in humans; for example, most of the malaria-defense mutations such as sickle cell are recent, just a few thousand years old. The lactase mutation that lets most adult Europeans digest ice cream is not much older. There is no magic principle that restricts human evolutionary change to disease defenses and dietary adaptations: everything is up for grabs. Genes affecting personality, reproductive strategies, cognition, are all able to change significantly over few-millennia time scales if the environment favors such change -- and this includes the new environments we have made for ourselves, things like new ways of making a living and new social structures. I would be astonished if the mix of personality types favored among hunter-gatherers is "exactly" the same as that favored among peasant farmers ruled by a Pharaoh. In fact they might be fairly different. There is evidence that such change has occurred. Henry Harpending and I have, we think, made a strong case that natural selection changed the Ashkenazi Jews over a thousand years or so, favoring certain kinds of cognitive abilities and generating genetic diseases as a side effect. Bruce Lahn's team has found new variants of brain-development genes: one, ASPM, appears to have risen to high frequency in Europe and the Middle East in about six thousand years. We don't yet know what this new variant does, but it certainly could affect the human psyche -- and if it does, Thucydides was wrong. We may not be doomed to repeat the Sicilian expedition: on the other hand, since we don't understand much yet about the changes that have occurred, we might be even more doomed. But at any rate, we have almost certainly changed. There is something new under the sun -- us. This concept opens strange doors. If true, it means that the people of Sumeria and Egypt's Old Kingdom were probably fundamentally different from us: human nature has changed -- some, anyhow -- over recorded history. Julian Jaynes, in The Origin of Consciousness in the Breakdown of the Bicameral Mind, argued that there was something qualitatively different about the human mind in ancient civilization. On first reading, Breakdown seemed one of the craziest books ever written, but Jaynes may have been on to something. If people a few thousand years ago thought and acted differently because of biological differences, history is never going to be the same. _________________________________________________________________ GEORGE B. DYSON Science Historian; Author, Project Orion [dysong100.jpg] Understanding molecular biology without discovering the origins of life I predict we will reach a complete understanding of molecular biology and molecular evolution, without ever discovering the origins of life. This idea is dangerous, because it suggests a mystery that science cannot explain. Or, it may be interpreted as confirmation that life is merely the collective result of a long series of incremental steps, and that it is impossible to draw a precise distinction between life and non-life. "The only thing of which I am sure," argued Samuel Butler in 1880, "is that the distinction between the organic and inorganic is arbitrary; that it is more coherent with our other ideas, and therefore more acceptable, to start with every molecule as a living thing, and then deduce death as the breaking up of an association or corporation, than to start with inanimate molecules and smuggle life into them. " Every molecule a living thing? That's not even dangerous, it's wrong! But where else can you draw the line? _________________________________________________________________ KEITH DEVLIN Mathematician; Executive Director, Center for the Study of Language and Information, Stanford; Author, The Millennium Problems [devlin100.jpg] We are entirely alone Living creatures capable of relecting on their own existence are a one-off, freak accident, existing for one brief moment in the history of the universe. There may be life elsewhere in the universe, but it does not have self-reflective consciousness. There is no God; no Intelligent Designer; no higher purpose to our lives. Personally, I have never found this possibility particularly troubling, but my experience has been that most people go to considerable lengths to convince themselves that it is otherwise. I think that many people find the suggestion dangerous because they see it as leading to a life devoid of meaning or moral values. They see it as a suggestion full of despair, an idea that makes our lives seem pointless. I believe that the opposite is the case. As the product of that unique, freak accident, finding ourselves able to reflect on and enjoy our conscious existence, the very unlikeliness and uniqueness of our situation surely makes us highly appreciative of what we have. Life is not just important to us; it is literally everything we have. That makes it, in human terms, the most precious thing there is. That not only gives life meaning for us, something to be respected and revered, but a strong moral code follows automatically. The fact that our existence has no purpose outside that existence is completely irrelevant to the way we live our lives, since we are inside our existence. The fact that our existence has no purpose for the universe -- whatever that means -- in no way means it has no purpose for us. We must ask and answer questions about ourselves within the framework of our existence as what we are. _________________________________________________________________ FRANK TIPLER Professor of Mathematical Physics, Tulane University; Author, The Physics of Immortality [tipler100.jpg] Why I Hope the Standard Model is Wrong about Why There is More Matter Than Antimatter The Standard Model of particle physics -- a theory of all forces and particles except gravity and a theory that has survived all tests over the past thirty years -- says it is possible to convert matter entirely into energy. Old-fashioned nuclear physics allows some matter to be converted into energy, but because nuclear physics requires the number of heavy particles like neutrons and protons, and light particles like electrons, to be separately conserved in nuclear reactions, only a small fraction (less than 1%) of the mass of the uranium or plutonium in an atomic bomb can be converted into energy. The Standard Model says that there is a way to convert all the mass of ordinary matter into energy; for example, it is in principle possible to convert the proton and electron making up a hydrogen atom entirely into energy. Particle physicists have long known about this possibility, but have considered it forever irrelevant to human technology because the energy required to convert matter into pure energy via this process is at the very limit of our most powerful accelerators (a trillion electron volts, or one TeV). I am very much afraid that the particle physicists are wrong about this Standard Model pure energy conversion process being forever irrelevant to human affairs. I have recently come to believe that the consistency of quantum field theory requires that it should be possible to convert up to 100 kilograms of ordinary matter into pure energy via this process using a device that could fit inside the trunk of a car, a device that could be manufactured in a small factory. Such a device would solve all our energy problems -- we would not need fossil fuels -- but 100 kilograms of energy is the energy released by a 1,000-megaton nuclear bomb. If such a bomb can be manufactured in a small factory, then terrorists everywhere will eventually have such weapons. I fear for the human race if this comes to pass. I very hope I am wrong about the technological feasibility of such a bomb. _________________________________________________________________ SCOTT SAMPSON Chief Curator, Utah Museum of Natural History; Associate Professor Department of Geology and Geophysics, University of Utah; Host, Dinosaur Planet TV series [sampson100.jpg] The purpose of life is to disperse energy The truly dangerous ideas in science tend to be those that threaten the collective ego of humanity and knock us further off our pedestal of centrality. The Copernican Revolution abruptly dislodged humans from the center of the universe. The Darwinian Revolution yanked Homo sapiens from the pinnacle of life. Today another menacing revolution sits at the horizon of knowledge, patiently awaiting broad realization by the same egotistical species. The dangerous idea is this: the purpose of life is to disperse energy. Many of us are at least somewhat familiar with the second law of thermodynamics, the unwavering propensity of energy to disperse and, in doing so, transition from high quality to low quality forms. More generally, as stated by ecologist Eric Schneider, "nature abhors a gradient," where a gradient is simply a difference over a distance -- for example, in temperature or pressure. Open physical systems -- including those of the atmosphere, hydrosphere, and geosphere -- all embody this law, being driven by the dispersal of energy, particularly the flow of heat, continually attempting to achieve equilibrium. Phenomena as diverse as lithospheric plate motions, the northward flow of the Gulf Stream, and occurrence of deadly hurricanes are all examples of second law manifestations. There is growing evidence that life, the biosphere, is no different. It has often been said the life's complexity contravenes the second law, indicating the work either of a deity or some unknown natural process, depending on one's bias. Yet the evolution of life and the dynamics of ecosystems obey the second law mandate, functioning in large part to dissipate energy. They do so not by burning brightly and disappearing, like a fire torching a forest, but through stable metabolic cycles that store chemical energy and continually reduce the solar gradient. Photosynthetic plants, bacteria, and algae capture energy from the sun and form the core of all food webs. Virtually all organisms, including humans, are, in a real sense, sunlight transmogrified, temporary waypoints in the flow of energy. Ecological succession, viewed from a thermodynamic perspective, is a process that maximizes the capture and degradation of energy. Similarly, the tendency for life to become more complex over the past 3.5 billion years (as well as the overall increase in biomass and organismal diversity through time) is not due simply to natural selection, as most evolutionists still argue, but also to nature's "efforts" to grab more and more of the sun's flow. The slow burn that characterizes life enables ecological systems to persist over deep time, changing in response to external and internal perturbations. Ecology has been summarized by the pithy statement, "energy flows, matter cycles. " Yet this maxim applies equally to complex systems in the non-living world; indeed it literally unites the biosphere with the physical realm. More and more, it appears that complex, cycling, swirling systems of matter have a natural tendency to emerge in the face of energy gradients. This recurrent phenomenon may even have been the driving force behind life's origins. This idea is not new, and is certainly not mine. Nobel laureate Erwin Schr?dinger was one of the first to articulate the hypothesis, as part of his famous "What is Life" lectures in Dublin in 1943. More recently, Jeffrey Wicken, Harold Morowitz, Eric Schneider and others have taken this concept considerably further, buoyed by results from a range of studies, particularly within ecology. Schneider and Dorian Sagan provide an excellent summary of this hypothesis in their recent book, "Into the Cool". The concept of life as energy flow, once fully digested, is profound. Just as Darwin fundamentally connected humans to the non-human world, a thermodynamic perspective connects life inextricably to the non-living world. This dangerous idea, once broadly distributed and understood, is likely to provoke reaction from many sectors, including religion and science. The wondrous diversity and complexity of life through time, far from being the product of intelligent design, is a natural phenomenon intimately linked to the physical realm of energy flow. Moreover, evolution is not driven by the machinations of selfish genes propagating themselves through countless millennia. Rather, ecology and evolution together operate as a highly successful, extremely persistent means of reducing the gradient generated by our nearest star. In my view, evolutionary theory (the process, not the fact of evolution!) and biology generally are headed for a major overhaul once investigators fully comprehend the notion that the complex systems of earth, air, water, and life are not only interconnected, but interdependent, cycling matter in order to maintain the flow of energy. Although this statement addresses only naturalistic function and is mute with regard to spiritual meaning, it is likely to have deep effects outside of science. In particular, broad understanding of life's role in dispersing energy has great potential to help humans reconnect both to nature and to planet's physical systems at a key moment in our species' history. _________________________________________________________________ JEREMY BERNSTEIN Professor of Physics, Stevens Institute of Technology; Author, Hitler's Uranium Club The idea that we understand plutonium The most dangerous idea I have come across recently is the idea that we understand plutonium. Plutonium is the most complex element in the periodic table. It has six different crystal phases between room temperature and its melting point. It can catch fire spontaneously in the presence of water vapor and if you inhale minuscule amounts you will die of lung cancer. It is the principle element in the "pits" that are the explosive cores of nuclear weapons. In these pits it is alloyed with gallium. No one knows why this works and no one can be sure how stable this alloy is. These pits, in the thousands, are now decades old. What is dangerous is the idea that they have retained their integrity and can be safely stored into the indefinite future. _________________________________________________________________ MIHALYI CSIKSZENTMIHALYI Psychologist; Director, Quality of Life Research Center, Claremont Graduate University; Author, Flow [csik100.jpg] The free market Generally ideas are thought to be dangerous when they threaten an entrenched authority. Galileo was sued not because he claimed that the earth revolved around the sun -- a "hypothesis" his chief prosecutor, Cardinal Bellarmine, apparently was quite willing to entertain in private -- but because the Church could not afford a fact it claimed to know be reversed by another epistemology, in this case by the scientific method. Similar conflicts arose when Darwin's view of how humans first appeared on the planet challenged religious accounts of creation, or when Mendelian genetics applied to the growth of hardier strains of wheat challenged Leninist doctrine as interpreted by Lysenko. One of the most dangerous ideas at large in the current culture is that the "free market" is the ultimate arbiter of political decisions, and that there is an "invisible hand" that will direct us to the most desirable future provided the free market is allowed to actualize itself. This mystical faith is based on some reasonable empirical foundations, but when embraced as a final solution to the ills of humankind, it risks destroying both the material resources, and the cultural achievements that our species has so painstakingly developed. So the dangerous idea on which our culture is based is that the political economy has a silver bullet -- the free market -- that must take precedence over any other value, and thereby lead to peace and prosperity. It is dangerous because like all silver bullets it is an intellectual and political scam that might benefit some, but ultimately requires the majority to pay for the destruction it causes. My dangerous idea is dangerous only to those who support the hegemony of the market. It consists in pointing out that the imperial free market wears no clothes -- it does not exist in the first place, and what passes for it is dangerous to the future well being of our species. Scientist need to turn their attention to what the complex system that is human life, will require in the future. Beginnings like the Calvert-Henderson Quality of Life Indicators, which focus on such central requirements as health, education, infrastructure, environment, human rights, and public safety, need to become part of our social and political agenda. And when their findings come into conflict with the agenda of the prophets of the free market, the conflict should be examined -- who is it that benefits from the erosion of the quality of life? _________________________________________________________________ IRENE PEPPERBERG Research Associate, Psychology, Harvard University; Author, The Alex Studies [pepperberg100.jpg] The differences between humans and nonhumans are quantitative, not qualitative I believe that the differences between humans and nonhumans are quantitative, not qualitative. Why is this idea dangerous? It is hardly surprising, coming from someone who has spent her scientific career studying the abilities of (supposedly) small-brained nonhumans; moreover, the idea is not exactly new. It may be a bit controversial, given that many of my colleagues spend much of their time searching for the defining difference that separates humans and nonhumans (and they may be correct), and also given a current social and political climate that challenges evolution on what seems to be a daily basis. But why dangerous? Because, if we take this idea to its logical conclusion, it challenges almost every aspect of our lives -- scientific and nonscientific alike. Scientifically, the idea challenges the views of many researchers who continue to hypothesize about the next human-nonhuman 'great divide'...Interestingly, however, detailed observation and careful experimentation have repeatedly demonstrated that nonhumans often possess capacities once thought to separate them from humans. Humans, for example, are not the only tool-using species, nor the only tool-making species, nor the only species to act cooperatively. So one has to wonder to what degree nonhumans share other capacities still thought to be exclusively human. And, of course, the critical words here are "to what degree" -- do we count lack of a particular behavior a defining criterion, or do we accept the existence of less complex versions of that behavior as evidence for a continuum? If one wishes to argue that I'm just blurring the difference between "qualitative" and "quantitative", so be it...such blurring will not affect the dangerousness of my idea. My idea is dangerous because it challenges scientists at a more basic level, that of how we perform research. Now, let me state clearly that I'm not against animal research -- I wouldn't be alive today without it, and I work daily with captive animals that, although domestically bred (and that, by any standard, are provided with a fairly cushy existence), are still essentially wild creatures denied their freedom. But if we believe in a continuum, then we must at least question our right to perform experiments on our fellow creatures; we need to think about how to limit animal experiments and testing to what is essential, and to insist on humane (note the term!) housing and treatment. And, importantly, we must accept the significant cost in time, effort, and money thereby incurred -- increases that must come at the expense of something else in our society. The idea, taken to its logical conclusion, is dangerous because it should also affect our choices as to the origins of the clothes we wear and the foods we eat. Again, I'm not campaigning against leather shoes and T-bone steaks; I find that I personally cannot remain healthy on a totally vegetarian diet and sheepskin boots definitely ease the rigors of a Massachusetts winter. But if we believe in a continuum, we must at least question our right to use fellow creatures for our sustenance: We need to become aware of, for example, the conditions under which creatures destined for the slaughterhouse live their lives, and learn about and ameliorate the conditions in which their lives are ended. And, again, we must accept the costs involved in such decisions. If we do not believe in a clear boundary between humans and nonhumans, if we do not accept a clear "them" versus "us", we need to rethink other aspects of our lives. Do we have the right to clear-cut forests in which our fellow creatures live? To pollute the air, soil and water that we share with them, solely for our own benefit? Where do we draw the line? Life may be much simpler if we do firmly draw a line, but is simplicity a valid rationale? And, in case anyone wonders at my own personal view: I believe that humans are the ultimate generalists, creatures that may lack specific talents or physical adaptations that have been finely honed in other species, but whose additional brain power enables them -- in an exquisite manner -- to, for example, integrate information, improvise with what is present, and alter or adapt to a wide range of environments...but that this additional brain power is (and provides) a quantitative, not qualitative difference. _________________________________________________________________ BRIAN GOODWIN Biologist, Schumacher College, Devon, UK; Author, How The Leopard Changed Its Spots [goodwin100.jpg] Fields of Danger In science, the concept of a field is used to describe patterns of order in systems that are extended in space and show regularities of behaviour in time. They have always expressed ideas that are rather mysterious, but work in describing natural processes. The first example of a field principle in physics was Newton's celebrated gravitational law, which described mathematically the universal attraction between bodies with mass. This mysterious action at a distance without any wires or mechanical attachments between the bodies was regarded as a mystical, occult concept by the mechanical philosophers of the 17th and 18th centuries. They condemned Newton's idea as a violation of the principles of explanation in the new science. However, there is a healthy pragmatic element to scientific investigation, and Newton's equations worked too well to be discarded on philosophical grounds. Another celebrated example of a physical field came from the experimental work of Michael Faraday on electricity and magnetism in the 19th century. He talked about fields of force that extend out in space from electrically charged bodies, or from magnets. Faraday's painstaking and ingenious work described how these fields change with distance from the body in precise ways, as does the gravitational force. Again these forces were regarded as mysterious since they travel through apparently empty space, exerting interaction at a distance that cannot be understood mechanically. However, so precise were Faraday's measurements of the properties of electric and magnetic fields, and so vivid his description of the fields of force associated with them, that James Clerk Maxwell could take his observations and put them directly into mathematical form. These are the famous wave equations of electromagnetism on which our technology for electric motors, lighting, TV, communications and innumerable other applications is based. In the 20th century with Einstein transformed Newton's mysterious gravitational force into an even more mysterious property of space itself: it bends or curves under the influence of bodies with mass. Einstein's relativity theory did away with a force of attraction between bodies and substituted a mathematical relationship between mass and curvature of space-time. The result was a whole new way of understanding motion as natural, curved paths followed by bodies that not only cause the curvature but follow it. The universe was becoming intrinsically self-organising and subjects as observers made an entry into physics. As if Einstein's relativity wasn't enough to shake up the world known to science, the next revolution was even more disturbing. Quantum mechanics, emerging in the 1920s, did away with the classical notions of fields as smooth distributions of forces through space-time and described interactions at a distance in terms of discrete little packets of energy that travel through the void in oscillating patterns described by wave functions, of which the solutions to Schr?dinger's wave equation are the best known. Now we have not only action at a distance but something infinitely more disturbing: these interactions violate conventional notions of causality because they are non-local. Two particles that have been joined in an intimate relationship within an atom remain coherently correlated with one another in their properties no matter how far apart they may be after emission from the atom. Einstein could not bring himself to believe that this 'spooky' implication of quantum mechanics could possibly be real. The implied entanglement means that there is a holistic principle of connectedness in operation at the most elementary level of physical reality. Quantum fields have subverted our basic notions of causality and substituted a principle of wholeness in relationship for elementary particles. The idea that I have pursued in biology for much of my career is the concept that goes under the name of a morphogenetic field. This term is used to describe the processes in space and time that organise and coordinate the various activities involved in the emergence of a whole complex organism from a single cell, or from a group of cells in interaction with each another. A human embryo developing in the mother's womb from a single fertilised egg, emerging at birth as a baby with all its organs coherently arranged in a functioning body, is one of the most breathtaking phenomena in nature. However, all species share the same ability to produce new individuals of the same kind in their processes of reproduction. The remarkable organising principles that underlie such basic properties of life have been known as morphogenetic fields (fields that generate form) throughout the 20th century, though this concept produces unease and discomfort among many biologists.This unease arises for good reason. As in physics, the field concept is subversive of mechanical explanations in science, and biology holds firmly to understanding life in terms of mechanisms organised by genes. However, the complete reading of the book of life in DNA, the major project in biology during the last two decades of the 20th century, did not reveal the secrets of the organism. It was a remarkable achievement to work out the sequence of letters in the genomes of different species, human, other animals, plants, and microbes, so that many of the words of the genetic text of different species could be deciphered. Unfortunately, we were unable to make coherent sense of these words, to put them together in the way that organisms do in creating themselves during their reproduction as they develop into beings with specific morphologies and behaviours, the process of morphogenesis. What had been forgotten, or ignored, was that information only makes sense to an agent, someone or something with the know-how to interpret it. The meaning was missing because the genome researchers ignored the context of the genomes: the living cell within which genes are read and their products are organised. The organisation that is responsible for making sense of the information in the genes, an essential and basic aspect of the living state, was taken for granted. What is the nature of this complex dynamic process that knows how to make an organism, using specific information from the genes? Biology is returning to notions of space-time organisation as an intrinsic aspect of the living condition, our old friends morphogenetic fields. They are now described as complex networks of molecules that somehow read and make sense of genes. These molecular networks have intriguing properties, giving them some of the same characteristics as words in a language. Could it be that biology and culture are not so different after all; that both are based on historical traditions and languages that are used to construct patterns of relationship embodied in communities, either of cells or of individuals? These self-organising activities are certainly mysterious, but not unintelligible. My own work, with many colleagues, cast morphogenetic fields in mathematical form that revealed how space (morphology) and time (behaviour) get organised in subtle but robust ways in developing organisms and communities. Such coordinating patterns in living beings seem to be at the heart of the creativity that drives both biological and cultural evolution. Despite many differences between these fields, which need to be clarified and distinguished rather than blurred, there may be underlying commonalities that can unify biological and cultural evolution rather than separating them. This could even lead us to value other species of organism for their wisdom in achieving coherent, sustainable relationships with other species while remaining creative and innovative throughout evolution, something we are signally failing to do in our culture with its ecologically damaging style of living. _________________________________________________________________ RUDY RUCKER Mathematician, Computer Scientist; CyberPunk Pioneer; Novelist; Author, Lifebox, the Seashell, and the Soul l [rucker100.jpg] Mind is a universally distributed quality Panpsychism. Each object has a mind. Stars, hills, chairs, rocks, scraps of paper, flakes of skin, molecules -- each of them possesses the same inner glow as a human, each of them has singular inner experiences and sensations. I'm quite comfortable with the notion that everything is a computation. But what to do about my sense that there's something numinous about my inner experience? Panpsychism represents a non-anthropocentric way out: mind is a universally distributed quality. Yes, the workings of a human brain are a deterministic computation that could be emulated by any universal computer. And, yes, I sense more to my mental phenomena than the rule-bound exfoliation of reactions to inputs: this residue is the inner light, the raw sensation of existence. But, no, that inner glow is not the exclusive birthright of humans, nor is it solely limited to biological organisms. Note that panpsychism needn't say that universe is just one mind. We can also say that each object has an individual mind. One way to visualize the distinction between the many minds and the one mind is to think of the world as a stained glass window with light shining through each pane. The world's physical structures break the undivided cosmic mind into a myriad of small minds, one in each object. The minds of panpsychism can exist at various levels. As well as having its own individuality, a person's mind would also be, for instance, a hive mind based upon the minds of the body's cells and the minds of the body's elementary particles. Do the panpsychic minds have any physical correlates? On the one hand, it could be that the mind is some substance that accumulates near ordinary matter -- dark matter or dark energy are good candidates. On the other hand, mind might simply be matter viewed in a special fashion: matter experienced from the inside. Let me mention three specific physical correlates that have been proposed for the mind. Some have argued that the experience of mind results when a superposed quantum state collapses into a pure state. It's an alluring metaphor, but as a universal automatist, I'm of the opinion that quantum mechanics is a stop-gap theory, destined to give way to a fully deterministic theory based upon some digital precursor of spacetime. David Skrbina, author of the clear and comprehensive book Panpsychism in the West, suggests that we might think of a physical system as determining a moving point in a multi-dimensional phase space that has an axis for each of the system's measurable properties. He feels this dynamic point represents the sense of unity characteristic of a mind. As a variation on this theme, let me point out that, from the universal automatist standpoint, every physical system can be thought of as embodying a computation. And the majority of non-simple systems embody universal computations, capable of emulating any other system at all. It could be that having a mind is in some sense equivalent to being capable of universal computation. A side-remark. Even such very simple systems as a single electron may in fact be capable of universal computation, if supplied with a steady stream of structured input. Think of an electron in an oscillating field; and by analogy think of a person listening to music or reading an essay. Might panpsychism be a distinction without a difference? Suppose we identify the numinous mind with quantum collapse, with chaotic dynamics, or with universal computation. What is added by claiming that these aspects of reality are like minds? I think empathy can supply an experiential confirmation of panpsychism's reality. Just as I'm sure that I myself have a mind, I can come to believe the same of another human with whom I'm in contact -- whether face to face or via their creative work. And with a bit of effort, I can identify with objects as well; I can see the objects in the room around me as glowing with inner light. This is a pleasant sensation; one feels less alone. Could there ever be a critical experiment to test if panpsychism is really true? Suppose that telepathy were to become possible, perhaps by entangling a person's mental states with another system's states. And then suppose that instead of telepathically contacting another person, I were to contact a rock. At this point panpsychism would be proved. I still haven't said anything about why panpsychism is a dangerous idea. Panpsychism, like other forms of higher consciousness, is dangerous to business as usual. If my old car has the same kind of mind as a new one, I'm less impelled to help the economy by buying a new vehicle. If the rocks and plants on my property have minds, I feel more respect for them in their natural state. If I feel myself among friends in the universe, I'm less likely to overwork myself to earn more cash. If my body will have a mind even after I'm dead, then death matters less to me, and it's harder for the government to cow me into submission. _________________________________________________________________ STEVEN PINKER Psychologist, Harvard University; Author, The Blank Slate [pinker.100.jpg] Groups of people may differ genetically in their average talents and temperaments The year 2005 saw several public appearances of what will I predict will become the dangerous idea of the next decade: that groups of people may differ genetically in their average talents and temperaments. * In January, Harvard president Larry Summers caused a firestorm when he cited research showing that women and men have non-identical statistical distributions of cognitive abilities and life priorities. * In March, developmental biologist Armand Leroi published an op-ed in the New York Times rebutting the conventional wisdom that race does not exist. (The conventional wisdom is coming to be known as Lewontin's Fallacy: that because most genes may be found in all human groups, the groups don't differ at all. But patterns of correlation among genes do differ between groups, and different clusters of correlated genes correspond well to the major races labeled by common sense. ) * In June, the Times reported a forthcoming study by physicist Greg Cochran, anthropologist Jason Hardy, and population geneticist Henry Harpending proposing that Ashkenazi Jews have been biologically selected for high intelligence, and that their well-documented genetic diseases are a by-product of this evolutionary history. * In September, political scientist Charles Murray published an article in Commentary reiterating his argument from The Bell Curve that average racial differences in intelligence are intractable and partly genetic. Whether or not these hypotheses hold up (the evidence for gender differences is reasonably good, for ethnic and racial differences much less so), they are widely perceived to be dangerous. Summers was subjected to months of vilification, and proponents of ethnic and racial differences in the past have been targets of censorship, violence, and comparisons to Nazis. Large swaths of the intellectual landscape have been reengineered to try to rule these hypotheses out a priori (race does not exist, intelligence does not exist, the mind is a blank slate inscribed by parents). The underlying fear, that reports of group differences will fuel bigotry, is not, of course, groundless. The intellectual tools to defuse the danger are available. "Is" does not imply "ought. " Group differences, when they exist, pertain to the average or variance of a statistical distribution, rather than to individual men and women. Political equality is a commitment to universal human rights, and to policies that treat people as individuals rather than representatives of groups; it is not an empirical claim that all groups are indistinguishable. Yet many commentators seem unwilling to grasp these points, to say nothing of the wider world community. Advances in genetics and genomics will soon provide the ability to test hypotheses about group differences rigorously. Perhaps geneticists will forbear performing these tests, but one shouldn't count on it. The tests could very well emerge as by-products of research in biomedicine, genealogy, and deep history which no one wants to stop. The human genomic revolution has spawned an enormous amount of commentary about the possible perils of cloning and human genetic enhancement. I suspect that these are red herrings. When people realize that cloning is just forgoing a genetically mixed child for a twin of one parent, and is not the resurrection of the soul or a source of replacement organs, no one will want to do it. Likewise, when they realize that most genes have costs as well as benefits (they may raise a child's IQ but also predispose him to genetic disease), "designer babies" will lose whatever appeal they have. But the prospect of genetic tests of group differences in psychological traits is both more likely and more incendiary, and is one that the current intellectual community is ill-equipped to deal with. _________________________________________________________________ RICHARD E. NISBETT Professor of Psychology, Co-Director of the Culture and Cognition Program, University of Michigan; Author, The Geography of Thought: How Asians and Westerners Think Differently. . . And Why [nisbett100.jpg] Telling More Than We Can Know Do you know why you hired your most recent employee over the runner-up? Do you know why you bought your last pair of pajamas? Do you know what makes you happy and unhappy? Don't be too sure. The most important thing that social psychologists have discovered over the last 50 years is that people are very unreliable informants about why they behaved as they did, made the judgment they did, or liked or disliked something. In short, we don't know nearly as much about what goes on in our heads as we think. In fact, for a shocking range of things, we don't know the answer to "Why did I?" any better than an observer. The first inkling that social psychologists had about just how ignorant we are about our thinking processes came from the study of cognitive dissonance beginning in the late 1950s. When our behavior is insufficiently justified, we move our beliefs into line with the behavior so as to avoid the cognitive dissonance we would otherwise experience. But we are usually quite unaware that we have done that, and when it is pointed out to us we recruit phantom reasons for the change in attitude. Beginning in the mid-1960s, social psychologists started doing experiments about the causal attributions people make for their own behavior. If you give people electric shocks, but tell them that you have given them a pill that will produce the arousal symptoms that are actually created by the shock, they will take much more shock than subjects without the pill. They have attributed their arousal to the pill and are therefore willing to take more shock. But if you ask them why they took so much shock they are likely to say something like "I used to work with electrical gadgets and I got a lot of shocks, so I guess I got used to it." In the 1970s social psychologists began asking whether people could be accurate about why they make truly simple judgments and decisions -- such as why they like a person or an article of clothing. For example, in one study experimenters videotaped a Belgian responding in one of two modes to questions about his philosophy as a teacher: he either came across as an ogre or a saint. They then showed subjects one of the two tapes and asked them how much they liked the teacher. Furthermore, they asked some of them whether the teacher's accent had affected how much they liked him and asked others whether how much they liked the teacher influenced how much they liked his accent. Subjects who saw the ogre naturally disliked him a great deal, and they were quite sure that his grating accent was one of the reasons. Subjects who saw the saint realized that one of the reasons they were so fond of him was his charming accent. Subjects who were asked if their liking for the teacher could have influenced their judgment of his accent were insulted by the question. Does familiarity breed contempt? On the contrary, it breeds liking. In the 1980s, social psychologists began showing people such stimuli as Turkish words and Chinese ideographs and asking them how much they liked them. They would show a given stimulus somewhere between one and twenty-five times. The more the subjects saw the stimulus the more they liked it. Needless to say, their subjects did not find it plausible that the mere number of times they had seen a stimulus could have affected their liking for it. (You're probably wondering if white rats are susceptible to the mere familiarity effect. The study has been done. Rats brought up listening to music by Mozart prefer to move to the side of the cage that trips a switch allowing them to listen to Mozart rather than Schoenberg. Rats raised on Schoenberg prefer to be on the Schoenberg side. The rats were not asked the reasons for their musical preferences.) Does it matter that we often don't know what goes on in our heads and yet believe that we do? Well, for starters, it means that we often can't answer accurately crucial questions about what makes us happy and what makes us unhappy. A social psychologist asked Harvard women to keep a daily record for two months of their mood states and also to record a number of potentially relevant factors in their lives including amount of sleep the night before, the weather, general state of health, sexual activity, and day of the week (Monday blues? TGIF?). At the end of the period, subjects were asked to tell the experimenters how much each of these factors tended to influence their mood over the two month period. The results? Women's reports of what influenced their moods were uncorrelated with what they had reported on a daily basis. If a woman thought that her sexual activity had a big effect, a check of her daily reports was just as likely to show that it had no effect as that it did. To really rub it in, the psychologist asked her subjects to report what influenced the moods of someone they didn't know: She found that accuracy was just as great when a woman was rated by a stranger as when rated by the woman herself! But if we were to just think really hard about reasons for behavior and preferences might we be likely to come to the right conclusions? Actually, just the opposite may often be the case. A social psychologist asked people to choose which of several art posters they liked best. Some people were asked to analyze why they liked or disliked the various posters and some were not asked, and everyone was given their favorite poster to take home. Two weeks later the psychologist called people up and asked them how much they liked the art poster they had chosen. Those who did not analyze their reasons liked their posters better than those who did. It's certainly scary to think that we're ignorant of so much of what goes on in our heads, though we're almost surely better off taking with a large quantity of salt what we and others say about motives and reasons. Skepticism about our ability to read our minds is safer than certainty that we can. Still, the idea that we have little access to the workings of our minds is a dangerous one. The theories of Copernicus and Darwin were dangerous because they threatened, respectively, religious conceptions of the centrality of humans in the cosmos and the divinity of humans. Social psychologists are threatening a core conviction of the Enlightenment -- that humans are perfectible through the exercise of reason. If reason cannot be counted on to reveal the causes of our beliefs, behavior and preferences, then the idea of human perfectibility is to that degree diminished. _________________________________________________________________ ROBERT R. PROVINE Psychologist and Neuroscientist, University of Maryland; Author, Laughter [provine100.jpg] This is all there is The empirically testable idea that the here and now is all there is and that life begins at birth and ends at death is so dangerous that it has cost the lives of millions and threatens the future of civilization. The danger comes not from the idea itself, but from its opponents, those religious leaders and followers who ruthlessly advocate and defend their empirically improbable afterlife and man-in-the-sky cosmological perspectives. Their vigor is understandable. What better theological franchise is there than the promise of everlasting life, with deluxe trimmings? Religious followers must invest now with their blood and sweat, with their big payoff not due until the after-life. Postmortal rewards cost theologians nothing--I'll match your heavenly choir and raise you 72 virgins. Some franchise! This is even better than the medical profession, a calling with higher overhead, that has gained control of birth, death and pain. Whether the religious brand is Christianity or Islam, the warring continues, with a terrible fate reserved for heretics who threaten the franchise from within. Worse may be in store for those who totally reject the man-in-the-sky premise and its afterlife trappings. All of this trouble over accepting what our senses tell us--that this is all there is. Resolution of religious conflict is impossible because there is no empirical test of the ghostly, and many theologians prey, intentionally or not, upon the fears, superstitions, irrationality, and herd tendencies that are our species' neurobehavioral endowment. Religious fundamentalism inflames conflict and prevents solution--the more extreme and irrational one's position, the stronger one's faith, and, when possessing absolute truth, compromise is not an option. Resolution of conflicts between religions and associated cultures is less likely to come from compromise than from the pursuit of superordinate goals, common, overarching, objectives that extend across nations and cultures, and direct our competitive spirit to further the health, well-being, and nobility of everyone. Public health and science provide such unifying goals. I offer two examples. Health Initiative. A program that improves the health of all people, especially those in developing nations, may find broad support, especially with the growing awareness of global culture and the looming specter of a pandemic. Public health programs bridge religious, political, and cultural divides. No one wants to see their children die. Conflicts fall away when cooperation offers a better life for all concerned. This is also the most effective anti-terrorism strategy, although one probably unpopular with the military industrial complex on one side, and terrorist agitators on the other. Space Initiative. Space exploration expands our cosmos and increases our appreciation of life on Earth and its finite resources. Space exploration is one of our species' greatest achievements. Its pursuit is a goal of sufficient grandeur to unite people of all nations. This is all there is. The sooner we accept this dangerous idea, the sooner we can get on with the essential task of making the most of our lives on this planet. _________________________________________________________________ DONALD HOFFMAN Cognitive Scientist, UC, Irvine; Author, Visual Intelligence [hoffman100.jpg] A spoon is like a headache A spoon is like a headache. This is a dangerous idea in sheep's clothing. It consumes decrepit ontology, preserves methodological naturalism, and inspires exploration for a new ontology, a vehicle sufficiently robust to sustain the next leg of our search for a theory of everything. How could a spoon and a headache do all this? Suppose I have a headache, and I tell you about it. It is, say, a pounding headache that started at the back of the neck and migrated to encompass my forehead and eyes. You respond empathetically, recalling a similar headache you had, and suggest a couple remedies. We discuss our headaches and remedies a bit, then move on to other topics. Of course no one but me can experience my headaches, and no one but you can experience yours. But this posed no obstacle to our meaningful conversation. You simply assumed that my headaches are relevantly similar to yours, and I assumed the same about your headaches. The fact that there is no "public headache," no single headache that we both experience, is simply no problem. A spoon is like a headache. Suppose I hand you a spoon. It is common to assume that the spoon I experience during this transfer is numerically identical to the spoon you experience. But this assumption is false. No one but me can experience my spoon, and no one but you can experience your spoon. But this is no problem. It is enough for me to assume that your spoon experience is relevantly similar to mine. For effective communication, no public spoon is necessary, just like no public headache is necessary. Is there a "real spoon," a mind-independent physical object that causes our spoon experiences and resembles our spoon experiences? This is not only unnecessary but unlikely. It is unlikely that the visual experiences of homo sapiens, shaped to permit survival in a particular range of niches, should miraculously also happen to resemble the true nature of a mind-independent realm. Selective pressures for survival do not, except by accident, lead to truth. One can have a kind of objectivity without requiring public objects. In special relativity, the measurements, and thus the experiences, of mass, length and time differ from observer to observer, depending on their relative velocities. But these differing experiences can be related by the Lorentz transformation. This is all the objectivity one can have, and all one needs to do science. Once one abandons public physical objects, one must reformulate many current open problems in science. One example is the mind-brain relation. There are no public brains, only my brain experiences and your brain experiences. These brain experiences are just the simplified visual experiences of homo sapiens, shaped for survival in certain niches. The chances that our brain experiences resemble some mind-independent truth are remote at best, and those who would claim otherwise must surely explain the miracle. Failing a clever explanation of this miracle, there is no reason to believe brains cause anything, including minds. And here the wolf unzips the sheep skin, and darts out into the open. The danger becomes apparent the moment we switch from boons to sprains. Oh, pardon the spoonerism. _________________________________________________________________ MARC D. HAUSER Psychologist and Biologist, Harvard University: Author, Wild Minds [hauser101.jpg] A universal grammar of [mental] life The recent explosion of work in molecular evolution and developmental biology has, for the first time, made it possible to propose a radical new theory of mental life that if true, will forever rewrite the textbooks and our way of thinking about our past and future. It explains both the universality of our thoughts as well as the unique signatures that demarcate each human culture, past, present and future. The theory I propose is that human mental life is based on a few simple, abstract, yet expressively powerful rules or computations together with an instructive learning mechanism that prunes the range of possible systems of language, music, mathematics, art, and morality to a limited set of culturally expressed variants. In many ways, this view isn't new or radical. It stems from thinking about the seemingly constrained ways in which relatively open ended or generative systems of expression create both universal structure and limited variation. Unfortunately, what appears to be a rather modest proposal on some counts, is dangerous on another. It is dangerous to those who abhor biologically grounded theories on the often misinterpreted perspective that biology determines our fate, derails free will, and erases the soul. But a look at systems other than the human mind makes it transparently clear that the argument from biological endowment does not entail any of these false inferences. For example, we now understand that our immune systems don't learn from the environment how to tune up to the relevant problems. Rather, we are equipped with a full repertoire of antibodies to deal with a virtually limitless variety of problems, including some that have not yet even emerged in the history of life on earth. This initially seems counter-intuitive: how could the immune system have evolved to predict the kinds of problems we might face? The answer is that it couldn't. What it evolved instead was a set of molecular computations that, in combination with each other, can handle an infinitely larger set of conditions than any single combination on its own. The role of the environment is as instructor, functionally telling the immune system about the current conditions, resulting in a process of pairing down of initial starting options. The pattern of change observed in the immune system, characterized by an initial set of universal computations or options followed by an instructive process of pruning, is seen in systems as disparate as the genetic mechanisms underlying segmented body parts in vertebrates, the basic body plan of land plants involving the shoot system of stem and leaves, and song development in birds. Songbirds are particularly interesting as the system for generating a song seems to be analogous in important ways to our capacity to generate a specific language. Humans and songbirds start with a species-specific capacity to build language and song respectively, and this capacity has limitless expressive power. Upon delivery and hatching, and possibly a bit before, the local acoustic environment begins the process of instruction, pruning the possible languages and songs down to one or possibly two. The common thread here is a starting state of universal computations or options followed by an instructive process of pruning, ending up with distinctive systems that share an underlying common core. Hard to see how anyone could find this proposal dangerous or off-putting, or even wrong! Now jump laterally, and make the move to aesthetics and ethics. Our minds are endowed with universal computations for creating and judging art, music, and morally relevant actions. Depending upon where we are born, we will find atonal music pleasing or disgusting, and infanticide obligatory or abhorrent. The common or universal core is, for music, a set of rules for combining together notes to alter our emotions, and for morality, a different set of rules for combining the causes and consequences of action to alter our permissibility judgments. To say that we are endowed with a universal moral sense is not to say that we will do the right or wrong thing, with any consistency. The idea that there is a moral faculty, grounded in our biology, says nothing at all about the good, the bad or the ugly. What it says is that we have evolved particular biases, designed as a function of selection for particular kinds of fit to the environment, under particular constraints. But nothing about this claim leads to the good or the right or the permissible. The reason this has to be the case is twofold: there is not only cultural variation but environmental variation over evolutionary time. What is good for us today may not be good for us tomorrow. But the key insight delivered by the nativist perspective is that we must uderstand the nature of our biases in order to work toward some good or better world, realizing all along that we are constrained. Appreciating the choreography between universal options and instructive pruning is only dangerous if misused to argue that our evolved nature is good, and what is good is right. That's bad. _________________________________________________________________ RAY KURZWEIL Inventor and Technologist; Author, The Singularity Is Near: When Humans Transcend Biology [kurzweil.100.jpg] The near-term inevitability of radical life extension and expansion My dangerous idea is the near-term inevitability of radical life extension and expansion. The idea is dangerous, however, only when contemplated from current linear perspectives. First the inevitability: the power of information technologies is doubling each year, and moreover comprises areas beyond computation, most notably our knowledge of biology and of our own intelligence. It took 15 years to sequence HIV and from that perspective the genome project seemed impossible in 1990. But the amount of genetic data we were able to sequence doubled every year while the cost came down by half each year. We finished the genome project on schedule and were able to sequence SARS in only 31 days. We are also gaining the means to reprogram the ancient information processes underlying biology. RNA interference can turn genes off by blocking the messenger RNA that express them. New forms of gene therapy are now able to place new genetic information in the right place on the right chromosome. We can create or block enzymes, the work horses of biology. We are reverse-engineering -- and gaining the means to reprogram -- the information processes underlying disease and aging, and this process is accelerating, doubling every year. If we think linearly, then the idea of turning off all disease and aging processes appears far off into the future just as the genome project did in 1990. On the other hand, if we factor in the doubling of the power of these technologies each year, the prospect of radical life extension is only a couple of decades away. In addition to reprogramming biology, we will be able to go substantially beyond biology with nanotechnology in the form of computerized nanobots in the bloodstream. If the idea of programmable devices the size of blood cells performing therapeutic functions in the bloodstream sounds like far off science fiction, I would point out that we are doing this already in animals. One scientist cured type I diabetes in rats with blood cell sized devices containing 7 nanometer pores that let insulin out in a controlled fashion and that block antibodies. If we factor in the exponential advance of computation and communication (price-performance multiplying by a factor of a billion in 25 years while at the same time shrinking in size by a factor of thousands), these scenarios are highly realistic. The apparent dangers are not real while unapparent dangers are real. The apparent dangers are that a dramatic reduction in the death rate will create over population and thereby strain energy and other resources while exacerbating environmental degradation. However we only need to capture 1 percent of 1 percent of the sunlight to meet all of our energy needs (3 percent of 1 percent by 2025) and nanoengineered solar panels and fuel cells will be able to do this, thereby meeting all of our energy needs in the late 2020s with clean and renewable methods. Molecular nanoassembly devices will be able to manufacture a wide range of products, just about everything we need, with inexpensive tabletop devices. The power and price-performance of these systems will double each year, much faster than the doubling rate of the biological population. As a result, poverty and pollution will decline and ultimately vanish despite growth of the biological population. There are real downsides, however, and this is not a utopian vision. We have a new existential threat today in the potential of a bioterrorist to engineer a new biological virus. We actually do have the knowledge to combat this problem (for example, new vaccine technologies and RNA interference which has been shown capable of destroying arbitrary biological viruses), but it will be a race. We will have similar issues with the feasibility of self-replicating nanotechnology in the late 2020s. Containing these perils while we harvest the promise is arguably the most important issue we face. Some people see these prospects as dangerous because they threaten their view of what it means to be human. There is a fundamental philosophical divide here. In my view, it is not our limitations that define our humanity. Rather, we are the species that seeks and succeeds in going beyond our limitations. _________________________________________________________________ HAIM HARARI Physicist, former President, Weizmann Institute of Science [harari100.jpg] Democracy may be on its way out Democracy may be on its way out. Future historians may determine that Democracy will have been a one-century episode. It will disappear. This is a sad, truly dangerous, but very realistic idea (or, rather, prediction). Falling boundaries between countries, cross border commerce, merging economies, instant global flow of information and numerous other features of our modern society, all lead to multinational structures. If you extrapolate this irreversible trend, you get the entire planet becoming one political unit. But in this unit, anti-democracy forces are now a clear majority. This majority increases by the day, due to demographic patterns. All democratic nations have slow, vanishing or negative population growth, while all anti-democratic and uneducated societies multiply fast. Within democratic countries, most well-educated families remain small while the least educated families are growing fast. This means that, both at the individual level and at the national level, the more people you represent, the less economic power you have. In a knowledge based economy, in which the number of working hands is less important, this situation is much more non-democratic than in the industrial age. As long as upward mobility of individuals and nations could neutralize this phenomenon, democracy was tenable. But when we apply this analysis to the entire planet, as it evolves now, we see that democracy may be doomed. To these we must add the regrettable fact that authoritarian multinational corporations, by and large, are better managed than democratic nation states. Religious preaching, TV sound bites, cross boundary TV incitement and the freedom of spreading rumors and lies through the internet encourage brainwashing and lack of rational thinking. Proportionately, more young women are growing into societies which discriminate against them than into more egalitarian societies, increasing the worldwide percentage of women treated as second class citizens. Educational systems in most advanced countries are in a deep crisis while modern education in many developing countries is almost non-existent. A small well-educated technological elite is becoming the main owner of intellectual property, which is, by far, the most valuable economic asset, while the rest of the world drifts towards fanaticism of one kind or another. Add all of the above and the unavoidable conclusion is that Democracy, our least bad system of government, is on its way out. Can we invent a better new system? Perhaps. But this cannot happen if we are not allowed to utter the sentence: "There may be a political system which is better than Democracy". Today's political correctness does not allow one to say such things. The result of this prohibition will be an inevitable return to some kind of totalitarian rule, different from that of the emperors, the colonialists or the landlords of the past, but not more just. On the other hand, open and honest thinking about this issue may lead either to a gigantic worldwide revolution in educating the poor masses, thus saving democracy, or to a careful search for a just (repeat, just) and better system. I cannot resist a cheap parting shot: When, in the past two years, Edge asked for brilliant ideas you believe in but cannot prove, or for proposing new exciting laws, most answers related to science and technology. When the question is now about dangerous ideas, almost all answers touch on issues of politics and society and not on the "hard sciences". Perhaps science is not so dangerous, after all. _________________________________________________________________ DAVID G. MYERS Social Psychologist; Co-author (with Letha Scanzoni); What God has Joined Together: A Christian Case for Gay Marriage [myers100.jpg] A marriage option for all Much as others have felt compelled by evidence to believe in human evolution or the warming of the planet, I feel compelled by evidence to believe a) that sexual orientation is a natural, enduring disposition and b) that the world would be a happier and healthier place if, for all people, romantic love, sex, and marriage were a package. In my Midwestern social and religious culture, the words "for all people" transform a conservative platitude into a dangerous idea, over which we are fighting a culture war. On one side are traditionalists, who feel passionately about the need to support and renew marriage. On the other side are progressives, who assume that our sexual orientation is something we did not choose and cannot change, and that we all deserve the option of life within a covenant partnership. I foresee a bridge across this divide as folks on both the left and the right engage the growing evidence of our panhuman longing for belonging, of the benefits of marriage, and of the biology and persistence of sexual orientation. We now have lots of data showing that marriage is conducive to healthy adults, thriving children, and flourishing communities. We also have a dozen discoveries of gay-straight differences in everything from brain physiology to skill at mentally rotating geometric figures. And we have an emerging professional consensus that sexual reorientation therapies seldom work. More and more young adults -- tomorrow's likely majority, given generational succession -- are coming to understand this evidence, and to support what in the future will not seem so dangerous: a marriage option for all. _________________________________________________________________ CLAY SHIRKY Social & Technology Network Topology Researcher; Adjunct Professor, NYU Graduate School of Interactive Telecommunications Program (ITP) [shirkey100.jpg] Free will is going away. Time to redesign society to take that into account. In 2002, a group of teenagers sued McDonald's for making them fat, charging, among other things, that McDonald's used promotional techniques to get them to eat more than they should. The suit was roundly condemned as an the erosion of the sense of free will and personal responsibility in our society. Less widely remarked upon was that the teenagers were offering an accurate account of human behavior. Consider the phenomenon of 'super-sizing', where a restaurant patron is offered the chance to increase the portion size of their meal for some small amount of money. This presents a curious problem for the concept of free will -- the patron has already made a calculation about the amount of money they are willing to pay in return for a particular amount of food. However, when the question is re-asked, -- not "Would you pay $5.79 for this total amount of food?" but "Would you pay an additional 30 cents for more french fries?" -- patrons often say yes, despite having answered "No" moments before to an economically identical question. Super-sizing is expressly designed to subvert conscious judgment, and it works. By re-framing the question, fast food companies have found ways to take advantages of weaknesses in our analytical apparatus, weaknesses that are being documented daily in behavioral economics and evolutionary psychology. This matters for more than just fat teenagers. Our legal, political, and economic systems, the mechanisms that run modern society, all assume that people are uniformly capable of consciously modulating their behaviors. As a result, we regard decisions they make as being valid, as with elections, and hold them responsible for actions they take, as in contract law or criminal trials. Then, in order to get around the fact that some people obviously aren't capable of consciously modulating their behavior, we carve out ad hoc exemptions. In U.S. criminal law, a 15 year old who commits a crime is treated differently than a 16 year old. A crime committed in the heat of the moment is treated specially. Some actions are not crimes because their perpetrator is judged mentally incapable, whether through developmental disabilities or other forms of legally defined insanity. This theoretical divide, between the mass of people with a uniform amount of free will and a small set of exceptional individuals, has been broadly stable for centuries, in part because it was based on ignorance. As long as we were unable to locate any biological source of free will, treating the mass of people as if each of them had the same degree of control over their lives made perfect sense; no more refined judgments were possible. However, that binary notion of free will is being eroded as our understanding of the biological antecedents of behavior improves. Consider laws concerning convicted pedophiles. Concern about their recidivism rate has led to the enactment of laws that restrict their freedom based on things they might do in the future, even though this expressly subverts the notion of free will in the judicial system. The formula here -- heinousness of crime x likelihood of repeat offense -- creates a new, non-insane class of criminals whose penalty is indexed to a perceived lack of control over themselves. But pedophilia is not unique in it's measurably high recidivism rate. All rapists have higher than average recidivism rates. Thieves of all varieties are likelier to become repeat offenders if they have short time horizons or poor impulse control. Will we keep more kinds of criminals constrained after their formal sentence is served, as we become better able to measure the likely degree of control they have over their own future actions? How can we, if we are to preserve the idea of personal responsibility? How can we not, once we are able to quantify the risk? Criminal law is just one area where our concept of free will is eroding. We know that men make more aggressive decisions after they have been shown pictures of attractive female faces. We know women are more likely to commit infidelity on days they are fertile. We know that patients committing involuntary physical actions routinely (and incorrectly) report that they decided to undertake those actions, in order to preserve their sense that they are in control. We know that people will drive across town to save $10 on a $50 appliance, but not on a $25,000 car. We know that the design of the ballot affects a voter's choices. And we are still in the early days of even understanding these effects, much less designing everything from sales strategies to drug compounds to target them. Conscious self-modulation of behavior is a spectrum. We have treated it as a single property -- you are either capable of free will, or you fall into an exceptional category -- because we could not identify, measure, or manipulate the various components that go into such self-modulation. Those days are now ending, and everyone from advertisers to political consultants increasingly understands, in voluminous biological detail, how to manipulate consciousness in ways that weaken our notion of free will. In the coming decades, our concept of free will, based as it is on ignorance of its actual mechanisms, will be destroyed by what we learn about the actual workings of the brain. We can wait for that collision, and decide what to do then, or we can begin thinking through what sort of legal, political, and economic systems we need in a world where our old conception of free will is rendered inoperable. _________________________________________________________________ MICHAEL SHERMER Publisher of Skeptic magazine, monthly columnist for Scientific American; Author, Science Friction [shermer100.jpg] Where goods cross frontiers, armies won't Where goods cross frontiers, armies won't. Restated: where economic borders are porous between two nations, political borders become impervious to armies. Data from the new sciences of evolutionary economics, behavioral economics, and neuroeconomics reveals that when people are free to cooperate and trade (such as in game theory protocols) they establish trust that is reinforced through neural pathways that release such bonding hormones as oxytocin. Thus, modern biology reveals that where people are free to cooperate and trade they are less likely to fight and kill those with whom they are cooperating and trading. My dangerous idea is a solution to what I call the "really hard problem": how best should we live? My answer: A free society, defined as free-market economics and democratic politics -- fiscal conservatism and social liberalism -- which leads to the greatest liberty for the greatest number. Since humans are, by nature, tribal, the overall goal is to expand the concept of the tribe to include all members of the species into a global free society. Free trade between all peoples is the surest way to reach this goal. People have a hard time accepting free market economics for the same reason they have a hard time accepting evolution: it is counterintuitive. Life looks intelligently designed, so our natural inclination is to infer that there must be an intelligent designer -- a God. Similarly, the economy looks designed, so our natural inclination is to infer that we need a designer -- a Government. In fact, emergence and complexity theory explains how the principles of self-organization and emergence cause complex systems to arise from simple systems without a top-down designer. Charles Darwin's natural selection is Adam Smith's invisible hand. Darwin showed how complex design and ecological balance were unintended consequences of individual competition among organisms. Smith showed how national wealth and social harmony were unintended consequences of individual competition among people. Nature's economy mirrors society's economy. Thus, integrating evolution and economics -- what I call evonomics -- reveals that an old economic doctrine is supported by modern biology. _________________________________________________________________ ARNOLD TREHUB Psychologist, University of Massachusetts, Amherst; Author, The Cognitive Brain [trehub100.jpg] Modern science is a product of biology The entire conceptual edifice of modern science is a product of biology. Even the most basic and profound ideas of science -- think relativity, quantum theory, the theory of evolution -- are generated and necessarily limited by the particular capacities of our human biology. This implies that the content and scope of scientific knowledge is not open-ended. _________________________________________________________________ ROGER C. SCHANK Psychologist & Computer Scientist; Chief Learning Officer, Trump University; Author, Making Minds Less Well Educated than Our Own [schank100.jpg] No More Teacher's Dirty Looks After a natural disaster, the newscasters eventually excitedly announce that school is finally open so no matter what else is terrible where they live, the kids are going to school. I always feel sorry for the poor kids. My dangerous idea is one that most people immediately reject without giving it serious thought: school is bad for kids -- it makes them unhappy and as tests show -- they don't learn much. When you listen to children talk about school you easily discover what they are thinking about in school: who likes them, who is being mean to them, how to improve their social ranking, how to get the teacher to treat them well and give them good grades. Schools are structured today in much the same way as they have been for hundreds of years. And for hundreds of years philosophers and others have pointed out that school is really a bad idea: We are shut up in schools and college recitation rooms for ten or fifteen years, and come out at last with a belly full of words and do not know a thing. -- Ralph Waldo Emerson Education is an admirable thing, but it is well to remember from time to time that nothing that is worth knowing can be taught. -- Oscar Wilde Schools should simply cease to exist as we know them. The Government needs to get out of the education business and stop thinking it knows what children should know and then testing them constantly to see if they regurgitate whatever they have just been spoon fed. The Government is and always has been the problem in education: If the government would make up its mind to require for every child a good education, it might save itself the trouble of providing one. It might leave to parents to obtain the education where and how they pleased, and content itself with helping to pay the school fees of the poorer classes of children, and defraying the entire school expenses of those who have no one else to pay for them. -- JS Mill First, God created idiots. That was just for practice. Then He created school boards. -- Mark Twain Schools need to be replaced by safe places where children can go to learn how to do things that they are interested in learning how to do. Their interests should guide their learning. The government's role should be to create places that are attractive to children and would cause them to want to go there. Whence it comes to pass, that for not having chosen the right course, we often take very great pains, and consume a good part of our time in training up children to things, for which, by their natural constitution, they are totally unfit. -- Montaigne We had a President many years ago who understood what education is really for. Nowadays we have ones that make speeches about the Pythagorean Theorem when we are quite sure they don't know anything about any theorem. There are two types of education. . . One should teach us how to make a living, And the other how to live. -- John Adams Over a million students have opted out of the existing school system and are now being home schooled. The problem is that the states regulate home schooling and home schooling still looks an awful lot like school. We need to stop producing a nation of stressed out students who learn how to please the teacher instead of pleasing themselves. We need to produce adults who love learning, not adults who avoid all learning because it reminds them of the horrors of school. We need to stop thinking that all children need to learn the same stuff. We need to create adults who can think for themselves and are not convinced about how to understand complex situations in simplistic terms that can be rendered in a sound bite. Just call school off. Turn them all into apartment houses. _________________________________________________________________ SUSAN BLACKMORE Psychologist and Skeptic; Author, Consciousness: An Introduction [blackmore.100.jpg] Everything is pointless We humans can, and do, make up our own purposes, but ultimately the universe has none. All the wonderfully complex, and beautifully designed things we see around us were built by the same purposeless process -- evolution by natural selection. This includes everything from microbes and elephants to skyscrapers and computers, and even our own inner selves. People have (mostly) got used to the idea that living things were designed by natural selection, but they have more trouble accepting that human creativity is just the same process operating on memes instead of genes. It seems, they think, to take away uniqueness, individuality and "true creativity". Of course it does nothing of the kind; each person is unique even if that uniqueness is explained by their particular combination of genes, memes and environment, rather than by an inner conscious self who is the fount of creativity. _________________________________________________________________ DAVID LYKKEN Behavioral geneticist and Emeritus Professor of Psychology, University of Minnesota; Author, Happiness [lykken100.jpg] Laws requiring parental licensure I believe that, during my grandchildren's lifetimes, the U.S. Supreme Court will find a way to approve laws requiring parental licensure. Traditional societies in which children are socialized collectively, the method to which our species is evolutionarily adapted, have very little crime. In the modern U.S., the proportion of fatherless children, living with unmarried mothers, currently some 10 million in all, has increased more than 400% since 1960 while the violent crime rate rose 500% by 1994, before dipping slightly due to a delayed but equal increase in the number of prison inmates (from 240,000 to 1.4 million.) In 1990, across the 50 States, the correlation between the violent crime rate and the proportion of illegitimate births was 0.70. About 70% of incarcerated delinquents, of teen-age pregnancies, of adolescent runaways, involve (I think result from) fatherless rearing. Because these frightening curves continue to accelerate, I believe we must eventually confront the need for parental licensure -- you can't keep that newborn unless you are 21, married and self-supporting -- not just for society's safety but so those babies will have a chance for life, liberty, and the pursuit of happiness. _________________________________________________________________ CLIFFORD PICKOVER Author, Sex, Drugs, Einstein, and Elves [pickover100.jpg] We are all virtual Our desire for entertaining virtual realities is increasing. As our understanding of the human brain also accelerates, we will create both imagined realities and a set of memories to support these simulacrums. For example, someday it will be possible to simulate your visit to the Middle Ages and, to make the experience realistic, we may wish to ensure that you believe yourself to actually be in the Middle Ages. False memories may be implanted, temporarily overriding your real memories. This should be easy to do in the future -- given that we can already coax the mind to create richly detailed virtual worlds filled with ornate palaces and strange beings through the use of the drug DMT (dimethyltryptamine). In other words, the brains of people who take DMT appear to access a treasure chest of images and experience that typically include jeweled cities and temples, angelic beings, feline shapes, serpents, and shiny metals. When we understand the brain better, we will be able to safely generate more controlled visions. Our brains are also capable of simulating complex worlds when we dream. For example, after I watched a movie about people on a coastal town during the time of the Renaissance, I was "transported" there later that night while in a dream. The mental simulation of the Renaissance did not have to be perfect, and I'm sure that there were myriad flaws. However, during that dream I believed I was in the Renaissance. If we understood the nature of how the mind induces the conviction of reality, even when strange, nonphysical events happen in the dreams, we could use this knowledge to ensure that your simulated trip to the Middle Ages seemed utterly real, even if the simulation was imperfect. It will be easy to create seemingly realistic virtual realities because we don't have to be perfect or even good with respect to the accuracy of our simulations in order to make them seem real. After all, our nightly dreams usually seem quite real even if upon awakening we realize that logical or structural inconsistencies existed in the dream. In the future, for each of your own real lives, you will personally create ten simulated lives. Your day job is a computer programmer for IBM. However, after work, you'll be a knight with shining armor in the Middle Ages, attending lavish banquets, and smiling at wandering minstrels and beautiful princesses. The next night, you'll be in the Renaissance, living in your home on the Amalfi coast of Italy, enjoying a dinner of plover, pigeon, and heron. If this ratio of one real life to ten simulated lives turned out to be representative of human experience, this means that right now, you only have a one in ten chance of being alive on the actual date of today. _________________________________________________________________ JOHN ALLEN PAULOS Professor of Mathematics, Temple University, Philadelphia; Author, A Mathematician Plays the Stock Market [paulos100.jpg] The self is a conceptual chimera Doubt that a supernatural being exists is banal, but the more radical doubt that we exist, at least as anything more than nominal, marginally integrated entities having convenient labels like "Myrtle" and "Oscar," is my candidate for Dangerous Idea. This is, of course, Hume's idea -- and Buddha's as well -- that the self is an ever-changing collection of beliefs, perceptions, and attitudes, that it is not an essential and persistent entity, but rather a conceptual chimera. If this belief ever became widely and viscerally felt throughout a society -- whether because of advances in neurobiology, cognitive science, philosophical insights, or whatever -- its effects on that society would be incalculable. (Or so this assemblage of beliefs, perceptions, and attitudes sometimes thinks.) _________________________________________________________________ JAMES O'DONNELL Classicist; Cultural Historian; Provost, Georgetown University; Author, Avatars of the Word [odonnell100.jpg] Marx was right: the "state" will evaporate and cease to have useful meaning as a form of human organization From the earliest Babylonian and Chinese moments of "civilization", we have agreed that human affairs depend on an organizing power in the hands of a few people (usually with religious charisma to undergird their authority) who reside in a functionally central location. "Political science" assumes in its etymology the "polis" or city-state of Greece as the model for community and government. But it is remarkable how little of human excellence and achievement has ever taken place in capital cities and around those elites, whose cultural history is one of self-mockery and implicit acceptance of the marginalization of the powerful. Borderlands and frontiers (and even suburbs) are where the action is. But as long as technologies of transportation and military force emphasized geographic centralization and concentration of forces, the general or emperor or president in his capital with armies at his beck and call was the most obvious focus of power. Enlightened government constructed mechanisms to restrain and channel such centralized authority, but did not effectively challenge it. So what advantage is there today to the nation state? Boundaries between states enshrine and exacerbate inequalities and prevent the free movement of peoples. Large and prosperous state and state-related organizations and locations attract the envy and hostility of others and are sitting duck targets for terrorist action. Technologies of communication and transportation now make geographically-defined communities increasingly irrelevant and provide the new elites and new entrepreneurs with ample opportunity to stand outside them. Economies construct themselves in spite of state management and money flees taxation as relentlessly as water follows gravity. Who will undergo the greatest destabilization as the state evaporates and its artificial protections and obstacles disappear? The sooner it happens, the more likely it is to be the United States. The longer it takes ... well, perhaps the new Chinese empire isn't quite the landscape-dominating leviathan of the future that it wants to be. Perhaps in the end it will be Mao who was right, and a hundred flowers will bloom there. _________________________________________________________________ PHILIP ZIMBARDO Professor Emeritus of Psychology at Stanford University; Author: Shyness [zimbardo100.jpg] The banality of evil is matched by the banality of heroism Those people who become perpetrators of evil deeds and those who become perpetrators of heroic deeds are basically alike in being just ordinary, average people. The banality of evil is matched by the banality of heroism. Both are not the consequence of dispositional tendencies, not special inner attributes of pathology or goodness residing within the human psyche or the human genome. Both emerge in particular situations at particular times when situational forces play a compelling role in moving individuals across the decisional line from inaction to action. There is a decisive decisional moment when the individual is caught up in a vector of forces emanating from the behavioral context. Those forces combine to increase the probability of acting to harm others or acting to help others. That decision may not be consciously planned or taken mindfully, but impulsively driven by strong situational forces external to the person. Among those action vectors are group pressures and group identity, diffusion of responsibility, temporal focus on the immediate moment without entertaining costs and benefits in the future, among others. The military police guards who abused prisoners at Abu Ghraib and the prison guards in my Stanford Prison experiment who abused their prisoners illustrate the "Lord of the Flies" temporary transition of ordinary individuals into perpetrators of evil. We set aside those whose evil behavior is enduring and extensive, such as tyrants like Idi Amin, Stalin and Hitler. Heroes of the moment are also contrasted with lifetime heroes. The heroic action of Rosa Parks in a Southern bus, of Joe Darby in exposing the Abu Ghraib tortures, of NYC firefighters at the World Trade Center's disaster are acts of bravery at that time and place. The heroism of Mother Teresa, Nelson Mandela, and Gandhi is replete with valorous acts repeated over a lifetime. That chronic heroism is to acute heroism as valour is to bravery. This view implies that any of us could as easily become heroes as perpetrators of evil depending on how we are impacted by situational forces. We then want to discover how to limit, constrain, and prevent those situational and systemic forces that propel some of us toward social pathology. It is equally important for our society to foster the heroic imagination in our citizens by conveying the message that anyone is a hero-in-waiting who will be counted upon to do the right thing when the time comes to make the heroic decision to act to help or to act to prevent harm. _________________________________________________________________ RICHARD FOREMAN Founder & Director, Ontological-Hysteric Theater [foreman100.jpg] Radicalized relativity In my area of the arts and humanities, the most dangerous idea (and the one under who's influence I have operated throughout my artistic life) is the complete relativity of all positions and styles of procedure. The notion that there are no "absolutes" in art -- and in the modern era, each valuable effort has been, in one way or another, the highlighting and glorification of elements previous "off limits" and rejected by the previous "classical" style. Such a continual "reversal of values" has of course delivered us into the current post-post modern era, in which fragmentation, surface value and the complex weave of "sampling procedure" dominate, and "the center does not hold". I realize that my own artistic efforts have, in a small way, contributed to the current aesthetic/emotional environment in which the potential spiritual depth and complexity of evolved human consciousness is trumped by the bedazzling shuffle of the shards of inherited elements -- never before as available to the collective consciousness. The resultant orientation towards "cultural relativity" in the arts certainly comes in part from the psychic re-orientation resulting from Einstein's bombshell dropped at the beginning of the last century. This current "relativity" of all artistic, philosophical, and psychological values leaves the culture adrift, and yet there is no "going back" in spite of what conservative thinkers often recommend. At the very moment of our cultural origin, we were warned against "eating from the tree of knowledge". Down through subsequent history, one thing has led to another, until now -- here we are, sinking into the quicksand of the ever-accelerating reversal of each latest value (or artistic style). And yet -- there are many artists, like myself, committed to the believe that -- having been "thrown by history" into the dangerous trajectory initiated by the inaugural "eating from the tree of knowledge" (a perhaps "fatal curiosity" programmed into our genes) the only escape possible is to treat the quicksand of the present as a metaphorical "black hole" through which we must pass -- indeed risking psychic destruction (or "banalization") -- for the promise of emerging re-made, in new still unimaginable form, on the other side. This is the "heroic wager" the serious "experimental" artist makes in living through the dangerous idea of radicalized relativity. It is ironic, of course, that many of our greatest scientists (not all of course) have little patience for the adventurous art of our times (post Stockhausen/Boulez music, post Joyce/ Mallarme literature) and seem to believe that a return to a safer "audience friendly" classical style is the only responsible method for today's artists. Do they perhaps feel psychologically threatened by advanced styles that supercede previous principals of coherence? They are right to feel threatened by such dangerous advances into territory for which conscious sensibility if not yet fully prepared. Yet it is time for all serious minds to "bite the bullet" of such forays into the unknown world in which the dangerous quest for deeper knowledge leads scientist and artist alike. _________________________________________________________________ JOHN GOTTMAN Psychologist; Founder of Gottman Institute; Author, The Mathematics of Marriage [gottman100.jpg] Emotional intelligence The most dangerous idea I know of is emotional intelligence. Within the context of the cognitive neuroscience revolution in psychology, the focus on emotions is extraordinary. The over-arching idea that there is such a thing as emotional intelligence, that it has a neuroscience, that it is inter-personal, i.e., between two brains, rather than within one brain, are all quite revolutionary concepts about human psychology. I could go on. It is also a revolution in thinking about infancy, couples, family, adult development, aging, etc. _________________________________________________________________ PIET HUT Professor of Astrophysics, Institute for Advanced Study, Princeton [hut100.jpg] A radical reevaluuation of the character of time Copernicus and Darwin took away our traditional place in the world and our traditional identity in the world. What traditional trait will be taken away from us next? My guess is that it will be the world itself. We see the first few steps in that direction in the physics, mathematics and computer science of the twentieth century, from quantum mechanics to the results obtained by G?del, Turing and others. The ontologies of our worlds, concrete as well as abstract, have already started to melt away. The problem is that quantum entanglement and logical incompleteness lack the in-your-face quality of a spinning earth and our kinship with apes. We will have to wait for the ontology of the traditional world to unravel further, before the avant-garde insights will turn into a real revolution. Copernicus upset the moral order, by dissolving the strict distinction between heaven and earth. Darwin did the same, by dissolving the strict distinction between humans and other animals. Could the next step be the dissolution of the strict distinction between reality and fiction? For this to be shocking, it has to come in a scientifically respectable way, as a very precise and inescapable conclusion -- it should have the technical strength of a body of knowledge like quantum mechanics, as opposed to collections of opinions on the level of cultural relativism. Perhaps a radical reevaluation of the character of time will do it. In everyday experience, time flows, and we flow with it. In classical physics, time is frozen as part of a frozen spacetime picture. And there is, as yet, no agreed-upon interpretation of time in quantum mechanics. What if a future scientific understanding of time would show all previous pictures to be wrong, and demonstrate that past and future and even the present do not exist? That stories woven around our individual personal history and future are all just wrong? Now that would be a dangerous idea. _________________________________________________________________ DAN SPERBER Social and cognitive scientist, CNRS, Paris; author, Explaining Culture [sperber100.jpg] Culture is natural A number of us -- biologists, cognitive scientists, anthropologists or philosophers -- have been trying to lay down the foundations for a truly naturalistic approach to culture. Sociobiologists and cultural ecologists have explored the idea that cultural behaviors are biological adaptations to be explained in terms of natural selection. Memeticists inspired by Richard Dawkins argue that cultural evolution is an autonomous Darwinian selection process merely enabled but not governed by biological evolution. Evolutionary psychologists, Cavalli-Sforza, Feldman, Boyd and Richerson, and I are among those who, in different ways, argue for more complex interactions between biology and culture. These naturalistic approaches have been received not just with intellectual objections, but also with moral and political outrage: this is a dangerous idea, to be strenuously resisted, for it threatens humanistic values and sound social sciences. When I am called a "reductionist", I take it as a misplaced compliment: a genuine reduction is a great scientific achievement, but, too bad, the naturalistic study of culture I advocate does not to reduce to that of biology or of psychology. When I am called a "positivist" (an insult among postmodernists), I acknowledge without any sense of guilt or inadequacy that indeed I don't believe that all facts are socially constructed. On the whole, having one's ideas described as "dangerous" is flattering. Dangerous ideas are potentially important. Braving insults and misrepresentations in defending these ideas is noble. Many advocates of naturalistic approaches to culture see themselves as a group of free-thinking, deep-probing scholars besieged by bigots. But wait a minute! Naturalistic approaches can be dangerous: after all, they have been. The use of biological evidence and arguments purported to show that there are profound natural inequalities among human "races", ethnic groups, or between women and men is only too well represented in the history of our disciplines. It is not good enough for us to point out (rightly) that 1) the science involved is bad science, 2) even if some natural inequality were established, it would not come near justifying any inequality in rights, and 3) postmodernists criticizing naturalism on political grounds should begin by rejecting Heidegger and other reactionaries in their pantheon who also have been accomplices of policies of discrimination. This is not enough because the racist and sexist uses of naturalism are not exactly unfortunate accidents. Species evolve because of genetic differences among their members; therefore you cannot leave biological difference out of a biological approach. Luckily, it so happens that biological differences among humans are minor and don't produce sub-species or "races," and that human sexual dimorphism is relatively limited. In particular, all humans have mind/brains made up of the same mechanisms, with just fine-tuning differences. (Think how very different all this would be if -- however improbably -- Neanderthals had survived and developed culturally like we did so that there really were different human "races"). Given what anthropologists have long called "the psychic unity of the human kind", the fundamental goal for a naturalistic approach is to explain how a common human nature -- and not biological differences among humans -- gives rise to such a diversity of languages, cultures, social organizations. Given the real and present danger of distortion and exploitation, it must be part of our agenda to take responsibility for the way this approach is understood by a wider public. This, happily, has been done by a number of outstanding authors capable of explaining serious science to lay audiences, and who typically have made the effort of warning their readers against misuses of biology. So the danger is being averted, and let's just move on? No, we are not there yet, because the very necessity of popularizing the naturalistic approach and the very talent with which this is being done creates a new danger, that of arrogance. We naturalists do have radical objections to what Leda Cosmides and John Tooby have called the "Standard Social Science Model." We have many insightful hypotheses and even some relevant data. The truth of the matter however is that naturalistic approaches to culture have so far remained speculative, hardly beginning to throw light on just fragments of the extraordinarily wide range of detailed evidence accumulated by historians, anthropologists, sociologists and others. Many of those who find our ideas dangerous fear what they see as an imperialistic bid to take over their domain. The bid would be unrealistic, and so is the fear. The real risk is different. The social sciences host a variety of approaches, which, with a few high profile exceptions, all contribute to our understanding of the domain. Even if it involves some reshuffling, a naturalistic approach should be seen as a particularly welcome and important addition. But naturalists full of grand claims and promises but with little interest in the competence accumulated by others are, if not exactly dangerous, at least much less useful than they should be, and the deeper challenge they present to social scientists' mental habits is less likely to be properly met. _________________________________________________________________ MARTIN E.P. SELIGMAN Psychologist, University of Pennsylvania, Author, Authentic Happiness [seligman100.jpg] Relativism In looking back over the scientific and artistic breakthroughs in the 20th century, there is a view that the great minds relativized the absolute. Did this go too far? Has relativism gotten to a point that it is dangerous to the scientific enterprise and to human well being? The most visible person to say this is none other than Pope Benedict XVI in his denunciations of the "dictatorship of the relative." But worries about relativism are not only a matter of dispute in theology; there are parallel dissenters from the relative in science, in philosophy, in ethics, in mathematics, in anthropology, in sociology, in the humanities, in childrearing, and in evolutionary biology. Here are some of the domains in which serious thinkers have worried about the overdoing of relativism: o In philosophy of science, there is ongoing tension between the Kuhnians (science is about "paradigms," the fashions of the current discipline) and the realists (science is about finding the truth). o In epistemology there is the dispute between the Tarskian correspondence theorists ("p" is true if p) versus two relativistic camps, the coherence theorists ("p" is true to the extent it coheres with what you already believe is true) and the pragmatic theory of truth ("p" is true if it gets you where you want to go). o At the ethics/science interface, there is the fact/value dispute: that science must and should incorporate the values of the culture in which it arises versus the contention that science is and should be value free. o In mathematics, G?del's incompleteness proof was widely interpreted as showing that mathematics is relative; but G?del, a Platonist, intended the proof to support the view that there are statements that could not be proved within the system that are true nevertheless. Einstein, similarly, believed that the theory of relativity was misconstrued in just the same way by the "man is the measure of all things" relativists. o In the sociology of high accomplishment, Charles Murray (Human Accomplishment) documents that the highest accomplishments occur in cultures that believe in absolute truth, beauty, and goodness. The accomplishments, he contends, of cultures that do not believe in absolute beauty tend to be ugly, that do not belief in absolute goodness tend to be immoral, and that do not believe in absolute truth tend to be false. o In anthropology, pre-Boasians believed that cultures were hierarchically ordered into savage, barbarian, and civilized, whereas much of modern anthropology holds that all social forms are equal. This is the intellectual basis of the sweeping cultural relativism that dominates the humanities in academia. o In evolution, Robert Wright (like Aristotle) argues for a scala naturae, with the direction of evolution favoring complexity by its invisible hand; whereas Stephen Jay Gould argued that the fern is just as highly evolved as Homo sapiens. Does evolution have an absolute direction and are humans further along that trajectory than ferns? o In child-rearing, much of twentieth century education was profoundly influenced by the "Summerhillians" who argued complete freedom produced the best children, whereas other schools of parenting, education, and therapy argue for disciplined, authoritative guidance. o Even in literature, arguments over what should go into the canon revolve around the absolute-relative controversy. o Ethical relativism and its opponents are all too obvious instances of this issue I do not know if the dilemmas in these domains are only metaphorically parallel to one another. I do not know if illumination in one domain will not illuminate the others. But it might and it is just possible that the great minds of the twenty-first century will absolutize the relative. _________________________________________________________________ HOWARD GARDNER Psychologist, Harvard University; Author, Changing Minds [gardner100.jpg] Following Sisyphus, not Pandora According to myth, Pandora unleashed all evils upon the world; only hope remained inside the box. Hope for human survival and progress rests on two assumptions: (1) Human constructive tendencies can counter human destructive tendencies, and (2) Human beings can act on the basis of long-term considerations, rather than merely short-term needs and desires. My personal optimism, and my years of research on "good work", could not be sustained without these assumptions. Yet I lay awake at night with the dangerous thought that pessimists may be right. For the first time in history -- as far as we know! -- we humans live in a world that we could completely destroy. The human destructive tendencies described in the past by Thomas Hobbes and Sigmund Freud, the "realist" picture of human beings embraced more recently by many sociobiologists, evolutionary psychologists, and game theorists might be correct; these tendencies could overwhelm any proclivities toward altruism, protection of the environment, control of weapons of destruction, progress in human relations, or seeking to become good ancestors. As one vivid data point: there are few signs that the unprecedented power possessed by the United States is being harnessed to positive ends. Strictly speaking, what will happen to the species or the planet is not a question for scientific study or prediction. It is a question of probabilities, based on historical and cultural considerations, as well as our most accurate description of human nature(s). Yet, science (as reflected, for example, in contributions to Edge discussions) has recently invaded this territory with its assertions of a biologically-based human moral sense. Those who assert a human moral sense are wagering that, in the end, human beings will do the right thing. Of course, human beings have the capacities to make moral judgments -- that is a mere truism. But my dangerous thought is that this moral sense is up for grabs -- that it can be mobilized for destructive ends (one society's terrorist is another society's freedom fighter) or overwhelmed by other senses and other motivations, such as the quest for power, instant gratification, or annihilation of one's enemies. I will continue to do what I can to encourage good work -- in that sense, Pandoran hope remains. But I will not look upon science, technology, or religion to preserve life. Instead, I will follow Albert Camus' injunction, in his portrayal of another mythic figure endlessly attempting to push a rock up a hill: one should imagine Sisyphus happy. From mario7k at gmail.com Fri Jan 6 02:02:56 2006 From: mario7k at gmail.com (Mario Ribeiro) Date: Thu, 5 Jan 2006 22:02:56 -0400 Subject: [Paleopsych] Independent: 'Chronic happiness' the key to success In-Reply-To: References: Message-ID: <7f0d59090601051802m48e87f6et7db65f4885553584@mail.gmail.com> yeah definetly, send it to me. its interesting the idea that happiness and cheerfullness may and should come before u achive what ever ur looking for... not after... ei pai, comprei 4 livros na amazon.com pra universidade. dois dos quais digitais... to estudando aki... remotely... not bad The Growth of the International Economy Globalization Sociology The Problem of Sociology bjs M On 1/2/06, Premise Checker wrote: > > http://news.independent.co.uk/world/science_technology/article333972.ece > 19 December 2005 10:27 > > By Lyndsay Moss > Published: 19 December 2005 > > The key to success may be "chronic happiness" rather than simply hard > work and the right contacts, psychologists have found. > > Many assume a successful career and personal life leads to happiness. > But psychologists in the US say happiness can bring success. > > Researchers from the universities of California, Missouri and Illinois > examined connections between desirable characteristics, life success and > well-being in more than 275,000 people. > > They found that happy individuals were predisposed to seek out new goals > in life, leading to success, which also reinforced their already > positive emotions. > > The psychologists addressed questions such as whether happy people were > more successful than unhappy people, and whether happiness came before > or after a perceived success. > > Writing in Psychological Bulletin, published by the American > Psychological Association, they concluded that "chronically happy > people" were generally more successful in many areas of life than less > happy people. > > The key to success may be "chronic happiness" rather than simply hard > work and the right contacts, psychologists have found. > > Many assume a successful career and personal life leads to happiness. > But psychologists in the US say happiness can bring success. > > Researchers from the universities of California, Missouri and Illinois > examined connections between desirable characteristics, life success and > well-being in more than 275,000 people. > > They found that happy individuals were predisposed to seek out new goals > in life, leading to success, which also reinforced their already > positive emotions. > > The psychologists addressed questions such as whether happy people were > more successful than unhappy people, and whether happiness came before > or after a perceived success. > > Writing in Psychological Bulletin, published by the American > Psychological Association, they concluded that "chronically happy > people" were generally more successful in many areas of life than less > happy people. > _______________________________________________ > paleopsych mailing list > paleopsych at paleopsych.org > http://lists.paleopsych.org/mailman/listinfo/paleopsych > -------------- next part -------------- An HTML attachment was scrubbed... URL: From checker at panix.com Fri Jan 6 03:44:09 2006 From: checker at panix.com (Premise Checker) Date: Thu, 5 Jan 2006 22:44:09 -0500 (EST) Subject: [Paleopsych] Discovery: Robot Demonstrates Self Awareness Message-ID: Robot Demonstrates Self Awareness http://dsc.discovery.com/news/briefs/20051219/awarerobot_tec.html [Uh oh...too late for AI friendliness, time to buy How to Survive a Robot Uprising.] By Tracy Staedter, Discovery News Dec. 21, 2005- A new robot can recognize the difference between a mirror image of itself and another robot that looks just like it. This so-called mirror image cognition is based on artificial nerve cell groups built into the robot's computer brain that give it the ability to recognize itself and acknowledge others. The ground-breaking technology could eventually lead to robots able to express emotions. Under development by Junichi Takeno and a team of researchers at Meiji University in Japan, the robot represents a big step toward developing self-aware robots and in understanding and modeling human self-consciousness. "In humans, consciousness is basically a state in which the behavior of the self and another is understood," said Takeno. Humans learn behavior during cognition and conversely learn to think while behaving, said Takeno. To mimic this dynamic, a robot needs a common area in its neural network that is able to process information on both cognition and behavior. Takeno and his colleagues built the robot with blue, red or green LEDs connected to artificial neurons in the region that light up when different information is being processed, based on the robot's behavior. "The innovative part is the independent nodes in the hierarchical levels that can be linked and activated," said Thomas Bock of the Technical University of Munich in Germany. For example, two red diodes illuminate when the robot is performing behavior it considers its own, two green bulbs light up when the robot acknowledges behavior being performed by the other. One blue LED flashes when the robot is both recognizing behavior in another robot and imitating it. Imitation, said Takeno, is an act that requires both seeing a behavior in another and instantly transferring it to oneself and is the best evidence of consciousness. In one experiment, a robot representing the "self" was paired with an identical robot representing the "other." When the self robot moved forward, stopped or backed up, the other robot did the same. The pattern of neurons firing and the subsequent flashes of blue light indicated that the self robot understood that the other robot was imitating its behavior. In another experiment, the researchers placed the self robot in front of a mirror. In this case, the self robot and the reflection (something it could interpret as another robot) moved forward and back at the same time. Although the blue lights fired, they did so less frequently than in other experiments. In fact, 70 percent of the time, the robot understood that the mirror image was itself. Takeno's goal is to reach 100 percent in the coming year. From checker at panix.com Fri Jan 6 03:44:19 2006 From: checker at panix.com (Premise Checker) Date: Thu, 5 Jan 2006 22:44:19 -0500 (EST) Subject: [Paleopsych] Economist: The robots are coming... from Japan... Message-ID: The robots are coming... from Japan... http://www.economist.com/World/asia/displayStory.cfm?story_id=5323427 Special Report Japan's humanoid robots Better than people Dec 20th 2005 | TOKYO Why the Japanese want their robots to act more like humans Getty Images HER name is MARIE, and her impressive set of skills comes in handy in a nursing home. MARIE can walk around under her own power. She can distinguish among similar-looking objects, such as different bottles of medicine, and has a delicate enough touch to work with frail patients. MARIE can interpret a range of facial expressions and gestures, and respond in ways that suggest compassion. Although her language skills are not ideal, she can recognise speech and respond clearly. Above all, she is inexpensive . Unfortunately for MARIE, however, she has one glaring trait that makes it hard for Japanese patients to accept her: she is a flesh-and-blood human being from the Philippines. If only she were a robot instead. Robots, you see, are wonderful creatures, as many a Japanese will tell you. They are getting more adept all the time, and before too long will be able to do cheaply and easily many tasks that human workers do now. They will care for the sick, collect the rubbish, guard homes and offices, and give directions on the street. This is great news in Japan, where the population has peaked, and may have begun shrinking in 2005. With too few young workers supporting an ageing population, somebody?or something?needs to fill the gap, especially since many of Japan's young people will be needed in science, business and other creative or knowledge-intensive jobs. Many workers from low-wage countries are eager to work in Japan. The Philippines, for example, has over 350,000 trained nurses, and has been pleading with Japan?which accepts only a token few?to let more in. Foreign pundits keep telling Japan to do itself a favour and make better use of cheap imported labour. But the consensus among Japanese is that visions of a future in which immigrant workers live harmoniously and unobtrusively in Japan are pure fancy. Making humanoid robots is clearly the simple and practical way to go. Japan certainly has the technology. It is already the world leader in making industrial robots, which look nothing like pets or people but increasingly do much of the work in its factories. Japan is also racing far ahead of other countries in developing robots with more human features, or that can interact more easily with people. A government report released this May estimated that the market for ?service robots? will reach ?1.1 trillion ($10 billion) within a decade. The country showed off its newest robots at a world exposition this summer in Aichi prefecture. More than 22m visitors came, 95% of them Japanese. The robots stole the show, from the nanny robot that babysits to a Toyota that plays a trumpet. And Japan's robots do not confine their talents to controlled environments. As they gain skills and confidence, robots such as Sony's QRIO (pronounced ?curio?) and Honda's ASIMO are venturing to unlikely places. They have attended factory openings, greeted foreign leaders, and rung the opening bell on the New York Stock Exchange. ASIMO can even take the stage to accept awards. The friendly face of technology So Japan will need workers, and it is learning how to make robots that can do many of their jobs. But the country's keen interest in robots may also reflect something else: it seems that plenty of Japanese really like dealing with robots. Few Japanese have the fear of robots that seems to haunt westerners in seminars and Hollywood films. In western popular culture, robots are often a threat, either because they are manipulated by sinister forces or because something goes horribly wrong with them. By contrast, most Japanese view robots as friendly and benign. Robots like people, and can do good. The Japanese are well aware of this cultural divide, and commentators devote lots of attention to explaining it. The two most favoured theories, which are assumed to reinforce each other, involve religion and popular culture. Most Japanese take an eclectic approach to religious beliefs, and the native religion, Shintoism, is infused with animism: it does not make clear distinctions between inanimate things and organic beings. A popular Japanese theory about robots, therefore, is that there is no need to explain why Japanese are fond of them: what needs explaining, rather, is why westerners allow their Christian hang-ups to get in the way of a good technology. When Honda started making real progress with its humanoid-robot project, it consulted the Vatican on whether westerners would object to a robot made in man's image. Japanese popular culture has also consistently portrayed robots in a positive light, ever since Japan created its first famous cartoon robot, Tetsuwan Atomu, in 1951. Its name in Japanese refers to its atomic heart. Putting a nuclear core into a cartoon robot less than a decade after Hiroshima and Nagasaki might seem an odd way to endear people to the new character. But Tetsuwan Atom?being a robot, rather than a human?was able to use the technology for good. Over the past half century, scores of other Japanese cartoons and films have featured benign robots that work with humans, in some cases even blending with them. One of the latest is a film called ?Hinokio?, in which a reclusive boy sends a robot to school on his behalf and uses virtual-reality technology to interact with classmates. Among the broad Japanese public, it is a short leap to hope that real-world robots will soon be able to pursue good causes, whether helping to detect landmines in war-zones or finding and rescuing victims of disasters. The prevailing view in Japan is that the country is lucky to be uninhibited by robophobia. With fewer of the complexes that trouble many westerners, so the theory goes, Japan is free to make use of a great new tool, just when its needs and abilities are happily about to converge. ?Of all the nations involved in such research,? the Japan Times wrote in a 2004 editorial, ?Japan is the most inclined to approach it in a spirit of fun.? These sanguine explanations, however, may capture only part of the story. Although they are at ease with robots, many Japanese are not as comfortable around other people. That is especially true of foreigners. Immigrants cannot be programmed as robots can. You never know when they will do something spontaneous, ask an awkward question, or use the wrong honorific in conversation. But, even leaving foreigners out of it, being Japanese, and having always to watch what you say and do around others, is no picnic. It is no surprise, therefore, that Japanese researchers are forging ahead with research on human interfaces. For many jobs, after all, lifelike features are superfluous. A robotic arm can gently help to lift and reposition hospital patients without being attached to a humanoid form. The same goes for robotic spoons that make it easier for the infirm to feed themselves, power suits that help lift heavy grocery bags, and a variety of machines that watch the house, vacuum the carpet and so on. Yet the demand for better robots in Japan goes far beyond such functionality. Many Japanese seem to like robot versions of living creatures precisely because they are different from the real thing. An obvious example is AIBO, the robotic dog that Sony began selling in 1999. The bulk of its sales have been in Japan, and the company says there is a big difference between Japanese and American consumers. American AIBO buyers tend to be computer geeks who want to hack the robotic dog's programming and delve in its innards. Most Japanese consumers, by contrast, like AIBO because it is a clean, safe and predictable pet. AIBO is just a fake dog. As the country gets better at building interactive robots, their advantages for Japanese users will multiply. Hiroshi Ishiguro, a robotocist at Osaka University, cites the example of asking directions. In Japan, says Mr Ishiguro, people are even more reluctant than in other places to approach a stranger. Building robotic traffic police and guides will make it easier for people to overcome their diffidence. Karl MacDorman, another researcher at Osaka, sees similar social forces at work. Interacting with other people can be difficult for the Japanese, he says, ?because they always have to think about what the other person is feeling, and how what they say will affect the other person.? But it is impossible to embarrass a robot, or be embarrassed, by saying the wrong thing. To understand how Japanese might find robots less intimidating than people, Mr MacDorman has been investigating eye movements, using headsets that monitor where subjects are looking. One oft-cited myth about Japanese, that they rarely make eye contact, is not really true. When answering questions put by another Japanese, Mr MacDorman's subjects made eye contact around 30% of the time. But Japanese subjects behave intriguingly when they talk to Mr Ishiguro's android, ReplieeQ1. The android's face has been modeled on that of a famous newsreader, and sophisticated actuators allow it to mimic her facial movements. When answering the android's questions, Mr MacDorman's Japanese subjects were much more likely to look it in the eye than they were a real person. Mr MacDorman wants to do more tests, but he surmises that the discomfort many Japanese feel when dealing with other people has something to do with his results, and that they are much more at ease when talking to an android. Eventually, interactive robots are going to become more common, not just in Japan but in other rich countries as well. As children and the elderly begin spending time with them, they are likely to develop emotional reactions to such lifelike machines. That is human nature. Upon meeting Sony's QRIO, your correspondent promptly referred to it as ?him? three times, despite trying to remember that it is just a battery-operated device. What seems to set Japan apart from other countries is that few Japanese are all that worried about the effects that hordes of robots might have on its citizens. Nobody seems prepared to ask awkward questions about how it might turn out. If this bold social experiment produces lots of isolated people, there will of course be an outlet for their loneliness: they can confide in their robot pets and partners. Only in Japan could this be thought less risky than having a compassionate Filipina drop by for a chat. From checker at panix.com Fri Jan 6 03:44:28 2006 From: checker at panix.com (Premise Checker) Date: Thu, 5 Jan 2006 22:44:28 -0500 (EST) Subject: [Paleopsych] New Scientist: Civilisation has left its mark on our genes Message-ID: Civilisation has left its mark on our genes http://www.newscientist.com/article.ns?id=dn8483&print=true * 22:00 19 December 2005 * Bob Holmes Darwins fingerprints can be found all over the human genome. A detailed look at human DNA has shown that a significant percentage of our genes have been shaped by natural selection in the past 50,000 years, probably in response to aspects of modern human culture such as the emergence of agriculture and the shift towards living in densely populated settlements. One way to look for genes that have recently been changed by natural selection is to study mutations called single-nucleotide polymorphisms (SNPs) single-letter differences in the genetic code. The trick is to look for pairs of SNPs that occur together more often than would be expected from the chance genetic reshuffling that inevitably happens down the generations. Such correlations are known as linkage disequilibrium, and can occur when natural selection favours a particular variant of a gene, causing the SNPs nearby to be selected as well. Robert Moyzis and his colleagues at the University of California, Irvine, US, searched for instances of linkage disequilibrium in a collection of 1.6 million SNPs scattered across all the human chromosomes. They then looked carefully at the instances they found to distinguish the consequences of natural selection from other phenomena, such as random inversions of chunks of DNA, which can disrupt normal genetic reshuffling. This analysis suggested that around 1800 genes, or roughly 7% of the total in the human genome, have changed under the influence of natural selection within the past 50,000 years. A second analysis using a second SNP database gave similar results. That is roughly the same proportion of genes that were altered in maize when humans domesticated it from its wild ancestors. Domesticated humans Moyzis speculates that we may have similarly domesticated ourselves with the emergence of modern civilisation. One of the major things that has happened in the last 50,000 years is the development of culture, he says. By so radically and rapidly changing our environment through our culture, weve put new kinds of selection [pressures] on ourselves. Genes that aid protein metabolism perhaps related to a change in diet with the dawn of agriculture turn up unusually often in Moyziss list of recently selected genes. So do genes involved in resisting infections, which would be important in a species settling into more densely populated villages where diseases would spread more easily. Other selected genes include those involved in brain function, which could be important in the development of culture. But the details of any such sweeping survey of the genome should be treated with caution, geneticists warn. Now that Moyzis has made a start on studying how the influence of modern human culture is written in our genes, other teams can see if similar results are produced by other analytical techniques, such as comparing human and chimp genomes. Journal reference: Proceedings of the National Academy of Sciences (DOI: 10.1073/pnas.0509691102) Evolution Learn more about the struggle to survive in our comprehensive [11]special report. Related Articles * [12]Can biology do better than faith? * [13]http://www.newscientist.com/article.ns?id=dn8254 * 02 November 2005 * [14]Human brains are still evolving * [15]http://www.newscientist.com/article.ns?id=mg18725174.600 * 17 September 2005 * [16]Evolution: Blink and you'll miss it * [17]http://www.newscientist.com/article.ns?id=mg18725071.100 * 09 July 2005 Weblinks * [18]Robert Moyzis, University of California * [19]http://www.ucihs.uci.edu/biochem/faculty/moyzis.html * [20]SNP fact sheet, Human Genome Project * [21]http://www.ornl.gov/sci/techresources/Human_Genome/faq/snps.shtml * [22]Evolution special report, New Scientist * [23]http://www.newscientist.com/channel/life/evolution References 11. http://www.newscientist.com/channel/life/evolution 12. http://www.newscientist.com/article.ns?id=dn8254 13. http://www.newscientist.com/article.ns?id=dn8254 14. http://www.newscientist.com/article.ns?id=mg18725174.600 15. http://www.newscientist.com/article.ns?id=mg18725174.600 16. http://www.newscientist.com/article.ns?id=mg18725071.100 17. http://www.newscientist.com/article.ns?id=mg18725071.100 18. http://www.ucihs.uci.edu/biochem/faculty/moyzis.html 19. http://www.ucihs.uci.edu/biochem/faculty/moyzis.html 20. http://www.ornl.gov/sci/techresources/Human_Genome/faq/snps.shtml 21. http://www.ornl.gov/sci/techresources/Human_Genome/faq/snps.shtml 22. http://www.newscientist.com/channel/life/evolution 23. http://www.newscientist.com/channel/life/evolution From checker at panix.com Fri Jan 6 03:44:41 2006 From: checker at panix.com (Premise Checker) Date: Thu, 5 Jan 2006 22:44:41 -0500 (EST) Subject: [Paleopsych] spiked: Why humans are superior to apes Message-ID: Why humans are superior to apes http://www.spiked-online.com/Printable/0000000CA40E.htm 4.2.24 by Helene Guldberg Humanism, in the sense of a faith in humanity's potential to solve problems through the application of science and reason, is taking quite a battering today. As the UK medical scientist Raymond Tallis warns, the role of mind and of self-conscious agency in human affairs is denied 'by anthropomorphising or "Disneyfying" what animals do and "animalomorphising" what human beings get up to' (1). One of the most extreme cases of 'animalomorphism' in recent years has come from the philosopher John Gray, professor of European thought at the London School of Economics. In his book Straw Dogs: Thoughts on Humans and Other Animals, Gray argues that humanity's belief in our ability to control our destiny and free ourselves from the constraints of the natural environment is as illusory as the Christian promise of salvation (2). Gray presents humanity as no better than any other living organism - even bacteria. We should therefore not be too concerned about whether humans have a future on this planet, he claims. Rather, it is the balance of the world's ecosystem that we should really worry about: 'Homo rapiens is only one of very many species, and not obviously worth preserving. Later or sooner, it will become extinct. When it is gone the Earth will recover.' Thankfully, not many will go along with John Gray's image of humans as a plague upon the planet. For our own sanity, if nothing else, we cannot really subscribe to such a misanthropic and nihilistic worldview. If we did, surely we would have no option other than to kill ourselves - for the good of the planet - and try to take as many people with us as possible? However, even if many will reject Gray's extreme form of anti-humanism, many more will go along with the notion that animals are ultimately not that different from us. The effect is the same: to denigrate human abilities. Today, a belief in human exceptionalism is distinctly out of fashion. Almost every day we are presented with new revelations about how animals are more like us than we ever imagined. A selection of news headlines includes: 'How animals kiss and make up'; 'Male birds punish unfaithful females'; 'Dogs experience stress at Christmas'; 'Capuchin monkeys demand equal rights'; 'Scientists prove fish intelligence'; 'Birds going through divorce proceedings'; 'Bees can think say scientists'; 'Chimpanzees are cultured creatures' (3). The argument is at its most powerful when it comes to the great apes -chimpanzees, gorillas and orangutans. One of the most influential opponents of the 'sanctification of human life', as he describes human exceptionalism, is Peter Singer, author of Animal Liberation and co-founder of the Great Ape Project (4). Singer argues that we need to 'break the species barrier' and extend rights to the great apes, in the first instance, followed by all other animal species. The great apes are not only our closest living relatives, argues Singer, but they are also beings who possess many of the characteristics that we have long considered distinctive to humans. Is it the case that apes are just like us? Primatology has indeed shown that apes, and even monkeys, communicate in the wild. Jane Goodall's observations of chimpanzees show that not only do they use tools, but that they also make them - using sticks to fish for termites, stones as anvils or hammers, and leaves as cups or sponges. Anybody watching juvenile chimps playfighting, tickling each other and giggling, will be struck by their human-like mannerisms and their apparent expressions of glee. But one has to go beyond first impressions in order to establish to what extent great ape abilities can be compared to those of humans. Is it the case that ape behaviour is the result of a capacity for some rudimentary form of human-like insight? Or can it be explained through Darwinian evolution and associative learning? Associative learning, or contingent learning, are concepts developed in the early twentieth century by BF Skinner, one of the most influential psychologists, to describe a type of learning that is the result of an association between an action and the reinforcer - in the absence of any insight. BF Skinner became famous for his work with rats, pigeons and chickens using his 'Skinner Box'. In one experiment he rewarded chickens with a small amount of food (the reinforcer) when they pecked a blue button (the action). If the chicken pecked a yellow, green, or red button, it would get nothing. Associative or contingent learning, concepts developed by the school of behaviourism, is based on the idea that animals behave in the way that they do because this kind of behaviour has had certain consequences in the past, not because they have any insight into why they are doing what they do. In Intelligence of Apes and Other Rational Beings (2003), primatologist Duane Rumbaugh and comparative psychologist David Washburn argue that ape behaviour cannot be explained on the basis of contingent learning alone (5). Apes are rational, they claim, and do make decisions using higher order reasoning skills. But the evidence for this is weak, and getting weaker, as more rigorous methodologies are being developed for investigating the capabilities of primates. As a result, many of the past claims about apes' capacity for insight into their own actions and those of their fellow apes are now being questioned. Cultural transmission and social learning The cultural transmission of behaviour, where actions are passed on through some kind of teaching, learning or observation rather than through genetics, is used as evidence of apes' higher order reasoning abilities. This is currently being revised. The generation-upon-generation growth in human abilities has historically been seen as our defining characteristic. Human progress has been made possible through our ability to reflect on what we, and our fellow humans, are doing - thereby teaching, and learning from, each other. The first evidence of cultural transmission among primates was found in the 1950s in Japan, with observations of the spread of potato washing among macaque monkeys (6). One juvenile female pioneered the habit, followed by her mother and closest peers. Within a decade, the whole of the population under middle age was washing potatoes. A review by Andrew Whiten and his colleagues of a number of field studies reveals evidence of at least 39 local variations in behavioural patterns, including tool-use, communication and grooming rituals, among chimpanzees - behaviours that are common in some communities and absent in others (7). So it seems that these animals are capable of learning new skills and of passing them on to their fellows. The question remains: what does this tell us about their mental capacities? The existence of cultural transmission is often taken as evidence that the animals are capable of some form of social learning (such as imitation) and possibly even teaching. But there is in fact no evidence of apes being able to teach their young. Michael Tomasello, co-director of the Wolfgang K?hler Primate Research Center in Germany, points out that 'nonhuman primates do not point to distal entities in the environment, they do not hold up objects for others to see and share, and they do not actively give or offer objects to other individuals. They also do not actively teach one another' (8). Yet even if apes cannot actively teach each other, if they are capable of social learning - in terms of imitation (which it has long been assumed that they are) - this does still imply they are capable of quite complex cognitive processes. Imitation involves being able to appreciate not just what an act looks like when performed by another individual, but also what it is like to do that act oneself. They must be able to put themselves in another person's shoes, so to speak. However, comparative psychologist Bennett Galef points out, after scrutinising the data from Japan, that the rate the behaviour spread among the macaque monkeys was very slow and steady, not accelerated as one might expect in the case of imitation (9). It took up to a decade for what, in human terms, would be described as a tiny group of individuals to acquire the habit of the 'innovator'. Compare this to the human ability to teach new skills and ways of thinking and to learn from each other's insights: which laid the foundation for the agricultural and industrial revolutions, the development of science and technology and the transformations of our ways of living that flow from these. Reviewing the literature on primate behaviour, it emerges that there is in fact no consensus among scientists as to whether apes are capable of the simplest form of social learning - imitation (10). Instead it could be the case that the differences in their behavioural repertoires are the result of what has been coined stimulus enhancement. It has been shown in birds, for instance, that the stimulus enhancement of a feeding site may occur if bird A sees bird B gaining food there. In other words, their attention has been drawn to a stimulus, without any knowledge or appreciation of the significance of the stimulus. Others argue that local variations may be due to observational conditioning, where an animal may learn about the positive or negative consequences of actions, not on the basis of experiencing the outcomes themselves, but on the basis of seeing the responses of other animals. This involves a form of associative learning (learning from the association between an action and the reinforcer), rather than any insight. Michael Tomasello emphasises the special nature of human learning. Unlike animals, he argues, humans understand that in the social domain relations between people involve intentionality, and in the physical domain that relations between objects involve causality (11). We do not tend to respond blindly to what others do or say, but, to some degree, analyse their motives. Similarly we have some understanding how physical processes work, which means we can manipulate the physical world to our advantage and continually develop and perfect the tools we use to do so. Social learning and teaching depends on these abilities, and human children begin on this task at the end of their first year. Because other primates do not understand intentionality or causality they do not engage in cultural learning of this type. The fact that it takes chimps up to four years to acquire the necessary skills to select and adequately use tools to crack nuts implies that they are not capable of true imitation, never mind any form of teaching. Young chimps invest a lot of time and effort in attempts to crack nuts that are, after all, an important part of their diet. The slow rate of their development raises serious questions about their ability to reflect on what they and their fellow apes are doing. Language But can apes use language? Groundbreaking research by Robert Seyfarth and Dorothy Cheney in the 1980s on vervet monkeys in the wild showed that their vocalisations went beyond merely expressing emotions such as anger or fear. Their vocalisations could instead be described as 'referential' - in that they refer to objects or events (12). But it could not be established from these studies whether the callers vocalised with the explicit intent of referring to a particular object or event, for instance the proximity of a predator. And Seyfarth and Cheney were careful to point out that there was no evidence that the monkeys had any insight into what they were doing. Their vocalisations could merely be the result of a form of associative learning. Later experiments have attempted to refine analyses in order to establish whether there is an intention to communicate: involving an understanding that someone else may have a different perspective or understanding of a situation from themselves, and using communication in order to change the others' understanding. It is too early to draw any firm conclusions on this question from research carried out to date. There is no evidence that primates have any, even rudimentary, human-like insight into the effect of their communications. But neither is there clear evidence that they do not. What is clear, however, is that primates, as with all non-human animals, only ever communicate about events in the here and now. They do not communicate about possible future events or previously encountered ones. Ape communications cannot therefore be elevated to the status of human language. Human beings debate and discuss ideas, constructing arguments, drawing on past experiences and imagining future possibilities, in order to change the opinions of others. This goes way beyond warning fellow humans about a clear and present danger. Deception and Theory of Mind What about the fact that apes have been seen to deceive their fellows? Does this not point towards what some have described as a Machiavellian Intelligence (13)? Primatologists have observed apes in the wild giving alarm calls when no danger is present, with the effect of distracting another animal from food or a mate. But again the question remains whether they are aware of what they are doing. To be able to deceive intentionally, they would have to have some form of a 'theory of mind' - that is, the recognition that one's own perspectives and beliefs are sometimes different from somebody else's. Although psychologist Richard Byrne argues that the abilities of the great apes are limited compared with even very young humans, he claims that 'some "theory of mind" in great apes but not monkeys now seems clear' (14). However, as the cognitive neuroscientist Marc Hauser points out, most studies of deception have left the question of intentionality unanswered (15). Studies that do attribute beliefs-about-beliefs to apes tend to rely heavily on fascinating, but largely unsubstantiated, anecdotes. As professor of archaeology Steven Mithen points out, 'even the most compelling examples can be explained in terms of learned behavioural contingencies [associative learning], without recourse to higher order intentionality' (16). So even if apes are found to deceive, that does not necessarily imply that the apes know that they are deceiving. The apes may just be highly adaptive and adept at picking up useful routines that bring them food, sex or safety, without necessarily having any understanding or insight into what they are doing. Self-awareness Although there is no substantive evidence of apes having a theory of mind, they may possess its precursor - a rudimentary self-awareness. This is backed up by the fact that, apart from human beings, apes are the only species able to recognise themselves in the mirror. In developmental literature, the moment when human infants first recognise themselves in the mirror (between 15 and 21 months of age) is seen as an important milestone in the emergence of the notion of 'self'. How important is it, then, that apes can make the same sort of mirror recognition? The development of self-awareness is a complex process with different elements emerging at different times. In humans, mirror recognition is only the precursor to a continually developing capacity for self-awareness and self-evaluation. Younger children's initial self-awareness is focused around physical characteristics. With maturity comes a greater appreciation of psychological characteristics. When asking 'who am I?', younger children use outer visible characteristics - such as gender and hair colour - while older children tend to use inner attributes - such as feelings and abilities. The ability of apes to recognise themselves in the mirror does not necessarily imply a human-like self-awareness or the existence of mental experiences. They seem able to represent their own bodies visually, but they never move beyond the stage reached by human children in their second year of life. Children Research to date presents a rather murky picture of what primates are and are not capable of. Field studies may not have demonstrated conclusively that apes are incapable of understanding intentionality in the social domain or causality in the physical domain, but logically this must be the case. Understanding of this sort would lead to a much more flexible kind of learning. It may be the case that the great apes do possess some rudimentary form of human-like insight. But the limitations of this rudimentary insight (if it exists at all) becomes clear when exploring the emergence, and transformative nature, of insight in young children. We are not born with the creative, flexible and imaginative thinking that characterises humans. It emerges in the course of development: humans develop from helpless biological beings into conscious beings with a sense of self and an independence of thought. The study of children can therefore give us great insights into the human mind. As Peter Hobson, professor of developmental psychopathology and author of The Cradle of Thought: Exploring the Origins of Thinking, states: 'It is always difficult to consider things in the abstract, and this is especially the case when what we are considering is something as elusive as the development of thought. It is one of the great benefits of studying very young children that one can see thinking taking place as it is lived out in a child's observable behaviour' (17). Thinking is more internalised, and therefore hidden, in older children and adults, but it is more externalised and nearer to the surface in children who are just beginning to talk. Hobson puts a persuasive case for human thought, language, and self-awareness developing 'in the cradle of emotional engagement between the infant and caregiver'. Emotional engagement and communication, he argues, are the foundation on which creative symbolic thought develops. Through reviewing an array of clinical and experimental studies, Hobson captures aspects of human exchanges that happen before thought. He shows that even in early infancy children have a capacity to react to the emotions of others. This points to an innate desire to engage with fellow human beings, he argues. However, with development, that innate desire is transformed into something qualitatively different. So, for instance, at around nine months of age, infants begin to share their experiences of objects or actions with others. They begin to monitor the emotional responses of adults, such as responding to facial expression or the tone of voice. When faced with novel situations or objects, infants look at their carers' faces and, by picking up emotional signals, they decide on their actions. When they receive positive/encouraging signals, they engage; when the signals are anxious/negative, they retreat. Towards the middle of the second year these mutually sensitive interpersonal engagements are transformed into more conscious exchanges of feelings, views and beliefs. Hobson is able to show that the ability to symbolise emerges out of the cradle of early emotional engagements. With the insight that people-with-minds have their own subjective experiences and can give things meanings comes the insight that these meanings can be anchored in symbols. This, according to Hobson, is the dawn of thought and the dawn of language: 'At this point, [the child] leaves infancy behind. Empowered by language and other forms of symbolic functioning, she takes off into the realms of culture. The infant has been lifted out of the cradle of thought. Engagement with others has taught this soul to fly.' (p274) The Russian psychologist Lev Vygotsky showed that a significant moment in the development of the human individual occurs when language and practical intelligence converge (18). It is when thought and speech come together that children's thinking is raised to new heights and they start acquiring truly human characteristics. Language becomes a tool of thought allowing children increasingly to master their own behaviour. As Vygotsky pointed out, young children will often talk out loud - to themselves it seems - when carrying out particular tasks. This 'egocentric speech' does not disappear, but gradually becomes internalised into private inner speech - also known as thought. Vygotsky and Luria concluded that 'the meeting between speech and thinking is a major event in the development of the individual; in fact, it is this connection that raises human thinking to extraordinary heights' (19). Apes never develop the ability to use language to regulate their own actions in the way that even toddlers are able to do. With the development of language, children's understanding of their own and other people's minds is transformed. So by three or four years of age, most children have developed a theory of mind. This involves an understanding of their own and others' mental life, including the understanding that others may have false beliefs and that they themselves may have had false beliefs. When my nephew Stefan was three years of age, he excitedly told me that 'this is my right hand [lifting his right hand] and this is my left hand [lifting his left hand]. But this morning [which is the phrase he used for anything that has happened in the past] I told daddy that this was my left hand [lifting his right hand] and this is my right hand [lifting his left hand]'. He was amused by the fact that he had been mistaken in his knowledge of what is right and what is left. He clearly had developed an understanding that people, including himself, have beliefs about things and that those beliefs can be wrong as well as right. Once children are able to think about thoughts in this way, their thinking has been lifted to a different height. The formal education system requires children to go much further in turning language and thought in upon themselves. Children must learn to direct their thought processes in a conscious manner. Above all, they need to become capable of consciously manipulating symbols. Literacy and numeracy serve important functions in aiding communication and manipulating numbers. But, above all, they have transformative effects on children's thinking, in particular on the development of abstract thought and reflective processes. In the influential book Children's Minds, child psychologist Margaret Donaldson shows that 'those very features of the written word which encourage awareness of language may also encourage awareness of one's own thinking and be relevant to the development of intellectual self-control, with incalculable consequences for the kinds of thinking which are characteristic of logic, mathematics and the sciences' (20). The differences in language, tool-use, self-awareness and insight between apes and humans are vast. A human child, even as young as two years of age, is intellectually head and shoulders above any ape. Denigrating humans As American biological anthropologist Kathleen R Gibson states: 'Other animals possess elements that are common to human behaviours, but none reaches the human level of accomplishment in any domain - vocal, gestural, imitative, technical or social. Nor do other species combine social, technical and linguistic behaviours into a rich, interactive and self-propelling cognitive complex.' (21) In the six million years since the human and ape lines first diverged, the behaviour and lifestyles of apes have hardly changed. Human behaviour, relationships, lifestyles and culture clearly have. We have been able to build upon the achievements of previous generations. In just the past century we have brought, through constant innovation, vast improvements to our lives: including better health, longer life expectancy, higher living standards and more sophisticated means of communication and transport. Six million years of ape evolution may have resulted in the emergence of 39 local behavioural patterns - in tool-use, communication and grooming rituals. However this has not moved them beyond their hand-to-mouth existence nor led to any significant changes in the way they live. Our lives have changed much more in the past decade - in terms of the technology we use, how we communicate with each other, and how we form and sustain personal relationships. Considering the vast differences in the way we live, it is very difficult to sustain the argument that apes are 'just like us'. What appears to be behind today's fashionable view of ape and human equivalence is a denigration of human capacities and human ingenuity. The richness of human experience is trivialised because human experiences are lowered to, and equated with, those of animals. Dr Roger Fouts from the Chimpanzee and Human Communication Institute expresses this anti-human view well in his statement. '[Human] intelligence has not only moved us away from our bodies, but from our families, communities, and even Earth itself. This may be a big mistake for the survival of our species in the long run.' (22) Investigations into apes' behaviour could shed some useful light on how they resemble us - and give us some insight into our evolutionary past, several million years back. Developing a science true to its subject matter could give us real insights into what shapes ape behaviour. Stephen Budiansky's fascinating book If A Lion Could Talk shows how evolutionary ecology (the study of how natural selection has equipped animals to lead the lives they do) is showing us how animals process information in ways that are uniquely their own, much of which we can only marvel at (23). But as Karl Marx pointed out in the late nineteenth century: 'What distinguishes the worst architect from the best of bees is this, that the architect raises his structure in imagination before he erects it in reality. At the end of every labour process, we get a result that already existed in the imagination of the labourer at its commencement.'(24) Much animal behaviour is fascinating. But, as Budiansky shows, it is also the case that animals do remarkably stupid things in situations very similar to those where they previously seemed to show a degree of intelligence. This is partly because they learn many of their clever feats by pure accident. But it is also because animal learning is highly specialised. Their ability to learn is not a result of general cognitive processes but 'specialised channels attuned to an animal's basic hard-wired behaviours' (23). It is sloppy simply to apply human characteristics and motives to animals. It blocks our understanding of what is specific about animal behaviour, and degrades what is unique about our humanity. It is ironic that we, who have something that no other organism has - the ability to evaluate who we are, where we come from and where we are going, and, with that, our place in nature - increasingly seem to use this unique ability in order to downplay the exceptional nature of our own capacities and achievements. Read on: [2]spiked-issue: Animals (1) New Humanist, November 2003 (2) Straw Dogs: Thoughts on Humans and Other Animals, by John Gray, Granta, August 2002 (3) [3]'How animals kiss and make up', BBC News, 13 October 2003; [4]Male birds punish unfaithful females, Animal Sentience, 31 October; [5]Dogs experience stress at Christmas, Animal Sentience, 10 December 2003; [6]Capuchin monkeys demand equal rights, Animal Sentience, 20 September 2003; [7]Scientists prove fish intelligence, 31 August 2003; [8]Birds going through divorce proceedings, Animal Sentience, 18 August 2003; [9]Bees can think say scientists, Guardian, 19 April 2001; [10]Chimpanzees are cultured creatures, Guardian, 24 September 2002 (4) See the [11]Great Ape project website (5) Intelligence of Apes and Other Rational Beings, by Duane M Rumbaugh and David A Washburn (buy this book from [12]Amazon (UK) or [13]Amazon (USA)) (6) Frans de Waal, Nature, Vol 399, 17 June 1999 (7) Nature, Vol 399, 17 June 1999 (8) Michael Tomasello, 'Primate Cognition: Introduction to the issue', Cognitive Science Vol 24 (3) 2000, p358 (9) BG Galef, Human Nature 3, 157-178, 1990 (10) See a detailed review by Andrew Whiten, 'Primate Culture and Social Learning', Cognitive Science Vol 24 (3), 2000 (11) Tomasello and Call, Primate Cognition, Oxford University Press, 1997 (12) [14]Peter Singer: Curriculum Vitae (13) Machiavellian Intelligence: Social Expertise and the Evolution of Intellect in Monkeys, Apes, and Humans, (eds) Andrew Whiten and Richard Byrne, Oxford 1990. Buy this book from [15]Amazon (USA) or [16]Amazon (UK) (14) [17]How primates learn novel complex skills: The evolutionary origins of generative planning?, by Richard W Byrne (15) M Hauser, 'A primate dictionary?', Cognitive Science Vol 24(3) 2000 (16) The Prehistory of the Mind: A Search for the Origins of Art, Religion and Science, Steven Mithen, Phoenix, 1998. Buy this book from or [18]Amazon (UK) or [19]Amazon (USA) (17) The Cradle of Thought: exploring the origins of thinking, Peter Hobson, Macmillan, 22 February 2002, p76. Buy this book from [20]Amazon (UK) or [21]Amazon (USA) (18) Thought and Language, Lev Vygotsky, MIT, 1986 (19) Ape, Primitive Man and Child, Lev Vygotsky, 1991, p140 (20) Children's Minds, Margaret Donaldson, HarperCollins, 1978, p95 (21) Tools, Language and Cognition in Human Evolution, Kathleen R Gibson, 1993, p7-8 (22) [22]CHCI Frequently Asked Questions: Chimpanzee Facts (23) If a Lion Could Talk: Animal Intelligence and the Evolution of Consciousness, by Stephen Budiansky. Buy this book from [23]Amazon (UK) or [24]Amazon (USA) (24) Capital, Karl Marx, vol 1 p198 [bug.gif] [pixel.gif] Reprinted from : [25]http://www.spiked-online.com/Articles/0000000CA40E.htm _________________________________________________________________ spiked, Signet House, 49-51 Farringdon Road, London, EC1M 3JP Email: [35]info at spiked-online.com References 2. http://www.spiked-online.com/Sections/Science/OnAnimals/Index.htm 3. http://news.bbc.co.uk/1/hi/scotland/3183516.stm 4. http://www.animalsentience.com/news/2003-10-31a.htm 5. http://www.animalsentience.com/news/2003-12-10.htm 6. http://www.animalsentience.com/news/2003-09-20.htm 7. http://www.animalsentience.com/news/2003-08-31.htm 8. http://www.animalsentience.com/news/2003-08-18.htm 9. http://www.guardian.co.uk/uk_news/story/0,3604,474807,00.html 10. http://education.guardian.co.uk/higher/artsandhumanities/story/0,12241,798331,00.html 11. http://www.greatapeproject.org/ 12. http://www.amazon.co.uk/exec/obidos/ASIN/0300099835/spiked 13. http://www.amazon.com/exec/obidos/tg/detail/-/0300099835/spiked-20 14. http://www.princeton.edu/%7Euchv/faculty/singercv.html 15. http://www.amazon.com/exec/obidos/tg/detail/-/0198521758/spiked-20 16. http://www.amazon.co.uk/exec/obidos/ASIN/0521559499/spiked 17. http://www.saga-jp.org/coe_abst/byrne.htm 18. http://www.amazon.co.uk/exec/obidos/ASIN/0500281009/spiked 19. http://www.amazon.com/exec/obidos/tg/detail/-/0500281009/spiked-20 20. http://www.amazon.co.uk/exec/obidos/ASIN/0333766334/spiked 21. http://www.amazon.com/exec/obidos/tg/detail/-/0195219546/qid=1077209516/spiked-20 22. http://www.cwu.edu/~cwuchci/chimpanzee_info/faq_info.htm 23. http://www.amazon.com/exec/obidos/tg/detail/-/0684837102/spiked 24. http://www.amazon.com/exec/obidos/tg/detail/-/0684837102/spiked-20 25. http://www.spiked-online.com/Articles/0000000CA40E.htm 26. http://www.spiked-online.com/ 27. http://www.spiked-online.com/sections/culture/index.htm 28. http://www.spiked-online.com/sections/health/index.htm 29. http://www.spiked-online.com/sections/life/index.htm 30. http://www.spiked-online.com/sections/liberties/index.htm 31. http://www.spiked-online.com/sections/politics/index.htm 32. http://www.spiked-online.com/sections/risk/index.htm 33. http://www.spiked-online.com/sections/science/index.htm 34. http://www.spiked-online.com/sections/technology/index.htm 35. http://www.spiked-online.com/forms/genEmail.asp?sendto=9§ion=central From checker at panix.com Fri Jan 6 03:44:53 2006 From: checker at panix.com (Premise Checker) Date: Thu, 5 Jan 2006 22:44:53 -0500 (EST) Subject: [Paleopsych] WP: Dr. Gridlock: Balanced Views on a Roadside Sobriety Test Message-ID: Balanced Views on a Roadside Sobriety Test http://www.washingtonpost.com/wp-dyn/content/article/2005/12/21/AR2005122101158_pf.html [I practiced the art of standing on the MetRoRail without holding onto anything. It took me several weeks, but now I can do so even when the train sways from side to side, as it does from Farragut North to MetRo Center and from Van Ness to Tenleytown. But the worst, at least on the trains I have takes, is the trip between Union Station and Rhode Island Avenue. Not only does the train sway from side to side, it goes up and down and the speed is quite variable. [Learing to balance myself on the MetRoRail helps me enormously when I am out running over ice. When I stumble or slip, my brain has been trained enough so that I rarely fall down. This is especially important, since as I go down the stately march to senility at age 61, I don't heal nearly so rapidly as I used to. [I also practice standing on one foot with my hands over my head whenever I get a chance, like when waiting for or riding elevators. My co-workers often give me a puzzled look, but when I explain why, they are all smiles. And get puzzled, too, when I greet them brightly first thing on Monday mornings, saying, "Thank God it's Monday!" The puzzlement turns to smiles when I follow this up with saying, "I'm a workaholic."] [People vary enormously in their ability to balance. I well remember the day when I was living in Little Rock--I was six or seven at the time--when Dad huffed and puffed up and down hills with me on my bike teaching me how to ride. Yet my brother, Dick, not then for he was 4 1/2 years younger than me but when he was about five, needed Dad not at all. He just got on his bike and rode away! He was later to become an excellent hockey player and still coaches the sport. But my experience with ice skating was abysmal. I was so cautious that it took me ten minutes to go once around the rink. I did not persist in learning ice skating. This was also true of skiing, though I did learn to roller skate, but never to roller blade, which activity was after my time. [I hadn't ridden a bike since around 1967 when around 2000 I borrowed my older daughter's (Alice's) bike and went down the Capital Crescent Trail. I was extremely cautious for a few miles, but after that riding came back to me, and I was riding comfortably, though not as well as I did when I was a child. [I learned the skill of double-clutching when my Austin-Healey 3000 did not syncromesh going down from second to first gear. This is twice as complicated as simple shifting. We haven't had a stick shift since our Volkswagen Beetle gave up the ghost in 1983. But I know I could at least single-clutch pretty well at once. Double-clutching would have to come back to me.] By Ron Shaffer Thursday, December 22, 2005; GZ13 Balanced Views on a Roadside Sobriety Test In a past column, an Annapolis man expressed concern that motorists suspected of drunken driving were sometimes asked by police to stand on one foot for 30 seconds [Dr. Gridlock, Dec. 8]. He noted that people in his morning health club class couldn't hold that position for 30 seconds, and they were all sober. I replied that I can't do it either -- not even close. That prompted the following responses. Dear Dr. Gridlock: I am going on 71 years, do not engage in regular exercise and was able to stand on one foot for 30 seconds. No problem at all -- on either leg. The gentleman may need to find a new exercise class! Mary Lucas Annapolis Dear Dr. Gridlock: I will never forget the evening I was humiliated on the side of the road after being stopped for speeding. I was returning home from dinner, driving my sports car faster than I should have been. I noticed a car coming from behind at even greater speed and immediately pulled over into the slow lane to let it go by. To my surprise, the other driver was a state police officer, and he was pulling me over. He asked for my ID, and when I opened my purse, an empty beer bottle from the only drink I had had that evening was prominently visible. I had saved it for the foreign label. He asked me to step out of the vehicle and undergo a number of tests. I passed the "touch your nose" test and the "walk the line" test but was then asked to stand on one foot for a period of time. I pointed out that I was standing on gravel and wearing high heels, but that made no difference. Of course I failed. I was then forced to take the roadside breathalyzer test and passed immediately. I went on my way with my speeding ticket but was completely rattled by the late-night roadside shenanigans. Although he was polite, the storm-trooper attitude the police officer assumed still rankles. Susan Guyaux Crownsville Makes me wonder: Why not administer the breathalyzer test before the other exercises? Dear Dr. Gridlock: I can stand on one leg, either one, and count to 30 with no trouble. I'm 79, so maybe I've learned how to balance by now. Those people in the exercise class must all have a balance problem, or maybe they all stopped by the local tavern before going to class. Ed MacArthur Greenbelt I can't even come close to a 30 count. Maybe too many years of inhaling exhaust fumes. Dear Dr. Gridlock: In response to your request for the information about the police roadside sobriety test, my understanding is that it's not the ability to stand on one foot for a count of 30 that helps detect inebriation, but the manner in which you go about your attempt. Also, at least in Maryland, the field sobriety tests alone are not enough to convict: You must also fail a breathalyzer test. So even if you are miserable at the physical tests but pass the breathalyzer, they will let you go. Similarly, if you pass the physical tests but the breathalyzer shows your blood alcohol content to be above the legal limit, they will have a case. Sadly, I have been given the roadside sobriety and breathalyzer tests numerous times because I play in bands and work as a DJ and thus am (soberly) leaving bars and nightclubs around closing time. Being on the road late at night apparently is enough probable cause to detain me and subject me to testing, irrespective of my driving performance. Eric Myers Germantown Well, I'm glad the ability to stand on one foot is not the sole indicator of one's sobriety. A Changing Metro Dear Dr. Gridlock: Oh, great. Not only is Metro going to get rid of seats in subway cars, which will leave standees who aren't tall enough to reach the overhead poles with nothing to hang onto, but they're also going to get rid of the center vertical poles between the exits? Wonderful! I'm 5 feet tall, and I cannot reach -- or at least can't grip -- the overhead poles. I'm also not built like a linebacker, which means I can't force my way into the middle of a crowded car to grab hold of a seat rail. I'm not quite elderly, but getting there, and I have a bad back. The "elderly and handicapped" seats, when I can get one or find someone kind enough to relinquish theirs, have been my salvation when using Metro. And, when I couldn't sit, I would hang on for dear life to that center pole. Now the seats are going, and the pole, too. How is someone like me supposed to use Metro? And why do they persist in making a subway ride more of an ordeal all the time? And where on Earth are those additional subway cars that have been on order for years now, which would make it possible to run six-car trains on all lines during busy hours? Why do fares keep going up and we get less service that is more a burden to use? Me, I have the option to drive. I can add my bit to the city's pollution and gridlock. Thanks, Metro. Lynda Meyers Arlington Metro is not getting rid of any of the seats designated for seniors and the disabled. As part of tests of 24 reconfigured cars, all the vertical ceiling-to-floor poles are being eliminated, but many more vertical poles are being installed from the backs of seats to the ceiling. Further, spring-loaded strap handles are being suspended from the overhead bars. That is being done to see if cars can be loaded, and unloaded, with more efficiency than the free-for-all that exists now. As for more cars, they are coming. Metro expects to have eight-car trains on 20 percent of its fleet by the end of 2006, 30 percent by the end of 2007 and 50 percent by the end of 2008. Dear Dr. Gridlock: I have been reading the suggested solutions to the difficulty of getting on and off Metro trains. If Metro can manage to fine-tune the brakes to allow people to line up on station platforms and have the train stop right in front of them, it could have riders entering in the middle and exiting on the sides. That would not require the removal of any seats. Liliana Ward Alexandria Maybe. Removal of the vertical poles and seats around the center doors would be intended to spread standees throughout the cars, rather than block those trying to board through the middle doors. I do hope Metro will try one-way boarding and exiting. From checker at panix.com Fri Jan 6 03:45:04 2006 From: checker at panix.com (Premise Checker) Date: Thu, 5 Jan 2006 22:45:04 -0500 (EST) Subject: [Paleopsych] Normal Lebrecht: Too much Mozart makes you sick Message-ID: Too much Mozart makes you sick http://www.scena.org/columns/lebrecht/051214-NL-250mozart.html 5.12.14 [I think I'll pass on the 172 CD set of the "complete" Mozart. It costs $300 but most of the performers are unknown to me. I didn't get the Philips "complete" Mozart in 1991 on 180 CDs. The "complete" Bach on H?nnsler took up 171 CDs. I got that for $200 and have wanted it ever since I bought a copy of the big Schmieder BWV in 1964 directly from the publisher. I slogged through them all, but I must report that none of these pushed aside any of my favorites. But the Robert Levin performance of the first klavier concerto was quite exciting. And the four discs of organ music played by Bine Katrine Bryndorf were almost as good as Walcha. [Why are the best performers today all women? Bryndorf on the organ, Lara St. John on the violin, Helene Grimaud and Mitsuko Uchida on the piano, and Marin Alsop holding the baton? [Mozart mostly cranked out music for others, as was common in those days, Haydn and Telemann being other examples. He composed great masterpieces, esp. the piano concerti, of course, but he wrote little just for himself. Beethoven did crank out stuff, but he wrote more for himself than any composer before or since. And Brahms is right up there. Bach was a big cranker-outer, esp. the cantatas, but he also wrote a great deal of music for himself, much more than Mozart did. [Lehbrecht, however, ignores the timeless masterpieces that Mozart wrote. And recall Mr. Mencken's remark that Mozart (and Wagner) could not avoid genuine music creeping into their operas. But it's good to have some balance in the discussion. Charles Murray, an obvious Beethoven lover, had to decide from the sources he used to rank human achievement in his book by that title, whether Beethoven or Mozart was the greater. It was a toss up, but he came down on Beethoven's side. [Norm's a little overheated. I'd rank Mozart in the top ten and put him ahead of Shostakovich. But his corrective is badly needed!] ------------------- They are steam cleaning the streets of Vienna ahead of next month's birthday weekend when pilgrim walks are planned around the composer's shrines. Salzburg is rolling out brochures for its 2006 summer festival, which will stage every opera in the Kochel canon from infantile fragments to The Magic Flute, 22 in all. Pierre Boulez, the pope of musical modernism, will break 80 years of principled abstinence to conduct a mostly-Mozart concert, a celebrity virgin on the altar of musical commerce. Wherever you go in the coming year, you won't escape Mozart. The 250th anniversary of his birth on January 27 1756 is being celebrated with joyless efficiency as a tourist magnet to the land of his birth and a universal sales pitch for his over-worked output. The complete 626 works are being marketed on record in two special-offer super coffers. All the world's orchestras will be playing Mozart, wall to wall, starting with the Vienna Philharmonic on tour this weekend. Mozart is the superstore wallpaper of classical music, the composer who pleases most and offends least. Lively, melodic, dissonance free: what's not to like? The music is not just charming, it's full of good vibes. The Mozart Effect, an American resource centre which ascribes 'transformational powers' to Austria's little wonderlad, collects empirical evidence to show that Mozart, but no other music, improves learning, memory, winegrowing and toilet training and should be drummed into classes of pregnant mothers like breathing exercises. A 'molecular basis' identified in Mozart's sonata for two pianos is supposed to have stimulated exceptional brain activity in laboratory rats. How can one argue with such 'proof'? Science, after all, confirms what we want to believe - that art is good for us and that Mozart, in his short-lived naivety, represents a prelapsarian ideal of organic beauty, unpolluted by industrial filth and loss of faith. Nice, if only it were true. The chocolate-box image of Mozart as a little miracle can be promptly banged on the head. The hard-knocks son of a cynical court musician, Mozart was taught from first principles to ingratiate himself musically with people of wealth and power. The boy, on tour from age five, hopped into the laps of queens and played limpid consolations to ruthless monarchs. Recognising that his music was better than most, he took pleasure in humiliating court rivals and crudely abused them in letters back home. A coprophiliac obsession with bodily functions, accurately evinced in Peter Shaffer's play and Milos Forman's movie Amadeus, was a clear sign of arrested emotional development. His marriage proved unstable and his inability to control the large amounts he earned from wealthy Viennese patrons was a symptom of the infantile behaviour that hastened his early death and pauper burial. Musical genius he may have been, but Mozart was no Einstein. For secrets of the universe, seek elsewhere. The key test of any composer's importance is the extent to which he reshaped the art. Mozart, it is safe to say, failed to take music one step forward. Unlike Bach and Handel who inherited a dying legacy and vitalised it beyond recognition, unlike Haydn who invented the sonata form without which music would never have acquired its classical dimension, Mozart merely filled the space between staves with chords that he knew would gratify a pampered audience. He was a provider of easy listening, a progenitor of Muzak. Some scholars have claimed revolutionary propensities for Mozart, but that is wishful nonsense. His operas of knowing servants and stupid masters were conceived by Da Ponte, a renegade priest, from plays by Beaumachais and Ariosto; and, while Mozart once indulged in backchat to the all-high Emperor Joseph II, he knew all too well where his breakfast brioche was buttered. He lacked the rage of justice that pushed Beethoven into isolation, or any urge to change the world. Mozart wrote a little night music for the ancien regime. He was not so much reactionary as regressive, a composer content to keep music in a state of servility so long as it kept him well supplied with frilled cuffs and fancy quills. Little in such a mediocre life gives cause for celebration and little indeed was done to mark the centenary of his birth, in 1856, or of his death in 1891. The bandwaggon of Mozart commemorations was invented by the Nazis in 1941 and fuelled by post-War rivalries in 1956 when Deutsche Grammophon rose the from ruins to beat the busy British labels, EMI and Decca, to a first recorded cycle of the Da Ponte operas. The 1991 bicentennial of Mozart's death turned Salzburg into a swamp of bad taste and cupidity. The world premiere of a kitsch opera, Mozart in New York, had me checking my watch every five unending minutes. The record industry, still vibrant, splattered Mozart over every vacant hoarding and a new phenomenon, Classic FM, launched in 1992 on the Mozart tide, ensured that we would never be more than a fingerstretch away from the nearest marzipan chord. What good all this Mozart does is disputable. For all the pseudoscience of the Mozart Effect I have yet to see a life elevated by Cosi fan tutte or a criminal reformed by the plinks of a flute and harp concerto. Where ten days of Bach on BBC Radio 3 will flush out the world's ears and open minds to limitless vistas, the coming year of Mozart feels like a term at Guantanamo Bay without the sunshine. There will be no refuge from neatly resolved chords, no escaping that ingratiating musical grin. Don't look to mass media for context or quality control. Both the BBC and independent channels have rejected any critical perspective on Mozart in the coming year, settling for sweet-wrapper documentaries that regurgitate familiar clich?s. In this orgy of simple-mindedness, the concurrent centenary of Dmitri Shostakovich ? a composer of true courage and historical significance ? is being shunted to the sidelines, celebrated by the few. Mozart is a menace to musical progress, a relic of rituals that were losing relevance in his own time and are meaningless to ours. Beyond a superficial beauty and structural certainty, Mozart has nothing to give to mind or spirit in the 21st century. Let him rest. Ignore the commercial onslaught. Play the Leningrad Symphony. Listen to music that matters. From checker at panix.com Fri Jan 6 10:40:48 2006 From: checker at panix.com (Premise Checker) Date: Fri, 6 Jan 2006 05:40:48 -0500 (EST) Subject: [Paleopsych] CPE: Douglas Glen Whitman: Hayek contra Pangloss on Evolutionary Systems Message-ID: Douglas Glen Whitman: Hayek contra Pangloss on Evolutionary Systems Constitutional Political Economy, 9, 45-66 (1998) [This article is an excellent discussion of Panglossism in evolutionary (and functionalist, for that matter) thinking. as well as a good summary of the controversy surrounding Hayek's thought. But I learn more the fallacies Hayek didn't commit than what he actually argued. How is evolutionary thinking important? [Various comments of mine in brackets below.] Department of Economics, New York University, New York, N.Y. 10018 Abstract. Some analysts have criticized Friedrich Hayek's theory of cultural evolution for implying that the rules, customs, norms, and institutions that emerge from the evolutionary process are necessarily efficient or desirable in all cases. This charge is unfounded. The present article defends Hayek versus his critics in two ways: First, it restates Hayek's own objections to the idea that cultural evolution produces optimal outcomes. Second, it shows, through an analogy with biological evolution, that Hayek's theory need not imply any such conclusion. Contrary to a widely held misconception, biological evolution does not produce organisms that are perfectly adapted to their habitats; insofar as cultural evolution shares common features with biological evolution, cultural evolution may be expected to display similar types of suboptimality or mal-adaptation. Insights from the theory of biological evolution also help to illuminate some areas of controversy with regard to Hayek's theory of cultural evolution, including: Hayek's advocacy of gradual change; the question of what selective forces drive the process of cultural evolution; and the alleged conflict between group selectionism and methodological individualism. JEL classification: B25, B31, K40 1. Introduction Nearly all of the political and economic doctrines of Friedrich Hayek have drawn heated criticism from one quarter or another, but few have attracted so much critique and rebuke, from authors of diverse persuasions, as his theory of cultural evolution. The idea that the morals, customs, habits, conventions, and even laws of modern civilization may owe their origin to a lengthy process of variation, competition, and selection has a long--and sometimes unsavory--history in intellectual thought, and Hayek was by no means its first exponent. He may, however, be credited with reviving the concept as a serious tool for social theory and normative judgment in the latter half of this century, and his most evolutionarily oriented works, the Fatal Conceit (1988) and Law, Legislation and Liberty (1973, 1976, 1979), have served as a lightning rod for renewed discussion of the merits and flaws of evolutionary theory in the social sciences. Among the most frequently repeated charges lodged against Hayek's theory of cultural evolution is that Hayek, like the Social Darwinists, has committed the Panglossian fallacy: he has suggested or implied that social evolution must necessarily produce the best of all possible worlds, a world in which "whatever is, is desirable," or (to put the economists' spin on it) "whatever is, is efficient."2 John Gray (1989: 98), for instance, claims that "Hayek frequently affirms that the sheer persistence of a tradition or a form of life suggests that it must possess some general utility." Martin De Vlieghere (1994: 293) characterizes Hayek as contending that "only those cultural attainments can survive and spread that are beneficial. So, the very longevity of an institution proves its value: : : ." According to Stefan Voigt (1992: 465, n.20), Hayek commits the naturalistic fallacy in his support of evolved institutions: "The currently existing institutions (the 'is') have emerged because they have been more viable than other institutions, from which Hayek concludes that they ought to exist." In the economists' camp, Joseph Stiglitz (1994: 275) argues that "those who appeal to the evolutionary process [e.g., Hayek and Armen Alchian] also claim too much: There is no reason to believe that evolutionary processes have any optimality properties:::," and he goes on to say, "It seems nonsensical to suggest that we should simply accept the natural outcome of the evolutionary process." James Buchanan, an author usually friendly to Hayekian themes, nonetheless perceives Hayek as being adamantly opposed to all reform of evolved institutions. "We may share much of Hayek's skepticism about social and institutional reform, however, without elevating the evolutionary process to an ideal role," says Buchanan (1975: 194, ch. 10, n.1). "Reform may, indeed, be difficult, but this is no argument that its alternative is ideal." Sociologist Bjorn Hallerod (1992: 34) is notably less sympathetic. He argues that "Hayek ends up in a situation where every existing form of society is a good society or otherwise it would not exist," which means that Hayek must find even Nazism acceptable. The critiques have been severe and sometimes overstated, but they are in substance correct: evolutionary systems cannot be characterized as unambiguously efficient or desirable (however these terms might be defined) in their effects. Where Hayek's critics err is in directing these criticisms at Hayek. Hayek's theory can be faulted in a variety of ways, but Panglossianism is not one of them. My intention in this article is two-fold: first, to restate for the record Hayek's rejection of the idea that cultural evolution necessarily produces optimal results; and second, to elaborate some of the reasons why his theory need not imply any such thing. I will conclude by explaining how a better understanding of suboptimality in evolutionary systems can illuminate some areas of controversy that have arisen with regard to Hayek's theory. The second goal will be pursued via an extended analogy with biological evolution. This approach may require some justification. It is my impression that many opponents of cultural evolution theories assume Panglossian implications because of a conscious or unconscious analogy with biological evolution, which is widely--and incorrectly--perceived as a process that produces optimal fitness in organisms relative to their habitats. Gray (1989: 98), for example, states as an objection to Hayek's theory, "we have nothing in society akin to the mechanism of natural selection of genetic accidents in Darwinian theory which guarantees the survival of useful social practices," as though he believes biological natural selection does make such a guarantee. My response, then, proceeds by showing that, even if the analogy between biological and cultural evolution is close (and the analogy does seem closer to me than many analysts would like to admit), biological evolution does not and cannot produce optimal results in all cases. Insofar as cultural evolution shares common features with biological evolution, it, too, will be subject to inefficiency. Although Hayek often tries to distance himself from the analogy with biological evolution, he apparently does so not mainly because he doubts the analogy's validity, but because he wishes to eschew the errors of the Social Darwinists. Hayek repeatedly emphasizes that Darwin's theory of biological evolution was inspired by the evolutionary thinking of the moral and social theorists who preceded him (particularly David Hume, Adam Smith, and the other Scottish moral philosophers).3 After Darwin, Hayek (1979: 154) laments, "those 'social Darwinists' who had needed Darwin to learn what was an older tradition in their own subjects, had somewhat spoiled the case [for cultural evolution] by concentrating on the selection of congenitally more fit individuals," rather than on the selection of rules and practices adopted by groups. Hayek hastens to point out the differences between cultural and biological evolution that make it the case that rules and practices are far more significant than individuals in the process of cultural evolution. Specifically, he notes that "cultural evolution simulates Lamarckism"; that cultural traits can be acquired "from an indefinite number of 'ancestors,"' not merely from one's parents; that learning as a mode of transmission makes cultural evolution occur more quickly than biological evolution; and that cultural evolution is more likely to be subject to group selection (1988: 25). Nonetheless, Hayek recognizes that while their specific mechanisms differ, all forms of evolution share common features. Although the "literal use" of Darwinian theory leads to "grave distortions" when focused upon individuals rather than rules, "the basic conception of evolution is still the same in both fields," he says (1979: 23). Biological and cultural evolution "both rely on the same principle of selection: survival and reproductive advantage. Variation, adaptation and competition are essentially the same kind of process, however different their particular mechanisms, particularly those pertaining to propagation."4 Again, to the extent that cultural and biological evolution are united by kindred processes, they can be expected to exhibit similar characteristics, including their capacity to produce efficient and less-than-efficient outcomes. In much of this article, I will be purposely vague about the definition of efficiency. Even within economics, efficiency has been defined in a variety of ways, from strict Pareto efficiency to wealth maximization. The standards by which the efficiency of rules and institutions are judged sometimes differ from the standards employed to judge efficient activity within given rules and institutions; for example, when Hayek speaks of the efficiency of rules, he usually seems to have in mind the degree to which rules promote the utilization of knowledge and the coordination of plans. Biologists typically employ the concept of "reproductive fitness," by which they mean the capacity of traits to increase the probability of an organism surviving long enough to reproduce as effectively as possible subject to environmental constraints. In general, all such concepts of efficiency are related to the idea, broadly conceived, of "doing the best you can given certain constraints," and fortunately, the point I wish to make does not require any greater specificity. I will contend that, whatever specific definition of efficiency may be adopted, an evolutionary system could not be expected to achieve it in all cases, although some brands of efficiency may be more easily approached than others. 2. F. A. Hayek: No Panglossian By all indications, Hayek was fully aware of the "all's for the best" charges that might be leveled against his theory. He was particularly concerned with the tendency of some social theorists to reject all evolutionary theories of culture out of hand because of the errors of Social Darwinism. His disclaimer is therefore worth quoting at length: Bertrand Russell provides a good example in his claim that "if evolutionary ethics were sound, we ought to be entirely indifferent to what the course of evolution might be, since whatever it is is thereby proved to be the best": : : . This objection, which A. G. N. Flew : : : regards as "decisive," rests on a simple misunderstanding. I have no intention to commit what is often called the genetic or naturalistic fallacy. I do not claim that the results of group selection of traditions are necessarily "good"-- any more than I claim that other things that have long survived in the course of evolution, such as cockroaches, have moral value (Hayek 1988: 27). Nor does he claim that the products of cultural evolution should be immune to criticism or change; again, it is best to quote Hayek directly: It would be wrong to conclude, strictly from such evolutionary premises, that whatever rules have evolved are always or necessarily conducive to the survival and increase of the populations following them. : : : Recognizing that rules generally tend to be selected, via competition, on the basis of their human survival-value certainly does not protect those rules from critical scrutiny (Ibid.: 20). Notably, Hayek believes that the cultural selection process selects for survival and reproduction of groups (a questionable hypothesis that will be considered later), yet even by that criterion of efficiency, the resulting rules cannot be assumed to be efficient. It would be particularly odd, then, for those rules to be efficient according to some other standard, such as neoclassical economic efficiency or classical liberal value judgments. The above quotations appear in Hayek's latest work, but they do not represent retrenchments in the face of criticism of Hayek's previous works; the same message appears repeatedly in his earlier works. In the Constitution of Liberty, for instance, we find Hayek admitting, These considerations, of course, do not prove that all sets of moral beliefs which have grown up in a society will be beneficial. Just as a group may owe its rise to the morals which its members obey, : : : so may a group or nation destroy itself by the moral beliefs to which it adheres (Hayek 1960: 67). Of course, this statement could be interpreted as merely a view of selectionism-in-progress, in that "bad" moral views are characterized as leading inevitably to their own demise. The point, however, is that Hayek does not perceive the process as finished: at any point in time, including the present day, we may find undesirable rules and customs that have not been weeded out by selective forces, at least not yet. Hayek never eschews the modification and reform of rules; he simply points out that any such revision of particular rules must necessarily take place in the context of a complex of other rules that are taken as given for the time being: "This givenness of the value framework implies that, in our efforts to improve them, we must take for granted much that we do not understand" (Ibid.: 63). In Law, Legislation and Liberty, Hayek again emphasizes the need for reform of established rules--this time in the context of a narrower evolutionary system, the common law. The fact that law that has evolved in this way has certain desirable properties does not prove that it will always be good law or even that some of its rules may not be very bad. It therefore does not mean that we can altogether dispense with legislation (1973: 88). Indeed, Hayek (Ibid.: 89) even admits the possibility that general principles of justice (embodied in the remainder of the body of law) may " require the revision not only of single rules but of whole sections of the established system of case law." These are not the statements of a Panglossian. But neither do they suffice to shield Hayek's theory from the charge that it implies that whatever exists is the best of all possible worlds; Hayek's objections notwithstanding, his theory may have implications beyond his words. The question is, does an evolutionary theory unavoidably lead to Panglossian conclusions? In answering this question, we can gain insights by taking a closer look at the well-developed evolutionary theory of another field: biology. 3. The Flaws of the Adaptationist Paradigm Evolutionary biologists have, unfortunately, contributed in part to the misconception that evolutionary systems must yield optimal results. Particularly in the early days of biological evolutionary theory, biologists could be found using Spencer's phrase "survival of the fittest," and that phrase has proved more than a little misleading. Biologists of the "panadaptationist" stripe have perpetuated the idea that all traits of all extant organisms may be construed as optimizing those organisms' fitness relative to the environment. Even modern biologists occasionally slip into this way of thinking; consider the following passage from biologist Ledyard Stebbins: ["Fitness" must refer to something in the world and not just as whatever survives, in order to avoid charges of circularity and tautology. This much we know from Mary B. Williams on group selection and from Mario Bunge in general. But I do not know how biologists do describe fitness.] ... all modern species and races of organisms have existed as successful populations, well adjusted to their environment, for thousands or millions of generations. We would expect, therefore, that all mutations that might improve the organism's reproductive fitness to its particular environment would have occurred at least once during this long period. If so, they would have been incorporated by natural selection in the gene pool (Stebbins 1977: 58). >From statements like this one, it would be easy--but wrong--to draw Panglossian conclusions. Although extreme adaptationism reigned for a while in the biological literature, most biologists (including Stebbins) now reject pan-adaptationism (Vromen 1995: 95f.). Two highly problematic assumptions are required to justify evolutionary theories of the pan-adaptationist variety. The first is the "t goes to infinity" assumption: evolutionary processes are presumed to have reached the ultimate result thatwould obtain if the processes continued for an infinite period of time. The paradigmatic example is the anecdote about 100 monkeys (actually, just one would do) pounding on typewriters for an unlimited amount of time: sooner or later, one of the monkeys will type out the entirety of Gone With the Wind. If the t-goes-to-infinity assumption is taken seriously, the logic is inexorable; every combination of letters (or gene/trait combinations, or cultural taboos) will eventually appear. Everything that can happen will happen, so an appropriate selection mechanism will presumably capture the best of all possible worlds. In real-world processes, however, infinite time is never the case, at least not from the perspective of an analyst observing the products of evolution at any given point in time. From our perspective, evolution is an ongoing process, and we should not be surprised to find incomplete--and suboptimal-- adaptation. The assumption of infinite time bypasses considerations of process altogether. [I love this "t goes to infinity." Happens all the time in economics. And it gets invoked by those who reject multi-selection evolution in biology. I have hounded those who use verbal arguments (which are all the worse when supplemented with equations: I say this as a math majopr with a vested interest in my speciality) against the possibility of group selection by drilling down the arguments till I find some mention of "the long run," like "in the long run no altruistic genes can survive. But by the same token, no groups can survive without altruists. The whole thing is like dividing by zero.] Indeed, it is tempting to argue that, once infinite time is presumed, the optimal result is implicit in the initial conditions, in much the same way that the solution to a system of equations is implicit in the equations themselves. I will resist that temptation, however, because a second assumption is necessary for that conclusion: a stable, exogenous environment against which selection takes place. The environment is, of course, the standard on the basis of which adaptation (and optimality) is usually measured; in most theories, the environment is actually the selective mechanism. When the environment is stable and exogenous, the adaptive "target" remains fixed, and infinite time assures the process will eventually achieve it. But with amoving target, even infinite time cannot force a conclusion of optimal adaptation. As J. Maynard Smith (1994: 97) has pointed out, "Optimization is based on the assumption that the population is adapted to the contemporary environment, whereas evolution is a process of continuous change. Species lag behind a changing environment." In other words, one cannot assume perfect tracking of environmental changes by changes in the genome of resident species. Even if time were infinite, the protean nature of the environment would restrict the relevant adaptation time for any organism to the interval between changes in its environs. The endogeneity of the environment complicates the matter even further, by raising the possibility that the definition of a "good" mutation may depend crucially upon prior mutations. The appearance of a new, desirable trait in a species causes changes in the environment, and those changes alter the selective pressures impinging on the species--possibly rendering other prevailing traits non-adaptive. One puzzling consequence of such a pathdependent process is that fitness may not be transitive: trait B might supersede trait A, and C supersede B, and then A supersede C (Wesson 1991: 141). If changes in the traits of an organism can shape the environment as well as be shaped by it, the very idea of optimal adaptation gets murky because it is unclear that a steady-state relationship between organism and environment will always occur. [The next paragraph is good.] The twin assumptions of infinite time and stable environment underpin the usual case for optimality in evolutionary systems. When they are relaxed, we can understand a variety of actual phenomena in such systems as being non-adaptive or mal-adaptive, rather than dream up ad hoc justifications of how such phenomena might be optimal (as modern biologists have unfortunately tried to do in many cases5). This is true even when "adaptiveness" or "efficiency" has been defined specifically in terms of the environment that acts as a selective mechanism upon traits and organisms. That is, even if we specifically tailor our definition of efficiency to fit the direction of the evolutionary forces at work, we still cannot realistically expect perfectly efficient outcomes. A fortiori, we should not expect an evolutionary system to yield efficient outcomes with respect to some other brand of efficiency defined independently of the selective forces at work (except, perhaps, purely by coincidence).6 In what follows, I will explain some of the most widely recognized types of less-thanperfect adaptation in biological evolution. In addition to mentioning specific cases of such suboptimalities in biology, I will also provide some examples of how similar suboptimalities might occur in cultural evolution. Where possible, I draw my examples from Hayek himself. These examples should, however, be taken with a grain of salt: they are intended as suggestive, not definitive. A convincing case for why any one of the examples given indeed constitutes an example of suboptimal adaptation would probably require an article of its own. 3.1. Errors of Omission, Errors of Commission Naturalists regularly encounter organisms with traits that defy attempts at explanation in terms of adaptation to prevailing environmental conditions. Often the best explanation for such traits comes from an examination of the organisms' phylogenetic histories (even though optimalitywould imply that current conditions alone should provide sufficient explanation). Apparently, selective forces are not always strong enough to remove all unnecessary or harmful traits from a genome in a finite period of time. The best examples are the so-called "vestigial structures" that appear in numerous species, including human beings. Vestigial structures in humans include the vermiform appendix (may have been a gizzard in our ancestors), ear muscles (needed for directional hearing), and caudal vertebrae (used to be a tail).7 None of these features provides any apparent selective advantage any longer, and appendices often require removal when they pose a positive danger to human life; they are actually mal-adaptive. These traits constitute errors of omission: they are features that selective forces have failed to eliminate. It is not terribly difficult to imagine possible analogs in cultural evolution. Although Hayek often fails in his works to explain why the processes he describes may not always yield optimal results, he seems to have recognized the persistence of no-longer-adaptive traits as one possible reason. In Law, Legislation and Liberty, Hayek notes that mankind maintains multiple layers of rules, "according as [sic] traditions have been preserved from the successive stages through which cultural evolution has passed. The consequence is that modern man is torn by conflicts which torment him and force him into ever-accelerating further changes" (1979: 159). Hayek harks back to the conflict between new and old rules in the Fatal Conceit (1988: 18f.) when he attributes the collectivist desire to implement altruism society-wide to a misapplication of the morals of the small group (which evolved very early in humanity's cultural history) to the extended order called civilization (whose rules developed later, and often in conflict with the prior set of rules). Biology also provides various cases in which traits that would clearly be beneficial are conspicuously absent. Smith cites the sula bassana gannet, which lays only one egg at a time, even though itwould be capable of raising (and the environment capable of sustaining) two young at a time. A related gannet in very similar conditions does, in fact, lay two eggs at a time (Smith 1994: 98). Why, then, doesn't the sula bassana? Two answers seem plausible: first, that the environment has changed recently in a more favorable direction and the gannet's genome has not caught up yet; or second, such a mutation may have appeared one or more times but been eliminated by accident (say, because the one chick with the mutation happened to fall out of the nest and die before reproducing). The second scenario would constitute an error of commission, a case of selective forces accidentally eliminating a desirable trait. In either scenario, the fact remains that evolution has not placed all adaptive traits in the current genome. Again, it is not difficult to imagine analogs in cultural evolution. Of course, many suggestions of "beneficial traits we haven't adopted" may be nothing more than the wishful thinking of social reformers or cultural entrepreneurs, but this observation does not mean that truly beneficial but unused or untried cultural traits cannot exist. Hayek admits this possibility with a particular example: "The institutions of property, as they exist at present, are hardly perfect; indeed, we can hardly yet say in what such perfection might consist. Cultural and moral evolution do require further steps if the institution of property is in fact to be as beneficial as it can be" (1988: 35). Some might argue that property rules and other customs and conventions are perfect as they are, but a belief in the idea of cultural evolution certainly would not warrant such a conclusion. There is every reason to believe that cultural evolution can produce errors of omission and commission just as does biological evolution. 3.2. Linkages and Pleiotropism [This is a nice section, esp. for those who have forgotten their biology.] Students of biological evolution have long been familiar with the fact that traits often travel together in packs, even when there is no apparent adaptive advantage to the traits appearing together. This may occur, for instance, when two or more genes are located very close to each other on a chromosome so that it is unlikely that they will be separated during crossing-over (the process in sexual cell division whereby chromosomes exchange sections, thus creating a greater variety of gene combinations). It may also occur in organisms in which crossing-over does not take place, such as in male fruit flies and some bacteria; in cases like these, the entire chromosome is the smallest unit of selection. In such situations, it becomes possible for non-adaptive or mal-adaptive traits to tag along with traits of high adaptive value, a phenomenon P. W. Hendrick calls "genetic hitchhiking" (Dodson and Dodson 1985: 212). Linkages between traits may also occur when a single regulator gene (a gene that activates or otherwise regulates the activities of other genes) turns a number of genes "on" or "off" as a group. If some of those genes confer substantial advantages, the unfortunate effects of other genes in the group may be outweighed. Some biologists suspect that a small number of mutations in regulator genes may have yielded the vast phenotypic differences that separated human beings from their ape-like ancestors; as Robert G. Wesson (1991: 272) puts it, "Hairlessness, tender skin, and exceptional intelligence seem all to be parts of an evolutionary package, elements of which are evidently unadaptive." Similar linkages may occur because of a pleiotropic gene, a single gene that causes multiple effects. An example of a pleiotropic gene is the gene for sickle cell anemia, which, in addition to its well-known harmful effects, provides some degree of protection against malaria (Stebbins 1977: 126). It might be argued that some linkages are unavoidable, and it is therefore optimal for an organism to have linked traits so long as the good outweighs the bad (since optimal means only the best of all possible worlds). This is probably true of pleiotropic genes, and possibly true for regulator-complexes. Linkage by proximity, on the other hand, is clearly a matter of historical accident. The relevant question is, do these traits need to be connected? Are there no other formations or combinations of genes that could separate good from bad effects? If the answer is no, then the existing situation must be considered suboptimal. Cultural analogs leap to mind. It is clear enough that many ideas and practices travel in groups, even though they could theoretically be separated. Religions, for example, are complex structures that comprise multiple beliefs and mores. One might expect a religion to persist if it provided sufficient selective advantages to outweigh any disadvantages involved. On this subject, Hayek argues: Customs whose beneficial effects were unperceivable by those practising them were likely to be preserved long enough to increase their selective advantage only when supported by some other strong beliefs; and some powerful or magic faiths were readily available to perform this role (1988: 138). Fantastic beliefs about the nature of theworld might, therefore, piggyback on beneficial religious practices. Such beliefs could be disadvantageous because they impede the acquisition of more accurate and scientific models of nature, yet survive because they facilitate useful modes of behavior. (What constitutes a selective advantage or disadvantage in cultural evolution is, of course, an open question--one that will be partially addressed later.) Another case of linkage in cultural evolution might arise from the fact that the growth of government power could have both beneficial and harmful consequences. As Hayek observes, Those [governments] that gave greater independence and security to individuals engaged in trading benefited from the increased information and larger population that resulted. Yet, when governments became aware how dependent their people had become on the importation of certain essential foodstuffs and materials, they themselves endeavoured to secure these supplies in one way or another (1988: 44). [Mancur Olson's stationary bandit theory fits the facts much better.] Consequently, security of trade routes and abuse of power have tended to travel together, although whether they can ever be separated is an open question. 3.3. Evolution by Chance and Evolutionary Trends Evolutionary change can also take place simply by chance, particularly in small, isolated populations. In a small population, the death of a single individual can have large repercussions in terms of gene frequencies. Over several generations, these random effects can drive out genes and reduce the variability of the population's genome (Sober 1994: 486). Random genetic changes can also accumulate over time with almost no effect, until a marginal mutation, such as the emergence or disappearance of a regulator gene, causes substantial changes to take place all at once. Most importantly, chance selection explains why adaptive mutations could appear yet fail to spread. Stephen J. Gould and Richard Lewontin (1994: 82) observe that "new mutations have a small chance of being incorporated into a population, even when selectively favored. Genetic drift causes the immediate loss of most new mutations after their introduction." In short, chance can provide the basis for the activation of complexes of linked genes and magnify the incidence of errors of commission. [How big of a mole hill are these chance effects?] A well-known, though trivial, cultural example of this phenomenon is the shrinking of the pool of surnames within small villages in New England, Wales, and elsewhere (Stebbins 1977: 127f.). When a new settlement was established by a small number of founders, the chance death of a single person could substantially reduce the frequency of the victim's surname in the population, even though the surname itself had no selective impact. Whether there exist non-trivial examples of chance selection in cultural processes depends in part on the level at which selection takes place. If group selection (as opposed to individual selection) is an actual phenomenon--as Hayek believed it to be--then some form of cultural drift should become more likely as groups become larger in size and fewer in number. To the extent that the entire world may be considered a single community, the relevant population has only a single member, and drift could therefore be quite dramatic. (The debate about levels of selection is a live issue in biological evolution as well as in cultural evolution that will be discussed more fully later.) Biologists have also observed that selective processes can sometimes lead to the persistence and enlargement of trends; that is, a kind of evolutionary multiplier effect may cause the same mutation to occur again and again. Suppose gene A creates cellular conditions under which mutation B is likely to occur. Then an organism with gene A will tend to have progeny that carry both gene A and mutation B. If B has a selective advantage, then the progeny will be likely to survive and create progeny of their own that carry gene A and mutation B twice. And the next generation may have gene A and mutation B thrice. The phenomenon occurs because the same forces that favor a trait (mutation B) must also favor the genetic conditions that make the trait likely to occur in the first place (gene A). Possible examples include the multiplication of legs on the millipede and the growth of the brain in humans (Wesson 1991: 194). A possible example of trend persistence in cultural evolution, which seems in keeping with Hayek's previously cited suggestions about the abuse of government power, is that the forces which favor groups that solve certain coordination or public good problems may also favor the growth of institutions or attitudes that allow for these social solutions to be reached. (The institutions or attitudes that allow the solutions to be reached are analogous to "gene A"; the solutions themselves are analogous to "mutation B".) Selective forces may therefore reinforce cultural attitudes that favor an increase in social control, even though only specific forms of social control yield a selective advantage. 3.4. Multiple Adaptive Peaks Finally, biologists have also recognized the possibility that an organism may follow multiple routes in its adaptation to an environment. It is by no means certain that all routes must lead to the same end point; there may be different end points that represent the highest adaptability of an organism along the different paths. Such end points are referred to as "multiple adaptive peaks" (Gould and Lewontin 1994: 84). Which path is "chosen" may depend crucially on the order in which mutations occur. A beneficial mutation may arise early on in the phylogenetic history of a species and be incorporated into its genome. Then, subsequent mutations' "fitness" will depend on how well they fit with the organism's new genome. Thus, an early mutation may place an organism on a path to one adaptive peak rather than another as a result of historical accident. A number of economists have observed the evident connection between the idea of multiple adaptive peaks and the game-theoretic concept of a coordination game. (Many of these economists owe much to J. Maynard Smith, who pioneered the use of game-theoretic tools in biology.) Viktor Vanberg (1986: 93) has used game theory to offer a sharp critique of Hayek's theory of cultural evolution, noting that "once a coordination rule is established in a group, it cannot be assumed that a shift to a more beneficial rule can, in general, be brought about by a spontaneous, invisible-hand process." To put that in biological terms, switching from one adaptive peak to another is an extremely unlikely phenomenon, even if one peak is demonstrably superior to the other. If a species reaches an adaptive peak that is not sufficient to preserve the species in the relevant environment, it seems more likely that it will go extinct than switch to a different evolutionary path. If, on the other hand, a suboptimal peak is sufficient for species survival, then the species could persist indefinitely in a less than optimal state. Despite Vanberg's criticism, Hayek seems to have been aware of the possibility of multiple adaptive peaks; indeed, Hayek's cultural relativism (which may seem inexplicable to those who interpret Hayek as a Social Darwinist) is intimately related to the concept. Hayek does not deny the fact that some cultures have developed in completely different directions from that of Western civilization and yet somehow managed to survive: There are, undoubtedly, many forms of tribal or closed societies which rest on very different systems of rules. All that we are here maintaining is that we know only of one kind of such systems of rules, undoubtedly still very imperfect and capable of much improvement, which makes the kind of open or "humanistic" society possible where each individual counts as an individual and not only as a member of a particular group, and where therefore universal rules of conduct can exist which are equally applicable to all responsible human beings (1976: 27). Hayek deliberately argues, therefore, from the context of the adaptive route taken by Western civilization, and he argues for internal improvement within that system. Hayek also recognizes the possibility that, even within a given tradition, path dependency may result in suboptimal consequences for particular subsets of that tradition. In the common law, for example, he points out that "The development of case-law is in some respects a sort of one-way street: when it has already moved a considerable distance in one direction, it often cannot retrace its steps when some implications of earlier decisions are seen to be clearly undesirable" (1973: 88). In situations like these, we find Hayek once again arguing for the occasional corrective reform, which would be unnecessary in a perfectly self-correcting (or instantaneously optimal) evolutionary system.8 4. Broader Implications for the Theory of Cultural Evolution Biological evolution does not provide any justification for the belief that evolutionary processes necessarily lead to optimal results. But neither does it support the opposite conclusion, that evolutionary systems exhibit no desirable or efficient qualities whatsoever. Outrageously mal-adaptive traits have a high likelihood of being weeded out of the gene pool, and the organisms we observe in the natural world have clearly inherited remarkably sophisticated and effective structures and behaviors that allow them to survive and reproduce. The adaptiveness of at least a large number of traits observed in existing organisms has never been in question; what is in question is whether such traits represent the best solutions possible in all cases, and whether every single trait must serve some adaptive purpose. As Wesson has observed, "It is only necessary, however, that any particular characteristic be sufficiently functional to permit the species to survive. If there is an optimal shape of leaf for certain conditions of light and humidity, or webs for snaring flies, and so forth, most species are far from it" (1991: 154). The challenge for biologists, then, is to discern which traits took hold for truly adaptive reasons, which traits emerged for other reasons, and how such emergence took place. The challenge for the evolutionary social scientist, I claim, is much the same. If evolutionary theory told us that all existing laws, customs, conventions, and mores were optimal adaptations to the conditions of human life, there would be little left to do but look around and describe what is already known to be best. But since evolutionary theory does not justify that conclusion, the social scientist's task is more difficult: he must attempt to identify which cultural norms possess truly adaptive qualities, which cultural norms emerged and persisted for non-adaptive reasons (and which may even have mal-adaptive effects), and how these norms came into being. Evolutionary theory can, therefore, provide a sound basis for both advocacy of reform (when structures appear mal-adaptive or detrimental in some way) and for the defense of tradition (when the traditions seem to produce desirable results on net, or when they may be indispensable to the ongoing system as a whole). (Of course, to engage in such internal criticism, one would have to approve, normatively, of whatever standard of "efficiency" is implicit in the selective forces at work--which Hayek appears to do.) That we should keep "good" traditions and change "bad" ones might seem truistic, but some of Hayek's critics have accused him of inconsistency in so arguing. De Vlieghere (1994: 294), for instance, calls Hayek's advocacy of piecemeal reform "lip-service" because it is "in contradiction with his Darwinian theory" which devalues the contributions of reason. Similarly, Barbara M. Rowland (1987: 54) says that Hayek "inconsistently" draws the conclusion "that people can learn from studying the valuable role evolved institutions have played in advanced societies" so that reforms will fit smoothly into the evolved order. But as we have seen, Hayek's reformist and traditionalist tendencies present no contradiction; they are perfectly consistent if viewed from an appropriate evolutionary perspective. These statements should not be taken to imply, however, that Hayek's theory of cultural evolution has no flaws or drawbacks. Indeed, the criticisms and doubts about Hayek's theory are too numerous to state here.9 I will instead show how the biological metaphor and a recognition of the possibility of suboptimality in evolutionary systems can help us to address some unresolved issues associated with Hayek's approach. 4.1. Gradualism One conclusion that Hayek has drawn from his evolutionary analysis is that gradual or piecemeal change ought to be preferred to radical or wholesale change. At first, this conclusion appears to fit in with the evolutionary paradigm nicely. In the biological sphere, Stebbins states, "If an organism is well adjusted to its environment, slight changes in its genetic makeup may alter it better to modifications of that environment, but drastic alterations of one or a few characteristics are almost certain to make it function more poorly under any environment" (1977: 60). In Hayek's view, ill-advised reformers who wish to jettison rules or conventions whose functions are not immediately clear or whose systemic implications are not understood may seriously threaten the stability of an interdependent system. Hayek therefore advises that all reforms be judged within the context of a complex of other rules taken for the time being as given. While all these points are well taken, Hayek's plea for gradualism cannot be taken as a universal rule--at least, not on the basis of evolutionary arguments alone. The potential existence of multiple adaptive peaks indicates that a system very different in all respects from the status quo could, conceivably, have more desirable qualities. In order for such a peak to be reached, radical changes might be required. For Hayek to argue against such wholesale reform, he must (and does) muster other arguments that he has elaborated elsewhere. For instance, in his case against socialism, Hayek might like to say that evolutionary considerations alone should be sufficient to relegate socialism to the dustbin of bad ideas. And evolutionary arguments do carry him part of the way to that conclusion, inasmuch as they lead the analyst to consider the functional properties of institutions such as several property and security of contract. But it is not inconceivable, prima facie, that the status quo represents but one of many adaptive paths. In order to make the case against socialism, Hayek must also rely on a variety of other tools such as economic theory to demonstrate that socialism could not, in fact, achieve the results its proponents suggest.10 Hayek also grounds his argument for gradualism on a strong epistemological challenge: people whose civilization has evolved along one path may simply lack the knowledge necessary to identify viable alternatives that differ substantially from the status quo. That other adaptive paths are conceivable does not imply that ignorant human beings can see what they are and implement them. To theorize that evolution could have led to a different and possibly superior outcome is fine, but to say precisely what that outcome would have been is an act of the imagination, and trying to realize an imaginary outcome in the real world is to engage, not in evolutionary theory, but in rational constructivist design. When proposed changes differ only marginally from the status quo, the imagination can (perhaps) be relied upon for some valid judgments; but in the case of massive system-wide changes, the demands placed on human knowledge are far higher.11 Itwas exactly that sort of hubristic endeavor to which Hayek applied the term, "the fatal conceit." 4.2. The Dual Selective Mechanism of Cultural Evolution There is a great deal of confusion, in both Hayek's work and the literature on cultural evolution in general, about the exact means by which selection takes place in cultural evolution. It is often unclear whether the emergence of cultural norms is a matter of individual and collective choice or purely a product of impersonal environmental factors. An understanding of how evolutionary systems may fail to yield optimality can shed light on this matter. An evolutionary system consists of two fundamental features: units of selection, and a selective mechanism. The selective mechanism consists of those forces in the system which allow for the differential survival and reproduction of the units of selection. Units of selection are structures or entities that have the capacity to replicate themselves (that is, to reproduce) under certain conditions. In biology, the most fundamental units of selection are genes (out of which higher level structures such as organisms and species are formed). But if genes are the most basic units of selection in biology, what are the corresponding units of selection in cultural evolution? The smallest units are, in fact, cultural traits or features with the capacity to be adopted, consciously or unconsciously, by human beings. Richard Dawkins dubbed these entities "memes." In his words, Examples of memes are tunes, ideas, catch phrases, clothes fashions, ways of making pots or of building arches. Just as genes propagate themselves in the gene pool by leaping from body to body via sperm and eggs, so memes propagate themselves in the meme pool by leaping from brain to brain via a process which, in the broad sense, can be called imitation (Dawkins 1976: 206). [This is most definitely not a definition. Russell (and after him Wittgenstein) also thought he had a "unit" of meaning. No characterization of such units have been found have achieved any consensus. This does not render the notion useless. Uselessness comes in degrees.] If memes are indeed the smallest such units, then the psychology and preferences of individuals may constitute significant selective forces, inasmuch as these factors determine which memes can successfully "infect" human minds. "We do not have to look for conventional biological survival values of traits like religion, music, and ritual dancing," Dawkins argues, "though these may also be present" (Ibid.: 214). [Dawkins would probably agree that a simple graph is a meme. The first graph was drawn by Nicole Oresme in 1340, not by a Greek or Roman.] The simplest way to grasp this point is to conceive of cultural evolution as a massive hitchhiker trait. The mutations that created the ability of the human brain to imitate, learn, and evaluate obviously had substantial adaptive qualities for the human species, and for that reason they tended to be selected. But such a brain is capable of far more than enhancing an organism's survival and reproduction; this sort of brain can also desire, imagine, and create. The complex of mutations that created the human mind set in motion myriad effects, only a fraction of which necessarily possessed biologically adaptive qualities; the rest just came along for the ride. Biologist Philip Kitcher observes, "All that natural selection may have done is to equip us with the capacity for various social arrangements and the capacity to understand and to formulate ethical rules" (1994: 440). In so doing, natural selection created the conditions for another kind of evolution, cultural evolution, that is only peripherally related to biological factors. The entire process of cultural evolution may be accurately characterized as a playing out of the full implications of a particular genetic configuration--the human brain--that emerged from the process of biological evolution. Consequently, human culture may be regarded as responding to a dual selective mechanism. On the one hand, the reproductive capacities of units of cultural transmission (memes) are subject to a selective process in terms of their plausibility, attractiveness, utility, and ease of imitation--as determined by the human minds that consciously or unconsciously adopt them. These standards may or may not have anything to do with the memes' capacity to help or hinder the survival and reproduction of human beings in their environments. On the other hand, which cultural traits human beings adopt will often have indirect impacts on human survival and reproduction, and natural selection of an environmental variety will necessarily come into play if the impacts are sufficiently positive or negative. I will refer to selection of the former variety as "psychological selection," and to selection of the latter variety as "environmental selection." I should note that what I am calling environmental selection is the sole selective mechanism at work in biological evolution, while both forms of selection are at work in cultural evolution.12 Like any evolutionary process, cultural evolution does not exhibit a strictly linear chain of causality. The feedback generated by selective forces (in this case, psychological and environmental selection) means that the reason a trait comes into being may differ from the reason a trait persists. Cultural traits come into being because humans are equipped with brains capable of imagining and conceiving of different rules, practices, and ideas. But of the many cultural traits that may come into being, only some will survive both psychological and environmental selection. (Similarly, in biological evolution many traits can come into being via mutation and recombination, but only some will survive the process of selection.) The two selective mechanisms of cultural evolution need not always work in the same direction. Sometimes they will reinforce each other; other times they may conflict. It is even possible that some cultural traits may run contrary to the apparent demands of environmental selection, because of the overwhelming influence of psychological factors. Cavalli-Sforza and Feldman provide a fascinating example: the decline of birth rates among European women after the onset of industrialization. If child-bearing were a purely genetic disposition passed from mother to daughter, a disposition to limit one's child-bearing would tend to die out in an environment wherein raising more children were possible. Those mothers with a disposition to bear more children would pass that disposition to more future mothers, while those with a disposition to bear fewer children would pass that disposition to fewer future mothers. The fact that the European birth rate diminished indicates that some form of cultural transmission of dispositions had to have been at work; only then could horizontal (intra-generational) and oblique (from one generation to non-offspring members of the next) transmission of dispositions have taken place. That is, the ability of European women to learn a new disposition rather than inherit an old one made a drop in the birth rate possible.13 But how could this environmentally mal-adaptive meme have survived the process of environmental selection? The answer lies in recognizing the dual nature of cultural selection. Evidently, the disposition to limit one's pregnancies became psychologically (and financially) appealing to women after industrialization took place in Europe; as a result, the disposition to have more children suffered from a magnified "death" rate, since large numbers of women were abandoning it. Successful memes must survive at this psychological level of selection before the environmental level of selection can even become operative. On the environmental level, the disposition to restrict one's pregnancies might have been expected to lead to its own demise if its net effect had been to reduce the number of the disposition's adherents in each generation. But the European population was still, on the whole, rising because of improvements in sanitation, food provision, and other factors. As a result, a meme yielding lower birth rates was enabled to survive despite the selective advantage of higher birthrates in the context of environmental selection.14 Perceiving cultural evolution as responding to a dual selective mechanism allows the idea of spontaneous order, a concept to which Hayek devoted a considerable amount of attention, to be more fully integrated into the theory of cultural evolution. A spontaneous order such as a market-based economic system does not respond to or serve the specific, unitary ends of a society; rather, it serves the multiplicitous and largely unknown ends of all the individuals whose transactions create the order. This sort of order is an "abstract order of the whole which does not aim at the achievement of known particular results but is preserved as a means for assisting the pursuit of a great variety of individual purposes" (Hayek 1976: 5). It is not at all clear why an order that serves individuals' multifarious purposes should survive in an evolutionary system in which survival and reproduction of groups is the only criterion for natural selection (as Hayek sometimes implied). One of the advantages of a spontaneous order is its capacity to mobilize information that is dispersed among many individuals in the order. Among the pieces of information transmitted (or summarized) by this sort of order, in addition to information about technologies and resource supplies, are the subjective tastes and preferences of the participating individuals. The need for such information would be inexplicable if group survival and reproduction were the only selective forces at work; the tastes of individuals would be precisely irrelevant. It is only because there is another type of selection involved--the satisfaction of the psychological demands of human minds--that information about tastes and preferences might be relevant to an adaptive process. 4.3. Group Selection and Methodological Individualism In explaining his theory of cultural evolution, Hayek embraces the concept of group selection: the idea that cultural traits and behavioral features are naturally selected on the basis of advantages and disadvantages they create for the groups of people who practice them. A number of authors have found Hayek's group selectionism troubling, and Vanberg (1986) argues that group selection conflicts with Hayek's professed methodological individualism. Since the idea of group selection is "theoretically vague, inconsistent with the basic thrust of Hayek's individualistic approach, and faulty judged on its own grounds," Vanberg (1986: 97) contends that group selection ought to be jettisoned to save methodological individualism. Geoffrey Hodgson (1991) agrees with Vanberg that there is a conflict between the two doctrines, but recommends instead that methodological individualism should be abandoned (or at least modified) in order to keep group selection. Some of the insights from the foregoing discussion of the dual selective mechanism of cultural evolution may help to resolve the Vanberg-Hodgson debate. Vanberg defines methodological individualism as "the guiding principle that aggregate social phenomena can be and should be explained in terms of individual actions, their interrelations, and their--largely unintended--combined effects" (Vanberg 1986: 80). Group selection conflicts with methodological individualism, Vanberg argues, because it attempts to explain cultural norms in terms of the functional roles they play for groups rather than their emergence through individuals' behavior. He proceeds to argue that group selection is a troublesome and flawed concept even in biology, because it is unclear how "altruistic" behavior patterns that benefit groups could possibly survive in the presence of selective pressures that favor "selfish" behavior by individuals. Vanberg says that it seems to be the "dominant opinion among biologists" that the conditions necessary for true group selection "rarely exist in nature" (Ibid.: 69). Interestingly, Vanberg also maintains that methodological individualism was a factor in the development of the theory of biological evolution because it supported a shift "from the species as the theoretical unit to the individual organism as the central unit of analysis" (Ibid.: 80). [I have been beating a drum for group selection in biology for a decade, even before Eliot Sober and David Sloan Wilson's _Unto Others_ came out. Wilson (alone) went on to apply the group selection idea to human religion and to revive functionalism in sociology, in _Darwin's Cathedral. Functionalism had gone by the wayside in sociology as much as group selection went in biology. It was Stephen Sanderson's _The Evolution of Human Sociality: A Darwinian Conflict Perspective_ (a terrific book and the first one to import sociobiology into sociology), or rather his dismissal of functionalism that turned me into a functionalist! He argued that functionalism implies the existence of equilibrating mechanisms, which would restore the social system back to where all the parts worked together. Well, there *are* these mechanisms. Sanderson was criticizing *pan*-functionalism. In the process, he convinced me that there must be something to a less thoroughing version of it. And _Darwin's Cathedral_ reached the same result.] Hodgson argues cogently that Vanberg has misconstrued the biological literature on the debate over units of selection. The biological "reductionists" on whom Vanberg relies for support do not contend that the individual organism is the most basic unit of selection in the evolutionary process. Reductionists like Richard Dawkins contend, on the contrary, that selective forces ultimately operate on the smallest units of selection, genes. The misleadingly labeled group selectionists, on the other hand, argue that natural selection operates on higher level structures as well. Genes come in complex groups called individual organisms, organisms come in groups called populations, populations comes in groups called species, and so on; and all of these structures, the group selectionists believe, may be subject to weeding and culling by evolutionary forces. "In other words," Hodgson says, "selection operates simultaneously on different types of unit, depending on the time-scale and the type of selection process" (1991: 69). The real debate in biology, then, is not selection of individuals versus selection of groups, but selection of genes versus selection at multiple levels of a hierarchy. To the extent that Vanberg relies on the support of biological reductionism to support methodological individualism, his argument collapses because there is no particular reason to focus on individuals. "Simple reduction to the individual level is unacceptable because the same arguments concerning reduction from groups to individuals apply equally to reduction from individual to gene. To avoid this double standard, one must either accept multiple levels of selection, or reduce everything to the lowest level [i.e., genes] in the manner of Dawkins : : : and Williams" (Ibid.: 71). Although Vanberg's use of reductionist argumentation is vulnerable to Hodgson's critique, the case for methodological individualism is stronger. In most of his analysis, Vanberg implicitly portrays individuals as units of selection in the evolutionary process. If this were the theoretical basis for methodological individualism, then methodological individualism would indeed be threatened by Hodgson's clarification of the levels-of-selection debate. But the earlier discussion of the dual selective mechanism of cultural evolution suggests that the individual human is not merely a unit of selection; the individual human is actually part of the selective mechanism that influences the survival and reproduction of cultural traits (or memes). And it is this fact, I will argue, that is crucial in the case for methodological individualism. In addition, it dissolves the alleged conflict between methodological individualism and group selection, allowing the concepts to co-exist in the same theory. For the social scientist interested in the process of cultural evolution, the relevant explananda are the cultural norms (including beliefs, rules, behavioral regularities, and institutions) that emerge from that process. In order to understand why some memes have survived and prospered while others have grown rare or disappeared, he must direct his attention to the selective forces that have imposed differential death rates on various cultural practices and beliefs. That means asking, first and foremost, how and why some practices and beliefs were adopted in the first place by human beings and others were not. In other words, it is necessary to enquire into the effects of psychological selection, the first prong of the dual selective mechanism. Then, the analyst must explore the systemic effects that would result from the adoption of certain norms. Such effects might include changes in the constraints that influenced individuals' adoption of those norms in the first place, in which case another round of psychological selection could occur, and the same process could be iterated indefinitely. The systemic effects of norms might also include changes in the capacity of individuals and groups to serve their physiological needs, resulting in population growth or population loss; that is, the second, environmental, prong of the dual selective mechanism could come into play. It might appear that allowing for two selective mechanisms, instead of just psychological selection, represents a break from methodological individualism. But there is no contradiction here: the tenets of methodological individualism do not require that social phenomena be explained without reference to the constraints that impinge on individuals' actions. If environmental constraints affect the survival of individuals (and the groups composed of them) in such a way that the norms they practice and the things they believe have a reduced probability of being absorbed by other individuals (either outsiders or subsequent generations), then the environmental prong of the dual selective mechanism is consistent with a methodological approach that explains social outcomes in terms of the actions, choices, and behaviors of individuals. [Just what is inconsistent with "methodological individualism"? I am now thoroughly confused.] Notably, in this account individuals are not units of selection upon which selective forces operate, except insofar as an individual may be perceived as a conglomeration of multiple memes and genes. What is essential for a methodologically individualist account of the evolution of cultural outcomes is that individuals constitute a filter (i.e., a selective mechanism) through which memes must pass before they can begin to have systemic effects. Vanberg is correct to chastise Hayek for giving too little attention to this filtering process in his later work: Hayek regularly refers to the unexpected prosperity of groups that "happened to change them [cultural rules] in a way that rendered them increasingly adaptive" (Hayek 1988: 20) while giving little detail about how the individuals in those groups might "happen" to adopt such changes. Vanberg is also correct, therefore, to draw attention to the question of how, for instance, groups of individuals might happen upon appropriate rules for escaping Prisoners' Dilemma-type situations. It is also clear, however, that groups that did--somehow--find solutions to that kind of dilemma (e.g., tit-for-tat or "grudger" strategies) would create advantages for their members over the members of other groups that did not discover similar solutions. In other words, if a set of beneficial social rules can survive the gauntlet of psychological selection, then groups of individuals who adopt those rules will be favored by environmental selection. It is worth pointing out that the psychological gauntlet may not be as difficult to clear as Vanberg suggests, since individuals may be guided as much by an instinct to imitate as by rational optimization. (Hayek contends that that kind of rationality is a product, not a predecessor, of cultural evolution (Ibid.: 21).) Of course, any strategy that survived psychological selection would still have to be capable of surviving environmental selection as well. (That is, it would have to be an "evolutionarily stable strategy," to borrow J. M. Smith's terminology.) [Most people give little or no thought to what they do. They are no obsessive Premise Checkers.] Finally, I should be explicit about howthis discussion relates to the issue of group selection. Without necessarily agreeing with the group selectionist hypothesis, it is easy enough to see that group selection is at least not incompatible with methodological individualism, once it is recognized that methodological individualism does not depend upon individual organisms being the (sole) unit of selection. With the methodological issue out of the way, the debate between Vanberg and Hodgson largely disappears. Like the biologists from whom they draw support for their respective positions, Vanberg and Hodgson apparently agree that group selection is a conceivable phenomenon; they merely disagree about its empirical relevance in the world. Opinion on this matter seems to have converged on the position stated by Sober: Group selection acts on a set of groups if, and only if, there is a force impinging on those groups which makes it the case that for each group, there is some property of the group which determines one component of the fitness of every member of the group.15 There remains a debate as to how often these conditions hold, in both biological and cultural evolution. But Vanberg's coordination games and Prisoners' Dilemmas present fine examples of how these conditions could, at least in principle, apply to certain realworld situations faced by human beings. There seems to be some basis, therefore, for Hayek's focus on group selection in his evolutionary theory. 5. Concluding Remarks The critics of theories of cultural evolution have often chided cultural evolutionists for their alleged belief that "whatever is, is desirable." Although some theorists of cultural evolution (like the Social Darwinists) have in fact reached such conclusions, Friedrich Hayek was not one of them. Repeated statements by Hayek indicate that he did not regard cultural evolution as a perfect process. ["Social Darwinism" is a social construction of 20th century historians. None called themselves that at the time. Likewise, there are no self-styled "neo-Nazis." Sometimes, though, an enemy's label gets worn as a badge of pride. Methodism is the best-known example.] Nor does an evolutionary approach justify or imply such a conclusion. The well-developed field of biological evolution provides innumerable examples of how an evolutionary process may fail to produce perfectly adapted organisms. The assumptions of infinite time and constant environments could sustain the idea of perfect adaptation, but these assumptions are untenable. In a real-world evolutionary system, whether of the biological or cultural variety, one should therefore not be surprised to find errors of omission and commission, "hitchhiker" traits, chance selection, trend persistence, and path dependence. Indeed, such "suboptimal" phenomena in the phylogenetic history of mankind may be responsible for the very existence of cultural evolution. Trend persistence and chance, as well as adaptive selection, led to the formation of a powerful human brain capable of imitation, learning, and cognitive thought. That brain produced multiple effects, only some of which could be considered adaptive on a purely biological level. The other traits merely tagged along, and among those traits was the capacity for desires and preferences--often for things with no discernable adaptive value whatsoever, such as fine art and literature. The very persistence of cultural traits that are non-adaptive (or even mal-adaptive, in the sense of counteracting the demands of environmental selection) constitutes a fantastic error of omission; human beings are constantly engaged in a multitude of costly, energy-consuming activities that add nothing to the reproductive fitness of the species. The species can remain in existence because the biological advantages of having powerful brains--such as providing food, shelter, and clothing--are sufficient to justify the biological burdens of having those brains. Those burdens include the vast majority of what we call "culture" (and few people would consider them burdens in a pejorative sense of the word). The process of cultural evolution may usefully be treated as responding to two masters. One is environmental selection, meaning the process by which certain cultural traits may lead to the demise or proliferation of those who hold them because they inhibit the production of food, cause the population to shrink, etc. The other is psychological selection, meaning the process by which some cultural traits dwindle and others spread because of their appeal, utility, plausibility, and capacity for imitation by human minds. Both sets of selective forces are, of course, highly imperfect; both are subject to all of the adaptive limitations imposed by finite time, trait linkage, path dependence, and so on. When a two-fold selection criterion is fully and explicitly incorporated into Hayek's theory of cultural evolution, the theory can more easily be squared with Hayek's theory of spontaneous order. The idea of a dual selective mechanism also provides a ready defense against the charge that his theory conflicts with the principles of methodological individualism. Stripped of all Panglossian implications, real or imagined by critics, Hayek's theory of cultural evolution may provide a powerful tool for the analyst searching for a critical theory of social development and the growth of institutions. [This strikes me as exactly right, namely that combining group selection with spontaneous order will result in a terrific way of looking at social institutions. Don't forget there are other constraints, even those imposed by physics. There's a fine little book, C.J. Pennycuick, _Newton Rules Biology: A physical approach to biological problems_ (Oxford UP, 1992) that should probably be read by all.] Notes 1. The author wishes to thank Roger Koppl, Mario Rizzo, an anonymous referee, and participants at the Austrian Economics Colloquium at New York University for their useful comments and suggestions. 2. In Voltaire's novel Candide, the eminent Dr. Pangloss maintained that we live in the best of all possible worlds. "It is proved," he said, "that things cannot be other than they are, for since everything is made for a purpose, it follows that everything is made for the best purpose." (Voltaire 1947 [1759]: 20). 3. Hayek (1960: 59); Hayek (1973: 23); Hayek (1979: 154); Hayek (1988: 23f.). 4. Hayek (1988: 26). Indeed, Hayek argues that the same principles are applicable to the study of all complex orders: "We understand now that all enduring structures above the level of the simplest atoms, and up to the brain and society, are the results of, and can be explained only in terms of, processes of selective evolution: : : " Hayek (1979: 158). 5. See Gould and Lewontin (1994: 78f.). 6. Examples of types of "efficiency" defined independently of the selective forces at work might include conformity to an aesthetic standard, or consistency with an ideological viewpoint such as classical liberalism. 7. Dodson and Dodson (1985: 213). 8. The fact that detrimental path dependence is possible in an evolutionary system does not necessarily mean it is common. Some of the most famous examples of detrimental path dependence in economics, such as the alleged inferiority of the QWERTY keyboard, have turned out to be unfounded. See Liebowitz and Margolis (1990). 9. See Kley (1994) for examples. 10. This does not mean that a socialist system could not be created in the first place, only that it could not work the way its proponents suggest it would. As noted earlier (in section 3.1), an evolutionary system is capable of a form of retrogression when no-longer-adaptive traits have been superseded but not weeded out. Hayek attributes the collectivist impulse behind socialist schemes to a misapplication of small group morals to the extended order that evolved later. 11. The Eastern European economies that are attempting to transform themselves into market economies after the socialist experiment may face similar problems of trying to implement a "jump" from one path to another. They have the advantage, however, of knowing from observation of existing market economies that a market economy is at least possible (an advantage not shared by the socialists early in this century, who tried to engineer a jump to a purely hypothetical socialist economy). 12. My distinction between psychological and environmental selection parallels Cvalli-Sforza and Feldman's distinction between "cultural" and " Darwinian" selection, which they define as follows: ": : : cultural selection refers to the acquisition of a cultural trait, while Darwinian selection refers to the actual test by survival and fertility of the advantages of having or not having the trait" (Cavalli-Sforza and Feldman 1981: 16). I have chosen not to adopt their terminology because their use of the word "cultural" might be misleading. I use the word "cultural" to refer to all traits that are not transmitted genetically, and I use " psychological" and "environmental" to refer to the selective forces that impinge on cultural traits. 13. See Sober (1994: 482-4). 14. It has been suggested to me that simple cost-benefit analysis would be sufficient to explain the drop in European birth rates. But this explanation begs the question: the whole issue is which costs and benefits may be considered. The environmental (i.e., strictly biological) costs and benefits clearly pointed toward more child-bearing (since better sanitation, food, etc., made children easier and cheaper to sustain). A lower birth rate could only have arisen from "cost-benefit analysis," then, if some psychological costs and benefits could also come into play. 15. Quoted in Hodgson (1991): 70. References [The anthology by Elliot Sober is comprehensive, often too specialized for me, and cheap considering its thickness. These are articles that should, as Mr. Vining used to say, be read "bolt upright in a hard chair.] Bradie, M. (1994) "Epistemology from an Evolutionary Point of View." In: Sober, E. (ed.) Conceptual Issues in Evolutionary Biology, 2nd ed., Cambridge, Mass.: The MIT Press. Buchanan, J. M. (1975) The Limits of Liberty: Between Anarchy and Leviathan. Chicago, Ill.: University of Chicago Press. Cavalli-Sforza, L., and Feldman, M. (1981) Cultural Transmission and Evolution: A Quantitative Approach. Princeton, N.J.: Princeton University Press. Dawkins, R. (1976) The Selfish Gene. New York: Oxford University Press. De Vlieghere, M. (1994) "A Reappraisal of Friedrich A. Hayek's Cultural Evolutionism." Economics and Philosophy 10(2): 285-304. Dillon, L. S. (1978) Evolution: Concepts and Consequences, 2nd ed. St. Louis: The C.V. Mosby Company. Dodson, E. O., and Dodson, P. (1985) Evolution: Process and Product. Boston: Prindle, Weber & Schmidt. Eldredge, N. (1985) Unfinished Synthesis: Biological Hierarchies and Modern Evolutionary Thought. New York: Oxford University Press. Gould, S. J., and Lewontin, R. C. (1994) "The Spandrels of San Marco and the Panglossian Paradigm: A Critique of the Adaptationist Programme." In: Sober, E. (ed.) Conceptual Issues in Evolutionary Biology. Gray, J. (1989) Liberalisms. London: Routledge. Hallerod, B. (1992) "Friedrich August von Hayek--Apostle for Freedom?" Sociologisk Forskning 29(3): 12-34. Hayek, F. A. (1960) The Constitution of Liberty. Chicago, Ill.: University of Chicago Press. Hayek, F. A. (1988) The Fatal Conceit: The Errors of Socialism. Chicago, Ill.: University of Chicago Press. Hayek, F. A. (1973) Law, Legislation and Liberty, vol. 1: Rules and Order. Chicago, Ill.: University of Chicago Press. Hayek, F. A. (1976) Law, Legislation and Liberty, vol. 2: The Mirage of Social Justice. Chicago, Ill.: University of Chicago Press. Hayek, F. A. (1979) Law, Legislation and Liberty, vol. 3: The Political Order of a Free People. Chicago, Ill.: University of Chicago Press. Heath, E. (1992) "Rules, Function, and the Invisible Hand: An Interpretation of Hayek's Social Theory." Philosophy of the Social Sciences 22(1): 28-45. Hodgson, G. M. (1991) "Hayek's Theory of Cultural Evolution: An Evaluation in the Light ofVanberg's Critique." Economics and Philosophy 7: 67-82. Kitcher, P. (1994) "Four Ways of 'Biologizing' Ethics." In: Sober, E. (ed.) Conceptual Issues in Evolutionary Biology. Kley, R. (1994) Hayek's Social and Political Thought. Oxford: Clarendon Press. Liebowitz, S. J., and Margolis, S. E. (1991) "The Fable of the Keys." Journal of Law and Economics 33 (April): 1-25. Rowland, B. M. (1987) Ordered Liberty and the Constitutional Framework: The Political Thought of Friedrich A. Hayek. New York: Greenwood Press. Smith, J. M. (1994) "Optimization Theory in Evolution." In: Sober, E. (ed.) Conceptual Issues in Evolutionary Biology. Sober, E. (ed.) (1994) Conceptual Issues in Evolutionary Biology, 2nd ed., Cambridge, Mass.: The MIT Press. Sober, E. (1994) "Models of Cultural Evolution." In: Sober, E. (ed.) Conceptual Issues in Evolutionary Biology. Stebbins, G. L. (1977) Processes of Organic Evolution. Englewood Cliffs, N.J.: Prentice-Hall, Inc. Stiglitz, J. (1991) Whither Socialism? Cambridge, Mass.: The MIT Press. Tomlinson, J. (1990) Hayek and the Market. London: Pluto Press. Vanberg, V. (1986) "Spontaneous Market Order and Social Rules: A Critical Examination of F. A. Hayek's Theory of Cultural Evolution." Economics and Philosophy 2: 75-100. Voigt, S. "On the Internal Consistency of Hayek's Evolutionary Oriented Constitutional Economics--Some General Remarks." Journal des Economistes et des Etudes Humaines 3(4): 461-76. Voltaire (F.-M. Arouet) [1947 (1759)] Candide. Translated by John Butt. Penguin Books. Vromen, J. J. (1995) Economic Evolution: An Enquiry into the Foundations of New Institutional Economics. London: Routledge. Wesson, R. G. (1991) Beyond Natural Selection. Cambridge, Mass.: The MIT Press. From checker at panix.com Fri Jan 6 18:03:13 2006 From: checker at panix.com (Premise Checker) Date: Fri, 6 Jan 2006 13:03:13 -0500 (EST) Subject: [Paleopsych] CHE: In the Lab With the Dalai Lama Message-ID: In the Lab With the Dalai Lama The Chronicle of Higher Education, 5.12.16 http://chronicle.com/weekly/v52/i17/17b01001.htm By LEIGH E. SCHMIDT Even the Dalai Lama's harshest critics at the Society for Neuroscience meeting last month, in Washington, would have to concede this much: Choosing the exiled Tibetan Buddhist leader to inaugurate the professional association's series on neuroscience and society certainly got people talking. Who would have thought that an announced lecture on "The Neuroscience of Meditation" would set off a protest petition gathering about 1,000 signatures, a counterpetition of support boasting nearly as many names, substantial coverage in The New York Times and on National Public Radio, as well as ample chatter in the blogosphere? In a culture that likes its battles between science and religion to be loud, colorful, and Christian -- another nasty squabble, say, between evolutionists and creationists -- this controversy seemed unlikely to gain much traction. Yet as the dispute built momentum in the months leading up to the event, it soon became clear that the prospect of the red-robed Dalai Lama's urging the study of an ancient spiritual practice upon white-coated lab scientists would provide a newsworthy angle on the usual wrangling. Playing upon tensions far less noticed than those that have plagued relations between science and conservative Christianity, the latest dust-up reveals the spirit wars that divide the knowledge class itself. How purely secular and naturalistic do the members of that class imagine themselves to be, and how committed are they to keeping religion at bay in their conference gatherings, university laboratories, civic institutions, newsrooms, and think tanks? In turn, is "spirituality" a back door through which religion gets to enter the conversation, now dressed in the suitably neutralized garb of meditation as a universalistic practice of inward peace and outreaching compassion? Or does religion, even when soft-peddled in the cosmopolitan language of spirituality and the contemplative mind, inevitably remain an embarrassment to those elites who stake their authority on secular rationality? The dispute roiling the neuroscience society over the past six months has brought such questions front and center. Inviting the Dalai Lama to speak at the meeting created two major border disputes. The first, of modest consequence to religion-and-science debates, was the conflict over the "political agenda" of the exiled Tibetan leader. In an international professional association that includes many Chinese scientists, some members were offended at the implied endorsement that the event gave to the Dalai Lama's larger cause of freedom for Tibetans. The second dispute, more insistently debated, was over religion's showing up -- so visibly, to boot -- at an annual meeting of neuroscientists. The almost visceral response by critics was to declare a total separation of religion and science, to wave the flag for the late-19th-century warfare between the two domains. "A science conference is not [an] appropriate venue for a religion-based presentation," a professor of anesthesia from the University of California at San Francisco remarked on the petition. "Who's next, the pope?" That sign-off question pointed to a second part of the strict separationist logic: Even if the Dalai Lama seemed pretty irenic as religious leaders go, he nonetheless represented a slippery slope into a mire of superstition and authoritarianism. (How else, some critics asked, were they to interpret his known affinities with reincarnation and monasticism?) "Today, the Dalai Lama; Tomorrow, Creationists?" wrote a professor of medicine at the University of Toronto, capturing perhaps the most commonplace anxiety given voice among the critics. Keep the society free of all religious discussion, or else the esteemed body might slide into the hell of a Kansas school-board meeting. More interesting than the purists' boundary monitoring is the way the Dalai Lama and his defenders imagine through meditation an emerging meeting point for science and religion in contemporary culture. The headline study that served as the immediate source of intrigue surrounding his recent lecture was an article published last year in the Proceedings of the National Academy of Sciences and produced by researchers at the Waisman Laboratory for Brain Imaging and Behavior, at the University of Wisconsin at Madison. That group, led by the psychology professor Richard J. Davidson, has been studying long-term Tibetan Buddhist practitioners of meditation, comparing their brain-wave patterns with those of a control group. Davidson himself has been working in the science-religion borderlands for more than two decades and has been a leading collaborator with the Mind and Life Institute, in Boulder, Colo., one of the principal organizations encouraging the neuroscience-meditation dialogue. Shifting the focus of research from altered states of consciousness or momentary experiences of ecstasy, which so often concerned inquirers in the 1960s and 1970s, the Davidson group has been looking for evidence that sustained meditation causes actual neural changes in everyday patterns of cognition and emotion. In other words, they want to know if the brain function of long-term contemplatives is made demonstrably different through years of "mental training." And not just different, but better: That is, does the well-developed meditative mind sustain higher levels of compassion and calmness than the run-of-the-mill American noggin? Well, after testing eight long-time Tibetan Buddhist practitioners and 10 "healthy student volunteers," the researchers discovered that the 10,000 to 50,000 hours that the various monks had devoted to "mental training" appeared to make a real neurological difference. As the study's title put it, "Long-term meditators self-induce high-amplitude gamma synchrony during mental practice." Davidson and company, careful not to overreach in their conclusions, did suggest that practices of meditation, and the accompanying compassionate affect, were "flexible skills that can be trained." Did that mean contemplative practice could be abstracted from its religious context and then applied as a kind of public pedagogy? Were hopeful supporters wrong to read this as a tantalizing suggestion that meditation might prove beneficial not only for the mental health of Americans but also for the very fabric of society? Where, after all, couldn't we benefit from a little more "pure compassion," altruism, lovingkindness, and "calm abiding"? As novel as it may sound to monitor the brain waves of Tibetan Buddhist monks in university laboratories or on Himalayan hillsides (Davidson has done both), it is certainly not the first time that American psychologists have sought to re-engage the spiritual through the healthy-mindedness of meditation. At Wisconsin, Davidson occupies a research professorship named for Harvard's William James, the pioneering psychologist, psychical researcher, and philosopher of religion, and it is in the tradition of James that the current turn to the contemplative mind is best understood. Counter to the popular image of Americans as endlessly enterprising, agitated, and restless -- all busy Marthas, no reflective Marys -- James discerned a deep mystical cast to the American psyche and pursued that strain with uncommon intellectual devotion. Yet when it came to "methodical meditation," James saw little of it left among American Christians and turned instead to homegrown practitioners of various mind-over-matter cures. He particularly accented those "New Thought" metaphysicians who were pushing forward a dialogue with far-flung emissaries of yoga and Buddhist meditation in the wake of the World's Parliament of Religions, held in Chicago in 1893. Among James's favored practitioners of these newly improvised regimens of meditation was Ralph Waldo Trine, a Boston-based reformer with a knack for inspirational writing. In The Varieties of Religious Experience (1902), James used Trine's blockbuster In Tune With the Infinite (1897) as an epitome of the emergent practices of concentration, mental repose, and healthy-mindedness then percolating in New England and elsewhere across the country. Though an unabashed popularizer, Trine was not a lightweight. With an educational pedigree that ran from Knox College to the University of Wisconsin to the Johns Hopkins University, he moved easily in Harvard's wider metaphysical circles and energetically engaged various progressive causes. In much the same way that current studies promote the clinical applications of meditation, Trine emphasized the healthful benefits that accrued from cultivating a calm yet expectant mind. He had no scanners or electrodes, but he had the same hopes about improving the mental and physical health of Americans through elaborating a universal practice of meditation, one that transcended the particulars of any one religious tradition and represented a kind of cosmopolitan composite of all faiths. And while Trine did not have the Dalai Lama at hand, he did have extended contact with a well-traveled Sinhalese Buddhist monk, Anagarika Dharmapala, with whom he compared notes and devotional habits at a summer colony in Maine as he was putting together his own system of meditation for Americans. Like other inquirers then and now, Trine was all too ready to look to Asia for a practical antidote to American nervousness. The real payoff for Trine, as it is for Davidson and his colleagues, was not established simply through a calculus of productivity or cheerfulness: Would encouraging meditation or other visualization techniques make people more alert and proficient at the office or on the playing field? Would it make them feel happier and less disgruntled? Trine, like James and now Davidson, was finally more interested in saintliness and compassion than in helping stressed-out brain workers relax and concentrate. It is hard not to hear a hint of Davidson's pursuit of altruism in Trine's "spirit of infinite love," the moral imperative to "care for the weak and defenseless." And it is hard not to see that the world of William James and Ralph Waldo Trine is alive and well as American investigators wire up Tibetan Buddhist hermits in a search for the powers of the concentrated mind, the mental disciplines of harmony, compassion, and peace that might make the world a marginally kinder, less selfish place. That optimism about human nature -- that the mind has deep reservoirs of potential for empathy and altruism -- had a lot more backing among liberals and progressives in 1900 than it does today. Still, the considerable hopes now invested in meditation suggest that the old romantic aspirations, spiritual and otherwise, continue to flourish, especially among members of the mind-preoccupied knowledge class. P erhaps the most important dimension of the Dalai Lama's turn to the laboratory is the notion that the religion-science wound will be salved through recasting religion as spirituality. The Nobel laureate's latest book explicitly suggests as much in its title, The Universe in a Single Atom: The Convergence of Science and Spirituality. In doing so, he expressly appeals to all those Americans who fear fundamentalist incarnations of religion and who instead cast themselves as intellectually curious and spiritually seeking. Religion, on this model, is not a domain of authority competing with science but an inward terrain of personal experience and individual probing. Spirituality, the Dalai Lama writes, "is a human journey into our internal resources." Representing "the union of wisdom and compassion," it shares with science a progressive hope for "the betterment of humanity." In those terms, religion as spirituality becomes the handmaiden of science itself, joining it in an open quest for knowledge, empirical and pragmatic, unconstrained by ancient creeds, cosmologies, or churches. In such exhortations the Dalai Lama shows a fine, intuitive feel for much of American intellectual and religious life, but he is hardly telling today's Emersonian inquirers something about the universe that they do not already affirm. A practice of meditation made palatable to scientists, secularists, and seekers would no doubt look pallid to all those monks, hermits, and saints who have taken it to be an arduous and ascetic discipline. Still, the American pursuit of "spirituality," reaching a crescendo in the past two decades, has been all too easy to dismiss as paltry and unsubstantial, labeled as foreign and threatening to more-orthodox versions of a Christian America. In this often-charged religious environment, the Dalai Lama has astutely laid hold of the science-spirituality nexus as a cultural foothold. As he has discovered in this latest brouhaha, that move has hardly lifted him above the wider debates, whether about materialism or intelligent design, but it has allowed him to connect with America's more cosmopolitan and progressive religious impulses. When William James was asked directly in 1904, "What do you mean by 'spirituality'?," he replied: "Susceptibility to ideals, but with a certain freedom to indulge in imagination about them." In mingling with neuroscientists who have warmed to his talk of spirituality, the Dalai Lama may well have found his own avatars of William James. Leigh E. Schmidt is a professor of religion at Princeton University and author of Restless Souls: The Making of American Spirituality (HarperSanFrancisco, 2005). From checker at panix.com Fri Jan 6 18:03:21 2006 From: checker at panix.com (Premise Checker) Date: Fri, 6 Jan 2006 13:03:21 -0500 (EST) Subject: [Paleopsych] NS: Is string theory in trouble? Message-ID: Is string theory in trouble? http://www.newscientist.com/article.ns?id=mg18825305.800&print=true * 17 December 2005 * Amanda Gefter Ever since Albert Einstein wondered whether the world might have been different, physicists have been searching for a theory of everything to explain why the universe is the way it is. Now string theory, one of today's leading candidates, is in trouble. A growing number of physicists claim it is ill-defined and based on crude assumptions. Something fundamental is missing, they say. The main complaint is that rather than describing one universe, the theory describes 10^500, each with different constants of nature, even different laws of physics. But the inventor of string theory, physicist Leonard Susskind, sees this landscape of universes as a solution rather than a problem. He says it could answer the most perplexing question in physics: why the value of the cosmological constant, which describes the expansion rate of the universe, appears improbably fine-tuned for life. A little bigger or smaller and life could not exist. With an infinite number of universes, says Susskind, there is bound to be one with a cosmological constant like ours. The idea is controversial, because it changes how physics is done, and it means that the basic features of our universe are just a random luck of the draw. He explains to Amanda Gefter why he thinks it's a possibility we cannot ignore. Why are physicists taking the idea of multiple universes seriously now? First, there was the discovery in the past few years that inflation seems right. This theory that the universe expanded spectacularly in the first fraction of a second fits a lot of data. Inflation tells us that the universe is probably extremely big and necessarily diverse. On sufficiently big scales, and if inflation lasts long enough, this diversity will produce every possible universe. The same process that forged our universe in a big bang will happen over and over. The mathematics are rickety, but that's what inflation implies: a huge universe with patches that are very different from one another. The bottom line is that we no longer have any good reason to believe that our tiny patch of universe is representative of the whole thing. Second was the discovery that the value of the cosmological constant - the energy of empty space which contributes to the expansion rate of the universe - seems absurdly improbable, and nothing in fundamental physics is able to explain why. I remember when Steven Weinberg first suggested that the cosmological constant might be anthropically determined - that it has to be this way otherwise we would not be here to observe it. I was very impressed with the argument, but troubled by it. Like everybody else, I thought the cosmological constant was probably zero - meaning that all the quantum fluctuations that make up the vacuum energy cancel out, and gravity alone affects the expansion of the universe. It would be much easier to explain if they cancelled out to zero, rather than to nearly zero. The discovery that there is a non-zero cosmological constant changed everything. Still, those two things were not enough to tip the balance for me. What finally convinced you? The discovery in string theory of this large landscape of solutions, of different vacuums, which describe very different physical environments, tipped the scales for me. At first, string theorists thought there were about a million solutions. Thinking about Weinberg's argument and about the non-zero cosmological constant, I used to go around asking my mathematician friends: are you sure it's only a million? They all assured me it was the best bet. But a million is not enough for anthropic explanations - the chances of one of the universes being suitable for life are still too small. When Joe Polchinski and Raphael Bousso wrote their paper in 2000 that revealed there are more like 10^500 vacuums in string theory, that to me was the tipping point. The three things seemed to be coming together. I felt I couldn't ignore this possibility, so I wrote a paper saying so. The initial reaction was very hostile, but over the past couple of years people are taking it more seriously. They are worried that it might be true. Steven Weinberg recently said that this is one of the great sea changes in fundamental science since Einstein, that it changes the nature of science itself. Is it such a radical change? In a way it is very radical but in another way it isn't. The great ambition of physicists like myself was to explain why the laws of nature are just what they are. Why is the proton just about 1800 times heavier than the electron? Why do neutrinos exist? The great hope was that some deep mathematical principle would determine all the constants of nature, like Newton's constant. But it seems increasingly likely that the constants of nature are more like the temperature of the Earth - properties of our local environment that vary from place to place. Like the temperature, many of the constants have to be just so if intelligent life is to exist. So we live where life is possible. For some physicists this idea is an incredible disappointment. Personally, I don't see it that way. I find it exciting to think that the universe may be much bigger, richer and full of variety than we ever expected. And it doesn't seem so incredibly philosophically radical to think that some things may be environmental. In order to accept the idea that we live in a hospitable patch of a multiverse, must a physicist trade in that dream of a final theory? Absolutely not. No more than when physicists discovered that the radii of planetary orbits were not determined by some elegant mathematical equation, or by Kepler's idea of nested Platonic solids. We simply have to reassess which things will be universal consequences of the theory and which will be consequences of cosmic history and local conditions. So even if you accept the multiverse and the idea that certain local physical laws are anthropically determined, you still need a unique mega-theory to describe the whole multiverse? Surely it just pushs the question back? Yes, absolutely. The bottom line is that we need to describe the whole thing, the whole universe or multiverse. It's a scientific question: is the universe on the largest scales big and diverse or is it homogeneous? We can hope to get an answer from string theory and we can hope to get some information from cosmology. There is a philosophical objection called Popperism that people raise against the landscape idea. Popperism [after the philosopher Karl Popper] is the assertion that a scientific hypothesis has to be falsifiable, otherwise it's just metaphysics. Other worlds, alternative universes, things we can't see because they are beyond horizons, are in principle unfalsifiable and therefore metaphysical - that's the objection. But the belief that the universe beyond our causal horizon is homogeneous is just as speculative and just as susceptible to the Popperazzi. Could there be some kind of selection principle that will emerge and pick out one unique string theory and one unique universe? Anything is possible. My friend David Gross hopes that no selection principle will be necessary because only one universe will prove to make sense mathematically, or something like that. But so far there is no evidence for this view. Even most of the hard-core adherents to the uniqueness view admit that it looks bad. Is it premature to invoke anthropic arguments - which assume that the conditions for life are extremely improbable - when we don't know how to define life? The logic of the anthropic principle requires the strong assumption that our kind of life is the only kind possible. Why should we presume that all life is like us - carbon-based, needs water, and so forth? How do we know that life cannot exist in radically different environments? If life could exist without galaxies, the argument that the cosmological constant seems improbably fine-tuned for life would lose all of its force. And we don't know that life of all kinds can't exist in a wide variety of circumstances, maybe in all circumstances. It a valid objection. But in my heart of hearts, I just don't believe that life could exist in the interior of a star, for instance, or in a black hole. Is it possible to test the landscape idea through observation? One idea is to look for signs that space is negatively curved, meaning the geometry of space-time is saddle-shaped as opposed to flat or like the surface of a sphere. It's a long shot but not as unlikely as I previously thought. Inflation tells us that our observable universe likely began in a different vacuum state, that decayed into our current vacuum state. It's hard to believe that's the whole story. It seems more probable that our universe began in some other vacuum state with a much higher cosmological constant, and that the history of the multiverse is a series of quantum tunnelling events from one vacuum to another. If our universe came out of another, it must be negatively curved, and we might see evidence of that today on the largest scales of the cosmic microwave background. So the landscape, at least in principle, is testable. If we do not accept the landscape idea are we stuck with intelligent design? I doubt that physicists will see it that way. If, for some unforeseen reason, the landscape turns out to be inconsistent - maybe for mathematical reasons, or because it disagrees with observation - I am pretty sure that physicists will go on searching for natural explanations of the world. But I have to say that if that happens, as things stand now we will be in a very awkward position. Without any explanation of nature's fine-tunings we will be hard pressed to answer the ID critics. One might argue that the hope that a mathematically unique solution will emerge is as faith-based as ID. Leonard Susskind Leonard Susskind is the Felix Bloch Professor of Theoretical Physics at Stanford University in California. His book Cosmic Landscape: String theory and the illusion of intelligent design is published this week by Little, Brown ($24.95, ?14.33, ISBN 0316155799) From checker at panix.com Fri Jan 6 18:03:31 2006 From: checker at panix.com (Premise Checker) Date: Fri, 6 Jan 2006 13:03:31 -0500 (EST) Subject: [Paleopsych] AP: Bush pardons two Tennessee moonshiners Message-ID: Bush pardons two Tennessee moonshiners http://www.southernstandard.net/news.php?viewStoryPrinter=26654 [Mr. Mencken would have been mighty pleased by this. He would have been less pleased by the pardoning of a bank robber and a lawyer with Republican connects, which was not in the AP wire as printed in the Tennessee paper but was mentioned in the NYT write up of an AP report. Quite possibly both papers had the same report and simply chose different excerpts. [A fine Christmas gift to moonshiners!] KNOXVILLE, Tenn. President Bush has pardoned two Tennesseans convicted decades ago of moonshine charges. "It's a good Christmas present," said Charles E. McKinley, 75, of Pall Mall. Also pardoned was Carl E. Cantrell, 57, of Monteagle, who said, "It was the biggest relief I ever had." The pardons this week restore full U.S. citizenship to the men, including the rights to vote and buy a gun, their attorneys told The Knoxville News Sentinel. But their records will reflect both the felony convictions and the pardons. Cantrell said in the mid-1960s he and two friends set up a still on the side of Monteagle Mountain, surrounded by trees. "It don't take a genius to make it," he said. Soon afterwards, the site was raided and he was convicted of Interal Revenue Service liquor law violations and given three years probation. McKinley said he was driving a friend around in 1949 and they stopped at his friend's still to get something to drink. Driving away, they were arrested by the feds. McKinley was convicted of IRS liquor law violations and given two years of probation. He said he never intended to sell moonshine, only drink it. "I just got in the wrong place at the wrong time," he said. McKinley added that he may have to change his political support since he's been helped by a GOP president. "I'd almost be a Republican after that." Cantrell expressed surprised by the pardon. "Really, to tell you the truth, I thought that nothing would be done about it," he said. Cantrell said he is looking forward to being able to vote and buy a gun. "I wasn't trying to cause nobody no harm," he said. "I was just trying to make a living." Bush Grants 11 Pardons http://www.nytimes.com/2005/12/24/politics/24pardons.html By THE ASSOCIATED PRESS WASHINGTON, Dec. 23 (AP) - President Bush has granted 11 pardons, bringing to 69 the number of clemency orders he has issued since taking office five years ago, the Justice Department said Friday. Three moonshiners and a bank robber were among those pardoned, as was a Denver lawyer whose employer has Republican political ties. The pardons were issued Tuesday, in keeping with a tradition of granting clemency during the holiday season. All of the individuals applied for clemency, and their applications were reviewed by the Office of the Pardon Attorney and forwarded to the president for a final decision. From checker at panix.com Fri Jan 6 18:03:41 2006 From: checker at panix.com (Premise Checker) Date: Fri, 6 Jan 2006 13:03:41 -0500 (EST) Subject: [Paleopsych] SW: Einstein and the Cosmological Constant Message-ID: History of Physics: Einstein and the Cosmological Constant http://scienceweek.com/2005/sw051230-2.htm The following points are made by Steven Weinberg (Physics Today 2005 November): 1) The mistakes made by leading scientists often provide a better insight into the spirit and presuppositions of their times than do their successes. In thinking of Einstein's mistakes, one immediately recalls what Einstein (in a conversation with George Gamow) called the biggest blunder he had made in his life: the introduction of the cosmological constant. After Einstein had completed the formulation of his theory of space, time, and gravitation -- the general theory of relativity -- he turned in 1917 to a consideration of the spacetime structure of the whole Universe. He then encountered a problem. Einstein was assuming that, when suitably averaged over many stars, the Universe is uniform and essentially static, but the equations of general relativity did not seem to allow a time-independent solution for a universe with a uniform distribution of matter. So Einstein modified his equations, by including a new term involving a quantity that he called the cosmological constant. Then it was discovered that the Universe is not static, but expanding. Einstein came to regret that he had needlessly mutilated his original theory. It may also have bothered him that he had missed predicting the expansion of the universe. 2) This story involves a tangle of mistakes, but not the one that Einstein thought he had made. First, the author (Weinberg) does not think that it can count against Einstein that he had assumed the Universe is static. With rare exceptions, theorists have to take the world as it is presented to them by observers. The relatively low observed velocities of stars made it almost irresistible in 1917 to suppose that the universe is static. Thus when Willem de Sitter (1872-1934) proposed an alternative solution to the Einstein equations in 1917, he took care to use coordinates for which the metric tensor is time-independent. However, the physical meaning of those coordinates is not transparent, and the realization that de Sitter's alternate cosmology was not static -- that matter particles in his model would accelerate away from each other -- was considered to be a drawback of the theory. 3) It is true that Vesto Melvin Slipher (1875-1969), while observing the spectra of spiral nebulae in the 1910s, had found a preponderance of redshifts of the sort that would be produced in an expansion by the Doppler effect, but no one then knew what the spiral nebulae were; it was not until Edwin Hubble (1889-1953) found faint Cepheid variables in the Andromeda Nebula in 1923 that it became clear that spiral nebulae were distant galaxies, clusters of stars far outside our own galaxy. The author (Weinberg) does not know if Einstein had heard of Slipher's redshifts by 1917, but in any case he knew very well about at least one other thing that could produce a redshift of spectral lines: a gravitational field. 4) It should be acknowledged here that Arthur Eddington (1882-1944), who had learned about general relativity during World War I from de Sitter, did in 1923 interpret Slipher's redshifts as due to the expansion of the Universe in the de Sitter model. Nevertheless, the expansion of the Universe was not generally accepted until Hubble announced in 1929 -- and actually showed in 1931 -- that the redshifts of distant galaxies increase in proportion to their distance, as would be expected for a uniform expansion. Only then was much attention given to the expanding-universe models introduced in 1922 by Alexander Friedmann (1888-1925), in which no cosmological constant is needed. In 1917 it was quite reasonable for Einstein to assume that the Universe is static. Physics Today http://www.physicstoday.org -------------------------------- Related Material: ON QUINTESSENCE AND THE EVOLUTION OF THE COSMOLOGICAL CONSTANT The following points are made by P.J.E. Peebles (Nature 1999 398:25): 1) Contrary to expectations, the evidence is that the Universe is expanding at approximately twice the velocity required to overcome the gravitational pull of all the matter the Universe contains. The implication of this is that in the past the greater density of mass in the Universe gravitationally slowed the expansion, while in the future the expansion rate will be close to constant or perhaps increasing under the influence of a new type of matter that some call "quintessence". 2) Quintessence began as Einstein's cosmological constant, Lambda. It has negative gravitational mass: its gravity pushes things apart. 3) Particle physicists later adopted Einstein's Lambda as a good model for the gravitational effect of the active vacuum of quantum physics, although the idea is at odds with the small value of Lambda indicated by cosmology. 4) Theoretical cosmologists have noted that as the Universe expands and cools, Lambda tends to decrease. As the Universe cools, symmetries among forces are broken, particles acquire masses, and these processes tend to release an analogue of latent heat. The vacuum energy density accordingly decreases, and with it the value of Lambda. Perhaps an enormous Lambda drove an early rapid expansion that smoothed the primeval chaos to make the near uniform Universe we see today, with a decrease in Lambda over time to its current value. This is the cosmological inflation concept. 5) The author suggests that the recent great advances in detectors, telescopes, and observatories on the ground and in space have given us a rough picture of what happened as our Universe evolved from a dense, hot, and perhaps quite simple early state to its present complexity. Observations in progress are filling in the details, and that in turn is driving intense debate on how the behavior of our Universe can be understood within fundamental physics. Nature http://www.nature.com/nature -------------------------------- Notes by ScienceWeek: Active vacuum of quantum physics: This refers to the idea that the vacuum state in quantum mechanics has a zero-point energy (minimum energy) which gives rise to vacuum fluctuations, so the vacuum state does not mean a state of nothing, but is instead an active state. If a theory or process does not change when certain operations are performed on it, the theory or process is said to possess a symmetry with respect to those operations. For example, a circle remains unchanged under rotation or reflection, and a circle therefore has rotational and reflection symmetry. The term "symmetry breaking" refers to the deviation from exact symmetry exhibited by many physical systems, and in general, symmetry breaking encompasses both "explicit" symmetry breaking and "spontaneous" symmetry breaking. Explicit symmetry breaking is a phenomenon in which a system is not quite, but almost, the same for two configurations related by exact symmetry. Spontaneous symmetry breaking refers to a situation in which the solution of a set of physical equations fails to exhibit a symmetry possessed by the equations themselves. In general, the term "latent heat" refers to the quantity of heat absorbed or released when a substance changes its physical phase (e.g., solid to liquid) at constant temperature. The inflationary model, first proposed by Alan Guth in 1980, proposes that quantum fluctuations in the time period 10^(-35) to 10^(-32) seconds after time zero were quickly amplified into large density variations during the "inflationary" 10^(50) expansion of the Universe in that time frame. -------------------------------- Related Material: COSMOLOGY: ON THE COSMOLOGICAL CONSTANT PROBLEM The following points are made by Thomas Banks (Physics Today 2004 March): 1) Since the mid-1980s, astronomers and astrophysicists have been accumulating evidence that the expansion of the universe is accelerating. The simplest way to incorporate that acceleration into the description of cosmology, within the framework of general relativity, is to add a cosmological constant (CC) term to the Einstein equations. Before Edwin Hubble discovered the expansion of the universe, Albert Einstein had originally introduced such a term to obtain a static solution of his cosmological equations. After the cosmic expansion was discovered, Einstein considered his introduction of the CC to be the greatest mistake of his career. 2) Many physicists were reluctant to consider the CC as an explanation for astronomical data, because the value it would need to have is ridiculously small compared to current theoretical expectations -- some 10^(120) times too small. Theorists interpreted that discrepancy as an indication that they would one day find an elegant explanation for why the parameter was exactly zero. Although some still cling to that hope, the author concludes that observation has once again upset the expectations of overconfident theorists. 3) The framework that gives rise to the enormous mismatch between calculation and observation is called "effective quantum field theory in background spacetime", or EFT for short. EFT always involves a short distance cutoff scale below which the approximations of EFT break down. The natural length scale introduced by quantum gravity (QG) is the Planck length -- the combination of Newton's gravitational constant, Planck's constant, and the speed of light that has units of length. Naive dimensional analysis and explicit calculations in EFT suggest that the cosmological constant should be proportional to the fourth power of the corresponding Planck energy of about 10^(28) eV. That is 10^(120) times too big. 4) Any dynamical solution of the CC problem within EFT should involve particles whose mass is on the order of the energy scale of the CC, about 10^(-3) eV. There have been many published attempts to resolve the problem by invoking such particles, but all of them have failed. EFT does provide a loophole for resolving the CC problem: Apart from calculable contributions, there are contributions from energy scales higher than those corresponding to the cutoff. In principle, those two types of contributions can cancel, but from the EFT point of view, the cancellation to 1 part in 10^(120) would be incredibly fortuitous. The author believes that the resolution of the CC problem does not involve some clever trick in EFT. Rather, QG will force on theorists a fundamental revision of the rules of the game. This belief is not yet the accepted dogma of the field. There are as many ideas about how to solve the CC problem as there are theorists who think about it.(1-5) References (abridged): 1. G. 't Hooft, in Salamfestschrift: A Collection of Talks From the Conference on Highlights of Particle and Condensed Matter Physics, A. Ali, J. Ellis, S. Randjbar-Daemi eds., World Scientific, River Edge, NJ (1994), available at http://www.arXiv.org/abs/gr-qc/9310026; L. Susskind, J. Math. Phys. 36, 6377 (1995) 2. J. H. Schwarz, in Quantum Aspects of Gauge Theories, Supersymmetry, and Unification, A. Ceresole, C. Kounnas, D. Loest, S. Theisen, eds., Springer-Verlag, New York (1999), available at http://www.arXiv.org/abs/hep-th/9812037 3. T. Banks, in Strings, Branes, and Gravity: TASI 99, J. Harvey, S. Kachru, E. Silverstein, eds., World Scientific, River Edge, NJ (2001), available at http://www.arXiv.org/abs/hep-th/9911068; D. Bigatti, L. Susskind, http://www.arXiv.org/abs/hep-th/9712072; O. Aharony et al., Phys. Rep. 323, 183 (2000) 4. L. Susskind, in The Black Hole: 25 Years After, C. Teitelboim, J. Zanelli, eds., World Scientific, River Edge, NJ, (1998), available at http://www.arXiv.org/abs/hep-th/9309145; A. Sen, Nucl. Phys. B 440, 421 (1995); A. Strominger, C. Vafa, Phys. Lett. B 379, 99 (1996) 5. J. Bekenstein, Phys. Rev. D 7, 2333 (1973); 9, 3292 (1974); S. Hawking Phys. Rev. D 13, 191 (1976) Physics Today http://www.physicstoday.org From checker at panix.com Fri Jan 6 18:03:50 2006 From: checker at panix.com (Premise Checker) Date: Fri, 6 Jan 2006 13:03:50 -0500 (EST) Subject: [Paleopsych] SW: On Social Learning in Insects Message-ID: Sociobiology: On Social Learning in Insects http://scienceweek.com/2005/sw051230-5.htm The following points are made by L. Chittka and E. Leadbeater (Current Biology 2005 15:R869): 1) The rapid expansion of the field of social learning in recent decades [1,2] has almost entirely bypassed the insects. But a close inspection of the literature reveals numerous cases where insects appear to learn by observation, eavesdrop on members of the same or different species, and even engage in teaching other members of a society. In fact, the first hint of observatory learning by animals dates back to Darwin's field notes published by Romanes [3,4]. Darwin suggested that honeybees learn the art of nectar robbing -- extracting nectar from flowers via holes bitten into the tubes, without touching the flower's reproductive organs -- by observing bumblebees engaged in the activity. Experimental proof for this conjecture remains outstanding, but it is interesting to note that Darwin thought that observatory learning might occur across, rather than within, species 2) Early in the 20th century, researchers became aware that many adult phytophagous insects prefer host species that they themselves had fed on when they were larvae -- even where the insect species, as a whole, was a generalist with multiple acceptable hosts [5]. In what has become known as "Hopkins' host selection principle", it was thought that the larvae become conditioned to the chemosensory cues associated with food provided by their parents. This is a non-trivial suggestion, as the nervous system of a holometabolous insect is extensively rearranged and rewired during metamorphosis; nevertheless, there have been convincing studies to show that such pre-imaginal conditioning indeed occurs. This shows that insect parents can pass on valuable information about suitable food types to their offspring, simply by placing eggs on suitable host plants, or by provisioning eggs with certain food types. In a similar vein, researchers have considered the possibility of "traditions" being established in honeybees colonies. Foragers can be trained to feed at a certain time of day, and it was shown that these learned temporal preferences are picked up by larvae via vibratory cues. The individuals so taught will display the same preferences when they themselves become foragers. 3) One of the most spectacular examples of social learning occurs in the honeybee dances. Inside the darkness of the hive, successful foragers display a series of stereotypical motor behaviors which inform other foragers of the precise location of floral food, up to several kilometers away from the hive. Dancers essentially "teach" recruits by putting them through a symbolized version of the "real life" flight to the food source. Recruits memorize and decode the information delivered in the dances, and subsequently apply the information on the flight to the indicated food source. Note that this constitutes a form of observatory (unrewarded) learning: while dancers occasionally give food samples to recruits by regurgitating food, these food samples are not a prerequisite for successful information transmission. Such mouth-to-mouth contacts between bees, however, serve another function in the context of social learning: successful foragers can teach their nestmates the scent of the food they have located. 4) With the exception of Darwin's suggestion that honeybees might copy bad habits from bumblebees, the examples above are all cases where the transmission of information is of mutual interest, for example between parents and offspring, or between members of a colony of related individuals. A recent focus in social influences on learning, however, concerns cases where individuals inadvertently leave cues that can be used as publicly available information by other individuals for adaptive behavior [2]. A relatively simple form is local enhancement, where animals are drawn to sites where conspecifics are present [1]. The newcomers may then learn, on their own, that the site contains valuable food, for example in Vespid wasps. Bumblebees are attracted to members of the same species when they scout for a novel flower species, and can learn about suitable food sources by observatory learning from unrelated individuals, without the necessity of direct interaction with these individuals, and without the presence of rewards. This means that bees, by observing the activities of other foragers, can bypass the substantial costs of exploring multiple food sources by individual initiative. References (abridged): 1. Galef, B.G. and Laland, K.N. (2005). Social learning in animals: Empirical studies and theoretical models. Bioscience 55, 489-499 2. Danchin, E., Giraldeau, L.A., Valone, T.J., and Wagner, R.H. (2004). Public information: From nosy neighbors to cultural evolution. Science 305, 487-491 3. Romanes, G.J. (1884). Mental evolution in animals. AMS Press, New York 4. Galef, B.G. (1996). Introduction. In: Heynes, C.M., Galef, B.G. (Eds.), Social learning in animals. (1996). Academic Press, San Diego 5. Hopkins, A.D. (1917). A discussion of C.G.Hewitt's paper on 'Insect Behavior'. J. Econ. Entomol. 10, 92-93 Current Biology http://www.current-biology.com -------------------------------- Related Material: ON ALTRUISM OF INDIVIDUALS IN INSECT SOCIETIES The following points are made by Edward O. Wilson (citation below): 1) Altruism is self-destructive behavior performed for the benefit of others. The use of the word altruism in biology has been faulted by Williams and Williams (1957), who suggest that the alternative expression "social donorism" is preferable because it has less gratuitous emotional flavor. Even so, altruism has been used as a term in connection with evolutionary argumentation by Haldane (1932) and rigorous genetic theory by Hamilton (1964), and it has the great advantage of being instantly familiar. The self-destruction can range in intensity all the way from total bodily sacrifice to a slight diminishment of reproductive powers. Altruistic behavior is of course commonplace in the responses of parents toward their young. It is far less frequent, and for our purposes much more interesting, when displayed by young toward their parents or by individuals toward siblings or other, more distantly related members of the same species. Altruism is a subject of importance in evolution theory because it implies the existence of group selection, and its extreme development in the social insects is therefore of more than ordinary interest. The great scope and variety of the phenomenon in the social insects is best indicated by citing a few concrete examples: a) The soldier caste of most species of termites and ants is virtually limited in function to colony defense. Soldiers are often slow to respond to stimuli that arouse the rest of the colony, but, when they do, they normally place themselves in the position of maximum danger. When nest walls of higher termites such as Nasutitermes are broken open, for example, the white, defenseless nymphs and workers rush inward toward the concealed depths of the nest, while the soldiers press outward and mill aggressively on the outside of the nest. Nutting (personal communication) witnessed soldiers of Amitermes emersoni in Arizona emerge from the nest well in advance of the nuptial flights, wander widely around the nest vicinity, and effectively tie up in combat all foraging ants that could have endangered the emerging winged reproductives. b) I have observed that injured workers of the fire ant Solenopsis saevissima leave the nest more readily and are more aggressive on the average than their uninjured sisters. Dying workers of the harvesting ant Pogonomyrmex badius tend to leave the nest altogether. Both effects may be no more than meaningless epiphenomena, but it is also likely that the responses are altruistic. To be specific, injured workers are useless for most functions other than defense, while dying workers pose a sanitary problem. c) Alarm communication, which is employed in one form or other throughout the higher social groups, has the effect of drawing workers toward sources of danger while protecting the queens, the brood, and the unmated sexual forms. d) Honeybee workers possess barbed stings that tend to remain embedded when the insects pull away from their victims, causing part of their viscera to be torn out and the bees to be fatally injured. A similar defensive maneuver occurs in many polybiine wasps, including Synoeca surinama and at least some species of Polybia and Stelopolybia and the ant Pogonomyrmex badius. The fearsome reputation of social bees and wasps in comparison with other insects is due to their general readiness to throw their lives away upon slight provocation. e) When fed exclusively on sugar water, honeybee workers can still raise larvae -- but only by metabolizing and donating their own tissue proteins. That this donation to their sisters actually shortens their own lives is indicated by the finding of de Groot (1953) that longevity in workers is a function of protein intake. f) Female workers of most social insects curtail their own egg laying in the presence of a queen, either through submissive behavior or through biochemical inhibition. The workers of many ant and stingless bee species lay special trophic eggs that are fed principally to the larvae and queen. g) The "communal stomach", or distensible crop, together with a specially modified proventriculus, forms a complex storage and pumping system that functions in the exchange of liquid food among members of the same colony in the higher ants. In both honeybees and ants, newly fed workers often press offerings of ingluvial food on nestmates without being begged, and they may go so far as to expend their supply to a level below the colony average. 2) These diverse physiological and behavioral responses are difficult to interpret in any way except as altruistic adaptations that have evolved through the agency of natural selection operating at the colony level. The list by no means exhausts the phenomena that could be placed in the same category. Adapted from: Edward O. Wilson: The Insect Societies. Harvard University Press 1971, p.321. -------------------------------- Related Material: ON EVOLUTIONARY BIOLOGY AND THE STUDY OF INSECTS The following points are made by Nipam H. Patel (Proc. Nat. Acad. Sci. 2000 97:4442): 1) A great number of studies aimed at understanding the evolution of development have been carried out within insects. Without a doubt, this is largely because our detailed understanding of the genetic and molecular basis of pattern formation in the model insect, Drosophila melanogaster, provides an excellent starting point for a large number of comparative studies. In addition, insects are an evolutionary diverse group of animals; almost one million species of insects have been described, and estimates of insect diversity place the total number of undescribed insect species at over 20 million. 2) More importantly, there is an enormous range of morphological and developmental diversity found within this group of animals, extending from spectacularly colored butterflies, to stick insects, to horned beetles, to wingless silverfish, to minuscule parasitic wasps. Over the last few years, evolutionary studies within the insects have ranged from characterizing the genetic and molecular changes responsible for reproductive isolation between closely related species of Drosophila, to comparing gene expression patterns to understand the developmental basis for variation in appendage number among differently related members of this group. 3) A number of investigations have also focused on the evolution of the developmental process of segmentation. Finally, recent studies in a variety of insects have revealed interesting molecular changes in the process of axis formation... It is particularly important that researchers continue to take advantage of as many different groups of insects as possible; this is the only way we can adequately address the evolutionary questions facing us. Proc. Nat. Acad. Sci. http://www.pnas.org From checker at panix.com Fri Jan 6 18:04:03 2006 From: checker at panix.com (Premise Checker) Date: Fri, 6 Jan 2006 13:04:03 -0500 (EST) Subject: [Paleopsych] NYT: Children Learn by Monkey See, Monkey Do. Chimps Don't. Message-ID: Children Learn by Monkey See, Monkey Do. Chimps Don't. http://www.nytimes.com/2005/12/13/science/13essa.html Essay By CARL ZIMMER I drove into New Haven on a recent morning with a burning question on my mind. How did my daughter do against the chimpanzees? A month before, I had found a letter in the cubby of my daughter Charlotte at her preschool. It was from a graduate student at Yale asking for volunteers for a psychological study. The student, Derek Lyons, wanted to observe how 3- and 4-year-olds learn. I was curious, so I got in touch. Mr. Lyons explained how his study might shed light on human evolution. His study would build on a paper published in the July issue of the journal Animal Cognition by Victoria Horner and Andrew Whiten, two psychologists at the University of St. Andrews in Scotland. Dr. Horner and Dr. Whiten described the way they showed young chimps how to retrieve food from a box. The box was painted black and had a door on one side and a bolt running across the top. The food was hidden in a tube behind the door. When they showed the chimpanzees how to retrieve the food, the researchers added some unnecessary steps. Before they opened the door, they pulled back the bolt and tapped the top of the box with a stick. Only after they had pushed the bolt back in place did they finally open the door and fish out the food. Because the chimps could not see inside, they could not tell that the extra steps were unnecessary. As a result, when the chimps were given the box, two-thirds faithfully imitated the scientists to retrieve the food. The team then used a box with transparent walls and found a strikingly different result. Those chimps could see that the scientists were wasting their time sliding the bolt and tapping the top. None followed suit. They all went straight for the door. The researchers turned to humans. They showed the transparent box to 16 children from a Scottish nursery school. After putting a sticker in the box, they showed the children how to retrieve it. They included the unnecessary bolt pulling and box tapping. The scientists placed the sticker back in the box and left the room, telling the children that they could do whatever they thought necessary to retrieve it. The children could see just as easily as the chimps that it was pointless to slide open the bolt or tap on top of the box. Yet 80 percent did so anyway. "It seemed so spectacular to me," Mr. Lyons said. "It suggested something remarkable was going on." It was possible, however, that the results might come from a simple desire in the children just to play along. To see how deep this urge to overimitate went, Mr. Lyons came up with new experiments with the transparent box. He worked with a summer intern, Andrew Young, a senior at Carnegie Mellon, to build other puzzles using Tupperware, wire baskets and bits of wood. And Mr. Lyons planned out a much larger study, with 100 children. I was intrigued. I signed up Charlotte, and she participated in the study twice, first at the school and later at Mr. Lyons's lab. Charlotte didn't feel like talking about either experience beyond saying they were fun. As usual, she was more interested in talking about atoms and princesses. Mr. Lyons was more eager to talk. He invited me to go over Charlotte's performance at the Yale Cognition and Development Lab, led by Mr. Lyons's adviser, Frank C. Keil. Driving into New Haven for our meeting, I felt as if Charlotte had just taken some kind of interspecies SAT. It was silly, but I hoped that Charlotte would show the chimps that she could see cause and effect as well as they could. Score one for Homo sapiens. At first, she did. Mr. Lyons loaded a movie on his computer in which Charlotte eagerly listened to him talk about the transparent plastic box. He set it in front of her and asked her to retrieve the plastic turtle that he had just put inside. Rather than politely opening the front door, Charlotte grabbed the entire front side, ripped it open at its Velcro tabs and snatched the turtle. "I've got it!" she shouted. A chimp couldn't have done better, I thought. But at their second meeting, things changed. This time, Mr. Lyons had an undergraduate, Jennifer Barnes, show Charlotte how to open the box. Before she opened the front door, Ms. Barnes slid the bolt back across the top of the box and tapped on it needlessly. Charlotte imitated every irrelevant step. The box ripping had disappeared. I could almost hear the chimps hooting. Ms. Barnes showed Charlotte four other puzzles, and time after time she overimitated. When the movies were over, I wasn't sure what to say. "So how did she do?" I asked awkwardly. "She's pretty age-typical," Mr. Lyons said. Having watched 100 children, he agrees with Dr. Horner and Dr. Whiten that children really do overimitate. He has found that it is very hard to get children not to. If they rush through opening a puzzle, they don't skip the extra steps. They just do them all faster. What makes the results even more intriguing is that the children understand the laws of physics well enough to solve the puzzles on their own. Charlotte's box ripping is proof of that. Mr. Lyons sees his results as evidence that humans are hard-wired to learn by imitation, even when that is clearly not the best way to learn. If he is right, this represents a big evolutionary change from our ape ancestors. Other primates are bad at imitation. When they watch another primate doing something, they seem to focus on what its goals are and ignore its actions. As human ancestors began to make complicated tools, figuring out goals might not have been good enough anymore. Hominids needed a way to register automatically what other hominids did, even if they didn't understand the intentions behind them. They needed to imitate. Not long ago, many psychologists thought that imitation was a simple, primitive action compared with figuring out the intentions of others. But that is changing. "Maybe imitation is a lot more sophisticated than people thought," Mr. Lyons said. We don't appreciate just how automatically we rely on imitation, because usually it serves us so well. "It is so adaptive that it almost never sticks out this way," he added. "You have to create very artificial circumstances to see it." In a few years, I plan to explain this experience to Charlotte. I want her to know what I now know. That it's O.K. to lose to the chimps. In fact, it may be what makes us uniquely human. From ljohnson at solution-consulting.com Sat Jan 7 05:42:53 2006 From: ljohnson at solution-consulting.com (Lynn D. Johnson, Ph.D.) Date: Fri, 06 Jan 2006 22:42:53 -0700 Subject: [Paleopsych] CHE: In the Lab With the Dalai Lama In-Reply-To: References: Message-ID: <43BF54DD.7020400@solution-consulting.com> Frank, This was hilarious! Imagine being against the Dalai Lama*! What Torquemadas these scientists are! *He is agnostic on the existence of God! How can one object? Lynn D. Johnson, Ph.D. Solutions Consulting Group 166 East 5900 South, Ste. B-108 Salt Lake City, UT 84107 Tel: (801) 261-1412; Fax: (801) 288-2269 Check out our webpage: www.solution-consulting.com Feeling upset? Order Get On The Peace Train, my new solution-oriented book on negative emotions. Premise Checker wrote: > In the Lab With the Dalai Lama > The Chronicle of Higher Education, 5.12.16 > http://chronicle.com/weekly/v52/i17/17b01001.htm > > By LEIGH E. SCHMIDT > > Even the Dalai Lama's harshest critics at the Society for > Neuroscience meeting last month, in Washington, would have > to concede this much: Choosing the exiled Tibetan Buddhist > leader to inaugurate the professional association's series > on neuroscience and society certainly got people talking. > Who would have thought that an announced lecture on "The > Neuroscience of Meditation" would set off a protest petition > gathering about 1,000 signatures, a counterpetition of > support boasting nearly as many names, substantial coverage > in The New York Times and on National Public Radio, as well > as ample chatter in the blogosphere? In a culture that likes > its battles between science and religion to be loud, > colorful, and Christian -- another nasty squabble, say, > between evolutionists and creationists -- this controversy > seemed unlikely to gain much traction. Yet as the dispute > built momentum in the months leading up to the event, it > soon became clear that the prospect of the red-robed Dalai > Lama's urging the study of an ancient spiritual practice > upon white-coated lab scientists would provide a newsworthy > angle on the usual wrangling. > > Playing upon tensions far less noticed than those that have > plagued relations between science and conservative > Christianity, the latest dust-up reveals the spirit wars > that divide the knowledge class itself. How purely secular > and naturalistic do the members of that class imagine > themselves to be, and how committed are they to keeping > religion at bay in their conference gatherings, university > laboratories, civic institutions, newsrooms, and think > tanks? In turn, is "spirituality" a back door through which > religion gets to enter the conversation, now dressed in the > suitably neutralized garb of meditation as a universalistic > practice of inward peace and outreaching compassion? Or does > religion, even when soft-peddled in the cosmopolitan > language of spirituality and the contemplative mind, > inevitably remain an embarrassment to those elites who stake > their authority on secular rationality? The dispute roiling > the neuroscience society over the past six months has > brought such questions front and center. > > Inviting the Dalai Lama to speak at the meeting created two > major border disputes. The first, of modest consequence to > religion-and-science debates, was the conflict over the > "political agenda" of the exiled Tibetan leader. In an > international professional association that includes many > Chinese scientists, some members were offended at the > implied endorsement that the event gave to the Dalai Lama's > larger cause of freedom for Tibetans. The second dispute, > more insistently debated, was over religion's showing up -- > so visibly, to boot -- at an annual meeting of > neuroscientists. The almost visceral response by critics was > to declare a total separation of religion and science, to > wave the flag for the late-19th-century warfare between the > two domains. "A science conference is not [an] appropriate > venue for a religion-based presentation," a professor of > anesthesia from the University of California at San > Francisco remarked on the petition. "Who's next, the pope?" > That sign-off question pointed to a second part of the > strict separationist logic: Even if the Dalai Lama seemed > pretty irenic as religious leaders go, he nonetheless > represented a slippery slope into a mire of superstition and > authoritarianism. (How else, some critics asked, were they > to interpret his known affinities with reincarnation and > monasticism?) "Today, the Dalai Lama; Tomorrow, > Creationists?" wrote a professor of medicine at the > University of Toronto, capturing perhaps the most > commonplace anxiety given voice among the critics. Keep the > society free of all religious discussion, or else the > esteemed body might slide into the hell of a Kansas > school-board meeting. > > More interesting than the purists' boundary monitoring is > the way the Dalai Lama and his defenders imagine through > meditation an emerging meeting point for science and > religion in contemporary culture. The headline study that > served as the immediate source of intrigue surrounding his > recent lecture was an article published last year in the > Proceedings of the National Academy of Sciences and produced > by researchers at the Waisman Laboratory for Brain Imaging > and Behavior, at the University of Wisconsin at Madison. > That group, led by the psychology professor Richard J. > Davidson, has been studying long-term Tibetan Buddhist > practitioners of meditation, comparing their brain-wave > patterns with those of a control group. Davidson himself has > been working in the science-religion borderlands for more > than two decades and has been a leading collaborator with > the Mind and Life Institute, in Boulder, Colo., one of the > principal organizations encouraging the > neuroscience-meditation dialogue. > > Shifting the focus of research from altered states of > consciousness or momentary experiences of ecstasy, which so > often concerned inquirers in the 1960s and 1970s, the > Davidson group has been looking for evidence that sustained > meditation causes actual neural changes in everyday patterns > of cognition and emotion. In other words, they want to know > if the brain function of long-term contemplatives is made > demonstrably different through years of "mental training." > And not just different, but better: That is, does the > well-developed meditative mind sustain higher levels of > compassion and calmness than the run-of-the-mill American > noggin? Well, after testing eight long-time Tibetan Buddhist > practitioners and 10 "healthy student volunteers," the > researchers discovered that the 10,000 to 50,000 hours that > the various monks had devoted to "mental training" appeared > to make a real neurological difference. As the study's title > put it, "Long-term meditators self-induce high-amplitude > gamma synchrony during mental practice." Davidson and > company, careful not to overreach in their conclusions, did > suggest that practices of meditation, and the accompanying > compassionate affect, were "flexible skills that can be > trained." Did that mean contemplative practice could be > abstracted from its religious context and then applied as a > kind of public pedagogy? Were hopeful supporters wrong to > read this as a tantalizing suggestion that meditation might > prove beneficial not only for the mental health of Americans > but also for the very fabric of society? Where, after all, > couldn't we benefit from a little more "pure compassion," > altruism, lovingkindness, and "calm abiding"? > > As novel as it may sound to monitor the brain waves of > Tibetan Buddhist monks in university laboratories or on > Himalayan hillsides (Davidson has done both), it is > certainly not the first time that American psychologists > have sought to re-engage the spiritual through the > healthy-mindedness of meditation. At Wisconsin, Davidson > occupies a research professorship named for Harvard's > William James, the pioneering psychologist, psychical > researcher, and philosopher of religion, and it is in the > tradition of James that the current turn to the > contemplative mind is best understood. Counter to the > popular image of Americans as endlessly enterprising, > agitated, and restless -- all busy Marthas, no reflective > Marys -- James discerned a deep mystical cast to the > American psyche and pursued that strain with uncommon > intellectual devotion. Yet when it came to "methodical > meditation," James saw little of it left among American > Christians and turned instead to homegrown practitioners of > various mind-over-matter cures. He particularly accented > those "New Thought" metaphysicians who were pushing forward > a dialogue with far-flung emissaries of yoga and Buddhist > meditation in the wake of the World's Parliament of > Religions, held in Chicago in 1893. > > Among James's favored practitioners of these newly > improvised regimens of meditation was Ralph Waldo Trine, a > Boston-based reformer with a knack for inspirational > writing. In The Varieties of Religious Experience (1902), > James used Trine's blockbuster In Tune With the Infinite > (1897) as an epitome of the emergent practices of > concentration, mental repose, and healthy-mindedness then > percolating in New England and elsewhere across the country. > Though an unabashed popularizer, Trine was not a > lightweight. With an educational pedigree that ran from Knox > College to the University of Wisconsin to the Johns Hopkins > University, he moved easily in Harvard's wider metaphysical > circles and energetically engaged various progressive > causes. In much the same way that current studies promote > the clinical applications of meditation, Trine emphasized > the healthful benefits that accrued from cultivating a calm > yet expectant mind. He had no scanners or electrodes, but he > had the same hopes about improving the mental and physical > health of Americans through elaborating a universal practice > of meditation, one that transcended the particulars of any > one religious tradition and represented a kind of > cosmopolitan composite of all faiths. And while Trine did > not have the Dalai Lama at hand, he did have extended > contact with a well-traveled Sinhalese Buddhist monk, > Anagarika Dharmapala, with whom he compared notes and > devotional habits at a summer colony in Maine as he was > putting together his own system of meditation for Americans. > Like other inquirers then and now, Trine was all too ready > to look to Asia for a practical antidote to American > nervousness. > > The real payoff for Trine, as it is for Davidson and his > colleagues, was not established simply through a calculus of > productivity or cheerfulness: Would encouraging meditation > or other visualization techniques make people more alert and > proficient at the office or on the playing field? Would it > make them feel happier and less disgruntled? Trine, like > James and now Davidson, was finally more interested in > saintliness and compassion than in helping stressed-out > brain workers relax and concentrate. It is hard not to hear > a hint of Davidson's pursuit of altruism in Trine's "spirit > of infinite love," the moral imperative to "care for the > weak and defenseless." And it is hard not to see that the > world of William James and Ralph Waldo Trine is alive and > well as American investigators wire up Tibetan Buddhist > hermits in a search for the powers of the concentrated mind, > the mental disciplines of harmony, compassion, and peace > that might make the world a marginally kinder, less selfish > place. That optimism about human nature -- that the mind has > deep reservoirs of potential for empathy and altruism -- had > a lot more backing among liberals and progressives in 1900 > than it does today. Still, the considerable hopes now > invested in meditation suggest that the old romantic > aspirations, spiritual and otherwise, continue to flourish, > especially among members of the mind-preoccupied knowledge > class. > > P erhaps the most important dimension of the Dalai Lama's > turn to the laboratory is the notion that the > religion-science wound will be salved through recasting > religion as spirituality. The Nobel laureate's latest book > explicitly suggests as much in its title, The Universe in a > Single Atom: The Convergence of Science and Spirituality. In > doing so, he expressly appeals to all those Americans who > fear fundamentalist incarnations of religion and who instead > cast themselves as intellectually curious and spiritually > seeking. Religion, on this model, is not a domain of > authority competing with science but an inward terrain of > personal experience and individual probing. Spirituality, > the Dalai Lama writes, "is a human journey into our internal > resources." Representing "the union of wisdom and > compassion," it shares with science a progressive hope for > "the betterment of humanity." In those terms, religion as > spirituality becomes the handmaiden of science itself, > joining it in an open quest for knowledge, empirical and > pragmatic, unconstrained by ancient creeds, cosmologies, or > churches. In such exhortations the Dalai Lama shows a fine, > intuitive feel for much of American intellectual and > religious life, but he is hardly telling today's Emersonian > inquirers something about the universe that they do not > already affirm. > > A practice of meditation made palatable to scientists, > secularists, and seekers would no doubt look pallid to all > those monks, hermits, and saints who have taken it to be an > arduous and ascetic discipline. Still, the American pursuit > of "spirituality," reaching a crescendo in the past two > decades, has been all too easy to dismiss as paltry and > unsubstantial, labeled as foreign and threatening to > more-orthodox versions of a Christian America. In this > often-charged religious environment, the Dalai Lama has > astutely laid hold of the science-spirituality nexus as a > cultural foothold. As he has discovered in this latest > brouhaha, that move has hardly lifted him above the wider > debates, whether about materialism or intelligent design, > but it has allowed him to connect with America's more > cosmopolitan and progressive religious impulses. When > William James was asked directly in 1904, "What do you mean > by 'spirituality'?," he replied: "Susceptibility to ideals, > but with a certain freedom to indulge in imagination about > them." In mingling with neuroscientists who have warmed to > his talk of spirituality, the Dalai Lama may well have found > his own avatars of William James. > > Leigh E. Schmidt is a professor of religion at Princeton > University and author of Restless Souls: The Making of > American Spirituality (HarperSanFrancisco, 2005). > _______________________________________________ > paleopsych mailing list > paleopsych at paleopsych.org > http://lists.paleopsych.org/mailman/listinfo/paleopsych > > From shovland at mindspring.com Sat Jan 7 15:55:04 2006 From: shovland at mindspring.com (Steve Hovland) Date: Sat, 7 Jan 2006 07:55:04 -0800 Subject: [Paleopsych] CHE: In the Lab With the Dalai Lama In-Reply-To: <43BF54DD.7020400@solution-consulting.com> Message-ID: Sounds like a bunch of very conventional people being threatened by an unusual idea... -----Original Message----- From: paleopsych-bounces at paleopsych.org [mailto:paleopsych-bounces at paleopsych.org]On Behalf Of Lynn D. Johnson, Ph.D. Sent: Friday, January 06, 2006 9:43 PM To: The new improved paleopsych list Subject: Re: [Paleopsych] CHE: In the Lab With the Dalai Lama Frank, This was hilarious! Imagine being against the Dalai Lama*! What Torquemadas these scientists are! *He is agnostic on the existence of God! How can one object? Lynn D. Johnson, Ph.D. Solutions Consulting Group 166 East 5900 South, Ste. B-108 Salt Lake City, UT 84107 Tel: (801) 261-1412; Fax: (801) 288-2269 Check out our webpage: www.solution-consulting.com Feeling upset? Order Get On The Peace Train, my new solution-oriented book on negative emotions. Premise Checker wrote: > In the Lab With the Dalai Lama > The Chronicle of Higher Education, 5.12.16 > http://chronicle.com/weekly/v52/i17/17b01001.htm > > By LEIGH E. SCHMIDT > > Even the Dalai Lama's harshest critics at the Society for > Neuroscience meeting last month, in Washington, would have > to concede this much: Choosing the exiled Tibetan Buddhist > leader to inaugurate the professional association's series > on neuroscience and society certainly got people talking. > Who would have thought that an announced lecture on "The > Neuroscience of Meditation" would set off a protest petition > gathering about 1,000 signatures, a counterpetition of > support boasting nearly as many names, substantial coverage > in The New York Times and on National Public Radio, as well > as ample chatter in the blogosphere? In a culture that likes > its battles between science and religion to be loud, > colorful, and Christian -- another nasty squabble, say, > between evolutionists and creationists -- this controversy > seemed unlikely to gain much traction. Yet as the dispute > built momentum in the months leading up to the event, it > soon became clear that the prospect of the red-robed Dalai > Lama's urging the study of an ancient spiritual practice > upon white-coated lab scientists would provide a newsworthy > angle on the usual wrangling. > > Playing upon tensions far less noticed than those that have > plagued relations between science and conservative > Christianity, the latest dust-up reveals the spirit wars > that divide the knowledge class itself. How purely secular > and naturalistic do the members of that class imagine > themselves to be, and how committed are they to keeping > religion at bay in their conference gatherings, university > laboratories, civic institutions, newsrooms, and think > tanks? In turn, is "spirituality" a back door through which > religion gets to enter the conversation, now dressed in the > suitably neutralized garb of meditation as a universalistic > practice of inward peace and outreaching compassion? Or does > religion, even when soft-peddled in the cosmopolitan > language of spirituality and the contemplative mind, > inevitably remain an embarrassment to those elites who stake > their authority on secular rationality? The dispute roiling > the neuroscience society over the past six months has > brought such questions front and center. > > Inviting the Dalai Lama to speak at the meeting created two > major border disputes. The first, of modest consequence to > religion-and-science debates, was the conflict over the > "political agenda" of the exiled Tibetan leader. In an > international professional association that includes many > Chinese scientists, some members were offended at the > implied endorsement that the event gave to the Dalai Lama's > larger cause of freedom for Tibetans. The second dispute, > more insistently debated, was over religion's showing up -- > so visibly, to boot -- at an annual meeting of > neuroscientists. The almost visceral response by critics was > to declare a total separation of religion and science, to > wave the flag for the late-19th-century warfare between the > two domains. "A science conference is not [an] appropriate > venue for a religion-based presentation," a professor of > anesthesia from the University of California at San > Francisco remarked on the petition. "Who's next, the pope?" > That sign-off question pointed to a second part of the > strict separationist logic: Even if the Dalai Lama seemed > pretty irenic as religious leaders go, he nonetheless > represented a slippery slope into a mire of superstition and > authoritarianism. (How else, some critics asked, were they > to interpret his known affinities with reincarnation and > monasticism?) "Today, the Dalai Lama; Tomorrow, > Creationists?" wrote a professor of medicine at the > University of Toronto, capturing perhaps the most > commonplace anxiety given voice among the critics. Keep the > society free of all religious discussion, or else the > esteemed body might slide into the hell of a Kansas > school-board meeting. > > More interesting than the purists' boundary monitoring is > the way the Dalai Lama and his defenders imagine through > meditation an emerging meeting point for science and > religion in contemporary culture. The headline study that > served as the immediate source of intrigue surrounding his > recent lecture was an article published last year in the > Proceedings of the National Academy of Sciences and produced > by researchers at the Waisman Laboratory for Brain Imaging > and Behavior, at the University of Wisconsin at Madison. > That group, led by the psychology professor Richard J. > Davidson, has been studying long-term Tibetan Buddhist > practitioners of meditation, comparing their brain-wave > patterns with those of a control group. Davidson himself has > been working in the science-religion borderlands for more > than two decades and has been a leading collaborator with > the Mind and Life Institute, in Boulder, Colo., one of the > principal organizations encouraging the > neuroscience-meditation dialogue. > > Shifting the focus of research from altered states of > consciousness or momentary experiences of ecstasy, which so > often concerned inquirers in the 1960s and 1970s, the > Davidson group has been looking for evidence that sustained > meditation causes actual neural changes in everyday patterns > of cognition and emotion. In other words, they want to know > if the brain function of long-term contemplatives is made > demonstrably different through years of "mental training." > And not just different, but better: That is, does the > well-developed meditative mind sustain higher levels of > compassion and calmness than the run-of-the-mill American > noggin? Well, after testing eight long-time Tibetan Buddhist > practitioners and 10 "healthy student volunteers," the > researchers discovered that the 10,000 to 50,000 hours that > the various monks had devoted to "mental training" appeared > to make a real neurological difference. As the study's title > put it, "Long-term meditators self-induce high-amplitude > gamma synchrony during mental practice." Davidson and > company, careful not to overreach in their conclusions, did > suggest that practices of meditation, and the accompanying > compassionate affect, were "flexible skills that can be > trained." Did that mean contemplative practice could be > abstracted from its religious context and then applied as a > kind of public pedagogy? Were hopeful supporters wrong to > read this as a tantalizing suggestion that meditation might > prove beneficial not only for the mental health of Americans > but also for the very fabric of society? Where, after all, > couldn't we benefit from a little more "pure compassion," > altruism, lovingkindness, and "calm abiding"? > > As novel as it may sound to monitor the brain waves of > Tibetan Buddhist monks in university laboratories or on > Himalayan hillsides (Davidson has done both), it is > certainly not the first time that American psychologists > have sought to re-engage the spiritual through the > healthy-mindedness of meditation. At Wisconsin, Davidson > occupies a research professorship named for Harvard's > William James, the pioneering psychologist, psychical > researcher, and philosopher of religion, and it is in the > tradition of James that the current turn to the > contemplative mind is best understood. Counter to the > popular image of Americans as endlessly enterprising, > agitated, and restless -- all busy Marthas, no reflective > Marys -- James discerned a deep mystical cast to the > American psyche and pursued that strain with uncommon > intellectual devotion. Yet when it came to "methodical > meditation," James saw little of it left among American > Christians and turned instead to homegrown practitioners of > various mind-over-matter cures. He particularly accented > those "New Thought" metaphysicians who were pushing forward > a dialogue with far-flung emissaries of yoga and Buddhist > meditation in the wake of the World's Parliament of > Religions, held in Chicago in 1893. > > Among James's favored practitioners of these newly > improvised regimens of meditation was Ralph Waldo Trine, a > Boston-based reformer with a knack for inspirational > writing. In The Varieties of Religious Experience (1902), > James used Trine's blockbuster In Tune With the Infinite > (1897) as an epitome of the emergent practices of > concentration, mental repose, and healthy-mindedness then > percolating in New England and elsewhere across the country. > Though an unabashed popularizer, Trine was not a > lightweight. With an educational pedigree that ran from Knox > College to the University of Wisconsin to the Johns Hopkins > University, he moved easily in Harvard's wider metaphysical > circles and energetically engaged various progressive > causes. In much the same way that current studies promote > the clinical applications of meditation, Trine emphasized > the healthful benefits that accrued from cultivating a calm > yet expectant mind. He had no scanners or electrodes, but he > had the same hopes about improving the mental and physical > health of Americans through elaborating a universal practice > of meditation, one that transcended the particulars of any > one religious tradition and represented a kind of > cosmopolitan composite of all faiths. And while Trine did > not have the Dalai Lama at hand, he did have extended > contact with a well-traveled Sinhalese Buddhist monk, > Anagarika Dharmapala, with whom he compared notes and > devotional habits at a summer colony in Maine as he was > putting together his own system of meditation for Americans. > Like other inquirers then and now, Trine was all too ready > to look to Asia for a practical antidote to American > nervousness. > > The real payoff for Trine, as it is for Davidson and his > colleagues, was not established simply through a calculus of > productivity or cheerfulness: Would encouraging meditation > or other visualization techniques make people more alert and > proficient at the office or on the playing field? Would it > make them feel happier and less disgruntled? Trine, like > James and now Davidson, was finally more interested in > saintliness and compassion than in helping stressed-out > brain workers relax and concentrate. It is hard not to hear > a hint of Davidson's pursuit of altruism in Trine's "spirit > of infinite love," the moral imperative to "care for the > weak and defenseless." And it is hard not to see that the > world of William James and Ralph Waldo Trine is alive and > well as American investigators wire up Tibetan Buddhist > hermits in a search for the powers of the concentrated mind, > the mental disciplines of harmony, compassion, and peace > that might make the world a marginally kinder, less selfish > place. That optimism about human nature -- that the mind has > deep reservoirs of potential for empathy and altruism -- had > a lot more backing among liberals and progressives in 1900 > than it does today. Still, the considerable hopes now > invested in meditation suggest that the old romantic > aspirations, spiritual and otherwise, continue to flourish, > especially among members of the mind-preoccupied knowledge > class. > > P erhaps the most important dimension of the Dalai Lama's > turn to the laboratory is the notion that the > religion-science wound will be salved through recasting > religion as spirituality. The Nobel laureate's latest book > explicitly suggests as much in its title, The Universe in a > Single Atom: The Convergence of Science and Spirituality. In > doing so, he expressly appeals to all those Americans who > fear fundamentalist incarnations of religion and who instead > cast themselves as intellectually curious and spiritually > seeking. Religion, on this model, is not a domain of > authority competing with science but an inward terrain of > personal experience and individual probing. Spirituality, > the Dalai Lama writes, "is a human journey into our internal > resources." Representing "the union of wisdom and > compassion," it shares with science a progressive hope for > "the betterment of humanity." In those terms, religion as > spirituality becomes the handmaiden of science itself, > joining it in an open quest for knowledge, empirical and > pragmatic, unconstrained by ancient creeds, cosmologies, or > churches. In such exhortations the Dalai Lama shows a fine, > intuitive feel for much of American intellectual and > religious life, but he is hardly telling today's Emersonian > inquirers something about the universe that they do not > already affirm. > > A practice of meditation made palatable to scientists, > secularists, and seekers would no doubt look pallid to all > those monks, hermits, and saints who have taken it to be an > arduous and ascetic discipline. Still, the American pursuit > of "spirituality," reaching a crescendo in the past two > decades, has been all too easy to dismiss as paltry and > unsubstantial, labeled as foreign and threatening to > more-orthodox versions of a Christian America. In this > often-charged religious environment, the Dalai Lama has > astutely laid hold of the science-spirituality nexus as a > cultural foothold. As he has discovered in this latest > brouhaha, that move has hardly lifted him above the wider > debates, whether about materialism or intelligent design, > but it has allowed him to connect with America's more > cosmopolitan and progressive religious impulses. When > William James was asked directly in 1904, "What do you mean > by 'spirituality'?," he replied: "Susceptibility to ideals, > but with a certain freedom to indulge in imagination about > them." In mingling with neuroscientists who have warmed to > his talk of spirituality, the Dalai Lama may well have found > his own avatars of William James. > > Leigh E. Schmidt is a professor of religion at Princeton > University and author of Restless Souls: The Making of > American Spirituality (HarperSanFrancisco, 2005). > _______________________________________________ > paleopsych mailing list > paleopsych at paleopsych.org > http://lists.paleopsych.org/mailman/listinfo/paleopsych > > _______________________________________________ paleopsych mailing list paleopsych at paleopsych.org http://lists.paleopsych.org/mailman/listinfo/paleopsych From checker at panix.com Sat Jan 7 20:49:49 2006 From: checker at panix.com (Premise Checker) Date: Sat, 7 Jan 2006 15:49:49 -0500 (EST) Subject: [Paleopsych] CPE: Andy Denis: Was Hayek a Panglossian Evolutionary Theorist? A Reply to Whitman Message-ID: Andy Denis: Was Hayek a Panglossian Evolutionary Theorist? A Reply to Whitman Constitutional Political Economy, 13, 275 -285, 2002. [Lots of comments by me below.] andy.denis at city.ac.uk Department of Economics, City University, London, UK * Contact details and address for correspondence: Andy Denis, Department of Economics, School of Social and Human Sciences, City University, London, Northampton Square, LONDON, United Kingdom EC1V 0HB. Telephone: ?44 (0)20-7040 0257; FAX: ?44 (0)20-7040 8580. URL: http://www.city.ac.uk/andy/ Abstract. By means of a consideration of Whitman (1998) the present paper considers the meanings of 'Panglossianism' and the relation between group and individual levels in evolution. It establishes the connection between the Panglossian policy prescription of laissez-faire and the mistaken evolutionary theory of group selection. Analysis of the passages in Hayek cited by Whitman shows that, once these passages are taken in context, and once the appropriate meaning of the term 'Panglossian' has been clarified, they fail to defend Hayek from this charge, but, on the contrary, confirm that Hayek was, indeed, 'a Panglossian evolutionary theorist'. JEL classification: B31. 1. Introduction This paper is a response to Whitman (1998) 'Hayek contra Pangloss on Evolutionary Systems', which seeks to exculpate Hayek from the charge of Panglossianism in his application of evolutionary theory to society. The present paper argues that Whitman has misunderstood the substance of the accusation of Panglossianism against Hayek.2 He may have been indirectly influenced by Gould and Lewontin (1979), which is widely assumed to identify Panglossianism with Darwinian adaptationism. Prior to that paper, the term Panglossian in evolutionary theory referred to the group selectionist fallacy, that groups could be selected in which individuals behaved altruistically. Thereafter, however, it was-- less appropriately-- taken to refer to Darwinian adaptationism, the view that features of organisms could be understood by asking what function they would best carry out. The paper begins by considering the meanings of the term 'Panglossian' and the relation between group and individual levels in evolution. It establishes the connection between the Panglossian policy prescription of laissez-faire and the mistaken evolutionary theory of group selection. Attention then turns to an analysis of the passages in Hayek cited by Whitman. The analysis shows that, once these passages are taken in context, and once the older, and more appropriate, meaning of the term Panglossian has been re-established, they do nothing to defend Hayek from this charge, but, on the contrary, provide compelling evidence that Hayek was, indeed, ' a Panglossian evolutionary theorist'. A final section summarises the findings of the paper. 2. The Meaning of the Term Panglossian in Evolutionary Theory The phrase 'Pangloss's theorem' was first used in the debate about evolution ... not as a criticism of adaptive explanations, but specifically as a criticism of 'group-selectionist', mean-fitness-maximising arguments (John Maynard Smith cited in Dennett 1995:239). Daniel Dennett (1995:238?9) argues for a distinction between Leibnizian and Panglossian paradigms, which he identifies in biology with the standpoints of individual and group adaptationism, respectively. Dennett regards the Leibnizian standpoint as the source of the bulk of our understanding of the living world. To understand a biological structure or phenomenon, the most fruitful approach is to 'reverse-engineer' it, to ask what purpose the structure would best serve were it the result of deliberate invention. No understanding of the heart, for example, is possible except on the hypothesis that it is there for a specific purpose: to pump blood around the body; similarly, the chain of structures from lungs to mitochondria can only be understood on the basis of the ro?le these structures play in respiration. Now adaptationism works well, though imperfectly, at the level of the individual organism. A major source of these imperfections is the fact that the replicators, in whose interest inherited characteristics should be understood as operating, are, not individual organisms, but those organisms' genes. Once the individual organism is understood as a vehicle of the underlying replicator, the gene (Dawkins 1989b:82), many of these imperfections vanish. Nevertheless, taking as a working hypothesis that the structure and behaviour of an organism is adaptive is a fruitful approach because all parts of the organism share a common genotype-- and hence a common interest-- which they can best realise by cooperating. The 'selfish organism' is close to the 'selfish gene' (Dawkins 1989a:6). Panglossianism, on the other hand, according to Dennett, is the assumption of group selection-- 'the old Panglossian fallacy that natural selection favours adaptations that are good for the species as a whole, rather than acting at the level of the individual' (John Maynard Smith cited in Dennett 1995:239). The group selectionist argument has been succinctly expressed and criticised by Richard Dawkins: A group, such as a species or a population within a species, whose individual members are prepared to sacrifice themselves for the welfare of the group, may be less likely to go extinct than a rival group whose individual members place their own selfish interests first. Therefore the world becomes populated mainly by groups consisting of self-sacrificing individuals. This is the theory of 'group selection' [expressed] in a famous book by V.C. Wynne-Edwards [Animal Dispersion in Relation to Social Behaviour] ... [But if] there is just one selfish rebel, prepared to exploit the altruism of the rest, then he, by definition, is more likely than they are to survive and have children. Each of these children will tend to inherit his selfish traits. After several generations [Note this "after several generations. Dawkins is invoking the "long run" or "t to infinity" metaphor.] of natural selection, the 'altruistic group' will be over-run by selfish individuals, and will be indistinguishable from the selfish group (Dawkins 1989a:7?8). The members of a group, unlike the members of an organism, have diverse interests: each individual is set up to realise the interests of its own DNA, by getting that DNA copied as many times as it can into future generations, by using the structures, executing the behaviours and exemplifying the predispositions which have tended to achieve that goal in the past. The members of an organism share an interest in cooperation, those of a group, for lack of a shared interest, must, perforce, compete. Now, clearly, groups (populations, species) do die out, and whether a group happens to die out may depend on the behaviour of the individual members of that group. But for that fact to exert any evolutionary selective pressure, there must be a mechanism such that the behaviour, on the basis of which the group is to be selected, is actually in the interest of its members to carry out. So, according to Dennett and Maynard Smith, Panglossianism in evolutionary theory originally referred to the 'group selectionist fallacy'. Then later Gould and Lewontin used the term to refer, in the words of the title of their article (1979:581), to 'the adaptationist programme'. The Gould and Lewontin article has stirred considerable interest and controversy. Dennett's verdict on the paper is to read it as an attack on the excesses of adaptationism-- adaptationism as ideology rather than heuristic-- which has been massively misread as a refutation of adaptationism: Gould and Lewontin memorably dubbed the excesses of adaptationism the 'Panglossian Paradigm,' and strove to ridicule it off the stage of serious science ... The Gould and Lewontin article ... is widely regarded ... as some sort of refutation of adaptationism (Dennett 1995:239). Though not intending any direct reference to the Gould and Lewontin paper (Whitman, personal communication), it is in fact in this-- mistaken-- sense that Whitman responds to the charge of Panglossianism against Hayek. But it is in the other, the Maynard-Smithian, not the Gould-Lewontinian, sense that Hayek can sensibly be accused of Panglossianism. On this charge, Whitman's article does nothing to defend Hayek; on the contrary, Hayek's real Panglossianism is brought out clearly in the passages that Whitman cites. The big question is this: given that it is individual humans who choose their behaviour in the context of their inherited predispositions and capacities, and the range of social norms and examples of behaviour presented to them, can behaviours be systematically selected which are beneficial for the group or society of humans but impose a cost on the individuals carrying out those behaviours? Hayek gives an unambiguous yes, he refers repeatedly and approvingly to 'group selection', and supports his argument with reference to the very book by Wynne-Edwards criticised by Dawkins in the passage cited above. Speaking of the rules of conduct in primitive human societies, he says that: the 'functions' which these rules serve we shall be able to discover only after we have reconstructed the overall order which is produced by actions in accordance with them ... all the individuals of the species which exist will behave in that manner because groups of individuals which have thus behaved have displaced those which did not do so (Hayek 1967:70). And in a footnote to this passage, Hayek refers the reader to the '[a]mple further illustrations of the kind of orders briefly sketched in this section ... in V.C. Wynne-Edwards, Animal Dispersion in Relation to Social Behaviour, Edinburgh, 1962' (Hayek 1967:70 n7). When we act, what we do is describable, if sufficiently regular, by a rule. But the question is whether the rule is an epiphenomenon, like the arrow formation of geese flying in each others' slipstreams, a pattern which emerges from generalising a large number of instances of the particular action, or whether the individual actions are executed because of the rule. In the first case, the collective outcome is just what happens to result from the actions of many individuals each following their own interests. In the second case, the actions of individuals are functional for the purposes served by the rule. The use of the term 'functions' in the passage cited -- albeit in quotation marks -- only illustrates Hayek's functionalism. As Hodgson says, Vanberg ... is right to suggest that Hayek's argument has a functionalist quality; it assumes that the contribution of a rule to the maintenance of a system is sufficient to explain the existence of that rule. Absent in Hayek's argument is the specification of a process by which a rule that is advantageous to the system is sustained in operation within that system (Hodgson 1993:168). It is a basic assumption in Hayek that individual actions serve a 'function' for the collective, that is, that in carrying through one's own interest, one is simultaneously (and more importantly) carrying through the interest of society; that actions performed by individuals are automatically functional for society. This is to assume all our problems-- theoretical and practical-- away. So, to the question, whether behaviours can be systematically selected, which are beneficial for the group or society of humans but impose a cost on the individuals carrying out those behaviours, Hayek answers yes. But in reality the correct answer is no, and for exactly the same reasons as in the biological context. If a population or group of humans follows rules for altruistic behaviour, it may prosper and expand at the expense of a similar population following a different, more selfish rule. But if, in that altruistic population, a rebel adopts the selfish rule, the rule coding for the more selfish behaviour, then he will prosper relatively to the more altruistic members of the population. The rule for the selfish behaviour will tend to displace its altruistic allele: {Note "tend to," another invocation of "the long run." Evolution is in fact a continuing *process,* in which indeed there may be competing forces, in this case, on for the individual and another for the group, though it may be quite common for the two processes to reinforce one another. In the "long run" altruistic individuals die out, but then again in the "long run" selfish groups die out, too. Until this "long run" is over, which it never is, groups with lots of selfish individuals may very well reproduce less than those with lots of altruistic individuals. The total number of selfish individuals across groups may go up, or it may go down. It all depends on the situation at hand. It is not a matter to be decided by invoking the "long run." When these debates were generating the most heat, that is, around the time _Unto Others_ was coming out, I asked the statisticians there if there was a way of measuring, in the field as opposed to theoretical exercises, the relative strengths of individual and group selection. There being no response, I presumed the statistical tools had not yet been developed. Whether they have since, I don't know.] other members of the population will see its connection with personal success and wish to adopt it. Rules for socially desirable behaviour can only be successful to the extent that there is a mechanism giving individuals the incentive to engage in that behaviour. Wynne-Edwards group selection is as fallacious in the social as in the biological context, and for identical reasons. It is in this sense that Hayek can justly be criticised for 'Panglossianism'. 3. F.A. Hayek: No Panglossian? The previous section presented the argument that what was wrong with Hayek's account of cultural evolution was that he applied to the social context the fallacious group selectionist theory of Wynne-Edwards. That is not how Whitman sees things, however. He defends Hayek, instead, against the charge of adaptationism, of Dennett's 'Leibnizian paradigm, and doesn't consider the differences between adaptationism applied at the level of the individual and the level of the group. One section of Whitman's article in particular is germane to the discussion: Section 2 'F.A. Hayek: No Panglossian' (47?493)-- the heart of Whitman's essay. Here Whitman cites four passages in which Hayek supposedly rejects Panglossianism. Each is worth considering in detail. The first such passage (48) is a citation from The Fatal Conceit, where Hayek seems to show himself aware of the problem with potentially Panglossian interpretations, and dissociates himself from them: I have no intention to commit what is often called the genetic or naturalistic fallacy. I do not claim that the results of group selection of traditions are necessarily ''good" -- any more than I claim that other things that have survived in the course of evolution, such as cockroaches, have moral value (Hayek 1988:27). The context of this passage is a polemic with AGN Flew over his 1967 booklet Evolutionary Ethics. Now, the notion of evolution deployed by Hayek was intended, I submit, not to provide a scientific understanding of the social order, warts and all, which has emerged from a blind, pitilessly indifferent evolutionary process (Dawkins 1995:155), but to present that order as something desirable, beyond the competence of humans to interfere with. The significance of Evolutionary Ethics here is that Flew got the apologetic role of evolution in this scheme of thought exactly right in his discussion of Social Darwinism: many people are inclined to believe, that whatever is in any sense natural must be as such commendable, and that Nature is a deep repository of wisdom, [so] for many the process of evolution by natural selection becomes a secular surrogate for Divine Providence; and ...for some the possibility, or even the duty, of relying on this benign and mighty force presents itself as a decisive reason why positive social policies must be superfluous, and may be wrong indeed almost blasphemous! (Flew 1967:15). So Hayek is responding to the charge, in Evolutionary Ethics, that those who thought along the lines actually adopted by himself were committing the 'naturalistic fallacy', deriving an ought from an is. Flew is clear that such a standpoint is Panglossian: if evolution leads to an institutional structure which has been selected for its beneficial influence on human societies, then certainly there will be excellent reason to leave that inherited institutional structure alone. Whitman cites the passage from The Fatal Conceit to support his view that Hayek rejects such Panglossianism. A number of points can be made about Hayek's response here. Firstly, we have to be quite clear here that Hayek's statement about not committing the naturalistic fallacy is a claim on Hayek's part: it is not (necessarily) what he says, but what he says he says. Hayek himself points out that what scientists describe as their own procedure is not to be trusted: 'The scientist reflecting and theorizing about his procedure is not always a reliable guide' (Hayek 1942, cited in Ransom 1996). Hayek's claim here no more closes the matter, than the denial of a suspect that he robbed the bank must eliminate him as a suspect. And the fact that Hayek is aware of the naturalistic fallacy is, again, no more evidence against him committing it, than a suspected bank robber's agreement, that robbing banks is illegal, would be evidence of his innocence. Secondly, what is truly significant here is that Hayek refers to the 'group selection of traditions', because-- as we have seen, and whatever claims Hayek chooses to make on the subject-- group selection, of the Wynne-Edwards variety, to which, as we have seen, Hayek explicitly refers his theory, does lead to Panglossian conclusions. Wynne-Edwardsian group selection, as Maynard Smith says, is 'the old Panglossian fallacy'. [Hold it here. That there are "Panglossian" forces favoring groups in no way imples what should be called Pan-Panglossianism -- please spread my new meme! -- that everything works for the good of the group. Recall my earlier telling that Stephen Sanderson, in his splendid _The Evolution of Human Sociality: A Darwinian Conflict Perspective_, made a (moderate) functionalst out of me when he attacked *pan*-functionalism, though he called it functionalism pure and simple. [If Hayek is a *moderate* Panglossian, then he is correct and insightful.] And thirdly, taking the passage in the context in which it occurs, it is clear that Hayek does endow the products of cultural evolution with moral value: they are products of a process of selection according to human survival value, and the products of biological evolution, such as cockroaches, are not. The very next sentence after those cited by Whitman make this abundantly clear: I do claim that ... without the particular traditions I have mentioned, the extended order of civilization could not continue to exist (whereas, were cockroaches to disappear, the resulting ecological 'disaster' would perhaps not wreak permanent havoc on mankind); and that if we discard these traditions ... we shall doom a large part of mankind to poverty and death (Hayek 1988:27). Whitman's second example is also from The Fatal Conceit: It would be wrong to conclude, strictly from such evolutionary premises, that whatever rules have evolved are always or necessarily conducive to the survival and increase of the populations following them ... Recognizing that rules generally tend to be selected, via competition, on the basis of their human survival-value certainly does not protect those rules from critical scrutiny (Hayek 1988:20). But, if 'human survival-value' is indeed the basis for selection, one may well wonder, on what basis this scrutiny is to be carried out? The assumption is that the basic process is a human-favourable one. One can only criticise it on the basis of details, not the fundamental processes involved. Whitman's gloss on this passage is that, Hayek believes that the cultural selection process selects for survival and reproduction of groups ... yet even by that criterion of efficiency, the resulting rules cannot be assumed to be efficient. It would be particularly odd, then, for those rules to be efficient according to some other standard, such as neoclassical economic efficiency or classical liberal value judgements (48). [Well, of course, they are not "efficient" in any sort of "long run" static equilibrium sense. They are merely tendencies to bring individuals and groups into harmony. Indeed, some societies go from bad to worse!] Hence, again, the claim is that Hayek's standpoint is not Panglossian. But Hayek here is clearly saying that cultural rules tend to be selected for human survival-value-- the outcome in each case, however, may for extraneous reasons be suboptimal on this score, and hence subject to critical scrutiny. Nevertheless, the tendency for selection according to human survival-value is in place and hence the critical scrutiny he alludes to can only be a matter of details, not of substance. The process itself is immune from such scrutiny. This is shown clearly if we look in more detail at the passage in The Fatal Conceit from which Whitman's extract is taken: It would however be wrong to conclude, strictly from such evolutionary premises, that whatever rules have evolved are always or necessarily conducive to the survival and increase of the populations following them. We need to show, with the help of economic analysis ... how rules that emerge spontaneously tend to promote human survival. Recognizing that rules generally tend to be selected, via competition, on the basis of their human survival-value certainly does not protect those rules from critical scrutiny. This is so, if for no other reason, because there has so often been coercive interference in the process of cultural evolution (Hayek 1988:20. Italics highlight the parts elided in Whitman's extract.). Contrary to Whitman's interpretation, Hayek is clearly saying, that we can assume that spontaneous evolutionary forces will tend to lead to desirable outcomes. [Well they do, sometimes and often very slowly at that.] He is saying that we cannot assume desirable social outcomes from 'such evolutionary premises', that is, those he had just been talking about, where vested interests often 'blocked the next step of evolution' (Hayek 1988:20) by the use of state power. [Now, I do have a problem here. Public Choice theory aims to incorporate state power *within* the overall system. But privleging voluntary behavior, within limits, is not such a bad thing. Buchanan reviewed a book for the Times Literary Supplement that tried to show, effectively, that Panglossian forces made it sure that the government almost always did right, and by using the same kind of metaphorical reasoning to regard market forces as being always forces for the good. Touche! For the book, that is. I don't have Buchanan's review at hand, but I recall he effectively trashed the argument on the grounds, familiar to all who have read Buchanan's writings that actual government must be compared to actual market, not ideal government to actual market.] Instead, he says, we must use economic analysis to show how spontaneous rules lead to desirable social outcomes-- not, we should note, to enquire whether they do this, but to show that they do so. That spontaneous processes lead to human-favourable outcomes is taken for granted. It is what 'we need to show'. That is Panglossianism. [This charge is a valid one.] Whitman is keen to point out that, in Hayek's view, the rules resulting from the evolutionary process are not exempt from critical scrutiny. This is supposed to show that spontaneous evolutionary processes are not assumed to lead to Panglossian results. But it actually shows the opposite, since the reason Hayek wants critical scrutiny of those rules is that they may be corrupted by an admixture of state influence ('coercive interference'). The spontaneous processes themselves are automatically benign, it is state intervention which spoils things. Whitman's third example is from The Constitution of Liberty: These considerations, of course, do not prove that all sets of moral beliefs which have grown up in a society will be beneficial. Just as a group may owe its rise to the morals which its members obey, ... so may a group or nation destroy itself by the moral beliefs to which it adheres (Hayek 1960:67). Again, the point of the citation is to show that Hayek, far from embracing Panglossianism, is well aware of the possibility of suboptimal outcomes of the social evolutionary process. But to see the full meaning of the passage cited, we once again need to look at somewhat more of the passage in Hayek, from which Whitman's extract has been taken, than Whitman does. In The Constitution of Liberty, Hayek allows that the points he has previously made do not prove that all the sets of moral beliefs which have grown up in a society will be beneficial .... [A] group or nation [may] destroy itself by the moral beliefs to which it adheres. Only the eventual results can show whether the ideals which guide a group are beneficial or destructive .... It may well be that a nation may destroy itself by following the teaching of what it regards as its best men .... There would be little danger of this in a society whose members were still free to choose their way of practical life, because in such a society such tendencies would be self- corrective: only groups guided by "impractical" ideals would decline, and others, less moral by current standards, would take their place. But this will happen only in a free society in which such ideals are not enforced on all (Hayek 1960:67). So, although Hayek admits that suboptimal systems may evolve, firstly, this can only be judged by 'eventual results': there is thus a presumption that it is impermissible for governments rationalistically to step in beforehand to avert the catastrophe. Secondly, he is able to assert that there would be 'little danger' of suboptimal results in a 'free society' -- by appeal to an argument which assumes optimality: 'groups guided by ''impractical" ideas would decline, and others ... would take their place'. The assumption is that what is good for individuals is good for their group and what is good for the group is good for the nation. But of course the behaviour which is Nash for agents within a society (whether those agents themselves be individuals or groups), the behaviour, that is, which issues from the evolutionarily stable strategies which emerge from the evolutionary process (Smith 1982:10), cannot be assumed to be optimal for the society as a whole. Individuals and groups do not achieve pre-eminence in a nation by following rules which it would be in the interest of the nation for everyone to follow, but by following rules which are well adapted for gaining power and influence within a nation's establishment. So, again, passages in Hayek which Whitman thinks point away from the charge of Panglossianism actually point towards it. Whitman's own response to the passage he cites from The Constitution of Liberty is as follows: Of course, this statement could be interpreted as merely a view of selection-inprogress, in that ''bad" moral views are characterized as leading inevitably to their own demise. The point, however, is that Hayek does not perceive the process as finished: at any point in time including the present day, we may find undesirable rules and customs that have not been weeded out by selective forces, at least not yet (48). But it makes a very big difference to policy response to perceived sub-optimalities, whether they are believed to be (a) the intermediate result of a fundamentally humanfavourable process which has not yet run its course, or (b) the result of a fundamentally human-indifferent process. The former conviction will tend to lead to a policy prescription of procrastination, gradualism and minor adjustment; the latter to one of more prompt and, potentially, radical reform. The passage is in keeping with the overall tenor of Hayek's work: spontaneous processes are optimal and are best left alone. Whitman seems unwilling to accept the simple message of Hayek's life work, that the policy prescription is one of laissez-faire: Hayek never eschews the modification and reform of rules; he simply points out that any such revision of particular rules must necessarily take place in the context of a complex of other rules that are taken as given for the time being (48). Of course Hayek doesn't object to the of modification of rules: but he wants them to be modified to give greater play to spontaneous processes, not less. Whitman seems to misunderstand Hayek's desire to modify the policy framework, in order to bring it more into line with laissez-faire, as a step away from laissez-faire. Later in the paper, Whitman argues that Hayek's standpoint cannot be Panglossian because he argues for 'the occasional corrective reform, which would be unnecessary in a perfectly self-correcting (or instantaneously optimal) evolutionary system' (55). However, it is not the spontaneous evolutionary process which is imperfectly self-correcting, in Hayek's view, but interference with it on the part of authority. And it is not necessary for Hayek to regard his evolutionary system as 'instantaneously optimal' for us to see that it is Panglossian-- what is required is that it tends to generate results which serve human purposes, not that it achieves those results perfectly and instantaneously. The essence of Pangloss's world view was that we live in the best of all possible worlds, not of all worlds whether possible or not: belief in 'instantaneous optimality' is not a sensible criterion for judging alleged instances of Panglossianism. This theme, concerning whether perceived sub-optimalities are believed to be the intermediate result of a fundamentally human-favourable process which has not yet run its course, or, on the contrary, the result of a fundamentally human-indifferent process, also arises in connection with Whitman's fourth example of Hayek rejecting Panglossianism: The fact that law that has evolved in this way has certain desirable properties does not prove that it will always be good law or even that some of its rules may not be very bad. It therefore does not mean that we can altogether dispense with legislation (Hayek 1973:88). Again, we should do well to situate this passage in the context within which it appears in Hayek's writing. Hayek says that The fact that all [spontaneously grown] law ... will of necessity possess some desirable properties not necessarily possessed by the commands of a legislator does not mean that in other respects such law may not develop in very undesirable directions, and that when this happens correction by deliberate legislation may not be the only practicable way out. For a variety of reasons the spontaneous process of growth may lead into an impasse from which it cannot extricate itself by its own forces or which it will at least not correct quickly enough ... The fact that law that has evolved in this way has certain desirable properties does not prove that it will always be good law or even that some of its rules may not be very bad. It therefore does not mean that we can altogether dispense with legislation ... the most frequent cause is probably that the development of the law has lain in the hands of members of a particular class (Hayek 1973:88?89). We may note that the passage continues the theme we have already noted of focusing on the exceptional, bad outcomes of an essentially good process: law evolves in a desirable way, but some laws may be undesirable. As just mentioned, it also touches on the theme of undesirable outcomes resulting from an essentially benign process not yet having run its course. We should also note the last sentence of the passage cited by Whitman: 'It therefore does not mean that we can altogether dispense with legislation'. In the previous section Hayek had so praised the evolutionary process of common law that one might think legislation itself superfluous. Here he needs to step back from a position on legislation which many might regard as beyond the pale of extremity. The role of the passage cited is to take the extremist edge off an argument which might otherwise deny any scope at all for legislation. The context is a massive pre-supposition that spontaneous processes lead to optimal results. The major and 'most frequent' cause for radical change requiring legislation is the recognition that existing law was biased in favour of some group over-represented in the state. Again, the assumption is that spontaneous processes are essentially benign, and that it is state encroachments which induce suboptimality of outcomes.4 So this passage, too, gives very little support to the notion that Hayek's attitude towards social evolutionary processes was not Panglossian. 4. Conclusion This article has argued that, contrary to Whitman's defence, Hayek is indeed a Panglossian evolutionary theorist. [It would have been better to say that Hayek was *more* Panglossian, with respect to voluntary actions that the author thinks warranted. This charge may or may not be correct: Denis advances no criteria to decide what the correct amount of Panglossian respect for voluntary actions should be. (It will vary across time and circumstance and depend on one's own preferences as well as on the objective facts of the situation, which are hard to get. [I think it fair to charge Hayek with being a "conservative," though he famously appended an essay, "Why I Am not a Conservative," to _The Constitution of Liberty_. Others have argued this, too. Basically, the "conservative" temperament is to search high and wide for the functional aspects of the status quo before chucking them as embodying "the dead hand of the past" (Mr. Jefferson) or "the accumulation of centuries of imbecilities" (Mr. Mencken). As always, moderation is in order. [I await Buchanan's new short book, _Why I, Too, Am not a Conservative_, with great eagerness.] Hayek's policy stance is a prescription of laissezfaire, and his economic and evolutionary theory underpins that policy prescription. His evolutionary theory says that spontaneous processes tend towards optimal social outcomes. To the extent that they issue from such spontaneous processes, the institutions which we inherit are those which have been selected according to the benefits they have conferred on the societies adopting them. This is Panglossian in the social sense: the institutional structure we inherit tends strongly to be desirable and attempts to improve it by conscious collective action are very much to be avoided. [Hayek did, of course, propose any number of radical changes in the status quo, in order to restore the "simple system of natural liberty" (Smith). I am not sure he was aware of the contradictions here. Leftists notice this valorization (a favorite meme) of the status quo all the time, and seek to put themselves in charge. The meme, "you can't change just one thing," should be generously invoked here. [As always, I have no answer to what the best combination of conservatism and change is and only urge a relentless Checking of Premises. The biggest Premise to Check is one that Hayek invokes, and praise him for doing so, namely that everything results from human design, when he quotes Ferguson on "the results of human action but not of human design" repeatedly. [The biggest change in the last decade is, of course, the Internet. It was not developed with the social changes it wrought in mind, indeed without any awareness of their impact, except on the part of a very few visionaries. [What will be coming in the decades ahead--humans may very well be displaced as the predominant form of organized matter in the centuries ahead--will not be the result of deliberations by Central Planners. I will keep this very much in mind as I read and write a review of Joel Garreau's _Radical Evolution_ for the _Journal of Evolution and Technology_, which task I remind you is keeping me away from most of my Internet activity. And it is Panglossian in the technical, evolutionary sense of Wynne-Edwards's erroneous theory of group selection, a theory that Hayek explicitly endorses. In his consideration of biological evolutionary theory, Whitman fails to identify Panglossianism with Wynne-Edwards group selection. In the passage from biological to cultural evolution, Whitman fails to realise that the distinction between the individual and group retains its significance in full. Once we look closely at the passages in Hayek to which Whitman directs our attention, our verdict on Hayek's evolutionary theory can only be: 'Panglossian, as charged'. Notes 1. I should like to thank Pete Clarke, Mary Denis, William Dixon and Geoffrey Kay for their encouragement and their most helpful comments on my earlier paper on Hayek, on which this one draws. I thank Douglas Whitman, Alan Isaac, Erik Angner and two referees for this Journal for comments on an earlier version of the present paper, and Alain Albert for prompting me to write it. Versions of the paper have been presented at the European Society for the History of Economic Thought conference, Graz, Austria, 2000, the City University Department of Economics research seminar, and the Association for Heterodox Economics conference, London, 2001-I should like to thank session participants, particularly Stephan Bo?hm, Jack Vromen, Simon Price, John Cubbin and Harold Chorney for their comments. Finally, I should like to thank BSc students on my Part III option in History of Economic Thought for their questions and comments on the paper, and their enthusiasm. The usual caveat applies. 2. Note that the issue is not whether Hayek is guilty of Panglossianism in the sense of believing that all is for the best in the world we inhabit, but whether his theory of cultural evolutionary processes is Panglossian. As the title of Whitman's article shows, and as we would expect of an Austrian economist, what is at issue is process not end state. 3. Unqualified page numbers refer to Whitman (1998). 4. It is also the case that there is a technical reason why legislation may be necessary: when changes in the law are required, this cannot be achieved by case law-- it would be unjust to do so, as case law can only determine what was the law in the past, not what it will be in the future. References Dawkins, R. (1989a) The Selfish Gene. Oxford: Oxford University Press. Dawkins, R. (1989b) The Extended Phenotype: the Long Reach of the Gene. Oxford: Oxford University Press. Dawkins, R. (1995) River out of Eden. London: Phoenix. Dennett, D.C. (1995) Darwin's Dangerous Idea. Evolution and the Meanings of Life. London: Penguin Books. Flew, A.G.N. (1967) Evolutionary Ethics. London: Macmillan. Gould, S.J., and Lewontin, R. (1979) ''The Spandrels of San Marco and the Panglossian Paradigm: A Critique of the Adaptationist Programme." Proceedings of the Royal Society B205: 581-98. Hayek, F. A. (1942) Scientism and the Study of Society Economica, vol. 91. pp. 127 -152. Hayek, F.A. (1960) The Constitution of Liberty. London: Routledge & Kegan Paul. Hayek, F.A. (1967) Studies in Philosophy, Politics and Economics. London: Routledge & Kegan Paul. Hayek, F.A. (1973) Law, Legislation and Liberty. A new statement of the liberal principles of justice and political economy. Rules and Order, vol. 1. London: Routledge. Hayek, F.A. (1988) The Fatal Conceit: The errors of socialism. London: Routledge. Hodgson, G.M. (1993) Economics and Evolution. Bringing Life Back into Economics. Cambridge: Polity Press. Ransom, G. (1996) ''The Significance of Myth and Misunderstanding in Social Science Narrative: Opening Access to Hayek's Copernican Revolution in Economics." Paper presented at the 1996 annual meetings of the History of Economics Society and the Southern Economics Association; http://www.hayekcenter.org/hayekmyth.htm Smith, J.M. (1982) Evolution and the Theory of Games. Cambridge: Cambridge University Press. Whitman, D.G. (1998) ''Hayek contra Pangloss on Evolutionary Systems." Constitutional Political Economy 9: 45-66. From checker at panix.com Sat Jan 7 20:52:27 2006 From: checker at panix.com (Premise Checker) Date: Sat, 7 Jan 2006 15:52:27 -0500 (EST) Subject: [Paleopsych] Natural History Magazine: Natural Selections Message-ID: Natural Selections http://www.naturalhistorymag.com/0504/0504_selections.html 2004 May (note date) R E V I E W Brains and the Beast Can the behaviorist's insistence on distinguishing animal from human cognition be reconciled with evolutionary continuity? By Frans B. M. de Waal Do Animals Think? by Clive D.L. Wynne Princeton University Press, 2004; $26.95 Intelligence of Apes and Other Rational Beings by Duane M. Rumbaugh and David A. Washburn Yale University Press, 2003; $35.00 IF YOUR DOG DROPS A TENNIS BALL in front of you and looks up at you with tail wagging, do you figure she wants to play? How naive! Who says dogs have desires and intentions? Her behavior is merely the product of reinforcement: she has been rewarded for it in the past. Many scientists have grown up with the so-called law of effect, the idea that all behavior is conditioned by reward and punishment. This principle of learning was advocated by a dominant school of twentieth-century psychological thought known as American behaviorism. The school's founders, John B. Watson and B.F. Skinner, were happy to explain all conceivable behavior within the narrow confines of what Skinner called "operant conditioning." The mind, if such a thing even existed, remained a black box. In the early days, the behaviorists applied their doctrine in equal measure to people and other animals. Watson, for instance, to demonstrate the power of his methods, intentionally created a phobia for furry objects in a human baby. Initially "little Albert" was unafraid of a tame white rat. But after Watson paired each appearance of the rat with sharp noises right behind poor Albert's head, fear of rats was the inevitable outcome. Even human speech was thought to be the product of simple reinforcement learning. The behaviorists' goal of unifying the science of behavior was a noble one--but alas, outside academia the masses resisted. They stubbornly refused to accept that their own behavior could be explained without considering thoughts, feelings, and intentions. Don't we all have mental lives, don't we look into the future, aren't we rational beings? Eventually, the behaviorists caved in and exempted the bipedal ape from their theory of everything. That was the beginning of the problem for other animals. Once cognitive complexity was admitted in people, the rest of the animal kingdom became the sole standard-bearer of behaviorism. Animals were expected to follow the law of effect to the letter, and anyone who thought differently was just being anthropomorphic. >From a unified science, behaviorism had become a dichotomous one, with two separate languages: one for human behavior, another for animal behavior. Human rationality and superiority are not really the issue, however--one only needs to read the latest Darwin Awards to notice that our species can be less rational than advertised. The issue is the dividing line between us and the rest of nature. Radical behaviorists adamantly insist on this line, and look across it with entirely different eyes than the ones they reserve for their fellow human beings. They speak about animals as "them" and compare "them" with "us," as Clive D. L. Wynne does at the beginning of Do Animals Think? ("What are animals--really? What should we make of them?"). Other behaviorists, however, intentionally blur the line. They apply the same well-tested behaviorist methodology to reconnect human and animal behavior, daring to mention the words "animal" and "cognition" in the same breath. They write books such as Duane M. Rumbaugh and David A. Washburn's Intelligence of Apes and Other Rational Beings. Of the two, Wynne's book is by far the more readable. Wynne has a pleasant writing style and a knack for engaging the reader. He begins with the story of a mad animal-rights activist who threatened the lives of people on the Isle of Wight, where Wynne grew up. The man was convinced that animals are sentient beings, a certainty Wynne says he wishes he could share. This story sets the tone of doubt and reserve that permeates the book. Wynne includes numerous insightful accounts of remarkable animal behavior, but he invariably concludes on a note of caution: one should not infer too much from these accounts. He is not so radical a behaviorist that he excludes all forms of reasoning by animals, but he takes greater pleasure in explaining what animals cannot do--monkeys fail to understand relations between cause and effect, apes can sign but lack the syntax that defines human language--than in describing what they can do. Capacities unique to a particular species, such as echolocation in bats, get Wynne's full admiration. But anything that seems to elevate other animals close to the lofty cognitive level of humankind he regards with utmost skepticism. He seems to take delight in animals, and possesses great knowledge about them, yet he prefers them at arm's length. The constant message is that animals are not people. That much is obvious. But it is equally true that people are animals. The dichotomy Wynne advocates is outdated, lending his book a pre-Darwinian flavor. Take the case of animal culture, currently one of the hottest areas in the study of animal behavior. The idea goes back to the pioneering work of Kinji Imanishi, who proposed in 1952 that if individuals learn from one another, their behavior may grow so different from behavior in other groups of the same species that they seem to have their own culture. Imanishi thus reduced the idea of culture to its most basic feature: the social rather than the genetic transmission of behavior. Many examples of animal culture have been documented. The classic case emerged among wild macaques on Japan's Koshima Island. During their fieldwork with the monkeys there, investigators provisioned them with sweet potatoes, which a juvenile female named Imo soon began washing; she would bring her potatoes to a small river and clean them off before eating them. Imo's washing behavior spread first to her mother and then to her age peers, before affecting the rest of the group. Later Imo moved her operation to the shoreline, washing the potatoes in the ocean, and, again, the other monkeys followed. Some psychologists have objected to this example, pointing out that it is uncertain whether the monkeys learned their skill by copying others or by discovering the behavior individually, without anyone's help. Wynne supports the second view. But instead of basing his opinion on the actual data published by a team of Japanese primatologists, who have worked on the problem for fifty years, he relies on the word of a skeptical Westerner who has never set foot on the island. This scientist, a specialist in rat behavior, suggested that potato washing spread because performers were selectively rewarded by the people who handed out the potatoes. A few years ago I went to Koshima Island to verify the idea of selective rewarding. I talked with some of the people who had actually witnessed Imo cleaning her first spud. They told me that initially the monkeys were fed far away from any water, so there was no question of rewarding any washing behavior. Imo herself came up with the idea of transporting the potatoes to the river for cleaning. They also pointed out that one cannot feed a group of monkeys any way one wishes. The dominant males have to be fed first, the females second, and the little ones last; changing the order sparks bloodshed. Thus, except for Imo's mother, the monkeys that learned the behavior first, the juveniles, were the last to be rewarded. In fact, the only monkeys on the island that never learned potato washing were the adult males: precisely the best-rewarded group. Wynne invariably favors interpretations that widen the assumed cognitive gap between human and animal. For example, he uncritically accepts the uniqueness claim du jour: that only human beings possess a theory of mind (ToM), or the cognitive ability to understand that others, too, have mental states such as thoughts and knowledge. Ironically--given Wynne's dismissal of an ape ToM--the concept got its start with a 1970s study of chimpanzees. A female showed she had grasped the intentions of others by, for example, selecting a key from among several tools if she saw a person struggling to open a locked door. Evidence for a theory of mind in apes has gone through its ups and downs ever since. Some experiments have failed spectacularly, leading the proponents of one school of thought to contend that apes simply lack the capacity. Negative results are inconclusive, though: as the saying goes, absence of evidence is not evidence of absence. Furthermore, the performance of apes is often assessed by comparing it with that of children. Because the experimenter is invariably human, however, only the apes face a species barrier. When an ingenious experiment conducted at Emory University's Yerkes National Primate Research Center in Atlanta got around that problem, the evidence for an ape ToM was more positive: chimpanzees seemed to realize that if a member of their species had seen hidden food, this individual knew where the food was, as opposed to one who had not seen it. That finding threw the question of a ToM in nonhuman animals wide open again. In an unexpected twist (because the debate has focused on humans versus apes), a capuchin monkey in a laboratory at Kyoto University in Japan recently passed a series of seeing-knowing tasks with flying colors. The least one can conclude is that it is premature to settle on ToM capabilities as the ultimate Rubicon. In spite of Wynne's dismissal of an ape ToM, his book offers many insightful descriptions of animal behavior. A wonderful chapter on the role of messenger pigeons during the First World War includes a picture of the stuffed body of Cher Ami, a genuine war hero. The pigeon kept flying after its leg had been shot off, delivering its message and thus rescuing an entire battalion. Rumbaugh and Washburn are considerably more open-minded about the mental accomplishments of animals than Wynne is. Their book celebrates Rumbaugh's lifetime of research on monkeys and apes. In fact, what fascinates me the most about Intelligence of Apes and Other Rational Beings is its historical overview of experimental work with primates, first with the Wisconsin General Testing Apparatus (WGTA) and later with joysticks and computers. The WGTA was developed at the University of Wisconsin in the 1940s, and is still being used today. In this set-up, a primate subject in a cage faces an experimenter across a platform, on which differently shaped or colored stimuli are arrayed. Both experimenter and primate can reach the stimuli; the experimenter baits them with rewards, and the primate selects among them. I remember working with such an apparatus as a student, testing chimpanzees to see if they could discriminate shapes by touch alone. The task was so incredibly simple and repetitive that the apes invariably got tired of the whole thing five minutes into the testing. In fact, they got so bored that they performed worse than macaques tested on the same stimuli. I mention this episode because test performance is often taken as a measure of intelligence, even though attention and motivation are equally important to the outcome. As a result, failure is open to interpretation. Rumbaugh and Washburn understand these points better than most scientists, and they are at pains to remind the reader how the questions one asks tend to constrain the answers one gets. Indeed, some testing paradigms positively suppress the phenomena being tested. When Rumbaugh replaced the WGTA with an innovative testing setup in which the monkeys move a joystick to select stimuli on a computer screen, their performance improved dramatically. Rumbaugh's work on the connection between method and outcome should be required reading for anyone who attaches significance to negative evidence. One learning paradigm discussed by Rumbaugh and Washburn has special interest. Some animals learn how to learn--that is, once they have mastered a particular task, they can more quickly learn future tasks that have the same design but rely on different stimuli. Trial-and-error learning cannot explain improved performance in reaction to new stimuli, hence the level of learning must be higher. But generalization across tasks is precisely what the founders of behaviorism thought animals could not do. Rumbaugh and Washburn discuss many forms of advanced problem-solving, which they classify as "emergents." The term is slightly awkward, but the authors apply it to cases in which animals flexibly apply accumulated knowledge to new situations, resulting in an "emergent" solution. The classic example is the chimpanzee in a room with a few sticks and boxes in one corner and, for the first time in the chimp's experience, a banana hanging from the ceiling. The solution emerges as the old bits of previous knowledge combine until, as if a lightbulb suddenly goes on in the chimpanzee's head, he climbs on top of the boxes and reaches for the banana with a stick. The two authors rightly speak of reasoning and rationality, and so adopt a terminology that is anathema to radical behaviorism. They discuss the behaviorist view at length but choose to deviate from it, stressing continuity between animal and human. For the reader, though, it is frustrating that they focus almost entirely on apes and other primates, without examining how the concept of emergents could apply equally well to other animals. Crows, dolphins, elephants, and parrots have been credited with creative problem-solving as well. There will always be tension between those who view animals as only slightly more flexible than machines and those who see them as only slightly less rational than human beings. The views discussed in these two books are by no means as far apart as they could be; both, after all, come out of the same tradition of experimental psychology. Throw in a few naturalists and neuroscientists, and the debate gets even more complex. That said, however, the two books range widely enough across the spectrum of views to make a powerful case that there is still plenty to be discovered, and that human uniqueness is largely in the eye of the beholder. Frans B.M. de Waal is C. H. Candler Professor of Primate Behavior at Emory University in Atlanta and the director of the Living Links Center at the university's Yerkes National Primate Research Center. B O O K S H E L F By Laurence A. Marschall [4]back to top Rats: Observations on the History and Habitat of the City's Most Unwanted Inhabitants by Robert Sullivan Bloomsbury, 2004; $23.95 IN HIS MEMORABLE 1998 BOOK The Meadowlands, about the New Jersey wetlands just west of the Lincoln Tunnel, Robert Sullivan emerged as the Thoreau of blighted ecosystems. Traveling by canoe along oil-slicked bayous, Sullivan uncovered treasures of both natural and industrial history no passing commuter would have suspected. Now Sullivan has crossed the Hudson River and relocated his eclectic wanderings to the back alleys of lower Manhattan, where the dumpsters of Chinese noodle joints, Irish pubs, and Salvadoran chicken takeouts are the real happening places for urban wildlife. Happening, that is, if you're a rat. "Four seasons spent among vermin" is how Sullivan describes his sojourn. His Walden Pond was Edens Alley, a narrow defile a few blocks from Wall Street. Equipped with both binoculars and a night-vision monocular, he arrived in the evenings after dark to watch the rats as they emerged to feed and, in the notebook he'd brought along, to wax lyrical about nature, civilization, and the meaning of life. A typical entry from his winter journal: 5:44--The rats retreat suddenly. The reason: three men enter the alley, though when I see the men I wonder which creature left the alley for which creature--sometimes it seems as if the rats' departure is a courtesy extended by the rats. . . . I think of all the rats that have crawled through this alley before, the history of this alley's previous inhabitants. Oh, to know--to really know--this pellicle of rat-infested ground. Such deadpan effusiveness over creatures commonly regarded as loathsome may border on sick humor, but elegies to Rattus norvegicus make up only a small part of Sullivan's book. There are many stories about the ethology, natural history, and social importance of rats, and, overall, plenty of evidence that people and rats have a lot more in common than most people would like to admit. Sullivan cites Martin W. Schein, for instance, the co-author of a 1953 paper on the eating habits of rats captured on Baltimore backstreets. Schein conducted laboratory studies using authentic garbage from the alleys where the rats were trapped. He learned that rats hate raw beets (I sympathize) and that scrambled eggs and macaroni and cheese are popular rat comfort foods, just as they are for human Baltimoreans. In Edens Alley, according to Sullivan, the rats also seem to like chicken pot pie. In spite of some strong dislikes, though, rats are not picky eaters. By and large, they are omnivorous and highly adaptable--the same traits that make people so successful--and they show uncanny cleverness in finding food and avoiding peril. Ann Li, an epidemiologist with the New York City Department of Health, takes Sullivan on a rat-trapping expedition to Brooklyn, and tells him she thinks rats are "so underappreciated." Even the exterminators who show Sullivan how to outsmart the rodents express a grudging admiration for their prey. As much as he shares the rodentophilia of his informants, Sullivan is unsparing when he recounts the misery rats cause. Sometimes they attack directly: in 1979 a large pack surrounded a woman on a street in downtown Manhattan. And of course they carry infectious diseases such as plague. Yet unless people find a way to steam-clean each crevice of the city every day, rats will continue to cohabit with us in uneasy harmony. "If you killed every rat in New York City," Ann Li remarks, "you would have created new housing for 60 million rats." Running with Reindeer: Encounters in Russian Lapland Running with Reindeer: Encounters in Russian Lapland by Roger Took Westview Press, 2004; $27.50 FEW PLACES IN EUROPE are as far off the beaten track as the Kola Peninsula, a potato-shaped carbuncle of land at the top of the Scandinavian Peninsula, east of Finland. Russian Lapland, as the Kola is also known, has one large city (Murmansk), a few subsidiary industrial centers and mining towns, and a scattering of isolated villages in the hinterlands. One passable highway runs through the province. But beyond that right-of-way, for hundreds of kilometers in every direction, the hardy traveler encounters nothing but tundra, taiga (boreal forest), and vacant shoreline. Roger Took is just such a hardy traveler--perhaps even a foolhardy one. When he arrived in the Russian northland in the early 1990s, the entire country was teetering on the edge of anarchy, and it was not clear which disaffected group a lone Englishman should be more afraid of: suspicious Sami tribesmen, the military attached to the remnants of the Russian Northern Fleet, or the legendary Russian Mafia. Just in case, Took offhandedly notes, he learned how to fire, strip, and reassemble a nine-millimeter semiautomatic pistol before he left London. We never learn whether Took ever fired the pistol, but readers can be grateful that he survived, met many fascinating characters, and kept coming back, year after year, for more than a decade. The Kola, he discovered, is a land of contrasts and contradictions, shaped by history and politics as much as by geography. Its first inhabitants were nomadic Sami, who roamed freely through northern Scandinavia. In the Middle Ages, settlers called Pomors arrived from the more populated regions of Russia, to the south. A brisk fur trade with Europe developed because the Kola Peninsula's best harbors, warmed by the northernmost hook of the Gulf Stream, are more or less ice-free throughout the year. Only after the Russian Revolution did the area begin to take on its current look of emptiness. In a procrustean attempt to collectivize the Sami economy, Stalin had villagers herded into hastily built urban areas and industrial farms. Much of the coastline was declared off-limits. The discovery of rich mineral resources in the Khibiny mountain range, near the center of the peninsula, only made matters worse; soon "special settlers" were being shipped from various parts of the Soviet Union to provide forced labor. For the most part, today's inhabitants huddle in charmless concrete apartment blocks, largely ignorant of the region's rich history and remarkable resources. Took, however, has grown to love the place. Armed with little more than a backpack and a fishing rod, he boldly wandered through military reservations, floated down rivers with salmon poachers, sledged to hunting and herding excursions with descendants of the Sami, and accompanied wildlife biologists and archaeologists on expeditions to the interior. In one memorable episode he hitched a ride through the backcountry on a clanking, tanklike all-terrain vehicle (minus the gun turret), accompanying a human-rights activist who was documenting a gulag of prison barracks. Took reports signs of a new life for Russian Lapland. Environmentalists in Russia and Scandinavia have begun to throw their weight behind efforts to clean up the damage caused by the nuclear fleet. Shops in Murmansk now display the latest fashions. And foreign sportsmen have begun to discover that some of the world's greatest salmon streams run through the Kola's remote countryside. Russian Lapland may not come off as a vacation paradise, but Took's book is a marvelous introduction to a region of rich but almost forgotten heritage. Sequoia: The Heralded Tree in American Art and Culture Sequoia: The Heralded Tree in American Art and Culture by Lori Vermaas Smithsonian Books, 2003; $39.95 JUST AS LEBANON is famous for its cedars, so North America is known for its redwoods. Not only are they among the largest and most stately trees on earth, but they thrive in settings of surpassing scenic beauty. Strolling beneath a towering canopy of Sequoia sempervirens, the most common redwood along the northern coast of California, one experiences a world of subtle twilight just a few steps from the glare of a sunlit, rocky shoreline. The rarer Sequoiadendron giganteum, whose ponderous trunks make their coastal cousins seem almost willowy, grow farther inland, in sheltered groves in Yosemite and other isolated valleys. It is no wonder, then, that the giant sequoias have assumed symbolic importance far out of proportion to their restricted habitat. Lori Vermaas, a cultural historian, has written an insightful new survey of American art and literature on redwoods from the nineteenth and early twentieth centuries. The most widespread early depictions of the giant trees, in the years during and just after the Civil War, were made by enterprising commercial artists who used twin lenses on their cameras to create so-called stereo-view cards. Many of the pictures focused on the immense scale of the trees; a favorite subject was the Grizzly Giant, a tree in Yosemite National Park whose trunk soared straight skyward but whose upper branches seemed painfully gnarled, like the rheumatic joints of an old man. To a nation still smarting from the horrible conflict between the states, the redwoods, far removed from the scene of battle, seemed serene, impassive, and impervious to harm. They epitomized the part of the nation that had remained intact and functional despite the fires of war and social turmoil. Huge paintings of sequoias by such landscape artists as Albert Bierstadt were all the rage (oversize landscape paintings being the functional equivalents of IMAX films). Yet few envisioned the giant trees as symbols of an endangered environment. Toward the end of the nineteenth century, logging them was even seen as an example of humankind's ability to bend nature to its will. Woodsmen were "no puny impersonations of men," but men who swung "heavy, keen-edged axes as though they were mere trifles." Logging teams were typically photographed in the yawning notches of trees they were about to topple. In one particularly striking print, an entire troop of U.S. cavalrymen, mounted on horseback, stand like conquering gladiators atop and along the length of the trunk of a fallen giant. Exuberantly expansive, the American imagination invoked sequoias as a natural treasure, but a treasure to be expropriated and spent. Even John Muir, one of the nation's first conservationists, waxed enthusiastic over the use of redwood lumber in construction. Redwood housing was "almost absolutely unperishable." The onslaught of logging operations, among other abuses of the era, sparked the modern environmental movement, and redwoods came to be seen as treasures to preserve. Although groves of redwoods are continually threatened, the trees still stand, and pictorialists in the tradition of Ansel Adams have continued to use the image of the redwood as an emblem of strength and endurance. Vermaas helps us understand the symbolism of sequoias, but even she must admit that the best way to appreciate them is on foot and close-up. "No one has ever successfully painted or photographed a redwood tree," wrote John Steinbeck in 1962. "The feeling they produce is not transferable." Laurence A. Marschall, author of The Supernova Story, is the W.K.T. Sahm professor of physics at Gettysburg College in Pennsylvania, and director of Project CLEA, which produces widely used simulation software for education in astronomy. Moving Mountains By Robert Anderson CALIFORNIA'S SANTA MONICA MOUNTAINS, where I live, are a mere 5 million years old. Like most mountains, they are comprised of rocks formed during complex and repeated sequences of uplift, sedimentation, and volcanism. In the case of the Santa Monica range, the process began about 200 million years ago, when the first dinosaurs were roaming the planet. A summary of the processes that make mountains rise can be found at www.physicalgeography.net, a Web site created by Michael J. Pidwirny, a geographer at Okanagan University College in Kelowna, British Columbia. (On the home page click on "Fundamentals: Online Textbook" from the menu bar at the top; in "Chapter 10: Introduction to the Lithosphere," click on "Mountain Building.") For an overall view of how colliding tectonic plates transform the planet, go to "Dynamic Earth" ([6]earth.leeds.ac.uk/dynamicearth), developed by Robert Butler, a geologist at the University of Leeds. Illustrations of the way tectonics has changed the distribution of land and sea can also be found at a Web site run by Christopher R. Scotese, a geologist at the University of Texas at Arlington. For thirty years, Scotese and his collaborators have been working on a series of paleogeographic atlases. The latest of them, the Global Plate Tectonic Model, is available at PALEOMAP Project ([7]www.scotese.com). From the home page you can choose 3-D movable paleoglobes and paleogeographic animations that show the positions of the continents and the shapes of the ocean basins for various periods of geological time. Select Earth History from the menu at the left on the home page. There youll find full-color maps depicting details such as mountain ranges, shorelines, and active plate boundaries during those same periodsbeginning with the breakup of the first supercontinent, Rodinia, and extending through the present and into the future for 250 million years, when the supercontinent Pangea Ultima will trap what is now the Atlantic Ocean in a small, inland basin. Antonio Schettino, a geologist in Milan, Italy, worked with Scotese to re-create plate motions in the Mediterranean region ([8]www.itis-molinari.mi.it/Intro-Med.html). The accompanying QuickTime animation provides an excellent graphic explanation of how the Alps arose. A similar presentation of tectonic processes shows the ancient mountain chains in greater regional detail ([9]www4.nau.edu/geology). Click on "Popular Departmental Links" and look at the three items created by Ronald C. Blakey, a geologist at Northern Arizona University. The site [10]www.jamestown-ri.info/northern_appalachians.htm provides a rundown of the northern Appalachian chain's geological history, which stretches back a billion years. Geologists can now watch mountains grow, thanks to new satellite and radar technologies that measure minute movements of the Earth's crust and slight changes in the stresses that cause earthquakes. Go to the "Active Tectonics" site, run by a group from the University of California, Berkeley ([11]www.seismo.berkeley.edu/~burgmann/EDUCATION/InSAR.html), for more information. Robert Anderson is a freelance science writer living in Los Angeles. References 6. http://earth.leeds.ac.uk/dynamicearth 7. http://www.scotese.com/ 8. http://www.itis-molinari.mi.it/Intro-Med.html 9. http://www4.nau.edu/geology 10. http://www.jamestown-ri.info/northern_appalachians.htm 11. http://www.seismo.berkeley.edu/~burgmann/EDUCATION/InSAR.html From checker at panix.com Sat Jan 7 20:52:35 2006 From: checker at panix.com (Premise Checker) Date: Sat, 7 Jan 2006 15:52:35 -0500 (EST) Subject: [Paleopsych] The Age (au): There's trouble in patriarchy Message-ID: There's trouble in patriarchy http://www.theage.com.au/news/general/theres-trouble-in-patriarchy/2005/12/08/1133829717803.html December 10, 2005 Are men facing extinction? Not quite yet, but times are a changin' in the gender playground, writes Simon Caterson. 'THE REASON THERE are so many divorces," I remember my secondary school literature teacher telling our class, "is that people nowadays marry for love." It is one of those throwaway lines that remain lodged in the mind long after whatever it was we were supposed to be learning has been absorbed, regurgitated and forgotten. I was reminded of this aphorism two decades later when, at a recent dinner party, one of the guests announced that his de facto relationship of 14 years had ended. After some brief expressions of sympathy, in particular from one of the women present, the conversation moved on. Though sad for all involved, especially the couple's young children, to us thirtysomethings such news comes as no surprise. Breakdowns in relationships, such as retrenchment, car accidents or minor operations, are just one of the everyday hazards of modern life for Generation X. It is an event that statistically is as likely to occur as not. In her provocatively titled new book, Are Men Necessary? When Sexes Collide, American columnist Maureen Dowd tolls the bell of marital doom. In an extract published in the New York Times Magazine, Dowd laments the abject failure of modern relationships: "Despite the best efforts of philosophers, politicians, historians, novelists, screenwriters, linguists, therapists, anthropologists and facilitators, men and women are still in a muddle in the boardroom, the bedroom and the Situation room." Dowd presents the landscape of relationships as a disaster zone of disappointment and inequality for women: "Before it curdled into a collection of stereotypes, feminism had fleetingly held out a promise that there would be some precincts of womanly life that were not all about men. But it never materialised." In Dowd's view, the sway held by stereotypes hasn't lessened since the rise of feminism, only their content: "The message is diametrically opposite - before it was don't be a sex object; now, it's be a sex object - but the conformity is just as stifling." Virginia Woolf wrote that "to enjoy freedom we have to control ourselves". Dowd is quite right to identify conformity as a true enemy of happiness, but it is a force propelling our consumer society that is difficult to deny. In gender relations, as in everything else, we are only limited by our imaginations and our capacity to empathise with one another. The "can't live with them, can't live without them" theme adumbrated by Dowd also echoes in the media across the gender divide, where the howl of embittered women is answered just as loudly by the bellowing of angry men who fear that they really are unnecessary. In addition to the misery and hardship that failed relationships cause to women, the high rate of divorce is just one of the factors that contribute to the so-called crisis in masculinity. According to Marian Salzman, Ira Matathia and Ann O'Reilly, the joint authors of a new book called The Future of Men, instability and uncertainty in relationships is just one of the issues profoundly troubling men. They refer to data indicating that men have a greater psychological need for permanence in relationships than women, and point out the fact that roughly twice as many divorces are instigated by women as men. But is the crisis real, or is it a furphy, an excuse for some men to behave badly and feel sorry for themselves? Are the gender wars overall a sign of social disintegration or a bit of a beatup? The threshold question is not one that preoccupies the authors of The Future of Men. In a sign of the times, the book is not a work of sociological inquiry or journalistic speculation but a business title. Salzman, Matathia and O'Reilly are not polemicists but professional trendspotters and as such are primarily concerned with how the fluid situation of men may translate into future spending patterns. They assert that the combination of "the women's movement, the evolution towards information-based economies, and shifting social mores and values" is having "a negative impact on the male psyche, leaving modern men hesitant, disoriented, and, in many cases, more than a little depressed". The authors say that companies "looking to connect with the male consumer" must respond to what they term "M-ness", that is a new masculinity being defined by men themselves. After a decade or so of metrosexuals having the unblinking queer eye appraise their dress, hygiene and appearance, men are reasserting their traditional masculinity. This new "ubersexual" man will keep using a regular moisturising routine, but will also be manlier in his pursuits and outlook. But like the conformist women described by Dowd, is the crisis of masculinity merely creating a new breed of fashion victims whose angst can only be soothed by retail therapy? Advertisers, take note: boy's toys can compensate men for the feeling they are toy boys. The new alliance forecast in The Future of Men between quiche-man and caveman, to borrow the distinction made in Kath & Kim, may seem like a refreshing new development in the affairs of men, but is it simply a case of putting some of the old wine in new bottles? In truth, the really significant changes have occurred around men and not within them. Human evolution has lagged far behind technology, medical science and legislative change. Thomas Keneally reminds us: "It is for most of us far less than 5000 years since we came in off the plain and began farming. But our chemistry is built for nomadic life. We really did have mastodons to kill once." It is precisely this apparent mismatch between purpose and use, a "suspicion that we are biologically and socially redundant", as Keneally puts it, that gives rise to a "profound unease" in men. The projection into the future properly begins in the past, though we do not have to go back five millennia to establish the beginnings of a modern malaise. The future of men arguably began that day in November 1951 when the patents were filed that heralded the introduction of the contraceptive pill for women. Giving women the power to control their own fertility, as almost all of them do in the West, was an epoch-making moment in the history of humanity. The freedom of choice that the Pill gave to women, and indirectly to their sexual partners, has been followed by medical advances that enable human reproduction to take place without the male partner or sperm donor being present at the moment of conception. When cloning is perfected, then the concept of paternity could be done away with altogether. In any case, according to geneticist Jennifer Marshall Graves, the Y chromosome will have exhausted its capacity to mutate within the next 10 million years. The only real question is when, not if, the Y chromosome disappears, she has said in an interview. "It could be a lot shorter than 10 million years, but it could be a lot longer." Some modern freedoms, such as those that enabled women to vote and to own property without being treated as property themselves, have altered relations between the sexes forever. According to the authors of The Future of Men, there is no going back, and nor would societies that have experienced the benefits of the liberation of women want to return to the sexual dark ages: "In the short term, it's clear that cultures that resist the rise in female power are losing out to those cultures that accept it, because the cultures that accept it are progressing further faster on most fronts - health, economy, security, and technology, to name a few. Only history (sic) will tell what the longer-term consequences may be." Some of the more militant men's advocacy groups may gnash their teeth but the dethronement of men was in some sense an abdication, since the entry of women into the workforce was an unintended consequence of the world wars waged by men, and it was men that led the medical research into women's health. Now men and women are able to work together as never before to improve the lives of everyone. Who in their right mind would want to pass up the opportunity, however imperfect the application so far, to create a non-discriminatory meritocracy? On the domestic front, much of what has happened in our time is not gender-specific but is the impersonal operation of technology-driven consumer society. The services that in the past only marriage could provide to each partner have now virtually been outsourced. Single women can now "hire a hubby" whenever there's an odd job to be done, while any man with enough cash can enjoy what used to be regarded as conjugal rights at the nearest licensed brothel. Nothing is sacred, and everything has a price. Shopping, cleaning, healthy food, child care, back rubs - all are just a phone call or a mouse click away. An enterprising Japanese inventor has even come up with an artificial arm for singletons that takes the place of that otherwise provided by a real sleeping partner. The marketplace fills the space to ensure that not even the absence of ordinary human contact lacks a commercial substitute. If robots can serve as pets, why not have them fill in for humans as well? Objectively, the only factor holding women back from the possibility of complete independence from men, and vice versa, is financial, since the moral, social and legal pressure that used to exist for couples to pair off permanently has disappeared. The crucial issue is the mutual desire to have children. In most cases, the cost of health and education is such that it still requires two incomes to ensure that children receive what is generally considered to be a good start in life. The authors of The Future of Men identify one area in popular culture that needs an adjustment to accommodate M-ness, and that is the negative stereotyping of men. In 1929, Virginia Woolf wrote that women "all these centuries have served as lookingglasses possessing the magic and delicious power of reflecting the magic figure of man at twice its natural size". Woolf knew, however, that the best writing transcends gender-specific thinking, since "anything written with a conscious bias is doomed to death". Great art, according to Woolf, encompasses the whole of human experience and dissolves any crude distinction between sex and gender. "Some collaboration has to take place in the mind between the man and woman before the art of creation can be accomplished. Some marriage of opposites has to be consummated". A powerful representation of this concept as it may be manifested in an actual relationship appears in Kurt Vonnegut's novel Mother Night. The main character, an American playwright recruited to spy in pre-war Germany, sees the bond between himself and his wife as constituting what he calls "a nation of two". The couple's loyalty to this nation will allow for a measure of regional autonomy and can survive the occasional outbreak of civil unrest. Despite Woolf's plea for wholeness and her warning against replacing misogyny with misandry, stereotyping is still with us. Once women might have been denigrated or patronised with impunity, but now popular culture tends to load the dice against men. Sitcoms and ads routinely show male characters as weak, foolish or stupid. Instead of Father Knows Best, there are such dubious role models as Homer Simpson and Ray in Everyone Loves Raymond. Not that they have much of an example to follow - Abraham Simpson was a terrible father and Ray's dad is a ranting sociopath. Of course, these shows would not be funny if the gag did not correlate to events in real life, but there is no balance, and in any case a joke repeated too often quickly becomes stale. Of recent comedies, Coupling and Kath & Kim stand out as mature enough, and wicked enough, to be even-handed in satirising human relationships. Selfishness, vanity, vulgarity, duplicity, snobbery, deceit - none of these things is the sole preserve of one half of humanity. Nor are any of the more admirable qualities, needless to say. Unreconstructed heroes such as those played by John Wayne and Steve McQueen have disappeared from the big screen, but also lost are the Gregory Pecks, Henry Fondas and Cary Grants. Russell Crowe's rampaging public persona may seem like a throwback, but even he can't go berserk in a hotel without concern for his family being the cause. Such sensitivity, which is shared by Crowe's characters in such films as Gladiator and The Cinderella Man, never inspired the ritual trashing of hotel rooms by rock bands in the '60s and '70s. In modern life, the patriarchy has been superseded. Men increasingly share in the family chores and spend time with their children, which is why the authors of The Future of Men think that "business still seems to be lagging behind the cultural reality of how much gender-blurring has occurred in traditional female domains". They write that "products related to food preparation, home furnishing and entertaining, and home maintenance (i.e. cleaning) are still generally pitched at women, despite the fact that most of these items have become gender neutral". While it has never been easier to form attachments, it has never been more of a challenge to maintain them. The upheavals in men's lives have brought much that is new and unprecedented but also confirmed much that is as old as the hills. A good man or woman has always been hard to find and the course of true love never runs smooth, as Shakespeare and Jane Austen, among other writers, understood. Some are lucky in love and others miss out, no matter what the rules of engagement happen to be. Mutual misunderstanding and gender confusion are a rich source of comedy and pathos in Shakespeare's plays and the politics and economics of sex is a central concern of Austen's novels. Female sexual selection was known to be a vital factor in courtship long before evolutionary science confirmed it as the dominant one. Much of what seems contemporary in gender conflict has a historical precedent. The cad or shrew of yesteryear is today's ballbreaker or toxic bachelor. For every female chauvinist pig there's a pick-up artist playing "the game". In his primping and preening, the metrosexual takes his cue from the dandy of yore. The prospects for men may seem bleak to some, but that need not be the case. Rather than feel useless and rejected, most men should perhaps feel lucky that women still show as much interest in them as they do and want their companionship. Often the most effective critics of cultural misandry are women, and women do continue to give birth to sons as well as daughters. Most women are attracted to men enough to want to be with them in some meaningful way, while men (or women) who like each other are no longer prevented from fulfilling their desire. Love is fragile and often fleeting but I doubt whether many of us, no matter how modern our outlook, could contemplate life without it. And for the time being, at least, it seems we don't have to. From checker at panix.com Sat Jan 7 20:52:45 2006 From: checker at panix.com (Premise Checker) Date: Sat, 7 Jan 2006 15:52:45 -0500 (EST) Subject: [Paleopsych] NYT: (NSA) The Agency That Could Be Big Brother Message-ID: The Agency That Could Be Big Brother http://www.nytimes.com/2005/12/25/weekinreview/25bamford.html Private Lives By JAMES BAMFORD Washington DEEP in a remote, fog-layered hollow near Sugar Grove, W.Va., hidden by fortress-like mountains, sits the country's largest eavesdropping bug. Located in a "radio quiet" zone, the station's large parabolic dishes secretly and silently sweep in millions of private telephone calls and e-mail messages an hour. Run by the ultrasecret National Security Agency, the listening post intercepts all international communications entering the eastern United States. Another N.S.A. listening post, in Yakima,Wash., eavesdrops on the western half of the country. A hundred miles or so north of Sugar Grove, in Washington, the N.S.A. has suddenly taken center stage in a political firestorm. The controversy over whether the president broke the law when he secretly ordered the N.S.A. to bypass a special court and conduct warrantless eavesdropping on American citizens has even provoked some Democrats to call for his impeachment. According to John E. McLaughlin, who as the deputy director of the Central Intelligence Agency in the fall of 2001 was among the first briefed on the program, this eavesdropping was the most secret operation in the entire intelligence network, complete with its own code word - which itself is secret. Jokingly referred to as "No Such Agency," the N.S.A. was created in absolute secrecy in 1952 by President Harry S. Truman. Today, it is the largest intelligence agency. It is also the most important, providing far more insight on foreign countries than the C.I.A. and other spy organizations. But the agency is still struggling to adjust to the war on terror, in which its job is not to monitor states, but individuals or small cells hidden all over the world. To accomplish this, the N.S.A. has developed ever more sophisticated technology that mines vast amounts of data. But this technology may be of limited use abroad. And at home, it increases pressure on the agency to bypass civil liberties and skirt formal legal channels of criminal investigation. Originally created to spy on foreign adversaries, the N.S.A. was never supposed to be turned inward. Thirty years ago, Senator Frank Church, the Idaho Democrat who was then chairman of the select committee on intelligence, investigated the agency and came away stunned. "That capability at any time could be turned around on the American people," he said in 1975, "and no American would have any privacy left, such is the capability to monitor everything: telephone conversations, telegrams, it doesn't matter. There would be no place to hide." He added that if a dictator ever took over, the N.S.A. "could enable it to impose total tyranny, and there would be no way to fight back." At the time, the agency had the ability to listen to only what people said over the telephone or wrote in an occasional telegram; they had no access to private letters. But today, with people expressing their innermost thoughts in e-mail messages, exposing their medical and financial records to the Internet, and chatting constantly on cellphones, the agency virtually has the ability to get inside a person's mind. The N.S.A.'s original target had been the Communist bloc. The agency wrapped the Soviet Union and its satellite nations in an electronic cocoon. Anytime an aircraft, ship or military unit moved, the N.S.A. would know. And from 22,300 miles in orbit, satellites with super-thin, football-field-sized antennas eavesdropped on Soviet communications and weapons signals. Today, instead of eavesdropping on an enormous country that was always chattering and never moved, the N.S.A. is trying to find small numbers of individuals who operate in closed cells, seldom communicate electronically (and when they do, use untraceable calling cards or disposable cellphones) and are constantly traveling from country to country. During the cold war, the agency could depend on a constant flow of American-born Russian linguists from the many universities around the country with Soviet studies programs. Now the government is forced to search ethnic communities to find people who can speak Dari, Urdu or Lingala - and also pass a security clearance that frowns on people with relatives in their, or their parents', former countries. According to an interview last year with Gen. Michael V. Hayden, then the N.S.A.'s director, intercepting calls during the war on terrorism has become a much more complex endeavor. On Sept. 10, 2001, for example, the N.S.A. intercepted two messages. The first warned, "The match begins tomorrow," and the second said, "Tomorrow is zero hour." But even though they came from suspected Al Qaeda locations in Afghanistan, the messages were never translated until after the attack on Sept. 11, and not distributed until Sept. 12. What made the intercepts particularly difficult, General Hayden said, was that they were not "targeted" but intercepted randomly from Afghan pay phones. This makes identification of the caller extremely difficult and slow. "Know how many international calls are made out of Afghanistan on a given day? Thousands," General Hayden said. Still, the N.S.A. doesn't have to go to the courts to use its electronic monitoring to snare Al Qaeda members in Afghanistan. For the agency to snoop domestically on American citizens suspected of having terrorist ties, it first must to go to the Foreign Intelligence Surveillance Court, or FISA, make a showing of probable cause that the target is linked to a terrorist group, and obtain a warrant. The court rarely turns the government down. Since it was established in 1978, the court has granted about 19,000 warrants; it has only rejected five. And even in those cases the government has the right to appeal to the Foreign Intelligence Surveillance Court of Review, which in 27 years has only heard one case. And should the appeals court also reject the warrant request, the government could then appeal immediately to a closed session of the Supreme Court. Before the Sept. 11 attacks, the N.S.A. normally eavesdropped on a small number of American citizens or resident aliens, often a dozen or less, while the F.B.I., whose low-tech wiretapping was far less intrusive, requested most of the warrants from FISA. Despite the low odds of having a request turned down, President Bush established a secret program in which the N.S.A. would bypass the FISA court and begin eavesdropping without warrant on Americans. This decision seems to have been based on a new concept of monitoring by the agency, a way, according to the administration, to effectively handle all the data and new information. At the time, the buzzword in national security circles was data mining: digging deep into piles of information to come up with some pattern or clue to what might happen next. Rather than monitoring a dozen or so people for months at a time, as had been the practice, the decision was made to begin secretly eavesdropping on hundreds, perhaps thousands, of people for just a few days or a week at a time in order to determine who posed potential threats. Those deemed innocent would quickly be eliminated from the watch list, while those thought suspicious would be submitted to the FISA court for a warrant. In essence, N.S.A. seemed to be on a classic fishing expedition, precisely the type of abuse the FISA court was put in place to stop.At a news conference, President Bush himself seemed to acknowledge this new tactic. "FISA is for long-term monitoring," he said. "There's a difference between detecting so we can prevent, and monitoring." This eavesdropping is not the Bush administration's only attempt to expand the boundaries of what is legally permissible. In 2002, it was revealed that the Pentagon had launched Total Information Awareness, a data mining program led by John Poindexter, a retired rear admiral who had served as national security adviser under Ronald Reagan and helped devise the plan to sell arms to Iran and illegally divert the proceeds to rebels in Nicaragua. Total Information Awareness, known as T.I.A., was intended to search through vast data bases, promising to "increase the information coverage by an order-of-magnitude." According to a 2002 article in The New York Times, the program "would permit intelligence analysts and law enforcement officials to mount a vast dragnet through electronic transaction data ranging from credit card information to veterinary records, in the United States and internationally, to hunt for terrorists." After press reports, the Pentagon shut it down, and Mr. Poindexter eventually left the government. But according to a 2004 General Accounting Office report, the Bush administration and the Pentagon continued to rely heavily on data-mining techniques. "Our survey of 128 federal departments and agencies on their use of data mining," the report said, "shows that 52 agencies are using or are planning to use data mining. These departments and agencies reported 199 data-mining efforts, of which 68 are planned and 131 are operational." Of these uses, the report continued, "the Department of Defense reported the largest number of efforts." The administration says it needs this technology to effectively combat terrorism. But the effect on privacy has worried a number of politicians. After he was briefed on President Bush's secret operation in 2003, Senator Jay Rockefeller, the Democratic vice chairman of the Senate Select Committee on Intelligence, sent a letter to Vice President Dick Cheney. "As I reflected on the meeting today and the future we face," he wrote, "John Poindexter's T.I.A. project sprung to mind, exacerbating my concern regarding the direction the administration is moving with regard to security, technology, and surveillance." Senator Rockefeller sounds a lot like Senator Frank Church. "I don't want to see this country ever go across the bridge," Senator Church said. "I know the capacity that is there to make tyranny total in America, and we must see to it that this agency and all agencies that possess this technology operate within the law and under proper supervision, so that we never cross over that abyss. That is the abyss from which there is no return." James Bamford is the author of "Puzzle Palace" and"Body of Secrets: Anatomy of the Ultra-Secret National Security Agency." From checker at panix.com Sat Jan 7 20:52:57 2006 From: checker at panix.com (Premise Checker) Date: Sat, 7 Jan 2006 15:52:57 -0500 (EST) Subject: [Paleopsych] Economist: Japan's humanoid robots: Better than people Message-ID: Japan's humanoid robots: Better than people http://economist.com/world/asia/PrinterFriendly.cfm?story_id=5323427 5.12.20 [Note the last sentence.] HER name is MARIE, and her impressive set of skills comes in handy in a nursing home. MARIE can walk around under her own power. She can distinguish among similar-looking objects, such as different bottles of medicine, and has a delicate enough touch to work with frail patients. MARIE can interpret a range of facial expressions and gestures, and respond in ways that suggest compassion. Although her language skills are not ideal, she can recognise speech and respond clearly. Above all, she is inexpensive . Unfortunately for MARIE, however, she has one glaring trait that makes it hard for Japanese patients to accept her: she is a flesh-and-blood human being from the Philippines. If only she were a robot instead. Robots, you see, are wonderful creatures, as many a Japanese will tell you. They are getting more adept all the time, and before too long will be able to do cheaply and easily many tasks that human workers do now. They will care for the sick, collect the rubbish, guard homes and offices, and give directions on the street. This is great news in Japan, where the population has peaked, and may have begun shrinking in 2005. With too few young workers supporting an ageing population, somebody--or something--needs to fill the gap, especially since many of Japan's young people will be needed in science, business and other creative or knowledge-intensive jobs. Many workers from low-wage countries are eager to work in Japan. The Philippines, for example, has over 350,000 trained nurses, and has been pleading with Japan--which accepts only a token few--to let more in. Foreign pundits keep telling Japan to do itself a favour and make better use of cheap imported labour. But the consensus among Japanese is that visions of a future in which immigrant workers live harmoniously and unobtrusively in Japan are pure fancy. Making humanoid robots is clearly the simple and practical way to go. Japan certainly has the technology. It is already the world leader in making industrial robots, which look nothing like pets or people but increasingly do much of the work in its factories. Japan is also racing far ahead of other countries in developing robots with more human features, or that can interact more easily with people. A government report released this May estimated that the market for "service robots" will reach ?1.1 trillion ($10 billion) within a decade. The country showed off its newest robots at a world exposition this summer in Aichi prefecture. More than 22m visitors came, 95% of them Japanese. The robots stole the show, from the nanny robot that babysits to a Toyota that plays a trumpet. And Japan's robots do not confine their talents to controlled environments. As they gain skills and confidence, robots such as Sony's QRIO (pronounced "curio") and Honda's ASIMO are venturing to unlikely places. They have attended factory openings, greeted foreign leaders, and rung the opening bell on the New York Stock Exchange. ASIMO can even take the stage to accept awards. The friendly face of technology So Japan will need workers, and it is learning how to make robots that can do many of their jobs. But the country's keen interest in robots may also reflect something else: it seems that plenty of Japanese really like dealing with robots. Few Japanese have the fear of robots that seems to haunt westerners in seminars and Hollywood films. In western popular culture, robots are often a threat, either because they are manipulated by sinister forces or because something goes horribly wrong with them. By contrast, most Japanese view robots as friendly and benign. Robots like people, and can do good. The Japanese are well aware of this cultural divide, and commentators devote lots of attention to explaining it. The two most favoured theories, which are assumed to reinforce each other, involve religion and popular culture. Most Japanese take an eclectic approach to religious beliefs, and the native religion, Shintoism, is infused with animism: it does not make clear distinctions between inanimate things and organic beings. A popular Japanese theory about robots, therefore, is that there is no need to explain why Japanese are fond of them: what needs explaining, rather, is why westerners allow their Christian hang-ups to get in the way of a good technology. When Honda started making real progress with its humanoid-robot project, it consulted the Vatican on whether westerners would object to a robot made in man's image. Japanese popular culture has also consistently portrayed robots in a positive light, ever since Japan created its first famous cartoon robot, Tetsuwan Atomu, in 1951. Its name in Japanese refers to its atomic heart. Putting a nuclear core into a cartoon robot less than a decade after Hiroshima and Nagasaki might seem an odd way to endear people to the new character. But Tetsuwan Atom--being a robot, rather than a human--was able to use the technology for good. Over the past half century, scores of other Japanese cartoons and films have featured benign robots that work with humans, in some cases even blending with them. One of the latest is a film called "Hinokio", in which a reclusive boy sends a robot to school on his behalf and uses virtual-reality technology to interact with classmates. Among the broad Japanese public, it is a short leap to hope that real-world robots will soon be able to pursue good causes, whether helping to detect landmines in war-zones or finding and rescuing victims of disasters. The prevailing view in Japan is that the country is lucky to be uninhibited by robophobia. With fewer of the complexes that trouble many westerners, so the theory goes, Japan is free to make use of a great new tool, just when its needs and abilities are happily about to converge. "Of all the nations involved in such research," the Japan Times wrote in a 2004 editorial, "Japan is the most inclined to approach it in a spirit of fun." These sanguine explanations, however, may capture only part of the story. Although they are at ease with robots, many Japanese are not as comfortable around other people. That is especially true of foreigners. Immigrants cannot be programmed as robots can. You never know when they will do something spontaneous, ask an awkward question, or use the wrong honorific in conversation. But, even leaving foreigners out of it, being Japanese, and having always to watch what you say and do around others, is no picnic. It is no surprise, therefore, that Japanese researchers are forging ahead with research on human interfaces. For many jobs, after all, lifelike features are superfluous. A robotic arm can gently help to lift and reposition hospital patients without being attached to a humanoid form. The same goes for robotic spoons that make it easier for the infirm to feed themselves, power suits that help lift heavy grocery bags, and a variety of machines that watch the house, vacuum the carpet and so on. Yet the demand for better robots in Japan goes far beyond such functionality. Many Japanese seem to like robot versions of living creatures precisely because they are different from the real thing. An obvious example is AIBO, the robotic dog that Sony began selling in 1999. The bulk of its sales have been in Japan, and the company says there is a big difference between Japanese and American consumers. American AIBO buyers tend to be computer geeks who want to hack the robotic dog's programming and delve in its innards. Most Japanese consumers, by contrast, like AIBO because it is a clean, safe and predictable pet. AIBO is just a fake dog. As the country gets better at building interactive robots, their advantages for Japanese users will multiply. Hiroshi Ishiguro, a robotocist at Osaka University, cites the example of asking directions. In Japan, says Mr Ishiguro, people are even more reluctant than in other places to approach a stranger. Building robotic traffic police and guides will make it easier for people to overcome their diffidence. Karl MacDorman, another researcher at Osaka, sees similar social forces at work. Interacting with other people can be difficult for the Japanese, he says, "because they always have to think about what the other person is feeling, and how what they say will affect the other person." But it is impossible to embarrass a robot, or be embarrassed, by saying the wrong thing. To understand how Japanese might find robots less intimidating than people, Mr MacDorman has been investigating eye movements, using headsets that monitor where subjects are looking. One oft-cited myth about Japanese, that they rarely make eye contact, is not really true. When answering questions put by another Japanese, Mr MacDorman's subjects made eye contact around 30% of the time. But Japanese subjects behave intriguingly when they talk to Mr Ishiguro's android, ReplieeQ1. The android's face has been modeled on that of a famous newsreader, and sophisticated actuators allow it to mimic her facial movements. When answering the android's questions, Mr MacDorman's Japanese subjects were much more likely to look it in the eye than they were a real person. Mr MacDorman wants to do more tests, but he surmises that the discomfort many Japanese feel when dealing with other people has something to do with his results, and that they are much more at ease when talking to an android. Eventually, interactive robots are going to become more common, not just in Japan but in other rich countries as well. As children and the elderly begin spending time with them, they are likely to develop emotional reactions to such lifelike machines. That is human nature. Upon meeting Sony's QRIO, your correspondent promptly referred to it as "him" three times, despite trying to remember that it is just a battery-operated device. What seems to set Japan apart from other countries is that few Japanese are all that worried about the effects that hordes of robots might have on its citizens. Nobody seems prepared to ask awkward questions about how it might turn out. If this bold social experiment produces lots of isolated people, there will of course be an outlet for their loneliness: they can confide in their robot pets and partners. Only in Japan could this be thought less risky than having a compassionate Filipina drop by for a chat. From checker at panix.com Sat Jan 7 20:53:21 2006 From: checker at panix.com (Premise Checker) Date: Sat, 7 Jan 2006 15:53:21 -0500 (EST) Subject: [Paleopsych] Chicago Tribune: Advances in science: Answering the big questions Message-ID: Advances in science: Answering the big questions http://www.chicagotribune.com/news/opinion/chi-0512110186dec11,1,6233758.story?coll=chi-opinionfront-hed&ctrack=1&cset=true World's scientists predict what's next in coming 25 years December 11, 2005 By Ronald Kotulak Tribune science reporter To celebrate the 125th anniversary of its founding by Thomas Edison, the journal Science asked more than 100 of the world's top scientists what they thought were the 25 most important scientific questions likely to be answered in the next 25 years. The 25 big questions range from what is consciousness (the mysterious interplay of brain cells and neurotransmitters that conjures up awareness and the ability to ask questions) to what is the universe made of. What distinguishes humans from all other species is that capacity to formulate questions--and to find answers that lead to more questions. Children start asking "why" almost as soon as they learn to talk. Why is the sky blue? Do mosquitoes go to the bathroom? Asking the right question is the driving force behind science's amazing run of successes in explaining how the world works. "Children ask the most natural and the most difficult questions because they really do want explanations in which they can understand relationships between cause and effect," said Donald Kennedy, executive editor in chief of Science. "Scientists proceed in much the same way," he said. "They see some complicated outcome and they say, What produced this? I'm not going to be satisfied with just describing that it happened; I want to know what put it in motion." Whereas a frustrated parent may answer a child's inquisitiveness with "because I told you so," scientists must frame a question in such a way that it poses a hypothesis--a theory that tries to explain how something works--that can be tested to determine if it is true or not. Questions are more important than answers in shaping the future of science, Kennedy wrote in an editorial in Science, adding that science is about questions while research is about answers. "The essential feature of a good question is that it is ultimately testable or answerable," he wrote. "The big question that can never be wrestled with isn't worth much." In 1943 Erwin Schrodinger posed one of the most famous questions ever recorded when he asked, "What is life?" Enough tantalizing clues are known, he postulated, to begin looking for the molecules of life. Schrodinger's question, and slim book by the same title, inspired a generation of young scientists, including James Watson and Francis Crick, who won the race to decipher the chemical structure of DNA. But Watson and Crick's achievement was only the start of a cascade of new questions: What are genes? Are there disease genes? Why do humans have so many fewer genes than previously thought? That last question opened a new field of epigenetics, which studies the role the environment plays in determining how genes are expressed. Scientists ask questions because they have an overwhelming urge to know why things are the way they are. The knowledge learned scrapes away the crust of ignorance that limits human progress. "In many cases, the answers are going to have a big impact on human well-being, and not just in the medical sense," Kennedy said. "People who explore the cosmos try to put our solar system, Earth and everybody on it in some kind of grander context in terms of our universe." Some answers seem to have no immediate relevance at the time of discovery, but later turn out to have a major impact on society. When Michael Faraday was demonstrating his equipment for generating the newly discovered phenomenon of electricity in the early 1800s, British chancellor of the exchequer William Gladstone, said: "It is very interesting, Mr. Faraday; But what practical worth is it?" Faraday replied: "One day, sir, you may tax it." The ancient Greeks were masters at asking questions and coming up with philosophical answers that were intellectually satisfying but usually not testable. It wasn't until the Age of Enlightenment starting in the 1600s that the scientific method--observe, form a hypothesis, test it--took hold. The flood of discoveries that followed changed the world. Every now and then, particularly after a surge of great discoveries, someone, often a scientist, would say that science has learned all there is to learn. "The more important fundamental laws and facts of physical science have all been discovered, and these are now so firmly established that the possibility of their ever being supplanted in consequence of new discoveries is exceedingly remote," Albert A. Michelson, who measured the speed of light, said in a speech dedicating the Ryerson Physics Lab at the University of Chicago in 1894. He could not anticipate the revolutionary discoveries of relativity and quantum mechanics that soon followed. The most recent pessimistic forecast is a 1996 book, "The End of Science" by John Horgan, that claimed all the big questions have been asked and answered. What's left, he said, is simply filling in the details. Most scientists, however, believe that there may be no end to big questions and that they will lead to big discoveries. "It is very plain that science has as much going for it now as it ever had. There are even more questions," Kennedy said. Besides consciousness, the small number of human genes and what the universe is made of, the other big questions on Science's list of 25 are: - To what extent are genetic variation and personal health linked? - Can the laws of physics be unified? - How much can the human life span be extended? - What controls organ regeneration? - How can a skin cell become a nerve cell? - How does a single somatic cell become a whole plant? - How does Earth's interior work? - Are we alone in the universe? - How and where did life on Earth arise? - What determines species diversity? - What genetic changes made us uniquely human? - How are memories stored and retrieved? - How did cooperative behavior evolve? - How will big pictures emerge from a sea of biological data? - How far can we push chemical self-assembly? - What are the limits of conventional computing? - Can we selectively shut off the immune responses? - Do deeper principles underlie quantum uncertainty and non-locality? - Is an effective HIV vaccine feasible? - How hot will the greenhouse world be? - What can replace cheap oil, and when? - Will Thomas Malthus [who predicted that overpopulation could lead to a global disaster] continue to be wrong? Scientists are already working on these questions and many more. "We thought these had the biggest potential for impact and the ability to be answered in the next 25 years," said Science executive editor Monica Bradford. From checker at panix.com Sat Jan 7 20:54:54 2006 From: checker at panix.com (Premise Checker) Date: Sat, 7 Jan 2006 15:54:54 -0500 (EST) Subject: [Paleopsych] Undernews: Censorship Growing On Internet: Suppressing Anti-Bush Material Message-ID: Censorship Growing On Internet: Suppressing Anti-Bush Material Undernews WAYNE MADSEN REPORT - Internet censorship. It did not happen overnight but slowly came to America's shores from testing grounds in China and the Middle East. Progressive and investigative journalist web site administrators are beginning to talk to each other about it, e-mail users are beginning to understand why their e-mail is being disrupted by it, major search engines appear to be complying with it, and the low to equal signal-to-noise ratio of legitimate e-mail and spam appears to be perpetuated by it. . . Take for example of what recently occurred when two journalists were taking on the phone about a story that appeared on Google News. The story was about a Christian fundamentalist move in Congress to use U.S. military force in Sudan to end genocide in Darfur. The story appeared on the English Google News site in Qatar. But the very same Google News site when accessed simultaneously in Washington, DC failed to show the article. This censorship is accomplished by geo-location filtering: the restriction or modifying of web content based on the geographical region of the user. In addition to countries, such filtering can now be implemented for states, cities, and even individual IP addresses. . . News reports on CIA prisoner flights and secret prisons are disappearing from Google and other search engines like Alltheweb as fast as they appear. Here now, gone tomorrow is the name of the game. Google is systematically failing to list and link to articles that contain explosive information about the Bush administration, the war in Iraq, Al Qaeda, and U.S. political scandals. But Google is not alone in working closely to stifle Internet discourse. America On Line, Microsoft, Yahoo and others are slowly turning the Internet into an information superhighway dominated by barricades, toll booths, off-ramps that lead to dead ends, choke points, and security checks. America On Line is the most egregious is stifling Internet freedom. A former AOL employee noted how AOL and other Internet Service Providers cooperate with the Bush administration in censoring email. The Patriot Act gave federal agencies the power to review information to the packet level and AOL was directed by agencies like the FBI to do more than sniff the subject line. The AOL term of service has gradually been expanded to grant AOL virtually universal power regarding information. Many AOL users are likely unaware of the elastic clause, which says they will be bound by the current TOS and any TOS revisions which AOL may elect at any time in the future. Essentially, AOL users once agreed to allow the censorship and non-delivery of their email. Microsoft has similar requirements for Hotmail as do Yahoo and Google for their respective e-mail services. There are also many cases of Google's search engine failing to list and link to certain information. According to a number of web site administrators who carry anti-Bush political content, this situation has become more pronounced in the last month. In addition, many web site administrators are reporting a dramatic drop-off in hits to their sites, according to their web statistic analyzers. http://waynemadsenreport.com/ From checker at panix.com Sun Jan 8 19:59:32 2006 From: checker at panix.com (Premise Checker) Date: Sun, 8 Jan 2006 14:59:32 -0500 (EST) Subject: [Paleopsych] Edge Annual Question 1998: What Is Your Question? Message-ID: Edge Annual Question 1998: What Is Your Question? EDGE 3rd Culture: THE WORLD QUESTION CENTER http://www.edge.org/3rd_culture/wqc/wqc_p1.html et seq. [This is the first in the series.] THE WORLD QUESTION CENTER Dedicated to the Memory of James Lee Byars 1932-1997 Introduction by John Brockman Everything has been explained. There is nothing left to consider. The explanation can no longer be treated as a definition. The question: a description. The answer: not explanation, but a description and knowing how to consider it. Asking or telling: there isn't any difference. The final elegance: assuming, asking the question. No answers. No explanations. "Why do you demand explanations? If they are given, you will once more be facing a terminus. They cannot get you any further than you are at present."1 The solution: not an explanation: a description and knowing how to consider it. Experience a minute. Experience an hour. Can you experience a minute and an hour together, simultaneously, at the same time? This is an important question to ask. No explanation, no solution, but consideration of the question. "Every proposition proposing a fact must in its complete analysis propose the general character of the universe required for the fact."2 The description, the proposition: not a definition, but a commission. "Understanding a commission means: knowing what one has got to do."3 Any new style, any new life, any new world, is but a god where gods are no longer valid. "The god that one so finds is but a word born of words, and returns to the word. For the reply we make to ourselves is assuredly never anything other than the question itself."4 "Our kind of innovation consists not in the answers, but in the true novelty of the questions themselves; in the statement of problems, not in their solutions."5 What is important is not "to illustrate a truth?or even an interrogation?known in advance, but to bring to the world certain interrogations . . . not yet known as such to themselves."6 A total synthesis of all human knowledge will not result in fantastic amounts of data, or in huge libraries filled with books. There's no value any more in amount, in quantity, in explanation. For a total synthesis of human knowledge, use the interrogative. Ask the most subtle sensibilities in the world what questions they are asking themselves. ? from By the Late John Brockman, 1969 ----- In EDGE 19, I presented a eulogy to honor my friend and collaborator of sorts, the artist James Lee Byars, who died in Egypt last May. I met Byars in 1969 when he sought me out after the publication of my first book, By the Late John Brockman. We were both in the art world, we shared an interest in language, in the uses of the interrogative, in avoiding the anesthesiology of wisdom, and in "the Steins" ? Einstein, Gertrude Stein, Wittgenstein, and Frankenstein. In 1971, our dialogue, in part, informed the creation by James Lee of the WORLD QUESTION CENTER. I wrote the following about his project at the time of his death: "James Lee inspired the idea that led to the Reality Club (and subsequently to EDGE), and is responsible for the motto of the club. He believed that to arrive at an axiology of societal knowledge it was pure folly to go to a Widener Library and read 6 million volumes of books. (In this regard he kept only four books at a time in a box in his minimally furnished room, replacing books as he read them.) This led to his creation of the World Question Center in which he planned to gather the 100 most brilliant minds in the world together in a room, lock them behind closed doors, and have them ask each other the questions they were asking themselves. The expected result, in theory, was to be a synthesis of all thought. But between idea and execution are many pitfalls. James Lee identified his 100 most brilliant minds (a few of them have graced the pages of this Site), called each of them, and asked what questions they were asking themselves. The result: 70 people hung up on him." That was in 1971. New technologies=new perceptions. The Internet and email now allow for a serious implementation of Jimmy Lee's grand design and I am pleased to note that among the contributors are Freeman Dyson and Murray Gell-Mann, two names on his 1971 list of "the 100 most brilliant minds in the world." For the first anniversary edition of EDGE I asked a number of those people I consider to be part of "the third culture" to use the interrogative. I have asked "the most subtle sensibilities in the world what questions they are asking themselves." I am pleased to present the World Question Center. -JB http://www.edge.org/3rd_culture/wqc/wqc_p2.html "Given the ability of regulatory proteins to rescue functions between taxa that haven't shared a common ancestor for over 600 million years how do we integrate this into the way we think about the evolution of phenotype?" JEREMY C. AHOUSE Works in developmental genetics at University of Wisconsin, Madison. "Is a greater understanding of the way the brain works going to give me a new language to explain what it is like to be me? Will the words we use now one day seem as strange as the 'humours' we once used to explain the state of our bodies? And what will be the consequence if a scientist gains the power to know me better than I can know myself?" ALUN ANDERSON Editor of New Scientist, biologist and author of Science And Technology In Japan. "What is the crucial distinction between inanimate matter and an entity which can act as an 'agent', manipulating the world on its own behalf; and how does that change happen?" PHILIP ANDERSON Nobel laureate physicist at Princeton. "Exactly how much of nature can we trash and burn and get away with it?" NATALIE ANGIER Science writer for The New York Times; author of Natural Obsessions, The Beauty Of The Beastly. "To what extent can we achieve a more just society through the use of better economic indicators, and to what extent is our choice of economic indicators just a reificiation of the wishes of those who are already economically powerful?" JOHN BAEZ Mathematical physicist at University of California, Riverside. "What if Gutenberg had invented the world wide web instead of the movable type slug? How would the questions scientists chose to ask themselves over the past five centuries, and the language in which they chose to answer, have been different?" JAMES BAILEY Former executive at Thinking Machines; author of After Thought. "As a theoretical physicist, the interpretation of quantum mechanics and the nature of time are what occupy me most, but, as a mystified sentient being, I should like to ask the child's question: Are the most remarkable things in life ? sights, sounds, colors, tastes ? really just subjective epiphenomena with no role or significance in the 'objective' world?" JULIAN BARBOUR Theoretical physicist; author of The Frame Of Mind. "Will we ever generate enough bandwidth to convey prana?" JOHN PERRY BARLOW Co-founder, Electronic Frontier Foundation; a former lyricist for the Grateful Dead. "Is the Universe a great mechanism, a great computation, a great symmetry, a great accident, or a great thought?" "Is there enough information in the observable universe to identify the fundamental laws of Nature beyond all reasonable doubt?" "Are there other minds that think about us?" JOHN D. BARROW Cosmologist, Professor of Astronomy, University of Sussex, UK; author of Theories Of Everything; Pi In The Sky. "How can we build a new ethics of respect for life that goes beyond individual survival to include the necessity of death, the preservation of the environment, and our current and developing scientific knowledge?" MARY CATHERINE BATESON Anthropologist, George Mason University; author Composing A Life; Peripheral Visions. "How can considering the longest time scales in human endeavor lead us to deal with the approaching crises of greenhouse warming and species diversity?" GREGORY BENFORD Physicist, University of California, Irvine; author of Timescape. "How do we make long-term thinking automatic and common instead of difficult and rare?" STEWART BRAND Founder of The Whole Earth Catalog; author of How Buildings Learn. "Which cognitive skills develop in any reasonably normal human environment and which only in specific socio-cultural contexts?" JOHN T. BRUER President, James S. McDonnell Foundation "What is the mathematical essence that distinguishes living from non-living, so that we can engineer a transcendence across the current boundaries?" ROD BROOKS Computer scientist; director of MIT's AI Lab. "Do humans have evolved homicide modules ? evolved psychological mechanisms specifically dedicated to killing other humans under certain contexts?" DAVID BUSS Psychologist at University of Texas at Austin; author of The Evolution Of Desire. "If Mosaic had never supported pictures (read: the Internet didn't become a commercial medium), what would I be doing right now?" JASON McCABE CALCANIS Publisher, Silicon Alley Reporter. "How will minds expand, once we understand how the brain makes mind?" WILLIAM H. CALVIN Theoretical neurophysiologist, University of Washington; author of The Cerebral Code; How Brains Think. "Any musically aware listener will know of music that breaks out of established forms or syntax to profound effect ? my personal favourites include Beethoven's Eroica symphony, Wagner's Tristan und Isolde, Schoenberg's Erwartung, Debussy's Apres midi d'un faune. .. What is the most that we can ever say objectively about what those composers are discovering? What are the limits of objective description using science, mathematics and musical analysis? More generally, how do these structures in sound make sense? As of now, I see only very preliminary hypotheses in response to this last question, no possibility of much more given current understanding and techniques, and no consensus as to the ultimate constraints on such an answer." PHILIP CAMPBELL Editor of Nature. "It's probably the case that intergroup competition was an important part of human evolution and there is increasing evidence that 'ethnicity' may be a correlate of 'modernity.' If ethnicity, and the human use of biological cues (and cultural and linguistic cues) to indicate social identity are parts of our evolutionary legacy, it makes it that much harder to eradicate ethnocentrism and racism. Can we do it? How can we engage our focus on the flip side of competition ? cooperation?" RACHEL CASPARI Anthropologist at the University of Michigan; coauthor of Race And Human Evolution. "How can we develop an objective language for describing subjective experience?" DAVID CHALMERS Philosopher, University of California, Santa Cruz; author of The Conscious Mind. "When will we learn to ask 'And then what' as a matter of course?" JEREMY CHERFAS Biologist and BBC Radio Four broadcaster; author of The Seed Savers Handbook. "If Gordon Moore was correct in his prediction that the amount of information storable on semiconductor chips would double every 18 months, then over time is time more or less valuable?" LUYEN CHOU President and CEO of Learn Technologies Interactive in New York City, an interactive media developer and publisher. "How can we sustain young people's interest in asking questions such as these? Does the emphasis on personal success and security divert psychic energy from taking the long-term view on things? How long can we keep curiosity and creativity alive in an increasingly materialistic culture?" MIHALY CSIKSZENTMIHALYI Psychologist, University of Chicago; author of Flow: The Psychology Of Optimal Experience; Creativity. "What is information and where does it ultimately originate?" PAUL DAVIES Physicist, University of Adelaide, Australia; author of The Mind Of God; Are We Alone. "What might a second specimen of the phenomenon that we call life look like?" RICHARD DAWKINS Evolutionary biologist, Oxford; author of River Out Of Eden; Climbing Mount Improbable. "How can we even begin to formulate the right questions about consciousness?" STANISLAS DEHAENE Cognitive neuropsychologist, Institut National de la Sant; author of The Number Sense. "How on earth does the brain manage its division of labor problem ? that is, how do the quite specialized bits manage to contribute something useful when they get 'recruited' by their neighbors to assist in currently dominant tasks (or is this 'recruitment' an illusion ? are they not helping but just complaining about the noise caused by their hyperactive neighbors)?" DANIEL C. DENNETT Philosopher, Tufts University; author of Darwin's Dangerous Idea; Kinds Of Minds. "Throughout its history, the scientific community has shown great integrity in resisting the onslaught of anti-rationalism. How can it now be persuaded to show the same integrity in regard to scientism?" DAVID DEUTSCH Physicist, Oxford University; author of The Fabric Of Reality. "Why are decentralized processes ubiquitous in nature and society and why are they so poorly understood that people will sacrifice their autonomy and freedom for authoritarian, centralized solutions (gods, governments, and gurus) to personal and social problems?" ARTHUR DE VANY Professor, Mathematical Behavioral Sciences Dept., University Of California, Irvine. "Is justice real?" THOMAS DE ZENGOTITA Anthropologist; teaches philosophy and anthropology at The Dalton School and at the Draper Graduate Program at New York University. "What do collapses of past societies teach us about our own future?" JARED DIAMOND Biologist, UCLA Medical School; author of The Third Chimpanzee; Guns, Germs, And Steel "Is psychic phenomenon just wishful thinking and can we ever prove it exists or doesn't exist using scientific methodology." JOHN C. DVORAK Columnist for Pc Magazine; Pc/Computing, Boardwatch. "What makes a soul? And if machines ever have souls, what will be the equivalent of psychoactive drugs? of pain? of the physical / emotional high I get from having a clean office?" ESTHER DYSON President, Edventures Holdings, Inc; publisher of Release 1.0 Newsletter; author of Release 2.0. "The best questions were asked long ago. For example, Fermi's question, 'Where are they?', and Blake's question, 'How do you know but ev'ry bird that cuts the airy way is an immense world of delight, clos'd by your senses five?' My question is, 'What goes on inside the head of a baby?' " FREEMAN DYSON Physicist, Institute for Advanced Study; author of Disturbing The Universe; From Eros To Gaia. "Why not trees in the oceans?" GEORGE DYSON Leading authority in the field of Russian Aleut kayaks; author of Baidarka; Darwin Among The Machines. http://www.edge.org/3rd_culture/wqc/wqc_p3.html "Will we find the will and the way to limit our population growth before the Biosphere does it for us?" NILES ELDREDGE Paleontologist and Curator at The American Museum of Natural History; author of The High Table; Dominion. "As biological and traditional forms of cultural evolution are superseded by electronic (or post-electronic) evolution, what will be the differentially propagating "units" and the outcome of the natural selection among them?" PAUL EWALD Evolutionary biologist at Amherst; author of Evolution Of Infectious Disease. "Will the 'theory of everything' be a theory of principles, not particles? Will it invoke order from above, not below?" KENNETH FORD Retired Director of the American Institute of Physics; author of The World Of Elementary Particles. "However appropriate it may be for the economy, the 'market model' is a grossly inadequate model for the rest of human society. With the decline of religious conviction and the slow pace of changes in the legal code, how can we nurture persons and institutions that can resist a purely market orientation in all spheres of living?" HOWARD GARDNER Psychologist at Harvard; author of Frames Of Mind; The Mind's New Science; Extraordinary Minds. "When will the nation's leading intellectuals come clean & admit that Biblical doctrine (on women, nature, homosexuality, the absolute nature of moral truth and lots of other topics) makes them cringe and they are henceforth NOT Jews and NOT Christians, and the hell with old time religion?" DAVID GELERNTER Computer scientist at Yale; author of Mirror Worlds; Drawing Life. "Is superstring theory (or M-theory, as it has become) the long-sought unified theory of all the elementary particles and forces of nature?" "How can we improve our reward system for excellence in filtering, interpreting, and synthesizing the vast body of so-called information with which we are deluged." MURRAY GELL-MANN Nobel laureate physicist at the Santa Fe Institute; author of The Quark And The Jaguar. "How can we teach each other to embrace pluralism, and to trust each other with the new tools that promote privacy and freedom of speech?" MIKE GODWIN EFF (Electronic Frontier Foundation) Staff Counsel. "Can science survive the sell-out to technology and the corporate sector?" BRIAN GOODWIN Biologist, Schumacher College; author of How The Leopard Changed Its Spots. "At what point a complex organic macro-structure becomes 'alive' ?" MARCELO GLEISER Brazilian physicist, Dartmouth; author of The Dancing Universe. "How do intelligent beings learn to adapt successfully on their own to a rapidly changing world without forgetting what they already know?" STEPHEN GROSSBERG Cognitive scientist at Boston University; author of Studies Of Mind And Brain; The Adaptive Brain. "It appears likely that the universe that we can observe is just one of an infinity of 'pocket universes,' which are continually being created by a process called eternal inflation. These pocket universes are believed to split off from a region of 'false vacuum', which expands so quickly that its volume increases forever, despite the loss of volume to the formation of pocket universes. The problem is to find a reliable way to extract predictions from this picture. The properties of the pocket universes can vary, and with an infinity of trials essentially anything will happen an infinite number of times. We need to learn how to distinguish the probable from the improbable, but so far such a probability calculation has never been given a precise definition." ALAN GUTH Physicist at MIT; author of The Inflationary Universe. "Are life and consciousness purely emergent phenomena, or subtly connected to a fundamental level of the universe?" STUART HAMEROFF, M.D. Neuroscientist, University of Arizona; coeditor of Toward A Science Of Consciousness. "How can we reconcile our desire for fairness and equity with the brutal fact that people are not all alike?" JUDITH RICH HARRIS Developmental psychologist; co-author of The Child: A Contemporary View Of Development. "It is now possible for functional parts of one animal's brain to be transplanted into another's. A tasty question for future research, one with volatile biomedical and ethical implications, is whether the memories and goals and desires of one animal can be transplanted as well?" MARC D. HAUSER Evolutionary psychologist at Harvard; author of The Evolution Of Communication. "Is there a way to enlarge our separate tribal loyalties, to include all our fellow humans?" REUBEN HERSH Mathematician; author of What Is Mathematics, Really? "Where is the frontier?" W. DANIEL HILLIS Computer scientist; V-P of R&D at the Walt Disney Company author of How Computers Think (forthcoming). "How can we bring up children so that they have the ability to form satisfying relationships and a proper moral sense? How do we construct a society with a proper moral code? Do we know what a proper moral code is?" ROBERT HINDE Ethologist; Fellow, former Master and Royal Society Professor, St. John's College, Cambridge; author of Towards Understanding Releationships; Individuals, Relationships, and Culture. "Can we use our current technology to bring C. P. Snow's two cultures closer together? For example, could we produce a vision-oriented, computer-based version of the cross-cultural artifact envisioned in Hermann Hesse's Das Glasperlenspiel?" JOHN HENRY HOLLAND Computer Scientist at the University of Michigan; author of Hidden Order: How Adaptation Builds Complexity; Emergence. "Does anyone who is not a fool or fundamentalist still believe in utopia?" JOHN HORGAN Science writer; author of The End Of Science. "Why and how do we jump to conclusions in mathematics?" VERENA HUBER-DYSON Mathematician; author of Goedel's Theorems; A Workbook On Formalization. "Why is music such a pleasure?" NICHOLAS HUMPHREY Psychologist at The New School for Social Research; author of Consciousness Regained; A History Of The Mind; Leaps Of Faith. "What will be the framework for a scientific study of the subject-object split?" PIET HUT Astrophysicist at the Institute for Advanced Study; President of the Kira Institute. "What are the implications of the science of complex adaptive systems for the nature of law and of legal personhood?" DAVID JOHNSON Attorney; founder of Counsel Connect; Co-Director, Cyberspace Law Institute. http://www.edge.org/3rd_culture/wqc/wqc_p4.html "If humanity ever encounters an alien intelligence, will we be able to communicate with it ? or even realize that it is there? GEORGE JOHNSON Writer, The New York Times; author of Fire In The Mind; Machinery Of The Mind. "What happens when a the library of human knowledge can process what it knows and provide advice? In other words what happens when the Library of Alexandria, Computing, and the Oracle at Delphi merge?" BREWSTER KAHLE Computer scientist; founder: Wide Area Information Servers Inc.; The Internet Archive; Alexa. "What must a physical system be such that it can act on its own in an environment." STUART A. KAUFFMAN Biologist at the Santa Fe Institute; author of Origins Of Order; At Home In The Universe. "What does technology want?" KEVIN KELLY Executive editor, Wired; author of Out Of Control. "Do we or even can we know the joint multi-variable probability density function (f(x1, ... , xn)) that describes any realworld event?" BART KOSKO Electrical engineer at USC; author of Fuzzy Thinking; Nanotime. "Are the laws of physics a logical coherent whole, so that with any small change the entire framework would crumble? Or are there a continuum of possibilities, only one of which happens to have been selected for our observed universe?" LAWRENCE M. KRAUSS Physicist, Case Western Reserve Universe University; author of The Fifth Essence; Fear Of Physics; The Physics Of Star Trek. "How do neural computation principles and the neural networks of our brains, together with the relevant aspects of experience, account for the details of all human concepts, especially their structure, how they are learned, and how they are used in thought and expressed in language?" GEORGE LAKOFF Cognitive scientist, University of California, Berkeley; coauthor of Metaphors We Live ; author of Women, Fire, And Dangerous Things. "How can minds, lives, and relationships be enhanced by information systems in unforeseen ways?" "How can scientific and technological culture be articulated so that fewer people are driven to embrace superstitions, and so that technology is more likely to be designed and judged on humanistic terms?" JARON LANIER Computer scientist and musician; pioneer of virtual reality. "With the ever-growing dominance of corporate forms of control in everyday social life, how do we reconcile our notions of personal liberty and autonomy rooted in Enlightenment political thought?" EDWARD O. LAUMANN Sociologist at the University of Chicago; author of The Social Organization Of Sexuality. "For how long can Christianity and Islam survive the recovery of living organisms from beyond our planet by our species?" "Can religion exist after humans have created living entities that reproduce?" RICHARD LEAKEY Paleoanthropologist and former director of Kenya's Wildlife Services; author of Origins Of Humankind and coauthor of The Sixth Extinction. "'What is the question I am asking myself?' ? After contemplating this for hours the only honest answer I could come up with was, 'What is the question I am asking myself?'" SETH LLOYD Physicist at MIT, who works on problems having to do with information and complex systems. "How can we know when and what we do not know?" SIR JOHN MADDOX Editor emeritus of Nature; author of The Doomsday Syndrome; What Remains To Be Discovered (forthcoming). "Do new computing technologies create or destroy jobs?" JOHN MARKOFF Technology reporter, The New York Times; coauthor, Takedown. "When posterity looks back on the 20th Century from the perspective of a hundred years, what will they see as our greatest successes and worst follies?" PAMELA McCORDUCK & JOSEPH TRAUB (McCorduck:) Writer; author of Machines Who Think; coauthor of The Futures Of Women. (Traub:) Computer scientist at Columbia; author of Complexity And Information (forthcoming). "What will happen when the male, scientific, hierarchical, control-oriented Western culture that has dominated Western thought integrates with the emerging female, spiritual, holographic, relationship-oriented Eastern way of seeing?" JERRY MICHALSKI Editor, Release 1.0 "Will it be possible to direct young people to the great educational question of learning what they have become without having chosen it, their unknown internal worlds, in the face of the blistering assault of stimuli ( in medias res, truly) they encounter continuously each day?" FRANK MORETTI Philosopher & educator; Co-Director, Institute for Learning Technologies at Columbia. "How come we don't understand how photosynthesis works?" MARNEY MORRIS Founder of Animatrix, an interactive design company; currently teaches interactive design at Stanford. "In 500 years, how will the phenotypic, genotypic and physical spaces occupied by life descended from that on earth have changed?" "How best can we combine democracy and expertise to make the living conditions of the people of earth, especially those currently in hardship, better and more equitable?" OLIVER MORTON Freelance writer, and a contributing editor at Wired and Newsweek International. "How does the capacity for low mood give a selective advantage?" RANDOLPH NESSE, M.D. Psychiatrist at the University of Michigan; coauthor of Why We Get Sick. "How much of what we as persons can experience in life can we share with fellow human beings?" TOR N?RRETRANDERS Danish science writer; author of The User Illusion (forthcoming in the U.S.). "Pont d'Ironie?" HANS ULRICH OBRIST Curator for Musee D'Art Moderne de la Ville de Paris and museum in progress, Vienna; chief editor of the magazine "Point d'Ironie." "Why are religions still vital?" ELAINE H. PAGELS Religious historian at Princeton; author of The Gnostic Gospels; ; The Origin Of Satan. "Which industries will shake out, or disappear in the new industrial revolution fomented by the advent of the world wide web, intranets, and extranets? How do we help those who are afraid of these new technologies to benefit from them, rather than be crushed by those who understand? KIP PARENT Intranet and extranet pioneer and engineer; President, Pantheon Interactive. "A chimpanzee cannot understand Bessel functions or the theory of black holes. Human forebrains are a few ounces bigger than a chimp's, and we can ask many more questions than a chimp. Are there facets of the universe we can never know? Are there questions we can't ask?" CLIFFORD A. PICKOVER Computer scientist; author of The Alien Iq Test; The Loom Of God "What is needed regarding the understanding of the mental process so that we will be able to produce thought computationally?" PAOLO PIGNATELLI Cyber-entrepreneur, linguist, translator and scientist who previously worked in image processing algorithms at Bell Labs. "How does the brain represent the meaning of a sentence?" STEVEN PINKER Psychologist at MIT;author of The Language Instinct; How The Mind Works. "Do emotions contribute to intelligence, and if so, what are the implications for the development of a technology of 'affective computing?' " ROBERT R. PROVINE Neurobiologist and psychologist at the University of Maryland; author of Quest For Laughter. "Can our ever-more-integrated society avoid becoming more vulnerable to high-tech extremists and terrorists?" SIR MARTIN REES Royal Society Professor at King's College, Cambridge; author of Before The Beginning. http://www.edge.org/3rd_culture/wqc/wqc_p5.html "Given what we know now about the origins, history, and impacts of technology, is it possible to design, deploy, and use technologies in ways that help humans be more human, instead of more like components in a machine?" HOWARD RHEINGOLD Founder of Electric Minds, a webzine; author of Tools For Thought; Virtual Communities. "How to ensure that we develop sciences and technologies that serve the people, are open to democratic scrutiny and which assist rather than hinder humans to live harmoniously with the rest of nature?" STEVEN ROSE Neurobiologist, The Open University; author Lifelines; The Making Of Memory. "Is there a happiness gene, and is it dominant?" LOUIS ROSSETTO Co-founder and Publisher of Wired. "Can human beings achieve spontaneous morality by opening ourselves further to some basic expression of nature, or must we create and adopt a set of moral guidelines?" DOUGLAS RUSHKOFF Author, Cyberia; Media Virus; Ecstasy Club; columnist for New York Times Syndicate and Time Digital. "Why does our 'humanness' keep getting in the way of rational decision-making?" KARL SABBAGH Writer and television producer; author of The Living Body; Skyscraper; 21St Century Jet. "How can the implicit beliefs that are imparted to us in childhood be 'reasoned with' in an educational context." ROGER SCHANK Computer scientist and cognitive psychologist at Northwestern; author of The Creative Attitude; Tell Me A Story. "I often wonder?sometimes despair?whether it will be possible to solve long term, global problems(global warming being my current focus) until we can overcome collective denial, which in turn, may not become conscious until we grapple with personal myths. I question whether the eventual loss of half the other species on Earth will even be enough to overcome personal escapism that has gone collective?what I sometimes think of a 'psychological fractal'. Perhaps that's not even a question, but it occupies my mind a lot." STEPHEN H. SCHNEIDER Atmospheric scientist at Stanford; author of The Genesis Strategy; Laboratory Earth. "Do exotic life forms, made of very different materials than those used by life on earth, occur elsewhere in the Universe?" ROBERT SHAPIRO Biochemist at New York University; author of Origins; The Human Blueprint. "Does reality have real numbers?" CHARLES SIMONYI Chief Architect, Microsoft Corporation. "Fundamentally, is the flow of time something real, or might our sense of time passing be just an illusion that hides the fact that what is real is only a vast collection of moments?" LEE SMOLIN Theoretical physicist at Penn State; author of The Life Of The Cosmos. "How to articulate the natural and the social sciences without being either driven or blocked by ideological agendas?" DAN SPERBER Cognitive and social scientist at the Ecole Polytechnique in Paris; author of Rethinking Symbolism; On Anthropological Knowledge. "Is it more useful to theorize a new conception of self that emerges from the widespread adoption of networked technology, or to seek to problematize it?" CARL STEADMAN Cofounder of Suck. "Why are most individuals and all human societies grossly under-achieving their potentials?" DUNCAN STEEL Australian research scientist, broadcaster; author of Rogue Asteroids And Doomsday Comets. "Why can our minds do physics? That is, why does the behavior of the physical world map so neatly onto mathematical laws, given that those laws are (arguably) strings of symbols that our brains happen to be capable of manipulating, apparently as a fortuitous byproduct of some evolutionary process that made our ancestors better adapted to dodging hyenas in the Rift Valley? Why is it that a person sitting in a chair in a room can, by using those leftover hyena-dodging and buffalo-hunting neurons to manipulate symbols in his head, design wing flaps for a 747, or figure out what was happening one femtosecond after the Big Bang?" NEAL STEPHENSON Novelist; author of The Big U; Zodiac: The Eco-Thriller; The Diamond Age; Snow Crash. "How shall I teach my children?" CLIFF STOLL Astronomer; author of The Cuckoo's Egg; Silicon Snake Oil. "Why not?" LINDA STONE Director of the Virtual Worlds Group in the Microsoft Advanced Technology and Research Division. "What was the key factor in the success of Homo sapiens compared with other human species such as the Neanderthals?" CHRIS STRINGER Research paleoanthropologist at The Natural History Museum, London; co-author of In Search Of The Neanderthals; African Exodus. "How predictive is the much sought-after 'Theory of Everything' intended to be? Presumably it will show why the formation of fundamental particles was inevitable, and why these were bound to form into atoms, and presumably predict galaxies. But will it show that life was bound to appear? Or consciousness? How powerful will it be really ? or can it be? What is the Universe really capable of?" "What is religion? Is it necessary? Can we devise a religion for the 21st century and beyond that is plausible and yet avoids banality ? one that people see the need for? What would it be like?" COLIN TUDGE Cambridge biologist and writer; author of Last Animals At The Zoo; The Time Before History. "Why is our western civilization so reluctant to accept subjective, first-hand experience as fundamental data? In close association: why the reluctance to consider one's experience as a realm to be explored with a discipline just as rigorous as the one invented by science for material phenomena?" FRANCISCO VARELA Biologist at the cole Polytechnique, in Paris; author of Principles Of Biological Autonomy; coauthor of Autopoiesis And Cognition. "Why does our species so obsessively document its origins and past yet so persistently ignore the dangerous portents of its future, such as overpopulation?" "Can there be a more reliable definition of intelligence than the ability of a species to realize it has predators and competitors, and then exterminate them ? as we humans have?" PETER D. WARD Paleontologist at University of Washington; author In Search Of Nautilus; The End Of Evolution. "Is the phenomenology of modern biology converging on a small number of basic truths or will it increasingly diverge, becoming so endlessly complex that no single human mind will be able to encompass it?" ROBERT A. WEINBERG, M.D. Biologist, MIT; founding member of the Whitehead Institute for Biomedical Research, Cambridge, Mass.; author of Racing To The Beginning Of The Road. "What do we want from science?" MARGARET WERTHEIM Australian science writer; author of Pythagoras' Trousers: God, Physics, And The Gender Wars. "The major change through the prehistory of our species is the evolution of our brain, the development of a social organ that makes human culture (and language) part of our biology. My question is whether we can ever transcend the consequences and free ourselves of the biological limitations that have been imposed in the process." MILFORD H. WOLPOFF Paleoanthropologist at the University of Michigan; author of Paleoanthropology; coauthor of Race And Human Evolution. From checker at panix.com Sun Jan 8 20:00:05 2006 From: checker at panix.com (Premise Checker) Date: Sun, 8 Jan 2006 15:00:05 -0500 (EST) Subject: [Paleopsych] VDARE: Ed Rubenstein: The Stupid American? Look again. Message-ID: Ed Rubenstein: The Stupid American? Look again. http://vdare.com/rubenstein/051222_nd.htm [14]Edwin S. Rubenstein Archive December 22, 2005 National Data, By [17]Edwin S. Rubenstein Psssst: Have you heard? We've lost our competitive edge. Historically the U.S. economy excelled because of the skills and smarts of our workers. But no longer. America's workforce increasingly lags that of [18]other countries in math and literacy skills. We need their [19]brainpower! At first glance, this assertion seems plausible. Math literacy scores for 15-year old students in the U.S. ranked in the lower half of 41 countries studied in 2003. [[20]ECONOMIC IMPACT: Education statistics don't bode well for our future, By Chris Chmura, Richmond Times-Dispatch, Dec 19, 2005] U.S. adults ranked 12^th among 20 high income countries in composite (document, prose, and quantitative) literacy, according to a separate report released by the [21]Educational Testing Service. A staggering 45 percent of adult Americans cannot read or write at the high school graduate level--and nearly half of those (20 percent) scored at a literacy level below that of a high school dropout. [Educational Testing Service, "[22]The Twin Challenges of Mediocrity and Inequality," February 2002] International rankings can be misleading, however--especially when many of the countries are small and exhibit little variation in average test scores. Thus, despite our mediocre ranking, the mean literacy test score for U.S. adults (272) was 2 points above the mean for all adults in the 20 country survey (270). The 2 point gap is not statistically significant......but we'll take it. Larger, statistically significant, literacy gaps between us and them unfold when you separate immigrant from native-born test takers, as is done in 17 [23]high income countries surveyed by ETS. [[24]Table 1] bullet U.S. natives scored 8 points above the average native of the 17 high income countries bullet U.S. immigrants scored 16 points below the average immigrant in the 17 countries There are several reasons why immigrants exert more of a literacy drag here than elsewhere. First, they account for a larger share of the population. At the time of the international literacy survey (1994) immigrants accounted for about 13 percent of U.S. adults, fifth highest proportion among the countries surveyed. Only in [25]Australia, [26]Canada, [27]New Zealand, and [28]Switzerland did immigrants account for a larger population share. Second--and far more important--is the abnormally wide gap between native and immigrant literacy capabilities in the U.S. Here are the average scores and the proportions by which natives outscore immigrants: bullet U.S.: immigrants 210; natives 284; 74 points, or 35 percent bullet 17 countries: immigrants 226; natives 276; 50 points, or 22 percent The immigrant-native differential is still larger among high school dropouts--a group that covers one-third of adult U.S. immigrants and only 13 percent of natives: bullet U.S.: immigrant dropouts 149; native dropouts 225; 76 points, or 51 percent bullet 17 countries: immigrant dropouts 177; native dropouts 243; 66 points, or 37 percent Needless to say, immigration is not the only factor behind our weak literacy scores. The literacy gap between native-born whites and Asians and their Black and Hispanic counterparts ranges from 46 points, or 19 percent, on the prose and document literacy tests, to 57 points, or 25 percent, on the quantitative test. The wide dispersion of capabilities forces the ETS Bureaucracy to state the politically incorrect, albeit obvious: "If we adjust the mean NALS scores for U.S. adults under age 65 to exclude all foreign-born adults as well as native-born Blacks and Hispanics, then the mean prose and quantitative scores of the remaining U.S. adults (Asian and White, native-born) would rise to 288, ranking the U.S. second highest--tied with Finland and Norway--on the prose scale and fifth highest on the quantitative scale.... The findings clearly suggest that future gains in the comparative, international literacy standing U.S. adults will require substantial improvements in the literacy proficiencies of Blacks, [29]Hispanics, and the foreign born from all racial/ethnic groups." [ETS Report, P.22] Or we could settle for immigration reform. Edwin S. Rubenstein ([30]email him) is President of [31]ESR Research Economic Consultants in Indianapolis. References 14. http://vdare.com/rubenstein/index.htm 17. http://vdare.com/rubenstein/index.htm 18. http://vdare.com/rubenstein/Local%20Settings/Temporary%20Internet%20Files/Content.IE5/BYOBRHS1/brainpower 19. http://www.vdare.com/letters/tl_060601.htm 20. http://www.timesdispatch.com/servlet/Satellite?c=MGArticle&cid=1128768779649&pagename=RTD/MGArticle/RTD_BasicArticle&path=!business!columnists&s=1045855934868 21. http://www.vdare.com/rubenstein/051110_nd.htm 22. http://www.ets.org/Media/Research/pdf/PICTWIN.pdf 23. http://www.vdare.com/sailer/wealth_of_nations.htm 24. http://vdare.com/rubenstein/051222_nd_table.htm#t1 25. http://www.vdare.com/misc/051221_fraser.htm 26. http://www.vdare.com/fulford/immigration_cut_off.htm 27. http://www.vdare.com/blog/111204_blog.htm#b2 28. http://vdare.com/zmirak/free_markets.htm 29. http://www.vdare.com/guzzardi/latino_literacy.htm 30. mailto:edwin at esrresearch.com 31. http://www.esrresearch.com/ From checker at panix.com Sun Jan 8 20:00:18 2006 From: checker at panix.com (Premise Checker) Date: Sun, 8 Jan 2006 15:00:18 -0500 (EST) Subject: [Paleopsych] Sigma Xi: Randomness as a Resource Message-ID: Randomness as a Resource http://www.americanscientist.org/template/AssetDetail/assetid/20829?&print=y [Best to get the PDF.] [31]Brian Hayes Randomness is not something we usually look upon as a vital natural resource, to be carefully conserved lest our grandchildren run short of it. On the contrary, as a close relative of chaos, randomness seems to be all too abundant and everpresent. Everyone has a closet or a file drawer that offers an inexhaustible supply of disorder. Entropy?another cousin of randomness?even has a law of nature saying it can only increase. And, anyway, even if we were somehow to use up all the world's randomness, who would lament the loss? Fretting about a dearth of randomness seems like worrying that humanity might use up its last reserves of ignorance. Nevertheless, there is a case to be made for the proposition that high-quality randomness is a valuable commodity. Many events and processes in the modern world depend on a steady supply of the stuff. Furthermore, we don?t know how to manufacture randomness; we can only mine it from those regions of the universe that have the richest deposits, or else farm it from seeds gathered in the natural world. So, even if we have not yet reached the point of clear-cutting the last proud acre of old-growth randomness, maybe it's not too early to consider the question of long-term supply. The Randomness Industry To appreciate the value of randomness, just imagine a world without it. What would replace the referee?s coin flip at the start of a football game? How would a political poll-taker select an unbiased sample of the electorate? Then of course there?s the Las Vegas problem. Slot machines devour even more randomness than they do silver dollars. Inside each machine an electronic device spews out random numbers 24 hours a day, whether or not anyone is playing. There?s also a Monte Carlo problem. I speak not of the Mediterranean principality but of the simulation technique named for that place. The Monte Carlo method got its start in the 1940s at Los Alamos, where physicists were struggling to predict the fate of neutrons moving through uranium and other materials. The Monte Carlo approach to this problem is to trace thousands of simulated neutron paths. Whenever a neutron strikes a nucleus, a random number determines the outcome of the event?reflection, absorption or fission. Today the Monte Carlo method is a major industry not only in physics but also in economics and some areas of the life sciences, not to mention hundreds of rotisserie baseball leagues. Many computer networks would be deadlocked without access to randomness. When two nodes on a network try to speak at once, politeness is not enough to break the impasse. Each computer might be programmed to wait a certain interval and then try again, but if all computers followed the same rule, they?d keep knocking heads repeatedly until the lights went out. The Ethernet protocol solves this problem by deliberately not giving a fixed rule. Instead, each machine picks a random number between 1 and n, then waits for the random number of units chosen before retransmitting; the probability of a second collision is reduced to 1/n. [32]click for full image and caption [33]Figure 1. Eight specimens of randomness . . . Computer science has a whole technology of "randomized algorithms." On first acquaintance the very idea of a randomized algorithm may seem slightly peculiar: An algorithm is supposed to be a deterministic procedure?one that allows no scope for arbitrary choice or caprice?so how can it be randomized? The contradiction is resolved by making the randomness a resource external to the algorithm itself. Where an ordinary algorithm is a black box receiving a stream of bits as input and producing another stream of bits as output, a randomized algorithm has a second input stream made up of random bits. Sometimes the advantage of a randomized algorithm is clearest when you take an adversarial view of the world. Randomness is what you need to foil an adversary who wants to guess your intentions or predict your behavior. Suppose you are writing a program to search a list of items for some specified target. Given any predetermined search strategy?left to right, right to left, middle outward?an adversary can arrange the list so that the target item is always in the last place you look. But a randomized version of the procedure can?t be outguessed so easily; the adversary can?t know where to hide the target because the program doesn?t decide where to search until it begins reading random bits. In spite of the adversary?s best efforts, you can expect to find the target after sifting through half the list. Still another field that can?t do without randomness is cryptography, where calculated disorder is the secret to secrecy. The strongest of all cipher systems require a random key as long as the message that?s being sent. The late Claude E. Shannon proved that such a cipher is absolutely secure. That is, if the key is truly random, and if it is used only once, an eavesdropper who intercepts an encrypted message can learn nothing about the original text, no matter how much time and effort and computational horsepower are brought to bear on the task. Shannon also showed that no cipher with a key shorter than the message can offer the same degree of security. But a long key is a considerable inconvenience?hard to generate, hard to distribute. Much of the emphasis in recent cryptological research has been on ways to get by with less randomness, but a recent proposal takes a step in the other direction. The idea is to drown an adversary in a deluge of random bits. The first version of the scheme was put forward in 1992 by Ueli M. Maurer of the Swiss Federal Institute of Technology; more recent refinements (not yet published) have come from Michael O. Rabin of Harvard University and his student Yan Zong Ding. The heart of the plan is to set up a public beacon?perhaps a satellite?continually broadcasting random bits at a rate so high that no one could store more than a small fraction of them. Parties who want to communicate in privacy share a relatively short key that they both use to select a sequence of random bits from the public broadcast; the selected bits serve as an enciphering key for their messages. An eavesdropper cannot decrypt an intercepted message without a record of the random broadcasts, and cannot keep such a record because it would be too voluminous. How much randomness would the beacon have to broadcast? Rabin and Ding mention a rate of 50 gigabits per second, which would fill up some 800,000 CD-ROMs per day. Supply-Side Issues Whatever the purpose of randomness, and however light or heavy the demand, it seems like producing the stuff ought to be a cinch. At the very least it should be easier to make random bits than non-random ones, in the same way that it?s easier to make a mess than it is to tidy up. If computers can perform long and intricate calculations where a single error could spoil the entire result, then surely they should be able to churn out some patternless digital junk. But they can't. There is no computer program for randomness. Of course most computer programming languages will cheerfully offer to generate random numbers for you. In Lisp the expression (random 100) produces an integer in the range between 0 and 99, with each of the 100 possible values having equal probability. But these are pseudo-random numbers: They "look" random, but under the surface there is nothing unpredictable about them. Each number in the series depends on those that went before. You may not immediately perceive the rule in a series like 58, 23, 0, 79, 48..., but it?s just as deterministic as 1, 2, 3, 4.... [34]click for full image and caption [35]Figure 2. Thermal noise in electronic circuits . . . [36]click for full image and caption [37]Figure 3. Radioactive decay offers . . . The only source of true randomness in a sequence of pseudo-random numbers is a "seed" value that gets the series started. If you supply identical seeds, you get identical sequences; different seeds produce different numbers. The crucial role of the seed was made clear in the 1980s by Manuel Blum, now of Carnegie Mellon University. He pointed out that a pseudo-random generator does not actually generate any randomness; it stretches or dilutes whatever randomness is in the seed, spreading it out over a longer series of numbers like a drop of pigment mixed into a gallon of paint. For most purposes, pseudo-random numbers serve perfectly well?often better than true random numbers. Almost all Monte Carlo work is based on them. Even for some cryptographic applications?where standards are higher and unpredictability is everything?Blum and others have invented pseudo-random generators that meet most needs. Nevertheless, true randomness is still in demand, if only to supply seeds for pseudo-random generators. And if true randomness cannot be created in any mathematical operation, then it will have to come from some physical process. Extracting randomness from the material world also sounds like an easy enough job. Unpredictable events are all around us: the stock market tomorrow, the weather next week, the orbital position of Pluto in 50 million years. Yet finding events that are totally patternless turns out to be quite difficult. The stories of the pioneering seekers after randomness are chronicles of travail and disappointment. Consider the experience of the British biometrician W. F. R. Weldon and his wife, the former Florence Tebb. Evidently they spent many an evening rolling dice together?not for money or sport but for science, collecting data for a classroom demonstration of the laws of probability. But in 1900 Karl Pearson analyzed 26,306 of the Weldons? throws and found deviations from those laws; there was an excess of fives and sixes. In 1901 Lord Kelvin tried to carry out what we would now call a Monte Carlo experiment, but he ran into trouble generating random numbers. In a footnote he wrote: "I had tried numbered billets (small squares of paper) drawn from a bowl, but found this very unsatisfactory. The best mixing we could make in the bowl seemed to be quite insufficient to secure equal chances for all the billets." In 1925 L. H. C. Tippett had the same problem. Trying to make a random selection from a thousand cards in a bag, "it was concluded that the mixing between each draw had not been sufficient, and there was a tendency for neighbouring draws to be alike." Tippett devised a more elaborate randomizing procedure, and two years later he published a table of 41,600 random digits. But in 1938 G. Udny Yule submitted Tippett's numbers to statistical scrutiny and reported evidence of "patchiness." Ronald A. Fisher and Frank Yates compiled another table of 15,000 random digits, using two decks of playing cards to select numbers from a large table of logarithms. When they were done, they discovered an excess of sixes, and so they replaced 50 of them with other digits "selected at random." (Two of their statistical colleagues, Maurice G. Kendall and Bernard Babington Smith, comment mildly: "A procedure of this kind may cause others, as it did us, some misgiving.") The ultimate random-number table arrived with a thump in 1955, when the Rand Corporation published a 600-page tome titled A Million Random Digits with 100,000 Normal Deviates. The Rand randomizers used "an electronic roulette wheel" that selected one digit per second. Despite the care taken in the construction of this device, "Production from the original machine showed statistically significant biases, and the engineers had to make several modifications and refinements of the circuits." Even after this tune-up, the results of the month-long run were still unsatisfactory; Rand had to remix and shuffle the numbers before the tables passed statistical tests. Today there is little interest in publishing tables of numbers, but machines for generating randomness are still being built. Many of them find their source of disorder in the thermal fluctuations of electrons wandering through a resistor or a semiconductor junction. This noisy signal is the hiss or whoosh you hear when you turn up an amplifier?s volume control. Traced by an oscilloscope, it certainly looks random and unpredictable, but converting it into a stream of random bits or numbers is not straightforward. The obvious scheme for digitizing noise is to measure the signal at certain instants and emit a 1 if the voltage is positive or a 0 if it is negative. But it?s hard to build a measuring circuit with a precise and consistent threshold between positive and negative voltage. As components age, the threshold drifts, causing a bias in the balance between 1s and 0s. There are circuits and computational tricks to correct this problem, but the need for such fixes suggests just how messy it can be getting a physical device to conform to a mathematical ideal?even when the ideal is that of pure messiness. Another popular source of randomness is the radioactive decay of atomic nuclei, a quantum phenomenon that seems to be near the ultimate in unpredictability. A simple random-number generator based on this effect might work as follows. A Geiger-M?ller tube detects a decay event, while in the background a free-running oscillator generates a high-frequency square-wave signal?a train of positive and negative pulses. At the instant of a nuclear decay, the square wave is sampled, and a binary 1 or 0 is output according to the polarity of the pulse at that moment. Again there are engineering pitfalls. For example, the circuitry?s "dead time" after each event may block detection of closely spaced decays. And if the positive and negative pulses in the square wave differ in length even slightly, the output will be biased. Hardware random-number generators are available as off-the-shelf components you can plug into a port of your computer. Most of them rely on thermal electronic noise. If your computer has one of the latest Intel Pentium processors, you don't need to plug in a peripheral: The random-number generator is built into the CPU chip. There are also several Web sites that serve up free samples of randomness. George Marsaglia of Florida State University has some 4.8 billion carefully tested random bits available to the public. And there are less-conventional sources of randomness, most famously "lavarand," at Silicon Graphics, where random bits are extracted from images of the erupting blobs inside six Lava Lite lamps. (Lately the lamps have gone out, although samples remain available at lavarand.sgi.com.) The Empyrean and the Empirical As a practical matter, reserves of randomness certainly appear adequate to meet current needs. Consumers of randomness need not fear rolling blackouts this summer. But what of the future? The great beacon of randomness proposed by Rabin and Ding would require technology that remains to be demonstrated. They envision broadcasting 50 billion random bits per second, but randomness generators today typically run at speeds closer to 50 kilobits per second. [39]Figure 4. Biased stream of random bits . . . The prospect of scaling up by a factor of a million demands attention to quality as well as quantity. For most commodities, quantity and quality have an inverse relation. A laboratory buying milligrams of a reagent may demand 99.9 percent purity, whereas a factory using carloads can tolerate a lower standard. In the case of randomness, the trade-off is turned upside down. If you need just a few random numbers, any source will do; it?s hard to spot biases in a handful of bits. But a Monte Carlo experiment burning up billions of random numbers is exquisitely sensitive to the faintest trends and patterns. The more randomness you consume, the better it has to be. Why is it hard to make randomness? The fact that maintaining perfect order is difficult surprises no one; but it comes as something of a revelation that perfect disorder is also beyond our reach. As a matter of fact, perfect disorder is the more troubling concept?it is hard not only to attain but also to define or even to imagine. The prevailing definition of randomness was formulated in the 1960s by Gregory J. Chaitin of IBM and by the Russian mathematician A. N. Kolmogorov. The definition says that a sequence of bits is random if the shortest computer program for generating the sequence is at least as long as the sequence itself. The binary string 101010101010 is not random because there is an easy rule for creating it, whereas 111010001011 is unlikely to have a generating program much shorter than "print 111010001011." It turns out that almost all strings of bits are random by this criterion?they have no concise description?and yet no one has ever exhibited a single string that is certified to be random. The reason is simple: The first string certified to have no concise description would thereby acquire a concise description?namely that it?s the first such string. The Chaitin-Kolmogorov definition is not the only aspect of randomness verging on the paradoxical or the ironic. Here is another example: True random numbers, captured in the wild, are clearly superior to those bred in captivity by pseudo-random generators?or at least that?s what the theory of randomness implies. But Marsaglia has run the output of various hardware and software generators through a series of statistical tests. The best of the pseudo-random generators earned excellent grades, but three hardware devices flunked. In other words, the fakes look more convincingly random than the real thing. To me the strangest aspect of randomness is its role as a link between the world of mathematical abstraction and the universe of ponderable matter and energy. The fact that randomness requires a physical rather than a mathematical source is noted by almost everyone who writes on the subject, and yet the oddity of this situation is not much remarked. Mathematics and theoretical computer science inhabit a realm of idealized and immaterial objects: points and lines, sets, numbers, algorithms, Turing machines. For the most part, this world is self-contained; anything you need in it, you can make in it. If a calculation calls for the millionth prime number or the cube root of 2, you can set the computational machinery in motion without ever leaving the precincts of mathland. The one exception is randomness. When a calculation asks for a random number, no mathematical apparatus can supply it. There is no alternative but to reach outside the mathematical empyrean into the grubby world of noisy circuits and decaying nuclei. What a strange maneuver! If some purely mathematical statement?say the formula for solving a quadratic equation?depended on the mass of the earth or the diameter of the hydrogen atom, we would find this disturbing or absurd. Importing randomness into mathematics crosses the same boundary. Of course there is another point of view: If we choose to look upon mathematics as a science limited to deterministic operations, it?s hardly a surprise that absence-of-determinism can?t be found there. Perhaps what is really extraordinary is not that randomness lies outside mathematics but that it exists anywhere at all. Or does it? The savants of the 18th century didn?t think so. In their clockwork universe the chain of cause and effect was never broken. Events that appeared to be random were merely too complicated to submit to a full analysis. If we failed to predict the exact motion of an object?a roving comet, a spinning coin?the fault lay not in the unruliness of the movement but in our ignorance of the laws of physics or the initial conditions. The issue is seen differently today. Quantum mechanics has cast a deep shadow over causality, at least in microscopic domains. And "deterministic chaos" has added its own penumbra, obscuring the details of events that might be predicted in principle, but only if we could gather an unbounded amount of information about them. To a modern sensibility, randomness reflects not just the limits of human knowledge but some inherent property of the world we live in. Nevertheless, it seems fair to say that most of what goes on in our neighborhood of the universe is mainly deterministic. Coins spinning in the air and dice tumbling on a felt table are not conspicuously quantum-mechanical or chaotic systems. We choose to describe their behavior through the laws of probability only as a matter of convenience; there?s no question the laws of angular momentum are at work behind the scenes. If there is any genuine randomness to be found in such events, it is the merest sliver of quantum uncertainty. Perhaps this helps to explain why digging for randomness in the flinty soil of physics is such hard work. Brian Hayes Bibliography * Aumann, Yonatan, and Michael O. Rabin. 1999. Information theoretically secure communication in the limited storage space model. In CRYPTO '99: 19th Annual International Cryptology Conference, Santa Barbara, Calif., August 15?19, 1999, pp. 65?79. Berlin: Springer-Verlag. * Ding, Yan Zong, and Michael O. Rabin. 2001. Provably secure and non-malleable encryption. Abstract. * Fisher, R. A., and F. Yates. 1938. Statistical Tables for Biological, Agricultural and Medical Research. London: Oliver & Boyd. * Ford, Joseph. 1983. How random is a coin toss? Physics Today (April 1983) pp. 40?47. * Intel Platform Security Division. 1999. The Intel random number generator. [40]ftp://download.intel.com/design/security/rng/techbrief.pdf * Lord Kelvin. 1901. Nineteenth century clouds over the dynamical theory of heat and light. The London, Edinburgh and Dublin Philosophical Magazine and Journal of Science, Series 6, 2:1?40. * Marsaglia, George. 1995. The Marsaglia Random Number CDROM, Including the DIEHARD Battery of Tests of Randomness. Tallahassee, Fla.: Department of Statistics, Florida State University. * Maurer, Ueli M. 1992. Conditionally-perfect secrecy and a provably-secure randomized cipher. Journal of Cryptology 5(1):53?66. * Pearson, Karl. 1900. On the criterion that a given system of deviations from the probable in the case of a correlated system of variables is such that it can be reasonably supposed to have arisen from random sampling. The London, Edinburgh and Dublin Philosophical Magazine and Journal of Science, Series 5, 50:157?175. * The Rand Corporation. 1955. A Million Random Digits with 100,000 Normal Deviates. Glencoe, Ill.: Free Press. * Shannon, C. E. 1949. Communication theory of secrecy systems. Bell System Technical Journal 28:656?715. * Tippett, L. H. C. 1927. Random sampling numbers. Tracts for Computers, No. 15. London: Cambridge University Press. * Vincent, C. H. 1970. The generation of truly random binary numbers. Journal of Physics E 3(8):594?598. * von Neumann, John. 1951. Various techniques used in connection with random digits. In Collected Works, Vol. 5, pp. 768?770. New York: Pergamon Press. References 31. http://www.americanscientist.org/template/AuthorDetail/authorid/490 33. http://www.americanscientist.org/template/AssetDetail/assetid/20829?&print=yes#22309 35. http://www.americanscientist.org/template/AssetDetail/assetid/20829?&print=yes#22310 37. http://www.americanscientist.org/template/AssetDetail/assetid/20829?&print=yes#22311 39. http://www.americanscientist.org/template/AssetDetail/assetid/20829?&print=yes#22312 40. ftp://download.intel.com/design/security/rng/techbrief.pdf From checker at panix.com Sun Jan 8 20:00:32 2006 From: checker at panix.com (Premise Checker) Date: Sun, 8 Jan 2006 15:00:32 -0500 (EST) Subject: [Paleopsych] Peng and Knowles: Culture, Education, and the Attribution of Physical Causality Message-ID: Culture, Education, and the Attribution of Physical Causality Kaiping Peng and Eric D. Knowles, both University of California, Berkeley Personality and Psychology Bulletin 29.10 (2003.10): 1272-84 Two studies investigated the impact of culturally instilled folk theories on the perception of physical events. In Study 1, Americans and Chinese with no formal physics education were found to emphasize different causes in their explanations for eight physical events, with Americans attributing them more to dispositional factors (e.g., weight) and less to contextual factors (e.g., a medium) than did Chinese. In Study 2, Chinese Americans' identity as Asians or as Americans was primed before having them explain the events used in Study 1. Asian-primed participants endorsed dispositional explanations to a lesser degree and contextual explanations to a greater degree than did American-primed participants, although priming effects were observed only for students with little physics education. Together, these studies suggest that culturally instilled folk theories of physics produce cultural differences in the perception of physical causality. Keywords: culture; attribution; ethnic identity; physical causality Authors' Note: Preparation of this article was supported by the Hellman Family Faculty Fund to Kaiping Peng. The first author and Richard Nisbett reported a portion of the data from Study 1 at the 7th Annual Conference on Culture and Science at Kentucky State University in 1997. We are grateful to Richard Nisbett for his generous support and sage advice concerning Study 1, to Fernando Lopez-Royo for assistance collecting the Study 2 data, and to members of the Culture and Cognition Lab at the University of California, Berkeley, for their helpful suggestions. Please address correspondence to Kaiping Peng or Eric D. Knowles, Department of Psychology, 4143 Tolman Hall, Berkeley, CA 94720; e-mail: kppeng at socrates.berkeley.edu or eknowles at socrates.berkeley.edu. Of all the thorny issues confronted by cultural psychologists, one foundational question has consistently furrowed researchers' brows: How should culture itself be operationalized? Different theoretical traditions have coalesced around different answers to this question. Two approaches in particular--the value tradition and the "self" tradition--have come to dominate cultural psychology. The value tradition (Hofstede, 1980; Schwartz, 1994; Triandis, 1995) sees culture as a shared set of core values that regulate behavior in a population; Triandis's (1995) theory of individualism-collectivism is perhaps the preeminent example of this approach. The self tradition (Heine & Lehman, 1997, 1999; Markus & Kitayama, 1991; Singelis, 1994) identifies culture with the particular type of self-conception predominant in a population; Markus and Kitayama's (1991) theory of interdependent versus independent self-construals is an important example of this tradition. Recently, a number of cultural psychologists have adopted a third conception of culture--that of culture as "knowledge structure" (Chiu, Morris, Hong, & Menon, 2000; Hong, Morris, Chiu, & Benet-Martinez, 2000; Peng, Ames, & Knowles, 2001; Peng & Nisbett, 1999). This approach portrays culture as a constellation of knowledge structures, or folk theories, that embody individuals' basic beliefs about the world and guide inferences in different domains. Much of the research within the culture-as-theory framework (and cultural psychology generally) has focused on cultural differences in social perception. For example, cultural differences in dispositional bias (i.e., the tendency of lay perceivers to overattribute observed behavior to an actor's personal dispositions) have been traced to divergent folk theories of personal agency (Choi, Nisbett, & Norenzayan, 1999; Knowles, Morris, Chiu, & Hong, 2001; Menon, Morris, Chiu, & Hong, 1999; Morris, Menon, & Ames, 2001; Morris & Peng, 1994; Norenzayan & Nisbett, 2000). The influence of culture, however, may reach beyond social perception. In this article, we argue that different cultures instill their members with different folk theories of physical phenomena and that these theories produce cultural differences in the perception of physical events. Specifically, we argue that East Asians possess a contextual folk theory of physics emphasizing the role of external and relational factors (e.g., gravity) in determining an object's behavior. On the other hand, we argue that members of Western cultures, such as the United States, possess a more dispositional physical theory emphasizing the internal causes of an object's behavior (e.g., weight). In Study 1, we present evidence that cross-national differences in physical attributions--specifically, between those of American and Chinese individuals--reflect the application of dispositional or contextual physical theories. In Study 2, we investigate the causal impact of folk physical theories among individuals likely to possess both dispositional and contextual folk theories-- namely, Chinese Americans. Before describing the studies, however, we review evidence for and against the proposition that culture affects perceptions of physical events. CULTURE AND FOLK PHYSICS Psychologists and other scholars have disagreed as to whether culture shapes individuals' causal understandings of physical events. We briefly review work supporting and refuting the impact of culture on physical attribution and then propose a partial resolution that affirms the role of culture yet places limits on the circumstances under which cultural differences will appear. Evidence for Cultural Variation in Folk Theories of Physics Scholars in the humanities have long argued for the existence of fundamental differences between Eastern and Western folk theories of physics. Joseph Needham (1954), a historian of the science and civilization of ancient China, argued that the ancient Chinese possessed a much richer and more "advanced" folk understanding of physics than did ancient Westerners (e.g., the Greeks) and that this understanding more closely resembles modern physics. According to Needham (1954; see also Capra, 1975; Zukav, 1979), the core concepts of Eastern folk physics, Yin and Yang, are inherently relational, contextual, and dialectical, and thus resemble features of contemporary quantum physics. The contextual Eastern folk theory, it is argued, emphasizes forces that act over distance (e.g., gravity or magnetism) and forces exerted on objects by a medium (e.g., air or water). Kurt Lewin (1935), a founding father of modern social psychology and former physics student, was perhaps the first psychologist to address the dispositional nature of the Western folk understanding of physics: The kind and direction of the physical vectors in Aristotelian dynamics are completely determined in advance by the nature of the object concerned. In modern physics, on the contrary, the existence of physical vectors always depends upon the mutual relations of several physical factors, especially upon the relation of the object to its environment. (p. 28) Thus, in Western (i.e., Aristotelian) folk physics, the behavior of objects is understood almost exclusively in terms of the object itself: A stone sinks when placed in water because it is heavy, and a piece of wood floats because it buoyant (Lewin, 1935). On this understanding, the behavior of objects is caused by their discrete properties alone rather than by those properties in conjunction with states of the environment. Evidence for Universality in the Perception of Physical Causality In contrast to the work just cited, cognitive psychologists investigating the impact of culture on the perception of physical events have generally failed to uncover dramatic cultural differences (Michotte, 1963; Morris, Nisbett, & Peng, 1995; Morris & Peng, 1994). For instance, Michotte (1963) and his students had participants from Europe, Africa, and various Pacific Islands explain mechanistic (i.e., "billiard-ball") interactions between inanimate objects, finding no significant cultural differences. More recent cross-cultural studies have identified seemingly universal rules guiding the interpretation of physical phenomena. In general, if the motion of an object follows, in a straightforward and visible way, the Newtonian law of conservation (i.e., that objects remain at rest or in uniform motion along a straight line until acted on by a outside force), then the motion is seen as externally caused (Stewart, 1984). Only if the motion appears to deviate from the law of conservation is it seen as internally caused (Morris et al., 1995; Morris & Peng, 1994; Stewart, 1984). Research in cognitive development suggests that the perception of physical events is largely "hardwired" and innate and might therefore be resistant to the influence of culture. Even very young infants have been shown to possess firm and reliable expectations about objects' possible movements and interactions (Baillargeon, 2000; Kotovsky & Baillargeon, 2000; Spelke, 2000). Of particular relevance, infants younger than 3 years old expect objects to behave according to the forces of gravity (an external, relational factor) and inertia (an internal, dispositional factor) (I. Kim & Spelke, 1999). The emergence at an early age of such physical expectations suggests that these expectations could not be altered by experience in one's culture. However, the early emergence of expectations concerning both contextual and dispositional factors does not rule out the possibility that culture-specific theories emphasize these understandings to varying degrees and thus lead individuals to favor one type of factor over the other in their explanations for physical events. The Role of Formal Physics Education The research reviewed above reveals disagreement among scholars concerning culture's impact on causal attributions for physical events. We propose a partial resolution to this debate. We argue that Lewin (1935) and Needham (1954) are correct in their portrayal of Western folk physics as dispositional and of Eastern folk physics as more contextual. However, we propose that this difference will only be reflected in the judgments of those with little formal physics education. If, as we have claimed, intuitions about physical phenomena are guided by theory-like knowledge structures, then formal education in physics could supplant the Western folk theory with the more contextual understanding of modern physics and thus obscure any cultural differences. The cross-cultural similarities found in previous research might then be an artifact of education because many of the participants in the studies of Michotte (1963) and others were college students. Hence, it becomes important to separate the effects of culture and physics education in studying causal attributions in the physical domain. The studies reported here either involve participants with no formal physics education (Study 1) or measure physics education to examine its influence on physical judgments (Study 2). ASSESSING THE CAUSAL IMPACT OF CULTURE-SPECIFIC FOLK THEORIES In addition to documenting cultural differences in individuals' folk theories of physics, it is important to show that culture-specific folk theories exert a causal influence on individuals' perceptions of physical phenomena. In virtue of its portrayal of culture as a constellation of knowledge structures, the culture-as-theory approach suggests a way to investigate the causal impact of folk theories on inferences. The technique of cultural priming makes use of the fact that some individuals (e.g., biculturals) often possess multiple culture-derived theories for the same domain of phenomena and that these individuals can be experimentally induced to rely on a given theory in interpreting stimuli. Cultural Priming If cultures are indeed associated with divergent knowledge structures, then cultural knowledge should be subject to well-documented rules of knowledge acquisition and use (for a review, see Higgins, 1996). Most important, it should be possible for individuals to acquire multiple culture-derived theories for the same domain, even if the theories contradict one another; however, only one theory at a time can influence judgments (Hong et al., 2000). Which theory guides cognition at a given time will depend on the relative cognitive accessibility of the theories. According to the principle of accessibility, a knowledge structure will affect judgments to the extent that it is available, or activated, in the perceiver's mind (Higgins, 1996). Thus, the currently most accessible theory for a given domain will be the one that influences judgment in that domain. The current accessibility of a theory can be experimentally manipulated through priming, in which the activation level of the construct is increased through the presentation of a stimulus semantically related to the construct (Higgins, 1996). It follows that an experimenter can, by priming selected theories, manipulate which of two or more conflicting folk theories will influence judgments in a domain. In so doing, it can be shown that the primed knowledge structures exert a causal influence on judgments. Researchers working within the culture-as-theory framework have used cultural symbols to prime culture- specific knowledge structures. Hong and colleagues (2000) presented individuals who had extensive experience in both East Asian and American culture with either East Asian cultural icons (e.g., a Chinese flag, the Great Wall of China, a picture of Stone Monkey) or American cultural icons (e.g., the American flag, the Capitol Building, a picture of Superman). These researchers found that priming affected attributions for social behavior, such that individuals exposed to East Asian primes interpreted behavior as more externally caused than did individuals in the American prime condition.1 In our research, we primed culture-specific folk theories using a cultural identity prime in which Asian American participants were asked to reflect on their identity as Asians or as Americans. These identity primes were intended to increase the level of activation of related networks of cultural knowledge, including East Asian and Western folk theories of physics. Bicultural Individuals The cultural priming technique used here relies on the possibility that some individuals possess more than one culture-bound folk theory of physics. Bicultural individuals--individuals who identify with more than one culture--may possess multiple folk theories. There is good reason to believe that ethnic cultures within the United States possess cultural knowledge similar to that of their countries of origin. For instance, there is evidence that Japanese Americans possess social attributional tendencies consistent with the contextual theory of social behavior thought to be dominant within Japanese culture, attributing success and failure more to situational factors than do European Americans (Narikiyo & Kameoka, 1992; Whang & Hancock, 1994). The congruence between the attributional tendencies of Asian Americans and members of Asian national cultures may be due to the fact that Asian Americans possess some of the same values and cultural knowledge prevalent in the national cultures (U. Kim & Choi, 1994). We suggest that, as with folk theories of social behavior, Asian Americans may possess both Asian and Western folk theories of physics. THE CURRENT RESEARCH The current research had two goals. First, we sought to demonstrate that American and Chinese national cultures are associated with different folk theories of physics. Specifically, we hypothesized that whereas Americans have a dispositional physical theory that locates the causes of physical phenomena in the discrete dispositions of objects (e.g., weight), Chinese perceivers have a contextual theory that places greater emphasis on relational factors--specifically, forces over distance (e.g., gravity) and the influence of mediums (e.g., air or water). Toward this end, in Study 1, we asked Chinese and American nationals with no formal physics education to identify the causes of a variety of physical events. Our second goal was to investigate the causal impact of culture-specific folk physical theories on attributions for physical events. Thus, in Study 2, we attempted to prime dispositional (Western) or contextual (East Asian) folk theories of physics in the minds of Asian Americans, who presumably possess both theories. Participants were subsequently asked to identify the causes of the same physical events as were used in Study 1. To test our hypothesis that folk theories will affect inferences only for individuals who have had little formal instruction in physics, we measured participants' level of physics education. STUDY 1 Method PARTICIPANTS Fifteen American participants, all of them female, were drawn from the Psychology Department subject pool at the University of Michigan; they participated in return for course credit. Participants were selected who reported having had no formal education in physics and who had declared majors in the arts or humanities. Fifteen female spouses of visiting Chinese graduate students at the University of Michigan were recruited and paid $10 for their participation in this study.2 All of these participants were Chinese citizens and, similar to their American counterparts, were college educated, reported no formal physics education, and had majored in the arts and humanities. The mean ages of American and Chinese participants were 19.1 and 22.7 years, respectively. All of the Chinese participants had been in the United States for less than a year because their visas only permitted them to remain in the country for a short time. PHYSICAL CAUSALITY DISPLAYS Eight animated displays of physical events were created using Macromind Director for Macintosh by Macromedia Software, a computer animation program. All displays depicted a white object interacting in various ways with a black object or a medium (i.e., air or water) (see Figure 1 for schematic representations of the displays): 1. White object interacting with black object a. "Launching" event (i.e., elastic collision). The black object collides with the stationary white object and stops, causing the white object to move. b. "Launching at a distance" event. This display was identical to the launching interaction except that the black object stops short of the white object before the white object begins moving. c. "Entraining" event (i.e., inelastic collision). The black object collides with the stationary white object, after which both objects move together. d. "Balance" display depicting objects balancing on a lever. In this event, the black and white circles are in balance at two ends of a platform resting on a fulcrum. e. "Magnetic" display depicting objects' motions in a magnetic field. In this event, the black and white circles appear to be magnetically attracted to one another, converging slowly at first, and then more quickly as the distance between them narrows. 2. White object interacting with a medium a. "Hydrodynamic--floating" event. The white object bobs on the surface of a pool of water. b. "Hydrodynamic--dropping" event. The white object drops into the pool, rises to the surface, and bobs for a moment. c. "Aerodynamic" display depicting an object's motion in the air. In this event, the white object looks like a balloon, dropping gradually while buffeted by air currents. The program was set to present the displays either in the order listed above or the reverse, as determined randomly. Figure 1 Schematic representations of displays used in Studies 1 and 2. PROCEDURE Participants were run one at a time. The experimenter, who was European American and fluent in both English and Chinese, brought the participant into a test ing room and seated her in front of a computer. All instructions were given in English for American participants and in Chinese for Chinese participants. The experimenter introduced the study as an investigation of visual perception, in which the participant would be shown a number of displays depicting physical events and asked questions about her perceptions of each. Participants were instructed to think of the physical events as independent episodes, such that objects in one display were not the same objects as in any other display. The experimenter then played the displays, pausing after each to ask the participant the following physical causality question: "Please explain in your own words why the white object moved in the way it did. Even if you don't have a strong opinion, please take a guess." Participants were given as much time as needed to respond to the questions. Responses were tape-recorded. The procedure typically lasted an hour, after which the participant was debriefed and dismissed. Results CODING OF OPEN-ENDED RESPONSES Participants' answers to the free-response causality question were transcribed and content analyzed by two psychology graduate students at the University of Michigan. Both coders were blind to the experimental hypothesis. One coder was European American and the other Chinese American, and each was fluent in both English and Chinese. Each coder coded all of the responses. For each open-ended response, coders tallied dispositional explanations (e.g., weight, shape) and contextual explanations (e.g., gravity, liquid) for the white object's movement as well as the perceived nature of the object (e.g., ball, balloon). Table 1 shows the complete coding scheme. Interjudge reliability was assessed by Kendall's coefficient of concordance; good reliability was achieved (W = .90, p < .001). THE PERCEIVED NATURE OF THE EVENTS Because the current research assesses the impact of culture on perceptions of physical causality, we intended for participants to interpret the animated events as physical interactions between inanimate objects. However, in light of research showing that moving geometrical figures can give the impression of animacy and personality (Heider, 1944; Michotte, 1963), it is possible that some participants saw the objects as representing organisms and their movements as social in nature. However, our coding of participants' impressions of the nature of the objects (refer to Table 1 for the coding scheme) indicated that all participants perceived the animated circles as balls or (in the aerodynamic display) as a balloon, not as animals or as humans. These findings are consistent with cross-cultural studies in judgments of animacy showing that across cultures, people as young as 3 years old make similar judgments in distinguishing animals from objects (Carey, 1991). TABLE 1: Coding Scheme for Open-Ended Physical Causality Judgments in Study 1 Property Type Categories Nature of the white object Ball Balloon Animal Human No mention Could not categorize Dispositional causes of whole object's movement Weight or mass Composition Inertia Shape Energy Electricity Magnetism Internal dynamics (e.g., heat, engine) Other disposition (e.g., size, color) Could not categorize Contextual causes of white object's movement Other object Gravity Friction Air or wind Invisible matter Liquid or current Field Outside physical conditions (e.g., smoothness of surface) Other forces (e.g., human intervention) Other contextual cause Could not categorize CAUSAL EXPLANATIONS FOR THE PHYSICAL EVENTS Having established that all participants perceived the animated displays as physical in nature, we next compared American and Chinese participants' preferences for dispositional and contextual explanations of the events. To isolate the effect of nationality on participants' preferences for different types of explanations, we sought to control for any influence of nationality on the total number of causes cited. Thus, for each of the eight physical events, we calculated the percentage of American and Chinese explanations that referred to dispositional and contextual factors. For all events, we predicted that American participants would give proportionally more dispositional explanations than would Chinese but that Chinese participants would give proportionally more contextual explanations than would Americans. For each display, we conducted a z test comparing the percentage of American versus Chinese explanations that referred to dispositional causal factors. (Note that separate z tests comparing rates of contextual explanations would have results identical to the tests of dispositional explanations and are thus unnecessary.) Because our hypothesis was clearly directional, we used one-tailed tests of significance. The differences between American and Chinese rates of dispositional and contextual explanations were significant for the Launching, Magnetic, and Aerodynamic displays and marginally significant for the Hydrodynamic dropping display, such that Americans gave more dispositional explanations than did Chinese. Averaging across all displays, the predicted effect of nationality on percentage of dispositional explanations was marginally significant, p < .10. American and Chinese percentages of dispositional and contextual explanations, and the corresponding z statistics, are presented in Table 2. Discussion Study 1 provides partial evidence that members of different cultures possess divergent folk theories of physical causality. In their open-ended explanations for the physical events, American participants exhibited a greater preference for dispositional explanations than did Chinese participants on three of the eight displays, whereas Chinese participants emphasized contextual explanations more than did Americans. This suggests that, as argued by Needham (1954), Capra (1975), and Zukav (1979), the Eastern folk theory of physics places more importance on contextual factors (such as other objects, forces over distance, and mediums) and less importance on dispositional causes (such as shape and weight) than does the American folk theory of physical causality. For all the displays, cultural differences were in the predicted direction--with Chinese being more contextual than Americans in their explanations of physical motions. Nonetheless, the cultural difference was statistically significant only for the Launching, Magnetic, and Aerodynamic displays and marginal for the Hydrodynamic dropping display. Although this discrepancy could be a result of the small sample size, it is interesting to note that cultural differences were stronger for displays that depicted salient energy transitions from one object to another object or a medium (Launching, Hydrodynamic dropping, Magnetic, and Aerodynamic) than for displays in which the energy transition is less salient (Balance, Launching at a distance, Hydrodynamic floating, and Entraining). We had no a priori reason to expect cultural differences to be limited to displays depicting salient energy transitions, and further research should examine the reliability of this finding. Inspection of Table 2 suggests a possible alternative framing of our findings. The analyses reported above examined the effect of culture on participants' tendency to give dispositional versus contextual explanations for the physical displays. However, the data also can be analyzed by comparing differences in dispositional and contextual explanation within culture. It is clear that Chinese participants favored contextual explanations over dispositional explanations; however, Americans showed a relatively small preference for dispositional explanations. Therefore, it is possible that whereas Chinese possess a markedly contextual folk physical theory, Americans have a more or less evenhanded theory emphasizing the importance of both types of explanation. This alternative interpretation should be viewed with caution, however. Americans' unexpectedly high reliance on contextual explanations is driven largely by responses on only two of the eight physical displays, Hydrodynamic dropping and Hydrodynamic floating, in which there is a visible medium that exerts an obvious influence on the behavior of the white object. Thus, interpretation of property use within culture is vulnerable to idiosyncrasies of the particular displays participants were shown, some of which demanded mention of important contextual causes. TABLE 2: Percentage of American and Chinese Explanations Coded as Dispositional or Contextual in Study 1 [dropped] Having found evidence in Study 1 that American and Chinese cultures are associated with divergent folk theories of physics, we next sought to demonstrate a causal connection between culture-specific knowledge structures and patterns of physical attribution. If culture-specific lay theories are indeed responsible for the cultural differences observed in Study 1, then it should be possible to temporarily increase the cognitive accessibility of different theories and thus increase their influence on attributions. In Study 2, we tapped a population likely to possess both Asian and Western lay theories--specifically, Chinese Americans--and attempted to influence their attributions by priming one or other of these theories. We primed Chinese Americans' identity either as Asians or as Americans before having them explain the same series of physical events used in Study 1. We predicted that participants receiving the Asian identity prime would prefer dispositional causes to a lesser degree, and contextual causes to a greater degree, than would participants receiving the American prime. The results of Study 1 may be seen to conflict with findings in cognitive psychology revealing no effects of culture on perceptions of physical causality (e.g., Michotte, 1963). We argued earlier that formal physics education may sometimes supplant or obscure folk theories and thus prevent cultural differences from emerging. In Study 1, we were careful to choose participants with no formal education in physics--and thus whose inferences are likely to be based on their folk physical theories--allowing the observed cultural difference to emerge. In addition to examining the causal influence of folk physical theories, Study 2 was intended as a more direct test of the idea that formal physics education may supplant or obscure individuals' folk theories of physical phenomena. Participants in Study 2 reported the amount of physics instruction they have received and rated their physics expertise. We predicted that the effect of cultural identity priming on attributions would be qualified by an interaction with participants' amount of physics education such that only participants with little physics background would be affected by the identity prime. Participants high in physics education, who presumably rely on a formally inculcated theory of physics rather than a culture-specific folk theory, should not be affected by the identity prime. STUDY 2 Method PARTICIPANTS Sixty-five students (44 women) at the University of California, Berkeley, participated in fulfillment of psychology course requirements. The mean age of the participants was 19.7 years. Participants were selected who had reported their ethnicity to be Chinese American during a mass data collection at the beginning of the semester. MATERIALS Cultural identity primes. Primes of Asian and of American identity consisted of a short questionnaire asking participants to reflect, in writing, on several aspects of their ethnic identity. First, participants were asked to "recall an experience you had that made your identity as an American [Asian] apparent to you." (Brackets indicate wording in the Asian prime condition.) Participants then answered the following questions about the experience: "When did you have this experience?" "How old were you when you had this experience?" "Briefly describe the experience, " and "Why do you think the experience made your American [Asian] identity apparent?" Physical displays. Study 2 employed the same eight physical displays as did Study 1 (Launching, Launching at a distance, Entraining, Hydrodynamic floating, Hydrodynamic dropping, Balance, Magnetic, and Aerodynamic) (see Figure 1). Displays were presented using Flash by Macromedia Software, a computer animation program. Rating packets. Participants made their ratings of the physical displays in a packet containing Likert-type questions corresponding to several causal factors. For each of the eight displays, participants rated the extent to which the white object's movement was due to five dispositional factors of the white object (shape, weight, composition, buoyancy, and inertia) and four contextual factors acting on the white object (gravity, friction, air/wind, and water). All ratings were made on a 5-point scale from 1 (not at all responsible) to 5 (completely responsible). Ratings of physics background. A short questionnaire was created to gauge different aspects of participants' background in physics. As a measure of formal instruction in physics, participants reported the total number of physics classes they had taken in high school and college. As a measure of physics expertise, participants rated their current physics expertise on a 5-point scale from 1 (none) to 5 (expert). PROCEDURE Participants were run in groups of 5 to 10 in a large testing room outfitted with computers. As each participant entered the testing room, he or she was handed an Asian or American identity prime from an alternating stack, thus randomizing assignment of participants to the Asian and American prime conditions. Participants were then seated at computers and asked to spend 3 minutes filling out the identity primes, after which the primes were collected. Next, participants viewed each of the eight physical events in random order. The computer displayed each event twice, after which participants were referred to the appropriate page in their rating packets where they rated the degree to which each causal factor was responsible for the event. After completing ratings for all eight displays, participants completed the questionnaire gauging the amount of physics instruction they had received and their self-reported physics expertise. The entire procedure typically lasted an hour, after which participants were debriefed and dismissed. Results DERIVATION OF AGGREGATE ATTRIBUTION SCORES For use in the analyses reported below, we created aggregate measures of dispositional and contextual attribution across all physical displays. Each participant's aggregate dispositional attribution score was calculated by averaging his or her endorsement of dispositional causal factors (i.e., shape, weight, composition, buoyancy, and inertia); aggregate contextual attributions were calculated by averaging each participant's endorsement of contextual causal factors (i.e., black object, gravity, friction, air/wind, and water). EFFECTS OF IDENTITY PRIMING AND PHYSICS EDUCATION We tested two hypotheses in this study. First, we predicted that the identity priming manipulation would influence Chinese American participants' attributions for the physical events, such the participants receiving the Asian prime would attribute the physical events more to contextual causes, and less to dispositional causes, than would participants receiving the American prime. Second, we predicted that priming effects would occur only for participants with little formal education in physics. To test these hypotheses, we followed Aiken and West's (1991) procedure for testing Categorical ? Continuous interactions using multiple regression. Unlike analysis of variance (ANOVA), the regression method has the advantage of not requiring a split (such as a median split) to be performed on the continuous variable, which discards useful variance. We began by standardizing the dummy-coded prime condition variable, the measure of physics education (i.e., number of physics classes taken), and self-reported physics expertise to create three main effect terms (see Table 3 for the correlations between these variables and aggregate dispositional and contextual attribution scores). Next, we multiplied the main effect terms together to create interaction terms for each two-and three-way interaction (i.e., Prime ? Physics Classes, Prime ? Physics Expertise, Physics Classes ? Physics Expertise, and Prime ? Physics Classes ? Physics Expertise). We then performed two simultaneous multiple regression analyses, one to test the influence of these main effect and interaction terms on dispositional attribution and one to test effects on the contextual attribution. Because dispositional and contextual attributions were highly correlated, r = .68, p < .01, 3 we controlled for this relationship by adding standardized contextual attribution score as a predictor in the analysis of dispositional attributions and standardized dispositional attributions scores as a predictor in the analysis of contextual attributions. Tables 4 and 5 summarize the regressions analyses. TABLE 3: Pearson Correlations Between Variables in Study 2 (N = 65) [dropped] Identity priming. As can be seen in the first row of Table 4, participants receiving the Asian prime made significantly less extreme dispositional attributions for the physical events than did participants receiving the American prime. Moreover, whereas the Asian prime decreased dispositional attribution among Chinese Americans, it increased contextual attribution (see Table 5, first row). Physics education. As shown in the fourth row of Table 4, the effect of identity priming was qualified by a marginally significant interaction with the number of physics classes, such that the influence of priming on dispositionism decreased as physics instruction increased. Likewise, the effect of identity priming on contextualism decreased as physics instruction increased (see Table 5, fourth row). To visualize the interactions between physics education and identity priming, we plotted the interactions according to the procedure recommended by Aiken and West (1991), with levels of dispositionism and contextualism predicted based on the regression equations. Figure 2 represents the predicted effects of the priming manipulation on dispositional attribution among participants one standard deviation above and below the mean on physics education. Similarly, Figure 3 represents the predicted effects of priming on contextual attributions for participants high and low in physics education. Self-reported physics expertise. An unexpected finding emerged involving participants' self-reported physics knowledge. Specifically, we observed a significant Prime ? Physics Knowledge interaction in our analysis of dispositional attributions, such that the American prime increased dispositional attribution among participants who self-reported a great deal of physics knowledge, but not among self-rated nonexperts (see Table 4, fifth row). Discussion The results of Study 2 provide evidence for our hypotheses concerning the causal impact of folk physical theories on attributions of physical causality. Chinese American participants who received the Asian identity prime, which was theorized to activate a contextual folk theory of physics, endorsed dispositional causes to a lesser extent, and contextual causes to a greater extent, than did participants receiving the American identity prime. The success of our priming manipulation supports the notion that the interpretation of physical phenomena is guided by knowledge structures acquired through experience in one's culture. Contributing further support to the knowledge-structure account, we found that priming only affected attributions for participants with little formal instruction in physics (see Figures 2 and 3). If physical attribution is guided by learned, culture-specific knowledge structures, then it should be possible to supplant these folk theories with formal scientific theories acquired through education. TABLE 4: Summary of Simultaneous Multiple Regression Analysis 4 of Aggregate Dispositional Attribution in Study 2 (N = 65) [dropped] Figure 2 Level of dispositional attribution for physical events as a function of identity prime condition and physics education in Study 2. Figure 3 Level of contextual attribution for physical events as a function of identity prime condition and physics education in Study 2. An unexpected finding emerged, such that identity priming had an effect for individuals rating themselves as relatively expert in physics but not for individuals self- reporting little physics knowledge. Given that we had intended self-reported physics expertise, like physics education, to gauge the extent to which participants have internalized formal physics theories, this finding seems to contradict the observed Prime ? Physics Classes interaction. However, we believe that this finding may have arisen because self-rated physics expertise is a less- pure measure of an individual's actual physics knowledge than is the number of physics classes he or she has taken. Specifically, we believe that the self-rated expertise measure may have been confounded with individuals' motivation to self-enhance (i.e., to portray themselves in a positive light); self-enhancement motivation, in turn, may have been negatively related to participants' level of identification as Asians. Because East Asians tend to self-enhance to a lesser degree than do European Americans (Heine & Lehman, 1997), it may be that self- rated physics nonexperts were less American-identified than were self-rated experts. If this is so, then it may make sense that self-rated nonexperts, being not very American identified--and thus less likely to possess a dispositional folk physical theory--would not have been susceptible to the American prime. It follows from priming theory (Higgins, 1996) that one cannot prime a knowledge structure that an individual does not possess. GENERAL DISCUSSION The current research employed the knowledge- structure conception of culture in an examination of cultural influences on perceptions of physical events. First, we argued that different cultures instill their members with different folk theories of physics. Study 1 provided evidence for this claim: American and Chinese individuals were found to differ in their explanations for a number of physical events, with Americans favoring dispositional explanations compared to Chinese. Second, we argued that folk theories exert a causal influence on physical attributions. We tested this claim in Study 2 using a procedure designed to temporarily increase the accessibility of dispositional or contextual theories in individuals presumed to possess both (i.e., Chinese Americans). Chinese Americans whose Asian identity was primed were found to endorse dispositional explanations for physical events to a lesser extent, and contextual explanations to a greater extent, than did Chinese Americans whose American identity was primed. Our findings concerning the role of formal instruction in physics help to reconcile the current results with previous psychological research in which no cultural differences in physical attribution were found (e.g., Michotte, 1963). In keeping with the folk theories approach, formal education might supplant or obscure the operation of folk theories and thus prevent the cultural difference from manifesting itself. In Study 1, we argued that cultural differences in folk physical theories emerged in part due to the fact that participants had no formal physics education. In Study 2, participants' background in physics was measured, consistent with the idea that physics instruction blocks the operation of folk physical theories, priming effects were found only for individuals who had taken few physics classes. Continuity of Cultural Differences in Social Attribution and Physical Attribution The present research provides evidence for a cultural difference in physical attribution analogous to a known cultural difference in social attribution. The dispositional- contextual (or internal-external) distinction, used here to distinguish between different kinds of attributions for objects' physical behavior, has a long history in the study of attributions for individuals' social behavior. Researchers studying social explanation often distinguish between internal attributions, which trace behavior to personal dispositions (e.g., personality traits or attitudes), and external attributions, which trace behavior to forces in the social environment (e.g., pressure from peers or authorities) (Gilbert & Malone, 1995). Researchers studying social attribution have argued for the existence of robust biases in social explanation. For instance, lay perceivers often have been observed to favor internal (dispositional) explanations for others' behavior over situational explanations--an inferential tendency known as the "correspondence bias" or the "fundamental attribution error" (Ross & Nisbett, 1991). Although this tendency was once seen as a universal bias in social judgment (Heider, 1958; Ichheiser, 1949; Ross, 1977), more recent work in cultural psychology has recast dispositional bias as a culture-bound phenomenon (e.g., Miller, 1984; for a review, see Peng et al., 2001). Cross-cultural research suggests dispositional bias is less marked in East Asian cultures than in Western cultures, where most social psychological research has been conducted. A growing body of research using a variety of methods has demonstrated that East Asians are less apt to attribute behavior to an actor's personal dispositions, and more apt to attribute behavior to the situational context, than are members of Western cultures (Kitayama & Masuda, 1997; Knowles et al., 2001; Lee, Hallahan, & Herzog, 1996; Morris & Peng, 1994). Analogous to this Asian-Western cultural difference in social attribution, the current research suggests that Americans favor internal/dispositional explanations for nonsocial events more than do Chinese, whereas Chinese prefer external/contextual explanations more than do Americans. Whether the parallel between cultural differences in social and physical perception reflect the operation of domain-general cognitive factors-- such as dialectical versus linear (Peng & Nisbett, 1999) or holistic versus analytic (Nisbett, Peng, Choi, & Norenzayan, 2001) modes of thought--is an important question for future research. Reconciling Developmental and Cultural Models of Causal Understanding At first blush, the current studies might seem at odds with research into the development of physical understanding, which points to the existence of universal constraints guiding individuals' perceptions of physical events from a very early age (Carey & Spelke, 1994; Spelke, 1990). We argue, however, that no inherent tension exists between these developmental and cultural perspectives. First, the existence of cultural differences among adults in no way rules out the existence of universals among infants. Indeed, models of the development of social-causal explanation have explicitly included both early universals and later cultural differences. For instance, Miller (1984) argued that whereas early social inference may be constrained by universal cognitive processes, the influence of culture--as carried by folk theories--increases as individuals mature within their culture (Miller, 1984). The development of physical understanding might follow a similar pattern, in which cultural differences emerge only relatively late in development. Second, as the influence of folk theories on physical perceptions increases over development, it need not be the case that universal perceptual and cognitive mechanisms stop operating. Indeed, there is no inherent contradiction between the types of cognitive constraints identified by developmental psychologists (e.g., the innate understanding, observed by Spelke, 1994, that two objects cannot occupy the same volume of space) and the types of divergent beliefs embodied in folk physical theories (i.e., that the behavior of objects is attributable primarily to their dispositions or to forces impinging on them from without). In other words, dispositional and contextual folk physics are equally consistent with the sorts of basic perceptual constraints identified by developmentalists. Conclusion The current research contributes to our understanding of how development within a particular social milieu (i.e., a culture) molds an individual's perceptions of his or her environment. Past research in the culture-astheory tradition (e.g., Hong et al., 2000; Morris & Peng, 1994) suggests that culture--both national and ethnic-- may profitably be construed as a constellation of folk theories governing one's basic understanding of the social world. The current research suggests that the influence of culturally instilled folk theories may extend further-- specifically, to one's causal understanding of nonsocial (i.e., physical) events. At the same time, the current studies place caveats on when folk theories can and cannot be expected to exert influence on causal attributions. When formal theories in a domain are acquired, the influence folk understandings may wane. NOTES 1. It should be noted that priming techniques have not been limited to research in the culture-as-theory tradition. Working within the self approach, Brewer and Gardner (1996; Gardner, Gabriel, & Lee, 1999) used linguistic cues to prime personal, relational, or collective self-definitions. Value theorists, in turn, have primed different cultural values using value-related cues (e.g., Trafimow, Triandis, & Goto, 1991). 2. Female spouses of Chinese students were selected due to the difficulty of finding Chinese students with no formal education in physics. 3. We see two possible artifactual reasons for the strong positive correlation between dispositional and contextual attribution scores. First, participants may have differed in the degree to which they saw the physical displays as requiring explanation; that is, some participants may have seen many causal factors at work in the displays (leading to relatively high ratings for all causes), whereas other participants saw only a few factors at work (leading to relatively low ratings for all causes). Second, participants may have differed in terms of acquiescence bias, leading them to favor either high ratings or low ratings across all causal factors. Thus, the positive association between dispositional and contextual attribution scores does not invalidate our claim that these modes of explanation are distinguishable and independent. REFERENCES Aiken, L., & West, S. (1991). Multiple regression: Testing and interpreting interactions. Newbury Park, CA: Sage. Baillargeon, R. (2000). How do infants learn about the physical world? In D. Muir & A. Slater (Eds.), Infant development: The essential readings (pp. 195-212). Malden, MA: Blackwell. Brewer, M., & Gardner, W. (1996). Who is this "We"? Levels of collective identity and self representations. Journal of Personality and Social Psychology, 71, 83-93. Capra, F. (1975). The Tao of physics: An exploration of the parallels between modern physics and eastern mysticism. Berkeley, CA: Shambala. Carey, S. (1991). Knowledge acquisition: Enrichment or conceptual change? In S. Carey & R. Gelman (Eds.), Epigenesis of mind: Studies in biology and cognition (pp. 257-291). Hillsdale, NJ: Lawrence Erlbaum. Carey, S., & Spelke, E. (1994). Domain-specific knowledge and conceptual change. In L. Hirschfeld & S. Gelman (Eds.), Mapping the mind: Domain specificity in cognition and culture (pp. 169-200). New York: Cambridge University Press. Chiu, C., Morris, M., Hong, Y., & Menon, T. (2000). Motivated cultural cognition: The impact of implicit cultural theories on dispositional attribution varies as a function of need for closure. Journal of Personality and Social Psychology, 78, 247-259. Choi, I., Nisbett, R., & Norenzayan, A. (1999). Causal attribution across cultures: Variation and universality. Psychological Bulletin, 125, 47-63. Gardner, W., Gabriel, S., & Lee, A. (1999). "I" value freedom, but "we" value relationships: Self-construal priming mirrors cultural differences in judgment. Psychological Science, 10, 321-326. Gilbert, D., & Malone, P. (1995). The correspondence bias. Psychological Bulletin, 117, 21-38. Heider, F. (1944). Social perception and phenomenal causality. Psychological Review, 51, 358-374. Heider, F. (1958). The psychology of interpersonal relations. New York: John Wiley. Heine, S., & Lehman, D. (1997). Culture, dissonance, and self- affirmation. Personality and Social Psychology Bulletin, 23, 389-400. Heine, S., & Lehman, D. (1999). Culture, self-discrepancies, and self- satisfaction. Personality and Social Psychology Bulletin, 25, 915-925. Higgins, E. (1996). Knowledge activation: Accessibility, applicability, and salience. In E. Higgins & A. Kruglanski (Eds.), Social psychology: Handbook of basic principles (pp. 133-168). New York: Guilford. Hofstede, G. (1980). Culture's consequences: International differences in work-related values. Beverly Hills, CA: Sage. Hong, Y., Morris, M., Chiu, C., & Benet-Martinez, V. (2000). Multicultural minds: A dynamic constructivist approach to culture and cognition, American Psychologist, 55, 709-720. Ichheiser, G. (1949). Misunderstandings in human relations. Chicago: University of Chicago Press. Kim, I., & Spelke, E. (1999). Perception and understanding of effects of gravity and inertia on object motion. Developmental Science, 2, 339-362. Kim, U., & Choi, S. (1994). Individualism, collectivism, and child development: A Korean perspective. In P. M. Greenfield & R. R. Cocking (Eds.), Cross-cultural roots of minority child development (pp. 227-257). Hillsdale, NJ: Lawrence Erlbaum. Kitayama, S., & Masuda, T. (1997). Cultural psychology of social inference: The correspondence bias largely vanishes in Japan. Unpublished manuscript, Kyoto University. Knowles, E. D., Morris, M., Chiu, C., & Hong, Y. (2001). Culture and the process of person perception: Evidence for automaticity among East Asians in correcting for situational influences on behavior. Personality and Social Psychology Bulletin, 27, 1344-1356. Kotovsky, L., & Baillargeon, R. (2000). Reasoning about collisions involving inert objects in 7.5-month-old infants. Developmental Science, 3, 344-359. Lee, F., Hallahan, M., & Herzog, T. (1996). Explaining real-life events: How culture and domain shape attributions. Personality and Social Psychology Bulletin, 22, 732-741. Lewin, K. (1935). Adynamic theory of personality: Selected papers. New York: McGraw-Hill. Markus, H., & Kitayama, S. (1991). Culture and the self: Implications for cognition, emotion, and motivation. Psychological Review, 98, 224-253. Menon, T., Morris, M., Chiu, C., & Hong, Y. (1999). Culture and the construal of agency: Attribution to individual versus group dispositions. Journal of Personality and Social Psychology, 76, 701-717. Michotte, A. (1963). The perception of causality. New York: Basic Books. Miller, J. (1984). Culture and the development of everyday social explanation. Journal of Personality and Social Psychology, 46, 961-978. Morris, M., Menon, T., & Ames, D. (2001). Culturally conferred conceptions of agency: A key to social perception of persons, groups, and other actors. Personality and Social Psychology Review, 5, 169-182. Morris, M., Nisbett, R., & Peng, K. (1995). Causal attribution across domains and cultures. In D. Sperber & D. Premack (Eds.), Causal cognition: Amultidisciplinary debate (pp. 577-614). New York: Clarendon. Morris, M., & Peng, K. (1994). Culture and cause: American and Chinese attributions for social and physical events. Journal of Personality and Social Psychology, 67, 949-971. Narikiyo, T., & Kameoka, V. A. (1992). Attributions of mental illness and judgments about help seeking among Japanese-American and White American students. Journal of Counseling Psychology, 39, 363-369. Needham, J. (1954). Science and civilisation in China (Vol. 4). Cambridge, UK: Cambridge University Press. Nisbett, R., Peng, K., Choi, I., & Norenzayan, A. (2001). Culture and system of thoughts: Holistic versus analytic cognition. Psychological Review, 108, 291-310. Norenzayan, A., & Nisbett, R. (2000). Culture and causal cognition. Current Directions in Psychological Science, 9, 132-135. Peng, K., Ames, D., & Knowles, E. D. (2001). Culture and human inference: Perspectives from three traditions. In D. R. Matsumoto (Ed.), Handbook of culture and psychology (pp. 245-264). New York: Oxford University Press. Peng, K., & Nisbett, R. (1999). Culture, dialectics, and reasoning about contradiction. American Psychologist, 54, 741-754. Ross, L. (1977). The intuitive psychologist and his shortcomings: Distortions in the attribution process. In L. Berkowitz (Ed.), Advances in experimental social psychology (Vol. 10, pp. 174-221). New York: Academic Press. Ross, L., & Nisbett, R. (1991). The person and the situation: Perspectives of social psychology. New York: McGraw-Hill. Schwartz, S. (1994). Beyond individualism/collectivism: New cultural dimensions of values. In U. Kim & H. Triandis (Eds.), Individualism and collectivism: Theory, method, and applications (pp. 85-119). Thousand Oaks, CA: Sage. Singelis, T. (1994). The measurement of independent and interdependent self-construals. Personality and Social Psychology Bulletin, 20, 580-591. Spelke, E. (1990). Principles of object perception. Cognitive Science, 14, 29-56. Spelke, E. (1994). Initial knowledge: Six suggestions. Cognition, 50, 431-445. Spelke, E. (2000). Core knowledge. American Psychologist, 55, 12331243. Stewart, J. (1984). Object motion and the perception of animacy. Paper presented at the meeting of the Psychonomic Society, San Antonio, TX. Trafimow, D., Triandis, H., & Goto, S. (1991). Some tests of the distinction between the private self and the collective self. Journal of Personality and Social Psychology, 60, 649-655. Triandis, H. (1995). Individualism and collectivism. Boulder, CO: Westview. Whang, P., & Hancock, G. (1994). Motivation and mathematics achievement: Comparisons between Asian-American and non- Asian students. Contemporary Educational Psychology, 19, 302-322. Zukav, G. (1979). The dancing wu li masters: An overview of the new physics. New York: Morrow. Received March 1, 2002 Revision accepted October 15, 2002 From checker at panix.com Sun Jan 8 20:00:57 2006 From: checker at panix.com (Premise Checker) Date: Sun, 8 Jan 2006 15:00:57 -0500 (EST) Subject: [Paleopsych] Hong, Morris, and Chiu: Multicultural Minds A Dynamic Constructivist Approach to Culture and Cognition Message-ID: Multicultural Minds A Dynamic Constructivist Approach to Culture and Cognition Hong, Ying-yi; Morris, Michael W. Chiu, Chi-yue Benet-Mart?nez, Ver?nica American Psychologist July 2000 Vol. 55, No. 7, 709-720 Although the multiplicity of cultural identities and influences is hardly a new phenomenon, it is one increasingly discussed. In contemporary popular discourse, it is becoming increasingly rare to hear the word cultural without the prefix multi-. Multicultural experience, however, has been underinvestigated in psychological research on culture, particularly within the most prominent research paradigm of cross-cultural psychology (see Segall, Lonner, & Berry, 1998). There are several reasons for this. First, somewhat obviously, methodological orientations influence a researcher's choice of topics, and culture has been assessed primarily as an individual difference, with the methods for its evaluation developed by clinical and personality researchers to distinguish types of persons. Insofar as the cross-cultural method relies on uncovering differences across cultural groups (usually indexed by nationality), the influence of multiple cultures on an individual merely creates error variance. Second, on a more subtle level, the theoretical assumptions predominant in cross-cultural scholarship have impeded an analysis of the dynamics of multiple cultures in the same mind. The effort to identify the knowledge that varies between but not within large cultural groups has led to the conceptualization of cultural knowledge in terms of very general constructs, such as individualistic as opposed to collectivist value orientations, which apply to all aspects of life (Segall et al., 1998). With the emphasis on domain-general constructs has come the assumption that the influence of culture on cognition is continual and constant. Cultural knowledge is conceptualized to be like a contact lens that affects the individual's perceptions of visual stimuli all of the time. This conception unfortunately leaves little room for a second internalized culture within an individual's psychology. In sum, the methods and assumptions of cross-cultural psychology have not fostered the analysis of how individuals incorporate more than one culture. Our introduction of an alternative approach to culture takes as a point of departure a commonly reported experience, which we call frame switching, among bicultural individuals. While frame switching, the individual shifts between interpretive frames rooted in different cultures in response to cues in the social environment (LaFromboise, Coleman, & Gerton, 1993). To capture how bicultural individuals switch between cultural lenses, we adopt a conceptualization of internalized culture as a network of discrete, specific constructs that guide cognition only when they come to the fore in an individual's mind. Fortunately, theories and methods have been developed in cognitive and social psychology, such as the technique of cognitive priming, to manipulate through experiment which of the constructs in an individual's mind comes to the fore (for a review, see Higgins, 1996). We illustrate in this article how this conceptualization creates a set of new methods that involves bicultural participants testing the consequences of culture. These methods offer greater internal validity than do the quasi-experimental comparisons typically relied on in cross-cultural research. After reviewing studies of cultural frame switching, we then discuss how this approach elucidates other topics, such as the relation between cultural beliefs and action, the role of culture in emotions and motivations, and the process of acculturation. This approach illuminates not only the experiences of bicultural individuals but also the more general roles that culture plays in mental and emotional life. Frame Switching Bicultural individuals are typically described as people who have internalized two cultures to the extent that both cultures are alive inside of them. Many bicultural individuals report that the two internalized cultures take turns in guiding their thoughts and feelings (LaFromboise et al., 1993; Phinney & Devich-Navarro, 1997). This is interesting because it suggests that (a) internalized cultures are not necessarily blended and (b) absorbing a second culture does not always involve replacing the original culture with the new one. Classical scholarship on African Americans, for instance, describes movement back and forth between "two souls, two thoughts, two unreconciled strivings, two warring ideals" (DuBois, 1903/1989, p. 5). Ethnographies of Asian Americans and Hispanic Americans, among other groups, describe switches between mindsets rooted in different cultures. Consider, for example, the following experience of a Mexican American individual: At home with my parents and grandparents the only acceptable language was Spanish; actually that's all they really understood. Everything was really Mexican, but at the same time they wanted me to speak good English . But at school, I felt really different because everyone was American, including me. Then I would go home in the afternoon and be Mexican again. (quoted in Padilla, 1994, p. 30) This example illustrates that frame switching may occur in response to cues such as contexts (home or school) and symbols (language) that are psychologically associated with one culture or the other. Reports of frame switching at work are common in the literature on minority or expatriate employees (e.g., Bell, 1991). Similar experiences are reported by ethnographers during fieldwork: I found myself constantly flip-flopping . The longer I lived in Samoa, the more I was able to use the Samoans' cultural resources the flow of my everyday experiences was increasingly filtered through Samoan models. (Shore, 1996, p. 6) A Dynamic Constructivist Analysis To understand frame switching in bicultural individuals, we have adopted an approach influenced by constructivist approaches to culture in several disciplines and by contemporary social psychological research on the dynamics of knowledge activation. A first premise is that a culture is not internalized in the form of an integrated and highly general structure, such as an overall mentality, worldview, or value orientation. Rather, culture is internalized in the form of a loose network of domain-specific knowledge structures, such as categories and implicit theories (Bruner, 1990; D'Andrade, 1984; Shore, 1996; Strauss, 1992). A second premise is that individuals can acquire more than one such cultural meaning system, even if these systems contain conflicting theories. That is, contradictory or conflicting constructs can be simultaneously possessed by an individual; they simply cannot simultaneously guide cognition. The key to this distinction is that possessing a particular construct does not entail relying on it continuously; only a small subset of an individual's knowledge comes to the fore and guides the interpretation of a stimulus. This dynamic constructivist approach differs in its conception of culture from cross-cultural psychology, yet it is a complementary rather than a rival approach in that it builds on previous insights and draws attention to novel research questions and novel accounts of phenomena, such as frame switching. A basic research question relevant to frame switching is how particular pieces of cultural knowledge become operative in particular interpretive tasks. To investigate this question, we have drawn concepts and methods from social psychological research on how stereotypes, schemas, and other constructs move in and out of operation (Fiske, 1998). A key concept is that the pieces of an individual's knowledge vary in accessibility (Higgins, 1996; Wyer & Srull, 1986). The more accessible a construct, the more likely it is to come to the fore in the individual's mind and guide interpretation. But what determines whether a piece of knowledge is highly accessible? A long-standing hypothesis in cognitive and social psychology holds that a construct, such as a category, is accessible to the extent that it has been activated by recent use (Bruner, 1957). Abundant evidence for this comes from experiments in which researchers manipulate whether participants are exposed to a word or image related to a construct (a prime) and then measure the extent to which the participants' subsequent interpretations of a stimulus are influenced by the primed construct (for a review, see Higgins, 1996). For example, in one experiment (Chiu et al., 1998), participants were primed either with pictures of a masculine man and a feminine woman or with gender-unrelated (control) pictures. Later, in a purportedly unrelated task, they were asked to interpret an ambiguous behavior (e.g., "Donna's friend ordered a coffee, and so did Donna"). Participants primed with gender-related pictures constructed interpretations that showed an influence of gender stereotypes: For example, they judged Donna to be dependent on others in making decisions. Participants in the control condition did not make such interpretations. In this experiment, gender-related pictures activated stereotypes in the minds of participants, which then made it more likely that these stereotypes became operative and guided inferences when participants sought to make sense of the behavioral stimulus. An important design feature in many priming studies is that the priming is presented to participants as part of an unrelated experiment, and participants are not aware of its influence in the interpretive task. Some studies have primed constructs that are one step removed from the construct that applies to the interpretive task. For example, priming with words related to African Americans led White participants to interpret hostility in stimulus behavior by race-unspecified actors (Gaertner & McLaughlin, 1983); priming with cues with positive affective valence led participants to subsequently rely on person categories having the same affective valence (Niedenthal & Cantor, 1986). These priming effects rely on the spillover or spread of activation from one construct to other linked constructs within a network of constructs that are psychologically associated for participants (see Anderson, 1976). In our research on frame switching, we used the concept of accessibility and the technique of priming to model the phenomenon experimentally. We posited that bicultural individuals who have been socialized into two cultures, A and B, have, as a result, two cultural meaning systems or networks of cultural constructs, which can be referred to as A' and B'. Accordingly, priming bicultural individuals with images from Culture A would spread activation through Network A', elevating the accessibility of the network's categories and the implicit theories the network comprises. Likewise, priming with images from Culture B would spread activation through Network B', elevating the accessibility of the constructs that network comprises. In looking for the ideal primes to test this account, we searched for symbols that would activate constructs central to specific cultural networks yet not so directly related to the interpretive task. Thus, participants could not consciously connect the prime with the stimulus. We turned to iconic cultural symbols. Icons: Triggers of Cultural Knowledge Icons have been called "magnets of meaning" in that they connect many diverse elements of cultural knowledge (Betsky, 1997). Like religious icons, cultural icons are images created or selected for their power to evoke in observers a particular frame of mind in a "powerful and relatively undifferentiated way" (Ortner, 1973, p. 1339). The potency and distinctiveness of icons make them ideal candidates for primes that would spread activation in a network of cultural constructs. Some examples of central icons in the mainstream American and Chinese cultural traditions are shown in Figure 1. Exposing Chinese American bicultural individuals to American icons should activate interpretive constructs in their American cultural knowledge network; exposing the same individuals to Chinese icons instead should activate constructs in their Chinese cultural knowledge network. Interpreting Behavior of Individual and Group Actors: A Litmus Test Our research also required an interpretive task that is influenced by cultural knowledge in a well-understood manner. Here the legacy of cross-cultural psychology is invaluable in that we can seek to replicate, by priming different cultures within the minds of bicultural individuals, the patterns of differences that have been discovered in previous cross-national comparative studies. Many such patterns exist. For example, in self-description tasks, North Americans are consistently more likely than Japanese to make self-enhancing statements (Kitayama & Markus, 1994). An important consideration, however, is that many Japanese American biculturals are, no doubt, aware of this difference. Hence, exposing bicultural individuals to cultural icons could affect this difference either through unobtrusive priming of knowledge structures or through demand characteristics. We needed a stimulus task that participants would not consciously connect to cultural icons. In short, the task could not be transparently related to culture. To develop a test for cultural priming that would be nontransparent to participants, we turned to interpretations of social behavior. Social psychologists have long studied how perceivers attribute the behavior of others to causes, noting systematic biases, such as tracing an individual's actions to personality dispositions rather than other plausible factors such as social context (Heider, 1958; Ross, 1977). Perhaps the most famous evidence for this bias came from studies conducted by Heider and Simmel (1944) in which participants were presented with animated films of geometric shapes, such as triangles and circles, that were moving in patterns suggestive of social interactions. Participants tended to interpret the films by ascribing motives and personalities to an individual shape. Heider (1958) concluded that social information is interpreted by forming units, primarily the unit of an individual person. The person unit then tends to attract most of the perceiver's attention, resulting in causal attributions that overweigh internal personal factors and underweigh factors in the surrounding social situation. Other researchers have studied everyday interactions in which this bias of tracing an individual's behavior to dispositions leads to incorrect interpretations of the individual's behavior and suboptimal ways of interacting with him or her (Jones & Harris, 1967; Morris, Larrick, & Su, 1999). Because of its pervasiveness and consequentiality, this dispositionist bias has been called the fundamental attribution error (Ross, 1977). Recent research has allowed psychologists to identify the role that culture plays in shaping the dispositionist bias in social perception. Prompted by ethnographic accounts of Chinese social understanding (Hsu, 1953), Morris and Peng (1994) investigated the hypothesis that the tendency of perceivers to focus on individuals and interpret behavior in terms of their internal dispositions may be more marked in North America than in China. They reasoned that an implicit theory that individuals are autonomous relative to the pressures of the group is central to American culture, whereas in Chinese culture a more salient implicit theory emphasizes that individuals accommodate the greater autonomy of groups (Su et al., 1999). In studies in which they used several methods, Morris and Peng showed that American participants accorded more weight to an individual's personal dispositions, whereas Chinese participants accorded more weight to an individual's social context. Further evidence for the difference in implicit theories emerged from studies directly measuring generalized beliefs about individuals versus social groups and institutions (Chiu, Dweck, Tong, & Fu, 1997). In a recent review of studies comparing North American and East Asian perceivers, researchers concluded that the sharpest differences in attributions for the cause of an individual's behavior lie in the weight accorded to the contexts of constraints and pressures imposed by social groups (Choi, Nisbett, & Norenzayan, 1999). Consistent with this indication that East Asians accord causal potency to social collectives, in studies of how perceivers attribute actions by groups researchers have found that East Asians make attributions to the dispositions of groups more than Americans do (Menon, Morris, Chiu, & Hong, 1999). In sum, cultural differences in the attributional weight accorded to the dispositions of individuals versus groups are well documented. An important feature of attribution differences is that they can be studied with nontransparent methods. One of the methods used by Morris and Peng (1994) adapted Heider's strategy of presenting animated films that participants do not consciously associate with social or cultural topics. Morris and Peng designed animated films of fish featuring an individual and a group in which it was ambiguous whether the individual's differing trajectory reflected internal dispositions or the influence of the group. In one type of display, the individual fish swam outside of the group, leaving ambiguous whether the individual's separation reflected an internal disposition (a leader leading other fish) or pressure from the group (an outcast being chased by other fish). In explaining the individual fish's behavior, Chinese participants attributed less to internal disposition of the fish in front but more to the external (group) factors than did American participants (see Figure 2). This method of measuring cultural differences through the ways social perceptions are anthropomorphically projected onto animals has the advantage that participants are unaware culture is relevant to the task. Cultural Priming Studies In a series of studies, we experimentally created frame switching among bicultural individuals. Next, we review three of the studies. The first two studies used the priming method to replicate in bicultural individuals the cross-national attribution differences revealed by Morris and Peng (1994). The third study is a conceptual replication of the first two studies, but the dependent measures were attributions for a social event. Bicultural Participants Who were the bicultural individuals we recruited in the studies? Our initial studies involved Westernized Chinese students in Hong Kong. Although traditional Chinese values are emphasized in the socialization processes in Hong Kong (Ho, 1986), contemporary university students in Hong Kong are acculturated with Western social beliefs and values (Bond, 1993). This is related to the fact that Hong Kong was a British-administrated territory for more than a century. Before 1997, English, not Chinese, was the official language of instruction in about 80% of the secondary schools (Young, Giles, & Pierson, 1986). Furthermore, large British and American expatriate communities and the salient presence of English-language television, films, and so forth means that Hong Kong Chinese students have been exposed to Euro-American social constructs extensively. Yet, although Hong Kong Chinese students are rather Westernized in some aspects of their self-concept and value system (see Bond & Cheung, 1981; Fu, 1999; Triandis, Leung, & Hui, 1990), they maintain their primary social identity as Hong Kong Chinese (Hong, Yeung, Chiu, & Tong, 1999) and subscribe to core Chinese values (Chinese Culture Connection, 1987). In sum, Hong Kong Chinese students in the late 1990s belong to a population of biculturally socialized individuals. In our later experiment (reported in Hong, Morris, Chiu, & Benet-Mart?nez, 2000), we tested a different group of bicultural individuals. These were China-born Californian college students who had lived at least five years in a Chinese society and at least five years in North America before attending college. Whereas the Hong Kong bicultural group represented bicultural identification resulting from extensive Westernization of a society, the Chinese American group represented bicultural identification resulting from immigration: These are two primary ways that culture moves across territories to create multicultural societies (Hermans & Kempen, 1998). Although we do not report in this article the study with Chinese American biculturals, results revealed that these participants recognized and were influenced by American and Chinese cultural icons in similar ways as were the members of the Hong Kong bicultural group. Priming Materials We presented Hong Kong Chinese students with a set of cultural icons designed to activate the associated social theories that produce cultural biases in attribution. In our research we used several kinds of icons. Some involved symbols (e.g., the American flag vs. a Chinese dragon), legendary figures from folklore or popular cartoons (e.g., Superman vs. Stone Monkey), famous people (e.g., Marilyn Monroe vs. a Chinese opera singer), and landmarks (e.g., the Capitol Building vs. the Great Wall). Several prior studies have demonstrated that exposure to such icons activates the corresponding cultural meaning system. For instance, Hong, Chiu, and Kung (1997, Experiment 1) found that exposure to these Chinese icons led Hong Kong Chinese students to increase their endorsement of Chinese values. Recently, Kemmelmeier and Winter (1998) found that Americans showed an elevated endorsement of independence values after being exposed to the American flag. Initial Tests In one study (Hong et al., 1997, Experiment 2), 303 Hong Kong Chinese undergraduate students were randomly assigned to the American culture priming condition, the Chinese culture priming condition, or the control condition. Participants in the American culture priming condition were shown six pictures of American icons and were asked to answer short questions about the pictures (e.g., "Which country does this picture symbolize?" "Use three adjectives to describe the character of the legendary figure in this picture"). Participants in the Chinese culture priming condition were shown six pictures of Chinese icons and were asked to answer the same short questions. These conditions were designed to inject activation into American and Chinese construct networks, respectively, leading to elevated accessibility of their respective implicit theories about the causality of social events. Participants in the control condition were shown six drawings of geometric figures and asked to indicate where they thought there should be a shade or a shadow. This condition was designed to inject no activation into cultural knowledge networks but to otherwise resemble the cultural prime conditions. Then, in an allegedly unrelated task, participants were given an attribution task adapted from Morris and Peng (1994). In this measure, participants were shown a realistic picture of a fish swimming in front of a group of fish (see Figure 3) and asked to indicate on a 12-point scale why one fish was swimming in front of the group. A score of 1 on the scale meant very confident that it is because the one fish is leading the other fish (an internal cause), and a score of 12 meant very confident that it is because the one fish is being chased by the other fish (an external cause). Consistent with the pattern identified in cross-national studies (Morris & Peng, 1994), we expected that participants would be less inclined to interpret the individual fish's behavior in terms of the external social pressure after American priming than after Chinese priming. Indeed, as predicted, participants who were exposed to American pictures were significantly less confident in the external (vs. internal) explanation than were those who were exposed to Chinese pictures (see Figure 4). Participants in the control condition fell midway between the two culture priming conditions. In a second experiment, we replicated the cultural priming effect with a less constricted measure of causal attributions (Hong et al., 1997, Experiment 3). Participants were 75 Hong Kong Chinese undergraduate students who were randomly assigned to the American culture priming condition, the Chinese culture priming condition, or the control condition. In the American culture priming condition, participants were shown five pictures of American icons and asked to write 10 sentences to describe the pictures in terms of American culture. Participants in the Chinese culture priming condition were shown five pictures of Chinese icons and asked to write 10 sentences to describe the pictures in terms of Chinese culture. In the control condition, participants were shown five pictures of physical landscapes and asked to write 10 sentences about the landscapes. This procedure lasted for 10 minutes. Then, in an ostensibly unrelated task, participants were presented with a picture depicting a fish swimming in front of a school of fish and asked to write down what they thought was the major reason why the fish was swimming in front of other fish. This open-ended response format allowed participants to generate explanations that were not limited to the options we provided. On the basis of Miller's (1984) coding scheme, the explanations were coded into inferences of internal properties or external properties. Again, participants' likelihood of generating external explanations differed significantly across the three experimental conditions. As predicted, fewer participants in the American culture priming condition than in the Chinese culture priming condition generated explanations referring to the external social context (see Figure 4). The proportion of participants who generated external explanations in the control condition fell midway between the proportions of the two culture priming conditions, much as in the previous study. A Conceptual Replication In our third study, we checked that the priming effect is replicated when the task involves interpreting human actions. We asked participants to make an attribution for a character's deviation from a diet--an action chosen because it has no obvious connection to the cultural icons. We randomly assigned 234 Hong Kong Chinese high school students to one of three priming conditions. Participants in the American culture priming condition saw eight American icons and wrote 10 sentences about American culture. Participants in the Chinese culture priming condition saw eight Chinese icons and wrote 10 sentences about Chinese culture. Participants in the control condition saw pictures of natural landscapes and wrote 10 sentences about the landscapes. This priming manipulation lasted approximately 15 minutes. Then participants in all conditions read a story about an overweight boy who was advised by a physician not to eat food with high sugar content. One day, he and his friends went to a buffet dinner where a delicious-looking cake was offered. Despite its high sugar content, he ate it. After reading this brief description, participants were asked to respond to three sets of questions. Participants were asked to indicate the extent to which the boy's weight problem was caused by his dispositions. That is, they rated factors such as his personality dispositions (e.g., he lacks the ability to control himself, etc.) on a 10-point scale, ranging from 1 (has very little influence on his action) to 10 (has a lot of influence on his action). In addition, participants were asked to indicate the extent to which the boy's eating of the cake was caused by pressures and constraints of his external social situation (situational reasons, friends' pressure on him, etc.) on the same 10-point scale. As in the previous two studies, participants in the three priming conditions differed on the weight accorded to the external, social situations as determinants of the boy's behavior (see Figure 5). As predicted, participants in the American culture priming condition accorded less weight to external social factors than did participants in the Chinese culture priming condition (see Figure 4). On this measure, participants in the control condition fell in between those in the Chinese and American culture priming conditions. Participants in the three priming conditions, however, did not differ on the internal attribution measure. This result is consistent with the conclusions in Choi et al.'s (1999) review that cultural influences on attributions for an individual's behavior originate more from the differential weight placed on the external social context (when these factors are salient) than from the differential weight placed on the actor's internal dispositions. In sum, through priming bicultural individuals, we have replicated the differences in attribution previously identified in quasi-experimental comparisons of groups in different countries. In so doing, we have experimentally modeled the phenomenon of frame switching in bicultural individuals and have demonstrated that multiple cultures can direct cognition within one individual's mind. Extending the Dynamic Constructivist Approach We began by analyzing the experience of frame switching reported by multicultural individuals in terms of a dynamic constructivist view of culture and cognition. We have experimentally modeled the phenomenon through priming experiments and have found support for our predictions. Culturally conferred implicit theories became operative in guiding the interpretation of stimuli to the extent that their accessibility was high because of recent activation. Having documented the fruitfulness of a dynamic constructivist approach to this phenomenon in the experience of bicultural individuals, we now discuss its assumptions and implications more generally as a framework for analyzing the role of culture in psychology. Our assumption that cultural knowledge exists at the level of domain-specific categories and theories derives from the constructivist tradition that knowledge must be specific enough to constrain interpretations of stimulus information (Bruner, 1957; Heider, 1958). Bruner (1990) and others have explicated a constructivist view of cultural knowledge as a toolbox of discrete, specific constructs that differs from the dominant view in cross-cultural psychology that cultural knowledge exists as an integrated, domain-general construct. Several contemporary anthropologists (Shore, 1996; Sperber, 1996) and sociologists (DiMaggio, 1997) have staked out similar positions within their disciplines, challenging more general conceptions of cultural knowledge as foundational schemas or value orientations. However, our approach goes beyond these other constructivist approaches to culture in its emphasis on the dynamics of knowledge activation. In describing the dynamics of cultural knowledge, we see great potential in drawing on research concerning construct accessibility. Whereas the cross-cultural literature generally explains judgment and decision outcomes in terms of whether individuals in a given cultural group possess a given knowledge construct, we see the possession of a construct as a less critical variable than whether the construct is highly accessible (cf. Trafimow, Triandis, & Goto, 1991). Our guess is that the most important implicit theories about the social world are possessed by people everywhere; the variance across cultural groups probably lies in the relative accessibility of particular implicit theories, not in whether the theories are possessed. In our experiments concerning frame switching in bicultural individuals, the emphasis was on temporary accessibility of a construct caused by the priming of related constructs. Equally useful in theories of culture may be the related notion that some constructs attain chronic accessibility, in part because accessibility is maintained by frequency of use (Higgins, King, & Mavin, 1982; for a review, see Higgins, 1996). Some findings in the cross-cultural literature that have been interpreted in terms of whether participants possess a construct (i.e., a performance difference reflects which self-concepts individuals possess in Culture A vs. Culture B) might be fruitfully reframed in terms of chronic accessibility (i.e., a performance difference reflects which self-concepts are made chronically accessible in Culture A vs. Culture B). Another virtue of an account based on accessibility is that it points to how factors outside of the individual person--such as institutions, discourse, or relationships--might prime cultural theories and keep these theories prominent in the minds of culture members. Cross-cultural researchers have been troubled at times that the influence of a given cultural construct does not emerge consistently when tasks are run under different conditions. Accessibility may provide an important clue to understanding this observation. Social cognition researchers have found that some conditions create an epistemic motivation for a quick reduction of ambiguity (the need for cognitive closure), and this increases the extent to which perceivers work top-down from accessible constructs, such as cultural theories, when constructing interpretations (Kruglanski & Webster, 1996). Consistent with the notion that the need for closure amplifies cultural influence, in recent research it has been found that a high need for closure fosters the tendency to make attributions to individual dispositions among North Americans and the tendency to make attributions to the dispositional properties of a group among Chinese perceivers (Chiu, Morris, Hong, & Menon, 2000). More generally, cultural psychology may benefit from the incorporation of many of the insights in social cognition research about the moderating factors (e.g., need for cognition, availability of cognitive capacity) that determine when constructs become accessible and when accessible constructs have the most influence on cognition. Many of the processes and conditions that moderate perceivers' reliance on stereotypes and other knowledge structures may also affect their reliance on cultural theories. Stronger support may emerge for models of the consequences of culture once the moderating factors are better specified. Implications for Other Research Areas Methodology The research reviewed here shows that it is possible to conduct experimental studies on culture. In the same way that quasi-experimental cross-cultural studies added a new tool for cultural research with some advantages over ethnographic observation, priming experiments offer a new tool for cultural research that has advantages over the preexisting methods. A first use of the priming method is to explore the content of cultural knowledge. This is usually done by analyzing the content of samples of conversation and other texts. An alternative method is to analyze the content of thoughts elicited by priming with cultural icons. For example, by priming North American perceivers with pictures of the American flag and querying their associations, Kemmelmeier and Winter (1998) have been able to analyze the constellation of values associated with this cultural icon. Similarly, exposing Hong Kong Chinese to pictures of Chinese cultural icons leads to elevated endorsement of certain social values (Hong et al., 1997, Experiment 1). Thus, the culture priming technique creates a new way to uncover content of cultural knowledge. A second role of priming lies in establishing the causal consequences of cultural knowledge. Experiments with the priming method allow for true random assignment of participants to cultural conditions, thus providing tests of culture's consequences with greater internal validity than that of tests provided by the quasi-experimental method of cross-national studies. Hence, the priming method complements cross-cultural comparisons in isolating the causal role of culture. Language as Prime Aside from cultural icons, language could also be an effective means of activating cultural constructs. In fact, considerable research evidence shows language effects in bilingual individuals' responses to a wide range of psychological inventories such as measures of personality (Earle, 1969; Ervin, 1964), values (Bond, 1983; Mar?n, Triandis, Betancourt, & Kashima, 1983), self-concept (Trafimow, Silverman, Fan, & Law, 1997), emotional expression (Matsumoto & Assar, 1992), or even other-person descriptions (Hoffman, Lau, & Johnson, 1986). A compelling explanation for these findings has been that for bilingual individuals, the two languages are often associated with two different cultural systems. In Bond's (1983) and Earle's (1969) studies, for instance, the responses of bilingual Chinese were more Western when they responded to the original (English) questionnaire than when they responded to a Chinese translation of it. Interestingly, Earle explained these results in dynamic constructionist terms. According to him, these bilingual individuals had learned Chinese at home and English at school and had, at the same time, acquired two distinct sets of cultural constructs reflecting the two languages' cultures. The Chinese version of the questionnaire activated the Chinese language culture, and the English version, the English language culture (see Krauss & Chiu, 1998). As such, the dynamic constructivist approach could help researchers to better understand the research on sociopersonality factors in bilingualism. Moving Beyond Cognition Heretofore, we have discussed the application of the dynamic approach to culture solely in the study of cognition. Clearly, however, the priming method can be used in analogous ways to study emotions. This experimental technique can be used to investigate the emotions triggered by exposure to cultural icons, and this may prove more incisive than trying to infer culture-emotion relationships from cross-national comparisons. Although research could commence with the study of a single culture, it would be interesting to see whether culturally distinct emotional states could be induced in bicultural individuals through priming with different icons. It is also interesting to explore the other side of this question: What emotions lead people to embrace cultural icons and cultural ideas more generally? Some evidence that cultural icons have more than a cold cognitive impact comes from work by Greenberg, Porteus, Simon, Pyszczynski, and Solomon (1995), in which they demonstrated that individuals led to think about their mortality are subsequently more respectful toward iconic cultural objects (e.g., a flag or crucifix). Central cultural symbols play a key role in the motivated identification of self with enduring cultural traditions. At the same time that the dynamic constructivist approach can be extended more broadly, it is also important to note that this model of culture in terms of an individual's knowledge structures obviously does not capture all the manifestations of culture that matter. Culture exists in many forms other than knowledge in an individual's head (see Kitayama & Markus, 1994). Other carriers of culture, such as practices, have been identified by psychological researchers using the sociocultural approach (see Rogoff, 1990) and by sociologists studying relationship patterns and institutions (see Morris, Podolny, & Ariel, 1999). Hence, although the activation of cultural knowledge may have important influences on emotions and motives as well as judgments and decisions, many interesting aspects of culture may not be mediated by knowledge activation at all. A complete understanding of culture and psychology requires that the dynamic constructivist approach be complemented by analyses that are less knowledge-oriented. Also, to a large extent, cultures are shaped in relation to each other, so the tension between cultures needs to be part of a comprehensive account of any single culture. This is particularly relevant in understanding the dynamics of a multiply acculturated individual. In our studies, we chose individuals identified with two cultures (North American and Chinese) that for the most part are not antagonistic to each other. If the two cultural groups an individual has been extensively exposed to involved intense political antagonism (such as Serbs and Muslims in Bosnia), presenting cultural icons of one culture may elicit reactive identification with the opposite culture (see Krauss & Chiu, 1998). Two conclusions can be drawn from this point. First, even within studies of culture and cognition, researchers need to proceed with an awareness of the intergroup and political connotations of particular cultural group membership. Second, reaction against unwanted reminders of a culture may be amenable to a dynamic constructivist analysis. One possibility is that antagonism leads to a psychological linking of the two cultural networks, so that activation of the constructs from the antagonist culture spreads to the other culture. Another possibility is that individuals actively control the dynamics of construct accessibility rather than being passively affected by them. Then, activating the antagonist culture may cause active suppression and thus would not yield any cultural priming effect. These possibilities can be explored in future research. The Process of Acculturation In addition to creating an understanding of internalized culture as an antecedent variable, the dynamic constructivist approach may lead to fresh insights about how culture gets inside minds in the first place, in other words, the psychology of acculturation. Theoretical models proposed by Berry (1988), Birman (1994), LaFromboise et al. (1993), and Phinney (1996) are useful in describing the behavioral (e.g., how active one is in ethnic organizations and social groups), motivational-attitudinal (e.g., how much value is given to assimilating into the mainstream culture), or phenomenological (e.g., how much conflict or discrimination is experienced in the new culture) aspects of the acculturation process. These models, however, focus on the outcome of acculturation more than on the process. Individuals are scored on the extent to which they have absorbed the new culture or retained the original one. The dynamic constructivist approach could supplement the traditional approach by emphasizing the process of internalizing a new culture, highlighting dynamics such as frame switching that many people experience in the process. More important, a dynamic constructivist approach lends itself to viewing acculturation as a more active process. The end result--thinking and behaving like a member of the host culture--is seen as a state, not a trait. This state will occur when interpretive frames from the host culture are accessible. We submit that individuals undergoing acculturation, to some extent, manage the process by controlling the accessibility of cultural constructs. People desiring to acculturate quickly surround themselves with symbols and situations that prime the meaning system of the host culture. Conversely, expatriates desiring to maintain the accessibility of constructs from their home culture surround themselves with stimuli priming that culture. For example, one of the current authors, who is Spanish but has lived for some years in the United States, often surrounds herself with Spanish music, food, and paintings to keep alive her Spanish ways of thinking and feeling. Active processes of priming oneself may help multicultural individuals in their ongoing effort to negotiate and express their cultural identities. Future research should investigate not only the outcome of acculturation but also the processes through which individuals navigate cultural transitions. Conclusion We have proposed a dynamic constructivist approach to culture and cognition and have reported supportive evidence. A distinctive contribution of this approach is in describing how a given individual incorporates multiple cultures and in describing how and when particular pieces of cultural knowledge become operative in guiding an individual's construction of meaning. This less monolithic view of culture seems particularly appropriate at this time of increasing cultural interconnection. Across the world, there is a drift toward culturally polyglot, pluralistic societies. Yet, in part because of the strain of negotiating cultural complexity, a countervailing resurgence of efforts to separate individuals into culturally "pure" groups also exists. By experimentally modeling frame switching among bicultural individuals, our model shows that research on "uncontaminated" cultural groups is not the only viable way to identify cultural effects on cognition. In sum, a dynamic constructivist approach may open new possibilities in understanding culture and transcultural experiences. References Anderson J. R. Language, memory, and thought., Hillsdale, NJ: Erlbaum., 1976 Bell E. The bicultural life experience of career-oriented Black women., Journal of Organizational Behavior, Hillsdale, NJ: Erlbaum., Vol 11, 1991, 459-478 Berry J. W. Acculturation and psychological adaptation: Conceptual overview., In J. W. Berry & R. C. Annis (Eds.), Ethnic psychology: Research and practice with immigrants, refugees, native peoples, ethnic groups and sojourners (pp. 41-52). Amsterdam: Swets & Zeitlinger., 1988 Betsky A. Icons: Magnets of meaning., San Francisco: Chronicle Books., 1997 Birman D. Acculturation and human diversity in a multicultural society., In E. Trickett, R. J. Watts, & D. Birman (Eds.), Human diversity: Perspectives on people in context (pp. 261-284). San Francisco: Jossey-Bass., 1994 Bond M. H. How language variation affects inter-cultural differentiation of values by Hong Kong bilinguals., Journal of Language and Social Psychology, In E. Trickett, R. J. Watts, & D. Birman (Eds.), Human diversity: Perspectives on people in context (pp. 261-284). San Francisco: Jossey-Bass., Vol 2, 1983, 57-66 Bond M. H. Between the yin and the yang: The identity of the Hong Kong Chinese (Hong Kong Chinese University, Professorial Inaugural Lecture Series 19), Chinese University Bulletin (Suppl. 31)., 1993 Bond M. H., Cheung T. College students' spontaneous self-concept: The effects of culture among respondents in Hong Kong, Japan, and the United States., Journal of Cross-Cultural Psychology, Chinese University Bulletin (Suppl. 31)., Vol 14, 1981, 153-171 Bruner J. S. Going beyond the information given., In University of Colorado, Boulder, Department of Psychology (Ed.), Contemporary approaches to cognition (pp. 218-238). Cambridge, MA: Harvard University Press., 1957 Bruner J. S. Acts of meaning., Cambridge, MA: Harvard University Press., 1990 Chinese Culture Connection Chinese values and the search for culture-free dimensions of culture., Journal of Cross-Cultural Psychology, Cambridge, MA: Harvard University Press., Vol 18, 1987, 143-164 Chiu C.-Y., Dweck C. S., Tong J. Y., Fu J. H. Implicit theories and conceptions of morality., Journal of Personality and Social Psychology, Cambridge, MA: Harvard University Press., Vol 73, 1997, 923-940 Chiu C.-Y., Hong Y.-Y., Lam I. C., Fu H., Tong Y., Lee S. Stereotyping and self-presentation: Effects of gender stereotype activation., Group Processes and Intergroup Relations, Cambridge, MA: Harvard University Press., Vol 1, 1998, 81-96 Chiu C.-Y., Morris M. W., Hong Y.-Y., Menon T. Motivated cultural cognition: The impact of implicit cultural theories on dispositional attribution varies as a function of need for closure., Journal of Personality and Social Psychology, Cambridge, MA: Harvard University Press., Vol 78, 2000, 247-259 Choi I., Nisbett R. E., Norenzayan A. Causal attribution across cultures: Variation and universality., Psychological Bulletin, Cambridge, MA: Harvard University Press., Vol 125, 1999, 47-63 D'Andrade R. G. Cultural meaning systems., In R. A. Shweder & R. A. LeVine (Eds.), Culture theory: Essays on mind, self, and emotion (pp. 88-119). Cambridge, England: Cambridge University Press., 1984 DiMaggio D. Culture and cognition., Annual Review of Sociology, In R. A. Shweder & R. A. LeVine (Eds.), Culture theory: Essays on mind, self, and emotion (pp. 88-119). Cambridge, England: Cambridge University Press., Vol 23, 1997, 263-287 DuBois W. E. B. The souls of Black folk., New York: Penguin. (Original work published 1903), 1989 Earle M. A cross-cultural and cross-language comparison of dogmatism scores., Journal of Social Psychology, New York: Penguin. (Original work published 1903), Vol 79, 1969, 19-24 Ervin S. M. Language and TAT content in bilinguals., Journal of Abnormal and Social Psychology, 68, 500-507., 1964 Fiske S. T. Stereotyping, prejudice, and discrimination., In D. T. Gilbert, S. T. Fiske, & G. Lindzey (Eds.), Handbook of social psychology (4th ed., Vol. 2, pp. 357-411). New York: McGraw-Hill., 1998 Fu H. Y. The social cognitive mediation of multiple enculturation and values., Unpublished doctoral dissertation, University of Hong Kong., 1999 Gaertner S. L., McLaughlin J. P. Racial stereotypes: Associations and ascriptions of positive and negative characteristics., Social Psychology Quarterly, Unpublished doctoral dissertation, University of Hong Kong., Vol 46, 1983, 23-40 Greenberg J., Porteus J., Simon L., Pyszczynski T., Solomon S. Evidence of a terror management function of cultural icons: The effects of mortality salience on the inappropriate use of cherished cultural symbols., Personality and Social Psychology Bulletin, Unpublished doctoral dissertation, University of Hong Kong., Vol 21, 1995, 1221-1228 Heider F. The psychology of interpersonal relations., New York: Wiley., 1958 Heider F., Simmel M. An experimental study of apparent behavior., American Journal of Psychology, New York: Wiley., Vol 57, 1944, 243-259 Hermans H. J. M., Kempen H. J. G. Moving cultures: The perilous problems of cultural dichotomies in a globalizing society., American Psychologist, New York: Wiley., Vol 53, 1998, 1111-1120 Higgins E. T. Knowledge activation: Accessibility, applicability and salience., In E. T. Higgins & A. E. Kruglanski (Eds.), Social psychology: Handbook of basic principles (pp. 133-168). New York: Guilford Press., 1996 Higgins E. T., King G. A., Mavin G. H. Individual construct accessibility and subjective impressions and recall., Journal of Personality and Social Psychology, In E. T. Higgins & A. E. Kruglanski (Eds.), Social psychology: Handbook of basic principles (pp. 133-168). New York: Guilford Press., Vol 43, 1982, 35-47 Ho D. Y. F. Chinese patterns of socialization: A critical review., In M. H. Bond (Ed.), The psychology of the Chinese people (pp. 1-35). Hong Kong: Oxford University Press., 1986 Hoffman C., Lau I., Johnson D. R. The linguistic relativity of person cognition: An English-Chinese comparison., Journal of Personality and Social Psychology, In M. H. Bond (Ed.), The psychology of the Chinese people (pp. 1-35). Hong Kong: Oxford University Press., Vol 51, 1986, 1097-1105 Hong Y.-Y., Chiu C.-Y., Kung T. M. Bringing culture out in front: Effects of cultural meaning system activation on social cognition., In K. Leung, Y. Kashima, U. Kim, & S. Yamaguchi (Eds.), Progress in Asian social psychology (Vol. 1, pp. 135-146). Singapore: Wiley., 1997 Hong Y.-Y., Morris M., Chiu C.-Y., Benet-Mart?nez V. Applicability of assessible cultural knowledge., Unpublished manuscript, Hong Kong University of Science and Technology., 2000 Hong Y.-Y., Yeung G., Chiu C.-Y., Tong Y. Social comparison during political transition: Interaction of entity versus incremental beliefs and social identities., International Journal of Intercultural Relations, Unpublished manuscript, Hong Kong University of Science and Technology., Vol 23, 1999, 257-279 Hsu F. L. K. Americans and Chinese: Two ways of life., New York: Schuman., 1953 Jones E. E., Harris V. A. The attribution of attitudes., Journal of Experimental Social Psychology, New York: Schuman., Vol 3, 1967, 1-24 Kemmelmeier M., Winter D. G. What's in an American flag? National symbols prime cultural self-construals., Unpublished manuscript, University of Michigan, Ann Arbor., 1998 Kitayama S., Markus H. R. Emotion and culture: Empirical studies of mutual influence., Washington, DC: American Psychological Association., 1994 Krauss R. M., Chiu C.-Y. Language and social behavior., In D. T. Gilbert, S. T. Fiske, & G. Lindzey (Eds.), Handbook of social psychology (4th ed., Vol. 2, pp. 41-88). New York: McGraw-Hill., 1998 Kruglanski A. W., Webster D. M. Motivated closing of the mind: "Seizing" and "freezing.", Psychological Review, In D. T. Gilbert, S. T. Fiske, & G. Lindzey (Eds.), Handbook of social psychology (4th ed., Vol. 2, pp. 41-88). New York: McGraw-Hill., Vol 103, 1996, 263-283 LaFromboise T., Coleman H., Gerton J. Psychological impact of biculturalism: Evidence and theory., Psychological Bulletin, In D. T. Gilbert, S. T. Fiske, & G. Lindzey (Eds.), Handbook of social psychology (4th ed., Vol. 2, pp. 41-88). New York: McGraw-Hill., Vol 114, 1993, 395-412 Mar?n G., Triandis H. C., Betancourt H., Kashima Y. Ethnic affirmation versus social desirability: Explaining discrepancies in bilinguals' responses to a questionnaire., Journal of Cross-Cultural Psychology, In D. T. Gilbert, S. T. Fiske, & G. Lindzey (Eds.), Handbook of social psychology (4th ed., Vol. 2, pp. 41-88). New York: McGraw-Hill., Vol 14, 1983, 173-186 Matsumoto D., Assar M. The effects of language on judgments of universal facial expressions of emotions., Journal of Nonverbal Behavior, In D. T. Gilbert, S. T. Fiske, & G. Lindzey (Eds.), Handbook of social psychology (4th ed., Vol. 2, pp. 41-88). New York: McGraw-Hill., Vol 16, 1992, 85-99 Menon T., Morris M. W., Chiu C.-Y., Hong Y.-Y. Culture and construal of agency: Attribution to individual versus group dispositions., Journal of Personality and Social Psychology, In D. T. Gilbert, S. T. Fiske, & G. Lindzey (Eds.), Handbook of social psychology (4th ed., Vol. 2, pp. 41-88). New York: McGraw-Hill., Vol 76, 1999, 701-717 Miller J. G. Culture and the development of everyday social explanation., Journal of Personality and Social Psychology, In D. T. Gilbert, S. T. Fiske, & G. Lindzey (Eds.), Handbook of social psychology (4th ed., Vol. 2, pp. 41-88). New York: McGraw-Hill., Vol 46, 1984, 961-978 Morris M. W., Larrick R., Su S. K. Misperceiving negotiation counterparts: When situationally determined bargaining behaviors are attributed to personality traits., Journal of Personality and Social Psychology, In D. T. Gilbert, S. T. Fiske, & G. Lindzey (Eds.), Handbook of social psychology (4th ed., Vol. 2, pp. 41-88). New York: McGraw-Hill., Vol 77, 1999, 52-76 Morris M. W., Nisbett R. E., Peng K. Causal attribution across domains and cultures., In D. Sperber, D. Premack, & A. J. Premack (Eds.), Causal cognition: A multidisciplinary debate (pp. 577-612). Oxford, England: Clarendon Press., 1995 Morris M. W., Peng K. Culture and cause: American and Chinese attributions for social physical events., Journal of Personality and Social Psychology, In D. Sperber, D. Premack, & A. J. Premack (Eds.), Causal cognition: A multidisciplinary debate (pp. 577-612). Oxford, England: Clarendon Press., Vol 67, 1994, 949-971 Morris M. W., Podolny J., Ariel S. The ties that bind in different cultures: A study of employee networks and obligation in a multinational financial institution., Unpublished manuscript, Graduate School of Business, Stanford University, Stanford, California., 1999 Niedenthal P. M., Cantor N. Affective responses as guides to category-based inferences., Motivation and Emotion, Unpublished manuscript, Graduate School of Business, Stanford University, Stanford, California., Vol 10, 1986, 217-232 Ortner S. B. On key symbols., American Anthropologist, Unpublished manuscript, Graduate School of Business, Stanford University, Stanford, California., Vol 75, 1973, 1338-1346 Padilla A. M. Bicultural development: A theoretical and empirical examination., In R. G. Malgady & O. Rodriguez (Eds.), Theoretical and conceptual issues in Hispanic mental health (pp. 20-51). Malabar, FL: Krieger., 1994 Phinney J. When we talk about American ethnic groups, what do we mean?, American Psychologist, In R. G. Malgady & O. Rodriguez (Eds.), Theoretical and conceptual issues in Hispanic mental health (pp. 20-51). Malabar, FL: Krieger., Vol 51, 1996, 918-927 Phinney J., Devich-Navarro M. Variations in bicultural identification among African American and Mexican American adolescents., Journal of Research on Adolescence, In R. G. Malgady & O. Rodriguez (Eds.), Theoretical and conceptual issues in Hispanic mental health (pp. 20-51). Malabar, FL: Krieger., Vol 7, 1997, 3-32 Rogoff B. Apprenticeship in thinking: Cognitive development in social context., New York: Oxford University Press., 1990 Ross L. The intuitive psychologist and his shortcomings: Distortions in the attribution process., In L. Berkowitz (Ed.), Advances in experimental social psychology (Vol. 10, pp. 173-220). New York: Academic Press., 1977 Segall M. H., Lonner W. J., Berry J. W. Cross-cultural psychology as a scholarly discipline: On the flowering of culture in behavioral research., American Psychologist, In L. Berkowitz (Ed.), Advances in experimental social psychology (Vol. 10, pp. 173-220). New York: Academic Press., Vol 53, 1998, 1101-1110 Shore B. Culture in mind: Cognition, culture, and the problem of meaning., New York: Oxford University Press., 1996 Sperber D. Explaining culture: A naturalistic approach., Oxford, England: Blackwell., 1996 Strauss C. Models and motives., In R. G. D'Andrade & C. Strauss (Eds.), Human motives and cultural models (pp. 1-20). New York: Cambridge University Press., 1992 Su S. K., Chiu C.-Y., Hong Y.-Y., Leung K., Peng K., Morris M. W. Self-organization and social organization: American and Chinese constructions., In T. R. Tyler, R. Kramer, & O. John (Eds.), The psychology of the social self (pp. 193-222). Mahwah, NJ: Erlbaum., 1999 Trafimow D., Silverman E. S., Fan R. M., Law J. S. The effects of language and priming on the relative accessibility of the private self and collective self., Journal of Cross-Cultural Psychology, In T. R. Tyler, R. Kramer, & O. John (Eds.), The psychology of the social self (pp. 193-222). Mahwah, NJ: Erlbaum., Vol 28, 1997, 107-123 Trafimow D., Triandis H. C., Goto S. G. Some tests of the distinction between the private self and the collective self., Journal of Personality and Social Psychology, In T. R. Tyler, R. Kramer, & O. John (Eds.), The psychology of the social self (pp. 193-222). Mahwah, NJ: Erlbaum., Vol 60, 1991, 649-655 Triandis H. C., Leung K., Hui C. H. A method for determining cultural, demographic, and personal constructs., Journal of Cross-Cultural Psychology, In T. R. Tyler, R. Kramer, & O. John (Eds.), The psychology of the social self (pp. 193-222). Mahwah, NJ: Erlbaum., Vol 21, 1990, 302-318 Wyer R. S., Srull T. K. Human cognition in its social context., Psychological Review, In T. R. Tyler, R. Kramer, & O. John (Eds.), The psychology of the social self (pp. 193-222). Mahwah, NJ: Erlbaum., Vol 93, 1986, 322-359 Young L., Giles H., Pierson H. Sociopolitical change and perceived vitality., International Journal of Intercultural Relations, In T. R. Tyler, R. Kramer, & O. John (Eds.), The psychology of the social self (pp. 193-222). Mahwah, NJ: Erlbaum., Vol 10, 1986, 459-469 From checker at panix.com Sun Jan 8 20:01:18 2006 From: checker at panix.com (Premise Checker) Date: Sun, 8 Jan 2006 15:01:18 -0500 (EST) Subject: [Paleopsych] Hermenaut: Anorexia/Technology: An Introduction Message-ID: Anorexia/Technology: An Introduction http://www.hermenaut.com/a149.shtml [Two related articles appended. Links omitted.] "I was sick for a long time, and that made me think about factories." --Jean-Luc Godard, in Num?ro Deux (1975) Hermes, the god of interpretation, gave his name to this magazine via those thinkers who've used "philosophical hermeneutics" to challenge our collective habit of taking received notions of self, truth, morality, and other phenomena--our social arrangements, our basic expectations--for granted. The "heady philosophy" of our subtitle is the kind that challenges prejudices masquerading as common sense. Lately, we've found the received wisdom surrounding the subjects of anorexia (as in "Calista Flockhart, Courteney Cox, Jennifer Aniston, Teri Hatcher, Tori Spelling, Helen Hunt, Celine Dion, ad--as it were--nauseam, must really want to be thin") and technology (as in "Only a reactionary Neo-Luddite could possibly believe that the 'Information Revolution' is anything less than inevitable") particularly infuriating. Which is why you're holding an issue of Hermenaut with not one, but two themes. Here at Hermenaut HQ the editorial decision-making process operates strictly on the basis of personal obsession; we never stop to wonder what anyone else thinks. So when we sent out the call for submissions to the "Anorexia/Technology" issue, we suspected we'd be deluged with essays on food fetishism, the erotics of self-discipline, and the politics and poetics of willful starvation--and in this we were not disappointed. But we also never doubted for a moment that the editors at Hermenaut.com in-box would soon be jammed with close readings of the ongoing communications technology "revolution." After all, we writers and artists are supposed to be particularly threatened by new technologies: Tolstoy, for example, refused to use the Dictaphone because he was afraid he'd find it "too dreadfully exciting"--we know exactly what he means. This latter type of article, however, failed, for the most part, to materialize. Readers, it is at these moments of crisis, which happen at least once a quarter around here, that the entire Hermenaut project is put to the test. Do we bow to the will of the public, or not? As always, the answer came back "not." I was informed by the Hermenaut staff that I'd just have to write about technology myself. So here goes. Not long ago, I quit a lucrative and high-profile job at one of those ultra-capitalized Web-based start-up companies you're always hearing about. I decided that I needed a change when I almost killed myself on the highway one morning trying to hotsync my Palm Pilot with the dash-mounted laptop while simultaneously "fertilizing" my virtual Chia Pet, Harry. (Don't worry: Although he was a bit shaken, Harry survived.) As I hung there, upside-down, waiting for the Jaws of Life to set me free, I found that I suddenly craved freedom from wearable disposable computers, digital mini-walkie-talkie phones, two-way pagers, personalized "start pages"--from any and every device, in other words, that served only to exacerbate the worst qualities of my own monkey-mind. At that moment I was transformed, or so I believed at the time, into that most ludicrous of contemporary creatures: a Neo-Luddite. Like others of my ilk, I became paranoid: Technology fills the world with itself, I suddenly realized. It's like a virus that way. I also began to feel physically nauseated by the thought of what Jean Baudrillard calls our current "society of excrescence." In "The Anorexic Ruins," an essay from several years ago, Baudrillard insisted that even if we manage to ignore 99 percent of all the information and products out there, we are still "electrocuted" by what remains. And in "Waiting for the Year 2000," published this past spring, he suggests that, thanks to late-20th-century technology, all human and social functions have become "extreme," literally grown beyond their own ends. "Because of the intervention of numerical, cybernetic, and virtual technologies... [things] can no longer end, and they fall into the abyss of the endless (endless history, endless politics, endless economic crisis). ... Everything can be extended ad infinitum. We can no longer stop the process." Unlike some people out there, I know that M. Baudrillard is a theorist, not a prophet, but I find I can relate to his apocalyptic vision of an apocalypse that will never happen, or that has already happened. The visceral feeling of too-muchness which he describes is with me every day now. Perhaps we do not literally live in a world grown obese through technological outgrowth--but it sure feels like it sometimes. Thanks to the octopoidal spread of fiber optic cables (capable of transmitting about a million times more information than copper wires can), the almost complete interconnection of the world's computers--i.e. the "information society"--is upon us. Besides changing the way we communicate, the way we do business, and the way we spend our days, the sudden ubiquity of directly accessible data sources has upped our daily information flow from a trickle to a rushing torrent. Now, we "knowledge workers" want to be filled with this substance... but we quickly become over-full, and are disgusted with ourselves. We soon long to stop receiving faxes, e-mails, and phone calls, yet we don't dare unplug for fear of becoming non-entities. Eventually even the most hotwired among us become aware of a hidden longing to "do something" about "technology." But the imperative "do something" must always be examined for received notions and prejudices, right? What, in fact, is the best way to think and talk about technology? "The hallucinatory utopia of communication technologies... has crept up over the years, disguised in the glad rags of ideologies of progress." --Paul Virilio, "The Shrinking Effect" (1993). To be strictly accurate, I first encountered what I think of as the "technology question" back in 1994, the year I began working at an "alternative press" magazine which had long published marginalized would-be Thoreauvians and '60s hold-outs like Jerry (Four Arguments for the Elimination of Television) Mander, Bill (The End of Nature) McKibben, Neil (Technopoly) Postman, Wendell (The Unsettling of America) Berry, Theodore (The Cult of Information) Roszak, and Kirkpatrick (Rebels Against the Future) Sale. These and other social critics had adopted, by the time I arrived on the scene, the pejorative term "Neo-Luddite"--which used to mean, to conservative and progressive social critics alike, a "hopelessly romantic machinoclast"--as a badge of honor, indicating their principled stance against technological "progress." Before I could even begin to appreciate what these thinkers were saying, however, everything suddenly changed. The World Wide Web, previously the exclusive haunt of scientists and Deadheads, became a mainstream medium. Wired started making publishing history, thanks to its devotion to the personal computing "revolution." Early in '95, then, the magazine I worked for ran a cover story on "100 Visionaries Who Could Change Your Life," in which stalwart Neo-Luddites like Berry, Postman, Sale, and Roszak were forced to jostle for elbow room with Wired's "digerati": John Perry Barlow, Danny Hillis, Mitchell Kapor, even Rand and Robyn Miller, the creators of the addictive CD-ROM game Myst. Then, at a public event the magazine sponsored in Manhattan, Sale pulverized a personal computer with a sledgehammer, and Postman barked that "we have transformed information into a form of garbage, and ourselves into garbage collectors." The peculiarly '90s version of the century-old technophile vs. Neo-Luddite debate was off and running. In a Harper's forum shortly after that event, Neo-Luddites Sven (The Gutenberg Elegies) Birkerts and Mark (Cyberspace and the Hi-Tech Assault on Reality) Slouka expressed their fear that the advent of the Internet was a signal of the disappearance of the "autonomous, bounded 'I'." Barlow, and Wired's editor Kevin Kelly, replied that the idea of an 'I' which isn't always already fragmented is na?ve. The Neo-Luddites worried that the Internet warps its users' sense of time and space; the technophiles replied that old-fashioned ideas of time and space (not to mention race and class) just hold us back from that ecstatic self-fulfillment which is the birthright of every American. The Neo-Luddites said they hated the thought of knowledge-as-decontextualized-information-allowed-to-recombine-in-prev iously-unknown-configurations; the technophiles just said, "Well, yeah, of course that's a good thing!" The Neo-Luddites argued that online "community" is just a pathetic, mediated ersatz of human exchange; the technophiles came back with, "Why can't we have both kinds of community?" I'd been rooting for the Neo-Luddites, so I was troubled at how easily the digerati had shut them down. Something, it seemed to me, must be wrong with the very terms of the debate itself. That's when I discovered "The Question Concerning Technology," a late essay by Martin Heidegger (whose brief attraction to Nazism, it's important to remember, was motivated not out of anti-Semitism, but by his distaste for highly "technologized" mass societies like the U.S.A. and the U.S.S.R.). Writing in 1954, Heidegger seemed to be speaking directly to my own set of questions by insisting, rather mysteriously, that "the essence of technology is by no means anything technological." That is to say, the artifactual component of technology--from steam engines to software--is the most insignificant and innocent part of a complex social and institutional matrix which includes corporations, banks, and public utilities. Technology is, for Heidegger, fundamentally a relationship between people--and to think of it as anything else is only to engender mystification, passivity, and fatalism. The only thing worse than "a stultified compulsion to push on blindly with technology," he concludes, is "to rebel helplessly against it and curse it as the work of the devil." This, it seemed to me, summed up most of what was passing for the debate over communication technologies. An article on Wired in The Baffler right around then ("The Killer App," by Keith White, The Baffler #6) directly addressed this larger, Heideggerian definition of technology. White skewered Wired for being "an aggressive apologist for the new Information Capitalism," the "Great Rationalizer of the new technology." The media-hyped drive to get us all online, White pointed out, is powered not by inevitable historical forces but by entirely evitable business interests. It became clear to me then that, like those 18th-century "Mechanical Societies" which provided those who stood to profit by increased production and the creation of new markets with a pseudo-religious doctrine of technical progress, the digerati cannot be trusted to speak with anything but a forked tongue. The idea that new technology will bring universal wealth, enhanced freedom, revitalized politics, satisfying community, and personal fulfillment is a promise we've heard many times before in history. Will "being digital" offer us ecstatic self-fulfillment through the ability to disburden ourselves of outmoded illusions like place, time, and appearance? Of course not. All it will really offer us is new gadgets, what Thoreau called "pretty toys"--nothing but improved means to unimproved ends [see "Disintermediated!" by Chris Fujiwara, this issue; and "The Thin Machine" by David Rothenberg, this issue]. Exposing the machinations of the digerati may be excellent social and cultural criticism, but it ain't philosophy: It doesn't tell us what "technology" is or what we should "do about it." That's why I turned next to social philosopher Lewis Mumford's Technics and Civilization (1934). Written in the grand style of the old-school public intellectual, Mumford's book offers an accessible world history of technology, and uses that over-arching perspective to shed light on the Neo-Luddite vs. technophile debate of his own time (which was focused on workplace automation and the increasing use of the telephone and radio). Most of us, in debates over whether or not technology is "good" or "bad," are referring to artifacts, things, from toaster ovens to corporate intranets; an endlessly evolving mass of tools, instruments, machines; the means and methods used to help people travel, communicate, produce, calculate; the practical implementations of human intelligence; McLuhan's "extensions of man." Mumford was, I believe, one of the first to argue that "technology" is not a thing, but a combination of artifacts ("technologies," even) with activities, beliefs, and attitudes. Thus, physical instruments of technology must be viewed as only one aspect of a larger sociotechnical complex--Mumford calls this previously unnamed phenomenon "technics"--which "promises well or ill as the social groups that exploit it promise well or ill." Against the Neo-Luddites of his own time, Mumford notes that automatic machines don't make men "mechanized" or "regimented"; men had been mechanized by the builders of the pyramids, for example, long before mechanical automation happened, and monastic regularity is even stricter than the factory time-clock. What is new, Mumford argued, is not mechanization and regimentation but the fact that these functions have come to dominate every aspect of our existence, the fact that we Westerners have adapted our whole mode of life to the relentless pace and seemingly infinite capacities of the automatic machine. To Mumford, automatic machines are the result, not the cause, of our inner capitulation to... it would be left to sociologist Jacques Ellul, writing a generation later (La Technique, 1954), to finish this thought: technique. By this term, Ellul means any complex of standardized means, any deliberate application of rationalized behavior whose goal is attaining a predetermined result. The automatic machine is certainly the first and most obvious example of technique, but it's not the origin of what Ellul calls the "technical problem." Technique in its proper place is fine, writes Ellul; the problem is that "technique has taken over all of man's activities, not just his productive activity." "Technicians" (what we now call "technocrats") control every sphere of human activity; political economists, for example, whose mission it is to question the morality of various economic activities, have been everywhere supplanted by economists who just figure out how to make things work. The "end of ideology" in politics has come to mean the end of ideals, and the successful politician today is one who simply gets services delivered efficiently [see "Whatever Works, Sucks" by Joshua Glenn, this issue]. In a civilization dominated by technique, means are continually "improved" while ends go unexamined: "Technical Man," laments Ellul, is fascinated by results and results alone. Neo-Marxists would come to call the reign of technique "instrumental rationality," the blind pursuit of means to further means, with ends forgotten. Technology theorist Rosalind Williams, writing in the journal Social Research (Fall, 1997), summarizes this criticism: "There is a zeal that lets nothing stand in the way of ever-greater efficiency in the production of more and more goods, and all this for the sake of ever-greater profits, and in total disregard of the costs to workers or to nature, while all higher purposes recede, dwarfed by the technological process." Ultimately, ends are transformed into means and means into ends, and everything and everyone is transformed into an efficient machine [see "Time for Teletubbies!" by Greg Rowland, this issue]. Having gotten "technics" and "technologies" and "technique" straightened out, it finally became clear to me that any debate about technology has to begin not with discussion about the various technologies in our lives, but with a discussion about our ideals. Although people would much rather debate the "effects" of new technologies than disagree publicly about the nature of the Good Life, this is precisely what needs to happen. How do we want to live? Once we've answered this, we can address what Mumford calls the "real question": How far does this or that "technology" further the ideal ends of life? If our life-values include material conquest, wealth, and power, then Ellulian "technique" is all good; wealth and power are the by-products--which accrue to someone, though probably not you, dear reader--of the process by which end-free means "improve" themselves. But if our life-values revolve around, say, culture and self-expression, then technique must be balanced with spontaneous and intuitive action, following which (says Mumford) "the machine... will fall back into its proper place: our servant, not our tyrant." This, by the way, was the conclusion independently arrived at, around the same time Mumford was writing, by this issue's Hermenaut: Simone Weil [see "Hermenaut of the Month"]. Which brings us to the anorexia part of this issue. "Disgust in all its forms is one of the most precious trials sent to man as a ladder by which to rise. I have a very large share of this favor."--Simone Weil, Gravity and Grace Because of her lifelong obsession with purity, Weil has been described, by contemporaries like T.S. Eliot, as a kind of saint. Although she starved herself to death at the age of 34, Weil had none of the primary symptoms of clinical anorexia--she didn't weigh herself excessively, hoard food, avoid eating in public, or appear to suffer from any significant disturbance in her body-shape perception [see "Interview with an Anorexic" by Lisa Carver, this issue]. It seems to me, then, that Weil was what historian Rudolph Bell calls a "holy anorexic," someone whose refusal to eat has nothing to do with body shape and everything to do with purity. Like Catherine of Siena, whom Bell analyzes, Weil practiced various austerities: She rejected sexuality, wore rough clothes, slept on hard surfaces, and restricted her diet whenever she could to bread, water, and raw vegetables [see "Convent Erotica" by Chris Fujiwara, this issue]. In this, Weil may have been a victim of what feminist literary critic Leslie Heywood calls the "anorexic logic" of the Western philosophical, religious, and literary tradition: To Weil, her body may have seemed the source of worldly corruption, and the antithesis of philosophical detachment [see "Confessions of an Anorexic Wannabe" by Michelle Chihara, this issue]. The only thing more fascinating to Weil than the symbolism of eating was the problem of automatic machinery in the workplace, and technological "progress" in general. I hope it doesn't seem callous to those who actually suffer from this disorder to suggest that anorexia is a useful concept for thinking, not about technology itself, but about how we think about and react to the technologies in our daily lives. Anorexics suffer from feelings of ineffectiveness (think of how you feel when your browser crashes again); from a strong need to control their environment coupled with limited social spontaneity (unless you count Multi-User Dungeons or whatever as a form of social spontaneity); and above all from a feeling that their life is not theirs to control--hello, Neo-Luddites! But, of course, most of our lives are out of control. We enjoy the freedom to be, do, and have almost anything we want, but we lack "freedom from": freedom from being advertised to incessantly [see "Letter from London" by Matthew De Abaitua, this issue], freedom from direct mail solicitations and telephone sales calls, freedom from too many TV channels and breakfast cereals, freedom from intimate knowledge of the President's sex life, freedom from distraction and overchoice. According to Hilde (The Golden Cage) Bruch, the pioneer authority on anorexia, "Anorexics struggle against feeling enslaved, exploited, and not permitted to lead a life of their own. They would rather starve than continue a life of accommodation. In this blind search for a sense of identity and selfhood they will not accept anything that their parents, or the world around them, has to offer... The main theme is a struggle for control, for a sense of identity, competence, and effectiveness." In other words anorexics, typically privileged young white women who enjoy all the "freedom to" in the world, may long for freedom from freedom itself [see "Fatty Fiction" by Lynn Peril, this issue; and "Anorexic Outfitters" by Pauline Wolstencroft, this issue]. This sort of attitude towards what passes for freedom finds its most direct expression in "Industrial Society and Its Future," the manifesto of Unabomber Ted Kaczynski. A true ascetic, who lived in an unheated shack in the woods, Kaczynski wasn't just talking about freedom from distraction--for him, the free market, free press, and every other so-called freedom we enjoy are simply "freedoms that are designed to serve the needs of the social machine more than those of the individual." He writes that "industrial-technological society" constricts every one of our true freedoms ("the power to control the circumstances of one's own life"), leaving us nothing but "the freedom to consume." Identity, competence, effectiveness--these are the Thoreauvian values the Unabomber sought to publicize by killing people. When, for whatever reason, these human impulses are (or just seem to be) thwarted, we seem to arrive at some form of anorexia [see "Fun With Richard & Karen" by John Marr, this issue]. The most recent widespread manifestation of Neo-Luddism is our collective emotion of rage, frustration, or just exhaustion at the thought of all that information we're expected to... well, ingest, nowadays. Theodore Roszak's The Cult of Information, for example, argues that we're being endlessly "force-fed" data, to the point where we begin to compare our minds unfavorably to computers [see "Suffragist City" by Dara Moskowitz, this issue]. David Shenk's book Data Smog makes the dieting/info glut connection manifest: Just as rich people paradoxically tend to be thinner than poor people (because they eat better, are more health-conscious, and have the leisure time to worry about their weight), he notes, the more media- and tech-savvy you are, the more likely you are to have the willingness and the tools to be an info-ascetic [see "Thin Code" by Scot Hacker, this issue]. Anorexia is not, then, as the Greek word suggests, a lack of appetite, but rather a distorted and implacable attitude toward eating [see "The Juice on Dick Gregory" by Dan Reines, this issue; and "Extreme Dieting" by Mark Frauenfelder, this issue]: I suggest that most of us have a distorted and implacable attitude toward what we mistakenly call "technology." Neo-Luddites like to argue that technologies are never neutral, that TV inherently controls social and political thought, breaks down family communication, shortens our attention span, and mediates our reality; that computers inherently invade our privacy (by making mega-databases possible, for example) and facilitate social centralization. For Heidegger, modern technologies are not the problem: The will to mastery is. One could say the same thing about anorexics; that their pathetic attempts at self-mastery only end up obliterat-ing the self [see "Half Karen" by A.S. Hamrah, this issue]. We Neo-Luddites want to get technology under control, we're super-privileged and super-disgusted--we're techno-anorexics: Remember, you heard it here first. In search of a non-distorted attitude toward the new information and communication technologies in my life, technologies which were indeed making me as unhappy as they were helping me to be more efficient, in the spring of '95 I attended something called The Second Neo-Luddite Congress. At this event, which took place at a Quaker meeting house in Barnesville, Ohio, I was overjoyed to discover an ally in my own attempt to address the Technology Question in a manner that was more Heidegger and Mumford, less Birkerts and Barlow. Scott Savage, the "plain" Quaker who'd organized the Congress, had convened representatives of the most laughably technophobic subcultures--survivalists, self-helpers, back-to-the-landers, rawfoodists, tree-spikers, deep ecologists, pagan bioregionalists--in order to teach them to stop asking "Technology: Changing The Way We Live For The Worse, Or For The Better?" Instead, he (and other plain folk present) suggested, let's ask "Technology how?" "Technology why?" "Technology when?" "Technology with what history, and to what end?" I have no idea if the ideas expressed at this event helped anybody present other than myself begin to see a solution to the problem of new technologies, but the whole scene, silly as it was in many ways, affected me deeply. Like anorexics, who fixate on the means of bodily denial ("How many peas per serving is too many?") without ever allowing themselves to consider their desired end ("Just how thin is thin enough?"), we don't ask ourselves what we want our lives to be like before we slap the "Kill Your TV" sticker on the Bronco. Until we can agree upon standards by which to judge new technologies, Neo-Luddites and technophiles can go around and around arguing the relative merits and demerits of cyberspace and "real life" without ever getting anywhere. The Amish belief that technology is only bad when it intrudes upon one's home life (which leads to the spectacle of rollerblading patriarchs, and a pay phone on every corner, in their communities), may not work for all of us, of course. All I'm trying to suggest is that the "technology question" is not just about machines and "their effect"--whether positive or negative--on mankind: That's a dead end. What to do? Heidegger's essay on technology offers one way out. Writing after the war, the previously activist philosopher had come to believe that human willpower cannot oppose the technological "enframing" of the world, in which everything and everyone is mobilized for the purpose of greater efficiency--precisely the state of affairs the Una-bomber's manifesto describes. Instead of suggesting that "it would be better to dump the whole stinking system," as Kaczynski does, Heidegger proposes that we practice a non-technological way of encountering things; that instead of perceiving the world in terms of means and ends, we keep sight of the "thereness" of reality, the mere given fact of the world. Heidegger replaces resoluteness of will with Gelassenheit--"releasement": the gentle coaxing from things of their own best potentiality. For anorexics, Gelassenheit may mean learning to live with the reality of one's body, allowing oneself to desire without being ruled by desire--perhaps even to "desire without an object," as Weil puts it. For us techno-anorexics, I think Gelassenheit means resisting the instinct to reject or embrace new technologies reflexively. It means questioning the motives of those who'd convince us that we can't get along without the latest gadget; but it also means questioning our own use of those technologies we take for granted. ----------- Anorexic Outfitters http://www.hermenaut.com/a41.shtml One day, in the fall of 1995, I decided it was time to do something about Urban Outfitters. I was sick of hearing my friends complain about getting paid slave wages in exchange for discounts on crappy clothes and the privilege of listening to indie rock at top volume all day. I was especially disgusted by their stories of girls trying on baby-doll dresses and begging their boyfriends to tell them they didn't look like chubsters; or about the occasional overweight girl brave enough to pick through the techno-enhanced labyrinth of skinny-girl clothes, in order to squeeze into something that made her look like an overgrown baby. Worst of all, I was horrified by the fact that so many people my age were buying into U.O.'s brand of mass-produced pseudo-nostalgia: After all, why scour through dirty Salvation Army bins for bellbottoms, barrettes, and lava lamps when you can pick up the same stuff sanitized and neatly presented on racks at a location convenient to your dorm? So I designed a hate poster, printed up a few thousand of them, and proceeded to plaster them on the windows and walls of Urban Outfitters everywhere. My [then-]boyfriend was touring with his band and I went along for the ride. I hit stores in Boston, New York, Philadelphia, Chicago, San Francisco, Austin, and Seattle. In Boston and New York the morning after the posters went up, I stood outside of the store in costume (a polka-dotted suit and a very long red wig) as my alter-ego Miss Kitty Bates, handing out little postcards I'd made. The postcards depicted a slouching waif with a question mark bubble for a head, and read: "EXHIBIT A: HUNGRY WAIF," with my name and P.O. Box at the bottom. People would crowd around, hoping for free passes to a club or fab party or something, and continue on into the store. Often they'd read the card before they got through the door and look back at me with hurt, but also with curiosity, in their eyes. When I included my address on the posters and cards I wasn't expecting a lot of feedback, because like a lot of people, I had grown cynical about people's lack of interest in anything with artistic or social content. I was really touched, then, by the number of responses I received from strangers who had just seen my poster on the street, and who took the time to make a note of my address and write me. I was also amused by these responses, though, because they ranged from the sublime to the ridiculous. (One guy wrote an angst-filled poem for me, entitled "Snapshots of a Void Screen," in which he lamented that "we linger in the stillness, feeling a fading desire for life...") Some people wrote admitting to not really understanding my beef with U.O. but wanting to know more. One person, who ended his response with an offer "to help in my campaign," wrote in asking, "Is this the kind of company that is insensitive, cold, and only interested in making money at the expense of others?" Sure. The "conformity" part of my campaign seemed lost on most people: My new pen-pals wanted to congratulate me for drawing attention to U.O.'s exploitation of the self-conscious teen girl market. One Go-Girl-brand feminist named "batgrrrl" wrote "Good luck deconstructing whatever paradigm you're working on!" Spoken like a true Harvard Yard cheerleader. Some of these folks, though, were looking for a sympathetic ear into which to pour their own personal anorexia sob stories. I came to realize that there were a lot more people out there who had a personal history of anorexia than I had ever imagined. While this made me sad, having to respond to a stranger who obviously needed help made me extremely uncomfortable. In fact, anorexia is something that I have never experienced first-hand--the thought "I'm sooo fat" never ran through my head as a child or as a gawky brace-faced teenager. Although I have an unwavering love for greasy, fattening foods, particularly hot dogs and hamburgers, I am 5'8"/113 pounds. Maybe that's why I got a lot of comments from people who saw me in action and questioned what a skinny girl like myself was doing making any comment about Urban Outfitters' perpetuation of the waif aesthetic. I could only respond by saying that I don't starve myself, and that my point in creating those posters was that if weight is an issue for women in their awkward teenage years, they ceratinly don't need additional pressure from some second-rate, overpriced, false mecca of "urban" style. One UMass grad student who wanted to interview me for a thesis paper she was doing about "Women Taking Up Space" wrote asking, "As a woman do you feel comfortable 'taking up space'?" Umm, yes. She then wanted me to describe some examples of times when I felt I had to "keep quiet, keep it down, pull it in, or behave myself because of my gender." And to finish these sentences: "Power is... ," "I take up space because... ," and "I would be happier if... ." I eventually wrote her back telling her that I was not the right person for her project, that I had never really considered myself as taking up space, and that I would never start a sentence with "I take up space because... ." In an article written about me and my posters in the Boston Phoenix, the author (Geoff Edgers) talked to the manager of Urban Outfitters' Boston location and to Sue Otto, the company's creative director. The manager of U.O. Boston said that when she first saw the posters she assumed that it was the work of a disgruntled employee. (Hmmm, why would there be disgruntled employees? Because they get paid $5 an hour? Or because they have to empty their pockets to prove that they are not stealing anything every time they leave the store for any reason? Or maybe because they're sick of being coerced into ratting on their coworkers daily in required written reports for their managers?) Otto, who ended her response with, "Working here has been my whole life," took my posters a bit more personally. To rebut my claim that the company where she's worked for 13 years caters to women who wish they had the body of a 12-year-old, she volunteered her own physical dimensions, which happened to be 5'3"/165 pounds. Talk about loyalty! I'm aware that a lot of the things that I have griped about with Urban Outfitters can be said of a million other companies. U.O. doesn't rip the money out of teenagers' hands, and thrift-store fashion was bound to trickle down from the starving artist types to suburban teens. But in a perfect world, kids everywhere would realize that they are being duped by marketing masterminds. These kids would then burn down all the Urban Outfitters polluting our cities, finally freeing themselves from the shackles of their chain wallets and the confines of their baby-doll T-shirts. ----------- Interview With An Anorexic http://www.hermenaut.com/a44.shtml I have very little sympathy for someone whose disease involves poor self-image. The idea that perfect control over your body is possible is so WASPy, as is the idea that other people actually spend their time caring whether you reach it or not. Much more attractive to me are people whose problems come from seeking out all that is invisible like that fad I read about in Vogue where young people cut themselves every day. The silent suffering and self-containedness of anorexia, in the grand scheme of life, is really worthless. Internalized drama is everything pathetic about drama with none of the majesty. Anorexics never kick out the jams. At least "diseases" like gambling or alcoholism, or even spousal abuse, involve interaction--a tipping-back-and-forth balance of guilt and fury and love and hatred, a shouting match with your girlfriend when you arrive home sans grandma's earrings. At least alcoholics have camaraderie--anorexics are eternally alone, single piranhas circling. (An anorexic sees another anorexic, she thinks, "Damn! Another skinny bitch on my turf!") And in the end, those anorexics will force you to take care of them while looking like they never wanted help, like they never hated you or wanted you miserable: "Oh no, it was all inward-directed violence." My foot! I fucking hate passive-aggressive behavior. While my sympathy is small, my jealousy is big. Anorexics always seem to have more thoughts than I do. All those intricacies and picayune habits. My body is just something that walks me to the store--it's no battlefield. Where do they come up with these ideas?I just want to write, have sex, fight with my boyfriend, hire someone to clean my house, figure out how to be funny, and go to Japan someday. Keeping my consumption of peas to 7 per day, while not letting the fork touch my lips, just doesn't enter in the picture. And I just know the swanky homosexuals who disapprove of everything under the sun think those spotty-haired scrawny girls are more worth talking to than I am. Maybe it's just the word "anorexia" I love--spread out like a fishing net over the stars, filaments so thin they're barely visible. The girl herself is a constellation of fine, blow-dried hair, shiny clothes, peeling nails, and jutting bones--you have to connect the dots because there's nothing in between. My best friend for the last 13 years has been anorexic to varying degrees: She is driven by egotism, perfectionism, and what people I never want to meet would call "issues of control." She ate only chips, iced tea, and jalape?o peppers for her main meal every day for a solid year. She'd go to three different stores to buy these items, as if some poor clerk might be keeping track, thinking "If she eats chips, she needn't eat jalape?o peppers as well." If she ever bought anything else, say a cup of soup, she'd talk about it with the person at the cash register (and anyone else around), pretending it was for someone else: "I guess this is the kind he wanted, I don't know..." Eventually, she stopped talking to clerks altogether. She'd pass them a note that said, "I am a deaf-mute. I am picking up some turkey soup for my friend. How much, please?" She was always getting in car accidents, and every single month she thought she was pregnant. The cool people are always selfish and dramatic. Unlike, say, depressives, who sink down into the same old patterns of self-destructive behavior and never get out of them, anorexics have a constantly expanding galaxy of ways to have problems. They lie. They black out. They hemorrhage. All the anorexics I've known steal boyfriends. Things always "happen" to them: People molest them when they sleep, ex-boyfriends steal their gas cards, or things go wrong when they try to kill themselves and they end up stuck in the loony bin for the weekend. They have mortal enemies. People put curses on them! It's an extravagant, silent life, the life of the anorexic. A guy recently told me his sister had been having a telepathic relationship with Martin Gore of Depeche Mode for the last 8 months--and had even gotten pregnant from it! "Wait!" I said, "Does your sister have an eating disorder?" "Yes," he said, "She's a fruitopian. She hasn't eaten anything but fruit for years." I got anorexic/bulimics all over my life. One of them is my son's baby-sitter, Chance Provencal--so I interviewed her. Throughout the interview, Chance peeled and ripped up an onion that was sitting on my table. You can hear the low crackle of the mutilation throughout the entire tape. Lisa: When did it start? Chance Provencal: When I was 18. I never thought about how much I ate or how much I weighed until I had this one boyfriend. I was 120 pounds and he was like, "Oh, I like my girls to be skinny." Lisa: "My girls." Sounds like a pimp. Chance: The thing is, he was fat! He was! "I like my girls to be 100, 105 pounds." No matter how much I tried to cut down on my eating or exercise more, I couldn't lose any weight. So I just kind of like ate what I wanted and then got rid of it. Lisa: You puked. Chance: Oh yeah, I puked and I starved, alternately. I did a lot more puking than I did starving. It was easier to just puke it out. Lisa: How often did you throw up? Chance: Sometimes just a couple times a week, sometimes a couple times a day. It depends on how much I ate that day. Because there are days where I won't eat at all. I found that if you wake up in the morning and you don't eat, you can go longer without eating. But once I eat, I just have to eat and eat and eat. Lisa: How come after you dumped that boyfriend you still had the eating problems? Chance: Because after that it was an obsession to be skinny. All my friends down in Maryland are really skinny--between 90 and 105 pounds--and I felt fat. Lisa: What if you got a bunch of fat friends, would that help? Chance: Probably not, because then I'd be mean and want to be even skinnier. I have fat friends now and I continue to be skinny because they all say, "Look how skinny you are, look how tiny you are," and I like it. Lisa: What's the lowest weight you ever got to? Chance: 85 pounds. I didn't get lower because I was taken to the hospital pretty early--[starvation] was harder on me than most people because of my diabetes. I got down to 85 with painkillers. I think painkillers are the best diet drug. You really don't get hungry! You just lay in bed all the time and lose the weight. But when you're not tired, they make you just jump up and run around and not think about anything--just keep going. Lisa: Were you able to hold down a job at this time? Chance: Not then, because I had an ovarian cyst, so I was out of work because of that. I never wanted to count calories. I'd just eat a piece of lettuce, drink water. I didn't want to do this whole thing of eat one M&M, exercise for three hours. I was never that meticulous about it. I'm too lazy. Lisa: Did you go to the hospital by choice? Chance: No. I couldn't really fight it by that point because I was just too out of it. I was too weak and half in and out. My boyfriend at the time took me because I was bordering on unconsciousness. My roommates called him up and said, "Her heartbeat's really low, she's not responding to much, she's dehydrated." So he came and picked me up and took me to the emergency room. Lisa: Is this the one who likes his girls skinny? Chance: No, a different one. This boyfriend never said I was too skinny. Lisa: How did people treat you while you were recovering in the hospital? Chance: Some were really nice and sympathetic, some were mean and heartless. They'd say, "Well you got yourself into this and you ought to know better and I don't feel sorry for you!" Other people would say, "Oh, you poor little thing." The counselors were nice, but a lot of the nurses were mean--the fat nurses. But you get that no matter what you're in the hospital for--some nice nurses, some mean. Lisa: How much weight did they make you gain before you could leave? Chance:: 10 or 15 pounds. Lisa: How did you gain the weight? Did you get an I.V. drip? Chance: Yeah. They gave me the saline solution, then glucose. They were talking about putting that tube in my nose. Lisa: Why? Were you afraid to eat? Chance: No, they just felt it was so necessary at that time, but I was like, "No, no, no, I'll eat!" And then I had to get monitored every time I had to go to the bathroom. I had to call a nurse and leave the door open part-way so they could make sure I was going to the bathroom and not doing other stuff. Lisa: How long did it take you to gain 10 pounds? Chance: It took a couple months, because your stomach shrinks, so what was a normal meal to me would be like a snack to someone else. Even now, when I don't eat for a while, my stomach shrinks, and then I'll eat just a couple bites and I'm full. I can't eat another bite, and that's fine for me! After that I went to my boyfriend's house and he was fat and his whole family was fat, and they took care of me. They made sure I got fed. And he worked at Taco Bell so I got to eat tacos all the time. They were trying to force me to eat, and then when I'd feel sick they'd tell me, "Oh you're fine," and like force-feed me, and then I'd really be sick, and throw up. Lisa: How do you feel when you see a fat person? Chance: I don't know, I think a lot of fat people are beautiful. Sometimes I want to be fat, have a little extra meat on me. But I can't bring myself to actually do it. But sometimes I get mad at fat people because I think they're gross and disgusting, other times I think they're fine, I think they're beautiful. My good friend Cindy, she's overweight, and sometimes I think, "Fat pig!" and then other times I think, "Oh, she's fine." Lisa: How do you feel when you see a skinny person? Chance: When I see someone skinnier than me, I get mad. This one lady, she was so skinny, I kept looking at her and thinking, "There must be something really wrong with her. She must have cancer. She is impossibly skinny." Lisa: Was she elegant or grotesque? Chance: She was grotesque. It was really nasty. Lisa: Did you realize you looked disgusting when you were that skinny? Chance: No. Because you have your own image of what you look like. There's this mirror over at Rick's house that I call the Skinny Mirror because I looked in it one day and I looked really thin. Everybody said, "You're just saying that." But then this guy's girlfriend looked in it and she said the same thing, so we call it the Skinny Mirror now. So whenever I feel fat I go look in the Skinny Mirror. There's days when I look at myself and I think, "Wow, I look great, I can live with this." And there's other days when I look at myself in the mirror and start scrutinizing every inch of my body: "My butt's fat. My legs are fat. My gut is fat." But I don't ever want to go through hospitalization again, I'll never let myself get to that point again. I have a messed-up esophagus now from making myself throw up so much. Sometimes I'll just chew on a pen cap now, and I'll gag. Because I used to stick my toothbrush down there, my fingers, anything. And now when I throw up it really hurts and burns, it feels like my whole chest is gonna cave in. Lisa: When you were so skinny, what did your skin and hair look like? Chance: I did get that light layer of hair that you grow. It's baby-blonde color. That extra layer that keeps you warm, because you get so skinny your body can't keep itself warm. Lisa: Was it all over your body and face? Chance: Not on my face so much. It was mostly on my midsection, on my back and front. Lisa: How did your boyfriend feel about you having chest hair? Chance: He was kind of disgusted by it, but he really cared for me, so he didn't let it really show. There was no physical relationship at that point, because I was too weak and he was too afraid he would break me. He was really delicate with me all the time. If he held my hand, his hand would completely wrap around mine. Lisa: Do you still get urges to not eat and to throw up? Chance: I still do. There are days where I won't eat at all. Just because I'm afraid I'll get to the point where I'm too fat again and then it will start all over. Other days I'll eat like a pig and then I'll feel awful for weeks after. And I'll be like, "Well, I can't eat for a couple days because I ate a lot yesterday." I haven't thrown up lately. I was doing it a couple months ago, because I'd eat so much I'd have to. And then you know, when you throw up you get dehydrated, so I'd drink like a whole gallon of water and then I'd have to throw that up. I used to throw up every night still last year. Lisa: Do you take painkillers now? Chance: I did the other day. But a lot of doctors are cracking down on what they give you. I had a kidney infection two weeks ago and they wouldn't even give me painkillers for that. Which is good, because I was addicted to them really bad. When I stopped taking them, I went through withdrawal, the shakes. If I even take one or two now, I'll get addicted almost automatically, so it's good they don't give them out as easily anymore. You don't think about a lot when you have painkillers. Painkillers are deadly, not just for the obvious reasons, but because of the way they make you think and act. I'm proud that I stopped, the painkillers and the eating disorder. I still have my days. They say it's never really cured, you always have it in the back of your mind. It's just a matter of controlling it. I don't make myself throw up anymore. When I look back at what I used to do, it really kind of disgusts me. From checker at panix.com Sun Jan 8 20:01:53 2006 From: checker at panix.com (Premise Checker) Date: Sun, 8 Jan 2006 15:01:53 -0500 (EST) Subject: [Paleopsych] Daniel Pipes: You Need Beethoven to Modernize Message-ID: Daniel Pipes: You Need Beethoven to Modernize http;//www.danielpipes.org/article/297 Middle East Quarterly September 1998 It is possible to modernize without Westernizing? This is the dream of despots around the world. Leaders as diverse as Mao on the Left and Khomeini on the Right seek a high-growth economy and a powerful military -- without the pesky distractions of democracy, the rule of law, and the whole notion of the pursuit of happiness. They welcome American medical and military technology but reject its political philosophy or popular culture. Technology shorn of cultural baggage is their ideal. Sad for them, fully reaping the benefits of Western creativity requires an immersion into the Western culture that produced it. Modernity does not exist by itself, but is inextricably attached to its makers. High rates of economic growth depend not just on the right tax laws, but on a population versed in the basics of punctuality, the work ethic, and delayed gratification. The flight team for an advanced jet bomber cannot be plucked out of a village but needs to be steeped into an entire worldview. Political stability requires a sense of responsibility that only civil society can inculcate. And so forth. Western music proves this point with special clarity, precisely because it is so irrelevant to modernization. Playing the Kreuzer Sonata adds nothing to one's GDP; enjoying an operetta does not enhance one's force projection. And yet, to be fully modern means mastering Western music; competence at Western music, in fact, closely parallels a country's wealth and power, as the experiences of two civilizations, Muslim and Japanese, show. Muslim reluctance to accept Western music foreshadows a general difficulty with modernity; Japanese mastery of every style from classical to jazz help explain everything from a strong yen to institutional stability. Muslims Among Muslims, choice of music represents deep issues of identity. Secularist Muslims tend to welcome European and American music, seeing it as a badge of liberation and culture. Ziya G?kalp, the leading theorist of Turkish secular nationalism, wrote in the early 1920s that Turks face three kinds of music today: Eastern music, Western music, and folk music. Which one of them belongs to our nation? We saw that Eastern music is both deathly and non-national. Folk music is our national culture, Western music is the music of our new civilization. Neither of the latter can be foreign to us. More recently, as Turkish secularists find themselves under siege, sold-out crowds turn out for concerts featuring Western classical music. In the words of a reporter, these have "become a symbolic rallying point for defenders of Turkish secularism." In an event rich with symbolism, the Turkish embassy in Tehran gave a two-hour concert of Western classical music in late December 1997, in tribute to the forthcoming (Christian) new year. Few cultural occasions could quite so sharply delineate the contrasting visions of Atat?rk and Khomeini. In contrast, fundamentalist Muslims, who nurse an abiding suspicion of the West, worry that its music has an insidious effect on Muslims. When Necmettin Erbakan was prime minister of Turkey in 1996-97, he cut back on dance ensembles, symphony orchestras, and other Western-style organizations. Instead, he fought to increase funding for groups upholding traditional musical forms. For fundamentalists, merely listening to Western music suggests disloyalty to Islam. A speaker at a fundamentalist rally in Istanbul flattered his audience by telling them, "This is the real Turkey. This is not the aimless crowd that goes out to see [sic] the Ninth Symphony." An Iranian newspaper published a poem that characterizes the opposite of the downtrodden, faithful Iranians killed by Iraqi troops as an audience of classical music buffs -- women with "pushed-back scarves" (i.e., who resist Islamic modesty) and men with "protruding bellies" (i.e., who profit from the black market). The same poem, titled "For Whom do the Violin Bows Move?" argues that concerts of Mozart and Beethoven promote the "worm of monarchic culture." Anyone who listens to Eine Kleine Nachtmusik, in other words, must be a traitor to the Islamic republic. Or to Islam itself: naming the very same composers, a Tunisian claims that "the treason of an Arab . . . begins when he enjoys listening to Mozart or Beethoven." Of course, if eighteenth-century composers so rile fundamentalist Muslims, what do they think of rock and rap music? American popular music epitomizes the values that Muslims find most reprehensible about Western culture -- the celebration of individualism, youth, hedonism, and unregulated sexuality. The Pakistani fundamentalist group Hizbullah has singled out Michael Jackson and Madonna as cultural "terrorists" who aspire to destroy Islamic civilization. The group's spokesman explains this fear: Michael Jackson and Madonna are the torchbearers of American society, their cultural and social values . . . that are destroying humanity. They are ruining the lives of thousands of Muslims and leading them to destruction, away from their religion, ethics and morality. Terrorists are not just those who set off bombs. They are also those who hurt others' feelings. Hizbullah finished with a call for the two Americans to be brought to trial in Pakistan. The Hizbullah statement points to the reasons why fundamentalists mistrust Western music: it demoralizes Muslims and distracts them from the serious requirements of their faith. Ahmad al-Qattan, a Palestinian preacher living in Kuwait, finds that Western music "involves pleasure and ecstasy, similar to drugs" and elaborates: I ask a lot of people, "When you listen to Michael Jackson, or Beethoven, or Mozart, what do you feel?" They tell me: "Oh, I feel my heart torn from the inside." I say, "To that extent?" They tell me: "Yes, by God, to that extent. I fell that all of a sudden I am flying. One moment I am crying, the next moment I am laughing, then dancing, then I am committing suicide." Our God, we seek refuge with You from singing and its evils. Ayatollah Khomeini had similar views, as he explained to an Italian journalist: Khomeini: Music dulls the mind, because it involves pleasure and ecstasy, similar to drugs. Your music I mean. Usually your music has not exalted the spirit, it puts it to sleep. And it destructs [sic] our youth who become poisoned by it, and then they no longer care about their country. Oriana Fallaci: Even the music of Bach, Beethoven, Verdi? Khomeini: I do not know these names. But then, unexpectedly perhaps, Khomeini softens his condemnation: "If their music does not dull the mind, they will not be prohibited. Some of your music is permitted. For example, marches and hymns for marching. . . . Yes, but your marches are permitted." Others join Khomeini in making an exception for marching music. Qattan, for example, distinguishes between degenerate and useful music: "No Mozart and no Michael Jackson, no singing and no instruments, only war drums." Fundamentalist Muslims allow the ecstasy that Western music can create is allowable only if it helps march youth to their deaths. (As an aside, it is interesting to note that marches are the only Western music significantly influenced by the Middle East: Gypsies introduced Turkish -- or "Janissary" -- music to Europe in the eighteenth century. The Austrian army appears to have been the first to adopt this genre. It involved exotic new uniforms and such new percussion instruments as tambourines, triangles, cymbals, bass drums, and -- suggestively -- crescents. Accented grace notes added to the exoticism. Soon after, these elements entered the orchestra too; Mozart first used Turkish-style music in a sketch dating from 1772 and "Turkish" effects are especially prominent in his Abduction from the Seraglio as well as the finale to Beethoven's Ninth Symphony. In a sense, then, with marching music the Middle East is letting back in its own innovation.) In contrast, the Turkish authorities, marching to a different drummer as is so often the case, rely on classical music to quiet their forces. The so-called "Steel Force" units, the baton-swinging riot police notorious for their tough tactics against street protesters, are forced to listen to Mozart and Beethoven in their buses on the way to operations as a way to calm them down. Other fundamentalists have divergent ideas on what music is permissible, a debate symbolized by the King Fahd Cultural Center, a magnificent concert hall seating 3,000 at the perimeter of Riyadh, Saudi Arabia. Shortly before his death in 1975, King Faysal approved the building of this center as part of the recreational facilities to turn Riyadh, his capital, into a handsome modern city. Completed in 1989 at a cost of $140 million, it boasts such lavish touches as the finest marble and precious woods, not to speak of a state-of-the-art laser lighting system, and a hydraulic stage. But the hall has never staged an event. A foreign diplomat who managed to visit the mothballed facility found that a full-time staff of 180 has for almost a decade maintained the building and its gardens in mint condition. This has meant not just tending the flower beds but air-conditioning the facility all year around so that the delicate woods on the interior not deteriorate. Why is the cultural center not used? Because it offends the strict Islamic sensibilities prevalent in Saudi Arabia. According to one report, on hearing about Western-style music played by mixed casts (meaning men and women) to mixed audiences, the country's religious leaders "went berserk." The saga of Riyadh's concert hall neatly illustrates the ongoing debate about Western music among fundamentalist Muslims. King Faysal, no slouch in his Islamic faith, thought it a permissible pleasure, but the Saudi religious authorities deemed otherwise. Other fundamentalists, too, disagree on specifics. The author of an advice column in a Los Angeles Muslim weekly concedes that "Music with soft and good tunes, and melodious songs with pure words and concepts are acceptable in Islam," provided that this does not lead to "the mixing of men and women." In contrast, `Ali Hoseyni Khamene'i, Iran's spiritual guide, deems "the promotion of music is . . . not compatible with the goals of the Islamic system." Accordingly, he rejects the teaching of music to children and prohibits "any swing music that is for debauchery," even when played in separate-sex parties. Egypt's leading television preacher, Sheikh Muhammad ash-Sha`rawi, went further and condemned Muslims who fall asleep to Western classical music rather than a recording of Qur'anic recital. Inspired by his words, fundamentalist hotheads in Upper Egypt stormed a concert and broke musical instruments, leading to their arrest. With such attitudes prevalent, it is hardly surprising that Muslim practitioners of Western music have achieved little. As the historian Bernard Lewis notes, "Though some talented composers and performers from Muslim countries, especially from Turkey, have been very successful in the Western world, the response to their kind of music at home is still relatively slight." They enjoy neither renown or influence outside of their native countries, and even there remain minor figures. Japan How different is Japan! True, the early reactions to Western music were adverse: on hearing a child in song in Hawaii, Norimasa Muragaki, a member of the very first Japanese embassy to the United States in 1860, compared the sound to "a dog howling late at night." Within a few years, however, Japanese heard Western music much more favorably, to the point that the music drew some individuals into Western religion. In 1884, Shoichi Toyama argued that "Christianity ought to be adopted for, first, the benefit of progress in music, second, the development of compassion for fellow men and harmonious cooperation, and third, social relations between men and women." Note that he lists music first. Before long, some Japanese discovered that Western music expressed their feeling far better than anything in their own tradition. As he left French soil, the leading writer Nagai Kafu (1879-1959) mused wistfully on the beauty of French culture: No matter how much I wanted to sing Western songs, they were all very difficult. Had I, born in Japan, no choice but to sing Japanese songs? Was there a Japanese song that expressed my present sentiment -- a traveler who had immersed himself in love and the arts in France but was now going back to the extreme end of the Orient where only death would follow monotonous life? . . . I felt totally forsaken. I belonged to a nation that had no music to express swelling emotions and agonized feelings. Kafu here describes emotions almost entirely unknown to Muslims. The local musical tradition engages in an intense give and take with Western music. Woodblocks, a traditional Japanese instrument, are a standard of jazz percussion. Traditional Japanese music has influenced many Western composers, and John Cage probably the most directly so. The Suzuki Method, which applies the traditional Japanese techniques of rote training (hiden) to children learning the violin, has won a substantial following in the West. Yamaha sells over 200,000 pianos a year and is the world's largest maker of musical instruments. Conversely, European classical and American popular music have become part of the Japanese scene. Tokyo has nine professional orchestras and three operas, giving it the highest mass of European classical music talent in the world. Seiji Ozawa, music director of the Boston Symphony Orchestra, rates as the most renowned of Japanese conductors. Classical performers with wide reputations include pianists Aki and Yugi Takahasi and percussionist Stomu Yamashita. Though Japanese composers are yet little known outside Japan, their pace of activity is considerable. Toru Takemitsu, who makes a specialty of exploring timbre, texture, and everyday sounds in both European and Japanese media, is perhaps the most renowned internationally. Akira Miyoshi composes classic Western music. Toshi Ichiyanagi, Jo Kondo, Teruyaki Noda, and Yuji Takahashi write in an avant-garde manner. Shinichiro Ikebe, Minoru Miki, Makato Moroi, and Katsutoshi Nagasawa write for traditional Japanese instruments. The marimbist Keiko Abe is the best known of classical Japanese musicians and Toshiko Akiyoshi the best known of jazz players. European classical music has shed its foreign quality in Japan, becoming fully indigenous. In this, Japan resembles the United States, another country which has imported nearly all of its classical music. Just as Americans have adapted the music to their own tastes and customs -- playing the 1812 Overture on the 4th of July, for example -- so have the Japanese. Thus does Beethoven's Ninth Symphony serve as the anthem of the Christmas and New Year's season. Not only do the country's leading orchestras play the symphony over and over again during December, but gigantic choruses (numbering up to 10,000 participants) rehearse for months before bellowing out the Ode to Joy in public performances. As for pop music, the Japanese -- like nearly all the world -- idolize American pop stars and grow their own local talent. But more interesting is their intense engagement with jazz. So large is the Japanese jazz market that it affects music produced in the United States. Jazz coffee shops (which play music on state-of-the-art equipment) have proliferated, and Japan hosts numerous international jazz festivals each year. Japanese Swing Journal sells 400,000 copies a month (compared to only 110,000 copies of the best-known American publication, Downbeat) and roughly half of some American jazz albums are bought by Japanese. Indeed, according to one American producer, Michael Cuscuna of Blue Note Records, "Japan almost singlehandedly kept the jazz record business going during the late 1970s. Without the Japanese market, a lot of independent jazz labels probably would have folded, or at least stopped releasing new material." This is too big a market to lose, so American and other artists must increasingly pay attention to Japanese taste. As for Japanese creativity, the results here have been modest until now -- composers and musicians do little more than imitate the styles of foreigners -- but the existence of a large and increasingly sophisticated home market offers fertile ground for Japanese musicians to experiment and then to lead. Attempts to combine jazz with traditional Japanese music have begun; these blendings are likely to influence jazz as much as they already have architecture and clothing. It seems safe to predict that the Japanese before long will become a major force in jazz. The Japanese give musically in other ways too. The karaoke machine plays instrumental versions of popular songs and permits a bar patron to accompany the music as though he were an accomplished singer, providing a good time for all. Not only has karaoke has become an amusement staple worldwide, but the characteristic Japanese-style bar (with its hostesses, a mama-san, and karaoke microphone) has proliferated in the West. Karaoke machines are sold in Sears Roebuck stores and have won a large and cheerful, if slightly tipsy following. Conclusion Muslim and Japanese responses with Western music symbolize their larger encounters with Western civilization. Muslims have historically approached the West warily, fearful of losing their identity. This prevents them from immersing themselves in Western learning or gaining the needed skills in technology and business. They remain permanently in arrears, coping with one wave of Western influence after another, barely keeping up and exerting virtually no influence over the West. The Japanese do things very differently. First, they throw themselves whole-heartedly into the new subject, not fearing the loss of their own identity. Second, they acquire skills, matching and even beating the West at its own game; what the Tokyo orchestras are to music, Toyota and Nissan are to cars. Third, Japanese evolve original customs of their own, either based in their traditions (karaoke) or an amalgam of cultures (Beethoven's Ninth for New Year's). Finally, they develop techniques that Westerners adopt; the Suzuki Method in music parallels the just-in-time system in car manufacturing. They have absorbed Western civilization in its entirety, discarded what does not interest them, taken what does, and mastered it. Thus does the response to Western music exemplify the whole of a civilization's experience with modernity. Its lack of utility makes it all the more useful as an indicator of achievement. Why this connection? Because, as Lewis observes, "Music, like science, is part of the inner citadel of Western culture, one of the final secrets to which the newcomer must penetrate." Music represents the challenge of modernity: competence in this arena implies an ability to deal with whatever else the West might serve up. Muslim resistance to accepting music from the West represents its larger unwillingness, whereas the Japanese have truly entered the inner citadel. In short, whoever would flourish must play Beethoven as well as Westerners do. From shovland at mindspring.com Sun Jan 8 23:35:24 2006 From: shovland at mindspring.com (Steve Hovland) Date: Sun, 8 Jan 2006 15:35:24 -0800 Subject: [Paleopsych] Visiting another dimension? Message-ID: Back when I was doing shamanic drumming and journeying, we would often go into the lower worlds (dimensions?) to get help. One time I decided to go into an upper dimension. There I was met by a creature with the body of a man and the head of a fish. He was the gate-keeper, and he asked me what I wanted to do there, because it was not for tourists. Apparently my answer satisfied him and he let me in. I walked down a causeway and came to a large gallery full of blank canvases. I realized these were paintings that I was supposed to do. When I came out of the gallery I was in a garden with many plots of medicinal plants. At the back of the garden there was an old man dispensing wisdom. Behind him was a dark tangled forest. I plunged in. Steve Hovland From ross.buck at uconn.edu Mon Jan 9 15:29:20 2006 From: ross.buck at uconn.edu (Buck, Ross) Date: Mon, 9 Jan 2006 10:29:20 -0500 Subject: [Paleopsych] CHE: In the Lab With the Dalai Lama Message-ID: Herbert Benson in "The Relaxation Response" suggested decades ago that disciplines such as meditation and prayer share the quality of promoting deep relaxation, which has the opposite effects of the fight-or-flight response. That is, they lower autonomic and endocrine arousal, and promote immune system functioning. Cheers, Ross Buck Ross Buck, Ph. D. Professor of Communication Sciences and Psychology Communication Sciences U-1085 University of Connecticut Storrs, CT 06269-1085 860-486-4494 fax 860-486-5422 Ross.buck at uconn.edu http://www.coms.uconn.edu/docs/people/faculty/rbuck/index.htm -----Original Message----- From: paleopsych-bounces at paleopsych.org [mailto:paleopsych-bounces at paleopsych.org] On Behalf Of Premise Checker Sent: Friday, January 06, 2006 1:03 PM To: paleopsych at paleopsych.org Subject: [Paleopsych] CHE: In the Lab With the Dalai Lama In the Lab With the Dalai Lama The Chronicle of Higher Education, 5.12.16 http://chronicle.com/weekly/v52/i17/17b01001.htm By LEIGH E. SCHMIDT Even the Dalai Lama's harshest critics at the Society for Neuroscience meeting last month, in Washington, would have to concede this much: Choosing the exiled Tibetan Buddhist leader to inaugurate the professional association's series on neuroscience and society certainly got people talking. Who would have thought that an announced lecture on "The Neuroscience of Meditation" would set off a protest petition gathering about 1,000 signatures, a counterpetition of support boasting nearly as many names, substantial coverage in The New York Times and on National Public Radio, as well as ample chatter in the blogosphere? In a culture that likes its battles between science and religion to be loud, colorful, and Christian -- another nasty squabble, say, between evolutionists and creationists -- this controversy seemed unlikely to gain much traction. Yet as the dispute built momentum in the months leading up to the event, it soon became clear that the prospect of the red-robed Dalai Lama's urging the study of an ancient spiritual practice upon white-coated lab scientists would provide a newsworthy angle on the usual wrangling. Playing upon tensions far less noticed than those that have plagued relations between science and conservative Christianity, the latest dust-up reveals the spirit wars that divide the knowledge class itself. How purely secular and naturalistic do the members of that class imagine themselves to be, and how committed are they to keeping religion at bay in their conference gatherings, university laboratories, civic institutions, newsrooms, and think tanks? In turn, is "spirituality" a back door through which religion gets to enter the conversation, now dressed in the suitably neutralized garb of meditation as a universalistic practice of inward peace and outreaching compassion? Or does religion, even when soft-peddled in the cosmopolitan language of spirituality and the contemplative mind, inevitably remain an embarrassment to those elites who stake their authority on secular rationality? The dispute roiling the neuroscience society over the past six months has brought such questions front and center. Inviting the Dalai Lama to speak at the meeting created two major border disputes. The first, of modest consequence to religion-and-science debates, was the conflict over the "political agenda" of the exiled Tibetan leader. In an international professional association that includes many Chinese scientists, some members were offended at the implied endorsement that the event gave to the Dalai Lama's larger cause of freedom for Tibetans. The second dispute, more insistently debated, was over religion's showing up -- so visibly, to boot -- at an annual meeting of neuroscientists. The almost visceral response by critics was to declare a total separation of religion and science, to wave the flag for the late-19th-century warfare between the two domains. "A science conference is not [an] appropriate venue for a religion-based presentation," a professor of anesthesia from the University of California at San Francisco remarked on the petition. "Who's next, the pope?" That sign-off question pointed to a second part of the strict separationist logic: Even if the Dalai Lama seemed pretty irenic as religious leaders go, he nonetheless represented a slippery slope into a mire of superstition and authoritarianism. (How else, some critics asked, were they to interpret his known affinities with reincarnation and monasticism?) "Today, the Dalai Lama; Tomorrow, Creationists?" wrote a professor of medicine at the University of Toronto, capturing perhaps the most commonplace anxiety given voice among the critics. Keep the society free of all religious discussion, or else the esteemed body might slide into the hell of a Kansas school-board meeting. More interesting than the purists' boundary monitoring is the way the Dalai Lama and his defenders imagine through meditation an emerging meeting point for science and religion in contemporary culture. The headline study that served as the immediate source of intrigue surrounding his recent lecture was an article published last year in the Proceedings of the National Academy of Sciences and produced by researchers at the Waisman Laboratory for Brain Imaging and Behavior, at the University of Wisconsin at Madison. That group, led by the psychology professor Richard J. Davidson, has been studying long-term Tibetan Buddhist practitioners of meditation, comparing their brain-wave patterns with those of a control group. Davidson himself has been working in the science-religion borderlands for more than two decades and has been a leading collaborator with the Mind and Life Institute, in Boulder, Colo., one of the principal organizations encouraging the neuroscience-meditation dialogue. Shifting the focus of research from altered states of consciousness or momentary experiences of ecstasy, which so often concerned inquirers in the 1960s and 1970s, the Davidson group has been looking for evidence that sustained meditation causes actual neural changes in everyday patterns of cognition and emotion. In other words, they want to know if the brain function of long-term contemplatives is made demonstrably different through years of "mental training." And not just different, but better: That is, does the well-developed meditative mind sustain higher levels of compassion and calmness than the run-of-the-mill American noggin? Well, after testing eight long-time Tibetan Buddhist practitioners and 10 "healthy student volunteers," the researchers discovered that the 10,000 to 50,000 hours that the various monks had devoted to "mental training" appeared to make a real neurological difference. As the study's title put it, "Long-term meditators self-induce high-amplitude gamma synchrony during mental practice." Davidson and company, careful not to overreach in their conclusions, did suggest that practices of meditation, and the accompanying compassionate affect, were "flexible skills that can be trained." Did that mean contemplative practice could be abstracted from its religious context and then applied as a kind of public pedagogy? Were hopeful supporters wrong to read this as a tantalizing suggestion that meditation might prove beneficial not only for the mental health of Americans but also for the very fabric of society? Where, after all, couldn't we benefit from a little more "pure compassion," altruism, lovingkindness, and "calm abiding"? As novel as it may sound to monitor the brain waves of Tibetan Buddhist monks in university laboratories or on Himalayan hillsides (Davidson has done both), it is certainly not the first time that American psychologists have sought to re-engage the spiritual through the healthy-mindedness of meditation. At Wisconsin, Davidson occupies a research professorship named for Harvard's William James, the pioneering psychologist, psychical researcher, and philosopher of religion, and it is in the tradition of James that the current turn to the contemplative mind is best understood. Counter to the popular image of Americans as endlessly enterprising, agitated, and restless -- all busy Marthas, no reflective Marys -- James discerned a deep mystical cast to the American psyche and pursued that strain with uncommon intellectual devotion. Yet when it came to "methodical meditation," James saw little of it left among American Christians and turned instead to homegrown practitioners of various mind-over-matter cures. He particularly accented those "New Thought" metaphysicians who were pushing forward a dialogue with far-flung emissaries of yoga and Buddhist meditation in the wake of the World's Parliament of Religions, held in Chicago in 1893. Among James's favored practitioners of these newly improvised regimens of meditation was Ralph Waldo Trine, a Boston-based reformer with a knack for inspirational writing. In The Varieties of Religious Experience (1902), James used Trine's blockbuster In Tune With the Infinite (1897) as an epitome of the emergent practices of concentration, mental repose, and healthy-mindedness then percolating in New England and elsewhere across the country. Though an unabashed popularizer, Trine was not a lightweight. With an educational pedigree that ran from Knox College to the University of Wisconsin to the Johns Hopkins University, he moved easily in Harvard's wider metaphysical circles and energetically engaged various progressive causes. In much the same way that current studies promote the clinical applications of meditation, Trine emphasized the healthful benefits that accrued from cultivating a calm yet expectant mind. He had no scanners or electrodes, but he had the same hopes about improving the mental and physical health of Americans through elaborating a universal practice of meditation, one that transcended the particulars of any one religious tradition and represented a kind of cosmopolitan composite of all faiths. And while Trine did not have the Dalai Lama at hand, he did have extended contact with a well-traveled Sinhalese Buddhist monk, Anagarika Dharmapala, with whom he compared notes and devotional habits at a summer colony in Maine as he was putting together his own system of meditation for Americans. Like other inquirers then and now, Trine was all too ready to look to Asia for a practical antidote to American nervousness. The real payoff for Trine, as it is for Davidson and his colleagues, was not established simply through a calculus of productivity or cheerfulness: Would encouraging meditation or other visualization techniques make people more alert and proficient at the office or on the playing field? Would it make them feel happier and less disgruntled? Trine, like James and now Davidson, was finally more interested in saintliness and compassion than in helping stressed-out brain workers relax and concentrate. It is hard not to hear a hint of Davidson's pursuit of altruism in Trine's "spirit of infinite love," the moral imperative to "care for the weak and defenseless." And it is hard not to see that the world of William James and Ralph Waldo Trine is alive and well as American investigators wire up Tibetan Buddhist hermits in a search for the powers of the concentrated mind, the mental disciplines of harmony, compassion, and peace that might make the world a marginally kinder, less selfish place. That optimism about human nature -- that the mind has deep reservoirs of potential for empathy and altruism -- had a lot more backing among liberals and progressives in 1900 than it does today. Still, the considerable hopes now invested in meditation suggest that the old romantic aspirations, spiritual and otherwise, continue to flourish, especially among members of the mind-preoccupied knowledge class. P erhaps the most important dimension of the Dalai Lama's turn to the laboratory is the notion that the religion-science wound will be salved through recasting religion as spirituality. The Nobel laureate's latest book explicitly suggests as much in its title, The Universe in a Single Atom: The Convergence of Science and Spirituality. In doing so, he expressly appeals to all those Americans who fear fundamentalist incarnations of religion and who instead cast themselves as intellectually curious and spiritually seeking. Religion, on this model, is not a domain of authority competing with science but an inward terrain of personal experience and individual probing. Spirituality, the Dalai Lama writes, "is a human journey into our internal resources." Representing "the union of wisdom and compassion," it shares with science a progressive hope for "the betterment of humanity." In those terms, religion as spirituality becomes the handmaiden of science itself, joining it in an open quest for knowledge, empirical and pragmatic, unconstrained by ancient creeds, cosmologies, or churches. In such exhortations the Dalai Lama shows a fine, intuitive feel for much of American intellectual and religious life, but he is hardly telling today's Emersonian inquirers something about the universe that they do not already affirm. A practice of meditation made palatable to scientists, secularists, and seekers would no doubt look pallid to all those monks, hermits, and saints who have taken it to be an arduous and ascetic discipline. Still, the American pursuit of "spirituality," reaching a crescendo in the past two decades, has been all too easy to dismiss as paltry and unsubstantial, labeled as foreign and threatening to more-orthodox versions of a Christian America. In this often-charged religious environment, the Dalai Lama has astutely laid hold of the science-spirituality nexus as a cultural foothold. As he has discovered in this latest brouhaha, that move has hardly lifted him above the wider debates, whether about materialism or intelligent design, but it has allowed him to connect with America's more cosmopolitan and progressive religious impulses. When William James was asked directly in 1904, "What do you mean by 'spirituality'?," he replied: "Susceptibility to ideals, but with a certain freedom to indulge in imagination about them." In mingling with neuroscientists who have warmed to his talk of spirituality, the Dalai Lama may well have found his own avatars of William James. Leigh E. Schmidt is a professor of religion at Princeton University and author of Restless Souls: The Making of American Spirituality (HarperSanFrancisco, 2005). _______________________________________________ paleopsych mailing list paleopsych at paleopsych.org http://lists.paleopsych.org/mailman/listinfo/paleopsych From checker at panix.com Mon Jan 9 15:38:20 2006 From: checker at panix.com (Premise Checker) Date: Mon, 9 Jan 2006 10:38:20 -0500 (EST) Subject: [Paleopsych] NYT: What Makes a Nation More Productive? It's Not Just Technology Message-ID: What Makes a Nation More Productive? It's Not Just Technology http://www.nytimes.com/2005/12/25/business/yourmoney/25view.html [Before you start bellowing that the most important factor in productivity was left out, be aware that this article is dealing with *changes* in the last five or ten years. It is still into the future before technology will allow changes in brain size and IQ to have a major impact in 5-10 years. Emryo selection and eugenic breeding takes a whole generation.] Economic View By DANIEL GROSS IN 2001, the stock market meltdown and a brief recession threw cold water on the widely held belief that the United States economy, juiced by a technological revolution, had entered a new era of limitless, inflation-free growth. But today, as bubble-era books like "Dow 36,000" collect dust on library shelves, evidence is mounting that there may be a new economy after all. In the late 1990's, growth in labor productivity - the amount of output per hour per worker - kicked into a higher gear. From 1996 through 1999, it grew at a blistering annual rate of 2.5 percent, compared with 1.4 percent from 1972 to 1995. Economists generally believed that the higher rate was a byproduct of the new economy. Much of the growth was spurred by the highly productive businesses that made information technology products - companies like Dell, Intel and Microsoft - and by their customers, who spent heavily to deploy productivity-enhancing PC's and software. "About half of the growth resurgence from 1995 to 2000 was due to I.T.," said Dale Jorgenson, university professor at Harvard and a co-author of the recently published "Information Technology and the American Growth Resurgence." As the technology investment boom of the 1990's gave way to bust in 2000, many analysts feared that the productivity gains would dissipate. Instead, productivity since 2000 has grown at a substantially higher pace than it did in the late 1990's. And productivity growth is still strong. This month, the Bureau of Labor Statistics reported that productivity in the third quarter was up 3.1 percent from the same quarter last year. A new report by the McKinsey Global Institute, the research arm of the consulting firm McKinsey & Company, found that sectors other than technology have been driving the growth in the post-bust years. "The I.T.-producing industry itself, with its extraordinarily rapid pace of change, certainly has contributed to overall productivity growth," said Martin Baily, a senior fellow at the Institute for International Economics, based in Washington. "But now we're getting a bigger share from the rest of the economy." Mr. Baily, a former chairman of the Council of Economic Advisers in the Clinton administration, was co-author of the McKinsey report with Diana Farrell, the director of McKinsey Global. In the late 1990's, McKinsey found that six of the economy's 59 sectors accounted for virtually all productivity growth. Among the biggest contributors were new-economy industries like telecommunications, computer manufacturing and semiconductors. But from 2000 to 2003, the top seven sectors accounted for only 75 percent of the productivity increase. And five of the top contributors were service industries, including retail trade, wholesale trade and financial services. That is surprising, since economists have generally believed that it is much harder for service industries to reap sharp productivity gains than it is for manufacturers. To be sure, service industries have become more productive in recent years by continuing to invest in information technology. Yet there are also other factors at work. "I.T. is a particularly effective enabling tool," Ms. Farrell said. "But without the competitive intensity that drives people to adopt innovation, we wouldn't see these kinds of gains." To compete with Wal-Mart, for example, retailers of all stripes have been working furiously to gain scale, to manage supply chains and logistics more effectively, and to negotiate better terms with suppliers and workers. A similar dynamic has played out in the finance sector, where there has also been a huge gain in productivity. It is likely that competition and structural changes are responsible for those gains - both in the late 1990's and in recent years. Commissions for stock trades have fallen sharply amid relentless competition; spreads in stock trading have narrowed, thanks to rules promulgated by the Securities and Exchange Commission; and trading volume has risen, thanks to the proliferation of investors. Add it up, and you have more volume at lower cost to the customer. And when the stock market cooled after the Internet bubble, companies in the once-hot financial sector began to focus on cutting costs and eliminating unprofitable operations. Those moves further bolstered productivity. One mystery of recent years has been the enduring gap in productivity growth between the United States and Europe. In this case, another structural force - regulation - may be at work. "In economies with less regulation, companies can use information communications technology that link sectors to one another in ways that create joint productivity," said Gail Fosler, executive vice president and chief economist at the Conference Board. Because domestic retailers don't face the same sorts of restrictions on working hours and road use that European retailers do, for example, the Americans have been better able to use technology to manage trucking fleets, deliveries and inventory. The encouraging news, some economists say, is that a major breakthrough in information technology is not required to fuel further productivity growth. "It's not research and development that cause the big gains in productivity," Professor Jorgenson said. "The real drivers are things like competition, deregulation, the opening of markets and globalization." AS the gospel of increased productivity spreads to a wider range of sectors, more companies keep trying to figure out how to do more with the same amount of labor - or with less. For macroeconomists, that is good news. But there is a downside. In the past few years, payroll job growth has been far less robust than usual for post-recessionary periods. And because high productivity means that the economy can grow smartly without the addition of new jobs, some job seekers might wish that companies were a tad less efficient. Mr. Baily says that there does not have to be a trade-off between productivity and job creation. "Historically, in the U.S. and in other countries, periods of rapid productivity growth have been periods of strong employment growth," he said. That was certainly the case in the late 1990's. Why has the experience been different in the last several years? "The loss of manufacturing jobs after 2000 was just huge, and those jobs haven't come back," Mr. Baily said. The Big Three automakers have shed tens of thousands of jobs since 2000 because of competitive pressures and a drop in demand for their products. And it is likely that General Motors and Ford would be retrenching even if productivity in the service sector was growing at a much slower rate. "It's hard to blame productivity growth for a lot of manufacturing job losses," Mr. Baily said. Daniel Gross writes the "Moneybox" column for Slate.com. From checker at panix.com Mon Jan 9 15:38:37 2006 From: checker at panix.com (Premise Checker) Date: Mon, 9 Jan 2006 10:38:37 -0500 (EST) Subject: [Paleopsych] BBS: (D.S. Wilson and E. Sober) Re-Introducing Group Selection to the Human Behavioral Sciences Message-ID: Re-Introducing Group Selection to the Human Behavioral Sciences http://www.bbsonline.org/documents/a/00/00/04/60/bbs00000460-00/bbs.wilson.html [This should be read carefully and prayerfully by all, esp. those who have not read their book, "Unto Others," of which this is a fine pr?cis.] Below is the unedited preprint (not a quotable final draft) of: Wilson, D.S. & Sober, E. (1994). Reintroducing group selection to the human behavioral sciences. Behavioral and Brain Sciences 17 (4): 585-654. The final published draft of the target article, commentaries and Author's Response are currently available only in paper. David Sloan Wilson Department of Biological Sciences State University of New York at Binghamton Binghamton New York 13902-6000 [3]DWILSON at BINGVAXA.BitNet Elliott Sober Department of Philosophy University of Wisconsin 5185 Helen C. White Hall 600 North Park Street Madison Wisconsin 53706 [4]ESober at VMS.MACC.Wisc.edu Keywords culture; evolution; group selection; kin selection; inclusive fitness; natural selection; reciprocity; social organization; units of selection. Abstract In both biology and the human sciences, social groups are sometimes treated as adaptive units whose organization cannot be reduced to individual interactions. This group-level view is opposed by a more individualistic view that treats social organization as a byproduct of self-interest. According to biologists, group-level adaptations can evolve only by a process of natural selection at the group level. During the 1960's and 70's most biologists rejected group selection as an important evolutionary force but a positive literature began to grow during the 70's and is rapidly expanding today. We review this recent literature and its implications for human evolutionary biology. We show that the rejection of group selection was based on a misplaced emphasis on genes as "replicators" which is in fact irrelevant to the question of whether groups can be like individuals in their functional organization. The fundamental question is whether social groups and other higher-level entities can be "vehicles" of selection. When this elementary fact is recognized, group selection emerges as an important force in nature and ostensible alternatives, such as kin selection and reciprocity, reappear as special cases of group selection. The result is a unified theory of natural selection that operates on a nested hierarchy of units. The vehicle-based theory makes it clear that group selection is an important force to consider in human evolution. Humans can facultatively span the full range from self-interested individuals to "organs" of group-level "organisms." Human behavior not only reflects the balance between levels of selection but it can also alter the balance through the construction of social structures that have the effect of reducing fitness differences within groups, concentrating natural selection (and functional organization) at the group level. These social structures and the cognitive abilities that produce them allow group selection to be important even among large groups of unrelated individuals. _________________________________________________________________ The existence of egoistic forces in animal life has long been recognized. It is not so well known that the idea of group-centered forces in animal life also has a respectable history. (Allee 1943, p 519) It is a crude oversimplification to conceive of social motives as being capable of direct derivation from a hedonic algebra of self-interest--real or fictitious--based on a few universal human drives, whatever the choice of the drives may be. (Tajfel 1981, p36) These quotations illustrate a perspective in which social groups have a primacy that cannot be reduced to individual interactions. The group-level perspective can be found in biology and all branches of the human behavioral sciences (e.g., Anthropology, Economics, Psychology, Sociology). It is opposed by another perspective that treats individuals as primary and social groups as mere consequences of individual interactions. Although the conflict between the two perspectives is often dismissed as semantic, it refuses to go away, suggesting that substantive issues are involved. In biology, the conflict between the perspectives has had a remarkable history. Prior to 1960 it was quite acceptable to think of social groups and even whole ecosystems as highly adapted units, similar to individuals in the harmony and coordination of their parts1. Williams (1966) and others argued that group-level adaptations require a process of natural selection at the group level and that this process, though theoretically possible, was unlikely to be important in nature. Their verdict quickly became the majority view and was celebrated as a major scientific advance, similar to the rejection of Lamarkianism. A generation of graduate students learned about group selection as an example of how not to think and it became almost mandatory for the authors of journal articles to assure their readers that group selection was not being invoked. Nevertheless, a positive literature began to grow in the 1970's and is rapidly expanding today (table 1).2 It is no longer heretical for biologists to think of natural selection as a hierarchical process that often operates at the group level. The most recent developments in biology have not yet reached the human behavioral sciences, which still know group selection primarily as the bogey man of the 60's and 70's. The purpose of this paper is to re- introduce group selection to the human behavioral sciences. We think that group selection can provide a firm foundation for a group-level perspective in the human sciences, as it has in biology. Before beginning, however, it is important to note a complication. Evolutionary approaches to human behavior have become increasingly common, as readers of Behavioral and Brain Sciences well know. Unfortunately, some of the most prominent evolutionary biologists interested in human behavior have themselves failed to incorporate the recent literature and still present group selection as a bogey man (e.g., Alexander 1979,1987, Daly and Wilson 1988, Trivers 1985; but see Table 1 entries marked 'H' for exceptions). We therefore must re-introduce group selection to human sociobiology as well as to the more traditional branches of the human sciences. A BRIEF REVIEW OF THE GROUP SELECTION CONTROVERSY The adaptationist program. In an influential paper, Gould and Lewontin (1979) criticize evolutionists for using adaptation as their only explanatory principle, to the exclusion of other factors such as genetic drift and genetic/developmental constraints. They coined the term "adaptationist program" as a pejorative and their conclusion that it cannot be the only tool in the evolutionist's toolkit is well taken. At the same time, their message should not obscure the fact that the adaptationist program, or "natural selection thinking" (Charnov 1982), is an extremely powerful tool for predicting the properties of organisms. One of the virtues of the adaptationist program is that it can be employed with minimal knowledge of the physiological,biochemical and genetic processes that make up the organisms under examination. For example, imagine studying the evolutionary effects of predation on snails, seeds and beetles. Suppose you discover that for all three groups, species exposed to heavy predation have harder and thicker exteriors than species not so exposed. The property ohard exterior@ can be predicted from knowledge of the selection pressures operating on the populations. Since the exteriors of snails, beetles, and seeds are made of completely different materials, there is a sense in which these materials are irrelevant to the prediction (Campbell 1974, Wilson 1988). That is why Darwin was able to achieve his fundamental insights in almost total ignorance of the mechanistic processes that make up organisms. Adaptationist explanations have the power to unify phenomena that are physiologically, biochemically and genetically quite different . The adaptationist program is valuable even if its predictions turn out to be untrue. If we know the traits that organisms will have if natural selection is the only influence on evolutionary trajectories, then deviations from these traits constitute evidence that factors other than natural selection have played a significant role. To discover whether adaptationism is true or false, optimality models are indispensable (Sober 1993, Orzack and Sober in press). Although the following discussion is, in effect, a view about how the adaptationist program should be pursued, it involves no substantive committment to the success of that program. Regardless of the scopes and limits of adaptationism, the question owhat would organisms be like if they were well adapted@ is of paramount importance in evolutionary biology. The adaptationist program and the biological hierarchy. The question "What would they be like if they were well adapted?" is more complicated than it sounds. To see this, consider an imaginary population of rabbits inhabiting an island. A mutant arises that grazes more efficiently--so efficiently that a population of such mutants will overexploit their resource and go extinct. The mutation is adaptive in the limited sense of causing its bearer to have more offspring than other rabbits, but maladaptive in the larger sense of driving the population extinct. This example should sound familiar to human behavioral scientists because it resembles the social dilemmas that abound in human life. It corresponds to the tragedy of the commons popularized by Hardin (1968), the voting problem of economics (Margolis 1982) and the prisoner's dilemma of game theory (Rapoport and Chammah 1965). For humans and nonhumans alike, individual striving can lead to social chaos. As previously mentioned, many biologists prior to the 1960's uncritically assumed that natural selection evolves adaptations at upper levels of the biological hierarchy. In our imaginary example they would assume that the population of rabbits evolves to manage its resources. The possibility that adaptation at one level of the hierarchy can be maladaptive at another level was either ignored or assumed to be resolved in favor of the higher level. These sentiments, which today are called "naive group selectionism", permeated the textbooks and were espoused by many eminent biologists, including Alfred Emerson (1960), who believed that all of nature was as functionally integrated as a termite colony. As a young post-doctoral associate at the University of Chicago, G.C. Williams attended a lecture by Emerson and left muttering "Something must be done...". The result was a modern classic, Adaptation and Natural Selection (Williams 1966)3. Williams' argument against higher-level adaptations came in three parts. First, he claimed that adaptation at any level of the biological hierarchy requires a process of natural selection operating at that level. Returning to our population of rabbits, it is easy to see that efficient grazers will evolve because they have more offspring than inefficient grazers. The negative consequences at the population level are irrelevant. However, if we imagine an archipelago of islands, only some of which contain the mutant strain, then populations driven extinct by the mutant can be replaced by other populations without the mutant. The population- level adaptation can now persist, but only because we have added a process of natural selection at that level; fit populations replace unfit populations in the same sense that fit rabbits replace unfit rabbits within populations. This is what evolutionary biologists term group selection. Second, Williams argued that group selection is unimportant in nature despite the fact that it is theoretically possible: It is universally conceded by those who have seriously concerned themselves with this problem that such group-related adaptations must be attributed to the natural selection of alternative groups of individuals and that the natural selection of alternative alleles within populations will be opposed to this development. I am in entire agreement with the reasoning behind this conclusion. Only by a theory of between-group selection could we achieve a scientific explanation of group-related adaptations. However, I would question one of the premises on which the reasoning is based. Chapters 5 to 8 will be primarily a defence of the thesis that group-related adaptations do not, in fact exist. (Williams 1966 p 92 ) Part of Williams' skepticism can be illustrated with our rabbit example. If migration occurs between islands, what is to prevent the mutant from "infecting" the other islands before the original population goes extinct? Or perhaps the mutant population doesn't go extinct but merely hobbles along in a malnourished state, in which case the occasional migrant from other islands would be unable to survive. At least for this example, it seems that the parameters of the model must be very finely tuned for group-level selection to prevail against individual-level selection. Third, Williams developed a concept of the gene as the "fundamental unit of selection" that has become a major theme in evolutionary biology, especially as amplified and extended by Dawkins (1976,1982). Williams claimed that groups and even individuals cannot be units of selection because they are ephemeral and do not replicate with sufficient fidelity. Every sexually reproducing organism is a unique combination of thousands of genes that will never exist again, no matter how successful reproductively. At the individual level, only clonal organisms replicate with sufficient fidelity to qualify as units of selection. For sexually reproducing organisms, the gene is the unit that is transmitted through time with high fidelity and is therefore the fundamental unit of selection (the replicator, in Dawkins' terminology). This is frequently used as an argument against group selection. For example, Alexander (1979 p36) states: In 1966 Williams published a book criticizing what he called "some current evolutionary thought" and chastised biologists for invoking selection uncritically at whatever level seemed convenient. WilliamsY book was the first truly general argument that selection is hardly ever effective on anything but the heritable genetic units of "genetic replicators" (Dawkins 1977) contained in the genotypes of individuals. Individuals and groups appear in Williams' scheme, not as units of selection, but as environments of the genes. As the simplest example, consider two alleles (A,a) at a single diploid locus in a randomly mating population, yielding the familiar three genotypes (AA,Aa,aa) in Hardy- Weinberg proportions. Suppose the fitnesses of the three genotypes are WAA=1, WAa=0.75 and Waa=0.5. From the gene's-eye view, the A-allele can be said to inhabit two "genotypic environments", AA and Aa, and its average fitness can be easily calculated: WA= pWAA + (1-p)WAa (1) The term p, in addition to being the frequency of the A-allele in the population, is also the proportion of A-alleles that exist in the AA "environment" in a randomly mating population. The fitness of the a-allele can similarly be averaged across its two genotypic environments (Aa,aa) to yield Wa= pWAa + (1-p)Waa (2) The A-allele will evolve whenever WA>Wa, which is always the case when WAA>WAa>Waa. Note that A and a have the same fitness within the one genotypic environment that they inhabit together (the heterozygote). It is only by averaging across genotypic environments that differences in the fitness of A and a occur. Biologically informed readers will recognize WA and Wa as the "average effects" of the two alleles used to calculate breeding values and narrow-sense heritability at the individual level (e.g., Falconer 1982, Wilson and Sober 1989). More complicated examples can be constructed in which the population is divided into social groups that differ in allele frequencies and genotypic fitnesses. In these cases the genes inhabit a more complicated array of "environments" but in principle it is always possible to calculate gene-level fitness by averaging across genotypic and social contexts. In addition, it will always be the case that A replaces a when WA>Wa. This is why Williams (1986,1992) refers to genes as "bookkeeping" devices that automatically record the net effect of multiple selection pressures. Williams' case against group selection was strengthened by two other theories in evolutionary biology that were developed during the 60's and 70's. The first was inclusive fitness theory (also called kin selection; Hamilton 1964, Maynard Smith 1964), which explained how altruism could evolve among genetic relatives. The second was evolutionary game theory (Axelrod and Hamilton 1981, Maynard Smith 1982, Trivers 1971, Williams 1966), which explained how cooperation could evolve among non-relatives. These theories seemed to account for many of the phenomena that group selection had been invoked to explain. With the problems raised by Williams and two robust alternatives, the theory of group selection, never well articulated to begin with, collapsed. Not all evolutionary biologists are familiar with the details of Williams' arguments against group selection, but the bottom-line conclusion has been adopted with such conviction that we will call it Williams' first commandment: "Thou shalt not apply the adaptationist program above the level of the individual." All adaptations must be explained in terms of the relative fitness of individuals within populations. Individual-level adaptations may have positive or negative effects at the group level, but in both cases the group-level effects are irrelevant to evolutionary change. Williams' first commandment was repeated like a mantra throughout the 60's and 70's, as every evolutionary biologist knows. Unfortunately, the mantra still echoes through the numerous accounts of evolutionary theory that are written for the human sciences and popular audiences today (e.g., Alexander 1987, Archer 1991, Cronin 1991, Daly and Wilson 1988, Frank 1988, Krebs 1987, MacDonald 1988, Noonan 1987, Sagan and Druyan 1992). Examining the edifice. Although Williams' and Dawkins' gene-centered view has enjoyed enormous popularity, it has one flaw that should be obvious, at least in retrospect. Naive group selectionists thought that upper levels of the biological hierarchy were like individual organisms in the coordination and harmony of their parts. According to Williams and Dawkins, however, even sexually reproducing organisms do not qualify as units of selection because they, like groups, are too ephemeral. If a creature such as a bird or a butterfly is not a unit of selection, then what endows it with the internal harmony implied by the word "organism"? To answer this question, an entirely different concept needed to be invoked which Dawkins (1976) called "vehicles of selection" ("interactors" in Hull's 1980 terminology). Employing one of DawkinsY own metaphors, we can say that genes in an individual are like members of a rowing crew competing with other crews in a race. The only way to win the race is to cooperate fully with the other crew members. Similarly, genes are "trapped" in the same individual with other genes and usually can replicate only by causing the entire collective to survive and reproduce. It is this property of shared fate that causes "selfish genes" to coalesce into individual organisms. So far, so good, but if individuals can be vehicles of selection, what about groups? After all, we are interested in comparing groups with individuals, not with genes. Yet gene-centered theorists have scarcely addressed this question.4 The situation is so extraordinary that historians of science should study it in detail: A giant edifice is built on the foundation of genes as replicators, and therefore the "fundamental" unit of selection, which seems to obliterate the concept of groups as organisms. In truth, however, the replicator concept cannot even account for the organismic properties of individuals. Almost as an afterthought, the vehicle concept is tacked onto the edifice to reflect the harmonious organization of individuals but it is not extended to the level of groups. The entire edifice therefore fails to address the question that it originally seemed to answer so conclusively and that made it seem so important. This is such a crucial and unappreciated point that we want to reinforce it by quoting from The Ant and the Peacock (Cronin 1991), one of the most recent book-length treatments of evolution for a popular audience.5 Cronin is a philosopher who has a part-time appointment at Oxford University's Zoology Department. Her book was chosen as one of the year's best by the New York Times and has been cited with approval by authorities such as G.C. Williams (1993) and John Maynard Smith (1992) and Daniel Dennett (1992) 6. There is every reason for the reader to think that it represents state-of-the-art evolutionary biology. Cronin agrees with us that naive group selectionists compared groups to individuals: Many an ecologist, equipped with no more than a flimsy analogy, marched cheerfully from the familiar Darwinian territory of individual organisms into a world of populations and groups. Populations were treated as individuals that just happened to be a notch or two up in the hierarchy of life...(p278). Her treatment of Williams is also close to our own: "Williams retaliated with two types of argument. He spelled out why genes are suitable candidates for units of selection whereas organisms, groups and so on are not...(p286)." Here Cronin commits (along with Williams) the fallacy that we outlined above. If individuals and groups are not replicators, then the replicator concept cannot be used to argue that they are different from each other! Faced with this dilemma, Cronin dutifully invokes vehicles to explain the organismal properties of individuals, with a nod to groups: If organisms are not replicators, what are they? The answer is that they are vehicles of replicators...Groups, too, are vehicles, but far less distinct, less unified...In this weak sense, then, 'group selection' could occur...But even if they [group-level adaptations] did arise--which as we've seen is unlikely--they would in no way undermine the status of genes as the only units of replicator selection. This does not mean that higher level entities are unimportant in evolution. They are important, but in a different way: as vehicles (p289). But this is all that naive group selectionists ever claimed--that groups are like individuals by virtue of the adaptive coordination of their parts! Finally, Cronin concludes that group selection is unimportant even in the so-called weak sense: But group selectionism (weak group selectionism) makes claims about adaptations, about characteristics that satisfy the fragmented purposes of all the genes in the group and, what's more, confer an advantage on that group over other groups. Group-level adaptations, then, are a very special case of emergent properties--so special that it would be rash to expect them to have played any significant role in evolution. Of course, the question of what role they have actually played is an empirical, not a conceptual issue. It is a factual matter about which adaptations happen to have arisen at levels higher than organisms, about the extent to which groups and other higher-level vehicles happen to have been roadworthy. [p290] Cronin is in the unhappy position of a circus artist who stands on the backs of two horses, replicators and vehicles, as they gallop around the ring. The only way that she can perform this dazzling feat is by making the horses gallop in parallel. Thus, groups must fail not only as replicators but as vehicles. What Cronin cannot bring herself to say is that the replicator concept that forms the inspiration for her book is totally irrelevant to the question that is and always was at the heart of the group selection controversy--can groups be like individuals in the harmony and coordination of their parts? To answer this question we must restructure the entire edifice around the concept of vehicles, not replicators. That is exactly what the positive literature on group selection does7. Taking vehicles seriously. The essence of the vehicle concept is shared fate, exemplified by the adage (and by Dawkins' rowing crew metaphor) "we're all in the same boat." Our restructured edifice must first be able to identify the vehicle(s) of selection in any particular biological or human situation. In figure 1, the biological hierarchy is shown as a nested series of units, each of which is a population of lower level units. An individual can be regarded as a population of genes and a group is obviously a population of individuals. A metapopulation is a population of groups. For example, a single field might contain hundreds of ant colonies. Each colony certainly deserves to be called a group and yet we must also recognize the collection of groups as an important entity. The hierarchy has been left open on both ends because genes are composed of subunits and metapopulations can exist in higher-order metapopulations, a fact that will become important later. Vehicles of selection can be identified on a trait-by-trait basis by the following simple procedure: Starting at the lowest level of the hierarchy8, ask the question "Do genes within a single individual differ in fitness?" If the answer is "no", then they share the same fate and are part of the same vehicle. Proceeding up the hierarchy, ask the question: "Do individuals within a single group differ in fitness?" If the answer is "no" then once again they share the same fate and we must proceed up the hierarchy until we find the level(s) at which units differ in fitness. This is the level (or levels) at which natural selection actually operates, producing the functional organization implicit in the word "organism at .9 Everything below this level will acquire the status of organs and everything above this level will be vulnerable to social dilemmas.10 Already we can make three fundamental points: First, focusing on vehicles makes it obvious that the concept of "organism" is not invariably linked to the "individual" level of the biological hierarchy. To the extent that genes can differ in fitness within single individuals, the genes will become the organisms and the individual will become a dysfunctional collection of genes. To the extent that individuals in the same group are in the same "boat" with respect to fitness, they will evolve into harmonious organs of group-level organization. In short, the organ-organism- population trichotomy can be frame-shifted both up and down the biological hierarchy. Frame-shifts in both directions have been documented and examples will be provided below. Second, the status of organ vs. organism vs. population must be assigned on a trait-by-trait basis. It is possible for a single creature such as a wasp to be an organ with respect to some traits, an organism with respect to other traits, and a population of organisms with respect to still other traits. This may sound strange but it follows directly from the fact that fitness is a property of traits, not organisms (Sober 1984). For example, in the parasitic wasp Nasonia vitripennis, some males harbor what has been called the oultimate@ selfish gene, because it destroys all the other genes in the male to facilitate its own transmission (Werren 1991,1992). In this case the gene is the vehicle of selection but most other genes in the same species evolve by standard Darwinian selection, in which case the individual is the vehicle of selection. Third, fitness differences are not always concentrated at one level of the biological hierarchy. Individuals with trait A can be less fit than individuals with trait B within single groups, while groups of individuals with trait A are more fit than groups of individuals with trait B. In these cases we cannot assign the status of organ, organism or population and must settle for some hybrid designation. As one example, Williams (1966) showed that, given certain assumptions, natural selection within single groups favors an even sex ratio while natural selection between groups favors an extreme female-biased sex ratio. He thought that the absence of female-biased sex ratios in nature provided conclusive evidence against group selection. Since then, moderately female-biased sex ratios have been discovered in literally hundreds of species, which reflect an equilibrium between opposing forces of within- and between-group selection (Charnov 1982, Colwell 1981, Frank 1986, Wilson and Colwell 1981).11 As we will show, altruism is another example of a hybrid trait that is selected against at the individual level but favored at the group level. Now we will document our claim that the organ-organism-population trichotomy can be frame-shifted both up and down the biological hierarchy. Individuals as dysfunctional populations of genetic elements. Individuals are traditionally viewed as stable entities that (barring mutation) pass the same genes in the same proportions to their offspring that they received from their parents. However, this is not always the case. For example, a diploid individual can be regarded as a population of N=2 alleles at each locus. The rules of meiosis usually dictate that each allele is equally represented in the gametes. Occasionally a mutation arises that "breaks" the rules of meiosis by appearing in greater than 50% of the gametes, a phenomenon known as "meiotic drive" (Crow 1979). These same alleles often decrease the survival of individuals that possess them and can even be lethal in homozygous form. Let us apply our simple procedure to this example to identify the vehicle(s) of selection. Can genes within a single individual differ in fitness? The answer is "yes" because the driving allele exists at a frequency of p=0.5 in heterozygotes and occurs in the gametes of those heterozygotes with a frequency of p>0.5. Natural selection therefore operates at the gene level, favoring the driving allele. Now proceed up the hierarchy. Do individuals within a single population differ in fitness? The answer is again "yes" because individuals with the driving allele suffer higher mortality than individuals without the driving allele. Natural selection therefore operates against the driving allele at the individual level. Both the gene and the individual are vehicles of selection. If gene-level selection is sufficiently strong, the driving allele can evolve despite its negative effects on individuals. Many other examples of natural selection within individuals could be cited involving chromosomal genes (Dover 1986), cytoplasmic genes (Cosmides and Tooby 1981), and competing cell lineages (Buss 1987). These examples have been received with great fanfare by gene-centered theorists as some sort of confirmation of their theory. But these examples do not confirm the thesis that genes are replicators--all genes are replicators by definition and no documentation is needed. These examples are remarkable because they show that genes can sometimes be vehicles. They seem bizarre and disorienting because they violate our deeply rooted notion that individuals are organisms. They force us to realize that individuals are at least occasionally nothing more than groups of genes, subject to the same social dilemmas as our imaginary population of rabbits. Why aren't examples of within-individual selection more common? Several authors have speculated that the rules of meiosis and other mechanisms that suppress evolution within individuals are themselves the product of natural selection acting at the individual level. Genes that profit at the expense of other genes within the same individual are metaphorically referred to as "outlaws" (Alexander and Borgia 1978) and the regulatory machinery that evolves to suppress them is referred to as a "parliament" of genes (Leigh 1977). Ironically, most of the authors who employ these metaphors are reluctant to think of real parliaments as regulatory machines that reduce fitness differences within groups, thereby concentrating adaptation at the group level. Gene-centered theorists frame-shift downward with enthusiasm but they are much more reluctant to frame-shift upward. Groups as organisms. Social insect colonies have been regarded as "superorganisms" for centuries. Sterile castes with division of labor, colony-level thermoregulation and patterns of information processing that transcend single brains all suggest intuitively that colonies are functionally organized units, built out of individual insects. This interpretation was rejected by gene-centered theorists, however, who claimed to explain the social insects without invoking group selection. Their scorn for the earlier view is illustrated by West-Eberhard (1981 p 12; parenthetical comments are hers): "Despite the logical force of arguments against group (or colony) selection, and the invention of tidy explanations for collaboration in individual terms, the supraorganism (colony-level selection) still haunts evolutionary discussions of insect sociality." Let us apply our simple procedure to locate the vehicle(s) of selection in the social insects. Can genes differ in fitness within individuals? Yes-- the social insects resemble other species in this regard--but the products of selection at this level are unlikely to enhance colony function. Can individuals differ in fitness within single colonies? Yes; as one example, honey bee queens usually mate with more than one male, leading to multiple patrilines among the workers. Many insects can detect genetic similarity using odor cues and it is plausible to expect workers tending future queens to favor members of their own patriline. As with evolution within individuals, however, this kind of palace intrigue is more likely to disrupt colony function than to enhance it (Ratnieks 1988, Ratnieks and Visscher 1989). We therefore must proceed up the hierarchy and ask "Can groups (=colonies) differ in fitness within a metapopulation?" Unlike our archipelago of rabbits, in which the metapopulation seemed somewhat contrived, the social insects obviously exist as a population of colonies. Consider a mutation that is expressed in honeybee workers and increases the efficiency of the hive, ultimately causing the queen to produce more reproductive offspring. It is obvious that this mutation will spread, not by increasing in frequency within the hive, but by causing hives possessing the mutation to out-produce other hives. Thus, for the majority of traits that improve colony function, the colony is the vehicle of selection and can legitimately be called an organism. Focusing on vehicles, not replicators, as the central concept makes West-Eberhard's statement sound absurd. Notice also that Williams' first argument, that group-level adaptations require a process of natural selection at the group level, is correct. But his empirical claim that group selection is weak and group-level adaptations don't exist is just plain wrong in the case of the eusocial insects--both the process and the product are manifest. The focus on genes as the "fundamental" unit of replication merely distracts from the more relevant framework based on vehicles. Fortunately, most social insect biologists now realize this and once again regard social insect colonies as "group-level vehicles of gene survival" (Seeley 1989), at least to the degree that they evolve by between-colony selection. Before leaving the social insects it is worth asking a question that we will pose later for humans: What does it mean for a creature such as an ant or a honeybee, itself an organism in some respects, to also be part of a group-level organism? A partial answer is provided by Seeley (1989), whose elegant experiments reveal the mechanisms of colony-level adaptation. A honeybee hive monitors its floral resources over several square miles and maximizes its energy intake with impressive accuracy. If the quality of a food patch is experimentally lowered, the hive responds within minutes by shifting workers away from that patch and toward ones that are more profitable. Yet individual bees visit only one patch and have no frame of comparison. Instead, individuals contribute one link to a chain of events that allows the comparison to be made at the hive level. Bees returning from the low quality patch dance less and themselves are less likely to revisit. With fewer bees returning from the poor resource, bees from better patches are able to unload their nectar faster, which they use as a cue to dance more. Newly recruited bees are therefore directed to the best patches. Adaptive foraging is accomplished by a decentralized process in which individuals are more like neurons than decision-making agents in their own right (Camazine and Sneyd 1991; see Camazine 1991, Deneubourg and Goss 1989, Franks 1989 and Wilson and Holldobler 1988 for other examples of group-level cognition in social insects). The image of a group-level mind composed of relatively mindless individuals is aptly described in D. HofstadterYs (1979) essay ant fugue. We suggest that some aspects of human mentality can also be understood as a form of group- level cognition (see below). Finding the vehicles in inclusive fitness theory. How was it possible for West-Eberhard and others to think that the social insects could be explained without invoking group selection? Her "tidy" alternative explanation was inclusive fitness theory, which she and almost everyone else regarded as a robust alternative to group selection. However, inclusive fitness theory is a gene-centered framework that does not identify the vehicle(s) of selection. When we rebuild inclusive fitness theory on the foundation of vehicles we discover that it is not an alternative to the idea of group selection at all (Michod 1982, Queller 1991,1992, Uyenoyama and Feldman 1980, Wade 1985, Wilson 1977, 1980). It would be hard to imagine a more important discovery, yet human behavioral scientists are almost totally unaware of it, in part because their evolutionary informants so assiduously ignore it. Even the most recent accounts of evolution for the human sciences treat inclusive fitness and group selection as separate mechanisms (e.g., Alexander 1987, 1989,1992, Archer 1991, Daly and Wilson 1988, Frank 1988, Krebs 1987, MacDonald 1988, Noonan 1987). We will consider one of these treatments in detail because it allows us to make a number of important points throughout the rest of our paper. Here is Frank's (1988 p37-39) depiction of group selection. Group-selection models are the favored turf of biologists and others who feel that people are genuinely altruistic. Many biologists are skeptical of these models, which reject the central Darwinian assumption that selection occurs at the individual level. In his recent text, for example, Trivers includes a chapter entitled "The group selection fallacy". With thinly veiled contempt, he defines group selection as "the differential reproduction of groups, often imagined to favor traits that are individually disadvantageous but evolve because they benefit the larger group". Group selectionists have attempted to show that genuine altruism, as conventionally defined, is just such a trait... Could altruism have evolved via group selection? For this to have happened, altruistic groups would have had to prosper at the expense of less altruistic groups in the competition for scarce resources. This requirement, by itself, is not problematic. After all, altruism is efficient at the group level (recall that pairs of cooperators in the prisoner's dilemma do better than pairs of defectors), and we can imagine ways that altruistic groups might avoid being taken advantage of by less altruistic groups... But even if we suppose that the superior performance of the altruistic group enables it to triumph over all other groups, the group selection story still faces a formidable hurdle. The conventional definition again, is that nonaltruistic behavior is advantageous to the individual . Even in an altruistic group, not every individual will be equally altruistic. When individuals differ, there will be selection pressure in favor of the least altruistic members. And as long as these individuals get higher payoffs, they will comprise an ever-larger share of the altruistic group. So even in the event that a purely altruistic group triumphs over all other groups, the logic of selection at the individual level appears to spell ultimate doom for genuinely altruistic behavior. It can triumph only when the extinction rate of groups is comparable to the mortality rate for individuals within them. As [E.O.] Wilson stresses, this condition is rarely if ever met in practice. Frank's account of group selection is accurate and similar to our own rabbit example. He also accurately depicts the climate of the group selection debate during the 60's and 70's. Now here is Frank's description of inclusive fitness theory (p25-27): Biologists have made numerous attempts to explain behavior that, on its face, appears self-sacrificing. Many of these make use of William Hamilton's notion of kin selection. According to Hamilton, an individual will often be able to promote its own genetic future by making sacrifices on behalf of others who carry copies of its genes... The kin-selection model fits comfortably within the Darwinian framework, and has clearly established predictive power... Sacrifices made on behalf of kin are an example of what E.O. Wilson calls "'hard core' altruism, a set of responses relatively unaffected by social reward or punishment beyond childhood." Viewed from one perspective, the behavior accounted for by the kin- selection model is not really self-sacrificing behavior at all. When an individual helps a relative, it is merely helping that part of itself that is embodied in the relative's genes... FrankYs exposition certainly suggests that group selection and kin selection are alternative theories that invoke separate mechanisms. Frank himself regards them as so different that he calls one non-Darwinian and the other Darwinian!12 Now consider the model in figure 2, which rebuilds inclusive fitness theory on the foundation of vehicles (see Michod 1982, Queller 1991,1992, Sober 1993, Uyenoyama and Feldman 1980, Wade 1985, Wilson 1977, 1980 for more formal treatments). A dominant allele (A) codes for a behavior that is expressed only among full siblings. The behavior decreases the fitness of the actor by an amount c and increases the fitness of a single recipient by an amount b. In figure 2, adults of the three genotypes (AA,Aa,aa) combine randomly to form six types of mating pairs (AAxAA, AAxAa, AAxaa, AaxAa, Aaxaa, aaxaa). Each mating pair produces sibling groups with a characteristic proportion of altruists and non-altruists. Thus, the sibling groups derived from AAxAA matings are entirely altruistic, the groups derived from aaxaa matings are entirely non-altruistic and so on. Since the behavior is expressed only among siblings, the progeny of each mated pair is an isolated group as far as the expression of the behavior is concerned. Thus, any model of sibling interactions invokes a metapopulation of sibgroups. Now let us employ our simple procedure to locate the vehicles of selection. Beginning at the lowest level of the hierarchy, there is no meiotic drive or other forms of selection within individuals in this example. Moving up the hierarchy, do individuals within single sibgroups differ in fitness? Yes, and natural selection at this level operates against the altruists. In all sibgroups that contain both selfish (aa) and altruistic (Aa,AA) phenotypes, the former are fitter--they benefit from the latter's help without sharing the costs. Sibling groups are similar to other groups in this respect. Continuing up the hierarchy, can sibgroups differ in fitness within the metapopulation? Yes, and it is here that we find the evolutionary force that favors altruism. Since every altruist contributes a net fitness increment of b-c to the sibgroup, the fitness of the collective is directly proportional to the number of altruists in the group. Sibgroups with more altruists outproduce sibgroups with fewer altruists. The degree of altruism that evolves depends on the balance of opposing forces at the group and individual levels. Figure 3 shows why kin groups are more favorable for the evolution of altruism than groups of unrelated individuals. In the latter case, groups of size N are drawn directly from the global population, forming a binomial distribution of local gene frequencies. In the former case, groups of size two (the parents) are drawn from the global population and groups of size N (the siblings) are drawn from their gametes. This two-step sampling procedure increases genetic variation among groups, intensifying natural selection at the group level. Put another way, altruists are segregated from non-altruists more in kin groups than in randomly composed groups. In both cases there are mixed groups, however, and evolution within mixed groups is the same regardless of whether they are composed of siblings or nonrelatives. Notice that this explanation does not invoke the concept of identity by descent, which seems to be the cornerstone of inclusive fitness theory. There is no physical difference between two altruistic genes that are identical by descent and two altruistic genes that are not. The coefficient of relationship is nothing more than an index of above-random genetic variation among groups (e.g, Falconer 1982 ch 3- 5, Queller 1991,1992). We invite the reader to go back to Frank's account of group selection to confirm that it exactly describes the process of kin selection that is portrayed in figure 2. Dr. Jekyl and Mr. Hyde are the same person. The only discrepancy between Frank's account and figure 2 involves the concept of extinction. Sibling groups don't last for multiple generations and don't necessarily go extinct, but rather dissolve into the larger population when the individuals become adults and have their own offspring. Thus, sibling groups (and social insect colonies) differ somewhat from our population of rabbits and the groups that Frank and Trivers had in mind. But this does not disqualify sibling groups as vehicles of selection. After all, individuals are transient collections of genes that "dissolve" into the gene pool as gametes. The ephemeral nature of groups in figure 2 makes them more similar to individuals, not less. Frank's account of kin selection appears so different, not because it invokes a different mechanism for the evolution of altruism, but because it utilizes a different accounting procedure for calculating gene frequency change that does not compare the fitnesses of individuals within single groups. The method correctly predicts the degree of altruism that evolves but obscures the internal dynamics of the process. In fact, when the vehicle-centered approach was first published, many biologists who thought they were familiar with inclusive fitness theory found it hard to believe that altruism is actually selected against within kin-groups and evolves only by a process of between-group selection. The unification of group selection and kin selection has implications for the distinction between ogenuine@ vs. oapparent@ altruism. This in an important distinction in the human behavioral sciences and evolutionary accounts such as FrankYs seem to provide a tidy answer: The altruism that evolves by group selection is "genuine" because it entails real self- sacrifice, while the altruism that evolves by kin selection is only "apparent" because it is just genes promoting copies of themselves in other individuals. The unified theory reveals that this distinction is an artifact of the way that fitness is calculated. Any trait that is selected at the group level can be made to appear ogenuinely@ altruistic by comparing relative fitness within groups, or only oapparently o altruistic by averaging fitness across groups (Wilson 1992, Wilson and Dugatkin 1992). Thus, evolutionary biologists have so far contributed little but confusion to the distinction between genuine and apparent altruism.13 Finding the vehicles in evolutionary game theory. Evolutionary game theory (also called ESS theory for "evolutionarily stable strategy") is similar to economic game theory except that the strategies compete in Darwinian fashion, as opposed to being adopted by rational choice. It was developed to explore the evolution of cooperation and was universally considered to be an individual-level alternative to group selection. For example, Dawkins (1980 p360) states There is a common misconception that cooperation within a group at a given level of organization must come about through selection between groups...ESS theory provides a more parsimonious alternative. We will explore the relationship between game theory and group selection with a fanciful example that is based on Dawkins' rowing crew metaphor. A species of cricket has evolved the peculiar habit of scooting about the water on dead leaves in search of its resource (water lily flowers). A leaf can be propelled much better by two crickets than by one so they scoot about in pairs. Initially they were quite awkward but natural selection eventually endowed them with breathtaking morphological and behavioral adaptations for their task. Especially impressive is the coordination of the pair. They take their stations on each side of the leaf and stroke the water with their modified legs in absolute unison, almost as if they are part of a single organism. Coordination is facilitated by one member of the pair, who synchronizes the strokes by chirping at regular intervals. On closer examination it was discovered that the chirps not only coordinate movements but also steer the little craft. A low-pitched chirp causes the chirper to row harder and a high-pitched chirp causes the non- chirper to row harder. The captain (as the chirper came to be called) adjusts its pitch to correct for asymmetries in the shape of the leaf and also to change direction as lily pads hove into view. Either member of the pair can act as captain; the important thing is that there be only one. The evolution of any particular trait in this example can be examined with a 2-person game theory model. For example, consider two types (A1 and A2) that differ in their ability to synchronize with their partner's movement. If p is the frequency of A1 in the population and if pairing is at random then three types of pairs exist (A1A1, A1A2, A2A2) at frequencies of p2, 2p(1-p) and (1-p)2. Coordination, and therefore fitness, is directly proportional to the number of A1 individuals in the pair, as shown by the payoff matrix in figure 3a. The fitness of the two types, averaged across pairs, is WA1=5p+4(1-p) and WA2=4p+3(1-p). This is not a very interesting game theory model because it doesn't pose a dilemma. WA1>WA2 for all values of p, making it obvious that A1 will evolve. However, this should not obscure a more fundamental point, that the pair is the vehicle of selection. If we apply our procedure we find no fitness differences between individuals within a pair, in which case A1 can evolve only by causing pairs to succeed relative to other pairs. The fact that the pairs are ephemeral, perhaps lasting only a fraction of an individual's lifetime, is irrelevant. Persistence is a requirement for replicators, not vehicles. Coordination evolves among the individuals for exactly the same reason that it evolves among genes within individuals, because they are "in the same boat" as far as fitness differences are concerned. More generally, evolutionary game theory deploys a metapopulation model, in which individuals exist within groups that exist within a population of groups. When this elementary fact is recognized, Dawkins' statement quoted above looks just as absurd as West-Eberhard's statement about the social insects. Cooperation evolves by group-level selection in a game theory model as surely as cooperation among genes evolves by individual-level selection in a standard population genetics model.14 In fact the two models are mathematically identical; we can go from one to the other merely by relabelling A1 and A2 as "alleles" rather than as "individuals" and calling the pair a zygote (Hamilton 1971, Holt 1983, Maynard Smith 1987, Wilson 1983, 1989, 1990). Continuing our example, suppose that a mutant type (A3) arises that rushes onto the lily pad at the moment of arrival, kicking the boat away and setting its hapless partner adrift. If both members of the pair are the A3 type, however, they collide and have a probability of drowning. The pay-off matrix for this situation is shown in figure 3b and the average fitness of the two types is WA1=p(5)+(1-p)(0) and WA3=p(10)+(1-p)(2). This model is more interesting because it constitutes a social dilemma. A3 evolves despite the fact that it disrupts group-level functional organization. Applying our procedure, we find that the nasty behavior is favored by within-group selection; A3 is more fit than A1 within pairs. Cooperation, as before, is favored by between group-selection; A1A1 and A1A3 pairs are more fit than A3A3 pairs. By renaming the individuals "alleles" and the pairs "zygotes", we have the example of meiotic drive described on page 14. Continuing our example, suppose that a new mutant (A4) arises that can remember the previous behavior of its partner. It acts honorably toward new partners and thereafter imitates its partner's previous behavior. This is the famous Tit-for-Tat strategy (Axelrod and Hamilton 1981) that can evolve above a threshold frequency, given a sufficient probability of future interactions (fig 3c). Applying our procedure, we find that natural selection still favors A3 over A4 within pairs because A4 loses during the first interaction. A4 reduces but does not eliminate its fitness disadvantage within groups by changing its behavior and it evolves only because groups of A4A4 outperform groups of A4A3 and A3A3.15 Finally, suppose that yet another mutant arises (A5) that grabs hold of its partner with one of its free legs, preventing it from leaping prematurely onto the lily pad. The pay-off matrix for A5 vs. A3 is shown in figure 3d. Applying our procedure, we find that fitness differences within groups have been eliminated while between-group selection still favors A5A5 and A5A3 over A3A3. A5 is like a dominant allele in the sense that A5A5 and A5A3 groups are phenotypically identical. Within- group selection has been eliminated by an evolved trait. Once again the pair has achieved a harmony and coordination that invites comparison with an organism, but with some safe-guards built in, similar to the rules of fair meiosis at the genetic level. How was it possible for Dawkins and virtually all other evolutionary biologists to regard game theory as an individualistic theory that does not require group selection? The answer is that groups were treated as "environments" inhabited by individuals, in exactly the same sense that Williams regarded individuals as "environments" inhabited by genes. Averaging the fitness of individual types across groups combines selection at all levels into a single measure of "individual fitness" that correctly predicts the outcome of natural selection but loses sight of the vehicles that natural selection actually acts upon. Selection can operate entirely at the group level (as it does in figure 3a and d) and still be represented in terms of individual fitnesses simply because the average A2 (or A5) is more fit than the average A1 (or A3). This definition of what "individual selection" favors is synonymous with "anything that evolves, regardless of the vehicles of selection". Of course, individuals are not replicators and we can make them disappear along with groups by averaging the fitness of genes across all contexts, arriving at a definition of "gene selection" as "anything that evolves, regardless of the vehicles of selection". These bloated definitions of individual and gene selection have misled a generation of biologists into thinking that natural selection almost never occurs at the level of groups. In this review we have concentrated on showing how the seemingly alternative theories of kin selection, evolutionary game theory and group selection have been united into a single theory of natural selection acting on a nested hierarchy of units. The unified theory does more than redescribe the familiar results of kin selection and game theory, however; it also predicts that natural selection can operate on units that were never anticipated by kin selection and game theory, such as multigenerational groups founded by a few individuals (e.g., Aviles 1993, Wilson 1987), large groups of unrelated individuals (Boyd and Richerson 1985, 1990a,b), and even multispecies communities (Goodnight 1990a,b; Wilson 1976,1980, 1987). For example, accounts of human evolution that are based on nepotism and reciprocity often conclude that prosocial behavior in modern humans is maladaptive because it is not confined to genetic relatives and is often given without expectation of return benefits (e.g., Ruse 1986; but see Alexander 1987). Later we will argue that these prosocial behaviors can be adaptive because group-level vehicles exist that are larger than the kin groups and very small groups modelled by kin selection and evolutionary game theory. We summarize our review of group selection in biology as follows: Williams' (1966) argument against group selection came in three parts: a) higher-level adaptations require higher levels of selection, b) higher levels of selection are theoretically possible but unlikely to occur in nature, c) the gene is the fundamental unit of selection because it is a replicator. The third part of this argument is irrelevant to the question of whether groups can be like individuals in the harmony and coordination of their parts. As far as we can tell, all gene-centered theorists now concede this point (e.g., Dawkins 1982, 1989, Grafen 1984, Williams 1992). Taking vehicles seriously requires more than acknowledging a few cases of group selection, however; it demands a restructuring of the entire edifice. It is a mistake to think there is one weak group-level theory and two strong individual-level theories to explain the evolution of altruism/cooperation. Rather, there is one theory of natural selection operating on a nested hierarchy of units, of which inclusive fitness and game theory are special cases. When we focus on vehicles of selection, the empirical claim that constitutes the second part of Williams' argument disintegrates but the first part remains intact. Adaptation at any level of the biological hierarchy requires a process of natural selection at that level. As might be expected from such a radical restructuring, some biologists who previously regarded group selection with contempt have found it difficult to accept this Cinderella-like reversal of fortunes. Thus, a large group of knowledgeable biologists who are perfectly comfortable with the hierarchical approach (see table 1) coexists with another large group whose members adhere to the earlier view. We think that the views of the former group are in the process of replacing the views of the latter. The replacement process is painfully slow, however, partly because the gene- centered view is so thoroughly entrenched and partly because the major gene-centered theorists have been reluctant to acknowledge the consequences of taking vehicles seriously. As one example, Sterelny and Kitcher (1988) manage to defend the selfish gene concept without even considering the question of whether groups can be vehicles of selection.16 We make these bold statements to provoke a response. If gene-centered theorists wish to rebut our account, let them speak in the commentary section that follows this paper. Otherwise, let the replacement process continue at a faster pace. All of the major developments that we have reviewed are over ten years old and it is time for them to be acknowledged generally. GROUP SELECTION AND HUMAN BEHAVIOR17 In his description of honey bee colonies as superorganisms, Seeley (1989 p546) wrote that "...larger and more complex vehicles have evidently proved superior to smaller and simpler vehicles in certain ecological settings. By virtue of its greater size and mobility and other traits, a multicellular organism is sometimes a better gene-survival machine than is a single eukaryotic cell...Likewise, the genes inside organisms sometimes fare better when they reside in an integrated society of organisms rather than in a single organism because of superior defensive, feeding, and homeostatic abilities of functionally organized groups." This statement applies almost as well to humans as to honeybees. Nevertheless, group-level functional organization in humans is usually portrayed as a byproduct of individual self-interest. Even the most recent evolutionary accounts of human behavior are based on Williams' first commandment and the triumph of "individual selection" in biology is often used to justify the individualistic perspective in the human behavioral sciences. We think that the hierarchical theory of natural selection leads to a very different conclusion. Individualism in biology and in the human sciences both fail for the same reasons. As far as human evolution is concerned, group-level functional organization is not a "byproduct" of self-interest in humans any more than it is in honeybees. The metapopulation structure of human interactions is manifest; individuals live in social groups which themselves comprise a population of social groups. Even a relatively small social unit such as a village is a metapopulation of still smaller groups such as kinship units or coalitions of unrelated individuals. Genetic variation among human groups is not as great as among bee hives, but, as we will attempt to show, human cognitive abilities provide other mechanisms for concentrating natural selection at the group level, even when the groups are composed of large numbers of unrelated individuals (also see Alexander 1987, 1989, Boyd and Richerson 1985,1990, Knauft 1991). Individualistic accounts of human behavior do not ignore these facts (e.g., Alexander 1979, 1987, 1989, 1992), but they are able to remain individualistic only by ignoring the concept of vehicles. As soon as we make vehicles the center of our analysis, group selection emerges as an important force in human evolution and the functional organization of human groups can be interpreted at face value, as adaptations that evolve because groups expressing the adaptations outcompeted other groups. The same adaptations can be and often are selectively neutral or even disadvantageous within groups. In the following sections we will sketch some of the implications of the hierarchical view for the study of human behavior. The new group selection is not a return to naive group selection. Some biologists have been reluctant to accept group selection in any form because they fear it will encourage the uncritical thinking of Emerson and others who simply assumed the existence of higher-level adaptations (e.g., Maynard Smith 1987a,b). Behavioral scientists may share this reluctance because every branch of the human sciences seems to have thinkers like Emerson (1960) and Wynne-Edwards (1962, 1986) who treat social groups as the unit of adaptation as if individuals and their strivings scarcely exist. We therefore want to stress, in the strongest possible terms, that these views are not supported by modern group selection theory. Consider the example within biology of the Gaia hypothesis (Lovelock 1979), which portrays the entire planet as a self-regulating organism. Even a passing knowledge of group selection theory exposes Gaia as just another pretty metaphor because planet-level adaptation would require a process of between-planet selection (Wilson and Sober 1989). Grandiose theories of human societies as organisms would be correct only if natural selection operated entirely at the society level, which no one proposes. The hierarchical theory's attention to mechanism makes it easy to discredit such "theories" both in biology and the human sciences. Groups are real. Having distanced ourselves from naive group selection, we want to stress with equal force that it is legitimate to treat social groups as organisms, to the extent that natural selection operates at the group level. Williams' first commandment ("Thou shalt not apply the adaptationist program above the individual level") is fundamentally wrong. To see this, consider a simplified situation in which natural selection acts entirely at the individual level, in which case genes within individuals become entirely cooperative and individuals within the population frequently face conflicts of interest that lead to social dilemmas. Employing the adaptationist program at the individual level leads to the celebrated insights that we discussed at the beginning of this paper. Employing the adaptationist program at the population level leads to the errors of naive group selection that Williams so effectively exposed. But now suppose that someone misleadingly suggests that we should not employ the adaptationist program at the individual level--that the fitness of individuals is actually irrelevant to the evolutionary process; it is only gene-level fitness that counts. This misleading advice would have us apply the adaptationist program below the level at which natural selection actually operates. In a sense, this is just what the gene's-eye view of Williams and Dawkins invites us to do. Even they do not take it seriously enough to abandon the individual's-eye view, however, since they assert the equivalence of gene fitness and individual fitness when the latter are vehicles of selection. In practice, most biologists pay passing tribute to the gene as the "fundamental" unit of selection and think about adaptation at the individual level as they always have (e.g., Grafen 1984, quoted in note 4; Maynard Smith 1987a, p 125). We submit that evolutionary biologists would be severely handicapped if they could not ask the simple question "what would a well adapted individual be like?" Yet that is the very question that is prohibited at the group level by Williams' first commandment. If commandments are needed, we suggest the following: "Thou shalt not apply the adaptationist program either above or below the level(s) at which natural selection operates". This statement avoids both the excesses of naive group selection and the excesses of naive individual and gene selection that we have outlined above. According to Campbell (1993 p1), the human behavioral sciences are dominated by something very similar to Williams' first commandment: Methodological individualism dominates our neighboring field of economics, much of sociology, and all of psychology's excursions into organizational theory. This is the dogma that all human social group processes are to be explained by laws of individual behavior--that groups and social organizations have no ontological reality--that where used, references to organizations, etc. are but convenient summaries of individual behavior...We must reject methodological individualism as an a priori assumption, make the issue an empirical one, and take the position that groups, human social organizations, might be ontologically real, with laws not derivable from individual psychology...One of my favorite early papers (Campbell 1958) explicitly sides with that strident minority of sociologists who assert that "Groups are real!" even though it finds human organizations "fuzzier" than stones or white rats. The hierarchical theory of natural selection provides an excellent justification for regarding groups as "real". Groups are "real" to the extent that they become functionally organized by natural selection at the group level. However, for traits that evolve by within-group selection, groups really should be regarded as by-products of individual behavior. Since group selection is seldom the only force operating on a trait, the hierarchical theory explains both the reality of groups that Campbell emphasizes and the genuinely individualistic side of human nature that is also an essential part of his thinking.18 Altruism and organism. Group selection is often studied as a mechanism for the evolution of altruism. We have also seen that groups become organisms to the extent that natural selection operates at the group level. Although the concepts of altruism and organism are closely related, there is also an important difference. Altruism involves a conflict between levels of selection. Groups of altruists beat groups of nonaltruists, but nonaltruists also beat altruists within groups. As natural selection becomes concentrated at the group level, converting the group into an organism, the self-sacrificial component of altruism disappears. In other words, an object can be an organism without tts parts behaving self- sacrificially. The distinction between altruism and the interactions among parts of an organism is illustrated by our fanciful cricket example. The four pay- off matrices in figure 3 represent a) pure between-group selection, b) strong conflict between levels of selection, c) weak conflict between levels of selection, and d) a return to pure between-group selection. Within-group selection is absent from the first example by virtue of the situation, since coordination has an equal effect on both occupants of the leaf. Within-group selection is absent from the fourth example by virtue of an adaptation, since the "outlaw" A3 type cannot operate in the presence of the "parliament" A5 type. It might seem that group-level adaptations would be easiest to recognize in group-level organisms. Ironically, the opposite is true, at least from the individualistic perspective. Individualists acknowledge group-level adaptations when they are easily exploited within groups, but when they are protected, or when exploitation is not possible by virtue of the situation, group-level adaptations are seen as examples of individual self-interest, despite the fact that they evolve purely by between-group selection and result in total within-group coordination. Payoff matrices such as 3a and 3d are seldom even considered by game theorists because their outcome is so obvious. In the absence of fitness differences within groups, any amount of genetic variation between groups is sufficient to select for A1 and A5, including the variation that is caused by random pairing. It is only by adding within-group selection that we can generate the social dilemmas that are deemed interesting enough to model. But A1 and A5 should not be viewed as examples of self-interest just because they easily evolve! As we have seen, groups are the vehicles of selection in these examples as surely as individuals are the vehicles in standard Darwinian selection. To call A1 and A5 examples of self-interest is to place them in the same category as A3, which evolves by within-group selection and disrupts group-level organization. Putting it another way, by lumping together the products of within- and between-group selection, the individualistic perspective does not distinguish between the outlaw and the parliament, turning oself-interest@ into a concept that is as empty as it is universal. Failure to recognize group-level adaptation in the absence of altruism extends far beyond game theory. We present an example from Alexander (1987) in detail, in part because he is one of the most influential biologists writing on human evolution. Alexander envisions moral systems as levelers of reproductive opportunities within groups: The tendency in the development of the largest human groups, although not always consistent, seems to be toward equality of opportunity for every individual to reproduce via its own offspring. Because human social groups are not enormous nuclear families, like social insect colonies, ...competition and conflicts of interest are also diverse and complex to an unparalleled degree. Hence, I believe, derives our topic of moral systems. We can ask legitimately whether or not the trend toward greater leveling of reproductive opportunities in the largest, most stable human groups indicates that such groups (nations) are the most difficult to hold together without the promise or reality of equality of opportunity (p69).19 Alexander explicitly compares human moral systems to the genetic rules of meiosis that eliminate fitness differences within individuals: A corollary to reproductive opportunity leveling in humans may occur through mitosis and meiosis in sexual organisms. It has generally been overlooked that these very widely studied processes are so designed as usually to give each gene or other genetic subunit of the genome...the same opportunity as any other of appearing in the daughter cells...It is not inappropriate to speculate that the leveling of reproductive opportunity for intragenomic components--regardless of its mechanism--is a prerequisite for the remarkable unity of genomes...[p69] Since the rules of meiosis concentrate natural selection at the individual level, producing individual-level organisms, moral rules must concentrate natural selection at the group level, producing group-level organisms--right? Wrong. Here is Alexander's verdict on group selection: Finally, many easily made observations on organisms indicate that selection is most effective below group levels. These include such things as evidence of conflicts among individuals within social groups, failure of semelparous organisms (one-time breeders) to forego reproduction when resources are scarce, and strong resistance to adopting nonrelatives by individuals evidently long evolved in social groups. None of these observations is likely if the individual's interests are consistently the same as those of the group or if, to put it differently, allelic survival typically were most affected by selection at the group level (p37-8). All of these examples involve altruistic traits that are highly vulnerable to exploitation within groups. The only evidence that Alexander will accept for group selection is extreme self-sacrifice. Somehow, Alexander manages to combine a strong emphasis on between-group competition and opportunity leveling within groups with a belief that group selection can be dismissed and that everything, parliaments and outlaws alike, are products of self interest.20 To make matters worse, Alexander speaks for the majority of biologists interested in human behavior. For example, here is Daly and Wilson's (1988 p254) tidy statement about human morality: If conscience and empathy were impediments to the advancement of self- interest, then we would have evolved to be amoral sociopaths. Rather than representing the denial of self-interest, our moral sensibilities must be intelligible as means to the end of fitness in the social environment in which we evolved. We hope the reader recognizes the familiar pattern of treating groups as "environments" inhabited by individuals and defining self-interest as "anything that evolves" without any consideration of vehicles. Alexander, Daly and Wilson join the anti-group selection chorus and then provide dozens of examples of human groups as vehicles of selection without ever acknowledging what gene-centered theorists have already conceded--that group selection is a "vehicle" question. Alexander's theory of moral systems can be rebuilt on the foundation of vehicles as follows: Human adaptations can evolve along two major pathways; a) by increasing the fitness of individuals relative to others within the same social group, and b) by increasing the fitness of social groups as collectives, relative to other social groups. Both pathways have been important in the evolution of the psychological mechanisms that govern human behavior. Sometimes group selection is important just by virtue of the situation. For example, the only way to defend a village might be to build a stockade, which benefits the collective by its nature. We are not surprised to see villagers building stockades, even when they are genetically unrelated to each other. We are not surprised when they coordinate their efforts in ways that invite comparison to a single organism. Nor do we regard them as especially morally praiseworthy as they feverishly work to save their collective skins. But building the stockade is not selfish just because it is reasonable. Applying our procedure, we find that the village is the vehicle of selection. We expect the stockade to be built for the same reason that we expect A1 and A5 to evolve in the game theory models; because in this particular situation group-level selection is very strong relative to within-group selection. If we define behaviors on the basis of fitness effects (as all evolutionists do), and if we want our terminology to reflect the vehicle(s) that natural selection acts upon, we should call stockade-building groupish, not selfish. Many other situations in human life provide opportunities for adaptation via the first pathway, by increasing the fitness of individuals relative to others within the same social group. Even with our stockade example we can imagine a temptation to selfishly cultivate one's own garden or romantic possibilities as others build the stockade. The use of the word selfish is fully appropriate here because the individual is the vehicle of selection whose behaviors tend to disrupt group-level functional organization. The balance between levels of selection is not determined exclusively by the situation, however. Adaptive human behavior not only reflects the balance between levels,but also can alter the balance between levels. Moral sentiments and moral social systems may function as "rules of meiosis" that often concentrate fitness differences, and therefore functional organization, at the group level. This is the core of Alexander's thesis. When stated in terms of vehicles, however, AlexanderYs theory acquires a familiar and conventional ring that is absent from his own account. Moral systems are defined as "social organizations designed to maximize the benefit of the group as a collective." Immoral behaviors are defined as "behaviors that benefit individuals at the expense of other individuals within the same group." These are close to the concepts of moral and immoral behavior in folk psychology.21 The shock value of Alexander's account, in which the gentle reader is made to face the grim reality that all is self-interest, evaporates when we realize that for Alexander, self-interest is everything that evolves, at all levels of the biological hierarchy.22 We will return to moral systems with an empirical example, but first we must consider the important issue of psychological motivation. Psychological selfishness and its alternatives. Dawkins portrays genes as psychologically selfish entities that manipulate their environment, including the genotypic environment in which they reside, to increase their own fitness. This image is obviously metaphorical, allowing Dawkins to use a familiar human reasoning process to describe the outcome of natural selection. The metaphor is relatively innocuous because there is no danger that it can be taken literally. No one believes that genes are intentional systems of any sort, much less systems motivated by self interest. Frame-shifting upward, it is possible to portray individuals as psychologically selfish entities that manipulate their environment, including the social environment in which they reside, to increase their own fitness. This image of "selfish individuals" may also be metaphorical but it is more insidious because it can be taken literally. In other words, it is possible to believe that individuals really are intentional systems motivated entirely by self-interest and this is, in fact, the individualistic perspective that pervades the human sciences. To distinguish mechanisms from metaphors, it is useful to think of a psychological motive as a strategy in the game theoretic sense, which produces a set of outcomes when it interacts with itself and with other strategies. Thus, a psychologically selfish individual (however defined) will be motivated to behave in certain ways with consequences for itself and others. A psychologically altruistic individual (however defined) will be motivated to behave in other ways with a different set of payoffs. Within an evolutionary framework, the empirical claim that individuals are motivated entirely by self-interest must be supported by showing that the psychologically selfish strategy prevails in competition with all other strategies. Psychological motives have seldom been analyzed in this way (but see Frank 1988, Alexander 1987) and we suggest that it will be a productive line of inquiry in the future. We also predict that two general conclusions will emerge: First, it is extremely unlikely that any single strategy will prevail against all other strategies. Even the famous Tit-for-Tat strategy, which is robust in the narrow context of Axelrod's (1980 a,b) computer tournaments, is vulnerable to a host of other strategies in more complex and realistic environments (e.g., Boyd and Lorberbaum 1987, Dugatkin and Wilson 1991, Feldman and Thomas 1987, Peck and Feldman 1986). Thus, any monolithic theory of proximate motives is destined to fail, including the monolithic theory of psychological selfishness. We should expect a diversity of motives in the human repertoire that is distributed both within and among individuals. Second, the very opposite of psychological selfishness can be highly successful, especially when natural selection operates at the group level. To see this, consider an individual who identifies so thoroughly with his group that he doesn't even consider the possibility of profiting at the expense of his fellows. This individual will be vulnerable to exploitation by members of his own group who are less civic-minded. But groups of individuals who think in this way will probably be superior in competition with other groups whose members are less civic-minded. It follows that intense between-group competition will favor psychological mechanisms that blur the distinction between group and individual welfare, concentrating functional organization at the group level. Alexander (1988) himself provides a good example in a review of Richards (1987) when he describes his own military experience: In the army in which I served one was schooled so effectively to serve the welfare of his unit (community?) that not only the contract altruism that Richards says is inferior to his "pure" altruism, but the intent that he requires,both disappear in a kind of automaticity that ceases to involve any deliberateness, either in maintenance of the contract signed when drafted or enlisted, or in explicitly serving the rest of one's unit [p443]. Quibbles about the definition of altruism aside, nothing more is required to convert a social group into an organism. Critics may argue that the selfless attitude of a well-trained soldier is not adopted by individual choice but imposed by an indoctrination process and reinforced by sanctions against disloyalty that make it disadvantageous to cheat. We disagree in two ways. First, individuals are not always drafted into these groups and often rush to join them, enthusiastically embracing the doctrine, refraining from cheating and enforcing the sanctions against others. Their self-interest is not taken from them but willingly abandoned. Second, even when imposed, indoctrination and sanctions are best regarded as group-level rules of meiosis that reduce the potential for fitness differences within groups, concentrating functional organization at the group level. An entity can be an organism without the parts behaving self-sacrificially (for an evolutionary model of psychological altruism per se, see Frank 1988). Since humans have lived in small groups throughout their history, it is reasonable to expect the evolution of psychological mechanisms that cause them to easily become "team players" in competition with other groups. We do not expect these to be the only motives that guide human behavior, but rather a module that is facultatively employed under appropriate conditions. In fact, there is abundant empirical evidence that humans coalesce into cooperative teams at the merest suggestion of a metapopulation structure in which groups can compete against other groups (e.g., Dawes et al 1988, Hogg and Abrams 1988, Sherif et al 1961, Tajfel 1981 ). Members of the same group often share a feeling of high regard, friendship and trust that is based not on any prior experience but merely by the fact that they are members of the same group. Exploitation within groups is often avoided even when opportunities are experimentally provided without any chance of detection (e.g., Caporeal et al 1989). Group formation is as spontaneous in children as in adults (e.g., Sherif et al 1961). These are the earmarks of an evolved "Darwinian algorithm" (sensu Cosmides and Tooby 1987) that predisposes humans for life in functionally organized groups. The algorithm appears paradoxical only when we consider its vulnerability to more selfish algorithms within groups. The advantages at the group level are manifest. It is important to stress that we have not merely converged on a view that is already well accepted within the human sciences. Proponents of alternatives to psychological selfishness are better described as an embattled minority who must constantly defend themselves against a monolithic individualistic world view (e.g., Batson 1991, Caporeal et al 1989, Campbell 1993, Mansbridge 1990, Simon 1991). As one example, most economists assume that individuals act in the interest of the company that employs them only because the company pays them enough to make it worthwhile from the standpoint of the individual's personal utility. According to Simon (1991), real people who are satisfied with their jobs do not distinguish between their own and their company's utility, but rather adopt the company's interest as their own interest. Even the lowest level employees make executive decisions that require asking the question "what is best for the company?" and which go far beyond the actual requirements of the job. In fact, one of the most effective forms of protest by dissatisfied employees is "work to rule", in which people perform their jobs to the letter and the company comes to a grinding halt. In modern life as in ancient times, group-level function requires individuals who to a significant degree take the group's goals as their own. This is a radical proposal within economics, however, and not the majority view. Group-level cognition. Goal-oriented behaviors are typically accomplished by a feedback process that includes the gathering and processing of information. While the entire process can be described as intentional (e.g., the wolf tries to catch the deer), the elements of the process cannot (the neuron does not try to fire; it merely does fire when stimulated at enough synapses; Dennett 1981). We are accustomed to regarding individuals as intentional systems with their own self-contained feedback processes. Group selection raises another possibility in which the feedback process is distributed among members of the group. We have already provided an example for honeybee colonies in which individuals behave more like neurons than as intentional agents in their own right. Similar examples have scarcely been considered for humans and our main purpose here is to define the question, rather than answer it. Modern governmental and judicial systems are sometimes designed to produce adaptive outcomes at the level of the whole system but not at the level of the component individuals. Science is sometimes portrayed as a similar process that generates knowledge only at the group level (e.g., Hull 1988, Kitcher 1993). The invisible hand metaphor in economics invokes the image of an adaptive system that organizes itself out of neuron-like components, although the metaphor is more often stated as an ideology than as a testable research program.23 The invisible hand notwithstanding, research on decision making in small groups reveals a complex process that does not always yield adaptive solutions (Hendrick 1987a,b). Groups even make decisions that would be regarded as foolish by every member of the group (Allison and Messick 1987). This research is important because it shows that intelligent individuals do not automatically combine to form intelligent groups. Adaptive decision-making at the small group level may require a highly specified cognitive division of labor. Since decision-making has occurred in small groups throughout human history, it is reasonable to expect "Darwinian algorithms" that cause individuals to relinquish their capacity to act as autonomous intentional agents and adopt a more limited role in a group-level cognitive structure. The architecture of group-level cognition might simply take the form of "leaders" who act as self- contained intentional agents and "followers" who abide by the decisions of others. Alternatively, even so-called "leaders" may be specialists in a feedback process that is distributed throughout the group. These questions can only be asked by recognizing the group as a potentially adaptive entity. An example of a human group-level oorganism at . We conclude by providing a possible example of extreme group-level functional organization in humans and the background conditions that make it possible. The Hutterites are a fundamentalist religious sect that originated in Europe in the sixteenth century and migrated to North America in the nineteenth century to escape conscription. The Hutterites regard themselves as the human equivalent of a bee colony. They practice community of goods (no private ownership) and also cultivate a psychological attitude of extreme selflessness. The ultimate Hutterite virtue is oGelassenheit@, a word that has no English equivalent, which includes othe grateful acceptance of whatever God gives, even suffering and death, the forsaking of all self- will, all selfishness, all concern for private property@ (Ehrenpreis 1650/1978). Nepotism and reciprocity, the two principles that most evolutionists use to explain prosocial behavior in humans, are scorned by the Hutterites as immoral. Giving must be without regard to relatedness and without any expectation of return. The passion for selflessness is more than just sermonizing and frequently manifests itself in action. For example, Claus FelbingerYs oConfession of faith@ (1560/1978) provides an eloquent statement that a Hutterite blacksmith gave to Bavarian authorities after their efforts to make him recant had failed and before they executed him for his beliefs. The extreme selflessness of the Hutterites can be explained in at least three ways. First, many authors, both inside and outside of biology, think of culture as a process that frequently causes humans to behave in ways that are biologically maladaptive. By this account, the Hutterites are influenced by (unspecified) cultural forces and their behavior cannot be explained by any biological theory, including the theory of group selection. Second, some evolutionists have tried to explain widespread altruism in humans as a product of manipulation, in which the putative altruists are essentially duped into behaving against their own interests for the benefit of the manipulator (e.g., Dawkins 1982,1989). If people can be fooled into believing that a life of sacrifice will lead to a pleasant afterlife, for example, then perpetrating that belief in others becomes an example of individual self-interest. By this account, we might expect some Hutterites (such as the leaders) to profit at the expense of their duped brethren. Third, it is possible that humans have evolved to willingly engage in selfless behavior whenever it is protected by a social organization that constitutes a group-level vehicle of selection. The relatively small group- level vehicles of kinship groups and cooperating diads are already well recognized. The hypothesis we wish to explore asserts that the Hutterites constitute a less familiar case in which the vehicle is a relatively large group of individuals and families that are genetically unrelated to each other. If this interpretation is correct, then group selection theory should be able to predict some major features of Hutterite social organization and ideology, despite the fact that it is stated in purely religious terms. In particular, the prediction is that the bee-like behavior of the Hutterites is promoted by a social organization and ideology that nearly eliminates the potential for individuals to increase their fitness relative to others within groups.24 Note that the other two interpretations do not make the same prediction. If Hutterite society is governed by independent cultural forces, it is unlikely to have the specific design features of a group-level vehicle. And if selflessness is a product of manipulation, we should find fitness differences between the puppets and the puppeteers. A number of caveats are in order before proceeding. First, we do not claim to rigorously distinguish among the above three explanations in the confines of this review article. The best that we can do is provide a brief sketch, which is nevertheless important because it makes the preceding discussion less abstract and gives an idea of what a group-level vehicle of selection might look like in humans. Second, we do not regard the Hutterite social organization as a direct product of group selection. Rather, we conjecture that group selection has operated throughout human history, endowing the human psyche with the ability to construct and live within group-level vehicles of the sort exhibited by the Hutterites.25 This enables us to make another prediction, that the Hutterite social organization is not unique but represents a fairly common type of social organization in ancestral environments. Otherwise it could not be interpreted as an evolved adaptation. As one of many possible social organizations in the human repetoire, this one is presumably evoked only under appropriate environmental conditions, yielding another set of testable predictions. Third, evolutionary psychologists rely on fitness maximizing arguments to explain the human psyche, but they do not necessarily expect humans to maximize biological fitness in present day environments. This is because, to the extent that humans are oprogrammed@ by natural selection, it is not to maximize biological fitness per se but only to achieve the more proximate goals that led to high fitness in ancestral environments. Thus, we must focus more on the design features and what they would have meant in ancestral environments than on the present day consequences of the design features (Symons 1992). This is a general issue in evolutionary psychology that applies to the Hutterites as well as any other group. With these caveats in mind, we now will elaborate the idea that Hutterite society is a group-level vehicle of selection. Although their ideology is stated in purely religious terms, it is clearly designed to suppress behaviors that benefit some individuals at the expense of others within groups: That is what Jesus means by His parable of the great banquet and the wedding of the king's son, when the servants were sent to call all the people together. Why did his anger fall on those who had been invited first? Because they let their private, domestic concerns keep them away. Again and again we see that man with his present nature finds it very hard to practice true community; true community feeds the poor every day at breakfast, dinner, and the common supper table. Men hang on to property like caterpillars to a cabbage leaf. Self-will and selfishness constantly stand in the way! [Ehrenpreis 1650/1978 p 11-12] Benefitting the group is exalted as highly as selfishness within groups is reviled: Where there is no community there is no true love. True love means growth for the whole organism, whose members are all interdependent and serve each other. That is the outward form of the inner working of the Spirit, the organism of the Body governed by Christ. We see the same thing among the bees, who all work with equal zeal gathering honey; none of them hold anything back for selfish needs. They fly hither and yon with the greatest zeal and live in community together. Not one of them keeps any property for itself. If only we did not love our property and our own will! If only we loved the life of poverty as Jesus showed it, if only we loved obedience to God as much as we love being rich and respected! If only everybody did not hang on to his own will! Then the truth of Christ's death would not appear as foolishness. Instead, it would be the power of God, which saves us. [Ehrenpreis 1650/1978 p12-13] Thus the Hutterites are as explicit as they can possibly be that their members should merge themselves into a group-level organism. They are also explicit about how group-level functional organization can be accomplished. In the first place, the Hutterites believe that selfishness is an innate part of human nature that can never be fully irradicated: The sinner lies in all of us; in fact to sin, to be selfish,is our present inclination. Left to ourselves we shall end up in damnation, but this does not mean that salvation cannot be attained. On the contrary, salvation is possible on three conditions: we live according to the life of Christ; we live in community; we strive very hard to attain salvation and are prepared to suffer for our efforts. Christ appeared to save us from our sinful nature. This nature is not easily abjured but it can if we try hard enough, both in the sense of personal determination and in the sense of collectively living according to the Word (Shenker 1986 p73). If we were to translate this sentiment into evolutionary language, we would arrive at the claim that within-group selection has been a powerful (but not the only) force in human evolution and has stamped itself upon the human psyche. To the extent that humans are the products of natural selection, they are inclined to benefit themselves at the expense of others within their group whenever it is evolutionarily advantageous to do so (at least in ancestral environments). To create a group-level organism, the part of human nature that has evolved by within-group selection must be constrained by a social organization that plays the same functional role as the genetic rules of meiosis. The most important ingredient of this social organization is evidently a sense of oshared fate@: The community can "hang together" only through the members having an identity of fate. In practice this means two things. Members must identify with the past and (more important) with the future of the community, such that their own future and the community's future are one and the same. We rise and fall together. This is another way of saying we have unconditional commitment to our community. We do not say "if the community does or achieves such and such, then I will stay, otherwise I won't", since this implies that there is an individual identity ontologically and morally distinct from the community's. No true community could operate successfully or manifest its raison d'etre with such limiting conditions or separate identities. Identity of fate also means that members relate to each other in an atmosphere of mutual trust, i.e. they consider their presence to stem from a common desire to express their humanity and recognize that this can only be achieved through mutual effort. Should one person claim that he has an inherent right to gain for himself at the expense of others, the entire fabric collapses. Life in the community presupposes that each will work for the benefit of others as much as for himself, that no-one will be egoistic. The moment this assumption is undermined, mutual suspicion, jealousy and mistrust arise. Not only will people probably consider themselves silly for being self- righteous while others are feathering their nest,but operationally the community will have to take a different character (primarily through the use of coercion) and the entire moral nature of the community disappears. (Shenker 1986 p93) We could not ask for a stronger correspondence between the sentiment expressed in this passage and the concept of ovehicles@ in a group selection model. One way to establish a sense of shared fate is via egalitarian social conventions that make it difficult to benefit oneself at the expense of others. Hutterite society is elaborately organized along these lines. In addition to practicing community of goods, they discourage individuality of any sort, for example, in the context of personal appearance and home furnishings. Leaders are elected democratically and are subject to long probationary periods before they are given their full authority. The HutteriteYs passion for fairness is perhaps best illustrated by the rules that surround the fissioning process. Like a honey bee colony, Hutterite brotherhoods split when they attain a large size, with one half remaining at the original site and the other half moving to a new site that has been pre-selected and prepared. In preparation for the split, the colony is divided into two groups that are equal with respect to number, age sex, skills and personal compatibility. The entire colony packs its belongings and one of the lists is drawn by lottery on the day of the split. The similarity to the genetic rules of meiosis could hardly be more complete. In principle, we might imagine that a psychological egoist, who thinks only in terms of personal gain, could decide to become a Hutterite if he became convinced that the group-level benefits (which he shares) are sufficiently great and the social conventions are sufficiently strong that neither he nor anyone else in the group can act as a freeloader. The Shenker passage quoted above suggests, however, that an effective group- oriented society cannot be composed of individuals who are motivated solely by a calculus of self-interest.26 The external social conventions that make freeloading difficult are evidently necessary but not sufficient and must be supplemented by a psychological attitude of genuine concern for others; a direct calculus of group interest rather than self interest is essential. Recall that Simon (1991; discussed on p38) makes a similar point about the behavior of individuals in business organizations. Thus, although we are focusing on the Hutterites, our discussion is not limited to esoteric communal societies, a point that we will return to below. Even with these attitudes and social conventions, however, selfishness in thought and action cannot be entirely eliminated. The Hutterites therefore have a well specified procedure for dealing with members who benefit themselves at the expense of others. The bond of love is kept pure and intact by the correction of the Holy Spirit. People who are burdened with vices that spread and corrupt can have no part in it. This harmonious fellowship excludes any who are not part of the unanimous spirit... If a man hardens himself in rebellion, the extreme step of separation is unavoidable. Otherwise the whole community would be dragged into his sin and become party to it...The Apostle Paul therefore says, "Drive out the wicked person from among you." ... In the case of minor transgressions, this discipline consists of simple brotherly admonition. If anyone has acted wrongly toward another but has not committed a gross sin, a rebuke and warning is enough. But if a brother or a sister obstinately resists brotherly correction and helpful advice, then even these relatively small things have to be brought openly before the Church. If that brother is ready to listen to the Church and allow himself to be set straight, the right way to deal with the situation will be shown. Everything will be cleared up. But if he persists in his stubbornness and refuses to listen even to the Church, then there is only one answer in this situation, and that is to cut him off and exclude him. It is better for someone with a heart full of poison to be cut off than for the entire Church to be brought into confusion or blemished. The whole aim of this order of discipline, however, is not exclusion but a change of heart. It is not applied for a brother's ruin, even when he has fallen into flagrant sin, into besmirching sins of impurity, which make him deeply guilty before God. For the sake of example and warning, the truth must in this case be declared openly and brought to light before the Church. Even then such a brother should hold on to his hope and his faith. He should not go away and leave everything but should accept and bear what is put upon him by the Church. He should earnestly repent, no matter how many tears it may cost him or how much suffering it may involve. At the right time, when he is repentant, those who are united in the Church pray for him, and all of Heaven rejoices with them. After he hasshown genuine repentance, he is received back with great joy in a meeting of the whole Church. They unanimously intercede for him that his sins need never be thought of again but are forgiven and removed forever. [Ehrenpreis 1650/1978 p66-9] We could not ask for a more explicit awareness of the freeloader problem and what to do about it, including the elements of retaliation and forgiveness that are also part of the tit-for-tat strategy in diadic interactions. If we were to translate this passage into evolutionary language, it would be as follows: Altruism can succeed only by segregating itself from selfishness. Not only does the selfish individual have the highest fitness within groups, but his mere presence signifies a population structure that favors within-group selection, causing others to quickly abandon their own altruistic strategy. Fortunately, in face-to-face groups whose members are intimately familiar with each other, it is easy to detect overt forms of selfishness and exclude the offender. When osubversion from within@ can be prevented to this extent, extreme altruism, in both thought and action, becomes evolutionarily advantageous. It is remarkable, and crucial for the hypothesis under consideration, that the willingness of the Hutterites to sacrifice for others is accompanied by such an elaborate set of rules that protect self-sacrifice from exploitation within groups. We suggest that there is a causal relationship here, that humans are inclined to adopt selfless behavior in social organizations that provide the functional equivalent of the genetic rules of meiosis. Not only do these social organizations promote selflessness at the behavioral level, but they also promote forms of thinking and feeling that would be classified as non-egoistic in a psychological sense. After all, what is the advantage of psychological selfishness if the most successful way to behave is by contributing to group-level functional organization? It is also crucial for our hypothesis that group-level functional organization is, in some sense, superior to what can be accomplished by individuals when they are free to pursue their own self interest (recall the Seeley passage quoted on pg 27). This certainly appears to be the case for the Hutterites, who do not have to wait for the hereafter to get their reward. By fostering a selfless attitude towards others and minimizing the potential for exploitation within groups, they are spectacularly successful at the group level. In sixteenth century Europe they were alternately tolerated and persecuted for their economic superiority, much like the Jews, another society that, in its traditional form, is well- organized at the group level (MacDonald 1994). In present-day Canada, Hutterites thrive in marginal farming habitat without the benefit of modern technology and almost certainly would displace the non-Hutterite population in the absence of laws that restrict their expansion. The HutteritesY success can also be measured in reproductive terms, since they have the highest birth rate of any known human society (Cook 1954).27 Finally, Hutterite society is internally stable, with the majority of young people electing to remain when given a choice. Were it not for persecution and legal restrictions imposed by their host nations, Hutterite colonies would be far more common than they are now. Part of our hypothesis is that the Hutterite social organization is not a unique product of the sixteenth century but reflects an evolved human potential to construct and live within such group-level vehicles. It might seem that the Hutterites are such an esoteric society that our prediction could not possibly be confirmed. On closer reflection, however, it appears that the functional elements of Hutterite society that act as group-level rules of meiosis are repeated in a great many social groups that place a premium on group-level performance, even though the ideologies are superficially different and the purpose of the group can be diametrically opposed to the goals of the Hutterites (e.g., an elite military group). Furthermore, according to Knauft (1991), this kind of egalitarianism characterizes hunter-gatherer groups whenever resources are too widely dispersed to allow the development of status-based societies (i.e., most human groups throughout human evolutionary history). The ethic of ogood company@ (which is extended to non-kin as well as kin; e.g. Knauft 1985) and the de-emphasized sense of self-interest that pervades many tribal societies does indeed resemble the HutteriteYs ocommunity@ and their denigration of oself-will at . Another part of our hypothesis is that the human potential to build and live within group-level vehicles is facultative and evoked more strongly in some situations than in others. Group-level vehicles should be most commonly observed in situations that place a premium on group-level functional organization, such as extreme physical environments, extreme persecution, or extreme intergroup competition. In more benign situations, the consequences of social dilemmas are not so dysfunctional and the effort that goes into the maintenance of group-level vehicles may be correspondingly relaxed.28 Obviously, we have only skimmed the surface of an enormously complex and poorly understood subject. We hope we have demonstrated the likelihood, however, that group selection in humans extends far beyond nepotism and narrow reciprocity. These two principles cannot account for the full range of prosocial behaviors in humans and evolutionists who rely on them have been forced to invoke other factors; that prosocial behavior evolved in ancestral groups of closely related individuals and is maladaptively expressed in modern groups of unrelated individuals (Ruse 1986); that prosocial behavior is a form of manipulation whereby some individuals profit at the expense of others (Dawkins 1982, 1988); or that prosocial behavior results from cultural forces that promote biologically maladaptive behavior (Campbell 1983). Group selection theory provides a robust alternative: Even large groups of unrelated individuals can be organized in a way that makes genuinely prosocial behavior advantageous. We have emphasized group-level functional organization in humans as an antidote to the rampant individualism we see in the human behavioral sciences. But it is not our goal to replace one caricature with another. Many human groups are clearly not the oorganisms@ that we have described above and must be explained as the product of conflicting individual interests within the group. Evolutionary theory has the resources to understand both conflict and cooperation. Only by pursuing both problems- -with the group as well as the individual as possible units of functional integration--can the human sciences come to terms with our evolutionary heritage. CONCLUSIONS Maynard Smith's most recent comment on group selection includes the following passage: It is ...perfectly justified to study eyes (or, for that matter, ribosomes, or foraging behaviors) on the assumption that these organs adapt organisms for survival and reproduction. But it would not be justified to study the fighting behavior of spiders on the assumption that this behavior evolved because it ensures the survival of the species, or to study the behavior of earthworms on the assumption that it evolved because it improves the efficacy of the ecosystem. (Maynard Smith 1987b p147) Maynard Smith still resists what we think is the most fundamental implication of natural selection as a hierarchical process: Higher units of the biological hierarchy can be organisms, in exactly the same sense that individuals are organisms, to the extent that they are vehicles of selection. Group organisms may be less common than individual organisms and they may be more vulnerable than individuals to subversion from within, but this must not prevent us from recognizing group-level functional organization where it exists. As the most facultative species on earth, humans have the behavioral potential to span the full continuum from organ to organism, depending on the situations we encounter and the social organizations that we build for ourselves. We often see ourselves as "organs". We sometimes identify ourselves primarily as members of a group and willingly make sacrifices for the welfare of our group. We long to be part of something larger than ourselves. We have a passion for building, maintaining and abiding by fair social organizations. The individualistic perspective seems to make all of this invisible. Because group-level functional organization can be successful, it is labelled selfish, therefore no different from the kinds of behaviors that succeed by disrupting group-level functional organization. But this is just a conjurer's trick. There are compelling intellectual and practical reasons to distinguish between behaviors that succeed by contributing to group-level organization and behaviors that succeed by disrupting group-level organization. That is what the words "selfish" and "unselfish", "moral" and "immoral" are all about in everyday language. Human behavioral scientists need to focus on these ancient concerns, rather than obscuring them with bloated definitions of "self-interest". A concern for within-group versus between-group processes characterizes the human mind and it should characterize the study of the human mind as well. ACKNOWLEDGEMENTS Supported by NSF SBE-9212294. DSW thanks A.B. Clark, Lee Dugatkin, Eric Dietrich, Greg Pollock and Binghamton's EEB group. NOTES 1) In this article we use the word oindividual@ to refer to single flesh- and-blood creatures such as a bird or a butterfly. We use the term oorganism@ to refer to any biological entity whose parts have evolved to function in a harmonious and coordinated fashion. 2) The purpose of this table is to provide a reasonably complete guide to the modern literature on group selection. A number of controversies exist within this literature that are beyond the scope of the present paper. For completeness we provide references for all sides of these controversies, including those with which we disagree. The philosophical literature on levels of selection has recently been reviewed by Sober and Wilson (1993). 3) Williams was only one of many biologists who reacted against group selection during the 1960's, especially in response to Wynne-Edward's (1962) Animal Dispersion in Relation to Social Behavior. We do not mean to imply that Williams was the only articulate critic, but he has become the icon for the individualistic perspective in biology. 4) Dawkins (1982, 1989) acknowledges that the group selection controversy is a "vehicle" question but asserts that groups are almost never vehicles of selection, with the possible exception of the eusocial insects. Dawkins (1989 p 297-8) and Cronin (1991 p 290) cite Grafen (1984) as the authoritative critique of group selection but Grafen's treatment of groups as vehicles consists of a single parenthetical statement (p76): "(The organismal approach suggested here is not in conflict with the the 'gene selectionism' of Dawkins (1982a,b). In his language, we are saying that the individual is usually a well-adapted vehicle for gene replication, while groups are usually not)". Williams (1986 p8) states that "selection at any level above the family (group selection in a broad sense) is unimportant for the origin and maintenance of adaptation. I reach this conclusion by simple inspection." More recently, Williams (1992) acknowledges that groups are vehicles in the specific cases of eusocial insects and female-biased sex ratio but does not generalize to other cases. 5) Cronin's (1991) The Ant and the Peacock belongs to the same genre as Dawkins' (1976) The Selfish Gene and Gould's (1989) Wonderful life, in which the author attempts to make the subject accessible to a popular audience without sacrificing scholarship. As Gould (1989 p16) put it, "...we can still have a genre of scientific books suitable for and accessible alike to professionals and interested laypeople". Because these books are so accessible they tend to be influential even among academic audiences, which is why Cronin (1991) merits criticism despite its status as a "popular" book. Similar views can be found in the more technical gene- centered literature (references in note 4) 6) Gould (1992) criticizes Cronin's gene-centered approach and advocates a hierarchical view of evolution. However, he accepts the gene-centered framework for the evolution of altruism and does not invoke the concept of vehicles in the same sense that we do. More generally, the concept of "species selection" that Gould emphasizes is somewhat different from the concept of group selection that we review here (for a discussion of the difference, see Sober 1984). This constitutes one of the controversies within the group selection literature mentioned in note 1. 7) The term "unit of selection" has become ambiguous because it refers to both replicators and vehicles, depending on the author. Within the group selection literature, "unit" equals "vehicle" and no word is required for "replicator" because it is (and always was) assumed that natural selection at all levels results in gene-frequency change. We prefer the word "unit" but use the word "vehicle" in this paper to distinguish it from replicators and also to force gene-centered theorists to acknowledge the implications of their own framework. 8) We start at the lowest level and work up the hierarchy for convenience, not because it is required for the procedure. Also, unless there is uncertainty as to where fitness differences are occuring, it is not necessary to invoke WilliamsY (1966) concept of parsimony in this procedure. 9) Even though organisms are defined on the basis of functional coordination among their parts, functional coordination per se does not enter into our definition of vehicles, which is based purely on shared fate. This is because shared fate is the crucial property of the process of natural selection; functional coordination among the parts is a product of the process. 10) The procedure for identifying vehicles requires some precautions that can be illustrated by the following examples. First, imagine that tall individuals are more fit than short individuals regardless of how they are structured into groups. The procedure will (correctly) identify the individual as the vehicle of selection. Nevertheless, groups that contain more tall individuals than other groups will be more productive, suggesting (incorrectly) that groups are also vehicles of selection. To resolve this difficulty we must imagine placing all individuals in a single group. Tall individuals are still most fit, demonstrating that the metapopulation structure is irrelevant . As a second example, imagine that the fitness of everyone in a group is directly proportional to the average hight of the group. Our procedure (correctly) identifies the group as the vehicle of selection because there are no fitness differences between individuals within groups. To confirm this result, imagine placing all individuals in the same group. The fitness of tall and short individuals are identical, demonstrating that the metapopulation structure is necessary for tallness to evolve (Sober 1984; see also Goodnight et al 1992, Heisler and Damuth 1987, Walton 1991). Another problem arises when a trait has already evolved to fixation. To apply the procedure we must conduct a thought experiment (or a real experiment) in which alternative types are present. Although other refinements in our procedure may be needed, we believe that they don't require discussion in the present context. 11) Although female biased sex ratios evolve by group selection, they cannot be used to assess the importance of group selection in the evolution of other traits. In other words, it does not follow that group selection can be ignored for species that have an even sex ratio. This is because the metapopulation structure must be defined separately for each trait (hence the term otrait group@; Wilson 1975, 1977, 1980). The trait group for sex ratio must persist long enough for f1 progeny to mate within the group before dispersing, a constraint that does not necessarily apply to other traits. 12) For most evolutionists, the ultimate rejection is to be labelled "non- Darwinian". In fact, Darwin's (1871) theory for the evolution of human moral sentiments is remarkably similar to the vehicle-based framework that we develop here (Richards 1987). 13) We think that an evolutionary theory of genuine vs. apparant psychological altruism is possible, but it must be based on the proximate motivations of the actor, which evolutionary accounts ignore by defining altruism and selfishness solely in terms of fitness effects. In other words, we must ask questions such as: oWhen are the behaviors motivated by a ZgenuinelyY psychologically altruistic individual more fit than the behaviors motivated by an ZapparantlyY altruistic individual?@ Frank is actually one of the few authors who are asking these questions, so we are not criticizing his specific proposals about emotions as commitment devices, which make more sense within a vehicle-based framework than within a replicator-based framework. For further discussion, see Wilson (1992) and Sober (in press). 14) Frank (1988) anticipates this conclusion in the passage that we quote above, but does not pursue it further. 15) Anatol Rapoport, who submitted the Tit-for-Tat strategy to Axelrod's (1980a,b) computer tournaments, always appreciated its group-level benefit and individual-level disadvantage (e.g., Rapaport 1991). In contrast, Axelrod and the majority of evolutionary game theorists regard tit-for-tat as a strategy that succeeds "at the individual level." 16) Sterelny and Kitcher recognize that Dawkins' position cannot simply be the empty truism that evolution occurs when the genetic composition of a population changes. They claim (p 340) that the nontrivial thesis that Dawkins advances is that "evolution under natural selection is thus a process in which, barring complications, the average ability of the genes in the gene pool to leave copies of themselves increases with time." Although this is a nontrivial claim, it is not something we find in Dawkins' writings, and in any case it isn't true as a generality. The average fitness of the alleles at a locus increases under frequency- independent selection. But when a truly selfish gene replaces an altruistic allele, the effect is to reduce average fitness. Dawkins frequently remarks that there is nothing to prevent natural selection (meaning within-group selection) from driving a population straight to extinction. It is also worth noting that group selection can lead the average fitness of the selected alleles to increase. A gene's ability to leave copies of itself can decline under selection as well as increase. And which turns out to occur is a separate issue from whether group selection is present or absent. 17) While group selection has been a controversial topic within biology, the entire subject of evolution has been a controversial topic when applied to human behavior. There are at least three ways that evolution in general (and group selection in particular) can influence human behavior. First, the psychological mechanisms that govern human behavior can be the product of natural selection. In its weak form this statement is uncontroversial, since everyone agrees that basic drives such as hunger, sex and pain exist because they are biologically adaptive. Some psychologists believe that the adaptationist program can be used to explain the architecture of human cognition in much greater detail, however, and this position is more controversial (e.g., Barkow et al 1992). Second, cultural change can itself be described as an evolutionary process with between- and within-group components (e.g. Boyd and Richerson 1985, Findlay 1992). Third, genetic evolution is an ongoing process that can partially explain differences between individuals and populations. Our own thinking is based primarily on the first and second influences. In other words, we think it is imperative to explore the hypothesis that group selection was a strong force during human evolution, resulting in proximate psychological mechanisms that today are universally shared and allow humans to facultatively adopt group-level adaptations under appropriate conditions. The specific nature and precision of these psychological mechanisms are empirical issues. We also propose, along with Boyd and Richerson (1985) and Findlay (1992), that group selection can be a strong force in cultural evolution. Thus, our position is compatable with but does not require a strong form of human sociobiology. Our point is not to prejudge the correctness of adaptationist explanations, but to urge the importance of asking adaptationist questions. Only by doing so can we find out whether and to what degree organisms are well adapted to their environments (Orzack and Sober, in press). 18) Sober (1981) discusses the relationship between methodological individualism and the units of selection controversy in more detail. 19) Opportunity levelling is not restricted to the largest human groups. According to Knauft (1992), the simplest human societies are highly egalitarian and overtly status-oriented societies require a concentrated and stable resource, such as crops or livestock. This improves Alexander's general thesis, especially if the simplest existing human societies represent ancestral conditions. 20) AlexanderYs views on group selection, presented in articles and books from 1974 to 1993, are difficult to represent in a single passage. When evaluating group selection in non-human species, Alexander identifies strongly with the views of Williams and Dawkins, as the passage quoted on p 7 shows. Alexander does speculate that humans may be an exception to the rule because of extreme between-group competition and regulation of fitness differences within groups. The following passage illustrates his pro-group selection side, which is consistent with our own interpretation: oIn sexually reproducing organisms, such as humans, confluences of interest within groups are likely to occur when different groups are in more or less direct competition. As a result, the kind of selection alluded to here [group selection] would be expected to produce individuals that would cooperate intensively and complexly within groups but show strong and even extreme aggressiveness between groups (Alexander 1989 p 463).@ However, in other passages, Alexander clearly minimizes the importance of group selection and attributes the evolution of moral behavior in humans to within-group processes. We provide his most recent statement to this effect: oBecause selection is primarily effective at and below the individual level, it is reasonable to expect concepts and practices pertaining to morality--as with all other aspects of the phenotypes of living forms--to be designed so as to yield reproductive (genetic) gains to the individuals exhibiting them, at least in the environments of history (Alexander 1993 p178).@ At a more technical level, Alexander occasionally seems to appreciate the vehicle concept when evaluating levels of selection (e.g., the 1989 passage quoted above), but more often he implicitly defines anything that evolves as oindividually@ advantageous, even when the group is the vehicle of selection (e.g.,the discussion of Frank, 1988, in Alexander, 1993). We think that a consistent application of our procedure will reveal that Alexander is invoking groups as vehicles of selection much more than he acknowledges in his own writing. We also want to stress, however, that AlexanderYs views on indirect reciprocity, opportunity-leveling within groups and competition between groups remain important within a vehicle-based framework. 21) AlexanderYs theory is conventional in the sense of equating morality with the notion of a common good. However, calling it familiar and conventional does not belittle its importance. Evolutionary theories of human behavior frequently make predictions that correspond closely to folk psychology (e.g., that men tend to value youth in women more than women value youth in men). Since the intuitions of folk psychology are unlikely to be completely wrong, it would be disturbing if an evolutionary theory of human behavior was not familiar and conventional in some sense. Of course, the theory must also go beyond folk psychology by making counter-intuitive predictions, revealing aspects of folk psychology that are false, refining familiar predictions, subjecting predictions to empirical test and so on. An evolutionary account of morality (including AlexanderYs) does depart from folk psychology in some important respects. The organ-organism- population trichotomy implies that there will always be a level of the biological hierarchy at which social dilemmas will prevail. In other words, moral behavior within groups will frequently be used to generate immoral behavior between groups. This fits well with observed behavior but contrasts with the concept of universal morality that is common in folk psychology and some branches of the human behavioral sciences (e.g., the higher stages of KohlbergYs (1984) theory of moral development; MacDonald 1988). In addition, if moral systems function as group-level rules of meiosis, it becomes difficult to explain the concept of individual rights, which are moral rules that protect individuals from groups. We think that an evolutionary account of morality may ultimately shed light on these topics but it will need to be more sophisticated than current accounts. 22) We provide AlexanderYs most recent statement that humans are motivated entirely by self interest: oIt is not easy for anyone to believe, from his own thoughts about his personal motivation and that of other humans, that humans are designed by natural selection to seek their own interests, let alone maximize their own genetic reproduction (Alexander 1993 p 191-2).@ 23) The invisible hand metaphor is the economic equivalent of the Gaia hypothesis. More generally, despite its emphasis on individual self- interest, economic theory is like naive group selection in its axiomatic belief that multi-individual firms maximize a collective utility. A more sophisticated hierarchical approach to economics, along the lines of Campbell 1993, Leibenstein 1976, Margolis 1982, and Simon 1991 will be highly interesting. 24) The sense in which we expect an absence of fitness differences within groups needs to be clarified. In honey bee colonies, there is a set of adaptations that is favored by within-colony selection and has the potential of disrupting colony function. This includes workers laying unfertilized eggs to produce sons and workers favoring their own patriline while tending future queens. These behaviors are seldom observed because of evolved adaptations that prevent them, which qualify as group-level rules of meiosis (Ratnieks 1988). Another set of adaptations is favored by within-colony selection but does not disrupt colony function. For example, a beneficial mutation that increases viability will cause patrilines that have this mutation to be more fit than patrilines that donYt, but there is no reason to expect these kinds of fitness differences to be suppressed by group-level rules of meiosis. Similarly, we expect the Hutterite social organization to suppress fitness differences that correspond to the first set but not those that correspond to the second. 25) Here we are following Tooby and Cosmides (1992) concept of modularity, according to which natural selection has evolved a number of cognitive subsystems that are evoked by appropriate environmental conditions. We do not mean to exclude the possibility of open-ended learning and cultural evolution, however, as envisioned by other evolutionary psychologists (e.g., Boyd and Richerson 1985, Durham 1991, MacDonald 1991). 26) Two caveats are in order here. First, people do not necessarily think the way an ideology exhorts them to think. We think it plausible that Shenker (who was himself an Israeli Kibbutznik) is not simply espousing an ideology but is accurately describing the attitudes and beliefs that exist among members of communal societies. Second, psychological egoism can be defined in many ways and some of the broadest definitions would include the attitudes and beliefs expressed in the Shenker passage. For example, if a Hutterite takes genuine pleasure in helping his group, he might be classified by some as a psychological egoist who is attempting to maximize his pleasure. For the purposes of this discussion, we define a psychological egoist as a person who has a category of oself@ that is separate from the category of oothers@, who acts to maximize perceived self-interest without regard to effects on others, and who does not axiomatically find pleasure in helping others. See Batson 1991 for more detailed discussions of psychological egoism. 27) Although the evaluation of psychological adaptations should be based on design features and their reproductive consequences in ancestral environments, it is still interesting to examine the reproductive consequences in modern environments. The Hutterites have been quite well studied demographically and it should be possible to measure actual fitness differentials within groups. 28) In addition to the environmental situations that we have listed, unstable equilibria leading to majority effects are likely to be important in the evolution and maintenance of group-level adaptations. In other words, group-level adaptations may have difficulty evolving from a low frequency even when they are favored by environmental conditions. Conversely, after they have become established, group-level adaptations may persist even after the environmental conditions that favored them are relaxed (Boyd and Richerson 1990). LITERATURE CITED Alexander, R. D. (1979). Darwinism and Human Affairs . Seattle: University of Washington Press. Alexander, R. D. (1987). The biology of moral systems . New York: Aldine de Gruyter. Alexander, R. D. (1988). Knowledge, intent and morality in Darwin's world. Quarterly Review of Biology, 63, 441-443. Alexander, R. D. (1989). The evolution of the human psyche. In P. Mellars, & C. Stringer (Ed.), The human revolution (pp. 455-513). Edinburgh: University of Edinburgh Press. Alexander, R. D. (1993). Biological considerations in the analysis of morality. In M. H. Nitecki, & D. V. Nitecki (Ed.), Evolutionary ethics (pp. 163-196). Albany, N.Y.: State University of New York Press. Alexander, R., & Borgia, G. (1978). Group selection, altruism and the levels of organization of life. Annual Review of Ecology and Systematics, 9, 449- 475. Allee, W. C. (1943). Where angels fear to tread: A contribution from general sociology to human ethics. Science, 97, 517-525. Allison, S. T., & Messick, D. M. (1987). From individual inputs to group outputs, and back again: group presses and inferences about members. In C. Hendrick (Ed.), Group processes (pp. 111-143). Newbury Park: Sage. Aoki, K. (1982). A condition for group selection prevail over individual selection. Evolution, 36, 832-842. Aoki, K. (1983). A quantitative genetic model of reciprocal altruism: A condition for kin or group selection to prevail. Proceedings of the National Academy of Sciences, 80, 4065-4068. Archer, J. (1991). Human sociobiology: Basic concepts and limitations. Journal of Social Issues, 47, 11-26. Aviles, L. (1986). Sex-ratio bias and possible group selection in the social spider Anelosimus eximius. American Naturalist, 128, 1-12. Aviles, L. (1993). Interdemic selection and the sex ratio: a social spider perspective. American Naturalist, 142, 320-345. Axelrod, R. (1980a). Effective choices in the prisoner's dilemma. Journal of Conflict Resolution, 24, 3-25. Axelrod, R. (1980b). More effective choices in the prisoner's dilemma. Journal of Conflict Resolution, 24, 379-403. Axelrod, R., & Hamilton, W. D. (1981). The evolution of cooperation. Science, 211, 1390-1396. Barkow, J. H., Cosmides, L., & Tooby, J. (1992). The adapted mind: evolutionary psychology and the generation of culture. Oxford: Oxford University Press, Batson, C. D. (1991). The altruism question: Toward a social-psychological answer . Hillsdale, N.J.: Erlbaum. Bell, G. (1978). Group selection in structured populations. American Naturalist, 112, 389-399. Boehm, C. (1981). Parasitic selection and group selection: a study of conflict interference in rhesus and Japanese macaque monkeys. In A. B. Chiarelli, & R. S. Corruccini (Ed.), Primate behavior and sociobiology (pp. 161-82). Berlin: Springer-Verlag. Boorman, S. A., & Levitt, P. R. (1973). Group selection on the boundary of a stable population. Theoretical Population Biology, 4, 85-128. Boorman, S. A., & Levitt, P. R. (1980). The genetics of altruism . New York: Academic Press. Boyd, R., & Lorberbaum, J. (1987). No pure strategy is evolutionarily stable in the repeated Prisoner's dilemma game. Nature, 327, 58-9. Boyd, R., & Richerson, P. J. (1980). Effect of phenotypic variation on kin selection. Proceedings of the national academy of sciences, 77, 7506- 7509. Boyd, R., & Richerson, P. J. (1982). Cultural transmission and the evolution of cooperative behavior. Human Ecology, 10, 325-351. Boyd, R., & Richerson, P. J. (1985). Culture and the evolutionary process . Chicago: University of Chicago Press. Boyd, R., & Richerson, P. J. (1988). The evolution of reciprocity in sizable groups. Journal of Theoretical Biology, 132, 337-356. Boyd, R., & Richerson, P. J. (1989). The evolution of indirect reciprocity. Social Networks, 11, 213-236. Boyd, R., & Richerson, P. J. (1990). Culture and cooperation. In J. J. Mansbridge (Ed.), Beyond self-interest (pp. 111-132). Chicago: University of Chicago Press. Boyd, R., & Richerson, P. (1990). Group selection among alternative evolutionarily stable strategies. Journal of Theoretical Biology, 145, 331- 342. Brandon, R. (1990). Organism and environment . Princeton: Princeton University Press. Breden, F. J., & Wade, M. J. (1989). Selection within and between kin groups of the imported willow leaf beetle. American Naturalist, 134, 35-50. Buss, L. (1987). The evolution of individuality . Princeton, NJ: Princeton University Press. Camazine, S. (1991). Self-organizing pattern formation on the combs of honey bee colonies. Behavioral Ecology and Sociobiology, 28, 61-76. Camazine, S., & Sneyd, J. (1991). A model of collective nectar source selection by honey bees: self organization through simple rules. Journal of Theoretical biology, 149, 547-571. Campbell, D. T. (1958). Common fate, similarity, and other indices of the status of aggregates of persons as social entities. Behavioral Science, 3, 14-25. Campbell, D. T. (1974). 'Downward causation' in hierarchically organized biological systems. In F. J. Ayala, & T. Dobzhansky (Ed.), Studies in the philosophy of biology (pp. 179-186). New York: MacMillan Press Ltd. Campbell, D. T. (1979). Comments on the sociobiology of ethics and moralizing. Behavioral Science, 24, 37-45. Campbell, D. T. (1983). The two distinct routes beyond kin selection to ultra-sociality: Implications for the humanities and social sciences. In D. L. Bridgeman (Ed.), The nature of prosocial development: Interdisciplinary theories and strategies (pp. 11-41). New York: Academic Press. Campbell, D. T. (1991). A naturalistic theory of archaic moral orders. Zygon, 26, 91-114. Campbell, D. T. (1993). How individual and face-to-face-group selection undermine firm selection in organizational evolution. In J. A. C. Baum, & J. V. Singh (Ed.), Evolutionary dynamics of organizations New York: Oxford University Press. Caporeal, L. R., Dawes, R. M., Orbell, J. M., & Van de Kragt, A. J. C. (1989). Selfishness examined: Cooperation in the absence of egoistic incentives. Behavioral and Brain Sciences, 12, 683-739. Cassidy, J. (1978). Philosophical aspects of the group selection controversy. Philosophy of Science, 45, 574-94. Cavalli-Sforza, L., & Feldman, M. (1978). Darwinian selection and altruism. Theoretical Population Biology, 14, 268-280. Chao, L., & Levin, B. (1981). Structured habitats and the evolution of anti- competitor toxins in bacteria. Proceedings of the National Academy of Sciences, USA, 78, 6324-6328. Charlesworth, B. (1979). A note on the evolution of altruism in structured demes. American Naturalist, 113, 601-605. Charlesworth, B., & Toro, M. A. (1982). Female-biased sex ratios. Nature, 298, 494. Charnov, E. L. (1982). The theory of sex allocation . Princeton: Princeton University Press. Chepko-Sade, B. D., Dow, M. M., & Cheverud, J. M. (1988). Group selection models with population substructure based on social interaction networks. American Journal of Physical Anthropology, 77, 427-33. Cohen, D., & Eshel, I. (1976). On the founder effect and the evolution of altruistic traits. Theoretical population biology, 10, 276-302. Colwell, R. K. (1981). Group selection is implicated in the evolution of female-baised sex ratios. Nature, 290, 401-404. Cook, R. C. (1954). The North American Hutterites: a study in human multiplication. Population bulletin, 10, 97-107. Cosmides, L. M., & Tooby, J. (1981). Cytoplasmic inheritance and intragenomic conflict. Journal of Theoretical Biology, 89, 83-129. Cosmides, L., & Tooby, J. (1987). From evolution to behavior: evolutionary psychology as the missing link. In J. Dupre (Ed.), The latest on the best: essays on evolution and optimality (pp. 277-307). Cambridge, Mass: Bradford (MIT Press). Craig, D. M. (1982). Group selection versus individual selection: an experimental analysis. Evolution, 36, 271-282. Crespi, B. J., & Taylor, P. D. (1990). Dispersal rates under variable patch selection. American Naturalist, 135, 48-62. Cronin, H. (1991). The ant and the peacock: Altruism and sexual selection from Darwin to today . Cambridge: Cambridge University Press. Crow, J. F. (1979). Genes that violate Mendel's Rules. Scientific American, 240, 104-113. Crow, J., & Aoki, K. (1982). Group selection for a polygenic behavioral trait:a differential proliferation model. Proceedings of the National Academy of Sciences, 79, 2628-2631. Crow, J., & Aoki, K. (1984). Group selection for a polygenic behavioral triat: estimating the degree of population subdivision. Proceedings of the National Academy of Sciences, 81, 6073-6077. Crozier, R. H. (1987). Selection, Adaption and Evolution. Journal and Proceedings, Royal Society of New South Wales, 120, 21-37. Crozier, R. H., & Consul, P. C. (1976). Conditions for genetic polymorphism in social hymenoptera under selection at the colony level. Theoretical Population Biology, 10, 1-9. Daly, M., & Wilson, M. (1988). Homicide . New York: Aldine de Gruyter. Damuth, J. (1985). Selection among "species": a formulation in terms of natural functional units. Evolution, 39, 1132-46. Damuth, J., & Heisler, I. L. (1988). Alternative formulations of multilevel selection. Biology and Philosophy, 3, 407-30. Darwin, C. (1871). The descent of man, and selection in relation to sex . London: Murray. Dawes, R. M., Van de Kragt, A. J. C., & Orbell, J. M. (1988). Not me or thee but we: The importance of group identity in eliciting cooperation in dilemma situations: experimental manipulations. Acta Psychologica, 68, 83-97. Dawkins, R. (1976). The Selfish gene (1st ed.). Oxford: Oxford University Press. Dawkins, R. (1978). Replicator selection and the extended phenotype. Zeitschrift fur Tierpsychologie, 47, 61-76. Dawkins, R. (1980). Good strategy or evolutionary stable strategy? In G. W. Barlow, & J. Silverberg (Ed.), Sociobiology: beyond nature/nurture? (pp. 331-367). Boulder, CO: Westview Press. Dawkins, R. (1982). The Extended Phenotype . Oxford: Oxford University Press. Dawkins, R. (1989). The Selfish gene (2nd ed.). Oxford: Oxford University Press. Deneubourg, J. L., & Goss, S. (1989). Collective patterns and decision- making. Ethological and Ecological Evolution, 1, 295-311. Dennett, D. C. (1981). Brainstorms . Cambridge, Mass: Bradford (MIT press). Dennett, D.C. (1992) Confusion over evolution: an exchange. New York Review of Books, 40, 44 Dover, G. A. (1986). Molecular drive in multigene families: how biological novelties arise, spread, and are assimilated. Trends in genetics, 2, 159- 165. Dugatkin, L. A. (1990). N-person games and the evolution of cooperation: a model based on predator inspection behavior in fish. Journal of Theoretical Biology, 142, 123-135. Dugatkin, L. A., Mesterton-Gibbons, M., & Houston, A. I. (1992). Beyond the Prisoner's Dilemma: towards models to discriminate among mechanisms of cooperation in nature. Trends in ecology and evolution, 7, 202-205. Dugatkin, L. A., & Reeve, H. K. (in press). Behavioral Ecology and levels of selection: Dissolving the group selection controversy. , Dugatkin, L. A., & Wilson, D. S. (1991). Rover: a strategy for exploiting cooperators in a patchy environment. American Naturalist, 138, 687-701. Durham, W. H. (1991). Coevolution: Genes, culture and human diversity . Stanford: Stanford University Press. Eberhard, W. G. (1990). Evolution of bacterial plasmids and levels of selection. Quarterly Review of Biology, 65, 3-22. Ehrenpreis, A. (1650/1978). An Epistle on brotherly community as the highest command of love. In Friedmann (Ed.), Brotherly community: the highest command of love (pp. 9-77). Rifton, N.Y.: Plough Publishing Co. Eibl-Eibesfeldt, I. (1982). Warfare, Man's indoctinability and group selection. Zeitschrift fur Tierpsychologie, 60, 177-198. Emerson, A. E. (1960). The evolution of adaptation in population systems. In S.Tax (Ed.), Evolution after Darwin (pp. 307-348). Chicago: Chicago University Press. Eshel, I. (1972). On the neighbor effect and the evolution of altruistic traits. Theoretical Population Biology, 3, 258-277. Eshel, I. (1977). On the founder effect and the evolution of altruistic traits: an ecogenetical approach. Theoretical population biology, 11, 410- 424. Eshel, I., & Montro, U. (1988). The three brothers' problem: kin selection with more than one potential helper: the case of delayed help. American Naturalist, 132, 567-75. Ewald, P. W. (1993). Adaptation and disease . Oxford: Oxford University Press. Fagen, R. M. (1980). When doves conspire: evolution on nondamaging fighting tactics in a nonrandom-encounter animal conflict model. American Naturalist, 115, 858-869. Falconer, D. S. (1981). Introduction to Quantitative Genetics (2nd ed.). London: Longman. Felbinger, C. (1560/1978). Confession of faith. In R. Friedmann (Ed.), Brotherly community: The highest command of love (pp. 91-133). Rifton, N.Y.: Plough Publishing Co. Feldman, M., & Thomas, E. (1987). Behavior-dependent contexts for repeated plays of the Prisoner's dilemma. Journal of Theoretical Biology, 128, 297-315. Findlay, C. S. (1992). Phenotypic evolution under gene-culture transmission in structured populations. Journal of Theoretical Biology, 156, 387-400. Fix, A. G. (1985). Evolution of altruism inkin-structured and random subdivided populations. Evolution, 39, 928-939. Frank, R. H. (1988). Passions within reason . New York: W.W. Norton. Frank, S. A. (1986). Dispersal polymorphisms in subdivided populations. Journal of Theoretical Biology, 122, 303-309. Frank, S. A. (1986). Hierarchical selection theory and sex ratios. I. General solutions for structured populations. Theoretical Population Biology, 29, 312-342. Frank, S. A. (1987). Demography and sex ratio in social spiders. Evolution, 41, 1267-1281. Franks, N. R. (1989). Army ants: a collective intelligence. American Scientist, 77, 139-145. Gadgil, M. (1975). Evolution of social behavior through interpopulational selection. Proceeding of the national academy of sciences, 72, 1199-1201. Garcia, C., & Toro, M. A. (1990). Individual and group selection for productivity in Tribolium castaneum. Theoretical and Applied Genetics, 79, 256-260. Gilinsky, N. L., & Mayo, D. G. (1987). Models of group selection. Philosophy of Science, 54, 515-38. Gilpin, M. E. (1975). Group selection in predator-prey communities . Princeton: Princeton University Press. Gilpin, M. E., & Taylor, B. L. (1988). Comment on Harpending and Roger's model of intergroup selection. Journal of theoretical biology, 135, 131- 135. Goodnight, C. J. (1985). The influence of environmental variation on group and individual selection in a cress. Evolution, 39, 545-558. Goodnight, C. J. (1990). Experimental studies of community evolution I: The response to selection at the community level. Evolution, 44, 1614-1624. Goodnight, C. J. (1990). Experimental studies of community evolution II: The ecological basis of the response to community selection. Evolution, 44, 1625-1636. Goodnight, C. J. (1991). Intermixing ability in two-species communities of Tribolium flour beetles. American Naturalist, 138, 342-354. Goodnight, C. J., Schwartz, J. M., & Stevens, L. (1992). Contextual analysis of models of group selection, soft selection, hard selection, and the evolution of altruism. American Naturalist, 140, 743-761. Goodnight, K. (1992). Kin selection in a structured population. American Naturalist, in press, Gould, S. J. (1980). Is a new and general theory of evolution emerging? Paleobiology, 6, 119-130. Gould, S. J. (1989). Wonderful life: The Burgess shale and the nature of history . New York: Norton. Gould, S. J. (1992). The confusion over evolution. New York Review of Books, 39, 47-53. Gould, S. J., & Lewontin, R. C. (1979). The spandrels of San Marco and the panglossian paradigm: A critique of the adaptationist program. Proceedings of the Royal Society of London, B205, 581-98. Govindaraju, D. R. (1988). Mating systems and the opportunity for group selection in plants. Evolutionary trends in plants, 2, 99-106. Grafen, A. (1984). Natural selection, kin selection and group selection. In J. Krebs, & N. Davies (Ed.), Behavioural Ecology: An evolutionary approach (pp. 62-84). Oxford: Blackwell Scientific Publications. Griesmer, J., & Wade, M. (1988). Laboratory models, causal explanations and group selection. Biology and Philosophy, 3, 67-96. Griffing, B. (1977). Selection for populations of interacting genotypes. In E. Pollak, O. Kempthorne, & T. B. Bailey (Ed.), Proceedings of the International Congress on Quantitative Genetics, August 16-21, 1976 (pp. 413-434). Ames, Iowa: Iowa State University Press. Hamilton, W. D. (1964). The genetical evolution of social behavior, I and II. Journal of theoretical biology, 7, 1-52. Hamilton, W. D. (1991). Selection of selfish and altruistic behavior in some extreme models. In J. S. Eisenberg, & W. S. Dillon (Ed.), Man and beast: comparative social behavior (pp. 57-92). Washington D.C.: Smithsonian Institution Press. Hardin, G. (1968). The tragedy of the commons. Science, 162, 1243-48. Harpending, H. C., & Rogers, A. R. (1987). On Wright's mechanism for intergroup selection. Journal of Theoretical Biology, 127, 51-61. Hausfater, G., & Breden, F. (1990). Selection within and between social groups for infanticide. American Naturalist, 136, 673-88. Heisler, I. L., & Damuth, J. (1987). A method of analyzing selection in hierarchically structured populations. American Naturalist, 130, 582-602. Hendrick, C. (1987a). Group processes. Newbury Park: Sage, Hendrick, C. (1987b). Group processes and intergroup relations. Newbury Park: Sage, Hofstadter, D. R. (1979). Godel, Escher, Bach: an eternal golden braid . New York: Vintage. Hogg, M., & Abrams, D. (1988). Social identifications: A social psychology of intergroup relations and group processes . London: Routledge. Holt, R. D. (1983). Evolution in structured demes: the components of selection. unpublished manuscript, , Hull, D. (1980). Individuality and selection. Annual review of ecology and systematics, 11, 311-32. Hull, D. (1981). The units of evolution--a metaphysical essay. In U. Jensen, & R. Harre (Ed.), The philosophy of evolution (pp. 23-44). Sussex: Harvester Press. Hull, D. (1988). Science as a process: an evolutionary account of the social and conceptual development of science . Chicago: University of Chicago Press. Hurst, L. D. (1991). The evolution of cytoplasmic incompatibility or when spite can be successful. Journal of theoretical biology, 148, 269-277. Jimenez, J., & Casadesus, J. (1989). An altruistic model of Rhizobium- Leguem association. Journal of Heredity, 80, 335-337. Johnson, M. S., & Brown, J. L. (1980). Genetic variation among trait groups and apparent absence of close inbreeding in Grey-crowned babblers. Behavioral Ecology and Sociobiology, 7, 93-98. Kelly, J. K. (1992a). The evolution of altruism in density regulated populations. in press, Kelly, J. K. (1992b). Restricted migration and the evolution of altruism. Evolution, 46, 1492-5. King, D. A. (1990). The adaptive significance of tree height. American Naturalist, 135, 809-828. Kitcher, P. (1993). The Advancement of Learning . Oxford: Oxford University Press. Kitcher, P., Sterelny, K., & Waters, K. (1990). The illusory riches of Sober's monism. Journal of Philosophy, 87, 158-60. Knauft, B. M. (1985). Good company and violence: Sorcery and social action in a lowland New Guinea society . Berkeley, CA: University of California Press. Knauft, B. M. (1991). Violence and sociality in human evolution. Current Anthropology, 32, 391-428. Kohlberg, L. (1984). Essagys on moral development: Vol 2, The psychology of moral development . San Francisco: Harper and Row. Krebs, D. (1987). The challenge of altruism in biology and psychology. In C. Crawford, M. Smith, & D. Krebs (Ed.), Sociobiology and psychology: ideas, issues and applications (pp. 81-119). Hillsdale, New Jersey: Erlbaum. Leibenstein, H. (1976). Beyond economic man: a new foundation for microeconomics . Cambridge, Mass: Harvard University Press. Leigh, E. G. J. (1977). How does selection reconcile individual advantage with the good of the group? Proceeding of the National Academy of Sciences, 74, 4542-4546. Leigh, E. G. J. (1991). Genes, bees and ecosystems: the evolution of common interest among individuals. Trends in Ecology and Evolution, 6, 257-262. Levin, B. R., & Kilmer, W. L. (1974). Interdemic selection and the evolution of altruism: A computer simulation. Evolution, 28, 527-545. Lewontin, R. C. (1970). The units of selection. Annual Review of Ecology and Systematics, 1, 1-18. Lloyd, E. (1988). The structure and confirmation of evolutionary theory . New York: Greenwood. Lovelock, J. E. (1979). Gaia: a new look at life on earth . Oxford: Oxford University Press. MacDonald, K. B. (1988). Sociobiology and the Cognitive-Developmental tradition in moral development research. In K. B. MacDonald (Ed.), Sociobiological perspectives on human development (pp. 140-167). New York: Springer-Verlag. MacDonald, K. (1991). A perspective on Darwinian psychology: the importance of domain-general mechanisms, plasticity, and individual differences. Ethology and Sociobiology, 12, 449-480. MacDonald, K. (in prep). Judaism as a group evolutionary strategy . Mansbridge, J. J. (1990). Beyond self interest. Chicago: University of Chicago Press, Margolis, H. (1982). Selfishness, Altruism and rationality: a theory of social choice . Chicago: University of Chicago Press. Matessi, C., & Jayakar, S. D. (1976). Conditions for the evolution of altruism under Darwinian selection. Theoretical Population Biology, 9, 360-387. Matessi, C., Karlin, S., & Morris, M. (1987). Models of intergenerational kin altruism. American Naturalist, 130, 544-69. Maynard Smith, J. (1964). Group selection and kin selection. Nature, , 1145-1146. Maynard Smith, J. (1976). Group selection. Quarterly review of biology, 51, 277-283. Maynard Smith, J. (1982). Evolution and the theory of games . Cambridge: Cambridge University Press. Maynard Smith, J. (1982). The evolution of social behavior- a classification of models. In K. C. S. Group (Ed.), Current problems in sociobiology (pp. 29-44). Cambridge: Cambridge University Press. Maynard Smith, J. (1987). How to model evolution. In J. Dupre (Ed.), The latest on the best: essays on evolution and optimality (pp. 119-131). Cambridge: MIT press. Maynard Smith, J. (1987). Reply to Sober. In J. Dupre (Ed.), The latest on the best: essays on evolution and optimality (pp. 147-150). Boston: MIT press. Maynard Smith, J. (1992). Confusion over evolution: an exchange. New York Review of Books, 40, 43. Mayr, E. (1990). Myxoma and group selection. Biologisches zentralblatt, 109, 453-457. McCauley, D. E. (1989). Extinction, colonization and population structure: a study of a milkweed beetle. American Naturalist, 134, 365-76. McCauley, D. E., & Wade, M. J. (1980). Group selection: the genotypic and demographic basis for the phenotypic differentiation of small populations of Tribolium castaneum. Evolution, 34, 813-821. McCauley, D. E., & Wade, M. J. (1988). Extinction and recolonization: their effects on the genetic differentiation of local populations. Evolution, 42, 995-1005. McCauley, D. E., Wade, M. J., Breden, F. J., & Wohltman, M. (1988). Spatial and temporal variaition in group relatedness: Evidence from the imported willow leaf beetle. Evolution, 42(1), 184-192. Mesterton-Gibbons, M., & Dugatkin, L. A. (1992). Cooperation among unrelated individuals: evolutionary factors. Quarterly Review of Biology, 67, 267-281. Michod, R. (1982). The theory of kin selection. Annual Review of Ecology and Systematics, 13, 23-55. Michod, R., & Sanderson, M. (1985). Behavioral structure and the evolution of cooperation. In J. Greenwood, & M. Slatkin (Ed.), Evolution - Essays in honor of John Maynard Smith (pp. 95-104). Cambridge: Cambridge University Press. Mitchell, S. D. (1987). Competing units of selection?: a case of symbiosis. Philosophy of Science, 54, 351-367. Mitchell, S. D. (1993). Superorganisms: then and now. Yearbook in the sociology of science, , Montro, U., & Eshel, I. (1988). The three brothers' problem: kin selection with more than one potential helper: the case of immediate help. American Naturalist, 132, 550-66. Myerson, R. B., Pollock, G. B., & Swinkels, J. M. (1991). Viscous population equilibria. Games and Economic Behavior, 3, 101-109. Nagel, E. (1961). The structure of science: problems in the logic of scientific explanation . Indianapolis: Hackett Publishing co. Noonan, K. M. (1987). Evolution: A primer for psychologists. In C. Crawford, M. Smith, & D. Krebs (Ed.), Sociobiology and Psychology: ideas, issues and applications (pp. 31-60). Hillsdale, New Jersey: Erlbaum. Nunney, L. (1985). Female-biased sex ratios: invidual or group selection. Evolution, 39(2), 349-361. Nunney, L. (1985). Group selection, altruism, and structured-deme models. American Naturalist, 126, 212-230. Nunney, L. (1989). The maintenance of sex by group selection. Evolution, 43(2), 245-257. Orzack, S., & Sober, E. (in press). Optimality models and the test of adaptationism. American Naturalist Owen, R. E. (1986). Colony-level selection in the social insects: Single locus additive and non-additive models. Theoretical population biology, 29, 198-234. Peck, J. R. (1990). The evolution of outsider exclusion. Journal of Theoretical Biology, 142, 565-571. Peck, J. R. (1992). Group selection, individual selection, and the evolution of genetic drift. Journal of Theoretical Biology, 159, 163-187. Peck, J. R. (1993). Friendship and the evolution of cooperation. Journal of Theoretical Biology, 162, 195-228. Peck, J., & Feldman, M. (1986). The evolution of helping behavior in large, randomly mixed populations. American Naturalist, 127, 209-221. Pollock. (1988). Suspending disbelief--of Wynne-Edwards and his critics. Journal of Evolutionary Biology, 2, 000-000. Pollock, G. B. (1983). Population viscosity and kin selection. American Naturalist, 122, 817-29. Pollock, G. (1989). Population structure, spite and the iterated prisoner's dilemma. American Journal of Physical Anthropology, 77, 459-69. Pollock, G. B. (1991). Crossing malthusian boundaries: Evolutionary stability in the finitely repeated Prisoner's dilemma. Journal of Quantitative Anthropology, 3, 159-180. Pollock, G. B. (in press). Personal fitness, altruism and the ontology of game theory. Journal of Quantitative Anthropology, , Price, G. R. (1972). Extension of covariance selection mathematics. Annals of Human Genetics, 35, 485-490. Proctor, H. C. (1989). Occurrence of protandry and a female-biased sex- ratio in a sponge-associated water mite (Acari: Unionicolidae). Experimental and applied acarology, 7, 289-298. Queller, D. C. (1991). Group selection and kin selection. Trends in Ecology and evolution, 6(2), 64. Queller, D. C. (1992). Quantitative genetics, inclusive fitness and group selection. American Naturalist, 139, 540-558. Rapoport, A. (1991). Ideological commitments and evolutionary theory. Journal of Social Issues, 47, 83-100. Rapoport, A., & Chammah, A. (1965). Prisoner's dilemma . Ann Arbor: University of Michigan Press. Ratnieks, F. L. (1988). Reproductive harmony via mutual policing by workers in eusocial Hymenoptera. American Naturalist, 132, 217-236. Ratnieks, F. L., & Visscher, P. K. (1989). Worker policing in the honeybee. Nature, 342, 796-797. Reed, E. (1978). Group selection and methodological individualism--a Critique of Watkins. British journal for the philosophy of science, 29, 256-62. Richards, R. J. (1987). Darwin and the emergence of evolutionary theories of mind and behavior . Chicago: University of Chicago. Richardson, R. (1983). Grades of organization and the units of selection controversy. In P. Asquith, & T. Nickles (Ed.), PSA 1982, v2 (pp. 324-340). E. Lansin: Philosophy of Science Association. Rissing, S., & Pollock, G. (1991). An experimental analysis of pleometric advantage in Messor pergandei. Insect societies, 63, 205-211. Rogers, A. R. (1990). Group selection by selective emmigration: the effects of migration and kin structure. American Naturalist, 135, 398-413. Rosenberg, A. (1983). Coefficients, effects and genic selection. Philosophy of Science, 50, 332-38. Rosenberg, A. (1985). The structure of biological science . Cambridge: Cambridge University Press. Ruse, M. (1986). Taking Darwin Seriously . New York: Basil Blackwell. Rushton, J. P. (1989). Genetic similarity, human altruism and group selection. Behavioral and Brain sciences, 12, 503-559. Sagan, C., & Druyan, A. (1992). Shadows of forgotten ancestors . New York: Random House. Seeley, T. (1989). The honey bee colony as a superorganism. American Scientist, 77, 546-553. Seger, J. (1989). All for one, one for all, that is our device. Nature, 338, 374-5. Shanahan, T. (1990). Group selection and the evolution of myxomatosis. Evolutionary Theory, 9, 239-254. Shenker, B. (1986). Intentional communities: ideology and alienation in communal societies . London: Routlege. Sherif, M., Harvey, O. J., White, B. J., Hood, W. R., & Sherif, C. W. (1961). Intergroup conflict and cooperation: The robber's cave experiment . Norman, OK: The University Book Exchange. Simon, H. A. (1991). Organizations and markets. Journal of Economic Perspectives, 5, 25-44. Slatkin, M. (1981). Populational heritability. Evolution, 35, 859-871. Slatkin, M., & Wade, M. J. (1978). Group selection on a quantitative character. Proceedings of the national academy of sciences, 75, 3531-34. Slatkin, M., & Wilson, D. s. (1979). Coevolution in structured demes. Proceedings of the National Academy of Sciences, 76, 2084-87. Smith, D. C. (1990). Population structure and competition among kin in the chorus frog (Pseudacris triseriata). Evolution, 44, 1529-1541. Smith, R. J. F. (1986). Evolution of alarm signals: role of benefits of retaining memebers or territorial neighbors. American naturalist, 128, 604-610. Sober, E. (1981). Holism, Individualism and the units of selection. In P. Asquith, & R. Giere (Ed.), PSA 1980 v2 (pp. 93-101). East Lansing: Philosophy of Science Association. Sober, E. (1984). The nature of selection: evolutionary theory in philosophical focus . Cambridge: Bradford/MIT. Sober, E. (1987). Comments on Maynard Smith's "How to model evolution". In J. Dupre (Ed.), The latest on the best: essays on evolution and optimality (pp. 133-146). Cambridge, Mass: MIT press. Sober, E. (1990). The poverty of pluralism. Journal of Philosophy, 87, 151- 57. Sober, E. (1992). The evolution of altruism: correlation , cost and benefit. Biology and Philosophy, 7, 177-188. Sober, E. (1992). Screening-off and the units of selection. Philosophy of science, 59, 142-152. Sober, E. (1993a). Evolutionary altruism, psychological egoism and morality--disentangling the phenotypes. In M. H. Nitecki, & D. V. Nitecki (Ed.), Evolutionary ethics Albany: SUNY Press. p 199-216 Sober, E. (1993b). Philosophy of Biology . Boulder, Co.: Westview Press. Sober, E. (in press). Did evolution make us psychological altruists? in J. Lennox (ed.). Pittsburgh studies in the philosophy of science. University of Pittsburgh Press. Sober, E., & Lewontin, R. (1982). Artifact, cause and genic selection. Philosophy of science, 47, 157-80. Sober, E., & Wilson, D. S. (1993). A critical review of philosophical work on the units of selection problem. submitted, , Stanley, S. (1975). A theory of evolution above the species level. Proceedings of the National Academy of Sciences, 72, 646-650. Stanley, S. (1979). Macroevolution: pattern and process . San Francisco: W.H. Freeman. Sterelny, K., & Kitcher, P. (1988). The return of the gene. Journal of Philosophy, 85, 339-61. Symons, D. (1992). On the use and misuse of Darwinism in the study of human behavior. In J. H. Barkow, L. Cosmides, & J. Tooby (Ed.), The adapted mind: evolutionary psychology and the generation of culture (pp. 137-162). Oxford: Oxford University Press. Tajfel, H. (1981). Human groups and social categories . Cambridge: Cambridge University Press. Tanaka, Y. (1991). The evolution of social communication systems in a subdivided population. Journal of Theoretical Biology, 149, 145-164. Tooby, J., & Cosmides, L. (1992). The psychological foundations of culture. In J. H. Barkow, L. Cosmides, & J. Tooby (Ed.), The adapted mind: evolutionary psychology and the generation of culture (pp. 19-136). Oxford: Oxford University Press. Toro, M., & Silio, L. (1986). Assortment of encounters in the two-strategy game. Journal of theoretical biology, 123, 193-204. Treisman, M. (1983). Errors in the theory of the structured deme. Journal of theoretical biology, 102, 339-346. Trivers, R. L. (1971). The evolution of reciprocal altruism. Quarterly Review of Biology, 46, 35-57. Trivers, R. L. (1985). Social evolution . Menlo Park, CA: Benjamin/Cummins. Uyenoyama, M., & Feldman, M. W. (1980). Evolution of altruism under group selection in large and small populations in fluctuating environments. Theoretical population biology, 15, 58-85. Uyenoyama, M. K., & Feldman, M. W. (1980). Theories of kin and group selection: a population genetics perspective. Theoretical population biology, 17, 380-414. Von Schilcher, F., & Tennant, N. (1984). Philosophy, evolution and human nature . London: Routledge and Kegan Paul. Voorzanger, B. (1984). Altruism in sociobiology: a conceptual analysis. Journal of human evolution, 13, 33-39. Vrba, E. (1989). Levels of selection and sorting. Oxford surveys in evolutionary biology, 6, Wade, M. J. (1976). Group selection among laboratory populations of Tribolium. Proceedings of the National academy of sciences, 73, 4604-7. Wade, M. J. (1977). An experimental study of group selection. Evolution, 31, 134-153. Wade, M. J. (1978). A critical review of the models of group selection. Quarterly Review of Biology, 53, 101-114. Wade, M. J. (1979). The primary characteristocs of Tribolium populations group selected for increased and decreased population size. Evolution, 33(2), 749-764. Wade, M. J. (1982). The evolution of interference competition by individual, family and group selection. Proceeding of the national academy of sciences, 79, 3575-3578. Wade, M. J. (1982). Group selection: migration and the differentiation of small populations. Evolution, 36, 949-62. Wade, M. J. (1985). Soft selection, hard selection, kin selection and group selection. American naturalist, 125, 61-73. Wade, M. J. (1991). Genetic variance for rate of population increase in natural populations of flour beetles, Tribolium spp. Evolution, 45, 1574- 84. Wade, M. J., & Breden, F. (1980). The evolution of cheating and selfish behavior. Behavioral ecology and sociobiology, 7(167-72), Wade, M. J., Breden, F. J., & McCauley, D. E. (1988). Spatial and temporal variation in group relatedness: evidence from the imported willow leaf beetle. Evolution, 42, 184-92. Wade, M. J., & McCauley, D. E. (1980). Group selection: the phenotypic and genotypic differentiation of small populations. Evolution, 34, 799-812. Walton, D. (1991). The units of selection and the bases of selection. Philosophy of science, 58, 417-35. Waters, K. (1991). Tempered realism about the forces of selection. Philosophy of science, 58, 553-73. Werren, J. H. (1991). The paternal sex-ratio chromosome of Nasonia. American Naturalist, 137, 392-402. Werren, J. H., & Beukeboom, L. W. (1992). Population genetics of a parasitic chromosome: experimental analysis of PSR in subdivided populations. Evolution, 46, 1257-68. Werren, J. H., & Beukeboom, L. W. (1993). Population genetics of a parasitic chromosome: theoretical analysis of PSR in subdivided populations. American Naturalist, 142, 224-241. West-Eberhard, M. J. (1981). Intragroup selection and the evolution of insect societies. In R. D. Alexander, & D. W. Tinkle (Ed.), Natural selection and social behavior (pp. 3-17). NY: Chiron Press. Whitlock, M. C., & McCauley, D. E. (1990). Some population genetic consequences of colony formation and extinction: genetic correlations within founding groups. Evolution, 44, 1717-24. Williams, G. C. (1966). Adaptation and Natural Selection: a critique of some current evolutionary thought . Princeton: Princeton University Press. Williams, G. C. (1971). Group selection. Chicago: Aldine, Williams, G. C. (1986). A defence of reductionism in evolutionary biology. In R. a. M. R. Dawkins (Ed.), Oxforf surveys in evolutionary biology (pp. 1- 27). Oxford: Oxford University Press. Williams, G. C. (1992). Natural selection: domains, levels and challenges . Oxford: Oxford University Press. Williams, G. C. (1993). Hard-core Darwinism since 1859. Quarterly Review of Biology, 68, 409-412. Wills, C. (1991). Maintenance of multiallelic polymorphism at the MHC region. Immunological reviews, 124, 165-220. Wilson, D. S. (1975). A general theory of group selection. Proceedings of the national academy of sciences, 72, 143-146. Wilson, D. (1976). Evolution on the level of communities. Science, 192, 1358-1360. Wilson, D. S. (1977). How nepotistic is the brain worm. Behavioral Ecology and Sociobiology, 2, 421-25. Wilson, D. (1977). Structured demes and the evolution of group- advantageous traits. American Naturalist, 111, 157-185. Wilson, D. S. (1978). Structured demes and trait-group variation. American naturalist, 113, 606-610. Wilson, D. S. (1980). The natural selection of populations and communities . Menlo Park: Benjamin Cummings. Wilson, D. S. (1983). The group selection controversy: History and current status. Annual review of ecology and systematics, 14, 159-187. Wilson, D. S. (1983). Reply to Tresiman. Journal of Theoretical Biology, 102, 459-462. Wilson, D. S. (1987). Altruism in mendelian populations derived from sibgroups: the haystack model revisited. Evolution, 41, 1059-1070. Wilson, D. S. (1988). Holism and reductionism in evolutionary ecology. Oikos, 53, 269-273. Wilson, D. S. (1989). Levels of selection: an alternative to individualism in biolog yand the social sciences. Social Networks, 11, 257-272. Wilson, D. S. (1990). Weak altruism, strong group selection. Oikos, 59, 135-140. Wilson, D. S. (1992). Complex interactions in metacommunities, with implications for biodiversity and higher levels of selection. Ecology, 73, 1984-2000. Wilson, D. S. (1992). On the relationship between evolutionary and psychological definitions of altruism and selfishness. Biology and Philosophy, 7, 61-68. Wilson, D. S. (1993). Group Selection. Key words in evolutionary biology Cambridge: Harvard. Wilson, D. S., & Colwell, R. K. (1981). Evolution of sex ratio in structured demes. Evolution, 35(5), 882-897. Wilson, D. S., & Dugatkin, L. A. (1992). Altruism. In E. F. Keller, & E. A. Lloy d (Ed.), Key words in Evolutionary Biology (pp. 29-33). Cambridge Mass: Harvard University Press. Wilson, D. S., & Knollenberg, W. G. (1987). Adaptive indirect effects: the fitness of burying beetles with and without their phoretic mites. Evolutionary Ecology, 1, 139-159. Wilson, D. S., Pollock, G. B., & Dugatkin, L. A. (1992). Can altruism evolve in purely viscous populations? Evolutionary Ecology, 6, 331-341. Wilson, D. S., & Sober, E. (1989). Reviving the superorganism. Journal of Theoretical Biology, 136, 337-356. Wilson, E. O. (1973). Group selection and its significance for ecology. Bioscience, 23, 631-38. Wilson, E. O., & Holldobler, B. (1988). Dense heterarchies and mass communication as the basis of organization in ant colonies. Trends in Ecology and Evolution, 3, 65-67. Wilson, J. B. (1987). Group selection in plant populations. Theoretical and Applied Genetics, 1987, 493-502. Wright, S. (1980). Genic and organismic selection. Evolution, 34, 825-843. Wynne-Edwards, V. C. (1962). Animal disperson in relation to social behavior . Edinburgh: Oliver & Boyd. Wynne-Edwards, V. C. (1986). Evolution through group selection . Oxford: Blackwell Scientific Publications. Zeigler, B. P. (1978). On necessary and sufficient conditions for group selection efficacy. Theoretical Population Biology, 13, 356-64. TABLE 1. A guide to the biological literature on group selection since 1970. "T"= theoretical models (including both mathematical and verbal models), "E"= possible empirical examples (including examples that have not been verified by experiments), "F"=field experiments, "L"=laboratory experiments, "R"=literature reviews, "P"=philosophical treatments, "C"=criticisms of group selection interpretations and "H"=papers that are especially relevant from the standpoint of human evolutionary biology. FIGURE CAPTIONS Figure 1) A nested hierarchy in which every unit is a population of lower- level units. The hierarchy is left open on both ends because genes are composed of subunits and metapopulations can exist in higher-order metapopulations. For example, a valley can be a metapopulation of villages, which in turn are metapopulations of kinship groups. Figure 2) A vehicle-centered version of kin selection theory. The dominant A-allele codes for an altruistic behavior. The fitness of altruists (WA) and nonaltruists (WS) in a given group is WA=1-c+b(Np-1)/(N-1) and WS=1+bNp/(N-1), where p=the frequency of altruists in the group, N=group size, c=the cost to the altruist, and b=the benefit to the recipient. Both phenotypes have a baseline fitness of 1. Each altruist can be a recipient for (Np-1) other altruists in the group (excluding itself) who are distributing their benefits among (N-1) members of the group (excluding themselves). Each non-altruist can be a recipient for all Np altruists in the group. For this example N=10, c=0.3 and b=1.0. Random mating among the three genotypes (first line) produces six types of mated pairs (second line), which in turn produce groups of siblings (third line). The third line shows only the average sibling group for each type of mated pair. Random sampling of the gametes will produce variation around the averages. Sibling groups vary in the frequency of altruists (fourth line). Altruism is selected against at the individual level because non-altruists have the highest fitness within all mixed groups. Altruism is favored at the group level, however, because group fitness is directly proportional to the frequency of altruists in the group. Figure 3. Variation among groups in the frequency of altruists. Altruism is coded by a dominant A allele at a frequency of p=0.25 in the metapopulation, yielding a frequency of 0.438 altruists (AA and Aa) and 0.562 nonaltruists (aa). When groups of N=10 are composed of unrelated individuals, the variation in the frequency of altruists between groups has a binomial distribution, as shown by the black curve. Sibling groups are created by a two-step sampling process in which groups of size N=2 (the parents) are drawn from the global population and groups of size N=10 (the siblings) are drawn from their gametes. This two-step sampling procedure increases genetic variation between groups, as shown by the stipled curve, intensifying natural selection at the group level. Evolution within groups always favors the non-altruist, regardless of whether the groups are composed of siblings or unrelated individuals. Figure 4. Four pay-off matrices that represent A) pure between-group selection, B) a strong conflict between levels of selection, C) a weak conflict between levels of selection (X is the average number of interactions between members of each pair), and D) a return to pure between-group selection. Within-group selection is absent from the first example by virtue of the situation, since coordination has an equal effect on both occupants of the leaf. Within-group selection is absent from the fourth example by virtue of an adaptation, since the "outlaw" A3 type cannot operate in the presence of the "parliament" A5 type. From checker at panix.com Mon Jan 9 15:39:37 2006 From: checker at panix.com (Premise Checker) Date: Mon, 9 Jan 2006 10:39:37 -0500 (EST) Subject: [Paleopsych] Sigma Xi: Unwed Numbers Message-ID: Unwed Numbers http://www.americanscientist.org/template/AssetDetail/assetid/48550?&print=yes American Scientist Online. The Magazine of Sigma Xi, the Scientific Research Society [Click the URL to view the graphic.] see full issue: January-February 2006 Volume: 94 Number: 1 Page: 12 DOI: 10.1511/2006.1.12 COMPUTING SCIENCE The mathematics of Sudoku, a puzzle that boasts "No math required!" Brian Hayes A few years ago, if you had noticed someone filling in a crossword puzzle with numbers instead of letters, you might well have looked askance. Today you would know that the puzzle is not a crossword but a Sudoku. The craze has circled the globe. It's in the newspaper, the bookstore, the supermarket checkout line; Web sites offer puzzles on demand; you can even play it on your cell phone. Just in case this column might fall into the hands of the last person in North America who hasn't seen a Sudoku, an example is given on the opposite page. The standard puzzle grid has 81 cells, organized into nine rows and nine columns and also marked off into nine three-by-three blocks. Some of the cells are already filled in with numbers called givens. The aim is to complete the grid in such a way that every row, every column and every block has exactly one instance of each number from 1 to 9. A well-formed puzzle has one and only one solution. The instructions that accompany Sudoku often reassure the number-shy solver that "No mathematics is required." What this really means is that no arithmetic is required. You don't have to add up columns of figures; you don't even have to count. As a matter of fact, the symbols in the grid need not be numbers at all; letters or colors or fruits would do as well. In this sense it's true that solving the puzzle is not a test of skill in arithmetic. On the other hand, if we look into Sudoku a little more deeply, we may well find some mathematical ideas lurking in the background. A Puzzling Provenance The name "Sudoku" is Japanese, but the game itself is almost surely an American invention. The earliest known examples were published in 1979 in Dell Pencil Puzzles & Word Games, where they were given the title Number Place. The constructor of the puzzles is not identified in the magazine, but Will Shortz, the puzzles editor of The New York Times, thinks he has identified the author through a process of logical deduction reminiscent of what it takes to solve a Sudoku. Shortz examined the list of contributors in several Dell magazines; he found a single name that was always present if an issue included a Number Place puzzle, and never present otherwise. The putative inventor identified in this way was Howard Garns, an architect from Indianapolis who died in 1989. Mark Lagasse, senior executive editor of Dell Puzzle Magazines, concurs with Shortz's conclusion, although he says Dell has no records attesting to Garns's authorship, and none of the editors now on the staff were there in 1979. The later history is easier to trace. Dell continued publishing the puzzles, and in 1984 the Japanese firm Nikoli began including puzzles of the same design in one of its magazines. (Puzzle publishers, it seems, are adept at the sincerest form of flattery.) Nikoli named the puzzle "suji wa dokushin ni kagiru," which I am told means "the numbers must be single"--single in the sense of unmarried. The name was soon shortened to Sudoku, which is usually translated as "single numbers." Nikoli secured a trademark on this term in Japan, and so later Japanese practitioners of sincere flattery have had to adopt other names. Ed Pegg, writing in the Mathematical Association of America's MAA Online, points out an ironic consequence: Many Japanese know the puzzle by its English name Number Place, whereas the English-speaking world prefers the Japanese term Sudoku. The next stage in the puzzle's east-to-west circumnavigation was a brief detour to the south. Wayne Gould, a New Zealander who was a judge in Hong Kong before the British lease expired in 1997, discovered Sudoku on a trip to Japan and wrote a computer program to generate the puzzles. Eventually he persuaded The Times of London to print them; the first appeared in November 2004. The subsequent fad in the U.K. was swift and intense. Other newspapers joined in, with The Daily Telegraph running the puzzle on its front page. There was boasting about who had the most and the best Sudoku, and bickering over the supposed virtues of handmade versus computer-generated puzzles. In July 2005 a Sudoku tournament was televised in Britain; the event was promoted by carving a 275-foot grid into a grassy hillside near Bristol. (It soon emerged that this "world's largest Sudoku" was defective.) Sudoku came back to the U.S. in the spring of 2005. Here too the puzzle has become a popular pastime, although perhaps not quite the all-consuming obsession it was in the U.K. I don't believe anyone will notice a dip in the U.S. gross domestic product as a result of this mass distraction. On the other hand, I must report that my own motive for writing on the subject is partly to justify the appalling number of hours I have squandered solving Sudoku. Hints and Heuristics If you take a pencil to a few Sudoku problems, you'll quickly discover various useful rules and tricks. The most elementary strategy for solving the puzzle is to examine each cell and list all its possible occupants--that is, all the numbers not ruled out by a conflict with another cell. If you find a cell that has only one allowed value, then obviously you can write that value in. The complementary approach is to note all the cells within a row, a column or a block where some particular number can appear; again, if there is a number that can be put in only one position, then you should put it there. In either case, you can eliminate the selected number as a candidate in all other cells in the same neighborhood. Some Sudoku can be solved by nothing more than repeated application of these two rules--but if all the puzzles were so straightforward, the fad would not have lasted long. Barry Cipra, a mathematician and writer in Northfield, Minnesota, describes a hierarchy of rules of increasing complexity. The rules mentioned above constitute level 1: They restrict a cell to a single value or restrict a value to a single cell. At level 2 are rules that apply to pairs of cells within a row, column or block; when two such cells have only two possible values, those values are excluded elsewhere in the neighborhood. Level-3 rules work with triples of cells and values in the same way. In principle, the tower of rules might rise all the way to level 9. This sequence of rules suggests a simple scheme for rating the difficulty of puzzles. Unfortunately, however, not all Sudoku can be solved by these rules alone; some of the puzzles seem to demand analytic methods that don't have a clear place in the hierarchy. A few of these tactics have even acquired names, such as "swordfish" and "x-wing." The subtlest of them are nonlocal rules that bring together information from across a wide swath of the matrix. When you are solving a specific puzzle, the search for patterns that trigger the various rules is where the fun is (assuming you go in for that sort of thing). But if you are trying to gain a higher-level understanding of Sudoku, compiling a catalog of such techniques doesn't seem very promising. The rules are too many, too various and too specialized. Rather than discuss methods for solving specific puzzles, I want to ask some more-general questions about Sudoku, and look at it as a computational problem rather than a logic puzzle. How hard a problem is it? Pencil-and-paper experience suggests that some instances are much tougher than others, but are there any clear-cut criteria for ranking or classifying the puzzles? Counting Solutions In the search for general principles, a first step is to generalize the puzzle itself. The standard 81-cell Sudoku grid is not the only possibility. For any positive integer n, we can draw an order-n Sudoku grid with n ^2 rows, n ^2 columns and n ^2 blocks; the grid has a total of n ^4 cells, which are to be filled with numbers in the range from 1 to n ^2. The standard grid with 81 cells is of order 3. Some publishers produce puzzles of order 4 (256 cells) and order 5 (625 cells). On the smaller side, there's not much to say about the order-1 puzzle. The order-2 Sudoku (with 4 rows, columns and blocks, and 16 cells in all) is no challenge as a puzzle, but it does serve as a useful test case for studying concepts and algorithms. click for full image and caption Order-2 Sudoku... How many Sudoku solutions exist for each n? To put the question another way: Starting from a blank grid--with no givens at all--how many ways can the pattern be completed while obeying the Sudoku constraints? As a first approximation, we can simplify the problem by ignoring the blocks in the Sudoku grid, allowing any solution in which each column and each row has exactly one instance of each number. A pattern of this kind is known as a Latin square, and it was already familiar to Leonhard Euler more than 200 years ago. Consider the 4 x 4 Latin square (which corresponds to the order-2 Sudoku). Euler counted them: There are exactly 576 ways of arranging the numbers 1, 2, 3 and 4 in a square array with no duplications in any row or column. It follows that 576 is an upper limit on the number of order-2 Sudoku. (Every Sudoku solution is necessarily a Latin square, but not every Latin square is a valid Sudoku.) In a series of postings on the Sudoku Programmers Forum, Frazer Jarvis of the University of Sheffield showed that exactly half the 4 x 4 Latin squares are Sudoku solutions; that is, there are 288 valid arrangements. (The method of counting is summarized in the illustration on the next page.) Moving to higher-order Sudoku and larger Latin squares, the counting gets harder in a hurry. Euler got only as far as the 5 x 5 case, and the 9 x 9 Latin squares were not enumerated until 1975; the tally is 5,524,751,496,156,892,842,531,225,600, or about 6 x 10^27. The order-3 Sudoku must be a subset of these squares. They were counted in June 2005 by Bertram Felgenhauer of the Technical University of Dresden in collaboration with Jarvis. The total they computed is 6,670,903,752,021,072,936,960, or 7 x 10^21. Thus, among all the 9 x 9 Latin squares, a little more than one in a million are also Sudoku grids. It's a matter of definition, however, whether all those patterns are really different. The Sudoku grid has many symmetries. If you take any solution and rotate it by a multiple of 90 degrees, you get another valid grid; in the tabulations above, these variants are counted as separate entries. Beyond the obvious rotations and reflections, you can permute the rows within a horizontal band of blocks or the columns within a vertical stack of blocks, and you can also freely shuffle the bands and stacks themselves. Furthermore, the numerals in the cells are arbitrary markers, which can also be permuted; for example, if you switch all the 5s and 6s in a puzzle, you get another valid puzzle. When all these symmetries are taken into account, the number of essentially different Sudoku patterns is reduced substantially. In the case of the order-2 Sudoku, it turns out there are actually only two distinct grids! All the rest of the 288 patterns can all be generated from these two by applying various symmetry operations. In the order-3 case, the reduction is also dramatic, although it still leaves an impressive number of genuinely different solutions: 3,546,146,300,288, or 4 x 10^12. Does the large number of order-3 Sudoku grids tell us anything about the difficulty of solving the puzzle? Maybe. If we set out to solve it by some kind of search algorithm, then the number of patterns to be considered is a relevant factor. But any strategy that involves generating all 6,670,903,752,021,072,936,960 grids is probably not the best way to go about solving the puzzle. NP or Not NP, That Is the Question Computer science has an elaborate hierarchy for classifying problems according to difficulty, and the question of where Sudoku fits into this scheme has elicited some controversy and confusion. It is widely reported that Sudoku belongs in the class NP, a set of notoriously difficult problems; meanwhile, however, many computer programs effortlessly solve any order-3 Sudoku puzzle. There is actually no contradiction in these facts, but there is also not much help in dispelling the confusion. Complexity classes such as NP do not measure the difficulty of any specific problem instance but rather describe the rate at which difficulty grows as a function of problem size. If we can solve an order-n Sudoku, how much harder will we have to work to solve a puzzle of order n + 1? For problems in NP, the effort needed grows exponentially. Most discussions of the complexity of Sudoku refer to the work of Takayuki Yato and Takahiro Seta of the University of Tokyo, whose analysis relates the task of solving Sudoku to the similar problem of completing a partially specified Latin square. The latter problem in turn has been connected with others that are already known to be in NP. This process of "reduction" from one problem to another is the standard way of establishing the complexity classes of computational problems. Yato and Seta employ an unusual form of reduction that addresses the difficulty of finding an additional solution after a first solution is already known. In Sudoku, of course, well-formed puzzles are expected to have only one solution. Yato and Seta say their result applies nonetheless. I don't quite follow their reasoning on this point, but the literature of complexity theory is vast and technical, and the fault is likely my own. When you lay down your pencil on a completed Sudoku, the thought that you've just dispatched a problem in the class NP may boost your psychological wellbeing, but the NP label doesn't say anything about the relative difficulty of individual Sudoku puzzles. For that, a different kind of hierarchy is needed. Many publishers rank their Sudoku on a scale from easy to hard (or from gentle to diabolical). The criteria for these ratings are not stated, and it's a common experience to breeze through a "very hard" puzzle and then get stuck on a "medium." One easily measured factor that might be expected to influence difficulty is the number of givens. In general, having fewer cells specified at the outset ought to make for a harder puzzle. At the extremes of the range, it's clear that having all the cells filled in makes a puzzle very easy indeed, and having none filled in leaves the problem under-specified. What is the minimum number of givens that can ensure a unique solution? For an order-n grid, there is a lower bound of n ^2 - 1. For example, on an order-3 grid with fewer than eight givens, there must be at least two numbers that appear nowhere among the givens. With no constraints on those symbols, there are at least two solutions in which their roles are interchanged. Can the n ^2 - 1 bound be achieved in practice? For n = 1 the answer is yes. On the order-2 grid there are uniquely solvable puzzles with four givens but not, I think, with three. (Finding the arrangements with just four givens is itself a pleasant puzzle.) For order 3, the minimum number of givens is unknown. Gordon Royle of the University of Western Australia has collected more than 24,000 examples of uniquely solvable grids with 17 givens, and he has found none with fewer than 17, but a proof is lacking. Published puzzles generally have between 25 and 30 givens. Within this range, the correlation between number of givens and difficulty rating is weak. In one book, I found that the "gentle" puzzles averaged 28.3 givens and the "diabolical" ones 28.0. Logic Rules Many puzzle constructors distinguish between puzzles that can be solved "by logic alone" and those that require "trial and error." If you solve by logic, you never write a number into a cell until you can prove that only that number can appear in that position. Trial and error allows for guessing: You fill in a number tentatively, explore the consequences, and if necessary backtrack, removing your choice and trying another. A logic solver can work with a pen; a backtracker needs a pencil and eraser. For the logic-only strategy to work, a puzzle must have a quality of progressivism: At every stage in the solution, there must be at least one cell whose value can be determined unambiguously. Filling in that value must then uncover at least one other fully determined value, and so on. The backtracking protocol dispenses with progressivism: When you reach a state where no choice is forced upon you--where every vacant cell has at least two candidates--you choose a path arbitrarily. The distinction between logic and backtracking seems like a promising criterion for rating the difficulty of puzzles, but on a closer look, it's not clear the distinction even exists. Is there a subset of Sudoku puzzles that can be solved by backtracking but not by "logic"? Here's another way of asking the question: Are there puzzles that have a unique solution, and yet at some intermediate stage reach an impasse, where no cell has a value that can be deduced unambiguously? Not, I think, unless we impose artificial restrictions on the rules allowed in making logical deductions. Backtracking itself can be viewed as a logical operation; it supplies a proof by contradiction. If you make a speculative entry in one cell and, as a consequence, eventually find that some other cell has no legal entry, then you have discovered a logical relation between the cells. The chain of implication could be very intricate, but the logical relation is no different in kind from the simple rule that says two cells in the same row can't have the same value. (David Eppstein of the University of California at Irvine has formulated some extremely subtle Sudoku rules, which capture the kind of information gleaned from a backtracking analysis, yet work in a forward-looking, nonspeculative mode.) A Satisfied Mind From a computational point of view, Sudoku is a constraint-satisfaction problem. The constraints are the rules forbidding two cells in the same neighborhood to have the same value; a solution is an assignment of values to cells that satisfies all the constraints simultaneously. In one obvious encoding, there are 810 constraints in an order-3 grid. It's interesting to observe how differently one approaches such a problem when solving it by computer rather than by hand. A human solver may well decide that logic is all you need, but backtracking is the more appealing option for a program. For one thing, backtracking will always find the answer, if there is one. It can even do the right thing if there are multiple solutions or no solution. To make similar claims for a logic-only program, you would have to prove you had included every rule of inference that might possibly be needed. Backtracking is also the simpler approach, in the sense that it relies on one big rule rather than many little ones. At each stage you choose a value for some cell and check to see if this new entry is consistent with the rest of the grid. If you detect a conflict, you have to undo the choice and try another. If you have exhausted all the candidates for a given cell, then you must have taken a wrong turn earlier, and you need to backtrack further. This is not a clever algorithm; it amounts to a depth-first search of the tree of all possible solutions--a tree that could have 9^81 leaves. There is no question that we are deep in the exponential territory of NP problems here. And yet, in practice, solving Sudoku by backtracking is embarrassingly easy. There are many strategies for speeding up the search, mostly focused on making a shrewd choice of which branch of the tree to try next. But such optimizations are hardly needed. On an order-3 Sudoku grid, even a rudimentary backtracking search converges on the solution in a few dozen steps. Evidently, competing against a computer in Sudoku is never going to be much fun. Does that ruin the puzzle for the rest of us? In moments of frustration, when I'm struggling with a recalcitrant diabolical, the thought that the machine across the room could instantly sweep away all my cobwebs of logic is indeed dispiriting. I begin to wonder whether this cross-correlation of columns, rows and blocks is a fit task for the human mind. But when I do make a breakthrough, I take more pleasure in my success than the computer would. ? Brian Hayes Bibliography * Bammel, Stanley E., and Jerome Rothstein. 1975. The number of 9 x 9 Latin squares. Discrete Mathematics 11:93-95. * Eppstein, David. Preprint. Nonrepetitive paths and cycles in graphs with application to sudoku. http://arxiv.org/abs/cs.DS/0507053 * Felgenhauer, Bertram, and Frazer Jarvis. Preprint. Enumerating possible Sudoku grids. http://www.shef.ac.uk/~pm1afj/sudoku/sudoku.pdf * Pegg, Ed. 2005. Math Games: Sudoku variations. MAA Online. http://www.maa.org/editorial/mathgames/mathgames_09_05_05.html * Royle, Gordon F. 2005. Minimum sudoku. http://www.csse.uwa.edu.au/~gordon/sudokumin.php * Simonis, Helmut. 2005. Sudoku as a constraint problem. In Proceedings of the Fourth International Workshop on Modelling and Reformulating Constraint Satisfaction Problems, pp. 13-27. * Sudoku Programmers Forum. 2005. Discussion thread, May 5, 2005, through June 11, 2005. http://www.setbb.com/phpbb/viewtopic.php?t=27&mforum=sudoku * Wikipedia. 2005. Sudoku. http://en.wikipedia.org/wiki/Sudoku * Yato, Takayuki, and Takahiro Seta. 2002. Complexity and completeness of finding another solution and its application to puzzles. http://www-imai.is.s.u-tokyo.ac.jp/~yato/data2/SIGAL87-2.pdf From checker at panix.com Mon Jan 9 15:39:55 2006 From: checker at panix.com (Premise Checker) Date: Mon, 9 Jan 2006 10:39:55 -0500 (EST) Subject: [Paleopsych] NYT: Quantum Trickery: Testing Einstein's Strangest Theory Message-ID: Quantum Trickery: Testing Einstein's Strangest Theory http://www.nytimes.com/2005/12/27/science/27eins.html By DENNIS OVERBYE Einstein said there would be days like this. This fall scientists announced that they had put a half dozen beryllium atoms into a "cat state." No, they were not sprawled along a sunny windowsill. To a physicist, a "cat state" is the condition of being two diametrically opposed conditions at once, like black and white, up and down, or dead and alive. These atoms were each spinning clockwise and counterclockwise at the same time. Moreover, like miniature Rockettes they were all doing whatever it was they were doing together, in perfect synchrony. Should one of them realize, like the cartoon character who runs off a cliff and doesn't fall until he looks down, that it is in a metaphysically untenable situation and decide to spin only one way, the rest would instantly fall in line, whether they were across a test tube or across the galaxy. The idea that measuring the properties of one particle could instantaneously change the properties of another one (or a whole bunch) far away is strange to say the least - almost as strange as the notion of particles spinning in two directions at once. The team that pulled off the beryllium feat, led by Dietrich Leibfried at the National Institute of Standards and Technology, in Boulder, Colo., hailed it as another step toward computers that would use quantum magic to perform calculations. But it also served as another demonstration of how weird the world really is according to the rules, known as quantum mechanics. The joke is on Albert Einstein, who, back in 1935, dreamed up this trick of synchronized atoms - "spooky action at a distance," as he called it - as an example of the absurdity of quantum mechanics. "No reasonable definition of reality could be expected to permit this," he, Boris Podolsky and Nathan Rosen wrote in a paper in 1935. Today that paper, written when Einstein was a relatively ancient 56 years old, is the most cited of Einstein's papers. But far from demolishing quantum theory, that paper wound up as the cornerstone for the new field of quantum information. Nary a week goes by that does not bring news of another feat of quantum trickery once only dreamed of in thought experiments: particles (or at least all their properties) being teleported across the room in a microscopic version of Star Trek beaming; electrical "cat" currents that circle a loop in opposite directions at the same time; more and more particles farther and farther apart bound together in Einstein's spooky embrace now known as "entanglement." At the University of California, Santa Barbara, researchers are planning an experiment in which a small mirror will be in two places at once. Niels Bohr, the Danish philosopher king of quantum theory, dismissed any attempts to lift the quantum veil as meaningless, saying that science was about the results of experiments, not ultimate reality. But now that quantum weirdness is not confined to thought experiments, physicists have begun arguing again about what this weirdness means, whether the theory needs changing, and whether in fact there is any problem. This fall two Nobel laureates, Anthony Leggett of the University of Illinois and Norman Ramsay of Harvard argued in front of several hundred scientists at a conference in Berkeley about whether, in effect, physicists were justified in trying to change quantum theory, the most successful theory in the history of science. Dr. Leggett said yes; Dr. Ramsay said no. It has been, as Max Tegmark, a cosmologist at the Massachusetts Institute of Technology, noted, "a 75-year war." It is typical in reporting on this subject to bounce from one expert to another, each one shaking his or her head about how the other one just doesn't get it. "It's a kind of funny situation," N. David Mermin of Cornell, who has called Einstein's spooky action "the closest thing we have to magic," said, referring to the recent results. "These are extremely difficult experiments that confirm elementary features of quantum mechanics." It would be more spectacular news, he said, if they had come out wrong. Anton Zeilinger of the University of Vienna said that he thought, "The world is not as real as we think. "My personal opinion is that the world is even weirder than what quantum physics tells us," he added. The discussion is bringing renewed attention to Einstein's role as a founder and critic of quantum theory, an "underground history," that has largely been overlooked amid the celebrations of relativity in the past Einstein year, according to David Z. Albert, a professor of philosophy and physics at Columbia. Regarding the 1935 paper, Dr. Albert said, "We know something about Einstein's genius we didn't know before." The Silly Theory From the day 100 years ago that he breathed life into quantum theory by deducing that light behaved like a particle as well as like a wave, Einstein never stopped warning that it was dangerous to the age-old dream of an orderly universe. If light was a particle, how did it know which way to go when it was issued from an atom? "The more success the quantum theory has, the sillier it seems," Einstein once wrote to friend. The full extent of its silliness came in the 1920's when quantum theory became quantum mechanics. In this new view of the world, as encapsulated in a famous equation by the Austrian Erwin Schr?dinger, objects are represented by waves that extend throughout space, containing all the possible outcomes of an observation - here, there, up or down, dead or alive. The amplitude of this wave is a measure of the probability that the object will actually be found to be in one state or another, a suggestion that led Einstein to grumble famously that God doesn't throw dice. Worst of all from Einstein's point of view was the uncertainty principle, enunciated by Werner Heisenberg in 1927. Certain types of knowledge, of a particle's position and velocity, for example, are incompatible: the more precisely you measure one property, the blurrier and more uncertain the other becomes. In the 1935 paper, Einstein and his colleagues, usually referred to as E.P.R., argued that the uncertainty principle could not be the final word about nature. There must be a deeper theory that looked behind the quantum veil. Imagine that a pair of electrons are shot out from the disintegration of some other particle, like fragments from an explosion. By law certain properties of these two fragments should be correlated. If one goes left, the other goes right; if one spins clockwise, the other spins counterclockwise. That means, Einstein said, that by measuring the velocity of, say, the left hand electron, we would know the velocity of the right hand electron without ever touching it. Conversely, by measuring the position of the left electron, we would know the position of the right hand one. Since neither of these operations would have involved touching or disturbing the right hand electron in any way, Einstein, Podolsky and Rosen argued that the right hand electron must have had those properties of both velocity and position all along. That left only two possibilities, they concluded. Either quantum mechanics was "incomplete," or measuring the left hand particle somehow disturbed the right hand one. But the latter alternative violated common sense. Such an influence, or disturbance, would have to travel faster than the speed of light. "My physical instincts bristle at that suggestion," Einstein later wrote. Bohr responded with a six-page essay in Physical Review that contained but one simple equation, Heisenberg's uncertainty relation. In essence, he said, it all depends on what you mean by "reality." Enjoy the Magic Most physicists agreed with Bohr, and they went off to use quantum mechanics to build atomic bombs and reinvent the world. The consensus was that Einstein was a stubborn old man who "didn't get" quantum physics. All this began to change in 1964 when John S. Bell, a particle physicist at the European Center for Nuclear Research near Geneva, who had his own doubts about quantum theory, took up the 1935 E.P.R. argument. Somewhat to his dismay, Bell, who died in 1990, wound up proving that no deeper theory could reproduce the predictions of quantum mechanics. Bell went on to outline a simple set of experiments that could settle the argument and decide who was right, Einstein or Bohr. When the experiments were finally performed in 1982, by Alain Aspect and his colleagues at the University of Orsay in France, they agreed with quantum mechanics and not reality as Einstein had always presumed it should be. Apparently a particle in one place could be affected by what you do somewhere else. "That's really weird," Dr. Albert said, calling it "a profoundly deep violation of an intuition that we've been walking with since caveman days." Physicists and philosophers are still fighting about what this means. Many of those who care to think about these issues (and many prefer not to), concluded that Einstein's presumption of locality - the idea that physically separated objects are really separate - is wrong. Dr. Albert said, "The experiments show locality is false, end of story." But for others, it is the notion of realism, that things exist independent of being perceived, that must be scuttled. In fact, physicists don't even seem to agree on the definitions of things like "locality" and "realism." "I would say we have to be careful saying what's real," Dr. Mermin said. "Properties cannot be said to be there until they are revealed by an actual experiment." What everybody does seem to agree on is that the use of this effect is limited. You can't use it to send a message, for example. Leonard Susskind, a Stanford theoretical physicist, who called these entanglement experiments "beautiful and surprising," said the term "spooky action at a distance," was misleading because it implied the instantaneous sending of signals. "No competent physicist thinks that entanglement allows this kind of nonlocality." Indeed the effects of spooky action, or "entanglement," as Schr?dinger called it, only show up in retrospect when the two participants in a Bell-type experiment compare notes. Beforehand, neither has seen any violation of business as usual; each sees the results of his measurements of, say, whether a spinning particle is pointing up or down, as random. In short, as Brian Greene, the Columbia theorist wrote in "The Fabric of the Cosmos," Einstein's special relativity, which sets the speed of light as the cosmic speed limit, "survives by the skin of its teeth." In an essay in 1985, Dr. Mermin said that "if there is spooky action at a distance, then, like other spooks, it is absolutely useless except for its effect, benign or otherwise, on our state of mind." He added, "The E.P.R. experiment is as close to magic as any physical phenomenon I know of, and magic should be enjoyed." In a recent interview, he said he still stood by the latter part of that statement. But while spooky action remained useless for sending a direct message, it had turned out to have potential uses, he admitted, in cryptography and quantum computing. Nine Ways of Killing a Cat Another debate, closely related to the issues of entanglement and reality, concerns what happens at the magic moment when a particle is measured or observed. Before a measurement is made, so the traditional story goes, the electron exists in a superposition of all possible answers, which can combine, adding and interfering with one another. Then, upon measurement, the wave function "collapses" to one particular value. Schr?dinger himself thought this was so absurd that he dreamed up a counterexample. What is true for electrons, he said, should be true as well for cats. In his famous thought experiment, a cat is locked in a box where the decay of a radioactive particle will cause the release of poison that will kill it. If the particle has a 50-50 chance of decaying, then according to quantum mechanics the cat is both alive and dead before we look in the box, something the cat itself, not to mention cat lovers, might take issue with. But cats are always dead or alive, as Dr. Leggett of Illinois said in his Berkeley talk. "The problem with quantum mechanics," he said in an interview, "is how it explains definite outcomes to experiments." If quantum mechanics is only about information and a way of predicting the results of measurements, these questions don't matter, most quantum physicists say. "But," Dr. Leggett said, "if you take the view that the formalism is reflecting something out there in real world, it matters immensely." As a result, theorists have come up with a menu of alternative interpretations and explanations. According to one popular notion, known as decoherence, quantum waves are very fragile and collapse from bumping into the environment. Another theory, by the late David Bohm, restores determinism by postulating a "pilot wave" that acts behind the scenes to guide particles. In yet another theory, called "many worlds," the universe continually branches so that every possibility is realized: the Red Sox win and lose and it rains; Schr?dinger's cat lives, dies, has kittens and scratches her master when he tries to put her into the box. Recently, as Dr. Leggett pointed out, some physicists have tinkered with Schr?dinger's equation, the source of much of the misery, itself. A modification proposed by the Italian physicists Giancarlo Ghirardi and Tullio Weber, both of the University of Trieste, and Alberto Rimini of the University of Pavia, makes the wave function unstable so that it will collapse in a time depending on how big a system it represents. In his standoff with Dr. Ramsay of Harvard last fall, Dr. Leggett suggested that his colleagues should consider the merits of the latter theory. "Why should we think of an electron as being in two states at once but not a cat, when the theory is ostensibly the same in both cases?" Dr. Leggett asked. Dr. Ramsay said that Dr. Leggett had missed the point. How the wave function mutates is not what you calculate. "What you calculate is the prediction of a measurement," he said. "If it's a cat, I can guarantee you will get that it's alive or dead," Dr. Ramsay said. David Gross, a recent Nobel winner and director of the Kavli Institute for Theoretical Physics in Santa Barbara, leapt into the free-for-all, saying that 80 years had not been enough time for the new concepts to sink in. "We're just too young. We should wait until 2200 when quantum mechanics is taught in kindergarten." The Joy of Randomness One of the most extreme points of view belongs to Dr. Zeilinger of Vienna, a bearded, avuncular physicist whose laboratory regularly hosts every sort of quantum weirdness. In an essay recently in Nature, Dr. Zeilinger sought to find meaning in the very randomness that plagued Einstein. "The discovery that individual events are irreducibly random is probably one of the most significant findings of the 20th century," Dr. Zeilinger wrote. Dr. Zeilinger suggested that reality and information are, in a deep sense, indistinguishable, a concept that Dr. Wheeler, the Princeton physicist, called "it from bit." In information, the basic unit is the bit, but one bit, he says, is not enough to specify both the spin and the trajectory of a particle. So one quality remains unknown, irreducibly random. As a result of the finiteness of information, he explained, the universe is fundamentally unpredictable. "I suggest that this randomness of the individual event is the strongest indication we have of a reality 'out there' existing independently of us," Dr. Zeilinger wrote in Nature. He added, "Maybe Einstein would have liked this idea after all." From checker at panix.com Mon Jan 9 15:40:25 2006 From: checker at panix.com (Premise Checker) Date: Mon, 9 Jan 2006 10:40:25 -0500 (EST) Subject: [Paleopsych] NYT: Slowly, Cancer Genes Tender Their Secrets Message-ID: Slowly, Cancer Genes Tender Their Secrets http://www.nytimes.com/2005/12/27/health/27canc.html [I usually pass over articles that merely confirm what Mr. Mencken said that the human body proves the truth of the Trinity, namely that man was made by a committee and that so much goes wrong. I am far more interested in progress than breakdowns. But this article is all about progress, the enormous conceptual progress we are getting in understanding this particular kind of breakdown. [Read carefully.] Preventing Cancer By GINA KOLATA Jay Weinstein found out that he had chronic myelogenous leukemia in 1996, two weeks before his marriage. He was a New York City firefighter, and he thought his health was great. He learned that there was little hope for a cure. The one treatment that could save him was a bone marrow transplant, but that required a donor, and he did not have one. By 1999, his disease was nearing its final, fatal phase. He might have just weeks to live. Then, Mr. Weinstein had a stroke of luck. He managed to become one of the last patients to enroll in a preliminary study at the Oregon Health & Science University, testing an experimental drug. Mr. Weinstein is alive today and still taking the drug, now on the market as Gleevec. Its maker, Novartis, supplies it to him free because he participated in the clinical trial. Dr. Brian Druker, a Howard Hughes investigator at the university's Cancer Institute, who led the Gleevec study, sees Mr. Weinstein as a pioneer in a new frontier of science. His treatment was based not on blasting cancer cells with harsh chemotherapy or radiation but instead on using a sort of molecular razor to cut them out. That, Dr. Druker and others say, is the first fruit of a new understanding of cancer as a genetic disease. But if cancer is a genetic disease, it is like no other in medicine. With cancer, a person may inherit a predisposition that helps set the process off, but it can take decades - even a lifetime - to accumulate the additional mutations needed to establish a tumor. That is why, scientists say, cancer usually strikes older people and requires an element of bad luck. "You have to get mutations in the wrong place at the wrong time," Dr. Druker says. Other genetic diseases may involve one or two genetic changes. In cancer, scores of genes are mutated or duplicated and huge chunks of genetic material are rearranged. With cancer cells, said Dr. William Hahn, an assistant professor of medicine at Harvard Medical School, "it looks like someone has thrown a bomb in the nucleus." In other genetic diseases, gene alterations disable cells. In cancer, genetic changes give cells a sort of superpower. At first, as scientists grew to appreciate the complexity of cancer genetics, they despaired. "If there are 100 genetic abnormalities, that's 100 things you need to fix to cure cancer," said Dr. Todd Golub, the director of the Cancer Program at the Broad Institute of Harvard and M.I.T. in Cambridge, Mass., and an oncologist at the Dana-Farber Cancer Institute in Boston. "That's a horrifying thought." Making matters more complicated, scientists discovered that the genetic changes in one patient's tumor were different from those in another patient with the same type of cancer. That led to new questioning. Was every patient going to be a unique case? Would researchers need to discover new drugs for every single patient? "People said, 'It's hopelessly intractable and too complicated a problem to ever figure out,' " Dr. Golub recalled. But to their own amazement, scientists are now finding that untangling the genetics of cancer is not impossible. In fact, they say, what looked like an impenetrable shield protecting cancer cells turns out to be flimsy. And those seemingly impervious cancer cells, Dr. Golub said, "are very much poised to die." The story of genes and cancer, like most in science, involves many discoveries over many years. But in a sense, it has its roots in the 1980's, with a bold decision by Dr. Bert Vogelstein of Johns Hopkins University to piece together the molecular pathways that lead to cancer. It was a time when the problem looked utterly complicated. Scientists thought that cancer cells were so abnormal that they were, as Dr. Vogelstein put it, "a total black box." But Dr. Vogelstein had an idea: what if he started with colon cancer, which had some unusual features that made it more approachable? Colon cancer progresses through recognizable phases. It changes from a tiny polyp, or adenoma - a benign overgrowth of cells on the wall of the colon - to a larger polyp, a pre-cancerous growth that, Dr. Vogelstein said, looks "mean," and then to a cancer that pushes through the wall of the colon. The final stage is metastasis, when the cancer travels through the body. "This series of changes is thought to occur in most cancers, but there aren't many cancers where you can get specimens that represent all these stages," Dr. Vogelstein said. With colon cancer, pathologists could get tissue by removing polyps and adenomas in colonoscopies and taking cancerous tumors in surgery. Colon cancer was even more appealing for such a study because there are families with strong inherited predispositions to develop the disease, indicating that they have cancer genes that may be discovered. So Dr. Vogelstein and his colleagues set out to search for genes "any way we could," Dr. Vogelstein said. Other labs found genes, too, and by the mid-1990's, scientists had a rough outline of what was going on. Although there were scores of mutations and widespread gene deletions and rearrangements, it turned out that the crucial changes that turned a colon cell cancerous involved just five pathways. There were dozens of ways of disabling those pathways, but they were merely multiple means to the same end. People with inherited predispositions to colon cancer started out with a gene mutation that put their cells on one of those pathways. A few more random mutations and the cells could become cancerous. The colon cancer story, Dr. Druker said, "is exactly the paradigm we need for every single cancer at every single stage." But scientists were stymied. Where should they go from there? How did what happens in colon cancer apply to other cancers? If they had to repeat the colon cancer story every time, discovering genetic alterations in each case, it would take decades to make any progress. The turning point came only recently, with the advent of new technology. Using microarrays, or gene chips - small slivers of glass or nylon that can be coated with all known human genes - scientists can now discover every gene that is active in a cancer cell and learn what portions of the genes are amplified or deleted. With another method, called RNA interference, investigators can turn off any gene and see what happens to a cell. And new methods of DNA sequencing make it feasible to start asking what changes have taken place in what gene. The National Cancer Institute and the National Human Genome Research Institute recently announced a three-year pilot project to map genetic aberrations in cancer cells. The project, Dr. Druker said, is "the first step to identifying all the Achilles' heels in cancers." Solving the problem of cancer will not be trivial, Dr. Golub said. But, he added, "For the first time, we have the tools needed to attack the problem, and if we as a research community come together to work out the genetic basis of cancer, I think it will forever change how we think about the disease." Already, the principles are in place, scientists say. What is left are the specifics: the gene alterations that could be targets for drugs. "We're close to being able to put our arms around the whole cancer problem," said Robert Weinberg, a biology professor at the Massachusetts Institute of Technology and a member of the Whitehead Institute. "We've completed the list of all cancer cells needed to create a malignancy," Dr. Weinberg said. "And I wouldn't have said that five years ago." The list includes roughly 10 pathways that cells use to become cancerous and that involve a variety of crucial genetic alterations. There are genetic changes that end up spurring cell growth and others that result in the jettisoning of genes that normally slow growth. There are changes that allow cells to keep dividing, immortalizing them, and ones that allow cells to live on when they are deranged; ordinarily, a deranged cell kills itself. Still other changes let cancer cells recruit normal tissue to support and to nourish them. And with some changes, Dr. Weinberg said, cancer cells block the immune system from destroying them. In metastasis, he added, when cancers spread, the cells activate genes that normally are used only in embryo development, when cells migrate, and in wound healing. But so many genetic changes give rise to a question: how does a cell acquire them? In any cell division, there is a one-in-a-million chance that a mutation will accidentally occur, Dr. Weinberg notes. The chance of two mutations is one in a million million and the chance of three is one in a million million million million. This slow mutation rate results from the fact that healthy cells quickly repair damage to their DNA. "DNA repair stands as the dike between us and the inundation of mutations," Dr. Weinberg said. But one of the first things a cell does when it starts down a road to cancer is to disable repair mechanisms. In fact, BRCA1 and 2, the gene mutations that predispose people to breast and ovarian cancer, as well as some other inherited cancer genes, disable these repair systems. Once the mutations start, there is "a kind of snowball effect, like a chain reaction," Dr. Vogelstein said. With the first mutations, cells multiply, producing clusters of cells with genetic changes. As some randomly acquire additional mutations, they grow even more. In the end, all those altered genes may end up being the downfall of cancer cells, researchers say. "Cancer cells have many Achilles' heels," Dr. Golub says. "It may take a couple of dozen mutations to cause a cancer, all of which are required for the maintenance and survival of the cancer cell." Gleevec, researchers say, was the first test of this idea. The drug knocks out a gene product, abl kinase, that is overly abundant in chronic myelogenous leukemia. The first clinical trial, which began seven years ago, seemed like a long shot. "The idea that this would lead to therapy was something you wrote in your grant application," said Dr. Charles Sawyers, a Howard Hughes investigator at the University of California, Los Angeles. "It wasn't anything you believed would happen soon." But the clinical trial of Gleevec, conducted at the Oregon Health & Science University, U.C.L.A. and M. D. Anderson Cancer Center in Houston, was a spectacular success. Patients' cancer cells were beaten back to such an extent that the old tests to look for them in bone marrow were too insensitive, Dr. Sawyers said. Gleevec is not perfect. It is expensive, costing about $25,000 a year. It is not a cure: some cancer cells remain lurking, quiescent and ready to spring if the drug is stopped, so patients must take it every day for the rest of their lives. And some patients are now developing resistance to Gleevec. Still, Dr. Sawyers says, "Seven years later, most of our patients are still doing well." Without Gleevec, he added, most would be dead. As for the future of cancer therapy, Dr. Golub and others say that Gleevec offers a taste of the possible. Dr. Golub said he expected that new drugs would strike the Achilles' heels of particular cancers. The treatment will not depend on where the cancer started - breast, colon, lung - but rather which pathway is deranged. "It's starting to come into focus how one might target the problem," Dr. Golub said. "Individual cancers are going to fall one by one by targeting the molecular abnormalities that underlie them." And some cancer therapies may have to be taken for a lifetime, turning cancer into a chronic disease. "Seeing cancer become more like what has happened with AIDS would not be shocking," Dr. Golub says. "Does that mean cure? Not necessarily. We may see patients treated until they die of something else." That is what Mr. Weinstein hopes will happen with him. The cancer is still there: new, exquisitely sensitive tests still find a few cells lurking in his bone marrow. And Gleevec has caused side effects. Mr. Weinstein says his fingers and toes sometimes freeze for a few seconds, and sometimes he gets diarrhea. But, he said, "Certain things you put out of your mind because life is so good." From checker at panix.com Mon Jan 9 15:41:30 2006 From: checker at panix.com (Premise Checker) Date: Mon, 9 Jan 2006 10:41:30 -0500 (EST) Subject: [Paleopsych] Economist: Economics focus: Wealth from worship Message-ID: Economics focus: Wealth from worship http://www.economist.com/finance/PrinterFriendly.cfm?story_id=5327652 Dec 20th 2005 An economist finds that going to church is more than its own reward AT CHRISTMAS, many people do things they would never dream of the rest of the year, from giving presents to getting drunk. Some even go to church. Attendance soars, as millions of once-a-year worshippers fill the pews. In Britain, where most weeks fewer than one person in ten goes to church, attendance more than triples. Even in America, where two-fifths of the people say they go frequently, the share climbs in December. Some of the occasional churchgoers must wonder whether they might benefit from turning up more often. If they did so, they could gain more than spiritual nourishment. Jonathan Gruber, an economist at the Massachusetts Institute of Technology, claims that regular religious participation leads to better education, higher income and a lower chance of divorce. His results* (based on data covering non-Hispanic white Americans of several Christian denominations, other faiths and none) imply that doubling church attendance raises someone's income by almost 10%. The idea that religion can bring material advantages has a distinguished history. A century ago Max Weber argued that the Protestant work ethic lay behind Europe's prosperity. More recently Robert Barro, a professor at Harvard, has been examining the links between religion and economic growth (his work was reviewed here in November 2003). At the microeconomic level, several studies have concluded that religious participation is associated with lower rates of crime, drug use and so forth. Richard Freeman, another Harvard economist, found 20 years ago that churchgoing black youths were more likely to attend school and less likely to commit crimes or use drugs. Until recently, however, there was little quantitative research on whether religion affects income directly and if so, by how much. A big obstacle is the difficulty of disentangling cause and effect. That frequent churchgoers have higher incomes than non-churchgoers does not prove that religion made them richer. It might be that richer people are likelier to go to church. Or unrelated traits, such as greater ambition or personal discipline, could lead people both to go to church and also to succeed in their work. To distinguish cause from coincidence, Mr Gruber uses information on the ethnic mix of neighbourhoods and congregations. Sociologists have long argued that people are more likely to go to church if their neighbours share their faith. Thus Poles in Boston (which has lots of Italian and Irish Catholics) are more likely to attend mass than Poles in Minneapolis (which has more Scandinavian Protestants). Measuring the density of nationalities that share a religion in a particular city can therefore be a good predictor of church attendance. But ethnic density is not wholly independent of income. Studies have found that people who live with lots of others of the same ethnic origin tend to be worse off than those who are not "ghettoised". So Mr Gruber excludes an individual's own group from the measures, and instead calculates the density of "co-religionists", the proportion of the population that shares your religion but not your race. According to Mr Gruber's calculations, a 10% increase in the density of co-religionists leads to an 8.5% rise in churchgoing. Once he has controlled for other inter-city differences, Mr Gruber finds that a 10% increase in the density of co-religionists leads to a 0.9% rise in income. In other words, because there are lots of non-Polish Catholics in Boston and few in Minnesota, Poles in Boston both go to church more often and are materially better off relative to, say, Swedes in Boston than Poles in Minnesota relative to Swedes in Minnesota. Mr Gruber finds little evidence that living near different ethnic groups of the same faith affects any other civic activity. Poles in Boston are no more likely to join secular organisations than Poles in Minnesota. Since general differences between cities are already controlled for, that leads him to conclude that it must be religious attendance that is driving the differences in income. Looking for a cause Other economists, though they think Mr Gruber's approach is clever, are not sure that he has established a causal link between religious attendance and wealth. So how might churchgoing make you richer? Mr Gruber offers several possibilities. One plausible idea is that going to church yields "social capital", a web of relationships that fosters trust. Economists think such ties can be valuable, because they make business dealings smoother and transactions cheaper. Churchgoing may simply be an efficient way of creating them. Another possibility is that a church's members enjoy mutual emotional and (maybe) financial insurance. That allows them to recover more quickly from setbacks, such as the loss of a job, than they would without the support of fellow parishioners. Or perhaps religion and wealth are linked through education. Mr Gruber's results suggest that higher church attendance leads to more years at school and less chance of dropping out of college. A vibrant church might also boost the number of religious schools, which in turn could raise academic achievement. Finally, religious faith itself might be the channel through which churchgoers become richer. Perhaps, Mr Gruber muses, the faithful may be "less stressed out" about life's daily travails and thus better equipped for success. This may make religion more appealing to some of those who turn up only once a year. But given that Jesus warned his followers against storing up treasures on earth, you might think that this wasn't the motivation for going to church that he had in mind. [gray.gif] * "Religious Market Structure, Religious Participation and Outcomes: Is Religion Good for You?", NBER Working Paper 11377, May 2005 From checker at panix.com Mon Jan 9 22:23:35 2006 From: checker at panix.com (Premise Checker) Date: Mon, 9 Jan 2006 17:23:35 -0500 (EST) Subject: [Paleopsych] NYT: William Haxby, Mapper of Ocean Floors, Is Dead at 56 Message-ID: William Haxby, Mapper of Ocean Floors, Is Dead at 56 http://www.nytimes.com/2006/01/09/nyregion/09haxby.html [Sarah and I knew Bill through his partner, Miriam, who met him at work at Lamont-Doherty Earth Observatory. Sarah and I know Miriam from college days. We stayed with them several times. Bill was a perfectly good-natured, competent man, enthusiastic about his work, frustrated by during periods between getting funded, and full of stories about his career and those of other mappers. It is amazing, considering what does get funded and also considering the huge regard for his work among his peers that he wasn't funded for life. [Basically what Bill did was to develop computer programs that took satellite data and mapped the *surface* of the ocean. If the surface dipped ever so slightly, it was due to the extra tug of gravity caused by there being more matter--like an underwater mountain--than nearby. This lets you map the bottom of the ocean. My guess is that since underground different specific gravities, straightfoward use of his method will result in inaccuracies. [The article says Bill's map was a lot better than individual soundings in the past, but perhaps better ways of making soundings have been made since then. If he were still alive, that would be the first thing I'd ask him about, and Bill would certainly have told me the best he could. He did not at all mind laymen asking difficult questions. That was not his way. [We were hoping to visit this Summer. Bill died much, much too soon. Our deepest condolances to Miriam. [More from LDEO itself below. Reminiscences about Bill will be posted on the site. Click the URL to get a fine photo of Bill and an image of his most famous map, which graces the cover of Stephan Hall's _Mapping the Millennium_. Then follows a shorter piece from the local paper.] January 9, 2006 By ANDREW C. REVKIN William F. Haxby, who created the first maps of the ocean floor to be based on satellite measurements of the water's surface and became a master at translating complicated marine data into comprehensible visual displays, died on Wednesday at his home in Westwood, N.J. He was 56. The cause was apparently a heart attack, said James V. Haxby, a brother. Dr. Haxby, a research scientist at the Lamont-Doherty Earth Observatory at Columbia University since 1978, used computers to sift streams of data from satellites and other sensors and produce images revealing hidden ocean features or phenomena like the drifting of Arctic sea ice. "Bill peeled back the surface of the ocean for us," said Robin Bell, a colleague at the observatory. "His maps launched countless expeditions and formed the framework for studies of the ocean floor for two decades." His signal achievement, several ocean scientists said, was the first global "gravity field" map of the world's oceans, created in 1983 using measurements of the height of the sea surface collected five years earlier by a satellite called Seasat that carried a then-new type of downward-pointing radar that could create images. Dimples and humps in the sea, not discernible up close but detectable with satellites, are generated by variations in earth's gravitational field that are created by seabed features like seamounts, chasms and ridges. Before the gravity maps, three-dimensional charts of the seafloor were drawn largely by using thousands of individual soundings taken over the centuries from ships - a method involving much guesswork and leaving vast gaps. Dr. Haxby led a small team that "invented the method to convert millions of arcane satellite observations into quantitative grids and then exquisite images," said David T. Sandwell, a researcher at the Scripps Institution of Oceanography in San Diego. "Major volcanic chains such as the Louisville Ridge and Foundation Seamounts had been barely detected by sparse ship soundings, yet they were elegantly and accurately displayed on Bill's gravity maps," Dr. Sandwell said, referring to two seamount chains in the Pacific Ocean. William Fulton Haxby was born in Minneapolis and studied geophysics at the University of Minnesota and at Cornell and Oxford. His entire career was spent at Lamont-Doherty, where colleagues often marveled at his ability to turn reams of data into colorful maps and animation that conveyed far more meaning than words or numbers. One map showed the effect of a 16-foot rise in sea levels on Florida. Such a shift is projected if either Greenland's ice sheet or that of West Antarctica eventually melts. Everything south of Lake Okeechobee would become submerged. In addition to his brother James, Dr. Haxby is survived by his companion of 15 years, Miriam Colwell; his mother, Mary Haxby; a daughter, Jane Haxby; another brother, Robert; and a sister, Mary Haxby. Some of Dr. Haxby's most recent work, done with Prof. Stephanie Pfirman of Barnard College, was a set of animations depicting how old, thick sea ice periodically builds in the Arctic Ocean and then is expelled past Greenland into the Atlantic. They are posted at http://www.geomapapp.org/arctic/ice_movies/. "He was amazing at making physical processes come to life," Professor Pfirman said. "When I showed the animation to my family over the holidays, they said that for the first time they realized how the ice moved in the Arctic. They had heard me talk about it for years, but through Bill's animation they could finally see it." Lamont-Doherty Earth Observatory News http://www.ldeo.columbia.edu/news/2006/01_06_06.htm News 01/06/06 Contact: Ken Kostel (212) 854-9729 William F. Haxby, world-renowned geophysicist and long-time member of the Lamont community, passes away at 56 William F. Haxby, a world-renowned earth scientist at Columbia University's Lamont-Doherty Earth Observatory and the first to produce detailed images of the world's seafloor, died suddenly at his home in Westwood, New Jersey on Wednesday, January 4. He was 56 years old. Haxby used radar signals to precisely measure the distance from satellites to the sea surface. From calculations involving a continuous stream of altimetry data from the Seasat satellites along thousands of orbital tracks, he reconstructed the minute variations in the water surface produced by the gravitational attraction of sea mounts and other topographical features hidden beneath the ocean. His so-called "satellite-derived gravity map" was a stunning three-dimensional picture of the Earth's hidden seascape, with its now-familiar mid-ocean ridges, deep-sea trenches, underwater volcanoes and submarine canyons laid bare for the first time. "Bill figured out how to map the oceans with satellites in a few days that would have taken decades to do with ships," said Jeff Weissel, who worked with Haxby on the first maps. "Seasat only ran 3 months, yet he was able to show us the entire ocean with that short mission, including many places that ships had never visited." When his first black and white image rolled off the printer in 1981, his colleagues were stunned, not only by the remarkable detail of their brand-new view of the planet, but also by the realization that the whole of the Earth's oceans were suddenly portrayed as they actually were and not as they had been inferred from often widely spaced echo-soundings. When compared to the maps made with ship-based soundings, the already familiar seascape was brought into sharper focus, but for the third of the ocean floor that had not yet been seen or studied, everything that scientists saw was new and begging to be explored. Gravity map of the seafloor One of the gravity maps of the seafloor William Haxby produced using satellite-based radar altimetry of the ocean surface "Bill peeled back the surface of the ocean for us," said Robin Bell, a colleague at Lamont-Doherty. "His maps launched countless expeditions and formed the framework for studies of the ocean floor for two decades. The images he produced also appear on the cover of textbooks and on classroom walls around the globe and will undoubtedly continue to inspire students for years to come." Most recently Haxby had been active in producing another global synthesis of seafloor imagery, this one obtained with echo-soundings from a new generation of multi-beam sonar that create a view of the oceans far more detailed than his gravity map. He was able to share his amazement of the ocean's depths by creating a software application called GeoMapApp that runs on most computers and enables virtually anyone to explore the world he first revealed. It is available for free at http://www.geomapapp.org. "For more than 30 years, Bill worked tirelessly to expand our understanding of the planet," said Michael Purdy, Director of Lamont-Doherty. "His impact was felt in countless ways and in dozens of discoveries. His quiet, unwavering support of science and the pursuit of new frontiers will be deeply missed." Haxby was born in Minneapolis and received his B.S. degree in geophysics from the University of Minnesota. He completed his Ph.D. in geophysics at Cornell University and held a post-doctoral research fellowship at the University of Oxford in the United Kingdom before joining the Lamont-Doherty research staff in 1978. Haxby is survived by his partner of 14 years, Miriam Colwell; daughter, Jane Haxby (Daniel Gottlieb); mother, Mary; sister, Mary Gibbons (Dennis); and brothers, James and Robert. Funeral services will be held at 2:00 p.m. on Saturday, January 7 at the Grace Episcopal Church in Westwood, New Jersey. Flowers or contributions can be sent to the church. _________________________________________________________________ The Lamont-Doherty Earth Observatory, a member of The Earth Institute at Columbia University, is one of the world's leading research centers examining the planet from its core to its atmosphere, across every continent and every ocean. From global climate change to earthquakes, volcanoes, environmental hazards and beyond, Observatory scientists provide the basic knowledge of Earth systems needed to inform the future health and habitability of our planet. The Earth Institute at Columbia University is among the world's leading academic centers for the integrated study of Earth, its environment, and society. The Earth Institute builds upon excellence in the core disciplines -- earth sciences, biological sciences, engineering sciences, social sciences and health sciences -- and stresses cross-disciplinary approaches to complex problems. Through its research training and global partnerships, it mobilizes science and technology to advance sustainable development, while placing special emphasis on the needs of the world's poor. Funeral today for ocean scientist Haxby http://www.thejournalnews.com/apps/pbcs.dll/article?AID=/20060107/NEWS03/601070313/1019/NEWS03 By CATHERINE L. FOLEY THE JOURNAL NEWS (Original publication: January 7, 2006) PALISADES -- A funeral service will be held today for William F. Haxby, a renowned Columbia University scientist responsible for a conceptual breakthrough in the scientific understanding of the Earth's oceanic landscape. Haxby died Wednesday in his Westwood, N.J., home. He was 56. Haxby joined the research staff at Columbia University's Lamont-Doherty Earth Observatory in Palisades in 1978. In 1981, he discovered how to map the ocean floor using data gathered in a three-month NASA satellite mission that recorded the undulations of the sea surface with radar, said Bill Ryan, Haxby's longtime colleague and a Lamont-Doherty senior scholar. His "satellite-derived gravity map" gave the world a three-dimensional picture of the planet's hidden seascape in a detail never before known. "A three-month satellite mission produced a model of the ocean floor a hundred times clearer than 50 years of mapping it with research ships," Ryan said. "It was revealed in a sharpness, a focus that just astonished everybody." Ryan said Haxby's research opened a new chapter of understanding for the entire scientific community. "It launched a whole generation of ocean scientists to then send ships out to sea to check out all these features that had been made visible," he said. "It immediately excited the community to try to explain how all these ridges and cracks and volcanoes had originated in the pattern they were in." Haxby recently created a software application available free at [3]www.geomapapp.org that generates images of sea floor topography using data from the new generation of multi-beam sonar on ocean research ships. "Galileo gave us the telescope, but it took multiple generations to sharpen that and eventually become the Hubble," Ryan said. "Bill, in a single life, for the oceans, took us from seeing features 10 miles across to features the size of a football field." Michael Purdy, director of Lamont-Doherty, said Haxby worked tirelessly to expand scientific understanding of the planet. "His impact was felt in countless ways and in dozens of discoveries," Purdy said. Haxby's research will continue to inspire scientists for generations to come, Ryan said. Haxby was born in Minneapolis, and received his bachelor's in geophysics at the University of Minnesota. He completed his doctorate in geophysics at Cornell University and held a post-doctoral research fellowship at the University of Oxford. He is survived by his partner of 14 years, Miriam Colwell; his daughter, Jane Haxby; his mother, Mary; his sister, Mary Gibbons; and two brothers, James and Robert. Services are at 2 p.m. today at Grace Episcopal Church, 9 Harrington Ave., Westwood, N.J. The Journal News,[7] is a Gannett Co. Inc. newspaper serving Westchester, Rockland and Putnam Counties in New York. References 3. http://www.geomapapp.org/ 7. http://www.gannett.com/ From thrst4knw at aol.com Tue Jan 10 02:46:23 2006 From: thrst4knw at aol.com (Todd I. Stark) Date: Mon, 9 Jan 2006 21:46:23 -0500 Subject: [Paleopsych] CHE: In the Lab With the Dalai Lama In-Reply-To: References: Message-ID: <43C31FFF.2070505@aol.com> I think of Kuhn visiting us in spirit when I hear stories like this. That is, I suspect that the main problem is that it is very hard for experimentalists to understand each other when they study the same domain from such different perspectives. This seems to parallel a lot of the flutter of "science wars" that largely dissolve the more closely you look at them (mostly because you discover people addressing different things). People who get their hands dirty playing with the tools and methods of experimental science often come away with different ways of thinking about their topics depending on the tools they are using. And different from people who view theories without doing experimental work. Experimentalists have by neccessity a realist sense of the specific things they use in their experiments, because they rely on them in their daily work and build on them and engineer with them to create new experimental situations. Hence, experimentalists studying meditation are often going to take a lot of things about meditation for granted that theorists (who are mostly trying to interpret the data) and other experimentalists are going to view with skepticism. It makes no sense to be skeptical of the very concept of meditation when you are taking as your goal to study its effects. If you define the object of study as meditators and take it for granted that they have something special in common, then you will be much more willing to step into the culture of meditators than someone who is for example wondering whether meditation is "just relaxation" or "just self-hypnosis" and so on. A similar culture gap resulted in very different ways of looking at hypnosis at one time (e.g. the "state" vs. "non-state" models), although in that case it was more like two different communities of experimenters than experimenters and theorists. Some experimenters for a long time assumed rom our historical and popular culture understanding there was a hypnotic state and that their goal was to discover its properties and what made it special. Ernest Hilgard famously defined the "domain of hypnotic phenomena" that needed to be studied. Others like Nicholas Spanos and Theodore Sarbin assumed instead that there was no such state and rather constructed experiments to try to demonstrate the same behavioral results in other ways. The non-state model led to an enormous amount of productive results that went way beyond just the situation of hypnosis. It led to general new and evolving theoretical constructs like "role taking," "suggestion," "fantasy proneness," "amnesia proneness" and "imaginative involvement" that rendered the older state view (almost) obsolete except for convenient shorthand. For example when we choose an experimental subject for their imaginative talents and suggestibility and then set their expectations in an experimental condition, it is still common to speak in shorthand of "hypnotizing" them, even though it is the experimental protocol that we are referring to more than some special state of mind. "Relaxation" doesn't cover it adequately. "Meditation" involves similar problems with a vengeance, since it is far more diverse in its forms and traditions and has some enormous and influential lobbies that dwarf the "hypnotists" and "hypnotherapists." The way we study it has a big impact on what sorts of assumptions we are willing to make. It is probably unavoidable that some people will be quite willing and feel justified in taking aspects of Buddhist or other traditions for granted as part of the domain of their study, while others will find these to be absurd trappings and at best irrelevant to the real work they feel they want to do. I think of Herbert Benson as having originally dealt with this in a model that said meditation per se was largely irrelevant; that it was a simple behavioral response (thus "relaxation response") that could be triggered by simply repeating an arbitrary phrase. Benson almost certainly only captured the part of the story that interested him and which he could test and explain. But it was the right start, a narrow but workable scientific model for studying phenomena that are typically captured more in poetry and autophenomenology than in scientific terms. Now we have a broader understanding and more sophisticated models of the global psychological processes involved in meditation and other sorts of conditions (for example, see Austin's wonderful "Zen and the Brain"), but we still sometimes differ on the proper way to conceptuallize meditation, and whether it can be removed from its cultural trappings and still be the same object of study. Not that any of this justifies bad behavior or sneering contempt on anyone's part, just by way of an attempt at partial explanation. kind regards, Todd Buck, Ross wrote on 1/9/2006, 10:29 AM: > Herbert Benson in "The Relaxation Response" suggested decades ago that > disciplines such as meditation and prayer share the quality of promoting > deep relaxation, which has the opposite effects of the fight-or-flight > response. That is, they lower autonomic and endocrine arousal, and > promote immune system functioning. > > Cheers, Ross Buck > > Ross Buck, Ph. D. > Professor of Communication Sciences > and Psychology > Communication Sciences U-1085 > University of Connecticut > Storrs, CT 06269-1085 > 860-486-4494 > fax 860-486-5422 > Ross.buck at uconn.edu > http://www.coms.uconn.edu/docs/people/faculty/rbuck/index.htm > > -----Original Message----- > From: paleopsych-bounces at paleopsych.org > [mailto:paleopsych-bounces at paleopsych.org] On Behalf Of Premise Checker > Sent: Friday, January 06, 2006 1:03 PM > To: paleopsych at paleopsych.org > Subject: [Paleopsych] CHE: In the Lab With the Dalai Lama > > In the Lab With the Dalai Lama > The Chronicle of Higher Education, 5.12.16 > http://chronicle.com/weekly/v52/i17/17b01001.htm > > By LEIGH E. SCHMIDT > > Even the Dalai Lama's harshest critics at the Society for > Neuroscience meeting last month, in Washington, would have > to concede this much: Choosing the exiled Tibetan Buddhist > leader to inaugurate the professional association's series > on neuroscience and society certainly got people talking. > Who would have thought that an announced lecture on "The > Neuroscience of Meditation" would set off a protest petition > gathering about 1,000 signatures, a counterpetition of > support boasting nearly as many names, substantial coverage > in The New York Times and on National Public Radio, as well > as ample chatter in the blogosphere? In a culture that likes > its battles between science and religion to be loud, > colorful, and Christian -- another nasty squabble, say, > between evolutionists and creationists -- this controversy > seemed unlikely to gain much traction. Yet as the dispute > built momentum in the months leading up to the event, it > soon became clear that the prospect of the red-robed Dalai > Lama's urging the study of an ancient spiritual practice > upon white-coated lab scientists would provide a newsworthy > angle on the usual wrangling. > > Playing upon tensions far less noticed than those that have > plagued relations between science and conservative > Christianity, the latest dust-up reveals the spirit wars > that divide the knowledge class itself. How purely secular > and naturalistic do the members of that class imagine > themselves to be, and how committed are they to keeping > religion at bay in their conference gatherings, university > laboratories, civic institutions, newsrooms, and think > tanks? In turn, is "spirituality" a back door through which > religion gets to enter the conversation, now dressed in the > suitably neutralized garb of meditation as a universalistic > practice of inward peace and outreaching compassion? Or does > religion, even when soft-peddled in the cosmopolitan > language of spirituality and the contemplative mind, > inevitably remain an embarrassment to those elites who stake > their authority on secular rationality? The dispute roiling > the neuroscience society over the past six months has > brought such questions front and center. > > Inviting the Dalai Lama to speak at the meeting created two > major border disputes. The first, of modest consequence to > religion-and-science debates, was the conflict over the > "political agenda" of the exiled Tibetan leader. In an > international professional association that includes many > Chinese scientists, some members were offended at the > implied endorsement that the event gave to the Dalai Lama's > larger cause of freedom for Tibetans. The second dispute, > more insistently debated, was over religion's showing up -- > so visibly, to boot -- at an annual meeting of > neuroscientists. The almost visceral response by critics was > to declare a total separation of religion and science, to > wave the flag for the late-19th-century warfare between the > two domains. "A science conference is not [an] appropriate > venue for a religion-based presentation," a professor of > anesthesia from the University of California at San > Francisco remarked on the petition. "Who's next, the pope?" > That sign-off question pointed to a second part of the > strict separationist logic: Even if the Dalai Lama seemed > pretty irenic as religious leaders go, he nonetheless > represented a slippery slope into a mire of superstition and > authoritarianism. (How else, some critics asked, were they > to interpret his known affinities with reincarnation and > monasticism?) "Today, the Dalai Lama; Tomorrow, > Creationists?" wrote a professor of medicine at the > University of Toronto, capturing perhaps the most > commonplace anxiety given voice among the critics. Keep the > society free of all religious discussion, or else the > esteemed body might slide into the hell of a Kansas > school-board meeting. > > More interesting than the purists' boundary monitoring is > the way the Dalai Lama and his defenders imagine through > meditation an emerging meeting point for science and > religion in contemporary culture. The headline study that > served as the immediate source of intrigue surrounding his > recent lecture was an article published last year in the > Proceedings of the National Academy of Sciences and produced > by researchers at the Waisman Laboratory for Brain Imaging > and Behavior, at the University of Wisconsin at Madison. > That group, led by the psychology professor Richard J. > Davidson, has been studying long-term Tibetan Buddhist > practitioners of meditation, comparing their brain-wave > patterns with those of a control group. Davidson himself has > been working in the science-religion borderlands for more > than two decades and has been a leading collaborator with > the Mind and Life Institute, in Boulder, Colo., one of the > principal organizations encouraging the > neuroscience-meditation dialogue. > > Shifting the focus of research from altered states of > consciousness or momentary experiences of ecstasy, which so > often concerned inquirers in the 1960s and 1970s, the > Davidson group has been looking for evidence that sustained > meditation causes actual neural changes in everyday patterns > of cognition and emotion. In other words, they want to know > if the brain function of long-term contemplatives is made > demonstrably different through years of "mental training." > And not just different, but better: That is, does the > well-developed meditative mind sustain higher levels of > compassion and calmness than the run-of-the-mill American > noggin? Well, after testing eight long-time Tibetan Buddhist > practitioners and 10 "healthy student volunteers," the > researchers discovered that the 10,000 to 50,000 hours that > the various monks had devoted to "mental training" appeared > to make a real neurological difference. As the study's title > put it, "Long-term meditators self-induce high-amplitude > gamma synchrony during mental practice." Davidson and > company, careful not to overreach in their conclusions, did > suggest that practices of meditation, and the accompanying > compassionate affect, were "flexible skills that can be > trained." Did that mean contemplative practice could be > abstracted from its religious context and then applied as a > kind of public pedagogy? Were hopeful supporters wrong to > read this as a tantalizing suggestion that meditation might > prove beneficial not only for the mental health of Americans > but also for the very fabric of society? Where, after all, > couldn't we benefit from a little more "pure compassion," > altruism, lovingkindness, and "calm abiding"? > > As novel as it may sound to monitor the brain waves of > Tibetan Buddhist monks in university laboratories or on > Himalayan hillsides (Davidson has done both), it is > certainly not the first time that American psychologists > have sought to re-engage the spiritual through the > healthy-mindedness of meditation. At Wisconsin, Davidson > occupies a research professorship named for Harvard's > William James, the pioneering psychologist, psychical > researcher, and philosopher of religion, and it is in the > tradition of James that the current turn to the > contemplative mind is best understood. Counter to the > popular image of Americans as endlessly enterprising, > agitated, and restless -- all busy Marthas, no reflective > Marys -- James discerned a deep mystical cast to the > American psyche and pursued that strain with uncommon > intellectual devotion. Yet when it came to "methodical > meditation," James saw little of it left among American > Christians and turned instead to homegrown practitioners of > various mind-over-matter cures. He particularly accented > those "New Thought" metaphysicians who were pushing forward > a dialogue with far-flung emissaries of yoga and Buddhist > meditation in the wake of the World's Parliament of > Religions, held in Chicago in 1893. > > Among James's favored practitioners of these newly > improvised regimens of meditation was Ralph Waldo Trine, a > Boston-based reformer with a knack for inspirational > writing. In The Varieties of Religious Experience (1902), > James used Trine's blockbuster In Tune With the Infinite > (1897) as an epitome of the emergent practices of > concentration, mental repose, and healthy-mindedness then > percolating in New England and elsewhere across the country. > Though an unabashed popularizer, Trine was not a > lightweight. With an educational pedigree that ran from Knox > College to the University of Wisconsin to the Johns Hopkins > University, he moved easily in Harvard's wider metaphysical > circles and energetically engaged various progressive > causes. In much the same way that current studies promote > the clinical applications of meditation, Trine emphasized > the healthful benefits that accrued from cultivating a calm > yet expectant mind. He had no scanners or electrodes, but he > had the same hopes about improving the mental and physical > health of Americans through elaborating a universal practice > of meditation, one that transcended the particulars of any > one religious tradition and represented a kind of > cosmopolitan composite of all faiths. And while Trine did > not have the Dalai Lama at hand, he did have extended > contact with a well-traveled Sinhalese Buddhist monk, > Anagarika Dharmapala, with whom he compared notes and > devotional habits at a summer colony in Maine as he was > putting together his own system of meditation for Americans. > Like other inquirers then and now, Trine was all too ready > to look to Asia for a practical antidote to American > nervousness. > > The real payoff for Trine, as it is for Davidson and his > colleagues, was not established simply through a calculus of > productivity or cheerfulness: Would encouraging meditation > or other visualization techniques make people more alert and > proficient at the office or on the playing field? Would it > make them feel happier and less disgruntled? Trine, like > James and now Davidson, was finally more interested in > saintliness and compassion than in helping stressed-out > brain workers relax and concentrate. It is hard not to hear > a hint of Davidson's pursuit of altruism in Trine's "spirit > of infinite love," the moral imperative to "care for the > weak and defenseless." And it is hard not to see that the > world of William James and Ralph Waldo Trine is alive and > well as American investigators wire up Tibetan Buddhist > hermits in a search for the powers of the concentrated mind, > the mental disciplines of harmony, compassion, and peace > that might make the world a marginally kinder, less selfish > place. That optimism about human nature -- that the mind has > deep reservoirs of potential for empathy and altruism -- had > a lot more backing among liberals and progressives in 1900 > than it does today. Still, the considerable hopes now > invested in meditation suggest that the old romantic > aspirations, spiritual and otherwise, continue to flourish, > especially among members of the mind-preoccupied knowledge > class. > > P erhaps the most important dimension of the Dalai Lama's > turn to the laboratory is the notion that the > religion-science wound will be salved through recasting > religion as spirituality. The Nobel laureate's latest book > explicitly suggests as much in its title, The Universe in a > Single Atom: The Convergence of Science and Spirituality. In > doing so, he expressly appeals to all those Americans who > fear fundamentalist incarnations of religion and who instead > cast themselves as intellectually curious and spiritually > seeking. Religion, on this model, is not a domain of > authority competing with science but an inward terrain of > personal experience and individual probing. Spirituality, > the Dalai Lama writes, "is a human journey into our internal > resources." Representing "the union of wisdom and > compassion," it shares with science a progressive hope for > "the betterment of humanity." In those terms, religion as > spirituality becomes the handmaiden of science itself, > joining it in an open quest for knowledge, empirical and > pragmatic, unconstrained by ancient creeds, cosmologies, or > churches. In such exhortations the Dalai Lama shows a fine, > intuitive feel for much of American intellectual and > religious life, but he is hardly telling today's Emersonian > inquirers something about the universe that they do not > already affirm. > > A practice of meditation made palatable to scientists, > secularists, and seekers would no doubt look pallid to all > those monks, hermits, and saints who have taken it to be an > arduous and ascetic discipline. Still, the American pursuit > of "spirituality," reaching a crescendo in the past two > decades, has been all too easy to dismiss as paltry and > unsubstantial, labeled as foreign and threatening to > more-orthodox versions of a Christian America. In this > often-charged religious environment, the Dalai Lama has > astutely laid hold of the science-spirituality nexus as a > cultural foothold. As he has discovered in this latest > brouhaha, that move has hardly lifted him above the wider > debates, whether about materialism or intelligent design, > but it has allowed him to connect with America's more > cosmopolitan and progressive religious impulses. When > William James was asked directly in 1904, "What do you mean > by 'spirituality'?," he replied: "Susceptibility to ideals, > but with a certain freedom to indulge in imagination about > them." In mingling with neuroscientists who have warmed to > his talk of spirituality, the Dalai Lama may well have found > his own avatars of William James. > > Leigh E. Schmidt is a professor of religion at Princeton > University and author of Restless Souls: The Making of > American Spirituality (HarperSanFrancisco, 2005). > _______________________________________________ > paleopsych mailing list > paleopsych at paleopsych.org > http://lists.paleopsych.org/mailman/listinfo/paleopsych > > _______________________________________________ > paleopsych mailing list > paleopsych at paleopsych.org > http://lists.paleopsych.org/mailman/listinfo/paleopsych > From checker at panix.com Tue Jan 10 14:17:22 2006 From: checker at panix.com (Premise Checker) Date: Tue, 10 Jan 2006 09:17:22 -0500 (EST) Subject: [Paleopsych] Edge Annual Question 1999: What Is The Most Important Invention? Message-ID: Edge Annual Question 1999: What Is The Most Important Invention? http://www.edge.org/documents/Invention.html CONTRIBUTORS: Colin Blakemore Steven Rose Joseph Traub M. Csikszentmihalyi Marvin Minsky Philip W. Anderson Reuben Hersh Howard Gardner Daniel Dennett Freeman Dyson William Calvin David Shaw Roger Schank Stephen Budiansky Richard Saul Wurman Stewart Brand George Dyson Marney Morris V.S. Ramachandran Jeremy Cherfas Bart Kosko Stuart Hameroff Michael Nesmith Clifford Pickover Margaret Wertheim Richard Dawkins David Haig Chris Langton Eric J. Hall Clay Shirkey Keith Devlin Luyen Chou Antonio Cabral Hendrik Hertzberg David Berreby Charles Simonyi Piet Hut Susan Blackmore James P. O'Donnell Nicholas Humphrey Jaron Lanier Terrence Sejnowski Ron Cooper W. Daniel Hillis John Baez Viviana Guzman Stephen Schneider Philip Campbell John Horgan Raphael Kasper Sherry Turkle David Myers Don Goldsmith Arnold Trehub Jay Ogilvy Douglas Rushkoff Mike Godwin Duncan Steel Tom Standage Andy Clark Stanislas Dehaene John Maddox Eberhard Zangger Leon Lederman Marc D. Hauser David Buss Leroy Hood Julian Barbour John Henry Holland Gordon Gould Bob Rafelson John Allen Paulos Verena Huber-Dyson Garniss Curtis Milford Wolpoff Mark Mirsky Dan Sperber Lew Tucker Tor N?rretranders Richard Potts Lawrence M. Krauss John McCarthy Karl Sabbagh Ellen Winner George Johnson Rodney Brooks John R. Searle Lee Smolin Paul W. Ewald Carl Zimmer Robert Shapiro James Bailey John C. Dvorak Kenneth Ford Philip Brockman Howard Rheingold George Lakoff Robert Provine Peter Cochrane Samuel Barondes Chris Westbury John Rennie Randolph Nesse Brian Greene Esther Dyson Steven Johnson Delta Willis Joseph LeDoux Maria Lepowski John Barrow Todd Siler Peter Tallack Brian Goodwin John Brockman WHAT IS THE MOST IMPORTANT INVENTION IN THE PAST TWO THOUSAND YEARS? Introduction by John Brockman A year ago I emailed the participants of The Third Culture Mail List for help with a project which was published on EDGE as "The World Question Center." I asked them: "what questions are you asking yourself?". The World Question Center was published on December 30th. On the same day The New York Times ran an article "In an Online Salon, Scientists Sit Back and Ponder" which featured a selection of the questions. Other press coverage can be found in EDGE In The News. The project was interesting, worthwhile....and fun. . This year, beginning on Thanksgiving Day, I polled the list on (a) "What Is The Most Important Invention In The Past Two Thousand Years?" ... and (b) "Why?". I am pleased to publish below* the more than one hundred responses in order of receipt. I expect many more entries and, in the spirit of The Reality Club, robust discussion and challenges among the contributors. Happy New Year!! JB p.s. I get the last word. (*Please note that the length of this document is 41,000 words which prints out to about 75 pages.) _________________________________________________________________ RELATED PRESS _________________________________________________________________ January 7, 1999 Wired News Top-Level Think Tank Goes Public John Brockman's invitation-only salon for scientific thinkers opens a public forum on Feed. By Steve Silberman One of the Net's most prestigious, invitation-only free-trade zones for the exchange of potent ideas is opening its doors. A little. .....Starting Thursday, two or three selected dialogs a month at Edge -- founded in 1996 by author and literary agent John Brockman -- will be open for public reading and discussion in a special area on Feed. _________________________________________________________________ January 7, 1999 Die Zeit (German Text) Brainstorming In The Club Of Thinkers (Partial, rough English Translation) by Ulrich Schnabel und Urs Willmann Could one inspire German scientists for such a brainstorming? Hardly. In German it is already difficult to find a good translation for this neural activity, leading to fantasy an fun. Brainstorming: "procedure to find the best solution of a problem by collecting spontaneous incidents (of the coworkers)", torments itself the Duden, the leading German dictionary. You can imagine the result. _________________________________________________________________ January 7, 1999| ABCNEWS.COM What Changed the World? Suggestions for Top Inventions by Lee Dye -- Special to ABCNEWS.COM That question was presented on Thanksgiving Day to Nobel laureates and other heavy thinkers by New York author and literary agent John Brockman. Brockman, who presides over an eclectic gathering of scientists and science buffs, started publishing the answers this week on the group's Web site. More than 100 participants have taken the bait so far, and their answers are as varied, and in some cases as strange, as the participants themselves.....This is not a group that accepts limitations gladly. Some fudged on the dates. Some eschewed the notion of an invention as some sort of gadget, opting instead for such things as the development of the scientific method, mathematics or some religions. _________________________________________________________________ January 5, 1998 FEED The Mother of All Inventions Richard Dawkins, Stewart Brand, Joseph Traub and others answer the question: What was the most important invention of the past two thousand years? This special feature marks the first collaboration between FEED and Edge, John Brockman's invitation-only Internet forum, where hundreds of the world's leading scientists and thinkers share their thoughts on issues ranging from the meaning of numbers to genetics to affirmative action. Readers can visit the Edge site for even more nominations, and an post their own suggestions in the Loop. -- The Editors _________________________________________________________________ January 5, 1998 Salon "What's the Mother of All Inventions" By Scott Rosenberg The list makes for an enjoyable read -- if you can get over the participants' utter inability to remain within the question's 2000-year bounds. Suggesting that the most important invention of this era is the spirit of rebellion against arbitrary rules. _________________________________________________________________ January 4, 1998 World News Tonight -- ABC News Comments by Peter Jennings _________________________________________________________________ January 4, 1998 Newsweek Magazine -- Newsweek.com "The Power of Big Ideas" By Sharon Begley Was the light bulb more important than the pill? An online gathering of scientists nominates the most important inventions of the past 2,000 years. Some of their choices might surprise you. Newsweek on Air -- Related Audio Interview by David Alpern _________________________________________________________________ January 4, 1999 The Wall Street Journal -- The Wall Street Journal Interactive (Subscription Required) "The Nominees for Best Invention Of the Last Two Millennia Are . . ." By David Bank Staff Reporter ofThe Wall Street Journal John Brockman is the premier literary agent of the digerati, so when he asked 1,000 scientists and other techno-thinkers to suggest the most important invention of the past 2,000 years, the responses sounded a lot like proposals for yet another millennial book. _________________________________________________________________ January 4, 1999 The Daily Telegraph The Pill and the Birth of Invention: From Hay and Mozart to the Internet and clocks, scientists nominatre man's major achievements, says Roger Highfield Nobel laureate Prof. Philip Anderson, philosopher Daniel C. Dennett, biologist Prof Richard Dawkins and Sir John Maddox are among the 100 or so contributors who have nominated inventions randing from tha atomic bomb and board games to the Internet, Hindu-Arabic number system and anaethesia. _________________________________________________________________ January 4, 1999 DaveNet " Welcome to 1999!" by Dave Winer Congratulations to John Brockman and the people at edge.org. This is an incredible source of new thoughts. I highly recommend it to DaveNet readers.....Sites like www.edge.org show what can be done when there's moderation and thoughtfulness and a little bit of editing. We can learn from each other. The world is not filled with bullshit. There are interesting new ideas, and new perspectives on old ideas _________________________________________________________________ WHAT IS THE MOST IMPORTANT INVENTION IN THE PAST TWO THOUSAND YEARS? _________________________________________________________________ Colin Blakemore: My choice for the most important invention? The contraceptive pill. Why? Well, there are, of course, the well-rehearsed answers to that question. The pill did indeed fertilize the sexual liberation of the sixties, did stimulate feminism and the consequent erosion of conventional family structure in Western society -- perhaps the most significant modification in human behaviour since the invention of shamanism. It did help to change our concept of the division of labour, to foster the beginnings of an utterly different attitude to the social role of women. But, arguably the important sequel of the pill is the growing conception that our bodies are servants of our minds, rather than vice versa. This relatively low-tech invention has triggered a cultural and cognitive revolution in our self-perception. It has contributed to our ability to accept organ transplantation, the notion of machine intelligence, gene therapy and even, eventually, germ-line genetic manipulation. It has shifted the quest of human beings from controlling their physical environment to controlling themselves -- their own bodies and hence their physical destinies. COLIN BLAKEMORE is Waynflete Professor of Physiology, University of Oxford; Director, Oxford Centre for Cognitive Neuroscience; President of the British Association for the Advancement of Science, 1997-8; and author of The Mind's Brain. _________________________________________________________________ Steven Rose: I don't need a page. The answer is clear: inventions are concepts, not just technologies, so the most important are the concepts of democracy, of social justice, and the belief in the possibility of creating a society free from the oppressions of clas, race, and gender. STEVEN ROSE, neurobiologist, is Professor of Biology and Director, Brain and Behaviour Research Group, The Open University; author Lifelines; The Making Of Memory; Not In Our Genes; From Brains To Consciousness (Ed.) . See EDGE: "THE TWO STEVES" Pinker vs. Rose - A Debate (Part I) and (Part II)". _________________________________________________________________ Joseph Traub: My nomination is the invention of the scientific method. The Greeks believed we could understand the world rationally. But the scientific method requires that we ask questions of nature by experimentation. This has led to the science and technology that has transformed the world. JOSEPH TRAUB is Edwin Howard Armstrong Professor of Computer Science at Columbia University and External Professor at the Santa Fe Institute. He is the author of nine books, including the recently published Complexity And Information. See EDGE: " The Unknown and The Unknowable: A Talk With Joseph Traub". _________________________________________________________________ Mihaly Csikszentmihalyi: I always liked Lynn White's story about how the stirrup revolutionized warfare and made feudal society and culture possible. Or Lefebre des Noettes' argument about how the invention of the rudder made extensive sailing and the consequent expansion of Europe and its colonization of the world possible. But it's sobering to realize that it took us over one thousand years to realize the impact of these artifacts. So I am not at all sure we have at this time a good grip on what the most important inventions of the past millennia have been. Certainly the contraceptive pill is a good candidate, and so is the scientific method. I am also intrigued by the effects of such inventions as the flag -- a symbol of belonging that millions will follow to ruin or victory independently of biological connectedness; or the social security card that signifies that we are not alone and our welfare is a joint problem for the community; or the invention of civil rights which however abused and misused is pointing us towards a notion of universal human dignity that might yet eclipse in importance all the technological marvels of the millennium. MIHALY CSIKSZENTMIHALYI is professor of psychology and education at the University of Chicago. He is the author of Flow: The Psychology of Optimal Experience, The Evolving Self: A Psychology For the Third Millennium, Creativity, and Finding Flow (A Master Minds Book). _________________________________________________________________ Marvin Minsky: In his work on the foundations of chemistry, it occurred to Antoine Lavoisier (and also, I suppose to Joseph Priestly) that the smell of a chemical was not necessarily a 'property' of that chemical, but a property of some related chemical that had the form of a gas, which therefore could reach the nose of the observer. Thus solid sulfur itself has no smell, but its gaseous relatives, sulfur dioxide and hydrogen sulfide have plenty of it. Perhaps this tiny insight was the key to the transformation of chemistry from a formerly incoherent field into the great science of the 19th and 20th centuries. MARVIN MINSKY is a mathematician and computer scientist; Toshiba Professor of Media Arts and Sciences at the Massachusetts Institute of Technology; cofounder of MIT's Artificial Intelligence Laboratory. He is the author of eight books, including The Society of Mind. See EDGE: " Consciousness is a Big Suitcase: A Talk with Marvin Minsky "; The Third Culture, Chapter 8. _________________________________________________________________ Philip W. Anderson: The question is impossible to answer with one thing; one could for instance say with some justification "the germ theory of disease" but then that goes back to the microscope -- otherwise no one would ever have seen a germ -- and that to the lens, and eyeglasses may be as important as germs, ft as germs, and so on. But I will give you my entry; to the amazement of my colleagues who think of me as the ultimate antireductionist, I will suggest a very reductionist idea: the quantum theory, and I include emphatically quantum field theory. The quantum theory forces a revision of our mode of thinking which is far more profound than Newtonian mechanics or the Copernican revolution or relativity. In a sense it absolutely forces us not to be reductionist if we are to keep our sanity, since it tells us that we are made up of anonymous identical quanta of various quantum fields, so that only the whole has any identity or integrity. Yet it also tells us that we really completely know the rules of the game which all these particles and quanta are playing, so that if we are clever enough we can understand everything about ourselves and our world. Note that I said understand, not predict -- the latter is really in principle impossible, for reasons which have little to do with the famous Uncertainty Principle and a lot to do with exponential explosions of computations. I would agree with whoever said "the scientific method" if I thought that was a single thing invented at some identifiable time, but I know too much history and see too much difference between different sociologies of fields. Why has no one mentioned the printing press yet? The other really profound discovery is the molecular basis of evolution, for which probably Oswald Avery deserves more credit than anyone. Evolution itself has, like the scientific method, much too complicated a history to class as a single invention. PHILIP W. ANDERSON is a Nobel laureate physicist at Princeton and one of the leading theorists on superconductivity. He is the author of A Career in Theoretical Physics, and Economy as a Complex Evolving System. _________________________________________________________________ Reuben Hersh: The most important invention of all time was the interrogative sentence. i.e., the asking of questions. However, the original request was for the most important invention of the last 2,000 years, not of all time. To that I would say, space travel. Of course, it may be centuries before we know the full consequences of space travel. REUBEN HERSH is professor emeritus at the University of matics, Really? And (with Philip J. Davis)The Mathematical Experience, winner of the National Book Award in 1983. See EDGE; "What Kind Of Thing Is A Number? A Talk With Reuben Hersh". _________________________________________________________________ Howard Gardner: Another good question! My perhaps eccentric but nonetheless heartfelt nomination is Western classical music, as epitomized in the compositions of Bach, Beethoven, Brahms, and above all Mozart. Music is a free invention of the human spirit, less dependent upon physical or physiological inventions than most other contrivances. Musical compositions in the Western tradition represent an incredible cerebral achievement, one that is not only appreciated but also imitated or elaborated upon wherever it travels. Most inventions -- from nuclear energy to antibiotics - can be used for good or ill. Classical music has probably given more pleasure to more individuals, with less negative fallout, than any other human artifact. Finally, while no one can compose like Mozart and few can play like Heifetz or Casals, anyone who works at it can perform in a credible way -- and, courtesy of software, even those of us unable to play an instrument or create a score can now add our own fragments to an ever expanding canon. HOWARD GARDNER, is Professor of Education at Harvard University. His numerous books include Leading Minds, Frames of Mind, Multiple Intelligences, The Mind's New Science: A History of the Cognitive Revolution, The Unschooled Mind, To Open Minds, Creating Minds, and Extraordinary Minds (Master Minds Series). See EDGE: "Truth, Beauty, and Goodness: Education for All Human Beings" A Talk With Howard Gardner". _________________________________________________________________ Daniel C. Dennett: The battery, the first major portable energy packet in the last few billion years. When simple prokaryotes acquired mitochondria several billion years ago, these amazingly efficient portable energy devices opened up Design Space to multicellular life of dazzling variety. Many metazoa developed complex nervous systems, which gave the planet eyes and ears for the first time, expanding the epistemic horizons of life by many orders of magnitude. The modest battery (and its sophisticated fuel cell descendants), by providing energy for autonomous, free-ranging, unplugged artifacts of dazzling variety, is already beginning to provide a similarly revolutionary cascade of developments. Politically, the transistor radio and cell phone are proving to be the most potent weapons against totalitarianism ever invented, since they destroy all hope of centralized control of information. By giving every individual autonomous prosthetic extensions of their senses (think of how camcorders are revolutionizing scientific data-gathering possibilities, for instance), batteries enable fundamental improvements in the epistemological architecture of our species. The explosion of science and technology that may eventually permit us to colonize space (or save our planet from a fatal collision) depends on our ability to store and extract electrical power ubiquitously. Our batteries are still no match for the mitochondrial ATP system -- a healthy person with a backpack can climb over mountains for a week without refueling, something no robot could come close to doing -- but they open up a new and different cornucopia of competences. DANIEL C. DENNETT, a philosopher, is Director of the Center for Cognitive Studies, and Distinguished Arts and Sciences Professor at Tufts University. He is author of Darwin's Dangerous Idea: Evolution and the Meanings of Life, Consciousness Explained, Brainstorms, Kinds of Minds (Science Masters Series), and coauthor with Douglas Hofstadter of The Mind's I. See The Third Culture, Chapter 10. _________________________________________________________________ Freeman Dyson: This is a good question. My suggestion is not original. I don't remember who gave me the idea, but it was probably Lynn White, with Murray Gell-Mann as intermediary. The most important invention of the last two thousand years was hay. In the classical world of Greece and Rome and in all earlier times, there was no hay. Civilization could exist only in warm climates where horses could stay alive through the winter by grazing. Without grass in winter you could not have horses, and without horses you could not have urban civilization. Some time during the so-called dark ages, some unknown genius invented hay, forests were turned into meadows, hay was reaped and stored, and civilization moved north over the Alps. So hay gave birth to Vienna and Paris and London and Berlin, and later to Moscow and New York. FREEMAN DYSON is Professor of Physics at the Institute for Advanced Study in Princeton. His professional interests are in mathematics and astronomy. Among his many books are Disturbing the Universe, From Eros to Gaia, and Imagined Worlds. _________________________________________________________________ William Calvin: Computers, not for current reasons but because they're essential to prevent a collapse of civilization in the future. Computers may allow us to understand the earth's fickle climate and how it is affected by detours of the great ocean currents. These detours cause abrupt coolings within a decade that last for centuries, sure to set off massive warfare as the population downsizes to match the crop failures. "Natural" though these worldwide coolings have been in the past, with their forest fires and population crashes, they're not any more inevitable than local floods -- if we learn enough about the nonlinear mechanisms in order to stabilize climate. Computer simulations are the key to a "preventative medicine" of climate, what may allow human scientific ingenuity to keep civilization from unraveling in another episode of cool, crash, and burn. WILLIAM H. CALVIN is a theoretical neurophysiologist on the faculty of the University of Washington School of Medicine who writes about the brain and evolution; author of The River That Flows Uphill, The Throwing Madonna, The Cerebral Symphony, Conversations with Neil's Brain (with George A. Ojemann), The Cerebral Code, and How Brains Think (Science Masters Series). See EDGE: " Competing for Consciousness: A Talk with William Calvin". _________________________________________________________________ David Shaw: I know it would probably be more helpful to add something new to the list, but I found Joe Traub's nomination so compelling that I'd feel dishonest doing anything but seconding it. It's hard to imagine how different our lives would be today without the steady accrual of both knowledge and technology that has accompanied the rigorous application of the scientific method over a surprisingly small number of human generations. While the notion of formulating well explicated, testable conjectures and subjecting them to potential refutation through controlled experimentation (and, where appropriate, statistical analysis) is now second nature to those of us who work in the sciences, it's easy to forget that we weren't born with an intuitive understanding of this approach, and had we lived two thousand years ago, we would never have been taught to use it. Although the apparatus of formal logic would probably rate a close second in my book, I join Joe in casting my vote for the scientific method. DAVID E. SHAW is the chairman of D. E. Shaw & Co., a global investment bank whose activities center on various aspects of the intersection between technology and finance, and of Juno Online Services, the world's second largest Internet access provider. He also serves as a member of President Clinton's Committee of Advisors on Science and Technology, and previously served on the faculty of the Computer Science Department at Columbia University. _________________________________________________________________ Roger Schank: We are using it now. The internet. Of course the internet relies on numerous other inventions (chips, networking, CRTs, telephones, electricity etc). The reason why the internet isn't an obvious choice at first glance (besides the fact that is so present in our lives we can fail to notice it) is that its power has not yet begun to fully manifest itself. We still have schools, offices, the post office, telephone companies, places of entertainment, shopping malls and such, but we won't for long. Information delivery methods affect every aspect of how we live. If we don't have to walk to town to find out what's going on, or to shop, or to learn, or to work, why will we go to town? Schools (which have not been able to change) will completely transform themselves when better course can be built on the internet than could possibly be delivered in a university. Of course, we haven't seen that yet, but when the best physicists in the world combine to deliver a learn by doing simulation that allows students to try things out and discuss what they have done with every important (virtual) physicist who has something to say about what they have done, the only thing universities will have to offer will be football. Shopping malls aren't gone yet but they will be. Why go to a store to buy music CDs any more? You can listen to samples of whatever you want and click a button for delivery while seated at home. Any object that needn't be felt and perused to be purchased will find no better delivery method than the internet. Newspapers? Not dead yet, but they will be. Pick an aspect of the way we live today and it will change radically in the coming years because of the internet. Life (and human interaction) in fifty years will be so different we will hardly recognize the social structures that will evolve. I don't know if we will be happier, but we will be better informed. ROGER C. SCHANK, computer scientist and cognitive psychologist, is director of The Institute for Learning Sciences at Northwestern University, where he is John Evans Professor of Electrical Engineering and Computer Science as well as Professor of Psychology and of Education and Social Policy; author of The Creative Attitude: Learning to Ask and Answer the Right Questions, Tell Me A Story, and Engines for Education . See The Third Culture, Chapter 9. _________________________________________________________________ Stephen Budiansky: There is an inherent bias in all such surveys, because everyone strives to be original and surprising and so shuns the obvious but probably more correct answers -- such as steel, or moveable type, or antibiotics, to name but three obvious things that have utterly transformed not only how people live but the way they experience life. The only way I can think of being surprising is to violate John's terms and go back 6,000 years. But if I will be permitted to do so, I would argue that the single invention that has changed human life more than any other is the horse -- by which I mean the domestication of the horse as a mount. The horse was well on its way to extinction when it was domesticated on the steppes of Ukraine 6,000 years ago, but from the moment it entered the company of man the horse repopulated Europe with a swiftness that announced the arrival of a new tempo of life and cultural change. Trade over thousands of miles suddenly sprang up, communication with a rapidity never before experienced became routine, exploration of once forbidding zones became possible, and warfare achieved a violence and degree of surprise that spurred the establishment and growth of fortified permanent settlements, the seeds of the great cities of Europe and Asia. For want of the horse, civilization would have been lost. STEPHEN BUDIANSKY, Correspondent for The Atlantic Monthly, is the author of If a Lion Could Talk: Animal Intelligence and the Evolution of Consciousness and The Nature of Horses: Exploring Equine Evolution, Intelligence, and Behavior. _________________________________________________________________ Richard Saul Wurman: ELECTRICITY CONTAINS THE WORD CITY -- WHICH CERTAINLY IS OUR MOST COMPLEX INVENTION & FROM THE DENSITY OF HUMAN INTERACTION ALL ELSE FLOWS. RICHARD SAUL WURMAN is the chairman and creative director of the TED conferences. He is also an architect, a cartographer, the creator of the Access Travel Guide Series, and the author and designer of more than sixty books, including Information Architects, Follow the Yellow Brick Road and Information Anxiety. _________________________________________________________________ Stewart Brand: The question does most of the answering: "What Is The Most Important Invention In The Past Two Thousand Years?" That lets out agriculture, writing, mathematics, and money. Too early. "Most important" would suggest looking for inventions near the beginning of the period, since they would have had the most time for accumulative impact. Where did that number "Two Thousand" come from? From the approaching Year 2000, which is a Christian Era date -- now referred to as "Common Era": 2000 CE. That's quite a clue. The most important cultural -- hence all-embracing -- invention is a religion. Only two major religions have been invented in the last two millennia, Christianity and Islam. Try to imagine the last two millennia, or the present, without them. STEWART BRAND is founder of The Whole Earth Catalog, cofounder of The Well, cofounder of Global Business Network, president of The Long Now Foundation, and author of The Media Lab: Inventing the Future at MIT and How Buildings Learn. See EDGE: "The Clock of the Long Now"; Digerati, Chapter 3. _________________________________________________________________ George Dyson: The Universal Turing Machine. Because it is universal. Not only as the theoretical archetype for digital computing as we practice it today, but as a least common denominator -- translating between sequence in time and pattern in space -- that lies at the foundations of mathematics and suggests the possibilities of a communications medium we have only just begun to explore. Life and intelligence that achieves widespread distribution across the cosmos (and over time) may be expected to assume a digital representation, at least in some phases of the life cycle, to facilitate electromagnetic transmission, cross-platform compatibility, and long-term storage. This requires a local substrate. And we are doing our best, thanks to the proliferation of our current instantiation of the UTM (known as the PC) to help. When we establish contact with such an intelligence, will we receive instructions for building a machine to upload Jodie Foster? Probably not. The download will proceed the other way. To paraphrase Marvin Minsky: "Instead of sending a picture of a cat, there is one area in which they can send the cat itself." GEORGE DYSON is the leading authority in the field of Russian Aleut kayaks, he has been a subject of the PBS television show Scientific American Frontiers. He is the author of Baidarka, and Darwin Among The Machines:The Evolution Of Global Intelligence. See EDGE: "Darwin Among the Machines; or, The Origins of Artificial Life"; See EDGE: "CODE - George Dyson & John Brockman: A Dialogue" . _________________________________________________________________ Marney Morris: (Well John, you did say most important invention, not the one we should be most proud of). The invention (and detonation) of the atomic bomb has changed the world more profoundly than any other human development in the last 2000 years. In seconds, nearly 200,000 people were dead or dying in Hiroshima, and consciousness was forever changed on our planet. Although the arms race fueled our economy for a few more decades, the bomb set into motion a 'warfare stalemate'. With the ability to destroy our planet within the realm of possibility, we were forced to examine our rules of war, and seek new means of engagement to work out our differences. And although hundreds of wars are going on at any time on our planet, there are checks and balances, underscored by the horror of Hiroshima and Nagasaki. Please note that if you were to have phrased the question to include time prior to 2000 years ago, then I would have suggested that our most powerful invention would be song. MARNEY MORRIS, is president of Animatrix, which is publishing Sprocketworks, a next generation learning program, early in 1999. She teaches interaction design at Stanford. _________________________________________________________________ V.S. Ramachandran: My personal favourite is the place value notation system combined with the use of a symbol 0 for Zero to denote a nonexistent number marks the birth of modern mathematics. I think this is the greatest invention but I am being a little jingoistic -- it was invented in India in the 4th or 5th century BC , systematised by Aryabhatta in the 4th Century AD by the Indian Astronomer and then transmitted to the west via the Arabs. And Maths of course is essential for all science. V.S. RAMACHANDRAN, M.D., PH.D., is professor of neurosciences and psychology and Director of the Brain Perception Laboratory at the University of California in San Diego. He is author of Phantoms In The Brain: Probing the Mysteries of the Human Mind (with Sandra Blakeslee). _________________________________________________________________ Jeremy Cherfas: Some of your jump-start friends and colleagues seem to have ignored your (arbitrary?) cutoff date, so I will too. I think you'd have to go a long way to find a more important invention than the basket. Without something to gather into, you cannot have a gathering society of any complexity, no home and hearth, no division of labour, no humanity. This is not an original insight. I ascribe it to Glyn Isaac, a sorely-missed palaeoanthropologist. The basket ranks right up there with hay, the stirrup, printing and what have you. While we're about it, though, I'd like to take issue with Dan Dennett's choice of the battery. Granted it has enabled all the things he says it has (and I seriously considered nominating the Walkman -- a bizarre idea, the tape recorder that doesn't record -- as the invention with most impact on our lives) but at what cost? All extant batteries (though not fuel cells) are inherently polluting and wasteful. It takes something like six times more energy to make a Zinc-alkaline battery as the battery can store. I can't help but think that if a small portion of the effort that has gone into inventing "better" batteries had gone into, say, solar panels, our world and culture would be even more different. Thanks for a stimulating time. JEREMY CHERFAS, biologist and BBC Radio Four broadcaster, is author of The Seed Savers Handbook. _________________________________________________________________ Bart Kosko: Most important invention: CALCULUS The world today would be very different if the Greeks and not Newton/Leibniz had invented or "discovered" calculus. The world today might have occurred a millennium or two earlier. Calculus was the real fruit of the renaissance. It began by taking a fresh look at infinity -- at the infinitely small rather than the infinitely large. And it led in one stroke to two great advances: It showed how to model change (the differential equation) and it showed how to find the best or worst solution to a well-defined problem (optimization). The first advance freed math from static descriptions of the world to dynamic descriptions that allowed things to change or evolve in time. This is literally where "rocket science" becomes a science. The second advance had more practical payoff because it showed how to minimize cost or maximize profit. Thomas Jefferson claimed to have used the calculus this way to design a more efficient plow. Someday we may use it to at least partially design our offspring to minimize bad health effects or (God forbid) maximize good behavior. Calculus lies at the heart of our modern world. Its equations led to the prediction of black holes. We built the first computers to run other calculus equations to predict where bombs would land. The recent evolution of calculus itself to the random version called "stochastic calculus" has led to how we price the mysterious financial "derivatives" contracts that underlie the global economy. Calculus has led us from seeing the world as what Democritus called mere "atoms and void" to seeing the world as atoms that move in a void that moves. BART KOSKO is professor of electrical engineering at the University of Southern California; he is author of Fuzzy Thinking and Nanotime. _________________________________________________________________ Stuart Hameroff: The most important invention in the past two thousand years is anesthesia. Have you ever had surgery? If so, either a) part of your body was temporarily "deadened" by "local" anesthesia, or b) you "went to sleep" with general anesthesia. Can you imagine having surgery, or needing surgery, or even possibly needing surgery without the prospect of anesthesia? And beyond the agony-sparing factor is an extra added feature -- understanding the mechanism of anesthesia is our best path to understanding consciousness. Anesthesia grew from humble beginnings. Inca shamans performing trephinations (drilling holes in patients' skulls to let out evil humors) chewed coca leaves and spat into the wound, effecting local anesthesia. The systemic effects of cocaine were studied by Sigmund Freud, but cocaine's use as a local anesthetic in surgery is credited to Austrian ophthalmologist Karl Koller who in 1884 used liquid cocaine to temporarily numb the eye. Since then dozens of local anesthetic compounds have been developed and utilized in liquid solution to temporarily block nerve conduction from peripheral nerves and/or spinal cord. The local anesthetic molecules bind specifically on sodium channel proteins in axonal membranes of neurons near the injection site, with essentially no effects on the brain. On the other hand general anesthetic molecules are gases which do act on the brain in a remarkable fashion -- the phenomenon of consciousness is erased completely while other brain activities continue. General anesthesia by inhalation developed in the 1840's, involving two gases used previously as intoxicants. Soporific effects of diethyl ether ("sweet vitriol") had been known since the 14th century, and nitrous oxide ("laughing gas") was synthesized by Joseph Priestley in 1772. In 1842 Crawford Long, a Georgia physician with apparent personal knowledge of "ether frolics" successfully administered diethyl ether to James W. Venable for removal of a neck tumor. However Long's success was not widely recognized, and it fell to dentist Horace Wells to publicly demonstrate the use of inhaled nitrous oxide for tooth extraction at the Massachusetts General Hospital in 1844. Although Wells had apparently used the technique previously with complete success, during the public demonstration the gas-containing bag was removed too soon and the patient cried out in pain. Wells was denounced as a fake, however two years later in 1846 another dentist William T.G. Morton returned to the "Mass General" and successfully used diethyl ether on patient William Abbott. Morton used the term "letheon" for his then-secret gas, but was persuaded by Boston physician/anatomist Oliver Wendell Holmes (father of the Supreme Court Justice) to use the term anesthesia. Although its use became increasingly popular, general anesthesia remained an inexact art with frequent deaths due to overdose and effects on breathing until after World War II. Hard lessons were learned following the attack on Pearl Harbor -- anesthetic doses easily tolerated by healthy patients had tragic consequences on those in shock due to blood loss. Advent of the endotracheal tube (allowing easy inhalation/exhalation and protection of the lungs from stomach contents), anesthesia gas machines, safer anesthetic drugs and direct monitoring of heart, lungs, kidneys and other organ systems have made modern anesthesia extremely safe. However one mystery remains. Exactly how do anesthetic gases work? The answer may well illuminate the grand mystery of consciousness. Inhaled anesthetic gas molecules travel through the lungs and blood to the brain. Barely soluble in water/blood, anesthetics are highly soluble in a particular lipid-like environment akin to olive oil. It turns out the brain is loaded with such stuff, both in lipid membranes and tiny water-free ("hydrophobic") lipid-like pockets within certain brain proteins. To make a long story short, Nicholas Franks and William Lieb at Imperial College in London showed in a series of articles in the 1980's that anesthetics act primarily in these tiny hydrophobic pockets in several types of brain proteins. The anesthetic binding is extremely weak and the pockets are only 1 /50 of each protein's volume, so it's unclear why such seemingly minimal interactions should have significant effects. Franks and Lieb suggested the mere presence of one anesthetic molecule per pocket per protein prevents the protein from changing shape to do its job. However subsequent evidence showed that certain other gas molecules could occupy the same pockets and not cause anesthesia (and in fact cause excitation or convulsions). Anesthetic molecules just "being there" can't account for anesthesia. Some natural process critical to consciousness and perturbed by anesthetics must be happening in the pockets. What could that be? Anesthetic gases dissolve in hydrophobic pockets by extremely weak quantum mechanical forces known as London dispersion forces. The weak binding accounts for easy reversibility - as the anesthetic gas flow is turned off, concentrations drop in the breathing circuit and blood, anesthetic molecules are gently sucked out of the pockets and the patient wakes up. Weak but influential quantum London forces also occur in the hydrophobic pockets in the absence of anesthetics and govern normal protein movement and shape. A logical conclusion is that anesthetics perturb normally occurring quantum effects in hydrophobic pockets of brain proteins. The quantum nature of the critical effects of anesthesia may be a significant clue. Several current consciousness theories propose systemic quantum states in the brain, and as consciousness has historically been perceived as the contemporary vanguard of information processing (J.B.'s "technology = new perception") the advent of quantum computers will inevitably cast the mind as a quantum process. The mechanism of anesthesia suggests such a comparison will be more than mere metaphor. STUART HAMEROFF, M.D. is Professor, Departments of Anesthesiology and Psychology, University of Arizonan 1996. He is coeditor of Toward a Science of Consciousness : The First Tucson Discussions and Debates and Toward a Science of Consciousness II: The Second Tucson Discussions and Debates. _________________________________________________________________ Michael Nesmith: After reading the various answers to the question, I'm going to sneak through the door opened by Philip Anderson and nominate a discovery instead of an invention. And it is the Copernican Theory. Generally it was a counter-intuitive idea, and it ran opposite to the interpretation of senses (not to mention the Church) I mean, one could "see" the sun going across the sky. What could be more obvious than that? A nice move. It took a lot of intellectual courage, and taught us more than just what it said. MICHAEL NESMITH is an artist, writer, and business man; former cast member of "The Monkees". _________________________________________________________________ Clifford Pickover: As usual you are a font of important, stimulating ideas and have gathered together an awesome collection of minds for your latest survey. Here is my response. In 105 AD, Ts'ai Lun reported the invention of paper to the Chinese Emperor. Ts'ai Lun was an official to the Chinese Imperial court, and I consider his early form of paper to be humanity's most important invention and progenitor of the Internet. Although recent archaeological evidence places the actual invention of papermaking 200 years earlier, Ts'ai Lun played an important role in developing a material that revolutionized his country. From China, papermaking moved to Korea and Japan. Chinese papermakers also spread their handiwork into Central Asia and Persia, from which traders introduced paper to India. This is why Ts'ai Lun is one of the most influential people in history. Today's Internet evolved from the tiny seed planted by Ts'ai Lun. Both paper and the Internet break the barriers of time and distance, and permit unprecedented growth and opportunity. In the next decade, communities formed by ideas will be as strong as those formed by geography. The Internet will dissolve away nations as we know them today. Humanity becomes a single hive mind, with a group intelligence, as geography becomes putty in the hands of the Internet sculptor. Chaos theory teaches us that even our smallest actions have amplified effects. Now more than ever before this is apparent. Whenever I am lonely at night, I look at a large map depicting 61,000 Internet routers spread throughout the world. I imagine sending out a spark, an idea, and a colleague from another country echoing that idea to his colleges, over and over again, until the electronic chatter resembles the chanting of monks. I agree with author Jane Roberts who once wrote, "You are so part of the world that your slightest action contributes to its reality. Your breath changes the atmosphere. Your encounters with others alter the fabrics of their lives, and the lives of those who come in contact with them." CLIFFORD A. PICKOVER is a research staff member at the IBM T. J. Watson Research Center. He is the author of over 20 books translated in 10 languages on a broad range of topics in science and art. His internet web site has attracted nearly 200,000 visitors. _________________________________________________________________ Margaret Wertheim: Good question! My immediate response (without even thinking) was the contraceptive pill. My mother had six children in five and a half years and it was only the invention of the pill that saved our family from becoming a mini-nation-state in its own right. But since Colin Blakemore has already described so well its immense importance, let me suggest another "invention" -- electrification. Why electrification? For a start, one of my most vivid childhood memories is of my mother seemingly spending endless hours washing nappies and clothes by hand. The electric washing machine and other electric home gadgets (vacuum cleaners, fridges, food processors et cetera) have freed billions of women from the endless drudgery of heavy-duty housework. By bringing us light and heat and power on tap, electricity has truly transformed life -- not just in the home, but in almost every industry. Modern manufacturing would be impossible without electricity. Ditto the modern office. The ability to literally transport power is, I think, the most revolutionary technology to come out of modern science. And of course, it is the ability to transport electric power at the micro level that has made possible silicon chips, and the attendant computer and information revolution. Far more than Einstein and Bohr, Faraday and Maxwell are the true "heroes" of the modern technological world. MARGARET WERTHEIM is a science writer, and a research associate of the American Museum of Natural History. She is the author of Pythagoras Trousers a history of physics and religion. _________________________________________________________________ Richard Dawkins: THE SPECTROSCOPE The telescope resolves light from very far away. The spectroscope analyses and diagnoses it. It is through spectroscopy that we know what the stars are made of. The spectroscope shows us that the universe is expanding and the galaxies receding; that time had a beginning, and when; that other stars are like the sun in having planets where life might evolve. In 1835, Auguste Comte, the French philosopher and founder of sociology, said of the stars: "We shall never be able to study, by any method, their chemical composition or their mineralogical structure . . . Our positive knowledge of stars is necessarily limited to their geometric and mechanical phenomena." Even as he wrote, the Fraunhofer lines had been discovered: those exquisitely fine barcodes precisely positioned across the spectrum; those telltale fingerprints of the elements. The spectroscopic barcodes enable us to do a chemical analysis of a distant star when, paradoxically (because it is so much closer), we cannot do the same for the moon -- its light is all reflected sunlight and its barcodes those of the sun. The Hubble red shift, majestic signature of the expanding universe and the hot birth of time, is calibrated by the same Fraunhofer barcodes. Rhythmic recedings and approachings by stars, which betray the presence of planets, are detected by the spectroscope as oscillating red and blue shifts. The spectroscopic discovery that other stars have planets makes it much more likely that there is life elsewhere in the universe. For me, the spectroscope has a poetic significance. Romantic poets saw the rainbow as a symbol of pure beauty, which could only be spoiled by scientific understanding. This thought famously prompted Keats in 1817 to toast "Newton's health and confusion to mathematics", and in 1820 inspired his well known lines: "Philosophy will clip an Angel's wings, Conquer all mysteries by rule and line, Empty the haunted air, and gnomed mine -- Unweave a rainbow . . ." Humanity's eyes have now been widened to see that the rainbow of visible light is only an infinitesimal slice of the full electromagnetic spectrum. Spectroscopy is unweaving the rainbow on a grand scale. If Keats had known what Newton's unweaving would lead to -- the expansion of our human vision, inspired by the expanding universe -- he could not have drunk that toast. RICHARD DAWKINS is an evolutionary biologist and the Charles Simonyi Professor For The Understanding Of Science at Oxford University; Fellow of New College; author of The Selfish Gene, The Extended Phenotype, The Blind Watchmaker, River out of Eden (Science Masters Series), Climbing Mount Improbable, and the recently published Unweaving the Rainbow. See EDGE: "Science, Delusion, and the Appetite for Wonder: A Talk by Richard Dawkins"; The Third Culture: Chapter 3. _________________________________________________________________ David Haig: My suggestion for the most important invention of the last two millennia is the computer because of the way it extends the capacities of the human mind for accurately performing large numbers of calculations and for keeping track of and accessing vast bodies of data. As with any great invention, these enhanced abilities have a light and a dark side. As a scientist I am now able to answer questions that could not be answered prior to the computer. On the dark side is the loss of privacy and the enhanced potential for social control made possible by the ability to manipulate large databases of personal information. As another candidate, my mother has said that her all time favorite invention is the telephone because of how it allows her to stay in intimate and immediate contact with distant friends. DAVID HAIG is an evolutionary biologist and a member of the Department of Organismic and Evolutionary Biology, Harvard University. _________________________________________________________________ Christopher G. Langton: Like others who have responded, I think the choice is obvious. The remarkable thing is that "the obvious choice" is different for everyone! To my mind, the most important inventions are those which have forced the largest changes in our world-view. On the basis of this criterion, I pick two (for reasons listed below): The telescope, and the theory of evolution by natural selection. I pick two because it seems to me that there are two major categories of important inventions: a) complexity increasing, and b) complexity decreasing. By complexity increasing, I mean those inventions that open up vast new realms of data, which can not be accounted for on the existing world view, making the universe less understandable, and therefore seemingly more complex. By complexity decreasing, I mean those inventions that identify a pattern or algorithm in vast realms of data, ridding that data of a good deal of its apparent complication. These inventions force alterations to our world view to account for previously unaccountable data, or to account for it more directly and simply, making the universe more understandable, and therefore seemingly less complex. The former tend to take the form of instruments or devices -- physical constructs -- while the latter tend to take the form of concepts, theories, or hypotheses -- mental constructs. Both qualify as inventions.* (*To be careful, the former also involves a mental construct -- a device alone is useless without the mental construct that points it in the right direction.) In the former category, nothing rivals the telescope. No other device has initiated such a massive reconstruction of our world view. It forced us to accept the Earth, and ourselves, as "merely" a part of a larger cosmos. Of course, numerous theories besides the earth-centered universe existed before its invention, but the telescope opened the doors to the flood of data that would resolve what were previously largely philosophical disputes. The microscope -- a relative of the telescope -- also opened the door to a previously unimagined universe, and runs a close second to the telescope on the world-view shaking Richter scale. In the latter category, there are many brilliant candidates, but I think that Darwin's invention of the theory of evolution by natural selection outshines them all. It is perhaps the only truly general theory in Biology, a field much more complex than physics. If we discover life elsewhere in the universe it is likely to be the only biological theory that will carry over from our terrestrial biology. Darwin's theory reduced tremendously the complication of zoological data. Critically, as with the telescope, it has put tremendous pressure on the previous world-view to accommodate man as "merely" a part of a much larger nature. This pressure is still largely being resisted, but the outcome is clear. A close second would be the Second Law of Thermodynamics. Although the Second Law has not, perhaps, posed such a profound challenge to our collective world view, it has tremendously reduced the complexity of a great body of data (and it profoundly affects the world view of anyone who studies it in detail!) I would have nominated the computer, but I think that, although it has profoundly affected our daily routines, it has not yet profoundly affected our world view. The computer is a kind of mathematical telescope, revealing to us a vast new realm of data about what kinds of dynamics follow from what sorts of rules -- we are constantly discovering new galaxies of mathematical reality with computers. However, it will be a while before these empirical discoveries force a profound alteration of our world view. CHRISTOPHER G. LANGTON a computer scientist, is internationally recognized as the "founder" of the field of Artificial Life. He is Chief Technology Officer at The Swarm Corporation, and editor of the Artificial Life journal. See The Third Culture, Chapter 21. _________________________________________________________________ Eric J. Hall: Quite a good question and some very interesting responses. However, I take a more pragmatic view. For me, the steam engine was the most important invention in the past two thousand years. The steam engine freed man and beast from physical labor. No other invention had so many different and versatile uses. Man could cut down entire forests to feed sawmills to build cities, quarry stone, propel trains and ships to make the world a smaller place, power factories, and generate electricity. Agrarian society was over and industrialism reigned. Most importantly, the steam engine created more leisure time for mankind. No longer was leisure a pastime for the idle rich. The pursuit of leisure and the changes it created in society far outstripped the first 18 centuries. Without the steam engine, our society would be radically different from today. ERIC J. HALL is President of The Archer Group, a consulting firm specializing in emerging technology companies. He has helped found companies including Yahoo!, Women.com, and The ImagiNation Network. _________________________________________________________________ Clay Shirkey: My vote for "The Most Important Invention In the Past Two Thousand Years" is G?del's Incompleteness Theorem. This single piece of mathematical jujitsu, proving unprovability, formally ended the strain of Western thought begun by Socrates and first fully fleshed out by Aristotle. The ancillary effects of that theory -- a rejection of master narrative, an understanding that we will never know all the answers, an acceptance of contradiction, and an embrace of complexity -- are just now making themselves felt in the dawn of the post complete world. CLAY SHIRKEY is Professor, New Media Department of Film & Media, Hunter College. _________________________________________________________________ Keith Devlin: Of course, "What is the single most important invention of the past two thousand years?" is one of those questions that does not really have an answer, like "What is the best novel/symphony/movie? But if I had to make a choice, it would be the Hindu-Arabic number system, which reached essentially its present form in the sixth century. Without it, Galileo would have been unable to begin the quantificational study of nature that we now call science. Today, there is scarcely any aspect of life that does not depend on our ability to handle numbers efficiently and accurately. True, we now use computers to do much of our number crunching, but without the Hindu-Arabic number system we would not have any computers. Because of its linguistic structure, the Hindu-Arabic number system allows humans who have an innate linguistic fluency but only a very primitive number sense to use their ability with language in order to handle numbers of virtually any useful magnitude with as much precision as required. In addition to its use in arithmetic and science, the Hindu-Arabic number system is the only genuinely universal language on Earth, apart perhaps for the Windows operating system, which has achieved the near universal adoption of a conceptually and technologically poor product by the sheer force of market dominance. (By contrast, the Hindu-Arabic number system gained worldwide acceptance because it is far better designed and much more efficient, for human usage, than any other number system.) KEITH DEVLIN, a mathematician, is the author of Goodbye, Descartes : The End of Logic and the Search for a New Cosmology of the Mind; Life by the Numbers; and The Language of Mathematics: Making the Invisible Visible. _________________________________________________________________ Luyen Chou: I would have to vote for philosophical skepticism as the most important "invention" (if one thinks of invention as fabrication rather than discovery, as it is more archaically meant) of the past two thousand years. The notion that there is a "truth behind" things and a "bottom" to the matter has instilled in all of us, whether scientists, philosophers, theologians, or lay people, a maniacal obsession with improving our explanatory capabilities. As such skepticism can be seen as the driving force behind science and technology, modern conceptions of faith, the soul, and the other. Of course, one might argue that skepticism has been around for longer than two thousand years; but its characterization as a fundamental problem to be contended with before any constructive work can be done seems to me a peculiarly modern invention, a defining feature of our intensively self-conscious, post-Cartesian world. LUYEN CHOU is President and CEO of Learn Technologies Interactive in New York City, an interactive media developer and publisher. See EDGE: "Engineering Formalism and Artistry: The Yin and Yang of Multimedia: A Talk With Luyen Chou". _________________________________________________________________ Antonio R. Cabral, M.D.: I propose that the most important invention in the past two thousand years is: "Languages". If you take a look at the proposals you have received (or will) so far: the contraceptive pill, the scientific method (whatever that means), the quantum theory, and so on, they could not have even been thought out, let alone conveyed, without the aid of a language. I do not mean a language in particular, but all the languages, dead or alive. Of course one tends to think that live languages deserve the credit, but without the so-called "dead languages", such as Latin, the live ones simply would not exist. If one accepts that language is the most important invention in the past 2000 years, one has to concede that the "Human Brain" is the most important inventor during the same period. In my opinion, the printing press comes second to languages as the most important invention in the past 20 centuries; this puts Johann Gutenberg (c.1400-1468) as the second most important inventor of all, since one can easily pinpoint him as the Father of the printed letter. Without a (written) language, specially when it conveys concepts and feelings, all cultures -- scientific, literary or otherwise -- would be all but a conceptless matter. The Third Culture simply could not breathe. One can speculate ad nauseam about which language in the current state of world affairs, including the Internet, is the most important one of all. I have some ideas, to theorize about them, though, is beyond your original question. ANTONIO R. CABRAL, M.D. Is Associate Professor of Medicinem National Autonomous University of Mexico. _________________________________________________________________ Hendrik Hertzberg: Philip Anderson asks the right question: "Why has no one mentioned the printing press yet?" I mean, doesn't it seem kind of obvious that printing -- under which would be subsumed all forms of large-scale reproduction of the written word, from handmade wooden type to the computer and word-processing program I'm using to write this -- was the most important invention of the past two thousand years? Printing led directly to mass literacy, democracy, the scientific revolution, cyberthis and cyberthat, and all those other good things. A more general observation. I notice that most of the responses you included in the email suggest that the most important invention of the past two thousand years, whatever it was, just happens to have happened in the past hundred years. Doesn't this reflect a bad case of chronocentrism, i.e., the irrational belief that one is lucky enough to be living in history's most important era? Given that people have been inventing things all along, isn't it unlikely that all the most important inventions would have happened in one little century out of twenty? Wouldn't it be more logical to expect them to be spaced out randomly over all twenty? Even if the twentieth is a particularly inventive century, isn't it a little myopic to imagine that the one we just happen to be living in is twenty times more inventive than any of the others? Maybe four or five times more inventive, but even that would be a stretch. HENDRIK HERTZBERG, executive editor of The New Yorker since 1992, is the author of the book One Million and, with Martin Kalb, of Candidates. _________________________________________________________________ David Berreby: Interesting question. My candidate would be: The concept of information as a commodity, a thing that can be bought and sold. It's an ancient invention, dating back to the day of the fleet footed messenger, but its enormous consequences had to wait for the acceleration of information-carrying technologies like the telegraph and the Internet. We're only now witnessing the cumulative impact, as the buying and selling of information begins to outweigh the buying and selling of stuff. Why is this so important? Because humans who trade in information behave like our hunter gathering ancestors. They are alert and adaptable to an ever-changing environment. They work in small groups. They are independent thinkers who dislike taking orders and are fervently egalitarian. They place their faith in face to face relationships, not authority or a title. For as long as humanity got its living by agriculture or industry, such traits had to be suppressed in favor of those more amenable to centralization, authority, large-scale enterprises. This epoch is coming to an end. In the post-industrial west we no longer value stability, steadfastness and predictability over change, adaptability and flexibility. We are no longer awed by political power, instead seeing those who hold it as just like us. (When I was a kid people worried about the ``Imperial Presidency'' becoming too awesome for a democracy to support. But then, when I was a kid, an ex-wrestler could not get elected governor of Minnesota.) Corporate types often remark that their 20-something employees can't take orders and expect to be able to dress as they please and bring their parrot to work. All this is supposed to be a consequence of prosperity. But it seems to me the shift is far more profound. After a 7000-year detour through agriculture and industry, we are returning to the ways of our proud, individualistic, headstrong, small-group-dwelling forebears, and that will reshape the human community profoundly. And it's the move from a thing-economy to an information-economy that's making it happen. DAVID BERREBY'S writing about science and culture has appeared in The New York Times Magazine, The New Republic, Slate, The Sciences and many other publications. He is currently at work on a book about the psychology of Us versus Them. _________________________________________________________________ Charles Simonyi: In the spirit of completeness and risking chronocentrism big time, I nominate Public Key Cryptosystems as something invented during the last two thousand years and which will remain useful long after the printing press will exist only in the (electronic) history books next to the steam engine. PKC has three incredible properties: perfect privacy, perfect authentication, and a reliable carrier of value and contracts -- like gold used to be. All this in the digital environment where information can be easily and perfectly stored and copied. At a single stroke PKC transformed our vision of the asymptotic result of information technology from the 1984-ish nightmare to a realistic and ultimately attractive cyberspace where identity and privacy are not lost, despite of our (and Orwell's) commonsense intuition to the contrary. CHARLES SIMONYI, Chief Architect, Microsoft Corporation, focuses on Intentional Programming, an ecology for abstractions which strives for maximal reuse of components by separating high level intentions from implementation detail. See EDGE: Intentional Programming: A Talk with Charles Simonyi" and EDGE: " CODE II -- Farmer & Simonyi: A Reality Club Dialogue". _________________________________________________________________ Piet Hut: Building autonomous tools is my candidate for the most important invention. Artificial complex adaptive systems, from robots to any type of autonomous agent, will change our world view in a qualitative way, comparable to the change brought by the use of thing-like tools. Tinkering with tools has shaped our view of the world and of ourselves. For example, the invention of the pump enabled us to understand the mechanical role of the heart. Science was born when laboratory apparatus was used to select among mathematical theories of the physical world which one correspond most closely to reality. But all those tools have been lifeless and soulless things, and it is no wonder that our scientific world view has tended to objectify everything. Grasping the proper role of the subject pole of experience, through the invention of subject-like tools, may provide the key to a far wider world view. With the invention of perspective, in the late Middle Ages, we shifted our collective Western experience one-sidedly into the object pole, leaving the subject pole out of the picture. We started looking at the world from behind a window, and a couple centuries later, in science, we attempted to take a God's eye view of the world. By now, we are coming around full-circle, with our science and technology providing us the means of exploration of the role of the subject. We have only set the first steps towards building artificial subjects. Just as our current artificial objects are vastly more complex than the first wheel or bow and arrow, our artificial subjects will grow more complex, powerful, and interesting over the centuries. But already we can see a glimmer of what lies ahead: our first attempts to build autonomous agents has taught us new concepts. As a result, we are now beginning to explore self-organizing ecological, economic, or social systems; areas of study where thing-like metaphors hopelessly fail. PIET HUT is professor of astrophysics at the Institute for Advanced Study, in Princeton. He is involved in the project of building GRAPEs, the world's fastest special-purpose computers, at Tokyo University. _________________________________________________________________ Susan Blackmore: Birth control (or if you need it to be more specific, the pill) Why? Because freedom from constant childbearing means that women can become meme-spreaders like men -- working for their memes rather than their genes. This then means a change in the kinds of memes that propagate effectively, including all the memes of other inventions as well as the meme-spreading media, myths, science and the arts. In other words, it is important because it changes the whole of culture. Few single inventions have this effect on the whole meme pool. SUSAN BLACKMORE, Senior Lecturer in Psychology at the University of the West of England, Bristol, columnist for the Independent, and author of Dying To Live: Near-Death Experiences, and In Search of the Light . _________________________________________________________________ James J. O'Donnell: If you read through this growing list, you will see that people tend to discover that the most important invention in the last 2000 years is something they just happen to know a lot about. Well, I know a lot about some important inventions -- like the codex book (and the consequent idea that a book can be a manual for living -- that leads us to the 19th century and its dead ends) and like the computer (which gives us a model for ignoring the manual and just living by experiment), but I think it is quite undeniable that there is something far more important going on: effectual health care. Not just antibiotics, not just birth control, not just anesthesia (to say things mentioned here), but the underlying fundamental fact that we have learned to cross the scientific method with care for human beings and save lives. A thought experiment I like to have people play is this: review your own life and imagine what it would have been like without late 20th century health care. Would you still be alive today? An astonishingly large number of people get serious looks on their faces and admit they wouldn't: I wouldn't, that's for sure. It's medical techniques, it's antibiotics, but it's also vitamin pills and -- in some ways most wondrously cost-effective of all -- soap, as in the soap doctors use to wash their hands. JAMES J. O'DONNELL, Professor of Classical Studies and Vice Provost for Information Systems and Computing at the University of Pennsylvania, is the author of Avatars of the Word: From Papyrus to Cyberspace. _________________________________________________________________ Nicholas Humphrey: The most important invention has been reading-glasses. They have effectively doubled the active life of everyone who reads or does fine work -- and prevented the world being ruled by people under forty. NICHOLAS HUMPHREY is a theoretical psychologist; professor at the New School for Social Research, New York; author of Consciousness Regained, The Inner Eye, A History of the Mind, and Leaps of Faith: Science, Miracles, and the Search for Supernatural Consolation . See The Third Culture, Chapter 11. _________________________________________________________________ Jaron Lanier: Joe Traub already nabbed the invention I would have chosen; empirical method. So I'll stake out a different claim. For present purposes, I'll claim that the most significant invention of the last 2000 years was the human ego. The ego I'm talking about is the self-concerned human that Harold Bloom credits Shakespeare with having invented. It's the thing that William Manchester finds definitively missing in the Medieval mind. Jostein Gaarder, in his children's philosophy novel, Sophie's World, blames St. Augustine for inventing it. It's what the fuss is about in Nietzsche. It's what exists in existentialism. In truth, I'm not entirely convinced that I don't find good evidence of this creature in pre-Christian/Common-era texts. (Thomas Cahill thinks it was a gift from the Jews.) But it does seem that the sense of individual self, outfitted with moral responsibility, free will, consciousness, and -- most importantly -- neurotic self-obsession, at one time did not exist, and then did. That same sense of self is now being challenged by AI-ish members of the EDGE community. Perhaps it will disappear, just as it once appeared. So it is reasonable to think of the ego as a natural inhabitant of approximately the last 2000 years. One could argue that the ego had to precede empirical method. The shift from pure rationality to empiricism relied on an acknowledgement of differing perspectives of observation (while pure rationality was thought to be independent of personal perspective). So the self was needed in order to have a starting point from which to pose theories and to make measurements in order to test them. Only an ego can have imperfect enough knowledge to make mere guesses about what's going on in the universe, and the hubris to test and improve those guesses. I personally hope the ego survives the computer. JARON LANIER, a computer scientist and musician, is a pioneer of virtual reality, and founder and former CEO of VPL. He is currently the lead scientist for the National Tele-Immersion Initiative. See Digerati, Chapter 17. _________________________________________________________________ Terrence Sejnowski: Technological advances in communication from clay tablets, to papyrus,to moveable type, to postscript have had a shaping influence on society and these are accelerating. Almost overnight, the accumulated knowledge of the world is crystallizing into a distributed digital archive. Images and music as well as text have merged into a universal currency of information, the digital bit, which is my choice for the greatest discovery of the last two millennia. Unlike other forms of archival storage, bits are forever. In the next millennium this digital archive will continue to expand, in ways we cannot yet imagine, greatly enhancing what a single human can accomplish in a lifetime, and what our culture can collectively discover about the world and ourselves. TERRENCE SEJNOWSKI, a pioneer in Computational Neurobiology, is regarded by many as one of the world's most foremost theoretical brain scientists. He is the director of the Computational Neurobiology Lab at the Salk Institute and the coauthor of The Computational Brain. _________________________________________________________________ Ron Cooper: I am surprised no one mentioned distillation, the great alchemical invention of transformation in the search to understand the essence of existence. Alchemy appears to have started in Ancient Egypt (al-khem means the art of Egypt in Arabic). Alchemy travelled with Islam as it spread across Northern Africa and into mainland Europe with the Moorish invasion of Andalucia in the tenth century. Alchemy tries to make sense of the world by, among other things, working with the elements to transform matter and attempt to strip away the extraneous and capture its purest essence. Some suggest Alchemy's founding father was the Egyptian god Thoth (in Greek Hermes). Both are symbols of mystical knowledge, rebirth and transformation. To find the first evidence of distillation of spirits, you have to go to fourth century China, where the alchemist Ko Hung wrote about the transformation of cinnabar in mercury as being: "like wine that has been fermented once. It cannot be compared with the pure clear wine that has been fermented nine time". Is he talking about distillation? It seems possible. How do you ferment a wine nine times unless you distill it? By that time, the Alexandrian Greeks had discovered that by boiling you could transform one object into another. Pliny writes about distillation being used to extract turpentine from resin, while Aristotle recounts how sea water could be turned into drinking water in 4 AD. Aside from being the basis of modern science and industry, the transformation of human beings brought on by the imbibing of distilled spirits is of great interest to me. RON COOPER, painter and sculptor who is known as "the King of Downtown," was one of the original artists in the Los Angeles downtown loft scene. More recently, he is founder and president of Del Maguey, Single Village Mezcal (TM). _________________________________________________________________ W. Daniel Hillis: I agree that Science is the most important human development is the last 2000 years, but it doesn't quite qualify as an invention. I therefore propose the clock as the greatest invention, since it is an instrument that enables Science in both a practice and temperament. The clock is the embodiment of objectivity. It converted time from a personal experience into a reality independent of perception. It gave us a framework in which the laws of nature could be observed and quantified. The mechanism of the clock gave us a metaphor for self-governed operation of natural law. (The computer, with its mechanistic playing out of predetermined rules, is the direct descendant of the clock.) Once we were able to imagine the solar system as a clockwork automaton, the generalization to other aspects of nature was almost inevitable, and the process of Science began. W. DANIEL HILLIS is a physicist and computer scientist; Vice president of research and development at the Walt Disney Company and a Disney Fellow; cofounder and chief scientist of Thinking Machines Corporation where he built Connection Machines; co-chair of The Long Now Foundation. He is author of The Connection Machine, The Pattern on the Stone: The Simple Ideas That Make Computers Work (Science Masters Series), as well as numerous articles.See The Third Culture, Chapter 23; Digerati, Chapter 11. _________________________________________________________________ John Baez: Here is my reply to your fiendish question: How can we possibly pick the most important invention in the past two thousand years? The real biggies -- language, fire, agriculture, art -- came too soon. In the last two millennia our world has seen so many inventions that it's hard to think of one that stands above all the rest. The printing press? The computer? The A-bomb? After a bit of this, one is tempted to give a smart-aleck reply and back it up with the semblance of earnest reasoning: "Thousand Island dressing!" But even this is boring. Somehow we have to break out of the box! Well, if inventions are important, surely it's even more important to invent the social structures that will guarantee a steady flow of new inventions. I've heard it said that Edison was the first to turn invention into a business. Every day he would walk into his lab and say "Okay, what can we invent today?" But the groundwork was laid earlier. Perhaps the invention of a patent office was the key step? Or further back, Bacon's "New Atlantis", which envisioned the techno-paradise we are now all so busy trying to build? JOHN BAEZ is a mathematical physicist working on quantum gravity using the techniques of "higher-dimensional algebra". A professor of mathematics at the University of California, Riverside, he enjoys answering physics questions on the usenet newsgroup sci.physics.research, and also writes a regular column entitled "This Week's Finds in Mathematical Physics". _________________________________________________________________ Viviana Guzman: Why hasn't anyone mentioned television??!! Is it too obvious? I think it's the single most powerful and manipulative tool ever invented. It's today's most important source of information and serves as a tremendous behavior patterning device. Since it's inception, crime has risen, sex has increased and the attendance at live performances has died. VIVIANA GUZMAN is a flutist whose latest album is Planet Flute. _________________________________________________________________ Stephen Schneider: My first association for the most (whatever that means) important invention was the unconscious mind, because, I thought to myself, the concept offers some explanation -- and thus hopefully later remedies -- for the behaviors coming from the darker sides of our nature. Armed with better understanding of the origins of such behavior, hopefully we could fashion ways out of the irrational clamp that fundamentalist religion, blind nationalism or deep ideology often puts on our conscious awareness. But, one thought later was that I believe the unconscious does indeed exist, so logically it is a discovery, not an invention. That (somewhat uneasily) suggests psychotherapy (again, whatever that means given all its incarnations -- psychotherapy being but one of a basket of techniques to make the unconscious more conscious) as my invention. At least in principle -- and often in practice too I believe -- it does offer us the opportunity to become more conscious, therefore less inclined to absolute thinking and the subjugation and/or violence absolutism often engenders in the minds of those who don't harbor doubts. In discussing the causes and possible solutions to global environmental problems (e.g., global warming in particular), I note in dialogues with junior high school students -- right on down to senate committees -- that we can't easily fix problems we can't see. Thus, solutions to long-term, global-scale systems threats require -- in a democracy at least -- overcoming any collective denial that our "puny" individual impacts can cause a major disruption at a planetary scale or over timeframes longer than our lifetime. Admittedly, I'm not going to seriously claim psychoanalysis is as "important" an invention as the scientific method over the past 2000 years (as I recall one of your respondents proposed). What I see as a key invention for the year 2000+, though, is an expanded systems analysis that includes methods to build in an understanding of the role of the unconscious of individuals which leads to lifestyles and behaviors which "scale up" to create unanticipated collective consequences. Although not directly responsive to your question, the invention I really like -- think we will really need -- is a fusion of systems analysis with psychotherapy. But the new field of "systems therapy" is yet to be invented!, leaving me dangling uneasily between systems -- and psycho-analysis. Perhaps, if armed with insights from tools that integrated physical, biological, social and psychological drivers of our behaviors across a range of scales, rather than always chugging merrily along in business as usual mode, we'd be more aware of the range of consequences of our unconsciousness. Then, if we continued to damage the collective or put the future at risk, at least that would be more of a choice and less of a surprise. With best wishes to all for the holiday season. STEPHEN H. SCHNEIDER is a Professor in the Biological Sciences Department at Stanford University, and the Former Department Director and Head of Advanced Study Project at the National Center for Atmospheric Research, Boulder; author of The Genesis Strategy; The Coevolution Of Climate And Life; Global Warming: Are We Entering The Greenhouse Century?; and Laboratory Earth (Science Masters Series). _________________________________________________________________ Philip Campbell: Thanks for the reminder. Here's my shot. Perhaps the most challengingly important inventions are those that open up new moral dilemmas, and thus make some people question whether the invention should have been allowed (or precursor discovery sought) in the first place. This even applies to Howard Gardner's suggestion of classical music: I would add Adorno's (I think) statement that, in contrast to some composers, it is impossible to find evil that could have been reinforced by any note written by Mozart. On the other hand, I believe Wagner is still banned in Israel. But my own suggestion is closer to my professional interests. As delightfully examined in Jared Diamond's Guns, Germs and Steel, writing was at least one of the most important inventions of all time, but Sumerian cuneiform is too old for me to offer it, by 3000 years. So, in agreement with Philip Anderson's nudge, the printing press is my response to the question. After all, even the World Wide Web is just a printing press with electronic and photonic elaborations. But I can't resist looking forward at an editorial fantasy, ignoring all sober estimations of the difficulties involved: a cumulative invention which, if fulfilled, would certainly have a capacity for good and evil. To quote William Gibson's Neuromancer: ".. and still he dreamed of cyberspace...still he'd see the matrix in his sleep, bright lattices of logic unfolding across the colorless void..." No keyboard, mouse or screen, just neural connections and a many-dimensional space of, at least, information, to explore, organise and communicate at will -- perhaps, dare I presumptuously suggest, with occasional help from an editor. I fear it's too much for me to expect, but my grandchildren could love it. PHILIP CAMPBELL (whose oldest offspring is 13) was founding editor of Physics World, and has been Editor of Nature since 1995. _________________________________________________________________ John Horgan: Okay, I'll bite. Has anyone nominated free will yet? The concept is more than 2,000 years old, but surely it deserves consideration as one of our most important inventions ever. Almost as soon as philosophers conceived of free will, they struggled to reconcile it with the materialistic, deterministic views of nature advanced by science. Epicurus insisted that there must be an element of randomness within nature that allows free will to exist. Lucretius called this randomness "the swerve." Modern free-willers find the swerve within chaos theory or quantum mechanics. None of these arguments are very convincing. Science has made it increasingly clear -- to me, anyway -- that free will is an illusion. But more even than God, it is a glorious, absolutely necessary illusion. JOHN HORGAN, science writer; author of The End of Science : Facing the Limits of Knowledge In The Twilight of the Scientific Age, has also written freelance articles for The New York Times, The New Republic, Slate, The London Times, Discover, The Sciences and other publications. See EDGE: " Why I Think Science Is Ending: A Talk With John Horgan" and EDGE: " The End of Horgan?" [thread unavalable]. _________________________________________________________________ Raphael Kasper: My immediate reaction to the question was to choose between the printing press and any of a set of public health-related inventions (antibiotics, sewage treatment, ...). And since it seems as though we might never have had the public health advances without the printing press, but did, in fact, have the printing press without the public health advances, I'd have to choose the printing press. Why? Because it opened the possibility that knowledge (information, wisdom) could be disseminated beyond a small number of privileged individuals, thus permitting larger numbers to share or debate world-views and to build upon past and present ideas. Thus far, at least, new electronic technologies (radio, movies, television, computers) have been employed as extensions of this broadening of access to knowledge, altering the medium of exchange but not the concept. At some time in the future they may lead to more fundamental changes in the human condition, but not yet, I'm afraid. RAPHAEL KASPER, a physicist, is Associate Vice Provost for Research at Columbia University and was Associate Director of the Superconducting Super Collider Laboratory. _________________________________________________________________ Sherry Turkle: My candidate would be the idea of the unconscious, the notion that what we say and do and feel can spring from sources of which we are not aware, that our choices and the qualities of our relationships are deeply motivated by our histories. In recent years, the Freudian contribution has tended be seen as historical...something we have passed beyond...but I think that in large part this is because the most fundamental ideas of psychodynamics have passed into popular culture as a given. These ideas animate out understandings of who we are with our families, with our friends and work. They add a dimension to our understandings of what it is to be human that will become increasingly important as we confront world in which artificial intelligences are increasingly presented to us and our children as candidates for dialogue and relationship (this year's Furbies are only a beginning) -- and we are compelled to a new level of reflection about what is special about being a person. SHERRY TURKLE is a professor of the sociology of sciences at MIT. She is the author of The Second Self: Computers and the Human Spirit; Psychoanalytic Politics: Jacques Lacan and Freud's French Revolution, and Life on the Screen: Identity in the Age of the Internet.. See Digerati, Chapter 31. _________________________________________________________________ David Myers: Others in this science-minded group have appropriately mentioned the scientific method. Speaking for my discipline let me sharpen this: When it comes to thinking smart -- to sifting reality from wishful thinking -- one of the great all-time inventions is the control group. If we want to evaluate medical claims (from bloodletting to new drugs to touch therapy), to assess social programs, or to isolate influences on human behavior we construct a controlled reality. By random assignment we form people into equivalent groups which either receive some experience or not -- thereby isolating the factor of interest. The power of the controlled experiment has meant the death of many wild and wacky claims, but also the flourishing of critical thinking and rationality. DAVID MYERS, professor of psychology at Hope College, is the author of The Pursuit of Happiness: Who Is Happy, and Why, as well several textbooks which include Exploring Psychology, and Psychology . _________________________________________________________________ Don Goldsmith: The most important invention has been a mental construct: the realization that we on Earth form an integral part of a giant cosmos, not a privileged form of existence in a special place. This invention, once the province of a few intellectuals in an obscure corner of the world, has now become widespread, though it remains a minority view among the full population; its implications and successes lie all around us. DONALD GOLDSMITH is an astronomer and the author of over a dozen books including The Astronomers, the companion volume to the PBS series of the same title, and The Hunt for Life on Mars. In 1995, Dr. Goldsmith was the recipient of the Annenberg Foundation Award for lifetime achievement awarded by the American Astronomical Society. He has also been awarded the Dorothea Klumpke-Robert prize for astronomy popularization. _________________________________________________________________ Arnold Trehub : The most important invention in the past two thousand years? In my opinion it is the invention by Otto von Guericke in 1660 of a machine which produced static electricity. This device was the primitive tool which unlocked our understanding and application of electricity. Modern power generation, communication, computation, and almost all of our most important analytic devices stand on the foundation of von Guericke's machine. A long line of basic intellectual formulations from electromagnetism to the bioelectric properties of brain mechanisms owe a debt to this invention. When we discover how the human brain creates the covert models of its own inventions, the structure and dynamics of the brain's own electrical activity will undoubtedly be an essential aspect of the explanation. ARNOLD TREHUB is adjunct professor of psychology, University of Massachusetts at Amherst, and the author of The Cognitive Brain. _________________________________________________________________ Jay Ogilvy: Okay, I'll weigh in with the invention of secularism -- getting out from under the thumbs of the gods. From all we can tell from historians and anthropologists, every ancient society worshipped some god or other. Superstition ran rampant. Human beings denied their own freedom and autonomy by praising or blaming the gods for their fates. Not until some bold minds like Ludwig Feuerbach, Karl Marx, Friedrich Nietzsche and Sigmund Freud did it become thinkable, much less fashionable, to preach atheism. These were inventors of a new order, one that allowed human beings to make up our game as we go along, unfettered by superstitions about the will of the gods or fear of their punishment. For my part I am appalled at how slowly this invention has been accepted. Over 60 percent of Americans still agree (somewhat, mostly, or strongly) that, "The world was literally created in six days, as the Bible says," (confirmed on three successive national probability sample surveys by the Values and Lifestyles Program at SRI International where I was director of research during the 1980s). Islam claims over a billion devotees. And I find it remarkable the number of highly educated, intelligent adults who still embrace a childlike, wish-fulfilling belief in God. Without kneeling down to positivism, or overestimating what is knowable, or underestimating the mysteries that remain lurking in the individual and social unconscious, let us nevertheless celebrate our liberation from superstition, remain humble before forces that transcend our individual egos, but accept the collective responsibilities of human freedom, and sing, as my GBN partner, Stewart Brand, did in the epigram for the Whole Earth Catalog: "We are as gods so we might as well get good at it." JAY OGILVY is a cofounder and Vice-President of Global Business Network , responsible for training; headed "Values and Lifestyles" research at SRI International; former professor of philosophy at Yale and Williams College; author of Living Without a Goal and Many Dimensional Man. _________________________________________________________________ Douglas Rushkoff: The eraser. As well as the delete key, white-out, the Constitutional amendment, and all the other tools that let us go back and fix our mistakes. Without our ability to go back, erase, and try again, we'd have no scientific model, nor any way to evolve government, culture, or ethics. The eraser is our confessor, our absolver, and our time machine. DOUGLAS RUSHKOFF is the author of Cyberia, Media Virus, Playing the Future, and the novel Ecstasy Club. His books have been translated into 16 languages, and his weekly column is syndicated by The New York Times. He writes and lectures about technology and culture, and teaches at New York University. _________________________________________________________________ Mike Godwin: The most important invention in the last 2000 years has to be the moveable-type printing press. Cheap book production put the printed language in the hands of the masses and led directly to the rise of literacy. Once you have a large literate class, you see the democratic impulse flourish -- even a moderately educated populace begins to make judgments about its rulers and its mode of government. Cheap book production also advances both scientific and historical knowledge by ensuring that valuable source documents are duplicated and preserved and (just as important, really) ensuring that those old documents are readable. Cheap book duplication makes it possible to quickly build a cadre of scientists and historians who've read the same works and thus share a common body of knowledge. Finally, moveable type makes it possible for the past to speak to the future en masse in a way that the evanescent oral tradition never could. It's helpful to look at the other inventions listed as the most important in the last 2000 years and try to imagine how they might have come about had there been no moveable-type printing press. MIKE GODWIN, an attorney, is counsel for the Electronic Frontier Foundation, the San Francisco-based cyber-liberties organization, and the author of Cyber Rights: Defending Free Speech in the Digital Age. See Digerati, Chapter 12 _________________________________________________________________ Duncan Steel: Summary answer: The non-implemented 33-year English Protestant Calendar. Let me start my answer by making a few comments about the suggestions made by other correspondents, and the general premise of the specific answer I give myself. At the time of writing many answers are already in, and so many good ideas have been aired. I don't even need to refer to the list to guess at some of them: the computer, the contraceptive pill, gunpowder, the internal combustion engine, nuclear weapons. Wait! you say. What am I suggesting, that nuclear explosions are good? Well, maybe not from the perspective of how they may be used on Earth; but from another perspective one could claim that they have been a major peacekeeping influence over the past half-century, which has been comparatively war less compared with what one might have expected given the other technologies available: jet planes, napalm, guided missiles,... Note that I wrote "one could claim" -- that does not mean that I am claiming it, I am just posing an arguable position. In the same way one could argue that the contraceptive pill, which has indeed been nominated as one of the most important inventions, is actually a bad thing. For example, we cannot know whether it has robbed us of a 21st century Einstein who would have found the way to unify the laws of physics whilst identifying a cure for cancer in her spare time. The impossibility of knowing how the world might have been post hoc opens up various avenues of thought, like what if Hitler had never lived? (a matter explored in certain ways by Stephen Fry in his novel 'Making History'). Obviously this has a wide variety of implications with gross repercussions, especially for the Jews, Gypsies and other races which were the target of such atrocities. But for my present purposes let me sidestep such huge considerations, and instead look at some trivial ones. Suppose that you are the President of the Boston and Area Volkswagen Beetle Owners Club: you might adjudge the hypothesized nonexistence of Hitler as being most important in your life because the Beetle would never have been built. One therefore has to think about what important means in the context of different people's lives. Right now the most important thing to a Denver Broncos fan (I write as they stand 13-0) is whether a perfect season is in the offing. Excuse me but isn't that totally insignificant to some starving child in Ethiopia; but it is the thing foremost in the mind of that Broncos fan, perhaps fatally-so: he may crash whilst driving to the next game at Mile High Stadium and lose his life, never getting to see his team romp the Superbowl again. The outcome of my own mental perambulations on this question of the most important invention is that all the technological products, of recent years and old, would not only have been invented sooner-or-later anyway, but also they are mere applications of ideas. An idea may be important, even though it does not directly lead to an important invention with a physical reality. An idea itself I count as being an invention in the current context. Further, how we got to where we are now is the result of many important ideas producing branching points in history. Now, one could make a case for the more distant (in history) branching points being more fundamental, because all following events depend upon them. If Alexander the Great, Charlemagne and William the Conqueror had never lived, then neither would Hitler. But that form of reasoning leads to a reductio ad absurdum. Rather, I choose to ask: "How did we get to where we are now?" The first step needed there is to define where we are, and the answer to that is: with the USA being the powerhouse of most of the rest of the world. Thus the branching point I look to is that which made the USA a reality. I do not mean the Declaration of Independence. I mean: what made the English first go and settle the Atlantic seaboard of North America? The answer to that provides my answer to the "Most Important Invention In The Past Two Thousand Years", but it is not original to me. The thing I am going to describe was suggested to me by Simon Cassidy, a British mathematician who lives in California. Here is the story. When the Catholic Church (per Pope Gregory XIII) brought in the reformed calendar in 1582, they decided to use a second-best solution to the problem. Let me tell you, all Christian calendar matters hinge on the question of the Easter computus. That depends upon the time of the vernal equinox, which is ecclesiastically defined to be March 21st, although astronomically-speaking the equinox on the Gregorian calendar shifts over the 400-year leap-year cycle by 53 hours, between March 19 and 21. This follows from the long cycle time. By far preferable from a religious perspective would be a calendar which keeps the equinox on one day, requiring a shorter cycle. Even so far back as AD 1079, Omar Khayyam had shown that an eight leap-years in 33-year cycle provides an excellent approximation to the year as measured as the time between vernal equinoxes. The advisers of Gregory XIII knew this but instead recommended the inferior 97/400 leap-year system we use, perhaps in the belief that the Protestants did not know of the better 8/33 concept. But in England, they did. John Dee and others (Thomas Harriot and Walter Raleigh amongst them) had secretly come up with a plan to implement a 'Perfect Christian Calendar' using the 33-year cycle (the traditional lifetime of Christ). In that span there are eight four-year cycles leading to a time-of-day wander by the equinox of just below 18 hours. The problem is the one five-year cycle in each grand cycle, during which the equinox steps forward by just below six hours in each of four jumps before the following leap year pulls it back by 24 hours. The full amplitude of the movement is 23 hours and 16 minutes. To get the equinox to remain on one calendar day throughout the 33-year cycle one has to use as a prime meridian for time-keeping a longitude band which is just right, and quite narrow. It happened (in the late sixteenth century but with movement east since due to the slow-down of the Earth's spin) to be at 77 degrees west, which Cassidy terms "God's Longitude". If you look down that meridian you will find that in the 1580s the settled areas (in the Caribbean, Peru, etc.) were under Spanish, hence Catholic, control. To grab part of God's Longitude and found a New Albion, enabling them to introduce a rival calendar -- that Perfect Christian Calendar -- and convert the other Christian states to the Protestant side, England mounted various expeditions which historians have since misinterpreted. In 1584-90 the so-called Lost Colony was sent to Roanoke Island, a bizarre place to attempt to start colonization but an excellent site from which to make astronomical observations to fix the longitude and thus decide how far inland New Albion should be. Similarly in 1607 the choice of Jamestown Island seems bizarre from the settlement perspective -- why not out on Chesapeake Bay, and away from the attacks of the local Algonquians led by Pocahontas' father Powhatan? -- but makes sense from the paramount need to grab a piece of God's Longitude. From the foothold the English managed to gain, Old Virginny grew and later other colonizers came to New England, and New Amsterdam was bought from the Dutch. But later utility/developments do not reflect the original purpose of the English coming to Roanoke Island and Jamestown Island any more than the Eiffel Tower was built to provide a mount for the many radio antennas which now festoon its apex. After the fact the English did not reveal their prime motivation for Raleigh's American adventures and the investment in the ill-starred Jamestown colonizers, and all of this is yet to be properly teased out. But if the English had never invented their non-implemented 33-year Protestant Calendar, then the USA as it is would not exist, and all of the scientific, technological and cultural development of the world over the past couple of centuries would be quite different. In view of this I nominate that calendar, due to John Dee, as the most important invention of the past 2000 years. DUNCAN STEEL conducts research on asteroids, comets and meteors and their influence upon the terrestrial environment, is Director of Spaceguard Australia, and the author of Rogue Asteroids and Doomsday Comets. _________________________________________________________________ Tom Standage: It all depends on how you define important, of course. But to my mind the most important invention is telecommunications technology: the telegraph, the telephone, and now things like the Internet. Until about 150 years ago, it was impossible to communicate with someone in real time unless they were in the same room. The only options were to send a message (or go in person) by horse or ship. The early optical telegraphs of the 1790s made long-distance communication possible at hitherto impossible speeds, at least for the governments that built them, but they were not available for general use. Then in the 1840s, the electric telegraph enabled people to send messages over great distances very quickly. This was a step change, though its social consequences took a while to percolate. At first, telegraph operators became the pioneers of a new frontier: they could gather in what we would today call chat rooms, play games over the wires, and so on. (There were several telegraphic romances and weddings.) The general public, of course, was still excluded, and had no direct access to the real-time nature of the technology. But the invention of the telephone in the 1870s made real-time telecommunications far more widely available. Today, in the developed world at least, we think nothing of talking with people on the other side of the world. During the course of a normal working day, many people spend more time dealing with people remotely than they do face-to-face. The ubiquity of telecommunications technology has become deeply embedded in our culture. Of course, life has sped up as a result. But we watch TV and use telephones, fax machines and, increasingly, the Internet, almost unthinkingly. If the mark of an advanced technology is that it is indistinguishable from magic, then the mark of an important one is that it becomes invisible -- that we fail to notice when we are using it. That makes the significance of telecommunications technology very easy to overlook, and underestimate. TOM STANDAGE, Science Correspondent of The Economist and former deputy editor of the Daily Telegraph's technology supplement, "Connected," is the author of The Victorian Internet: A History of the 19th Century Communications Revolution. He has written for many newspapers and magazines including Wired, The Guardian, The Independent, and The Daily Telegraph. He has also appeared as a technology and new media pundit on BBC television and radio. _________________________________________________________________ Andy Clark: DIGITAL ECOSYSTEMS A digital ecosystem is a kind of universe, realized in electronic media, in which we observe incremental evolution and complex interaction. The classic examples come from work in Artificial Life, such as Tom Ray's Tierra project in which strings of code compete for resources, such as CPU time, and in which cascades of strategies for success develop, with later ones exploiting the weaknesses and loopholes of their predecessors. But the idea is much broader. The worldwide web and browser technologies have combined to create a massive digital ecosystem populated by ideas and product descriptions, whose true impact on the human lifestyle is only just beginning to be felt. The human mind was never contained in the head, and has always been a construct involving head, artifacts (such as pen and paper), and webs of communication and interaction. We make our worlds smart so that brains like ours can be dumb in peace. But the development of web and internet technologies may well signal the next great leap in the evolution of thought and reason. For we now have a medium in which ideas can travel, mutate, recombine and propagate with unprecedented ease and (increasingly) across the old barriers of culture, language, geography and central authority. Moreover, and in a kind of golden loop, we can use our experience with more restricted digital ecosystems to improve our grip on the properties of the kind of large, distributed, self-organizing system of which we are now a proper part. Understanding these properties is important both for policy making (what kind of regulation creates and maintains the optimal conditions for productive self-organization in a complex and highly uncertain world?) and for moral and economic reason. Human brains are bad at seeing the patterns that will result from multiple, ongoing, bidirectional interactions: see, for example, the simulations that show, to most peoples surprise, that if each person in a group insists on having just 30% of their neighbors 'the same' as them (picked out by race, gender, sexual inclination or whatever you like), that over a short period of time what evolves is a highly segregated ecology containing a great many 'all X' neighborhoods. Perhaps if our children get to play with quite large-scale digital ecosystems, in games such as Sim City or using new educational resources such as such as Mitchell Resnick's Starlogo, they may yet learn something of how to predict, understand, and sometimes avoid, such emergent patterns. Digital ecosystems thus both radically transform the space in which human brains think and reason, and provide opportunities to help us learn to reason better about the kind of complex system of which we are now a part. The double-whammy gets my vote. ANDY CLARK is Professor of Philosophy and Director of the Philosophy/ Neuroscience/Psychology Program at Washington University in St Louis, St Louis, MO, USA. He is the author of Microcognition, Associative Engines and most recently Being There: Putting Brain, Body And World Together Again. _________________________________________________________________ Stanislas Dehaene: In my opinion, the most important human invention is not an artefact, such as the pill or the electric shaver. It's an idea, the very idea that made all these technical successes possible: the concept of education. Our brain is nothing but a collection of networks of neurons and synapses that have been shaped by evolution to solve specific problems. Yet through education and culture, we have found ways to "recycle" those networks for other uses. With the invention of reading and writing, we recycle our visual system to do word reading. With the invention of mathematics, we apply our innate networks for number, space, and time to all sorts of problems beyond their original domain of relevance. Education is the key invention that enables all these rewirings to take place at a time when our brains are still optimally modifiable. As David Premack likes to remind us, homo sapiens is the only primate that has invented an active pedagogy. Without education, it would only take one generation for all the inventions that other have mentioned to vanish from the surface of the earth. STANISLAS DEHAENE is a researcher at the Institut National de la Sant? studies cognitive neuropsychology of language and number processing in the human brain. He is author of The Number Sense: How Mathematical Knowledge Is Embedded In Our Brains. See EDGE: "What Are Numbers, Really? A Cerebral Basis For Number Sense" by Stanislas Dehaene." _________________________________________________________________ John Maddox: I'm amazed that fellow beneficiaries of this site are making such heavy weather of your pre-millennial assignment. Incidentally, surely some have bent your rules in that assorted Sumarians, Assyrians and Egyptians, not to mention Chinese, Greeks and Romans, were well into the recording of history long before 2,000 years ago. Ab-reacting a little, I was tempted to enter the central locking-systems on modern motorcars (a.k.a. "automobiles") as the greatest contribution to the convenience of modern life, but that's a trivial invention (and should have been incorporated on the model-T). In any case, there's no doubt in my mind that the invention of the differential calculus by Newton and, independently, by Leibnitz, was the outstanding invention of the past 2,000 years. The calculus made the whole of modern science what it is. Moreover, this was not a trivial invention. Newton know that velocity is the rate of change (with time) of distance (from Galileo, for example) and that acceleration is the rate of change of velocity (with time), but it was far from self-evident that these quantities could be inferred from the geometrical shapes of Kepler's orbits of the planets. Nowadays, of course, mere schoolboys (and girls) can play Newton's game -- it's just a matter of "changing the variables", as they say. In the seventeenth century, it was far from obvious that the differential calculus would turn out to be as influential as later events have shown. Indeed, Daniel Bernoulli claimed (in 1672) that Newton had deliberately hidden his "method of fluxions" in obscure language so as to keep the secret to himself. But Leibnitz's technique was hardly transparent; it fell to Bernoulli himself to interpret the scheme, much as Freeman Dyson made Feynman's electrodynamics intelligible in the 1940s. Both Newton and Leibnitz appreciated that the inverse of differentiation leads to a way of calculating the "area under a curve" (on which Newton had earlier spent a great deal of energy), but it was Liebnitz who invented the integral sign now scattered through the mathematical literature. That these developments transformed mathematics hardly needs assertion. But the effect of the calculus on physics, and eventually on the rest of science, was even more profound. Where would be field theories of any kind (from Maxwell and Einstein to Schrodinger/Feynman/Schwinger/Weinberg and the like) without the calculus? One can, of course, say much the same about the invention of arithmetic, but that long predates 2,000 years ago. The calculus was the next big leap forward. JOHN MADDOX is Editor emeritus of Nature; physicist; author of Revolution in Biology, The Doomsday Syndrome, Beyond the Energy Crisis, and What Remains to be Discovered. See EDGE: "Complexity and Catastrophe" A Talk With Sir John Maddox." _________________________________________________________________ Eberhard Zangger: The tricky part of the question is not what the most important invention is, but the qualifier "in the past two thousand years". Technological innovations alter the frontier between humans and their natural habitat. Because of the insuperable importance of the environment, humans have always sought to maximize the advantage they can take from the laws provided by nature. As a consequence, truly fundamental innovations date back many thousand years ago. The most outstanding innovation of all times was probably the domestication of animals, followed by that of plants. Life in permanent homes, villages and cities, the wheel, the sailing ship, engineering, script, as well as conceptual achievements such as nations, democracy, religion, music and songs, even taxes, interest and inflation all date back way before the beginning of the common era. Several innovations suggested in this forum were actually part of every day routines of Bronze Age people, including, for instance, language, steel, paper, and reading glasses. Scientific method must have also existed in some form, since 14th century BC hydraulic installations in Greece perfectly meet the parameters of the given environment. Even moveable type was known by 1500 BC, as the example of the Discos of Phaistos from Minoan Crete shows. Finally, heliocentricity was first discovered by the astronomer Aristarchos of Samos during the 3rd century BC -- but the concept failed peer reviews and its acceptance was thus delayed by 1800 years. Since the principle factors controlling people's lives today already existed 2000 years ago, the skeptic in me would intuitively vote for: nothing worth mentioning. If we take a stroll through a Roman town 2000 years ago -- and ancient Pompeii provides a good example of a city frozen in a moment of every day life -- we would find a city containing factories (including one for fish sauce), public baths, athletic stadiums, theaters, plastered roads, proper sidewalks, pubs and, inevitably, brothels -- facilities for people who were, for the most part, in better physical shape than us. What distinguishes a modern city from its Roman predecessor? Two things come to mind, the first belonging to the category of conceptual realization: Christianity. The Roman dominion over the western world lasted for about 1000 years -- and we might indeed still live in the Roman era, if there would not have been a common denominator which united the many tribes suppressed by the imperious control. This unifying factor was Christianity. -- The second prominent innovation which distinguishes a Roman from a modern city is electricity. Only through the invention of electricity is it possible to operate laundry machines and subnotebook computers; two inventions I personally cherish the most, as well as many of the other items suggested in this forum. However, I recall enjoying a particularly Romantic evening in the usually overcrowded, noisy Cretan tourist resort of Elounda. Some time passed before I realized what made this evening so special -- a general power shut down had knocked out all fluorescent lighting and loudspeakers. Lanterns and kitchen stoves still worked -- with gas. This brings me back to my original response to the question, what is the most important invention of the past two thousand years. Nothing worth mentioning. EBERHARD ZANGGER is a geoarchaeologist and works as chief physical scientist on many archaeological field projects in Mediterranean countries. He is author of The Flood from Heaven: Deciphering the Atlantis Legend and The Future of the Past: Archaeology in the 21st Century. _________________________________________________________________ Leon Lederman: If we suggest anything other than the Printing Press, Brockman will cancel our Christmas bonuses and New Years Eve turkey. So: the greatest invention in the past two thousand years is the printing press. Next is the thermos bottle. LEON LEDERMAN, the director emeritus of Fermi National Accelerator Laboratory, has received the Wolf Prize in Physics (1982), and the Nobel Prize in Physics (1988). In 1993 he was awarded the Enrico Fermi Prize by President Clinton. He is the author of several books, including (with David Schramm) From Quarks to the Cosmos : Tools of Discovery, and (with Dick Teresi) The God Particle: If the Universe Is the Answer, What Is the Question? _________________________________________________________________ Marc D. Hauser: I read through the list. Some good ones. I think it is interesting that many found it so difficult to stick to the 2000 year cut off. Is it really the case that all the big inventions happened so long ago? This is surely an important and profound statement, if correct. I have two suggestions, both within the cut-off period. First, the electric light, born about 50 years before Joseph Swan put a patent on the incandescent lamp in 1878, and then Edison in 1879. Having lived in Africa, where one is often forced to read from fire light, electricity is a god send. Moreover, having invented the incandescent lamp, it didn't take too long to come up with the flashlight, another handy device for those of us working in dark jungles. My second suggestion for great inventions is the aspirin, invented, oddly enough in 1853 in France. Clearly, other medicines have been around, many of which serve comparable functions, but what a useful little pill. Among the Maasai in Kenya, headaches are treated with goat feces, a mud compact to the head. I prefer the aspirin personally. MARC D. HAUSER, evolutionary psychologist, is Associate Professor at Harvard University where he is a fellow of the Mind, Brain, and Behavior Program; and author of The Evolution of Communication. _________________________________________________________________ David Buss: In my view, questions of "importance" cannot be answered without first specifying "criteria of importance," of "important with respect to what." Thus, I would give the following answer to your question: "One criterion for "most important" is that which has most profoundly altered patterns of human mating. Changes in mating can affect the subsequent evolutionary course of the entire species, with cascading consequences for virtually every aspect of human life. Although many inventions have altered human mating over the past 2,000 years, television must rank among the most important. Television has changed status and prestige criteria, created instant celebrities, hastened the downfall of leaders, increased the importance of physical appearance, and accelerated the intensity of intrasexual mate competition -- all of which have acutely transformed the nature of sexuality and mating and perhaps forever altered the evolutionary course of our species." DAVID BUSS is Professor of Psychology at The University of Texas at Austin; author of The Evolution of Desire : Strategies of Human Mating. _________________________________________________________________ Leroy Hood: I nominate the printing press as the most important invention in the past 200 years. LEROY HOOD, M.D., Ph.D., is the William Gates III Professor of Biomedical Sciences and founding chair of the Department of Molecular Biotechnology at University of Washington. He is principal investigator of the Leroy Hood Laboratory and coeditor (with Daniel J. Kevles) of Code of Codes : Scientific and Social Issues in the Human Genome Project. _________________________________________________________________ Julian Barbour: If it had not been invented over three thousand years ago, I should have nominated the bell, but instead I choose the symphony orchestra. This is because, like the bell, it establishes a dramatic link between two seemingly disparate worlds -- the material world of science and the world of the psyche and the arts. The symphony orchestra is surely important because it made possible classical music, the nomination of Howard Gardner. However, I choose it as a symbol for something that may yet be to come, like space travel, the choice of Reuben Hersh. What is more, I make my choice precisely because in just one point I disagree with Howard Gardner -- classical music is crucially dependent on physical inventions: musical instruments. I have long been fascinated by one of the great conundrums of philosophy that was clearly recognized by Newton's contempories: If there is only a material world characterized by the so-called primary qualities such as extension, motion, and mass, how are we to explain our awareness of so many different secondary qualities such as colors, sounds, tastes, and smells? The material world has no need of them and can never explain them. Of course, we all know that science can now demonstrate how specific sensations are correlated with physical phenomena, but a correlation is not necessarily a cause -- for both correlates may have a common cause -- and still less is it an explanation. How can the vibrations of cat gut create in me the effect I experience when listening to Beethoven's quartets? Perhaps I am na?ve, but I am a committed scientist. I cannot be content to regard the secondary qualities as epiphenomena. I think there could be a physics, far richer than the one we presently know, in which the secondary qualities are as real as electric charge. The bell and symphony orchestra call us to ponder higher things and wider possibilities, the domain where science is reconciled with the arts. JULIAN BARBOUR is a theoretical physicist and the author of Absolute or Relative Motion : A Study from a Machian Point of View of the Discovery and the Structure of Dynamical Theories : The Discovery of Dynami. _________________________________________________________________ John Henry Holland: BOARD GAMES Board games, more than any other invention, foretell the role of science in understanding the universe through symbolic reasoning. Their essence is a simple set of rules for generating a complex network of possibilities by manipulating tokens on a reticulate board. Board games are found as artifacts of the earliest Egyptian dynasties, so they don't truly fall within the 2000 year limit, but they have undergone a rapid "adaptive radiation" in the last millennium. Thales' invention of logic (the manipulation of abstract tokens under fixed rules) was likely influenced by a knowledge of board games, and board games offered an early metaphoric guide for politics and war in both the East (Go) and the West (Chess). These insights, in turn, had much to do with transition from the belief that the world around us is controlled by the whims and personalities of gods to the outlook that the world can be described in lawlike fashion. In the 19th and 20th centuries board games became the inspiration for models, simulations and mathematics, ranging from genetics and evolution to markets and social interaction. Board games also offer a simple example of the recondite phenomenon called emergence -- "much coming from little" -- as when a fertilized egg yields a complex organism consisting of tens of billions of cells. And, via a mutation into video-games, board games offer the next generation an entry into the world of long horizons and rigorous thought -- both in short supply in the current generation. JOHN HENRY HOLLAND is Professor of Computer Science at the University of Michigan at Ann Arbor. The recipient of a MacArthur genius award, he is credited with the discovery of genetic algorithms -- lines of computer code that simulate sexually reproducing organisms. A leading expert on complexity theory at the Sante Fe Institute in New Mexico, Dr. Holland is the author of Hidden Order: How Adaptation Builds Complexity and Emergence: From Chaos to Order. _________________________________________________________________ Gordon Gould: Here is my $.02 on what is significant, in addition to all the illustrious suggestions received so far: Double Entry Accounting: While it is not all that sexy, it has been a significant force in shaping the West and by the globalization of market-driven economies, the world. Invented in 1494 by a Franciscan monk named Luca Pacioli, double entry accounting was designed to help the flourishing Venetian merchants manage their burgeoning economic empires. Today, it remains the core methodology for most accounting systems worldwide. It is the DOS of money. Based on the principle of equilibrium (ie a balance sheet), double entry accounting provides both control over the internal state of an agent (in this case, an economic entity) and the necessary structures required for individual organizations to cooperate/collaborate in the emergent construction of modern market economies. In other words, double entry accounting simultaneously enables organizations to regulate themselves (through internal accounting and control mechanisms) while also allowing the larger economy to assess the relative health and worth of an enterprise using standardized measures. If money is the blood and markets are the circulatory systems of the global economy, then double entry accounting ledgers are the nerve cells that both control and, in turn, respond to changes in the flows of money. GORDON GOULD is the President of Rising Tide Studios, parent company of the Silicon Alley Reporter and the Digital Coast Reporter. Prior to joining RTS, he was a principle at Thinking Pictures, an interactive entertainment/database technologies company, and also oversaw the Multimedia/Internet Group for Sony Worldwide Networks. _________________________________________________________________ Bob Rafelson: Richard Gatling started with a cotton seed sowing machine and graduated to a weapon that rotated 10 barrels of 0.45 in bullets at a rate of 1000 rounds a minute. The Confederacy didn't purchase the thing 'til after the Civil War. But in the next several decades it was bought and used by powerful armies around the globe. Finally it proved its battle merit in Africa where it mowed down thousands of unsuspecting Zulus. The Gatling Gun was the first weapon of mass destruction. Moreover, it spawned the ongoing, if clumsy, debate about weapons being banned for the sake of mankind. BOB RAFELSON is a film director and producer whose work includes Head, Five Easy Pieces, The King of Marvin Gardens, The Postman Always Rings Twice, Mountains of the Moon, and Blood and Wine. _________________________________________________________________ John Allen Paulos: Thanks for your invitation (and for your project in general). I'd respond more fully but the question seems too ill-defined to answer. (I guess I still have something of the reductionistic, literal mindset of a mathematician despite periodic forays into more nebulous realms.) An invention or innovation that becomes essential has a tendency also to become invisible as we, in a sense, "grow around" it. If I were forced to name something, I guess I would go with Gutenberg's movable type. And if I wanted to be puerilely self-referential, my choice for most important invention might be the notion of a precise question. (Nevertheless, I do see the value of vague ones as well.) JOHN ALLEN PAULOS, professor of mathematics at Temple University in Philadelphia, is the author of Innumeracy: Mathematical Illiteracy and Its Consequences, Beyond Numeracy: Ruminations of a Numbers Man, A Mathematician Reads the Newspaper, and Once Upon a Number: The Hidden Mathematical Logic of Stories. _________________________________________________________________ Verena Huber-Dyson: My first reaction to your question was The Zero, the next Infinity, but my answer is The Infinitesimal Calculus. Creating a bridge between the two archetypal fictions 0 and * it makes sense of them. It has become a tool in just about every branch of engineering and science. It provides a language for the formulation of laws and a method for constructing explanations, solutions and predictions. It is alive: its invention in the 17th century -- by Leibniz and Newton independently -- articulated a concept that had long been vaguely anticipated and applied implicitly, its development is still in progress, leading to the resolution of old puzzles (e.g., Zeno's Paradox) while raising new ones (e.g., the continuum hypothesis). Leibniz had been agonizing over what he called "the labyrinth of the continuum" but the 19th century put the infinitesimal calculus on a firm basis by analyzing the concepts of a limit and of infinity from a variety of view points. Nowadays we are blessed with new developments coming from the quarter of symbolic logic that arose out of a digital (0,1) modeling of rational arguing: non standard analysis vindicates Leibniz' use of "infinitely small" non-zero quantities. VERENA HUBER-DYSON, a mathematician, has published research in group theory, and taught in various mathematics departments such as UC Berkeley and University of Illinois at Chicago. She is now emeritus professor at the philosophy department of the University of Calgary where she taught logic and philosophy of the sciences and of mathematics which led to a book on G?del's theorems published in 1991. See EDGE: "On The Nature Of Mathematical Concepts: Why And How Do Mathematicians Jump To Conclusions?" by Verena Huber-Dyson. _________________________________________________________________ Garniss Curtis: My instantaneous response was: Gutenberg's printing press with movable type. This knee-jerk response was followed by a pause and reflection. What is meant by "invention"? So, to the dictionary! Essentially, anything that did not exist previously, whether it be a mechanical device or art,literature,or music, is an "invention". Sobered by this, I reflected again. The skulls of l0 skeletons found in Skhul Cave at the foot of Mt. Carmel in Israel in the l930's are similar in size and shape to modern Homo sapiens. These have been dated at 80,000 years. A similar skull found in a cave at Qafzeh, Israel has been dated at 9l,000 years. Having the same size brain capacity, of course, does not necessarily mean they had our same intelligence, although they were capable of making beautiful stone tools. We jump now to the Chevaux cave in France, where wall paintings of animals extant in Europe at that time are beautifully depicted and have been dated at over 30,000 years. l5,000 years later in the caves at Le Portel and Lascaux in France, our ancestors were making magnificent polychrome paintings of animals. Their stone tools at that time and for the previous 5,000 years are comparable in technique and beauty to any made by Native Americans in the past few hundred years. Can anyone doubt that these Cro-Magnons could have learned to read and write, to philosophise, to do math at a high level, to learn chemistry and physics if magically brought into our culture of today? (Let's leave out some fundamentalists who still don't believe in evolution.) We find that cuneiform writing began about 5,000 years ago and quickly evolved. By 2,500 years ago, the Greeks were producing masterpieces of plays, literature, art, architecture, and they were doing some wonderful things in mathematics and elementary observational science. The Romans carried on these traditions until their fall. Christianity came in and destroyed as much as it could of this great heritage in western Europe including the great library in Alexandria. Thus began the "Dark Ages" in Europe. The gradual dissemination of knowledge, other than ecumenical literature (which wasn't much faster!) was extremely slow. So, in the mid fourteen hundreds along comes Gutenberg with his printing press and its movable type. Of course, almost the first thing he did was print a bible or two and they sold like hot cakes. True, fixed or non movable type had been around for a short time, but the process wasn't much faster than printing books by hand and was very costly, so, the rapid dissemination of knowledge through printed books began with Gutenberg. While it is true that the Dark Ages began to end about the year l,000, real progress wasn't made until the Renaissance and, particularly, with the rapid dissemination of knowledge via Gutenberg-type presses. As books were published, people became inspired to learn to read. Reading led to thinking about what had been read and to further publications and to communications between people. The first world wide web had been started, Anyone with a grain of sense can see what this has led to! So, John, after my consideration outlined above, I still think the Gutenberg press with movable type is the greatest invention of the past 2,000 years or, perhaps of the last 5,000 years after cuneiform writing was invented! GARNISS CURTIS is Professor Emeritus in the Dept. of Geology & Geophysics at the University of California, Berkeley and Founder of the Berkeley Geochronology Center. A colleague of Louis Leakey, he determined the age (1.85 my) of the famous Zinjanthropus fossil which rocked the anthropological world. His research continues that endeavor: In 1994 with colleague Carl Swisher he re-dated Homo erectus in Java at 1.8 my instead of the long-held .8 my. _________________________________________________________________ Milford H. Wolpoff: Science, because it brings us explanations of our world we may act on, is by far the most important invention of this time. The fact that the explanations are usually wrong brings the partial illusion of progress, as well as tenure, which is a consequence of the publications debating the various wrongnesses. At its best, science works in a sort of Darwinian frame, where hypotheses are the source of variation (cleverness counts) and disproofs are the extinctions. Developments from hypotheses are the analogues of ontogeny, and there are various other processes that parallel the biological world such as the roles of randomness (first publications carry excess influence by virtue of being first, just as Microsoft systems succeed by being most common but not necessarily best), and punctuated equilibrium (scientific revolutions are complete replacement events). There are even biological-like terms like "memes" that may describe how hypotheses are transmitted. All and all, ever since when well before Neandertal times we hominids developed significantly complex culture, that extrasomatic way of transmitting hierarchically structured information, we have enjoyed (in the sense of the Chinese curse) interesting times. MILFORD WOLPOFF is a paleoanthropologist, Professor of Anthropology at the University of Michigan, author of Paleoanthropology ; and coauthor (with Rachel Caspari) of Race And Human Evolution. _________________________________________________________________ Mark Mirsky: Last year, I held my tongue when it came to questions. No one however asked directly as I recall what was most important to me. What is going to happen to me after I die? This is only the prologue to other questions that go to a root of human consciousness for me, is there order or no order in the universe? If there is order does it represent in my own life a pattern that is supposed to mean something for me. What meaning does my life have? Is there order or simply random event in my life? Does what I do effect the order of the universe in any important way, in any way that effects its order? According to the scholar of religious philosophy Harry Wolfson, Spinoza believed that memory would survive, though he had no logical proof for this belief. Human actions would therefore matter because they would be bound up with memory. Evolution and DNA in part confirm that at least in limited spans of time this is true. What will survive me? What is most important in the last two thousand years, I feel is the human capacity to enact symbols, to identify reality with them. A friend who is both a distinguished mathematician and a rabbi, likes to quote Maimonides to the effect that only original thoughts will survive in the after life. This is after all, a consoling thought to a mathematician, since "original thoughts" are the m?tier of the sciences. And it is with "fear and trembling" that I tread through the gates of EDGE site on the sacred ground of scientists. As a novelist however, I beg to differ with the particulars of this hope, for originality is not necessarily important in the world of fantasy or rather what is compelling is not necessarily what is most original. The very word "invention" has in some of the early responses, a scientific, or pseudoscientific interpretation. I believe (as someone who has seen briefly -- though in a state of such high anxiety I can readily admit -- they may have been hallucinations -- ghosts), that the act of symbolic enactment is a key to the riddle of consciousness and the most important of human "inventions." Nor do I think such "enactment" or symbol drama is entirely a "human" invention. For I believe I derives from play, though in human beings it has come to combine play with the self reference of thought about existence. The latter drama of symbols I think it is part of the uncanny tension between the weight of the Unknown (which I choose to personify with a Capital) and consciousness. The story that has historically "galvanized" Jewish thought and then Christian thought is the Biblical saga of the sacrifice of Isaac, where a family or tribe obviously familiar with human sacrifice, passed to its symbolic enactment. In the Sinai desert, years ago, a German sociologist, Gunnar Heinneson, told me that the Jews were the first people to do away with the exposure of unwanted infants. You can speak volumes about human values, but without ceremonies that address terror of the Unknown, the human majority falls prey to the overwhelming anxiety of death and its handmaiden, survival. I am not enough of a historian or anthropologist, to insist on what Gunnar spoke of as fact. Symbolic enactment obviously goes much further back in human history than the Biblical world in which we have idealized patriarchs and matriarchs. It probably derives from the play that we can observe among animals. It is however, a process that is constantly being refined. I can appreciate that the Sioux Indians when they knocked an opponent on the head with a stick rather than killing him, also invented something that civilization needs -- an extension that the rage for national sports teams may well answer. We recognize, I think, as a society that feels that peace with ourselves is important, that exposing children who are actually delivered, on door steps, brutalizes us as a people of shared customs. Steven Rose speaks of inventions as concepts. It is in bringing ourselves back, again and again, to the concept of invention and in particular of the invention of symbol in the light of our fear, that I think both the human body and mind find themselves in a balance that allows them to experience that mysterious state that Plato called the "good" and the Bible referred to as "completed" or "perfect": or "quiet within oneself." I would challenge Colin Blakemore's assertion that control of human destiny has shifted from the body to the mind. The mind after all is finally subject within the human span to the body, just as the latter has no conscious existence without the mind. We have to reinvent a form of the Shamanism that seeks to bridge this division within contemporary religion or suffer a terror that will devastate most of us in mind and body. When we seek overwhelming joy, in sex, art, music, even the pursuit of knowledge, or understanding, some of us are asking to be just that, overwhelmed through the mind but throughout the body and that has to be part of my "greater good" or "balance." At Thanksgiving dinner, two prominent friends in the lofty upper spheres of the university were mocking the blessing of human organs as they passed to the recipient. I felt the opposite, that in the bleak sphere of the hospital, it might be important to a system in shock. The symbol dramatized recalls inspired pages of Milosz on the dance and the way movement locates us in the universe. Space grows bleak without a sense of this location and dangerous in its suggestion of no meaning. I think we need a more powerful sense of symbol if we are to avoid the fear that our very mastery of technological invention spurs. If human sacrifice was found to be unnecessary, so could heroic distinction based on war, national identity based on exclusion, social identity based on wealth, even the more exaggerated rewards of entrepreneurship, great wealth. Some years ago I suggested (in the "Village Voice") that the Israelis and Palestinians could find a lasting peace if they both acknowledged large parts of what is called "Israel," what is called "the West Bank," even what is called "Jordan," in other words, Biblical Canaan, as sacred space and turned what was still empty into religious park. Tragically, their statesmen can not invent such a symbolic space. To answer Colin Blakemore, I certainly found the contraceptive pill a liberation, at first. Soon, however, it seemed to confuse some of the deepest impulses of sexual joy. I am not sure that the puritanical strategies of the 19th century in which eroticism was buried in passionate friendship were not more effective as symbolic of the desire to be one with another than sex in which no children were intended or hoped for. I would never want to go back to a world without the pill or effective contraceptives, but I am not sure we have mastered its implications for the body or the mind in that body. For the pill has no ceremony, no weight of ritual behind it and the meaning of its communion still awaits definition. MARK MIRSKY is the author of many novels including Thou Worm Jacob, Blue Hill Avenue, My Search for the Messiah, and The Red Adam. He is editor of the recently published Diaries : Robert Musil 1899-1942 . He is editor of "Fiction" and a professor of English at CCNY. _________________________________________________________________ Dan Sperber: I am afraid the answer I find compelling is a rather trivial one. The two most important inventions in the past two thousand years are the computer and the atomic bomb. The computer will bring about the greatest change to human life since the neolithic revolution, unless the bomb destroys human life altogether. DAN SPERBER is a researcher at the CREA in Paris. He is the author of Rethinking Symbolism, On Anthropological Knowledge, Relevance: Communication and Cognitions (with Deirdre Wilson), and Explaining Culture: A Naturalistic Approach. _________________________________________________________________ Lew Tucker: I would have to agree that Gutenberg's printing press is the most important invention in the past two thousand years because it changed forever the cost of knowledge distribution. What other inventions wouldn't have happened if the inventor didn't have access to books? In a sense I think we can trace many aspects of our information society back to this single invention. In it's electronic form on the web we see movable type and a yearning for information to be accessible and free. The web is taking the cost of distributing information down near zero. Gutenberg would be pleased to see where his invention has taken us. LEW TUCKER is a Java evangelist and director of developer relations at Sun Microsystems. He has worked in the areas of artificial intelligence and parallel computers at Thinking Machines and is now building an online community of software developers. See Digerati, Chapter 30. _________________________________________________________________ Tor N?rretranders: THE MIRROR The most influential invention in the past 2000 years has been the mirror: It has shown to each person how she or he appears to other persons on the planet. Before the widespread production and use of mirrors that came about in the Renaissance, humans could mirror themselves in lakes and metallic surfaces. But only with the installation of mirrors in everyday life did viewing oneself from the outside become a daily habit. This coincided with the advent of manners for eating, clothing and behaving. This again made possible the modern version of self-consciousness: Viewing oneself through the eyes of others, rather than just from the inside or through the eyes of God. Hence, consciousness as we know it is an effect of an advanced mental task: To acknowledge the person experienced out there in the mirror as the same as the one being simultaneously experienced from within. To know that the person out there in the mirror is controlled by me in here. The invention of the mirror is closely related to the problem of free will and to the invention of the modern human ego as described in this poll by Jaron Lanier. The problem with overemphasis of conscious control is thus the problem of supervising oneself through the eyes of others, rather than just acting out. Many malaises of modern life stems from the fact that one tends to consider the mirror-image of oneself as more real than the view from within. This new loop of the-outside-person-viewed-by-the-inside-person recently got a parallel with the first images of the Earth seen on the sky of the Moon: No longer just the planet we can touch and live on, the Earth became a heavenly body comparable to other celestial objects. TOR N?RRETRANDERS is a science writer and communicator based in Copenhagen, Denmark and the author of The User Illusion: Cutting Consciousness Down to Size. _________________________________________________________________ Richard Potts: Over 4.6 billion years, the most important evolutionary inventions have been those that code, store, and use information in new ways. DNA; nervous systems; organic devices enabling cultural transmission of information. In large perspective, the most important invention over the past 2 thousand years will likely be something related to computers, electronic information coded and handled outside of living bodies. Its importance, however, has not yet been fully realized. I'm going with something whose impact so far is more apparent. The paleontologist in me wants to say something like the discovery of time -- from inventions that have led to an intense sense of personal time to others that have found out the age of the universe or the human species. These inventions are perception-altering. But there's another invention with greater impact. My vote is for flying machines. Before 2 thousand years ago, sea craft allowed the overcoming of water; the wheel, the conquest of earth. And now flying machines, the conquest of air -- an invention that taps into the center of our mythologies. Many inventions change our lives but stay in the predictable range of human nature. Firearms, for example, have had their impact mainly by extending existing tendencies to bluff, subjugate, or kill in immediate, face-to-face situations. Air craft have altered our perceptions in ways that were evolutionarily unpredictable. They changed the delivery of weapons, vastly destructive weapons, to a inter-continental scale -- a wholly new scale, unprecedented in evolutionary history. A flu virus that mutates in Kennedy Airport is spread around the world within a day or two. And so the history of disease has been altered by moving the month- or year-long dispersal of disease to a time scale of hours. We now meet other people en masse anywhere in the world in less than a day's travel. Thus things foreign and strange have become familiar. Ancient phobias and bias toward hatred and exclusion have been altered widely. The CNN culture (instantaneous worldwide information) is an extension of this; in my view, the actual intermingling of people from one place to another has been the more important, precedent-shattering development. Despite international information media, civil strife remains the worst where cultural and physical insularity reigns. Finally, flying machines have meant a global altering of how societies approach food and other resources, tying humanity together in a worldwide economy (resource exchange) driven by our interdependence. Two million years ago, the movement of resources (like food and stone tools) had become a development with extraordinary implications for human evolution. But even 2 thousand years ago, no one could have foreseen just how far this process of resource exchange has gone today -- largely due to flying machines. RICHARD POTTS is Director of the Human Origins Program, Dept. of Anthropology, National Museum of Natural History, Smithsonian Institution in Washington. He is the author of Early Hominid Activities at Olduvai and Humanity's Descent: The Consequences of Ecological Instability, and co-author of Terrestrial Ecosystems through Time. _________________________________________________________________ Lawrence M. Krauss: If I take the word "important" to suggest an invention that will have "the greatest impact on the next 2000 years" (after all, it is the future that counts, not the pa st!),then the invention of the programmable computer seems to me to be the most important invention of the last 2000 years. ( I am not including in my list of possibilities here ideas and concepts, since I don't think they qualify as inventions, and I suspect that the intent of the question is to explore technology, not ideas...). While the printing press certainly revolutionized the world in its time, computers will govern everything we do in the next 20 centuries. The development of artificial intelligence will be profound, quantum computers may actually be built, and I am sympathetic to the idea I first heard expressed by my friend Frank Wilczek, that computers are the next phase of human evolution. Once self-aware, self-programmable computers become a reality, then I have a hard time seeing how humans can keep up without in some way integrating them into their own development. The only other invention that may come close is perhaps DNA sequencing, since it will undoubtedly lead to a new understanding and control of genetic and biology in a way which will alter what we mean by life. LAWRENCE M. KRAUSS, Ambrose Swasey Professor of Physics, Professor of Astronomy, and Chair, Physics Department, Case Western Reserve University, is the author of The Fifth Essence; Fear of Physics; The Physics of Star Trek; Beyond Star Trek. _________________________________________________________________ John McCarthy: The most important invention is the idea of continued scientific and technological progress. The individual who deserves the most credit for this is Francis Bacon. Before Bacon progress occurred but was sporadic, and most people did not expect to see new inventions in their lifetimes. The idea of continued scientific progress became institutionalized in the Academei dei Lincei, the Royal Society and other scientific academies. the idea of continued invention was institutionalized with the patent laws. JOHN McCARTHY, a computer scientist and one of the first-generation pioneers in AI, is at the Computer Science Department of Stanford University. _________________________________________________________________ Karl Sabbagh: Clearly, none of us is playing by the rules in this game, otherwise we would all concentrate on a few key inventions that are obviously the most important -- the Indo-Arabic number system including zero, computers, the contraceptive pill. Instead we are all reading the suggestions so far and then trying to select something different. Because I've come in late my friend Nicholas Humphrey has bagged my first thought -- reading glasses, so I'll break the rules in two ways by choosing something which was invented more than two thousand years ago but refined over the last two thousand years. In fact I'll break the rules a third time by choosing two things -- chairs and stairs. Apart from the fact that they rhyme, they also represent an imaginative leap by seeing the value to the human anatomy of an idealised platform in space at a certain height. A platform of, say, 7 inches would enable a person to raise himself towards some higher objective without undue effort, but that's as far as it goes. But if, from that new starting point, a further platform of the same height could be constructed, the objective could be more closely approached. The refinements have all been to do with the fact that the greater the height you want to reach the larger the floor area that has to be taken up by the staircase. But landings and 180 degree turns helped to solve that problem, along with the even later improvement of a spiral structure. The consequences of stairs have obviously included greater density of occupation of site areas, but they have also included the propagation of the Muslim religion by allowing muezzins to call the faithful to prayer from minarets. As far as chairs are concerned, the same thought process was involved -- seeing the value of a platform at just above knee height and then constructing it. Portability came in at some stage as well so that instead of finding somewhere -- a wall, a rock, etc -- of the right height you carry around with you, or position where you liked, the place to park your butt. Somehow, the height was chosen, or evolved, so that we can stay for the maximum time in a fixed position with eyes,hands and arms free to do what eyes, hands and arms are good at. Lying down, standing up, and squatting all get uncomfortable after a while, particularly for reading or writing (although we have to accept that medieval monks seemed to manage O.K, transcribing manuscripts standing up.) KARL SABBAGH is a writer and television producer. His programs for the BBC and PBS have encompassed physics, medicine, psychology, philosophy, technology, and anthropology. Three of his television projects have been accompanied by books: The Living Body, Skyscraper, and 21St Century Jet: The Making And Marketing Of The Boeing 777. _________________________________________________________________ Ellen Winner: I will cast my vote for anaesthesia. While this invention may not have changed the world for all, it has certainly altered the lives of many for the good. Imagine a world without anaesthesia. It makes me shudder. Howard Gardner and I are probably one of the few couples who replied to your request. Last night we were at a party and we mentioned this project. I said that Howard's and my choices (Western classical music, Howard; vs. anaesthesia, Ellen) showed how different we were, Howard the optimist, I the one who thinks of the grim side of life. At the party was Yo Yo Ma, who listened with interest and said, without skipping a beat, that our two choices were not so different, because "One is a form of the other." (Interpret that as you will I take it to mean that music is the ultimate escape from pain, but also perhaps that anaesthesia [when needed] is as pleasurable as music). ELLEN WINNER is Professor of Psychology at Boston College, and Senior Research Associate at Harvard Project Zero. She is the author of Invented Worlds: The Psychology of the Arts ; and Gifted Children: Myths and Realities. _________________________________________________________________ George Johnson: Surely one of the most powerful earthly inventions has been the ability to represent any phenomenon with numbers -- either analogue or digital -- and then use this representation to predict outcomes in the real world. This information revolution actually began before the year zero with the Pythagoreans and has advanced through stages that include the invention of calculus and, most recently, boolean algebra and all the advantages of digital modeling. And just as important has been the recent humbling realization that there are limits to this scientific cartography; that, tempting as it is, the map can never be mistaken for the real thing. GEORGE JOHNSON is a writer for The New York Times, working on contract from Santa Fe. His books include Fire In The Mind: Science, Faith, And The Search For Order; In The Palaces Of Memory: How We Build The Worlds Inside Our Heads; and Machinery Of The Mind: Inside The New Science Of Artificial Intelligence . _________________________________________________________________ Rodney Brooks: The electric motor, in all its guises where electricty produces mechanical motion. The industrial revolution was restricted to places of work and shared production until the relatively small and clean electric motor enabled the adoption of its bounty into the home; for instance, refrigeration, automated cleaning, cooling, better heating, entertainment, mass data storage, home medical care, and more comfortable personal transportation. True, many of these aspects were present in the home with simpler technologies (e.g., gravity driven water flow, convective air flow), but it was the electric motor which made them pervasive. The change in our western lifestyle has been profound and has completely changed our expectations of how our bodies should fit with our surroundings. A question: what will it take for the computer revolution to truly enter our lives in the way that the electric motor has enabled the industrial revolution to do so? RODNEY BROOKS, a computer scientist, is director of MIT's AI Lab.See EDGE: "The Deep Question," A Talk With Rodney Brooks. _________________________________________________________________ John R. Searle: If by invention we mean actual technological advances -- as opposed to ideas, theories and concepts -- then there have been some good ones. One thinks of the printing press and the clock, for example. It is too early to say for sure but my choice for the most important invention of the past 2000 years would be the invention of the set of agricultural techniques known collectively as "The Green Revolution". This invention began in the 1960's and continues into the nineties, indeed, it is now being extended into something that may well come to be called "The Green-Blue Revolution", which would extend new agricultural techniques to the oceans. The most important invention of all time is the Neolithic Revolution. With the Neolithic Revolution, humanity found ways to grow crops systematically, and thus overcame both the instability and the fragility of life itself that went with hunter-gatherer ways of survival. Hunter-gatherers could neither stay in one place long enough to develop a stable civilization, nor could they count on being able to survive periods of drought and other forms of natural catastrophe. With the Neolithic revolution, both of these problems were solved, and civilization became a real possibility. However the Neolithic Revolution brought problems of its own. In particular, the Malthusian problem, because the growth of population was constantly threatening to outrun the growth of food supply. For the foreseeable future, at least, this problem has been solved by the Green Revolution. The food supply has vastly outrun the increase in population. At this time, if you read that there is a famine going on in some part of Africa or Asia, you know that it is deliberately politically created. There is no international shortage of food. There is plenty of food to go around, and because of the Green Revolution, there will be food to go around for the foreseeable future. JOHN R. SEARLE is the Mills Professor of Philosophy of Mind at the University of California, Berkeley and author of The Rediscovery of the Mind, Minds, Brains and Science: The 1984 Reith Lectures, Intentionality: An Essay in the Philosophy of Mind, The Construction of Social Reality, and Mind, Language & Society: Doing Philosophy in the Real World (A MasterMinds Book). _________________________________________________________________ Lee Smolin: The most important invention, I believe, was a mathematical idea, which is the notion of representation: that one system of relationships, whether mathematical or physical, can be captured faithfully by another. The first full use of the idea of a representation was the analytic geometry of Descartes, which is based on the discovery of a precise relationship between two different kinds of mathematical objects, in this case, numbers and geometry. This correspondence made it possible to formulate general questions about geometrical figures in terms of numbers and functions, and when people had learned to answer these questions they had invented the calculus. By now we have understood that it is nothing other than the existence of such relationships between systems of relations that gives mathematics its real power. Many of the most important mathematical developments of the present century, such as algebraic topology, differential geometry, representation theory and algebraic geometry come from the discovery of such relationships, of which Descartes analytic geometry was only the first example. The most profound developments in present mathematics and theoretical physics are all based on the notion of a representation, which is the general term we use for a way to code one set of mathematical relationships in terms of another. There is even a branch of mathematics whose subject is the study of correspondences between different mathematical systems, which is called category theory. According to some of its developers, mathematics is at its root nothing but the study of such relationships, and for many working mathematics, category theory has replaced set theory as the foundational language within which all mathematics is expressed. Moreover, once it was understood that one mathematical system can represent another, the door was open to wondering if a mathematical system could represent a physical system, or vise versa. It was Kepler who first understood that the paths of the planets in the sky might form closed orbits, when looked at from the right reference point. This discovery of a correspondence between motion and geometry was far more profound than the Ptolemaic notion that the orbits were formed by the motion of circles on circles. Before Kepler, geometry may have played a role in the generation of motion, but only with Kepler do we have an attempt to represent the orbits themselves as geometrical figures. At the same time Galileo, by slowing motion down through the use of devices such as the pendulum and the inclined plane, realized that the motions of ordinary bodies could be represented by geometry. When combined with Descartes correspondence between geometry and number this made possible the spatialization of time, that is the representation of time and motion purely in terms of geometry. This not only made Newtonian physics possible, it is of course what we do every time we graph the motion of a body or the change of some quantity in time. It also made it possible, for the first time, to build clocks accurate enough to capture the motion of terrestrial, rather than celestial, bodies. The next step in the discovery of correspondences between mathematical and physical systems of relations came with devices for representing logical operations in terms of physical motions. This idea was realized early in mechanical calculators and logic engines, but of course came into its own with the invention of the modern computer. But the final step in the process began by Descartes analytic geometry was the discovery that if a physical system could represent a mathematical system, then one physical system might represent another. Thus, sequences of electrical pulses can represent sound waves, or pictures, and all of these can be represented by electromagnetic waves. Thus we have telecommunications, certainly among the most important inventions in its own right, which cannot even be conceived of without some notion of the representation of one system by another. Telecommunications also gave rise to a question, which is what is it that remains the same when a signal is translated from sound waves to electrical impulses or electromagnetic waves. We have a name for the answer, it is information, but I do not have the impression that we really understand its implications. For example, using this concept some people are claiming that not only is it the case that some physical or mathematical systems can be represented in terms of another but that, there is some coding that would permit every sufficiently complicated physical or mathematical system to be represented in terms of any other. This of course, brings us back to Descartes, who wanted to understand the relationship between the mind and the brain. Certainly the concept of information is not the whole answer, but it does gives us a language in which to ask the question that was not available to Descartes. Nevertheless, without his first discovery of a correspondence between two systems of relations, we would not only lack the possibility of talking about information, we would not have most of mathematics, we would not have telecommunications and we would not have the computer. This the notion of a representation is not only the most important mathematical invention, it is the idea that made it possible to conceive of many of the other important inventions of the last few centuries. LEE SMOLIN is a theoretical physicist; Professor of Physics at the Center for Gravitational Physics and Geometry at Pennsylvania State University; author ofThe Life of The Cosmos. See EDGE:" A Possible Solution For The Problem Of Time In Quantum Cosmology" By Stuart Kauffman and Lee Smolin. See The Third Culture, Chapter 17. _________________________________________________________________ Paul W. Ewald: My nominee is the concept of evolution by selection (which encompasses natural selection, sexual selection, and the selective processes that generate cultural evolution). It offers the best explanation for what we are, where we came from, and the nature of life in the rest of the universe. It also explains why we invent and why we believe the inventions described in this list are important. It is the invention that explains invention. PAUL W. EWALD is an Evolutionary biologist; Professor of Biology at Amherst College; author of Evolution Of Infectious Disease. _________________________________________________________________ Carl Zimmer: I nominate waterworks -- the system of plumbing and sewers that gets clean water to us and dirty water away from us. I'm hard pressed to think of any other single invention that has stopped so much disease and death. It may not inspire quite the intellectual awe as something like a quantum computer, but the sheer heft of the benefits it brings about so simply makes it all the more impressive. John Snow didn't need to sequence the Vibrio cholerae genome to stop people from dying in London in 1854 -- he didn't even know what V. cholerae was -- but a pattern of deaths showed him that to stop a cholera outbreak all he needed to do was shut down a fouled well. Without waterworks, the crowded conditions of the modern world would be utterly insupportable -- and you only have to go to a poor city without clean water to see this. Another sign of the importance of an invention is the havoc it can wreak, and waterworks score here again--by cutting down infant mortality they help fuel the population explosion, and they also let places like Las Vegas suck the surrounding land dry. I'd even go so far as to put the importance of the invention of waterworks on an evolutionary scale with things such as language. For hundreds of millions of years, life on land has been crafting new ways to extract and hold onto water. With plumbing, however, you don't go to the water -- the water comes to you. CARL ZIMMER is a senior editor at Discover and author of At the Water's Edge: Macroevolution and the Transformation of Life. _________________________________________________________________ Robert Shapiro: Most of the inventions mentioned thus far have affected, as one contributor put it, the boundary between we humans and the natural world that surrounds us. But the operations of the human body, and the brain which it contains, support all of the experiences that make up our existence. Discoveries that will permit us ultimately to take charge of these functions, and shape them to our desires, surely deserve nomination as the most important of the last two millennia. These insights have flowed broadly from the entire area of science that is now called molecular biology, but if I had to single out the most important invention that made the entire process possible, then I would select genetic sequencing for the honor. The new techniques developed by Fred Sanger in Cambridge and Walter Gilbert at Harvard in the mid-1970's allowed us to read out rapidly the specific information stored in our genes and those of all other living creatures on Earth. The new methods stimulated a burst of scientific energy that will culminate in the next decade, when the sequence of about 3 billion characters of DNA that encodes a typical human being will be fully deciphered by the Human Genome Project. In subsequent explorations, we shall how individuals differ in their heredity, and how this information is expressed to produce the human body. Thus far the effects of sequencing have largely impacted us through such media worthy events as the identification of the stain on Monica Lewinsky's dress, validation of the identity of the Romanov bones, refutation of the claim of Anna Anderson to be Anastasia and confirmation of Thomas Jefferson's affair with Sally Hemings. Much, much more is yet to come. The completion of the Human Genome Project will provide us with an understanding, at the molecular level, of human hereditary disease (much has already been learned about Huntington's disease, cystic fibrosis and others). Further, by the application of other tools from modern molecular biology, we shall be able to do something about these afflictions in the near future.They will be treated and, if society permits it, corrected at the genetic level. Beyond that, we shall come to understand, and perhaps control, many unfortunate aspects of the human condition that have until now been taken for granted, from baldness to aging. Ultimately, we may elect to rewrite our genetic text, changing ourselves and the way in which we experience the universe. Much more has been written on these subjects, but I hope that the above brief treatment should be enough to qualify genetic sequencing for the short list of finalists in this contest. I will also suggest that any poll taken now would not do justice to this invention, as most of its consequences still lie ahead of us. Perhaps we should schedule another poll for the year 3998, to determine the best invention in the period AD 1-2000? ROBERT SHAPIRO, Professor of Chemistry at New York University, has written three books: Life Beyond Earth (co-author), Origins: A Skeptic's Guide to the Creation of Life on Earth, The Human Blueprint. _________________________________________________________________ James Bailey: Terrence Sejnowski said it beautifully: the most important discovery of the past 2000 years is the bit. Not the bit used 8-by-8 to redisplay the old sequential sentences and equations that carry too much of our culture today, but rather the bit which, used in parallel profusion, can embody living realities far beyond the expressive power of static text. Images and music are just the beginning of it. We are only now awakening to how much the printing press narrowed western culture by driving it into text and sequentialism for the past 500 years. Is it true, as the recent da Vinci museum exhibit haughtily claimed, that Leonardo was not a true scientist at all because, unlike Galileo, he did not publish? Of course not. It was merely the fact that his highly parallel, and hence visual, way of doing science was hopelessly incompatible with the printing press. (He probably wouldn't even have participated in this exercise, where we are all limiting our responses to those which can be expressed in text even though we are no longer forced by technology to do so.) Imagine, just for a moment, that Gutenberg had invented the worldwide web instead of movable type. It would have been Leonardo's science, with its focus on the living and the parallel, that would have been ubiquitous. Galileo's endless dialogs might have been lucky to get percussio per diem una. In general, the equations of the book era have been superb at describing the parts of reality that are dead and hence universal. Bits seem much more capable of describing the other 99%. I have the sense that biology is already moving into the post-book era. To understand what biologists are doing, it is not enough to read the sentences they write. Increasingly, one must run the programs they run and get at the bits themselves. JAMES BAILEY is a former computer company executive at Thinking Machines; author, After Thought: The Computer Challenge to Human Intelligence. _________________________________________________________________ John C. Dvorak: Let's ignore discoveries (germs) and technique (scientific method) for starters before determining the greatest invention. I also think that the printing press, a device invented to rip-off the bible buying public, should be relegated to its rightful place as number two to a newer invention: computer networks. While it is quaint to romanticize the past by citing the printing press, steam engines or 18th century lug nuts, we ignore the fact that our inventiveness as a civilization is increasing not decreasing and newer inventions might be the most important inventions. And let's choose an invention in and of itself and not argue about derivatives. Right now the invention that is revolutionizing the world (more than TV, for sure) is the computer network -- the Internet in particular. And, for what it's worth, arguing that none of this would be possible if man hadn't learned to grunt first, therefore grunting in the most important invention is nonsense. More interesting in this artificial discussion is how most of the participants, including myself, have chosen an invention from their particular specialty. Perhaps we should ask the question: what is the most insidious invention of the past 2000 years? How about specialization? Look at how insidious it is in this discussion. So much so that it's frightening. Change the topic! Discussing the most insidious inventions would be more fun than talking about the importance of hay, the concept of infinity or Goedel! Just think of the possibilities. We can nominate plastic, the stock market, roller pens, the vibrating dildo, sitcoms, the literary agent, Microsoft Visual BASIC, the animated cartoon, CNN, the wrist watch, roller blades, the spinach souffle. The possibilities are endless. Let's start over. JOHN C. DVORAK is a columnist at PC Magazine, PC/Computing, Computer Shopper, PC-UK, Info (Brazil), Boardwatch, Barrons Online, host of Public Radio's Real Computing and host of Silicon Spin on ZDTV. He's written 14 books, all out of print. Co-founder of the Oswald Spengler Society. See Digerati, Chapter 8. _________________________________________________________________ Kenneth Ford: Well, this isn't very imaginative, but my choice-like that of several other contributors-is the pill. Here's my reasoning. The greatest invention of the last 2000 years is the one that is most likely to help avert the collapse of civilization in the next 2000. Electricity as a means of information and energy transport is a candidate. Modern medicine is a candidate. But what drives or exacerbates every major global problem is, ultimately, population growth. So whatever most effectively limits population growth is the greatest invention-and that's the pill, or contraception more generally. KENNETH FORD is Director of Science Programs at the David and Lucile Packard Foundation, former director of the American Institute of Physics, and, with John Archibald Wheeler, the recent co-author of Geons, Black Holes, and Quantum Foam: A Life In Physics. _________________________________________________________________ Philip Brockman: As a researcher I believe that my most important contributions are inherent in the younger people I have worked with and in the increase of the universal knowledge that has resulted from the work that I have done and have sponsored among universities and companies. Stephen Budiansky points to importance of "the domestication of the horse as a mount." The amazing fact is that mankind can learn new technology at an amazing pace. Thus, in a relatively short time after the introduction of the horse to America, the Apache were a great light cavalry. I would also agree with David Shaw re: "the steady accrual of both knowledge and technology that has accompanied the rigorous application of the scientific method over a surprisingly small number of human generations "; and with Stanislas Dehaene on the "concept of education." So I guess my invention violates the 2000 year (very Christian limit). It is the intergenerational passing of information. PHILIP BROCKMAN , a physicist, has been at NASA LaRC (Langley Field, Virginia) since 1959 and is a recipient of NASA's Exceptional Service Medal (ESM). His research includes: Shock tubes; Plasma propulsion; Diode laser spectroscopy; Heterodyne remote sensing; Laser research; Laser injection seeding; Remote sensing of atmospheric species, winds, windshear and vortices. He is currently supporting all solid state laser development for aircraft and spaceborne remote sensing of species and winds and developing coherent lidars to measure wake vortices in airport terminal areas. _________________________________________________________________ Howard Rheingold: The kind of thinking that makes it possible for all these people to expound upon "the single most important invention of the last two thousand years" is the most important invention of the past two thousand years. There is no such thing as the single most important invention of the last two thousand years. The evolution of technology doesn't work like that. It's a web of ideas, not a zero-sum game. Knowing how to turn knowledge into power is the most powerful form of knowledge. The mindsets, mindtools, and institutions that make massive technological progress possible are all part of an invisible cultural system -- it is learned, not inherent, it was invented, not evolved, it hypnotizes you to see the world in a certain way. What we know as "technology" the visible stuff that hums and glows -- is only the physical manifestation of a specific kind of social system. That invisible system, which emerged over the past three centuries -- what Jacques Ellul called "la technique" and Lewis Mumford called "technics" -- is more important than all the inventions it engendered. Do we lack one important invention at a crucial time when our inventions are becoming our only evolutionary competitor? We haven't formulated and agreed upon a way of making good decisions about the powerful technologies we're so good at creating. We have a lot of the knowledge that turns knowledge into power. We need more of the wisdom that knows what we ought to do with the power of invention. HOWARD RHEINGOLD, founder of Electric Minds, is the author of Tools For Thought; Virtual Reality, and Virtual Communities. _________________________________________________________________ George Lakoff: As a cognitive linguist whose job is to study conceptual systems, both conscious and unconscious, I was struck by what was meant by "invention." o The most concrete "inventions" proposed have been gadgets, mechanical or biological -- the printing press, the computer, the birth control pill. o A step way from the concrete specific technical innovations are specific technical inventions of a mental character: G?del's Theorem, Arabic numbers, the nongeocentric universe, the theory of evolution, the theory of computation. o A step away from those are the general innovations of a mental character in specific domains like science and politics, e.g., the scientific method and democracy. I would like to go a step further and talk about the invention that was causally necessary for all of the above: o The most basic fully general invention of a mental character is The Idea of an Idea. THE IDEA OF AN IDEA It's a bit more than 2,000, more like 2,500 years, at least in the West. It is an 'invention" in the sense that human beings actively and consciously thought it up: to my knowledge, it is not the case that every indigenous culture around the world objectifies the notion of an idea, making it a thing that can be consciously constructed. What is required for all other human inventions is the notion that one can actively, consciously construct new ideas. We take this for granted, but it is not a "natural" development. Three-year-old children have lots of ideas and even make up new ideas. But they do not have the Idea of an Idea that they can construct anew; they do not naturally arrive at the idea that making up new ideas is something people do. The Idea of an Idea is a cultural creation that children have to learn. It is only with the Idea of an Idea that we get conscious specific intellectual constructions like democracy, science, the number system, the computer, the birth control pill, and so on. The Idea of an Idea is the generative notion behind the very notion of an invention and is causally necessary for all specific inventions. GEORGE LAKOFF is Professor of Linguistics at the University of California at Berkeley, where he is on the faculty of the Institute of Cognitive Studies.He is the author of Metaphors We Live By (with Mark Johnson), Women, Fire and Dangerous Things: What Categories Reveal About the Mind, More Than Cool Reason: A Field Guide to Poetic Metaphor (with Mark Turner), Moral Politics, and Philosophy in the Flesh (with Mark Johnson). _________________________________________________________________ Robert Provine: Discovery of Childhood and Invention of Universal Schooling Instead of suggesting a device, I nominate the educational process essential for a high velocity of inventiveness, the evolution of a technological society, and the spread of culture. While schools for the elite have existed since antiquity, the recognition of childhood as a unique time of life with special schooling, social, and emotional needs, and different standards of justice, is relatively recent and associated with Rousseau, Freud, Piaget, and their forbearers. The discovery that children are not "miniature adults" led to a more humane society and was essential to tailoring educational programs to the developmental stage of the student. Universal schooling (and even the modern university) were born both of this increased appreciation of the special needs of children and necessity -- the industrial revolution needed a cadre of trained workers, scientists and engineers. The complexity of modern technology and the associated acceleration of innovation demand a critical mass of creative minds and hands that cannot be provided by occasional virtuosi toiling in solitude. ROBERT R. PROVINE is Professor of Neurobiology and Psychology at the University of Maryland. _________________________________________________________________ Peter Cochrane: The Thermionic Valve by DeForest in 1915 really was the birth of the electronic age. Without this invention most of us would never have been born. Without electronics this planet would not be supporting the massive numbers of people now living in the West. We would not be able to communicate, compute, manufacture and distribute atoms on the scale we now enjoy. There would be no radio, TV, computers, Internet, modern medicine -- engineering, international travel of any scale, atomic power and almost everything we currently take for granted. In fact our species and our civilisation would have stalled without this invention. This Thermionic Valve is very closely followed by the Transistor in 1945 with Bardeen and Shockley creating the foundation for what you are reading this on -- the PC. PETER COCHRANE is Head of Research, BT Laboratories, UK and the author of Tips for Time Travelers. _________________________________________________________________ Samuel Barondes: The great invention of the modern era is the invention of organized science -- scientific societies and journals that foster the accumulation and dissemination of knowledge based on evidence rather than on authority or revelation. Before the invention of these organizations the accumulation of scientific knowledge was slow, because there were no established venues for criticism and education -- essential social interactions at the heart of science. Now that these organizations (in the developed world) have become very large (necessitating the proliferation of many subdivisions, to allow for personal interactions on a human scale) they facilitate the unprecedented opportunities for collective knowledge and self knowledge that so many of us enjoy. SAMUEL H. BARONDES, M.D. is the Jeanne and Sanford Robertson Professor of Neurobiology and Psychiatry at the University of California at San Francisco, President of the McKnight Endowment Fund for Neuroscience, and the author of Molecules and Mental Illness and Mood Genes: Hunting for Origins of Mania and Depression. _________________________________________________________________ Christopher Westbury: My nomination for the most important invention of the past 2000 years is probability theory, which was mainly put together in a series of steps between 1654, when Blaise Pascal proposed a solution for splitting the pot in an unfinished game of chance, and 1843, when Antoine Cournot offered a definition of chance as the crossing of two independent streams of events. I don't nominate it simply because probability theory laid the foundation for statistical analysis, which provided us with a vocabulary without which most scientific discoveries made in the last century would have been (literally) unthinkable. Nor do I nominate probability theory because it gave us for the first time a trustworthy tool for deciding how to apportion belief to multiple sources of evidence. Probability theory had even more fundamental epistemological implications whose importance is under-appreciated in our time because those implications are so seamlessly integrated into the foundations of our modern world view. Until the nineteenth century, the idea that there could exist deep regularities underlain by pure chance -- regularities arising from distributions of events which were themselves the result of multifarious unmeasurable causes -- was not only almost unknown (Aristotle had hinted at it, as he seems to have hinted at everything), but actually philosophically repugnant. It required the invention of probability theory to make this idea thinkable. In making it possible to think about such abstract regularities, probability theory rescued us from two philosophical shackles which had held us back from the beginning of history: that of needing to postulate a centralized controller that made everything come out right, and that of assuming that "what you see is what you get" -- i.e. that the proper objects of scientific study are roughly identical to the direct objects of the senses. Though perhaps they have still not been totally removed, those philosophical shackles needed to be at least loosened in order for science to get moving. A whole new world of law-obeying objects to be studied was opened up by probability theory. Neither Darwin's theory of natural selection, nor Maxwell's theory of statistical mechanics (both published in the same year, only 140 years ago) would have been thinkable before probability theory was thinkable. Without probability theory, human kind would be (and was) unable to even conceive of the explanations for many -- probably most -- of the phenomena which we have ever explained. CHRISTOPHER WESTBURY is a post-doctoral fellow at the University Of Alberta. _________________________________________________________________ John Rennie: Earlier contributors have already staked out the intellectual high road of mental constructs like scientific method and the calculus, so I'll retreat to the most prosaic, literal reading of your question: What is the one device invented by one person at one moment during the past 2,000 years that has had the most influence to date? I'd be a traitor to my inky profession if I didn't at least echo the nominations for Johann Gutenberg's movable-type printing press. But in the spirit of the game, let me throw support behind something else: Alessandro Volta's electric battery of 1800. Static electricity was known since at least the time of the Greeks, but study of it had largely stalled. When Pieter van Musschenbroek built and discharged the first Leiden jar in 1745, nearly killing himself in the process, he also jolted the study of electricity back to life. But it was Volta's invention of a steady source of current, inspired by the electrochemical observations of Galvani, that revolutionized technology and physics. Without it, Oersted could not have proved that electricity and magnetism were different faces of the same force, electromagnetism. Electrochemistry itself offered clues to the underlying electrical nature of all matter. And of course, Volta's battery was the forerunner of all the electrical devices that have transformed the world over the past two centuries. What I find so appealing about Volta's creation is that it had immense practical significance but also opened to us a world of physical phenomena that in themselves changed our understanding of the universe. Yet it was not a bolt-from-the blue inspiration; it pulled together other threads of discoveries by Volta's contemporaries. There's a lesson about greatness in there somewhere. JOHN RENNIE is the editor in chief of Scientific American magazine. _________________________________________________________________ Randolph Nesse: Text is Special It seems to me, as it will no doubt to many others, that the printing press has changed the world more than any other invention in the past two millennia. But why has such a simple technology had such a huge influence? And why, after 500 years, has no one invented a superior replacement? I suspect it is because text has a special relationship to the human mind. Printing is the third wave of the biggest innovation, the one that started with the co-evolution of language, thought and speech. Speech makes it possible to share and compare internal models of the external world, a capacity that gives huge selective advantages. But acoustic vibrations are ephemeral, fading in moments into questions about who said what, when. Writing, the second wave, is like a blast of super-cooled air that freezes words in mid-flight and smacks them onto a page where they can be examined by anyone, anywhere, anytime. Writing makes possible law, contracts, history, narratives, and poetry, to say nothing of sacred texts with their overwhelming influence. Printing transformed writing into the first mass medium, and the world has never been the same since. In the half-century that followed Gutenberg's 1446 Bible, over a thousand publishers printed over a million books. Suddenly it was worthwhile, and soon essential, for even ordinary people to learn to read. Now, people whose brains have trouble with this trick are at a severe disadvantage, while some with particular verbal felicity can make a living just by arranging words on paper. Is text merely a temporary expedient, necessitated by the previous inability to record and transmit speech and images? We will soon see. In just a few years, sensors, storage and bandwidth will be so inexpensive that many people will be unconstrained by technical limitations. This affords a fine opportunity to make bold predictions that can be completely and embarrassingly wrong, as wrong as the predictions that said that e-mail would never catch on. In that spirit, I predict that voice and video attachments to e-mail, "v-mail" and "vid-mail," will be the next big thing, and they will create all manner of consternation. At first they will be hailed as more personal and more natural, thanks to the increased content carried by intonation and exclamations. But soon, I predict, the usual human strivings will give rise to problems. Many people who previously were forgiven as "liking to hear themselves talk" will be revealed as actually wanting to hear others listen to them talk. Some, especially bosses, will send long soliloquies to hundreds of other people in the expectation that they will be listened to in full. The wonderful veil of privacy in which a reader considers a text will be rent. You won't be able to jump around and skip whole paragraphs in v-mail and vid-mail, as you can in e-mail. Time and attention will be revealed as the valuable resources they are. Many people will post electronic notices equivalent to the one a friend has on his answering machine, "Leave a message, but please KEEP IT BRIEF." To solve this we will, of course, turn to still more technology. V-mail will be transformed automatically into text so we will have a choice of mediums. What will we choose? It will depend. For emotional endearments, and many narratives, v-mail and vid-mail. For simple facts, and subtle ideas, however, I think will we choose text, at least, that is, until our brains are changed by the selective forces unleashed by these technologies. RANDOLPH NESSE, M.D., is Professor of Psychiatry, Director, ISR Evolution and Human Adaptation Program, The University of Michigan and coauthor (with George C. Williams) of Why We Get Sick: The New Science of Darwinian Medicine. _________________________________________________________________ Brian Greene: My initial thought of how to define the importance of an invention was to imagine the impact which would be caused by its absence. But having just sat through yet another viewing of "Its A Wonderful Life," I am inspired to leave contemplation of the contingencies of history to others better suited to the task. And so, I will vote for my "knee-jerk" response: The Telescope. The invention of the telescope and its subsequent refinement and use by Galileo marked the birth of the modern scientific method and set the stage for a dramatic reassessment of our place in the cosmos. A technological device revealed conclusively that there is so much more to the universe than is available to our unaided senses. And these revelations, in time, have established the unforeseen vastness of our dynamic, expanding universe, shown that our galaxy is but one among countless others, and introduced us to a wealth of exotic astrophysical structures. BRIAN GREENE is a professor of physics and of mathematics at Columbia University, and author of The Elegant Universe: Superstrings, Hidden Dimensions, and the Quest for the Ultimate Theory. _________________________________________________________________ Esther Dyson: I'd say the notion that people can govern themselves, rather than being governed by someone who claims divine right. (I'm wrestling with that one myself right now, on the Internet.) ESTHER DYSON is president of EDventure Holdings and editor of Release 1.0. Her PC Forum conference is an annual industry event. She is the author of Release 2.0: A Design for Living in the Digital Age. Release 2.1, the paperback upgrade, is now available. Dyson is also active in industry affairs; she is the interim chairman of ICANN, the Internet Corporation of Assigned Names and Numbers; a member of the board of the Electronic Frontier Foundation and is a member of the President's Export Council Subcommittee on Encryption. See Digerati, Chapter 9. _________________________________________________________________ Steven Johnson: Given the amount of self-reference in the answers so far, I'm tempted to nominate this very discussion list as the greatest invention of the past two thousand years, and hopefully out-meta all the other contenders. I think part of the problem here is the fact that inventions by nature are cumulative, and so when asked to pick out the single most important one, you're inevitably faced with a kind of infinite regress: if the automobile is the most important invention, then why not the combustible engine? (And so onŠ) In that spirit -- and in the spirit of nominating things you happen to be working on professionally -- I'd nominate the ultimate cumulative invention: the city. Or at least the modern city's role as an information storage and retrieval device. Before there were webs and telegraphs making information faster, there were cities bringing information physically closer together, and organizing it in intelligible ways. It's not a stretch to think of the original urban guilds as file directories on the storage device of the collective mind, combining disparate skills and knowledge bases and placing them into the appropriate slots. (Manuel De Landa has a wonderful riff on this in the first section of his new book, A Thousand Years of Nonlinear History.) But of course, the city isn't an invention proper, at least in the conventional way that we talk about inventions. It's the sum total of multiple inventions, without each of which the modern city as we know it might not exist. I think what this discussion makes clear is that we need a better definition of "invention"! STEVEN JOHNSON is the author of Interface Culture: How New Technology Transforms The Way We Create And Communicate, and the editor-in-chief of FEED Magazine. He is working on a new book about cities and emergent behavior. _________________________________________________________________ Delta Willis: Had you asked two years ago I would have nominated the airplane, for it symbolizes the essence of invention by being composed of other inventions: the wheel, the bicycle, a glider, a prop, and a 12 horse power engine. The airplane is important because it diminishes our parochial view, it too changed the manner of warfare (Hey, nice bomber) and one can argue that it was an early form of the space shuttle. But if by important you mean sweeping, the utility of electricity is pivotal to so many things mundane and great, especially the broadcast of information, the A train, the monitors that tell me if my flight is delayed, this modern version of Gutenberg's press, and Les Paul's guitar. Thomas Edison received 389 patents for electric light and power, and Nikola Tesla patented his Apparatus for Transmitting Electrical Energy. Of course humans no more invented electricity than we invented flight, but utility is key. As I write I am preparing to go out for New Year's Eve, and packing a flashlight, just in case an old programming shut down code of 99 introduces us to the Y2K bug a year early. Should such a power failure occur then, the impact of electrical utility will be known. DELTA WILLIS wrote The Hominid Gang: Behind the Scenes in the Search for Human Origins, and The Sand Dollar and The Slide Rule, Drawing Blueprints from Nature. _________________________________________________________________ Joseph LeDoux: Inventions. My top runners in the area of physical inventions would have to be ways of harnessing energy, ways of moving around the world, and ways of communicating. And since the latter two depend on the first, I'd have to put my money on energy control and use. But we've got lots of psychological and social inventions as well. I'd put the idea that all people are equal at the top of the list. This is an invention that we could make better use of. JOSEPH LEDOUX is a Professor at the Center for Neural Science, New York University. He is the author of the recently published The Emotional Brain: The Mysterious Underpinnings of Emotional Life, coauthor (with Michael Gazzaniga) of The Integrated Mind, and editor with W. Hirst of Mind and Brain: Dialogues in Cognitive Neuroscience. See EDGE: "Parallel Memories: Putting Emotions Back Into The Brain" -- A Talk With Joseph LeDoux. _________________________________________________________________ Maria Lepowsky: I've been pondering your millennial/bimillennial question, and I'd like to cheat a bit by giving several answers. I too offer a vote for the oral contraceptive pill. It is revolutionary for two reasons. First, it makes a quantum leap in the effectiveness of technologies for the control of human fertility -- which are found in every known culture and likely date back more than a hundred millennia. The pill and subsequent devices have the potential for a revolutionary impact on the lives of women from puberty to menopause everywhere in the world, allowing women to control their own fertility and thus enabling members of half the human species to control their own adult lives. In addition, these devices have the potential to save the planet Earth from the ongoing disaster of human overpopulation, with its present and future dire consequences globally of mass poverty, pandemics, warfare and violent confrontations over scarce resources, environmental degradation, and wholesale species extinctions. My next vote for most important technology of the last two thousand years goes to the gun, or more precisely to a series of European inventions of more efficient killing technologies. The ship-mounted cannon, the Spanish trabuco and the British Snider rifle -- to mention just a few weapons from recent centuries -- in the hands of members of authoritarian societies (whose populations had exceeded the carrying capacities of their homelands given contemporary agricultural technologies), bent on acquiring new territories, propelled across the Atlantic and Pacific Oceans by ships built according to the most advanced maritime technologies of their eras, effected the European conquest of large portions of the planet's landmass, resources, and human populations. The momentous consequences of the European conquest will continue to play themselves out in every sphere of human life around the globe over the next millennium. My final vote goes to the revolutionary improvements in hydraulic engineering made beginning in the late nineteenth century that have solved what has for millennia been the single greatest problem of urban life: how to bring clean water in and human waste out of a large nucleated settlement. While the Roman waterworks were brilliantly designed (and their epoch crosses the bimillennial cut-off point of this exercise), improvements in sanitation made only a century or so from the present led, in industrial societies like Britain and the United States, to a revolutionary drop in the death rate from infectious diseases transmitted by fecal contamination of drinking water. These advances in hydraulic engineering have extended human life spans even more than the subsequent discovery of antibiotics. This technology has diffused only slowly around the globe as it encounters barriers created by unequal distributions of wealth and power. Even so, ironically, our resulting increased longevity, and the increases in population fertility that declines in mortality rates confer when they are unchecked by other variables, contribute dramatically to the ongoing crisis of human overpopulation. This makes the wide availability of advanced contraceptive technology, invented two generations later, all the more critical for the survival and well-being of our species and of the entire planet. MARIA LEPOWSKY is Professor, Department of Anthropology, University of Wisconsin, and author of Fruit of the Motherland: Gender in an Egalitarian Society. _________________________________________________________________ John Barrow: John, The most important invention is the Indo-Arab counting system with 0,1,2,3,4,5,6,7,8,9 with its positional information content (so 111 means one hundred plus one ten plus one unit), zero symbol, and operator property that by adding a zero to the righthand end of a string multiplies the number by the base value of 10. This system of counting and enumeration is completely universal and lies at the foundation of all quantitative science, economics, and mathematics. JOHN BARROW is is Professor of Astronomy at the University of Sussex, England. He is the author of The World Within the World, Pi in the Sky, Theories of Everything, The Origins of the Universe (Science Masters Series),The Left Hand of Creation, The Artful Universe, and Impossibility: The Limits of Science and the Science of Limits. _________________________________________________________________ Todd Siler: To avoid incurring the wrath of some scholars, I wanted to add this parenthetical note (see asterisk below) to my statement about language. Hopefully, it clarifies my point a little; or, at least, focuses it. My first candidate is "language"; specifically, our initial realization* of its creative potential, building on the intuitions of the ancient Greeks and Romans. Language is the life-force and body of communica- tion. It comprises all forms of symbolic creations, expressions and systems which we use to communicate: from the mathematical to the vernacular. Without language, every other invention and innovation may never have existed -- including humor! My close-second candidate is E = mc2. When we learn to tap the full meaning of that piece of symbolic language, we'll create more than a Nuclear Age. "Matter is frozen energy," Einstein said, relating the essence of his insight into the mass-energy relationship. Similarly, language is frozen meaning. When we discover how to unleash the enormous energy in meaning by continually transforming information (data, ideas, knowledge, experience) in new contexts, we'll make a quantum leap in applying the power of language to achieve our boldest dreams. * Note: Some people may choose to date our first deep realization of language's potential around the late 1700's. That's when the first scientific study of the nature and origins of language began to unfold through the systematic, comparative studies of the German scholars Friedrich Schlegel, Jakob Grimm, and Franz Bopp. Others may focus on the work of Ferdinand de Saussure whose general, descriptive method led to some basic laws that relate to all languages (about 3,000 or more now). My broad statement is meant to embrace the "makeup" of language: its symbolic nature, structures, semantics, and boundless usages. I'm not simply referring to the inventive act of classifying spoken and written languages into families, or categorizing the growth patterns of language, or charting the evolution of grammar. TODD SILER is the founder and director of Psi-Phi Communications, a company that provides catalysts for breakthroughs & innovation in business and education. He is the author of Breaking the Mind Barrier and Think Like A Genius. _________________________________________________________________ Peter Tallack: The horse collar as the most important high-tech invention. Developed around 1000 AD in northern Europe, it allowed the region to be farmed efficiently and so, it could be argued, was responsible for the rise of civilization there. It also gave its possessors great war-making potential -- think of knights in armour, for example. PETER TALLACK, former book editor of Nature, is science editor of Weidenfeld & Nicholson, London. _________________________________________________________________ Brian Goodwin: The most important invention in the past two thousand years is the printing press. When William Caxton published 'The Canterbury Tales' in the 15th Century with his newly invented printing machine, he dramatically accelerated the separation of human culture from nature, eclipsing the direct experience of natural processes that continues in the oral tradition and replacing it by words on a page. This cut in two directions. (1) The power of nature diminished so that science and technology could start the systematic program of gaining knowledge for control of nature, liberating people from drudgery and freeing the imagination. (2) At the same time, nature was degraded to a set of mechanisms that humans could manipulate for their own purposes, and the 'rape of nature' began in earnest. We are now reaping twin harvests: vastly expanded potential for written communication through the internet, as in this exchange of views at the Edge web site; and a vastly degraded planet that won't support us much longer, as things are going. Can we use one to save us from the other? We can now connect with each other as never before; but what about nature? BRIAN GOODWIN Brian Goodwin is a professor of biology at the Schumacher College, Milton Keynes, and the author of Temporal Organization in Cells and Analytical Physiology, How The Leopard Changed Its Spots: The Evolution of Complexity, and (with Gerry Webster) Form and Transformation: Generative and Relational Principles in Biology. Dr. Goodwin is a member of the Board of Directors of the Sante Fe Institute. See EDGE: A New Science of Qualities; The Third Culture, Chapter 4. _________________________________________________________________ John Brockman: DNI: DISTRIBUTED NETWORKED INTELLIGENCE In their classic book The Mathematical Theory of Communication, Claude Shannon and Warren Weaver stated: "The word communication will be used here in a very broad sense to include all of the procedures by which one mind may affect another. This, of course, involves not only written and oral speech, but also music, the pictorial arts, the theater, the ballet, and in fact all human behavior." Marshall McLuhan pointed out that by inventing electric technology, we had externalized our central nervous systems; that is, our minds. We had gone beyond Freud's invention of the unconscious, and, for the first time, had rendered visible the conscious. Composer John Cage went further to say that we now had to presume that "there's only one mind, the one we all share." Cage pointed out that we had to go beyond private and personal mind-sets and understand how radically things had changed. Mind had become socialized. "We can't change our minds without changing the world," he said. Mind as an extension became our environment, which he characterized as "the collective consciousness," which we could tap into by creating "a global utilities network." We create tools and then mold ourselves in their image. Seventeenth-century clockworks inspired mechanistic metaphors ("the heart is a pump") just as mid-twentieth-century developments in self-regulating engineering devices resulted in the cybernetic image ("the brain is computer"). Although you don't hear much about cybernetics today in the scientific arena, its impact is profound. "The cybernetic idea" stated anthropologist Gregory Bateson, "is the most important abstraction since the invention of Jesus Christ." He went on to note that we were now living in " a world of pattern, of order, of resonances in which the individual mind is a subsystem of a larger order. Mind is intrinsic to the messages carried by the pathways within the larger system and intrinsic also in the pathways themselves." In this new epistemology Ockham's Razor meets G?del's Proof and the fabric of our habitual thinking is torn apart. Subject and object fuse. The individual self decreates. (See By The Late John Brockman). Reality passes into description and thus becomes invention. Such ideas, which appear destructive, liberate, allowing us to lay waste to the generalizations of previous epochs which we decreate by getting through the history of our words. As Wallace Stevens wrote: "The words of the world are the life of the world. It is the speech of truth in its true solitude: a nature that is created in what it says."* Key to this radical rebooting of our mindsets is the term information, which, in this scheme, refers to regulation and control and has nothing to do with meaning, ideas, or data. Bateson pointed out that "information is a difference that makes a difference." The raindrop that hits the ground behind you contains no information. The raindrop that hits you on the nose has information. Information is a measure of effect. Systems of control utilize information if and when they react to change to maintain continuity. If Newtonian physics taught us that it is the parts that matter, we now inhabit a universe that interacts infinitely with itself, where importance lies in the patterns that connect the parts. This becomes problematic because how can a system describe itself without generating a spiralling ladder of recursive mirrors? The answer? Nobody knows, and you can't find out. The description of the plane of language is the plane that holds our descriptions. Language becomes a commission, a dance, a play, a song. With the Internet we are creating a new extension of ourselves in much the same way as Mary Shelley's Dr. Frankenstein pieced together his creation. Only this creation is not an anthropomorphic being that moves through accretive portions of space in time. It is instead, an emergent electronic beast of such proportions that we can only imagine its qualities, its dimensions. Can it be ourselves? I propose as the most important invention of the past two thousand years: Distributed Networked Intelligence (DNI). DNI is the collective externalized mind, the mind we all share, the infinite oscillation of our collective consciousness interacting with itself, adding a fuller, richer dimension to what it means to be human. JOHN BROCKMAN is the author/editor of nineteen books, including By The Late John Brockman, The Third Culture: Beyond the Scientific Revolution, Digerati: Encounters with the Cyber Elite and (with Katinka Matson) How Things Are: A Science Took-Kit for the Mind. He is founder of Brockman, Inc., a literary and software agency, President of Edge Foundation, Inc., founder of The Reality Club, and editor and publisher of EDGE, a Website presenting The Third Culture in action. From checker at panix.com Tue Jan 10 14:18:03 2006 From: checker at panix.com (Premise Checker) Date: Tue, 10 Jan 2006 09:18:03 -0500 (EST) Subject: [Paleopsych] Economist: Luxury: Inconspicuous consumption Message-ID: Luxury: Inconspicuous consumption http://www.economist.com/business/PrinterFriendly.cfm?story_id=5323772 Dec 20th 2005 | LONDON, NEW YORK AND PARIS Now that luxury has gone mass market, how are the super-rich to flaunt their wealth? THE recently reopened Louis Vuitton store on the Champs Elys?es is a deliberate exercise in democratic luxury. On its new, opulent art deco terraces, elegant French ladies of a certain age--the epitome of the traditional consumer of luxury fashion--rub padded shoulders with jeans-and-tee-shirt sporting younger women who could be their daughters, or, just as easily, rap singers or a gaggle of British working-class hen-weekenders. What traditional buyers of luxury make of their nouveau co-consumers they are, of course, too civilised to say. But it seems unlikely that they consider Louis Vuitton's still-exquisite handbags, shoes and other indulgences to be quite as exclusive as before. If they continue to shop there--and the store's owner, LVMH Mo?t Hennessy Louis Vuitton, thinks it can extend its brand to a broader market without losing its existing customers--it may no longer be because its products are a signal of exalted social status. In this respect, LVMH's goods are by no means unique. Products and services that were once the preserve of a very wealthy few--from designer handbags to fast cars, bespoke tailoring and domestic servants--are increasingly becoming accessible, if not to everyone, then certainly to millions of people around the world. This may appall killjoy economists such as Robert Frank, the author a few years ago of a book condemning "Luxury Fever" in this new "era of excess". But it is arguably even more upsetting to those super-rich folk who have long been able to afford luxury, and may in one crucial respect even regard it as a necessity. As Thorstein Veblen noted over a century ago in "The Theory of the Leisure Class"--the book in which he coined the phrase "conspicuous consumption"--spending lavishly on expensive but essentially wasteful goods and services is "evidence of wealth" and the "failure to consume in due quantity and quality becomes a mark of inferiority and demerit." But in the 21st century, "being a conspicuous consumer is getting harder and harder", says James Lawson of Ledbury Research, a firm that advises luxury businesses on market trends. What does a billionaire have to do to get noticed nowadays? Luxury for the people A stone's throw from the Louis Vuitton store, the classical grandeur of the five-star Four Seasons George V hotel was the perfect Old World venue for the World Luxury Congress in October. But all the talk at the gathering was of the potentially lucrative opportunities presented by new consumers of luxury, in rich and emerging economies alike. Being a millionaire, for instance, is becoming commonplace. In 2004 there were 8.3m households worldwide with assets of at least $1m, up by 7% on a year earlier, according to the latest annual survey by Merrill Lynch and Capgemini. The newly wealthy are often desperate to affirm their status by conspicuously consuming the favoured brands of the already rich. In developed countries this can be seen, in its extreme form, in the rise of "Bling"--jewellery, diamonds and other luxuries sported initially by rappers--and Britain's unsophisticated Burberry-loving "chavs". (Burberry is considered unusually successful at tapping a broader market. But even it now understands that not every new customer is desirable: in January it withdrew its distinctive checked baseball caps because of their popularity with chavs.) The number of luxury buyers in the developed world is also being swelled by two other trends. First, consumers are increasingly adopting a "trading up, trading down" shopping strategy. Many traditional mid-market shoppers are abandoning middle-of-the-range products for a mix of lots of extremely cheap goods and a few genuine luxuries that they would once have thought out of their price league. Alongside this "selective extravagance" is the growth of "fractional ownership": time-shares in luxury goods and services formerly available only to those paying full price. Fractional ownership first got noticed when firms such as NetJets started selling access to private jets. It has since spread to luxury resorts, fast cars and much more. In America, From Bags to Riches--"better bags, better value"--lets less-well-off people rent designer handbags. In Britain, Damon Hill, a former racing driver, has launched P1 International. A ?2,500 ($4,300) joining fee, plus annual membership of ?13,750, buys around 50-70 driving days a year in cars ranging from a Range Rover Sport to a Bentley or a Ferrari. As a result, "the price of entry for much of what traditionally was available to the top 0.001% is now far lower", says Mr Lawson, who notes the sorry implications for a would-be conspicuous consumer: "How do I know if the guy who drives past me in a Ferrari owns it or is just renting it for the weekend?" Demand for luxury is also soaring from emerging economies such as Russia, India, Brazil and China. Antoine Colonna, an analyst at Merrill Lynch, estimates that last year Chinese consumers already accounted for 11% of the worldwide revenues of luxury-goods firms, with most of their buying done outside mainland China. He forecasts that by 2014, they will have overtaken both American and Japanese consumers, becoming the world's leading luxury shoppers, yielding 24% of global revenues. These emerging consumers have a big appetite for the top luxury brands--and the owners of those brands are increasingly keen to oblige. Russia is producing today's most determinedly conspicuous consumers. Roman Abramovich, the best-known oligarch not in jail, has conspicuously set new standards in buying mansions, ski resorts and soccer teams. Veblen revisited For the already rich, strategies such as splashing out on ever bigger houses, longer yachts or getting special treatment from luxury-goods firms does not contribute much marginal conspicuousness. Meanwhile, the list of new ways to get noticed by the masses is shrinking fast. Even space tourism--impressive in 2001, when Dennis Tito paid Russia $20m to visit the International Space Station--will soon be humdrum. As it gets ever harder to consume conspicuously, are some traditional luxury consumers giving up trying? According to Virginia Postrel, author of "The Substance of Style", conspicuous consumption is much more important when people are not far from being poor, as in today's emerging economies. In developed countries, in particular, "status is always there, but the shift in the balance is towards enjoyment". For instance, the first thing the newly super-rich tend to buy is a private plane. But that, she says, is "not so much about distinguishing themselves from the masses as not being stuck with them in a security line". In his new book, "Hypermodern Times", Gilles Lipovetsky, the favourite philosopher of LVMH's boss, Bernard Arnaud, has coined the term "hyperconsumption". This is consumption which pervades ever more spheres of life and which encourages people to consume for their own pleasure rather than to enhance their social status. H.L. Mencken made the same point more crisply in a critique of Veblen in 1919: "Do I prefer kissing a pretty girl to kissing a charwoman because even a janitor may kiss a charwoman--or because the pretty girl looks better, smells better and kisses better?" Yet rather than abandoning status anxiety, the way the rich seek to display status may simply be getting more complex. As inequality grows again in rich countries, some of the very rich worry about consumption that is so conspicuous to the masses that it provokes them to try to take their wealth away. Some car-industry experts blame weak sales of the latest luxury limousines on this fear. As well as traditional conspicuous consumption and "self-treating", Ledbury Research identifies two other motives that are driving buying by the rich: connoisseurship and being an "early adopter". Both are arguably consumption that is conspicuous only to those you really want to impress. Connoisseurs are people whom their friends respect for their deep knowledge of, say, fine wine or handmade Swiss watches. Early adopters are those who are first with a new technology. Silicon Valley millionaires currently impress their friends by buying an amphibian vehicle to avoid the commuter traffic on the Bay Bridge. Several millionaires have already paid $50,000 a go to clone their pet cat. In America, at least, says Marian Salzman, a leading trendspotter, the focus of conspicuous consumption is increasingly on getting your children into the best schools and universities. Harvard may be today's ultimate luxury good. Getting into the right clubs is also as important a social statement as ever. America's young wealthy may currently be seen at the Core Club in New York: membership is by invitation only, with a joining fee of $55,000 plus annual dues of $12,000. But perhaps the true symbol of exalted status in the era of mass luxury is conspicuous non-consumption. This is not just the growing tendency of the very rich to dress scruffily and drive beaten-up cars, as described by David Brooks in "Bobos in Paradise". It is showing that you have more money than you know how to spend. So, for example, philanthropy is increasingly fashionable, and multi-billion-dollar endowments such as the Bill and Melinda Gates Foundation are certainly conspicuous. However, since the new philanthropists are keen to demonstrate that their giving produces results, this does not quite meet Veblen's threshold of being a complete waste of money. So the laurels surely go to those who are so wealthy that they are willing to buy adverts encouraging the state to tax them. Kudos, then, to those conspicuously non-consuming wealthy American opponents of recent efforts to abolish estate taxes: George Soros, Bill Gates senior (the father of the world's richest man) and Warren Buffett. From checker at panix.com Tue Jan 10 14:18:21 2006 From: checker at panix.com (Premise Checker) Date: Tue, 10 Jan 2006 09:18:21 -0500 (EST) Subject: [Paleopsych] Economist: China: Catching up Message-ID: China: Catching up http://www.economist.com/world/asia/PrinterFriendly.cfm?story_id=5327842 Dec 20th 2005 | BEIJING China's economy is overtaking Britain's "SURPASS Britain and catch up with America": Mao Zedong's slogan is still widely remembered in China. Though his reckless idealism is now officially discredited, there has been satisfaction this week at the results of a reassessment of the country's GDP data. China's economy, the new figures show, is on the verge of overtaking Britain's. The revision of China's GDP figures was of an order reminiscent of Mao's liberal adjustment of statistics to show his campaign was on target. Now, however, it is the higher figure that is the more credible one. Economists have long believed that China's GDP has been considerably understated thanks mainly to poor measuring of privately run services. The country's first economic census, launched in January, showed that in 2004 it was some 16.8% bigger than the previously announced figure of 13.7 trillion yuan ($1.7 trillion). In dollar terms at official exchange rates, this means that China replaced Italy as the world's sixth-biggest economy last year. In 2005, it almost certainly surpassed France, and probably squeaked past Britain too. It adds some $284 billion to China's GDP for last year, a figure almost the size of Taiwan's total. The survey also confirms some long-held suspicions about China's economic make-up: that its service sector is bigger than the one-third of GDP suggested by the old figures (the new data show more than 40%), that consumption is also higher and that investment and savings as a proportion of GDP are lower. This lot of figures look more sustainable than the old lot. But they are still only a best guess at the truth. And, at a sixth the size of America's, China's economy still has a bit of catching-up to do. From checker at panix.com Tue Jan 10 14:19:11 2006 From: checker at panix.com (Premise Checker) Date: Tue, 10 Jan 2006 09:19:11 -0500 (EST) Subject: [Paleopsych] CATO: Peace on Earth? Try Free Trade among Men Message-ID: Peace on Earth? Try Free Trade among Men http://cato.org/pub_display.php?pub_id=5344&print=Y December 28, 2005 [I agree with this completely, even though I have only scanned it. I know where the article is coming from, that's all. I should ask, though, whether the free movement of goods will entail irresistable pressure for the free movement of people. It's the people that generate externalities, some positive but (in the United States) mostly negative. There is a debate among libertarians whether to keep the ideal of the free movement of people at bay until such time as immigrants will stop using their votes to enrich themselves and limit liberty. (Dumbing down the population is apparently not on their radar screen.) [The pressure is not irresistable in Japan and Israel, but the Japanese think they are utterly and biologically unique among the world's people and the Jews have intense ingroup solidarity. [I also add that the Internet is another powerful force for peace. Those that communicate with one another will resist fighting, just as those who want to keep trade moving. [The old days where socialists were convinced that "capitalism" causes war by enriching arms manufacturers are not gone. They have been replaced with oil companies. How oil imports, which amount to less than one percent of GDP can exert this much pressure is not clear to me. [The author is only mildly right about capitalism being a force for expanding "democracy," very much like sociologists are only mildly right about modernization bringing secularization. The Chinese, so far, are proving to be a counter-example.] by Daniel T. Griswold Daniel Griswold is director of the Cato Institute Center for Trade Policy Studies. Buried beneath the daily stories about car bombs and insurgents is an underappreciated but comforting fact during this Christmas season: The world has somehow become a more peaceful place. As one little-noticed headline on an Associated Press story recently reported, "War declining worldwide, studies say." According to the Stockholm International Peace Research Institute, the number of armed conflicts around the world has been in decline for the past half-century. In just the past 15 years, ongoing conflicts have dropped from 33 to 18, with all of them now civil conflicts within countries. As 2005 draws to an end, no two nations in the world are at war with each other. The death toll from war has also been falling. According to the AP story, "The number killed in battle has fallen to its lowest point in the post-World War II period, dipping below 20,000 a year by one measure. Peacemaking missions, meanwhile, are growing in number." Those estimates are down sharply from annual tolls ranging from 40,000 to 100,000 in the 1990s, and from a peak of 700,000 in 1951 during the Korean War. Many causes lie behind the good news -- the end of the Cold War and the spread of democracy, among them -- but expanding trade and globalization appear to be playing a major role. Far from stoking a "World on Fire," as one misguided American author has argued, growing commercial ties between nations have had a dampening effect on armed conflict and war, for three main reasons. First, trade and globalization have reinforced the trend toward democracy, and democracies don't pick fights with each other. Freedom to trade nurtures democracy by expanding the middle class in globalizing countries and equipping people with tools of communication such as cell phones, satellite TV, and the Internet. With trade comes more travel, more contact with people in other countries, and more exposure to new ideas. Thanks in part to globalization, almost two thirds of the world's countries today are democracies -- a record high. Second, as national economies become more integrated with each other, those nations have more to lose should war break out. War in a globalized world not only means human casualties and bigger government, but also ruptured trade and investment ties that impose lasting damage on the economy. In short, globalization has dramatically raised the economic cost of war. Third, globalization allows nations to acquire wealth through production and trade rather than conquest of territory and resources. Increasingly, wealth is measured in terms of intellectual property, financial assets, and human capital. Those are assets that cannot be seized by armies. If people need resources outside their national borders, say oil or timber or farm products, they can acquire them peacefully by trading away what they can produce best at home. Of course, free trade and globalization do not guarantee peace. Hot-blooded nationalism and ideological fervor can overwhelm cold economic calculations. But deep trade and investment ties among nations make war less attractive. Trade wars in the 1930s deepened the economic depression, exacerbated global tensions, and helped to usher in a world war. Out of the ashes of that experience, the United States urged Germany, France, and other Western European nations to form a common market that has become the European Union. In large part because of their intertwined economies, a general war in Europe is now unthinkable. In East Asia, the extensive and growing economic ties among Mainland China, Japan, South Korea, and Taiwan is helping to keep the peace. China's Communist rulers may yet decide to go to war over its "renegade province," but the economic cost to their economy would be staggering and could provoke a backlash among Chinese citizens. In contrast, poor and isolated North Korea is all the more dangerous because it has nothing to lose economically should it provoke a war. In Central America, countries that were racked by guerrilla wars and death squads two decades ago have turned not only to democracy but to expanding trade, culminating in the Central American Free Trade Agreement with the United States. As the Stockholm institute reports in its 2005 Yearbook, "Since the 1980s, the introduction of a more open economic model in most states of the Latin American and Caribbean region has been accompanied by the growth of new regional structures, the dying out of interstate conflicts and a reduction in intra-state conflicts." Much of the political violence that remains in the world today is concentrated in the Middle East and Sub-Saharan Africa -- the two regions of the world that are the least integrated into the global economy. Efforts to bring peace to those regions must include lowering their high barriers to trade, foreign investment, and domestic entrepreneurship. Advocates of free trade and globalization have long argued that trade expansion means more efficiency, higher incomes, and reduced poverty. The welcome decline of armed conflicts in the past few decades indicates that free trade also comes with its own peace dividend. From checker at panix.com Tue Jan 10 14:22:12 2006 From: checker at panix.com (Premise Checker) Date: Tue, 10 Jan 2006 09:22:12 -0500 (EST) Subject: [Paleopsych] Hermenaut: Whatever Works, Sucks Message-ID: Whatever Works, Sucks http://www.hermenaut.com/a51.shtml Rational Exuberance: The Influence of Generation X on the New Economy by Meredith Bagby (Dutton, 1998) If, as A.S. Hamrah and Chris Fujiwara claim in their online Club Havana film review series, "Things That Don't Suck" is the new aesthetic category of the '90s--they explain that "the advertising and publicist types who employ this as a category of worth want you to believe that just because whatever it is they're offering isn't 100 percent offensive or repellent means it's somehow great" --then "Whatever Works" is the corresponding ethical category of the era. These are the debased formulae of people who not only don't have any standards of truth, beauty, or morality, but to whom the whole concept of standards is suspect. Of course we should be skeptical of received notions of truth, beauty, and morality, but skepticism isn't cynicism and even cynicism isn't as grass-eatingly low as the kind of slickly hip will-to-power-passing-as-pragmatism expressed in these phrases. Yet, now that the End of Ideology and What is Art? debates have finally trickled down (up?) from the academy to the street, these aggressively anti-standard standards have become cant in the mouths of would-be Gen X (no room to go into, for the nth time, how misleading and pointless this term is) spokespeople who should know better. Meredith Bagby is one of these--she's of the "Whatever Works" school--and this review is of her recently-published and highly-praised book Rational Exuberance: The Influence of Generation X on the New American Economy. Rational Exuberance was, I imagine, sold to Dutton as a happy-smiley slap in the face to all those whiny twentysomethings still bitching about how they couldn't afford to go to college and have to live with their parents. As such, it contains a bewildering onslaught of sidebars profiling "key Gen X players": fund managers, speechwriters, political activists, entrepreneurs, hacks at such Gen X media ghettoes as Swing magazine (which, ironically, just bit the dust) and CNN's Financial Network: experts all, one is led to believe, in the science of Gen X-ology. "We" anti-slackers, it seems, are "taking on the economic challenges inherited from older generations" and influencing the American economy through "greater savings plans, successful entrepreneurship, education reform, and bottom-line politics" --I quote from the press release. But as annoying a book as that in itself would have been, Rational Exuberance is worse than that, much worse. Americans born after 1964, "economist [and] renowned Gen X'er Bagby" (press release again) suggests in her introduction, have been unfairly characterized as depressed, directionless losers. But in fact, she counters, what look to "our" (I can't go on putting "we" and "our" in quotes, so I'll just stop here) elders like weaknesses--our deep suspicion regarding the idea of a "steady job," our inborn mistrust of Republicans and Democrats alike, our "No Future" certainty that Social Security will be bankrupt by the time we're 65, even though we're paying into it now--are actually strengths. Scorning the idea of climbing the corporate ladder, we're a generation of risk-taking entrepreneurs! The non-voting apathy of our older brothers and sisters has obscured our own grassroots activism! And as for the Social Security thing, well, some of us are lobbying for entitlement reform, while others re-learn the lost art of planning for the future! "Compound interest" sound familiar? But I don't really have a problem with any of this. (OK, I do have a problem with the idea that any intelligent person could possibly imagine that what the Sex Pistols were talking about in "No Future" had anything to do with vanishing welfare for the elderly.) What bugs me is the whole "Whatever Works" thing. According to Bagby, who brandishes charts, graphs, and public opinion polls (the first profile she offers is of a hip Republican pollster, already proving her book is a work of fiction) to prove her various points, "We [Gen X'ers] are not worried about staying within the guidelines of any particular [moral, political, philosophical] system; rather, we seek the avenue that produces the greatest results. We adjust, maneuver, manipulate our choices around what seems to work in today's complex world." "Results" is the red flag here: The holy grail of a certain kind of economist, results by definition justify the means used to achieve them. In fact, Bagby takes great pains to dismiss those dangerously idealistic types (she includes lawyers!) for whom means cannot be separated from ends. In Chapter 2, "A New Moral Order: The Age of Economics," Bagby forcefully insists that religious, philosophical, aesthetic, and even political lenses for viewing social issues may have been fine for past generations, but they just don't make the cut today. On the abortion question, she scoffs, "a philosopher might begin the debate with questions" (a word I imagine her saying with a sour expression on her face) about when life begins, moral responsibility and choice, and so forth; and a lawyer would waste everybody's time mucking around with issues of legal precedent. An economist, however, would cut through all the mumbo-jumbo by seeking "what is common among differing factions and build[ing] a solution." Moralists and politicians take their lumps, too, for being so wishy-washy over the death penalty. When it comes down to a choice between execution and life imprisonment, Bagby airily concludes, whichever one costs society less money is the right answer. Economists like herself, Bagby boasts, seek results, results, results. Unlike those querulous old fogies who'd hold us all to one universal ideal or another, economics is about the greatest benefit for the greatest number, and it's not afraid to write everyone who isn't part of the greatest number off as a loss. To Bagby, this is what the world needs now. With the end of the Cold War, she suggests, came the long-heralded End of Ideology. Democracy and capitalism have won, and the only challenge that remains is making the system work, dammit. "Our role in American history seems clear: If our parents' generation was about dismantling the status quo, our generation will be about building new institutions, moral codes, families, churches, corporations If our parents were the revolutionaries, then let us be the rebuilders." To this exalted end, Bagby (or should that be Carpetbagby?) offers pat, sketchy advice on everything from rebuilding our schools to rebuilding our communities, our families, and the economy. Of course, much of what passes for advice--on, for example, not being manipulated in the workplace, or by wily advertisers--is actually how-to advice for the book's real audience: those seeking advice on exploiting and manipulating Gen X'ers. But that's obvious, right? I could go on and on (and on) about what's wrong with the methods and conclusions of Rational Exuberance. All I want to accomplish here is a criticism of this "Whatever Works" shit, but it's hard not to get sidetracked. So let's take a leaf from Bagby's book and compromise: I'll complain about something off-topic just one more time, and then get back to criticizing her philosophy. Not happy with this solution? Apparently, in the new roll-up-your-shirtsleeves-and-lose-those-pesky-opinions age, we'd all better get used to feeling that way. "Fact: According to the U.S. Census Bureau, people aged 24 to 35 work 4.6 percent longer each week than the national average." This context-free statistic is a typical example of Bagby's misuse of the tools of social science to make a seemingly objective point. This particular nugget of data is supposed to disprove the media myth that all Gen X'ers are slackers. I question whether this statistic actually means "we" (sorry, couldn't resist) are hard workers: Doesn't it most likely mean, instead, that twentysomething temp-slave types are more exploited than your average non-Gen X worker? And, while we're on the subject, what's wrong with not wanting to work hard? Is there no middle ground between couch potato and millionaire-in-the-making? ("The 'X' in Generation X is the symbol for multiplication," Bagby sound-bites. "For us the symbol strikes a chord because the most successful entrepreneurs don't win by adding dollars--they win by multiplying dollars.") Unafraid to contradict herself, Bagby goes on to pay lip service to the idea that Gen X'ers just seem lazy because we refuse to sacrifice our "lifestyles" to our careers, that in fact we work less than our elders. As anyone who has worked for the kind of hotshot entrepreneur Bagby admiringly profiles knows, however, pulling insanely long work days is a lifestyle for them and that goes for everyone at their companies, like it or not. (I should confess here that I have worked for one of the entrepreneurs Bagby profiles!) This irresolvable contradiction hints at the special problem Bagby faced in assembling this book: Her real audience and her pretend audience want to hear different things, and while the latter will just get pissed at her, the former is the one buying the book. OK, let's move on. All social problems-- "the overextension of our government, growing economic inequality, increasing ethnic rumblings, deteriorating education systems, apathy from our citizenry [this kind of phrasing makes me worry that the author is planning to run for office], and an oversimplification of our problems by media and politicians" --can be solved by applying the economist's way of thinking, according to Bagby. "We have entered upon a new era of logic and numbers," she trumpets: "To give us the answers, we have elevated a new breed of experts. Other ages had prophets, witch doctors, soothsayers, and voodoo gods [you might know them as social scientists, political analysts, philosophers]. We have economists." Quoting Francis Fukuyama, Bagby argues that with the worldwide triumph of liberal democracy and free-market capitalism, we have arrived at not just the end of ideology, but the much-anticipated end of history--so the Big Questions of the past (freedom of speech, liberty, equality) just don't matter any more. Americans want "a government that W-O-R-K-S," Bagby argues, and Gen X is the generation that's uniquely qualified, precisely because it lacks any faith in ideals or standards, to give these Americans the government they deserve. "It is not about ideology. It is about practicality." In 1933, a twentysomething Marxist sympathizer named Simone Weil shocked her comrades by announcing her fear that the actual outcome of the much-anticipated proletarian revolution would not be the replacement of capitalism by socialism, but something much worse: the replacement of capitalism and socialism alike with a post-ideological society run by "technicians" [i.e. experts for whom the ends justify the means], particularly economists. After all, Weil pointed out, the only thing worse than oppression exercised in the name of some philosophical or religious or even political ideal is "oppression exercised in the name of function." Bagby's central obsession, entitlement reform--that political tar baby which at least serves as a sort of flypaper for annoying trust-funders like the ones who started the "twentysomething think-tank" Third Millennium--is precisely the kind of technocratic "issue" guaranteed to alienate and depress anyone still possessing a shred of political passion. Although Weil was deeply suspicious of all ideologies, she was an unapologetic idealist. If Weil's vision of a society without oppression has yet to materialize, her writing at least continues to inspire; it's hard to imagine anyone re-reading Bagby in, say, 1999. In fact, Weil's self-imposed alienation from all causes and communities is precisely what the original prophet of the End of Ideology, sociologist Daniel Bell, advocated: Detachment and idealism are not, after all, mutually exclusive. Bagby's "Whatever Works," on the other hand, though surrounded by weakly worded expressions of idealism, is in fact subversive of same. And whenever someone starts subverting idealism, we need to ask: To whose profit? The answer, I think, was expressed by Lewis Mumford, one of those "witch doctors" or "soothsayers" of a past generation, when he wrote (in Technics and Civilization) that "the idea that values could be dispensed with and replaced by mechanical or mathematical solutions is just another ideology" --a utilitarian one, whose only truly hoped-for result is the "clean victory" of capitalism over previous traditions, loyalties, and sentiments. Like every single carpetbagger who's ever descended upon a war-torn landscape, preaching about new brooms sweeping clean, Bagby is not to be trusted. Whenever she starts spelling W-O-R-K-S, she means M-O-N-E-Y. Beneath her shopworn Gen X insouciance lies an age-old cynical utilitarianism which knows the price of everything and the value of nothing--and that really sucks. From checker at panix.com Tue Jan 10 14:23:16 2006 From: checker at panix.com (Premise Checker) Date: Tue, 10 Jan 2006 09:23:16 -0500 (EST) Subject: [Paleopsych] Leland B. Yeager: Monarchy: Friend of Liberty Message-ID: Leland B. Yeager: Monarchy: Friend of Liberty http://www.angelfire.com/in3/theodore/opinion/articles/yeager.html [I audited a course under Mr. Yeager in my last undergraduate year at UVa. I was to switch to economics there in graduate school. Mr. Yeager made me write a paper like everyone else in the class. In graduate school, he was on sabbatical somewhere else when I would have taken his famed course in macroeconomics. As it happened, his substitute was the worst teacher I ever had, tied with another one who taught econometrics. No one in my class got an excellent on the Ph.D. candidacy examinations, as a result. I got a satisfactory minus, minus, since I flat out flunked the macroeconomics part. I just could not handle patently arbitrary, if not absurd, assumptions that are built into macroeconomics, even though I was extremely gifted in math, having taken most of the graduate math courses UVa offered when I was an undergraduate. I simply had a severe mental block to the whole thing, though perhaps today I'd just play the game. [I know THANK those bad professors. Both macroeconomics and econometrics are bottomless pits, and I am grateful that I did not waste what might have been a hugely successful career, which (I *am* or rather was, that talented in math) might have gotten me a Nobel Prize in macroeconomics, provided I would have dedicated myself as a calling from the Lord to do the steady and persistent day in and day out work that is required. Such a dedication is not part of my make up by a very long shot, so the matter is moot. I'm more a specialist in looking across disciplines, as I think my notes to these forwardings amply demonstrates, but I keep resolving to narrow my reading and have said so over and over again, only to lapse back into wide reading again and again.] A LIBERTARIAN CASE FOR MONARCHY Democracy and Other Good Things Clear thought and discussion suffer when all sorts of good things, like liberty, equality, fraternity, rights, majority rule, and general welfaresome in tension with othersare marketed together under the portmanteau label democracy. Democracys core meaning is a particular method of choosing, replacing, and influencing government officials (Schumpeter 1950/1962). It is not a doctrine of what government should and should not do. Nor is it the same thing as personal freedom or a free society or an egalitarian social ethos. True enough, some classical liberals, like Thomas Paine (1791-1792/1989) and Ludwig von Mises (1919/1983), did scorn hereditary monarchy and did express touching faith that representative democracy would choose excellent leaders and adopt policies truly serving the common interest. Experience has taught us better, as the American Founders already knew when constructing a government of separated and limited powers and of only filtered democracy. As an exercise, and without claiming that my arguments are decisive, Ill contend that constitutional monarchy can better preserve peoples freedom and opportunities than democracy as it has turned out in practice.^1 My case holds only for countries where maintaining or restoring (or conceivably installing) monarchy is a live option.^2 We Americans have sounder hope of reviving respect for the philosophy of our Founders. Our traditions could serve some of the functions of monarchy in other countries. An unelected absolute ruler could conceivably be a thoroughgoing classical liberal. Although a wise, benevolent, and liberal-minded dictatorship would not be a contradiction in terms, no way is actually available to assure such a regime and its continuity, including frictionless succession. Some element of democracy is therefore necessary; totally replacing it would be dangerous. Democracy allows people some influence on who their rulers are and what policies they pursue. Elections, if not subverted, can oust bad rulers peacefully. Citizens who care about such things can enjoy a sense of participation in public affairs. Anyone who believes in limiting government power for the sake of personal freedom should value also having some nondemocratic element of government besides courts respectful of their own narrow authority. While some monarchists are reactionaries or mystics, others (like Erik von Kuehnelt-Leddihn and Sean Gabb, cited below) do come across as a genuine classical liberals. Shortcomings of Democracy Democracy has glaring defects.^3 As various paradoxes of voting illustrate, there is no such thing as any coherent will of the people. Government itself is more likely to supply the content of any supposed general will (Constant 1814-15/1988, p. 179). Winston Churchill reputedly said: The best argument against democracy is a five-minute conversation with the average voter (BrainyQuote and several similar sources on the Internet). The ordinary voter knows that his vote will not be decisive and has little reason to waste time and effort becoming well informed anyway. This rational ignorance, so called in the public-choice literature, leaves corresponding influence to other-than-ordinary voters (Campbell 1999). Politics becomes a squabble among rival special interests. Coalitions form to gain special privileges. Legislators engage in logrolling and enact omnibus spending bills. Politics itself becomes the chief weapon in a Hobbesian war of all against all (Gray 1993, pp. 211-212). The diffusion of costs while benefits are concentrated reinforces apathy among ordinary voters. Politicians themselves count among the special-interest groups. People who drift into politics tend to have relatively slighter qualifications for other work. They are entrepreneurs pursuing the advantages of office. These are not material advantages alone, for some politicians seek power to do good as they understand it. Gratifying their need to act and to feel important, legislators multiply laws to deal with discovered or contrived problemsand fears. Being able to raise vast sums by taxes and borrowing enhances their sense of power, and moral responsibility wanes (as Benjamin Constant, pp. 194-196, 271-272, already recognized almost two centuries ago). Democratic politicians have notoriously short time horizons. (Hoppe (2001) blames not just politicians in particular but democracy in general for high time preferenceindifference to the long runwhich contributes to crime, wasted lives, and a general decline of morality and culture.) Why worry if popular policies will cause crises only when one is no longer running for reelection? Evidence of fiscal irresponsibility in the United States includes chronic budget deficits, the explicit national debt, and the still huger excesses of future liabilities over future revenues on account of Medicare and Social Security. Yet politicians continue offering new plums. Conflict of interest like this far overshadows the petty kinds that nevertheless arouse more outrage. Responsibility is diffused in democracy not only over time but also among participants. Voters can think that they are only exercising their right to mark their ballots, politicians that they are only responding to the wishes of their constituents. The individual legislator bears only a small share of responsibility fragmented among his colleagues and other government officials. Democracy and liberty coexist in tension. Nowadays the United States government restricts political speech. The professed purpose of campaign-finance reform is to limit the power of interest groups and of money in politics, but increased influence of the mass media and increased security of incumbent politicians are likelier results. A broader kind of tension is that popular majorities can lend an air of legitimacy to highly illiberal measures. Bv the sheer weight of numbers and by its ubiquity the rule of 99 per cent is more hermetic and more oppressive than the rule of 1 per cent (Kuehnelt-Leddihn 1952, p. 88). When majority rule is thought good in its own right and the fiction prevails that weordinary citizens are the government, an elected legislature and executive can get away with impositions that monarchs of the past would scarcely have ventured. Louis XIV of France, autocrat though he was, would hardly have dared prohibit alcoholic beverages, conscript soldiers, and levy an income tax (Kuehnelt-Leddihn, pp. 280-281)or, we might add, wage war on drugs. Not only constitutional limitations on a kings powers but also his^4 not having an electoral mandate is a restraint. At its worst, the democratic dogma can abet totalitarianism. History records totalitarian democracies or democratically supported dictatorships. Countries oppressed by communist regimes included words like democratic or popular in their official names. Totalitarian parties have portrayed their leaders as personifying the common man and the whole nation. German National Socialism, as Kuehnelt-Leddihn reminds us, was neither a conservative nor a reactionary movement but a synthesis of revolutionary ideas tracing to before 1789 (pp. 131, 246-247, 268). He suggests that antimonarchical sentiments in the background of the French Revolution, the Spanish republic of 1931, and Germanys Weimar Republic paved the way for Robespierre and Napoleon, for Negrin and Franco, and for Hitler (p. 90). Winston Churchill reportedly judged that had the Kaiser remained German Head of State, Hitler could not have gained power, or at least not have kept it (International Monarchist League). [M]onarchists, conservatives, clerics and other reactionaries were always in bad grace with the Nazis (Kuehnelt-Leddihn, p. 248). Separation of Powers A nonelected part of government contributes to the separation of powers. By retaining certain constitutional powers or denying them to others, it can be a safeguard against abuses.^5 This is perhaps the main modern justification of hereditary monarchy: to put some restraint on politicians rather than let them pursue their own special interests complacent in the thought that their winning elections demonstrates popular approval. When former president Theodore Roosevelt visited Emperor Franz Joseph in 1910 and asked him what he thought the role of monarchy was in the twentieth century, the emperor reportedly replied: To protect my peoples from their governments (quoted in both Thesen and Purcell 2003). Similarly, Lord Bernard Weatherill, former speaker of the House of Commons, said that the British monarchy exists not to exercise power but to keep other people from having the power; it is a great protection for our democracy (interview with Brian Lamb on C-Span, 26 November 1999). The history of England shows progressive limitation of royal power in favor of parliament; but, in my view, a welcome trend went too far. Almost all power, limited only by traditions fortunately continuing as an unwritten constitution, came to be concentrated not only in parliament but even in the leader of the parliamentary majority. Democratization went rather too far, in my opinion, in the Continental monarchies also. Continuity A monarch, not dependent on being elected and reelected, embodies continuity, as does the dynasty and the biological process. Constitutional monarchy offers us ... that neutral power so indispensable for all regular liberty. In a free country the king is a being apart, superior to differences of opinion, having no other interest than the maintenance of order and liberty. He can never return to the common condition, and is consequently inaccessible to all the passions that such a condition generates, and to all those that the perspective of finding oneself once again within it, necessarily creates in those agents who are invested with temporary power. It is a master stroke to create a neutral power that can terminate some political danger by constitutional means (Constant, pp. 186-187). In a settled monarchybut no regime whatever can be guaranteed perpetual existencethe king need not worry about clinging to power. In a republic, The very head of the state, having no title to his office save that which lies in the popular will, is forced to haggle and bargain like the lowliest office-seeker (Mencken 1926, p. 181). Dynastic continuity parallels the rule of law. The king symbolizes a state of affairs in which profound political change, though eventually possible, cannot occur without ample time for considering it. The king stands in contrast with legislators and bureaucrats, who are inclined to think, by the very nature of their jobs, that diligent performance means multiplying laws and regulations. Continuity in the constitutional and legal regime provides a stable framework favorable to personal and business planning and investment and to innovation in science, technology, enterprise, and culture. Continuity is neither rigidity nor conservatism. The heir to the throne typically has many years of preparation and is not dazzled by personal advancement when he finally inherits the office. Before and while holding office he accumulates a fund of experience both different from and greater than what politicians, who come and go, can ordinarily acquire. Even when the king comes to the throne as a youth or, at the other extreme, as an old man with only a few active years remaining, he has the counsel of experienced family members and advisors. If the king is very young (Louis XV, Alfonso XIII) or insane (the elderly George III, Otto of Bavaria), a close relative serves as regent.^6 The regent will have had some of the opportunities to perform ceremonial functions and to accumulate experience that an heir or reigning monarch has. Objections and Rebuttals Some arguments occasionally employed for monarchy are questionable. If the monarch or his heir may marry only a member of a princely family (as Kuehnelt-Leddihn seems to recommend), chances are that he or she will marry a foreigner, providing international connections and a cosmopolitan way of thinking. Another dubious argument (also used by Kuehnelt-Leddihn) is that the monarch will have the blessing of and perhaps be the head of the state religion. Some arguments are downright absurd, for example: Monarchy fosters art and culture. Austria was culturally much richer around 1780 than today! Just think of Mozart! (Thesen). But neither all arguments for nor all objections to monarchy are fallacious. The same is true of democracy. In the choice of political institutions, as in many decisions of life, all one can do is weigh the pros and cons of the options and choose what seems best or least bad on balance. Some objections to monarchy apply to democracy also or otherwise invite comments that, while not actual refutations, do strengthen the case in its favor. Monarchy is charged with being government-from-above (Kuehnelt-Leddihn, p. 276). But all governments, even popularly elected ones, except perhaps small direct democracies like ancient Athens, are rule by a minority. (Robert Michels and others recognized an iron law of oligarchy; Jenkin 1968, p. 282.) Although democracy allows the people some influence over the government, they do not and cannot actually run it. Constitutional monarchy combines some strengths of democracy and authoritarian monarchy while partially neutralizing the defects of those polar options. Another objection condemns monarchy as a divisive symbol of inequality; it bars an ideal society in which everyone will be equal in status, and in which everyone will have the right, if not the ability, to rise to the highest position (Gabb 2002, who replies that attempts to create such a society have usually ended in attacks on the wealthy and even the well-off). Michael Prowse (2001), calling for periodic referendums on whether to keep the British monarchy, invokes what he considers the core idea of democracy: all persons equally deserve respect and consideration, and no one deserves to dominate others. The royal family and the aristocracy, with their titles, demeanor, and self-perpetuation, violate this democratic spirit. In a republican Britain, every child might aspire to every public position, even head of state. So arguing, Prowse stretches the meaning of democracy from a particular method of choosing and influencing rulers to include an egalitarian social ethos. But monarchy need not obstruct easy relations among persons of different occupations and backgrounds; a suspicious egalitarianism is likelier to do that. In no society can all persons have the same status. A more realistic goal is that everyone have a chance to achieve distinction in some narrow niche important to him. Even in a republic, most people by far cannot realistically aspire to the highest position. No one need feel humbled or ashamed at not ascending to an office that simply was not available. A hereditary monarch can be like the Alps(Thesen), something just there. Perhaps it is the kings good luck, perhaps his bad luck, to have inherited the privileges but also the limitations of his office; but any question of unfairness pales in comparison with advantages for the country. Prowse complains of divisiveness. But what about an election? It produces losers as well as winners, disappointed voters as well as happy ones. A king, however, cannot symbolize defeat to supporters of other candidates, for there were none. A monarch mounting the throne of his ancestors follows a path on which he has not embarked of his own will. Unlike a usurper, he need not justify his elevation (Constant, p. 88). He has no further political opportunities or ambitions except to do his job well and maintain the good name of his dynasty. Standing neutral above party politics, he has a better chance than an elected leader of becoming the personified symbol of his country, a focus of patriotism and even of affection. The monarch and his family can assume ceremonial functions that elected rulers would otherwise perform as time permitted. Separating ceremonial functions from campaigning and policymaking siphons off glamor or adulation that would otherwise accrue to politicians and especially to demagogues. The occasional Hitler does arouse popular enthusiasm, and his opponents must prudently keep a low profile. A monarch, whose power is preservative rather than active (Constant, pp. 191-192), is safer for peoples freedom. Prowse is irritated rather than impressed by the pomp and opulence surrounding the Queen. Clinging to outmoded forms and ascribing importance to unimportant things reeks of collective bad faith and corrosive hypocrisy. Yet a monarchy need not rest on pretense. On the contrary, my case for monarchy is a utilitarian one, not appealing to divine right or any such fiction. Not all ritual is to be scorned. Even republics have Fourth of July parades and their counterparts. Ceremonial trappings that may have become functionless or comical can evolve or be reformed. Not all monarchies, as Prowse recognizes, share with the British the particular trappings that irritate him. A case, admittedly inconclusive, can be made for titles of nobility (especially for close royal relatives) and for an upper house of parliament of limited powers whose members, or some of them, hold their seats by inheritance or royal appointment (e.g., Constant, pp. 198-200). The glory of a legitimate monarch is enhanced by the glory of those around him. ... He has no competition to fear. ... But where the monarch sees supporters, the usurper sees enemies. (Constant, p. 91; on the precarious position of a nonhereditary autocrat, compare Tullock 1987). As long as the nobles are not exempt from the laws, they can serve as a kind of framework of the monarchy. They can be a further element of diversity in the social structure. They can provide an alternative to sheer wealth or notoriety as a source of distinction and so dilute the fawning over celebrities characteristic of modern democracies. Ordinary persons need no more feel humiliated by not being born into the nobility than by not being born heir to the throne. On balance, though, I am ambivalent about a nobility. A Kings Powers Michael Prowses complaint about the pretended importance of unimportant things suggests a further reason why the monarchs role should go beyond the purely symbolic and ceremonial. The king should not be required (as the Queen of England is required at the opening of parliament) merely to read words written by the cabinet. At least he should have the three rights that Walter Bagehot identified in the British monarchy: the right to be consulted, the right to encourage, the right to warn. And a king of great sense and sagacity would want no others. He would find that his having no others would enable him to use these with singular effect (Bagehot (1867/1872/1966, p. 111). When Bagehot wrote, the Prime Minister was bound to keep the Queen well informed about the passing politics of the nation. She has by rigid usage a right to complain if she does not know of every great act of her Ministry, not only before it is done, but while there is yet time to consider it while it is still possible that it may not be done. A sagacious king could warn his prime minister with possibly great effect. He might not always turn his course, but he would always trouble his mind. During a long reign he would acquire experience that few of his ministers could match. He could remind the prime minister of bad results some years earlier of a policy like one currently proposed. The king would indeed have the advantage which a permanent under-secretary has over his superior the Parliamentary secretary that of having shared in the proceedings of the previous Parliamentary secretaries. ... A pompous man easily sweeps away the suggestions of those beneath him. But though a minister may so deal with his subordinate, he cannot so deal with his king (Bagehot, pp. 111-112). A prime minister would be disciplined, in short, by having to explain the objective (not merely the political) merits of his policies to a neutral authority. The three rights that Bagehot listed should be interpreted broadly, in my view, or extended. Constant (p. 301) recommends the right to grant pardons as a final protection of the innocent. The king should also have power: to make some appointments, especially of his own staff, not subject to veto by politicians; to consult with politicians of all parties to resolve an impasse over who might obtain the support or acquiescence of a parliamentary majority; and to dismiss and temporarily replace the cabinet or prime minister in extreme cases. (I assume a parliamentary system, which usually does accompany modern monarchy; but the executive could be elected separately from the legislators and even subject to recall by special election.) Even dissolving parliament and calling new elections in an exceptional case is no insult to the rights of the people. On the contrary, when elections are free, it is an appeal made to their rights in favor of their interests (Constant, p.197). The king should try to rally national support in a constitutional crisis (as when King Juan Carlos intervened to foil an attempted military coup in 1981). Kings and Politicians What if the hereditary monarch is a child or is incompetent? Then, as already mentioned, a regency is available. What if the royal family, like some of the Windsors, flaunts unedifying personal behavior? Both dangers are just as real in a modern republic. Politicians have a systematic tendency to be incompetent or worse.^7 For a democratic politician, understanding economics is a handicap.^8 He either must take unpopular (because misunderstood) stands on issues or else speak and act dishonestly. The economically ignorant politician has the advantage of being able to take vote-catching stands with a more nearly clear conscience. Particularly in these days of television and of fascination with celebrities, the personal characteristics necessary to win elections are quite different from those of a public-spirited statesman. History does record great statesmen in less democratized parliamentary regimes of the past. Nowadays a Greshams Law operates: the inferior human currency drives the better one out of circulation (Kuehnelt-Leddihn, pp.115, 120). Ideal democratic government simply is not an available option. Our best hope is to limit the activities of government, a purpose to which monarchy can contribute. Although some contemporary politicians are honorable and economically literate, even simple honesty can worsens ones electoral chances. H. L. Mencken wrote acidly and with characteristic exaggeration: No educated man, stating plainly the elementary notions that every educated man holds about the matters that principally concern government, could be elected to office in a democratic state, save perhaps by a miracle. ... it has become a psychic impossibility for a gentleman to hold office under the Federal Union, save by a combination of miracles that must tax the resourcefulness even of God. ... the man of native integrity is either barred from the public service altogether, or subjected to almost irresistible temptations after he gets in (Mencken 1926, pp. 103, 106, 110). Under monarchy, the courtier need not abase himself before swine, pretend that he is a worse man than he really is. His sovereign has a certain respect for honor. The courtiers sovereign ... is apt to be a man of honour himself (Mencken, p. 118, mentioning that the King of Prussia refused the German imperial crown offered him in 1849 by a mere popular parliament rather than by his fellow sovereign princes). Mencken conceded that democracy has its charms: The fraud of democracy ... is more amusing than any othermore amusing even, and by miles, than the fraud of religion. ... [The farce] greatly delights me. I enjoy democracy immensely. It is incomparably idiotic, and hence incomparably amusing (pp. 209, 211). Conclusion One argument against institutions with a venerable history is a mindless slogan betraying temporal provincialism, as if newer necessarily meant better: Dont turn back the clock. Sounder advice is not to overthrow what exists because of abstract notions of what might seem logically or ideologically neater. In the vernacular, If it aint broke, dont fix it. It is progress to learn from experience, including experience with inadequately filtered democracy. Where a monarchical element in government works well enough, the burden of proof lies against the republicans (cf. Gabb). Kuehnelt-Leddihn, writing in 1952 (p. 104), noted that the royal, non-democratic alloy has supported the relative success of several representative governments in Europe. Only a few nontotalitarian republics there and overseas have exhibited a record of stability, notably Switzerland, Finland, and the United States.^9 Constitutional monarchy cannot solve all problems of government; nothing can. But it can help. Besides lesser arguments, two main ones recommend it. First, its very existence is a reminder that democracy is not the sort of thing of which more is necessarily better; it can help promote balanced thinking. Second, by contributing continuity, diluting democracy while supporting a healthy element of it, and furthering the separation of government powers, monarchy can help protect personal liberty. Notes References "Monarchy: Friend of Liberty", Liberty 18, January 2004, pp. 37-42 From checker at panix.com Tue Jan 10 14:24:07 2006 From: checker at panix.com (Premise Checker) Date: Tue, 10 Jan 2006 09:24:07 -0500 (EST) Subject: [Paleopsych] Neese and Williams: On Darwinian Medicine Message-ID: Neese and Williams: On Darwinian Medicine http://www-instruct.nmu.edu/biology/ALindsay/Evolution/Nesse_1999.pdf Life Science Research Published in China Vol. 3, No 1, 1999 pp. 1-17 and Vol. 3, No 2, 1999, pp. 79-91 On Darwinian Medicine Randolph M. Nesse George C. Williams Address correspondence to: Randolph Nesse, M.D. Professor of Psychiatry, The University of Michigan Room 5057, Institute for Social Research 426 Thompson Street Ann Arbor, MI 48106-1248 USA nesse at umich.edu It would appear, at first glance, that natural selection would have little to say about why work better, and decreases the frequency of genes that harm organisms, so it would seem that evolutionary theory would be able to explain adaptations, but not their failures. This illusion has been so strong that even now, over a century since Darwin showed how organisms are shaped by natural selection, evolutionary biology is just beginning to be applied to the problems of medicine. (1, 1991, 2, 3, Trevathan, In press #1220) We are just beginning to learn why we are vulnerable to so many diseases. Why do we have an appendix? Wouldn't we be better off without wisdom teeth? Why does the human fetus have to squeeze through the tiny pelvic opening? Why, after millions of years of natural selection, are we still susceptible to infection by the streptococcus organism? Why do we get fevers so high they cause seizures? And why do our own immune systems sometimes attack us, causing rheumatic fever, rheumatoid arthritis, and multiple sclerosis? Why is depression so common, and why is life, in general, so full of suffering? Why is pain so often excessive? And why is it so hard for so many people to find restful sleep, to say nothing of love and sexual satisfaction? All these questions are about the design of the organism. More specifically, they are about why our bodies aren't better designed. In many cases, such as the appendix, it would seem to be child's play to improve the design. Were we able to change the body as we see fit, we could banish much disease. Or, could we? If the problem is just that natural selection is just not strong enough to improve the design of the body, then we certainly could do better. Selection operates on random mutations that constantly slip in, so perhaps the design flaws simply result from chance. Some certainly do. The vast majority of designs that go bad, are, however, not random mistakes, but products of natural selection. This poses a central mystery. If naturalselection is so powerful that it can shape bodies so perfect in so many respects, then why are our bodies also full of so many flaws and design oversights that leave us vulnerable to thousands of diseases? There are only a few possible kinds of evolutionary explanations for such vulnerabilities. First, as already mentioned, there are random events--environmental mishaps that are too rare to be seen by natural selection and genetic changes that are outside the reach of natural selection. Second, there are problems that arise because our bodies were not designed from scratch, but from an unbroken lineage that goes all the way back to the simplest single-celled organisms. Path dependence--the result of this continuous lineage--means that many structure designs are actually maladaptive. Third, there is competition between living organisms. Natural selection shapes predators, bacteria and viruses, and other humans, all of whom may benefit from harming us. To protect ourselves from these dangers, natural selection has shaped a wide variety of protective defenses such as pain, fever, nausea, and anxiety. These are not causes of disease, but the body's ways of preventing damage, yet because they are painful and associated with problems, we often confuse them with diseases themselves. Fourth, there are trade-offs--every trait in the body could be improved to some degree if, that is, we were willing to accept the resulting compromises in other traits. Fifth, we were not designed by natural selection for our present environment and lifestyles, and much disease results because natural selection is not had time to transform us for living in the modern environment. Finally, there are genetic "quirks", variations that were of no consequence in the ancestral environment, but that now cause disease. Natural selection is an extremely simple principle, but its elaborations are extremely subtle and this leads to much misunderstanding. Darwin, of course, did not know about genes--all he had to go on was the traits of organisms. As a result, he had to develop his ideas based on observations of how traits change in response to breeding, with only a vague notion about how the information that coded for an organism's traits was passed along. Now we know that the information code for organisms is stored in DNA, specifically, in approximately 100,000 protein coding sequences called genes. While much of this code is identical between two different human individuals, and, for that matter, between any two living organisms, there are also variations. If individuals with gene A, on the average in this environment have more offspring and grand-offspring than individuals with gene B, then the A gene will gradually become more common and B will, over the generations, become rare. Naturalselection consists of nothing more or less than 1) variation in the information code that results in variations in phenotypes, 2) differential reproductive success of those phenotypes, with the inevitable result of, 3) changes in the information code across the generations. (4) This logic is so straightforward that it is best described not as a theory, but as a principle. When variation in an information code leads to differential reproductive success, the information code changes to whatever works best at getting copies of itself in future generations. Despite the simplicity of the principle of natural selection, it remains the focus of many misunderstandings. (5) In particular, contrary to the beliefs of many, the path of natural selection has no goal, no direction, and follows no plan. Yes, it is true that the first organisms were small unicellular creatures and the earth now has many large multi-cellular organisms with much greater complexity. This is not, however, because of any preexisting plan or magical force, it is simply because the first organisms were necessarily small and simple, so there was only one direction for variations to go. Organisms can evolve to simpler as well as more complex forms, as illustrated by the helminths that inhabit mammalian guts. Also, while humans are quite wonderful creatures with some unique abilities, there is no reason to think that we are a peak or a goal of this process. In fact, it appears more and more that our species is, at least in terms of the ecosystem it is ravaging, a malignantly successful replicator that is accidentally but systematically destroying most other organisms on Earth. The key to understanding natural selection is recognizing that it changes, not organisms, but the information code that makes organisms. (4) Whatever information creates individuals who maximize copies of that information in future generations will become more common. This depends, of course, on the environment. There is no such thing as an adaptation in the abstract-- traits are adaptive only in reference to a particular environment. (6) Another common misunderstanding arises from the understandable tendency to see individuals and groups as a product of natural selection, instead of genes. Different genes cooperate to do things that benefit the individual, so intuitively, it seems sensible that individuals should make sacrifices for the benefit of the species. There is a huge difference, however, between these levels. (7) The genes in each cell of an individual are identical. Even if they were not identical, the genes in somatic cells cannot be passed on, so they get no benefit whatsoever from doing anything except helping the individual. Different individuals, however, have different genes. These genes inevitably induce behaviors that are in their own interests at the expense of other individuals. The only exceptions are in the case of identical twins or genetically identical individuals in species that can reproduce by nonsexual means. This brings up an immediate, large, good, and unanswered question; why is reproduction sexual? Compared to simply budding, sex is expensive, troublesome, shuffles the genetic code in ways that cause unfortunate genetic combinations and, worst of all, results in related individuals with different interests who therefore are shaped to compete against one another. The very existence of sex is, from an evolutionary point of view, a mystery. (8) One plausible explanation is that the genetic code must be constantly varied, otherwise viruses and bacteria would be able to crack the code and make short work out every one of us. (9) Another idea is based on the best strategy for winning a lottery. (10) In most organisms, the vast majority of offspring never ever get to sexual maturity. Each one is much like a lottery ticket that probably will be worthless, but may have a big payoff. In this situation, the best strategy is obviously not to buy many lottery tickets that are identical, but to spread your bets among a wide variety of different tickets in hopes that one of them will payoff. The important point here is that while it is hard to explain why sex exists, it does, and the resulting genetic variation necessarily means that individuals within a species will compete, and will cooperate only in the service of that competition. We do, however, see individuals sacrifice for the good of the group on many occasions, a phenomenon that is the deserving focus of much current work (11). It is tempting to try to explain such phenomena as a result of selection acting for the benefit of the group by selecting for traits that benefit the group even though the harm the individual. Except possibly in some very special circumstances, however, this simply does not work. The force of selection at the individual level is so much more powerful than that at the group level, that group selection can explain only traits that are of very small cost to the individual and enormous benefit to the group. One level further down the hierarchy is a parallel issue with important consequences for human health. We have already mentioned that genetic differences in somatic human cells cannot get into the genome and so act only in the interests of the individual. But are there circumstances in which genes can have effects that promote their interests in getting into the next generation even at the expense of the individual's health? Yes, indeed. In fact, such examples are common causes of much human illness and suffering. The most dramatic example is the longevity difference between the sexes. Males on the average live approximately seven years less than females. This is not true only for humans, but for essentially every species where males compete vigorously for mates. This phenomenon can be explained at one level, the proximate level, by reference to the effects of testosterone on tissues, and the effects of male aggression and fighting on accident rates. An entirely separate explanation, an evolutionary explanation, is necessary to reveal why testosterone has deleterious effects on tissues, and why men do aggressive and dangerous things. This explanation is based, like all evolutionary explanations, on how genes shape traits that influence reproductive success. Females invest more in each offspring, therefore the range of their range reproductive success is narrowed compared to men. Most women have from 1 to perhaps as many as 10 children. Some men, however, will have no offspring whatsoever, while some will have far more. Some men have had hundreds of children. (12) Thus, the competition between men for mates is extreme, while women tend to be choosy. (13) A man whose physiology is set to put increased effort into this competition by sacrificing tissue protection, tissue healing, immune function, and physical safety, will have a reproductive advantage over men who live safer, healthier, longer lives. For women, this is not true, at least to the same extent, so they live longer. This principle leads to other observable effects even in a modern civilization. People do things that are not particularly in their best interests, but are in the interests of their genes. For instance, many men will seek out additional sexual partners, despite knowing the risks of disease, jealous husbands and their own wife's wrath (14). Often they know ahead of time exactly what lies in store for them, but they go ahead, almost as if they cannot help acting in ways that benefit their genes, even when they know it will lead to their harm and ultimate unhappiness. Still another example of genes taking advantage of individuals occurs in the process of meiosis. A gene that can somehow distort the process to get increased copies of itself into the egg or sperm will have a huge selective advantage so such genes can increase in frequency even if they do severe harm to the individual. No examples are known in humans, but the T-locus in mice and segregation-distorter locus in Drosophlia, document the existence of this phenomenon (15). Our purpose here is not to describe such phenomena in any detail, but simply to use them as examples that illustrate the crucial principle that natural selection is a mindless process that increases the frequency of any bits of information in the DNA code that, by whatever means, are especially successful at getting themselves into the next generation, even if that harms the individual. Still another example illustrates the idea of pleiotropy. A gene that increase the rate of implantation of a zygote on the uterine wall from the usual 25% to, say 35%, will have such a huge selective advantage that it can rapidly increase in frequency even if it causes severe problems later in life. An example may exist in the DR3 allele on the HLA system, an allele that greatly increases the risk of childhood diabetes. (16) Phenylketoneuria also may be maintained by the same mechanism, as indicated by the presence of this recessive disease in more that the expected 50% of a couple's offspring. (17) Competition with other organisms A large proportion of human disease results from competition with other organisms. This is most obvious in our competition with viruses and bacteria, but disease also arises from competition with predators and other humans. In all cases, natural selection is constantly improving our ability to cope with these threats, but because these organisms themselves are constantly changing products of natural selection who are in a race to escape our defenses, there is no end to the process, and much disease results. (18, 19) To simplify, consider rabbits and foxes. If a mutation makes some foxes a bit faster than others, they will catch more rabbits, and this mutation will soon become more common in future generations. This has an obvious effect on the rabbits. Those who previously could handily escape foxes now are vulnerable. Only the very fastest rabbits can escape, so selection increases frequency of genes that make rabbits still faster, even when those genes may have other negative effects. This is a classic instance of a trade-off. A change in an organism that is all for the good with no new costs, is extremely rare. At the very least, rabbits that run faster are likely to be lighter, and therefore less likely to survive a period of food shortage. Or perhaps, the changes that make rabbits faster are also increase the speed with which energy is metabolized, perhaps with tissue-damaging side effects. As must be obvious, this competition between the rabbits and the foxes will shape both species in an escalating arms race. Such arms races result in much disease. Both foxes and rabbits could be heavier and better able to get through a harsh winter if they could just relax and cooperate. But, they can't. Similar arms races are even more obvious in the all-out competitions between pathogens and hosts (20). Streptococcal bacteria, for instance, must somehow escape surveillance by our immune systems. One of their strategies seems to be to imitate our own cells so our antibodies against them sometimes attack our own tissues. This is obviously a tricky business, but we must do it to it escape their infections. So, some people and up get rheumatic fever that damages the joints and heart valves, or obsessive compulsive disorder from damage to the basal ganglia, or scarlet fever from damage to the skin. The layer on layer of intrigue and counter-intrigue in these competitions is breathtaking in its complexity, and tragic in its results (21). Consider the organism that causes sleeping sickness, the trypanosome. Its antigen coat stimulates a healthy immune response, but just at the time when antibodies are being made in quantity, the organism exposes a completely different antigen coat, thus eluding its pursuers as effectively as a spy who completely changes his disguise. (22) Such arms races seem to be responsible for considerable genetic variation, some of which causes disease. The sickle cell trait is a well-known example. Individuals with a single sickle cell allele are protected against malaria but do not get sickle cell disease, while those with 2 sickle cell alleles die young from sickle cell disease, and those who have no sickle cell alleles are vulnerable to malaria. This example has been widely discussed, but it is somewhat peculiar because it results from a single nucleotide substitution that apparently occurred in the neighborhood of ten thousand years ago, and, is found only in Africa and areas of the Mediterranean were malaria has been prevalent. (23) The alpha+-thalassaemias are of particular interest because they are the commonest known human genetic disorders. They have been thought to protect from malaria, but recent evidence suggests that they are associated with a higher incidence of malaria. The explanation seems to be that by increasing susceptibility to the more mild Plasmodium vivax they result in immunity that protects against the more severe P. falciparum malaria (24). G6PD deficiency also apparently protects against malaria and has increased in frequency in the times since humans began agriculture (25). There has been wide speculation that some of the genetic diseases characteristic of the Askanazi Jews, such as Tay Sachs and other sphingolipidoses, may protect against tuberculosis. (26) These genetic variations are present only in small groups, but others may have become universal genetic characteristics that protect us against other pathogens despite causing us harm. Despite the difficulty of distinguishing harmful from useful genes, (27), it is important to recognize this mechanism as a source of vulnerability to disease, especially as we begin to unravel the entire genome. Some genes will, no doubt, appear to have wholly pathologicaleffects. Before we tamper with them, we should consider the possibility that they may have unsuspected benefits. What is the optimum level of virulence for a pathogen? On the surface it would seem senseless for a pathogen to kill its host. Why not simply coexist with a host so that the host lives longer and the pathogen can also? But this is not how natural selection works. Whatever information code in pathogens results in the most copies in future generations will become more common. For some pathogens, such as those that cause minor upper respiratory infections, a low-level of virulence will facilitate spread. People who are too sick to leave bed will not be up and about coughing, sneezing, and touching other people. On the other hand, when a pathogen is spread by vector such as mosquitoes or dirty water, selection may favor increased virulence. Malaria may spread even better if the host is unable to slap at mosquitoes. In an environment where raw sewage may reach others, cholera may spread proportionate to the amount of diarrhea it produces. When the host is infected with a fatal pathogen, restraint by another pathogen is of no benefit. Paul Ewald has investigated such situations in detail, and predicted and demonstrated that virulence should decrease when changed sanitary conditions shift the advantage to strains of an organism that allow the victim to be up and about. (20) Indeed, when public sanitation is successful, the more virulent type of cholera is displaced by the less virulent. Likewise with Shigella - when public sanitation is instituted, natural selection shifts the advantage to less virulent subtype. This principle has profound implications for modern hospitals as well, since doctor's and nurse's hands serve the same function as mosquitoes, transferring pathogens from and to passive victims in a cycle that selects for the more aggressive organisms. Finally, we note the lengths to which pathogens go to insure their transmission. They can even take over the behavioral control machinery of the host to their own advantage. (28) Ants who are infected with a particular kind of fluke will, in the late stages of infection, climb to the top of a blade of grass and grab on in a spasm that will not let go. Why? The next phase of the life-cycle for this fluke is in sheep, so ants clasping the tip of a blade of grass are helpless prisoners doing the bidding of their internal masters. Similar pathogens induced snails to crawl up on the shore where they are exposed to sea gulls, the next stage in their life-cycle. A more common and gruesome example that affects humans, is offered by rabies. After it enters the skin, the rabies virus enters the nerves and arranges for its own transport directly to the central nervous system. There, it concentrates in the amygdala, a site that controls aggression, and in the brain locus that controls swallowing, so that the mouth fills with saliva, and in the salivary glands. Thus, the rabies virus essentially takes over the individual and turns it into a device for transmitting itself. Defenses Many manifestations of disease are caused directly by a pathogen or by some defect in the body. Paralysis, jaundice, and seizures, are examples. Other manifestations of disease are not themselves defects, but are defenses that have been shaped by natural selection to protect us in the face of certain dangers. Examples include pain, nausea, vomiting, diarrhea, fatigue and anxiety. It is very easy to mistakenly interpret such symptoms as pathological, when, in fact they are protections against pathology. Cough is the most obviously useful defense. The basic benefit of cough is clearing foreign from the respiratory tract. People who are unable to cough cannot clear secretions from their lungs and are likely to die from pneumonia. A variety of other mechanisms do the same thing for other passageways. Vomiting clears toxins and pathogens from the upper GI tract while diarrhea clears them from the lower GI tract. Coughing, sneezing and nasal secretions clear the respiratory passages. Inflammation leading to pus formation and extrusion on the surface of the body serves the same function for infections that have penetrated the tissue of the body. Much of general medical practice consists in blocking the discomfort associated with these symptoms. We use these medication to block cough, relieve pain, stop vomiting and decrease diarrhea. Is this wise? Not always. In a clear demonstration of the value of diarrhea, Du Pont and Hornick compared the outcomes in people with Shigella infections who took medications to decrease diarrhea and those who did not. (29) Those who are left alone recovered faster, while those who took medication had extended illness, more complications, and were more likely to become carriers. Giving cough suppressants to patients shortly after surgery is well-known to cause pneumonia, so physicians therefore avoid this. Nonetheless, in many other situations we are able to use medications to block cough, diarrhea, and vomiting with no particular apparent ill- effects. How is this possible? Consider how natural selection shaped the mechanisms to regulate these defenses. Essentially natural selection acts on the outcome of a signal detection analysis. Just as electrical engineer must set a system to decide correctly whether given click coming across a line is a signal or just noise, the body's regulation mechanisms must set the system for, say, vomiting, to expel the contents of the stomach only when that is worthwhile. But such signals are somewhat difficult to interpret. The only way to insure that no toxin is ever ingested, is not to eat all. This would not be good strategy. Conversely, to avoid wasting calories, it would be best never ever to vomit. This would not be wise either. The optimal regulation strategy depends on how likely it is that a toxin really is present, how costly it will be to mount a defensive response of vomiting, and how costly it would be to fail to mount such response if the toxin is actually present. In many instances the parameters of the system favor an apparently overly-sensitive defense responses. The cost of many defenses is relatively small--in the case of vomiting, only a few hundred calories. The cost of not responding could be death. As a result, natural selection has shaped the normal system to respond in many instances where a response is not actually necessary in order to ensure that the system will always respond when response is necessary. We call this the "smoke detector principle" because it also guides the design of smoke detectors. (2) We could design a smoke detector that would sound a warning only when the house was definitely on fire and never when the toast is burning. Such a system would, however, on occasion fail to sound off in when there was a real fire. Thus, we want our smoke detectors designed to sound some false alarms because that is what it takes to ensure that they will always warn us of a real fire. This "smoke detector principle" also helps to explain how is possible to use medications to block defenses without necessarily causing harm. Nine times out of ten vomiting may not be necessary, so in most instances, medications to block it will cause little harm. Then again, there is that additional instance when it really is necessary. The same principle applies to many other defenses. Consider pain. Pain is a useful adaptation - people who lack the capacity for pain are usually dead by their early 20s or 30s. (30) Overall, however, it seems that most of the pain that we experience in life, at least in the modern environment, is excessive and prolonged. Medical advances to block pain have been a great boon for humans. Furthermore, we now recognize disorders in which the pain system itself is dysregulated causing chronic pain. Defensive mechanisms, like any other real-world mechanisms, can malfunction. Anxiety offers another instructive example. We often imagine that we would prefer life without the experience of anxiety, but people who lack anxiety entirely likely do as poorly in life as people who lack the capacity for pain. They do not come to psychiatrist's offices complaining of insufficient anxiety, but the defect is just as serious it as if their immune systems were hypofunctional. (31, 32) Therefore, while a few people may have anxiety deficiencies, the vast majority of us tend to have more anxiety that we need. Blocking this anxiety with medications only rarely leads to reckless behavior, although in case of driving automobiles, this certainly can cause accidents. We have additional defenses that are specific against infection, including fever, inflammation and the immune response. Fever is not a simple increase in the rate of metabolism, but is a systematic and coordinated response to the presence of cues that indicate the presence of infection. (33) During a fever, the body will defend the new set point by increasing the temperature if attempts are made to reduce it and also by decreasing the temperature if attempts are made to increase it further . Pathogens are more susceptible to our defenses at the higher body temperature. Even cold-blooded animals raise their body temperature in the face of infection by moving to warmer places until the infection is controlled. All this leads to an obvious question--is it indeed wise to block fever during infection? Surprisingly, adequate studies have still not been done on this most routine medical question. Certainly in many individual instances the smoke detector principle applies, and we can block fever and rely on the body's other defense mechanisms to protect us. However, there may be situations in which we would get better faster if we did not block fever. Also, in cold climates people have repeatedly discovered the sauna bath or other ways to raise body temperature, perhaps because this helps to improve health. In the case of chicken pox, there is some evidence that antipyretics slightly prolong the cause of illness. For influenza, would people get better faster if they did not take medications block fever? We don't know. None of these defenses are diseases themselves, but it is easy to fall into the illusion that they are problems, instead of parts of solutions. This illusion is fostered because they are constantly associated with pathology and because they can so often be blocked without untoward effects. The illusion is still further fostered by the psychological and physical discomfort we experience in association with the expression of defenses. We don't like fever, we obviously don't want pain, vomiting, diarrhea and coughing are extremely unpleasant, and it's perfectly understandable that we should want to minimize them. This brings us to the next question, why do humans have capacities for suffering at all? The capacities for suffering The capacities for suffering are products of natural selection. If pain was not useful, we would not have the capacity for pain. If anxiety was not useful, we would not have anxiety. Anxiety and pain, perhaps in concert with the awful feelings we get when we lose someone we love, are close to purely negative experiences, their aversiveness almost certainly being central to their utility. People who didn't mind tissue damage, threats, and losses, have not passed on as many of their genes as people who did everything in their power to avoid these circumstances. The reaction of people to narcotic pain-killers is of great interest. Many report that they can still experience the pain but "it no longer bothers me." In essence, they report the perception is intact, but the affective representation of the experience has been reduced or eliminated. Much of the mission of medicine is, of course, to relieve suffering. This is best accomplished by eliminating the cause that has aroused the negative feeling, but very often we can safely and effectively use medications to directly block the brain mechanisms that gave give rise to negative feeling. Many bodily defects, such as cancer or atherosclerosis are imperceptible for years, but almost every bodily defense is associated with discomfort. Nausea precedes vomiting and inhibits eating, thus preventing further intake of toxins. Diarrhea and cough are quite annoying. The physical fatigue that follows over-exertion is unpleasant enough to motivate avoidance of the situations that gave rise to it. The malaise that accompanies infection often seems excessive, and when we take medications that block this feeling we can often go about our business much more comfortably. In the ancestral environment, however, when predators were problem, this might not have been so wise. In that circumstance, to wander far from camp when unable to run fast might have been unwise indeed. Furthermore, simply resting during infection may allow the body's full resources to be commandeered for the fight. Many forms of human suffering are not so physical. We also experience depression, anger, jealousy, anger, embarrassment, and many other unpleasant emotions. By extension, it seems likely that the very unpleasantness of these emotions is also a product of natural selection. Indeed, most of them are aroused by situations that are not good for our health, status, or reproductive success. (34) Work to understand the emotions in this light is just beginning, but it is needed urgently if we are to cope wisely with the development of new psychopharmacologic agents. We already have effective anti-anxiety drugs, although they all have side-effects or cause dependency. We have increasingly good drugs to block depression, although they take weeks to work and also have side effects. It seems entirely likely, however, that the combination of brain science, the genome project, and improvements in chemistry will lead to agents that are far more specific with far fewer side-effects. We are ill-prepared to decide how to use such substances. If, for instance, a pill was developed that could eliminate jealousy, how would we use it? Certainly this could prevent much suffering, and even violence, but it would also undermine some deep human impulses that are responsible, in a considerable measure, for the fundamental structure of the family and society. Or, how would we use agents that block the experiences of greed and envy? On the surface it would seem fine to eliminate these nasty emotions, but we might find that we simultaneously eliminate the motives for much human effort and entrepreneurship that makes societies successful in competition with other societies. We expect that conflicts between individuals and their societies will arise over the use of such drugs. Perhaps the current war on drugs is already an example. (35) Trade-offs Every aspect of the body could be designed to be more resistant to disease, but only at a cost. Why for instance are our arms not stronger and less prone to fractures? Arm bones could be made thicker and less likely to break when we fall. Because of the design of these bones, however, making them thicker would drastically decrease our dexterity and make it impossible to rotate the wrist the way we can now. We could be even better protected from infection if our immune systems were more aggressive. Then, however, the untoward effects of the system, including tissue damage and rapid senescence , would become even more prominent and we might also experience more auto-immune disease. Our vision could be still more acute, like a hawk able to see a mouse from a kilometer way. The trade-offs, however, would be a drastic loss of peripheral vision, color vision and the ability to see complex shapes all at once. Our upright posture is still another trade-off. Many explanations have been proposed to explain why we walk on two feet, from the benefits of using tools and weapons, to the need to carry infants. Whatever the benefit is, we can be sure was a substantial one, because standing upright has so many costs. The most obvious one comes from the design of our spine which is optimized for a creature the goes about on all fours. On standing upright, however, enormous pressure is put on the lower spinal discs, causing pain and disability that is perhaps more common than any other medical disorder. On the much more mundane level it appears that hemorrhoids result from the changes in circulation that result from upright posture. The tendency to faint is greatly increased by standing upright. The tendency to lose one's balance and the need for extraordinarily complex brain mechanisms to regulate bodily position and balance are all secondary to standing upright. Even the design of the system that supplies blood to our bowel is very poorly designed for an upright posture. In an animal that goes about on all fours the omentum hangs like a curtain, supporting the bowel and providing easy access for vessels. But when we stand up, however, it is as if someone took the curtain rod and stood it upright, whereupon the folds tangle in on one another, giving rise to the possibility of bowel obstruction and arterial compromise. It's worth noting that these trade-offs can also be interpreted as novelties, in that there simply has not been enough time for natural selection to shape reliable mechanisms to protect us against of these ills. Perhaps in another million years back pain will be far less frequent. In the meantime, however, we will suffer. Another trait that seems to be maladaptive, is our lack of hair. There is much disagreement about the evolutionary explanation for our nakedness, with proposals ranging from increased ability to sweat, to speculation that we spent much time in the water in our evolutionary past. Whatever the explanation, there are some obvious penalties, especially for paler individuals, including sunburn, and a risk of dying from malignant melanoma. Everything is a trade-off. It is foolish to describe a trait as perfect and there are few traits that are simply pathological. All have costs and benefits. All are trade-offs. Such trade-offs also exist at the level of the gene. All genetic changes begin as new mutations, and it would be rare indeed for a mutation to have only benefits and no costs. Given the interaction effects among 100,000 genes, acting in environments that vary from year to year and generation to generation, a detailed accounting of such costs and benefits is beyond our current understanding . What we do know, however, is that a gene that gives a net selective advantage will likely be selected for, even if it causes disease in some people, or disability or decreased function in all of us. It is extremely hard to recognize genes that cause disadvantages in all of us, because we have nothing to compare them to. What we do have are a few examples of genes that have obvious benefits that explain their selection despite their tendency to cause disease. Sickle cell disease has already been mentioned. It is the only solidly documented example. In our book, we described our expectation that the allele that causes cystic fibrosis would be found have some benefit, because it was so common and so reliably fatal. This speculation has since been supported by epidemiology (23) and by genetic studies of mice with a single cystic fibrosis allele . In more recent work, it has been discovered that this allele also inhibits Salmonella typhi, the cause of typhoid fever, from entering the cells in our gut. (36) A further illustration may soon be available in the case of manic depressive illness. This illness is overwhelmingly genetic in its origins, affects approximately one percent of people worldwide with devastating consequences including a 20 percent risk of suicide and a 20 percent risk of early death from other causes. (37) The selection force acting against the genes that cause manic depression is so enormous that there is very likely some selective advantage unless a large number of genes are involved. What could the advantage be? For centuries, people have noted the increased creativity of people who have manic depressive tendencies and recent scientific evidence confirms this finding. (38) Perhaps this creativity somehow leads to selection for the manic depression genes. There's no need, however, for the benefit to accrue to people with manic depressive illness. In fact, it's more likely that their unaffected kin would experience a benefit, while those with the disease would experience mainly the costs. This general mechanism may apply to many genes, with some individuals suffering from a disease, while fitness benefits accrue to relatives who carry the same genes in combination with other different genes. The definitive test would be to look at the reproductive success, in the ancestral environment, of relatives of individuals with manic depressive illness. Such a study would be nearly impossible to do. We will likely sooner identify the specific genes. Once we have them, we will look at people who have these genes to see how they differ from other individuals. Perhaps they are more creative and perhaps this creativity does give them increased reproductive success or other advantages. Or perhaps the genes protect them from some infection. The question will be difficult to answer but important as we approach a time when genes can be manipulated. This example has important implications for those who would try to improve the human species by controlling reproduction. A long-standing dream of progressives has been to eliminate defective genes and thus improve the health of the population, presumably making the world better place. While this idea recurs often throughout history, it was the subject of early public policies in the United States at the very start of the 20th-century. As everyone knows, a grander and more deadly version was practiced by the Nazis, leaving a lingering repugnance for eugenics that makes it almost impossible even to talk about the issue. (39) We will address the issue of human rights here only by saying that in our vision, Darwinian medicine is a field that benefits individuals, not nations or the species. The other issue is the scientific basis for eugenics. Much has already been said about the lack of scientific foundation for such efforts as conducted in the past. Many supposedly genetic diseases, such as cretinism, turned out to be caused by environmental factors, and others resulted from many genes or rare recessive genes so that eugenic efforts, and the associated restrictions of individual reproductive rights, were in vain. An evolutionary view of disease helps to reveal the complexity of these matters. Population geneticists have worked out the details of how certain rare recessive genes persist in the population irrespective of their possible selective costs, and these principles show the extraordinary practical difficulties faced by anyone who would try to reduce the frequency of such genes through selective breeding. An even more important factor is that many genetic diseases involve many genes, often in complex interactions with environmental factors. Even specific harmful genes may give rise to disease in only a small proportion of individuals, so restricting reproductive opportunities on the basis of manifest disease will have little impact on the frequency of most diseases even if eugenic policies were pursued rigidly for many generations. The case of manic depressive illness, is instructive because the responsible genes may well be helpful as well as harmful. This case is peculiarly appropriate to consider for public policy because while the individual may suffer with manic depressive illness, the society may benefit from creations by the ill individual or his or her relatives. To eliminate the genes that cause manic depressive illness without careful thought could, therefore, be a catastrophic mistake. Furthermore, other genes that appear superficially to be simple defects, will likely turn out to have unanticipated adaptive benefits, although it is still very difficult to distinguish these from others. (40) We know enough now to suggest that it would be safe to do away with genes that cause cystic fibrosis, but it will be more difficult to discover if and what benefits accrue from other genes that cause disease. Finally, it is by no means certain that the future human environment will be the environment that we live and now, so eliminating disease causing genes that protect us against infections that are now rare may seem wise, but at some future date cause new suffering. All of this is, of course, about to be changed by the unraveling of the human genome. On one hand we will finally have accurate information about individual genotypes, and will no longer have to rely on phenotypic expression of disease. On the other hand, it's likely that medical advances arising from the human genome project will make it possible to control vastly more diseases, including genetic diseases, that has ever before been possible. This will, no doubt, give rise to new calls for restricting reproduction among certain individuals with specific known pathological genotypes. While this argument goes on, further progress will be made in discovering ways to minimize the effects of these genetic defects or to allow people with such defects to have offspring in which manipulation of a single bit of DNA can prevent the problem. Of course, human tendencies will use these technologies to give rise to entirely new problems. People will want to have offspring with the best possible genotype. We predict that a market will soon arise in which rich people will try to control the genotype of their offspring. Such a phenomenon of would likely lead to an arms race for genetic information between countries that are fearful that their populations would be left behind by genetically superior generations in other countries. This prospect seems both frightening and likely to us. While it would be easy simply to advocate for restrictions on such practices, they would both prove extremely difficult to enforce and, perhaps not in the interests of elites to enforce in their own country. What does seem likely is that the human species will, in a few hundred years, be different than it is now. No doubt it will in some ways be better, but much conflict and many mistakes lie along that road. Novelty The environment in which we live is considerably different from the environment in which we were designed to live. While much disease arises from environmental changes in the past 100 years, much disease also arises from changes since the advent of agriculture about 40,000 years ago (41). Prior to that, humans lived in small hunting and foraging groups of 20 to 50 people who lived largely on fruits, tubers, grains, and meat. In most locations, salt was in short supply, sugar was available mainly in the form of ripe fruits or occasionally as honey, and high levels of fat were almost always unavailable. We are living in an unnatural environment. It is easy to over-generalize this principle. The idea of the environment of evolutionary adaptedness (the EEA), proposed by John Bowlby, (42) has been extremely useful in reminding us about the differences between then and now. As pointed out by recent scholarship, however, there was no single environment of evolutionary adaptedness, but a constellation of situations in which our ancestors lived. (43) While these environments had much in common, they also differed. When humans moved out of Africa, perhaps one million years ago, their particular ability to adapt to new environments quickly lead to spread across the Eur-Asian land mass. As they moved to new environments, new selective forces began to act. In colder climates individuals with shorter arms and legs lost heat less quickly and had a selective advantage. In environments where lack of sunshine and wearing clothes lead to light deprivation on the skin causing vitamin D depletion and rickets, there was selection for decreased skin pigmentation. (44) In settings where humans raised animals and subsisted on milk, there was selection for maintenance of lactose activity into adult life. (45) The big environmental changes however, have been those of our own making. The giant one was the invention of agriculture. By growing their own food, people were able to insure a much more consistent supply of calories at less effort. The price, however, was immediate increase in certain diseases. Studies of Native Americans give particularly clear evidence of the rise in disease after cultivation of maize and sorgum became common. The stature of adults declined, and arthritis and tooth decay suddenly emerged because the agricultural diet provided more sugar, and far fewer and more limited phytochemicals than the diet consumed by hunter gatherers. The diets of these early Native Americans were probably also deficient in protein and certain essential amino acids. Cultural traits can do much to compensate for such problems. For instance, many native groups in the Americas soak their maize in alkali before cooking--a process that frees the niacin, an essential vitamin that is otherwise deficient in a maize based diet. (46) and may increase lysine, an amino acid deficient in a maize diet. Other deficiencies are not so easy to remedy. When eating natural fruits and vegetables, humans get plenty of vitamin C, a chemical they cannot synthesized friends (in contrast to other primates, most of which can). Because vitamin C is a necessary substance for us, we can be confident that it was in abundant supply as a routine part of our diets for long enough to allow the synthetic mechanism to be lost over the course or evolution. When sailors began to take voyages lasting months, subsisting only on hardtack and dried meat, scurvy quickly became a major problem. When Lind discovered that giving out rations of limes prevented scurvy, the way was paved for the discovery vitamin C. In Iceland, the same problem had long been recognized and prevented by storing blueberries especially for the time in late winter when scurry became a problem We are vastly more healthy on the average now that we were even a few hundred years ago. In most locations infection is less likely and more curable, accidents are less common and more treatable, and general health has improved thanks to more adequate food supplies and sanitation. A Darwinian approach to medicine in no way advocates reverting to some imagined ancestral time of perfect health. On the other hand, it remains true that the majority of problems we see in medical clinics today arise from novel aspects of our modern environment to which our "thrifty genotype" not yet adapted. (47) The most common and devastating of these diseases arise from our abnormal diets, and the resulting triad of hypertension, obesity, and atherosclerosis. (48) Compared to our ancestors, are diets include vastly more fat, salt, and sugar and substantially less phytochemicals and fiber. (49) The result is the current epidemic of heart disease and stroke caused largely by atherosclerosis. Such diseases claim half of individuals in most modern countries. The defect in design, however, is not simply in our metabolism and our arteries, it is also in our brains. A hunter-gatherer who did not have a taste for sugar and fat would be at a disadvantage. One could hardly ever get enough of those substances in the ancestral environment. Today, we have the same preferences as our hunter-gatherer ancestors, but the world is different. The difference, of course, is that the hunter-gatherer had to work long hours to get even occasional taste of a high-fat high-salt, high sugar food, if it was possible at all. Nowadays, we can go to the grocery store and glut ourselves on a wide variety of snack foods that satisfy these cravings instantly. In United States more than half of individuals are now overweight and a third are clinically obese, conditions that contribute to much disease. Individuals try to diet, but rarely succeed. They know what they should eat, but they eat fat and sugar instead. They know they should exercise, but they don't. The fault is not with their will- power, but with the very design of the brain mechanisms that regulate their exercise and diet, a design that is optimized for an entirely different environment. As the diets typical in technological societies spread to developing countries, the epidemic is predicted to be the single greatest cause of human disease. (50) Eating disorders are problems that seem to have arisen mainly in the last generation. The ability to live with very little caloric expenditure, and eat whatever one chooses whenever one chooses, interacts with evolved preferences for mates with a particular shape with unfortunate results. It appears to be a cross-cultural universal that men prefer women with a waist/hip ratio of about 0.7. (51) This has been proposed to identify women who have recently become sexually mature but who have not yet borne many children, thus making them optimal reproductive partners. Heavy women obviously do not have this conformation. Furthermore, the human tendency to attend to caricatures interacts with mass media to create images of women that are exaggerations of this ideal. In the arms race that arises from sexual competition, women try to live up to these ideals, often with tragic consequences. Attempting to diet sets off protective mechanisms that were designed to protect a person from famine. When food is in short supply, these mechanisms induce preoccupation with food and a tendency to quickly gulp down large amounts of high calorie food. Such impulses to gorge make a woman on a calorie-restricted diet even more fearful that she will be unable to control her energy intake, so she tries even harder to diet. This sets off a vicious cycle in which the impulses to eat become still stronger, causing more loss of control, thus making her feel still worse, until a serious eating disorder is established. For most women, (with eating disorders) the cycle becomes one of the bulimia, eating large amounts of food and then vomiting. For those few women with extraordinary will-power, it is possible to restrict intake entirely causing anorexia nervosa, a disease that is sometimes fatal. Eating disorders are a product of the novel environment in which we live. They can be explained by the food intake regulation mechanisms that evolved in an entirely different environment and their interactions with innate sexual preferences that are exaggerated by modern media. Such problems will become much more common as technology and easy access to variable foods spreads across the world. On a much more mundane level, millions of people suffer from pain at the inside edge of the heel. This is sometimes called "heel spurs", because a tiny bit of calcification is visible on x-rays, but the technical name is plantar faschitis. The plantar faschia is a band of tough tissue that stretches from the ball of the foot to the heel--essentially it is the bow string that holds arch of the foot taunt. When walking miles each day and sitting without chairs by squatting on the ground, this faschia is constantly stretched and exercised. When, however, people sit for long hours in chairs, this tissue is not stretched and contracts. When the contracted tissue is suddenly stretched by jogging or a long walk, it is vulnerable to ripping off from the heel--an injury that causes pain at the site of the injury. Certainly there are peculiarities of anatomy and walking posture that increase the vulnerability of some individuals to this problem, but the fundamental difficulty is the design of the organism and the mismatch with how we live our lives today. We are designed to seek comfort and minimize caloric expenditure. Plantar faschiitis is one of the several costs for following our evolved inclinations when they are no longer adaptive. The invention of reliable birth control has been enormous boon, not only for individuals, but also for populations, that at last have some hope of restraining their numbers without relying on disease, war, and starvation to control populations. The availability of birth control is, however, a completely novel aspect of the environment that causes many complications. In the ancestral environment, a woman would typically reached sexual maturity at about age 17, would become pregnant within a year or two, following which she would nurse or baby for two to three years and quickly become pregnant once again. The total number of menstrual cycles in a lifetime averaged around a hundred. (52) Nowadays, women reached sexual maturity much younger, probably because of a superior diet and increased fat stores earlier life. They may wait until age 30 to have children or may never become pregnant. After giving birth, a woman may feed the baby with a bottle, thus making it possible to become pregnant again in a matter of months, instead of the several years of infertility associated with breast feeding. The most common complication of this modern pattern is certainly iron deficiency anemia. The disorder is far more common in women than men because of loss of blood with each menstrual cycle. The system was never designed for as many menstrual cycles as now take place. High rates of breast cancer in modern societies may also be partly attributed to the use of birth control. (53) The cells in the breast that are most vulnerable to becoming cancerous begin dividing at menarche and stop dividing only with the first pregnancy. In the ancestral environment this interval lasted months to a year, but now it often lasts for decades. Studies are now being done to see if the use of pregnancy mimicking hormones for some years after menarche can prevent breast cancer in some young women whose family histories suggest a high risk. The discovery of psychoactive drugs has also been a great boon for humankind, but like all other advances, it has brought complications, in this case drug abuse. While some individuals clearly are far more susceptible to addiction than others, and while social factors certainly help to account for why some people become addicted and others do not, an evolutionary approach to the problem highlights the universal capacity for humans to become addicted to drugs that act directly on motivational systems. (35) The ascending dopaminergic tracts that are stimulated by most drugs of abuse are intimately involved with reward mechanisms designed to control behavior. (54) Actions that led to success (as indicated by cues such as eating tasty food) are reinforced and become more common. When, however, these mechanisms are stimulated by direct action of drugs, they have no way of interpreting what is happening and they respond as if some huge bonanza of resources had just been gained. This gives a subjective enormous pleasure, the likes of which is hard to find in real life. It also entrains behavior to repeat, over and over again, whatever action brought such enormous pleasure. The great irony is that after continued drug use, the drug addict may get very little pleasure. Apparently the mechanisms that regulate subjective experience damp out after repeated exposure to the drug. The mechanisms that control behavior, however, tend to persist. Thus, the common picture of the drug-addicted individual who desperately wants to quit, who gets little pleasure from his habit, and yet who feels helplessly compelled to spend his life seeking out drugs of abuse. (55) We were simply never designed to live in an environment where drugs of abuse are readily available. It seems as if there should be some solution to the problem drug abuse, either by prevention, treatment or legalization of drugs. A Darwinian approach suggests, however, that this problem may not have any straightforward solution but may arise from an intrinsic vulnerability of organisms that reach an advanced enough state of technology if their motivational systems are chemically controlled, as ours are. In fact, we predict that when we make contact with intelligent organisms on other planets, we will discover that they either are continuing to cope with a chronic problem drug abuse or at least passed through that stage at great cost and suffering. The amount of anxiety we experience nowadays is greatly excessive for the dangers we encounter. (31) Most of us would be better off cutting down our anxiety level by several notches. In this sense, anxiety can be seen as excessive, given that we live in novel environment. It also seems possible, however, that the anxiety system was fine-tuned during a life-time in ancestral environments by exposure to things that actually were dangerous. A modern person may see snakes only in zoos and so fear of snakes can become quite generalized and lead to a tendency to avoid any place a snake might conceivably be seen. If that same person had been living in an ancestral environment, however, there would be great pressure to keep going to places were snakes would be seen despite the fear, a process that which soon extinguish unwarranted fear. Furthermore, exposure to different kinds of snakes would soon lead to stimulus discrimination between snakes that are harmful and those that are harmless. Many modern phobias may, paradoxically, result from lack of exposure to different kinds of dangerous objects. Genetic quirks We have emphasized diseases that arise from novel aspects of the environment and diseases that arise from genes that may have benefits as well as costs. Much modern disease arises, however, from interactions between genetic variation and environmental novelty. Genes that had no ill effects in our ancestral environment now reliably cause disease. Myopia is an excellent example. Nearsightedness is a genetic disorder. If your parents have it, you almost certainly will as well. This prevalence is approximately 25 percent in all modern populations. How could such a serious defect be maintained despite the force of natural selection? The answer comes from recognizing that this is not purely a genetic defect, but a genetic variation that was harmless until people began doing close work, such as reading at an early age. Such early reading, in people who have the genes, reliably cause is myopia. People who do not have the genes, or do not do close work, never get nearsighted. Attempts to decide if it is a genetic or an environmental disease are confused. Like many other diseases, it is both. Much atherosclerosis is probably the same. The genes that increase vulnerability to heart disease probably were not harmful in an environment where no one had high cholesterol. To call these genes defects is vastly simplistic. These variations were of minor consequence in the environment we were designed to live in. Genes that make some individuals especially susceptible to drug abuse, are still another example of "quirks" that caused no harm in the natural environment. Path dependence We have emphasized design features of the human body that offer some advantages as well as disadvantages. Other features are, however, simply mistakes. The eye, for instance, that wonder of wonders, is inside out. The vessels and nerves enter at the back of the eye ball causing a blind spot, and they spread out of the inside of the retina casting shadows. The eye of an octopus is, in contrast, much better designed. The nerves and vessels run along the outside of the eyeball, penetrating were they are needed. This octopus has no difficulty with a blind spot, no shadows cast by the retina, and is protected against detachment of the retina. In this respect, the design of the octopus eye is extremely sensible, ours is a mistake. Why doesn't natural selection fix it? Because the process of evolution is not based on planned design, but on continual tiny modifications in which each generation must survive and prosper. Once some semblance of a working eye gave a selective advantages to our ancestors, the process moved forward steadily until our eyes were as good as they could be, despite the gross disadvantages of having vessels on the inside. As Jacob Monod has put it so clearly, "Nature is a tinkerer, not an engineer." Many other examples illustrate other anatomical difficulties that arise from path dependence. (56) The vas deferens, for instance, instead of going directly from the testicles to the penis, makes a long detour into the pelvis, looping around the inguinal arteries, and only then returning to the urethra. This path makes it vulnerable to damage, at least in surgery. But, because the original routing of the vas deferens and the iliac vessels was the it was, there is no going back. The recurrent laryngeal nerve offers another example. This nerve controls some motions of the vocal cords and muscular contraction of the upper eyelid and the pupil. It descends from the brain down into the neck and proceeds immediately behind the thyroid gland on the surface of the trachea. From there, it does its work at the vocal cords and goes back up to the eye. All along this long course it is subject to injury, especially at the hands of surgeons working on the thyroid gland. It is a faulty design that cannot be changed. Choking is the cause of death for many people worldwide each day and it too is simply a design defect resulting from path dependence. It would be ever so much better if the trachea and the esophagus were completely separate, however, some of our amphibian ancestors seem to have swum at the very surface of the water so their nostrils could take air into a common passage way shared by the food and air. That common cavity has never been eliminated, thus there is always the possibility of aspirating food that will clog the wind pipe and cause death. Finally there's a matter of the appendix. A very thin blind loop of gut, it extends from the large bowel and seems for all the world as if it is there just to cause problems. In our ancestors it may have been a larger cavity that which useful in digestion, but for us, it appears to be nothing but a potentially fatal nuisance. Its tendency to cause problems is directly proportional to its narrowness. Any minor bit of inflammation can compress the artery that supplies it with blood and this lack of blood supply that opens the way to further bacterial invasion unencumbered by protective defenses. Such infection further compresses the blood supply, at which point bacteria can grow completely unhindered until the appendix bursts, whereupon the patient very often dies. Has natural selection simply not had enough time to eliminate this troublesome organ? It certainly does not seem to give any selective advantage. Paradoxically, however, the appendix may be maintained by natural selection precisely because it causes appendicitis. People who have an appendix that is somewhat larger are less likely to get appendicitis, while people who have a long thin appendix, are more likely to die. This is perhaps the ultimate example of a "blind loop" in the process of natural selection, an organ that is wholly useless for any task, but is nonetheless maintained by natural selection because as it gets smaller, it increased the risk of death. Such examples suggest that the very idea of a normal, perfect body is probably incorrect. The body is a bundle of trade-offs and problematic arrangements jury-rigged into a miraculous machine. Random events We began by emphasizing the randomness of natural selection and we return to this theme here at the end. There are many accidents and diseases for which natural selection can offer no protection. If an asteroid hits our neighborhood, there's nothing natural selection can do to protect us. If we are exposed to high levels of radioactivity, we have no way of detecting the danger, so would likely go about our business with possibly fatal results. Many toxins, especially novel toxins, are colorless and tasteless, thus making it difficult for us to protect ourselves. Events that are very rare, or that we cannot detect, do not shape protection and simply must be chalked up to the unfortunate randomness and uncontrollability of life. Likewise, the genetic code can never be perfect. Mutations are always creeping in, at the rate of approximately one per individual per reproductive episode. Selection will gradually eliminate some of these, but some, even some that cause decreases in reproductive success, will become more common or even widespread by the mere process of genetic drift and there is no rhyme our reason or controlling such mutations, they are simply random events that happen. At the next stage, selection, there is further randomness. Some genes that cause harm will drift to a higher frequency despite the harm they cause. Some genes that would protect us or otherwise be beneficial may nonetheless be eliminated from gene pool by simple stochastic accident. Such random factors are real and important, but they are not as all-important as they have sometimes been portrayed. Many of the body's vulnerabilities are, by contrast, direct products of natural selection. There is no such thing as one universal normal genome, there is no such thing as a perfect body, there is no such thing as a perfectly safe diet, and there is no such thing as life without senescence, but there are a remarkable number of humans who have miraculously healthy periods in their life. Given the myriad vulnerabilities and the number of things that can go wrong, this is astounding indeed. Senescence Perhaps the most serious trade-off at the level of a trait is that of aging. More specifically, there is the mystery of senescence. Why should individuals age and inevitably die? It is perfectly possible for organism to recreate body parts that have been lost, so why isn't it possible to systematically and steadily replace every body part as it ages so that the individual can be eternal? The explanation here is very similar to the explanation offered earlier for why men die younger than women. While it might be possible to design a body that would be eternal, this individual would not be as effective a replicator as an individual that put more resources into competition and less into preservation of the body. (57) Actually, genes that cause aging can be assigned globally to just two categories. Some have simply never been exposed to the force of natural selection because they cause disorders that are too rare and too late in life for selection to have had much of any effect in the natural environment. Certain diseases that become extremely common for people in their 90s, for instance, would have had only a minuscule effect on natural selection in the natural environment and so it's not surprising that we remain vulnerable. The same is observed in laboratory animals who are fed and protected so they can grow to ages that they would never reach in the wild. On the other hand, the effects of aging may well influence fitness in the wild for some species. Alex Comfort, going along with ecologists of previous generations, believed that there was no evidence for aging in wild animals because he had never seen a decrepit animal in the wild. However, most animals in the wild are prey for other animals. Long before they become decrepit, they become a meal for some other predator. Thus, just because we do not see feeble old rabbits, does not mean there is no senescence for wild rabbits The other explanation for the continued presence of genes that cause aging is that they give some pleiotropic benefit. (58) By this we mean that the very same gene that offers a benefit, for instance, strengthening bones during childhood and early adulthood, may also cause some disadvantage that causes disease or even death later in life, for instance, calcification of the arteries. While no such specific gene has yet been identified in animals, the likelihood of such genes has been demonstrated in fruit flies and other insects. (59) Before we began to tamper with genes that appear to be causes of aging, we should look carefully to see if they perhaps have been maintained because of some pleiotropic benefit. An evolutionary view gives a somewhat pessimistic outlook on the possibility of eliminating senescence. If genes cause disadvantages in midlife this will select for other modifier genes that postpone the expression of the deleterious effects to later in life. At some point, the expression of many of these genes will be seem to be coordinated in later life because the force of selection will fall quite rapidly at the age when they are expressed. One can thus imagine their manifestations as grains of sand that have been swept to later in the life-span by other modifier genes so that they now form something of a hill beyond which it is impossible to go. This is not to say that much may not be accomplished by gerontologic research and by slowing some aspects of aging. For instance, taking a small those of aspirin each day decreases risk of dying for heart attack. Does this have disadvantages is well? Yes, it thins the blood somewhat and that makes death from bleeding more likely. However, injuries are less likely now, and medical care is available, so on the whole we benefit by having blood that is a bit thinner than that designed for the natural environment. These circumstances offer an example where taking medications regularly may improve our adaptation to the current environment. Likewise, a tendency for rapid oxidation may be essential to destroy certain bacteria but toning down this capacity may currently not harm us much at all, but may protect our tissues from aging. In fact, it appears that this may be the explanation for gout. Gout occurs when crystals of uric acid precipitate in the joint fluid, causing excruciating pain. So why don't humans have lower levels of uric acid, like other primates do? A cross-species comparison shows a very strong linear relationship between plasma uric acid levels and longevity in different species. Uric acid turns out to be a potent antioxidant, and may well have been selected for to help make our long life spans possible. A few people, the unfortunate ones, get gout. Implications Natural selection and our evolutionary history has been well understood for nearly a hundred years now. Why is it only now being applied to the problems of medicine? In part, the explanation probably depends on the illusion that we referred to at the beginning of this article. Natural selection shapes things that work, so it is a bit hard to see on first glance how can also help explain why things don't work. There also more practical reasons, however, why it is only now that evolutionary biology is being recognized as a basic science for medicine. Medicine is a practical endeavor. Doctors treat individual patients with individual diseases and are usually far more interested in why this patient is sick now and what to do about it, than they are about why all members of the species are vulnerable to a particular problem. The patient comes in with a painful gouty big toe and the physician wants to help that individual immediately. The possibility that high levels of uric acid protect all of us from aging is not especially relevant at that moment. Nonetheless, an evolutionary approach to medicine can be profoundly relevant. For instance, some well-meaning genetic engineer might well decide to adjust things so that we all have lower levels of uric acid in order to protect us from gout. This would be fine, except for the possibility that we would probably all begin aging more quickly. Natural selection creates many designs that are substandard, but when it has a chance to act on some variable parameter, that shows continuous variation, such as the circulating level of uric acid, it will usually approach an optimum, given trade-offs, and given the specific environment in which the trait was shaped. Even in everyday practice, however, there is much that is immediately useful from an evolutionary approach to medicine. Recognition that diarrhea, fever, pain, nausea, vomiting and anxiety are useful defenses allows us to treat them in a far more sophisticated way. On the one hand, it helps us to hesitate and think carefully about the normal function of the defense before we block it. It also may allow us to feel comfortable that in this particular instance, blocking the defense is of no consequence to the person's health so we can act aggressively to make the person feel better more quickly. This is especially common in the case of pain. In the area of public health, an evolutionary approach is of great importance in assessing environmental changes that might influence changes in virulence. In particular, settings in which vectors can transmit pathogens between passive hosts are recognized as particularly dangerous for shaping more virulent organisms, whether the vector is a mosquito or a doctor's hands. The use of condoms not only prevents transmission of sexual diseases, it also can decrease their virulence. A sexually transmitted disease that causes quick death or incapacitation will tend to increase in virulence if the person is having many sexual partners, but if the person uses protective devices or abstains from dangerous sexual practices, this will tend to select for strains of the pathogens that are less virulent. Similar principles may also be useful for vaccine design. We could go on at great length about other potential benefits from an evolutionary approach to medicine but we wish to emphasize that most of the relevant research has not yet been done. Evolutionary questions have not been asked systematically about disease, and the methods for testing them are still being developed. What is needed now is not to jump quickly to a new theory of medical practice based on evolutionary biology, but to begin to educate physicians and patients about the evolutionary nature of the body and its vulnerabilities to disease. This will, we believe, quickly lead to specific advances in the treatment of individual diseases that will benefit individual patients. Even before that, however, it will help us all to a deeper understanding of the nature of the organism, and the nature of its vulnerabilities to disease. From this viewpoint, the body is not a Platonic ideal, and the genetic code is not correct in any one particular version. Instead, genes, with considerable variation, makes phenotypes, that interact with environments and other individuals, to result in more or fewer offspring depending on the genes, the environment, their interactions, and chance factors. An extraordinary number of people are blessed with years and even decades a good health, and sometimes even happiness. Despite all our knowledge about how this is possible, it still seems nothing short of miraculous, even though no miracle is needed to explain. 1. Williams GW, Nesse RM. The dawn of Darwinian medicine. Quarterly Review of Biology 1991;66(1):1-22. 2. Nesse RM, Williams GC. Why We Get Sick: The New Science of Darwinian Medicine. New York: Times Books, 1994. 3. Stearns S, ed. Evolution in Health and Disease. Oxford: Oxford University Press, 1998. 4. Williams GC. Natural Selection: Domains, Levels, and Challenges. New York: Oxford University Press, 1992. 5. Dawkins R. The Blind Watchmaker. New York: W. W. Norton, 1986. 6. Mayr E. How the carry out the adaptationist program? Amer. Naturalist 1983;121(March, 1983):324-333. 7. Dawkins R. The Extended Phenotype: The Gene as the Unit of Selection. San Francisco: W.H. Freeman and Company, 1982. 8. Michod RE, Levin BR, eds. The Evolution of Sex. Sunderland, MA: Sinauer, 1988. 9. Hamilton WD, Axelrod R, Tanese R. Sexual reproduction as an adaptation to resist parasites: a review. Proceedings of the National Academy of Sciences 1990;87(3566-3573). 10. Williams GC. Sex and Evolution. Princeton, New Jersey: Princeton University Press, 1975. 11. Dugatkin LA. Cooperation among animals : an evolutionary perspective. New York: Oxford University Press, 1997. Oxford series in ecology and evolution.; 12. Betzig L, Mulder MB, Turke P, eds. Human Reproductive Behavior: A Darwinian Perspective. Cambridge: Cambridge University Press, 1988. 13. Daly M, Wilson M. Sex, Evolution, and Behavior. (Second Edition ed.) Boston: Willard Grant Press, 1983. 14. Buss D. The Evolution of Desire. New York: Basic, 1994. 15. Partridge L, Hurst LD. Sex and conflict. Science 1998;281(5385):2003-8. 16. Diamond JM. The cost of living. Discover 1990(June):62-67. 17. Woolf L, McBean MS, Woolf F, Cahalane S. Phenylketonuria as a balanced polymorphism: the nature of the heterozygote advantage. Annals of Human Genetics 1975;38(4):461-469. 18. Krebs JR, Dawkins RD. Animal signals: mind-reading and manipulation. In: Krebs JR, Davies NB, eds. Behavioral Ecology: An Evolutionary Approach. Sunderland, Mass: Sinauer, 1984:380-402. 19. Alexander RD. The Arms Race: Some Thoughts of a Biologist. 1985: 20. Ewald P. Evolution of Infectious Disease. New York: Oxford University Press, 1994. 21. Goodenough UW. Deception by Pathogens. American Scientist 1991;79:344-355. 22. Donelson J, Hill K, El-Sayed N. Multiple mechanisms of immune evasion by African trypanosomes. Mol Biochem Parasitol 1998;91(1):51-66. 23. Bertranpetit J, Calafell F. Genetic and geographical variability in cystic fibrosis: evolutionary considerations. Ciba Found Symp 1996;197(97-114). 24. Williams T, Maitland K, Bennett S, et al. High incidence of malaria in alpha- thalassaemic children. Nature 1996;383(6600):480-481. 25. Ruwende C, Hill A. Glucose-6-phosphate dehydrogenase deficiency and malaria. Journal of Molecular Medicine 1998;76(8):581-588. 26. Spyropoulos B. Tay-Sachs carriers and tuberculosis resistance. Nature 1988;331(6158):666. 27. Chadwick D, Cardew G. Variation in the human genome. Chichester ; New York: Wiley, 1996. Ciba Foundation symposium ; 197; 28. Dobson A. The population biology of parasite-induced changes in host behavior. Q Rev Biol 1988;63(2):139-165. 29. DuPont HL, Hornick RB. Adverse effect of Lomotil therapy in shigellosis. J. Am. Med. Assoc. 1973;226:1525-1528. 30. Melzack R. The puzzle of pain. New York,: Basic Books, 1973. 31. Marks IM, Nesse RM. Fear and fitness: An evolutionary analysis of anxiety disorders. Ethology and Sociobiology 1994;15(5-6):247-261. 32. Rosen JB, Schulkin J. From normal fear to pathological anxiety. Psych Rev 1998;105(2):325-350. 33. Kluger MJ, ed. Fever, its Biology, Evolution, and Function. Princeton, N.J.: Princeton University Press, 1979. 34. Gilbert P. Human Nature and Suffering. Hove, UK: Lawrence Erlbaum, 1989. 35. Nesse RM, Berridge KC. Psychoactive drug use in evolutionary perspective. Science 1997;278:63-66. 36. Pier G, Grout M, Zaidi T, et al. Salmonella typhi uses CFTR to enter intestinal epithelial cells. Nature 1998;393(6680):79-82. 37. Goodwin FK, Jamison KR. Manic-Depressive Illness. New York: Oxford University Press, 1990. 38. Jamison KR. Touched with fire : manic-depressive illness and the artistic temperament. New York: Free Press, 1993. 39. Proctor RN. Racial Hygiene: Medicine Under the Nazis. Cambridge, MA: Harvard University Press, 1988. 40. Chadwick D, Cardew G, eds. Variation in the Human Genome. New York: John Wiley, 1996. 41. Eaton SB, Konner M, Shostak M. Stone Agers in the Fast Lane: Chronic Degenerative Diseases in Evolutionary Perspective. The American Journal of Medicine 1988;84(4):739-749. 42. Bowlby J. Attachment and loss. New York,: Basic Books, 1969. 43. Irons W. Adaptively relevant environments versus the environment of evolutionary adaptedness. Evolutionary Anthropology 1998;6(6):194-204. 44. Russell WMS, Russell C. Evolutionary and social aspects of disease. Ecology of Disease 1983;2:95-106. 45. Durham WH. Interactions of genetic and cultural evolution: models and examples. Human Ecol 1982;10(3):289-323. 46. Carpenter K. The relationship of pellagra to corn and the low availability of niacin in cereals. Experientia - Supplementum :-. 47. Sharma A. The thrifty-genotype hypothesis and its implications for the study of complex genetic disorders in man. Journal of Molecular Medicine 1998;76(8):568-571. 48. Neel J, Julius S, Weder A, Yamada M, Kardia S, Haviland M. Syndrome X: is it for real? Genetic Epidemiology 1998;15(1):19-32. 49. Eaton S, Eaton Sr, Konner M, Shostak M. An evolutionary perspective enhances understanding of human nutritional requirements. Journal of Nutrition 1966;126(6):1732-1740. 50. Murray CJL, Lopez AD, Harvard School of Public Health., World Health Organization., World Bank. The global burden of disease : a comprehensive assessment of mortality and disability from diseases, injuries, and risk factors in 1990 and projected to 2020. [Cambridge, Mass.]: Published by the Harvard School of Public Health on behalf of the World Health Organization and the World Bank ; Distributed by Harvard University Press, 1996. 51. Singh D, Young RK. Body weight, waist-to-hip ratio, breasts, and hips: Role in judgments of female attractiveness and desirability for relationships. Ethology & Sociobiology 1995;16(6):483-507. 52. Strassmann BI. The biology of menstruation in Homo sapiens: Total lifetime menses, fecundity and nonsynchrony in a natural fertility population. Current Anthropology 1997;38(1):123-129. 53. Eaton S, Pike M, Short R, et al. Women's reproductive cancers in evolutionary context. Quarterly Review of Biology 1994;69(3):353-67. 54. Berridge KC, Robinson TE. The mind of an addicted brain: sensitization of wanting versus liking. Current Directions in Psychological Science 1995;4:71-76. 55. Robinson TE, Berridge KC. The neural basis of drug craving: an incentive-sensitization theory of addiction. Brain Research Reviews 1993;18(3):247-91. 56. Williams GC. The Pony Fish's Glow : And Other Clues to Plan and Purpose in Nature. New York: Basic, 1997. 57. Austad SN. Why we age : what science is discovering about the body's journey throughout life. New York: J. Wiley & Sons, 1997. 58. Williams GC. Pleiotropy, natural selection, and the evolution of senescence. Evolution 1957;11(4):398-411. 59. Rose MR. Genetic Mechanisms for the Evolution of Aging. New York: Oxford, 1991. From checker at panix.com Wed Jan 11 16:11:07 2006 From: checker at panix.com (Premise Checker) Date: Wed, 11 Jan 2006 11:11:07 -0500 (EST) Subject: [Paleopsych] NYT: Nearly 100, LSD's Father Ponders His 'Problem Child' Message-ID: Nearly 100, LSD's Father Ponders His 'Problem Child' http://www.nytimes.com/2006/01/07/international/europe/07hoffman.html [The article is from Saturday, and he did indeed make it to his 100th earlier today. Because of this centennial, I'm not sending out a single long article today, but rather a few shorter ones. I am nearly finished reading Joel Garreau's _Radical Evolution_.] The Saturday Profile By CRAIG S. SMITH BURG, Switzerland ALBERT Hofmann, the father of LSD, walked slowly across the small corner office of his modernist home on a grassy Alpine hilltop here, hoping to show a visitor the vista that sweeps before him on clear days. But outside there was only a white blanket of fog hanging just beyond the crest of the hill. He picked up a photograph of the view on his desk instead, left there perhaps to convince visitors of what really lies beyond the windowpane. Mr. Hofmann will turn 100 on Wednesday, a milestone to be marked by a symposium in nearby Basel on the chemical compound that he discovered and that famously unlocked the Blakean doors of perception, altering consciousnesses around the world. As the years accumulate behind him, Mr. Hofmann's conversation turns ever more insistently around one theme: man's oneness with nature and the dangers of an increasing inattention to that fact. "It's very, very dangerous to lose contact with living nature," he said, listing to the right in a green armchair that looked out over frost-dusted fields and snow-laced trees. A glass pitcher held a bouquet of roses on the coffee table before him. "In the big cities, there are people who have never seen living nature, all things are products of humans," he said. "The bigger the town, the less they see and understand nature." And, yes, he said, LSD, which he calls his "problem child," could help reconnect people to the universe. Rounding a century, Mr. Hofmann is physically reduced but mentally clear. He is prone to digressions, ambling with pleasure through memories of his boyhood, but his bright eyes flash with the recollection of a mystical experience he had on a forest path more than 90 years ago in the hills above Baden, Switzerland. The experience left him longing for a similar glimpse of what he calls "a miraculous, powerful, unfathomable reality." "I was completely astonished by the beauty of nature," he said, laying a slightly gnarled finger alongside his nose, his longish white hair swept back from his temples and the crown of his head. He said any natural scientist who was not a mystic was not a real natural scientist. "Outside is pure energy and colorless substance," he said. "All of the rest happens through the mechanism of our senses. Our eyes see just a small fraction of the light in the world. It is a trick to make a colored world, which does not exist outside of human beings." He became particularly fascinated by the mechanisms through which plants turn sunlight into the building blocks for our own bodies. "Everything comes from the sun via the plant kingdom," he said. MR. HOFMANN studied chemistry and took a job with the Swiss pharmaceutical company Sandoz Laboratories, because it had started a program to identify and synthesize the active compounds of medically important plants. He soon began work on the poisonous ergot fungus that grows in grains of rye. Midwives had used it for centuries to precipitate childbirths, but chemists had never succeeded in isolating the chemical that produced the pharmacological effect. Finally, chemists in the United States identified the active component as lysergic acid, and Mr. Hofmann began combining other molecules with the unstable chemical in search of pharmacologically useful compounds. His work on ergot produced several important drugs, including a compound still in use to prevent hemorrhaging after childbirth. But it was the 25th compound that he synthesized, lysergic acid diethylamide, that was to have the greatest impact. When he first created it in 1938, the drug yielded no significant pharmacological results. But when his work on ergot was completed, he decided to go back to LSD-25, hoping that improved tests could detect the stimulating effect on the body's circulatory system that he had expected from it. It was as he was synthesizing the drug on a Friday afternoon in April 1943 that he first experienced the altered state of consciousness for which it became famous. "Immediately, I recognized it as the same experience I had had as a child," he said. "I didn't know what caused it, but I knew that it was important." When he returned to his lab the next Monday, he tried to identify the source of his experience, believing first that it had come from the fumes of a chloroform-like solvent he had been using. Inhaling the fumes produced no effect, though, and he realized he must have somehow ingested a trace of LSD. "LSD spoke to me," Mr. Hofmann said with an amused, animated smile. "He came to me and said, 'You must find me.' He told me, 'Don't give me to the pharmacologist, he won't find anything.' " HE experimented with the drug, taking a dose so small that even the most active toxin known at that time would have had little or no effect. The result with LSD, however, was a powerful experience, during which he rode his bicycle home, accompanied by an assistant. That day, April 19, later became memorialized by LSD enthusiasts as "bicycle day." Mr. Hofmann participated in tests in a Sandoz laboratory, but found the experience frightening and realized that the drug should be used only under carefully controlled circumstances. In 1951, he wrote to the German novelist Ernst Junger, who had experimented with mescaline, and proposed that they take LSD together. They each took 0.05 milligrams of pure LSD at Mr. Hofmann's home accompanied by roses, music by Mozart and burning Japanese incense. "That was the first planned psychedelic test," Mr. Hofmann said. He took the drug dozens of times after that, he said, and once experienced what he called a "horror trip" when he was tired and Mr. Junger gave him amphetamines first. But his hallucinogenic days are long behind him. "I know LSD; I don't need to take it anymore," Mr. Hofmann said. "Maybe when I die, like Aldous Huxley," who asked his wife for an injection of LSD to help him through the final painful throes of his fatal throat [3]cancer. But Mr. Hofmann calls LSD "medicine for the soul" and is frustrated by the worldwide prohibition that has pushed it underground. "It was used very successfully for 10 years in psychoanalysis," he said, adding that the drug was hijacked by the youth movement of the 1960's and then demonized by the establishment that the movement opposed. He said LSD could be dangerous and called its distribution by Timothy Leary and others "a crime." "It should be a controlled substance with the same status as morphine," he said. Mr. Hofmann lives with his wife in the house they built 38 years ago. He raised four children and watched one son struggle with alcoholism before dying at 53. He has eight grandchildren and six great-grandchildren. As far as he knows, no one in his family besides his wife has tried LSD. Mr. Hofmann rose, slightly stooped and now barely reaching five feet, and walked through his house with his arm-support cane. When asked if the drug had deepened his understanding of death, he appeared mildly startled and said no. "I go back to where I came from, to where I was before I was born, that's all," he said. From checker at panix.com Wed Jan 11 16:13:31 2006 From: checker at panix.com (Premise Checker) Date: Wed, 11 Jan 2006 11:13:31 -0500 (EST) Subject: [Paleopsych] Mitchell Consulting: India and China may not have so many engineers Message-ID: India and China may not have so many engineers http://mitchellconsulting.net/commonsense/?cat=17 Common Sense Technology Common sense views on technology and related subjects Read in 152 countries since 1995 January 10, 2006 [Referenced articles to the second degree appended.] [Add the creativity gap to the IQ gap. When trying to reform their education systems, Asians are not so concerned about instilling basic literacy and numeracy (as Africans, Middle Easterners, and Latin Americans are doing) as they are about getting their students to think for themselves. [As GRIN technologies continue, fostering creativity is going to become more and more critical, as well as higher paying. [The article supplies a much needed corrective about overcounting engineers in India and China. I should add that lawyers are undercounted there, as much of the work that lawyers handle in the United States are handled by those without law degrees, at least in Japan, where conflict resolution is also handled at a lower level. It's not that there is less conflict so much at that it is handled differently, as far as the statistics go. So when you hear that Americans produce ten times as many lawyers as engineers, while Japan does the opposite, question it. [Of course, there is lots of Central Planning rhetoric throughout these articles, as though some bureaucrat, or even pundit, can say how many engineers (or even lawyers) "we" will "need." If there's a market failure, I want to hear about it, and if the prospective government cure will have a government failure less onerous than the market failure, I really want to hear about it. [As far as getting an MBA goes, I certainly can't tell Vivek Wadhwa, the author of most of the articles below, that he would have done as well by learning on the job, gaining more job experience, not having had to pay tuition, and not having his salary reduced while he got the MBA than by doing what he did. But those that wasted their time in business school usually don't write about it. All studies I've seen (none of them very good) show that the private rates of return to post-baccalaureate education is less than that for college (and college is less than high school and high school less than K-8). All these studies give exaggerated rates of return, since most leave out innate ability (heredity) and all leave out effort (free will), but the sheer consistency of lower rates of return as one goes up the education ladder is very strong evidence that it does. [While I'm at it, there are also studies of the public rate of return to education, meaning here how GDP goes up. If the public rate exceeds the private rate then there is a case, hardly conclusive, for subsidizing education. Alas, these studies have different kinds of flaws from those estimating the private rate of return but similarly exaggerate it. The estimates from both of these kinds of studies vary enormously, with no showing at all that the public rate of return is greater than the private rate. One reason--I've just thought of this--is that, since Big Ed has managed to get every other form of job qualification besides educational credentials banned for being "discriminatory," educational credentials have become a costly surrogate for IQ tests (and aren't as reliable, as we learned from the analysis of the National Longitudinal Study of Youth in _The B*ll C**ve_) and have value as positioning. Not as irrelevant as the examinations in ancient China to get into the bureaucracy, it is not at all clear that what gets learned by sitting behind desks listening to professors drone on for all those extra years is actually good for. [Here are two paragraphs from an article below, which draws the distinction between "dynamic" and "transactional" engineers: DYNAMIC RANGE. Dynamic engineers are individuals capable of abstract thinking and high-level problem-solving. These engineers thrive in teams, work well across international borders, have strong interpersonal skills, and lead innovation. Transactional engineers may possess engineering fundamentals, but not the experience or expertise to apply this knowledge to larger problems. These individuals are typically responsible for rote and repetitive tasks in the workforce. What differentiates the two types of engineers is their education. The capstone design course that dynamic engineers study in their senior year enables them to integrate knowledge gained from fundamental coursework in the applied sciences and engineering. [Somehow I doubt that how the senior year in engineering school is spent makes much difference. I would think that heredity dominates, but even still this senior year (hey, why not all one's years in school!) could be thought out much better than it has been and those who have the capacity and temperament to become dynamic engineers could be helped become so. [We will be hearing much more about the creativity gap in the future, so stay tuned.] ------------------ Amidst reports that India and China are graduating more "engineers" than the U.S. comes some more actual data that notes we are not comparing the same skill sets when we compare most engineers in India or China to U.S. engineers. See [7]Filling the Engineering Gap Interesting quote: Contrary to the popular view that India and China have an abundance of engineers, recent studies show that both countries may actually face severe shortages of dynamic engineers. The vast majority of graduates from these counties have the qualities of transactional engineers. This reminds me of a number of years ago when I was doing on campus recruiting of software and computer engineers. At one university campus, the majority of the students we interviewed were international students, studying in the U.S., and most from one or two countries. This report echoes our observations - we talked with many smart engineering students. They all would do great work if told rather specifically what to do. But we were looking for a willingness to think outside the box, to be creative, to lead, to fight for radical ideas - all traits we were not finding in this bunch of students. That does not mean these international students were ineffective - but it is clear that many U.S. firms prefer dynamic engineers versus transaction engineers as described in this article. [snip rest] Filling the Engineering Gap http://www.businessweek.com/print/smallbiz/content/jan2006/sb20060109_693001.htm JANUARY 10, 2006 Viewpoint By Vivek Wadhwa The U.S. doesn't need to simply graduate more engineers. It needs more of the right kind of engineers. And more research into the problem sure would help The stakes are very high in the global economic race. As India and China strive to catch up, the debate continues about what the U.S. needs to do to maintain its lead. While it seems inevitable that other economies will grow, the issue here is whether their success will lead to greater prosperity for Americans or threaten our way of life. One of the few things on which both sides agree is that the U.S. needs to increase spending on education and research. Both the Democratic Party and the U.S. Chamber of Commerce announced policy initiatives last month prescribing an increase in the number of engineering graduates. They cited statistics that show the U.S. graduates about 70,000 engineers a year, while India and China graduate five and eight times that number, respectively. APPLES TO LITCHIS. While the remedy sounds good, the problem they're trying to solve isn't what it seems. The statistics that are being cited are inaccurate (see BW Online, 12/27/05, [3]"Engineering: Is the U.S. Really Falling?"). And simply mandating that the country should graduate more engineers may lead to a situation in which we graduate the wrong types of engineers and discourage future generations from studying engineering. As reader feedback shows, this debate is based more on emotion than fact. A lot more research is needed, and we need to differentiate between engineers. As I wrote in my last column, (see BW Online, 12/13/05, [4]"About the Engineering Gap"), our study at [5]Duke University revealed that the engineering graduation numbers commonly cited in this debate are inaccurate. In an apples-to-apples comparison, the U.S. actually graduated more engineers than India last year, and the Chinese numbers aren't comparable. It's not that the U.S. graduation numbers are wrong. as Salil Tripathi from The Wall Street Journal reported, the comparison was false: Washington apples were being compared to Alphonso mangoes and Chinese litchis. Reader feedback to my column shows the diverging views and opinions on this topic. Some claimed that our numbers are absurd and that we were painting an unduly rosy picture of America. Others suggest that this was a misinformation campaign against India. Some thanked us for setting the record straight. Others felt that by saying America was strong, we had done a disservice to the country. Many who lauded our study suggested that previous inaccurate data was being used to disadvantage American workers. Some experts attacked our motives. HEATED DEBATE. Most surprising were comments from two journalists who interviewed me about the study. A reporter from a top Indian newspaper said this story would hurt India's "national ego" and chided me for being disloyal to the country of my origin. And the editor of a U.S. tech weekly demanded to know my nationality and asked if the Indian government had funded our project. On the other hand, Gail Pesyna, program director at the prestigious Alfred P. Sloan Foundation, said our report had generated a lot of excitement not only in the larger world, but also inside the Sloan Foundation. She thanked us for taking the debate up one notch. The most perceptive feedback we received was from Professor Jesse Ausubel of Rockefeller University. He was impressed that a team of students was quickly able to make a contribution to the factual aspects of the debate. He wrote, "I have never believed all the moaning and groaning about how hard it is to figure out the numbers.... I learned the main problem was that no one chose to break a sweat doing the research.... 'Facts' are few because few people work on ascertaining them, and many of those who want to use the 'facts' are happy to use a misleading selection that serves their interests." UNDERSTANDING OUTSOURCING. With facts being in short supply, both sides of the debate use available statistics to justify their positions. Many have lobbied to raise immigration barriers based on the threat from abroad. Yet in its State of American Business 2006 report, the Chamber of Commerce uses the incorrect engineering graduate numbers to argue that we should allow more immigration. So what should be done? Further research is needed on a subject of such critical national importance. The Duke study was a small step toward establishing certain baseline facts and reliable statistics. As Professor Ausubel notes, if a team of engineering students can accomplish so much within a semester, why not the experts and analysts? The Duke study tried to differentiate between the skill and education level of engineers and suggested that those with higher-quality education would always stay in demand. Study contributor Dr. Richard Schroth of Katzenbach Partners, who coined the terms "dynamic engineers" and "transactional engineers," argues that this is the best way of understanding the outsourcing threat. DYNAMIC RANGE. Dynamic engineers are individuals capable of abstract thinking and high-level problem-solving. These engineers thrive in teams, work well across international borders, have strong interpersonal skills, and lead innovation. Transactional engineers may possess engineering fundamentals, but not the experience or expertise to apply this knowledge to larger problems. These individuals are typically responsible for rote and repetitive tasks in the workforce. What differentiates the two types of engineers is their education. The capstone design course that dynamic engineers study in their senior year enables them to integrate knowledge gained from fundamental coursework in the applied sciences and engineering. Contrary to the popular view that India and China have an abundance of engineers, recent studies show that both countries may actually face severe shortages of dynamic engineers. The vast majority of graduates from these counties have the qualities of transactional engineers. BEYOND THE NUMBERS. Dynamic engineers develop renewable energy sources, solutions for purifying water, sustaining the environment, providing low-cost health care, and vaccines for infectious diseases. They also manage projects and lead innovation. Talk to any CEO, CIO, or engineering manager, and they'll likely tell you that they're always looking for such people. With all the problems that need solving in the world, we probably need many more dynamic engineers. India and China need them as badly as the U.S. does. But by simply focusing on the numbers and racing to graduate more, we're going to end up with more transactional engineers -- and their jobs will likely get outsourced. _________________________________________________________________ [6]Wadhwa, the founder of two software companies, is an Executive-in-Residence/Adjunct Professor at Duke University. He is also the co-founder of TiE Carolinas, a networking and mentoring group. References 3. http://www.businessweek.com/bwdaily/dnflash/dec2005/nf20051223_7594_db039.htm 4. http://www.businessweek.com/smallbiz/content/dec2005/sb20051212_623922.htm 5. http://memp.pratt.duke.edu/outsourcing/ 6. mailto:vivek at wadhwa.com Engineering: Is the U.S. Really Falling? http://www.businessweek.com/print/bwdaily/dnflash/dec2005/nf20051223_7594_db039.htm?chan=db DECEMBER 27, 2005 NEWS ANALYSIS By Pete Engardio Numbers cited to prove that graduation rates in India and China dwarf those in the U.S. may be flawed. But the fear is all too real Is America losing its competitive edge in engineering? Top Silicon Valley executives, U.S. think-tanks, industry associations, and university deans have all pointed out dropping enrollment in American science and tech programs and warn of a brewing problem. And in a November survey of 4,000 U.S. engineers, 64% said outsourcing makes them worry about the profession's future, while less than 10% feel sure America will maintain its leadership in technology. Such gloom is reinforced by a raft of oft-cited statistics: the U.S. graduates only 70,000 engineers a year, and enrollment in engineering schools is declining fast. India, meanwhile, turns out 350,000 engineers annually, while Chinese universities produce 600,000, by some estimates. Indeed, with Asian techies earning anywhere from a quarter to a tenth of what their Western counterparts do, doomsayers might ask why any intelligent young American would pursue engineering. FUZZY DEFINITIONS. But how accurate are such numbers? And how does the theory of American decline square with the reality that graduates of good U.S. engineering schools seem to have little problem finding jobs? Vivek Wadhwa, a founder of several tech startups and an occasional contributor to BusinessWeek Online who's now an executive in residence at Duke University says he got so disturbed by the anxieties of bright engineering students that he helped supervise a study released in December to get to the bottom of such questions. The conclusion: Because of fuzzy definitions of "engineering graduate," estimates of Indian and Chinese numbers can be wildly exaggerated, while America's are understated. Just look at the numbers using consistent criteria. If one counts people who study computer science and information technology as engineers -- as India does -- then the U.S. grants 134,000 four-year engineering degrees annually. Indeed, the U.S. is producing far more engineers per capita than either of Asia's emerging superpowers. Indian schools grants only 122,000 four-year engineering degrees (and almost as many three-year degrees), while China generates 351,000. "SPREADING PROPAGANDA." But China's statistics may still be inflated because the definition of an engineer can vary widely from province to province. In some cases, auto mechanics are included. "The numbers seem to include anybody who has studied anything technical," Wadhwa says. The bottom line is that America's engineering crisis is a myth, Wadhwa argues. Both sides in the globalization debate are "spreading propaganda," he contends. India and China are using inflated engineering numbers because they want to draw more foreign investment, while fearmongers in the U.S. use dubious data either to support their case for protectionism, to lobby for greater government spending on higher education and research, or to justify their offshore investments. The study, though, is already coming under fire. Wadhwa says he's getting notes from researchers who challenge its soothing conclusions, and some U.S. engineers say it doesn't match the grim reality they're witnessing in downsized American R&D labs. And other studies point to different signs of ebbing American dominance in science and technology: The U.S. share of scientific papers is declining. Federal funding for research is falling. And even though American engineering schools may be producing more grads than some data might indicate, many of their students come from overseas. "ARE YOU KIDDING?" The debate raises an intriguing question: Does hype about the rise of India and China unnecessarily demoralize American engineers and scare U.S. students away from technical careers? Most surveys of U.S. corporate executives, after all, conclude that America is already facing a shortage of engineers in everything from software and chemicals to life sciences, and these shortfalls will worsen in coming years. Even the November survey of 4,000 engineers, by public relations firm McClenahan Bruer Communications and CMP publishing group, found that 56% said their own companies currently have a shortage of engineers. The survey confirms that the psychological impact of U.S. offshoring may be just as big as the reality. In focus groups, engineers overwhelmingly said they believe their work is important to society. "But when we asked whether they think society appreciates what they do, they looked at us with blank faces and said, 'Are you kidding?'" says Kerry McClenahan, who runs the PR company behind the survey. Another problem is that many of the U.S. engineers who are getting displaced lack the more demanding skills required by American tech companies today. Because routine tasks can be done more cheaply offshore, many executives say, they need U.S. engineers who can rapidly move on to next-generation technologies, work well with customers, and manage R&D teams. COUNTERARGUMENTS COMING. Wadhwa describes it as a gap between "transactional" engineers and "dynamic" ones. The former are good at fundamentals but have a hard time applying their knowledge to broader problems. Dynamic engineers are more capable of abstract thinking, work well in teams, and can lead innovation. India and China have dynamic engineers, too, but U.S. companies still need many of them on staff at home. "What I'm seeing is that transactional engineers in the U.S. are being replaced by dynamic engineers offshore." The contention that only engineers with routine skills are put at risk by offshoring will surely provoke counterarguments. But at very least, the Duke study has helped take the debate over declining U.S. competitiveness up a notch. [4]Engardio is a senior writer for BusinessWeek in New York References 4. mailto:pete_engardio at businessweek.com About That Engineering Gap... http://www.businessweek.com/print/smallbiz/content/dec2005/sb20051212_623922.htm DECEMBER 13, 2005 Viewpoint By Vivek Wadhwa Is the U.S. really falling behind China and India in education? Not really. Take a closer look at the data There are few topics that generate as much heated debate as outsourcing. One side argues that globalization will lead to greater innovation and prosperity, the other says we are increasing unemployment and misery. Everyone agrees that what's at stake is America's standard of living and world economic leadership. One would expect that the numbers used in such debate would be defensible and grounded. Yet researchers at Duke University have determined that some of the most cited statistics on engineering graduates are inaccurate. Statistics that say the U.S. is producing 70,000 engineers a year vs. 350,000 from India and 600,000 from China aren't valid, the Duke team says. We're actually graduating more engineers than India, and the Chinese numbers aren't quite what they seem. In short, America is far ahead by almost any measure, and we're a long way from losing our edge. Unfortunately, the message students are getting is that many engineering jobs will be outsourced and U.S. engineers have a bleak future of higher unemployment and lower remuneration. This could result in a self-fulfilling prophecy, as fearful young scholars stick to supposedly "outsourcing-proof" professions. In other words, we have more to fear from fear itself. RESEARCH FELLOWS. Having been a tech exec and co-producer of a Bollywood film, I've long been at the center of the outsourcing debate. I wrote about how my own son called me unpatriotic and argued that I was doing wrong for America (see, BW Online, 3/12/04, [3]"My Son, It's Time to Talk of Outsourcing..."). Yet, in my new life in academia, I couldn't answer the first question my engineering students asked (see BW Online, 9/14/05, [4]"Degrees of Achievement"). They wondered what courses would lead to the best job prospects and what jobs were "outsourcing proof." I knew that with a master's of engineering management from Duke, these students were destined to be leaders, and that leadership can never be outsourced. Yet I was no expert on engineering majors across the world. Dean Kristina Johnson of Duke's Pratt School of Engineering suggested we research the topic. I enlisted the help of Professor Gary Gereffi, a world renowned sociologist and Duke outsourcing expert, and we picked a team of our brightest students. We set out to compare international engineering degrees and analyze employment opportunities. As you do in any study, we started by assessing the facts. The problem was that facts were in short supply. BIPARTISAN CONSENSUS. In recent years, the worldwide media has cited graduation numbers that show a huge imbalance of engineering graduates coming out of Chinese and Indian schools. One commonly cited set of figures is 600,000 engineers graduated annually from institutions of higher education in China, 350,000 from India, and 70,000 from the U.S. Top business publications have repeated these numbers. So have political leaders across the spectrum -- from [5]Ted Kennedy to [6]Newt Gingrich. The Congressional Record references these numbers. Even the prestigious National Academies issued [7]a press release asking federal support to bolster U.S. competitiveness, citing these numbers as part of its argument. The U.S. numbers were easy to verify. The National Center for Education and the American Society of Engineering Education provided useful data. However, international numbers were a different story. LOST IN TRANSLATION. Several reports cited the Ministry of Education in China and the National Association of Software & Service Companies (NASSCOM) in India. Yet none of the reports issued by these authorities that we read matched the numbers being reported. So we called registrars of the largest universities in India and China. Chinese universities readily provided high-level data, but not enough detail. Some Indian registrars were helpful and shared comprehensive spreadsheets. Others claimed not to know how many engineering colleges were affiliated with their schools or they lacked detail on graduation rates by major. We eventually found our way to knowledgeable employees of the Chinese Education Ministry, and the research head of NASSCOM, Sunil Mehta. After extensive discussions and reviews of more reports and data, we learned that no one was comparing apples to apples. The word "engineer" didn't translate well into different Chinese dialects and had no standard definition. We were told that reports received by the ministry from Chinese provinces didn't count degrees in a consistent way. A motor mechanic or a technician could be considered an engineer, for example. Also, the numbers included all degrees related to information technology and specialized fields such as shipbuilding. DATA BANK. There were also "short-cycle" degrees, which were typically completed in 2 or 3 years. These are equivalent to associate degrees in the U.S. Nearly half of China's reported degrees fell into this category. NASSCOM maintains extensive engineering graduation data. They gather data from diverse sources and create and validate projections and estimates. We couldn't get the data to perform accurate comparisons with China, so we matched the NASSCOM definition of engineer to U.S. numbers. We found that the U.S. was graduating 222,335 engineers, vs. 215,000 from India. The closest comparable number reported by China is 644,106, but it includes additional majors. Looking strictly at four-year degrees and without considering accreditation or quality, the U.S. graduated 137,437 engineers, vs. 112,000 from India. China reported 351,537 under a broader category. All of these numbers include information technology and related majors ([8]click here to read the full Duke report). WORLD OF DOUBT. What's the point? We hear repeatedly that America is in trouble and that the root cause lies with our education system. There's no doubt that K-12 science and math could be improved, and few will dispute that America needs to invest more in education and research. However, our higher education system isn't in trouble -- in fact, it's still the world's best. We spend the most on research, produce the most patents, have the most innovative curriculum, and educate many of the world's leaders. Take Duke University. It spends $50 million a year just on engineering research, and members of its faculty are world renowned. The message that our engineering graduates compete with 1 million graduates from India and China has created a sense of fear, uncertainty, and doubt. Why would a smart student enter a field where their job might soon be outsourced? Rather than encouraging our children to study more math and science and become engineers, we're turning them into lawyers. When the world hears that the U.S. education system is in decline, we scare away those who would otherwise come here to study. To keep America competitive, we must keep attracting the world's best and brightest. America needs to do all it can to fuel innovation and maintain its lead in science and technology. By repeatedly sending the message that we're weak, we in fact become weak. _________________________________________________________________ [9]Wadhwa, the founder of two software companies, is an Executive-in-Residence/Adjunct Professor at Duke University. He is also the co-founder of TiE Carolinas, a networking and mentoring group. References 3. http://www.businessweek.com/smallbiz/content/mar2004/sb20040312_5086.htm 4. http://www.businessweek.com/smallbiz/content/sep2005/sb20050914_959737.htm 5. http://www.govtrack.us/congress/record.xpd?id=109-s20051025-16 6. http://www.newt.org/index.php?src=news&prid=1191&category=Speeches 7. http://www4.nationalacademies.org/news.nsf/isbn/0309100399?OpenDocument 8. http://memp.pratt.duke.edu/outsourcing 9. mailto:vivek at wadhwa.com Download Duke Outsourcing Report http://memp.pratt.duke.edu/outsourcing/ Download: [1]Framing the Engineering Outsourcing Debate: Placing the United States on a Level Playing Field with China and India Download: [2]Appendix This report is part of an ongoing study to compare the number of U.S. engineering graduates to those in developing nations, particularly India and China. This is a complex issue and requires further study but this preliminary report raises several questions about the numbers quoted in the popular press. This report was developed by graduate students of Duke University's [4]Master of Engineering Management Program in the [5]Pratt School of Engineering under the guidance of [6]Dr. Gary Gereffi, and [7]Vivek Wadhwa with consulting assistance from Katzenbach Partners LLC. References 1. http://memp.pratt.duke.edu/downloads/duke_outsourcing_2005.pdf 2. http://memp.pratt.duke.edu/downloads/duke_outsourcing_2005_appendix.pdf 3. http://memp.pratt.duke.edu/downloads/duke_outsourcing_2005.pdf 4. http://memp.duke.edu/ 5. http://pratt.duke.edu/ 6. http://fds.duke.edu/db/aas/Sociology/faculty/ggere 7. http://memp.pratt.duke.edu/people/staff.php My Son, It's Time to Talk of Outsourcing... http://www.businessweek.com/print/smallbiz/content/mar2004/sb20040312_5086.htm?chan=sb MARCH 12, 2004 BOLLYWOOD POSTCARD By Vivek Wadhwa ...and how, without those Russian programmers I hired, neither my U.S. startup nor the jobs it created would ever have existed When I got involved with a venture to produce a Hollywood movie in Bollywood, I was very excited about helping educate Americans on India and its culture, thinking we could make the world a better place by entertaining audiences for two hours. We were creating some jobs in the US, many more in India, and providing huge financial upside to American investors. True, we were going to film the movie in India to save money. But we were also planning to spend about half our budget in the U.S. And if we couldn't film in India at a lower cost the movie might not happen. So I figured some money spent in the U.S. economy is better than no money spent in the U.S. or India. In that sense, we were doing plenty to help the U.S. economy. FAMILY FEUD. I didn't imagine that anyone could call this unpatriotic or wrong. Not only was I mistaken, but my own 16-year-old son, Tarun, was the one who objected and called me unpatriotic. He said it was wrong to ship more jobs to India. The debate about outsourcing has become so heated and emotional in this election year that it is difficult to have a rational discussion about it, even at home. This isn't the first time I have had to deal with the issue of outsourcing. In the late 1980s, I was a vice-president at a major Wall Street financial institution, in charge of building technology to help improve the way the investment banking arm of the outfit did its business. Eventually, we spent about $150 million to modernize the computer systems, and the company was happy with the results. That success in 1990 led to a joint venture with IBM to market the technology we had created. There was one technical problem we were never able to solve even after spending $150 million. That solution could have cut the cost of the project in half. We needed an automated way of translating old IBM mainframe code to modern computer languages that run on the latest computer platforms. To solve this problem, we needed computer programmers who excelled in both mathematics and computer science, and we simply couldn't find the right people for this task. We spent millions of dollars trying, however. HOMEGROWN JOBS. As fate would have it, I visited Russia, in 1992, just after the break-up of the Soviet Union, and hired a team of 30 brilliant and hungry computer programmers who boasted just the right skills. They were grateful to work for a small fraction of American wages, and labored for years to solve this complex problem. Eventually, this technology led me to start my own company, Relativity Technologies. That company would not have existed, or saved its customers many millions of dollars if we hadn't outsourced development to Russia. Relativity Technologies employed nearly 100 people in the U.S., paying many of them close to $100,000 per year -- jobs that would not have existed without help from our 50 Russian programmers. Despite this fact, I often received angry e-mails from people who learned about my company from news articles describing our venture. They accused me of "taking jobs away from the U.S." Some of the messages asked if I was ashamed of myself, and challenged my patriotism. Now, this was all before outsourcing became an election issue. I have always done everything possible to give back to the community in which I live. I have been loyal to America, the country which readily accepted me as a citizen, and have helped to create jobs. So I thought my logic was beyond reproach. But I must admit that the recent debate with my son really shook me up. A WORLD OF POSSIBILITIES. My current project is to help produce a feature film in India. We have American producers. The script was written by Americans. We will have an American director working side by side with an Indian director. The lead actor will be British, but we will also have many Indian actors. We will shoot the movie in Mumbai using local talent, but expect to do the final production and editing in Los Angeles. We expect to spend $300K on marketing in the US. The budget of this film is less than $1M. I hope that it will gross many times that sum and deliver magnificent returns for its American investors, who will turn around and hand that money to money managers and accountants and car dealerships -- maybe even donate some of it to charity. And if the movie turns out the way I hope, it will educate both Americans and Indians about their respective cultures. Maybe those Indian directors will collaborate on more projects with American directors, perhaps and start their own studio with offices in Mumbai and Hollywood. They will employ actors, writers, directors, electricians, and computer wizards in California, where they will do the special effects. Films that could not be made before because of the cost or risk might now become feasible. The point of all this is that the current view of outsourcing and resulting job loss is simplistic and binary. The economies of the world are now interlinked more tightly than ever before. Witness my film and my story. A child of India, immigrates to America from Australia, where he has been educated, starts a technology company and invests his own money to employ Americans. When I was growing up in Canberra, the Australian capital, such a trajectory would have been nearly unimaginable. Foreigners didn't start technology companies in the U.S. Today, it's an old story. In fact, the story today is that I am involved in making a movie in India. And the technology company that I started is now expanding rapidly and is looking for more employees in the US. GRUDGING CONVERSION. Back to my 16-year-old son. I am always grateful when my American-born teenager has time to talk to me. Between all those instant messages and cell phone calls, teenage life is really busy. I was delighted that he wanted to talk to me, but the fact that the discussion was about outsourcing really surprised me. His views were even more unexpected. His concern was that, with so much unemployment and poverty in America, we needed to create jobs locally. Why was my company employing all those people in Russia, and why was I now working on creating jobs in India? He lectured me about the need to give back to the local community before we gave back to the world. Fortunately, it is easier to convince a teenager than a politician in an election year. After a discussion lasting hours, and walking him through some of the details, and showing him that we were making the pie bigger for everyone, he said that maybe what I was doing was okay, after all. He still believed that outsourcing was bad, but maybe we could do things in a way that everyone wins. _________________________________________________________________ [3]Vivek Wadhwa has co-founded two technology companies, and is currently chairman of Relativity Technologies in Raleigh, N.C. When not producing movies or battling venture capitalists, Wadhwa mentors fledgling entrepreneurs. Edited by Alex Salkever Degrees of Achievement http://www.businessweek.com/print/smallbiz/content/sep2005/sb20050914_959737.htm SEPTEMBER 14, 2005 Viewpoint By Vivek Wadhwa Much of what I learned at B-school seemed irrelevant -- until it boosted my earnings, launched a new career, and ushered me into academia Early in their careers, entrepreneurs have to make difficult decisions about the level of education to pursue. Business school is costly and takes years to complete -- and even longer to pay dividends. If you're a Bill Gates or Steve Jobs, schooling clearly won't make or break your career. However, for me, education probably made the difference between success and mediocrity. I've gained so much from my education that I've decided to give back to the system. TUITION BURDEN. Long ago, I had to make some very tough choices. With a bachelor's degree in computing, I was well set in my technical career. I took great pride in being a geek and would boast about my coding exploits at cocktail parties. Yet I knew that my understanding of the business world lacked depth, and I harbored a deep-rooted desire to get the best education possible. I signed up for an MBA at [3]New York University, but this was no easy decision. My wife and I had to choose between tuition and the down payment on a new house. Having a child on the way made it even more difficult. We ended up having to manage with a tiny, one-bedroom apartment in North Bergen, N.J., while I went to school. After starting the program, I wasn't so sure why my study included topics like economics, accounting, marketing, and operations management. These seemed so far from my technical world that I wondered if they would ever help me. About the time I completed my degree, I received an offer from investment-banking powerhouse Credit Suisse First Boston ([4]CTNX ) to join its programming staff. I'm still not sure if my MBA helped me land this job, but it didn't hurt. SELF-ESTEEM BUILDER. For the first couple of years, my business degree seemed totally irrelevant. Knowing about how the capital markets worked or understanding operations management didn't help me write better code. So I used to wonder if I had made the right choice. Over time, I realized that I had a much better understanding of how the business worked than some of my peers. And I felt confident in my dealings with the bank's managing directors as well as with user departments. I could present business proposals and participate in technology design meetings. I was able to persuade the company to invest in technology I had designed. When my team's success led to IBM's ([5]IBM ) funding of a spin-off company, I was offered the chief technology officer post. That's when my education really began to pay big dividends. A FEW ODDBALLS. In the startup world, it's simply survival of the fittest. You have to involve yourself with almost every aspect of the business -- and leverage all skills. I would find myself having to develop and manage budgets, help market and sell, hire, and motivate employees, assist in setting corporate strategy, and review legal contracts. Plus, I still had to develop technology and deal with all the uncertainties and failures that come with a startup. My MBA classes seemed to fit our business needs like the pieces of a jigsaw puzzle. Even obscure topics like corporate finance came in handy in IPO discussions with investment bankers and when I later raised capital for my own company. To be fair, there were some jigsaw pieces that I'll never use and others that didn't fit well. I can't imagine when I'm ever going to price an option with the Black-Scholes Model, for example. I wished that the program had had classes on selling and given us more interaction with the real world. HIGHER GOAL. In previous columns I've written about my adventures with India's Bollywood (see BW Online, 1/24/04, [6]"Bollywood, Here I Come!"). To my surprise, my education even helped in that universe. The principles of marketing, management, and accounting all have the same applications, no matter what the industry. With My Bollywood Bride nearing theatrical release and my having become disenchanted with this world of glamour, I began to wonder where I'd find the next mountain worth climbing. A chance meeting with Kristina Johnson, dean of the Pratt School of Engineering at Duke University, presented the opportunity I was looking for. With a PhD in Electrical Engineering, 44 patents to her name, and her involvement in many startup technology companies, Kristina is no ordinary academic. And she has a mission. GLOBAL EDGE. Kristina worries that the U.S. is losing its competitive edge as the graduation rate of engineers and scientists declines. To hold on to quality of life, we have to maintain our competency and leadership in engineering, science, and technology. She also feels that graduating engineers and technicians generally lack business skills. She introduced me to Dr. Jeff Glass, who heads the Pratt Master's of Engineering Management (MEM) program, which had put together a very innovative one-year course of study. The program seeks to prepare engineering graduates to go beyond their technical roots when they enter the workforce. By teaching topics such as marketing, management, and law, the MEM Program gives students a head start in the corporate world. And MEM graduates can better compete in the global economy, where lower-level technical functions are rapidly relocating to countries like India and China. HONORIFIC GETTER. Jeff wanted to bring the program closer to the business world, to set up an advisory board that comprised business and academic leaders to guide the school. He wanted assistance in mentoring students and his faculty. Kristina wanted help in commercializing some of the revolutionary technologies her school was researching. And she wanted to create ties to universities in other countries. When Kristina asked me to join the university as executive-in-residence, I said yes immediately. I was looking for a way to give back to the education system that had given so much to me, and this seemed like the perfect role. Plus, it would allow me to add a title that I have respected since childhood. So, Professor Wadhwa it is. _________________________________________________________________ [7]Wadhwa, the founder of two software companies, is an Executive-in-Residence/Adjunct Professor at Duke University. He is also the co-founder of TiE Carolinas, a networking and mentoring group. Edited by Rod Kurtz References 3. http://www.businessweek.com/bschools/04/full_time_profiles/stern.htm 4. javascript: void showTicker('CTNX') 5. javascript: void showTicker('IBM') 6. http://www.businessweek.com/smallbiz/content/jan2004/sb20040121_6183.htm 7. mailto:vivek at wadhwa.com Bollywood, Here I Come! http://www.businessweek.com/print/smallbiz/content/jan2004/sb20040121_6183.htm?chan=sb JANUARY 21, 2004 BOLLYWOOD POSTCARD By Vivek Wadhwa Meet our latest columnist, a U.S. entrepreneur-turned-film producer, who will be reporting on his new business in the world's busiest movie mecca After 25 years as a geek, hacker, programmer, project manager, chief technology officer, and finally, chief executive officer of a technology company, I have added a new title to my resum?: Bollywood film producer. And I owe it all to a heart attack, venture capitalists, and my son. My story is a simple one. I spent five years running at top speed to build a software outfit called Relativity Technologies, which helps modernize their legacy computer systems. Relativity received worldwide recognition and in 2001, a top business magazine named it one of the world's 25 "coolest" companies. Like all other tech players, the dot-com crash hit us hard. To make ends meet, we had to lay off good employees and raise more money. We believed so much in our company that my management team and I personally invested the first $1 million in a $5 million round of financing. After we succeeded in getting back to solid growth and profits, I decided to take a much needed vacation. HEART OF THE MATTER. It must have been the lack of Internet access on my Caribbean cruise that led to a massive heart attack. I woke up in the hospital glad to be alive. While still in the Critical Care Unit, I received a phone call saying that my investors felt the need to renegotiate the terms of the current financing. Two days later and still bandaged, I left the hospital and walked, uninvited, into a closed-door meeting, where investors were trying to convince my executive team to accept more money for a revised agreement that would give them majority ownership. I flatly refused, and ended the meeting. My friends sent me lots of "get well" flowers. My investors sent me a letter demanding that I step aside and allow the younger brother of a partner in one of their firms to take over -- the company I had founded, built with most of my life savings, and paid for with a cardiac arrest. For the next few months, I spent what little energy I had fighting people I formerly held in high regard. Fortunately, that chapter of my life story had a happy ending. I won critical battles, kept control of Relativity, and eventually recruited a new CEO to take over from me. I also took my doctors' advice to do something less stressful than fight venture capitalists for a living, and regained my health. I kept my role as chairman of the board, but decided that my chief priority was to make up for lost time with my family. A FATHER'S DUTY. That led me to India, where my eldest son, Vineet, was taking his semester abroad in New Delhi. He is as American as can be, but Vineet had not only discovered his Indian roots, he also had acquired the Indian addiction to Bollywood movies. An Indian specialty, these films tend to be spicy, dramatic, and romantic musical extravaganzas, and should never be watched without a ready handkerchief. Think Moulin Rouge or Chicago with family-values morals, but even more over-the-top musical numbers and costumes. In India, Bollywood's stars are practically worshipped by the masses. When I told Vineet that I was taking a month off to spend with him, I don't think he believed me. I asked him where in India he would like to travel. His immediate answer? Bollywood, in the city of Mumbai -- formerly known as Bombay -- home of the world's largest film industry. And he asked for a rare favor: Could we meet some movie stars while we were there? His dad was famous and knew everyone, right? Of course I didn't know any movie stars -- I was just a techie who started a software company. However, I couldn't admit this to him. To make things worse, I didn't think I knew anyone who knew any movie stars, since most of my friends were like me. Yet I couldn't let my son down. After exhausting my list of Indian contacts, I recalled corresponding with an investment banker, Brad Listermann, who was raising money for a telecommunications outfit. When I talked to him a couple of weeks earlier, he had mentioned something about Bollywood. Grasping at straws, I sent him an urgent e-mail asking him for more information. LIMO TO LUXURY. What luck. As it turned out, he was married to former beauty queen and Bollywood star, Kashmira Shah. He had met her on the Internet, and after months of romancing via e-mail, they got together, fell in love, and married. Unfortunately, she was scheduled to be in Hollywood for a film that was shooting the week we were to be in Mumbai. Brad offered to introduce us to Feroz Khan instead. In most of South Asia and the Middle East, Feroz Khan is as well known as Clint Eastwood or Robert Redford. I gladly took Brad up on his offer. Believe me, we would have traveled anywhere to shake hands with Feroz. Still, we were taking all this with a grain of salt. I mean, who gets to meet Robert Redford after a couple of e-mails? We were surprised to get a phone call from Feroz when we landed in Mumbai, but even more surprised to get an invitation to his house for dinner. Would it really be Khan, Vineet asked, or a practical joke? Our doubts were dispelled when his Mercedes showed up at our hotel and drove us to his beachside mansion in Mumbai's prestigious Juhu Beach neighborhood. On the ride over, Vineet's only question was: What would we talk to Feroz about? Feroz greeted us at the door and showed us around his palace. The house boasted beautiful artwork and sculptures, massive crystal chandeliers, a film production studio, a gymnasium, and an indoor pool. Between the butler, servants, chauffeur, and security guards, there were at least 11 employees on duty that evening. STAR POWER. Feroz apologized that his son, Fardeen, was delayed in New Delhi. Fardeen is akin to the Tom Cruise of Bollywood. Instead, Feroz had invited Celina Jaitley, Miss India 2001, who has since become a popular Bollywood actress. My son and I looked at each other in utter disbelief when she walked in the door a few minutes later. Vineet almost fell off his chair. Clearly, that night we were visiting a different universe than the one we inhabited back in North Carolina. It was an enchanting evening. They told us stories about the production of their upcoming film, gossiped about their fellow Bollywood stars, and discussed the inner workings of beauty pageants. Feroz also gave me an interesting lesson on the dynamics and economics of the entertainment industry, and took me to his production studio, where he showed clips of his upcoming movie, Janasheen. I learned a lot in those four hours, later calling Brad Listermann to thank him and to ask how I could possibly return the huge favor of making me look like a superhero in the eyes of my son. He hesitated, then wondered if I could spend some time to review a business plan, and possibly introduce him to any members of the Indian tech community who might be interested in funding it. He had developed this plan with the help of his mentor, Duncan Clark, the ex-President of Columbia Tri-star's International Theatrical Division, who was fascinated by Bollywood's potential. The plan was to produce quality Hollywood movies in Bollywood, and to do so for one tenth what a comparable independent production would cost elsewhere. OLD SKILLS, NEW PLOT . Brad had also written a lovely story about an American who falls in love with an Indian movie star, and follows her back to Bollywood (sound familiar?). It showed India through the eyes of a bewildered Westerner. This script provided an opportunity to test Brad's theory about producing quality movies in Bollywood at a low cost. What excited me the most was that it would help build on the momentum of independent cross-over movies such as Bend It Like Beckham and Monsoon Wedding -- box-office successes that have done a great job of educating Americans about Indian culture, and have helped change old, hurtful stereotypes. Brad wanted to produce this movie called "My Bollywood Bride". Duncan Clark had agreed to take the role of executive producer and oversee the production and distribution of this film. He wanted me to take a similar role and help with funding, review budgets and project plans, and assist with positioning, marketing, and promotion -- all things I did for a living as a tech executive. I didn't take much time to think about this, and readily agreed. I had already caught the "Bollywood bug." _________________________________________________________________ [3]Vivek Wadhwa has co-founded two technology companies, and is currently chairman of Relativity Technologies in Raleigh, N.C. When not producing movies or battling venture capitalists, Wadhwa mentors fledgling entrepreneurs. Edited by Alex Salkever References 3. mailto:vivek at relativity.com From checker at panix.com Wed Jan 11 16:16:07 2006 From: checker at panix.com (Premise Checker) Date: Wed, 11 Jan 2006 11:16:07 -0500 (EST) Subject: [Paleopsych] NYT: The Cute Factor Message-ID: The Cute Factor http://www.nytimes.com/2006/01/03/science/03cute.html By NATALIE ANGIER WASHINGTON, Jan. 2 - If the mere sight of Tai Shan, the roly-poly, goofily gamboling masked bandit of a panda cub now on view at the National Zoo isn't enough to make you melt, then maybe the crush of his human onlookers, the furious flashing of their cameras and the heated gasps of their mass rapture will do the trick. "Omigosh, look at him! He is too cute!" "How adorable! I wish I could just reach in there and give him a big squeeze!" "He's so fuzzy! I've never seen anything so cute in my life!" A guard's sonorous voice rises above the burble. "OK, folks, five oohs and aahs per person, then it's time to let someone else step up front." The 6-month-old, 25-pound Tai Shan - whose name is pronounced tie-SHON and means, for no obvious reason, "peaceful mountain" - is the first surviving giant panda cub ever born at the Smithsonian's zoo. And though the zoo's adult pandas have long been among Washington's top tourist attractions, the public debut of the baby in December has unleashed an almost bestial frenzy here. Some 13,000 timed tickets to see the cub were snapped up within two hours of being released, and almost immediately began trading on eBay for up to $200 a pair. Panda mania is not the only reason that 2005 proved an exceptionally cute year. Last summer, a movie about another black-and-white charmer, the emperor penguin, became one of the highest-grossing documentaries of all time. Sales of petite, willfully cute cars like the Toyota Prius and the Mini Cooper soared, while those of noncute sport utility vehicles tanked. Women's fashions opted for the cute over the sensible or glamorous, with low-slung slacks and skirts and abbreviated blouses contriving to present a customer's midriff as an adorable preschool bulge. Even the too big could be too cute. King Kong's newly reissued face has a squashed baby-doll appeal, and his passion for Naomi Watts ultimately feels like a serious case of puppy love - hopeless, heartbreaking, cute. Scientists who study the evolution of visual signaling have identified a wide and still expanding assortment of features and behaviors that make something look cute: bright forward-facing eyes set low on a big round face, a pair of big round ears, floppy limbs and a side-to-side, teeter-totter gait, among many others. Cute cues are those that indicate extreme youth, vulnerability, harmlessness and need, scientists say, and attending to them closely makes good Darwinian sense. As a species whose youngest members are so pathetically helpless they can't lift their heads to suckle without adult supervision, human beings must be wired to respond quickly and gamely to any and all signs of infantile desire. The human cuteness detector is set at such a low bar, researchers said, that it sweeps in and deems cute practically anything remotely resembling a human baby or a part thereof, and so ends up including the young of virtually every mammalian species, fuzzy-headed birds like Japanese cranes, woolly bear caterpillars, a bobbing balloon, a big round rock stacked on a smaller rock, a colon, a hyphen and a close parenthesis typed in succession. The greater the number of cute cues that an animal or object happens to possess, or the more exaggerated the signals may be, the louder and more italicized are the squeals provoked. Cuteness is distinct from beauty, researchers say, emphasizing rounded over sculptured, soft over refined, clumsy over quick. Beauty attracts admiration and demands a pedestal; cuteness attracts affection and demands a lap. Beauty is rare and brutal, despoiled by a single pimple. Cuteness is commonplace and generous, content on occasion to cosegregate with homeliness. Observing that many Floridians have an enormous affection for the manatee, which looks like an overfertilized potato with a sock puppet's face, Roger L. Reep of the University of Florida said it shone by grace of contrast. "People live hectic lives, and they may be feeling overwhelmed, but then they watch this soft and slow-moving animal, this gentle giant, and they see it turn on its back to get its belly scratched," said Dr. Reep, author with Robert K. Bonde of "The Florida Manatee: Biology and Conservation." "That's very endearing," said Dr. Reep. "So even though a manatee is 3 times your size and 20 times your weight, you want to get into the water beside it." Even as they say a cute tooth has rational roots, scientists admit they are just beginning to map its subtleties and source. New studies suggest that cute images stimulate the same pleasure centers of the brain aroused by sex, a good meal or psychoactive drugs like cocaine, which could explain why everybody in the panda house wore a big grin. At the same time, said Denis Dutton, a philosopher of art at the University of Canterbury in New Zealand, the rapidity and promiscuity of the cute response makes the impulse suspect, readily overridden by the angry sense that one is being exploited or deceived. "Cute cuts through all layers of meaning and says, Let's not worry about complexities, just love me," said Dr. Dutton, who is writing a book about Darwinian aesthetics. "That's where the sense of cheapness can come from, and the feeling of being manipulated or taken for a sucker that leads many to reject cuteness as low or shallow." Quick and cheap make cute appealing to those who want to catch the eye and please the crowd. Advertisers and product designers are forever toying with cute cues to lend their merchandise instant appeal, mixing and monkeying with the vocabulary of cute to keep the message fresh and fetching. That market-driven exercise in cultural evolution can yield bizarre if endearing results, like the blatantly ugly Cabbage Patch dolls, Furbies, the figgy face of E.T., the froggy one of Yoda. As though the original Volkswagen Beetle wasn't considered cute enough, the updated edition was made rounder and shinier still. "The new Beetle looks like a smiley face," said Miles Orvell, professor of American studies at Temple University in Philadelphia. "By this point its origins in Hitler's regime, and its intended resemblance to a German helmet, is totally forgotten." Whatever needs pitching, cute can help. A recent study at the Veterans Affairs Medical Center at the University of Michigan showed that high school students were far more likely to believe antismoking messages accompanied by cute cartoon characters like a penguin in a red jacket or a smirking polar bear than when the warnings were delivered unadorned. "It made a huge difference," said Sonia A. Duffy, the lead author of the report, which was published in The Archives of Pediatrics and Adolescent Medicine. "The kids expressed more confidence in the cartoons than in the warnings themselves." Primal and widespread though the taste for cute may be, researchers say it varies in strength and significance across cultures and eras. They compare the cute response to the love of sugar: everybody has sweetness receptors on the tongue, but some people, and some countries, eat a lot more candy than others. Experts point out that the cuteness craze is particularly acute in Japan, where it goes by the name "kawaii" and has infiltrated the most masculine of redoubts. Truck drivers display Hello Kitty-style figurines on their dashboards. The police enliven safety billboards and wanted posters with two perky mouselike mascots, Pipo kun and Pipo chan. Behind the kawaii phenomenon, according to Brian J. McVeigh, a scholar of East Asian studies at the University of Arizona, is the strongly hierarchical nature of Japanese culture. "Cuteness is used to soften up the vertical society," he said, "to soften power relations and present authority without being threatening." In this country, the use of cute imagery is geared less toward blurring the line of command than toward celebrating America's favorite demographic: the young. Dr. Orvell traces contemporary cute chic to the 1960's, with its celebration of a perennial childhood, a refusal to dress in adult clothes, an inversion of adult values, a love of bright colors and bloopy, cartoony patterns, the Lava Lamp. Today, it's not enough for a company to use cute graphics in its advertisements. It must have a really cute name as well. "Companies like Google and Yahoo leave no question in your mind about the youthfulness of their founders," said Dr. Orvell. Madison Avenue may adapt its strategies for maximal tweaking of our inherent baby radar, but babies themselves, evolutionary scientists say, did not really evolve to be cute. Instead, most of their salient qualities stem from the demands of human anatomy and the human brain, and became appealing to a potential caretaker's eye only because infants wouldn't survive otherwise. Human babies have unusually large heads because humans have unusually large brains. Their heads are round because their brains continue to grow throughout the first months of life, and the plates of the skull stay flexible and unfused to accommodate the development. Baby eyes and ears are situated comparatively far down the face and skull, and only later migrate upward in proportion to the development of bones in the cheek and jaw areas. Baby eyes are also notably forward-facing, the binocular vision a likely legacy of our tree-dwelling ancestry, and all our favorite Disney characters also sport forward-facing eyes, including the ducks and mice, species that in reality have eyes on the sides of their heads. The cartilage tissue in an infant's nose is comparatively soft and undeveloped, which is why most babies have button noses. Baby skin sits relatively loose on the body, rather than being taut, the better to stretch for growth spurts to come, said Paul H. Morris, an evolutionary scientist at the University of Portsmouth in England; that lax packaging accentuates the overall roundness of form. Baby movements are notably clumsy, an amusing combination of jerky and delayed, because learning to coordinate the body's many bilateral sets of large and fine muscle groups requires years of practice. On starting to walk, toddlers struggle continuously to balance themselves between left foot and right, and so the toddler gait consists as much of lateral movement as of any forward momentum. Researchers who study animals beloved by the public appreciate the human impulse to nurture anything even remotely babylike, though they are at times taken aback by people's efforts to identify with their preferred species. Take penguins as an example. Some people are so wild for the creatures, said Michel Gauthier-Clerc, a penguin researcher in Arles, France, "they think penguins are mammals and not birds." They love the penguin's upright posture, its funny little tuxedo, the way it waddles as it walks. How like a child playing dress-up! Endearing as it is, Dr. Gauthier-Clerc explained that the apparent awkwardness of the penguin's march had nothing to do with clumsiness or uncertain balance. Instead, he said, penguins waddle to save energy. A side-to-side walk burns fewer calories than a straightforward stride, and for birds that fast for months and live in a frigid climate, every calorie counts. As for the penguin's maestro garb, the white front and black jacket suits its aquatic way of life. While submerged in water, the penguin's dark backside is difficult to see from above, camouflaging the penguin from potential predators of air or land. The white chest, by contrast, obscures it from below, protecting it against carnivores and allowing it to better sneak up on fish prey. The giant panda offers another case study in accidental cuteness. Although it is a member of the bear family, a highly carnivorous clan, the giant panda specializes in eating bamboo. As it happens, many of the adaptations that allow it to get by on such a tough diet contribute to the panda's cute form, even in adulthood. Inside the bear's large, rounded head, said Lisa Stevens, assistant panda curator at the National Zoo, are the highly developed jaw muscles and the set of broad, grinding molars it needs to crush its way through some 40 pounds of fibrous bamboo plant a day. When it sits up against a tree and starts picking apart a bamboo stalk with its distinguishing pseudo-thumb, a panda looks like nothing so much like Huckleberry Finn shucking corn. Yet the humanesque posture and paws again are adaptations to its menu. The bear must have its "hands" free and able to shred the bamboo leaves from their stalks. The panda's distinctive markings further add to its appeal: the black patches around the eyes make them seem winsomely low on its face, while the black ears pop out cutely against the white fur of its temples. As with the penguin's tuxedo, the panda's two-toned coat very likely serves a twofold purpose. On the one hand, it helps a feeding bear blend peacefully into the dappled backdrop of bamboo. On the other, the sharp contrast between light and dark may serve as a social signal, helping the solitary bears locate each other when the time has come to find the perfect, too-cute mate. From guavaberry at earthlink.net Wed Jan 11 19:58:15 2006 From: guavaberry at earthlink.net (K.E.) Date: Wed, 11 Jan 2006 14:58:15 -0500 Subject: [Paleopsych] The failure to calculate the costs of war Message-ID: <7.0.0.16.0.20060111145653.0324f598@edu-cyberpg.com> With these costs taken into account, the total macroeconomic costs may add up to $750bn and total costs to $1,850bn. "We will bankrupt ourselves in the vain search for absolute security." -Dwight David Eisenhower, U.S. general and 34th president (1890-1969) Martin Wolf is a respectable economist and chief economics commentator at the Financial Times Martin Wolf: The failure to calculate the costs of war http://news.ft.com/cms/s/48ad9c0a-820f-11da-aea0-0000779e2340.html Before the Iraq war began, Lawrence Lindsey, then president George W.???Bush's economic adviser, suggested that the costs might reach $200bn. The White House promptly fired him. Mr Lindsey was indeed wrong. But his error lay in grossly underestimating the costs. The administration's estimates of a cost of some $50-$60bn were a fantasy, as were Saddam Hussein's weapons of mass destruction, and much else. So far the government has spent $251bn in hard cash. But the costs continue. If the US begins to withdraw troops this year, but maintains a diminishing presence for the next five years, the additional cost will be at least $200bn, under what Profs Bilmes and Stiglitz call their "conservative" option. Under their "moderate" one, the cost reaches $271bn, because troops remain until 2015. With these costs taken into account, the total macroeconomic costs may add up to $750bn and total costs to $1,850bn. It is possible to argue that the benefits for Iraq, the Middle East and the world will outweigh all these costs. But that depends on the emergence, in Iraq, of a stable and peaceful democratic order. That has not yet been achieved. Even those who supported the war must draw two lessons. First, the exercise of military power is far more expensive than many fondly hoped. Second, such policy decisions require a halfway decent analysis of the costs and possible consequences. The administration's failure to do so was a blunder that will harm the US and the world for years to come. <>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<> The Educational CyberPlayGround http://www.edu-cyberpg.com/ National Children's Folksong Repository http://www.edu-cyberpg.com/NCFR/ Hot List of Schools Online and Net Happenings, K12 Newsletters, Network Newsletters http://www.edu-cyberpg.com/Community/ 7 Hot Site Awards New York Times, USA Today , MSNBC, Earthlink, USA Today Best Bets For Educators, Macworld Top Fifty <>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<> From checker at panix.com Wed Jan 11 21:29:20 2006 From: checker at panix.com (Premise Checker) Date: Wed, 11 Jan 2006 16:29:20 -0500 (EST) Subject: [Paleopsych] NYT: Cells That Read Minds Message-ID: Cells That Read Minds http://www.nytimes.com/2006/01/10/science/10mirr.html [This is a particularly important article, for it connects social learning with the brain. File it under the G in GRIN: genetics, robotics, information, nanotech.] By SANDRA BLAKESLEE On a hot summer day 15 years ago in Parma, Italy, a monkey sat in a special laboratory chair waiting for researchers to return from lunch. Thin wires had been implanted in the region of its brain involved in planning and carrying out movements. Every time the monkey grasped and moved an object, some cells in that brain region would fire, and a monitor would register a sound: brrrrrip, brrrrrip, brrrrrip. A graduate student entered the lab with an ice cream cone in his hand. The monkey stared at him. Then, something amazing happened: when the student raised the cone to his lips, the monitor sounded - brrrrrip, brrrrrip, brrrrrip - even though the monkey had not moved but had simply observed the student grasping the cone and moving it to his mouth. The researchers, led by Giacomo Rizzolatti, a neuroscientist at the University of Parma, had earlier noticed the same strange phenomenon with peanuts. The same brain cells fired when the monkey watched humans or other monkeys bring peanuts to their mouths as when the monkey itself brought a peanut to its mouth. Later, the scientists found cells that fired when the monkey broke open a peanut or heard someone break a peanut. The same thing happened with bananas, raisins and all kinds of other objects. "It took us several years to believe what we were seeing," Dr. Rizzolatti said in a recent interview. The monkey brain contains a special class of cells, called mirror neurons, that fire when the animal sees or hears an action and when the animal carries out the same action on its own. But if the findings, published in 1996, surprised most scientists, recent research has left them flabbergasted. Humans, it turns out, have mirror neurons that are far smarter, more flexible and more highly evolved than any of those found in monkeys, a fact that scientists say reflects the evolution of humans' sophisticated social abilities. The human brain has multiple mirror neuron systems that specialize in carrying out and understanding not just the actions of others but their intentions, the social meaning of their behavior and their emotions. "We are exquisitely social creatures," Dr. Rizzolatti said. "Our survival depends on understanding the actions, intentions and emotions of others." He continued, "Mirror neurons allow us to grasp the minds of others not through conceptual reasoning but through direct simulation. By feeling, not by thinking." The discovery is shaking up numerous scientific disciplines, shifting the understanding of culture, empathy, philosophy, language, imitation, autism and psychotherapy. Everyday experiences are also being viewed in a new light. Mirror neurons reveal how children learn, why people respond to certain types of sports, dance, music and art, why watching media violence may be harmful and why many men like pornography. How can a single mirror neuron or system of mirror neurons be so incredibly smart? Most nerve cells in the brain are comparatively pedestrian. Many specialize in detecting ordinary features of the outside world. Some fire when they encounter a horizontal line while others are dedicated to vertical lines. Others detect a single frequency of sound or a direction of movement. Moving to higher levels of the brain, scientists find groups of neurons that detect far more complex features like faces, hands or expressive body language. Still other neurons help the body plan movements and assume complex postures. Mirror neurons make these complex cells look like numbskulls. Found in several areas of the brain - including the premotor cortex, the posterior parietal lobe, the superior temporal sulcus and the insula - they fire in response to chains of actions linked to intentions. Studies show that some mirror neurons fire when a person reaches for a glass or watches someone else reach for a glass; others fire when the person puts the glass down and still others fire when the person reaches for a toothbrush and so on. They respond when someone kicks a ball, sees a ball being kicked, hears a ball being kicked and says or hears the word "kick." "When you see me perform an action - such as picking up a baseball - you automatically simulate the action in your own brain," said Dr. Marco Iacoboni, a neuroscientist at the University of California, Los Angeles, who studies mirror neurons. "Circuits in your brain, which we do not yet entirely understand, inhibit you from moving while you simulate," he said. "But you understand my action because you have in your brain a template for that action based on your own movements. "When you see me pull my arm back, as if to throw the ball, you also have in your brain a copy of what I am doing and it helps you understand my goal. Because of mirror neurons, you can read my intentions. You know what I am going to do next." He continued: "And if you see me choke up, in emotional distress from striking out at home plate, mirror neurons in your brain simulate my distress. You automatically have empathy for me. You know how I feel because you literally feel what I am feeling." Mirror neurons seem to analyzed scenes and to read minds. If you see someone reach toward a bookshelf and his hand is out of sight, you have little doubt that he is going to pick up a book because your mirror neurons tell you so. In a study published in March 2005 in Public Library of Science, Dr. Iacoboni and his colleagues reported that mirror neurons could discern if another person who was picking up a cup of tea planned to drink from it or clear it from the table. "Mirror neurons provide a powerful biological foundation for the evolution of culture," said Patricia Greenfield, a psychologist at the U.C.L.A. who studies human development. Until now, scholars have treated culture as fundamentally separate from biology, she said. "But now we see that mirror neurons absorb culture directly, with each generation teaching the next by social sharing, imitation and observation." Other animals - monkeys, probably apes and possibly elephants, dolphins and dogs - have rudimentary mirror neurons, several mirror neuron experts said. But humans, with their huge working memory, carry out far more sophisticated imitations. Language is based on mirror neurons, according to Michael Arbib, a neuroscientist at the University of Southern California. One such system, found in the front of the brain, contains overlapping circuitry for spoken language and sign language. In an article published in Trends in Neuroscience in March 1998, Dr. Arbib described how complex hand gestures and the complex tongue and lip movements used in making sentences use the same machinery. Autism, some researchers believe, may involve broken mirror neurons. A study published in the Jan. 6 issue of Nature Neuroscience by Mirella Dapretto, a neuroscientist at U.C.L.A., found that while many people with autism can identify an emotional expression, like sadness, on another person's face, or imitate sad looks with their own faces, they do not feel the emotional significance of the imitated emotion. From observing other people, they do not know what it feels like to be sad, angry, disgusted or surprised. Mirror neurons provide clues to how children learn: they kick in at birth. Dr. Andrew Meltzoff at the University of Washington has published studies showing that infants a few minutes old will stick out their tongues at adults doing the same thing. More than other primates, human children are hard-wired for imitation, he said, their mirror neurons involved in observing what others do and practicing doing the same things. Still, there is one caveat, Dr. Iacoboni said. Mirror neurons work best in real life, when people are face to face. Virtual reality and videos are shadowy substitutes. Nevertheless, a study in the January 2006 issue of Media Psychology found that when children watched violent television programs, mirror neurons, as well as several brain regions involved in aggression were activated, increasing the probability that the children would behave violently. The ability to share the emotions of others appears to be intimately linked to the functioning of mirror neurons, said Dr. Christian Keysers, who studies the neural basis of empathy at the University of Groningen in the Netherlands and who has published several recent articles on the topic in Neuron. When you see someone touched in a painful way, your own pain areas are activated, he said. When you see a spider crawl up someone's leg, you feel a creepy sensation because your mirror neurons are firing. People who rank high on a scale measuring empathy have particularly active mirror neurons systems, Dr. Keysers said. Social emotions like guilt, shame, pride, embarrassment, disgust and lust are based on a uniquely human mirror neuron system found in a part of the brain called the insula, Dr. Keysers said. In a study not yet published, he found that when people watched a hand go forward to caress someone and then saw another hand push it away rudely, the insula registered the social pain of rejection. Humiliation appears to be mapped in the brain by the same mechanisms that encode real physical pain, he said. Psychotherapists are understandably enthralled by the discovery of mirror neurons, said Dr. Daniel Siegel, the director of the Center for Human Development in Los Angeles and the author of "Parenting From the Inside Out," because they provide a possible neurobiological basis for the psychological mechanisms known as transference and countertransference. In transference, clients "transfer" feelings about important figures in their lives onto a therapist. Similarly, in countertransference, a therapist's reactions to a client are shaped by the therapist's own earlier relationships. Therapists can use their own mirror system to understand a client's problems and to generate empathy, he said. And they can help clients understand that many of their experiences stem from what other people have said or done to them in the past. Art exploits mirror neurons, said Dr. Vittorio Gallese, a neuroscientist at Parma University. When you see the Baroque sculptor Gian Lorenzo Bernini's hand of divinity grasping marble, you see the hand as if it were grasping flesh, he said. Experiments show that when you read a novel, you memorize positions of objects from the narrator's point of view. Professional athletes and coaches, who often use mental practice and imagery, have long exploited the brain's mirror properties perhaps without knowing their biological basis, Dr. Iacoboni said. Observation directly improves muscle performance via mirror neurons. Similarly, millions of fans who watch their favorite sports on television are hooked by mirror neuron activation. In someone who has never played a sport - say tennis - the mirror neurons involved in running, swaying and swinging the arms will be activated, Dr. Iacoboni said. But in someone who plays tennis, the mirror systems will be highly activated when an overhead smash is observed. Watching a game, that person will be better able to predict what will happen next, he said. In yet another realm, mirror neurons are powerfully activated by pornography, several scientists said. For example, when a man watches another man have sexual intercourse with a woman, the observer's mirror neurons spring into action. The vicarious thrill of watching sex, it turns out, is not so vicarious after all. From checker at panix.com Wed Jan 11 21:49:17 2006 From: checker at panix.com (Premise Checker) Date: Wed, 11 Jan 2006 16:49:17 -0500 (EST) Subject: [Paleopsych] CHE: Life, Death, and Biocultural Literacy Message-ID: Life, Death, and Biocultural Literacy The Chronicle of Higher Education, 6.1.6 http://chronicle.com/weekly/v52/i18/18b00901.htm [Read this one carefully. I often tend to corner solutions rather than compromises myself, though. I'd rather that public monies be spent on people who do not generate positive externalities. In other cases, I'd make a legal presumption toward death: unless someone has expressly declared that he wants his body to be kept alive as long as possible AND has forked over the money to provide for it, the law should presume that those in terminal pain and those who have gone senile do not want such support. [I would not pull the plug on my own mother, any more than Peter Singer did on his. I am not an early adaptor of an irreverence toward mere life that I'd like to see adopted, which it will be as the Baby Boomers do not go to the polls like the G.I. and self-styled "greatest" generation did, and won't therefore be so nearly successful rent-seekers. And the upcoming labor force to provide these rents will be far less capable, due to demographic shifts. The way out is to change ethical attitudes toward the prolongation of life. [By the way, health care is NOT getting more expensive. It is in fact getting much cheaper. Health care spending is, of course, going up, but this spending purchases the latest and highest-tech care and is largely demand driven. A 1985 level of health care is cheaper now than it was 1985. I wish I had some actual figures here. I also wish I could establish my estimate that life expectancy under 1985 levels of spending would mean one year less of life expectancy. Life expectancy went up five years in the last twenty years (a trend going back over a hundred years). My guess is that a fifth of it is due to spending. [Read the article carefully, to learn about the inconsistencies in various "conservative" and "liberal" positions. Then wait for the next article, which will throw cold water on whole thing. Pause first, though, to try to Check its major Premise.] By LENNARD J. DAVIS It is a literary convention that at the moment of death, one may finally come to know oneself. In many of Dickens's novels, for example, you'll find a touching deathbed scene. As the dying character fades away, he or she utters a few summarizing words -- or, in the case of Shakespeare, a great deal of them -- and those around can wipe away a tear and find some significance in the person's demise. In literature, one's identity, paradoxically, often comes to fruition at the moment of death. But while Dickens had metaphorical harps and angels to enhance self-revelation at the time of death, we have ventilators, feeding tubes, and defibrillators. Death for us isn't so much a final revelation of identity as a series of decisions preceding a finality. Our sense of identity is much less clear than it was for people in the past. For Dickens and his compeers, the division between life and death was fairly knowable. But now, at every step through our life and death, we have to take into account technological innovations that newly define what it means to be human. As a result of the publicity surrounding the Terri Schiavo case -- a legal brawl between her husband and other family members about whether her feeding tube should be removed after she had been in what some doctors had diagnosed as a "persistent vegetative state" for 15 yearsmany of us are writing living wills. In so doing, we have to think about not only what it means to be a human, but also at what point people cease to have identities. Liberals might argue that one's identity ceases to exist with the loss of a certain level of consciousness, accompanied by the necessity of mechanical life support, such as a feeding tube and ventilator. The religious right contends that one has an identity as long as one's heart is beating, regardless of one's cognitive function or the need for external life support. Some people see being a "vegetable" as an insult to existence, while others see it as a variety of life. In writing our own living wills, we must attempt to define our identity and to project what our identity would and should be if we were comatosethat is, permanently unconscious and unresponsive; in a persistent vegetative statethat is, awake but unaware; minimally conscious; or severely disabled. By doing that, each of us is wrestling in our small corner of existence with very large questions concerning the point at which identity meets biotechnology. The problem is that most of us are ill equipped to make those choices because we know so little about the facts of life and death. That is probably one reason why at least a third of people who make advance directives change their opinions within two years, according to a 2004 Hastings Center report, "The Failure of the Living Will," by Angela Fagerlin, a medical researcher, and Carl E. Schneider, a law professor, both at the University of Michigan. Our college educations provide us with almost no way to sort through such end-of-life decisions. Most of us know very little about biology, don't keep up on recent developments in neurology, and barely know the difference between a coma and persistent vegetative state. We rely on our physicians to tell us about the complexities of medicine, and some of us search the Internet to find out what our doctors won't say. In short, we have to make up and cobble together what we didn't learn in school. There are few if any college-preparatory courses or a single discipline that prepares us to grapple with the questions that are emerging in the postmillennial public sphere. So when it comes to understanding what makes us human, what defines consciousness and personhood, when life begins and ends, we often have to shoot from the hip. And that can mean we end up shooting ourselves in the foot. When we as a culture have to address issues of life and death -- such as whether we should allow stem-cell research or third-trimester abortions, whether we should cause people like Terri Schiavo to die, whether people in Oregon are right to allow physician-assisted suicide -- we are often at a loss and inconsistent in our positions. The public historically has turned to scholars and researchers to inform difficult public debates. But it isn't really clear what part of the academy should be the go-to profession or department. Certainly bioethics seems a logical area of study to resolve contemporary questions of life and death, but few undergraduates are expected to take a required course on those issues, and the field of bioethics itself tends to be fairly specific, dealing mostly with medicine, too often without connecting the ethical issues to a broader vision that includes history and culture. Philosophy and political theory are rich areas of study for dealing with life-and-death issues, and the writings of John Rawls, Richard Rorty, and John Stuart Mill can help us understand citizens' rights and liberal thought. But we then would have to graft those discussions onto situations that require some medical and scientific knowledge. While a few university programs engage in that kind of synthesis, most of us are doing this work on our own, without a substantial commitment from academe to help us out. Disability studies is one field thatis beginning to pull together several disciplines to address the philosophical, moral, legal, medical, and cultural questions emerging from the intersection of biotechnology and identity. Students of disability studies will be prepared to discuss medical interventions, the use of technology in medicine, the way in which society thinks about the body, and so on. But the problem is that most citizens, because they don't think of themselves as disabled, will not turn to disability experts to help them understand the complex issues highlighted by the Schiavo muddle. Therefore, that case, which dealt with a severely disabled woman, came down in the popular press to a debate between members of the religious right and the liberal left. But neither side was particularly knowledgeable about the nature of a persistent vegetative state and whether a feeding tube should be considered a medical intervention (in which case, by a Supreme Court ruling, it could be removed) or simply a form of providing nutrition (in which case it couldn't). Most of the people I talked with thought Schiavo was "brain dead" -- an inaccurate term, since her brain was working well enough to keep her alive. Bioethicists were used freely by both sides, but, aside from the openly religious ethicists, the majority followed the bioethics party line that has fostered and encouraged a rather strict notion of autonomy based on the patient's informed consent. When bioethics began as an academic profession, its goal was to promote a notion of patient autonomy as opposed to the previously unchallenged authority of the medical profession. Therefore most of the bioethicists consulted in the case were in favor of removing Schiavo's feeding tube, since, they maintained, that was the course of action she had wanted, according to her legal guardian. Bioethicists fear that interventions by religious groups or the government will muck up the principle of patient autonomy. But autonomy is a somewhat limiting principle, despite its obvious utility, if you think of the issue not as what a legal guardian wants, or says a patient wants, but as what or how a society defines "a life worth living." That is, if you thought of Schiavo as a "vegetable," your notion of her autonomy would have pointed to removal of the feeding tube. But if you thought of her as a severely disabled woman, the notion of autonomy would have become more ambiguous. That is the position that most people in disability studies took, and so they supported leaving the feeding tube in place. The brouhaha raised the question of how we understand identity in an age that is increasingly "biocultural," to use a term emerging recently. A biocultural approach combines the disciplines of science, technology, medicine, and the humanities. This nascent discipline -- which I have been calling "biocultures" -- is often practiced by graduate students or professors in departments of history, gender and women's studies, criminal justice, medical education, history, science studies, anthropology, literature, and cultural studies. Programs such as those at the University of California at Berkeley, Duke, Harvard, the University of Illinois at Chicago, the University of Michigan, and Pennsylvania State University bring together issues concerning the body, identity, history, and culture. The trend is important because it is crucial not just for scholars in the humanities to know the impact that science has on culture and the body, but also for scientists, limited by funds earmarked for increasingly narrow research topics, to think more broadly about the political, cultural, and social implications of what they do. Take the prickly subject of abortion. Most people have strong feelings about it, but few have the biocultural literacy necessary to understand the complexity of the factors involved. And those with the technical knowledge often lack understanding of the cultural and historical contexts in which abortion needs to be considered. A biocultural approach to questions surrounding abortion would encompass the latest scientific facts about reproduction, conception, implantation, pregnancy, and so on. But it would also consider the cultural, moral, and religious contexts that surround the medical issues. Further, a biocultural approach would take into account the social and political history of the debates themselves, as well as related ethical and philosophical issues, such as infanticide, prenatal testing, developing-world uses of abortion, animal rights, and the death penalty. In other words, just as you can't fully discuss Shakespeare without having a certain level of cultural literacy, you can't fully discuss issues like abortion without biocultural literacy. A biocultural approach to Terri Schiavo -- and, by extension, the 10,000 or more people in persistent vegetative states throughout the country -- would have included a discussion of whether she, even in her attenuated state, had an identity. Was she a human being? A disabled woman? A homo sacer (the philosopher Giorgio Agamben's notion of someone who is alive but can be killed without fear of punishment, like Holocaust victims or ostracized ancient Greeks)? What exactly is the status of people who are connected to life-support machinery, newborns with hopeless fatal illnesses, fetuses, fertilized embryos, stem cells, patients in the last stages of Alzheimer's disease, and those dying of cancer or ALS? These are perhaps the most crucial identities of our times -- what the science historian and disability-studies scholar Susan Squier calls "liminal lives": those that test our ability to define identity and life itself. We will be seeing a huge public debate in forthcoming months as the Supreme Court considers Gonzales v. Oregon, deciding if terminal patients in Oregon can have physicians assist them with suicide. Do dying people have an identity that is different from that of ordinary citizens who cannot ask physicians for lethal drugs? Stem-cell research and cloning will continue to be enormous issues. And, of course, abortion remains one of the major splitting points between Democrats and Republicans. How can colleges, universities, and the disciplines inform the public about cutting-edge biocultural issues? How can our students and faculty members be educated so that they can think consistently and logically about these questions, so that their feelings can be supported with facts? The academy needs a major initiative to provide education on these issues that communicates the complexities and nuances involved. But if we are not careful, confusion rather than clarity will result. First, it is imperative that we communicate facts rather than opinions. When we talk about abortion, for example, we need to know when implantation takes place, when the embryo's nervous system develops and begins to feel pain, and in what week viability occurs (when the fetus can survive outside the womb). When we discuss the history of religious attitudes toward abortion, we need to know, for example, when the Roman Catholic Church changed its definition of the beginning of life from the "quickening" of the fetus to the moment of conception. Second, we must be consistent about definitions and willing to challenge inconsistent positions. For example, the right historically has supported individual and state autonomy and therefore has generally opposed federal intervention in individual or states' rights. But in recent years the right, particularly the religious right, has sought federal intervention against individual autonomy in issues concerning the right to life and personal choice -- for example, abortion and gay marriage. For the right in general and the religious right in particular, one's identity is based on the sanctity of life, extending to patients in comas or vegetative states, fetuses in the womb, and byproducts of fertilization such as stem cells and unused embryos. But any logician could inform the debate by pointing out the inconsistency between those positions and support for the death penalty, war, and even the eating of animals. The left favors autonomy in regard to the body, resisting the idea that the state should dictate how and what we do with our bodies. So it favors abortion, gay marriage, and freedom of sexual choice between consenting adults. It hits a wall, though, in the area of liminal lives. Here it is quick to say that people in vegetative states don't really have identities and therefore are not autonomous. While the left's support of removing Schiavo's feeding tube appeared to support her autonomy (by assuming that she would not have wanted to be a vegetable), it actually was saying that we should assume that people who are disabled enough to be unconscious no longer have identities. According to that view, those people -- paradoxically -- no longer have the ability to have chosen to stay alive. The left's position on abortion supports the right of parents to abort fetuses with disabilities, while it objects to the abortion of female fetuses in other countries. In addition, the left supports late-term abortions -- although presumably it opposes infanticide -- even though, because of biotechnical advances, the line between inside the womb and outside the womb has become somewhat arbitrary and largely a matter of conjecture. Now that a third-term fetus can easily be removed from the womb and survive, its existence inside or outside the womb is mostly determined by medical practice or even parental choice. Finally, in thinking about these issues, academics must be leery of the pull of identity politics, which condition many of our responses in the university. Those on the right think they know the answers to questions about identity; those on the left think they do as well, even if neither position is internally coherent. Another way of saying this is that we have to state openly where our theories begin to fall down, where we become incoherent, where our personal biases and identity politics muddy the waters. One example, which no doubt will upset many of my disability-studies colleagues, illustrates my point. Disability studies is fundamentally based on, among other things, the idea that people with disabilities should have autonomy over their own lives. The independent-living movement and much disability legislation stress that barriers to active participation and self-determination should be removed. Better to live at home with personal assistants, to work without discrimination, to navigate the streets without barriers, to communicate by all means, and to use adapted media and technology to function as fully as possible than to be cared for in facilities, be confined to a home, and be limited by "ableist" environments without ramps or curb cuts, accessible Web sites, or classrooms with real-time captioning. Yet disability scholars and activists also believe that autonomous identity is tempered by recognition that we are all interdependent, that the model of the free and autonomous individual is a bit of a myth, and that the demand that we all be "normal" is a burdensome and limiting ideal. But in the Schiavo case and in the Supreme Court hearing of challenges to the Oregon law permitting physician-assisted suicide, many disability scholars have found themselves on the same side as the right-to-life movement in opposing the removal of Schiavo's feeding tube and physician-assisted suicide. It is striking that a movement with roots in liberal-to-left feminism, the fight for civil rights, and the demand by progressive, disabled Vietnam veterans for proper treatment now appears to the public to be aligned with the religious right and social conservatives on those issues. Those who support physician-assisted suicide argue that individuals with all their mental capacities have the right to end their lives before they become incapacitated. It is their right as autonomous individuals. Many disability advocates, on the other hand, claim that all people who are dying are, in fact, disabled, and that their identity as disabled individuals trumps their identity as autonomous beings. The faulty syllogism goes that dying people are disabled, and in an ableist society they naturally will be pressured to kill themselves; ergo disabled people are being put to death through physician-assisted suicide. Further, dying people (read "newly disabled") will ask for physician-assisted suicide specifically because they do not wish to be disabled -- to lose their sight, hearing, voice, mobility, and so on. There are at least two flaws in that argument. First, it is hard to shoehorn someone dying of cancer, for example, into the category of chronic disability. The aim of making it possible for disabled people to live full lives with their impairments and of ensuring a free and accessible society has little to do with someone who will be dead in six months (the requirement for receiving physician-assisted suicide in Oregon). Why should people have to accept disability status when they will be dead within a few months? Second, according to statistics provided by Oregon, most people seeking physician-assisted suicide are end-stage cancer patients who, by and large, are educated, middle class, and informed. While it is true that many seek physician-assisted suicide because they fear losing their abilities and their autonomy, they no doubt have the independent judgment to make that decision. It seems illogical for some disability advocates to try to prevent dying people from choosing a humane way of ending their lives (as opposed to shooting themselves or wrapping a plastic bag around their heads) because those advocates see suicide as a critique of the disability perspective. The activist origin of the disability movement laid down certain positions -- notably opposition to euthanasia and physician-assisted suicide -- that have become the rule and therefore difficult to challenge. Indeed, a long history of abuse against people with disabilities, culminating in eugenics and discrimination, supports those positions. But there are important distinctions between physician-assisted suicide and euthanasia: The former allows patients to take by themselves a lethal overdose of a drug prescribed by a physician for that purpose; the latter requires that someone else, a physician, be the murderer. One could logically support suicide as a self-determining act while opposing murder. Yet the religious right lumps physician-assisted suicide, euthanasia, and abortion together, and so do many in the disability community. A more nuanced approach, a biocultural one, would make distinctions. In the end, we are all poorly served by an academic community that does not promote biocultural literacy. As this century moves on, many issues the public needs to discuss will increasingly be tied to biotechnological advances that challenge our definitions of what it means to be human. We will need all the resources that we can command to come up with consistent, logical, and culturally relevant ways of conceiving of and bidding farewell to our bodies, ourselves. Lennard J. Davis is a professor of English, disability and human development, and medical education, and director of Project Biocultures, at the University of Illinois at Chicago. He is the author, most recently, of Bending Over Backwards: Disability, Dismodernism, and Other Difficult Positions (New York University Press, 2002). From checker at panix.com Wed Jan 11 22:00:43 2006 From: checker at panix.com (Premise Checker) Date: Wed, 11 Jan 2006 17:00:43 -0500 (EST) Subject: [Paleopsych] CHE: A Very Long Disengagement Message-ID: A Very Long Disengagement The Chronicle of Higher Education, 6.1.6 http://chronicle.com/weekly/v52/i18/18b00601.htm [Now did you read the past article on life, death, and biocultural literacy? Here's cold water to throw on it, namely that all kinds of literacy are going down. I don't know if this apparently deep culture change means that much or just that learning styles are moving from the sort of didactic, linear, foundational approach that guided Christian and Newtonian civilization to the hyperlinking becoming more and more characteristic of Darwinian civilization. [Remember that what we take as natural is so often second-natural, what we are brought up with, that in fact most of the world's civilizations held together without the didactic, foundational, linear (I must add experimental scientific!) thinking that we take for granted. Also, understanding the world and moving along in the world is not all that dependent on the sort of reductionist promise to carry everything back to first principles that we so often hold up as ideal. There is only one case of full theoretical reduction, heat to the motion of molecules, and that holds only for an ideal case. Even our scientific knowledge is quite splotchy. This means that hyperlinking is not so bad as those raised in the old ways fear. And I say this, being very much a foundations man myself. [Help me develop these ideas!] By MARK BAUERLEIN Last spring Nielsen Media Research reported that the average college student watches 3 hours 41 minutes of television each day. "It was a little more than I expected," a Nielsen executive told a reporter, and a little more than professors care to see. But the networks have complained for years that young-adult programs attract more viewers than the ratings have previously indicated. Nielsen traditionally bases its count on household viewing, but many students watch TV shows in a different way, and the trend is growing. The Wall Street Journal described one example: "Every Thursday night at the University of Colorado-Boulder, Theta Xi fraternity brothers and their friends cram into a common room for their favorite television show. It can be a tight squeeze, with as many as 40 people watching at a time. "The big attraction is 'The O.C.,' Fox's soapy drama about the lives of teens in upscale Orange County, Calif." The ritual is a common one on campuses today, and it has precursors. I remember it back in college in 1980, when the Luke and Laura affair on General Hospital caught on, and in the 90s when Friends lured into the lounges undergrads and, surprisingly, grads, too. Now, female students gather for airings of Friends spinoff Joey, while ESPN's SportsCenter pulls in massive numbers of twentysomething men. That is far from the customary image of a loner freshman zoning out in front of the screen in his dorm room. Ever since Ray Bradbury's Fahrenheit 451 (1953), media critics have believed that watching the boob tube "atomizes" individuals, so that even when viewing the news they have no real social engagement. The college ritual of The O.C., March Madness, The Daily Show With Jon Stewart, and other favorites reverses the process, and television watching isn't the only leisure habit shifting from "isolationist" to collective. Teenagers used to keep diaries under lock and key in their bedrooms, recording hopes and humiliations for the authors' eyes only. Today's teens have a different approach. This past spring the Perseus Development Corporation, a company that designs software for online surveys, counted 31.6 million blogs, and 58 percent of them were kept by 13-to-19-year-olds. Instead of opening a monogrammed notebook in the late hours to cogitate alone, such "juvenile Marcel Prousts gone wild" (as a story in The New York Times Magazine labeled them) arrive home from school, log on, and let go. They compose an entry on the day's happenings, respond to comments on yesterday's entry, search other blogs on which to comment, and then return to their own site for updates. As with everything adolescent, the observations range from the poignantly self-effacing to the tiresomely self-involved. A student told me how his 17-year-old brother is obsessed with his own blog, where intemperate chatter vies with awkward confession. He doesn't do homework or his chores, and he doesn't exercise or volunteer. The thrill of a peer's reaction to his own adagios holds him in his room for five hours a day. I don't know if such habits signal a widespread or long-term trend, but here and there one sees an odd paradox at work. Students don't gather often in one place to hang out and tell stories unless an outside focus -- like a television show -- demands it. They make contact with a few clicks, and their exchanges can take place with strangers as often as with friends. As soon as students leave class, they flip open the cellphone to check for messages. One of my colleagues talks about how his son carries on six conversations at a time through instant messaging, with dialogue boxes from his "buddy list" cluttering the screen. Walk through any university library, and at each computer station you will see a cheery or intent sophomore pounding out e-mail messages at a rat-a-tat pace. Head up to the stacks, and the aisles are as quiet as a morgue. The students at the screen or on the cell appear just as solitary as a person reading a book, but, in fact, they are intensifying their connections, solidifying their identity among peers. Such contacts form largely through campus resources, but they are completely independent of the professors and the curriculum. It is a young people's universe of social intercourse, a group behavior unaffected by studies. Indeed, there is no evidence that the intellectual life of the college influences their connectedness at all. Surveys of undergraduates like "Your First College Year," conducted by the Higher Education Research Institute at the University of California at Los Angeles and the Policy Center on the First Year of College at Brevard College, show troublingly high levels of "academic disengagement." Students say that they feel bored in class, submit assignments that underexercise their talents, and do minimal homework. Last year the National Survey of Student Engagement found that 44 percent of first-year students never discuss ideas from their readings or classes with their professors outside of class. And Indiana University at Bloomington's 2005 "High School Survey of Student Engagement" found that as many as half of all students spend only four hours or less per week preparing for class. The trends are not unrelated. The more young people gather to watch TV shows, transmit e-mail and text messages, and blog and chat and surf and download, the less they attend to their regular studies. What develops is an acute peer consciousness, a sense of themselves as a distinct group. To be sure, the current crop of students is the most educated and affluent ever. Their enrollment rates in college surpass those of their baby-boomer parents and Generation X, and their purchasing power is so strong that it dominates the retail and entertainment sectors. Credit-card debt for 18-to-24-year-olds doubled from $1,500 in 1992 to $3,000 in 2001, much of it due to the new array of tools, such as BlackBerries, that keep them up to date with contemporaries and youth culture. Students have grown up in a society of increasing prosperity and education levels, and technology outfits them with instant access to news, music, sports, fashion, and one another. Their parents' experience -- LP records, typewriters, the cold war -- seems a far-gone reality. As drivers of consumer culture, mirrored constantly by mass entertainment, young adults understandably heed one another and ignore their seniors -- including professors. But what do they know? What have they learned from their classes and their privilege? We can be certain that they have mastered the fare that fills their five hours per day with screens -- TV, DVD, video games, computers for fun -- leaving young adults with extraordinarily precise knowledge of popular music, celebrities, sports, and fashion. But when it comes to the traditional subjects of liberal education, the young mind goes nearly blank. In the last few years, an accumulation of survey research on civics, history, literature, the fine arts, geography, and politics reveals one dismal finding after another. The surveys vary in sample size and question design, and they tend to focus on basic facts, but they consistently draw the same general inference: Young people are cut off from the worlds beyond their social circuit. While the wealth and education of young Americans has increased, their knowledge levels have either dropped or remained flat in the following important areas: History. Students entering college have passed through several years of social studies and history classes, but few of those students remember the significant events, figures, and texts. On the 2001 National Assessment of Educational Progress history exam, the majority of high-school seniors, or 57 percent, scored "below basic," and only about one in nine reached "proficient" or better. Diane Ravitch, a professor of education at New York University and a member of the National Assessment Governing Board, called the results "truly abysmal," and worried about a new voting bloc coming of age with such a meager awareness of American history. People who believe that college can remedy the history deficit should be dismayed at the findings of another study, commissioned by the American Council of Trustees and Alumni, of the historical knowledge of seniors at the top 55 colleges in the country. Many of the questions were drawn from the NAEP high-school exam, and the results were astonishing. Only 19 percent of the subjects scored a grade of C or higher. According to the 2000 report, titled "Losing America's Memory," only 29 percent knew what "Reconstruction" refers to, only one-third recognized the American general at Yorktown, and less than one-fourth identified James Madison as the "father of the Constitution." The consequences are dire. As Leslie Lenkowsky, the former head of the Corporation of National and Community Service, observed in response to the NAEP results, "If young people cannot construct a meaningful narrative of American history, then there is little hope that the nation can live up to the highest task of a pluralistic liberal democracy." Civics. In 1999 the Center for Information and Research on Civic Learning and Engagement reported that more than two-thirds of ninth graders study the Constitution, Congress, or the presidency. Unfortunately, their course work hasn't sunk in. In a 2003 survey on the First Amendment commissioned by the Foundation for Individual Rights in Education, only one in 50 college students named the first right guaranteed in the amendment, and one out of four did not know any freedom protected by it. In 2003 a project led by the National Conference of State Legislatures examined the civic awareness of young people age 15 to 26 compared with older Americans. Barely half of those surveyed said that "paying attention to government and politics" is important to good citizenship. While 64 percent knew the name of the latest "American Idol," only 10 percent could identify the speaker of the U.S. House of Representatives. The researchers concluded "that young people do not understand the ideals of citizenship, they are disengaged from the political process, [and] they lack the knowledge necessary for effective self-government." High-school and college students shine in one area of civics: volunteerism. A recent study by the Center for Information and Research on Civic Learning and Engagement titled "The Civic and Political Health of the Nation" found that young people "trail their elders in attentiveness to public affairs and in electoral participation, but hold their own in community-related and volunteer activities." But the habit is a superficial one, most likely fueled by the emphasis that college admissions offices place on volunteer work. A study in April 2005 sponsored by the Higher Education Research Institute at UCLA reported that "engagement with the community declines sharply during the years immediately after students graduate from college," and the drop begins during the college years. Literature and the arts. In 2004 the National Endowment for the Arts released two reports, the "2002 Survey of Public Participation in the Arts" and "Reading at Risk: A Survey of Literary Reading in America." (I was the project director of the latter.) The surveys measured rates of involvement in different art forms -- like attending, listening, and performing -- and compared them with previous findings. In the performing arts, the involvement rates of 18-to-24-year-olds fell significantly in most activities from 1992 to 2002. For example, the numbers of students who listened to jazz and classical music fell from 37 percent to 22 percent, while those who visited a museum or attended a performing-arts event dropped from 29 percent to 24 percent and 33 percent to 26 percent, respectively. The literary reading rates plummeted as well. From 1992 to 2002 the portion of young people reading at least one poem, play, or work of fiction for pleasure in the preceding 12 months fell from 53 to 43 percent. Meanwhile, it should be noted, young people have enjoyed greater access to literature and the arts than ever before. The Economic Census counted 9,353 performing-arts companies in 2002, up from 5,883 in 1997. During the same period the number of museums jumped from 3,860 to 4,535. From 2000 to 2002 the number of fiction titles published rose from 14,615 to 15,133. And yet, from 1998 to 2003, the portion of all books sold that were purchased by people under 25 years old declined from 5 percent to 3.9 percent. The fact that involvement fell while access rose signals a new stance toward literature and the arts among the young. I don't know of any research that formally examines the trend, but a snippet of conversation that occurred during a National Public Radio interview with me last year illustrates the attitude that I'm describing: Caller: "I'm a high-school student, and I don't read and my friends don't read because of all the boring stuff the teachers assign." Host: "Such as?" Caller: "Uh ... that book about the guy. You know, that guy who was great." Host: "Huh?" Caller: "The great guy." Host: "The Great Gatsby?" Caller: "Yeah. Who wants to read about him?" Geography. In 2002 the National Geographic Society issued the results of the Global Geographic Literacy Survey. Thirty-nine percent of American 18-to-24-year-olds surveyed failed the test, and in international comparisons Americans came in second to last out of nine nations tested. Only 13 percent of our country's participants could pinpoint Iraq, only 12 percent could identify Afghanistan. The rate rose to just 51 percent for those who could locate New York State. Moreover, the young American adults surveyed could identify an average of only 2.5 countries in Europe. Around 30 percent believed that this nation has one billion to two billion residents (young people in other countries scored higher in estimating U.S. population), and only 19 percent could name four nations that acknowledge having nuclear weapons. Remarkably, 29 percent could not identify the Pacific Ocean. Politics. In the past few decades, higher education has undergone a revolution in curriculum, what conservatives have called "the politicization of the humanities." But while the curriculum has changed, the shift hasn't affected the students. Political interest among them couldn't be much lower. The geography survey found that, despite the high Internet usage among young adults, only 11 percent of the respondents said they use the Internet to follow the news. Eighty-two percent stated that they keep up with events by watching television, but a growing proportion tune in to programs of dubious informational value. A January 2004 study by the Pew Research Center for the People and the Press found that comedy shows like Saturday Night Live and The Daily Show With Jon Stewart "are now mentioned almost as frequently as newspapers and evening network news programs as regular sources for election news." A story on the report in The Hollywood Reporter began, "To a young generation of Americans, Jon Stewart may as well be Walter Cronkite." Indeed, newspaper circulation is down, in part because while 46 percent of people in their 20s read a newspaper every day in 1972, the rate now stands around the low 20s. The Higher Education Research Institute at UCLA's 2004 survey "The American Freshman" tabulated only 34 percent of new students thinking that it was "very important" to keep up with politics, a drastic slide from the 60 percent who thought so in 1966. The lack of curiosity among college students is reflected in their knowledge. In the 2004 National Election Study, a mere 28 percent of 18-to-24-year-olds correctly identified William H. Rehnquist as the chief justice of the United States. Only 39 percent knew which party had the most members in Congress, and one-quarter of them could not identify Dick Cheney as vice president. Educators usually denigrate such surveys as ideologically slanted and narrowly conceived. They test "rote learning" and "mere facts," the argument goes. In 2004 the president of the Organization of American Historians stated, "Using such surveys as a starting point for debate diverts us from the real challenge at hand: how to use what students do know -- the ideas and identities they glean from family stories, museums, historic sites, films, television, and the like -- to engage them in the life-changing process of learning to think historically." In spite of the na?vet? of that parenthesis, we see the operative contrast: a knowledge of historical data versus thinking historically. The one amounts to a storage of facts, the other to a mode of reflection. But do we have any evidence that the latter is possible without a fair measure of the former? "Thinking historically" is one of those higher-order critical-thinking skills that educators favor, but how one can achieve it without first delving into the details of another time and place is a mystery. The facts are not an end in themselves, of course, but are a starting point for deeper understanding, and the ignorance of them is a fair gauge of deeper deficiencies. Moreover, if critics of such surveys consider them ideologically slanted -- because the knowledge they test is ideologically slanted -- they should develop knowledge measures in other, less partisan areas. But it seems that they don't like any kind of metric, that measurable knowledge is itself a problem. If students pick up that attitude, they are primed for ignorance and failure. Reading through those reports, and given the advantages that college students enjoy today, one recalls the professor in Philip Roth's The Human Stain, who declares: "Our students are abysmally ignorant ... far and away the dumbest generation in American history." They aren't less intelligent than their precursors -- as IQ scores show -- and earlier generations, too, struggled with traditional subjects. But they've taken more courses than previous cohorts, and they have more money and access than ever before. Why hasn't their knowledge level kept pace? In part, because of the new leisure habits of teens and young adults. To repeat, the more time young adults devote to activities like sending e-mail messages, the less time they devote to books, the arts, politics, and their studies. Time has proved the formula. In the 1990s the gurus and cheerleaders of technology promised that the horizon of users would expand to take in a global village, and that a digital era would herald a more active, engaged, and knowledgeable citizenry, with young adults leading the way. It hasn't happened. Instead, youth discourse has intensified, its grip on adolescence becoming ever tighter, and the walls between young adults and larger realities have grown higher and thicker. College professors complain about the result, noting the disaffection of students from their course work and the puny reserves of knowledge they bring into the classroom. But they hesitate to take a stand against mass culture and youth culture, fearful of the "dinosaur" or "conservative" tag. The disengagement of students from the liberal-arts curriculum is reaching a critical point, however. And the popular strategy of trying to bridge youth culture and serious study -- of, say, using hip-hop to help students understand literary classics, as described in a June 19 article in the Los Angeles Times -- hasn't worked. All too often, the outcome is that important works are dumbed down to trivia, and the leap into serious study never happens. The middle ground between adolescent life and intellectual life is disappearing, leaving professors with ever more stark options. One can accept the decline, and respond as a distinguished professor of literature did at a regional Modern Language Association panel last year after I presented the findings of "Reading at Risk." "Look, I don't care if everybody stops reading literature," she blurted. "Yeah, it's my bread and butter, but cultures change. People do different things." Or one can accept the political philosopher Leo Strauss's formula that "liberal education is the counter-poison to mass culture," and stand forthrightly against the tide. TV shows, blogs, hand-helds, wireless ... they emit a blooming, buzzing confusion of adolescent stimuli. All too eagerly, colleges augment the trend, handing out iPods and dignifying video games like Grand Theft Auto as worthy of study. That is not a benign appeal for relevance. It is cooperation in the prolonged immaturity of our students, and if it continues, the alienation of student from teacher will only get worse. Mark Bauerlein is a professor of English at Emory University. From shovland at mindspring.com Thu Jan 12 14:28:18 2006 From: shovland at mindspring.com (Steve Hovland) Date: Thu, 12 Jan 2006 06:28:18 -0800 Subject: [Paleopsych] The failure to calculate the costs of war In-Reply-To: <7.0.0.16.0.20060111145653.0324f598@edu-cyberpg.com> Message-ID: Laws of Victory Do your homework thoroughly. Any corner you cut will come back to haunt you. Don?t start a war you can?t win. Assume it will be a lot harder than you expect. Get a bunch of big guys to join you. If your team is big enough, your enemy may surrender without a fight. If they choose to fight, they will quickly lose. Make the enemy come to you. Soldiers defending their homes are more motivated than attackers and supply lines are shorter. Don?t start until your troops are properly equipped. Don?t gloat over early successes. Positive thinking is not a substitute for raw power. Resist the temptation to engage in wishful thinking. Don?t be too proud to cut your losses when things are going badly. It?s OK to lie to the enemy, but you must always tell your own people the truth. -----Original Message----- From: paleopsych-bounces at paleopsych.org [mailto:paleopsych-bounces at paleopsych.org]On Behalf Of K.E. Sent: Wednesday, January 11, 2006 11:58 AM To: checker at panix.com; paleopsych at paleopsych.org Subject: [Paleopsych] The failure to calculate the costs of war With these costs taken into account, the total macroeconomic costs may add up to $750bn and total costs to $1,850bn. "We will bankrupt ourselves in the vain search for absolute security." -Dwight David Eisenhower, U.S. general and 34th president (1890-1969) Martin Wolf is a respectable economist and chief economics commentator at the Financial Times Martin Wolf: The failure to calculate the costs of war http://news.ft.com/cms/s/48ad9c0a-820f-11da-aea0-0000779e2340.html Before the Iraq war began, Lawrence Lindsey, then president George W.???Bush's economic adviser, suggested that the costs might reach $200bn. The White House promptly fired him. Mr Lindsey was indeed wrong. But his error lay in grossly underestimating the costs. The administration's estimates of a cost of some $50-$60bn were a fantasy, as were Saddam Hussein's weapons of mass destruction, and much else. So far the government has spent $251bn in hard cash. But the costs continue. If the US begins to withdraw troops this year, but maintains a diminishing presence for the next five years, the additional cost will be at least $200bn, under what Profs Bilmes and Stiglitz call their "conservative" option. Under their "moderate" one, the cost reaches $271bn, because troops remain until 2015. With these costs taken into account, the total macroeconomic costs may add up to $750bn and total costs to $1,850bn. It is possible to argue that the benefits for Iraq, the Middle East and the world will outweigh all these costs. But that depends on the emergence, in Iraq, of a stable and peaceful democratic order. That has not yet been achieved. Even those who supported the war must draw two lessons. First, the exercise of military power is far more expensive than many fondly hoped. Second, such policy decisions require a halfway decent analysis of the costs and possible consequences. The administration's failure to do so was a blunder that will harm the US and the world for years to come. <>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<> The Educational CyberPlayGround http://www.edu-cyberpg.com/ National Children's Folksong Repository http://www.edu-cyberpg.com/NCFR/ Hot List of Schools Online and Net Happenings, K12 Newsletters, Network Newsletters http://www.edu-cyberpg.com/Community/ 7 Hot Site Awards New York Times, USA Today , MSNBC, Earthlink, USA Today Best Bets For Educators, Macworld Top Fifty <>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<> _______________________________________________ paleopsych mailing list paleopsych at paleopsych.org http://lists.paleopsych.org/mailman/listinfo/paleopsych From checker at panix.com Thu Jan 12 15:53:15 2006 From: checker at panix.com (Premise Checker) Date: Thu, 12 Jan 2006 10:53:15 -0500 (EST) Subject: [Paleopsych] Re: The failure to calculate the costs of war In-Reply-To: <7.0.0.16.0.20060111145653.0324f598@edu-cyberpg.com>, <7.0.0.16.0.20060111145513.03518590@earthlink.net> References: <7.0.0.16.0.20060111145653.0324f598@edu-cyberpg.com>, <7.0.0.16.0.20060111145513.03518590@earthlink.net> Message-ID: I wonder what wars have ever been profitable. The Allies could have purchased all real estate, buildings, etc., from the Axis countries in World War II and have saved money. I once calculated that the U.S. could have bought Vietnam for what it was spending each month on fighting the war. The people don't benefit by having new tax collectors, but the glory of kings is enhanced. Maybe wars were profitable to the looters for a while. Eventually these bandits noted that they lands they raided had less to offer the second time around and so became what the late Mancur Olson called "stationary bandits" and saw to it that they could engage in repeated extraction. They would also provide protection from rival bandits. Olson said this was the greatest breakthrough in history. See John V. Denson, _Costs of War: America's Pyrrhic Victories_ for an Old Right perspective, which I share, though I concluded a few weeks after 9/11 that empire is here to stay. Frank On 2006-01-11, K.E. opined [message unchanged below]: > Date: Wed, 11 Jan 2006 14:58:15 -0500 > From: K.E. > To: checker at panix.com, paleopsych at paleopsych.org > Subject: The failure to calculate the costs of war > > > With these costs taken into account, the total macroeconomic costs may > add up to $750bn and total costs to $1,850bn. > > "We will bankrupt ourselves in the vain search for absolute security." > -Dwight David Eisenhower, U.S. general and 34th president (1890-1969) > > > > > Martin Wolf is a respectable economist and chief > economics commentator at the Financial Times > > Martin Wolf: The failure to calculate the costs of war > http://news.ft.com/cms/s/48ad9c0a-820f-11da-aea0-0000779e2340.html > > Before the Iraq war began, Lawrence Lindsey, then president George > W.???Bush's economic adviser, suggested that the costs might reach > $200bn. The White House promptly fired him. Mr Lindsey was indeed > wrong. But his error lay in grossly underestimating the costs. The > administration's estimates of a cost of some $50-$60bn were a fantasy, > as were Saddam Hussein's weapons of mass destruction, and much else. > > > > So far the government has spent $251bn in hard cash. But the costs > continue. If the US begins to withdraw troops this year, but maintains > a diminishing presence for the next five years, the additional cost > will be at least $200bn, under what Profs Bilmes and Stiglitz call > their "conservative" option. Under their "moderate" one, the cost > reaches $271bn, because troops remain until 2015. > > > > With these costs taken into account, the total macroeconomic costs may > add up to $750bn and total costs to $1,850bn. > > > > It is possible to argue that the benefits for Iraq, the Middle East > and the world will outweigh all these costs. But that depends on the > emergence, in Iraq, of a stable and peaceful democratic order. That > has not yet been achieved. > > Even those who supported the war must draw two lessons. First, the > exercise of military power is far more expensive than many fondly > hoped. Second, such policy decisions require a halfway decent analysis > of the costs and possible consequences. The administration's failure > to do so was a blunder that will harm the US and the world for years > to come. > > <>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<> > The Educational CyberPlayGround > http://www.edu-cyberpg.com/ > > National Children's Folksong Repository > http://www.edu-cyberpg.com/NCFR/ > > Hot List of Schools Online and > Net Happenings, K12 Newsletters, Network Newsletters > http://www.edu-cyberpg.com/Community/ > > 7 Hot Site Awards > New York Times, USA Today , MSNBC, Earthlink, > USA Today Best Bets For Educators, Macworld Top Fifty > <>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<> > From shovland at mindspring.com Thu Jan 12 18:01:14 2006 From: shovland at mindspring.com (shovland at mindspring.com) Date: Thu, 12 Jan 2006 10:01:14 -0800 (GMT-08:00) Subject: [Paleopsych] Re: The failure to calculate the costs of war Message-ID: <28257337.1137088875362.JavaMail.root@mswamui-bichon.atl.sa.earthlink.net> After the war in Iraq the US will no longer be a super power. This one will break the bank. It already has. I think most of the world is sitting back waiting for us to fall. -----Original Message----- >From: Premise Checker >Sent: Jan 12, 2006 7:53 AM >To: >Cc: paleopsych at paleopsych.org >Subject: [Paleopsych] Re: The failure to calculate the costs of war > >I wonder what wars have ever been profitable. The Allies could have >purchased all real estate, buildings, etc., from the Axis countries in >World War II and have saved money. I once calculated that the U.S. could >have bought Vietnam for what it was spending each month on fighting the >war. > >The people don't benefit by having new tax collectors, but the glory of >kings is enhanced. > >Maybe wars were profitable to the looters for a while. Eventually these >bandits noted that they lands they raided had less to offer the second >time around and so became what the late Mancur Olson called "stationary >bandits" and saw to it that they could engage in repeated extraction. They >would also provide protection from rival bandits. Olson said this was the >greatest breakthrough in history. > >See John V. Denson, _Costs of War: America's Pyrrhic Victories_ for an Old >Right perspective, which I share, though I concluded a few weeks after >9/11 that empire is here to stay. > >Frank > >On 2006-01-11, K.E. opined [message unchanged below]: > >> Date: Wed, 11 Jan 2006 14:58:15 -0500 >> From: K.E. >> To: checker at panix.com, paleopsych at paleopsych.org >> Subject: The failure to calculate the costs of war >> >> >> With these costs taken into account, the total macroeconomic costs may >> add up to $750bn and total costs to $1,850bn. >> >> "We will bankrupt ourselves in the vain search for absolute security." >> -Dwight David Eisenhower, U.S. general and 34th president (1890-1969) >> >> >> >> >> Martin Wolf is a respectable economist and chief >> economics commentator at the Financial Times >> >> Martin Wolf: The failure to calculate the costs of war >> http://news.ft.com/cms/s/48ad9c0a-820f-11da-aea0-0000779e2340.html >> >> Before the Iraq war began, Lawrence Lindsey, then president George >> W.???Bush's economic adviser, suggested that the costs might reach >> $200bn. The White House promptly fired him. Mr Lindsey was indeed >> wrong. But his error lay in grossly underestimating the costs. The >> administration's estimates of a cost of some $50-$60bn were a fantasy, >> as were Saddam Hussein's weapons of mass destruction, and much else. >> >> >> >> So far the government has spent $251bn in hard cash. But the costs >> continue. If the US begins to withdraw troops this year, but maintains >> a diminishing presence for the next five years, the additional cost >> will be at least $200bn, under what Profs Bilmes and Stiglitz call >> their "conservative" option. Under their "moderate" one, the cost >> reaches $271bn, because troops remain until 2015. >> >> >> >> With these costs taken into account, the total macroeconomic costs may >> add up to $750bn and total costs to $1,850bn. >> >> >> >> It is possible to argue that the benefits for Iraq, the Middle East >> and the world will outweigh all these costs. But that depends on the >> emergence, in Iraq, of a stable and peaceful democratic order. That >> has not yet been achieved. >> >> Even those who supported the war must draw two lessons. First, the >> exercise of military power is far more expensive than many fondly >> hoped. Second, such policy decisions require a halfway decent analysis >> of the costs and possible consequences. The administration's failure >> to do so was a blunder that will harm the US and the world for years >> to come. >> >> <>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<> >> The Educational CyberPlayGround >> http://www.edu-cyberpg.com/ >> >> National Children's Folksong Repository >> http://www.edu-cyberpg.com/NCFR/ >> >> Hot List of Schools Online and >> Net Happenings, K12 Newsletters, Network Newsletters >> http://www.edu-cyberpg.com/Community/ >> >> 7 Hot Site Awards >> New York Times, USA Today , MSNBC, Earthlink, >> USA Today Best Bets For Educators, Macworld Top Fifty >> <>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<> >> From checker at panix.com Thu Jan 12 18:38:26 2006 From: checker at panix.com (Premise Checker) Date: Thu, 12 Jan 2006 13:38:26 -0500 (EST) Subject: [Paleopsych] Edge Annual Question 2000: What Is Today's Most Underreported Story? Message-ID: Edge Annual Question 2000: What Is Today's Most Underreported Story? http://www.edge.org/3rd_culture/story/contributions.html (Links omitted) [These are worth reviewing, six years later. Lots of transhumanist themes here. And lots of continued underreporting. Most of these are long-term trends, not stories. [The biggest underreported story of 2005 was the collapse of the Standard Social Science ("blank-slate") Model. Over the last six years, the most underreported story was the demographic decline of Europe, but that is now rapidly changing, perhaps due to the collapse of the SSSM. [The second most underreported is the continuing shift from equality to pluralism as the major preoccupation of the political "left." Is the "right" unifying around universalism? Look for defections from the right on the part of free-marketeers, leaving the theocrats and empire builders together on the "right."] ------------------ "Don't assume for a second that Ted Koppel, Charlie Rose and the editorial high command at the New York Times have a handle on all the pressing issues of the day....when Brockman asked 100 of the world's top thinkers to come up with pressing matters overlooked by the media, they generated a lengthy list of profound, esoteric and outright entertaining responses." -- "Web Site for Intellectuals Inspires Serious Thinking" by Elsa Arnett, San Jose Mercury News _________________________________________________________________ The World Question Center: "WHAT IS TODAY'S MOST IMPORTANT UNREPORTED STORY?" _________________________________________________________________ 102 contributions to Date (71,200 words): William Calvin, Mihaly Csikszentmihaly, Pattie Maes, George Dyson, Douglas Rushkoff , Howard Gardner, Roger Schank, Lee Smolin, Judith Rich Harris, Stewart Brand, John McWhorter, Paul Davies, Rodney Brooks, Sally M. Gall, John Gilmore, Eric J. Hall, Stephen R. Kellert, Thomas Petzinger, Jr, Sylvia Paull, James J. O'Donnell, Philip W. Anderson, Stephen Grossberg, Brian Goodwin, Arnold Trehub, Ivan Amato, Howard Rheingold , Clifford A. Pickover, Hans Weise, John Horgan, Philip Elmer-DeWitt, Lance Knobel, Jeff Jacobs, Piet Hut, Freeman Dyson, Kevin Kelly, Marc D. Hauser, Daniel Goleman, Philip Brockman, Terrence J. Sejnowski, Bart Kosko, Dean Ornish, Keith Devlin, Andy Clark, Anne Fausto-Sterling, Eberhard Zangger, Peter Cochrane, Hans Ulrich Obrist, Ellis Rubinstein, Stuart Hameroff, David Lykken, Mehmet C. Oz, M.D., Eduard Punset, Stephen H. Schneider, David G. Myers, Todd Siler, Joseph Ledoux, Verena Huber-Dyson, Julian Barbour, Henry Warwick, James Bailey. Robert R. Provine, Steven Quartz, Jaron Lanier, Robert Hormats, Daniel Pink, Timothy Taylor, Carlo Rovelli, Peter Schwartz, Leon M. Lederman, Phil Leggiere, Denise Caruso, Tor Norretranders, Delta Willis, , Charles Arthur, David M. Buss, Denis Dutton, Tom de Zengotita, Rupert Sheldrake, Marney Morris, Raphael Kasper, Jason McCabe Calacanis, Steven Pinker, Philip Campbell, Ernst Poppel, David Braunschvig, Geoffrey Miller, Nancy Etcoff, Kenneth W. Ford, Richard Potts, Robert Aunger, Colin Tudge, Paul W. Ewald, David Bunnell, W. Brian Arthur, Margaret Wertheim, Thomas A. Bass, Rafael Nunez, Margaret Wertheim, Randolph M. Nesse, M.D., Sherry Turkle, Joseph Vardi _________________________________________________________________ Joseph Vardi How Kids Replaced the Generals The most important untold story according my opinion is how kids replaced the generals as the major source of defining the innovation agenda of the world; how the biblical prophecy "thou shell turn your swords into Sony Playstation 2" is being fulfilled; how the underlying power sending satellites to the skies, creating fabs for 128 bit machines, pushing for broadband, is not any longer based on the defense needs of the big powers ,but on the imagination and the passion of kids (and adults) to play games, see 1000 channels of television, and listen to music!! This is just amazing, and beautiful. Never in the history of mankind have kids had such a profound influence on the creativity and innovation agenda. It is a very democratic process as well as the decision-making power is distributed widely. DR. JOSEPH VARDI is the Principal of International Technologies Ventures, a private venture capital enterprise, investing principally for its own account, which has initiated, negotiated, structured and arranged financing for the acquisition of operating companies; and created and funded several high-tech companies in the fields of Internet, software, telecommunications, electro-optics, energy, environment and other areas. Dr Vardi is the founding investor and the former chairman of Mirabilis Ltd, the creator of the extremely popular Internet communication program ICQ which took the web by storm, making it one of the most successful Internet products of all times, with currently over 65 million users. The company was acquired by AOL. LINK: Joseph Vardi's bio page on Edge _________________________________________________________________ Sherry Turkle I. A new kind of object: From Rorschach to Relationship I have studied the effects of computational objects on human developmental psychology for over twenty years, documenting the ways that computation and its metaphors have influenced our thinking about such matters as how the mind works, what it means to be intelligent, and what is special about being human. Now, I believe that a new kind of computational object -- the relational artifact -- is provoking striking new changes in the narrative of human development, especially in the way people think about life, and about what kind of relationships it is appropriate to have with a machine. Relational artifacts are computational objects designed to recognize and respond to the affective states of human beings-and indeed, to present themselves as having "affective" states of their own. They include children's playthings (such as Furbies and Tamagotchis), digital dolls that double as health monitoring systems for the homebound elderly (Matsushita's forthcoming Tama), sentient robots whose knowledge and personalities change through their interactions with humans, as well as software that responds to its users' emotional states and responds with "emotional states" of their own. Over the past twenty years, I have often used the metaphor of "computer as Rorschach" to describe the relationship between people and their machines. I found computers used as a projective screen for other concerns, a mirror of mind and self. But today's relational artifacts make the Rorschach metaphor far less useful than before. These artifacts do not so much invite projection as they demand engagement. The computational object is no longer affectively "neutral." People are learning to interact with computers through conversation and gesture, people are learning that to relate successfully to a computer you do not have to know how it works, but to take it "at interface value," that is to assess its emotional "state," much as you would if you were relating to another person. Through their experiences with virtual pets and digital dolls (Tamagotchi, Furby, Amazing Ally), a generation of children are learning that some objects require (and promise) emotional nurturance. Adults, too, are encountering technology that attempts to meet their desire for personalized advice, care and companionship (help wizards, intelligent agents, AIBO, Matsushita's forthcoming Tama). These are only the earliest, crude examples of the relational technologies that will become part of our everyday lives in the next century. There is every indication that the future of computational technology will include ubiquitous relational artifacts that have feelings, life cycles, moods, that reminisce, and have a sense of humor, which say they love us, and expect us to love them back. What will it mean to a person when their primary daily companion comes is a robotic dog? Or their health care "worker" is a robot cat? Or their software program attends to their emotional states and, in turn, has its own?. We need to know how these new artifacts affect people's way of thinking about themselves, human identity, and what makes people special. These artifacts also raise significant new questions about how children apporach the question of `What is alive?" In the proposed research the question is not what the computer will be like in the future, but what will we be like, what kind of people are we becoming? Relational artifacts are changing the narrative of human development, including how we understand such "human" qualities as emotion, love, and care. The dynamic between a person and an emotionally interactive, evolving, caring machine object is not the same as the relationship one might have with another person, or a pet, or a cherished inanimate object. We have spent a large amount of social resources trying to build these artifacts; now it is time to study what is happening to all of us as we go forth into a world "peopled" with a kind of object we have never experienced before. We need to more deeply understand the nature and implications of this new sort of relationship -- and its potential to fundamentally change our understanding of what it means to be human. We need to be asking several kinds of new questions: o How are we to conceptualize the nature of our attachments to interactive robots, affective computers, and digital pets? o How does interacting with relational artifacts affect people's way of thinking about themselves and others, their sense of human identity and relationships? How do the models of development and values embedded in the design of relational artifacts both reflect and influence our ways of thinking about people? o What roles -- both productive and problematic -- can relational artifacts play in fulfilling a basic human need for relationship? Their first generation is being predominantly marketed to children and the elderly. What does this reflect about our cultural values about these groups? How will these objects influence their understanding of who they are as individuals and in the world? Are we reinforcing their marginality? Are we tacitly acknowledging that we do not have enough "human" time to spend with them? II. In the 1960s through the 1980s, researchers in artificial intelligence took part in what we might call the classical "great AI debates" where the central question was whether machines could be "really" intelligent. This classical debate was essentialist; the new relational objects tend to enable researchers and their public to sidestep such arguments about what is inherent in the computer. Instead, the new objects depend on what people attribute to them; they shift the focus to what the objects evoke in us. When we are asked to care for an object (the robot Kismet, the plaything Furby), when the cared-for object thrives and offers us its attention and concern, people are moved to experience that object as intelligent. Beyond this, they feel a connection to it. So the question here is not to enter a debate about whether relational objects "really" have emotions, but to reflect on a series of issues having to do with what relational artifacts evoke in the user. In my preliminary research on children and Furbies, I have found that children describe these new toys as "sort of alive" because of the quality of their emotional attachments to the Furbies and because of their fantasies about the idea that the Furby might be emotionally attached to them. So, for example, when I ask the question, "Do you think the Furby is alive?" children answer not in terms of what the Furby can do, but how they feel about the Furby and how the Furby might feel about them. Ron (6): Well, the Furby is alive for a Furby. And you know, something this smart should have arms. It might want to pick up something or to hug me. Katherine (5): Is it alive? Well, I love it. It's more alive than a Tamagotchi because it sleeps with me. It likes to sleep with me. Here, the computational object functions not only as an evocative model of mind, but as a kindred other. With these new objects, children (and adults) not only reflect on how their own mental and physical processes are analogous to the machine's, but perceive and relate to the machine as an autonomous and "almost alive" self. My work with children and computational objects has evolved into a decades-long narrative about the way computation has affected the way we make sense of the world. In many ways, the behaviors and comments of children have foreshadowed the reactions of adults. In the first generation of computer culture I studied, the children of the late 1970s and early 1980s tended to resolve metaphysical conflicts about machine "aliveness" by developing a concept of "the psychological machine"-concluding that psychology and a kind of consciousness were possible in objects they knew were not alive. This way of coping with the conundrums posed by computational objects was pioneered by children, and later adopted by adults. Later cohorts of children's responses to computational objects that were more complex and problematic in new ways again reliably foreshadowed the conclusions the culture at large would soon reach. First, they explained that although machines might be psychological in the cognitive sense (they might be intelligent, they might have intentionality), they were not psychological in the emotional sense, because they did not know pain, or love, they were not mortal, and they did not have souls. Soon after, the children I interviewed began consistently citing biology and embodiment as the crucial criteria that separated people from machines; they insisted that qualities like breathing, having blood, being born, and, as one put it, "having real skin," were the true signs of life. Now, I have begun to see a new pattern - children describe relational artifacts not as "alive" or "not alive", but as "sort-of-alive." Categories such as "aliveness" and "emotion" seem poised to split in the same way that the categories of "psychological" and "intelligent" did twenty years ago. Children's reactions to the presence of "smart machines" have fallen into discernable patterns over the past twenty years. Adults' reactions, too, have been changing over time, often closely following those of the children. To a certain extent, we can look to children to see what we are starting to think ourselves. However, in the case of relational artifacts, there is more to the choice of children as subjects than a simple desire to stay ahead of the curve in anticipating changes in computer culture. By accepting a new category of relationship, with entities that they recognize as "sort-of-alive", or "alive in a different, but legitimate way," today's children will redefine the scope and shape of the playing field for social relations in the future. Because they are the first generation to grow up with this new paradigm, it is essential that we observe and document their experiences. SHERRY TURKLE is a professor of the sociology of sciences at MIT. She is the author of The Second Self: Computers and the Human Spirit; Psychoanalytic Politics: Jacques Lacan and Freuds French Revolution, and Life on the Screen: Identity in the Age of the Internet.. LINKS: Sherry Turkle's Home Page; See "The Cyberanalyst", Chapter 31 in Digerati Randolph M. Nesse, M.D. Is the Market on Prozac? The press has been preoccupied with possible explanations for the current extraordinary boom. Many articles say, as they always do while a bubble grows, that this market is "different." Some attribute the difference to new information technology. Others credit changes in foreign trade, or the baby boomer's lack of experience with a real economic depression. But you never see a serious story about the possibility that this market is different because investor's brains are different. There is good reason to suspect that they are. Prescriptions for psychoactive drugs have increased from 131 million in 1988 to 233 million in 1998, with nearly 10 million prescriptions filled last year for Prozac alone. The market for antidepressants in the USA is now $6.3 billion per year. Additional huge numbers of people use herbs to influence their moods. I cannot find solid data on how many people in the USA take antidepressants, but a calculation based on sales suggests a rough estimate of 20 million. What percent of brokers, dealers, and investors are taking antidepressant drugs? Wealthy, stressed urbanites are especially likely to use them. I would not be surprised to learn that one in four large investors has used some kind of mood-altering drug. What effects do these drugs have on investment behavior? We don't know. A 1998 study by Brian Knutson and colleagues found that the serotonin specific antidepressant paroxetine (Paxil) did not cause euphoria in normal people, but did block negative affects like fear and sadness. From seeing many patients who take such agents, I know that some experience only improved mood, often a miraculous and even life-saving change. Others, however, report that they become far less cautious than they were before, worrying too little about real dangers. This is exactly the mind-set of many current investors. Human nature has always given rise to booms and bubbles, followed by crashes and depressions. But if investor caution is being inhibited by psychotropic drugs, bubbles could grow larger than usual before they pop, with potentially catastrophic economic and political consequences. If chemicals are inhibiting normal caution in any substantial fraction of investors, we need to know about it. A more positive interpretation is also easy to envision. If 20 million workers are more engaged and effective, to say nothing of showing up for work more regularly, that is a dramatic tonic for the economy. There is every reason to think that many workers and their employers are gaining such benefits. Whether the overall mental health of the populace is improving remains an open question, however. Overall rates of depression seem stable or increasing in most technological countries, and the suicide rate is stubbornly unchanged despite all the new efforts to recognize and treat depression. The social effects of psychotropic medications is the unreported story of our time. These effects may be small, but they may be large, with the potential for social catastrophe or positive transformation. I make no claim to know which position is correct, but I do know that the question is important, unstudied, and in need of careful research. What government agency is responsible for ensuring that such investigations get carried out? The National Institute of Mental Health? The Securities and Exchange Commission? Thoughtful investigative reporting can give us preliminary answers that should help to focus attention on the social effects of psychotropic medications. Randoph M. Nesse, M.D., is Professor of Psychiatry, Director, ISR Evolution and Human Adaptation Program, The University of Michigan and coauthor (with George C. Williams) of Why We Get Sick: The New Science of Darwinian Medicine. LINK: Randolph Nesse's Home Page _________________________________________________________________ Margaret Wertheim Response to Paul Davies I appreciate Paul Davies' response to my question "What is science, and do indigenous kniowledge systems also contain a genuine scientific understanding of the world?" My point in raising this question is not to suggest that Western science is not universal - clearly the same law of gravity operates in the deserts of central Australia as operates in the labs of Caltech. In that sense Western science is indeed something that every culture can share and benefit from, if they so choose. At issue here is really the reverse question: Might there also be discoveries about the way the world works that have been made by other cultures, that we in the West have not yet come to - knowledge that we in turn might benefit from? One example here is the Aboriginal tradition of fire burning. It is now known that Aboriginal people traditionally managed the land and its native flora and fauna by complex patterns of burning. Given the huge risk of out-of-control bush-fires in Australia, there is now interest among some ecologists and park mamagers re understanding this tradional knowledge. Another example is accupuncture. Some years ago I had a serious case of hepatitis, for which Western medicine could do nothing whatever. Eventually after months of illness, I started to see an accupuncturist because Chinese medicine claims to have ways of treating liver disease. Eventually I recovered. It is possible, of course, that I might have recovered without the accupuncture, but many many people (including billions of Chinese) have had therapeutic experiences with accupuncture. I do not claim to know how accupuntcure works, but it seems fair to at least keep an open mind that there really is some deep understanding here of bodily function - some knoweldge that we might truly benefit from. The hard part of the question is does such knowledge constitute a genuine "science"? Paul suggests not, but I think this option should not be ruled out. In practical terms, accupuncturists operate much the same way that Western doctors operate: you go for a diagnosis, they check for various symptoms, then they prescribe certain treatments. All this involves rational analysis based on a complex underlying theory of how the body works. That theoretical foundation might well sound odd to Western minds (it obvioulsy does to Paul, as it does to me), but if billions of people get well it seems hard to dismiss it completely. We should not forget that our own medical science today now incorporates theoretical ideas (jumping genes for example) that were scofffed at by most scientists just a few decades ago. There are no doubt many more "true" ideas that we have not yet come to about the human body - things that might seem crazy today. Numbers of double-blind trials have shown that accupuncture can be very effective - so even by Western standards it seems to pass the test. One could still argue that its not a "true science", but just a complex set of heuristics that happens to work in lots of cases, but is this any less so of much of our own medical science? As "shining emblems of true science" Paul suggests, "radio waves, nuclear power, the computer and genetic engineering." The first three examples all come out of physics, which because of its mathematical foundation is somewhat different to most of the other sciences. As philosphers of science have been saying for some time, it is problematic to judge all sciences according to the methodologies of phsyics. If that is the criteria for a "true science" then much of modern biology would not count either. Paul's final example, genetic engineering, is from the biological area, but it is the most "atomized" part of biology. Historically the whole area of gene science (from Max Delbruck on) has been heavily influenced by a physics mentality, and contemporary genetic engieering is indeed a testimony to what can been achieved by applying a physics paradigm to biology. But again, if this is our only criteria for "true science" then what is the status of other biological sciences such as ecology, zoology, and indeed Darwin's theory of evolution? None of these would seem to me to pass Paul's criteria. Thus we come back top the question: What really is science? This is a question of immense debate among philosophers of science, and among many scientists. I don't claim to have a simple answer - but I would like to argue for a fairly expansive definition. Although I trained as a physicist myself, and physics remains my personal favorite science, I do not think it can or should be our only model for a "true science." By suggesting that indigenous knowledge systems contain genuine scientific understandings of the world, I do not mean to imply that Western science becomes less universal, only that there may well be other truths that our science has yet to discover. The point is not to diminish our own science, or our understanding of what sicence is, but to enrich both. MARGARET WERTHEIM is the author of Pythagoras Trousers, a cultural history of physics, and The Pearly Gates of Cyberspace: A History of Space from Dante to the Internet. She writes regularly about science for Salon, L.A. Weekly, The Sciences, Guardian, TLS and New Scientist. She is the senior science reviewer for the Australian's Review of Books. _________________________________________________________________ Paul Davies Response to Margaret Wertheim Margaret Wertheim asks what is meant by "science." I have an answer. It must involve more than merely cataloguing facts, and discovering successful procedures by trial and error. Crucially, true science involves uncovering the principles that underpin and link natural phenomena. Whilst I wholeheartedly agree with Margaret that we should respect the world view of indigineous non-European peoples, I do not believe the examples she cites -- Mayan astronomy, Chinese acupuncture, etc. -- meet my definition. The Ptolemaic system of epicycles achieved reasonable accuracy in describing the motion of heavenly bodies, but there was no proper physical theory underlying it. Newtonian mechanics, by contrast, not only described planetary motions more simply, it connected the movement of the moon with the fall of the apple. That is real science, because it uncovers things we cannot know any other way. Has Mayan astronomy or Chinese acupuncture ever led to a successful nontrivial prediction producing new knowledge about the world? Many people have stumbled on the fact that certain things work, but true science is knowing why things work. I am open-minded about acupuncture, but if it does work, I would rather put my faith in an explanation based on nerve impulses than mysterious energy flows that have never been demonstrated to have physical reality. Why did science take root in Europe? At the time of Galileo and Newton, China was far more advanced technologically. However, Chinese technology (like that of the Australian Aborigines) was achieved by trial and error refined over many generations. The boomerang was not invented by first understanding the principles of hydrodynamics and then designing a tool. The compass (discovered by the Chinese) did not involve formulating the principles of electromagnetism. These latter developments emerged from the (true, by my definition) scientific culture of Europe. Of course, historically, some science also sprang from accidental discoveries only later understood. But the shining emblems of true science -- such as radio waves, nuclear power, the computer, genetic engineering - all emerged from the application of a deep theoretical understanding that was in place before -- sometimes long before -- the sought-after technology. The reasons for Europe being the birthplace of true science are complex, but they certainly have a lot to do with Greek philosophy, with its notion that humans could come to understand how the world works through rational reasoning, and the three monothesitic religions -- Judaism, Christianity and Islam -- with their notion of a real, lawlike, created order in nature, imposed by a Grand Architect. Although science began in Europe, it is universal and now available to all cultures. We can continue to cherish the belief systems of other cultures, whilst recognizing that scientific knowledge is something special that tanscends cultures. Paul Davies is an internationally acclaimed physicist, writer and broadcaster, now based in South Australia. Professor Davies is the author of some twenty books, including Other Worlds, God and the New Physics, The Edge of Infinity, The Mind of God, The Cosmic Blueprint, Are We Alone? and About Time. LINK: "The Synthetic Path" -- Ch. 18 in The Third Culture _________________________________________________________________ Rafael Nunez The Death of Nations For centuries societies have organized themselves in terms of kingdoms, countries, and states. Towards the second half of the recently past 20th Century these geographical, cultural, and political "units" acquired a more precise meaning through the establishment of modern "nations". The process was consolidated, among others, through the creation of the so called United Nations, and the independence of most colonial territories in Africa during the 60's. Today, we naturally see the world as organized in clear-cut and well-defined units: the world's nations (just check the colors of a political atlas). Nations have their own citizens, well established territories, capital cities, flags, currencies, stamps and postal systems, military forces, embassies, national anthems, and even their own sport teams competing in the various planet-scale events. This widespread view not only has been taken for granted by most sectors of the public opinion, but also it has served as the foundation of the highest form of international organization -- the United Nations. The most serious world affairs have been approached with this nation-oriented paradigm. But the reality of our contemporary global society (which goes far beyond pure global technology) is gradually showing that the world is not a large collection of nations. Nations, as we know them, are not anymore the appropriate "unit of analysis" to run the world, and to deal with its problems. Here is why. o Environmental problems: Purely national/inter-national efforts to avoid the pollution of rivers, to protect the ozone layer, to manage (and avoid) environmental disasters, and to protect endangered species and biological diversity, have not given good results. New forms of global organizations, such as WWF and Greenpeace, have emerged to deal with these problems in a more efficient manner. o Natural resources: The management of the world's forests, the Antarctic ice, and fishing resources, has shown that they don't belong to the national/inter-national realm. Again, new and more efficient forms of global organizations have emmerged for addressing these problems. o Sovereignty: The relatively recent arrest in London of the ex-chilean dictator Augusto Pinochet (facing a potential extradition to Spain), has raised unprecedented and deep issues about the sovereignty of nations. The Chilean government claims that Pinochet should be judged in Chile, but international laws seem to be gradually evolving towards a form of jurisdiction that is above the sovereignty of nations. The role of supra national organizations such as Amnesty International and Human Rights Watch is becoming extremely prominent in redefining these issues. o Neutrality: The complexity of contemporary world organization is leaving almost no room for neutrality. Contemporary Swiss society, for instance, is experiencing an important identity crisis, since their traditional neutrality is no longer tenable in the new european and international contexts. One of their essential aspects of national identity -- neutrality - is collapsing. A simple fact illustrates this crisis. In 1992, during the World Expo in Seville, the official Swiss stand exhibited the following motto: "Switzerland does not exist". o Ethnic groups representation: Many ethnic groups around the world whose territories extend over several nations, such as Kurds (who live mainly in Eastern Turkey, Westren Iran, and Northern Iraq) or Aymaras (who live in Eastern Bolivia, Southern Peru, and Northern Chile), have had almost no representation in international organizations. Their problems haven't been heard in a world organized under the nation-paradigm. These groups, however, have been in the news on the last decade bringing their issues more to the foreground, thus relegating the traditional nations to a less prominent role. o Epidemics: Serious epidemics such as AIDS and new forms of tuberculosis, are spreading at alarming rates in some areas of the world. The cure, the study, and the control of these epidemics demand organizational efforts that go well beyond national/inter-national schemas. The emergence of many NGO's dealing with health issues is an attempt to provide more appropriate answers to these devastating situations. o Civil wars and ethnic cleansing: The stopping and control of ethnic massacres such as the ones observed in the former Yugoslavian regions, and those between Tutsis and Hutus in Africa, demand quick intervention and serious negotiation. A heavy nation-oriented apparatus is usually extremely slow and innefficient in dealing with this kind of situations. It can't capture the subtleties of cultural dynamics. o Ongoing separatism and proliferation of nations: The world has more and more nations. Only a few dozen nations founded the United Nations half a century ago. Today the UN has around two hundred members (The International Olympic Committee and FIFA, the World's Football Federation, have even more!). And it is not over. Former Soviet republics, Slovenia, Croatia, Czech Republic, Slovakia, and so on, already created new nations. Many others, such as the Basque country, Quebec, and Chechnya, are still looking for their independence. An ever increasing number of nations will eventually collapse. o Loss of national property and national icons: The openness and dynamism of international markets, as well as the globalization of foreign investment have altered at unprecedented levels the sense of what is "national". For instance, many airlines (to take a very simple example) usually seen as "national" airlines, today belong in fact to extra-national companies. Such is the case of Aerolineas Argentinas, LOT Polish Airlines, TAP Portugal, and LAN Peru, to mention only a few. National airlines, which in many countries have been seen as national icons, are simply not national anymore. Of course, the same applies to fishing waters, mines, forests, shopping malls, vineyards, and so on. These are only a few examples. There are many others. Very serious ones, such as the primacy of watersheds over national borders in solving serious problems of water distribution. And less serious ones, such as the potential collapse of one of the Canadian national sports (ice-hockey), if their franchises continue to move to more profitable lands in the United States. All these aspects of our contemporary societies challenge the very notion of "nation", and reveal the primacy of other factores which are not captured by nation-oriented institutions. The world is now gradually adjusting to these changes, and is coming up with new forms of organization, where nations, as such, play a far less important role. Such is the case of the formation of the European Community (which allows for free circulation of people and merchandises), the establishment of a "European passport", and the creation of the Euro as common currency. After all, many national borders are, like those straight lines one sees in the maps of Africa and North America, extremely arbitrary. It shouldn't then be a surprise that the world divided into nations is becoming an anachronism from the days when the world was ruled by a few powerful kingdoms, that ignored, fundamental aspects of ethnic, cultural, biological, and environmental dynamics. We are now witnessing the death of nations as we know them. RAFAEL NUNEZ is Assistant Professor of Cognitive Science at the University of Freiburg, and a Research Associate at the University of California, Berkeley. He is co-editor (with Walter J. Freeman) of Reclaiming Cognition: The Primacy of Action, Intention, and Emotion. Thomas A. Bass Shifting Empires and Power I'm currently thinking about Sophocles' Oedipus at Colonus and the Book of Exodus, which is inclining me to the opinion that today's unreported stories are similar to yesterday's: shifting empires and power as the powerless struggle for sanctuary and their own form of salvation. THOMAS A. BASS is the author of The Predictors, Back to Vietnamerica, Reinventing the Future, Camping with the Prince and Other Tales of Science in Africa , and The Eudaemonic Pie. A frequent contributor to Smithsonian, Audubon, Discover, The New York Times, and other publications, he is Contributing Writer for Wired magazine and Scholar-in-Residence at Hamilton College. LINK: Thomas A. Bass Home Page _________________________________________________________________ Margaret Wertheim Indigenous Science Over the last century we in the western world have gradually come to take seriously other culture's religions, social systems, aesthetics, and philosophies. Unlike our eighteenth century forebears we no longer think of indigenous peoples of the non-white world as "savages", but have come to understand that many of these cultures are as complex and sophisticated as ours. The one area where we have continued to proclaim our own specialness - and by extension our own superiority - is science. "True Science" - that is a "truely empirical" understanding of the world - is often said to be a uniquely western pursuit, the one thing we alone have achieved. Against this view, a small but growing body of scholars are beginning to claim that many indigenous cultures also have a genuine scientific understanding of the world - their claim is that science is not a uniquely western endeavour. These "other" sciences are sometimes referred to as "ethnosciences" - examples include (most famously) Mayan astronomy and Chinese medicine, both of which are highly accurate, though wildly different to their western equivalents. Less well known is the logic-obsessed knowledge system of the Yolgnu Aborigines of Arnhemland in northern Australia, and the complex navigational techniques of the Polynesians. The claim that other cultures have genuine sciences (and sometimes also complex logics) embedded in their knowledge systems, raises again the whole philosophical issue of what exactly does the word "science" mean. Helen Verran, an Australian philosopher of science who is one of the leaders of the ethnoscience movement, has made the point that having the chance to study other sciences gives us a unique opportunity to reflect back on our own science. Her work on the Yolgnu provides a important window from which to see our own scientific insights in a new light. Sadly, some scientists seem inherently opposed to the very idea of "other sciences". But studying these other ways of knowing may enhance our own understanding of the world in ways we cannot yet imagine. The example of accupuncture must surely give any skeptic at least some pause for thought - the Chinese have performed operations using accupucture needles instead of anesthetic drugs. Likewise Mayan astronomy, though based on the cycles of Venus, was as empirically accurate as anything in the West before the advent of the telescope. Two hundred years ago the idea that indigenous "savages" might be genuine philsophers would have struck most Europeans as preposterous. Today we have accepted this "preposterous" proposition, but a similar view prevails about science. Learning about, and taking seriously, these other ways of knowing the world seems to me one of the greatest tasks for the next century - before (as Steven Pinker has rightly noted) this immense wealth of human understanding disappears from our planet. MARGARET WERTHEIM is the author of Pythagoras Trousers, a cultural history of physics, and The Pearly Gates of Cyberspace: A History of Space from Dante to the Internet. She writes regularly about science for Salon, L.A. Weekly, The Sciences, Guardian, TLS and New Scientist. She is the senior science reviewer for the Australian's Review of Books. _________________________________________________________________ W. Brian Arthur The last word on Y2K: The Y1K Problem Just before the year 1000, a rumor arose in a certain town in Germany, Hamelin I believe, that the coming of the new time would bring rats to the public buildings. Some had been spotted in the basement of the town hall, some in the local stables. Rat preventers were hired at great price, and indeed when the century turned no rats were to be seen. The city fathers felt shamed by the scare and called the preventers before them. You have spent a great deal of money on these rats ? but there were no rats, they said. Ah, city fathers, said the preventers. That's because we prevented them. W. BRIAN ARTHUR is Citibank Professor at the Santa Fe Institute. From 1982 to 1996 he held the Morrison Chair in Economics and Population Studies at Stanford University. Arthur pioneered the study of positive feedbacks or increasing returns in the economy ? in particular their role in magnifying small, random events in the economy. _________________________________________________________________ David Bunnell New York Times Sells Out! Sources from deep inside The New York Times Company, owner of The New York Times, Boston Globe, numerous TV stations, regional newspapers, and various digital properties, and from The Onion, a web based satirical newspaper (www.theonion.com) have verified the rumors. It's true, The Onion, Inc. company headquartered in Madison, Wisconsin has made a offer to buy The New York Times, Inc., company for stock. This is a serious offer and word is that NYT Chairman and Publisher Arthur Sulzberger sees it as a way instantly transform his family's company into a major Internet content provider and thereby pump up the company's stock creating instant wealth for many long time shareholders. According to people close to the talks, Sulzberger and other New York Times executives were recently seen in Madison Wisconsin where they reportedly attended a Friday afternoon beer bash at The Onion headquarters. Apparently, the executives of both companies really hit it off and have even gone on camping trips together. Executive Editor Joseph Lelyveld of The New York Times and Onion's Editor-in-Chief Robert Siegel have formed a "mutual admiration club" and are seriously considering swapping jobs once the merger is finalized. "Some of the ideas these two groups discuss once they've had a few beers is phenomenal, particularly when you get Mr. Sulzberger into it," reported one of the Onion editors. The New York Times group is particularly intrigued with the success that The Onion has had by using invented names in all its stories except for public figures. By employing an Internet journalistic standard to an old media newspaper like The Times, it is felt that editorial costs can be reduced by a whopping 80%! Cultural differences between the two companies and differing standards of journalism aren't seen as major stumbling blocks to getting the deal done. The biggest challenge will be to get the two sides to agree to a valuation that gives shareholders of both companies plenty to cheer for. This is complicated because The Onion has a market cap that is several hundred billion dollars higher than the New York Times Company. The expectation, though is this will be worked out to be similar to AOL's purchase of Time Warner, with The Onion shareholders getting about 55 to 60 percent of the merger company. Thus, The New Times Shareholder will see an instant up tick in their stock which should compensate them more than adequately for losing control of the company. Onion Publisher & President Peter K. Haise will reportedly give up his position to become Chairman of the combined company and move to New York. Arthur Sulzberger will more to Wisconsin, to run The Onion which will be the new flag ship of the what will be called "Onion New York Times Media Giant Company." Haise and Sulzberger have also agreed to swap houses and families as part of the deal, which will facilitate their need to move quickly. The resulting "Onion New York Times Media Giant Company" will be one of the world's largest media companies in terms of market cap, though only half as big as AOL/ Time-Warner. The year 2000 is already being seen as the year that old media surrendered to new media and there are some more surprises to come. The biggest merger yet could happen this summer when Wired Digital spins out Suck.com which will in turn make a bid to buy The Walt Disney Corporation. Stay tuned dear readers, Suck!Disney could become the biggest acquistion of all time. DAVID BUNNELL is founder of PC Magazine, PC World, MacWorld, Personal Computing, and New Media. He is CEO and Editor of Upside. LINKS: Upside; David Bunnell in Upside Further reading on Edge: Chapter 4 - "The Seer" -- in Digerati; "PC MEMORIES, HOW I CREATED THE PC" by David Bunnell _________________________________________________________________ Paul W. Ewald Infection Is Much Bigger Than We Thought, Bigger Than We Think, And Perhaps Bigger Than We Can Think I am confident that I don't know "today's most important unreported story" because it hasn't been reported to me yet. But I'll take a stab at one of today's most important under-reported stories: Infection is much bigger than we thought, bigger than we think, and perhaps bigger than we can think. With apologies to J.B.S. Haldane, let me offer a less grandiose, but more tangible and testable (and ponderous) version: The infectious diseases that are already here but not yet generally recognized as infectious diseases will prove to be vastly more important than the infectious diseases that newly arise in the human population from some exotic source (such as the jungles of Africa) or genetic diseases that will be newly discovered by the human genome project. By "important" I mean both the amount of disease that can be accounted for and the amount that can be ameliorated, but I also mean how much of what we value as well as what we fear. A judgment on this pronouncement can be assessed incrementally decade by decade over the next half-century. What are the diseases to keep an eye on? Heart disease and stroke; Alzheimer's, Parkinson's and other neurodegenerative diseases; impotence, polycystic ovary disease, cancers of the breast and ovaries, the penis and prostate; schizophrenia and the other major mental illnesses. The list goes on. But is the scope really "bigger than we can think"? Who can say? We can speculate that the scope of infection may extend far beyond what many in the year 2000 would be willing to take seriously. If schizophrenia and manic depression are caused largely by infection, then perhaps the artistic breakthroughs in our society, the groundbreaking work of van Gogh, for example, can also be attributed to infection. How much of what we prize in society would not be here were it not for our constant companions? Rather than pass judgment now, I suggest that we return in 2010 to this offering and each of the other contributions to see how each is faring as the fuzziness of the present gives way to the acuity of hindsight. PAUL W. EWALD is a professor of biology at Amherst College. He was the first recipient of the George E. Burch Fellowship in Theoretic Medicine and Affiliated Sciences, and he conceived a new discipline, evolutionary medicine. He is the author of Evolution of Infectious Disease which is widely acknowledged as the watershed event for the emergence of this discipline. _________________________________________________________________ Robert Aunger The End of the Nation-State One of the Big Stories of the last century was globalization ? the rise of plodding great dinosaur-like institutions promoting the interests of the Fortune 500. Of course, merger-mania continues to capture headlines, creating ever-larger multinational firms, centralizing information and money ? and hence power ? in the hands of a few old White guys. This is Goliath, and Goliath at a scale above the State. On David's side of the battle for our hearts and souls, we have the Internet, the weapon of Everyman. The Internet is the newfound instrument of the little people, bringing us all within a few clicks of each other (the so-called "small world" phenomenon). It is no accident that the first to flock to this medium were minorities of all kinds ? poodle-lovers, UFO-watchers and other fringe-dwellers. Here, through this network, they found a way to broadcast their message across the world at virtually no cost through an avenue not controlled by Walmart or Banque Credit Suisse. What is getting squeezed out in this picture is the institution in the middle, the nation-state. It is easy for the media to focus on the President as he waves to them while boarding Air Force One ? indeed, they fawn on these "photo-ops." The existence of standardized channels, like the press advisor, for disseminating "important messages" makes their job easy. Thus, the media haven't noticed that the institution the President represents is increasingly irrelevant to the course of events. Why? Let's look at the sources of State power, and how they are being eroded. First, money is no longer tied to any material token (see Thomas Petzinger, Jr., this Forum). Once the link to cowrie shells or gold bullion is severed, the exchange of value becomes a matter of trust. And this trust is increasingly being placed in computers ? the Internet again. Greenspan can control greenbacks, but not e-money. Any zit-faced teenager can become an instant millionaire by flipping a digit on a strategic computer account. This is digital democratization, of a sort. So one of the vital sources of centralized governmental power ? control over the money supply ? is increasingly no longer in the hands of the State. What about the distribution of wealth? It used to be that those close to the political decision-making machinery could write the rules for moneymaking and thus guarantee themselves advantages: policies informed incentives. But the globalization of capital markets has reversed that causal ordering: money now flows as if national boundaries were invisible, slipping right'round local rules and regulations. The policy-makers are always a step behind. So the State no longer finds it easy to ply favorites with favors. The ultimate source of control, of course, is access to information. What you don't know you can't act on. Governments have long recognized how important this is. Can States nowadays control public opinion? Are the media operated by people the State can coopt? Well, sometimes. But the Fall of the Wall suggests control is never perfect. So you can tell some of the people what to do some of the time, but not whole populations what to think for very long. It just costs too much. And (as Phil Leggiere points out elsewhere in this Forum), the Internet is now a powerful means for protest against State interests. No wonder States are trying hard to control this organically-grown monster. States of course use various means to attract allegiance beside the media. For example, they stir up patriotism by the tried-and-true method of demonizing outsiders. However, of late, it has become harder to direct aggression "outside," as made obvious by the proliferation of aggressive conflicts along ethnic lines within States (Jaron Lanier's non-Clausewitzian wars, in this Forum). The other possibility, of course, is that some splinter group will get hold of ? or make ? a nuclear warhead, and hold a government ransom. So the ability to incite war ? another source of State power ? seems to be coming from other quarters. This constitutes additional evidence of the soon-to-be demise of States. What people really care about, the social psychologists tell us, is the group they identify with. You don't identify with Uncle Sam (a clever anthropomorphizing gimmick that only works during war); you identify with Uncle Fred and the other kin who share your name. So it's difficult for people to identify with a country. It's too big ? just a jerry-rigged bit of color on a map in many cases. How can you care when your vote has no influence over outcomes? "Representative" government is farcical when a population is counted in millions. Of course, if you're rich, you can buy influence, but the ante is always being upped as some other special interest vies for control over your Man in Washington. Besides, those guys always logroll anyway. When your self-concept, wealth and well-being derive from participation in other kinds of community, the State becomes an anachronism. The result of all this will not be the arrival of the Libertarian heaven, a State-less society. It is just that mid-level governance will be replaced by larger- and smaller-scale institutions. We won't have monolithic Big Brother looking over our shoulders in the next century. Instead, we will become a network of tightly linked individuals, empowered by technologies for maintaining personal relationships across space and time. We will all choose to be cyborgs (Rodney A. Brooks), with implants that permanently jack us into the global brain (Ivan Amato), because of the power we derive from our environmentally augmented intelligence (Andy Clark, with apologies to Edwin Hutchins and Merlin Donald). We will all come to live in what Manuel Castells calls a Network Society, and begin, literally, to "think globally and act locally." ROBERT AUNGER works on cultural evolution at the Department of Biological Anthropology, University of Cambridge, and is editor of Darwinizing Culture: The Status of Memetics as a Science. LINK: Robert Aunger Home Page _________________________________________________________________ Richard Potts Emergence of an Integrated Human-Earth Organism Several under-reported stories come to mind. Almost all powerful stories concern human beings in some way or another, metaphorically or directly. One result of globalization, let's call it cultural unity, is a story of such power. I'll mention only one facet of this story. Over the past 50,000 years, the vast diversification of human culture ? the creation of quasi distinct cultures, the plural ? stands as a peculiarity of Homo sapiens (relative to earlier humans and other organisms). Human life has divided into diverse languages and ways of organizing kin, technologies, economies, even mating and demographic systems. It's a process that reflects our ken for doing things differently from the people in the next valley. Globalization may mean the dissolving (all too gradually) of tribal mentality. But there's more to it. The related extinction of languages, loss of local cultural information, and decay of cultural barriers, all point toward an eventual homogenization of behavior that hasn't existed at such a scale (across all humans) since the Paleolithic prior to 50,000 years ago, or even much earlier. The result: the loss of alternative adaptive strategies and behavioral options, which have been rather important in the history of human adaptability. That's pretty big. In seeking a truly unreported story, though, it's wise to think a little further out, to make an unexpected prediction. How can that be done? "Unexpected prediction" seems contradictory. Well, the history of life is full of curious experiments, and careful study lets one fathom its rash opportunism and rises and erasures of biotic complexity. The history offers hints. An intriguing case is the evolution of the complex cell, the basis of all eukaryotic life, including multicellular organisms. The cell, with its nucleus, mitochrondria, centrioles, and other components, represents an ecosystem of earlier organisms. The cell emerged evidently by symbiosis of a few early organisms brought together in a single, coordinated system. It's complex internally, but it evolved by simplifying, by gleaning from the surrounding ecosystem. Each of us carries around about a hundred trillion of these simplified early ecosystems, which are coordinated at even higher levels of organization ? tissues, organ systems, the individual. The big unreported story that I fancy is a latter-day parallel to this fateful development in life's history. Human alteration of ecosystems presents the parallel ? a sweeping simplification of a previously diverse biotic system. Homo sapiens has slashed, culled, and gleaned. It has forged symbiotic relationships with a few other species (domesticates) that help fuel its metabolism (economic functions) as humans enhance the replication of those few at other species' expense. While these observations are somewhat familiar, the unreported part is this: The global reach of this process threatens/promises to create a single extended organism. The superorganism continues to alter the planet and promises to touch virtually every place on the third rock from the sun. Will this strange organism eventually harness the intricate linkages of ocean, atmosphere, land, and deep Earth? Will it seize control over the circulation of heat, moisture, energy, and materials ? that is, the core operations of the planet? Hard to say without a crystal ball. At its current trajectory, the process seems destined to turn the planet into a cell, highly complex in its own right but evolved by vast simplification of its original setting. Certainly a different Gaia than is usually envisioned. If this story has any validity, it's interesting that the initial loss of cultural alternatives due to globalization roughly coincides with the emergence of this incipient planetary organism. What I suggest here is the onset of a Bizarre New World, not an especially brave one. It might take more bravery to conserve Earth's biological diversity and diverse ways of being human, salvaging species and cultures from oblivion in a globalized world. Then again...this may already be old fashioned sentiment. Any important story, even as complicated as this one, needs a headline: Human-Earth Organism Evolves Will It Survive? What Will It Become? RICHARD POTTS is Director of The Human Origins Program, Department of Anthropology, National Museum of Natural History, Smithsonian Institution. The program focuses on the long history of ecosystem responses to human pressures and vice versa. Museum researchers are piecing together the climatic and ecological conditions that allowed humans to evolve. He is the author of Humanity's Descent : The Consequences of Ecological Instability and a presenter, with Stephen Jay Gould, of a videotape, Tales of the Human Dawn. _________________________________________________________________ Kenneth W. Ford The Swiftness of the Societal Changes That Occurred Two-thirds of the Way Through the Century No end of changes in our world are cited as we look back on the twentieth century: population growth, scientific and medical advances, communications technology, transportation, child rearing and family structure, depletion of energy and mineral resources, and human impact on the environment, to name a few. In general we analyze these changes over the whole sweep of the century although some, to be sure, because of their exponential character, have made their mark mainly toward the end of the century. What has gone largely unreported, it seems to me, is the suddenness with which a set of societal changes occurred in less than a decade between 1965 and 1975 (a step function to a mathematician, a seismic shift to a journalist). In that period, we saw revolutionary change in the way people dress, groom, and behave; in the entertainment that grips them; in equity for minorities, women, and the variously disabled; in higher education; and in the structure of organizations. The unpopular Vietnam war can account for some of these changes, but surely not all. The changes were too numerous and extended into too many facets of our lives to be explained solely by antiwar fervor. Moreover, what happened was not a blip that ended when the war ended. The changes were permanent. With remarkable speed, as if a switch had been thrown, we altered the way we deal with one another, the way we see our individual relation to society, and the way we structure our organizations to deal with people. I lived through the period on a university campus, and saw rapid changes in higher education, not to mention dress and behavior, that are with us still. My own professional society, the American Physical Society, transformed itself between the late 1960s and the early 1970s from an organization that held meeting and published papers to an organization that, in addition, promotes equity, highlights links between science and society, seeks to influence policy, and cares for the welfare of its individual members. Why did so much with lasting impact -- happen so quickly? KENNETH W. FORD is the retired director of the American Institute of Physics. He recently taught high-school physics and served as science director of the David and Lucile Packard Foundation. His most recent book, written with John A. Wheeler, is Geons, Black Holes, and Quantum Foam: A Life in Physics, which won the 1999 American Institute of Physics Science Writing Prize. _________________________________________________________________ Nancy Etcoff Good News Four of five stories on trends in American life that appear on national television news describe negative, frightful trends rather than hopeful ones. Crime stories are the top category of local news, outnumbering segments on health, business, and government combined. Perhaps we require the media to be our sentinel. But we also seek a spark. The popularity and prestige of science has never been higher because science is forward looking. Science has become the bearer of hope, a source of the sublime. NANCY ETCOFF, a faculty member at Harvard Medical school and a practicing psychologist and neuropsychologist in the Departments of Psychiatry and Neurology at the Massachusetts General Hospital, has been researching the perception of human faces for the past ten years. Her work and ideas have been reported in The New York Times, Newsweek, Rolling Stone, U.S. News and World Report, Discover, Fortune, and Mademoiselle. She has been a featured guest on Dateline, NPR, The Discover Channel, and Day One. She is the author of Survival of the Prettiest: The Science of Beauty. _________________________________________________________________ Geoffrey Miller Social Policy Implications of the New Happiness Research In the last ten years, psychology has finally started to deliver the goods -- hard facts about what causes human happiness. The results have been astonishing, but their social implications have not sparked any serious public debate: (1) Almost all humans are surprisingly happy almost all the time. 90% of Americans report themselves to be "very happy" or "fairly happy". Also, almost everyone thinks that they are happier than the average person. To a first approximation, almost everyone is near the maximum on the happiness dimension, and this has been true throughout history as far back as we have reliable records. (This may be because our ancestors preferred happy people as sexual partners, driving happiness upwards in both sexes through sexual selection). (2) Individuals still differ somewhat in their happiness, but these differences are extremely stable across the lifespan, and are almost entirely the result of heritable genetic differences (as shown by David Lykken's and Auke Tellegen's studies of identical twins reared apart.) (3) Major life events that we would expect to affect happiness over the long term (e.g. winning the lottery, death of a spouse) only affect it for six months or a year. Each person appears to hover around a happiness "set-point" that is extremely resistant to change. (4) The "usual suspects" in explaining individual differences in happiness have almost no effect. A person's age, sex, race, income, geographic location, nationality, and education level have only trivial correlations with happiness, typically explaining less than 2% of the variance. An important exception is that hungry, diseased, oppressed people in developing nations tend to be slightly less happy -- but once they reach a certain minimum standard of calorie intake and physical security, further increases in material affluence do not increase their happiness very much. (5) For those who suffer from very low levels of subjective well-being (e.g. major depression), the most potent anti-depressants are pharmaceutical, not social or economic. Six months on Prozac(TM), Wellbutrin(TM), Serzone(TM), or Effexor(TM) will usually put a depressed person back near a normal happiness set-point (apparently by increasing serotonin's effects in the left prefrontal cortex). The effects of such drugs are much stronger than any increase in wealth or status, or any other attempt to change the external conditions of life. The dramatic, counter-intuitive results of happiness research have received a fair amount of media attention. The leading researchers, such as Ed Diener, David Myers, David Lykken, Mihaly Csikszentmihalyi, Norbert Schwarz, and Daniel Kahneman, are regularly interviewed in the popular press. Yet the message has influenced mostly the self-help genre of popular psychology books (which is odd, given that the whole concept of self-help depends on ignoring the heritability and stability of the happiness set-point). The research has not produced the social, economic, and political revolution that one might have expected. Journalists have not had the guts to rock our ideological boats by asking serious questions about the broader social implications of the research. Popular culture is dominated by advertisements that offer the following promise: buy our good or service, and your subjective well-being will increase. The happiness research demonstrates that most such promises are empty. Perhaps all advertisements for non-essential goods should be required to carry the warning: "Caution: scientific research demonstrates that this product will increase your subjective well-being only in the short term, if at all, and will not increase your happiness set-point". Of course, luxury goods may work very well to signal our wealth and taste to potential sexual partners and social rivals, through the principles of conspicuous consumption that Thorstein Veblen identified. However, the happiness research shows that increases in numbers of sexual partners and social status do not boost overall long-term happiness. There are good evolutionary reasons why we pursue sex and status, but those pursuits are apparently neither causes nor consequences of our happiness level. Some journalists may have realized that the happiness research challenges the consumerist dream-world upon which their advertising revenues depend -- their failure to report on the implications of the research for consumerism is probably no accident. They are in the business of selling readers to advertisers, not telling readers that advertising is irrelevant to their subjective well-being. Also, if we take the happiness research seriously, most of the standard rationales for economic growth, technological progress, and improved social policy simply evaporate. In economics for example, people are modelled as agents who try to maximize their "subjective expected utility'. At the scientific level, this assumption is very useful in understanding consumer behavior and markets. But at the ideological level of political economy, the happiness literature shows that "utility" cannot be equated with happiness. That is, people may act as if they are trying to increase their happiness by buying products, but they are not actually achieving that aim. Moreover, increasing GNP per capita, which is a major goal of most governments in the world, will not have any of the promised effects on subjective well-being, once a certain minimum standard of living is in place. None of the standard "social indicators" of economic, political, and social progress are very good at tracking human happiness. When hot-headed socialists were making this claim 150 years ago, it could be dismissed as contentious rhetoric. Equally, claims by the rich that "money doesn't buy happiness" could be laughed off as self-serving nonsense that perpetuated the oppression of the poor by creating a sort of envy-free pseudo-contentment. But modern science shows both were right: affluence produces rapidly diminishing returns on happiness. This in turn has a stark and uncomfortable message for those of us in the developed world who wallow in material luxuries: every hundred dollars that we spend on ourselves will have no detectable effect on our happiness; but the same money, if given to hungry, ill, oppressed developing-world people, would dramatically increase their happiness. In other words, effective charity donations have a powerful hedonic rationale (if one takes an objective view of the world), whereas runaway consumerism does not. Tor Norretranders (in this Edge Forum) has pointed out that 50 billion dollars a year -- one dollar a week from each first world person -- could end world hunger, helping each of the 6 billion people in the world to reach their happiness set-point. The utilitarian argument for the rich giving more of their money to the poor is now scientifically irrefutable, but few journalists have recognized that revolutionary implication. (Of course, equally modest contributions to the welfare of other animals capable of subjective experience would also have a dramatic positive effect on overall mammalian, avian, and reptilian happiness.) Other contributors to this Edge Forum have also alluded to the social implications of happiness research. David Myers pointed out the lack of correlation between wealth and happiness: "it's not the economy, stupid'. Douglas Rushkoff and Denise Caruso bemoaned America's descent into mindless, impulsive consumerism and media addiction, neither of which deliver the promised hedonic pay-offs. Daniel Goleman identified the hidden social effects of our daily consumption habits -- they not only fail to make us happier, but they impose high environmental costs on everyone else. Others have suggested that some external substitute for consumerism might be more hedonically effective. David Pink championed a switch from accumulating money to searching for meaning. John Horgan was excited about the quiet proliferation of better psychedelic drugs. Howard Rheingold thinks more electronic democracy will help. They may be right that spiritualism, LSD, and online voting will increase our happiness, but the scientific evidence makes me skeptical. If these advances don't change our genes or our serotonin levels in left prefrontal cortex, I doubt they'll make us happier. There may be other rationales for these improvements in the quality of life, but, ironically, our subjective quality of life is not one of them. Perhaps the most important implication of the happiness literature concerns population policy. For a na?ve utilitarian like me who believes in the greatest happiness for the greatest number, the happiness research makes everything much simpler. To a first approximation, every human is pretty happy. From an extra-terrestrial utilitarian's viewpoint, human happiness could be treated as a constant. It drops out of the utilitarian equation. That leaves just one variable: the total human population size. The major way to maximize aggregate human happiness is simply to maximize the number of humans who have the privilege of living, before our species goes extinct. Obviously, there may be some trade-offs between current population size and long-term population sustainability. However, most of the sustainability damage is due not to our large populations per se, but to runaway consumerism in North America and Europe, and catastrophic environmental policies everywhere else. Peter Schwartz (in this Edge Forum) mentioned the declining growth rate of the world's population as if it were unreported good news. I take a different view: the good news for a utilitarian who appreciates the happiness research would be a reduction in America's pointless resource-wastage and Brazil's deforestation rate, accompanied by a luxuriantly fertile boom in world population. Given modest technological advances, I see no reason why our planet could not sustain a population of 20 billion people for several hundred thousand generations. This would result in a utilitarian aggregate of 10 quadrillion happy people during the life-span of our species -- not bad for such a weird, self-deluded sort of primate. GEOFFREY MILLER is an evolutionary psychologist at University College London, and author of The Mating Mind: How Sexual Choice Shaped Human Nature. He is currently researching the implications of evolutionary psychology for consumer behavior and marketing. LINKS: Geoffrey Miller Home Page Further Reading on Edge: "Sexual Selection and the Mind": A Talk with Geoffrey Miller _________________________________________________________________ David Braunschvig The Non-US Uniform Mobile Standard The current American monopoly on Internet innovation is not etched in stone. With about two thirds of the worldwide internet user base in North America, US based companies generate over 80% of global revenue and these represent about 95% of the sector's overall market capitalization of about a trillion US dollars (as of January 2000). This is indeed a paradox for a medium that was designed to be open and global. Quite understandable, though, when you consider that US entrepreneurs benefit from: abundant venture capital, more efficient equity markets, flexible employment, higher PC penetration, efficient infrastructures and earlier deregulation leading to lower communications costs to consumers, better business academia linkages and a large, homogeneous domestic market. Of course, the rest of the world is catching up, as is increasingly reported in the news. In Europe alone, the aggregate market value of internet companies has shot up by a factor of 30 in the past year, admittedly from a low base of $ 2 billion early 1999 to be contrasted with "only" a four-fold increase for internet companies quoted in US markets. Thus, observers generally agree that the disproportionately low aggregate capitalization of the non-US internet companies is a temporary fact. However, the media here often views the primacy of US innovation in the internet which is of course the premise of its leadership as something like an American birth right. During the past "American century" this has been a conventional wisdom for other equally significant sectors. In the late 1960s, a major unreported story was that Boeing's leadership in civil aircraft construction was more fragile than one would expect; yet, Airbus's orders surpassed Boeing's last year. Ten years ago, US dominance in cellular telecommunications technology seemed equally impregnable. Since then, the European GSM consortium has spawned a technology which is now widely accepted as a global standard for digital mobile telephony. Likewise, could new internet concepts and user experiences emerge outside of the US, with global relevance and reach ? A remarkably underreported story is that the existence of a uniform mobile standard outside of the US is poised to be the foundation of a new generation of internet-enabled applications, which can be an extremely significant innovation. If portable devices and internet-enabled mobility are to be at the center of the current information revolution, Europe seems at an advantage to seed the landscape with new concepts, technologies and companies leveraging their consistent mobile infrastructure. In Europe, location-sensitive services are being tested as we speak, enabling merchants to reach pedestrians and motorists with information and opportunities. Thus, rather than competing online with pure-play e-commerce companies, established bricks-and-mortar businesses could find their revenge in the high streets, thanks to these devices. The best technologies enabling these experiences might well come from all over the world but the first movers are likely to find a privileged ground in Europe: a caveat for the complacent in the US ! DAVID BRAUNSCHVIG is a managing director of Lazard Fr?res & Co. LLC in New York, where he advises governments and corporations on transactions and technology. In addition to his ongoing work as an advisor in the fields of information technology, Internet services, and "new media," he has advised the Mexican government on the privatization of its national satellite system. LINK: Lazard Fr?res & Co. LLC _________________________________________________________________ Ernst Poppel My Own Story Today's most important unreported story is of course my own story. This must be true for everybody. But who else would be interested? ERNST POPPEL is a brain researcher. Chair of the Board of Directors at the Center for Human Sciences and Director of the Institute for Medical Psychology, University of Munich. _________________________________________________________________ Philip Campbell "Chemistry for non-chemists"; Entrepreneurism I found your demand for "most" important a bit of a distraction, so forgive me for ignoring it. My first thought was "chemistry for non-chemists". Few people write about chemistry for the public, few stories appear in the press. It's intrinsically difficult and, anyway, biology is, for the foreseeable future, just too sensational (in both good and questionable senses) and fast-moving for all but the sexiest of the rest of science to get much of a chance to compete for space in the media. But there is room for unusual science writers who know how to hit a nerve with a neat association between interesting chemistry and the everyday world - there just seem to be too few in existence and/or too little demand. (I'd mention John Emslie and my colleague Philip Ball as two honourable examples.) The second thought was entrepreneurism. Because of inevitable business secrecy, entrepreneurism too rarely gets adequately opened up to scrutiny and public awareness. That's not to imply a hostile intent - entrepreneurism can provide the basis of riveting tales in a positive as well as negative senses. But, in Europe especially, chief executives of high-technology companies who bemoan the lack of an entrepreneurial culture unsurprisingly resist suggestions that a well-proven journalist be given the chance to roam around their company and write about what they find. Partly as a result of such inevitable caution, and partly because of the way the media approaches business, the public tends to get basic news and oceans of speculation about share prices and profits, gee-whiz accounts of technology, misrepresentation from lobby groups on both sides of a divide, lectures on management, partial autobiographies of successful business people, but, unless a company collapses, nothing like the whole truth. More could surely be done, though the obstacles are daunting. PHILIP CAMPBELL is the Editor-in-Chief of Nature. LINK: Nature _________________________________________________________________ Steven Pinker The Loss of our Species' Biography Just as we are beginning to appreciate the importance of our prehistoric and evolutionary roots to understanding the human condition, precious and irreplaceable information about them is in danger of being lost forever: 1. Languages. The 6,000 languages spoken on the planet hold information about prehistoric expansions and migrations, about universal constraints and learnable variation in the human language faculty, and about the art, social system, and knowledge of the people who speak it. Between 50% and 90% of those languages are expected to vanish in this century (because of cultural assimilation), most before they have been systematically studied. 2. Hunter-gatherers. Large-scale agriculture, cities, and other aspects of what we call "civilization" are recent inventions (< 10,000 years old), too young to have exerted significant evolutionary change on the human genome, and have led to cataclysmic changes in the human lifestyle. The best information about the ecological and social lifestyle to which our minds and bodies are biologically adapted lies in the few remaining foraging or hunting and gathering peoples. These peoples are now assimilating, being absorbed, being pushed off their lands, or dying of disease. 3. Genome diversity. The past decade has provided an unprecedented glimpse of recent human evolutionary history from analyses of diversity in mitochondrial and genomic DNA across aboriginal peoples. As aboriginal people increasingly intermarry with larger groups, this information is being lost (especially with the recent discovery that mitochondrial DNA, long thought to be inherited only along the female line, in fact shows signs of recombination). 4. Fossils. Vast stretches of human prehistory must be inferred from a small number of precious hominid fossils. The fossils aren't going anywhere, but political instability in east Africa closes down crucial areas of exploration, and because of a lack of resources existing sites are sometimes inadequately protected from erosion and vandalism. 5. Great apes in the wild. Information about the behavior of our closest living relatives, the bonobos, chimpanzees, gorillas, and orangutans, requires many years of intensive observation in inaccessible locations, but these animals and their habitats are rapidly being destroyed. What these five areas of research have in common, aside from being precious and endangered, is that they require enormous dedication from individual researchers, they are underfunded (often running on a shoestring from private foundations), and have low prestige within their respective fields. A relatively small reallocation of priorities (either by expanding the pie or by diverting resources from juggernauts such as neuroscience and molecular biology, whose subject matter will still be around in ten years) could have an immeasurable payoff in our understanding of ourselves. How will we explain to students in 2020 that we permanently frittered away the opportunity to write our species' biography? STEVEN PINKER is Professor of Psychology in the Department of Brain and Cognitive Sciences, and author of Language Learnability and Language Development, Learnability and Cognition, The Language Instinct, How the Mind Works, and Words and Rules. LINKS: The Official Steven Pinker Web Page; The Unofficial Web Page about Steven Pinker _________________________________________________________________ Jason McCabe Calacanis The Farce of the Slacker Generation (Or What the Hell Happened to Generation X)? In the early nineties, when I graduated from college, the media was obsessed with a generation of indifferent teenagers and twenty-somethings who couldn' t be bothered with social causes, careers, or the general state of humanity. Ironically, the same media structure which had previously been upset with the 60s generation for being too rebellious was now upset the kids born in the 70s and the 80s for not being rebellious enough. They branded us slackers and they called us generation X, ho-hum. Fast forward five short years. The same media covering the same generation, but instead of dismissing them as slackers they anoint them business titans and revolutionaries controlling the future of business, media and culture. With technology as their ally they will not rest until they've disintermediated anyone or anything inefficient. This group of rebels are on a mission, and their drive is matched only by their insane work ethic. Never a mention of slackers or generation X. The story that isn't being told in all of this is why a generation of slackers would suddenly create and drive one of the biggest paradigm shifts in the history of industry. Clearly part of this is a matter of perspective. The media givieth and the media takieth away, all in their desire to create sexy stories through polarization, generalization and, of course, exaggeration. However, looking deeper into the issue, is the fact that a generation of young adults, having stumbled onto a new medium (the Internet), was smart enough to seize the opportunity, taking their own piece of the pie and leaving the dead to bury the dead (think: old media). What did we, as generation X inherit in the early 90s? The remnants of a five-yea r, cocaine-infused, party on Wall Street that ended in tears and a recession. Our generation wasn't filled with slackers, it was filled with such media savvy, and saturated, individuals that we knew that participating in the existing paradigm would only result in low pay and long hours for some old-school company. Is it is a coincidence that this same group of people are the ones who owned the media that obsessed over the slacker generation? Perhaps they hoped to guilt us to getting into line? Equity is the revolution of our generation, as in having equity in the company you work for. This equity, in the form of stock options, is not on the same level as the equality that the 60s generation fought for, but it is certainly an evolution of that same movement. Don't believe the hype. JASON MCCABE CALACANIS is Editor and Publisher of Silicon Alley Daily; The Digital Coast Weekly, Silicon Alley Reporter and Chairman CEO, Rising Tide Studios. LINKS: Silicon Alley Daily; The Digital Coast Weekly, Silicon Alley Reporter _________________________________________________________________ Raphael Kasper The Fact That There Are No Longer ANY Unreported Stories "Today's most important unreported story" may be the fact that there are no longer ANY unreported stories. To be sure, there are stories that are given less attention than some might think appropriate, and others that are inaccurate or misleading. But the proliferation of sources of information -- the Internet, myriad cable television stations, niche magazines, alternative newspapers -- makes it virtually certain that no occurrence can remain secret or unmentioned. The dilemma that faces all of us is not one of ferreting out information that is hidden but of making sense of the information that is readily available. How much of what we read and see is reliable? And how can we tell? In the not-so-distant past, we could rely, to an extent, on the brand name of the information provider. We all applied our own measures of credence to stories we read in the York Times or in the National Enquirer. But who knows anything about the authors of what we read at www.whatever.com? Everything -- all that has happened and much that has not -- is reported. RAPHAEL KASPER, a physicist, is Associate Vice Provost for Research at Columbia University and was Associate Director of the Superconducting Super Collider Laboratory. LINK: Columbia University Record _________________________________________________________________ Marney Morris The Consequences of Choices Made About The Internet Today "America, where are you going in your automobile?" Allen Ginsberg The years of 1939 and 1999 were snapshots in time revealing the world as it was and as it would be. At the 1939 New York World's Fair, General Motors' "Futurama" and Ford's "Road of Tomorrow" showcased freeways and spiral ramps scrolling around urban towers. The future was clear. American would rebuild its cities and highways to sing a song of prosperity and personal freedom. In 1999, a snapshot of the Internet revealed what was and what will be. Wires are still being strung. The Ecommerce structure is still being built. And content is still incunabulum. What is today's most important unreported story? That the choices made about the Internet today will have great consequences in the next century. Like the automotive age, exuberant times make it easy to forget that a bit of thoughtful design will profoundly influence the fabric of our future society. What's the issue? Access. Half the people in the US don't have computers, but 98% have TVs and 97% have phones. Why do they say they don't have computers? "Too complicated." Using a computer should be as easy as turning on a TV. And it could be. Computers interfaces should be self explanatory. And simple. And they could be. The quality of information design in the US is declining. Good information design should reveal relationships about the information. It should make you smarter. Learning disorders are on the rise. It's not because we are getting better at diagnosing them. It's because we are creating them. All too often our textbooks are confusing or misleading. And that same lack of thoughtful design pervades the personal computer, and the Internet. Information design is a science that needs to underpin our society if we are going to remain democratic and vital. The biggest difference between 1939 and 1999? The automobile was simple at the outset. It took years to make it complicated and inaccessible. Computers have been unnecessarily complicated since they began. It is hard to make things simple. But they could be. MARNEY MORRIS teaches interaction design in the Engineering department at Stanford University and is the founder of Animatrix, a design studio that has built over 300 interactive projects since 1984. Animatrix is currently creating Sprocketworks.com. LINKS: Animatrix; Sprocketworks.com _________________________________________________________________ Rupert Sheldrake The Rise of Organic Farming in Europe Once seen as a marginal enterprise of interest only to health food fanatics, organic farming is booming in Europe. Over the last 10 years, the acreage under organic management has been growing by 25 per cent per year. At present growth rates, 10 per cent of Western European agriculture would be organic by 2005, and 30 per cent by 2010. But in some parts of Europe the growth rates are even higher. In Britain, within the last 12 months the acreage more than doubled, but even so the surge in demand for organic food greatly outstrips the supply, and 70 per cent has to be imported. Most supermarket chains in the UK now carry a range of organic products, and the market is growing at 40 per cent per year. By the end of this year, nearly half the baby food sold in Britain will be organic. Why is this happening? It reflects a major shift in public attitudes, which are probably changing more rapidly in Britain than in other countrides. First there was the trauma of mad cow disease and the emergence of CJD, its human form, contracted through eating beef. No one knows whether the death toll will rise to thousands or even millions; the incubation time can be many years. Then in 1999 there was the remarkable public rejection of genetically modifie foods, much to the surprise of Monsanto and their government supporters. Recent surveys have shown the third of the public who now buy organic food do so primarily because they perceive it as healthier, but many also want to support organic farming because they think it is better for the environment and for animal welfare. The rise in organic farming together with the continuing growth of alternative medicine are symptoms of a mass change in world view. Governments and scientific institutions are not at the leading edge of this change, they are at the trailing edge. A major paradigm shift is being propelled by the media, consumers' choices and market forces. A change of emphasis in the educational system and in the funding of scientific and medical research is bound to follow, sooner or later. RUPERT SHELDRAKE is a biologist and author of Dogs That Know When Their Owners Are Coming Home, And Other Unexplained Powers of Animals, The Rebirth of Nature and Seven Experiments That Could Change the World, as well as many technical papers in scientific journals. He was formerly a Research Fellow of the Royal Society, and is currently a Fellow of the Institute of Noetic Sciences. He lives in London. LINK: Rupert Sheldake Online _________________________________________________________________ Tom de Zengotita Linda Tripp's Makeover It's being covered as a publicity ploy by a Lewinsky scandal leftover -- that is, not much and certainly not seriously. But check out the pictures. This is a MAJOR makeover. It represents the culmination of a process we have been tracking for awhile -- but in two different arenas of celebrity, the real and the Hollywood. Remember the new Nixon? Distant ancestor of Al Gore remakes and of McCain and Bradley, the "story" candidates, and every prominent real life figure today who is now obliged to play some version of themselves. On the other front, we have cases of performer resurrections in new guises going back to the straightforward comebacks of Frank Sinatra and John Travolta to Madonna and Michael Jackson makeovers to Garth Brooks effort to recreate himself as a fictional celebrity whose name escapes me at the moment. Linda Tripp's makeover represents a consolidation, a fusion of these trends. This marks a moment when the possibility of making and remaking one's image collapses into the possibility of making and remaking oneself literally. And it's just the beginning... TOM DE ZENGOTITA, anthropologist, teaches philosophy and anthropology at The Dalton School and at the Draper Graduate Program at York University. _________________________________________________________________ Denis Dutton The Gradual The Growth of a Prosperous Middle Class in China and in India Few large-scale, gradual demographic changes can be expected to generate headlines. The exceptions are those which point toward catastrophe, such as the widespread belief a generation ago that the population bomb would doom millions in the third world. In fact, the most significant unreported story of our time does deal with the so called third world, and it is the obverse of the panic about overpopulation. It is the story of the gradual growth of a prosperous middle class in China and in India. The story is truly dull: yes, millions of Indians can now shop in malls, talk to each other on cell phones, and eat mutton burgers and vegetarian fare at Mcdonald's. Such news goes against the main reason for wanting to cover Indian cultural stories in the first place, which has traditionally been to stress cultural differences from the West. That millions of people increasingly have a level of wealth that is approaching the middle classes of the West (in buying power, if not in exact cash equivalence) is not really newsworthy. Nevertheless, this development is of staggering importance. Middle class peoples worldwide, particularly in a world dependent on global trade, have important values in common. They share the values they place on material comfort. They borrow in living styles from one another. They appreciate to an increasing extent each others' cultures and entertainments. And they place an important value on social stability. Countries with prosperous middle classes are less likely to declare war on one another: they have too much to lose. In the modern world, war is a pastime for losers and ideologues; the middle classes tend to be neither. When I was a Peace Corps volunteer in India in the 1960s, I accepted the conventional belief that south Asia would experience widespread famine by the 1980s. My first surprise was returning to India in 1988 and finding that far from moving closer to famine, India was richer than ever. Now in the computer age, and having abandoned the Fabianism of Nehru, India is showing its extraordinary capacity to engage productively with the knowledge economies of the world. China too is will contribute enormously to the world economy of the 21st century. The story does not square with many old prejudices about the backward Orient, nor does it appeal to our sense of exoticism. But the emerging middle class of Asia will change the human face of the world. DENIS DUTTON teaches the philosophy of art at the University of Canterbury, New Zealand. He writes widely on aesthetics and is editor of the journal Philosophy and Literature, published by the Johns Hopkins University Press. He is also editor of the Web page, Arts & Letters Daily. Prof. Dutton is a director of Radio New Zealand, Inc. LINK: Arts & Letters Daily _________________________________________________________________ David M. Buss Discrimination in the Mating Market Hundreds of stories are reported every year about discrimination, bias, and prejudice against women, minorities, and those who are different. But there's a more pervasive, universal, and possibly more insidious form of discrimination that goes on every day, yet lacks a name or an organized constituency-discrimination on the mating market. Although there are important individual differences in what people want (e.g., some like blondes, some like brunettes), people worldwide show remarkable consensus in their mating desires. Nearly everyone, for example, wants a partner who is kind, understanding, intelligent, healthy, attractive, dependable, resourceful, emotionally stable, and who has an interesting personality and a good sense of humor. No one desires those who are mean, stupid, ugly, or riddled with parasites. To the degree that there exists consensus about the qualities people desire in mating partners, a mating hierarchy is inevitably established. Some people are high in mate market value; others are low. Those at the top, the "9's" and "10's" are highly sought and in great demand; those near the bottom, the "1's" and the "2's," are invisible, ignored, or shunned. Being shunned on the mating market relegates some individuals to a loveless life that may cause bitterness and resentment. As the rock star Jim Morrison noted, "women seem wicked when you're unwanted." Discrimination on the mating market, of course, cuts across sex, race, and other groups that have names and organized advocates. Neither men nor women are exempt. For those who suffer discrimination on the mating market, there exists no judicial body to rectify the injustice, no court of appeals. It's not against the law to have preferences in mating, and no set of social customs declares that all potential mates must be treated equally or given a fair chance to compete. But it's not just the rock bottom losers on the mating market that suffer. A "4" might aspire to mate with a "6," or "7" might aspire to mate with a "9." Indeed, it's likely that sexual selection has forged in us desires for mates who may be just beyond our reach. The "7" who is rejected by the "9" may suffer as much as the "4" who is rejected by the "6." People bridle at attaching numbers to human beings and making the hierarchy of mate value so explicit. We live in a democracy, where everyone is presumed to be created equal. Attaching a different value to different human beings violates our sensibilities, so we tend not to speak in civilized company of this hidden form of discrimination that has no name. But it exists nonetheless, pervasive and insidious, touching the lives of everyone save those few who opt out of the mating market entirely. DAVID M. BUSS is s Professor of Psychology at the University of Texas at Austin where he teaches courses in evolutionary psychology and the psychology of human mating. He is the author of The Dangerous Passion: Why Jealousy is as Necessary as Love and Sex; The Evolution Of Desire: Strategies Of Human Mating; and Evolutionary Psychology: The New Science Of The Mind. _________________________________________________________________ Charles Arthur The Peculiar Feedback Loops - Both Negative and Positive - That Drive Media Reporting of Technological and Science Issues The most important unreported story, and perhaps one that is impossible to report, is about the peculiar feedback loops-- both negative and positive -- that drive media reporting of technological and science issues. In Britain, the science repoting agenda in the past year has been dominated by stories about genetically modified food and crops. Britons have rejected them, crops in experiments have been torn up (thus preventing the results of the experiments, which could show whether or not the crops had harmful effects, being produced). Supermarkets vie with each other to find some way in which they don't use genetically modified ingredients or crops. Newspapers run "campaigns" against genetically modified ingredients. There is an incredible positive feedback loop operating there, driving ever wilder hysteria -- at least amongst the media. Whether the public really cares is hard to ascertain. Meanwhile climate change, that oft-repeated phrase, is almost accepted as being right here, right now; to the extent that my news editor's eyes glaze over at the mention of more global warming data, more melting ice shelves (apart from "Are there good pictures?" A calving ice shelf can do it.) There is clearly a negative feedback loop running there. The only way to garner interest is to present someone or some paper which says it isn't happening. Which seems to me pointless, before Stephen Schneider jumps on me. But what is making those loops run in the way they do? Why doesn't genetically modified food get a negative loop, and climate change a positive one? What are the factors that make these loops run with a + or - on the input multiplier? Damned if I know how it all . But I'll read about it with fascination. As we are more and more media-saturated, understanding how all this works looks increasingly important, yet increasingly hard to do. CHARLES ARTHUR is technology editor at The Independent newspaper. _________________________________________________________________ Delta Willis Weird Ape Fouls Planet In the spirit of tabloid headlines (Ted No Longer Fonda Jane) my Exclusive, Untold Story would be headed Weird Ape Fouls Planet. Granted Bill McKibben got very close with his book The End of Nature, but there continues a pervasive denial that stops this story from honest resolution. First, of course, we'd rather not hear about it. At the Jackson Hole Wildlife Film Festival we discussed the dilemma of presenting the dreadful conservation stories without depressing the audience, which is another way of asking, Shall we not hit this nail on the head again? So beyond the various lobbies and doubts that obscure issues such as global warming, the media hesitate to offend, or to be redundant. Secondly, it's a story I'd rather not research and write because it is a bummer. The first half I did attempt; I truly do think humans are quite a wonderfully weird, unique species and the story of our place within the evolution of life on earth is fantastic (and for many, still unbelievable.) But when it comes to our impact on the earth, the already palpable effects of over-population, I get bogged down in the details, or distracted by Untold Story Number Two (Equal Rights Amendment Never Passed U.S. Senate). For example: there was a rush by health insurers to provide for coverage of Viagra use, but not of contraception pills for women. Enormous pressures remain for reproduction, from social and religious ones (the Vatican Rag) to the biological cues that inspired John Updike (The Witches of Eastwick) to put these words in the mouth of Jack Nicholson: What a Bait They Set Up. In a roundabout way the fallibilities of being human were covered ad nausea (Leader of Free World Impeached for Thong Thing) and maybe the reason that too will pass is because we would prefer to deny the power of these urges, on par with drugs and greed, plus ego, i.e. parking one's off road 4-WD Range Rover in front of the Hard Rock Caf?. So I'm with Bugs Bunny, who said, people are the strangest animals because we have this ability to reason and yet that base stem of the brain, wherever it is located on your anatomy, tends to rule the day. Hollywood might be the only medium that can rattle our cage on such issues of perspective, truly seeing ourselves in context; journalism no longer seems capable of delivering profound, incisive news, unless you dare to have canines as sharp as those of Maureen Dowd. DELTA WILLIS is the author of The Hominid Gang, Behind the Scenes in the Search for Human Origins; The Leakey Family: Makers of Modern Science; and The Sand Dollar & The Slide Rule: Drawing Blueprints from Nature. _________________________________________________________________ Tor Norretranders A Dollar a Week Will End World Hunger It's now a billion to a billion: Of the six billion human beings currently alive on this planet, one billion live with a daily agenda of malnutrition, hunger and polluted drinking water, while another one billion -- including you and me -- live lives where hunger is never really an issue. The number of really rich and really poor people on the planet now match. That makes the following piece of arithmetics very simple indeed: If all of us who are rich (in the sense that starvation is out of the question and has always been) want to provide the economic resources necessary to end hunger, how much should we pay? We assume that all existing government and NGO aid programs continue, but will be supplemented by a world-wide campaign for private donations to end hunger (feed your antipode). The cost of providing one billion people with 250 kilograms of grain every year is approximately $40 billion dollars a year. That would seem to be a lot of money, but with one billion people to pay, it is no big deal: $40 a year! An even more moderate estimate is provided by the organization Netaid: Just $13 billion dollars a year and the basic health and food needs of the world's poorest people could be met. With $50 billion a year as an estimated cost of ending world hunger, the expense for each well-off person is one dollar a week. It is the growth in the number of rich people on the planet, while the number of poor has not grown, that results in this favorable situation, unprecedented in human history. The advent of the Internet makes this proposal practical and conceptually clear: Living in a global village makes it meaningful to help end global hunger, just like the populations of most industrialized countries have already done on a national scale. The Internet provides a simple way of collecting the money (this writer broke the embargo on his own unreported story and sent $100 to www.netaid.org to pay the global tax to end hunger for himself and one child). The money flowing through organizations such as netaid.org and hungersite.org will attract public attention and scrutiny of their efficiency in turning money into food for the hungry. Also, the Internet makes it perfectly clear who should consider her/himself as part of the rich billion on the planet and hence pay a dollar a week: Every user of the Internet. In few years time the number of users will be one billion and we could see the end of hunger on this planet. Obviously, once the money to end hunger is available, all sorts of obstructions will appear before those in need are fed: bureaucracies, mafias, corruption, waste. But is it not then time that we deal with them? A very important effect of annual donations from a billion people is the resulting global awareness of the embarrassment involved in the present unnecessary situation. TOR NORRETRANDERS is a Danish science writer, lecturer and consultant based in Copenhagen. He is the author of The User Illusion. Cutting Consciousness Down to Size. His latest book, in Danish, is Frem i Tiden. _________________________________________________________________ Denise Caruso Maybe Media Is the Real Opiate of the People One of today's great untold stories -- or, I should say, it keeps trying to get itself told and is usually mercilessly thrashed or ignored entirely -- is the degree to which our behavior is manipulated and conditioned by media. Most everyone has heard about the studies (more than 200 at last count) that show a direct correlation between increased aggression and exposure to violence portrayed in media. The most compelling of this research suggests that the visual media in particular -- television, movies and even video games -- employ psychological techniques such as desensitization and Pavlovian conditioning which change how we think about and react to violent behavior. Of course, the entertainment and advertising industries dismiss these studies, saying it's impossible that their little ol' movie or TV show or 30-second ad or point-and-shooter could actually influence anyone's behavior. That's what they say to Congress, anyhow, when they get called on the carpet for irresponsible programming. But how do their protestations square with the gazillion-dollar business of TV advertising, in particular? This is an industry which is based entirely on the proposition that it can and does, in fact, impel people to buy a new car or a new pair of shoes, to drink more beer or get online -- to do something different than they've been doing, in some shape or form. So one of those statements has to be a lie, and if you follow the money, you can make a pretty good guess which one. Once you are willing to consider this premise (and if you've read the studies and/or are willing to honestly observe your own behavior, it's pretty hard not to), it becomes apparent that a whole lot more than our attitudes toward violence may be influenced by visual media. Not long ago, for example, it occurred to me that the rising obesity rate of our TV-addicted population might actually have something to do with the fact that any given hour of programming will yield an infinity of food porn -- sexy, slender women shoving two-pound dripping hamburgers into lipsticked mouths, or normal-sized families cheerily gorging themselves at tables piled with giant lobsters and steaks and all manner of things oozing fat and sugar. I mentioned this theory of mine to a colleague last November and, remarkably, the very next day, a blurb in The New York Times' science section announced that a Stanford University study had correlated children's obesity with television watching, and that the American Institute for Cancer Research found that most Americans overestimated a normal-sized portion of food by about 25 percent. Neither of these two studies directly linked TV's food bonanza with overeating, but they do suggest a connection between what our eyes see and what our brains subsequently do with that information. It's then no giant leap to wonder whether the constant barrage of TV "news" and political programming -- from the Clinton-Lewinsky extravaganza to Sunday morning's meet-the-pundits ritual to the "coverage" of the latest batch of presidential hopefuls -- is another case of media desensitization in action. Could TV itself, the place where most Americans get their daily fix of news, actually be causing America's vast political ennui and depressed voter turnout? Have we become so anesthetized by what we watch that we require the specter of Jesse Ventura or Donald Trump as president to engage, even superficially, in the political process? The studies that correlate media exposure with a flattened cultural affect about violence would support that general premise. But as we know, correlation does not prove causation. To prove that TV "causes" violence, for example, you'd have to conduct a controlled, double-blind experiment which, if successful, would result in someone committing a violent act. The human subjects committee at any responsible research lab or university would never approve such an experiment, and for good reason. But it must be possible to set up a sufficiently rigorous, violence-free experiment to measure the actual neurological and behavioral effects of visual media. Wouldn't we all like to know what really happens -- what happens in our brains, what humans can be impelled to do -- as a result of spending so many hours in front of TVs and computers and movie screens? Considering the massive amount of visual stimuli that is pumped into our brains every day -- and the astronomical profits made by the industries who keep the flow going -- this seems like a story eminently worth reporting. DENISE CARUSO is Digital Commerce/Technology Columnist, The New York Times LINK: "Digital Commerce" Column Archive _________________________________________________________________ Phil Leggiere Appropriation of the Internet as an Effective and Powerful Tool of Large Scale Global Social Protest Buried beneath the blitz of news coverage of the rise of e-commerce and the emergence of the World Wide Web as a new focal point of consumerism ( the most ubiquitous stories of the moment) is a potentially just as significant, still unreported, story: the appropriation of the Internet as an effective and powerful tool of large scale global social protest. Most major mainstream broadcast and cable news coverage and commentary rather jadedly treated the WTO protests in Seattle last month as a fluke, a nostalgic hippie flashback. Their cynicism reflects not only the binders of their Beltway mindsets, but the bias of their own, now challenged, media formats. For most of the past forty years, since broadcast television emerged in about 1960 as the primary deliverer ( and definer) of news, political activism evolved in a kind of dependent relationship (which superficially some took to be a symbiotic one) to television. Intuitively, sometimes by instinct, sometimes, as students of McLuhan, quite consciously, activists of the civil rights and anti-Vietnam War movements, attuned to the persuasive power of the mediated image, learned to cast and craft their political protests at least in part as media politics. Grass roots organizing remained, as always, the essential underpinning of a viable social movement, but angling for a dramatic visually intense slot on the nightly news ( what Abbie Hoffman called "Becoming an Advertisement for the Revolution" or "Media Freaking") became a primary tactic, if not full fledged strategy. The power relationship, however, was always ultimately one-sided. Those who lived by the televised image, could be easily squashed by the image gatekeepers, cancelled like a burnt-out sit-com or cops-and-robbers show once their novelty effect ebbed. And when "The Whole World" was no longer watching, communication was pretty easily squelched. What the WTO protests represent, far from Luddite know-nothing-ism (despite the handful of brick throwing John Zerzan/Theodore Kaczynski "anarcho-primitivists" whom broadcast TV reflexively and inevitably locked-in on as the TV stars of the event) is the first social protest movement created largely through and communicating largely via the Web. Which is to say the first, potentially at least, able to by-pass the gatekeepers of mainstream media while reaching hundreds of thousands, perhaps millions, of participants/observers/ sympathizers and others, globally on an ongoing basis. This suggests that the populist cyber-punk roots of Net BBS's are surviving and even flourishing alongside the corporate branding the Web is undergoing. With due apologies to the great writer Bruce Sterling (who advises us to retire cyber prefixes once and for all), I can't help thinking that, despite the apparent easy triumph of cyber-commercialization (the Web as global strip-mall), the next few years may also witness the blossoming of the first era of mass global populist cyber-protest. PHIL LEGGIERE is a free-lance journalist and book reviewer for Upside Magazine and several other publications. _________________________________________________________________ Leon M. Lederman Survival Depends on the Race Between Education and Catastrophe A greatly underrated crisis looming over us was predicted by the futurist H. G. Wells. In about 1922 he commented that survival would depend on the race between education and catastrophe. The justification for this profound foresight can be seen in the incredible violence of this century we have survived, and the newfound capacity of mankind to obliterate the planet. Today, although political rhetoric extols education, the educational system we have cleverly devised and which is in part a product of the wisdom of our founding fathers defies reform. It is a system incapable of learning from either our successes or our failures. How many parents and policy makers know that the system for teaching science in 99% of our high schools was installed over 100 years ago? A National Committee of Ten in 1893 chaired by a Harvard President recommended that high school children be instructed in science in the sequence Biology, then Chemistry, and then Physics. The logic was not wholly alphabetical since Physics was thought to require a more thorough grounding in mathematics. Then came the 20^th century, the most scientifically productive century in the history of mankind. Revolutions in all these and other disciplines have changed the fundamental concepts and have created a kind of hierarchy of sciences; the discovery of the atom, quantum mechanics, nuclear sciences, molecular structures, quantum chemistry, earth sciences and astrophysics, cellular structures and DNA. To all of this, the high school system was unmoved. These events and pleas to high school authorities from scientists and knowledgeable teachers went unheeded. The system defies change. We still teach the disciplines as unconnected subjects with ninth grade biology as a chore of memorizing more new words than 9^th and 10^th grade French together! This is only one dramatic example of the resistance of the system to change. Our well-documented failure in science education is matched by failures in geography, history, literature and so forth. "So what?" critics say. "Look at our booming economy. If we can do so well economically, our educational system can't be all that important." Here is where appeal to H. G. Wells' insightful vision enters. The trend lines of our work force are ominous. Increasing numbers of our citizens are cut-off from access to technological components of society, are alienated and are condemned to scientific and technological illiteracy. We have by the process, solidified and increased the gap between the two classes of our culture. And the formative elements of culture outside of school: TV, cinema, and radio . . . strongly encourage this partition. Look at the social (as well as economic) status of teachers. Most parents want the best teachers for their children, but would bridle at the suggestion that their children become teachers. The penalties of continuing to graduate cultural illiterates (in science and the humanities) may not be evident in year 2000 Wall Street, but it is troubling the leaders of our economic success, the CEO's of major corporations who see a grim future in our workforce. Can we continue to import educated workers? As the low-level service jobs continue to give way to robots and computers, the needs are increasingly for workers who have high level reasoning skills, which a proper education can supply to the vast majority of students. But what is it that threatens "catastrophe" in the 21^st century? Aside from the dark implication of a hardening two-class system, there is a world around us that provides global challenges to society and solutions require large popular consensus. Global climate change, population stabilization, the need for research to understand ourselves and our world, the need for extensive educational reform, support for the arts, preservation of natural resources, clean air and water, clean streets and city beautification, preservation of our wilderness areas and our biodiversity-these and other elements make life worth living, and cannot sensibly be confined to enclaves of the rich. You don't have to be a rocket scientist to construct catastrophes out of a failed educational system. LEON M. LEDERMAN , the director emeritus of Fermi National Accelerator Laboratory, has received the Wolf Prize in Physics (1982), and the Nobel Prize in Physics (1988). In 1993 he was awarded the Enrico Fermi Prize by President Clinton. He is the author of several books, including (with David Schramm) From Quarks to the Cosmos : Tools of Discovery, and (with Dick Teresi) The God Particle: If the Universe Is the Answer, What Is the Question? LINKS: The Story of Leon; Leon M. Lederman Science Information Center _________________________________________________________________ Peter Schwartz The Dramatic Fall in the Rate of Growth in Global Population My candidate for most important unreported story is the dramatic fall in the rate of growth in global population. Instead of hitting 20, 20 or even 50 billion as was feared only a few years ago, with all the associated horror, it is likely to reach between ten and eleven billion by mid century. The implications for the carrying capacity of the planet are profound. PETER SCHWARTZ is an internationally renowned futurist and business strategist. He is cofounder and chairman of Global Business Network, a unique membership organization and worldwide network of strategists, business executives, scientists, and artists based in Emeryville, California. He is the author of The Art of the Long View : Planning for the Future in an Uncertain World and coauthor (with Peter Leyden & Joel Hyatt) of The Long Boom. LINK: Global Business Network; The Long Boom Home Page _________________________________________________________________ Carlo Rovelli The End of the Dream of a More Gentle World This unreported story is not much up to date. It is perhaps not even unreported (everything is reported -- does anything exist if it isn't reported?). And I do not know how "important" it is. Importance is determined by what is perceived to be important, and what is perceived important is reported. What remains is what will be viewed as important by our descendants -- but let us leave them the burden of the choice -- , and what is perceived important by singles or groups. I have been reading the other "most important unreported stories" in this fascinating collection of answers, and I have been surprised how much the perspective of importance matches the specific of the interests, or of the personal history, of the writer. So, I shall allow myself to be absorbed by the same soft self-indulgence... The most important under-reported story I want to talk about is the old story of the great dream of a more gentle world. A more just world, not based on the competition of everybody against everybody, but based on sharing, the world as a collective adventure. The dream is dead. Killed by the simple fact that most of the times -- and this one is no exception -- the stronger and aggressive wins and the gentle one looses. Killed by the fears it generated in the privileged. Killed by its own naivete, by horrors its unrealism generated, by its incapacity to defend itself from the thirst of power hidden inside. And killed by the thousand of reported and over-reported stories of its sins. The new century raises over a dreamless new world, with richer riches and with desperates by the millions. This is reality and we go with it, its imperfections and its promises, which are no small. But the dream is dead. It has moved generations, it has inspired peoples, made countries, it has filled with light the youth of so many of us, from Prague to San Francisco, from Paris to Beijin, from Mexico City to Bologna. It has lead some to the country, some to take arms, some to let their minds fly, some to Africa, some to join the party, some to fight the party. But it was for the humanity, for a better future for everybody, it was pure, beautiful and generous. And real. Stories do not get unreported because they are kept hidden. But simply because there are different perceptions of reality and of importance. Perception of reality changes continuously. What was there and big, may then vanish like a strange morning dream, leaving nothing but confused undecoded traces and incomprehensible stories. A strong ideology believes to be realist, and calls its reported stories "reality". The old dreams are transformed in irrational monsters, then they get unreported, then they have never existed. CARLO ROVELLI is a theoretical physicist, working on quantum gravity and on foundations of spacetime physics. He is professor of physics at the University of Pittsburgh, PA, and at the Centre de Physique Theorique in Marseille, France. LINK: Carlo Rovelli Home Page _________________________________________________________________ Timothy Taylor The sexual abuse of children by women While I was writing about libidinousness among female primates for my book The Prehistory of Sex: Four Million Years of Human Sexual Culture, a friend told me that she had been sexually abused by her mother. My research had helped her cut through the cultural myth that only men could be sexually violent. Since then five more people have told me that as children they were sexually abused by females (not all by their own mother -- adult relatives and unrelated persons figure too). A seventh person believes she may have been abused by a female, but her memory is clouded by later abuse by a male. The seven come from a range of social and ethnic backgrounds; three are men and four are women. None of them has had any form of memory-tweaking therapy, such as hypnotic regression. Indeed only two of the seven have mentioned their abuse to a doctor or therapist (as compared with two out of the three people I know who were abused by males). Each abused child grew up in ignorance of others, in a culture in which their kind of story was not told. As with abuse by males, the psychological effects are profound and long-term. The ability to name what happened is thus won with difficulty and has come only recently to each of the victims I know: maturity and parenthood, supportive friends, and the simple realization that it can actually happen, have all played a part. Whatever its biological and cultural antecedents -- poor parent-infant bonding, the urge to control and dominate, repression, hidden traditions of perversion, etc. -- the truth of abuse by males has only recently been accepted, and the extent of it probably remains underestimated. By contrast, abuse by females is almost totally unreported outside specialist clinical literature. Successful criminal prosecutions, rare enough for the former, are almost unheard of for the latter except where they comprise part of more unusual psychopathic crimes (such as the torture and murder of children). But there is nothing inherently implausible about there being as many female as male paedophiles in any given human community. That women paedophiles have been a systemic part of recent social reality is, in my view, today's most important unreported story. TIMOTHY TAYLOR teaches in the Department of Archaeological Sciences, University of Bradford, UK, and conducts research on the later prehistoric societies of southeastern Europe. He is the author of The Prehistory of Sex: Four Million Years of Human Sexual Culture. _________________________________________________________________ Daniel Pink Maslow's America Time's "Person of the Year" should have been Abraham Maslow. The great psychologist is the key to understanding the biggest economic story of our day ? a story that's been obscured by stock tickers crawling across the bottom of every television screen, by breathless magazine covers about dot-com fired insta-wealth, and by the endless decoding of Alan Greenspan's every emission. Deep into the middle class, Americans are enjoying a standard of living unmatched in world history and unthinkable to our ancestors just 100 years ago. This development goes well beyond today's high Dow and low unemployment rate. (Insert startling factoid about VCRs, longevity, car ownership, antibiotics, indoor plumbing, or computing power here.) And demographics are only deepening the significance of the moment. Roughly seven out of eight Americans were not alive during the Great Depression ? and therefore have no conscious memory of outright, widespread, hope-flattening economic privation. (Note: long gas lines and short recessions don't qualify as life-altering hardship.) As a result, the default assumption of middle class American life has profoundly changed: the expectation of comfort has replaced the fear of privation. Enter Maslow and his hierarchy of needs. I imagine that at least a majority of Americans have satisfied the physiological, safety, and even social needs on the lower levels of Maslow's pyramid. And that means that well over 100 million people are on the path toward self-actualization, trying to fulfill what Maslow called "metaneeds." This is one reason why work has become our secular religion ? and why legions of people are abandoning traditional employment to venture out on their own. (It's also why I guarantee that in the next twelve months we'll see newsmagazine stories about despondent, unfulfilled "What's It All About, Alfie?" Internet millionaires.) What happens when life for many (though, of course, not all) Americans ? and ever more people in the developed world ? ceases being a struggle for subsistence and instead becomes a search for meaning? It could herald an era of truth, beauty, and justice. Or it could get really weird. DAN PINK, a contributing editor at Fast Company and former chief speechwriter to Vice President Al Gore, is completing a book on the free-agent economy. LINK: Pink's Free Agent Nation Website _________________________________________________________________ Robert Hormats The Information Revolution Requires A Matching Education Revolution Today's most important unreported story is that for many millions of people in the industrialized and developing countries education and training are not keeping up with the information technology revolution. As the world enters the 21st century we need more robust educational education and training, benchmarking to ensure that educational systems provide the skills needed for this new era and resource commitments that recognize that educational investments are critical to economic prosperity and social stability in this new century. If the benefits of the information technology revolution are to be broadly shared, and its economic potential fully realized, a far greater effort is required during and after school years to enable larger numbers of people to utilize and benefit from new information technologies. Failure to do this will widen the digital divide and the income gap within and among nations, sowing seeds of social unrest and political instability. It also will deprive our economies of the talents of many people who could make enormous contributions to science, medicine, business, the arts and many other fields of endeavor were they able to realize their full educational and professional potential. The goal of our societies should be not only to be sure schools and homes are wired and online -- itself a critical infrastructure challenge -- but to provide education and training programs so that larger and larger numbers of people at all income levels can use these new technologies to learn and create during their school years and throughout their lives. For the US, whose population is steadily aging, this means ensuring that older citizens have greater training in the use of these new technologies. And it means that younger Americans, especially minorities who will become an increasingly significant portion of the 21st century workforce, have far greater education and training in the use of information technologies than many do now. The better trained they are the better position they will be in to contribute productively to the US economy -- empowered by these new technologies. In the emerging economies, IT education is an important part of their evolution into dynamic participants in the global information economy, attracting more and more investment based not only on low labor costs or large domestic markets but also on their innovativeness and ability to adapt to a world where more and more high quality jobs are knowledge based. In much of Asia the financial crisis received so much attention that much of the world paid little attention to the dynamic changes in the information technology sector taking place in the region; impressive as that is, it can be even more impressive as greater investment in human capital expands the number of information / technology savvy citizens in these countries and thus broadens the base of high-tech prosperity. In the least developed economies, IT education should be a top priority. It is greatly in the world's interest that they be able to achieve their full economic potential. A substantial amount of international support from the private sector and governments will be needed. This can both prevent these nations from falling further behind and unlock the innovative potential of their peoples. An education revolution in industrialized, emerging and developing nations is needed to keep up with and realize the full potential of the information technology revolution. We should not become so enamoured of technology that we ignore the human dimension that is so critical to its success and to the social progress that these technologies have the potential to accelerate. ROBERT HORMATS is Vice-Chairman of Goldman Sachs International. LINK: Goldman Sachs _________________________________________________________________ Jaron Lanier The End of Clausewitzian War Prior to about twenty years ago, wars could almost always be understood as depressingly rational events perceived by instigators as being in their own self interests. Certain recent wars and other acts of organized violence are astonishing in that they seem to break this age old pattern. A striking example is the series of awful confrontations in the former Yugoslavia. If it was only an evil strongman, a Slobodan Milosevic, who instigated the bloodshed, events would have kept true to the old established pattern. Many a leader has instigated conflict, conjuring a demonized foreign or domestic enemy to rouse support and gain power. But that is not really what happened in this case of Yugoslavia. In the past, the demons were accused of posing a material threat. Hitler claimed the Jews had taken all the money, for example. Yes, he claimed they (we) were morally degenerate, etc., but that alone would perhaps not have roused a whole population to support war and genocide. The material rationale seemed indispensable. By contrast, in Yugoslavia a large number of both middle level leaders and ordinary citizens, not limited to the Serbs or another single group, rather suddenly decided to knowingly lower their immediate standard of living, their material prospects for the foreseeable future, their security, and their effective long term options and freedoms in order to reinforce a sense of ethnic identity. This is remarkably unusual. While ethnic, religious, and regional movements have throughout history sought political independence, they have almost never before resorted to large scale violence unless economic or some other form of material degradation was a critical motivation. Had the English Crown been more generous in the matter of taxation, for instance, he might well have held on to the American Colonies. It is often pointed out that the cultural context for conflict in the Balkans is extraordinarily old and entrenched, but there are awful psychic wounds in collective memory all over the world. There are plenty of individuals who might under other circumstances be drawn once again into conflict over the proper placement of the border between Germany and Poland, for example, but there is absolutely no material incentive at this time to make an issue of it, and every material incentive to live with the situation as it is. Similarly, if an uninformed, uneducated population had burst into violent conflict on the basis of bizarre beliefs that the enemy posed a serious threat of some kind, perhaps abducting children to drink their blood, then that would have kept to the historical pattern as well. Neither Von Clausewitz nor any other theorist of war has claimed that war has always in fact been in the self interest of perpetrators, only that it was perceived to be so. But Yugoslavia was a nation that was relatively prosperous, well educated, and informed. Yugoslav society was not closed or controlled to the extent of other contemporary nations formed upon related ideologies. There were relatively open borders and extensive commerce, tourism, and cultural contact with the West. And Yugoslavia was not Germany between the wars. Yugoslavs were not humiliated or frustrated relative to other populations across their borders. The material conditions were critically different. There was no sense of hopeless economic disintegration, no reason to think, "Even war would be better than this, or at least a risk worth taking." Before Yugoslavia, war famously spared nations blessed with Macdonald's hamburger franchises. The comforting common wisdom was that economic interdependence reduced the threat of war. Economic globalism was supposed to remove the material incentives from making war, and it indeed it probably has done that. In former Yugoslavia, an upwelling of need for absolute identity trumped rational, material self interest. This phenomenon can also perhaps be seen in some instances in the rise of Islamic militancy. The recent rise of violent events perpetrated in the name of "traditional" identities, values, and beliefs is startling. Once again, such violence has always existed, but almost always before it has been coupled with a component of material motivation. The Biblical Israelites were enslaved and subjected economic abuse, for example. The fundamentalists who attack abortion clinics seek no improved material prospects. Neither do the Taliban. Or the bombers of the Federal Building in Oklahoma City. In all these cases, identity has become more important than wealth, and that is new. Another possible explanation that haunts me is that the human spirit cannot cope with the changes technology makes to human identity. This can be as simple as MTV blasting into the lives of children who otherwise would never have known the meaning of spandex, piercing, or whatever is in fashion on a particular day. Any thinking person, though, must know that the changes to the human condition wrought by such technologies as MTV, or even abortion and birth control, are mere whispers compared to the roar of changes that will soon come to pass. JARON LANIER , a computer scientist and musician, is a pioneer of virtual reality, and founder and former CEO of VPL. He is currently the lead scientist for the National Tele-Immersion Initiative. Further reading on Edge: Chapter 17, "The Prodigy," in Digerati LINKS: Jaron Lanier's Home Page; The National Tele-Immersion Initiative _________________________________________________________________ Steve Quartz The Coming Transformation in Human Life and Society in the Post-Genomic World Although there hasn't been any shortage of stories on genes in the press, public dialogue hasn't even begun to seriously consider how radically genetic technologies will alter human life and society -- and probably all much sooner than we think. Forget cloning -- the pace of the Human Genome Project combined with the emerging dominance of market forces in dictating how spinoff technologies from gene therapy to engineering novel genes will be utilized suggests that we'll soon be able to retool human life (altering human traits from life history -- aging, reproduction -- to intelligence and personality). We haven't really begun to consider the enormous implications these will have for the design of human society and social policy, from the family unit to education and work. My bet is that feasible technologies to retool human life will put us face to face with the basic dilemma of deciding what it means to be human within two decades. STEVEN QUARTZ is a professor in the division of Humanities and Social Sciences and the Computation and Neural Systems Program at Caltech. He is the author, with Terrence Sejnowski, of the forthcoming Who We Are: How Today's Revolutionary Understanding of the Brain is Rewriting Our Deepest Beliefs About Ourselves. _________________________________________________________________ Robert R. Provine The Walkie-Talkie Theory: Bipedalism Was Necessary For Human Speech Evolution Speech is a byproduct of the respiratory adjustments associated with walking upright on two legs. With bipedalism came a secondary and unrecognized consequence, the respiratory plasticity necessary for speech. Quadrupedal species must synchronize their locomotion and respiratory cycles at a ratio of 1:1(strides per breath), a coupling required by the shared, rhythmic use of the thoracic complex (sternum, ribs, and associated musculature), and the need to endure impacts of the forelimbs during running. Without such sychronization, running quadrupeds would fall face first into the dust because their thorax would be only a floppy air-filled bag that could not absorb the shock of forelimb strikes. Human bipedal runners free of these mechanical constraints on the thorax employ a wide variety of phase-locked patterns (4:1, 3:1, 2:1 [most common], 1:1, 5:2, and 3:2), evidence of a more plastic coupling between respiratory rhythm and gait. The relative emancipation of breathing from locomotion permitted by bipedality was necessary for the subsequent selection for the virtuosic acts of vocalization we know as speech. The contribution of bipedality to speech evolution has been neglected because linguists typically focus on higher-order cognitive and neurobehavioral events that occur from the neck up and overlook the neuromuscular processes that produce the modified respiratory movements known as speech. ROBERT R. PROVINE is Professor of Psychology and Neuroscience at the University of Maryland Baltimore County where he studies the development and evolution of the nervous system. The walkie-talkie theory is presented in his forthcoming book Laughter: A Scientific Approach. _________________________________________________________________ James Bailey The Spread of Universal Visual Literacy Beneath Keith Devlin's "Death of the Paragraph" lies a deeper and even less reported story: the spread of Universal Visual Literacy. Visual Literacy is the ability not just to understand knowledge in visual form but also to create it. Future generations of scientists (and poets) are growing up with Photoshop in their fingertips. To them, a conjunction is a video fade or wipe as much as a but or a yet. A modifier is a texture on a 3D model as much as an adverb. With the (under-reported) close of the Gutenberg Era, young scholars are no longer constrained to old textual modes of communication. With the aid of new electronic tools for expressing knowledge visually, they will go back and forth with a facility unknown since Leonardo. The reason this matters hugely is that visual modes of knowing can accurately apprehend and communicate realities that are parallel, whereas paragraphs and equations force us to pretend, with Descartes, that life really happens in the single-step-at-a-time sequences that the printing press demands. In the Gutenberg Era, the master scientific concept was the equality of two strings of symbols on a printed page. For young scientists growing up today, the master scientific concept is the all-at-once docking of one molecular shape onto the binding site of another on a computer screen. Perhaps the most egregious example of using the old sequential concepts of the Gutenberg Era to try to express parallel reality is our current enthusiasm for the lame assertion that life is speeding up. Here is a candidate for the most over-reported story of our time. As a culture we are stringing together whole bookloads of paragraphs trying to apply the centuries-old sequential concept of speed to whatever is going on right now, because that is the best that text seems to be able to do. Count on todays fourth-grader, with her iBook and her Chime plug-in in her back pack, to do a whole lot better some day. But by the time she finishes high school, she will still be able to understand our old ways, because, along with her daily biology and art classes, she will humor her physicist parents and take two days a week of Algebra Appreciation. JAMES BAILEY is an independent scholar focusing on the impact of electronic computing on the overall history of ideas. He is the author of After Thought: The Computer Challenge to Human Intelligence. _________________________________________________________________ Henry Warwick My friend, John Brockman, asked me, "What is the most important unreported story of the year?" Deprived of sleep and somewhat dulled by holiday festivities, I had no reaction except a mumbled "Damned if I know..." I immediately set out to understand this question (which soon dominated My Every Waking Hour of my Holiday Vacation), and in order to get my mind around the question and all it implied, I would need to do some research beyond the most propitious mixture of rum and eggnog, and how to cook a turkey dinner for nine... I basically asked almost everyone I met, making a (typically) cheerful nuisance of myself. The results were most interesting, and I quickly found that the results of my research, like many of the previous responses, was also conditioned by the world views that obtained given the career choices and life objectives of the people I asked. Most of people I asked, deprived of sleep and dulled by the holiday festivities, shrugged and said "Damned if I know..." I found such informal results less than satisfactory. Over more "eggnog" I came to the conclusion I should ask people who might actually know. The next day, I talked to people in the news trade, figuring, if they publish the stories that do get reported, they would certainly know what doesn't get reported. I contacted a number of people in this regard, among them; an editor of a major San Francisco weekly newspaper, a writer for a major San Francisco daily newspaper, a photo editor for another daily, and a writer for a weekly newspaper and the internet, located near Seattle. Four people from completely different backgrounds- and they all said (basically) the same thing- "How can it be that there is incredible poverty amidst incomparable wealth, so often resulting in homelessness? " This threw me for a loop, because I didn't anticipate unanimity from such a diverse lot. What also struck me was how I felt they were wrong. While I do think poverty does deserve a greater examination, and is certainly an important issue, I don't feel like it is particularly "unreported" much less unknown. Anyone who lives in an urban center in America (and many other countries for that matter) knows about the reality that is poverty and homelessness. I also felt I had to discount their answer, to a certain degree. For one thing, they spend much of their time reporting on the headline eating news- the acts, both dastardly and venal, of society's misfits, madmen, and squalid criminals, both elected and otherwise. These people are journalists, and journalists, especially American journalists, have a tradition- bordering on an archetype- of being the voice for the voiceless, the muckraker, the fourth estate, the ever critical conscience of a secular society. This would make their odd unanimity explainable, and, to a degree, underscore the value and gravity of their choice. But -- John didn't ask them, he asked me. And what do I know? Enough to make me a worthy opponent at most trivia games. Enough that I'm not homeless. Yet. Are the growing ranks of the homeless and poor amidst our ever deepening sense of prosperity and wealth the answer to Brockman's question? Or is it something broader and deeper? Ever since there have been small privileged classes of the rich and/or powerful, there have been the endless ranks of peasants and proles, microserfs and burgerflippers, all of them struggling to feed their children, and then there have been those who look up to peasants and proles, microserfs and burgerflippers: the misfits, the madmen, and the squalid criminals both elected and otherwise. Perhaps that's an important untold story- the grand parade of the society's faceless "losers", the peasants and refugees fleeing some obscene tyrant and his witless army of cannon fodder dupes and cruel henchmen, and why on earth do they all buy into this fiction we call "Civilisation"? Or, is it less of a fiction as one might imagine, and simply the natural product of a status conscious primate whose every activity is amplified and processed by its symbolic language center? Does reason in human relations only extend as far as the highly codified and ritualistic systems of voting and criminal justice? Can the objectivity of scie nce ever be used to develop social and economic systems that will eliminate injustice and poverty? Or, I wondered, is such a quest based in an outmoded socio political messianistic teleology? Are we fated to forever step over the prone bodies of those less fortunate or healthy? If the answer to "When will the horror ever end?" is "Never", then the big unreported story of the year is the true loss of "Utopia" and the evisceration of the humanist's hope by the knives of history and a scientifically informed realism. Should we then also apply the logical conclusions of the Copernican revolution to our own human existence? With a decentered earth, sun, galaxy, and now, if some theories are correct, a decentered Universe, it is now logical that we should apply the lens of decentering to ourselves, our civilisations and cultures, and to our actions both collective and personal Perhaps that's the most important unreported story of the year -- we're really not "The Story" any more, and what we do is likely of little, if any, consequence. Are we, as persons and a species, merely bit players in a peculiar performance? Improvising before an empty house, and all of our preening culture and posturing civilised rhetoric but a vain and oddly comical conceit? On this tiny planet of water, trees, and concrete -- are we small participants in a giant multiverse that is actually less moving material incidentals expressing an equation of variables and constants, and more of a growing, blooming, beautiful, if very slowly dying, flower? A Flower? HENRY WARWICK is an artist, composer, and scientist whose formal education consists of a BFA from Rutgers University in visual systems studies, a major of his own invention. He lives in San Francisco, and his works can be appreciated at www. kether .com. LINKS: Henry Warwick's Website: kether.com _________________________________________________________________ Julian Barbour a) The Extraordinary Proliferation of Jobs and Careers; b) James P. Carse's Jewel of a Book, Finite and Infinite Games The very notion of the "most important unreported story" makes it inevitable that any answer will be highly subjective. For can anyone honestly say they know for sure what is the most important thing in life? One might say that the very essence of life is growth and uncertainty about the direction in which it happens. I feel forced to look for striking unreported stories whose ultimate importance is inevitably unknown. Then I find that no story is completely unreported, only underreported compared with the standards I would apply. In this line, I find that the existence of serious (though not conclusive) scientific evidence for the complete nonexistence of time has been strikingly underreported. But having just published a book on the subject (and also undergone Edge edition No. 60) on this theme, I do not feel like returning to it. I offer two substitutes. Though certainly reported, there is a feature of modern life that I feel should be given more prominence. It is the extraordinary proliferation of jobs and careers that are now open to people, both young and old, compared with my childhood over 50 years ago. I can gauge this especially well because of a piece of research that is worth mentioning on Edge. In the midst of the Second World War, but when it was already clear that the Allies would win, the British war cabinet decided that it should start thinking about how life should be improved for both the urban and the rural population. Task forces were set to work, researching the existing state. As it happened, the village 20 miles north of Oxford where I grew up and still live (South Newington) was chosen as the centre of an oblong region (measuring 4 by 6 miles on the Ordnance Survey map of Britain) that was to be studied in detail as typical of rural Britain. Every village in it was surveyed, and almost every person living in this region questioned about their lifestyle and occupations. The outcome was a book and a film called Twentyfour Square Miles, made just after the war, when I was a boy of eight. Every few years, this film is shown in the village. What is most striking is the subsequent incredible mushrooming of career opportunities (and the large degree of equalization of prospects between the sexes). Back in 1945, boys could look forward to jobs in agricultural (quite a lot), as motor mechanics (perhaps 20 such jobs in the entire region), working in the aluminum factory in the nearby town of Banbury, and not much else. For girls the options, apart from motherhood, were essentially limited to domestic service, working as salespersons, and secretarial or other clerking jobs. Probably only about one child in 20 would go on to higher education. The transformation in 55 years has been amazing -- and it seems to me to be accelerating. In fact, any reasonably bright young (or even relatively old) person in the region can now choose to follow more or less any career. There is some concern today that we are all becoming more and more stereotyped. It seems to me that the mere fact that we now engage in such a huge variety of occupations should largely offset any such danger of stereotyping. Now to my second offering. A highly original book not widely known must be an important unreported story. (The President of Edge can hardly disagree!). It turns out that the book I have in mind is actually quite widely known (it has about half a million sales worldwide, as I learned recently on the phone from its author) but seems to be almost completely unknown in the UK. I am a sufficiently chauvinistic Brit to think that the UK reading world is important, so here goes. I learned about the book from Stewart Brand's The Clock of the Long Now: It was a professor of religion, James P. Carse of New York University, who came up with the idea of "the infinite game". His 1986 jewel of a book, Finite and Infinite Games, begins, "A finite game is played for the purpose of winning, an infinite game for the purpose of continuing the game." This struck me as an extremely novel idea. I immediately ordered the book and was not disappointed. A slim volume written in the addictive aphoristic manner of Leibniz's Monadology and Wittgenstein's Tractatus, with both of which it most certainly can be compared, I think it is perhaps the most original perspective on life and the world I have encountered. In fact, I was so delighted that I logged onto amazon.co.uk and ordered 50 copies to send to friends as Christmas and New Year presents (along with a copy of this entry to serve as explanation for the said slim volume). My order had a gratifying effect on the Amazon sales ranking of Finite and Infinite Games. Before my order it stood at 25499. Two hours later it was 4287. I mention this in the hope that some high Amazon executive reads Edge and will ponder a somewhat less erratic way to measure the rankings, which fluctuate so wildly as to be almost useless. I think they need a Longer Now. I am writing this on 31st December 1999 while listening to BBC Radio 3's monumental day-long 2000-year history of music called "The Unfinished Symphony". The subtitle of Carse's book seems especially apt in this connection --" A Vision of Life As Play and Possibility". Perhaps the idea of life as an everlasting game or an ongoing symphony is the most important unreported story. On the subject of slim volumes, the British photographer David Parker, who specializes in science studies and monumental landscapes with a trace of the human element (Utah mainly), recently told me the ultimate traveling-light story. Years ago he was trekking in Bolivia with a friend who takes traveling light to the extreme. He therefore allowed himself just one book, the Tractatus. To lighten even that slim load, he would read one page each evening, tear it out and then deposit it carefully wherever his bed happened to be: beneath a stone if in the open or in the bedside table to rival the Gideon Bible if in a hotel. One can hardly call the Tractatus an unreported story, but the method of dissemination is worthy of Professor Emeritus James P. Carse. JULIAN BARBOUR , a theoretical physicist, has specialized in the study of time and inertia. He is the is the author of Absolute or Relative Motion? and The End of Time. Further Reading on Edge: "Time In Quantum Cosmology"; Julian Barbour comments in the Reality Club on "A Possible Solution For The Problem Of Time In Quantum Cosmology" by Stuart Kauffman and Lee Smolin LINK: Julian Barbour's Website _________________________________________________________________ Verena Huber-Dyson The Classification Theorem for Finite Simple Groups It has been a great century for mathematical groups of all shapes and sizes! They have been part and parcel of our daily lives for the last couple of millennia or more. Apart from ruling our bureaucracy, groups of transformations and symmetries are the keepers of law and order among mathematical structures, the sciences and the arts -- as well as the sources of beauty. Every finite group admits a decomposition into a finite sequence of simple ones similar to the prime factorization of integers. But it has proven spectacularly difficult to obtain a systematic grasp of the set of all finite simple groups. The first step was a 254 page proof in 1961 of a key conjecture dating back to the last century. Then a team of over 50 group theorists started snowballing under the leadership of Daniel Gorenstein who, in 1980, announced the result of their labors, recorded in some 15000 highly technical journal pages: every finite simple group belongs either to one of two fairly well understood infinite families or to one of 16 less tractable infinite lists or is one of the 26 so-called sporadic groups-- a strange elusive lot whose discovery was a hunt full of hunches, surprises and insights, reminiscent of the chase for elementary particles. It was an extraordinarily fruitful proof. Within 2 years of its completion a conference was dedicated to applications in nearly all branches of mathematics. But beyond merely establishing truth a good proof must illuminate and explain. At first ascent everybody is just glad to have scaled the peak. Then comes the search for a more elegant, easier, snappier route. By 1985 that second stage of the project was well under way. The "enormous proof" has set a new trend in mathematics. It is a True Tale of a Tower of Babel with a Happy Ending! All those mathematicians toiling side by side, if spread all over the globe, each with his own outlook, language, bag of tricks: constructing geometries, permuting objects, calculating characters, centralizing, fusing, signalizing and inventing all sorts of new terms for situations that had been lying there waiting to be recognized, named and used. They were not treading on each other's toes but collaborating in a prolific way unprecedented in the history of mathematics! This wonderful happening did provoke lively professional discussions but not much attention from the popular media. It is difficult to advertise, unintelligible without technical explanations and lacking in historical romance. Much of the friendliness of the sporadic monster is lost on an audience gaping at its size, incapable of appreciating its capricious charms. And there are no melodramatic side shows; group theorists -- especially finite ones -- make up just about the sanest and nicest species among mathematicians. Glitter and glamour are not engendered by the laborious toil that went into the quest for a classification of the finite simple groups. Finally, the "second generation" has not yet completed its task, a good reason for holding back the popular fanfares. But it is a great story in progress to be carried over into the new millennium! VERENA HUBER-DYSON received her Ph.D. in mathematics at the University of Zuerich with a thesis in finite group theory, then, in 1948, came as a post doc. to the Institute for Advanced Study in Princeton. After a short but fruitful marriage she resumed her career, at UC Berkeley in the early sixties, then at the U of Illinois' at Chicago Circle and finally retired from the University of Calgary. Her research papers on the interface between Logic and Algebra concern decision problems in group theory. Her monograph Goedel's theorem: a workbook on formalization (Teubner Texte, Leipzig 1991) is an attempt at a self-contained interdisciplinary introduction to logic and the foundations of mathematics. _________________________________________________________________ Joseph LeDoux Educational Inequities As a nation we pay lip service to the idea that we're all created equal. But the rich are getting richer and the poor are getting poorer. Is this because the poor have bad brains that can't learn to do better, or because their brains never get the opportunity to learn? We know that even the best of brains needs input from the environment to form and flourish. So why do we allow schools in poor neighborhoods, as a rule, to be so much worse? The difference is less about race than about class. Shouldn't education be more standardized from neighborhood to neighborhood, city to city, and state to state? Improvement of educational opportunities wouldn't solve all the problems the poor face, but is an obvious place to start. Critics of liberal social policy often claim that pouring money into a situation doesn't help. I'm not suggesting that the poor get anything extra, just that they get what others get. A decent education is a right not a privilege. JOSEPH LEDOUX, a Professor at the Center for Neural Science, New York University. Joseph LeDoux, has written the most comprehensive examination to date of how systems in the brain work in response to emotions, particularly fear. Among his fascinating findings is the work of amygdala structure within the brain. He is the author of the recently published The Emotional Brain: The Mysterious Underpinnings of Emotional Life, coauthor (with Michael Gazzaniga) of The Integrated Mind, and editor with W. Hirst of Mind and Brain: Dialogues in Cognitive Neuroscience. Further Reading on Edge: "Parallel Memories: Putting Emotions Back Into The Brain" -- A Talk With Joseph LeDoux on Edge LINK: LeDoux Lab: Center for Neural Science Home Page _________________________________________________________________ Todd Siler Applying our Lessons from the Nuclear Age to this Age of Molecular Biology and Genetic Engineering The general public is wondering about the deep connection between these two Ages in which we've glimpsed the power of atoms and are beginning to glean the power of genes. Welcome to the WISDOMillennium! There's no better time to tap our collective wisdom as we welcome new opportunities to work together toward advancing science, technology and society. In order to responsibly and thoughtfully use our scientific insights into the nature of molecules and genes, we need to hold in our conscious mind (and conscience) the key learnings from our more freewheeling experimental work on the atom which has lead to the development of increasingly sophisticated nuclear weapons. With some concerted effort, perhaps we can avoid repeating the mistake of mindlessly innovating, constructing and stockpiling weapons of mass destruction. Clearly, our boldest insights into the basic building blocks of biological matter can be abused in a similar manner, as we start to scratch that big itch of curiosity -- designing, for example, evermore exotic bioweapons with will-o'-the-wisp applications. Anyone with a sense of wonder can't help but marvel at those potentially mind-altering innovations in genetic engineering that keep rolling out of the laboratories and into our lives. All this excitement has a way of momentarily silencing our skepticism, as we tend to overlook the impact these innovative works may be having on the whole of human ecology. Somehow, we need to keep in the forefront our wonderment "the thinking eye," as the Symbolist painter Paul Klee referred to that most essential element of creativity: higher awareness. We need to look ahead -- cautiously and with full vision -- reviewing the times our eyes were "wide shut" to wanton exploits of scientific explorations. Given our penchant for reaching for the impossible -- and, occasionally, realizing "the unthinkable" (those worst case scenarios and experiences involving serious human error) -- perhaps the science community would be wise to do some collaborative, projective thinking and moral forecasting about the more questionable applications of experimental research in both Molecular Biology and Genetic Engineering; hopefully, this collaboration will take place before the world community is forced to face yet another flagrant act of poor judgment. Surely, there must be a way of supporting basic scientific research while safeguarding ourselves from what cynics gleefully call "the inevitable": you know the story (it's always the same story): some megalomaniac spearheads a group of clowns, or clones -- or cloned clowns -- who proudly announce their spectacular creation, a lethal new life-form. "Here, let us demonstrate how this new viral strain works...Oops! Gosh, we had no idea this creature would be quite so devastating." As the story goes -- some millions of deaths and apologies later -- another group of renegade researchers will try their hand at reengineering virtually every cell in our bodies, as they blindly attempt to morph the human spirit to fit an inhumane world. I mean, just imagine what would happen if we pushed aside bioethics and other governors of conscience, while envisioning what we want the human race to become as it "grows up"? Will we end up looking like amoebas, but thinking like gods? Or will we look like gods, but think like amoebas? "Inquiring minds want to know." There's an anxious public just waiting to be informed about the whole field of Molecular Biology and the prospects of The Genome Project. Instead of reporting on this field and Project in a straightforward manner -- treating the story as if it were merely another development in the history of ideas -- it needs to be seen laterally, against the backdrop and fallout of the Nuclear Age. Looking below the surface and politics of The Cold War that initially drove this Age, we need to drill down to see what lies at the core of human creativity in the service of its aggressive tendencies. Maybe this report could address one of the most perplexing aspects of human creative potential -- namely, how highly intelligent people can take good ideas and turn them into bad ones real fast, bearing really grave consequences. But what's a "good idea" anyway? I think The Genome Project is a great idea because of its enormous medical benefits. However, I fear the more radical possibilities of its broad applications -- particularly, those that sway toward expanding the work on advanced weapon systems. I suspect we'll always be dealing with this dilemma of deciding what's a fair use of scientific knowledge, and what isn't; what's beneficial, and what isn't; what improves the human condition, and what doesn't. In responding to the public's escalating fears about our nuclear future, R. Buckminster Fuller once remarked: "There are five billion people on this planet and no one seems to know what to do." Have you heard those haunting words of Albert Schweitzer rumbling in the distant? "Man has lost the capacity to foresee and to forestall. He will end by destroying the earth," said Dr. Schweitzer. I gather from his pensive perspective, we're no longer opening Pandora's Box; we're living in it. The question remains: Do we always have to live in this Box? If so, what's the best way to live in it and flourish? In order to make our way with some peace of mind in this new Age of engineered atoms, molecules and genes, we need to quickly learn from the past. That means figuring out how we're going to initiate policies in bioethics while encouraging greater social responsibility. Jacob Bronowski touched on this issue in his classic muse, Science and Human Values -- prompting us to always consider the dimension of values in science that should never vanish from our view. We must take the time now -- in this fresh moment -- to envision ways of ensuring a healthy collective future. One step towards this end is to transfer our learnings from the Nuclear Age to this new Age, and then transform them. I trust we won't find this act of connection-making an exercise in futility. There's a dark line in Kurt Vonnegut's novel Hocus Pocus that may bleed into this new millennium -- permanently staining it. In one sweeping definition, Vonnegut sums up the whole of 20^th century thought as "the complicated futility of ignorance." He proceeds to define high art as "making the most of futility." I'm wondering whether many scientists see how this edgy definition applies to high science, as well. Is it so futile to summon our leading scientists and technological visionaries to strategize about the next steps for growing and applying our knowledge of molecules and genes? (This gathering might work if everyone checked one's ego at the door and removed any Master of the Universe costume or attitude before "dancing naked in the mind field" [to borrow Kary Mullis's lively expression].) Finally, the following questions should be included in this unreported story, adding to the bonfire of challenging world questions: Is there one thing in particular that would help improve the state of the world? What is it, and who's working on it? TODD SILER, artist, author, inventor, is the Founder and Director of Psi-Phi Communications, a company that provides creative catalysts and communication tools for breakthroughs & innovation in Fortune 500 Companies and schools. He recently presented his latest book, Thinking Like A Genius, at the World Economic Forum, in Davos. He is the author of Breaking the Mind Barrier and co-author (with Patricia Ward Biederman) of "Creativity and A Civil Society," a commissioned report by the Institute for Civil Society. _________________________________________________________________ David G. Myers The Disconnect Between Wealth and Well-Being: It's Not the Economy, Stupid Does money buy happiness? Few of us would agree. But would a little more money make us a little happier? Many of us smirk and nod. There is, we believe, some connection between fiscal fitness and emotional fulfillment. Most of us tell Gallup that, yes, we would like to be rich. Three in four entering American collegians -- nearly double the 1970 proportion -- now consider it "very important" or "essential" that they become "very well off financially." Money matters. Think of it as today's American dream: life, liberty, and the purchase of happiness. "Of course money buys happiness," writes Andrew Tobias. Wouldn't anyone be happier with the indulgences promised by the magazine sweepstakes: a 40 foot yacht, deluxe motor home, private housekeeper? Anyone who has seen Lifestyles of the Rich and Famous knows as much. "Whoever said money can't buy happiness isn't spending it right," proclaimed a Lexus ad. Well, are rich people happier? Researchers have found that in poor countries, such as Bangladesh, being relatively well off does make for greater well-being. We need food, rest, shelter, social contact. But -- the underreported story in our materialistic age -- in countries where nearly everyone can afford life's necessities, increasing affluence matters surprisingly little. The correlation between income and happiness is "surprisingly weak," observed University of Michigan researcher Ronald Inglehart in one 16 nation study of 170,000 people. Once comfortable, more money provides diminishing returns. The second piece of pie, or the second $100,000, never tastes as good as the first. Even lottery winners, those whose income is much higher than 10 years ago, and the very rich people -- the Forbes' 100 wealthiest Americans surveyed by University of Illinois psychologist Ed Diener -- are only slightly happier than the average American. Making it big brings temporary joy. But in the long run wealth is like health: Its utter absence can breed misery, but having doesn't guarantee happiness. Happiness seems less a matter of getting what we want than of wanting what we have. Has our collective happiness floated upward with the rising economic tide? In 1957, when economist John Galbraith was about to describe the United States as the Affluent Society, Americans' per person income, expressed in today's dollars, was $8700. Today it is $20,000. Compared to 1957, we are now "the doubly affluent society" -- with double what money buys. We have twice as many cars per person. We eat out two and a half times as often. Compared to the late 1950s when few Americans had dishwashers, clothes dryers, or air conditioning, most do today. So, believing that it's very important to be very well off, are we now happier ? We are not. Since 1957, the number of Americans who say they are "very happy" has declined from 35 to 32 percent. Meanwhile, the divorce rate has doubled, the teen suicide rate has nearly tripled, the violent crime rate has nearly quadrupled (even after the recent decline), and depression has mushroomed. These facts of life explode a bombshell underneath our society's materialism: Economic growth has provided no boost to human morale. When it comes to psychological well being, it is not the economy, stupid. We know it, sort of. Princeton sociologist Robert Wuthnow reports that 89 percent of people say "our society is much too materialistic." Other people are too materialistic, that is. For 84 percent also wished they had more money, and 78 percent said is was "very or fairly important" to have "a beautiful home, a new car and other nice things." But one has to wonder, what's the point? "Why," wondered the prophet Isaiah, "do you spend your money for that which is not bread, and your labor for that which does not satisfy?" What's the point of accumulating stacks of unplayed CD's, closets full of seldom worn clothes, garages with luxury cars -- all purchased in a vain quest for an elusive joy? And what's the point of leaving significant inherited wealth to one's heirs, as if it could buy them happiness, when that wealth could do so much good in a hurting world? DAVID G. MYERS is the John Dirk Werkman Professor of Psychology at Hope College and, most recently, author of The American Paradox: Spiritual Hunger in an Age of Plenty. Further Reading on Edge: "What Questions are on Psychologists' Minds Today?" David G. Myers. LINK: David G. Myers Home Page. _________________________________________________________________ Stephen H. Schneider The Way Stories About Complex Scientific Controversies Are Often Unintentionally Mis-reported by the Mainstream Media. And since policy making to deal with such controversies calls for value judgments, and that in turn requires a scientifically literate public to telegraph their value preferences to leaders, miscommunication of the nature of scientific controversy has serious implications for democracy in a world of exploding complexity. In political reporting, it is both common and appropriate to "get the other side": if the Democrat gives a speech, the Republican gets comparable time/inches/prominence. This doctrine of "balance" -- which is still taught proudly in journalism schools in the U.S. -- is supposed to underlie the journalistic independence of the Fourth Estate. [And let's not forget that conflict packaged in sound bite-sized chunks garners higher ratings than more circumspect reporting.] But while journalists rightly defend the need for balance in truly bipolar stories, how many scientific controversies are really two-sided? More likely, there will be several competing paradigms and a half a dozen marginal ideas kicking around scientific controversies. And when the issues have high-stakes political winners and losers -- like the global warming topic I work in -- it is to be expected that various special interests will compete for their spin. We've all seen media filled with the views of environmental catastrophists, technological cornucopians, ideological opponents of collective controls on entrepreneurial activities, or denial from industrial producers of products that pollute -- to name the usual prime players. And each often has their hired or favored PhDs handy with ready explanations and slick sound bites -- e.g., why carbon dioxide buildup in the air will be either catastrophic or good for you. Unfortunately, here is where a serious -- and largely unreported by the very people who bring us this daily show -- disjuncture occurs. For example, in the name of "balance", a 200-scientist, two-years-in-the-making refereed scientific assessment gets comparable space or airtime to a handful of "contrarian" scientists saying it "ain't so". When I challenge this equal time reporting to my media colleagues, they accuse me of being against "balance". This parade of dueling scientists isn't remotely "balance" I respond, but rather, utter distortion -- unless the journalist also reports the relative credibility of each quoted position. I call the latter doctrine "perspective" -- as opposed to the "balance" that journalists label. In science all opinions are decidedly not equal, and we spend the bulk of our effort winnowing the less from the more probable lines of inquiry. Moreover, when we are assessors, we are obligated to report whether our estimates of the likelihood of some set of hypothesized outcomes are based on objective rather than subjective odds. I don't have space to get into the "frequentist" versus "Bayesian" debate over what is ever "objective", but awareness of the issue is also part of what scientific literacy entails -- even for scientists. Nevertheless, I do agree it would be irresponsible not to cover minority opinions in media accounts of complex controversies. My concern comes when contradictory scientific opinions are offered without any attempt to report the relative credibility of these views. Then, the public -- and political leaders too for the most part -- are left to do that difficult assessment job themselves. More often than not the "dueling scientists" get equal time in the story, confusion sets in and outlier opinions win equal status at the bar of public opinion with more widely accepted views. Of course, as Kuhn has taught us, once in a while someone comes along to overthrow the mainstream doctrine -- but we celebrate these paradigm busters primarily because they are rare, not commonplace. One well-known editor argued with me that to report scientific credibility "calls for a judgement on the part of the journalist, and that most reporters lack specialized qualifications to do that". "Without making judgments how can they choose what to report and who to quote", I responded? "Why don't you get someone from the Flat Earth Society to 'balance' every space shot you cover -- isn't that a 'judgment' about their lack of credibility"? Of course, they could hire such specialists, but only a few major media outlets do -- and those reporters are decidedly not at the top of the respect hierarchy in corporate media. Science must always examine and test dissent, even if it takes a long time to reduce some uncertainties. But science policy needs to know where the mainstream is at the moment. My mantra to those seeking scientific literacy in order to address the implications of the debate is to remember to ask all competing claimants of scientific "truth" three questions: 1), "What can happen?", 2), "What are the odds?", and 3) "How do you know?" And if you intend to ask the third question, plan to have a pen and paper along and be willing to check references, for question 3) isn't a sound bite-length inquiry. In summary, most stories turn the doctrine of balance on its head by applying it too literally to complex, multi-faceted scientific debates. Then, the unreported story becomes that there actually are different probabilities that belong to each of the various positions covered, yet these conflicting positions appear in the story to be equally likely. Science must always examine and test dissent, even if it takes a long time to reduce some uncertainties. But science policy needs to know where the mainstream is at the moment. My mantra to those seeking scientific literacy in order to address the implications of the debate is to remember to ask all competing claimants of scientific "truth" three questions: 1), "What can happen?", 2), "What are the odds?", and 3) "How do you know?" And if you intend to ask the third question, plan to have a pen and paper along and be willing to check references, for question 3) isn't a sound bite-length inquiry. STEPHEN H. SCHNEIDER is Professor in the Biological Sciences Department at Stanford University and the Former Department Director and Head of Advanced Study Project at the National Center for Atmospheric Research Boulder. He is internationally recognized as one of the world's leading experts in atmospheric research and its implications for environment and society. Dr. Schneider's books include The Genesis Strategy: Climate Change and Global Survival; The Coevolution Of Climate and Life and Global Warming: Are We Entering The Greenhouse Century?; and Laboratory Earth. _________________________________________________________________ Eduard Punset The End of the Brain It is the best candidate to the major unreported event of the next century. Only people in need really do need a brain. Plants don't, and get along pretty well without it: photosynthesis alone largely fulfil all their requirements. Actually, now that we know that we share, for better or worse similar DNA, -- the instruction booklet that designs living organisms robust enough to ensure survival, but flexible enough to adapt to changing environments --, the missing brain is the only difference between plants and us. And as we'll quickly learn during the next century, it is nothing to be very proud of. Although it seems harder to define the differences within the brainy species themselves, including primates and other animals, it is, however, rather surprising to find that -in the History of human thought- there has hardly been a single intellectual who has not condescended to share the collective and unending appraisal of the substantial differences between men and the rest. In fact, this debate has bored quite a few generations of learned hominids. Fortunately, it is about to end thanks to, above all other scientists, Lynn Margulis. Let me explain why. It has taken quite some time and arguing to show that most animals do indeed communicate and master reasonably evolved languages to that end. There is nothing terribly creative about the capacity to learn a language; as Steven Pinker pointed out it is genetic, and could not be more damn simple, since it is digital. Men can do it; other mammals too. The only surprising thing about it is the sheer impotence of current scientific thought to unveil the basics of animal culture. Despite the fact that we share the same genes, tool-making also helped to substantiate the differences between hominids and chimps. Of course, a few of those genes are different, but we still don't know which of them actually makes the difference. The tool making singularity, however, has not outlived the language exclusiveness. As other people moved by curiosity, I have enjoyed looking at zoologist Sabater's collection of chimp's sandals, hats, seed's catchers and sticks for all sorts of widely different uses, such as beating, or carving the sand searching for fresh water during the dry season, instead of trying to drink in muddy soils. The identification of consciousness -- since scientists assumed twenty years ago that the scientific method could be extended to these domains, up to then left to superstition --, looked like the final argument. "We're conscious of ourselves. We know who we are. And they don't". It was the most serious argument ever put forward in our defense. It did not matter that chimps could also recognize themselves in a mirror; somehow they would not show the precise awareness, nor the same cognitive capacity to ruminate about one self. Unfortunately, biologists like Lynn Margulis showed that bacteria -- as far back as two billion years ago- could not manage their electric-like motors, nor their magnetic navigation systems, without some realization of what on earth they were building up those ultramodern transport systems for. You just can't pretend any longer that bacteria are not conscious too. For those still interested in the old debate about the differences between the brainy species, let me remind you that the most avant garde argument now runs something like this: only the descendants of the Australopitecus have developed the capacity to generate symbols. Nobody can demonstrate neither when nor how it happened; I myself am convinced that the whole thing started six thousand years ago when people settled to labor the land, and women had to leave their babies unattended shouting all day long. But total allegiance to symbols like the San Francisco 49ers, the Serbian motherland, or the Manchester United colors are undeniably humane. No chimpanzee would risk his life for these or similar symbols, nor for that matter would leave their newly born unattended. Chimp's mothers love to carry them. There at last is something which makes us really different from other animals. The capacity to generate symbols and to blindly follow them, has indeed taken Homo Sapiens a long way off from the brain's original purpose: to go in the right direction, and to anticipate a few questions. A very lucid New York physiologist attending last December a Neuroscience Congress at the birthplace of Ramon y Cajal, actually told me he knows of a particular specie who ends up eating its own brain once it settles in the right place and knows the basic answers. Could it not be that the brain has taken over a bunch of simple people who were only in need of a few addresses and of guessing what on earth was going to happen tomorrow? The World Health Organization is predicting that life expectancy will reach one hundred and twenty five years very shortly. Neuroscientists should start worrying about the outcome of forty additional years with jammed brains immersed in the process of deepening their symbolic capacity, leaving at long last an unbridgeable and recognizable gap with plants and animals. Yet despite this distinctive capacity to generate symbols, some 25% of the population -- excluding criminals -- have serious brain disfunctions, and most medical observers already agree that brain disorders will be the most serious health threat in the twenty-first century. The lucky 75% who will not be insane already know, according to the latest statistics, that more patients die as a result of practitioner's brains guessing wrongly about the nature and treatment of real or invented illness, than people succumb on the roads and from heart failure altogether. Thankfully, building a collective brain through Internet should alleviate the stress of saturated individual brains, and help manage the lives of the great majority of people who have already been overcome by too many choices regarding the path to follow and the answers to non-formulated questions, even under current life expectancy models. I'm afraid that quite a few of them will, however, regret the placid and constructive life of brainless plants. EDUARD PUNSET is Master of Science (Economics) by LSE, and Professor of Economic Policy at the Chemical Institute of Ramon Llull University in Barcelona. He was Chairman of the Bull Technological Institute, Professor of Innovation and Technology at Madrid University, and IMF Representative in the Caribean. He actively participated in the Spanish political transition to democracy as Minister for Relations with the European Union, and Member of the European Parliament. He is currently Director and Producer of Networks, a weekly programme of Spanish public television on Science. His latest book is A Field Guide to Survive in the XXIst Century. _________________________________________________________________ Mehmet C. Oz, M.D. The True Nature of Much of Human Illness Eludes Mankind. We routinely tolerate concepts that challenge our perceived reality. Electrons are allowed to spiral through the air transmitting information and sprouting quarks, yet we insist of a very concrete, biomedical understanding of our body. The body is sacred and our provincial understanding of its workings in imbued with cultural biases that are bred from birth and dominated by guttural rather than cerebral influences. The resulting theology of the medicine insists that all citizens having heart attacks must have chest pain, or smoke, or have type A personalities, or have high cholesterol, or be obese. The observation that half the victims of mankind's largest killer do not fit this profile eludes the public. And how do these risk factors explain why heart attacks are most likely to occur on Monday mornings? We make the error of assuming that medicine, which describes tendencies rather than certainty, is a mathematical field rather than a statistical field. Even the act of observing disease in the form of a patient-physician relationship can alter the natural history of the illness, the medical equivalent of the Heisenberg uncertainty principle in physics. This may explain the well know placebo effect and its dark relative, the nocebo impact. Religious and scientific explanations for disease crisscross frequently in the brain. How about the eyesight you use to read a sentence describing your grandmother's love for you. We can describe how you can see the words on the page, and even how you know what the words mean and how they fit together. We can understand the memory processes that allow you to recall you grandmother's face, but how do you know that the image you are seeing is your grandmother? Do you have an individual brain cell reserved for each object you ever see with one being reserved a young grandmother and another for a more mature memory? At some point we push our scientific understanding into the abyss of art and become surrounded by the seeming darkness of theology. What makes this experience particularly frightening is the associated realization that we, as individuals, are so unique that cookie cutter answers offered to humanity on losing weight or avoiding heart disease are unlikely to work. We each have to shoulder our own burden in seeking these answers on our journey through life. MEHMET C. OZ, M.D., is Irving Assistant Professor, Department of Surgery, Columbia-Presbyterian, and a leading cardiac surgeon who has pioneered both high-technology approaches (he was involved in the invention of the cardiac assist device) and the use of complementary medicine in surgery. He is author of Healing from the Heart. _________________________________________________________________ David Lykken The Reduction Since 1993 In American Crime Is An Illusion The much-touted reduction since 1993 in American crime is an illusion. The U.S. rate of violent crime today is still nearly four times what it was in 1960. The recent dip in crime is the predictable result of our segregating in our prisons more than six times the number who were inmates as recently as 1975. A few of these inmates are psychopaths, persons whose genetic temperaments made them too difficult for the average parents to successfully socialize. A few others are mentally ill or retarded or sheer victims of circumstance. But most are sociopaths, persons broadly normal in genetic endowment who matured unsocialized due to parental mis-, mal-, or non-feasance. Like our language talent, humans evolved an ability to acquire a conscience, to feel empathy and altruism, to accept the responsibilities of a member of the social group. But, like the language talent, this proclivity for socialization requires to be elicited, shaped, and reinforced during childhood. The epidemic of crime that began in the 1960s is due largely to the fact that, of males aged 15 to 24, the group responsible for at least half our violent crime, the proportion who were reared without fathers is now four times what it was in 1960. More than two-thirds of -- abused children, juvenile delinquents, school dropouts, pregnant teen-agers, homeless persons, adult criminals -- were reared without the participation of their biological fathers. Calculated separately for white and black youngsters, it can be shown that a fatherless boy is seven times more likely to become incarcerated as delinquent than a boy raised by both biological parents. Judith Rich Harris argues that parents are fungible, that children are shaped mainly by their genes and their peers. I think she is 80% correct but I think that there are a few super-parents who effectively nurture and cultivate their children (and largely determine their choice of peers). And I am certain that the bottom 10% of parents are truly malignant -- immature, or overburdened, or indifferent, or sociopathic themselves -- so that their children are almost certain to be robbed of their rights to life, liberty, and the pursuit of happiness. Suppose we were to require those who wish to have -- and keep -- a baby must be mature, married, self-supporting, and never convicted of a crime of violence. If the parents of the 1.3 million Americans currently in prison had met such simple licensure requirements, I believe that at least a million of those inmates would instead be tax-paying citizens and neighbors. Interfering with parental rights, even as modestly as this, is rather frightening because the instinct to procreate is as strong in us as it is in all the birds and beasts. But homo sapiens should be able to agree that the rights of the children outweigh those of parents who are unable or unwilling to grow up, get married, and get a job. DAVID LYKKEN is a behavioral geneticist at the University of Minnesota who recently published the results of a study of 1500 pairs of twins in the May issue of Psychological Science. He is the proponent of a set-point theory of happiness, the idea that one's sense of well-being is half determined by genetics and half determined by circumstances. His research illustrates that a person's baseline levels of cheerfulness, contentment, and psychological satisfaction are largely a matter of heredity. He is the author of Happiness: What Studies on Twins Show Us about Nature, Nurture, and the Happiness Set Point. Further Reading on Edge: "How Can Educated People Continue to be Radical Environmentalists?" A Talk by David Lykken. _________________________________________________________________ Stuart Hameroff The Imminent Paradigm Shift In Understanding The Conscious Mind Today's most important unreported story is an imminent paradigm shift in understanding consciousness. Quantum computation will soon replace our familiar classical computation as primary metaphor for the brain/mind. The purported brain=mind=computer analogy promising robot/computer superiority and human/machine hybridization from near-future classical computers is a myth promulgated by the "silicon-industrial complex. " Quantum computation was proposed in the 1980's by Feynmann, Benioff, Deutsch and others to take advantage of the mysterious but well documented quantum phenomen?a of 1) superposition (particles existing in multiple states or locations simultaneously) and 2) entanglement (instantaneous, non-local communication among quantum states). Whereas classical computers represent information digitally as "bits" of either 1 OR 0, quantum computation utilizes "qubits" in which information exists in quantum superposition of both 1 AND 0. While in superposition, multiple entangled qubits may interact nonlocally, resulting in computation of near-infinite massive parallelism. In 1994 Peter Shor of Bell Labs proved that quantum computers (if they are able to be built) could factor large numbers into their primes (the key to modern cryptography, banking codes etc) with unprecedented efficiency, rendering conventional systems obsolete. Shor's work sparked major funding in the general area of quantum information (quantum computation, quantum cryptography, quantum teleportation). An apparent roadblock to quantum computation -- the problem of decoherence by environmental interactions -- was potentially solved in the mid 1990's by groups who developed quantum error correction codes which can detect and repair decoherence before quantum computation is destroyed. In the past several years numerous quantum computational prototypes have been developed, and various technologies for full blown, large scale quantum computers are being explored. It seems almost inevitable that quantum computation will have an enormous impact on information technology. The brain/mind has traditionally been compared to contemporary vanguards of information processing (dating from the Greeks' "seal ring in wax" as a metaphor for memory, to the telephone switching circuit, to the hologram, to the modern day classical computer in which consciousness "emerges" from complex computation among simple neurons). As quantum computation comes to the forefront of technology, human nature (and ego) will surely resist the notion that technology bears superior intellect, and search for quantum computation in the brain. There are cogent reasons for believing that quantum computation does indeed operate in the brain, and such suggestions have been made by theorists including Sir John Eccles and Sir Roger Penrose. However critics quickly point out that the warm, wet, noisy brain must be inhospitable to delicate quantum effects which (in the case of superconductors, Bose-Einstein condensates etc) seem to require complete isolation and temperatures near absolute zero to prevent decoherence. On the other hand "quantum-mind" advocates suggest that biological quantum coherence is metabolically "pumped", point to several lines of evidence suggesting that biological evolution has solved the decoherence problem, observe that only quantum computation can solve the enigmatic features of consciousness, and propose testable predictions of quantum-mind theories (on the contrary, experimental predictions regarding classical computational emergence of consciousness have not been put forth). The implication, and potential theme for the next century, is that we are not strictly emergent products of higher order complexity, but creatures connected to the basic level of the universe. STUART HAMEROFF. M.D. Professor, Anesthesiology and Psychology Associate Director, Center for Consciousness Studies The University of Arizona Tucson, Arizona. LINK: Stuart Hameroff Home Page. _________________________________________________________________ Ellis Rubinstein The Erosion of Traditional "Divides" I believe the world to be experiencing an unprecedented erosion of traditional "divides". Yes, we can all point to examples of horrific ideological conflict, but such tribalism surely seems anachronistic to most of us. And that is because many of us have grown accustomed in the latter decades of the 20th century to a kind of social enlightenment that stems from urbanization, globalization, and the sharing of common information disseminated by our extraordinary new communication tools. Now, it may seem obvious that nationalism and political and religious ideologies are having an increasingly hard time remaining "pure" in the face of increased face-to-face contact with those who see things differently from us. Moreover, we cannot easily cling to our most formative views when we increasingly find ourselves in conversation via phone and e-mail with others who see the world differently from us. And, finally, it must be ever more difficult to remain isolated in our views of others when we are surrounded by images of them -- often touching images -- on film and television. Still, all this may have been discussed somewhat in various media. What especially intrigues me is the apparent erosion among relatively educated families of a different "cultural divide": the generational divide. What are the drivers of this shift? And what are its effects? In my necessarily limited experience, I have observed that parents and children are increasingly "friends". It has been much noted that the Baby Boom generation and their children share many of the same interests in music. At formal events -- I think of my recent experience in Sweden attending the Nobel festivities -- teenagers and 20-somethings happily mingled with their elders who, if they were male, were dressed in cut-aways. Indeed, some of the young people were wearing special costumes in order to play roles in this highly traditional event. In my time, we would have seriously considered committing suicide before putting on costumes provided by our elders, then attending hours of events populated largely by our parents and grandparents, and finally dancing to "retro" music in the closest imaginable proximity to our parents and even grandparents. I conclude from this and similar experiences that as this new millennium begins, a sort of truce has taken place between generations, with parents and children attempting to bridge divides that, in my view, ought naturally to exist between them. If I am correct about this, then surely there are major ramifications on our culture...and I'm not at all sure they would only be for the good. I worry, for example, that some needed element of rebelliousness is being "bred out" of the system of growing up. I worry that this may have an effect on creative thought. And I worry that the potential lack of tension between generations might lead to a kind of stagnation in the arts, humanities and sciences. Am I alone in this concern? I personally haven't seen this topic addressed in the all-too-limited spectrum of publications I can personally scan. So maybe others have publicly shared this concern. If not, however, I vote for this as one of the most important underreported stories of our time. ELLIS RUBINSTEIN is Editor of Science LINKS: Science Magazine; Ellis Rubinstein's Science Page _________________________________________________________________ Hans Ulrich Obrist The Roads Not Taken I see unrealized projects as the most important unreported stories. As Bergson showed, realization is only one case of the possible. There are many amazing unrealized projects out there, forgotten projects, misunderstood projects, lost projects, realizable projects, poetic-utopian dream constructs, unrealizable projects, partially realized projects, censored projects and so on. The beginning of a new millennium seems like a good moment to remember certain roads not taken in an active and dynamic and not nostalgic way. HANS ULRICH OBRIST has curated exhibitions at Musee d'Art Moderne de la Ville de Paris, unsthalle, Wien and Deichtor-Hallen,Hamburg, and Serpentine Gallery in London, among other institutions. He currently divides his time between France, Switzerland and Austria. After an initial training in economics and politics, he switched to contemporary art and has organized a variety of exhibitions in such unlikely venues as his own house, a monastery library, an airplane and an hotel. _________________________________________________________________ Peter Cochrane Decoding the Human Genome is a Long Term Project We are currently being fed a series of snippets detailing the unraveling of the human blueprint -- our genome. If every isolated and obscure report were strung end to end, and edited thoroughly into a coherent whole, we might just have a comprehensible story -- and one of the most important to be told. But let me jump to the end point and make an educated guess! I suspect that when we decide that tweaking the odd gene or two seems like a good idea to cure this or that disease or shortcoming, or to get this or that eye color etc, we will get a heck of a surprise. It seems to me that the likelihood that individual genes are singularly responsible for any one trait is pretty slim. So I am putting my money on a reasonably strong interdependency, that is -- we will have to attend to complex combinations of genes to achieve some desired effect. Realizing an adequate level of understanding of the complexity of the genome code book may well take considerably more effort than the initial mapping of all the raw elements. Like Morse Code and the German Enigma machines of W.W.II, we will almost certainly require some hefty computing power to do the job. Now this really will be worth reporting! PETER COCHRANE, Chief Technologist BT, Collier Chair for The Public Understanding of Science & Technology, @Bristol, and the author of Tips for Time Travelers. LINKS: Peter Cochrane's Home Page; C2G: Communications Consultants Group, @ BT Adastral lPark _________________________________________________________________ Eberhard Zangger A Lost Civilization Beyk?y is a hamlet sitting on the lonely edge of the world -- in the highlands of Phrygia in western Turkey, hundreds of kilometers away from either city or coast. In 1876, a local peasant made a remarkable discovery in these forsaken backwoods -- one that ranks today as the world's most important unreported story. This farmer's pasture extended along the foot of a low but extensive mound, believed to hold the remains of an ancient city. While working his ox and plow along the base of the hill, the farmer struck some objects in the furrow which gave off a metallic noise. After picking up and scrutinizing the large pieces of metal which he had accidentally unearthed, the peasant noticed they were covered with script. Several years later, the American scholar William M. Ramsay passed through the village. Speaking with some local people, Ramsay inquired if they possessed any coins and artifacts picked up off the ground, with the intention of finding a bargain. Among the pieces offered to him was a tiny fragment containing a short inscription. Wondering aloud about its original location, the locals pointed towards the knoll. After further inquiry, Ramsay found out about the series of spectacular bronze tablets which had also been found there. Unfortunately, they were gone and nobody was able to say where. As it turned out, the Beyk?y bronze tablets had made their way into the archives of the Ottoman empire. Recognizing their possible significance, the curator in charge did not hesitate in contacting the world's foremost authority, German-born Anatolia expert Albrecht Goetze, in order to publish the finds. In the late 1950's, this professor of Yale University began to investigate the texts. For over ten years, he examined, translated and interpreted them. Then, he informed another colleague of their existence. Goetze completed his investigations and manuscript shortly before he died in 1971. Owing to the subsequent death of his editor, the monograph, however, has never been published and the Beyk?y tablets remain completely unknown. Goetze had found that they contained lists of Anatolian states, kings, and military actions from as early as the fourth millennium BC up until the eighth century BC. These texts proved conclusively that today's Turkey was once the home of a civilization older and more important to European history than Pharaonic Egypt. Yet, this civilization has remained virtually uninvestigated to the present today. Now, why should the discovery of the Beyk?y tablets be considered the world's most important unreported story? After all, archaeological discoveries -- like that of the man in the ice -- have often surprised the general public and upset established scholarship. The discovery of the Beyk?y tablets, however, is of a different order of magnitude. It shows that the center of European history is to be found way outside the frontiers of the hitherto Old -- and accepted -- World. Therefore, its impact equals a scientific revolution similar to those caused by Nicolas Copernicus, Charles Darwin and Siegmund Freud. Copernicus' work generated an upheaval, since it demonstrated that humankind is not at the center of the universe. Darwin's research demonstrated that humans do not stand at the ultimate zenith of Creation; instead they are more or less an accidental product of a million years-long evolutionary process. And Freud showed that we are not even in full command of our own mind, our inner selves. The publication of the Beyk?y tablets will take this sequence of upheavals in western thought one step further. It will demonstrate -- once and for all -- that "western culture" is an arbitrary, abstract concept resting upon the wishful thinking of certain eighteenth century members of the educated class. Old world civilizations, especially that of ancient Greece, evolved from frontier outposts of much older Asian civilizations in the adjacent Orient. The practices characterizing western civilization -- domestication of animals and plants, agriculture, husbandry, permanent homes, life in village communities, metallurgy, politics and cosmology -- all clearly originated in Asia. After this paradigm shift, only one even greater upheaval remains -- proving that intelligent life exists elsewhere in the universe. EBERHARD ZANGGER is an exploder of myths and the discoverer of the lost continent of Atlantis, which was never lost in the first place. His five books on ancient civilizations have been published internationally -- with more to come. Feeling a bit estranged from the great thinkers of the planet, he is impatiently waiting for the fourth culture to begin. _________________________________________________________________ Anne Fausto-Sterling The End of Gene Control As the 20th century draws to a close, biologists triumphantly announce the beginning of the end of the project to sequence the human genome. Metaphoric hyperbole runs rampant as we speak of "reading the book of life" and of "unraveling the essence of what it means to be human". But less noticed is the fact that developmental biologists who study the role of genes in development are busily dethroning the gene. When I was a young embryologist I lectured about genes in development. Following the dogma of the time, I told my students that there were two groups of genes. First, there were housekeeping genes -- those responsible for the mundane daily functions of the cell -- the feminine duties of maintenance. These genes supposedly kept the machinery running smoothly -- respiration and waste disposal went on quietly and demurely. But the really important genes were the development genes -- those masculine entities that pioneered new territory and wrought new form from undifferentiated plasm. The goal of any self-respecting developmental geneticist was to find those special genes for development and thus unravel the mystery of how genes control the formation of new organisms. The successes have been many and profound. Developmental biologists have uncovered myriad genes involved in embryo formation. They have found an amazing continuity of genetic structure and function across the phyla. We now understand in fabulous detail the function of many genes in development. But something funny happened on the way to the genetic forum. The distinction between housekeeping genes and development genes has become increasingly hard to maintain. Some development genes fall into the category of transcription regulators, as might be expected for genes that control genetic expression. But many turn out to be involved in cell communication and signaling. What is more these genes don't control development. In a real sense development controls the genes. The same genetic read-out can have a vastly different outcome depending upon when during development and in which cell the protein is produced. Indeed, most development genes seem to act at multiple times during development and in many tissue and cell types. The same gene can play a key role in quite a variety of developmental events. The important story is that the search for genes that control development has shown us that our initial idea that genes control processes within an organism is wrong. Instead genes are one set of actors within a developmental system. The system itself contains all of the pre-existing contents of the cell, organ or organism. These include thousands of gene products, other chemicals such as ions, lipids, carbohydrates and more, all organized and compartmentalized in a highly-stru ctured physical setting (the cell and its substructures, the organ and its tissues, the organism and its organ systems). From before the turn of the century embryologists debated whether the cytoplasm controlled the nucleus or vice versa. What the last decade of research on genes in development reveals is that both things are simultaneously true -- the system and its history control development. Genes are but one of many crucial components of the process. ANNE FAUSTO-STERLING is Professor of Biology and Women's Studies at Brown University. She is the author of Myths of Gender: Biological differences between women and men. LINK: Anne Fausto-Sterling Home Page _________________________________________________________________ Andy Clark That the Human Mind is Less and Less In the Head Human brains are making the human world smarter and smarter, so that they (the brains) can be dumb in peace. Or rather, we are progressively altering our environment so that our brains, which are good at pattern-matching, at sensory recognition, and at the manipulation of objects in the world, can support intelligent choice and action by their participation in much larger networks. Human development is a process in which the brain becomes deeply tuned to the available environmental surround, be it pen, paper and sketchpad, or PC, Palm Pilot and designer software. As the century closes, and our typical, reliable environmental props and supports become ever more sophisticated and interlinked, so the mental machinery that makes us who we are is becoming ever more extended, interanimated and networked. In the near future, software agents whose profiles have evolved alongside those of a specific individual will count as part of the individual person. To say that I use those software agents will be strictly false. Instead, my brain and the various personalized manifestations of new technology will constitute integrated thinking things. I will no more be the user of these close-knit technologies than I am the user of the various sub- systems already existing within my biological brain. Instead, better to see both those sub-systems and these close-knit external technologies, as together constituting a spatially extended kind of user or person. The next step may be, as Rodney Brooks suggests, to put as much of that technology back inside the biological membrane as possible. This buys easier portability without changing the real state of affairs. We are already (mental) cyborgs. ANDY CLARK is Professor of Philosophy and Director of the Philosophy / Neuroscience / Psychology Program at Washington University in St Louis. He has written extensively on Artificial Neural Networks, Robotics and Artificial Life. His latest book is Being There: Putting Brain, Body and World Together Again. LINK: Andy Clark's Home Page _________________________________________________________________ Keith Devlin The Death of the Paragraph The most significant unreported story? The mathematician in me screams that this is a paradox. The moment I write about it, it ceases to be unreported. That aside, there are so many reporters chasing stories today, it would be hard to point to something that has the status of being a "story" but has not yet been reported. On the other hand, as Noam Chomsky is fond of reminding us, there's a difference between being reported somewhere, by somebody, and being covered by the major news organizations. I'll settle for a trend. I'm not sure if it will turn out to be a story, but if it does it will be big. It's the death of the paragraph. We may be moving toward a generation that is cognitively unable to acquire information efficiently by reading a paragraph. They can read words and sentences -- such as the bits of text you find on a graphical display on a web page -- but they are not equipped to assimilate structured information that requires a paragraph to get across. To give just one example, a recent study of 10,000 community college students in California found that, in the 18-25-year age group, just 17% of the men could acquire information efficiently through reading text. For the remaining 83%, the standard college textbook is little more than dead weight to carry around in their bag! The figure for women in the same age group is a bit higher: just under 35% can learn well from textually presented information. These figures contrast with those for students aged 35 or over: 27% of males and over 42% of females find it natural to learn from reading. Of course, that's still less than half the student population, so any ideas we might fondly harbor of a highly literate older generation are erroneous. But if the difference between the figures for the two generations indicates the start of a steady decline in the ability to read text of paragraph length, then a great deal of our scientific and cultural heritage is likely to become highly marginalized. Half a century after the dawn of the television age, and a decade into the Internet, it's perhaps not surprising that the medium for acquiring information that both age groups find most natural is visual nonverbal: pictures, video, illustrations, and diagrams. According to the same college survey, among the 18-25 age-group, 48% of males and 36% of females favor this method of acquiring information. The figures for the over-35s are almost identical: 46% and 39%. If these figures reflect the start of a story that has not been reported, then by the time somebody does write it, there may not be many people around able to read it. The social, cultural, scientific, and political consequences of that are likely to be major. KEITH DEVLIN, mathematician, is a Senior Researcher at Stanford University, and the Dean of Science at Saint Mary's College of California. He is the author of Goodbye, Descartes : The End of Logic and the Search for a New Cosmology of the Mind; Life by the Numbers; and The Language of Mathematics: Making the Invisible Visible. Link: Keith Devlin Home Page _________________________________________________________________ Dean Ornish The Globalization of Illness Many developing countries are copying the Western way of living, and they are now copying the Western way of dying. Illnesses like coronary heart disease that used to be very rare in countries such as Japan and other Asian countries are becoming epidemics, causing huge drains on their economies as well as enormous personal suffering -- much of which could be avoided. The same is true for prostate cancer, breast cancer, colon cancer, diabetes, hypertension, obesity, arthritis, and so on. Trillions of dollars in direct and indirect expenditures could be avoided, along with untold suffering. DEAN ORNISH, has 22 years experience directing clinical research demonstrating, for the first time, that comprehensive lifestyle changes may reverse even severe coronary heart disease without drugs or surgery. Founder, President, and Director, non-profit Preventive Medicine Research Institute. Clinical Professor of Medicine, University of California, San Francisco. Physician consultant to President Clinton and several members of the U.S. Congress. He is the author of five best-selling books, including New York Times' bestsellers Dr. Dean Ornish's Program for Reversing Heart Disease; Eat More, Weigh Less; and Love & Survival: The Scientific Basis for the Healing Power of Intimacy. LINK: Dean Ornish, M.D. -- "Healthy Living" _________________________________________________________________ Bart Kosko The most important unreported story at the dawn of the Information Age has two parts: (1) The last Sunday of the 20th century passed and the United States Government still continued to outlaw first-class mail on Sunday and (2) no one complained. BART KOSKO is Professor, Electrical Engineering Department, University of Southern California, and author of Fuzzy Thinking; Nanotime; and The Fuzzy Future: From Society and Science to Heaven in a Chip. LINK: Bart Kosko Home Page _________________________________________________________________ Terrence J. Sejnowski Exercise Can Make you Smarter A revolution recently occurred in neuroscience that has far reaching implications for our future. According to all the textbooks in neuroscience, we are born with a full complement of around 100 billion neurons and that it is all downhill from there. This was a discouraging "fact". Fred Gage, a colleague of mine at the Salk Institute, has discovered that new neurons are born every day in the hippocampus, an important part of the brain for long-term memory of facts and events, even in adults. This was first found in rats, but has now been shown in monkeys and humans, and not just in the hippocampus, but also in the cerebral cortex, the storehouse of our experience and the fountainhead of our creativity. This was widely reported, but what has emerged since then is even more encouraging. First, the new neurons in the hippocampus don't survive unless the animal exercises; a running wheel in an otherwise standard lab cage is enough to keep new neurons alive and well in mice. Second, the increase in the strengths of connections between neurons in the hippocampus that occurs when they are strongly activated, called long-term potentiation, is twice as strong in mice that had access to a running wheel. Finally, the mice with exercise were smarter at memory tasks. We still do not know how all this happens, but the bottom line is that something as basic as exercise can make you smarter. Recess in schools and executive gyms help not only the body, but can also make the mind sharper. These results have implications for graceful aging. Until recently, the dominant view of how the brain develops made the assumption that experience selects neurons and connections from a fixed, pre-existing repertoire. This view had some plausibility when it was thought that all of the neurons you will ever have are present at birth, coupled with evidence of neuron death and pruning of connections during childhood. However, if the brain makes new neurons in adults then this cannot be the whole story, since the growth of new connections must also be occurring, and doing so in an experience-dependent way. This discovery, coupled with increasing evidence that new connections can grow even between old neurons as a consequence of an enriched environment, means that an active mind along with an active body predisposes us for a lifetime of learning. This is good news for the baby boomers who have embraced health clubs and challenging new experiences, but bad news for couch potatoes who are exercise phobic. In short, "Use it or lose it". An active lifestyle and a rich social life are the best insurance against premature senility. We will learn much more about the how the brain renews itself in the next century as neuroscientists reveal more about the mechanisms and circumstances that make new neurons survive and grow stronger connections. Ultimately, this will lead to greater productivity from the elderly, the fastest growing segment in western societies. TERRENCE J.SEJNOWSKI, a pioneer in Computational Neurobiology, is regarded by many as one of the world's most foremost theoretical brain scientists. In 1988, he moved from Johns Hopkins University to the Salk Institute, where he is a Howard Hughes Medical Investigator and the director of the Computational Neurobiology Laboratory. In addition to co-authoring The Computational Brain, he has published over 250 scientific articles. LINKS: Terrence Sejnowksi Home Page; Compututional Neurobiology Lab (CNL) _________________________________________________________________ Philip Brockman How Great the Kids Are Today. The young people I work with at NASA's Langley Research Center are sharp and hard working. We get some of the best, but they are almost all great. And what is rarely understood, nevermind reported, is that they have to process about twice the information that I had to deal with starting out in research in 1959. Of course, the biggest untrue story told, one probably found in the first cave writings, is that "the younger generation is going to hell." PHILIP BROCKMAN, a physicist, has been at NASA LaRC (Langley Field, Virginia) since 1959. His research includes: Shock tubes; Plasma propulsion; Diode laser spectroscopy; Heterodyne remote sensing; Laser research; Laser injection seeding; Remote sensing of atmospheric species, winds, windshear and vortices. He is currently supporting all solid state laser development for aircraft and spaceborne remote sensing of species and winds and developing coherent lidars to measure wake vortices in airport terminal areas. He is a recipient of NASA's Exceptional Service Medal (ESM). LINK: Philip Brockman's NASA Home Page _________________________________________________________________ Daniel Goleman Hidden Consequences of Our Daily Choices as Consumers of Products and Services What is the biggest unreported story? The hidden consequences for our health and the environment, and for social and economic justice, of our daily choices as consumers of products and services. Our individual habits of consumption, when multiplied by our vast numbers, have devastating impacts -- but we are blind to the chain that links our individual choices with their vaster consequences. I'd like to know what these links are -- but they lack transparency. We need something akin to the labels of nutritional value on foods that would surface these hidden consequences of our own actions. A case in point: what is the environmental cost of choosing to buy a hamburger? How many acres for cattle to graze, how much erosion or degrading of land as a consequence, how much more greenhouse gases added to the atmosphere, how much water used for this purpose rather than other things? How does a burger made from beef compare in this regard to, say, one made from turkey, or from soybeans? Another case in point: since childhood, I've suffered from asthma. Asthma is becoming epidemic among children, especially in urban neighborhoods. One clear reason for the upsurge is the increase in airborne particulates that irritate and inflame respiratory passages. I live in the Connecticut River Valley of Massachusetts, which, because of prevailing winds, receives a large portion of its particulates from the pollution in the metro New York City area, as well as from the industrial Midwest states. One coal-powered electrical plant in Ohio, a notoriously bad offender, contributes almost half the airborne particulates that reach my area from the Midwest. Who are the largest industrial customers of that electrical plant? What products that I'm now buying are built using power from that plant? I might boycott them if I knew. Individually, the consequences of my choices are admittedly negligible. But if summated across millions of consumers making the similarly informed choices, the impact could be quite great. We could 'vote' for more benign consequences if we had this missing information. I applaud, for instance, the mutual funds and other corporate citizens who are offering shareholders the option to get their reports via the internet, rather than wasting resources -- trees, power, etc. -- mailing thousands of hard copies. One fund informs me, for instance, that if all members sign up for internet reports, the savings in pulp amounts to more than three hundred trees per year. I want more choices like that. So how about a new investigative beat for journalism: hidden consequences. DANIEL GOLEMAN is the author of the international bestsellers, Emotional Intelligence and Working with Emotional Intelligence. For twelve years he covered the behavioral and brain sciences for the New York Times, and has also taught at Harvard. His previous books include Vital Lies, Simple Truths; The Meditative Mind; and, as coauthor, The Creative Spirit. Dr. Goleman is CEO of Emotional Intelligence Services. LINK: Emotional Intelligence Services (EI) _________________________________________________________________ Marc D. Hauser (1) The Health Crisis in Africa and (2) The Poor Level of Educational Distribution. Having lived for several years in East Africa, I am struck by two observations which seem to me to have escaped sufficient reporting. The first concerns the health crisis on the continent. Unlike anywhere that I have ever lived, and especially over the past 5-10 years, I am overwhelmed by the illnesses. When one walks among villages in Uganda, or on the streets of cities such as Nairobi and Kamapal, one sees AIDS. There is no need to go to the local newspapers and read the latest counts. It is right in front of ones eyes. Perhaps more depressing than the AIDS crisis is the problem with malaria. Unlike AIDS, for which we have only the weakest medicines, we certainly do have the medical technology to eradicate most of the problems with malaria. Cracking this problem will not, however, require medical expertise alone. Rather, it will require a creative team of doctors, anthropologists, sociologists, and economists working hand in hand, trying to understand the local mores, the local economy, and the struggles of daily life. It will also have to crack the problem of medical distribution, still a problem in much of Africa. A second problem comes from my experience teaching in Uganda. I had the great pleasure, and honor of teaching students in Uganda. I have never seen such a thirst to learn. And yet, much of the educational system is lacking in basic supplies and materials that would transfer these students into highly educated scholars. Like the poor distribution of medicinal supplies, there is an equally poor level of educational distribution. The inequities here are dramatic. Those of us who write for the Edge, should put our heads together to think of some way to help, to lend our minds to theirs. Any takers? MARC D. HAUSER is an evolutionary psychologist, and a professor at Harvard University where he is a fellow of the Mind, Brain, and Behavior Program. He is a professor in the departments of Anthropology and Psychology, as well as the Program in Neurosciences. He is the author of The Evolution of Communication LINKS: Primate Cognitive Neuroscience Laboratory, Marc Hauser Home Page _________________________________________________________________ Kevin Kelly The Population Fizzle While the absolute numbers of humans on Earth will continue to rise for another generation at least, birth rates around the world are sinking, particularly in the developed countries. In the coming decades many emerging countries -- currently boosting the rise in absolute population numbers -- will themselves make the transition to lower birth rates. While most estimates see the global population of Earth peaking around 2040-50, what none of them show is what happens afterward. Unless some unknown counterforce arises, what happens after the peak is that the world's birth rate steadily sinks below replacement level. Increasing people's longevity can only slightly postpone this population fizzle. The main reason this is unreported is that it will be decades before it will happen and until it does, the population boomers are correct: our numbers swell exponentially. But the story is worth reporting for three reasons: 1) Long before the world's population fizzles, it will age. As it already is. The average age of a human on earth has been increasing since 1972. The developing countries are aimed to be Geezer Countries in another twenty years or less. This will affect culture, politics, and business. 2) There is currently no cultural force in the modern societies to maintain human population levels over the long term. Population robustness these days comes primarily from undeveloped countries, and from cultural practices that are disappearing in developed countries. Once the developing countries become developed, there will be little at work to maintain an average of one offspring per person, with millions have more to balance the millions with few or no children. How many future readers will be willing to have three or more kids? 3) There is no evidence in history of declining population and increasing prosperity. In the past rising prosperity has always accompanied rising populations. Prosperity riding upon of declining population *may* be possible through some wisdom or technology that we don't possess now, but it would require developing an entirely new skill set for society. This story could be reported by asking the non-obvious questions of the usual population pundits: what happens afterwards? What happens in China with a continually declining birth rate? What would happen to the US without immigration? Who is going to work in Europe? KEVIN KELLY is a founding editor of Wired magazine. In 1993 and 1996, under his co-authorship, Wired won it's industry's Oscar -- The National Magazine Award for General Excellence. Prior to the launch of Wired , Kelly was editor/publisher of the Whole Earth Review, a journal of unorthodox technical and cultural news. He is the author of New Rules for the New Economy; and Out of Control: The New Biology of Machines, Social Systems, and the Economic World. LINK: Kevin Kelly Bio Page _________________________________________________________________ Freeman Dyson The Enduring Vitality of the More Moderate Kinds of Religions Here is my not very original answer to this year's question: what is today's most important unreported story? The enduring vitality of the more moderate kinds of religion. Although the fanatical and fundamentalist versions of religion receive heavy attention from the media, the moderate versions of religion do not. The majority of religious people are neither fanatical nor fundamentalist. This is true of Christians, Jews and Moslems alike. In the modern world, with inequalities between rich and poor growing sharper, governments are increasingly incapable or unwilling to take care of the poor. Organized religions increasingly provide the glue that holds societies together, giving equal respect to those who profit from economic prosperity and those who are left behind. This is true even in such a prosperous and technologically advanced community as Princeton, New Jersey, where I live. Princeton has more than twenty churches, all trying in their different ways to reach out to people in need, all bridging the gap between young and old as well as the gap between rich and poor. Religion plays this same role, holding communities together with bonds of fellowship, among the Arabs of the West Bank, the Jews of Brooklyn, and the African-Americans of Trenton, New Jersey. Religions that have endured for a thousand years and helped generations of oppressed people to survive are not about to disappear. FREEMAN DYSON is professor of physics at the Institute for Advanced Study, in Princeton. His professional interests are in mathematics and astronomy. Among his many books are Disturbing the Universe, Infinite in All Directions Origins of Life, From Eros to Gaia, Imagined Worlds, and The Sun, the Genome, and the Internet. _________________________________________________________________ Piet Hut There Are No Things. That's right. No thing exists, there are only actions. We live in a world of verbs, and nouns are only shorthand for those verbs whose actions are sufficiently stationary to show some thing-like behavior. These statements may seem like philosophy or poetry, but in fact they are an accurate description of the material world, when we take into account the quantum nature of reality. Future historians will be puzzled by the fact that this interpretation has not been generally accepted, 75 years after the discovery of quantum mechanics. Most physics text books still describe the quantum world in largely classical terms. Consequently anything quantum seems riddled with paradoxes and weird behavior. One generally talks about the "state" of a particle, such as an electron, as if it really had an independent thing-like existence, as in classical mechanics. For example, the term `state vector' is used, even though its operational properties belie almost anything we normally associate with a state. Two voices have recently stressed this verb-like character of reality, those of David Finkelstein, in his book Quantum Relativity, and of David Mermin, in his article "What is quantum mechanics trying to tell us" [1998, Amer. J. of Phys. 66, 753]. In the words of the second David: `"Correlations have physical reality; that which they correlate does not.'" In other words, matter acts, but there are no actors behind the actions; the verbs are verbing all by themselves without a need to introduce nouns. Actions act upon other actions. The ontology of the world thus becomes remarkably simple, with no duality between the existence of a thing and its properties: properties are all there is. Indeed: there are no things. Two hundred years ago, William Blake scolded the physicists for their cold and limited view of the world, in terms of a clockwork mechanism, in which there was no room for spontaneity and wonder. Fortunately, physicists did not listen to the poet, and pushed on with their program. But to their own utter surprise, they realized with the discovery of quantum mechanics that nature exhibits a deeply fundamental form of spontaneity, undreamt of in classical physics. An understanding of matter as dissolving into a play of interactions, partly spontaneous, would certainly have pleased Blake. What will be next? While physics may still seem to lack a fundamental way of touching upon meaning and wonder, who is to say that those will remain forever outside the domain of physics? We simply do not know and cannot know what physics will look like, a mere few hundred years from now. There is an analogy with computer languages. Physicists have a traditional aversion to learning any other language than Fortran, with which they grow up, no matter how useful the other languages may be. But without ever parting from their beloved Fortran, it was Fortran that changed out from under them, incorporating many of the features that the other languages had pioneered. So, when asked how future physicists will program, a good answer is: we have not the foggiest idea, but whatever it is, it will still be called Fortran. Similarly, our understanding of the material world, including the very notion of what matter and existence is, is likely to keep changing radically over the next few hundred years. In what direction, we have no idea. The only thing we can safely predict is that the study of those wonderful new aspects of reality will still be called physics. PIET HUT is professor of astrophysics at the Institute for Advanced Study, in Princeton. He is involved in the project of building GRAPEs, the world's fastest special-purpose computers, at Tokyo University, and he is also a founding member of the Kira Institute. LINKS: Piet Hut's Home Page, GRAPEs; and Kira Institute. _________________________________________________________________ Jeff Jacobs The Lack of Services to Heal Abused Children The emotional, physical, sexual abuse of children in our society and the lack of services to heal them. JEFF JACOBS is the Founder of Civitas Initiative, and President of Harpo Entertainment Group. LINK: Civitas Initiative _________________________________________________________________ Lance Knobel Two Stories, One Encouraging, the Other Worrying First is the incredible dynamism, energy and economic hope found in the great cities of the developing world. Most coverage of developing world cities concentrates on the problems: environment, overcrowding, shanty towns. But these cities represent the best hope of the world's poor nations. Modern myth has it that these cities are growing at unprecedented rates in an increasingly urbanized world. In fact the cities of the north grew at even greater rates in the 19th century. Mexico City, Sao Paulo, Calcutta, Buenos Aires and Rio all had more people moving out than moving in over the decade of the '80s. More than two-thirds of the world's cities with populations of more than 1 million were important cities 200 years ago. Miami and Phoenix grew more rapidly than Nairobi in the 20th century; Los Angeles more rapidly than Calcutta. Urban growth is generally good: rising levels of urbanization are strongly associated with growing and diversifying economies, and most of the nations in the south whose economic performance over the last 10 to 15 years is so envied by others are also the nations with the most rapid increase in their levels of urbanization. The developing worlds cities need to be celebrated and not lamented. Less encouraging is the largely ignored story of climate change. All media have great difficulty in covering stories that develop over generations. Who in 1900 would have written about the changing role of women or the spread of democracy, two of the extraordinary shifts of the last century. The failure of the Kyoto conference on climate change to get significant action by the advanced, industrialized countries on climate change means that the problems will get far worse before there is any chance of them getting better. LANCE KNOBEL is editor-in-chief of World Link, the magazine of the World Economic Forum, and is responsible for the program of the Forum's Annual Meeting in Davos, Switzerland. He also publishes their nascent Weblog. LINKS: The World Economic Forum, Worldlink: The Online Magazine of the World Economic Forum, and Weblog _________________________________________________________________ Philip Elmer-Dewitt O.J.'s Search for the Real Killer. PHILIP ELMER-DEWITT is Science and technology Editor, Time Magazine. He has been writing about science and technology for Time since he reported a cover story on computer "Whiz Kids" in 1982. He became a staff writer in 1983, a senior writer in 1993, a senior editor in 1994 and science editor in 1995. He started two new sections in the magazine -- Computers (1982) and Technology (1987) -- and in 1994 helped launch Time Online, America's first interactive newsmagazine. LINK: Philip Elmer-DeWitt's Home Page. _________________________________________________________________ John Horgan The Quiet Resurgence of Psychedelic Compounds as Instruments of Both Spiritual and Scientific Exploration The story that has gripped me lately is the quiet resurgence of psychedelic compounds as instruments of both spiritual and scientific exploration. This trend is unfolding worldwide. I just attended a conference in Switzerland at which scholars presented findings on the physiological and psychological effects of drugs such as psilocybin, LSD and MDMA (Ecstacy). At the meeting, I met an American chemist who had synthesized a new compound that seems to induce transcendent experiences as reliably as LSD does but with a greatly reduced risk of bad trips; a Russian psychiatrist who for more than 15 years has successfully treated alcoholics with the hallucinogen ketamine; and a German anthropologist who touts the spiritual benefits of a potent Amazonian brew called ayahuasca. Long a staple of Indian shamans, ayahuasca now serves as a sacrament for two fast-growing churches in Brazil. Offshoots of these churches are springing up in the U.S. and Europe. Several non-profit groups in the U.S. are attempting to rehabilitate the image of psychedelic drugs through public education and by supporting research on the drugs' clinical and therapeutic potential. They include the Heffter Institute, based in Santa Fe, New Mexico and the Multidisciplinary Association for Psychedelic Studies. MAPS, based in Florida. The question is, will this new psychedelic movement founder, as its predecessor did in the 1960's? Or will it bring about the profound spiritual and social changes that advocates envision? JOHN HORGAN is a freelance writer and author of The End of Science and The Undiscovered Mind. A senior writer at Scientific American from 1986 to 1997, he has also written for the New York Times, Washington Post, New Republic, Slate, London Times, Times Literary Supplement and other publications. He is now writing a book on mysticism. Further Reading on Edge: "Why I Think Science Is Ending" A Talk With John Horgan; "The End of Horgan?" _________________________________________________________________ Hans Weise That There Are So Many Important Unreported Stories. Given the number of media outlets for independent voices to tell good stories, the vanilla quality of mainstream reporting is like the proverbial frog in a pot of water who doesn't notice the slow temperature increase and boils cozily. In a consumer-oriented culture under a booming economy, critical voices are marginalized and the questions "we" ask ourselves lose color and substance and become sensational and picayune. A news magazine won't tell you about record-setting U.S. arms sales, for instance, but they will tell you that Ricky Martin is the sexiest man alive. Consumers don't like to hear that things aren't what they seem, so advertisers won't support those who publish such nonsense. The same goes for education -- people are still uncomfortable with the idea that we've descended from apes and so, in the case of Kansas at least, students needn't be burdened with that knowledge. HANS WEISE is a filmmaker, writer, and Web developer living in Alexandria, VA. He was a home-schooled auto-didact who majored in cinema studies at NYU and then studied archaeology and astronomy via Harvard extension. _________________________________________________________________ Clifford A. Pickover The Immortalization of Humanity The most unreported story deals with evolution of human lifespans and intelligence. Although we hear news reports about how humans will live longer in the future, we rarely hear reports that our children or grandchildren will be immortal by the end of the next century. Given the tremendous advances in molecular biochemistry that will take place by 2100, we will certainly uncover the molecular and cellular mysteries of aging, and therefore many humans will live forever, assuming they don't suffer a fatal accident. I am amazed that this obvious concept is not discussed more often or taken more seriously. Of course, the ecological, economic, political, social, and religious implications will be extreme. Imagine an immortal Pope discussing the afterlife with his followers -- or the growth of two social classes, those that can afford immortality and those too poor to gain access to the required anti-aging "treatment." Similarly, most scientists and lay people seem to think that there is intelligent, space-faring life elsewhere in the universe. A related unreported story is just how special human intelligence is. Despite what we see in Star Wars and Star Trek, I don't expect intelligence to be an inevitable result of evolution on other worlds. Since the beginning of life on Earth, as many as 50 billion species have arisen, and only one of them has acquired technology. If intelligence has such has high survival value, why are so few creatures intelligent? Mammals are not the most successful or plentiful of animals. Ninety-five percent of all animal species are invertebrates. Most of the worm species on our planet have not even been discovered yet, and there are a billion insects wandering the Earth. If humankind were destroyed in some great cataclysm, in my opinion there is very little possibility that our level of intelligence would ever be achieved on Earth again. If human intelligence is an evolutionary accident, and mathematical, linguistic, artistic, and technological abilities a very improbable bonus, then there is little reason to expect that life on other worlds will ever develop intelligence that allows them to explore the stars. Both intelligence and mechanical dexterity appear to be necessary to make radio transmitting devices for communication between the stars. How likely is it that we will find a race having both traits? Very few Earth organisms have much of either. As evolutionary biologist Jared Diamond has suggested, those that have acquired a little of one (smart dolphins, dexterous spiders) have acquired none of the other, and the only species to acquire a little of both (chimpanzees) has been rather unsuccessful. The most successful creatures on Earth are the dumb and clumsy rats and beetles, which both found better routes to their current dominance. If we do receive a message from the stars, it will undermine much of our current thinking about evolutionary mechanisms. Despite the improbabilities, we must continue to scan the stars for signs of intelligence. I agree with the ancient Persian proverb, "The seeker is a finder," which suggests we must always search in order to understand our place in our universe. CLIFFORD A. PICKOVER is a research staff member at the IBM Watson Research Center in Yorktown Heights, New York. He received his Ph.D. from Yale University and is the author of over twenty books on such topics as computers and creativity, art, mathematics, black holes, human behavior and intelligence, time travel, and alien life. His web site, www.pickover.com, has received over 200,000 visits, and his latest book is Surfing Through Hyperspace: Understanding Higher Universes in Six Easy Lessons. LINK: Clifford A. Pickover's Home Page _________________________________________________________________ Howard Rheingold How Will The Internet Influence Democracy? The way we learn to use the Internet in the next few years (or fail to learn) will influence the way our grandchildren govern themselves. Yet only a tiny fraction of the news stories about the impact of the Net focus attention on the ways many to-many communication technology might be changing democracy -- and those few stories that are published center on how traditional political parties are using the Web, not on how grassroots movements might be finding a voice. Democracy is not just about voting for our leaders. Democracy is about citizens who have the information and freedom of communication the need to govern themselves. Although it would be illogical to say that the printing press created modern democratic nation-states, it would have been impossible to conceive, foment, or implement self-government without the widespread literacy made possible by printing technology. The more we know about the kind of literacy citizens are granted by the Internet, the better our chances of using that literacy to strengthen democracy. And what could be more important? What good is health and wealth and great personal home entertainment media without liberty? Every communication technology alters governance and political processes. Candidates and issues are packaged and sold on television by the very same professionals who package and sell other commodities. In the age of mass media, the amount of money a candidate can spend on television advertising is the single most important influence on the electoral success. Now that the Internet has transformed every desktop into a printing press, broadcasting station, and place of assembly, will enough people learn to make use of this potential? Or will our lack of news, information, and understanding of the Net as a political tool prove insufficient against the centralization of capital, power, and knowledge that modern media also make possible? The same tool that affords tremendous power to the grassroots, the broad citizenry, the cacaphony of competing "factions" necessary for healthy democracy, also affords tremendous power to the elites who already have wealth and power. Guess who can best afford to apply the tool to further their ends? What's in it for big media interests to inform us about how we can compete with big media interests? The political power afforded to citizens by the Web is not a technology issue. Technology makes a great democratization of publishing, journalism, public discourse possible, but does not determine whether or not that potential will be realized. Every computer connected to the Net can publish a manifesto, broadcast audio and video eyewitness reports of events in real time, host a virtual community where people argue about those manifestos and broadcasts. Will only the cranks, the enthusiasts, the fringe groups take advantage of this communication platform? Or will many-to-many communication skills become a broader literacy, the way knowing and arguing about the issues of the day in print was the literacy necessary for the American revolution? The "public sphere" is what the German political philosopher Habermas called that part of public life where ordinary people exchange information and opinions regarding potholes on main street and national elections, school bonds and foreign policy. Habermas claimed that the democratic revolutions of the 18th century were incubated in the coffee houses and committees of correspondence, informed by the pamphlets and newspaper debates where citizens argued about how to govern themselves without a King. Public governance could only emerge from public opinion. Habermas wrote: "By "public sphere," we mean first of all a domain in our social life in which such a thing as public opinion can be formed." The public sphere is the reason why the modern coup d'etat requires paratroopers to capture television broadcast stations -- because those are the places where the power to influence public opinion is concentrated. The problem with the public sphere during the past sixty years of broadcast communications has been that a small number of people have wielded communication technology to mold the public opinion of entire populations. The means of creating and distributing the kind of media content that could influence public opinion -- magazines, newspapers, radio and television stations -- were too expensive for any but a few. Just as books were once too expensive for any but a few. The PC and the Internet changed that. Desktop video, desktop radio, desktop debates, digicam journalism, drastically reduced the barriers to publishing and broadcasting. These technological capabilities have emerged only recently, and are evolving rapidly. While much attention is focused on how many-to-many audio technology is threatening the existing music industry, little attention is focused on political portals. While all eyes are on e-commerce, relatively few know about public opinion BBSs, cause-related marketing, web-accessible voting and finance data. Look at VoxCap, and the Minnesota E-Democracy Project, project, the California Voter's foundation, and scores of other unreported experiments. Imagine what might happen if more people were told that the Web could help them remain free, as well as enhance their shopping experience? HOWARD RHEINGOLD is the author of Virtual Reality, and The Virtual Community, and was the editor of Whole Earth Review and the Millennium Whole Earth Catalog . Further reading on Edge: Chapter 24, "The Citizen" in Digerati. LINK: rheingold's brainstorms. _________________________________________________________________ Ivan Amato The Planet Itself Is Becoming Self Aware Living flesh is innervated with all kinds of sensors like taste buds, pressure sensors, photoreceptors and position sensors in muscle fibers, which monitor internal and external conditions. Brains analyze signals from these sensors using built-in and ever-evolving models of the world (that include the owners of the brains) and then use these analyses to formulate plans of action. One of the most important un(der)reported stories today is the way the inanimate world built by humanity is becoming ever more innervated with sensors (cameras, microphones, strain gauges, magnetic sensors, GPS receivers, transponders, infrared sensors, satellite surveillance, etc.) as well as communications systems linking these sensors to computers that can store, analyze and act on those signals just like biological brains. What's more, all of these sensors are likely to ultimately link into a next-generation Internet via ultra-miniaturized, on-board, wireless connections (one of the main R&D thrusts of the microelectromechanical systems community). Millions of millions of thermometers, barometers, GPs transponders in vehicles, seismic monitors, radiation monitors, department store surveillance cameras, and thousands of other gadgets watching the world will all feed data into the system. This will amount to a global-scale, sensitive infrastructure a planet-sized body, that is -- whereby myriad sensory signals will constantly feed into a global-scale cyberspace coursing with sophisticated pattern-recognition abilities, knowledge-discovery (data-mining) systems, and other artificial cognition tools. One consequence will be that Earth will have a new kind of planetary self-awareness akin the bodily awareness living creatures have due to their sensory tissue. Debate about personal privacy will become almost moot since the entire world will constitute a glass house. On the up side, the complexity of this worldwide awareness -- and the new categories of data about the world will become available -- is likely to lead to emergent phenomena as surprising as the way life emerges from molecules and consciousness from life. IVAN AMATO, freelance print and radio writer; editor of the Pathways of Discovery essay series for Science Magazine; author of Stuff: The Materials The World Is Made Of and Pushing the Horizon, which is an institutional history of the Naval Research Laboratory. _________________________________________________________________ Arnold Trehub How The Human Brain Models The World A plausible explanation of how the brain can create internal models of veridical and hypothetical worlds has long eluded theorists. But recently there has been significant progress in the theoretical understanding of this defining aspect of human cognition, and it has scarcely been reported. About a decade ago, I wrote in The Cognitive Brain that the capability for invention is arguably the most consequential characteristic that distinguishes humans from all other creatures. Our cognitive brain is especially endowed with neuronal mechanisms that can model within their biological structures all conceivable worlds, as well as the world we directly perceive or know to exist. External expressions of an unbounded diversity of brain-created models constitute the arts and sciences and all the artifacts and enterprises of human society. The newsworthy story is that we now have, for the first time, a biologically credible large-scale neuronal model that can explain in essential structural and dynamic detail how the human brain is able to create internal models of its intimate world and invent models of a wider universe. ARNOLD TREHUB is adjunct professor of psychology at the University of Massachusetts Amherst. He has been the director of a laboratory devoted to psychological and neurophysiological research and is the author of The Cognitive Brain. _________________________________________________________________ Brian Goodwin Quality Pigs My story is about pigs! How could anything connected with pigs possibly have significant cultural consequences? It comes from research that entails a fundamental change in the scope of scientific inquiry. To appreciate what is at stake, we need to recall a basic assumption in the practice of western science: reliable knowledge about nature depends upon measurement. We can be sure of the wavelength of light rays from the setting sun, but there's no way we can determine the beauty of a sunset. Or we can find out the weight of a pig, but we can never know if a pig is happy or sad. Western science is about quantities, which are regarded as 'objective' properties of the world that everyone using the same method of measurement can agree on. It is not about qualities such as pleasure, pain, honesty, happiness or grief, which are regarded as subjective states that are not objectively real, however important they may seem to us. But what if it could be shown that qualities can be evaluated just as reliably and consistently as quantities? And by essentially the same scientific procedures? This is what has been shown in studies by a research team working in Edinburgh. People were shown videos of individual pigs interacting in a standard pen with the team leader, Francoise Wemelsfelder. They were asked to write down for each pig any set of terms that they felt described the quality of its behavior. These included words such as bold, aggressive, playful for one animal; timid, shy, nervous for another; indifferent, independent, self-absorbed for a third, and so on. There was no limit to the number of descriptors that could be used for any pig. A routine procedure was then followed in which each pig was evaluated again by each observer using all their chosen pig-descriptive terms and the results compared over the whole group of observers to see if there was consistency of evaluation. This type of procedure is regularly used in evaluation of food quality and flavour, but it has never before been used to see if people agree about an animal's 'subjective' state in terms of its behavior. The results were startling: there was a high level of consensus among people about the quality of behavior shown by different pigs. Their assessments were not arbitrary, personally idiosyncratic descriptions, but evaluations with a high degree of intersubjective consistency. This is precisely the basis of scientific 'objectivity': agreement between different observers using an agreed method of observation. This opens the door to a science of qualities with startling implications. The most important aspects of our lives are connected with qualities: quality of relationships, quality of education, quality of our environment, quality of life generally. We spend a great deal of time evaluating the behavior of those on whom we depend and trying to sort out whether they are happy, angry, depressed, reliable, and so on; i.e., we get a lot of practice at evaluating others' internal states by reading their behavior. And on the whole we are pretty good at it, despite dramatic errors of judgement to which we are prone. So it isn't all that surprising that people with no familiarity with pigs should nevertheless be very consistent at evaluating the quality of their behavior. But what is most dramatically lacking in the lives of people in 'developed' countries at the moment is, by general consensus, quality of life. Quantities we have in abundance - of food, technological gadgets of all kinds, cars, aircraft, information, and so on; the things that our science of measurement and quantities has been so successful at providing. But that science has degraded qualities such as beauty, love, joy, grief, and creativity to mere epiphenomal subjectivity, regarding them as ephemeral shadows with no objective reality. We intuitively know better. But now we can actually explore this territory systematically, scientifically, and reinvest our world with the qualities that are essential for living full lives; not just for humans but also for pigs and cows and trees and cities and landscapes and watersheds and cultures and the biosphere. With a science of qualities we can start to recover the wisdom we lost when we restricted our search for reliable knowledge to measurable quantities and cut ourselves off from the qualitative half of the world without which we and all else must perish. BRIAN GOODWIN is a professor of biology at the Schumacher College, Milton Keynes, and the author of Temporal Organization in Cells and Analytical Physiology, How The Leopard Changed Its Spots: The Evolution of Complexity, and (with Gerry Webster) Form and Transformation: Generative and Relational Principles in Biology. Dr. Goodwin is a member of the Board of Directors of the Sante Fe Institute. Further reading on Edge: Chapter 4, The Third Culture; "A New Science of Qualities:" A Talk With Brian Brian Goodwin _________________________________________________________________ Stephen Grossberg How does a brain give rise to a mind? When we think about our conscious experiences of the world, we are aware of vivid colors, sounds, and feelings. These seem to occur in a world of three spatial dimensions evolving through time. When we look at a typical brain, we see an unbelievably intricate web of networks energized by electrical and chemical processes. These networks often have incredibly large numbers of components. Given such different levels of description, it has been difficult to comprehend how a brain can give rise to a mind. During the past decade, theorists of mind and brain have finally been able to model the detailed temporal dynamics of known brain cells and circuits...AND the observable behaviors that they control, using the same model. Thus, for the first time in history, there are theories powerful enough to begin to explicitly show how our minds arise from our brains. The exciting thing about this progress is that, in every case, it depends upon qualitatively new concepts and theories about how our brains adapt on their own, moment - by - moment, to a rapidly changing world. Many outstanding problems in technology also require that an intelligent agent adapt on its own, moment-by-mo ment, to a rapidly changing world. These new concepts and theories have therefore begun to find their way into new technological applications. Thus the new progress about how our brains work promises to provide a more human technology. The story has not been adequately reported because the new concepts are qualitatively new -- for the same reason that brains seem to be so different from minds -- and many reporters have not taken the time to understand them. STEPHEN GROSSBERG is one of the principal founders of the fields of computational neuroscience, connectionist cognitive science, and artificial neural network research. He is Wang Professor of Cognitive and Neural Systems and Professor of Mathematics, Psychology, and Biomedical Engineering at Boston University. He is Co-Editor-in-Chief of the journal Neural Networks, which is the official journal of the three major neural modeling societies in the world. LINK: Stephen Grossberg Home Page _________________________________________________________________ Philip W. Anderson (1) The closing of the High Flux Beam Reactor at Brookhaven National Lab; (2) Eppawala I have two answers, which will certainly qualify because you may well never have heard of either. The first, and my primary answer because it is a local matter that the Third Culture can hope to affect, is the closing of the High Flux Beam Reactor at Brookhaven National Lab. Many high profile media scientists proclaimed the Supercollider decision as the point at which the US definitively turned away from science. But it was nothing of the sort. It was a badly managed and unwisely promulgated project, immensely expensive and disconnected with the rest of science, about which many perfectly reasonable scientists had serious doubts -- not just me, though I seem to have taken the brunt of the blame. My nominee for this turning point is the HFBR. A coalition of pseudo-environmentalists and trendy New Agers, useful and wellheeled Clinton friends and campaign contributors with Long Island real estate, blew up a leakage which amounted to the amount of tritium in the exit signs of your local movie theatre into a major issue, and Bill Richardson, the Secretary of Energy, caved without listening to the scientists involved. It is reported that the coalition arranged an interview for the Secretary with a supermodel on the afternoon the scientists had asked for a last appeal. In any case the loss of the HFBR closes one of the world's most productive scientific instruments and sends the entire community off to our friendly competitors in Europe and Japan. Neutron scattering and diffraction is central to much of condensed matter physics and useful in biology, chemistry and several branches of technology. Approximately 300 experiments were run the last year the HFBR was up. There was no conceivable economic reason for shutting it down -- it was a very inexpensive instrument relative to the projects which are replacing it. Its real problem is the anti-intellectual bias of the majority culture. If only we had been able to label it "organic" rather than "nuclear" it would have survived. I will be brief about the other. You will never have heard the name "Eppawala". This is a project of a US-Japan consortium to mine phosphate in gigantic quantities from a mountain in Sri Lanka, destroying a thousand-year-old irrigation system, numerous antiquities, and many villages and compensating the locals on a typical Third World scale with a minute fraction of the profits -- profits which hardly exist if one were to count the true cost of the project. It is a staggering example of the misuse of economic reasoning which characterizes third world "development" projects. Not just third world, in my opinion! PHILIP W. ANDERSON is a Nobel laureate physicist at Princeton and one of the leading theorists on superconductivity. He is the author of A Career in Theoretical Physics, and Economy as a Complex Evolving System. LINK: Philip W. Anderson Home Page. _________________________________________________________________ James J. O'Donnell The Remaking Of Personality As Seen In Language. Until a few years ago, tracking the evolution of language was only possible for idle gentlemen: university scholars, amateur dilettantes (whence arose the Oxford English Dictionary), or pompous columnists (only one of whom has a weekly piece in the New York Times Magazine). It was carried on from an elite gentleman's club perspective and consisted of deploring decline and patronizing neologism. But now we have the tools to do a vastly better job of paying attention to what *we* are saying. Huge quantities of "real" language as it is spoken and written can be collected easily and subjected to sophisticated tracking analysis. Gender, class, nationality: all can be revealed and studied as never before. The ethical and political bases of society as it is (not as it imagines itself to be) can be displayed and analyzed. Why is this important? Because the way we talk about ourselves and others is the way we create ourselves and our society. The ethical and social revolutions of the last half century have been far reaching, but it is still possible for those who prefer nostalgia to justice to wish those revolutions away. The emergence of a serious journalism of language, supported by good science, would document the way in which all classes and social groups have changed and continue to change. It would tell us things about ourselves that we know in our hearts but have not had the self aware and the wisdom and the courage to say to ourselves aloud. I believe we would all be happier if we knew how far we have come, and I can think of no better way of measuring and showing it. JAMES J. O'DONNELL, Professor of Classical Studies and Vice Provost for Information Systems and Computing at the University of Pennsylvania, is the author of Avatars of the Word: From Papyrus to Cyberspace. LINK: The James J. O'Donnell Website. _________________________________________________________________ Sylvia Paull Women Are Still Considered Inferior To Men After two thousand years of "civilization," women are still considered inferior to men by most cultures, whether in developed, developing, or undeveloped nations. Although the media reports on glass ceilings in the job place, they do not penetrate beyond the economic discrimination women face into the culture itself: What is it that makes most men think they are superior to women? Why is the thought of electing a woman president of the United States so unthinkable to most of the population? Why is it surprising that most Fortune 1000 companies still lack a woman on their board of directors? Why do women athletes still lack funding and popular support on a scale that their male counterparts garner? Because after 2,000 years of recorded history, and 20,000 years of artifact-preserved history, women have generally been relegated as breeders not leaders. And even though technological and economic advances have allowed women to have children as well as professional careers, their multimillenial image as background breeders persists. This pervasive fallacy continues to limit the creative potential of half of the world's population. The underlying belief in women's inferiority seems to be so ingrained in our collective psyches that even the media doesn't seem motivated to investigate -- let alone challenge -- its roots. SYLVIA PAULL is founder of Gracenet , a group of women who work in computing, high-tech journalism, and related fields and who support the advancement of women in these fields. LINK: Gracenet _________________________________________________________________ Thomas Petzinger, Jr. The End Of Money No one, least of all in the press -- least of all in the business press -- has seen the beginnings of what may be the greatest revolution in the history of commerce: the end of money, and with it the concept of the customer. Until there was money, there was no such thing as a customer. It wasn't swapping tools for fish that turned a Polynesian islander from a trader into a customer. That's simply barter. The idea of "buyer" and "seller" emerged only when one party swapped something with a fixed use for something fungible. Often, the money received by the seller had a modest utilitarian purpose; gold, for instance, could be hammered into nose rings, false teeth or satellite solar arrays. But money became the foundation of economic life precisely because it had symbolic more than practical value. Then God gave us lawyers and accountants to prevent underweighing and overcharging, to make sure that every exchange of tangible things for intangible money was perfectly balanced, perfectly reciprocal. But this is a conceit of economists, accountants and lawyers, as everyday commercial life reveals. Because it can be turned into anything, money represents dreams unfulfilled, and unrequited dreams, at any price, are worth more than dreams realized. We all realize this intuitively. A buyer asks a seller to give up a mere thing; a seller asks a buyer to give up hopes and possibilities. For the same reason, it's more costly for sellers to recruit buyers than for buyers to recruit sellers: Sellers can exchange their stuff for only one thing (money), while buyers can exchange their money for anything. That's why, in the real world of purportedly balanced transactions, sellers invariably defer to buyers -- why we say "the customer is king" and "The customer is always right." But let's say it's 2000 and you're Time Inc. You own some of the best-known media properties in the world: Sports Illustrated, People magazine, etc. You want to leverage those properties. So you approach Yahoo!, say, or American Online. You propose to provide content to them. They propose to promote your brand. And as you sit down to the bargaining table to sort out the economics of all this, you throw up your hands and ask, "Are we paying you or are you paying us?" That's how these negotiations actually go. "Who's paying whom?" Asking a question like that signals that maybe nobody needs to pay anything to anybody. Lots of value is created, but "nobody's paying for it". It just happens because two (or more) business partners create something together. In these situations firms can't begin to account for the nickels and dimes in the deal and may not even bother trying. In these situations, relationships triumph over transactions. Money drastically diminishes as a factor in the deal. And the identity of the customer -- Are we paying you or are you paying us? -- becomes fuzzy. The very concept of the customer begins to disappear. Look at Silicon Valley. Every major firm there is a node in a complex network in which a huge fraction of the value creation could never be accounted for in monetary terms. Should Intel pay for Microsoft to optimize operating systems in a way that makes Intel chips ubiquitous? Or should Microsoft pay Intel to design chips that make Microsoft operating systems ubiquitous? The press and the pundits are clueless about the effects of these de-monetized value-added dealings. No wonder, because all their measurements are expressed as units of money. Unless some dough changes hands, even the biggest commercial developments are as unheard as trees falling in the distant forest. The data mavens at Commerce are blind to the value created when Yahoo! adds a new Web site listing or when Mapquest shaves 0.6 miles off my trip. When the Labor Department calculates the Consumer Price Index it has no idea that its own Web pages are being dished out on free Linux source code or that a building contractor in Bowie, Md., decided to eat a change order because he wanted to preserve the goodwill of his client -- and that more and more of the economy is being transacted on such a basis. When Dr. Greenspan and the poo-bahs at the Fed deliberate over the "irrational exuberance" of the stock market, how much weight do you suppose they're giving to the fact that the marginal cost of a transaction in a world of e-commerce has essentially dropped to zero? More de-monetization. Today most of the money in the world isn't even made of paper, much less metal. It exists as binary digits. No wonder the central banks of the world are heaving their gold reserves into a collapsing market. Who needs gold when money sheds the slightest pretense of being anything but data? Say good-bye to gold. Gold is history. If you want currency backed by something tangible, sign up for 5,000 frequent flier miles on a new Visa card. THOMAS PETZINGER, JR., has spent 21 years as an editor, reporter and weekly columnist at The Wall Street Journal, where he served most recently as Millennium Editor. His latest book is The New Pioneers: The Men and Women Who Are Transforming the Workplace and Marketplace LINK: Thomas Petzinger, Jr. Home Page _________________________________________________________________ Stephen Kellert The Concept Of Ecology As Aldo Leopold once suggested, the greatest invention of the 20th century might be the concept of ecology. It reminds us of the illusory nature of the concept of the autonomous, unitary, individual being. We are all -- ourselves as persons, the human animal, all species, in all likelihood, the universe -- a constant product of relational and transactional processes. STEPHEN R. KELLERT, a social ecologist at Yale and E.O. Wilson's collaborator, is recognized as the world's foremost authority on human relationships to animals. The New York Times has featured his work in various articles and his work has been recognized by a number of awards including the Distinguished Individual Achievement Award of the Society for Conservation Biology. Kellert is coeditor with E.O. Wilson of The Biophilia Hypothesis and author of Kinship to Mastery, The Value of Life. _________________________________________________________________ Eric J. Hall The World Is Losing Potable Water At A Rate That Is Unprecedented. Quite an interesting question and one that required both research and introspection. I finally had to rely on what I have experienced in travelling to 6 of the earth's continents. The world is losing potable water at a rate that is unprecedented. Saharan and sub-saharan Africa water supplies are contaminated with bilharzia, schistosomiasis (sic), giardia, and other water-born parasites and diseases. Russia has seen the Caspian Sea and other land-locked lakes decline rapidly over the past 25 years due to evaporation from a warmer climate and drainage to irrigate arid lands. Populations are expanding into areas where water is scarce and aquifers will be drained in less than 100 years at current consumption rates. Even my favorite Sierra streams are polluted with giardia and require treatment before you can drink the water. China, India, Asia, and South America are also faced with the same problems of water-born diseases and decreasing supplies. In the 19th century we saw battles fought over water rights in New Mexico. What will happen in the next century when nations are faced with hydro projects in a neighboring country decrease the flow of water to another country. This is already happening in Turkey where a massive hydro project threatens Iraq's water supply. Will this be Saddam's next war? Scientists still do not understand what the impact of desalinization of sea water will have on the balance of the global ecosystem. The salt removed has to go somewhere, usually back into the water. Will increased salinity impact polar ice caps or the temperature of the earth's oceans? We have already seen an increase in the salinity of water in Russia, the Middle East, and even in areas in the USA. In terms of consumption, we waste a tremendous amount of water as individuals. More water goes down the drain or the gutter as we continue to plant water-intensive lawns or keep the tap running while we brush our teeth. Recent draughts in California have highlighted how much water can be saved on a daily basis. On the converse, in many third-world countries more time is spent each day bringing the necessary water to the home as wells dry up. You can't be productive if you're always fetching water. The next global war will be fought over water, not imperialism or political ideology. Man can live without a lot of things, but not water. Mainstream media in all countries have been absent in their reporting as well as conservation groups like the Sierra Club. What will the world do when there's not enough rain to go around?. ERIC J. HALL is President of The Archer Group, a consulting firm specializing in emerging technology companies. He has helped found companies including Yahoo!, Women.com, The ImagiNation Network, and rightscenter.com, Inc. _________________________________________________________________ John Gilmore The World Isn't Going to Hell Media sells your attention to advertisers using bad news. This makes people think the bad news is the real state of the world. Pollution is making the world a worse place, raw materials are being used up, bigger populations are overconsuming us into wretchedness, etc. They even want all of us to waste our time sorting garbage for fear that we'll run out of barren land to put it on! The real story is that human life has gotten better and better and better over the centuries. The world used to be a very polluted place -- if you count deadly infectious bacteria in the environment. Centuries of focus on clean drinking water, separating sewers from food and water supplies, medicine, and nutrition have resulted in human life span being literally doubled and tripled, first in "civilized" countries and then in "developing" countries. China has doubled life-span in this century. Everyone grows up taller and stronger than hundreds of years ago. There is much less pollution in London today than in any recorded century. There is much less pollution in the US today than in any recorded decade. There are more proven oil reserves than ever before, and in fifty years when those have been used, another fifty or sixty years' worth will have been worth locating. There are more acres of forest in North America than a hundred years ago. (There's a market for growing trees now, and they can be transported to where people want to buy the wood! Two hundred years ago it was more work to move wood twenty miles in carts on mud roads, than it was to take it across the Atlantic!) Resources of all types are getting cheaper and cheaper, as measured in decades and centuries. There is no reason to believe that these trends will change. (Remember the people like Paul Erlich who predicted world famine by 2000? They are still making predictions, but you shouldn't believe 'em any more, because it's 2000 and the starving hordes aren't here.) Prof. Julian Simon was a "liberal who got mugged" -- by the facts. He started off trying to prove the environment was getting worse, but whenever he found actual historical data, it contradicted that thesis. Eventually he changed his mind and started writing books about it. Ultimate Resource 2 is his updated book about how all the resources except one of-a-kind objects are becoming less scarce, except humans. (Human attention commands higher and higher prices over the decades, a trend easily visible in the price of labor, despite there being more humans around.) The State of Humanity is a very well documented (footnoted) survey of human life, health, and the environment. It points you to actual historical data showing the real long-term trends in human longevity, health, welfare, prices of materials, acres of forest, number of species known, pollution, smoke, you name it he's got it. Knee-jerk liberals beware! These are the kinds of books that you won't like to read. The interior feeling of a mind stretching is uncomfortable, though the result is well worth it. JOHN GILMORE is an entrepreneur disguised as a philanthropist. Or perhaps vice verse. He co-founded the Electronic Frontier Foundation, the "alt" newsgroups, the Cypherpunks, and 1989 open source company Cygnus Solutions. He's been pushing encryption technology out of government spy agency control for 15 years. He's a big believer in civil rights, even for Internet users and those who like drugs. He thinks nation-states have killed more people than any other single force in history, and wishes to see them wither to a tiny fraction of their current size and power over their citizens. And he refuses to "recycle" his trash -- if it's worth it to you, you can recycle it. _________________________________________________________________ Sally M. Gall The Potentially Glorious, or Dangerous, Massive Cultural Impact of Mitochondrial DNA Studies. The increasing use of mitochondrial DNA in determining genetic relationships among human beings opens up the extraordinary possibility of a global registry in which every individual knows his or her antecedents and degree of genetic closeness to all other living human beings. What would be the result of such knowledge? A delight in finding out that we are all more or less brothers and sisters under the skin, leading -- one hopes -- to a decrease in hostilities between antagonistic groups? Or would a new clannishness emerge in which anyone who is more, say, than six degrees of genetic separation from oneself is identified as a natural enemy? SALLY M. GALL, holds degrees from Harvard / Radcliffe and NYU. A librettist, poet, critic, and scholar, she now specializes in writing texts for a broad range of music drama. _________________________________________________________________ Rodney A. Brooks People are Morphing into Machines. Since I work in building autonomous humanoid robots reporters always ask me what will happen when the robots get really smart. Will they decide that we (us, people) are useless and stupid and take over the world from us? I have recently come to realize that this will never happen. Because there won't be any us (people) for them (pure robots) to take over from. Barring an asteroid size thwack that knocks humans back into pre-technological society, humankind has embarked on a journey of technological manipulation of our bodies. The first few decades of the new millennium will be a moral battleground as we question, reject, and accept these innovations. Different cultures will accept them at different rates (e.g., organ transplantation is currently routine in the United States, but unacceptable in Japan), but our ultimate nature will lead to wide spread adoption. And just what are these technologies? Already there are thousands of people walking around with cochlea implants, enabling the formerly deaf to hear again -- these implants include direct electronic to neural connections. Human trials have started with retina chips being inserted in blind people's eyes (for certain classes of blindness, such as macular degeneration), enabling simple perceptions. Recently I was confronted with a researcher in our lab, a double leg amputee, stepping off the elevator that I was waiting for -- from the knees up he was all human, from the knees down he was robot, and prototype robot at that -- metal shafts, joints full of magneto-restrictive fluids, single board computers, batteries, connectors, and wire harnesses flopping everywhere; not a hint of antiseptic packaging -- it was all hanging out for all to see. Many other researchers are placing chips in animal, and sometimes human, flesh and letting neurons grow and connect to them. The direct neural interface between man and machine is starting to happen. At the same time surgery is becoming more acceptable for all sorts of body modifications -- I worry that I am missing the boat carrying these heavy glasses around on my nose when everyone else is going down to the mall and having direct laser surgery on their eyes to correct their vision. And at the same time cellular level manipulation of our bodies is becoming real through genetic therapies. Right now we ban Olympic athletes who have used steroids. Fairly soon we may have to start banning kids with neural Internet connection implants from having them switched on while taking the SATs. Not long after that it may be virtually mandatory to have one in order to have a chance taking the new ISATs (Internet SATs). We will become a merger between flesh and machines, and we (the robot-people) will be a step ahead of them (the pure robots). We won't have to worry about them taking over. RODNEY A. BROOKS is Director of the MIT Artificial Intelligence Laboratory, and Fujitsu Professor of Computer Science. He is also Chairman and Chief Technical Officer of IS Robotics, an 85 person robotics company. Dr. Brooks also appeared as one of the four principals in the Errol Morris movie "Fast, Cheap, and Out of Control" (named after one of his papers in the Journal of the British Interplanetary Society) in 1997 (one of Roger Ebert's 10 best films of the year). Further reading on Edge:"The Deep Question" A Talk With Rodney Brooks. LINKS: Rodney A. Brooks Home Page; and Cog: A Humanoid Robot. _________________________________________________________________ John McWhorter The Transformation Of The American Musical Ear There are now two generations of Americans who have grown up after the rock revolution of the late 1960s, for whom classical music and the old style Broadway/Hollywood songs are largely marginal. As a result, today's typical American ear is attuned more to rhythm and vocal emotion -- the strengths of rock and rap -- than to melody and harmony, the strengths of classical music and Golden Age pop. This is true not just of teenagers but of people roughly fifty and under, and has been the most seismic shift in musical sensibility since the advent of ragtime introduced the American ear to syncopation a century ago. A catchy beat is not just one element, but the sine qua non in most pop today, opening most songs instead of the instrumental prelude of the old days. The increasing popularity of rhythm-centered Third World pop (pointedly called "World Beat") underscores this change in taste. Certainly folks liked a good beat before Elvis, but much of even the most crassly commercial dance music before the 1950s was couched in melody and harmony to a degree largely unknown in today's pop. Our expectations have so shifted that the rock music that critics today call "melodic" would sound like Gregorian chants to members of even the cheesiest little high school dance band in 1930. Yet what pop has lost in craft it has gained in psychological sophistication, and the focus on vocal emotion is part of this. In the old days, singers made their marks as individuals, no doubt, but, for example, Sinatra's artistry was in being able to suggest a range of emotions within the context of rather homogenous lyrics. Modern pop singers like Alanis Morrisette are freed from these constraints, and the variety and individuality of many modern pop lyrics have made them America's true poetry; indeed, many listeners relate to the lyrics of their favorite rock singers with an intensity our grandparents were more likely to devote to the likes of Robert Frost. Yet the fact remains that for the typical American of the future, melody and harmony will be as aesthetically marginal as they are to the African musician whose music is based on marvelously complex rhythms, with a vocal line serving largely rhythmic and/or decorative ends (notably, World Beat listeners are little concerned with not understanding most of the lyrics; it's the vocal texture that matters). Lyrics will continue to count, but their intimate linkage to musical line will be of no more concern than individual expression or complex rhythm was to pop listeners sixty years ago. I once attended a screening of a concert video from the mid-1960s in which Sammy Davis, Jr., who occupied the transitional point between the old and the current sensibility, sang Cole Porter's "I've Got You Under My Skin" first "straight", and then without accompaniment, eventually moving into scatting and riffing rhythmically to the merest suggestion of the written vocal line for a good few minutes, in a vein we would today call "performance art". Young hipsters behind me whispered "This is rad!"; a few seconds later I heard an elderly woman in the front row mumble "Enough of this is enough!" She would have been happy to hear Davis simply sing it through once with the orchestra; the hipsters wouldn't have minded Davis walking out and doing only the vocal riffs -- and they are the American musical ear of today and tomorrow. JOHN H. MCWHORTER is Assistant Professor of Linguistics at the University of California at Berkeley. He taught at Cornell University before entering his current position at Berkeley. He specializes in pidgin and creole languages, particularly of the Caribbean, and is the author of Toward a New Model of Creole Genesis and The Word on the Street : Fact and Fable About American English. He also teaches black musical theater history at Berkeley and is currently writing a musical biography of Adam Clayton Powell, Jr. Further reading on Edge: "The Demise of Affirmative Action at Berkeley": An Essay by John McWhorter. _________________________________________________________________ Stewart Brand The Peace Dividend Whatever happened to looking for the Peace Dividend? What if the rampant prosperity in America these years is it? Money not spent on Defense gets spent on something. Research not sequestered into Defense applications gets loose into the world faster, and in pace with other events in technology and science. Policies not organized around paranoia can be organized around judicious optimism. The more former enemies there are, the more new customers and suppliers. (Are Democrats getting credit for something that Republicans did? -- -win and end the Cold War. But maybe Democrats are exactly who you want running things when a long debilitating war ends.) Question: if the Peace Dividend is prosperity, is it a blip good for only a couple years, or is it a virtuous circle that goes on and on? STEWART BRAND is founder of the Whole Earth Catalog, cofounder of The Well, cofounder of Global Business Network, cofounder and president of The Long Now Foundation. He is the original editor of The Whole Earth Catalog, author of The Media Lab: Inventing the Future at MIT , How Buildings Learn), and The Clock of the Long Now: Time and Responsibility (MasterMinds Series). Further reading on Edge: "The Clock of the Long Now" A Talk by Stewart Brand; and Chapter 3, "The Scout" in Digerati. LINKS: Global Business Network; and The Long Now Foundation. _________________________________________________________________ Judith Rich Harris Parenting Styles Have Changed But Children Have Not What stories are most likely to go unreported? Those that have to do with things that happen so gradually that they aren't noticed, or happen so commonly that they aren't news, and those that have politically incorrect implications. A story that has gone unreported for all three reasons is the gradual and pervasive change in parenting styles that has occurred in this country since the 1940s, and the consequences (or lack of consequences) of that change. In the early part of this century, parents didn't worry about shoring up their children's self-esteem or sense of autonomy, and they didn't feel called upon to provide them with "unconditional love." They worried that their children might become spoiled, self-centered, or disobedient. In those days, spankings were administered routinely, often with a weapon such as a belt or a ruler. Kisses were exchanged once a day, at bedtime. Declarations of parental love were made once a lifetime, from the deathbed. The gradual but dramatic change in parenting styles over the past 50 years occurred mainly because more and more parents were listening to the advice of the "experts," and the experts' advice gradually changed. Nowadays parents are told that spankings will make their children more aggressive, that criticism will destroy their self-esteem, and that children who feel loved will be kinder and more loving to others. As a result of this advice, most parents today are administering far fewer spankings and reprimands, and far more physical affection and praise, than their grandparents did. But that's only half the story. The other half is the results, or lack of results, of this change in parenting styles. Are today's children less aggressive, kinder, more self-confident, or happier than the children of two generations ago? If anything, the opposite is true. Rates of childhood depression and suicide, for example, have gone up, not down. And certainly there has been no decline in aggressiveness. The implications, whatever they are, are bound to be politically incorrect. Perhaps the "experts" don't know what they're talking about. Perhaps parenting styles are less important than people have been led to believe. Perhaps human nature is more robust than most people give it credit for -- perhaps children are designed to resist whatever their parents do to them. It's possible that being hit by a parent doesn't make children want to go right out and hit their playmates, any more than being kissed by a parent makes them want to go right out and kiss their playmates. It's even possible (dare I suggest it?) that those parents who are still doling out a lot of punishment have aggressive kids because aggressiveness is, in part, passed on genetically. But now I'm getting into a story that HAS been reported. JUDITH RICH HARRIS is a writer and developmental psychologist; co-author of The Child: A Contemporary View Of Development; winner of the 1997 George A. Miller Award for an outstanding article in general psychology, and author of The Nurture Assumption: Why Children Turn Out The Way They Do. Further reading on Edge: "Children Don't Do Things Halfway". A Talk with Judith Rich Harris; Judith Rich Harris Comments on Frank J. Sulloway's Talk "How Is Personality Formed?". LINK: The Nurture Assumption Web Site. _________________________________________________________________ Lee Smolin The Internationalization of the Third Culture The internationalization of the third culture, by which I mean the growth of a class of people who do creative work of some kind (science, arts, media, business, technology, finance, fashion...) who live and work in a country other than their own, are married to such a person, or both. This is not a new situation, but what is new is the extent to which the combination of inexpensive air travel, telephone the Internet and computer technologies makes living and working outside of ones native country not only easy but increasingly attractive for a growing proportion of people in these professions. This is a natural consequence of the internationalization of these areas, which has made frequent international travel, and periods of studying and working abroad the norm rather than the exception. It is made possible by the ascendancy of English as a global language and the long period in which the developed world has been more or less at peace. With the end of the cold war, the growth of democracies in Latin America and the Far East and the unification of Europe there remain few significant political obstructions to the growth in size and influence of a denationalized community of people who work in exactly those areas which are most critical for shaping the human future. This class of people shares not only a common language and a common set of tastes in food, clothing, coffee, furniture, housing, entertainment, etc, but are increasingly coming to share a common political outlook, which is far more international than those from the old literary cultures, based as they are each on a national language and history. It is perhaps too early to characterize this outlook, but it involves a mix of traditional social democratic and environmental concerns with an interest (or perhaps self-interest) in the links between creative work, international exchange of ideas and technologies and economic growth. Moreover, they share an interest in the conditions which make their lives possible, which are peace, stability, democracy and economic prosperity, and these are more important to them than the nationalist concerns of their native countries. It is not surprising that the daily experience of juggling different languages, identities and cultures gives these people a much more optimistic outlook concerning issues such as pluralism and multiculturalism than those from the literary cultures. Most of them feel an attachment and identification to their native culture, but they also feel alienated from the party politics and petty nationalisms of their home countries. When they move to a new country they do not immigrate in the traditional sense, rather they enter a denationalized zone in which their colleagues and neighbors come from an array of countries and the place where they happen to be is less important than the work they do. How they and their children will resolve these different loyalties is far from clear. One can meet young people whose parents each speak a different language, who grew up in a third country, did a university education in a fourth, and now work in a fifth. What the political loyalties of such people will be is impossible to predict, but it seems not impossible that the growing concentrations of such people in the areas of work that most influence public taste and economic growth may catalyze the evolution of nation states into local governments and the invention of a global political system. LEE SMOLIN is a theoretical physicist; professor of physics and member of the Center for Gravitational Physics and Geometry at Pennsylvania State University; author of The Life of The Cosmos. Further Reading on Edge: "A Theory of the Whole Universe" in The Third Culture; "A Possible Solution For The Problem Of Time In Quantum Cosmology" by Stuart Kauffman and Lee Smolin. _________________________________________________________________ Roger Schank The Politicians Who Are Running On Education Platforms Don't Actually Care About Education As each presidential candidate makes education the big issue for his campaign we need to understand that none of them actually wants change in education. There are two main reasons for this. The first are vested interests. Those who oppose real change in our schools include teachers unions who lobby heavily for the status quo, book publishers who are afraid of losing the textbook investments they have made, testing services and test preparation services who have a significant investment in keeping things as they are and parents who really would be quite frightened if the schools changed in a way that made their own educations seem irrelevant. No politician wants to challenge a group like this so no politician wants to do any more than pay lip service to the issue. The second reason is more insidious. When real reformers propose that everyone should be equally educated there are gasps from the elitists who run our government. Their concern? If everyone were educated, who would do the menial jobs? Hard as it may be to believe this issue is raised quite often in Washington. ROGER SCHANK, is the Chairman and Chief Technology Officer for Cognitive Arts and has been the Director of the Institute for the Learning Sciences at Northwestern University since its founding in 1989. He holds three faculty appointments at Northwestern University as John Evans Professor of Computer Science, Education, and Psychology. His books include: Dynamic Memory: A Theory of Learning in Computers and People , Tell Me a Story: A New Look at Real and Artificial Memory, The Connoisseur's Guide to the Mind, and Engines for Education, and Virtual Learning: A Revolutionary Approach to Building a Highly Skilled Workforce. Further reading on Edge: "Information is Surprises" -- Roger Schank in The Third Culture; and "The Disrespected Student -- or -- The Need for the Virtual University": A Talk with Roger Schank. LINKS: Cognitive Arts; and Institute for the Learning Sciences. _________________________________________________________________ Howard Gardner (1) African Civil Wars; (2) Evolutionary Theory Is Not Intuitive; Creationism Is Two stories, one of global importance, the other of importance in the areas in which I work: Global: At any one time in recent years, there have been civil wars raging in several countries in Africa. Thousands of individuals die each year. Last year, according to one authority, 1/3 of the African countries were at war. Yet because the political aspects of these conflicts are no longer of interest to Americans (because the Cold War is over), the economic stakes have no global importance, and African populations do not capture the attention of well-off Westerners, one needs to be a specialist to find out the details of these conflicts. The contrast with the attention paid to the death of an American youth, particularly one from a middle-class family, is shocking and hard to justify sub species aeternitas. Of course, mere knowledge of these conflicts does not in itself solve anything; but it is a necessary step if we are to consider what might be done to halt this carnage. Local: In my own areas of psychology and education, there is plenty of interest nowadays in student achievement in schools. Yet the coverage of these matters in the press almost entirely leaves out knowledge which enjoys wide consensus among researchers. In the area of human development, it is recognized that youngsters pass through stages or phases, and it makes no sense to treat a four year old as if he or she were simply a "slower" or less informed middle school student or adult. In the area of cognitive studies, it is recognized that youngsters "construct" their own theories by which they attempt to make sense of the world; and that these intuitive theories often fly directly in the face of the theories and disciplines which we hope that they will ultimately master. Because these points are not well understood by journalists, policymakers, and the general public, we keep implementing policies that are doomed to fail. Efforts to teach certain materials in certain ways to youngsters who aren't ready to assimilate them will not only be ineffective but they are likely to cause children to come to dislike formal education. And efforts to instruct that fail to take into account -- and challenge -- the often erroneous theories that youngsters have already developed will delude us into thinking that the students are actually understanding materials that remain opaque to them. I think that this happens because as humans we are predisposed to come up with this theory of learning: Our minds are initially empty and the job of education is to fill those vessels with information. It is very difficult for humans to appreciate that the actual situation is quite different: in our early years, we construct all kinds of explanations for things. Our scholarly disciplines can only be mastered if we get rid of these faulty explanations and construct, often slowly and painfully, better kinds of explanations. Put sharply, evolutionary theory is not intuitive; creationism is. And that is why eight year olds are invariable creationists, whether their parents are fundamentalists or atheists. HOWARD GARDNER, the major proponent of the theory of multiple intelligences, is Professor of Education at Harvard University and holds research appointments at the Boston Veteran's Administration Medical Center and Boston University School of Medicine. His numerous books include Leading Minds, Frames of Mind, The Mind's New Science: A History of the Cognitive Revolution, To Open Minds, Multiple Intelligences, and Extraordinary Minds: Portraits of Four Exceptional Individuals. He has received both a MacArthur Prize Fellowship and the Louisville Grawemeyer Award. Further Reading on Edge: "Truth, Beauty, and Goodness: Education for All Human Beings. _________________________________________________________________ Douglas Rushkoff America's Descent Into Computer-Aided Unconsciousness And Consumer Fascism We have taught our machines to conduct propaganda. Web sites and other media are designed to be "sticky," using any means necessary to maintain our attention. Computers are programmed to stimulate Pavlovian responses from human beings, using techniques like one-to-one marketing, collaborative filtering, and hypnotic information architecture. Computers then record our responses in order to refine these techniques, automatically and without the need for human intervention. The only metrics used to measure the success of banner ads and web sites is the amount of economic activity - consumption and production - they are able to stimulate in their human user/subjects. As a result, the future content and structure of media will be designed by machines with no priority other than to induce spending. It amounts to a closed feedback loop between us and our computers, where - after their initial programming - the machines take the active role and human beings behave automatically. Programs adjust themselves in real time, based on their moment to moment success in generating the proper, mindless responses from us. In fact, computers and software are already charged with the design of their own successors. They are encouraged to evolve, while we are encouraged to devolve into impulsive, thoughtless passivity. Those who stand a chance of resisting - people who actually think - are rewarded handsomely for their compliance, and awarded favorable media representations such as "geek chic." These monikers are reserved for intelligent people who surrender their neural power to the enhancement of the machine, by becoming vested web programmers, for example. Those who refuse to suspend active thought are labeled communist, liberal, or simply "unfashionably pessimistic." Worse, they are unfaithful enemies of NASDAQ, and the divinely ordained expansion of the US economy. Ultimately, if such a story were actually reported, it would have to dress itself in irony, or appear as the result of an abstract intellectual exercise, so as not to alert too much attention. DOUGLAS RUSHKOFF, a Professor of Media Culture at New York University's Interactive Telecommunications Program, is an author, lecturer, and social theorist. His books include Free Rides, Cyberia: Life in the Trenches of Hyperspace, The GenX Reader (editor), Media Virus! Hidden Agendas in Popular Culture, Ecstasy Club (a novel), Playing the Future, and Coercion: Why We Listen to What "They" Say . Further reading on Edge: "The Think That I Call Doug: A Talk with Douglas Rushkoff". LINK: Doug Rushkoff Home Page. _________________________________________________________________ George B. Dyson The Size Of The Digital Universe Cosmologists have measured the real universe with greater precision than any reportable metric encompassing the extent of the digital universe -- even though much of it is sitting on our desks. Sales figures of all kinds are readily available, whereas absolute numbers (of processors, CPU cycles, addressable memory, disk space, total lines of code) are scarce. We are left to make rough approximations (skewed by volatility in prices) where there could and should be a precise, ongoing count. Have we reached Avogadro's number yet? GEORGE DYSON is a leading authority in the field of Russian Aleut kayaks --the subject of his book Baidarka, numerous articles, and a segment of the PBS television show Scientific American Frontiers. His early life and work was portrayed in 1978 by Kenneth Brower in his classic dual biography, The Starship and the Canoe. Now ranging more widely as a historian of technology, Dyson's most recent book is Darwin Among the Machines. Further reading on Edge: "Darwin Among the Machines; or, The Origins of Artificial Life"; and "CODE - George Dyson & John Brockman: A Dialogue." _________________________________________________________________ Pattie Maes What Scientific Research Is Really Like The public still thinks of research as a very serious and lonely activity. The picture of a scientist that typically comes to mind is that of a person in labcoat hunched over heavy books, locked up in their ivory tower. The truth is that scientists are more like a group of uninhibited, curious kids at play. Maybe teenagers would be more into science if they had a more accurate picture of what research is like and realized that one way to avoid growing up is to become a scientist. PATTIE MAES is an Associate Professor at MIT's Media Laboratory, where she founded and directs the Software Agents Group, and is principal investigator of the e-markets Special Interest Group. She currently holds the Sony Corporation Career Development Chair. Further reading on Edge: Intelligence Augmentation -- A Talk With Pattie Maes. LINK: Pattie Maes' Home Page. _________________________________________________________________ Mihaly Csikszentmihalyi The Reasons For Right-Wing Extremism In Europe And The U.S. Today (but I hope not tomorrow) I think the most important unreported story concerns the reasons for a return of right-wing extremism in Europe, and for the first time in the U.S. Since I am not a journalist I would not report such a story, but I would first find out if it is really true, and if true then study what its causes are. Is it that people are running out of hope and meaning? Have the Western democracies run out of believable goals? What conditions favor fascism and what can we do to prevent them from spreading? MIHALY CSIKSZENTMIHALYI (pronounced "chick-SENT-me high"), a Hungarian-born polymath and the Davidson Professor of Management at the Claremont Graduate University, in Claremont, California has been thinking about the meaning of happiness since a child in wartime Europe. He is author of Flow: The Psychology Of Optimal Experience; The Evolving Self: A Psychology For The Third Millennium; Creativity; and Finding Flow. LINK: FlowNet. _________________________________________________________________ William H. Calvin Abrupt Climate Change That's easy: abrupt climate change, the sort of thing where most of the earth returns to ice-age temperatures in just a decade or two, accompanied by a major worldwide drought. Then, centuries later, it flips back just as quickly. This has happened hundreds of times in the past. The earth's climate has at least two modes of operation that it flips between, just as your window air-conditioner cycles between fan and cool with a shudder. And it doesn't just settle down into the alternate mode: the transition often has a flicker like an aging fluorescent light bulb. There are sometimes a half-dozen whiplash cycles between warm-and-wet and cool-and-dusty, all within one madhouse century. On a scale far larger than we saw in the El Nino several years ago, major forest fires denude much of the human habitat. To the extent the geophysicists understand the mechanism, it's due to a rearrangement in the northern extension of the Gulf Stream. A number of computer simulations, dating back to 1987, of the winds and ocean currents have shown that gradual global warming can trigger such a mode switch within several centuries, mostly due to the increased rainfall into the northern North Atlantic Ocean (if the cold salty surface waters are diluted by fresh water, they won't flush in the usual manner that allows more warm water to flow north and lose its heat). Meltwater floods from Iceland and Greenland will do the job if tropical-warming-enhanced rainfall doesn't. This has been the major story in the geophysical sciences of the last decade. I've been puzzled since 1987 about why this story hasn't been widely reported. A few newspapers finally started reporting the story in some detail two years ago but still almost no one knows about it, probably because editors and readers confuse it with gradual climate change via greenhouse gases. This longstanding gradual warming story seems to cause the abrupt story to be sidetracked, even though another abrupt cooling is easily the most catastrophic outcome of gradual warming, far worse than the usual economic and ecological burden envisaged. How would I report it? Start with the three million year history of abrupt coolings and how they have likely affected prehuman evolution. Our ancestors lived through a lot of these abrupt climate changes, and some humans will survive the next one. It's our civilization that likely won't, just because the whiplashes happen so quickly that warfare over plummeting resources leaves a downsized world where everyone hates their neighbors for good reason. Fortunately, if we get our act together, there are few things we might do to stabilize the patient, buying some extra time in the same manner as preventive medicine has extended the human lifespan. WILLIAM H. CALVIN is a theoretical neurophysiologist on the faculty of the University of Washington who writes about the brain and evolution. Among his many books are How Brains Think, The Cerebral Code, and Lingua ex Machina. He is the author of a cover story for The Atlantic Monthly, "The Great Climate Flip-flop," January 1998, and a forthcoming book, Cool, Crash and Burn: The Once and Future Climate of Human Evolution. Further reading on Edge: "Competing for Consciousness: How Subconscious Thoughts Cook on the Backburner: A Talk by William H. Calvin" LINK: William Calvin Home Page. _________________________________________________________________ Add your own suggestions for Today's Most Important Unreported Story at the Edge discussion forum at [feedlogo40.gif] From checker at panix.com Thu Jan 12 18:47:49 2006 From: checker at panix.com (Premise Checker) Date: Thu, 12 Jan 2006 13:47:49 -0500 (EST) Subject: [Paleopsych] Economist: Obesity in France: Gross national product Message-ID: Obesity in France: Gross national product http://www.economist.com/World/europe/PrinterFriendly.cfm?story_id=5328398 Dec 20th 2005 | PARIS Contrary to popular myth, French people do get fat AS THE French sit down to their traditional Christmas eve feast of foie gras, oysters and dinde farcie aux marrons, they can do so with a clear conscience. For, as everybody knows, the famously svelte French somehow manage to combine gluttony with gastronomy--and still stay slim. Or do they? In fact, the rate of obesity in France has started to swell, rising from 8% of the adult population in 1997 to 11% by 2003. Over 40% of the French are now considered overweight. According to a recent Senate report, France has the same share of fat people today as America did in 1991--and an upward trend to match. And these numbers may understate the problem. The 2005 OECD health study says that obesity rates in Britain, at 23%, and America, at 31%, are higher. But it points out that the French figures, unlike British and American ones, are based on polls asking people if they are fat. Unsurprisingly, denial intrudes; self-reporting produces underestimates. Either way, France's politicians have started to notice. In October, a parliamentary report called for a public-health campaign. And a law has been passed to impose a 1.5% tax on the advertising budgets of food companies if they do not encourage healthy eating. What has happened to the French waistline? The short answer is that France has latched on to the fast-food culture. France is one of the biggest and most profitable European markets for McDonald's. Now KFC fast-food joints are spreading across the country. Frozen pizzas and fizzy drinks are also nibbling away at the traditional family meal, particularly in poorer households. There may be something else going on. Mireille Guiliano, a Frenchwoman based in America, caused a stir there with a book entitled "French Women Don't Get Fat". But in France, her fellow-citizens seem not only to be doing just that--but to have few hang-ups about it. Last weekend, 11.4m viewers watched Magalie, a singer with a voice as big as her build, being voted the winner of this year's Star Academy television talent show. Her voluptuous curves were all over the papers the next day. "It's the first time that a plump girl has won Star Ac," she told Le Parisien, a newspaper. "It's proof that, in order to succeed, physique no longer counts." From checker at panix.com Thu Jan 12 18:48:04 2006 From: checker at panix.com (Premise Checker) Date: Thu, 12 Jan 2006 13:48:04 -0500 (EST) Subject: [Paleopsych] Foreign Affairs: Robert M. Sapolsky: A Natural History of Peace Message-ID: Robert M. Sapolsky: A Natural History of Peace http://www.foreignaffairs.org/20060101faessay85110/robert-m-sapolsky/a-natural-history-of-peace.html From Foreign Affairs, January/February 2006 _________________________________________________________________ Summary: Humans like to think that they are unique, but the study of other primates has called into question the exceptionalism of our species. So what does primatology have to say about war and peace? Contrary to what was believed just a few decades ago, humans are not "killer apes" destined for violent conflict, but can make their own history. Robert M. Sapolsky is John A. and Cynthia Fry Gunn Professor of Biological Sciences and Professor of Neurology and Neurological Sciences at Stanford University. His most recent book is "Monkeyluv: And Other Essays on Our Lives as Animals." THE NAKED APE The evolutionary biologist Theodosius Dobzhansky once said, "All species are unique, but humans are uniquest." Humans have long taken pride in their specialness. But the study of other primates is rendering the concept of such human exceptionalism increasingly suspect. Some of the retrenchment has been relatively palatable, such as with the workings of our bodies. Thus we now know that a baboon heart can be transplanted into a human body and work for a few weeks, and human blood types are coded in Rh factors named after the rhesus monkeys that possess similar blood variability. More discomfiting is the continuum that has been demonstrated in the realm of cognition. We now know, for example, that other species invent tools and use them with dexterity and local cultural variation. Other primates display "semanticity" (the use of symbols to refer to objects and actions) in their communication in ways that would impress any linguist. And experiments have shown other primates to possess a "theory of mind," that is, the ability to recognize that different individuals can have different thoughts and knowledge. Our purported uniqueness has been challenged most, however, with regard to our social life. Like the occasional human hermit, there are a few primates that are typically asocial (such as the orangutan). Apart from those, however, it turns out that one cannot understand a primate in isolation from its social group. Across the 150 or so species of primates, the larger the average social group, the larger the cortex relative to the rest of the brain. The fanciest part of the primate brain, in other words, seems to have been sculpted by evolution to enable us to gossip and groom, cooperate and cheat, and obsess about who is mating with whom. Humans, in short, are yet another primate with an intense and rich social life -- a fact that raises the question of whether primatology can teach us something about a rather important part of human sociality, war and peace. It used to be thought that humans were the only savagely violent primate. "We are the only species that kills its own," one might have heard intoned portentously at the end of nature films several decades ago. That view fell by the wayside in the 1960s as it became clear that some other primates kill their fellows aplenty. Males kill; females kill. Some kill one another's infants with cold-blooded stratagems worthy of Richard III. Some use their toolmaking skills to fashion bigger and better cudgels. Some other primates even engage in what can only be called warfare -- organized, proactive group violence directed at other populations. As field studies of primates expanded, what became most striking was the variation in social practices across species. Yes, some primate species have lives filled with violence, frequent and varied. But life among others is filled with communitarianism, egalitarianism, and cooperative child rearing. Patterns emerged. In less aggressive species, such as gibbons or marmosets, groups tend to live in lush rain forests where food is plentiful and life is easy. Females and males tend to be the same size, and the males lack secondary sexual markers such as long, sharp canines or garish coloring. Couples mate for life, and males help substantially with child care. In violent species, on the other hand, such as baboons and rhesus monkeys, the opposite conditions prevail. The most disquieting fact about the violent species was the apparent inevitability of their behavior. Certain species seemed simply to be the way they were, fixed products of the interplay of evolution and ecology, and that was that. And although human males might not be inflexibly polygamous or come with bright red butts and six-inch canines designed for tooth-to-tooth combat, it was clear that our species had at least as much in common with the violent primates as with the gentle ones. "In their nature" thus became "in our nature." This was the humans-as-killer-apes theory popularized by the writer Robert Ardrey, according to which humans have as much chance of becoming intrinsically peaceful as they have of growing prehensile tails. That view always had little more scientific rigor than a Planet of the Apes movie, but it took a great deal of field research to figure out just what should supplant it. After decades' more work, the picture has become quite interesting. Some primate species, it turns out, are indeed simply violent or peaceful, with their behavior driven by their social structures and ecological settings. More important, however, some primate species can make peace despite violent traits that seem built into their natures. The challenge now is to figure out under what conditions that can happen, and whether humans can manage the trick themselves. PAX BONOBO Primatology has long been dominated by studies of the chimpanzee, due in large part to the phenomenally influential research of Jane Goodall, whose findings from her decades of observations in the wild have been widely disseminated. National Geographic specials based on Goodall's work would always include the reminder that chimps are our closest relatives, a notion underlined by the fact that we share an astonishing 98 percent of our DNA with them. And Goodall and other chimp researchers have carefully documented an endless stream of murders, cannibalism, and organized group violence among their subjects. Humans' evolutionary fate thus seemed sealed, smeared by the excesses of these first cousins. But all along there has been another chimp species, one traditionally ignored because of its small numbers; its habitat in remote, impenetrable rain forests; and the fact that its early chroniclers published in Japanese. These skinny little creatures were originally called "pygmy chimps" and were thought of as uninteresting, some sort of regressed subspecies of the real thing. Now known as bonobos, they are today recognized as a separate and distinct species that taxonomically and genetically is just as closely related to humans as the standard chimp. And boy, is this ever a different ape. Male bonobos are not particularly aggressive and lack the massive musculature typical of species that engage in a lot of fighting (such as the standard chimp). Moreover, the bonobo social system is female dominated, food is often shared, and there are well-developed means for reconciling social tensions. And then there is the sex. Bonobo sex is the prurient highlight of primatology conferences, and leads parents to shield their children's eyes when watching nature films. Bonobos have sex in every conceivable position and some seemingly inconceivable ones, in pairs and groups, between genders and within genders, to greet each other and to resolve conflicts, to work off steam after a predator scare, to celebrate finding food or to cajole its sharing, or just because. As the sound bite has it, chimps are from Mars and bonobos are from Venus. All is not perfect in the bonobo commune, and they still have hierarchies and conflict (why else invent conflict resolution?). Nonetheless, they are currently among the trendiest of species to analyze, a wonderful antidote to their hard-boiled relatives. The trouble is, while we have a pretty good sense of what bonobos are like, we have little insight into how they got that way. Furthermore, this is basically what all bonobos seem to be like -- a classic case of in-their-nature-ness. There is even recent evidence for a genetic component to the phenomenon, in that bonobos (but not chimps) possess a version of a gene that makes affiliative behavior (behavior that promotes group cohesion) more pleasurable to males. So -- a wondrous species (and one, predictably, teetering on the edge of extinction). But besides being useful for taking the wind out of we-be-chimps fatalists, the bonobo has little to say to us. We are not bonobos, and never can be. WARRIORS, COME OUT TO PLAY In contrast to the social life of bonobos, the social life of chimps is not pretty. Nor is that of rhesus monkeys, nor savanna baboons -- a species found in groups of 50 to 100 in the African grasslands and one I have studied for close to 30 years. Hierarchies among baboons are strict, as are their consequences. Among males, high rank is typically achieved by a series of successful violent challenges. Spoils, such as meat, are unevenly divided. Most males die of the consequences of violence, and roughly half of their aggression is directed at third parties (some high-ranking male in a bad mood takes it out on an innocent bystander, such as a female or a subordinate male). Male baboons, moreover, can fight amazingly dirty. I saw this happen a few years ago in one of the troops I study: Two males had fought, and one, having been badly trounced, assumed a crouching stance, with his rear end up in the air. This is universally recognized among savanna baboons as an abject gesture of subordination, signaling an end to the conflict, and the conventional response on the part of the victorious male is to subject the other to a ritualized gesture of dominance (such as mounting him). In this instance, however, the winner, approaching the loser as if to mount him, instead abruptly gave him a deep slash with his canines. A baboon group, in short, is an unlikely breeding ground for pacifists. Nevertheless, there are some interesting exceptions. In recent years, for example, it has been recognized that a certain traditional style of chest-thumping evolutionary thinking is wrong. According to the standard logic, males compete with one another aggressively in order to achieve and maintain a high rank, which will in turn enable them to dominate reproduction and thus maximize the number of copies of their genes that are passed on to the next generation. But although aggression among baboons does indeed have something to do with attaining a high rank, it turns out to have virtually nothing to do with maintaining it. Dominant males rarely are particularly aggressive, and those that are typically are on their way out: the ones that need to use it are often about to lose it. Instead, maintaining dominance requires social intelligence and impulse control -- the ability to form prudent coalitions, show some tolerance of subordinates, and ignore most provocations. Recent work, moreover, has demonstrated that females have something to say about which males get to pass on their genes. The traditional view was based on a "linear access" model of reproduction: if one female is in heat, the alpha male gets to mate with her; if two are in heat, the alpha male and the second-ranking male get their opportunity; and so on. Yet we now know that female baboons are pretty good at getting away from even champions of male-male competition if they want to and can sneak off instead with another male they actually desire. And who would that be? Typically, it is a male that has followed a different strategy of building affiliative relations with the female -- grooming her a lot, helping to take care of her kids, not beating her up. These nice-guy males seem to pass on at least as many copies of their genes as their more aggressive peers, not least because they can go like this for years, without the life-shortening burnout and injuries of the gladiators. And so the crude picture of combat as the sole path to evolutionary success is wrong. The average male baboon does opt for the combative route, but there are important phases of his life when aggression is less important than social intelligence and restraint, and there are evolutionarily fruitful alternative courses of action. Even within the bare-knuckle world of male-male aggression, we are now recognizing some surprising outposts of primate civility. For one thing, primates can make up after a fight. Such reconciliation was first described by Frans de Waal, of Emory University, in the early 1980s; it has now been observed in some 27 different species of primates, including male chimps, and it works as it is supposed to, reducing the odds of further aggression between the two ex-combatants. And various primates, including male baboons, will sometimes cooperate, for example by supporting one another in a fight. Coalitions can involve reciprocity and even induce what appears to be a sense of justice or fairness. In a remarkable study by de Waal and one of his students, capuchin monkeys were housed in adjacent cages. A monkey could obtain food on its own (by pulling a tray of food toward its cage) or with help from a neighbor (by pulling a heavier tray together); in the latter case, only one of the monkeys was given access to the food in question. The monkeys that collaborated proved more likely to share it with their neighbor. Even more striking are lifelong patterns of cooperation among some male chimps, such as those that form bands of brothers. Among certain primate species, all the members of one gender will leave their home troop around puberty, thus avoiding the possibility of genetically deleterious inbreeding. Among chimps, the females leave home, and as a result, male chimps typically spend their lives in the company of close male relatives. Animal behaviorists steeped in game theory spend careers trying to figure out how reciprocal cooperation gets started among nonrelatives, but it is clear that stable reciprocity among relatives emerges readily. Thus, even the violent primates engage in reconciliation and cooperation -- but only up to a point. For starters, as noted in regard to the bonobo, there would be nothing to reconcile without violence and conflict in the first place. Furthermore, reconciliation is not universal: female savanna baboons are good at it, for example, but males are not. Most important, even among species and genders that do reconcile, it is not an indiscriminate phenomenon: individuals are more likely to reconcile with those who can be useful to them. This was demonstrated in a brilliant study by Marina Cords, of Columbia University, in which the value of some relationships among a type of macaque monkey was artificially raised. Animals were again caged next to each other under conditions in which they could obtain food by themselves or through cooperation, and those pairs that developed the capacity for cooperation were three times as likely to reconcile after induced aggression as noncooperators. Tension-reducing reconciliation, in other words, is most likely to occur among animals who already are in the habit of cooperating and have an incentive to keep doing so. Some deflating points emerge from the studies of cooperation as well, such as the fact that coalitions are notoriously unstable. In one troop of baboons I studied in the early 1980s, male-male coalitions lasted less than two days on average before collapsing, and most cases of such collapse involved one partner failing to reciprocate or, even more dramatically, defecting to the other side during a fight. finally, and most discouraging, is the use to which most coalitions are put. In theory, cooperation could trump individualism in order to, say, improve food gathering or defend against predators. In practice, two baboons that cooperate typically do so in order to make a third miserable. Goodall was the first to report the profoundly disquieting fact that bands of related male chimps carry out cooperative "border patrols" -- searching along the geographic boundary separating their group from another and attacking neighboring males they encounter, even to the point of killing other groups off entirely. In-group cooperation can thus usher in not peace and tranquility, but rather more efficient extermination. So primate species with some of the most aggressive and stratified social systems have been seen to cooperate and resolve conflicts -- but not consistently, not necessarily for benign purposes, and not in a cumulative way that could lead to some fundamentally non-Hobbesian social outcomes. The lesson appears to be not that violent primates can transcend their natures, but merely that the natures of these species are subtler and more multifaceted than previously thought. At least that was the lesson until quite recently. OLD PRIMATES AND NEW TRICKS To some extent, the age-old "nature versus nurture" debate is silly. The action of genes is completely intertwined with the environment in which they function; in a sense, it is pointless to even discuss what gene X does, and we should consider instead only what gene X does in environment Y. Nonetheless, if one had to predict the behavior of some organism on the basis of only one fact, one might still want to know whether the most useful fact would be about genetics or about the environment. The first two studies to show that primates were somewhat independent from their "natures" involved a classic technique in behavioral genetics called cross-fostering. Suppose some animal has engaged in a particular behavior for generations -- call it behavior A. We want to know if that behavior is due to shared genes or to a multigenerationally shared environment. Researchers try to answer the question by cross-fostering the animal, that is, switching the animal's mother at birth so that she is raised by one with behavior B, and then watching to see which behavior the animal displays when she grows up. One problem with this approach is that an animal's environment does not begin at birth -- a fetus shares a very intimate environment with its mother, namely the body's circulation, chock-full of hormones and nutrients that can cause lifelong changes in brain function and behavior. Therefore, the approach can be applied only asymmetrically: if a behavior persists in a new environment, one cannot conclude that genes are the cause, but if a behavior changes in a new environment, then one can conclude that genes are not the cause. This is where the two studies come in. In the early 1970s, a highly respected primatologist named Hans Kummer was working in Ethiopia, in a region containing two species of baboons with markedly different social systems. Savanna baboons live in large troops, with plenty of adult females and males. Hamadryas baboons, in contrast, have a more complex, multilevel society. Because they live in a much harsher, drier region, hamadryas have a distinctive ecological problem. Some resources are singular and scarce -- like a rare watering hole or a good cliff face to sleep on at night in order to evade predators -- and large numbers of animals are likely to want to share them. Other resources, such as the vegetation they eat, are sparse and widely dispersed, requiring animals to function in small, separate groups. As a result, hamadryas have evolved a "harem" structure -- a single adult male surrounded by a handful of adult females and their children -- with large numbers of discrete harems converging, peacefully, for short periods at the occasional desirable watering hole or cliff face. Kummer conducted a simple experiment, trapping an adult female savanna baboon and releasing her into a hamadryas troop and trapping an adult female hamadryas and releasing her into a savanna troop. Among hamadryas, if a male threatens a female, it is almost certainly this brute who dominates the harem, and the only way for the female to avoid injury is to approach him -- i.e., return to the fold. But among savanna baboons, if a male threatens a female, the way for her to avoid injury is to run away. In Kummer's experiment, the females who were dropped in among a different species initially carried out their species-typical behavior, a major faux pas in the new neighborhood. But gradually, they assimilated the new rules. How long did this learning take? About an hour. In other words, millennia of genetic differences separating the two species, a lifetime of experience with a crucial social rule for each female, and a miniscule amount of time to reverse course completely. The second experiment was set up by de Waal and his student Denise Johanowicz in the early 1990s, working with two macaque monkey species. By any human standards, male rhesus macaques are unappealing animals. Their hierarchies are rigid, those at the top seize a disproportionate share of the spoils, they enforce this inequity with ferocious aggression, and they rarely reconcile after fights. Male stump tail macaques, in contrast, which share almost all of their genes with their rhesus macaque cousins, display much less aggression, more affiliative behaviors, looser hierarchies, and more egalitarianism. Working with captive primates, de Waal and Johanowicz created a mixed-sex social group of juvenile macaques, combining rhesus and stump tails together. Remarkably, instead of the rhesus macaques bullying the stump tails, over the course of a few months, the rhesus males adopted the stump tails' social style, eventually even matching the stump tails' high rates of reconciliatory behavior. It so happens, moreover, that stump tails and rhesus macaques use different gestures when reconciling. The rhesus macaques in the study did not start using the stump tails' reconciliatory gestures, but rather increased the incidence of their own species-typical gestures. In other words, they were not merely imitating the stump tails' behavior; they were incorporating the concept of frequent reconciliation into their own social practices. When the newly warm-and-fuzzy rhesus macaques were returned to a larger, all-rhesus group, finally, their new behavioral style persisted. This is nothing short of extraordinary. But it brings up one last question: When those rhesus macaques were transferred back into the all-rhesus world, did they spread their insights and behaviors to the others? Alas, they did not. For that, we need to move on to our final case. LEFT BEHIND In the early 1980s, "Forest Troop," a group of savanna baboons I had been studying -- virtually living with -- for years, was going about its business in a national park in Kenya when a neighboring baboon group had a stroke of luck: its territory encompassed a tourist lodge that expanded its operations and consequently the amount of food tossed into its garbage dump. Baboons are omnivorous, and "Garbage Dump Troop" was delighted to feast on leftover drumsticks, half-eaten hamburgers, remnants of chocolate cake, and anything else that wound up there. Soon they had shifted to sleeping in the trees immediately above the pit, descending each morning just in time for the day's dumping of garbage. (They soon got quite obese from the rich diet and lack of exercise, but that is another story.) The development produced nearly as dramatic a shift in the social behavior of Forest Troop. Each morning, approximately half of its adult males would infiltrate Garbage Dump Troop's territory, descending on the pit in time for the day's dumping and battling the resident males for access to the garbage. The Forest Troop males that did this shared two traits: they were particularly combative (which was necessary to get the food away from the other baboons), and they were not very interested in socializing (the raids took place early in the morning, during the hours when the bulk of a savanna baboon's daily communal grooming occurs). Soon afterward, tuberculosis, a disease that moves with devastating speed and severity in nonhuman primates, broke out in Garbage Dump Troop. Over the next year, most of its members died, as did all of the males from Forest Troop who had foraged at the dump.[See Footnote #1] The results were that Forest Troop was left with males who were less aggressive and more social than average and the troop now had double its previous female-to-male ratio. The social consequences of these changes were dramatic. There remained a hierarchy among the Forest Troop males, but it was far looser than before: compared with other, more typical savanna baboon groups, high-ranking males rarely harassed subordinates and occasionally even relinquished contested resources to them. Aggression was less frequent, particularly against third parties. And rates of affiliative behaviors, such as males and females grooming each other or sitting together, soared. There were even instances, now and then, of adult males grooming each other -- a behavior nearly as unprecedented as baboons sprouting wings. This unique social milieu did not arise merely as a function of the skewed sex ratio; other primatologists have occasionally reported on troops with similar ratios but without a comparable social atmosphere. What was key was not just the predominance of females, but the type of male that remained. The demographic disaster -- what evolutionary biologists term a "selective bottleneck" -- had produced a savanna baboon troop quite different from what most experts would have anticipated. But the largest surprise did not come until some years later. Female savanna baboons spend their lives in the troop into which they are born, whereas males leave their birth troop around puberty; a troop's adult males have thus all grown up elsewhere and immigrated as adolescents. By the early 1990s, none of the original low aggression/high affiliation males of Forest Troop's tuberculosis period was still alive; all of the group's adult males had joined after the epidemic. Despite this, the troop's unique social milieu persisted -- as it does to this day, some 20 years after the selective bottleneck.In other words, adolescent males that enter Forest Troop after having grown up elsewhere wind up adopting the unique behavioral style of the resident males. As defined by both anthropologists and animal behaviorists, "culture" consists of local behavioral variations, occurring for nongenetic and nonecological reasons, that last beyond the time of their originators. Forest Troop's low aggression/high affiliation society constitutes nothing less than a multigenerational benign culture. Continuous study of the troop has yielded some insights into how its culture is transmitted to newcomers. Genetics obviously plays no role, nor apparently does self-selection: adolescent males that transfer into the troop are no different from those that transfer into other troops, displaying on arrival similarly high rates of aggression and low rates of affiliation. Nor is there evidence that new males are taught to act in benign ways by the residents. One cannot rule out the possibility that some observational learning is occurring, but it is difficult to detect given that the distinctive feature of this culture is not the performance of a unique behavior but the performance of typical behaviors at atypically extreme rates. To date, the most interesting hint about the mechanism of transmission is the way recently transferred males are treated by Forest Troop's resident females. In a typical savanna baboon troop, newly transferred adolescent males spend years slowly working their way into the social fabric; they are extremely low ranking -- ignored by females and noted by adult males only as convenient targets for aggression. In Forest Troop, by contrast, new male transfers are inundated with female attention soon after their arrival. Resident females first present themselves sexually to new males an average of 18 days after the males arrive, and they first groom the new males an average of 20 days after they arrive (normal savanna baboons introduce such behaviors after 63 and 78 days, respectively). Furthermore, these welcoming gestures occur more frequently in Forest Troop during the early post-transfer period, and there is four times as much grooming of males by females in Forest Troop as elsewhere. From almost the moment they arrive, in other words, new males find out that in Forest Troop, things are done differently. At present, I think the most plausible explanation is that this troop's special culture is not passed on actively but simply emerges, facilitated by the actions of the resident members. Living in a group with half the typical number of males, and with the males being nice guys to boot, Forest Troop's females become more relaxed and less wary. As a result, they are more willing to take a chance and reach out socially to new arrivals, even if the new guys are typical jerky adolescents at first. The new males, in turn, finding themselves treated so well, eventually relax and adopt the behaviors of the troop's distinctive social milieu. NATURAL BORN KILLERS? Are there any lessons to be learned here that can be applied to human-on-human violence -- apart, that is, from the possible desirability of giving fatal cases of tuberculosis to aggressive people? Any biological anthropologist opining about human behavior is required by long-established tradition to note that for 99 percent of human history, humans lived in small, stable bands of related hunter-gatherers. Game theorists have shown that a small, cohesive group is the perfect setting for the emergence of cooperation: the identities of the other participants are known, there are opportunities for multiple iterations of games (and thus the ability to punish cheaters), and there is open-book play (players can acquire reputations). And so, those hunter-gatherer bands were highly egalitarian. Empirical and experimental data have also shown the cooperative advantages of small groups at the opposite human extreme, namely in the corporate world. But the lack of violence within small groups can come at a heavy price. Small homogenous groups with shared values can be a nightmare of conformity. They can also be dangerous for outsiders. Unconsciously emulating the murderous border patrols of closely related male chimps, militaries throughout history have sought to form small, stable units; inculcate them with rituals of pseudokinship; and thereby produce efficient, cooperative killing machines. Is it possible to achieve the cooperative advantages of a small group without having the group reflexively view outsiders as the Other? One way is through trade. Voluntary economic exchanges not only produce profits; they can also reduce social friction -- as the macaques demonstrated by being more likely to reconcile with a valued partner in food acquisition. Another way is through a fission-fusion social structure, in which the boundaries between groups are not absolute and impermeable. The model here is not the multilevel society of the hamadryas baboons, both because their basic social unit of the harem is despotic and because their fusion consists of nothing more than lots of animals occasionally coming together to utilize a resource peacefully. Human hunter-gatherers are a better example to follow, in that their small bands often merge, split, or exchange members for a while, with such fluidity helping to solve not only environmental resource problems but social problems as well. The result is that instead of the all-or-nothing world of male chimps, in which there is only one's own group and the enemy, hunter-gatherers can enjoy gradations of familiarity and cooperation stretching over large areas. The interactions among hunter-gatherers resemble those of other networks, where there are individual nodes (in this case, small groups) and where the majority of interactions between the nodes are local ones, with the frequency of interactions dropping off as a function of distance. Mathematicians have shown that when the ratios among short-, middle-, and long-distance interactions are optimal, networks are robust: they are dominated by highly cooperative clusters of local interactions, but they also retain the potential for less frequent, long-distance communication and coordination. Optimizing the fission-fusion interactions of hunter-gatherer networks is easy: cooperate within the band; schedule frequent joint hunts with the next band over; have occasional hunts with bands somewhat farther out; have a legend of a single shared hunt with a mythic band at the end of the earth. Optimizing the fission-fusion interactions in contemporary human networks is vastly harder, but the principles are the same. In exploring these subjects, one often encounters a pessimism built around the notion that humans, as primates, are hard-wired for xenophobia. Some brain-imaging studies have appeared to support this view in a particularly discouraging way. There is a structure deep inside the brain called the amygdala, which plays a key role in fear and aggression, and experiments have shown that when subjects are presented with a face of someone from a different race, the amygdala gets metabolically active -- aroused, alert, ready for action. This happens even when the face is presented "subliminally," which is to say, so rapidly that the subject does not consciously see it. More recent studies, however, should mitigate this pessimism. Test a person who has a lot of experience with people of different races, and the amygdala does not activate. Or, as in a wonderful experiment by Susan Fiske, of Princeton University, subtly bias the subject beforehand to think of people as individuals rather than as members of a group, and the amygdala does not budge. Humans may be hard-wired to get edgy around the Other, but our views on who falls into that category are decidedly malleable. In the early 1960s, a rising star of primatology, Irven DeVore, of Harvard University, published the first general overview of the subject. Discussing his own specialty, savanna baboons, he wrote that they "have acquired an aggressive temperament as a defense against predators, and aggressiveness cannot be turned on and off like a faucet. It is an integral part of the monkeys' personalities, so deeply rooted that it makes them potential aggressors in every situation." Thus the savanna baboon became, literally, a textbook example of life in an aggressive, highly stratified, male-dominated society. Yet within a few years, members of the species demonstrated enough behavioral plasticity to transform a society of theirs into a baboon utopia. The first half of the twentieth century was drenched in the blood spilled by German and Japanese aggression, yet only a few decades later it is hard to think of two countries more pacific. Sweden spent the seventeenth century rampaging through Europe, yet it is now an icon of nurturing tranquility. Humans have invented the small nomadic band and the continental megastate, and have demonstrated a flexibility whereby uprooted descendants of the former can function effectively in the latter. We lack the type of physiology or anatomy that in other mammals determine their mating system, and have come up with societies based on monogamy, polygyny, and polyandry. And we have fashioned some religions in which violent acts are the entr?e to paradise and other religions in which the same acts consign one to hell. Is a world of peacefully coexisting human Forest Troops possible? Anyone who says, "No, it is beyond our nature," knows too little about primates, including ourselves. [Footnote #1] Considerable sleuthing ultimately revealed that the disease had come from tainted meat in the garbage dump, which had been sold to the tourist lodge thanks to a corrupt meat inspector. The studies were the first of this kind of outbreak in a wild primate population and showed that, in contrast to what happens with humans and captive primates, there was little animal-to-animal transmission of the tuberculosis, and so the disease did not spread in Forest Troop beyond the garbage eaters. From checker at panix.com Thu Jan 12 18:48:18 2006 From: checker at panix.com (Premise Checker) Date: Thu, 12 Jan 2006 13:48:18 -0500 (EST) Subject: [Paleopsych] Live Science: Link Between Dancing Ability and Mating Quality Message-ID: Science: Link Between Dancing Ability and Mating Quality http://www.livescience.com/humanbiology/051221_dance_symmetry.html Symmetrical People Make Better Dancers By Ker Than posted: 21 December 2005 [Joel Garreau's new book, _Radical Evolution: The Promise and Peril of Enhancing Our Minds, Our Bodies--and What It Means to be Human_ (NY: Doubleday, 2005) has just arrived. I am signed up to review it for _The Journal of Evolution and Technology_ and commenced reading it at once. Accordingly, I have stopped grabbing articles to forward until I have written my review *and* have caught up on my reading, this last going on for how many ever weeks it takes. I have a backlog of articles to send and will exhaust them by the end of the year. After that, I have a big batch of journal articles I downloaded on my annual visit to the University of Virginia and will dole our conversions from PDF to TXT at the rate of one a day. I'll also participate in discussions and do up and occasional meme. But you'll be on your own in analyzing the news. I hope I have given you some of the tools to do so.] Many people are attracted to hot dancers, and a new study suggests part of the reason is because their bodies are more symmetrical than those of the less coordinated. The researchers found that men judged to be better dancers tended to have a higher degree of body symmetry, a factor that has been linked to overall attractiveness and health in other research . The new study involved 183 Jamaican teenagers, ranging between 14-19 years old, who danced while their movements were captured using motion-capture cameras similar to those used in video games and movies to give computer-generated characters fluid movements. Women watching the recordings preferred the dances of men who were more symmetrical, while men were more impressed by the dances of more symmetric females. Women are pickier Interestingly, the male preference for symmetric females was not as strong as that of the female preference for symmetric males. This seems to confirm the theory that women are pickier when selecting a mate, since they bear most of the burden of raising a child, the researchers say. According to the researchers, their study is the first of its type. Regular videotape or film can't separate the dance from what the people look like, said study member Lee Cronk, an anthropologist at Rutgers University in New Jersey. "With motion capture, we can do that and get just pure dance movements." All of us have asymmetries in our bodies. The index finger on one hand might be longer than the other, for example, or the left foot may be slightly larger than the right. Researchers call these fluctuating asymmetries, or FA. According to one hypothesis, FA is an indicator of an individual's ability to cope with the stresses and pressures associated with body development. "As you're developing, all sorts of things come at you, like diseases and injury," Cronk told LiveScience. "If you're able to develop symmetry despite all of that, then that would indicate to others that you have what it takes to make a go of it in that environment." Body advertising A high degree of body symmetry serves as a subtle advertisement of genetic quality and health, the thinking goes. While most people don't go around measuring and comparing body parts of potential mates, it's thought that we pick up on these cues subconsciously. The idea that there is an association between body symmetry and health comes from various animal and human studies. Peahens and barn swallows prefer males with more symmetrical tails. One study found that women experience more orgasms during sex with male partners whose features are more symmetrical, regardless of the level of romantic attachment or the sexual experience of the guy. What's any of this got to do with dancing? The researchers speculate that higher body symmetry might also indicate better neuromuscular coordination. This may influence dance ability since attractive dances can be more rhythmic and more difficult to perform. The study, led by William Brown of Rutgers, was detailed in the Dec. 22 issue of the journal Nature. From checker at panix.com Thu Jan 12 18:48:30 2006 From: checker at panix.com (Premise Checker) Date: Thu, 12 Jan 2006 13:48:30 -0500 (EST) Subject: [Paleopsych] Live Science: Scientists Predict What You'll Think of Next Message-ID: Scientists Predict What You'll Think of Next http://www.livescience.com/humanbiology/051222_mental_brain.html By Ker Than LiveScience Staff Writer posted: 22 December 2005 02:00 pm ET [Joel Garreau's new book, _Radical Evolution: The Promise and Peril of Enhancing Our Minds, Our Bodies--and What It Means to be Human_ (NY: Doubleday, 2005) has just arrived. I am signed up to review it for _The Journal of Evolution and Technology_ and commenced reading it at once. Accordingly, I have stopped grabbing articles to forward until I have written my review *and* have caught up on my reading, this last going on for how many ever weeks it takes. I have a backlog of articles to send and will exhaust them by the end of the year. After that, I have a big batch of journal articles I downloaded on my annual visit to the University of Virginia and will dole our conversions from PDF to TXT at the rate of one a day. I'll also participate in discussions and do up and occasional meme. But you'll be on your own in analyzing the news. I hope I have given you some of the tools to do so.] To recall memories, your brain travels back in time via the ultimate Google search, according to a new study in which scientists found they can monitor the activity and actually predict what you'll think of next. The work bolsters the validity of a longstanding hypothesis that the human brain takes itself back to the state it was in when a memory was first formed. The psychologist Endel Tulving dubbed this process "mental time travel." How it works Researchers analyzed brain scans of people as the test subjects watched pictures on a computer screen. The images were divided into three categories: celebrities like Jack Nicholson and Halle Berry, places like the Taj Mahal and the Grand Canyon, and everyday objects like tweezers and a pocket mirror. To make sure the subjects were paying attention, they were asked a question about each image as it came up, like whether they liked a certain celebrity, how much they wanted to visit a certain place or how often they used a certain object. Later, without any images and while their brains were still being scanned, the subjects were asked to recall as many of the images as they could. The researchers found that the patterns of brain activity associated with each picture "reinstated" themselves seconds before the people could verbally recall the memories. On average, the time between beginning brain activity associated with the memory and the subjects verbally stating the memory was about 5.4 seconds. "When you have an experience, that experience is represented as a pattern of cortical activity," explained Sean Polyn, a postdoctoral researcher at the University of Pennsylvania and leader of the study. "The memory system, which we think lives in the hippocampus, forms a sort of summary representation of everything that's going on in your cortex." Googling your brain The process can be compared to the way web crawlers work to browse and catalogue web pages on the Internet. Web crawlers are automated programs that create copies of all visited pages. Search engines like Google then tag and index the pages. In the same way, as we're trying to remember something, our brains dredge up the memory by first recalling a piece of it, scientists say. When trying to remember a face you saw recently, for example, you might first think broadly about faces and then narrow your search from there, enlisting new details as you go, Polyn explained. It's like adding more and more specific keywords to a Google search, until finally you find what you want. Scientists call this process "contextual reinstatement." "The memories that came up would be hits and the ones that most match your queries would be the ones that came up first," Polyn told LiveScience. Reading your mind The researchers were even able to do a little mind-reading by watching the search in progress. By comparing the brain scans of the subjects while they tried to remember the images they'd seen with those collected when they first viewed the images, the researchers were able to correctly conclude whether the people were going to remember a celebrity, place or object. "We can see some evidence of what category the subject is trying to recall before they even say anything, so we think we're visualizing the search process itself," Polyn said. A similar mind-reading effort was announced earlier this year, when researchers found they could predict where a patient would move his hand based on brain activity the instant prior. Scientists think that contextual reinstatement is unique to memories that involve personal experiences, so-called "episodic" memories, but that similar processes might be at work in other forms of memory. The study was detailed in the Dec. 23 issue of the journal Science. From checker at panix.com Thu Jan 12 19:23:21 2006 From: checker at panix.com (Premise Checker) Date: Thu, 12 Jan 2006 14:23:21 -0500 (EST) Subject: [Paleopsych] Atlas Sphere: Indict the New York Times Message-ID: Indict the New York Times http://www.theatlasphere.com/columns/printer_050111-holzer-indict-nytimes.php [Go ahead and critique, folks. I'll send out your thoughts, unless you specifically ask me not to. It's not clear to me how our "national security" was compromised. Aren't the citizens part of this "nation," and didn't Amendment IV speak of the "right of the people to be SECURE in their persons, houses, papers, and effects, against unreasonable searches and seizures"? Or does the "nation's" security extend only to our rulers? [So it is unclear whether the Espionage Act violates the Bill of Rights, even if past Supreme Courts have allowed certain somewhat similar powers. And did Congress intend the Bush administration to do what it did? Or is the intention of Congress something the Executive Branch can determine for itself at will? Is the United States a nation of laws (Congress) or of Presidential decrees? It is also unclear to me whether the Bush administration violated the laws governing the NSA. [Share with me your other objections. Basically, domestic spying is open to serious abuses and it must be checked. I have not been following the details and so don't know just how various members of Congress have reacted, but it's had not been one of ringing endorsement. [Mr. Jefferson said if he had to choose between newspapers and no government or government and no newspapers, he would unhesitatingly choose the latter. [I don't know what restrictions on the press I would allow in the unlikely event that the United States got into a war justifiably. Help!] [And just how did Hanoi Jane jeopardize our national security? Did it mean that North Vietnam was more likely to invade the United States? She was hardly the only critic of our foreign policy.] Opinion Editorial Indict the New York Times By Henry Mark Holzer Jan 10, 2006 It is an article of faith on the Left and among its fellow travelers that the Bush administration stole two elections, made war on Iraq for venal reasons, tortured hapless foreigners, and conducted illegal surveillance of innocent Americans. A corollary of this mindset is that the press, primarily the Washington Post and The New York Times, has a right, indeed a duty, to print whatever they want about the administration even if the information compromises national security. Not true. The press is not exempt from laws that apply to everyone else. The press is not exempt from laws protecting our national security. The New York Times is not exempt from the Espionage Act, as we shall see in a moment. But first, its necessary to understand what an indictment of the Times does not involve. First, an Espionage Act indictment of The New York Times would not even remotely constitute an attack on a free press. As Justice White wrote in Branzburg v. Hayes, [i]t would be frivolous to assert ... that the First Amendment, in the interest of securing news or otherwise, confers a license on either the reporter or his news sources to violate valid criminal laws. Nor would an indictment of the Times constitute an attempt to restrain it from publishing news. The anti-anti-terrorists who seek to justify the Times revealing the NSAs domestic surveillance program and thus prevent their flagship paper from being indicted, rely on a Supreme Court decision entitled New York Times Company v. United States, better known as the Pentagon Papers Case. Their reliance is misplaced. In 1971 a disgruntled anti-war activist delivered a classified study History of U.S. Decision-Making Process on Viet Nam Policy to The New York Times and the Washington Post. The government sued to enjoin publication seeking to impose a prior restraint. If there are any fundamental principles in modern First Amendment law, one is that the burden on government to restrain publication (as compared, for example, with later punishing its publication) is extremely heavy. Accordingly, in a 6-3 decision, the Court ruled for the newspapers, and the publication of the embarrassing Pentagon Papers went ahead. Thus, New York Times Company v. United States, where the Court rejected a government-sought prior restraint on publication, would have no precedential value in a case where, after publication, the government sought to punish the Times for violating the Espionage Act. Third, not only was there no legal impediment to the NSAs domestic surveillance program, there was abundant authority for it. The President possesses broad powers as chief executive and Commander in Chief under Article II of the Constitution. Congress has repeatedly delegated to all presidents considerable war-related powers, and especially post-9/11 to President Bush. It was Congress that created and empowered the National Security Agency. The Executive Branchs NSA domestic surveillance program, aimed at obtaining intelligence about the foreign-based terrorist war on the United States, was/is an integral element of our national security policy and its implementation. No Supreme Court decision has ever held that the Presidential/Congressionally-sanctioned acquisition of that kind of intelligence was constitutionally or otherwise prohibited. Accordingly, it is pointless to consider whether the NSAs domestic surveillance program was legal. It was! If a case involving that program ever reaches the Supreme Court, thats what its ruling will be. Fourth, the interesting history of the Espionage Act is irrelevant to whether the Times may have violated it. Finally, it is a waste of time to consider whether the Act is constitutional. It has been expressly and impliedly held constitutional more than once. This brings us to whether The New York Times is indictable (and ultimately convictable) for violating the Espionage Act. The facts are clear. The NSA was engaged in highly classified warrantless wiretaps of domestic subjects in connection with the War on Terror, and the Times, a private newspaper, made that information public. It is to those facts that the Espionage Act either applies, or does not apply. Title 18, Section 793 of the United States Code, provides that (e) Whoever having unauthorized possession of ... any document ... or information relating to the national defense which information the possessor has reason to believe could be used to the injury of the United States or to the advantage of any foreign nation, willfully communicates ... the same to any person not entitled to receive it ... (f) ... [s]hall be fined under this title or imprisoned not more than ten years, or both. (g) If two or more persons conspire to violate any of the foregoing provisions of this section, and one or more of such persons do any act to effect the object of the conspiracy, each of the parties to such conspiracy shall be subject to the punishment provided for the offense which is the object of such conspiracy. (Section 794 is inapplicable. It deals with gathering or delivering defense information to aid [a] foreign government.) It is, said the United States Court of Appeals for the Fourth Circuit in assessing Section 793 (e) in United States v. Morison, difficult to conceive of any language more definite and clear. Lets break down the statute into its component parts. Whoever: this would mean the New York Times company, publisher Arthur Sulzberger, Jr., editor Bill Keller, and anyone else privy to the information upon which the story was based. Having unauthorized possession: the information was classified, and the Times was not authorized to have it. Of any document ... or information: certainly the Times had information, because it published it; it is inconceivable that the newspaper did not have documents of some kind, because the newspaper would never have gone that far out on a limb without at least some corroboration beyond an oral report(s). Relating to the national defense: no comment is necessary; indeed, the Times has conceded that targets of the warrantless wiretaps were persons who may have had some connection to terrorists. Which information the possessor has reason to believe could be used to the injury of the United States or to the advantage of any foreign nation: obviously the Times had reason to believe, because it withheld the story for a year. Willfully communicates ... the same: no comment is necessary; the story was front-page news. To any person not entitled to receive it: even the Times cant argue that subway straphangers, or any other member of the public, was entitled to receive information about the classified operations about one of this countrys most secret and highly protected agencies. Several years ago Erika Holzer and I wrote a book entitled [1]Aid and Comfort: Jane Fonda in North Vietnam, which proved that her conduct in Hanoi made her indictable for, and convictable of, treason. We discovered that she was not indicted because of a political failure of will by the Nixon administration. To summarize a chapter of our book, suffice to say that the government was afraid to indict a popular anti-war actress who had the support of the radical left. Even today, three decades after Fondas trip to North Vietnam and three years after the publication of our book, we receive countless letters lamenting that Hanoi Jane was never punished for her conduct. We tell them that its too late, that any possibility of seeing justice done for Fondas traitorous conduct is long gone. That is all the more reason why those of us who remember the Fonda episode, and who understand the nature and importance of todays War on Terror, should not rest until the government calls to account The New York Times in a court of law, with an indictment and hopefully a conviction, under the Espionage Act. [2]Henry Mark Holzer is a professor emeritus at Brooklyn Law School and a constitutional and appellate lawyer. He provided legal representation to Ayn Rand on a variety of matters in the 1960s. His latest book, Keeper of the Flame: The Supreme Court Jurisprudence of Justice Clarence Thomas, will be published later this year. ? Copyright 2004-5 by The Atlasphere LLC References 1. http://www.amazon.com/exec/obidos/redirect?link_code=ur2&tag=theatlasphere-20&camp=1789&creative=9325&path=http%3A%2F%2Fwww.amazon.com%2Fgp%2Fproduct%2F078641247X 2. http://www.theatlasphere.com/directory/profile.php?id=12095 From checker at panix.com Thu Jan 12 19:24:21 2006 From: checker at panix.com (Premise Checker) Date: Thu, 12 Jan 2006 14:24:21 -0500 (EST) Subject: [Paleopsych] BBS: (Mealey) The Sociobiology of Sociopathy Message-ID: The Sociobiology of Sociopathy http://www.bbsonline.org/documents/a/00/00/05/20/bbs00000520-00/bbs.mealey.html Below is the unedited preprint (not a quotable final draft) of: Mealey, L. (1995). The sociobiology of sociopathy: An integrated evolutionary model. Behavioral and Brain Sciences 18 (3): 523-599. The final published draft of the target article, commentaries and Author's Response are currently available only in paper. Linda Mealey Department of Psychology College of St. Benedict St. Joseph, MN 56374 Keywords sociobiology, sociopathy, psychopathy, antisocial personality, evolution, criminal behavior, game theory, emotion, moral development, facultative strategies Abstract Sociopaths are "outstanding" members of society in two senses: politically, they command attention because of the inordinate amount of crime they commit, and psychologically, they elicit fascination because most of us cannot fathom the cold, detached way they repeatedly harm and manipulate others. Proximate explanations from behavior genetics, child development, personality theory, learning theory, and social psychology describe a complex interaction of genetic and physiological risk factors with demographic and micro-environmental variables that predispose a portion of the population to chronic antisocial behavior. Recent evolutionary and game theoretic models have tried to present an ultimate explanation of sociopathy as the expression of a frequency-dependent life history strategy which is selected, in dynamic equilibrium, in response to certain varying environmental circumstances. This target article tries to integrate the proximate, developmental models with the ultimate, evolutionary ones. Two developmentally different etiologies of sociopathy emerge from two different evolutionary mechanisms. Social strategies for minimizing the incidence of sociopathic behavior in modern society should consider the two different etiologies and the factors which contribute to them. _________________________________________________________________ Sociopaths, who comprise only 3-4% of the male population and less than 1% of the female population (Strauss & Lahey 1984, Davison and Neale 1994, Robins, Tipp & Przybeck 1991), are thought to account for approximately 20% of the United States' prison population (Hare 1993) and between 33% and 80% of the population of chronic criminal offenders (Mednick, Kirkegaard-Sorensen, Hutchings, Knop, Rosenberg & Schulsinger 1977, Hare 1980, Harpending & Sobus 1987). Furthermore, whereas the "typical" U.S. burglar is estimated to have committed a median five crimes per year before being apprehended, chronic offenders- those most likely to be sociopaths- report committing upward of fifty crimes per annum and sometimes as many as two or three hundred (Blumstein & Cohen 1987). Collectively, these individuals are thought to account for over 50% of all crimes in the U.S. (Loeber 1982; Mednick, Gabrielli & Hutchings 1987, Hare 1993). Whether criminal or not, sociopaths typically exhibit what is generally considered to be irresponsible and unreliable behavior; their attributes include egocentrism, an inability to form lasting personal commitments and a marked degree of impulsivity. Underlying a superficial veneer of sociability and charm, sociopaths are characterized by a deficit of the social emotions (love, shame, guilt, empathy, and remorse). On the other hand, they are not intellectually handicapped, and are often able to deceive and manipulate others through elaborate scams and ruses including fraud, bigamy, embezzlement, and other crimes which rely on the trust and cooperation of others. The sociopath is "aware of the discrepancy between his behavior and societal expectations, but he seems to be neither guided by the possibility of such a discrepancy, nor disturbed by its occurrence" (Widom 1976a, p 614). This cold- hearted and selfish approach to human interaction at one time garnered for sociopathy the moniker "moral insanity" (McCord 1983, Davison & Neale 1990). Sociopaths are also sometimes known as psychopaths or antisocial personalities. Unfortunately, the literature reflects varied uses of these three terms (Hare 1970, Feldman 1977, McCord 1983, Wolf 1987, Eysenck 1987). Some authors use one or another term as a categorical label, as in psychiatric diagnosis or in defining distinct personality "types"; an example is the "antisocial personality" disorder described in the Diagnostic and Statistical Manual of the American Psychiatric Association (1987). Other authors use the terms to refer to individuals who exhibit, to a large degree, a set of behaviors or personality attributes which are found in a continuous, normal distribution among the population at large; an example of such usage is "sociopathy" as defined by high scores on all three scales of the Eysenck Personality Questionnaire- extraversion, neuroticism, and psychoticism (Eysenck 1977, 1987). Other authors make a distinction between "simple" and "hostile" (Allen, Lindner, Goldman & Dinitz 1971), or "primary" and "secondary" psychopaths or sociopaths (Fagan & Lira 1980), reserving the term "simple" or "primary" for those individuals characterized by a complete lack of the social emotions; individuals who exhibit antisocial behavior in the absence of this emotional deficit are called "hostile" or "secondary" psychopaths or sociopaths, or even "pseudopsychopaths" (McCord 1983). Other authors also make a typological distinction, using the term "psychopath" to refer to anti-social individuals who are of relatively high intelligence and middle to upper socio-economic status and who express their aberrant behavior in impressive and sometimes socially skilled behavior which may or may not be criminal, such as insider trading on the stock market (e.g. Bartol 1984). These authors reserve the term "sociopath" for those antisocial persons who have relatively low intelligence and social skills or who come from the lower socio- economic stratum and express their antisocial nature in the repeated commission of violent crime or crimes of property. I will begin by using the single term "sociopath" inclusively. However, by the end of the paper I hope to convince the reader that the distinction between primary and secondary sociopaths is an important one because there are two different etiological paths to sociopathy, with differing implications for prevention and treatment. My basic premise is that sociopaths are designed for the successful execution of social deception and that they are the product of evolutionary pressures which, through a complex interaction of environmental and genetic factors, lead some individuals to pursue a life history strategy of manipulative and predatory social interactions. On the basis of game theoretic models this strategy is to be expected in the population at relatively low frequencies in a demographic pattern consistent with what we see in contemporary societies. It is also expected to appear preferentially under certain social, environmental, and developmental circumstances which I hope to delineate. In an effort to present an integrated model, I will use a variety of arguments and data from the literature in sociobiology, game theory, behavior genetics, child psychology, personality theory, learning theory, and social psychology. I will argue that: (1) there is a genetic predisposition underlying sociopathy which is normally distributed in the population; (2) as the result of selection to fill a small, frequency-dependent, evolutionary niche, a small, fixed percentage of individuals- those at the extreme of this continuum- will be deemed "morally insane" in any culture; (3) a variable percentage of individuals who are less extreme on the continuum will sometimes, in response to environmental conditions during their early development, pursue a life-history strategy that is similar to that of their "morally insane" colleagues; and (4) a subclinical manifestation of this underlying genetic continuum is evident in many of us, becoming apparent only at those times when immediate environmental circumstances make an antisocial strategy more profitable than a prosocial one. 1. The Model: 1.1 The evolutionary role of emotion As the presenting, almost defining characteristic of sociopaths is their apparent lack of sincere social emotions in the absence of any other deficit such as mental retardation or autism (Hare 1980), it seems appropriate to begin with an examination of some current models of emotion. Plutchik (1980) put forth an evolutionary model of emotion in which he posits eight basic or "primary" emotions (such as fear, anger and disgust) which predate human evolution and are clearly related to survival (1). According to the model, everyone (including sociopaths) experiences these primary emotions, which are cross- cultural and instinctively programmed. "Secondary" and "tertiary" emotions, on the other hand, are more complex, specifically human, cognitive interpretations of varying combinations and intensities of the primary emotions (2). Because they are partly dependent upon learning and socialization, secondary emotions, unlike primary emotions, can vary across individuals and cultures. Thus, the social emotions (such as shame, guilt, sympathy, and love), which are secondary emotions, can be expected to exhibit greater variability. Griffiths (1990) points out that most of the important features of emotion argue for an evolutionary design: emotions are generally involuntary, and are often "intrusive" (p 176); they cause rapid, coordinated changes in the skeletal/muscular system, facial expression, vocalization, and the autonomic nervous system; they are to a large extent innate, or at least "prepared" (see Seligman 1971); and they do not seem as responsive to new information about the environment as do beliefs. Griffiths argues that emotional responses to stimuli (he calls them "affect-programs" after Ekman 1971) are informationally-encapsulated, complex, organized reflexes, which are "adaptive responses to events that have a particular ecological significance for the organism" (p 183). That is, they are likely to be highly specialized reflexive responses elicited spontaneously by the presence of certain critical stimuli, regardless of the presence of possible mediating contextual cues or cognitive assessments. Nesse (1990) likewise posits an evolutionary model in which emotions are: "specialized modes of operation, shaped by natural selection, to adjust the physiological, psychological, and behavioral parameters of the organism in ways that increase its capacity and tendency to respond to the threats and opportunities characteristic of specific kinds of situations" (p 268). He attributes a particular role to the social emotions, a role he couches in the language of reciprocity and game theory. Presenting a classic Prisoner's Dilemma matrix, he notes which emotions would be likely to be associated with the outcomes of each of the four cells: when both players cooperate, they experience friendship, love, obligation, or pride; when both cheat or defect, they feel rejection and hatred; when one player cooperates and the other defects, the cooperator feels anger while the defector feels anxiety and guilt. Given that these emotions are experienced AFTER a behavioral choice is made, how could they possibly be adaptive? Nesse's explanation is based on the models of Frank (1988) and Hirshleifer (1987), which posit that ex post facto feelings lead to behavioral expressions which are read by others and can be used to judge a person's likely future behavior. To the extent that the phenomenological experience of emotion serves to direct a person's future behavior (positive emotions reinforce the preceding behavior while negative emotions punish and, therefore, discourage repetition of the behavior), the outward expression of emotion will serve as a reliable indicator to others as to how a person is likely to behave in the future. Indeed, that there exist reliable, uncontrollable outward expressions of these inner experiences at all suggests that the expressions must be serving a communicative function (Dimberg 1988). But if, as in the case of the Prisoner's Dilemma, the most rational strategy is to be selfish and defect, why should the positive (reinforcing) emotions follow mutual cooperation rather than the seemingly more adaptive behavior of defection? Here lies the role of reputation. If a player is known, through direct experience or social reputation, always to play the "rational" move and defect, then in a group where repeated interactions occur and individuals are free to form their own associations, no rational player will choose to play with the known defector, who will thus no longer be provided the opportunity for any kind of gain-- cooperative or exploitative. To avoid this social "shunning" based on reputation (3) and hence, to be able to profit at all from long-term social interaction, players must be able to build a reputation for cooperation. To do so, most of them must in fact, reliably cooperate, despite the fact that cooperation is not the "rational" choice for the short-term. Frank and Hirshleifer suggest that the social emotions have thus evolved as "commitment devices" (Frank) or "guarantors of threats and promises" (Hirshleifer)- they cause positive or negative feelings that act as reinforces or punishers, molding our behavior in a way that is not economically rational for the short-term but profitable and adaptive in situations where encounters are frequent and reputation is important. Frank presents data from a variety of studies suggesting that people do often behave irrationally (emotionally) in many dyadic and triadic interactions- sometimes even when it is clear that there will be no future opportunity to interact again with the same partner. These studies support the suggestion that in social situations, one's emotional response will often prevail over logic, and that the reason is that such behavior is, in the long-term, adaptive under conditions when one's reputation can follow or precede one. (See also Farrington 1982, Caldwell 1986, Anawalt 1986, Axelrod 1986, Alexander 1987, Irons 1991, Dugatkin 1992, and Frank, Gilovich & Regan 1993 for more on the role of reputation.) According to these models, emotion serves both as a motivator of adaptive behavior and as a type of communication: the phenomenological and physiological experience of emotion rewards, punishes, and motivates the individual toward or away from certain types of stimuli and social interactions, while the outward manifestations of emotion communicate probable intentions to others. Once such reliable communicative mechanisms have evolved, however, when communication of intent precedes interaction, or when one's reputation precedes one, the conditions of interaction become vulnerable to deception through false signalling or advance deployment of enhanced reputation (e.g. Caldwell 1986). Those who use a deceptive strategy and defect after signalling cooperation are usually referred to as "cheaters" and, as many authors have pointed out (e.g. Trivers 1971, Alexander 1987, Dennett 1988, Quiatt 1988), the presence of cheaters can lead to a coevolutionary "arms race" in which potential interactors evolve finely tuned sensitivities to likely evidence or cues of deception, while potential cheaters evolve equally fine-tuned abilities to hide those cues (4). As long as evolutionary pressures for emotions to be reliable communication and commitment devices leading to long-term, cooperative strategies coexist with counter-pressures for cheating, deception, and "rational" short-term selfishness, a mixture of phenotypes will result, such that some sort of statistical equilibrium will be approached. Cheating should thus be expected to be maintained as a low-level, frequency-dependent strategy, in dynamic equilibrium with changes in the environment which exist as counter-pressures against its success. This type of dynamic process has been modelled extensively by evolutionary biologists who use game theory- the topic I turn to next. 1.2 Game theory and evolutionarily stable strategies Game theory was first introduced into the literature of evolutionary biology by Richard Lewontin (1961), who applied it to the analysis of speciation and extinction events. It was later taken up in earnest by John Maynard Smith and colleagues (eg. Maynard Smith & Price 1973, Maynard Smith 1974, Maynard Smith 1978) who used it to model contests between individuals. Maynard Smith showed that the "evolutionarily stable strategies" (ESSs) that could emerge in such contests included individuals' use of mixed, as well as fixed, strategies. Alexander (1986) writes "It would be the worst of all strategies to enter the competition and cooperativeness of social life, in which others are prepared to alter their responses, with only preprogrammed behaviors" (p 171). The maintenance of mixed ESSs in a population can theoretically be accomplished in at least four ways (after Buss 1991): (1) through genetically based, individual differences in the use of single strategies (such that each individual, in direct relation to genotype, consistently uses the same strategy in every situation); (2) through statistical use by all individuals of a species-wide, genetically fixed, optimum mix of strategies (whereby every individual uses the same statistical mix of strategies, but does so randomly and unpredictably in relation to the situation); (3) through species-wide use by all individuals of a mix of environmentally-contingent strategies (such that every individual uses every strategy, but predictably uses each according to circumstances); (4) through the developmentally-contingent use of single strategies by individuals (such that each individual has an initial potential to utilize every type of strategy, but, after exposure to certain environmental stimuli in the course of development, is phenotypically canalized from that point on, to use only a fraction of the possible strategies). To Buss's fourth mechanism can be added a differential effect of genotype on developmental response to the environment, thus adding another mechanism: (5) genetically based individual differences in response to the environment, resulting in differential use by individuals of environmentally-contingent strategies (such that individuals of differing genotypes respond differently to environmental stimuli in the course of development and are thus canalized to produce a different set of limited strategies given the same, later conditions). Following the leads of Kenrick, Dantchik, & MacFarlane (1983), MacMillan & Kofoed (1984), Kofoed & MacMillan (1986), Harpending & Sobus (1987), and Cohen & Machalek (1988), I would like to suggest an evolutionary model in which sociopaths are a type of cheater- defector in our society of mixed-strategy interactionists. I will be arguing that sociopathy appears in two forms, according to mechanisms 1 and 5 (above), one version that is the outcome of frequency-dependent, genetically based individual differences in use of a single (antisocial) strategy (which I will refer to as "primary sociopathy") and another that is the outcome of individual differences in developmental response to the environment, resulting in the differential use of cooperative or deceptive social strategies (which I will refer to as "secondary sociopathy"). To support this model, I will provide evidence that there are predictable differences in the use of cheating strategies across individuals, across environments, and within individuals across environments; this evidence will integrate findings from the fields of behavior genetics, child psychology, personality theory, learning theory, and social psychology. 2. The Evidence: 2.1 Behavior genetics For decades, evidence has been accumulating that both criminality and sociopathy have a substantial heritable component, and that this heritable component is to a large extent overlapping; that is, the heritable attributes that contribute to criminal behavior seem to be the same as those which contribute to sociopathy. While there is no one-to-one correspondence between those individuals identified as criminals and those identified as sociopaths, (indeed, the definitions of both vary from study to study), it is clear that these two sets of individuals share a variety of characteristics and that a subset of individuals share both labels (Moffitt 1987, Ellis 1990b, Robins, Tipp & Przybeck 1991, Gottesman & Goldsmith 1993). 2.1.1 Studies of criminal behavior The behavior-genetic literature on criminal behavior suggests a substantial effect of heredity across several cultures (5). Christiansen (1977a&b), Wilson & Hernnstein (1985), Cloninger & Gottesman (1987), Eysenck & Gudjonsson (1989), and Raine (1993) review studies of twins which, taken as a whole, suggest a heritability of approximately .60 for repeated commission of crimes of property. [Heritability is a measure of the proportion of variance of a trait, within a population, that can be explained by genetic variability within that population; it thus, ranges theoretically from 0.00 to 1.00, with the remaining population variance explained by variance in individuals' environment.] Adoption studies (reviewed in Hutchings & Mednick 1977, Mednick & Finello 1983, Wilson & Hernnstein 1985, Cloninger & Gottesman 1987, Mednick, Gabrielli & Hutchings, 1987, Eysenck & Gudjonsson 1989, and Raine 1993) arrive at a similar conclusion (but see footnote 6). Several adoption studies were also able to demonstrate significant interactive effects not discriminable using the twin methodology. Crowe (1972, 1974), Cadoret, Cain & Crowe (1983), Mednick & Finello (1983), and Mednick, Gabrielli & Hutchings (1984) report significant gene-environment interactions, such that adoptive children with both a genetic risk (criminal biological parent) and an environmental risk (criminality, psychiatric illness, or other severe behavioral disturbance in an adoptive parent), have a far greater risk of expressing criminal behavior than do adoptees with no such risk or only one risk factor, and that increased risk is more than simply an additive effect of both risk factors. In addition, Baker, Mack, Moffitt & Mednick (1989) report an interaction based on sex, in which females are more likely to transmit a genetic risk to their offspring than are males. 2.1.2 Studies of sociopathy The literature on sociopathy suggests a pattern similar to that on criminality: Schulsinger (1972/77), Cadoret (1978), Crowe (1974), Cadoret & Cain (1980), Cadoret, Troughton & O'Gorman (1987), and Cadoret & Stewart (1991) demonstrate a substantial heritability to sociopathy; Cadoret, Troughton, Bagford & Woodworth (1990) found a gene-environment interaction similar to the one found for criminal behavior; and Cadoret & Cain (1980, 1981) found an interaction involving sex, such that male adoptees were more sensitive to the influence of environmental risk factors than were female adoptees. The similarity of the patterns described in these two domains is to some extent due to the fact that the diagnosis of sociopathy is often based in part upon the existence of criminal activity in a subject's life history. On the other hand, consider the following: (1) criminal behavior and other aspects of sociopathy are correlated (Eysenck 1977, Morrison & Stewart 1971, Cadoret, Cain & Crowe 1983, Wolf 1987, Cloninger & Gottesman 1987, Patterson, DeBaryshe & Ramsey 1989); (2) criminal activity is found with increased frequency among the adopted-away children of sociopaths (Moffitt 1987); and (3) sociopathy is found with increased frequency among the adopted-away children of criminals (Cadoret & Stewart 1991, Cadoret, Troughton, Bagford & Woodworth 1990). This all suggests that the criminality and sociopathy may share some common heritable factors. For this reason, early researchers and clinicians (e.g. Schulsinger 1972/77 and Cadoret 1978) suggested using the term "antisocial spectrum" to incorporate a variety of phenotypes that are considered likely to be manifestations of closely related genotypes (7). The existence of this spectrum suggests a multifactorial, probably polygenic, basis for sociopathy and its related phenotypes. Using an analogy to "g", which is often used to refer to the common factor underlying the positive correlations between various aptitude measures, Rowe (1986) and Rowe and Rodgers (1989) use "d" to refer to the common factor underlying the various expressions of social deviance. 2.1.3 Sex differences and the "two-threshold" model Cloninger put forth a "two threshold" polygenic model to account for both the sex difference in sociopathy and its spectral nature (Cloninger, Reich & Guze 1975; Cloninger, Christiansen, Reich & Gottesman 1978). According to the model, sociopaths are individuals on the extreme end of a normal distribution whose genetic component is (1) polygenic and (2) to a large degree, sex-limited. [Sex- limited genes, not to be confused with sex-linked genes, are those which are located on the autosomes of both sexes but which are triggered into expression only within the chemical/hormonal microenvironment of one sex or the other. Common examples include beard and mustache growth in men, and breast and hip development in women.] If a large number of the many genes underlying sociopathy are triggered by testosterone or some other androgen, many more men than women will pass the threshold of the required number of active genes necessary for its outward expression. According to the two-threshold model, those females who do express the trait must have a greater overall "dose" or "genetic load" (i.e, they are further out in the extreme of the normal distribution of genotypes) than most of the males who express the trait. This proposition has been supported by data showing that in addition to the greater overall risk for males as opposed to females, there is a also greater risk for the offspring (and other relatives) of female sociopaths as compared to the offspring (and other relatives) of male sociopaths. This phenomenon cannot be accounted for either by sex-linkage or by the differential experiences of the sexes. Besides providing a proximate explanation for the greater incidence of male sociopathy and crime, the two-threshold model also explains on a proximate level the finding that males are more susceptible to environmental influences than females. Somewhat paradoxically, while a male will express sociopathy at a lower "genetic dose" than is required for expression in a female, the heritability of the trait is greater for females, meaning that the environmental component of the variance is greater for males (8). The two-threshold model thus explains in a proximate sense what sociobiologists would predict from a more ultimate perspective. The fact that males are more susceptible than females to the environmental conditions of their early years fits well with sociobiological theory, in that the greater variance in male reproductive capacity makes their "choice" of life strategy somewhat more risky and therefore more subject to selective pressures (Symons 1979, Buss 1988, Mealey & Segal 1993). Sociobiological reasoning thus leads to the postulate that males should be more sensitive to environmental cues that (1) trigger environmentally-contingent or developmentally-canalized life history strategies or (2) are stimuli for which genetically based individual differences in response thresholds have evolved. (Recall mechanisms 3, 4 & 5 for the maintenance of mixed-strategy ESSs in a population.) If the evolutionary models apply then when, specifically, would sociopathy be the best available strategy? and what would be the environmental cues which, especially for boys, would trigger its development? To answer these questions, I turn to the child psychology literature, with a special focus on studies of life history strategies, delinquency, and moral development. 2.2 Child psychology 2.2.1 Life history strategies Beginning with Draper and Harpending's now-classic 1982 paper on the relationship between father absence and reproductive strategy in adolescents, there has been an increasing effort to view development as the unfolding of a particular life history strategy in response to evolutionarily relevant environmental cues (Draper & Harpending 1982, Surbey 1987, MacDonald 1988, Crawford & Anderson 1989, Draper & Belsky 1990, Gangestad & Simpson 1990, Mealey 1990, Belsky, Steinberg & Draper 1991, Moffitt, Caspi, Belsky & Silva 1992, Mealey & Segal 1993). These models are based either implicitly or explicitly on the assumption that there are multiple evolutionarily adaptive strategies and that the optimal strategy for particular individuals will depend both upon their genotype and their local environment. To date, most developmental life history models address variance in reproductive strategies (for example, age at menarche or first sexual activity, number of mating partners, and amount of parental investment), but this type of modeling can also be applied to the adoption of social strategies such as cheating versus cooperation. Perhaps the most oft-mentioned factor suggested as being relevant to the development of a cheating strategy, especially in males, is being competitively disadvantaged with respect to the ability to obtain resources and mating opportunities. Theoretically, those individuals who are the least likely to outcompete other males in a status hierarchy, or to achieve mates through female choice, are the ones most likely to adopt a cheating strategy. (See eg. Thornhill & Alcock 1983, Daly & Wilson 1983 and Gould & Gould 1989 re: non-human animals, and Symons 1979, Kenrick et al 1983, MacMillan & Kofoed 1984, Kofoed & MacMillan 1986, Cohen & Machalek 1988, Tooke & Camire 1991 and Thornhill & Thornhill 1992 re: humans). In humans, competitive disadvantage could be related to a variety of factors, including age, health, physical attractiveness, intelligence, socioeconomic status, and social skills. Criminal behavior, one kind of cheating strategy, is clearly related to these factors. Of the seven cross-cultural correlates of crime reported by Ellis (1988), three -large number of siblings, low socio-economic status, and urban residency- seem directly related to resource competition; the four others -youth, maleness, being of black racial heritage, and coming from a single parent (or otherwise disrupted) family background- can be plausibly argued to be related to competition as well (see eg., Kenrick et al 1983, Wilson & Hernnstein 1985, Ellis 1988, and Cohen & Machalek 1988). Empirical data suggest that deficits in competitive ability due to psychosis (Hodgins 1992), intellectual handicap (Moffitt & Silva 1988, Quay 1990a, Stattin & Magnusson 1991, Hodgins 1992), or poor social skills (Hogan & Jones 1983, Simonian, Tarnowski & Gibbs 1991, Garmezy 1991, Dishion, Patterson, Stoolmiller & Skinner 1991) are also associated with criminal behavior. Likewise, the competitive advantages conferred by high intelligence (Hirschi & Hindenlang 1977, Wilson & Herrnstein 1985, Silverton 1988, Kandel, Mednick, Kirkegaard-Sorensen, Hutchings, Knop, Rosenberg, & Schulsinger 1988, White, Moffitt & Silva 1989) or consistent external support (Garmezy 1991), can mitigate the development of criminal or delinquent behavior in those who are otherwise at high risk. Rape and spouse abuse, other forms of cheating strategy, appear to be related to the same life history factors as crime (Ellis 1989 & 1991a, Malamuth, Sockloskie, Koss & Tanaka 1991, Thornhill & Thornhill 1992). In fact, Huesmann, Eron, Lefkowitz & Walder (1984), Rowe and Rodgers (1989) and Rowe, Rodgers, Meseck-Bushey & St. John (1989) present evidence that there is a common genetic component to the expression of sexual and nonsexual antisocial behavior. Given the overlaps between rape, battering, and criminality in terms of life history circumstances, genetics, and apparent inability to empathize with one's victim, it would be parsimonious to postulate that they might be expressions of a single sociopathy spectrum. As such, these antisocial behaviors could be considered to be genetically influenced, developmentally- and environmentally-contingent cheating strategies, utilized when a male finds himself at a competitive disadvantage (see also Figueredo & McCloskey nd). Along these lines, MacMillan and Kofoed (1984) presented a model of male sociopathy based on the premise that sexual opportunism and manipulation are the key features driving both the individual sociopath and the evolution of sociopathy. Harpending and Sobus (1987) posited a similar basis for the evolution and behavioral manifestations of Briquet's Hysteria in women, suggesting that this syndrome of promiscuity, fatalistic dependency, and attention- getting, is the female analogue, and homologue, of male sociopathy. 2.2.2 Delinquency Childhood delinquency is a common precursor of adolescent delinquency and adult criminal and sociopathic behavior (Robins & Wish 1977, Loeber 1982, Loeber & Dishion 1983, Loeber & Stouthamer- Loeber 1987, Patterson et al 1989); in fact, childhood conduct disorder is a prerequisite finding in order to diagnose adult antisocial personality (APA 1987). Importantly, just as in the literature on adults, a distinction is frequently made between two subtypes of conduct disorder in children: Lytton (1990), for example, distinguishes between "solitary aggressive type" and "group type"; Loeber (1990) distinguishes between "versatiles" and "property offenders"; and Strauss & Lahey (1984) distinguish between "unsocialized" and "socialized". I will argue that these subtypes are precursors of two types of adulthood antisociality (with "solitary aggressive", "versatile", or "unsocialized" types leading to primary sociopathy and "group", "property offender", or "socialized" types presaging secondary sociopathy). I will also argue that the differing life history patterns of these two types of delinquents are reflections of two different evolutionary mechanisms for maintaining ESSs in a population- mechanism 1 and mechanism 5, respectively (see Section 1.2). Although more than half of juvenile delinquents outgrow their behavior (Lytton 1990, Robins, Tipp & Przybeck 1991, Gottesman & Goldsmith 1993), the frequency of juvenile antisocial behaviors is still the best predictor of adult antisocial behavior, and the earlier such behavior appears, the more likely it is to be persistent (Farrington 1986, Loeber & Stouthamer-Loeber 1987, Lytton 1990, Stattin & Magnusson 1991, White, Moffitt, Earls, Robins & 2Silva 1990, and Robins, Tipp & Przybeck 1991). The mean age at which adult sociopaths exhibited their first significant symptom is between eight and nine years; 80% of all sociopaths exhibited their first symptom by age eleven (Robins, Tipp & Przybeck 1991); over two-thirds of eventual chronic offenders are already distinuishable from other children by kindergarten (Loeber & Stouthamer-Loeber 1987). Thus, by evaluating the environments of juvenile delinquents, we can fairly reliably reconstruct the childhood environments of adult sociopaths. Studies of this sort consistently implicate several relevant environmental factors correlated with boyhood antisocial behavior: inconsistent discipline, parental use of punishment as opposed to rewards, disrupted family life (especially father absence, family violence, alcoholic parent, or mentally ill parent), and low socioeconomic status (Cadoret 1982, Loeber & Dishion 1983, van Dusen, Mednick, Gabrielli & Hutchings 1983, Wilson & Hernnstein 1985, Farrington 1986 & 1989, McCord 1986, Silverton 1988, & Patterson et al 1989, Lytton 1990, Offord, Boyle & Racine 1991). Besides the fact that all of these variables are more likely to exist when one or the other parent is sociopathic, and the child hence, genetically predisposed to sociopathy, behaviorist and social learning models of the dynamics of early parent-child interactions (to be described in Section 2.4.2) have been fairly convincing in explaining how antisocial behaviors can be reinforced under such living conditions. Interestingly, in line with the postulate that cheating strategies would be most likely to be used by individuals who are at a competitive disadvantage, McGarvey, Gabrielli, Bentler & Mednick (1981), Loeber & Dishion (1983), Hogan & Jones (1983), Magid & McKelvie (1987), Kandel et al (1988), Hartup (1989), and Patterson et al (1989) all suggest that the common way in which high risk familial and environmental factors contribute to delinquency, is by handicapping children with respect to their peers, in terms of social skills, academic ability, and self-esteem. This noncompetitiveness then leads disadvantaged youths to seek alternative peer groups and social environments in which they can effectively compete (Dishion et al 1991). If they are successful in the estimation of their new peer group, adopting this strategy may lead to "local prestige" (Rowe, personal communication) sufficient to commandeer resources, deter rivals, or gain sexual opportunities within the new referent group (see also Moffitt 1993). In other words, competitively disadvantaged youth may be trying to "make the best of a bad job" (Dawkins 1980, Cohen & Machalek 1988), by seeking a social environment in which they may be less handicapped or even superior. The correlates of delinquency in girls are essentially the same as those for boys, although delinquency is less common in girls (Robins 1986, Lytton 1990, White, Moffitt, Earls, Robins & Silva 1990). Caspi, Lynam, Moffitt & Silva (1993) found that delinquency in girls, as in boys, is arrived at via two different developmental trajectories. One pattern includes a history of antisocial behavior throughout childhood and a tendency to seek out delinquent peers; based on previous research (White, Moffitt, Earls, Robins & Silva 1990), this life history trajectory is thought to lead to persisten antisocial behavior in adulthood. The second pattern is exhibited by girls who have few behavior problems in childhood, but who, upon reaching menarche, exhibit more and more frequent antisocial behaviors. The antisocial behavior of girls who show this latter pattern is thought to be more a product of environmental influence than that of girls who follow the first trajectory, as this pattern is selectively exhibited by girls who (a) have an early age of menarche and (b) are in coeducational school settings. These girls, upon reaching early sexual maturity, start associating with older male peers and exhibiting some of the antisocial behaviors that are more often displayed by older boys than their younger female peers (see Maccoby 1986); girls who follow this trajectory are expected to "outgrow" their antisocial activites. Although the two subsets of delinquent girls would be difficult to differentiate using a cross- sectional methodology, in accordance with the model presented here, Caspi et al. consider their differing developmental histories to be of theoretical importance for longitudinal studies and of practical importance for early intervention. (See Moffitt 1993 for a similar scenario regarding boys.) 2.2.3 Moral development Like the tendency to engage in antisocial behavior, an individual's tendency to engage in prosocial behavior seems to be fairly stable from an early age (Rushton 1982). Yet the development of individual differences in behavior has not been as well studied as the presumably universal stages of cognition that underlie changes in moral reasoning. Kohlberg's (1964) stage model of moral development, for example, ties advances in moral thinking to advances in reasoning ability and attributes individual differences largely to differences in cognitive ability. While it is clear that both moral reasoning and moral behavior covary with age (Rushton 1982) and may do so in a manner consistent with some evolutionists' thinking (e.g. Alexander 1987), cognitive models alone cannot explain the absence of moral behavior in sociopaths, who are not intellectually handicapped with respect to the normal population. Other developmental models posit the emergence of empathy and the other social emotions as prerequisites for moral behavior (see Zahn- Waxler & Kochanska 1988 for a review). Even very young children, it seems, are in a sense biologically prepared to learn moral behavior, in that they are selectively attentive to emotions- especially distress- in others (Hoffman 1978, Zahn-Waxler & Radke-Yarrow 1982, Radke- Yarrow & Zahn-Waxler 1986). Hoffman (1975, 1977, 1982), for example, suggests that the observation of distress in others triggers an innate "empathic distress" response in the child, even before the child has the cognitive capacity to differentiate "other" from "self". Accordingly, any instrumental behavior which serves to reduce the distress of the other also serves to relieve the vicarious distress of the child. Thus, very young children might learn to exhibit prosocial behavior long before they are able to conceptualize its effect on others. In Hoffman's model, the motivation behind early prosocial behavior is the (egocentric) need to reduce one's own aversive feelings of arousal and distress. As the child ages, the range of cues and stimuli which can trigger the vicarious distress increase through both classical and operant conditioning. Eventually, when the child develops the cognitive ability to "role play" or take on another's perspective, empathic distress turns to "sympathetic distress", which motivates prosocial behavior that is more likely to be interpreted as intentional, altruistic, and moral. Hoffman's model of prosocial behavior dovetails nicely with Hirshleifer's (1987) "Guarantor" and Frank's (1988) "Commitment" model of emotion (see section I.A.): the reduction in anxiety which follows cooperative or prosocial behavior reinforces such behavior, while the increase in anxiety which, through stimulus generalization, follows acts or thoughts of antisocial behavior will punish and therefore reduce those acts and thoughts. Dienstbier (1984) reported an interesting series of studies testing the role of anxiety and emotional arousal on cheating. As expected, high arousal levels were associated with low cheating levels (and vice versa), but the subjects' attribution of the cause of high arousal was also important. When subjects were able to attribute their arousal to a cause other than the temptation to cheat, they found it much easier to cheat than when they had no other explanation for their arousal level. Subjects were also less willing to work to avoid punishment when they were able to attribute their arousal to an external cause rather than to an internal source of anxiety associated with the threat of punishment. Dienstbier concluded that when a situation is perceived to be "detection-free", one's temptation to cheat is either resisted or not, depending on the levels of anxiety perceived to be associated with the temptation. The ability to act intentionally in either a prosocial or antisocial manner (or in terms of game theory, cooperatively or deceptively), depends upon having reached a certain level of cognitive development at which it is possible to distinguish emotions of the self from emotions of others, i.e., the child must pass from empathic responses to sympathetic responses (Dunn 1982, Hoffman 1975, 1977, 1984, Mitchell 1986, Vasek 1986). This transition begins to occur some time during the second year (Hoffman 1975, 1982, Leslie 1987, Dunn 1987, 1988, and Dunn, Brown, Slomkowski, Tesla & Youngblade 1991) when the child is beginning to develop what has come to be called a "theory of mind" (Premack & Woodruff 1978). Having a theory of mind allows one to impute mental states (thoughts, perceptions, and feelings) not only to oneself, but also to other individuals. Humphrey (1976, 1983) suggests that this kind of awareness evolved in humans because it was a successful tool for predicting the behavior of others. Humphrey claims that the best strategists in the human social game would be those who could use a theory of mind to empathize accurately with others and thereby be able to predict the most adaptive strategy or play in a social interaction. (Byrne & Whiten 1988 call this aptitude "Machiavellian intelligence".) Humphrey's model is something of a cognitive equivalent of the evolutionary models of emotion discussed in section 1.1; they can probably be considered complementary and mutually reinforcing. With regard to sociopathy, the question is whether a strategist can be successful using only the cognitive tool of a theory of mind, without access to emotional, empathic information which, presumably, sociopaths lack (Mealey 1992). In the next section I will argue that this is exactly what a sociopath does. 2.3 Personality theory The models of normative moral development presented above are helpful but clearly insufficient to explain sociopathy. Although some adoption studies and most longitudinal studies report significant effects of social and environmental risk factors on delinquency and criminality, the magnitude of that risk as a simple main effect is rather small. Despite repeated exposure to inconsistent and confusing reinforcement and punishment, most children who grow up with these risk factors do not turn out to be sociopathic, whereas some children who do not experience such risk factors, do. Studies have repeatedly shown that the effect of the environment is much more powerful for children at biological risk than for others. What is it that makes "high risk" environmental features particularly salient for those individuals who have a certain predisposing genotype? 2.3.1 The role of gene-environment interactions Stimulated by the work of Rowe and Plomin (Rowe & Plomin 1981, Rowe 1983a&b, 1990a&b; Plomin & Daniels 1987, Dunn & Plomin 1990), evidence is accumulating that, unlike what has been traditionally assumed, the most important environmental features and events that influence personal development are not those that are shared by siblings within a family (such as parenting style, socioeconomic status, and schooling), but rather, are idiosyncratic events and relationships which are difficult to study systematically with traditional methods. Despite a shared home, individual children will encounter different microenvironments: their individual relationships with their parents will differ, and their experiences on a day to day, minute by minute basis will not overlap significantly. In addition, there will be some environmental differences which are due to genetic differences; children with different personalities, aptitudes, and body types, will not only seek out different experiences (Scarr & McCartney 1983, Caspi, Elder & Bem 1987, Rowe 1990a), but will also attribute different phenomenological interpretations to the same experiences (Rowe 1983a & 1990b, Dunn 1992). For these reasons, any two children will experience an (objectively) identical environment in different ways; there is, in some sense, no real validity to some of the operational measures we currently use to describe a child's environment (Rowe & Plomin 1981, Plomin & Daniels 1987, Wachs 1992). Although this may sound discouraging for those who seek to apply psychological research to the prevention of crime and delinquency - and most such efforts have, in fact, been fairly unsuccessful (Feldman 1977, Gottschalk, Davidson, Gensheimer & Mayer 1987, Borowiak 1989 and Patterson et al 1989)- there are reasons for optimism. Palmer (1983) suggests that the "nothing works" conclusion is valid only in the sense that no single intervention technique will be successful across the board, and that targeting different strategies to different individuals should prove more successful. More and more studies are suggesting that there are at least two developmental pathways to delinquency and sociopathy and that we need to address them separately (Quay 1990b, Lytton 1990, White et al 1990, Caspi et al 1993, Moffitt 1993, Dishion & Poe 1993, Patterson 1993, McCord 1993, Simons 1993). The evolutionary model presented here makes specific predictions about the likely differential success of different intervention and prevention strategies for individuals arriving at their anti-social behavior via different paths: while individuals of differing genotypes may end up with similar phenotypes, different environmental elements and experiences may be particularly salient for them. (This is a corollary of mechanism 5 for the maintenance of ESSs presented in Section 1.2). As will be argued below, primary and secondary sociopathy seem to provide an excellent illustration of the development of similar phenotypes from different genotype-environment interactions. To the extent that we understand it now, primary sociopaths come from one extreme of a polygenic genetic distribution and seem to have a genotype that disposes them "to acquire and be reinforced for displaying antisociality" (Rowe 1990a, p 122). That genotype results in a certain inborn temperament or personality, coupled with a particular pattern of autonomic arousal, that, together, seem to design the individual (1) to be selectively unresponsive to those environmental cues which are necessary for normal socialization and moral development and (2) to actively seek the more deviant and arousing stimuli within the environment. Secondary sociopaths, on the other hand, are not as genetically predisposed to their behavior; rather, they, are more responsive to environmental cues and risk factors, becoming sociopathic "phenocopies" (after Raine 1993) or "mimics" (after Moffitt 1993) when the carrying capacity of the "cheater" niche grows. What are the predisposing constitutional factors that place some individuals at high risk? 2.3.2 The role of temperament In a twin study, Rushton, Fulker, Neale, Nias & Eysenck (1986) found evidence of substantial heritability of self-reported measures of altruism, nurturance, aggressiveness, and empathy. Across twin pairs, altruism, nurturance, and empathy increased with age, while aggressiveness decreased; sex differences (in the expected direction) were found for nurturance, empathy, and aggression; for all measures, the environmental contributions were determined to be individual rather than familial. Methodological considerations do not allow full confidence in the numerical heritability estimates of this study, but Eisenberg, Fabes & Miller (1990) conclude that it reports true individual differences which are likely to be a result of genetic differences in temperament, specifically sociability and emotionality. More recently, two additional twin studies have confirmed the findings of Rushton et al. Emde, Plomin, Robinson, Corley, DeFries, Fulker, Reznick, Campos, Kagan & Zahn-Waxler (1992) reported significant heritabilities for empathy, behavioral inhibition, and expressions of negative affect, while Ghodsian-Carpey & Baker (1987) found significant heritabilities on four measures of aggressiveness in children. Like the Rushton et al study, both of these studies also reported sex differences, and both confirmed the relative importance of nonshared, as opposed to shared, environmental influences. A fourth twin study (Rowe 1986) used a different set of personality indices but went a step further in establishing the link between temperament and antisocial behavior. Rowe's analysis suggests that, especially for males, the inherited factors correlated with one's genetic risk of delinquency are the same as those that lead to the temperamental attributes of anger, impulsivity, and deceitfulness ("self-serving dishonesty with people with whom a person ordinarily has affectional bonds" p 528). Interestingly, while Rowe found that common genetic factors related temperament and delinquency, it was environmental factors which related academic nonachievement with delinquency. These findings provide evidence for the two-pathway model presented in Section 1.2, in that such a gene-environment interaction (1) would create at least two possible routes to sociopathy or criminality, one primarily heritable and one less so, and (2) in terms of the latter, less heritable pathway, would set the stage for developmentally- and environmentally-contingent individual differences in antisocial behavior. In addition, in line with previously mentioned studies and the proposed model, the environmental factors Rowe found to be statistically significant varied within families and were more significant for males than for females. Most of the research into the relationship between temperament, personality and sociopathy has been based on the extensive work of Hans Eysenck (summarized in Eysenck 1977 & 1983, Eysenck & Gudjohnsson 1989, and Zuckerman 1989). Eysenck first postulated and then convincingly documented that sociopathy in particular and antisocial behavior in general are correlated with high scores on all three of the major personality dimensions of the Eysenck Personality Questionnaire: 'extraversion' (contra introversion), 'neuroticism' (contra emotional stability), and 'psychoticism' (contra fluid and efficient superego functioning- not synonymous with psychotic mental illness; Zuckerman (1989) suggests that this scale would be better called 'psychopathy'). All three of these dimensions exhibit substantial heritability, and since psychoticism is typically much higher in males than females, it is a likely candidate for one of the relevant sex-limited traits which fits Cloninger's two-threshold risk model explaining the sex difference in expression of sociopathy. In trying to explain the proximate connections between temperament, delinquency, sociopathy, and criminal behavior, Eysenck and colleagues devised the "General Arousal Theory of Criminality" (summarized in Eysenck & Gudjohnsson 1989), according to which the common biological condition underlying all of these behavioral predispositions is the inheritance of a nervous system which is relatively insensitive to low levels of stimulation. Individuals with such a physiotype, it is argued, will be extraverted, impulsive, and sensation-seeking, because under conditions of relatively low stimulation they find themselves at a suboptimal level of arousal; to increase their arousal, many will participate in high-risk activities such as crime (see also Farley 1986 and Gove & Wilmoth 1990). In general support of this model, Ellis (1987) performed a meta-analysis which found that both criminality and sociopathy were associated with a variety of indicators of suboptimal arousal, including childhood hyperactivity, recreational drug use, risk-taking, failure to persist on tasks, and preference for wide-ranging sexual activity. Additional confirmation of the arousal model comes from Zuckerman, who found a similar pattern of behaviors associated with his measure of sensation-seeking. (The following summary is derived from Zuckerman 1979, Zuckerman, Buschbaum & Murphy 1980, Daitzman & Zuckerman 1980, and Zuckerman 1983, 1984, 1985, 1990 & 1991). In addition to seeking thrill and novelty, sensation-seekers describe "a hedonistic pursuit of pleasure through extraverted activities including social drinking, parties, sex, and gambling", "an aversion to routine activities or work and to dull and boring people", and "a restlessness in an unchanging environment" (Zuckerman et al 1980, p 189). In college students, sensation-seeking is correlated with the Pd (Psychopathic Deviate) scale of the Minnesota Multiphasic Personality Inventory, and among prisoners it can be used to distinguish primary psychopaths from secondary psychopaths and non- psychopathic criminals (see also Fagan & Lira 1980). Zuckerman also shows that sensation-seeking as a temperament appears at an early age (3-4 years), exhibits a high degree of heritability, correlates negatively with age in adults, and exhibits sex differences, with higher scores more often in males. Because it shows a relationship with both sex and age, sensation-seeking (and its presumed underlying hypoarousal) may also be a good candidate for a trait which can explain the distribution and expression of sociopathy (see also Baldwin 1990). Gray (1982, 1987), and Cloninger (Cloninger 1987a, Cloninger, Svrakic & Przybeck 1993) have proposed updated versions of the Eysenck model in which the three personality factors are rotated and renamed so as to more clearly correspond to known neural circuitry. Gray names the three systems: the approach, or, behavioral activation system, the behavioral inhibition system, and the fight/flight system; Cloninger names them "novelty-seeking", "harm- avoidance", and "reward-dependence". The three factors explain the same variance in personality as Eysenck's original factors and have been shown to be independent and highly heritable (Cloninger 1987). In addition to mapping more closely to known neural systems, these three factors are also proposed to correspond to differential activity of three neurochemicals: dopamine for behavioral activation (or novelty-seeking), serotonin for behavioral inhibition (or harm avoidance), and norepinephrine for fight/flight (or reward dependence); see Depue & Spoont 1986, Cloninger 1987, Charney, Woods, Krystal & Heninger 1990, Eysenck 1990, and Raine 1993 for partial reviews. 2.3.3 The role of physiology Using Cloninger's terminology, sociopaths are individuals who are high on novelty-seeking, low on harm-avoidance, and low on reward- dependence. Thus, we should expect them to be high on measures of dopamine activity, low on measures of serotonin activity, and low on measures of norepinephrine activity; data suggest that they are. Zuckerman (1989) reports that sensation-seeking is negatively correlated with levels of dopamine-beta-hydroxylase (DBH), the enzyme which breaks down dopamine, and that extremely low levels of DBH are associated with undersocialized conduct disorder and psychopathy. Importantly, with respect to the two-pathway model, boys with socialized conduct disorder (those with fewer, later- appearing symptoms, and who are posited to be at risk for secondary, as opposed to primary sociopathy), had high levels of DBH. In addition, extraverts and delinquents are reported to have lower than average levels of adrenaline (epinephrine) and norepinephrine under baseline circumstances; Magnusson (1985 as cited by Zuckerman 1989) reports that urinary epinephrine measures of boys at age 13 significantly predicted criminality at ages 18-25. High sensation- seekers, criminals, and other individuals scoring high on measures of impulsivity and aggression also have significantly lower levels than others of the serotonin metabolite, 5-HIAA (Brown, Goodwin, Ballenger, Goyer & Major 1979, Brown, Ebert, Goyer, Jimerson, Klein, Bunney & Goodwin 1982, Muhlbauer 1985, Depue & Spoont 1986, Zuckerman 1989, 1990, Kruesi, Hibbs, Zahn, Keysor, Hamburger, Bartko & Rapoport 1992, and Raine 1993). These are not small effects: Raine (1993) reports an average effect size (the difference between groups divided by the standard deviation) for serotonin of .75, and for norepinephrine of .41; Brown et al (1979) reported that 80% of the variance in aggression scores of their sample was explained by levels of 5-HIAA alone; and Kruesi et al reported that knowing 5- HIAA levels increased the explained variance of aggression at a two- year follow-up from 65% (using clinical measures only) to 91% (clinical measures plus 5-HIAA measures). Levels of monoamine oxidase (MAO)- an enzyme which breaks down the neurotransmitters serotonin, dopamine, epinephrine, and norepinephrine- are also low in antisocial and sensation-seeking individuals (Zuckerman 1989, 1990, Ellis 1991b). Individual differences in platelet MAO appear shortly after birth and are stable (Zuckerman 1989, 1990 and Raine 1993); Zuckerman reports an estimated heritability of .86. Recently, a mutant version of the gene coding for MAO-A, the version of MAO specific to serotonin, has been identified in an extended family in which the males show a history of repeated, unexplained outbursts of aggressive behavior (Brunner, Nelen, Breakefield, Ropers & van Oost 1993, Morrell 1993); urinalysis indicated that the MAO-A is not functioning normally in the affected men. Results of psychophysiological studies also report significant differences between sociopaths and others. [Reviews of this literature can be found in Mednick, Moffitt & Stack (1987), Trasler (1987), Raine (1989), Eysenck & Gudjohnsson (1989), Raine & Dunkin (1990), Zuckerman (1990), and Raine (1993).] Among the findings are that: high sensation-seekers and sociopaths are more likely than lows and normals to show orienting responses to novel stimuli of moderate intensity, whereas lows and normals are more likely to show defensive or startle responses; criminals and delinquents tend to exhibit a slower alpha (resting) frequency in their electroencephalogram (EEG) than age-matched controls; high sensation-seekers and delinquents differ from lows and nondelinquents in the amplitude and shape of cortical evoked potentials; extraverts and sociopaths show less physiological arousal than introverts and normals in response to threats of pain or punishment and more tolerance of actual pain or punishment; and delinquents (though not necessarily adult criminals) tend to have lower baseline heart rate than nondelinquents. The importance of the role of these psychophysiological factors as significant causes, not just correlates, of sociopathy, is strengthened by evidence that (a) these measures of autonomic reactivity are just as heritable as the temperament they are associated with (Zuckerman 1989, Gabbay 1992), and that (b) the same physiological variables which differentiate identified sociopaths, delinquents, and criminals from others can also significantly predict later levels of antisocial behavior in unselected individuals (Loeb & Mednick 1977 using skin conductance; Volavka, Mednick, Gabrielli, Matousek & Pollock 1984 using EEG; Satterfield 1987 using EEG; Raine, Venable & Williams 1990a using EEG, heart rate, and skin conductance; and Raine, Venables & Williams 1990b using evoked potentials). As for the reports on neurochemistry, these are effects are not small; Raine (1993) reports that for heart rate, the average effect size across ten studies was .84. Another important physiological variable in the distribution of sociopathic behavior is testosterone. Testosterone (or one of its derivatives) is a likely candidate for the role of trigger of the sex-limited activation of genes required by the two-threshold model presented earlier. The mechanism of action of steroid hormones is to enter the nucleus of the cell and interact with the chromosomes, regulating gene expression. This differential activity of the genes leads to some of the individual, age, and sex differences we see in temperament, specifically, psychoticism, aggression, impulsivity, sensation-seeking, nurturance, and empathy (Zuckerman et al 1980, Zuckerman 1984, 1985, 1991 and Ellis 1991b). Variation in testosterone levels also parallels the age variation in the expression of sociopathic behavior and is correlated with such behavior in adolescent and adult males (Daitzman & Zuckerman 1980, Zuckerman 1985, Rubin 1987, Olweus 1986, 1987, Schalling 1987, Susman, Inoff-Germain, Nottelman, Loriaux, Cutler & Chrousos 1987, Ellis & Coontz 1990, Udry 1990, Dabbs & Morris 1990, Gladue 1991 and Archer 1991). Testosterone is thus likely to play a dual role in the development of sociopathy, just as it does in the development of other sex differences: one as an organizer (affecting traits) and one as an activator (affecting states). Udry (Drigotas & Udry 1993, Halpern, Udry, Campbell & Suchindran 1993), unable to replicate his own 1990 study suggesting an activating effect of testosterone, has suggested that the correlation between testosterone and aggression might be due to a physiosocial feedback loop; he posits that boys with high, early levels of testosterone mature faster, and, being bigger, are more likely to get in fights. Since levels of testosterone, adrenaline, and serotonin have been shown to fluctuate in response to social conditions (McGuire, Raleigh & Johnson 1983, Raleigh, McGuire, Brammer & Yuwiler 1984, Schalling 1987, Olweus 1987, Raleigh, McGuire, Brammer, Pollack & Yuwiler 1991, Archer 1991, Kalat 1992), this sociophysiological interaction creates a positive feedback loop: those who start out with high levels of testosterone and sensation seeking (and low levels of adrenaline, serotonin, and MAO) are (1) more likely than others to initiate aggressive behavior, and (2) more likely to experience success in dominance interactions, leading to (3) an increased probability of experiencing further increases in testosterone, which (4) further increases the likelihood of continued aggressive behavior. Another example of a sociophysiological feedback loop comes from Dabbs and Morris (1990), who found significant correlations between testosterone levels and antisocial behavior in lower class men but not in upper class men. They explained this by positing that upper class men are more likely, because of differential socialization, to avoid individual confrontations. If this is true, it would mean that upper class men are, because of their socialization, specifically avoiding those types of social encounters which might raise their testosterone (and in turn, their antisocial behavior). This interpretation is supported by the finding (in the same study) that significantly fewer upper class than lower class men had high testosterone levels. Thus, it is possible that upper class socialization may mitigate the influence of testosterone. An alternative explanation- that the aggressive behavior associated with higher testosterone levels leads to downward social mobility- also suggests a recursive sociophysiological interaction. Raine (1988) has argued that since upper class children are less likely than lower class children to suffer the environmental risks predisposing one toward sociopathic behavior, when such behavior is seen in upper class individuals, it is likely to be the result of a particularly strong genetic predisposition. Evidence supporting this has been reported by three independent studies. Wadsworth (1976) found physiological indicators of hypoarousal amongst upper- class, but not lower-class, boys who subsequently became delinquent. Raine (Raine & Venables 1981, 1984; also reported in Raine 1988 and Raine & Dunkin 1990) found indicators of hypoarousal in his upper- class antisocial subjects, but the reverse in his lower-class subjects. Satterfeld (1987) found that of his lower-class subjects, those in a biological high-risk group were seven times more likely to have been arrested than those in his control group, whereas among his middle- and upper-class subjects, the rate was 25 and 28 times, respectively. This outcome was a result of lower rates of criminal activity in the control groups of the middle- and upper-class subjects as compared to the lower-class controls; i.e., almost all of those who had been arrested from the middle- and upper-class were biologically at high risk, but this was not true for the lower class subjects. The implications of these findings are of tremendous import, as they suggest that (1) the effect of the social environment might be considerably larger than suggested by adoption studies, and (2) there might be different etiological pathways to sociopathy, and therefore different optimal strategies for its prevention or remediation, depending upon what kind of social and environmental background the person has experienced. 2.4 Learning Theory Adoption studies show that the environment clearly plays an important role in the etiology of sociopathy, but that its effects are different for individuals of different genotype. As mentioned in section 2.3.1, some of this difference is likely to be a result of gene-environment correlations, in that different environments are sought by individuals of different genotypes; some will be a result of differences in interpretation of the same environment by individuals of different genotypes; and some will be a result of differences in environment impinging upon people because of differences in their genotype (e.g. discriminating parental treatment of two children differing in temperament). In nonadoptive families, gene-environment correlations will be even stronger because parents with certain personality types will provide certain environments for their children. These differential effects of environment on individuals of varying genetic risk for sociopathy become readily apparent when we examine the effect of the interaction between physiotype and conditioning on the process of socialization. 1. Conditioning There is evidence that individuals with a hypoaroused nervous system are less sensitive than most people to the emotional expression of other individuals, and to social influences in general (Eliasz & Reykowski 1986, Eysenck 1967 as cited in Patterson and Newman 1993). They are also less responsive to levels and types of stimuli that are normally used for reinforcement and punishment (Eliasz 1987); as a result, they are handicapped in learning through autonomic conditioning although they exhibit no general intellectual deficit (e.g. Hare & Quinn 1971, Eysenck 1977, Mednick 1977, Ziskind, Syndulko & Maltzman 1978, Gorenstein & Newman 1980, Newman, Widom & Nathan 1985, Raine 1988, Lytton 1990, Zuckerman 1991). One of the posited consequences of this learning deficit is a reduced ability to be socialized by the standard techniques of reward and punishment that are used (especially in the lower classes and by uneducated parents) on young children. In particular, hypoaroused individuals have difficulty inhibiting their behavior when both reward and punishment are possible outcomes (Newman, Widom & Nathan 1985, Newman & Kosson 1986, Newman 1987, Zuckerman 1991, Patterson & Newman 1993); in situations when most people would experience an approach-avoidance conflict, sociopaths and extraverts are more likely to approach; (see also Dienstbier 1984). Because of their high levels of sensation-seeking, children with a hypoaroused nervous system will be more likely than other children to get into trouble, and when they do, will be less likely to be affected by, and learn from, the consequences whether those consequences are a direct result of their behavior or an indirect result such as parental punishment. Despite continuing problems with operational definitions, recent research suggests that there might be distinguishable differences in learning between primary and secondary sociopaths, or children with unsocialized versus socialized conduct disorder (Newman et al 1985, Gray 1987, Quay 1990b, Newman, Kosson & Patterson 1992). Primary sociopaths, with their inability to experience the social emotions, exhibit deficits on tasks which typically induce anxiety in others, specifically, passive avoidance tasks, approach-avoidance tasks, and tasks involving punishment, but they can learn well under other conditions (Raine, Venables & Williams 1990b, Newman et al 1992, Patterson & Newman 1993, Raine 1993). Secondary sociopaths and extraverts, on the other hand, have normal levels of anxiety and responses to punishment, but they may be especially driven by high reward conditions (Boddy, Carver & Rowley 1986, Derryberry 1987, Newman, Patterson, Howland & Nichols 1990). Primary sociopaths, with diminished ability to experience anxiety and to form conditioned associations between antisocial behavior and the consequent punishment, will be unable to progress through the normal stages of moral development. Unlike most children who are biologically prepared to learn empathy, they are contraprepared to do so, and will remain egoistic- unable to acquire the social emotions of empathy, shame, guilt, and love. They present at an early age with "unsocialized" conduct disorder. Secondary sociopaths, with normal emotional capacities, will present, generally at a later age, with "socialized" conduct disorder (Loeber 1993, Patterson 1993, Simons 1993). What socialization processes contribute to their development? 2.4.2 Social learning In Section 2.2.1, it was noted that a cheating strategy is predicted to develop when a male (especially) is competitively disadvantaged, and that criminal behavior (especially in males) is clearly related to factors associated with disadvantage. These are: large numbers of siblings, low socio-economic status, urban residency, low intelligence, and poor social skills. How, in a proximate sense, do these variables contribute to the development of secondary sociopathy? Path models suggest a two-stage process involving a variety of cumulative risk factors (McGarvey et al 1981, Snyder, Dishion & Patterson 1986, Snyder & Patterson 1990, Patterson, Capaldi & Bank 1991, Dishion et al 1991, Loeber 1993, Simons 1993, Tremblay 1993, Moffitt 1993) (9). In the first stage, disrupted family life, associated with parental neglect, abuse, inconsistent discipline, and the use of punishment as opposed to rewards, are critical (Feldman 1977, Wilson & Hernnstein 1985, Snyder et al 1986, McCord 1986, Patterson et al 1989, Luntz & Widom 1993, Conger 1993, Simons 1993). Poor parenting provides the child with inconsistent feedback and poor models of prosocial behavior, handicapping the child in the development of appropriate social, emotional, and problem-solving skills. This pattern is found most frequently in parents who are themselves criminal, mentally disturbed, undereducated, of low intelligence, or socioeconomically deprived (McGarvey et al 1981, McCord 1986, Farrington 1986), leading to a cross-generational cycle of increasing family dysfunction (eg. Jaffe, Suderman & Reitzel 1992, Luntz & Widom 1993). In the second stage, children with poor social skills find themselves at a disadvantage in interactions with age-mates; rejected by the popular children, they consort with one another (Loeber & Dishion 1983, Snyder et al 1986, Kandel et al 1988, Hartup 1989, and Patterson et al 1989, Dishion et al 1991). In these socially unskilled peer groups, which will also include primary sociopathic, or, unsocialized conduct disorder children, delinquent, antisocial behavior is reinforced and new (antisocial) skills are learned (Maccoby 1986, Moffitt 1993). Antisocial behavior may then escalate in response to, or as prerequisite for, social rewards provided by the group, or as an attempt to obtain the perceived social (and tangible) rewards which often accompany such behavior (Moffitt 1993). As the focus of the socialization process moves outside the home, parental monitoring becomes more important (Snyder et al 1986, Snyder & Patterson 1990, Forgatch 1991, Dishion et al 1991, Conger 1993, Forgatch, Stoolmiller & Patterson 1993, Simons 1993), as does the availability of prosocial alternatives for the socially unskilled adolescent (Farrington 1986, Apter 1992, Moffitt 1993). The development of secondary sociopathy appears to depend much more upon environmental contributions than does primary sociopathy. Since it is secondary sociopathy which, presumably, has increased so rapidly, so recently in our culture, what can social psychologists contribute to our understanding of the sociocultural factors involved in its development? 2.5 Social Psychology 2.5.1 Machiavellianism First, the use of antisocial strategies is not restricted to sociopaths. The majority of people who are arrested are not sociopathic, and many people exhibit antisocial behavior that is infrequent enough or inoffensive enough to preclude arrest. Some antisocial behavior is even considered acceptable if it is expressed in socially approved circumstances. Person (1986), for example, relates entrepreneurism to psychopathy, while Christie (1970) notes that people who seek to control and manipulate others often become lawyers, psychiatrists, or behavioral scientists; Jenner (1980), too, claims that "subtle, cynical selfishness with a veneer of social skills is common among scientists" (p 128). Christie (see Christie & Geis 1970) developed a scale for measuring this subclinical variation in antisocial personality; he called it the "Machiavellianism" or "Mach" scale. One's Mach score is calculated by compiling answers to Likert-format queries of agreement or disagreement with statements like "Humility not only is of no service but is actually harmful," "Nature has so created men that they desire everything but are unable to attain it," and "The most important thing in life is winning". Adults who score high on the Mach scale express "a relative lack of affect in interpersonal relationships," "a lack of concern with conventional morality," "a lack of gross psychopathology," and "low ideological commitment" (Christie & Geis, p 3-4); children who score high on Machiavellianism have lower levels of empathy than age-mates (Barnett & Thompson 1984). High Machs have an "instrumental cognitive attitude toward others" (Christie & Geis, p 277), and, because they are goal-oriented as opposed to person-oriented, they are more successful in face-to-face bargaining situations than low Machs. High Machs "are especially able communicators, regardless of the veracity of their message" (Kraut & Price 1976). In a related vein, high Machs, like sociopaths, are more resistant to confession after cheating than are low Machs, and they are rated as being more plausible liars (Christie & Geis 1970, Bradley & Klohn 1987); like sociopaths, high Machs are often referred to as "cool". According to Christie, "If Machiavellianism has any behavioral definition ...self-initiated manipulation of others should be at its core" (p 76). One can thus easily think of Machiavellianism as a low-level manifestation of sociopathy. It even shows a sex difference consistent with the two- threshold model (Christie & Geis 1970), an age pattern consistent with age variation in testosterone levels (Christie & Geis 1970), significant positive correlations with Eysenck's psychoticism and neuroticism scales (Allsopp, Eysenck & Eysenck 1991), and a correlation with serotonin levels (Madsen 1985). In one study, Geis & Levy (1970) found that high Machs (who were thought to use an "impersonal, cognitive, rational, cool" approach with others), were much more accurate than low Machs (who were thought to use a "more personal, empathizing" approach), at assessing how other "target" individuals answered a Machiavellian attitudes questionnaire. Even more interesting is the result (from the same study) that the high Machs achieved their accuracy by using a nomothetic or actuarial strategy: they guessed that everyone was at about the average level, without discriminating between individuals based on differences they had had an opportunity to observe during a previous experimental session. In addition, their errors tended to be random, which would fit with reports by Eliasz & Reykowski (1986) and Damasio, Tranel & Damasio (1990) that hypoaroused and antisocial individuals are less attentive to social and emotional cues than others. Low Machs, on the other hand, used an idiographic approach, and although they successfully differentiated between high scorers and low scorers, they grossly underestimated the scores of both, guessing at a level that was more reflective of their own scores than those of the population at large. This study suggests two things: (1) that basing one's playing strategies on an "impersonal, cognitive, rational, cool" approach to others might be more accurate in the long run than using a "personal, empathizing" approach (at least in those situations where cooperative long-term partnerships are not possible); and (2) the errors made by those who use the personal, empathizing approach, are of the kind more likely to result in playing the cooperation strategy when the cheating strategy would be more appropriate (rather than vice versa). Thus, the personal, empathizing approach is likely to make one susceptible to being exploited by others who use the impersonal cognitive approach; indeed, high Machs outcompete low Machs in most experimental competitive situations (Terhune 1970, Christie & Geis 1970). As I have argued elsewhere (Mealey 1992), the common assumption that an empathy-based approach to predicting the behavior of others is better than a statistical approach is not necessarily correct; this belief may itself be an emotion-based cognitive bias. To have such a bias may be beneficial, however, for the same reason that emotional commitment biases are beneficial: in situations where voluntary, long-term coalitions can be formed, the personal, empathizing (and idealistic) low Machs might outperform the more impersonal, cognitive (and realistic) high Machs, since low Machs would be more successful than high Machs in selecting a cooperator as a partner. Although two studies (Hare & Craigen 1974 and Widom 1976a) report on the strategy of sociopaths in Prisoner's Dilemma-type settings, in both studies the sociopaths were paired with one another; thus, we do not have a measure of the strategy sociopaths use against partners of their own choosing or in situations with random, rotating partners (10). I would predict that in such settings, sociopaths, (like Geis & Levy's high Mach subjects), would be less proficient than others in distinguishing between high and low Mach partners, and would thus be at a disadvantage in iterated games with a chosen partner; on the other hand (again like high Mach subjects), they should perform at better than average levels when playing with randomly assigned, rotating partners. Widom (1976b) found that when asked to guess how "people in general" would feel about different social situations sociopaths guessed that others would feel essentially the same way that they do, whereas control subjects guessed that others would feel differently. As in the Geis & Levy study, both groups were wrong, but in different ways: the sociopaths underestimated their differences from others, while the control subjects substantially over-estimated their differences from others, suggesting that sociopaths (like high Machs) were using a nomothetic approach to prediction, while controls (like low Machs) were using an idiographic approach. Machiavellianism and the related propensity to use others in social encounters has generally been looked upon as a trait. An alternative perspective, however, acknowledges both the underlying variation in personality and the situational factors that are relevant to an individual's behavior at any given moment (eg. Barber 1992). In line with mechanism 5 for maintaining ESSs (presented in Section I.B.), Terhune (1970) says "actors bring to the situation propensities to act in a certain general way, and within the situation their propensities interact with situational characteristics to determine their specific behavior" (p 229) (11). This brings us to the last question: Beyond the constitutional and environmental variables that contribute to the development of individual differences in personality and antisocial behavior, what can social psychology tell us about the within-individual situational factors which encourage or discourage cheating strategies, and how can these be explained? 2.5.2 The role of mood Although mood and emotion are not identical concepts, they are clearly related (12). Mood might be thought of as a relative of emotion which clearly varies within individuals but is perhaps less an immediate response to concrete events and stimuli and more a generalized, short- to mid-term response to the environment. As such, the role of mood must be addressed by any model that relies so heavily on the concepts of emotion, emotionality, and emotionlessness, as determinants of behavior. Positive mood and feelings of success have been demonstrated to enhance cooperative behavior (Mussen & Eisenberg-Berg 1977, Cialdini, Kenrick & Baumann 1982, Farrington 1982). If, as Nesse (1991) has argued, positive mood is a reflection not only of past success, but also of anticipation of future success, the facilitation of cooperation by positive mood could be seen as part of a long-term strategy by individuals who feel they can afford to pass up possible short-term gains for the sake of establishing a cooperative reputation. Sad affect and feelings of failure can also affect strategy in social interactions. To the extent that sadness and feelings of failure follow losses of various sorts, individuals in these circumstances should be expected to be egoistic and selfish. In children, this is typically what is found (Mussen & Eisenberg-Berg 1977, Baumann, Cialdini & Kenrick 1981). In some children, and more consistently in adults, on the other hand, sadness and feelings of failure can facilitate prosocial behavior. Mussen & Eisenberg-Berg (1977) suggest that this is a result of a deliberate effort to enhance one's (diminished) reputation among others; Baumann et al (1981) and Cialdini et al (1982) suggest that it is a result of a deliberate effort to relieve negative affect based on prior experience that prosocial behavior often has a positive, self- gratifying effect. If sadness is profound, i.e., one is is depressed and experiencing the cognitive biases and selective attention associated with depression (Nesse 1991, Sloman 1992, Mineka & Sutton 1992), one would be expected to desist from all social interaction, being neither antisocial nor prosocial, but asocial (Nesse 1991, Sloman 1992). In this view, the lethargy and anhedonia associated with depression could be considered to be facultative lapses in the emotions or moods which typically motivate a person toward social interaction. Hostility can also lead to cognitive biases and selective attention to relevant social stimuli. Dodge and Newman (1981) showed that aggressiveness in boys is associated with the over-attribution of hostile intent to others. The authors concluded that such attributions lead to increased "retaliatory" aggression by the hostile individuals, fueling a cycle of true hostility and retaliation by all parties. It is also abundantly clear that anger and hostility, once expressed, do not lead to catharsis, but to amplified feelings and outward expressions of that anger (Tavris 1982). Guilt, which often follows selfish behavior, typically results in an increase in subsequent prosocial behavior (Hoffman 1982, Cialdini et al 1982); Hoffman calls this "reparative altruism". Guilt can easily be seen as one of Hirshleifer's (1987) or Frank's (1988) emotional commitment devices, compelling one to perform prosocial behavior as a means of reestablishing one's tarnished reputation. Interestingly, Cialdini et al (1982) also report that prosocial behavior increases after observing another's transgression. They explain this phenomenon within the context of what they call the "Negative Relief" model: prosocial behavior is performed as a means of alleviating negative feelings in general (including direct or vicarious guilt, sympathy, distress, anxiety, or depression). Like Hoffman's, this model postulates that the reinforcing power of (relief provided by) prosocial behavior is learned during childhood. Since guilt, anxiety and sympathy are social emotions that primary sociopaths rarely, if ever, experience, there is no reason to expect that they might moderate their behavior so as to avoid them. On the other hand, there is no reason to expect that sociopaths don't experience fluctuations in mood (such as depression, optimism, or anger) in response to their changing evaluation of their prospects of success and failure. To the extent that we can manipulate the sociopath's mood, therefore, we might be able to influence his behavior. 2.5.3 Cultural variables Competition, in addition to being one of the most important variables in determining long-term life strategy choices, is also one of the more important situational variables influencing the choice of immediate strategy. Competition increases the use of antisocial and Machiavellian strategies (Christie & Geis 1970) and can counteract the increase in prosocial behavior that generally results from feelings of success (Mussen & Eisenberg-Berg 1977). Some cultures encourage competitiveness more than others (Mussen & Eisenberg-Berg 1977, Shweder, Mahapatra & Miller 1987) and these differences in social values vary both temporally and crossculturally. Across both dimensions, high levels of competitiveness are associated with high crime rates (Wilson & Herrnstein 1985, see also Farley 1986) and Machiavellianism (Christie & Geis 1970). High population density, an indirect form of competition, is also associated with reduced prosocial behavior (Farrington 1982) and increased antisocial behavior (Wilson & Hernnstein 1985, Ellis 1988, Robins, Tipp & Przyeck 1991, U.S. Department of Justice 1992)- especially in males (Wachs 1992; see Section 3.2.1 and references therein for ultimate, game theoretic explanations why this might occur; see Draper 1978, Siegel 1986, Gold 1987, Foster 1991, and Wilson & Daly 1993 for a variety of proximate explanations). Fry (1988) reports large differences in the frequency of prosocial and antisocial behaviors in two Zapotec settlements equated for a variety of socio-ecological variables; the one major difference- thought possibly to be causal- was in land holdings per capita, with the higher levels of aggression found in the community with the smaller per capita land holdings. Last, but not least, is the relatedness or similarity of the actors/strategists to their partners in an interaction. Based on models of kin selection and inclusive fitness, individuals should be more cooperative and less deceptive when interacting with relatives who share their genes, or relatives who share investment in common descendents. Segal (1991) reported that identical twins cooperated more than fraternal twins playing the Prisoner's Dilemma. Barber (1992) reported that responses on an altruism questionnaire were more altruistic when the questions were phrased so as to refer to relatives (as opposed to "people" in general), and that Machiavellian responses were thereby reduced. Rushton (Rushton, Russell and Wells 1984, Rushton 1989) presents evidence that people also cooperate more with others who are similar to them even though not genetically related. There are a variety of plausible evolutionary explanations for this behavior (see Pulliam 1982, Mealey 1984, and BBS commentary on Rushton 1989). 3. Integration, Implications, and Conclusions: 3.1 Integration: Sociopathy as an ESS leads to two types of sociopaths 3.1.1 Primary sociopathy I have thus far argued that some individuals seem to have a genotype that disposes them "to acquire and be reinforced for displaying antisociality" (Rowe 1990a, p 122). That genotype results in a certain inborn temperament or personality, coupled with a particular pattern of autonomic hypoarousal that, together, design the child to be selectively unresponsive to cues necessary for normal socialization and moral development. This scenario is descriptive of mechanism 1 (Section 1.2) of maintaining ESSs in the population; it describes the existence of frequency-dependent, genetically based individual differences in employment of life history strategies. I suggest accordingly, that there will always be a small, cross- culturally similar, and unchanging baseline frequency of sociopaths: a certain percentage of sociopaths- those individuals to whom I have referred as primary sociopaths- will always appear in every culture, no matter what the socio-cultural conditions. Those individuals will display chronic, pathologically emotionless antisocial behavior throughout most of their lifespan and across a variety of situations, a phenotype which is recognized (according to Robins, Tipp & Przybeck 1991) "by every society, no matter what its economic system, and in all eras" (13). Since it is a genetically determined strategy, primary sociopaths should be equally likely to come from all kinds of socio-economic backgrounds; on the other hand, since they constitute that small group of individuals whose physiotype makes them essentially impervious to the social environment almost all sociopaths from the upper-classes will be primary sociopaths (14). Of course, because they are not intellectually handicapped, these individuals will progress normally in terms of cognitive development and will acquire a theory of mind. Their's however, will be formulated purely in instrumental terms, without access to the empathic understanding that most of us rely on so much of the time. They may become excellent predictors of others' behavior, unhandicapped by the vagaries and "intrusiveness" of emotion, acting, as do professional gamblers, solely on nomothetic laws and actuarial data rather than on hunches and feelings. In determining how to "play" in the social encounters of everyday life, they will use a pure cost-benefit approach based on immediate personal outcomes, with no "accounting" for the emotional reactions of the others with whom they are dealing. Without love to "commit" them to cooperation, anxiety to prevent "defection", or guilt to inspire repentance, they will remain free to continually play for the short- term benefit in the Prisoner's Dilemma. 3.1.2 Secondary sociopathy At the same time, because changes in gene-frequencies in the population would not be able to keep pace with the fast-changing parameters of social interactions, an additional, fluctuating proportion of sociopathy should be a result of mechanism 5 for maintaining ESSs, which allows for more flexibility in the ability of the population to track the frequency-dependent nature of the success of the cheating strategy. Mechanism 5 (genetically based individual differences in response to the environment, resulting in differential use by individuals of environmentally-contingent strategies) would explain the development and distribution of what I have referred to as secondary sociopathy. Secondary sociopathy is expressed by individuals who are not extreme on the genetic sociopathy spectrum, but who, because of exposure to environmental risk factors, pursue a life history strategy that involves frequent, but not necessarily emotionless cheating. Unlike primary sociopaths, secondary sociopaths will not necessarily exhibit chronic antisocial behavior, because their strategy choices will be more closely tied to age, fluctuation in hormone levels, their competitive status within their referent group, and changing environmental contingencies. Since secondary sociopathy is more closely tied to environmental factors than to genetic factors, secondary sociopaths will almost always come from lower class backgrounds and their numbers could vary substantially across cultures and time, tracking environmental conditions favoring or disfavoring the use of cheating strategies. The existence of this second etiological pathway to sociopathy explains the fact that cultural differences are correlated with differences in the overall incidence of antisocial behavior (Wilson & Hernnstein 1985, Farley 1986, Gold 1987, Ellis 1988, Robins, Tipp & Przybeck 1991). It also explains why, as the overall incidence of sociopathy increases, the discrepancy in the ratio of male to female sociopaths decreases (Robins, Tipp & Przybeck, 1991): since secondary sociopathy is less heritable than primary sociopathy (according to this model), the effect of sex-limited genes (like that of all the genes contributing to the spectrum) should be less important for the development of secondary sociopathy, resulting in less of a sex difference. Based on this model, I would predict that, unlike what we find for primary sociopathy (see Section 2.1.3), we should find no differential heritability between the sexes for secondary sociopathy (even though there will still be a sex difference in prevalence). 3.2 Implications of the two-pathways model Terhune (1970) suggests that choice of strategy in experimental game situations (and, presumably, real-life settings as well) depends upon two things: (1) cognitive expectations regarding others (i.e., a theory of mind), and (2) motivational/emotional elements such as hopes and fears. Since primary sociopaths have a deficit in the realm of emotional motivation, they presumably act primarily upon their cognitive expectations of others; to the extent that they do act upon emotions, it is most likely to be upon mood and the primary emotions (like anger and fear) rather than upon the social and secondary emotions (like love and anxiety). Thus, the extent to which a society will be able to diminish the antisocial behavior of primary sociopaths will depend upon two things: (1) its influence on the sociopath's cognitive evaluation of its own reputation as a player in the Prisoner's Dilemma, and (2) the primary emotion- or mood-inducing capacity of the stimuli it utilizes in establishing the costs and benefits of prosocial versus antisocial behavior. Manipulations of these two variables will also influence the numbers of secondary sociopaths by changing the size of the adaptive niche associated with antisocial behavior. In addition, since the development of secondary sociopathy is more influenced by the social environment than is the development of primary sociopathy, and since secondary sociopaths are not devoid of social emotions, changing patterns in the nurturing and socialization of children and in the socialization and rehabilitation of delinquents and adult criminals is an additional, viable possibility for reducing the overall prevalence of antisocial behavior. 3.2.1 Minimizing the impact of primary sociopaths: society as a player in the Prisoner's Dilemma Sociopaths' immediate decisions are based partly on their ability to form a theory of mind, and to use those expectations of others' behavior in a cost-benefit analysis to assess what actions are likely to be in their own self-interest. (This is true for both primary and secondary sociopaths.) The outcome of such analyses is therefore partially dependent on the sociopath's expectations of the behavior of other players in the game. I would argue that an entire society can be seen as a player, and that the past behavior of that society will be used by the sociopath in forming the equivalent of a theory of mind, to predict the future behavior of that society. Like an individual player, a society will have a certain probability of detecting deception, a more-or-less accurate memory of who has cheated in the past, and a certain proclivity to retaliate or not, based upon a cheater's past reputation and current behavior. Since the sociopath is using a rational and actuarial approach to assess the costs and benefits of different behaviors, it is the actual past behavior of the society which will go into his calculations, rather than risk assessments inflated from the exaggerated fears or anxieties that most people feel in anticipation of being caught or punished. Thus, to reduce antisocial behavior, a society must establish and enforce a reputation for high rates of detection of deception and identification of cheaters, and a willingness to retaliate. In other words, it must establish a successful strategy of deterrence. Game theory models by Axelrod and others have shown that the emergence, frequency, and stability of social cooperation is subject to an abundance of potential deterrent factors (Axelrod & Hamilton 1981, Axelrod 1984, Feldman & Thomas 1987, Axelrod & Dion 1988, Heckathorn 1988, Hirshleifer & Coll 1988, Boyd 1988, Dugatkin & Wilson 1991, Boyd & Richerson 1992, Nowak & Sigmund 1993 and Vila & Cohen 1993). Among these are: group size (as it decreases, cooperation increases); nonrandom association of individuals within the population (as it increases, cooperation increases); the probability of error in memory or recognition of an individual (as it decreases, cooperation increases); the effect of a loss on a cooperator (as it decreases, cooperation increases); the effect of a gain on a defector (as it decreases, cooperation increases); the frequency of punishment against defectors (as it increases, cooperation increases); the cost of punishment for the punished (as it increases, cooperation increases); and the cost of punishment for the punishers (as it decreases, cooperation increases) (15). Recent game-theoretic models are coming closer and closer to the complexity of real-world, human social interactions on a large scale by examining the role of culture and technology in: expanding society's collective memory of individual players' past behavior; broadcasting the costs and benefits of cooperation and defection, and; the development and application of new socialization, deception-detection, and punishment techniques (see esp. Hirshleifer & Rasmusen 1989, Machalek & Cohen 1991, Dugatkin 1992). These models begin to provide useful strategies for the real-world prediction and reduction of cheating strategies and antisocial behavior. (See also Feldman 1977, Farrington 1979, Bartol 1984, Wilson & Herrnstein 1985, Axelrod 1986, Eysenck & Gudjonsson 1989, Ellis 1990a, and Machalek & Cohen 1991 for some nonquantitative models and tests which incorporate some of these variables in their explanation of the socialization, punishment, and deterrence of crime.) Since neither secondary nor primary sociopaths have a deficit in the ability to perform accurate cost-benefit analyses, increasing the probabilities of criminal detection, identification, and punishment, can also reduce crime; a society must therefore establish a reputation for willingness to retaliate. [The National Research Council (1993) reports that a 50% increase in the probability of incarceration for any single crime reduces subsequent crime twice as much as does doubling incarceration duration (p 294).] Harsher penalties can also be deterring, but only if they are reliably meted out. Another key is in making the costs of cheating salient. Generally speaking, antisocial and uncooperative behaviors increase as the costs become more diffuse or removed in time, and prosocial and cooperative behaviors decrease as the benefits become more diffuse or removed in time (Bartol 1984, Ostrom 1990, Low 1993). For primary sociopaths, this is even moreso, since their sensation- seeking physiotype makes them particularly unable to make decisions based on nonimmediate consequences. Although able to focus attention on interesting tasks for short periods, the sociopath cannot perform well under conditions of delayed gratification (Pulkkinen 1986) and is more motivated avoid immediate costs than by threats or avoidance of future punishments (Christie & Geis 1970, McCord 1983, Raine 1988, 1989, Forth & Hare 1989). Costs associated with social retaliation must therefore not only be predictable, but be swift- and the swiftness itself, must also be predictable. Another factor the sociopath will use to "compute" the potential value of an antisocial action is the cost-benefit ratio of the alternatives (Piliavin, Thornton, Gartner & Matsueda (1986). For the sociopath, money and other immediate tangible rewards are more motivating than social reinforcers (such as praise) or promises of future payoff, and visual stimuli are more salient than auditory stimuli (Chesno & Kilmann 1975, Raine & Venables 1987, Forth & Hare 1989, Raine 1989, Raine et al 1990b, Zuckerman 1990). Thus, alternatives to crime must be stimulating enough and rewarding enough to preferentially engage the chronically hypoaroused sensation-seeker. This will be a difficult task to achieve, but it will be more successful if we can effectively distinguish primary from secondary sociopaths. Primary sociopaths, with their emotional, but not intellectual deficit, will be competent on some tasks on which secondary sociopaths, with deficits in social skills, emotion-regulation and problem-solving, will not. Possibilities might include: novelist, screen play writer, stunt man, talk show host, disk jockey, explorer, race car driver, or skydiving exhibitionist. Given that primary sociopaths will always be with us in low numbers, it would be a wise social investment to create- even on an individual basis, if necessary- a number of exciting, high- payoff alternatives for them, in order to minimize the number who may otherwise cause pain and destruction. Distinguishing between primary and secondary sociopaths is also critical for decisions about confinement and rehabilitation. Quinsey & Walker (1992) cite examples where recidivism rates went up for psychopaths, but down for nonpsychopaths, after they were exposed to the same kind of "treatment". Recidivism is much greater in primary sociopaths than in secondary sociopaths (Hare, Forth & Strachan 1992), and sometimes the only response is prolongued incapacitation (until they literally "grow out of it"). A recent international meeting of experts concluded that "treatment" programs dealing with primary sociopaths should be "less concerned with attempts to develop empathy or conscience than with intensive efforts to convince them that their current attitudes and behavior (simply) are not in their own self-interest" (Hare 1993, p 204). 3.2.2 Minimizing the prevalence of secondary sociopathy: society as a socializing agent and mood setter Given that secondary sociopaths have a different life history and are more responsive to environmental influences than primary sociopaths, social changes can be designed to minimize not only their impact, but their incidence. Loeber (1990) argues that each generation in our society is being raised with an increasing number of environmental risk factors, leading to increasing generation-wide deficits in impulse control. He makes specific suggestions to screen for high-risk children and institute early intervention, noting that different interventions are likely to be more or less effective given different risk factors in the child's or adolescent's life history. (See also U.S. Department of Justice 1993.) One possible intervention is parent training (see Magid & McKelvie, 1987 and Dumas, Blechman & Prinz 1992, for reviews and programmatic suggestions). Laboratory experiments show that antisocial behaviors can be reduced and prosocial behaviors reinforced by appropriate use of modelling, induction, and behavioral modification techniques (Feldman 1977, Mussen & Eisenberg-Berg 1977, Grusec 1982, Rushton 1982, Gelfand & Hartmann 1982, Radke-Yarrow & Zahn-Waxler 1986, Kochanska 1992 & 1993). Recent longitudinal studies in natural settings suggest that the positive effects of good parenting, especially parental warmth and predictability, may be long-lasting (McCord 1986, Kochanska 1992 & 1993, Kochanska & Murray 1992). The cause and effect relationship between parental behavior and child behavior, however, is not likely to be one-way. Children of different gender, temperaments, and even social classes, respond differentially to different socialization techniques (Dienstbier 1984, Radke-Yarrow & Zahn-Waxler 1986, Lytton 1990, Kochanska 1991 & 1993, Kochanska & Murray 1992, McCord 1993), and to some extent, difficult children elicit poor parenting (Buss 1981, Lee & Bates 1985, Bell & Chapman 1986, Snyder & Patterson 1987, Lytton 1990, Eron, Huesmann & Zelli 1991). It is easy for parents of difficult children to lose heart, and in so doing, become even less effective (Patterson 1992). For example, studies cited in Landy & Peters (1992) found that mothers of aggressive children, like other mothers with a low sense of personal power, tend to give weak, ineffectual commands to their children. This lack of "goodness of fit" between parental style and the needs of the child is probably an important factor in the exacerbation of conduct disorder (Lee & Bates 1985, Landy & Peters 1992, Wachs 1992, Moffitt 1993). Parents need help in identifying high-risk children and they need instruction in how to take a practical, assertive approach with them (see Magid & McKelvie 1987, Garmezy 1991), while using a more inductive, empathic approach with their other children (see Kochanska 1991, Kochanska & Murray 1992, and Kochanska 1993). Social workers, health care providers, and employees of the criminal justice system also need to be able to distinguish between children with different risk factors and life histories and to respond accordingly. Palmer (1983) argues that agents should be individually matched with each client/offender based on style and personality characteristics, to prevent high Mach and sociopathic offenders from taking advantage of low Mach employees. At a broader level, many sociocultural aspects of modern society seem to contribute to antisocial behaviors and attitudes (National Research Council 1993, Moffitt 1993). As a society gets larger and more competitive, both theoretical models (Section 3.2.1) and empirical research (Section 2.4.2) show that individuals become more anonymous and more Machiavellian, leading to reductions in altruism and increases in crime. Social stratification and segregation can also lead to feelings of inferiority, pessimism, and depression among the less privileged, which can in turn promote the use of alternative competitive strategies, including antisocial behavior (Sanchez 1986, Magid & McKelvey 1987, Wilson & Daly 1993). Crime may be one response to the acquisition of an external locus of control (Raine, Roger & Venables 1982) or learned helplessness. Learned helplessness and other forms of depression have been associated with reduced levels of serotonin (Traskman, Asberg, Bertilsson & Sjostrand 1981); since reduced levels of serotonin have also been shown to be related to increased aggression, it is likely that physiological changes mediate these psychological and behavioral changes. The neurochemical pathway involved in learned helplessness (identified by Petty & Arnold 1982) appears to be the same one identified by Gray (1982, 1987) and Cloninger (1987a) with mediation of behavioral inhibition/harm avoidance, and by Charney et al (1990) with anxiety-mediated inhibition. Crime may also function to obtain desirable resources, increase an individual's status in a local referent group, or provide the stimulation that the more privileged find in more socially acceptable physical and intellectual challenges (eg. Farrington 1986, Farley 1986, Lyng 1990, Apter 1992, Moffitt 1993). According to Apter, "the vandal is a failed creative artist," a bored and frustrated sensation-seeker who "does not have the intellectual or other skills and capacities to amuse or occupy himself" (1992, p 198). Thus, in addition to making the costs of antisocial behavior greater, strong arguments can be made for providing early social support for those at risk, and for developing alternative, nonexploitative, sensation-seeking ventures that can meet the psychological needs of disadvantaged and low-skill individuals. 3.3 Conclusions A review of the literature in several areas supports the concept of two pathways to sociopathy: (1) "Primary sociopaths" are individuals of a certain genotype, physiotype, and personality who are incapable of experiencing the secondary, "social" emotions that normally contribute to behavioral motivation and inhibition; they fill the ecological niche described by game theorists as the "cheater strategy" and, as the result of frequency-dependent selection, will be found in low frequency in every society. (1b) To minimize the damage caused by primary sociopaths, the appropriate social response is to modify the criminal justice system in ways that obviously reduce the benefits and increase the costs of antisocial behavior, while simultaneously creating alternatives to crime which could satisfy the psychophysiological arousal needs of the sociopath. (2) "Secondary sociopaths" are individuals who use an environmentally-contingent, facultative cheating strategy not as clearly tied to genotype; this strategy develops in response to social and environmental conditions related to disadvantage in social competition and will thus covary (across cultures, generations, and even within an individual lifetime) with variation in immediate social circumstances. (2b) To reduce the frequency of secondary sociopathy, the appropriate social response is to implement programs which reduce social stratification, anonymity, and competition, intervene in high-risk settings with specialized parent education and support; and increase the availability of rewarding, prosocial opportunities for at-risk youth. Since the genetics and life histories of primary and secondary sociopaths are so different, successful intervention will require differential treatment of different cases; we thus need to encourage the widespread adoption of common terminology and diagnostic criteria. FOOTNOTES 1. Plutchik's eight primary emotions are: anger, fear, sadness, disgust, surprise, joy, acceptance, and anticipation. Others posit a few more (Izard 1977, 1991) or fewer (Ekman 1971, Panskepp 1982) but what is basically agreed is that primary emotions are those which can be found in other mammals, are hard-wired in the brain, are reflexively produced in response to certain stimuli, are associated with certain, sometimes species-specific, physiological responses (e.g., piloerection, changes in heart rate, facial expressions), and, in humans, are found cross-culturally and at an early age. (See Ortony & Turner 1990 for a dissenting opinion.) Note that the "social emotions", including love, guilt, shame, and remorse, do not meet the above criteria, and are not considered to be primary emotions by most authors (see Izard 1991 for another perspective). Although distinctly human, the social emotions seem to involve a critical element of learning, and, central to the argument I will be making, are not panhuman. 2. According to Plutchik, cognitive processes themselves evolved "in the service of emotions... in order to make the evaluations of stimulus events more correct and the predictions more precise so that the emotional behavior that finally resulted would be adaptively related to the stimulus event" (p 303). This model of the relationship between emotion and cognition is somewhat similar to Bigelow's (1972), which postulates that intelligence evolved as a result of the need to control the emotions (especially the aggressive emotions), in the service of sociality, and Humphrey's (1976, 1983), which claims that self-awareness evolved because it was a successful tool for predicting the behavior of others. 3. See Draper (1978) and the 1986 special issue of Ethology and Sociobiology (vol. 7 #3/4) on ostracism for further discussion of the role of shunning with specific reference to human societies; see Hirshleifer & Rasmusen (1989) for a game theoretic model of shunning; and see Nathanson (1992) for the importance of the social emotion, shame. 4. The wealth of literature on strategies that people use to detect deception in interpersonal interactions, as well as the technologies that have been developed in order to further enhance that ability in less-personal social interactions, are indicators of the importance we bestow on such ability. (See Zuckerman, DePaulo & Rosenthal 1981, Mitchell & Thompson 1986, and especially, Ekman 1992.) 5. Although the data are overwhelming, the particular articles cited in this section should not be considered to be independent reports, since most of the reviews cited overlap substantially in their coverage, and many authors or teams report their findings more than once in a series of updates. While interested readers should direct themselves to the most recent publications, older publications do contain some information not presented in the updates, and thus are included for thoroughness and ease of reference. 6. Twin study methods yield estimates of what is termed "broad heritability", which includes both "additive" genetic factors (i.e., the summed effect of individual genes on the phenotype) and "non- additive" genetic factors (i.e., the phenotypic effects of dominance interactions between homologous alleles on paired chromosomes, and the epistatic interactions between non-homologous genes throughout the genome). Adoption study methods, on the other hand, yield estimates of what is termed "narrow heritability" which is only the additive genetic component. The additive component is that which can be selected for (or against) as it is transmitted from generation to generation, while the nonadditive effect is unique to each individual genotype and is broken and reshuffled with every episode of sexual recombination. Because of this difference, twin studies typically yield higher heritability estimates than adoption studies. Another difference between the twin methodology and the adoption methodology, is that twin studies generally provide heritability coefficients which estimate the proportion of the total explained variance accounted for by genetic factors, whereas adoption studies provide heritability coefficients which estimate the proportion of the total variance (including measurement error) that is accounted for by additive genetic factors. Since measurement error is so large when assessing criminality, adoption studies tend to yield both smaller and more varied heritability estimates than do twin studies. A third difference is that twin studies yield heritability estimates for members of a particular generational cohort, usually tested at the same age, whereas adoption studies necessarily regress measures from one generation to another. This leads to two problems in interpreting of heritability estimates derived from adoption studies which are not germane to twin studies. The first is that heritability can change across generations- even in the absence of genetic change- due to changes in the environment; this effect cannot be assessed in either twin or adoption studies, but is only a limitation of generalizability for the former, whereas it is conflated in the latter. The second is that heritability can also be different at different ages. Huesmann et al (1984), for example, report that the correlation between children's aggression level and their parents' aggression level when measured at the same age, is greater than the correlation between the child's own aggression level at one age and at a later age. This phenomenon, too, results simply in limited generalizability of heritability estimates derived from twin studies, but yields conflated estimates from adoption studies. The heritability of .6 reported herein is an estimate of broad heritability as derived directly from twin studies; similar estimates can also be calculated indirectly from adoption study data after accounting for measurement error, but cohort effects cannot be separated out. See Loehlin (1992) for a general discussion of twin and adoption methodologies and Emde et al (1992) and Raine (1993) for further discussions of the relevance of methodological considerations as they pertain to interpretation of the specific studies summarized herein. 7. There is also evidence that at least one form of alcoholism belongs to the sociopathy spectrum: Type II alcoholism, which is also much more prevalent in men than women and seems to be transmitted in the same way (Cloninger, Christiansen, Reich & Gottesman 1978, Bohman, Sigvardsson & Cloninger 1981, Cloninger, Bohman & Sigvardsson 1981, Stabenau 1985, Cadoret 1986, Zucker & Gomberg 1986, and McGue, Pickens & Svikis 1992). Type II alcoholism is characterized by early onset, frequent violent outbursts, EEG abnormalities, and several of the personality attributes that are often seen in sociopathy- impulsivity, extraversion, sensation- seeking, aggressiveness, and lack of concern for others (Cloninger 1987b, Tarter 1988). 8. The interesting phenomenon of differential heritability of traits across the sexes can occur, as in this case, as a result of differential (sex-limited) expression of the same genes, or, as it does with Type I alcoholism (a milder, nonviolent form), as a result of differential environmental experiences of the sexes (Cloninger, Christiansen, Reich & Gottesman 1978). Since heritability is measured as a proportion, the value of a heritability estimate can be changed by changes in either the numerator (variance in a trait due to genetic variance) or the denominator (total variance in the trait). Since the denominator (total variance) is composed of both genetic and environmental variance, changes in either will change the heritability. This method of defining heritability also explains some other apparent paradoxes, such as how two populations (eg., racial groups or two successive generations of a single group) could have exactly the same genotypic variation with respect to a trait, but because of differences in their environments, exhibit differential phenotypes and differential heritability of the trait. 9. Like the behavior genetic studies cited in Section 2.1, these studies provide overwhelming data, but should not be considered as independent reports, because many overlap or update earlier work. Methodologically, while path models and the longitudinal studies from which they are derived have excellent ecological validity, they are correlational, and while they improve upon cross-sectional designs by noting which factors precede others developmentally, they cannot completely sort out cause and effect- especially in the earliest stages of parent-child interactions. 10. The strategy of sociopaths against one another, although not a test of the current model, is still interesting in its own right. In the Hare & Craigen (1974) modified Prisoner's Dilemma, the majority of sociopaths, on their turn, chose from amongst five "plays", that choice which minimized their own pain (an electric shock) for that trial, but which maximized their partner's. Since partners took turns in selecting from the same five "plays", this strategy actually maximized pain over the long-run. The alternative, pain-minimizing strategy, involved giving both oneself and one's partner a small shock- a choice that most subjects declined to use. This result seems to confirm the sociopath's inability to consider anything other than the immediate consequences of an act, as well as the ineffectiveness of delayed punishment or threat of punishment as a deterrent. In the Widom (1976a) study, sociopaths did not, in general, "defect" more often than the controls, but in the condition when subjects were informed of their partner's move on the previous trial, sociopaths were much more likely than controls to "defect" after a mutual cooperation. On this measure, at least, the sociopaths seemed to demonstrate an inability to "commit" to an ongoing cooperative relationship. 11. Terhune reports that personality is the most important factor for strategy choice within the setting of single-trial Prisoner's Dilemma interactions. In multiple-trial interactions, however, when players have the opportunity to learn one another's dispositions, situational factors are more important for determining play (see Frank, Gilovich & Regan 1993). This is consistent with the idea that the establishment of reputation is a key goal, even for players who on a single trial would choose not to cooperate. For more on the idea that establishing a certain reputation within one's referent group is a conscious goal, and how that might play a role in the development of antisocial behavior, see Hogan & Jones (1983) and Irons (1991). 12. For some of the debate on this issue see the series of comments and replies following Nesse (1991) in the electronic journal Psycoloquy. The comments specifically addressing the relationship between mood and emotion are: Morris (1992), Nesse (1992a), Plutchik (1992), and Nesse (1992b). 13. While searching for data to test this prediction, I came across only the Robins et al (1991) reference in support of it, and one reference in an introductory psychology text (Wade & Tavris 1993) against it. The latter stated that antisocial personality disorder "is rare or unknown in small, tight-knit, cooperative communities, such as the Inuit, religious Hutterites, and Israelis raised on the communal plan of the kibbutz" (p 584). Contact with Dr. Tavris allowed me to follow up on the sources from which the latter statement was derived (Eaton & Weil 1953, Montagu 1978, and Altrocchi 1980). My conclusion, (which is shared by Dr. Tavris in a personal communication), is that the absence or rarity of sociopathy in these small, tightly knit societies, is not a result of the creation of a social system in which sociopaths never develop; rather, it is that secondary sociopaths do not develop (keeping total numbers at the low baseline), and that primary sociopaths emigrate. Small, closely knit societies have all the properties that game theoretic models indicate will reduce (but not eliminate) the incidence of the cheater strategy (see Section 3.2.1). One of the most important of these features is size per se; the cheater strategy cannot be used repeatedly against the same interactionists and remain successful (see section 1.2). Thus, in small societies, sociopaths are likely to do their damage, acquire a reputation, and leave- to avoid punishment and move on to greener pastures. This "roving strategist" model (Harpending & Sobus 1987, Dugatkin & Wilson 1991, Dugatkin 1992) allows for both the evolution and the maintenance of a low baseline of successful sociopaths even in small groups (like those in which we presumably evolved). 14. Despite being a genetically based strategy, because primary sociopathy is the end product of the additive and interactive effects of many genes, we will not be able to predict or identify individual sociopaths by knowledge of their genotype. We will, however, be able to predict which children will be at risk, given their genetic background, the same way we predict which children will be at risk given their familial and sociocultural background. We will also be alerted to differentiate between diagnoses of primary sociopathy and secondary sociopathy (and our consequent approaches to them) based upon knowledge of an already identified sociopath's genetic and environmental background. 15. Axelrod (1986) and Boyd and Richerson (1992) also consider the extension of punishment not only to cheaters, but to those cooperators who do not, themselves, punish cheaters. The presence of this strategy can lead to an ESS of practically any behavior, regardless of whether there is any group benefit derived from such cooperation. Clearly this extension of the model has some analogues with totalitarian regimes and in-groups of a variety of sorts. ACKNOWLEDGEMENTS I would like to thank Mr. Rainer Link, who helped me get started on this project, and who collaborated with me on the first version and first public presentation of the model (Link & Mealey 1992). I would also like to extend thanks to the many individuals who provided useful comments during the revision process: J.D. Baldwin, David Buss, Patricia Draper, Lee Dugatkin, Lee Ellis, Hans Eysenck, David Farrington, Hill Goldsmith, Henry Harpending, James Kalat, John Loehlin, Michael McGuire, Randy Nesse, Jaak Panskepp, David Rowe, Sandra Scarr, Nancy Segal, Chuck Watson, David S. Wilson, and four anonymous BBS reviewers. REFERENCES Alexander, R.D. (1986) Biology and law. Ethology and Sociobiology 7(3/4):329-337. Alexander, R.D. (1987) The biology of moral systems. Aldine de Gruyter Pub. Allen, H., Lindner, L., Goldman, H. & Dinitz, S. (1971) Hostile and simple sociopaths: An empirical typology. Criminology 9:27-47. Allsopp, J., Eysenck, H.J. & Eysenck, S.B.G. (1991) Machiavellianism as a component in psychoticism and extraversion. Personality and Individual Differences 12(1):29-41. Altrocchi, J. (1980) Abnormal Behavior. Harcourt Brace Jovanovich Pub. American Psychiatric Association (1987) Diagnostic and statistical manual, 3rd ed. (rev) American Psychiatric Association. Anawalt, H.C. (1986) Ostracism and the law of defamation. Ethology and Sociobiology 7(3/4):329-337. Apter, M.J. (1992) The dangerous edge: The psychology of excitement. Free Press. Archer, J. (1991) The influence of testosterone on human aggression. British Journal of Psychiatry 82:1-28. Axelrod, R. (1984) The evolution of cooperation. Basic Books. Axelrod, R. (1986) An evolutionary approach to norms. American Political Science Review 80(4):1095-1111. Axelrod, R. & Dion, D. (1988) More on the evolution of cooperation. Science 242:1385-1390. Axelrod, R. & Hamilton, W.D. (1981) The evolution of cooperation. Science 211:1290-1396. Baker, L.A. Mack, W. Moffitt, T.E. & Mednick, S. (1987) Sex differences in property crime in a Danish adoption cohort. Behavior Genetics 19(3):355-370. Baldwin, J.D. (1990) The role of sensory stimulation in criminal behavior, with special attention to the age peak in crime. In: Crime in biological, social, and moral contexts, eds. L. Ellis & H. Hoffman. Praeger Pub. Barber, N. (1992) Are interpersonal attitudes, such as Machiavellianism and altruism, modified by relatedness of their targets? Presented at the Fourth Annual Meeting of the Human Behavior and Evolution Society, Albuquerque, N.M. Bartol, C.R. (1984) Psychology and American law. Wadsworth Pub. Bell, R.Q. & Chapman, M. (1986) Child effects in studies using experimental or brief longitudinal approaches to socialization. Developmental Psychology 22:595-603. Belsky, J., Steinberg, L. & Draper, P. (1991) Childhood experience, interpersonal development and reproductive strategy: An evolutionary theory of socialization. Child Development 62(4):647-670. Bigelow, R. (1972) The evolution of cooperation, aggression, and self-control. In: Nebraska symposium on motivation, vol. 20, eds. J.K. Cole & D.D. Jensen. University of Nebraska Press. Blumstein, A. & Cohen, J. (1987) Characterizing criminal careers. Science 237:985-991. Boddy, J., Carver, A. & Rowley, K. (1986) Effects of positive and negative verbal reinforcement on performance as a function of extraversion-introversion: Some tests of Gray's theory. Personality and Individual Differences 7(1):81-88. Bohman, M., Sigvardsson, S. & Cloninger, C.R. (1981) Maternal inheritance of alcohol abuse. Archives of General Psychiatry 38:965- 969. Borowiak, M. (1989) The effectiveness of individual therapy, group therapy, family therapy, and outdoor therapy on adjudicated juvenile delinquents: A meta-analysis. Unpublished senior thesis, St. John's University, Collegeville, MN. Boyd, R. (1988) Is the repeated Prisoner's Dilemma game a good model of reciprocal altruism? Ethology and Sociobiology 9:211-221. Boyd, R. & Richerson, P.J. (1992) Punishment allows the evolution of cooperation (or anything else) in sizable groups. Ethology and Sociobiology 213(3):171-195. Bradley, M.T. & Klohn, K.I (1987) Machiavellianism, the control question test and the detection of deception. Perceptual and Motor Skills 64:747-757. Brown, G.L., Goodwin, F.K., Ballenger, J.C., Goyer, P.F. & Major, L.F. (1979) Aggression in Humans correlates with cerebrospinal fluid amine metabolites. Psychiatry Research 1:131-139. Brunner, H.G., Nelen, M., Breakefield, X.O., Ropers, H.H. & van Oost, B.A. (1993) Abnormal behavior associated with a point mutation in the structural gene for monoamine oxidase A. Science 262:578-580. Buss, D.M. (1981) Predicting parent-child interactions from children's sctivity level. Developmental Psychology 17:59-65. Buss, D.M. (1988) The evolution of human intrasexual competition: Tactics of mate attraction. Journal of Personality and Social Psychology 54(4):616-628. Buss, D.M. (1991) Evolutionary personality psychology. Annual Review of Psychology 42:459-491. Byrne, R.W. & Whiten, A. (1988) Machiavellian intelligence: Social expertise and the evolution of intellect in monkeys, apes, and humans. Oxford Science Pub. Cadoret, R.J. (1978) Psychopathology of adopted-away offspring of biologic parents with antisocial personality. Archives of General Psychiatry 35:176-184. Cadoret, R.J. (1982) Genotype-environment interaction in antisocial behavior. Psychosomatic Medicine 12:235-239. Cadoret, R. (1986) Epidemiology of antisocial personality. In: Unmasking the psychopath: Antisocial personality and related syndromes, eds. W.H. Reid, D. Dorr, J.I Walker & J.W. Bonner, III. W.W. Norton Pub. Cadoret, R.J & Cain, C. (1980) Sex differences in predictors of antisocial behavior in adoptees. Archives of General Psychiatry 37:1171-1175. Cadoret, R.J. & Cain, C. (1981) Environmental and genetic factors in predicting adolescent antisocial behavior. The Psychiatric Journal of the University of Ottawa 6(4):220-225. Cadoret, R.J., Cain, C.A. & Crowe, R.R. (1983) Evidence for gene- environment interaction in the development of adolescent antisocial behavior. Behavior Genetics 13(3):301-310. Cadoret, R.J. & Stewart, M.A. (1991) An adoption study of attention deficit/hyperactivity/aggression and their relationship to adult antisocial personality. Comprehensive Psychiatry 32(1):73-82. Cadoret, R.J., Troughton, E., Bagford, J. & Woodworth, G. (1990) Genetic and environmental factors in adoptee antisocial personality. European Archives of Psychiatry and Neurological Sciences 239:231- 240. Cadoret, R.J., Troughton, E. & O'Gorman, T.W. (1987) Genetic and environmental factors in alcohol abuse and antisocial personality. Journal of Studies on Alcohol 48(1):1-8. Caldwell. R.L. (1986) The deceptive use of reputation by stomatopods. In: Deception: Perspectives on human and nonhuman deceit, eds. R.W. Mitchell & N.S. Thompson. SUNY Press. Caspi, A., Elder, G.H. & Bem, D.J. (1987) Moving against the world Life-course patterns of explosive children. Developmental Psychology 23:308-313. Charney, D.S., Woods, S.W., Krystal, J.H. & Heninger, G.R. (1990) Serotonin function and human anxiety disorders. Annals of the New York Academy of Sciences 600:558-573. Chesno, F.A. & Kilman, P.R. (1975) Effects of stimulation intensity on sociopathic avoidance learning. Journal of Abnormal Psychology 84:144-150. Christiansen, K.O. (1977a) A review of studies of criminality among twins. In: Biosocial bases of criminal behavior, eds. S.A. Mednick & K.O. Christiansen. Gardner Press. Christiansen, K.O. (1977b) A preliminary study of criminality among twins. In: Biosocial bases of criminal behavior, eds. S.A. Mednick & K.O. Christiansen. Gardner Press. Christie, R. (1970) The Machiavellis among us. Psychology Today 4(6):82-86. Christie, R. & Geis, F.L. (1970) Studies in Machiavellianism. Academic Press. Cialdini, R.B., Kenrick, D.T. & Baumann, D.J. (1982) Effects of mood on prosocial behavior in children and adults. In: The development of prosocial behavior, ed. N. Eisenberg. Academic Press. Cloninger, C.R. (1987a) A systematic method for clinical description and classification of personality variants. Archives of General Psychiatry 44:573-588. Cloninger, C.R. (1987b) Neurogenetic adaptive mechanisms in alcoholism. Science 236:410-416. Cloninger, C.R., Bohman, M. & Sigvardsson, S. (1981) Inheritance of alcohol abuse. Archives of General Psychiatry 38:861-868. Cloninger, C.R., Christiansen, K.O., Reich, T. & Gottesman, I.I. (1978) Implications of sex differences in the prevalences of antisocial personality, alcoholism, and criminality for familial transmission. Archives of General Psychiatry 35:941-951. Cloninger, C.R. & Gottesman, I.I. (1987) Genetic and environmental factors in anti-social behavior disorders. In: The causes of crime: New biological approaches, eds. S.A. Mednick, Terrie E. Moffitt, & S.A. Stack. Cambridge University Press. Cloninger, C.R., Reich, T., & Guze, S.B. (1975) The multifactorial model of disease transmission: Sex differences in the familial transmission of sociopathy (antisocial personality). British Journal of Psychiatry 127:11-22. Cloninger, C.R., Svrakic, D.M. & Przybeck, T.R. (1993) A psychobiological model of temperament. Archives of General Psychiatry 50:975-990. Cohen, L.E. & Machalek, R. (1988) A general theory of expropriative crime: An evolutionary ecological approach. American Journal of Sociology 94(3):465-501. Conger, R. (1993) Linking family processes to adolescent deviance. Presented at the 45th annual meeting of the American Society of Criminology, Phoenix, AZ. Crawford, C.B. & Anderson, J.L. (1989) Sociobiology: An environmentalist discipline? American Psychologist 44(12):1449-1459. Crowe, R.R. (1972) The adopted offspring of women criminal offenders: A study of their arrest records. Archives of General Psychiatry 27:600-603. Crowe, R.R. (1974) An adoption study of anti-social personality. Archives of General Psychiatry 31:785-791. Dabbs, J.M. & Morris, R. (1990) Testosterone, social class and antisocial behavior. Psychological Science 1:209-211. Daitzman, R. & Zuckerman, M. (1980) Disinhibitory sensation seeking, personality and gonadal hormones. Personality and Individual Differences 1:103-110. Daly, M. & Wilson, M. (1983) Sex, evolution, and behavior, 2nd ed. Willard Grant Press. Damasio, A.R., Tranel, D. & Damasio, H. (1990) Individuals with sociopathic behavior caused by frontal damage fail to respond autonomically to social stimuli. Behavioural Brain Research 41:81- 94. Davison G.C., & Neale J.M. (1994) Abnormal psychology (6th edition). John Wiley & Sons Pub. Dawkins, R. (1980) Good strategy or evolutionary stable strategy? In: Sociobiology: Beyond nature/nurture?, eds. G.W. Barlow & J. Silverberg. Westview Press. Dennett, D. (1988) Why creative intelligence is hard to find. Behavioral and Brain Sciences 11(2):253. Depue, R.A., & Spoont, M.R. (1986) Conceptualizing a serotonin trait. Annals of the New York Academy of Sciences 487:47-62. Derryberry, D. (1987) Incentive and feedback effects on target detection: A chronometric analysis of Gray's model of temperament. Personality and Individual Differences 8(6):855-865. Dienstbier, R.A. (1984) The role of emotion in moral socialization. In: Emotions, cognition and behavior, eds. C.E. Izard, J. Kagan & R.B. Zajonc. Cambridge University Press. Dimberg, U. (1988) Facial expressions and emotional reactions: A psychobiological analysis of human social behaviour. In: Social psychophysiology and emotion: Theory and clinical applications, ed. H.L. Wagner. John Wiley & Sons Pub. Dishion, T.J., Patterson, G.R., Stoolmiller, M. & Skinner, M.L. (1991) Family, school, and behavioral antecedents to early adolescent involvement with antisocial peers. Developmental Psychology 27(1):172-180. Dishion, T.J. & Poe, Joyanna (1993) Parental antisocial behavior as an antecedent to deviancy training among adolescent boys and their peers. Presented at the 45th annual meeting of the American Society of Criminology, Phoenix, AZ. Dodge, K.A. & Newman, J.P. (1981) Biased decision-making processes in aggressive boys. Journal of Abnormal Psychology 90(4):375-379. Draper, P. (1978) The learning environment for aggression and anti- social behavior among the !Kung (Kalahari Desert, Botswana, Africa). In: Learning non-aggression: The experience of non-literate societies, ed: A. Montagu. Oxford University Press. Draper, P. & Belsky, J. (1990) Personality development in evolutionary perspective. Journal of Personality 58(1):141-161. Draper, P. & Harpending, H. (1982) Father absence and reproductive strategy: An evolutionary perspective. Journal of Anthropological Research 38(3):255-273. Dugatkin, L.A. (1992) The evolution of the "con artist". Ethology and Sociobiology 13(1):3-18. Dugatkin, L.A. & Wilson, D.S. (1991) ROVER: A atrategy for exploiting cooperators in a patchy environment. The American Naturalist 138(3):687-702 Dumas, J.E., Blechman, E.A. & Prinz, R.J. (1992) Helping families with aggressive children and adolescents change. In: Aggression and violence throughout the lifespan, eds. R.DeV. Peters, R.J. McMahon & V.L. Quinsey. Sage Pub. Dunn, J. (1987) The beginnings of moral understanding: Development in the second year. In: The emergence of morality in young children, eds. J. Kagan & S. Lamb. Chicago University Press. Dunn, J. (1988) The beginnings of social understanding. Harvard University Press. Dunn, J. (1992) Siblings and development. Current Directions in Psychological Science 1:6-9. Dunn, J., Brown, J., Slomkowski, C., Tesla, C. & Youngblade, L. (1991) Young children's understanding of other people's feelings and beliefs: Individual differences and their antecedents. Child Development 62:1352-1366. Dunn, J. & Plomin, R. (1990) Separate lives: Why siblings are so different. Basic Books. van Dusen, K.T, Mednick, S.A., Gabrielli, W.F. Jr., & Hutchings, B. (1983) Social class and crime in an adoption cohort. The Journal of Criminal Law and Criminology 74(1):249-269. Eaton, J.W. & Weil, R.J. (1953) The mental health of the Hutterites. Scientific American 189(6):32-37. Eisenberg, N., Fabes, R.A. & Miller, P.A. (1990) The evolutionary and neurological roots of prosocial behavior. In: Crime in biological, social, and moral contexts, eds. L. Ellis & H. Hoffman. Praeger Pub. Ekman, P. (1971) Universals and cultural differences in facial expressions of emotion. In: Nebraska symposium on motivation, vol. 19, ed. J. Cole. University of Nebraska Press. Ekman, P. (1992) Telling lies: Clues to deceit in the market place, politics and marriage, 2nd ed. Norton Pub. Eliasz, A. (1987) Temperament-contingent cognitive orientation towrd various aspects of reality. In: Personality dimensions and arousal, eds. J. Strelau & H.J. Eysenck. Plenum Press. Eliasz, H. & Reykowski, J. (1986) Reactivity and empathic control of aggression. In: The biological bases of personality and behavior, vol.2: Psychophysiology, performance, and application, eds. J. Strelau, F.H. Farley & A. Gale. Hemisphere Pub. Ellis, L. (1987) Relationships of criminality and psychopathy with eight other apparent behavioral manifestations of sub-optimal arousal. Personality and Individual Differences 8:905-925. Ellis, L. (1988) Criminal behavior and r/K selection: An extension of gene-based evolutionary theory. Personality and Individual Differences 9(4):697-708. Ellis, L. (1989) Theories of rape: Inquiries into the causes of sexual aggression. Hemisphere Pub. Ellis, L. (1990a) The evolution of collective counterstrategies to crime: From the primate control rule to the criminal justice system. In: Crime in biological, social, and moral contexts, eds. L. Ellis & H. Hoffman. Praeger Pub. Ellis, L. (1990b) Conceptualizing criminal and related behavior from a biosocial perspective. In: Crime in biological, social, and moral contexts, eds. L. Ellis & H. Hoffman. Praeger Pub. Ellis, L. (1991a) A synthesized (biosocial) theory of rape. Journal of Consulting and Clinical Psychology 59(5):631-642. Ellis, L. (1991b) Monoamine oxidase and criminality: Identifying an apparent biological marker for antisocial behavior. Journal of Research in Crime and Delinquency 28(2):227-251. Ellis, L. & Coontz, P.D. (1990) Androgens, brain functioning, and criminality: The neurohormonal foundations of antisociality. In: Crime in biological, social, and moral contexts, eds. L. Ellis & H. Hoffman. Praeger Pub. Emde, R.N., Plomin, R., Robinson, J., Corley, R., DeFries, J., Fulker, D.W., Reznick, J.S., Campos, J., Kagan, J. & Zahn-Waxler, C. (1992) Temperament, emotion, and cognition at fourteen months: The MacArthur longitudinal twin study. Child Development 63:1437-1455. Eron, L.D., Huesmann, L.R. & Zelli, A. (1991) The role of parental variables in the learning of aggression. In: The development and treatment of aggression, eds. D.J. Pepler & K.H. Rubin. Lawrence Erlbaum Pub. Eysenck, H.J. (1977) Crime and personality, 3rd ed. Routledge & Kegan Paul Pub. Eysenck, H.J. (1983) Personality, conditioning, and antisocial behavior. In: Personality theory, moral development, and criminal behavior, eds. W.S. Laufer & J.M. Day. Lexington Books. Eysenck, H.J. (1987) The definition of personality disorders and the criteria appropriate for their description. Journal of Personality Disorders 1(3):211-219. Eysenck, H.J (1990) Biological dimensions of personality. In: Handbook of personality theory and research, ed. L.A. Pervin. Guilford Pub. Eysenck, H.J. & Gudjonsson, G.H. (1989) The causes and cures of criminality. Plenum Press. Fagan, T.J. & Lira, F.T. The primary and secondary sociopathic personality: Differences in frequency and severity of antisocial behaviors. Journal of Abnormal Psychology 89(3):493-496. Farley, F. (1986) The big T in personality. Psychology Today 20(5):44-52. Farrington, D.P. (1979) Experiments on deviance with special reference to dishonesty. Advances in Experimental Social Psychology 12: 207-252. Farrington, D.P. (1982) Naturalistic experiments on helping behavior. In: Cooperation and competition in humans and animals, ed. A.M. Colman. Von Nostrand Reinhold Pub. Farrington, D. P. (1986) Stepping stones to adult criminal careers. In: Development of antisocial and prosocial behavior: Research, theories, and issues, eds. D. Olweus, J. Block & M. Radke-Yarrow. Academic Press. Farrington, D.P. (1989) Early predictors of adolescent aggression and adult violence. Violence and Victims 4(2):79-100. Feldman, M.P. (1977) Criminal behavior: A psychological analysis. John Wiley & Sons Pub. Feldman, M.W. & Thomas, E.A.C. (1987) Behavior-dependent contexts for repeated plays of the Prisoner's Dilemma II: Dynamical aspects of the evolution of cooperation. Journal of Theoretical Biology 128:297-315. Figueredo, A.J. & McCloskey, L A. (nd) Sex, money, and paternity: The evolutionary psychology of domestic violence. Unpublished manuscript. Forgatch, M.S. (1991) The clinical science vortex: a developing theory of antisocial behavior. In: The development and treatment of aggression, eds. D.J. Pepler & K.H. Rubin. Lawrence Erlbaum Pub. Forgatch, M.S., Stoolmiller, M. & Patterson, G.R. (1993) Parental affect and adolescent delinquency: A mediational model. Presented at the 45th annual meeting of the American Society of Criminology, Phoenix, AZ. Forth, A.E. & Hare, R.D. (1989) The contingent negative variation in psychopaths. Psychophysiology 26:676-682. Foster, D. (1991) Social influence III: Crowds and collective violence. In: Social psychology in South Africa, eds: D. Foster & J. Louw-Potgieter. Lexicon Pub. (Johannesburg). Frank, R.H. (1988) Passions within reason: The strategic role of the emotions. W.W. Norton Pub. Frank, R.H., Gilovich, T. & Regan, D.T. (1993) The evolution of one- shot cooperation: An experiment. Ethology and Sociobiology 14:247- 256. Fry, D.P. (1988) Intercommunity differences in aggression among Zapotec children. Child Development 59:1008-1019. Gabbay, F.H. (1992) Behavior-genetic strategies in the study of emotion. Psychological Science 3(1):50-55. Gangestad, S.W. & Simpson, J.A. (1990) Toward an evolutionary history of female sociosexual variation. Journal of Personality 58(1):69-95. Garmezy, N. (1991) Resiliency and vulnerability to adverse developmental outcomes associated with poverty. American Behavioral Scientist 34(4):416-430. Geis, F. & Levy, M. (1970) The eye of the beholder. In: Studies in Machiavellianism, eds. R. Christie & F.L. Geis. Academic Press. Gelfand, D.M. & Hartmann, D.P. (1982) Response consequences and attributions: Two contributors to prosocial behavior. In: The development of prosocial behavior, ed. N. Eisenberg. Academic Press. Ghodsian-Carpey, J. & Baker, L.A. (1987) Genetic and environmental influences on aggression in 4- to 7-year old twins. Aggressive Behavior 13:173-186. Gladue, B.A. (1991) Aggressive behavioral characteristics, hormones, and sexual orientation in men and women. Aggressive Behavior 17:313- 326. Gold, M. (1987) Social ecology. In: Handbook of juvenile delinquency, ed: H.C. Quay. Wiley-Interscience Pub. Gordon, D.A. & Arbuthnot, J. (1987) Individual, group, and family interventions. In: Handbook of juvenile delinquency, ed: H.C. Quay. Wiley-Interscience Pub. Gorenstein, E.E. & Newman J.P. (1980) Disinhibitory psychopathology: A new perspective and a model for research. Psychological Review 87(3):301-315. Gottesman, I.I. & H.H. Goldsmith (1993) Developmental psychopathology of anti-social behavior: Inserting genes into its ontogenesis and epigenesis. In: Threats to optimal development: Integrating biological, psychological and social risk factors, ed. C.A. Nelson. Lawrence Erlbaum Pub. Gottschalk, R., Davidson, W.S., II, Gensheimer, L.K. & Mayer, J.P. (1987) Community based interventions. In: Handbook of juvenile delinquency, ed. H.C. Quay. Wiley-Interscience Pub. Gould, J.L. & Gould, C.G. (1989) Sexual selection. Scientific American Library, W.H. Freeman Pub. Gove, W.R. & Wilmoth, C. (1990) Risk, crime, and neurophysiologic highs: A consideration of brain processes that may reinforce delinquent and criminal behavior. In: Crime in biological, social, and moral contexts, eds. L. Ellis & H. Hoffman. Praeger Pub. Gray, J.A. (1982) The neuropsychology of anxiety: An enquiry into the functions of he septohippocampal system, Oxford University Press. Gray, J.A. (1987) Perspectives on anxiety and impulsivity: A commentary. Journal of Research in Personality 21:493-509. Griffiths, P.E. (1990) Modularity, and the psychoevolutionary theory of emotion. Biology and Philosophy 5:175-196. Grusec, J.E. (1982) The socialization of altruism. In: The development of prosocial behavior, ed. N. Eisenberg. Academic Press. Halpern, C.T., Udry, J.R. Campbell, B. & Suchindran, C. (1993) Relationship between aggression and pubertal increass in testosterone: A panel analysis of adolescent males. Social Biology 40(1-2):8-24. Hare, R.D. (1970) Psychopathy: Theory and research. John Wiley & Sons Pub. Hare, R.D. (1980) A research scale for the assessment of psychopathy in criminal populations. Personality and Individual Differences 1:111-119. Hare, R.D. (1993) Without conscience: The disturbing world of the psychopaths among us. Pocket Books. Hare, R.D. & Craigen, D. (1974) Psychopathy and physiological activity in a mixed-motive game situation. Psychophysiology 11:197- 206. Hare, R.D., Forth, A.E. & Strachan, K.E. (1992) Psychopathy and crime across the life span. In: Aggression and violence throughout the lifespan, eds. R.DeV. Peters, R.J. McMahon & V.L. Quinsey. Sage Pub. Hare, R.D. & Quinn, M.J. (1971) Psychopathy and autonomic conditioning. Journal of Abnormal Psychology 77:223-235. Harpending, H. & Sobus, J. (1987) Sociopathy as an adaptation. Ethology and Sociobiology 8(3s):63s-72s. Hartup, W.W. (1989) Social relationships and their developmental significance. American Psychologist 44(2):120-126. Heckathorn, D.D. (1988) Collective sanctions and the creation of Prisoner's Dilemma norms. American Journal of Sociology 94(3):535- 562. Hirschi, T. & Hindelang, M.J. (1977) Intelligence and delinquency: A revisionist review. American Sociological Review 42:571-587. Hirshleifer, D. & Rasmusen, E. (1989) Cooperation in a repeated prisoner's dilemma with ostracism. Journal of Economic Behavior and Organization 12:87-106. Hirshleifer, J. (1987) On the emotions as guarantors of threats and promises. In: The latest on the best: Essays on evolution and optimality, ed. J. Dupre. Bradford Books. Hirshleifer, J. & Coll, J.C. (1988) What strategies can support the evolutionary emergence of cooperation? Journal of Conflict Resolution 32(2):367-398. Hodgins, S. (1992) Mental disorder, intellectual deficiency, and crime: Evidence from a birth cohort. Archives of General Psychiatry 49:476-483. Hoffman, M.L. (1975) Developmental synthesis of affect and cognition and its implications for altruistic motivation. Developmental Psychology 11(5):607-622. Hoffman, M.L (1977) Empathy, its development and prosocial implications. In: Nebraska symposium on motivation, vol. 25, ed. C.B. Keasey. University of Nebraska Press. Hoffman, M.L. (1978) Psychological and biological perspectives on altruism. International Journal of Behavioral Development 1:323-339. Hoffman, M.L. (1982) Development of prosocial motivation: Empathy and guilt. In: The development of prosocial behavior, ed. N. Eisenberg. Academic Press. Hoffman, M.L. (1984) Interaction of affect and cognition in empathy. In: Emotions, cognition and behavior, eds. C.E. Izard, J. Kagan & R. B. Zajonc. Cambridge University Press. Hogan, R. & Jones, W.H. (1983) A role-theoretical model of criminal behavior. In: Personality theory, moral development, and criminal behavior, eds. W.S. Laufer & J.M. Day. Lexington Books. Huesmann, L.R., Eron, L.D., Lefkowitz, M.M. & Walder, L.O. (1984) Stability of aggression over time and generations. Developmental Psychology 20(6):1120-1134. Humphrey, N.K. (1976) The social function of intellect. In: Growing points in ethology, eds. P.P.G. Bateson & R.A. Hinde. Cambridge University Press. Humphrey, N.K. (1983) Consciousness regained. Oxford University Press. Hutchings, B. & Mednick, S.A. (1977) Criminality in adoptees and their adoptive and biological parents: A pilot study. In: Biosocial bases of criminal behavior, eds. S.A. Mednick & K.O. Christiansen. Gardner Press. Irons, W. (1991) How did morality evolve? Zygon 26(1):49-89. Izard, C.E. (1977) Human emotions. Plenum Press. Izard, C.E. (1991) The psychology of emotions. Plenum Press. Jaffe, P.G., Suderman, M. & Reotzel, D. (1992) Working with children and adolescents to end the cycle of violence: A social learning approach to intervention. In: Aggression and violence throughout the lifespan, eds. R.DeV. Peters, R.J. McMahon & V.L. Quinsey. Sage Pub. Jenner, F.A. (1980) Psychiatry, biology and morals. In: Morality as a biological phenomenon: The presuppositions of sociobiological research, ed. G.S. Stent. University of California Press. Kalat, J.W. (1992) Biological psychology, 4th edition. Wadsworth Pub. Kandel, E., Mednick, S.A., Kirkegaard-Sorensen, L., Hutchings, B., Knop, J., Rosenberg, R. & Schulsinger, F. (1988) IQ as a protective factor for subjects at high risk for antisocial behavior. Journal of Consulting and Clinical Psychology 56(2):224-226. Kenrick, D.T., Dantchik, A., & MacFarlane, S. (1983) Personality, environment, and criminal behavior: An evolutionary perspective. In: Personality theory, moral development, and criminal behavior, eds. W.S. Laufer & J.M. Day. Lexington Books. Kochanska, G. (1991) Socialization and temperament in the development of guilt and conscience. Child Development 62:1379-1392. Kochanska, G. (1993) Toward a synthesis of parental socialization and child temperament in early development of conscience. Child Development 64:325-347. Kochanska, G. & Murray, K. (1992) Temperament and conscience development. Presented at the Ninth Occasional Temperament Conference, Bloomington, Indiana, October 29-31, 1992. Kofoed, L. & MacMillan, J. (1986) Alcoholism and antisocial personality: The sociobiology of an addiction. The Journal of Nervous and Mental Disease 174(6): 332-335. Kohlberg, L. (1964) Development of moral character and moral ideology. In: Review of child development research, eds. M. Hoffman & L.W. Hoffman. Russell Sage Foundation Kraut, R.E. & Price, J.D. (1976) Machiiavellianism in parents and their children. Journal of Personality and Social Psychology 33(6):782-786. Kruesi, M.J., Hibbs, E.D., Zahn, T.P., Keysor, C.S., Hamburger, S.D., Bartko, J.J. & Rapoport, J. L. (1992) A 2-year prospective follow-up study of children and adolescents with disruptive behavior disorders: Prediction by cerebrospinal fluid 5-hydroxyindoleacetic acid, homovanillic acid, and autonomic measures? Archives of General Psychiatry 49:429-435. Landy, S. & Peters, R. DeV. (1992) Toward an understanding of a developmental paradigm for aggressive conduct problems during the pre-school years. In: Aggression and violence throughout the lifespan, eds. R.DeV. Peters, R.J. McMahon & V.L. Quinsey. Sage Pub. Lazarus, R.S. (1991) Emotion and adaptation. Oxford University Press. Lee, C.L. & Bates, J.E. (1985) Mother-child interactions at age two years and perceived difficult temperament. Child Development 56:1314-1325. Leslie, A.M. (1987) Pretense and representation: The origins of 'Theory of Mind'. Psychological Review 94(4):412-426. Lewontin, R.C. (1961) Evolution and the theory of games. Journal of Theoretical Biology 1:382-403. Link, R. & Mealey, L. (1992) "The sociobiology of sociopathy: An integrated evolutionary model". Presented at the Conference on the Biology of Morality, Bethel College, St. Paul, MN. Littlepage, G. & Pineault, T. (1978) Verbal, facial, and paralinguistic cues to the detection of truth and lying. Personality and Social Psychology Bulletin 4(3):461-464. Loeb, J. & Mednick, S.A. (1977) A prospective study of predictors of criminality, III: electrodermal response patterns. In: The causes of crime: New biological approaches, eds. S.A. Mednick, Terrie E. Moffitt, & S.A. Stack. Cambridge University Press. Loeber, R. (1982) The stability of antisocial and delinquent child behavior: A review. Child Development 53:1431-1446. Loeber, R. (1990) Development and risk factors of juvenile antisocial behavior and delinquency. Clinical Psychology Review 10:1-41. Loeber, R. (1993) Predictors of delinquency and violence: Longitudinal findings from the Pittsburgh youth study. Presented at the 45th annual meeting of the American Society of Criminology, Phoenix, AZ. Loeber, R. & Dishion, T. (1983) Early predictors of male delinquency: A review. Psychological Bulletin 94(1):68-99. Loeber, R. & Stouthamer-Loeber, M. (1987) Prediction. In: Handbook of juvenile delinquency, ed. H.C. Quay. Wiley-Interscience Pub. Loehlin, J.C. (1992) Genes and environment in personality development. Sage Pub. Low, B. (1993) Linking our evolutionary past and our ecological future: A behavioral ecological approach. Presented at the Fifth Annual Meeting of the Human Behavior and Evolution Society, August 4-8, 1993, Binghamton, NY. Luntz, B.K. & Widom, C.S. (1993) A comparison of antisocial personality diagnoses and psychopathy checklist scores in a sample of non-institutionalized young adults. Presented at the 45th annual meeting of the American Society of Criminology, Phoenix, AZ. Lyng, S. (1990) Edgework: A social psychological analysis of voluntary risk taking. American Journal of Sociology 95(4):851-856. Lytton, H. (1990) Child and parent effects in boys' conduct disorder: A reinterpretation. Developmental Psychology 26(5):683- 697. Maccoby, E.E. (1986) Social groupings in childhood: Their relationship to prosocial and antisocial behavior in boys and girls. In: Development of antisocial and prosocial behavior: Research, theories, and issues, eds. D. Olweus, J. Block & M. Radke-Yarrow. Academic Press. MacDonald, K.B., ed. (1988) Sociobiological perspectives on human development. Springer-Verlag Pub. Machalek, R. & Cohen, L.E. (1991) The nature of crime: Is cheating necessary for cooperation? Human Nature 2(3):215-233. MacMillan, J. & Kofoed, L. (1984) Sociobiology and antisocial personality: An alternative perspective. The Journal of Nervous and Mental Disease 172(12):701-706. Madsen, D. (1985) A biochemical property relating to power seeking in humans. American Political Science Review 79:448-457. Magid, K. & McKelvey, C.A. (1987) High risk: Children without a conscience. Bantam Books. Malamuth, N.M., Sockloskie, R.J., Koss, M.P. & Tanaka, J.S. (1991) Characteristics of aggressors against women: Testing a model using a national sample of college students. Journal of Counseling and Clinical Psychology 59(5):670-681. Maynard Smith, J. (1978) The evolution of behavior. Scientific American 239:176-192. Maynard Smith, J. (1974) The theory of games and the evolution of animal conflict. Journal of Theoretical Biology 47:209-221. Maynard Smith, J. & Price, G.R. (1973) The logic of animal conflict. Nature 246:15-18. McCord, J. (1983) Personality, moral development, and criminal behavior. In: Personality theory, moral development, and criminal behavior, eds. W.S. Laufer & J.M. Day. Lexington Books. McCord, J. (1986) Instigation and insulation: How families affect antisocial aggression. In: Development of antisocial and prosocial behavior: Research, theor8ies, and issues, eds. D. Olweus, J. Block & M. Radke-Yarrow. Academic Press. McCord, J. (1993) From family to peer group. Discussant at the 45th annual meeting of the American Society of Criminology, Phoenix, AZ. McGarvey, B., Gabrielli, W.F., Jr., Bentler, P.M. & Mednick, S.A. (1981) Rearing social class, education, and criminality: A multiple indicator model. Journal of Abnormal Psychology 90(4):354-364. McGue, M., Pickens, R.W. & Svikis, D.S. (1992) Sex and age effects on the inheritance of alcohol problems: A twin study. Journal of Abnormal Psychology 101(1):3-17. McGuire, M., Raleigh, M. & Johnson, C. (1983) Social dominance in adult male vervet monkeys II: Behavior-biochemical relationships. Social Science Information 22:311-328. Mealey, L. (1984) Comment on genetic similarity theory. Behavior Genetics 15(6):571-574. Mealey, L. (1990) Differential use of reproductive strategies by human groups? Psychological Science 1(6):385-387. Mealey, L. (1992) Are monkeys nomothetic or idiographic? Behavioral and Brain Sciences 15(1):161-162. Mealey, L. & Segal, N.L. (1993) Heritable and environmental variables affect reproduction-related behaviors, but not ultimate reproductive success. Personality and Individual Differences 14(6):783-794. Mednick, S.A. (1977) A biosocial theory of learning of law-abiding behavior. In: Biosocial bases of criminal behavior, eds. S.A. Mednick & K.O. Christiansen. Gardner Press. Mednick, S.A. & Finello, K.M. (1983) Biological factors and crime: Implications for forensic psychiatry. International Journal of Law and Psychiatry 6:1-15. Mednick, S.A., Gabrielli, W.F., Jr. & Hutchings, B. (1984) Genetic influences in criminal convictions: Evidence from an adoption cohort. Science 224:891-894. Mednick, S.A., Gabrielli, W.F., Jr. & Hutchings, B. (1987) Genetic factors in the etiology of criminal behavior. In: The causes of crime: New biological approaches, eds. S.A. Mednick, Terrie E. Moffitt, & S.A. Stack. Cambridge University Press. Mednick, S.A., Kirkegaard-Sorense, L., Hutchings, B., Knop, J., Rosenberg, R. & Schulsinger, F. (1977) An example of biosocial interaction research: The interplay of socioenvironmental and individual factors in the etiology of criminal behavior. In: Biosocial bases of criminal behavior, eds. S.A. Mednick & K.O. Christiansen. Gardner Press. Mednick, S.A., Moffitt, T.E. & Stack, S.A. (1987) The causes of crime: New biological approaches. Cambridge University Press. Mineka, S. & Sutton, S.K. (1992) Cognitive biases and emotional disorders. Psychological Science 3(1):65-69. Mitchell, R.W. (1986) A framework for discussing deception. In: Deception: Perspectives on human and nonhuman deceit, eds. R.W. Mitchell & N.S. Thompson. SUNY Press. Mitchell, R.W. & Thompson, N.S. (1986) Deception: Perspectives on human and nonhuman deceit. SUNY Press. Moffitt, T.E. (1987) Parental mental disorder and offspring criminal behavior: An adoption study. Psychiatry 50:346-360. Moffitt, T.E. (1993) Adolescent-limited and life-course-persistent antisocial behavior: A developmental taxonomy. Psychological Reports 100(4):674-701. Moffitt, T., Caspi, A., Belsky, J. & Silva, P.A. (1992) Childhood experience and the onset of menarche: A test of a sociobiological model. Child Development 63:47-58. Moffitt, T.E. & Silva, P.A. (1988) IQ and delinquency: A direct test of the differential detection hypothesis. Journal of Abnormal Psychology97(3):330-333. Montagu, A. (1978) Learning non-aggression: The experience of non- literate societies. Oxford University Press. Morell, V. (1993) Evidence found for a possible 'Aggression Gene'. Science 260:1722-1723. Morris, W.W. (1992) More on the mood-emotion distinction. Psycoloquy (an electronic journal) 3.2.1.1. Morrison, J.R. & Stewart, M.A. (1971) A family study of the hyperactive child syndrome. Biological Psychiatry 3:189-195. Muhlbauer, H.D. (1985) Human agression and the role of central serotonin. Pharmacopsychiatry 18:218-221. Mussen, P. & Eisenberg-Berg, N. (1977) Roots of caring, sharing, and helping. W.H. Freeman Pub. Nathanson, D.L. (1992) Shame and pride: Affect, sex, and the birth of the self. Norton Pub. National Research Council (1993) Understanding and preventing violence. National Academy Press. Nesse, R.M. (1990) Evolutionary explanations of emotions. Human Nature 1(3):261-289. Nesse, R. (1991) What is mood for? Psycoloquy (an electronic journal) 2.9.2.1. Nesse, R. (1992a) Overevaluation of the mood-emotion distinction. Psycoloquy (an electronic journal) 3.2.1.2. Nesse, R. (1992b) Ethology to the rescue (Reply to Plutchik) Psycoloquy (an electronic journal) 3.2.1.6. Newman, J.P (1987) Reaction to punishment in extraverts and psychopaths: Implications for the impulsive behavior of disinhibited individuals. Journal of Research in Personality 21:464-480. Newman, J.P. & Kosson, D.S. (1986) Passive avoidance learning in psychopathic and nonpsychopathic offenders. Journal of Abnormal Psychology 95(3):252-256. Newman, J.P, Patterson, C.M., Howland, E.W. & Nichols, S.L. (1990) Personality and Individual Differences 11(11):1101-1114. Newman, J.P., Widom, C.S. & Nathan, S. (1985) Passive avoidance in syndromes of disinhibition: Psychopathy and extraversion. Journal of Personality and Social Psychology 48(5):1316-1327. Nowak, M. & Sigmund, K. (1993) A strategy of win-stay, lose-shift that outperforms tit-for-tat in the Prisoner's Dilemma game. Nature 364:56-58. Offord, D.R., Boyle, M.H. & Racine, Y.A. (1991) The epidemiology of antisocial behavior in childhood and adolescence. In: The development and treatment of aggression, eds. D.J. Pepler & K.H. Rubin. Lawrence Erlbaum Pub. Olson, S.L. (1989) Assessment of impulsivity in preschoolers: Cross- measure convergences, longitudinal stability, and relevance to social competence. Journal of Clinical Child Psychology 18(2):176- 183. Olweus, D. (1986) Aggression and hormones: Behavioral relationship with testosterone and adrenaline. In: Development of antisocial and prosocial behavior: Research, theories, and issues, eds. D. Olweus, J. Block & M. Radke-Yarrow. Academic Press. Olweus, D. (1987) Testosterone and adrenaline: Aggressive antisocial behavior in normal adolescent males. In: The causes of crime: New biological approaches, eds. S.A. Mednick, Terrie E. Moffitt, & S.A. Stack. Cambridge University Press. Palmer, T. (1983) The "effectiveness" issue today: An overview. Federal Probation 26:3-10. Panskepp, J. (1982) Toward a general psychobiological theory of emotions. Behavioral and Brain Sciences 5:407-422. Patterson, G.R. (1992) Developmental changes in antisocial behavior. In: Aggression and violence throughout the lifespan, eds. R.DeV. Peters, R.J. McMahon & V.L. Quinsey. Sage Pub. Patterson, G.R. (1993) Determinants and outcomes for early and late onset of police arrest. Presented at the 45th annual meeting of the American Society of Criminology, Phoenix, AZ. Patterson, G.R., Capaldi, D. & Bank, L. (1991) An early starter model for predicting delinquency. In: The development and treatment of aggression, eds. D.J. Pepler & K.H. Rubin. Lawrence Erlbaum Pub. Patterson, C.M. & Newman, J.P. (1993) Reflectivity and learning from aversive events. Psychological Review 100(4):716-736. Petty, F. & Sherman, A.D. (1982) Serotonergic mediation of the learned helplessness animal model of depression. In: Serotonin in biological psychiatry, eds. B.T. Ho, J.C. Schoolar & E. Usdin. Raven Press. Piliavin, I., Thornton, C, Gartner, R. & Matsueda, R. (1986) Crime, deterrence, and rational choice. American Sociological Review 51:101-119. Pulkkinen, L. (1986) The role of impulse control in the development of antisocial and prosocial behavior. In: Development of antisocial and prosocial behavior: Research, theories, and issues, eds. D. Olweus, J. Block & M. Radke-Yarrow. Academic Press. Ortony, A. & Turner, T.J. (1990) What's basic about basic emotions? Psychological Review 97(3):315-331. Ostrom, E. (1990) Governing the commons. Cambridge University Press. Patterson, G.R. (1993) Determinants and outcomes for early and late onset of police arrest. Presented at the 45th annual meeting of the American Society of Criminology, Phoenix, AZ. Patterson, G.R., DeBaryshe, B.D. & Ramsey, E. (1989) A developmental perspective on antisocial behavior. American Psychologist 44(2):329- 335. Person, E. S. (1986) Manipulativeness in entrepreneurs and psychopaths. In: Unmasking the psychopath: Antisocial personality and related syndromes, eds. W.H. Reid, D. Dorr, J.I Walker & J.W. Bonner, III. W.W. Norton Pub. Plutchik, R. (1980) Emotion: A psychoevolutionary synthesis. Harper & Row Pub. Plutchik, R. (1992) What is mood for? A critique. Psycoloquy (an electronic journal) 3.2.1.5. Plomin, R. & Daniels, D. (1987) Why are children in the same family so different from each other? Behavioral and Brain Sciences 10:1-16. Premack, D. & Woodruff, G. Does a chimpanzee have a theory of mind? Behavioral and Brain Sciences 4:515-526. Pulliam, H.R. (1982) A social learning model of conflict and cooperation in human societies. Human Ecology 10(3):353-363. Quiatt, D. (1988) Which are more easily deceived, friends or strangers? Behavioral and Brain Sciences 11(2):260-261. Quinsey, V.L. & Walker, W.D. (1992) Dealing with dangerousness: Community risk management strategies with violent offenders. In: Aggression and violence throughout the lifespan, eds. R.DeV. Peters, R.J. McMahon & V.L. Quinsey. Sage Pub. Quay, H.C. (1990a) Intelligence. In: Handbook of juvenile delinquency, ed. H.C. Quay. Wiley-Interscience Pub. Quay, H.C. (1990b) Patterns of delinquent behavior. In: Handbook of juvenile delinquency, ed. H.C. Quay. Wiley-Interscience Pub. Radke-Yarrow, M. & Zahn-Waxler, C. (1986) The role of familial factors in the development of prosocial behavior: Research findings and questions. In: Development of antisocial and prosocial behavior: Research, theories, and issues, eds. D. Olweus, J. Block & M. Radke- Yarrow. Academic Press. Raine, A. (1988) Antisocial behavior and social psychophysiology. In: Social psychophysiology and emotion: Theory and clinical application, ed H.L. Wagner. John Wiley and Sons, Pub. Raine, A. (1989) Evoked potentials and psychopathy. International Journal of Psychophysiology 8:1-16. Raine, A. (1993) The psychopathology of crime: Criminal behavior as a clinical disorder. Academic Press. Raine, A. & Dunkin, J. (1990) The genetic and psychophysiological basis of antisocial behavior: Implications for counseling and therapy. Journal of Counseling and Development 68:637-644. Raine, A., Rogers, D.B. & Venables, P.H. (1982) Locus of control and socialization. Journal of Research in Personality 16:147-156. Raine, A. & Venables, P.H. (1981) Classical conditioning and socialization- a biosocial interaction. Personality and Individual Differences 2:273-283. Raine, A. & Venables, P.H. (1984) Tonic heart rate level, social class, and antisocial behavior in adolescents. Biological Psychology 18:123-132. Raine, A. & Venables, P.H. (1987) Contingent negative variation, P3 evoked potentials, and antisocial behavior. Psychophysiology 24(2):191-199. Raine, A., Venables, P.H. & Williams, M. (1990a) Relationships between central and autonomic measures of arousal at age 15 years and criminality at age 24 years. Archives of General Psychiatry 47:1003-1007. Raine, A., Venables, P.H. & Williams, M. (1990b) Relationships between N1, P300, and contingent negative variation recorded at age 15 and criminal behavior at age 24. Psychophysiology 27(5):567-574. Raleigh, M., McGuire, M., Brammer, G., Pollack, D.B. & Yuwiler, A. (1991) Serotonergic mechanisms promote dominance acquisition in adult male vervet monkeys. Brain Research 559:181-190. Raleigh, M., McGuire, M., Brammer, G. & Yuwiler, A. (1984) Social status and whole blood serotonin in vervets. Archives of General Psychiatry 41(4):405-410. Robins, L.N. (1986) The consequences of conduct dissorder in girls. In: Development of antisocial and prosocial behavior: Research, theories, and issues, eds. D. Olweus, J. Block & M. Radke-Yarrow. Academic Press. Robins, L.N., Tipp, J. & Przybeck, T. (1991) Antisocial personality. In: Psychiatric disorders in America, eds. L.N. Robins & D.A. Regier. Free Press. Robins, L.N. & Wish, E. Childhood deviance as a developmental process: A study of 223 urban black men from birth to 18. Social Forces 56(2):449-473. Rowe, D.C. (1983a) A biometrical analysis of perceptions of family environment: A study of twin and singleton sibling kinships. Child Development 54:416-423. Rowe, D.C. (1983b) Biometrical genetic models of self-reported delinquent behavior: A twin study. Behavior Genetics 13(5):473-489. Rowe, D.C. (1986) Genetic and environmental components of antisocial behavior: A study of 265 twin pairs. Criminology 24(3):513-532. Rowe, D.C. (1990a) Inherited dispositions toward learning delinquent and criminal behavior: New evidence. In: Crime in biological, social, and moral contexts, eds. L. Ellis & H. Hoffman. Praeger Pub. Rowe, D.C. (1990b) As the twig is bent? The myth of child-rearing influences on personality development. Journal of Counseling and Development 68:606-611. Rowe, D.C. & Plomin, R. (1981) The importance of nonshared environmental influences on behavioral development. Developmental Psychology 17:517-531. Rowe, D.C. & Rodgers, J.L. (1989) Behavioral genetics, adolescent deviance, and "d": Contributions and issues. In: Biology of adolescent behavior and development, eds. G.R. Adams, R. Montemayor, and T.P. Gullotta. Sage Pub. Rowe, D.C., Rodgers, J.L., Meseck-Bushey, S. & St. John, C. (1989) Sexual behavior and nonsexual deviance: A sibling study of their relationship. Developmental Psychology 25:61-69. Rubin, R.T. (1987) The neuroendocrinology and neurochemistry of antisocial behavior. In: The causes of crime: New biological approaches, eds. S.A. Mednick, Terrie E. Moffitt, & S.A. Stack. Cambridge University Press. Rushton, J.P. (1982) Social learning theory and the development of prosocial behavior. In: The development of prosocial behavior, ed. N. Eisenberg. Academic Press. Rushton, J.P. (1989) Genetic similarity, human altruism, and group selection. Behavioral and Brain Sciences 12(3):503-518. Rushton, J.P., Fulker, D.W., Neale, M.C., Nias, D.K.B. & Eysenck, H.J. (1986) Altruism and aggression: The heritability of individual differences. Journal of Personality and Social Psychology 50(6):1192-1198. Rushton, J.P., Russell, R.J.H. & Wells, P.A. (1984) Genetic similarity theory: An extension to sociobiology. Behavior Genetics 14:179-193. Sanchez, J. (1986) Social crises and psychopathy: Toward a sociology of the psychopath. In: Unmasking the psychopath: Antisocial personality and related syndromes, eds. W.H. Reid, D. Dorr, J.I Walker & J.W. Bonner, III. W.W. Norton Pub. Satterfeld, J.H. (1987) Childhood diagnostic and neurophysiological predictors of teenage arrest rates: an eight-year prospective study. In: The causes of crime: New biological approaches, eds. S.A. Mednick, Terrie E. Moffitt, & S.A. Stack. Cambridge University Press. Scarr, S. & McCartney, K. How people make their own environments: A theory of genotype-environment effects. Child Development 54:424- 435. Schalling, D. (1987) Personality correlates of plasma testosterone levels in young delinquents: An example of person-situation interaction? In: The causes of crime: New biological approaches, eds. S.A. Mednick, Terrie E. Moffitt, & S.A. Stack. Cambridge University Press. Schulsinger, F. (1972) Psychopathy: Heredity and environment. International Journal of Mental Health 1:190-206. (Reprinted in 1977 in: Biosocial bases of criminal behavior, eds. S.A. Mednick & K.O. Christiansen. Gardner Press. Segal, N. L. (1991) Cooperation and competition in adolescent MZ and DZ twins during a Prisoner's Dilemma game. Presented at the Society for Research on Child Development, Seattle, WA. Seligman, M.E.P. (1970) On the generality of the laws of learning. Psychological Review 77:407-418. Shweder, R.A., Mahapatra, M. & Miller, J.G. (1987) Culture and moral development. In: The emergence of morality in young children, eds. J. Kagan & S. Lamb. Chicago University Press. Siegel, L.J. (1986) Criminology (2nd edition). West Pub. Silverton, L. (1988) Crime and the schizophrenia spectrum: A diathesis-stress model. Acta Psychiatrica Scandanavia 78:72-81. Simonian, S.J., Tarnowski, K.J. & Gibbs, J.C. (1991) Social skills and antisocial conduct of delinquents. Child Psychiatry and Human Development 22(1):17-27. Simons, R.L. (1993) Family context and developmental timing of delinquent behaviors. Presented at the 45th annual meeting of the American Society of Criminology, Phoenix, AZ. Sloman, L. (1992) How mood variation regulates aggression. Psycoloquy (an electronic journal) 3.1.1.3. Snyder, J., Dishion, T.J. & Patterson, G.R. (1986) Determinants and consequences of associating with deviant peers during preadolescence and adolescence. Journal of Early Adolescence 6(1):29-43. Snyder, J. & Patterson, G. (1990) Family interaction and delinquent behavior. In: Handbook of juvenile delinquency, ed. H.C. Quay. Wiley-Interscience Pub. Stabenau, J.R. (1985) Basic research on heredity and alcohol: Implications for clinical application. Social Biology 32(3-4):297- 321. Stattin, H. & Magnusson, D. (1991) Stability and change in criminal behavior up to age 30. The British Journal of Criminology 31(4):327- 346 Strauss, C.C. & Lahey, B.B. (1984) Behavior disorders of children. In: Comprehensive handbook of psychopathology, eds. Adams, H.E. & Sutker, P.B. Plenum Press. Surbey, M.K. (1987) Anorexia nervosa, amenorrhea, and adaptation. Ethology and Sociobiology 8:47S-61S. Susman, E.J., Inoff-Germain, G., Nottelmann, E.D., Loriaux, D.L., Cutler, G.B., Jr. & Chrousos, G.P. (1987) Hormones, emotional dispositions, and aggressive attributes in young adolescents. Child Development 58:1114-1134. Symons, D. (1979) The evolution of human sexuality. Oxford University Press. Tarter, R.E. (1988) Are there inherited behavioral traits that predispose to substance abuse? Journal of Consulting and Clinical Psychology 56(2):189-196. Tavris, C.A. (1982) Anger: The misunderstood emotion. Simon & Schuster Pub. Terhune, K.W. (1970) The effects of personality in cooperation and conflict. In: The structure of conflict, ed. P. Swingle. Academic Press. Thornhill, R. & Alcock, J. (1983) The evolution of insect mating systems. Harvard University Press. Thornhill, R. & Thornhill, N.W. (1992) The evolutionary psychology of men's coercive sexuality. Behavioral and Brain Sciences 15(2):363-375. Tooke, W. & Camire, L. (1991) Patterns of deception in intersexual and intrasexual mating strategies. Ethology and Sociobiology 12:345- 364. Traskman, L., Asberg, M., Bertilsson, L. & Sjostrand, L. (1981) Monoamine metanolites in CSF and suicidal behavior. Archives of General Psychiatry 38:631-636. Trasler, G. (1987) Biogenetic factors. In: Handbook of juvenile delinquency, ed. H.C. Quay. Wiley-Interscience Pub. Tremblay, R.E. (1993) Cognitive deficits, school achievement, disruptive behavior, and juvenile delinquency: A longitudinal look at their developmental sequence. Presented at the 45th annual meeting of the American Society of Criminology, Phoenix, AZ. Trivers, R.L. (1971) The evolution of reciprocal altruism. Quarterly Review of Biology 46:35-57. U.S. Department of Justice (1992) Criminal victimization in the United States, 1991: A national crime victim survey report (NCJ- 139563). U.S. Department of Justice (1993) A comprehensive strategy for serious, violent, and chronic juvenile offenders: Program summary (NCJ-143453). Udry, J.R. (1990) Biosocial models of adolescent problem behaviors. Social Biology 37:1-10. Vasek, M.E. (1986) Lying as a skill: The development of deception in children. In: Deception: Perspectives on human and nonhuman deceit, eds. R.W. Mitchell & N.S. Thompson. SUNY Press. Vila, B.J. & Cohen, L.E. (1993) Crime as strategy: Testing an evolutionary ecological theory of expropriative crime. American Journal of Sociology 98(4):873-912. Volavka, J., Mednick, S.A., Gabrielli, W.F., Matousek, M. & Pollock, V.E. (1984) EEG and crime: Evidence from longitudinal prospective studies. Advances in Biological Psychiatry 15:97-101. Wachs, T.D. (1992) The nature of nurture, Sage Pub. Wade, C. & Tavris, C. (1993) Psychology (3rd edition). Harper/Collins Pub. White, J.L., Moffitt, T.E., Earls, F., Robins, l. & Silva, P.A. (1990) How early can we tell?: Predictors of childhood conduct disorder and adolescent delinquency. Criminology 28(4):507-533. White, J.L., Moffitt, T.E. & Silva, P.A. (1989) A prospective replication of the protective effects of IQ in subjects at high risk for juvenile delinquency. Journal of Consulting and Clinical Psychology 57(6):719-724. Widom, C.S. (1976a) Interpersonal and personal construct systems in psychopaths. Journal of consulting and clinical psychology 44(4):614-623. Widom, C.S. (1976b) Interpersonal conflict and cooperation in psychopaths. Journal of Abnormal Psychology 85(3):330-334. Wilson, M. & Daly, M. (1993) Homicide as a window on modulated risk- proneness in competitive interpersonal confrontations. Presented at the 45th annual meeting of the American Society of Criminology, Phoenix, AZ. Wilson, J.Q. & Herrnstein, R.J. (1985) Crime and human nature. Simon & Schuster Pub. Wolf, P. (1987) Definitions of antisocial behavior in biosocial research. In: The causes of crime: New biological approaches, eds. S.A. Mednick, T. E. Moffitt, & S.A. Stack. Cambridge University Press. Zahn-Waxler, C. & Kochanska, G. (1988) The origins of guilt. In: Nebraska symposium on motivation, vol. 36, ed. R.A. Thompson. University of Nebraska Press. Zahn-Waxler, C. & Radke-Yarrow, M. (1982) The development of altruism: Alternative research strategies. In: The development of prosocial behavior, ed. N. Eisenberg. Academic Press. Ziskind, E., Syndulko, K. & Maltzman, I. (1978) Aversive conditioning in the sociopath. Pavlovian Journal of Biological Science 13(4):199-205. Zucker, R.A. & Gomberg, E.S.L. (1986) Etiology of alcoholism reconsidered. American Psychologist 41(7):783-793. Zuckerman, M. (1979) Sensation seeking. Lawrence Erlbaum Pub. Zuckerman, M. (1983) Biological bases of sensation-seeking, impulsivity and anxiety. Lawrence Erlbaum Pub. Zuckerman, M. (1984) Sensation seeking: A comparative approach to a human trait. Behavioral and Brain Sciences 7:413-471. Zuckerman, M. (1985) Biological foundations of the sensation-seeking temperament. In: The biological bases of personality and behavior, vol.1: Theories, measurement techniques, and development, eds. J. Strelau, F.H. Farley & A. Gale. Hemisphere Pub. Zuckerman, M. (1989) Personality in the third dimension: A psychobiological approach. Personality and Individual Differences 10(4):391-418. Zuckerman, M. (1990) The psychophysiology of sensation seeking. Journal of Personality 58:313-345. Zuckerman, M. (1991) The psychobiology of personality. Cambridge University Press. Zuckerman, M., Buchsbaum, M.S. & Murphy, D.L. (1980) Sensation seeking and its biological correlates. Psychological Bulletin 88:187-214. Zuckerman, M., DePaulo, B., & Rosenthal, R. (1981) Verbal and nonverbal communication of deception. In: Advances in experimental social psychology, ed. L. Berkowitz. Academic Press. From ljohnson at solution-consulting.com Fri Jan 13 01:49:27 2006 From: ljohnson at solution-consulting.com (Lynn D. Johnson, Ph.D.) Date: Thu, 12 Jan 2006 18:49:27 -0700 Subject: [Paleopsych] Atlas Sphere: Indict the New York Times In-Reply-To: References: Message-ID: <43C70727.1090104@solution-consulting.com> Frank, Regarding the question of spying on ourselves: Let me provide a counter-intuitive argument, only partly tongue in cheek. Perhaps the left-wing fear that government spies on us is actually salutory. It encourages us to reflect on the current confirmation hearings for Alito: Everything you have ever said will someday be scrutinized. So live life with that clear understanding. Privacy is over-rated as a freedom. Knowledge about others is interesting. (I love gossip!) It is the contraints on behavior that we cannot tolerate. Recall that SCOTUS invented a right to privacy so people in Connecticut could buy birth control Privacy was not the real issue. The issue was that govenment was trying to control things it shouldn't. To get a decision, the Supremes had to invent a new right. In the early 1970s, my father was puzzled about people objecting to Nixon's wiretaps. "Why does anyone object?" he asked. "You shouldn't oughta be sayin' things on the telephone that you are ashamed of anyway." Dad lived his life with that notion, that he had nothing to hide. It is a wonderfully freeing life. I think things like Holzer proposes are actually good for democracy, since it helps the democracy have an open dialog about important issues including the role of secrecy vs. openness. As David Horowitz says, dialog between disparate entities invigorates democracy. Such a prosecution would be exciting to watch. Lynn Lynn D. Johnson, Ph.D. Solutions Consulting Group 166 East 5900 South, Ste. B-108 Salt Lake City, UT 84107 Tel: (801) 261-1412; Fax: (801) 288-2269 Check out our webpage: www.solution-consulting.com Feeling upset? Order Get On The Peace Train, my new solution-oriented book on negative emotions. Premise Checker wrote: > Indict the New York Times > http://www.theatlasphere.com/columns/printer_050111-holzer-indict-nytimes.php > > > [Go ahead and critique, folks. I'll send out your thoughts, unless you > specifically ask me not to. It's not clear to me how our "national > security" was compromised. Aren't the citizens part of this "nation," > and didn't Amendment IV speak of the "right of the people to be SECURE > in their persons, houses, papers, and effects, against unreasonable > searches and seizures"? Or does the "nation's" security extend only to > our rulers? > > [So it is unclear whether the Espionage Act violates the Bill of > Rights, even if past Supreme Courts have allowed certain somewhat > similar powers. And did Congress intend the Bush administration to do > what it did? Or is the intention of Congress something the Executive > Branch can determine for itself at will? Is the United States a nation > of laws (Congress) or of Presidential decrees? It is also unclear to > me whether the Bush administration violated the laws governing the NSA. > > [Share with me your other objections. Basically, domestic spying is > open to serious abuses and it must be checked. I have not been > following the details and so don't know just how various members of > Congress have > reacted, but it's had not been one of ringing endorsement. > > [Mr. Jefferson said if he had to choose between newspapers and no > government or government and no newspapers, he would unhesitatingly > choose the latter. > > [I don't know what restrictions on the press I would allow in the > unlikely event that the United States got into a war justifiably. Help!] > > [And just how did Hanoi Jane jeopardize our national security? Did it > mean that North Vietnam was more likely to invade the United States? > She was hardly the only critic of our foreign policy.] > > Opinion Editorial > Indict the New York Times > By Henry Mark Holzer > Jan 10, 2006 > > It is an article of faith on the Left and among its fellow > travelers that the Bush administration stole two elections, made > war on Iraq for venal reasons, tortured hapless foreigners, and > conducted illegal surveillance of innocent Americans. > > A corollary of this mindset is that the press, primarily the > Washington Post and The New York Times, has a right, indeed a duty, > to print whatever they want about the administration even if the > information compromises national security. > > Not true. The press is not exempt from laws that apply to everyone > else. The press is not exempt from laws protecting our national > security. The New York Times is not exempt from the Espionage Act, > as we shall see in a moment. > > But first, its necessary to understand what an indictment of the > Times does not involve. > > First, an Espionage Act indictment of The New York Times would not > even remotely constitute an attack on a free press. As Justice > White wrote in Branzburg v. Hayes, [i]t would be frivolous to > assert ... that the First Amendment, in the interest of securing > news or otherwise, confers a license on either the reporter or his > news sources to violate valid criminal laws. > > Nor would an indictment of the Times constitute an attempt to > restrain it from publishing news. The anti-anti-terrorists who seek > to justify the Times revealing the NSAs domestic surveillance > program and thus prevent their flagship paper from being indicted, > rely on a Supreme Court decision entitled New York Times Company v. > United States, better known as the Pentagon Papers Case. Their > reliance is misplaced. > > In 1971 a disgruntled anti-war activist delivered a classified > study History of U.S. Decision-Making Process on Viet Nam Policy to > The New York Times and the Washington Post. The government sued to > enjoin publication seeking to impose a prior restraint. > > If there are any fundamental principles in modern First Amendment > law, one is that the burden on government to restrain publication > (as compared, for example, with later punishing its publication) is > extremely heavy. Accordingly, in a 6-3 decision, the Court ruled > for the newspapers, and the publication of the embarrassing > Pentagon Papers went ahead. > > Thus, New York Times Company v. United States, where the Court > rejected a government-sought prior restraint on publication, would > have no precedential value in a case where, after publication, the > government sought to punish the Times for violating the Espionage > Act. > > Third, not only was there no legal impediment to the NSAs domestic > surveillance program, there was abundant authority for it. The > President possesses broad powers as chief executive and Commander > in Chief under Article II of the Constitution. Congress has > repeatedly delegated to all presidents considerable war-related > powers, and especially post-9/11 to President Bush. > > It was Congress that created and empowered the National Security > Agency. The Executive Branchs NSA domestic surveillance program, > aimed at obtaining intelligence about the foreign-based terrorist > war on the United States, was/is an integral element of our > national security policy and its implementation. No Supreme Court > decision has ever held that the > Presidential/Congressionally-sanctioned acquisition of that kind of > intelligence was constitutionally or otherwise prohibited. > > Accordingly, it is pointless to consider whether the NSAs domestic > surveillance program was legal. It was! If a case involving that > program ever reaches the Supreme Court, thats what its ruling will > be. > > Fourth, the interesting history of the Espionage Act is irrelevant > to whether the Times may have violated it. > > Finally, it is a waste of time to consider whether the Act is > constitutional. It has been expressly and impliedly held > constitutional more than once. > > This brings us to whether The New York Times is indictable (and > ultimately convictable) for violating the Espionage Act. > > The facts are clear. The NSA was engaged in highly classified > warrantless wiretaps of domestic subjects in connection with the > War on Terror, and the Times, a private newspaper, made that > information public. > > It is to those facts that the Espionage Act either applies, or does > not apply. > > Title 18, Section 793 of the United States Code, provides that (e) > Whoever having unauthorized possession of ... any document ... or > information relating to the national defense which information the > possessor has reason to believe could be used to the injury of the > United States or to the advantage of any foreign nation, willfully > communicates ... the same to any person not entitled to receive it > ... (f) ... [s]hall be fined under this title or imprisoned not > more than ten years, or both. (g) If two or more persons conspire > to violate any of the foregoing provisions of this section, and one > or more of such persons do any act to effect the object of the > conspiracy, each of the parties to such conspiracy shall be subject > to the punishment provided for the offense which is the object of > such conspiracy. (Section 794 is inapplicable. It deals with > gathering or delivering defense information to aid [a] foreign > government.) > > It is, said the United States Court of Appeals for the Fourth > Circuit in assessing Section 793 (e) in United States v. Morison, > difficult to conceive of any language more definite and clear. > Lets break down the statute into its component parts. > > Whoever: this would mean the New York Times company, publisher > Arthur Sulzberger, Jr., editor Bill Keller, and anyone else privy > to the information upon which the story was based. > > Having unauthorized possession: the information was classified, and > the Times was not authorized to have it. > > Of any document ... or information: certainly the Times had > information, because it published it; it is inconceivable that the > newspaper did not have documents of some kind, because the > newspaper would never have gone that far out on a limb without at > least some corroboration beyond an oral report(s). > > Relating to the national defense: no comment is necessary; indeed, > the Times has conceded that targets of the warrantless wiretaps > were persons who may have had some connection to terrorists. > > Which information the possessor has reason to believe could be used > to the injury of the United States or to the advantage of any > foreign nation: obviously the Times had reason to believe, because > it withheld the story for a year. > > Willfully communicates ... the same: no comment is necessary; the > story was front-page news. > > To any person not entitled to receive it: even the Times cant argue > that subway straphangers, or any other member of the public, was > entitled to receive information about the classified operations > about one of this countrys most secret and highly protected > agencies. > > Several years ago Erika Holzer and I wrote a book entitled [1]Aid > and Comfort: Jane Fonda in North Vietnam, which proved that her > conduct in Hanoi made her indictable for, and convictable of, > treason. We discovered that she was not indicted because of a > political failure of will by the Nixon administration. > > To summarize a chapter of our book, suffice to say that the > government was afraid to indict a popular anti-war actress who had > the support of the radical left. Even today, three decades after > Fondas trip to North Vietnam and three years after the publication > of our book, we receive countless letters lamenting that Hanoi Jane > was never punished for her conduct. > > We tell them that its too late, that any possibility of seeing > justice done for Fondas traitorous conduct is long gone. That is > all the more reason why those of us who remember the Fonda episode, > and who understand the nature and importance of todays War on > Terror, should not rest until the government calls to account The > New York Times in a court of law, with an indictment and hopefully > a conviction, under the Espionage Act. > > [2]Henry Mark Holzer is a professor emeritus at Brooklyn Law School > and a constitutional and appellate lawyer. He provided legal > representation to Ayn Rand on a variety of matters in the 1960s. > His latest book, Keeper of the Flame: The Supreme Court > Jurisprudence of Justice Clarence Thomas, will be published later > this year. > ? Copyright 2004-5 by The Atlasphere LLC > > References > > 1. > http://www.amazon.com/exec/obidos/redirect?link_code=ur2&tag=theatlasphere-20&camp=1789&creative=9325&path=http%3A%2F%2Fwww.amazon.com%2Fgp%2Fproduct%2F078641247X > > 2. http://www.theatlasphere.com/directory/profile.php?id=12095 > >------------------------------------------------------------------------ > >_______________________________________________ >paleopsych mailing list >paleopsych at paleopsych.org >http://lists.paleopsych.org/mailman/listinfo/paleopsych > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From checker at panix.com Fri Jan 13 16:28:48 2006 From: checker at panix.com (Premise Checker) Date: Fri, 13 Jan 2006 11:28:48 -0500 (EST) Subject: [Paleopsych] Meme 056: Friedrich Nietzsche: Thus Spake Zarathustra Message-ID: Meme 056: Friedrich Nietzsche: Thus Spake Zarathustra: A Book for All and None (1883-5) sent 6.1.13 Translated by Thomas Common http://www.gutenberg.org/dirs/etext99/spzar10.txt [Nietzsche is the founder of transhumanism. [I detect seven translations of Also Sprach Zarathusta. I give the earliest publications I can find: Alexander Tille, London: H. Henry & Co., 1896. Thomas Common. Edinburgh and London: T.N. Foulis, 1907. Volume 11 of the "complete" works, prepared by Oscar Levy and put out between 1906 and 1927. Walter Kaufmann, by 1948. R.J. Hollingdale. Harmondsworth: Penquin, 1961. Thomas Wayne. NY: Algora Press, 2003. Stanley Applebaum (selections, with facing German). Dover, 2004. Clancy Martin. Barnes and Noble, 2005. Graham Parks. Oxford UP, 2005. Adrian Del Caro. Cambridge UP later in 2006. [I haven't seen the Tille translation, but the one below is admirable, in that it uses a thundering [King James] Biblical-like style. Kaufmann, in the introduction to his own translation, finds lots of objections to Common's translation, and it does read more smoothly. But smoothness is not the point. Kaufmann, reeling from the use of Nietzsche by the National Socialists, translates ?bermensch as "overman" not "superman," trying to rescue Nietzsche from charges of antisemitism. In fact, _The Antichrist_ is the second most antisemitic book ever written. They founded the Sklavenmoral, and the Christians merely augmented it. The most antisemitic book is, of course, the Bible, both Testaments of which excoriate the Jews relentlessly, though occasionally intermixed with praise.] CONTENTS. INTRODUCTION BY MRS FORSTER-NIETZSCHE [1905, moved to after the text] THUS SPAKE ZARATHUSTRA. FIRST PART. Zarathustra's Prologue. Zarathustra' Discourses. I. The Three Metamorphoses. II. The Academic Chairs of Virtue. III. Backworldsmen. IV. The Despisers of the Body. V. Joys and Passions. VI. The Pale Criminal. VII. Reading and Writing. VIII. The Tree on the Hill. IX. The Preachers of Death. X. War and Warriors. XI. The New Idol. XII. The Flies in the Market-place. XIII. Chastity. XIV. The Friend. XV. The Thousand and One Goals. XVI. Neighbour-Love. XVII. The Way of the Creating One. XVIII. Old and Young Women. XIX. The Bite of the Adder. XX. Child and Marriage. XXI. Voluntary Death. XXII. The Bestowing Virtue. SECOND PART. XXIII. The Child with the Mirror. XXIV. In the Happy Isles. XXV. The Pitiful. XXVI. The Priests. XXVII. The Virtuous. XXVIII. The Rabble. XXIX. The Tarantulas. XXX. The Famous Wise Ones. XXXI. The Night-Song. XXXII. The Dance-Song. XXXIII. The Grave-Song. XXXIV. Self-Surpassing. XXXV. The Sublime Ones. XXXVI. The Land of Culture. XXXVII. Immaculate Perception. XXXVIII. Scholars. XXXIX. Poets. XL. Great Events. XLI. The Soothsayer. XLII. Redemption. XLIII. Manly Prudence. XLIV. The Stillest Hour. THIRD PART. XLV. The Wanderer. XLVI. The Vision and the Enigma. XLVII. Involuntary Bliss. XLVIII. Before Sunrise. XLIX. The Bedwarfing Virtue. L. On the Olive-Mount. LI. On Passing-by. LII. The Apostates. LIII. The Return Home. LIV. The Three Evil Things. LV. The Spirit of Gravity. LVI. Old and New Tables. LVII. The Convalescent. LVIII. The Great Longing. LIX. The Second Dance-Song. LX. The Seven Seals. FOURTH AND LAST PART. LXI. The Honey Sacrifice. LXII. The Cry of Distress. LXIII. Talk with the Kings. LXIV. The Leech. LXV. The Magician. LXVI. Out of Service. LXVII. The Ugliest Man. LXVIII. The Voluntary Beggar. LXIX. The Shadow. LXX. Noon-Tide. LXXI. The Greeting. LXXII. The Supper. LXIII. The Higher Man. LXXIV. The Song of Melancholy. LXXV. Science. LXXVI. Among Daughters of the Desert. LXXVII. The Awakening. LXXVIII. The Ass-Festival. LXXIX. The Drunken Song. LXXX. The Sign. APPENDIX. Notes on "Thus Spake Zarathustra" by Anthony M. Ludovici (1909) THUS SPAKE ZARATHUSTRA. FIRST PART. ZARATHUSTRA'S DISCOURSES. ZARATHUSTRA'S PROLOGUE. 1. When Zarathustra was thirty years old, he left his home and the lake of his home, and went into the mountains. There he enjoyed his spirit and solitude, and for ten years did not weary of it. But at last his heart changed,--and rising one morning with the rosy dawn, he went before the sun, and spake thus unto it: Thou great star! What would be thy happiness if thou hadst not those for whom thou shinest! For ten years hast thou climbed hither unto my cave: thou wouldst have wearied of thy light and of the journey, had it not been for me, mine eagle, and my serpent. But we awaited thee every morning, took from thee thine overflow and blessed thee for it. Lo! I am weary of my wisdom, like the bee that hath gathered too much honey; I need hands outstretched to take it. I would fain bestow and distribute, until the wise have once more become joyous in their folly, and the poor happy in their riches. Therefore must I descend into the deep: as thou doest in the evening, when thou goest behind the sea, and givest light also to the nether-world, thou exuberant star! Like thee must I GO DOWN, as men say, to whom I shall descend. Bless me, then, thou tranquil eye, that canst behold even the greatest happiness without envy! Bless the cup that is about to overflow, that the water may flow golden out of it, and carry everywhere the reflection of thy bliss! Lo! This cup is again going to empty itself, and Zarathustra is again going to be a man. Thus began Zarathustra's down-going. 2. Zarathustra went down the mountain alone, no one meeting him. When he entered the forest, however, there suddenly stood before him an old man, who had left his holy cot to seek roots. And thus spake the old man to Zarathustra: "No stranger to me is this wanderer: many years ago passed he by. Zarathustra he was called; but he hath altered. Then thou carriedst thine ashes into the mountains: wilt thou now carry thy fire into the valleys? Fearest thou not the incendiary's doom? Yea, I recognise Zarathustra. Pure is his eye, and no loathing lurketh about his mouth. Goeth he not along like a dancer? Altered is Zarathustra; a child hath Zarathustra become; an awakened one is Zarathustra: what wilt thou do in the land of the sleepers? As in the sea hast thou lived in solitude, and it hath borne thee up. Alas, wilt thou now go ashore? Alas, wilt thou again drag thy body thyself?" Zarathustra answered: "I love mankind." "Why," said the saint, "did I go into the forest and the desert? Was it not because I loved men far too well? Now I love God: men, I do not love. Man is a thing too imperfect for me. Love to man would be fatal to me." Zarathustra answered: "What spake I of love! I am bringing gifts unto men." "Give them nothing," said the saint. "Take rather part of their load, and carry it along with them--that will be most agreeable unto them: if only it be agreeable unto thee! If, however, thou wilt give unto them, give them no more than an alms, and let them also beg for it!" "No," replied Zarathustra, "I give no alms. I am not poor enough for that." The saint laughed at Zarathustra, and spake thus: "Then see to it that they accept thy treasures! They are distrustful of anchorites, and do not believe that we come with gifts. The fall of our footsteps ringeth too hollow through their streets. And just as at night, when they are in bed and hear a man abroad long before sunrise, so they ask themselves concerning us: Where goeth the thief? Go not to men, but stay in the forest! Go rather to the animals! Why not be like me--a bear amongst bears, a bird amongst birds?" "And what doeth the saint in the forest?" asked Zarathustra. The saint answered: "I make hymns and sing them; and in making hymns I laugh and weep and mumble: thus do I praise God. With singing, weeping, laughing, and mumbling do I praise the God who is my God. But what dost thou bring us as a gift?" When Zarathustra had heard these words, he bowed to the saint and said: "What should I have to give thee! Let me rather hurry hence lest I take aught away from thee!"--And thus they parted from one another, the old man and Zarathustra, laughing like schoolboys. When Zarathustra was alone, however, he said to his heart: "Could it be possible! This old saint in the forest hath not yet heard of it, that GOD IS DEAD!" 3. When Zarathustra arrived at the nearest town which adjoineth the forest, he found many people assembled in the market-place; for it had been announced that a rope-dancer would give a performance. And Zarathustra spake thus unto the people: I TEACH YOU THE SUPERMAN. Man is something that is to be surpassed. What have ye done to surpass man? All beings hitherto have created something beyond themselves: and ye want to be the ebb of that great tide, and would rather go back to the beast than surpass man? What is the ape to man? A laughing-stock, a thing of shame. And just the same shall man be to the Superman: a laughing-stock, a thing of shame. Ye have made your way from the worm to man, and much within you is still worm. Once were ye apes, and even yet man is more of an ape than any of the apes. Even the wisest among you is only a disharmony and hybrid of plant and phantom. But do I bid you become phantoms or plants? Lo, I teach you the Superman! The Superman is the meaning of the earth. Let your will say: The Superman SHALL BE the meaning of the earth! I conjure you, my brethren, REMAIN TRUE TO THE EARTH, and believe not those who speak unto you of superearthly hopes! Poisoners are they, whether they know it or not. Despisers of life are they, decaying ones and poisoned ones themselves, of whom the earth is weary: so away with them! Once blasphemy against God was the greatest blasphemy; but God died, and therewith also those blasphemers. To blaspheme the earth is now the dreadfulest sin, and to rate the heart of the unknowable higher than the meaning of the earth! Once the soul looked contemptuously on the body, and then that contempt was the supreme thing:--the soul wished the body meagre, ghastly, and famished. Thus it thought to escape from the body and the earth. Oh, that soul was itself meagre, ghastly, and famished; and cruelty was the delight of that soul! But ye, also, my brethren, tell me: What doth your body say about your soul? Is your soul not poverty and pollution and wretched self- complacency? Verily, a polluted stream is man. One must be a sea, to receive a polluted stream without becoming impure. Lo, I teach you the Superman: he is that sea; in him can your great contempt be submerged. What is the greatest thing ye can experience? It is the hour of great contempt. The hour in which even your happiness becometh loathsome unto you, and so also your reason and virtue. The hour when ye say: "What good is my happiness! It is poverty and pollution and wretched self-complacency. But my happiness should justify existence itself!" The hour when ye say: "What good is my reason! Doth it long for knowledge as the lion for his food? It is poverty and pollution and wretched self- complacency!" The hour when ye say: "What good is my virtue! As yet it hath not made me passionate. How weary I am of my good and my bad! It is all poverty and pollution and wretched self-complacency!" The hour when ye say: "What good is my justice! I do not see that I am fervour and fuel. The just, however, are fervour and fuel!" The hour when we say: "What good is my pity! Is not pity the cross on which he is nailed who loveth man? But my pity is not a crucifixion." Have ye ever spoken thus? Have ye ever cried thus? Ah! would that I had heard you crying thus! It is not your sin--it is your self-satisfaction that crieth unto heaven; your very sparingness in sin crieth unto heaven! Where is the lightning to lick you with its tongue? Where is the frenzy with which ye should be inoculated? Lo, I teach you the Superman: he is that lightning, he is that frenzy!-- When Zarathustra had thus spoken, one of the people called out: "We have now heard enough of the rope-dancer; it is time now for us to see him!" And all the people laughed at Zarathustra. But the rope-dancer, who thought the words applied to him, began his performance. 4. Zarathustra, however, looked at the people and wondered. Then he spake thus: Man is a rope stretched between the animal and the Superman--a rope over an abyss. A dangerous crossing, a dangerous wayfaring, a dangerous looking-back, a dangerous trembling and halting. What is great in man is that he is a bridge and not a goal: what is lovable in man is that he is an OVER-GOING and a DOWN-GOING. I love those that know not how to live except as down-goers, for they are the over-goers. I love the great despisers, because they are the great adorers, and arrows of longing for the other shore. I love those who do not first seek a reason beyond the stars for going down and being sacrifices, but sacrifice themselves to the earth, that the earth of the Superman may hereafter arrive. I love him who liveth in order to know, and seeketh to know in order that the Superman may hereafter live. Thus seeketh he his own down-going. I love him who laboureth and inventeth, that he may build the house for the Superman, and prepare for him earth, animal, and plant: for thus seeketh he his own down-going. I love him who loveth his virtue: for virtue is the will to down-going, and an arrow of longing. I love him who reserveth no share of spirit for himself, but wanteth to be wholly the spirit of his virtue: thus walketh he as spirit over the bridge. I love him who maketh his virtue his inclination and destiny: thus, for the sake of his virtue, he is willing to live on, or live no more. I love him who desireth not too many virtues. One virtue is more of a virtue than two, because it is more of a knot for one's destiny to cling to. I love him whose soul is lavish, who wanteth no thanks and doth not give back: for he always bestoweth, and desireth not to keep for himself. I love him who is ashamed when the dice fall in his favour, and who then asketh: "Am I a dishonest player?"--for he is willing to succumb. I love him who scattereth golden words in advance of his deeds, and always doeth more than he promiseth: for he seeketh his own down-going. I love him who justifieth the future ones, and redeemeth the past ones: for he is willing to succumb through the present ones. I love him who chasteneth his God, because he loveth his God: for he must succumb through the wrath of his God. I love him whose soul is deep even in the wounding, and may succumb through a small matter: thus goeth he willingly over the bridge. I love him whose soul is so overfull that he forgetteth himself, and all things are in him: thus all things become his down-going. I love him who is of a free spirit and a free heart: thus is his head only the bowels of his heart; his heart, however, causeth his down-going. I love all who are like heavy drops falling one by one out of the dark cloud that lowereth over man: they herald the coming of the lightning, and succumb as heralds. Lo, I am a herald of the lightning, and a heavy drop out of the cloud: the lightning, however, is the SUPERMAN.-- 5. When Zarathustra had spoken these words, he again looked at the people, and was silent. "There they stand," said he to his heart; "there they laugh: they understand me not; I am not the mouth for these ears. Must one first batter their ears, that they may learn to hear with their eyes? Must one clatter like kettledrums and penitential preachers? Or do they only believe the stammerer? They have something whereof they are proud. What do they call it, that which maketh them proud? Culture, they call it; it distinguisheth them from the goatherds. They dislike, therefore, to hear of 'contempt' of themselves. So I will appeal to their pride. I will speak unto them of the most contemptible thing: that, however, is THE LAST MAN!" And thus spake Zarathustra unto the people: It is time for man to fix his goal. It is time for man to plant the germ of his highest hope. Still is his soil rich enough for it. But that soil will one day be poor and exhausted, and no lofty tree will any longer be able to grow thereon. Alas! there cometh the time when man will no longer launch the arrow of his longing beyond man--and the string of his bow will have unlearned to whizz! I tell you: one must still have chaos in one, to give birth to a dancing star. I tell you: ye have still chaos in you. Alas! There cometh the time when man will no longer give birth to any star. Alas! There cometh the time of the most despicable man, who can no longer despise himself. Lo! I show you THE LAST MAN. "What is love? What is creation? What is longing? What is a star?"--so asketh the last man and blinketh. The earth hath then become small, and on it there hoppeth the last man who maketh everything small. His species is ineradicable like that of the ground-flea; the last man liveth longest. "We have discovered happiness"--say the last men, and blink thereby. They have left the regions where it is hard to live; for they need warmth. One still loveth one's neighbour and rubbeth against him; for one needeth warmth. Turning ill and being distrustful, they consider sinful: they walk warily. He is a fool who still stumbleth over stones or men! A little poison now and then: that maketh pleasant dreams. And much poison at last for a pleasant death. One still worketh, for work is a pastime. But one is careful lest the pastime should hurt one. One no longer becometh poor or rich; both are too burdensome. Who still wanteth to rule? Who still wanteth to obey? Both are too burdensome. No shepherd, and one herd! Every one wanteth the same; every one is equal: he who hath other sentiments goeth voluntarily into the madhouse. "Formerly all the world was insane,"--say the subtlest of them, and blink thereby. They are clever and know all that hath happened: so there is no end to their raillery. People still fall out, but are soon reconciled--otherwise it spoileth their stomachs. They have their little pleasures for the day, and their little pleasures for the night, but they have a regard for health. "We have discovered happiness,"--say the last men, and blink thereby.-- And here ended the first discourse of Zarathustra, which is also called "The Prologue": for at this point the shouting and mirth of the multitude interrupted him. "Give us this last man, O Zarathustra,"--they called out- -"make us into these last men! Then will we make thee a present of the Superman!" And all the people exulted and smacked their lips. Zarathustra, however, turned sad, and said to his heart: "They understand me not: I am not the mouth for these ears. Too long, perhaps, have I lived in the mountains; too much have I hearkened unto the brooks and trees: now do I speak unto them as unto the goatherds. Calm is my soul, and clear, like the mountains in the morning. But they think me cold, and a mocker with terrible jests. And now do they look at me and laugh: and while they laugh they hate me too. There is ice in their laughter." 6. Then, however, something happened which made every mouth mute and every eye fixed. In the meantime, of course, the rope-dancer had commenced his performance: he had come out at a little door, and was going along the rope which was stretched between two towers, so that it hung above the market-place and the people. When he was just midway across, the little door opened once more, and a gaudily-dressed fellow like a buffoon sprang out, and went rapidly after the first one. "Go on, halt-foot," cried his frightful voice, "go on, lazy-bones, interloper, sallow-face!--lest I tickle thee with my heel! What dost thou here between the towers? In the tower is the place for thee, thou shouldst be locked up; to one better than thyself thou blockest the way!"--And with every word he came nearer and nearer the first one. When, however, he was but a step behind, there happened the frightful thing which made every mouth mute and every eye fixed--he uttered a yell like a devil, and jumped over the other who was in his way. The latter, however, when he thus saw his rival triumph, lost at the same time his head and his footing on the rope; he threw his pole away, and shot downwards faster than it, like an eddy of arms and legs, into the depth. The market-place and the people were like the sea when the storm cometh on: they all flew apart and in disorder, especially where the body was about to fall. Zarathustra, however, remained standing, and just beside him fell the body, badly injured and disfigured, but not yet dead. After a while consciousness returned to the shattered man, and he saw Zarathustra kneeling beside him. "What art thou doing there?" said he at last, "I knew long ago that the devil would trip me up. Now he draggeth me to hell: wilt thou prevent him?" "On mine honour, my friend," answered Zarathustra, "there is nothing of all that whereof thou speakest: there is no devil and no hell. Thy soul will be dead even sooner than thy body: fear, therefore, nothing any more!" The man looked up distrustfully. "If thou speakest the truth," said he, "I lose nothing when I lose my life. I am not much more than an animal which hath been taught to dance by blows and scanty fare." "Not at all," said Zarathustra, "thou hast made danger thy calling; therein there is nothing contemptible. Now thou perishest by thy calling: therefore will I bury thee with mine own hands." When Zarathustra had said this the dying one did not reply further; but he moved his hand as if he sought the hand of Zarathustra in gratitude. 7. Meanwhile the evening came on, and the market-place veiled itself in gloom. Then the people dispersed, for even curiosity and terror become fatigued. Zarathustra, however, still sat beside the dead man on the ground, absorbed in thought: so he forgot the time. But at last it became night, and a cold wind blew upon the lonely one. Then arose Zarathustra and said to his heart: Verily, a fine catch of fish hath Zarathustra made to-day! It is not a man he hath caught, but a corpse. Sombre is human life, and as yet without meaning: a buffoon may be fateful to it. I want to teach men the sense of their existence, which is the Superman, the lightning out of the dark cloud--man. But still am I far from them, and my sense speaketh not unto their sense. To men I am still something between a fool and a corpse. Gloomy is the night, gloomy are the ways of Zarathustra. Come, thou cold and stiff companion! I carry thee to the place where I shall bury thee with mine own hands. 8. When Zarathustra had said this to his heart, he put the corpse upon his shoulders and set out on his way. Yet had he not gone a hundred steps, when there stole a man up to him and whispered in his ear--and lo! he that spake was the buffoon from the tower. "Leave this town, O Zarathustra," said he, "there are too many here who hate thee. The good and just hate thee, and call thee their enemy and despiser; the believers in the orthodox belief hate thee, and call thee a danger to the multitude. It was thy good fortune to be laughed at: and verily thou spakest like a buffoon. It was thy good fortune to associate with the dead dog; by so humiliating thyself thou hast saved thy life to-day. Depart, however, from this town,--or tomorrow I shall jump over thee, a living man over a dead one." And when he had said this, the buffoon vanished; Zarathustra, however, went on through the dark streets. At the gate of the town the grave-diggers met him: they shone their torch on his face, and, recognising Zarathustra, they sorely derided him. "Zarathustra is carrying away the dead dog: a fine thing that Zarathustra hath turned a grave-digger! For our hands are too cleanly for that roast. Will Zarathustra steal the bite from the devil? Well then, good luck to the repast! If only the devil is not a better thief than Zarathustra!--he will steal them both, he will eat them both!" And they laughed among themselves, and put their heads together. Zarathustra made no answer thereto, but went on his way. When he had gone on for two hours, past forests and swamps, he had heard too much of the hungry howling of the wolves, and he himself became a-hungry. So he halted at a lonely house in which a light was burning. "Hunger attacketh me," said Zarathustra, "like a robber. Among forests and swamps my hunger attacketh me, and late in the night. "Strange humours hath my hunger. Often it cometh to me only after a repast, and all day it hath failed to come: where hath it been?" And thereupon Zarathustra knocked at the door of the house. An old man appeared, who carried a light, and asked: "Who cometh unto me and my bad sleep?" "A living man and a dead one," said Zarathustra. "Give me something to eat and drink, I forgot it during the day. He that feedeth the hungry refresheth his own soul, saith wisdom." The old man withdrew, but came back immediately and offered Zarathustra bread and wine. "A bad country for the hungry," said he; "that is why I live here. Animal and man come unto me, the anchorite. But bid thy companion eat and drink also, he is wearier than thou." Zarathustra answered: "My companion is dead; I shall hardly be able to persuade him to eat." "That doth not concern me," said the old man sullenly; "he that knocketh at my door must take what I offer him. Eat, and fare ye well!"-- Thereafter Zarathustra again went on for two hours, trusting to the path and the light of the stars: for he was an experienced night-walker, and liked to look into the face of all that slept. When the morning dawned, however, Zarathustra found himself in a thick forest, and no path was any longer visible. He then put the dead man in a hollow tree at his head--for he wanted to protect him from the wolves--and laid himself down on the ground and moss. And immediately he fell asleep, tired in body, but with a tranquil soul. 9. Long slept Zarathustra; and not only the rosy dawn passed over his head, but also the morning. At last, however, his eyes opened, and amazedly he gazed into the forest and the stillness, amazedly he gazed into himself. Then he arose quickly, like a seafarer who all at once seeth the land; and he shouted for joy: for he saw a new truth. And he spake thus to his heart: A light hath dawned upon me: I need companions--living ones; not dead companions and corpses, which I carry with me where I will. But I need living companions, who will follow me because they want to follow themselves--and to the place where I will. A light hath dawned upon me. Not to the people is Zarathustra to speak, but to companions! Zarathustra shall not be the herd's herdsman and hound! To allure many from the herd--for that purpose have I come. The people and the herd must be angry with me: a robber shall Zarathustra be called by the herdsmen. Herdsmen, I say, but they call themselves the good and just. Herdsmen, I say, but they call themselves the believers in the orthodox belief. Behold the good and just! Whom do they hate most? Him who breaketh up their tables of values, the breaker, the lawbreaker:--he, however, is the creator. Behold the believers of all beliefs! Whom do they hate most? Him who breaketh up their tables of values, the breaker, the law-breaker--he, however, is the creator. Companions, the creator seeketh, not corpses--and not herds or believers either. Fellow-creators the creator seeketh--those who grave new values on new tables. Companions, the creator seeketh, and fellow-reapers: for everything is ripe for the harvest with him. But he lacketh the hundred sickles: so he plucketh the ears of corn and is vexed. Companions, the creator seeketh, and such as know how to whet their sickles. Destroyers, will they be called, and despisers of good and evil. But they are the reapers and rejoicers. Fellow-creators, Zarathustra seeketh; fellow-reapers and fellow-rejoicers, Zarathustra seeketh: what hath he to do with herds and herdsmen and corpses! And thou, my first companion, rest in peace! Well have I buried thee in thy hollow tree; well have I hid thee from the wolves. But I part from thee; the time hath arrived. 'Twixt rosy dawn and rosy dawn there came unto me a new truth. I am not to be a herdsman, I am not to be a grave-digger. Not any more will I discourse unto the people; for the last time have I spoken unto the dead. With the creators, the reapers, and the rejoicers will I associate: the rainbow will I show them, and all the stairs to the Superman. To the lone-dwellers will I sing my song, and to the twain-dwellers; and unto him who hath still ears for the unheard, will I make the heart heavy with my happiness. I make for my goal, I follow my course; over the loitering and tardy will I leap. Thus let my on-going be their down-going! 10. This had Zarathustra said to his heart when the sun stood at noon-tide. Then he looked inquiringly aloft,--for he heard above him the sharp call of a bird. And behold! An eagle swept through the air in wide circles, and on it hung a serpent, not like a prey, but like a friend: for it kept itself coiled round the eagle's neck. "They are mine animals," said Zarathustra, and rejoiced in his heart. "The proudest animal under the sun, and the wisest animal under the sun,-- they have come out to reconnoitre. They want to know whether Zarathustra still liveth. Verily, do I still live? More dangerous have I found it among men than among animals; in dangerous paths goeth Zarathustra. Let mine animals lead me! When Zarathustra had said this, he remembered the words of the saint in the forest. Then he sighed and spake thus to his heart: "Would that I were wiser! Would that I were wise from the very heart, like my serpent! But I am asking the impossible. Therefore do I ask my pride to go always with my wisdom! And if my wisdom should some day forsake me:--alas! it loveth to fly away!- -may my pride then fly with my folly!" Thus began Zarathustra's down-going. ZARATHUSTRA' DISCOURSES. I. THE THREE METAMORPHOSES. Three metamorphoses of the spirit do I designate to you: how the spirit becometh a camel, the camel a lion, and the lion at last a child. Many heavy things are there for the spirit, the strong load-bearing spirit in which reverence dwelleth: for the heavy and the heaviest longeth its strength. What is heavy? so asketh the load-bearing spirit; then kneeleth it down like the camel, and wanteth to be well laden. What is the heaviest thing, ye heroes? asketh the load-bearing spirit, that I may take it upon me and rejoice in my strength. Is it not this: To humiliate oneself in order to mortify one's pride? To exhibit one's folly in order to mock at one's wisdom? Or is it this: To desert our cause when it celebrateth its triumph? To ascend high mountains to tempt the tempter? Or is it this: To feed on the acorns and grass of knowledge, and for the sake of truth to suffer hunger of soul? Or is it this: To be sick and dismiss comforters, and make friends of the deaf, who never hear thy requests? Or is it this: To go into foul water when it is the water of truth, and not disclaim cold frogs and hot toads? Or is it this: To love those who despise us, and give one's hand to the phantom when it is going to frighten us? All these heaviest things the load-bearing spirit taketh upon itself: and like the camel, which, when laden, hasteneth into the wilderness, so hasteneth the spirit into its wilderness. But in the loneliest wilderness happeneth the second metamorphosis: here the spirit becometh a lion; freedom will it capture, and lordship in its own wilderness. Its last Lord it here seeketh: hostile will it be to him, and to its last God; for victory will it struggle with the great dragon. What is the great dragon which the spirit is no longer inclined to call Lord and God? "Thou-shalt," is the great dragon called. But the spirit of the lion saith, "I will." "Thou-shalt," lieth in its path, sparkling with gold--a scale-covered beast; and on every scale glittereth golden, "Thou shalt!" The values of a thousand years glitter on those scales, and thus speaketh the mightiest of all dragons: "All the values of things--glitter on me. All values have already been created, and all created values--do I represent. Verily, there shall be no 'I will' any more. Thus speaketh the dragon. My brethren, wherefore is there need of the lion in the spirit? Why sufficeth not the beast of burden, which renounceth and is reverent? To create new values--that, even the lion cannot yet accomplish: but to create itself freedom for new creating--that can the might of the lion do. To create itself freedom, and give a holy Nay even unto duty: for that, my brethren, there is need of the lion. To assume the right to new values--that is the most formidable assumption for a load-bearing and reverent spirit. Verily, unto such a spirit it is preying, and the work of a beast of prey. As its holiest, it once loved "Thou-shalt": now is it forced to find illusion and arbitrariness even in the holiest things, that it may capture freedom from its love: the lion is needed for this capture. But tell me, my brethren, what the child can do, which even the lion could not do? Why hath the preying lion still to become a child? Innocence is the child, and forgetfulness, a new beginning, a game, a self- rolling wheel, a first movement, a holy Yea. Aye, for the game of creating, my brethren, there is needed a holy Yea unto life: ITS OWN will, willeth now the spirit; HIS OWN world winneth the world's outcast. Three metamorphoses of the spirit have I designated to you: how the spirit became a camel, the camel a lion, and the lion at last a child.-- Thus spake Zarathustra. And at that time he abode in the town which is called The Pied Cow. II. THE ACADEMIC CHAIRS OF VIRTUE. People commended unto Zarathustra a wise man, as one who could discourse well about sleep and virtue: greatly was he honoured and rewarded for it, and all the youths sat before his chair. To him went Zarathustra, and sat among the youths before his chair. And thus spake the wise man: Respect and modesty in presence of sleep! That is the first thing! And to go out of the way of all who sleep badly and keep awake at night! Modest is even the thief in presence of sleep: he always stealeth softly through the night. Immodest, however, is the night-watchman; immodestly he carrieth his horn. No small art is it to sleep: it is necessary for that purpose to keep awake all day. Ten times a day must thou overcome thyself: that causeth wholesome weariness, and is poppy to the soul. Ten times must thou reconcile again with thyself; for overcoming is bitterness, and badly sleep the unreconciled. Ten truths must thou find during the day; otherwise wilt thou seek truth during the night, and thy soul will have been hungry. Ten times must thou laugh during the day, and be cheerful; otherwise thy stomach, the father of affliction, will disturb thee in the night. Few people know it, but one must have all the virtues in order to sleep well. Shall I bear false witness? Shall I commit adultery? Shall I covet my neighbour's maidservant? All that would ill accord with good sleep. And even if one have all the virtues, there is still one thing needful: to send the virtues themselves to sleep at the right time. That they may not quarrel with one another, the good females! And about thee, thou unhappy one! Peace with God and thy neighbour: so desireth good sleep. And peace also with thy neighbour's devil! Otherwise it will haunt thee in the night. Honour to the government, and obedience, and also to the crooked government! So desireth good sleep. How can I help it, if power like to walk on crooked legs? He who leadeth his sheep to the greenest pasture, shall always be for me the best shepherd: so doth it accord with good sleep. Many honours I want not, nor great treasures: they excite the spleen. But it is bad sleeping without a good name and a little treasure. A small company is more welcome to me than a bad one: but they must come and go at the right time. So doth it accord with good sleep. Well, also, do the poor in spirit please me: they promote sleep. Blessed are they, especially if one always give in to them. Thus passeth the day unto the virtuous. When night cometh, then take I good care not to summon sleep. It disliketh to be summoned--sleep, the lord of the virtues! But I think of what I have done and thought during the day. Thus ruminating, patient as a cow, I ask myself: What were thy ten overcomings? And what were the ten reconciliations, and the ten truths, and the ten laughters with which my heart enjoyed itself? Thus pondering, and cradled by forty thoughts, it overtaketh me all at once--sleep, the unsummoned, the lord of the virtues. Sleep tappeth on mine eye, and it turneth heavy. Sleep toucheth my mouth, and it remaineth open. Verily, on soft soles doth it come to me, the dearest of thieves, and stealeth from me my thoughts: stupid do I then stand, like this academic chair. But not much longer do I then stand: I already lie.-- When Zarathustra heard the wise man thus speak, he laughed in his heart: for thereby had a light dawned upon him. And thus spake he to his heart: A fool seemeth this wise man with his forty thoughts: but I believe he knoweth well how to sleep. Happy even is he who liveth near this wise man! Such sleep is contagious-- even through a thick wall it is contagious. A magic resideth even in his academic chair. And not in vain did the youths sit before the preacher of virtue. His wisdom is to keep awake in order to sleep well. And verily, if life had no sense, and had I to choose nonsense, this would be the desirablest nonsense for me also. Now know I well what people sought formerly above all else when they sought teachers of virtue. Good sleep they sought for themselves, and poppy-head virtues to promote it! To all those belauded sages of the academic chairs, wisdom was sleep without dreams: they knew no higher significance of life. Even at present, to be sure, there are some like this preacher of virtue, and not always so honourable: but their time is past. And not much longer do they stand: there they already lie. Blessed are those drowsy ones: for they shall soon nod to sleep.-- Thus spake Zarathustra. III. BACKWORLDSMEN. Once on a time, Zarathustra also cast his fancy beyond man, like all backworldsmen. The work of a suffering and tortured God, did the world then seem to me. The dream--and diction--of a God, did the world then seem to me; coloured vapours before the eyes of a divinely dissatisfied one. Good and evil, and joy and woe, and I and thou--coloured vapours did they seem to me before creative eyes. The creator wished to look away from himself,--thereupon he created the world. Intoxicating joy is it for the sufferer to look away from his suffering and forget himself. Intoxicating joy and self-forgetting, did the world once seem to me. This world, the eternally imperfect, an eternal contradiction's image and imperfect image--an intoxicating joy to its imperfect creator:--thus did the world once seem to me. Thus, once on a time, did I also cast my fancy beyond man, like all backworldsmen. Beyond man, forsooth? Ah, ye brethren, that God whom I created was human work and human madness, like all the Gods! A man was he, and only a poor fragment of a man and ego. Out of mine own ashes and glow it came unto me, that phantom. And verily, it came not unto me from the beyond! What happened, my brethren? I surpassed myself, the suffering one; I carried mine own ashes to the mountain; a brighter flame I contrived for myself. And lo! Thereupon the phantom WITHDREW from me! To me the convalescent would it now be suffering and torment to believe in such phantoms: suffering would it now be to me, and humiliation. Thus speak I to backworldsmen. Suffering was it, and impotence--that created all backworlds; and the short madness of happiness, which only the greatest sufferer experienceth. Weariness, which seeketh to get to the ultimate with one leap, with a death-leap; a poor ignorant weariness, unwilling even to will any longer: that created all Gods and backworlds. Believe me, my brethren! It was the body which despaired of the body--it groped with the fingers of the infatuated spirit at the ultimate walls. Believe me, my brethren! It was the body which despaired of the earth--it heard the bowels of existence speaking unto it. And then it sought to get through the ultimate walls with its head--and not with its head only--into "the other world." But that "other world" is well concealed from man, that dehumanised, inhuman world, which is a celestial naught; and the bowels of existence do not speak unto man, except as man. Verily, it is difficult to prove all being, and hard to make it speak. Tell me, ye brethren, is not the strangest of all things best proved? Yea, this ego, with its contradiction and perplexity, speaketh most uprightly of its being--this creating, willing, evaluing ego, which is the measure and value of things. And this most upright existence, the ego--it speaketh of the body, and still implieth the body, even when it museth and raveth and fluttereth with broken wings. Always more uprightly learneth it to speak, the ego; and the more it learneth, the more doth it find titles and honours for the body and the earth. A new pride taught me mine ego, and that teach I unto men: no longer to thrust one's head into the sand of celestial things, but to carry it freely, a terrestrial head, which giveth meaning to the earth! A new will teach I unto men: to choose that path which man hath followed blindly, and to approve of it--and no longer to slink aside from it, like the sick and perishing! The sick and perishing--it was they who despised the body and the earth, and invented the heavenly world, and the redeeming blood-drops; but even those sweet and sad poisons they borrowed from the body and the earth! From their misery they sought escape, and the stars were too remote for them. Then they sighed: "O that there were heavenly paths by which to steal into another existence and into happiness!" Then they contrived for themselves their by-paths and bloody draughts! Beyond the sphere of their body and this earth they now fancied themselves transported, these ungrateful ones. But to what did they owe the convulsion and rapture of their transport? To their body and this earth. Gentle is Zarathustra to the sickly. Verily, he is not indignant at their modes of consolation and ingratitude. May they become convalescents and overcomers, and create higher bodies for themselves! Neither is Zarathustra indignant at a convalescent who looketh tenderly on his delusions, and at midnight stealeth round the grave of his God; but sickness and a sick frame remain even in his tears. Many sickly ones have there always been among those who muse, and languish for God; violently they hate the discerning ones, and the latest of virtues, which is uprightness. Backward they always gaze toward dark ages: then, indeed, were delusion and faith something different. Raving of the reason was likeness to God, and doubt was sin. Too well do I know those godlike ones: they insist on being believed in, and that doubt is sin. Too well, also, do I know what they themselves most believe in. Verily, not in backworlds and redeeming blood-drops: but in the body do they also believe most; and their own body is for them the thing-in-itself. But it is a sickly thing to them, and gladly would they get out of their skin. Therefore hearken they to the preachers of death, and themselves preach backworlds. Hearken rather, my brethren, to the voice of the healthy body; it is a more upright and pure voice. More uprightly and purely speaketh the healthy body, perfect and square- built; and it speaketh of the meaning of the earth.-- Thus spake Zarathustra. IV. THE DESPISERS OF THE BODY. To the despisers of the body will I speak my word. I wish them neither to learn afresh, nor teach anew, but only to bid farewell to their own bodies,--and thus be dumb. "Body am I, and soul"--so saith the child. And why should one not speak like children? But the awakened one, the knowing one, saith: "Body am I entirely, and nothing more; and soul is only the name of something in the body." The body is a big sagacity, a plurality with one sense, a war and a peace, a flock and a shepherd. An instrument of thy body is also thy little sagacity, my brother, which thou callest "spirit"--a little instrument and plaything of thy big sagacity. "Ego," sayest thou, and art proud of that word. But the greater thing--in which thou art unwilling to believe--is thy body with its big sagacity; it saith not "ego," but doeth it. What the sense feeleth, what the spirit discerneth, hath never its end in itself. But sense and spirit would fain persuade thee that they are the end of all things: so vain are they. Instruments and playthings are sense and spirit: behind them there is still the Self. The Self seeketh with the eyes of the senses, it hearkeneth also with the ears of the spirit. Ever hearkeneth the Self, and seeketh; it compareth, mastereth, conquereth, and destroyeth. It ruleth, and is also the ego's ruler. Behind thy thoughts and feelings, my brother, there is a mighty lord, an unknown sage--it is called Self; it dwelleth in thy body, it is thy body. There is more sagacity in thy body than in thy best wisdom. And who then knoweth why thy body requireth just thy best wisdom? Thy Self laugheth at thine ego, and its proud prancings. "What are these prancings and flights of thought unto me?" it saith to itself. "A by-way to my purpose. I am the leading-string of the ego, and the prompter of its notions." The Self saith unto the ego: "Feel pain!" And thereupon it suffereth, and thinketh how it may put an end thereto--and for that very purpose it IS MEANT to think. The Self saith unto the ego: "Feel pleasure!" Thereupon it rejoiceth, and thinketh how it may ofttimes rejoice--and for that very purpose it IS MEANT to think. To the despisers of the body will I speak a word. That they despise is caused by their esteem. What is it that created esteeming and despising and worth and will? The creating Self created for itself esteeming and despising, it created for itself joy and woe. The creating body created for itself spirit, as a hand to its will. Even in your folly and despising ye each serve your Self, ye despisers of the body. I tell you, your very Self wanteth to die, and turneth away from life. No longer can your Self do that which it desireth most:--create beyond itself. That is what it desireth most; that is all its fervour. But it is now too late to do so:--so your Self wisheth to succumb, ye despisers of the body. To succumb--so wisheth your Self; and therefore have ye become despisers of the body. For ye can no longer create beyond yourselves. And therefore are ye now angry with life and with the earth. And unconscious envy is in the sidelong look of your contempt. I go not your way, ye despisers of the body! Ye are no bridges for me to the Superman!-- Thus spake Zarathustra. V. JOYS AND PASSIONS. My brother, when thou hast a virtue, and it is thine own virtue, thou hast it in common with no one. To be sure, thou wouldst call it by name and caress it; thou wouldst pull its ears and amuse thyself with it. And lo! Then hast thou its name in common with the people, and hast become one of the people and the herd with thy virtue! Better for thee to say: "Ineffable is it, and nameless, that which is pain and sweetness to my soul, and also the hunger of my bowels." Let thy virtue be too high for the familiarity of names, and if thou must speak of it, be not ashamed to stammer about it. Thus speak and stammer: "That is MY good, that do I love, thus doth it please me entirely, thus only do _I_ desire the good. Not as the law of a God do I desire it, not as a human law or a human need do I desire it; it is not to be a guide-post for me to superearths and paradises. An earthly virtue is it which I love: little prudence is therein, and the least everyday wisdom. But that bird built its nest beside me: therefore, I love and cherish it-- now sitteth it beside me on its golden eggs." Thus shouldst thou stammer, and praise thy virtue. Once hadst thou passions and calledst them evil. But now hast thou only thy virtues: they grew out of thy passions. Thou implantedst thy highest aim into the heart of those passions: then became they thy virtues and joys. And though thou wert of the race of the hot-tempered, or of the voluptuous, or of the fanatical, or the vindictive; All thy passions in the end became virtues, and all thy devils angels. Once hadst thou wild dogs in thy cellar: but they changed at last into birds and charming songstresses. Out of thy poisons brewedst thou balsam for thyself; thy cow, affliction, milkedst thou--now drinketh thou the sweet milk of her udder. And nothing evil groweth in thee any longer, unless it be the evil that groweth out of the conflict of thy virtues. My brother, if thou be fortunate, then wilt thou have one virtue and no more: thus goest thou easier over the bridge. Illustrious is it to have many virtues, but a hard lot; and many a one hath gone into the wilderness and killed himself, because he was weary of being the battle and battlefield of virtues. My brother, are war and battle evil? Necessary, however, is the evil; necessary are the envy and the distrust and the back-biting among the virtues. Lo! how each of thy virtues is covetous of the highest place; it wanteth thy whole spirit to be ITS herald, it wanteth thy whole power, in wrath, hatred, and love. Jealous is every virtue of the others, and a dreadful thing is jealousy. Even virtues may succumb by jealousy. He whom the flame of jealousy encompasseth, turneth at last, like the scorpion, the poisoned sting against himself. Ah! my brother, hast thou never seen a virtue backbite and stab itself? Man is something that hath to be surpassed: and therefore shalt thou love thy virtues,--for thou wilt succumb by them.-- Thus spake Zarathustra. VI. THE PALE CRIMINAL. Ye do not mean to slay, ye judges and sacrificers, until the animal hath bowed its head? Lo! the pale criminal hath bowed his head: out of his eye speaketh the great contempt. "Mine ego is something which is to be surpassed: mine ego is to me the great contempt of man": so speaketh it out of that eye. When he judged himself--that was his supreme moment; let not the exalted one relapse again into his low estate! There is no salvation for him who thus suffereth from himself, unless it be speedy death. Your slaying, ye judges, shall be pity, and not revenge; and in that ye slay, see to it that ye yourselves justify life! It is not enough that ye should reconcile with him whom ye slay. Let your sorrow be love to the Superman: thus will ye justify your own survival! "Enemy" shall ye say but not "villain," "invalid" shall ye say but not "wretch," "fool" shall ye say but not "sinner." And thou, red judge, if thou would say audibly all thou hast done in thought, then would every one cry: "Away with the nastiness and the virulent reptile!" But one thing is the thought, another thing is the deed, and another thing is the idea of the deed. The wheel of causality doth not roll between them. An idea made this pale man pale. Adequate was he for his deed when he did it, but the idea of it, he could not endure when it was done. Evermore did he now see himself as the doer of one deed. Madness, I call this: the exception reversed itself to the rule in him. The streak of chalk bewitcheth the hen; the stroke he struck bewitched his weak reason. Madness AFTER the deed, I call this. Hearken, ye judges! There is another madness besides, and it is BEFORE the deed. Ah! ye have not gone deep enough into this soul! Thus speaketh the red judge: "Why did this criminal commit murder? He meant to rob." I tell you, however, that his soul wanted blood, not booty: he thirsted for the happiness of the knife! But his weak reason understood not this madness, and it persuaded him. "What matter about blood!" it said; "wishest thou not, at least, to make booty thereby? Or take revenge?" And he hearkened unto his weak reason: like lead lay its words upon him-- thereupon he robbed when he murdered. He did not mean to be ashamed of his madness. And now once more lieth the lead of his guilt upon him, and once more is his weak reason so benumbed, so paralysed, and so dull. Could he only shake his head, then would his burden roll off; but who shaketh that head? What is this man? A mass of diseases that reach out into the world through the spirit; there they want to get their prey. What is this man? A coil of wild serpents that are seldom at peace among themselves--so they go forth apart and seek prey in the world. Look at that poor body! What it suffered and craved, the poor soul interpreted to itself--it interpreted it as murderous desire, and eagerness for the happiness of the knife. Him who now turneth sick, the evil overtaketh which is now the evil: he seeketh to cause pain with that which causeth him pain. But there have been other ages, and another evil and good. Once was doubt evil, and the will to Self. Then the invalid became a heretic or sorcerer; as heretic or sorcerer he suffered, and sought to cause suffering. But this will not enter your ears; it hurteth your good people, ye tell me. But what doth it matter to me about your good people! Many things in your good people cause me disgust, and verily, not their evil. I would that they had a madness by which they succumbed, like this pale criminal! Verily, I would that their madness were called truth, or fidelity, or justice: but they have their virtue in order to live long, and in wretched self-complacency. I am a railing alongside the torrent; whoever is able to grasp me may grasp me! Your crutch, however, I am not.-- Thus spake Zarathustra. VII. READING AND WRITING. Of all that is written, I love only what a person hath written with his blood. Write with blood, and thou wilt find that blood is spirit. It is no easy task to understand unfamiliar blood; I hate the reading idlers. He who knoweth the reader, doeth nothing more for the reader. Another century of readers--and spirit itself will stink. Every one being allowed to learn to read, ruineth in the long run not only writing but also thinking. Once spirit was God, then it became man, and now it even becometh populace. He that writeth in blood and proverbs doth not want to be read, but learnt by heart. In the mountains the shortest way is from peak to peak, but for that route thou must have long legs. Proverbs should be peaks, and those spoken to should be big and tall. The atmosphere rare and pure, danger near and the spirit full of a joyful wickedness: thus are things well matched. I want to have goblins about me, for I am courageous. The courage which scareth away ghosts, createth for itself goblins--it wanteth to laugh. I no longer feel in common with you; the very cloud which I see beneath me, the blackness and heaviness at which I laugh--that is your thunder-cloud. Ye look aloft when ye long for exaltation; and I look downward because I am exalted. Who among you can at the same time laugh and be exalted? He who climbeth on the highest mountains, laugheth at all tragic plays and tragic realities. Courageous, unconcerned, scornful, coercive--so wisdom wisheth us; she is a woman, and ever loveth only a warrior. Ye tell me, "Life is hard to bear." But for what purpose should ye have your pride in the morning and your resignation in the evening? Life is hard to bear: but do not affect to be so delicate! We are all of us fine sumpter asses and assesses. What have we in common with the rose-bud, which trembleth because a drop of dew hath formed upon it? It is true we love life; not because we are wont to live, but because we are wont to love. There is always some madness in love. But there is always, also, some method in madness. And to me also, who appreciate life, the butterflies, and soap-bubbles, and whatever is like them amongst us, seem most to enjoy happiness. To see these light, foolish, pretty, lively little sprites flit about--that moveth Zarathustra to tears and songs. I should only believe in a God that would know how to dance. And when I saw my devil, I found him serious, thorough, profound, solemn: he was the spirit of gravity--through him all things fall. Not by wrath, but by laughter, do we slay. Come, let us slay the spirit of gravity! I learned to walk; since then have I let myself run. I learned to fly; since then I do not need pushing in order to move from a spot. Now am I light, now do I fly; now do I see myself under myself. Now there danceth a God in me.-- Thus spake Zarathustra. VIII. THE TREE ON THE HILL. Zarathustra's eye had perceived that a certain youth avoided him. And as he walked alone one evening over the hills surrounding the town called "The Pied Cow," behold, there found he the youth sitting leaning against a tree, and gazing with wearied look into the valley. Zarathustra thereupon laid hold of the tree beside which the youth sat, and spake thus: "If I wished to shake this tree with my hands, I should not be able to do so. But the wind, which we see not, troubleth and bendeth it as it listeth. We are sorest bent and troubled by invisible hands." Thereupon the youth arose disconcerted, and said: "I hear Zarathustra, and just now was I thinking of him!" Zarathustra answered: "Why art thou frightened on that account?--But it is the same with man as with the tree. The more he seeketh to rise into the height and light, the more vigorously do his roots struggle earthward, downward, into the dark and deep--into the evil." "Yea, into the evil!" cried the youth. "How is it possible that thou hast discovered my soul?" Zarathustra smiled, and said: "Many a soul one will never discover, unless one first invent it." "Yea, into the evil!" cried the youth once more. "Thou saidst the truth, Zarathustra. I trust myself no longer since I sought to rise into the height, and nobody trusteth me any longer; how doth that happen? I change too quickly: my to-day refuteth my yesterday. I often overleap the steps when I clamber; for so doing, none of the steps pardons me. When aloft, I find myself always alone. No one speaketh unto me; the frost of solitude maketh me tremble. What do I seek on the height? My contempt and my longing increase together; the higher I clamber, the more do I despise him who clambereth. What doth he seek on the height? How ashamed I am of my clambering and stumbling! How I mock at my violent panting! How I hate him who flieth! How tired I am on the height!" Here the youth was silent. And Zarathustra contemplated the tree beside which they stood, and spake thus: "This tree standeth lonely here on the hills; it hath grown up high above man and beast. And if it wanted to speak, it would have none who could understand it: so high hath it grown. Now it waiteth and waiteth,--for what doth it wait? It dwelleth too close to the seat of the clouds; it waiteth perhaps for the first lightning?" When Zarathustra had said this, the youth called out with violent gestures: "Yea, Zarathustra, thou speakest the truth. My destruction I longed for, when I desired to be on the height, and thou art the lightning for which I waited! Lo! what have I been since thou hast appeared amongst us? It is mine envy of thee that hath destroyed me!"--Thus spake the youth, and wept bitterly. Zarathustra, however, put his arm about him, and led the youth away with him. And when they had walked a while together, Zarathustra began to speak thus: It rendeth my heart. Better than thy words express it, thine eyes tell me all thy danger. As yet thou art not free; thou still SEEKEST freedom. Too unslept hath thy seeking made thee, and too wakeful. On the open height wouldst thou be; for the stars thirsteth thy soul. But thy bad impulses also thirst for freedom. Thy wild dogs want liberty; they bark for joy in their cellar when thy spirit endeavoureth to open all prison doors. Still art thou a prisoner--it seemeth to me--who deviseth liberty for himself: ah! sharp becometh the soul of such prisoners, but also deceitful and wicked. To purify himself, is still necessary for the freedman of the spirit. Much of the prison and the mould still remaineth in him: pure hath his eye still to become. Yea, I know thy danger. But by my love and hope I conjure thee: cast not thy love and hope away! Noble thou feelest thyself still, and noble others also feel thee still, though they bear thee a grudge and cast evil looks. Know this, that to everybody a noble one standeth in the way. Also to the good, a noble one standeth in the way: and even when they call him a good man, they want thereby to put him aside. The new, would the noble man create, and a new virtue. The old, wanteth the good man, and that the old should be conserved. But it is not the danger of the noble man to turn a good man, but lest he should become a blusterer, a scoffer, or a destroyer. Ah! I have known noble ones who lost their highest hope. And then they disparaged all high hopes. Then lived they shamelessly in temporary pleasures, and beyond the day had hardly an aim. "Spirit is also voluptuousness,"--said they. Then broke the wings of their spirit; and now it creepeth about, and defileth where it gnaweth. Once they thought of becoming heroes; but sensualists are they now. A trouble and a terror is the hero to them. But by my love and hope I conjure thee: cast not away the hero in thy soul! Maintain holy thy highest hope!-- Thus spake Zarathustra. IX. THE PREACHERS OF DEATH. There are preachers of death: and the earth is full of those to whom desistance from life must be preached. Full is the earth of the superfluous; marred is life by the many-too-many. May they be decoyed out of this life by the "life eternal"! "The yellow ones": so are called the preachers of death, or "the black ones." But I will show them unto you in other colours besides. There are the terrible ones who carry about in themselves the beast of prey, and have no choice except lusts or self-laceration. And even their lusts are self-laceration. They have not yet become men, those terrible ones: may they preach desistance from life, and pass away themselves! There are the spiritually consumptive ones: hardly are they born when they begin to die, and long for doctrines of lassitude and renunciation. They would fain be dead, and we should approve of their wish! Let us beware of awakening those dead ones, and of damaging those living coffins! They meet an invalid, or an old man, or a corpse--and immediately they say: "Life is refuted!" But they only are refuted, and their eye, which seeth only one aspect of existence. Shrouded in thick melancholy, and eager for the little casualties that bring death: thus do they wait, and clench their teeth. Or else, they grasp at sweetmeats, and mock at their childishness thereby: they cling to their straw of life, and mock at their still clinging to it. Their wisdom speaketh thus: "A fool, he who remaineth alive; but so far are we fools! And that is the foolishest thing in life!" "Life is only suffering": so say others, and lie not. Then see to it that YE cease! See to it that the life ceaseth which is only suffering! And let this be the teaching of your virtue: "Thou shalt slay thyself! Thou shalt steal away from thyself!"-- "Lust is sin,"--so say some who preach death--"let us go apart and beget no children!" "Giving birth is troublesome,"--say others--"why still give birth? One beareth only the unfortunate!" And they also are preachers of death. "Pity is necessary,"--so saith a third party. "Take what I have! Take what I am! So much less doth life bind me!" Were they consistently pitiful, then would they make their neighbours sick of life. To be wicked--that would be their true goodness. But they want to be rid of life; what care they if they bind others still faster with their chains and gifts!-- And ye also, to whom life is rough labour and disquiet, are ye not very tired of life? Are ye not very ripe for the sermon of death? All ye to whom rough labour is dear, and the rapid, new, and strange--ye put up with yourselves badly; your diligence is flight, and the will to self-forgetfulness. If ye believed more in life, then would ye devote yourselves less to the momentary. But for waiting, ye have not enough of capacity in you--nor even for idling! Everywhere resoundeth the voices of those who preach death; and the earth is full of those to whom death hath to be preached. Or "life eternal"; it is all the same to me--if only they pass away quickly!-- Thus spake Zarathustra. X. WAR AND WARRIORS. By our best enemies we do not want to be spared, nor by those either whom we love from the very heart. So let me tell you the truth! My brethren in war! I love you from the very heart. I am, and was ever, your counterpart. And I am also your best enemy. So let me tell you the truth! I know the hatred and envy of your hearts. Ye are not great enough not to know of hatred and envy. Then be great enough not to be ashamed of them! And if ye cannot be saints of knowledge, then, I pray you, be at least its warriors. They are the companions and forerunners of such saintship. I see many soldiers; could I but see many warriors! "Uniform" one calleth what they wear; may it not be uniform what they therewith hide! Ye shall be those whose eyes ever seek for an enemy--for YOUR enemy. And with some of you there is hatred at first sight. Your enemy shall ye seek; your war shall ye wage, and for the sake of your thoughts! And if your thoughts succumb, your uprightness shall still shout triumph thereby! Ye shall love peace as a means to new wars--and the short peace more than the long. You I advise not to work, but to fight. You I advise not to peace, but to victory. Let your work be a fight, let your peace be a victory! One can only be silent and sit peacefully when one hath arrow and bow; otherwise one prateth and quarrelleth. Let your peace be a victory! Ye say it is the good cause which halloweth even war? I say unto you: it is the good war which halloweth every cause. War and courage have done more great things than charity. Not your sympathy, but your bravery hath hitherto saved the victims. "What is good?" ye ask. To be brave is good. Let the little girls say: "To be good is what is pretty, and at the same time touching." They call you heartless: but your heart is true, and I love the bashfulness of your goodwill. Ye are ashamed of your flow, and others are ashamed of their ebb. Ye are ugly? Well then, my brethren, take the sublime about you, the mantle of the ugly! And when your soul becometh great, then doth it become haughty, and in your sublimity there is wickedness. I know you. In wickedness the haughty man and the weakling meet. But they misunderstand one another. I know you. Ye shall only have enemies to be hated, but not enemies to be despised. Ye must be proud of your enemies; then, the successes of your enemies are also your successes. Resistance--that is the distinction of the slave. Let your distinction be obedience. Let your commanding itself be obeying! To the good warrior soundeth "thou shalt" pleasanter than "I will." And all that is dear unto you, ye shall first have it commanded unto you. Let your love to life be love to your highest hope; and let your highest hope be the highest thought of life! Your highest thought, however, ye shall have it commanded unto you by me-- and it is this: man is something that is to be surpassed. So live your life of obedience and of war! What matter about long life! What warrior wisheth to be spared! I spare you not, I love you from my very heart, my brethren in war!-- Thus spake Zarathustra. XI. THE NEW IDOL. Somewhere there are still peoples and herds, but not with us, my brethren: here there are states. A state? What is that? Well! open now your ears unto me, for now will I say unto you my word concerning the death of peoples. A state, is called the coldest of all cold monsters. Coldly lieth it also; and this lie creepeth from its mouth: "I, the state, am the people." It is a lie! Creators were they who created peoples, and hung a faith and a love over them: thus they served life. Destroyers, are they who lay snares for many, and call it the state: they hang a sword and a hundred cravings over them. Where there is still a people, there the state is not understood, but hated as the evil eye, and as sin against laws and customs. This sign I give unto you: every people speaketh its language of good and evil: this its neighbour understandeth not. Its language hath it devised for itself in laws and customs. But the state lieth in all languages of good and evil; and whatever it saith it lieth; and whatever it hath it hath stolen. False is everything in it; with stolen teeth it biteth, the biting one. False are even its bowels. Confusion of language of good and evil; this sign I give unto you as the sign of the state. Verily, the will to death, indicateth this sign! Verily, it beckoneth unto the preachers of death! Many too many are born: for the superfluous ones was the state devised! See just how it enticeth them to it, the many-too-many! How it swalloweth and cheweth and recheweth them! "On earth there is nothing greater than I: it is I who am the regulating finger of God"--thus roareth the monster. And not only the long-eared and short-sighted fall upon their knees! Ah! even in your ears, ye great souls, it whispereth its gloomy lies! Ah! it findeth out the rich hearts which willingly lavish themselves! Yea, it findeth you out too, ye conquerors of the old God! Weary ye became of the conflict, and now your weariness serveth the new idol! Heroes and honourable ones, it would fain set up around it, the new idol! Gladly it basketh in the sunshine of good consciences,--the cold monster! Everything will it give YOU, if YE worship it, the new idol: thus it purchaseth the lustre of your virtue, and the glance of your proud eyes. It seeketh to allure by means of you, the many-too-many! Yea, a hellish artifice hath here been devised, a death-horse jingling with the trappings of divine honours! Yea, a dying for many hath here been devised, which glorifieth itself as life: verily, a hearty service unto all preachers of death! The state, I call it, where all are poison-drinkers, the good and the bad: the state, where all lose themselves, the good and the bad: the state, where the slow suicide of all--is called "life." Just see these superfluous ones! They steal the works of the inventors and the treasures of the wise. Culture, they call their theft--and everything becometh sickness and trouble unto them! Just see these superfluous ones! Sick are they always; they vomit their bile and call it a newspaper. They devour one another, and cannot even digest themselves. Just see these superfluous ones! Wealth they acquire and become poorer thereby. Power they seek for, and above all, the lever of power, much money--these impotent ones! See them clamber, these nimble apes! They clamber over one another, and thus scuffle into the mud and the abyss. Towards the throne they all strive: it is their madness--as if happiness sat on the throne! Ofttimes sitteth filth on the throne.--and ofttimes also the throne on filth. Madmen they all seem to me, and clambering apes, and too eager. Badly smelleth their idol to me, the cold monster: badly they all smell to me, these idolaters. My brethren, will ye suffocate in the fumes of their maws and appetites! Better break the windows and jump into the open air! Do go out of the way of the bad odour! Withdraw from the idolatry of the superfluous! Do go out of the way of the bad odour! Withdraw from the steam of these human sacrifices! Open still remaineth the earth for great souls. Empty are still many sites for lone ones and twain ones, around which floateth the odour of tranquil seas. Open still remaineth a free life for great souls. Verily, he who possesseth little is so much the less possessed: blessed be moderate poverty! There, where the state ceaseth--there only commenceth the man who is not superfluous: there commenceth the song of the necessary ones, the single and irreplaceable melody. There, where the state CEASETH--pray look thither, my brethren! Do ye not see it, the rainbow and the bridges of the Superman?-- Thus spake Zarathustra. XII. THE FLIES IN THE MARKET-PLACE. Flee, my friend, into thy solitude! I see thee deafened with the noise of the great men, and stung all over with the stings of the little ones. Admirably do forest and rock know how to be silent with thee. Resemble again the tree which thou lovest, the broad-branched one--silently and attentively it o'erhangeth the sea. Where solitude endeth, there beginneth the market-place; and where the market-place beginneth, there beginneth also the noise of the great actors, and the buzzing of the poison-flies. In the world even the best things are worthless without those who represent them: those representers, the people call great men. Little do the people understand what is great--that is to say, the creating agency. But they have a taste for all representers and actors of great things. Around the devisers of new values revolveth the world:--invisibly it revolveth. But around the actors revolve the people and the glory: such is the course of things. Spirit, hath the actor, but little conscience of the spirit. He believeth always in that wherewith he maketh believe most strongly--in HIMSELF! Tomorrow he hath a new belief, and the day after, one still newer. Sharp perceptions hath he, like the people, and changeable humours. To upset--that meaneth with him to prove. To drive mad--that meaneth with him to convince. And blood is counted by him as the best of all arguments. A truth which only glideth into fine ears, he calleth falsehood and trumpery. Verily, he believeth only in Gods that make a great noise in the world! Full of clattering buffoons is the market-place,--and the people glory in their great men! These are for them the masters of the hour. But the hour presseth them; so they press thee. And also from thee they want Yea or Nay. Alas! thou wouldst set thy chair betwixt For and Against? On account of those absolute and impatient ones, be not jealous, thou lover of truth! Never yet did truth cling to the arm of an absolute one. On account of those abrupt ones, return into thy security: only in the market-place is one assailed by Yea? or Nay? Slow is the experience of all deep fountains: long have they to wait until they know WHAT hath fallen into their depths. Away from the market-place and from fame taketh place all that is great: away from the market-Place and from fame have ever dwelt the devisers of new values. Flee, my friend, into thy solitude: I see thee stung all over by the poisonous flies. Flee thither, where a rough, strong breeze bloweth! Flee into thy solitude! Thou hast lived too closely to the small and the pitiable. Flee from their invisible vengeance! Towards thee they have nothing but vengeance. Raise no longer an arm against them! Innumerable are they, and it is not thy lot to be a fly-flap. Innumerable are the small and pitiable ones; and of many a proud structure, rain-drops and weeds have been the ruin. Thou art not stone; but already hast thou become hollow by the numerous drops. Thou wilt yet break and burst by the numerous drops. Exhausted I see thee, by poisonous flies; bleeding I see thee, and torn at a hundred spots; and thy pride will not even upbraid. Blood they would have from thee in all innocence; blood their bloodless souls crave for--and they sting, therefore, in all innocence. But thou, profound one, thou sufferest too profoundly even from small wounds; and ere thou hadst recovered, the same poison-worm crawled over thy hand. Too proud art thou to kill these sweet-tooths. But take care lest it be thy fate to suffer all their poisonous injustice! They buzz around thee also with their praise: obtrusiveness, is their praise. They want to be close to thy skin and thy blood. They flatter thee, as one flattereth a God or devil; they whimper before thee, as before a God or devil. What doth it come to! Flatterers are they, and whimperers, and nothing more. Often, also, do they show themselves to thee as amiable ones. But that hath ever been the prudence of the cowardly. Yea! the cowardly are wise! They think much about thee with their circumscribed souls--thou art always suspected by them! Whatever is much thought about is at last thought suspicious. They punish thee for all thy virtues. They pardon thee in their inmost hearts only--for thine errors. Because thou art gentle and of upright character, thou sayest: "Blameless are they for their small existence." But their circumscribed souls think: "Blamable is all great existence." Even when thou art gentle towards them, they still feel themselves despised by thee; and they repay thy beneficence with secret maleficence. Thy silent pride is always counter to their taste; they rejoice if once thou be humble enough to be frivolous. What we recognise in a man, we also irritate in him. Therefore be on your guard against the small ones! In thy presence they feel themselves small, and their baseness gleameth and gloweth against thee in invisible vengeance. Sawest thou not how often they became dumb when thou approachedst them, and how their energy left them like the smoke of an extinguishing fire? Yea, my friend, the bad conscience art thou of thy neighbours; for they are unworthy of thee. Therefore they hate thee, and would fain suck thy blood. Thy neighbours will always be poisonous flies; what is great in thee--that itself must make them more poisonous, and always more fly-like. Flee, my friend, into thy solitude--and thither, where a rough strong breeze bloweth. It is not thy lot to be a fly-flap.-- Thus spake Zarathustra. XIII. CHASTITY. I love the forest. It is bad to live in cities: there, there are too many of the lustful. Is it not better to fall into the hands of a murderer, than into the dreams of a lustful woman? And just look at these men: their eye saith it--they know nothing better on earth than to lie with a woman. Filth is at the bottom of their souls; and alas! if their filth hath still spirit in it! Would that ye were perfect--at least as animals! But to animals belongeth innocence. Do I counsel you to slay your instincts? I counsel you to innocence in your instincts. Do I counsel you to chastity? Chastity is a virtue with some, but with many almost a vice. These are continent, to be sure: but doggish lust looketh enviously out of all that they do. Even into the heights of their virtue and into their cold spirit doth this creature follow them, with its discord. And how nicely can doggish lust beg for a piece of spirit, when a piece of flesh is denied it! Ye love tragedies and all that breaketh the heart? But I am distrustful of your doggish lust. Ye have too cruel eyes, and ye look wantonly towards the sufferers. Hath not your lust just disguised itself and taken the name of fellow-suffering? And also this parable give I unto you: Not a few who meant to cast out their devil, went thereby into the swine themselves. To whom chastity is difficult, it is to be dissuaded: lest it become the road to hell--to filth and lust of soul. Do I speak of filthy things? That is not the worst thing for me to do. Not when the truth is filthy, but when it is shallow, doth the discerning one go unwillingly into its waters. Verily, there are chaste ones from their very nature; they are gentler of heart, and laugh better and oftener than you. They laugh also at chastity, and ask: "What is chastity? Is chastity not folly? But the folly came unto us, and not we unto it. We offered that guest harbour and heart: now it dwelleth with us--let it stay as long as it will!"-- Thus spake Zarathustra. XIV. THE FRIEND. "One, is always too many about me"--thinketh the anchorite. "Always once one--that maketh two in the long run!" I and me are always too earnestly in conversation: how could it be endured, if there were not a friend? The friend of the anchorite is always the third one: the third one is the cork which preventeth the conversation of the two sinking into the depth. Ah! there are too many depths for all anchorites. Therefore, do they long so much for a friend, and for his elevation. Our faith in others betrayeth wherein we would fain have faith in ourselves. Our longing for a friend is our betrayer. And often with our love we want merely to overleap envy. And often we attack and make ourselves enemies, to conceal that we are vulnerable. "Be at least mine enemy!"--thus speaketh the true reverence, which doth not venture to solicit friendship. If one would have a friend, then must one also be willing to wage war for him: and in order to wage war, one must be CAPABLE of being an enemy. One ought still to honour the enemy in one's friend. Canst thou go nigh unto thy friend, and not go over to him? In one's friend one shall have one's best enemy. Thou shalt be closest unto him with thy heart when thou withstandest him. Thou wouldst wear no raiment before thy friend? It is in honour of thy friend that thou showest thyself to him as thou art? But he wisheth thee to the devil on that account! He who maketh no secret of himself shocketh: so much reason have ye to fear nakedness! Aye, if ye were Gods, ye could then be ashamed of clothing! Thou canst not adorn thyself fine enough for thy friend; for thou shalt be unto him an arrow and a longing for the Superman. Sawest thou ever thy friend asleep--to know how he looketh? What is usually the countenance of thy friend? It is thine own countenance, in a coarse and imperfect mirror. Sawest thou ever thy friend asleep? Wert thou not dismayed at thy friend looking so? O my friend, man is something that hath to be surpassed. In divining and keeping silence shall the friend be a master: not everything must thou wish to see. Thy dream shall disclose unto thee what thy friend doeth when awake. Let thy pity be a divining: to know first if thy friend wanteth pity. Perhaps he loveth in thee the unmoved eye, and the look of eternity. Let thy pity for thy friend be hid under a hard shell; thou shalt bite out a tooth upon it. Thus will it have delicacy and sweetness. Art thou pure air and solitude and bread and medicine to thy friend? Many a one cannot loosen his own fetters, but is nevertheless his friend's emancipator. Art thou a slave? Then thou canst not be a friend. Art thou a tyrant? Then thou canst not have friends. Far too long hath there been a slave and a tyrant concealed in woman. On that account woman is not yet capable of friendship: she knoweth only love. In woman's love there is injustice and blindness to all she doth not love. And even in woman's conscious love, there is still always surprise and lightning and night, along with the light. As yet woman is not capable of friendship: women are still cats, and birds. Or at the best, cows. As yet woman is not capable of friendship. But tell me, ye men, who of you are capable of friendship? Oh! your poverty, ye men, and your sordidness of soul! As much as ye give to your friend, will I give even to my foe, and will not have become poorer thereby. There is comradeship: may there be friendship! Thus spake Zarathustra. XV. THE THOUSAND AND ONE GOALS. Many lands saw Zarathustra, and many peoples: thus he discovered the good and bad of many peoples. No greater power did Zarathustra find on earth than good and bad. No people could live without first valuing; if a people will maintain itself, however, it must not value as its neighbour valueth. Much that passed for good with one people was regarded with scorn and contempt by another: thus I found it. Much found I here called bad, which was there decked with purple honours. Never did the one neighbour understand the other: ever did his soul marvel at his neighbour's delusion and wickedness. A table of excellencies hangeth over every people. Lo! it is the table of their triumphs; lo! it is the voice of their Will to Power. It is laudable, what they think hard; what is indispensable and hard they call good; and what relieveth in the direst distress, the unique and hardest of all,--they extol as holy. Whatever maketh them rule and conquer and shine, to the dismay and envy of their neighbours, they regard as the high and foremost thing, the test and the meaning of all else. Verily, my brother, if thou knewest but a people's need, its land, its sky, and its neighbour, then wouldst thou divine the law of its surmountings, and why it climbeth up that ladder to its hope. "Always shalt thou be the foremost and prominent above others: no one shall thy jealous soul love, except a friend"--that made the soul of a Greek thrill: thereby went he his way to greatness. "To speak truth, and be skilful with bow and arrow"--so seemed it alike pleasing and hard to the people from whom cometh my name--the name which is alike pleasing and hard to me. "To honour father and mother, and from the root of the soul to do their will"--this table of surmounting hung another people over them, and became powerful and permanent thereby. "To have fidelity, and for the sake of fidelity to risk honour and blood, even in evil and dangerous courses"--teaching itself so, another people mastered itself, and thus mastering itself, became pregnant and heavy with great hopes. Verily, men have given unto themselves all their good and bad. Verily, they took it not, they found it not, it came not unto them as a voice from heaven. Values did man only assign to things in order to maintain himself--he created only the significance of things, a human significance! Therefore, calleth he himself "man," that is, the valuator. Valuing is creating: hear it, ye creating ones! Valuation itself is the treasure and jewel of the valued things. Through valuation only is there value; and without valuation the nut of existence would be hollow. Hear it, ye creating ones! Change of values--that is, change of the creating ones. Always doth he destroy who hath to be a creator. Creating ones were first of all peoples, and only in late times individuals; verily, the individual himself is still the latest creation. Peoples once hung over them tables of the good. Love which would rule and love which would obey, created for themselves such tables. Older is the pleasure in the herd than the pleasure in the ego: and as long as the good conscience is for the herd, the bad conscience only saith: ego. Verily, the crafty ego, the loveless one, that seeketh its advantage in the advantage of many--it is not the origin of the herd, but its ruin. Loving ones, was it always, and creating ones, that created good and bad. Fire of love gloweth in the names of all the virtues, and fire of wrath. Many lands saw Zarathustra, and many peoples: no greater power did Zarathustra find on earth than the creations of the loving ones--"good" and "bad" are they called. Verily, a prodigy is this power of praising and blaming. Tell me, ye brethren, who will master it for me? Who will put a fetter upon the thousand necks of this animal? A thousand goals have there been hitherto, for a thousand peoples have there been. Only the fetter for the thousand necks is still lacking; there is lacking the one goal. As yet humanity hath not a goal. But pray tell me, my brethren, if the goal of humanity be still lacking, is there not also still lacking--humanity itself?-- Thus spake Zarathustra. XVI. NEIGHBOUR-LOVE. Ye crowd around your neighbour, and have fine words for it. But I say unto you: your neighbour-love is your bad love of yourselves. Ye flee unto your neighbour from yourselves, and would fain make a virtue thereof: but I fathom your "unselfishness." The THOU is older than the _I_; the THOU hath been consecrated, but not yet the _I_: so man presseth nigh unto his neighbour. Do I advise you to neighbour-love? Rather do I advise you to neighbour- flight and to furthest love! Higher than love to your neighbour is love to the furthest and future ones; higher still than love to men, is love to things and phantoms. The phantom that runneth on before thee, my brother, is fairer than thou; why dost thou not give unto it thy flesh and thy bones? But thou fearest, and runnest unto thy neighbour. Ye cannot endure it with yourselves, and do not love yourselves sufficiently: so ye seek to mislead your neighbour into love, and would fain gild yourselves with his error. Would that ye could not endure it with any kind of near ones, or their neighbours; then would ye have to create your friend and his overflowing heart out of yourselves. Ye call in a witness when ye want to speak well of yourselves; and when ye have misled him to think well of you, ye also think well of yourselves. Not only doth he lie, who speaketh contrary to his knowledge, but more so, he who speaketh contrary to his ignorance. And thus speak ye of yourselves in your intercourse, and belie your neighbour with yourselves. Thus saith the fool: "Association with men spoileth the character, especially when one hath none." The one goeth to his neighbour because he seeketh himself, and the other because he would fain lose himself. Your bad love to yourselves maketh solitude a prison to you. The furthest ones are they who pay for your love to the near ones; and when there are but five of you together, a sixth must always die. I love not your festivals either: too many actors found I there, and even the spectators often behaved like actors. Not the neighbour do I teach you, but the friend. Let the friend be the festival of the earth to you, and a foretaste of the Superman. I teach you the friend and his overflowing heart. But one must know how to be a sponge, if one would be loved by overflowing hearts. I teach you the friend in whom the world standeth complete, a capsule of the good,--the creating friend, who hath always a complete world to bestow. And as the world unrolled itself for him, so rolleth it together again for him in rings, as the growth of good through evil, as the growth of purpose out of chance. Let the future and the furthest be the motive of thy to-day; in thy friend shalt thou love the Superman as thy motive. My brethren, I advise you not to neighbour-love--I advise you to furthest love!-- Thus spake Zarathustra. XVII. THE WAY OF THE CREATING ONE. Wouldst thou go into isolation, my brother? Wouldst thou seek the way unto thyself? Tarry yet a little and hearken unto me. "He who seeketh may easily get lost himself. All isolation is wrong": so say the herd. And long didst thou belong to the herd. The voice of the herd will still echo in thee. And when thou sayest, "I have no longer a conscience in common with you," then will it be a plaint and a pain. Lo, that pain itself did the same conscience produce; and the last gleam of that conscience still gloweth on thine affliction. But thou wouldst go the way of thine affliction, which is the way unto thyself? Then show me thine authority and thy strength to do so! Art thou a new strength and a new authority? A first motion? A self- rolling wheel? Canst thou also compel stars to revolve around thee? Alas! there is so much lusting for loftiness! There are so many convulsions of the ambitions! Show me that thou art not a lusting and ambitious one! Alas! there are so many great thoughts that do nothing more than the bellows: they inflate, and make emptier than ever. Free, dost thou call thyself? Thy ruling thought would I hear of, and not that thou hast escaped from a yoke. Art thou one ENTITLED to escape from a yoke? Many a one hath cast away his final worth when he hath cast away his servitude. Free from what? What doth that matter to Zarathustra! Clearly, however, shall thine eye show unto me: free FOR WHAT? Canst thou give unto thyself thy bad and thy good, and set up thy will as a law over thee? Canst thou be judge for thyself, and avenger of thy law? Terrible is aloneness with the judge and avenger of one's own law. Thus is a star projected into desert space, and into the icy breath of aloneness. To-day sufferest thou still from the multitude, thou individual; to-day hast thou still thy courage unabated, and thy hopes. But one day will the solitude weary thee; one day will thy pride yield, and thy courage quail. Thou wilt one day cry: "I am alone!" One day wilt thou see no longer thy loftiness, and see too closely thy lowliness; thy sublimity itself will frighten thee as a phantom. Thou wilt one day cry: "All is false!" There are feelings which seek to slay the lonesome one; if they do not succeed, then must they themselves die! But art thou capable of it--to be a murderer? Hast thou ever known, my brother, the word "disdain"? And the anguish of thy justice in being just to those that disdain thee? Thou forcest many to think differently about thee; that, charge they heavily to thine account. Thou camest nigh unto them, and yet wentest past: for that they never forgive thee. Thou goest beyond them: but the higher thou risest, the smaller doth the eye of envy see thee. Most of all, however, is the flying one hated. "How could ye be just unto me!"--must thou say--"I choose your injustice as my allotted portion." Injustice and filth cast they at the lonesome one: but, my brother, if thou wouldst be a star, thou must shine for them none the less on that account! And be on thy guard against the good and just! They would fain crucify those who devise their own virtue--they hate the lonesome ones. Be on thy guard, also, against holy simplicity! All is unholy to it that is not simple; fain, likewise, would it play with the fire--of the fagot and stake. And be on thy guard, also, against the assaults of thy love! Too readily doth the recluse reach his hand to any one who meeteth him. To many a one mayest thou not give thy hand, but only thy paw; and I wish thy paw also to have claws. But the worst enemy thou canst meet, wilt thou thyself always be; thou waylayest thyself in caverns and forests. Thou lonesome one, thou goest the way to thyself! And past thyself and thy seven devils leadeth thy way! A heretic wilt thou be to thyself, and a wizard and a sooth-sayer, and a fool, and a doubter, and a reprobate, and a villain. Ready must thou be to burn thyself in thine own flame; how couldst thou become new if thou have not first become ashes! Thou lonesome one, thou goest the way of the creating one: a God wilt thou create for thyself out of thy seven devils! Thou lonesome one, thou goest the way of the loving one: thou lovest thyself, and on that account despisest thou thyself, as only the loving ones despise. To create, desireth the loving one, because he despiseth! What knoweth he of love who hath not been obliged to despise just what he loved! With thy love, go into thine isolation, my brother, and with thy creating; and late only will justice limp after thee. With my tears, go into thine isolation, my brother. I love him who seeketh to create beyond himself, and thus succumbeth.-- Thus spake Zarathustra. XVIII. OLD AND YOUNG WOMEN. "Why stealest thou along so furtively in the twilight, Zarathustra? And what hidest thou so carefully under thy mantle? Is it a treasure that hath been given thee? Or a child that hath been born thee? Or goest thou thyself on a thief's errand, thou friend of the evil?"-- Verily, my brother, said Zarathustra, it is a treasure that hath been given me: it is a little truth which I carry. But it is naughty, like a young child; and if I hold not its mouth, it screameth too loudly. As I went on my way alone to-day, at the hour when the sun declineth, there met me an old woman, and she spake thus unto my soul: "Much hath Zarathustra spoken also to us women, but never spake he unto us concerning woman." And I answered her: "Concerning woman, one should only talk unto men." "Talk also unto me of woman," said she; "I am old enough to forget it presently." And I obliged the old woman and spake thus unto her: Everything in woman is a riddle, and everything in woman hath one solution --it is called pregnancy. Man is for woman a means: the purpose is always the child. But what is woman for man? Two different things wanteth the true man: danger and diversion. Therefore wanteth he woman, as the most dangerous plaything. Man shall be trained for war, and woman for the recreation of the warrior: all else is folly. Too sweet fruits--these the warrior liketh not. Therefore liketh he woman;--bitter is even the sweetest woman. Better than man doth woman understand children, but man is more childish than woman. In the true man there is a child hidden: it wanteth to play. Up then, ye women, and discover the child in man! A plaything let woman be, pure and fine like the precious stone, illumined with the virtues of a world not yet come. Let the beam of a star shine in your love! Let your hope say: "May I bear the Superman!" In your love let there be valour! With your love shall ye assail him who inspireth you with fear! In your love be your honour! Little doth woman understand otherwise about honour. But let this be your honour: always to love more than ye are loved, and never be the second. Let man fear woman when she loveth: then maketh she every sacrifice, and everything else she regardeth as worthless. Let man fear woman when she hateth: for man in his innermost soul is merely evil; woman, however, is mean. Whom hateth woman most?--Thus spake the iron to the loadstone: "I hate thee most, because thou attractest, but art too weak to draw unto thee." The happiness of man is, "I will." The happiness of woman is, "He will." "Lo! now hath the world become perfect!"--thus thinketh every woman when she obeyeth with all her love. Obey, must the woman, and find a depth for her surface. Surface, is woman's soul, a mobile, stormy film on shallow water. Man's soul, however, is deep, its current gusheth in subterranean caverns: woman surmiseth its force, but comprehendeth it not.-- Then answered me the old woman: "Many fine things hath Zarathustra said, especially for those who are young enough for them. Strange! Zarathustra knoweth little about woman, and yet he is right about them! Doth this happen, because with women nothing is impossible? And now accept a little truth by way of thanks! I am old enough for it! Swaddle it up and hold its mouth: otherwise it will scream too loudly, the little truth." "Give me, woman, thy little truth!" said I. And thus spake the old woman: "Thou goest to women? Do not forget thy whip!"-- Thus spake Zarathustra. XIX. THE BITE OF THE ADDER. One day had Zarathustra fallen asleep under a fig-tree, owing to the heat, with his arms over his face. And there came an adder and bit him in the neck, so that Zarathustra screamed with pain. When he had taken his arm from his face he looked at the serpent; and then did it recognise the eyes of Zarathustra, wriggled awkwardly, and tried to get away. "Not at all," said Zarathustra, "as yet hast thou not received my thanks! Thou hast awakened me in time; my journey is yet long." "Thy journey is short," said the adder sadly; "my poison is fatal." Zarathustra smiled. "When did ever a dragon die of a serpent's poison?"--said he. "But take thy poison back! Thou art not rich enough to present it to me." Then fell the adder again on his neck, and licked his wound. When Zarathustra once told this to his disciples they asked him: "And what, O Zarathustra, is the moral of thy story?" And Zarathustra answered them thus: The destroyer of morality, the good and just call me: my story is immoral. When, however, ye have an enemy, then return him not good for evil: for that would abash him. But prove that he hath done something good to you. And rather be angry than abash any one! And when ye are cursed, it pleaseth me not that ye should then desire to bless. Rather curse a little also! And should a great injustice befall you, then do quickly five small ones besides. Hideous to behold is he on whom injustice presseth alone. Did ye ever know this? Shared injustice is half justice. And he who can bear it, shall take the injustice upon himself! A small revenge is humaner than no revenge at all. And if the punishment be not also a right and an honour to the transgressor, I do not like your punishing. Nobler is it to own oneself in the wrong than to establish one's right, especially if one be in the right. Only, one must be rich enough to do so. I do not like your cold justice; out of the eye of your judges there always glanceth the executioner and his cold steel. Tell me: where find we justice, which is love with seeing eyes? Devise me, then, the love which not only beareth all punishment, but also all guilt! Devise me, then, the justice which acquitteth every one except the judge! And would ye hear this likewise? To him who seeketh to be just from the heart, even the lie becometh philanthropy. But how could I be just from the heart! How can I give every one his own! Let this be enough for me: I give unto every one mine own. Finally, my brethren, guard against doing wrong to any anchorite. How could an anchorite forget! How could he requite! Like a deep well is an anchorite. Easy is it to throw in a stone: if it should sink to the bottom, however, tell me, who will bring it out again? Guard against injuring the anchorite! If ye have done so, however, well then, kill him also!-- Thus spake Zarathustra. XX. CHILD AND MARRIAGE. I have a question for thee alone, my brother: like a sounding-lead, cast I this question into thy soul, that I may know its depth. Thou art young, and desirest child and marriage. But I ask thee: Art thou a man ENTITLED to desire a child? Art thou the victorious one, the self-conqueror, the ruler of thy passions, the master of thy virtues? Thus do I ask thee. Or doth the animal speak in thy wish, and necessity? Or isolation? Or discord in thee? I would have thy victory and freedom long for a child. Living monuments shalt thou build to thy victory and emancipation. Beyond thyself shalt thou build. But first of all must thou be built thyself, rectangular in body and soul. Not only onward shalt thou propagate thyself, but upward! For that purpose may the garden of marriage help thee! A higher body shalt thou create, a first movement, a spontaneously rolling wheel--a creating one shalt thou create. Marriage: so call I the will of the twain to create the one that is more than those who created it. The reverence for one another, as those exercising such a will, call I marriage. Let this be the significance and the truth of thy marriage. But that which the many-too-many call marriage, those superfluous ones--ah, what shall I call it? Ah, the poverty of soul in the twain! Ah, the filth of soul in the twain! Ah, the pitiable self-complacency in the twain! Marriage they call it all; and they say their marriages are made in heaven. Well, I do not like it, that heaven of the superfluous! No, I do not like them, those animals tangled in the heavenly toils! Far from me also be the God who limpeth thither to bless what he hath not matched! Laugh not at such marriages! What child hath not had reason to weep over its parents? Worthy did this man seem, and ripe for the meaning of the earth: but when I saw his wife, the earth seemed to me a home for madcaps. Yea, I would that the earth shook with convulsions when a saint and a goose mate with one another. This one went forth in quest of truth as a hero, and at last got for himself a small decked-up lie: his marriage he calleth it. That one was reserved in intercourse and chose choicely. But one time he spoilt his company for all time: his marriage he calleth it. Another sought a handmaid with the virtues of an angel. But all at once he became the handmaid of a woman, and now would he need also to become an angel. Careful, have I found all buyers, and all of them have astute eyes. But even the astutest of them buyeth his wife in a sack. Many short follies--that is called love by you. And your marriage putteth an end to many short follies, with one long stupidity. Your love to woman, and woman's love to man--ah, would that it were sympathy for suffering and veiled deities! But generally two animals alight on one another. But even your best love is only an enraptured simile and a painful ardour. It is a torch to light you to loftier paths. Beyond yourselves shall ye love some day! Then LEARN first of all to love. And on that account ye had to drink the bitter cup of your love. Bitterness is in the cup even of the best love: thus doth it cause longing for the Superman; thus doth it cause thirst in thee, the creating one! Thirst in the creating one, arrow and longing for the Superman: tell me, my brother, is this thy will to marriage? Holy call I such a will, and such a marriage.-- Thus spake Zarathustra. XXI. VOLUNTARY DEATH. Many die too late, and some die too early. Yet strange soundeth the precept: "Die at the right time! Die at the right time: so teacheth Zarathustra. To be sure, he who never liveth at the right time, how could he ever die at the right time? Would that he might never be born!--Thus do I advise the superfluous ones. But even the superfluous ones make much ado about their death, and even the hollowest nut wanteth to be cracked. Every one regardeth dying as a great matter: but as yet death is not a festival. Not yet have people learned to inaugurate the finest festivals. The consummating death I show unto you, which becometh a stimulus and promise to the living. His death, dieth the consummating one triumphantly, surrounded by hoping and promising ones. Thus should one learn to die; and there should be no festival at which such a dying one doth not consecrate the oaths of the living! Thus to die is best; the next best, however, is to die in battle, and sacrifice a great soul. But to the fighter equally hateful as to the victor, is your grinning death which stealeth nigh like a thief,--and yet cometh as master. My death, praise I unto you, the voluntary death, which cometh unto me because _I_ want it. And when shall I want it?--He that hath a goal and an heir, wanteth death at the right time for the goal and the heir. And out of reverence for the goal and the heir, he will hang up no more withered wreaths in the sanctuary of life. Verily, not the rope-makers will I resemble: they lengthen out their cord, and thereby go ever backward. Many a one, also, waxeth too old for his truths and triumphs; a toothless mouth hath no longer the right to every truth. And whoever wanteth to have fame, must take leave of honour betimes, and practise the difficult art of--going at the right time. One must discontinue being feasted upon when one tasteth best: that is known by those who want to be long loved. Sour apples are there, no doubt, whose lot is to wait until the last day of autumn: and at the same time they become ripe, yellow, and shrivelled. In some ageth the heart first, and in others the spirit. And some are hoary in youth, but the late young keep long young. To many men life is a failure; a poison-worm gnaweth at their heart. Then let them see to it that their dying is all the more a success. Many never become sweet; they rot even in the summer. It is cowardice that holdeth them fast to their branches. Far too many live, and far too long hang they on their branches. Would that a storm came and shook all this rottenness and worm-eatenness from the tree! Would that there came preachers of SPEEDY death! Those would be the appropriate storms and agitators of the trees of life! But I hear only slow death preached, and patience with all that is "earthly." Ah! ye preach patience with what is earthly? This earthly is it that hath too much patience with you, ye blasphemers! Verily, too early died that Hebrew whom the preachers of slow death honour: and to many hath it proved a calamity that he died too early. As yet had he known only tears, and the melancholy of the Hebrews, together with the hatred of the good and just--the Hebrew Jesus: then was he seized with the longing for death. Had he but remained in the wilderness, and far from the good and just! Then, perhaps, would he have learned to live, and love the earth--and laughter also! Believe it, my brethren! He died too early; he himself would have disavowed his doctrine had he attained to my age! Noble enough was he to disavow! But he was still immature. Immaturely loveth the youth, and immaturely also hateth he man and earth. Confined and awkward are still his soul and the wings of his spirit. But in man there is more of the child than in the youth, and less of melancholy: better understandeth he about life and death. Free for death, and free in death; a holy Naysayer, when there is no longer time for Yea: thus understandeth he about death and life. That your dying may not be a reproach to man and the earth, my friends: that do I solicit from the honey of your soul. In your dying shall your spirit and your virtue still shine like an evening after-glow around the earth: otherwise your dying hath been unsatisfactory. Thus will I die myself, that ye friends may love the earth more for my sake; and earth will I again become, to have rest in her that bore me. Verily, a goal had Zarathustra; he threw his ball. Now be ye friends the heirs of my goal; to you throw I the golden ball. Best of all, do I see you, my friends, throw the golden ball! And so tarry I still a little while on the earth--pardon me for it! Thus spake Zarathustra. XXII. THE BESTOWING VIRTUE. 1. When Zarathustra had taken leave of the town to which his heart was attached, the name of which is "The Pied Cow," there followed him many people who called themselves his disciples, and kept him company. Thus came they to a crossroad. Then Zarathustra told them that he now wanted to go alone; for he was fond of going alone. His disciples, however, presented him at his departure with a staff, on the golden handle of which a serpent twined round the sun. Zarathustra rejoiced on account of the staff, and supported himself thereon; then spake he thus to his disciples: Tell me, pray: how came gold to the highest value? Because it is uncommon, and unprofiting, and beaming, and soft in lustre; it always bestoweth itself. Only as image of the highest virtue came gold to the highest value. Goldlike, beameth the glance of the bestower. Gold-lustre maketh peace between moon and sun. Uncommon is the highest virtue, and unprofiting, beaming is it, and soft of lustre: a bestowing virtue is the highest virtue. Verily, I divine you well, my disciples: ye strive like me for the bestowing virtue. What should ye have in common with cats and wolves? It is your thirst to become sacrifices and gifts yourselves: and therefore have ye the thirst to accumulate all riches in your soul. Insatiably striveth your soul for treasures and jewels, because your virtue is insatiable in desiring to bestow. Ye constrain all things to flow towards you and into you, so that they shall flow back again out of your fountain as the gifts of your love. Verily, an appropriator of all values must such bestowing love become; but healthy and holy, call I this selfishness.-- Another selfishness is there, an all-too-poor and hungry kind, which would always steal--the selfishness of the sick, the sickly selfishness. With the eye of the thief it looketh upon all that is lustrous; with the craving of hunger it measureth him who hath abundance; and ever doth it prowl round the tables of bestowers. Sickness speaketh in such craving, and invisible degeneration; of a sickly body, speaketh the larcenous craving of this selfishness. Tell me, my brother, what do we think bad, and worst of all? Is it not DEGENERATION?--And we always suspect degeneration when the bestowing soul is lacking. Upward goeth our course from genera on to super-genera. But a horror to us is the degenerating sense, which saith: "All for myself." Upward soareth our sense: thus is it a simile of our body, a simile of an elevation. Such similes of elevations are the names of the virtues. Thus goeth the body through history, a becomer and fighter. And the spirit--what is it to the body? Its fights' and victories' herald, its companion and echo. Similes, are all names of good and evil; they do not speak out, they only hint. A fool who seeketh knowledge from them! Give heed, my brethren, to every hour when your spirit would speak in similes: there is the origin of your virtue. Elevated is then your body, and raised up; with its delight, enraptureth it the spirit; so that it becometh creator, and valuer, and lover, and everything's benefactor. When your heart overfloweth broad and full like the river, a blessing and a danger to the lowlanders: there is the origin of your virtue. When ye are exalted above praise and blame, and your will would command all things, as a loving one's will: there is the origin of your virtue. When ye despise pleasant things, and the effeminate couch, and cannot couch far enough from the effeminate: there is the origin of your virtue. When ye are willers of one will, and when that change of every need is needful to you: there is the origin of your virtue. Verily, a new good and evil is it! Verily, a new deep murmuring, and the voice of a new fountain! Power is it, this new virtue; a ruling thought is it, and around it a subtle soul: a golden sun, with the serpent of knowledge around it. 2. Here paused Zarathustra awhile, and looked lovingly on his disciples. Then he continued to speak thus--and his voice had changed: Remain true to the earth, my brethren, with the power of your virtue! Let your bestowing love and your knowledge be devoted to be the meaning of the earth! Thus do I pray and conjure you. Let it not fly away from the earthly and beat against eternal walls with its wings! Ah, there hath always been so much flown-away virtue! Lead, like me, the flown-away virtue back to the earth--yea, back to body and life: that it may give to the earth its meaning, a human meaning! A hundred times hitherto hath spirit as well as virtue flown away and blundered. Alas! in our body dwelleth still all this delusion and blundering: body and will hath it there become. A hundred times hitherto hath spirit as well as virtue attempted and erred. Yea, an attempt hath man been. Alas, much ignorance and error hath become embodied in us! Not only the rationality of millenniums--also their madness, breaketh out in us. Dangerous is it to be an heir. Still fight we step by step with the giant Chance, and over all mankind hath hitherto ruled nonsense, the lack-of-sense. Let your spirit and your virtue be devoted to the sense of the earth, my brethren: let the value of everything be determined anew by you! Therefore shall ye be fighters! Therefore shall ye be creators! Intelligently doth the body purify itself; attempting with intelligence it exalteth itself; to the discerners all impulses sanctify themselves; to the exalted the soul becometh joyful. Physician, heal thyself: then wilt thou also heal thy patient. Let it be his best cure to see with his eyes him who maketh himself whole. A thousand paths are there which have never yet been trodden; a thousand salubrities and hidden islands of life. Unexhausted and undiscovered is still man and man's world. Awake and hearken, ye lonesome ones! From the future come winds with stealthy pinions, and to fine ears good tidings are proclaimed. Ye lonesome ones of to-day, ye seceding ones, ye shall one day be a people: out of you who have chosen yourselves, shall a chosen people arise:--and out of it the Superman. Verily, a place of healing shall the earth become! And already is a new odour diffused around it, a salvation-bringing odour--and a new hope! 3. When Zarathustra had spoken these words, he paused, like one who had not said his last word; and long did he balance the staff doubtfully in his hand. At last he spake thus--and his voice had changed: I now go alone, my disciples! Ye also now go away, and alone! So will I have it. Verily, I advise you: depart from me, and guard yourselves against Zarathustra! And better still: be ashamed of him! Perhaps he hath deceived you. The man of knowledge must be able not only to love his enemies, but also to hate his friends. One requiteth a teacher badly if one remain merely a scholar. And why will ye not pluck at my wreath? Ye venerate me; but what if your veneration should some day collapse? Take heed lest a statue crush you! Ye say, ye believe in Zarathustra? But of what account is Zarathustra! Ye are my believers: but of what account are all believers! Ye had not yet sought yourselves: then did ye find me. So do all believers; therefore all belief is of so little account. Now do I bid you lose me and find yourselves; and only when ye have all denied me, will I return unto you. Verily, with other eyes, my brethren, shall I then seek my lost ones; with another love shall I then love you. And once again shall ye have become friends unto me, and children of one hope: then will I be with you for the third time, to celebrate the great noontide with you. And it is the great noontide, when man is in the middle of his course between animal and Superman, and celebrateth his advance to the evening as his highest hope: for it is the advance to a new morning. At such time will the down-goer bless himself, that he should be an over- goer; and the sun of his knowledge will be at noontide. "DEAD ARE ALL THE GODS: NOW DO WE DESIRE THE SUPERMAN TO LIVE."--Let this be our final will at the great noontide!-- Thus spake Zarathustra. THUS SPAKE ZARATHUSTRA. SECOND PART. "-and only when ye have all denied me, will I return unto you. Verily, with other eyes, my brethren, shall I then seek my lost ones; with another love shall I then love you."--ZARATHUSTRA, I., "The Bestowing Virtue." XXIII. THE CHILD WITH THE MIRROR. After this Zarathustra returned again into the mountains to the solitude of his cave, and withdrew himself from men, waiting like a sower who hath scattered his seed. His soul, however, became impatient and full of longing for those whom he loved: because he had still much to give them. For this is hardest of all: to close the open hand out of love, and keep modest as a giver. Thus passed with the lonesome one months and years; his wisdom meanwhile increased, and caused him pain by its abundance. One morning, however, he awoke ere the rosy dawn, and having meditated long on his couch, at last spake thus to his heart: Why did I startle in my dream, so that I awoke? Did not a child come to me, carrying a mirror? "O Zarathustra"--said the child unto me--"look at thyself in the mirror!" But when I looked into the mirror, I shrieked, and my heart throbbed: for not myself did I see therein, but a devil's grimace and derision. Verily, all too well do I understand the dream's portent and monition: my DOCTRINE is in danger; tares want to be called wheat! Mine enemies have grown powerful and have disfigured the likeness of my doctrine, so that my dearest ones have to blush for the gifts that I gave them. Lost are my friends; the hour hath come for me to seek my lost ones!-- With these words Zarathustra started up, not however like a person in anguish seeking relief, but rather like a seer and a singer whom the spirit inspireth. With amazement did his eagle and serpent gaze upon him: for a coming bliss overspread his countenance like the rosy dawn. What hath happened unto me, mine animals?--said Zarathustra. Am I not transformed? Hath not bliss come unto me like a whirlwind? Foolish is my happiness, and foolish things will it speak: it is still too young--so have patience with it! Wounded am I by my happiness: all sufferers shall be physicians unto me! To my friends can I again go down, and also to mine enemies! Zarathustra can again speak and bestow, and show his best love to his loved ones! My impatient love overfloweth in streams,--down towards sunrise and sunset. Out of silent mountains and storms of affliction, rusheth my soul into the valleys. Too long have I longed and looked into the distance. Too long hath solitude possessed me: thus have I unlearned to keep silence. Utterance have I become altogether, and the brawling of a brook from high rocks: downward into the valleys will I hurl my speech. And let the stream of my love sweep into unfrequented channels! How should a stream not finally find its way to the sea! Forsooth, there is a lake in me, sequestered and self-sufficing; but the stream of my love beareth this along with it, down--to the sea! New paths do I tread, a new speech cometh unto me; tired have I become-- like all creators--of the old tongues. No longer will my spirit walk on worn-out soles. Too slowly runneth all speaking for me:--into thy chariot, O storm, do I leap! And even thee will I whip with my spite! Like a cry and an huzza will I traverse wide seas, till I find the Happy Isles where my friends sojourn;- And mine enemies amongst them! How I now love every one unto whom I may but speak! Even mine enemies pertain to my bliss. And when I want to mount my wildest horse, then doth my spear always help me up best: it is my foot's ever ready servant:-- The spear which I hurl at mine enemies! How grateful am I to mine enemies that I may at last hurl it! Too great hath been the tension of my cloud: 'twixt laughters of lightnings will I cast hail-showers into the depths. Violently will my breast then heave; violently will it blow its storm over the mountains: thus cometh its assuagement. Verily, like a storm cometh my happiness, and my freedom! But mine enemies shall think that THE EVIL ONE roareth over their heads. Yea, ye also, my friends, will be alarmed by my wild wisdom; and perhaps ye will flee therefrom, along with mine enemies. Ah, that I knew how to lure you back with shepherds' flutes! Ah, that my lioness wisdom would learn to roar softly! And much have we already learned with one another! My wild wisdom became pregnant on the lonesome mountains; on the rough stones did she bear the youngest of her young. Now runneth she foolishly in the arid wilderness, and seeketh and seeketh the soft sward--mine old, wild wisdom! On the soft sward of your hearts, my friends!--on your love, would she fain couch her dearest one!-- Thus spake Zarathustra. XXIV. IN THE HAPPY ISLES. The figs fall from the trees, they are good and sweet; and in falling the red skins of them break. A north wind am I to ripe figs. Thus, like figs, do these doctrines fall for you, my friends: imbibe now their juice and their sweet substance! It is autumn all around, and clear sky, and afternoon. Lo, what fullness is around us! And out of the midst of superabundance, it is delightful to look out upon distant seas. Once did people say God, when they looked out upon distant seas; now, however, have I taught you to say, Superman. God is a conjecture: but I do not wish your conjecturing to reach beyond your creating will. Could ye CREATE a God?--Then, I pray you, be silent about all Gods! But ye could well create the Superman. Not perhaps ye yourselves, my brethren! But into fathers and forefathers of the Superman could ye transform yourselves: and let that be your best creating!-- God is a conjecture: but I should like your conjecturing restricted to the conceivable. Could ye CONCEIVE a God?--But let this mean Will to Truth unto you, that everything be transformed into the humanly conceivable, the humanly visible, the humanly sensible! Your own discernment shall ye follow out to the end! And what ye have called the world shall but be created by you: your reason, your likeness, your will, your love, shall it itself become! And verily, for your bliss, ye discerning ones! And how would ye endure life without that hope, ye discerning ones? Neither in the inconceivable could ye have been born, nor in the irrational. But that I may reveal my heart entirely unto you, my friends: IF there were gods, how could I endure it to be no God! THEREFORE there are no Gods. Yea, I have drawn the conclusion; now, however, doth it draw me.-- God is a conjecture: but who could drink all the bitterness of this conjecture without dying? Shall his faith be taken from the creating one, and from the eagle his flights into eagle-heights? God is a thought--it maketh all the straight crooked, and all that standeth reel. What? Time would be gone, and all the perishable would be but a lie? To think this is giddiness and vertigo to human limbs, and even vomiting to the stomach: verily, the reeling sickness do I call it, to conjecture such a thing. Evil do I call it and misanthropic: all that teaching about the one, and the plenum, and the unmoved, and the sufficient, and the imperishable! All the imperishable--that's but a simile, and the poets lie too much.-- But of time and of becoming shall the best similes speak: a praise shall they be, and a justification of all perishableness! Creating--that is the great salvation from suffering, and life's alleviation. But for the creator to appear, suffering itself is needed, and much transformation. Yea, much bitter dying must there be in your life, ye creators! Thus are ye advocates and justifiers of all perishableness. For the creator himself to be the new-born child, he must also be willing to be the child-bearer, and endure the pangs of the child-bearer. Verily, through a hundred souls went I my way, and through a hundred cradles and birth-throes. Many a farewell have I taken; I know the heart- breaking last hours. But so willeth it my creating Will, my fate. Or, to tell you it more candidly: just such a fate--willeth my Will. All FEELING suffereth in me, and is in prison: but my WILLING ever cometh to me as mine emancipator and comforter. Willing emancipateth: that is the true doctrine of will and emancipation-- so teacheth you Zarathustra. No longer willing, and no longer valuing, and no longer creating! Ah, that that great debility may ever be far from me! And also in discerning do I feel only my will's procreating and evolving delight; and if there be innocence in my knowledge, it is because there is will to procreation in it. Away from God and Gods did this will allure me; what would there be to create if there were--Gods! But to man doth it ever impel me anew, my fervent creative will; thus impelleth it the hammer to the stone. Ah, ye men, within the stone slumbereth an image for me, the image of my visions! Ah, that it should slumber in the hardest, ugliest stone! Now rageth my hammer ruthlessly against its prison. From the stone fly the fragments: what's that to me? I will complete it: for a shadow came unto me--the stillest and lightest of all things once came unto me! The beauty of the Superman came unto me as a shadow. Ah, my brethren! Of what account now are--the Gods to me!-- Thus spake Zarathustra. XXV. THE PITIFUL. My friends, there hath arisen a satire on your friend: "Behold Zarathustra! Walketh he not amongst us as if amongst animals?" But it is better said in this wise: "The discerning one walketh amongst men AS amongst animals." Man himself is to the discerning one: the animal with red cheeks. How hath that happened unto him? Is it not because he hath had to be ashamed too oft? O my friends! Thus speaketh the discerning one: shame, shame, shame--that is the history of man! And on that account doth the noble one enjoin upon himself not to abash: bashfulness doth he enjoin on himself in presence of all sufferers. Verily, I like them not, the merciful ones, whose bliss is in their pity: too destitute are they of bashfulness. If I must be pitiful, I dislike to be called so; and if I be so, it is preferably at a distance. Preferably also do I shroud my head, and flee, before being recognised: and thus do I bid you do, my friends! May my destiny ever lead unafflicted ones like you across my path, and those with whom I MAY have hope and repast and honey in common! Verily, I have done this and that for the afflicted: but something better did I always seem to do when I had learned to enjoy myself better. Since humanity came into being, man hath enjoyed himself too little: that alone, my brethren, is our original sin! And when we learn better to enjoy ourselves, then do we unlearn best to give pain unto others, and to contrive pain. Therefore do I wash the hand that hath helped the sufferer; therefore do I wipe also my soul. For in seeing the sufferer suffering--thereof was I ashamed on account of his shame; and in helping him, sorely did I wound his pride. Great obligations do not make grateful, but revengeful; and when a small kindness is not forgotten, it becometh a gnawing worm. "Be shy in accepting! Distinguish by accepting!"--thus do I advise those who have naught to bestow. I, however, am a bestower: willingly do I bestow as friend to friends. Strangers, however, and the poor, may pluck for themselves the fruit from my tree: thus doth it cause less shame. Beggars, however, one should entirely do away with! Verily, it annoyeth one to give unto them, and it annoyeth one not to give unto them. And likewise sinners and bad consciences! Believe me, my friends: the sting of conscience teacheth one to sting. The worst things, however, are the petty thoughts. Verily, better to have done evilly than to have thought pettily! To be sure, ye say: "The delight in petty evils spareth one many a great evil deed." But here one should not wish to be sparing. Like a boil is the evil deed: it itcheth and irritateth and breaketh forth--it speaketh honourably. "Behold, I am disease," saith the evil deed: that is its honourableness. But like infection is the petty thought: it creepeth and hideth, and wanteth to be nowhere--until the whole body is decayed and withered by the petty infection. To him however, who is possessed of a devil, I would whisper this word in the ear: "Better for thee to rear up thy devil! Even for thee there is still a path to greatness!"-- Ah, my brethren! One knoweth a little too much about every one! And many a one becometh transparent to us, but still we can by no means penetrate him. It is difficult to live among men because silence is so difficult. And not to him who is offensive to us are we most unfair, but to him who doth not concern us at all. If, however, thou hast a suffering friend, then be a resting-place for his suffering; like a hard bed, however, a camp-bed: thus wilt thou serve him best. And if a friend doeth thee wrong, then say: "I forgive thee what thou hast done unto me; that thou hast done it unto THYSELF, however--how could I forgive that!" Thus speaketh all great love: it surpasseth even forgiveness and pity. One should hold fast one's heart; for when one letteth it go, how quickly doth one's head run away! Ah, where in the world have there been greater follies than with the pitiful? And what in the world hath caused more suffering than the follies of the pitiful? Woe unto all loving ones who have not an elevation which is above their pity! Thus spake the devil unto me, once on a time: "Even God hath his hell: it is his love for man." And lately, did I hear him say these words: "God is dead: of his pity for man hath God died."-- So be ye warned against pity: FROM THENCE there yet cometh unto men a heavy cloud! Verily, I understand weather-signs! But attend also to this word: All great love is above all its pity: for it seeketh--to create what is loved! "Myself do I offer unto my love, AND MY NEIGHBOUR AS MYSELF"--such is the language of all creators. All creators, however, are hard.-- Thus spake Zarathustra. XXVI. THE PRIESTS. And one day Zarathustra made a sign to his disciples, and spake these words unto them: "Here are priests: but although they are mine enemies, pass them quietly and with sleeping swords! Even among them there are heroes; many of them have suffered too much--: so they want to make others suffer. Bad enemies are they: nothing is more revengeful than their meekness. And readily doth he soil himself who toucheth them. But my blood is related to theirs; and I want withal to see my blood honoured in theirs."-- And when they had passed, a pain attacked Zarathustra; but not long had he struggled with the pain, when he began to speak thus: It moveth my heart for those priests. They also go against my taste; but that is the smallest matter unto me, since I am among men. But I suffer and have suffered with them: prisoners are they unto me, and stigmatised ones. He whom they call Saviour put them in fetters:-- In fetters of false values and fatuous words! Oh, that some one would save them from their Saviour! On an isle they once thought they had landed, when the sea tossed them about; but behold, it was a slumbering monster! False values and fatuous words: these are the worst monsters for mortals-- long slumbereth and waiteth the fate that is in them. But at last it cometh and awaketh and devoureth and engulfeth whatever hath built tabernacles upon it. Oh, just look at those tabernacles which those priests have built themselves! Churches, they call their sweet-smelling caves! Oh, that falsified light, that mustified air! Where the soul--may not fly aloft to its height! But so enjoineth their belief: "On your knees, up the stair, ye sinners!" Verily, rather would I see a shameless one than the distorted eyes of their shame and devotion! Who created for themselves such caves and penitence-stairs? Was it not those who sought to conceal themselves, and were ashamed under the clear sky? And only when the clear sky looketh again through ruined roofs, and down upon grass and red poppies on ruined walls--will I again turn my heart to the seats of this God. They called God that which opposed and afflicted them: and verily, there was much hero-spirit in their worship! And they knew not how to love their God otherwise than by nailing men to the cross! As corpses they thought to live; in black draped they their corpses; even in their talk do I still feel the evil flavour of charnel-houses. And he who liveth nigh unto them liveth nigh unto black pools, wherein the toad singeth his song with sweet gravity. Better songs would they have to sing, for me to believe in their Saviour: more like saved ones would his disciples have to appear unto me! Naked, would I like to see them: for beauty alone should preach penitence. But whom would that disguised affliction convince! Verily, their Saviours themselves came not from freedom and freedom's seventh heaven! Verily, they themselves never trod the carpets of knowledge! Of defects did the spirit of those Saviours consist; but into every defect had they put their illusion, their stop-gap, which they called God. In their pity was their spirit drowned; and when they swelled and o'erswelled with pity, there always floated to the surface a great folly. Eagerly and with shouts drove they their flock over their foot-bridge; as if there were but one foot-bridge to the future! Verily, those shepherds also were still of the flock! Small spirits and spacious souls had those shepherds: but, my brethren, what small domains have even the most spacious souls hitherto been! Characters of blood did they write on the way they went, and their folly taught that truth is proved by blood. But blood is the very worst witness to truth; blood tainteth the purest teaching, and turneth it into delusion and hatred of heart. And when a person goeth through fire for his teaching--what doth that prove! It is more, verily, when out of one's own burning cometh one's own teaching! Sultry heart and cold head; where these meet, there ariseth the blusterer, the "Saviour." Greater ones, verily, have there been, and higher-born ones, than those whom the people call Saviours, those rapturous blusterers! And by still greater ones than any of the Saviours must ye be saved, my brethren, if ye would find the way to freedom! Never yet hath there been a Superman. Naked have I seen both of them, the greatest man and the smallest man:-- All-too-similar are they still to each other. Verily, even the greatest found I--all-too-human!-- Thus spake Zarathustra. XXVII. THE VIRTUOUS. With thunder and heavenly fireworks must one speak to indolent and somnolent senses. But beauty's voice speaketh gently: it appealeth only to the most awakened souls. Gently vibrated and laughed unto me to-day my buckler; it was beauty's holy laughing and thrilling. At you, ye virtuous ones, laughed my beauty to-day. And thus came its voice unto me: "They want--to be paid besides!" Ye want to be paid besides, ye virtuous ones! Ye want reward for virtue, and heaven for earth, and eternity for your to-day? And now ye upbraid me for teaching that there is no reward-giver, nor paymaster? And verily, I do not even teach that virtue is its own reward. Ah! this is my sorrow: into the basis of things have reward and punishment been insinuated--and now even into the basis of your souls, ye virtuous ones! But like the snout of the boar shall my word grub up the basis of your souls; a ploughshare will I be called by you. All the secrets of your heart shall be brought to light; and when ye lie in the sun, grubbed up and broken, then will also your falsehood be separated from your truth. For this is your truth: ye are TOO PURE for the filth of the words: vengeance, punishment, recompense, retribution. Ye love your virtue as a mother loveth her child; but when did one hear of a mother wanting to be paid for her love? It is your dearest Self, your virtue. The ring's thirst is in you: to reach itself again struggleth every ring, and turneth itself. And like the star that goeth out, so is every work of your virtue: ever is its light on its way and travelling--and when will it cease to be on its way? Thus is the light of your virtue still on its way, even when its work is done. Be it forgotten and dead, still its ray of light liveth and travelleth. That your virtue is your Self, and not an outward thing, a skin, or a cloak: that is the truth from the basis of your souls, ye virtuous ones!-- But sure enough there are those to whom virtue meaneth writhing under the lash: and ye have hearkened too much unto their crying! And others are there who call virtue the slothfulness of their vices; and when once their hatred and jealousy relax the limbs, their "justice" becometh lively and rubbeth its sleepy eyes. And others are there who are drawn downwards: their devils draw them. But the more they sink, the more ardently gloweth their eye, and the longing for their God. Ah! their crying also hath reached your ears, ye virtuous ones: "What I am NOT, that, that is God to me, and virtue!" And others are there who go along heavily and creakingly, like carts taking stones downhill: they talk much of dignity and virtue--their drag they call virtue! And others are there who are like eight-day clocks when wound up; they tick, and want people to call ticking--virtue. Verily, in those have I mine amusement: wherever I find such clocks I shall wind them up with my mockery, and they shall even whirr thereby! And others are proud of their modicum of righteousness, and for the sake of it do violence to all things: so that the world is drowned in their unrighteousness. Ah! how ineptly cometh the word "virtue" out of their mouth! And when they say: "I am just," it always soundeth like: "I am just--revenged!" With their virtues they want to scratch out the eyes of their enemies; and they elevate themselves only that they may lower others. And again there are those who sit in their swamp, and speak thus from among the bulrushes: "Virtue--that is to sit quietly in the swamp. We bite no one, and go out of the way of him who would bite; and in all matters we have the opinion that is given us." And again there are those who love attitudes, and think that virtue is a sort of attitude. Their knees continually adore, and their hands are eulogies of virtue, but their heart knoweth naught thereof. And again there are those who regard it as virtue to say: "Virtue is necessary"; but after all they believe only that policemen are necessary. And many a one who cannot see men's loftiness, calleth it virtue to see their baseness far too well: thus calleth he his evil eye virtue.-- And some want to be edified and raised up, and call it virtue: and others want to be cast down,--and likewise call it virtue. And thus do almost all think that they participate in virtue; and at least every one claimeth to be an authority on "good" and "evil." But Zarathustra came not to say unto all those liars and fools: "What do YE know of virtue! What COULD ye know of virtue!"-- But that ye, my friends, might become weary of the old words which ye have learned from the fools and liars: That ye might become weary of the words "reward," "retribution," "punishment," "righteous vengeance."-- That ye might become weary of saying: "That an action is good is because it is unselfish." Ah! my friends! That YOUR very Self be in your action, as the mother is in the child: let that be YOUR formula of virtue! Verily, I have taken from you a hundred formulae and your virtue's favourite playthings; and now ye upbraid me, as children upbraid. They played by the sea--then came there a wave and swept their playthings into the deep: and now do they cry. But the same wave shall bring them new playthings, and spread before them new speckled shells! Thus will they be comforted; and like them shall ye also, my friends, have your comforting--and new speckled shells!-- Thus spake Zarathustra. XXVIII. THE RABBLE. Life is a well of delight; but where the rabble also drink, there all fountains are poisoned. To everything cleanly am I well disposed; but I hate to see the grinning mouths and the thirst of the unclean. They cast their eye down into the fountain: and now glanceth up to me their odious smile out of the fountain. The holy water have they poisoned with their lustfulness; and when they called their filthy dreams delight, then poisoned they also the words. Indignant becometh the flame when they put their damp hearts to the fire; the spirit itself bubbleth and smoketh when the rabble approach the fire. Mawkish and over-mellow becometh the fruit in their hands: unsteady, and withered at the top, doth their look make the fruit-tree. And many a one who hath turned away from life, hath only turned away from the rabble: he hated to share with them fountain, flame, and fruit. And many a one who hath gone into the wilderness and suffered thirst with beasts of prey, disliked only to sit at the cistern with filthy camel- drivers. And many a one who hath come along as a destroyer, and as a hailstorm to all cornfields, wanted merely to put his foot into the jaws of the rabble, and thus stop their throat. And it is not the mouthful which hath most choked me, to know that life itself requireth enmity and death and torture-crosses:-- But I asked once, and suffocated almost with my question: What? is the rabble also NECESSARY for life? Are poisoned fountains necessary, and stinking fires, and filthy dreams, and maggots in the bread of life? Not my hatred, but my loathing, gnawed hungrily at my life! Ah, ofttimes became I weary of spirit, when I found even the rabble spiritual! And on the rulers turned I my back, when I saw what they now call ruling: to traffic and bargain for power--with the rabble! Amongst peoples of a strange language did I dwell, with stopped ears: so that the language of their trafficking might remain strange unto me, and their bargaining for power. And holding my nose, I went morosely through all yesterdays and to-days: verily, badly smell all yesterdays and to-days of the scribbling rabble! Like a cripple become deaf, and blind, and dumb--thus have I lived long; that I might not live with the power-rabble, the scribe-rabble, and the pleasure-rabble. Toilsomely did my spirit mount stairs, and cautiously; alms of delight were its refreshment; on the staff did life creep along with the blind one. What hath happened unto me? How have I freed myself from loathing? Who hath rejuvenated mine eye? How have I flown to the height where no rabble any longer sit at the wells? Did my loathing itself create for me wings and fountain-divining powers? Verily, to the loftiest height had I to fly, to find again the well of delight! Oh, I have found it, my brethren! Here on the loftiest height bubbleth up for me the well of delight! And there is a life at whose waters none of the rabble drink with me! Almost too violently dost thou flow for me, thou fountain of delight! And often emptiest thou the goblet again, in wanting to fill it! And yet must I learn to approach thee more modestly: far too violently doth my heart still flow towards thee:-- My heart on which my summer burneth, my short, hot, melancholy, over-happy summer: how my summer heart longeth for thy coolness! Past, the lingering distress of my spring! Past, the wickedness of my snowflakes in June! Summer have I become entirely, and summer-noontide! A summer on the loftiest height, with cold fountains and blissful stillness: oh, come, my friends, that the stillness may become more blissful! For this is OUR height and our home: too high and steep do we here dwell for all uncleanly ones and their thirst. Cast but your pure eyes into the well of my delight, my friends! How could it become turbid thereby! It shall laugh back to you with ITS purity. On the tree of the future build we our nest; eagles shall bring us lone ones food in their beaks! Verily, no food of which the impure could be fellow-partakers! Fire, would they think they devoured, and burn their mouths! Verily, no abodes do we here keep ready for the impure! An ice-cave to their bodies would our happiness be, and to their spirits! And as strong winds will we live above them, neighbours to the eagles, neighbours to the snow, neighbours to the sun: thus live the strong winds. And like a wind will I one day blow amongst them, and with my spirit, take the breath from their spirit: thus willeth my future. Verily, a strong wind is Zarathustra to all low places; and this counsel counselleth he to his enemies, and to whatever spitteth and speweth: "Take care not to spit AGAINST the wind!"-- Thus spake Zarathustra. XXIX. THE TARANTULAS. Lo, this is the tarantula's den! Would'st thou see the tarantula itself? Here hangeth its web: touch this, so that it may tremble. There cometh the tarantula willingly: Welcome, tarantula! Black on thy back is thy triangle and symbol; and I know also what is in thy soul. Revenge is in thy soul: wherever thou bitest, there ariseth black scab; with revenge, thy poison maketh the soul giddy! Thus do I speak unto you in parable, ye who make the soul giddy, ye preachers of EQUALITY! Tarantulas are ye unto me, and secretly revengeful ones! But I will soon bring your hiding-places to the light: therefore do I laugh in your face my laughter of the height. Therefore do I tear at your web, that your rage may lure you out of your den of lies, and that your revenge may leap forth from behind your word "justice." Because, FOR MAN TO BE REDEEMED FROM REVENGE--that is for me the bridge to the highest hope, and a rainbow after long storms. Otherwise, however, would the tarantulas have it. "Let it be very justice for the world to become full of the storms of our vengeance"--thus do they talk to one another. "Vengeance will we use, and insult, against all who are not like us"--thus do the tarantula-hearts pledge themselves. "And 'Will to Equality'--that itself shall henceforth be the name of virtue; and against all that hath power will we raise an outcry!" Ye preachers of equality, the tyrant-frenzy of impotence crieth thus in you for "equality": your most secret tyrant-longings disguise themselves thus in virtue-words! Fretted conceit and suppressed envy--perhaps your fathers' conceit and envy: in you break they forth as flame and frenzy of vengeance. What the father hath hid cometh out in the son; and oft have I found in the son the father's revealed secret. Inspired ones they resemble: but it is not the heart that inspireth them-- but vengeance. And when they become subtle and cold, it is not spirit, but envy, that maketh them so. Their jealousy leadeth them also into thinkers' paths; and this is the sign of their jealousy--they always go too far: so that their fatigue hath at last to go to sleep on the snow. In all their lamentations soundeth vengeance, in all their eulogies is maleficence; and being judge seemeth to them bliss. But thus do I counsel you, my friends: distrust all in whom the impulse to punish is powerful! They are people of bad race and lineage; out of their countenances peer the hangman and the sleuth-hound. Distrust all those who talk much of their justice! Verily, in their souls not only honey is lacking. And when they call themselves "the good and just," forget not, that for them to be Pharisees, nothing is lacking but--power! My friends, I will not be mixed up and confounded with others. There are those who preach my doctrine of life, and are at the same time preachers of equality, and tarantulas. That they speak in favour of life, though they sit in their den, these poison-spiders, and withdrawn from life--is because they would thereby do injury. To those would they thereby do injury who have power at present: for with those the preaching of death is still most at home. Were it otherwise, then would the tarantulas teach otherwise: and they themselves were formerly the best world-maligners and heretic-burners. With these preachers of equality will I not be mixed up and confounded. For thus speaketh justice UNTO ME: "Men are not equal." And neither shall they become so! What would be my love to the Superman, if I spake otherwise? On a thousand bridges and piers shall they throng to the future, and always shall there be more war and inequality among them: thus doth my great love make me speak! Inventors of figures and phantoms shall they be in their hostilities; and with those figures and phantoms shall they yet fight with each other the supreme fight! Good and evil, and rich and poor, and high and low, and all names of values: weapons shall they be, and sounding signs, that life must again and again surpass itself! Aloft will it build itself with columns and stairs--life itself: into remote distances would it gaze, and out towards blissful beauties-- THEREFORE doth it require elevation! And because it requireth elevation, therefore doth it require steps, and variance of steps and climbers! To rise striveth life, and in rising to surpass itself. And just behold, my friends! Here where the tarantula's den is, riseth aloft an ancient temple's ruins--just behold it with enlightened eyes! Verily, he who here towered aloft his thoughts in stone, knew as well as the wisest ones about the secret of life! That there is struggle and inequality even in beauty, and war for power and supremacy: that doth he here teach us in the plainest parable. How divinely do vault and arch here contrast in the struggle: how with light and shade they strive against each other, the divinely striving ones.-- Thus, steadfast and beautiful, let us also be enemies, my friends! Divinely will we strive AGAINST one another!-- Alas! There hath the tarantula bit me myself, mine old enemy! Divinely steadfast and beautiful, it hath bit me on the finger! "Punishment must there be, and justice"--so thinketh it: "not gratuitously shall he here sing songs in honour of enmity!" Yea, it hath revenged itself! And alas! now will it make my soul also dizzy with revenge! That I may NOT turn dizzy, however, bind me fast, my friends, to this pillar! Rather will I be a pillar-saint than a whirl of vengeance! Verily, no cyclone or whirlwind is Zarathustra: and if he be a dancer, he is not at all a tarantula-dancer!-- Thus spake Zarathustra. XXX. THE FAMOUS WISE ONES. The people have ye served and the people's superstition--NOT the truth!-- all ye famous wise ones! And just on that account did they pay you reverence. And on that account also did they tolerate your unbelief, because it was a pleasantry and a by-path for the people. Thus doth the master give free scope to his slaves, and even enjoyeth their presumptuousness. But he who is hated by the people, as the wolf by the dogs--is the free spirit, the enemy of fetters, the non-adorer, the dweller in the woods. To hunt him out of his lair--that was always called "sense of right" by the people: on him do they still hound their sharpest-toothed dogs. "For there the truth is, where the people are! Woe, woe to the seeking ones!"--thus hath it echoed through all time. Your people would ye justify in their reverence: that called ye "Will to Truth," ye famous wise ones! And your heart hath always said to itself: "From the people have I come: from thence came to me also the voice of God." Stiff-necked and artful, like the ass, have ye always been, as the advocates of the people. And many a powerful one who wanted to run well with the people, hath harnessed in front of his horses--a donkey, a famous wise man. And now, ye famous wise ones, I would have you finally throw off entirely the skin of the lion! The skin of the beast of prey, the speckled skin, and the dishevelled locks of the investigator, the searcher, and the conqueror! Ah! for me to learn to believe in your "conscientiousness," ye would first have to break your venerating will. Conscientious--so call I him who goeth into God-forsaken wildernesses, and hath broken his venerating heart. In the yellow sands and burnt by the sun, he doubtless peereth thirstily at the isles rich in fountains, where life reposeth under shady trees. But his thirst doth not persuade him to become like those comfortable ones: for where there are oases, there are also idols. Hungry, fierce, lonesome, God-forsaken: so doth the lion-will wish itself. Free from the happiness of slaves, redeemed from Deities and adorations, fearless and fear-inspiring, grand and lonesome: so is the will of the conscientious. In the wilderness have ever dwelt the conscientious, the free spirits, as lords of the wilderness; but in the cities dwell the well-foddered, famous wise ones--the draught-beasts. For, always, do they draw, as asses--the PEOPLE'S carts! Not that I on that account upbraid them: but serving ones do they remain, and harnessed ones, even though they glitter in golden harness. And often have they been good servants and worthy of their hire. For thus saith virtue: "If thou must be a servant, seek him unto whom thy service is most useful! The spirit and virtue of thy master shall advance by thou being his servant: thus wilt thou thyself advance with his spirit and virtue!" And verily, ye famous wise ones, ye servants of the people! Ye yourselves have advanced with the people's spirit and virtue--and the people by you! To your honour do I say it! But the people ye remain for me, even with your virtues, the people with purblind eyes--the people who know not what SPIRIT is! Spirit is life which itself cutteth into life: by its own torture doth it increase its own knowledge,--did ye know that before? And the spirit's happiness is this: to be anointed and consecrated with tears as a sacrificial victim,--did ye know that before? And the blindness of the blind one, and his seeking and groping, shall yet testify to the power of the sun into which he hath gazed,--did ye know that before? And with mountains shall the discerning one learn to BUILD! It is a small thing for the spirit to remove mountains,--did ye know that before? Ye know only the sparks of the spirit: but ye do not see the anvil which it is, and the cruelty of its hammer! Verily, ye know not the spirit's pride! But still less could ye endure the spirit's humility, should it ever want to speak! And never yet could ye cast your spirit into a pit of snow: ye are not hot enough for that! Thus are ye unaware, also, of the delight of its coldness. In all respects, however, ye make too familiar with the spirit; and out of wisdom have ye often made an almshouse and a hospital for bad poets. Ye are not eagles: thus have ye never experienced the happiness of the alarm of the spirit. And he who is not a bird should not camp above abysses. Ye seem to me lukewarm ones: but coldly floweth all deep knowledge. Ice- cold are the innermost wells of the spirit: a refreshment to hot hands and handlers. Respectable do ye there stand, and stiff, and with straight backs, ye famous wise ones!--no strong wind or will impelleth you. Have ye ne'er seen a sail crossing the sea, rounded and inflated, and trembling with the violence of the wind? Like the sail trembling with the violence of the spirit, doth my wisdom cross the sea--my wild wisdom! But ye servants of the people, ye famous wise ones--how COULD ye go with me!-- Thus spake Zarathustra. XXXI. THE NIGHT-SONG. 'Tis night: now do all gushing fountains speak louder. And my soul also is a gushing fountain. 'Tis night: now only do all songs of the loving ones awake. And my soul also is the song of a loving one. Something unappeased, unappeasable, is within me; it longeth to find expression. A craving for love is within me, which speaketh itself the language of love. Light am I: ah, that I were night! But it is my lonesomeness to be begirt with light! Ah, that I were dark and nightly! How would I suck at the breasts of light! And you yourselves would I bless, ye twinkling starlets and glow-worms aloft!--and would rejoice in the gifts of your light. But I live in mine own light, I drink again into myself the flames that break forth from me. I know not the happiness of the receiver; and oft have I dreamt that stealing must be more blessed than receiving. It is my poverty that my hand never ceaseth bestowing; it is mine envy that I see waiting eyes and the brightened nights of longing. Oh, the misery of all bestowers! Oh, the darkening of my sun! Oh, the craving to crave! Oh, the violent hunger in satiety! They take from me: but do I yet touch their soul? There is a gap 'twixt giving and receiving; and the smallest gap hath finally to be bridged over. A hunger ariseth out of my beauty: I should like to injure those I illumine; I should like to rob those I have gifted:--thus do I hunger for wickedness. Withdrawing my hand when another hand already stretcheth out to it; hesitating like the cascade, which hesitateth even in its leap:--thus do I hunger for wickedness! Such revenge doth mine abundance think of: such mischief welleth out of my lonesomeness. My happiness in bestowing died in bestowing; my virtue became weary of itself by its abundance! He who ever bestoweth is in danger of losing his shame; to him who ever dispenseth, the hand and heart become callous by very dispensing. Mine eye no longer overfloweth for the shame of suppliants; my hand hath become too hard for the trembling of filled hands. Whence have gone the tears of mine eye, and the down of my heart? Oh, the lonesomeness of all bestowers! Oh, the silence of all shining ones! Many suns circle in desert space: to all that is dark do they speak with their light--but to me they are silent. Oh, this is the hostility of light to the shining one: unpityingly doth it pursue its course. Unfair to the shining one in its innermost heart, cold to the suns:--thus travelleth every sun. Like a storm do the suns pursue their courses: that is their travelling. Their inexorable will do they follow: that is their coldness. Oh, ye only is it, ye dark, nightly ones, that extract warmth from the shining ones! Oh, ye only drink milk and refreshment from the light's udders! Ah, there is ice around me; my hand burneth with the iciness! Ah, there is thirst in me; it panteth after your thirst! 'Tis night: alas, that I have to be light! And thirst for the nightly! And lonesomeness! 'Tis night: now doth my longing break forth in me as a fountain,--for speech do I long. 'Tis night: now do all gushing fountains speak louder. And my soul also is a gushing fountain. 'Tis night: now do all songs of loving ones awake. And my soul also is the song of a loving one.-- Thus sang Zarathustra. XXXII. THE DANCE-SONG. One evening went Zarathustra and his disciples through the forest; and when he sought for a well, lo, he lighted upon a green meadow peacefully surrounded with trees and bushes, where maidens were dancing together. As soon as the maidens recognised Zarathustra, they ceased dancing; Zarathustra, however, approached them with friendly mein and spake these words: Cease not your dancing, ye lovely maidens! No game-spoiler hath come to you with evil eye, no enemy of maidens. God's advocate am I with the devil: he, however, is the spirit of gravity. How could I, ye light-footed ones, be hostile to divine dances? Or to maidens' feet with fine ankles? To be sure, I am a forest, and a night of dark trees: but he who is not afraid of my darkness, will find banks full of roses under my cypresses. And even the little God may he find, who is dearest to maidens: beside the well lieth he quietly, with closed eyes. Verily, in broad daylight did he fall asleep, the sluggard! Had he perhaps chased butterflies too much? Upbraid me not, ye beautiful dancers, when I chasten the little God somewhat! He will cry, certainly, and weep--but he is laughable even when weeping! And with tears in his eyes shall he ask you for a dance; and I myself will sing a song to his dance: A dance-song and satire on the spirit of gravity my supremest, powerfulest devil, who is said to be "lord of the world."-- And this is the song that Zarathustra sang when Cupid and the maidens danced together: Of late did I gaze into thine eye, O Life! And into the unfathomable did I there seem to sink. But thou pulledst me out with a golden angle; derisively didst thou laugh when I called thee unfathomable. "Such is the language of all fish," saidst thou; "what THEY do not fathom is unfathomable. But changeable am I only, and wild, and altogether a woman, and no virtuous one: Though I be called by you men the 'profound one,' or the 'faithful one,' 'the eternal one,' 'the mysterious one.' But ye men endow us always with your own virtues--alas, ye virtuous ones!" Thus did she laugh, the unbelievable one; but never do I believe her and her laughter, when she speaketh evil of herself. And when I talked face to face with my wild Wisdom, she said to me angrily: "Thou willest, thou cravest, thou lovest; on that account alone dost thou PRAISE Life!" Then had I almost answered indignantly and told the truth to the angry one; and one cannot answer more indignantly than when one "telleth the truth" to one's Wisdom. For thus do things stand with us three. In my heart do I love only Life-- and verily, most when I hate her! But that I am fond of Wisdom, and often too fond, is because she remindeth me very strongly of Life! She hath her eye, her laugh, and even her golden angle-rod: am I responsible for it that both are so alike? And when once Life asked me: "Who is she then, this Wisdom?"--then said I eagerly: "Ah, yes! Wisdom! One thirsteth for her and is not satisfied, one looketh through veils, one graspeth through nets. Is she beautiful? What do I know! But the oldest carps are still lured by her. Changeable is she, and wayward; often have I seen her bite her lip, and pass the comb against the grain of her hair. Perhaps she is wicked and false, and altogether a woman; but when she speaketh ill of herself, just then doth she seduce most." When I had said this unto Life, then laughed she maliciously, and shut her eyes. "Of whom dost thou speak?" said she. "Perhaps of me? And if thou wert right--is it proper to say THAT in such wise to my face! But now, pray, speak also of thy Wisdom!" Ah, and now hast thou again opened thine eyes, O beloved Life! And into the unfathomable have I again seemed to sink.-- Thus sang Zarathustra. But when the dance was over and the maidens had departed, he became sad. "The sun hath been long set," said he at last, "the meadow is damp, and from the forest cometh coolness. An unknown presence is about me, and gazeth thoughtfully. What! Thou livest still, Zarathustra? Why? Wherefore? Whereby? Whither? Where? How? Is it not folly still to live?-- Ah, my friends; the evening is it which thus interrogateth in me. Forgive me my sadness! Evening hath come on: forgive me that evening hath come on!" Thus sang Zarathustra. XXXIII. THE GRAVE-SONG. "Yonder is the grave-island, the silent isle; yonder also are the graves of my youth. Thither will I carry an evergreen wreath of life." Resolving thus in my heart, did I sail o'er the sea.-- Oh, ye sights and scenes of my youth! Oh, all ye gleams of love, ye divine fleeting gleams! How could ye perish so soon for me! I think of you to- day as my dead ones. From you, my dearest dead ones, cometh unto me a sweet savour, heart- opening and melting. Verily, it convulseth and openeth the heart of the lone seafarer. Still am I the richest and most to be envied--I, the lonesomest one! For I HAVE POSSESSED you, and ye possess me still. Tell me: to whom hath there ever fallen such rosy apples from the tree as have fallen unto me? Still am I your love's heir and heritage, blooming to your memory with many-hued, wild-growing virtues, O ye dearest ones! Ah, we were made to remain nigh unto each other, ye kindly strange marvels; and not like timid birds did ye come to me and my longing--nay, but as trusting ones to a trusting one! Yea, made for faithfulness, like me, and for fond eternities, must I now name you by your faithlessness, ye divine glances and fleeting gleams: no other name have I yet learnt. Verily, too early did ye die for me, ye fugitives. Yet did ye not flee from me, nor did I flee from you: innocent are we to each other in our faithlessness. To kill ME, did they strangle you, ye singing birds of my hopes! Yea, at you, ye dearest ones, did malice ever shoot its arrows--to hit my heart! And they hit it! Because ye were always my dearest, my possession and my possessedness: ON THAT ACCOUNT had ye to die young, and far too early! At my most vulnerable point did they shoot the arrow--namely, at you, whose skin is like down--or more like the smile that dieth at a glance! But this word will I say unto mine enemies: What is all manslaughter in comparison with what ye have done unto me! Worse evil did ye do unto me than all manslaughter; the irretrievable did ye take from me:--thus do I speak unto you, mine enemies! Slew ye not my youth's visions and dearest marvels! My playmates took ye from me, the blessed spirits! To their memory do I deposit this wreath and this curse. This curse upon you, mine enemies! Have ye not made mine eternal short, as a tone dieth away in a cold night! Scarcely, as the twinkle of divine eyes, did it come to me--as a fleeting gleam! Thus spake once in a happy hour my purity: "Divine shall everything be unto me." Then did ye haunt me with foul phantoms; ah, whither hath that happy hour now fled! "All days shall be holy unto me"--so spake once the wisdom of my youth: verily, the language of a joyous wisdom! But then did ye enemies steal my nights, and sold them to sleepless torture: ah, whither hath that joyous wisdom now fled? Once did I long for happy auspices: then did ye lead an owl-monster across my path, an adverse sign. Ah, whither did my tender longing then flee? All loathing did I once vow to renounce: then did ye change my nigh ones and nearest ones into ulcerations. Ah, whither did my noblest vow then flee? As a blind one did I once walk in blessed ways: then did ye cast filth on the blind one's course: and now is he disgusted with the old footpath. And when I performed my hardest task, and celebrated the triumph of my victories, then did ye make those who loved me call out that I then grieved them most. Verily, it was always your doing: ye embittered to me my best honey, and the diligence of my best bees. To my charity have ye ever sent the most impudent beggars; around my sympathy have ye ever crowded the incurably shameless. Thus have ye wounded the faith of my virtue. And when I offered my holiest as a sacrifice, immediately did your "piety" put its fatter gifts beside it: so that my holiest suffocated in the fumes of your fat. And once did I want to dance as I had never yet danced: beyond all heavens did I want to dance. Then did ye seduce my favourite minstrel. And now hath he struck up an awful, melancholy air; alas, he tooted as a mournful horn to mine ear! Murderous minstrel, instrument of evil, most innocent instrument! Already did I stand prepared for the best dance: then didst thou slay my rapture with thy tones! Only in the dance do I know how to speak the parable of the highest things:--and now hath my grandest parable remained unspoken in my limbs! Unspoken and unrealised hath my highest hope remained! And there have perished for me all the visions and consolations of my youth! How did I ever bear it? How did I survive and surmount such wounds? How did my soul rise again out of those sepulchres? Yea, something invulnerable, unburiable is with me, something that would rend rocks asunder: it is called MY WILL. Silently doth it proceed, and unchanged throughout the years. Its course will it go upon my feet, mine old Will; hard of heart is its nature and invulnerable. Invulnerable am I only in my heel. Ever livest thou there, and art like thyself, thou most patient one! Ever hast thou burst all shackles of the tomb! In thee still liveth also the unrealisedness of my youth; and as life and youth sittest thou here hopeful on the yellow ruins of graves. Yea, thou art still for me the demolisher of all graves: Hail to thee, my Will! And only where there are graves are there resurrections.-- Thus sang Zarathustra. XXXIV. SELF-SURPASSING. "Will to Truth" do ye call it, ye wisest ones, that which impelleth you and maketh you ardent? Will for the thinkableness of all being: thus do _I_ call your will! All being would ye MAKE thinkable: for ye doubt with good reason whether it be already thinkable. But it shall accommodate and bend itself to you! So willeth your will. Smooth shall it become and subject to the spirit, as its mirror and reflection. That is your entire will, ye wisest ones, as a Will to Power; and even when ye speak of good and evil, and of estimates of value. Ye would still create a world before which ye can bow the knee: such is your ultimate hope and ecstasy. The ignorant, to be sure, the people--they are like a river on which a boat floateth along: and in the boat sit the estimates of value, solemn and disguised. Your will and your valuations have ye put on the river of becoming; it betrayeth unto me an old Will to Power, what is believed by the people as good and evil. It was ye, ye wisest ones, who put such guests in this boat, and gave them pomp and proud names--ye and your ruling Will! Onward the river now carrieth your boat: it MUST carry it. A small matter if the rough wave foameth and angrily resisteth its keel! It is not the river that is your danger and the end of your good and evil, ye wisest ones: but that Will itself, the Will to Power--the unexhausted, procreating life-will. But that ye may understand my gospel of good and evil, for that purpose will I tell you my gospel of life, and of the nature of all living things. The living thing did I follow; I walked in the broadest and narrowest paths to learn its nature. With a hundred-faced mirror did I catch its glance when its mouth was shut, so that its eye might speak unto me. And its eye spake unto me. But wherever I found living things, there heard I also the language of obedience. All living things are obeying things. And this heard I secondly: Whatever cannot obey itself, is commanded. Such is the nature of living things. This, however, is the third thing which I heard--namely, that commanding is more difficult than obeying. And not only because the commander beareth the burden of all obeyers, and because this burden readily crusheth him:-- An attempt and a risk seemed all commanding unto me; and whenever it commandeth, the living thing risketh itself thereby. Yea, even when it commandeth itself, then also must it atone for its commanding. Of its own law must it become the judge and avenger and victim. How doth this happen! so did I ask myself. What persuadeth the living thing to obey, and command, and even be obedient in commanding? Hearken now unto my word, ye wisest ones! Test it seriously, whether I have crept into the heart of life itself, and into the roots of its heart! Wherever I found a living thing, there found I Will to Power; and even in the will of the servant found I the will to be master. That to the stronger the weaker shall serve--thereto persuadeth he his will who would be master over a still weaker one. That delight alone he is unwilling to forego. And as the lesser surrendereth himself to the greater that he may have delight and power over the least of all, so doth even the greatest surrender himself, and staketh--life, for the sake of power. It is the surrender of the greatest to run risk and danger, and play dice for death. And where there is sacrifice and service and love-glances, there also is the will to be master. By by-ways doth the weaker then slink into the fortress, and into the heart of the mightier one--and there stealeth power. And this secret spake Life herself unto me. "Behold," said she, "I am that WHICH MUST EVER SURPASS ITSELF. To be sure, ye call it will to procreation, or impulse towards a goal, towards the higher, remoter, more manifold: but all that is one and the same secret. Rather would I succumb than disown this one thing; and verily, where there is succumbing and leaf-falling, lo, there doth Life sacrifice itself--for power! That I have to be struggle, and becoming, and purpose, and cross-purpose-- ah, he who divineth my will, divineth well also on what CROOKED paths it hath to tread! Whatever I create, and however much I love it,--soon must I be adverse to it, and to my love: so willeth my will. And even thou, discerning one, art only a path and footstep of my will: verily, my Will to Power walketh even on the feet of thy Will to Truth! He certainly did not hit the truth who shot at it the formula: 'Will to existence': that will--doth not exist! For what is not, cannot will; that, however, which is in existence--how could it still strive for existence! Only where there is life, is there also will: not, however, Will to Life, but--so teach I thee--Will to Power! Much is reckoned higher than life itself by the living one; but out of the very reckoning speaketh--the Will to Power!"-- Thus did Life once teach me: and thereby, ye wisest ones, do I solve you the riddle of your hearts. Verily, I say unto you: good and evil which would be everlasting--it doth not exist! Of its own accord must it ever surpass itself anew. With your values and formulae of good and evil, ye exercise power, ye valuing ones: and that is your secret love, and the sparkling, trembling, and overflowing of your souls. But a stronger power groweth out of your values, and a new surpassing: by it breaketh egg and egg-shell. And he who hath to be a creator in good and evil--verily, he hath first to be a destroyer, and break values in pieces. Thus doth the greatest evil pertain to the greatest good: that, however, is the creating good.-- Let us SPEAK thereof, ye wisest ones, even though it be bad. To be silent is worse; all suppressed truths become poisonous. And let everything break up which--can break up by our truths! Many a house is still to be built!-- Thus spake Zarathustra. XXXV. THE SUBLIME ONES. Calm is the bottom of my sea: who would guess that it hideth droll monsters! Unmoved is my depth: but it sparkleth with swimming enigmas and laughters. A sublime one saw I to-day, a solemn one, a penitent of the spirit: Oh, how my soul laughed at his ugliness! With upraised breast, and like those who draw in their breath: thus did he stand, the sublime one, and in silence: O'erhung with ugly truths, the spoil of his hunting, and rich in torn raiment; many thorns also hung on him--but I saw no rose. Not yet had he learned laughing and beauty. Gloomy did this hunter return from the forest of knowledge. From the fight with wild beasts returned he home: but even yet a wild beast gazeth out of his seriousness--an unconquered wild beast! As a tiger doth he ever stand, on the point of springing; but I do not like those strained souls; ungracious is my taste towards all those self- engrossed ones. And ye tell me, friends, that there is to be no dispute about taste and tasting? But all life is a dispute about taste and tasting! Taste: that is weight at the same time, and scales and weigher; and alas for every living thing that would live without dispute about weight and scales and weigher! Should he become weary of his sublimeness, this sublime one, then only will his beauty begin--and then only will I taste him and find him savoury. And only when he turneth away from himself will he o'erleap his own shadow --and verily! into HIS sun. Far too long did he sit in the shade; the cheeks of the penitent of the spirit became pale; he almost starved on his expectations. Contempt is still in his eye, and loathing hideth in his mouth. To be sure, he now resteth, but he hath not yet taken rest in the sunshine. As the ox ought he to do; and his happiness should smell of the earth, and not of contempt for the earth. As a white ox would I like to see him, which, snorting and lowing, walketh before the plough-share: and his lowing should also laud all that is earthly! Dark is still his countenance; the shadow of his hand danceth upon it. O'ershadowed is still the sense of his eye. His deed itself is still the shadow upon him: his doing obscureth the doer. Not yet hath he overcome his deed. To be sure, I love in him the shoulders of the ox: but now do I want to see also the eye of the angel. Also his hero-will hath he still to unlearn: an exalted one shall he be, and not only a sublime one:--the ether itself should raise him, the will- less one! He hath subdued monsters, he hath solved enigmas. But he should also redeem his monsters and enigmas; into heavenly children should he transform them. As yet hath his knowledge not learned to smile, and to be without jealousy; as yet hath his gushing passion not become calm in beauty. Verily, not in satiety shall his longing cease and disappear, but in beauty! Gracefulness belongeth to the munificence of the magnanimous. His arm across his head: thus should the hero repose; thus should he also surmount his repose. But precisely to the hero is BEAUTY the hardest thing of all. Unattainable is beauty by all ardent wills. A little more, a little less: precisely this is much here, it is the most here. To stand with relaxed muscles and with unharnessed will: that is the hardest for all of you, ye sublime ones! When power becometh gracious and descendeth into the visible--I call such condescension, beauty. And from no one do I want beauty so much as from thee, thou powerful one: let thy goodness be thy last self-conquest. All evil do I accredit to thee: therefore do I desire of thee the good. Verily, I have often laughed at the weaklings, who think themselves good because they have crippled paws! The virtue of the pillar shalt thou strive after: more beautiful doth it ever become, and more graceful--but internally harder and more sustaining-- the higher it riseth. Yea, thou sublime one, one day shalt thou also be beautiful, and hold up the mirror to thine own beauty. Then will thy soul thrill with divine desires; and there will be adoration even in thy vanity! For this is the secret of the soul: when the hero hath abandoned it, then only approacheth it in dreams--the superhero.-- Thus spake Zarathustra. XXXVI. THE LAND OF CULTURE. Too far did I fly into the future: a horror seized upon me. And when I looked around me, lo! there time was my sole contemporary. Then did I fly backwards, homewards--and always faster. Thus did I come unto you, ye present-day men, and into the land of culture. For the first time brought I an eye to see you, and good desire: verily, with longing in my heart did I come. But how did it turn out with me? Although so alarmed--I had yet to laugh! Never did mine eye see anything so motley-coloured! I laughed and laughed, while my foot still trembled, and my heart as well. "Here forsooth, is the home of all the paintpots,"--said I. With fifty patches painted on faces and limbs--so sat ye there to mine astonishment, ye present-day men! And with fifty mirrors around you, which flattered your play of colours, and repeated it! Verily, ye could wear no better masks, ye present-day men, than your own faces! Who could--RECOGNISE you! Written all over with the characters of the past, and these characters also pencilled over with new characters--thus have ye concealed yourselves well from all decipherers! And though one be a trier of the reins, who still believeth that ye have reins! Out of colours ye seem to be baked, and out of glued scraps. All times and peoples gaze divers-coloured out of your veils; all customs and beliefs speak divers-coloured out of your gestures. He who would strip you of veils and wrappers, and paints and gestures, would just have enough left to scare the crows. Verily, I myself am the scared crow that once saw you naked, and without paint; and I flew away when the skeleton ogled at me. Rather would I be a day-labourer in the nether-world, and among the shades of the by-gone!--Fatter and fuller than ye, are forsooth the nether- worldlings! This, yea this, is bitterness to my bowels, that I can neither endure you naked nor clothed, ye present-day men! All that is unhomelike in the future, and whatever maketh strayed birds shiver, is verily more homelike and familiar than your "reality." For thus speak ye: "Real are we wholly, and without faith and superstition": thus do ye plume yourselves--alas! even without plumes! Indeed, how would ye be ABLE to believe, ye divers-coloured ones!--ye who are pictures of all that hath ever been believed! Perambulating refutations are ye, of belief itself, and a dislocation of all thought. UNTRUSTWORTHY ONES: thus do _I_ call you, ye real ones! All periods prate against one another in your spirits; and the dreams and pratings of all periods were even realer than your awakeness! Unfruitful are ye: THEREFORE do ye lack belief. But he who had to create, had always his presaging dreams and astral premonitions--and believed in believing!-- Half-open doors are ye, at which grave-diggers wait. And this is YOUR reality: "Everything deserveth to perish." Alas, how ye stand there before me, ye unfruitful ones; how lean your ribs! And many of you surely have had knowledge thereof. Many a one hath said: "There hath surely a God filched something from me secretly whilst I slept? Verily, enough to make a girl for himself therefrom! "Amazing is the poverty of my ribs!" thus hath spoken many a present-day man. Yea, ye are laughable unto me, ye present-day men! And especially when ye marvel at yourselves! And woe unto me if I could not laugh at your marvelling, and had to swallow all that is repugnant in your platters! As it is, however, I will make lighter of you, since I have to carry what is heavy; and what matter if beetles and May-bugs also alight on my load! Verily, it shall not on that account become heavier to me! And not from you, ye present-day men, shall my great weariness arise.-- Ah, whither shall I now ascend with my longing! From all mountains do I look out for fatherlands and motherlands. But a home have I found nowhere: unsettled am I in all cities, and decamping at all gates. Alien to me, and a mockery, are the present-day men, to whom of late my heart impelled me; and exiled am I from fatherlands and motherlands. Thus do I love only my CHILDREN'S LAND, the undiscovered in the remotest sea: for it do I bid my sails search and search. Unto my children will I make amends for being the child of my fathers: and unto all the future--for THIS present-day!-- Thus spake Zarathustra. XXXVII. IMMACULATE PERCEPTION. When yester-eve the moon arose, then did I fancy it about to bear a sun: so broad and teeming did it lie on the horizon. But it was a liar with its pregnancy; and sooner will I believe in the man in the moon than in the woman. To be sure, little of a man is he also, that timid night-reveller. Verily, with a bad conscience doth he stalk over the roofs. For he is covetous and jealous, the monk in the moon; covetous of the earth, and all the joys of lovers. Nay, I like him not, that tom-cat on the roofs! Hateful unto me are all that slink around half-closed windows! Piously and silently doth he stalk along on the star-carpets:--but I like no light-treading human feet, on which not even a spur jingleth. Every honest one's step speaketh; the cat however, stealeth along over the ground. Lo! cat-like doth the moon come along, and dishonestly.-- This parable speak I unto you sentimental dissemblers, unto you, the "pure discerners!" You do _I_ call--covetous ones! Also ye love the earth, and the earthly: I have divined you well!--but shame is in your love, and a bad conscience--ye are like the moon! To despise the earthly hath your spirit been persuaded, but not your bowels: these, however, are the strongest in you! And now is your spirit ashamed to be at the service of your bowels, and goeth by-ways and lying ways to escape its own shame. "That would be the highest thing for me"--so saith your lying spirit unto itself--"to gaze upon life without desire, and not like the dog, with hanging-out tongue: To be happy in gazing: with dead will, free from the grip and greed of selfishness--cold and ashy-grey all over, but with intoxicated moon-eyes! That would be the dearest thing to me"--thus doth the seduced one seduce himself,--"to love the earth as the moon loveth it, and with the eye only to feel its beauty. And this do I call IMMACULATE perception of all things: to want nothing else from them, but to be allowed to lie before them as a mirror with a hundred facets."-- Oh, ye sentimental dissemblers, ye covetous ones! Ye lack innocence in your desire: and now do ye defame desiring on that account! Verily, not as creators, as procreators, or as jubilators do ye love the earth! Where is innocence? Where there is will to procreation. And he who seeketh to create beyond himself, hath for me the purest will. Where is beauty? Where I MUST WILL with my whole Will; where I will love and perish, that an image may not remain merely an image. Loving and perishing: these have rhymed from eternity. Will to love: that is to be ready also for death. Thus do I speak unto you cowards! But now doth your emasculated ogling profess to be "contemplation!" And that which can be examined with cowardly eyes is to be christened "beautiful!" Oh, ye violators of noble names! But it shall be your curse, ye immaculate ones, ye pure discerners, that ye shall never bring forth, even though ye lie broad and teeming on the horizon! Verily, ye fill your mouth with noble words: and we are to believe that your heart overfloweth, ye cozeners? But MY words are poor, contemptible, stammering words: gladly do I pick up what falleth from the table at your repasts. Yet still can I say therewith the truth--to dissemblers! Yea, my fish- bones, shells, and prickly leaves shall--tickle the noses of dissemblers! Bad air is always about you and your repasts: your lascivious thoughts, your lies, and secrets are indeed in the air! Dare only to believe in yourselves--in yourselves and in your inward parts! He who doth not believe in himself always lieth. A God's mask have ye hung in front of you, ye "pure ones": into a God's mask hath your execrable coiling snake crawled. Verily ye deceive, ye "contemplative ones!" Even Zarathustra was once the dupe of your godlike exterior; he did not divine the serpent's coil with which it was stuffed. A God's soul, I once thought I saw playing in your games, ye pure discerners! No better arts did I once dream of than your arts! Serpents' filth and evil odour, the distance concealed from me: and that a lizard's craft prowled thereabouts lasciviously. But I came NIGH unto you: then came to me the day,--and now cometh it to you,--at an end is the moon's love affair! See there! Surprised and pale doth it stand--before the rosy dawn! For already she cometh, the glowing one,--HER love to the earth cometh! Innocence and creative desire, is all solar love! See there, how she cometh impatiently over the sea! Do ye not feel the thirst and the hot breath of her love? At the sea would she suck, and drink its depths to her height: now riseth the desire of the sea with its thousand breasts. Kissed and sucked WOULD it be by the thirst of the sun; vapour WOULD it become, and height, and path of light, and light itself! Verily, like the sun do I love life, and all deep seas. And this meaneth TO ME knowledge: all that is deep shall ascend--to my height!-- Thus spake Zarathustra. XXXVIII. SCHOLARS. When I lay asleep, then did a sheep eat at the ivy-wreath on my head,--it ate, and said thereby: "Zarathustra is no longer a scholar." It said this, and went away clumsily and proudly. A child told it to me. I like to lie here where the children play, beside the ruined wall, among thistles and red poppies. A scholar am I still to the children, and also to the thistles and red poppies. Innocent are they, even in their wickedness. But to the sheep I am no longer a scholar: so willeth my lot--blessings upon it! For this is the truth: I have departed from the house of the scholars, and the door have I also slammed behind me. Too long did my soul sit hungry at their table: not like them have I got the knack of investigating, as the knack of nut-cracking. Freedom do I love, and the air over fresh soil; rather would I sleep on ox- skins than on their honours and dignities. I am too hot and scorched with mine own thought: often is it ready to take away my breath. Then have I to go into the open air, and away from all dusty rooms. But they sit cool in the cool shade: they want in everything to be merely spectators, and they avoid sitting where the sun burneth on the steps. Like those who stand in the street and gape at the passers-by: thus do they also wait, and gape at the thoughts which others have thought. Should one lay hold of them, then do they raise a dust like flour-sacks, and involuntarily: but who would divine that their dust came from corn, and from the yellow delight of the summer fields? When they give themselves out as wise, then do their petty sayings and truths chill me: in their wisdom there is often an odour as if it came from the swamp; and verily, I have even heard the frog croak in it! Clever are they--they have dexterous fingers: what doth MY simplicity pretend to beside their multiplicity! All threading and knitting and weaving do their fingers understand: thus do they make the hose of the spirit! Good clockworks are they: only be careful to wind them up properly! Then do they indicate the hour without mistake, and make a modest noise thereby. Like millstones do they work, and like pestles: throw only seed-corn unto them!--they know well how to grind corn small, and make white dust out of it. They keep a sharp eye on one another, and do not trust each other the best. Ingenious in little artifices, they wait for those whose knowledge walketh on lame feet,--like spiders do they wait. I saw them always prepare their poison with precaution; and always did they put glass gloves on their fingers in doing so. They also know how to play with false dice; and so eagerly did I find them playing, that they perspired thereby. We are alien to each other, and their virtues are even more repugnant to my taste than their falsehoods and false dice. And when I lived with them, then did I live above them. Therefore did they take a dislike to me. They want to hear nothing of any one walking above their heads; and so they put wood and earth and rubbish betwixt me and their heads. Thus did they deafen the sound of my tread: and least have I hitherto been heard by the most learned. All mankind's faults and weaknesses did they put betwixt themselves and me:--they call it "false ceiling" in their houses. But nevertheless I walk with my thoughts ABOVE their heads; and even should I walk on mine own errors, still would I be above them and their heads. For men are NOT equal: so speaketh justice. And what I will, THEY may not will!-- Thus spake Zarathustra. XXXIX. POETS. "Since I have known the body better"--said Zarathustra to one of his disciples--"the spirit hath only been to me symbolically spirit; and all the 'imperishable'--that is also but a simile." "So have I heard thee say once before," answered the disciple, "and then thou addedst: 'But the poets lie too much.' Why didst thou say that the poets lie too much?" "Why?" said Zarathustra. "Thou askest why? I do not belong to those who may be asked after their Why. Is my experience but of yesterday? It is long ago that I experienced the reasons for mine opinions. Should I not have to be a cask of memory, if I also wanted to have my reasons with me? It is already too much for me even to retain mine opinions; and many a bird flieth away. And sometimes, also, do I find a fugitive creature in my dovecote, which is alien to me, and trembleth when I lay my hand upon it. But what did Zarathustra once say unto thee? That the poets lie too much? --But Zarathustra also is a poet. Believest thou that he there spake the truth? Why dost thou believe it?" The disciple answered: "I believe in Zarathustra." But Zarathustra shook his head and smiled.-- Belief doth not sanctify me, said he, least of all the belief in myself. But granting that some one did say in all seriousness that the poets lie too much: he was right--WE do lie too much. We also know too little, and are bad learners: so we are obliged to lie. And which of us poets hath not adulterated his wine? Many a poisonous hotchpotch hath evolved in our cellars: many an indescribable thing hath there been done. And because we know little, therefore are we pleased from the heart with the poor in spirit, especially when they are young women! And even of those things are we desirous, which old women tell one another in the evening. This do we call the eternally feminine in us. And as if there were a special secret access to knowledge, which CHOKETH UP for those who learn anything, so do we believe in the people and in their "wisdom." This, however, do all poets believe: that whoever pricketh up his ears when lying in the grass or on lonely slopes, learneth something of the things that are betwixt heaven and earth. And if there come unto them tender emotions, then do the poets always think that nature herself is in love with them: And that she stealeth to their ear to whisper secrets into it, and amorous flatteries: of this do they plume and pride themselves, before all mortals! Ah, there are so many things betwixt heaven and earth of which only the poets have dreamed! And especially ABOVE the heavens: for all Gods are poet-symbolisations, poet-sophistications! Verily, ever are we drawn aloft--that is, to the realm of the clouds: on these do we set our gaudy puppets, and then call them Gods and Supermen:-- Are not they light enough for those chairs!--all these Gods and Supermen?-- Ah, how I am weary of all the inadequate that is insisted on as actual! Ah, how I am weary of the poets! When Zarathustra so spake, his disciple resented it, but was silent. And Zarathustra also was silent; and his eye directed itself inwardly, as if it gazed into the far distance. At last he sighed and drew breath.-- I am of to-day and heretofore, said he thereupon; but something is in me that is of the morrow, and the day following, and the hereafter. I became weary of the poets, of the old and of the new: superficial are they all unto me, and shallow seas. They did not think sufficiently into the depth; therefore their feeling did not reach to the bottom. Some sensation of voluptuousness and some sensation of tedium: these have as yet been their best contemplation. Ghost-breathing and ghost-whisking, seemeth to me all the jingle-jangling of their harps; what have they known hitherto of the fervour of tones!-- They are also not pure enough for me: they all muddle their water that it may seem deep. And fain would they thereby prove themselves reconcilers: but mediaries and mixers are they unto me, and half-and-half, and impure!-- Ah, I cast indeed my net into their sea, and meant to catch good fish; but always did I draw up the head of some ancient God. Thus did the sea give a stone to the hungry one. And they themselves may well originate from the sea. Certainly, one findeth pearls in them: thereby they are the more like hard molluscs. And instead of a soul, I have often found in them salt slime. They have learned from the sea also its vanity: is not the sea the peacock of peacocks? Even before the ugliest of all buffaloes doth it spread out its tail; never doth it tire of its lace-fan of silver and silk. Disdainfully doth the buffalo glance thereat, nigh to the sand with its soul, nigher still to the thicket, nighest, however, to the swamp. What is beauty and sea and peacock-splendour to it! This parable I speak unto the poets. Verily, their spirit itself is the peacock of peacocks, and a sea of vanity! Spectators, seeketh the spirit of the poet--should they even be buffaloes!-- But of this spirit became I weary; and I see the time coming when it will become weary of itself. Yea, changed have I seen the poets, and their glance turned towards themselves. Penitents of the spirit have I seen appearing; they grew out of the poets.-- Thus spake Zarathustra. XL. GREAT EVENTS. There is an isle in the sea--not far from the Happy Isles of Zarathustra-- on which a volcano ever smoketh; of which isle the people, and especially the old women amongst them, say that it is placed as a rock before the gate of the nether-world; but that through the volcano itself the narrow way leadeth downwards which conducteth to this gate. Now about the time that Zarathustra sojourned on the Happy Isles, it happened that a ship anchored at the isle on which standeth the smoking mountain, and the crew went ashore to shoot rabbits. About the noontide hour, however, when the captain and his men were together again, they saw suddenly a man coming towards them through the air, and a voice said distinctly: "It is time! It is the highest time!" But when the figure was nearest to them (it flew past quickly, however, like a shadow, in the direction of the volcano), then did they recognise with the greatest surprise that it was Zarathustra; for they had all seen him before except the captain himself, and they loved him as the people love: in such wise that love and awe were combined in equal degree. "Behold!" said the old helmsman, "there goeth Zarathustra to hell!" About the same time that these sailors landed on the fire-isle, there was a rumour that Zarathustra had disappeared; and when his friends were asked about it, they said that he had gone on board a ship by night, without saying whither he was going. Thus there arose some uneasiness. After three days, however, there came the story of the ship's crew in addition to this uneasiness--and then did all the people say that the devil had taken Zarathustra. His disciples laughed, sure enough, at this talk; and one of them said even: "Sooner would I believe that Zarathustra hath taken the devil." But at the bottom of their hearts they were all full of anxiety and longing: so their joy was great when on the fifth day Zarathustra appeared amongst them. And this is the account of Zarathustra's interview with the fire-dog: The earth, said he, hath a skin; and this skin hath diseases. One of these diseases, for example, is called "man." And another of these diseases is called "the fire-dog": concerning HIM men have greatly deceived themselves, and let themselves be deceived. To fathom this mystery did I go o'er the sea; and I have seen the truth naked, verily! barefooted up to the neck. Now do I know how it is concerning the fire-dog; and likewise concerning all the spouting and subversive devils, of which not only old women are afraid. "Up with thee, fire-dog, out of thy depth!" cried I, "and confess how deep that depth is! Whence cometh that which thou snortest up? Thou drinkest copiously at the sea: that doth thine embittered eloquence betray! In sooth, for a dog of the depth, thou takest thy nourishment too much from the surface! At the most, I regard thee as the ventriloquist of the earth: and ever, when I have heard subversive and spouting devils speak, I have found them like thee: embittered, mendacious, and shallow. Ye understand how to roar and obscure with ashes! Ye are the best braggarts, and have sufficiently learned the art of making dregs boil. Where ye are, there must always be dregs at hand, and much that is spongy, hollow, and compressed: it wanteth to have freedom. 'Freedom' ye all roar most eagerly: but I have unlearned the belief in 'great events,' when there is much roaring and smoke about them. And believe me, friend Hullabaloo! The greatest events--are not our noisiest, but our stillest hours. Not around the inventors of new noise, but around the inventors of new values, doth the world revolve; INAUDIBLY it revolveth. And just own to it! Little had ever taken place when thy noise and smoke passed away. What, if a city did become a mummy, and a statue lay in the mud! And this do I say also to the o'erthrowers of statues: It is certainly the greatest folly to throw salt into the sea, and statues into the mud. In the mud of your contempt lay the statue: but it is just its law, that out of contempt, its life and living beauty grow again! With diviner features doth it now arise, seducing by its suffering; and verily! it will yet thank you for o'erthrowing it, ye subverters! This counsel, however, do I counsel to kings and churches, and to all that is weak with age or virtue--let yourselves be o'erthrown! That ye may again come to life, and that virtue--may come to you!--" Thus spake I before the fire-dog: then did he interrupt me sullenly, and asked: "Church? What is that?" "Church?" answered I, "that is a kind of state, and indeed the most mendacious. But remain quiet, thou dissembling dog! Thou surely knowest thine own species best! Like thyself the state is a dissembling dog; like thee doth it like to speak with smoke and roaring--to make believe, like thee, that it speaketh out of the heart of things. For it seeketh by all means to be the most important creature on earth, the state; and people think it so." When I had said this, the fire-dog acted as if mad with envy. "What!" cried he, "the most important creature on earth? And people think it so?" And so much vapour and terrible voices came out of his throat, that I thought he would choke with vexation and envy. At last he became calmer and his panting subsided; as soon, however, as he was quiet, I said laughingly: "Thou art angry, fire-dog: so I am in the right about thee! And that I may also maintain the right, hear the story of another fire-dog; he speaketh actually out of the heart of the earth. Gold doth his breath exhale, and golden rain: so doth his heart desire. What are ashes and smoke and hot dregs to him! Laughter flitteth from him like a variegated cloud; adverse is he to thy gargling and spewing and grips in the bowels! The gold, however, and the laughter--these doth he take out of the heart of the earth: for, that thou mayst know it,--THE HEART OF THE EARTH IS OF GOLD." When the fire-dog heard this, he could no longer endure to listen to me. Abashed did he draw in his tail, said "bow-wow!" in a cowed voice, and crept down into his cave.-- Thus told Zarathustra. His disciples, however, hardly listened to him: so great was their eagerness to tell him about the sailors, the rabbits, and the flying man. "What am I to think of it!" said Zarathustra. "Am I indeed a ghost? But it may have been my shadow. Ye have surely heard something of the Wanderer and his Shadow? One thing, however, is certain: I must keep a tighter hold of it; otherwise it will spoil my reputation." And once more Zarathustra shook his head and wondered. "What am I to think of it!" said he once more. "Why did the ghost cry: 'It is time! It is the highest time!' For WHAT is it then--the highest time?"-- Thus spake Zarathustra. XLI. THE SOOTHSAYER. "-And I saw a great sadness come over mankind. The best turned weary of their works. A doctrine appeared, a faith ran beside it: 'All is empty, all is alike, all hath been!' And from all hills there re-echoed: 'All is empty, all is alike, all hath been!' To be sure we have harvested: but why have all our fruits become rotten and brown? What was it fell last night from the evil moon? In vain was all our labour, poison hath our wine become, the evil eye hath singed yellow our fields and hearts. Arid have we all become; and fire falling upon us, then do we turn dust like ashes:--yea, the fire itself have we made aweary. All our fountains have dried up, even the sea hath receded. All the ground trieth to gape, but the depth will not swallow! 'Alas! where is there still a sea in which one could be drowned?' so soundeth our plaint--across shallow swamps. Verily, even for dying have we become too weary; now do we keep awake and live on--in sepulchres." Thus did Zarathustra hear a soothsayer speak; and the foreboding touched his heart and transformed him. Sorrowfully did he go about and wearily; and he became like unto those of whom the soothsayer had spoken.-- Verily, said he unto his disciples, a little while, and there cometh the long twilight. Alas, how shall I preserve my light through it! That it may not smother in this sorrowfulness! To remoter worlds shall it be a light, and also to remotest nights! Thus did Zarathustra go about grieved in his heart, and for three days he did not take any meat or drink: he had no rest, and lost his speech. At last it came to pass that he fell into a deep sleep. His disciples, however, sat around him in long night-watches, and waited anxiously to see if he would awake, and speak again, and recover from his affliction. And this is the discourse that Zarathustra spake when he awoke; his voice, however, came unto his disciples as from afar: Hear, I pray you, the dream that I dreamed, my friends, and help me to divine its meaning! A riddle is it still unto me, this dream; the meaning is hidden in it and encaged, and doth not yet fly above it on free pinions. All life had I renounced, so I dreamed. Night-watchman and grave-guardian had I become, aloft, in the lone mountain-fortress of Death. There did I guard his coffins: full stood the musty vaults of those trophies of victory. Out of glass coffins did vanquished life gaze upon me. The odour of dust-covered eternities did I breathe: sultry and dust- covered lay my soul. And who could have aired his soul there! Brightness of midnight was ever around me; lonesomeness cowered beside her; and as a third, death-rattle stillness, the worst of my female friends. Keys did I carry, the rustiest of all keys; and I knew how to open with them the most creaking of all gates. Like a bitterly angry croaking ran the sound through the long corridors when the leaves of the gate opened: ungraciously did this bird cry, unwillingly was it awakened. But more frightful even, and more heart-strangling was it, when it again became silent and still all around, and I alone sat in that malignant silence. Thus did time pass with me, and slip by, if time there still was: what do I know thereof! But at last there happened that which awoke me. Thrice did there peal peals at the gate like thunders, thrice did the vaults resound and howl again: then did I go to the gate. Alpa! cried I, who carrieth his ashes unto the mountain? Alpa! Alpa! who carrieth his ashes unto the mountain? And I pressed the key, and pulled at the gate, and exerted myself. But not a finger's-breadth was it yet open: Then did a roaring wind tear the folds apart: whistling, whizzing, and piercing, it threw unto me a black coffin. And in the roaring, and whistling, and whizzing the coffin burst up, and spouted out a thousand peals of laughter. And a thousand caricatures of children, angels, owls, fools, and child- sized butterflies laughed and mocked, and roared at me. Fearfully was I terrified thereby: it prostrated me. And I cried with horror as I ne'er cried before. But mine own crying awoke me:--and I came to myself.-- Thus did Zarathustra relate his dream, and then was silent: for as yet he knew not the interpretation thereof. But the disciple whom he loved most arose quickly, seized Zarathustra's hand, and said: "Thy life itself interpreteth unto us this dream, O Zarathustra! Art thou not thyself the wind with shrill whistling, which bursteth open the gates of the fortress of Death? Art thou not thyself the coffin full of many-hued malices and angel- caricatures of life? Verily, like a thousand peals of children's laughter cometh Zarathustra into all sepulchres, laughing at those night-watchmen and grave-guardians, and whoever else rattleth with sinister keys. With thy laughter wilt thou frighten and prostrate them: fainting and recovering will demonstrate thy power over them. And when the long twilight cometh and the mortal weariness, even then wilt thou not disappear from our firmament, thou advocate of life! New stars hast thou made us see, and new nocturnal glories: verily, laughter itself hast thou spread out over us like a many-hued canopy. Now will children's laughter ever from coffins flow; now will a strong wind ever come victoriously unto all mortal weariness: of this thou art thyself the pledge and the prophet! Verily, THEY THEMSELVES DIDST THOU DREAM, thine enemies: that was thy sorest dream. But as thou awokest from them and camest to thyself, so shall they awaken from themselves--and come unto thee!" Thus spake the disciple; and all the others then thronged around Zarathustra, grasped him by the hands, and tried to persuade him to leave his bed and his sadness, and return unto them. Zarathustra, however, sat upright on his couch, with an absent look. Like one returning from long foreign sojourn did he look on his disciples, and examined their features; but still he knew them not. When, however, they raised him, and set him upon his feet, behold, all on a sudden his eye changed; he understood everything that had happened, stroked his beard, and said with a strong voice: "Well! this hath just its time; but see to it, my disciples, that we have a good repast; and without delay! Thus do I mean to make amends for bad dreams! The soothsayer, however, shall eat and drink at my side: and verily, I will yet show him a sea in which he can drown himself!"-- Thus spake Zarathustra. Then did he gaze long into the face of the disciple who had been the dream-interpreter, and shook his head.-- XLII. REDEMPTION. When Zarathustra went one day over the great bridge, then did the cripples and beggars surround him, and a hunchback spake thus unto him: "Behold, Zarathustra! Even the people learn from thee, and acquire faith in thy teaching: but for them to believe fully in thee, one thing is still needful--thou must first of all convince us cripples! Here hast thou now a fine selection, and verily, an opportunity with more than one forelock! The blind canst thou heal, and make the lame run; and from him who hath too much behind, couldst thou well, also, take away a little;--that, I think, would be the right method to make the cripples believe in Zarathustra!" Zarathustra, however, answered thus unto him who so spake: When one taketh his hump from the hunchback, then doth one take from him his spirit--so do the people teach. And when one giveth the blind man eyes, then doth he see too many bad things on the earth: so that he curseth him who healed him. He, however, who maketh the lame man run, inflicteth upon him the greatest injury; for hardly can he run, when his vices run away with him--so do the people teach concerning cripples. And why should not Zarathustra also learn from the people, when the people learn from Zarathustra? It is, however, the smallest thing unto me since I have been amongst men, to see one person lacking an eye, another an ear, and a third a leg, and that others have lost the tongue, or the nose, or the head. I see and have seen worse things, and divers things so hideous, that I should neither like to speak of all matters, nor even keep silent about some of them: namely, men who lack everything, except that they have too much of one thing--men who are nothing more than a big eye, or a big mouth, or a big belly, or something else big,--reversed cripples, I call such men. And when I came out of my solitude, and for the first time passed over this bridge, then I could not trust mine eyes, but looked again and again, and said at last: "That is an ear! An ear as big as a man!" I looked still more attentively--and actually there did move under the ear something that was pitiably small and poor and slim. And in truth this immense ear was perched on a small thin stalk--the stalk, however, was a man! A person putting a glass to his eyes, could even recognise further a small envious countenance, and also that a bloated soullet dangled at the stalk. The people told me, however, that the big ear was not only a man, but a great man, a genius. But I never believed in the people when they spake of great men--and I hold to my belief that it was a reversed cripple, who had too little of everything, and too much of one thing. When Zarathustra had spoken thus unto the hunchback, and unto those of whom the hunchback was the mouthpiece and advocate, then did he turn to his disciples in profound dejection, and said: Verily, my friends, I walk amongst men as amongst the fragments and limbs of human beings! This is the terrible thing to mine eye, that I find man broken up, and scattered about, as on a battle- and butcher-ground. And when mine eye fleeth from the present to the bygone, it findeth ever the same: fragments and limbs and fearful chances--but no men! The present and the bygone upon earth--ah! my friends--that is MY most unbearable trouble; and I should not know how to live, if I were not a seer of what is to come. A seer, a purposer, a creator, a future itself, and a bridge to the future --and alas! also as it were a cripple on this bridge: all that is Zarathustra. And ye also asked yourselves often: "Who is Zarathustra to us? What shall he be called by us?" And like me, did ye give yourselves questions for answers. Is he a promiser? Or a fulfiller? A conqueror? Or an inheritor? A harvest? Or a ploughshare? A physician? Or a healed one? Is he a poet? Or a genuine one? An emancipator? Or a subjugator? A good one? Or an evil one? I walk amongst men as the fragments of the future: that future which I contemplate. And it is all my poetisation and aspiration to compose and collect into unity what is fragment and riddle and fearful chance. And how could I endure to be a man, if man were not also the composer, and riddle-reader, and redeemer of chance! To redeem what is past, and to transform every "It was" into "Thus would I have it!"--that only do I call redemption! Will--so is the emancipator and joy-bringer called: thus have I taught you, my friends! But now learn this likewise: the Will itself is still a prisoner. Willing emancipateth: but what is that called which still putteth the emancipator in chains? "It was": thus is the Will's teeth-gnashing and lonesomest tribulation called. Impotent towards what hath been done--it is a malicious spectator of all that is past. Not backward can the Will will; that it cannot break time and time's desire--that is the Will's lonesomest tribulation. Willing emancipateth: what doth Willing itself devise in order to get free from its tribulation and mock at its prison? Ah, a fool becometh every prisoner! Foolishly delivereth itself also the imprisoned Will. That time doth not run backward--that is its animosity: "That which was": so is the stone which it cannot roll called. And thus doth it roll stones out of animosity and ill-humour, and taketh revenge on whatever doth not, like it, feel rage and ill-humour. Thus did the Will, the emancipator, become a torturer; and on all that is capable of suffering it taketh revenge, because it cannot go backward. This, yea, this alone is REVENGE itself: the Will's antipathy to time, and its "It was." Verily, a great folly dwelleth in our Will; and it became a curse unto all humanity, that this folly acquired spirit! THE SPIRIT OF REVENGE: my friends, that hath hitherto been man's best contemplation; and where there was suffering, it was claimed there was always penalty. "Penalty," so calleth itself revenge. With a lying word it feigneth a good conscience. And because in the willer himself there is suffering, because he cannot will backwards--thus was Willing itself, and all life, claimed--to be penalty! And then did cloud after cloud roll over the spirit, until at last madness preached: "Everything perisheth, therefore everything deserveth to perish!" "And this itself is justice, the law of time--that he must devour his children:" thus did madness preach. "Morally are things ordered according to justice and penalty. Oh, where is there deliverance from the flux of things and from the 'existence' of penalty?" Thus did madness preach. "Can there be deliverance when there is eternal justice? Alas, unrollable is the stone, 'It was': eternal must also be all penalties!" Thus did madness preach. "No deed can be annihilated: how could it be undone by the penalty! This, this is what is eternal in the 'existence' of penalty, that existence also must be eternally recurring deed and guilt! Unless the Will should at last deliver itself, and Willing become non- Willing--:" but ye know, my brethren, this fabulous song of madness! Away from those fabulous songs did I lead you when I taught you: "The Will is a creator." All "It was" is a fragment, a riddle, a fearful chance--until the creating Will saith thereto: "But thus would I have it."-- Until the creating Will saith thereto: "But thus do I will it! Thus shall I will it!" But did it ever speak thus? And when doth this take place? Hath the Will been unharnessed from its own folly? Hath the Will become its own deliverer and joy-bringer? Hath it unlearned the spirit of revenge and all teeth-gnashing? And who hath taught it reconciliation with time, and something higher than all reconciliation? Something higher than all reconciliation must the Will will which is the Will to Power--: but how doth that take place? Who hath taught it also to will backwards? --But at this point in his discourse it chanced that Zarathustra suddenly paused, and looked like a person in the greatest alarm. With terror in his eyes did he gaze on his disciples; his glances pierced as with arrows their thoughts and arrear-thoughts. But after a brief space he again laughed, and said soothedly: "It is difficult to live amongst men, because silence is so difficult-- especially for a babbler."-- Thus spake Zarathustra. The hunchback, however, had listened to the conversation and had covered his face during the time; but when he heard Zarathustra laugh, he looked up with curiosity, and said slowly: "But why doth Zarathustra speak otherwise unto us than unto his disciples?" Zarathustra answered: "What is there to be wondered at! With hunchbacks one may well speak in a hunchbacked way!" "Very good," said the hunchback; "and with pupils one may well tell tales out of school. But why doth Zarathustra speak otherwise unto his pupils--than unto himself?"-- XLIII. MANLY PRUDENCE. Not the height, it is the declivity that is terrible! The declivity, where the gaze shooteth DOWNWARDS, and the hand graspeth UPWARDS. There doth the heart become giddy through its double will. Ah, friends, do ye divine also my heart's double will? This, this is MY declivity and my danger, that my gaze shooteth towards the summit, and my hand would fain clutch and lean--on the depth! To man clingeth my will; with chains do I bind myself to man, because I am pulled upwards to the Superman: for thither doth mine other will tend. And THEREFORE do I live blindly among men, as if I knew them not: that my hand may not entirely lose belief in firmness. I know not you men: this gloom and consolation is often spread around me. I sit at the gateway for every rogue, and ask: Who wisheth to deceive me? This is my first manly prudence, that I allow myself to be deceived, so as not to be on my guard against deceivers. Ah, if I were on my guard against man, how could man be an anchor to my ball! Too easily would I be pulled upwards and away! This providence is over my fate, that I have to be without foresight. And he who would not languish amongst men, must learn to drink out of all glasses; and he who would keep clean amongst men, must know how to wash himself even with dirty water. And thus spake I often to myself for consolation: "Courage! Cheer up! old heart! An unhappiness hath failed to befall thee: enjoy that as thy-- happiness!" This, however, is mine other manly prudence: I am more forbearing to the VAIN than to the proud. Is not wounded vanity the mother of all tragedies? Where, however, pride is wounded, there there groweth up something better than pride. That life may be fair to behold, its game must be well played; for that purpose, however, it needeth good actors. Good actors have I found all the vain ones: they play, and wish people to be fond of beholding them--all their spirit is in this wish. They represent themselves, they invent themselves; in their neighbourhood I like to look upon life--it cureth of melancholy. Therefore am I forbearing to the vain, because they are the physicians of my melancholy, and keep me attached to man as to a drama. And further, who conceiveth the full depth of the modesty of the vain man! I am favourable to him, and sympathetic on account of his modesty. From you would he learn his belief in himself; he feedeth upon your glances, he eateth praise out of your hands. Your lies doth he even believe when you lie favourably about him: for in its depths sigheth his heart: "What am _I_?" And if that be the true virtue which is unconscious of itself--well, the vain man is unconscious of his modesty!-- This is, however, my third manly prudence: I am not put out of conceit with the WICKED by your timorousness. I am happy to see the marvels the warm sun hatcheth: tigers and palms and rattle-snakes. Also amongst men there is a beautiful brood of the warm sun, and much that is marvellous in the wicked. In truth, as your wisest did not seem to me so very wise, so found I also human wickedness below the fame of it. And oft did I ask with a shake of the head: Why still rattle, ye rattle- snakes? Verily, there is still a future even for evil! And the warmest south is still undiscovered by man. How many things are now called the worst wickedness, which are only twelve feet broad and three months long! Some day, however, will greater dragons come into the world. For that the Superman may not lack his dragon, the superdragon that is worthy of him, there must still much warm sun glow on moist virgin forests! Out of your wild cats must tigers have evolved, and out of your poison- toads, crocodiles: for the good hunter shall have a good hunt! And verily, ye good and just! In you there is much to be laughed at, and especially your fear of what hath hitherto been called "the devil!" So alien are ye in your souls to what is great, that to you the Superman would be FRIGHTFUL in his goodness! And ye wise and knowing ones, ye would flee from the solar-glow of the wisdom in which the Superman joyfully batheth his nakedness! Ye highest men who have come within my ken! this is my doubt of you, and my secret laughter: I suspect ye would call my Superman--a devil! Ah, I became tired of those highest and best ones: from their "height" did I long to be up, out, and away to the Superman! A horror came over me when I saw those best ones naked: then there grew for me the pinions to soar away into distant futures. Into more distant futures, into more southern souths than ever artist dreamed of: thither, where Gods are ashamed of all clothes! But disguised do I want to see YOU, ye neighbours and fellowmen, and well- attired and vain and estimable, as "the good and just;"-- And disguised will I myself sit amongst you--that I may MISTAKE you and myself: for that is my last manly prudence.-- Thus spake Zarathustra. XLIV. THE STILLEST HOUR. What hath happened unto me, my friends? Ye see me troubled, driven forth, unwillingly obedient, ready to go--alas, to go away from YOU! Yea, once more must Zarathustra retire to his solitude: but unjoyously this time doth the bear go back to his cave! What hath happened unto me? Who ordereth this?--Ah, mine angry mistress wisheth it so; she spake unto me. Have I ever named her name to you? Yesterday towards evening there spake unto me MY STILLEST HOUR: that is the name of my terrible mistress. And thus did it happen--for everything must I tell you, that your heart may not harden against the suddenly departing one! Do ye know the terror of him who falleth asleep?-- To the very toes he is terrified, because the ground giveth way under him, and the dream beginneth. This do I speak unto you in parable. Yesterday at the stillest hour did the ground give way under me: the dream began. The hour-hand moved on, the timepiece of my life drew breath--never did I hear such stillness around me, so that my heart was terrified. Then was there spoken unto me without voice: "THOU KNOWEST IT, ZARATHUSTRA?"-- And I cried in terror at this whispering, and the blood left my face: but I was silent. Then was there once more spoken unto me without voice: "Thou knowest it, Zarathustra, but thou dost not speak it!"-- And at last I answered, like one defiant: "Yea, I know it, but I will not speak it!" Then was there again spoken unto me without voice: "Thou WILT not, Zarathustra? Is this true? Conceal thyself not behind thy defiance!"-- And I wept and trembled like a child, and said: "Ah, I would indeed, but how can I do it! Exempt me only from this! It is beyond my power!" Then was there again spoken unto me without voice: "What matter about thyself, Zarathustra! Speak thy word, and succumb!" And I answered: "Ah, is it MY word? Who am _I_? I await the worthier one; I am not worthy even to succumb by it." Then was there again spoken unto me without voice: "What matter about thyself? Thou art not yet humble enough for me. Humility hath the hardest skin."-- And I answered: "What hath not the skin of my humility endured! At the foot of my height do I dwell: how high are my summits, no one hath yet told me. But well do I know my valleys." Then was there again spoken unto me without voice: "O Zarathustra, he who hath to remove mountains removeth also valleys and plains."-- And I answered: "As yet hath my word not removed mountains, and what I have spoken hath not reached man. I went, indeed, unto men, but not yet have I attained unto them." Then was there again spoken unto me without voice: "What knowest thou THEREOF! The dew falleth on the grass when the night is most silent."-- And I answered: "They mocked me when I found and walked in mine own path; and certainly did my feet then tremble. And thus did they speak unto me: Thou forgottest the path before, now dost thou also forget how to walk!" Then was there again spoken unto me without voice: "What matter about their mockery! Thou art one who hast unlearned to obey: now shalt thou command! Knowest thou not who is most needed by all? He who commandeth great things. To execute great things is difficult: but the more difficult task is to command great things. This is thy most unpardonable obstinacy: thou hast the power, and thou wilt not rule."-- And I answered: "I lack the lion's voice for all commanding." Then was there again spoken unto me as a whispering: "It is the stillest words which bring the storm. Thoughts that come with doves' footsteps guide the world. O Zarathustra, thou shalt go as a shadow of that which is to come: thus wilt thou command, and in commanding go foremost."-- And I answered: "I am ashamed." Then was there again spoken unto me without voice: "Thou must yet become a child, and be without shame. The pride of youth is still upon thee; late hast thou become young: but he who would become a child must surmount even his youth."-- And I considered a long while, and trembled. At last, however, did I say what I had said at first. "I will not." Then did a laughing take place all around me. Alas, how that laughing lacerated my bowels and cut into my heart! And there was spoken unto me for the last time: "O Zarathustra, thy fruits are ripe, but thou art not ripe for thy fruits! So must thou go again into solitude: for thou shalt yet become mellow."-- And again was there a laughing, and it fled: then did it become still around me, as with a double stillness. I lay, however, on the ground, and the sweat flowed from my limbs. --Now have ye heard all, and why I have to return into my solitude. Nothing have I kept hidden from you, my friends. But even this have ye heard from me, WHO is still the most reserved of men --and will be so! Ah, my friends! I should have something more to say unto you! I should have something more to give unto you! Why do I not give it? Am I then a niggard?-- When, however, Zarathustra had spoken these words, the violence of his pain, and a sense of the nearness of his departure from his friends came over him, so that he wept aloud; and no one knew how to console him. In the night, however, he went away alone and left his friends. THIRD PART. "Ye look aloft when ye long for exaltation, and I look downward because I am exalted. "Who among you can at the same time laugh and be exalted? "He who climbeth on the highest mountains, laugheth at all tragic plays and tragic realities."--ZARATHUSTRA, I., "Reading and Writing." XLV. THE WANDERER. Then, when it was about midnight, Zarathustra went his way over the ridge of the isle, that he might arrive early in the morning at the other coast; because there he meant to embark. For there was a good roadstead there, in which foreign ships also liked to anchor: those ships took many people with them, who wished to cross over from the Happy Isles. So when Zarathustra thus ascended the mountain, he thought on the way of his many solitary wanderings from youth onwards, and how many mountains and ridges and summits he had already climbed. I am a wanderer and mountain-climber, said he to his heart, I love not the plains, and it seemeth I cannot long sit still. And whatever may still overtake me as fate and experience--a wandering will be therein, and a mountain-climbing: in the end one experienceth only oneself. The time is now past when accidents could befall me; and what COULD now fall to my lot which would not already be mine own! It returneth only, it cometh home to me at last--mine own Self, and such of it as hath been long abroad, and scattered among things and accidents. And one thing more do I know: I stand now before my last summit, and before that which hath been longest reserved for me. Ah, my hardest path must I ascend! Ah, I have begun my lonesomest wandering! He, however, who is of my nature doth not avoid such an hour: the hour that saith unto him: Now only dost thou go the way to thy greatness! Summit and abyss--these are now comprised together! Thou goest the way to thy greatness: now hath it become thy last refuge, what was hitherto thy last danger! Thou goest the way to thy greatness: it must now be thy best courage that there is no longer any path behind thee! Thou goest the way to thy greatness: here shall no one steal after thee! Thy foot itself hath effaced the path behind thee, and over it standeth written: Impossibility. And if all ladders henceforth fail thee, then must thou learn to mount upon thine own head: how couldst thou mount upward otherwise? Upon thine own head, and beyond thine own heart! Now must the gentlest in thee become the hardest. He who hath always much-indulged himself, sickeneth at last by his much- indulgence. Praises on what maketh hardy! I do not praise the land where butter and honey--flow! To learn TO LOOK AWAY FROM oneself, is necessary in order to see MANY THINGS:--this hardiness is needed by every mountain-climber. He, however, who is obtrusive with his eyes as a discerner, how can he ever see more of anything than its foreground! But thou, O Zarathustra, wouldst view the ground of everything, and its background: thus must thou mount even above thyself--up, upwards, until thou hast even thy stars UNDER thee! Yea! To look down upon myself, and even upon my stars: that only would I call my SUMMIT, that hath remained for me as my LAST summit!-- Thus spake Zarathustra to himself while ascending, comforting his heart with harsh maxims: for he was sore at heart as he had never been before. And when he had reached the top of the mountain-ridge, behold, there lay the other sea spread out before him: and he stood still and was long silent. The night, however, was cold at this height, and clear and starry. I recognise my destiny, said he at last, sadly. Well! I am ready. Now hath my last lonesomeness begun. Ah, this sombre, sad sea, below me! Ah, this sombre nocturnal vexation! Ah, fate and sea! To you must I now GO DOWN! Before my highest mountain do I stand, and before my longest wandering: therefore must I first go deeper down than I ever ascended: --Deeper down into pain than I ever ascended, even into its darkest flood! So willeth my fate. Well! I am ready. Whence come the highest mountains? so did I once ask. Then did I learn that they come out of the sea. That testimony is inscribed on their stones, and on the walls of their summits. Out of the deepest must the highest come to its height.-- Thus spake Zarathustra on the ridge of the mountain where it was cold: when, however, he came into the vicinity of the sea, and at last stood alone amongst the cliffs, then had he become weary on his way, and eagerer than ever before. Everything as yet sleepeth, said he; even the sea sleepeth. Drowsily and strangely doth its eye gaze upon me. But it breatheth warmly--I feel it. And I feel also that it dreameth. It tosseth about dreamily on hard pillows. Hark! Hark! How it groaneth with evil recollections! Or evil expectations? Ah, I am sad along with thee, thou dusky monster, and angry with myself even for thy sake. Ah, that my hand hath not strength enough! Gladly, indeed, would I free thee from evil dreams!-- And while Zarathustra thus spake, he laughed at himself with melancholy and bitterness. What! Zarathustra, said he, wilt thou even sing consolation to the sea? Ah, thou amiable fool, Zarathustra, thou too-blindly confiding one! But thus hast thou ever been: ever hast thou approached confidently all that is terrible. Every monster wouldst thou caress. A whiff of warm breath, a little soft tuft on its paw--: and immediately wert thou ready to love and lure it. LOVE is the danger of the lonesomest one, love to anything, IF IT ONLY LIVE! Laughable, verily, is my folly and my modesty in love!-- Thus spake Zarathustra, and laughed thereby a second time. Then, however, he thought of his abandoned friends--and as if he had done them a wrong with his thoughts, he upbraided himself because of his thoughts. And forthwith it came to pass that the laugher wept--with anger and longing wept Zarathustra bitterly. XLVI. THE VISION AND THE ENIGMA. 1. When it got abroad among the sailors that Zarathustra was on board the ship--for a man who came from the Happy Isles had gone on board along with him,--there was great curiosity and expectation. But Zarathustra kept silent for two days, and was cold and deaf with sadness; so that he neither answered looks nor questions. On the evening of the second day, however, he again opened his ears, though he still kept silent: for there were many curious and dangerous things to be heard on board the ship, which came from afar, and was to go still further. Zarathustra, however, was fond of all those who make distant voyages, and dislike to live without danger. And behold! when listening, his own tongue was at last loosened, and the ice of his heart broke. Then did he begin to speak thus: To you, the daring venturers and adventurers, and whoever hath embarked with cunning sails upon frightful seas,-- To you the enigma-intoxicated, the twilight-enjoyers, whose souls are allured by flutes to every treacherous gulf: --For ye dislike to grope at a thread with cowardly hand; and where ye can DIVINE, there do ye hate to CALCULATE-- To you only do I tell the enigma that I SAW--the vision of the lonesomest one.-- Gloomily walked I lately in corpse-coloured twilight--gloomily and sternly, with compressed lips. Not only one sun had set for me. A path which ascended daringly among boulders, an evil, lonesome path, which neither herb nor shrub any longer cheered, a mountain-path, crunched under the daring of my foot. Mutely marching over the scornful clinking of pebbles, trampling the stone that let it slip: thus did my foot force its way upwards. Upwards:--in spite of the spirit that drew it downwards, towards the abyss, the spirit of gravity, my devil and arch-enemy. Upwards:--although it sat upon me, half-dwarf, half-mole; paralysed, paralysing; dripping lead in mine ear, and thoughts like drops of lead into my brain. "O Zarathustra," it whispered scornfully, syllable by syllable, "thou stone of wisdom! Thou threwest thyself high, but every thrown stone must--fall! O Zarathustra, thou stone of wisdom, thou sling-stone, thou star-destroyer! Thyself threwest thou so high,--but every thrown stone--must fall! Condemned of thyself, and to thine own stoning: O Zarathustra, far indeed threwest thou thy stone--but upon THYSELF will it recoil!" Then was the dwarf silent; and it lasted long. The silence, however, oppressed me; and to be thus in pairs, one is verily lonesomer than when alone! I ascended, I ascended, I dreamt, I thought,--but everything oppressed me. A sick one did I resemble, whom bad torture wearieth, and a worse dream reawakeneth out of his first sleep.-- But there is something in me which I call courage: it hath hitherto slain for me every dejection. This courage at last bade me stand still and say: "Dwarf! Thou! Or I!"-- For courage is the best slayer,--courage which ATTACKETH: for in every attack there is sound of triumph. Man, however, is the most courageous animal: thereby hath he overcome every animal. With sound of triumph hath he overcome every pain; human pain, however, is the sorest pain. Courage slayeth also giddiness at abysses: and where doth man not stand at abysses! Is not seeing itself--seeing abysses? Courage is the best slayer: courage slayeth also fellow-suffering. Fellow-suffering, however, is the deepest abyss: as deeply as man looketh into life, so deeply also doth he look into suffering. Courage, however, is the best slayer, courage which attacketh: it slayeth even death itself; for it saith: "WAS THAT life? Well! Once more!" In such speech, however, there is much sound of triumph. He who hath ears to hear, let him hear.-- 2. "Halt, dwarf!" said I. "Either I--or thou! I, however, am the stronger of the two:--thou knowest not mine abysmal thought! IT--couldst thou not endure!" Then happened that which made me lighter: for the dwarf sprang from my shoulder, the prying sprite! And it squatted on a stone in front of me. There was however a gateway just where we halted. "Look at this gateway! Dwarf!" I continued, "it hath two faces. Two roads come together here: these hath no one yet gone to the end of. This long lane backwards: it continueth for an eternity. And that long lane forward--that is another eternity. They are antithetical to one another, these roads; they directly abut on one another:--and it is here, at this gateway, that they come together. The name of the gateway is inscribed above: 'This Moment.' But should one follow them further--and ever further and further on, thinkest thou, dwarf, that these roads would be eternally antithetical?"-- "Everything straight lieth," murmured the dwarf, contemptuously. "All truth is crooked; time itself is a circle." "Thou spirit of gravity!" said I wrathfully, "do not take it too lightly! Or I shall let thee squat where thou squattest, Haltfoot,--and I carried thee HIGH!" "Observe," continued I, "This Moment! From the gateway, This Moment, there runneth a long eternal lane BACKWARDS: behind us lieth an eternity. Must not whatever CAN run its course of all things, have already run along that lane? Must not whatever CAN happen of all things have already happened, resulted, and gone by? And if everything have already existed, what thinkest thou, dwarf, of This Moment? Must not this gateway also--have already existed? And are not all things closely bound together in such wise that This Moment draweth all coming things after it? CONSEQUENTLY--itself also? For whatever CAN run its course of all things, also in this long lane OUTWARD--MUST it once more run!-- And this slow spider which creepeth in the moonlight, and this moonlight itself, and thou and I in this gateway whispering together, whispering of eternal things--must we not all have already existed? --And must we not return and run in that other lane out before us, that long weird lane--must we not eternally return?"-- Thus did I speak, and always more softly: for I was afraid of mine own thoughts, and arrear-thoughts. Then, suddenly did I hear a dog HOWL near me. Had I ever heard a dog howl thus? My thoughts ran back. Yes! When I was a child, in my most distant childhood: --Then did I hear a dog howl thus. And saw it also, with hair bristling, its head upwards, trembling in the stillest midnight, when even dogs believe in ghosts: --So that it excited my commiseration. For just then went the full moon, silent as death, over the house; just then did it stand still, a glowing globe--at rest on the flat roof, as if on some one's property:-- Thereby had the dog been terrified: for dogs believe in thieves and ghosts. And when I again heard such howling, then did it excite my commiseration once more. Where was now the dwarf? And the gateway? And the spider? And all the whispering? Had I dreamt? Had I awakened? 'Twixt rugged rocks did I suddenly stand alone, dreary in the dreariest moonlight. BUT THERE LAY A MAN! And there! The dog leaping, bristling, whining--now did it see me coming--then did it howl again, then did it CRY:--had I ever heard a dog cry so for help? And verily, what I saw, the like had I never seen. A young shepherd did I see, writhing, choking, quivering, with distorted countenance, and with a heavy black serpent hanging out of his mouth. Had I ever seen so much loathing and pale horror on one countenance? He had perhaps gone to sleep? Then had the serpent crawled into his throat-- there had it bitten itself fast. My hand pulled at the serpent, and pulled:--in vain! I failed to pull the serpent out of his throat. Then there cried out of me: "Bite! Bite! Its head off! Bite!"--so cried it out of me; my horror, my hatred, my loathing, my pity, all my good and my bad cried with one voice out of me.-- Ye daring ones around me! Ye venturers and adventurers, and whoever of you have embarked with cunning sails on unexplored seas! Ye enigma-enjoyers! Solve unto me the enigma that I then beheld, interpret unto me the vision of the lonesomest one! For it was a vision and a foresight:--WHAT did I then behold in parable? And WHO is it that must come some day? WHO is the shepherd into whose throat the serpent thus crawled? WHO is the man into whose throat all the heaviest and blackest will thus crawl? --The shepherd however bit as my cry had admonished him; he bit with a strong bite! Far away did he spit the head of the serpent--: and sprang up.-- No longer shepherd, no longer man--a transfigured being, a light-surrounded being, that LAUGHED! Never on earth laughed a man as HE laughed! O my brethren, I heard a laughter which was no human laughter,--and now gnaweth a thirst at me, a longing that is never allayed. My longing for that laughter gnaweth at me: oh, how can I still endure to live! And how could I endure to die at present!-- Thus spake Zarathustra. XLVII. INVOLUNTARY BLISS. With such enigmas and bitterness in his heart did Zarathustra sail o'er the sea. When, however, he was four day-journeys from the Happy Isles and from his friends, then had he surmounted all his pain--: triumphantly and with firm foot did he again accept his fate. And then talked Zarathustra in this wise to his exulting conscience: Alone am I again, and like to be so, alone with the pure heaven, and the open sea; and again is the afternoon around me. On an afternoon did I find my friends for the first time; on an afternoon, also, did I find them a second time:--at the hour when all light becometh stiller. For whatever happiness is still on its way 'twixt heaven and earth, now seeketh for lodging a luminous soul: WITH HAPPINESS hath all light now become stiller. O afternoon of my life! Once did my happiness also descend to the valley that it might seek a lodging: then did it find those open hospitable souls. O afternoon of my life! What did I not surrender that I might have one thing: this living plantation of my thoughts, and this dawn of my highest hope! Companions did the creating one once seek, and children of HIS hope: and lo, it turned out that he could not find them, except he himself should first create them. Thus am I in the midst of my work, to my children going, and from them returning: for the sake of his children must Zarathustra perfect himself. For in one's heart one loveth only one's child and one's work; and where there is great love to oneself, then is it the sign of pregnancy: so have I found it. Still are my children verdant in their first spring, standing nigh one another, and shaken in common by the winds, the trees of my garden and of my best soil. And verily, where such trees stand beside one another, there ARE Happy Isles! But one day will I take them up, and put each by itself alone: that it may learn lonesomeness and defiance and prudence. Gnarled and crooked and with flexible hardness shall it then stand by the sea, a living lighthouse of unconquerable life. Yonder where the storms rush down into the sea, and the snout of the mountain drinketh water, shall each on a time have his day and night watches, for HIS testing and recognition. Recognised and tested shall each be, to see if he be of my type and lineage:--if he be master of a long will, silent even when he speaketh, and giving in such wise that he TAKETH in giving:-- --So that he may one day become my companion, a fellow-creator and fellow- enjoyer with Zarathustra:--such a one as writeth my will on my tables, for the fuller perfection of all things. And for his sake and for those like him, must I perfect MYSELF: therefore do I now avoid my happiness, and present myself to every misfortune--for MY final testing and recognition. And verily, it were time that I went away; and the wanderer's shadow and the longest tedium and the stillest hour--have all said unto me: "It is the highest time!" The word blew to me through the keyhole and said "Come!" The door sprang subtlely open unto me, and said "Go!" But I lay enchained to my love for my children: desire spread this snare for me--the desire for love--that I should become the prey of my children, and lose myself in them. Desiring--that is now for me to have lost myself. I POSSESS YOU, MY CHILDREN! In this possessing shall everything be assurance and nothing desire. But brooding lay the sun of my love upon me, in his own juice stewed Zarathustra,--then did shadows and doubts fly past me. For frost and winter I now longed: "Oh, that frost and winter would again make me crack and crunch!" sighed I:--then arose icy mist out of me. My past burst its tomb, many pains buried alive woke up--: fully slept had they merely, concealed in corpse-clothes. So called everything unto me in signs: "It is time!" But I--heard not, until at last mine abyss moved, and my thought bit me. Ah, abysmal thought, which art MY thought! When shall I find strength to hear thee burrowing, and no longer tremble? To my very throat throbbeth my heart when I hear thee burrowing! Thy muteness even is like to strangle me, thou abysmal mute one! As yet have I never ventured to call thee UP; it hath been enough that I-- have carried thee about with me! As yet have I not been strong enough for my final lion-wantonness and playfulness. Sufficiently formidable unto me hath thy weight ever been: but one day shall I yet find the strength and the lion's voice which will call thee up! When I shall have surmounted myself therein, then will I surmount myself also in that which is greater; and a VICTORY shall be the seal of my perfection!-- Meanwhile do I sail along on uncertain seas; chance flattereth me, smooth- tongued chance; forward and backward do I gaze--, still see I no end. As yet hath the hour of my final struggle not come to me--or doth it come to me perhaps just now? Verily, with insidious beauty do sea and life gaze upon me round about: O afternoon of my life! O happiness before eventide! O haven upon high seas! O peace in uncertainty! How I distrust all of you! Verily, distrustful am I of your insidious beauty! Like the lover am I, who distrusteth too sleek smiling. As he pusheth the best-beloved before him--tender even in severity, the jealous one--, so do I push this blissful hour before me. Away with thee, thou blissful hour! With thee hath there come to me an involuntary bliss! Ready for my severest pain do I here stand:--at the wrong time hast thou come! Away with thee, thou blissful hour! Rather harbour there--with my children! Hasten! and bless them before eventide with MY happiness! There, already approacheth eventide: the sun sinketh. Away--my happiness!-- Thus spake Zarathustra. And he waited for his misfortune the whole night; but he waited in vain. The night remained clear and calm, and happiness itself came nigher and nigher unto him. Towards morning, however, Zarathustra laughed to his heart, and said mockingly: "Happiness runneth after me. That is because I do not run after women. Happiness, however, is a woman." XLVIII. BEFORE SUNRISE. O heaven above me, thou pure, thou deep heaven! Thou abyss of light! Gazing on thee, I tremble with divine desires. Up to thy height to toss myself--that is MY depth! In thy purity to hide myself--that is MINE innocence! The God veileth his beauty: thus hidest thou thy stars. Thou speakest not: THUS proclaimest thou thy wisdom unto me. Mute o'er the raging sea hast thou risen for me to-day; thy love and thy modesty make a revelation unto my raging soul. In that thou camest unto me beautiful, veiled in thy beauty, in that thou spakest unto me mutely, obvious in thy wisdom: Oh, how could I fail to divine all the modesty of thy soul! BEFORE the sun didst thou come unto me--the lonesomest one. We have been friends from the beginning: to us are grief, gruesomeness, and ground common; even the sun is common to us. We do not speak to each other, because we know too much--: we keep silent to each other, we smile our knowledge to each other. Art thou not the light of my fire? Hast thou not the sister-soul of mine insight? Together did we learn everything; together did we learn to ascend beyond ourselves to ourselves, and to smile uncloudedly:-- --Uncloudedly to smile down out of luminous eyes and out of miles of distance, when under us constraint and purpose and guilt steam like rain. And wandered I alone, for WHAT did my soul hunger by night and in labyrinthine paths? And climbed I mountains, WHOM did I ever seek, if not thee, upon mountains? And all my wandering and mountain-climbing: a necessity was it merely, and a makeshift of the unhandy one:--to FLY only, wanteth mine entire will, to fly into THEE! And what have I hated more than passing clouds, and whatever tainteth thee? And mine own hatred have I even hated, because it tainted thee! The passing clouds I detest--those stealthy cats of prey: they take from thee and me what is common to us--the vast unbounded Yea- and Amen-saying. These mediators and mixers we detest--the passing clouds: those half-and- half ones, that have neither learned to bless nor to curse from the heart. Rather will I sit in a tub under a closed heaven, rather will I sit in the abyss without heaven, than see thee, thou luminous heaven, tainted with passing clouds! And oft have I longed to pin them fast with the jagged gold-wires of lightning, that I might, like the thunder, beat the drum upon their kettle- bellies:-- --An angry drummer, because they rob me of thy Yea and Amen!--thou heaven above me, thou pure, thou luminous heaven! Thou abyss of light!--because they rob thee of MY Yea and Amen. For rather will I have noise and thunders and tempest-blasts, than this discreet, doubting cat-repose; and also amongst men do I hate most of all the soft-treaders, and half-and-half ones, and the doubting, hesitating, passing clouds. And "he who cannot bless shall LEARN to curse!"--this clear teaching dropt unto me from the clear heaven; this star standeth in my heaven even in dark nights. I, however, am a blesser and a Yea-sayer, if thou be but around me, thou pure, thou luminous heaven! Thou abyss of light!--into all abysses do I then carry my beneficent Yea-saying. A blesser have I become and a Yea-sayer: and therefore strove I long and was a striver, that I might one day get my hands free for blessing. This, however, is my blessing: to stand above everything as its own heaven, its round roof, its azure bell and eternal security: and blessed is he who thus blesseth! For all things are baptized at the font of eternity, and beyond good and evil; good and evil themselves, however, are but fugitive shadows and damp afflictions and passing clouds. Verily, it is a blessing and not a blasphemy when I teach that "above all things there standeth the heaven of chance, the heaven of innocence, the heaven of hazard, the heaven of wantonness." "Of Hazard"--that is the oldest nobility in the world; that gave I back to all things; I emancipated them from bondage under purpose. This freedom and celestial serenity did I put like an azure bell above all things, when I taught that over them and through them, no "eternal Will"-- willeth. This wantonness and folly did I put in place of that Will, when I taught that "In everything there is one thing impossible--rationality!" A LITTLE reason, to be sure, a germ of wisdom scattered from star to star-- this leaven is mixed in all things: for the sake of folly, wisdom is mixed in all things! A little wisdom is indeed possible; but this blessed security have I found in all things, that they prefer--to DANCE on the feet of chance. O heaven above me! thou pure, thou lofty heaven! This is now thy purity unto me, that there is no eternal reason-spider and reason-cobweb:-- --That thou art to me a dancing-floor for divine chances, that thou art to me a table of the Gods, for divine dice and dice-players!-- But thou blushest? Have I spoken unspeakable things? Have I abused, when I meant to bless thee? Or is it the shame of being two of us that maketh thee blush!--Dost thou bid me go and be silent, because now--DAY cometh? The world is deep:--and deeper than e'er the day could read. Not everything may be uttered in presence of day. But day cometh: so let us part! O heaven above me, thou modest one! thou glowing one! O thou, my happiness before sunrise! The day cometh: so let us part!-- Thus spake Zarathustra. XLIX. THE BEDWARFING VIRTUE. 1. When Zarathustra was again on the continent, he did not go straightway to his mountains and his cave, but made many wanderings and questionings, and ascertained this and that; so that he said of himself jestingly: "Lo, a river that floweth back unto its source in many windings!" For he wanted to learn what had taken place AMONG MEN during the interval: whether they had become greater or smaller. And once, when he saw a row of new houses, he marvelled, and said: "What do these houses mean? Verily, no great soul put them up as its simile! Did perhaps a silly child take them out of its toy-box? Would that another child put them again into the box! And these rooms and chambers--can MEN go out and in there? They seem to be made for silk dolls; or for dainty-eaters, who perhaps let others eat with them." And Zarathustra stood still and meditated. At last he said sorrowfully: "There hath EVERYTHING become smaller! Everywhere do I see lower doorways: he who is of MY type can still go therethrough, but--he must stoop! Oh, when shall I arrive again at my home, where I shall no longer have to stoop--shall no longer have to stoop BEFORE THE SMALL ONES!"--And Zarathustra sighed, and gazed into the distance.-- The same day, however, he gave his discourse on the bedwarfing virtue. 2. I pass through this people and keep mine eyes open: they do not forgive me for not envying their virtues. They bite at me, because I say unto them that for small people, small virtues are necessary--and because it is hard for me to understand that small people are NECESSARY! Here am I still like a cock in a strange farm-yard, at which even the hens peck: but on that account I am not unfriendly to the hens. I am courteous towards them, as towards all small annoyances; to be prickly towards what is small, seemeth to me wisdom for hedgehogs. They all speak of me when they sit around their fire in the evening--they speak of me, but no one thinketh--of me! This is the new stillness which I have experienced: their noise around me spreadeth a mantle over my thoughts. They shout to one another: "What is this gloomy cloud about to do to us? Let us see that it doth not bring a plague upon us!" And recently did a woman seize upon her child that was coming unto me: "Take the children away," cried she, "such eyes scorch children's souls." They cough when I speak: they think coughing an objection to strong winds --they divine nothing of the boisterousness of my happiness! "We have not yet time for Zarathustra"--so they object; but what matter about a time that "hath no time" for Zarathustra? And if they should altogether praise me, how could I go to sleep on THEIR praise? A girdle of spines is their praise unto me: it scratcheth me even when I take it off. And this also did I learn among them: the praiser doeth as if he gave back; in truth, however, he wanteth more to be given him! Ask my foot if their lauding and luring strains please it! Verily, to such measure and ticktack, it liketh neither to dance nor to stand still. To small virtues would they fain lure and laud me; to the ticktack of small happiness would they fain persuade my foot. I pass through this people and keep mine eyes open; they have become SMALLER, and ever become smaller:--THE REASON THEREOF IS THEIR DOCTRINE OF HAPPINESS AND VIRTUE. For they are moderate also in virtue,--because they want comfort. With comfort, however, moderate virtue only is compatible. To be sure, they also learn in their way to stride on and stride forward: that, I call their HOBBLING.--Thereby they become a hindrance to all who are in haste. And many of them go forward, and look backwards thereby, with stiffened necks: those do I like to run up against. Foot and eye shall not lie, nor give the lie to each other. But there is much lying among small people. Some of them WILL, but most of them are WILLED. Some of them are genuine, but most of them are bad actors. There are actors without knowing it amongst them, and actors without intending it--, the genuine ones are always rare, especially the genuine actors. Of man there is little here: therefore do their women masculinise themselves. For only he who is man enough, will--SAVE THE WOMAN in woman. And this hypocrisy found I worst amongst them, that even those who command feign the virtues of those who serve. "I serve, thou servest, we serve"--so chanteth here even the hypocrisy of the rulers--and alas! if the first lord be ONLY the first servant! Ah, even upon their hypocrisy did mine eyes' curiosity alight; and well did I divine all their fly-happiness, and their buzzing around sunny window- panes. So much kindness, so much weakness do I see. So much justice and pity, so much weakness. Round, fair, and considerate are they to one another, as grains of sand are round, fair, and considerate to grains of sand. Modestly to embrace a small happiness--that do they call "submission"! and at the same time they peer modestly after a new small happiness. In their hearts they want simply one thing most of all: that no one hurt them. Thus do they anticipate every one's wishes and do well unto every one. That, however, is COWARDICE, though it be called "virtue."-- And when they chance to speak harshly, those small people, then do _I_ hear therein only their hoarseness--every draught of air maketh them hoarse. Shrewd indeed are they, their virtues have shrewd fingers. But they lack fists: their fingers do not know how to creep behind fists. Virtue for them is what maketh modest and tame: therewith have they made the wolf a dog, and man himself man's best domestic animal. "We set our chair in the MIDST"--so saith their smirking unto me--"and as far from dying gladiators as from satisfied swine." That, however, is--MEDIOCRITY, though it be called moderation.-- 3. I pass through this people and let fall many words: but they know neither how to take nor how to retain them. They wonder why I came not to revile venery and vice; and verily, I came not to warn against pickpockets either! They wonder why I am not ready to abet and whet their wisdom: as if they had not yet enough of wiseacres, whose voices grate on mine ear like slate- pencils! And when I call out: "Curse all the cowardly devils in you, that would fain whimper and fold the hands and adore"--then do they shout: "Zarathustra is godless." And especially do their teachers of submission shout this;--but precisely in their ears do I love to cry: "Yea! I AM Zarathustra, the godless!" Those teachers of submission! Wherever there is aught puny, or sickly, or scabby, there do they creep like lice; and only my disgust preventeth me from cracking them. Well! This is my sermon for THEIR ears: I am Zarathustra the godless, who saith: "Who is more godless than I, that I may enjoy his teaching?" I am Zarathustra the godless: where do I find mine equal? And all those are mine equals who give unto themselves their Will, and divest themselves of all submission. I am Zarathustra the godless! I cook every chance in MY pot. And only when it hath been quite cooked do I welcome it as MY food. And verily, many a chance came imperiously unto me: but still more imperiously did my WILL speak unto it,--then did it lie imploringly upon its knees-- --Imploring that it might find home and heart with me, and saying flatteringly: "See, O Zarathustra, how friend only cometh unto friend!"-- But why talk I, when no one hath MINE ears! And so will I shout it out unto all the winds: Ye ever become smaller, ye small people! Ye crumble away, ye comfortable ones! Ye will yet perish-- --By your many small virtues, by your many small omissions, and by your many small submissions! Too tender, too yielding: so is your soil! But for a tree to become GREAT, it seeketh to twine hard roots around hard rocks! Also what ye omit weaveth at the web of all the human future; even your naught is a cobweb, and a spider that liveth on the blood of the future. And when ye take, then is it like stealing, ye small virtuous ones; but even among knaves HONOUR saith that "one shall only steal when one cannot rob." "It giveth itself"--that is also a doctrine of submission. But I say unto you, ye comfortable ones, that IT TAKETH TO ITSELF, and will ever take more and more from you! Ah, that ye would renounce all HALF-willing, and would decide for idleness as ye decide for action! Ah, that ye understood my word: "Do ever what ye will--but first be such as CAN WILL. Love ever your neighbour as yourselves--but first be such as LOVE THEMSELVES-- --Such as love with great love, such as love with great contempt!" Thus speaketh Zarathustra the godless.-- But why talk I, when no one hath MINE ears! It is still an hour too early for me here. Mine own forerunner am I among this people, mine own cockcrow in dark lanes. But THEIR hour cometh! And there cometh also mine! Hourly do they become smaller, poorer, unfruitfuller,--poor herbs! poor earth! And SOON shall they stand before me like dry grass and prairie, and verily, weary of themselves--and panting for FIRE, more than for water! O blessed hour of the lightning! O mystery before noontide!--Running fires will I one day make of them, and heralds with flaming tongues:-- --Herald shall they one day with flaming tongues: It cometh, it is nigh, THE GREAT NOONTIDE! Thus spake Zarathustra. L. ON THE OLIVE-MOUNT. Winter, a bad guest, sitteth with me at home; blue are my hands with his friendly hand-shaking. I honour him, that bad guest, but gladly leave him alone. Gladly do I run away from him; and when one runneth WELL, then one escapeth him! With warm feet and warm thoughts do I run where the wind is calm--to the sunny corner of mine olive-mount. There do I laugh at my stern guest, and am still fond of him; because he cleareth my house of flies, and quieteth many little noises. For he suffereth it not if a gnat wanteth to buzz, or even two of them; also the lanes maketh he lonesome, so that the moonlight is afraid there at night. A hard guest is he,--but I honour him, and do not worship, like the tenderlings, the pot-bellied fire-idol. Better even a little teeth-chattering than idol-adoration!--so willeth my nature. And especially have I a grudge against all ardent, steaming, steamy fire-idols. Him whom I love, I love better in winter than in summer; better do I now mock at mine enemies, and more heartily, when winter sitteth in my house. Heartily, verily, even when I CREEP into bed--: there, still laugheth and wantoneth my hidden happiness; even my deceptive dream laugheth. I, a--creeper? Never in my life did I creep before the powerful; and if ever I lied, then did I lie out of love. Therefore am I glad even in my winter-bed. A poor bed warmeth me more than a rich one, for I am jealous of my poverty. And in winter she is most faithful unto me. With a wickedness do I begin every day: I mock at the winter with a cold bath: on that account grumbleth my stern house-mate. Also do I like to tickle him with a wax-taper, that he may finally let the heavens emerge from ashy-grey twilight. For especially wicked am I in the morning: at the early hour when the pail rattleth at the well, and horses neigh warmly in grey lanes:-- Impatiently do I then wait, that the clear sky may finally dawn for me, the snow-bearded winter-sky, the hoary one, the white-head,-- --The winter-sky, the silent winter-sky, which often stifleth even its sun! Did I perhaps learn from it the long clear silence? Or did it learn it from me? Or hath each of us devised it himself? Of all good things the origin is a thousandfold,--all good roguish things spring into existence for joy: how could they always do so--for once only! A good roguish thing is also the long silence, and to look, like the winter-sky, out of a clear, round-eyed countenance:-- --Like it to stifle one's sun, and one's inflexible solar will: verily, this art and this winter-roguishness have I learnt WELL! My best-loved wickedness and art is it, that my silence hath learned not to betray itself by silence. Clattering with diction and dice, I outwit the solemn assistants: all those stern watchers, shall my will and purpose elude. That no one might see down into my depth and into mine ultimate will--for that purpose did I devise the long clear silence. Many a shrewd one did I find: he veiled his countenance and made his water muddy, that no one might see therethrough and thereunder. But precisely unto him came the shrewder distrusters and nut-crackers: precisely from him did they fish his best-concealed fish! But the clear, the honest, the transparent--these are for me the wisest silent ones: in them, so PROFOUND is the depth that even the clearest water doth not--betray it.-- Thou snow-bearded, silent, winter-sky, thou round-eyed whitehead above me! Oh, thou heavenly simile of my soul and its wantonness! And MUST I not conceal myself like one who hath swallowed gold--lest my soul should be ripped up? MUST I not wear stilts, that they may OVERLOOK my long legs--all those enviers and injurers around me? Those dingy, fire-warmed, used-up, green-tinted, ill-natured souls--how COULD their envy endure my happiness! Thus do I show them only the ice and winter of my peaks--and NOT that my mountain windeth all the solar girdles around it! They hear only the whistling of my winter-storms: and know NOT that I also travel over warm seas, like longing, heavy, hot south-winds. They commiserate also my accidents and chances:--but MY word saith: "Suffer the chance to come unto me: innocent is it as a little child!" How COULD they endure my happiness, if I did not put around it accidents, and winter-privations, and bear-skin caps, and enmantling snowflakes! --If I did not myself commiserate their PITY, the pity of those enviers and injurers! --If I did not myself sigh before them, and chatter with cold, and patiently LET myself be swathed in their pity! This is the wise waggish-will and good-will of my soul, that it CONCEALETH NOT its winters and glacial storms; it concealeth not its chilblains either. To one man, lonesomeness is the flight of the sick one; to another, it is the flight FROM the sick ones. Let them HEAR me chattering and sighing with winter-cold, all those poor squinting knaves around me! With such sighing and chattering do I flee from their heated rooms. Let them sympathise with me and sigh with me on account of my chilblains: "At the ice of knowledge will he yet FREEZE TO DEATH!"--so they mourn. Meanwhile do I run with warm feet hither and thither on mine olive-mount: in the sunny corner of mine olive-mount do I sing, and mock at all pity.-- Thus sang Zarathustra. LI. ON PASSING-BY. Thus slowly wandering through many peoples and divers cities, did Zarathustra return by round-about roads to his mountains and his cave. And behold, thereby came he unawares also to the gate of the GREAT CITY. Here, however, a foaming fool, with extended hands, sprang forward to him and stood in his way. It was the same fool whom the people called "the ape of Zarathustra:" for he had learned from him something of the expression and modulation of language, and perhaps liked also to borrow from the store of his wisdom. And the fool talked thus to Zarathustra: O Zarathustra, here is the great city: here hast thou nothing to seek and everything to lose. Why wouldst thou wade through this mire? Have pity upon thy foot! Spit rather on the gate of the city, and--turn back! Here is the hell for anchorites' thoughts: here are great thoughts seethed alive and boiled small. Here do all great sentiments decay: here may only rattle-boned sensations rattle! Smellest thou not already the shambles and cookshops of the spirit? Steameth not this city with the fumes of slaughtered spirit? Seest thou not the souls hanging like limp dirty rags?--And they make newspapers also out of these rags! Hearest thou not how spirit hath here become a verbal game? Loathsome verbal swill doth it vomit forth!--And they make newspapers also out of this verbal swill. They hound one another, and know not whither! They inflame one another, and know not why! They tinkle with their pinchbeck, they jingle with their gold. They are cold, and seek warmth from distilled waters: they are inflamed, and seek coolness from frozen spirits; they are all sick and sore through public opinion. All lusts and vices are here at home; but here there are also the virtuous; there is much appointable appointed virtue:-- Much appointable virtue with scribe-fingers, and hardy sitting-flesh and waiting-flesh, blessed with small breast-stars, and padded, haunchless daughters. There is here also much piety, and much faithful spittle-licking and spittle-backing, before the God of Hosts. "From on high," drippeth the star, and the gracious spittle; for the high, longeth every starless bosom. The moon hath its court, and the court hath its moon-calves: unto all, however, that cometh from the court do the mendicant people pray, and all appointable mendicant virtues. "I serve, thou servest, we serve"--so prayeth all appointable virtue to the prince: that the merited star may at last stick on the slender breast! But the moon still revolveth around all that is earthly: so revolveth also the prince around what is earthliest of all--that, however, is the gold of the shopman. The God of the Hosts of war is not the God of the golden bar; the prince proposeth, but the shopman--disposeth! By all that is luminous and strong and good in thee, O Zarathustra! Spit on this city of shopmen and return back! Here floweth all blood putridly and tepidly and frothily through all veins: spit on the great city, which is the great slum where all the scum frotheth together! Spit on the city of compressed souls and slender breasts, of pointed eyes and sticky fingers-- --On the city of the obtrusive, the brazen-faced, the pen-demagogues and tongue-demagogues, the overheated ambitious:-- Where everything maimed, ill-famed, lustful, untrustful, over-mellow, sickly-yellow and seditious, festereth pernicious:-- --Spit on the great city and turn back!-- Here, however, did Zarathustra interrupt the foaming fool, and shut his mouth.-- Stop this at once! called out Zarathustra, long have thy speech and thy species disgusted me! Why didst thou live so long by the swamp, that thou thyself hadst to become a frog and a toad? Floweth there not a tainted, frothy, swamp-blood in thine own veins, when thou hast thus learned to croak and revile? Why wentest thou not into the forest? Or why didst thou not till the ground? Is the sea not full of green islands? I despise thy contempt; and when thou warnedst me--why didst thou not warn thyself? Out of love alone shall my contempt and my warning bird take wing; but not out of the swamp!-- They call thee mine ape, thou foaming fool: but I call thee my grunting- pig,--by thy grunting, thou spoilest even my praise of folly. What was it that first made thee grunt? Because no one sufficiently FLATTERED thee:--therefore didst thou seat thyself beside this filth, that thou mightest have cause for much grunting,-- --That thou mightest have cause for much VENGEANCE! For vengeance, thou vain fool, is all thy foaming; I have divined thee well! But thy fools'-word injureth ME, even when thou art right! And even if Zarathustra's word WERE a hundred times justified, thou wouldst ever--DO wrong with my word! Thus spake Zarathustra. Then did he look on the great city and sighed, and was long silent. At last he spake thus: I loathe also this great city, and not only this fool. Here and there-- there is nothing to better, nothing to worsen. Woe to this great city!--And I would that I already saw the pillar of fire in which it will be consumed! For such pillars of fire must precede the great noontide. But this hath its time and its own fate.-- This precept, however, give I unto thee, in parting, thou fool: Where one can no longer love, there should one--PASS BY!-- Thus spake Zarathustra, and passed by the fool and the great city. LII. THE APOSTATES. 1. Ah, lieth everything already withered and grey which but lately stood green and many-hued on this meadow! And how much honey of hope did I carry hence into my beehives! Those young hearts have already all become old--and not old even! only weary, ordinary, comfortable:--they declare it: "We have again become pious." Of late did I see them run forth at early morn with valorous steps: but the feet of their knowledge became weary, and now do they malign even their morning valour! Verily, many of them once lifted their legs like the dancer; to them winked the laughter of my wisdom:--then did they bethink themselves. Just now have I seen them bent down--to creep to the cross. Around light and liberty did they once flutter like gnats and young poets. A little older, a little colder: and already are they mystifiers, and mumblers and mollycoddles. Did perhaps their hearts despond, because lonesomeness had swallowed me like a whale? Did their ear perhaps hearken yearningly-long for me IN VAIN, and for my trumpet-notes and herald-calls? --Ah! Ever are there but few of those whose hearts have persistent courage and exuberance; and in such remaineth also the spirit patient. The rest, however, are COWARDLY. The rest: these are always the great majority, the common-place, the superfluous, the far-too many--those all are cowardly!-- Him who is of my type, will also the experiences of my type meet on the way: so that his first companions must be corpses and buffoons. His second companions, however--they will call themselves his BELIEVERS,-- will be a living host, with much love, much folly, much unbearded veneration. To those believers shall he who is of my type among men not bind his heart; in those spring-times and many-hued meadows shall he not believe, who knoweth the fickly faint-hearted human species! COULD they do otherwise, then would they also WILL otherwise. The half- and-half spoil every whole. That leaves become withered,--what is there to lament about that! Let them go and fall away, O Zarathustra, and do not lament! Better even to blow amongst them with rustling winds,-- --Blow amongst those leaves, O Zarathustra, that everything WITHERED may run away from thee the faster!-- 2. "We have again become pious"--so do those apostates confess; and some of them are still too pusillanimous thus to confess. Unto them I look into the eye,--before them I say it unto their face and unto the blush on their cheeks: Ye are those who again PRAY! It is however a shame to pray! Not for all, but for thee, and me, and whoever hath his conscience in his head. For THEE it is a shame to pray! Thou knowest it well: the faint-hearted devil in thee, which would fain fold its arms, and place its hands in its bosom, and take it easier:--this faint-hearted devil persuadeth thee that "there IS a God!" THEREBY, however, dost thou belong to the light-dreading type, to whom light never permitteth repose: now must thou daily thrust thy head deeper into obscurity and vapour! And verily, thou choosest the hour well: for just now do the nocturnal birds again fly abroad. The hour hath come for all light-dreading people, the vesper hour and leisure hour, when they do not--"take leisure." I hear it and smell it: it hath come--their hour for hunt and procession, not indeed for a wild hunt, but for a tame, lame, snuffling, soft- treaders', soft-prayers' hunt,-- --For a hunt after susceptible simpletons: all mouse-traps for the heart have again been set! And whenever I lift a curtain, a night-moth rusheth out of it. Did it perhaps squat there along with another night-moth? For everywhere do I smell small concealed communities; and wherever there are closets there are new devotees therein, and the atmosphere of devotees. They sit for long evenings beside one another, and say: "Let us again become like little children and say, 'good God!'"--ruined in mouths and stomachs by the pious confectioners. Or they look for long evenings at a crafty, lurking cross-spider, that preacheth prudence to the spiders themselves, and teacheth that "under crosses it is good for cobweb-spinning!" Or they sit all day at swamps with angle-rods, and on that account think themselves PROFOUND; but whoever fisheth where there are no fish, I do not even call him superficial! Or they learn in godly-gay style to play the harp with a hymn-poet, who would fain harp himself into the heart of young girls:--for he hath tired of old girls and their praises. Or they learn to shudder with a learned semi-madcap, who waiteth in darkened rooms for spirits to come to him--and the spirit runneth away entirely! Or they listen to an old roving howl--and growl-piper, who hath learnt from the sad winds the sadness of sounds; now pipeth he as the wind, and preacheth sadness in sad strains. And some of them have even become night-watchmen: they know now how to blow horns, and go about at night and awaken old things which have long fallen asleep. Five words about old things did I hear yester-night at the garden-wall: they came from such old, sorrowful, arid night-watchmen. "For a father he careth not sufficiently for his children: human fathers do this better!"-- "He is too old! He now careth no more for his children,"--answered the other night-watchman. "HATH he then children? No one can prove it unless he himself prove it! I have long wished that he would for once prove it thoroughly." "Prove? As if HE had ever proved anything! Proving is difficult to him; he layeth great stress on one's BELIEVING him." "Ay! Ay! Belief saveth him; belief in him. That is the way with old people! So it is with us also!"-- --Thus spake to each other the two old night-watchmen and light-scarers, and tooted thereupon sorrowfully on their horns: so did it happen yester- night at the garden-wall. To me, however, did the heart writhe with laughter, and was like to break; it knew not where to go, and sunk into the midriff. Verily, it will be my death yet--to choke with laughter when I see asses drunken, and hear night-watchmen thus doubt about God. Hath the time not LONG since passed for all such doubts? Who may nowadays awaken such old slumbering, light-shunning things! With the old Deities hath it long since come to an end:--and verily, a good joyful Deity-end had they! They did not "begloom" themselves to death--that do people fabricate! On the contrary, they--LAUGHED themselves to death once on a time! That took place when the unGodliest utterance came from a God himself--the utterance: "There is but one God! Thou shalt have no other Gods before me!"-- --An old grim-beard of a God, a jealous one, forgot himself in such wise:-- And all the Gods then laughed, and shook upon their thrones, and exclaimed: "Is it not just divinity that there are Gods, but no God?" He that hath an ear let him hear.-- Thus talked Zarathustra in the city he loved, which is surnamed "The Pied Cow." For from here he had but two days to travel to reach once more his cave and his animals; his soul, however, rejoiced unceasingly on account of the nighness of his return home. LIII. THE RETURN HOME. O lonesomeness! My HOME, lonesomeness! Too long have I lived wildly in wild remoteness, to return to thee without tears! Now threaten me with the finger as mothers threaten; now smile upon me as mothers smile; now say just: "Who was it that like a whirlwind once rushed away from me?-- --Who when departing called out: 'Too long have I sat with lonesomeness; there have I unlearned silence!' THAT hast thou learned now--surely? O Zarathustra, everything do I know; and that thou wert MORE FORSAKEN amongst the many, thou unique one, than thou ever wert with me! One thing is forsakenness, another matter is lonesomeness: THAT hast thou now learned! And that amongst men thou wilt ever be wild and strange: --Wild and strange even when they love thee: for above all they want to be TREATED INDULGENTLY! Here, however, art thou at home and house with thyself; here canst thou utter everything, and unbosom all motives; nothing is here ashamed of concealed, congealed feelings. Here do all things come caressingly to thy talk and flatter thee: for they want to ride upon thy back. On every simile dost thou here ride to every truth. Uprightly and openly mayest thou here talk to all things: and verily, it soundeth as praise in their ears, for one to talk to all things--directly! Another matter, however, is forsakenness. For, dost thou remember, O Zarathustra? When thy bird screamed overhead, when thou stoodest in the forest, irresolute, ignorant where to go, beside a corpse:-- --When thou spakest: 'Let mine animals lead me! More dangerous have I found it among men than among animals:'--THAT was forsakenness! And dost thou remember, O Zarathustra? When thou sattest in thine isle, a well of wine giving and granting amongst empty buckets, bestowing and distributing amongst the thirsty: --Until at last thou alone sattest thirsty amongst the drunken ones, and wailedst nightly: 'Is taking not more blessed than giving? And stealing yet more blessed than taking?'--THAT was forsakenness! And dost thou remember, O Zarathustra? When thy stillest hour came and drove thee forth from thyself, when with wicked whispering it said: 'Speak and succumb!'- --When it disgusted thee with all thy waiting and silence, and discouraged thy humble courage: THAT was forsakenness!"-- O lonesomeness! My home, lonesomeness! How blessedly and tenderly speaketh thy voice unto me! We do not question each other, we do not complain to each other; we go together openly through open doors. For all is open with thee and clear; and even the hours run here on lighter feet. For in the dark, time weigheth heavier upon one than in the light. Here fly open unto me all being's words and word-cabinets: here all being wanteth to become words, here all becoming wanteth to learn of me how to talk. Down there, however--all talking is in vain! There, forgetting and passing-by are the best wisdom: THAT have I learned now! He who would understand everything in man must handle everything. But for that I have too clean hands. I do not like even to inhale their breath; alas! that I have lived so long among their noise and bad breaths! O blessed stillness around me! O pure odours around me! How from a deep breast this stillness fetcheth pure breath! How it hearkeneth, this blessed stillness! But down there--there speaketh everything, there is everything misheard. If one announce one's wisdom with bells, the shopmen in the market-place will out-jingle it with pennies! Everything among them talketh; no one knoweth any longer how to understand. Everything falleth into the water; nothing falleth any longer into deep wells. Everything among them talketh, nothing succeedeth any longer and accomplisheth itself. Everything cackleth, but who will still sit quietly on the nest and hatch eggs? Everything among them talketh, everything is out-talked. And that which yesterday was still too hard for time itself and its tooth, hangeth to-day, outchamped and outchewed, from the mouths of the men of to-day. Everything among them talketh, everything is betrayed. And what was once called the secret and secrecy of profound souls, belongeth to-day to the street-trumpeters and other butterflies. O human hubbub, thou wonderful thing! Thou noise in dark streets! Now art thou again behind me:--my greatest danger lieth behind me! In indulging and pitying lay ever my greatest danger; and all human hubbub wisheth to be indulged and tolerated. With suppressed truths, with fool's hand and befooled heart, and rich in petty lies of pity:--thus have I ever lived among men. Disguised did I sit amongst them, ready to misjudge MYSELF that I might endure THEM, and willingly saying to myself: "Thou fool, thou dost not know men!" One unlearneth men when one liveth amongst them: there is too much foreground in all men--what can far-seeing, far-longing eyes do THERE! And, fool that I was, when they misjudged me, I indulged them on that account more than myself, being habitually hard on myself, and often even taking revenge on myself for the indulgence. Stung all over by poisonous flies, and hollowed like the stone by many drops of wickedness: thus did I sit among them, and still said to myself: "Innocent is everything petty of its pettiness!" Especially did I find those who call themselves "the good," the most poisonous flies; they sting in all innocence, they lie in all innocence; how COULD they--be just towards me! He who liveth amongst the good--pity teacheth him to lie. Pity maketh stifling air for all free souls. For the stupidity of the good is unfathomable. To conceal myself and my riches--THAT did I learn down there: for every one did I still find poor in spirit. It was the lie of my pity, that I knew in every one, --That I saw and scented in every one, what was ENOUGH of spirit for him, and what was TOO MUCH! Their stiff wise men: I call them wise, not stiff--thus did I learn to slur over words. The grave-diggers dig for themselves diseases. Under old rubbish rest bad vapours. One should not stir up the marsh. One should live on mountains. With blessed nostrils do I again breathe mountain-freedom. Freed at last is my nose from the smell of all human hubbub! With sharp breezes tickled, as with sparkling wine, SNEEZETH my soul-- sneezeth, and shouteth self-congratulatingly: "Health to thee!" Thus spake Zarathustra. LIV. THE THREE EVIL THINGS. 1. In my dream, in my last morning-dream, I stood to-day on a promontory-- beyond the world; I held a pair of scales, and WEIGHED the world. Alas, that the rosy dawn came too early to me: she glowed me awake, the jealous one! Jealous is she always of the glows of my morning-dream. Measurable by him who hath time, weighable by a good weigher, attainable by strong pinions, divinable by divine nut-crackers: thus did my dream find the world:-- My dream, a bold sailor, half-ship, half-hurricane, silent as the butterfly, impatient as the falcon: how had it the patience and leisure to-day for world-weighing! Did my wisdom perhaps speak secretly to it, my laughing, wide-awake day- wisdom, which mocketh at all "infinite worlds"? For it saith: "Where force is, there becometh NUMBER the master: it hath more force." How confidently did my dream contemplate this finite world, not new- fangledly, not old-fangledly, not timidly, not entreatingly:-- --As if a big round apple presented itself to my hand, a ripe golden apple, with a coolly-soft, velvety skin:--thus did the world present itself unto me:-- --As if a tree nodded unto me, a broad-branched, strong-willed tree, curved as a recline and a foot-stool for weary travellers: thus did the world stand on my promontory:-- --As if delicate hands carried a casket towards me--a casket open for the delectation of modest adoring eyes: thus did the world present itself before me to-day:-- --Not riddle enough to scare human love from it, not solution enough to put to sleep human wisdom:--a humanly good thing was the world to me to-day, of which such bad things are said! How I thank my morning-dream that I thus at to-day's dawn, weighed the world! As a humanly good thing did it come unto me, this dream and heart- comforter! And that I may do the like by day, and imitate and copy its best, now will I put the three worst things on the scales, and weigh them humanly well.-- He who taught to bless taught also to curse: what are the three best cursed things in the world? These will I put on the scales. VOLUPTUOUSNESS, PASSION FOR POWER, and SELFISHNESS: these three things have hitherto been best cursed, and have been in worst and falsest repute-- these three things will I weigh humanly well. Well! Here is my promontory, and there is the sea--IT rolleth hither unto me, shaggily and fawningly, the old, faithful, hundred-headed dog-monster that I love!-- Well! Here will I hold the scales over the weltering sea: and also a witness do I choose to look on--thee, the anchorite-tree, thee, the strong- odoured, broad-arched tree that I love!-- On what bridge goeth the now to the hereafter? By what constraint doth the high stoop to the low? And what enjoineth even the highest still--to grow upwards?-- Now stand the scales poised and at rest: three heavy questions have I thrown in; three heavy answers carrieth the other scale. 2. Voluptuousness: unto all hair-shirted despisers of the body, a sting and stake; and, cursed as "the world," by all backworldsmen: for it mocketh and befooleth all erring, misinferring teachers. Voluptuousness: to the rabble, the slow fire at which it is burnt; to all wormy wood, to all stinking rags, the prepared heat and stew furnace. Voluptuousness: to free hearts, a thing innocent and free, the garden- happiness of the earth, all the future's thanks-overflow to the present. Voluptuousness: only to the withered a sweet poison; to the lion-willed, however, the great cordial, and the reverently saved wine of wines. Voluptuousness: the great symbolic happiness of a higher happiness and highest hope. For to many is marriage promised, and more than marriage,-- --To many that are more unknown to each other than man and woman:--and who hath fully understood HOW UNKNOWN to each other are man and woman! Voluptuousness:--but I will have hedges around my thoughts, and even around my words, lest swine and libertine should break into my gardens!-- Passion for power: the glowing scourge of the hardest of the heart-hard; the cruel torture reserved for the cruellest themselves; the gloomy flame of living pyres. Passion for power: the wicked gadfly which is mounted on the vainest peoples; the scorner of all uncertain virtue; which rideth on every horse and on every pride. Passion for power: the earthquake which breaketh and upbreaketh all that is rotten and hollow; the rolling, rumbling, punitive demolisher of whited sepulchres; the flashing interrogative-sign beside premature answers. Passion for power: before whose glance man creepeth and croucheth and drudgeth, and becometh lower than the serpent and the swine:--until at last great contempt crieth out of him--, Passion for power: the terrible teacher of great contempt, which preacheth to their face to cities and empires: "Away with thee!"--until a voice crieth out of themselves: "Away with ME!" Passion for power: which, however, mounteth alluringly even to the pure and lonesome, and up to self-satisfied elevations, glowing like a love that painteth purple felicities alluringly on earthly heavens. Passion for power: but who would call it PASSION, when the height longeth to stoop for power! Verily, nothing sick or diseased is there in such longing and descending! That the lonesome height may not for ever remain lonesome and self- sufficing; that the mountains may come to the valleys and the winds of the heights to the plains:-- Oh, who could find the right prenomen and honouring name for such longing! "Bestowing virtue"--thus did Zarathustra once name the unnamable. And then it happened also,--and verily, it happened for the first time!-- that his word blessed SELFISHNESS, the wholesome, healthy selfishness, that springeth from the powerful soul:-- --From the powerful soul, to which the high body appertaineth, the handsome, triumphing, refreshing body, around which everything becometh a mirror: --The pliant, persuasive body, the dancer, whose symbol and epitome is the self-enjoying soul. Of such bodies and souls the self-enjoyment calleth itself "virtue." With its words of good and bad doth such self-enjoyment shelter itself as with sacred groves; with the names of its happiness doth it banish from itself everything contemptible. Away from itself doth it banish everything cowardly; it saith: "Bad--THAT IS cowardly!" Contemptible seem to it the ever-solicitous, the sighing, the complaining, and whoever pick up the most trifling advantage. It despiseth also all bitter-sweet wisdom: for verily, there is also wisdom that bloometh in the dark, a night-shade wisdom, which ever sigheth: "All is vain!" Shy distrust is regarded by it as base, and every one who wanteth oaths instead of looks and hands: also all over-distrustful wisdom,--for such is the mode of cowardly souls. Baser still it regardeth the obsequious, doggish one, who immediately lieth on his back, the submissive one; and there is also wisdom that is submissive, and doggish, and pious, and obsequious. Hateful to it altogether, and a loathing, is he who will never defend himself, he who swalloweth down poisonous spittle and bad looks, the all- too-patient one, the all-endurer, the all-satisfied one: for that is the mode of slaves. Whether they be servile before Gods and divine spurnings, or before men and stupid human opinions: at ALL kinds of slaves doth it spit, this blessed selfishness! Bad: thus doth it call all that is spirit-broken, and sordidly-servile-- constrained, blinking eyes, depressed hearts, and the false submissive style, which kisseth with broad cowardly lips. And spurious wisdom: so doth it call all the wit that slaves, and hoary- headed and weary ones affect; and especially all the cunning, spurious- witted, curious-witted foolishness of priests! The spurious wise, however, all the priests, the world-weary, and those whose souls are of feminine and servile nature--oh, how hath their game all along abused selfishness! And precisely THAT was to be virtue and was to be called virtue--to abuse selfishness! And "selfless"--so did they wish themselves with good reason, all those world-weary cowards and cross-spiders! But to all those cometh now the day, the change, the sword of judgment, THE GREAT NOONTIDE: then shall many things be revealed! And he who proclaimeth the EGO wholesome and holy, and selfishness blessed, verily, he, the prognosticator, speaketh also what he knoweth: "BEHOLD, IT COMETH, IT IS NIGH, THE GREAT NOONTIDE!" Thus spake Zarathustra. LV. THE SPIRIT OF GRAVITY. 1. My mouthpiece--is of the people: too coarsely and cordially do I talk for Angora rabbits. And still stranger soundeth my word unto all ink-fish and pen-foxes. My hand--is a fool's hand: woe unto all tables and walls, and whatever hath room for fool's sketching, fool's scrawling! My foot--is a horse-foot; therewith do I trample and trot over stick and stone, in the fields up and down, and am bedevilled with delight in all fast racing. My stomach--is surely an eagle's stomach? For it preferreth lamb's flesh. Certainly it is a bird's stomach. Nourished with innocent things, and with few, ready and impatient to fly, to fly away--that is now my nature: why should there not be something of bird-nature therein! And especially that I am hostile to the spirit of gravity, that is bird- nature:--verily, deadly hostile, supremely hostile, originally hostile! Oh, whither hath my hostility not flown and misflown! Thereof could I sing a song--and WILL sing it: though I be alone in an empty house, and must sing it to mine own ears. Other singers are there, to be sure, to whom only the full house maketh the voice soft, the hand eloquent, the eye expressive, the heart wakeful:-- those do I not resemble.-- 2. He who one day teacheth men to fly will have shifted all landmarks; to him will all landmarks themselves fly into the air; the earth will he christen anew--as "the light body." The ostrich runneth faster than the fastest horse, but it also thrusteth its head heavily into the heavy earth: thus is it with the man who cannot yet fly. Heavy unto him are earth and life, and so WILLETH the spirit of gravity! But he who would become light, and be a bird, must love himself:--thus do _I_ teach. Not, to be sure, with the love of the sick and infected, for with them stinketh even self-love! One must learn to love oneself--thus do I teach--with a wholesome and healthy love: that one may endure to be with oneself, and not go roving about. Such roving about christeneth itself "brotherly love"; with these words hath there hitherto been the best lying and dissembling, and especially by those who have been burdensome to every one. And verily, it is no commandment for to-day and to-morrow to LEARN to love oneself. Rather is it of all arts the finest, subtlest, last and patientest. For to its possessor is all possession well concealed, and of all treasure- pits one's own is last excavated--so causeth the spirit of gravity. Almost in the cradle are we apportioned with heavy words and worths: "good" and "evil"--so calleth itself this dowry. For the sake of it we are forgiven for living. And therefore suffereth one little children to come unto one, to forbid them betimes to love themselves--so causeth the spirit of gravity. And we--we bear loyally what is apportioned unto us, on hard shoulders, over rugged mountains! And when we sweat, then do people say to us: "Yea, life is hard to bear!" But man himself only is hard to bear! The reason thereof is that he carrieth too many extraneous things on his shoulders. Like the camel kneeleth he down, and letteth himself be well laden. Especially the strong load-bearing man in whom reverence resideth. Too many EXTRANEOUS heavy words and worths loadeth he upon himself--then seemeth life to him a desert! And verily! Many a thing also that is OUR OWN is hard to bear! And many internal things in man are like the oyster--repulsive and slippery and hard to grasp;- So that an elegant shell, with elegant adornment, must plead for them. But this art also must one learn: to HAVE a shell, and a fine appearance, and sagacious blindness! Again, it deceiveth about many things in man, that many a shell is poor and pitiable, and too much of a shell. Much concealed goodness and power is never dreamt of; the choicest dainties find no tasters! Women know that, the choicest of them: a little fatter a little leaner-- oh, how much fate is in so little! Man is difficult to discover, and unto himself most difficult of all; often lieth the spirit concerning the soul. So causeth the spirit of gravity. He, however, hath discovered himself who saith: This is MY good and evil: therewith hath he silenced the mole and the dwarf, who say: "Good for all, evil for all." Verily, neither do I like those who call everything good, and this world the best of all. Those do I call the all-satisfied. All-satisfiedness, which knoweth how to taste everything,--that is not the best taste! I honour the refractory, fastidious tongues and stomachs, which have learned to say "I" and "Yea" and "Nay." To chew and digest everything, however--that is the genuine swine-nature! Ever to say YE-A--that hath only the ass learnt, and those like it!-- Deep yellow and hot red--so wanteth MY taste--it mixeth blood with all colours. He, however, who whitewasheth his house, betrayeth unto me a whitewashed soul. With mummies, some fall in love; others with phantoms: both alike hostile to all flesh and blood--oh, how repugnant are both to my taste! For I love blood. And there will I not reside and abide where every one spitteth and speweth: that is now MY taste,--rather would I live amongst thieves and perjurers. Nobody carrieth gold in his mouth. Still more repugnant unto me, however, are all lickspittles; and the most repugnant animal of man that I found, did I christen "parasite": it would not love, and would yet live by love. Unhappy do I call all those who have only one choice: either to become evil beasts, or evil beast-tamers. Amongst such would I not build my tabernacle. Unhappy do I also call those who have ever to WAIT,--they are repugnant to my taste--all the toll-gatherers and traders, and kings, and other landkeepers and shopkeepers. Verily, I learned waiting also, and thoroughly so,--but only waiting for MYSELF. And above all did I learn standing and walking and running and leaping and climbing and dancing. This however is my teaching: he who wisheth one day to fly, must first learn standing and walking and running and climbing and dancing:--one doth not fly into flying! With rope-ladders learned I to reach many a window, with nimble legs did I climb high masts: to sit on high masts of perception seemed to me no small bliss;-- --To flicker like small flames on high masts: a small light, certainly, but a great comfort to cast-away sailors and ship-wrecked ones! By divers ways and wendings did I arrive at my truth; not by one ladder did I mount to the height where mine eye roveth into my remoteness. And unwillingly only did I ask my way--that was always counter to my taste! Rather did I question and test the ways themselves. A testing and a questioning hath been all my travelling:--and verily, one must also LEARN to answer such questioning! That, however,--is my taste: --Neither a good nor a bad taste, but MY taste, of which I have no longer either shame or secrecy. "This--is now MY way,--where is yours?" Thus did I answer those who asked me "the way." For THE way--it doth not exist! Thus spake Zarathustra. LVI. OLD AND NEW TABLES. 1. Here do I sit and wait, old broken tables around me and also new half- written tables. When cometh mine hour? --The hour of my descent, of my down-going: for once more will I go unto men. For that hour do I now wait: for first must the signs come unto me that it is MINE hour--namely, the laughing lion with the flock of doves. Meanwhile do I talk to myself as one who hath time. No one telleth me anything new, so I tell myself mine own story. 2. When I came unto men, then found I them resting on an old infatuation: all of them thought they had long known what was good and bad for men. An old wearisome business seemed to them all discourse about virtue; and he who wished to sleep well spake of "good" and "bad" ere retiring to rest. This somnolence did I disturb when I taught that NO ONE YET KNOWETH what is good and bad:--unless it be the creating one! --It is he, however, who createth man's goal, and giveth to the earth its meaning and its future: he only EFFECTETH it THAT aught is good or bad. And I bade them upset their old academic chairs, and wherever that old infatuation had sat; I bade them laugh at their great moralists, their saints, their poets, and their Saviours. At their gloomy sages did I bid them laugh, and whoever had sat admonishing as a black scarecrow on the tree of life. On their great grave-highway did I seat myself, and even beside the carrion and vultures--and I laughed at all their bygone and its mellow decaying glory. Verily, like penitential preachers and fools did I cry wrath and shame on all their greatness and smallness. Oh, that their best is so very small! Oh, that their worst is so very small! Thus did I laugh. Thus did my wise longing, born in the mountains, cry and laugh in me; a wild wisdom, verily!--my great pinion-rustling longing. And oft did it carry me off and up and away and in the midst of laughter; then flew I quivering like an arrow with sun-intoxicated rapture: --Out into distant futures, which no dream hath yet seen, into warmer souths than ever sculptor conceived,--where gods in their dancing are ashamed of all clothes: (That I may speak in parables and halt and stammer like the poets: and verily I am ashamed that I have still to be a poet!) Where all becoming seemed to me dancing of Gods, and wantoning of Gods, and the world unloosed and unbridled and fleeing back to itself:-- --As an eternal self-fleeing and re-seeking of one another of many Gods, as the blessed self-contradicting, recommuning, and refraternising with one another of many Gods:-- Where all time seemed to me a blessed mockery of moments, where necessity was freedom itself, which played happily with the goad of freedom:-- Where I also found again mine old devil and arch-enemy, the spirit of gravity, and all that it created: constraint, law, necessity and consequence and purpose and will and good and evil:-- For must there not be that which is danced OVER, danced beyond? Must there not, for the sake of the nimble, the nimblest,--be moles and clumsy dwarfs?-- 3. There was it also where I picked up from the path the word "Superman," and that man is something that must be surpassed. --That man is a bridge and not a goal--rejoicing over his noontides and evenings, as advances to new rosy dawns: --The Zarathustra word of the great noontide, and whatever else I have hung up over men like purple evening-afterglows. Verily, also new stars did I make them see, along with new nights; and over cloud and day and night, did I spread out laughter like a gay-coloured canopy. I taught them all MY poetisation and aspiration: to compose and collect into unity what is fragment in man, and riddle and fearful chance;-- --As composer, riddle-reader, and redeemer of chance, did I teach them to create the future, and all that HATH BEEN--to redeem by creating. The past of man to redeem, and every "It was" to transform, until the Will saith: "But so did I will it! So shall I will it--" --This did I call redemption; this alone taught I them to call redemption.-- Now do I await MY redemption--that I may go unto them for the last time. For once more will I go unto men: AMONGST them will my sun set; in dying will I give them my choicest gift! From the sun did I learn this, when it goeth down, the exuberant one: gold doth it then pour into the sea, out of inexhaustible riches,-- --So that the poorest fisherman roweth even with GOLDEN oars! For this did I once see, and did not tire of weeping in beholding it.-- Like the sun will also Zarathustra go down: now sitteth he here and waiteth, old broken tables around him, and also new tables--half-written. 4. Behold, here is a new table; but where are my brethren who will carry it with me to the valley and into hearts of flesh?-- Thus demandeth my great love to the remotest ones: BE NOT CONSIDERATE OF THY NEIGHBOUR! Man is something that must be surpassed. There are many divers ways and modes of surpassing: see THOU thereto! But only a buffoon thinketh: "man can also be OVERLEAPT." Surpass thyself even in thy neighbour: and a right which thou canst seize upon, shalt thou not allow to be given thee! What thou doest can no one do to thee again. Lo, there is no requital. He who cannot command himself shall obey. And many a one CAN command himself, but still sorely lacketh self-obedience! 5. Thus wisheth the type of noble souls: they desire to have nothing GRATUITOUSLY, least of all, life. He who is of the populace wisheth to live gratuitously; we others, however, to whom life hath given itself--we are ever considering WHAT we can best give IN RETURN! And verily, it is a noble dictum which saith: "What life promiseth US, that promise will WE keep--to life!" One should not wish to enjoy where one doth not contribute to the enjoyment. And one should not WISH to enjoy! For enjoyment and innocence are the most bashful things. Neither like to be sought for. One should HAVE them,--but one should rather SEEK for guilt and pain!-- 6. O my brethren, he who is a firstling is ever sacrificed. Now, however, are we firstlings! We all bleed on secret sacrificial altars, we all burn and broil in honour of ancient idols. Our best is still young: this exciteth old palates. Our flesh is tender, our skin is only lambs' skin:--how could we not excite old idol-priests! IN OURSELVES dwelleth he still, the old idol-priest, who broileth our best for his banquet. Ah, my brethren, how could firstlings fail to be sacrifices! But so wisheth our type; and I love those who do not wish to preserve themselves, the down-going ones do I love with mine entire love: for they go beyond.-- 7. To be true--that CAN few be! And he who can, will not! Least of all, however, can the good be true. Oh, those good ones! GOOD MEN NEVER SPEAK THE TRUTH. For the spirit, thus to be good, is a malady. They yield, those good ones, they submit themselves; their heart repeateth, their soul obeyeth: HE, however, who obeyeth, DOTH NOT LISTEN TO HIMSELF! All that is called evil by the good, must come together in order that one truth may be born. O my brethren, are ye also evil enough for THIS truth? The daring venture, the prolonged distrust, the cruel Nay, the tedium, the cutting-into-the-quick--how seldom do THESE come together! Out of such seed, however--is truth produced! BESIDE the bad conscience hath hitherto grown all KNOWLEDGE! Break up, break up, ye discerning ones, the old tables! 8. When the water hath planks, when gangways and railings o'erspan the stream, verily, he is not believed who then saith: "All is in flux." But even the simpletons contradict him. "What?" say the simpletons, "all in flux? Planks and railings are still OVER the stream! "OVER the stream all is stable, all the values of things, the bridges and bearings, all 'good' and 'evil': these are all STABLE!"-- Cometh, however, the hard winter, the stream-tamer, then learn even the wittiest distrust, and verily, not only the simpletons then say: "Should not everything--STAND STILL?" "Fundamentally standeth everything still"--that is an appropriate winter doctrine, good cheer for an unproductive period, a great comfort for winter-sleepers and fireside-loungers. "Fundamentally standeth everything still"--: but CONTRARY thereto, preacheth the thawing wind! The thawing wind, a bullock, which is no ploughing bullock--a furious bullock, a destroyer, which with angry horns breaketh the ice! The ice however--BREAKETH GANGWAYS! O my brethren, is not everything AT PRESENT IN FLUX? Have not all railings and gangways fallen into the water? Who would still HOLD ON to "good" and "evil"? "Woe to us! Hail to us! The thawing wind bloweth!"--Thus preach, my brethren, through all the streets! 9. There is an old illusion--it is called good and evil. Around soothsayers and astrologers hath hitherto revolved the orbit of this illusion. Once did one BELIEVE in soothsayers and astrologers; and THEREFORE did one believe, "Everything is fate: thou shalt, for thou must!" Then again did one distrust all soothsayers and astrologers; and THEREFORE did one believe, "Everything is freedom: thou canst, for thou willest!" O my brethren, concerning the stars and the future there hath hitherto been only illusion, and not knowledge; and THEREFORE concerning good and evil there hath hitherto been only illusion and not knowledge! 10. "Thou shalt not rob! Thou shalt not slay!"--such precepts were once called holy; before them did one bow the knee and the head, and take off one's shoes. But I ask you: Where have there ever been better robbers and slayers in the world than such holy precepts? Is there not even in all life--robbing and slaying? And for such precepts to be called holy, was not TRUTH itself thereby--slain? --Or was it a sermon of death that called holy what contradicted and dissuaded from life?--O my brethren, break up, break up for me the old tables! 11. It is my sympathy with all the past that I see it is abandoned,-- --Abandoned to the favour, the spirit and the madness of every generation that cometh, and reinterpreteth all that hath been as its bridge! A great potentate might arise, an artful prodigy, who with approval and disapproval could strain and constrain all the past, until it became for him a bridge, a harbinger, a herald, and a cock-crowing. This however is the other danger, and mine other sympathy:--he who is of the populace, his thoughts go back to his grandfather,--with his grandfather, however, doth time cease. Thus is all the past abandoned: for it might some day happen for the populace to become master, and drown all time in shallow waters. Therefore, O my brethren, a NEW NOBILITY is needed, which shall be the adversary of all populace and potentate rule, and shall inscribe anew the word "noble" on new tables. For many noble ones are needed, and many kinds of noble ones, FOR A NEW NOBILITY! Or, as I once said in parable: "That is just divinity, that there are Gods, but no God!" 12. O my brethren, I consecrate you and point you to a new nobility: ye shall become procreators and cultivators and sowers of the future;-- --Verily, not to a nobility which ye could purchase like traders with traders' gold; for little worth is all that hath its price. Let it not be your honour henceforth whence ye come, but whither ye go! Your Will and your feet which seek to surpass you--let these be your new honour! Verily, not that ye have served a prince--of what account are princes now! --nor that ye have become a bulwark to that which standeth, that it may stand more firmly. Not that your family have become courtly at courts, and that ye have learned--gay-coloured, like the flamingo--to stand long hours in shallow pools: (For ABILITY-to-stand is a merit in courtiers; and all courtiers believe that unto blessedness after death pertaineth--PERMISSION-to-sit!) Nor even that a Spirit called Holy, led your forefathers into promised lands, which I do not praise: for where the worst of all trees grew--the cross,--in that land there is nothing to praise!-- --And verily, wherever this "Holy Spirit" led its knights, always in such campaigns did--goats and geese, and wryheads and guyheads run FOREMOST!-- O my brethren, not backward shall your nobility gaze, but OUTWARD! Exiles shall ye be from all fatherlands and forefather-lands! Your CHILDREN'S LAND shall ye love: let this love be your new nobility,-- the undiscovered in the remotest seas! For it do I bid your sails search and search! Unto your children shall ye MAKE AMENDS for being the children of your fathers: all the past shall ye THUS redeem! This new table do I place over you! 13. "Why should one live? All is vain! To live--that is to thrash straw; to live--that is to burn oneself and yet not get warm.-- Such ancient babbling still passeth for "wisdom"; because it is old, however, and smelleth mustily, THEREFORE is it the more honoured. Even mould ennobleth.-- Children might thus speak: they SHUN the fire because it hath burnt them! There is much childishness in the old books of wisdom. And he who ever "thrasheth straw," why should he be allowed to rail at thrashing! Such a fool one would have to muzzle! Such persons sit down to the table and bring nothing with them, not even good hunger:--and then do they rail: "All is vain!" But to eat and drink well, my brethren, is verily no vain art! Break up, break up for me the tables of the never-joyous ones! 14. "To the clean are all things clean"--thus say the people. I, however, say unto you: To the swine all things become swinish! Therefore preach the visionaries and bowed-heads (whose hearts are also bowed down): "The world itself is a filthy monster." For these are all unclean spirits; especially those, however, who have no peace or rest, unless they see the world FROM THE BACKSIDE--the backworldsmen! TO THOSE do I say it to the face, although it sound unpleasantly: the world resembleth man, in that it hath a backside,--SO MUCH is true! There is in the world much filth: SO MUCH is true! But the world itself is not therefore a filthy monster! There is wisdom in the fact that much in the world smelleth badly: loathing itself createth wings, and fountain-divining powers! In the best there is still something to loathe; and the best is still something that must be surpassed!-- O my brethren, there is much wisdom in the fact that much filth is in the world!-- 15. Such sayings did I hear pious backworldsmen speak to their consciences, and verily without wickedness or guile,--although there is nothing more guileful in the world, or more wicked. "Let the world be as it is! Raise not a finger against it!" "Let whoever will choke and stab and skin and scrape the people: raise not a finger against it! Thereby will they learn to renounce the world." "And thine own reason--this shalt thou thyself stifle and choke; for it is a reason of this world,--thereby wilt thou learn thyself to renounce the world."-- --Shatter, shatter, O my brethren, those old tables of the pious! Tatter the maxims of the world-maligners!-- 16. "He who learneth much unlearneth all violent cravings"--that do people now whisper to one another in all the dark lanes. "Wisdom wearieth, nothing is worth while; thou shalt not crave!"--this new table found I hanging even in the public markets. Break up for me, O my brethren, break up also that NEW table! The weary- o'-the-world put it up, and the preachers of death and the jailer: for lo, it is also a sermon for slavery:-- Because they learned badly and not the best, and everything too early and everything too fast; because they ATE badly: from thence hath resulted their ruined stomach;-- --For a ruined stomach, is their spirit: IT persuadeth to death! For verily, my brethren, the spirit IS a stomach! Life is a well of delight, but to him in whom the ruined stomach speaketh, the father of affliction, all fountains are poisoned. To discern: that is DELIGHT to the lion-willed! But he who hath become weary, is himself merely "willed"; with him play all the waves. And such is always the nature of weak men: they lose themselves on their way. And at last asketh their weariness: "Why did we ever go on the way? All is indifferent!" TO THEM soundeth it pleasant to have preached in their ears: "Nothing is worth while! Ye shall not will!" That, however, is a sermon for slavery. O my brethren, a fresh blustering wind cometh Zarathustra unto all way- weary ones; many noses will he yet make sneeze! Even through walls bloweth my free breath, and in into prisons and imprisoned spirits! Willing emancipateth: for willing is creating: so do I teach. And ONLY for creating shall ye learn! And also the learning shall ye LEARN only from me, the learning well!--He who hath ears let him hear! 17. There standeth the boat--thither goeth it over, perhaps into vast nothingness--but who willeth to enter into this "Perhaps"? None of you want to enter into the death-boat! How should ye then be WORLD-WEARY ones! World-weary ones! And have not even withdrawn from the earth! Eager did I ever find you for the earth, amorous still of your own earth-weariness! Not in vain doth your lip hang down:--a small worldly wish still sitteth thereon! And in your eye--floateth there not a cloudlet of unforgotten earthly bliss? There are on the earth many good inventions, some useful, some pleasant: for their sake is the earth to be loved. And many such good inventions are there, that they are like woman's breasts: useful at the same time, and pleasant. Ye world-weary ones, however! Ye earth-idlers! You, shall one beat with stripes! With stripes shall one again make you sprightly limbs. For if ye be not invalids, or decrepit creatures, of whom the earth is weary, then are ye sly sloths, or dainty, sneaking pleasure-cats. And if ye will not again RUN gaily, then shall ye--pass away! To the incurable shall one not seek to be a physician: thus teacheth Zarathustra:--so shall ye pass away! But more COURAGE is needed to make an end than to make a new verse: that do all physicians and poets know well.-- 18. O my brethren, there are tables which weariness framed, and tables which slothfulness framed, corrupt slothfulness: although they speak similarly, they want to be heard differently.-- See this languishing one! Only a span-breadth is he from his goal; but from weariness hath he lain down obstinately in the dust, this brave one! From weariness yawneth he at the path, at the earth, at the goal, and at himself: not a step further will he go,--this brave one! Now gloweth the sun upon him, and the dogs lick at his sweat: but he lieth there in his obstinacy and preferreth to languish:-- --A span-breadth from his goal, to languish! Verily, ye will have to drag him into his heaven by the hair of his head--this hero! Better still that ye let him lie where he hath lain down, that sleep may come unto him, the comforter, with cooling patter-rain. Let him lie, until of his own accord he awakeneth,--until of his own accord he repudiateth all weariness, and what weariness hath taught through him! Only, my brethren, see that ye scare the dogs away from him, the idle skulkers, and all the swarming vermin:-- --All the swarming vermin of the "cultured," that--feast on the sweat of every hero!-- 19. I form circles around me and holy boundaries; ever fewer ascend with me ever higher mountains: I build a mountain-range out of ever holier mountains.-- But wherever ye would ascend with me, O my brethren, take care lest a PARASITE ascend with you! A parasite: that is a reptile, a creeping, cringing reptile, that trieth to fatten on your infirm and sore places. And THIS is its art: it divineth where ascending souls are weary, in your trouble and dejection, in your sensitive modesty, doth it build its loathsome nest. Where the strong are weak, where the noble are all-too-gentle--there buildeth it its loathsome nest; the parasite liveth where the great have small sore-places. What is the highest of all species of being, and what is the lowest? The parasite is the lowest species; he, however, who is of the highest species feedeth most parasites. For the soul which hath the longest ladder, and can go deepest down: how could there fail to be most parasites upon it?-- --The most comprehensive soul, which can run and stray and rove furthest in itself; the most necessary soul, which out of joy flingeth itself into chance:-- --The soul in Being, which plungeth into Becoming; the possessing soul, which SEEKETH to attain desire and longing:-- --The soul fleeing from itself, which overtaketh itself in the widest circuit; the wisest soul, unto which folly speaketh most sweetly:-- --The soul most self-loving, in which all things have their current and counter-current, their ebb and their flow:--oh, how could THE LOFTIEST SOUL fail to have the worst parasites? 20. O my brethren, am I then cruel? But I say: What falleth, that shall one also push! Everything of to-day--it falleth, it decayeth; who would preserve it! But I--I wish also to push it! Know ye the delight which rolleth stones into precipitous depths?--Those men of to-day, see just how they roll into my depths! A prelude am I to better players, O my brethren! An example! DO according to mine example! And him whom ye do not teach to fly, teach I pray you--TO FALL FASTER!-- 21. I love the brave: but it is not enough to be a swordsman,--one must also know WHEREON to use swordsmanship! And often is it greater bravery to keep quiet and pass by, that THEREBY one may reserve oneself for a worthier foe! Ye shall only have foes to be hated; but not foes to be despised: ye must be proud of your foes. Thus have I already taught. For the worthier foe, O my brethren, shall ye reserve yourselves: therefore must ye pass by many a one,-- --Especially many of the rabble, who din your ears with noise about people and peoples. Keep your eye clear of their For and Against! There is there much right, much wrong: he who looketh on becometh wroth. Therein viewing, therein hewing--they are the same thing: therefore depart into the forests and lay your sword to sleep! Go YOUR ways! and let the people and peoples go theirs!--gloomy ways, verily, on which not a single hope glinteth any more! Let there the trader rule, where all that still glittereth is--traders' gold. It is the time of kings no longer: that which now calleth itself the people is unworthy of kings. See how these peoples themselves now do just like the traders: they pick up the smallest advantage out of all kinds of rubbish! They lay lures for one another, they lure things out of one another,--that they call "good neighbourliness." O blessed remote period when a people said to itself: "I will be--MASTER over peoples!" For, my brethren, the best shall rule, the best also WILLETH to rule! And where the teaching is different, there--the best is LACKING. 22. If THEY had--bread for nothing, alas! for what would THEY cry! Their maintainment--that is their true entertainment; and they shall have it hard! Beasts of prey, are they: in their "working"--there is even plundering, in their "earning"--there is even overreaching! Therefore shall they have it hard! Better beasts of prey shall they thus become, subtler, cleverer, MORE MAN- LIKE: for man is the best beast of prey. All the animals hath man already robbed of their virtues: that is why of all animals it hath been hardest for man. Only the birds are still beyond him. And if man should yet learn to fly, alas! TO WHAT HEIGHT--would his rapacity fly! 23. Thus would I have man and woman: fit for war, the one; fit for maternity, the other; both, however, fit for dancing with head and legs. And lost be the day to us in which a measure hath not been danced. And false be every truth which hath not had laughter along with it! 24. Your marriage-arranging: see that it be not a bad ARRANGING! Ye have arranged too hastily: so there FOLLOWETH therefrom--marriage-breaking! And better marriage-breaking than marriage-bending, marriage-lying!--Thus spake a woman unto me: "Indeed, I broke the marriage, but first did the marriage break--me! The badly paired found I ever the most revengeful: they make every one suffer for it that they no longer run singly. On that account want I the honest ones to say to one another: "We love each other: let us SEE TO IT that we maintain our love! Or shall our pledging be blundering?" --"Give us a set term and a small marriage, that we may see if we are fit for the great marriage! It is a great matter always to be twain." Thus do I counsel all honest ones; and what would be my love to the Superman, and to all that is to come, if I should counsel and speak otherwise! Not only to propagate yourselves onwards but UPWARDS--thereto, O my brethren, may the garden of marriage help you! 25. He who hath grown wise concerning old origins, lo, he will at last seek after the fountains of the future and new origins.-- O my brethren, not long will it be until NEW PEOPLES shall arise and new fountains shall rush down into new depths. For the earthquake--it choketh up many wells, it causeth much languishing: but it bringeth also to light inner powers and secrets. The earthquake discloseth new fountains. In the earthquake of old peoples new fountains burst forth. And whoever calleth out: "Lo, here is a well for many thirsty ones, one heart for many longing ones, one will for many instruments":--around him collecteth a PEOPLE, that is to say, many attempting ones. Who can command, who must obey--THAT IS THERE ATTEMPTED! Ah, with what long seeking and solving and failing and learning and re-attempting! Human society: it is an attempt--so I teach--a long seeking: it seeketh however the ruler!-- --An attempt, my brethren! And NO "contract"! Destroy, I pray you, destroy that word of the soft-hearted and half-and-half! 26. O my brethren! With whom lieth the greatest danger to the whole human future? Is it not with the good and just?-- --As those who say and feel in their hearts: "We already know what is good and just, we possess it also; woe to those who still seek thereafter! And whatever harm the wicked may do, the harm of the good is the harmfulest harm! And whatever harm the world-maligners may do, the harm of the good is the harmfulest harm! O my brethren, into the hearts of the good and just looked some one once on a time, who said: "They are the Pharisees." But people did not understand him. The good and just themselves were not free to understand him; their spirit was imprisoned in their good conscience. The stupidity of the good is unfathomably wise. It is the truth, however, that the good MUST be Pharisees--they have no choice! The good MUST crucify him who deviseth his own virtue! That IS the truth! The second one, however, who discovered their country--the country, heart and soil of the good and just,--it was he who asked: "Whom do they hate most?" The CREATOR, hate they most, him who breaketh the tables and old values, the breaker,--him they call the law-breaker. For the good--they CANNOT create; they are always the beginning of the end:-- --They crucify him who writeth new values on new tables, they sacrifice UNTO THEMSELVES the future--they crucify the whole human future! The good--they have always been the beginning of the end.-- 27. O my brethren, have ye also understood this word? And what I once said of the "last man"?-- With whom lieth the greatest danger to the whole human future? Is it not with the good and just? BREAK UP, BREAK UP, I PRAY YOU, THE GOOD AND JUST!--O my brethren, have ye understood also this word? 28. Ye flee from me? Ye are frightened? Ye tremble at this word? O my brethren, when I enjoined you to break up the good, and the tables of the good, then only did I embark man on his high seas. And now only cometh unto him the great terror, the great outlook, the great sickness, the great nausea, the great sea-sickness. False shores and false securities did the good teach you; in the lies of the good were ye born and bred. Everything hath been radically contorted and distorted by the good. But he who discovered the country of "man," discovered also the country of "man's future." Now shall ye be sailors for me, brave, patient! Keep yourselves up betimes, my brethren, learn to keep yourselves up! The sea stormeth: many seek to raise themselves again by you. The sea stormeth: all is in the sea. Well! Cheer up! Ye old seaman- hearts! What of fatherland! THITHER striveth our helm where our CHILDREN'S LAND is! Thitherwards, stormier than the sea, stormeth our great longing!-- 29. "Why so hard!"--said to the diamond one day the charcoal; "are we then not near relatives?"-- Why so soft? O my brethren; thus do _I_ ask you: are ye then not--my brethren? Why so soft, so submissive and yielding? Why is there so much negation and abnegation in your hearts? Why is there so little fate in your looks? And if ye will not be fates and inexorable ones, how can ye one day-- conquer with me? And if your hardness will not glance and cut and chip to pieces, how can ye one day--create with me? For the creators are hard. And blessedness must it seem to you to press your hand upon millenniums as upon wax,-- --Blessedness to write upon the will of millenniums as upon brass,--harder than brass, nobler than brass. Entirely hard is only the noblest. This new table, O my brethren, put I up over you: BECOME HARD!-- 30. O thou, my Will! Thou change of every need, MY needfulness! Preserve me from all small victories! Thou fatedness of my soul, which I call fate! Thou In-me! Over-me! Preserve and spare me for one great fate! And thy last greatness, my Will, spare it for thy last--that thou mayest be inexorable IN thy victory! Ah, who hath not succumbed to his victory! Ah, whose eye hath not bedimmed in this intoxicated twilight! Ah, whose foot hath not faltered and forgotten in victory--how to stand!-- --That I may one day be ready and ripe in the great noontide: ready and ripe like the glowing ore, the lightning-bearing cloud, and the swelling milk-udder:-- --Ready for myself and for my most hidden Will: a bow eager for its arrow, an arrow eager for its star:-- --A star, ready and ripe in its noontide, glowing, pierced, blessed, by annihilating sun-arrows:-- --A sun itself, and an inexorable sun-will, ready for annihilation in victory! O Will, thou change of every need, MY needfulness! Spare me for one great victory!--- Thus spake Zarathustra. LVII. THE CONVALESCENT. 1. One morning, not long after his return to his cave, Zarathustra sprang up from his couch like a madman, crying with a frightful voice, and acting as if some one still lay on the couch who did not wish to rise. Zarathustra's voice also resounded in such a manner that his animals came to him frightened, and out of all the neighbouring caves and lurking-places all the creatures slipped away--flying, fluttering, creeping or leaping, according to their variety of foot or wing. Zarathustra, however, spake these words: Up, abysmal thought out of my depth! I am thy cock and morning dawn, thou overslept reptile: Up! Up! My voice shall soon crow thee awake! Unbind the fetters of thine ears: listen! For I wish to hear thee! Up! Up! There is thunder enough to make the very graves listen! And rub the sleep and all the dimness and blindness out of thine eyes! Hear me also with thine eyes: my voice is a medicine even for those born blind. And once thou art awake, then shalt thou ever remain awake. It is not MY custom to awake great-grandmothers out of their sleep that I may bid them-- sleep on! Thou stirrest, stretchest thyself, wheezest? Up! Up! Not wheeze, shalt thou,--but speak unto me! Zarathustra calleth thee, Zarathustra the godless! I, Zarathustra, the advocate of living, the advocate of suffering, the advocate of the circuit--thee do I call, my most abysmal thought! Joy to me! Thou comest,--I hear thee! Mine abyss SPEAKETH, my lowest depth have I turned over into the light! Joy to me! Come hither! Give me thy hand--ha! let be! aha!--Disgust, disgust, disgust--alas to me! 2. Hardly, however, had Zarathustra spoken these words, when he fell down as one dead, and remained long as one dead. When however he again came to himself, then was he pale and trembling, and remained lying; and for long he would neither eat nor drink. This condition continued for seven days; his animals, however, did not leave him day nor night, except that the eagle flew forth to fetch food. And what it fetched and foraged, it laid on Zarathustra's couch: so that Zarathustra at last lay among yellow and red berries, grapes, rosy apples, sweet-smelling herbage, and pine-cones. At his feet, however, two lambs were stretched, which the eagle had with difficulty carried off from their shepherds. At last, after seven days, Zarathustra raised himself upon his couch, took a rosy apple in his hand, smelt it and found its smell pleasant. Then did his animals think the time had come to speak unto him. "O Zarathustra," said they, "now hast thou lain thus for seven days with heavy eyes: wilt thou not set thyself again upon thy feet? Step out of thy cave: the world waiteth for thee as a garden. The wind playeth with heavy fragrance which seeketh for thee; and all brooks would like to run after thee. All things long for thee, since thou hast remained alone for seven days-- step forth out of thy cave! All things want to be thy physicians! Did perhaps a new knowledge come to thee, a bitter, grievous knowledge? Like leavened dough layest thou, thy soul arose and swelled beyond all its bounds.--" --O mine animals, answered Zarathustra, talk on thus and let me listen! It refresheth me so to hear your talk: where there is talk, there is the world as a garden unto me. How charming it is that there are words and tones; are not words and tones rainbows and seeming bridges 'twixt the eternally separated? To each soul belongeth another world; to each soul is every other soul a back-world. Among the most alike doth semblance deceive most delightfully: for the smallest gap is most difficult to bridge over. For me--how could there be an outside-of-me? There is no outside! But this we forget on hearing tones; how delightful it is that we forget! Have not names and tones been given unto things that man may refresh himself with them? It is a beautiful folly, speaking; therewith danceth man over everything. How lovely is all speech and all falsehoods of tones! With tones danceth our love on variegated rainbows.-- --"O Zarathustra," said then his animals, "to those who think like us, things all dance themselves: they come and hold out the hand and laugh and flee--and return. Everything goeth, everything returneth; eternally rolleth the wheel of existence. Everything dieth, everything blossometh forth again; eternally runneth on the year of existence. Everything breaketh, everything is integrated anew; eternally buildeth itself the same house of existence. All things separate, all things again greet one another; eternally true to itself remaineth the ring of existence. Every moment beginneth existence, around every 'Here' rolleth the ball 'There.' The middle is everywhere. Crooked is the path of eternity."-- --O ye wags and barrel-organs! answered Zarathustra, and smiled once more, how well do ye know what had to be fulfilled in seven days:-- --And how that monster crept into my throat and choked me! But I bit off its head and spat it away from me. And ye--ye have made a lyre-lay out of it? Now, however, do I lie here, still exhausted with that biting and spitting-away, still sick with mine own salvation. AND YE LOOKED ON AT IT ALL? O mine animals, are ye also cruel? Did ye like to look at my great pain as men do? For man is the cruellest animal. At tragedies, bull-fights, and crucifixions hath he hitherto been happiest on earth; and when he invented his hell, behold, that was his heaven on earth. When the great man crieth--: immediately runneth the little man thither, and his tongue hangeth out of his mouth for very lusting. He, however, calleth it his "pity." The little man, especially the poet--how passionately doth he accuse life in words! Hearken to him, but do not fail to hear the delight which is in all accusation! Such accusers of life--them life overcometh with a glance of the eye. "Thou lovest me?" saith the insolent one; "wait a little, as yet have I no time for thee." Towards himself man is the cruellest animal; and in all who call themselves "sinners" and "bearers of the cross" and "penitents," do not overlook the voluptuousness in their plaints and accusations! And I myself--do I thereby want to be man's accuser? Ah, mine animals, this only have I learned hitherto, that for man his baddest is necessary for his best,-- --That all that is baddest is the best POWER, and the hardest stone for the highest creator; and that man must become better AND badder:-- Not to THIS torture-stake was I tied, that I know man is bad,--but I cried, as no one hath yet cried: "Ah, that his baddest is so very small! Ah, that his best is so very small!" The great disgust at man--IT strangled me and had crept into my throat: and what the soothsayer had presaged: "All is alike, nothing is worth while, knowledge strangleth." A long twilight limped on before me, a fatally weary, fatally intoxicated sadness, which spake with yawning mouth. "Eternally he returneth, the man of whom thou art weary, the small man"--so yawned my sadness, and dragged its foot and could not go to sleep. A cavern, became the human earth to me; its breast caved in; everything living became to me human dust and bones and mouldering past. My sighing sat on all human graves, and could no longer arise: my sighing and questioning croaked and choked, and gnawed and nagged day and night: --"Ah, man returneth eternally! The small man returneth eternally!" Naked had I once seen both of them, the greatest man and the smallest man: all too like one another--all too human, even the greatest man! All too small, even the greatest man!--that was my disgust at man! And the eternal return also of the smallest man!--that was my disgust at all existence! Ah, Disgust! Disgust! Disgust!--Thus spake Zarathustra, and sighed and shuddered; for he remembered his sickness. Then did his animals prevent him from speaking further. "Do not speak further, thou convalescent!"--so answered his animals, "but go out where the world waiteth for thee like a garden. Go out unto the roses, the bees, and the flocks of doves! Especially, however, unto the singing-birds, to learn SINGING from them! For singing is for the convalescent; the sound ones may talk. And when the sound also want songs, then want they other songs than the convalescent." --"O ye wags and barrel-organs, do be silent!" answered Zarathustra, and smiled at his animals. "How well ye know what consolation I devised for myself in seven days! That I have to sing once more--THAT consolation did I devise for myself, and THIS convalescence: would ye also make another lyre-lay thereof?" --"Do not talk further," answered his animals once more; "rather, thou convalescent, prepare for thyself first a lyre, a new lyre! For behold, O Zarathustra! For thy new lays there are needed new lyres. Sing and bubble over, O Zarathustra, heal thy soul with new lays: that thou mayest bear thy great fate, which hath not yet been any one's fate! For thine animals know it well, O Zarathustra, who thou art and must become: behold, THOU ART THE TEACHER OF THE ETERNAL RETURN,--that is now THY fate! That thou must be the first to teach this teaching--how could this great fate not be thy greatest danger and infirmity! Behold, we know what thou teachest: that all things eternally return, and ourselves with them, and that we have already existed times without number, and all things with us. Thou teachest that there is a great year of Becoming, a prodigy of a great year; it must, like a sand-glass, ever turn up anew, that it may anew run down and run out:-- --So that all those years are like one another in the greatest and also in the smallest, so that we ourselves, in every great year, are like ourselves in the greatest and also in the smallest. And if thou wouldst now die, O Zarathustra, behold, we know also how thou wouldst then speak to thyself:--but thine animals beseech thee not to die yet! Thou wouldst speak, and without trembling, buoyant rather with bliss, for a great weight and worry would be taken from thee, thou patientest one!-- 'Now do I die and disappear,' wouldst thou say, 'and in a moment I am nothing. Souls are as mortal as bodies. But the plexus of causes returneth in which I am intertwined,--it will again create me! I myself pertain to the causes of the eternal return. I come again with this sun, with this earth, with this eagle, with this serpent--NOT to a new life, or a better life, or a similar life: --I come again eternally to this identical and selfsame life, in its greatest and its smallest, to teach again the eternal return of all things,-- --To speak again the word of the great noontide of earth and man, to announce again to man the Superman. I have spoken my word. I break down by my word: so willeth mine eternal fate--as announcer do I succumb! The hour hath now come for the down-goer to bless himself. Thus--ENDETH Zarathustra's down-going.'"-- When the animals had spoken these words they were silent and waited, so that Zarathustra might say something to them: but Zarathustra did not hear that they were silent. On the contrary, he lay quietly with closed eyes like a person sleeping, although he did not sleep; for he communed just then with his soul. The serpent, however, and the eagle, when they found him silent in such wise, respected the great stillness around him, and prudently retired. LVIII. THE GREAT LONGING. O my soul, I have taught thee to say "to-day" as "once on a time" and "formerly," and to dance thy measure over every Here and There and Yonder. O my soul, I delivered thee from all by-places, I brushed down from thee dust and spiders and twilight. O my soul, I washed the petty shame and the by-place virtue from thee, and persuaded thee to stand naked before the eyes of the sun. With the storm that is called "spirit" did I blow over thy surging sea; all clouds did I blow away from it; I strangled even the strangler called "sin." O my soul, I gave thee the right to say Nay like the storm, and to say Yea as the open heaven saith Yea: calm as the light remainest thou, and now walkest through denying storms. O my soul, I restored to thee liberty over the created and the uncreated; and who knoweth, as thou knowest, the voluptuousness of the future? O my soul, I taught thee the contempt which doth not come like worm-eating, the great, the loving contempt, which loveth most where it contemneth most. O my soul, I taught thee so to persuade that thou persuadest even the grounds themselves to thee: like the sun, which persuadeth even the sea to its height. O my soul, I have taken from thee all obeying and knee-bending and homage- paying; I have myself given thee the names, "Change of need" and "Fate." O my soul, I have given thee new names and gay-coloured playthings, I have called thee "Fate" and "the Circuit of circuits" and "the Navel-string of time" and "the Azure bell." O my soul, to thy domain gave I all wisdom to drink, all new wines, and also all immemorially old strong wines of wisdom. O my soul, every sun shed I upon thee, and every night and every silence and every longing:--then grewest thou up for me as a vine. O my soul, exuberant and heavy dost thou now stand forth, a vine with swelling udders and full clusters of brown golden grapes:-- --Filled and weighted by thy happiness, waiting from superabundance, and yet ashamed of thy waiting. O my soul, there is nowhere a soul which could be more loving and more comprehensive and more extensive! Where could future and past be closer together than with thee? O my soul, I have given thee everything, and all my hands have become empty by thee:--and now! Now sayest thou to me, smiling and full of melancholy: "Which of us oweth thanks?-- --Doth the giver not owe thanks because the receiver received? Is bestowing not a necessity? Is receiving not--pitying?"-- O my soul, I understand the smiling of thy melancholy: thine over- abundance itself now stretcheth out longing hands! Thy fulness looketh forth over raging seas, and seeketh and waiteth: the longing of over-fulness looketh forth from the smiling heaven of thine eyes! And verily, O my soul! Who could see thy smiling and not melt into tears? The angels themselves melt into tears through the over-graciousness of thy smiling. Thy graciousness and over-graciousness, is it which will not complain and weep: and yet, O my soul, longeth thy smiling for tears, and thy trembling mouth for sobs. "Is not all weeping complaining? And all complaining, accusing?" Thus speakest thou to thyself; and therefore, O my soul, wilt thou rather smile than pour forth thy grief-- --Than in gushing tears pour forth all thy grief concerning thy fulness, and concerning the craving of the vine for the vintager and vintage-knife! But wilt thou not weep, wilt thou not weep forth thy purple melancholy, then wilt thou have to SING, O my soul!--Behold, I smile myself, who foretell thee this: --Thou wilt have to sing with passionate song, until all seas turn calm to hearken unto thy longing,-- --Until over calm longing seas the bark glideth, the golden marvel, around the gold of which all good, bad, and marvellous things frisk:-- --Also many large and small animals, and everything that hath light marvellous feet, so that it can run on violet-blue paths,-- --Towards the golden marvel, the spontaneous bark, and its master: he, however, is the vintager who waiteth with the diamond vintage-knife,-- --Thy great deliverer, O my soul, the nameless one--for whom future songs only will find names! And verily, already hath thy breath the fragrance of future songs,-- --Already glowest thou and dreamest, already drinkest thou thirstily at all deep echoing wells of consolation, already reposeth thy melancholy in the bliss of future songs!-- O my soul, now have I given thee all, and even my last possession, and all my hands have become empty by thee:--THAT I BADE THEE SING, behold, that was my last thing to give! That I bade thee sing,--say now, say: WHICH of us now--oweth thanks?-- Better still, however: sing unto me, sing, O my soul! And let me thank thee!-- Thus spake Zarathustra. LIX. THE SECOND DANCE-SONG. 1. "Into thine eyes gazed I lately, O Life: gold saw I gleam in thy night- eyes,--my heart stood still with delight: --A golden bark saw I gleam on darkened waters, a sinking, drinking, reblinking, golden swing-bark! At my dance-frantic foot, dost thou cast a glance, a laughing, questioning, melting, thrown glance: Twice only movedst thou thy rattle with thy little hands--then did my feet swing with dance-fury.-- My heels reared aloft, my toes they hearkened,--thee they would know: hath not the dancer his ear--in his toe! Unto thee did I spring: then fledst thou back from my bound; and towards me waved thy fleeing, flying tresses round! Away from thee did I spring, and from thy snaky tresses: then stoodst thou there half-turned, and in thine eye caresses. With crooked glances--dost thou teach me crooked courses; on crooked courses learn my feet--crafty fancies! I fear thee near, I love thee far; thy flight allureth me, thy seeking secureth me:--I suffer, but for thee, what would I not gladly bear! For thee, whose coldness inflameth, whose hatred misleadeth, whose flight enchaineth, whose mockery--pleadeth: --Who would not hate thee, thou great bindress, inwindress, temptress, seekress, findress! Who would not love thee, thou innocent, impatient, wind-swift, child-eyed sinner! Whither pullest thou me now, thou paragon and tomboy? And now foolest thou me fleeing; thou sweet romp dost annoy! I dance after thee, I follow even faint traces lonely. Where art thou? Give me thy hand! Or thy finger only! Here are caves and thickets: we shall go astray!--Halt! Stand still! Seest thou not owls and bats in fluttering fray? Thou bat! Thou owl! Thou wouldst play me foul? Where are we? From the dogs hast thou learned thus to bark and howl. Thou gnashest on me sweetly with little white teeth; thine evil eyes shoot out upon me, thy curly little mane from underneath! This is a dance over stock and stone: I am the hunter,--wilt thou be my hound, or my chamois anon? Now beside me! And quickly, wickedly springing! Now up! And over!--Alas! I have fallen myself overswinging! Oh, see me lying, thou arrogant one, and imploring grace! Gladly would I walk with thee--in some lovelier place! --In the paths of love, through bushes variegated, quiet, trim! Or there along the lake, where gold-fishes dance and swim! Thou art now a-weary? There above are sheep and sun-set stripes: is it not sweet to sleep--the shepherd pipes? Thou art so very weary? I carry thee thither; let just thine arm sink! And art thou thirsty--I should have something; but thy mouth would not like it to drink!-- --Oh, that cursed, nimble, supple serpent and lurking-witch! Where art thou gone? But in my face do I feel through thy hand, two spots and red blotches itch! I am verily weary of it, ever thy sheepish shepherd to be. Thou witch, if I have hitherto sung unto thee, now shalt THOU--cry unto me! To the rhythm of my whip shalt thou dance and cry! I forget not my whip?-- Not I!"-- 2. Then did Life answer me thus, and kept thereby her fine ears closed: "O Zarathustra! Crack not so terribly with thy whip! Thou knowest surely that noise killeth thought,--and just now there came to me such delicate thoughts. We are both of us genuine ne'er-do-wells and ne'er-do-ills. Beyond good and evil found we our island and our green meadow--we two alone! Therefore must we be friendly to each other! And even should we not love each other from the bottom of our hearts,--must we then have a grudge against each other if we do not love each other perfectly? And that I am friendly to thee, and often too friendly, that knowest thou: and the reason is that I am envious of thy Wisdom. Ah, this mad old fool, Wisdom! If thy Wisdom should one day run away from thee, ah! then would also my love run away from thee quickly."-- Thereupon did Life look thoughtfully behind and around, and said softly: "O Zarathustra, thou art not faithful enough to me! Thou lovest me not nearly so much as thou sayest; I know thou thinkest of soon leaving me. There is an old heavy, heavy, booming-clock: it boometh by night up to thy cave:-- --When thou hearest this clock strike the hours at midnight, then thinkest thou between one and twelve thereon-- --Thou thinkest thereon, O Zarathustra, I know it--of soon leaving me!"-- "Yea," answered I, hesitatingly, "but thou knowest it also"--And I said something into her ear, in amongst her confused, yellow, foolish tresses. "Thou KNOWEST that, O Zarathustra? That knoweth no one--" And we gazed at each other, and looked at the green meadow o'er which the cool evening was just passing, and we wept together.--Then, however, was Life dearer unto me than all my Wisdom had ever been.-- Thus spake Zarathustra. 3. One! O man! Take heed! Two! What saith deep midnight's voice indeed? Three! "I slept my sleep-- Four! "From deepest dream I've woke and plead:-- Five! "The world is deep, Six! "And deeper than the day could read. Seven! "Deep is its woe-- Eight! "Joy--deeper still than grief can be: Nine! "Woe saith: Hence! Go! Ten! "But joys all want eternity-- Eleven! "Want deep profound eternity!" Twelve! LX. THE SEVEN SEALS. (OR THE YEA AND AMEN LAY.) 1. If I be a diviner and full of the divining spirit which wandereth on high mountain-ridges, 'twixt two seas,-- Wandereth 'twixt the past and the future as a heavy cloud--hostile to sultry plains, and to all that is weary and can neither die nor live: Ready for lightning in its dark bosom, and for the redeeming flash of light, charged with lightnings which say Yea! which laugh Yea! ready for divining flashes of lightning:-- --Blessed, however, is he who is thus charged! And verily, long must he hang like a heavy tempest on the mountain, who shall one day kindle the light of the future!-- Oh, how could I not be ardent for Eternity and for the marriage-ring of rings--the ring of the return? Never yet have I found the woman by whom I should like to have children, unless it be this woman whom I love: for I love thee, O Eternity! FOR I LOVE THEE, O ETERNITY! 2. If ever my wrath hath burst graves, shifted landmarks, or rolled old shattered tables into precipitous depths: If ever my scorn hath scattered mouldered words to the winds, and if I have come like a besom to cross-spiders, and as a cleansing wind to old charnel- houses: If ever I have sat rejoicing where old Gods lie buried, world-blessing, world-loving, beside the monuments of old world-maligners:-- --For even churches and Gods'-graves do I love, if only heaven looketh through their ruined roofs with pure eyes; gladly do I sit like grass and red poppies on ruined churches-- Oh, how could I not be ardent for Eternity, and for the marriage-ring of rings--the ring of the return? Never yet have I found the woman by whom I should like to have children, unless it be this woman whom I love: for I love thee, O Eternity! FOR I LOVE THEE, O ETERNITY! 3. If ever a breath hath come to me of the creative breath, and of the heavenly necessity which compelleth even chances to dance star-dances: If ever I have laughed with the laughter of the creative lightning, to which the long thunder of the deed followeth, grumblingly, but obediently: If ever I have played dice with the Gods at the divine table of the earth, so that the earth quaked and ruptured, and snorted forth fire-streams:-- --For a divine table is the earth, and trembling with new creative dictums and dice-casts of the Gods: Oh, how could I not be ardent for Eternity, and for the marriage-ring of rings--the ring of the return? Never yet have I found the woman by whom I should like to have children, unless it be this woman whom I love: for I love thee, O Eternity! FOR I LOVE THEE, O ETERNITY! 4. If ever I have drunk a full draught of the foaming spice- and confection- bowl in which all things are well mixed: If ever my hand hath mingled the furthest with the nearest, fire with spirit, joy with sorrow, and the harshest with the kindest: If I myself am a grain of the saving salt which maketh everything in the confection-bowl mix well:-- --For there is a salt which uniteth good with evil; and even the evilest is worthy, as spicing and as final over-foaming:-- Oh, how could I not be ardent for Eternity, and for the marriage-ring of rings--the ring of the return? Never yet have I found the woman by whom I should like to have children, unless it be this woman whom I love: for I love thee, O Eternity! FOR I LOVE THEE, O ETERNITY! 5. If I be fond of the sea, and all that is sealike, and fondest of it when it angrily contradicteth me: If the exploring delight be in me, which impelleth sails to the undiscovered, if the seafarer's delight be in my delight: If ever my rejoicing hath called out: "The shore hath vanished,--now hath fallen from me the last chain-- The boundless roareth around me, far away sparkle for me space and time,-- well! cheer up! old heart!"-- Oh, how could I not be ardent for Eternity, and for the marriage-ring of rings--the ring of the return? Never yet have I found the woman by whom I should like to have children, unless it be this woman whom I love: for I love thee, O Eternity! FOR I LOVE THEE, O ETERNITY! 6. If my virtue be a dancer's virtue, and if I have often sprung with both feet into golden-emerald rapture: If my wickedness be a laughing wickedness, at home among rose-banks and hedges of lilies: --For in laughter is all evil present, but it is sanctified and absolved by its own bliss:-- And if it be my Alpha and Omega that everything heavy shall become light, every body a dancer, and every spirit a bird: and verily, that is my Alpha and Omega!-- Oh, how could I not be ardent for Eternity, and for the marriage-ring of rings--the ring of the return? Never yet have I found the woman by whom I should like to have children, unless it be this woman whom I love: for I love thee, O Eternity! FOR I LOVE THEE, O ETERNITY! 7. If ever I have spread out a tranquil heaven above me, and have flown into mine own heaven with mine own pinions: If I have swum playfully in profound luminous distances, and if my freedom's avian wisdom hath come to me:-- --Thus however speaketh avian wisdom:--"Lo, there is no above and no below! Throw thyself about,--outward, backward, thou light one! Sing! speak no more! --Are not all words made for the heavy? Do not all words lie to the light ones? Sing! speak no more!"-- Oh, how could I not be ardent for Eternity, and for the marriage-ring of rings--the ring of the return? Never yet have I found the woman by whom I should like to have children, unless it be this woman whom I love: for I love thee, O Eternity! FOR I LOVE THEE, O ETERNITY! FOURTH AND LAST PART. Ah, where in the world have there been greater follies than with the pitiful? And what in the world hath caused more suffering than the follies of the pitiful? Woe unto all loving ones who have not an elevation which is above their pity! Thus spake the devil unto me, once on a time: "Ever God hath his hell: it is his love for man." And lately did I hear him say these words: "God is dead: of his pity for man hath God died."--ZARATHUSTRA, II., "The Pitiful." LXI. THE HONEY SACRIFICE. --And again passed moons and years over Zarathustra's soul, and he heeded it not; his hair, however, became white. One day when he sat on a stone in front of his cave, and gazed calmly into the distance--one there gazeth out on the sea, and away beyond sinuous abysses,--then went his animals thoughtfully round about him, and at last set themselves in front of him. "O Zarathustra," said they, "gazest thou out perhaps for thy happiness?"-- "Of what account is my happiness!" answered he, "I have long ceased to strive any more for happiness, I strive for my work."--"O Zarathustra," said the animals once more, "that sayest thou as one who hath overmuch of good things. Liest thou not in a sky-blue lake of happiness?"--"Ye wags," answered Zarathustra, and smiled, "how well did ye choose the simile! But ye know also that my happiness is heavy, and not like a fluid wave of water: it presseth me and will not leave me, and is like molten pitch."-- Then went his animals again thoughtfully around him, and placed themselves once more in front of him. "O Zarathustra," said they, "it is consequently FOR THAT REASON that thou thyself always becometh yellower and darker, although thy hair looketh white and flaxen? Lo, thou sittest in thy pitch!"--"What do ye say, mine animals?" said Zarathustra, laughing; "verily I reviled when I spake of pitch. As it happeneth with me, so is it with all fruits that turn ripe. It is the HONEY in my veins that maketh my blood thicker, and also my soul stiller."--"So will it be, O Zarathustra," answered his animals, and pressed up to him; "but wilt thou not to-day ascend a high mountain? The air is pure, and to-day one seeth more of the world than ever."--"Yea, mine animals," answered he, "ye counsel admirably and according to my heart: I will to-day ascend a high mountain! But see that honey is there ready to hand, yellow, white, good, ice-cool, golden- comb-honey. For know that when aloft I will make the honey-sacrifice."-- When Zarathustra, however, was aloft on the summit, he sent his animals home that had accompanied him, and found that he was now alone:--then he laughed from the bottom of his heart, looked around him, and spake thus: That I spake of sacrifices and honey-sacrifices, it was merely a ruse in talking and verily, a useful folly! Here aloft can I now speak freer than in front of mountain-caves and anchorites' domestic animals. What to sacrifice! I squander what is given me, a squanderer with a thousand hands: how could I call that--sacrificing? And when I desired honey I only desired bait, and sweet mucus and mucilage, for which even the mouths of growling bears, and strange, sulky, evil birds, water: --The best bait, as huntsmen and fishermen require it. For if the world be as a gloomy forest of animals, and a pleasure-ground for all wild huntsmen, it seemeth to me rather--and preferably--a fathomless, rich sea; --A sea full of many-hued fishes and crabs, for which even the Gods might long, and might be tempted to become fishers in it, and casters of nets,-- so rich is the world in wonderful things, great and small! Especially the human world, the human sea:--towards IT do I now throw out my golden angle-rod and say: Open up, thou human abyss! Open up, and throw unto me thy fish and shining crabs! With my best bait shall I allure to myself to-day the strangest human fish! --My happiness itself do I throw out into all places far and wide 'twixt orient, noontide, and occident, to see if many human fish will not learn to hug and tug at my happiness;-- Until, biting at my sharp hidden hooks, they have to come up unto MY height, the motleyest abyss-groundlings, to the wickedest of all fishers of men. For THIS am I from the heart and from the beginning--drawing, hither- drawing, upward-drawing, upbringing; a drawer, a trainer, a training- master, who not in vain counselled himself once on a time: "Become what thou art!" Thus may men now come UP to me; for as yet do I await the signs that it is time for my down-going; as yet do I not myself go down, as I must do, amongst men. Therefore do I here wait, crafty and scornful upon high mountains, no impatient one, no patient one; rather one who hath even unlearnt patience, --because he no longer "suffereth." For my fate giveth me time: it hath forgotten me perhaps? Or doth it sit behind a big stone and catch flies? And verily, I am well-disposed to mine eternal fate, because it doth not hound and hurry me, but leaveth me time for merriment and mischief; so that I have to-day ascended this high mountain to catch fish. Did ever any one catch fish upon high mountains? And though it be a folly what I here seek and do, it is better so than that down below I should become solemn with waiting, and green and yellow-- --A posturing wrath-snorter with waiting, a holy howl-storm from the mountains, an impatient one that shouteth down into the valleys: "Hearken, else I will scourge you with the scourge of God!" Not that I would have a grudge against such wrathful ones on that account: they are well enough for laughter to me! Impatient must they now be, those big alarm-drums, which find a voice now or never! Myself, however, and my fate--we do not talk to the Present, neither do we talk to the Never: for talking we have patience and time and more than time. For one day must it yet come, and may not pass by. What must one day come and may not pass by? Our great Hazar, that is to say, our great, remote human-kingdom, the Zarathustra-kingdom of a thousand years-- How remote may such "remoteness" be? What doth it concern me? But on that account it is none the less sure unto me--, with both feet stand I secure on this ground; --On an eternal ground, on hard primary rock, on this highest, hardest, primary mountain-ridge, unto which all winds come, as unto the storm- parting, asking Where? and Whence? and Whither? Here laugh, laugh, my hearty, healthy wickedness! From high mountains cast down thy glittering scorn-laughter! Allure for me with thy glittering the finest human fish! And whatever belongeth unto ME in all seas, my in-and-for-me in all things --fish THAT out for me, bring THAT up to me: for that do I wait, the wickedest of all fish-catchers. Out! out! my fishing-hook! In and down, thou bait of my happiness! Drip thy sweetest dew, thou honey of my heart! Bite, my fishing-hook, into the belly of all black affliction! Look out, look out, mine eye! Oh, how many seas round about me, what dawning human futures! And above me--what rosy red stillness! What unclouded silence! LXII. THE CRY OF DISTRESS. The next day sat Zarathustra again on the stone in front of his cave, whilst his animals roved about in the world outside to bring home new food,--also new honey: for Zarathustra had spent and wasted the old honey to the very last particle. When he thus sat, however, with a stick in his hand, tracing the shadow of his figure on the earth, and reflecting-- verily! not upon himself and his shadow,--all at once he startled and shrank back: for he saw another shadow beside his own. And when he hastily looked around and stood up, behold, there stood the soothsayer beside him, the same whom he had once given to eat and drink at his table, the proclaimer of the great weariness, who taught: "All is alike, nothing is worth while, the world is without meaning, knowledge strangleth." But his face had changed since then; and when Zarathustra looked into his eyes, his heart was startled once more: so much evil announcement and ashy-grey lightnings passed over that countenance. The soothsayer, who had perceived what went on in Zarathustra's soul, wiped his face with his hand, as if he would wipe out the impression; the same did also Zarathustra. And when both of them had thus silently composed and strengthened themselves, they gave each other the hand, as a token that they wanted once more to recognise each other. "Welcome hither," said Zarathustra, "thou soothsayer of the great weariness, not in vain shalt thou once have been my messmate and guest. Eat and drink also with me to-day, and forgive it that a cheerful old man sitteth with thee at table!"--"A cheerful old man?" answered the soothsayer, shaking his head, "but whoever thou art, or wouldst be, O Zarathustra, thou hast been here aloft the longest time,--in a little while thy bark shall no longer rest on dry land!"--"Do I then rest on dry land?" --asked Zarathustra, laughing.--"The waves around thy mountain," answered the soothsayer, "rise and rise, the waves of great distress and affliction: they will soon raise thy bark also and carry thee away."--Thereupon was Zarathustra silent and wondered.--"Dost thou still hear nothing?" continued the soothsayer: "doth it not rush and roar out of the depth?"--Zarathustra was silent once more and listened: then heard he a long, long cry, which the abysses threw to one another and passed on; for none of them wished to retain it: so evil did it sound. "Thou ill announcer," said Zarathustra at last, "that is a cry of distress, and the cry of a man; it may come perhaps out of a black sea. But what doth human distress matter to me! My last sin which hath been reserved for me,--knowest thou what it is called?" --"PITY!" answered the soothsayer from an overflowing heart, and raised both his hands aloft--"O Zarathustra, I have come that I may seduce thee to thy last sin!"-- And hardly had those words been uttered when there sounded the cry once more, and longer and more alarming than before--also much nearer. "Hearest thou? Hearest thou, O Zarathustra?" called out the soothsayer, "the cry concerneth thee, it calleth thee: Come, come, come; it is time, it is the highest time!"-- Zarathustra was silent thereupon, confused and staggered; at last he asked, like one who hesitateth in himself: "And who is it that there calleth me?" "But thou knowest it, certainly," answered the soothsayer warmly, "why dost thou conceal thyself? It is THE HIGHER MAN that crieth for thee!" "The higher man?" cried Zarathustra, horror-stricken: "what wanteth HE? What wanteth HE? The higher man! What wanteth he here?"--and his skin covered with perspiration. The soothsayer, however, did not heed Zarathustra's alarm, but listened and listened in the downward direction. When, however, it had been still there for a long while, he looked behind, and saw Zarathustra standing trembling. "O Zarathustra," he began, with sorrowful voice, "thou dost not stand there like one whose happiness maketh him giddy: thou wilt have to dance lest thou tumble down! But although thou shouldst dance before me, and leap all thy side-leaps, no one may say unto me: 'Behold, here danceth the last joyous man!' In vain would any one come to this height who sought HIM here: caves would he find, indeed, and back-caves, hiding-places for hidden ones; but not lucky mines, nor treasure-chambers, nor new gold-veins of happiness. Happiness--how indeed could one find happiness among such buried-alive and solitary ones! Must I yet seek the last happiness on the Happy Isles, and far away among forgotten seas? But all is alike, nothing is worth while, no seeking is of service, there are no longer any Happy Isles!"-- Thus sighed the soothsayer; with his last sigh, however, Zarathustra again became serene and assured, like one who hath come out of a deep chasm into the light. "Nay! Nay! Three times Nay!" exclaimed he with a strong voice, and stroked his beard--"THAT do I know better! There are still Happy Isles! Silence THEREON, thou sighing sorrow-sack! Cease to splash THEREON, thou rain-cloud of the forenoon! Do I not already stand here wet with thy misery, and drenched like a dog? Now do I shake myself and run away from thee, that I may again become dry: thereat mayest thou not wonder! Do I seem to thee discourteous? Here however is MY court. But as regards the higher man: well! I shall seek him at once in those forests: FROM THENCE came his cry. Perhaps he is there hard beset by an evil beast. He is in MY domain: therein shall he receive no scath! And verily, there are many evil beasts about me."-- With those words Zarathustra turned around to depart. Then said the soothsayer: "O Zarathustra, thou art a rogue! I know it well: thou wouldst fain be rid of me! Rather wouldst thou run into the forest and lay snares for evil beasts! But what good will it do thee? In the evening wilt thou have me again: in thine own cave will I sit, patient and heavy like a block--and wait for thee!" "So be it!" shouted back Zarathustra, as he went away: "and what is mine in my cave belongeth also unto thee, my guest! Shouldst thou however find honey therein, well! just lick it up, thou growling bear, and sweeten thy soul! For in the evening we want both to be in good spirits; --In good spirits and joyful, because this day hath come to an end! And thou thyself shalt dance to my lays, as my dancing-bear. Thou dost not believe this? Thou shakest thy head? Well! Cheer up, old bear! But I also--am a soothsayer." Thus spake Zarathustra. LXIII. TALK WITH THE KINGS. 1. Ere Zarathustra had been an hour on his way in the mountains and forests, he saw all at once a strange procession. Right on the path which he was about to descend came two kings walking, bedecked with crowns and purple girdles, and variegated like flamingoes: they drove before them a laden ass. "What do these kings want in my domain?" said Zarathustra in astonishment to his heart, and hid himself hastily behind a thicket. When however the kings approached to him, he said half-aloud, like one speaking only to himself: "Strange! Strange! How doth this harmonise? Two kings do I see--and only one ass!" Thereupon the two kings made a halt; they smiled and looked towards the spot whence the voice proceeded, and afterwards looked into each other's faces. "Such things do we also think among ourselves," said the king on the right, "but we do not utter them." The king on the left, however, shrugged his shoulders and answered: "That may perhaps be a goat-herd. Or an anchorite who hath lived too long among rocks and trees. For no society at all spoileth also good manners." "Good manners?" replied angrily and bitterly the other king: "what then do we run out of the way of? Is it not 'good manners'? Our 'good society'? Better, verily, to live among anchorites and goat-herds, than with our gilded, false, over-rouged populace--though it call itself 'good society.' --Though it call itself 'nobility.' But there all is false and foul, above all the blood--thanks to old evil diseases and worse curers. The best and dearest to me at present is still a sound peasant, coarse, artful, obstinate and enduring: that is at present the noblest type. The peasant is at present the best; and the peasant type should be master! But it is the kingdom of the populace--I no longer allow anything to be imposed upon me. The populace, however--that meaneth, hodgepodge. Populace-hodgepodge: therein is everything mixed with everything, saint and swindler, gentleman and Jew, and every beast out of Noah's ark. Good manners! Everything is false and foul with us. No one knoweth any longer how to reverence: it is THAT precisely that we run away from. They are fulsome obtrusive dogs; they gild palm-leaves. This loathing choketh me, that we kings ourselves have become false, draped and disguised with the old faded pomp of our ancestors, show-pieces for the stupidest, the craftiest, and whosoever at present trafficketh for power. We ARE NOT the first men--and have nevertheless to STAND FOR them: of this imposture have we at last become weary and disgusted. From the rabble have we gone out of the way, from all those bawlers and scribe-blowflies, from the trader-stench, the ambition-fidgeting, the bad breath--: fie, to live among the rabble; --Fie, to stand for the first men among the rabble! Ah, loathing! Loathing! Loathing! What doth it now matter about us kings!"-- "Thine old sickness seizeth thee," said here the king on the left, "thy loathing seizeth thee, my poor brother. Thou knowest, however, that some one heareth us." Immediately thereupon, Zarathustra, who had opened ears and eyes to this talk, rose from his hiding-place, advanced towards the kings, and thus began: "He who hearkeneth unto you, he who gladly hearkeneth unto you, is called Zarathustra. I am Zarathustra who once said: 'What doth it now matter about kings!' Forgive me; I rejoiced when ye said to each other: 'What doth it matter about us kings!' Here, however, is MY domain and jurisdiction: what may ye be seeking in my domain? Perhaps, however, ye have FOUND on your way what _I_ seek: namely, the higher man." When the kings heard this, they beat upon their breasts and said with one voice: "We are recognised! With the sword of thine utterance severest thou the thickest darkness of our hearts. Thou hast discovered our distress; for lo! we are on our way to find the higher man-- --The man that is higher than we, although we are kings. To him do we convey this ass. For the highest man shall also be the highest lord on earth. There is no sorer misfortune in all human destiny, than when the mighty of the earth are not also the first men. Then everything becometh false and distorted and monstrous. And when they are even the last men, and more beast than man, then riseth and riseth the populace in honour, and at last saith even the populace- virtue: 'Lo, I alone am virtue!'"-- What have I just heard? answered Zarathustra. What wisdom in kings! I am enchanted, and verily, I have already promptings to make a rhyme thereon:-- --Even if it should happen to be a rhyme not suited for every one's ears. I unlearned long ago to have consideration for long ears. Well then! Well now! (Here, however, it happened that the ass also found utterance: it said distinctly and with malevolence, Y-E-A.) 'Twas once--methinks year one of our blessed Lord,-- Drunk without wine, the Sybil thus deplored:-- "How ill things go! Decline! Decline! Ne'er sank the world so low! Rome now hath turned harlot and harlot-stew, Rome's Caesar a beast, and God--hath turned Jew! 2. With those rhymes of Zarathustra the kings were delighted; the king on the right, however, said: "O Zarathustra, how well it was that we set out to see thee! For thine enemies showed us thy likeness in their mirror: there lookedst thou with the grimace of a devil, and sneeringly: so that we were afraid of thee. But what good did it do! Always didst thou prick us anew in heart and ear with thy sayings. Then did we say at last: What doth it matter how he look! We must HEAR him; him who teacheth: 'Ye shall love peace as a means to new wars, and the short peace more than the long!' No one ever spake such warlike words: 'What is good? To be brave is good. It is the good war that halloweth every cause.' O Zarathustra, our fathers' blood stirred in our veins at such words: it was like the voice of spring to old wine-casks. When the swords ran among one another like red-spotted serpents, then did our fathers become fond of life; the sun of every peace seemed to them languid and lukewarm, the long peace, however, made them ashamed. How they sighed, our fathers, when they saw on the wall brightly furbished, dried-up swords! Like those they thirsted for war. For a sword thirsteth to drink blood, and sparkleth with desire."-- --When the kings thus discoursed and talked eagerly of the happiness of their fathers, there came upon Zarathustra no little desire to mock at their eagerness: for evidently they were very peaceable kings whom he saw before him, kings with old and refined features. But he restrained himself. "Well!" said he, "thither leadeth the way, there lieth the cave of Zarathustra; and this day is to have a long evening! At present, however, a cry of distress calleth me hastily away from you. It will honour my cave if kings want to sit and wait in it: but, to be sure, ye will have to wait long! Well! What of that! Where doth one at present learn better to wait than at courts? And the whole virtue of kings that hath remained unto them--is it not called to-day: ABILITY to wait?" Thus spake Zarathustra. LXIV. THE LEECH. And Zarathustra went thoughtfully on, further and lower down, through forests and past moory bottoms; as it happeneth, however, to every one who meditateth upon hard matters, he trod thereby unawares upon a man. And lo, there spurted into his face all at once a cry of pain, and two curses and twenty bad invectives, so that in his fright he raised his stick and also struck the trodden one. Immediately afterwards, however, he regained his composure, and his heart laughed at the folly he had just committed. "Pardon me," said he to the trodden one, who had got up enraged, and had seated himself, "pardon me, and hear first of all a parable. As a wanderer who dreameth of remote things on a lonesome highway, runneth unawares against a sleeping dog, a dog which lieth in the sun: --As both of them then start up and snap at each other, like deadly enemies, those two beings mortally frightened--so did it happen unto us. And yet! And yet--how little was lacking for them to caress each other, that dog and that lonesome one! Are they not both--lonesome ones!" --"Whoever thou art," said the trodden one, still enraged, "thou treadest also too nigh me with thy parable, and not only with thy foot! Lo! am I then a dog?"--And thereupon the sitting one got up, and pulled his naked arm out of the swamp. For at first he had lain outstretched on the ground, hidden and indiscernible, like those who lie in wait for swamp- game. "But whatever art thou about!" called out Zarathustra in alarm, for he saw a deal of blood streaming over the naked arm,--"what hath hurt thee? Hath an evil beast bit thee, thou unfortunate one?" The bleeding one laughed, still angry, "What matter is it to thee!" said he, and was about to go on. "Here am I at home and in my province. Let him question me whoever will: to a dolt, however, I shall hardly answer." "Thou art mistaken," said Zarathustra sympathetically, and held him fast; "thou art mistaken. Here thou art not at home, but in my domain, and therein shall no one receive any hurt. Call me however what thou wilt--I am who I must be. I call myself Zarathustra. Well! Up thither is the way to Zarathustra's cave: it is not far,--wilt thou not attend to thy wounds at my home? It hath gone badly with thee, thou unfortunate one, in this life: first a beast bit thee, and then--a man trod upon thee!"-- When however the trodden one had heard the name of Zarathustra he was transformed. "What happeneth unto me!" he exclaimed, "WHO preoccupieth me so much in this life as this one man, namely Zarathustra, and that one animal that liveth on blood, the leech? For the sake of the leech did I lie here by this swamp, like a fisher, and already had mine outstretched arm been bitten ten times, when there biteth a still finer leech at my blood, Zarathustra himself! O happiness! O miracle! Praised be this day which enticed me into the swamp! Praised be the best, the livest cupping-glass, that at present liveth; praised be the great conscience-leech Zarathustra!"-- Thus spake the trodden one, and Zarathustra rejoiced at his words and their refined reverential style. "Who art thou?" asked he, and gave him his hand, "there is much to clear up and elucidate between us, but already methinketh pure clear day is dawning." "I am THE SPIRITUALLY CONSCIENTIOUS ONE," answered he who was asked, "and in matters of the spirit it is difficult for any one to take it more rigorously, more restrictedly, and more severely than I, except him from whom I learnt it, Zarathustra himself. Better know nothing than half-know many things! Better be a fool on one's own account, than a sage on other people's approbation! I--go to the basis: --What matter if it be great or small? If it be called swamp or sky? A handbreadth of basis is enough for me, if it be actually basis and ground! --A handbreadth of basis: thereon can one stand. In the true knowing- knowledge there is nothing great and nothing small." "Then thou art perhaps an expert on the leech?" asked Zarathustra; "and thou investigatest the leech to its ultimate basis, thou conscientious one?" "O Zarathustra," answered the trodden one, "that would be something immense; how could I presume to do so! That, however, of which I am master and knower, is the BRAIN of the leech: --that is MY world! And it is also a world! Forgive it, however, that my pride here findeth expression, for here I have not mine equal. Therefore said I: 'here am I at home.' How long have I investigated this one thing, the brain of the leech, so that here the slippery truth might no longer slip from me! Here is MY domain! --For the sake of this did I cast everything else aside, for the sake of this did everything else become indifferent to me; and close beside my knowledge lieth my black ignorance. My spiritual conscience requireth from me that it should be so--that I should know one thing, and not know all else: they are a loathing unto me, all the semi-spiritual, all the hazy, hovering, and visionary. Where mine honesty ceaseth, there am I blind, and want also to be blind. Where I want to know, however, there want I also to be honest--namely, severe, rigorous, restricted, cruel and inexorable. Because THOU once saidest, O Zarathustra: 'Spirit is life which itself cutteth into life';--that led and allured me to thy doctrine. And verily, with mine own blood have I increased mine own knowledge!" --"As the evidence indicateth," broke in Zarathustra; for still was the blood flowing down on the naked arm of the conscientious one. For there had ten leeches bitten into it. "O thou strange fellow, how much doth this very evidence teach me--namely, thou thyself! And not all, perhaps, might I pour into thy rigorous ear! Well then! We part here! But I would fain find thee again. Up thither is the way to my cave: to-night shalt thou there by my welcome guest! Fain would I also make amends to thy body for Zarathustra treading upon thee with his feet: I think about that. Just now, however, a cry of distress calleth me hastily away from thee." Thus spake Zarathustra. LXV. THE MAGICIAN. 1. When however Zarathustra had gone round a rock, then saw he on the same path, not far below him, a man who threw his limbs about like a maniac, and at last tumbled to the ground on his belly. "Halt!" said then Zarathustra to his heart, "he there must surely be the higher man, from him came that dreadful cry of distress,--I will see if I can help him." When, however, he ran to the spot where the man lay on the ground, he found a trembling old man, with fixed eyes; and in spite of all Zarathustra's efforts to lift him and set him again on his feet, it was all in vain. The unfortunate one, also, did not seem to notice that some one was beside him; on the contrary, he continually looked around with moving gestures, like one forsaken and isolated from all the world. At last, however, after much trembling, and convulsion, and curling-himself-up, he began to lament thus: Who warm'th me, who lov'th me still? Give ardent fingers! Give heartening charcoal-warmers! Prone, outstretched, trembling, Like him, half dead and cold, whose feet one warm'th-- And shaken, ah! by unfamiliar fevers, Shivering with sharpened, icy-cold frost-arrows, By thee pursued, my fancy! Ineffable! Recondite! Sore-frightening! Thou huntsman 'hind the cloud-banks! Now lightning-struck by thee, Thou mocking eye that me in darkness watcheth: --Thus do I lie, Bend myself, twist myself, convulsed With all eternal torture, And smitten By thee, cruellest huntsman, Thou unfamiliar--GOD... Smite deeper! Smite yet once more! Pierce through and rend my heart! What mean'th this torture With dull, indented arrows? Why look'st thou hither, Of human pain not weary, With mischief-loving, godly flash-glances? Not murder wilt thou, But torture, torture? For why--ME torture, Thou mischief-loving, unfamiliar God?-- Ha! Ha! Thou stealest nigh In midnight's gloomy hour?... What wilt thou? Speak! Thou crowdst me, pressest-- Ha! now far too closely! Thou hearst me breathing, Thou o'erhearst my heart, Thou ever jealous one! --Of what, pray, ever jealous? Off! Off! For why the ladder? Wouldst thou GET IN? To heart in-clamber? To mine own secretest Conceptions in-clamber? Shameless one! Thou unknown one!--Thief! What seekst thou by thy stealing? What seekst thou by thy hearkening? What seekst thou by thy torturing? Thou torturer! Thou--hangman-God! Or shall I, as the mastiffs do, Roll me before thee? And cringing, enraptured, frantical, My tail friendly--waggle! In vain! Goad further! Cruellest goader! No dog--thy game just am I, Cruellest huntsman! Thy proudest of captives, Thou robber 'hind the cloud-banks... Speak finally! Thou lightning-veiled one! Thou unknown one! Speak! What wilt thou, highway-ambusher, from--ME? What WILT thou, unfamiliar--God? What? Ransom-gold? How much of ransom-gold? Solicit much--that bid'th my pride! And be concise--that bid'th mine other pride! Ha! Ha! ME--wantst thou? me? --Entire?... Ha! Ha! And torturest me, fool that thou art, Dead-torturest quite my pride? Give LOVE to me--who warm'th me still? Who lov'th me still?- Give ardent fingers Give heartening charcoal-warmers, Give me, the lonesomest, The ice (ah! seven-fold frozen ice For very enemies, For foes, doth make one thirst). Give, yield to me, Cruellest foe, --THYSELF!-- Away! There fled he surely, My final, only comrade, My greatest foe, Mine unfamiliar-- My hangman-God!... --Nay! Come thou back! WITH all of thy great tortures! To me the last of lonesome ones, Oh, come thou back! All my hot tears in streamlets trickle Their course to thee! And all my final hearty fervour-- Up-glow'th to THEE! Oh, come thou back, Mine unfamiliar God! my PAIN! My final bliss! 2. --Here, however, Zarathustra could no longer restrain himself; he took his staff and struck the wailer with all his might. "Stop this," cried he to him with wrathful laughter, "stop this, thou stage-player! Thou false coiner! Thou liar from the very heart! I know thee well! I will soon make warm legs to thee, thou evil magician: I know well how-- to make it hot for such as thou!" --"Leave off," said the old man, and sprang up from the ground, "strike me no more, O Zarathustra! I did it only for amusement! That kind of thing belongeth to mine art. Thee thyself, I wanted to put to the proof when I gave this performance. And verily, thou hast well detected me! But thou thyself--hast given me no small proof of thyself: thou art HARD, thou wise Zarathustra! Hard strikest thou with thy 'truths,' thy cudgel forceth from me--THIS truth!" --"Flatter not," answered Zarathustra, still excited and frowning, "thou stage-player from the heart! Thou art false: why speakest thou--of truth! Thou peacock of peacocks, thou sea of vanity; WHAT didst thou represent before me, thou evil magician; WHOM was I meant to believe in when thou wailedst in such wise?" "THE PENITENT IN SPIRIT," said the old man, "it was him--I represented; thou thyself once devisedst this expression-- --The poet and magician who at last turneth his spirit against himself, the transformed one who freezeth to death by his bad science and conscience. And just acknowledge it: it was long, O Zarathustra, before thou discoveredst my trick and lie! Thou BELIEVEDST in my distress when thou heldest my head with both thy hands,-- --I heard thee lament 'we have loved him too little, loved him too little!' Because I so far deceived thee, my wickedness rejoiced in me." "Thou mayest have deceived subtler ones than I," said Zarathustra sternly. "I am not on my guard against deceivers; I HAVE TO BE without precaution: so willeth my lot. Thou, however,--MUST deceive: so far do I know thee! Thou must ever be equivocal, trivocal, quadrivocal, and quinquivocal! Even what thou hast now confessed, is not nearly true enough nor false enough for me! Thou bad false coiner, how couldst thou do otherwise! Thy very malady wouldst thou whitewash if thou showed thyself naked to thy physician. Thus didst thou whitewash thy lie before me when thou saidst: 'I did so ONLY for amusement!' There was also SERIOUSNESS therein, thou ART something of a penitent-in-spirit! I divine thee well: thou hast become the enchanter of all the world; but for thyself thou hast no lie or artifice left,--thou art disenchanted to thyself! Thou hast reaped disgust as thy one truth. No word in thee is any longer genuine, but thy mouth is so: that is to say, the disgust that cleaveth unto thy mouth."-- --"Who art thou at all!" cried here the old magician with defiant voice, "who dareth to speak thus unto ME, the greatest man now living?"--and a green flash shot from his eye at Zarathustra. But immediately after he changed, and said sadly: "O Zarathustra, I am weary of it, I am disgusted with mine arts, I am not GREAT, why do I dissemble! But thou knowest it well--I sought for greatness! A great man I wanted to appear, and persuaded many; but the lie hath been beyond my power. On it do I collapse. O Zarathustra, everything is a lie in me; but that I collapse--this my collapsing is GENUINE!"-- "It honoureth thee," said Zarathustra gloomily, looking down with sidelong glance, "it honoureth thee that thou soughtest for greatness, but it betrayeth thee also. Thou art not great. Thou bad old magician, THAT is the best and the honestest thing I honour in thee, that thou hast become weary of thyself, and hast expressed it: 'I am not great.' THEREIN do I honour thee as a penitent-in-spirit, and although only for the twinkling of an eye, in that one moment wast thou--genuine. But tell me, what seekest thou here in MY forests and rocks? And if thou hast put thyself in MY way, what proof of me wouldst thou have?-- --Wherein didst thou put ME to the test?" Thus spake Zarathustra, and his eyes sparkled. But the old magician kept silence for a while; then said he: "Did I put thee to the test? I--seek only. O Zarathustra, I seek a genuine one, a right one, a simple one, an unequivocal one, a man of perfect honesty, a vessel of wisdom, a saint of knowledge, a great man! Knowest thou it not, O Zarathustra? I SEEK ZARATHUSTRA." --And here there arose a long silence between them: Zarathustra, however, became profoundly absorbed in thought, so that he shut his eyes. But afterwards coming back to the situation, he grasped the hand of the magician, and said, full of politeness and policy: "Well! Up thither leadeth the way, there is the cave of Zarathustra. In it mayest thou seek him whom thou wouldst fain find. And ask counsel of mine animals, mine eagle and my serpent: they shall help thee to seek. My cave however is large. I myself, to be sure--I have as yet seen no great man. That which is great, the acutest eye is at present insensible to it. It is the kingdom of the populace. Many a one have I found who stretched and inflated himself, and the people cried: 'Behold; a great man!' But what good do all bellows do! The wind cometh out at last. At last bursteth the frog which hath inflated itself too long: then cometh out the wind. To prick a swollen one in the belly, I call good pastime. Hear that, ye boys! Our to-day is of the populace: who still KNOWETH what is great and what is small! Who could there seek successfully for greatness! A fool only: it succeedeth with fools. Thou seekest for great men, thou strange fool? Who TAUGHT that to thee? Is to-day the time for it? Oh, thou bad seeker, why dost thou--tempt me?"-- Thus spake Zarathustra, comforted in his heart, and went laughing on his way. LXVI. OUT OF SERVICE. Not long, however, after Zarathustra had freed himself from the magician, he again saw a person sitting beside the path which he followed, namely a tall, black man, with a haggard, pale countenance: THIS MAN grieved him exceedingly. "Alas," said he to his heart, "there sitteth disguised affliction; methinketh he is of the type of the priests: what do THEY want in my domain? What! Hardly have I escaped from that magician, and must another necromancer again run across my path,-- --Some sorcerer with laying-on-of-hands, some sombre wonder-worker by the grace of God, some anointed world-maligner, whom, may the devil take! But the devil is never at the place which would be his right place: he always cometh too late, that cursed dwarf and club-foot!"-- Thus cursed Zarathustra impatiently in his heart, and considered how with averted look he might slip past the black man. But behold, it came about otherwise. For at the same moment had the sitting one already perceived him; and not unlike one whom an unexpected happiness overtaketh, he sprang to his feet, and went straight towards Zarathustra. "Whoever thou art, thou traveller," said he, "help a strayed one, a seeker, an old man, who may here easily come to grief! The world here is strange to me, and remote; wild beasts also did I hear howling; and he who could have given me protection--he is himself no more. I was seeking the pious man, a saint and an anchorite, who, alone in his forest, had not yet heard of what all the world knoweth at present." "WHAT doth all the world know at present?" asked Zarathustra. "Perhaps that the old God no longer liveth, in whom all the world once believed?" "Thou sayest it," answered the old man sorrowfully. "And I served that old God until his last hour. Now, however, am I out of service, without master, and yet not free; likewise am I no longer merry even for an hour, except it be in recollections. Therefore did I ascend into these mountains, that I might finally have a festival for myself once more, as becometh an old pope and church-father: for know it, that I am the last pope!--a festival of pious recollections and divine services. Now, however, is he himself dead, the most pious of men, the saint in the forest, who praised his God constantly with singing and mumbling. He himself found I no longer when I found his cot--but two wolves found I therein, which howled on account of his death,--for all animals loved him. Then did I haste away. Had I thus come in vain into these forests and mountains? Then did my heart determine that I should seek another, the most pious of all those who believe not in God--, my heart determined that I should seek Zarathustra!" Thus spake the hoary man, and gazed with keen eyes at him who stood before him. Zarathustra however seized the hand of the old pope and regarded it a long while with admiration. "Lo! thou venerable one," said he then, "what a fine and long hand! That is the hand of one who hath ever dispensed blessings. Now, however, doth it hold fast him whom thou seekest, me, Zarathustra. It is I, the ungodly Zarathustra, who saith: 'Who is ungodlier than I, that I may enjoy his teaching?'"- Thus spake Zarathustra, and penetrated with his glances the thoughts and arrear-thoughts of the old pope. At last the latter began: "He who most loved and possessed him hath now also lost him most--: --Lo, I myself am surely the most godless of us at present? But who could rejoice at that!"-- --"Thou servedst him to the last?" asked Zarathustra thoughtfully, after a deep silence, "thou knowest HOW he died? Is it true what they say, that sympathy choked him; --That he saw how MAN hung on the cross, and could not endure it;--that his love to man became his hell, and at last his death?"-- The old pope however did not answer, but looked aside timidly, with a painful and gloomy expression. "Let him go," said Zarathustra, after prolonged meditation, still looking the old man straight in the eye. "Let him go, he is gone. And though it honoureth thee that thou speakest only in praise of this dead one, yet thou knowest as well as I WHO he was, and that he went curious ways." "To speak before three eyes," said the old pope cheerfully (he was blind of one eye), "in divine matters I am more enlightened than Zarathustra himself--and may well be so. My love served him long years, my will followed all his will. A good servant, however, knoweth everything, and many a thing even which a master hideth from himself. He was a hidden God, full of secrecy. Verily, he did not come by his son otherwise than by secret ways. At the door of his faith standeth adultery. Whoever extolleth him as a God of love, doth not think highly enough of love itself. Did not that God want also to be judge? But the loving one loveth irrespective of reward and requital. When he was young, that God out of the Orient, then was he harsh and revengeful, and built himself a hell for the delight of his favourites. At last, however, he became old and soft and mellow and pitiful, more like a grandfather than a father, but most like a tottering old grandmother. There did he sit shrivelled in his chimney-corner, fretting on account of his weak legs, world-weary, will-weary, and one day he suffocated of his all-too-great pity."-- "Thou old pope," said here Zarathustra interposing, "hast thou seen THAT with thine eyes? It could well have happened in that way: in that way, AND also otherwise. When Gods die they always die many kinds of death. Well! At all events, one way or other--he is gone! He was counter to the taste of mine ears and eyes; worse than that I should not like to say against him. I love everything that looketh bright and speaketh honestly. But he--thou knowest it, forsooth, thou old priest, there was something of thy type in him, the priest-type--he was equivocal. He was also indistinct. How he raged at us, this wrath-snorter, because we understood him badly! But why did he not speak more clearly? And if the fault lay in our ears, why did he give us ears that heard him badly? If there was dirt in our ears, well! who put it in them? Too much miscarried with him, this potter who had not learned thoroughly! That he took revenge on his pots and creations, however, because they turned out badly--that was a sin against GOOD TASTE. There is also good taste in piety: THIS at last said: 'Away with SUCH a God! Better to have no God, better to set up destiny on one's own account, better to be a fool, better to be God oneself!'" --"What do I hear!" said then the old pope, with intent ears; "O Zarathustra, thou art more pious than thou believest, with such an unbelief! Some God in thee hath converted thee to thine ungodliness. Is it not thy piety itself which no longer letteth thee believe in a God? And thine over-great honesty will yet lead thee even beyond good and evil! Behold, what hath been reserved for thee? Thou hast eyes and hands and mouth, which have been predestined for blessing from eternity. One doth not bless with the hand alone. Nigh unto thee, though thou professest to be the ungodliest one, I feel a hale and holy odour of long benedictions: I feel glad and grieved thereby. Let me be thy guest, O Zarathustra, for a single night! Nowhere on earth shall I now feel better than with thee!"-- "Amen! So shall it be!" said Zarathustra, with great astonishment; "up thither leadeth the way, there lieth the cave of Zarathustra. Gladly, forsooth, would I conduct thee thither myself, thou venerable one; for I love all pious men. But now a cry of distress calleth me hastily away from thee. In my domain shall no one come to grief; my cave is a good haven. And best of all would I like to put every sorrowful one again on firm land and firm legs. Who, however, could take THY melancholy off thy shoulders? For that I am too weak. Long, verily, should we have to wait until some one re-awoke thy God for thee. For that old God liveth no more: he is indeed dead."-- Thus spake Zarathustra. LXVII. THE UGLIEST MAN. --And again did Zarathustra's feet run through mountains and forests, and his eyes sought and sought, but nowhere was he to be seen whom they wanted to see--the sorely distressed sufferer and crier. On the whole way, however, he rejoiced in his heart and was full of gratitude. "What good things," said he, "hath this day given me, as amends for its bad beginning! What strange interlocutors have I found! At their words will I now chew a long while as at good corn; small shall my teeth grind and crush them, until they flow like milk into my soul!"-- When, however, the path again curved round a rock, all at once the landscape changed, and Zarathustra entered into a realm of death. Here bristled aloft black and red cliffs, without any grass, tree, or bird's voice. For it was a valley which all animals avoided, even the beasts of prey, except that a species of ugly, thick, green serpent came here to die when they became old. Therefore the shepherds called this valley: "Serpent-death." Zarathustra, however, became absorbed in dark recollections, for it seemed to him as if he had once before stood in this valley. And much heaviness settled on his mind, so that he walked slowly and always more slowly, and at last stood still. Then, however, when he opened his eyes, he saw something sitting by the wayside shaped like a man, and hardly like a man, something nondescript. And all at once there came over Zarathustra a great shame, because he had gazed on such a thing. Blushing up to the very roots of his white hair, he turned aside his glance, and raised his foot that he might leave this ill-starred place. Then, however, became the dead wilderness vocal: for from the ground a noise welled up, gurgling and rattling, as water gurgleth and rattleth at night through stopped-up water- pipes; and at last it turned into human voice and human speech:--it sounded thus: "Zarathustra! Zarathustra! Read my riddle! Say, say! WHAT IS THE REVENGE ON THE WITNESS? I entice thee back; here is smooth ice! See to it, see to it, that thy pride do not here break its legs! Thou thinkest thyself wise, thou proud Zarathustra! Read then the riddle, thou hard nut-cracker,--the riddle that I am! Say then: who am _I_!" --When however Zarathustra had heard these words,--what think ye then took place in his soul? PITY OVERCAME HIM; and he sank down all at once, like an oak that hath long withstood many tree-fellers,--heavily, suddenly, to the terror even of those who meant to fell it. But immediately he got up again from the ground, and his countenance became stern. "I know thee well," said he, with a brazen voice, "THOU ART THE MURDERER OF GOD! Let me go. Thou couldst not ENDURE him who beheld THEE,--who ever beheld thee through and through, thou ugliest man. Thou tookest revenge on this witness!" Thus spake Zarathustra and was about to go; but the nondescript grasped at a corner of his garment and began anew to gurgle and seek for words. "Stay," said he at last-- --"Stay! Do not pass by! I have divined what axe it was that struck thee to the ground: hail to thee, O Zarathustra, that thou art again upon thy feet! Thou hast divined, I know it well, how the man feeleth who killed him,--the murderer of God. Stay! Sit down here beside me; it is not to no purpose. To whom would I go but unto thee? Stay, sit down! Do not however look at me! Honour thus--mine ugliness! They persecute me: now art THOU my last refuge. NOT with their hatred, NOT with their bailiffs;--Oh, such persecution would I mock at, and be proud and cheerful! Hath not all success hitherto been with the well-persecuted ones? And he who persecuteth well learneth readily to be OBSEQUENT--when once he is--put behind! But it is their PITY-- --Their pity is it from which I flee away and flee to thee. O Zarathustra, protect me, thou, my last refuge, thou sole one who divinedst me: --Thou hast divined how the man feeleth who killed HIM. Stay! And if thou wilt go, thou impatient one, go not the way that I came. THAT way is bad. Art thou angry with me because I have already racked language too long? Because I have already counselled thee? But know that it is I, the ugliest man, --Who have also the largest, heaviest feet. Where _I_ have gone, the way is bad. I tread all paths to death and destruction. But that thou passedst me by in silence, that thou blushedst--I saw it well: thereby did I know thee as Zarathustra. Every one else would have thrown to me his alms, his pity, in look and speech. But for that--I am not beggar enough: that didst thou divine. For that I am too RICH, rich in what is great, frightful, ugliest, most unutterable! Thy shame, O Zarathustra, HONOURED me! With difficulty did I get out of the crowd of the pitiful,--that I might find the only one who at present teacheth that 'pity is obtrusive'-- thyself, O Zarathustra! --Whether it be the pity of a God, or whether it be human pity, it is offensive to modesty. And unwillingness to help may be nobler than the virtue that rusheth to do so. THAT however--namely, pity--is called virtue itself at present by all petty people:--they have no reverence for great misfortune, great ugliness, great failure. Beyond all these do I look, as a dog looketh over the backs of thronging flocks of sheep. They are petty, good-wooled, good-willed, grey people. As the heron looketh contemptuously at shallow pools, with backward-bent head, so do I look at the throng of grey little waves and wills and souls. Too long have we acknowledged them to be right, those petty people: SO we have at last given them power as well;--and now do they teach that 'good is only what petty people call good.' And 'truth' is at present what the preacher spake who himself sprang from them, that singular saint and advocate of the petty people, who testified of himself: 'I--am the truth.' That immodest one hath long made the petty people greatly puffed up,--he who taught no small error when he taught: 'I--am the truth.' Hath an immodest one ever been answered more courteously?--Thou, however, O Zarathustra, passedst him by, and saidst: 'Nay! Nay! Three times Nay!' Thou warnedst against his error; thou warnedst--the first to do so--against pity:--not every one, not none, but thyself and thy type. Thou art ashamed of the shame of the great sufferer; and verily when thou sayest: 'From pity there cometh a heavy cloud; take heed, ye men!' --When thou teachest: 'All creators are hard, all great love is beyond their pity:' O Zarathustra, how well versed dost thou seem to me in weather-signs! Thou thyself, however,--warn thyself also against THY pity! For many are on their way to thee, many suffering, doubting, despairing, drowning, freezing ones-- I warn thee also against myself. Thou hast read my best, my worst riddle, myself, and what I have done. I know the axe that felleth thee. But he--HAD TO die: he looked with eyes which beheld EVERYTHING,--he beheld men's depths and dregs, all his hidden ignominy and ugliness. His pity knew no modesty: he crept into my dirtiest corners. This most prying, over-intrusive, over-pitiful one had to die. He ever beheld ME: on such a witness I would have revenge--or not live myself. The God who beheld everything, AND ALSO MAN: that God had to die! Man cannot ENDURE it that such a witness should live." Thus spake the ugliest man. Zarathustra however got up, and prepared to go on: for he felt frozen to the very bowels. "Thou nondescript," said he, "thou warnedst me against thy path. As thanks for it I praise mine to thee. Behold, up thither is the cave of Zarathustra. My cave is large and deep and hath many corners; there findeth he that is most hidden his hiding-place. And close beside it, there are a hundred lurking-places and by-places for creeping, fluttering, and hopping creatures. Thou outcast, who hast cast thyself out, thou wilt not live amongst men and men's pity? Well then, do like me! Thus wilt thou learn also from me; only the doer learneth. And talk first and foremost to mine animals! The proudest animal and the wisest animal--they might well be the right counsellors for us both!"-- Thus spake Zarathustra and went his way, more thoughtfully and slowly even than before: for he asked himself many things, and hardly knew what to answer. "How poor indeed is man," thought he in his heart, "how ugly, how wheezy, how full of hidden shame! They tell me that man loveth himself. Ah, how great must that self-love be! How much contempt is opposed to it! Even this man hath loved himself, as he hath despised himself,--a great lover methinketh he is, and a great despiser. No one have I yet found who more thoroughly despised himself: even THAT is elevation. Alas, was THIS perhaps the higher man whose cry I heard? I love the great despisers. Man is something that hath to be surpassed."-- LXVIII. THE VOLUNTARY BEGGAR. When Zarathustra had left the ugliest man, he was chilled and felt lonesome: for much coldness and lonesomeness came over his spirit, so that even his limbs became colder thereby. When, however, he wandered on and on, uphill and down, at times past green meadows, though also sometimes over wild stony couches where formerly perhaps an impatient brook had made its bed, then he turned all at once warmer and heartier again. "What hath happened unto me?" he asked himself, "something warm and living quickeneth me; it must be in the neighbourhood. Already am I less alone; unconscious companions and brethren rove around me; their warm breath toucheth my soul." When, however, he spied about and sought for the comforters of his lonesomeness, behold, there were kine there standing together on an eminence, whose proximity and smell had warmed his heart. The kine, however, seemed to listen eagerly to a speaker, and took no heed of him who approached. When, however, Zarathustra was quite nigh unto them, then did he hear plainly that a human voice spake in the midst of the kine, and apparently all of them had turned their heads towards the speaker. Then ran Zarathustra up speedily and drove the animals aside; for he feared that some one had here met with harm, which the pity of the kine would hardly be able to relieve. But in this he was deceived; for behold, there sat a man on the ground who seemed to be persuading the animals to have no fear of him, a peaceable man and Preacher-on-the-Mount, out of whose eyes kindness itself preached. "What dost thou seek here?" called out Zarathustra in astonishment. "What do I here seek?" answered he: "the same that thou seekest, thou mischief-maker; that is to say, happiness upon earth. To that end, however, I would fain learn of these kine. For I tell thee that I have already talked half a morning unto them, and just now were they about to give me their answer. Why dost thou disturb them? Except we be converted and become as kine, we shall in no wise enter into the kingdom of heaven. For we ought to learn from them one thing: ruminating. And verily, although a man should gain the whole world, and yet not learn one thing, ruminating, what would it profit him! He would not be rid of his affliction, --His great affliction: that, however, is at present called DISGUST. Who hath not at present his heart, his mouth and his eyes full of disgust? Thou also! Thou also! But behold these kine!"-- Thus spake the Preacher-on-the-Mount, and turned then his own look towards Zarathustra--for hitherto it had rested lovingly on the kine--: then, however, he put on a different expression. "Who is this with whom I talk?" he exclaimed frightened, and sprang up from the ground. "This is the man without disgust, this is Zarathustra himself, the surmounter of the great disgust, this is the eye, this is the mouth, this is the heart of Zarathustra himself." And whilst he thus spake he kissed with o'erflowing eyes the hands of him with whom he spake, and behaved altogether like one to whom a precious gift and jewel hath fallen unawares from heaven. The kine, however, gazed at it all and wondered. "Speak not of me, thou strange one; thou amiable one!" said Zarathustra, and restrained his affection, "speak to me firstly of thyself! Art thou not the voluntary beggar who once cast away great riches,-- --Who was ashamed of his riches and of the rich, and fled to the poorest to bestow upon them his abundance and his heart? But they received him not." "But they received me not," said the voluntary beggar, "thou knowest it, forsooth. So I went at last to the animals and to those kine." "Then learnedst thou," interrupted Zarathustra, "how much harder it is to give properly than to take properly, and that bestowing well is an ART--the last, subtlest master-art of kindness." "Especially nowadays," answered the voluntary beggar: "at present, that is to say, when everything low hath become rebellious and exclusive and haughty in its manner--in the manner of the populace. For the hour hath come, thou knowest it forsooth, for the great, evil, long, slow mob-and-slave-insurrection: it extendeth and extendeth! Now doth it provoke the lower classes, all benevolence and petty giving; and the overrich may be on their guard! Whoever at present drip, like bulgy bottles out of all-too-small necks:--of such bottles at present one willingly breaketh the necks. Wanton avidity, bilious envy, careworn revenge, populace-pride: all these struck mine eye. It is no longer true that the poor are blessed. The kingdom of heaven, however, is with the kine." "And why is it not with the rich?" asked Zarathustra temptingly, while he kept back the kine which sniffed familiarly at the peaceful one. "Why dost thou tempt me?" answered the other. "Thou knowest it thyself better even than I. What was it drove me to the poorest, O Zarathustra? Was it not my disgust at the richest? --At the culprits of riches, with cold eyes and rank thoughts, who pick up profit out of all kinds of rubbish--at this rabble that stinketh to heaven, --At this gilded, falsified populace, whose fathers were pickpockets, or carrion-crows, or rag-pickers, with wives compliant, lewd and forgetful:-- for they are all of them not far different from harlots-- Populace above, populace below! What are 'poor' and 'rich' at present! That distinction did I unlearn,--then did I flee away further and ever further, until I came to those kine." Thus spake the peaceful one, and puffed himself and perspired with his words: so that the kine wondered anew. Zarathustra, however, kept looking into his face with a smile, all the time the man talked so severely--and shook silently his head. "Thou doest violence to thyself, thou Preacher-on-the-Mount, when thou usest such severe words. For such severity neither thy mouth nor thine eye have been given thee. Nor, methinketh, hath thy stomach either: unto IT all such rage and hatred and foaming-over is repugnant. Thy stomach wanteth softer things: thou art not a butcher. Rather seemest thou to me a plant-eater and a root-man. Perhaps thou grindest corn. Certainly, however, thou art averse to fleshly joys, and thou lovest honey." "Thou hast divined me well," answered the voluntary beggar, with lightened heart. "I love honey, I also grind corn; for I have sought out what tasteth sweetly and maketh pure breath: --Also what requireth a long time, a day's-work and a mouth's-work for gentle idlers and sluggards. Furthest, to be sure, have those kine carried it: they have devised ruminating and lying in the sun. They also abstain from all heavy thoughts which inflate the heart." --"Well!" said Zarathustra, "thou shouldst also see MINE animals, mine eagle and my serpent,--their like do not at present exist on earth. Behold, thither leadeth the way to my cave: be to-night its guest. And talk to mine animals of the happiness of animals,-- --Until I myself come home. For now a cry of distress calleth me hastily away from thee. Also, shouldst thou find new honey with me, ice-cold, golden-comb-honey, eat it! Now, however, take leave at once of thy kine, thou strange one! thou amiable one! though it be hard for thee. For they are thy warmest friends and preceptors!"-- --"One excepted, whom I hold still dearer," answered the voluntary beggar. "Thou thyself art good, O Zarathustra, and better even than a cow!" "Away, away with thee! thou evil flatterer!" cried Zarathustra mischievously, "why dost thou spoil me with such praise and flattery-honey? "Away, away from me!" cried he once more, and heaved his stick at the fond beggar, who, however, ran nimbly away. LXIX. THE SHADOW. Scarcely however was the voluntary beggar gone in haste, and Zarathustra again alone, when he heard behind him a new voice which called out: "Stay! Zarathustra! Do wait! It is myself, forsooth, O Zarathustra, myself, thy shadow!" But Zarathustra did not wait; for a sudden irritation came over him on account of the crowd and the crowding in his mountains. "Whither hath my lonesomeness gone?" spake he. "It is verily becoming too much for me; these mountains swarm; my kingdom is no longer of THIS world; I require new mountains. My shadow calleth me? What matter about my shadow! Let it run after me! I--run away from it." Thus spake Zarathustra to his heart and ran away. But the one behind followed after him, so that immediately there were three runners, one after the other--namely, foremost the voluntary beggar, then Zarathustra, and thirdly, and hindmost, his shadow. But not long had they run thus when Zarathustra became conscious of his folly, and shook off with one jerk all his irritation and detestation. "What!" said he, "have not the most ludicrous things always happened to us old anchorites and saints? Verily, my folly hath grown big in the mountains! Now do I hear six old fools' legs rattling behind one another! But doth Zarathustra need to be frightened by his shadow? Also, methinketh that after all it hath longer legs thin mine." Thus spake Zarathustra, and, laughing with eyes and entrails, he stood still and turned round quickly--and behold, he almost thereby threw his shadow and follower to the ground, so closely had the latter followed at his heels, and so weak was he. For when Zarathustra scrutinised him with his glance he was frightened as by a sudden apparition, so slender, swarthy, hollow and worn-out did this follower appear. "Who art thou?" asked Zarathustra vehemently, "what doest thou here? And why callest thou thyself my shadow? Thou art not pleasing unto me." "Forgive me," answered the shadow, "that it is I; and if I please thee not --well, O Zarathustra! therein do I admire thee and thy good taste. A wanderer am I, who have walked long at thy heels; always on the way, but without a goal, also without a home: so that verily, I lack little of being the eternally Wandering Jew, except that I am not eternal and not a Jew. What? Must I ever be on the way? Whirled by every wind, unsettled, driven about? O earth, thou hast become too round for me! On every surface have I already sat, like tired dust have I fallen asleep on mirrors and window-panes: everything taketh from me, nothing giveth; I become thin--I am almost equal to a shadow. After thee, however, O Zarathustra, did I fly and hie longest; and though I hid myself from thee, I was nevertheless thy best shadow: wherever thou hast sat, there sat I also. With thee have I wandered about in the remotest, coldest worlds, like a phantom that voluntarily haunteth winter roofs and snows. With thee have I pushed into all the forbidden, all the worst and the furthest: and if there be anything of virtue in me, it is that I have had no fear of any prohibition. With thee have I broken up whatever my heart revered; all boundary-stones and statues have I o'erthrown; the most dangerous wishes did I pursue,-- verily, beyond every crime did I once go. With thee did I unlearn the belief in words and worths and in great names. When the devil casteth his skin, doth not his name also fall away? It is also skin. The devil himself is perhaps--skin. 'Nothing is true, all is permitted': so said I to myself. Into the coldest water did I plunge with head and heart. Ah, how oft did I stand there naked on that account, like a red crab! Ah, where have gone all my goodness and all my shame and all my belief in the good! Ah, where is the lying innocence which I once possessed, the innocence of the good and of their noble lies! Too oft, verily, did I follow close to the heels of truth: then did it kick me on the face. Sometimes I meant to lie, and behold! then only did I hit--the truth. Too much hath become clear unto me: now it doth not concern me any more. Nothing liveth any longer that I love,--how should I still love myself? 'To live as I incline, or not to live at all': so do I wish; so wisheth also the holiest. But alas! how have _I_ still--inclination? Have _I_--still a goal? A haven towards which MY sail is set? A good wind? Ah, he only who knoweth WHITHER he saileth, knoweth what wind is good, and a fair wind for him. What still remaineth to me? A heart weary and flippant; an unstable will; fluttering wings; a broken backbone. This seeking for MY home: O Zarathustra, dost thou know that this seeking hath been MY home-sickening; it eateth me up. 'WHERE is--MY home?' For it do I ask and seek, and have sought, but have not found it. O eternal everywhere, O eternal nowhere, O eternal--in- vain!" Thus spake the shadow, and Zarathustra's countenance lengthened at his words. "Thou art my shadow!" said he at last sadly. "Thy danger is not small, thou free spirit and wanderer! Thou hast had a bad day: see that a still worse evening doth not overtake thee! To such unsettled ones as thou, seemeth at last even a prisoner blessed. Didst thou ever see how captured criminals sleep? They sleep quietly, they enjoy their new security. Beware lest in the end a narrow faith capture thee, a hard, rigorous delusion! For now everything that is narrow and fixed seduceth and tempteth thee. Thou hast lost thy goal. Alas, how wilt thou forego and forget that loss? Thereby--hast thou also lost thy way! Thou poor rover and rambler, thou tired butterfly! wilt thou have a rest and a home this evening? Then go up to my cave! Thither leadeth the way to my cave. And now will I run quickly away from thee again. Already lieth as it were a shadow upon me. I will run alone, so that it may again become bright around me. Therefore must I still be a long time merrily upon my legs. In the evening, however, there will be--dancing with me!"-- Thus spake Zarathustra. LXX. NOONTIDE. --And Zarathustra ran and ran, but he found no one else, and was alone and ever found himself again; he enjoyed and quaffed his solitude, and thought of good things--for hours. About the hour of noontide, however, when the sun stood exactly over Zarathustra's head, he passed an old, bent and gnarled tree, which was encircled round by the ardent love of a vine, and hidden from itself; from this there hung yellow grapes in abundance, confronting the wanderer. Then he felt inclined to quench a little thirst, and to break off for himself a cluster of grapes. When, however, he had already his arm out-stretched for that purpose, he felt still more inclined for something else--namely, to lie down beside the tree at the hour of perfect noontide and sleep. This Zarathustra did; and no sooner had he laid himself on the ground in the stillness and secrecy of the variegated grass, than he had forgotten his little thirst, and fell asleep. For as the proverb of Zarathustra saith: "One thing is more necessary than the other." Only that his eyes remained open:--for they never grew weary of viewing and admiring the tree and the love of the vine. In falling asleep, however, Zarathustra spake thus to his heart: "Hush! Hush! Hath not the world now become perfect? What hath happened unto me? As a delicate wind danceth invisibly upon parqueted seas, light, feather- light, so--danceth sleep upon me. No eye doth it close to me, it leaveth my soul awake. Light is it, verily, feather-light. It persuadeth me, I know not how, it toucheth me inwardly with a caressing hand, it constraineth me. Yea, it constraineth me, so that my soul stretcheth itself out:-- --How long and weary it becometh, my strange soul! Hath a seventh-day evening come to it precisely at noontide? Hath it already wandered too long, blissfully, among good and ripe things? It stretcheth itself out, long--longer! it lieth still, my strange soul. Too many good things hath it already tasted; this golden sadness oppresseth it, it distorteth its mouth. --As a ship that putteth into the calmest cove:--it now draweth up to the land, weary of long voyages and uncertain seas. Is not the land more faithful? As such a ship huggeth the shore, tuggeth the shore:--then it sufficeth for a spider to spin its thread from the ship to the land. No stronger ropes are required there. As such a weary ship in the calmest cove, so do I also now repose, nigh to the earth, faithful, trusting, waiting, bound to it with the lightest threads. O happiness! O happiness! Wilt thou perhaps sing, O my soul? Thou liest in the grass. But this is the secret, solemn hour, when no shepherd playeth his pipe. Take care! Hot noontide sleepeth on the fields. Do not sing! Hush! The world is perfect. Do not sing, thou prairie-bird, my soul! Do not even whisper! Lo--hush! The old noontide sleepeth, it moveth its mouth: doth it not just now drink a drop of happiness-- --An old brown drop of golden happiness, golden wine? Something whisketh over it, its happiness laugheth. Thus--laugheth a God. Hush!-- --'For happiness, how little sufficeth for happiness!' Thus spake I once and thought myself wise. But it was a blasphemy: THAT have I now learned. Wise fools speak better. The least thing precisely, the gentlest thing, the lightest thing, a lizard's rustling, a breath, a whisk, an eye-glance--LITTLE maketh up the BEST happiness. Hush! --What hath befallen me: Hark! Hath time flown away? Do I not fall? Have I not fallen--hark! into the well of eternity? --What happeneth to me? Hush! It stingeth me--alas--to the heart? To the heart! Oh, break up, break up, my heart, after such happiness, after such a sting! --What? Hath not the world just now become perfect? Round and ripe? Oh, for the golden round ring--whither doth it fly? Let me run after it! Quick! Hush--" (and here Zarathustra stretched himself, and felt that he was asleep.) "Up!" said he to himself, "thou sleeper! Thou noontide sleeper! Well then, up, ye old legs! It is time and more than time; many a good stretch of road is still awaiting you-- Now have ye slept your fill; for how long a time? A half-eternity! Well then, up now, mine old heart! For how long after such a sleep mayest thou --remain awake?" (But then did he fall asleep anew, and his soul spake against him and defended itself, and lay down again)--"Leave me alone! Hush! Hath not the world just now become perfect? Oh, for the golden round ball!-- "Get up," said Zarathustra, "thou little thief, thou sluggard! What! Still stretching thyself, yawning, sighing, failing into deep wells? Who art thou then, O my soul!" (and here he became frightened, for a sunbeam shot down from heaven upon his face.) "O heaven above me," said he sighing, and sat upright, "thou gazest at me? Thou hearkenest unto my strange soul? When wilt thou drink this drop of dew that fell down upon all earthly things,--when wilt thou drink this strange soul-- --When, thou well of eternity! thou joyous, awful, noontide abyss! when wilt thou drink my soul back into thee?" Thus spake Zarathustra, and rose from his couch beside the tree, as if awakening from a strange drunkenness: and behold! there stood the sun still exactly above his head. One might, however, rightly infer therefrom that Zarathustra had not then slept long. LXXI. THE GREETING. It was late in the afternoon only when Zarathustra, after long useless searching and strolling about, again came home to his cave. When, however, he stood over against it, not more than twenty paces therefrom, the thing happened which he now least of all expected: he heard anew the great CRY OF DISTRESS. And extraordinary! this time the cry came out of his own cave. It was a long, manifold, peculiar cry, and Zarathustra plainly distinguished that it was composed of many voices: although heard at a distance it might sound like the cry out of a single mouth. Thereupon Zarathustra rushed forward to his cave, and behold! what a spectacle awaited him after that concert! For there did they all sit together whom he had passed during the day: the king on the right and the king on the left, the old magician, the pope, the voluntary beggar, the shadow, the intellectually conscientious one, the sorrowful soothsayer, and the ass; the ugliest man, however, had set a crown on his head, and had put round him two purple girdles,--for he liked, like all ugly ones, to disguise himself and play the handsome person. In the midst, however, of that sorrowful company stood Zarathustra's eagle, ruffled and disquieted, for it had been called upon to answer too much for which its pride had not any answer; the wise serpent however hung round its neck. All this did Zarathustra behold with great astonishment; then however he scrutinised each individual guest with courteous curiosity, read their souls and wondered anew. In the meantime the assembled ones had risen from their seats, and waited with reverence for Zarathustra to speak. Zarathustra however spake thus: "Ye despairing ones! Ye strange ones! So it was YOUR cry of distress that I heard? And now do I know also where he is to be sought, whom I have sought for in vain to-day: THE HIGHER MAN--: --In mine own cave sitteth he, the higher man! But why do I wonder! Have not I myself allured him to me by honey-offerings and artful lure-calls of my happiness? But it seemeth to me that ye are badly adapted for company: ye make one another's hearts fretful, ye that cry for help, when ye sit here together? There is one that must first come, --One who will make you laugh once more, a good jovial buffoon, a dancer, a wind, a wild romp, some old fool:--what think ye? Forgive me, however, ye despairing ones, for speaking such trivial words before you, unworthy, verily, of such guests! But ye do not divine WHAT maketh my heart wanton:-- --Ye yourselves do it, and your aspect, forgive it me! For every one becometh courageous who beholdeth a despairing one. To encourage a despairing one--every one thinketh himself strong enough to do so. To myself have ye given this power,--a good gift, mine honourable guests! An excellent guest's-present! Well, do not then upbraid when I also offer you something of mine. This is mine empire and my dominion: that which is mine, however, shall this evening and tonight be yours. Mine animals shall serve you: let my cave be your resting-place! At house and home with me shall no one despair: in my purlieus do I protect every one from his wild beasts. And that is the first thing which I offer you: security! The second thing, however, is my little finger. And when ye have THAT, then take the whole hand also, yea, and the heart with it! Welcome here, welcome to you, my guests!" Thus spake Zarathustra, and laughed with love and mischief. After this greeting his guests bowed once more and were reverentially silent; the king on the right, however, answered him in their name. "O Zarathustra, by the way in which thou hast given us thy hand and thy greeting, we recognise thee as Zarathustra. Thou hast humbled thyself before us; almost hast thou hurt our reverence--: --Who however could have humbled himself as thou hast done, with such pride? THAT uplifteth us ourselves; a refreshment is it, to our eyes and hearts. To behold this, merely, gladly would we ascend higher mountains than this. For as eager beholders have we come; we wanted to see what brighteneth dim eyes. And lo! now is it all over with our cries of distress. Now are our minds and hearts open and enraptured. Little is lacking for our spirits to become wanton. There is nothing, O Zarathustra, that groweth more pleasingly on earth than a lofty, strong will: it is the finest growth. An entire landscape refresheth itself at one such tree. To the pine do I compare him, O Zarathustra, which groweth up like thee-- tall, silent, hardy, solitary, of the best, supplest wood, stately,-- --In the end, however, grasping out for ITS dominion with strong, green branches, asking weighty questions of the wind, the storm, and whatever is at home on high places; --Answering more weightily, a commander, a victor! Oh! who should not ascend high mountains to behold such growths? At thy tree, O Zarathustra, the gloomy and ill-constituted also refresh themselves; at thy look even the wavering become steady and heal their hearts. And verily, towards thy mountain and thy tree do many eyes turn to-day; a great longing hath arisen, and many have learned to ask: 'Who is Zarathustra?' And those into whose ears thou hast at any time dripped thy song and thy honey: all the hidden ones, the lone-dwellers and the twain-dwellers, have simultaneously said to their hearts: 'Doth Zarathustra still live? It is no longer worth while to live, everything is indifferent, everything is useless: or else--we must live with Zarathustra!' 'Why doth he not come who hath so long announced himself?' thus do many people ask; 'hath solitude swallowed him up? Or should we perhaps go to him?' Now doth it come to pass that solitude itself becometh fragile and breaketh open, like a grave that breaketh open and can no longer hold its dead. Everywhere one seeth resurrected ones. Now do the waves rise and rise around thy mountain, O Zarathustra. And however high be thy height, many of them must rise up to thee: thy boat shall not rest much longer on dry ground. And that we despairing ones have now come into thy cave, and already no longer despair:--it is but a prognostic and a presage that better ones are on the way to thee,-- --For they themselves are on the way to thee, the last remnant of God among men--that is to say, all the men of great longing, of great loathing, of great satiety, --All who do not want to live unless they learn again to HOPE--unless they learn from thee, O Zarathustra, the GREAT hope!" Thus spake the king on the right, and seized the hand of Zarathustra in order to kiss it; but Zarathustra checked his veneration, and stepped back frightened, fleeing as it were, silently and suddenly into the far distance. After a little while, however, he was again at home with his guests, looked at them with clear scrutinising eyes, and said: "My guests, ye higher men, I will speak plain language and plainly with you. It is not for YOU that I have waited here in these mountains." ("'Plain language and plainly?' Good God!" said here the king on the left to himself; "one seeth he doth not know the good Occidentals, this sage out of the Orient! But he meaneth 'blunt language and bluntly'--well! That is not the worst taste in these days!") "Ye may, verily, all of you be higher men," continued Zarathustra; "but for me--ye are neither high enough, nor strong enough. For me, that is to say, for the inexorable which is now silent in me, but will not always be silent. And if ye appertain to me, still it is not as my right arm. For he who himself standeth, like you, on sickly and tender legs, wisheth above all to be TREATED INDULGENTLY, whether he be conscious of it or hide it from himself. My arms and my legs, however, I do not treat indulgently, I DO NOT TREAT MY WARRIORS INDULGENTLY: how then could ye be fit for MY warfare? With you I should spoil all my victories. And many of you would tumble over if ye but heard the loud beating of my drums. Moreover, ye are not sufficiently beautiful and well-born for me. I require pure, smooth mirrors for my doctrines; on your surface even mine own likeness is distorted. On your shoulders presseth many a burden, many a recollection; many a mischievous dwarf squatteth in your corners. There is concealed populace also in you. And though ye be high and of a higher type, much in you is crooked and misshapen. There is no smith in the world that could hammer you right and straight for me. Ye are only bridges: may higher ones pass over upon you! Ye signify steps: so do not upbraid him who ascendeth beyond you into HIS height! Out of your seed there may one day arise for me a genuine son and perfect heir: but that time is distant. Ye yourselves are not those unto whom my heritage and name belong. Not for you do I wait here in these mountains; not with you may I descend for the last time. Ye have come unto me only as a presage that higher ones are on the way to me,-- --NOT the men of great longing, of great loathing, of great satiety, and that which ye call the remnant of God; --Nay! Nay! Three times Nay! For OTHERS do I wait here in these mountains, and will not lift my foot from thence without them; --For higher ones, stronger ones, triumphanter ones, merrier ones, for such as are built squarely in body and soul: LAUGHING LIONS must come! O my guests, ye strange ones--have ye yet heard nothing of my children? And that they are on the way to me? Do speak unto me of my gardens, of my Happy Isles, of my new beautiful race--why do ye not speak unto me thereof? This guests'-present do I solicit of your love, that ye speak unto me of my children. For them am I rich, for them I became poor: what have I not surrendered, --What would I not surrender that I might have one thing: THESE children, THIS living plantation, THESE life-trees of my will and of my highest hope!" Thus spake Zarathustra, and stopped suddenly in his discourse: for his longing came over him, and he closed his eyes and his mouth, because of the agitation of his heart. And all his guests also were silent, and stood still and confounded: except only that the old soothsayer made signs with his hands and his gestures. LXXII. THE SUPPER. For at this point the soothsayer interrupted the greeting of Zarathustra and his guests: he pressed forward as one who had no time to lose, seized Zarathustra's hand and exclaimed: "But Zarathustra! One thing is more necessary than the other, so sayest thou thyself: well, one thing is now more necessary UNTO ME than all others. A word at the right time: didst thou not invite me to TABLE? And here are many who have made long journeys. Thou dost not mean to feed us merely with discourses? Besides, all of you have thought too much about freezing, drowning, suffocating, and other bodily dangers: none of you, however, have thought of MY danger, namely, perishing of hunger-" (Thus spake the soothsayer. When Zarathustra's animals, however, heard these words, they ran away in terror. For they saw that all they had brought home during the day would not be enough to fill the one soothsayer.) "Likewise perishing of thirst," continued the soothsayer. "And although I hear water splashing here like words of wisdom--that is to say, plenteously and unweariedly, I--want WINE! Not every one is a born water-drinker like Zarathustra. Neither doth water suit weary and withered ones: WE deserve wine--IT alone giveth immediate vigour and improvised health!" On this occasion, when the soothsayer was longing for wine, it happened that the king on the left, the silent one, also found expression for once. "WE took care," said he, "about wine, I, along with my brother the king on the right: we have enough of wine,--a whole ass-load of it. So there is nothing lacking but bread." "Bread," replied Zarathustra, laughing when he spake, "it is precisely bread that anchorites have not. But man doth not live by bread alone, but also by the flesh of good lambs, of which I have two: --THESE shall we slaughter quickly, and cook spicily with sage: it is so that I like them. And there is also no lack of roots and fruits, good enough even for the fastidious and dainty,--nor of nuts and other riddles for cracking. Thus will we have a good repast in a little while. But whoever wish to eat with us must also give a hand to the work, even the kings. For with Zarathustra even a king may be a cook." This proposal appealed to the hearts of all of them, save that the voluntary beggar objected to the flesh and wine and spices. "Just hear this glutton Zarathustra!" said he jokingly: "doth one go into caves and high mountains to make such repasts? Now indeed do I understand what he once taught us: Blessed be moderate poverty!' And why he wisheth to do away with beggars." "Be of good cheer," replied Zarathustra, "as I am. Abide by thy customs, thou excellent one: grind thy corn, drink thy water, praise thy cooking,-- if only it make thee glad! I am a law only for mine own; I am not a law for all. He, however, who belongeth unto me must be strong of bone and light of foot,-- --Joyous in fight and feast, no sulker, no John o' Dreams, ready for the hardest task as for the feast, healthy and hale. The best belongeth unto mine and me; and if it be not given us, then do we take it:--the best food, the purest sky, the strongest thoughts, the fairest women!"-- Thus spake Zarathustra; the king on the right however answered and said: "Strange! Did one ever hear such sensible things out of the mouth of a wise man? And verily, it is the strangest thing in a wise man, if over and above, he be still sensible, and not an ass." Thus spake the king on the right and wondered; the ass however, with ill- will, said YE-A to his remark. This however was the beginning of that long repast which is called "The Supper" in the history-books. At this there was nothing else spoken of but THE HIGHER MAN. LXXIII. THE HIGHER MAN. 1. When I came unto men for the first time, then did I commit the anchorite folly, the great folly: I appeared on the market-place. And when I spake unto all, I spake unto none. In the evening, however, rope-dancers were my companions, and corpses; and I myself almost a corpse. With the new morning, however, there came unto me a new truth: then did I learn to say: "Of what account to me are market-place and populace and populace-noise and long populace-ears!" Ye higher men, learn THIS from me: On the market-place no one believeth in higher men. But if ye will speak there, very well! The populace, however, blinketh: "We are all equal." "Ye higher men,"--so blinketh the populace--"there are no higher men, we are all equal; man is man, before God--we are all equal!" Before God!--Now, however, this God hath died. Before the populace, however, we will not be equal. Ye higher men, away from the market-place! 2. Before God!--Now however this God hath died! Ye higher men, this God was your greatest danger. Only since he lay in the grave have ye again arisen. Now only cometh the great noontide, now only doth the higher man become--master! Have ye understood this word, O my brethren? Ye are frightened: do your hearts turn giddy? Doth the abyss here yawn for you? Doth the hell-hound here yelp at you? Well! Take heart! ye higher men! Now only travaileth the mountain of the human future. God hath died: now do WE desire--the Superman to live. 3. The most careful ask to-day: "How is man to be maintained?" Zarathustra however asketh, as the first and only one: "How is man to be SURPASSED?" The Superman, I have at heart; THAT is the first and only thing to me--and NOT man: not the neighbour, not the poorest, not the sorriest, not the best.-- O my brethren, what I can love in man is that he is an over-going and a down-going. And also in you there is much that maketh me love and hope. In that ye have despised, ye higher men, that maketh me hope. For the great despisers are the great reverers. In that ye have despaired, there is much to honour. For ye have not learned to submit yourselves, ye have not learned petty policy. For to-day have the petty people become master: they all preach submission and humility and policy and diligence and consideration and the long et cetera of petty virtues. Whatever is of the effeminate type, whatever originateth from the servile type, and especially the populace-mishmash:--THAT wisheth now to be master of all human destiny--O disgust! Disgust! Disgust! THAT asketh and asketh and never tireth: "How is man to maintain himself best, longest, most pleasantly?" Thereby--are they the masters of to-day. These masters of to-day--surpass them, O my brethren--these petty people: THEY are the Superman's greatest danger! Surpass, ye higher men, the petty virtues, the petty policy, the sand-grain considerateness, the ant-hill trumpery, the pitiable comfortableness, the "happiness of the greatest number"--! And rather despair than submit yourselves. And verily, I love you, because ye know not to-day how to live, ye higher men! For thus do YE live--best! 4. Have ye courage, O my brethren? Are ye stout-hearted? NOT the courage before witnesses, but anchorite and eagle courage, which not even a God any longer beholdeth? Cold souls, mules, the blind and the drunken, I do not call stout-hearted. He hath heart who knoweth fear, but VANQUISHETH it; who seeth the abyss, but with PRIDE. He who seeth the abyss, but with eagle's eyes,--he who with eagle's talons GRASPETH the abyss: he hath courage.-- 5. "Man is evil"--so said to me for consolation, all the wisest ones. Ah, if only it be still true to-day! For the evil is man's best force. "Man must become better and eviler"--so do _I_ teach. The evilest is necessary for the Superman's best. It may have been well for the preacher of the petty people to suffer and be burdened by men's sin. I, however, rejoice in great sin as my great CONSOLATION.-- Such things, however, are not said for long ears. Every word, also, is not suited for every mouth. These are fine far-away things: at them sheep's claws shall not grasp! 6. Ye higher men, think ye that I am here to put right what ye have put wrong? Or that I wished henceforth to make snugger couches for you sufferers? Or show you restless, miswandering, misclimbing ones, new and easier footpaths? Nay! Nay! Three times Nay! Always more, always better ones of your type shall succumb,--for ye shall always have it worse and harder. Thus only-- --Thus only groweth man aloft to the height where the lightning striketh and shattereth him: high enough for the lightning! Towards the few, the long, the remote go forth my soul and my seeking: of what account to me are your many little, short miseries! Ye do not yet suffer enough for me! For ye suffer from yourselves, ye have not yet suffered FROM MAN. Ye would lie if ye spake otherwise! None of you suffereth from what _I_ have suffered.-- 7. It is not enough for me that the lightning no longer doeth harm. I do not wish to conduct it away: it shall learn--to work for ME.-- My wisdom hath accumulated long like a cloud, it becometh stiller and darker. So doeth all wisdom which shall one day bear LIGHTNINGS.-- Unto these men of to-day will I not be LIGHT, nor be called light. THEM-- will I blind: lightning of my wisdom! put out their eyes! 8. Do not will anything beyond your power: there is a bad falseness in those who will beyond their power. Especially when they will great things! For they awaken distrust in great things, these subtle false-coiners and stage-players:-- --Until at last they are false towards themselves, squint-eyed, whited cankers, glossed over with strong words, parade virtues and brilliant false deeds. Take good care there, ye higher men! For nothing is more precious to me, and rarer, than honesty. Is this to-day not that of the populace? The populace however knoweth not what is great and what is small, what is straight and what is honest: it is innocently crooked, it ever lieth. 9. Have a good distrust to-day ye, higher men, ye enheartened ones! Ye open- hearted ones! And keep your reasons secret! For this to-day is that of the populace. What the populace once learned to believe without reasons, who could-- refute it to them by means of reasons? And on the market-place one convinceth with gestures. But reasons make the populace distrustful. And when truth hath once triumphed there, then ask yourselves with good distrust: "What strong error hath fought for it?" Be on your guard also against the learned! They hate you, because they are unproductive! They have cold, withered eyes before which every bird is unplumed. Such persons vaunt about not lying: but inability to lie is still far from being love to truth. Be on your guard! Freedom from fever is still far from being knowledge! Refrigerated spirits I do not believe in. He who cannot lie, doth not know what truth is. 10. If ye would go up high, then use your own legs! Do not get yourselves CARRIED aloft; do not seat yourselves on other people's backs and heads! Thou hast mounted, however, on horseback? Thou now ridest briskly up to thy goal? Well, my friend! But thy lame foot is also with thee on horseback! When thou reachest thy goal, when thou alightest from thy horse: precisely on thy HEIGHT, thou higher man,--then wilt thou stumble! 11. Ye creating ones, ye higher men! One is only pregnant with one's own child. Do not let yourselves be imposed upon or put upon! Who then is YOUR neighbour? Even if ye act "for your neighbour"--ye still do not create for him! Unlearn, I pray you, this "for," ye creating ones: your very virtue wisheth you to have naught to do with "for" and "on account of" and "because." Against these false little words shall ye stop your ears. "For one's neighbour," is the virtue only of the petty people: there it is said "like and like," and "hand washeth hand":--they have neither the right nor the power for YOUR self-seeking! In your self-seeking, ye creating ones, there is the foresight and foreseeing of the pregnant! What no one's eye hath yet seen, namely, the fruit--this, sheltereth and saveth and nourisheth your entire love. Where your entire love is, namely, with your child, there is also your entire virtue! Your work, your will is YOUR "neighbour": let no false values impose upon you! 12. Ye creating ones, ye higher men! Whoever hath to give birth is sick; whoever hath given birth, however, is unclean. Ask women: one giveth birth, not because it giveth pleasure. The pain maketh hens and poets cackle. Ye creating ones, in you there is much uncleanliness. That is because ye have had to be mothers. A new child: oh, how much new filth hath also come into the world! Go apart! He who hath given birth shall wash his soul! 13. Be not virtuous beyond your powers! And seek nothing from yourselves opposed to probability! Walk in the footsteps in which your fathers' virtue hath already walked! How would ye rise high, if your fathers' will should not rise with you? He, however, who would be a firstling, let him take care lest he also become a lastling! And where the vices of your fathers are, there should ye not set up as saints! He whose fathers were inclined for women, and for strong wine and flesh of wildboar swine; what would it be if he demanded chastity of himself? A folly would it be! Much, verily, doth it seem to me for such a one, if he should be the husband of one or of two or of three women. And if he founded monasteries, and inscribed over their portals: "The way to holiness,"--I should still say: What good is it! it is a new folly! He hath founded for himself a penance-house and refuge-house: much good may it do! But I do not believe in it. In solitude there groweth what any one bringeth into it--also the brute in one's nature. Thus is solitude inadvisable unto many. Hath there ever been anything filthier on earth than the saints of the wilderness? AROUND THEM was not only the devil loose--but also the swine. 14. Shy, ashamed, awkward, like the tiger whose spring hath failed--thus, ye higher men, have I often seen you slink aside. A CAST which ye made had failed. But what doth it matter, ye dice-players! Ye had not learned to play and mock, as one must play and mock! Do we not ever sit at a great table of mocking and playing? And if great things have been a failure with you, have ye yourselves therefore--been a failure? And if ye yourselves have been a failure, hath man therefore--been a failure? If man, however, hath been a failure: well then! never mind! 15. The higher its type, always the seldomer doth a thing succeed. Ye higher men here, have ye not all--been failures? Be of good cheer; what doth it matter? How much is still possible! Learn to laugh at yourselves, as ye ought to laugh! What wonder even that ye have failed and only half-succeeded, ye half- shattered ones! Doth not--man's FUTURE strive and struggle in you? Man's furthest, profoundest, star-highest issues, his prodigious powers--do not all these foam through one another in your vessel? What wonder that many a vessel shattereth! Learn to laugh at yourselves, as ye ought to laugh! Ye higher men, Oh, how much is still possible! And verily, how much hath already succeeded! How rich is this earth in small, good, perfect things, in well-constituted things! Set around you small, good, perfect things, ye higher men. Their golden maturity healeth the heart. The perfect teacheth one to hope. 16. What hath hitherto been the greatest sin here on earth? Was it not the word of him who said: "Woe unto them that laugh now!" Did he himself find no cause for laughter on the earth? Then he sought badly. A child even findeth cause for it. He--did not love sufficiently: otherwise would he also have loved us, the laughing ones! But he hated and hooted us; wailing and teeth-gnashing did he promise us. Must one then curse immediately, when one doth not love? That--seemeth to me bad taste. Thus did he, however, this absolute one. He sprang from the populace. And he himself just did not love sufficiently; otherwise would he have raged less because people did not love him. All great love doth not SEEK love:--it seeketh more. Go out of the way of all such absolute ones! They are a poor sickly type, a populace-type: they look at this life with ill-will, they have an evil eye for this earth. Go out of the way of all such absolute ones! They have heavy feet and sultry hearts:--they do not know how to dance. How could the earth be light to such ones! 17. Tortuously do all good things come nigh to their goal. Like cats they curve their backs, they purr inwardly with their approaching happiness,-- all good things laugh. His step betrayeth whether a person already walketh on HIS OWN path: just see me walk! He, however, who cometh nigh to his goal, danceth. And verily, a statue have I not become, not yet do I stand there stiff, stupid and stony, like a pillar; I love fast racing. And though there be on earth fens and dense afflictions, he who hath light feet runneth even across the mud, and danceth, as upon well-swept ice. Lift up your hearts, my brethren, high, higher! And do not forget your legs! Lift up also your legs, ye good dancers, and better still, if ye stand upon your heads! 18. This crown of the laughter, this rose-garland crown: I myself have put on this crown, I myself have consecrated my laughter. No one else have I found to-day potent enough for this. Zarathustra the dancer, Zarathustra the light one, who beckoneth with his pinions, one ready for flight, beckoning unto all birds, ready and prepared, a blissfully light-spirited one:-- Zarathustra the soothsayer, Zarathustra the sooth-laugher, no impatient one, no absolute one, one who loveth leaps and side-leaps; I myself have put on this crown! 19. Lift up your hearts, my brethren, high, higher! And do not forget your legs! Lift up also your legs, ye good dancers, and better still if ye stand upon your heads! There are also heavy animals in a state of happiness, there are club-footed ones from the beginning. Curiously do they exert themselves, like an elephant which endeavoureth to stand upon its head. Better, however, to be foolish with happiness than foolish with misfortune, better to dance awkwardly than walk lamely. So learn, I pray you, my wisdom, ye higher men: even the worst thing hath two good reverse sides,-- --Even the worst thing hath good dancing-legs: so learn, I pray you, ye higher men, to put yourselves on your proper legs! So unlearn, I pray you, the sorrow-sighing, and all the populace-sadness! Oh, how sad the buffoons of the populace seem to me to-day! This to-day, however, is that of the populace. 20. Do like unto the wind when it rusheth forth from its mountain-caves: unto its own piping will it dance; the seas tremble and leap under its footsteps. That which giveth wings to asses, that which milketh the lionesses:-- praised be that good, unruly spirit, which cometh like a hurricane unto all the present and unto all the populace,-- --Which is hostile to thistle-heads and puzzle-heads, and to all withered leaves and weeds:--praised be this wild, good, free spirit of the storm, which danceth upon fens and afflictions, as upon meadows! Which hateth the consumptive populace-dogs, and all the ill-constituted, sullen brood:--praised be this spirit of all free spirits, the laughing storm, which bloweth dust into the eyes of all the melanopic and melancholic! Ye higher men, the worst thing in you is that ye have none of you learned to dance as ye ought to dance--to dance beyond yourselves! What doth it matter that ye have failed! How many things are still possible! So LEARN to laugh beyond yourselves! Lift up your hearts, ye good dancers, high! higher! And do not forget the good laughter! This crown of the laughter, this rose-garland crown: to you my brethren do I cast this crown! Laughing have I consecrated; ye higher men, LEARN, I pray you--to laugh! LXXIV. THE SONG OF MELANCHOLY. 1. When Zarathustra spake these sayings, he stood nigh to the entrance of his cave; with the last words, however, he slipped away from his guests, and fled for a little while into the open air. "O pure odours around me," cried he, "O blessed stillness around me! But where are mine animals? Hither, hither, mine eagle and my serpent! Tell me, mine animals: these higher men, all of them--do they perhaps not SMELL well? O pure odours around me! Now only do I know and feel how I love you, mine animals." --And Zarathustra said once more: "I love you, mine animals!" The eagle, however, and the serpent pressed close to him when he spake these words, and looked up to him. In this attitude were they all three silent together, and sniffed and sipped the good air with one another. For the air here outside was better than with the higher men. 2. Hardly, however, had Zarathustra left the cave when the old magician got up, looked cunningly about him, and said: "He is gone! And already, ye higher men--let me tickle you with this complimentary and flattering name, as he himself doeth--already doth mine evil spirit of deceit and magic attack me, my melancholy devil, --Which is an adversary to this Zarathustra from the very heart: forgive it for this! Now doth it wish to conjure before you, it hath just ITS hour; in vain do I struggle with this evil spirit. Unto all of you, whatever honours ye like to assume in your names, whether ye call yourselves 'the free spirits' or 'the conscientious,' or 'the penitents of the spirit,' or 'the unfettered,' or 'the great longers,'-- --Unto all of you, who like me suffer FROM THE GREAT LOATHING, to whom the old God hath died, and as yet no new God lieth in cradles and swaddling clothes--unto all of you is mine evil spirit and magic-devil favourable. I know you, ye higher men, I know him,--I know also this fiend whom I love in spite of me, this Zarathustra: he himself often seemeth to me like the beautiful mask of a saint, --Like a new strange mummery in which mine evil spirit, the melancholy devil, delighteth:--I love Zarathustra, so doth it often seem to me, for the sake of mine evil spirit.-- But already doth IT attack me and constrain me, this spirit of melancholy, this evening-twilight devil: and verily, ye higher men, it hath a longing-- --Open your eyes!--it hath a longing to come NAKED, whether male or female, I do not yet know: but it cometh, it constraineth me, alas! open your wits! The day dieth out, unto all things cometh now the evening, also unto the best things; hear now, and see, ye higher men, what devil--man or woman-- this spirit of evening-melancholy is!" Thus spake the old magician, looked cunningly about him, and then seized his harp. 3. In evening's limpid air, What time the dew's soothings Unto the earth downpour, Invisibly and unheard-- For tender shoe-gear wear The soothing dews, like all that's kind-gentle--: Bethinkst thou then, bethinkst thou, burning heart, How once thou thirstedest For heaven's kindly teardrops and dew's down-droppings, All singed and weary thirstedest, What time on yellow grass-pathways Wicked, occidental sunny glances Through sombre trees about thee sported, Blindingly sunny glow-glances, gladly-hurting? "Of TRUTH the wooer? Thou?"--so taunted they- "Nay! Merely poet! A brute insidious, plundering, grovelling, That aye must lie, That wittingly, wilfully, aye must lie: For booty lusting, Motley masked, Self-hidden, shrouded, Himself his booty- HE--of truth the wooer? Nay! Mere fool! Mere poet! Just motley speaking, From mask of fool confusedly shouting, Circumambling on fabricated word-bridges, On motley rainbow-arches, 'Twixt the spurious heavenly, And spurious earthly, Round us roving, round us soaring,-- MERE FOOL! MERE POET! HE--of truth the wooer? Not still, stiff, smooth and cold, Become an image, A godlike statue, Set up in front of temples, As a God's own door-guard: Nay! hostile to all such truthfulness-statues, In every desert homelier than at temples, With cattish wantonness, Through every window leaping Quickly into chances, Every wild forest a-sniffing, Greedily-longingly, sniffing, That thou, in wild forests, 'Mong the motley-speckled fierce creatures, Shouldest rove, sinful-sound and fine-coloured, With longing lips smacking, Blessedly mocking, blessedly hellish, blessedly bloodthirsty, Robbing, skulking, lying--roving:-- Or unto eagles like which fixedly, Long adown the precipice look, Adown THEIR precipice:-- Oh, how they whirl down now, Thereunder, therein, To ever deeper profoundness whirling!-- Then, Sudden, With aim aright, With quivering flight, On LAMBKINS pouncing, Headlong down, sore-hungry, For lambkins longing, Fierce 'gainst all lamb-spirits, Furious-fierce all that look Sheeplike, or lambeyed, or crisp-woolly, --Grey, with lambsheep kindliness! Even thus, Eaglelike, pantherlike, Are the poet's desires, Are THINE OWN desires 'neath a thousand guises, Thou fool! Thou poet! Thou who all mankind viewedst-- So God, as sheep--: The God TO REND within mankind, As the sheep in mankind, And in rending LAUGHING-- THAT, THAT is thine own blessedness! Of a panther and eagle--blessedness! Of a poet and fool--the blessedness!-- In evening's limpid air, What time the moon's sickle, Green, 'twixt the purple-glowings, And jealous, steal'th forth: --Of day the foe, With every step in secret, The rosy garland-hammocks Downsickling, till they've sunken Down nightwards, faded, downsunken:-- Thus had I sunken one day From mine own truth-insanity, From mine own fervid day-longings, Of day aweary, sick of sunshine, --Sunk downwards, evenwards, shadowwards: By one sole trueness All scorched and thirsty: --Bethinkst thou still, bethinkst thou, burning heart, How then thou thirstedest?- THAT I SHOULD BANNED BE FROM ALL THE TRUENESS! MERE FOOL! MERE POET! LXXV. SCIENCE. Thus sang the magician; and all who were present went like birds unawares into the net of his artful and melancholy voluptuousness. Only the spiritually conscientious one had not been caught: he at once snatched the harp from the magician and called out: "Air! Let in good air! Let in Zarathustra! Thou makest this cave sultry and poisonous, thou bad old magician! Thou seducest, thou false one, thou subtle one, to unknown desires and deserts. And alas, that such as thou should talk and make ado about the TRUTH! Alas, to all free spirits who are not on their guard against SUCH magicians! It is all over with their freedom: thou teachest and temptest back into prisons,-- --Thou old melancholy devil, out of thy lament soundeth a lurement: thou resemblest those who with their praise of chastity secretly invite to voluptuousness!" Thus spake the conscientious one; the old magician, however, looked about him, enjoying his triumph, and on that account put up with the annoyance which the conscientious one caused him. "Be still!" said he with modest voice, "good songs want to re-echo well; after good songs one should be long silent. Thus do all those present, the higher men. Thou, however, hast perhaps understood but little of my song? In thee there is little of the magic spirit. "Thou praisest me," replied the conscientious one, "in that thou separatest me from thyself; very well! But, ye others, what do I see? Ye still sit there, all of you, with lusting eyes--: Ye free spirits, whither hath your freedom gone! Ye almost seem to me to resemble those who have long looked at bad girls dancing naked: your souls themselves dance! In you, ye higher men, there must be more of that which the magician calleth his evil spirit of magic and deceit:--we must indeed be different. And verily, we spake and thought long enough together ere Zarathustra came home to his cave, for me not to be unaware that we ARE different. We SEEK different things even here aloft, ye and I. For I seek more SECURITY; on that account have I come to Zarathustra. For he is still the most steadfast tower and will-- --To-day, when everything tottereth, when all the earth quaketh. Ye, however, when I see what eyes ye make, it almost seemeth to me that ye seek MORE INSECURITY, --More horror, more danger, more earthquake. Ye long (it almost seemeth so to me--forgive my presumption, ye higher men)-- --Ye long for the worst and dangerousest life, which frighteneth ME most,-- for the life of wild beasts, for forests, caves, steep mountains and labyrinthine gorges. And it is not those who lead OUT OF danger that please you best, but those who lead you away from all paths, the misleaders. But if such longing in you be ACTUAL, it seemeth to me nevertheless to be IMPOSSIBLE. For fear--that is man's original and fundamental feeling; through fear everything is explained, original sin and original virtue. Through fear there grew also MY virtue, that is to say: Science. For fear of wild animals--that hath been longest fostered in man, inclusive of the animal which he concealeth and feareth in himself:--Zarathustra calleth it 'the beast inside.' Such prolonged ancient fear, at last become subtle, spiritual and intellectual--at present, me thinketh, it is called SCIENCE."-- Thus spake the conscientious one; but Zarathustra, who had just come back into his cave and had heard and divined the last discourse, threw a handful of roses to the conscientious one, and laughed on account of his "truths." "Why!" he exclaimed, "what did I hear just now? Verily, it seemeth to me, thou art a fool, or else I myself am one: and quietly and quickly will I Put thy 'truth' upside down. For FEAR--is an exception with us. Courage, however, and adventure, and delight in the uncertain, in the unattempted--COURAGE seemeth to me the entire primitive history of man. The wildest and most courageous animals hath he envied and robbed of all their virtues: thus only did he become--man. THIS courage, at last become subtle, spiritual and intellectual, this human courage, with eagle's pinions and serpent's wisdom: THIS, it seemeth to me, is called at present--" "ZARATHUSTRA!" cried all of them there assembled, as if with one voice, and burst out at the same time into a great laughter; there arose, however, from them as it were a heavy cloud. Even the magician laughed, and said wisely: "Well! It is gone, mine evil spirit! And did I not myself warn you against it when I said that it was a deceiver, a lying and deceiving spirit? Especially when it showeth itself naked. But what can _I_ do with regard to its tricks! Have _I_ created it and the world? Well! Let us be good again, and of good cheer! And although Zarathustra looketh with evil eye--just see him! he disliketh me--: --Ere night cometh will he again learn to love and laud me; he cannot live long without committing such follies. HE--loveth his enemies: this art knoweth he better than any one I have seen. But he taketh revenge for it--on his friends!" Thus spake the old magician, and the higher men applauded him; so that Zarathustra went round, and mischievously and lovingly shook hands with his friends,--like one who hath to make amends and apologise to every one for something. When however he had thereby come to the door of his cave, lo, then had he again a longing for the good air outside, and for his animals, --and wished to steal out. LXXVI. AMONG DAUGHTERS OF THE DESERT. 1. "Go not away!" said then the wanderer who called himself Zarathustra's shadow, "abide with us--otherwise the old gloomy affliction might again fall upon us. Now hath that old magician given us of his worst for our good, and lo! the good, pious pope there hath tears in his eyes, and hath quite embarked again upon the sea of melancholy. Those kings may well put on a good air before us still: for that have THEY learned best of us all at present! Had they however no one to see them, I wager that with them also the bad game would again commence,-- --The bad game of drifting clouds, of damp melancholy, of curtained heavens, of stolen suns, of howling autumn-winds, --The bad game of our howling and crying for help! Abide with us, O Zarathustra! Here there is much concealed misery that wisheth to speak, much evening, much cloud, much damp air! Thou hast nourished us with strong food for men, and powerful proverbs: do not let the weakly, womanly spirits attack us anew at dessert! Thou alone makest the air around thee strong and clear! Did I ever find anywhere on earth such good air as with thee in thy cave? Many lands have I seen, my nose hath learned to test and estimate many kinds of air: but with thee do my nostrils taste their greatest delight! Unless it be,--unless it be--, do forgive an old recollection! Forgive me an old after-dinner song, which I once composed amongst daughters of the desert:-- For with them was there equally good, clear, Oriental air; there was I furthest from cloudy, damp, melancholy Old-Europe! Then did I love such Oriental maidens and other blue kingdoms of heaven, over which hang no clouds and no thoughts. Ye would not believe how charmingly they sat there, when they did not dance, profound, but without thoughts, like little secrets, like beribboned riddles, like dessert-nuts-- Many-hued and foreign, forsooth! but without clouds: riddles which can be guessed: to please such maidens I then composed an after-dinner psalm." Thus spake the wanderer who called himself Zarathustra's shadow; and before any one answered him, he had seized the harp of the old magician, crossed his legs, and looked calmly and sagely around him:--with his nostrils, however, he inhaled the air slowly and questioningly, like one who in new countries tasteth new foreign air. Afterward he began to sing with a kind of roaring. 2. THE DESERTS GROW: WOE HIM WHO DOTH THEM HIDE! --Ha! Solemnly! In effect solemnly! A worthy beginning! Afric manner, solemnly! Of a lion worthy, Or perhaps of a virtuous howl-monkey-- --But it's naught to you, Ye friendly damsels dearly loved, At whose own feet to me, The first occasion, To a European under palm-trees, A seat is now granted. Selah. Wonderful, truly! Here do I sit now, The desert nigh, and yet I am So far still from the desert, Even in naught yet deserted: That is, I'm swallowed down By this the smallest oasis--: --It opened up just yawning, Its loveliest mouth agape, Most sweet-odoured of all mouthlets: Then fell I right in, Right down, right through--in 'mong you, Ye friendly damsels dearly loved! Selah. Hail! hail! to that whale, fishlike, If it thus for its guest's convenience Made things nice!--(ye well know, Surely, my learned allusion?) Hail to its belly, If it had e'er A such loveliest oasis-belly As this is: though however I doubt about it, --With this come I out of Old-Europe, That doubt'th more eagerly than doth any Elderly married woman. May the Lord improve it! Amen! Here do I sit now, In this the smallest oasis, Like a date indeed, Brown, quite sweet, gold-suppurating, For rounded mouth of maiden longing, But yet still more for youthful, maidlike, Ice-cold and snow-white and incisory Front teeth: and for such assuredly, Pine the hearts all of ardent date-fruits. Selah. To the there-named south-fruits now, Similar, all-too-similar, Do I lie here; by little Flying insects Round-sniffled and round-played, And also by yet littler, Foolisher, and peccabler Wishes and phantasies,-- Environed by you, Ye silent, presentientest Maiden-kittens, Dudu and Suleika, --ROUNDSPHINXED, that into one word I may crowd much feeling: (Forgive me, O God, All such speech-sinning!) --Sit I here the best of air sniffling, Paradisal air, truly, Bright and buoyant air, golden-mottled, As goodly air as ever From lunar orb downfell-- Be it by hazard, Or supervened it by arrogancy? As the ancient poets relate it. But doubter, I'm now calling it In question: with this do I come indeed Out of Europe, That doubt'th more eagerly than doth any Elderly married woman. May the Lord improve it! Amen. This the finest air drinking, With nostrils out-swelled like goblets, Lacking future, lacking remembrances Thus do I sit here, ye Friendly damsels dearly loved, And look at the palm-tree there, How it, to a dance-girl, like, Doth bow and bend and on its haunches bob, --One doth it too, when one view'th it long!-- To a dance-girl like, who as it seem'th to me, Too long, and dangerously persistent, Always, always, just on SINGLE leg hath stood? --Then forgot she thereby, as it seem'th to me, The OTHER leg? For vainly I, at least, Did search for the amissing Fellow-jewel --Namely, the other leg-- In the sanctified precincts, Nigh her very dearest, very tenderest, Flapping and fluttering and flickering skirting. Yea, if ye should, ye beauteous friendly ones, Quite take my word: She hath, alas! LOST it! Hu! Hu! Hu! Hu! Hu! It is away! For ever away! The other leg! Oh, pity for that loveliest other leg! Where may it now tarry, all-forsaken weeping? The lonesomest leg? In fear perhaps before a Furious, yellow, blond and curled Leonine monster? Or perhaps even Gnawed away, nibbled badly-- Most wretched, woeful! woeful! nibbled badly! Selah. Oh, weep ye not, Gentle spirits! Weep ye not, ye Date-fruit spirits! Milk-bosoms! Ye sweetwood-heart Purselets! Weep ye no more, Pallid Dudu! Be a man, Suleika! Bold! Bold! --Or else should there perhaps Something strengthening, heart-strengthening, Here most proper be? Some inspiring text? Some solemn exhortation?-- Ha! Up now! honour! Moral honour! European honour! Blow again, continue, Bellows-box of virtue! Ha! Once more thy roaring, Thy moral roaring! As a virtuous lion Nigh the daughters of deserts roaring! --For virtue's out-howl, Ye very dearest maidens, Is more than every European fervour, European hot-hunger! And now do I stand here, As European, I can't be different, God's help to me! Amen! THE DESERTS GROW: WOE HIM WHO DOTH THEM HIDE! LXXVII. THE AWAKENING. 1. After the song of the wanderer and shadow, the cave became all at once full of noise and laughter: and since the assembled guests all spake simultaneously, and even the ass, encouraged thereby, no longer remained silent, a little aversion and scorn for his visitors came over Zarathustra, although he rejoiced at their gladness. For it seemed to him a sign of convalescence. So he slipped out into the open air and spake to his animals. "Whither hath their distress now gone?" said he, and already did he himself feel relieved of his petty disgust--"with me, it seemeth that they have unlearned their cries of distress! --Though, alas! not yet their crying." And Zarathustra stopped his ears, for just then did the YE-A of the ass mix strangely with the noisy jubilation of those higher men. "They are merry," he began again, "and who knoweth? perhaps at their host's expense; and if they have learned of me to laugh, still it is not MY laughter they have learned. But what matter about that! They are old people: they recover in their own way, they laugh in their own way; mine ears have already endured worse and have not become peevish. This day is a victory: he already yieldeth, he fleeth, THE SPIRIT OF GRAVITY, mine old arch-enemy! How well this day is about to end, which began so badly and gloomily! And it is ABOUT TO end. Already cometh the evening: over the sea rideth it hither, the good rider! How it bobbeth, the blessed one, the home- returning one, in its purple saddles! The sky gazeth brightly thereon, the world lieth deep. Oh, all ye strange ones who have come to me, it is already worth while to have lived with me!" Thus spake Zarathustra. And again came the cries and laughter of the higher men out of the cave: then began he anew: "They bite at it, my bait taketh, there departeth also from them their enemy, the spirit of gravity. Now do they learn to laugh at themselves: do I hear rightly? My virile food taketh effect, my strong and savoury sayings: and verily, I did not nourish them with flatulent vegetables! But with warrior-food, with conqueror-food: new desires did I awaken. New hopes are in their arms and legs, their hearts expand. They find new words, soon will their spirits breathe wantonness. Such food may sure enough not be proper for children, nor even for longing girls old and young. One persuadeth their bowels otherwise; I am not their physician and teacher. The DISGUST departeth from these higher men; well! that is my victory. In my domain they become assured; all stupid shame fleeth away; they empty themselves. They empty their hearts, good times return unto them, they keep holiday and ruminate,--they become THANKFUL. THAT do I take as the best sign: they become thankful. Not long will it be ere they devise festivals, and put up memorials to their old joys. They are CONVALESCENTS!" Thus spake Zarathustra joyfully to his heart and gazed outward; his animals, however, pressed up to him, and honoured his happiness and his silence. 2. All on a sudden however, Zarathustra's ear was frightened: for the cave which had hitherto been full of noise and laughter, became all at once still as death;--his nose, however, smelt a sweet-scented vapour and incense-odour, as if from burning pine-cones. "What happeneth? What are they about?" he asked himself, and stole up to the entrance, that he might be able unobserved to see his guests. But wonder upon wonder! what was he then obliged to behold with his own eyes! "They have all of them become PIOUS again, they PRAY, they are mad!"--said he, and was astonished beyond measure. And forsooth! all these higher men, the two kings, the pope out of service, the evil magician, the voluntary beggar, the wanderer and shadow, the old soothsayer, the spiritually conscientious one, and the ugliest man--they all lay on their knees like children and credulous old women, and worshipped the ass. And just then began the ugliest man to gurgle and snort, as if something unutterable in him tried to find expression; when, however, he had actually found words, behold! it was a pious, strange litany in praise of the adored and censed ass. And the litany sounded thus: Amen! And glory and honour and wisdom and thanks and praise and strength be to our God, from everlasting to everlasting! --The ass, however, here brayed YE-A. He carrieth our burdens, he hath taken upon him the form of a servant, he is patient of heart and never saith Nay; and he who loveth his God chastiseth him. --The ass, however, here brayed YE-A. He speaketh not: except that he ever saith Yea to the world which he created: thus doth he extol his world. It is his artfulness that speaketh not: thus is he rarely found wrong. --The ass, however, here brayed YE-A. Uncomely goeth he through the world. Grey is the favourite colour in which he wrappeth his virtue. Hath he spirit, then doth he conceal it; every one, however, believeth in his long ears. --The ass, however, here brayed YE-A. What hidden wisdom it is to wear long ears, and only to say Yea and never Nay! Hath he not created the world in his own image, namely, as stupid as possible? --The ass, however, here brayed YE-A. Thou goest straight and crooked ways; it concerneth thee little what seemeth straight or crooked unto us men. Beyond good and evil is thy domain. It is thine innocence not to know what innocence is. --The ass, however, here brayed YE-A. Lo! how thou spurnest none from thee, neither beggars nor kings. Thou sufferest little children to come unto thee, and when the bad boys decoy thee, then sayest thou simply, YE-A. --The ass, however, here brayed YE-A. Thou lovest she-asses and fresh figs, thou art no food-despiser. A thistle tickleth thy heart when thou chancest to be hungry. There is the wisdom of a God therein. --The ass, however, here brayed YE-A. LXXVIII. THE ASS-FESTIVAL. 1. At this place in the litany, however, Zarathustra could no longer control himself; he himself cried out YE-A, louder even than the ass, and sprang into the midst of his maddened guests. "Whatever are you about, ye grown- up children?" he exclaimed, pulling up the praying ones from the ground. "Alas, if any one else, except Zarathustra, had seen you: Every one would think you the worst blasphemers, or the very foolishest old women, with your new belief! And thou thyself, thou old pope, how is it in accordance with thee, to adore an ass in such a manner as God?"-- "O Zarathustra," answered the pope, "forgive me, but in divine matters I am more enlightened even than thou. And it is right that it should be so. Better to adore God so, in this form, than in no form at all! Think over this saying, mine exalted friend: thou wilt readily divine that in such a saying there is wisdom. He who said 'God is a Spirit'--made the greatest stride and slide hitherto made on earth towards unbelief: such a dictum is not easily amended again on earth! Mine old heart leapeth and boundeth because there is still something to adore on earth. Forgive it, O Zarathustra, to an old, pious pontiff- heart!--" --"And thou," said Zarathustra to the wanderer and shadow, "thou callest and thinkest thyself a free spirit? And thou here practisest such idolatry and hierolatry? Worse verily, doest thou here than with thy bad brown girls, thou bad, new believer!" "It is sad enough," answered the wanderer and shadow, "thou art right: but how can I help it! The old God liveth again, O Zarathustra, thou mayst say what thou wilt. The ugliest man is to blame for it all: he hath reawakened him. And if he say that he once killed him, with Gods DEATH is always just a prejudice." --"And thou," said Zarathustra, "thou bad old magician, what didst thou do! Who ought to believe any longer in thee in this free age, when THOU believest in such divine donkeyism? It was a stupid thing that thou didst; how couldst thou, a shrewd man, do such a stupid thing!" "O Zarathustra," answered the shrewd magician, "thou art right, it was a stupid thing,--it was also repugnant to me." --"And thou even," said Zarathustra to the spiritually conscientious one, "consider, and put thy finger to thy nose! Doth nothing go against thy conscience here? Is thy spirit not too cleanly for this praying and the fumes of those devotees?" "There is something therein," said the spiritually conscientious one, and put his finger to his nose, "there is something in this spectacle which even doeth good to my conscience. Perhaps I dare not believe in God: certain it is however, that God seemeth to me most worthy of belief in this form. God is said to be eternal, according to the testimony of the most pious: he who hath so much time taketh his time. As slow and as stupid as possible: THEREBY can such a one nevertheless go very far. And he who hath too much spirit might well become infatuated with stupidity and folly. Think of thyself, O Zarathustra! Thou thyself--verily! even thou couldst well become an ass through superabundance of wisdom. Doth not the true sage willingly walk on the crookedest paths? The evidence teacheth it, O Zarathustra,--THINE OWN evidence!" --"And thou thyself, finally," said Zarathustra, and turned towards the ugliest man, who still lay on the ground stretching up his arm to the ass (for he gave it wine to drink). "Say, thou nondescript, what hast thou been about! Thou seemest to me transformed, thine eyes glow, the mantle of the sublime covereth thine ugliness: WHAT didst thou do? Is it then true what they say, that thou hast again awakened him? And why? Was he not for good reasons killed and made away with? Thou thyself seemest to me awakened: what didst thou do? why didst THOU turn round? Why didst THOU get converted? Speak, thou nondescript!" "O Zarathustra," answered the ugliest man, "thou art a rogue! Whether HE yet liveth, or again liveth, or is thoroughly dead--which of us both knoweth that best? I ask thee. One thing however do I know,--from thyself did I learn it once, O Zarathustra: he who wanteth to kill most thoroughly, LAUGHETH. 'Not by wrath but by laughter doth one kill'--thus spakest thou once, O Zarathustra, thou hidden one, thou destroyer without wrath, thou dangerous saint,--thou art a rogue!" 2. Then, however, did it come to pass that Zarathustra, astonished at such merely roguish answers, jumped back to the door of his cave, and turning towards all his guests, cried out with a strong voice: "O ye wags, all of you, ye buffoons! Why do ye dissemble and disguise yourselves before me! How the hearts of all of you convulsed with delight and wickedness, because ye had at last become again like little children--namely, pious,-- --Because ye at last did again as children do--namely, prayed, folded your hands and said 'good God'! But now leave, I pray you, THIS nursery, mine own cave, where to-day all childishness is carried on. Cool down, here outside, your hot child- wantonness and heart-tumult! To be sure: except ye become as little children ye shall not enter into THAT kingdom of heaven." (And Zarathustra pointed aloft with his hands.) "But we do not at all want to enter into the kingdom of heaven: we have become men,--SO WE WANT THE KINGDOM OF EARTH." 3. And once more began Zarathustra to speak. "O my new friends," said he,-- "ye strange ones, ye higher men, how well do ye now please me,-- --Since ye have again become joyful! Ye have, verily, all blossomed forth: it seemeth to me that for such flowers as you, NEW FESTIVALS are required. --A little valiant nonsense, some divine service and ass-festival, some old joyful Zarathustra fool, some blusterer to blow your souls bright. Forget not this night and this ass-festival, ye higher men! THAT did ye devise when with me, that do I take as a good omen,--such things only the convalescents devise! And should ye celebrate it again, this ass-festival, do it from love to yourselves, do it also from love to me! And in remembrance of me!" Thus spake Zarathustra. LXXIX. THE DRUNKEN SONG. 1. Meanwhile one after another had gone out into the open air, and into the cool, thoughtful night; Zarathustra himself, however, led the ugliest man by the hand, that he might show him his night-world, and the great round moon, and the silvery water-falls near his cave. There they at last stood still beside one another; all of them old people, but with comforted, brave hearts, and astonished in themselves that it was so well with them on earth; the mystery of the night, however, came nigher and nigher to their hearts. And anew Zarathustra thought to himself: "Oh, how well do they now please me, these higher men!"--but he did not say it aloud, for he respected their happiness and their silence.-- Then, however, there happened that which in this astonishing long day was most astonishing: the ugliest man began once more and for the last time to gurgle and snort, and when he had at length found expression, behold! there sprang a question plump and plain out of his mouth, a good, deep, clear question, which moved the hearts of all who listened to him. "My friends, all of you," said the ugliest man, "what think ye? For the sake of this day--_I_ am for the first time content to have lived mine entire life. And that I testify so much is still not enough for me. It is worth while living on the earth: one day, one festival with Zarathustra, hath taught me to love the earth. 'Was THAT--life?' will I say unto death. 'Well! Once more!' My friends, what think ye? Will ye not, like me, say unto death: 'Was THAT--life? For the sake of Zarathustra, well! Once more!'"-- Thus spake the ugliest man; it was not, however, far from midnight. And what took place then, think ye? As soon as the higher men heard his question, they became all at once conscious of their transformation and convalescence, and of him who was the cause thereof: then did they rush up to Zarathustra, thanking, honouring, caressing him, and kissing his hands, each in his own peculiar way; so that some laughed and some wept. The old soothsayer, however, danced with delight; and though he was then, as some narrators suppose, full of sweet wine, he was certainly still fuller of sweet life, and had renounced all weariness. There are even those who narrate that the ass then danced: for not in vain had the ugliest man previously given it wine to drink. That may be the case, or it may be otherwise; and if in truth the ass did not dance that evening, there nevertheless happened then greater and rarer wonders than the dancing of an ass would have been. In short, as the proverb of Zarathustra saith: "What doth it matter!" 2. When, however, this took place with the ugliest man, Zarathustra stood there like one drunken: his glance dulled, his tongue faltered and his feet staggered. And who could divine what thoughts then passed through Zarathustra's soul? Apparently, however, his spirit retreated and fled in advance and was in remote distances, and as it were "wandering on high mountain-ridges," as it standeth written, "'twixt two seas, --Wandering 'twixt the past and the future as a heavy cloud." Gradually, however, while the higher men held him in their arms, he came back to himself a little, and resisted with his hands the crowd of the honouring and caring ones; but he did not speak. All at once, however, he turned his head quickly, for he seemed to hear something: then laid he his finger on his mouth and said: "COME!" And immediately it became still and mysterious round about; from the depth however there came up slowly the sound of a clock-bell. Zarathustra listened thereto, like the higher men; then, however, laid he his finger on his mouth the second time, and said again: "COME! COME! IT IS GETTING ON TO MIDNIGHT!"--and his voice had changed. But still he had not moved from the spot. Then it became yet stiller and more mysterious, and everything hearkened, even the ass, and Zarathustra's noble animals, the eagle and the serpent,--likewise the cave of Zarathustra and the big cool moon, and the night itself. Zarathustra, however, laid his hand upon his mouth for the third time, and said: COME! COME! COME! LET US NOW WANDER! IT IS THE HOUR: LET US WANDER INTO THE NIGHT! 3. Ye higher men, it is getting on to midnight: then will I say something into your ears, as that old clock-bell saith it into mine ear,-- --As mysteriously, as frightfully, and as cordially as that midnight clock- bell speaketh it to me, which hath experienced more than one man: --Which hath already counted the smarting throbbings of your fathers' hearts--ah! ah! how it sigheth! how it laugheth in its dream! the old, deep, deep midnight! Hush! Hush! Then is there many a thing heard which may not be heard by day; now however, in the cool air, when even all the tumult of your hearts hath become still,-- --Now doth it speak, now is it heard, now doth it steal into overwakeful, nocturnal souls: ah! ah! how the midnight sigheth! how it laugheth in its dream! --Hearest thou not how it mysteriously, frightfully, and cordially speaketh unto THEE, the old deep, deep midnight? O MAN, TAKE HEED! 4. Woe to me! Whither hath time gone? Have I not sunk into deep wells? The world sleepeth-- Ah! Ah! The dog howleth, the moon shineth. Rather will I die, rather will I die, than say unto you what my midnight-heart now thinketh. Already have I died. It is all over. Spider, why spinnest thou around me? Wilt thou have blood? Ah! Ah! The dew falleth, the hour cometh-- --The hour in which I frost and freeze, which asketh and asketh and asketh: "Who hath sufficient courage for it? --Who is to be master of the world? Who is going to say: THUS shall ye flow, ye great and small streams!" --The hour approacheth: O man, thou higher man, take heed! this talk is for fine ears, for thine ears--WHAT SAITH DEEP MIDNIGHT'S VOICE INDEED? 5. It carrieth me away, my soul danceth. Day's-work! Day's-work! Who is to be master of the world? The moon is cool, the wind is still. Ah! Ah! Have ye already flown high enough? Ye have danced: a leg, nevertheless, is not a wing. Ye good dancers, now is all delight over: wine hath become lees, every cup hath become brittle, the sepulchres mutter. Ye have not flown high enough: now do the sepulchres mutter: "Free the dead! Why is it so long night? Doth not the moon make us drunken?" Ye higher men, free the sepulchres, awaken the corpses! Ah, why doth the worm still burrow? There approacheth, there approacheth, the hour,-- --There boometh the clock-bell, there thrilleth still the heart, there burroweth still the wood-worm, the heart-worm. Ah! Ah! THE WORLD IS DEEP! 6. Sweet lyre! Sweet lyre! I love thy tone, thy drunken, ranunculine tone!-- how long, how far hath come unto me thy tone, from the distance, from the ponds of love! Thou old clock-bell, thou sweet lyre! Every pain hath torn thy heart, father-pain, fathers'-pain, forefathers'-pain; thy speech hath become ripe,-- --Ripe like the golden autumn and the afternoon, like mine anchorite heart --now sayest thou: The world itself hath become ripe, the grape turneth brown, --Now doth it wish to die, to die of happiness. Ye higher men, do ye not feel it? There welleth up mysteriously an odour, --A perfume and odour of eternity, a rosy-blessed, brown, gold-wine-odour of old happiness, --Of drunken midnight-death happiness, which singeth: the world is deep, AND DEEPER THAN THE DAY COULD READ! 7. Leave me alone! Leave me alone! I am too pure for thee. Touch me not! Hath not my world just now become perfect? My skin is too pure for thy hands. Leave me alone, thou dull, doltish, stupid day! Is not the midnight brighter? The purest are to be masters of the world, the least known, the strongest, the midnight-souls, who are brighter and deeper than any day. O day, thou gropest for me? Thou feelest for my happiness? For thee am I rich, lonesome, a treasure-pit, a gold chamber? O world, thou wantest ME? Am I worldly for thee? Am I spiritual for thee? Am I divine for thee? But day and world, ye are too coarse,-- --Have cleverer hands, grasp after deeper happiness, after deeper unhappiness, grasp after some God; grasp not after me: --Mine unhappiness, my happiness is deep, thou strange day, but yet am I no God, no God's-hell: DEEP IS ITS WOE. 8. God's woe is deeper, thou strange world! Grasp at God's woe, not at me! What am I! A drunken sweet lyre,-- --A midnight-lyre, a bell-frog, which no one understandeth, but which MUST speak before deaf ones, ye higher men! For ye do not understand me! Gone! Gone! O youth! O noontide! O afternoon! Now have come evening and night and midnight,--the dog howleth, the wind: --Is the wind not a dog? It whineth, it barketh, it howleth. Ah! Ah! how she sigheth! how she laugheth, how she wheezeth and panteth, the midnight! How she just now speaketh soberly, this drunken poetess! hath she perhaps overdrunk her drunkenness? hath she become overawake? doth she ruminate? --Her woe doth she ruminate over, in a dream, the old, deep midnight--and still more her joy. For joy, although woe be deep, JOY IS DEEPER STILL THAN GRIEF CAN BE. 9. Thou grape-vine! Why dost thou praise me? Have I not cut thee! I am cruel, thou bleedest--: what meaneth thy praise of my drunken cruelty? "Whatever hath become perfect, everything mature--wanteth to die!" so sayest thou. Blessed, blessed be the vintner's knife! But everything immature wanteth to live: alas! Woe saith: "Hence! Go! Away, thou woe!" But everything that suffereth wanteth to live, that it may become mature and lively and longing, --Longing for the further, the higher, the brighter. "I want heirs," so saith everything that suffereth, "I want children, I do not want MYSELF,"-- Joy, however, doth not want heirs, it doth not want children,--joy wanteth itself, it wanteth eternity, it wanteth recurrence, it wanteth everything eternally-like-itself. Woe saith: "Break, bleed, thou heart! Wander, thou leg! Thou wing, fly! Onward! upward! thou pain!" Well! Cheer up! O mine old heart: WOE SAITH: "HENCE! GO!" 10. Ye higher men, what think ye? Am I a soothsayer? Or a dreamer? Or a drunkard? Or a dream-reader? Or a midnight-bell? Or a drop of dew? Or a fume and fragrance of eternity? Hear ye it not? Smell ye it not? Just now hath my world become perfect, midnight is also mid-day,-- Pain is also a joy, curse is also a blessing, night is also a sun,--go away! or ye will learn that a sage is also a fool. Said ye ever Yea to one joy? O my friends, then said ye Yea also unto ALL woe. All things are enlinked, enlaced and enamoured,-- --Wanted ye ever once to come twice; said ye ever: "Thou pleasest me, happiness! Instant! Moment!" then wanted ye ALL to come back again! --All anew, all eternal, all enlinked, enlaced and enamoured, Oh, then did ye LOVE the world,-- --Ye eternal ones, ye love it eternally and for all time: and also unto woe do ye say: Hence! Go! but come back! FOR JOYS ALL WANT--ETERNITY! 11. All joy wanteth the eternity of all things, it wanteth honey, it wanteth lees, it wanteth drunken midnight, it wanteth graves, it wanteth grave- tears' consolation, it wanteth gilded evening-red-- --WHAT doth not joy want! it is thirstier, heartier, hungrier, more frightful, more mysterious, than all woe: it wanteth ITSELF, it biteth into ITSELF, the ring's will writheth in it,-- --It wanteth love, it wanteth hate, it is over-rich, it bestoweth, it throweth away, it beggeth for some one to take from it, it thanketh the taker, it would fain be hated,-- --So rich is joy that it thirsteth for woe, for hell, for hate, for shame, for the lame, for the WORLD,--for this world, Oh, ye know it indeed! Ye higher men, for you doth it long, this joy, this irrepressible, blessed joy--for your woe, ye failures! For failures, longeth all eternal joy. For joys all want themselves, therefore do they also want grief! O happiness, O pain! Oh break, thou heart! Ye higher men, do learn it, that joys want eternity. --Joys want the eternity of ALL things, they WANT DEEP, PROFOUND ETERNITY! 12. Have ye now learned my song? Have ye divined what it would say? Well! Cheer up! Ye higher men, sing now my roundelay! Sing now yourselves the song, the name of which is "Once more," the signification of which is "Unto all eternity!"--sing, ye higher men, Zarathustra's roundelay! O man! Take heed! What saith deep midnight's voice indeed? "I slept my sleep--, "From deepest dream I've woke, and plead:-- "The world is deep, "And deeper than the day could read. "Deep is its woe--, "Joy--deeper still than grief can be: "Woe saith: Hence! Go! "But joys all want eternity-, "-Want deep, profound eternity!" LXXX. THE SIGN. In the morning, however, after this night, Zarathustra jumped up from his couch, and, having girded his loins, he came out of his cave glowing and strong, like a morning sun coming out of gloomy mountains. "Thou great star," spake he, as he had spoken once before, "thou deep eye of happiness, what would be all thy happiness if thou hadst not THOSE for whom thou shinest! And if they remained in their chambers whilst thou art already awake, and comest and bestowest and distributest, how would thy proud modesty upbraid for it! Well! they still sleep, these higher men, whilst _I_ am awake: THEY are not my proper companions! Not for them do I wait here in my mountains. At my work I want to be, at my day: but they understand not what are the signs of my morning, my step--is not for them the awakening-call. They still sleep in my cave; their dream still drinketh at my drunken songs. The audient ear for ME--the OBEDIENT ear, is yet lacking in their limbs." --This had Zarathustra spoken to his heart when the sun arose: then looked he inquiringly aloft, for he heard above him the sharp call of his eagle. "Well!" called he upwards, "thus is it pleasing and proper to me. Mine animals are awake, for I am awake. Mine eagle is awake, and like me honoureth the sun. With eagle-talons doth it grasp at the new light. Ye are my proper animals; I love you. But still do I lack my proper men!"-- Thus spake Zarathustra; then, however, it happened that all on a sudden he became aware that he was flocked around and fluttered around, as if by innumerable birds,--the whizzing of so many wings, however, and the crowding around his head was so great that he shut his eyes. And verily, there came down upon him as it were a cloud, like a cloud of arrows which poureth upon a new enemy. But behold, here it was a cloud of love, and showered upon a new friend. "What happeneth unto me?" thought Zarathustra in his astonished heart, and slowly seated himself on the big stone which lay close to the exit from his cave. But while he grasped about with his hands, around him, above him and below him, and repelled the tender birds, behold, there then happened to him something still stranger: for he grasped thereby unawares into a mass of thick, warm, shaggy hair; at the same time, however, there sounded before him a roar,--a long, soft lion-roar. "THE SIGN COMETH," said Zarathustra, and a change came over his heart. And in truth, when it turned clear before him, there lay a yellow, powerful animal at his feet, resting its head on his knee,--unwilling to leave him out of love, and doing like a dog which again findeth its old master. The doves, however, were no less eager with their love than the lion; and whenever a dove whisked over its nose, the lion shook its head and wondered and laughed. When all this went on Zarathustra spake only a word: "MY CHILDREN ARE NIGH, MY CHILDREN"--, then he became quite mute. His heart, however, was loosed, and from his eyes there dropped down tears and fell upon his hands. And he took no further notice of anything, but sat there motionless, without repelling the animals further. Then flew the doves to and fro, and perched on his shoulder, and caressed his white hair, and did not tire of their tenderness and joyousness. The strong lion, however, licked always the tears that fell on Zarathustra's hands, and roared and growled shyly. Thus did these animals do.-- All this went on for a long time, or a short time: for properly speaking, there is NO time on earth for such things--. Meanwhile, however, the higher men had awakened in Zarathustra's cave, and marshalled themselves for a procession to go to meet Zarathustra, and give him their morning greeting: for they had found when they awakened that he no longer tarried with them. When, however, they reached the door of the cave and the noise of their steps had preceded them, the lion started violently; it turned away all at once from Zarathustra, and roaring wildly, sprang towards the cave. The higher men, however, when they heard the lion roaring, cried all aloud as with one voice, fled back and vanished in an instant. Zarathustra himself, however, stunned and strange, rose from his seat, looked around him, stood there astonished, inquired of his heart, bethought himself, and remained alone. "What did I hear?" said he at last, slowly, "what happened unto me just now?" But soon there came to him his recollection, and he took in at a glance all that had taken place between yesterday and to-day. "Here is indeed the stone," said he, and stroked his beard, "on IT sat I yester-morn; and here came the soothsayer unto me, and here heard I first the cry which I heard just now, the great cry of distress. O ye higher men, YOUR distress was it that the old soothsayer foretold to me yester-morn,-- --Unto your distress did he want to seduce and tempt me: 'O Zarathustra,' said he to me, 'I come to seduce thee to thy last sin.' To my last sin?" cried Zarathustra, and laughed angrily at his own words: "WHAT hath been reserved for me as my last sin?" --And once more Zarathustra became absorbed in himself, and sat down again on the big stone and meditated. Suddenly he sprang up,-- "FELLOW-SUFFERING! FELLOW-SUFFERING WITH THE HIGHER MEN!" he cried out, and his countenance changed into brass. "Well! THAT--hath had its time! My suffering and my fellow-suffering--what matter about them! Do I then strive after HAPPINESS? I strive after my WORK! Well! The lion hath come, my children are nigh, Zarathustra hath grown ripe, mine hour hath come:-- This is MY morning, MY day beginneth: ARISE NOW, ARISE, THOU GREAT NOONTIDE!"-- Thus spake Zarathustra and left his cave, glowing and strong, like a morning sun coming out of gloomy mountains. ---------------- INTRODUCTION BY MRS FORSTER-NIETZSCHE. HOW ZARATHUSTRA CAME INTO BEING. "Zarathustra" is my brother's most personal work; it is the history of his most individual experiences, of his friendships, ideals, raptures, bitterest disappointments and sorrows. Above it all, however, there soars, transfiguring it, the image of his greatest hopes and remotest aims. My brother had the figure of Zarathustra in his mind from his very earliest youth: he once told me that even as a child he had dreamt of him. At different periods in his life, he would call this haunter of his dreams by different names; "but in the end," he declares in a note on the subject, "I had to do a PERSIAN the honour of identifying him with this creature of my fancy. Persians were the first to take a broad and comprehensive view of history. Every series of evolutions, according to them, was presided over by a prophet; and every prophet had his 'Hazar,'--his dynasty of a thousand years." All Zarathustra's views, as also his personality, were early conceptions of my brother's mind. Whoever reads his posthumously published writings for the years 1869-82 with care, will constantly meet with passages suggestive of Zarathustra's thoughts and doctrines. For instance, the ideal of the Superman is put forth quite clearly in all his writings during the years 1873-75; and in "We Philologists", the following remarkable observations occur:-- "How can one praise and glorify a nation as a whole?--Even among the Greeks, it was the INDIVIDUALS that counted." "The Greeks are interesting and extremely important because they reared such a vast number of great individuals. How was this possible? The question is one which ought to be studied. "I am interested only in the relations of a people to the rearing of the individual man, and among the Greeks the conditions were unusually favourable for the development of the individual; not by any means owing to the goodness of the people, but because of the struggles of their evil instincts. "WITH THE HELP OF FAVOURABLE MEASURES GREAT INDIVIDUALS MIGHT BE REARED WHO WOULD BE BOTH DIFFERENT FROM AND HIGHER THAN THOSE WHO HERETOFORE HAVE OWED THEIR EXISTENCE TO MERE CHANCE. Here we may still be hopeful: in the rearing of exceptional men." The notion of rearing the Superman is only a new form of an ideal Nietzsche already had in his youth, that "THE OBJECT OF MANKIND SHOULD LIE IN ITS HIGHEST INDIVIDUALS" (or, as he writes in "Schopenhauer as Educator": "Mankind ought constantly to be striving to produce great men--this and nothing else is its duty.") But the ideals he most revered in those days are no longer held to be the highest types of men. No, around this future ideal of a coming humanity--the Superman--the poet spread the veil of becoming. Who can tell to what glorious heights man can still ascend? That is why, after having tested the worth of our noblest ideal--that of the Saviour, in the light of the new valuations, the poet cries with passionate emphasis in "Zarathustra": "Never yet hath there been a Superman. Naked have I seen both of them, the greatest and the smallest man:-- All-too-similar are they still to each other. Verily even the greatest found I--all-too-human!"-- The phrase "the rearing of the Superman," has very often been misunderstood. By the word "rearing," in this case, is meant the act of modifying by means of new and higher values--values which, as laws and guides of conduct and opinion, are now to rule over mankind. In general the doctrine of the Superman can only be understood correctly in conjunction with other ideas of the author's, such as:--the Order of Rank, the Will to Power, and the Transvaluation of all Values. He assumes that Christianity, as a product of the resentment of the botched and the weak, has put in ban all that is beautiful, strong, proud, and powerful, in fact all the qualities resulting from strength, and that, in consequence, all forces which tend to promote or elevate life have been seriously undermined. Now, however, a new table of valuations must be placed over mankind--namely, that of the strong, mighty, and magnificent man, overflowing with life and elevated to his zenith--the Superman, who is now put before us with overpowering passion as the aim of our life, hope, and will. And just as the old system of valuing, which only extolled the qualities favourable to the weak, the suffering, and the oppressed, has succeeded in producing a weak, suffering, and "modern" race, so this new and reversed system of valuing ought to rear a healthy, strong, lively, and courageous type, which would be a glory to life itself. Stated briefly, the leading principle of this new system of valuing would be: "All that proceeds from power is good, all that springs from weakness is bad." This type must not be regarded as a fanciful figure: it is not a nebulous hope which is to be realised at some indefinitely remote period, thousands of years hence; nor is it a new species (in the Darwinian sense) of which we can know nothing, and which it would therefore be somewhat absurd to strive after. But it is meant to be a possibility which men of the present could realise with all their spiritual and physical energies, provided they adopted the new values. The author of "Zarathustra" never lost sight of that egregious example of a transvaluation of all values through Christianity, whereby the whole of the deified mode of life and thought of the Greeks, as well as strong Romedom, was almost annihilated or transvalued in a comparatively short time. Could not a rejuvenated Graeco-Roman system of valuing (once it had been refined and made more profound by the schooling which two thousand years of Christianity had provided) effect another such revolution within a calculable period of time, until that glorious type of manhood shall finally appear which is to be our new faith and hope, and in the creation of which Zarathustra exhorts us to participate? In his private notes on the subject the author uses the expression "Superman" (always in the singular, by-the-bye), as signifying "the most thoroughly well-constituted type," as opposed to "modern man"; above all, however, he designates Zarathustra himself as an example of the Superman. In "Ecco Homo" he is careful to enlighten us concerning the precursors and prerequisites to the advent of this highest type, in referring to a certain passage in the "Gay Science":-- "In order to understand this type, we must first be quite clear in regard to the leading physiological condition on which it depends: this condition is what I call GREAT HEALTHINESS. I know not how to express my meaning more plainly or more personally than I have done already in one of the last chapters (Aphorism 382) of the fifth book of the 'Gaya Scienza'." "We, the new, the nameless, the hard-to-understand,"--it says there,--"we firstlings of a yet untried future--we require for a new end also a new means, namely, a new healthiness, stronger, sharper, tougher, bolder and merrier than all healthiness hitherto. He whose soul longeth to experience the whole range of hitherto recognised values and desirabilities, and to circumnavigate all the coasts of this ideal 'Mediterranean Sea', who, from the adventures of his most personal experience, wants to know how it feels to be a conqueror, and discoverer of the ideal--as likewise how it is with the artist, the saint, the legislator, the sage, the scholar, the devotee, the prophet, and the godly non-conformist of the old style:--requires one thing above all for that purpose, GREAT HEALTHINESS--such healthiness as one not only possesses, but also constantly acquires and must acquire, because one unceasingly sacrifices it again, and must sacrifice it!--And now, after having been long on the way in this fashion, we Argonauts of the ideal, more courageous perhaps than prudent, and often enough shipwrecked and brought to grief, nevertheless dangerously healthy, always healthy again,--it would seem as if, in recompense for it all, that we have a still undiscovered country before us, the boundaries of which no one has yet seen, a beyond to all countries and corners of the ideal known hitherto, a world so over-rich in the beautiful, the strange, the questionable, the frightful, and the divine, that our curiosity as well as our thirst for possession thereof, have got out of hand--alas! that nothing will now any longer satisfy us!-- "How could we still be content with THE MAN OF THE PRESENT DAY after such outlooks, and with such a craving in our conscience and consciousness? Sad enough; but it is unavoidable that we should look on the worthiest aims and hopes of the man of the present day with ill-concealed amusement, and perhaps should no longer look at them. Another ideal runs on before us, a strange, tempting ideal full of danger, to which we should not like to persuade any one, because we do not so readily acknowledge any one's RIGHT THERETO: the ideal of a spirit who plays naively (that is to say involuntarily and from overflowing abundance and power) with everything that has hitherto been called holy, good, intangible, or divine; to whom the loftiest conception which the people have reasonably made their measure of value, would already practically imply danger, ruin, abasement, or at least relaxation, blindness, or temporary self-forgetfulness; the ideal of a humanly superhuman welfare and benevolence, which will often enough appear INHUMAN, for example, when put alongside of all past seriousness on earth, and alongside of all past solemnities in bearing, word, tone, look, morality, and pursuit, as their truest involuntary parody--and WITH which, nevertheless, perhaps THE GREAT SERIOUSNESS only commences, when the proper interrogative mark is set up, the fate of the soul changes, the hour-hand moves, and tragedy begins..." Although the figure of Zarathustra and a large number of the leading thoughts in this work had appeared much earlier in the dreams and writings of the author, "Thus Spake Zarathustra" did not actually come into being until the month of August 1881 in Sils Maria; and it was the idea of the Eternal Recurrence of all things which finally induced my brother to set forth his new views in poetic language. In regard to his first conception of this idea, his autobiographical sketch, "Ecce Homo", written in the autumn of 1888, contains the following passage:-- "The fundamental idea of my work--namely, the Eternal Recurrence of all things--this highest of all possible formulae of a Yea-saying philosophy, first occurred to me in August 1881. I made a note of the thought on a sheet of paper, with the postscript: 6,000 feet beyond men and time! That day I happened to be wandering through the woods alongside of the lake of Silvaplana, and I halted beside a huge, pyramidal and towering rock not far from Surlei. It was then that the thought struck me. Looking back now, I find that exactly two months previous to this inspiration, I had had an omen of its coming in the form of a sudden and decisive alteration in my tastes--more particularly in music. It would even be possible to consider all 'Zarathustra' as a musical composition. At all events, a very necessary condition in its production was a renaissance in myself of the art of hearing. In a small mountain resort (Recoaro) near Vicenza, where I spent the spring of 1881, I and my friend and Maestro, Peter Gast--also one who had been born again--discovered that the phoenix music that hovered over us, wore lighter and brighter plumes than it had done theretofore." During the month of August 1881 my brother resolved to reveal the teaching of the Eternal Recurrence, in dithyrambic and psalmodic form, through the mouth of Zarathustra. Among the notes of this period, we found a page on which is written the first definite plan of "Thus Spake Zarathustra":-- "MIDDAY AND ETERNITY." "GUIDE-POSTS TO A NEW WAY OF LIVING." Beneath this is written:-- "Zarathustra born on lake Urmi; left his home in his thirtieth year, went into the province of Aria, and, during ten years of solitude in the mountains, composed the Zend-Avesta." "The sun of knowledge stands once more at midday; and the serpent of eternity lies coiled in its light--: It is YOUR time, ye midday brethren." In that summer of 1881, my brother, after many years of steadily declining health, began at last to rally, and it is to this first gush of the recovery of his once splendid bodily condition that we owe not only "The Gay Science", which in its mood may be regarded as a prelude to "Zarathustra", but also "Zarathustra" itself. Just as he was beginning to recuperate his health, however, an unkind destiny brought him a number of most painful personal experiences. His friends caused him many disappointments, which were the more bitter to him, inasmuch as he regarded friendship as such a sacred institution; and for the first time in his life he realised the whole horror of that loneliness to which, perhaps, all greatness is condemned. But to be forsaken is something very different from deliberately choosing blessed loneliness. How he longed, in those days, for the ideal friend who would thoroughly understand him, to whom he would be able to say all, and whom he imagined he had found at various periods in his life from his earliest youth onwards. Now, however, that the way he had chosen grew ever more perilous and steep, he found nobody who could follow him: he therefore created a perfect friend for himself in the ideal form of a majestic philosopher, and made this creation the preacher of his gospel to the world. Whether my brother would ever have written "Thus Spake Zarathustra" according to the first plan sketched in the summer of 1881, if he had not had the disappointments already referred to, is now an idle question; but perhaps where "Zarathustra" is concerned, we may also say with Master Eckhardt: "The fleetest beast to bear you to perfection is suffering." My brother writes as follows about the origin of the first part of "Zarathustra":--"In the winter of 1882-83, I was living on the charming little Gulf of Rapallo, not far from Genoa, and between Chiavari and Cape Porto Fino. My health was not very good; the winter was cold and exceptionally rainy; and the small inn in which I lived was so close to the water that at night my sleep would be disturbed if the sea were high. These circumstances were surely the very reverse of favourable; and yet in spite of it all, and as if in demonstration of my belief that everything decisive comes to life in spite of every obstacle, it was precisely during this winter and in the midst of these unfavourable circumstances that my 'Zarathustra' originated. In the morning I used to start out in a southerly direction up the glorious road to Zoagli, which rises aloft through a forest of pines and gives one a view far out into the sea. In the afternoon, as often as my health permitted, I walked round the whole bay from Santa Margherita to beyond Porto Fino. This spot was all the more interesting to me, inasmuch as it was so dearly loved by the Emperor Frederick III. In the autumn of 1886 I chanced to be there again when he was revisiting this small, forgotten world of happiness for the last time. It was on these two roads that all 'Zarathustra' came to me, above all Zarathustra himself as a type;--I ought rather to say that it was on these walks that these ideas waylaid me." The first part of "Zarathustra" was written in about ten days--that is to say, from the beginning to about the middle of February 1883. "The last lines were written precisely in the hallowed hour when Richard Wagner gave up the ghost in Venice." With the exception of the ten days occupied in composing the first part of this book, my brother often referred to this winter as the hardest and sickliest he had ever experienced. He did not, however, mean thereby that his former disorders were troubling him, but that he was suffering from a severe attack of influenza which he had caught in Santa Margherita, and which tormented him for several weeks after his arrival in Genoa. As a matter of fact, however, what he complained of most was his spiritual condition--that indescribable forsakenness--to which he gives such heartrending expression in "Zarathustra". Even the reception which the first part met with at the hands of friends and acquaintances was extremely disheartening: for almost all those to whom he presented copies of the work misunderstood it. "I found no one ripe for many of my thoughts; the case of 'Zarathustra' proves that one can speak with the utmost clearness, and yet not be heard by any one." My brother was very much discouraged by the feebleness of the response he was given, and as he was striving just then to give up the practice of taking hydrate of chloral--a drug he had begun to take while ill with influenza,--the following spring, spent in Rome, was a somewhat gloomy one for him. He writes about it as follows:-- "I spent a melancholy spring in Rome, where I only just managed to live,-- and this was no easy matter. This city, which is absolutely unsuited to the poet-author of 'Zarathustra', and for the choice of which I was not responsible, made me inordinately miserable. I tried to leave it. I wanted to go to Aquila--the opposite of Rome in every respect, and actually founded in a spirit of enmity towards that city (just as I also shall found a city some day), as a memento of an atheist and genuine enemy of the Church--a person very closely related to me,--the great Hohenstaufen, the Emperor Frederick II. But Fate lay behind it all: I had to return again to Rome. In the end I was obliged to be satisfied with the Piazza Barberini, after I had exerted myself in vain to find an anti-Christian quarter. I fear that on one occasion, to avoid bad smells as much as possible, I actually inquired at the Palazzo del Quirinale whether they could not provide a quiet room for a philosopher. In a chamber high above the Piazza just mentioned, from which one obtained a general view of Rome and could hear the fountains plashing far below, the loneliest of all songs was composed--'The Night-Song'. About this time I was obsessed by an unspeakably sad melody, the refrain of which I recognised in the words, 'dead through immortality.'" We remained somewhat too long in Rome that spring, and what with the effect of the increasing heat and the discouraging circumstances already described, my brother resolved not to write any more, or in any case, not to proceed with "Zarathustra", although I offered to relieve him of all trouble in connection with the proofs and the publisher. When, however, we returned to Switzerland towards the end of June, and he found himself once more in the familiar and exhilarating air of the mountains, all his joyous creative powers revived, and in a note to me announcing the dispatch of some manuscript, he wrote as follows: "I have engaged a place here for three months: forsooth, I am the greatest fool to allow my courage to be sapped from me by the climate of Italy. Now and again I am troubled by the thought: WHAT NEXT? My 'future' is the darkest thing in the world to me, but as there still remains a great deal for me to do, I suppose I ought rather to think of doing this than of my future, and leave the rest to THEE and the gods." The second part of "Zarathustra" was written between the 26th of June and the 6th July. "This summer, finding myself once more in the sacred place where the first thought of 'Zarathustra' flashed across my mind, I conceived the second part. Ten days sufficed. Neither for the second, the first, nor the third part, have I required a day longer." He often used to speak of the ecstatic mood in which he wrote "Zarathustra"; how in his walks over hill and dale the ideas would crowd into his mind, and how he would note them down hastily in a note-book from which he would transcribe them on his return, sometimes working till midnight. He says in a letter to me: "You can have no idea of the vehemence of such composition," and in "Ecce Homo" (autumn 1888) he describes as follows with passionate enthusiasm the incomparable mood in which he created Zarathustra:-- "--Has any one at the end of the nineteenth century any distinct notion of what poets of a stronger age understood by the word inspiration? If not, I will describe it. If one had the smallest vestige of superstition in one, it would hardly be possible to set aside completely the idea that one is the mere incarnation, mouthpiece or medium of an almighty power. The idea of revelation in the sense that something becomes suddenly visible and audible with indescribable certainty and accuracy, which profoundly convulses and upsets one--describes simply the matter of fact. One hears-- one does not seek; one takes--one does not ask who gives: a thought suddenly flashes up like lightning, it comes with necessity, unhesitatingly--I have never had any choice in the matter. There is an ecstasy such that the immense strain of it is sometimes relaxed by a flood of tears, along with which one's steps either rush or involuntarily lag, alternately. There is the feeling that one is completely out of hand, with the very distinct consciousness of an endless number of fine thrills and quiverings to the very toes;--there is a depth of happiness in which the painfullest and gloomiest do not operate as antitheses, but as conditioned, as demanded in the sense of necessary shades of colour in such an overflow of light. There is an instinct for rhythmic relations which embraces wide areas of forms (length, the need of a wide-embracing rhythm, is almost the measure of the force of an inspiration, a sort of counterpart to its pressure and tension). Everything happens quite involuntarily, as if in a tempestuous outburst of freedom, of absoluteness, of power and divinity. The involuntariness of the figures and similes is the most remarkable thing; one loses all perception of what constitutes the figure and what constitutes the simile; everything seems to present itself as the readiest, the correctest and the simplest means of expression. It actually seems, to use one of Zarathustra's own phrases, as if all things came unto one, and would fain be similes: 'Here do all things come caressingly to thy talk and flatter thee, for they want to ride upon thy back. On every simile dost thou here ride to every truth. Here fly open unto thee all being's words and word-cabinets; here all being wanteth to become words, here all becoming wanteth to learn of thee how to talk.' This is MY experience of inspiration. I do not doubt but that one would have to go back thousands of years in order to find some one who could say to me: It is mine also!--" In the autumn of 1883 my brother left the Engadine for Germany and stayed there a few weeks. In the following winter, after wandering somewhat erratically through Stresa, Genoa, and Spezia, he landed in Nice, where the climate so happily promoted his creative powers that he wrote the third part of "Zarathustra". "In the winter, beneath the halcyon sky of Nice, which then looked down upon me for the first time in my life, I found the third 'Zarathustra'--and came to the end of my task; the whole having occupied me scarcely a year. Many hidden corners and heights in the landscapes round about Nice are hallowed to me by unforgettable moments. That decisive chapter entitled 'Old and New Tables' was composed in the very difficult ascent from the station to Eza--that wonderful Moorish village in the rocks. My most creative moments were always accompanied by unusual muscular activity. The body is inspired: let us waive the question of the 'soul.' I might often have been seen dancing in those days. Without a suggestion of fatigue I could then walk for seven or eight hours on end among the hills. I slept well and laughed well--I was perfectly robust and patient." As we have seen, each of the three parts of "Zarathustra" was written, after a more or less short period of preparation, in about ten days. The composition of the fourth part alone was broken by occasional interruptions. The first notes relating to this part were written while he and I were staying together in Zurich in September 1884. In the following November, while staying at Mentone, he began to elaborate these notes, and after a long pause, finished the manuscript at Nice between the end of January and the middle of February 1885. My brother then called this part the fourth and last; but even before, and shortly after it had been privately printed, he wrote to me saying that he still intended writing a fifth and sixth part, and notes relating to these parts are now in my possession. This fourth part (the original MS. of which contains this note: "Only for my friends, not for the public") is written in a particularly personal spirit, and those few to whom he presented a copy of it, he pledged to the strictest secrecy concerning its contents. He often thought of making this fourth part public also, but doubted whether he would ever be able to do so without considerably altering certain portions of it. At all events he resolved to distribute this manuscript production, of which only forty copies were printed, only among those who had proved themselves worthy of it, and it speaks eloquently of his utter loneliness and need of sympathy in those days, that he had occasion to present only seven copies of his book according to this resolution. Already at the beginning of this history I hinted at the reasons which led my brother to select a Persian as the incarnation of his ideal of the majestic philosopher. His reasons, however, for choosing Zarathustra of all others to be his mouthpiece, he gives us in the following words:-- "People have never asked me, as they should have done, what the name Zarathustra precisely means in my mouth, in the mouth of the first Immoralist; for what distinguishes that philosopher from all others in the past is the very fact that he was exactly the reverse of an immoralist. Zarathustra was the first to see in the struggle between good and evil the essential wheel in the working of things. The translation of morality into the metaphysical, as force, cause, end in itself, was HIS work. But the very question suggests its own answer. Zarathustra CREATED the most portentous error, MORALITY, consequently he should also be the first to PERCEIVE that error, not only because he has had longer and greater experience of the subject than any other thinker--all history is the experimental refutation of the theory of the so-called moral order of things:--the more important point is that Zarathustra was more truthful than any other thinker. In his teaching alone do we meet with truthfulness upheld as the highest virtue--i.e.: the reverse of the COWARDICE of the 'idealist' who flees from reality. Zarathustra had more courage in his body than any other thinker before or after him. To tell the truth and TO AIM STRAIGHT: that is the first Persian virtue. Am I understood?...The overcoming of morality through itself--through truthfulness, the overcoming of the moralist through his opposite--THROUGH ME--: that is what the name Zarathustra means in my mouth." ELIZABETH FORSTER-NIETZSCHE. Nietzsche Archives, Weimar, December 1905. ------------- APPENDIX. NOTES ON "THUS SPAKE ZARATHUSTRA" BY ANTHONY M. LUDOVICI. I have had some opportunities of studying the conditions under which Nietzsche is read in Germany, France, and England, and I have found that, in each of these countries, students of his philosophy, as if actuated by precisely similar motives and desires, and misled by the same mistaken tactics on the part of most publishers, all proceed in the same happy-go- lucky style when "taking him up." They have had it said to them that he wrote without any system, and they very naturally conclude that it does not matter in the least whether they begin with his first, third, or last book, provided they can obtain a few vague ideas as to what his leading and most sensational principles were. Now, it is clear that the book with the most mysterious, startling, or suggestive title, will always stand the best chance of being purchased by those who have no other criteria to guide them in their choice than the aspect of a title-page; and this explains why "Thus Spake Zarathustra" is almost always the first and often the only one of Nietzsche's books that falls into the hands of the uninitiated. The title suggests all kinds of mysteries; a glance at the chapter-headings quickly confirms the suspicions already aroused, and the sub-title: "A Book for All and None", generally succeeds in dissipating the last doubts the prospective purchaser may entertain concerning his fitness for the book or its fitness for him. And what happens? "Thus Spake Zarathustra" is taken home; the reader, who perchance may know no more concerning Nietzsche than a magazine article has told him, tries to read it and, understanding less than half he reads, probably never gets further than the second or third part,--and then only to feel convinced that Nietzsche himself was "rather hazy" as to what he was talking about. Such chapters as "The Child with the Mirror", "In the Happy Isles", "The Grave-Song," "Immaculate Perception," "The Stillest Hour", "The Seven Seals", and many others, are almost utterly devoid of meaning to all those who do not know something of Nietzsche's life, his aims and his friendships. As a matter of fact, "Thus Spake Zarathustra", though it is unquestionably Nietzsche's opus magnum, is by no means the first of Nietzsche's works that the beginner ought to undertake to read. The author himself refers to it as the deepest work ever offered to the German public, and elsewhere speaks of his other writings as being necessary for the understanding of it. But when it is remembered that in Zarathustra we not only have the history of his most intimate experiences, friendships, feuds, disappointments, triumphs and the like, but that the very form in which they are narrated is one which tends rather to obscure than to throw light upon them, the difficulties which meet the reader who starts quite unprepared will be seen to be really formidable. Zarathustra, then,--this shadowy, allegorical personality, speaking in allegories and parables, and at times not even refraining from relating his own dreams--is a figure we can understand but very imperfectly if we have no knowledge of his creator and counterpart, Friedrich Nietzsche; and it were therefore well, previous to our study of the more abstruse parts of this book, if we were to turn to some authoritative book on Nietzsche's life and works and to read all that is there said on the subject. Those who can read German will find an excellent guide, in this respect, in Frau Foerster-Nietzsche's exhaustive and highly interesting biography of her brother: "Das Leben Friedrich Nietzsche's" (published by Naumann); while the works of Deussen, Raoul Richter, and Baroness Isabelle von Unger- Sternberg, will be found to throw useful and necessary light upon many questions which it would be difficult for a sister to touch upon. In regard to the actual philosophical views expounded in this work, there is an excellent way of clearing up any difficulties they may present, and that is by an appeal to Nietzsche's other works. Again and again, of course, he will be found to express himself so clearly that all reference to his other writings may be dispensed with; but where this is not the case, the advice he himself gives is after all the best to be followed here, viz.:--to regard such works as: "Joyful Science", "Beyond Good and Evil", "The Genealogy of Morals", "The Twilight of the Idols", "The Antichrist", "The Will to Power", etc., etc., as the necessary preparation for "Thus Spake Zarathustra". These directions, though they are by no means simple to carry out, seem at least to possess the quality of definiteness and straightforwardness. "Follow them and all will be clear," I seem to imply. But I regret to say that this is not really the case. For my experience tells me that even after the above directions have been followed with the greatest possible zeal, the student will still halt in perplexity before certain passages in the book before us, and wonder what they mean. Now, it is with the view of giving a little additional help to all those who find themselves in this position that I proceed to put forth my own personal interpretation of the more abstruse passages in this work. In offering this little commentary to the Nietzsche student, I should like it to be understood that I make no claim as to its infallibility or indispensability. It represents but an attempt on my part--a very feeble one perhaps--to give the reader what little help I can in surmounting difficulties which a long study of Nietzsche's life and works has enabled me, partially I hope, to overcome. ... Perhaps it would be as well to start out with a broad and rapid sketch of Nietzsche as a writer on Morals, Evolution, and Sociology, so that the reader may be prepared to pick out for himself, so to speak, all passages in this work bearing in any way upon Nietzsche's views in those three important branches of knowledge. (A.) Nietzsche and Morality. In morality, Nietzsche starts out by adopting the position of the relativist. He says there are no absolute values "good" and "evil"; these are mere means adopted by all in order to acquire power to maintain their place in the world, or to become supreme. It is the lion's good to devour an antelope. It is the dead-leaf butterfly's good to tell a foe a falsehood. For when the dead-leaf butterfly is in danger, it clings to the side of a twig, and what it says to its foe is practically this: "I am not a butterfly, I am a dead leaf, and can be of no use to thee." This is a lie which is good to the butterfly, for it preserves it. In nature every species of organic being instinctively adopts and practises those acts which most conduce to the prevalence or supremacy of its kind. Once the most favourable order of conduct is found, proved efficient and established, it becomes the ruling morality of the species that adopts it and bears them along to victory. All species must not and cannot value alike, for what is the lion's good is the antelope's evil and vice versa. Concepts of good and evil are therefore, in their origin, merely a means to an end, they are expedients for acquiring power. Applying this principle to mankind, Nietzsche attacked Christian moral values. He declared them to be, like all other morals, merely an expedient for protecting a certain type of man. In the case of Christianity this type was, according to Nietzsche, a low one. Conflicting moral codes have been no more than the conflicting weapons of different classes of men; for in mankind there is a continual war between the powerful, the noble, the strong, and the well-constituted on the one side, and the impotent, the mean, the weak, and the ill-constituted on the other. The war is a war of moral principles. The morality of the powerful class, Nietzsche calls NOBLE- or MASTER-MORALITY; that of the weak and subordinate class he calls SLAVE-MORALITY. In the first morality it is the eagle which, looking down upon a browsing lamb, contends that "eating lamb is good." In the second, the slave-morality, it is the lamb which, looking up from the sward, bleats dissentingly: "Eating lamb is evil." (B.) The Master- and Slave-Morality Compared. The first morality is active, creative, Dionysian. The second is passive, defensive,--to it belongs the "struggle for existence." Where attempts have not been made to reconcile the two moralities, they may be described as follows:--All is GOOD in the noble morality which proceeds from strength, power, health, well-constitutedness, happiness, and awfulness; for, the motive force behind the people practising it is "the struggle for power." The antithesis "good and bad" to this first class means the same as "noble" and "despicable." "Bad" in the master-morality must be applied to the coward, to all acts that spring from weakness, to the man with "an eye to the main chance," who would forsake everything in order to live. With the second, the slave-morality, the case is different. There, inasmuch as the community is an oppressed, suffering, unemancipated, and weary one, all THAT will be held to be good which alleviates the state of suffering. Pity, the obliging hand, the warm heart, patience, industry, and humility--these are unquestionably the qualities we shall here find flooded with the light of approval and admiration; because they are the most USEFUL qualities--; they make life endurable, they are of assistance in the "struggle for existence" which is the motive force behind the people practising this morality. To this class, all that is AWFUL is bad, in fact it is THE evil par excellence. Strength, health, superabundance of animal spirits and power, are regarded with hate, suspicion, and fear by the subordinate class. Now Nietzsche believed that the first or the noble-morality conduced to an ascent in the line of life; because it was creative and active. On the other hand, he believed that the second or slave-morality, where it became paramount, led to degeneration, because it was passive and defensive, wanting merely to keep those who practised it alive. Hence his earnest advocacy of noble-morality. (C.) Nietzsche and Evolution. Nietzsche as an evolutionist I shall have occasion to define and discuss in the course of these notes (see Notes on Chapter LVI., par.10, and on Chapter LVII.). For the present let it suffice for us to know that he accepted the "Development Hypothesis" as an explanation of the origin of species: but he did not halt where most naturalists have halted. He by no means regarded man as the highest possible being which evolution could arrive at; for though his physical development may have reached its limit, this is not the case with his mental or spiritual attributes. If the process be a fact; if things have BECOME what they are, then, he contends, we may describe no limit to man's aspirations. If he struggled up from barbarism, and still more remotely from the lower Primates, his ideal should be to surpass man himself and reach Superman (see especially the Prologue). (D.) Nietzsche and Sociology. Nietzsche as a sociologist aims at an aristocratic arrangement of society. He would have us rear an ideal race. Honest and truthful in intellectual matters, he could not even think that men are equal. "With these preachers of equality will I not be mixed up and confounded. For thus speaketh justice unto ME: 'Men are not equal.'" He sees precisely in this inequality a purpose to be served, a condition to be exploited. "Every elevation of the type 'man,'" he writes in "Beyond Good and Evil", "has hitherto been the work of an aristocratic society--and so will it always be--a society believing in a long scale of gradations of rank and differences of worth among human beings." Those who are sufficiently interested to desire to read his own detailed account of the society he would fain establish, will find an excellent passage in Aphorism 57 of "The Antichrist". ... PART I. THE PROLOGUE. In Part I. including the Prologue, no very great difficulties will appear. Zarathustra's habit of designating a whole class of men or a whole school of thought by a single fitting nickname may perhaps lead to a little confusion at first; but, as a rule, when the general drift of his arguments is grasped, it requires but a slight effort of the imagination to discover whom he is referring to. In the ninth paragraph of the Prologue, for instance, it is quite obvious that "Herdsmen" in the verse "Herdsmen, I say, etc., etc.," stands for all those to-day who are the advocates of gregariousness--of the ant-hill. And when our author says: "A robber shall Zarathustra be called by the herdsmen," it is clear that these words may be taken almost literally from one whose ideal was the rearing of a higher aristocracy. Again, "the good and just," throughout the book, is the expression used in referring to the self-righteous of modern times,-- those who are quite sure that they know all that is to be known concerning good and evil, and are satisfied that the values their little world of tradition has handed down to them, are destined to rule mankind as long as it lasts. In the last paragraph of the Prologue, verse 7, Zarathustra gives us a foretaste of his teaching concerning the big and the little sagacities, expounded subsequently. He says he would he were as wise as his serpent; this desire will be found explained in the discourse entitled "The Despisers of the Body", which I shall have occasion to refer to later. ... THE DISCOURSES. Chapter I. The Three Metamorphoses. This opening discourse is a parable in which Zarathustra discloses the mental development of all creators of new values. It is the story of a life which reaches its consummation in attaining to a second ingenuousness or in returning to childhood. Nietzsche, the supposed anarchist, here plainly disclaims all relationship whatever to anarchy, for he shows us that only by bearing the burdens of the existing law and submitting to it patiently, as the camel submits to being laden, does the free spirit acquire that ascendancy over tradition which enables him to meet and master the dragon "Thou shalt,"--the dragon with the values of a thousand years glittering on its scales. There are two lessons in this discourse: first, that in order to create one must be as a little child; secondly, that it is only through existing law and order that one attains to that height from which new law and new order may be promulgated. Chapter II. The Academic Chairs of Virtue. Almost the whole of this is quite comprehensible. It is a discourse against all those who confound virtue with tameness and smug ease, and who regard as virtuous only that which promotes security and tends to deepen sleep. Chapter IV. The Despisers of the Body. Here Zarathustra gives names to the intellect and the instincts; he calls the one "the little sagacity" and the latter "the big sagacity." Schopenhauer's teaching concerning the intellect is fully endorsed here. "An instrument of thy body is also thy little sagacity, my brother, which thou callest 'spirit,'" says Zarathustra. From beginning to end it is a warning to those who would think too lightly of the instincts and unduly exalt the intellect and its derivatives: Reason and Understanding. Chapter IX. The Preachers of Death. This is an analysis of the psychology of all those who have the "evil eye" and are pessimists by virtue of their constitutions. Chapter XV. The Thousand and One Goals. In this discourse Zarathustra opens his exposition of the doctrine of relativity in morality, and declares all morality to be a mere means to power. Needless to say that verses 9, 10, 11, and 12 refer to the Greeks, the Persians, the Jews, and the Germans respectively. In the penultimate verse he makes known his discovery concerning the root of modern Nihilism and indifference,--i.e., that modern man has no goal, no aim, no ideals (see Note A). Chapter XVIII. Old and Young Women. Nietzsche's views on women have either to be loved at first sight or they become perhaps the greatest obstacle in the way of those who otherwise would be inclined to accept his philosophy. Women especially, of course, have been taught to dislike them, because it has been rumoured that his views are unfriendly to themselves. Now, to my mind, all this is pure misunderstanding and error. German philosophers, thanks to Schopenhauer, have earned rather a bad name for their views on women. It is almost impossible for one of them to write a line on the subject, however kindly he may do so, without being suspected of wishing to open a crusade against the fair sex. Despite the fact, therefore, that all Nietzsche's views in this respect were dictated to him by the profoundest love; despite Zarathustra's reservation in this discourse, that "with women nothing (that can be said) is impossible," and in the face of other overwhelming evidence to the contrary, Nietzsche is universally reported to have mis son pied dans le plat, where the female sex is concerned. And what is the fundamental doctrine which has given rise to so much bitterness and aversion?--Merely this: that the sexes are at bottom ANTAGONISTIC--that is to say, as different as blue is from yellow, and that the best possible means of rearing anything approaching a desirable race is to preserve and to foster this profound hostility. What Nietzsche strives to combat and to overthrow is the modern democratic tendency which is slowly labouring to level all things--even the sexes. His quarrel is not with women--what indeed could be more undignified?--it is with those who would destroy the natural relationship between the sexes, by modifying either the one or the other with a view to making them more alike. The human world is just as dependent upon women's powers as upon men's. It is women's strongest and most valuable instincts which help to determine who are to be the fathers of the next generation. By destroying these particular instincts, that is to say by attempting to masculinise woman, and to feminise men, we jeopardise the future of our people. The general democratic movement of modern times, in its frantic struggle to mitigate all differences, is now invading even the world of sex. It is against this movement that Nietzsche raises his voice; he would have woman become ever more woman and man become ever more man. Only thus, and he is undoubtedly right, can their combined instincts lead to the excellence of humanity. Regarded in this light, all his views on woman appear not only necessary but just (see Note on Chapter LVI., par. 21.) It is interesting to observe that the last line of the discourse, which has so frequently been used by women as a weapon against Nietzsche's views concerning them, was suggested to Nietzsche by a woman (see "Das Leben F. Nietzsche's"). Chapter XXI. Voluntary Death. In regard to this discourse, I should only like to point out that Nietzsche had a particular aversion to the word "suicide"--self-murder. He disliked the evil it suggested, and in rechristening the act Voluntary Death, i.e., the death that comes from no other hand than one's own, he was desirous of elevating it to the position it held in classical antiquity (see Aphorism 36 in "The Twilight of the Idols"). Chapter XXII. The Bestowing Virtue. An important aspect of Nietzsche's philosophy is brought to light in this discourse. His teaching, as is well known, places the Aristotelian man of spirit, above all others in the natural divisions of man. The man with overflowing strength, both of mind and body, who must discharge this strength or perish, is the Nietzschean ideal. To such a man, giving from his overflow becomes a necessity; bestowing develops into a means of existence, and this is the only giving, the only charity, that Nietzsche recognises. In paragraph 3 of the discourse, we read Zarathustra's healthy exhortation to his disciples to become independent thinkers and to find themselves before they learn any more from him (see Notes on Chapters LVI., par. 5, and LXXIII., pars. 10, 11). ... PART II. Chapter XXIII. The Child with the Mirror. Nietzsche tells us here, in a poetical form, how deeply grieved he was by the manifold misinterpretations and misunderstandings which were becoming rife concerning his publications. He does not recognise himself in the mirror of public opinion, and recoils terrified from the distorted reflection of his features. In verse 20 he gives us a hint which it were well not to pass over too lightly; for, in the introduction to "The Genealogy of Morals" (written in 1887) he finds it necessary to refer to the matter again and with greater precision. The point is this, that a creator of new values meets with his surest and strongest obstacles in the very spirit of the language which is at his disposal. Words, like all other manifestations of an evolving race, are stamped with the values that have long been paramount in that race. Now, the original thinker who finds himself compelled to use the current speech of his country in order to impart new and hitherto untried views to his fellows, imposes a task upon the natural means of communication which it is totally unfitted to perform,--hence the obscurities and prolixities which are so frequently met with in the writings of original thinkers. In the "Dawn of Day", Nietzsche actually cautions young writers against THE DANGER OF ALLOWING THEIR THOUGHTS TO BE MOULDED BY THE WORDS AT THEIR DISPOSAL. Chapter XXIV. In the Happy Isles. While writing this, Nietzsche is supposed to have been thinking of the island of Ischia which was ultimately destroyed by an earthquake. His teaching here is quite clear. He was among the first thinkers of Europe to overcome the pessimism which godlessness generally brings in its wake. He points to creating as the surest salvation from the suffering which is a concomitant of all higher life. "What would there be to create," he asks, "if there were--Gods?" His ideal, the Superman, lends him the cheerfulness necessary to the overcoming of that despair usually attendant upon godlessness and upon the apparent aimlessness of a world without a god. Chapter XXIX. The Tarantulas. The tarantulas are the Socialists and Democrats. This discourse offers us an analysis of their mental attitude. Nietzsche refuses to be confounded with those resentful and revengeful ones who condemn society FROM BELOW, and whose criticism is only suppressed envy. "There are those who preach my doctrine of life," he says of the Nietzschean Socialists, "and are at the same time preachers of equality and tarantulas" (see Notes on Chapter XL. and Chapter LI.). Chapter XXX. The Famous Wise Ones. This refers to all those philosophers hitherto, who have run in the harness of established values and have not risked their reputation with the people in pursuit of truth. The philosopher, however, as Nietzsche understood him, is a man who creates new values, and thus leads mankind in a new direction. Chapter XXXIII. The Grave-Song. Here Zarathustra sings about the ideals and friendships of his youth. Verses 27 to 31 undoubtedly refer to Richard Wagner (see Note on Chapter LXV.). Chapter XXXIV. Self-Surpassing. In this discourse we get the best exposition in the whole book of Nietzsche's doctrine of the Will to Power. I go into this question thoroughly in the Note on Chapter LVII. Nietzsche was not an iconoclast from choice. Those who hastily class him with the anarchists (or the Progressivists of the last century) fail to understand the high esteem in which he always held both law and discipline. In verse 41 of this most decisive discourse he truly explains his position when he says: "...he who hath to be a creator in good and evil--verily he hath first to be a destroyer, and break values in pieces." This teaching in regard to self-control is evidence enough of his reverence for law. Chapter XXXV. The Sublime Ones. These belong to a type which Nietzsche did not altogether dislike, but which he would fain have rendered more subtle and plastic. It is the type that takes life and itself too seriously, that never surmounts the camel- stage mentioned in the first discourse, and that is obdurately sublime and earnest. To be able to smile while speaking of lofty things and NOT TO BE OPPRESSED by them, is the secret of real greatness. He whose hand trembles when it lays hold of a beautiful thing, has the quality of reverence, without the artist's unembarrassed friendship with the beautiful. Hence the mistakes which have arisen in regard to confounding Nietzsche with his extreme opposites the anarchists and agitators. For what they dare to touch and break with the impudence and irreverence of the unappreciative, he seems likewise to touch and break,--but with other fingers--with the fingers of the loving and unembarrassed artist who is on good terms with the beautiful and who feels able to create it and to enhance it with his touch. The question of taste plays an important part in Nietzsche's philosophy, and verses 9, 10 of this discourse exactly state Nietzsche's ultimate views on the subject. In the "Spirit of Gravity", he actually cries:--"Neither a good nor a bad taste, but MY taste, of which I have no longer either shame or secrecy." Chapter XXXVI. The Land of Culture. This is a poetical epitome of some of the scathing criticism of scholars which appears in the first of the "Thoughts out of Season"--the polemical pamphlet (written in 1873) against David Strauss and his school. He reproaches his former colleagues with being sterile and shows them that their sterility is the result of their not believing in anything. "He who had to create, had always his presaging dreams and astral premonitions--and believed in believing!" (See Note on Chapter LXXVII.) In the last two verses he reveals the nature of his altruism. How far it differs from that of Christianity we have already read in the discourse "Neighbour-Love", but here he tells us definitely the nature of his love to mankind; he explains why he was compelled to assail the Christian values of pity and excessive love of the neighbour, not only because they are slave-values and therefore tend to promote degeneration (see Note B.), but because he could only love his children's land, the undiscovered land in a remote sea; because he would fain retrieve the errors of his fathers in his children. Chapter XXXVII. Immaculate Perception. An important feature of Nietzsche's interpretation of Life is disclosed in this discourse. As Buckle suggests in his "Influence of Women on the Progress of Knowledge", the scientific spirit of the investigator is both helped and supplemented by the latter's emotions and personality, and the divorce of all emotionalism and individual temperament from science is a fatal step towards sterility. Zarathustra abjures all those who would fain turn an IMPERSONAL eye upon nature and contemplate her phenomena with that pure objectivity to which the scientific idealists of to-day would so much like to attain. He accuses such idealists of hypocrisy and guile; he says they lack innocence in their desires and therefore slander all desiring. Chapter XXXVIII. Scholars. This is a record of Nietzsche's final breach with his former colleagues-- the scholars of Germany. Already after the publication of the "Birth of Tragedy", numbers of German philologists and professional philosophers had denounced him as one who had strayed too far from their flock, and his lectures at the University of Bale were deserted in consequence; but it was not until 1879, when he finally severed all connection with University work, that he may be said to have attained to the freedom and independence which stamp this discourse. Chapter XXXIX. Poets. People have sometimes said that Nietzsche had no sense of humour. I have no intention of defending him here against such foolish critics; I should only like to point out to the reader that we have him here at his best, poking fun at himself, and at his fellow-poets (see Note on Chapter LXIII., pars. 16, 17, 18, 19, 20). Chapter XL. Great Events. Here we seem to have a puzzle. Zarathustra himself, while relating his experience with the fire-dog to his disciples, fails to get them interested in his narrative, and we also may be only too ready to turn over these pages under the impression that they are little more than a mere phantasy or poetical flight. Zarathustra's interview with the fire-dog is, however, of great importance. In it we find Nietzsche face to face with the creature he most sincerely loathes--the spirit of revolution, and we obtain fresh hints concerning his hatred of the anarchist and rebel. "'Freedom' ye all roar most eagerly," he says to the fire-dog, "but I have unlearned the belief in 'Great Events' when there is much roaring and smoke about them. Not around the inventors of new noise, but around the inventors of new values, doth the world revolve; INAUDIBLY it revolveth." Chapter XLI. The Soothsayer. This refers, of course, to Schopenhauer. Nietzsche, as is well known, was at one time an ardent follower of Schopenhauer. He overcame Pessimism by discovering an object in existence; he saw the possibility of raising society to a higher level and preached the profoundest Optimism in consequence. Chapter XLII. Redemption. Zarathustra here addresses cripples. He tells them of other cripples--the GREAT MEN in this world who have one organ or faculty inordinately developed at the cost of their other faculties. This is doubtless a reference to a fact which is too often noticeable in the case of so many of the world's giants in art, science, or religion. In verse 19 we are told what Nietzsche called Redemption--that is to say, the ability to say of all that is past: "Thus would I have it." The in ability to say this, and the resentment which results therefrom, he regards as the source of all our feelings of revenge, and all our desires to punish--punishment meaning to him merely a euphemism for the word revenge, invented in order to still our consciences. He who can be proud of his enemies, who can be grateful to them for the obstacles they have put in his way; he who can regard his worst calamity as but the extra strain on the bow of his life, which is to send the arrow of his longing even further than he could have hoped;--this man knows no revenge, neither does he know despair, he truly has found redemption and can turn on the worst in his life and even in himself, and call it his best (see Notes on Chapter LVII.). Chapter XLIII. Manly Prudence. This discourse is very important. In "Beyond Good and Evil" we hear often enough that the select and superior man must wear a mask, and here we find this injunction explained. "And he who would not languish amongst men, must learn to drink out of all glasses: and he who would keep clean amongst men, must know how to wash himself even with dirty water." This, I venture to suggest, requires some explanation. At a time when individuality is supposed to be shown most tellingly by putting boots on one's hands and gloves on one's feet, it is somewhat refreshing to come across a true individualist who feels the chasm between himself and others so deeply, that he must perforce adapt himself to them outwardly, at least, in all respects, so that the inner difference should be overlooked. Nietzsche practically tells us here that it is not he who intentionally wears eccentric clothes or does eccentric things who is truly the individualist. The profound man, who is by nature differentiated from his fellows, feels this difference too keenly to call attention to it by any outward show. He is shamefast and bashful with those who surround him and wishes not to be discovered by them, just as one instinctively avoids all lavish display of comfort or wealth in the presence of a poor friend. Chapter XLIV. The Stillest Hour. This seems to me to give an account of the great struggle which must have taken place in Nietzsche's soul before he finally resolved to make known the more esoteric portions of his teaching. Our deepest feelings crave silence. There is a certain self-respect in the serious man which makes him hold his profoundest feelings sacred. Before they are uttered they are full of the modesty of a virgin, and often the oldest sage will blush like a girl when this virginity is violated by an indiscretion which forces him to reveal his deepest thoughts. ... PART III. This is perhaps the most important of all the four parts. If it contained only "The Vision and the Enigma" and "The Old and New Tables" I should still be of this opinion; for in the former of these discourses we meet with what Nietzsche regarded as the crowning doctrine of his philosophy and in "The Old and New Tables" we have a valuable epitome of practically all his leading principles. Chapter XLVI. The Vision and the Enigma. "The Vision and the Enigma" is perhaps an example of Nietzsche in his most obscure vein. We must know how persistently he inveighed against the oppressing and depressing influence of man's sense of guilt and consciousness of sin in order fully to grasp the significance of this discourse. Slowly but surely, he thought the values of Christianity and Judaic traditions had done their work in the minds of men. What were once but expedients devised for the discipline of a certain portion of humanity, had now passed into man's blood and had become instincts. This oppressive and paralysing sense of guilt and of sin is what Nietzsche refers to when he speaks of "the spirit of gravity." This creature half-dwarf, half-mole, whom he bears with him a certain distance on his climb and finally defies, and whom he calls his devil and arch-enemy, is nothing more than the heavy millstone "guilty conscience," together with the concept of sin which at present hangs round the neck of men. To rise above it--to soar--is the most difficult of all things to-day. Nietzsche is able to think cheerfully and optimistically of the possibility of life in this world recurring again and again, when he has once cast the dwarf from his shoulders, and he announces his doctrine of the Eternal Recurrence of all things great and small to his arch-enemy and in defiance of him. That there is much to be said for Nietzsche's hypothesis of the Eternal Recurrence of all things great and small, nobody who has read the literature on the subject will doubt for an instant; but it remains a very daring conjecture notwithstanding and even in its ultimate effect, as a dogma, on the minds of men, I venture to doubt whether Nietzsche ever properly estimated its worth (see Note on Chapter LVII.). What follows is clear enough. Zarathustra sees a young shepherd struggling on the ground with a snake holding fast to the back of his throat. The sage, assuming that the snake must have crawled into the young man's mouth while he lay sleeping, runs to his help and pulls at the loathsome reptile with all his might, but in vain. At last, in despair, Zarathustra appeals to the young man's will. Knowing full well what a ghastly operation he is recommending, he nevertheless cries, "Bite! Bite! Its head off! Bite!" as the only possible solution of the difficulty. The young shepherd bites, and far away he spits the snake's head, whereupon he rises, "No longer shepherd, no longer man--a transfigured being, a light-surrounded being, that LAUGHED! Never on earth laughed a man as he laughed!" In this parable the young shepherd is obviously the man of to-day; the snake that chokes him represents the stultifying and paralysing social values that threaten to shatter humanity, and the advice "Bite! Bite!" is but Nietzsche's exasperated cry to mankind to alter their values before it is too late. Chapter XLVII. Involuntary Bliss. This, like "The Wanderer", is one of the many introspective passages in the work, and is full of innuendos and hints as to the Nietzschean outlook on life. Chapter XLVIII. Before Sunrise. Here we have a record of Zarathustra's avowal of optimism, as also the important statement concerning "Chance" or "Accident" (verse 27). Those who are familiar with Nietzsche's philosophy will not require to be told what an important role his doctrine of chance plays in his teaching. The Giant Chance has hitherto played with the puppet "man,"--this is the fact he cannot contemplate with equanimity. Man shall now exploit chance, he says again and again, and make it fall on its knees before him! (See verse 33 in "On the Olive Mount", and verses 9-10 in "The Bedwarfing Virtue"). Chapter XLIX. The Bedwarfing Virtue. This requires scarcely any comment. It is a satire on modern man and his belittling virtues. In verses 23 and 24 of the second part of the discourse we are reminded of Nietzsche's powerful indictment of the great of to-day, in the Antichrist (Aphorism 43):--"At present nobody has any longer the courage for separate rights, for rights of domination, for a feeling of reverence for himself and his equals,--FOR PATHOS OF DISTANCE...Our politics are MORBID from this want of courage!--The aristocracy of character has been undermined most craftily by the lie of the equality of souls; and if the belief in the 'privilege of the many,' makes revolutions and WILL CONTINUE TO MAKE them, it is Christianity, let us not doubt it, it is CHRISTIAN valuations, which translate every revolution merely into blood and crime!" (see also "Beyond Good and Evil", pages 120, 121). Nietzsche thought it was a bad sign of the times that even rulers have lost the courage of their positions, and that a man of Frederick the Great's power and distinguished gifts should have been able to say: "Ich bin der erste Diener des Staates" (I am the first servant of the State.) To this utterance of the great sovereign, verse 24 undoubtedly refers. "Cowardice" and "Mediocrity," are the names with which he labels modern notions of virtue and moderation. In Part III., we get the sentiments of the discourse "In the Happy Isles", but perhaps in stronger terms. Once again we find Nietzsche thoroughly at ease, if not cheerful, as an atheist, and speaking with vertiginous daring of making chance go on its knees to him. In verse 20, Zarathustra makes yet another attempt at defining his entirely anti-anarchical attitude, and unless such passages have been completely overlooked or deliberately ignored hitherto by those who will persist in laying anarchy at his door, it is impossible to understand how he ever became associated with that foul political party. The last verse introduces the expression, "THE GREAT NOONTIDE!" In the poem to be found at the end of "Beyond Good and Evil", we meet with the expression again, and we shall find it occurring time and again in Nietzsche's works. It will be found fully elucidated in the fifth part of "The Twilight of the Idols"; but for those who cannot refer to this book, it were well to point out that Nietzsche called the present period--our period--the noon of man's history. Dawn is behind us. The childhood of mankind is over. Now we KNOW; there is now no longer any excuse for mistakes which will tend to botch and disfigure the type man. "With respect to what is past," he says, "I have, like all discerning ones, great toleration, that is to say, GENEROUS self-control...But my feeling changes suddenly, and breaks out as soon as I enter the modern period, OUR period. Our age KNOWS..." (See Note on Chapter LXX.). Chapter LI. On Passing-by. Here we find Nietzsche confronted with his extreme opposite, with him therefore for whom he is most frequently mistaken by the unwary. "Zarathustra's ape" he is called in the discourse. He is one of those at whose hands Nietzsche had to suffer most during his life-time, and at whose hands his philosophy has suffered most since his death. In this respect it may seem a little trivial to speak of extremes meeting; but it is wonderfully apt. Many have adopted Nietzsche's mannerisms and word- coinages, who had nothing in common with him beyond the ideas and "business" they plagiarised; but the superficial observer and a large portion of the public, not knowing of these things,--not knowing perhaps that there are iconoclasts who destroy out of love and are therefore creators, and that there are others who destroy out of resentment and revengefulness and who are therefore revolutionists and anarchists,--are prone to confound the two, to the detriment of the nobler type. If we now read what the fool says to Zarathustra, and note the tricks of speech he has borrowed from him: if we carefully follow the attitude he assumes, we shall understand why Zarathustra finally interrupts him. "Stop this at once," Zarathustra cries, "long have thy speech and thy species disgusted me...Out of love alone shall my contempt and my warning bird take wing; BUT NOT OUT OF THE SWAMP!" It were well if this discourse were taken to heart by all those who are too ready to associate Nietzsche with lesser and noiser men,--with mountebanks and mummers. Chapter LII. The Apostates. It is clear that this applies to all those breathless and hasty "tasters of everything," who plunge too rashly into the sea of independent thought and "heresy," and who, having miscalculated their strength, find it impossible to keep their head above water. "A little older, a little colder," says Nietzsche. They soon clamber back to the conventions of the age they intended reforming. The French then say "le diable se fait hermite," but these men, as a rule, have never been devils, neither do they become angels; for, in order to be really good or evil, some strength and deep breathing is required. Those who are more interested in supporting orthodoxy than in being over nice concerning the kind of support they give it, often refer to these people as evidence in favour of the true faith. Chapter LIII. The Return Home. This is an example of a class of writing which may be passed over too lightly by those whom poetasters have made distrustful of poetry. From first to last it is extremely valuable as an autobiographical note. The inevitable superficiality of the rabble is contrasted with the peaceful and profound depths of the anchorite. Here we first get a direct hint concerning Nietzsche's fundamental passion--the main force behind all his new values and scathing criticism of existing values. In verse 30 we are told that pity was his greatest danger. The broad altruism of the law- giver, thinking over vast eras of time, was continually being pitted by Nietzsche, in himself, against that transient and meaner sympathy for the neighbour which he more perhaps than any of his contemporaries had suffered from, but which he was certain involved enormous dangers not only for himself but also to the next and subsequent generations (see Note B., where "pity" is mentioned among the degenerate virtues). Later in the book we shall see how his profound compassion leads him into temptation, and how frantically he struggles against it. In verses 31 and 32, he tells us to what extent he had to modify himself in order to be endured by his fellows whom he loved (see also verse 12 in "Manly Prudence"). Nietzsche's great love for his fellows, which he confesses in the Prologue, and which is at the root of all his teaching, seems rather to elude the discerning powers of the average philanthropist and modern man. He cannot see the wood for the trees. A philanthropy that sacrifices the minority of the present-day for the majority constituting posterity, completely evades his mental grasp, and Nietzsche's philosophy, because it declares Christian values to be a danger to the future of our kind, is therefore shelved as brutal, cold, and hard (see Note on Chapter XXXVI.). Nietzsche tried to be all things to all men; he was sufficiently fond of his fellows for that: in the Return Home he describes how he ultimately returns to loneliness in order to recover from the effects of his experiment. Chapter LIV. The Three Evil Things. Nietzsche is here completely in his element. Three things hitherto best- cursed and most calumniated on earth, are brought forward to be weighed. Voluptuousness, thirst of power, and selfishness,--the three forces in humanity which Christianity has done most to garble and besmirch,-- Nietzsche endeavours to reinstate in their former places of honour. Voluptuousness, or sensual pleasure, is a dangerous thing to discuss nowadays. If we mention it with favour we may be regarded, however unjustly, as the advocate of savages, satyrs, and pure sensuality. If we condemn it, we either go over to the Puritans or we join those who are wont to come to table with no edge to their appetites and who therefore grumble at all good fare. There can be no doubt that the value of healthy innocent voluptuousness, like the value of health itself, must have been greatly discounted by all those who, resenting their inability to partake of this world's goods, cried like St Paul: "I would that all men were even as I myself." Now Nietzsche's philosophy might be called an attempt at giving back to healthy and normal men innocence and a clean conscience in their desires--NOT to applaud the vulgar sensualists who respond to every stimulus and whose passions are out of hand; not to tell the mean, selfish individual, whose selfishness is a pollution (see Aphorism 33, "Twilight of the Idols"), that he is right, nor to assure the weak, the sick, and the crippled, that the thirst of power, which they gratify by exploiting the happier and healthier individuals, is justified;--but to save the clean healthy man from the values of those around him, who look at everything through the mud that is in their own bodies,--to give him, and him alone, a clean conscience in his manhood and the desires of his manhood. "Do I counsel you to slay your instincts? I counsel to innocence in your instincts." In verse 7 of the second paragraph (as in verse I of paragraph 19 in "The Old and New Tables") Nietzsche gives us a reason for his occasional obscurity (see also verses 3 to 7 of "Poets"). As I have already pointed out, his philosophy is quite esoteric. It can serve no purpose with the ordinary, mediocre type of man. I, personally, can no longer have any doubt that Nietzsche's only object, in that part of his philosophy where he bids his friends stand "Beyond Good and Evil" with him, was to save higher men, whose growth and scope might be limited by the too strict observance of modern values from foundering on the rocks of a "Compromise" between their own genius and traditional conventions. The only possible way in which the great man can achieve greatness is by means of exceptional freedom--the freedom which assists him in experiencing HIMSELF. Verses 20 to 30 afford an excellent supplement to Nietzsche's description of the attitude of the noble type towards the slaves in Aphorism 260 of the work "Beyond Good and Evil" (see also Note B.) Chapter LV. The Spirit of Gravity. (See Note on Chapter XLVI.) In Part II. of this discourse we meet with a doctrine not touched upon hitherto, save indirectly;--I refer to the doctrine of self-love. We should try to understand this perfectly before proceeding; for it is precisely views of this sort which, after having been cut out of the original context, are repeated far and wide as internal evidence proving the general unsoundness of Nietzsche's philosophy. Already in the last of the "Thoughts out of Season" Nietzsche speaks as follows about modern men: "...these modern creatures wish rather to be hunted down, wounded and torn to shreds, than to live alone with themselves in solitary calm. Alone with oneself!--this thought terrifies the modern soul; it is his one anxiety, his one ghastly fear" (English Edition, page 141). In his feverish scurry to find entertainment and diversion, whether in a novel, a newspaper, or a play, the modern man condemns his own age utterly; for he shows that in his heart of hearts he despises himself. One cannot change a condition of this sort in a day; to become endurable to oneself an inner transformation is necessary. Too long have we lost ourselves in our friends and entertainments to be able to find ourselves so soon at another's bidding. "And verily, it is no commandment for to-day and to-morrow to LEARN to love oneself. Rather is it of all arts the finest, subtlest, last, and patientest." In the last verse Nietzsche challenges us to show that our way is the right way. In his teaching he does not coerce us, nor does he overpersuade; he simply says: "I am a law only for mine own, I am not a law for all. This --is now MY way,--where is yours?" Chapter LVI. Old and New Tables. Par. 2. Nietzsche himself declares this to be the most decisive portion of the whole of "Thus Spake Zarathustra". It is a sort of epitome of his leading doctrines. In verse 12 of the second paragraph, we learn how he himself would fain have abandoned the poetical method of expression had he not known only too well that the only chance a new doctrine has of surviving, nowadays, depends upon its being given to the world in some kind of art- form. Just as prophets, centuries ago, often had to have recourse to the mask of madness in order to mitigate the hatred of those who did not and could not see as they did; so, to-day, the struggle for existence among opinions and values is so great, that an art-form is practically the only garb in which a new philosophy can dare to introduce itself to us. Pars. 3 and 4. Many of the paragraphs will be found to be merely reminiscent of former discourses. For instance, par. 3 recalls "Redemption". The last verse of par. 4 is important. Freedom which, as I have pointed out before, Nietzsche considered a dangerous acquisition in inexperienced or unworthy hands, here receives its death-blow as a general desideratum. In the first Part we read under "The Way of the Creating One", that freedom as an end in itself does not concern Zarathustra at all. He says there: "Free from what? What doth that matter to Zarathustra? Clearly, however, shall thine eye answer me: free FOR WHAT?" And in "The Bedwarfing Virtue": "Ah that ye understood my word: 'Do ever what ye will--but first be such as CAN WILL.'" Par. 5. Here we have a description of the kind of altruism Nietzsche exacted from higher men. It is really a comment upon "The Bestowing Virtue" (see Note on Chapter XXII.). Par. 6. This refers, of course, to the reception pioneers of Nietzsche's stamp meet with at the hands of their contemporaries. Par. 8. Nietzsche teaches that nothing is stable,--not even values,--not even the concepts good and evil. He likens life unto a stream. But foot-bridges and railings span the stream, and they seem to stand firm. Many will be reminded of good and evil when they look upon these structures; for thus these same values stand over the stream of life, and life flows on beneath them and leaves them standing. When, however, winter comes and the stream gets frozen, many inquire: "Should not everything--STAND STILL? Fundamentally everything standeth still." But soon the spring cometh and with it the thaw-wind. It breaks the ice, and the ice breaks down the foot-bridges and railings, whereupon everything is swept away. This state of affairs, according to Nietzsche, has now been reached. "Oh, my brethren, is not everything AT PRESENT IN FLUX? Have not all railings and foot-bridges fallen into the water? Who would still HOLD ON to 'good' and 'evil'?" Par. 9. This is complementary to the first three verses of par. 2. Par. 10. So far, this is perhaps the most important paragraph. It is a protest against reading a moral order of things in life. "Life is something essentially immoral!" Nietzsche tells us in the introduction to the "Birth of Tragedy". Even to call life "activity," or to define it further as "the continuous adjustment of internal relations to external relations," as Spencer has it, Nietzsche characterises as a "democratic idiosyncracy." He says to define it in this way, "is to mistake the true nature and function of life, which is Will to Power...Life is ESSENTIALLY appropriation, injury, conquest of the strange and weak, suppression, severity, obtrusion of its own forms, incorporation and at least, putting it mildest, exploitation." Adaptation is merely a secondary activity, a mere re- activity (see Note on Chapter LVII.). Pars. 11, 12. These deal with Nietzsche's principle of the desirability of rearing a select race. The biological and historical grounds for his insistence upon this principle are, of course, manifold. Gobineau in his great work, "L'Inegalite des Races Humaines", lays strong emphasis upon the evils which arise from promiscuous and inter-social marriages. He alone would suffice to carry Nietzsche's point against all those who are opposed to the other conditions, to the conditions which would have saved Rome, which have maintained the strength of the Jewish race, and which are strictly maintained by every breeder of animals throughout the world. Darwin in his remarks relative to the degeneration of CULTIVATED types of animals through the action of promiscuous breeding, brings Gobineau support from the realm of biology. The last two verses of par. 12 were discussed in the Notes on Chapters XXXVI. and LIII. Par. 13. This, like the first part of "The Soothsayer", is obviously a reference to the Schopenhauerian Pessimism. Pars. 14, 15, 16, 17. These are supplementary to the discourse "Backworld's-men". Par. 18. We must be careful to separate this paragraph, in sense, from the previous four paragraphs. Nietzsche is still dealing with Pessimism here; but it is the pessimism of the hero--the man most susceptible of all to desperate views of life, owing to the obstacles that are arrayed against him in a world where men of his kind are very rare and are continually being sacrificed. It was to save this man that Nietzsche wrote. Heroism foiled, thwarted, and wrecked, hoping and fighting until the last, is at length overtaken by despair, and renounces all struggle for sleep. This is not the natural or constitutional pessimism which proceeds from an unhealthy body--the dyspeptic's lack of appetite; it is rather the desperation of the netted lion that ultimately stops all movement, because the more it moves the more involved it becomes. Par. 20. "All that increases power is good, all that springs from weakness is bad. The weak and ill-constituted shall perish: first principle of our charity. And one shall also help them thereto." Nietzsche partly divined the kind of reception moral values of this stamp would meet with at the hands of the effeminate manhood of Europe. Here we see that he had anticipated the most likely form their criticism would take (see also the last two verses of par. 17). Par. 21. The first ten verses, here, are reminiscent of "War and Warriors" and of "The Flies in the Market-Place." Verses 11 and 12, however, are particularly important. There is a strong argument in favour of the sharp differentiation of castes and of races (and even of sexes; see Note on Chapter XVIII.) running all through Nietzsche's writings. But sharp differentiation also implies antagonism in some form or other--hence Nietzsche's fears for modern men. What modern men desire above all, is peace and the cessation of pain. But neither great races nor great castes have ever been built up in this way. "Who still wanteth to rule?" Zarathustra asks in the "Prologue". "Who still wanteth to obey? Both are too burdensome." This is rapidly becoming everybody's attitude to-day. The tame moral reading of the face of nature, together with such democratic interpretations of life as those suggested by Herbert Spencer, are signs of a physiological condition which is the reverse of that bounding and irresponsible healthiness in which harder and more tragic values rule. Par. 24. This should be read in conjunction with "Child and Marriage". In the fifth verse we shall recognise our old friend "Marriage on the ten-years system," which George Meredith suggested some years ago. This, however, must not be taken too literally. I do not think Nietzsche's profoundest views on marriage were ever intended to be given over to the public at all, at least not for the present. They appear in the biography by his sister, and although their wisdom is unquestionable, the nature of the reforms he suggests render it impossible for them to become popular just now. Pars. 26, 27. See Note on "The Prologue". Par. 28. Nietzsche was not an iconoclast from predilection. No bitterness or empty hate dictated his vituperations against existing values and against the dogmas of his parents and forefathers. He knew too well what these things meant to the millions who profess them, to approach the task of uprooting them with levity or even with haste. He saw what modern anarchists and revolutionists do NOT see--namely, that man is in danger of actual destruction when his customs and values are broken. I need hardly point out, therefore, how deeply he was conscious of the responsibility he threw upon our shoulders when he invited us to reconsider our position. The lines in this paragraph are evidence enough of his earnestness. Chapter LVII. The Convalescent. We meet with several puzzles here. Zarathustra calls himself the advocate of the circle (the Eternal Recurrence of all things), and he calls this doctrine his abysmal thought. In the last verse of the first paragraph, however, after hailing his deepest thought, he cries: "Disgust, disgust, disgust!" We know Nietzsche's ideal man was that "world-approving, exuberant, and vivacious creature, who has not only learnt to compromise and arrange with that which was and is, but wishes to have it again, AS IT WAS AND IS, for all eternity insatiably calling out da capo, not only to himself, but to the whole piece and play" (see Note on Chapter XLII.). But if one ask oneself what the conditions to such an attitude are, one will realise immediately how utterly different Nietzsche was from his ideal. The man who insatiably cries da capo to himself and to the whole of his mise-en-scene, must be in a position to desire every incident in his life to be repeated, not once, but again and again eternally. Now, Nietzsche's life had been too full of disappointments, illness, unsuccessful struggles, and snubs, to allow of his thinking of the Eternal Recurrence without loathing--hence probably the words of the last verse. In verses 15 and 16, we have Nietzsche declaring himself an evolutionist in the broadest sense--that is to say, that he believes in the Development Hypothesis as the description of the process by which species have originated. Now, to understand his position correctly we must show his relationship to the two greatest of modern evolutionists--Darwin and Spencer. As a philosopher, however, Nietzsche does not stand or fall by his objections to the Darwinian or Spencerian cosmogony. He never laid claim to a very profound knowledge of biology, and his criticism is far more valuable as the attitude of a fresh mind than as that of a specialist towards the question. Moreover, in his objections many difficulties are raised which are not settled by an appeal to either of the men above mentioned. We have given Nietzsche's definition of life in the Note on Chapter LVI., par. 10. Still, there remains a hope that Darwin and Nietzsche may some day become reconciled by a new description of the processes by which varieties occur. The appearance of varieties among animals and of "sporting plants" in the vegetable kingdom, is still shrouded in mystery, and the question whether this is not precisely the ground on which Darwin and Nietzsche will meet, is an interesting one. The former says in his "Origin of Species", concerning the causes of variability: "...there are two factors, namely, the nature of the organism, and the nature of the conditions. THE FORMER SEEMS TO BE MUCH THE MORE IMPORTANT (The italics are mine.), for nearly similar variations sometimes arise under, as far as we can judge, dissimilar conditions; and on the other hand, dissimilar variations arise under conditions which appear to be nearly uniform." Nietzsche, recognising this same truth, would ascribe practically all the importance to the "highest functionaries in the organism, in which the life-will appears as an active and formative principle," and except in certain cases (where passive organisms alone are concerned) would not give such a prominent place to the influence of environment. Adaptation, according to him, is merely a secondary activity, a mere re-activity, and he is therefore quite opposed to Spencer's definition: "Life is the continuous adjustment of internal relations to external relations." Again in the motive force behind animal and plant life, Nietzsche disagrees with Darwin. He transforms the "Struggle for Existence"--the passive and involuntary condition--into the "Struggle for Power," which is active and creative, and much more in harmony with Darwin's own view, given above, concerning the importance of the organism itself. The change is one of such far-reaching importance that we cannot dispose of it in a breath, as a mere play upon words. "Much is reckoned higher than life itself by the living one." Nietzsche says that to speak of the activity of life as a "struggle for existence," is to state the case inadequately. He warns us not to confound Malthus with nature. There is something more than this struggle between the organic beings on this earth; want, which is supposed to bring this struggle about, is not so common as is supposed; some other force must be operative. The Will to Power is this force, "the instinct of self-preservation is only one of the indirect and most frequent results thereof." A certain lack of acumen in psychological questions and the condition of affairs in England at the time Darwin wrote, may both, according to Nietzsche, have induced the renowned naturalist to describe the forces of nature as he did in his "Origin of Species". In verses 28, 29, and 30 of the second portion of this discourse we meet with a doctrine which, at first sight, seems to be merely "le manoir a l'envers," indeed one English critic has actually said of Nietzsche, that "Thus Spake Zarathustra" is no more than a compendium of modern views and maxims turned upside down. Examining these heterodox pronouncements a little more closely, however, we may possibly perceive their truth. Regarding good and evil as purely relative values, it stands to reason that what may be bad or evil in a given man, relative to a certain environment, may actually be good if not highly virtuous in him relative to a certain other environment. If this hypothetical man represent the ascending line of life--that is to say, if he promise all that which is highest in a Graeco-Roman sense, then it is likely that he will be condemned as wicked if introduced into the society of men representing the opposite and descending line of life. By depriving a man of his wickedness--more particularly nowadays-- therefore, one may unwittingly be doing violence to the greatest in him. It may be an outrage against his wholeness, just as the lopping-off of a leg would be. Fortunately, the natural so-called "wickedness" of higher men has in a certain measure been able to resist this lopping process which successive slave-moralities have practised; but signs are not wanting which show that the noblest wickedness is fast vanishing from society--the wickedness of courage and determination--and that Nietzsche had good reasons for crying: "Ah, that (man's) baddest is so very small! Ah, that his best is so very small. What is good? To be brave is good! It is the good war which halloweth every cause!" (see also par. 5, "Higher Man"). Chapter LX. The Seven Seals. This is a final paean which Zarathustra sings to Eternity and the marriage- ring of rings, the ring of the Eternal Recurrence. ... PART IV. In my opinion this part is Nietzsche's open avowal that all his philosophy, together with all his hopes, enthusiastic outbursts, blasphemies, prolixities, and obscurities, were merely so many gifts laid at the feet of higher men. He had no desire to save the world. What he wished to determine was: Who is to be master of the world? This is a very different thing. He came to save higher men;--to give them that freedom by which, alone, they can develop and reach their zenith (see Note on Chapter LIV., end). It has been argued, and with considerable force, that no such philosophy is required by higher men, that, as a matter of fact, higher men, by virtue of their constitutions always, do stand Beyond Good and Evil, and never allow anything to stand in the way of their complete growth. Nietzsche, however, was evidently not so confident about this. He would probably have argued that we only see the successful cases. Being a great man himself, he was well aware of the dangers threatening greatness in our age. In "Beyond Good and Evil" he writes: "There are few pains so grievous as to have seen, divined, or experienced how an exceptional man has missed his way and deteriorated..." He knew "from his painfullest recollections on what wretched obstacles promising developments of the highest rank have hitherto usually gone to pieces, broken down, sunk, and become contemptible." Now in Part IV. we shall find that his strongest temptation to descend to the feeling of "pity" for his contemporaries, is the "cry for help" which he hears from the lips of the higher men exposed to the dreadful danger of their modern environment. Chapter LXI. The Honey Sacrifice. In the fourteenth verse of this discourse Nietzsche defines the solemn duty he imposed upon himself: "Become what thou art." Surely the criticism which has been directed against this maxim must all fall to the ground when it is remembered, once and for all, that Nietzsche's teaching was never intended to be other than an esoteric one. "I am a law only for mine own," he says emphatically, "I am not a law for all." It is of the greatest importance to humanity that its highest individuals should be allowed to attain to their full development; for, only by means of its heroes can the human race be led forward step by step to higher and yet higher levels. "Become what thou art" applied to all, of course, becomes a vicious maxim; it is to be hoped, however, that we may learn in time that the same action performed by a given number of men, loses its identity precisely that same number of times.--"Quod licet Jovi, non licet bovi." At the last eight verses many readers may be tempted to laugh. In England we almost always laugh when a man takes himself seriously at anything save sport. And there is of course no reason why the reader should not be hilarious.--A certain greatness is requisite, both in order to be sublime and to have reverence for the sublime. Nietzsche earnestly believed that the Zarathustra-kingdom--his dynasty of a thousand years--would one day come; if he had not believed it so earnestly, if every artist in fact had not believed so earnestly in his Hazar, whether of ten, fifteen, a hundred, or a thousand years, we should have lost all our higher men; they would have become pessimists, suicides, or merchants. If the minor poet and philosopher has made us shy of the prophetic seriousness which characterized an Isaiah or a Jeremiah, it is surely our loss and the minor poet's gain. Chapter LXII. The Cry of Distress. We now meet with Zarathustra in extraordinary circumstances. He is confronted with Schopenhauer and tempted by the old Soothsayer to commit the sin of pity. "I have come that I may seduce thee to thy last sin!" says the Soothsayer to Zarathustra. It will be remembered that in Schopenhauer's ethics, pity is elevated to the highest place among the virtues, and very consistently too, seeing that the Weltanschauung is a pessimistic one. Schopenhauer appeals to Nietzsche's deepest and strongest sentiment--his sympathy for higher men. "Why dost thou conceal thyself?" he cries. "It is THE HIGHER MAN that calleth for thee!" Zarathustra is almost overcome by the Soothsayer's pleading, as he had been once already in the past, but he resists him step by step. At length he can withstand him no longer, and, on the plea that the higher man is on his ground and therefore under his protection, Zarathustra departs in search of him, leaving Schopenhauer--a higher man in Nietzsche's opinion--in the cave as a guest. Chapter LXIII. Talk with the Kings. On his way Zarathustra meets two more higher men of his time; two kings cross his path. They are above the average modern type; for their instincts tell them what real ruling is, and they despise the mockery which they have been taught to call "Reigning." "We ARE NOT the first men," they say, "and have nevertheless to STAND FOR them: of this imposture have we at last become weary and disgusted." It is the kings who tell Zarathustra: "There is no sorer misfortune in all human destiny than when the mighty of the earth are not also the first men. There everything becometh false and distorted and monstrous." The kings are also asked by Zarathustra to accept the shelter of his cave, whereupon he proceeds on his way. Chapter LXIV. The Leech. Among the higher men whom Zarathustra wishes to save, is also the scientific specialist--the man who honestly and scrupulously pursues his investigations, as Darwin did, in one department of knowledge. "I love him who liveth in order to know, and seeketh to know in order that the Superman may hereafter live. Thus seeketh he his own down-going." "The spiritually conscientious one," he is called in this discourse. Zarathustra steps on him unawares, and the slave of science, bleeding from the violence he has done to himself by his self-imposed task, speaks proudly of his little sphere of knowledge--his little hand's breadth of ground on Zarathustra's territory, philosophy. "Where mine honesty ceaseth," says the true scientific specialist, "there am I blind and want also to be blind. Where I want to know, however, there want I also to be honest--namely, severe, rigorous, restricted, cruel, and inexorable." Zarathustra greatly respecting this man, invites him too to the cave, and then vanishes in answer to another cry for help. Chapter LXV. The Magician. The Magician is of course an artist, and Nietzsche's intimate knowledge of perhaps the greatest artist of his age rendered the selection of Wagner, as the type in this discourse, almost inevitable. Most readers will be acquainted with the facts relating to Nietzsche's and Wagner's friendship and ultimate separation. As a boy and a youth Nietzsche had shown such a remarkable gift for music that it had been a question at one time whether he should not perhaps give up everything else in order to develop this gift, but he became a scholar notwithstanding, although he never entirely gave up composing, and playing the piano. While still in his teens, he became acquainted with Wagner's music and grew passionately fond of it. Long before he met Wagner he must have idealised him in his mind to an extent which only a profoundly artistic nature could have been capable of. Nietzsche always had high ideals for humanity. If one were asked whether, throughout his many changes, there was yet one aim, one direction, and one hope to which he held fast, one would be forced to reply in the affirmative and declare that aim, direction, and hope to have been "the elevation of the type man." Now, when Nietzsche met Wagner he was actually casting about for an incarnation of his dreams for the German people, and we have only to remember his youth (he was twenty-one when he was introduced to Wagner), his love of Wagner's music, and the undoubted power of the great musician's personality, in order to realise how very uncritical his attitude must have been in the first flood of his enthusiasm. Again, when the friendship ripened, we cannot well imagine Nietzsche, the younger man, being anything less than intoxicated by his senior's attention and love, and we are therefore not surprised to find him pressing Wagner forward as the great Reformer and Saviour of mankind. "Wagner in Bayreuth" (English Edition, 1909) gives us the best proof of Nietzsche's infatuation, and although signs are not wanting in this essay which show how clearly and even cruelly he was sub-consciously "taking stock" of his friend--even then, the work is a record of what great love and admiration can do in the way of endowing the object of one's affection with all the qualities and ideals that a fertile imagination can conceive. When the blow came it was therefore all the more severe. Nietzsche at length realised that the friend of his fancy and the real Richard Wagner-- the composer of Parsifal--were not one; the fact dawned upon him slowly; disappointment upon disappointment, revelation after revelation, ultimately brought it home to him, and though his best instincts were naturally opposed to it at first, the revulsion of feeling at last became too strong to be ignored, and Nietzsche was plunged into the blackest despair. Years after his break with Wagner, he wrote "The Case of Wagner", and "Nietzsche contra Wagner", and these works are with us to prove the sincerity and depth of his views on the man who was the greatest event of his life. The poem in this discourse is, of course, reminiscent of Wagner's own poetical manner, and it must be remembered that the whole was written subsequent to Nietzsche's final break with his friend. The dialogue between Zarathustra and the Magician reveals pretty fully what it was that Nietzsche grew to loathe so intensely in Wagner,--viz., his pronounced histrionic tendencies, his dissembling powers, his inordinate vanity, his equivocalness, his falseness. "It honoureth thee," says Zarathustra, "that thou soughtest for greatness, but it betrayeth thee also. Thou art not great." The Magician is nevertheless sent as a guest to Zarathustra's cave; for, in his heart, Zarathustra believed until the end that the Magician was a higher man broken by modern values. Chapter LXVI. Out of Service. Zarathustra now meets the last pope, and, in a poetical form, we get Nietzsche's description of the course Judaism and Christianity pursued before they reached their final break-up in Atheism, Agnosticism, and the like. The God of a strong, warlike race--the God of Israel--is a jealous, revengeful God. He is a power that can be pictured and endured only by a hardy and courageous race, a race rich enough to sacrifice and to lose in sacrifice. The image of this God degenerates with the people that appropriate it, and gradually He becomes a God of love--"soft and mellow," a lower middle-class deity, who is "pitiful." He can no longer be a God who requires sacrifice, for we ourselves are no longer rich enough for that. The tables are therefore turned upon Him; HE must sacrifice to us. His pity becomes so great that he actually does sacrifice something to us-- His only begotten Son. Such a process carried to its logical conclusions must ultimately end in His own destruction, and thus we find the pope declaring that God was one day suffocated by His all-too-great pity. What follows is clear enough. Zarathustra recognises another higher man in the ex-pope and sends him too as a guest to the cave. Chapter LXVII. The Ugliest Man. This discourse contains perhaps the boldest of Nietzsche's suggestions concerning Atheism, as well as some extremely penetrating remarks upon the sentiment of pity. Zarathustra comes across the repulsive creature sitting on the wayside, and what does he do? He manifests the only correct feelings that can be manifested in the presence of any great misery--that is to say, shame, reverence, embarrassment. Nietzsche detested the obtrusive and gushing pity that goes up to misery without a blush either on its cheek or in its heart--the pity which is only another form of self- glorification. "Thank God that I am not like thee!"--only this self- glorifying sentiment can lend a well-constituted man the impudence to SHOW his pity for the cripple and the ill-constituted. In the presence of the ugliest man Nietzsche blushes,--he blushes for his race; his own particular kind of altruism--the altruism that might have prevented the existence of this man--strikes him with all its force. He will have the world otherwise. He will have a world where one need not blush for one's fellows--hence his appeal to us to love only our children's land, the land undiscovered in the remotest sea. Zarathustra calls the ugliest man the murderer of God! Certainly, this is one aspect of a certain kind of Atheism--the Atheism of the man who reveres beauty to such an extent that his own ugliness, which outrages him, must be concealed from every eye lest it should not be respected as Zarathustra respected it. If there be a God, He too must be evaded. His pity must be foiled. But God is ubiquitous and omniscient. Therefore, for the really GREAT ugly man, He must not exist. "Their pity IS it from which I flee away," he says--that is to say: "It is from their want of reverence and lack of shame in presence of my great misery!" The ugliest man despises himself; but Zarathustra said in his Prologue: "I love the great despisers because they are the great adorers, and arrows of longing for the other shore." He therefore honours the ugliest man: sees height in his self- contempt, and invites him to join the other higher men in the cave. Chapter LXVIII. The Voluntary Beggar. In this discourse, we undoubtedly have the ideal Buddhist, if not Gautama Buddha himself. Nietzsche had the greatest respect for Buddhism, and almost wherever he refers to it in his works, it is in terms of praise. He recognised that though Buddhism is undoubtedly a religion for decadents, its decadent values emanate from the higher and not, as in Christianity, from the lower grades of society. In Aphorism 20 of "The Antichrist", he compares it exhaustively with Christianity, and the result of his investigation is very much in favour of the older religion. Still, he recognised a most decided Buddhistic influence in Christ's teaching, and the words in verses 29, 30, and 31 are very reminiscent of his views in regard to the Christian Savior. The figure of Christ has been introduced often enough into fiction, and many scholars have undertaken to write His life according to their own lights, but few perhaps have ever attempted to present Him to us bereft of all those characteristics which a lack of the sense of harmony has attached to His person through the ages in which His doctrines have been taught. Now Nietzsche disagreed entirely with Renan's view, that Christ was "le grand maitre en ironie"; in Aphorism 31 of "The Antichrist", he says that he (Nietzsche) always purged his picture of the Humble Nazarene of all those bitter and spiteful outbursts which, in view of the struggle the first Christians went through, may very well have been added to the original character by Apologists and Sectarians who, at that time, could ill afford to consider nice psychological points, seeing that what they needed, above all, was a wrangling and abusive deity. These two conflicting halves in the character of the Christ of the Gospels, which no sound psychology can ever reconcile, Nietzsche always kept distinct in his own mind; he could not credit the same man with sentiments sometimes so noble and at other times so vulgar, and in presenting us with this new portrait of the Saviour, purged of all impurities, Nietzsche rendered military honours to a foe, which far exceed in worth all that His most ardent disciples have ever claimed for Him. In verse 26 we are vividly reminded of Herbert Spencer's words "'Le mariage de convenance' is legalised prostitution." Chapter LXIX. The Shadow. Here we have a description of that courageous and wayward spirit that literally haunts the footsteps of every great thinker and every great leader; sometimes with the result that it loses all aims, all hopes, and all trust in a definite goal. It is the case of the bravest and most broad-minded men of to-day. These literally shadow the most daring movements in the science and art of their generation; they completely lose their bearings and actually find themselves, in the end, without a way, a goal, or a home. "On every surface have I already sat!...I become thin, I am almost equal to a shadow!" At last, in despair, such men do indeed cry out: "Nothing is true; all is permitted," and then they become mere wreckage. "Too much hath become clear unto me: now nothing mattereth to me any more. Nothing liveth any longer that I love,--how should I still love myself! Have I still a goal? Where is MY home?" Zarathustra realises the danger threatening such a man. "Thy danger is not small, thou free spirit and wanderer," he says. "Thou hast had a bad day. See that a still worse evening doth not overtake thee!" The danger Zarathustra refers to is precisely this, that even a prison may seem a blessing to such a man. At least the bars keep him in a place of rest; a place of confinement, at its worst, is real. "Beware lest in the end a narrow faith capture thee," says Zarathustra, "for now everything that is narrow and fixed seduceth and tempteth thee." Chapter LXX. Noontide. At the noon of life Nietzsche said he entered the world; with him man came of age. We are now held responsible for our actions; our old guardians, the gods and demi-gods of our youth, the superstitions and fears of our childhood, withdraw; the field lies open before us; we lived through our morning with but one master--chance--; let us see to it that we MAKE our afternoon our own (see Note XLIX., Part III.). Chapter LXXI. The Greeting. Here I think I may claim that my contention in regard to the purpose and aim of the whole of Nietzsche's philosophy (as stated at the beginning of my Notes on Part IV.) is completely upheld. He fought for "all who do not want to live, unless they learn again to HOPE--unless THEY learn (from him) the GREAT hope!" Zarathustra's address to his guests shows clearly enough how he wished to help them: "I DO NOT TREAT MY WARRIORS INDULGENTLY," he says: "how then could ye be fit for MY warfare?" He rebukes and spurns them, no word of love comes from his lips. Elsewhere he says a man should be a hard bed to his friend, thus alone can he be of use to him. Nietzsche would be a hard bed to higher men. He would make them harder; for, in order to be a law unto himself, man must possess the requisite hardness. "I wait for higher ones, stronger ones, more triumphant ones, merrier ones, for such as are built squarely in body and soul." He says in par. 6 of "Higher Man":-- "Ye higher men, think ye that I am here to put right what ye have put wrong? Or that I wished henceforth to make snugger couches for you sufferers? Or show you restless, miswandering, misclimbing ones new and easier footpaths?" "Nay! Nay! Three times nay! Always more, always better ones of your type shall succumb--for ye shall always have it worse and harder." Chapter LXXII. The Supper. In the first seven verses of this discourse, I cannot help seeing a gentle allusion to Schopenhauer's habits as a bon-vivant. For a pessimist, be it remembered, Schopenhauer led quite an extraordinary life. He ate well, loved well, played the flute well, and I believe he smoked the best cigars. What follows is clear enough. Chapter LXXIII. The Higher Man. Par. 1. Nietzsche admits, here, that at one time he had thought of appealing to the people, to the crowd in the market-place, but that he had ultimately to abandon the task. He bids higher men depart from the market-place. Par. 3. Here we are told quite plainly what class of men actually owe all their impulses and desires to the instinct of self-preservation. The struggle for existence is indeed the only spur in the case of such people. To them it matters not in what shape or condition man be preserved, provided only he survive. The transcendental maxim that "Life per se is precious" is the ruling maxim here. Par. 4. In the Note on Chapter LVII. (end) I speak of Nietzsche's elevation of the virtue, Courage, to the highest place among the virtues. Here he tells higher men the class of courage he expects from them. Pars. 5, 6. These have already been referred to in the Notes on Chapters LVII. (end) and LXXI. Par. 7. I suggest that the last verse in this paragraph strongly confirms the view that Nietzsche's teaching was always meant by him to be esoteric and for higher man alone. Par. 9. In the last verse, here, another shaft of light is thrown upon the Immaculate Perception or so-called "pure objectivity" of the scientific mind. "Freedom from fever is still far from being knowledge." Where a man's emotions cease to accompany him in his investigations, he is not necessarily nearer the truth. Says Spencer, in the Preface to his Autobiography:--"In the genesis of a system of thought, the emotional nature is a large factor: perhaps as large a factor as the intellectual nature" (see pages 134, 141 of Vol. I., "Thoughts out of Season"). Pars. 10, 11. When we approach Nietzsche's philosophy we must be prepared to be independent thinkers; in fact, the greatest virtue of his works is perhaps the subtlety with which they impose the obligation upon one of thinking alone, of scoring off one's own bat, and of shifting intellectually for oneself. Par. 13. "I am a railing alongside the torrent; whoever is able to grasp me, may grasp me! Your crutch, however, I am not." These two paragraphs are an exhortation to higher men to become independent. Par. 15. Here Nietzsche perhaps exaggerates the importance of heredity. As, however, the question is by no means one on which we are all agreed, what he says is not without value. A very important principle in Nietzsche's philosophy is enunciated in the first verse of this paragraph. "The higher its type, always the seldomer doth a thing succeed" (see page 82 of "Beyond Good and Evil"). Those who, like some political economists, talk in a business-like way about the terrific waste of human life and energy, deliberately overlook the fact that the waste most to be deplored usually occurs among higher individuals. Economy was never precisely one of nature's leading principles. All this sentimental wailing over the larger proportion of failures than successes in human life, does not seem to take into account the fact that it is the rarest thing on earth for a highly organised being to attain to the fullest development and activity of all its functions, simply because it is so highly organised. The blind Will to Power in nature therefore stands in urgent need of direction by man. Pars. 16, 17, 18, 19, 20. These paragraphs deal with Nietzsche's protest against the democratic seriousness (Pobelernst) of modern times. "All good things laugh," he says, and his final command to the higher men is, "LEARN, I pray you--to laugh." All that is GOOD, in Nietzsche's sense, is cheerful. To be able to crack a joke about one's deepest feelings is the greatest test of their value. The man who does not laugh, like the man who does not make faces, is already a buffoon at heart. "What hath hitherto been the greatest sin here on earth? Was it not the word of him who said: 'Woe unto them that laugh now!' Did he himself find no cause for laughter on the earth? Then he sought badly. A child even findeth cause for it." Chapter LXXIV. The Song of Melancholy. After his address to the higher men, Zarathustra goes out into the open to recover himself. Meanwhile the magician (Wagner), seizing the opportunity in order to draw them all into his net once more, sings the Song of Melancholy. Chapter LXXV. Science. The only one to resist the "melancholy voluptuousness" of his art, is the spiritually conscientious one--the scientific specialist of whom we read in the discourse entitled "The Leech". He takes the harp from the magician and cries for air, while reproving the musician in the style of "The Case of Wagner". When the magician retaliates by saying that the spiritually conscientious one could have understood little of his song, the latter replies: "Thou praisest me in that thou separatest me from thyself." The speech of the scientific man to his fellow higher men is well worth studying. By means of it, Nietzsche pays a high tribute to the honesty of the true specialist, while, in representing him as the only one who can resist the demoniacal influence of the magician's music, he elevates him at a stroke, above all those present. Zarathustra and the spiritually conscientious one join issue at the end on the question of the proper place of "fear" in man's history, and Nietzsche avails himself of the opportunity in order to restate his views concerning the relation of courage to humanity. It is precisely because courage has played the most important part in our development that he would not see it vanish from among our virtues to-day. "...courage seemeth to me the entire primitive history of man." Chapter LXXVI. Among the Daughters of the Desert. This tells its own tale. Chapter LXXVII. The Awakening. In this discourse, Nietzsche wishes to give his followers a warning. He thinks he has so far helped them that they have become convalescent, that new desires are awakened in them and that new hopes are in their arms and legs. But he mistakes the nature of the change. True, he has helped them, he has given them back what they most need, i.e., belief in believing--the confidence in having confidence in something, but how do they use it? This belief in faith, if one can so express it without seeming tautological, has certainly been restored to them, and in the first flood of their enthusiasm they use it by bowing down and worshipping an ass! When writing this passage, Nietzsche was obviously thinking of the accusations which were levelled at the early Christians by their pagan contemporaries. It is well known that they were supposed not only to be eaters of human flesh but also ass-worshippers, and among the Roman graffiti, the most famous is the one found on the Palatino, showing a man worshipping a cross on which is suspended a figure with the head of an ass (see Minucius Felix, "Octavius" IX.; Tacitus, "Historiae" v. 3; Tertullian, "Apologia", etc.). Nietzsche's obvious moral, however, is that great scientists and thinkers, once they have reached the wall encircling scepticism and have thereby learned to recover their confidence in the act of believing, as such, usually manifest the change in their outlook by falling victims to the narrowest and most superstitious of creeds. So much for the introduction of the ass as an object of worship. Now, with regard to the actual service and Ass-Festival, no reader who happens to be acquainted with the religious history of the Middle Ages will fail to see the allusion here to the asinaria festa which were by no means uncommon in France, Germany, and elsewhere in Europe during the thirteenth, fourteenth, and fifteenth centuries. Chapter LXXVIII. The Ass-Festival. At length, in the middle of their feast, Zarathustra bursts in upon them and rebukes them soundly. But he does not do so long; in the Ass-Festival, it suddenly occurs to him, that he is concerned with a ceremony that may not be without its purpose, as something foolish but necessary--a recreation for wise men. He is therefore highly pleased that the higher men have all blossomed forth; they therefore require new festivals,--"A little valiant nonsense, some divine service and ass-festival, some old joyful Zarathustra fool, some blusterer to blow their souls bright." He tells them not to forget that night and the ass-festival, for "such things only the convalescent devise! And should ye celebrate it again," he concludes, "do it from love to yourselves, do it also from love to me! And in remembrance of ME!" Chapter LXXIX. The Drunken Song. It were the height of presumption to attempt to fix any particular interpretation of my own to the words of this song. With what has gone before, the reader, while reading it as poetry, should be able to seek and find his own meaning in it. The doctrine of the Eternal Recurrence appears for the last time here, in an art-form. Nietzsche lays stress upon the fact that all happiness, all delight, longs for repetitions, and just as a child cries "Again! Again!" to the adult who happens to be amusing him; so the man who sees a meaning, and a joyful meaning, in existence must also cry "Again!" and yet "Again!" to all his life. Chapter LXXX. The Sign. In this discourse, Nietzsche disassociates himself finally from the higher men, and by the symbol of the lion, wishes to convey to us that he has won over and mastered the best and the most terrible in nature. That great power and tenderness are kin, was already his belief in 1875--eight years before he wrote this speech, and when the birds and the lion come to him, it is because he is the embodiment of the two qualities. All that is terrible and great in nature, the higher men are not yet prepared for; for they retreat horror-stricken into the cave when the lion springs at them; but Zarathustra makes not a move towards them. He was tempted to them on the previous day, he says, but "That hath had its time! My suffering and my fellow suffering,--what matter about them! Do I then strive after HAPPINESS? I strive after my work! Well! the lion hath come, my children are nigh. Zarathustra hath grown ripe. MY day beginneth: ARISE NOW, ARISE, THOU GREAT NOONDAY!" ... The above I know to be open to much criticism. I shall be grateful to all those who will be kind enough to show me where and how I have gone wrong; but I should like to point out that, as they stand, I have not given to these Notes by any means their final form. ANTHONY M. LUDOVICI. London, February 1909. From checker at panix.com Fri Jan 13 16:53:04 2006 From: checker at panix.com (Premise Checker) Date: Fri, 13 Jan 2006 11:53:04 -0500 (EST) Subject: [Paleopsych] Thomas Jansen, ed.: Reflections on European Identity Message-ID: Reflections on European Identity Edited by Thomas Jansen EUROPEAN COMMISSION FORWARD STUDIES UNIT WORKING PAPER, 1999 The contents of this publication do not necessarily reflect the opinion or position of the European Commission. Table of contents Preface.... 5 by Jean-Claude Th?bault The dimensions of the historical and cultural core of a European identity .... 7 by Heinrich Schneider Consciousness of European identity after 1945.... 21 by Gilbert Trausch European Identity and /or the Identity of the European Union.... 27 by Thomas Jansen A contribution from political psychology.... 37 by Tom Bryder What is it ? Why do we need it ? Where do we find it ? .... 51 by Korthals Altes European identity and political experience....57 by Mario Soares How to define the European identity today and in the future? .... 63 by Ingmar Karlsson European identity - A perspective from a Norwegian European, or a European Norwegian....73 by Truls Frogner European identity - an anthropoligical approach.... 77 by Maryon McDonald European identity and citizenship .... 81 by Massimo La Torre From poetic citizenship to European citizenship .... 89 by Claire Lejeune L'identit? europ?enne comme engagement transnational dans la soci?t?.... 99 by R?diger Stephan Security and a common area.... 103 by Adriano Moreira Neither Reich nor Nation - another future for the European Union.... 107 by Roger De Weck What does it mean to be a European ? Preliminary conclusions .... 111 by J?r?me Vignon Annex:....115 A dialogue on unemployment between Truls Frogner and his Neighbour List of contributors....119 Preface The texts that have been gathered in the following pages were written or pronounced during the ?Carrefour Europ?en des sciences et de la culture? which was held in 1996 in Coimbra. This event had been organised by the Forward Studies Unit in cooperation with the ancient University of Coimbra whose academic excellence made this small Portuguese town so famous. The Carrefours Europ?ens aim to provide a forum where personalities coming from the world of science or culture can discuss and exchange their views with Commission officials. Participants come from different European countries to propound ideas on issues that are particularly important for the future of our continent. Each of them brings different experience and sensibilities and thus contributes to the openness and the richness of the reflection. The debates that took place in Coimbra focused on understanding how the European identity expresses itself. Their richness is reflected in the following texts that are at long last submitted to our readers with the deep conviction that neither their relevance nor their actuality has been lost. A characteristic of European identity is that it facilitates, fosters and stimulates variety in modes of expression, form, content and approach. And it is clear that this same principle can be applied to the definition of this identity itself: several paths may lead to the recognition and the assertion of an European identity which in itself is made of a plurality of ethnic, religious, cultural, national, or local identities. Each of the discussions that took place in Coimbra have, in their own way, reflected this approach. Both the University's rector and Marcelino Oreja Aguirre (the Commissioner in charge of communication, information, culture and institutional questions at that time (1995-1999)) highlighted three constituent poles of European identity. First, Europe is steeped in humanism and all the values that make up its heritage today. The second is European diversity: even if the construction of the Community seems to be a harmonisation process, this harmonisation is just a necessary step towards the realisation of a European market-place which should allow underlying diversitiy to flourish. Diversity is truly Europe's richness. Finally, universalism is a European value and an obligation. At a time when Europe is sometimes tempted by the idea of becoming a "fortress Europe", this founding principle has to be constantly remembered and revived. The debates gave further opportunities to put forward some key issues linked to identity, memory or nation. Thus, identity appears as two-sided: on the one hand memory, and heritage, and on the other hand voluntarism and a project to be achieved. Contrary to what is usually thought, identity seems to be constantly evolving and changeable. All these reflections ended in a discussion on the theme of "Europe and its role in the World", and of its contribution to the promotion of peace and progress. Marcelino Oreja had expressed the initial interest in a meeting such as this and had encouraged the Forward Studies Unit to organise it. The Commissioner's active participation highly contributed to the intellectual and human success of the event. We now offer our readers these collected thoughts, for which we most warmly thank the participants with the wish that they will cast light on a question that reaches right to the heart of the European political project. Jean-Claude Thebault Forward Studies Unit Director ----------------- The dimensions of the historical and cultural core of a european identity Heinrich Schneider Preliminary remarks The topic "dimensions of the historical and cultural core of a European identity" may appear to be a historical and theoretical one. However, it is political in its nature. It stands in the context of a political discussion. Obviously, it is a contribution to the assessment of new political projects of the European Community : On the one hand, a discussion of the role of the cultural heritage, the historical traditions of Europe in the formation of a political identity which will and should]d necessarily arise if the projects of "deepening" are to be successful ; and, on the other hand, a discussion about the question : what is the significance of the common cultural and historical roots of those nations which belong to Europe, in view of the "widening" of the Community. Historical reflections, theoretical reasoning, and scholars' analyses can help with the orientation of opinion and decision-making processes, but they cannot replace decisions about political goals. What we are really dealing with is the political identity of a European Union. What it should be has to be decided politically. Problems of clarifying the terms o what is constituting identity ? Every now and then, politicians have talked about "European Identity, but mostly without ever trying to explain its meaning1 ! The term "identity" is used in the context of discussions on European identity as psychologists, sociologists, and students of civilisation apply it o not in the sense philosophers deal with the concept "identity" in logics or metaphysics. Primarily, one talks about the identity, or the formation of identity, or an individual. Can we construct a concept of collective identity just as well ? Perhaps as an analogy. But we must be careful in doing so. For all that : one does speak of the identity of social groups, and there is also the concept of the identity of larger social or historical units, for instance nations. However, we cannot possibly construct the concept of "European identity" in the same fashion as we perceive group identity of Boy Scouts or national identity. These models are not adequate and thus we have to search for a more general definition. Anyone in search of her or his identity will pose the question : "Who am I" ?. With regard to collective identity the questions are : "Who are we ? Where do we come Cf., for instance, the "Document on European Identity, adopted by the Ministers of Foreign Affairs of the member states of the European Community in Copenhagen, 14 December 1973 from ? Where do we go ? What do we expect ? What will expect us ?"2. But these questions really serve to clarify another, more fundamental one : Why and how can we (or must we) talk in the first person plural ? There are two common answers ; one of them sounds as follows : "Because we want it that way !. The other one refers to certain things that we have in common : a common history, common views about our present situation, common projects for our future and the tasks that are facing us there... In the lingo of sociologists, this means : it is the common "definition of a situation" which serves as a mutual link and creates solidarity.3 Identity is thus founded on "spiritual ties", it can be grasped in a "core of shared meanings"4 in sharing consensually a common universe of symbols and relevancies.5 We do not only speak a common language ; we also agree about the things that must be talked about as well as the things that are important without words. This sharing of common values is not hanging somewhere in mid-air over our actual everyday life. Normally there are common societal conditions of life as well. Therefore, we also have to deal with the "sociological dimension" of European common cause. Our common "world of meanings" ("knowing about life") is one thing that we need in order to find our collective identity. Another one is the delimitation as an element of identity. Knowing about myself also implies that I distinguish myself from others ; identity is always based on negations, as Niklas Luhmann shows.6 Collective identity as well needs the distinction between "Us" and "Them". Nothing leads more effectively to the formation of group identity than a common enemy, according to those who do research on small groups. An analysis of nationalism shows that national identity is mostly defined through relating to "counter identities".7 A third element is needed to constitute collective identity in the full sense of the word : the ability to act and to be responsible for one's action. Personal identity includes the capacity of independent action. Collective identity calls for, and implies, authorisation, which enables the collectivity to conduct collective action.8 2 This, by the way, is how Ernst Bloch begins his book "Des Prinzip Hoffnung", Vol. 1, Berlin 1954, p. 13. 3 In this context, the present situation has also a historical depth-dimension, and there is a perspective into the future 4 Cf. Talcott Parsons, Politics and Social Structure, New York 1969, p. 292ff. This concept of collective identity is in accordance to Parson's concept of individual identity being "the core system of meanings of an individual personality ; cf. Talcott Parsons, The Position of Identity in the General Theory of Action, in : Chad Gordon and Kenneth J. Gergen (eds.), The Self in Social Interaction, New York 1968, p. 14. 5 Cf. Peter L Berger and Thomas Luckmann, The Social Construction of Reality. A treatise in the sociology of knowledge, Garden City and New York : Doubreday 1967 6 "Alle Identit?t konstituiert sich ?ber Negationen"; cf. Niklas Luhmann, Sinn als Grundbegriff der Soziologie, in : J?rgen Habermas and Nikias Luhmann, Theorie der Gesellschaft oder Sozialtechnologie, Frankfurt am Main : Suhrkamp 1971, p. 60. 7 Cf. Orest Ranum, Counter-identities of Western European Nations in the Early-Modern Period. Definitions and Points of Departure, ~n : Peter Boerner (ed.). 8 Cf. Burkart Holzoer and Roland Robertson, Identity and Authority. A Problem Analysis of Processes of Identification and Authorisation, ~n : Roland Robertson and Burkart Holzner (eds.), Identity and Authority, Oxford : Blackwell 1980, pp. 5ff., 10f., 18f., 22ff. Aristotle already knew that, by the way : The identity of a P?lis is primarily a constitutional identity, the "polite?a", through which a community becomes a political subject, so to speak. It is founded on the "koinon?a" of knowing about right and wrong (the "d?kaion") as well as about what is beneficial or not (the "s?mpheron"). It rests on the solidarity ("phil?a") of people, and its political manifestation is a general consensus, "hom?noia" as "phil?a politik?".9 Therefore, collective identity in the full sense of the concept implies a political dimension : Collective identity formation tends towards the establishment of a polity. Only against the background of this differentiation between the requirements and dimensions of collective identity, it does make sense posing more exact and detailed questions to find out what European Identity is, what it can be, and what the possible impact of historical, cultural, and sociological components looks like. Some theses and problems have to be introduced and considerable aspects in our context are to be pointed out. The primacy of politics The first task we have to deal with is to find out whether "the European Community will be able to build up a 'European identity'"., namely, under the present "new circumstances, now that the 'old' historical frontiers of the continent are reappearing". This language sounds clear enough ; but the matter itself is rather complicated. The "reappearance" of the "old historical frontiers of the continent".--do we know what we are talking about ? To quote Oskar K?hler "Neither in a geographical sense nor in a historical view, there is a static' definition of Europe".10 A lot has been said about the validity of that formula, "Europe goes from the Atlantic to the Urals". But Willem van Eekelen, the Secretary General of the Western European Union, has recently stated that "the whole of Europe ..." ("Gesamteuropa") reaches "... from Vladivostok to San Francisco", and he is not the only one to say that.11 Statements of this kind do sound as if inspired by the experience of the CSCE process. But the most famous German XIXth century historiograph on European politics, Leopold von Ranke, has already pointed out that America belongs to Europe ; "indeed do New York and Lima concern us much more than Kiev and Smolensk''12 o and we must bear in mind that Ranke, of course, saw the Russian Empire as part of the European system. Other authors took the same attitude ; there is for example the definition of the European system of states as "the connection and interdependence of all European states and empires ... including the independent states that have arisen from the colonies of Europeans in America".13 9 Aristotle, Politics, book I chapter 2 and book III chap. 3. 10 This is the introductory sentence of Oskar K?hler's article "Europa", in : Josef Hofer and Karl Rahner (eds.), Lexikon f?r Theologie und Kirche, 2nd ed., 2nd printing, Freiburg/Br. 1986, colt 1187. 11 Ambassador Henri Fromont Meurice did join him in sharing this opinion, cf. "Europa im Aufbruch. Auf dem Wege zu einer neuen Friedensordnung", Protokoll des 91. Bergedorfer Gespr?chskreises 199 , p. 29 and p. 34. 12 Leopold von Ranke, Geschichte der germanischen und romanischen V?lker (1824), p. XXXIX, cf. Heinz Gollwitzer, Europabild und Europagedanke, M?nchen : Beck 1951, p. 279 13 Karl Heinrich P?litz, Die Staatswissenschaften im Lichte unserer Zeit (1824) cf. Gollwitzer, op. cit., p. 443 On the other hand, there are much narrower definitions. When Winston Churchill held his famous speech at the University of Zurich in 1946, in which he called for the creation of a kind of United States of Europe, he entertained no doubts that Great Britain must naturally be a friend and supporter of this new political entity, but of course not a member. And the author of a well-known book about "The limits and Divisions of European History stated that usually the eastern border of the European community today, both in earlier times and today has always been the Western frontier of Russia".14 This, of course, refers to modern times ; in the Middle Ages, Europe's eastern borderlines were located much further westward. Where do we find those "reappearing 'old' frontiers of the continent ?" The controversy on how Europe is to be defined geographically is, nowadays, hardly touched by the question whether America ought to be included in the European identity ; however, there is dissent whether Europe coincides with the occidental part of the continent, that is, whether the border between Latin and Byzantine civilisation can serve to delimit it, or should do so. Now we have a whole series of problems : It cannot be denied that the schism between "East" and "West Rome" appears to be a symbol for a cultural demarcation. In the West, there was the struggle for supremacy between political and religious authorities, and in the dead corner between both of them the freedoms of the estates and urban autonomy could be developed. As a consequence, the "civil society" had more of a chance to spread out than in the East, where church government was integrated in the Empire, thus perpetuating ecclesiastical rule in the political order, respectively Caesaropapism. This had further outcomes ; but there also had been other preconditions that did contribute to the different course of social and societal history, like small-scale geography and the harbourly-structured landscapes of many of the regions of Western Europe15, as against massive geographical structures of the East, and others. v Surely, there was the great schism ; but there was also suffering that arose from a common consciousness of a fundamental unity--up to the Ecumenical Movement of our days. v Even in the days of Peter the Great, Russians reached out for Europe. Were the East European Westerners of his days and of later times erring in their illusions ? Can we deny that cultural and political identities are open to historical change, and that there have been, already, processes of "widening" of the extent and range of European civilisation ? v And, with respect to social and mental differences between different parts of what has been called our continent, is it not a constituent feature of the cultural uniqueness of Europe that opposites meet here, time and again, turning the task of ever-renewed conciliation into the principle of productive dynamic development ? 14 Quoted from the German edition : Oskar Halecki. Europa Grenzen und Gliederung seiner Geschichte, Darmstadt : Gentner 1957, p 79 15 Hans Georg Gadamer speaks of "einer einzigen gro?en Hafenlandschaft die f?r die Entdeckungsfahrten zu neuen Weiten f?rmlich aufgetan warn", cf. Hans Georg Gadamer, Das Erbe Europas, Frankfurt am Main : Suhrkamp 1989, p. 40. I do not want to say that such "old frontiers" like those between the Latin and the Byzantine tradition are irrelevant. But how far Europe will reach tomorrow, or the day after tomorrow, or in the next century and later, cannot be looked up in a historical atlas of the Antique, the Middle Ages, of the XIXth century, or of the Cold War period in our century either. Besides, the supreme representatives of the CSCE participating states have adopted in 1990, the Paris "Charter for a New Europe", and we can read in this charter that the new Europe extends as far as the reality of human rights and democracy, rule of law and pluralism, economic freedom, social justice, and the commitment for peace is reaching on European soil. We all know, and have only recently again become painfully aware, that there is that discrepancy between what is and what should be, what we want to do and what we achieve. But should it not be our common cause to realise and safeguard these principles of a European political order for all nations whose representatives have stood up for them ? Can we deny this solidarity to those who wish to subscribe to this common European order--wherever they may live in Europe ? And is it possible to denounce the declaration of thirty-four heads of state and government in favour of a new "united democratic Europe. as a mere emptiness, a proclamation that cannot be other than untrue o in view of the fact that even Albania has now joined these 34 ? Certainly, there may be reasons for a narrower concept of uniting efforts that have to be carried on during the years ahead of us o such as political prudence may suggest. In Western Europe, governments and people might ask themselves whether the chances for organising European security within European borders may not be better if one denies responsibility for certain regions. It can be argued that the political and structural requirements for a certain kind of economic or political integration may indeed call for a restriction in certain areas, in order to be optimal. And there are much more such questions and considerations. Just one of these questions is the one we are dealing with. What would be the most favourable historical and cultural conditions for including parts of Europe in the Union-to-be : soon, or later ? But, according to my opinion, it would be unjustifiable trying to avoid all these reflections, not to discuss their ramifications and to shun--or to disguise--political decisions by pointing out old historical and cultural borders. And indeed, if one were to stress that cleavage between ancient Latin and Byzantine culture, then the motherland of European political thought, Greece, ought not to have been accepted into the Community--and the definite stand the Community had recently taken in favour of Yugoslavian unity would have been absurd... Hence, the primacy of politics should not be denied. Options of the European Community When speaking about the "reappearance of old frontiers" in Europe, some other aspects come to mind. What is really new in the European situation, is the disappearance of "less old" frontiers. What allows the states of Central and Eastern Europe to "return to Europe"--as they call it--is in the first place the fact that the fatal barriers, the wall in Berlin, the barbed wire obstacles and iron curtains, are removed, and that people hayed been successful in overcoming totalitarian systems. But along with the end of East-West polarisation, with the termination of the antagonisms of political organisation, some other "old frontiers" and controversies have reappeared. We face again the situation about which Karl Jaspers said, some decades ago, that Europe has got to make a choice between "Balkanisation" and "Helvetisation". "Balkanisation" means a tangle of conflicts and hostilities, whereas Helvetisation" points to the attainment of a political identity across a multitude of national heritages and languages. The beginnings of the formation of the European Community, restricted to the six founding nations of the Coal and Steel Community and later the EEC had been initiated as such a process of "Helvetisation"., as a first step towards a confederation with an identity of its own. However, this policy was determined by some quite specific options. At first, things were started with a small community of states that intended integration ; but it was clear that this community could not identify itself with "Europe". The Community of the "European Six" was regarded by that organisation which considered itself as maid-servant to a union of European states--i.e. the Council of Europe--as a case of establishing "specialised authorities" for specific functional areas. In Strasbourg they thought that all such endeavours should always take place "within the frame of the Council of Europe" and thus being securely bound to the "proper European policy"(as the Council had conceived it). And yet this Council of Europe was in itself limited to only a part of the European states. As a representative of a European identity, it was some "pars pro toto", and the Community of the Six was some "pars partis". This changed in the course of time. In the Treaty of Rome, "the foundations of an ever closer union among the European peoples" (and not only those peoples that are directly involved) are mentioned. And in the Single European Act, the parliament of the Twelve is called the instrument of expression for the endeavours of "the European peoples" ; as simply as that. This implies that the political identity of the Community is to be further developed to become the political identity of Europe as a whole. If this is wanted, one cannot deny any European nation the right to participate in that political identity. Now, if today some 81 percent of the Hungarians, 79 percent of the citizens of the CSFR, and still 68 percent of the Poles have a positive attitude about the creation of the United States of Europe", and affirm that their own nation belongs to this future policy16, than the Community of the Twelve will have to reconsider what is to be done about the Community's own identity. Another decision of the "founding fathers" has been quite important. What the Community was all about originally, was to form an administrative union to manage the common coal and steel production as well as the distribution, notwithstanding the idea to use this union as a lever to promote political integration by creating interdependence of interests. Later, a widening was achieved in more than one dimension : the Community was extended to nine at first and then step by step to twelve member states. And the area of functions and policy fields was expanded, comprising now the whole of national economies and more and more common tasks up to a common foreign and security policy. The reason for this widening of functions and interests lies in the interdependence of policy areas. There is hardly a problem area which is not to be treated on the EC level. On the other hand, the states have not given up their spheres of responsibility, and they are still thinking (or 16 Cf. "Mehrheit im Osten f?r Vereinigte Staaten von Europa", in : Die Presse (23 April 1991), p. 22. dreaming) of their complete autonomy and "sovereignty". Thus, they try to keep under control what is happening. As a result, political processes on community, national and "mixed" levels intertwine. Complex procedures of mediation and grey zones of responsibility evolve. There was talk about "traps of policy tangles"17 and "Eurosklerosis". The reforms initiated by Jacques Delors were aimed at breaking up these entanglements and the sclerosis of the Community. Since the EC is supposed to gain more freedom of action rather than simply retaining its status, this will hopefully end in a strengthening of its political identity. This becomes particularly clear in view of the goal to form a European Union of a federal character. Under this perspective, the Community can no longer be regarded as a system to co-ordinate just the problem management of its member states who so far try to push their own interests rather then bear jointly the common consequences of their interdependence. A federal union cannot be achieved without an established supranational authority to determine a common policy. And this does not only raise the problem of democratic legitimacy but also the question of political identity. Thus, it is not surprising that the question of political identity of the Community, and in particular of the European-Union-to-be, is posed anew. In the first place the upheavals in Eastern Europe raise the problem how the Community intends to define its own purpose with regard to the identity of the whole of Europe--even more than for example the intentions of EFTA states to join the Community. Slogans like "centre of gravitation" or "anchor of stability" are no adequate answers to that. And secondly, "deepening", strengthening the polity character of the Community, transforming it into a "European Union" also implies the necessity to clarify identity problems. In search of a definition of European identity We have to find out what Europe has in common, historically and culturally, in order to define, to articulate and to strengthen its identity. If we are to do that, we should remember what the fundamental dimensions of a possible European identity are, according to the conceptual and theoretical explications I tried to give in the first part of this contribution : -the "spiritual ties" as they are manifested in a common "world of meanings" (a "universe of symbols and relevancies"), as they allow to achieve a consensual "definition of the situation"., and including the three dimensions of a shared "today", "past", and "future" ; -the "delimitation", knowing what is special about "our thing" as compared to other people's things ("nostra res agitur"--not some "res alienorum") ; -the ability to act and bear responsibility through authorisation and, thus, institutionalisation (which means, in consequence, polity building). 17 Cf. Fritz W. Scharpf, Die Politikverflechtungs-Falle. Europ?ische Integration und deutscher F?deralismus im Vergleich. in : Politische Vierteljahresschrift, vol. 26 (1985), p. 323ff. What is primarily called for, is obviously a "political identity in the concise sense of the term--a capacity which enables to institutionalise common action, and a quality which provides an adequately wide and massive basis of consensus and loyalty. It may well be that remembering common historical and cultural roots, and activating consciousness of them, helps to strengthen this basis. Yet one wonders why this historical dimension must be shoved into the foreground when the real issue is what Aristotle calls "hom?nioa" and what in our context might be called "European spirit" or "consciousness of a European common cause". To translate this into educational terms : Can we, should we, make our efforts to form European consciousness only by looking at the past, at our common history ? Would it not be equally important to recall what Europe means today and will mean in the future ? Some hypothetical answer is at hand : the matter is seen in the same way as it was seen in the last century when national identify had to be formed. The formation of a national consciousness, however, came about under remarkably different circumstances.18 When the nation decided to take over the power of government--as in the typical case of France, the main thing was to create the political will and to keep it alive (in the "pl?biscite de tous les jours", to cite the famous formula Ernest Renan found). "Res publica" was to replace "res regis". The case was different if an ethnic or national group wanted to emancipate itself from a supra-national or foreign regime (as in the case of "secessionist nationalism"), or if people which were convinced that they belong together wanted to break up the barriers between constituent states (as in the case of "integrational nationalism). Whereas, in the first of the three typical cases, the state that shall be taken over by the nation does already exist, in both of the latter cases a state shall be created which does not yet exist. The representatives of the people's political will need a "metapolitical" justification. It must be explained that this state should exist. This explanation refers to the existence of a "cultural nation" that now wants and deserves to constitute itself politically. Usually, the "meta-political" justification is given with a reference to history : in the past we, or our ancestors, did descend from one family or tribe ; or we grew together as a spiritual community ; and we shared a common fate even in earlier times. Or even this : history has uncovered a common metaphysical substance which unites us in national identity o Herder's doctrine of "Volksgeist". In political reality, this idea serves efforts of make-believe in the service of a political will.19 It derives from religious doctrines and concepts which are given a new interpretation by transferring into socio-political thinking. To give an outstanding example : the originally theological concept of the "corpus mysticum", that is the 18 The following remarks make reference to Theodor Schieder's triple typology of nation state building in Europe, namely (I) the process of assumption of power of an existing state by the "nation", (II) the process of secession or separation of a "nation"from a multinational empire or state, and (III) the unification of--up to then independent--states, whose peoples regard themselves being parts of one single "nation". This triadic typology, according to my opinion is more revealing than Friedrich Meinecke's famous distinction between "Staatsnation"and "Kulturnation"; but Schieder's idea is able to explain Meinecke's comparison. Cf. Theodor Schieder, Typologie und Erscheinungsformen des Nationalstaates in Europa, in : Historische Zeitschrift, vol. 202 (1966), p. 58ff. 19 Cf. Raymond Grew, The Constitution of National Identity, in : Boerner (ed.), op. cit., p. 31 ff. community of the faithful who find their identity in Christ's "pneuma", in which they eucharistically and spiritually participate, is transferred on the nation, whose members are spiritually bound together by their participation in some metaphysical substance, which Herder called "Volksgeist". It is only later that such notions lose their "mystical" (or mythological, or pseudo-theological) character, so that the nation then (and we might say, "only) becomes a "community by common culture and disposition through having shared a common fate.20 If today a political unification is to be attempted, for instance, a European Union, and if we all, perhaps without much reflection, still see the paradigm for the creation of a political identity in the way nation states were formed, then we must suspect that the idea of a "cultural Europe", which would have the same function as the idea of a "cultural nation", will here be conjured up. I do not want to say that one might dismiss the idea of a European cultural identity and the quest for its historical roots as nothing but ideology, as a mere construction to serve a political purpose, as, for example, Geoffrey Barraclough did.21 Indeed, there is a "fundamentum in re" : there is a European spiritual and cultural identity ; it would lead too far astray if I were to quote the witnesses for that--from Ernst Robert Curtius to Denis de Rougemont, Arnold Toynbee to Hendrik Brugmans.22 But reminding ourselves of the names of such authoritative scholars does not dispense us from the effort to identify at least some substantial contributions to what we might all "European spirit". What is meant to be represented by these centres of experience and of thought ? And what has been further developed from the achievements those keywords refer to ? It is difficult to answer such questions, for several reasons. One of them is the fact that the "fundamentum in re" of European spiritual and cultural identity is characterised by an agreement to disagree, a "concordantia discors", as Jacob Burckhardt called it, a common cause with sometimes lots of antagonism. Yet there are achievements and experiences imprinted in a common memory that constitute common understandings and are in the background of such political declarations as the "Charter of Paris" conjuring so emphatically an identity of spirit and will. There are problems both in principle and in method which have to be faced, if one tries to reconstruct and to explain them : that of the "hermeneutic circle" and of the inevitably subjective and specific perspective as well as that of the criteria for adequate selection of sources, etc. We cannot deal with these problems here in extenso. So we just turn to the "authorities", to the specialists of information. There is plenty of general agreement about the most important and significant issues-- maybe not perfect, but considerable consensus. After all, the historical and cultural identity of Europe has been an interesting topic for a long time, and many have taken part in this discussion. At least, there is an agreement about the most important historical eras, what their message is today and what should be kept alive in the 20 Otto Bauer regards the nation as a "Kultur- und Charaktergemeinschaft", based on common historical experiences ("Erleben und Erleiden des Schicksals") ; cf. Otto Bauer, Die Nationalit?tenfrage und die Sozialdemokratie, Wien 1924. 21 Cf. Geoffrey Barraclough, Die Einheit Europas als Gedanke und Tat, G?ttingen : Vandenhoeck & Ruprecht 1964. 22 See the contribution by Hendrik Brugmans in this volume. "collective memory" of Europeans. In this context phenomena, issues, and essentials like the following ones are named23 : -Extra-European and "Pre-European" achievements that were significant stimulators of European culture, i.e. the impact of ancient Egypt on pre-classical and classical Antique, above all the tradition of the Old Testament. -Classical Hellas : The Greek tradition of the "polis", the "civilisation" of social life and the Greek understanding of politics which had to have such a deep influence all over Europe ; the "discovery of the mind" ; the idea of "paideia" and thus humanness ; the evolution of philosophy--the beginnings of critical cognition of reality, that is, the Pre-Socratic thinkers, the classical philosophers Plato and Aristotle and the creation of the various genres of European literature. Rome as Republic and Empire : The idea of the "res publica", Roman law, the "virtutes", the Roman answer to Greek philosophy (Cicero, for example). Christendom as creative power in Europe : the surpassing of the reality through God's salvatory work ; the idea of the "corpus mysticum" ; the several types of Christian attitudes in the mundane world ; the relativity of secular power, the construction (or discovery) of the concept of "person" in christological thought and dispute the interrelation of religious orientation and secular order, of political power and church authority--with view on the different development in the Latin and Byzantine empires and its consequences for the forming of their societies--, and the importance of Christian social doctrine. The laying of the foundations of "Occidental culture" after the "V?lkerwanderung", the role of Benedictine monkshood, the "Regnum Europae" of Charlemagne. The "Second Awakening of Europe" (Albert Mirgeler) in the Middle Ages ; the controversy between "regnum" and "sacerdotium" ; the struggle for "Libertas Ecclesiae", the intellectual disputes over the recognition of authorities (the establishment of the "studium" as an institution. The rise of scholastic philosophy and of universities), and the rediscovery of the "inner mind" (mysticism). The inclusion of Middle and Eastern Europe in Western European culture. The dawn of modern times : Schism, growth of towns and municipal self- government ; Renaissance and Reformation ; striving for religious freedom, the building-up of the territorial state, development of a bourgeois economy, the construction of a European state system and the growth of its dynamics of power, the expansion of Europe into other continents. The Enlightenment, the emancipation of the middle classes, the great revolutions in England, America, France, and their intellectual foundations : human rights, basic freedoms, civil society, and representative government. The political ideas and movements of the XIXth century : liberal and democratic progressism, conservatism, socialism, and imperialism ; idealistic and materialistic 23 The list of phenomena, issues and essentials is in particular influenced by the author's subjective view. But as it shall be nothing more than an impulse for discussion, it can be done without references o which had to be very extensive o to the corresponding literature. philosophies as well as the new critics of civilisation, society, and the inner life (Marx, Nietzsche, Freud). Finally the movements for emancipation in the dynastic empires. The age of world wars, totalitarianism, and the efforts to overcome it. Once more, there are many questions with respect to such an outline. Do we recognise in this landscape summits of the first, second, and other order ? Are there essentials that are either continuously effective or slowly rising in an evolutionary process ? Maybe with regard to the concept of man (personality, the call to freedom and solidarity). Further on in view of the productive collision of involvement and distance, mundane responsibilities and transcendental calling, harmony and antagonism. And also in ranking individual before cause ; in the development of attitudes of "critical loyalty", broken affirmation, the combination of tolerance with firmness of conscience, and so on... But is it possible at all to present more than subjective opinions or convictions, as far as questions like such ones are concerned ? Furthermore : Is it possible to draw a precise and adequate picture of the relations between transnational developments, structures and movements on the one hand, and of the particular contributions of nations, ethnic or religious groups, and regions on the other ? Does in this sense, a "historical image" exist, truly "European", reflecting indeed the contribution of all nations and groups that make up the Community of Europe, and will this image continue to be understood (at least by the more sensible contemporary minds) as a cultural common obligation ? I think, nobody would be able to present a definite answer to such and similar questions. The meaning of the European heritage and of the living European spirit can only be actualised and made effective through a permanent effort of intellectual realisation of its components and elements. This effort must take place in form of a dialogue and discourse, through which we expose ourselves to the impacts of what we are affected by and called on, in order to widen and deepen our understanding and to activate motivational strength. Integration : colonisation of the world we live in as subversion of identity ? Posing once again the question of the meaning, of the function and of the importance of a "meta-political" identity of Europe today and tomorrow, we do this now in a different perspective. This is the case because matters might be taken too easily by simply identifying common heritages, leaving the business then to the mediators of a European consciousness, say teachers, classbook authors, or journalists. Is all that we have recalled perhaps only a heritage loosing its formative power, as some contemporary theoreticians want us to believe ? J?rgen Habermas has asked whether "complex societies" can anyway form "a reasonable identity".24 He says that this is only possible in a process of 24 J?rgen Habermas, K?nnen komplexe Gesellschaften eine vern?nftige Identit?t ausbilden ?, in : J?rgen Habermas, Zur Rekonstruktion des Historischen Materialismus, 3 ed., Frankfurt am Main : Suhrkamp 1982, p. 144ff. communication taking place under conditions of an "ideal form of life", free of any domination. All other "knowledge" about identity would be unreasonable and could be only a mystification of conditions which one has not to identify with. Niklas Luhmann is disputing Habermas' question. The "intersubjectivity of cognition, experience, and action created by symbolic interpretation and value systems" is, in his opinion, not apt to integrate modern societies. It cannot satisfy the "requirements for the control of highly differentiated societal sub-systems".25 The idea that political order has to do anything with spiritual sharing, that politics receive meaning from the conception of a common cultural heritage is, in his eyes, a totally outmoded notion (a case of "false consciousness"). Habermas insists that a humane life must be governed by "communicative reason". But he diagnoses a fatal discrepancy between the demand for reasonable identity and such trends in modern development which he assumes are manifest especially in the process of European integration. Along with the increasing rationalisation of social life, the integration of societies is more and more carried on "through the systemic interaction of specified functions".26 The control over social processes works through "speechless media of communication", through exchange mechanisms like money in the economy and through mechanisms of power in the sphere of politics. And while these control systems were embodied for a long time in a normative framework according to the "Old-European" tradition of a common weal where there was communication about necessary and appropriate actions in terms of common sense and philosophy, it then came to a "mediatisation" and, finally, the "colonisation of the life-world".27 Those spheres in which individual and collective identity may find themselves and may realise themselves are now occupied and exploited by the politico-economic control organisations using power and monetary incentives in order to get societal life going on. Morality and culture are being robbed of their substance and, thus, cultural identity becomes obsolete. Seen through such glasses, European integration as it has been in process for the past forty years would appear as a gigantic and typical example for the deliberate promotion and acceleration of just such a development : the take-over of power by a rational functioning macro-organisation that combines governmental and economic interests to control interdependencies. Habermas would be able to formulate his diagnosis o primarily made about the modern state o more precisely with respect to the EC system : The utilisation and instrumentalisation of conceptions of cultural identity and public political discussion in order to legitimise that what will be done anyway through calculated interest and power bargaining ; the substitution of democratic decision-making through relations between welfare administrations and their clients ; transformation of rule of law into an instrument of organising interest- controlled systems of regulation ; and finally, "make-believe of communicative relations" in the form of rituals in which "the system is draped as the life-world".28 This might appear as a caricature, and Habermas has indeed met with decided protest. His thesis of the reduction of politics to systems control is shrewd, but is resting on 25 Niklas Luhmann, quoted by Habermas op. cit. 26 J?rgen Habermas, Theorie des kommunikativen Handelns, vol. II, Frankfurt am Main : Suhrkamp 1981, p. 175. 27 Ibid., p. 240, p. 470f. 28 Ibid., p. 472, p. 476, p. 536ff., p. 567. rather fundamentalist premises. If it has been brought to attention here and now, then primarily because of the fact that our discussion may well need a thorn in the flesh so that we do not take things too easy on the subject of cultural identity and the building of a polity out of the EC system. But there is still another reason for taking such theses and discussions into consideration. In spite of all exaggeration, a very senseful question arises, making our special topic particularly relevant : How is it possible to secure the political identity, through which the "meta-political" components and dimensions of identity only obtain their full significance as well as their motivational relevance, while the European Community is developing ? It looks as if political actors or political scientists would have asked us to find the historical and cultural potential, so that we produce and promote European consciousness, because they expect some contribution to the progress of political community-building and polity-formation for the benefit of a European Union which shall be deepened and widened. But a complementary perspective exists, too. In the framework of European integration, it is necessary to strengthen the structures and the processes for the articulation of a truly political self-understanding and for a process of conceiving and comprehending what the tasks which Europe is confronted with are. Only if these processes are going to take place, our spiritual and cultural properties" will play a significant role in our joint endeavours to solve problems and to meet the challenges of our time and of the days to come. Therefore, we need efforts to create a political identity of a uniting Europe. If not for other reasons--then at least in order to encounter trends which tend to make the content and the substance of our metapolitical traditions politically irrelevant. The reality of politics and policies is more than a complex system of functionalist management of socio-economic interdependencies and power relations. It is also a field of communication and interaction between human beings, groups, communises, regions, and nations, on what is important, what is meaningful, and what should be done and pursued. By this process of communication and interaction, a common identity is being formed. This is also true in the field of European co-operation and integration. In the humanistic tradition of our European civilisation, it has been passed on from the philosophers of the Greek "polis" to the outstanding thinkers of our time that politics always means two things : to make possible what is necessary (Paul Valery), and to find agreement on what is real (Hugo von Hofmannsthal). Both of these will help to create, to keep alive, and to perform a European identity. Consciousness of European identityafter 1945 Gilbert Trausch The question of Europe's identity can be looked at from many angles within the perspective of this Forum o that of post-1945 Europe, and, even more specifically, that of the European Community. Sociologists, political scientists and philosophers have all made interesting contributions o highly theoretical, as can be expected, given the academic disciplines in which they work. A theoretical approach is particularly apt for the question of European identity, because, in the final analysis, Europe is a ?construction of the mind' (J. B. Duroselle). However, we must not stifle the voice of history. This is a discipline that is kept in check by two rigorous parameters o time and space. What is true for one region is not necessarily true for another, and what is acceptable at one time is not always acceptable at another. I mention this because historians construct facts from documents of all kinds. The constant need to bear this in mind sometimes clips their wings and stops them getting carried away. Marc Bloch called them ?those nasty little facts which ruin the best hypotheses'. A historical approach to the European identity after 1945 inevitably brings us to the conditions in which the European Community was born. No reasonable person would deny that the sense of a shared identity was and still is a major stimulus in the quest for a closer union. However, the disturbing fact remains that European integration only became a reality after 1945, with the creation of the OEEC, the Council of Europe, the Brussels Treaty Organisation, and, above all, the European Communities (from 1950). Robert Schuman's appeal on 9 May 1950 in Paris was translated into action, while Aristide Briand's in Geneva on 7 September 1929 fell on deaf ears. Both were French Foreign Ministers and therefore influential men, and both addressed their appeals to German politicians at the highest level who were very open to Europe, Gustav Stresemann and Konrad Adenauer. So why did Europe take off in 1950 and not in 1929? The philosopher Jean-Marie Domenach hints at an answer when he says that the European Community was born not of Charlemagne but of European nihilism. He uses Charlemagne to symbolise Europe's identity. Many historians think that we can speak of Europe from the time of Charlemagne, who is referred to in certain documents of that time as ?Pater Europae'. But for Domenach, the jolt which finally induced the Europeans to unite more closely was the havoc wreaked by the two great totalitarian systems of the 20th century: Marxism-Leninism and National Socialism. The Gulag and Auschwitz were seen as the last warnings before the final catastrophe. The figures are clear and chilling. First World War: 10 million dead; Second World War: 55 million dead (including 45 million Europeans). If this geometrical progression were to continue, the next step would be an apocalyptic Thirld World War. In other words, the European Community emerged in response to the challenge posed by two ideologies which were born in Europe from a shared cultural heritage. How can Europeans be united? Basically, there are only two possible approaches: political and economic. And where should we start? This was a question that already exercised Aristide Briand. When, in 1929, he called for the creation of a United States of Europe, he proposed to start with economic unification. One year later, in a memorandum submitted to 26 European governments for their opinion, he shifted his stance and backed a political approach, the reason being the Wall Street Crash which had changed the situation. Briand thus played it by ear, without a precise idea of the path to be taken or the objective to be attained. In this he differed from Jean Monnet, who had clearer ideas on both the end and the means. The same questions arose after 1945. Although it was clear that the two approaches should be separate, it was felt that there was no reason why progress should not be made on both fronts simultaneously. This is what the Europeans did in the years 1947-49 with the OEEC and the Council of Europe. The result was hardly encouraging, even though the two organisations did manage to group together almost all the states of Western Europe, because they were confined to the framework of simple cooperation between countries without any transfer of sovereignty. An attempt to move forward on the economic front o negotiations for an economic union between two countries (France and Italy) or five countries (with Benelux) under the name of Finebel o was to fail (1948-50). In the spring of 1950, Jean Monnet realised that the political path was closed, because the European countries remained strongly attached to their political sovereignty. Having learnt his lesson from the failure of Finebel, and not impressed by Adenauer's proposal for a Franco-German economic union (23 March 1950), Monnet opted for the economic approach, but on a smaller scale: a common market in coal and steel. This option had a number of consequences. Jean Monnet expected that this first ?pool' (coal and steel) would lead to others (agriculture, energy, transport) and hence, gradually, to a genuine common market. This prediction was to end up coming true, but only after forty years or so, which is probably longer than Monnet reckoned. Monnet also believed that this economic approach would eventually be followed by political unification. In this respect, events proved his hopes wrong. The attachment to national sovereignty in the world of politics (security and foreign policy) has turned out to be more tenacious than anticipated in 1950. By launching the process of European integration through the economy, Jean Monnet o no doubt unwittingly o defined its identity over several decades. The European Community which, with its fifteen countries, is starting to represent Europe as a whole, is perceived essentially as an economic entity. However, men (and women), being creatures of flesh and blood, do not easily identify with economic indicators, quotas and compensatory amounts. The failure of all attempts to create a common foreign and security policy (European Defence Community and the planned European Political Community 1951-54, Fouchet Plan 1961-62) and the less than binding nature of the Maastricht Treaty provisions explains why the European Union continues to be perceived by ordinary people as an economic machine. It is difficult, in these circumstances, to see it as the expression of a common destiny. Jean Monnet's proposal for a coal and steel community, put forward by Robert Schuman, was a response to a multi-faceted challenge. Like everyone else, he was aware that Europe could not continue to tear itself apart, or it would end up disappearing completely. Also, Europe's difficulties over the last hundred years had always started in the form of a Franco-German conflict, so it was here that action needed to be taken: to make war between France and Germany ?not merely unthinkable but physically impossible' (declaration of 9 May 1950). This is why the French appeal of 9 May was addressed first and foremost to Germany. The two world wars were to some extent Franco-German wars, at least when they started, and can thus be seen from a similar angle to the 1870 war. This explains the determination of many Europeans to reconcile the French and Germans and bring them closer together. Jean Monnet understood more clearly than others that Europe's future depended on France and Germany. Like it or not, the European Community has been built around France and Germany. If monetary union comes to fruition in the next few years, it will happen again around these two countries. Jean Monnet's game plan o to make the Franco-German axis the motor of Europe o could not be achieved unless Germany played along too, in other words unless it aligned itself with the western political model for good. It had to be kept from the ?temptation to swing between West and East' (Jean Monnet, 16 September 1950) and therefore had to be solidly attached to a host organisation. Neither the OEEC nor the Council of Europe, with their loose structures, could take on this role, but the ECSC fitted the bill. The European Community, along with other organisations such as NATO and the WEO, thus became a way of resolving the German question. The effects of the Cold War The appeal of 9 May 1950 was also a response to the challenge of the Cold War, which created a new situation in which Europe was not so much a player as an object manipulated by non-European players (the USA and the USSR). Jean Monnet had no difficulty in accepting the Atlantic Alliance, which was essential in order to ensure Western Europe's security. However, he felt that it had helped to fossilise mindsets and create a ?rigidity of thought'. Thus ?any proposal, any action is interpreted by public opinion as contributing to the Cold War' (note of 1 May 1950). Monnet believed that a Community as he conceived it could break out of the Cold War mould, which was not the case for the Atlantic Alliance. He thought that the ECSC could incorporate West Germany without raising the question of rearming it, which he still felt (beginning of May 1950) would provoke the Russians. The Korean War (25 June 1950) was responsible for overturning this kind of thinking. German rearmament was put on the agenda. Very rapidly, the ECSC became the model for a European Defence Community. In fact, throughout the first phase of European integration, from the OEEC through the ECSC to the EEC, Western Europe was subjected to a whole set of Cold War- related pressures which had a direct impact on the integration process. There was American pressure, which could be described as positive in that it encouraged the Europeans to unite. American diplomacy pushed the Europeans to come closer together economically and politically, though it was understood that a united Europe must remain open to American influences and products. The pressure was also positive in the sense that it did not impose any specific solution on the Europeans. In the case of the OEEC, for example, the United States would have preferred a more integrated solution than the one finally chosen on Britain's initiative. Similarly, the first British application to join the EEC (1961) owed a great deal to American encouragement. The same cannot be said for pressure from the USSR. It felt it was not in its interest for the Europeans to unite opposite it. Its policy thus aimed to divide the Europeans and to separate Europe from the United States. Thanks to its impressive military apparatus, which its acquisition of atomic weapons in 1949 rendered credible, it was able to put pressure on Europe o indeed virtually blackmail it. In the Cold War climate which set in from spring 1947, the Europeans lived in fear of the USSR, a fear which Paul-Henri Spaak gave full rein to in a famous speech. The Brussels Treaty, the Atlantic Alliance and the WEO, and also the ECSC and the EDC, were a response to the negative pressure from the USSR. The process of European integration is inseparable from the climate created by the Cold War. Throughout its history, the European Community has been very sensitive to international developments. The Korean War had a positive effect on the ECSC negotiations and the beginnings of the EDC, but the death of Stalin and the ensuing d?tente affected the EDC negatively. In the autumn of 1956, the preparatory negotiations for the Treaties of Rome were heading for an impasse after wide-ranging last-minute demands made by France when they were finally saved by the events of Suez and Budapest reminding Europeans how weak they were. In periods of tension, the Europeans close ranks, and in periods of d?tente they loosen their ties. Overall, the process of European integration has to be seen in the Cold War context. To push the image to its provocative extreme, one could say that the European Community is Stalin's baby. Only when they were forced to did the European countries agree to the surrender of sovereignty which characterises the Community. One can imagine only too clearly the consequences that the end of the Cold War may have on European integration. The effects of the Cold War can also be seen in many other areas, particularly that of political institutions. Between the wars, democratic countries suffered a period of profound crisis, which explains the rise of fascist dictatorships and authoritarian regimes (central Europe and the Baltic and Balkan countries). Where democracies did survive, they were weakened and discredited by major scandals. After 1945, however, western-style democracy became the political system par excellence, fully adopted by the nations of Western Europe. The last bastions of authoritarian regimes o fascist or semi-fascist o fell one after the other (Greece, Spain, Portugal). The rule of law and respect for human rights which became established in Western Europe contrasted with the communist model. Confronted by a regime which claimed to have history on its side and to be both politically and economically more successful, European democracy was obliged to furnish daily proof of its excellence and superiority. The example of the Federal Republic of Germany in its face-off with the other Germany illustrates this situation. The East German regime became a foil for the resounding success of the Bonn democracy. The flourishing health of western democracy is not unconnected to the creation of the welfare state after 1945. The social insurance system goes back to the 19th century, with considerable differences between the different countries. However, it is the English model, developed during the Second World War, which was to become the source of inspiration for the other countries of Western Europe. Within one generation it had become the norm, and the differences between the countries diminished, even though the extent of provision was not the same for all countries. The welfare state model stopped at the iron curtain. Beyond it, social protection was certainly well-developed, but the philosophy underlying the system was different. The weakness of the command economy explains the mediocrity of the services provided. Basically, the welfare state is a characteristic of Western Europe, different from both the communist system and the American system. The fact that this model is now under threat, and that some are arguing for the American model, has particular historical significance in view of Western Europe's identity as it has been constructed, in particular through the European Community, over the course of the last forty years. The Carolingian image In its quest to unify in the aftermath of the war, Western Europe was to take various forms based on different institutional approaches and different concepts. There would be the European Community, EFTA etc. Opposite, there was another Europe: the Europe of Comecon and the Warsaw Pact. However, it was the smallest of these configurations, the six countries which formed the ECSC, which was to dominate. Gradually, slowly but inexorably, the Community took on o or usurped, depending on the point of view o the name of Europe. It is easy to understand the irritation of some, such as the Scandinavians or the Swiss, on seeing the word ?Europe' increasingly applied to the Community during the 1960s, a usage which successive enlargements have only reinforced. The Community is thus at the root of one of the concepts of Europe. For 22 years, until the first enlargement in 1972, it was this little Europe of six countries which incarnated Europe's identity. Right from the start one could see the historical imagination set in motion. Very quickly, potential commentators and journalists started talking about a Carolingian or Lotharingian Europe. It is true that the map of the six founding countries of Europe covered exactly the same area as Charlemagne's empire. In both cases, the Elbe formed a border, even a barrier, against the barbarian tribes o or the communist countries. Of course there was no causal link between the two constructions, separated by eleven centuries. This was a mythological projection, but one that was popular for a long time because the historical connection seemed so irresistible. Clearly, calling it a Carolingian Europe stresses western Christianity's role in founding Europe. The force of the image led some people to speak of the Community as a Europe of the Vatican. Be that as it may, the fact remains that the six countries which were the first to launch themselves into the European adventure are still seen as the spearhead or core of the European Union. They seem more committed than the others. And they are destined to be the heart of a future monetary union. It is all the more distressing, therefore, that one of them (Italy) has to stay on the sidelines, forced to by the Maastricht criteria. This essay deliberately leaves aside the question of the European identity in terms of culture and civilisation. Few observers contest the fact that Europe has a cultural identity, formed over the centuries, encompassing the diversity of national cultures. But this identity may not be as clear-cut as some would see it, and it is blurred at the edges: Europe's borders have always been problematic. Beyond this cultural identity, which the elites have recognised since the Middle Ages, but which has not stopped the Europeans constantly and mercilessly tearing each other apart, the period since 1945 has seen the emergence of several Europes, born of the convulsions of the First and Second World Wars. Only one of these Europes has managed to establish a public image - the European Community o and even that took four decades. The Community only really entered into public consciousness in the member countries with the Maastricht Treaty and the public controversy which it generated. European Identity and /or the Identity of the European Union Thomas Jansen When speaking of "European identity" one needs to state what exactly is meant, as each of these words taken individually may be ambiguous and confusing. The "European" identity we are seeking to outline here is that of the European Union, the word "Identity" being understood to mean the spirit of this community, indeed, the very source of its cohesion. In so doing, we assume that both the European Union as an organisation and its tangible manifestations, policies and achievements are expressions of that identity. It is incumbent on the European Union as a political and democratic organisation to ensure that its citizens and peoples not only understand but actually espouse the spirit of the Union if they are ultimately to identify with it. Indeed, the Union's very ability to survive, grow, act and succeed in its endeavours depends on it. The factors of european identity Let me first recall the basic factors of European identity in a broader sense, which even a precise definition cannot dissociate from that of the European Union. For, even if since its inception the European Union has never embraced more than a part of Europe, its vocation still relates to Europe in its entirety. And the historical, cultural, social and political components and factors of European identity which bind the continent together, east, west, north and south, will certainly increase in importance as the Union grows larger. Historical Factors Ever since the early Middle Ages, all political processes in Europe have been interconnected. There gradually arose a complex system of relations between tribes and peoples, dynasties and classes, states and empires, which, in a context of constant change, became ever more intricate and refined. Systems of domination and counterbalance arose and collapsed as a result of recurrent wars only to be followed by fresh attempts to build empires or peace settlements. Just as nations are defined as communities of destiny, it can also be said of Europe as a whole that a shared history over many centuries has given rise to a differentiated yet in many respects interconnected and mutually dependent community of destiny. Proximity and the shared nature of both individual and collective experience have fashioned a special relationship between the peoples of Europe which, whether consciously or unconsciously, has had the effect of forging an identity. Even in places where togetherness gave way to antagonism, where proximity resulted in demarcation or where coexistence deteriorated into rivalry and ultimately war, shared experience has left a deep imprint on Europeans. Likewise, the very causes of the wars in this as in previous centuries sprang from intellectual currents simultaneously at work everywhere in Europe. Cultural Factors The shared historical experience is underpinned by a considerable degree of cultural unity of which, paradoxically, diversity has been a constituent part. This diversity has common roots, i.e. it is the outcome of a combination of the Mediterranean Greco- Roman culture, which contributed the sum experience of the ancient world as a conservative and stabilising element on the one hand, and the continental Germanic- Slavonic culture, which contributed the dynamic, youthful and forward-looking component on the other. The decisive catalyst in this synthesis was Christianity. The European world which emerged from this process during the Middle Ages never lacked awareness of its unity. Likewise, in modern times and even very recently, this awareness has always survived despite the bloodiest of wars waged in the name of national differentiation or opposing nationalist or ideological aims. Social Factors Not least because of its cultural unity, in which any differences can be seen as so many aspects or individual expressions of a shared background, Europe developed into a single area also in social and economic terms. Despite all the typical differences between its diverse regions, a similar pattern of economic development served as the basis on which social life progressed along similar lines everywhere. A significant part was played here by a highly developed trading system involving large- scale exchange of goods, labour and know-how. It formed a large internal market which, despite the restrictions imposed by the upsurge of nationalism in the 19th century, flourished up until the First World War. Symmetrical social development in the regions of Europe was matched by a simultaneity of social crisis and radical change and then in turn the formation of social groupings or classes predisposed towards transnational identification, thus creating the conditions in which the integration rooted in historical developments and a common culture could take hold. A radical break in this movement towards social integration occurred only with the division of Europe into two fundamentally different economic and social systems after the Second World War, a period from which Europe is only now beginning to recover. Political Factors History since the Second World War has shown that the intellectual and cultural strengths of the Old World are far from exhausted. The fact that the Europeans adopted a critical stance towards their history but at the same time opened up to stimuli from the new worlds of America, Asia and Africa and the fact that they ultimately responded to the challenge of Communism also impelled them to develop a new self-awareness. The European identity expressed in that new self-awareness is characterised by a marked drive for organised action which, now that the Central and Eastern European nations in an act of self-liberation are reuniting with the nations of western Europe, is confronted with new challenges. The open democratic societies did not succumb to the threats or enticements of Socialist revolution and its claims to march in step with history. On the contrary, they succeeded in maintaining and developing their attractiveness. They emerged strengthened from all economic, social and cultural crises. In the North Atlantic Alliance, they were able to jointly organise their security. Lastly, in the European Community, a significant group of democratic states created a model of peaceful cooperation, peaceful change and unity which exerts an extraordinary power of attraction throughout the world. National unity of the states and political unity of Europes The European Union is a young and still incomplete community composed nonetheless of old communities. Its Member States still possess a fairly strong identity. It is therefore only natural that, in seeking to define an appropriate way of expressing the European identity that appeals to the public, we should ask how the identity of the Member States expressed itself when (in the 19th century or before) they were still in their infancy. The unity of the Member States as they came into existence was based mainly on : o a common language and culture or common cultural and linguistic bases ; o a common experience of history, which could even encompass the experience of mutual antagonism between different sections of what became now one nation ; o one economic area with neighbourhood markets developing right across the region ; o a shared need for security against external threats ; Similar factors go to explain the process of European integration and the emergence of a supranational European Union : o the experience of history acquired by the peoples and states of Europe both in war and peaceful exchange ; o common cultural bases even if their expression has been diverse ; o economic necessity and shared practical interest within the market which transcend the national and continental framework ; o the setting of limits in relation to an enemy power which poses a threat to freedom and integrity (the USSR with its aggressive ideology and totalitarian regime). Just as the factors referred to with regard to the formation of the nation state did not all affect all participants in equal measure, not all of the population feels equally inspired or convinced by the foregoing justifications with regard to the European Union. It will nonetheless be observed that it is these common factors which, now as then, influence the decisions of the political, social and intellectual elites. And now as then we see amid those same elites sizeable minorities and occasionally even majorities of Luddites who, unwilling to relinquish the past, reject any identification with new contexts and find arguments for their ideas which are heard and believed by a certain section of the population. These are all socio-psychologically explainable transitional phenomena which arise in the definition of a new European identity (including the difficulty of expressing this identity in an appropriate fashion) or in the search for a European awareness which transcends the national awareness. To see them as problems specific to European unification would be to approach them from the wrong angle. For it is clear that changes in political and social circumstances do not always immediately result in a change in awareness. Only when new circumstances are perceived as realities do we adapt our thinking and planning accordingly. The time lapse between the appearance of the new and its perception is attributable to the fact that the old continues to coexist in parallel with the new for a while or even permanently. As a result, awareness continues to revolve around the old and therefore barely notices the new. The debate on the feasibility/non-feasibility of supranational/transnational statehood or democracy offers prime examples here. The lack of identity for the young, new and constitutionally not yet established community known as the "European Union" is also accompanied by certain problems of legitimacy which its institutions in particular have in projecting and asserting themselves. However, if one compares these problems with similar problems of the Member States and their constitutional situation, they can be seen to be quite obviously commonplace phenomena with which all communities have to contend regardless of the level at which they are established. In this respect, the problems at the various levels may perhaps be connected : o the weaker a nation's self-awareness, the less problematic is its European awareness ? o the weaker the confidence in the system of the nation state, the greater the hope placed in the European institutions ? The absence of a consensus on the constitution This is a practical problem and one which confronts politicians with practical tasks. It manifests itself in the deficit of legitimacy with which the authorities have to contend every time they want to make innovations whose advantages are not always immediately apparent given the time it may well take for results to be produced, whereas the disadvantages, whether short-term or medium-term, real or imaginary, have to be taken into account. For any political project to gain acceptance it is therefore important, indeed indispensable, for its meaning to be clear, its components visible, and its effects foreseeable. If the European project is to succeed, then it is crucially important for it to be understood. But what does the European project entail ? A Union organised on federal principles and endowed with a democratic political system which, through its institutions and laws, guarantees internal and external security and which takes on major tasks, beyond the capabilities of individual Member States, in a manner accepted by the public as serving its interests. However, in defining the project, we see at once that the project thus defined does not enjoy the support of all the participants. There are governments, parties, parliamentary factions and important social and cultural groupings which want to achieve a different project. Their European project is based on another idea. For example : Cooperation between a group of states which agree on institutions and procedures to perform jointly defined tasks, case by case, but without submitting to the discipline of a democratic and federal system. In other words, there is no consensus on the "finalit? politique" of European integration and this makes it above all difficult to establish and give expression to the European identity. For the European Union remains the unfinished practical expression of an ultimately undefined project. It is therefore more process than project ; it is the blueprint for a product, the real shape of which remains undecided. Equally undecided is the geography of the Union. Where does it place its borders ? There is no consensus here either. The dilatory treatment of Turkey's desire for Union membership is proof of this, as are the difficulties in agreeing an enlargement strategy with respect to Central and Eastern Europe. And then there is the fact that we have become accustomed to seeing certain challenges as the most important motives for the unification of Europe : establishment of an enduring peace between the participant nations, reconstruction of a devastated continent, reacquisition of a role in international decision-making, defence of freedom against totalitarian Communism, the guaranteeing of a democratic future and greater and more widespread prosperity. As European integration policy achieved results, so these motives gradually faded into the background ; and since the watershed year of 1989, it has become clear that the European Union needs new motivation. This does not mean that all the original reasons and motives for the policy of European unification have become obsolete. They retain, albeit in a different context from before, a certain reality content. This is true even if it no longer carries the same weight as in the 1950s and up until the 1980s because : o the process of rebuilding Europe from the ruins of the war has long been completed ; o the peace between those nations of Europe which took part in the integration process is today guaranteed by the existing set of institutions ; o the Soviet-Communist regime has collapsed ; o democracy has established itself in all European countries and can be regarded as secure ; o the aim of a more widespread prosperity has been achieved to an unparalleled degree ; o Europe can regard itself once again as a leading player and partner on the world stage. The question as to what makes it necessary to take integration further now that the most important goals have been achieved is therefore warranted ; it challenges us to define and explain the new objectives and motives in order thereby to give appropriate and perceptive expression also to the identity of the European Union. The new tasks The new challenges confronting Europeans now and in the future arise from various developments : o the process of unification itself, which has generated a dynamic through which the responsibilities of the European Union have increased substantially and certain reforms of its political system have become indispensable since it will otherwise be incapable of performing the tasks entrusted to it ; o the collapse of the Soviet Union and the accompanying end of a bipolar world order based on two mutually opposed superpowers ; o the technological and industrial developments which are giving rise to new ways of living, working and operating all over the world. Many of the individual measures enacted in the decades since the Second World War can be seen as preludes and pointers to the changes of recent years. However, we are only now becoming gradually aware of their full implications. New situations are arising, which we are attempting to conceptualise when we talk for example of the "globalisation of the economy" or the "information society". In coming to terms with the new situation, Europe will above all have to face up to the following challenges : o the renewal of European society ; o the development of a democratic and workable constitutional order ; o the enlargement of the Union to include the countries of Central and Eastern Europe ; o the creation of a new world order in line with technological, scientific and social change. The European nation states cannot rely on their own discretion and devices to carry out these tasks alone. For the challenges involved are directed at the entire Union. They can therefore only be properly addressed through the combined effect of contributions by the individual states to the united action of the Union of European states and the added value of joint effort. The Renewal of European Society There are in Europe various competing models for the most effective and fairest social order. They are inspired by differing national concepts and traditions of social organisation and social life ; even regional characteristics can be discerned, finding expression for example in the differences between the Northern European (more Germanic Protestant) and Southern European (more Roman Catholic) societies. And neither must we ignore the influence which ideological and political convictions have exerted on societies in the individual European countries : Conservative and Liberal, Socialist and Christian-Social ideas have all left clear, distinguishable traces. And yet, we can now ascertain that over the decades, thanks to a common cultural foundation, a broad consensus has formed on a model which corresponds more closely than others to the vital needs and circumstances of Europeans. The differences between this European model and that of American society are striking, not to mention the models which underlie the societies of certain East and Southeast Asian industrialised market economies. What are the main features of this European model of society ? Its central feature is what in Germany is called the "Soziale Marktwirtschaft", i.e. a "social market economy" which allows market forces full scope whilst subjecting them to a framework of rules designed to prevent abuse, satisfy basic social needs and provide a minimum of social security. The consequent solidarity and stability also makes for greater freedom of the market ; the efficiency gained as a result makes it possible to supply the necessary resources for social welfare and security. This model is being called into question and is now in jeopardy. More precisely, the excessive growth of the social security system over the years has disrupted the balance between individual responsibility for the whole and society's responsibility for the individual. On the other hand, the pressure of competition accompanying the globalisation of the economy and communication has meant that to safeguard jobs in "Enterprise Europe" substantial cutbacks have had to be made in the social security system together with radical reforms in the way they operate. Ultimately, this twofold threat to the European model represents a virulent attack on the philosophy which underlies it ; the motives behind the attack are partly ideological, partly conditioned by interests and its aim is to eliminate the social dimension. The European Union would lose an essential component of its identity if it failed to withstand this attack. The agreement on social policy between the Member States (with the exception of the United Kingdom) appended to the Maastricht Treaty was a first important step. The Commission White Paper entitled "Growth, Competitiveness, Employment" endorsed by the Union in the autumn of 1994 contains a programme for the safeguarding and reshaping of the social and economic order of the Union. The aims of this programme are likewise served by the proposal for an Economic and Monetary Union, in particular its establishment in stages and the definition of a sound financial situation as a preliminary requirement for the introduction of a single currency and the consolidation of the single market in the large frontier-free European economic area. The reform programme which underlies the policy of the European Union is sustained moreover by the confidence that the peoples of the old world who have emerged from the tribulations of repeated fratricidal wars and the humiliation of totalitarian repression have lost neither their capacity for innovation and creativity nor their historical and cultural experience and therefore possess all the assets needed to remain competitive in the global context. The Development of a workable constitutional order The identity of a political community finds its noblest expression in its internal order, i.e. in its constitution. However, it is precisely in this respect that the European Union is defective. The first item on the agenda for the years to come is therefore the revision of the treaties in which the institutions, procedures and rules of the Union are rooted. It is generally agreed that the Intergovernmental Conference entrusted with the reform (of the treaties or constitution) should serve to bring the European Union closer to the people by making it operate more efficiently and openly. The Union should raise its profile and its activities should become more understandable. It is clear that the expectations placed in the Intergovernmental Conference, which must be measured in terms of the major developments dependant upon its outcome (enlargement, monetary union, etc.), can only be fulfilled if the conference aims at the establishment of a federal and democratically legitimate structure. Federation could give expression to what is inherent in the European Union : namely unity in diversity. At the same time, as a prerequisite for the definition of identity, this would answer the unresolved question of the "finalit? politique". Given the complex circumstances of the integration process in the Union, only a democratic order offers the possibility of tackling the pressing practical and political problems with any hope of success on the one hand, and of giving meaning to what we call Union citizenship on the other. The Enlargement of the European Union The historical watershed of 1989 confronted the European Union with a new task which will keep it occupied until well into the next millennium. After initial reticence, attributable to widespread unease about the new uncertainties as well as to misunderstandings and a resultant distrust between the partners, there is a now a general consensus on the fact that every effort must be made to incorporate the states and peoples of central and eastern Europe as Members of the Union as soon as possible. There are many justifications for this, historical, moral, social, and not least, the fact that this is the only way of ensuring lasting economic and political stability and peace in this region. The Union already treats the states of Central and Eastern Europe as future members and more and more systematic efforts are being made to achieve what in previous decades has been no more than a dream : namely the unification of all of Europe in peace and freedom. Indeed, the establishment of the conditions for the enlargement of the Union is in full swing, : in the individual applicant countries as well as in the Union itself. A strategy of preparation for membership has been drawn up in cooperation with the governments concerned. Important stages on this road of standardisation and harmonisation are the association agreements with the Central and Eastern European countries which, through these agreements, have moved politically closer to the Union. The economic and trade provisions and the connected assistance arrangements afford them the material and practical wherewithal needed to prepare for membership. If, however, the future members of the Union have to be capable of accession, then the Union itself must become capable of enlargement. Thus, if it is to remain open to all European nations which can claim a historical and cultural right to belong to it, it must also solve the problems connected with a major enlargement from 15 to foreseeably 27 and perhaps even 30 Member States : the political and institutional problems, the economic and social problems and also the financial problems, the solution of which will demand a substantial additional solidarity on the part of the Union's present Members. A considerable leap in self-awareness could be made if this process of political deepening and geographical enlargement could be handled successfully, also because the name "European Community/Union" has always suggested the encompassing and representation of Europe in its entirety. The closer this ideal comes to being achieved, the easier it will be to bridge the credibility gap. The Establishment of a New World Order Lasting economic and social stability is also vitally important for the European Union from the point of view of the Mediterranean area. It is therefore in the Union's interest, indeed it is its duty, to help create the conditions for peaceful development in this region. The Mediterranean Conference of November 1995 provided the impetus for a new inter-relationship based on partnership which not only satisfies the present requirements but marks a fresh start compared with the centuries of cultural and religious conflict which have characterised relations between Europe and the Mediterranean region in the past. The readiness of the European Union to face up to its responsibilities with regard to the Mediterranean area and Central and Eastern Europe (moreover in relation to Russia and the Commonwealth of Independent States) is substantiated by large-scale development aid and development cooperation in the Third World. It indicates a growing role for the Union as an actor in the international order. It has the capacity to do this thanks to : o its success in establishing its own order representing, historically and structurally, an international order pacified in a lasting manner by democracy and federalism ; o the strength which it derives from the united action of its Members. More unity, and above all more unity deriving from democratic decision-making procedures, will lend the Union greater weight and greater credibility in this role ; to achieve such unity, further advances need to be made in the establishment of its internal order and the strengthening of its capacity for external action. The establishment of the European Community nearly fifty years ago was also a contribution to the creation of a more just and peaceful world order. Its endowment with democratic institutions and instruments for the common definition and implementation of policies in an increasing number of areas, but in particular its development into a European Union with a common foreign and security policy and a single currency, only becomes really meaningful if it is understood as a structural component of a "world federation", i.e. of a process which leads, via the organisation of large continental groups of states and a radical reform of the United Nations, to a world order based on subsidiarity. That does not mean to say that the integration of the European states and societies is not in itself also a high-ranking objective for in the past it has led to the pacification and reshaping of Europe, increased economic prosperity and guaranteed social progress ; in the future, through the corresponding effects of enlargement to Central and Eastern Europe, it will also develop in those parts of Europe which have hitherto been unable to take part in this development. At the same time European integration remains the basis for the effective discharge of all the major cross-border tasks entrusted to Europe. However, in the context of world history, the unification process in Europe is aiming further than the construction of a Union.. More precisely, the stability achieved through the process of building the Community, together with the instruments of peace devised in this process and the prosperity existing here, are all factors which oblige Europeans to assume responsibility in and for the world. This involves more than development aid and active concern for human rights or the protection of the global environment. It also involves the shaping of an institutional and legal framework for world progress, a worldwide economy, worldwide transport, worldwide communication, the ecology of the world and worldwide politics in its various branches. The European Union will be in a privileged situation in being able to submit and implement proposals to this effect on the basis of its own experiences, if in the years to come it succeeds in giving expression to its identity by successfully defending its societal model through renewal, giving an effective form to its political system and at the same time finding optimum solutions for its geographical enlargement. A contribution from political psychology Tom Bryder "Europeanisation", meaning the political unification or integration of Europe, as we have recently come to think of it, is a relatively new phenomenon. More precisely, it refers to attempts at creating a European federal union, a distinct entity in relation to its surroundings. To the surroundings, such as people in the former colonies, or in the United States, "Europeanisation" has a different meaning from that revealed by the integration perspective. Edgar Morin (1990, p. 20) says that "Il est difficile de percevoir l'Europe depuis l'Europe." From the outside it is often associated with expansive tendencies such as "European cultural imperialism" (in the former colonies) or "Cultural snobbism" (in the United States), that is, a colonialisation of the minds of people outside Europe, both in Africa, Asia, and America. Somewhat paradoxically, it is difficult to distinguish "Europeanisation" as such from what we, in Europe, sometimes call "Americanisation" or "American cultural imperialism." The difference for the political order, however, seems to be a matter of quantity and authenticity. Critics of "Europeanisation" so conceived, such as of the francophones and German visionary intellectuals like T.W. Adorno, search for a European identity free of such connotations. Apart from this ingroup-outgroup aspect of "Europeanisation", we must deal with ongoing processes of how European identity evolves o if it exists, or whether it is emerging. How is it created, sustained, and dispersed ? To which extent and in what respect can we characterise the formation of a European political identity as an outcome of learning, memorisation and information retrieval processes ? To some people, particularly the contributors to the French intellectual debate on the future of Europe, the contradiction between technocracy and meritocracy on the one hand, and democracy on the other ("Eurocrats" versus "Europe des citoyens"), poses the major challenge to the process of a politically unified Europe.29 It is, for example, presented as the end of minority rule in general by Wolton, who says (1993, p. 95), "Le passage de l'Europe technocratique ? l'Europe d?mocratique signe la fin du r?gne de la minorit?." It is an expectation resembling the classless society expressed by Marxism. Wolton (1993, p. 232) says that this debate is more widespread than claimed here : "Le th?me de la "technocratie europ?enne" est omnipr?sent dans tous les pays. Conceptualisations and definitions Let me first mention some definitional issues that might be helpful in a search for appropriate conceptualisations of identity. According to Webster's : 1a = sameness of essential or generic character in different instances, or 1b = sameness in all that constitutes the objective reality of a thing, or 2 = unity and persistence of personality, or 3 = the condition of being the same with something described or asserted. Le Nouveau Petit Robert (1993, p. 1122) is somewhat more exhaustive : 1. Caract?re de deux objets de pens?e identiques, Identit? qualitative ou sp?cifique. ? similitude. L'identit? d'une chose avec une autre, d'une chose et d'une autre. Identit? de vue. ? communaut?. 2. Caract?re de ce qui est un. ? unit?. 3. PSYCHOL. Identit? personnelle, caract?re de ce qui demeure identique ? soi-m?me. Probl?me psychologique de l'identit? du moi. Crise d'identit?. o Identit? culturelle : ensemble de traits culturels propres ? un groupe ethnique (langue, religion, art, etc.) qui lui conf?rent son individualit? ; sentiment d'appartenance d'un individu ? ce groupe. ? acculturation, d?culturation. PAR EXT. ?sommier. Psychologists and psychoanalysts say that identity equals "The sense of one's continued being an entity distinguishable from all others" (Rycroft, p. 68). As Rycroft also says (ibid.) : The sense of identity is lost in fugues and perverted in schizophrenic delusions of identity in which, typically, an underlying sense of nonentity is compensated for by delusions of grandeur. A fugue designates a process by which an individual loses her or his sense of destiny and location. In psychoanalysis, fugues are classified as instances of hysterical behaviour and cited as examples of dissociation of consciousness. They typically arise out of role confusion when an individual cannot cognitively handle the information she or he faces. A transposition of psychoanalytical concepts to a figurative political language, I believe, may create some fruitful associations which can assist us when we try to explain, for example, disintegrative processes in central and south-eastern Europe, or integrative processes in Western Europe. Taking a preliminary view of what identity is from the psychoanalytic description, we may consequently look at "identification" as : The process by which a person either (a) extends his identity into someone else, (b) borrows his identity from someone else, or (c) fuses or confuses his identity with someone else. In analytical writings, it never means establishing the identity of oneself or someone else. (Rycroft p. 67) The expression "to identify with" bridges an individual identity and a shared identity ("I", "me" and "we", "us"), that is, some kind of "social" or "political" identity. The place of identity in modern political research In modern political science (Cf. Lasswell, 1965) identity is usually treated as an element in a "political perspective," the other major components being "demands" and "expectations." Probably influenced by sociological role theory (which is wider in scope than psychological identity theories, since it incorporates behaviour as well as thought and emotional process), some authors seek a solution to identity uncertainty in the concept of multiple identities. But who should determine what these identities should be like ? The concept of identity cannot be patented by any traditional political-sociological group. It is not part of the traditional ideological quest for a distinct political vocabulary, as revolutionary socialists tended to believe before World War I. As Wolton says (1993, p. 48) : L'identit?, la nation, la tradition ne sont pas des valeurs de "droite", elles appartiennent ? toutes les familles politique et il y a un conformisme eurocratique ? diaboliser ces mots. As a matter of fact, the dynamism of a pluralistic and democratic conception of political identity presupposes that multiple identity pragmatism need not be present at the individual level of analysis at all, but only at the social level in the form of choice options. (Wildawsky, 1987). From a theoretical point of view, the lack of hierarchical priorities of identity objects may lead to the kind of psychological state called fugues, previously described. Mixed or uncertain political role conceptions are not the same as cultural pluralism and may eventually lead to hyper- vigilance (psychological distress), decision evasion and paralysis. Territory, language, ideas, culture, and history may all serve as objects with which we wish to establish notions of political identity. But which objects are of primary, of secondary or of lesser importance to the citizens of Europe ? Which objects are necessary and which are sufficient for the establishment of a notion of European identity ? In the French debate, the opposition between objects of identity is basically seen as a conflict between "modernism" and "voluntarism," not between social classes or party alignments. Modernism is seen to be creating a link between identity and nationalism, and "voluntarism" is seen as creating a link between identity and history. Moreover, the construction of the new Europe, according to the French debate, does not simply mean a democratisation of the technocratic Europe which has been the foundation of previous attempts to integrate Europe politically, economically and culturally, but a radical break away from both the modernistic and the voluntaristic "paradigms" (Wolton, 1993, p. 67). The cardinal issue revolves around the opposition between democracy and totalitarianism. This issue re-emerged when the Communist menace disappeared around 1990. Which, then, are the attitudes of the general public towards the European Common Market of yesterday, as it was usually referred to in the 1980s, and the European Union of today and tomorrow ? Should decision making in Europe be confined to the approximately 50.000 Eurocrats, or to the 343 million citizens ? If the Eurocrats, as a caste, are indispensable in the process of European integration, how do we ensure that they are made accountable to democratic institutions and that they take considerate attitudes to the citizens of Europe ? What should the role of national parliaments and the European parliament be in the future ? With the present tendency to transfer power from government(s) to markets, what will the scope, weight, and domain of political power in the political system of Europe be in the future ? Let us first take a look at the objects of identification, and see if they provide us with adequate criteria for choice and commitment. Geographical criteria What first comes to our minds when trying to outline what it means to be a European is, perhaps, Europe as a geographical unit. Political systems such the Italian political system, the French political system or the Danish political system all embrace a notion of territory. So important is this that Max Weber made territory a major component of his definition of what a state is. But how do we establish where the boundaries of Europe are ? Should Greenland be included if we look at the map before it gained autonomy (Hjemmestyre) ? The Faeroe Islands ? Madeira ? The Canary Islands ? Cyprus ? Malta ? Uzbekistan ? Linguistic criteria In France it is sometimes maintained that (Wolton, 1993, p. 84), "Le fractionnement linguistique est... constitutif de l'identit? europ?enne." At the same time, the practical problems of the language barriers are realised (ibid.) : "Le principal probl?me de l'Europe est l'absence de langue commune avec d'insolubles probl?mes de communication, notamment ? Bruxelles et au Parlement. D'ailleurs sur 13.000 fonctionnaires ? la Commission, il y a 1.700 traducteurs soit 2 traducteurs pour 13 fonctionnaires". Many people see this lack of linguistic unity as an indication of how difficult it is to unify Europe : L'Europe est aussi un carrefour de langues, puisque quarante-trois langues y sont parl?es, ? des degr?s divers. (Wolton, 1993, p. 17) What about English ? Many people in most European countries, however defined, speak English. But so do many people in America and Australia, and as a native language of a European state, English is not spoken by as many people as is, for example, German. Moreover, French, Italian and Spanish are strong competitors within the European context. So language cannot easily be used as a common denominator for establishing a unified sense of European identity. Still, as Edgar Morin points out, English may very well be used as a working language without the creation of an Anglo-Saxon cultural hegemony (1990, pp. 23233): L'Europe ne court aucun risque culturel ? ce que l'anglais y devienne langue principale de communication. N'a-t-il pas constitu? la langue de communication entre les diverses cultures et ethnies indiennes sans les corrompre, sans d?valuer les langues r?gionales, sans surimposer l'identit? anglaise sur l'identit? indienne ? L'utilisation de l'anglais, accompagn?e de la connaissance de deux autres langues europ?ennes, aurait en outre l'avantage de faciliter les communications avec le reste de la plan?te.30 Cultural-Ideational criteria One can, of course, assume life styles, traditions and behavioural patterns within some European territory, more or less arbitrarily defined, constitute a "European culture." But even within nation states it is dubious to speak of specific political cultures, since other criteria such as class, urban versus rural, north versus south, and similar criteria tend to give more explanatory power to the notion of "political culture." The political culture of the British working class is definitely different from that of the middle class and the gentry, the political outlook of farmers in rural Holland definitely differs from that of city dwellers in The Hague, Amsterdam and Rotterdam, and northern Italian conceptions of politics are very different from those held by the population of Sicily and Naples. And as the two World Wars in this century have shown, Marx was definitely wrong in believing that the working classes of the world had so much in common that they would prefer class to nation as a chief object of identification. 30 Others like Wolton (1993, p. 162) are more cautious and less optimistic : "L'identit? postnationale est le moyen de construire cette identit?, reposant sur l'adh?sion ? des cultures politiques d?mocratiques, communicationelles, qui attribuent une influence certaine ? l'?change et font notamment l'impasse sur le probl?me de la langue. Comment communiquer des exp?riences sans langage commun ?" Analytical criteria If a political perspective reflects aspects of political cultures, and if identity is a necessary element of a political perspective, then it follows that we must give further consideration to political culture. At a somewhat high level of analytical abstraction, Wolton argues that one can intuitively speak of culture in three senses. In the first place, as an opposition to nature, that is, as the results of human labour. In the second place, culture can be seen as that which unifies a people or ethnic groups and which allows us to distinguish cultures from each other. In the third place, finally, culture can be seen as "high culture," as implied when we speak of being cultivated, familiar with literary traditions and art, etc. In Europe, all three notions have always co-exited at the same time. (Wolton, 1993, p. 312). Yet there were dynamisms and developments as Laqueur has pointed out (1970, p. 344) : With all its vitality, post-war European culture faced grave problems. The stultifying effects of mass culture, the standardisation of the mass media, the commercial production of cultural goods, constituted an insidious danger which in this form had never existed before. At the other extreme there were the futilities of an esoteric, precious, often sterile ?high culture', divorced from real life and from people, a dead-end rather than a narrow pass on the road to new cultural peaks. Culture had become less spontaneous and far more costly... Trying to relate these common sense notions to the debate on European political culture, Wolton says that empirically there are three national approaches with ingredients borrowed from these notions : v Le premier sens, "fran?ais" insiste sur l'id?e d'oeuvre, de cr?ation. Il suppose une identification de ce qui est consid?r? comme culturel, en terme de patrimoine et de cr?ation, de connaissance et de savoir. v Le deuxi?me sens, "allemand", est proche de l'id?e de civilisation. C'est l'ensemble des oeuvres et des valeurs, des repr?sentations et des symboles, du patrimoine et de la m?moire tels qu'ils sont partag?s par une communaut?, ? un moment de son histoire. v Le troisi?me sens, "anglo-saxon", est plus anthropologique au sens o? il insiste sur les modes de vie, les pratique quotidiennes, l'histoire au jour de jour, les styles et les savoirs quotidiens, les images et les mythes. (Wolton, 1993, p. 312) Historical criteria To the extent that we wish to speak of a common European historical destiny, we would find that there are more competition, rivalry, strife, war and other forms of nonco- operative behaviour than forms of co-operative behaviour. In an attempt to summarise the results of a historical survey of Europe's origins, Morin (1990, pp. 2223) says that : L'Europe se dissout d?s qu'on veut la penser de fa?on claire et distincte, elle se morcelle d?s qu'on veut reconna?tre son unit?. Lorsque nous voulons lui trouver une origine fondatrice ou une originalit? intransmissible, nous d?couvrons qu'il n'y a rien lui soit propre aux origines, et rien dont elle ait aujourd'hui l'exclusivit?. In this sense, it seems inappropriate to speak of the long-term historical origins of a European identity, which o according to both Webster, Le Petit Robert and the psychoanalytical definition o would have to denote a form of sameness. In the period before World War II, the term Europeanisation tended to express the effects on Australian, Asiatic, American and African cultures and civilisations of the peculiar civilisation that grew up in modern Europe o including what we today call Eastern and Central Europe o as a consequence of the Renaissance, the Calvinist and Lutheran Reformation and, later on, the industrial revolution. As George Young wrote in the 1934 edition of The International Encyclopedia of the Social Sciences (1937, p. 623) : Europeanisation may be expressed politically by imposing the idea of democracy, in the sense of parliamentary and party government, or of sovereignty, in the sense of suppression or subordination of all government organs to the sovereign state, or of nationality, by creating a semi-religious solidarity in support of that sovereignty. It may be expressed economically by imposing ideas of individualistic capitalism, competition and control on community enjoying more elaborate and equitable, but less productive and progressive, collectivistic or communal civilisations ; or industrially by substituting the factory and the foundry for the hand loom and home craft. Subjective versus objective criteria Should we satisfy ourselves with just noting that "European" is what one is, if one says so ? If we reason along this line, National Socialists and Arab Socialists would be "socialists," National Democrats (that is, Neo-Nazis of the 1960s) and representatives of the former "People's Democracies" would be Democrats. If political science equals the creation of political clarity rather than confusion, a purely subjective approach seems inappropriate. For reasons of expediency, I would suggest that we opt for something like a minimalist objective approach. For a person to be "European" she or he would at least have to : o be a citizen of a state, located by stipulation, to be geographically within a geographical entity called Europe ; o speak a language which is officially accepted as one of the official languages of that state ; o share a historical destiny with other people, within that state, speaking the aforementioned language ; o share a cultural pattern with other such people, where the cultural pattern is seen as consisting of similar cognitive, evaluative and emotional elements. Citizenship is a legal criterion. An Australian citizen would not qualify even if he had lived for a long time in a European state, neither would aspiring immigrants or refugees. Language is somewhat weaker as a criterion variable, as I have already mentioned. Shared history is also a weak criterion : What about people living in territories that historically have been contested such as South Tyrol, Alsace-Lorraine, Slesvig-Holstein, parts of the former Habsburg empire, or the former USSR ? What about the Basque separatists and Catalonian nationalists, not to forget the Balkan states ? With respect to a notion of European identity, as opposed to the national identities of Europe's constituent states, peripheral territories will constitute problems since Europe is a peninsula, rather than a continent. Hence we have had problematic notions such as the old "cordon sanitaire" which was invented between the two World Wars to define a buffer zone between the Soviet "dictatorship of the proletariat" and the rest of Europe, and the "Partnership for Peace" within the new world security order. Shared culture also seems insufficient when we wish to create a distinction between European and non-European identities and, besides, cultural criteria seem to overlap with the other criteria, as I have already mentioned. Since culture can be based on any of the three previously mentioned elements of a political perspective (identification, demands, and expectation), we run the risk of exposing ourselves to definitional circularity if we use that as an exclusive criterion. Three kinds of motives Some people tend to perceive themselves ("to identify") on the basis of what they think they are and have been, and draw their political conclusions on this basis : "I am a Danish farmer or Danish farmer's son, so I must vote for the agrarian party." They are characterised by their "because-of" motives. Other people tend to conceive of themselves in terms of what they want : "In order to promote a free society I will vote for the liberal party." These people are characterised by their "in-order-to" motives. Still others perceive themselves on the basis of what they expect : "Activism is required if I wish to gain what I want or preserve what must be preserved ; in order to live a good life." "Fatalism or free-riding will be better for me than activism." This third group can be characterised by their "optional-choice" motives. The first requirement for a political identification to occur is the recognition of a "self" distinct from others, i.e. "them". This is "identification" proper. What is distinctive about being European today, if we compare it with being, say, Australian , Canadian, or Mexican ? What are the significant characteristics of being European today in comparison to being, say, European before and immediately after the Second World War ? The accumulated efforts of Schumann, Adenauer, de Gaulle, Monet, and Delors have all made a difference, but will it continue ? In the second place, there must be a recognition that this "self," this "identification" is in opposition to "them." This is regrettable for those who advocate world federalism and continued responsibility toward the Third World. In order for an identity to thrive there must be a challenge, a recognised competitive edge or conflicts of interests. The political self-recognition and the recognition of opposition between the "self" and "others" tend to reinforce each other, as in Marxist theory which claims that the class in itself (Klasse an sich) becomes more distinct as it fights for its interests against other classes, so as to emerge as a class for itself (Klasse f?r sich). As the social psychologists Hans Gerth and C. Wright Mills say in Character and Social Structure (1979, p. 288), "It is in controversies that symbol systems are tightened up". Although we may recognise a competitive edge and a conflict of interest with "non- Europeans" with respect to, say, economic issues, Europe is still integrated in a wider global community through GATT, the United Nations and NATO, etc. So despite attempts by the European Union to create a separate identity for Europeans, not unlike the Marxist notion of a "Klasse f?r sich," there are other centripetal and centrifugal forces at work to create wider as well as more narrow political identities. The third step in the establishment of a separate political identity involves a cognitive simplification of the world, where most events are interpreted in dual categories such as "European" versus "non-European." The cognitive simplification process has two explanations, each of which is equally valid. Man faces great and complex problems but has limited capabilities to process information. In order to focus attention and regain perceptual control, aspects have to be disregarded, otherwise chaos follows. Politically this is also necessary, because the audience of the politically active must be influenced by simplified images that reach down to everyone. When it comes to speaking about the identification of Europeans, such a simplified "black-and-white" perspective is probably (and hopefully) not an enduring characteristic of the electorates of Europe. Black-and-white thinking and stereotyping tendencies seem to have more in common with the kind of totalitarianism propagated within the ranks of the German Republikaner, the French Front National, Vlaamse Blok in Belgium and a few more marginal groups o perhaps inadequately described as "totalitarian" o such as the Danish Fremskridtspartiet and the Ulster nationalists. Not even the neo-fascist Italian MSI (now calling itself "the National Alliance") and its sub-organisations can be accused of such xenophobia and single-mindedness as that which goes into simple cognitive dualisms. Lowell Dittmer describes the process of identification when he says (1977, p. 573) that, "The process of political identification involves generalisation from objective perception to subjective wish-fulfilment...". However, Wolton (1993, p. 82) says that it is possible and even desirable to accept the old distinction of out-groups versus in-groups, but that it must be given a new content : L'Europe se trouve donc aujourd'hui confront?e au m?me enjeu : retrouver une figure contre-identitaire, ou inventer un nouveau mode de structuration identitaire. This new figure of contra-identification, according to the French intellectuals, should be anti-democratic political tendencies and sentiments. The fourth and final requirement concerns expected and desired goals. Such goals can be elaborated as utopian systems or models, like the federalist and confederalist conceptions of a new European political, economic or security order, or as partial working solutions to pragmatically felt needs, such as those postulated by neofunctionalists. There are at least six, more or less overlapping, contradictory and/or supportive models one can discern in the current debate on the integration of Europe and the development of a European political identity : v The great Europe model o a confederal model, with an emphasis on external relations ; v The united nations of Europe o a federal model, with an emphasis on internal relations ; v The community model o a model for inventories of what has already been achieved as a result of so called neo-functionalist initiatives ; v The Europe of the nations (de Gaulle) o a model which focuses on definitions of what should be included and excluded, and which would not necessarily include all European states in their geographical extensions ; v The minimal Europe o a liberal model in which market forces are given priority, but in which political and monetary issues are played down ; v The Europe of "espace publique" o a democratic model for Europe to be shaped, which ignores the traditional cultural cleavages and focuses on the democratic versus totalitarian modes of identity. Dominique Wolton says that these models have the quality of "ideal types" about them but that (p. 218) : En fait, l'Europe n'est pour le moment, et sans doute pour longtemps encore, ni une Europe des r?gions, ni une Europe des nations, mais une mosa?que de mod?les et de responsabilit?s gouvernementales : supranationales, nationales, r?gionales, locales, municipales, o? la souverainet? est partag?e entre les diff?rents niveaux de gouvernement. This is a reasonably pragmatic conclusion since it allows for the theoretical debate about European political identity to continue, and this debate is in itself a major source of political identification. Conclusion and some practical proposals It makes a difference whether we speak about plural identities or a plurality of choices when we look at the fears and hopes for a new Europe to be built. Plural identities are not necessarily "good" from the point of view of psychology, since they may cause distress, paralysis and confusion. The French intellectuals seem to believe that when using different criteria as identity objects, one should not focus exclusively on geographical units, since the national state is unlikely to be perishing anyway. When they advocate multiple perspectives they say that political criteria must be used, and that way the debate is being transformed into a debate about the future of European democracy, a debate with firm roots in European federalism. Since the establishment of the European Coal and Steel Community and the other European Union "pillars" there has been a change in the extent to which people regard themselves as European. This can be seen in the Eurobarometer surveys which show that the sense of being European is greater among citizens of Member States that have been members of the EEC from the beginning than among the "newcomers". But even if this is so, it may be misleading, because such "identification" may be based on parochial expectations of economic and other gains for the national unit to which one belongs, as for example in the case of Belgium, where European integration is demanded, but on the basis that the European politicians will further Belgian interests in the first place, rather than European common interests. What, then, can be done to further the idea of a common European identity tomorrow if the pace up till now has been slow and uncertain ? The answer to this question will greatly affect the future of the European Union. Since it is impossible to mention all possible projects that may contribute to a greater inner strength of the European project, I will confine my attention to some rather basic ideas which are within the scope of practical realisation. It is now more than half a century since the end of the Second World War, and we have now seen the downfall of totalitarian Communism. But we still have traces of totalitarianism among us everywhere in the form of racism, bureaucratic arrogance, and even leftover sentiments of Communism, Fascism and even National Socialism in Europe. We have concerns about a sustainable environmental development and corruption among politicians, irresponsible bankers, and remote representatives in the Europe to which we belong. These are just a few issues to which many young people pay attention but it is far from all who actually pay attention. If we can support those young people who feel concerned, and give them reasons to be grateful for what the European Union does to combat totalitarianism, racism and economic fraud, we may win over the next generation for the European project and make them feel more European than the older generations have felt. As the President of the European Union, Jacques Santer pointed out in his speech at a previous carrefour arranged by the Cellule de Prospective at the University of Lund in 1995, the great change in attitudes towards Europe will come with the next generations, those who know foreign languages and those who have lived abroad. This leads me to the practical conclusion that all of us who wish to strengthen European identity should promote travelling in all its forms all over Europe, especially by subsidising continued Inter-rail travelling among the young during the summer holidays and whenever else it is possible. Since the birth of the European Union, through the implementation of the Single European Act in the early 1990s, many airlines have shown their good-will and launched cheap travel programmes for both adults and young people. But more can be done in this area. For example, arrangements can be made with the youth hostel organisations in Europe so that travelling and accommodation will not be confined to only those who are well off, have employment, and have received grants from various study programmes. Efforts can be made to maintain and enlarge already existing exchange programmes of students and teachers that are already effective, and an effort can be made to establish summer camps, where young people from all over Europe can come together for three to four weeks to learn more and discuss problems of concern to them, including their immediate concerns about youth unemployment. If possible, they could even work directly in projects of common concern to us all, such as the rebuilding of roads and villages in the former Yugoslavia, when it is safe to do so again. The positive role of such initiatives for the strengthening of a European identity will depend upon the role played by the European Union. This role need not be too directly linked with our European institutions as they are today, and the most important thing is not to pour a lot of money into such projects, but let the beneficiaries know where the support comes from. I envisage that the European Union could play the role of the empowering agent to institutions which already exist. We could awaken an interest in a European youth hostel movement, in a European Interrail Travel System, and in European Summer Camps for young people. Such projects could send a positive signal to all European adolescents, employed or unemployed, students, trainees and working class youngsters, a signal which says : "If you wish to know more about life in other European countries and if you wish to participate in furthering the goals of the new Europe, we are there to support you." Through such measures we can not only strengthen and build a future European identity, we can also make sure that the achievements of the past are safeguarded. Reference literature Bryder, Tom (1989). ?Political Culture, Language and Change'. Paper presented at the ECPR, Paris. Christiansen, Bj?rn (1959). Attitudes Toward Foreign Affairs as a Function of Personality. Oslo : Oslo University Press. Deutsch, Karl W. (1953). Nationalism and Social Communication. An Inquiry Into the Foundations of Nationality. London : Chapman & Hall. Dittmer, Lowell (1977). ?Political Culture and Political Symbolism : Toward a Theoretical Synthesis'. World Politics Vol. XXIX, July 1977, No. 4, pp. 552 - 583. Eckstein, Harry (1988). ?A Culturalist Theory of Political Change', in American Political Science Review, 82, 3, (1988), pp. 789 - 804. Gu?henno, Jean-Marie (1993). La Fin de la D?mocratie. Paris : Flammarion. Inglehart, Ronald (1990). Culture Shift in Advanced Industrial Society. Princeton : Princeton University Press. Keohane, Robert O & Stanley Hoffmann (Eds.) (1991). The New European Community. Decisionmaking and Institutional Change. Boulder : Westview Press. Laqueur, Walter (1970). Europe Since Hitler. Harmondsworth : Pelican Books. Lasswell, H. D. (1965). World Politics and Personal Insecurity. New York : The Free Press, (orig. 1935). Mackenzie, W J M (1978). Political Identity. Harmondsworth : Penguin Books. Minc, Alain (1993). Le Nouveau Moyen Age. Paris : Gallimard. Morin, Edgar (1990). Penser l'Europe. Paris : Gallimard. Nettl, J. P. (1967). Political Mobilization. London : Faber and Faber, 1967. Rycroft, Charles (1972). A Critical Dictionary of Psychoanalysis. Harmondsworth : Penguin Books. Simon, H. A. (1985). ?Human Nature in Politics : The Dialogue of Psychology with Political Science'. American Political Science Review, 79, (1985), pp. 293 - 304. Weiler, J. H. H. (1983). ?The Genscher-Colombo Draft European Act : The Politics of Indecision' Journal of European Integration, 6, (1983), pp. 129-154. Wildawsky, Aaron (1987). ?Choosing Preferences by Constructing Institutions : A Cultural Theory of Preferences'. American Political Science Review, 81, (1987), 1, pp. 3-21. Wolton, Dominique (1993). La Derni?re Utopie. Naissance de l'Europe d?mocratique. Paris : Flammarion. Young, George (1937). ?Europeanization.' In Encyclopedia of the Social Sciences, Vol. 5. New York : The Macmillan Company. pp. 623-635. What is it ? Why do we need it ? Where do we find it ? Edy Korthals Altes Identity has to do with the individuality of a person or - in this case - of the European Union. What are the specific characteristics ? In what ways does the European Union discern itself from other international or national agents ? Identity in the sense of ?being yourself' is closely connected with the relation to others, ?seeing the other'. In this sense, Cardinal Lustiger could state that ; ?solidarity with those who die for lack of bread is an essential condition for Europe to stay alive'.31 The classic response to the question of European identity is : unity in diversity. Ethnic background, culture, religion and history are certainly important factors for the European identity. Decisive at this stage of the European integration process is however the question : what do we want to do together ? The answer depends on the perception of the need for a common response to the challenges of today's world. This is not an academic question but a matter of survival ! Identity is subject to change. It is not something ?static', given for all time. It is something that grows or withers away. Just as with individuals there is a process of development ( circumstances, events, inner growth). The present identity of the European Union is not robust but rather confusing. It resembles a Picasso portrait, conflicting lines, different levels not the unity of a human face. Or if we want to put it in diplomatic language : "the European Union is going through an identity crisis". It is still uncertain about its place in the surrounding world. Internal and external aspects of European identity Structure (decision making process : efficient /democratic/ transparent) ; Policies : (agricultural, regional, social ; just/unjust, greater inequality, exclusion) ; Economy : what are its objectives ? To serve man and society, to enable all people to live a decent existence. Or is it just the other way round : man and society sewing economics ? Accepting the tyranny of the iron laws of economics as an absolute, a given reality, something that cannot be changed. Economy as a goal in itself growth, maximisation of profits and power, etc.). Environment. Something to respect / to manage with great care and responsibility, or something to exploit whenever we feel like it. Jean-Marie Lustiger, Nous avons rendez-vous avec l'Europe, 1991, Paris, Mame. If we consider the economic aspects, the European Union looks quite impressive : large internal market, major trading partner on a world scale, strong industrial base, great financial power, among highest GNP per capita, good infrastructure, seat of multinationals, impressive number of cars and TVs, etc. A realistic vision is however obscured because of the highly unreliable way of assessing what is really going on (inadequate measuring instruments, poor definition of GNP). Counting ourselves rich at the expense of well being. The European Union is one of the greatest polluters in the world, among the greatest consumers of energy and raw materials. About 20 million unemployed, many poor. An increasing commercialisation of society, progressive deterioration of social and medical care, a degradation of education and universities (result-oriented, relevant for economy but at the detriment of education). The opinion Europeans have of their own identity does not necessarily correspond with the perception of non-Europeans. While we may be indulging in the ?civilising role' of Europe in a largely ?underdeveloped world', other nations, e.g. in the south of the Sahel or in the Pacific, may be inclined to curse the European Union (or some of its member states) for its selfishness (Common Agricultural Policy) or arrogance (nuclear tests). For the perception of our identity, deeds (actions/policies) are more relevant than words (declarations). African cattle-growers and local food producers suffer more from the negative effects of dumping of the European Union's agricultural surpluses than they benefit from fine words about the vocation of the European Union in this world. The same applies to import restrictions, policies on debts etc. And what about the striking contrast between the commitments made in Rio and Copenhagen and the slowness of action of the European Union ? It should be clear by now that a drastic revision of extravagant production - and consumption - levels in the highly industrialised nations is a prerequisite for a sustainable world society. There will be no hope for an effective control of the environmental crisis without far- reaching adjustments in the modern world. The position of the European Union is here of particular relevance. A common foreign and security policy : Picasso's portraits provide a good illustration of the present chaotic state of affairs. The unity of a well-integrated external policy is still a long way off. Several Commissioners are responsible for different aspects. Efficient policy-making is not possible with the present set-up under which the hands of External Affairs are strictly tied by the Council members ! A common security policy is not just around the corner ! And what will a common defence policy ultimately look like ? Will the European Union adopt an offensive stance with a nuclear component and a large military establishment or will it be content with a police function preferably in the context of the UN ? In the latter case, much more emphasis than to be found in the preparatory notes for the Intergovernmental Conference should be given to conflict prevention also in the economic sphere. This would also lead to a strict limitation of the production of and trade in arms and ensure careful scrutiny of R&D activities in the military industry. In search of the ?heart and soul' of Europe Reflecting on the structure of the European Union, its capabilities and policies, is certainly very important as a preparation for practical propositions on how to express the European identity. But at this critical moment of a deep existential crisis both in our nations and in the world, it seems to me that serious attention should also be paid to the deepest motivation of our acting. In other words, we should tackle the fundamental spiritual crisis, so manifest in modern society. Jacques Delors' pertinent question as to the ?heart and soul' of Europe has to be answered. Indeed, what are we doing with this huge Brussels machinery wielding so much power ? What are we heading for ? More power and material well being, especially for the stronger elements ? Or do we have a vision of a sustainable and just society ? A society in which all people, whatever background (religious/cultural/national), have a right and the possibility to lead a decent life ? A society in which people have respect for life in all its forms. A European Union with a balanced relation between the individual and the community, sustained by citizens who realise that each individual has a unique value that may never be reduced to an object for exploitation. Man, freed from the yoke of a one-sided fixation on economics, rediscovering that he is infinitely more than the homo economicus to which he is now being reduced by the apostles of greed in a materialistic culture. Man, part of a greater whole, knowing that he does not live by bread alone ! Man, with a destiny, a meaning of life, sharing life and goods in a responsible way with human beings in a global world. Of crucial importance for identity is the spirit providing the deepest motivation for action. One rightly stresses the importance of Christianity for Europe. But what is this spirit now ? If we look at today's reality, it will be difficult to maintain that we are living in a Christian Europe. Secularisation, materialism, hedonism and individualism are dominating modern culture. For many people, the sense for the transcendent has evaporated. (Many people have lost the sense for the transcendent ?) The ?horizontal' approach with its emphasis on a so-called autonomous "l " has taken its place. This has far-reaching consequences for our relations with mankind (ourselves, fellow- beings, the other and future generations) towards things and towards nature. In all three fields, man has lost his orientation, his bearings. Vacl?v Havel has made some relevant observations on this loss of the sense of the Transcendent (loss of spirituality ?) and the many problems of today as well as the incapacity of politicians to solve these (to come up with solutions). Spiritually, the European Union is in a rather poor (desolate) state. Impressive technological and economic achievements abound, but a very meagre spiritual basis. Crisis of meaning is widespread, psychological problems, crime, drug abuse, lack of respect for life with an annual death toll of 50,000 in road accidents, although (this number could be drastically reduced through the implementation of certain measures. Television programmes of a deplorable quality, etc. We need a european identity Only an effectively structured European Union (internally and externally) will be a relevant factor on the international scene, where the final real decisions affecting directly the life of all Europeans will be taken. v No European State is any longer in a position to meet the challenges of the modern world (ecological crisis, unemployment, poverty, rise of world population, armed conflicts, the spectacular increase in the destructive power of modern arms). v The dynamics of power relationships (nations as well as multinational companies). Affected are therefore not only countries like the USA, Japan, Russia, China, East- Russia, etc. but also the major international players in finance and business. v The serious threat to a ?social market economy' caused by overwhelming global forces call for a common ? Answer.32 We cannot go on with our present rate of production / consumption/ destruction of the environment. If we want a sustainable and just society, we must make progress in the direction of ?enough is enough'. We need to accept an upper limit and pay much more attention to the unsustainability of the present economy. We know that our planet cannot cope with a similar rate of economic expansion on the part of all other nations. We know that 4/5 of mankind is in urgent need of development (aid ? ?) in order to enjoy a decent standard of living and to escape from hunger and starvation. We must therefore strive for a reduction of our impact on the environment if we are serious about a basic sense of humanity. This cannot be achieved by technological means, fiscal and other measures alone. A fundamental change in mentality, in basic orientation, is needed. The obvious response to the global challenge should be a worldwide decision to set course towards a sustainable future. Heading off a collective disaster by managing the planet's scarce resources and environment in a responsible way. This will however take time - too much time. But why shouldn't the European Union - with its considerable economic leverage - take the initiative with a step-by-step approach, making it clear to the world that the one-sided emphasis of ?unlimited material growth' at the expense of real well-being is a fatal error ? Recognising that other areas may be in need for further economic development but that we have reached the stage of ?enough is enough'. That we are no longer victims of the false ideology that man has endless unlimited ? material needs which have to be satisfied. After all, it is from Europe that the industrial revolution and the expansion of our economic system started. A convincing European Union signal, illustrating a decisive turn in our economic approach might trigger off similar reactions in the US and Japan. Politically speaking, this deliberate change of course will not be easy. It could be greatly furthered if the European Commission entered into a creative relationship with 32 Michel Albert, Capitalisme contre capitalisme, 1991, Paris, Seuil. those egos that promote a similar course of action. There may be a greater concern among many people about the loss of ?qualify of life' than many politicians think. One of the challenges is, as we have seen before, the rediscovery of the great spiritual resources that have been at the origin of the European civilisation. There will be no renewal of the European society without a fundamental reappraisal of man's place in the Universe. The relation with the Ultimate. As we live in a multireligious Europe, this is a shared responsibility not only for Christianity but also for other religions. In the present situation of a morally disoriented Europe, a simple appeal for ?norms and values' will not be enough. Much more is needed. Values without deep spiritual roots will not stand up in the present harsh reality. For example : the threat to the social model. It would be an illusion to think that it will be possible to maintain the ?social market' - now under great pressure - without a strong spiritual basis. Europe urgently needs a radical change from its one-sided materialistic - horizontal approach to an attitude towards life which opens up towards transcendence. Christians throughout the ages have discovered in the cross of Jesus Christ the ultimate symbol - and reality - for this meeting of the horizontal and vertical lines. Jews and Muslims have other ways of expressing the reality of the transcendental experience. Where to find it ? The great temptation is to look for ?identity' in the structure of the European Union, its institutions, regulations, acts and policies. And may be even among its declarations. Ultimately, the European Union identity depends on the political will of member states and the way the European Union uses its competencies. But political action of states is highly dependent on public support. Whether there will be sufficient understanding for necessary ?painful policies' depends on the motivation of citizens. It is thus a question of the spirit. What moves (activates ?) people nowadays ? The spiritual desert in which many people live is well illustrated by the statement of a Dutch cabinet minister (environment) that ?the car cannot be touched because it is an essential element of the identity of a person' ! I doubt whether the European Union could ever develop its identity on the basis of this narrow materialistic concept of human nature. The European Union identity will not be found in wonderful words about our common history and common sources of inspiration. Not in digging up long forgotten treasures of the past but in acting together. On the basis of adequate policies, meeting the present challenges. Just three examples of missed opportunities - all in areas on our doorstep : 1. The end of the Cold War and breakdown of the communist system provided a unique occasion for a visionary approach of the new reality : a large-scale well-integrated economic co-operation programme addressing the actual needs. 2. The handling of the crisis in ex-Yugoslavia. 3. The creation of an all-European security system in the spirit of the Paris Charter. On these historic occasions, action would have given a greater impulse to the development of a European Union identity than a thousand seminars and numerous solemn declarations of politicians. Unless the European Union develops an adequate structure enabling it to deal effectively with the challenges of the modern world, we will not discover our common identity. It is up to the member states to take a hard look at reality and decide to break the impasse of the present "Impossible Status Quo" !33 Some practical and some more fundamental suggestions v Continue and expand the excellent initiative on the Carrefours d'Europe. If necessary even under more modest circumstances ! v Bring spiritual and cultural leaders together with politicians, managers, journalists etc. Strive for an equilibrium between bureaucrats of institutions and ?independent' Europeans. v Consider the possibility of a substantial increase of inter-European exchange programmes for students and scholars. v Bring forcefully to the attention of Council members and public opinion that the European Union has now really arrived at a crucial point which will be decisive for its future : whether it will develop an identity or become a non-entity. making clear that the latter option will inevitably also lead the proud member states on the same road towards oblivion. v Deepening of the European Union should have an absolute priority over enlargement. The danger of further diluting the identity is great. v Translating the recognition that the spiritual factor is crucial for the European identity in an active support of all those religious and cultural forces that can contribute to the spiritual revival in Europe. A new spirituality will liberate us from the dominance of economics, breaking the spell of the golden calf ! This would pave the way for a humane and just society, offering the possibility to lead a full human life in which values such as love, beauty, truth and goodness together with human rights, solidarity and justice are guaranteed for us and coming generations ! 33 Club de Florence, Europe: L'impossible Status Quo, 1996, Editions Stock. European identity and political experience Mario Soares Let me make two points clear to start with. Firstly, Europe is not just the European Union ; secondly, I have no doubt that a European identity does exist. When my country embarked on the process of joining the European Community it did so for very specific reasons, namely to consolidate our newfound freedoms. Portugal, like Spain, had just emerged from nearly half a century of dictatorship, and it was essential to consolidate our democracy to prevent any resurgence of military power. We could only counter this threat by turning to Europe. This is not to say that we did not consider ourselves to be a European country before. Let me remind you that the Portuguese were the first Europeans to export the culture of our continent to the Indies, Japan and America. We were also the first to bring back to Europe the riches of the civilisations and cultures we discovered there, which were still completely unknown here. We have always regarded ourselves as Europeans, even if our country is on the periphery of the continent and faces the Atlantic and Africa. I have mentioned the importance of sporting European colours to consolidate democratic institutions that were still in their infancy. But there was another reason for Portuguese membership : we were very late in embarking on decolonisation. Having been the first colonial empire in the world, Portugal was also the last. But once our colonial empire had finally disappeared, fifteen years later than those of our neighbours and in difficult circumstances, and we found ourselves face to face with new sovereign states such as Cape Verde, Guinea, Angola and Mozambique, we felt that integration in the European Community was the natural counterweight to this change. We joined the European Community at the same time as Spain, in June 1985. At that point, it was not yet the European Union. Since then, we have seen the collapse of the Communist world and many profound changes. The European Community had two objectives : the most obvious, founded on Franco-German friendship, was to preserve peace on the continent. The second was to keep up with the United States and the Soviet bloc. With the end of bi-polarism, the Community found itself plunged into a completely new situation. This was when Europe rediscovered its own values and escaped from the geographical and historical confines imposed by the Cold War. We realised that Europe was much larger and started to ask ourselves what we should do with the "rest" of the continent. We realised that we had a duty to reintegrate this "other Europe" into our Community onow a Union. But of course it is no easy matter : what will become of a Europe that was difficult enough to run with just 10 or 12 or 15 members when it expands to include 21 or 22 members in a few years' time ? This is a problem for the European institutions but it also touches on the very future of the concept of the European Union. Europe cannot just be the European Union within the frontiers as they stand today. Hungary, Poland, Bulgaria, have the right to join our Community : their history and their contribution to the European identity fully entitle them to membership. They have contributed as much to the European ideal as we have. From these countries, I hear the same arguments that Spain and Portugal used for joining the European Community : we have freed ourselves from dictatorship, we have become a democratic country through our own efforts, without Europe's help. We also had the right to democracy at the end of the Second World War, because Great Britain and France had defeated the dictatorships and Germany counted for nothing in the immediate post-war period. Who allowed the dictatorships to reemerge in our country, if not "democratic Europe" ? Though it pains me as a socialist, I have to say that if there was one champion of the rehabilitation of the dictatorships at that time it was the British Foreign Secretary. Driven by fear of Soviet pressure and fear of Communism in Western Europe o in both France and Italy o the democratic states of Europe took the view that it was more sensible and served their own interests better to overlook the fact that there were two dictatorships on their doorsteps. From 1945 to 1974, we continued to live under a dictatorship because of this sort of indulgence, because of the treachery of the democracies. They did everything to perpetuate the dictatorship in our country. It was the easier option : it was either that or risk letting Communism in through Spain or Portugal or somewhere else. This was the main consideration. Once we had rid ourselves of these dictatorships, our first concern was to assert that we were democracies and that you bore a share of the responsibility for our period of fascist or authoritarian rule. This gave us every right to sit at the same table as you, particularly as our contribution to Europe has been every bit as important as yours in the past. This is what we said to the European states. It is what our friends in Central Europe are saying to us today and it remains equally valid. They too can claim the right to sit at the Europeans' table. Economic reasons cannot stand in the way of this right. This is why it is our duty to find ways of dealing with the current situation and welcoming these states into the Union. The question of greater Europe is not confined to Central Europe alone : Europe is also linked to the Mediterranean Sea and the Mediterranean basin. It is linked to what happens in Eastern Europe. Where does Europe end ? On the Russian steppes ? Is Turkey part of Europe ? I was in Turkey quite recently and found that those who want to modernise the country proclaim their Europeanness. And rightly so. They have reasons for doing so. Is the European Union to be a club reserved exclusively for Christian countries o Protestant, Catholic and Orthodox ? Are countries with a predominantly Muslim population not allowed to join ? Is there some sort of religious bar to membership ? I do not think so. But the problem of Turkey is a serious one for Europe. How are we supposed to deal with an unprecedented situation like this on the institutional level ? On this point, my mind is made up : I agree with Chancellor Kohl that the construction of Europe is a vital matter for the next century, a matter of war and peace. Even if we had no problems of identity, if we fail to move towards a stronger European Union, if we fail to move rapidly along that road, Europe will find itself without a voice and will lose the importance it once had in the world. It is not just a matter of being heard throughout the world, but of having the strength to impose certain models which we believe encapsulate so many of the ideas which this ancient continent has produced over the years. The European model reflects serious humanist concerns based on fundamental human values : values of liberty and reason, solidarity and social justice. They are values without which the human race cannot successfully enter the XXIst century. In other words, Europe's interests are not limited to Europe alone. It is not a matter of simply asserting Europe's position in the world, but of going further and making a contribution to the world as a whole. If we fail to make this contribution, something will be missing, and we will fail to explore the paths that are most rational and most conducive to human happiness. This is how I see the situation and why I believe in Europe. I may sometimes criticise Europe with other pro-Europeans, but I do so because of my love for Europe. I do so because I am not afraid of the march of European progress, quite the contrary. I do not think there can be a solution which would unite 20 or 30 European countries but leave these essential values as the individual concern of each State. They must be pooled and managed collectively o and this is true of security policy and foreign policy as much as anything else. But it cannot be done without supranational European institutions. It cannot be done unless we move towards a united Europe, towards a measure of European federalism. I know that this word makes some people uneasy. But I have no qualms about using it. Like the founding fathers of Europe, I favour a structure which does not have to be identical to the one that already exists. It should be a new and original design. Others have already said as much in this seminar. It should evolve towards a United States of Europe, along more or less federal lines, perhaps with its own original touches, but basically federal, with a certain common direction. This requires the sacrifice of certain elements of states' traditional sovereignty, the pooling of national sovereignties. Without this, we cannot build Europe. There may come a moment when we have to say to those Member States that do not want to go all the way that they have no right to stop the others from going further. This path offers the best solution available in the short term. When I speak of Europe in such glowing terms, it should be clear that I do not mean Europe to be simply Europe of the free market, the single market, economic and monetary union and the single currency. Of course I support this, but only if we build a genuinely political Europe as well. Because if it is to be only an economic and monetary Europe I will withdraw my support. That is not the sort of Europe I am interested in. I am in favour of an economic and monetary Europe if it goes hand in hand with a political Europe and coordinated foreign policy : a Europe which defines its own security collectively, a Europe that is also a social union, a Europe of the people, a Europe with popular participation. I want to talk about the participation of Europe's regions, which is as important as that of the states. I want to talk about the participation of the cities, the people, the NGOs, the general public. This pluralism, this diversity, is the key to achieving a multi-faceted Europe capable of fulfilling its role in the world. This role is essential for maintaining equilibrium in the world and creating a new international order, without which disaster beckons. We are concerned about human destiny, about the environment, drug abuse, unemployment, the problems that preoccupy the younger generation. These are very serious problems which also affect the United States and, even more acutely, Japan, to mention just two important countries. But when we look at the countries of Southeast Asia, it is clear that their prosperity is based quite simply on slave labour. I was in China a few months ago, where I had the opportunity to meet various leading Chinese figures. My impression is that China is heading for an explosion that will be completely out of control, an explosion even more dramatic than the one that tore apart the Soviet Union, because things cannot go on as they are. You cannot maintain such a level of capitalist exploitation ; you cannot have a city like Shanghai with a very high level of development and staggering wealth and at the same time have public officials earning a pittance. A street trader in Europe would not accept such a meagre salary. Such inequality can only be sustained by high levels of corruption or crazy distortions which I am convinced will lead to social upheaval. As I see it, the world is completely deregulated at the moment. We are all well aware of this. The United States cannot run the world on their own, even if they want to. This is why it is important that Europe carries out its allotted task. It is a major challenge for Europe and for us Europeans. We must be ready to respond boldly. Unfortunately, we have not seen any great leaders stand up to defend this sort of point of view loudly and clearly. For electoral reasons, political leaders find themselves conditioned, tied by the rules of normal democracy, the rules of parliamentary democracy. They want to please and respond to the immediate present, with the result that they cannot rise to the responses they are called on to make. They cannot provide answers to a much more serious problem which touches on the deepest aspirations of the individuals and societies of today. This is why we sometimes find ourselves deadlocked. We can see that concern is becoming widespread in Europe : there is disenchantment about Europe in the countries which joined the European Union most recently. It is clearest in Sweden, but is not limited to Sweden. The same disappointment is to be found in Germany, France, Spain and Portugal, not to mention Great Britain. What is the cause ? There is a mistaken idea that the European Union is a bureaucracy based in Brussels which concerns itself with the details and tries to regulate the life of the ordinary citizen instead of allowing him a voice and the chance to do something for himself. I believe this to be completely false, but this is how things are perceived. Matters are made worse by the fact that the situation for young people is very difficult : unemployment, delinquency, drugs, social exclusion, AIDS are all problems which particularly affect young people. The solutions proposed tend to have an economic or technocratic slant : they are not the answer to these human problems. This is what young people feel and this is why people are pessimistic and suspicious about Europe. Europe has to be relaunched. The European identity has been described as a changing concept and this is true. It has to incorporate the great values and aspirations of the different nations. If we could do that, if we had the courage to do it and to speak the truth when dealing with the big problems, if we were able to resolve these problems by taking steps towards closer European integration, Europe would begin to respond positively to the great challenges of the day. These great challenges may be stated in very simple terms : either we are able to understand and create a true political, economic, social and cultural union, which, for all its diversity and pluralism, remains Union on a grand scale, or we are unable to do this and we take a step backwards into the outdated nationalism and disorder of the past. For me, this is one of the most worrying prospects, not only for Europe, but for humanity as a whole. How to define the European identitytoday and in the future? Ingmar Karlsson The European identity is often described in a somewhat high-flown manner as having its foundations in antiquity ; free thought, individualism, humanism and democracy had their cradle in Athens and Rome. On the other hand, neither Greek nor Roman civilisations can be described as European. Both were Mediterranean cultures with centres of influence in Asia Minor, Africa and the Middle East. When Alexander the Great set out to conquer the civilised world of his time, Egypt, Persia and India, he had no idea that he was acting on behalf of Europe. Christianity, with its roots in Judaism, was also a Mediterranean, non-European religion. Byzantium was a Christian power which marked the limit to Roman claims of sovereignty, as did a large part of post-Reformation Europe. The result of the schism between Rome and Byzantium was the development of another culture in Russia and south-eastern Europe. Following the Reformation, a large part of continental Europe was preoccupied for several centuries with religious wars and rivalry between Protestants and Catholics. More recently, historians have played down our antique heritage. European ideals are traced back to the Renaissance instead and the concept of the individual as the smallest and inviolable element of society. The Enlightenment and the French Revolution contributed to the demand for freedom, equality, fraternity, democracy, self-determination, equal opportunities for all, clearly defined government powers, separation between the powers of church and state, freedom of the press and human rights. The ideas that are triumphant in Europe today are those of market economy and democracy. By definition, this also includes the USA, Canada, New Zealand and Australia as European powers. However, Europe does not only represent modernity and tolerance but religious persecution, not only democracy but fascist dictatorship as well - Hitler was the first to use the idea of a European house - for the collectivist ideals of Communism, colonialism and racism disguised in scientific terms. In other words, European identity cannot be defined on grounds of cultural heritage and history, and even less can it be used as the basis for European domestic and foreign policies. The explanation is as simple as it is obvious. Economic and political integration between European nation-states has not yet progressed so far that it is possible to speak of coincidental interests. It is possible that they have diminished somewhat with the collapse of communism and disappearance of a common threat. Instead, there is a growing need for a national identity and sovereignty in proportion to the increased levelling of European politics and economy. The greater the sense of diversity being under threat and that standardisation is rising, the greater the antipathy to projects that promote integration. The European Community is already a reality as far as production and consumption is concerned, but there is popular opposition to a culturally standardised community. The more blurred and controversial the future of a common Europe appears to the common man, the more the nations will mobilise themselves against Europe. In the interests of not becoming counterproductive, a balance must be struck between enthusiasm for the European project and awareness that European Union legitimacy will be in short supply in the foreseeable future. This view need not paralyse efforts towards integration, however. The phrase "an ever closer union between the peoples of Europe" could instead be useful in its general vagueness. There may also be some validity for European integration in Edmund Burke's wise words that political order cannot be created at a drawing board but has to emerge gradually. This, in turn, means that politicians and bureaucrats must concentrate on immediately essential and clear issues and on measures the consequences of which can be judged by citizens themselves. Every new European competence must therefore be explained in concrete terms in order to achieve acceptance. Consequently, the issues should be carefully examined that require a European solution and which withstand centralised interference, particularly because an incorrect decision on, say, the agricultural policy, can have far-reaching consequences and undermine the credibility of Union projects. A stable foundation of legitimacy for the European Union will only be achieved when Europeans perceive a European political identity. This does not imply that they would no longer feel themselves to be Swedes, Finns, Frenchmen or Portuguese, but that the sense of a European common destiny was added to these identities. Even after four decades of European integration, this development is still in its infancy. Nation-states evolved after a long period, often filled with conflict. They are ideological constructions and a national identity is ultimately a political standpoint. A prerequisite for a strong national identity is that citizens have a sense of loyalty to the state because it redistributes social resources and provides education, infrastructure, a legal system etc. The same prerequisites hold true for the creators of Europe as well. As in the process that led to the creation of European nation-states, the European Union will also be an elite project for the foreseeable future and the European identity an elite phenomenon. To be sure, the technocrats and bureaucrats in Brussels are a new European elite but are they representatives of European culture or merely an international "civil service" who, with the passing of time, increasingly alienate themselves from the people whose interests it is meant to serve ? Is there not a danger that institutional loyalty will become stronger than "European awareness" which may spread among the elite of member nations ? The problem becomes more aggravated when these people arouse negative stereotype reactions among citizens. Eurocrats are not regarded as the first among Europeans, but as overpaid bureaucrats interfering in matters that do not concern them. The creation of national symbols and myths and the rewriting of history were also part of the process by which European nations were formed. First came the state, followed by the formation of a national community within the territorial framework by means of gradual integration and cultural standardisation. The architects of nations emerging in the XIXth century used such means as national conscription, compulsory education and the supra-regional spread of the growing mass media to create contact between the centre and periphery and seemingly natural boundaries on the basis of geography, language, ethnicity or religion. Above all, the arrival of national educational systems and mass media contributed to the sense of belonging to a national community, expanded cultural horizons and getting away from provincial narrow-mindedness. Efforts to create a European identity Brussels appears to have had this in mind when taking the decision in 1984 that the EC would improve contact with its citizens and, so to speak, create a European identity, centrally and from above. At a summit meeting in Fontainebleau, the European Council found it "absolutely essential that the Community fulfil the expectations of the European people and take measures to strengthen and promote the identity and image of the Community vis-?vis its citizens and the rest of the world". The Adonnino Committee was set up for this purpose, with the task of starting a campaign on the theme of "A people's Europe". This work would be based on a quotation from the preamble to the Rome Treaty on "an ever closer union among the peoples of Europe", and on the Tindemans Report of 1975 which recommended that Europe must be close to its citizens and that a European Union could only become reality if people supported the idea. An outcome of the work of this committee was the decision that the EC should have its own flag. When the flag was raised for the first time at Berlaymont on 29 May 1986, the EC hymn - the "Ode to Joy" from the fourth movement of Beethoven's ninth symphony was played for the first time. Thus, by means of a flag and European national hymn, the Union acquired the attributes of a nation-state. A European Day was also established. The choice fell on 9 May, the date on which Robert Schumann held a speech in 1950 that resulted in the first community, the European Coal and Steel Community. Consequently, the Adonnino Committee appears to have assumed that a European identity could be created on the initiative of politicians and bureaucrats. In 1988, the European Council decided to introduce a European dimension into school subjects such as literature, history, civics, geography, languages and music. Legitimacy for future integration would be created by invoking a common history and cultural heritage. This has resulted in a book, "Europe - a history of its peoples", written by the French history professor, Jean-Baptiste Duroselle, which, to quote the author, covers a period from 5.000 years ago to tomorrow's news. The European Union is thus attempting to create a European identity from above. A common European frame of reference is being created by means of a standardised set of symbols and myths. A European driving licence already exists and an European Union passport, although it took ten years to agree on its colour and appearance. The Maastricht Treaty introduced the new concept of a citizen of the Union, although his/her rights and obligations have still to be defined. These activities are incompatible with the often-recurring theme that European integration must be a natural process and not imposed from above. Every European people has its more or less genuine historical myths, experiences and view of history. There is no European equivalent to the Acad?mie Fran?aise, Bastille, Escorial, La Scala, Brandenburger Tor or the opening of Parliament at Westminster. There is no European Unknown Soldier. Jean Monnet rests at the Panth?on in Paris. The fame of Robert Schumann's resting place at Scy-Chazelles cannot compete with Colombey-les-Deux-Eglises, where General de Gaulle lies buried. Common history has been experienced by many as against and not with each other in the great European wars. The main task of the "Europe-makers" cannot therefore be to provide Europeans with a common identity originating in antique or medieval times but to develop political self-confidence and ability to act in line with the role of Europe in the XXIst century. This will not happen by elevating the European Union to a free trade zone in accordance with British ideas, or into some kind of American- style United States of Europe which is imposed on people against their will. Basis for European patriotism and identity Only long-term, patient growing together will provide the basis for a democratic Europe comprised of its citizens. For many decades, the EC was a practical community. We are only now en route towards a community of destiny and experience. If anything is be learnt from European history it is that Europe as an entity can only be completed in agreement with and not against the will of the nation- states and what they consider to be their legitimate interests. At present, regionalism and nationalism undoubtedly have another strength than paneuropeanism. Perhaps Europe needs some ?multi-national shocks' in the form of an aggressive Russia, a new Chernobyl catastrophe or Gulf crisis to show our total dependency on the USA in conflicts that affect vital European interests. Other problems will also arise that call for joint action and which in due course will aid the establishment of an identity, such as for example : o the necessity to use our common strength to meet the technological challenge from Japan and the USA and, in the not too distant future, the "new tigers". o common action to overcome environmental problems, pressure from immigration and to handle international organised crime. A successful European policy in these and other areas could help in the development of "constitutional European patriotism" in the same way that "loyalty to the Constitution" ("Verfassungspatriotismus") became a reality in the Federal Republic of Germany , replacing the nationalism that no German was able to feel after the terrors of the Hitler period. An absolute precondition for developing a common political culture and constitutional patriotism in the European Union is that its citizens are informed about and participate in the super-national decision-making process. A European public opinion must emerge before there can be talk of a European citizenship. As stated above, the European identity has no historical reference. European trade unions do not exist at present, nor other interest groups nor, above all, trans-boundary European parties and a European general public. The Maastricht Treaty brought this deficiency into focus, negotiated as it was by experts in a European code incomprehensible to its citizens. As a result, the reputation of the European Union was further diminished. A prerequisite for a solid European identity is therefore the development of European parties, or at least a party network, and political debate on trans-boundary issues. When employer organisation and trade unions begin to meet at a European level to look after their members' common interests, we will have taken the first steps because politics will have reached beyond the national level. The optimum we can achieve at the end of such a process would be a European "constitutional state" and European Union citizenship that is felt to be genuine and not an artificial construction. The way is both difficult and long, however, and more likely to be curbed than speeded on by enlargement eastwards. It has proved difficult enough to bridge the cultural and linguistic differences between Catholics and Protestants, Latins, Germans, Anglo-Saxons and Scandinavians in Europe. The task of integrating the Baltic, Slav and Orthodox Europeans will be infinitely more difficult. The larger and more heterogeneous membership becomes, the greater the need to differentiate between various member states and a Europe moving at different speeds and where the political union, monetary union, common security and defence policy and inner market will not extend over the same geographical areas. A union of up to 30 members at varying stages of economic development can only function if it is organised along multi-tracks and at different levels. Efforts to create a Europe around the hard core of a monetary union with the Euro as a magnet could be counterproductive. Magnets work in two ways, either drawing particles towards them or pushing them away. There is a clear risk that a monetary union will not only have a magnetic effect but the reverse as well. Cultural diversity o obstacle or prerequisite for a European identity ? European political oratory often maintains that Europe can only be defined through its unique heritage of diversity and lack of conformity and that, paradoxically, its very diversity has been its unifying principle and strength. However, European linguistic diversity is probably the greatest obstacle standing in the way of the emergence of a European political identity and thus the European democratic project. While multilingual European democracies certainly exist, the prime example is Switzerland, which has elected to remain outside the European Union. A democracy is non-existent if most of its citizens cannot make themselves understood to each other. Rhetoric apart, not even leading European politicians are today able to socialise with each other without an interpreter, and very few can make themselves understood to a majority of European voters in their own language. Not one European newspaper exists, except elitist newspapers such as "The European". There is no European television programme apart from Eurosport, and most of its viewers watch matches between nations. In short, there is no public European debate, no European political discourse because the political process is still tied to language. The question of language is basically one of democracy. Political discussion would be divided between A and B teams with many excluded because of their lack of linguistic knowledge if only English and French were the official European Union languages. At the same time, the problem of interpreting is becoming insurmountable. Over 40% of the European Union administrative budget is already spent on language services. Eleven languages make 132 combinations possible in the translator booths. The addition of another 10 Eastern and Central European languages brings this figure to 420 and 462 if Maltese is added. Some form of functional differentiation will therefore be necessary, making some languages more equal than others. although this would have a negative effect on European public opinion in the small member nations. At present, an average 66% of European Union citizens are monolingual while 10% speak at least two foreign languages. Ireland is at one extreme with 80 and 3 % respectively, while only 1% of the population in Luxembourg is monolingual and no less than 80% speak at least two foreign languages. In order to function as Europeans and safeguard our interests, we Swedes must become tolerably fluent in at least one other major European language apart from English. Swedish remains the basis of our cultural heritage and domestic political discussions, but in order to play a constructive part in Europe we must develop into citizens of Luxembourg as far as language is concerned. Consequently, Europe is neither a communication- nor an experience-based community, to use German expressions. Both factors are indispensable in the development of a collective political identity. This is created by sharing experience, myths and memories, often in contradiction to those held in other collective identities. They are, moreover, often strengthened by the comparison with those that are distinctly different. Not just Robert Schumann, Alcide de Gasper, Jean Monnet and Konrad Adenauer should be counted among the fathers of European integration, but Josef Stalin as well. The Cold War enabled a sense of unity to be mobilised among Western Europeans, but who can play the role of opposition now in order to provide Europeans with a common identity ? The USA is part of the same circle of culture. Japan is of course a homogeneous and different society but is too far away and does not constitute any political or military threat. And its economic strength is directed primarily at the USA. There is an inherent danger that Europe will choose to define itself vis-?-vis its surrounding third world neighbours and that the Mediterranean will become the moat around the European fort. The creation of a pan-European identity risks being accompanied by a cultural exclusion mechanism. The search for a European identity could easily take the form of demarcation against "the others", a policy which leads to a racial cul-de-sac while at the same time the mixing of races continues to rise in Europe. A European identity must therefore be distinctive and all embracing, differentiate and assimilate at the same time. It is a question of integrating the nations of Europe, with their deeply rooted national and, often, regional identities and to persuade citizens to feel part of a supra-national community and identity. Can half a continent with 370 million citizens and 11 official languages really be provided with a democratic constitution ? In the ideal scenario for the emergence of a European political union, the European Parliament must first be "de-nationalised" and this assumes a European party system. Secondly, it must have the classic budgetary and legislative powers. The Council of Ministers must be turned into a second chamber and the Commission should be led by a "head of government" appointed by Parliament. National parliaments would consequently lose their functions. They could be transformed into federal parliaments in smaller states, as in Germany, and would thus have the same position vis-?-vis Brussels that they have today. It is easier said than done to abolish the democratic deficit by giving greater powers to the parliament in Strasbourg, because the dilemma of representation versus effectiveness would immediately come to a head. If every parliamentarian represented about 25.000 citizens, as is the case in Sweden, the gathering at Strasbourg with about 15 member nations would have to be increased to 15.000. If in the name of effectiveness, the number was reduced to 500, with constituencies of more than a million citizens and everyone was guaranteed an equal European vote, Luxembourg would not be represented and Sweden would have a maximum 13 representatives in the European Parliament. It might be capable of functioning but could not by any means claim to represent a European electorate. The democratic deficit would continue. Europe as an entity can only be achieved with the help of and not against the nations and their special characteristics. European integration will not be completed because of some natural necessity but only if enough political energy is brought to bear. The future of the European Union rests therefore in the common interests of member states and not on the political will of a European people for the simple reason that such a thing does not exist. Regional and national identities will grow in importance in a world that is becoming every more difficult to oversee and which is ever more rapidly changing. Citizens will be living more and more in a state of tension between several loyalties, their home district, state, nation, Europe and the international community, increasingly required to think globally but act locally. New ancient regimes and new regions are emerging everywhere in Europe. By actively supporting the process of regionalisation, Brussels and individual capital cities can show that European Union is taking its institutions closer to its citizens and thereby creating greater scope for cultural and linguistic diversity than the nation-states have been capable of doing. By contributing to a new vision - the Europe of diversity and regional government based on subsidiarity - the idea of Europe can be made more comprehensible and attractive. In this way, the regional identity can strengthen the emergent European identity. Now that regions are increasingly turning to the European Union in their fight for resources for regional development and to attract investment, Brussels and the European Union will be seen as the friends of the regions rather than their national capitals. The nation-state is thus being nibbled at from two directions. At the same time, we will experience a renaissance for nation-states and regions and their gradual merger in a transnational community. Those who support the region and nation must not necessarily reject Europe, but the traditional nation-state with community-based traditions, identity and loyalty will remain indispensable as a strength and source of political stability. Nation-states are therefore essential in order to legitimise a new European order but structural asymmetry, conflicting interests and unexpected courses of development will lead to relations between the nation-state and European integration that are difficult to manage and oversee. Europe will continue therefore even in the future to be squeezed between what the German philosopher Karl Jasper called "Balkan and Helvetian tendencies", i.e., between Yugoslav and Swiss development models. Nations are not great once and for all, but are created. They are what Benedict Anderson called "imagined communities". The idea of a European community cannot arise from the German concept of "Blut und Boden", or from the idea of a European "Volk" or a European "cultural nation". Nor can the European identity be created through central directives from Brussels or member nations' capital cities, or by being conjured forth at seminars and conferences but rather through the citizens of individual European states knowing that they personally have something to gain from integration and they hereby say yes to the European Union in their daily referendum. As we have already experienced, a forced unifying process produces counter reactions in all the member countries. A European identity is possible only where there is a community of interests among the citizens. If this is missing or not felt to be sufficiently strong, the European Union will have a democratic deficit irrespective of what new competence is given to the European Parliament. The single market will bring about trans-boundary mobility and thereby albeit slowly contribute to the emergence of a European identity but it will be one of many relativised by different national and regional identities (such as, for example, Benelux, Ibero-Europe, the Nordic countries). Immigration will strengthen the multicultural component that is indispensable for a new sense of identity. At the same time, it will nourish the social tensions and racist and nationalist comments, but can also lead to political mobilisation and the insight that these problems can only be solved at European level. A European ?supra-nationality' will be accepted first in situations where there is no hierarchy of national, regional and supra-region identities but when every individual knows about them as self-evident and as part of their daily life. A policy for preserving diversity will thus be a precondition for creating a European identity that neither should or would become a replacement for a national identity but which can create support and strength for political institutions that are neither national nor the framework of a European superstate. Questions of cultural policy, education and a historically deep-rooted social system and values must therefore remain the concern of nation-states. It is thus a case of render unto the nation state that which belongs to it and to European Union that which is the European Union's ; a security and foreign policy structure, the single market, a common crime, asylum and immigration policy. The hitherto clear links between state and nation will thus grow looser. European integration from this point of view will not mean that a new superstate will appear but that power is spread out. Cultural identities will remain rooted at national level but will spread further down to ever more distinctive regional identities. We will have neither a new European superstate or sovereign nation-states. Nations will not disappear but we will have nations with less state and national cultures with softer outer casings. Relations between European and national identities could take the shape of a foreign and security policy in the wide sense as a fundament of a common European political identity, a "nation" to which one feels a sense of political belonging without the need to feel part of a European "Volk" or a European "cultural nation". The German concept of a nation would endure at national level although in its original form as conceived by Johann Gottfried von Herder in which a nation need not necessarily express itself as a state. By standing on secure and solid cultural ground, every people with their own distinctive character and cultural capacity achievements can contribute to an international community. Cultural nations will thus become divorced from a territory. People will have a sense of belonging to a special area and its cultural and political history but this area need not necessarily be linked to a nation-state with defined territorial boundaries. The European political identity could emerge in this way while at the same time leaving the cultural national or regional identity intact while European diversity will not only remain in place but grow as well. The democratic deficit can never be abolished unless this kind of development takes place, nor would the project of a European Union be realised. European identity - A perspective from a Norwegian European, or a European Norwegian Truls Frogner Norway is a part of Europe, but not a member of the European Union. We are integrated in many ways, and for practical and economic purposes (EEA Treaty) we are close to membership. The road to full and political membership is to be found in our visions and roots, both part of our European identity. In this respect, the Norwegian challenge is similar to that of all other Europeans. Since Europe has many countries on its fringe, the approach towards European identity could start from one of them. Even the opponents in Norway said before the referendum in 1994 ; "YES to Europe, but no to the Union"... Membership of the European Union can never be more than the means to achieve other and higher goals. Integration as an instrument of cooperation is necessary, yet not sufficient. Institutions should reflect the dreams and needs of the population, and transform them into practical solutions of which they can approve. The forthcoming "Citizen First" campaign may succeed in reminding the people of Europe of what has already been achieved during the four decades since the Rome Treaty was signed. Still it seems to many people that politics on the European level is something different and remote from national politics at home. And worse, sometimes national voices blame "Brussels" for unpopular measures, without giving credit for the positive impact of European decisions. Does the European Union suffer from a scapegoat syndrome ? There are at least two answers. It is necessary to normalise European politics. To work for European solutions is a part of a general struggle for values and visions on the individual, local, regional and global level. In this perspective Europe is not something special, but the bridge between near and far. Europe is the gate to the big unknown world and the port when coming back. The second answer is to develop a consciousness of our own European identity and the common ground of European values and history. The point is not to cultivate something European which is different than national, local or global, but to compose some ideas, sentiments and values as a platform, as an inspiration, for taking part in facing common challenges. The Norwegian "naysayers" cleverly connected their opposition against the European Union with a combat for positive ideals. But the supporters also fight for higher values, a better society and a sustainable development, however not yet communicating this message with the same one-sided self-confidence and conviction. Maybe because real Europeans have ambiguous minds ? Belonging to the European Community is often said to be the major reason why the supporters are in favour of membership. It has to do with a cultural and geographical identity, also shared by many Norwegians. (Remember that rejection may be an indirect affirmation ; as the 5-year-old boy asked his mother : "Do you think God knows we don't believe in him".) Identity is not free from contradictions. A lot of people are fond of their village or party without accepting all aspects. The alternative to a poor marriage is not necessarily divorce, but a better marriage. European identity does not exclude criticising the European Union. The next question is always : what is your proposal or alternative ? The dual critical and constructive approach represents the dialectical dynamism of European history o compromise after crises. Safety is related to belonging. It is a positive feeling of security in itself and with others, in contrast to lacking individual faith and confidence in a greater community. The security in NATO, which almost all Norwegians rely on, is an example of a historical acknowledgement that no nation can or should stand alone to protect peace and prevent war. Only a binding international cooperation can offer the security of being treated equally in accordance with common rules, to avoid occasional infringements. Security in Europe is an idea which pervades our approach to political, economic, social, cultural, environmental and other issues. Solidarity is, according to Andr? Malraux, an intelligent form of egoism. In a European context this means it is in our own interest that outside countries, groups, regions etc., should be helped to develop their human, social and economic resources. We should have learned that too deep differences create instability with the potential for upheaval, conflict and war. Solidarity in Europe is about taking care of each other across national borders, demonstrated in practice by supranational measures for cohesion. European solidarity includes the rest of the world. The next debate in Norway may illustrate a shift from the last campaign. A possible ?yes' to the European Union next time can not mean better prospects for economic benefits for a prosperous nation in a Europe enlarged with poorer countries. It would demand an obligation and commitment to higher values, a safer society and a sustainable development in a broad Europe. Then, as part of European identity, we find the classical political values. Democracy was invented and developed in Europe, further developed in America where the most democratic constitution at the time was established in 1776. Thereafter, new democratic reforms emerged and the idea of government by and for the people spread throughout the world and gradually, or after revolutions, unfolded in a variety of forms, within the framework of the nation-state. But democracy is still not fulfilled anywhere, due to the fact that the idea of democracy is a relative concept, a complex concept and a political concept. The relative concept of democracy implies that it is related to something outside the reach of voters and their representatives. Those who oppose federal, supranational democratic initiatives are without proper answers to the challenges from transnational companies, international capital movements, cross-border pollution and abuse of national sovereignty, for instance nuclear tests, suppression of ethnic groups and aggression against neighbouring states. From this, we realise that identity is closer to interdependence than self-determination. Identity is more a social, less an individual, phenomenon, but still both exist in Europe where the (im)balance between collective and personal responsibility has been a driving force in society. National independence does not have the same importance and impact as before. Now and in the future, nation-states have to find democratic ways of cooperation which preserve the positive dimensions of independence and limit its negative elements. Paradoxically, the notion of supranationality was also accepted by major parts of the opponent movement in Norway, in spite of their exaggerated belief in national self-determination. They approved of supranational regulation of national independence linked to peace, defence and security matters in the UN and NATO, at the same time they refused supranational regulation of national independence with almost the same countries in the European Union on civilian and political issues ! The Union is not in opposition to the nation. Supranationality is the strife to unfold the potential of the nations which they are unable to fulfil within their borders. A strong Union can not depend on weak nations. A strong Union strengthens its parts. Identity is not only unity in itself, but also a unity of contradictions. A political Union is how to bridge contradictions and the arena where different forces can do so. Identity and democracy are both complex concepts, and consist of an inner power balance between different components. Both include dynamic processes. Neither identity nor democracy can stand still. It is a question of live or die. For a European it is important that each political institution has limited power, and nevertheless is capable of achieving political goals, while simultaneously securing an appropriate balance among representatives from Member states and the people of Europe. Democratic cooperation among many countries, some hundred parties and 280 million voters is more complicated. However, democracy must not only be dependent on small-size communities to function. Large-scale democracy becomes increasingly important to avoid close political bodies becoming local theatre. On the other hand, distant democracy presupposes information and dialogue, transparency and control mechanisms in order to avoid the danger of living its own life. Nothing is more fitted to stimulate attention to a distant political structure than conflicts stemming from disagreement on how to solve the real problems of today and the future. From this fact follows a need to abolish unanimity and expand QMV. This will not be in contradiction to the need for consensus and respect for vital national interests. A European Union has in place of final goals, some common visions of peace, prosperity, social cohesion and partnership with nature. The Union is nurtured by the struggle between, and from, various interest groups working for their visions. Europe is indivisibly connected to its cultural, Christian, humanistic, scientific, social and professional values, o the identity of Europe in our heart and minds. Europe has a magnificent heritage of art and science, architecture and philosophy, and a abundance of ideas and religious schools within a system of tolerance and legal protection, which make our continent attractive, exciting and challenging. Without expelling the tragedies and catastrophes Europe has brought upon herself and other continents, it should be permitted to remember that the cultural and political ideas have conquered, and will continue to overcome, prejudices, xenophobia, racism and other discriminating and suppressing powers. And without degrading anyone, it is also convenient not to forget that the Nordic and European model of cooperation and conflict solving in the labour market, is advanced from a global point of view. Of course, there will always be nuances between various interest groups concerning the balance between politics and market, labour and capital, public and private sector, tradition and modernisation, men and women etc. But nobody should claim their interests to be superior to those of others or to suppress fundamental democratic and human rights. A European House should be built on pluralism and equality, as the European wants for him- or herself. And as we are changing and enlarging this house, we strive for the good life today and tomorrow. European identity must be found in something we already know. Identity is recognition. To be a European is coming home to my own house. European identity an anthropological approach Maryon McDonald Questions about European identity and about the future symbolic and practical content of ?Europe' are questions about the meaning of Europe : what does Europe mean, and what could it mean, to those who are its citizens ? Questions or worries of this kind were not paramount when the EEC began. Between the late 1960s and the present day, however, questions of ?legitimacy' and ?identity' have come increasingly to the fore. Legitimacy There have been two principal periods during which questions of legitimacy have been raised. First of all, concerns were voiced in the late 1960s o a period when it was first noticed that the original, self-evident legitimacy of the Community, defined against a past of war, was losing relevance to a new generation. Amidst demographic changes, increased studentification, and the re-invention of the category of ?youth', a new ?generation' was self-consciously establishing itself in contra-distinction from its parents. Old certainties such as modernisation, progress, reason and positivism, many of which had informed the EEC project, were put in question. This was a time when cultural diversity was invented, a time of civil rights marches in the US, a time of decolonisation and counter-cultures, a time when the alternative worlds of regionalism, particularism and relativism appealed. The world was de-naturalised, and the ?West' was re-invented as a category that the young might affect to despise. For this new generation. ?Europe', far from being the triumph of civilisation over irrationality, tyranny and violence, easily slipped into synonymy with this new ?West' to become another metaphor for post-imperial castigation and self-castigation, or one from which the authenticity and difference of alternative realities might be measured. The response of the EEC at this time was to try to draw young people, against the prevailing current, back into the ?European' fold through youth programmes, largely exchange schemes, and then much later on through the active ?conscientisation' programmes of the ?People's Europe' project. The structural funds also developed, partly in response to the economism of the EEC. The second period, which launched new worries about legitimacy, has come about since the launch of the Internal Market. This unprecedented flurry of perceived ?interference' from Brussels (however sound the original intention), with more directives in a shorter time than ever before, was bolstered and coloured by two other sets of events. On the one hand, the Berlin Wall fell, and many old certainties fell afresh with it. On the other hand, the Maastricht Treaty was negotiated and seemed to threaten national identities in a context in which, with the Internal Market, Brussels ?interference' already appeared as established fact. Going beyond, nationalism had once seemed morally right in the years after the Second World War, but now this was widely perceived as a moral and political threat. Not surprisingly, referenda results sent any certainties still surviving in Brussels diving for cover. Identity The ?People's Europe' project of the 1980s enlisted the old package of XIXth century nationalism to try to re-create Europe and European Identity o to make people feel European. But this old package is heavy with problems : Firstly, the package that nationalism used to invent nations, a package of language- culture-history-people-territory, is not available in all its elements to Europe. Europe cannot easily construct itself, or be imagined, through this package, therefore, and be convincing. It will also seem to be competing with nation-states. Secondly, the time span for the construction of European identity has been relatively short (mere decades where some nations have had two hundred years) and the construction process highly visible. Where the nation may feel ?natural', Europe is inevitably going to feel ?artificial'. And for those from national backgrounds which lack a historiography of self-conscious construction of the nation (such as Britain and Denmark, for example), some aspects of the self-conscious construction of Europe easily appear to be little more than propaganda. Thirdly, the old package for identity construction was born of certainties that no longer pertain in a world of diversity and relativism. Europe is now often more easily identified with a capacity to question apparent certainties rather than with the old certainties themselves. Fourthly, the old package assumes identity to be monolithic and culture to be a homogeneous, clearly bounded entity. However, identity is contextual and relational o and self and other, or sameness and difference, are constructed relationally in the context of daily imaginings and encounters. And fifthly, it is easy to lapse back into the full racial force of this old package o with the boundaries of Europe unrelativised and read as the boundaries of ethnic flesh. The freedom of movement of ?persons' is then rightly confronted with more uneasy reflection on the definition of a ?person'. History History was an important element in the nationalism package, and many histories of Europe have been encouraged as part of the ?People's Europe' project o apparently in the hope of appropriating the tool of history for the creation of European identity. However, we might say that there could, within current models of historiography, be two main ways of writing the history of Europe. Firstly, there would be the old, historicist model, in which Europe might be assumed to exist from Ancient Greece, say, up to contemporary European Union. This is the historiography that nationalism used and that the histories of Europe now tend to use also. All the ethnological bric-a-brac of the classical world become virtual flag- wavers before the Berlaymont, and contemporary ideals are read back into the classical world and onwards to the present day. This historicism, which worked for nationalism, is the style of the vast majority of officially sanctioned ?History of Europe' texts ( whether sanctioned by the Council of Europe or by EC funding). In this history, a continuous litany of features deemed to be inherent to Europe is paraded : this would include Christianity, democracy, citizens' rights and the rule of law, for example. This litany was especially important when it was first constructed, after the Second World War and then during the Cold War, in opposition to the East, but its appeal is not always self-evident now. The second kind of history of Europe would involve a history of the category of Europe. If we were to trace the history of the category of ?Europe' from, say, Ancient Greece to the present, we would find ?Europe' travelling through different conceptual systems, finding new meanings, becoming a different reality as it did so. The geographical boundaries expand and contract, the salient conceptual relations change, the moral and political frontiers and content shift considerably, and Europe is invented and re-invented accordingly. This is the kind of historiography that postmodernism would readily encourage, and it is one in which o unlike in the historiography of nationalism o the simple clarity of being on the right side of history is ideally and deliberately lacking. Moreover, this historiography would not allow any simple continuity to be read back into the past o whether of territory, culture or ethnic flesh, for example. The advent of postmodernism does not mean that we have to throw out the old history altogether. We can put certain key aspects of these two kinds of history together in a productive way. Elements from the old historiography which gave Europe its moral and political content o such as democracy, citizens' rights and the rule of law, for instance o can become important elements in a new understanding of both ?Europe' and identity as relative or relational. Without lapsing into any old historicism, such historical elements o or any one of them o can simply be drawn on or cited as the occasion or context demands. In other words, history becomes self-consciously part of the present, and the history of Europe is no longer historicist litany but part of our critical self-awareness. History is then an awareness of the changing and discontinuous contexts in which ?Europe' has been created in the past, and offers elements in the present that we might now choose to assume relationally in order to assert things ?European'. Europe in action If identity is constructed relationally, the clearest identity is in conceptual opposition. You know most clearly who you are through what you are not. It is relatively easy to feel ?European' when visiting Japan, for example. External relations might seem the obvious area in which a European identity can be constructed and expressed. However, this is also an area in which national identities are deeply embedded. Nation-states have in many ways been defined by their external relations, and Europe does not have the now dubious advantages of war and empire, or of clear external threat, to help to define itself. It is perhaps readily understood that international linking systems help to avoid old fault-lines reappearing, but steps towards some notion of European representation in this area, or of more fundamental institutional reform, have to carry with them the same critical self- awareness that there is no better way to re-create and re-invigorate national identities and differences internally than to be seen to impose decisions from outside (?Brussels'). For most purposes, we are now in a Europe that can, in an important sense, be more relaxed about its identity. The stuff of a European identity is available in the policies and issues which the EU (whether all of it or part of it) creates : in environmental questions, in equal opportunities, in the market (where it most obviously both follows and creates globalisation), in the social arena, in Trans-European networks, in food and health, and so on. Many of these areas have been re-thought (equal opportunities is no longer about the ?women's rights' of the 1970s, for instance, but about issues such as gender and the new family etc.) and others still await re-juggling and rethinking. Any one area of policy can, for better or worse, contextually enlist people to a ?European' self-consciousness (as we have seen recently in the BSE scare, with different sections of the British population suddenly calling for European compensation and solidarity). It is in its policies, in practice, that European identification comes alive. No one is ?European' all the time, just as no one is Spanish, Portuguese, British or French, and so on, all the time. There are moments when being a father, being a businesswoman, being a tennis-player, or being from Coimbra (etc.) are the salient identifications, and these identities would normally occupy much of one's waking life. The overarching ambitions of an older European -identity-construction-kit do not take this into account. Just as the certainties once inherent in the symbolism and narratives of large political parties are having to change and even give way to single-issue politics, so a post-federal Europe has to look for recruitment through the contexts of issues and practice. So, Europe exists. ?Europe' and ?European' exist as categories and people are contextually recruited into them, and there have been many successes of identification. Europe exists in action o in the contextual identification of people with specific policy-areas. Bargaining and compromise are acknowledged as the means to achieve desired policies internally, and the achievement of desired policies makes people feel better about being European, and more ready to compromise elsewhere. And so on. Europe, for many, is not a project, and the old narratives can be alienating. The future symbolic content of European identity resides in practice and action o requiring carefully re-thought policies, and the very European capacities for questioning and reflection, for self-criticism, and now the acceptance o without any naive federal model of a Europe des ethnies and without cultural fundamentalism o of diversity both at home and elsewhere. European identity and citizenship Massimo La Torre I do not intend here to deal, even tangentially, with the questions of God, the meaning of human life, transcendence, or universalism. What we are concerned with o if I am not wrong o is not "identity" in metaphysical terms, or either in anthropological, or mere cultural terms. Nor o I must confess o do I think that Europe without further qualifications is a useful category for political thought. By the way, the question "what is European identity" is also a trifle too broad and vague to find an appropriate answer. I assume that what interests and intrigues us is that identity which is relevant and needed for the construction of a political community at the European level. The identity to which I shall refer will therefore be that which derives or which is equivalent to membership to a polity. It has been said that an identity can be built either from above or from below. This is, I think, quite correct. But I have some problems if one starts identifying top-down procedures with whatever legal measure, with law, and democratic down-top mechanisms with historical processes. Now, I think that the opposite is often the case, i.e. that history has an authoritarian character and law, a libertarian one. History, if seen as a collective process, something given by an intrinsic immanent "telos" of human events, excludes the reflective intervention of individuals on the direction of their social life. Destiny even if shared in a community is never democratic. On the other side, law is not necessarily a sum of authoritarian or repressive provisions. Law is conventional, whatever the legal doctrine says or affirms about it ; it is made by reflective and more or less explicit processes : as a matter of fact a custom becomes a legal practice only when it is contested and is reaffirmed either by collective majoritarian behaviour or by judicial decisions. Law should be contestable in order to direct human conduct. But if the law is made, the real question will be whether it is made by one, the few, or the many. We are then called to choose the system of law we prefer. If we are liberal-minded, we would certainly have to opt for the rule given by the many, in a way that the law will no longer be authoritarian, that is, elitist, the artefact of the one or of the few, but will become the solid pillar of a democratic polity. I therefore dare to suggest that there is no political identity from below without democratic law. Once the question of identity is reformulated in terms of political identity, that is, in terms of membership to a European polity, the main problem for us will be that of a European citizenship. In fact, it is citizenship what marks the political belonging, the membership to a polity. European citizenship and democracy I would like to argue for a strong concept of European citizenship. This is fully justified from an internal legal point of view, since article B of the Treaty of Maastricht holds as one of the main purposes of the Union "to strengthen the protection of the rights and interests of the nationals of its member States through the introduction of a citizenship of the Union". We may also recall a decision taken by the European Court of Justice in Commission v. Council (May 30, 1989), confirming the full legality of the Erasmus Programme, which is then justified with reference to the "objectifs g?n?rant de la Communaut?, tels que la r?alisation d'une Europe des citoyens". A strong concept of European citizenship, characterised by a wide and rich range of rights ascribed through it and with independence from national citizenships, could powerfully contribute to solve at least partly but nevertheless effectively the democratic deficiencies of the European Union. A democracy is not only a representative or parliamentary political regime, but also and above all an association of equal citizens who are defined as such directly, that is without referring to intermediate social and political groups ; democracy is not only or even mainly given by the majority rule applied to political decisions, but eminently by the existence of a public domain of free discussion. But in order to have this, some requirements have to be satisfied : a feeling and a sphere of common concern, first of all. One could and should decide on matters which can affect more or less directly one's own life. Autonomy, which is an ideal principle presupposed by democracy, and expanded by this into a collective practice, makes sense only if it is exercised within the individual's scope of interests and action. Beyond this scope there is no right of autonomy ; even worse autonomy, as individual decision and action, can be transformed into its opposite : heteronomy, disruption of others' private sphere and life plans. This holds a fortiori for an extension of the principle to collective entities, that is, for democracy. A democratic decision cannot go beyond the area of interests which are at stake within a specific scope of (collective) action, that is, beyond the area constituted by those individuals who are the holders of the right of democratic decision. Now, citizenship as membership to a body politic, even if conceived only in formal legal terms, can contribute to create the idea of a common concern, the concern which is common to persons who bear a same legal and political status. To have a public sphere of discussion another requisite should be fulfilled : that of having procedures which allow a fair discussion. But in order to have a fair public discussion we need to assume that people when entering into that discussion share at least a few and "thin" principles : contra negantem principia non est disputandum.34 We need to assume that people recognise reciprocally the autonomy (the possibility of a rational and independent action, in this case discussion itself) and therefore, the sincerity and dignity of their opponents or fellow discussants. We should thus assume that in a public discussion discussants have equal rights.35 Citizenship (and European citizenship is no exception) is just the sum of rights which allow subjects to take part in a political deliberation and to discuss in order to arrive at a reasonable and well pondered decision. 34 See A. SCHOPENHAUER, Die Kunst, Recht zu behalten. In achtunddrei?ig Kunstgriffen dargestellt, ed. by F. Volpi, Insel, Frankfurt am Main 1995, p. 38. 35 See R. ALEXY, Theorie der juristischen Argumentation, 2nd ed., Suhrkamp, Frankfurt am Main 1991, pp. 238 ss. This can mean that in order to promote democratic progress in a society, we can first create statuses granting common and equal rights among its members, and then proceed to find out a viable institutional device to render visible and effective the public discourse which has started with the ascription of those statuses. In the terms of the present political and institutional situation in the European Union we can therefore plausibly believe that we can have European democratic citizens even before having at the supranational level institutions endowed with effective powers of political direction governed by democratic procedures. If we have a European citizenship as an independent status granting rights such as political rights (rights to vote and to be elected) both at the supranational and infranational level (see articles 8b and 8c of the Treaty of the Union), or rights such as the right not to be discriminated as an alien against a national (see article 6 of Maastricht Treaty), or rights such as freedom of movement to and through any member State and freedom of residence in them (see article 8a), then, even if the European parliament is not a fully developed democratic institution (because of the limited range of its current powers), we shall have a society of democratic citizens which will represent a better condition for developing democratic decision making at the supranational level. Of course, to this purpose the rights which we have mentioned should be fully deployed in all their potentiality, and break the limitations which articles 8a-8c still impose upon them. When democratic institutions are deficient, democracy can also be developed through democratic citizens. In particular, in the European Union whose member States actually are all democratic regimes what is fundamental is not to maintain a nationalist or ethnical view of democracy. We need a free sphere of public concern and the sense of participating in a fair cooperative scheme. A stronger and richer concept of European citizenship can be extremely helpful in this direction. Citizenship and ?demos' "Es gibt keine Demokratie ohne Demos" o says Josef Isensee, a well-known German constitutional lawyer36 o, whereby he means that democracy is built upon a collective subject pre-existing to it, endowed with a proper intense life, that is, a people seen as a homogeneous cultural and ethnical body. Moving from this premise the German lawyer then draws the conclusion that there is no possible legitimisation basis for a European democracy (that is, for the European Union), since there is no European "demos", that is, a European folk. It may also be remembered that the same author has successfully fought against the introduction, in the Freie und Hansestadt Hamburg, of an aliens' right to vote for the election of district councils, endowed of indeed poor competencies, with the argument that State officials and representative bodies (at whatever level and of whatever size) enjoy of democratic legitimisation only and only if they receive their mandate from 36 J. ISENSEE, Europa o die politische Erfindung eines Erdteils, in Europa als politische Idee und als rechtliche Form, ed. by J. Isensee, Duncker & Humblot, Berlin 1993, p. 133. For a more sophisticated but in its core quite similar view, cf. D. GRIMM, Does Europe Need a Constitution ?, in "European Law Journal", 1995, p. 295 : "Here, then, is the biggest obstacle to Europeanisation of the political substructure, on which the functioning of a democratic system and the performance of a parliament depends : language". According to Grimm the European Parliament, even reformed and fully empowered as a legislative assembly, could not be considered as a European popular representative body, "since there is as yet no European people"(ibid., p. 293). the "People" in its entirety, that is, from the "German People". The German Federal Constitutional Court unfortunately accepted Isensee's argument,37 thus reformulating the concept of "people" mentioned in article 20 of Grundgesetz ("Alle Staatsgewalt geht vom Volke aus") into that of German people38 and twisting this into an ethnically defined community of fate39 which has constitutional relevance even before the drafting of the constitution itself. Democracy o says the German Court o should not be seen as "freie Selbstbestimmung aller", free self-determination of all (as was formerly held by the Court itself)40 but as a power which derives from a unique and unitary entity whose individual members as such have no constitutional right of participation to collective political decisions ; they can exercise democratic self- determination only jointly, only if considered as indivisible group.41 The idea that democracy means the right for the people (in the plural) concerned by the laws to contribute to their deliberation and enactment is dismissed.42 Now, this is indeed a peculiar concept of democracy. It is based on a romantic idea of "people" or "nation, which has represented a reaction against the originally liberal concept of democracy, based on two basic pillars : individuality and public reason.43 In the romantic protest against liberal democracy, the very concept of political representation is deeply modified : representation is no longer expression of the concrete will of concrete individuals, but is rather expression of the existence of a community. In this second acceptation of representation, connected with a people idealised as a compact, tight and uniform ethnical entity, which has been cherished by "democrats" such as Carl Schmitt,44 even a dictator can "represent" a community, and in the end even a dictatorship may be legitimately be considered a... democracy. If, to have democracy what is required is on the one side a folk and on the other a special existential (ethnical) link between the folk and its leaders (this being the proper Repr?sentation of the folk), then it is not at all contradictory to have an authoritarian and even a totalitarian leader and nevertheless "democracy".45 37 "Das Volk, welches das Grundgesetz als Legitimations- und Kreationssubjekt der verfa?ten Staatlichkeit bestimme, sei das deutsche Volk"(BVerfGE 83, 60 [65]. 38 See also BVerfGE 83, 37 : "Das Staatsvolk, von dem die Staatsgewalt in der Bundesrepublik Deutschland ausgeht, wird nach dem Grundgesetz von den Deutschen, also den deutschen Staatsangeh?rigen und den ihnen nach Art. 116 ABS. 1 GG gleichgestellten Personen, gebildet". 39 Cf. BVerfGE 83, 37[40] : "Das Bild des Staatsvolkes, das dem Staatsangeh?rigkeitsrecht zugrunde liege, sei die politische Schicksalsgemeinschaft, in welche die einzelnen B?rger eingebunden seien. Ihre Solidarhaftung und ihre Verstrickung in das Schicksal ihres Heimatstaates, der sie nicht entrinnen k?nnten, seien auch Rechtfertigung daf?r, das Wahlrecht den Staatsangeh?rigen vorzubehalten"(italics mine). 40 See, for instance, BVerfGE 44, 125 [142]. 41 "Das demokratische Prinzip l??t es nicht beliebig zu, anstelle des Gesamtstaatsvolkes jeweils einer durch ?rtlichen Bezug verbundenen, gesetzlich gebildeten kleineren Gesamtheit von Staatsb?rgern Legitimationskraft zuzuerkennen"(BVerfGE 83, 60). 42 See BVerfGE 83, 60 [72]. See also BVerfGE 83, 37[42]. 43 Cf. D. GAUTHIER, Public Reason, in "Social Philosophy and Policy", 1995, pp. 19 ff. 44 See C. SCHMITT, Verfassungslehre, 3rd ed., Duncker & Humblot, Berlin 1957, p. 209 : "Repr?sentation ist kein normativer Vorgang, kein Verfahren und keine Prozedur, sondern etwas Existentielles. Repr?sentation hei?t, ein unsichtbares Sein durch ein ?ffentlich anwesendes Sein sichtbar machen und vergegenw?rtigen" (emphasis in original). 45 "According to this view, democracy and dictatorship are not essentially antagonistic ; rather, dictatorship is a kind of democracy if the dictator successfully claims to incarnate the identity of people" (U. K. PREUSS, Constitutional Powermaking for the New Polity : Some Deliberations on the Relations Between Constituent Power and the Constitution, in Constitutionalism, Identity, Difference, Indeed, in a democracy the people is not given by a "authentic" demos, but by its citizens, that is, by those individuals who publicly share a common concern and adhere to the fundamental principles by which the democracy defines and builds itself. In a democratic perspective "people is rather only a summary formula for human beings".46 As a matter of fact, there is no "demos" without democracy, that is, without individuals who recognise each other rights and duties. A people in political and legal terms (a "demos") is a normative product : "populus dicitur a polis" o wrote Baldus de Ubaldis in the XIVth century ; it is not there to be found before one starts the difficult enterprise of building up a polity. A people in political and legal terms is the outcome of political and legal institutions : it christalises around them ("civitas sibi faciat civem" o said Baldus' master, the great Bartolus de Sassoferrato). A people in democratic terms, a demos, the people of a democratic polity, makes thus itself in as far as it aggregates along the rules of democracy. We can recall a famous phrase of Kant where he defines a constitution as the act of general will whereby a multitude becomes a people ("den Akt des allgemeinen Willens, wodurch die Menge ein Volk wird").47 The story going on between people and democracy is more or less the same as the one of the egg and the chicken. Which came first : the egg or the chicken, demos or democracy ? Now, as far as the latter pair is concerned, we can confidently solve the enigma : they were just born together ! In short, es gibt kein Demos ohne Demokratie. This is another reason, and a fundamental one, why European citizenship is so important : because it is a stone, and a founding one, in the building of a European democracy. Democracy needs at least two poles : decision-making authorities and citizens towards whom those authorities are called to account for their decisions and the corresponding behaviour. If we have democratic citizens, persons endowed with a rich patrimony of rights, we should then have democratic political authorities. If we have democratic citizens, we already have a demos. And to have citizens in legal and political terms is only a question of common rights and duties. In the organic view of democracy, we are confronted with a dangerous confusion of the notion of public opinion with that of ethnical and cultural homogeneity. This confusion unfortunately seems to be perpetuated in the "Maastricht Urteil" by the German Federal Constitutional Court. "Demokratie, soll sie nicht lediglich formales Zurechnungsprinzip bleiben, ist vom Vorhandensein bestimmter vorrechtlicher Voraussetzungen abh?ngig, wie einer st?ndigen freien Auseinandersetzung zwischen sich begegnenden sozialen Kr?ften, Interessen und Ideen, in der sich auch politische Ziele kl?ren und wandeln und aus der heraus eine ?ffentliche Meinung den politischen Willen verformt. Dazu geh?rt auch, da? die Entscheidungsverfahren der Hoheitsgewalt aus?benden Organe und die jeweils verfolgten politischen Zielvorstellungen allgemein sichtbar und verstehbar sind, und ebenso da? der and Legitimacy. Theoretical Perspectives, ed. by M. Rosenfeld, Duke University Press, Durham and London 1994, p. 155). 46 B.O. BRYDE, Die bundesrepublikanische Volksdemokratie als Irrweg der Demokratietheorie, in "Staatswissenschaften und Staatspraxis", 1994, p. 322. 47 I. KANT, Zum Ewigen Frieden. Ein philosophischer Entwurf, in ID., Kleinere Schriften zur Geschichtsphilosophie, Ethik und Politik, ed. by K. Vorl?nder, Meiner, Hamburg 1959, p. 128. wahlberechtigste B?rger mit der Hoheitsgewalt, der er unterworfen ist, in seiner Sprache kommunizieren kann".48 I find it correct to affirm that democracy, in the sense of majority rule, presupposes some fundamental pre-legal conditions as much as some fundamental normative (moral and political) principles, a vigorous and open public discussion and an influential public opinion. Democracy as a political institution needs, in other words, a civil society. But first, a civil society does not necessarily need to coincide with some Schicksalgemeinschaft, a homogeneous ethnical and linguistic community. (Suggestively enough when the German Court tries to establish a clear-cut separation between national citizenship and European citizenship does not find anything better than making recourse to their different level of existential tightness : "Mit der durch den Vertrag von Maastricht begr?ndeten Unionsb?rgerschaft wird zwischen den Staatsangeh?rigen der Mitgliedstaaten ein auf Dauer angelegtes rechtliches Band gekn?pft, das zwar nicht eine der gemeinsamen Zugeh?rigkeit zu einem Staat vergleichbare Dichte besitzt").49 And, second, a civil society becomes a "people", in the sense of the sum of a polity citizens, only by interacting with constitutional rules and institutions. This point is clearly expressed in the following statement by Ulrich Preuss : "Neither pre-political feelings of commonness o like descent, ethnicity, language, race o nor representative institutions as such are able to a create a polity, be it a nation-state, a multinational state or a supranational entity. Rather, what is required is a dynamic process in which the will to form a polity is shaped and supported through institutions which in their turn symbolise and foster the idea of such a polity".50 Sure, a common language among citizens and between civil society and political institutions is needed in order to have public discussion and thus public reason. However, a common language can be a conventional or an artificial one. To be citizens, individuals should be able to communicate with political authorities : they should be able to understand each other. But this does not imply at all that to this purpose individuals should use their own mother tongue. Any other language will do, provided it is common to the parties. It may be the case that in the European Union, we do not still have such a common language. Nonetheless, such a language can be found. We can think of a lingua franca emerging in the ongoing process of European integration or of a net of various national or regional languages employed each at a different level and for a certain occasion but allowing a continuous flux of information.51 Moreover, the common language does not need to be in any occasion the same. We could perhaps apply a kind of subsidiarity principle to the use of the different languages, choosing the one or the other according to the context and the dimensions of the issue at stake and the 48 BVerfGE 89, 155 [185], italics mine. For a powerful criticism of the constitutional Weltanschauung of the German court as expressed in this decision, see J. H. H. WEILER, Does Europe Need a Constitution ? Reflections on Demos, Telos, and the German Maastricht Decision, in "European Law Journal", 1995, pp. 219 ff. 49 BVerfGE 89, 155 [184]. Italics mine. 50 U.K. PREUSS, Problems of a Concept of European Citizenship, in "European Law Journal", 1995, pp. 277-278. Italics in the text. 51 See what J?rgen Habermas opposes to Dieter Grimm's defence of cultural homogeneity as legitimation for democracy (J. HABERMAS, Comment on the paper by Dieter Grimm : 'Does Europe Need a Constitution ?', in "European Law Journal", 1995, pp. 303 ff.). people (and the languages) concerned. "Zweitens o as was pointed out by Edmund Bernatzik, a leading public lawyer of Austria Felix o kann man ja eine fremde Sprache lernen".52 In any case successful European experiences such as for instance the Erasmus Programme or the European University Institute in Florence (a university is an institution for which communication is of utmost relevance) show that it is possible at least to have a European university even without a European folk. Europe admittedly is not a nation, European citizens as such either. It is high time perhaps that the one (Europe) and the others (European citizens) combine their plans, leaving the nation to its old-fashioned nightmares of blood and soil. Belonging to a European polity I am not so much concerned about the sociological evidence supporting the romantic thesis according to which peoples and nations are homogeneous ethnical and cultural entities. My stance towards this thesis is quite radical. Should it be true, should nations be Volksgemeinschaften, that would not still be a legitimisation ground for a genuine democratic polity. Since democracy is based on intersubjective discourses and representation, any process which would work without an explicit reference to individual and interindividual will formation, would not be appropriate to offer any democratic legitimisation to a polity. The demos of democracy certainly is not ethnos. Yet, in order to defeat the foolish resistance, we might recall a historical fact : that in most cases the so-called Schicksalgemeinschaft is the outcome, an artificial product, of the State or of other reflective political processes.53 This was recognised in 1933 by Hermann Heller, he himself a strong defendant of nations as Schicksalgemeinschaften (and therefore quoted in the "Maastricht Urteil"),54 when he is confronted with the rise of the Nazi regime. "Weder das Volk noch die Nation d?rfen als die gleichsam nat?rliche Einheit angesehen werden, die der staatlichen Einheit vorgegeben w?re und sie selbstt?tig konstituierte. Oft genug war es [...] umgekehrt die staatliche Einheit, welche die "nat?rliche" Einheit des Volkes und der Nation erst gez?chtet hat. Der Staat ist mit seinen Machtmitteln durchaus im Stande selbst aus sprachlich und anthropologisch verschiedenen V?lkern ein einziges zu machen."55 Peoples in the cultural sense, in some cases at least, are not prior but posterior to the State's (sometimes brutal) intervention. The "ethnical" homogeneity 52 E. BERNATZIK, Die Ausgestaltung des Nationalgef?hls im 19. Jahrhundert, in ID., Die Ausgestaltung des Nationalgef?hls im 19. Jahrhundert o Rechtsstaat und Kulturstaat. Zwei Vortr?ge gehalten in der Vereinigung f?r staatswissenschaftliche Fortbildung in K?ln im April 1912, Helwingsche Verlagsbuchhandlung, Hanover 1912, p. 27. 53 Cf. what is said by Oswald Spengler, an author certainly not to be suspected of any "abstract", "formal", "thin", universalist liberal political views : ? Die "Muttersprache" ist bereits ein Produkt dynastischer Geschichte. Ohne die Capetinger w?rde es keine franz?sische Sprache geben[...] ; die italienische Schriftsprache ist ein Verdienst der deutschen Kaiser, vor allem Friedrichs II. Die modernen Nationen sind zun?chst die Bev?lkerungen alter dynastischer Gebiete ? (O. SPENGLER, Der Untergang des Abendlandes. Umrisse einer Morphologie der Weltgeschichte, DTV, M?nchen 1986, p. 779). 54 See BVerfGE 89, 155 [186]. Cf. the sharp critical comments by Brun-Otto Bryde (B.O. BRYDE, op. cit., p. 326, note 37). 55 H. HELLER, Staatslehre, 6th, rev. ed., ed. by G. Niemeyer, Mohr, T?bingen 1983, p. 186. of Pale in Bosnia could never be claimed as the outcome of an organic process of communitarian growth. On the other side, as far as a European "demos" is concerned, we might affirm that, in spite of the lack of one (and only one) common language, there is something like a common European cultural identity. A common history, common tragedies and sufferance, common values, common "myths"o if you like56o have made of the French, the Italian, the German, etc., a common "people". Though a Sicilian can manifest some perplexity in front of a guy dressed in leather pants and a feathery hat drinking litres of beer, she will still identify him as a European like her, with more things uniting than dividing them. In a democracy to be a citizen, to develop a sense of belonging to a democratic polity, one should overcome one's own rooting in unreflective communities, and be for a moment naked, a mere human being. Moving from this nakedness, one can then freely decide whether and how one wishes to cooperate. Only from this nowhere will persons be able to build up fair terms of co-operation, since in that hypothetical condition there will be no room for discriminatory grounds. Democracy as a polity of equals, should presuppose a kind of "transcendental" nakedness : "Democracy is a system of government according to which every member of society is considered as a man, and nothing more".57 The European identity meant as membership to a European polity can only be the outcome of a reflective adhesion to an institutional body ruled by democratic rules and offering a rich comprehensive set of rights. Thus, the European identity we are in search for passes through the consolidation of a meaningful European citizenship. 56 Cf. F. CHABOD, Storia dell'idea d'Europa, Laterza, Bari 1995. 57 W. GODWIN, Enquiry Concerning Political Justice and Its Influence on Moral and Happiness, ed. by I. Kramnick, Penguin, Harmondsworth 1976, p. 486. From poetic citizenshipto European citizenship Claire Lejeune If I dwell from the outset on the fact that is in this reflection a matter of motives peculiar to a woman, a poet moreover, it is precisely because the citizenship of women and that of poets has been, at the very least since Plato, an object of exclusion. This is to say that both have on European culture and identity, or simply on identity, the other view, a different discourse which has difficulty in making itself heard in public debates. And yet, without it, there is no possible democratic dialogue. Should we not begin by calling into question the very sense of the word "culture" ? The globalisation of the market economy has made culture a commodity, an object of production and consumption. To revitalise it is to give it back its function of well- thought-out action. In the situation of rupture that we are living at the end of the XXth century, cultivating one's mind means not only enriching one's knowledge of the heritage (patrimony) to be able to enjoy it and be able to transmit enjoyment of it, but chiefly becoming capable of generating a societal project of giving birth to the future ; literally, delivering our mentality of the XXIst century that it bears so painfully but which it is, however, alone in bearing. I am not of those who believe that "History has more imagination than men" : to trust history to invent the future is necessarily to go in its direction, let oneself go from upstream to downstream, in other words, leave it to its fatality, its determinism, whereas any creation supposes that thought resists the force of the mainstream, that it climbs back against the tide of the course of History, that it thinks about itself from downstream to upstream, that it returns to the sources of patriarchal History, not nostalgically to re-immerse itself in it, to find in it the ideal original purity, but with the ethical intention of bringing to light the foundations of this fratricidal civilisation the endless agony of which we are living through. A desire for Europe All those, men and women, who are set asking about the future of the planet through the collapse of the socio-political systems, the sudden growth of fundamentalist and nationalistic perils, agree that it will not happen without our mentality and behaviour undergoing genuine transformations. We know that the only power capable of undermining and undoing o from the inside o the supreme reign of money can only be born from the intensive development of conscience lagging frightfully behind that of science and techniques, in other words making each citizen aware of his/her responsibilities. That said, what is left is to put this awareness in hand, create and organise this network of resistance to generalised mercantilism, which culture will necessarily have to be in the XXIst century. European citizenship does not exist when it is legitimised by a Treaty only ; when it has no other body than the lifeless body of law. For it to become lively and active it must be desired, it must be rooted in memory's emotional depths where desire reigns. Creative citizenship cannot do without the order that comes from law any more than without the energy that comes from desire : it becomes of our daily aptitude to embody dynamically the conflictual relationship that the logic of reason and that of passion keep alive in us. From the beginning of our reflection lucidity obliges us to recognise that if Europe does not lack a body that makes law, it generally lacks the desire that makes sense, in other words passion, the emotional motivation that it needs to build itself. Without wanting to psychoanalyse our relationship with Europe, we shall have, for it to come to life, to feel it, to think it out in terms of feeling, and this feeling will have to find the words to spread. Between the murderous hatred of Europe that nationalism and the platonic love which inspires its bigots testify to, it is a matter for qualifying, embodying, humanising this European citizenship which is yet only an indispensable fiction. The question is that of the existence in us o real or virtual, latent or revealed o of a "desire for Europe" (I say "a desire for Europe" as one would say "a desire for a child") with which the legislator has hardly been concerned up to now. If this desire exists, what does it correspond to in our imagination ? Does there exist between the strata of the individual unconscious a European unconscious as one says that there exists an African unconscious ? Does an initiation into European citizenship necessarily come through discovering the places of this unconscious, through a recalling of the European mythological sources (Judeo-Christian, Greco-Roman, Celtic, Germanic) ? We know today that myths are, in the memory, where high human energy is focused, the complex and very real places of a violence capable of both destruction and creation. Knowing the deep psychic manipulations that fundamentalism and nationalism, religious and political totalitarianism operate through the unequivocal and dogmatic interpretation of myths and symbols, what work of searching and critical analysis must be undertake, to become aware of these occult sources of history ? What work of deciphering and enlightenment to turn them into sources of creative energy for a transnational, transpolitical, transcultural Europe ? How shall we go about it so that traditions cease to be the prisons of thought ? How to open them ? What must we do for them to become the very sources of freedom, of the fertility of thought, i.e. of something truly new ? Everything leads us to believe that it is to the deep level of myths and symbols that we must go back, with open eyes, to free European imagination o the entire imagination o from its historical conditioning, to put it in a position to desire both the diversity and community of its destiny, in other words to motivate conscience to invest its high energies into the construction of a decompartmentalized society. It is clear that if this work of deconditioning, of genuine secularisation of mentalities, is not done, planetary solidarity is doomed to remain a utopia. Humanisation is not self-evident : it is the fruit of working continually on oneself, in other words a culture in depth, in the most down-to-earth sense of the term. No doubt even, it could be said that one is not born human, but that one becomes so. It is of this work without respite of thought on the contents of memory that a society of persons with unlimited responsibility can become. At the time when women allow themselves o how painfully o to speak on the scene of political disaster, one must know that this speech is new, that its forms of legitimacy are still to be invented. A woman's citizenship does not have to recreate a political- cultural space, but quite simply to create it for itself, for the first time, from the ruins of a History where she ever had only the right to speak in the name of the Father and of the Son, in the name of a sex which is not hers. What seems vital to me today is to rethink the very concept of identity, to understand that the identity principle is also the principle of third-party exclusion. The identity logic is that which invalidates any truth stemming from the crossing of the thought of I and that of the other. It is on the rejection of disturbing strangeness, of impurity, of all that is not white or black, masculine or feminine, dominating subject or dominated object, that the xenophobic History of Nation-States was based. To want to rebuild on the same foundations would be wholly irresponsible. The human resources that the history to come can rely on are those which were inhibited, doomed, sacrificed by patriarchal History to ensure the stability of its Order. If life on the planet stands a chance of saving itself, of over-arising, it is in the repressed part of patriarchal History that it is buried. What twenty-first century thought is going to have to set off, i.e. cultivate to re-generate oneself, is precisely what Patriarcate has excluded, gagged, burnt to ensure the continuity of its domination. This treasure of the possible is buried in our memory. It is up to each of us (woman or man) to work on their own mental field, to cultivate it for it to become an oasis of true life, a space of creation o both of projection and reflection o in the pervading cultural wilderness. To sex the question of identity Faced with the ravages of the evil of exclusion, the most universally prescribed remedy in this fin de si?cle is, of course, communication between people, sexes, ethnic groups, and cultures. The term "communication" enjoys a theoretical fortune without precedent and, on the other hand, the technical means that are available to us to attempt to communicate are today fabulous. But never, no doubt, has the hiatus between the virtual and the actual of communication been more gaping than today. It is in the daily passage to the act of communicating that communication breakdowns are the most flagrant, that powerlessness is the most tangible, the most dramatic. But it is nevertheless there where we strongly experience the difficulty of communicating, where we suffer from it personally in our flesh and in our heart, that desire lies, i.e. the chance of delimiting and overcoming it. If communication is overabundantly provided with theory and technology, we are obliged to note that its ordinary practice is still in its infancy, that its language is still to be invented. The logos of communication are not the mono-logos, they are the dialogos, i.e. the language that is conceived and developed from the interactive knowledge of I and of the other. Dialogue is the communicating language, the interacting language, that which foils the principle of exclusion towards the impure third party. Now, dialogue o the dia-logic of included third parties o forms the subject of no initiation, no learning at school. It is increasingly manifest that a male or female citizen's dialogic capacity o his/her capacity of opening to the other o is the b?te noire of all forms of religious and political fundamentalism since the latter can only reign through exclusion, through division, to begin with, between the sexes. To create spaces where this aptitude to opening, exchange, dialogue which is responsible citizenship is to create conditions indispensable to the advent of a real democracy. It is no longer defensible today not to sex the identity question, to stay deaf to the nascent speech of the other subject who is the I of feminine gender. In the light of the recrudescent fundamentalism it becomes impossible to ignore that the very matrix of any xenophobia is gyno-phobia, and not the reverse. Now, gyno-phobia is not only the work of men. We have to note that women themselves can be the patriarcate's worst accomplices. The fear- alas understandable o that they have of being themselves, i.e. different ; to dare think, say and act otherwise than men is still far from overcome ! If the common stake is the advent of a society of persons with unlimited responsibility, this supposes that an end be put to the childish moral codes based on making one another guilty. An adult feminism can only be a matter of solidarity not only of women among themselves, but between lucid women and men, in search of a happy outcome from the patriarchal impasse. It has not been well enough seen how much the health of the political depends on abolishing the rule of linguistic clich?s, in other words, on the male and female citizen's aptitude to form one body, sexed body with his/her language. Everyone agrees that the great remedy to the ravages of hatred and indifference is love. Yes, but how can love be reinvented ? How to free Eros from the murderous empire of Thanatos ? It must first be understood that the eroticisation of the social body necessarily passes through the eroticisation of the body of the language and that the eroticisation of the body of the language necessarily passes through its sexuation... In the architecture of a construction there is always o more or less conscious o a logic at work : a logic of closing or a logic of opening. A socio-cultural space is built according to the identity principle (xenophobic exclusion of the impure third party) or according to the solidarity principle (logic of inclusion of the impure third party). According to "eliquishness" or "workshop spirit". This is to say that an effective democracy can only come from an alliance without precedent of consciences where the humanity of the solidarity principle has prevailed over the inhumanity of the identity logic. It is at the level of the founding principles that the true political cleavage lies. We shall come alive out of this societal crisis, which most agree to qualify as structural, only if thought of the political becomes deeper o not without vertigo o till it gets right to the bottom of its rational and irrational foundations. A cultural act, if ever there is one ! In the current debates on the European Union's political structure, attention is often focused around the word nation. How can we revive a form of nation which is not a people's defensive withdrawal into itself, which is not in advance undermined by the demon nationalism ? The rights and duties of the European Union citizen will not be those defined by the Nation-States patriarchal History. The idea of a European fatherland must be given up. Europe will be a brotherland or will not be. To pass from a closed nation to an open one, of the fatherland-nation to the brotherland-nation will only be done through recognising the effective existence of two equal and different human genders, irreducible to each other. We no longer ignore that we are all bisexual, all impure, all half-castes. A woman's femininity is not a man's as a man's virility is not a woman's, which means that in the relationship between a man and a woman, there are four sexes in continuous interaction. If women's thought proves other than men's it is due not only to the cultural memory difference (the feminine has not crossed history as dominating subject but as dominated object) but also to the difference in body memory. While a man keeps indelible the physical and emotional memory of having had a mother, he is deprived of having been the belly required by the generation of the other, this place not only of conception and gestation, but of expulsion of another. The link to otherness fostered by feminine identity is undeniably different from that fostered by masculine identity. To recognise this difference in natural and cultural memory between men and women, find words and images to make it noticeable and intelligible instead of continuing to ignore it does not go without giving thought what it needs to regenerate itself. A postscript In deciding to develop as a postscript to my introductory text the words that I spoke during the "Carrefour" I want to testify to that wave which passed among the participants and which Marcelino Oreja called : "the Coimbra spirit". For the European that I am, clearly there will henceforth be a pre-Coimbra and a post- Coimbra. In the reflection document that Thomas Jansen has drafted, we are reminded that determining the "political finality" of the European Union must be done on in the perspective of a project of "world federation". I feel my European citizenship as an interface, as the indispensable mediator between my awareness of belonging to a local, regional community and that of belonging to the world community. European identity can make sense for an individual who wonders about what it means only in relation to a project of planetary citizenship. Even then this individual must of course be motivated to do so, which first assumes that he is wondering about the meaning of his own existence, in other words that he has reached a certain degree of maturity. The concrete forms of the we can only be implemented if desired, imagined, thought up and meant by a multiplicity of I. A pluralistic world can only be built by communities of responsible singulars. Real inter-nationality, and inter-culturality can only be conceived and expanded on the basis of a well-thought-out and theorised practice of intersubjectivity which would be at the very basis of education. The mental revolution which can be expected to lead to the advent of a world democracy is occurring at this moment within the family, school and university. The numerous signs given by the current mutations are still being perceived and interpreted as negative signs of disarray, signs of the monological order of the patriarchate collapsing. The high psychic energies o repressed by history o which these mutations are delivering will be translated by acts of destructive violence as long as they do not have available the tools of thought which enable them to transmute into creative power, as long as they have not found the language that actualises the strangeness of each man and each woman (the real object of xenophobia) as the very essence of their universality. Over a century ago, Arthur Rimbaud wrote in : La Lettre du voyant : "to find a language ; o besides, every word being an idea, the time of a universal language will come ! /.../ This language will be soul for the soul, epitomising everything, scents, sounds, colours, thought catching thought and pulling. The poet would define the amount of unknown waking up in its time in the universal soul : he would give more o than the formula of his thought, than the annotation of his march to Progress ! Enormity becoming norm, absorbed by everyone, he would truly be a multiplier of Progress." The first condition of the advent of an adult Europe, responsible both for her own future and that of the planet, is that she worries not only about informing but about forming her citizens, not only about their access to the multiplicity of knowledge, but about their initiation into the act of thinking for oneself. Finding a language to express the strangeness, the continual newness of this self-generating thought, necessarily passes through the capacity of imagining, the development of the resources of the personal field and the collective field of our imagination. One is not born a creator, one becomes one. Even then one must discover the logics, the dialogics of creation and communication, the tools of interactive thought and learn to use them. Only a culture of intersubjectity will enable us to overcome the spiritual and affective handicap of modern mentality distorted by the exclusive reign of scientific objectivity. What the thought of our times most evidently lacks is neither faith nor reason, it is vision. But visionary speech is the fruit of that logic of creation o logic of the included third o which po?etics is (po?ein : to make). To H?lderlin's question : "Why poets in times of distress?" I would answer first : because the poets who think the world pre-see how a utopia destroys itself, an "ideal City" which excludes them, and how a "real City" which integrates their turbulent presence can be built. "Poetry will be in front", and "the poet will be a citizen", writes Rimbaud when prophesying the real democracy he longs for. Blind faith in a "lendemain qui chante", without that visionary lucidity of which Ren? Char tells us that it is "the wound closest to the sun", can never lead humanity to anything other than a utopia doomed sooner or later to collapse. I hasten to say with Lautr?amont, another prophet of real democracy, that "poetry will be made by all", which means that everyone will have to awaken in themselves the poet that western civilisation has excluded to found its order. There is only one vision of the future which can pull us out of the belly of the past, project us ahead of ourselves. We must be able to imagine the future ; see it revealing itself (in the photographic sense of the word) on our inner screens ; we must be able to give birth to a picture of the common future which is specific to us, which is particular to us for it to mobilise our deepest energies. Spiritual foundation Re-enchanting the world at which a pedagogy of creation and communication aims o a pedagogy of the inter-and of the trans-passes through questioning thought about its tools. What makes the difference between the logic of the divisional (of excluded third) and that of a visionary thought (of included third) is the coordinating conjunction of opposites. For the logic of knowledge and power I can be only I or the other (the inter is interdicted, i.e. unsaid) ; but when I enter the field of creation and communication, I am both I and the other, co-existing in an analogical relation which underlies their dialogical link : I am to you as you are to me. According to the principle of pure reason, A is A and B is B : I cannot be another. Between identity and alterity, all impurity, all ambiguity, all common ownership o all strangeness o has to be deprived of active citizenship. Europe, said Husserl, cannot forget her spiritual foundation which takes root on the Greek soil of philosophy. I believe that the very notion of "poetic citizenship" cannot be grasped and shared but by the double reference to Plato who excludes it and to Rimbaud who predicts its resurgence. Let us first remember that Plato, in the name of the principle of reason, sees it as his duty to put the poet out of his republic. Like women, children and lunatics, the poet is excluded from taking part in the business of the "ideal City" ; his (magical) thought is deprived of legitimacy, i.e. of citizenship. This is what Plato writes in The Republic : That was, I went on to say, what I meant, returning to poetry, to justify myself for previously banning from our republic so frivolous an art : reason made it a duty for us to do so. Let us also say to it, so that it may not accuse us of harshness and rusticity, that the dispute between philosophy and poetry does not date from today. Notwithstanding, let us protest strongly that if imitative poetry which has pleasure as its object can prove for some reason that it must have its place in a well-ordered society, we will bring it back into it wholeheartedly. As to Rimbaud, he predicts the return of the poet in that prophetic letter which was called la Lettre du voyant : Eternal art would have its functions, as poets are citizens. Poetry will no longer punctuate action ; it will be ahead. These poets will be ! When the infinite bondage of woman is broken, when she lives by her and for her, man o so far abominable -, having given her the sack, she too will be a poet ! Woman will find things unknown ! Will her worlds of ideas differ from ours ? o She will find strange, unsoundable, repulsive, delicious things ; we will take them, we will understand them. Cross-checking these two texts, twenty-three centuries apart from one another, the one the founder of our civilisation, the other predicting its end, is, in the current sociopolitical context, prodigiously enlightening. That you invited me to speak among you, I, poet and woman, both delights me and makes me feel hugely responsible. I must find the images and words capable of expressing my own vision of the world being born, knowing that this risks disturbing yours, but at this cost only does it have a chance of acting, of inciting you to find whatever words and images will express yours and contest or meet mine. He who comes into the world to trouble nothing, says Ren? Char also, deserves neither consideration nor patience. For a real dialogue, an effective democratic game, to occur there have to be at play two different speeches and two different listeners, who affect, respect, greet one another, who cease being indifferent to one another. A conflictual relation can start generating a trans-personal, trans-cultural, trans-national, trans-political thought only by means of this quadrivocal dialectic which prevents communication from getting bogged down in the rut of consensus, from being trapped in the homogenisation where what it is to-day agreed to call "la pens?e unique" thrives. Perception of oneself (conceiving of oneself) I attempted to make you see in my speeches how links between my poetic citizenship and my European citizenship are woven ; so, these speeches are of the order of the testimony. Thirty-five years ago occurred in me the illumination o the poetic experience o where I was initiated into my own existence and into the vital need to find a language to express that disturbing strangeness which suddenly served me as identity in a basically xenophobic and misogynous society. The instant before this literally apocalyptic instant (of revelation), I was present neither to myself nor to the world. The instant after my patriarchal imagination was in ruins, I had, on pain of death or madness, to build another one, a dynamic, self- generating imagination. Starting from the desire to become who I really am, I had to re-create for me a love imagination, a family imagination, and a social imagination. To put it in other words, I had, by means of visionary thought and of the work of writing, to save myself from chaos : no saviour would do it in my place. Let us say, briefly, that my spiritual dimension o verticality made up of height and depth o was born of this wild initiation into the genesis of consciousness. Conceiving oneself is experiencing the primeval consubstantiality of space and time, of I and the other, of both the woman and the man that I am ; it is reaching the lightning nucleus of SELF of which Andr? Breton said that it is : the POINT of the mind from where life and death, the real and the imaginary, past and future, what can and cannot be communicated, top and bottom, cease being perceived contradictorily. He added: the point in question is a fortiori the one where construction and destruction can no longer be brandished one against the other. I understand that the Europe of today too is seeking to know herself, to know her soul, to become aware of who she really is in relation to the world and to put in a token appearance in it ; to express her project of post-modern future. In other words, Europe is more or less confusedly seeking to become an adult we, i.e. a community of persons and nations with full and entire responsibility. It is indispensable for us to produce symbols, images, metaphors, said J?r?me Vignon. We must give tools of communication other than conceptual but which can be linked to the conceptual to revive it, to re-nature it, to re-humanise it. We must bring into the world o beyond the great hardships of History o a new understanding of the real. Of this post-modern thought born of the reconciliation of poetry and philosophy, I like to say to myself that it is post-socratic, in the sense that it recalls that prodigious presocratic thought which was current before they split. Thinking as a poet is being able to put oneself into the other's place, being SELF (consubstantially I and the other), but also being able at the same time to embody, from the smallest to the largest, all the circles of collectivity I belong to. What would be the soul of a people other than the one their poets gave them ? If I think Europe as a poet, I identify with her, I espouse her cause, I form one body with her present, I lend her the strength of my visceral resistance to all forms of totalitarianism ; thus I commit myself personally in her quest for a non-fatal outcome to the unprecedented impasse where she is at the end of this century. Let us say that I see my own experience of emancipation as an illuminating metaphor of the trying search for herself which Europe is pursuing today. I draw from this analogy not only my motivation but the daily energy that is needed to provide this project of a transnational, trans-cultural Europe with a body of writing radically other than her "body of laws" which will never have anything but a set language ; that is to say, with a poetic existence without which it will stay a dead letter. Only the influence of an adult poetry o in the sense that it has freed itself from the condition of minor thought where western philosophy confined it -, irresistibly confident in its real power to change life, could truly re-enchant the world. If, like Ariadne, I undertake to pursue the metaphor to better understand all that was exchanged thanks to the Coimbra forum, I say to myself that Europe will get out of her crisis of growth, will become a big adult woman only if she dares to call into question the dogma of economism which threatens any moment to "topple her over from the market economy to the market society" : a striking formula which Zaki La?di gave us of the peril which is threatening us. Put differently, Europe will not recover from her disaster unless she appropriates the freedom of self-determination, the freedom to choose the model of globalisation to which she wishes to belong. We heard Mario Soares tell us forcefully that he "does not want a Europe exclusively determined by economic and monetary demands but a political, social Europe, a Europe of citizens, a Europe of participation" ; not a Europe that we would have to suffer, but a Europe that we have to make happen. From the moment that the European Union knows not only what she does not want, but what she wants to be, she must change her history, i.e. her relational logic ; she must pass from the identity principle based on the exclusion of the third's strangeness which determined the building of xenophobic Europe, to an interactive dia-logic based on the integration of this strangeness, on the actualisation of all mediation between identity and alterity. Only the development of a such a citizenship in the process of building the Union can save Europe from the twofold peril which threatens her : homogenisation or atomisation. It is urgent to understand that a democratic space can only be built from a mentality structured by the "solidarity principle", a principle of the interactivity of opposites. Subject A is to object B what subject B is to object A : I am to you what you are to me; logical translation of the principle of Christian charity : love (respect) the other as you love (respect) yourself : a universal formula of a laity to which any religion of love can rally without betraying itself. The game of interactivity As soon as we understand that there are really at play in any human relationship at least two subjectivities and two objectivities, two identities and two alterities, i.e. four elementary truths, the problem of thought is completely transformed. "Telling the truth" supposes from that moment that our four truths recognise one another, interfere, interact, that a dialogical language is invented able to translate not now the duplicity but the quadruplicity of the real. In this great dynamic game of interactivity, all horizontal, vertical, diagonal relations are authorised. What disappears in the dynamic structure of real democracy is the inevitability of exclusion. All aesthetic, ethical and political revival, all possible regeneration of the social body, will proceed from this metamorphosis of the structures of our relational imagination. The identity of the European Union should appear as that of a societal model which not only succeeds in safeguarding entitlements, but in integrating the great historical, political, economic, technological and ethical upheavals. Which supposes the conception and the implementation of a logic of construction which is a logic of integration of differences. The image that I have of the Europe to come is less that of a continent in search of an intellectual leadership able to face up to the rise of the (economic, political and religious) fundamentalisms than that of a living and thinking organism, capable of metabolising what has happened to it and what continues to happen to it for better of worse ; so as to be able to build for itself a great contagious health the influence of which works not only to relieve but to heal the extreme misery from which the world is suffering. I see the European building site as the main, if not the only, chance our planet currently has of saving itself from the perils which threaten it, of building with new tools of thought its first "real City", its first adult democracy, its first trans-national phratry on the ruins of xenophobic patriarchate. L'identit? europ?enne comme engagementtransnational dans la soci?t? R?diger Stephan European identity as a transnational commitment in society A terminology debate is of no value in the face of the real challenge. The term "identity" is merely a starting point. Psychologists would say that it covers both continuous identification with oneself and permanent adherence to certain traits of character which are specific to a group. On the one hand, identity appears as a criterion for acts intended to provide a synthesis of the self, while on the other hand it signifies a feeling of solidarity with a group. Thus, there are two aspects : o identity is linked to the individual, the person ; o identity reflects a state of existence, an outcome, the end of a path. On this basis, what is the answer to the original question, how can we express this identity which must take on a European dimension ? First of all, it is the individual, the European citizen, who must both give and receive the reply, in the context of his relationship with himself and his environment. The citizen should be able to express this identity, which in turn must be developed together with the citizen. Furthermore, if the identity of the individual is a fulfilment, the sum of a personal history, then European identity is made up of a huge and varied heritage. European identity appears here as being linked to the past, and the future is not a factor. To express European identity through heritage only, however rich this may be, would be to limit oneself to conservatism without a future. Europe needs visions which relate to the future. The development of a European identity can play no part other than through a European consciousness, bringing in itself movement and evolution, a European consciousness which captures the national identities in their diversity and conceives them as having a common future. Expressing this identity o a forward-looking European consciousness o implies the abolition of antagonism between national and European identities. European identity- consciousness is founded on national identities, and finds its expression in cooperation and interaction. We need this European identity-consciousness in order to avoid wars among ourselves or with others, to pool our resources, and to join forces in the face of the challenges of our time, which transcend national and continental boundaries. We draw this identity-consciousness from a heritage which expresses what is common to us, or what we recognise as being common to us. We draw it from history, the common European traits of which we are rediscovering, after two centuries of nationalism and nationalist interpretation. We draw it from the memory of the past, our memory banks o what are our European memory banks ? We draw it from the symbols which we have succeeded in creating and which we shall be capable of creating in the future. We find it in the democratic institutions and rules which structure and define life within our societies, the relationships of the individual and society, and the rights and duties of the citizen. The European Union has neither a political nor a social structure which would give it an "identity" and allow it to develop a citizen's European consciousness, or which would allow the citizen to develop a European consciousness. The Council of Ministers is not European, but inter-governmental. The European Commission acts as if it were inter-governmental. In order to have our voice heard, we must use the channels of the national representations or even national bodies. The European Parliament, the political representation of the citizen, is not truly recognised as such, because its powers, responsibilities and image do not correspond to what the European citizen, accustomed to the role of his national parliament, can or wants to expect from it. Nevertheless, it is the European institution with which the citizen can identify most easily, because it is supranational, or European, and because Parliament fights to give legitimacy to Europe, which also gives it symbolic value. However, there are forces within society which are not representative of national interests and are non-governmental, non-State and transnational by nature. First of all, there is the economic sector, or at least the bulk of it. Industrialists spend all day telling us that their vision is no longer national or even European, but global. The economy creates its own identities o corporate identities, which are neither national nor European. This is what is called the "IBM identity". The question remains as to whether, with the single currency, the economic sector can also help boost European identity. There is, however, another sector of society which is developing rapidly. It is known as the "Third Sector", a term which is both vague (in that it takes in the most diverse forms of organisation) and precise (in the sense that it refers to non-governmental organisations). Civil society translates the will and aspirations of the citizen and quite naturally goes beyond the national context o in fact increasingly so. As every domain of society is affected by, or is open to, international pressures , these organisations nowadays all engage in activities which to a greater or lesser extent go beyond national confines. This "Third Sector" o the expression has come to represent organisation, solidarity and community o represents the commitment of the citizen within society and through society. The Third Sector is not a third country, but a sector of present-day society which should become an increasingly important communication partner, a forum for proposals and for implementing new solutions needed to resolve the major problems facing us today. If we wish to develop European identity-consciousness, this movement towards more Europe, in and with the citizen, if we wish to organise participation and interaction, we must find, or in my view create, a way of organising relations between the Third Sector and the European institutions. As the social and cultural organisations of the Third Sector reflect this commitment on the part of the citizen to non-State and non-public forms of organisation and institution, it is necessary to create a space in society giving the citizen a voice outside the national framework, a European space in which the citizen's commitment to society can be expressed. This societal space should be able to communicate regularly with the European Parliament, whose powers would have to be extended, and with the European Commission. Participation and interaction could be expressed and organised around major subjects of civilisation, such as work and integration into society, national and European memory banks, European citizenship training for the younger generation, and the development of a European language policy. Security and a common area Adriano Moreira One way of analysing comparable transitions from unity among nations to a united Europe is to see it in peaceful terms as the setting of a boundary against a hostile power which threatens freedom and integrity. Let us say that as a general rule the fact of being subjected to the same climate of aggression generates a common defence system and the emergence of an identity through the feeling of security experienced in relation to the threat. Although Toynbee regards the West as the present-day aggressors, identified as such from outside by the peoples of the former colonial territories, the fact of being surrounded by a common threat has more than once united Europe. This was the situation in Western Europe for more than half of a century dominated by the military pacts (NATO and the Warsaw Pact), until it came to an end in 1989 with the fall of the Berlin Wall, and yet Western Europe was involved in defending a plan to unite the area from the Atlantic to the Urals. It thus firmly refused to be "le petit cap au bout de l'Asie", as Val?ry used to put it. A political era Identity implies a common area which has a geographical form, but this will only have a border if, for unambiguous reasons of security, solidarity among the peoples involved and well-established sociological proximity, that identity is assumed. Article 0 of the Maastricht Treaty lays down that any European State may become a member, but it does not attempt to define a European State. In fact the supposed common area is divided by various formal frontiers which do not coincide but were laid down for pragmatic reasons with a view to achieving the overriding objective. The European Union has 15 members since 1995 (when Austria, Sweden and Finland joined) and is considering admitting another 12 at the beginning of the next century, which is nearly upon us. The Council of Europe has 39, including Russia since 1996, which should make us wonder whether the area is broadening out or joining up. Meanwhile the Organisation for Security and Cooperation in Europe (OSCE) has 54, which raises the very same question in a more complicated form. The sea frontier is extremely long, from the North Sea down the Atlantic seaboard to the Mediterranean, a fact which raises the opposite question to the one about the Council of Europe, i.e. whether the European identity has the effect of fragmenting the Atlantic identity which is still best expressed through NATO. This multiplicity of formal frontiers, outlining areas which do not coincide, points towards a definition of a political area demarcated by a series of common threats facing it and a common determination to confront them. In Europe's experience, historical internal conflicts are identifiable as such and are not to be confused with external threats. Title V of the Treaty of Maastricht defines as one pillar of the European Union a common foreign and security policy, leading eventually to a common defence policy, but does not make a distinction between the internal frontier formed by the threat of the recent past and the external panorama constituted by a world context in flux. It should be remembered that the founding fathers of the new Europe, Jean Monnet, Adenauer and Schuman, had in mind to free Europe forever from the spectre of civil war, with Germany and France in the leading roles, and in the area of security it is WEU which reflects that rivalry most clearly : the United States came over to Europe to fight twice in the same generation because of that historical conflict, and the object of WEU was to define a restrictive arrangement for the entry of the Federal Republic of Germany into NATO. The external threat is a different issue, and that was reflected in the Atlantic Alliance for the half-century when the world was divided into two opposing camps. Europe and the Atlantic Alliance At the present time, when diplomacy conducted as a Nixon-style strategy, with the three pillars formed by the United States, Russia and China, seems once more to be to the fore, the question of a European identity in the political field of security and defence, which will be responsible for defining whatever geographical frontier is eventually adopted, seems to be couched in the following terms : o a return by Russia to the historical nation-based strategic concept, with the idea of a "near-abroad" (the former satellite countries), and an attempt to reconstitute the geographical borders prior to 1989, in response to the creation by the Atlantic Alliance and Europe of "near friends" from the Baltic to the Mediterranean, a development made very clear by the Barcelona Conference this year ; o the Western security organisation, to which the former Eastern European bloc is applying, and NATO, a group of countries which all aspire to be admitted to the European Union as well, thereby showing that they treat the two frontiers, the economic and political frontier on the one hand and the security frontier on the other, as autonomous ; o in the Mediterranean region, NATO is also being pressed to provide a security frontier by the countries in the North African corridor, while it is from the EU that they also seek support for their political, economic and social development ; o NATO is consequently being forced to give thought to adopting a new profile : -in addition to the collective defence objective, it now provides logistical and military support for the peace-keeping operations flowing from the UN's Agenda for Peace of 31 January 1992 ; -it has laid bridges for cooperation with the East such as the North Atlantic Cooperation Council of 1991 and the Partnership for Peace of 1994 ; -against this background, WEU has once again been brought into action as a point of reference for the Europeanisation of defence. Here it seems that the ongoing overhaul of the United States' strategic concept and the process of formulating a European strategic concept which is now under way are bound to acknowledge that the European security frontier and the NATO security frontier are still tending to coincide. In that case, the common defence policy, or any common defence system for the European Union which emerges, is clearly first and foremost an internal question for the Atlantic Alliance, following half a century of solidarity. Neither Reich nor Nation another future for the european union Roger De Weck What is it that keeps us Europeans together ? What is it that links the British, who so love to rage against the Continent, to the Poles or the Portuguese ? What do we have in common ? What are the differences, which not only divide but also unite us ? Is there a European identity ? The very fact that we raise the question of identity betrays the European in us. The French philosopher, Edgar Morin, speaks of Europe's "manifold unity" or "unitas multiplex". For all the variety of North America, its binding forces are obvious to the observer. The most striking feature of our continent is its diversity. Certainly, the whole of Europe shares the inheritance of Christianity ; indeed for centuries awareness of "Christendom" was much stronger than the notion of "Europe". But as Europeans desert the Christian churches in their droves, this last vestige of the Western heritage loses its relevance. Christianity no longer unites Europeans, but nor does it divide them. As time goes by, our other great legacy o the Enlightenment o becomes less and less specifically European. Other regions of the world have long drawn on this inheritance (just as other continents have become more Christian than our own). But more importantly, if the spirit of the Enlightenment forms part of European identity, then this particular part has been damaged since the Holocaust. The interplay of nationalism, imperialism and totalitarianism, which, sad to say, is all too European, brought disaster. Europe proved incapable of saving itself by its own efforts. We had to be liberated. Our fate hung on the United States, and that has undermined our self- confidence. In a century that has seen the most terrible of wars, the North Americans too have often gone astray. But they have always rejected totalitarianism. The United States is not only stronger as a result, but also more decisive. There was no American Voltaire, but nor was there an American Hitler. What is not European All of us in Europe have at least one identity, which we experience again and again and which can sometimes break right through to the surface o I'm talking here of a "negative identity". We may not know exactly what it is to be European, but we are quite sure of what is not European. We Europeans have never had hard and fast criteria for determining what counts as Europe. Our continent is ill defined both politically and culturally. Not even geography can help us o does our Eastern border really run along the Urals ? During the Cold War, many Westerners forgot that the "far-off" countries of Central and Eastern Europe were utterly European in character. Despite all the anti-American feeling prevailing at that time, they felt much closer to America and still do. Even so, seldom do we feel as European as when we watch an American television series, which may explain why they are so successful. They are foreign to us and yet familiar. According to one of the classic interpretative models used by psychologists, identity stems from negation. Europeans are hardly ever as united as in their determination to marginalize others. But there must be more to Europe than that, for in the long run negation is not enough : it offers a weak identity in which we protect our own egos by demonising others. For example, the British make a habit of "splendid isolation" and the Swiss nurture their "hedgehog" mentality. It is as if the Confederation would collapse were it not surrounded by enemies : the rabble-rousers on the Swiss right brand the EU as the "Fourth Reich" and one Green politician has waffled on about the "Empire of Evil". Expressing a common sense of purpose Europe is in fact made up of former enemies. When British Prime Minister John Major picks a fight with the European Union, his crisis team in London is immediately dubbed the "war cabinet", proving that the past is still close at hand. And yet wherever Europeans have finally come together, they now live in peace. New wars in Western Europe are virtually unthinkable and the Cold War is history. However, war and civil wars will remain a distinct possibility in Eastern Europe until its countries are able to join the European Union. The German Chancellor Helmut Kohl was right when he observed that ultimately the European question is still a question of war and peace. And just as Switzerland sees itself as a nation created by an act of will, there is in Europe a growing identity, both in the literal and in the figurative sense, that is also based on an effort of will. The vast majority of Europeans share an "identical" and hence "identity-forming" will to establish a peaceful, united Europe. What is at work here is a positive identity : the twin concepts of will and reason are very much European. No doubt, the European Union will face many setbacks in future, but it will hardly sink to a point so low that disintegration could mean destabilisation and even lead to war. This is because of the workings of what the French call "le sens de l'histoire" o in both senses of the term : Europe is moving in a certain "direction" and in so doing is giving itself a "purpose". Individuals object to having an identity foisted on them. Identity cannot be decreed from above by nation states or by the European Union, for it is something organic, which develops from small beginnings and either thrives or withers away. The EU is simply a powerful expression of a common sense of purpose shared by many Europeans, who, after centuries of war, have finally become aware of their responsibility for their own continent. A Europe of the nations may be the rallying- cry for some, but Europe is first and foremost a warning against the hubris of these same nations. "Verfassungspatriotismus" o or loyalty to the constitution o is a familiar concept in Germany. Underpinning the European idea is a kind of "loyalty to peace", which, however, is now fading away fifty years after the end of the Second World War. As time goes by, the younger generation which was spared those horrors has less and less sense of purpose and, in this respect, resembles the directionless and disoriented Swiss, since they too escaped the heavy toll in human lives. The EU Member States were not far enough down the road to a common security policy to prevent the carnage following the break-up of Yugoslavia. If Europe had been up to the task, the question of identity would hardly be raised any more. Identity is also a matter of success. Competition between world regions Is success at all possible in an era of mass unemployment where the virus of social disintegration infects everything which is not already geared to out-and-out economic warfare ? Globalisation (internationalisation) threatens both national and European identities o as if one day the only remaining form of identification will be that of the worker with the mega-firm that employs him. Yet the EU is not perceived as a force for order and moderation which is striving (for example through monetary union) to control the forces of globalisation and, logically, to steer in the opposite direction, something the nation states have long been incapable of. On the contrary, the EU is seen o albeit unjustifiably in many cases o as one of the mainsprings of the globalisation process which is oppressing countless individuals. This provokes national resentment. National politicians heighten the mistrust by claiming for themselves the credit for all political successes and laying the blame for failures at the EU's door. However, Europe is not merely a scapegoat, but at the same time the exact opposite : the hopelessly overburdened standard-bearer of hope, which is bound to disappoint, because so many people would like it to disappoint. Europe acts as a blank screen on to which the Frenchman can project his yearning for "grandeur", the German his deep-seated need to belong, the Briton his uncompromising cries of "I want my money back", and the Eastern European his desire for stability and a guarantee of democracy, the rule of law and human rights. While we are on the subject of human rights, in the vast globalisation process now under way, the old European claim to universal values is rebounding on Europe itself. Now that our continent is no longer at the centre of world events, Europeans must face up to the competition of values and identities. Just as the Swiss always feel the urge to retreat into their little corner, many Europeans also tend to withdraw into themselves in order to protect their own egos. Yet if there is one single characteristic that defines Europe, it is that curious capacity for openness, which our continent displays time and again and has contributed to the "infinite richness in a little room" that so delighted Marlowe. Europe has left its mark over the whole globe, but it has also proved to have a voracious appetite itself, being perfectly capable of absorbing influences from all over the world and positively devouring foreign ideas, without surrendering any of its own identity. However, globalisation unleashes the forces of homogenisation. It also throws open the question of the balance of power between continents. Must o indeed can o Europe summon up the will to compete as a united force against other regions of world ? Since the passing of Charlemagne, the diversity of Europe has been ranged against the concept of a single European power. Our instinct is not to concentrate, but to divide, spread out and split up. Our logic is not that of a single centre, but of multiple centres. The concept of a "European nation", which is ultimately bound up with power politics, is a contradiction in terms. Balkanisation is the real danger. The European Union lies somewhere in between. For far too long, Europe has swung between Scylla and Charybdis, between the Reich and the nation. The EU does not fit into this pattern ; it breaks the vicious circle. It is neither Reich nor nation and hence truly modern. Perhaps European identity is actually to be found in the new and lasting phenomenon of networks, which was first developed by the generation of '68 and took off with the electronic revolution. In many ways the European Union is o and is at its best as o a network. What the Swiss fail to understand, as outsiders with little first-hand experience, is that the EU has something more important than its institutions : the network of connections, the day- to-day working relationships remote from diplomatic channels, the exchanges. And these exchanges give rise to the "manifold unity", which according to Edgar Morin is the life-blood of Europe. Identity is a process Our generation has experienced both the integration of Western Europe and the disintegration of Eastern Europe. In the West the decades-long enthusiasm for the unification process o identification with the EU o has been somewhat dampened, particularly where closer union has degenerated into homogenisation. In the East, many people see Europe as providing an ersatz identity. This is just one of many examples that identity is not something static and does not always remain what it was. Identity is more of a process, and processes have driving forces, restraining forces and opposing forces. Identity always springs from contradictions and never becomes fully o and inhumanly o coherent. On the contrary, identity contains within it crisis in the original Greek sense of "krisis" o decision. That is one of the reasons why the European Union often cuts a poor figure, just as the Swiss Confederation presented an unflattering picture for most of the 550 years before the founding of the Federal State o civil war, treachery, pacts with foreign powers, intrigue and ineffective parliaments. It is actually growth which prompts the outbreak of identity crises. In a brilliant essay for the literary supplement of the "Weltwoche", Adolf Muschg recently asked ? How much identity does Switzerland need ? ?. Similar questions on the quantity and in particular the quality of identity could be asked about Europe. However, Muschg also went on to ask, ? What is it that Switzerland still has to protect from Europe ? ? Perhaps the difference is that Europe is looking for a new identity, while Switzerland is trying not to lose its old one. What does it mean to be a European ? Preliminary conclusions J?r?me Vignon From the very outset, at the preparatory meeting for the Coimbra Seminar, the historian Gilbert Trausch warned us that the task we faced was one fraught with difficulties and risks. "Though the search for a European identity is a classic exercise, indeed almost a commonplace for the social science disciplines, the quest for an identity specific to that very new arrival among the ranks of political animals, the European Union, is a much tougher proposition." In other words, to the historian's mind, the shaping of a collective identity is a long process, in contrast to the brief span of time occupied by the integration of Europe so far. Let there be no misunderstandings on that score. With this caveat ringing in its ears, the Coimbra Seminar proceeded to business. Advancing in stages, it started with what it means to be European as a general concept, then moved on to the challenges raised by political unification of the European continent in the here and now. The discussion progressed by way of the idea of a "European project" which arose spontaneously as participants made their contributions. Alongside the centrality of the political necessity of 'the European project', four other main categories emerge : legitimacy, necessity, the project and interactivity. Legitimacy Was it proper, for the proponents of an integrated Europe, to seek to mobilise the many facets of a European identity o history, culture, values and so on o to their own advantage, so as to construct some kind of political legitimacy for themselves ? In so doing, were they not falling into a double trap ? o A collective identity was the outcome of an approach which needed to be seen in context and in proportion. If it was supposed to appeal to "ordinary people", then it could only be from the standpoint of their particular perceptions and experiences where we stand now at the end of the XXth century. o To seek to exploit the material traditionally used to forge national identities was to ignore the special qualities of openness and multiculturalism, which were the marks of a truly European identity. Jose Vidal Beneyto disposed elegantly of these two posers. Reminding his listeners of the academic achievements chalked up by the sociology of knowledge, he stressed that there was no going back on what the experts now agreed on : "Like individual identities, collective identities exist de facto. It is not improper to refer to them, provided we recognise that the European identity evolves in step with whatever age we live in : it is a moving thing, not a thing established once and for all. And it goes much further than that : a collective European identity is bound to encompass not just variations but o especially o contradictions, contradictions which must be managed, and that is the job of politics. The purpose of a 'project' is just that, to reconcile contradictions, at the same time using the lessons we have learnt from the past and from a shared culture." Necessity The bond between the identity of the European Union and a common project is not something which has come about in a void, simply through the inspiration of a few founding fathers, or a historical accident. It also owes its being to necessity, and to the will to which it gives rise. Here, the Coimbra Seminar brought out a telling parallel between the 1950s and the 1990s. We are, in a sense, entitled to say that there was more to the setting up of community of countries belonging to the Western European camp from the time of the Hague Conference onwards than a deliberate plan by the Fathers of Europe. This community of belonging also sprang up and developed under pressure from a political necessity, the necessity created by the East-West dispute. An economic integration process, one might say, was a way of responding to a geopolitical necessity, in which case the brainwave of the pioneers of European integration was to harness this economic vehicle up to a prior objective which went much deeper, a plan for solidarity and reconciliation which went beyond the immediate geopolitical challenges. This was the sense in which Filippo Pandolfi was able to say that "it was only after 1989 that the full scope of the European project could be seen, its raison d'?tre, if you like." Marcelino Oreja reminded us that today, it was economic constraints, bringing with them the nagging challenges of competitiveness, which were the driving forces in integration. The progress made from 1985 to 1991 led to a political leap, the Economic and Monetary Union, which was itself reinforced by the geopolitical demands of enlargement. The Intergovernmental Conference now under way ought to graft a collective project adapted to meet the challenges of the present day. To put it another way, in the 1990s as in the 1950s, pressure of necessity created an opportunity for a new collective departure. If there was a secret behind the identity of a Political Union, it was that it should be capable of giving a generally accepted sense to the sweeping changes occurring in the European continent, over and above the geopolitical momentum behind them. The project What should such a project consist of, "now and for the future", if that shared sense was to unfold ? What, in other words, was to be the telos, the ultimate objective ? Are we not entitled to expect an answer to this question from those responsible for European integration, from those who govern, but also from the intellectual elite ? o Some speakers stressed the importance of overhauling the European social model, threatened as it now was by its inability to reconcile opening out to the world with maintaining social cohesion (Jos? Vidal Beneyto). Bonaventura Sousa Santos, in fact, proposed focusing our efforts back on restoring the State and the community once the other pillar of the European social model, the market, had outgrown itself. o Others wanted to go still further along the path of reshaping the model. Defining their stance in relation to the global challenges of the environment and population growth, they saw a contemporary European identity as an awareness of the urgent need for changes in lifestyles and patterns of consumption. Edy Korthals Altes, for example, saw it as a moral awareness with the capacity to answer the questions about the meaning of life. The same global view of developments in Europe today would, in the eyes of Zaki La?di, seek to identify Europe with efforts to act as an effective mediator for the world. President Mario Soares went so far as to say that the world needed a Europe capable of translating the spirit of democracy which was the only foundation it had at the present time into acts of international solidarity. o Those who identified the European Union with a way of giving a deeper dimension to democracy alluded to a project which was as much a cultural as a political exercise. In the words of Massimo La Torre, it was a matter of establishing, by law, a genuine European citizenship. Freed of any ties to the prior possession of a particular nationality, it would be the seedbed of an identity linked directly to democratic ideals, a sort of constitutional patriotism in the pure state. For Claire Lejeune, the Political Union should be one where the implicit subjection of men to women would have been overthrown. While invoking the urgent need for the European project to have a telos, those attending the Seminar stressed that the demos must be involved in the work of putting such a project together. In other words, to give expression to a European identity today meant embarking on a process of exchange, of listening and of interaction. Interactivity Warnings against the risk of overintellectualising came from intellectuals themselves. Heinrich Schneider pointed to the risk of totalitarianism lurking behind the concept of an avant-garde, if it were one enlightened not by reason but by a moral consciousness. Truls Frogner spoke of what the most deprived groups in Europe really expected in terms of jobs and unemployment. Maryon McDonald insisted on what made sense to people. This brought the meeting back, when it came to what it meant to be a European, to the sphere of "communicating", to "how to share, listen and receive", to "how to inspire and deserve trust". This was the point in the Seminar at which speakers' contributions became more specific and closer to the work being done by the European institutions. Under the subject heading of an interactive identity, four aspects were discussed : the institutions in the strict sense of the word ; communication ; new forms of mediation ; and, lastly, the need to foster interaction between the Member States and the Union. 1. Heinrich Schneider, a veteran of the battle for federalism, thought it was time to build something new out of the old federal mould. The institutions should be judged less against the yardstick of unity than on the basis of new criteria : whether the executive inspired confidence, whether joint action was effective, whether someone was visibly answerable for the exercise of power. It would have been hard to find a better definition of some of the challenges facing the IGC. 2. In the view of Elemer Hankiss, who was Head of Hungarian Television from 1991 to 1992, what the European Commission needed to overhaul was not so much its messages (though these, he said, were still not getting across strongly enough in his country) as its methods. Opportunities for working out what European integration meant in the present day needed to be provided in the shape of hundreds of forums like the Coimbra Seminar, where intellectuals, people from cultural and scientific backgrounds and journalists would debate the underlying issue, the raison d'?tre which Filippo Pandolfi had referred to. One was reminded of Denis de Rougemont saying that the search for Europe was itself Europe. 3. Many participants felt that the Commission did not allow enough space for mediation by associations acting as relays to develop, meaning the many hundreds of NGOs already structured into European networks which were capable of expressing the European sense of an operation carried out at local level, not to mention acting as the expression of a moral consciousness. Edy Korthals Altes spoke for them when he spoke of the practice of dialogue between religions at the European and Mediterranean levels. 4. We should stop acting and talking as if the Union and the nations in it were in competition. Nations were part of what it meant to be European, Maryon McDonald maintained. Bearing in mind the immense symbolic challenges posed by a single currency, we should leave it up to the national apparatuses, with their huge capacity to influence and respond, to talk to European people about Europe. Nor should we forget that farmers, students, textile workers, bosses of small businesses, doctors and trade unionists, in the publishing business, experienced Europe in the first instance through their day-to-day occupations. When the debates were over, some self-criticism emerged. Perhaps our group had taken too much of a consensus view. Had it allowed enough space for the anti- Maastricht protest voice to be heard ? Did it reflect the doubts and bewilderment in the minds of some grassroots voters ? The unconscious temptation to preach to the converted was certainly there, and we should bear it in mind when later Seminars came up. But a Seminar on Science and Culture was not there to do the work of a parliament : what it aspired to do was to think matters through and go back over the experience of the past. In that sense, Coimbra was a great help to us. Annex: A dialogue on unemploymentbetween Truls Frogner and his Neighbour You have not yet heard the trade union voice. Some people think that trade unions are fading away. Well, in Europe we have the ETUC, the European Trade Union Confederation, with member organisations from 33 countries, after the enlargement eastwards last December. Now, some 55 national organisations, representing more than 50 million members, come together in the ETUC to discuss and decide on common matters and then take care of our joint interests in the European Union and the European Economic Area (EEA). Do you know any other and more representative non-governmental European organisation ? In the European Union's search for its identity, a trade union has a relevant message. In my context, to be in a union means to take care of each other, knowing that acting together may give better results for all than acting individually. Let me also add that in Norway, community has a more positive connotation than union, since my country, for many years, was the weaker part in unions with other countries. A union in Norway is also associated with foreign rule. In our discussions today, I have heard that the magic words "European identity" contain the concepts of diversity, legitimacy and transcendence. My neighbour in Norway does not understand this and seldom speaks of identity. But he lost his job some months ago, and I can see this is doing something to his identity. I told my neighbour last week that I was going to Coimbra to discuss the "European identity." -What is that ?, he said. -Well, we are supposed to find out, I replied. -Do you have to go to Coimbra to find that out ? Why not here ? -No, it is easier to see what you are from the outside. In Sweden, I feel Norwegian. In Brussels, I feel Scandinavian and in Tokyo, I feel European. When I'm in a pub in Boston, I'm still in Europe. -I understand. As an unemployed person, I feel the importance of a job... -So, my friend, what is the European identity to you ? -Nothing ! Does it create jobs ? -It depends... -What do you mean ? Does it or does it not ? -It creates peace. What kind of employment policy is possible in Bosnia ? -Stop ! The European Union did not prevent war in ex-Yugoslavia. -Agreed, but in the old days, local war spread through all of Europe. The European Union, together with NATO, made this impossible. -OK, peace is a natural thing now. War will not happen in Europe again. -Are you sure ? -To be honest, no. I'm not sure of anything. Without a job, I don't know where I belong. How could I identify with the European Union if it does not create jobs ? -European Union made a report on "Growth, competitiveness and employment"... -Reports are not reality. The European Union is a marketplace. Growth and competitiveness yes, jobs no ! -With 20 million unemployed in Europe, it seems you are right. On the other hand, the European Union may change its treaty and enshrine employment in it. -Interesting, but paragraphs don't create jobs. Moreover, national governments don't follow up. -Should the European Union be the scapegoat if national governments fail in their economic policy ? -I admit you have a point. Moreover, unemployment is high outside the European Union, too. Except in Norway where it is 4% and the inflation rate is below 1%. But still, these positive figures don't help me. -We take you seriously. Within a short time, you will be offered a job, a labour market (professional training ?) course or another active alternative. And this is not mainly thanks to oil and gas, but to our social model and cooperation for employment. -Why can't the European Union do the same ? Isn't cooperation a part of what you call the European identity ? -Good question. Maybe because... eh... maybe... -Well, Truls, come on ! -I'm not really sure why the European Union has not used its potential. -Can't you ask them in Coimbra ? -I will. -Do you know what I think ? I think the European Union pays too little attention to the social dimension and too much to economic matters, or they have too narrow a concept of economy. -Yes and no. Where else in the world will you find such close relations between the social partners and politicians ? -Now you're talking me around again. It doesn't help me if you, on the one hand, speak of a fine European social model in a global context, and on the other hand, you have welfare cutbacks and rising unemployment. -It is a part of the European political identity to say one thing and do something else. -Ah ! Now I know what the European identity is : contradiction over unity. -It's true, but it could also be unity over contradiction. -Please tell me, Truls, why should I o being unemployed o identify with the European Union ? -The answer is both simple and complicated ; at one and the same time, the European Union identifies with you and with 20 million more people without jobs. -In that case, I will wait and see. -Oh no, this time I will challenge you. Why should you wait to see what the community can do for you ? Shouldn't you also ask yourself what you can do for the community ? -Hmm... let's make a deal. I will, in spite of unemployment and a poor private economy, keep my trade union membership and join the European Movement. But you should take an initiative to strengthen the European Union with what is important to my identity o employment. In practice ! Not only in fine words. -Agreed. You have a deal. -Not quite. Only a temporary deal. -Of course. Europe is not finished yet. Identity is something moving and invisible o an Unidentified Flying Object ! List of contributors Tom Bryder, Senior Research Fellow, Institute of Political Science, University of Copenhagen Truls Forgner, Director of Political Affairs, Federation of Professional Associations in Norway, Oslo Thomas Jansen, Adviser, Forward Studies Unit, European Commission, Brussels Ingmar Karlsson, Ambassador and Head of Policy Planning Unit, Swedish Ministry for Foreign Affais, Stockholm Edy Korthals Altes, former Ambassador of the Netherlands; President, World Conference on Religion and Peace (WCPR), New York Claire Lejeune, Poet; Secretary General of the Interdisciplinary Centre for Philosophical Studies at the University of Mons-Hainaut, Cl?phum, Belgique Maryon McDonald, Appointed Senior Fellow, Department of Social Anthropology, Cambridge University, Cambridge. Adriano Moreira, former Minister, Professor, Technical University of Lisbon Heinrich Schneider, Professor Emeritus, University of Vienna Mario Soares, former President of Portugal R?diger Stephan, Secretary General of the European Cultural Foundation in Amsterdam Massimo La Torre, Professor, Department of Law, European University Institute, Florence Gilbert Trausch, Professor Emeritus, University of Li?ge J?r?me Vignon, former Director of the Forward Studies Unit, European Commission, Brussels (1989-1998). Director for the Strategy, D?l?gation ? l'Am?nagement du Territoire et ? l'Action R?gionale (DATAR), Paris Roger de Weck, Editor of "Tages-Anzeiger", Z?rich From checker at panix.com Fri Jan 13 16:53:26 2006 From: checker at panix.com (Premise Checker) Date: Fri, 13 Jan 2006 11:53:26 -0500 (EST) Subject: [Paleopsych] Technology Review: The Internet Is Broken Message-ID: The Internet Is Broken http://www.technologyreview.com/infotech/wtr_16051,258,p1.html [This is confusing. There are three parts to the article. First is part 1, which is fine. Then comes part 2, with some feedback that is not in the dead three version. Then part 3. But the dead-tree version had additional paragraphs. I typed in the URL above, changing it to p4. It consisted of yet more comments. Changing it to p5 results in the very same additional comments. I follow it all with an short article referenced in the main one, "Click, 'Oh Yeah?'"] Monday, December 19, 2005 The Net's basic flaws cost firms billions, impede innovation, and threaten national security. It's time for a clean-slate approach, says MIT's David D. Clark. By David Talbot In his office within the gleaming-stainless-steel and orange-brick jumble of MIT's Stata Center, Internet elder statesman and onetime chief protocol architect David D. Clark prints out an old PowerPoint talk. Dated July 1992, it ranges over technical issues like domain naming and scalability. But in one slide, Clark points to the Internet's dark side: its lack of built-in security. In others, he observes that sometimes the worst disasters are caused not by sudden events but by slow, incremental processes -- and that humans are good at ignoring problems. "Things get worse slowly. People adjust," Clark noted in his presentation. "The problem is assigning the correct degree of fear to distant elephants." [[37]Click here to view graphic representations of David D. Clark's four goals for a new Internet architecture.] Today, Clark believes the elephants are upon us. Yes, the Internet has wrought wonders: e-commerce has flourished, and e-mail has become a ubiquitous means of communication. Almost one billion people now use the Internet, and critical industries like banking increasingly rely on it. At the same time, the Internet's shortcomings have resulted in plunging security and a decreased ability to accommodate new technologies. "We are at an inflection point, a revolution point," Clark now argues. And he delivers a strikingly pessimistic assessment of where the Internet will end up without dramatic intervention. "We might just be at the point where the utility of the Internet stalls -- and perhaps turns downward." Indeed, for the average user, the Internet these days all too often resembles New York's Times Square in the 1980s. It was exciting and vibrant, but you made sure to keep your head down, lest you be offered drugs, robbed, or harangued by the insane. Times Square has been cleaned up, but the Internet keeps getting worse, both at the user's level, and -- in the view of Clark and others -- deep within its architecture. Over the years, as Internet applications proliferated -- wireless devices, peer-to-peer file-sharing, telephony -- companies and network engineers came up with ingenious and expedient patches, plugs, and workarounds. The result is that the originally simple communications technology has become a complex and convoluted affair. For all of the Internet's wonders, it is also difficult to manage and more fragile with each passing day. That's why Clark argues that it's time to rethink the Internet's basic architecture, to potentially start over with a fresh design -- and equally important, with a plausible strategy for proving the design's viability, so that it stands a chance of implementation. "It's not as if there is some killer technology at the protocol or network level that we somehow failed to include," says Clark. "We need to take all the technologies we already know and fit them together so that we get a different overall system. This is not about building a technology innovation that changes the world but about architecture -- pulling the pieces together in a different way to achieve high-level objectives." Just such an approach is now gaining momentum, spurred on by the National Science Foundation. NSF managers are working to forge a five-to-seven-year plan estimated to cost $200 million to $300 million in research funding to develop clean-slate architectures that provide security, accommodate new technologies, and are easier to manage. They also hope to develop an infrastructure that can be used to prove that the new system is really better than the current one. "If we succeed in what we are trying to do, this is bigger than anything we, as a research community, have done in computer science so far," says Guru Parulkar, an NSF program manager involved with the effort. "In terms of its mission and vision, it is a very big deal. But now we are just at the beginning. It has the potential to change the game. It could take it to the next level in realizing what the Internet could be that has not been possible because of the challenges and problems." References 37. http://www.technologyreview.com/infotech/wtr_16051,258,p1.html http://www.technologyreview.com/InfoTech/wtr_16051,258,p2.html Continued from Page 1 By David Talbot Firewall Nation When AOL updates its software, the new version bears a number: 7.0, 8.0, 9.0. The most recent version is called AOL 9.0 Security Edition. These days, improving the utility of the Internet is not so much about delivering the latest cool application; it's about survival. In August, IBM released a study reporting that "virus-laden e-mails and criminal driven security attacks" leapt by 50 percent in the first half of 2005, with government and the financial-services, manufacturing, and health-care industries in the crosshairs. In July, the Pew Internet and American Life Project reported that 43 percent of U.S. Internet users -- 59 million adults -- reported having spyware or adware on their computers, thanks merely to visiting websites. (In many cases, they learned this from the sudden proliferation of error messages or freeze-ups.) Fully 91 percent had adopted some defensive behavior -- avoiding certain kinds of websites, say, or not downloading software. "Go to a neighborhood bar, and people are talking about firewalls. That was just not true three years ago," says Susannah Fox, associate director of the Pew project. Then there is spam. One leading online security company, Symantec, says that between July 1 and December 31, 2004, spam surged 77 percent at companies that Symantec monitored. The raw numbers are staggering: weekly spam totals on average rose from 800 million to more than 1.2 billion messages, and 60 percent of all e-mail was spam, according to Symantec. But perhaps most menacing of all are "botnets" -- collections of computers hijacked by hackers to do remote-control tasks like sending spam or attacking websites. This kind of wholesale hijacking -- made more potent by wide adoption of always-on broadband connections -- has spawned hard-core crime: digital extortion. Hackers are threatening destructive attacks against companies that don't meet their financial demands. According to a study by a Carnegie Mellon University researcher, 17 of 100 companies surveyed had been threatened with such attacks. Simply put, the Internet has no inherent security architecture -- nothing to stop viruses or spam or anything else. Protections like firewalls and antispam software are add-ons, security patches in a digital arms race. The President's Information Technology Advisory Committee, a group stocked with a who's who of infotech CEOs and academic researchers, says the situation is bad and getting worse. "Today, the threat clearly is growing," the council wrote in a report issued in early 2005. "Most indicators and studies of the frequency, impact, scope, and cost of cyber security incidents -- among both organizations and individuals -- point to continuously increasing levels and varieties of attacks." And we haven't even seen a real act of cyberterror, the "digital Pearl Harbor" memorably predicted by former White House counterterrorism czar Richard Clarke in 2000 (see "[35]A Tangle of Wires"). Consider the nation's electrical grid: it relies on continuous network-based communications between power plants and grid managers to maintain a balance between production and demand. A well-placed attack could trigger a costly blackout that would cripple part of the country. The conclusion of the advisory council's report could not have been starker: "The IT infrastructure is highly vulnerable to premeditated attacks with potentially catastrophic effects." The system functions as well as it does only because of "the forbearance of the virus authors themselves," says Jonathan Zittrain, who cofounded the Berkman Center for Internet and Society at Harvard Law School and holds the Chair in Internet Governance and Regulation at the University of Oxford. "With one or two additional lines of code...the viruses could wipe their hosts' hard drives clean or quietly insinuate false data into spreadsheets or documents. Take any of the top ten viruses and add a bit of poison to them, and most of the world wakes up on a Tuesday morning unable to surf the Net -- or finding much less there if it can." Discuss Hogwash by artMonster, 12/19/2005 11:43:06 AM The internet is not broken, M.S. Windows is. The issue of unwanted email (spam) warrants some changes in the underlying structure, but the other problems are really OS problems, and Windows bears the brunt of responsiblity for this. Major structural changes to how the internet works would be unwise, and probably open up more control by either the government or Microsoft. Neither are desireable or beneficial for the end user. So who really benefits from this FUD about the internet being broken? Not too difficult to figure out... Spam proliferation by Bellinghamster, 12/19/2005 4:52:05 PM Despite my ISPs efforts to filter emailed spam, my inbasket is typically less than one-quarter legitimate message traffic. But purging spam isnt my greatest inefficiency. The time I spend maintaining firewall, virus and malware software is the truly significant inefficiency. New protocols -- we dont use current ones! by Matej, 12/19/2005 9:11:11 PM Hi, when this article was mentioned on "The World" (WGBH) they mentioned that NSF is planning to release $300M for "development of new protocols which would make Internet safe" (and another $300M later for implementation). Why in the world we need another protocols when we are not using the current ones? My Linux here has support for IPv6, S/MIME, etc. etc. but no-one in the world uses them, because the problem with unsafe Internet is not in the technology, but in the organization and social problems (like how to make everybody identifiable over Internet, when US public doesnt want to be identified in the first place)? Matej Great sales pitch by Mike, 12/20/2005 1:30:05 AM Isnt one of the best ways to get someone to spend money to instill fear? Some people would argue thats how congress is duped into appropriating funds - How close is Cambridge to DC? :-) />If they want to spend $200M, send it my way and Ill demonstrate a cool solution to make it easier to deploy new web-based services, to any device, saving major corporations Billions in the process. Cheers! The Internet is in need of repair by Owen N. Martinez, 12/20/2005 5:47:24 AM Like any system, the I. needs to be tuned-up or repaired as things get out of control. Who is qualified to determine what to do, and who should control the system? Preferably the same entity or two very close ones, that have the confidence of the majority the users. The US government need not apply. hogwash by Si, 12/20/2005 4:31:01 AM Im a day late on this and notice that artMonster has hit it perfectly. Big brother wants control. I would hate to think what the internet would be like if they redesigned it along the lines suggested. Hogwash by Fergus Doyle, 12/20/2005 5:39:59 AM I agree with the other two guys here the problems are down to MS software - specifically that MS cannot/will not keep up with changing circumstance, by releasing SW. I have no spyware on my (Windows) system and no viruses. eg use Firefox not Internet Explorer use Thunderbird not Outlook Express and most of your problems with Windows are solved. Use Linux and you dont even have to worry this much. Its the infrastructure that needs changing by E Feustel, 12/20/2005 6:30:49 AM Its the routers and the protocols that need changing to permit secure higher speed operation including authentication of the traffic on the net -- no more fake IP addresses and if the packet says that X sent it, then X did actually send it. No more DNS hacking -- if you ask for Xs address, you get Xs address, not Ys. And you get it with the minimum computation in a reliable manner even with pieces of the net going down. Hogwash indeed. by mrxsmb, 12/28/2005 4:30:12 AM Although hopefully grown ups dont need more alert than "powerpoint presentation" and "$400 million dollars reseach funding" in close proximity to know that. The issues highlighted with MS [the debilitating Operating System, not the debilitating Physical Affliction] and its usability over functionality approach are all valid, but other OSs and applications have their own issues. />Of course business could actually pony up the money to build their own networks and not use the internet, but then how would that save them money? I believe some already do, as do Governments and sensibly so. One bank in Australia has actually got with the program and realised they should issue their on-line banking customers with a swipe and pin security system the same as on an ATM, at each and every house. How much of the "problems" discussed would be solved by this simple change in attitude? future of the internet by p, 12/20/2005 8:31:26 AM The network (as opposed to the endpoints) doesnt need major new security features. I admit largeer TCP ISNs would be good, and SMTP should have a way to reject mail per-user after the mail server has read all of it. Apart from that what you need is security in execution environmensts (where some of those EEs are OSs and some are browsers etc.). This is one of several similar approaches - its no longer adequate to let a program do anything it chooses. The programs cant be rusted while handling suspect data. This is a different threat model from most computer security work historically. http://www.google.co.uk/url?sa=U&start=5&q=http://www.cs.columbi a.edu/~smb/papers/subos.pdf&e=42 /> Extensions to existing OS s/w are effective at providing this kind of security. http://whitepapers.zdnet.co.uk/0,39025945,60150583p-39000584q,00.htm /> Hogwash Support by Dr Hacker, 12/20/2005 10:35:07 AM artMonster is right on. The royalists from MaBell refuse to give up their 100+ year monopoly. I say give it up and become Americans instead of British-like thugs. We dont want another 1776, but it looks like we may need one! Designers did it by Sundararajan Srinivasan, 12/28/2005 5:47:34 AM Some of the internet bugs we have now has nothing to do with the OS. It was the way in which it was designed. For instance, SMTP does not provide authentication by default. I can pose myself as bill.gates at microsoft.com with an SMTP server, w/o any problem. This is because the SMTP does not mind the "from" address. The solution can be the usage of digital signature. Internet and all the related protocols could have been designed more secure. But it would not have got the same popularity, as it is now. That is why, we are now paying security experts to build layers of security. Impact of Emerging Technologies: The Internet Is Broken -- Part 3 http://www.technologyreview.com/InfoTech/wtr_16056,258,p1.html Wednesday, December 21, 2005 The Internet Is Broken -- Part 3 Researchers are working to make the Internet smarter -- but that could make it even slower, warn experts like Google's Vinton Cerf. By David Talbot This article -- the cover story in Technology Review's December-January print issue -- was divided into three parts for presentation online. This is part 3; [34]part 1 appeared on December 19 and [35]part 2 on December 20. In part 1, we argued (with the help of one of the Internet's "elder statesmen," MIT's David D. Clark) that the Internet has become a vast patchwork of firewalls, antispam programs, and software add-ons, with no overall security plan. Part 2 dealt with how we might design a far-reaching new Web architecture, with, for instance, software that detects and reports emerging problems and authenticates users. In this third part, we examine differing views on how to deal with weaknesses in the Internet, ranging from an effort at the National Science Foundation to launch a $300 million research program on future Internet architectures to concerns that "smarter" networks will be more complicated and therefore error-prone. The Devil We Know It's worth remembering that despite all of its flaws, all of its architectural kluginess and insecurity and the costs associated with patching it, the Internet still gets the job done. Any effort to implement a better version faces enormous practical problems: all Internet service providers would have to agree to change all their routers and software, and someone would have to foot the bill, which will likely come to many billions of dollars. But NSF isn't proposing to abandon the old network or to forcibly impose something new on the world. Rather, it essentially wants to build a better mousetrap, show that it's better, and allow a changeover to take place in response to user demand. To that end, the NSF effort envisions the construction of a sprawling infrastructure that could cost approximately $300 million. It would include research labs across the United States and perhaps link with research efforts abroad, where new architectures can be given a full workout. With a high-speed optical backbone and smart routers, this test bed would be far more elaborate and representative than the smaller, more limited test beds in use today. The idea is that new architectures would be battle tested with real-world Internet traffic. "You hope that provides enough value added that people are slowly and selectively willing to switch, and maybe it gets enough traction that people will switch over," Parulkar says. But he acknowledges, "Ten years from now, how things play out is anyone's guess. It could be a parallel infrastructure that people could use for selective applications." [[36]Click here to view graphic representations of David D. Clark's four goals for a new Internet architecture.] Still, skeptics claim that a smarter network could be even more complicated and thus failure-prone than the original bare-bones Internet. Conventional wisdom holds that the network should remain dumb, but that the smart devices at its ends should become smarter. "I'm not happy with the current state of affairs. I'm not happy with spam; I'm not happy with the amount of vulnerability to various forms of attack," says Vinton Cerf, one of the inventors of the Internet's basic protocols, who recently joined Google with a job title created just for him: chief Internet evangelist. "I do want to distinguish that the primary vectors causing a lot of trouble are penetrating holes in operating systems. It's more like the operating systems don't protect themselves very well. An argument could be made, 'Why does the network have to do that?'" According to Cerf, the more you ask the network to examine data -- to authenticate a person's identity, say, or search for viruses -- the less efficiently it will move the data around. "It's really hard to have a network-level thing do this stuff, which means you have to assemble the packets into something bigger and thus violate all the protocols," Cerf says. "That takes a heck of a lot of resources." Still, Cerf sees value in the new NSF initiative. "If Dave Clark...sees some notions and ideas that would be dramatically better than what we have, I think that's important and healthy," Cerf says. "I sort of wonder about something, though. The collapse of the Net, or a major security disaster, has been predicted for a decade now." And of course no such disaster has occurred -- at least not by the time this issue of Technology Review went to press. References 36. http://www.technologyreview.com/InfoTech/wtr_16056,258,p1.html The Impact of Emerging Technologies: The Internet Is Broken -- Part 3 http://www.technologyreview.com/InfoTech/wtr_16056,258,p4.html Discuss slowing down of Internet by H.M. Hubey, 12/21/2005 10:56:22 AM Long shift registers (multiple streams if needed for speed) at routers to catch worms, viruses, Trojan horses, etc will not slow down the Internet. The bits will be XORed as they speed along at their normal speed. The other end of the XOR will be registers that can be loaded with bit-images of unwanted pgms (e.g. viruses, etc). It will be a combination of HW and SW. Since it will be expensive, it will be best to implement at the routers. If the routers "surrounding" a country known for spamming can catch these, it will be harder for this kind of SW to spread all over the Internet. In effect, one can quarantine a country so that spam and viruses do not infect the rest of the Internet. new internet? by Erik Karl Sorgatz, 12/21/2005 1:02:00 PM If all the spam and cookies, virus and worm code were cut, wed have 50% more bandwidth! Then a little blacklist to keep the spammers from gaining access after a 3rd strike and we might find that the existing internet is fairly responsive. Tax it? Nah..regulate it? Yes, perhaps put all the porn garbarage on its own backbone, with its own domain, and start fresh..it might be a good idea if the college kids were only allowed read-only access to USENET for the first six months too. The commercial interests should be blocking the known spam-friendly domains, and the pill-vendors could be held responsible for their commercial spams too - its a slippery slope, but the end user shouldnt be required to support the scum that perpetrate scams and spam. Long shift registers in routers by Jesse, 12/27/2005 5:53:19 PM Will not work. 1. You dont always have access to the contents. (encrypted) 2. You dont always have access to the entire message (incomplete messages) 3. You dont even necessarily have access to the entire packet (out of order fragmentation delivery) Check the Security Focus web site, and read the white paper on router hacking... You just CANNOT validate the contents at routers. The Internet is Broken by Grant Callaghan, 12/21/2005 11:09:25 AM Its all software -- even the hardware -- and the only question seems to be, "Where do we put the fixes?" I think they belong at the end of the process rather than at the beginning or in the middle. Charging a small amount per message would cut down on the spam, say a fraction of a penny, and it would generate enough money to police the system, free up bandwidth and catch bad hackers simply because the volume of traffic is so large. The only danger I see to this is that the government tends to want to feed its cash cows with ever larger increases in taxation of any kind. If you let them start taxing the internet, there will be no end to it. Encryption? by Aaron, 12/21/2005 12:52:35 PM I think it is odd that an article about the future of the internet makes no mention of encryption. Public key encryption, the ability to know who is saying what, has existed for longer than I have been alive. It also seems that a lot of the original ideas that made the internet popular, decentralization and anonymous communication, are lost on its current inhabitants. My mother could care less that emails from me are signed, she just wants less spam in her mailbox. Interesting idea about access charges by Dmitry Afanasiev, 12/26/2005 6:34:07 AM http://blog.tomevslin.com/2005/01/voip_spam_and_a.html Here access means access to user. Obviously, this needs sender authentication, automatic charging or balance verification, and probably some sort of rule-based message cost negotiation (e.g. I want to deliver this message, but only if this costs me less than $xy.z). But it makes a lot of sense since (thanks to Moores law) human time and attention are now the most scarce and expensive resources on the Net Email postage, not so good by B. Curtis, 12/21/2005 1:04:06 PM Although it seems simple, Mr. Calaghans concept of a small fee per email is no good in reality. It would equally penalize legitimate mass-email systems (newsletters, discussion lists, etc.) as well as spammers. E.g., there has been talk about sending tsunami warnings to peoples cell phones via email Id hardly want to charge the organization millions of dollars right when theyre trying to save my life. If the postage were optional (the recipient chooses if the sender pays), then youre talking about needing to positively identify both sender and receiver of an email, which amounts to SSL in every home. Some have posited using a difficult puzzle to extract a "cost" of sending emails even though no real money is involved, the same counter-arguments apply. No, postage on email is just one of those fun ideas that just wont work. Tariffing email by Jim Hayes, 12/21/2005 1:54:45 PM B. Curtis seems to not be aware that postage of about $0.20 per letter and who knows what per catalog does not keep my mailbox at the end of my driveway from getting filled with junk mail on paper, especially in the last month. Legit emaillers would gladly pay a penny per email to interested recipients while spammers sending out tens of millions of messages a day to random addresses - many of whom seem to illegally use some of my email addresses as return addresses by the way - would be put to rest. By law, 911 calls are toll-free. The issue of billing is easy - include 1000 emails per month in an account from an ISP, so only the excess is billed, so few users will even need to be billed. BTW, I do know companies who have limited access to the Internet for employees because of overloads of viruses and spam, as well as abuses in downloading inappropriate material - I fired an employee myself for storing his downloaded porn on a company computer. Parallel Internet by Khushnood Naqvi, 12/28/2005 3:27:48 AM The idea of having a parallel Internet is good. The parallel Internet can be implemented on the next generation of protocols - all with authentication (through digital certificates) and the like. And will have no spam. Commercial sites would perhaps like to have a presence on the more secure Internet. Users also wont mind to connect to a different Internet for things like Banking, or any business transaction for that matter. Even if users have to pay a slightly higher amount for that one it will be a success. But the only problem, I see with that one is that the Internet in the current form will be abondoned and so become more hazardous for people who continue to rely on this one. Press Re-Start button by 666, 12/21/2005 3:04:41 PM The core problem is that Internet, like its underlying software is becoming legacy and is an institution. /> The problem with all software is that underlying software is hard and unmaintenable instead being soft and flexible. This will be rectified by my chosen acronym. Security vs privacy by Jose I. Icaza, 12/23/2005 9:40:04 PM Can we trust a government (NSF) initiative to design a more secure internet that nevertheless makes government and corporate tracking of individual users and their data at least as difficult as the present internet? The Impact of Emerging Technologies: Click "Oh yeah?" http://www.technologyreview.com/InfoTech/wtr_15999,258,p1.html Dec. 2005/Jan. 2006 How the Web's inventor viewed security issues a decade ago. By Katherine Bourzac As part of a larger proposed effort to rethink the Internet's architecture (see "[27]The Internet Is Broken"), Internet elders such as MIT's David D. Clark argue that authentication -- verification of the identity of a person or organization you are communicating with -- should be part of the basic architecture of a new Internet. Authentication technologies could, for example, make it possible to determine if an e-mail asking for account information was really from your bank, and not from a scam artist trying to steal your money. Back in 1996, as the popularity of the World Wide Web was burgeoning, Tim Berners-Lee, the Web's inventor, was already thinking about authentication. In an article published in July of that year, Technology Review spoke with him about his creation. The talk was wide ranging; Berners-Lee described having to convince people to put information on the Web in its early years and expressed surprise at people's tolerance for typing code. But he also addressed complaints about the Web's reliability and safety. He proposed a simple authentication tool -- a browser button labeled "Oh, yeah?" that would verify identities -- and suggested that Web surfers take responsibility for avoiding junk information online. Two responses are excerpted here. >From Technology Review, July 1996: TR: The Web has a reputation in some quarters as more sizzle than steak -- you hear people complain that there's no way of judging the authenticity or reliability of the information they find there. What would you do about this? Berners-Lee: People will have to learn who they can trust on the Web. One way to do this is to put what I call an "Oh, yeah?" button on the browser. Say you're going into uncharted territory on the Web and you find some piece of information that is critical to the decision you're going to make, but you're not confident that the source of the information is who it is claimed to be. You should be able to click on "Oh, yeah?" and the browser program would tell the server computer to get some authentication -- by comparing encrypted digital signatures, for example -- that the document was in fact generated by its claimed author. The server could then present you with an argument as to why you might believe this document or why you might not. ...Another common gripe is that the Web is drowning in banal and useless material. After a while, some people get fed up and stop bothering with it. To people who complain that they have been reading junk, I suggest they think about how they got there. A link implies things about quality. A link from a quality source will generally be only to other quality documents. A link to a low-quality document reduces the effective quality of the source document. The lesson for people who create Web documents is that the links are just as important as the other content because that is how you give quality to the people who read your article. That's how paper publications establish their credibility -- they get their information from credible sources....You don't go down the street, after all, picking up every piece of paper blowing in the breeze. If you find that a search engine gives you garbarage, don't use it. From checker at panix.com Fri Jan 13 16:53:58 2006 From: checker at panix.com (Premise Checker) Date: Fri, 13 Jan 2006 11:53:58 -0500 (EST) Subject: [Paleopsych] Sigma Xi: Naming Names Message-ID: Naming Names http://www.americanscientist.org/template/AssetDetail/assetid/39138?&print=yes [Best to click on the URL.] COMPUTING SCIENCE [31]Brian Hayes Adam's only chore in the Garden of Eden was naming the beasts and birds. The book of Genesis doesn't tell us whether he found this task difficult or burdensome, but today the need to name and number things has become a major nuisance. When you try to choose a name for a new Internet domain or an e-mail account, you're likely to discover that your first choice was taken long ago. One Internet service tells me the name "brian" is unavailable and suggests "brian13311" as an alternative. Perhaps I should think of this appellation in the same category as Louis the 18th or John the 23rd, but being Brian the 13,311th seems a dubious distinction. The challenge of inventing original names is particularly acute when the name has to fit into a format that allows only a finite number of possibilities. For example, the ticker symbols that identify securities on the New York Stock Exchange can be no more than three characters long, and only the 26 letters of the English alphabet are allowed. The scheme imposes an upper limit of 18,278 symbols. If the day ever comes that 18,279 companies want to be listed on the exchange, the format will have to be expanded. And long before that absolute limit is reached, companies could have a hard time finding a symbol that bears any resemblance to the company name. [33]Constraints on the size... It's not just names that are scarce; we're even running out of numbers. A few years ago telephone numbers were in short supply, and so were the numbers that identify computers on the Internet. Those crises have abated, but now attention has turned to the Universal Product Code, the basis of the barcode labels found on virtually everything sold in the United States and Canada. It seems the universe has more products than the UPC has code numbers. For that reason and others, the 12-digit UPC standard is being supplanted by a 13-digit code, with provisions for adding a 14th digit. The "sunrise" date for this transition is January 1, 2005. The old 12-digit codes will continue to be recognized, so you may not notice an immediate change on product labels, but every supermarket and drug store has had to modify its database software to accommodate the extra digits. Some commentators have drawn parallels with the year 2000 rollover, when software had to be patched to deal with four-digit year numbers. That event was a fizzle, anxiously anticipated but with little real disruption on January 1, 2000. This time there has been little advance publicity, so perhaps we should brace for turmoil in the checkout line. Finishing Adam's Job Names and numbers were causing trouble long before the Internet age. Biology had a naming crisis in the 17th and 18th centuries. The problem wasn't so much a shortage of names but a surfeit of them: Plants and animals were known by many different names in different places. Then came the great reform of Carolus Linnaeus and his system of Latin binomials, identifying each organism by genus and species. The new scheme revolutionized taxonomy, not because there is any magic in Latin or in two-part names but because Linnaeus and his followers labored to preserve a strict one-to-one mapping between names and organisms. Official codes of nomenclature continue to enforce this rule--one name, one species--although rooting out synonyms and homonyms is a constant struggle. Linnaeus himself named some 6,000 species, and by now the number of living things in the biological literature is approaching two million. But there could be another 10 million species--or, who knows, even 100 million--yet to be catalogued. Might we run out of names before all the species are described? If we were to insist that every binomial consist of two real Latin words--words known to the Romans--then perhaps there might be trouble ahead. But in practice Linnaean names only have to look like Latin, and the only limit on their proliferation is the ingenuity of the biologist. A dictionary of classical Latin will not help you understand the terms Nerocila and Conilera, which designate two genera of isopods; more helpful is knowing that the biologist who invented the terms was fond of someone named Caroline. Among all the sciences, the one with the most remarkable system of nomenclature is organic chemistry. Names in most other realms are opaque labels, which identify a concept or object but tell you little about it. For most of us, a Linnaean name such as Upupa epops doesn't even reveal whether the organism is animal or vegetable (this one's a bird). In contrast, the full name of an organic compound specifies the structure of the molecule in great detail. "1,1-dichloro-2,2-difluoro-ethane" is a prescription for drawing a picture of a Freon molecule. The mapping from name to structural diagram is so direct that it can be done by a computer program. The reverse transformation, from diagram to name, is trickier; in other words, it's easier to make the molecule from the name than the name from the molecule. Exhausting the supply of names for organic compounds is not something we need to worry about: By the very nature of the notational system, there is a name for every molecule. On the other hand, the names can get so long and intricate that only a computer can parse them. Namespace Although difficulties with names are nothing new, the nature of name-giving changed with the introduction of computer technology. There is greater emphasis now on making names uniform and unique. Second, many names and identifying numbers must conform to a rigid format, with a specified number of letters or digits drawn from a fixed alphabet. Place names--and abbreviations for them--offer a good example of how names have changed. In the old days, a letter from overseas addressed to the "U.S." or the "U.S.A." or even the "EE.UU." would stand a chance of being delivered, but e-mail for the corresponding geographic domain must have the exact designation "US"; no variation is tolerated (except that upper case and lower case are not distinguished). The list of acceptable country codes for Internet addresses is maintained by the Internet Assigned Number Authority (IANA). Each code consists of exactly two characters, drawn from an alphabet of 26 letters. Thus the number of available codes--the total namespace--is 26 x 26, or 676. The current IANA list has 247 entries, so the filling factor--the fraction of the space that's occupied--is 0.365. That leaves room for growth if a few more nations decide to deconstruct themselves the way Yugoslavia and the Soviet Union did. But not every nation can get its first choice code. Consider the case of the ?land Islands, which, according to the Web site www.aland.fi, "form an autonomous, demilitarized and unilingually Swedish province of Finland." The islands are sufficiently autonomous to have persuaded IANA to issue them a country code of their own--but which code? Perhaps the first choice would have been AL, but Albania already had that one. Or maybe AI, if Anguilla hadn't claimed it. Why isn't Anguilla AN? Because that's the code for the Netherlands Antilles (which might have been NA if it weren't for Namibia). The preemption of AN also leads to less-than-obvious assignments for Andorra, Angola, Antigua and even Antarctica. In the end, the ?landers have wound up with the code AX (although, as the address www.aland.fi indicates, not everyone uses it). There is more to say about the difficulty of finding an unused name as a namespace fills up. But first some more examples of finite namespaces: Stock market ticker symbols. Ticker symbols began as telegraphers' informal shorthand, but today they are registered with the various exchanges. The New York Stock Exchange and the American Stock Exchange share a namespace; no symbol is allowed to have a different meaning in the two markets. Ignoring certain minutiae, the symbols consist of one, two or three letters; thus the size of the namespace is 26^3+26^2+26=18,278. The listing I consulted (at www.commerce-database.com) had 3,926 active symbols, for a filling factor of about 0.22. Stocks traded on the NASDAQ market use four-letter symbols. There are fewer of these stocks (about 3,400) and a much larger namespace (456,976), so it should be considerably easier to find a symbol for a new company there. (The most notable recent addition is Google, which chose the symbol GOOG.) Telephone numbers. Telephone numbers in North America have 10 decimal digits (including the area code), which suggests that the capacity of the namespace should be 10 billion numbers. Under the rules prevailing through the 1980s, however, fewer than a tenth of those combinations were valid telephone numbers. The format of a phone number in those days was expressed as NZX-NNN-XXXX, where N represents the digits 2-9, Z the digits 0-1 and X any digit in the full range 0-9. That works out to about 819 million numbers. Even that quantity should be plenty; there are roughly 300 million telephones in use in the United States. Nevertheless, during the early 1990s the supply of numbers within many area codes came close to exhaustion. Although the crisis was often blamed on the proliferation of modems, fax machines and cellular telephones, the real culprit was an inefficient scheme of allocation: If a telephone company had even one subscriber within a region, the company was assigned a block of 10,000 numbers. The main remedy was allocating numbers in smaller blocks, but along the way the grammatical rules defining a telephone number were relaxed, and the namespace expanded. Any combination of digits of the form NXX-NXX-XXXX is now a valid phone number, allowing some 6.4 billion possibilities. With careful conservation, the supply is expected to last until sometime in the 2030s. [35]Simulation of the filling of a namespace... Product codes. As in the telephone system, the shortage of Universal Product Codes is partly a matter of allocation policy. Although a UPC number has 12 digits (implying a maximum capacity of a trillion items), the first digit is a category code that in practice is almost always 0, and the final digit is a checksum used for detecting errors. Of the remaining 10 digits, 5 identify the manufacturer and 5 the individual product. Because of this fixed structure, every manufacturer automatically gets a block of 100,000 item numbers, even though most companies need far fewer. The new 13-digit standard coming into force on the first day of 2005 not only expands the total namespace by a factor of 10 but also allows a more flexible division of resources. In particular, some companies will be given a longer manufacturer code and fewer item codes. The new product-code standard isn't really new. The United States and Canada are merely acceding to another standard, called the European Article Number, that is already in use almost everywhere else in the world. (How quaint that the scheme known only in part of North America is the one labeled "Universal.") After the merger, the entire suite of product codes will be renamed the Global Trade Item Number. Most of the barcode scanning devices at checkout counters have long been able to read the 13-digit EAN format, but in many cases the database in the back office could not handle the extra digit. While making the necessary conversions, retailers have been urged to allow space for a 14-digit version of the GTIN. In 2007 publishers and libraries will get their turn to renumber their world as the International Standard Book Number is expanded to 13 digits and brought under the GTIN umbrella. Social Security numbers. With nine-digit decimal numbers, there should be a billion possibilities. The Social Security Administration has excluded only a few of them ("No SSNs with an area number of '666' have been or will be assigned"), so that the actual size of the namespace appears to be 987,921,198. Some 415 million numbers have been issued since in 1936, for a filling factor of about 0.4. The supply of numbers may well outlast the supply of funds to pay benefits. Other countries have quite different systems for allocating numbers analogous to the U.S. Social Security number. In particular, the Italian codice fiscale is not an arbitrary number assigned to a person but rather a string of alphanumeric symbols calculated from personal data such as name and date and place of birth. This scheme eliminates all concerns over running out of numbers, but it has another potential hazard: If the algorithm for calculating the codici is not chosen very carefully, two individuals may wind up with the same number. Radio station call signs. Broadcast radio stations in the United States have call signs of either three or four letters, but the first letter is always either K or W. These rules create a namespace with room for 36,504 entries. I was surprised to discover how densely filled this space is. Combining the AM and FM bands (many stations broadcast on both), there are 12,560 call signs currently registered with the Federal Communications Commission, a filling factor of more than one-third. Airport codes. When you check a bag at the airport, the luggage tag is marked with a three-letter code that indicates where, if all goes well, you'll eventually retrieve your belongings. The codes are administered by the International Air Transport Association (IATA). There's a code for every airport that has airline service, not to mention a few bus and train stations. Surprisingly, the IATA codes are the most densely packed of all the naming schemes I have encountered. Out of 17,576 possible codes, 10,678 are taken, a filling factor of 0.6. This may be why some of the codes are less than obvious (YYC for Calgary?), although many such minor mysteries have historical explanations. Chicago's O'Hare airport is ORD because it was once called Orchard Field. Making Hash of a Name [37]Figure 3. Names are distributed nonrandomly... Suppose you've just built a new airport or radio station or founded a sovereign nation, and you want to register an identifying code with the appropriate agency. What is the likelihood that your first choice will be available? Or your second or third choice? How do these probabilities change as the namespace fills up? If we can make the assumption that preferences for codes are distributed randomly throughout the namespace, then the question is easily answered. The probability that your first choice is already taken is just the filling factor of the namespace. The probability that both your first choice and your second choice are taken is the square of the filling factor, and so on. For example, if the namespace is two-thirds filled, then in two-thirds of the cases a randomly chosen code will already be present; four-ninths of the time, two randomly generated codes will both be taken. Searching at random for an unused name is related to the process known in computer science as hashing. The idea of hashing is to store data items for quick retrieval by scattering them seemingly at random throughout a table in computer memory. The arrangement isn't truly random; each item's position is set by a deterministic "hash function." Sometimes the hash function sends two data items to the same location; the collision must be resolved by putting one of the items elsewhere. This is analogous to requesting your favored name or code and finding that someone else has already claimed it. The resemblance between name search and hashing is worth noting because the performance of various hashing algorithms has been carefully analyzed and documented. Much depends on the strategy for resolving collisions, or, in the context of name search, the policy for choosing an alternative when a desired name is not available. Figure 2 reports the results of a simulation of a name search equivalent to one of the simplest hashing methods. The rule here is to generate a first-choice name at random; if that choice is taken, try the next name in alphabetical order and continue until an opening is found. Naturally, the number of collisions increases as the namespace fills up, but the increase is not linear; the shape of the curve is concave upward. Thus at any filling factor below about one-half, there is a reasonable chance you will get one of your first few choices. At higher filling factors, the average number of attempts before you find an available name rises steeply. But there is a flaw in this analysis: The assumption that preferences for names are random is obviously bogus. People prefer names that appear to mean something or that have some trait that distinguishes them from random strings of symbols. In the stock market, the rare one-letter ticker symbols carry much prestige; radio call signs that spell a pronounceable word (WARM, KOOL) are in demand. It would be difficult to codify or quantify these biases, but as a simple way of estimating their effect I tried looking at the first-order statistics of the code words in various data sets. The first-order statistics are simply the letter frequencies at each position within a word. (Higher-order statistics take into account correlations between the letters.) My experiments compared the success of two players--one who chooses names utterly at random and another whose random choices are biased to match the statistics of the names already in the data set. In other words, the latter player tends to favor names that are like those already present. Not surprisingly, the random player has an easier time finding an available name. The magnitude of the effect can be quite large. In the case of IANA country codes, random choices succeed after an average of 1.6 probes, but finding a name with letter frequencies similar to the existing population takes 2.5 trials on average. For IATA airport codes, the statistical bias raises the average number of attempts from 2.5 to 3.9. These results suggest that some namespaces may become impractically full much sooner than would be expected from an analysis based on hashing algorithms. [39]Statistical bias within a namespace... The experiment itself has a curious bias. Using an existing data set to infer people's preferences neglects the fact that many of the code words may not have been anyone's first choice; they may have been selected merely because the real first choice was already taken. Furthermore, the statistical bias varies with the filling factor. If there are only a few names in the data set, the letter frequencies will be strongly biased. Indeed, some letters may not appear at all, and so the algorithm used in the experiment would assign them a probability of zero. At the opposite end of the spectrum, variations in letter frequencies inevitably diminish as the namespace fills up. Once almost all the code words are taken, all letters must have nearly the same frequency. Horse Sense As namespaces get larger, analyses based on random character strings become less illuminating. A case in point is the naming of thoroughbred horses. Under rules enforced by the Jockey Club, a horse's name can have from 2 to 18 characters, drawn from an alphabet consisting of the usual 26 letters plus the space, the period and the apostrophe. This is an enormous namespace, with room for more than 2 x 10^26 entries. At any one time there are about 450,000 names assigned to active or recently retired horses. Most of these names will eventually become available for reuse, and so the pool of active names stays at roughly constant size. (Only the names of very famous steeds are permanently withdrawn; there will never be another Kelso or Secretariat.) With just 450,000 of 2 x 10^26 slots occupied, the filling factor of this namespace might as well be zero. Generating strings of characters at random, you would have to try 10^21 of them before you would have much chance of stumbling on a name in use. And yet real-world experience gives a very different impression. Of all the names submitted by horse breeders, the fraction rejected is not 1 in 10^21 but close to 1 in 4. According to a spokesperson for the Jockey Club, the most common reason for rejection is that the proposed name is too close to an existing one. In this context names can clash even if they are not spelled identically--mere phonetic similarity is enough to bar a name. But even allowing for this broader criterion of uniqueness, the thoroughbred namespace is not nearly as empty as it would seem from a naive counting of character strings. A fair estimate of the true filling factor would probably have to be based not on the combinatorics of random letters but on combinations of words or some other higher linguistic unit. The same is surely true for Internet domain names. Each component of a domain name--each part between dots--can have up to 63 characters, and the acceptable characters include both letters and numbers as well as the hyphen. The size of the namespace is nearly 10^100; we won't use them all up anytime soon. But meaningful, pithy, clever domain names--that's another matter. Even outside the confines of finite name-spaces, the sheer onomastic challenge of modern life sometimes gets to be a burden. Where's Adam when we need him? Years ago, I could save a clipping from the newspaper without any need to name it. Now, for every document I create or choose to keep, I must enact a little ceremony of naming: I dub thee "FILE-037.TXT." The workload has gotten serious enough that consultants make a living out of nothing more than dreaming up names. (One firm named itself A Hundred Monkees--well named!) When my daughter was a voluble three-year-old, she would greet passersby with the enthusiastic salute: "Hi! My name is named Amy. What is your name named?" A dizzying recursion yawns before us. Once we start naming names, and then the names of names of names, where do we ever stop? Bibliography * The Airline Codes Web Site. [40]http://www.airlinecodes.co.uk/ * Book Industry Study Group. 2004. The evolution in product identification: Sunrise 2005 and the ISBN-13. http://www.bisg.org/docs/The_Evolution_in_Product_ID.pdf * Federal Communications Commission. Undated. Index of Media Bureau CDBS public database files. http://www.fcc.gov/mb/databases/cdbs * Garfield, Eugene. 1961. An Algorithm for Translating Chemical Names to Molecular Formulas. Doctoral dissertation, University of Pennsylvania. http://www.garfield.library.upenn.edu/essays/v7p441y1984.pdf * Jeffrey, Charles. 1973. Biological Nomenclature. New York: Crane, Russak & Co. * The Jockey Club. 2003. The American Stud Book: Principal Rules and Requirements. Lexington, Ky.: The Jockey Club. http://www.jockeyclub.com/pdfs/RULES_2003_PRINT.pdf * Knuth, Donald E. 1973. The Art of Computer Programming. Vol. 3: Sorting and Searching. Section 6.4, Hashing. Reading, Mass.: Addison-Wesley. * McNamee, Joe. 2003. Why do we care about names and numbers? http://www.circleid.com/article/336_0_1_0/ * Mockpetris, P. 1987. Domain names: implementation and specification. Network Working Group Request for Comments 1035. http://www.ietf.org/rfc/rfc1035.txt * NeuStar, Inc. 2003. North American Numbering Plan Administration Annual Report, January 1-December 31, 2003. http://www.nanpa.com/reports/2003_NANPA_Annual_Report.pdf * Savory, Theodore. 1962. Naming the Living World: An Introduction to the Principles of Biological Nomenclature. London: The English Universities Press. * Uniform Code Council, Inc. Undated. 2005 Sunrise: Executive summary. http://www.uc-council.org/ean_ucc_system/stnds_and_tech/2005_s unrise.html References 31. http://www.americanscientist.org/template/AuthorDetail/authorid/490 33. http://www.americanscientist.org/template/AssetDetail/assetid/39138?&print=yes#39415 35. http://www.americanscientist.org/template/AssetDetail/assetid/39138?&print=yes#39420 37. http://www.americanscientist.org/template/AssetDetail/assetid/39138?&print=yes#39536 39. http://www.americanscientist.org/template/AssetDetail/assetid/39138?&print=yes#39426 From checker at panix.com Fri Jan 13 16:53:42 2006 From: checker at panix.com (Premise Checker) Date: Fri, 13 Jan 2006 11:53:42 -0500 (EST) Subject: [Paleopsych] Sigma Xi: On the Threshold Message-ID: On the Threshold http://www.americanscientist.org/template/AssetDetail/assetid/18577?&print=yes January-February 2003 COMPUTING SCIENCE [Best to click the URL.] [31]Brian Hayes Last night I called technical support for the universe to report a bug. They kept me on hold for eternity, but finally I lodged my complaint: Some things in this world take entirely too long to compute--exponentially so, in the worst cases. "That's not a bug, that's a feature," was the inevitable reply. "It keeps the universe from running down too fast. Besides, NP-complete calculations are an unsupported option, which void your warranty. And where is it written that anything at all is guaranteed to be efficiently computable? Count yourself lucky that 1+1 is a polynomial-time calculation." Perhaps cosmic tech support is right: Quick and easy answers to computational questions are not something we are entitled to expect in this world. Still, it's puzzling that some calculations are so much harder than others. The classic example is multiplication versus factoring. If you are given two prime numbers, it's easy to multiply them, yielding a bigger number as the product. But trying to undo this process--to take the product and recover the two unknown factors--seems to be much more difficult. We have fast algorithms for multiplying but not for factoring. Why is that? Although such questions stump the help desk, there has been some progress lately in understanding the sources of difficulty in at least one family of computational tasks, those known as constraint-satisfaction problems. The new line of inquiry doesn't quite explain why some of these problems are hard and others are easy, but it traces the boundary between the two classes in considerable detail. Furthermore, a better map of the problem-solving landscape has led to a novel algorithm that pushes back a little further the frontier of intractability. The algorithm, called survey propagation, could well have important practical applications. Where the Hard Problems Are The new algorithm weaves together threads from at least three disciplines: mathematics, computer science and physics. The theme that binds them all together is the presence of sudden transitions from one kind of behavior to another. The mathematical thread begins in the 1960s with the study of random graphs, initiated by Paul Erdos and Alfred R?nyi. In this context a graph is not a chart or plot but a more abstract mathematical structure--a collection of vertices and edges, generally drawn as a network of dots (the vertices) and connecting lines (the edges). To draw a random graph, start by sprinkling n vertices on the page, then consider all possible pairings of the vertices, choosing randomly with probability p whether or not to draw an edge connecting each pair. When p is near 0, edges are rare, and the graph consists of many small, disconnected pieces, or components. As p increases, the graph comes to be dominated by a single "giant" component, which includes most of the vertices. The existence of this giant component is hardly a surprise, but the manner in which it develops is not obvious. The component does not evolve gradually as p increases but emerges suddenly when a certain threshold is crossed. The threshold is defined by a parameter I'll call , which is the number of edges divided by the number of vertices. The giant component is born when is about 1/2. [33]Figure 1. Graph coloring . . . In computer science, a similar threshold phenomenon came to widespread attention in the early 1990s. In this case the threshold governs the likelihood that certain computational problems have a solution. One of these problems comes straight out of graph theory: It is the k-coloring problem, which asks you to paint each vertex of a graph with one of k colors, under the rule that two vertices joined by an edge may not have the same color. Finding an acceptable coloring gets harder as increases, because there are more edges imposing constraints on each vertex. Again, the threshold is sharp: Below a certain ratio, almost all graphs are k-colorable, and above this threshold almost none are. Moreover, the threshold affects not only the existence of solutions but also the difficulty of finding them. The computational effort needed to decide whether a graph is k-colorable has a dramatic peak near the critical value of . (An influential paper about this effect was aptly titled "Where the really hard problems are.") Physicists also know something about threshold phenomena; they call them phase transitions. But are the changes of state observed in random graphs and in constraint-satisfaction problems truly analogous to physical events such as the freezing of water and the onset of magnetization in iron? Or is the resemblance a mere coincidence? For a time there was controversy over this issue, but it's now clear that the threshold phenomena in graphs and other mathematical structures are genuine phase transitions. The tools and techniques of statistical physics are ideally suited to them. In particular, the k-coloring problem can be mapped directly onto a model of a magnetic system in solid-state physics. The survey-propagation algorithm draws on ideas developed originally to describe such physical models. Where the Hard Problems Aren't Survey propagation is really a family of algorithms, which could be applied in many different realms. So far, the method has been tested on two specific problems. The first of these is Boolean satisfiability, or SAT, where the aim is to solve a large formula in symbolic logic, assigning values of true or false to all the variables in such a way that the entire formula evaluates to true. The second problem is k-coloring. Because I have written about satisfiability on an earlier occasion, I shall adopt k-coloring as the main example here. I focus on three-coloring, where the palette of available colors has just three entries. Three-coloring is a hard problem, but not an impossible one. The question "Is this graph three-colorable?" can always be answered, at least in principle. Since each vertex can be assigned any of three colors, and there are n vertices, there must be exactly 3 ^n ways of coloring the graph. To decide whether a specific graph is three-colorable, just work through all the combinations one by one. If you find an assignment that satisfies the constraint--that is, where no edges yoke together like-colored vertices--then the answer to the question is yes. If you exhaust all the possibilities without finding a proper coloring, you can be certain that none exists. This algorithm is simple and sure. Unfortunately, it's also useless, because enumerating 3 ^n colorings is beyond the realm of practicality for any n larger than 15 or 20. Some more-sophisticated procedures can retain the guarantee of an exact and exhaustive search while reducing the number of operations to fewer than 1.5 ^n . This is a dramatic improvement, but it is still an exponential function, and it merely raises the limit to n=50 or so. For large graphs, with thousands of vertices, all such brute-force methods are hopeless. On the other hand, if you could somehow peek at the solution to a large three-coloring problem, you could check its correctness with much less labor. All you would have to do is go through the list of edges, verifying that the vertices at the ends of each edge carry different colors. The number of edges in a graph cannot be greater than n ^2, which is a polynomial rather than an exponential function and which therefore grows much more slowly. Problems with answers that are hard to find but easy to check are the characteristic signature of the class called NP (which stands for "nondeterministic polynomial"). Three-coloring is a charter member of NP and also belongs to the more-elite group of problems described as NP-complete; the same is true of satisfiability. Barring a miracle, there will be no polynomial-time algorithms for NP-complete problems. Having thus established the credentials of three-coloring as a certifiably hard problem, it is now time to reveal that most three-coloring problems on random graphs are actually quite easy. Given a typical graph, you have a good chance of quickly finding a three-coloring or proving that none exists. There is no real paradox in this curious situation. The classification of three-coloring as NP-complete is based on a worst-case analysis. It could be overturned only by an algorithm that is guaranteed to produce the correct answer and to run in polynomial time on every possible graph. No one has discovered such an algorithm. But there are many algorithms that run quickly most of the time, if you are willing to tolerate an occasional failure. One popular strategy for graph-coloring algorithms is backtracking. It is similar to the way most people would attack the problem if they were to try coloring a graph by hand. You start by assigning an arbitrary color to an arbitrary vertex, then go on to the neighboring vertices, giving them any colors that do not cause a conflict. Continuing in this way, you may eventually reach a vertex where no color is legal; at that point you must back up, undoing some of your previous choices, and try again. [35]Figure 2. Transition between solvable and unsolvable phases . . . Showing that a graph cannot be three-colored calls for another kind of algorithm. The basic approach is to search for a small cluster of vertices that--even in isolation from the rest of the graph--cannot be three-colored. For example, a "clique" made up of four vertices that are all linked to one another has this property. If you can find just one such cluster, it settles the question for the entire graph. Algorithms like these are very different from the brute-force, exhaustive-search methods. The simple enumeration of all 3 ^n colorings may be impossibly slow, but at least it's consistent; the running time is the same on all graphs of the same size. This is not true for backtracking and other inexact or incomplete algorithms; their performance varies widely depending on the nature of the graph. In particular, the algorithms are sensitive to the value of , the ratio of edges to vertices, which again is the parameter that controls the transition between colorable and uncolorable phases. Well below the critical value of , where edges are sparse, there are so many ways to color the graph successfully that any reasonable strategy is likely to stumble onto one of them. At the opposite extreme, far above the threshold, graphs are densely interconnected, and it's easy to find a subgraph that spoils the chance of a three-coloring. The troublesome region is between these poles, near the threshold. In that middle ground there may be just a few proper colorings, or there may be none at all. Distinguishing between these two situations can require checking almost every possible assignment. Where the Solutions Are The critical value of is about 2.35. In other words, if a random graph with n vertices has fewer than 2.35n edges, it can almost surely be three-colored; if it has more than 2.35n edges, a three-coloring is unlikely. Moreover, the transition between these two regimes is known to be sharp; it is a true discontinuity, a sudden jump rather than a smooth gradation. To put this idea more formally, the width of the transitional region tends to zero as n tends to infinity. The sharpness of the phase transition could be taken as encouraging news. If algorithms for deciding colorability bog down only in the transitional region, and if that region is vanishingly narrow, then the probability of encountering a hard-to-classify graph is correspondingly small. But it seems the universe has another bug (or feature). In the first place, the sharpness of the colorability transition is assured only for infinitely large graphs; at finite n, the corners of the transition curve are rounded. And there is another disrupting factor, which has been recognized only recently. It has to do not with the structure of the graph itself but with the structure of the set of all solutions to the coloring problem. Although the uncolorable phase does not begin until ~ 2.35, experiments have shown that algorithms begin slowing down somewhat earlier, at values of around 2.2. The discrepancy may seem inconsequential, but it is too large to be explained merely by the blurring of the phase transition at finite n. Something else is going on. [37]Figure 3. Computational effort . . . To understand the cause, it helps to think of all the possible three-colorings of a graph spread out over a surface. The height of the surface at any point represents the number of conflicts in the corresponding coloring. Thus the perfect colorings (those with zero conflicts) all lie at sea level, while the worst colorings create high-altitude peaks or plateaus. Of course the topography of this landscape depends on the particular graph. Consider how the surface evolves as gradually increases. At low values of there are broad basins and valleys, representing the many ways to color the graph perfectly. At high the landscape is alpine, and even the lowest point is well above sea level, implying a complete absence of perfect colorings. The transitional value ~ 2.35 marks the moment when the last extensive areas of land at sea level disappear. What happens in this "solution space" at ~ 2.2? It turns out this is the moment when a broad expanse of bottomland begins breaking up into smaller isolated basins. Below 2.2, nearly all the perfect colorings form a single giant connected cluster. They are connected in the sense that you can convert one solution into another by making relatively few changes, and without introducing too many conflicts in any of the intermediate stages. Above 2.2, each basin represents an isolated cluster of solutions. Colorings that lie in separate basins are substantially different, and converting one into another would require climbing over a ridge formed by colorings that have large numbers of conflicts. Algorithms that work by conducting a local search are unlikely to cross such ridge lines, and so they remain confined for long periods to whichever basin they first wander into. As increases above 2.2, the number of perfect colorings within any one basin dwindles away to zero, and so the algorithms may fail to find a solution, even though many proper colorings still exist elsewhere on the solution surface. This vision of solutions spread out over an undulating landscape is a familiar conceptual device in many areas of physics. Often the landscape is interpreted as an energy surface, and physical systems are assumed to run downhill toward states of minimum energy. This analogy can be pursued further, setting up a direct correspondence between the k-coloring of graphs and a model of magnetic materials. Where the Spins Are Models of magnetism come in baffling varieties. The basic components are vectors that represent atomic spins. Usually the spins are arranged in a regular lattice, as in a crystalline solid, and the vectors are constrained to point in only a few possible directions. In a model of a ferromagnet, nearby spins have positive couplings, meaning that the energy of the system is lower when the spins line up in parallel. An antiferromagnet has negative couplings, favoring spins that point in different directions. The problem of three-coloring a graph can be seen as a model of an antiferromagnet in which each spin has three possible directions, corresponding to the three colors. It is antiferromagnetic because the favored state is one where the colors or the spins differ. [39]Figure 4. Random walks through the space of graph colorings . . . Most spin-system models focus on the effects of thermal fluctuations and the countervailing imperatives to minimize energy and to maximize entropy. In this respect the graph-coloring model is simpler than most, because the condition of interest is at zero temperature, where entropy can be neglected. On the other hand, the model is more complicated in another way: The spins are embedded in a graph with random interconnections, more like a glass than the geometrically regular lattice of a crystal. Having translated the coloring problem into the language of spin physics, the aim is to identify the ground state--the spin configuration of minimum energy. If the ground-state energy is zero, then at least one perfect coloring exists. If the energy of the spins cannot be reduced to zero, then the corresponding graph is not three-colorable. The minimum energy indicates how many unavoidable conflicts exist in the colored graph. Of course recasting the problem in a new vocabulary doesn't make the fundamental difficulty go away. In graph coloring, when you resolve a conflict by changing the color of one vertex, you may create a new conflict elsewhere in the graph. Likewise in the spin system, when you lower the energy of one pair of coupled spins, you may raise it for a different pair. Physicists refer to this effect as "frustration." Interactions between adjacent spins can be viewed as a kind of message-passing, in which each spin tells its neighbors what they ought to do (or, since the coupling is antiferromagnetic, what they ought not to do). Translating back into the language of graph coloring, a green vertex broadcasts a signal to its neighbors saying "Don't be green." The neighbors send back messages of their own--"Don't be red," "Don't be blue." The trouble is, every edge is carrying messages in both directions, some of which may be contradictory. And feedback loops could prevent the network from ever settling down into a stable state. A remedy for this kind of frustration is known in condensed-matter physics as the cavity method. It prescribes the following sequence of actions: First, choose a single spin and temporarily remove it from the system (thereby creating a "cavity"). Now, from among the neighbors surrounding the cavity, choose one node to regard as an output and consider the rest to be inputs. Sum up the signals arriving on all the input edges, and pass along the result to the output. The effect is to break open loops and enforce one-way communication. Finally, repeat the entire procedure with another spin, and continue until the system converges on some steady state. The cavity method was first applied to constraint-satisfaction problems by Marc M?zard of the Universit? de Paris Sud, Giorgio Parisi of the Universit? di Roma "La Sapienza" and Riccardo Zecchina of the Abdus Salam International Centre for Theoretical Physics in Trieste. Initially it was a tool for calculating the average properties of statistical ensembles of many spin systems. About a year ago, M?zard and Zecchina realized that it could also be adapted to work with individual problem instances. But a significant change was needed. Instead of simple messages such as "Don't be green," the information transmitted from node to node consists of entire probability distributions, which give a numerical rating to each possible spin state or vertex color. M?zard and Zecchina named the algorithm survey propagation. They got the "propagation" part from another algorithm that also inspired their work: a technique called belief propagation, which is used in certain error-correcting codes. "Survey" is meant in the sense of opinion poll: The sites surrounding a cavity are surveyed for the advice they would offer to their neighbors. Where the Bugs Are Over the past year the concept of survey propagation has been further refined and embodied in a series of computer programs by M?zard and Zecchina and a group of coworkers. Contributors include Alfredo Braunstein, Silvio Franz, Michele Leone, Andrea Montanari, Roberto Mulet, Andrea Pagnani, Federico Ricci-Tersenghi and Martin Weigt. To solve a three-coloring problem on a graph of size n, the algorithm first finds the vertex that is most highly biased toward one color or another, and permanently sets the color of that vertex accordingly. Then the algorithm is invoked recursively on the remaining graph of n-1 vertices, so that another vertex color is fixed. Obviously this process has to terminate after no more than n repetitions. In practice it usually stops sooner, when all the signals propagating through the network have become messages of indifference, putting no constraints on neighboring nodes. At this point survey propagation has nothing more to offer, but the graph that remains has been reduced to a trivial case for other methods. As with other algorithms for NP-complete problems, survey propagation comes with no guarantees, and it does sometimes fail. The process of deciding which vertex to fix next is not infallible, and when a wrong choice is made, there may be no later opportunity to recover from it. (Adding some form of backtracking or randomized restarting might alleviate this problem.) In its present form the algorithm is also strictly one-sided: It can usually color a colorable graph, but it cannot prove a graph to be uncolorable. Nevertheless, the algorithm has already had some impressive successes, particularly in the hard-to-solve region near the phase transition. The version for satisfiability has solved problems with 8 million variables. The graph-coloring program handles graphs of a million vertices. Both of these numbers are two orders of magnitude beyond what is routine practice for other methods. Graph coloring and satisfiability are not just toy problems for theorists. They are at the core of various practical tasks in scheduling, in the engineering of silicon circuits and in optimizing computer programs. Having an algorithm capable of solving much larger instances could open up still more applications. Ironically, although survey propagation works well on enormous problems, it sometimes stalls on much smaller instances, such as random graphs with only a few hundred vertices. This is not a pressing practical concern, since other methods work well in this size range, but it's annoying, and there's the worry that the same failures might show up in larger nonrandom graphs. The cause of these small-graph failures is not yet clear. It may have to do with an abundance of densely nested loops and other structures in the graphs. Then again, it may be just another bug in the universe. Brian Hayes Acknowledgment This article had its genesis during a 10-week residence at the Abdus Salam International Centre for Theoretical Physics, where I benefitted from discussions with Riccardo Zecchina, Muli Safra, Roberto Mulet, Marc M?zard, Stephan Mertens, Alfredo Braunstein, Johannes Berg and others. Bibliography * Achlioptas, Dimitris, and Ehud Friedgut. 1999. A sharp threshold for k-colorability. Random Structures and Algorithms 14:63-70. [[40]CrossRef] * Cheeseman, Peter, Bob Kanefsky and William M. Taylor. 1991. Where the really hard problems are. In Proceedings of the International Joint Conference on Artificial Intelligence, Vol. 1, pp. 331-337. * Dubois, O., R. Monasson, B. Selman and R. Zecchina (eds). 2001. Special issue on phase transitions in combinatorial problems. Theoretical Computer Science 265(1). * Erdos, P., and A. Rnyi. 1960. On the evolution of random graphs. Publications of the Mathematical Institute of the Hungarian Academy of Sciences 5:17-61. * Friedgut, Ehud, and Gil Kalai. 1996. Every monotone graph property has a sharp threshold. Proceedings of the American Mathematical Society 124:2993-3002. * Gent, Ian P., and Toby Walsh (eds). 2002. Satisfiability in the year 2000. Journal of Automated Reasoning 28(2). * Hayes, Brian. 1997. Computing science: Can't get no satisfaction. American Scientist 85:108-112. [[41]CrossRef] * Johnson, David S., and Michael A. Trick (eds). 1996. Cliques, Coloring, and Satisfiability: Second DIMACS Implementation Challenge. Providence, R.I.: American Mathematical Society. * Martin, Olivier C., Rmi Monasson and Riccardo Zecchina. 2001. Statistical mechanics methods and phase transitions in optimization problems. Theoretical Computer Science 265:3-67. * Mzard, Marc, Giorgio Parisi and Miguel Angel Virasoro (eds). 1987. Spin Glass Theory and Beyond. Philadelphia: World Scientific. * Mzard, Marc, and Giorgio Parisi. 2002. The cavity method at zero temperature. Journal of Statistical Physics (in press). [[42]CrossRef] * Mzard, M., G. Parisi and R. Zecchina. 2002. Analytic and algorithmic solution of random satisfiability problem. Science 297:812-815. * Mzard, Marc, and Riccardo Zecchina. 2002. Random 3-SAT: From an analytic solution to a new efficient algorithm. Physical Review E (in press). * Mulet, R., A. Pagnani, M. Weigt and R. Zecchina. 2002. Coloring random graphs. Physical Review Letters (in press). [[43]CrossRef] * Turner, Jonathan S. 1988. Almost all k-colorable graphs are easy to color. Journal of Algorithms 9:63-82. References 31. http://www.americanscientist.org/template/AuthorDetail/authorid/490 33. http://www.americanscientist.org/template/AssetDetail/assetid/18577?&print=yes#17741 35. http://www.americanscientist.org/template/AssetDetail/assetid/18577?&print=yes#17742 37. http://www.americanscientist.org/template/AssetDetail/assetid/18577?&print=yes#17743 39. http://www.americanscientist.org/template/AssetDetail/assetid/18577?&print=yes#17746 40. http://dx.doi.org/10.1002/%28SICI%291098-2418%281999010%2914%3A1%3C63%3A%3AAID-RSA3%3E3.0.CO%3B2-7 41. http://dx.doi.org/10.1090/S0002-9939-96-03732-X 42. http://dx.doi.org/10.1016/S0304-3975(01)00149-9 43. http://dx.doi.org/10.1126/science.1073287 From checker at panix.com Fri Jan 13 16:54:13 2006 From: checker at panix.com (Premise Checker) Date: Fri, 13 Jan 2006 11:54:13 -0500 (EST) Subject: [Paleopsych] Sigma Xi: Rumours and Errours Message-ID: Rumours and Errours http://www.americanscientist.org/template/AssetDetail/assetid/42368?&print=yes [Best to click the URL. End of articles from Sigma Xi.] COMPUTING SCIENCE [31]Brian Hayes The confessional essay is not a popular genre in mathematics and the sciences; few of us wish to dwell on our mistakes or call attention to them. An inspiring exception is Donald E. Knuth of Stanford University. During a decade's labor on the TeX typesetting system, he kept a meticulous log of all his errors, and then he published the list with a detailed commentary. I have long admired Knuth's act of public bravery, and this column is my attempt to follow his example. I took courage from the thought that if there is any realm of life in which I might hope to surpass Don Knuth, it's in making mistakes; but, alas, I've fallen short even in this dubious department. Knuth's published error log runs to more than 900 entries, whereas here I am going to confess to only a paltry handful of mistakes. Then again, Knuth needed 10 years' work on a major software project to accumulate his budget of errors, but I was able to commit some really serious howlers in a program of a dozen lines. Knuth remarks that keeping an error log not only helped in debugging the program but also "helped me to get to know myself." I would like to think that I too have acquired some self-knowledge from the experience of confronting my own fallibility. And it would be gratifying to suggest that by telling my story I might save others from making the same mistakes--but I don't quite believe that, and I'm not even sure it would be a good idea. Start Spreading the News [33]Figure 1. The birth and death of a rumor... The story begins with a loose end from my column on the Lambert W function in the March-April issue of American Scientist. I had been looking for a paper with the curious title "Rumours with general initial conditions," by Selma Belen and C. E. M. Pearce of the University of Adelaide, published in The ANZIAM Journal, which is also known as The Australia and New Zealand Industrial and Applied Mathematics Journal. By the time I found the paper, my column had already gone to press. This was a disappointment, because Belen and Pearce describe an illuminating application of the W function in a context that I found interesting in its own right. Here is how they begin: The stochastic theory of rumours, with interacting subpopulations of ignorants, spreaders and stiflers, began with the seminal paper of Daley and Kendall. The most striking result in the area--that if there is one spreader initially, then the proportion of the population never to hear the rumour converges almost surely to a proportion 0.203188 of the population size as the latter tends to infinity--was first signalled in that article. This result occurs also in the variant stochastic model of Maki and Thompson, although a typographic error has resulted in the value 0.238 being cited in a number of consequent papers. I was intrigued and a little puzzled to learn that a rumor would die out while "almost surely" leaving a fifth of the people untouched. Why wouldn't it reach everyone eventually? And that number 0.203188, with its formidable six decimal places of precision--where did that come from? I read on far enough to get the details of the models. The premise, I discovered, is that rumor-mongering is fun only if you know the rumor and your audience doesn't; there's no thrill in passing on stale news. In terms of the three subpopulations, people remain spreaders of a rumor as long as they continue to meet ignorants who are eager to receive it; after that, the spreaders become stiflers, who still know the rumor but have lost interest in propagating it. The Daley-Kendall and Maki-Thompson models simplify and formalize this social process. Both models assume a thoroughly mixed population, so that people encounter each other at random, with uniform probability. Another simplifying assumption is that people always meet two-by-two, never in larger groups. The pairwise interactions are governed by a rigid set of rules: o Whenever a spreader meets an ignorant, the ignorant becomes a spreader, while the original spreader continues spreading. o When a spreader meets a stifler, the spreader becomes a stifler. o In the case where two spreaders meet, the models differ. In the Daley-Kendall version, both spreaders become stiflers. The Maki-Thompson rules convert only one spreader into a stifler; the other continues spreading. o All other interactions (ignorant-ignorant, ignorant-stifler, stifler-stifler) have no effect on either party. The rules begin to explain why rumors are self-limiting in these models. Initially, spreaders are recruited from the large reservoir of ignorants, and the rumor ripples through part of the population. But as the spreaders proliferate, they start running into one another and thereby become stiflers. Because the progression from ignorant to spreader to stifler is irreversible, it's clear the rumor must eventually die out, as all spreaders wind up as stiflers in the end. What's not so obvious is why the last spreader should disappear before the supply of ignorants is exhausted, or why the permanently clueless fraction is equal to 0.203188 of the original population. The rumor models are closely related to well-known models of epidemic disease, where the three subpopulations are usually labeled susceptibles, infectives and removed cases. But there's a difference between rumors and epidemics. In the rumor models, it's not only the disease that's contagious but also the cure, since both spreading and stifling are communicable traits. The Rumor Mill I was curious to see the rumor models in action, and so I wrote a little program. I set up a population of 1,000 individuals, each of whom could be an ignorant, a spreader or a stifler. Initially there was just one spreader and all the rest were ignorants. At the heart of the program was the following procedure, meant to implement the Daley-Kendall model (the one in which pairs of spreaders annihilate each other): repeat choose X at random from among the spreaders in the population; choose Y at random from the entire population; if Y is an ignorant then make Y a spreader else if Y is a spreader then make both X and Y stiflers else if Y is a stifler then make X a stifler until there are no more spreaders When all the spreaders are gone, nothing more can change, so the program ends and reports the fraction of the population still oblivious of the rumor. This fraction, designated th, should be 0.203188. But the result from my program, averaged over a few thousand runs, was 0.28 or 0.29--a considerable discrepancy. At this point, let me pause to say that my big boo-boo had already been committed. Before reading on, you might want to try debugging my algorithm, or even write a program of your own. Typos and Thinkos Computer programming teaches humility, or at least that's my experience. In principle, the discrepancy I observed might have pointed to an error in the published result, but that wasn't my first hypothesis. I checked my own code, fully expecting to find some careless mistake--running through a loop one time too few or too many, failing to update a variable, miscalculating an array index. Nothing leapt out at me. The problem, I began to suspect, was not a typo but a thinko. I did know of one soft spot in the program. The individuals X and Y were chosen in such a way that they could both turn out to be the same person, suggesting the strange spectacle of spreading a rumor to oneself. ("Pssst. Have I heard about...?") When I went to fix this oddity, I discovered another bug. A variable named spreader-count was incremented or decremented on each passage through the loop, according to the outcome of the encounter; when this variable reached zero, the program ended. After each spreader-spreader interaction, I decreased spreader-count by 2--with potentially disastrous results if X and Y were identical. This was a serious flaw, which needed to be repaired; however, the change had no discernible effect on the value of th, which remained stuck at 0.285. I had another thought. Belen and Pearce were careful to state that their result holds only when the population size tends to infinity. Perhaps my discrepancy would go away in a larger sample. I tried a range of populations, with these results: population th 10 0.354 100 0.296 1,000 0.286 10,0000.285 100,0000.285 The trend was in the right direction--a smaller proportion of residual ignorants as population increased--but the curve seemed to flatten out beyond 1,000, and th looked unlikely ever to reach 0.203. Even so, it seemed worthwhile to test still larger populations, but for that I would need a faster program. I wrote a new and simpler version, dispensing with the array of individuals and merely keeping track of the number of persons in each of the three categories. With this strategy I was able to test populations up to 100 million. The value of th remained steady at 0.285. [35]Figure 2. The dynamics of rumors... Looking at the distribution of th values from single runs of the program (rather than averages over many runs) suggested another idea. Most of the results were clustered between th=0.25 and th=0.35, but there were a few outliers--runs in which 99 percent of the population never heard the rumor. I could see what must be going on. Suppose on the very first interaction X spreads the rumor to Y, and then in the second round the random selection happens to settle on X and Y again. The rumor dies in infancy, having reached only two people. Could it be that excluding these outliers would bring the average value of th down to 0.203? I gave it a try; the answer was no. Mea Culpa I was stumped. I had reached the point in a debugging session where you begin to doubt your random-number generator, or even your compiler. As it happens, Knuth found a few compiler bugs during his work on TeX, but for me that road has always led nowhere. For lack of a better idea, I decided to look at the other scheme of rumor propagation, the Maki-Thompson model. As indicated above, this model differs from the Daley-Kendall one in that an encounter between two spreaders converts just one of them into a stifler. Modifying my program took only a second. When I ran it, the answer came back th=0.203. Now I was not just stumped but also thoroughly confused. Here's where confession becomes a test of character. There was a moment--it came in the dark of the night--when I allowed myself to entertain the notion that maybe I was right after all, and the rest of the world had a screw loose. I looked back at the opening paragraph of the paper by Belen and Pearce. I realized that I could make sense of it all, and reconcile their numbers with mine, merely by assuming that the Australian authors had everything umop-episdn. The number 0.203, which they identified as the result of the Daley-Kendall model, really belonged with Maki-Thompson. As for 0.238, which they called a typographical error--well, yes, that's just what it was, a transposition of 0.283, which seemed close enough to 0.285, the value of th I calculated for Daley-Kendall.... [37]Figure 3. The longevity of a rumor... By morning this madness had abated, but the impasse remained. I knew I could probably settle the matter by going back to the library and looking up the sources cited by Belen and Pearce, but that seemed less than sporting. I could have tried to prove the correctness of one result or the other, but if I can't trust myself to write a correct program, how can I be trusted to write a correct proof? Then there's the experimental method: I might have assembled a thousand volunteers, carefully instructed them on the Daley-Kendall rules, and set a rumor loose in their midst. In the end, what I tried was yet another computer simulation. I decided to write a program that would mimic a real experiment as closely as possible, reproducing all the basic events of the underlying model with no shortcuts or optimizations. The image I had in mind was a crowd of people milling about like molecules in a gas, bumping into each other at random and passing on rumors during these chance collisions. This was the system I wanted to simulate. My first program, with its explicit representation of each member of the population, was already fairly close to the goal. But I had made one refinement for the sake of computational efficiency. Because interactions in which neither party is a spreader could never affect the fate of a rumor, it seemed wasteful to include them; I avoided that waste by always choosing a spreader as the first party to an encounter. This seemed a totally harmless bit of streamlining, but now I went back and removed it. In the third version of the program, I selected two individuals at random from the entire population, checked to make sure they were not actually the same person, and then allowed them to interact according to the Daley-Kendall rules. It seemed a futile exercise, which would surely yield exactly the same result as the other programs, only slower. To my astonishment, the new program reported th=0.203. My Bad If you have already figured out where my reasoning went astray, I offer my congratulations. My own belated enlightenment came when I finally drew the matrix of all nine possible encounters of ignorants, spreaders and stiflers. As shown in Figure 4, this diagram can serve as more than just an enumeration of possible outcomes; it encodes the entire structure and operation of the model. If we make the widths of the columns and rows proportional to the sizes of the three subpopulations, then the area of each of the nine boxes gives the probability of the corresponding two-person encounter. Choosing two participants at random is equivalent to choosing a point at random within the diagram; the outcome of the interaction is then decided by which of the nine boxes the chosen point lies within. (I am again glossing over the issue of spreading a rumor to oneself; it's a minor correction.) [39]Figure 4. Matrix of all possible events... To understand where I went wrong, it's enough to analyze the simplest case, where the three subpopulations are of equal size and all nine kinds of encounters have the same probability, namely 1/9. The boxes at the four corners of the diagram correspond to events that do not involve a spreader and that change no one's status. Two other boxes describe ignorant-spreader encounters, which therefore have a total probability of 2/9. Two more boxes correspond to spreader-stifler meetings, so those events also occur with probability 2/9. But there is only one box representing spreader-spreader interactions, which accordingly must be assigned a probability of 1/9. The key point is that ignorant-spreader and spreader-stifler events are each twice as likely as spreader-spreader meetings. Now consider what happens in my first program for the Daley-Kendall model. By always choosing a spreader first, I confined all events to the middle row of the matrix, and the three boxes in this row were selected with equal probability. As a result, spreader-spreader interactions were twice as frequent as they should have been, and the rumor was extinguished prematurely. >From the point of view of probability theory, the error is an elementary one of failing to count cases properly. Perhaps a professional programmer would cite a different root cause: I had violated the old adage, "Don't start optimizing your program until you've finished writing it." Back to the Stacks Not all of my confusion was cleared up by the discovery of this error. In particular, the algorithm that I now knew to be incorrect for the Daley-Kendall model still seemed to give the right answer for the Maki-Thompson rules. To make sense of this situation, I finally went back to the library to find out what the original authors had said. Daley and Kendall are Daryl J. Daley, now of the Australian National University, and David G. Kendall, a distinguished statistician and probabilist at the University of Cambridge. Their paper, published in 1965, is a model of lucid exposition, which would have spared me all my stumbling in the dark--and for that reason I'm glad I didn't see it sooner. The correct calculation of probabilities is stated very clearly (there's a factor of 1/2 in the expression for the spreader-spreader interaction). Furthermore, the origin of the mysteriously precise number 0.203188 is made plain. Those six digits come not from a discrete-event simulation like the ones I had designed but from a continuous, differential-equation version of the model. The number th is a solution of the equation: th e ^2(1- ^th) = 1. This equation brings us back almost to the Lambert W function, We^W.) Maki and Thompson are Daniel P. Maki and Maynard Thompson of Indiana University, who discussed rumors in a 1973 textbook, Mathematical Models and Applications. They described the rumor-passing process in terms of telephone calls, and they limited their attention to calls placed by spreaders; because of this asymmetry, only the middle row of the matrix in Figure 4 enters into the calculation, and my first program was in fact a correct implementation of their model. (At least I got something right.) It is almost a coincidence that Maki and Thompson arrive at the same value of th as Daley and Kendall: Their spreader-spreader interactions are twice as likely but have only half the effect. The paper by Belen and Pearce that launched me on this adventure also deserves a further comment. The phrase "general initial conditions" in their title refers to rumors initiated not by a single spreader but by many. One might guess that with enough spreaders, the rumor must surely permeate the entire society, but Belen and Pearce show otherwise. Measuring the fraction of those originally ignorant who remain ignorant when the rumor has finished, they find that this fraction actually increases along with the number of initial spreaders, tending to a maximum of 1/e, or about 0.368. In other words, as more people spread the news, more people fail to hear it. The reason is simply that the multitude of spreaders quickly stifle one another. By now the mathematics of rumors has acquired a vast literature. Variant models track competing rumors and counter-rumors or allow people to meet more than two at a time. Still more models study the progress of rumors through networks or lattices rather than structureless mixed populations. I have not yet had a chance to make any errors in exploring these systems. Gnothi Seauton Mistakes bring the gift of self-knowledge--a gift that is not always welcome. Looking back on this episode, I could summarize it as follows: I wrote a program that gave a wrong answer, and then I fiddled and fudged until I finally got the output I wanted, and then I stopped. This is not a protocol to be recommended. What's most troubling is the uncomfortable thought that if the textbook answer had not been given to me at the outset, I would surely have been content with the result of my first, fallacious, program. Still, for most of us, the only way we'll never err is if we never try. My fellow-columnist Henry Petroski has written eloquently about the necessary role of error and failure in all worthy undertakings; as he says, falling down is part of growing up. And if we are going to make mistakes, it seems salutary to bring them out in the open and discuss their causes. Staring them in the face makes them seem a little less mortifying. Only a little, though. A confession of this kind is not followed by absolution. And instead of "Go and err no more," Knuth quotes Piet Hein's advice: "Err and err and err again but less and less and less." I take my own motto from the novelist and playwright Samuel Beckett: "Fail again. Fail better." Bibliography * Belen, Selma, and C. E. M. Pearce. 2004. Rumours with general initial conditions. The ANZIAM Journal 45:393-400. * Daley, D. J., and D. G. Kendall. 1965. Stochastic rumours. Journal of the Institute of Mathematics and Its Applications 1:42-55. * Knuth, Donald E. 1989. The errors of TeX. Software--Practice & Experience 19:607-685. Reprinted with additions in Literate Programming, 1992, Stanford, Calif.: Center for the Study of Language and Information, pp. 243-339. * Maki, Daniel P., and Maynard Thompson. 1973. Mathematical Models and Applications, with Emphasis on the Social, Life, and Management Sciences. Englewood Cliffs, N.J.: Prentice-Hall. References 31. http://www.americanscientist.org/template/AuthorDetail/authorid/490 33. http://www.americanscientist.org/template/AssetDetail/assetid/42368?&print=yes#42605 35. http://www.americanscientist.org/template/AssetDetail/assetid/42368?&print=yes#42606 37. http://www.americanscientist.org/template/AssetDetail/assetid/42368?&print=yes#42607 39. http://www.americanscientist.org/template/AssetDetail/assetid/42368?&print=yes#42956 From checker at panix.com Fri Jan 13 16:55:42 2006 From: checker at panix.com (Premise Checker) Date: Fri, 13 Jan 2006 11:55:42 -0500 (EST) Subject: [Paleopsych] Seth Lloyd: Ultimate physical limits to computation Message-ID: Seth Lloyd: Ultimate physical limits to computation http://puhep1.princeton.edu/~mcdonald/examples/QM/lloyd_nature_406_1047_00.pdf d'Arbeloff Laboratory for Information Systems and Technology, MIT Department of Mechanical Engineering, Massachusetts Institute of Technology 3-160, Cambridge, Massachusetts 02139, USA (slloyd at mit.edu) NATURE 406 (2000 August 31):1047-56 An insight review article Abstract: Computers are physical systems: the laws of physics dictate what they can and cannot do. In particular, the speed with which a physical device can process information is limited by its energy and the amount of information that it can process is limited by the number of degrees of freedom it possesses. Here I explore the physical limits of computation as determined by the speed of light c, the quantum scale h-bar and the gravitational constant G. As an example, I put quantitative bounds to the computational power of an 'ultimate laptop' with a mass of one kilogram confined to a volume of one litre. [I did some different calculations, on an upper bound for the number of possible computations since the Big Bang. This was simply the time a photon can cross the Planck distance times the number of photons times the number of such instants since the Big Bang. [a. The Plank length is sqrt(hG/c^3) = 1.62 x 10^-35 meters. [b. The Plank time is the length divided by c, or sqrt(hG/c^5) = 5 x 10^-44 seconds. [c. The universe is 4 x 10^17 seconds old, or 10^61 Plank time units old. [d. There are 10^89 photons in the universe. [I cannot easily recall where I got these numbers, but I don't think they are in serious doubt. So the number of photons having moved across a Planck distance since the Big Bang is the product of b, c, and d and is 10^150. [10^150 = (10^3)^50 ~ (2^10)^50 = 2^500. [The number of actual calculations is vastly smaller than this, since particles do not communicate their movements instantaneously. How much vaster doesn't really matter right now. What does matter is that cracking a truth table puzzle that has 512 = 2^7 variables is impossible and that a 512-bit encryption scheme is likewise impossible. [All this sets sharp limits on the ability of anything whatever to calculate. It would not matter much, philosophically, if the figure were 128 or 256 or 1024 or 2048 or even 1024^2. It is humanly-comprhensible number. We know from the results of G?del, Church, Turing, and so on about the limitations of any finite calculators.^ But when we come to people the limits on reason are much sharper, since our brains weigh only three pounds. ^[Recall that only finite sentences of finite length are admitted in the "formal systems" of the title of G?del's 1931 paper, which not so by the way, means that finite has suddenly become an undefined term! This is true of Raymond Smullyan's _Formal Systems_ and all similar books I have seen. Of course, there are plenty of definitions of finite *inside* mathematics, logic, and set theory, definitions that are not equivalent, but every one of these theories is handled only by formal systems, with their undefined finite sentences of finite length. And so in mathematics, one of the core terms is left as a KNOW-HOW term and never actually gets defined. The same is true of cost in economics. I could go on.] [The Enlightenment optimism of using reason to solve all problems is dead. No one in the Enlightenment thought about solving 512 variable truth tables and all, except maybe LaPlace, would have admitted to limits on reason. Socially, though, the weaking of this optimism took place on a different plane. World War I, with it senseless trench warfare, destroved much optimism about human nature, as far as governing itself goes. Relativity theory and quantum mechanics did their part, too, in the realm of science. [The Enlightenment was killed for good during six seconds on the Dealey Plaza. [Before the article here is a talk by Lloyd with Edge.] HOW FAST, HOW SMALL, AND HOW POWERFUL? http://www.edge.org/3rd_culture/lloyd/lloyd_print.html REBOOTING CIVILIZATION "Something else has happened with computers. What's happened with society is that we have created these devices, computers, which already can register and process huge amounts of information, which is a significant fraction of the amount of information that human beings themselves, as a species, can process. When I think of all the information being processed there, all the information being communicated back and forth over the Internet, or even just all the information that you and I can communicate back and forth by talking, I start to look at the total amount of information being processed by human beings -- and their artifacts -- we are at a very interesting point of human history, which is at the stage where our artifacts will soon be processing more information than we physically will be able to process." SETH LLOYD -- HOW FAST, HOW SMALL, AND HOW POWERFUL?: MOORE'S LAW AND THE ULTIMATE LAPTOP [7.23.01] Introduction "Lloyd's Hypothesis" states that everything that's worth understanding about a complex system, can be understood in terms of how it processes information. This is a new revolution that's occurring in science. Part of this revolution is being driven by the work and ideas of Seth Lloyd, a Professor of Mechanical Engineering at MIT. Last year, Lloyd published an article in the journal Nature -- "Ultimate Physical Limits to Computation" (vol. 406, no. 6788, 31 August 2000, pp. 1047-1054) -- in which he sought to determine the limits the laws of physics place on the power of computers. "Over the past half century," he wrote, "the amount of information that computers are capable of processing and the rate at which they process it has doubled every 18 months, a phenomenon known as Moore's law. A variety of technologies -- most recently, integrated circuits -- have enabled this exponential increase in information processing power. But there is no particular reason why Moore's law should continue to hold: it is a law of human ingenuity, not of nature. At some point, Moore's law will break down. The question is, when?" His stunning conclusion? "The amount of information that can be stored by the ultimate laptop, 10 to the 31st bits, is much higher than the 10 to the 10th bits stored on current laptops. This is because conventional laptops use many degrees of freedom to store a bit whereas the ultimate laptop uses just one. There are considerable advantages to using many degrees of freedom to store information, stability and controllability being perhaps the most important. Indeed, as the above calculation indicates, to take full advantage of the memory space available, the ultimate laptop must turn all its matter into energy. A typical state of the ultimate laptop's memory looks like a plasma at a billion degrees Kelvin -- like a thermonuclear explosion or a little piece of the Big Bang! Clearly, packaging issues alone make it unlikely that this limit can be obtained, even setting aside the difficulties of stability and control." Ask Lloyd why he is interested in building quantum computers and you will get a two part answer. The first, and obvious one, he says, is "because we can, and because it's a cool thing to do." The second concerns some interesting scientific implications. "First," he says, "there are implications in pure mathematics, which are really quite surprising, that is that you can use quantum mechanics to solve problems in pure math that are simply intractable on ordinary computers." The second scientific implication is a use for quantum computers was first suggested by Richard Feynman in 1982, that one quantum system could simulate another quantum system. Lloyd points out that "if you've ever tried to calculate Feynman diagrams and do quantum dynamics, simulating quantum systems is hard. It's hard for a good reason, which is that classical computers aren't good at simulating quantum systems." Lloyd notes that Feynman suggested the possibility of making one quantum system simulate another. He conjectured that it might be possible to do this using something like a quantum computer. In 90s Lloyd showed that in fact Feynman's conjecture was correct, and that not only could you simulate virtually any other quantum system if you had a quantum computer, but you could do so remarkably efficiently. So by using quantum computers, even quite simple ones, once again you surpass the limits of classical computers when you get down to, say, 30 or 40 bits in your quantum computer. You don't need a large quantum computer to get a big huge speedup over classical simulations of physical systems. "A salt crystal has around 10 to the 17 possible bits in it," he points out. "As an example, let's take your own brain. If I were to use every one of those spins, the nuclear spins, in your brain that are currently being wasted and not being used to store useful information, we could probably get about 10 to the 28 bits there." Sitting with Lloyd in the Ritz Carlton Hotel in Boston, overlooking the tranquil Boston Public Gardens, I am suddenly flooded with fantasies of licensing arrangements regarding the nuclear spins of my brain. No doubt this would be a first in distributed computing "You've got a heck of a lot of nuclear spins in your brain," Lloyd says. "If you've ever had magnetic resonance imaging, MRI, done on your brain, then they were in fact tickling those spins. What we're talking about in terms of quantum computing, is just sophisticated 'spin tickling'." This leads me to wonder how "spin tickling" fits into intellectual property law. How about remote access? Can you in theory designate and exploit people who would have no idea that their brains were being used for quantum computation? Lloyd points out that so far as we know, our brains don't pay any attention to these nuclear spins. "You could have a whole parallel computational universe going on inside your brain. This is, of course, fantasy. But hey, it might happen." "But it's not a fantasy to explore this question about making computers that are much, much, more powerful than the kind that we have sitting around now -- in which a grain of salt has all the computational powers of all the computers in the world. Having the spins in your brain have all the computational power of all the computers in a billion worlds like ours raises another question which is related to the other part of the research that I do." In the '80s, Lloyd began working on how large complex systems process information. How things process information at a very small scale, and how to make ordinary stuff, like a grain of salt or a cube of sugar, process information, relates to the complex systems work in his thesis that he did with the late physicist Heinz Pagels, his advisor at Rockefeller University. "Understanding how very large complex systems process information is the key to understanding how they behave, how they break down, how they work, what goes right and what goes wrong with them," he says. Science is being done in new an different ways, and the changes accelerates the exchange of ideas and the development of new ideas. Until a few years ago, it was very important for a young scientist to be to "in the know" -- that is, to know the right people, because results were distributed primarily by pre prints, and if you weren't on the right mailing list, then you weren't going to get the information, and you wouldn't be able to keep up with the field. "Certainly in my field, and fundamental physics, and quantum mechanics, and physics of information," Lloyd notes, "results are distributed electronically, the electronic pre-print servers, and they're available to everybody via the World Wide Web. Anybody who wants to find out what's happening right now in the field can go to [13]http://xxx.lanl.gov and find out. So this is an amazing democratization of knowledge which I think most people aren't aware of, and its effects are only beginning to be felt." "At the same time," he continues, "a more obvious way in which science has become public is that major newspapers such as The New York Times have all introduced weekly science sections in the last ten years. Now it's hard to find a newspaper that doesn't have a weekly section on science. People are becoming more and more interested in science, and that's because they realize that science impacts their daily lives in important ways." A big change in science is taking place, and that's that science is becoming more public -- that is, belonging to the people. In some sense, it's a realization of the capacity of science. Science in some sense is fundamentally public. "A scientific result is a result that can be duplicated by anybody who has the right background and the right equipment, be they a professor at M.I.T. like me," he points out, "or be they from an underdeveloped country, or be they an alien from another planet, or a robot, or an intelligent rat. Science consists exactly of those forms of knowledge that can be verified and duplicated by anybody. So science is basically, at it most fundamental level, a public form of knowledge, a form of knowledge that is in principle accessible to everybody. Of course, not everybody's willing to go out and do the experiments, but for the people who are willing to go out and do that, -- if the experiments don't work, then it means it's not science. "This democratization of science, this making it public, is in a sense the realization of a promise that science has held for a long time. Instead of having to be a member of the Royal Society to do science, the way you had to be in England in the 17th, 18th, centuries today pretty much anybody who wants to do it can, and the information that they need to do it is there. This is a great thing about science. That's why ideas about the third culture are particularly apropos right now, as you are concentrating on scientists trying to take their case directly to the public. Certainly, now is the time to do it." --JB SETH LLOYD is an Associate Professor of Mechanical Engineering at MIT and a principal investigator at the Research Laboratory of Electronics. He is also adjunct assistant professor at the Santa Fe Institute. He works on problems having to do with information and complex systems from the very small -- how do atoms process information, how can you make them compute, to the very large -- how does society process information? And how can we understand society in terms of its ability to process information? [14]Click Here for Seth Lloyd's Bio Page[15] [7.23.01] SETH LLOYD: Computation is pervading the sciences. I believe it began about 400 years ago, if you look at the first paragraph of Hobbes's famous book Leviathan. He says that just as we consider the human body to be like a machine, like a clock where you have sinews and muscles to move energy about, a pulse beat like a pendulum, and a heart that pumps energy in, similar to the way a weight supplies energy to a clock's pendulum, then we can consider the state to be analogous to the body, since the state has a prince at its head, people who form its individual portions, legislative bodies that form its organs, etc. In that case, Hobbes asked, couldn't we consider the state itself to have an artificial life? To my knowledge that was the first use of the phrase artificial life in the form that we use it today. If we have a physical system that's evolving in a physical way, according to a set of rules, couldn't we consider it to be artificial and yet living? Hobbes wasn't talking about information processing explicitly, but the examples he used were, in fact, examples of information processing. He used the example of the clock as something that is designed to process information, as something that gives you information about time. Most pieces of the clock that he described are devices not only for transforming energy, but actually for providing information. For example, the pendulum gives you regular, temporal information. When he next discusses the state and imagines it having an artificial life, he first talks about the brain, the seat of the state's thought processes, and that analogy, in my mind, accomplishes two things. First, Hobbes is implicitly interested in information. Second, he is constructing the fundamental metaphor of scientific and technological inquiry. When we think of a machine as possessing a kind of life in and of itself, and when we think of machines as doing the same kinds of things that we ourselves do, we are also thinking the corollary, that is, we are doing the same kinds of things that machines do. This metaphor, one of the most powerful of the Enlightenment, in some sense pervaded the popular culture of that time. Eventually, one could argue, that metaphor gave rise to Newton's notions of creating a dynamical picture of the world. The metaphor also gave rise to the great inquiries into thermodynamics and heat, which came 150 years later, and, in some ways, became the central mechanical metaphor that has informed all of science up to the 20th century. The real question is, when did people first start talking about information in such terms that information processing rather than clockwork became the central metaphor for our times? Because until the 20th century, this Enlightenment mode of thinking of physical things such as mechanical objects with their own dynamics as being similar to the body or the state was really the central metaphor that informed much scientific and technological inquiry. People didn't start thinking about this mechanical metaphor until they began building machines, until they had some very good examples of machines, like clocks for instance. The 17th century was a fantastic century for clockmaking, and in fact, the 17th and 18th centuries were fantastic centuries for building machines, period. Just as people began conceiving of the world using mechanical metaphors only when they had themselves built machines, people began to conceive of the world in terms of information and information-processing, only when they began dealing with information and information processing. All the mathematical and theoretical materials for thinking of the world in terms of information, including all the basic formulas, were available at the end of the 19th century, because all these basic formulas had been created by Maxwell, Boltzmann and Gibbs for statistical mechanics. The formula for information was known back in the 1880s, but people didn't know that it dealt with information. Instead, because they were familiar with things like heat and mechanical systems that processed heat, they called information in its mechanical or thermodynamic manifestation, entropy. It wasn't until the 1930s, when people like Claude Shannon and Norbert Wiener, and before them Harry Nyquist, started to think about information processing for the primary purpose of communication, or for the purposes of controlling systems so that the role of information and feedback could be controlled. Then came the notion of constructing machines that actually processed information. Babbage tried to construct one back in the early 19th century, which was a spectacular and expensive failure, and one which did not enter into the popular mainstream. Another failure concerns the outgrowth of the wonderful work regarding Cybernetics in other fields such as control theory, back in the late 1950s, early 1960s, when there was this notion that cybernetics was going to solve all our problems and allow us to figure out how social systems work, etc. That was a colossal failure -- not because that idea was necessarily wrong, but because the techniques for doing so didn't exist at that point -- and, if we're realistic, may in fact never exist. The applications of Cybernetics that were spectacularly successful are not even called Cybernetics because they're so ingrained in technology, in fields like control theory, and in the aerospace techniques that were used to put men on the moon. Those were the great successes of Cybernetics, remarkable successes, but in a more narrow technological field. This brings us to the Internet, which in some sense is almost like Anti Cybernetics, the evil twin of Cybernetics. The word Cybernetics comes from the Greek word kybernotos which means a governor -- helmsman, actually, the kybernotos was the pilot of a ship. Cybernetics, as initially conceived, was about governing, or controlling, or guiding. The great thing about the Internet, as far as I'm concerned, is that it's completely out of control. In some sense the fact of the Internet goes way beyond and completely contradicts the Cybernetic ideal. But, in another sense -- the way in which the Internet and cybernetics are related, Cybernetics was fundamentally on the right track. As far as I'm concerned what's really going on in the world is that there's a physical world where things happen. I'm a physicist by training and I was taught to think of the world in terms of energy, momentum, pressure, entropy. You've got all this energy, things are happening, things are pushing on other things, things are bouncing around. But that's only half the story. The other half of the story, its complementary half, is the story about information. In one way you can think about what's going on in the world as energy, stuff moving around, bouncing off each other -- that's the way people have thought about the world for over 400 years, since Galileo and Newton. But what was missing from that picture was what that stuff was doing: how, why, what? These are questions about information. What is going on? It's a question about information being processed. Thinking about the world in terms of information is complementary to thinking about it in terms of energy. To my mind, that is where the action is, not just thinking about the world as information on its own, or as energy on its own, but looking at the confluence of information and energy and how they play off against each other. That's exactly what Cybernetics was about. Wiener, who is the real father of the field of Cybernetics, conceived of Cybernetics in terms of information, things like feedback control. How much information, for example, do you need to make something happen? The first physicists studying these problems were scientists who happened to be physicists, and the first person who was clearly aware of the connection between information, entropy, and physical mechanics and energy like quanta was Maxwell. Maxwell, in the 1850s and 60s, was the first person to write down formulas that related what we would now call information -- ideas of information -- to things like energy and entropy. He was also the first person to make such an explicit connection. He also had this wonderfully evocative far-out, William Gibsonesque notion of a demon. "Maxwell's Demon" is this hypothetical being that was able to look very closely at the molecules of gas whipping around in a room, and then rearrange them. Maxwell even came up with a model in which the demon was sitting at a partition, a tiny door, between two rooms and he could open and shut this door very rapidly. If he saw fast molecules coming from the right and slow molecules coming from the left, then he'd open the door and let the fast molecules go in the lefthand side, and let the slow molecules go into the righthand side. And since Maxwell already knew about this connection between the average speed of molecules and entropy, and he also knew that entropy had something to do with the total number of configurations, the total number of states a system can have, he pointed out, that if the demon continues to do this, the stuff on the lefthand side will get hot, and the stuff on the righthand side will get cold, because the molecules over on the left are fast, and the molecules on the right are slow. He also pointed out that there is something screwy about this because the demon is doing something that shouldn't take a lot of effort since the door can be as light as you want, the demon can be as small as you want, the amount of energy you use to open and shut the door can be as small as you desire, and yet somehow the demon is managing to make something hot on one side. Maxwell pointed out that this was in violation of all the laws of thermodynamics -- in particular the second law of thermodynamics which says that if you've got a hot thing over here and a cold thing over there, then heat flows from the hot thing to the cold thing, and the hot thing gets cooler and the cold thing gets hotter, until eventually they end up the same temperature. And it never happens the opposite way. You never see something that's all the same temperature spontaneously. Maxwell pointed out that there was something funny going on, that there was this connection between entropy and this demon who was capable of processing information. To put it all in perspective, as far as I can tell, the main thing that separates humanity from most other living things, is the way that we deal with information. Somewhere along the line we developed methods, sophisticated mechanisms, for communicating via speech. Somewhere along the line we developed natural language, which is a universal method for processing information. Anything that you can imagine is processed with information and anything that could be said, can be said using language. That probably happened around a hundred thousand years ago, and since then, the history of human beings has been the development of ever more sophisticated ways of registering, processing, transforming, and dealing with information. Society through this methodology creates an organizational formula that is totally wild compared with the organizational structures of most other species, which makes the human species distinctive, if there is something at all that makes us distinctive. In some sense we're just like any ordinary species out there. The extent to which we are different has to do with having more sophisticated methods for processing information. Something else has happened with computers. What's happened with society is that we have created these devices, computers, which already can register and process huge amounts of information, which is a significant fraction of the amount of information that human beings themselves, as a species, can process. When I think of all the information being processed there, all the information being communicated back and forth over the Internet, or even just all the information that you and I can communicate back and forth by talking, I start to look at the total amount of information being processed by human beings -- and their artifacts -- we are at a very interesting point of human history, which is at the stage where our artifacts will soon be processing more information than we physically will be able to process. So I have to ask, how many bits am I processing per second in my head? I could estimate it, it's going to be around ten billion neurons, something like 10 to the 15 bits per second, around a million billion bits per second. Hell if I know what it all means -- we're going to find out. That's the great thing. We're going to be around to find out some of what this means. If you think that information processing is where the action is, it may mean in fact that human beings are not going to be where the action is anymore. On the other hand, given that we are the people who created the devices that are doing this mass of information processing, we, as a species, are uniquely poised to make our lives interesting and fun in completely unforeseen ways. Every physical system, just by existing, can register information. And every physical system, just by evolving according to its own peculiar dynamics, can process that information. I'm interested in how the world registers information and how it processes it. Of course, one way of thinking about all of life and civilization is as being about how the world registers and processes information. Certainly that's what sex is about; that's what history is about. But since I'm a scientist who deals with the physics of how things process information, I'm actually interested in that notion in a more specific way. I want to figure out not only how the world processes information, but how much information it's processing. I've recently been working on methods to assign numerical values to how much information is being processed, just by ordinary physical dynamics. This is very exciting for me, because I've been working in this field for a long time trying to come up with mathematical techniques for characterizing how things process information, and how much information they're processing. About a year or two ago, I got the idea of asking the question, given the fundamental limits on how the world is put together -- (1) the speed of light, which limits how fast information can get from one place to another, (2) Planck's constant, which tells you what the quantum scale is, how small things can actually get before they disappear altogether, and finally (3) the last fundamental constant of nature, which is the gravitational constant, which essentially tells you how large things can get before they collapse on themselves -- how much information can possibly be processed. It turned out that the difficult part of this question was thinking it up in the first place. Once I'd managed to pose the question, it only took me six months to a year to figure out how to answer it, because the basic physics involved was pretty straightforward. It involved quantum mechanics, gravitation, perhaps a bit of quantum gravity thrown in, but not enough to make things too difficult. The other motivation for trying to answer this question was to analyze Moore's Law. Many of our society's prized objects are the products of this remarkable law of miniaturization -- people have been getting extremely good at making the components of systems extremely small. This is what's behind the incredible increase in the power of computers, what's behind the amazing increase in information technology and communications, such as the Internet, and it's what's behind pretty much every advance in technology you can possibly think of -- including fields like material science. I like to think of this as the most colossal land grab that's ever been done in the history of mankind. From an engineering perspective, there are two ways to make something bigger: One is to make it physically bigger, (and human beings spent a lot of time making things physically bigger, working out ways to deliver more power to systems, working out ways to actually build bigger buildings, working out ways to expand territory, working out ways to invade other cultures and take over their territory, etc.) But there's another way to make things bigger, and that's to make things smaller. Because the real size of a system is not how big it actually is, the real size is the ratio between the biggest part of a system and the smallest part of a system. Or really the smallest part of a system that you can actually put to use in doing things. For instance, the reason that computers are so much more powerful today than they were ten years ago is that every year and a half or so, the basic components of computers, the basic wires, logic chips etc., have gone down in size by a factor of two. This is known as Moore's Law, which is just a historical fact about history of technology. Every time something's size goes down by a factor of two, you can cram twice as many of them into a box, and so every two years or so, the power of computers doubles, and over the course of fifty years the power of computers has gone up by a factor of a million or more. The world has gotten a million times bigger because we've been able to make the smallest parts of the world a million times smaller. This makes this an exciting time to live in, but a reasonable question to ask is, where is all this going to end? Since Moore proposed it in the early 1960s, Moore's Law has been written off numerous times. It was written off in the early 1970s because people thought that fabrication techniques for integrated circuits were going to break down and you wouldn't be able to get things smaller than a scale size of ten microns. Now Moore's Law is being written off again because people say that the insulating barriers between wires in computers are getting to be only a few atoms thick, and when you have an insulator that's only a few atoms thick then electrons can tunnel through them and it's not a very good insulator. Well, perhaps that will stop Moore's Law, but so far nothing has stopped it. At some point Moore's Law has to stop? This question involves the ultimate physical limits to computation: you can't send signals faster than the speed of light, you can't make things smaller than the laws of quantum mechanics tell you that you can, and if you make things too big, then they just collapse into one giant black hole. As far as we know, it's impossible to fool Mother Nature. I thought it would be interesting to see what the basic laws of physics said about how fast, how small, and how powerful, computers can get. Actually these two questions: given the laws of physics, how powerful can computers be; and where must Moore's Law eventually stop -- turn out to be exactly the same, because they stop at the same place, which is where every available physical resource is used to perform computation. So every little subatomic particle, every ounce of energy, every photon in your system -- everything is being devoted towards performing a computation. The question is, how much computation is that? So in order to investigate this, I thought that a reasonable form of comparison would be to look at what I call the ultimate laptop. Let's ask just how powerful this computer could be. The idea here is that we can actually relate the laws of physics and the fundamental limits of computation to something that we are familiar with -- something of human scale that has a mass of about a kilogram, like a nice laptop computer, and has about a liter in volume, because kilograms and liters are pretty good to hold in your lap, are a reasonable size to look at, and you can put it in your briefcase, et cetera. After working on this for nearly a year what I was able to show was that the laws of physics give absolute answers to how much information you could process with a kilogram of matter confined to a volume of one liter. Not only that, surprisingly, or perhaps not so surprisingly, the amount of information that can be processed, the number of bits that you could register in the computer, and the number of operations per second that you could perform on these bits are related to basic physical quantities, and to the aforementioned constants of nature, the speed of light, Planck's constant, and the gravitational constant. In particular you can show without much trouble that the number of ops per second -- the number of basic logical operations per second that you can perform using a certain amount of matter is proportional to the energy of this matter. For those readers who are technically-minded, it's not very difficult to whip out the famous formula E = MC2 and show, using work of Norm Margolus and Lev Levitin here in Boston that the total number of elementary logical operations that you can perform per second using a kilogram of matter is the amount of energy, MC2, times two, divided by H-bar Planck's constant, times pi. Well, you don't have to be Einstein to do the calculation; the mass is one kilogram, the speed of light is 3 times ten to the eighth meters per second, so MC2 is about ten to the 17th joules, quite a lot of energy (I believe it's roughly the amount of energy used by all the world's nuclear power plants in the course of a week or so), a lot of energy, but let's suppose you could use it to do a computation. So you've got ten to the 17th joules, and H-bar, the quantum scale, is ten to the minus 34 joules per second, roughly. So there you go. I have ten to the 17th joules, I divide by ten to the minus 34 joules-seconds, and I have the number of ops: ten to the 51 ops per second. So you can perform 10 to the 51 operations per second, and ten to the 51 is about a billion billion billion billion billion billion billion ops per second -- a lot faster than conventional laptops. And this is the answer. You can't do any better than that, so far as the laws of physics are concerned. Of course, since publication of the Nature article, people keep calling me up to order one of these laptops; unfortunately the fabrication plant to build it has not yet been constructed. You might then actually ask why it is that our conventional laptops, are so slow by comparison when we've been on this Moore's Law track for 50 years now? The answer is that they make the mistake, which could be regarded as a safety feature of the laptop, of locking up most of their energy in the form of matter, so that rather than using that energy to manipulate information and transform it, most of it goes into making the laptop sit around and be a laptop. As you can tell, if I were to take a week's energy output of all the world's nuclear power plants and liberate it all at once, I would have something that looked a lot like a thermonuclear explosion, because a thermonuclear explosion is essentially taking roughly a kilogram of matter and turning it into energy. So you can see right away that the ultimate laptop would have some relatively severe packaging problems. Since I am a professor of mechanical engineering at MIT, I think packaging problems is where it's at. We're talking about some very severe material and fabrication problems to prevent this thing from taking not only you but the entire city of Boston out with it when you boot it up the first time. Needless to say, we didn't actually figure out how we were going to put this thing into a package, but that's part of the fun of doing calculations according to the ultimate laws of physics. We decided to figure out how many ops per second we could perform, and to worry about the packaging afterwards. Now that we've got 10 to the 51 ops per second the next question is: what's the memory space of this laptop. When I go out to buy a new laptop, I first ask how many ops per second can it perform? If it's something like a hundred megahertz, it's pretty slow by current standards; if it's a gigahertz, that's pretty fast though we're still very far away from the 10 to the 51 ops per second. With a gigahertz, we're approaching 10 to the 10th, 10 to the 11th, 10 to the 12th, depending how ops per second are currently counted. Next, how many bits do I have -- how big is the hard drive for this computer, or how big is its RAM? We can also use the laws of physics to calculate that figure. And computing memory capability is something that people could have done back in the early decades of this century. We know how to count bits. We take the number of states, and the number of states is two raised to the power of the number of bits. Ten bits, two to the tenth states, 1024 states. Twenty bits, two to the 20 bits, roughly a million states. You keep on going and you find that with about 300 bits, two to the 300, well, it's about ten to the one hundred, which is essentially a bit greater than the number of particles in the universe. If you had 300 bits, you could assign every particle in the universe a serial number, which is a powerful use of information. You can use a very small number of bits to label a huge number of bits. How many bits does this ultimate laptop have? I have a kilogram of matter confined to the volume of a liter; how many states, how many possible states for matter confined to the volume of a liter can there possibly be? This happened to be a calculation that I knew how to do, because I had studied cosmology, and in cosmology there's this event, called the Big Bang, which happened a long time ago, about 13 billion years ago, and during the Big Bang, matter was at extremely high densities and pressures. I learned from cosmology how to calculate the number of states for matter of very high densities and pressures. In actuality, the density is not that great. I have a kilogram of matter in a liter. The density is similar to what we might normally expect today. However, if you want to ask what the number of states is for this matter in a liter, I've got to calculate every possible configuration, every possible elementary quantum state for this kilogram of matter in a liter of volume. It turns out, when you count most of these states, that this matter looks like it's in the midst of a thermonuclear explosion. Like a little piece of the Big Bang -- a few seconds after the universe was born -- when the temperature was around a billion degrees. At a billion degrees, if you ask what most states for matter are if it's completely liberated and able to do whatever it wants, you'll find that it looks like a lot like a plasma at a billion degrees Kelvin. Electrons and positrons are forming out of nothing, going back into photons again, there's a lot of elementary particles whizzing about and it's very hot. Lots of stuff is happening and you can still count the number of possible states using the conventional methods that people use to count states in the early universe; you take the logarithm of the number of states, get a quantity that's normally thought of as being the entropy of the system (the entropy is simply the logarithm of the number of states which also gives you the number of bits, because the logarithm of the number of states, the base 2, is the number of bits -- because the number of bits raised to the power of -- 2 to the power of the number of bits is the number of states. What more do I need to say?) When we count them, we find that there are roughly 10 to the 31 bits available. That means that there's 2 to the 10 to the 31 possible states that this matter could be in. That's a lot of states - but we can count them. The interesting thing about that is that you notice we've got 10 to the 31 bits, we're performing 10 to the 51 ops per second, so each bit can perform about 10 to the 20 ops per second. What does this quantity mean? It turns out that the quantity -- if you like, the number of ops per second per bit is essentially the temperature of this plasma. And I take this plasma, I multiply it by Bell's constant, divide by Planck's constant, and what I get is the energy per bit, essentially; that's what temperature is. It tells you the energy per bit. It tells you how much energy is available for a bit to perform a logical operation. Since I know if I have a certain amount of energy I could perform a certain number of operations per second, then the temperature tells me how many ops per bit per second I can perform. Then I know not only the number of ops per second, and the number of bits, but also the number of ops per bit per second that can be performed by this ultimate laptop, a kilogram of matter in a liter volume; it's the number of ops per bit per second that could be performed by these elementary particles back at the beginning of time by the Big Bang; it's just the total number of ops that each bit can perform per second. The number of times it can flip, the number of times it can interact with its neighboring bits, the number of elementary logical operations. And it's a number, right? 10 to the 20. Just the way that the total number of bits, 10 to the 31, is a number -- it's a physical parameter that characterizes a kilogram of matter and a liter of volume. Similarly, 10 to the 51 ops per second is the number of ops per second that characterize a kilogram of matter, whether it's in a liter volume or not. We've gone a long way down this road, so there's no point in stopping -- at least in these theoretical exercises where nobody gets hurt. So far all we've used are the elementary constants of nature, the speed of light, which tells us the rate of converting matter into energy or E = MC2. The speed of light tells us how much energy we get from a particular mass. Then we use the Planck scale, the quantum scale, because the quantum scale tells you both how many operations per second you can get from a certain amount of energy, and it also tells you how to count the number of states available for a certain amount of energy. So by taking the speed of light, and the quantum scale, we are able to calculate the number of ops per second that a certain amount of matter can perform, and we're able to calculate the amount of memory space that we have available for our ultimate computer. Then we can also calculate all sorts of interesting issues, like what's the possible input-output rate for all these bits in a liter of volume. That can actually be calculated quite easily from what I've just described, because to get all this information into and out of a liter volume -- take a laptop computer -- you can say okay, here's all these bits, they're sitting in a liter volume, let's move this liter volume over, by its own distance, at the speed of light. You're not going to be able to get the information in or out faster than that. We can find out how many bits per second we get into and out of our ultimate laptop. And we find we can get around 10 to the 40, or 10 to the 41, or perhaps, in honor of Douglas Adams and his mystical number 42, even 10 to the 42 bits per second in and out of our ultimate laptop. So you can calculate all these different parameters that you might think are interesting, and that tells you how good a modem you could possibly have for this ultimate laptop -- how many bits per second can you get in and out over the Ultimate Internet, whatever the ultimate Internet would be. I guess the Ultimate Internet is just space/time itself in this picture. I noted that you can't possibly do better than this, right? These are the laws of physics. But you might be able to do better in other ways. For example, let's think about the architecture of this computer. I've got this computer that's doing 10 to the 51 ops per second, or 10 to the 31 bits. Each bit can flip 10 to the 20 times per second. That's pretty fast. The next question is how long does it take a bit on this side of the computer to send a signal to a bit on that side of the computer in the course of time it takes it to do an operation. As we've established, it has a liter volume, which is about ten centimeters on each side, so it takes about ten to the minus ten seconds for light to go from one side to another -- one ten billionth of a second for light to go from this side to the other. These bits are flipping 10 to the 20 times per second -- a hundred billion billion times per second. This bit is flipping ten billion times in the course of time it takes a signal to go from one side of the computer to the other. This is not a very serial computation. A lot of action is taking place over here in the time it takes to communicate when all the action is taking place over on this side of the computer. This is what's called a parallel computation. You could say that in the kinds of densities of matter that we're familiar with, like a kilogram per liter volume, which is the density of water, we find that we can only perform a very parallel computation, if we operate at the ultimate limits of computation; lots of computational action takes place over here during the time it takes a signal to go from here to there and back again. How can we do better? How could we make the computation more serial? Let's suppose that we want our machine to do more serial computation, so in the time it takes to send a signal from one side of the computer to the other, there are fewer ops that are being done. The obvious solution is to make the computer smaller, because if I make the computer smaller by a factor of two, it only takes half the time for light, for a signal, for information, to go from one side of the computer to the other. If I make it smaller by a factor of ten billion, it only takes one ten billionth of the time for its signal to go from one side of the computer to the other. You also find that when you make it smaller, these pieces of the computer tend to speed up, because you tend to have more energy per bit available in each case. If you go through the calculation you find out that as the computer gets smaller and smaller, as all the mass is compressed into a smaller and smaller volume, you can do a more serial computation. When does this process stop? When can every bit in the computer talk with every other bit, in the course of time it takes for a bit to flip? When can everybody get to talk with everybody else in the same amount of time that it takes them to talk with their neighbors? As you make the computer smaller and smaller, it gets denser and denser, until you have a kilogram of matter in an ever smaller volume. Eventually the matter assumes more and more interesting configurations, until it's actually going to take a very high pressure to keep this system down at this very small volume. The matter assumes stranger and stranger configurations, and tends to get hotter and hotter and hotter, until at a certain point a bad thing happens. The bad thing that happens is that it's no longer possible for light to escape from it -- it becomes a black hole. What happens to our computation at this point. This is probably very bad for a computation, right? Or rather, it's going to be bad for input-output. Input is good, because stuff goes in, but output is bad because it doesn't come out since it's a black hole. Luckily, however, we're safe in this, because the very laws of quantum mechanics that we were using to calculate how much information a physical system can compute, how fast it can perform computations, and how much information it can register, actually hold. Stephen Hawking showed, in the 1970s, that black holes, if you treat them quantum-mechanically, actually can radiate out information. There's an interesting controversy as to whether that information has anything to do with the information that went in. Stephen Hawking and John Preskill have a famous bet, where Preskill says yes -- the information that comes out of a black hole reflects the information that went in. Hawking says no -- the information that comes out of a black hole when it radiates doesn't have anything to do with the information that went in; the information that went in goes away. I don't know the answer to this. But let's suppose for a moment that Hawking is wrong and Preskill is right. Let's suppose for a moment that in fact the information that comes out of a black hole when it evaporates, radiates information the wave length of the radiation coming out which is the radius of the black hole. This black hole, this kilogram black hole, is really radiating at a whopping rate; it's radiating out these photons with wave lengths of 10 to the minus 27 meters, this is not something you would actually wish to be close to -- it would be very dangerous. In fact it would look a lot like a huge explosion. But let's suppose that in fact that information that's being radiated out by the black hole is in fact the information that went in to construct it, but simply transformed in a particular way. What you then see is that the black hole can be thought of in some sense as performing a computation. You take the information about the matter that's used to form the black hole, you program it in the sense that you give it a particular configuration, you put this electron here, you put that electron there, you make that thing vibrate like this, and then you collapse this into a black hole, 10 to the minus 27 seconds later, in one hundred billion billionth of a second, the thing goes cablooey, and you get all this information out again, but now the information has been transformed, by some dynamics, and we don't know what this dynamics is, into a new form. In fact we would need to know something like string theory or quantum gravity to figure out how it's been transformed. But you can imagine that this could in fact function as a computer. We don't know how to make it compute, but indeed, it's taking in information, it's transforming it in a systematic form according to the laws of physics, all right, and then poop! It spits it out again. It's a dangerous thing -- the Ultimate Laptop was already pretty dangerous, because it looked like a thermonuclear explosion inside of a liter bottle of coca cola. This is even worse, because in fact it looks like a thermonuclear explosion except that it started out at a radius of 10 to the minus 27 meters, one billion billion billionth of a meter, so it's really radiating at a very massive rate. But suppose you could somehow read information coming out of the black hole. You would indeed have performed the ultimate computation that you could have performed using a kilogram of matter, in this case confining it to a volume of 10 to the minus 81 cubic meters. Pretty minuscule but we're allowed to imagine this happening. Is there anything more to the story? After writing my paper on the ultimate laptop in Nature, I realized this was insufficiently ambitious; that of course the obvious question to ask at this point is not what is the ultimate computational capacity of a kilogram of matter, but instead to ask what is the ultimate computational capacity of the universe as a whole? After all, the universe is processing information, right? Just by existing, all physical systems register information, just by evolving their own natural physical dynamics, they transform that information, they process it. So the question then is how much information has the universe processed since the Big Bang? _________________________________________________________________ References 13. http://xxx.lanl.gov/ 14. http://www.edge.org/3rd_culture/bios/lloyd.html 15. http://www.edge.org/3rd_culture/bios/varela.html 16. http://www.edge.org/discourse/information.html [Now to the famous article by Seth Lloyd.] ------------------- Over the past half century, the amount of information that computers are capable of processing and the rate at which they process it has doubled every 18 months, a phenomenon known as Moore's law. A variety of technologies--most recently, integrated circuits-- have enabled this exponential increase in information processing power. But there is no particular reason why Moore's law should continue to hold: it is a law of human ingenuity, not of nature. At some point, Moore's law will break down. The question is, when? The answer to this question will be found by applying the laws of physics to the process of computation (1-85). Extrapolation of current exponential improvements over two more decades would result in computers that process information at the scale of individual atoms. Although an Avogadro-scale computer that can act on 10^23 bits might seem implausible, prototype quantum computers that store and process information on individual atoms have already been demonstrated (64,65,76-80). Existing quantum computers may be small and simple, and able to perform only a few hundred operations on fewer than ten quantum bits or 'qubits', but the fact that they work at all indicates that there is nothing in the laws of physics that forbids the construction of an Avogadro-scale computer. The purpose of this article is to determine just what limits the laws of physics place on the power of computers. At first, this might seem a futile task: because we do not know the technologies by which computers 1000, 100, or even 10 years in the future will be constructed, how can we determine the physical limits of those technologies? In fact, I will show that a great deal can be determined concerning the ultimate physical limits of computation simply from knowledge of the speed of light, c = 2.9979 x 10^8 ms^-1, Planck's reduced constant, h-bar= h/2pi = 1.0545 x 10^-34 Js, and the gravitational constant, G = 6.673 x 10^-11 m^3 kg^-1 s^-2. Boltzmann's constant, k-sub-B = 1.3805 x10^-23 J K^-1, will also be crucial in translating between computational quantities such as memory space and operations per bit per second, and thermodynamic quantities such as entropy and temperature. In addition to reviewing previous work on how physics limits the speed and memory of computers, I present results--which are new except as noted--of the derivation of the ultimate speed limit to computation, of trade-offs between memory and speed, and of the analysis of the behaviour of computers at physical extremes of high temperatures and densities. Before presenting methods for calculating these limits, it is important to note that there is no guarantee that these limits will ever be attained, no matter how ingenious computer designers become. Some extreme cases such as the black-hole computer described below are likely to prove extremely difficult or impossible to realize. Human ingenuity has proved great in the past, however, and before writing off physical limits as unattainable, we should realize that certain of these limits have already been attained within a circumscribed context in the construction of working quantum computers. The discussion below will note obstacles that must be sidestepped or overcome before various limits can be attained. Energy limits speed of computation To explore the physical limits of computation, let us calculate the ultimate computational capacity of a computer with a mass of 1 kg occupying a volume of 1 litre, which is roughly the size of a conventional laptop computer. Such a computer, operating at the limits of speed and memory space allowed by physics, will be called the 'ultimate laptop' (Fig. 1). [Figure 1 The ultimate laptop. The 'ultimate laptop' is a computer with a mass of 1 kg and a volume of 1 liter operating at the fundamental limits of speed and memory capacity fixed by physics. The ultimate laptop performs 2mc^2/pi h-bar = 5.4258 x 10^50 logical operations per second on ~10^31 bits. Although its computational machinery is in fact in a highly specified physical state with zero entropy, while it performs a computation that uses all its resources of energy and memory space it appears to an outside observer to be in a thermal state at ~10^9 degrees Kelvin. The ultimate laptop looks like a small piece of the Big Bang.] First, ask what limits the laws of physics place on the speed of such a device. As I will now show, to perform an elementary logical operation in time Delta-t requires an average amount of energy E greater than or equal to pi h-bar/2Delta-t. As a consequence, a system with average energy E can perform a maximum of 2E/pi h-bar logical operations per second. A 1-kg computer has average energy E = mc^2 = 8.9874 x 10^16 J. Accordingly, the ultimate laptop can perform a maximum of 5.4258 x 10^50 operations per second. Maximum speed per logical operation For the sake of convenience, the ultimate laptop will be taken to be a digital computer. Computers that operate on nonbinary or continuous variables obey similar limits to those that will be derived here. A digital computer performs computation by representing information in the terms of binary digits or bits, which can take the value 0 or 1, and then processes that information by performing simple logical operations such as AND, NOT and FANOUT. The operation, AND, for instance, takes two binary inputs X and Y and returns the output 1 if and only if both X and Y are 1; otherwise it returns the output 0. Similarly, NOT takes a single binary input X and returns the output 1 if X = 0 and 0 if X = 1. FANOUT takes a single binary input X and returns two binary outputs, each equal to X. Any boolean function can be constructed by repeated application of AND, NOT and FANOUT. A set of operations that allows the construction of arbitrary boolean functions is called universal. The actual physical device that performs a logical operation is called a logic gate. How fast can a digital computer perform a logical operation? During such an operation, the bits in the computer on which the operation is performed go from one state to another. The problem of how much energy is required for information processing was first investigated in the context of communications theory by Levitin (11-16), Bremermann (17-19), Beckenstein (20-22) and others, who showed that the laws of quantum mechanics determine the maximum rate at which a system with spread in energy Delta-E can move from one distinguishable state to another. In particular, the correct interpretation of the time-energy Heisenberg uncertainty principle Delta-E Delta-t is greater than or equal to h-bar is not that it takes time Delta-t to measure energy to an accuracy Delta-E (a fallacy that was put to rest by Aharonov and Bohm (23,24)), but rather that a quantum state with spread in energy Delta-E takes time at least Deltat = pi h-bar/2Delta-E to evolve to an orthogonal (and hence distinguishable) state(23-26). More recently, Margolus and Levitin (15,16) extended this result to show that a quantum system with average energy E takes time at least Delta-t = pi h-bar/2E to evolve to an orthogonal state. Performing quantum logic operations As an example, consider the operation NOT performed on a qubit with logical states |0> and |1>. (For readers unfamiliar with quantum mechanics, the 'bracket' notation |> signifies that whatever is contained in the bracket is a quantum-mechanical variable; |0> and |1> are vectors in a two-dimensional vector space over the complex numbers.) To flip the qubit, one can apply a potential H = E-sub-0|E-sub-0> = (1/sqrt2)(|0> + |1>) and |E-sub-1> = (1/sqrt2)(|0> - |1>). Because |0> = (1/sqrt2)(|E-sub-0> + |E-sub-1>) and |1> = (1/sqrt2)(|E-sub-0>- |E-sub-1>), each logical state |0>, |1>has spread in energy Delta-E = (E-sub-1 - E-sub-0)/2. It is easy to verify that after a length of time Delta-t = pi h-bar/2Delta-E the qubit evolves so that |0> -> |1> and |1> -> |0>. That is, applying the potential effects a NOT operation in a time that attains the limit given by quantum mechanics. Note that the average energy E of the qubit in the course of the logical operation is <0|H|0> = <1|H|1> = (E-sub-0+ E-sub-1)/2 = E-sub-0+ Delta-E. Taking the ground-state energy E-sub-0 = 0 gives E = Delta-E. So the amount of time it takes to perform a NOT operation can also be written as Delta-t = pi h-bar/2E. It is straightforward to show (15,16) that no quantum system with average energy E can move to an orthogonal state in a time less than Delta-t. That is, the speed with which a logical operation can be performed is limited not only by the spread in energy, but also by the average energy. This result will prove to be a key component in deriving the speed limit for the ultimate laptop. AND and FANOUT can be enacted in a way that is analogous to the NOT operation. A simple way to perform these operations in a quantum-mechanical context is to enact a so-called Toffoli or controlled-controlled-NOT operation (31). This operation takes three binary inputs, X, Y and Z, and returns three outputs, X', Y' and Z'. The first two inputs pass through unchanged, that is, X' = X, Y' = Y. The third input passes through unchanged unless both X and Y are 1, in which case it is flipped. This is universal in the sense that suitable choices of inputs allow the construction of AND, NOT and FANOUT. When the third input is set to zero, Z = 0, then the third output is the AND of the first two: Z' = X AND Y. So AND can be constructed. When the first two inputs are 1, X = Y = 1, the third output is the NOT of the third input, Z' = NOT Z. Finally, when the second input is set to 1, Y = 1, and the third to zero, Z = 0, the first and third output are the FANOUT of the first input, X' = X, Z' = X. So arbitrary boolean functions can be constructed from the Toffoli operation alone. By embedding a controlled-controlled-NOT gate in a quantum context, it is straightforward to see that AND and FANOUT, like NOT, can be performed at a rate 2E/pi h-bar times per second, where E is the average energy of the logic gate that performs the operation. More complicated logic operations that cycle through a larger number of quantum states (such as those on non-binary or continuous quantum variables) can be performed at a rate E/pi h-bar--half as fast as the simpler operations (15,16). Existing quantum logic gates in optical-atomic and nuclear magnetic resonance (NMR) quantum computers actually attain this limit. In the case of NOT, E is the average energy of interaction of the qubit's dipole moment (electric dipole for optic-atomic qubits and nuclear magnetic dipole for NMR qubits) with the applied electromagnetic field. In the case of multiqubit operations such as the Toffoli operation, or the simpler two-bit controlled-NOT operation, which flips the second bit if and only if the first bit is 1, E is the average energy in the interaction between the physical systems that register the qubits. Ultimate limits to speed of computation We are now in a position to derive the first physical limit to computation, that of energy. Suppose that one has a certain amount of energy E to allocate to the logic gates of a computer. The more energy one allocates to a gate, the faster it can perform a logic operation. The total number of logic operations performed per second is equal to the sum over all logic gates of the operations per second per gate. That is, a computer can perform no more than BigSigma-over-l 1/Delta-t-sub-l is less than or equal to BigSigma-over- l underneath 2E-sub-l/pi h-bar= 2E/pi h-bar operations per second. In other words, the rate at which a computer can compute is limited by its energy. (Similar limits have been proposed by Bremmerman in the context of the minimum energy required to communicate a bit (17-19), although these limits have been criticized for misinterpreting the energy-time uncertainty relation(21), and for failing to take into account the role of degeneracy of energy eigenvalues13,14 and the role of nonlinearity in communications7-9.) Applying this result to a 1-kg computer with energy E = mc^2 = 8.9874 2 1016 J shows that the ultimate laptop can perform a maximum of 5.4258 x 10^50 operations per second. Parallel and serial operation An interesting feature of this limit is that it is independent of computer architecture. It might have been thought that a computer could be speeded up by parallelization, that is, by taking the energy and dividing it up among many subsystems computing in parallel. But this is not the case. If the energy E is spread among N logic gates, each one operates at a rate 2E/pi h-bar N, and the total number of operations per second, N 2E/pi h-barN = 2E/pi h-bar, remains the same. then the rate at which they operate and the spread in energy per gate decrease. Note that in this parallel case, the overall spread in energy of the computer as a whole is considerably smaller than the average energy: in general Delta-E = sqrt(BigSigma-over-l Delta-E-sub-l^2) ~ sqrt(N Delta-E-sub-l) whereas E = BigSigma-over-l E-sub-l = NE-sub-l. efficiently, but it does not alter the total number of operations per second. As I will show below, the degree of parallelizability of the computation to be performed determines the most efficient distribution of energy among the parts of the computer. Computers in which energy is relatively evenly distributed over a larger volume are better suited for performing parallel computations. More compact computers and computers with an uneven distribution of energy are performing parallel computations, Comparison with existing computers Conventional laptops operate much more slowly than the ultimate laptop. There are two reasons for this inefficiency. First, mot of the energy is locked up in the mass of the particles of which the computer is constructed, leaving only an infintesimal fraction for performing logic. Second, a conventional computer uses many degrees of freedom (billions and billions of electrons) for registering a single bit. From the physical perspective, such a computer operates in a highly redundant fashion. There are, however, good technological reasons for such redundancy, with conventional designs depending on it for reliability and manufacturability. But in the present discussion, the subject is not what computers are but what they might be, and in this context the laws of physics do not require redundancy to perform logical operations--recently constructed quantum microcomputers use one quantum degree of freedom for each bit and operate at the Heisenberg limit De;ta-t = pi h-bar/2 Delta-E for the time needed to flip a bit (64,65,76-80). Redundancy is, however, required for error correction, as will be discussed below. In sum, quantum mechanics provides a simple answer to the question of how fast information can be processed using a given amount of energy. Now it will be shown that thermodynamics and statistical mechanics provide a fundamental limit to how many bits of informatio can be processed using a given amount of energy confined to a given volume. Available energy necessarily limits the rate at which a computer can process information. Similarly, the maximum entropy of a physical system determines the amount of information it can process. Energy limits speed. Entropy limits memory. Entropy limits memory space The amount of information that a physical system can store and process is related to the number of distingt physical states that are accessible to the system. A collection of m two-state systems has 2^m accessible states and can register m bits of information. In general, a system with N accessible states can register log (base 2) N bits of information. But it has been known for more than a century that the number of accessible states of a physical system, W, is related to its thermodynamic entropy by the formula S = k-sub-B lnW, where kB is Boltzmann's constant. (Although this formula is inscribed on Boltzmann's tomb, it is attributed originally to Planck; before the turn of the century, kB was often known as Planck's constant.) The amount of information that can be registered by a physical system is I = S(E)/k-sub-B ln2 where S(E) is the thermodynamic entropy of a system with expectation value for the energy E. Combining this formula with the formula 2E/pi h-bar for the number of logical operations that can be performed per second, we see that when it is using all its memory, the number of operations per bit per second that our ultimate laptop can perform is k-sub-B x 2ln(2)E/pi h-barS = k-sub-B T/h-bar, where T = (partialderivativeS/partialderivativeE)^-1 is the temperature of 1 kg of matter in a maximum entropy in a volume of 1 liter. The entropy governs the amount of information the system can register and the temperature governs the number of operations per bit per second that it can perform. Because thermodynamic entropy effectively counts the number of bits available to a physical system, the following derivation of the memory space available to the ultimate laptop is based on a thermo dynamic treatment of 1 kg of matter confined to a volume of 1 l in a maximum entropy state. Throughout this derivation, it is important to remember that although the memory space available to the computer is given by the entropy of its thermal equilibrium state, the actual state of the ultimate laptop as it performs a computation is completely determined, so that its entropy remains always equal to zero. As above, I assume that we have complete control over the actual state of the ultimate laptop, and are able to guide it through its logical steps while insulating it from all uncontrolled degrees of freedom. But as the following discussion will make clear, such complete control will be difficult to attain (see Box 1). [Box 1: The role of thermodynamics in computation [The fact that entropy and information are intimately linked has been known since Maxwell introduced his famous 'demon' well over a century ago (1). Maxwell's demon is a hypothetical being that uses its information-processing ability to reduce the entropy of a gas. The first results in the physics of information processing were derived in attempts to understand how Maxwell's demon could function (1-4. The role of thermodynamics in computation has been examined repeatedly over the past half century. In the 1950s, von Neumann (10) speculated that each logical operation performed in a computer at temperature T must dissipate energy k-sub-B T ln2, thereby increasing entropy by k-sub-B ln2. This speculation proved to be false. The precise, correct statement of the role of entropy in computation was attributed to Landauer (5), who showed that reversible, that is, one- to-one, logical operations such as NOT can be performed, in principle, without dissipation, but that irreversible, many-to-one operations such as AND or ERASE require dissipation of at least k-sub-B ln2 for each bit of information lost. (ERASE is a one-bit logical operation that takes a bit, 0 or 1, and restores it to 0.) The argument behind Landauer's principle can be readily understood (37). Essentially, the one-to-one dynamics of hamiltonian systems implies that when a bit is erased the information that it contains has to go somewhere. If the information goes into observable degrees of freedom of the computer, such as another bit, then it has not been erased but merely moved; but if it goes into unobservable degrees of freedom such as the microscopic motion of molecules it results in an increase of entropy of at least k-sub-B ln2. [In 1973, Bennett (28-30) showed that all computations could be performed using only reversible logical operations. Consequently, by Landauer's principle, computation does not require dissipation. (Earlier work by Lecerf (27) had anticipated the possibility of reversible computation, but not its physical implications. Reversible computation was discovered independently by Fredkin and Toffoli(31).) The energy used to perform a logical operation can be 'borrowed' from a store of free energy such as a battery, 'invested' in the logic gate that performs the operation, and returned to storage after the operation has been performed, with a net 'profit' in the form of processed information. Electronic circuits based on reversible logic have been built and exhibit considerable reductions in dissipation over conventional reversible circuits (33-35). [Under many circumstances it may prove useful to perform irreversible operations such as erasure. If our ultimate laptop is subject to an error rate of epsilon bits per second, for example, then error-correcting codes can be used to detect those errors and reject them to the environment at a dissipative cost of epsilon k-sub-B T-sub-E ln2 J s^-1, where T-sub-E is the temperature of the environment. (k-sub-B T ln2 is the minimal amount of energy required to send a bit down an information channel with noise temperature T (ref. 14).) Such error-correcting routines in our ultimate computer function as working analogues of Maxwell's demon, getting information and using it to reduce entropy at an exchange rate of k-sub-B T ln2 joules per bit. In principle, computation does not require dissipation. In practice, however, any computer--even our ultimate laptop--will dissipate energy. [The ultimate laptop must reject errors to the environment at a high rate to maintain reliable operation. To estimate the rate at which it can reject errors to the environment, assume that the computer encodes erroneous bits in the form of black-body radiation at the characteristic temperature 5.87 x 10^8 K of the computer's memory (21). The Stefan-Boltzmann law for black-body radiation then implies that the number of bits per unit area than can be sent out to the environment is B = pi^2 k-sub-B^3 T^3/60ln(2)h-bar^3 c^2 = 7.195 x 10^42 bits per square meter per second. As the ultimate laptop has a surface area of 10^-2 m^2 and is performing ~10^50 operations per second, it must have an error rate of less than 10-^10 per operation in order to avoid over-heating. Even if it achieves such an error rate, it must have an energy throughput (free energy in and thermal energy out) of 4.04 x 10^26 W--turning over its own resting mass energy of mc^2 ~ 10^17 J in a nanosecond! The thermal load of correcting large numbers of errors clearly indicates the necessity of operating at a slower speed than the maximum allowed by the laws of physics. [End of Box 1] Entropy, energy and temperature To calculate the number of operations per second that could be performed by our ultimate laptop, I assume that the expectation value of the energy is E. Accordingly, the total number of bits of memory space available to the computer is S(E,V)/k-sub-B ln2 where S(E,V) is the thermodynamic entropy of a system with expectation value of the energy E confined to volume V. The entropy of a closed system is usually given by the so-called microcanonical ensemble, which fixes both the average energy and the spread in energy DeltaE, and assigns equal probability to all states of the system within a range [E, E + Delta-E]. In the case of the ultimate laptop, however, I wish to fix only the average energy, while letting the spread in energy vary according to whether the computer is to be more serial (fewer, faster gates, with larger spread in energy) or parallel (more, slower gates, with smaller spread in energy). Accordingly, the ensemble that should be used to calculate the thermodynamic entropy and the memory space available is the canonical ensemble, which maximizes S for fixed average energy with no constraint on the spread in energy Delta-E. The canonical ensemble shows how many bits of memory are available for all possible ways of programming the computer while keeping its average energy equal to E. In any given computation with average energy E, the ultimate laptop will be in a pure state with some fixed spread of energy, and will explore only a small fraction of its memory space. In the canonical ensemble, a state with energy E-sub-i has probability p-sub-i = (1/Z(T) e^(-E-sub-ii/k-sub-BT) where Z(T) = BigSigma-over-i e^(E-sub-i/k-sub-BT) is the partition function, and the temperature T is chosen so that E = BigSigma-over-i p-sub-iE-sub-i. The entropy is S = -k-sub-bB BigSigma-over-i p-sub-i ln p-sub-i = E/T + k-sub-B lnZ. The number of bits of memory space available to the computer is S/k-sub-bB ln2. The difference between the entropy as calculated using the canonical ensemble and that calculated using the microcanonical ensemble is minimal. But there is some subtlety involved in using the canonical ensemble rather than the more traditional microcanonical ensemble. The canonical ensemble is normally used for open systems that interact with a thermal bath at temperature T. In the case of the ultimate laptop, however, it is applied to a closed system to find the maximum entropy given a fixed expectation value for the energy. As a result, the temperature T = (partialS/partialE)^-1 has a somewhat different role in the context of physical limits of computation than it does in the case of an ordinary thermodynamic system interacting with a thermal bath. Integrating the relationship T = (partialS/partialE)^-1 over E yields T = CE/S, where C is a constant of order unity (for example, C = 4/3 for black-body radiation, C = 3/2 for an ideal gas, and C = 1/2 for a black hole). Accordingly, the temperature governs the number of operations per bit per second, k-sub-B ln(2) E/h-barS ~ k-sub-B T/h-bar, that a system can perform. As I will show later, the relationship between temperature and operations per bit per second is useful in investigating computation under extreme physical conditions. Calculating the maximum memory space To calculate exactly the maximum entropy for a kilogram of matter in a litre volume would require complete knowledge of the dynamics of elementary particles, quantum gravity, and so on. Although we do not possess such knowledge, the maximum entropy can readily be estimated by a method reminiscent of that used to calculate thermodynamic quantities in the early Universe (86). The idea is simple: model the volume occupied by the computer as a collection of modes of elementary particles with total average energy E. The maximum entropy is obtained by calculating the canonical ensemble over the modes. Here, I supply a simple derivation of the maximum memory space available to the ultimate laptop. A more detailed discussion of how to calculate the maximum amount of information that can be stored in a physical system can be found in the work of Bekenstein (19-21). For this calculation, assume that the only conserved quantities other than the computer's energy are angular momentum and electric charge, which I take to be zero. (One might also ask that the number of baryons be conserved, but as will be seen below, one of the processes that could take place within the computer is black-hole formation and evaporation, which does not conserve baryon number.) At a particular temperature T, the entropy is dominated by the contributions from particles with mass less than k-sub-B T/2c^2. The lth such species of particle contributes energy E = r-sub-i 1/2pi, as the computer is compressed, t-sub-com/t-sub-flip ~ k-sub-BRE/h-bar cS will remain greater than one, that is, the operation will still be somewhat parallel. Only at the ultimate limit of compression--a black hole--is the computation entirely serial. Compressing the computer allows more serial computation Suppose that we want to perform a highly serial computation on a few bits. Then it is advantageous to compress the size of the computer so that it takes less time to send signals from one side of the computer to the other at the speed of light. As the computer gets smaller, keeping the energy fixed, the energy density inside the computer increases. As this happens, different regimes in high-energy physics are necessarily explored in the course of the computation. First the weak unification scale is reached, then the grand unification scale. Finally, as the linear size of the computer approaches its Schwarzchild radius, the Planck scale is reached (Fig. 2). (No known technology could possibly achieve such compression.) At the Planck scale, gravitational effects and quantum effects are both important: the Compton wavelength of a particle of mass m, lC = 2pi h-bar/mc, is on the order of its Schwarzschild radius, 2Gm/c 2. In other words, to describe behaviour at length scales of the size l-sub-P = sqrt(h-barwG/c^3) = 1.616 x 10^-35 m, timescales t-sub-P = sqrt(h-bar/c^5 = 5.391 x 10^-44 s, and mass scales of m-sub-P = sqrt(h-barc/G) ~ 2.177 x 10^-8 kg, a unified theory of quantum gravity is required. We do not currently possess such a theory. Nonetheless, although we do not know the exact number of bits that can be registered by a 1-kg computer confined to a volume of 1 l, we do know the exact number of bits that can be registered by a 1-kg computer that has been compressed to the size of a black hole (87). This is because the entropy of a black hole has a well-defined value. [Figure 2. Computing at the black-hole limit. The rate at which the components of a computer can communicate is limited by the speed of light. In the ultimate laptop, each bit can flip ~10^19 times per second, whereas the time taken to communicate from one side of the 1-liter computer to the other is on the order of 10^9 s--the ultimate laptop is highly parallel. The computation can be speeded up and made more serial by compressing the computer. But no computer can be compressed to smaller than its Schwarzschild radius without becoming a black hole. A 1-kg computer that has been compressed to the black-hole limit of R-sub-S = 2Gm/c^2 = 1.485 x 10^-27 m can perform 5.4258 x 10^50 operations per second on its I = 4piGm^2/ln(2)h-barc = 3.827 x 10^16 bits. At the black-hole limit, computation is fully serial: the time it takes to flip a bit and the time it takes a signal to communicate around the horizon of the hole are the same.] In the following discussion, I use the properties of black holes to place limits on the speed, memory space and degree of serial computation that could be approached by compressing a computer to the smallest possible size. Whether or not these limits could be attained, even in principle, is a question whose answer will have to await a unified theory of quantum gravity (see Box 2). [Box 2: Can a black hole compute? [No information can escape from a classical black hole: what goes in does not come out. But the quantum mechanical picture of a black hole is different. First of all, black holes are not quite black; they radiate at the Hawking temperature T given above. In addition, the well-known statement that 'a black hole has no hair'--that is, from a distance all black holes with the same charge and angular momentum look essentially alike--is now known to be not always true (89-91). Finally, research in string theory (92-94) indicates that black holes may not actually destroy the information about how they were formed, but instead process it and emit the processed information as part of the Hawking radiation as they evaporate: what goes in does come out, but in an altered form. [If this picture is correct, then black holes could in principle be 'programmed': one forms a black hole whose initial conditions encode the information to be processed, lets that information be processed by the planckian dynamics at the hole's horizon, and extracts the answer to the computation by examining the correlations in the Hawking radiation emitted when the hole evaporates. Despite our lack of knowledge of the precise details of what happens when a black hole forms and evaporates (a full account must await a more exact treatment using whatever theory of quantum gravity and matter turns out to be the correct one), we can still provide a rough estimate of how much information is processed during this computation (95-96). Using Page's results on the rate of evaporation of a black hole (95), we obtain a lifetime for the hole t-sub-life = G^2 m^3 / 3C h-bar c^4, where C is a constant that depends on the number of species of particles with a mass less than k-sub-BT, where T is the temperature of the hole. For O (10^1 to 10^2) such species, C is on the order of 10^-3 to 10^-2, leading to a lifetime for a black hole of mass 1 kg of ~10^-19 s, during which time the hole can perform ~10^32 operations on its ~10^16 bits. As the actual number of effectively massless particles at the Hawking temperature of a 1-kg black hole is likely to be considerably larger than 10^2, this number should be regarded as an upper bound on the actual number of operations that could be performed by the hole. Although this hypothetical computation is performed at ultra-high densities and speeds, the total number of bits available to be processed is not far from the number available to current computers operating in more familiar surroundings. [End of Box 2] The Schwarzschild radius of a 1-kg computer is RS = 2Gm/c 2= 1.485 x 10^27 m. The entropy of a black hole is Boltzmann's constant multiplied by its area divided by 4, as measured in Planck units. Accordingly, the amount of information that can be stored in a black hole is I = 4piGm^2/ln(2)h-bar c = 4pim^2/ln(2)m-sub-P^2. The amount of information that can be stored by the 1-kg computer in the black- hole limit is 3.827 x 10^16 bits. A computer compressed to the size of a black hole can perform 5.4258 x 10^50 operations per second, the same as the 1-l computer. In a computer that has been compressed to its Schwarzschild radius, the energy per bit is E/I = mc^2/I = ln(2)h-bar c^3 / 4pimG = ln(2)k-sub-B T/2, where T = (partialS/partialE)^-1 = h-bar c/4pi k-sub-B R-sub-S is the temperature of the Hawking radiation emitted by the hole. As a result, the time it takes to flip a bit on average is t-sub-flip = pi h-barI/2E = pi^2RS/c ln2. In other words, according to a distant observer, the amount of time it takes to flip a bit, t-sub-flip, is on the same order as the amount of time t-sub-com = piR-sub-S/c it takes to communicate from one side of the hole to the other by going around the horizon: t-sub-com/t-sub-flip = ln2/pi. In contrast to computation at lesser densities, which is highly parallel, computation at the horizon of a black hole is highly serial: every bit is essentially connected to every other bit over the course of a single logic operation. As noted above, the serial nature of computation at the black-hole limit can be deduced from the fact that black holes attain the Beckenstein bound (20-22), k-sub-B RE/h-barcS = 1/2pi. Constructing ultimate computers Throughout this entire discussion of the physical limits to computation, no mention has been made of how to construct a computer that operates at those limits. In fact, contemporary quantum 'microcomputers' such as those constructed using NMR (76-80) do indeed operate at the limits of speed and memory space described above. Information is stored on nuclear spins, with one spin registering one bit. The time it takes a bit to flip from a state |uparrow> to an orthogonal state |downarrow> is given by pi h-bar/2muB = pi h-bar/2E, where m is the spin's magnetic moment, B is the magnetic field, and E = muB is the average energy of interaction between the spin and the magnetic field. To perform a quantum logic operation between two spins takes a time pi h-bar/2E-sub-gammma, where E-sub-gamma is the energy of interaction between the two spins. Although NMR quantum computers already operate at the limits to computation set by physics, they are nonetheless much slower and process much less information than the ultimate laptop described above. This is because their energy is locked up largely in mass, thereby limiting both their speed and their memory. Unlocking this energy is of course possible, as a thermonuclear explosion indicates. But controlling such an 'unlocked' system is another question. In discussing the computational power of physical systems in which all energy is put to use, I assumed that such control is possible in principle, although it is certainly not possible in current practice. All current designs for quantum computers operate at low energy levels and temperatures, exactly so that precise control can be exerted on their parts. As the above discussion of error correction indicates, the rate at which errors can be detected and rejected to the environment by error-correction routines places a fundamental limit on the rate at which errors can be committed. Suppose that each logical operation performed by the ultimate computer has a probability e of being erroneous. The total number of errors committed by the ultimate computer per second is then 2epsilonE/pi h-bar. The maximum rate at which information can be rejected to the environment is, up to a geometric factor, ln(2)cS/R (all bits in the computer moving outward at the speed of light). Accordingly, the maximum error rate that the ultimate computer can tolerate is epsilon less than or equal to?piln(2)h-bar cS/2ER = 2t-sub-flip/t-sub-com. That is, the maximum error rate that can be tolerated by the ultimate computer is the inverse of its degree of parallelization. Suppose that control of highly energetic systems were to become possible. Then how might these systems be made to compute? As an example of a 'computation' that might be performed at extreme conditions, consider a heavy-ion collision that takes place in the heavy-ion collider at Brookhaven (S. H. Kahana, personal communication). If one collides 100 nucleons on 100 nucleons (that is, two nuclei with 100 nucleons each) at 200 GeV per nucleon, the operation time is pi h-bar/2E ~ 10-29 s. The maximum entropy can be estimated to be ~4k-sub-B per relativistic pion (to within a factor of less than 2 associated with the overall entropy production rate per meson), and there are ~10^4 relativistic pions per collision. Accordingly, the total amount of memory space available is S/k-sub-B ln2 ~ 10^4 to 10^5 bits. The collision time is short: in the centre-of-mass frame the two nuclei are Lorentz-contracted to D/gamma where D = 12-13 fermi and gamma = 100, giving a total collision time of ~10^-25 s. During the collision, then, there is time to perform approximately 10^4 operations on 10^4 bits--a relatively simple computation. (The fact that only one operation per bit is performed indicates that there is insufficient time to reach thermal equilibrium, an observation that is confirmed by detailed simulations.) The heavy-ion system could be programmed by manipulating and preparing the initial momenta and internal nuclear states of the ions. Of course, we would not expect to be able do word processing on such a 'computer'. Rather it would be used to uncover basic knowledge about nuclear collisions and quark-gluon plasmas: in the words of Heinz Pagels, the plasma 'computes itself ' (88). At the greater extremes of a black-hole computer, I assumed that whatever theory (for example, string theory) turns out to be the correct theory of quantum matter and gravity, it is possible to prepare initial states of such systems that causes their natural time evolution to carry out a computation. What assurance do we have that such preparations exist, even in principle? Physical systems that can be programmed to perform arbitrary digital computations are called computationally universal. Although computational universality might at first seem to be a stringent demand on a physical system, a wide variety of physical systems-- ranging from nearest-neighbour Ising models52 to quantum electrodynamics84 and conformal field theories (M. Freedman, unpublished results)--are known to be computationally universal (51-53,55-65). Indeed, computational universality seems to be the rule rather than the exception. Essentially any quantum system that admits controllable nonlinear interactions can be shown to be computationally universal (60,61). For example, the ordinary electrostatic interaction between two charged particles can be used to perform universal quantum logic operations between two quantum bits. A bit is registered by the presence or absence of a particle in a mode. The strength of the interaction between the particles, e^2/r, determines the amount of time t-sub-flip = pi h-bar r/2e^2 it takes to perform a quantum logic operation such as a controlled-NOT on the two particles. The time it takes to perform such an operation divided by the amount of time it takes to send a signal at the speed of light between the bits t-sub-com = r/c is a universal constant, t-sub-flip/t-sub-com= pi h-bar c/2e^2= p/2alpha, where alpha= e^2/h-bar c ~1/137 is the fine structure constant. This example shows the degree to which the laws of physics and the limits to computation are entwined. In addition to the theoretical evidence that most systems are computationally universal, the computer on which I am writing this article provides strong experimental evidence that whatever the correct underlying theory of physics is, it supports universal computation. Whether or not it is possible to make computation take place in the extreme regimes envisaged in this paper is an open question. The answer to this question lies in future technological development, which is difficult to predict. If, as seems highly unlikely, it is possible to extrapolate the exponential progress of Moore's law into the future, then it will take only 250 years to make up the 40 orders of magnitude in performance between current computers that perform 10^10 operations per second on 10^10 bits and our 1-kg ultimate laptop that performs 1051 operations per second on 10^31 bits. Notes 1. Maxwell, J. C. Theory of Heat (Appleton, London, 1871). 2. Smoluchowski, F. Vortr|ge ?ber die kinetische Theorie der Materie u. Elektrizitat (Leipzig, 1914). 3. Szilard, L. ?ber die Entropieverminderung in einem thermodynamischen System bei Eingriffen intelligenter Wesen. Z. Physik 53, 840-856 (1929). 4. Brillouin, L. Science and Information Theory (Academic Press, New York, 1953). 5. Landauer, R. Irreversibility and heat generation in the computing process. IBM J. Res. Dev. 5, 183-191 (1961). 6. Keyes, R. W. & Landauer, R. Minimal energy dissipation in logic. IBM J. Res. Dev. 14, 152-157 (1970). 7. Landauer, R. Dissipation and noise-immunity in computation and communication. Nature 335, 779-784 (1988). 8. Landauer, R. Information is physical. Phys. Today 44, 23-29 (1991). 9. Landauer, R. The physical nature of information. Phys. Lett. A 217, 188-193 (1996). 10. von Neumann, J. Theory of Self-Reproducing Automata Lect. 3 (Univ. Illinois Press, Urbana, IL, 1966). 11. Lebedev, D. S. & Levitin, L. B. Information transmission by electromagnetic field. Inform. Control 9, 1-22 (1966). 12. Levitin, L. B. in Proceedings of the 3rd International Symposium on Radio Electronics part 3, 1-15 (Varna, Bulgaria, 1970). 13. Levitin, L. B. Physical limitations of rate, depth, and minimum energy in information processing. Int. J. Theor. Phys. 21, 299-309 (1982). 14. Levitin, L. B. Energy cost of information transmission (along the path to understanding). Physica D 120, 162-167 (1998). 15. Margolus, N. & Levitin, L. B. in Proceedings of the Fourth Workshop on Physics and Computation--PhysComp96 (eds Toffoli, T., Biafore, M. & Le?o, J.) (New England Complex Systems Institute, Boston, MA, 1996). 16.Margolus, N. & Levitin, L. B. The maximum speed of dynamical evolution. Physica D 120, 188-195 (1998). 17. Bremermann, H. J. in Self-Organizing Systems (eds Yovits, M. C., Jacobi, G. T. & Goldstein, G. D.) 93-106 (Spartan Books, Washington DC, 1962). 18. Bremermann, H. J. in Proceedings of the Fifth Berkeley Symposium on Mathematical Statistics and Probability (eds LeCam, L. M. & Neymen, J.) Vol. 4, 15-20 (Univ. California Press, Berkeley, CA, 1967). 19. Bremermann, H. J. Minimum energy requirements of information transfer and computing. Int. J. Theor. Phys. 21, 203-217 (1982). 20. Bekenstein, J. D. Universal upper bound on the entropy-to-energy ration for bounded systems. Phys. Rev. D 23, 287-298 (1981). 21. Bekenstein, J. D. Energy cost of information transfer. Phys. Rev. Lett. 46, 623-626 (1981). 22. Bekenstein, J. D. Entropy content and information flow in systems with limited energy. Phys. Rev. D 30, 1669-1679 (1984). 23. Aharonov, Y. & Bohm, D. Time in the quantum theory and the uncertainty relation for the time and energy domain. Phys. Rev. 122, 1649-1658 (1961). 24. Aharonov, Y. & Bohm, D. Answer to Fock concerning the time-energy indeterminancy relation. Phys. Rev. B 134, 1417-1418 (1964). 25. Anandan, J. & Aharonov, Y. Geometry of quantum evolution. Phys. Rev. Lett. 65, 1697-1700 (1990). 26. Peres, A. Quantum Theory: Concepts and Methods (Kluwer, Hingham, 1995). 27. Lecerf, Y. Machines de Turing r?versibles. C.R. Acad. Sci. 257, 2597-2600 (1963). 28. Bennett, C. H. Logical reversibility of computation. IBM J. Res. Dev. 17, 525-532 (1973). 29. Bennett, C.H. Thermodynamics of computation--a review. Int. J. Theor. Phys. 21, 905-940 (1982). 30. Bennett, C. H. Demons, engines and the second law. Sci. Am. 257, 108 (1987). 31. Fredkin, E. & Toffoli, T. Conservative logic. Int. J. Theor. Phys. 21, 219-253 (1982). 32. Likharev, K. K. Classical and quantum limitations on energy consumption in computation. Int. J. Theor. Phys. 21, 311-325 (1982). 33. Seitz, C. L. et al. in Proceedings of the 1985 Chapel Hill Conference on VLSI (ed. Fuchs, H.) (Computer Science Press, Rockville, MD, 1985). 34. Merkle, R. C. Reversible electronic logic using switches. Nanotechnology 34, 21-40 (1993). 35. Younis, S. G. & Knight, T. F. in Proceedings of the 1993 Symposium on Integrated Systems, Seattle, Washington (eds Berrielo, G. & Ebeling, C.) (MIT Press, Cambridge, MA, 1993). 36. Lloyd, S. & Pagels, H. Complexity as thermodynamic depth. Ann. Phys. 188, 186-213 (1988). 37. Lloyd, S. Use of mutual information to decrease entropy--implications for the Second Law of Thermodynamics. Phys. Rev. A 39, 5378-5386 (1989). 38. Zurek, W. H. Thermodynamic cost of computation, algorithmic complexity and the information metric. Nature 341, 119-124 (1989). 39. Leff, H. S. & Rex, A. F. Maxwell's Demon: Entropy, Information, Computing (Princeton Univ. Press, Princeton, 1990). 40. Lloyd, S. Quantum mechanical Maxwell's demon. Phys. Rev. A 56, 3374-3382 (1997). 41. Benioff, P. The computer as a physical system: a microscopic quantum mechanical Hamiltonian model of computers as represented by Turing machines. J. Stat. Phys. 22, 563-591 (1980). 42. Benioff, P. Quantum mechanical models of Turing machines that dissipate no energy. Phys. Rev. Lett. 48, 1581-1585 (1982). 43. Feynman, R. P. Simulating physics with computers. Int. J. Theor. Phys. 21, 467 (1982). 44. Feynman, R. P. Quantum mechanical computers. Optics News 11, 11 (1985); reprinted in Found. Phys. 16, 507 (1986). 45. Zurek, W. H. Reversibility and stability of information-processing systems. Phys. Rev. Lett. 53, 391-394 (1984). 46. Peres, A. Reversible logic and quantum computers. Phys. Rev. A 32, 3266-3276 (1985). 47. Deutsch, D. Quantum-theory, the Church-Turing principle, and the universal quantum computer. Proc. R. Soc. Lond. A 400, 97-117 (1985). 48. Margolus, N. Quantum computation. Ann. N.Y. Acad. Sci. 480, 487-497 (1986). 49. Deutsch, D. Quantum computational networks. Proc. R. Soc. Lond. A 425, 73-90 (1989). 50. Margolus, N. in Complexity, Entropy, and the Physics of Information, Santa Fe Institute Studies in the Sciences of Complexity Vol. VIII (ed. Zurek, W. H.) 273-288 (Addison Wesley, Redwood City, 1991). 51. Lloyd, S. Quantum-mechanical computers and uncomputability. Phys. Rev. Lett. 71, 943-946 (1993). 52. Lloyd, S. A potentially realizable quantum computer. Science 261, 1569-1571 (1993). 53.Lloyd, S. Necessary and sufficient conditions for quantum computation. J. Mod. Opt. 41, 2503-2520 (1994). 54. Shor, P. in Proceedings of the 35th Annual Symposium on Foundations of Computer Science (ed. Goldwasser, S.) 124-134 (IEEE Computer Society, Los Alamitos, CA, 1994). 55. Lloyd, S. Quantum-mechanical computers. Sci. Am. 273, 140-145 (1995). 56. DiVincenzo, D. Quantum computation. Science 270, 255-261 (1995). 57. DiVincenzo, D. P. 2-Bit gates are universal for quantum computation. Phys. Rev. A 51, 1015-1022 (1995). 58.Sleator, T. & Weinfurter, H. Realizable universal quantum logic gates. Phys. Rev. Lett. 74, 4087-4090 (1995). 59. Barenco, A. et al. Elementary gates for quantum computation. Phys. Rev. A 52, 3457-3467 (1995). 60. Lloyd, S. Almost any quantum logic gate is universal. Phys. Rev. Lett. 75, 346-349 (1995). 61. Deutsch, D., Barenco, A. & Ekert, A. Universality in quantum computation. Proc. R. Soc. Lond. A 449, 669-677 (1995). 62. Cirac, J. I. & Zoller, P. Quantum computation with cold ion traps. Phys. Rev. Lett. 74, 4091-4094 (1995). 63. Pellizzari, T., Gardiner, S. A., Cirac, J. I. & Zoller, P. Decoherence, continuous observation, and quantum computing--a cavity QED model. Phys. Rev. Lett. 75, 3788-3791 (1995). 64. Turchette, Q. A., Hood, C. J., Lange, W., Mabuchi, H. & Kimble, H. J. Measurement of conditional phase-shifts for quantum logic. Phys. Rev. Lett. 75, 4710-4713 (1995). 65. Monroe, C., Meekhof, D. M., King, B. E., Itano, W. M. & Wineland, D. J. Demonstration of a fundamental quantum logic gate. Phys. Rev. Lett. 75, 4714-4717 (1995). 66. Grover, L. K. in Proceedings of the 28th Annual ACM Symposium on the Theory of Computing 212-218 (ACM Press, New York, 1996). 67. Lloyd, S. Universal quantum simulators. Science 273, 1073-1078 (1996). 68. Zalka, C. Simulating quantum systems on a quantum computer. Proc. R. Soc. Lond A 454, 313-322 (1998). 69.Shor, P. W. A scheme for reducing decoherence in quantum memory. Phys. Rev. A 52, R2493-R2496 (1995). 70. Steane, A. M. Error correcting codes in quantum theory. Phys. Rev. Lett. 77, 793-797 (1996). 71. Laflamme, R., Miquel, C., Paz, J. P. & Zurek, W. H. Perfect quantum error correcting code. Phys. Rev. Lett. 77, 198-201 (1996). 72. DiVincenzo, D. P. & Shor, P. W. Fault-tolerant error correction with efficient quantum codes. Phys. Rev. Lett. 77, 3260-3263 (1996). 73. Shor, P. in Proceedings of the 37th Annual Symposium on the Foundations of Computer Science 56-65 (IEEE Computer Society Press, Los Alamitos, CA, 1996). 74. Preskill, J. Reliable quantum computers. Proc. R. Soc. Lond. A 454, 385-410 (1998). 75.Knill, E., Laflamme, R. & Zurek, W. H. Resilient quantum computation. Science 279, 342-345 (1998). 76. Cory, D. G., Fahmy, A. F. & Havel, T. F. in Proceedings of the Fourth Workshop on Physics and Computation--PhysComp96 (eds Toffoli, T., Biafore, M. & Le?o, J.) 87-91 (New England Complex Systems Institute, Boston, MA, 1996). 77.Gershenfeld, N. A. & Chuang, I. L. Bulk spin-resonance quantum computation. Science 275, 350-356 (1997). 78. Chuang, I. L., Vandersypen, L. M. K., Zhou, X., Leung, D. W. & Lloyd, S. Experimental realization of a quantum algorithm. Nature 393, 143-146 (1998). 79. Jones, J. A., Mosca, M. & Hansen, R. H. Implementation of a quantum search algorithm on a quantum computer. Nature 393, 344-346 (1998). 80. Chuang, I. L., Gershenfeld, N. & Kubinec, M. Experimental implementation of fast quantum searching. Phys. Rev. Lett. 80, 3408-3411 (1998). 81. Kane, B. A silicon-based nuclear-spin quantum computer. Nature 393, 133 (1998). 82. Nakamura, Y., Pashkin, Yu. A. & Tsai, J. S. Coherent control of macroscopic quantum states in a single-Cooper-pair box. Nature 398, 786-788 (1999). 83. Mooij, J. E. et al. Josephson persistent-current qubit. Science 285, 1036-1039 (1999). 84. Lloyd, S. & Braunstein, S. Quantum computation over continuous variables. Phys. Rev. Lett. 82, 1784-1787 (1999). 85. Abrams, D. & Lloyd, S. Nonlinear quantum mechanics implies polynomial-time solution for NP- complete and P problems. Phys. Rev. Lett. 81, 3992-3995 (1998). 86. Zel'dovich, Ya. B. & Novikov, I. D. Relativistic Astrophysics (Univ. of Chicago Press, Chicago, 1971). 87. Novikov, I. D. & Frolov, V. P. Black Holes (Springer, Berlin, 1986). 88. Pagels, H. The Cosmic Code: Quantum Physics as the Language of Nature (Simon and Schuster, New York, 1982). 89. Coleman, S., Preskill, J. & Wilczek, F. Growing hair on black-holes. Phys. Rev. Lett. 67, 1975-1978 (1991). 90. Preskill, J. Quantum hair. Phys. Scr. T 36, 258-264 (1991). 91. Fiola, T. M., Preskill, J. & Strominger A. Black-hole thermodynamics and information loss in 2 dimensions. Phys. Rev. D 50, 3987-4014 (1994). 92. Susskind, L. & Uglum, J. Black-hole entropy in canonical quantum-gravity and superstring theory. Phys. Rev. D 50, 2700-2711 (1994). 93. Strominger A. & Vafa, C. Microscopic origin of the Bekenstein-Hawking entropy. Phys. Lett. B 37, 99-104 (1996). 94. Das, S. R. & Mathur, S. D. Comparing decay rates for black holes and D-branes. Nucl. Phys. B 478, 561-576 (1996). 95. Page, D. N. Particle emision rates from a black-hole: massless particles form an uncharged non-rotating black-hole. Phys. Rev. D 13, 198 (1976). 96. Thorne, K. S., Zurek, W. H. & Price R. H. in Black Holes: The Membrane Paradigm Ch. VIII (eds Thorne, K. S., Price, R. H. & Macdonald, D. A.) 280-340 (Yale Univ. Press, New Haven, CT, 1986). From shovland at mindspring.com Sat Jan 14 04:42:39 2006 From: shovland at mindspring.com (Steve Hovland) Date: Fri, 13 Jan 2006 20:42:39 -0800 Subject: [Paleopsych] Choppers going down in Iraq Message-ID: SMALL ARMS AIR DEFENSE ---------------------------------------------------------------------------- ---- "The power of an air force is terrific when there is nothing to oppose it." - - Winston Churchill "If it flies it dies." - - Anon. ADA NCO COMBAT EXPERIENCE When facing an enemy that has air superiority or air parity, air attacks are not just a probability--they are a certainty. The following battlefield experiences relate some success by ground units engaging enemy aircraft with small arms. Lesson Learned Fire at a coordinated selected aiming point (football field method) in front of the aircraft so that it will fly into the "Curtain of Lead." FALKLANDS British troops were preparing to move out of the beachhead at San Carlos Bay when four Argentine jets flying at a low level appeared without warning and headed out over the water. Forces on the ground firing small arms and automatic weapons placed a "curtain of lead" in front of the flight path of the aircraft. As the four aircraft exited from the area, pieces of the tail section from one of the Mirages began to fall off and smoke appeared to be coming from out of its side just before it crashed. [15] Lesson Learned The use of a higher proportion of tracer rounds can disturb an enemy pilot's concentration enough to cause him to miss the target or abort his attack plan. FALKLANDS One British officer said that everything fired at attacking aircraft had good effect. If the aircraft was not shot down, the tracers and pyrotechnics intimidated the pilot into using his weapons prematurely, changing his interest, or aborting the mission. To ensure that the Argentine pilots knew they were being engaged by ground forces, the British relinked their machine gun ammunition to add more tracers. British ground forces were credited with downing three Argentine jet aircraft with small arms. [16] Lesson Learned Helicopters are especially vulnerable to ground fire. FALKLANDS Helicopter losses to ground fire on both sides were minimal. The British lost four helicopters to ground fire and the Argentines lost one. However, these low losses can be attributed to the support mission (resupply) assigned to the helicopters and to the relatively few helicopters used by both sides. Attack and air assault missions would have exposed these critical assets to more small arms and shoulder-fired surface-to-air missiles. [17] GRENADA U.S. forces in Grenada lost two helicopters to ground fire from Cubans at Edgmont military barracks. The Rangers used four UH-60 to conduct an air mobile assault on the Cuban stronghold. The landing zone was tight and surrounded by a high barbed wire fence. In the last wave, one helicopter was hit by small arms fire. The pilot, wounded in the arm and leg, lost control of his aircraft and it tumbled into another UH-60 already on the ground. Several soldiers on the ground were killed by the falling aircraft. [18] AFGHANISTAN The Mujahideen, Afghanistan freedom fighters, have proved to be major threats to Soviet air during combat operations. An Afghan pilot of the Communist Afghan Army, who defected in 1984, disclosed that the Soviet built MI 24 (Hind) was extremely vulnerable to machine gun fire, especially when it is engaged from elevated positions in the mountains. The Mujahideens' effectiveness in engaging Soviet aircraft with all weapons systems forced the Soviets to adopt the technique of engaging the Mujahideen at maximum range. For example, the Soviets began to drop their bombs from 5000 feet and fire their rockets beyond maximum effective ranges at the Mujahideen. As of 1984, the Mujahideen were credited with shooting down close to 300 Soviet helicopters using small arms and anti-tank weapons. After the arrival of Stingers in great numbers, the combined use of small-arms fire and anti-aircraft missile reportedly brought down an average of one Soviet aircraft a day during 1987. [19] Lesson Learned A coordinated team effort using all organic weapons can help win the airspace battle over friendly forces. IRAN / IRAQ Air defense in the Iranian and Iraqi armies relies on air defense artillery, SA-7s, and the concentrated fire of automatic weapons. Both sides use the "curtain of lead" technique to focus the firepower of their small arms. However, this technique works best as a supplement to other ADA systems. Small arms fire and SAMs used together caused both Iran and Iraq to change their close air support tactics. Aircraft from both sides now have to engage their targets at the weapons system's maximum effective standoff range rather than overflying the target. This makes them less accurate and provides some measure of relief to the beleaguered infantrymen. [20] References These publications will help you train your unit to defeat attacking enemy aircraft. FM 44-8, Small Unit Self-Defense Against Air Attack, Dec 1981, is the primary doctrine for defense against enemy air attacks. FM 7-8, The Infantry Platoon and Squad (Infantry, Airborne, Assault, Ranger), 31 Dec 1980. Appendix H describes the procedures that small units should use in engaging enemy aircraft with small arms. FM 7-10, The Infantry Rifle Company (Infantry, Airborne, Assault, Ranger), Jan 1982. Appendix P provides doctrine at the company level on how to engage enemy aircraft. FM 7-20, The Infantry Battalion (Infantry, Airborne, and Air Assault), Dec 1981. Chapter 5 provides doctrine at the battalion level on engaging enemy aircraft. Bottom Line Coordinated fire from small arms is effective against enemy aircraft. Even if an aircraft is not hit or destroyed, it can intimidate the pilot and cause him to fly higher or seek an easier target. Lesson Learned Live fire training teaches the importance of accurate and disciplined fire and helps prepare the soldier for the shock and noise of combat. From shovland at mindspring.com Sat Jan 14 04:46:46 2006 From: shovland at mindspring.com (Steve Hovland) Date: Fri, 13 Jan 2006 20:46:46 -0800 Subject: [Paleopsych] MILITARY OPERATIONS IN URBAN TERRAIN Message-ID: ---------------------------------------------------------------------------- ---- "What is the position about London? I have a very clear view that we should fight every inch of it, and that it would devour quite a large invading Army." - -Winston Churchill COMBAT EXPERIENCE While armor and mechanized infantry dominate the open terrain, dismounted infantry dominate the fight in built-up areas. Maneuver of mechanized forces in the city is restricted to the streets and alleys. Therefore, roles are reversed with mech and armor supporting dismounted infantry. MOUT is a combined arms effort but it is the dismounted infantry that clears and controls the cities. Lesson Learned MOUT demands the use of decentralized small unit operations and superb leadership. WW II: AACHEN Some of the fiercest fighting in WW II in the European Theater occurred during the Battle for Aachen in October 1944. The 26th Infantry Regiment of the 1st Infantry Division had the mission to clear the city of Aachen against stubborn German resistance. The soldiers were not prepared for urban combat and had to learn "on the job." Small Unit Operations Most of the burden for the fighting fell on the infantryman. Small unit actions were the rule. Infantry with hand grenades fought "from attic to attic and from sewer to sewer." Occasionally some self-propelled 155mm artillery was brought in to demolish a particularly hard target such as a bunker or fortified building. However, small teams of infantry cleared out house after house to seize the city. They used grenades and bazookas to blow a hole in one of the inside walls and moved into the adjoining building without having to brave the dangerous streets. Assault teams were formed which combined fire support, armor and infantry actions, but the city was only taken by the slow, careful and methodical advance of the infantrymen. [11] Lesson Learned Dense urban areas are easily fortified and require combined arms cooperation especially between the infantry and the engineers, to overcome major obstacles. ATTACK ON BREST The problems of fighting in an urban environment are compounded when the city in question has a number of easily fortified areas. This fact quickly became apparent to the soldiers of the 8th, 29th and 2nd Infantry Divisions of VIII Corps when they moved to attack Brest after the breakout from the Normandy Beachhead. The Germans had fortified the city and harbor area and were ordered to defend to the last man. The infantry had to infiltrate through German minefields to clear a path for the tanks and the "crocodiles" (flame-throwing tanks). They used self-propelled guns to blow down the walls of houses as they moved from street to street. FT. MONTBAREY However, the men of the 29th division were stopped by Ft. Montbarey. Built in the 17th century, Ft. Montbarey had walls up to 35 feet high, 25 feet thick and was surrounded by a deep dry moat. Artillery shells, even those from the 240 mm howitzers brought in especially for this operation, did little more than destroy a few exposed artillery postions and disturb the sleep of the Germans in their deep bunkers. The assault on the fort was led by the 116th Infantry Regiment. Under cover of smoke, they cleared minefields to allow the "crocodiles" to neutralize some exposed gun positions and then the 121 Engineer Combat Battalion went forward to fill in the moat. Finally, a 105 mm howitzer was brought in to fire against the main gate of the fort. After this pounding the 80 German defenders surrendered. [12] Lesson Learned Units must be trained to fight in urban areas. IRAQ Iraqi forces were neither prepared nor trained for urban warfare. In the first year of the war, the Iraqi Army attacked Iranian cities such as Khorramshahr with armored forces without dismounted infantry. These forces were repeatedly destroyed at short distances by antitank weapons and homemede explosives. The Iraqis soon discovered that fighting in built-up areas deprives armor of its advantages of mobility and firepower. The Iraqis also discovered that massing of artillery fires against the city was largely ineffective due to the cover which the buildings provided the enemy. The iraqis were completely bogged down in Khorramshahr and had to bring in a special forces brigade to fight its way through the city to assist the stranded Iraqi units. Iraq virtually halted all offensive operations for three weeks to give special MOUT training to units before finally taking the city. Even then it took a total of 15 days and some 5000 casualties to secure the city. Iraqi losses in the city of Khorramshar were so great they renamed it "Khunishar, The City of Blood." [13] IRAN The effectiveness of dismounted troops fighting in urban terrain was reinforced by the Iranians in the same battle. The Iranian Revolutionary Guards, a highly motivated organization with minimal military training, were in command of the defenses of Khorramshahr. Under their control was a hodgepodge of elements made up of police, customs agents, armed forces trainees and volunteers. Fighting in small but organized teams, the Iranians took advantage of the cover provided by the buildings and rubble to move close enough to engage the Iraqi tanks at point blank range. A successful tactic used by the Iranians was to knock out the lead and rear tanks with antitank weapons at close range to stop the advance of an armored column. Then, using all available weapons (to include homemade explosives and fire bombs), they systematically destroyed the remaining aromored vehicles. The makeshift army of the Revolutionary Guard successfully defended the city until the Iraqis changed their tactics to counter dismounted infantry in the city with dismounted infantry of their own. [14] References The following References will assist commanders to train for MOUT. FM 90-10, Military Operations on Urbanized Terrain (MOUT), Aug 79 FM 90-10-1, An Infantryman's Guide to Urban Combat, Sep 82 FM 7-70, 71, 72, Light Infantry Platoon/Squad Company, and Battalion, Sep 86, Aug 87, and Mar 87. Bottom Line Urban combat is primarily a dismounted infantry action that requires different tactics, techniques and organization. ---------------------------------------------------------------------------- ---- From checker at panix.com Sat Jan 14 10:24:18 2006 From: checker at panix.com (Premise Checker) Date: Sat, 14 Jan 2006 05:24:18 -0500 (EST) Subject: [Paleopsych] Edge Annual Question 2001: What Questions Have Disappeared? Message-ID: Edge Annual Question 2001: What Questions Have Disappeared? For its fourth anniversary edition--"The World Question Center 2001"--Edge has reached out to a wide group of individuals distinguished by their significant achievements and asked them to respond to the following question: "What Questions Have Disappeared?" At publication, 83 responses (34,000 words plus) have been posted. Additional responses are expected in the coming weeks and will be posted on Edge as they are received. Happy New Year! John Brockman Publisher & Editor [Simultaneously published in German by Frankfurter Allgemeine Zeitung--Frank Schirrmacher, Publisher.] Join the Edge public forum at New Minds Meet Online to Offer New Perspectives on Old Questions January 9, 2001 By THE NEW YORK TIMES (free registration required) Once a year, John Brockman of New York, a writer and literary agent who represents many scientists, poses a question in his online journal, The Edge, and invites the thousand or so people on his mailing list to answer it. At the end of 1998, for example, he asked readers to name the most important invention in 2,000 years; the question generated 117 responses as diverse as hay and birth control pills. This year, Mr. Brockman offered a question about questions: "What questions have disappeared, and why?" Here are edited excerpts from some of the answers, to be posted today at www.edge.org..... New "Welche Fragen sind verschwunden?" Die Sphinx in der New Economy: Eine Umfrage unter f?hrenden Wissenschaftlern NEW YORK, 8. January Auch die Zukunft kommt nicht ohne Traditionen aus. Selbst eine mit Mlle. de Scud?ry zeitreisende Mme. de S?vign? m??te sich nicht gar zu sehr wundern, wenn sie beim Netzsurfen auf ein Internetmagazin stie?e, das sich unerschrocken prezi?s "Salon" nennt. Wo immer aber ein Salon zum Verweilen, Sinnieren und Brillieren l?dt, kann eine Preisfrage nicht weit sein. Elektronisch funktioniert sie nicht viel anders als zu Zeiten der Aufkl?rung und ihrer Debattierzirkel. In seinem Internetsalon (www.edge.org) verf?hrt der Verleger und Literaturagent John Brockman zum Anfang des Jahres gelehrte Koryph?en gern zu Antworten auf solche Fragen. Diesmal hat er den Ritus selbst thematisiert und fragt nach Fragen, die keiner mehr stellt. An die hundert Wissenschaftler, Philosophen und Publizisten der sogenannten "Dritten Kultur" nehmen am Spiel teil, haben aber die Spielregeln nicht alle gleich verstanden. Warum eine Frage verschwindet, kann schlie?lich viele Gr?nde haben. Vielleicht ist sie beantwortet, vielleicht auch nicht zu beantworten, was freilich in der Regel den intellektuellen Spieleifer um so heftiger stimuliert, vielleicht aber war die Frage auch von Anfang an nicht fragenswert...... Izumi Aizu ? Alun Anderson ? Philip W. Anderson ? Robert Aunger ? John Barrow ? Thomas A. Bass ? David Berreby ? Susan Blackmore ? Stewart Brand ? Rodney A. Brooks ? David M. Buss ? Jason McCabe Calacanis ? William H. Calvin ? Andy Clark ? Ann Crittenden ? Paul Davies ? Richard Dawkins ? Stanislas Dehaene ? David Deutsch ? Keith Devlin ? Denis Dutton ? George B. Dyson ? J. Doyne Farmer ? Kenneth Ford ? Howard Gardner ? Joel Garreau ? David Gelernter ? Brian Goodwin ? David Haig ? Judy Harris ? Marc D. Hauser ? Geoffrey Hinton ? John Horgan ? Verena Huber-Dyson ? Nicholas Humphrey ? Mark Hurst ? Piet Hut ? Raphael Kasper ? Kevin Kelly ? Lance Knobel ? Marek Kohn ? Stephen M. Kosslyn ? Kai Krause ? Lawrence M. Krauss ? Leon Lederman ? Joseph Le Doux ? Pamela McCorduck ? Dan McNeill ? John H. McWhorter ? Geoffrey Miller ? David Myers ? Randolph M. Nesse ? Tor Norretranders ? Rafael E. N??ez ? James J. O'Donnell ? Jay Ogilvy ? Sylvia Paull ? John Allen Paulos ? Christopher Phillips ? Cliff Pickover ? Steven Pinker ? Jordan Pollack ? David G. Post ? Rick Potts ? Robert Provine ? Eduardo Punset ? Tracy Quan ? Martin Rees ? Howard Rheingold ? Douglas Rushkoff ? sKarl Sabbagh ? Roger Schank ? Stephen H. Schneider ? Al Seckel ? Terrence J. Sejnowski ? Michael Shermer ? Lee Smolin ? Dan Sperber ? Tom Standage ? Timothy Taylor ? Joseph Traub ? Colin Tudge ? Sherry Turkle ? Henry Warwick ? Margaret Wertheim ? Dave Winer ? Naomi Wolf ? Milford Wolpoff ? Eberhard Zangger ? Carl Zimmer ? Izumi Aizu "Who should make the truly global decisions, and how?" As we all use the global medium, Internet, people who are running it behind is making the decisions on how to run this medium. So far so good. But not anymore. With all the ICANN process, commercialization of Domain Name registration, expanding the new gTLDs, one can ask: who are entitled to make these decisions, and how come they can decide that way? Despite the growing digital divide, the number of people who use the Net is still exploding, even in the developing side of the world. What is fair, what is democratic, what kind of principles can we all agree on this single global complex system, from all corners of the world is my question of the year to come. IZUMI AIZU, a researcher and promoter of the Net in Asia since mid 80s, is principal, Asia Network Research and Senior Research Fellow at GLOCOM (Center for Global Communications), at the International University of Japan. Alun Anderson "Why are humans smarter than other animals?" Such a simple question. Many of you might think "Has that question really disappeared?" Some questions disappear for ever because they have been answered. Some questions go extinct because they were bad questions to begin with. But there are others that appear to vanish but then we find that they are back with us again in a slightly different guise. They are questions that are just too close to our hearts for us to let them die completely. For millennia, human superiority was taken for granted. From the lowest forms of life up to humans and then on to the angels and God, all living thing were seen as arranged in the Great Chain of Being. Ascend the chain and perfection grows. It is a hierarchical philosophy that conveniently allows for the exploitation of dumber beasts--of other species or races--as a right by their superiors. We dispose of them as God disposes of us. The idea of human superiority should have died when Darwin came on the scene. Unfortunately, the full implications of what he said have been difficult to take in: there is no Great Chain of Being, no higher and no lower. All creatures have adapted effectively to their own environments in their own way. Human "smartness" is just a particular survival strategy among many others, not the top of a long ladder. It took a surprisingly long time for scientist to grasp this. For decades, comparative psychologists tried to work out the learning abilities of different species so that they could be arranged on a single scale. Animal equivalents of intelligence tests were used and people seriously asked whether fish were smarter than birds. It took the new science of ethology, created by Nobel-prize winners Konrad Lorenz, Niko Tinbergen and Karl von Frisch, to show that each species had the abilities it needed for its own lifestyle and they could not be not arranged on a universal scale. Human smartness is no smarter than anyone else's smartness. The question should have died for good. Artificial intelligence researchers came along later but they too could not easily part from medieval thinking. The most important problems to tackle were agreed to be those that represented our "highest" abilities. Solve them and everything else would be easy. As a result, we have ended up with computer programs that can play chess as well as a grandmaster. But unfortunately we have none that can make a robot walk as well as a 2-year old, yet alone run like a cat. The really hard problems turn out to be those that we share with "lower" animals. Strangley enough, even evolutionary biologists still get caught up with the notion that humans stand at the apex of existence. There are endless books from evolutionary biologists speculating on the reasons why humans evolved such wonderful big brains, but a complete absence of those which ask if a big brains is a really useful organ to have. The evidence is far from persuasive. If you look at a wide range of organisms, those with bigger brains are generally no more successful than those with smaller brains--hey go extinct just as fast. Of course, it would be really nice to sample a large range of different planets where life is to be found and see if big-brained creatures do better over really long time scales (the Earth is quite a young place). Unfortunately, we cannot yet do that, although the fact that we have never been contacted by any intelligent life from older parts of the Universe suggests that it usually comes to a bad end. Still, as we are humans it's just so hard not to be seduced by the question "What makes us so special" which is just the same as the question above but in a different form. When you switch on a kitchen light and see a cockroach scuttle for safety you can't help seeing it as a lower form of life. Unfortunately, there are a lot more of them than there are of us and they have been around far, far longer. Cockroach philosophers doubtless entertain their six-legged friends by asking "What makes us so special". ALUN ANDERSON is Editor-in-Chief of New Scientist. Philip W. Anderson "A question no longer: what is the Theory of Every Thing?" My colleagues in the fashionable fields of string theory and quantum gravity advertise themselves as searching desperately for the 'Theory of Everything", while their experimental colleagues are gravid with the "God Particle", the marvelous Higgson which is the somewhat misattributed source of all mass. (They are also after an understanding of the earliest few microseconds of the Big Bang.) As Bill Clinton might remark, it depends on what the meaning of "everything" is. To these savants, "everything" means a list of some two dozen numbers which are the parameters of the Standard Model. This is a set of equations which already exists and does describe very well what you and I would be willing to settle for as "everything". This is why, following Bob Laughlin, I make the distinction between "everything" and "every thing". Every thing that you and I have encountered in our real lives, or are likely to interact with in the future, is no longer outside of the realm of a physics which is transparent to us: relativity, special and general; electromagnetism; the quantum theory of ordinary, usually condensed, matter; and, for a few remote phenomena, hopefully rare here on earth, our almost equally cut-and-dried understanding of nuclear physics. [Two parenthetic remarks: 1) I don't mention statistical mechanics only because it is a powerful technique, not a body of facts; 2) our colleagues have done only a sloppy job so far of deriving nuclear physics from the Standard Model, but no one really doubts that they can.] I am not arguing that the search for the meaning of those two dozen parameters isn't exciting, interesting, and worthwhile: yes, it's not boring to wonder why the electron is so much lighter than the proton, or why the proton is stable at least for another 35 powers of ten years, or whether quintessence exists. But learning why can have no real effect on our lives, spiritually inspiring as it would indeed be, even to a hardened old atheist like myself. When I was learning physics, half a century ago, the motivation for much of what was being done was still "is quantum theory really right?" Not just QED, though the solution of that was important, but there were still great mysteries in the behavior of ordinary matter--like superconductivity, for instance. It was only some twenty years later that I woke up to the fact that the battle had been won, probably long before, and that my motivation was no longer to test the underlying equations and ideas, but to understand what is going on. Within the same few years , the molecular biology pioneers convinced us we needed no mysterious "life force" to bring all of life under the same umbrella. Revolutions in geology, in astrophysics, and the remarkable success of the Standard Model in sorting out the fundamental forces and fields, leave us in the enviable position I described above: given any problematic phenomenon, we know where to start, at least. And nothing uncovered in string theory or quantum gravity will make any difference to that starting point. Is this Horgan's End of Science? Absolutely not. It's just that the most exciting frontier of science no longer lies at the somewhat sophomoric--or quasi-religious--level of the most "fundamental" questions of "what are we made of?" and the like; what needs to be asked is "how did all this delightful complexity arise from the stark simplicity of the fundamental theory?" We have the theory of every thing in any field of science you care to name, and that's about as far as it gets us. If you like, science is now almost universally at the "software" level; the fundamental physicists have given us all the hardware we need, but that doesn't solve the problem, in physics as in every other field. It's a different game, probably a much harder one in fact, as it has often been in the past; but the game is only begun. PHILIP W. ANDERSON is a Nobel laureate physicist at Princeton and one of the leading theorists on superconductivity. He is the author of A Career in Theoretical Physics, and Economy as a Complex Evolving System. Robert Aunger "Is the Central Dogma of biology inviolate?" In 1957, a few years after he co-discovered the double helix, Francis Crick proposed a very famous hypothesis. It states that "once 'information' has passed into protein it cannot get out again. In more detail, the transfer of information from nucleic acid to nucleic acid, or from nucleic acid to protein may be possible, but transfer from protein to protein, or from protein to nucleic acid is impossible." After it had proven to form the foundation of molecular biology, he later called this hypothesis the "Central Dogma" of biology. In the last years of the last millennium, Crick's dogma fell. The reason? Direct protein-to-protein information transfer was found to be possible in a class of proteins called "prions." With the aid of a catalyst, prions (short for "proteinaceous infectious particles") cause another molecule of the same class to adopt an infectious shape like their own simply through contact. Thus, prions are an important and only recently discovered mechanism for the inheritance of information through means other than DNA. Such an important discovery merited a recent Nobel Prize for Stanley Prusiner, who doggedly pursued the possibility of a rogue biological entity replicating without the assistance of genes against a back-drop of resistance and disbelief among most of his colleagues. Further testimony to the significance of prions comes from the current BSE crisis in Europe. Now that we know how they work, prions--and the diseases they cause--may begin popping up all over the place. ROBERT AUNGER is an anthropologist studying cultural evolution, both through the now much-maligned method of fieldwork in nonwestern societies, and the application of theory adapted from evolutionary biology. He is at the Department of Biological Anthropology at the University of Cambridge, and the editor of Darwinizing Culture: The Status of Memetics as a Science. John Barrow "How does a slide rule work?" My vanished question is:"'How does a slide rule work?"' Slide rules were once ubiquitous in labs, classrooms, and the pockets of engineers. They are now as common as dinosaurs; totally replaced by electronic calulators and computers. The interesting question to ponder is: what is it that in the future will do to computers what computers did to slide rules? JOHN BARROW is a physicist at Cambridge University.. He is the author of The World Within the World, Pi in the Sky, Theories of Everything, The Origins of the Universe (Science Masters Series),The Left Hand of Creation, The Artful Universe, and Impossibility: The Limits of Science and the Science of Limits. Thomas A. Bass "The questions that have disappeared are eschatological." The twentieth century will be remembered as one of the most violent in history. There were two world wars, numerous genocides, and millions of murders conducted in the name of progress. Driving this violence was the urge to find truth or purity. The violence was lit by the refining fire of belief. The redemptive ideal was called national socialism, communism, stalinism, maoism. Today, these gods have feet of clay, and we mock their pretensions. Global consumerism is the new world order, but global consumerism is not a god. Market capitalism does not ask questions about transcendent meaning. Western democracies, nodding into the sleep of reason, have grown numb with self-congratulation about having won the hot, the cold, and, now, the star wars. The questions that have disappeared are eschatological. But they have not really disappeared. They are a chtonic force, waiting underground, searching for a new language in which to express themselves. This observation sprang to mind while I was standing in the Place de la Revolution, now known as the Place de la Concorde, awaiting the arrival of the third millennium. During the French revolution this square was so soaked in blood that oxen refused to cross it. On New Year's Eve it was soaked in rain and champagne, as we counted down to a display of fireworks that never materialized. Instead, there was a Ferris wheel, lit alternately in mauve and chartreuse, and some lasers illuminating the Luxor obelisk which today is the square's secular center. No one staring at it knew how to read the hieroglyphics carved on its face, but this obelisk was once a transcendent object, infused with meaning, and so, too, was the guillotine that formerly stood in its place. THOMAS A. BASS, who currently lives in Paris, is the author of The Eudaemonic Pie, Vietnamerica, The Predictors, and other books. David Berreby ''How does [fill in the blank] in human affairs relate to the great central theory?'' I do not, of course, mean any particular Great Central Theory. I am referring to the once-pervasive habit of relating everything that had human scale--Chinese history, the Odyssey, your mother's fear of heights--to an all-explaining principle. This principle was set forth in a short shelf of classic works and then worked to a fine filigree by close-minded people masquerading as open-minded people. The precise Great Central Theory might be, as it was in my childhood milieu, the theories of Freud. It might be Marx. It might be Levi-Strauss or, more recently, Foucault. At the turn of the last century, there was a Darwinist version going, promulgated by Francis Galton, Herbert Spencer and their ilk. These monolithic growths had begun, I suppose, as the answers to specific questions, but then they metastasized; their adherents would expect the great central theory to answer any question. Commitment to a Great Central Theory thus became more a religious act than an intellectual one. And, as with all religions, the worldview of the devout crept into popular culture. (When I was in high school we'd say So-and-So was really anal about his locker or that What's-his-name's parents were really bourgeois.) For decades, this was what intellectual life appeared to be: Commit to an overarching explanation, relate it to everything you experienced, defend it against infidels. Die disillusioned, or, worse, die smug. So why has this sort of question vanished? My guess is that, broadly speaking, it was a product of the Manichean worldview of the last century. Depression, dictators, war, genocide, nuclear terror--all of these lend themselves to a Yes-or-No, With-Us-or-With-Them, Federation vs. Klingons mindset. We were, to put it simply, all a little paranoid. And paranoids love a Great Key: Use this and see the single underlying cause for what seems to be unrelated and random! Nowadays the world, though no less dangerous, seems to demand attention to the seperateness of things, the distinctiveness of questions. ''Theories of everything'' are terms physicists use to explain their near-theological concerns, but at the human scale most people care about, where we ask questions like ''why can't we dump the Electoral College?'' or ''How come Mom likes my sister better?'', the Great Central Theory question has vanished with the black-or-white arrangement of the human world. What's next? Three possibilities. One, some new Great Central Theory slouches in; some of the Darwinians think they've got the candidate, and they certainly evince signs of quasi-religious commitment. (For example, as a Freudian would say you doubted Freud because of your neuroses, I have heard Darwinians say I doubted their theories because of an evolved predisposition not to believe the truth. I call this quasi-religious because this move makes the theory impregnable to evidence or new ideas.) Two, the notion that overarching theory is impossible becomes, itself, a new dogma. I lean toward this prejudice myself but I recognize its dangers. An intellectual life that was all boutiques could be, in its way, as stultifying as a giant one-product factory. Three, we learn from the mistakes of the last two centuries and insist that our answers always match our questions, and that the distinction between theory and religious belief be maintained. DAVID BERREBY'S writing about science and culture has appeared in The New York Times Magazine, The New Republic, Slate, The Sciences and many other publications. Susan Blackmore "Do we survive death?" This question was long considered metaphysical, briefly became a scientific question, and has now disappeared again. Victorian intellectuals such as Frederic Myers, Henry Sidgwick and Edmund Gurney founded the Society for Psychical Research in 1882 partly because they realised that the dramatic claims of spiritualist mediums could be empirically tested. They hoped to prove "survival" and thus overturn the growing materialism of the day. Some, like Faraday, convinced themselves by experiment that the claims were false, and lost interest. Others, like Myers, devoted their entire lives to ultimately inconclusive research. The Society continues to this day, but survival research has all but ceased. I suggest that no one asks the question any more because the answer seems too obvious. To most scientists it is obviously "No", while to most New Agers and religious people it is obviously "Yes". But perhaps we should. The answer may be obvious (it's "No"--I'm an unreligious scientist) but its implications for living our lives and dealing compassionately with other people are profound. SUSAN BLACKMORE is a psychologist and ex-parapsychologist, who--when she found no evidence of psychic phenomena--turned her attention to why people believe in them. She is author of several skeptical books on the paranormal and, more recently, The Meme Machine. Stewart Brand "How will Americans handle a surplus of leisure?" "Can the threat of recombinant DNA possibly be contained?" "How will Americans handle a surplus of leisure?" That was a brow-furrower in the late '50s and early '60s for social observers and forecasters. Whole books addressed the problem, most of them opining that Americans would have to become very interested in the arts. Turned out the problem never got around to existing, and the same kind of people are worrying now about how Americans will survive the stress of endless multi-tasking. "Can the threat of recombinant DNA possibly be contained?" That was the brand new bogey of the mid-'70s. At a famous self regulating conference at Asilomar conference center in California, genetic researchers debated the question and imposed rules (but not "relinquishment") on the lab work. The question was answered: the threat was handily contained, and it was not as much of a threat as feared anyway. Most people retrospectively applaud the original caution. Similar fears and debate now accompany the introduction of Genetically Modified foods and organisms. Maybe it's the same question rephrased, and it will keep being rephrased as long as biotech is making news. Can the threat of frankenfoods possibly be contained? Can the threat of gene-modified children possibly be contained? Can the threat of bioweapons possibly be contained? Can the threat of human life extension possibly be contained? It won't be a new question until it reaches reflexivity: "Are GM humans really human?" STEWART BRAND is founder of the Whole Earth Catalog, cofounder of The Well, cofounder of Global Business Network, cofounder and president of The Long Now Foundation. He is the original editor of The Whole Earth Catalog, author of The Media Lab: Inventing the Future at MIT , How Buildings Learn, and The Clock of the Long Now: Time and Responsibility (MasterMinds Series). Rodney A. Brooks "What is it that makes something alive?" With the success of molecular biology explaining the mechanisms of life we have lost sight of the question one level up. We do not have any good answers at a more systems level of what it takes for something to be alive. We can list general necessities for a system to be alive, but we can not predict whether a given configuration of molecules will be alive or not. As evidence that we really do not understand what it takes for something to be alive, we have not been able to build machines that are alive. Everything else that we understand leads to machines that capitalize on that understanding--machines that fly, machines that run, machines that calculate, machines that make polymers, machines that communicate, machines that listen, machines that play games. We have not built any machines that live. RODNEY A. BROOKS is director of the MIT Artificial Intelligence Laboratory and Chairman of iRobot Corporation. He builds robots. David M. Buss "Do Men and Women Differ Psychologically?" Psychology for much of the 20th century was dominated by the view that men and women were psychologically identical. So pervasive was this assumption that research articles in psychology journals prior to the 1970's rarely bothered to report the sex of their study participants. Women and men were understood to be interchangeable. Findings for one sex were presumed to be applicable to the other. Once the American Psychological Association required sex of participants to be reported in published experiments, controversy erupted over whether men and women were psychologically different. The past three decades of empirical research has resolved this issue, at least in delimited domains. Although women and men show great psychological similarity, they also differ in profound ways. They diverge in the sexual desires they express and mating strategies they pursue. They differ in the time they allocate to friends and relentlessness with which they pursue status. They display distinct abilities in reading other's minds, feeling other's feelings, and responding emotionally to specific traumas in their lives. Men opt for a wider range of risky activities, are more prone to violence against others, make sharper in-group versus out-group distinctions, and commit the vast majority of homicides worldwide. The question 'Do men and women differ psychologically?' has been replaced with more interesting questions. In what ways do these sex differences create conflict between men and women? Have the selection pressures that created these differences vanished in the modern world? How can societies premised on equality grapple with the profound psychological divergences of the sexes? DAVID M. BUSS is Professor of Psychology at the University of Texas, Austin, and author of several books, most recently The Dangerous Passion: Why Jealousy is as Necessary as Love and Evolutionary Psychology: The New Science of the Mind , and The Evolution of Desire: Strategies of Human Mating. Jason McCabe Calacanis "How long before all nations obey the basic principles of the human rights as outlined in the Universal Declaration of Human Rights on December 10th, 1948?" The distinctive Amnesty International arched sticker, with a burning candle surrounded by a swoosh of barbed wire, seemed to adorn every college dorm-room door, beat up Honda Accord, and office bulletin board when I started college in the late '80s at Fordham University. Human rights was the "in" cause. So, we all joined Amnesty and watched our heroes including Bruce Springsteen, Sting, and Peter Gabriel sing on the "Human Rights Now" tour (brought to you, of course, by Reebok). As quickly as it took center stage, however, human rights seemed to fall off the map. Somewhere in the mid-90s, something stole our fire and free time, perhaps it was the gold rush years of the Internet or the end of the Cold War. The wild spread of entrepreneurship and capitalism may have carried some democracy along with it. Yet just because people are starting companies and economic markets are opening up doesn't mean that there are fewer tortures, rapes, and murders for political beliefs. (These kinds of false perceptions may stem from giving places like China "Most Favored Nation" status). Youth inspired by artists created the foundation of Amnesty's success in the '80s, so maybe a vacuum of activist artists is to blame for human rights disappearing from the collective consciousness. Would a homophobic, misogynistic, and violent artist like Eminem ever take a stand for anyone other than himself? Could anyone take him seriously if he did? Britney Spears' fans might not have a problem with her dressing in a thong at the MTV Music Awards but how comfortable would they be if she addressed the issue of the rape, kidnapping, and torture of young women in Sierra Leone? Of course, you don't have to look around the world to find human-rights abuses. Rodney King and Abner Louima taught us that human rights is an important and pressing issue right in our backyard. (Because of these examples, some narrow-minded individuals may see is as only a race specific issue.) One bright spot in all of this, however, is that the technology that was supposed to create a Big Brother state, like video cameras, is now being used to police Big Brother himself. (Check out witness.org and send them a check--or a video camera--if you have the means.) Eleanor Roosevelt considered her fight to create the Universal Declaration of Human Rights her greatest accomplishment. How ashamed would she be that 50 years has elapsed since her battle, and now, no one seems to care. JASON McCABE CALACANIS is Editor and Publisher of Silicon Alley Daily; The Digital Coast Weekly, Silicon Alley Reporter and Chairman CEO, Rising Tide Studios. William H. Calvin "Where did the moon go?" When, every few years, you see a bite taken out of the sun or moon, you ought to remember just how frightening that question used to be. It became clockwork when the right viewpoint was eventually discovered by science (imagining yourself high above the north pole, looking at the shadows cast by the earth and the moon). But there was an intermediate stage of empirical knowledge, when the shaman discovered that the sixth full moon after a prior eclipse had a two-third's chance of being associated with another eclipse. And so when the shaman told people to pray hard the night before, he was soon seen as being on speaking terms with whomever ran the heavens. This helped convert part-time shamen into full-time priests, supported by the community. This can be seen as the entry-level job for philosophers and scientists, who prize the discoveries they can pass on to the next generation, allowing us to see farther, always opening up new questions while retiring old ones. It's like climbing a mountain that keeps providing an even better viewpoint. WILLIAM H. CALVIN is a neurobiologist at the University of Washington, who writes about brains, evolution, and climate. His recent books are The Cerebral Code, How Brains Think, and (with the linguist Derek Bickerton) Lingua ex Machina. Andy Clark "Why Is There Something Instead of Nothing?" This is a question that the ancients asked, and one that crops up a few times in 20th century philosophical discussions. When it is mentioned, it is usually as an example of a problem that looks to be both deep and in principle insoluble. Unsurprisingly, then, it seems to have fallen by the scientific, cosmological and philosophical waysides. But sometimes I wonder whether it really is insoluble (or senseless), or whether science may one day surprise us by finding an answer. ANDY CLARK is Professor of Philosophy and Cognitive Science at the University of Sussex, UK. He was previously Director of the Philosophy/Neuroscience/Psychology Program at Washington University in St. Louis. He is the author of Microcognition: Philosophy, Cognitive Science and Parallel Distributed Processing, Associative Engines, and Being There: Putting Brain, Body and World Together Again. Ann Crittenden "Is human nature innately good or evil?" Another question that has fallen into the dustbin of history is this: Is human nature innately good or evil? This became a gripping topic in the late 17th century, as Enlightment thinkers began to challenge the Christian assumption that man was born a fallen creature. It was a great debate while it lasted: original sin vs. tabla rasa and the perfectability of man; Edmund Burke vs. Tom Paine; Dostoyevsky vs. the Russian reformers. But Darwin and Freud undermined the foundations of both sides, by discrediting the very possibility of discussing human nature in moral or teleological terms. Now the debate has been recast as "nature vs. nurture" and in secular scientific circles at least, man is the higher primate -- a beast with distinctly mixed potential. ANN CRITTENDEN is an award-winning journalist and author. She was a reporter for The New York Times from 1975 to 1983, where her work on a broad range of economic issues was nominated for the Pulitzer Prize. She is the author of several books inncluding The Price of Motherhood: Why the Most Important Job in the World is Still the Least Valued. Her articles have appeared in numerous magazines, including The Nation, Foreign Affairs, McCall's, Lear's, and Working Woman. Paul Davies "How fast is the Earth moving?" A hundred years ago, one of the most fundamental questions in physical science was: How fast is the Earth moving? Many experiments had been performed to measure the speed of the Earth through space as it orbits the sun, and as the solar system orbits the galaxy. The most famous was conducted in 1887 by Albert Michelson and Edward Morley using an optical interferometer. The result they obtained was... zero. Today, scientists regard the question of the Earth's speed through space as meaningless and misconceived, although many non-scientists still refer to the concept. Why has the question disappeared? Einstein's theory of relativity, published in 1905, denied any absolute frame of rest in the universe; speed is meaningful only relative to other bodies or physical systems. Ironically, some decades later, it was discovered there is a special frame of reference in the universe defined by the cosmic microwave background radiation, the fading afterglow of the big bang. The Earth sweeps through this radiation at roughly 600 km per second (over a million miles per hour) in the direction of the constellation Leo. This is the closest that modern astronomy gets to the notion of an absolute cosmic velocity. PAUL DAVIES is an internationally acclaimed physicist, writer and broadcaster, now based in South Australia. Professor Davies is the author of some twenty books, including Other Worlds, God and the New Physics, The Edge of Infinity, The Mind of God, The Cosmic Blueprint, Are We Alone? and About Time. He is the recipient of a Glaxo Science Writers' Fellowship, an Advance Australia Award and a Eureka prize for his contributions to Australian science, and in 1995 he won the prestigious Templeton Prize for his work on the deeper meaning of science. Richard Dawkins "As William Blake might have written to a coelacanth: Did he who made the haplochromids make thee?" Different people on the Edge list seem to have chosen to understand 'questions that have disappeared' in three very different senses: 1. Questions that were once popular but have now been answered 2. Questions that should never have been asked in the first place 3. Questions that have disappeared although they never received a satisfactory answer. This third meaning is, I suspect, the one intended by the organizer of the forum. It is the most interesting of the three since it suggests real science that we should now be doing, rather than just raking over the historical coals. The three meanings are too disparate to bring together easily, but I'll try. The popular question 'Has there been enough time for evolution to take place?' can now confidently be answered in the affirmative. It should never have been put in the first place since, self-evidently, we are here. But what is more interesting is that the real question that faces us is almost the exact opposite. Why is evolution so slow, given that natural selection is so powerful? Far from there being too little time for evolution to play with, there seems to be too much. Ledyard Stebbins did a theoretical calculation about an extremely weak selection pressure, acting on a population of mouse-sized animals to favor the largest individuals. His hypothetical selection pressure was so weak as to be below the threshold of detectability in field sampling studies. Yet the calculated time to evolve elephant-sized descendants from mouse-sized ancestors was only a few tens of thousands of generations: too short to be detected under most circumstances in the fossil record. To exaggerate somewhat, evolution could be yo-yo-ing from mouse to elephant, and back again, so fast that the changes could seem instantaneous in the fossil record. Worse, Stebbins's calculation assumed an exceedingly weak selection pressure. The real selection pressures measured in the field by Ford and his colleagues on lepidoptera and snails, by Endler and his colleagues on guppies, and by the Grants and their colleagues on the Galapagos finches, are orders of magnitude stronger. If we fed into the Stebbins calculation a selection pressure as strong as the Grants have measured in the field, it is positively worrying to contemplate how fast evolution could go. The same conclusion is indirectly suggested by domestic breeding. We have gone from wolf to Yorkshire terrier in a few centuries, and could presumably go back to something like a wolf in as short a time. It is indeed the case that evolution on the Galapagos archipelago has been pretty fast, though still nothing like as fast as the measured selection pressures might project. The islands have been in existence for five million years at the outside, and the whole of their famous endemic fauna has evolved during that time. But even the Galapagos islands are old compared to Lake Victoria. In the less than one million years of the lake's brief lifetime, more than 170 species of the genus Haplochromis alone have evolved. Yet the Coelacanth Latimeria, and the three genera of lungfish, have scarcely changed in hundreds of millions of years. Surviving Lingula ('lamp shells') are classified in the same genus as their ancestors of 400 million years ago, and could conceivably interbreed with them if introduced through a time machine. The question that still faces us is this. How can evolution be both so fast and so leadenly slow? How can there be so much variance in rates of evolution? Is stasis just due to stabilizing selection and lack of directional selection? Or is there something remarkably special going on in the (non) evolution of living fossils? As William Blake might have written to a coelacanth: Did he who made the haplochromids make thee? RICHARD DAWKINS is an evolutionary biologist and the Charles Simonyi Professor For The Understanding Of Science at Oxford University; Fellow of New College; author of The Selfish Gene, The Extended Phenotype , The Blind Watchmaker, River out of Eden (ScienceMasters Series), Climbing Mount Improbable, and Unweaving the Rainbow. Stanislas Dehaene "The definition of life and consciousness?" Some scientific questions cannot be resolved, but rather are dissolved, and vanish once we begin to better understand their terms. This is often the case for "definitional questions". For instance, what is the definition of life? Can we trace a sharp boundary between what is living and what is not living? Is a virus living? Is the entire earth a living organism? It seems that our brain predisposes us to ask questions that require a yes or no answer. Moreover, as scientists, we'd like to keep our mental categories straight and, therefore, we would like to have neat and tidy definitions of the terms we use. However, especially in the biological sciences, the objects of reality do not conform nicely to our categorical expectations. As we delve into research, we begin to realize that what we naively conceived of as a essential category is, in fact, a cluster of loosely bound properties that each need to be considered in turn (in the case of life: metabolism, reproduction, autonomy, homeostasy, etc..). Thus, what was initially considered as a simple question, requiring a straightforward answer, becomes a complex issue or even a whole domain of research. We begin to realize that there is no single answer, but many different answers depending on how one frames the terms of the question. And eventually, the question is simply dropped. It is not longer relevant. I strongly suspect that one of today's hottest scientific question,s the definition of consciousness, is of this kind. Some scientists seem to believe that what we call consciousness is an essence of reality, a single coherent phenomenon that can be reduced to a single level such as a quantum property of microtubules. Another possibility, however, that consciousness is a cluster of properties that, most of the time, cohere together in awake adult humans. A minimal list probably includes the ability to attend to sensory inputs or internal thoughts, to make them available broadly to multiple cerebral systems, to store them in working memory and in episodic memory, to manipulate them mentally, to act intentionally based on them, and in particular to report them verbally. As we explore the issue empirically, we begin to find many situations (such as visual masking or specific brain lesions) in which those properties break down. The neat question "what is consciousness" dissolves into a myriad of more precise and more fruitful research avenues. Any biological theory of consciousness, which assumes that consciousness has evolved, implies that "having consciousness" is not an all-or-none property. The biological substrates of consciousness in human adults are probably also present, but only in partial form, in other species, in young children or brain-lesioned patients. It is therefore a partially arbitrary question whether we want to extend the use of the term "consciousness" to them. For instance, several mammals, and even very young human children, show intentional behavior, partially reportable mental states, some working memory ability--but perhaps no theory of mind, and more "encapsulated" mental processes that cannot be reported verbally or even non-verbally. Do they have consciousness, then? My bet is that once a detailed cognitive and neural theory of the various aspects of consciousness is available, the vacuity of this question will become obvious. STANISLAS DEHAENE, researcher at the Institut National de la Sant?, studies cognitive neuropsychology of language and number processing in the human brain; author of The Number Sense: How Mathematical Knowledge Is Embedded In Our Brains. David Deutsch "And why?" "What Questions Have Disappeared...And Why?" Funny you should ask that. "And why? " could itself be the most important question that has disappeared from many fields. "And why?": in other words, "what is the explanation for what we see happening?" "What is it in reality that brings about the outcome that we predict?" Whenever we fail to take that question seriously enough, we are blinded to gaps in our favoured explanation. And so, when we use that explanation to interpret regularities that we may observe, instead of understanding that the explanation was an assumption in our analysis, we regard it as the inescapable implication of our observations. "I just can't feel myself split", complained Bryce DeWitt when he first encountered the many-universes interpretation of quantum theory. Then Hugh Everett convinced him that this was the same circular reasoning that Galileo rejected when he explained how the Earth can be in motion even though we observe it to be at rest. The point is, both theories are consistent with that observation. Thanks to Everett, DeWitt and others, the "and why" question began gradually to return to quantum theory, whence it had largely disappeared during the 1930s. I believe that its absence did great harm both in impeding progress and in encouraging all sorts of mystical fads and pseudo-science. But elsewhere, especially in the human philosophies (generally known as social sciences), it is still largely missing. Although behaviourism--the principled refusal to ask "and why?"--is no longer dominant as an explicit ideology, it is still widespread as a psychological attitude in the human philosophies. Suppose you identified a gene G, and a human behaviour B, and you undertook a study with 1000 randomly chosen people, and the result was that of the 500 people who had G in their genome, 499 did B, while of the 500 who lacked G, 499 failed to do B. You'd conclude, wouldn't you, that G is the predominant cause of B? Obviously there must be other mechanisms involved, but they have little influence on whether a person does B or not. You'd inform the press that all those once-trendy theories that tried to explain B through people's upbringing or culture, or attributed it to the exercise of free will or the logic of the situation or any combination of such factors--were just wrong. You've proved that when people choose to do B, they are at the very least responding to a powerful influence from their genes. And if someone points out that your results are perfectly consistent with B being 100% caused by something other than G (or any other gene), or with G exerting an influence in the direction of not doing B, you will shrug momentarily, and then forget that possibility. Won't you? DAVID DEUTSCH's research in quantum physics has been influential and highly acclaimed. His papers on quantum computation laid the foundations for that field, breaking new ground in the theory of computation as well as physics, and have triggered an explosion of research efforts worldwide. He is a member of the Centre for Quantum Computation at the Clarendon Laboratory, Oxford University and the author of The Fabric of Reality. Keith Devlin "Why can't girls/women do math?" Heavens, I take a couple of days off from reading email over Christmas and when I next log on already there are over twenty responses to the Edge question! Maybe the question we should all be asking is "Doesn't anyone take time off any more?" As to questions that have disappeared, as a mathematician I hope we've seen the last of the question "Why can't girls/women do math?" With women now outnumbering men in mathematics programs in most US colleges and universities, that old wives' tale (old husbands' tale?) has surely been consigned to the garbage can. Some recent research at Brown University confirmed what most of us had long suspected: that past (and any remaining present) performance differences were based on cultural stereotyping. (The researchers found that women students performed worse at math tests when they were given in a mixed gender class than when no men were present. No communication was necessary to cause the difference. The sheer presence of men was enough.) While I was enjoying my offline Christmas, Roger Schank already raised the other big math question: Why do we make such a big deal of math performance and of teaching math to everyone in the first place? But with the educational math wars still raging, I doubt we've seen the last of that one! KEITH DEVLIN is a mathematician, writer, and broadcaster living in California. His latest book is The Math Gene: How Mathematical Thinking Evolved and Why Numbers Are Like Gossip. Denis Dutton "When will overpopulation create worldwide starvation?" They cordoned off the area and brought in disposal experts to defuse the bomb, but it turned out to be full of--sawdust. The Population Bomb is truly a dud, although this news and its implications have yet fully to sink into the general consciousness. Ideas can become so embedded in our outlook that they are hard to shake by rational argument. As a Peace Corps Volunteer working in rural India in the 1960s, I vividly remember being faced with multiple uncertainties about what might work for the modernization of India. There was only one thing I and my fellow development workers could all agree on: India unquestionably would experience mass famine by the 1980s at the latest. For us at the time this notion was an eschatological inevitability and an article of faith. For 35 years since those days, India has grown in population by over a million souls a month, never failing to feed itself or earn enough to buy the food it needs (sporadic famine conditions in isolated areas, which still happen in India, are always a matter of communications and distribution breakdown). Like so many of the doomsayers of the twentieth century, we left crucial factors out of our glib calculations. First, we failed to appreciate that people in developing countries will behave exactly like people in the rest of the world: as they improve their standard of living, they have fewer children. In India, the rate of population increase began to turn around in the 1970s, and it has declined since. More importantly, we underestimated the capacity of human intelligence to adapt changing situations. Broadly speaking, instead of a world population of 25 or 30 billion, which some prophets of the 1960s were predicting, it now looks as though the peak of world population growth might be reached within 25 to 40 years at a maximum of 8.5 billion (just 2.5 billion above the present world population). Even without advances in food technology, the areas of land currently out of agricultural production in the United States and elsewhere will prevent starvation. But genetic technologies will increase the quantities and healthfulness of food, while at the same time making food production much more environmentally friendly. For example, combining gene modification with herbicides will make it possible to produce crops that induce no soil erosion. New varieties will requires less intensive application of nitrogen fertilizers and pesticides. If genetic techniques can control endemic pests, vast areas of Africa could be brought into productive cultivation. There will be no way to add 2.5 billion people to the planet without environmental costs. Some present difficulties, such as limited supplies of fresh water in Third World localities, will only get worse. But these problems will not be insoluble. Moreover, there is not the slightest chance that population growth will in itself cause famine. What will be fascinating to watch, for those who live long enough to witness it, will be how the world copes with an aging, declining population, once the high-point has been reached. The steady evaporation of the question, "When will overpopulation create worldwide starvation?", has left a gaping hole in the mental universe of the doomsayers. They have been quick to fill it with anxieties about global warming, cellphones, the ozone hole, and Macdonaldization. There appears to be a hard-wired human propensity to invent threats where they cannot clearly be discovered. Historically, this has been applied to foreign ethnic groups or odd individuals in a small-scale society (the old woman whose witchcraft must have caused village children to die). Today's anxieties focus on broader threats to mankind, where activism can mix fashionable politics with dubious science. In this respect alone, the human race is not about to run out of problems. Fortunately, it also shows no sign of running out of solutions. DENIS DUTTON, founder and editor of the innovative Web page Arts & Letters Daily (www.cybereditions.com/aldaily/), teaches the philosophy of art at the University of Canterbury, New Zealand and writes widely on aesthetics. He is editor of the journal Philosophy and Literature, published by the Johns Hopkins University Press. Professor Dutton is a director of Radio New Zealand, Inc. George B. Dyson "What does the other side of the moon look like?" This can be elaborated by the following anecdote, from an interview (2.99) with Herbert York: "Donald Hornig, who was head of PSAC [President's Science Advisory Committee, during the Johnson Administration] was not imaginative. I can give you an example of this. I was very enthusiastic about getting a picture of the other side of the moon. And there were various ways of doing it, sooner or later. And I argued with Hornig about it and he said, 'Why? It looks just like this side.' And it turned out it didn't. But nevertheless, that was it, and that's the real Hornig. 'Why are you so enthused about the other side of the moon? The other side of the moon looks just like this side, why would you be so interested to see it?'" GEORGE DYSON, a historian among futurists, has been excavating the history and prehistory of the digital revolution going back 300 years. His most recent book is Darwin Among the Machines. J. Doyne Farmer "What do these discarded questions tell us?" The road of knowledge is littered with old questions, but by their very nature, none of them stands out above all others. The diversity of thoughtful responses given on the Edge forum, which just begin to scratch the surface, illustrates how progress happens. The evolution of knowledge is a Schumpterian process of creative destruction, in which weeding out the questions that no longer merit attention is an integral part of formulating better questions that should. Forgetting is a vital part of creation. Maxwell once worried that the second law of thermodynamics could be violated by a demon who could measure the velocity of individual particles and separate the fast ones from the slow ones, and use this to do work. Charlie Bennet showed that that this is impossible, because to make a measurement the demon has to first put her instruments in a known state. This involves erasing information. The energy needed to do this is more than can be gained. Thus, the fact that forgetting takes work is essential to the second law of thermodynamics. Why is this relevant? As Gregory Bateson once said, the second law of thermodynamics is the reason that it is easier to mess up a room than it is to clean it. Forgetting is an essential part of the process of creating order. Asking the right questions is the most important part of the creative process. There are lots of people who are good at solving problems, fewer who are good at asking questions. Around the time I took my qualifying examination in physics, someone showed me the test that Lord Rayleigh took when he graduated as senior wrangler from Cambridge in 1865. I would have failed it. There were no questions on thermodynamics, statistical mechanics, quantum mechanics, nuclear physics, particle physics, condensed matter, or relativity, i.e. no questions covering most of what I had learned. However, the classical mechanics questions, which comprised most of the bets, were diabolically hard. Their solution involved techniques that are no longer taught, and that a modern physicist would have to work hard to recreate. Of course, in a field like philosophy this would not have surprised me--it just hadn't occurred to me that this was as true for physics as well. The physicists in Rayleigh's generation presumably worked just as hard, and knew just as many things. They just knew different things. After overcoming the shock of how much had seemingly been lost, I rationalized my ignorance with the belief that what I was taught was more useful than what Rayleigh was taught. Whether as a culture or as individuals, to learn new things, we have to forget old things. The notion of what is useful is constantly evolving. The most important questions evolve through time as people understand little bits and pieces, and view them from different angles in the attempt to solve them. Each question is replaced by a new one that is (hopefully) better framed than its antecedant. Reflecting on those that have been cast aside is like sifting through flotsam on a beach, and asking what it tells us. Is there a common thread that might give us a clue to posing better questions in the future? When we examine questions such as "What is a vital force?", "How fast is the earth moving?", "Does God exist?", "Have we seen the end of science?", "Has history ended?", "Can machines think?", there are some common threads. One is that we never really understood what these questions meant in the first place. But these questions (to varying degrees) have been useful in helping us to formulate better, more focused questions. We just have to turn loose of our pet ideas, and make a careful distinction between what we know and what we only think we know, and try to be more precise about what we are really asking. I would be curious to hear more discussion about the common patterns and the conclusions to be drawn from the questions that have disappeared. J. DOYNE FARMER, one of the pioneers of what has come to be called chaos theory, is McKinsey Professor, Sante Fe, Institute, and the co founder and former co-president of Prediction Company in Santa Fe, New Mexico. Kenneth Ford "When will we face another energy crisis, and how will we cope with it?" This question (or pair of questions) was on everyone's lips in the 1970s, following the oil shortage and lines at gas stations. It stimulated a lot of good thinking and good work on alternative energy sources, renewable energy sources, and energy efficiency. Although this question is still asked by many knowledgeable and concerned people, it has disappeared from the public's radar screen (or, better, television screen). Even the recent escalation of fuel prices and the electricity shortage in California have not lent urgency to thinking ahead about energy. But we should be asking, we should be worrying, and we should be planning. A real energy crisis is closer now than it was when the question had high currency. The energy-crisis question is only part of a larger question: How is humankind going to deal in the long term with its impact on the physical world we inhabit (of which the exhaustion of fossil fuels is only a part)? Another way to phrase the larger question: Are we going to manage more or less gracefully a transition to a sustainable world, or will eventual sustainability be what's left, willy nilly, after the chaos of unplanned, unanticipated change? Science will provide no miracles (as the Wall Street Journal, in its justification of inaction, would have us believe), but science can do a lot to ameliorate the dislocations that this century will bring. We need to encourage our public figures to lift their eyes beyond the two-, four-, and six-year time horizons of their jobs. KENNETH FORD is a retired physicist who teaches at Germantown Friends School in Philadelphia. He is the co-author, with John Wheeler, of Geons, Black Holes, and Quantum Foam: A Life in Physics. Howard Gardner "Has History Ended?" I am going to take slight liberty with your question. With the publication a decade ago of Francis Fukuyama's justly acclaimed article The End Of History, many pundits and non-pundits assumed that historical forces and trends had been spent. The era of the "isms" was at an end; liberal democracy, market forces, and globalization had triumphed; the heavy weight of the past was attenuating around the globe. At the start of 2001, we are no longer asking "Has History Ended?" History seems all too alive. The events of Seattle challenged the globalization behemoth; the world is no longer beating a path to internet startups; Communist and fascist revivals have emerged in several countries; the historical legacies in areas like the Balkans and the Middle East are as vivid as ever; and, as I noted in response to last year's question, much of Africa is at war. As if to remind us of our naivete, Fidel Castro and Saddam Hussein have been in "office" as long as most Americans can remember. If George II is ignorant of this history, he is likely to see it repeated. HOWARD GARDNER, the major proponent of the theory of multiple intelligences, is Professor of Education at Harvard University and author of numerous books including The Mind's New Science and Extraordinary Minds: Portraits of Four Exceptional Individuals. Joel Garreau "What can government do to help create a better sort of human?" The moral, intellectual, physical and social improvement of the human race was a hot topic of the Enlightenment. It helped shape the American and French revolutions. Creating the "New Soviet Man" was at the heart of the Russian revolution--that's what justified the violence. A central theme of Lyndon Johnson's Great Society was not just that human misery could be alleviated. It was that core human problems like crime could be fixed by the government eliminating root causes like want. That's all gone. We now barely trust government to teach kids to read. JOEL GARREAU, the cultural revolution correspondent of The Washington Post, is a student of global culture, values, and change whose current interests range from human networks and the transmission of ideas to the hypothesis that the '90s--like the '50s--set the stage for a social revolution to come. He is the author of the best-selling books Edge City: Life on the New Frontier and The Nine Nations of North America, and a principal of The Edge City Group, which is dedicated to the creation of more liveable and profitable urban areas worldwide. David Gelernter "How should adult education work? How do we educate the masses? (That's right, The Masses.)...." How should adult education work? How do we educate the masses? (That's right, The Masses.) How do we widen the circle of people who love and support great art, great music, great literature? How do we widen the circle of adults who understand the science and engineering that our modern world is built on? How do we rear good American citizens? Or for that matter good German citizens, or Israeli or Danish or Chilean? And if this is the information age, why does the population at large grow worse-informed every year? (Sorry--that last one isn't a question people have stopped asking; they never started.) These questions have disappeared because in 2001, the "educated elite" never goes anywhere without its quote-marks. Here in America's fancy universities, we used to believe that everyone deserved and ought to have the blessings of education. Today we believe our children should have them--and to make up for that fact, to even the score, we have abolished the phrase. No more "blessings of education." That makes us feel better. Many of us can't say "truth and beauty" without snickering like 10-year-old boys. But the situation will change, as soon as we regain the presence of mind to start asking these questions again. We have the raw materials on hand for the greatest cultural rebirth in history. We have the money and the technical means. We tend to tell our children nowadays (implicitly) that their goal in life is to get rich, get famous and lord it over the world. We are ashamed to tell them that what they really ought to be is good, brave and true. (In fact I am almost ashamed to type it.) This terrible crisis of confidence we're going through was probably inevitable; at any rate it's temporary, and if we can't summon the courage to tell our children what's right, my guess is that they will figure it out for themselves, and tell us. I'm optimistic. DAVID GELERNTER, Professor of Computer Science at Yale University and author of Mirror Worlds, The Muse in the Machine, 1939: The Lost World of the Fair, and Drawiing a Life: Surviving the Unabomber. Brian Goodwin "Where Does Love Come From?" What does science have to say about the origins of love in the scheme of things? Not a lot. In fact, it is still virtually a taboo subject, just as consciousness was until very recently. However, since feelings are a major component of consciousness, it seems likely that the ontology of love is now likely to emerge as a significant question in science. Within Christian culture, as in many other religious traditions, love has its origin as a primal quality of God and so is co-eternal with Him. His creation is an outpouring of this love in shared relationship with beings that participate in the essential creativity of the cosmos. As in the world of Shakespeare and the Renaissance Magi, it is love that makes the world go round and animates all relationships. This magical view of the world did not satisfy the emerging perspective of Galilean science, which saw relationships in nature as law-like, obeying self-consistent logical principles of order. God may well have created the world, but he did so according to intelligible principles. It is the job of the scientist to identify these and describe them in mathematical form. And so with Newton, love turned into gravity. The rotation of the earth around the sun, and the moon around the earth, was a result of the inverse square law of gravitational attraction. It was not a manifestation of love as an attractive principle between animated beings, however much humanity remained attached to romantic feelings about the full moon. Love was henceforth banished from scientific discourse and the mechanical world-view took over. Now science itself is changing and mechanical principles are being replaced by more subtle notions of interaction and relationships. Quantum mechanics was the first harbinger of a new holistic world of non-local connectedness in which causality operates in a much more intricate way than conventional mechanism. We now have complexity theory as well, which seeks to understand how emergent properties arise in complex systems such as developing organisms, colonies of social insects, and human brains. Often these properties are not reducible to the behavior of their component parts and their interactions, though there is always consistency between levels: that is, there are no contradictions between the properties of the parts of a complex system and the order that emerges from them. Consciousness appears to be one of these emergent properties. With this recognition, science enters a new realm. Consciousness involves feelings, or more generally what are called qualia, the experience of qualities such as pain, pleasure, beauty, and ??. love. This presents us with a major challenge. The scientific principle of consistency between levels in systems requires that feelings emerge from some property of the component parts (e.g., neurones) that is consistent with feeling, experience. But if matter is 'dead', without any feeling, and neurones are just made of this dead matter, even though organized in a complex way, then where do feelings come from ? This is the crunch question which presents us with a hard choice. We can either say that feelings are epiphenomena, illusions that evolution has invented because they are useful for survival. Or we can change our view of matter and ascribe to the basic stuff of reality some elementary component of feeling, sentience, however rudimentary. Of course, we could also take the view that nature is not self-consistent and that miracles are possible; that something can come from nothing, such as feeling from dead, insentient matter, thus returning to the magical world-view of the early renaissance. But if we are to remain scientific, then the choice is between the other two alternatives. The notion that evolution has invented feelings because they are useful for survival is not a scientific explanation, because it gives no account of how feelings are possible as properties that emerge in the complex systems we call organisms (i.e., consistent emergent properties of life). So we are left with the other hard choice: matter must have some rudimentary property of sentience. This is the conclusion that the mathematician/philosopher A.N. Whitehead came to in his classic, Process and Reality, and it is being proposed as a solution to the Cartesian separation of mind and matter by some contemporary philosophers and scientists. It involves a radical reappraisal of what we call 'reality'. But it does suggest a world in which love exists as something real, in accord with most peoples' experience. And goodness knows, we could do with a little more of it in our fragmented world. BRIAN GOODWIN is a professor of biology at the Schumacher College, Milton Keynes, and the author of Temporal Organization in Cells and Analytical Physiology, How The Leopard Changed Its Spots: The Evolution of Complexity, and (with Gerry Webster) Form and Transformation: Generative and Relational Principles in Biology. Dr. Goodwin is a member of the Board of Directors of the Sante Fe Institute. David Haig "questions that were asked in extinct languages" All those questions that were asked in extinct languages for which there is no written record. DAVID HAIG is an evolutionary geneticist/theorist at Harvard who is interested in conflicts and conflict resolution with the genome, with a particular interest in genomic imprinting and relations between parents and offspring. Hiscurrent interests include the evolution of linkage groups and the evolution of viviparity. Judy Harris "Do genes influence human behavior?" This question bit the dust after a brief but busy life; it is entirely a second-half-of the-20th-century question. Had it been asked before the 20th century, it would have been phrased differently: "heredity" instead of "genes." But it wasn't asked back then, because the answer was obvious to everyone. Unfortunately, the answer everyone gave--yes!--was based on erroneous reasoning about ambiguous evidence: the difference in behavior between the pauper and the prince was attributed entirely to heredity. The fact that the two had been reared in very different circumstances, and hence had had very different experiences, was overlooked. Around the middle of the 20th century, it became politically incorrect and academically unpopular to use the word "heredity"; if the topic came up at all, a euphemism, "nature," was used in its place. The fact that the pauper and the prince had been reared in very different circumstances now came to the fore, and the behavioral differences between them was now attributed entirely to the differences in their experiences. The observation that the prince had many of the same quirks as the king was now blamed entirely on his upbringing. Unfortunately, this answer, too, was based on erroneous reasoning about ambiguous evidence. That children tend to resemble their biological parents is ambiguous evidence; the fact that such evidence is plentiful--agreeable parents tend to have agreeable kids, aggressive parents tend to have aggressive kids, and so on--does not make it any less ambiguous. The problem is that most kids are reared by their biological parents. The parents have provided both the genes and the home environment, so the kids' heredity and environment are correlated. The prince has inherited not only his father's genes but also his father's palace, his father's footmen, and his father's Lord High Executioner (no reference to living political figures is intended). To disambiguate the evidence, special techniques are required--ways of teasing apart heredity and environment by controlling the one and varying the other. Such techniques didn't begin to be widely used until the 1970s; their results didn't become widely known and widely accepted until the 1990s. By then so much evidence had piled up that the conclusion (which should have been obvious all along) was incontrovertible: yes, genes do influence human behavior, and so do the experiences children have while growing up. (I should point out, in response to David Deutsch's contribution to the World Question Center, that no one study, and no one method, can provide an answer to a question of this sort. In the case of genetic influences on behavior, we have converging evidence--studies using a variety of methods all led to the same conclusion and even agreed pretty well on the quantitative details.) Though the question has been answered, it has left behind a cloud of confusion that might not disappear for some time. The biases of the second half of the 20th century persist: when "dysfunctional" parents are found to have dysfunctional kids, the tendency is still to blame the environment provided by the parents and to overlook the fact that the parents also provided the genes. Some would argue that this bias makes sense. After all, they say, we know how the environment influences behavior. How the genes influence behavior is still a mystery--a question for the 21st century to solve. But they are wrong. They know much less than they think they know about how the environment influences behavior. The 21st century has two important questions to answer. How do genes influence human behavior? How is human behavior influenced by the experiences a child has while growing up? JUDITH RICH HARRIS is a writer and developmental psychologist; co-author of The Child: A Contemporary View Of Development; winner of the 1997 George A. Miller Award for an outstanding article in general psychology, and author of The Nurture Assumption: Why Children Turn Out The Way They Do. Marc D. Hauser "Do animals have thoughts?" The reason this question is dead is because traditional Skinnerianism, which viewed rats and pigeons as furry and feathered black boxes, guided by simple principles of reinforcement and punishment, is theoretically caput. It can no longer account for the extraordinary things that animals do, spontaneously. Thus, we now know that animals form cognitive maps of their environment, compute numerosities, represent the relationships among individuals in their social group, and most recently, have some understanding of what others know. The questions for the future, then, are not "Do animals think?", but "What precisely do they think about, and to what extent do their thoughts differ from our own?" MARC D. HAUSER is an evolutionary psychologist, and a professor at Harvard University where he is a fellow of the Mind, Brain, and Behavior Program. He is a professor in the departments of Anthropology and Psychology, as well as the Program in Neurosciences. He is the author of The Evolution of Communication, and Wild Minds: What AnimalsThink. Geoffrey Hinton "What is 'vital force'?" Nobody asks what "vital force" is anymore. Organisms still have just as much vital force as they had before, but as understanding of biological mechanisms increased, the idea of a single essence evaporated. Hopefully the same will happen with "consciousness". GEOFFREY HINTON, is an AI researcher at the Gatsby Computational Neuroscience Unit, University College London, where he does research on ways of using neural networks for learning, memory, perception and symbol processing and has over 100 publications in these areas. He was one of the researchers who introduced the back-propagation algorithm that is now widely used for practical applications. His other contributions to neural network research include Boltzmann machines, distributed representations, time-delay neural nets, mixtures of experts, and Helmholtz machines. His current main interest is in unsupervised learning procedures for neural networks with rich sensory input. John Horgan "Is enlightenment a myth or a reality?" Is enlightenment a myth or a reality? I mean the enlightenment of the east, not west, the state of supreme mystical awareness also known as nirvana, satori, cosmic consciousnesss, awakening. Enlightenment is the telos of the great Eastern religions, Buddhism and Hinduism, and it crops up occasionally in western religions, too, although in a more marginal fashion. Enlightenment once preoccupied such prominent western intellectuals as William James, Aldous Huxley and Joseph Campbell, and there was a surge of scientific interest in mysticism in the 1960?s and 1970?s. Then mysticism became tainted by its association with the human potential and New Age movements and the psychedelic counterculture, and for the last few decades it has for the most part been banished from serious scientific and intellectual discourse. Recently a few scholars have written excellent books that examine mysticism in the light of modern psychology and neuroscience--Zen and the Brain by the neurologist James Austin; Mysticism, Mind, Consciousness by the philosopher Robert Forman; The Mystical Mind by the late psychiatrist Eugene d'Aquili and the radiologist Andrew Newberg--but their work has received scant attention in the scientific mainstream. My impression is that many scientists are privately fascinated by mysticism but fear being branded as fuzzy-headed by disclosing their interest. If more scientists revealed their interest in mystical consciousness, perhaps it could become a legitimate subject for investigation once again. JOHN HORGAN is a freelance writer and author of The End of Science and The Undiscovered Mind. A senior writer at Scientific American from 1986 to 1997, he has also written for the New York Times, Washington Post, New Republic, Slate, London Times, Times Literary Supplement and other publications. Verena Huber-Dyson "Did Fermat's question, 'is it true that there are no integers x, y, z and n, all greater than 2, such that x^n + y^n = z^n?', F? for short, raised in the 17th century, disappear when Andrew Wiles answered it affirmatively by a proof of Fermat's theorem F in 1995?" Did Fermat's question, "is it true that there are no integers x, y, z and n, all greater than 2, such that x^n + y^n = z^n?", F? for short, raised in the 17th century, disappear when Andrew Wiles answered it affirmatively by a proof of Fermat's theorem F in 1995? The answer is no. The question F? can be explained to every child, but the proof of F is extremely sophisticated requiring techniques and results way beyond the reach of elementary arithmetic, thus raising the quest for conceptually simpler proofs. What is going on here, why do such elementary theorems require such intricate machinery for their proof? The fact of the truth of F itself is hardly of vital interest. But, in the wake of Goedel's incompleteness proof of 1931, F? finds it place in a sequence of elementary number theoretic questions for which there provably cannot exist any algorithmic proof procedure! Or take the question D? raised by the gut feeling that there are more points on a straight line segment than there are integers in the infinite sequence 1,2,3,4,.... Before it can be answered the question what is meant by "more" must be dealt with. This done by the 18th Century's progress in the Foundations, D? became amenable to Cantor's diagonal argument, establishing theorem D. But this was by no means the end of the question! The proof gave rise to new fields of investigation and new ideas. In particular, the Continuum hypothesis C?, a direct descendant of D? was shown to be "independent" of the accepted formal system of set theory. A whole new realm of questions sprang up; questions X? that are answered by proofs of independence, bluntly by: "that depends"--on what you are talking about, what system you are using, on your definition of the word "is" and so forth. With this they give rise to comparative studies of systems without as well as with the assumption X added. Euclid's parallel axiom in geometry, is the most popular early example. What about the question as to the nature of infinitesimal's, a question that has plagued us ever since Leibniz. Euler and his colleagues had used them with remarkable success boldly following their intuition. But in the 18th Century mathematicians became self conscious. By the time we were teaching our calculus classes by means of epsilon's, delta's and Dedekind cuts some of us might have thought that Cauchy, Weierstrass and Dedekind had chased the question away. But then along came logicians like Abraham Robinson with a new take on it with so-called non standard quantities--another favorite of the popular science press. Finally, turning to a controversial issue; the question of the existence of God can neither be dismissed by a rational "No" nor by a politically expedient "Yes". Actually as a plain yes-or-no question it ought to have disappeared long ago. Nietzsche, in particular, did his very best over a hundred years ago to make it go away. But the concept of God persists and keeps a maze of questions afloat, such as "who means what by Him"?, "do we need a boogie man to keep us in line"?, "do we need a crutch to hold despair at bay"? and so forth, all questions concerning human nature. Good questions do not disappear, they mature, mutate and spawn new questions. VERENA HUBER-DYSON, is a mathematician who taught at UC Berkeley in the early sixties, then at the U of Illinois' at Chicago Circle, before retiring from the University of Calgary. Her research papers on the interface between Logic and Algebra concern decision problems in group theory. Her monograph Goedel's theorem: a workbook on formalization is an attempt at a self contained interdisciplinary introduction to logic and the foundations of mathematics. Nicholas Humphrey "...a set of questions that ought to have disappeared; questions that seek reasons for patterns that in reality are due to chance" There is a set of questions that ought to have disappeared, but--given human psychology--probably never will do: questions that seek reasons for patterns that in reality are due to chance. Why is "one plus twelve" an anagram of "two plus eleven"? Why do the moon and the sun take up exactly the same areas in the sky as seen from Earth? Why did my friend telephone me just as I was going to telephone her? Whose face is it in the clouds? The truth is that not everything has a reason behind it. We should not assume there is someone or something to be blamed for every pattern that strikes us as significant. But we have evolved to have what the psychologist Bartlett called an "effort after meaning". We have always done better to find meaning where there was none than to miss meaning where there was. We're human. When we win the lottery by betting on the numbers of our birthday, the question Why? will spring up, no matter what. NICHOLAS HUMPHREY is a theoretical psychologist at the Centre for Philosophy of Natural and Social Sciences, London School of Economics, and the author of Consciousness Regained, The Inner Eye, A History of the Mind, and Leaps of Faith: Science, Miracles, and the Search for Supernatural Consolation. Mark Hurst "Do I have e-mail?" The sudden increase in digital information, or bits, in our everyday lives has destroyed any question of permanence or scarcity of those bits. Just consider the example of e-mail. Years ago when you first got online, you were excited to get e-mail, right? So every day, the big question when you logged in was, Will I have any e-mail? The chirpy announcement that "You've got mail!" actually meant something, since sometimes you didn't have mail. Today there's no question. There's no such thing as no mail. You don't have to ask; you DO have mail. If it's not the mail you want (from friends or family), it's work-related mail or, worse, spam. Our inboxes may soon be so flooded with spam that we look for entirely different ways to use e-mail. The death of that question, "Do I have e-mail?" has brought us a new, more interesting question as a result: "What do I do with all this e-mail?" More generally, what do we do with all these bits (e-mail, wireless messages, websites, Palm Pilot files, Napster downloads)? This is the question that will define our relationship with digital technology in coming years. MARK HURST, founder of Internet consulting firm Creative Good, is widely credited for popularizing the term "customer experience" and the methodology around it. Hurst has worked since the birth of the Web to make Internet technology easier and more relevant to its "average" users. In 1999, InfoWorld magazine named Hurst "Netrepreneur of the Year", saying that "Mark Hurst has done more than any other individual to make Web-commerce sites easier to use." Over 39,000 people subscribe to his Good Experience newsletter, available for free at goodexperience.com or update at goodexperie nce.com. Piet Hut "What is Reality?" It has become unfashionable to ask about the structure of reality without already having chosen a framework in which to ponder the answer, be it scientific, religious or sceptical. A sense of wonder at the sheer appearance of the world, moment by moment, has been lost. To look at the world in wonder, and to stay with that sense of wonder without jumping straight past it, has become almost impossible for someone taking science seriously. The three dominant reactions are: to see science as the only way to get at the truth, at what is really real; to accept science but to postulate a more encompassing reality around or next to it, based on an existing religion; or to accept science as one useful approach in a plurality of many approaches, neither of which has anything to say about reality in any ultimate way. The first reaction leads to a sense of wonder scaled down to the question of wonder about the underlying mathematical equations of physics, their interpretation, and the complexity of the phenomena found on the level of chemistry and biology. The second reaction tends to allow wonder to occur only within the particular religous framework that is accepted on faith. The third reaction allows no room for wonder about reality, since there is no ultimate reality to wonder about. Having lost our ability to ask what reality is like means having lost our innocence. The challenge is to regain a new form of innocence, by accepting all that we can learn from science, while simultaneously daring to ask 'what else is true?' In each period of history, the greatest philosophers struggled with the question of how to confront skepticism and cynicism, from Socrates and Descartes to Kant and Husserl in Europe, and Nagarjuna and many others in Asia and elsewhere. I hope that the question "What is Reality?" will reappear soon, as a viable intellectual question and at the same time as an invitation to try to put all our beliefs and frameworks on hold. Looking at reality without any filter may or may not be possible, but without at least trying to do so we will have given up too soon. PIET HUT is professor of astrophysics at the Institute for Advanced Study, in Princeton. He is involved in the project of building GRAPEs, the world's fastest special-purpose computers, at Tokyo University, and he is also a founding member of the Kira Institute. Raphael Kasper "What does all the information mean?" The ubiquity of upscale coffee houses has eliminated the need to ask "Where can I get a cup of coffee?" But I suspect that the question "What questions have disappeared?" is meant to elicit even deeper and [perhaps] more meaningful responses. The coffee house glut has been accompanied--although with no necessarily causal link--by an avalanche of information [or, at least, of data] and in the rush to obtain [or "to access"--groan] that information we?ve stopped asking "What does it all mean?" It is as though raw data, in and of itself, has real value, indeed all of the value, and thus there is no need to stop, to assimilate, to ponder. We grasp for faster computers, greater bandwidth, non-stop connectivity. We put computers in every classroom, rewire schools. But, with the exception of a great deal of concern about the business and marketing uses of the new "information age," we pay precious little attention to how the information can be used to change, or improve, our lives, nor do we seem to take the time to slow down and deliberate upon its meaning. We wire the schools, but never ask what all those computers in classrooms will be used for, or whether teachers know what to do with them, or whether we can devise ways to employ the technology to help people learn in new or better ways. We get cable modems, or high-speed telephone lines, but don?t think about what we can do with them beyond getting more information faster. [Really, does being able to watch the trailer for "Chicken Run" in a 2-inch square window on the computer screen after a several minute long download, constitute a major advance--and if we could cut the download time to several seconds, would that qualify?] Most insidious, I think, is that the rush to get more information faster almost forces people to avoid the act of thinking. Why stop and try to make sense of the information we?ve obtained when we can click on that icon and get still more data? And more. RAPHAEL KASPER, a physicist, is Associate Vice Provost for Research at Columbia University and was Associate Director of the Superconducting Super Collider Laboratory. Kevin Kelly "What is the nature of our creator?" This question was once entertained by the educated and non-educated alike, but is now out of fashion among the learned, except in two small corners of intellectual life. One corner is religious theology, which many scientists would hardly consider a legitimate form of inquiry at this time. In fact it would not be an exaggeration to say that modern thinking considers this question as fit only for the religious, and that it has no part in the realm of science at all. But even among the religious this question has lost favor because, to be honest, theology hasn't provided very many satisfactory answers for modern sensibilities, and almost no new answers in recent times. It feels like a dead end. A question that cannot be asked merely by musing in a book-lined room. The other corner where this question is asked--but only indirectly--is in particle physics and cosmology. We get hints of answers here and there mainly as by-products of other more scientifically specific questions, but very few scientists set out to answer this question primarily. The problem here is that because the question of the nature of our creator is dismissed as a religious question, and both of these sciences require some of the most expensive equipment in the world paid by democracies committed to separation of church and state, it won't do to address the question directly. But there is a third way of thinking emerging that may provide a better way to ask this question. This is the third culture of technology. Instead of asking this question starting from the human mind contemplating the mysteries of God, as humanists and theologists do, or starting from experiment, observation, and testing as scientists do, the third way investigates the nature of our creator by creating creations. This is the approach of nerds and technologists. Technologists are busy creating artificial worlds, virtual realities, artificial life, and eventually perhaps, parallel universes, and in this process they explore the nature of godhood. When we make worlds, what are the various styles of being god? What is the relation to the creator and the created? How does one make laws that unfold creatively? How much of what is created can be created without a god? Where is god essential? Sometimes there are theories (theology) but more often this inquiry is driven by pure pragmatic engineering: "We are as gods and may as well get good at it," to quote Stewart Brand. While the third way offers a potential for new answers, more than the ways of the humanities or science, the truth is that even here this question--of the nature of our creator--is not asked directly very much. This really is a question that has disappeared from public discourse, although of course, it is asked every day by billions of people silently. KEVIN KELLY is a founding editor of Wired magazine. In 1993 and 1996, under his co-authorship, Wired won it's industry's Oscar--The National Magazine Award for General Excellence. Prior to the launch of Wired , Kelly was editor/publisher of the Whole Earth Review, a journal of unorthodox technical and cultural news. He is the author of New Rules for the New Economy; and Out of Control: The New Biology of Machines, Social Systems, and the Economic World. Lance Knobel "Are you hoping for a girl or a boy?" The moment of birth used to be attended by an answer to a nine-month mystery: girl or boy? Now, to anyone with the slightest curiosity and no mystical scruples, simple, non-invasive technology can provide the answer from an early stage of pregnancy. With both of our children, we chose to know the answer (in the UK about half of parents want to know), and I suspect the likelihood of the question continuing to be asked will diminish rapidly. What's interesting is this is the first of many questions about the anticipated child that will soon not be asked. These will range from the trivial (eye colour, mature height) to the important (propensity to certain diseases and illnesses). The uneasiness many people still have about knowing the sex of the child suggests that society is vastly unprepared for the pre-birth answers to a wide range of questions. LANCE KNOBEL is a managing director of Vesta Group, an Internet and wireless investment company based in London. He was formerly head of the programme of the World Economic Forum's Annual Meeting in Davos and Editor-in-Chief of World Link. Marek Kohn "What about the workers?" It may have been uttered as often in caricature as in anger, but the voice from the crowd asked a question that was accepted as reasonable even by those who winced at the jeering tone. Until fifteen or twenty years ago, the interest-earning and brain-working classes generally felt that they owed something to the workers, for doing the drudgery needed to keep an industrialised society going. And it was taken for granted that the workers were a class, with collective interests of their own. In some instances--British miners, for example--they enjoyed considerable respect and a romantic aura. Even in the United States, where perhaps the question was not put quite the same way, the sentiments were there. Now there is an underclass of dangerous and hopeless folk, an elite of the fabulous and beautiful, and everybody in between is middle class. Meritocracy is taken for granted, bringing with it a perspective that sees only individuals, not groups. There are no working classes, only low-grade employees. In a meritocracy, respect is due according to the rank that an individual has attained. And since achievement is an individual matter, those at the upper levels see no reason to feel they owe anything to those at lower ones. This state of affairs will probably endure until such time that people cease to think of their society as a meritocracy, with its upbeat tone of progress and fairness, and start to feel that they are living in a Red Queen world, where they have to run ever faster just to stay in the same place. MAREK KOHN'S most recent book, published last year, is As We Know It: Coming to Terms with an Evolved Mind. His other books include The Race Gallery: The Return of Racial Science and Dope Girls: The Birth of the British Drug Underground. He writes a weekly column on digital culture, Second Site, for the London Independent on Sunday. Stephen M. Kosslyn "How do people differ in the ways they think and learn?" Most Americans, even (or, perhaps, especially) educated Americans, seem to believe that all people are basically the same--we have the same innate abilities and capacities, and only hard work and luck separates those who are highly skilled from those who are not. But this idea is highly implausible. People differ along every other dimension, from the size of their stomachs and shoes to the length of their toes and tibias. They even differ in the sizes of their brains. So, why shouldn't they also differ in their abilities and capacities? Of course, the answer is that they do. It's time to acknowledge this fact and take advantage of it. In my view, the 21st century is going to be the "Century of Personalization." No more off-the-rack drugs: Gene and proteonomic chips will give readouts for each person, allowing drugs to be tailored to their individual physiologies. No more off-the-rack clothes: For example, you'll stick your feet in a box, lasers will measure every aspect of them, and shoes will be custom-made according to your preferred style. Similarly, no more off-the-rack teaching. Specifically, the first step is to diagnose individual differences in cognitive abilities and capacities, so we can play to a given person's strengths and avoid falling prey to his or her weaknesses. But in order to characterize these differences, we first need to understand at least the broad outlines of general mechanisms that are common to the species. All of us have biceps and triceps, but these muscles differ in their strength. So too with our mental muscles. All of us have a short-term memory, for example (in spite of how it may sometimes feel at the end of the day), and all of us are capable storing information in long-term memory. Differences among people in part reflect differences in the efficacy of such mechanisms. For example, there are at least four distinct ways that visual/spatial information can be processed (which I'm not going to go into here), and people differ in their relative abilities on each one. Presenting the same content in different ways will invite different sorts of processing, which will be more or less congenial for a given person. But there's more to it than specifying mechanisms and figuring out how well people can use them (as daunting as that is). Many of the differences in cognitive abilities and capacities probably reflect how mechanisms work together and when they are recruited. Understanding such differences will tell us how to organize material so that it goes down smoothly. For example, how--for a given person-should examples and general principles be intermixed? And, yet more. We aren't bloodless brains floating in vats, soaking up information pumped into us. Rather, it's up to us to decide what to pay attention to, and what to think about. Thus, it's no surprise that people learn better when they are motivated. We need to know how a particular student should be led to use the information during learning. For example, some people may "get" physics only when it's taught in the context of auto mechanics. All of this implies that methods of teaching in the 21st Century will be tightly tied to research in cognitive psychology and cognitive neuroscience. At present, the study of individual differences is almost entirely divorced from research on general mechanisms. Even if this is remedied, it's going to be a challenge to penetrate the educational establishment and have this information put to use. So, the smart move will probably be to do an end-run around this establishment, using computers to tutor children individually outside of school. This in turn raises the specter of another kind of Digital Divide. Some of us may in fact still get off-the-rack education. Finally, I'll leave aside another set of questions no one seems to be seriously asking: What should be taught? And should the same material be taught to everyone? You can imagine why this second question isn't being asked, but it's high time we seriously considered making the curriculum relevant for the 21st Century. STEPHEN M. KOSSLYN, a full professor of psychology at Harvard at age 34, is a researcher focusing primarily on the nature of visual mental imagery. His books include Image and Mind, Ghosts in the Mind's Machine, Wet Mind: The New Cognitive Neuroscience, Image and Brain: The Resolution of the Imagery Debate, and Psychology: The Brain, the Person, the World. Kai Krause "What is the difference between men and pigs?" Questions... We ask many questions ... ...about our species, our gender, our friends, lovers and ourselves, the mystery who each of us is and where in the ant hill the intelligence lies, if each ant has no clue. We search for the variations amongst a set, try to define the set in its limits and borders against other sets, look for analogies, anomalies and statistical outliers.... There is in fact an almost universal algorithm, like cats stalking their prey, to makes sense of our nature by boundary conditions, alas compiled with spotty statistics and messy heuristics, gullible souls, political machinations, cheats, lies and video tape, in short: human nature. We search and probe, the literate digerati confer virtually, each wondering about the other, each looking at their unique sets of parents, and the impossibility to imagine them in the act of procreation. In other words, we still have no idea what-so-ever who we really are, what mankind as a whole is all about. We have mild inclinations on where we have been, sort of, and contradictory intentions on where we may be headed, kind of, but all in all, we are remarkably clue-free. But that question at least need no longer be asked and has indeed vanished after this: Even when they had way too much to drink, pigs don?t turn into men. KAI KRAUSE is currently building a research lab dubbed "Byteburg" in a thousand year old castle above the Rhein river in the geometric center of Europe. He asked not to be summed up by previous accomplishments, titles or awards. Lawrence M. Krauss "Does God Exist?" In the 1960's and 70's it seemed just a matter of time before antiquated notions of god, heaven, and divine intervention would disappear from the intellectual spectrum, at least in the US. Instead, we find ourselves in an era when God appears to be on the lips of all politicians, creationism is rampant in our schools, and the separation of church and state seems more fragile than ever. What is the cause of this regression, and what can we do to combat it? Surely, one of the legacies of science is to learn to accept the Universe for what it is, rather than imposing our own belief systems on it. We should be prepared to offend any sensibilities, even religious ones, when they disagree with the evidence of experiment. Should scientists be more vocal in order to combat the born-again evangelists who are propagating ill-founded notions about the cosmos? LAWRENCE M. KRAUSS is Ambrose Swasey Professor of Physics, Professor of Astronomy, and Chair of the Physics Department at Case Western Reserve University. He is the recipient of the AAAS Award for Public Understanding of Science, and this year's Lilienfeld Prize from the American Physical Society. He is the author of numerous books, including The Physics of Star Trek. Leon Lederman "Does God play dice?" (...first asked by Albert Einstein some time in the 30's.) Like mathematics whose symbols can represent physical properties when applied to some scientific problem, God is a convenient symbol for nature, for the way the world works. Einstein's reaction of utter incredibility to the quantum theory from its development in the late 20's until his death in 1955, was echoed by colleagues who had participated in the early creation of the quantum revolution, which Richard Feynman had termed the most radical theory ever. Well does she? The simplest example of what sure looks like God playing dice happens when you walk past a store window on a sunny day. Of course you are not just admiring your posture and checking your attire, you are probably watching the guy undressing the manikin, but that is another story. So how do you see yourself, albeit dimly, while the manikin abuser sees you very clearly? Everyone knows that light is a stream of photons, here from the sun, some striking your nose, then reflected in all directions. We focus on two photons heading for the window. We'll need thousands to get a good picture but two will do for a start. One penetrates the window and impacts the eye of the manikin dresser. The second is reflected from the store window and hits your eye, a fine picture of a good looking pedestrian! What determines what the photons will do? The photons are identical...trust me. Philosophers of science assure us that identical experiments give identical results. Not so! The only rational conclusion would seem to be that she plays dice at each impact of the photon. Using a die with 10 faces, good enough for managing this bit of the world, numbers one to nine determine that the photon goes through, a ten and the photon is reflected. Its random...a matter of probability. Dress this concept up in shiny mathematics and we have quantum science which underlies physics, most of chemistry and molecular biology. It now accounts for 43.7% of our GNP. (this is consistent with 87.1% of all numbers being made up.) So what was wrong with Einstein and his friends? Probabilistic nature which is applicable to the world of atoms and smaller, has implications which are bizarre, spooky, wierd. Granting that it works, Einstein could not accept it and hoped for a deeper explanation. Today, many really smart physicists are are seeking a kinder, gentler formulation but 99.3% of working physicists go along with the notion that she is one hell-of-a crap shooter. LEON M. LEDERMAN , the director emeritus of Fermi National Accelerator Laboratory, has received the Wolf Prize in Physics (1982), and the Nobel Prize in Physics (1988). In 1993 he was awarded the Enrico Fermi Prize by President Clinton. He is the author of several books, including (with David Schramm) From Quarks to the Cosmos : Tools of Discovery, and (with Dick Teresi) The God Particle: If the Universe Is the Answer, What Is the Question? Joseph Le Doux "How do our brains become who we are?" Many neuroscientists, myself included, went into brain research because of an interest in the fact that our brains make us who we are. But the topics we end up working on are typically more mundane. It's much easier to research the neural basis of perception, memory or emotion than the way perceptual, memory, and emotion systems are integrated in the process of encoding who we are. Questions about the neural basis of personhood, the self, have never been at the forefront of brain science, and so are not, strictly speaking, lost questions to the field. But they are lost questions for those of us who were drawn to neuroscience by an interest in them, and then settle for less when overcome with frustration over the magnitude of the problem relative to the means we have for solving it. But questions about the self and the brain may not be as hard to address as they seem. A simple shift in emphasis from issues about the way the brain typically works in all of us to the way it works in individuals would be an important entry point. This would then necessitate that research on cognitive processes, like perception or memory, take subjects' motivations and emotions into consideration, rather than doing everything possible to eliminate them. Eventually, researchers would study perception, memory, or emotion less as isolated brain functions than as activities that, when integrated, contribute to the real function of the brain-- the creation and maintenance of the self. JOSEPH LEDOUX is a Professor of Neural Science at New York University. He is author of The Emotional Brain. Pamela McCorduck "Can machines think?" It burned through the sixties, seventies and even eighties, until the answer was, Of course. It was replaced with a different, less emotionally fraught question: How can we make them think smarter/better/deeper? The central issue is the social, not scientific, definition of "thinking". A generation of Western intellectuals who took their identity mainly from their intelligence has grown too old to ask the question with any conviction, and anyway, machines are all around them thinking up a storm. Machines don't yet think like Einstein, but then neither do most people, and we don't question their humanity on that account. PAMELA McCORDUCK is the author or coauthor of seven books, among them Machines Who Think, and coauthor with Nancy Ramsey of The Futures Of Women: Scenarios for the 21st Century. Dan McNeill "Where is the Great American Novel?" This question haunted serious writers in the early 20th century, when critics sought a product that measured up to the European standard. Now it is dead, and the underlying notion is in ICU. What happened? Well, the idea itself was never a very good one. It had breathtakingly hazy contours. It ignored the work of authors like Melville, Hawthorne, Wharton, and Twain. And it seemed to assume that a single novel could sum up this vast and complex nation. I'd like to think its disappearance reflects these problems. But technology also helped shelve the question. As media proliferated, literature grew less central. If the Great American Novel appeared tomorrow, how many people would actually read it? My guess: Most would wait for the movie. DANIEL McNEILL is the author of The Face, and principal author of the best-selling Fuzzy Logic, which won the Los Angeles Times Book Prize in Science and Technology, and was a New York Times "Notable Book of the Year". John H. McWhorter "Are subordinate clauses more typical of languages with a long literary tradition than integral features of human speech?" Contemporary linguists tend to assume in their work that subordinate clauses, such as "The boy that I saw yesterday" or "I knew what happened when she came down the steps", are an integral part of the innate linguistic endowment, and/or central features of "human speech" writ large. Most laymen would assume the same thing. However, the fact is that when we analyze a great many strictly spoken languages with no written tradition, subordinate clauses are rare to nonexistent. In many Native American languages, for example, the only way to express something like the men who were members is a clause which parses approximately as "The 'membering' men"; the facts are similar in thousands of other languages largely used orally. In fact, even in earlier documents in today's "tall building" literary languages, one generally finds a preference for stringing simple main clauses together--she came down the steps, and I knew what happened rather than embedding them in one another along the lines of when she came down the steps, I knew what happened. The guilty sense we often have when reading English of the first half of the last millennium that the writing is stylistically somewhat "clunky" is due largely to the marginality of the subordinate clause: here is Thomas Malory in the late fifteenth century: And thenne they putte on their helmes and departed and recommaunded them all wholly unto the Quene and there was wepynge and grete sorowe Thenne the Quene departed in to her chamber and helde her that no man shold perceyue here grete sorowes Early Russian parses similarly, and crucially, so do the Hebrew Bible and the Greek of Homer. At the time that these documents were written, writing conventions had yet to develop, and thus written language hewed closer to the way language is actually spoken on the ground. Over time, subordinate clauses, a sometime thing in speech, were developed as central features in written speech, their economy being aesthetically pleasing, and more easily manipulated via the conscious activity of writing than the spontaneous "on-line" activity of speaking. Educated people, exposed richly to written speech via education, tended to incorporate the subordinate clause mania into their spoken varieties. Hence today we think of subordinate clauses as "English", as the French do "French", and so on--even though if we listen to a tape recording of ourselves speaking casually, even we tend to embrace main clauses strung together in favor of the layered sentential constructions of Cicero. But the "natural" state of language persists in the many which have had no written tradition. In the 1800s, various linguists casually speculated as to whether subordinate clauses were largely artifactual rather than integral to human language, with one (Karl Brugmann) even going as far as to assert that originally, humans spoke only with main clauses. Today, however, linguistics operates under the sway of our enlightened valuation of "undeveloped" cultures, which has, healthily, included an acknowledgment of the fact that the languages of "primitive" peoples are as richly complex as written Western languages. (In fact, the more National Geographic the culture, the more fearsomely complex the language tends to be overall.) However, this sense has discouraged most linguists from treading into the realm of noting that one aspect of "complexity", subordinate clauses, is in fact not central to expression in unwritten languages and is most copiously represented in languages with a long written tradition. In general, the idea that First World written languages might exhibit certain complexities atypical of languages spoken by preliterate cultures has largely been tacitly taboo for decades in linguistics, generally only treated in passing in obscure venues. The problem is that this could be argued to bode ill for investigations of the precise nature of Universal Grammar, which will certainly require a rigorous separation of the cultural and contingent from the encoded. JOHN H. MCWHORTER is Assistant Professor of Linguistics at the University of California at Berkeley. He taught at Cornell University before entering his current position at Berkeley. He specializes in pidgin and creole languages, particularly of the Caribbean, and is the author of Toward a New Model of Creole Genesis and The Word on the Street : Fact and Fable About American English. He also teaches black musical theater history at Berkeley and is currently writing a musical biography of Adam Clayton Powell, Jr. Geoffrey Miller "Three Victorian questions about potential sexual partners: 'Are they from a good family?'; 'What are their accomplishments?'; 'Was their money and status acquired ethically?' " To our "Sex and the City" generation, these three questions sound shamefully Victorian and bourgeois. Yet they were not unique to 19th century England: they obsessed the families of eligible young men and women in every agricultural and industrial civilization. Only with our socially-atomized, late-capitalist society have these questions become tasteless, if not taboo. Worried parents ask them only in the privacy of their own consciences, in the sleepless nights before a son or daughter's ill-considered marriage. The "good family" question always concerned genetic inheritance as much as financial inheritance. Since humans evolved in bands of closely-related kin, we probably evolved an intuitive appreciation of the genetics relevant to mate choice--taking into account the heritable strengths and weakness that we could observe in each potential mate's relatives, as well as their own qualities. Recent findings in medical genetics and behavior genetics demonstrate the wisdom of taking a keen interest in such relatives: one can tell a lot about a young person's likely future personality, achievements, beliefs, parenting style, and mental and physical health by observing their parents, siblings, uncles, and aunts. Yet the current American anti-genetic ideology demands that we ignore such cues of genetic quality--God forbid anyone should accuse us of eugenics. Consider the possible reactions a woman might have to hearing that a potential husband was beaten as a child by parents who were alcoholic, aggressive religious fundamentalists. Twin and adoption studies show that alcoholism, aggressiveness, and religiousity are moderately heritable, so such a man is likely to become a rather unpleasant father. Yet our therapy cures-all culture says the woman should offer only non-judgmental sympathy to the man, ignoring the inner warning bells that may be going off about his family and thus his genes. Arguably, our culture alienates women and men from their own genetic intuitions, and thereby puts their children at risk. The question "What are their accomplishments?" refers not to career success, but to the constellation of hobbies, interests, and skills that would have adorned most educated young people in previous centuries. Things like playing pianos, painting portraits, singing hymns, riding horses, and planning dinner parties. Such accomplishments have been lost through time pressures, squeezed out between the hyper-competitive domain of school and work, and the narcissistic domain of leisure and entertainment. It is rare to find a young person who does anything in the evening that requires practice (as opposed to study or work)--anything that builds skills and self-esteem, anything that creates a satisfying, productive "flow" state, anything that can be displayed with pride in public. Parental hot-housing of young children is not the same: after the child's resentment builds throughout the French and ballet lessons, the budding skills are abandoned with the rebelliousness of puberty--or continued perfunctorily only because they will look good on college applications. The result is a cohort of young people whose only possible source of self-esteem is the school/work domain--an increasingly winner-take-all contest where only the brightest and most motivated feel good about themselves. (And we wonder why suicidal depression among adolescents has doubled in one generation.) This situation is convenient for corporate recruiting--it channels human instincts for self-display and status into an extremely narrow range of economically productive activities. Yet it denies young people the breadth of skills that would make their own lives more fulfilling, and their potential lovers more impressed. Their identities grow one-dimensionally, shooting straight up towards career success without branching out into the variegated skill sets which could soak up the sunlight of respect from flirtations and friendships, and which could offer shelter, and alternative directions for growth, should the central shoot snap. The question "Was their money and status acquired ethically?" sounds even quainter, but its loss is even more insidious. As the maximization of share-holder value guides every decision in contemporary business, individual moral principles are exiled to the leisure realm. They can be manifest only in the Greenpeace membership that reduces one's guilt about working for Starbucks or Nike. Just as hip young consumers justify the purchase of immorally manufactured products as "ironic" consumption, they justify working for immoral businesses as "ironic" careerism. They aren't "really" working in an ad agency that handles the Phillip Morris account for China; they're just interning for the experience, or they're really an aspiring screen-writer or dot-com entrepreneur. The explosion in part-time, underpaid, high-turnover service industry jobs encourages this sort of amoral, ironic detachment on the lower rungs of the corporate ladder. At the upper end, most executives assume that shareholder value trumps their own personal values. And in the middle, managers dare not raise issues of corporate ethics for fear of being down-sized. The dating scene is complicit in this corporate amorality. The idea that Carrie Bradshaw or Ally McBeal would stop seeing a guy just because he works for an unethical company doesn't even compute. The only relevant morality is personal--whether he is kind, honest, and faithful to them. Who cares about the effect his company is having on the Phillipino girls working for his sub-contractors? "Sisterhood" is so Seventies. Conversely, men who question the ethics of a woman's career choice risk sounding sexist: how dare he ask her to handicap herself with a conscience, when her gender is already enough of a handicap in getting past the glass ceiling? In place of these biologically, psychologically, ethically grounded questions, marketers encourage young people to ask questions only about each other's branded identities. Armani or J. Crew clothes? Stanford or U.C.L.A. degree? Democrat or Republican? Prefer "The Matrix" or "You've Got Mail'? Eminem or Sophie B. Hawkins? Been to Ibiza or Cool Britannia? Taking Prozac or Wellbutrin for the depression? Any taste that doesn't lead to a purchase, any skill that doesn't require equipment, any belief that doesn't lead to supporting a non-profit group with an aggressive P.R. department, doesn't make any sense in current mating market. We are supposed to consume our way into an identity, and into our most intimate relationships. But after all the shopping is done, we have to face, for the rest of our lives, the answers that the Victorians sought: what genetic propensities, fulfilling skills, and moral values do our sexual partners have? We might not have bothered to ask, but our children will find out sooner or later. GEOFFREY MILLER is an evolutionary psychologist at the London School of Economics and at U.C.L.A. His first book was The Mating Mind: How Sexual Choice Shaped the Evolution of Human Nature. David G. Myers "Does money buy happiness?" Three in four entering collegians today deem it "very important" or "essential" that they become "very well-off financially." Most adults believe "more money" would boost their quality of life. And today's "luxury fever" suggests that affluent Americans and Europeans are putting their money where their hearts are. "Whoever said money can't buy happiness isn't spending it right," proclaimed a Lexus ad. But the facts of life have revealed otherwise. Although poverty and powerlessness often bode ill for body and spirit, wealth fails to elevate well-being. Surveys reveal that even lottery winners and the super rich soon adapt to their affluence. Moreover, those who strive most for wealth tend, ironically, to live with lower well-being than those focused on intimacy and communal bonds. And consider post-1960 American history: Average real income has doubled, so we own twice the cars per person, eat out two and a half times as often, and live and work in air conditioned spaces. Yet, paradoxically, we are a bit less likely to say we're "very happy." We are more often seriously depressed. And we are just now, thankfully, beginning to pull out of a serious social recession that was marked by doubled divorce, tripled teen suicide, quadrupled juvenile violence, quintupled prison population, and a sextupled proportion of babies born to unmarried parents. The bottom line: Economic growth has not improved psychological morale or communal health. DAVID G. MYERS is a social psychologist at Hope College (Michigan) and author, most recently, of The American Paradox: Spiritual Hunger in an Age of Plenty and of A Quiet World: Living with Hearing Loss. Randolph M. Nesse "Why is life so full of suffering?" Questions disappear when they seem to be answered or unanswerable. The interesting missing questions are the apparently answered ones that are not, and the apparently unanswerable ones that are. One of life's most profound questions has been thought to be unanswerable. That question is, "Why is life so full of suffering?" Impatience with centuries of theological and philosophical speculation has led many to give up on the big question, and to ask instead only how brain mechanisms work, and why people differ in their experiences of suffering. But the larger question has an answer, an evolutionary answer. The capacities for suffering--pain, hunger, cough, anxiety, sadness, boredom and all the res--have been shaped by natural selection. They seem to be problems because they are so painful and because they are aroused only in adverse circumstances, but they are, in fact, solutions. The illusion that they are problems is further fostered by the smoke-detector principle--selection has shaped control mechanisms that express defensive responses whenever the costs are less than the protection they provide. This is often indeed, much more often than is absolutely necessary. Thus, while the capacities for suffering are useful and generally well-regulated, most individual instances are excessive or entirely unnecessary. It has not escaped notice that this principle has profound implications for the power and limits of pharmacology to relieve human suffering. RANDOLPH M. NESSE is Professor of Psychiatry and Director of the Evolution and Human Adaptation Program at the University of Michigan. He is the author, with George Williams, of Why We Get Sick: The New Science of Darwinian Medicine. Tor Norretranders "Who are we?" In the not so distant future we will have to revive the question about who and what we are. We will have to, not because we choose to do so, but because the question will be posed to us by Others or Otherness: Aliens, robots, mutants, and the like. New phenomena like information processing artifacts, computational life forms, bioengineered humans, upgraded animals and pen-pals in space will force us to consider ourselves and our situation: Why didn't we finish hunger on this planet? Are we evil or just idiots? Why do we really want to rebuild ourselves? Do we regain our soul when the tv-set is turned off? It?s going to happen like this: We build a robot. It turns towards us and says: "If technology is the answer then what was the question?" TOR NORRETRANDERS is a science writer, consultant, lecturer and organizer based in Copenhagen, Denmark. He was recently appointed Chairman of the National Council for Competency. Rafael E. N??ez "Do computers think?" This question was at the heart of heated debates for decades during the recently past century, and it was at the ambitious origins of the Artificial Intelligence adventure. It had profound implications not only for science, but also for philosophy, technology, business, and even theology. In the 50's and 60's, for instance, it made a lot of sense to ask the question whether one day a computer could defeat an international chess master, and if it did, it was assumed that we would learn a great deal about how human thought works. Today we know that building such a machine is possible, but the reach of the issue has dramatically changed. Nowadays not many would claim that building such a computer actually informs us in an interesting way about what human thought is and how it works. Beyond the (indeed impressive) engineering achievements involved in building such machines, we got from them little (if any) insight into the mysteries, variability, depth, plasticity, and richness of human thought. Today, the question "do computers think?" has become completely uninteresting and it has disappeared from the cutting edge academic circus, remaining mainly in the realm of pop science, Hollywood films, and video games. And why it disappeared? It disappeared because it was finally answered with categorical responses that stopped generating fruitful work. The question became useless and uninspiring, ... boring. What is interesting, however, is that the question disappeared with no single definitive answer! It disappeared with categorical "of-course-yes" and "of-course-not" responses. Of-course-yes people, in general motivated by a technological goal (i.e., "to design and to build something") and implicitly based on functionalist views, built their arguments on the amazing ongoing improvement in the design and development of hardware and software technologies. For them the question became uninteresting because it didn't help to design or to build anything anymore. What became relevant for of-course-yes people was mainly the engineering challenge, that is, to actually design and to build computers capable of processing algorithms in a faster, cheaper, and more flexible manner. (And also, for many, what became relevant was to build computers for human activities and purposes). Now when of-course-yes people are presented with serious problems that challenge their view, they provide the usual response: "just wait until we get better computers" (once known as the wait-until the-year-2000 argument). On the other hand there were the of-course not people, who were mainly motivated by a scientific task (i.e., "to describe, explain, and predict a phenomenon"), which was not necessarily technology-driven. They mainly dealt with real-time and real-world biological, psychological, and cultural realities. These people understood that most of the arrogant predictions made by Artificial Intelligence researchers in the 60's and 70's hadn't been realized because of fundamental theoretical problems, not because of the lack of powerful enough machines. They observed that even the simplest everyday aspects of human thought, such as common sense, sense of humor, spontaneous metaphorical thought, use of counterfactuals in natural language, to mention only a few, were in fact intractable for the most sophisticated machines. They also observed that the nature of the brain and other bodily mechanisms that make thinking and the mind possible, were by several orders of magnitude, way more complex than what it was thought during the hey-days of Artificial Intelligence. Thus for of course-not people the question whether computers think became uninteresting, since it didn't provide insights into a genuine understanding of the intricacies of human thinking. Today the question is dead. The answer had become a matter of faith. RAFAEL E. N??EZ, currently at the Department of Psychology of the University of Freiburg, is a research associate of the University of California, Berkeley. He has worked for more than a decade on the foundations of embodied cognition, with special research into the nature and origin of mathematical concepts. He has published in several languages in a variety of areas, and has taught in leading academic institutions in Europe, the United States, and South America. He is the author (with George Lakoff) of Where Mathematics Comes From: How the Embodied Mind Brings Mathematics into Being; and co-editor (with Walter Freeman) of Reclaiming Cognition: The Primacy of Action, Intention, and Emotion. James J. O'Donnell "the old Platonic questions about the nature of the good and the form of beauty" Metaphysical questions. Metaphysical answers haven't disappeared: the new agers are full of them, and so are the old religionists. Cosmological questions haven't disappeared: but scientists press them as real questions about the very physical universe. But the old Platonic questions about the nature of the good and the form of beauty--they went away when we weren't looking. They won't be back. JAMES J. O'DONNELL, Professor of Classical Studies and Vice Provost for Information Systems and Computing at the University of Pennsylvania, is the author of Avatars of the Word: From Papyrus to Cyberspace. Jay Ogilvy "What will life be like after the revolution?" The disappearance of this question isn't only a trace of the deletion of the left. It is also a measure of our loss of faith in secular redemption. We don't look forward anymore to radical transformation. Perhaps it's a result of a century of disappointments: from the revolution of 1917 to Stalin and the fall of communism; from the Spanish Civil War to Franco; from Mao's long march to Deng's proclamation that to get rich is glorious. Perhaps it's a result of political history. But there was more that had to do with psychological transformation. Remember Norman O. Brown's essay, "The place of apocalypse in the life of the mind"? Remember R. D. Laing's turn on breakdown as breakthrough? Remember the fascination with words like 'metamorphosis' and 'metanoia'? Maybe we're just getting older and all too used to being the people we are. But I'd like to think we're getting wiser and less naive about the possibility of shedding our pasts overnight. It's important to distinguish between political liberalism on the one hand and a faith in discontinuous transformation on the other. If we fail to make this distinction, then forgetting about the revolution turns (metanoically) into the familiar swing to the right. Old radicals turn reactionary. If we're less dramatic about our beliefs, if we're more cautious about distinguishing between revolutionary politics and evolutionary psychology, then we'll retain our faith in the dream that we can do better. Just not overnight. p.s. Part of the passion for paradigms and thier shiftings may derive from displaced revolutionary fervor. If you yearn for transfiguration, but can't find it in religion or politics, then you'll seek it elsewhere, like the history of science. p.p.s. There is one place where talk of transformation is alive and kicking, if not well: The executive suite. The business press is full of books about corporate transformation, re-engineering from a blank sheet of paper, reinvention from scratch. Yes, corporate America is feeling the influence of the sixties as boomers reach thte board room. And this is not a bad thing. For, just as the wisdom to distinguish between revolutionary politics and evolutionary psychology can help us keep the faith in marginal improvements in the human condition, so the tension between greying warriors for change and youthful stalwarts of the status quo will keep us from lurching left or right. JAMES OGILVY is co-founder and managing director of Global Business Network; taught philosophy at Yale and Williams; served as director of research for the Values and Lifestyles Program at SRI International; author of Many Dimensional Man, and Living without a Goal. Sylvia Paull "What do women want?" People in the Western world assume women have it all: education, job opportunities, birth control, love control, and financial freedom. But women still lack the essential freedom--equality--they lacked a century ago. Women are minorities in every sector of our government and economy, and women are still expected to raise families while at the same time earning incomes that are comparably lower than what males earn. And in our culture, women are still depicted as whores, bimbos, or bloodsuckers by advertisers to sell everything from computers to cars. Will it take another century or another millenium before the biological differences between men and women are taken as a carte blanche justification for the unequal treatment of women? SYLVIA PAULL is Founder, Gracenet (www.gracenet.net) Serving women in high-tech and business media. John Allen Paulos "Don't reckon that I know." The question that has appeared this year is "What questions (plural) have disappeared and why?" Countless questions have disappeared, of course, but for relatively few reasons. The most obvious vanishings are connected to the passing of time. No one asks anymore "Who's pitching tomorrow for the Brooklyn Dodgers?" or "Who is Princess Diana dating now?" Other disappearances are related to the advance of science and mathematics. People no longer seriously inquire whether Jupiter has moons, whether DNA has two or three helical strands, or whether there might be integers a, b, and c such that a^3 + b^3 = c^3. Still other vanished queries are the result of changes in our ontology, scientific or otherwise. We've stopped wondering, "What happened to the phlogiston?" or "How many witches live in this valley?" The most interesting lacunae in the erotetic landscape, however, derive from lapsed assumptions, untenable distinctions, incommensurable mindsets, or superannuated worldviews that in one way or another engender questions that are senseless or, at least, much less compelling than formerly. "What are the election's exact vote totals" comes to mind. Now that I've clarified to myself the meaning of "What questions have disappeared and why?" I have to confess that I don't have any particularly telling examples. (Reminds me of the joke about the farmer endlessly elucidating the lost tourist's query about the correct road to some hamlet before admitting, "Don't reckon that I know.") JOHN ALLEN PAULOS, bestselling author, mathematician, and public speaker is professor of mathematics at Temple University in Philadelphia. In addition to being the author of a number of scholarly papers on mathematical logic, probability, and the philosophy of science, Dr. Paulos books include Innumeracy - Mathematical Illiteracy and Its Consequences, A Mathematician Reads the Newspaper, and Once Upon a Number. Christopher Phillips "None." Or at least, certainly not the ones that have so far been submitted to this list, since the questions posted are proof positive that they have not disappeared at all, or at least, not altogether. Sure, some questions have their heyday for a while, and then they may disappear for many a moon. But the great question you posed -- what questions have disappeared? -- shows that they were just waiting for a question like this for someone to be reminded just how much emptier our existence would be without certain questions. But I also think that some questions certainly have gone by the wayside for a long time, though not necessarily the ones that so far have been posed. We may ask, for instance, questions like, Has history ended?, and then go on to offer up a response of one sort or another. But when is the last time we asked, what *is* history? What different types of history are there? What makes history history, regardless of which type it is? Or we may ask: Why have certain questions been discarded? But when's the last time anyone has asked, What is a question? What does a question do? What does a question to do us, and what do we do to it? We may ask: How do people differ in how they think and learn? But do we still ask: What is thinking? What is learning? Instead, we seem to take for granted that we know what history is, that we know what thinking is, that we know what learning is, when in fact if we delved a little more into these questions, we may well find that none of us hold the same views on what these rich concepts mean and how they function. Which leads me to this perspective: What *has* all but disappeared, I think, is a way of answering questions, regardless of which one is being posed, regardless of how seemingly profound or off-beat or mundane it is. I'm speaking of the kind of rigorous, exhaustive, methodical yet highly imaginative scrutiny of a Socrates or a Plato that challenged all assumptions embedded in a question, and that revealed breathtakingly new vistas and hidden likenesses between seemingly disparate entities. Who these days takes the time and effort, much less has the critical and creative acumen, to answer questions as those I've already posed, much less such questions as ?What is human good?? or ?What is a good human??in the soul-stirringly visionary yet at the same time down-to-earth way they did? We need a new generation of questioners in the mold of Plato and Sorcrates, people who dare to think a bit outside the lines, who take nothing for granted when a question is posed, and who subject their scrutiny to continual examination and consideration of cogent objections and alternative ways of seeing. CHRISTOPHER PHILLIPS is the author of ?Socrates Cafe: A Fresh Taste of Philosophy?, and founder-executive director of the nonprofit Society for Philosophical Inquiry. Cliff Pickover "Did Noah Really Collect all Species of Earthly Organism on his Ark?" People who interpret the Bible literally may believe that a man named Noah collected all species of Earthly organisms on his ark. However, scientists no longer ask this question. Let me put the problem in a modern perspective by considering what it means to have animals from every species on an ark. Consider that siphonapterologists (experts in fleas) recognize 1,830 variety of fleas. Incredible as it may seem, there are around 300,000 species of beetles, making beetles one of the most diverse groups of organisms on earth. When biologist J.B.S. Haldane was asked by a religious person what message the Lord conveyed through His creations, he responded, "an inordinate fondness for beetles." One of my favorite books on beetles is Ilkka Hanski's Dung Beetle Ecology, which points out that a large number (about 7000 species) of the 300,000 species of beetles live off animal dung. Did Noah bring these species on the ark? If he did, did he concern himself with the fact that animal dung is often fiercely contested. On the African savanna up to 4000 beetles have been observed to converge on 500 grams of fresh elephant dung within 15 minutes after it is deposited. Did Noah or his family also take kleptoparastic beetles on the ark? These are dung beetles known to steal dung from others. Did Noah need to take into consideration that insect dung communities involve hundreds of complex ecological interactions between coprophagous flies and their parasites, insects, mites, and nematodes (an ecology probably difficult to manage on the ark!). In South Africa, more than 100 species of dung beetle occur together in a single cow pat. One gigantic species, Heliocopris dilloni resides exclusively in elephant dung. A few species of beetles are so specialized that they live close to the source of dung, in the hairs near an animal's anus. You get my point! It's quite a mystery as to what the Biblical authors meant when they called for Noah taking pairs of every animal on the Earth. Incidentally, scientists very roughly estimate that the weight of animals in the hypothetical ark to be 1000 tons. You can use a value of 10 million for the number of species and assume an average mass of 100 grams. (Insects decrease this figure for average mass because of the huge number of insect species.) There would be some increase in mass if plants were used in the computation. (How would this change if extinct species were included?) Even if Noah took ten or twenty of each kind of mammal, very few would be alive after a thousand years because approximately 50 individuals of a single species are needed to sustain genetic health. Any small population is subject to extinction from disease, environmental changes, and genetic risks--the gradual accumulation of traits with small but harmful effects. There is also the additional problem of making sure that there is both male and female offspring surviving. Today, species are considered endangered well before their numbers drop below fifty. (Interestingly, there's a conflicting Biblical description in the story of Noah that indicated God wanted Noah to take "seven pairs of clean animals... and a pair of the animals that are not clean... and seven pairs of the birds of the air also.") The Biblical flood would probably kill most of the plant life on Earth. Even if the waters were to recede, the resultant salt deposits would prevent plants from growing for many years. Of additional concern is the ecological effect of the numerous dead carcasses caused by the initial flood. Various authors have noted that if, in forty days and nights the highest mountains on Earth were covered, the required incredible rate of rain fall of fifteen feet per hour would sink the ark. All of these cogitations lead me to believe that most scientifically trained people no longer ask whether an actual man named Noah collected all species of Earthly organism on his ark. By extension, most scientifically trained people no longer ask if the Bible is literal truth. CLIFF PICKOVER is author of over 20 books, his latest being Wonders of Numbers: Adventures in Math, Mind, and Meaning. His web site, www.pickover.com, has received over 300,000 visits. Steven Pinker "What are the implications of human nature for political systems? This question was openly discussed in two historical periods."` The first was the Enlightenment. Hobbes claimed the brutishishness of man in a state of nature called for a governmental Leviathan. Rousseau's concept of the noble savage led him to call for the abolition of property and the predominance of the "general will." Adam Smith justified market capitalism by saying that it is not the generosity but the self-interest of the baker that leads him to give us bread. Madison justified constitutional government by saying that if people were angels, no government would be necessary, and if angels were to govern people, no controls on government would be necessary. The young Marx's notion of a "species character" for creativity and self-expression led to "From each according to his ability"; his later belief that human nature is transformed throughout history justified revolutionary social change. The second period was the 1960s and its immediate aftermath, when Enlightenment romanticism was revived. Here is an argument the US Attorney General, Ramsay Clark, against criminal punishment: "Healthy, rational people will not injure others ... they will understand that the individual and his society are best served by conduct that does not inflict injury. ... Rehabilitated, an individual will not have the capacity-cannot bring himself-to injure another or take or destroy property." This is, of course, an empirical claim about human nature, with significant consequences for policy. The discussion came to an end in the 1970s, when even the mildest non romantic statements about human nature were met with angry denunciations and accusations of Nazism. At the century's turn we have an unprecedented wealth of data from social psychology, ethnography, behavioral economics, criminology, behavioral genetics, cognitive neuroscience, and so on, that could inform (though of course, not dictate) policies in law, political decision-making, welfare, and so on. But they are seldom brought to bear on the issues. In part this is a good thing, because academics have been known to shoot off their mouths with half-baked or crackpot policy proposals. But since all policy decisions presuppose some hypothesis about human nature, wouldn't it make sense to bring the presuppositions into the open so they can be scrutinized in the light of our best data? STEVEN PINKER is professor in the Department of Brain and Cognitive Sciences at MIT; director of the McDonnell-Pew Center for Cognitive Neuroscience at MIT; author of Language Learnability and Language Development, Learnability and Cognition, The Language Instinct , How the Mind Works, and Words and Rules. Jordan Pollack "Should the right to own property be preserved?" For all of history, humans traded objects, then traded currency for objects, with the idea that when you buy some thing, you own it. This most fundamental human right--the right to own--is under attack again. Only this time, the software industry, not the followers of Karl Marx, are responsible. In the last 15 years of the information age, we discovered that every object really has three separable components: the informational content, delivered on a physical medium, governed by the license, a social or legal contract governing the rights to use the thing. It used to be that the tangible media token (the thing) both held the content, and (via possession) enforced the common understanding of the "ownership" license. Owners have rights to a thing--to trade it, sell it, loan it, rent it, destroy it, donate it, paint it, photograph it, or even chop it up to sell in pieces. For a book, the content is the sequence of words themselves, which may be rendered as ink scratches on a medium of paper bound inside cardboard covers. For song it is a reproduction of the audio pattern pressed into vinyl or plastic to be released by a reading device. The license - to own all rights but copy rights--was enforced simply by possession of the media token, the physical copy of the book or disk itself. If you wanted to, you could buy a book, rip it into separate pages, and sell each page individually. You can slice a vinyl record into individual song rings for trade. For software, content is the evolving bits of program and data arranged to operate on some computer. Software can be delivered in paper, magnetic, optical, or silicon form, or can be downloaded from the internet, even wirelessly. Software publishers know the medium is completely irrelevant, except to give the consumer the feeling of a purchase. Even though you had the feeling of trading money for something, your really don't own the software you paid for. The license clearly states that you don't. You are merely granted a right to use the information, and the real owner can terminate your license at will, if you criticize him in public. Moreover, you cannot resell the software, you cannot take a "suite" apart into working products to sell each one separately. You cannot rent it or loan it to a friend. You cannot look under the hood to try to fix it when it breaks. You don't actually own anything but exchanged your money for a "right to use", and those rights can be arbitrarily dictated, and then rendered worthless by the very monopoly you got it from, forcing you to pay again for something you felt you had acquired last year. There is no fundamental difference between software, recordings, and books. E-books are not sold, but licensed, and Secure Music will be available in a pay-per-download format. Inexorably driven by more lucrative profits from rentals, I predict that within a couple of decades, you will no longer be able to "buy" a new book or record. You will not be able to "own" copies. This may not seem so nefarious, as long as you have easy access to the "celestial jukebox" and can temporarily download a "read-once" license to any entertainment from private satellites. Your children will have more room in their homes and offices without the weight of a lifetime collection of books and recordings. What are humans when stripped of our libraries? And it won't stop with books. For an automobile, the content is the blueprint, the organization of mechanisms into stylistic and functional patterns which move, built out of media such as metals, pipes, hoses, leather, plastic, rubber, and fluids. Because of the great expense of cloning a car, Ford doesn't have to spell out the licensing agreement: You own it until it is lost, sold, or stolen. You can rent it, loan it, sell it, take it apart, and sell the radio, tires, engine, carburetor, etc. individually. But the license agreement can be changed! And when Ford discovers the power of UCITA, you will have to pay an annual fee for a car you don't own, which will blow up if you fail to bring it in or pay your renewal fee. And you will find that you cannot resell your car on an open secondary market, but can only trade it in to the automobile publisher for an upgrade. Without an effort to protect the right to own, we may wake up to find that there is nothing left to buy. JORDAN POLLACK, a computer science and complex systems professor at Brandeis, works on AI, Artificial Life, Neural Networks, Evolution, Dynamical Systems, Games, Robotics, Machine Learning, and Educational Technology. He is a prolific inventor, advises several startup companies and incubators, and in his spare time runs Thinmail, a service designed to enhance the usefulness of wireless email. David G. Post "... can there really be fossil sea-shells in the mountains of Kentucky, hundreds of miles from the Atlantic coast? " This question about questions may be a useful way to differentiate "science" from "not-science"; questions really do disappear in the former in a way, or at least at a rate, that they don't in the latter. A question that has disappeared: can there really be fossil sea-shells in the mountains of Kentucky, hundreds of miles from the Atlantic coast? I came across this particular question recently when reading Thomas Jefferson's 'Notes on the State of Virginia'; he devotes several pages to speculation about whether the finds in Kentucky really were sea-shells, and, if so, how they could have ended up there. Geologists could, today, tell him. "...from what source do governments get their legitimate power?" Perhaps another question dear to Jefferson's heart has also disappeared: from what source do governments get their legitimate power? In 1780, this was a real question, concerning which reasonable people gave different answers: 'God,' or 'the divine right of Kings,' or 'heredity,' or 'the need to protect its citizens.' By declaring as 'self evident' the 'truth' that 'governments derive their just power from the consent of the governed,' Jefferson was trying to declare that this question had, in fact, disappeared. I think he may have been right. DAVID POST is Professor of Law at Temple University, and Senior Fellow at The Tech Center at George Mason University, with an interest in questions of (and inter-connections between) Internet law, complexity theory, and the ideas of Thomas Jefferson. Rick Potts "How do societies function and change?" The general question of how human societies operate, and how they change, was once a central feature of theory-building in anthropology. The question--at least significant progress in answering the question--has largely disappeared at the emergent, large-scale level ("society" or "culture") originally defined by the social sciences. Over the past three decades, behavioral biology and studies of gene-culture coevolution have made some important theoretical advances in the study of human social behavior. However, the concepts of inclusive fitness, reciprocal altruism, memes, coevolution, and related ideas have yet to effectively penetrate the question of how large-scale cultural institutions--political, economic, religious, legal, and other systems--function, stay the same, or change. The strong inclination toward bottom-up explanations, which account for human social phenomena in terms of lower-level individual behaviors and epigenetic rules--implies either that social institutions (and thus how they function and change) are only epiphenomena, thus less worthy of investigating than the genetic bases of behavior and evolutionary psychology; or that cultural systems and institutions do exist--they function and change at a level of complexity above human psychology, decision-making, and epigenetic rules--but have largely been forgotten by certain fields purporting to study and to explain human social behavior. RICHARD POTTS is Director of The Human Origins Program, Department of Anthropology, National Museum of Natural History, Smithsonian Institution. He is the author of Humanity's Descent : The Consequences of Ecological Instability and a presenter, with Stephen Jay Gould, of a videotape, Tales of the Human Dawn. Robert Provine "Obsolete and Inappropriate Metaphors" The detection of "questions that are no longer asked" is difficult. Old questions, like MacArthur's old soldiers, just fade away. Scientists and scholars in hot pursuit of new questions neither note nor mourn their passing. I regularly face a modest form of the disappearing question challenge when a textbook used in one of my classes is revised. Deletions are hard to find; they leave no voids and are more stealthy than black holes, not even affecting their surrounds. New text content stands out, while missing material must be established through careful line-by-line reading. Whether in textbooks or in life, we don't think much about what is no longer relevant. My response to the inquiry about questions that are no longer asked is to reframe it and suggest instead a common class of missing questions, those associated with obsolete and inappropriate metaphors. Metaphor is a powerful cognitive tool, which, like all models, clarifies thinking when appropriate, but constrains it when inappropriate. Science is full of them. My professional specialties of neuroscience and biopsychology has mind/brain metaphors ranging from Locke's ancient blank slate (tabula rasa), to the more technologically advanced switchboard, and the metaphor de jour, the computer. None do justice to the brain as a soggy lump of wetware, but linger as cognitive/linguistic models. Natural selection in the realm of metaphors is slow and imperfect. Witness the reference to DNA as a "blueprint" for an organism, when Dawkins' "recipe" metaphor more accurately reflects DNA's incoding of instructions for organismic assembly. ROBERT R. PROVINE is Professor of Psychology and Neuroscience at the University of Maryland, Baltimore County, and author of Laughter: A Scientific Investigation. Eduardo Punset "Looking at the world upside down: what are we enhancing or what is vanishing in our brains while flat and dormant views of the universe are slowly disappearing?" Wrapped like hotdogs full of mustard, snorting in search of air to breath from beneath the blanket--like dendrites looking for the first time for new contacts--, the skull plunged in a floppy pillow and the eyes allowed only to stare at the grey sky, most of the time too flat and low to enjoy a more diversified life in three dimensions. What has been the impact on the newly born brain's positioned mummy-like, and tight for generations in the pram, of this upside down perception of the Universe? We do have a point of reference to imagine what life was like during the first eight or nine months after birth, before the invention of the anatomically shaped infant car seat that makes our youngest travel and look around from their earliest age. I'll come to that later. First let me insist for those unaware of radical innovations in evolutionary psychology, that no baby has ever been found--there are plenty of very reliable tests for that--, who after having experienced the glamour of looking at the Universe face to face, right and left, backwards and forward, has regretted the odd way of being carried around by previous generations. Not only that; no newly born would ever accept now to look at the Universe from other vantage points than the high-tech pushchair, carriages, and travelling systems for children aged birth to four years, developed in the mid-80's , out of the original baby car seat invented in America. Just as monkeys become quickly aware of new inventions and adopt them without second thoughts, our youngest do not accept any longer to be carried in prams where they lied flat and dormant. They have suddenly become aware that they can be taken around in efficiently designed traveling engines, from where they can look at the world in movement practically as soon as they open their eyes. If somebody thinks that the end of looking upside-down at the Universe during the first eight or nine months of life is not important enough to be quoted as the end of anything, think of what neuroscientists are discovering about what happens during the first five months of the unborn just after conception. Professor Beckman in Wursburg University (Germany) has convinced at last his fellow psychiatrists that neuron's mistakes in their migration from the limbic to the upper layers of the brain of the unborn are responsible, to a very large extent, for the 1% of epileptics and schizophrenics in the world's population. By the way, the 1% is fixed, no matter how many neuroscientists join the battle against mental illness. It is like a sort of cosmic radiation background. The only exception that shows up is whenever deep malnutrition or feverish influenza in expectant mothers pushes the rate significantly up. Likewise, very few scientist would refuse to acknowledge today, that what happens during the first five months of the embryo is not only relevant in the case of malformations and mental disorders, but also in the case of levels of intelligence and other reasonable behavior patterns. How could anybody discard then the tremendous impact on the newly born brain of interacting with the Universe face to face during the first eight to nine months? Surely, if we continue searching for the missing link between a single gene and a bark--and I deeply hope that we do now that molecular biology and genetics have joined forces--, everybody should care about the end of the upside-down perception of the Universe, and the silent revolution led by babies nurtured in the latest high-tech travelling system's interactive culture. Professor EDUARDO PUNSET teaches Economics at the Sarri? Chemichal Institute of Ramon Llull University (Barcelona). He is Chairman of Planetary Agency, an audiovisual concern for the public understanding of Science. He was IMF Representative in the Caribbean, Professor of Innovation & Technology at Madrid University, and Minister for Relations with the UE. Tracy Quan "Who does your bleeding?" Recently, I was relaxing in my hotel room with a biography of Queen Elizabeth I. Her biographer noted that when Elizabeth R wasn't feeling quite herself she would call for a good "bleeding." I wondered about this practice which now seems so destructive and dangerous, especially given the hygienic possibilities of 16th-century Britain. Even for the rich and famous. But Elizabeth R survived numerous bleedings and, I imagine, lots of other strange treatments that were designed to make her look and feel like her very best self--by the standards of her time. (Did she have a great immune system? Probably.) As dotty and unclean as "bleedings" now seem to a 21st century New Yorker, I realized with a jolt that Elizabeth was pampering, not punishing, herself--and I was going to be late for my reflexology appointment. I had scheduled a two-hour orgy of relaxation and detoxification at a spa. I imagine that the ladies at court asked each other, in the manner of ladies who-lunch, "Who does your bleeding?"--trading notes on price, ambiance and service, just as ladies today discuss their facials, massages and other personal treatments. Some skeptics assume that the beauty and spa treatments of today are as ineffective or dangerous as those of the Renaissance period. In fact, there have been inroads. Germ theory helped--as did a host of other developments, including a fascination in the West with things Eastern. The kind of people who would once have gone in for bleeding now go in for things like reflexology and shiatsu. That urge to cleanse and detoxify the body has long been around but we've actually figured out how to do it because we better understand the body. The pampered are prettier and healthier today than were their 16th century European counterparts. I wonder whether, another thousand or so years into the future, we will all look prettier and healthier in ways that we can't yet fathom. This kind of query might seem irresponsible, shallow, even immoral--given the real health crises facing human beings in 2001. But the way we look has everything to do with how we live and how we think. And I'm glad that bleedings are no longer the rage. TRACY QUAN, a writer and working girl living in New York, is the author of "Nancy Chan: Diary of a Manhattan Call Girl", a serial novel about the life and loves of Nancy Chan, a turn-of-the- millennium call girl. Excerpts from the novel--which began running in July, 2000 in the online magazine, Salon--have attracted a wide readership as well as the attention of the The New York Times and other publications. Martin Rees "Was Einstein Right?" Einstein's theory of gravity--general relativity--transcended Newton's by offering deeper insights. It accounted naturally, in a way that Newton didn't, for why everything falls at the same speed, and why the force obeys an inverse square law. The theory dates from 1916, and was famously corroborated by the measured deflection of starlight during eclipses, and by the anomalies in Mercury's orbit. But it took more than 50 years before there were any tests that could measure the distinctive effects of the theory with better than 10 percent accuracy. In the 1960s and 1970s , there was serious interest in tests that could decide between general relativity and alternative theories that were still in the running. But now these tests have improved so much, and yielded such comprehensive and precise support for Einstein, that it would require very compelling evicence indeed to shake our belief that general relativity is the correct "classical" theory of gravity, New and different experiments are nonetheless currently being planned. But the expectation that they'll corroborate the theory is no so strong that we'd demand a high burden of proof before accepting a contrary result, For instance, NASA plans to launch an ultra-precise gyroscope ("Gravity Probe B") to measure tiny precession effects. If the results confirm Einstein, nobody will be surprised nor excited--though they would have been if the experiment had flown 30 years ago, when it was first devised. On the other hand, if this very technically-challenging experiment revealed seeming discrepancies, I suspect that most scientists would suspend judgment until it had been corroborated. So the most exciting result of Gravity Probe B would be a request to NASA for another vast sum, in order repeat it. But Einstein himself raised other deep questions that are likely to attract more interest in the 21st century than they ever did in the 20th. He spent his last 30 years in a vain (and, as we now recognize, premature) quest for a unified theory. Will such a theory--reconciling gravity with the quantum principle, and transforming our conception of space and time--be achieved in coming decades? And, if it is, what answer will it offer to an another of Einstein's questions: "Did God have any choice in the creation of the world?" Is our universe--and the physical laws that govern it--the unique outcome of a fundamental theory, or are the underlying laws more "permissive", in the sense that they could allow other very different universes as well? MARTIN REES is Royal Society Professor at Cambridge University and a leading researcher in astrophysics and cosmology. His books include Before the Beginning, Gravity's Fatal Attraction and (most recently) Just Six Numbers. Douglas Rushkoff "Is nothing sacred?" It seems to me we?ve surrendered the notion of the sacred to those who only mean to halt the evolution of culture. Things we call "sacred" are simply ideologies and truths so successfully institutionalized that they seem unquestionable. For example, the notion that sexual imagery is bad for young people to see--a fact never established by any psychological or anthropological study I?ve come across--is accepted as God-ordained fact, and used as a fundamental building block to justify censorship. (Meanwhile, countless sitcoms in which parents lie to one another are considered wholesome enough to earn "G" television ratings.) A politician?s claim to be "God-fearing" is meant to signify that he has priorities greater than short-term political gain. What most people don?t realize is that, in the Bible anyway, God-fearing is a distant second to God-loving. People who were God-fearing only behaved ethically because they were afraid of the Hebrew God?s wrath. This wasn?t a sacred relationship at all, but the self-interested avoidance of retaliation. Today, it seems that no place, and--more importantly--no time is truly sacred. Our mediating technologies render us available to our business associates at any hour, day or night. Any moment spent thinking instead of spending, or laughing instead of working is an opportunity missed. And the more time we sacrifice to production and consumption, the less any alternative seems available to us. One radical proposal to combat the contraction of sacred time was suggested in the book of Exodus, and it's called the Sabbath. What if we all decided that for one day each week, we would refrain from buying or selling anything? Would it throw America into a recession? Maybe the ancients didn't pick the number seven out of a hat. Perhaps they understood that human beings can only immerse themselves in commerce for six days at a stretch before losing touch with anything approaching the civil, social, or sacred. DOUGLAS RUSHKOFF is the author of Coercion, Media Virus, Playing the Future, Ecstasy Club. Professor of Virtual Culture, New York University. Howard Rheingold "Why can't we use technology to solve social problems?" Not long after the Apollo landing, a prevalent cliche for a few years was "If we can put humans on the moon, why can't we....[insert prominent social problem such as starvation, epidemic, radical inequalities, etc.]? In 1980, in his book "Critical Path," Buckminster Fuller wrote: "We are blessed with technology that would be indescribable to our forefathers. We have the wherewithal, the know-it-all to feed everybody, clothe everybody, and give every human on Earth a chance. We know now what we could never have known before-that we now have the option for all humanity to "make it" successfully on this planet in this lifetime." In the contemporary zeitgeist, Fuller's claims seem naively utopian. The past century saw too much misery resulting from the attempts to build utopias. But without the belief that human civilization can improve, how could we have arrived at the point where we can formulate questions like these and exchange them around the world at the speed of light? There are several obvious choices for answers to the question of why this question isn't asked any more: 1. We might have been able to put humans on the moon in 1969, but not today. Good point. And the reason for this circumstance--lack of political will, and the community of know-how it took NASA a decade to assemble, not lack of technical capabilities--is instructive. 2. Technology actually has solved enormous social problems--antibiotics, hygienic plumbing, immunization, the green revolution. I agree with this, and see it as evidence that it is possible to relieve twice as much, a thousand times as much human misery as previous inventions. 3. Human use of technologies have created even greater social problems--antibiotics are misused and supergerms evolved; nuclear wastes and weapons are threats, not enhancements; the green revolution swelled the slums of the world as agricultural productivity rose and global agribiz emerged. 4. There is no market for solving social problems, and it isn't the business of government to get into the technology or any other kind of business. This is the fallacy of the excluded middle. Some technologies such as the digital computer and the Internet were jump-started by governments, evolved through grassroots enthusiasms, and later become industries and "new economies." 5. Throwing technology at problems can be helpful, but the fundamental problems are political and economic and rooted in human nature. This answer should not be ignored. A tool is not the task, and often the invisible, social, non-physical aspects of a technological regime make all the difference. There's some truth to each of these answers, yet they all fall short because all assume that we know how to think about technology. Just because we know how to make things doesn't guarantee that we know what those things will do to us. Or what kind of things we ought to make. What if we don't know how to think about the tools we are so skilled at creating? What if we could learn? Perhaps knowing how to think about technology is a skill we will have to teach ourselves the way we taught ourselves previous new ways of thinking such as mathematics, logic, and science. A few centuries ago, a few people began questioning the assumption that people knew how to think about the physical world. Neither philosophy nor religion seemed to be able to stave off famine and epidemic. The enlightenment was about a new method for thinking. Part of that new method was the way of asking and testing questions known as science, which provided the knowledge needed to create new medicines, new tools, new weapons, new economic systems. We learned how to think very well about the physical world, and how to unleash the power in that knowledge. But perhaps we have yet to learn how to think about what to do with our tools. HOWARD RHEINGOLD is author of The Virtual Community, Virtual Reality, Tools for Thought. Founder of Electric Minds, named by Time magazine one of the ten best web sites of 1996. Editor of The Millennium Whole Earth Catalog. Karl Sabbagh "How many angels can dance on the point of a pin?" This question is no longer asked, not because the question has been answered (although I happen to believe the answer is e to the i pi) but because the search for knowledge about the spiritual world has shifted focus as a result of science and maths cornering all the physical and numerical answers. Along with "Did Adam have a navel?" and "Did Jesus' mother give birth parthenogenetically?", this question is no longer asked by anyone of reasonable intelligence. Those who would in the past have searched for scientific support for their spiritual ideas have finally been persuaded that this is a demeaning use for human brainpower and that by moving questions about the reality of spiritual and religious ideas into the same category as questions about mind (as opposed to brain) they will retain the respect of unbelievers and actually get nearer to an understanding of the sources of their preoccupations. As an addendum, although I wasn't asked I would like also to answer the question "What questions should disappear and why?" The one question that should disappear as soon as possible ? and to a certain extent scientists are to blame for the fact that it is still asked ? is: What is the explanation for astrology/UFOs/clairvoyance/telepathy/any other 'paranormal' phenomenon you care to name? This question is still asked because scientists and science educators have failed to get over to the public the fact that there is only one method of explaining phenomena ? the scientific method. Therefore, anything that people are puzzled by that has not been explained either doesn't exist or there isn't yet enough evidence to prove that it does. But still I get the impression that for believers in these phenomena there are two types of explanatory system ? science and nonscience (you can pronounce the latter 'nonsense' if you like). When you try to argue with these people by pointing out that there isn't sufficient repeatable evidence even to begin to attempt an explanation in scientific terms, they just say that this particular phenomenon doesn't require that degree of stringency. When the evidence is strong enough to puzzle scientists as well as nonscientists, they'll begin to devise explanations ? scientific explanations. There's a good example of how this works currently with the interest taken in St John's wort as a possible treatment for depression. Once there was enough consistent evidence to suggest that there might be an effect, clinical trials were planned and are now under way. Interestingly, an indication that there might be a genuine effect comes from a substantial body of information suggesting that there are adverse drug interactions between St John's wort and immunosuppressive drugs taken by transplant patients. Once an 'alternative' remedy actually causes harm as well as having alleged benefits, it's claimed effects are more likely to be genuine. One argument against most of the quack remedies around, such as homoeopathy, is that they are entirely safe (although this is seen as a recommendation, by the gullible.) The demand that phenomena that are not explainable in scientific terms should be accepted on the basis of some other explanation similar to the argument you might care to use with your bank manager that there is more than one type of arithmetic. Using his conventional accounting methods he might think your account is overdrawn but you would argue that, although the evidence isn't as strong as his method might require, you believe you still have lots of money in your account and therefore will continue writing cheques. (As like as not, this belief in a positive balance in your account will be based on some erroneous assumption ? for example, that you still have a lot of blank cheques left in your chequebook.) KARL SABBAGH is a television producer who has turned to writing. Among his television programs are "Skyscraper" ? a four-hour series about the design and construction of a New York skyscraper; "Race for the Top" ? a documentary about the hunt for top quark; and "21st Century Jet" ? a five part series following Boeing's new 777 airliner from computer design to entry into passenger service.He is the author of six books including Skyscraper, 21st Century Jet, and A Rum Affair . Roger Schank "Why Teach Mathematics?" Some questions are so rarely asked that we are astonished anyone would ask them at all. The entire world seems to agree that knowing mathematics is the key to something important, they just forget what. Benjamin Franklin asked this question in 1749 while thinking about what American schools should be like and concluded that only practical mathematics should be taught. The famous mathematician G.H. Hardy asked this question (A Mathematicians's Apology) and concluded that while he loved the beauty of mathematics there was no real point teaching it to children. Today, we worry about the Koreans and Lithuanians doing better than us in math tests and every "education president" asserts that we will raise math scores, but no one asks why this matters. Vague utterances about how math teaches reasoning belie the fact that mathematicicans do everyday reasoning particularly better than anyone else. To anyone who reads this and still is skeptical, I ask: what is the Quadratic Formula? You learned it in ninth grade, you couldn't graduate high school without it. When was the last time you used it? What was the point of learning it? ROGER SCHANK is the Chairman and Chief Technology Officer for Cognitive Arts and has been the Director of the Institute for the Learning Sciences at Northwestern University since its founding in 1989. One of the world's leading Artificial Intelligence researchers, he is books include: Dynamic Memory: A Theory of Learning in Computers and People , Tell Me a Story: A New Look at Real and Artificial Memory, The Connoisseur's Guide to the Mind, and Engines for Education. Stephen H. Schneider "Will the free market finally triumph?" Despite Seattle and the French farmers, free market advocates of globalization have largely won--even CHina is signing up to be a major player in the international trading and growth-oriented global political economy. So it is rare to hear this question anymore, even from so-called "enterprise institutes" dedicated to protecting property rights. The problem is, what has been won?? My concern is not with the question no-longer asked in this context, but rather with the companion question not often enough asked: "Is there any such thing as a free market"? To be sure, markets are generally efficient ways of allocating resources and accomplishing economic goals. However, markets are notorious for leaving out much of what people really value. In different words, the market price of doing business simply excludes much of the full costs or benefits of doing business because many effects aren't measured in traditional monetary units. For example, the cost of a ton of coal isn't just the extraction costs plus transportation costs plus profit, but also real expenses to real people (or real creatures) who happen to be external to the energy market. Such "externalities" are very real to coastal dwellers trying to cope with sea level rises likely to be induced from the global warming driven by massive coal burning. I recall a discussion at the recent international negotiations to limit emissions of greenhouse gasses in which a chieftain from the tiny Pacific island of Kiribati was being told by an OPEC supporter opposed to international controls on emissions from fossil fuels that the summed economies of all the small island states were only a trivial fraction of the global GDP, and thus even if sea level rise were to drive them out of national existence, this was "not sufficient reason to hold back to economic progress of the planet by constricting the free use of energy markets". "We are not ungenerous", he said, so in the "unlikely event" that you were a victim of sea level rise, "we'll just pay to relocate all of you and your people to even better homes and jobs than you have now", and this, he went on, will be much cheaper than to "halt industrial growth" (THis isn't the forum to refute the nonsense that controls on emissions will halt industrial growth.) After hearing this offer, the aging and stately chieftain paused, scratched his flowing hair, politely thanked the OPEC man for his thoughtfulness and simply said, "we may be able to move, but what do I do with the buried bones of my grandfather?" Economists refer to the units of value in cost-benefit analyses as "numeraires"--dollars per ton carbon emitted in the climate example, is the numeraire of choice for "free market" advocates. But what of lives lost per ton of emissions from intensified hurricanes, or species driven off mountain tops to extinction per ton, or heritage sites lost per ton?? Or what if global GDP indeed goes up fastest by free markets but 25% of the world gets left further behind as globally economically efficient markets expand? Is equity a legitimate numeraire too? Therefore, while market systems seem indeed to have triumphed, it is time to phase in a new, multi-part question: "How can free markets be adjusted to value what is left out of private cost-benefit calculus but represents real value so we can get the price signals in markets to reflect all the costs and benefits to society across all the numeraires, and not simply have market prices rigged to preserve the status quo in which monetary costs to private parties are the primary condition?" I hope the new US president soon transcends all that obligatory free market rhetoric of the campaign and learns much more about what constitutes a full market price. It is very likely he'll get an earful as he jetsets about the planet in Air Force 1 catching up on the landscapes--political and physical--of the vastly diverse countries in the world that it is time for him to visit. Many world leaders are quite worried about just what we will have won as currently defined free markets triumph. STEPHEN H. SCHNEIDER is Professor in the Biological Sciences Department at Stanford University and the Former Department Director and Head of Advanced Study Project at the National Center for Atmospheric Research Boulder. He is internationally recognized as one of the world's leading experts in atmospheric research and its implications for environment and society. Dr. Schneider's books include The Genesis Strategy: Climate Change and Global Survival; The Coevolution Of Climate and Life and Global Warming: Are We Entering The Greenhouse Century?; and Laboratory Earth Al Seckel "Why is our sense of beauty and elegance such a useful tool for discriminating between a good theory and a bad theory?" During the early 1980s, I had the wonderful fortune to spend a great deal of time with Richard Feynman, and our innumerable conversations extended over a very broad range of topics (not always physics!). At that time, I had just finished re-reading his wonderful book, The Character of Physical Law, and wanted to discuss an interesting question with him, not directly addressed by his book: Why is our sense of beauty and elegance such a useful tool for discriminating between a good theory and a bad theory? And a related question: Why are the fundamental laws of the universe self-similar? Over lunch, I put the questions to him. "It's goddam useless to discuss these things. It's a waste of time," was Dick's initial response. Dick always had an immediate gut-wrenching approach to philosophical questions. Nevertheless, I persisted, because it certainly was to be admitted that he had a strong intuitive sense of the elegance of fundamental theories, and might be able to provide some insight rather than just philosophizing. It was also true that this notion was a successful guiding principle for many great physicists of the twentieth century including Einstein, Bohr, Dirac, Gell-Mann, etc. Why this was so, was interesting to me. We spent several hours trying to get at the heart of the problem and, indeed, trying to determine if it was even a true notion rather than some romantic representation of science. We did agree that it was impossible to explain honestly the beauties of the laws of nature in a way that people can feel, without their having some deep understanding of mathematics. It wasn't that mathematics was just another language for physicists, it was a tool for reasoning by which you could connect one statement with another. The physicist has meaning to all his phrases. He needs to have a connection of words to the real world. Certainly, a beautiful theory meant being able to describe it very simply in terms of fundamental mathematical quantities. "Simply" meant compression into a small mathematical expression with tremendous explanatory powers, which required only a finite amount of interpretation. In other words, a huge number of relationships between data are concisely fit into a single statement. Later, Murray Gell-Mann expressed this point well, when he wrote, "The complexity of what you have to learn in order to be able to read the statement of the law is not really very great compared to the apparent complexity of the data that are being summarized by that law. That apparent complexity is partly removed when the law is formed." Another driving principle was that the laws of the universe are self similar, in that there are connections between two sets of phenomena previously thought to be distinct. There seemed to be a beauty in the inter-relationships fed by perhaps a prejudice that at the bottom of it all was a simple unifying law. It was easy to find numerous examples from the history of modern science that fit within this framework (Maxwell's equations for electromagnetism, Einstein's general-relativistic equations for gravitation, Dirac's relativistic quantum mechanics, etc.,), but Dick and I were still working away at the fringes of the problem. So far, all we could do was describe the problem, find numerous examples, but we could not answer what provided the feeling for great intuitive guesses. Perhaps, our love of symmetries and patterns, are an integral part of why would embrace certain theories and not others. For example, for every conservation law, there was a corresponding symmetry, albeit sometimes these symmetries would be broken. But this led us to another question: Is symmetry inherent in nature or is it something we create? When we spoke of symmetries, we were referring to the symmetry of the mathematical laws of physics, not to the symmetry of objects commonly found in nature. We felt that symmetry was inherent in nature, because it was not something that we expected to find in physics. Another psychological prejudice was our love for patterns. The simplicity of the patterns in physics were beautiful. This does not mean simple in action ? the motion of the planets and of atoms can be very complex, but the basic patterns underneath are simple. This is what is common to all of our fundamental laws. It should be noted that we could also come up with numerous examples where one's sense of elegance and beauty led to beautiful theories that were wrong. A perfect example of a mathematically elegant theory that turned out to be wrong is Francis Crick's 1957 attempt at working out the genetic coding problem (Codes without Commas). It was also true that there were many examples of physical theories that were pursued on the basis of lovely symmetries and patterns, and that these also turned out to be false. Usually, these were false because of some logical inconsistency or the crude fact that they did not agree with experiment. The best that Dick and I could come up with was an unscientific response, which is, given our fondness for patterns and symmetry, we have a prejudice--that nature is simple and therefore beautiful. Since that time, the question has disappeared from my mind, and it is fun thinking about it again, but in doing scientific research, I now have to concern myself with more pragmatic questions. AL SECKEL is acknowledged as one of the world's leading authorities on illusions. He has given invited lectures on illusions at Caltech, Harvard, MIT, Berkeley, Oxford University, University of Cambridge, UCLA, UCSD, University of Lund, University of Utrecht, and many other fine institutions. Seckel is currently under contract with the Brain and Cognitive Division of the MIT Press to author a comprehensive treatise on illusions, perception, and cognitive science. Terrence J. Sejnowski "Is God Dead?" On April 8, 1966, the cover of Time Magazine asked "Is God Dead?" in bold red letters on a jet black background. This is an arresting question that no one asks anymore, but back in 1996 it was a hot issue that received serious comment. In 1882 Friedrich Nietzsche in The Gay Science had a character called "the madman" running through the marketplace shouting "God is dead!", but in the book, no one took the madman seriously. The Time Magazine article reported that a group of young theologians calling themselves Christian atheists, led by Thomas J. J. Altizer at Emory University, had claimed God was dead. This hit a cultural nerve and in an appearance on "The Merv Griffin Show" Altizer was greeted by shouts of "Kill him! Kill him!" Today Altizer continues to develop an increasingly apocalyptic theology but has not received a grant or much attention since 1966. The lesson here is that the impact of a question very much depends on the cultural moment. Questions disappear not because they are answered but because they are no longer interesting. TERRENCE J. SEJNOWSKI, a pioneer in Computational Neurobiology, is regarded by many as one of the world's most foremost theoretical brain scientists. In 1988, he moved from Johns Hopkins University to the Salk Institute, where he is a Howard Hughes Medical Investigator and the director of the Computational Neurobiology Laboratory. In addition to co-authoring The Computational Brain, he has published over 250 scientific articles. Michael Shermer "Can science answer moral and ethical questions?" From the time of the Enlightenment philosophers have speculated that the remarkable advances of science would one day spill over into the realm of moral philosophy, and that scientists would be able to discover answers to previously insoluble moral dilemmas and ethical conundrums. One of the reasons Ed Wilson's book Consilience was so successful was that he attempted to revive this Enlightenment dream. Alas, we seem no closer than we were when Voltaire, Diderot, and company first encouraged scientists to go after moral and ethical questions. Are such matters truly insoluble and thus out of the realm of science (since, as Peter Medewar noted, "science is the art of the soluble")? Should we abandon Ed Wilson's Enlightenment dream of applying evolutionary biology to the moral realm? Most scientists agree that moral questions are scientifically insoluble and they have abandoned the Enlightenment dream. But not all. We shall see. MICHAEL SHERMER is the founding publisher of Skeptic magazine, the host of the acclaimed public science lecture series at Caltech, and a monthly columnist for Scientific American. His books include Why People Believe Weird Things, How We Believe, and Denying History. Lee Smolin "What is the next step in the evolution of democracy?" A question no longer being asked is how to make the next step in the evolution of a democratic society. Until very recently it was widely understood that democracy was a project with many steps, whose goal was the eventual construction of a perfectly just and egalitarian society. But recently, with the well deserved collapse of Marxism, it has begun to seem that the highest stage of civilization we humans can aspire to is global capitalism leavened by some version of a bureaucratic welfare state, all governed badly by an unwieldy and corrupt representative democracy. This is better than many of the alternatives, but it is hardly egalitarian and often unjust; those of us who care about these values must hope that human ingenuity is up to the task of inventing something still better. It is proper that the nineteenth century idea of utopia has finally been put to rest, for that was based on a paradox, which is that any predetermined blueprint for an ideal society could only be imposed by force. It is now almost universally acknowledged that there is no workable alternative to the democratic ideal that governments get their authority by winning the consent of the governed. This means that if we are to change society, it must be by a process of evolution rather than revolution. But why should this mean that big changes are impossible? What is missing are new ideas, and a context to debate them. There are at least four issues facing the future of the democratic project. First, while democracy in the worlds most powerful country is perceived by many of its citizens as corrupted, there is little prospect for serious reform. The result is alienation so severe that around half of our citizens do not participate in politics. At what point, we may ask, will so few vote that the government of the United States may cease to have a valid claim to have won the consent of the governed. As the political and journalistic classes have largely lost the trust of the population, where will leadership to begin the reform that is so obviously needed come from? A second point of crisis and opportunity is in the newly democratized states. In many of these countries intellectuals played a major role in the recent establishment of democracy. These people are not likely to go to sleep and let the World Bank tell them what democracy is. The third opportunity is in Europe, where a rather successful integration of capitalism and democratic socialism has been achieved. These societies suffer much less from poverty and the other social and economic ills that appear so unsolvable in the US context. (And it is not coincidental that the major means of funding political campaigns in the US are illegal in most of Europe.) Walking the streets in Denmark or Holland it is possible to wonder what a democratic society that evolved beyond social democracy might look like. European integration may be only the first step towards a new kind of nation state which will give much of its sovereignty up to multinational entities, a kind of nation-as-local-government. Another challenge for democracy is the spread of the bureaucratic mode of organization, which in most countries has taken over the administration of education, science, health and other vital areas of public interest. As any one who works for a modern university or hospital can attest to, bureaucratic organizations are inherently undemocratic. Debate amongst knowledgeable, responsible individuals is replaced by the management of perceptions and the manipulation of supposedly objective indices. As the politics of the academy begins to look more like nineteenth century Russia than 5th Century BC Athens we intellectuals need to do some serious work to invent more democratic modes of organization for ourselves and for others who work in the public interest. Is it not then time we "third culture intellectuals" begin to attack the problem of democracy, in both our workplaces and in our societies? Perhaps, with all of our independence, creativity, intelligence and edginess, we may find we really have something of value to contribute? LEE SMOLIN is a theoretical physicist; professor of physics and member of the Center for Gravitational Physics and Geometry at Pennsylvania State University; author of The Life of The Cosmos. Dan Sperber "Are women and men equal?" No doubt, there are differences between women and men, some obvious and others more contentious. But arguments for inequality of worth or rights between the sexes have wholly lost intellectual respectability. Why? Because they were grounded in biologically evolved dispositions and culturally transmitted prejudices that, however strongly entrenched, could not withstand the kind of rational scrutiny to which they have been submitted in the past two centuries. Also because, more recently, the Feminist movement has given so many of us the motivation and the means to look into ourselves and recognize and fight lingering biases. Still, the battle against sexism is not over--and it may never be. DAN SPERBER is a social and cognitive scientist at the French Centre National de la Recherche Scientifique (CNRS) in Paris. His books include Rethinking Symbolism, On Anthropological Knowledge, Explaining Culture: A Naturalistic Approach, and, with Deirdre Wilson, Relevance: Communication and Cognition. Tom Standage "Are there planets around other stars?" Speculation about the possibility of a "plurality of worlds" goes back at least as far as Epicurus in the fourth century BC. Admittedly, Epicurus' definition of a "world" was closer to what we would currently regard as a solar system--but he imagined innumerable such spheres, each containing a system of planets, packed together. "There are," he declared, "infinite worlds both like and unlike this world [i.e., solar system] of ours." The same question was subsequently considered by astronomers and philosophers over the course of many centuries; within the past half century, the idea of the existence of other planets has become a science fiction staple, and people have started looking for evidence of extraterrestrial civilizations. Like Epicurus, many people have concluded that there must be other solar systems out there, consisting of planets orbiting other stars. But they didn't know for sure. Today, we do. The first "extrasolar" planet (ie, beyond the solar system) was found in 1995 by two Swiss astronomers, and since then another 48 planets have been found orbiting dozens of nearby sun-like stars. This figure is subject to change, because planets are now being found at an average rate of more than one per month; more planets are now known to exist outside the solar system than within it. Furthermore, one star is known to have at least two planets, and another has at least three. We can, in other words, now draw maps of alien solar systems--maps that were previously restricted to the realm of science fiction. The discovery that there are other planets out there has not, however, caused as much of a fuss as might have been expected, for two reasons. First, decades of Star Trek and its ilk meant that the existence of other worlds was assumed; the discovery has merely confirmed what has lately become a widely-held belief. And second, none of these new planets has actually been seen. Instead, their existence has been inferred through the tiny wobbles that they cause in the motion of their parent stars. The first picture of an extrasolar planet is, however, probably just a few years away. Like the first picture of Earth from space, it is likely to become an iconic image that once again redefines the way we as humans think about our place in the universe. Incidentally, none of these new planets has a name yet, because the International Astronomical Union, the body which handles astronomical naming, has yet to rule on the matter. But the two astronomers who found the first extrasolar planet have proposed a name for it anyway, and one that seems highly appropriate: they think it should be called Epicurus. TOM STANDAGE is technology correspondent at The Economist in London and author of the books The Victorian Internet and The Neptune File, both of which draw parallels between episodes in the history of science and modern events. He has also written for the Daily Telegraph, The Guardian, Prospect, and Wired. He is married and lives in Greenwich, England, just down the hill from the Old Royal Observatory. Timothy Taylor "How can I stop the soul of the deceased reanimating the body?" At a particular point (yet to be clearly defined) in human cultural evolution, a specific idea took hold that there were two, partially separable, elements present in a living creature: the material body and the force that animated it. On the death of the body the animating force would, naturally, desire the continuation of this-worldly action and struggle to reassert itself (just as one might strive to retrieve a flint axe one had accidentally dropped). If the soul (or spirit) succeeded, it would also seek to repossess its property, including its spouse, and reassert its material appetites. The desire of the disembodied soul was viewed as dangerous by the living, who had by all means to enchant, cajole, fight off, sedate, or otherwise distract and disable it. This requirement to keep the soul from the body after death did not last forever, only so long as the flesh lay on the bones. For the progress of the body's decomposition was seen as analogous to the slow progress the soul made toward the threshold of the Otherworld. When the bones where white (or were sent up in smoke or whatever the rite in that community was), then it was deemed that the person had finally left this life and was no longer a danger to the living. Thus it was, that for most of recent human history (roughly the last 35,000 years) funerary rites were twofold: the primary rites zoned off the freshly dead and instantiated the delicate ritual powers designed to keep the unquiet soul at bay; the secondary rites, occurring after weeks or months (or, sometimes--in the case of people who had wielded tremendous worldly power--years), firmly and finally incorporated the deceased into the realm of the ancestors. Since the rise of science and scepticism, the idea of the danger of the disembodied soul has, for an increasing number of communities, simply evaporated. But there is a law of conservation of questions. "How can I stop the soul of the deceased reanimating the body?" is now being replaced with "How can I live so long that my life becomes indefinite?," a question previously only asked by the most arrogant pharaohs and emperors. TIMOTHY TAYLOR lecturers in the Department of Archaeological Sciences, University of Bradford, UK. He is the author of The Prehistory of Sex. Joseph Traub "Have We Seen the End of Science?" "Will the Internet Stock Bubble Burst?" I am taking the liberty of sending two questions. (After all, the people on the list like to push the boundaries.) "Have We Seen the End of Science?" John Horgan announced the end of science in his book of the same title. Almost weekly the most spectacular advances are being announced and intriguing questions being asked in fields such as biology and physics. The answer was always a resounding no; now nobody asks the question. "Will the Internet Stock Bubble Burst?" We certainly know the answer to that one now. On February 22, 2000 I gave a talk at the Century Association in NYC titled "Modern Finance and Computers". One of the topics that i covered was "will the internet stock bubble burst?" I said it was a classic bubble and would end in the usual way. I cited the example of an Fall, 1999 IPO for VA Linux. This was a company that had a market capitalization of 10 billion dollars at the end of the first day even though it had never shown a profit and was up against competitors such as Dell and IBM. The NASDAQ reached its high on March 10, 2000, and the internet sector collapsed a couple of weeks later. The high for VA Linux in 2000 was $247;yesterday it closed below $10. JOSEPH F. TRAUB is the Edwin Howard Armstrong Professor of Computer Science at Columbia University and External Professor at the Santa Fe Institute. He was founding Chairman of the Computer Science Department at Columbia University from 1979 to 1989, and founding chair of the Computer Science and Telecommunications Board of the National Academy of Sciences from 1986 to 1992. From 1971 to 1979 he was Head of the Computer Science Department at Carnegie-Mellon University. Traub is the founding editor of the Journal of Complexity and an associate editor of Complexity. A Festschrift in celebration of his sixtieth birthday was recently published. He is the author of nine books including the recently published Complexity and Information. Colin Tudge "The Great Idea That's Disappeared" The greatest idea that's disappeared from mainstream science this past 400 years is surely that of God. The greats who laid the foundations of modern science in the 17th century (Galileo, Newton, Leibnitz, Descartes) and the significant-but-not-quite-so-greats (Robert Boyle, John Ray, etc.) were theologians as much as they were scientists and philosophers. They wanted to know how things are, of course--but also what God had in mind when he made them this way. They took it for granted, or contrived to prove to their own satisfaction, that unless there is a God, omniscient and mindful, then there could be no Universe at all. Although David Hume did much to erode such argument, it persisted well into the 19th century. Recently I have been intrigued to find James Hutton--who, as one of the founders of modern geology, is one of the boldest and most imaginative of all scientists--earnestly wondering in a late 18th century essay what God could possibly have intended when he made volcanoes. The notion that there could be no complex and adapted beings at all without a God to create them, was effectively the default position in orthodox biology until (as Dan Dennett has so succinctly explained) Charles Darwin showed how natural selection could produce complexity out of simplicity, and adaptation out of mere juxtaposition. Today, very obviously, no Hutton-style musing would find its way into a refereed journal. In Nature, God features only as the subject of (generally rather feeble) sociological and sometimes evolutionary speculation. Religion obviously flourishes still, but are religion and science now condemned to mortal conflict? Fundamentalist-atheists would have it so, but I think not. The greatest ideas in philosophy and science never really go away, even if they do change their form or go out of fashion, but they do take a very long time to unfold. For at least 300 years--from the 16th to the 19th centuries--emergent science and post- medieval theology were deliberately intertwined, in many ingenious ways. Through the past 150, they have been just as assiduously disentangled. But the game is far from over. Cosmologists and metaphysicians continue to eye and circle each other. Epistemology--how we know what's true--is of equal interest to scientists and theologians, and each would be foolish to suppose that the other has nothing to offer. How distant is the religious notion of revelation from Dirac's--or Keats's?--perception of truth as beauty? Most intriguingly of all, serious theologians are now discussing the role of religion in shaping emotional response while modern aficionados of artificial intelligence acknowledge (as Hume did) that emotion is an essential component of thought itself. Lastly, the ethics of science and technology--how we should use our new-found power--are the key discussions of our age and it is destructive to write religion out of the act, even if the priests, rabbis and mullahs who so far have been invited to take part have often proved disappointing. I don't share the modern enthusiasm for over-extended life but I would like to see how the dialogue unfolds in the centuries to come. COLIN TUDGE is a Research Fellow at the Centre for Philosophy, London School of Economics. His two latest books are The Variety of Life and In Mendel's Footnotes. Sherry Turkle "Can you have an artificial intelligence?" Progress in the domain that Marvin Minsky once characterized as "making machines do things that would be considered intelligent if done by people" has not been as dramatic as its founders might once have hoped, but the penetration of machine cognition into everyday life (from the computer that plays chess to the computer that determines if your toast is done) has been broad and deep. We now the term "intelligent" to refer to the kind of helpful smartness embedded in such objects. So the language has shifted and the question has disappeared. But until recently, there was a tendency to limit appreciation of machine mental prowess to the realm of the cognitive. In other words, acceptance of artificial intelligence came with a certain "romantic reaction." People were willing to accept that simulated thinking might well be deemed thinking, but simulated feeling was not feeling. Simulated love could never be love. These days, however, the realm of machine emotion has become a contested terrain. There is research in "affective computing" and in robotics which produces virtual pets and digital dolls--objects that present themselves as experiencing subjects. In artificial intelligence's "essentialist" past, researchers tried to argue that the machines they had built were "really" intelligent. In the current business of building machines that self-present as "creatures," the work of inferring emotion is left in large part to the user. The new artificial creatures are designed to push our evolutionary buttons to respond to their speech, their gestures, and their demands for nurturance by experiencing them as sentient, even emotional. And people are indeed inclined to respond to creatures they teach and nurture by caring about them, often in spite of themselves. People tell themselves that the robot dog is a program embodied in plastic, but they become fond of it all the same. They want to care for it and they want it to care for them. In cultural terms, old questions about machine intelligence has given way to a question not about the machines but about us: What kind of relationships is it appropriate to have with a machine? It is significant that this question has become relevant in a day-to-day sense during a period of unprecedented human redefinition through genomics and psychopharmacology, fields that along with robotics, encourage us to ask not only whether machines will be able to think like people, but whether people have always thought like machines SHERRY TURKLE is a professor of the sociology of science at MIT. She is the author of Life on the Screen: Identity in the Age of the Internet; The Second Self: Computers and the Human Spirit; and Psychoanalytic Politics: Jacques Lacan and Freud's French Revolution. Henry Warwick "None." Why: 1. If Barbour's theory of Platonia is even roughly correct, then everything exists in a timless universe, and therefore doesn't actually "disappear". Therefore, all questions are always asked, as everything is actually happening at once. I know that doesn't help much, and it dodges the main thrust of the question, but it's one support for my answer, if oblique. 2. Other than forgotten questions that disappear of their own accord, or are in some dead language, or are too personal/particular/atomised (i.e., What did you think of the latest excretion from Hollywood? Is it snowing now? Why is that weirdo across the library reading room looking at me?!?! When will I lose these 35 "friends" who are perched on my belt buckle? etc.) questions don't really disappear. They are asked again and again and are answered again and again, and this is a very good thing. Three Year Olds will always ask "Daddy, where do the stars come fwum?" And daddys will always answer as best they can. Eventually, some little three year old will grow into an adult astronomer and might find even better answers than their daddy supplied them on a cold Christmas night. And they will answer the same simple question with a long involved answer, or possibly, a better and simpler answer. In this way, questions come up again and again, but over time they spin out in new directions with new answers. 3. It's important to not let questions disappear. By doubting the obvious, examining the the same ground with fresh ideas, and questioning recieved ideas, great strides in the collected knowledge of this human project can be (and historically, have been) gained. When we consign a question to the scrap heap of history we run many risks--risks of blind arrogance, deaf self righteousness, and finally choking on the bland pablum of unquestioned dogma. 4. It's important to question the questions. It keeps the question alive, as it refines the question. Question the questions, and then reverse the process - question the questioning of questions. Permit the mind everything, even if it seems repetitive. If you spin your wheels long enough you'll blow a bearing or snap a spring, and the question is re-invented, re-asked, and re-known, but in a way not previously understood. In this way, questions don't disappear, they evolve into other questions. For a while they might bloat up in the sun and smell really weird, but it's all part of the process... HENRY WARWICK sometimes works as a scientist in the computer industry. He always works as an artist, composer, and writer. He lives in San Francisco, California. Margaret Wertheim "...the old question of whether our categories of reality are discovered or constructed." One question that has almost disappeared, but which I think should not is the old question about whether our categories of reality are discovered or constructed. In medieval times this was the debate about realism versus nominalism. Earlier this century the question flared up again in the debates about the relativistic nature of knowledge and has more recently given rise to the whole "science wars" debacle, but reading the science press today one would think the question had been finally resolved--on the side of realism. Reading the science press now one gets a strong impression that for most scientists our categories of reality are Platonic, almost God-given entities just waiting for the right mind to uncover them. This hard-nosed realist trend is evident across the spectrum of the sciences, but is particularly strong in physics, where the search is currently on for the supposed "ultimate" category of reality--strings being a favored candidate. What gets lost in all this is any analysis of the role that language plays in our pictures of reality. We literally cannot see things that we have no words for. As Einstein once said "we can only see what our theories allow us to see." I would argue that the question of what role language plays in shaping our picture of reality is one of the most critical questions in science today--and one that should be back on the agenda of every thoughtful scientiist. Just one example should suffice to illustrate what is at stake here: MIT philosopher of science Evelyn Fox Keller has shown in her book Secrets of Life, Secrets of Death (and elsewhere) the primary role played by language in shaping theories of genetics. Earlier this century physicists like Max Delbruck and Erwin Schrodinger started to have a philosophical impact on the biological sciences, which henceforth became increasingly "physicized." Just as atoms were seen as the ultimate constituents of matter so genes came to be seen as the ultimate constituents of life--the entities in which all power and control over living organisms resided. What this metaphor of the "master molecule" obscured was th role played by the cytoplasm in regulating the function and activation of genes. For half a century study of the cytoplasm was virtually ignored because genetics were so fixed on the idea of the gene as the "master colecule." Sure much good work on genetics was done, but important areas of biological function were also ignored. And are still being ignored by the current "master molecule" camp--the evolutionary psychologists, who cannot seem to see anything but through the prism of genes. Scientists (like all other humans) can only see reality as their language and their metaphors allow them to see it. This is not to say that scientists "make up" their discoveries, only to point out that language plays a critical role in shaping the way we categorize, and hence theorize, the world around us. Revolutions in science are not just the result of revolutions in the laboratory or at theorists blackboards, they are also linguistic revoluttions. Think of words like inertia, energy, momentum--words which did not have any scientific meaning before the seventeenth century. Or words like quantum, spin, charm and strange, which have only had scientific meaning science the quantum revolution of the early twentieth century. Categories of reality are not merely discovered--they are also constructed by the words we use. Understanding more deeply the interplay between the physical world and human language is, I believe, one of the major tasks for the future of science. MARGARET WERTHEIM is the author of Pythagoras Trousers, a history of the relationship between physics and religion; and The Pearly Gates of Cyberspace: A History of Space from Dante to the Internet. She is a research associate to the American Museum of Natural History in NY and a fellow of the Los Angeles Institute for the Humanities. She is currently working on a documentary about "outsider physics." Dave Winer "What's your business model?" Until this summer this was the most common question at Silicon Valley parties, at bus stops, conferences and grocery stores. Everyone had a business model, none planned to make money, all focused on the exit strategy. The euphoria attracted a despicable kind carpetbagger, one who wanted nothing more than money, and had a disdain for technology. All of a sudden technology was out of fashion in Silicon Valley. Now that the dotcom crash seems permanent, entrepreneurs are looking for real ways to make money. No more vacuous "business models." VCs are hunkering down for a long haul. The average IQ of Silicon Valley entrepreneurs is zooming to its former stratospheric levels. There's a genuine excitement here now, but if you ask what the business model is you're going to get a boring answer. Silicon Valley goes in cycles. Downturns are a perfect time to dig in, listen to users, learn what they want, and create the technology that scratches the itch, and plan on selling it for money. DAVE WINER, CEO UserLand Software, Inc. http://www.userland.com/ New Naomi Wolf "...the narrative shifted and ...the female sense of identity in the West, for the first time ever, no longer hinges on the identity of her mate ..." The question disappeared in most of Europe and North America, of course, because of the great movement toward women's employment and career advancement even after marrying and bearing children. Feminist historians have long documented how the "story" of the female heroine used to end with marriage; indeed, this story was so set in stone as late as the 1950's and early 60's in this country that Sylvia Plath's heroine in The Bell Jar had to flirt with suicide in order to try to find a way out of it. Betty Friedan noted in The Feminine Mystique that women (meaning middle class white women; the narrative was always different for women of color and working class women) couldn't "think their way past" marriage and family in terms of imagining a future that had greater dimension. But the narrative shifted and it's safe to say that the female sense of identity in the West, for the first time ever, no longer hinges on the identity of her mate--which is a truly new story in terms of our larger history. NAOMI WOLF, author, feminist, and social critic, is s an outspoken and influential voice for women's rights and empowerment. she is the author of The Beauty Myth, Fire with Fire, and Promiscuities. Milford H. Wolpoff "Where has Darwin gone?" Darwinism is alive and well in academic discussions and in pop thinking. Natural selection is a key element in explaining just about everything we encounter today, from the origin and spread of AIDS to the realization that our parents didn't "make us do it," our ancestors did. Ironically, though, Darwinism has disappeared from the area where it was first and most firmly seated the evolution of life, and especially the evolution of humanity. Human evolution was once pictured as a series of responses to changing environments coordinated by differences in reproduction and survivorship, as opportunistic changes taking advantage of the new possibilities opened up by the cultural inheritance of social information, as the triumph of technology over brute force, as the organization of intelligence by language. Evolutionary psychologists and other behavioralists still view it this way, but this is no longer presented as the mainstream view of human paleontologists and geneticists who address paleodemographic problems. Human evolution is now commonly depicted as the consequence of species replacements, where there are a series of species emanating from different, but usually African homelands, each sooner or later replacing the earlier ones. It is not the selection process that provides the source of human superiority in each successive replacement, but the random accidents that take place when new species are formed from small populations of old ones. The process is seen as being driven by random extinctions, opening up unexpected oFpportunities for those fortunate new species lucky to be at the right time and place. The origin and evolution of human species are now also addressed by geneticists studying the variation and distribution of human genes today (and in a few cases ancient genes from Neandertals). They use this information to estimate the history of human population size and the related questions of when the human population might have been small, where it might have originated, and when it might have been expanding. It is possible to do this if one can assume that mutation and genetic drift are the only driving forces of genetic change, because the effect of drift depends on population size. But this assumption means that Darwinian selection did not play any significant role in genetic evolution. Similarly, interpreting the distribution of ancient DNA as reflecting population history (rather than the history of the genes studied the histories are not necessarily the same) also assumes that selection on the DNA studied did not play a role in its evolution. In fact, the absence of Darwinian selection is the underlying assumption for these types of genetic studies. Human paleontology has taken a giant step away from Darwin will it have the courage to follow the lead of evolutionary behavior and step back? MILFORD H. WOLPOFF is Professor of Anthropology and Adjunct Associate Research Scientist, Museum of Anthropology at the University of Michigan. His work and theories on a "multiregional" model of human development challenge the popular "Eve" theory. His work has been covered in The New York Times, New Scientist, Discover, and Newsweek, among other publications. He is the author (with Rachel Caspari) of Race and Human Evolution: A Fatal Attraction Eberhard Zangger "Where Was Lost Atlantis?" Two journalists once ranked the discovery of lost Atlantis as potentially the most spectacular sensation of all times. Now, the question what or where Atlantis might have been has disappeared. Why? The Greek philosopher Plato, the only source for Atlantis, incorporated an extensive description of this legendary city into a mundane summary of contemporary (4th century BC) scientific achievements and knowledge of prehistory. Nobody attributed much attention to the account during subsequent centuries. In Medieval times, scholarly interest focussed on Aristotle, while Plato was neglected. When archaeology and history finally assumed the shape of scientific disciplines--after the middle of the 18th century AD--science still was under the influence of Christian theology, its Medieval mother discipline. The first art historians, who were brought up in a creationist world, consequently interpreted western culture as an almost divine concept which first materialized in ancient Greece, without having had any noticeable predecessors. Accordingly, any ancient texts referring to high civilizations, much older than Classical Greece, had to be fictitious by definition. During the 20th century, dozens of palaces dating to a golden age a thousand years older than Plato's Athens have been excavated around the eastern Mediterranean. Atlantis can now be placed in a historical context. It is an Egyptian recollection of Bronze Age Troy and its awe-inspiring war against the Greek kingdoms. Plato's account and the end of the Bronze Age around 1200 BC can now be seen in a new light. Why was this connection not made earlier? Four Egyptian words, describing location and size, were mistranslated, because at the time Egypt and Greece used different calendars and scales. And, in contrast to biology, where, after Darwin, the idea of creationism was dropped in favor of evolutionism, Aegean prehistory has never questioned its basic premises. Geoarchaeologist EBERHARD ZANGGER is Director of Corporate Communications at KPNQwest (Switzerland) and the author of The Flood from Heaven : Deciphering the Atlantis Legend and Geoarchaeology of the Argolid. Zangger has written a monograph, published by the German Archaeological Institute, as well as more than seventy scholarly articles, which have appeared in the American Journal of Archaeology, Hesperia, the Oxford Journal of Archaeology, and the Journal of Field Archaeology. Carl Zimmer "When will disease be eradicated?" By the middle of the twentieth century, scientists and doctors were sure that it was just a matter of time, and not much time at that, before most diseases would be wiped from the face of the Earth. Antibiotics would get rid of bacterial infections; vaccines would get rid of viruses; DDT would get rid of malaria. Now one drug after the next are becoming useless against resistant parasites, and new plagues such as AIDS are sweeping through our species. Except for a handful of diseases like smallpox and Guinea worms, eradication now looks like a fantasy. There are three primary reasons that this question is no longer asked. First, parasites evolution is far faster and more sophisticated than anyone previously appreciated. Second, scientists don't understand the complexities of the immune system well enough to design effective vaccines for many diseases yet. For another, the cures that have been discovered are often useless because the global public health system is a mess. The arrogant dream of eradication has been replaced by much more modest goals of trying to keep diseases in check. CARL ZIMMER is the author of Parasite Rex and writes a column about evolution for Natural History. From checker at panix.com Sat Jan 14 10:29:42 2006 From: checker at panix.com (Premise Checker) Date: Sat, 14 Jan 2006 05:29:42 -0500 (EST) Subject: [Paleopsych] NYTBR: 'Joseph Smith: Rough Stone Rolling, ' by Richard Lyman Bushman Message-ID: 'Joseph Smith: Rough Stone Rolling,' by Richard Lyman Bushman http://www.nytimes.com/2006/01/15/books/review/15kirn.html Review by WALTER KIRN JOSEPH SMITH Rough Stone Rolling. By Richard Lyman Bushman, with the assistance of Jed Woodworth. Illustrated. 740 pp. Alfred A. Knopf. $35. Most men who go searching for signs from God look skyward, but Joseph Smith, the youthful Mormon prophet, distinguished himself from his visionary cohort by hunting for sacred wisdom in the ground. In 1827, this barely literate 21-year-old dug in a hillside in rural upstate New York and unearthed a set of golden plates whose unfamiliar characters he translated with the aid of magical "seer stones." The result was the Book of Mormon, a second Bible whose elaborate tale of interracial warfare between two ancient American peoples - the so-called Lamanites and Nephites - was dismissed by Mark Twain as "chloroform in print" but today forms the basis of a worldwide church with a still-growing membership of some six million in the United States and another six million overseas. The mystery of the scripture's origins (was it really translated from "reformed Egyptian" or was it made up or borrowed from other sources?) is just one of the burning questions about Smith that Richard Lyman Bushman, his latest biographer, examines from every conceivable rational angle before declaring it to be unanswerable - unanswerable in a way that vaguely suggests such puzzles were divinely intended to stay that way. Bushman, a retired Columbia history professor who also happens to be a practicing Mormon, has a tricky dual agenda, it turns out: to depict Smith both as the prophet he claimed to be and as the man of his times that he most certainly was. "The efforts to situate the Book of Mormon in history, whether ancient or modern, run up against baffling complexities," Bushman writes, seemingly closing the door on the whole matter while slyly leaving it open a crack for a faith. "The Book of Mormon resists conventional analysis, whether sympathetic or critical." As refracted through Bushman's intellectual bifocals - one lens is skeptical and clear, the other reverent and rosy - most of the rest of Smith's remarkable story is shown to resist such analysis as well. So why make the effort in the first place? By showing the inadequacy of reason in the face of spiritual phenomena, Bushman seems to be playing a Latter-Day-Saint Aquinas. It appears he wants to usher in a subtle, mature new age of Mormon thought - rigorous yet not impious - akin to what smart Roman Catholics have had for centuries. Once the reader despairs of ever finding out whether Smith was God's own spokesman or the L. Ron Hubbard of his day, it's possible to enjoy a tale that's as colorful, suspenseful and unlikely as any in American history. Operating on the margins of society, out where the traveled roads turned into paths, Smith managed to build a major religion from scratch. What's more, unlike other 19th-century utopian faiths, Smith's parade kept lengthening over time rather than dispersing from the start. Despite bloody harassment from all sides, a chronic shortage of funds and almost nonstop challenges to his authority, he did Moses one better by leading an exodus and amassing a tribe at the same time. Bushman's Smith, whatever else he was, comes off as a singularly brilliant motivator whose method - call it Dynamic Overextension - modern students of management would do well to study. By perpetually promising the world to a mixed bag of followers that included preachers picked off from other sects, Smith not only captured hundreds, then thousands, of minds, he harnessed their muscles, too. From New York he led his pilgrims to Ohio, only to tell them once they'd settled down that Zion lay in Missouri, much farther west, and that many of them would have to pack their things again. To make things yet more strenuous for everyone, he dispatched bands of missionaries to Europe to convert enough souls to populate the place. The upshot of always demanding the impossible was chronic disappointment and disaffection when Smith's followers walked face first into the actual and found it painfully solid, not made of cloud. Zion (near modern Kansas City) wasn't the promised land Smith had promised but a turbulent, insecure frontier whose residents tarred and feathered the hopeful interlopers, torched their houses and not infrequently murdered them - all with the tacit permission of politicians who feared the swelling Mormon vote and the liberal views of the prophet they believed controlled it. Though Mormons today tend to be social conservatives, their founder was something of a wild-eyed radical, opposing slavery, preaching kindness to animals and even promoting an economic order based on distributing wealth according to need. His dreams and schemes came in cascades of revelation, and when they evaporated, fresh visions arrived, many of them assigning blame for the misfortunes on the failings of those whom they befell while promising yet grander glories if the erring Mormons straightened up. The violent confrontations in Missouri and the humiliating retreat that brought the Mormons to Illinois (a haven they would have to flee after their prophet was assassinated in 1844) turned Smith from a primarily religious figure into a full-blown political leader. His idealistic pacifism gave way to a practical doctrine of self-defense. His mild-mannered tolerance for dissent became a cranky insistence on discipline. In tracing this fateful shift from seer to czar and oracle to general, Bushman earns a place for his biography on the very short shelf reserved for books on Mormonism with appeal to initiates and outsiders, too. Bushman marks the prophet's time in prison, where the Missourians had locked him up on a dubious charge of treason, as the dawn of his historical self-consciousness, when he recast the Mormon experience as myth and situated his people in a narrative that would give them a durable identity, not just a debatable theology: "Joseph had conceived a strategy. For the Saints to claim their rights, the story of persecution had to be told." Bushman, who seems to believe that psychoanalysis can shed a partial light on the miraculous, relates Smith's new emphasis on sacred suffering to the humiliations of his youth as the son of a scorned and struggling farmer. By uniting his private traumas with the public tribulations of his church, the first Mormon became the essential Mormon, too. The split personality diagnosed in Smith by his best-known modern biographer, Fawn Brodie, has no place in Bushman's study, whose aim is not really to get inside the prophet but to show him from so many angles that he achieves a lifelike roundness while retaining an impenetrable core, which Bushman suggests is where the holiness goes. But as Smith becomes increasingly ambitious about personally building and peopling God's kingdom in the American Midwest, it's hard not to wonder whether the forces driving him ran on chemical and neurological fuels. Bushman may find such medical hindsight trivial - a meaningless, anachronistic autopsy - but it might help make his behavior seem less ghostly, less unremittingly remote. We hear Smith's words but we can't quite picture him speaking them, can't quite imagine their flavor and their tone. "Awake, O Kings of the earth!" the prophet cried from Nauvoo, Ill., the half-built Mississippi river town that he'd designed to accommodate immigrant Saints who would include but not be limited to "the polished European, the degraded Hottentot and the shivering Laplander." "Come ye, O! come ye with your gold and your silver," he urged. The magnificent gathering Smith craved would also, on some level, include the dead, whom living Mormons would baptize in absentia and rescue from the spiritual darkness that had descended after the time of Christ and hung on until the teenage Smith, sitting up late in his parents' house, came face to face with the angel Moroni, whose strangely luminous white feet hovered several inches above the floor. That's where the journey that Bushman chronicles started - with the prayers of a boy of rudimentary schooling and no conspicuous talent other than a self-proclaimed ability to pinpoint buried caches of gems and gold, whose contentious local religious culture crackled with rumors of signs and wonders while ceaselessly arguing over salvation's fine points until former congregants could no longer stand each other and drifted off to join more agreeable sects that often disbanded more quickly than they'd been formed. The prayers from Smith that sliced through all the bickering constitute the inspiring story that Mormon missionaries first lavish on their converts, but as the faith of budding Mormons matures they're told the story of Smith's murder in 1844 - a date with destruction he may have made inevitable when his forces smashed the printing press of a newspaper Smith branded libelous and dangerous. The incident led to his jailing in nearby Carthage, where the governor guaranteed his safety from the latest in a series of mobs that saw the prophet as a monstrous devil and were repulsed by his polygamous household, disgusted with the uniform that he affected while drilling his militia, and perhaps most disturbed by his confident declaration that his followers would hold sway over the world someday, and perhaps quite soon. They rushed the jail and shot him. He staggered to the sill of a high window, tipped forward, and toppled to his death while calling out "O Lord my God." This cry to heaven drew no recorded response. Perhaps the realm that Smith had long conversed with had nothing more to say to him, or perhaps he'd fabricated his whole career. For Bushman, the fact that his church continues to grow is proof that he was onto something big, though. For logicians, this is tantamount to arguing that Santa Claus probably exists because he gets millions of letters each year from children. But but since logic played almost no part in Joseph Smith's life, it may be fitting that it's largely absent from this respectful biography as well. Walter Kirn is a regular contributor to the Book Review. His most recent novel is "Mission to America." Related Searches * [68]Smith, Joseph * [69]Mormons (Church of Jesus Christ of Latter-Day Saints) * [70]Religion and Churches References 68. http://query.nytimes.com/search/query?ppds=per&v1=SMITH%2C+JOSEPH&fdq=19960101&td=sysdate&sort=newest&ac=SMITH%2C+JOSEPH&rt=1%2Cdes%2Corg%2Cper%2Cgeo 69. http://query.nytimes.com/search/query?ppds=des&v1=MORMONS+%28CHURCH+OF+JESUS+CHRIST+OF+LATTER%2DDAY+SAINTS%29&fdq=19960101&td=sysdate&sort=newest&ac=MORMONS+%28CHURCH+OF+JESUS+CHRIST+OF+LATTER%2DDAY+SAINTS%29&rt=1%2Cdes%2Corg%2Cper%2Cgeo 70. http://query.nytimes.com/search/query?ppds=des&v1=RELIGION+AND+CHURCHES&fdq=19960101&td=sysdate&sort=newest&ac=RELIGION+AND+CHURCHES&rt=1%2Cdes%2Corg%2Cper%2Cgeo If you want to retrieve any of the articles that are linked, send me an e-message. From checker at panix.com Sat Jan 14 10:32:40 2006 From: checker at panix.com (Premise Checker) Date: Sat, 14 Jan 2006 05:32:40 -0500 (EST) Subject: [Paleopsych] Discover Mag: We Find Out Why Stupid People Usually Die Young Message-ID: We Find Out Why Stupid People Usually Die Young DISCOVER Vol. 27 No. 01 | January 2006 | Mind & Brain At Last: We Find Out Why Stupid People Usually Die Young In 2001 researchers in Great Britain were surprised to discover that people with low IQs live shorter lives. But a more startling finding came this year with a report that reaction time proved an even stronger predictor of life span than IQ. Ian Deary, a psychologist at the University of Edinburgh, and Geoff Der, a statistician at the MRC Social and Public Health Sciences Unit in Glasgow, suspected that higher IQ might lead to healthier habits like not smoking or healthier environments like safer office jobs. So they looked at data on 898 people first tested at about age 56, and then tracked their survival until age 70. They found that the link between IQ and mortality held strong even after adjusting for education, occupation, social class, and smoking. But the group had also taken button-pressing reaction-time tests, which measure how quickly and accurately a person repeatedly makes a simple decision. Deary and Der wanted to find out whether there was still a relationship between mental ability and survival once a person's reaction time was taken into account. "And there wasn't," Deary says. The upshot: "We could explain the association between mental ability and survival with reaction time." In this study, the reaction times tested in subjects at age 56 appeared to have about as strong a link to chances of survival over the next 14 years as did being a smoker. Why is still uncertain. One possibility is that reaction time slows because an undetected disease has begun to compromise performance. Another hypothesis is that the differences result from more fundamental, lifelong variations in the speed at which people process information. Both factors could be at work. For clues, Deary hopes to track a younger sample over several years. The study has sparked interest in what its authors call "cognitive epidemiology," the study of associations between mental ability tests and health outcomes. "One of the indicators of whether mental ability tests are useful is whether they predict things about real life," Deary says, and these findings suggest they do. Marina Krakovsky From checker at panix.com Sat Jan 14 10:38:30 2006 From: checker at panix.com (Premise Checker) Date: Sat, 14 Jan 2006 05:38:30 -0500 (EST) Subject: [Paleopsych] NYT: New Light on Origins of Ashkenazi in Europe Message-ID: New Light on Origins of Ashkenazi in Europe http://www.nytimes.com/2006/01/14/science/14gene.html By NICHOLAS WADE A new look at the DNA of the Ashkenazi Jewish population has thrown light on its still mysterious origins. Until now, it had been widely assumed by geneticists that the Ashkenazi communities of Northern and Central Europe were founded by men who came from the Middle East, perhaps as traders, and by the women from each local population whom they took as wives and converted to Judaism. But the new study, published online this week in The American Journal of Human Genetics, suggests that the men and their wives migrated to Europe together. The researchers, Doron Behar and Karl Skorecki of the Technion and Ramban Medical Center in Haifa, and colleagues elsewhere, report that just four women, who may have lived 2,000 to 3,000 years ago, are the ancestors of 40 percent of Ashkenazis alive today. The Technion team's analysis was based on mitochondrial DNA, a genetic element that is separate from the genes held in the cell's nucleus and that is inherited only through the female line. Because of mutations - the switch of one DNA unit for another - that build up on the mitochondrial DNA, people can be assigned to branches that are defined by which mutations they carry. In the case of the Ashkenazi population, the researchers found that many branches coalesced to single trees, and so were able to identify the four female ancestors. Looking at other populations, the Technion team found that some people in Egypt, Arabia and the Levant also carried the set of mutations that defines one of the four women. They argue that all four probably lived originally in the Middle East. A study by Michael Hammer of the University of Arizona showed five years ago that the men in many Jewish communities around the world bore Y chromosomes that were Middle Eastern in origin. This finding is widely accepted by geneticists, but there is less consensus about the women's origins. David Goldstein, now of Duke University, reported in 2002 that the mitochondrial DNA of women in Jewish communities around the world did not seem to be Middle Eastern, and indeed each community had its own genetic pattern. But in some cases the mitochondrial DNA was closely related to that of the host community. Dr. Goldstein and his colleagues suggested that the genesis of each Jewish community, including the Ashkenazis, was that Jewish men had arrived from the Middle East, taken wives from the host population and converted them to Judaism, after which there was no further intermarriage with non-Jews. The Technion team suggests a different origin for the Ashkenazi community: if the women too are Middle Eastern in origin, they would presumably have accompanied their husbands. At least the Ashkenazi Jewish community might have been formed by families migrating together. Dr. Hammer said the new study "moves us forward in trying to understand Jewish population history." His own recent research, he said, suggests that the Ashkenazi population expanded through a series of bottlenecks - events that squeeze a population down to small numbers - perhaps as it migrated from the Middle East after the destruction of the Second Temple in A.D. 70 to Italy, reaching the Rhine Valley in the 10th century. But Dr. Goldstein said the new report did not alter his previous conclusion. The mitochondrial DNA's of a small, isolated population tend to change rapidly as some lineages fall extinct and others become more common, a process known as genetic drift. In his view, the Technion team has confirmed that genetic drift has played a major role in shaping Ashkenazi mitochondrial DNA. But the linkage with Middle Eastern populations is not statistically significant, he said. Because of genetic drift, Ashkenazi mitochondrial DNA's have developed their own pattern, which makes it very hard to tell their source. This differs from the patrilineal case, Dr. Goldstein said, where there is no question of a Middle Eastern origin. From checker at panix.com Sat Jan 14 10:39:26 2006 From: checker at panix.com (Premise Checker) Date: Sat, 14 Jan 2006 05:39:26 -0500 (EST) Subject: [Paleopsych] Reuters: Study Finds Why Jewish Mothers Are so Important Message-ID: Study Finds Why Jewish Mothers Are so Important http://www.nytimes.com/reuters/news/news-science-jews.html By REUTERS Filed at 12:21 p.m. ET WASHINGTON (Reuters) - Four Jewish mothers who lived 1,000 years ago in Europe are the ancestors of 40 percent of all Ashkenazi Jews alive today, an international team of researchers reported on Friday. The genetic study of DNA paints a vivid picture of human evolution and survival, and correlates with the well-established written and oral histories of Jewish migrations, said Dr. Doron Behar of the Technion-Israel Institute of Technology, who worked on the study. The study, published in the American Journal of Human Genetics, suggests that some 3.5 million Jews alive today all descended from four women. For their study, Behar and geneticist Karl Skorecki, with collaborators in Finland, France, Estonia, Finland, Portugal, Russia and the United States sampled DNA from 11,452 people from 67 populations. ``All subjects reported the birthplace of their mothers, grandmothers, and, in most cases, great-grandmothers,'' they wrote in their report. They looked at mitochondrial DNA, which is found in cells, outside the nucleus and away from the DNA that carries most genetic instructions. Mitochondrial DNA is passed down virtually unchanged from mother to daughter, but it does occasionally mutate, at a known rate. Researchers can use this molecular clock to track genetic changes through time, and used it, for instance, to compute when the ``ancestral Eve'' of all living humans lived -- in Africa, about 180,000 years ago. Now they have found four ancestral Jewish mothers. ``I think there was some kind of genetic pool that was in the Near East,'' Behar said in a telephone interview. ``Among this genetic pool there were four maternal lineages, four real women, that carried the exact specific mitochondrial DNA markers that we can find in mitochondrial DNA today.'' SETTLING EUROPE They, or their direct descendants, moved into Europe. ``Then at a certain period, most probably in the 13th century, simply by demographic matters, they started to expand dramatically,'' Behar said. ``Maybe it was because of Jewish tradition, the structure of the family that might have been characterized by a high number of children.'' But these four families gave rise to much of the population of European Jews -- which exploded from 30,000 people in the 13th century to ``something like 9 million just prior to World War II,'' Behar said. The Nazis and their allies killed 6 million Jews during the war, but there are now an estimated 8 million Ashkenazi Jews, defined by their common northern and central European ancestry, cultural traditions and Yiddish language. Behar said as they sampled people from Ashkenazi communities around the world, the same mitochondrial genetic markers kept popping up. They did not find the markers in most of the non-Jewish people they sampled, and only a very few were shared with Jews of other origin. This particular study does not provide a direct explanation for some of the inherited diseases that disproportionately affect Jews of European descent, such as breast and colon cancer, because most diseases are caused by mutations in nuclear DNA, not the DNA studied by Behar's group. These genes are believed to date from a ``bottleneck'' phenomenon, when populations were squeezed down from large to small and then expanded again. Behar and Skorecki's team have found what is known as a ``founder effect'' -- when one or a small number of people have a huge number of descendants. What the study also shows, Behar said, is that Jewish mothers are highly valued for a good reason. ``This I could tell you even without the paper,'' he said. From checker at panix.com Sat Jan 14 10:42:21 2006 From: checker at panix.com (Premise Checker) Date: Sat, 14 Jan 2006 05:42:21 -0500 (EST) Subject: [Paleopsych] Hermenaut: Philosophy Hits the Newsstands Message-ID: Philosophy Hits the Newsstands http://www.hermenaut.com/a125.shtml FEATURE | Joshua Glenn | 8/23/0 Philosophy Hits the Newsstands Recently, in a Wicked Pavilion conference on ex-Harper's editor Willie Morris, Hermenaut's Josh Glenn invited all present to help him dream up the perfect periodical. One question that arose in the ensuing discussion was this: How to publish a magazine, for a general audience, which takes difficult ideas seriously, and refuses to dumb them down--without gaining a reputation for being too esoteric? This conundrum reminded me of Glenn's write-up of two English "popular philosophy" magazines whose editors he'd interviewed earlier this year for Britannica.com (as part of a short series in which he reported on the so called applied philosophy movement in Europe and the United States). Hoping, then, to fan the flames of the discussion on intelligent magazines, we've decided to reprint "Philosophy Hits the Newsstands: A Tale of Two Magazines" (February 9th, 2000) here, on our site. Please join us in the Wicked Pavilion to discuss the prospect of a magazine staking its claim in what is called, below, "the inhospitable no-man's land between the academy and the real world." --C. Ingoglia, ed. Philosophy Hits the Newsstands by Joshua Glenn There is, by all accounts, an "applied philosophy" movement afoot in the world today. In times past philosophers from Aristotle and Lao-tzu to Bacon, Descartes, Hobbes, Locke, and Hume applied philosophical insights and methods to quotidian matters of diplomacy and public policy. Over the course of the 20th century, however, philosophical inquiry--particularly in the United States--grew largely abstract and detached from public affairs. While a handful of the century's more public philosophical figures, such as Bertrand Russell, John Dewey, Jean-Paul Sartre, Simone de Beauvoir, and Herbert Marcuse, actively participated in--indeed, played monumental roles in influencing--the events of their times, they stand distinctly apart from the prevailing tendency among their contemporaries to practice philosophy as if in a cave, impervious to the world-historical upheavals and transformations taking place around them. Worse, philosophy has been confined almost exclusively to the isolated (and isolating) groves of academe. All that seems to be changing, however. In the past few years men and women who once seemed burdened with doctorates in philosophy and consigned to the hermetic margins of society have begun engaging in what's being described as "philosophical practice." Besides practicing individual client counseling, these brave new philosophical practitioners are engaging in philosophical discourse with children, working with schools on educational programming, and consulting with American corporations on everything from dilemma training to designing, building, implementing, and maintaining codes of ethics. Philosopher-led Socratic dialogues, or just plain old philosophical forums, have been sprouting up in American caf?s, bookstores, senior centers, libraries, hospices, prisons, and shopping malls. For philosophers and interested laypeople alike it's an exciting time. There's just one thing missing, though. What's a movement without a magazine? There are plenty of scholarly and professional journals and newsletters devoted to philosophical practice, but where's the Rolling Stone, the Wired, the Utne Reader of the popular philosophy movement? Where is the independently-published yet glossy magazine--which supports itself through subscriptions and ad sales but maintains a sense of mission--that tackles the burning issues of everyday life in an informative, enjoyable, and thought-provoking manner, but with all the clarity, rigor, and precision that define philosophy? Is the very idea that such a magazine could survive in today's marketplace just a pipe-dream? There are, in fact, two contenders. Philosophy Now and The Philosophers' Magazine, both quarterlies published in London and distributed in the United Kingdom and the United States, have staked their claim in what was previously regarded as the inhospitable no-man's land between the academy and the real world. Which raises another question: is there room enough, in that peculiar new "media space," for both magazines? On closer inspection, it turns out that these two periodicals aren't as similar as they first appear. In 1990 Rick Lewis was working as a physicist for British Telecom Laboratories. A couple of years earlier, having decided that he wanted to be able to look back from his old age and see that he'd spent his time well, he'd returned to school and earned a master's degree in philosophy. In 1991, when the British government began slashing funding for many philosophy departments without a squeak of public protest, Lewis decided to take action: he started Philosophy Now, a magazine devoted to what he calls "popular philosophy." ("You can't expect people to pay taxes for something if they don't understand it and can't see its importance," he told me.) Today, thanks in part to the "applied philosophy" movement marching across Europe and the United States, the magazine's circulation is at 8,000 and growing rapidly. Lewis remarks that "one way you can tell that an idea is successful is that sooner or later people take it up and try to develop variations on the theme. For almost seven years this didn't happen and I was starting to worry why not!" In 1997 Julian Baggini, a disaffected philosophy Ph.D., relieved Lewis of his worries on this matter. Baggini told his friend Jeremy Stangroom, a Ph.D. in sociology, that he was dissatisfied that no one was publishing a philosophy magazine "that was both genuinely a magazine," as he put it to me recently, "and genuinely philosophy." Stangroom, who'd become fascinated with the Internet, launched The Philosophers' Web Magazine that spring, and Baggini launched its sister publication, The Philosophers' Magazine, a few months later. Today, their Web site receives between 2,500 and 3,000 unique visitors per week, and their magazine's circulation is already at 3,500--double what it was a year ago. Both magazines publish theme issues (on heavy-duty topics like "The Meaning of Life" and "The New Problem of Evil"), but they leaven the gravity of their feature articles with humorous cartoons, titles, and short features. Commenting on this, Lewis argues that The Philosophers' Magazine has followed in his publication's footsteps. Baggini and Stangroom respond that Philosophy Now's redesign in the spring of 1998 was clearly an attempt to reposition itself in response to the challenge of its upstart rival. It's true: at the level of format, it's extremely difficult to distinguish between the two publications. Their covers and layouts are nearly identical, right down to the typefaces--one of the perils of desktop publishing, perhaps. Lewis insists that Philosophy Now's redesign was strictly market-driven. "There is little point in producing a magazine to popularize philosophy if nobody has a chance to read it," he explained in an editorial at the time, "and to be displayed by the big retail chains Philosophy Now really does have to look as slick as the style-obsessed citizens of the '90s have come to expect." Although the two popular philosophy magazines are eerily similar in means, they diverge sharply when it comes to the question of ends. Stangroom likes to say that the point of The Philosophers' Magazine, both online and off, is to "break down some of the barriers that exist between academia and the intelligent world outside." And Lewis might describe Philosophy Now in similar terms. It's clear, however, that whereas Philosophy Now wants to lead academics out of their ivory towers into the light of day, The Philosophers' Magazine is more concerned with inviting interested laypeople inside for a guided tour. Philosophy Now, for example, invites prominent (mostly British) philosophers--including Antony Flew, J.J.C. Smart, Richard Taylor, and Mary Midgley--to write original articles on matters of everyday importance, in a manner accessible to an intelligent magazine-reading public. The Philosophers' Magazine, on the other hand, leans toward the interview form: leading philosophers are encouraged to be clear, but not necessarily un esoteric. Thus, in the Winter 1999/2000 issue of Philosophy Now, we find Midgley weighing in on the "selfish gene" debate with the elegant proposition that although everyone has "a Don Giovanni somewhere inside him," we might ask why "there is usually also, for a start, at least a Figaro, a Tamino, and a Sarastro, as well as a whole crowd of unexpected characters whom the tide of life is continually producing?" In the Winter 2000 issue of The Philosophers' Magazine, on the other hand, we find Baggini asking the eminent English philosopher Stuart Hampshire, "Aren't the procedures by which you undertake [the conflict resolution mechanism that you advocate] in some sense dependent upon the kinds of principles established in deductive reasoning, such as the law of non-contradiction?" A scholarly philosophy journal might get even more technical than this, of course, but it isn't "popular philosophy." It's also indicative of the difference between the magazines that whereas Philosophy Now has gone to a good deal of trouble and expense to get itself shelved at ordinary newsstands around the world, The Philosophers' Magazine can for the most part only be found in specialty bookstores. Dr. Charles Echelbarger, professor of philosophy at the State University of New York-Oswego and a member of Philosophy Now's U.S. editorial board since 1998, expresses what could be the magazine's mission statement when he admits to having become "increasingly frustrated over the fact that it is so difficult to explain to people outside academic life what I do." Perhaps this explains the helpful sidebar to each issue's editorial, which provides a guide to "philosophy in a nutshell." The Philosophers' Magazine turns up its nose at crutches like this; it's edited for people who already know exactly what Echelbarger does. "Bertrand's Break," its regular crossword puzzle--named after Bertrand Russell--challenges readers to name, for example, a "Proponent of Theory of Relativity" or a "Follower of Plotinus." This is fun, but decidedly in-crowd, stuff. After reading half-a-dozen of the most recent issues of Philosophy Now and The Philosophers' Magazine, I felt a burning need to ask the editors of both magazines about the ends of philosophy itself. "What's it for, anyway?" I asked in an e-mail. "And are academic philosophers doing it right?" Lewis replied that, although we may be living in the golden age of democracy, this is also the golden age of spin-doctors and lobbyists. So "clear thinking and a questioning, skeptical (rather than cynical) attitude to life are not merely relevant--they are vital survival skills!" Lewis doesn't want to be lumped in with those in the "applied philosophy" movement who insist that academic philosophy is useless. Although he's highly critical of professional philosophers "who've forgotten how to communicate in ordinary English," Lewis's editorial in Philosophy Now's Winter 1999/2000 issue insists that, by training their students to have "a clear head and the courage to think fundamental things through for [themselves]," dedicated, scholarly teachers of philosophy are an asset to society. In their responses to my question, Baggini and Stangroom were careful to note that The Philosophers' Magazine covers philosophy as it is, not as it ought to be. "I don't think people like me who have effectively given up doing original philosophy for writing about it have much right to comment on what the academy is up to," Baggini insists. That said, however, he agrees with Lewis that philosophy can be useful to everyday living, but notes that "a great deal of philosophy has no relevance whatsoever to how we as individuals live our lives, and if we don't accept that we're going to falsely accuse philosophers of letting us down." Stangroom admits to having a more jaundiced view about academic philosophy. "It seems to me that its strength is its scholarship--philosophers almost invariably know the minutiae of the philosophical canon," he offers. "However, that is also its weakness, in that they frequently seem to know little else." Stangroom and Baggini agree, though, that the greatest contribution philosophy can make to society is "to introduce clearer, sharper thinking on the issues which bother us, whether they're moral, existential, legal, or epistemological." In the end the editors of both publications insist that it would be a fantasy to suggest that they're bitter rivals. Dr. Timothy J. Madigan, another member of Philosophy Now's U.S. editorial board, sums up the situation by noting that "it is important to try to return to the Socratic tradition of philosophy in the marketplace, while not unduly denigrating the Platonic tradition of academic philosophy per se. There's no need to perpetuate a false dilemma--there's enough food for thought at all levels for everyone to join the banquet." And, after all, success for either publication is undoubtedly good for the other, because it expands the circle of people interested in philosophy. "Maybe," Madigan suggests hopefully, "the western world is ready for a full-blown philosophical explosion." _________________________________________________________________ A version of this article originally appeared on the Web site Britannica.com. From checker at panix.com Sat Jan 14 10:42:02 2006 From: checker at panix.com (Premise Checker) Date: Sat, 14 Jan 2006 05:42:02 -0500 (EST) Subject: [Paleopsych] SW: Dinosaurs and Grass Message-ID: Paleontology: Dinosaurs and Grass http://scienceweek.com/2005/sw051223-4.htm [And God said, Let the earth bring forth grass, the herb yielding seed, and the fruit tree yielding fruit after his kind, whose seed is in itself, upon the earth: and it was so. And the earth brought forth grass, and herb yielding seed after his kind, and the three yielding fruit, whose seed was in itself, after his kind: and God saw that it was good. And the evening and the morning were the third day. --Genesis 1:11-13 [And God created great whales, and every living creature that moveth, which the waters brought forth abundantly, after their kind, and every winged fowl after his kind: and saw that it was good. And God blessed them, saying, Be fruitful and multiply, and fill the waters in the seas, and let fowl multiply in the earth. And the evening and morning were the second day. --Genesis 1:21-23 [Paleontologists, however, hold that there were sea animals long before there was grasss. In either case, the role of grass is generally overlooked.] The following points are made by D.R. Piperno and H-D. Sues (Science 2005 310:1126): 1) Grasses (family Poaceae or Gramineae), with about 10,000 extant species, are among the largest and most ecologically dominant families of flowering plants, and today provide staple foods for much of humankind. Dinosaurs, the dominant mega-herbivores during most of the Mesozoic Era (65 to 251 million years ago), are similarly one of the largest and best known groups of organisms. However, the possible coevolution of grasses and dinosaurs has never been studied. New work[1] reports analysis of phytoliths -- microscopic pieces of silica formed in plant cells -- in coprolites that the authors attribute to titanosaurid sauropods that lived in central India about 65 to 71 million years ago. Their data indicate that those dinosaurs ate grasses. 2) Part of the difficulty in studying the question of dinosaur-grass coevolution results from the poor quality of the fossil record for early grasses. The earliest unequivocal grass fossils date to the Paleocene-Eocene boundary, about 56 million years ago [2,3], well after the demise of nonavian dinosaurs at the end of the Cretaceous Period. Pollen and macrofossils of Poaceae are uncommon in sedimentary strata until the middle Miocene, about 11 to 16 million years ago, when the family is thought to have undergone considerable evolutionary diversification and ecological expansion [2]. Thus, dioramas in museums have long depicted dinosaurs as grazing on conifers, cycads, and ferns in landscapes without grasses. The work of Prasad et al [1] is the first unambiguous evidence that the Poaceae originated and had already diversified during the Cretaceous. The research shows that phytoliths, which have become a major topic of study in Quaternary research over the last 20 years [4,5], can provide a formidable means for reconstructing vegetation and animal diets for much earlier time periods when early angiosperms were diversifying. These results will force reconsideration of many long-standing assumptions about grass evolution, dinosaurian ecology, and early plant-herbivore interactions. 3) Researchers have long known that grasses make distinctive kinds of phytoliths in the epidermis of their leaves and leaflike coverings that surround their flowers. More recent work has examined in greater detail phytolith characteristics from a large set of grasses comprising taxa representing the entire range of diversification within the family, showing that discriminations at the subfamily, tribe, and genus levels are often possible [1,4,5]. In addition, publication of a well-resolved consensus phylogeny of the Poaceae by the Grass Phylogeny Working Group (GPWG) considerably advances our overall understanding of the evolutionary history of grasses and leads to improved interpretations of the early grass fossil record. For example, by mapping the phytolith characters that discriminate clades and subfamilies of extant taxa onto this phylogenetic tree, we can infer how phytolith morphology changed at the origin of major clades and lineages. References (abridged): 1. V. Prasad, C. A. E. Str?mberg, H. Alimohammadian, A. Sahni, Science 310, 1177 (2005) 2. B. F. Jacobs, J. D. Kingston, L. L. Jacobs, Ann. Mo. Bot. Gard. 86, 590 (1999) 3. E. A. Kellogg, Plant Physiol. 125, 1198 (2001) 4. D. R. Piperno, Phytolith Analysis: An Archaeological and Geological Perspective (Academic Press, San Diego, CA, 1988) 5. G. G. Fredlund, L. T. Tieszen, J. Biogeogr. 21, 321 (1994) Science http://www.sciencemag.org -------------------------------- Related Material: DINOSAURS, DRAGONS, AND DWARFS: THE EVOLUTION OF MAXIMAL BODY SIZE The following points are made by G.P. Burness et al (Proc. Nat. Acad. Sci. 2001 98:14518): 1) The size and taxonomic affiliation of the largest locally present species ("top species") of terrestrial vertebrate vary greatly among faunas, raising many unsolved questions. Why are the top species on continents bigger than those on even the largest islands, bigger in turn than those on small islands? Why are the top mammals marsupials on Australia but placentals on the other continents? Why is the world's largest extant lizard (the Komodo dragon) native to a modest-sized Indonesian island, of all unlikely places? Why is the top herbivore larger than the top carnivore at most sites? Why were the largest dinosaurs bigger than any modern terrestrial species? 2) A useful starting point is the observation of Marquet and Taper (1998), based on three data sets (Great Basin mountaintops, Sea of Cortez islands, and the continents), that the size of a landmass's top mammal increases with the landmass's area. To explain this pattern, they noted that populations numbering less than some minimum number of individuals are at high risk of extinction, but larger individuals require more food and hence larger home ranges, thus only large landmasses can support at least the necessary minimum number of individuals of larger-bodied species. If this reasoning were correct, one might expect body size of the top species also to depend on other correlates of food requirements and population densities, such as trophic level and metabolic rate. Hence the authors assembled a data set consisting of the top terrestrial herbivores and carnivores on 25 oceanic islands and the 5 continents to test 3 quantitative predictions: a) Within a trophic level, body mass of the top species will increase with land area, with a slope predictable from the slope of the relation between body mass and home range area. b) For a given land area, the top herbivore will be larger than the top carnivore by a factor predictable from the greater amounts of food available to herbivores than to carnivores. c) Within a trophic level and for a given area of landmass, top species that are ectotherms will be larger than ones that are endotherms, by a factor predictable from ectotherms' lower food requirements. 3) The authors point out that on reflection, one can think of other factors likely to perturb these predictions, such as environmental productivity, over-water dispersal, evolutionary times required for body size changes, and changing landmass area with geological time. Indeed, the database of the authors does suggest effects of these other factors. The authors point out they propose their three predictions not because they expect them always to be correct, but because they expect them to describe broad patterns that must be understood in order to be able to detect and interpret deviations from those patterns. Proc. Nat. Acad. Sci. http://www.pnas.org -------------------------------- Related Material: ECOLOGY: ON FOOD-WEB INTERACTIONS The following points are made by A.R. Ives and B.J. Cardinale (Nature 2004 429:174): 1) Growing concern about how loss of biodiversity will affect ecosystems has stimulated numerous studies(1-5). Although most studies have assumed that species go extinct randomly, species often go extinct in order of their sensitivity to a stress that intensifies through time (such as climate change). 2) For two reasons, interactions among species make it difficult to predict how ecological communities will respond to environmental degradation. First, the sensitivity of an individual species to environmental degradation depends not only on the direct impact of degradation on that species, but also on the indirect effects on that species caused by changes in densities of other species. For example, environmental degradation may decrease the density of competitors and/or predators of a species, thereby causing a compensatory increase in the density of that species. Second, as species go extinct, links within the food web are severed, changing the pathways through which indirect effects operate. Changes in food-web structure depend on the order in which species go extinct, making it difficult to extrapolate from studies that assume extinctions are random to real communities facing progressively intensifying stress from environmental degradation. 3) To disentangle the effects of species interactions on the ability of communities to tolerate environmental degradation, the authors used mathematical simulations to compare how communities resist changes in abundance as species go extinct randomly versus going extinct in order of their sensitivity to an environmental stress. The authors considered communities with three trophic topologies that span a range of community types: tritrophic communities with plants, herbivores, and predators; monotrophic communities comprising just competitors; and communities with arbitrary topology containing prey and predators, competitors, and mutualists. For each topology, the authors constructed 1000 communities in which interaction strengths were chosen from random distributions under the constraints of the specified topology. The direct effects of the stressor on each species were also selected at random, but constrained so that the stressor had a negative effect on all species by decreasing their population growth rates. 4) The authors demonstrate that the consequences of random and ordered extinctions differ. Both depend on food-web interactions that create compensation; that is, the increase of some species when their competitors and/or predators decrease in density due to environmental stress. Compensation makes communities as a whole more resistant to stress by reducing changes in combined species densities. As extinctions progress, the potential for compensation is depleted, and communities become progressively less resistant. For ordered extinctions, however, this depletion is offset and communities retain their resistance, because the surviving species have greater average resistance to the stress. Despite extinctions being ordered, changes in the food web with successive extinctions make it difficult to predict which species will show compensation in the future. This unpredictability argues for "whole-ecosystem" approaches to biodiversity conservation, as seemingly insignificant species may become important after other species go extinct. References (abridged): 1. Sala, O. E. et al. Global biodiversity scenarios for the year 2100. Science 287, 1170-1174 (2000) 2. Chapin, F. S. I. et al. Consequences of changing biodiversity. Nature 405, 234-242 (2000) 3. Ehrlich, P. & Ehrlich, A. Extinction (Random House, New York, 1981) 4. Terborgh, J. et al. Ecological meltdown in predator-free forest fragments. Science 294, 1923-1926 (2001) 5. Naeem, S., Thompson, L. J., Lawler, S. P., Lawton, J. H. & Woodfin, R. M. Declining biodiversity can alter the performance of ecosystems. Nature 368, 734-737 (1994) Nature http://www.nature.com/nature From checker at panix.com Sun Jan 15 23:47:15 2006 From: checker at panix.com (Premise Checker) Date: Sun, 15 Jan 2006 18:47:15 -0500 (EST) Subject: [Paleopsych] Albert Hofmann: LSD - My Problem Child Message-ID: Albert Hofmann: LSD - My Problem Child http://www.flashback.se/archive/my_problem_child/ et seq. [1]Foreword [2]Translator's Preface 1. [3]How LSD Originated 1.1. [4]First Chemical Explorations 1.2. [5]Ergot 1.3. [6]Lysergic Acid and Its Derivatives 1.4. [7]Discovery of the Psyhic Effects of LSD 1.5. [8]Self-Experiments 2. [9]LSD in Animal Experiments and Biological Research 2.1. [10]How Toxic Is LSD? 2.2. [11]Pharmacological Properties of LSD 3. [12]Chemical Modifications of LSD 4. [13]Use of LSD in Psychiatry 4.1. [14]First Self-Experiment by a Psychiatrist 4.2. [15]The Psychic Effects of LSD 5. [16]From Remedy to Inebriant 5.1. [17]Nonmedical Use of LSD 5.2. [18]Sandoz Stops LSD Distribution 5.3. [19]Dangers of Nomnedicinal LSD Experiments 5.4. [20]Psychotic Reactions 5.5. [21]LSD from the Black Market 5.6. [22]The Case of Dr. Leary 5.7. [23]Meeting with Timothy Leary 5.8. [24]Travels in the Universe of the Soul 5.9. [25]Dance of the Spirits in the Wind 5.10. [26]Polyp from the Deep 5.11. [27]LSD Experience of a Painter 5.12. [28]A Joyous Song of Being 6. [29]The Mexican Relatives of LSD 6.1. [30]The Sacred Mushroom Teonanacatl 6.2. [31]Psilocybin and Psilocin 6.3. [32]A Voyage into the Universe of the Soul with Psilocybin 6.4. [33]Where Time Stands Still 6.5. [34]The "Magic Morning Glory" Ololiuhqui 6.6. [35]In Search of the Magic Plant "Ska Maria Pastora" in the Mazatec Country 6.7. [36]Ride through the Sierra Mazateca 6.8. [37]A Mushroom Ceremony 7. [38]Radiance from Ernst Junger 7.1. [39]Ambivalence of Drug Use 7.2. [40]An Experiment with Psilocybin 7.3. [41]Another LSD Session 8. [42]Meeting with Aldous Huxley 9. [43]Correspondence with the Poet-Physician Walter Vogt 10. [44]Various Visitors 11. [45]LSD Experience and Reality 11.1. [46]Valious Realities 11.2. [47]Mystery and Myth ______________________________________________________________ Formatted in HTML by [48]kk at sci.fi References 1. http://www.flashback.se/archive/my_problem_child/foreword.html 2. http://www.flashback.se/archive/my_problem_child/preface.html 3. http://www.flashback.se/archive/my_problem_child/chapter1.html 4. http://www.flashback.se/archive/my_problem_child/chapter1.html#1 5. http://www.flashback.se/archive/my_problem_child/chapter1.html#2 6. http://www.flashback.se/archive/my_problem_child/chapter1.html#3 7. http://www.flashback.se/archive/my_problem_child/chapter1.html#4 8. http://www.flashback.se/archive/my_problem_child/chapter1.html#5 9. http://www.flashback.se/archive/my_problem_child/chapter2.html 10. http://www.flashback.se/archive/my_problem_child/chapter2.html#1 11. http://www.flashback.se/archive/my_problem_child/chapter2.html#2 12. http://www.flashback.se/archive/my_problem_child/chapter3.html 13. http://www.flashback.se/archive/my_problem_child/chapter4.html 14. http://www.flashback.se/archive/my_problem_child/chapter4.html#1 15. http://www.flashback.se/archive/my_problem_child/chapter4.html#2 16. http://www.flashback.se/archive/my_problem_child/chapter5.html 17. http://www.flashback.se/archive/my_problem_child/chapter5.html#1 18. http://www.flashback.se/archive/my_problem_child/chapter5.html#2 19. http://www.flashback.se/archive/my_problem_child/chapter5.html#3 20. http://www.flashback.se/archive/my_problem_child/chapter5.html#4 21. http://www.flashback.se/archive/my_problem_child/chapter5.html#5 22. http://www.flashback.se/archive/my_problem_child/chapter5.html#6 23. http://www.flashback.se/archive/my_problem_child/chapter5.html#7 24. http://www.flashback.se/archive/my_problem_child/chapter5.html#8 25. http://www.flashback.se/archive/my_problem_child/chapter5.html#9 26. http://www.flashback.se/archive/my_problem_child/chapter5.html#10 27. http://www.flashback.se/archive/my_problem_child/chapter5.html#11 28. http://www.flashback.se/archive/my_problem_child/chapter5.html#12 29. http://www.flashback.se/archive/my_problem_child/chapter6.html 30. http://www.flashback.se/archive/my_problem_child/chapter6.html#1 31. http://www.flashback.se/archive/my_problem_child/chapter6.html#2 32. http://www.flashback.se/archive/my_problem_child/chapter6.html#3 33. http://www.flashback.se/archive/my_problem_child/chapter6.html#4 34. http://www.flashback.se/archive/my_problem_child/chapter6.html#5 35. http://www.flashback.se/archive/my_problem_child/chapter6.html#6 36. http://www.flashback.se/archive/my_problem_child/chapter6.html#7 37. http://www.flashback.se/archive/my_problem_child/chapter6.html#8 38. http://www.flashback.se/archive/my_problem_child/chapter7.html 39. http://www.flashback.se/archive/my_problem_child/chapter7.html#1 40. http://www.flashback.se/archive/my_problem_child/chapter7.html#2 41. http://www.flashback.se/archive/my_problem_child/chapter7.html#3 42. http://www.flashback.se/archive/my_problem_child/chapter8.html 43. http://www.flashback.se/archive/my_problem_child/chapter9.html 44. http://www.flashback.se/archive/my_problem_child/chapter10.html 45. http://www.flashback.se/archive/my_problem_child/chapter11.html 46. http://www.flashback.se/archive/my_problem_child/chapter11.html#1 47. http://www.flashback.se/archive/my_problem_child/chapter11.html#2 48. mailto:kk at sci.fi _________________________________________________________________ Foreword There are experiences that most of us are hesitant to speak about, because they do not conform to everyday reality and defy rational explanation. These are not particular external occurrences, but rather events of our inner lives, which are generally dismissed as figments of the imagination and barred from our memory. Suddenly, the familiar view of our surroundings is transformed in a strange, delightful, or alarming way: it appears to us in a new light, takes on a special meaning. Such an experience can be as light and fleeting as a breath of air, or it can imprint itself deeply upon our minds. One enchantment of that kind, which I experienced in childhood, has remained remarkably vivid in my memory ever since. It happened on a May morning - I have forgotten the year - but I can still point to the exact spot where it occurred, on a forest path on Martinsberg above Baden, Switzerland. As I strolled through the freshly greened woods filled with bird song and lit up by the morning sun, all at once everything appeared in an uncommonly clear light. Was this something I had simply failed to notice before? Was I suddenly discovering the spring forest as it actually looked? It shone with the most beautiful radiance, speaking to the heart, as though it wanted to encompass me in its majesty. I was filled with an indescribable sensation of joy, oneness, and blissful security. I have no idea how long I stood there spellbound. But I recall the anxious concern I felt as the radiance slowly dissolved and I hiked on: how could a vision that was so real and convincing, so directly and deeply felt - how could it end so soon? And how could I tell anyone about it, as my overflowing joy compelled me to do, since I knew there were no words to describe what I had seen? It seemed strange that I, as a child, had seen something so marvelous, something that adults obviously did not perceive - for I had never heard them mention it. While still a child, I experienced several more of these deeply euphoric moments on my rambles through forest and meadow. It was these experiences that shaped the main outlines of my world view and convinced me of the existence of a miraculous, powerful, unfathomable reality that was hidden from everyday sight. I was often troubled in those days, wondering if I would ever, as an adult, be able to communicate these experiences; whether I would have the chance to depict my visions in poetry or paintings. But knowing that I was not cut out to be a poet or artist, I assumed I would have to keep these experiences to myself, important as they were to me. Unexpectedly - though scarcely by chance - much later, in middle age, a link was established between my profession and these visionary experiences from childhood. Because I wanted to gain insight into the structure and essence of matter, I became a research chemist. Intrigued by the plant world since early childhood, I chose to specialize in research on the constituents of medicinal plants. In the course of this career I was led to the psychoactive, hallucination-causing substances, which under certain conditions can evoke visionary states similar to the spontaneous experiences just described. The most important of these hallucinogenic substances has come to be known as LSD. Hallucinogens, as active compounds of considerable scientific interest, have gained entry into medicinal research, biology, and psychiatry, and later - especially LSD also obtained wide diffusion in the drug culture. In studying the literature connected with my work, I became aware of the great universal significance of visionary experience. It plays a dominant role, not only in mysticism and the history of religion, but also in the creative process in art, literature, and science. More recent investigations have shown that many persons also have visionary experiences in daily life, though most of us fail to recognize their meaning and value. Mystical experiences, like those that marked my childhood, are apparently far from rare. There is today a widespread striving for mystical experience, for visionary breakthroughs to a deeper, more comprehensive reality than that perceived by our rational, everyday consciousness. Efforts to transcend our materialistic world view are being made in various ways, not only by the adherents to Eastern religious movements, but also by professional psychiatrists, who are adopting such profound spiritual experiences as a basic therapeutic principle. I share the belief of many of my contemporaries that the spiritual crisis pervading all spheres of Western industrial society can be remedied only by a change in our world view. We shall have to shift from the materialistic, dualistic belief that people and their environment are separate, toward a new consciousness of an all-encompassing reality, which embraces the experiencing ego, a reality in which people feel their oneness with animate nature and all of creation. Everything that can contribute to such a fundamental alteration in our perception of reality must therefore command earnest attention. Foremost among such approaches are the various methods of meditation, either in a religious or a secular context, which aim to deepen the consciousness of reality by way of a total mystical experience. Another important, but still controversial, path to the same goal is the use of the consciousness-altering properties of hallucinogenic psychopharmaceuticals. LSD finds such an application in medicine, by helping patients in psychoanalysis and psychotherapy to perceive their problems in their true significance. Deliberate provocation of mystical experience, particularly by LSD and related hallucinogens, in contrast to spontaneous visionary experiences, entails dangers that must not be underestimated. Practitioners must take into account the peculiar effects of these substances, namely their ability to influence our consciousness, the innermost essence of our being. The history of LSD to date amply demonstrates the catastrophic consequences that can ensue when its profound effect is misjudged and the substance is mistaken for a pleasure drug. Special internal and external advance preparations are required; with them, an LSD experiment can become a meaningful experience. Wrong and inappropriate use has caused LSD to become my problem child. It is my desire in this book to give a comprehensive picture of LSD, its origin, its effects, and its dangers, in order to guard against increasing abuse of this extraordinary drug. I hope thereby to emphasize possible uses of LSD that are compatible with its characteristic action. I believe that if people would learn to use LSD's vision-inducing capability more wisely, under suitable conditions, in medical practice and in conjunction with meditation, then in the future this problem child could become a wonder child. _________________________________________________________________ Translator's Preface Numerous accounts of the discovery of LSD have been published in English; none, unfortunately, have been completely accurate. Here, at last, the father of LSD details the history of his "problem child" and his long and fruitful career as a research chemist. In a real sense, this book is the inside story of the birth of the Psychedelic Age, and it cannot be denied that we have here a highly candid and personal insight into one of the most important scientific discoveries of our time, the signiflcance of which has yet to dawn on mankind. Surpassing its historical value is the immense philosophical import of this work. Never before has a chemist, an expert in the most materialistic of the sciences, advanced a Weltanschauung of such a mystical and transcendental nature. LSD, psilocybin, and the other hallucinogens do indeed, as Albert Hofmann asserts, constitute "cracks" in the edifice of materialistic rationality, cracks we would do well to explore and perhaps widen. As a writer, it gives me great satisfaction to know that by this book the American reader interested in hallucinogens will be introduced to the work of Rudolf Gelpke, Ernst Junger, and Walter Vogt, writers who are all but unknown here. With the notable exceptions of Huxley and Wasson, English and American writers on the hallucinogenic experience have been far less distinguished and eloquent than they. This translation has been carefully overseen by Albert Hofmann, which made my task both simpler and more enjoyable. I am beholden to R. Gordon Wasson for checking the chapters on LSD's "Mexican relatives" and on "Ska Maria Pastora" for accuracy and style. Two chapters of this book - "How LSD Originated" and "LSD Experience and Reality" - were presented by Albert Hofmann as apaperbefore the international conference "Hallucinogens, Shamanism and Modern Life" in San Francisco on the afternoon of Saturday, September 30, 1978. As a part of the conference proceedings, the first chapter has been published in the Journal of Psychedetic Drugs, Vol. 11 (1-2), 1979. Jonathan Ott Vashon Island, Washington _________________________________________________________________ 1. How LSD Originated In the realm of scientific observation, luck is granted only to those who are prepared. Louis Pasteur Time and again I hear or read that LSD was discovered by accident. This is only partly true. LSD came into being within a systematic research program, and the "accident" did not occur until much later: when LSD was already five years old, I happened to experience its unforeseeable effects in my own body - or rather, in my own mind. Looking back over my professional career to trace the influential events and decisions that eventually steered my work toward the synthesis of LSD, I realize that the most decisive step was my choice of employment upon completion of my chemistry studies. If that decision had been different, then this substance, which has become known the world over, might never have been created. In order to tell the story of the origin of LSD, then, I must also touch briefly on my career as a chemist, since the two developments are inextricably interreleted. In the spring of 1929, on concluding my chemistry studies at the University of Zurich, I joined the Sandoz Company's pharmaceutical-chemical research laboratory in Basel, as a co-worker with Professor Arthur Stoll, founder and director of the pharmaceutical department. I chose this position because it afforded me the opportunity to work on natural products, whereas two other job offers from chemical firms in Basel had involved work in the field of synthetic chemistry. First Chemical Explorations My doctoral work at Zurich under Professor Paul Karrer had already given me one chance to pursue my intrest in plant and animal chemistry. Making use of the gastrointestinal juice of the vineyard snail, I accomplished the enzymatic degradation of chitin, the structural material of which the shells, wings, and claws of insects, crustaceans, and other lower animals are composed. I was able to derive the chemical structure of chitin from the cleavage product, a nitrogen-containing sugar, obtained by this degradation. Chitin turned out to be an analogue of cellulose, the structural material of plants. This important result, obtained after only three months of research, led to a doctoral thesis rated "with distiction." When I joined the Sandoz firm, the staff of the pharmaceutical-chemical department was still rather modest in number. Four chemists with doctoral degrees worked in research, three in production. In Stoll's laboratory I found employment that completely agreed with me as a research chemist. The objective that Professor Stoll had set for his pharmaceutical-chemical research laboratories was to isolate the active principles (i.e., the effective constituents) of known medicinal plants to produce pure speciments of these substances. This is particularly important in the case of medicinal plants whose active principles are unstable, or whose potency is subject to great variation, which makes an exact dosage difficult. But if the active principle is available in pure form, it becomes possible to manufacture a stable pharmaceutical preparation, exactly quantifiable by weight. With this in mind, Professor Stoll had elected to study plant substances of recognized value such as the substances from foxglove (Digitalis), Mediterranean squill (Scilla maritima), and ergot of rye (Claviceps purpurea or Secale cornutum), which, owning to their instability and uncertain dosage, nevertheless, had been little used in medicine. My first years in the Sandoz laboratories were devoted almost exclusively to studying the active principles of Mediterranean squill. Dr. Walter Kreis, one of Professor Stoll's earliest associates, lounched me in this field of research. The most important constituents of Mediterranean squill already existed in pure form. Their active agents, as well as those of woolly foxglove (Digitalis lanata), had been isolated and purified, chiefly by Dr. Kreis, with extraordinary skill. The active principles of Mediterranean squill belong to the group of cardioactive glycosides (glycoside = sugar-containing substance) and serve, as do those of foxglove, in the treatment of cardiac insufficiency. The cardiac glycosides are extremely active substances. Because the therapeutic and the toxic doses differ so little, it becomes especially important here to have an exact dosage, based on pure compounds. At the beginning of my investigations, a pharmaceutical preparation with Scilla glycosides had already been introduced into therapeutics by Sandoz; however, the chemical structure of these active compounds, with the exception of the sugar portion, remained largely unknown. My main contribution to the Scilla research, in which I participated with enthusiasm, was to elucidate the chemical structure of the common nucleus of Scilla glycosides, showing on the one hand their differences from the Digitalis glycosides, and on the other hand their close structural relationship with the toxic principles isolated from skin glands of toads. In 1935, these studies were temporarily concluded. Looking for a new field of research, I asked Professor Stoll to let me continue the investigations on the alkaloids of ergot, which he had begun in 1917 and which had led directly to the isolation of ergotamine in 1918. Ergotamine, discovered by Stoll, was the first ergot alkaloid obtained in pure chemical form. Although ergotamine quickly took a significant place in therapeutics (under the trade name Gynergen) as a hemostatic remedy in obstetrics and as a medicament in the treatment of migraine, chemical research on ergot in the Sandoz laboratories was abandoned after the isolation of ergotamine and the determination of its empirical formula. Meanwhile, at the beginning of the thirties, English and American laboratories had begun to determine the chemical structure of ergot alkaloids. They had also discovered a new, watersoluble ergot alkaloid, which could likewise be isolated from the mother liquor of ergotamine production. So I thought it was high time that Sandoz resumed chemical research on ergot alkaloids, unless we wanted to risk losing our leading role in a field of medicinal research, which was already becoming so important. Professor Stoll granted my request, with some misgivings: "I must warn you of the difficulties you face in working with ergot alkaloids. These are-exceedingly sensitive, easily decomposed substances, less stable than any of the compounds you have investigated in the cardiac glycoside field. But you are welcome to try." And so the switches were thrown, and I found myself engaged in a field of study that would become the main theme of my professional career. I have never forgotten the creative joy, the eager anticipation I felt in embarking on the study of ergot alkaloids, at that time a relatively uncharted field of research. Ergot It may be helpful here to give some background information about ergot itself.[For further information on ergot, readers should refer to the monographs of G. Barger, Ergot and Ergotism (Gurney and Jackson, London, 1931 ) and A. Hofmann, Die Mutterkornalkaloide (F. Enke Verlag, Stuttgart, 1964). The former is a classical presentation of the history of the drug, while the latter emphasizes the chemical aspects.] It is produced by a lower fungus (Claviceps purpurea) that grows parasitically on rye and, to a lesser extent, on other species of grain and on wild grasses. Kernels infested with this fungus develop into light-brown to violet-brown curved pegs (sclerotia) that push forth from the husk in place of normal grains. Ergot is described botanically as a sclerotium, the form that the ergot fungus takes in winter. Ergot of rye (Secale cornutum) is the variety used medicinally. Ergot, more than any other drug, has a fascinating history, in the course of which its role and meaning have been reversed: once dreaded as a poison, in the course of time it has changed to a rich storehouse of valuable remedies. Ergot first appeared on the stage of history in the early Middle Ages, as the cause of outbreaks of mass poisonings affecting thousands of persons at a time. The illness, whose connection with ergot was for a long time obscure, appeared in two characteristic forms, one gangrenous (ergotismus gangraenosus) and the other convulsive (ergotismus convulsivus). Popular names for ergotism - such as "mal des ardents," "ignis sacer," "heiliges Feuer," or "St. Anthony's fire" - refer to the gangrenous form of the disease. The patron saint of ergotism victims was St. Anthony, and it was primarily the Order of St. Anthony that treated these patients. Until recent times, epidemic-like outbreaks of ergot poisoning have been recorded in most European countries including certain areas of Russia. With progress in agriculture, and since the realization, in the seventeenth century, that ergot-containing bread was the cause, the frequency and extent of ergotism epidemics diminished considerably. The last great epidemic occurred in certain areas of southern Russia in the years 1926-27. [The mass poisoning in the southern French city of Pont-St. Esprit in the year 1951, which many writers have attributed to ergot-containing bread, actually had nothing to do with ergotism. It rather involved poisoning by an organic mercury compound that was utilized for disinfecting seed.] The first mention of a medicinal use of ergot, namely as an ecbolic (a medicament to precipitate childbirth), is found in the herbal of the Frankfurt city physician Adam Lonitzer (Lonicerus) in the year 1582. Although ergot, as Lonitzer stated, had been used since olden times by midwives, it was not until 1808 that this drug gained entry into academic medicine, on the strength of a work by the American physician John Stearns entitled Account of the Putvis Parturiens, a Remedy for Quickening Childbirth. The use of ergot as an ecbolic did not, however, endure. Practitioners became aware quite early of the great danger to the child, owing primarily to the uncertainty of dosage, which when too high led to uterine spasms. From then on, the use of ergot in obstetrics was confined to stopping postpartum hemorrhage (bleeding after childbirth). It was not until ergot's recognition in various pharmacopoeias during the first half of the nineteenth century that the first steps were taken toward isolating the active principles of the drug. However, of all the researchers who assayed this problem during the first hundred years, not one succeeded in identifying the actual substances responsible for the therapeutic activity. In 1907, the Englishmen G. Barger and F. H. Carr were the first to isolate an active alkaloidal preparation, which they named ergotoxine because it produced more of the toxic than therapeutic properties of ergot. (This preparation was not homogeneous, but rather a mixture of several alkaloids, as I was able to show thirty-five years later.) Nevertheless, the pharmacologist H. H. Dale discovered that ergotoxine, besides the uterotonic effect, also had an antagonistic activity on adrenaline in the autonomic nervous system that could lead to the therapeutic use of ergot alkaloids. Only with the isolation of ergotamine by A. Stoll (as mentioned previously) did an ergot alkaloid find entry and widespread use in therapeutics. The early 1930s brought a new era in ergot research, beginning with the determination of the chemical structure of ergot alkaloids, as mentioned, in English and American laboratories. By chemical cleavage, W. A. Jacobs and L. C. Craig of the Rockefeller Institute of New York succeeded in isolating and characterizing the nucleus common to all ergot alkaloids. They named it lysergic acid. Then came a major development, both for chemistry and for medicine: the isolation of the specifically uterotonic, hemostatic principle of ergot, which was published simultaneously and quite independently by four institutions, including the Sandoz laboratories. The substance, an alkaloid of comparatively simple structure, was named ergobasine (syn. ergometrine, ergonovine) by A. Stoll and E. Burckhardt. By the chemical degradation of ergobasine, W. A. Jacobs and L. C. Craig obtained lysergic acid and the amino alcohol propanolamine as cleavage products. I set as my first goal the problem of preparing this alkaloid synthetically, through chemical linking of the two components of ergobasine, lysergic acid and propanolamine (see structural formulas in the appendix). The lysergic acid necessary for these studies had to be obtained by chemical cleavage of some other ergot alkaloid. Since only ergotamine was available as a pure alkaloid, and was already being produced in kilogram quantities in the pharmaceutical production department, I chose this alkaloid as the starting material for my work. I set about obtaining 0.5 gm of ergotamine from the ergot production people. When I sent the internal requisition form to Professor Stoll for his countersignature, he appeared in my laboratory and reproved me: "If you want to work with ergot alkaloids, you will have to familiarize yourself with the techniques of microchemistry. I can't have you consuming such a large amount of my expensive ergotamine for your experiments." The ergot production department, besides using ergot of Swiss origin to obtain ergotamine, also dealt with Portuguese ergot, which yielded an amorphous alkaloidal preparation that corresponded to the aforementioned ergotoxine first produced by Barger and Carr. I decided to use this less expensive material for the preparation of lysergic acid. The alkaloid obtained from the production department had to be purified further, before it would be suitable for cleavage to lysergic acid. Observations made during the purification process led me to think that ergotoxine could be a mixture of several alkaloids, rather than one homogeneous alkaloid. I will speak later of the far-reaching sequelae of these observations. Here I must digress briefly to describe the working conditions and techniques that prevailed in those days. These remarks may be of interest to the present generation of research chemists in industry, who are accustomed to far better conditions. We were very frugal. Individual laboratories were considered a rare extravagance. During the first six years of my employment with Sandoz, I shared a laboratory with two colleagues. We three chemists, plus an assistant each, worked in the same room on three different fields: Dr. Kreiss on cardiac glycosides; Dr. Wiedemann, who joined Sandoz around the same time as I, on the leaf pigment chlorophyll; and I ultimately on ergot alkaloids. The laboratory was equipped with two fume hoods (compartments supplied with outlets), providing less than effective ventilation by gas flames. When we requested that these hoods be equipped with ventilators, our chief refused on the gound that ventilation by gas flame had sufficed in Willstatter's laboratory. During the last years of World War I, Professor Stoll had been an assistant in Berlin and Munich to the world-famous chemist and Nobel laureate Professor Richard Willstatter, and with him had conducted the fundamental investigations on chlorophyll and the assimilation of carbon dioxide. There was scarcely a scientific discussion with Professor Stoll in which he did not mention his revered teacher Professor Willstatter and his work in Willstatter's laboratory. The working techniques available to chemists in the field of organic chemistry at that time (the beginning of the thirties) were essentially the same as those employed by Justus von Liebig a hundred years earlier. The most important development achieved since then was the introduction of microanalysis by B. Pregl, which made it possible to ascertain the elemental composition of a compound with only a few milligrams of specimen, whereas earlier a few centigrams were needed. Of the other physical-chemical techniques at the disposal of the chemist today - techniques which have changed his way of working, making it faster and more effective, and created entirely new possibilities, above all for the elucidation of structure - none yet existed in those days. For the investigations of Scilla glycosides and the first studies in the ergot field, I still used the old separation and purification techniques from Liebig's day: fractional extraction, fractional precipitation, fractional crystallization, and the like. The introduction of column chromatography, the first important step in modern laboratory technique, was of great value to me only in later investigations. For structure determination, which today can be conducted rapidly and elegantly with the help of spectroscopic methods (UV, IR, NMR) and X-ray crystallography, we had to rely, in the first fundamental ergot studies, entirely on the old laborious methods of chemical degradation and derivatization. Lysergic Acid and Its Derivatives Lysergic acid proved to be a rather unstable substance, and its rebonding with basic radicals posed difficulties. In the technique knon as Curtius' Synthesis, I ultimately found a process that proved useful for combining lysergic acid with amines. With this method I produced a great number of lysergic acid compounds. By combining lysergic acid with the amino alcohol propanolamine, I obtained a compound that was identical to the natural ergot alkaloid ergobasine. With that, the first synthesis - that is, artificial production - of an ergot alkaloid was accomplished. This was not only of scientific interest, as confirmation of the chemical structure of ergobasine, but also of practical significance, because ergobasine, the specifically uterotonic, hemostatic principle, is present in ergot only in very trifling quantities. With this synthesis, the other alkaloids existing abundantly in ergot could now be converted to ergobasine, which was valuable in obstetrics. After this first success in the ergot field, my investigations went forward on two fronts. First, I attempted to improve the pharmacological properties of ergobasine by variations of its amino alcohol radical. My colleague Dr. J. Peyer and I developed a process for the economical production of propanolamine and other amino alcohols. Indeed, by substitution of the propanolamine contained in ergobasine with the amino alcohol butanolamine, an active principle was obtained that even surpassed the natural alkaloid in its therapeutic properties. This improved ergobasine has found worldwide application as a dependable uterotonic, hemostatic remedy under the trade name Methergine, and is today the leading medicament for this indication in obstetrics. I further employed my synthetic procedure to produce new lysergic acid compounds for which uterotonic activity was not prominent, but from which, on the basis of their chemical structure, other types of interesting pharmacological properties could be expected. In 1938, I produced the twenty-fifth substance in this series of lysergic acid derivatives: lysergic acid diethylamide, abbreviated LSD-25 (Lyserg-saure-diathylamid) for laboratory usage. I had planned the synthesis of this compound with the intention of obtaining a circulatory and respiratory stimulant (an analeptic). Such stimulating properties could be expected for lysergic acid diethylamide, because it shows similarity in chemical structure to the analeptic already known at that time, namely nicotinic acid diethylamide (Coramine). During the testing of LSD-25 in the pharmacological department of Sandoz, whose director at the time was Professor Ernst Rothlin, a strong effect on the uterus was established. It amounted to some 70 percent of the activity of ergobasine. The research report also noted, in passing, that the experimental animals became restless during the narcosis. The new substance, however, aroused no special interest in our pharmacologists and physicians; testing was therefore discontinued. For the next five years, nothing more was heard of the substance LSD-25. Meanwhile, my work in the ergot field advanced further in other areas. Through the purification of ergotoxine, the starting material for lysergic acid, I obtained, as already mentioned, the impression that this alkaloidal preparation was not homogeneous, but was rather a mixture of different substances. This doubt as to the homogeneity of ergotoxine was reinforced when in its hydrogenation two distinctly different hydrogenation products were obtained, whereas the homogeneous alkaloid ergotamine under the same condition yielded only a single hydrogenation product (hydrogenation = introduction of hydrogen). Extended, systematic analytical investigations of the supposed ergotoxine mixture led ultimately to the separation of this alkaloidal preparation into three homogeneous components. One of the three chemically homogeneous ergotoxine alkaloids proved to be identical with an alkaloid isolated shortly before in the production department, which A. Stoll and E. Burckhardt had named ergocristine. The other two alkaloids were both new. The first I named ergocornine; and for the second, the last to be isolated, which had long remained hidden in the mother liquor, I chose the name ergokryptine (kryptos = hidden). Later it was found that ergokryptine occurs in two isomeric forms, which were differentiated as alfa- and beta-ergokryptine. The solution of the ergotoxine problem was not merely scientifically interesting, but also had great practical significance. A valuable remedy arose from it. The three hydrogenated ergotoxine alkaloids that I produced in the course of these investigations, dihydroergocristine, dihydroergokryptine, and dihydroergocornine, displayed medicinally useful properties during testing by Professor Rothlin in the pharmacological department. From these three substances, the pharmaceutical preparation Hydergine was developed, a medicament for improvement of peripheral circulation and cerebral function in the control of geriatric disorders. Hydergine has proven to be an effective remedy in geriatrics for these indications. Today it is Sandoz's most important pharmaceutical product. Dihydroergotamine, which I likewise produced in the course of these investigations, has also found application in therapeutics as a circulation- and bloodpressure-stabilizing medicament, under the trade name Dihydergot. While today research on important projects is almost exclusively carried out as teamwork, the investigations on ergot alkaloids described above were conducted by myself alone. Even the further chemical steps in the evolution of commercial preparations remained in my hands - that is, the preparation of larger specimens for the clinical trials, and finally the perfection of the first procedures for mass production of Methergine, Hydergine, and Dihydergot. This even included the analytical controls for the development of the first galenical forms of these three preparations: the ampules, liquid solutions, and tablets. My aides at that time included a laboratory assistant, a laboratory helper, and later in addition a second laboratory assistant and a chemical technician. Discovery of the Psyhic Effects of LSD The solution of the ergotoxine problem had led to fruitful results, described here only briefly, and had opened up further avenues of research. And yet I could not forget the relatively uninteresting LSD-25. A peculiar presentiment - the feeling that this substance could possess properties other than those established in the first investigations - induced me, five years after the first synthesis, to produce LSD-25 once again so that a sample could be given to the pharmacological department for further tests. This was quite unusual; experimental substances, as a rule, were definitely stricken from the research program if once found to be lacking in pharmacological interest. Nevertheless, in the spring of 1943, I repeated the synthesis of LSD-25. As in the first synthesis, this involved the production of only a few centigrams of the compound. In the final step of the synthesis, during the purification and crystallization of lysergic acid diethylamide in the form of a tartrate (tartaric acid salt), I was interrupted in my work by unusual sensations. The following description of this incident comes from the report that I sent at the time to Professor Stoll: Last Friday, April 16,1943, I was forced to interrupt my work in the laboratory in the middle of the afternoon and proceed home, being affected by a remarkable restlessness, combined with a slight dizziness. At home I lay down and sank into a not unpleasant intoxicated-like condition, characterized by an extremely stimulated imagination. In a dreamlike state, with eyes closed (I found the daylight to be unpleasantly glaring), I perceived an uninterrupted stream of fantastic pictures, extraordinary shapes with intense, kaleidoscopic play of colors. After some two hours this condition faded away. This was, altogether, a remarkable experience - both in its sudden onset and its extraordinary course. It seemed to have resulted from some external toxic influence; I surmised a connection with the substance I had been working with at the time, lysergic acid diethylamide tartrate. But this led to another question: how had I managed to absorb this material? Because of the known toxicity of ergot substances, I always maintained meticulously neat work habits. Possibly a bit of the LSD solution had contacted my fingertips during crystallization, and a trace of the substance was absorbed through the skin. If LSD-25 had indeed been the cause of this bizarre experience, then it must be a substance of extraordinary potency. There seemed to be only one way of getting to the bottom of this. I decided on a self-experiment. Exercising extreme caution, I began the planned series of experiments with the smallest quantity that could be expected to produce some effect, considering the activity of the ergot alkaloids known at the time: namely, 0.25 mg (mg = milligram = one thousandth of a gram) of lysergic acid diethylamide tartrate. Quoted below is the entry for this experiment in my laboratory journal of April 19, 1943. Self-Experiments 4/19/43 16:20: 0.5 cc of 1/2 promil aqueous solution of diethylamide tartrate orally = 0.25 mg tartrate. Taken diluted with about 10 cc water. Tasteless. 17:00: Beginning dizziness, feeling of anxiety, visual distortions, symptoms of paralysis, desire to laugh. Supplement of 4/21: Home by bicycle. From 18:00- ca.20:00 most severe crisis. (See special report.) Here the notes in my laboratory journal cease. I was able to write the last words only with great effort. By now it was already clear to me that LSD had been the cause of the remarkable experience of the previous Friday, for the altered perceptions were of the same type as before, only much more intense. I had to struggle to speak intelligibly. I asked my laboratory assistant, who was informed of the self-experiment, to escort me home. We went by bicycle, no automobile being available because of wartime restrictions on their use. On the way home, my condition began to assume threatening forms. Everything in my field of vision wavered and was distorted as if seen in a curved mirror. I also had the sensation of being unable to move from the spot. Nevertheless, my assistant later told me that we had traveled very rapidly. Finally, we arrived at home safe and sound, and I was just barely capable of asking my companion to summon our family doctor and request milk from the neighbors. In spite of my delirious, bewildered condition, I had brief periods of clear and effective thinking - and chose milk as a nonspecific antidote for poisoning. The dizziness and sensation of fainting became so strong at times that I could no longer hold myself erect, and had to lie down on a sofa. My surroundings had now transformed themselves in more terrifying ways. Everything in the room spun around, and the familiar objects and pieces of furniture assumed grotesque, threatening forrns. They were in continuous motion, animated, as if driven by an inner restlessness. The lady next door, whom I scarcely recognized, brought me milk - in the course of the evening I drank more than two liters. She was no longer Mrs. R., but rather a malevolent, insidious witch with a colored mask. Even worse than these demonic transformations of the outer world, were the alterations that I perceived in myself, in my inner being. Every exertion of my will, every attempt to put an end to the disintegration of the outer world and the dissolution of my ego, seemed to be wasted effort. A demon had invaded me, had taken possession of my body, mind, and soul. I jumped up and screamed, trying to free myself from him, but then sank down again and lay helpless on the sofa. The substance, with which I had wanted to experiment, had vanquished me. It was the demon that scornfully triumphed over my will. I was seized by the dreadful fear of going insane. I was taken to another world, another place, another time. My body seemed to be without sensation, lifeless, strange. Was I dying? Was this the transition? At times I believed myself to be outside my body, and then perceived clearly, as an outside observer, the complete tragedy of my situation. I had not even taken leave of my family (my wife, with our three children had traveled that day to visit her parents, in Lucerne). Would they ever understand that I had not experimented thoughtlessly, irresponsibly, but rather with the utmost caution, an-d that such a result was in no way foreseeable? My fear and despair intensified, not only because a young family should lose its father, but also because I dreaded leaving my chemical research work, which meant so much to me, unfinished in the midst of fruitful, promising development. Another reflection took shape, an idea full of bitter irony: if I was now forced to leave this world prematurely, it was because of this Iysergic acid diethylamide that I myself had brought forth into the world. By the time the doctor arrived, the climax of my despondent condition had already passed. My laboratory assistant informed him about my selfexperiment, as I myself was not yet able to formulate a coherent sentence. He shook his head in perplexity, after my attempts to describe the mortal danger that threatened my body. He could detect no abnormal symptoms other than extremely dilated pupils. Pulse, blood pressure, breathing were all normal. He saw no reason to prescribe any medication. Instead he conveyed me to my bed and stood watch over me. Slowly I came back from a weird, unfamiliar world to reassuring everyday reality. The horror softened and gave way to a feeling of good fortune and gratitude, the more normal perceptions and thoughts returned, and I became more confident that the danger of insanity was conclusively past. Now, little by little I could begin to enjoy the unprecedented colors and plays of shapes that persisted behind my closed eyes. Kaleidoscopic, fantastic images surged in on me, alternating, variegated, opening and then closing themselves in circles and spirals, exploding in colored fountains, rearranging and hybridizing themselves in constant flux. It was particularly remarkable how every acoustic perception, such as the sound of a door handle or a passing automobile, became transformed into optical perceptions. Every sound generated a vividly changing image, with its own consistent form and color. Late in the evening my wife returned from Lucerne. Someone had informed her by telephone that I was suffering a mysterious breakdown. She had returned home at once, leaving the children behind with her parents. By now, I had recovered myself sufficiently to tell her what had happened. Exhausted, I then slept, to awake next morning refreshed, with a clear head, though still somewhat tired physically. A sensation of well-being and renewed life flowed through me. Breakfast tasted delicious and gave me extraordinary pleasure. When I later walked out into the garden, in which the sun shone now after a spring rain, everything glistened and sparkled in a fresh light. The world was as if newly created. All my senses vibrated in a condition of highest sensitivity, which persisted for the entire day. This self-experiment showed that LSD-25 behaved as a psychoactive substance with extraordinary properties and potency. There was to my knowledge no other known substance that evoked such profound psychic effects in such extremely low doses, that caused such dramatic changes in human consciousness and our experience of the inner and outer world. What seemed even more significant was that I could remember the experience of LSD inebriation in every detail. This could only mean that the conscious recording function was not interrupted, even in the climax of the LSD experience, despite the profound breakdown of the normal world view. For the entire duration of the experiment, I had even been aware of participating in an experiment, but despite this recognition of my condition, I could not, with every exertion of my will, shake off the LSD world. Everything was experienced as completely real, as alarming reality; alarming, because the picture of the other, familiar everyday reality was still fully preserved in the memory for comparison. Another surprising aspect of LSD was its ability to produce such a far-reaching, powerful state of inebriation without leaving a hangover. Quite the contrary, on the day after the LSD experiment I felt myself to be, as already described, in excellent physical and mental condition. I was aware that LSD, a new active compound with such properties, would have to be of use in pharmacology, in neurology, and especially in psychiatry, and that it would attract the interest of concerned specialists. But at that time I had no inkling that the new substance would also come to be used beyond medical science, as an inebriant in the drug scene. Since my self-experiment had revealed LSD in its terrifying, demonic aspect, the last thing I could have expected was that this substance could ever find application as anything approaching a pleasure drug. I failed, moreover, to recognize the meaningful connection between LSD inebriation and spontaneous visionary experience until much later, after further experiments, which were carried out with far lower doses and under different conditions. The next day I wrote to Professor Stoll the abovementioned report about my extraordinary experience with LSD-25 and sent a copy to the director of the pharmacological department, Professor Rothlin. As expected, the first reaction was incredulous astonishment. Instantly a telephone call came from the management; Professor Stoll asked: "Are you certain you made no mistake in the weighing? Is the stated dose really correct?" Professor Rothlin also called, asking the same question. I was certain of this point, for I had executed the weighing and dosage with my own hands. Yet their doubts were justified to some extent, for until then no known substance had displayed even the slightest psychic effect in fractionof-a-milligram doses. An active compound of such potency seemed almost unbelievable. Professor Rothlin himself and two of his colleagues were the first to repeat my experiment, with only onethird of the dose I had utilized. But even at that level, the effects were still extremely impressive, and quite fantastic. All doubts about the statements in my report were eliminated. _________________________________________________________________ 2. LSD in Animal Experiments and Biological Research After the discovery of its extraordinary psychic effects, the substance LSD-25, which five years earlier had been excluded from further investigation after the first trials on animals, was again admitted into the series of experimental preparations. Most of the fundamental studies on animals were carried out by Dr. Aurelio Cerletti in the Sandoz pharmacological department, headed by Professor Rothlin. Before a new active substance can be investigated in systematic clinical trials with human subjects, extensive data on its effects and side effects must be determined in pharmacological tests on animals. These experiments must assay the assimilation and elimination of the particular substance in organisms, and above all its tolerance and relative toxicity. Only the most important reports on animal experiments with LSD, and those intelligible to the layperson, will be reviewed here. It would greatly exceed the scope of this book if I attempted to mention all the results of several hundred pharmacological investigations, which have been conducted all over the world in connection with the fundamental work on LSD in the Sandoz laboratories. Animal experiments reveal little about the mental alterations caused by LSD because psychic effects are scarcely determinable in lower animals, and even in the more highly developed, they can be established only to a limited extent. LSD produces its effects above all in the sphere of the higher and highest psychic and intellectual functions. It is therefore understandable that speciflc reactions to LSD can be expected only in higher animals. Subtle psychic changes cannot be established in animals because, even if they should be occurring, the animal could not give them expression. Thus, only relatively heavy psychic disturbances, expressing themselves in the altered behavior of research animals, become discernible. Quantities that are substantially higher than the effective dose of LSD in human beings are therefore necessary, even in higher animals like cats, dogs, and apes. While the mouse under LSD shows only motor disturbances and alterations in licking behavior, in the cat we see, besides vegetative symptoms like bristling of the hair (piloerection) and salivation, indications that point to the existence of hallucinations. The animals stare anxiously in the air, and instead of attacking the mouse, the cat leaves it alone or will even stand in fear before the mouse. One could also conclude that the behavior of dogs that are under the influence of LSD involves hallucinations. A caged community of chimpanzees reacts very sensitively if a member of the tribe has received LSD. Even though no changes appear in this single animal, the whole cage gets in an uproar because the LSD chimpanzee no longer observes the laws of its finely coordinated hierarchic tribal order. Of the remaining animal species on which LSD was tested, only aquarium fish and spiders need be mentioned here. In the fish, unusual swimming postures were observed, and in the spiders, alterations in web building were apparently produced by kSD. At very low optimum doses the webs were even better proportioned and more exactly built than normally: however, with higher doses, the webs were badly and rudimentarily made. How Toxic Is LSD? The toxicity of LSD has been determined in various animal species. A standard for the toxicity of a substance is the LDso, or the median lethal dose, that is, the dose with which 50 percent of the treated animals die. In general it fluctuates broadly, according to the animal species, and so it is with LSD. The LDso for the mouse amounts to 50-60 mgtkg i.v. (that is, 50 to 60 thousandths of a gram of LSD per kilogram of animal weight upon injection of an LSD solution into the veins). In the rat the LDso drops to 16.5 mg/kg, and in rabbits to 0.3 mg/kg. One elephant given 0.297 g of LSD died after a few minutes. The weight of this animal was determined to be 5,000 kg, which corresponds to a lethal dose of 0.06 mg/kg (0.06 thousandths of a gram per kilogram of body weight). Because this involves only a single case, this value cannot be generalized, but we can at least deduce from it that the largest land animal reacts proportionally very sensitively to LSD, since the lethal dose in elephants must be some 1,000 times lower than in the mouse. Most animals die from a lethal dose of LSD by respiratory arrest. The minute doses that cause death in animal experiments may give the impression that LSD is a very toxic substance. However, if one compares the lethal dose in animals with the effective dose in human beings, which is 0.0003-0.001 mg/kg (0.0003 to 0.001 thousandths of a gram per kilogram of body weight), this shows an extraordinarily low toxicity for LSD. Only a 300- to 600-fold overdose of LSD, compared to the lethal dose in rabbits, or fully a 50,000- to 100,000fold overdose, in comparison to the toxicity in the mouse, would have fatal results in human beings. These comparisons of relative toxicity are, to be sure, only understandable as estimates of orders of magnitude, for the determination of the therapeutic index (that is, the ratio between the effective and the lethal dose) is only meaningful within a given species. Such a procedure is not possible in this case because the lethal doge of LSD for humans is not known. To my knowledge, there have not as yet occurred any casualties that are a direct consequence of LSD poisoning. Numerous episodes of fatal consequences attributed to LSD ingestion have indeed been recorded, but these were accidents, even suicides, that may be attributed to the mentally disoriented condition of LSD intoxication. The danger of LSD lies not in its toxicity, but rather in the unpredictability of its psychic effects. Some years ago reports appeared in the scientific literature and also in the lay press, alleging that damage to chromosomes or the genetic material had been caused by LSD. These effects, however, have been observed in only a few individual cases. Subsequent comprehensive investigations of a large, statistically significant number of cases, however, showed that there was no connection between chromosome anomalies and LSD medication. The same applies to reports about fetal deformities that had allegedly been produced by LSD. In animal experiments, it is indeed possible to induce fetal deformities through extremely high doses of LSD, which lie well above the doses used in human beings. But under these conditions, even harmless substances produce such damage. Examination of reported individual cases of human fetal deformities reveals, again, no connection between LSD use and such injury. If there had been any such connection, it would long since have attracted attention, for several million people by now have taken LSD. Pharmacological Properties of LSD LSD is absorbed easily and completely through the gastrointestinal tract. It is therefore unnecessary to inject LSD, except for special purposes. Experiments on mice with radioactively labeled LSD have established that intravenously injected LSD disappeared down to a small vestige, very rapidly from the bloodstream and was distributed throughout the organism. Unexpectedly, the lowest concentration is found in the brain. It is concentrated here in certain centers of the midbrain that play a role in the regulation of emotion. Such findings give indications as to the localization of certain psychic functions in the brain. The concentration of LSD in the various organs attains maximum values 10 to 15 minutes after injection, then falls off again swiftly. The small intestine, in which the concentration attains the maximum within two hours, constitutes an exception. The elimination of LSD is conducted for the most part (up to some 80 percent) through the intestine via liver and bile. Only 1 to 10 percent of the elimination product exists as unaltered LSD; the remainder is made up of various transformation products. As the psychic effects of LSD persist even after it can no longer be detected in the organism, we must assume that LSD is not active as such, but that it rather triggers certain biochemical, neurophysiological, and psychic mechanisms that provoke the inebriated condition and continue in the absence of the active principle. LSD stimulates centers of the sympathetic nervous system in the midbrain, which leads to pupillary dilatation, increase in body temperature, and rise in the blood-sugar level. The uterine-constricting activity of LSD has already been mentioned. An especially interesting pharmacological property of LSD, discovered by J. H. Gaddum in England, is its serotonin-blocking effect. Serotonin is a hormone-like substance, occurring naturally in various organs of warm-blooded animals. Concentrated in the midbrain, it plays an important role in the propagation of impulses in certain nerves and therefore in the biochemistry of psychic functions. The disruption of natural functioning of serotonin by LSD was for some time regarded as an explanation of its psychic effects. However, it was soon shown that even certain derivatives of LSD (compounds in which the chemical structure of LSD is slightly modified) that exhibit no hallucinogenic properties, inhibit the effects of serotonin just as strongly, or yet more strongly, than unaltered LSD. The serotonin-blocking effect of LSD thus does not suffice to explain its hallucinogenic properties. LSD also influences neurophysiological functions that are connected with dopamine, which is, like serotonin, a naturally occurring hormone-like substance. Most of the brain centers receptive to dopamine become activated by LSD, while the others are depressed. As yet we do not know the biochemical mechanisms through which LSD exerts its psychic effects. Investigations of the interactions of LSD with brain factors like serotonin and dopamine, however, are examples of how LSD can serve as a tool in brain research, in the study of the biochemical processes that underlie the psychic functions. _________________________________________________________________ 3. Chemical Modifications of LSD When a new type of active compound is discovered in pharmaceutical-chemical research, whether by isolation from a plant drug or from animal organs, or through synthetic production as in the case of LSD, then the chemist attempts, through alterations in its molecular structure, to produce new compounds with similar, perhaps improved activity, or with other valuable active properties. We call this process achemical modification of this type of active substance. Of the approximately 20,000 new substances that are produced annually in the pharmaceutical-chemical research laboratories of the world, the overwhelming majority are modification products of proportionally few types of active compounds. The discovery of a really new type of active substance - new with regard to chemical structure and pharmacological effect - is a rare stroke of luck. Soon after the discovery of the psychic effects of LSD, two coworkers were assigned to join me in carrying out the chemical modification of LSD on a broader basis and in further investigations in the field of ergot alkaloids. The work on the chemical structure of ergot alkaloids of the peptide type, to which ergotamine and the alkaloids of the ergotoxine group belong, continued with Dr. Theodor Petrzilka. Working with Dr. Franz Troxler, I produced a great number of chemical modifications of LSD, and we attempted to gain further insights into the structure of lysergic acid, for which the American researchers had already proposed a structural formula. In 1949 we succeeded in correcting this formula and specifying the valid structure of this common nucleus of all ergot alkaloids, including of course LSD. The investigations of the peptide alkaloids of ergot led to the complete structural formulas of these substances, which we published in 1951. Their correctness was confirmed through the total synthesis of ergotamine, which was realized ten years later in collaboration with two younger coworkers, Dr. Albert J. Frey and Dr. Hans Ott. Another coworker, Dr. Paul A. Stadler, was largely responsible for the development of this synthesis into a process practicable on an industrial scale. The synthetic production of peptide ergot alkaloids using lysergic acid obtained from special cultures of the ergot fungus in tanks has great economic importance. This procedure is used to produce the starting material for the medicaments Hydergine and Dihydergot. Now we return to the chemical modifications of LSD. Many LSD derivatives were produced, since 1945, in collaboration with' Dr. Troxler, but none proved hallucinogenically more active than LSD. Indeed, the very closest relatives proved themselves essentially less active in this respect. There are four different possibilities of spatial arrangement of atoms in the LSD molecule. They are differentiated in technical language by the prefix isoand the letters D and L. Besides LSD, which is more precisely designated as D-lysergic acid diethylamide, I have also produced and likewise tested in selfexperiments the three other spatially different forms, namely D-isolysergic acid diethylamide (iso-LSD), L-lysergic acid diethylamide (L-LSD), and L-isolysergic acid diethylamide (L-iso-LSD). The last three forms of LSD showed no psychic effects up to a dose of 0.5 mg, which corresponds to a 20-fold quantity of a still distinctly active LSD dose. A substance very closely related to LSD, the monoethylamide of lysergic acid (LAE-23), in which an ethyl group is replaced by a hydrogen atom on the diethylamide residue of LSD, proved to be some ten times less psychoactive than LSD. The hallucinogenic effect of this substance is also qualitatively different: it is characterized by a narcotic component. This narcotic effect is yet more pronounced in lysergic acid amide (LA-111), in which both ethyl groups of LSD are displaced by hydrogen atoms. These effects, which I established in comparative self-experiments with LA-111 and LAE-32, were corroborated by subsequent clinical investigations. Fifteen years later we encountered lysergic acid amide, which had been produced synthetically for these investigations, as a naturally occurring active principle of the Mexican magic drug olotiuhqui. In a later chapter I shall deal more fully with this unexpected discovery. Certain results of the chemical modification of LSD proved valuable to medicinal research; LSD derivatives were found that were only weakly or not at all hallucinogenic, but instead exhibited other effects of LSD to an increased extent. Such an effect of LSD is its blocking effect on the neurotransmitter serotonin (referred to previously in the discussion of the pharmacological properties of LSD). As serotonin plays a role in allergic-inflammatory processes and also in the generation of migraine, a specific serotonin-blocking substance was of great significance to medicinal research. We therefore searched systematically for LSD derivatives without hallucinogenic effects, but with the highest possible activity as serotonin blockers. The first such active substance was found in bromo-LSD, which has become known in medicinal-biological research under the designation BOL-148. In the course of our investigations on serotonin antagonists, Dr. Troxler produced in the sequel yet stronger and more specifically active compounds. The most active entered the medicinal market as a medicament for the treatment of migraine, under the trademark "Deseril" or, in English-speaking countries, "Sansert." _________________________________________________________________ 4. Use of LSD in Psychiatry Soon after LSD was tried on animals, the first systematic investigation of the substance was carried out on human beings, at the psychiatric clinic of the University of Zurich. Werner A. Stoll, M.D. (a son of Professor Arthur Stoll), who led this research, published his results in 1947 in the Schweizer Archiv fur Neurologie und Psychiatrie, under the title "Lysergsaure-diathylamid, ein Phantastikum aus der Mutterkorngruppe" [Lysergic acid diethylamide, a phantasticum from the ergot group]. The tests involved healthy research subjects as well as schizophrenic patients. The dosages - substantially lower than in my first self-experiment with 0.25 mg LSD tartrate - amounted to only 0.02 to 0.13 mg. The emotional state during the LSD inebriation was here predominantly euphoric, whereas in my experiment the mood was marked by grave side effects resulting from overdosage and, of course, fear of the uncertain outcome. This fundamental publication, which gave a scientific description of all the basic features of LSD inebriation, classified the new active principle as a phantas a phantasticum. However, the question of therapeutic application of LSD remained unanswered. On the other hand, the report emphasized the extraordinarily high activity of LSD, which corresponds to the activity of trace substances occurring in the organism that are considered to be responsible for certain mental disorders. Another subject discussed in this first publication was the possible application of LSD as a research tool in psychiatry, which follows from its tremendous psychic activity. First Self-Experiment by a Psychiatrist In his paper, W. A. Stoll also gave a detailed description of his own personal experiment with LSD. Since this was the first self-experiment published by a psychiatrist, and since it describes many characteristic features of LSD inebriation, it is interesting to quote extensively from the report. I warmly thank the author for kind permission to republish this extract. At 8 o'clock I took 60 mcg (0.06 milligrams) of LSD. Some 20 minutes later, the first symptoms appeared: heaviness in the limbs, slight atactic (i.e., confused, uncoordinated) symptoms. A subjectively very unpleasant phase of general malaise followed, in parallel with the drop in blood pressure registered by the examiners. A certain euphoria then set in, though it seemed weaker to me than experiences in an earlier experiment. The ataxia increased, and I went "sailing" around the room with large strides. I felt somewhat better, but was glad to lie down. Afterward the room was darkened (dark experiment); there followed an unprecedented experience of unimaginable intensity that kept increasing in strength. It w as characterized by an unbelievable profusion of optical hallucinations that appeared and vanished with great speed, to make way for countless new images. I saw a profusion of circles, vortices, sparks, showers, crosses, and spirals in constant, racing flux. The images appeared to stream in on me predominantly from the center of the visual field, or out of the lower left edge. When a picture appeared in the middle, the remaining field of vision was simultaneously filled up with a vast number of similar visions. All were colored: bright, luminous red, yellow, and green predominated. I never managed to linger on any picture. When the supervisor of the experiment emphasized my great fantasies, the richness of my statements, I could only react with a sympathetic smile. I knew, in fact, that I could not retain, much less describe, more than a fraction of the pictures. I had to force myself to give a description. Terms such as "fireworks" or "kaleidoscopic" were poor and inadequate. I felt that I had to immerse myself more and more deeply into this strange and fascinating world, in order to allow the exuberance, the unimaginable wealth, to work on me. At first, the hallucinations were elementary: rays, bundles of rays, rain, rings, vortices, loops, sprays, clouds, etc. Then more highly organized visions also appeared: arches, rows of arches, a sea of roofs, desert landscapes, terraces, flickering fire, starry skies of unbelievable splendor. The original, more simple images continued in the midst of these more highly organized hallucinations. I remember the following images in particular: A succession of towering, Gothic vaults, an endless choir, of which I could not see the lower portions. A landscape of skyscrapers, reminiscent of pictures of the entrance to New York harbor: house towers staggered behind and beside one another with innumerable rows of windows. Again the foundation was missing. A system of masts and ropes, which reminded me of a reproduction of a painting seen the previous day (the inside of a circus tent). An evening sky of an unimaginable pale blue over the dark roofs of a Spanish city. I had a peculiar feeling of anticipation, was full of joy and decidedly ready for adventure. All at once the stars flared up, amassed, and turned to a dense rain of stars and sparks that streamed toward me. City and sky had disappeared. I was in a garden, saw brilliant red, yellow, and green lights falling through a dark trelliswork, an indescribably joyous experience. It was significant that all the images consisted of countless repetitions of the same elements: many sparks, many circles, many arches, many windows, many fires, etc. I never saw isolated images, but always duplications of the same image, endlessly repeated. I felt myself one with all romanticists and dreamers, thought of E. T. A. Hoffmann, saw the maelstrom of Poe (even though, at the time I had read Poe, his description seemed exaggerated). Often I seemed to stand at the pinnacle of artistic experience; I luxuriated in the colors of the altar of Isenheim, and knew the euphoria and exultation of an artistic vision. I must also have spoken again and again of modern art; I thought of abstract pictures, which all at once I seemed to understand. Then again, there were impressions of an extreme trashiness, both in their shapes and their color combinations. The most garish, cheap modern lamp ornaments and sofa pillows came into my mind. The train of thought was quickened. But I had the feeling the supervisor of the experiment could still keep up with me. Of course I knew, intellectually, that I was rushing him. At first I had descriptions rapidly at hand. With the increasingly frenzied pace, it became impossible to think a thought through to the end. I must have only started many sentences. When I tried to restrict myself to specific subjects, the experiment proved most unsuccessful. My mind would even focus, in a certain sense, on contrary images: skyscrapers instead of a church, a broad desert instead of a mountain. I assumed that I had accurately estimated the elapsed time, but did not take the matter very seriously. Such questions did not interest me in the slightest. My state of mind was consciously euphoric. I enjoyed the condition, was serene, and took a most active interest in the experience. From time to time I opened my eyes. The weak red light seemed mysterious, much more than before. The busily writing research supervisor appeared to me to be very far away. Often I had peculiar bodily sensations: I believed my hands to be attached to some distant body, but was not certain whether it was my own. After termination of the first dark experiment, I strolled about in the room a bit, was unsure on my legs, and again felt less well. I became cold and was thankful that the research supervisor covered me with a blanket. I felt unkempt, unshaven, and unwashed. The room seemed strange and broad. Later I squatted on a high stool, thinking all the while that I sat there like a bird on the roost. The supervisor emphasized my own wretched appearance. He seemed remarkably graceful. I myself had small, finely formed hands. As I washed them, it was happening a long way from me, somewhere down below on the right. It was questionable, but utterly unimportant, whether they were my own hands. In the landscape outside, well known to me, many things appeared to have changed. Besides the hallucinations, I could now see the real as well. Later this was no longer possible, although I remained aware that reality was otherwise. A barracks, and the garage standing before it to the left, suddenly changed to a landscape of ruins, shattered to pieces. I saw wall wreckage and projecting beams, inspired undoubtedly by the memory of the war events in this region. In a uniform, extensive field, I kept seeing figures, which I tried to draw, but could get no farther than the crudest beginnings. I saw an extremely opulent sculptural ornamentation in constant metamorphosis, in continuous flux. I was reminded of every possible foreign culture, saw Mexican, Indian motifs. Between a grating of small beams and tendrils appeared little caricatures, idols, masks, strangely mixed all of a sudden with childish drawings of people. The tempo was slackened compared to the dark experiment. The euphoria had now vanished. I became depressed, especially during the second dark experiment, which followed. Whereas during the first dark experiment, the hallucinations had alternated with great rapidity in bright and luminous colors, now blue, violet, and dark green prevailed. The movement of larger images was slower milder, quieter, although even these were composed of finely raining "elemental dots," which streamed and whirled about quickly. During the first dark experiment, the commotion had frequently intruded upon me; now it often led distinctly away from me into the center of the picture, where a sucking mouth appeared. I saw grottoes with fantastic erosions and stalactites, reminding me of the child's book Im Wunderreiche des Bergkonigs [In the wondrous realm of the mountain king]. Serene systems of arches rose up. On the right-hand side, a row of shed roofs suddenly appeared; I thought of an evening ride homeward during military service. Significantly it involved a homeward ride: there was no longer anything like departure or love of adventure. I felt protected, enveloped by motherliness, was in peace. The hallucinations were no longer exciting, but instead mild and attenuated. Somewhat later I had the feeling of possessing the same motherly strength. I perceived an inclination, a desire to help, and behaved then in an exaggeratedly sentimental and trashy manner, where medical ethics are concerned. I realized this and was able to stop. But the depressed state of mind remained. I tried again and again to see bright and joyful images. But to no avail; only dark blue and green patterns emerged. I longed to imagine bright fire as in the first dark experiment. And I did see fires; however, they were sacrificial fires on the gloomy battlement of a citadel on a remote, autumnal heath. Once I managed to behold a bright ascending multitude of sparks, but at half-altitude it transformed itself into a group of silently moving spots from a peacock's tail. During the experiment I was very impressed that my state of mind and the type of hallucinations harmonized so consistently and uninterruptedly. During the second dark experiment I observed that random noises, and also noises intentionally produced by the supervisor of the experiment, provoked simultaneous changes in the optical impressions (synesthesia). In the same manner, pressure on the eyeball produced alterations of visual perceptions. Toward the end of the second dark experiment, I began to watch for sexual fantasies, which were, however, totally absent. In no way could I experience sexual desire. I wanted to imagine a picture of a woman; only a crude modern-primitive sculpture appeared. It seemed completely unerotic, and its forms were immediately replaced by agitated circles and loops. After the second dark experiment I felt benumbed and physically unwell. I perspired, was exhausted. I was thankful not to have to go to the cafeteria for lunch. The laboratory assistant who brought us the food appeared to me small and distant, of the same remarkable daintiness as the supervisor of the experiment. Sometime around 3:00 P.M. I felt better, so that the supervisor could pursue his work. With some effort I managed to take notes myself. I sat at the table, wanted to read, but could not concentrate. Once I seemed to myself like a shape from a surrealistic picture, whose limbs were not connected with the body, but were rather painted somewhere close by.... I was depressed and thought with interest of the possibility of suicide. With some terror I apprehended that such thoughts were remarkably familiar to me. It seemed singularly self-evident that a depressed person commits suicide.... On the way home and in the evening I was again euphoric, brimming with the experiences of the morning. I had experienced unexpected, impressive things. It seemed to me that a great epoch of my life had been crowded into a few hours. I was tempted to repeat the experiment. The next day I was careless in my thinking and conduct, had great trouble concentrating, was apathetic. . . . The casual, slightly dream-like condition persisted into the afternoon. I had great trouble reporting in any organized way on a simple problem. I felt a growing general weariness, an increasing awareness that I had now returned to everyday reality. The second day after the experiment brought an irresolute state.... Mild, but distinct depression was experienced during the following week, a feeling which of course could be related only indirectly to LSD. The Psychic Effects of LSD The picture of the activity of LSD obtained from these first investigations was not new to science. It largely matched the commonly held view of mescaline, an alkaloid that had been investigated as early as the turn of the century. Mescaline is the psychoactive constituent of a Mexican cactus Lophophora williamsii (syn. Anhalonium lewinii). This cactus has been eaten by American Indians ever since pre-Columbian times, and is still used today as a sacred drug in religious ceremonies. In his monograph Phantastica (Verlag Georg Stilke, Berlin, 1924), L. Lewin has amply described the history of this drug, called peyotl by the Aztecs. The alkaloid mescaline was isolated from the cactus by A. Heffter in 1896, and in 1919 its chemical structure was elucidated and it was produced synthetically by E. Spath. It was the first hallucinogen or phantasticum (as this type of active compound was described by Lewin) to become available as a pure substance, permitting the study of chemically induced changes of sensory perceptions, mental illusions (hallucinations), and alterations of consciousness. In the 1920s extended experiments with mescaline were carried out on animal and human subjects and described comprehensively by K. Beringer in his book Der Meskalinrausch (Verlag Julius Springer, Berlin, 1927). Because these investigations failed to indicate any applications of mescaline in medicine, interest in this active substance waned. With the discovery of LSD, hallucinogen research received a new impetus. The novelty of LSD as opposed to mescaline was its high activity, lying in a different order of magnitude. The active dose of mescaline, 0.2 to 0.5 g, is comparable to 0.00002 to 0.0001 g of LSD; in other words, LSD is some 5,000 to 10,000 times more active than mescaline. LSD's unique position among the psychopharmaceuticals is not only due to its high activity, in a quantitative sense. The substance also has qualitative significance: it manifests a high specificity, that is, an activity aimed specifically at the human psyche. It can be assumed, therefore, that LSD affects the highest control centers of the psychic and intellectual functions. The psychic effects of LSD, which are produced by such minimal quantities of material, are too meaningful and too multiform to be explained by toxic alterations of brain function. If LSD acted only through a toxic effect on the brain, then LSD experiences would be entirely psychopathological in meaning, without any psychological or psychiatric interest. On the contrary, it is likely that alterations of nerve conductivity and influence on the activity of nerve connections (synapses), which have been experimentally demonstrated, play an important role. This could mean that an influence is being exerted on the extremely complex system of cross-connections and synapses between the many billions of brain cells, the system on which the higher psychic and intellectual functions depend. This would be a promising area to explore in the search for an explanation of LSD's radical efficacy. The nature of LSD's activity could lead to numerous possibilities of medicinal-psychiatric uses, as W. A. Stoll's ground-breaking studies had already shown. Sandoz therefore made the new active substance available to research institutes and physicians as an experimental drug, giving it the trade name Delysid (D-Lysergsaure-diathylamid) which I had proposed. The printed prospectus below describes possible applications of this kind and voices the necessary precautions. Delysid (LSD 25) D-lysergic acid diethylamide tartrate Sugar-coated tablets containing 0.025 mg. (25 mircog.) Ampoules of 1 ml. containing 0.1 mg. (100 microg.) for oral administration The solution may also be injected s.c. or i.v. The effect is identical with that of oral administration but sets in more rapidly. PROPERTIES The administration of very small doses of Delysid (1/2-2 microg./kg. body weight) results in transitory disturbances of affect, hallucinations, depersonalization, reliving of repressed memories, and mild neurovegetative symptoms. The effect sets in after 30 to 90 minutes and generally lasts 5 to 12 hours. However, intermittent disturbances of affect may occasionally persist for several days. METHOD OF ADMINISTRATION For oral administration the contents of 1 ampoule of Delysid are diluted with distilled water, a 1% solution of tartaric acid or halogen-free tap water. The absorption of the solution is somewhat more rapid and more constant than that of the tablets. Ampoules which have not been opened, which have been protected against light and stored in a cool place are stable for an unlimited period. Ampoules which have been opened or diluted solutions retain their effectiveness for 1 to 2 days, if stored in a refrigerator. INDICATIONS AND DOSAGE a) Analytical psychotherapy, to elicit release of repressed material and provide mental relaxation, particularly in anxiety states and obsessional neuroses. The initial dose is 25 microg. (1/4 of an ampoule or 1 tablet). This dose is increased at each treatment by 25 microg. until the optimum dose (usually between 50 and 200 microg.) is found. The individual treatments are best given at intervals of one week. b) Experimental studies on the nature of psychoses: By taking Delysid himself, the psychiatrist is able to gain an insight into the world of ideas and sensations of mental patients. Delysid can also be used to induce model psychoses of short duration in normal subjects, thus facilitating studies on the pathogenesis of mental disease. In normal subjects, doses of 25 to 75 microg. are generally sufficient to produce a hallucinatory psychosis (on an average 1 microg./kg. body weight). In certain forms of psychosis and in chronic alcoholism, higher doses are necessary (2 to 4 microg./kg. body weight). PRECAUTIONS Pathological mental conditions may be intensified by Delysid. Particular caution is necessary in subjects with a suicidal tendency and in those cases where a psychotic development appears imminent. The psycho-affective liability and the tendency to commit impulsive acts may occasionally last for some days. Delysid should only be administered under strict medical supervision. The supervision should not be discontinued until the effects of the drug have completely orn off. ANTIDOTE The mental effects of Delysid can be rapidly reversed by the i.m. administration of 50 mg. chlorpromazine. Literature available on request. SANDOZ LTD., BASLE, SWITZERLAND The use of LSD in analytical psychotherapy is based mainly on the following psychic effects. In LSD inebriation the accustomed world view undergoes a deep-seated transformation and disintegration. Connected with this is a loosening or even suspension of the I-you barrier. Patients who are bogged down in an egocentric problem cycle can thereby be helped to release themselves from their fixation and isolation. The result can be an improved rapport with the doctor and a greater susceptibility to psychotherapeutic influence. The enhanced suggestibility under the influence of LSD works toward the same goal. Another significant, psychotherapeutically valuable characteristic of LSD inebriation is the tendency of long forgotten or suppressed contents of experience to appear again in consciousness. Traumatic events, which are sought in psychoanalysis, may then become accessible to psychotherapeutic treatment. Numerous case histories tell of experiences from even the earliest childhood that were vividly recalled during psychoanalysis under the influence of LSD. This does not involve an ordinary recollection, but rather a true reliving; not a reminiscence, but rather a reviviscence, as the French psychiatrist Jean Delay has formulated it. LSD does not act as a true medicament; rather it plays the role of a drug aid in the context of psychoanalytic and psychotherapeutic treatment and serves to channel the treatment more effectively and to shorten its duration. It can fulfill this function in two particular ways. In one procedure, which was developed in European clinics and given the name psychotytic therapy, moderately strong doses of LSD are administered in several successive sessions at regular intervals. Subsequently the LSD experiences are worked out in group discussions, and in expression therapy by drawing and painting. The term psycholytic therapy was coined by Ronald A. Sandison, an English therapist of Jungian orientation and a pioneerof clinical LSD research. The root -lysis or -lytic signifies the dissolution of tension or conflicts in the human psyche. In a second procedure, which is the favored treatment in the United States, a single, very high LSD dose (0.3 to 0.6 mg) is administered after correspondingly intensive psychological preparation of the patients. This method, described as psychedelic therapy, attempts to induce a mystical-religious experience through the shock effects of LSD. This experience can then serve as a starting point for a restructuring and curing of the patient's personality in the accompanying psychotherapeutic treatment. The term psychedelic, which can be translated as "mind-manifesting" or "mind-expanding," was introduced by Humphry Osmond, a pioneer of LSD research in the United States. LSD's apparent benefits as a drug auxiliary in psychoanalysis and psychotherapy are derived from properties diametrically opposed to the effects of tranquilizer-type psychopharmaceuticals. Whereas tranquilizers tend to cover up the patient's problems and conflicts, reducing their apparent gravity and importance: LSD, on the contrary, makes them more exposed and more intensely experienced. This clearer recognition of problems and conflicts makes them, in turn, more susceptible to psychotherapeutic treatment. The suitability and success of LSD in psychoanalysis and psychotherapy are still a subject of controversy in professional circles. The same could be said, however, of other procedures employed in psychiatry such as electroshock, insulin therapy, or psychosurgery, procedures that entail, moreover, a far greater risk than the use of LSD, which under suitable conditions can be considered practically safe. Because forgotten or repressed experiences, under the influence of LSD, may become conscious with considerable speed, the treatment can be correspondingly shortened. To some psychiatrists, however, this reduction of the therapy's duration is a disadvantage. They are of the opinion that this precipitation leaves the patient insufficient time for psychotherapeutic working-through. The therapeutic effect they believe, persists for a shorter time than when there is a gradual treatment, including a slow process of becoming conscious of the traumatic experiences. Psycholytic and especially psychedelic therapy require thorough preparation of the patient for the LSD experience, to avoid his or her being frightened by the unusual and the unfamiliar. Only then is a positive interpretation of the experience possible. The selection of patients is also important, since not all types of psychic disturbance respond equally well to these msthods of treatment. Successful use of LSD-assisted psychoanalysis and psychotherapy presupposes speclflc knowledge and experience. In this respect self-examination by psychiatrists, as W. A. Stoll has pointed out, can be most useful. They provide the doctors with direct insight, based on firsthand experience into the strange world of LSD inebriation, and make it possible for them truly to understand these phenomena in their patients, to interpret them properly, and to take full advantage of them. The following pioneers in use of LSD as a drug aid in psychoanalysis and psychotherapy deserve to be named in the front rank: A. K. Busch and W. C. Johnson, S. Cohen and B. Eisner, H. A. Abramson, H. Osmond, and A. Hoffer in the United States; R. A. Sandison in England; W. Frederking and H. Leuner in Germany; and G. Roubicek and S. Grof in Czechoslovakia. The second indication for LSD cited in the Sandoz prospectus on Delysid concerns its use in experimental investigations on the nature of psychoses. This arises from the fact that extraordinary psychic states experimentally produced by LSD in healthy research subjects are similar to many manifestations of certain mental disturbances. In the early days of LSD research, it was often claimed that LSD inebriation has something to do with a type of "model psychosis." This idea was dismissed, however, because extended comparative investigations showed that there were essential differences between the manifestations of psychosis and the LSD experience. With the LSD model, nevertheless, it is possible to study deviations from the normal psychic and mental condition, and to observe the biochemical and electrophysiological alterations associated with them. Perhaps we shall thereby gain new insights into the nature of psychoses. According to certain theories, various mental disturbances could be produced by psychotoxic metabolic products that have the power, even in minimal quantities, to alter the functions of brain cells. LSD represents a substance that certainly does not occur in the human organism, but whose existence and activity let it seem possible that abnormal metabolic products could exist, that even in trace quantities could produce mental disturbances. As a result, the conception of a biochemical origin of certain mental disturbances has received broader support, and research in this direction has been stimulated. One medicinal use of LSD that touches on fundamental ethical questions is its administration to the dying. This practice arose from observations in American clinics that especially severe painful conditions of cancer patients, which no longer respond to conventional pain-relieving medication, could be alleviated or completely abolished by LSD. Of course, this does not involve an analgesic effect in the true sense. The diminution of pain sensitivity may rather occur because patients under the influence of LSD are psychologically so dissociated from their bodies that physical pain no longer penetrates their consciousness. In order for LSD to be effective in such cases, it is especially crucial that patients be prepared and instructed about the kind of experiences and transformations that await them. In many cases it has proved beneficial for either a member of the clergy or a psychotherapist to guide the patient's thoughts in a religious direction. Numerous case histories tell of patients who gained meaningful insights about life and death on their deathbeds as, freed from pain in LSD ecstasy and reconciled to their fate, they faced their earthly demise fearlessly and in peace. The hitherto existing knowledge about the administration of LSD to the terminally ill has been summarized and published by S. Grof and J. Halifax in their book The Human Encounter with Death (E. P. Dutton, New York, 1977). The authors, together with E. Kast, S. Cohen, and W. A. Pahnke, are among the pioneers of this application of LSD. The most recent comprehensive publication on the use of LSD in psychiatry, Realms of the Human Unconscious: Observations from LSD Research (The Viking Press, New York, 1975), likewise comes from S. Grof, the Czech psychiatrist who has emigrated to the United States. This book offers a critical evaluation of the LSD experience from the viewpoint of Freud and Jung, as well as of existential analysis. _________________________________________________________________ 5. From Remedy to Inebriant During the first years after its discovery, LSD brought me the same happiness and gratification that any pharmaceutical chemist would feel on learning that a substance he or she produced might possibly develop into a valuable medicament. For the creation of new remedies is the goal of a pharmaceutical chemist's research activity; therein lies the meaning of his or her work. Nonmedical Use of LSD This joy at having fathered LSD was tarnished after more than ten years of uninterrupted scientific research and medicinal use when LSD was swept up in the huge wave of an inebriant mania that began to spread over the Western world, above all the United States, at the end of the 1950s. It was strange how rapidly LSD adopted its new role as inebriant and, for a time, became the number-one inebriating drug, at least as far as publicity was concerned. The more its use as an inebriant was disseminated, bringing an upsurge in the number of untoward incidents caused by careless, medically unsupervised use, the more LSD became a problem child for me and for the Sandoz firm. It was obvious that a substance with such fantastic effects on mental perception and on the experience of the outer and inner world would also arouse interest outside medical science, but I had not expected that LSD, with its unfathomably uncanny, profound effects, so unlike the character of a recreational drug, would ever find worldwide use as an inebriant. I had expected curiosity and interest on the part of artists outside of medicine - performers, painters, and writers - but not among people in general. After the scientific publications around the turn of the century on mescaline - which, as already mentioned, evokes psychic effects quite like those of LSD - the use of this compound remained confined to medicine and to experiments within artistic and literary circles. I had expected the same fate for LSD. And indeed, the first non-medicinal self-experiments with LSD were carried out by writers, painters, musicians, and other intellectuals. LSD sessions had reportedly provoked extraordinary aesthetic experiences and granted new insights into the essence of the creative process. Artists were influenced in their creative work in unconventional ways. A particular type of art developed that has become known as psychedelic art. It comprises creations produced under the influenced of LSD and other psychedelic drugs, whereby the drugs acted as stimulus and source of inspiration. The standard publication in this field is the book by Robert E. L. Masters and Jean Houston, Psychedelic Art (Balance House, 1968). Works of psychedelic art are not created while the drug is in effect, but only afterward, the artist being inspired by these experiences. As long as the inebriated condition lasts, creative activity is impeded, if not completely halted. The influx of images is too great and is increasing too rapidly to be portrayed and fashioned. An overwhelming vision paralyzes activity. Artistic productions arising directly from LSD inebriation, therefore, are mostly rudimentary in character and deserve consideration not because of their artistic merit, but because they are a type of psychoprogram, which offers insight into the deepest mental structures of the artist, activated and made conscious by LSD. This was demonstrated later in a large-scale experiment by the Munich psychiatrist Richard P. Hartmann, in which thirty famous painters took part. He published the results in his book Mlerei aus Bereichen des Unbewussten: Kunstler Experimentieren unter LSD [Painting from spheres of the unconscious: artists experiment with LSD], Verlag M. Du Mont Schauberg, Cologne, 1974). LSD experiments also gave new impetus to exploration into the essence of religious and mystical experience. Religious scholars and philosophers discussed the question whether the religious and mystical experiences often discovered in LSD sessions were genuine, that is, comparable to spontaneous mysticoreligious enlightenment. This nonmedicinal yet earnest phase of LSD research, at times in parallel with medicinal research, at times following it, was increasingly overshadowed at the beginning of the 1960s, as LSD use spread with epidemic-like speed through all social classes, as a sensational inebriating drug, in the course of the inebriant mania in the United States. The rapid rise of drug use, which had its beginning in this country about twenty years ago, was not, however, a consequence of the discovery of LSD, as superficial observers often declared. Eather it had deep-seated sociological causes: materialism, alienation from nature through industrialization and increasing urbanization, lack of satisfaction in professional employment in a mechanized, lifeless working world, ennui and purposelessness in a wealthy, saturated society, and lack of a religious, nurturing, and meaningful philosophical foundation of life. The existence of LSD was even regarded by the drug enthusiasts as a predestined coincidence - it had to be discovered precisely at this time in order to bring help to people suffering under the modern conditions. It is not surprising that LSD first came into circulation as an inebriating drug in the United States, the country in which industrialization, urbanization, and mechanization, even of agriculture, are most broadly advanced. These are the same factors that have led to the origin and growth of the hippie movement that developed simultaneously with the LSD wave. The two cannot be dissociated. It would be worth investigating to what extent the consumption of psychedelic drugs furthered the hippie movement and conversely. The spread of LSD from medicine and psychiatry into the drug scene was introduced and expedited by publications on sensational LSD experiments that, although they were carried out in psychiatric clinics and universities, were not then reported in scientific journals, but rather in magazines and daily papers, greatly elaborated. Reporters made themselves available as guinea pigs. Sidney Katz, for example, participated in an LSD experiment in the Saskatchewan Hospital in Canada under the supervision of noted psychiatrists; his experiences, however, were not published in a medical journal. Instead, he described them in an article entitled "My Twelve Hours as a Madman" in his magazine MacLean's Canada National Magazine, colorfully illustrated in fanciful fullness of detail. The widely distributed German magazine Quick, in its issue number 12 of 21 March 1954, reported a sensational eyewitness account on "Ein kuhnes wissenschaftliches Experiment" [a daring scientific experiment] by the painter Wilfried Zeller, who took "a few drops of lysergic acid" in the Viennese University Psychiatric Clinic. Of the numerous publications of this type that have made effective lay propaganda for LSD, it is sufficient to cite just one more example: a large-scale, illustrated article in Look magazine of September 1959. Entitled "The Curious Story Behind the New Cary Grant," it must have contributed enormously to the diffusion of LSD consumption. The famous movie star had received LSD in a respected clinic in California, in the course of a psychotherapeutic treatment. He informed the Look reporter that he had sought inner peace his whole life long, but yoga, hypnosis, and mysticism had not helped him. Only the treatment with LSD had made a new, selfstrengthened man out of him, so that after three frustrating marriages he now believed himself really able to love and make a woman happy. The evolution of LSD from remedy to inebriating drug was, however, primarily promoted by the activities of Dr. Timothy Leary and Dr. Richard Alpert of Harvard University. In a later section I will come to speak in more detail about Dr. Leary and my meetings with this personage who has become known worldwide as an apostle of LSD. Books also appeared on the U.S. market in which the fantastic effects of LSD were reported more fully. Here only two of the most important will be mentioned: Exploring I nner Space by Jane Dunlap (Harcourt Brace and World, New York, 1961) and My Self and I by Constance A. Newland (N A.L. Signet Books, New York, 1963). Although in both cases LSD was used within the scope of a psychiatric treatment, the authors addressed their books, which became bestsellers, to the broad public. In her book, subtitled "The Intimate and Completely Frank Record of One Woman's Courageous Experiment with Psychiatry's Newest Drug, LSD 25," Constance A. Newland described in intimate detail how she had been cured of frigidity. After such avowals, one can easily imagine that many people would want to try the wondrous medicine for themselves. The mistaken opinion created by such reports - that it would be sufficient simply to take LSD in order to accomplish such miraculous effects and transformations in oneself - soon led to broad diffusion of self-experimentation with the new drug. Objective, informative books about LSD and its problems also appeared, such as the excellent work by the psychiatrist Dr. Sidney Cohen, The Beyond Within (Atheneum, New York, 1967), in which the dangers of careless use are clearly exposed. This had, however, no power to put a stop to the LSD epidemic. As LSD experiments were often carried out in ignorance of the uncanny, unforeseeable, profound effects, and without medical supervision, they frequently came to a bad end. With increasing LSD consumption in the drug scene, there came an increase in "horror trips" - LSD experiments that led to disoriented conditions and panic, often resulting in accidents and even crime. The rapid rise of nonmedicinal LSD consumption at the beginning of the 1960s was also partly attributable to the fact that the drug laws then current in most countries did not include LSD. For this reason, drug habitues changed from the legally proscribed narcotics to the still-legal substance LSD. Moreover, the last of the Sandoz patents for the production of LSD expired in 1963, removing a further hindrance to illegal manufacture of the drug. The rise of LSD in the drug scene caused our firm a nonproductive, laborious burden. National control laboratories and health authorities requested statements from us about chemical and pharmacological properties, stability and toxicity of LSD, and analytical methods for its detection in confiscated drug samples, as well as in the human body, in blood and urine. This brought a voluminous correspondence, which expanded in connection with inquiries from all over the world about accidents, poisonings, criminal acts, and so forth, resulting from misuse of LSD. All this meant enormous, unprofitable difficulties, which the business management of Sandoz regarded with disapproval. Thus it happened one day that Professor Stoll, managing director of the firm at the time, said to me reproachfully: "I would rather you had not discovered LSD." At that time, I was now and again assailed by doubts whether the valuable pharmacological and psychic effects of LSD might be outweighed by its dangers and by possible injuries due to misuse. Would LSD become a blessing for humanity, or a curse? This I often asked myself when I thought about my problem child. My other preparations, Methergine, Dihydroergotamine, and Hydergine, caused me no such problems and difficulties. They were not problem children; lacking extravagant properties leading to misuse, they have developed in a satisfying manner into therapeutically valuable medicines. The publicity about LSD attained its high point in the years 1964 to 1966, not only with regard to enthusiastic claims about the wondrous effects of LSD by drug fanatics and hippies, but also to reports of accidents, mental breakdowns, criminal acts, murders, and suicide under the influence of LSD. A veritable LSD hysteria reigned. Sandoz Stops LSD Distribution In view of this situation, the management of Sandoz was forced to make a public statement on the LSD problem and to publish accounts of the corresponding measures that had been taken. The pertinent letter, dated 23 August 1965, by Dr. A. Cerletti, at the time director of the Pharmaceutical Department of Sandoz, is reproduced below: Decision Regarding LSD 25 and Other Hallucinogenic Substances More than twenty years have elapsed since the discovey by Albert Hofmann of LSD 25 in the SANDOZ Laboratories. Whereas the . fundamental importance of this discovery may be assessed by its impact on the development of modern psychiatric research, it must be recognized that it placed a heavy burden of responsibility on SANDOZ, the owner of this product. The finding of a new chemical with outstanding biological properties, apart from the scientific success implied by its synthesis, is usually the first decisive step toward profitable development of a new drug. In the case of LSD, however, it soon became clear that, despite the outstanding properties of this compound, or rather because of the very nature of these qualities, even though LSD was fully protected by SANDOZ-owned patents since the time of its first synthesis in 1938, the usual means of practical exploitation could not be envisaged. On the other hand, all the evidence obtained following the initial studies in animals and humans carried out in the SANDOZ research laboratories pointed to the important role that this substance could play as an investigational tool in neurological research and in psychiatry. It was therefore decided to make LSD available free of charge to qualified experimental and clinical investigators all over the world. This broad research approach was assisted by the provision of any necessary technical aid and in many instances also by financial support. An enormous amount of scientific documents, published mainly in the international biochemical and medical literature and systematically listed in the "SANDOZ Bibliography on LSD" as well as in the "Catalogue of Literature on Delysid" periodically edited by SANDOZ, gives vivid proof of what has been achieved by following this line of policy over nearly two decades. By exercising this kind of "nobile offlcium" in accordance with the highest standards of medical ethics with all kinds of self-imposed precautions and restrictions, it was possible for many years to avoid the danger of abuse (i.e., use by people neither competent nor qualifled), which is always inherent in a compound with exceptional CNS activity. In spite of all our precautions, cases of LSD abuse have occurred from time to time in varying circumstances completely beyond the control of SANDOZ. Very recently this danger has increased considerably and in some parts of the world has reached the scale of a serious threat to public health. This state of affairs has now reached a critical point for the following reasons: (1) A worldwide spread of misconceptions of LSD has been caused by an increasing amount of publicity aimed at provoking an active interest in laypeople by means of sensational stories and statements; (2) In most countries no adequate legislation exists to control and regulate the production and distribution of substances like LSD; (3) The problem of availability of LSD, once limited on technical grounds, has fundamentally changed with the advent of mass production of lysergic acid by fermentation procedures. Since the last patent on LSD expired in 1963, it is not surprising to find that an increasing number of dealers in fine chemicals are offering LSD from unknown sources at the high price known to be paid by LSD fanatics. Taking into consideration all the above-mentioned circumstances and the flood of requests for LSD which has now become uncontrollable, the pharmaceutical management of SANDOZ has decided to stop immediately all further production and distribution of LSD. The same policy will apply to all derivatives or analogues of LSD with hallucinogenic properties as well as to Psilocybin, Psilocin, and their hallucinogenic congeners. For a while the distribution of LSD and psilocybin was stopped completely by Sandoz. Most countries had subsequently proclaimed strict regulations concerning possession, distribution, and use of hallucinogens, so that physicians, psychiatric clinics, and research institutes, if they could produce a special permit to work with these substances from the respective national health authorities, could again be supplied with LSD and psilocybin. In the United States the National Institute of Mental Health (NIMH) undertook the distribution of these agents to licensed research institutes. All these legislative and official precautions, however, had little influence on LSD consumption in the drug scene, yet on the other hand hindered and continue to hinder medicinal-psychiatric use and LSD research in biology and neurology, because many researchers dread the red tape that is connected with the procurement of a license for the use of LSD. The bad reputation of LSD - its depiction as an "insanity drug" and a "satanic invention" - constitutes a further reason why many doctors shunned use of LSD in their psychiatric practice. In the course of recent years the uproar of publicity about LSD has quieted, and the consumption of LSD as an inebriant has also diminished, as far as that can be concluded from the rare reports about accidents and other regrettable occurrences following LSD ingestion. It may be that the decrease of LSD accidents, however, is not simply due to a decline in LSD consumption. Possibly the recreational users, with time, have become more aware of the particular effects and dangers of LSD and more cautious in their use of this drug. Certainly LSD, which was for a time considered in the Western world, above all in the United States, to be the number-one inebriant, has relinquished this leading role to other inebriants such as hashish and the habituating, even physically destructive drugs like heroin and amphetamine. The last-mentioned drugs represent an alarming sociological and public health problem today. Dangers of Nomnedicinal LSD Experiments While professional use of LSD in psychiatry entails hardly any risk, the ingestion of this substance outside of medical practice, without medical supervision, is subject to multifarious dangers. These dangers reside, on the one hand, in external circumstances connected with illegal drug use and, on the other hand, in the peculiarity of LSD's psychic effects. The advocates of uncontrolled, free use of LSD and other hallucinogens base their attitude on two claims: (l) this type of drug produces no addiction, and (2) until now no danger to health from moderate use of hallucinogens has been demonstrated. Both are true. Genuine addiction, characterized by the fact that psychic and often severe physical disturbances appear on withdrawal of the drug, has not been observed, even in cases in which LSD was taken often and over a long period of time. No organic injury or death as a direct consequence of an LSD intoxication has yet been reported. As discussed in greater detail in the chapter "LSD in Animal Experiments and Biological Research," LSD is actually a relatively nontoxic substance in proportion to its extraordinarily high psychic activity. Psychotic Reactions Like the other hallucinogens, however, LSD is dangerous in an entirely different sense. While the psychic and physical dangers of the addicting narcotics, the opiates, amphetamines, and so forth, appear only with chronic use, the possible danger of LSD exists in every single experiment. This is because severe disoriented states can appear during any LSD inebriation. It is true that through careful preparation of the experiment and the experimenter such episodes can largely be avoided, but they cannot be excluded with certainty. LSD crises resemble psychotic attacks with a manic or depressive character. In the manic, hyperactive condition, the feeling of omnipotence or invulnerability can lead to serious casualties. Such accidents have occurred when inebriated persons confused in this way - believing themselves to be invulnerable - walked in front of a moving automobile or jumped out a window in the belief that they were able to fly. This type of LSD casualty, however, is not so common as one might be led to think on the basis of reports that were sensationally exaggerated by the mass media. Nevertheless, such reports must serve as serious warnings. On the other hand, a report that made the rounds worldwide, in 1966, about an alleged murder committed under the influence on LSD, cannot be true. The suspect, a young man in New York accused of having killed his mother-in-law, explained at his arrest, immediately after the fact, that he knew nothing of the crime and that he had been on an LSD trip for three days. But an LSD inebriation, even with the highest doses, lasts no longer than twelve hours, and repeated ingestion leads to tolerance, which means that extra doses are ineffective. Besides, LSD inebriation is characterized by the fact that the person remembers exactly what he or she has experienced. Presumably the defendant in this case expected leniency for extenuating circumstances, owing to unsoundness of mind. The danger of a psychotic reaction is especially great if LSD is given to someone without his or her knowledge. This was demonstrated in an episode that took place soon after the discovery of LSD, during the first investigations with the new substance in the Zurich University Psychiatric Clinic, when people were not yet aware of the danger of such jokes. A young doctor, whose colleagues had slipped LSD into his coffee as a lark, wanted to swim across Lake Zurich during the winter at -20!C (-4!F) and had to be prevented by force. There is a different danger when the LSD-induced disorientation exhibits a depressive rather than manic character. In the course of such an LSD experiment, frightening visions, death agony, or the fear of becoming insane can lead to a threatening psychic breakdown or even to suicide. Here the LSD trip becomes a "horror trip." The demise of a Dr. Olson, who had been given LSD without his knowledge in the course of U.S. Army drug experiments, and who then committed suicide by jumping from a window, caused a particular sensation. His family could not understand how this quiet, well-adjusted man could have been driven to this deed. Not until fifteen years later, when the secret documents about the experiments were published, did they learn the true circumstances, whereupon the president of the United States publicly apologized to the dependents. The conditions for the positive outcome of an LSD experiment, with little possibility of a psychotic derailment, reside on the one hand in the individual and on the other hand in the external milieu of the experiment. The internal, personal factors are called set, the external conditions setting. The beauty of a living room or of an outdoor location is perceived with particular force because of the highly stimulated sense organs during LSD inebriation, and such an amenity has a substantial influence on the course of the experiment. The persons present, their appearance, their traits, are also part of the setting that determines the experience. The acoustic milieu isequally significant. Even harmless noises can turn to torment, and conversely lovely music can develop into a euphoric experience. With LSD experiments in ugly or noisy surroundings, however, there is greater danger of a negative outcome, including psychotic crises. The machine- and appliance-world of today offers much scenery and all types of noise that could very well trigger panic during enhanced sensibility. Just as meaningful as the external milieu of the LSD experience, if not even more important, is the mental condition of the experimenters, their current state of mind, their attitude to the drug experience, and their expectations associated with it. Even unconscious feelings of happiness or fear can have an effect. LSD tends to intensify the actual psychic state. A feeling of happiness can be heightened to bliss, a depression can deepen to despair. LSD is thus the most inappropriate means imaginable for curing a depressive state. It is dangerous to take LSD in a disturbed, unhappy frame of mind, or in a state of fear. The probability that the experiment will end in a psychic breakdown is then quite high. Among persons with unstable personality structures, tending to psychotic reactions, LSD experimentation ought to be completely avoided. Here an LSD shock, by releasing a latent psychosis, can produce a lasting mental injury. The psyche of very young persons should also be considered as unstable, in the sense of not yet having matured. In any case, the shock of such a powerful stream of new and strange perceptions and feelings, such as is engendered by LSD, endangers the sensitive, still-developing psycho-organism. Even the medicinal use of LSD in youths under eighteen years of age, in the scope of psychoanalytic or psychotherapeutic treatment, is discouraged in professional circles, correctly so in my opinion. Juveniles for the most part still lack a secure, solid relationship to reality. Such a relationship is needed before the dramatic experience of new dimensions of reality can be meaningfully integrated into the world view. Instead of leading to a broadening and deepening of reality consciousness, such an experience in adolescents will lead to insecurity and a feeling of being lost. Because of the freshness of sensory perception in youth and the still-unlimited capacity for experience, spontaneous visionary experiences occur much more frequently than in later life. For this reason as well, psychostimulating agents should not be used by juveniles. Even in healthy, adult persons, even with adherence to all of the preparatory and protective measures discussed, an LSD experiment can fail, causing psychotic reactions. Medical supervision is therefore earnestly to be recommended, even for nonmedicinal LSD experiments. This should include an examination of the state of health before the experiment. The doctor need not be present at the session; however, medical help should at all times be readily available. Acute LSD psychoses can be cut short and brought under control quickly and reliably by injection of chlorpromazine or another sedative of this type. The presence of a familiar person, who can request medical help in the event of an emergenCy, is also an indispensable psychological assurance. Although the LSD inebriation is characterized mostly by an immersion in the individual inner world, a deep need for human contact sometimes arises, especially in depressive phases. LSD from the Black Market Nonmedicinal LSD consumption can bring dangers of an entirely different type than hitherto discussed: for most of the LSD offered in the drug scene is of unknown origin. LSD preparations from the black market are unreliable when it comes to both quality and dosage. They rarely contain the declared quantity, but mostly have less LSD, often none at all, and sometimes even too much. In many cases other drugs or even poisonous substances are sold as LSD. These observations were made in our laboratory upon analysis of a great number of LSD samples from the black market. They coincide with the experiences of national drug control departments. The unreliability in the strength of LSD preparations on the illicit drug market can lead to dangerous overdosage. Overdoses have often proved to be the cause of failed LSD experiments that led to severe psychic and physical breakdowns. Reports of alleged fatal LSD poisoning, however, have yet to be confirmed. Close scrutiny of such cases invariably established other causative factors. The following case, which took place in 1970, is cited as an example of the possible dangers of black market LSD. We received for investigation from the police a drug powder distributed as LSD. It came from a young man who was admitted to the hospital in critical condition and whose friend had also ingested this preparation and died as a result. Analysis showed that the powder contained no LSD, but rather the very poisonous alkaloid strychnine. If most black market LSD preparations contained less than the stated quantity and often no LSD at all, the reason is either deliberate falsification or the great instability of this substance. LSD is very sensitive to air and light. It is oxidatively destroyed by the oxygen in the air and is transformed into an inactive substance under the influence of light. This must be taken into account during the synthesis and especially during the production of stable, storable forms of LSD. Claims that LSD may easily be prepared, or that every chemistry student in a half-decent laboratory is capable of producing it, are untrue. Procedures for synthesis of LSD have indeed been published and are accessible to everyone. With these detailed procedures in hand, chemists would be able to carry out the synthesis, provided they had pure lysergic acid at their disposal; its possession today, however, is subject to the same strict regulations as LSD. In order to isolate LSD in pure crystalline form from the reaction solution and in order to produce stable preparations, however, special equipment and not easily acquired specific experience are required, owing (as stated previously) to the great instability of this substance. Only in completely oxygen-free ampules protected from light is LSD absolutely stable. Such ampules, containing 100 ,Lg (= 0.1 mg) LSD-tartrate (tartaric acid salt of LSD) in 1 cc of aqueous solution, were produced for biological research and medicinal use by the Sandoz firm. LSD in tablets prepared with additives that inhibit oxidation, while not absolutely stable, at least keeps for a longer time. But LSD preparations often found on the black market - LSD that has been applied in solution onto sugar cubes or blotting paper - decompose in the course of weeks or a few months. With such a highly potent substance as LSD, the correct dosage is of paramount importance. Here the tenet of Paracelsus holds good: the dose determines whether a substance acts as a remedy or as a poison. A controlled dosage, however, is not possible with preparations from the black market, whose active strength is in no way guaranteed. One of the greatest dangers of non-medicinal LSD experiments lies, therefore, in the use of such preparations of unknown provenience. The Case of Dr. Leary Dr. Timothy Leary, who has become known worldwide in his role of drug apostle, had an extraordinarily strong influence on the diffusion of illegal LSD consumption in the United States. On the occasion of a vacation in Mexico in the year 1960, Leary had eaten the legendary "sacred mushrooms," which he had purchased from a shaman. During the mushroom inebriation he entered into a state of mystico-religious ecstasy, which he described as the deepest religious experience of his life. From then on, Dr. Leary, who at the time was a lecturer in psychology at Harvard University in Cambridge, Massachusetts, dedicated himself totally to research on the effects and possibilities of the use of psychedelic drugs. Together with his colleague Dr. Richard Alpert, he started various research projects at the university, in which LSD and psilocybin, isolated by us in the meantime, were employed. The reintegration of convicts into society, the production of mystico-religious experiences in theologians and members of the clergy, and the furtherance of creativity in artists and writers with the help of LSD and psilocybin were tested with scientific methodology. Even persons like Aldous Huxley, Arthur Koestler, and Allen Ginsberg participated in these investigations. Particular consideration was given to the question, to what degree mental preparation and expectation of the subjects, along with the external milieu of the experiment, are able to influence the course and character of states of psychedelic inebriation. In January 1963, Dr. Leary sent me a detailed report of these studies, in which he enthusiastically imparted the positive results obtained and gave expression to his beliefs in the advantages and very promising possibilities of such use of these active compounds. At the same time, the Sandoz firm received an inquiry about the supply of lOOg LSD and 25 kg psilocybin, signed by Dr. Timothy Leary, from the Harvard University Department of Social Relations. The requirement for such an enormous quantity (the stated amounts correspond to 1 million doses of LSD and 2.5 million doses of psilocybin) was based on the planned extension of investigations to tissue, organ, and animal studies. We made the supply of these substances contingent upon the production of an import license on behalf of the U.S. health authorities. Immediately we received the order for the stated quantities of LSD and psilocybin, along with a check for $10,000 as deposit but without the required import license. Dr. Leary signed for this order, but no longer as lecturer at Harvard University, rather as president of an organization he had recently founded, the International Federation for Internal Freedom (IFIF). Because, in addition, our inquiry to the appropriate dean of Harvard University had shown that the university authorities did not approve of the continuation of the research project by Leary and Alpert, we canceled our offer upon return of the deposit. Shortly thereafter, Leary and Alpert were discharged from the teaching staff of Harvard- University because the investigations, at first conducted in an academic milieu, had lost their scientific character. The experiments had turned into LSD parties. The LSD trip - LSD as a ticket to an adventurous journey into new worlds of mental and physical experience - became the latest exciting fashion among academic youth, spreading rapidly from Harvard to other universities. Leary's doctrine - that LSD not only served to find the divine and to discover the self, but indeed was the most potent aphrodisiac yet discovered - surely contributed quite decisively to the rapid propagation of LSD consumption among the younger generation. Later, in an interview with the monthly magazine Playboy, Leary said that the intensification of sexual experience and the potentiation of sexual ecstasy by LSD was one of the chief reasons for the LSD boom. After his expulsion from Harvard University, Leary was completely transformed from a psychology lecturer pursuing research, into the messiah of the psychedelic movement. He and his friends of the IFIF founded a psychedelic research center in lovely, scenic surroundings in Zihuatanejo, Mexico. I received a personal invitation from Dr. Leary to participate in a top-level planning session on psychedelic drugs, scheduled to take place there in August 1963. I would gladly have accepted this grand invitation, in which I was offered reimbursement for travel expenses and free lodging, in order to learn from personal observation the methods, operation, and the entire atmosphere of such a psychedelic research center, about which contradictory, to some extent very remarkable, reports were then circulating. Unfortunately, professional obligations kept me at that moment from flying to Mexico to get a picture at first hand of the controversial enterprise. The Zihuatanejo Research Center did not last long. Leary and his adherents were expelled from the country by the Mexican government. Leary, however, who had now become not only the messiah but also the martyr of the psychedelic movement, soon received help from the young New York millionaire William Hitchcock, who made a manorial house on his large estate in Millbrook, New York, available to Leary as new home and headquarters. Millbrook was also the home of another foundation for the psychedelic, transcendental way of life, the Castalia Foundation. On a trip to India in 1965 Leary was converted to Hinduism. In the following year he founded a religious community, the League for Spiritual Discovery, whose initials give the abbreviation "LSD." Leary's proclamation to youth, condensed in his famous slogan "Turn on, tune in, drop out !", became a central dogma of the hippie movement. Leary is one of the founding fathers of the hippie cult. The last of these three precepts, "drop out," was the challenge to escape from bourgeois life, to turn one's back on society, to give up school, studies, and employment, and to dedicate oneself wholly to the true inner universe, the study of one's own nervous system, after one has turned on with LSD. This challenge above all went beyond the psychological and religious domain to assume social and political significance. It is therefore understandable that Leary not only became the enfant terrible of the university and among his academic colleagues in psychology and psychiatry, but also earned the wrath of the political authorities. He was, therefore, placed under surveillance, followed, and ultimately locked in prison. The high sentences - ten years' imprisonment each for convictions in Texas and California concerning possession of LSD and marijuana, and conviction (later overturned) with a sentence of thirty years' imprisonment for marijuana smuggling - show that the punishment of these offenses was only a pretext: the real aim was to put under lock and key the seducer and instigator of youth, who could not otherwise be prosecuted. On the night of 13-14 September 1970, Leary managed to escape from the California prison in San Luis Obispo. On a detour from Algeria, where he made contact with Eldridge Cleaver, a leader of the Black Panther movement living there in exile, Leary came to Switzerland and there petitioned for political asylum. Meeting with Timothy Leary Dr. Leary lived with his wife, Rosemary, in the resort town Villars-sur-Ollon in western Switzerland. Through the intercession of Dr. Mastronardi, Dr. Leary's lawyer, contact was established between us. On 3 September 1971, I met Dr. Leary in the railway station snack bar in Lausanne. The greeting was cordial, a symbol of our fateful relationship through LSD. Leary was medium-sized, slender, resiliently active, his brown face surrounded with slightly curly hair mixed with gray, youthful, with bright, laughing eyes. This gave Leary somewhat the mark of a tennis champion rather than that of a former Harvard lecturer. We traveled by automobile to Buchillons, where in the arbor of the restaurant A la Grande Foret, over a meal of fish and a glass of white wine, the dialogue between the father and the apostle of LSD finally began. I voiced my regret that the investigations with LSD and psilocybin at Harvard University, which had begun promisingly, had degenerated to such an extent that their continuance in an academic milieu became impossible. My most serious remonstrance to Leary, however, concerned the propagation of LSD use among juveniles. Leary did not attempt to refute my opinions about the particular dangers of LSD for youth. He maintained, however, that I was unjustified in reproaching him for the seduction of immature persons to drug consumption, because teenagers in the United States, with regard to information and life experience, were comparable to adult Europeans. Maturity, with satiation and intellectual stagnation, would be reached very early in the United States. For that reason, he deemed the LSD experience significant, useful, and enriching, even for people still very young in years. In this conversation, I further objected to the great publicity that Leary sought for his LSD and psilocybin investigations, since he had invited reporters from daily papers and magazines to his experiments and had mobilized radio and television. Emphasis was thereby placed on publicity rather than on objective information. Leary defended this publicity program because he felt it had been his fateful historic role to make LSD known worldwide. The overwhelmingly positive effects of such dissemination, above all among America's younger generation, would make any trifling injuries, any regrettable accidents as a result of improper use of LSD, unimportant in comparison, a small price to pay. During this conversation, I ascertained that one did Leary an injustice by indiscriminately describing him as a drug apostle. He made a sharp distinction between psychedelic drugs - LSD, psilocybin, mescaline, hashish - of whose salutary effects he was persuaded, and the addicting narcotics morphine, heroin, etc., against whose use he repeatedly cautioned. My impression of Dr. Leary in this personal meeting was that of a charming personage, convinced of his mission, who defended his opinions with humor yet uncompromisingly; a man who truly soared high in the clouds pervaded by beliefs in the wondrous effects of psychedelic drugs and the optimism resulting therefrom, and thus a man who tended to underrate or completely overlook practical difficulties, unpleasant facts, and dangers. Leary also showed carelessness regarding charges and dangers that concerned his own person, as his further path in life emphatically showed. During his Swiss sojourn, I met Leary by chance once more, in February 1972, in Basel, on the occasion of a visit by Michael Horowitz, curator of the Fitz Hugh Ludlow Memorial Library in San Francisco, a library specializing in drug literature. We traveled together to my house in the country near Burg, where we resumed our conversation of the previous September. Leary appeared fidgety and detached, probably owing to a momentary indisposition, so that our discussions were less productive this time. That was my last meeting with Dr. Leary. He left Switzerland at the end of the year, having separated from his wife, Rosemary, now accompanied by his new friend Joanna Harcourt-Smith. After a short stay in Austria, where he assisted in a documentary film about heroin, Leary and friend traveled to Afghanistan. At the airport in Kabul he was apprehended by agents of the American secret service and brought back to the San Luis Obispo prison in California. After nothing had been heard from Leary for a long time, his name again appeared in the daily papers in summer 1975 with the announcement of a parole and early release from prison. But he was not set free until early in 1976. I learned from his friends that he was now occupied with psychological problems of space travel and with the exploration of cosmic relationships between the human nervous system and interstellar space - that is, with problems whose study would bring him no further difficulties on the part of governmental authorities. Travels in the Universe of the Soul Thus the Islamic scholar Dr. Rudolf Gelpke entitled his accounts of self-experiments with LSD and psilocybin, which appeared in the publication Antaios, for January 1962, and this title could also be used for the following descriptions of LSD experiments. LSD trips and the space flights of the astronauts are comparable in many respects. Both enterprises require very careful preparations, as far as measures for safety as well as objectives are concerned, in order to minimize dangers and to derive the most valuable results possible. The astronauts cannot remain in space nor the LSD experimenters in transcendental spheres, they have to return to earth and everyday reality, where the newly acquired experiences must be evaluated. The following reports were selected in order to demonstrate how varied the experiences of LSD inebriation can be. The particular motivation for undertaking the experiments was also decisive in their selection. Without exception, this selection involves only reports by persons who have tried LSD not simply out of curiosity or as a sophisticated pleasure drug, but who rather experimented with it in the quest for expanded possibilities of experience of the inner and outer world; who attempted, with the help of this drug key, to unlock new "doors of perception" (William Blake); or, to continue with the comparison chosen by Rudolf Gelpke, who employed LSD to surmount the force of gravity of space and time in the accustomed world view, in order to arrive thereby at new outlooks and understandings in the "universe of the soul." The first two of the following research records are taken from the previously cited report by Rudolf Gelpke in Antaios. Dance of the Spirits in the Wind (0.075 mg LSD on 23 June 1961, 13:00 hours) After I had ingested this dose, which could be considered average, I conversed very animatedly with a professional colleague until approximately 14:00 hours. Following this, I proceeded alone to the Werthmuller bookstore where the drug now began to act most unmistakably. I discerned, above all, that the subjects of the books in which I rummaged peacefully in the back of the shop were indifferent to me, whereas random details of my surroundings suddenly stood out strongly, and somehow appeared to be "meaningful." . . . Then, after some ten minutes, I was discovered by a married couple known to me, and had to let myself become involved in a conversation with them that, I admit, was by no means pleasant to me, though not really painful either. I listened to the conversation (even to myself) " as from far away. " The things that were discussed (the conversation dealt with Persian stories that I had translated) "belonged to another world": a world about which I could indeed express myself (I had, after all, recently still inhabited it myself and remembered the "rules of the game"!), but to which I no longer possessed any emotional connection. My interest in it was obliterated - only I did not dare to let myself observe that. After I managed to dismiss myself, I strolled farther through the city to the marketplace. I had no "visions," saw and heard everything as usual, and yet everything was also altered in an indescribable way; "imperceptible glassy walls" everywhere. With every step that I took, I became more and more like an automaton. It especially struck me that I seemed to lose control over my facial musculature - I was convinced that my face was grown stiff, completely expressionless, empty, slack and masklike. The only reason I could still walk and put myself in motion, was because I remembered that, and how I had "earlier" gone and moved myself. But the farther back the recollection went, the more uncertain I became. I remember that my own hands somehow were in my way: I put them in my pockets, let them dangle, entwined them behind my back . . . as some burdensome objects, which must be dragged around with us and which no one knows quite how to stow away. I had the same reaction concerning my whole body. I no longer knew why it was there, and where I should go with it. All sense for decisions of that kind had been lost . They could only be reconstructed laboriously, taking a detour through memories from the past. It took a struggle of this kind to enable me to cover the short distance from the marketplace to my home, which I reached at about 15:10. In no way had I had the feeling of being inebriated. What I experienced was rather a gradual mental extinction. It was not at all frightening; but I can imagine that in the transition to certain mental disturbances - naturally dispersed over a greater interval - a very similarprocess happens: as long as the recollection of the former individual existence in the human world is still present, the patient who has become unconnected can still (to some extent) find his way about in the world: later, however, when the memories fade and ultimately die out, he completely loses this ability. Shortly after I had entered my room, the "glassy stupor" gave way. I sat down, with a view out of a window, and was at once enraptured: the window was opened wide, the diaphanous gossamer curtains, on the other hand, were drawn, and now a mild wind from the outside played with these veils and with the silhouettes of potted plants and leafy tendrils on the sill behind, which the sunlight delineated on the curtains breathing in the breeze. This spectacle captivated me completely. I "sank" into it, saw only this gentle and incessant waving and rocking of the plant shadows in the sun and the wind. I knew what "it" was, but I sought after the name for it, after the formula, after the "magic word" that I knew and already I had it: Totentanz, the dance of the dead.... This was what the wind and the light were showing me on the screen of gossamer. Was it frightening? Was I afraid? Perhaps - at first. But then a great cheerfulness infiltrated me, and I heard the music of silence, and even my soul danced with the redeemed shadows to the whistle of the wind. Yes, I understood: this is the curtain, and this curtain itself IS the secret, the "ultimate" that it concealed. Why, therefore, tear it up? He who does that only tears up himself. Because "there behind," behind the curtain, is "nothing.". . . Polyp from the Deep (0.150 mg LSD on 15 April 1961, 9:15 hours) Beginning of the effect already after about 30 minutes with strong inner agitation, trembling hands, skin chills, taste of metal on the palate. 10:00: The environment of the room transforms itself into phosphorescent waves, running hither from the feet even through my body. The skin - and above all the toes - is as electrically charged; a still constantly growing excitement hinders all clear thoughts.... 10:20: I lack the words to describe my current condition. It is as if an "other" complete stranger were seizing possession of me bit by bit. Have greatest trouble writing ("inhibited" or"uninhibited"? - I don't know!). This sinister process of an advancing self-estrangement aroused in me the feeling of powerlessness, of being helplessly delivered up. Around 10:30, through closed eyes I saw innumerable, self-intertwining threads on a red background. A sky as heavy as lead appeared to press down on everything; I felt my ego compressed in itself, and I felt like a withered dwarf.... Shortly before 13:00 I escaped the more and more oppressing atmosphere of the company in the studio, in which we only hindered one another reciprocally from unfolding completely into the inebriation. I sat down in a small, empty room, on the floor, with my back to the wall, and saw through the only window on the narrow frontage opposite me a bit of gray- white cloudy sky. This, like the whole environment in general, appeared to be hopelessly normal at this moment. I was dejected, and my self seemed so repulsive and hateful to me that I had not dared (and on this day even had actually repeatedly desperately avoided) to look in a mirror or in the face of another person. I very much wished this inebriation were finally finished, but it still had my body totally in its possession. I imagined that I perceived, deep within its stubborn oppressive weight, how it held my limbs surrounded with a hundred polyp arms - yes, I actually experienced this in a mysterious rhythm; electrified contacts, as of a real, indeed imperceptible, but sinister omn sent being, which I addressed with a loud voice, reviled, bid, and challenged to open combat. "It is only the projection of evil in your self," another voice assured me. "It is your soul monster!" This perception was like a flashing sword. It passed through me with redeeming sharpness. The polyp arms fell away from me - as if cut through - and simultaneously the hitherto dull and gloomy gray-white of the sky behind the open window suddenly scintillated like sunlit water. As I stared at it so enchanted, it changed (for me!) to real water: a subterranean spring overran me, which had ruptured there all at once and now boiled up toward me, wanted to become a storm, a lake, an ocean, with millions and millions of drops - and on all of these drops, on every single one of them, the light danced.... As the room, window, and sky came back into my consciousness (it was 13:25 hours), the inebriation was certainly not at an end - not yet - but its rearguard, which passed by me during the ensuing two hours, very much resembled the rainbow that follows the storm. Both the estrangement from the environment and the estrangement from the individual body, experienced in both of the preceding experiments described by Gelpke - as well as the feeling of an alien being, a demon, seizing possession of oneself - are features of LSD inebriation that, in spite of all the other diversity and variability of the experience, are cited in most research reports. I have already described the possession by the LSD demon as an uncanny experience in my first planned self-experiment. Anxiety and terror then affected me especially strongly, because at that time I had no way of knowing that the demon would again release his victim. The adventures described in the following report, by a painter, belong to a completely different type of LSD experience. This artist visited me in order to obtain my opinion about how the experience under LSD should be understood and interpreted. He feared that the profound transformation of his personal life, which had resulted from his experiment with LSD, could rest on a mere delusion. My explanation - that LSD, as a biochemical agent, only triggered his visions but had not created them and that these visions rather originated from his own soul - gave him confidence in the meaning of his transformation. LSD Experience of a Painter . . . Therefore I traveled with Eva to a solitary mountain valley. Up there in nature, I thought it would be particularly beautiful with Eva. Eva was young and attractive. Twenty years older than she, I was already in the middle of life. Despite the sorrowful consequences that I had experienced previously, as a result of erotic escapades, despite the pain and the disappointments that I inflicted on those who loved me and had believed in me, I was drawn again with irresistible power to this adventure, to Eva, to her youth. I was under the spell of this girl. Our affair indeed was only beginning, but I felt this seductive power more strongly than ever before. I knew that I could no longer resist. For the second time in my life I was again ready to desert my family, to give up my position, to break all bridges. I wanted to hurl myself uninhibitedly into this lustful inebriation with Eva. She was life, youth. Over again it cried out in me, again and again to drain the cup of lust and life until the last drop, until death and perdition. Let the Devil fetch me later on! I had indeed long ago done away with God and the Devil. They were for me only human inventions, which came to be utilized by a skeptical, unscrupulous minority, in order to suppress and exploit a believing, naive majority. I wanted to have nothing to do with this mendacious social moral. To enjoy, at all costs, I wished to enjoy et apres nous te deluge. "What is wife to me, what is child to me - let them go begging, if they are hungry." I also perceived the institution of marriage as a social lie. The marriage of my parents and marriages of my acquaintances seemed to confirm that sufficiently for me. Couples remained together because it was more convenient; they were accustomed to it, and "yes, if it weren't for the children . . ." Under the pretense of a good marriage, each tormented the other emotionally, to the point of rashes and stomach ulcers, or each went his own way. Everything in me rebelled against the thought of having to love only one and the same woman a life long. I frankly perceived that as repugnant and unnatural. Thus stood my inner disposition on that portentous summer evening at the mountain lake. At seven o'clock in the evening both of us took a moderately strong dose of LSD, some 0.1 milligrams. Then we strolled along about the lake and then sat on the bank. We threw stones in the water and watched the forming wave circles. We felt a slight inner restlessness. Around eight o'clock we entered the hotel lounge and ordered tea and sandwiches. Some guests still sat there, telling jokes and laughing loudly. They winked at us. Their eyes sparkled strangely. We felt strange and distant and had the feeling that they would notice something in us. Outside it slowly became dark. We decided only reluctantly to go to our hotel room. A street without lights led along the black lake to the distant guest house. As I switched on the light, the granite staircase, leading from the shore road to the house, appeared to flame up from step to step. Eva quivered all at once, frightened. "Hellish" went through my mind, and all of a sudden horror passed through my limbs, and I knew: now it's going to turn out badly. From afar, from the village, a clock struck nine. Scarcely were we in our room, when Eva threw herself on the bed and looked at me with wide eyes. It was not in the least possible to think of love. I sat down on the edge of the bed and held both of Eva's hands. Then came the terror. We sank into a deep, indescribable horror, which neither of us understood. "Look in my eyes, look at me," I implored Eva, yet again and again her gaze was averted from me, and then she cried out loud in terror and trembled all over her body. There was no way out. Outside was only gloomy night and the deep, black lake. In the public house all the lights were extinguished; the people had probably gone to sleep. What would they have said if they could see us now? Possibly they would summon the police, and then everything would become still much worse. A drug scandal - intolerable agonizing thoughts. We could no longer move from the spot. We sat there surrounded by four wooden walls whose board joints shone infernally. It became more unbearable all the time. Suddenly the door was opened and "something dreadful" entered. Eva cried out wildly and hid herself under the bed covers. Once again a cry. The horror under the covers was yet worse. "Look straight in my eyes!" I called to her, but she rolled her eyes back and forth as though out of her mind. She is becoming insane, I realized. In desperation I seized her by the hair so that she could no longer turn her face away from me. I saw dreadful fear in her eyes. Everything around us was hostile and threatening, as if everything wanted to attack us in the next moment. You must protect Eva, you must bring her through until morning, then the effects will discontinue, I said to myself. Then again, however, I plunged into nameless horror. There was no more time or reason; it seemed as if this condition would never end. The objects in the room were animated to caricatures; everything on all sides sneered scornfully. I saw Eva's yellow-black striped shoes, which I had found so stimulating, appearing as two large, evil wasps crawling on the floor. The water piping above the washbasin changed to a dragon head, whose eyes, the two water taps, observed me malevolently. My first name, George, came into my mind, and all at once I felt like Knight George, who must fight for Eva. Eva's cries tore me from these thoughts. Bathed in perspiration and trembling, she fastened herself to me. "I am thirsty," she moaned. With great effort, without releasing Eva's hand, I succeeded in getting a glass of water for her. But the water seemed slimy and viscous, was poisonous, and we could not quench our thirst with it. The two night-table lamps glowed with a strange brightness, in an infernal light. The clock struck twelve. This is hell, I thought. There is indeed no Devil and no demons, and yet they were perceptible in us, filled up the room, and tormented us with unimaginable terror. Imagination, or not? Hallucinations, projections? - insignificant questions when confronted with the reality of fear that was fixed in our bodies and shook us: the fear alone, it existed. Some passages from Huxley's book The Doors of Perception came to me and brought me brief comfort. I looked at Eva, at this whimpering, horrified being in her torment, and felt great remorse and pity. She had become strange to me; I scarcely recognized her any longer. She wore a fine golden chain around her neck with the medallion of the Virgin Mary. It was a gift from her younger brother. I noticed how a benevolent, comforting radiation, which was connected with pure love, emanated from this necklace. But then the terror broke loose again, as if to our final destruction. I needed my whole strength to constrain Eva. Loudly I heard the electrical meter ticking weirdly outside of the door, as if it wanted to make a most important, evil, devastating announcement to me in the next moment. Disdain, derision, and malignity again whispered out of all nooks and crevices. There, in the midst of this agony, I perceived the ringing of cowbells from afar as a wonderful, promising music. Yet soon it became silent again, and renewed fear and dread once again set in. As a drowning man hopes for a rescuing plank, so I wished that the cows would yet again want to draw near the house.\But everything remained quiet, and only the threatening tick and hum of the current meter buzzed round us like an invisible, malevolent insect. Morning finally dawned. With great relief I noticed how the chinks in the window shutters lit up. Now I could leave Eva to herself; she had quieted down. Exhausted, she closed her eyes and fell asleep. Shocked and deeply sad, I still sat on the edge of the bed. Gone was my pride and self- assurance; all that remained of me was a small heap of misery. I examined myself in the mirror and started: I had become ten years older in the course of the night. Downcast, I stared at the light of the night-table lamp with the hideous shade of intertwined plastic cords. All at once the light seemed to become brighter, and in the plastic cords it began to sparkle and to twinkle; it glowed like diamonds and gems of all colors, and an overwhelming feeling of happiness welled up in me. All at once, lamp, room, and Eva disappeared, and I found myself in a wonderful, fantastic landscape. It was comparable to the interior of an immense Gothic church nave, with infinitely many columns and Gothic arches. These consisted, however, not of stone, but rather of crystal. Bluish, yellowish, milky, and clearly transparent crystal columns surrounded me like trees in an open forest. Their points and arches became lost in dizzying heights. A bright light appeared before my inner eye, and a wonderful, gentle voice spoke to me out of the light. I did not hear it with my external ear, but rather perceived it, as if it were clear thoughts that arise in one. I realized that in the horror of the passing night I had experienced my own individual condition: selfishness. My egotism had kept me separated from mankind and had led me to inner isolation. I had loved only myself, not my neighbor; loved only the gratification that the other offered me. The world had existed only for the satisfaction of my greed. I had become tough, cold, and cynical. Hell, therefore, had signified that: egotism and lovelessness. Therefore everything had seemed strange and unconnected to me, so scornful and threatening. Amid flowing tears, I was enlightened with the knowledge that true love means surrenderof selfishness and that it is not desires but rather selfless love that forms the bridge to the heart of our fellow man. Waves of ineffable happiness flowed through my body. I had experienced the grace of God. But how could it be possible that it was radiating toward me, particularly out of this cheap lampshade? Then the inner voice answered: God is in everything. The experience at the mountain lake has given me the certainty that beyond the ephemeral, material world there also exists an imperishable, spiritual reality, which is our true home. I am now on my way home. For Eva everything remained just a bad dream. We broke up a short time thereafter. The following notes kept by a twenty-five-year-old advertising agent are contained in The LSD Story by John Cashman (Fawcett Publications, Greenwich, Conn., 1966). They were included in this selection of LSD reports, along with the preceding example, because the progression that they describe - from terrifying visions to extreme euphoria, a kind of deathrebirth cycle - is characteristic of many LSD experiments. A Joyous Song of Being My first experience with LSD came at the home of a close friend who served as my guide. The surroundings were comfortably familiar and relaxing. I took two ampuls (200 micrograms) of LSD mixed in half a glass of distilled water. The experience lasted for close to eleven hours, from 8 o'clock on a Saturday evening until very nearly 7 o'clock the next morning. I have no firm point of comparison, but I am positive that no saint ever saw more glorious or joyously beautiful visions or experienced a more blissful state of transcendence. My powers to convey the miracles are shabby and far too inadequate to the task at hand. A sketch, and an artless one at that, must suffice where only the hand of a great master working from a complete palette could do justice to the subject. I must apologize for my own limitations in this feeble attempt to reduce the most remarkable experience of my life to mere words. My superior smile at the fumbling, halting attempts of others in their attempts to explain the heavenly visions to me has been transformed into a knowing smile of a conspirator - the common experience requires no words. My first thought after drinking the LSD was that it was having absolutely no effect. They had told me thirty minutes would produce the first sensation, a tingling of the skin. There was no tingling. I commented on this and was told to relax and wait. For the lack of anything else to do I stared at the dial light of the table radio, nodding my head to a jazz piece I did not recognize. I think it was several minutes before I realized that the light was changing color kaleidoscopically with the different pitch of the musical sounds, bright reds and yellows in the high register, deep purple in the low. I laughed. I had no idea when it had started. I simply knew it had. I closed my eyes, but the colored notes were still there. I was overcome by the remarkable brilliance of the colors. I tried to talk, to explain what I was seeing, the vibrant and luminous colors. Somehow it didn't seem important. With my eyes open, the radiant colors flooded the room, folding over on top of one another in rhythm with the music. Suddenly I was aware that the colors were the music. The discovery did not seem startling. Values, so cherished and guarded, were becoming unimportant. I wanted to talk about the colored music, but I couldn't. I was reduced to uttering one-syllable words while polysyllabic impressions tumbled through my mind with the speed of light. The dimensions of the room were changing, now sliding into a fluttering diamond shape, then straining into an oval shape as if someone were pumping air into the room, expanding it to the bursting point. I was having trouble focusing on objects. They would melt into fuzzy masses of nothing or sail off into space, self-propelled, slow-motion trips that were of acute interest to me. I tried to check the time on my watch, but I was unable to focus on the hands. I thought of asking for the time, but the thought passed. I was too busy seeing and listening. The sounds were exhilarating, the sights remarkable. I was completely entranced. I have no idea how long this lasted. I do know the egg came next. The egg, large, pulsating, and a luminous green, was there before I actually saw it. I sensed it was there. It hung suspended about halfway between where I sat and the far wall. I was intrigued by the beauty of the egg. At the same time I was afraid it would drop to the floor and break. I didn't want the egg to break. It seemed most important that the egg should not break. But even as I thought of this, the egg slowly dissolved and revealed a great multihued flower that was like no flowerI have ever seen. Its incredibly exquisite petals opened on the room, spraying indescribable colors in every direction. I felt the colors and heard them as they played across my body, cool and warm, reedlike and tinkling. The first tinge of apprehension came later when I saw the center of the flower slowly eating away at the petals, a black, shiny center that appeared to be formed by the backs of a thousand ants. It ate away the petals at an agonizingly slow pace. I wanted to scream for it to stop or to hurry up. I was pained by the gradual disappearance of the beautiful petals as if being swallowed by an insidious disease. Then in a flash of insight I realized to my horror that the black thing was actually devouring me. I was the flower and this foreign, creeping thing was eating me! I shouted or screamed, I really don't remember. I was too full of fear and loathing. I heard my guide say: "Easy now. Just go with it. Don't fight it. Go with it." I tried, but the hideous blackness caused such repulsion that I screamed: "I can't! For God's sake help me! Help me!" The voice was soothing, reassuring: "Let it come. Everything is all right. Don't worry. Go with it. Don't fight." I felt myself dissolving into the terrifying apparition, my body melting in waves into the core of blackness, my mind stripped of ego and life and, yes even death. In one great crystal instant I realized that I was immortal. I asked the question: "Am I dead?" But the question had no meaning. Meaning was meaningless. Suddenly there was white light and the shimmering beauty of unity. There was light everywhere, white light with a clarity beyond description. I was dead and I was born and the exultation was pure and holy. My lungs were bursting with the joyful song of being. There was unity and life and the exquisite love that filled my being was unbounded. My awareness was acute and complete. I saw God and the devil and all the saints and I knew the truth. I felt myself flowing into the cosmos, levitated beyond all restraint, liberated to swim in the blissful radiance of the heavenly visions. I wanted to shout and sing of miraculous new life and sense and form, of the joyous beauty and the whole mad ecstasy of loveliness. I knew and understood all there is to know and understand. I was immortal, wise beyond wisdom, and capable of love, of all loves. Every atom of my body and soul had seen and felt God. The world was warmth and goodness. There was no time, no place, no me. There was only cosmic harmony. It was all there in the white light. With every fiberof my being I knew it was so. I embraced the enlightenment with complete abandonment. As the experience receded I longed to hold onto it and tenaciously fought against the encroachment of the realities of time and place. For me, the realities of our limited existence were no longer valid. I had seen the ultimate realities and there would be no others. As I was slowly transported back to the tyranny of clocks and schedules and petty hatreds, I tried to talk of my trip, my enlightenment, the horrors, the beauty, all of it. I must have been babbling like an idiot. My thoughts swirled at a fantastic rate, but the words couldn't keep pace. My guide smiled and told me he understood. The preceding collection of reports on "travels in the universe of the soul," even though they encompass such dissimilar experiences, are still not able to establish a complete picture of the broad spectrum of all possible reactions to LSD, which extends from the most sublime spiritual, religious, and mystical experiences, down to gross psychosomatic disturbances. Cases of LSD sessions have been described in which the stimulation of fantasy and of visionary experience, as expressed in the LSD reports assembled here, is completely absent, and the experimenter was for the whole time in a state of ghastly physical and mental discomfort, or even felt severely ill. Reports about the modification of sexual experience under the influence of LSD are also contradictory. Since stimulation of all sensory perception is an essential feature of LSD effects, the sensual orgy of sexual intercourse can undergo unimaginable enhancements. Cases have also been described, however, in which LSD led not to the anticipated erotic paradise, but rather to a purgatory or even to the hell of frightful extinction of every perception and to a lifeless vacuum. Such a variety and contradiction of reactions to a drug is found only in LSD and the related hallucinogens. The explanation for this lies in the complexity and variability of the conscious and subconscious minds of people, which LSD is able to penetrate and to bring to life as experienced reality. _________________________________________________________________ 6. The Mexican Relatives of LSD The Sacred Mushroom Teonanacatl Late in 1956 a notice in the daily paper caught my interest. Among some Indians in southern Mexico, American researchers had discovered mushrooms that were eaten in religious ceremonies and that produced an inebriated condition accompanied by hallucinations. Since, outside of the mescaline cactus found also in Mexico, no other drug was known at the time that, like LSD, produced hallucinations, I would have liked to establish contact with these researchers, in order to learn details about these hallucinogenic mushrooms. But there were no names and addresses in the short newspaper article, so that it was impossible to get further information. Nevertheless, the mysterious mushrooms, whose chemical investigation would be a tempting problem, stayed in my thoughts from then on. As it later turned out, LSD was the reason that these mushrooms found their way into my laboratory, with out my assistance, at the beginning of the following year. Through the mediation of Dr. Yves Dunant, at the time director of the Paris branch of Sandoz, an inquiry came to the pharmaceutical research management in Basel from Professor Roger Heim, director of the Laboratoire de Cryptogamie of the Museum National d'Histoire Naturelle in Paris, asking whether we were interested in carrying out the chemical investigation of the Mexican hallucinogenic mushrooms. With great joy I declared myself ready to begin this work in my department, in the laboratories for natural product research. That was to be my link to the exciting investigations of the Mexican sacred mushrooms, which were already broadly advanced in the ethnomycological and botanical aspects. For a long time the existence of these magic mushrooms had remained an enigma. The history of their rediscovery is presented at first hand in the magnificent two-volume standard work of ethnomycology, Mushrooms, Russia and History (Pantheon Books, New York, 1957), for the authors, the American researchers Valentina Pavlovna Wasson and her husband, R. Gordon Wasson, played a decisive role in this rediscovery. The following descriptions of the fascinating history of these mushrooms are taken from the Wassons' book. The first written evidence of the use of inebriating mushrooms on festival occasions, or in the course of religious ceremonies and magically oriented healing practices, is found among the Spanish chroniclers and naturalists of the sixteenth century, who entered the country soon after the conquest of Mexico by Hernan Cortes. The most important of these witnesses is the Franciscan friar Bernardino de Sahagun, who mentions the magic mushrooms and describes their effects and their use in several passages of his famous historical work, Historia General de tas Cosas de Nueva Espana, written between the years 1529 and 1590. Thus he describes, for example, how merchants celebrated the return home from a successful business trip with a mushroom party: Coming at the very first, at the time of feasting, they ate mushrooms when, as they said, it was the hour of the blowing of the flutes. Not yet did they partake of food; they drank only chocolate during the night. And they ate mushrooms with honey. When already the mushrooms were taking effect, there was dancing, there was weeping.... Some saw in a vision that they would die in war. Some saw in a vision that they would be devoured by wild beasts.... Some saw in a vision that they would become rich, wealthy. Some saw in a vision that they would buy slaves, would become slave owners. Some saw in a vision that they would commit adultery [and so] would have their heads bashed in, would be stoned to death.... Some saw in a vision that they would perish in the water. Some saw in a vision that they would pass to tranquility in death. Some saw in avision that they would fall from the housetop, tumble to their death. . . . All such things they saw.... And when [the effects of] the mushroom ceased, they conversed with one another, spoke of what they had seen in the vision. In a publication from the same period, Diego Duran, a Dominican friar, reported that inebriating mushrooms were eaten at the great festivity on the occasion of the accession to the throne of Moctezuma II, the famed emperor of the Aztecs, in the year 1502. A passage in the seventeenth-century chronicle of Don Jacinto de la Serna refers to the use of these mushrooms in a religious framework: And what happened was that there had come to [the village] an Indian . . . and his name was Juan Chichiton . . . and he had brought the red-colored mushrooms that are gathered in the uplands, and with them he had committed a great idolatry.... In the house where everyone had gathered on the occasion of a saint's feast . . . the teponastli [an Aztec percussion instrument] was playing and singing was going on the whole night through. After most of the night had passed, Juan Chichiton, who was the priest for that solumn rite, to all of those present at the flesta gave the mushrooms to eat, after the manner of Communion, and gave them pulque to drink. . . so that they all went out of their heads, a shame it was to see. In Nahuatl, the language of the Aztecs, these mushrooms were described as teonanactl, which can be translated as "sacred mushroom." There are indications that ceremonial use of such mushrooms reaches far back into pre-Columbian times. So-called mushroom stones have been found in El Salvador, Guatemala, and the contiguous mountainous districts of Mexico. These are stone sculptures in the form of pileate mushroom, on whose stem the face or the form of a god or an animallike demon is carved. Most are about 30 cm high. The oldest examples, according to archaeologists, date back to before 500 B.C. R. G. Wasson argues, quite convincingly, that there is a connection between these mushroom stones and teonanacatl. If true, this means that the mushroom cult, the magico-medicinal and religious-ceremonial use of the magic mushrooms, is more than two thousand years old. To the Christian missionaries, the inebriating, vision- and hallucination-producing effects of these mushrooms seemed to be Devil's work. They therefore tried, with all the means in their power, to extirpate their use. But they succeeded only partially, for the Indians have continued secretly down to our time to utilize the mushroom teonanacatl, which was sacred to them. Strange to say, the reports in the old chronicles about the use of magic mushrooms remained unnoticed during the following centuries, probably because they were considered products of the imagination of a superstitious age. All traces of the existence of "sacred mushrooms" were in danger of becoming obliterated once and for all, when, in 1915, an Americanbotanist of repute, Dr. W. E. Safford, in an address before the Botanical Society in Washington and in a scientific publication, advanced the thesis that no such thing as magic mushrooms had ever existed at all: the Spanish chroniclers had taken the mescaline cactus for a mushroom! Even if false, this proposition of Safford's served nevertheless to direct the attention of the scientific world to the riddle of the mysterious mushrooms. It was the Mexican physician Dr. Blas Pablo Reko who first openly disagreed with Safford's interpretation and who found evidence that mushrooms were still employed in medicinal-religious ceremonies even in our time, in remote districts of the southern mountains of Mexico. But not until the years 19338 did the anthropologist Robert J. Weitlaner and Dr. Richard Evans Schultes, a botanist from Harvard University, find actual mushrooms in that region, which were used there for this ceremonial purpose; and only in 1938 could a group of young American anthropologists, under the direction of Jean Bassett Johnson, attend a secret nocturnal mushroom ceremony for the first time. This was in Huautla de Jimenez, the capital of the Mazatec country, in the State of Oaxaca. But these researchers were only spectators, they were not permitted to partake of the mushrooms. Johnson reported on the experience in a Swedish journal (Ethnotogical Studies 9, 1939). Then exploration of the magic mushrooms was interrupted. World War II broke out. Schultes, at the behest of the American government, had to occupy himself with rubber production in the Amazon territory, and Johnson was killed after the Allied landing in North Africa. It was the American researchers, the married couple Dr. Valentina Pavlovna Wasson and her husband, R. Gordon Wasson, who again took up the problem from the ethnographic aspect. R. G. Wasson was a banker, vice-president of the J. P. Morgan Co. in New York. His wife, who died in 1958, was a pediatrician. The Wassons began their work in 1953, in the Mazatec village Huautla de Jimenez, where fifteen years earlier J. B. Johnson and others had established the continued existence of the ancient Indian mushroom cult. They received especially valuable information from an American missionary who had been active there for many years, Eunice V. Pike, member of the Wycliffe Bible Translators. Thanks to her knowledge of the native language and her ministerial association with the inhabitants, Pike had information about the significance of the magic mushrooms that nobody else possessed. During several lengthy sojourns in Huautla and environs, the Wassons were able to study the present use of the mushrooms in detail and compare it with the descriptions in the old chronicles. This showed that the belief in the "sacred mushrooms" was still prevalent in that region. However, the Indians kept their beliefs a secret from strangers. It took great tact and skill, therefore, to gain the confidence of the indigenous population and to receive insight into this secret domain. In the modern form of the mushroom cult, the old religious ideas and customs are mingled with Christian ideas and Christian terminology. Thus the mushrooms are often spoken of as the blood of Christ, because they will grow only where a drop of Christ's blood has fallen on the earth. According to another notion, the mushrooms sprout where a drop of saliva from Christ's mouth has moistened the ground, and it is thcrefore Jesus Christ himself who speaks through the mushrooms. The mushroom ceremony follows the form of a consultation. The seeker of advice or a sick person or his or her family questions a "wise man" or a "wise woman," asabio orsabia, also named curandero orcurandera, in return for a modest payment. Curandero can best be translated into English as "healing priest," for his function is that of a physician as well as that of a priest, both being found only rarely in these remote regions. In the Mazatec language the healing priest is called co-ta-ci-ne, which means "one who knows." He eats the mushroom in the framework of a ceremony that always takes place at night. The other persons present at the ceremony may sometimes receive mushrooms as well, yet a much greater dose always goes to the curandero. The performance is executed with the accompaniment of prayers and entreaties, while the mushrooms are incensed briefly over a basin, in which copal (an incense-like resin) is burned. In complete darkness, at times by candlelight, while the others present lie quietly on their straw mats, the curandero, kneeling or sitting, prays and sings before a type of altar bearing a crucifix, an image of a saint, or some other object of worship. Under the influence of the sacred mushrooms, the curandero counsels in a visionary state, in which even the inactive observers more or less participate. In the monotonous song of the curandero, the mushroom teonanacatl gives its answers to the questions posed. It says whether the diseased person will live or die, which herbs will effect the cure; it reveals who has killed a specific person, or who has stolen the horse; or it makes known how a distant relative fares, and so forth. The mushroom ceremony not only has the function of a consulation of the type described, for the Indians it also has a meaning in many respects similar to the Holy Communion for the believing Christian. From many utterances of the natives it could be inferred that they believe that God has given the Indians the sacred mushroom because they are poor and possess no doctors and medicines; and also, because they cannot read, in particular the Bible, God can therefore speak directly to them through the mushroom. The missionary Eunice V. Pike even alluded to the difficulties that result from explaining the Christian message, the written word, to a people who believe they possess a means - the sacred mushrooms of course - to make God's will known to them in a direct, clear manner: yes, the mushrooms permit them to see into heaven and to establish communication with God himself. The Indians' reverence for the sacred mushrooms is also evident in their belief that they can be eaten only by a "clean" person. "Clean" here means ceremonially clean, and that term among other things includes sexual abstinence at least four days before and after ingestion of the mushrooms. Certain rules must also be observed in gathering the mushrooms. With nonobservance of these commandments, the mushrooms can make the person who eats it insane, or can even kill. The Wassons had undertaken their first expedition to the Mazatec country in 1953, but not until 1955 did they succeed in overcoming the shyness and reserve of the Mazatec friends they had managed to make, to the point of being admitted as active participants in a mushroom ceremony. R. Gordon Wasson and his companion, the photographer Allan Richardson, were given sacred mushrooms to eat at the end of June 1955, on the occasion of a nocturnal mushroom ceremony. They thereby became in all likelihood the first outsiders, the first whites, ever permitted to take teonanacatl. In the second volume of Mushrooms, Russia and History, in enraptured words, Wasson describes how the mushroom seized possession of him completely, although he had tried to struggle against its effects, in order to be able to remain an objective observer. First he saw geometric, colored patterns, which then took on architectural characteristics. Next followed visions of splendid colonnades, palaces of supernatural harmony and magnificence embellished with precious gems, triumphal cars drawn by fabulous creatures as they are known only from mythology, and landscapes of fabulous luster. Detached from the body, the spirit soared timelessly in a realm of fantasy among images of a higher reality and deeper meaning than those of the ordinary, everyday world. The essence of life, the ineffable, seemed to be on the verge of being unlocked, but the ultimate door failed to open. This experience was the final proof, for Wasson, that the magical powers attributed to the mushrooms actually existed and were not merely superstition. In order to introduce the mushrooms to scientific research, Wasson had earlier established an association with mycologist Professor Roger Heim of Paris. Accompanying the Wassons on further expeditions into the Mazatec country, Heim conducted the botanical identification of the sacred mushrooms. He showed that they were gilled mushrooms from the family Strophariaceae, about a dozen different species not previously described scientifically, the greatest part belonging to the genus Psilocybe. Professor Heim also succeeded in cultivating some of the species in the laboratory. The mushroom Psilocybe mexicana turned out to be especially suitable for artificial cultivation. Chemical investigations ran parallel with these botanical studies on the magic mushrooms, with the goal of extracting the hallucinogenically active principle from the mushroom material and preparing it in chemically pure form. Such investigations were carried out at Professor Heim's instigation in the chemicaI laboratory of the Museum National d'Histoire Naturelle in Paris, and work teams were occupied with this problem in the United States in the research laboratories of two large pharmaceutical companies: Merck, and Smith, Kline and French. The American laboratories had obtained some of the mushrooms from R. G. Wasson and had gathered others themselves in the Sierra Mazateca. As the chemical investigations in Paris and in the United States turned out to be ineffectual, Professor Heim addressed this matter to our firm, as mentioned at the beginning of this chapter, because he felt that our experimental experience with LSD, related to the magic mushrooms by similar activity, could be of use in the isolation attempts. Thus it was LSD that showed teonanacatl the way into our laboratory. As director of the department of natural products of the Sandoz pharmaceutical-chemical research laboratories at that time, I wanted to assign-the investigation of the magic mushrooms to one of my coworkers. However, nobody showed much eagerness to take on this problem because it was known that LSD and everything connected with it were scarcely popular subjects to the top management. Because the enthusiasm necessary for successful endeavors cannot be commanded, and because the enthusiasm was already present in me as far as this problem was concerned, I decided to conduct the investigation myself. Some 100 g of dried mushrooms of the species Psilocybe mexicana, cultivated by Professor Heim in the laboratory, were available for the beginning of the chemical analysis. My laboratory assistant, Hans Tscherter, who during our decade-long collaboration, had developed into a very capable helper, completely familiar with my manner of work, aided me in the extraction and isolation attempts. Since there were no clues at all concerning the chemical properties of the active principle we sought, the isolation attempts had to be conducted on the basis of the effects of the extract fractions. But none of the various extracts showed an unequivocal effect, either in the mouse or the dog, which could have pointed to the presence of hallucinogenic principles. It therefore became doubtful whether the mushrooms cultivated and dried in Paris were still active at all. That could only be determined by experimenting with this mushroom material on a human being. As in the case of LSD, I made this fundamental experiment myself, since it is not appropriate for researchers to ask anyone else to perform self-experiments that they require for their own investigations, especially if they entail, as in this case, a certain risk. In this experiment I ate 32 dried specimens of Psilocybe mexicana, which together weighed 2.4 g. This amount corresponded to an average dose, according to the reports of Wasson and Heim, as it is used by the curanderos. The mushrooms displayed a strong psychic effect, as the following extract from the report on that experiment shows: Thirty minutes after my taking the mushrooms, the exterior world began to undergo a strange transformation. Everything assumed a Mexican character. As I was perfectly well aware that my knowledge of the Mexican origin of the mushroom would lead me to imagine only Mexican scenery, I tried deliberately to look on my environment as I knew it normally. But all voluntary efforts to look at things in their customary forms and colors proved ineffective. Whether my eyes were closed or open, I saw only Mexican motifs and colors. When the doctor supervising the experiment bent over me to check my blood pressure, he was transformed into an Aztec priest and I would not have been astonished if he had drawn an obsidian knife. In spite of the seriousness of the situation, it amused me to see how the Germanic face of my colleague had acquired a purely Indian expression. At the peak of the intoxication, about 1 1/2 hours after ingestion of the mushrooms, the rush of interior pictures, mostly abstract motifs rapidly changing in shape and color, reached such an alarming degree that I feared that I would be torn into this whirlpool of form and color and would dissolve. After about six hours the dream came to an end. Subjectively, I had no idea how long this condition had lasted. I felt my return to everyday reality to be a happy return from a strange, fantastic but quite real world to an old and familiar home. This self-experiment showed once again that human beings react much more sensitively than animals to psychoactive substances. We had already reached the same conclusion in experimenting with LSD on animals, as described in an earlier chapter of this book. It was not inactivity of the mushroom material, but rather the deficient reaction capability of the research animals vis-a-vis such a type of active principle, that explained why our extracts had appeared inactive in the mouse and dog. Because the assay on human subjects was the only test at our disposal for the detection of the active extract fractions, we had no other choice than to perform the testing on ourselves if we wanted to carry on the work and bring it to a successful conclusion. In the self-experiment just described, a strong reaction lasting several hours was produced by 2.4 g dried mushrooms. Therefore, in the sequel we used samples corresponding to only one-third of this amount, namely 0.8 g dried mushrooms. If these samples contained the active principle, they would only provoke a mild effect that impaired the ability to work for a short time, but this effect would still be so distinct that the inactive fractions and those containing the active principle could unequivocally be differentiated from one another. Several coworkers and colleagues volunteered as guinea pigs for this series of tests. Psilocybin and Psilocin With the help of this reliable test on human subjects, the active principle could be isolated, concentrated, and transformed into a chemically pure state by means of the newest separation methods. Two new substances, which I named psilocybin and psilocin, were thereby obtained in the form of colorless crystals . These results were published in March 1958 in the journal Experientia, in collaboration with Professor Heim and with my colleagues Dr. A. Brack and Dr. H. Kobel, who had provided greater quantities of mushroom material for these investigations after they had essentially improved the laboratory cultivation of the mushrooms. Some of my coworkers at the time - Drs. A. J. Frey, H. Ott, T. Petrzilka, and F. Troxler - then participated in the next steps of these investigations, the determination of the chemical structure of psilocybin and psilocin and the subsequent synthesis of these compounds, the results of which were published in the November 1958 issue of Experientia. The chemical structures of these mushroom factors deserve special attention in several respects. Psilocybin and psilocin belong, like LSD, to the indole compounds, the biologically important class of substances found in the plant and animal kingdoms. Particular chemical features common to both the mushroom substances and LSD show that psilocybin and psilocin are closely related to LSD, not only with regard to psychic effects but also to their chemical structures. Psilocybin is the phosphoric acid ester of psilocin and, as such, is the first and hitherto only phosphoric-acid-containing indole compound discovered in nature. The phosphoric acid residue does not contribute to the activity, for the phosphoric-acid-free psilocin is just as active as psilocybin, but it makes the molecule more stable. While psilocin is readily decomposed by the oxygen in air, psilocybin is a stable substance. Psilocybin and psilocin possess a chemical structure very similar to the brain factor serotonin. As was already mentioned in the chapter on animal experiments and biological research, serotonin plays an important role in the chemistry of brain functions. The two mushroom factors, like LSD, block the effects of serotonin in pharmacological experiments on different organs. Other pharmacological properties of psilocybin and psilocin are also similar to those of LSD. The main difference consists in the quantitative activity, in animal as well as human experimentation. The average active dose of psilocybin or psilocin in human beings amounts to 10 mg (0.01 g); accordingly, these two substances are more than 100 times less active than LSD, of which 0.1 mg constitutes a strong dose. Moreover, the effects of the mushroom factors last only four to six hours, much shorter than the effects of LSD (eight to twelve hours). The total synthesis of psilocybin and psilocin, without the aid of the mushrooms, could be developed into a technical process, which would allow these substances to be produced on a large scale. Synthetic production is more rational and cheaper than extraction from the mushrooms. Thus with the isolation and synthesis of the active principles, the demystification of the magic mushrooms was accomplished. The compounds whose wondrous effects led the Indians to believe for millennia that a god was residing in the mushrooms had their chemical structures elucidated and could be produced synthetically in flasks. Just what progress in scientific knowledge was accomplished by natural products research in this case? Essentially, when all is said and done, we can only say that the mystery of the wondrous effects of teonanacatl was reduced to the mystery of the effects of two crystalline substances - since these effects cannot be explained by science either, but can only be describe. A Voyage into the Universe of the Soul with Psilocybin The relationship between the psychic effects of psilocybin and those of LSD, their visionaryhallucinatory character, is evident in the following report from Antaios, of a psilocybin experiment by Dr. Rudolf Gelpke. He has characterized his experiences with LSD and psilocybin, as already mentioned in a previous chapter, as "travels in the universe of the soul." Where Time Stands Still (10 mg psilocybin, 6 April 1961, 10:20) After ca. 20 minutes, beginning effects: serenity, speechlessness, mild but pleasant dizzy sensation, and "pleasureful deep breathing." 10:50 Strong! dizziness, can no longer concentrate . 10:55 Excited, intensity of colors: everything pink to red. 11:05 The world concentrates itself there on the center of the table. Colors very intense. 11:10 A divided being, unprecedented - how can I describe this sensation of life? Waves, different selves, must control me. Immediately after this note I went outdoors, leaving the breakfast table, where I had eaten with Dr. H. and ourwives, and lay down on the lawn. The inebriation pushed rapidly to its climax. Although I had firmly resolved to make constant notes, it now seemed to me a complete waste of time, the motion of writing infinitely slow, the possibilities of verbal expression unspeakably paltry - measured by the flood of inner experience that inundated me and threatened to burst me. It seemed to me that 100 years would not be sufficient to describe the fullness of experience of a single minute. At the beginning, optical impressions predominated: I saw with delight the boundless succession of rows of trees in the nearby forest. Then the tattered clouds in the sunny sky rapidly piled up with silent and breathtaking majesty to a superimposition of thousands of layers - heaven on heaven - and I waited then expecting that up there in the next moment something completely powerful, unheard of, not yet existing, would appear or happen - would I behold a god? But only the expectation remained, the presentiment, this hovering, "on the threshold of the ultimate feeling." . . . Then I moved farther away (the proximity of others disturbed me) and lay down in a nook of the garden on a sun-warmed wood pile - my fingers stroked this wood with overflowing, animal-like sensual affection. At the same time I was submerged within myself; it was an absolute climax: a sensation of bliss pervaded me, a contented happiness - I found myself behind my closed eyes in a cavity full of brick-red ornaments, and at the same time in the "center of the universe of consummate calm." I knew everything was good - the cause and origins of everything was good. But at the same moment I also understood the suffering and the loathing, the depression and misunderstanding of ordinary life: there one is never "total," but instead divided, cut in pieces, and split up into the tiny fragments of seconds, minutes, hours, days, weeks, and years: there one is a slave of Moloch time, which devoured one piecemeal; one is condemned to stammering, bungling, and patchwork; one must drag about with oneself the perfection and absolute, the togetherness of all things; the eternal moment of the golden age, this original ground of being - that indeed nevertheless has always endured and will endure forever - there in the weekday of human existence, as a tormenting thorn buried deeply in the soul, as a memorial of a claim never fulfilled, as a fata morgana of a lost and promised paradise; through this feverish dream "present" to a condemned "past" in a clouded "future." I understood. This inebriation was a spaceflight, not of the outer but rather of the inner man, and for a moment I experienced reality from a location that lies somewhere beyond the force of gravity of time. As I began again to feel this force of gravity, I was childish enough to want to postpone the return by taking a new dose of 6 mg psilocybin at 11:45, and once again 4 mg at 14:30. The effect was trifling, and in any case not worth mentioning. Mrs. Li Gelpke, an artist, also participated in this series of investigations, taking three self-experiments with LSD and psilocybin. The artist wrote of the drawing she made during the experiment: Nothing on this page is consciously fashioned. While I worked on it, the memory (of the experience under psilocybin) was again reality, and led me at every stroke. For that reason the picture is as many-layered as this memory, and the figure at the lower right is really the captive of its dream.... When books about Mexican art came into my hands three weeks later, I again found the motifs of my visions there with a sudden start.... I have also mentioned the occurrence of Mexican motifs in psilocybin inebriation during my first selfexperiment with dried Psilocybe mexicana mushrooms, as was described in the section on the chemical investigation of these mushrooms. The same phenomenon has also struck R. Gordon Wasson. Proceeding from such observations, he has advanced the conjecture that ancient Mexican art could have been influenced by visionary images, as they appear in mushroom inebriation. The "Magic Morning Glory" Ololiuhqui After we had managed to solve the riddle of the sacred mushroom teonanacatt in a relatively short time, I also became interested in the problem of another Mexican magic drug not yet chemically elucidated, olotiuhqui. Ololiuhqui is the Aztec name for the seeds of certain climbing plants (Convolvulaceae) that, like the mescaline cactus peyotl and the teonanacatl mushrooms, were used in pre-Columbian times by the Aztecs and neighboring people in religious ceremonies and magical healing practices. Ololiuhqui is still used even today by certain Indian tribes like the Zapotec, Chinantec, Mazatec, and Mixtec, who until a short time ago still led a geniunely isolated existence, little influenced by Christianity, in the remote mountains of southern Mexico. An excellent study of the historical, ethnological, and botanical aspects of ololiuhqui was published in 1941 by Richard Evans Schultes, director of the Harvard Botanical Museum in Cambridge, Massachusetts. It is entitled "A Contribution to Our Knowledge of Rivea corymbosa, the Narcotic Ololiuqui of the Aztecs." The following statements about the history of ololiuhqui derive chiefly from Schultes's monograph. [Translator's note: As R. Gordon Wasson has pointed out, "ololiuhqui" is a more precise orthography than the more popular spelling used by Schultes. See Botanical Museum Leaflets Harvard University 20: 161-212, 1963.] The earliest records about this drug were written by Spanish chroniclers of the sixteenth century, who also mentioned peyotl and teonanacatl. Thus the Franciscan friar Bernardino de Sahagun, in his already cited famous chronicle Historia General de las Cosas de Nueva Espana, writes about the wondrous effects of olotiuhqui: "There is an herb, called coatl xoxouhqui (green snake), which produces seeds that are called ololiuhqui. These seeds stupefy and deprive one of reason: they are taken as a potion." We obtain further information about these seeds from the physician Francisco Hernandez, whom Philip II sent to Mexico from Spain, from 1570 to 1575, in order to study the medicaments of the natives. In the chapter "On Ololiuhqui" of his monumental work entitled Rerum Medicarum Novae Hispaniae Thesaurus seu Plantarum, Animalium Mineralium Mexicanorum Historia, published in Rome in 1651, he gives a detailed description and the first illustration of ololiuhqui. An extract from the Latin text accompanying the illustration reads in translation: "Ololiuhqui, which others call coaxihuitl or snake plant, is a climber with thin, green, heart-shaped leaves.... The flowers are white, fairly large.... The seeds are roundish. . . . When the priests of the Indians wanted to visit with the gods and obtain information from them, they ate of this plant in order to become inebriated. Thousands of fantastic images and demons then appeared to them...." Despite this comparatively good description, the botanical identification of ololiuhqui as seeds of Rivea corymbosa (L.) Hall. f. occasioned many discussions in specialist circles. Recently preference has been given to the synonym Turbina corymbosa (L.) Raf. When I decided in 1959 to attempt the isolation o the active principles of ololiuhqui, only a single report on chemical work with the seeds of Turbina cormbosa was available. It was the work of the pharmacologist C. G. Santesson of Stockholm, from the year 1937. Santesson, however, was not successful in isolating an active substance in pure form. Contradictory findings had been published about the activity of theololiuhqui seeds. The psychiatrist H. Osmond conducted a self-experiment with the seeds of Turbina corymbosa in 1955. After the ingestion of 60 to 100 seeds, he entered into a state of apathy and emptiness, accompanied by enhanced visual sensitivity. After four hours, there followed a period of relaxation and well-being, lasting for a longer time. The results of V. J. Kinross-Wright, published in England in 1958, in which eight voluntary research subjects, who had taken up to 125 seeds, perceived no effects at all, contradicted this report. Through the mediation of R. Gordon Wasson, I obtained two samples of ololiuhgui seeds. In his accompanying letter of 6 August 1959 from Mexico City, he wrote of them: . . . The parcels that I am sending you are the following: . . . A small parcel of seeds that I take to be Rivea corymbosa, otherwise known as ololiuqui well-known narcotic of the Aztecs, called in Huautla "la semilla de la Virgen." This parcel, you will find, consists of two little bottles, which represent two deliveries of seeds made to us in Huautla, and a larger batch of seeds delivered to us by Francisco Ortega "Chico," the Zapotec guide, who himself gathered the seeds from the plants at the Zapotec town of San Bartolo Yautepec.... The first-named, round, light brown seeds from Huautla proved in the botanical determination to have been correctly identified as Rivea (Turbina) corymbosa, while the black, angular seeds from San Bartolo Yautepec were identified as Ipomoea violacea L. While Turbina corymbosa thrives only in tropical or subtropical climates, one also finds Ipomoea violacea as an ornamental plant dispersed over the whole earth in the temperate zones. It is the morning glory that delights the eye in our gardens in diverse varieties with blue or blue-red striped caiyxes. The Zapotec, besides the original ololiuhqui (that is, the seeds of Turbina corymbosa, which they call badoh), also utilize badoh negro, the seeds of Ipomoea violacea. T. MacDougall, who furnished us with a second larger consignment of the last-named seeds, made this observation. My capable laboratory assistant Hans Tscherter, with whom I had already carried out the isolation of the active principles of the mushrooms, participated in the chemical investigation of the ololiuhqui drug. We advanced the working hypothesis that the active principles of the ololiuhqui seeds could be representatives of the same class of chemical substances, the indole compounds, to which LSD, psilocybin, and psilocin belong. Considering the very great number of other groups of substances that, like the indoles, were under consideration as active principles of ololiuhqui, it was indeed extremely improbable that this assumption would prove true. It could, however, very easily be tested. The presence of indole compounds, of course, may simply and rapidly be determined by colorimetric reactions. Thus even traces of indole substances, with a certain reagent, give an intense blue-colored solution. We had luck with our hypothesis. Extracts of ololiuhqui seeds with the appropriate reagent gave the blue coloration characteristic of indole compounds. With the help of this colorimetric test, we succeeded in a short time in isolating the indole substances from the seeds and in obtaining them in chemically pure form. Their identification led to an astonishing result. What we found appeared at first scarcely believable. Only after repetition and the most careful scrutiny of the operations was our suspicion concerning the peculiar findings eliminated: the active principles from the ancient Mexican magic drug ololiuhqui proved to be identical with substances that were already present in my laboratory. They were identical with alkaloids that had been obtained in the course of the decadeslong investigations of ergot; partly isolated as such from ergot, partly obtained through chemical modification of ergot substances. Lysergic acid amide, lysergic acid hydroxyethylamide, and alkaloids closely related to them chemically were established as the main active principles of olotiuhqui. (See formulae in the appendix.) Also present was the alkaloid ergobasine, whose synthesis had constituted the starting point of my investigations on ergot alkaloids. Lysergic acid amide and lysergic acid hydroxyethylamide, active principles of ololiuhqui, are chemically very closely related to lysergic acid diethylamide (LSD), which even for the nonchemist follows from the names. Lysergic acid amide was described for the first time by the English chemists S. Smith and G. M. Timmis as a cleavage product of ergot alkaloids, and I had also produced this substance synthetically in the course of the investigations in which LSD originated. Certainly, nobody at the time could have suspected that this cornpound synthesized in the flask would be discovered twenty years later as a naturally occurring active principle of an ancient Mexican magic drug. After the discovery of the psychic effects of LSD, I had also tested lysergic acid amide in a selfexperiment and established that it likewise evoked a dreamlike condition, but only with about a tenfold to twentyfold greater dose than LSD. This effect was characterized by a sensation of mental emptiness and the unreality and meaninglessness of the outer world, by enhanced sensitivity of hearing, and by a not unpleasant physical lassitude, which ultimately led to sleep. This picture of the effects of LA-l 1 1, as lysergic acid amide was called as a research preparation, was confirmed in a systematic investigation by the psychiatrist Dr. H. Solms. When I presented the findings of our investigations on ololiuhqui at the Natural Products Congress of the International Union for Pure and Applied Chemistry (IUPAC) in Sydney, Australia, in the fall of 1960, my colleagues received my talk with skepticism. In the discussions following my lecture, some persons voiced the suspicion that the ololiuhqui extracts could well have been contaminated with traces of lysergic acid derivatives, with which so much work had been done in my laboratory. There was another reason for the doubt in specialist circles concerning our findings. The occurrence in higher plants (i.e., in the morning glory family) of ergot alkaloids that hitherto had been known only as constituents of lower fungi, contradicted the experience that certain substances are typical of and restricted to respective plant families. It is indeed a very rare exception to find a characteristic group of substances, in this case the ergot alkaloids, occurring in two divisions of the plant kingdom broadly separated in evolutionary history. Our results were confirmed, however, when different laboratories in the United States, Germany, and Holland subsequently verified our investigations on the ololiuhqui seeds. Nevertheless, the skepticism went so far that some persons even considered the possibility that the seeds could have been infected with alkaloid-producing fungi. That suspicion, however, was ruled out experimentally. These studies on the active principles of ololiuhqui seeds, although they were published only in professional journals, had an unexpected sequel. We were apprised by two Dutch wholesale seed companies that their sale of seeds of Ipomoea violacea, the ornamental blue morning glory, had reached unusual proportions in recent times. They had heard that the great demand was connected with investigations of these seeds in our laboratory, about which they were eager to learn the details. It turned out that the new demand derived from hippie circles and other groups interested in hallucinogenic drugs. They believed they had found in the ololiuhqui seeds a substitute for LSD, which was becoming less and less accessible. The morning glory seed boom, however, lasted only a comparatively short time, evidently because of the undesirable experiences that those in the drug world had with this "new" ancient inebriant. The ololiuhqui seeds, which are taken crushed with water or another mild beverage, taste very bad and are difficult for the stomach to digest. Moreover, the psychic effects of ololiuhqui, in fact, differ from those of LSD in that the euphoric and the hallucinogenic components are less pronounced, while a sensation of mental emptiness, often anxiety and depression, predominates. Furthermore, weariness and lassitude are hardly desirable effects as traits in an inebriant. These could all be reasons why the drug culture's interest in the morning glory seeds has diminished. Only a few investigations have considered the question whether the active principles of ololiuhqui could find a useful application in medicine. In my opinion, it would be worthwhile to clarify above all whether the strong narcotic, sedative effect of certain ololiuhqui constituents, or of chemical modifications of these, is medicinally useful. My studies in the field of hallucinogenic drugs reached a kind of logical conclusion with the investigations of ololiuhqui. They now formed a circle, one could almost say a magic circle: the starting point had been the synthesis of lysergic acid amides, among them the naturally occurring ergot alkaloid ergobasin. This led to the synthesis of lysergic acid diethylamide, LSD. The hallucinogenic properties of LSD were the reason why the hallucinogenic magic mushroom teonanacatl found its way into my laboratory. The work with teonanacatt, from which psilocybin and psilocin were isolated, proceeded to the investigation of another Mexican magic drug, olotiuhqui, in which hallucinogenic principles in the form of lysergic acid amides were again encountered, including ergobasin-with which the magic circle closed. In Search of the Magic Plant "Ska Maria Pastora" in the Mazatec Country R. Gordon Wasson, with whom I had maintained friendly relations since the investigations of the Mexican magic mushrooms, invited my wife and me to take part in an expedition to Mexico in the fall of 1962. The purpose of the journey was to search for another Mexican magic plant. Wasson had learned on his travels in the mountains of southern Mexico that the expressed juice of the leaves of a plant, which were called hojas de la Pastora or hojas de Maria Pastora, in Mazatec ska Pastora or ska Maria Pastora (leaves of the shepherdess or leaves of Mary the shepherdess), were used among the Mazatec in medico-religious practices, like the teonanacatl mushrooms and the ololiuhqui seeds. The question now was to ascertain from what sort of plant the "leaves of Mary the shepherdess" derived, and then to identify this plant botanically. We also hoped, if at all possible, to gather sufficient plant material to conduct a chemical investigation on the hallucinogenic principles it contained. Ride through the Sierra Mazateca On 26 September 1962, my wife and I accordingly flew to Mexico City, where we met Gordon Wasson. He had made all the necessary preparations for the expedition, so that in two days we had already set out on the next leg of the journey to the south. Mrs. Irmgard Weitlaner Johnson, (widow of Jean B. Johnson, a pioneer of the ethnographic study of the Mexican magic mushrooms, killed in the Allied landing in North Africa) had joined us. Her father, Robert J. Weitlaner, had emigrated to Mexico from Austria and had likewise contributed toward the rediscovery of the mushroom cult. Mrs. Johnson worked at the National Museum of Anthropology in Mexico City, as an expert on Indian textiles. After a two-day journey in a spacious Land Rover, which took us over the plateau, along the snow-capped Popocatepetl, passing Puebla, down into the Valley of Orizaba with its magnificent tropical vegetation, then by ferry across the Popoloapan (Butterfly River), on through the former Aztec garrison Tuxtepec, we arrived at the starting point of our expedition, the Mazatec village of Jalapa de Diaz, lying on a hillside. There we were in the midst of the environment and among the people that we would come to know in the succeeding 2 1/2 weeks. There was an uproar upon our arrival in the marketplace, center of this village widely dispersed in the jungle. Old and young men, who had been squatting and standing around in the half-opened bars and shops, pressed suspiciously yet curiously about our Land Rover; they were mostly barefoot but all wore a sombrero. Women and girls were nowhere to be seen. One of the men gave us to understand that we should follow. him. He led us to the local president, a fat mestizo who had his office in a one-story house with a corrugated iron roof. Gordon showed him our credentials from the civil authorities and from the military governor of Oaxaca, which explained that we had come here to carry out scientific investigations. The president, who probably could not read at all, was visibly impressed by the large-sized documents equipped with official seals. He had lodgings assigned to us in a spacious shed, in which we could place our air mattresses and sleeping bags. I looked around the region somewhat. The ruins of a large church from colonial times, which must have once been very beautiful, rose almost ghostlike in the direction of an ascending slope at the side of the village square. Now I could also see women looking out of their huts, venturing to examine the strangers. In their long, white dresses, adorned with red borders, and with their long braids of blue-black hair, they offered a picturesque sight. We were fed by an old Mazatec woman, who directed a young cook and two helpers. She lived in one of the typical Mazatec huts. These are simply rectangular structures with thatched gabled roofs and walls of wooden poles joined together, windowless, the chinks between the wooden poles offering sufficient opportunity to look out. In the middle of the hut, on the stamped clay floor, was an elevated, open fireplace, built up out of dried clay or made of stones. The smoke escaped through large openings in the walls under the two ends of the roof. Bast mats that lay in a corner or along the walls served as beds. The huts were shared with the domestic animals, as well as black swine, turkeys, and chickens. There was roasted chicken to eat, black beans, and also, in place of bread, tortittas, a type of cornmeal pancake that is baked on the hot stone slab of the hearth. Beer and tequila, an Agave liquor, were served. Next morning our troop formed for the ride through the Sierra Mazateca. Mules and guides were engaged from the horsekeeper of the village. Guadelupe, the Mazatec familiar with the route, took charge of guiding the lead animal. Gordon, Irmgard, my wife, and I were stationed on our mules in the middle. Teodosio and Pedro, called Chico, two young fellows who trotted along barefoot beside the two mules laden with our baggage, brought up the rear. It took some time to get accustomed to the hard wooden saddles. Then, however, this mode of locomotion proved to be the most ideal type of travel that I know of. The mules followed the leader, single file, at a steady pace. They required no direction at all by the rider. With surprising dexterity, they sought out the best spots along the almost impassable, partly rocky, partly marshy paths, which led through thickets and streams or onto precipitous slopes. Relieved of all travel cares, we could devote all our attention to the beauty of the landscape and the tropical vegetation. There were tropical forests with gigantic trees overgrown with twining plants, then again clearings with banana groves or coffee plantations, between light stands of trees, flowers at the edge of the path, over which wondrous butterflies bustled about.... We made our way upstream along the broad riverbed of Rio Santo Domingo, with brooding heat and steamy air, now steeply ascending, then again falling. During a short, violent tropical downpour, the long broad ponchos of oilcloth, with which Gordon had equipped us, proved quite useful. Our Indian guides had protected themselves from the cloudburst with gigantic, heart-shaped leaves that they nimbly chopped off at the edge of the path. Teodosio and Chico gave the impression of great, green hay ricks as they ran, covered with these leaves, beside their mules. Shortly before nightfall we arrived at the first settlement, La Providencia ranch. The patron, Don Joaquin Garcia, the head of a large family, welcomed us hospitably and full of dignity. It was impossible to determine how many children, in addition to the grown-ups and the domestic animals, were present in the large living room, feebly illuminated by the hearth fire alone. Gordon and I placed our sleeping bags outdoors under the projecting roof. I awoke in the morning to find a pig grunting over my face. After another day's journey on the backs of our worthy mules, we arrived at Ayautla, a Mazatec settlement spread across a hillside. En route, among the shrubbery, I had delighted in the blue calyxes of the magic morning glory Ipomoea violacea, the mother plant of the ololiuhqui seeds. It grew wild there, whereas among us it is only found in the Garden as an ornamental plant. We remained in Ayautla for several days. We had lodging in the house of Dona Donata Sosa de Garcia. Dona Donata was in charge of a large family, which included her ailing husband. In addition, she presided over the coffee cultivation of the region. The collection center for the freshly picked coffee beans was in an adjacent building. It was a lovely picture, the young Indian woman and girls returning home from the harvest toward evening, in their bright garments adorned with colored borders, the coffee sacks carried on their backs by headbands. Dona Donata also managed a type of grocery store, in which her husband, Don Eduardo, stood behind the counter. In the evening by candlelight, Dona Donata, who besides Mazatec also spoke Spanish, told us about life in the village; one tragedy or another had already struck nearly every one of the seemingly peaceful huts that lay surrounded by this paradisiacal scenery. A man who had murdered his wife, and who now sits in prison for life, had lived in the house next door, which now stood empty. The husband of a daughter of Dona Donata, after an affair with another woman, was murdered out of jealousy. The president of Ayautla, a young bull of a mestizo, to whom we had made our formal visit in the afternoon, never made the short walk from his hut to his "office" in the village hall (with the corrugated iron roof) unless accompanied by two heavily armed men. Because he exacted illegal taxes, he was afraid of being shot to death. Since no higher authority sees to justice in this remote region, people have recourse to self-defense of this type. Thanks to Dona Donata's good connections, we received the first sample of the sought-after plant, some leaves of hojas de la Pastora, from an old woman. Since the flowers and roots were missing, however, this plant material was not suitable for botanical identification. Our efforts to obtain more precise information about the habitat of the plant and its use were also fruitless. The continuation of our journey from Ayautla was delayed, as we had to wait until our boys could again bring back the mules that they had taken to pasture on the other side of Rio Santo Domingo, over the river swollen by intense downpours. After a two-day ride, on which we had passed the night in the high mountain village of San MiguelHuautla, we arrived at Rio Santiago. Here we were joined by Dona Herlinda Martinez Cid, a teacher from Huautla de Jimenez. She had ridden over on the invitation of Gordon Wasson, who had known her since his mushroom expeditions, and was to serve as our Mazatec and Spanish-speaking interpreter. Moreover, she could help us, through her numerous relatives scattered in the region, to pave the way to contacts with curanderos and curanderas who used the hojas de 1a Pastora in their practice. Because of our delayed arrival in Rio Santiago, Dona Herlinda, who was acquainted with the dangers of the region, had been apprehensive about us, fearing we might have plunged down a rocky path or been attacked by robbers. Our next stop was in San Jose Tenango, a settlement lying deep in a valley, in the midst of tropical vegetation with orange and lemon trees and banana plantations. Here again was the typical village picture: in the center, a marketplace with a half-ruined church from the colonial period, with two or three stands, a general store, and shelters for horses and mules. We found lodging in a corrugated iron barracks, with the special luxury of a cement floor, on which we could spread out our sleeping bags. In the thick jungle on the mountainside we discovered a s-pring, whose magnificent fresh water in a natural rocky basin invited us to bathe. That was an unforgettable pleasure after days without opportunities to wash properly. In this grotto I saw a hummingbird for the first time in nature, a blue-green, metallic, iridescent gem, which whirred over great liana blossoms. The desired contact with persons skilled in medicine came about thanks to the kindred connections of Dona Herlinda, beginning with the curandero Don Sabino. But he refused, for some reason, to receive us in a consultation and to question the leaves. From an old curandera, a venerable woman in a strikingly magnificent Mazatec garment, with the lovely name Natividad Rosa, we received a whole bundle of flowering specimens of the sought-after plant, but even she could not be prevailed upon to perform a ceremony with the leaves for us. Her excuse was that she was too old for the hardship of the magical trip; she could never cover the long distance to certain places: a spring where the wise women gather their powers, a lake on which the sparrows sing, and where objects get their names. Nor would Natividad Rosa tell us where she had gathered the leaves. They grew in a very, very distant forest valley. Wherever she dug up a plant, she put a coffee bean in the earth as thanks to the gods. We now possessed ample plants with flowers and roots, which were suitable for botanical identification. It was apparently a representative of the genus Salvia, a relative of the well-known meadow sage. The plants had blue flowers crowned with a white dome, which are arranged on a panicle 20 to 30 cm long, whose stem leaked blue. Several days later, Natividad Rosa brought us a whole basket of leaves, for which she was paid fifty pesos. The business seemed to have been discussed, for two other women brought us further quantities of leaves. As it was known that the expressed juice of the leaves is drunk in the ceremony, and this must therefore contain the active principle, the fresh leaves were crushed on a stone plate, squeezed out in a cloth, the juice diluted with alcohol as a preservative, and decanted into flasks in order to be studied later in the laboratory in Basel. I was assisted in this work by an Indian girl, who was accustomed to dealing with the stone plate, the metate, on which the Indians since ancient times have ground their corn by hand. On the day before the journey was to continue, having given up all hope of being able to attend a ceremony, we suddenly made another contact with a curandera, one who was ready " to serve us ." A confidante of Herlinda's, who had produced this contact, led us after nightfall along a secret path to the hut of the curandera, lying solitary on the mountainside above the settlement. No one from the village was to see us or discover that we were received there. It was obviously considered a betrayal of sacred customs, worthy of punishment, to allow strangers, whites, to take part in this. That indeed had also been the real reason why the other healers whom we asked had refused to admit us to a leaf ceremony. Strange birdcalls from the darkness accompanied us on the ascent, and the barking of dogs was heard on all sides. The dogs had detected the strangers. The curandera Consuela Garcia, a woman of some forty years, barefoot like all Indian women in this region, timidly admitted us to her hut and immediately closed up the doorway with a heavy bar. She bid us lie down on the bast mats on the stamped mud floor. As Consuela spoke only Mazatec, Herlinda translated her instructions into Spanish for us. The curandera lit a candle on a table covered with some images of saints, along with a variety of rubbish. Then she began to bustle about busily, but in silence. All at once we heard peculiar noises and a rummaging in the room-did the hut harbor some hidden person whose shape and proportions could not be made out in the candlelight? Visibly disturbed, Consuela searched the room with the burning candle. It appeared to be merely rats, however, who were working their mischief. In a bowl the curandera now kindled copal, an incense-like resin, which soon filled the whole hut with its aroma. Then the magic potion was ceremoniously prepared. Consuela inquired which of us wished to drink of it with her. Gordon announced himself. Since I was suffering from a severe stomach upset at the time, I could not join in. My wife substituted for me. The curandera laid out six pairs of leaves for herself. She apportioned the same number to Gordon. Anita received three pairs. Like the mushrooms, the leaves are always dosed in pairs, a practice that, of course, has a magical significance. The leaves were crushed with the metate, then squeezed out through a fine sieve into a cup, and the metate and the contents of the sieve were rinsed with water. Finally, the filled cups were incensed over the copal vessel with much ceremony. Consuela asked Anita and Gordon, before she handed them their cups, whether they believed in the truth and the holiness of the ceremony. After they answered in the affirmative and the very bitter-tasting potion was solemnly imbibed, the candles were extinguished and, lying in darkness on the bast masts, we awaited the effects. After some twenty minutes Anita whispered to me that she saw striking, brightly bordered images. Gordon also perceived the effect of the drug. The voice of the curandera sounded from the darkness, half speaking, half singing. Herlinda translated: Did we believe in Christ's blood and the holiness of the rites? After our "creemos" ("We believe"), the ceremonial performance continued. The curandera lit the candles, moved them from the "altar table" onto the floor, sang and spoke prayers or magic formulas, placed the candles again under the images of the saints-then again silence and darkness. Thereupon the true consultation began. Consuela asked for our request. Gordon inquired after the health of his daughter, who immediately before his departure from New York had to be admitted prematurely to the hospital in expectation of a baby. He received the comforting information that mother and child were well. Then again came singing and prayer and manipulations with the candles on the "altar table" and on the floor, over the smoking basin. When the ceremony was at an end, the curandera asked us to rest yet a while longer in prayer on our bast mats. Suddenly a thunderstorm burst out. Through the cracks of the beam walls, lightning flashed into the darkness of the hut, accompanied by violent thunderbolts, while a tropical downpour raged, beating on the roof. Consuela voiced apprehension that we would not be able to leave her house unseen in the darkness. But the thunderstorm let up before daybreak, and we went down the mountainside to our corrugated iron barracks, as noiselessly as possible by the light of flashlights, unnoticed by the villagers, but dogs again barked from all sides. Participation in this ceremony was the climax of our expedition. It brought confirmation that the hojas de la Pastora were used by the Indians for the same purpose and in the same ceremonial milieu as teonanacatl, the sacred mushrooms. Now we also had authentic plant material, not only sufficient for botanical identification, but also for the planned chemical analysis. The inebriated state that Gordon Wasson and my wife had experienced with the hojas had been shallow and only of short duration, yet it had exhibited a distinctly hallucinogenic character. On the morning after this eventful night we took leave of San Jose Tenango. The guide, Guadelupe, and the two fellows Teodosio and Pedro appeared before our barracks with the mules at the appointed time. Soon packed up and mounted, our little troop then moved uphill again, through the fertile landscape glittering in the sunlight from the night's thunderstorm. Returning by way of Santiago, toward evening we reached our last stop in Mazatec country, the capital Huautla de Jimenez. From here on, the return trip to Mexico City was made by automobile. With a final supper in the Posada Rosaura, at the time the only inn in Huautla, we took leave of our Indian guides and of the worthy mules that had carried us so surefootedly and in such a pleasant way through the Sierra Mazatec. The Indians were paid of, and Teodosio, who also accepted payment for his chief in Jalapa de Diaz (where the animals were to be returned afterward), gave a receipt with his thumbprint colored by a ballpoint pen. We took up quarters in Dona Herlinda's house. A day later we made our formal visit to the curandera Maria Sabina, a woman made famous by the Wassons' publications. It had been in her hut that Gordon Wasson became the first white man to taste of the sacred mushrooms, in the course of a nocturnal ceremony in the summer of 1955. Gordon and Maria Sabina greeted each other cordially, as old friends. The curandera lived out of the way, on the mountainside above Huautla. The house in which the historic session with Gordon Wasson had taken place had been burned, presumably by angered residents or an envious colleague, because she had divulged the secret of teonanacatl to strangers. In the new hut in which we found ourselves, an incredible disorder prevailed, as had probably also prevailed in the old hut, in which half-naked children, hens, and pigs bustled about. The old curandera had an intelligent face, exceptionally changeable in expression. She was obviously impressed when it was explained that we had managed to confine the spirit of the mushrooms in pills, and she at once declared herself ready to " serve us" with these, that is, to grant us a consultation. It was agreed that this should take place the coming night in the house of Dona Herlinda. In the course of the day I took a stroll through Huautla de Jimenez, which led along a main street on the mountainside. Then I accompanied Gordon on his visit to the Instituto Nacional Indigenista. This governmental organization had the duty of studying and helping to solve the problems of the indigenous population, that is, the Indians. Its leader told us of the difficulties that the "coffee policy" had caused in the area at that time. The president of Huautla, in collaboration with the Instituto Nacional Indigenista had tried to eliminate middlemen in order to shape the coffee prices favorably for the producing Indians. His body was found, mutilated, the previous June. Our stroll also took us past the cathedral, from which Gregorian chants resounded. Old Father Aragon, whom Gordon knew well from his earlier stays, invited us into the vestry for a glass of tequila. A Mushroom Ceremony As we returned home to Herlinda's house toward evening, Maria Sabina had already arrived there with a large company, her two lovely daughters, Apolonia and Aurora (two prospective curanderas), and a niece, all of whom brought children along with them. Whenever her child began to cry, Apolonia would offer her breast to it. The old curandero Don Aurelio also appeared, a mighty man, one-eyed, in a black-andwhite patternedserape (cloak). Cacao and sweet pastry were served on the veranda. I was reminded of the report from an ancient chronicle which described how chocotatl was drunk before the ingestion of teonanacatl. After the fall of darkness, we all proceeded into the room in which the ceremony would take place. It was then locked up-that is, the door was obstructed with the only bed available. Only an emergency exit into the back garden remained unlatched for absolute necessity. It was nearly midnight when the ceremony began. Until that time the whole party lay, in darkness sleeping or awaiting the night's events, on the bast mats spread on the floor. Maria Sabina threw a piece of copal on the embers of a brazier from time to time, whereby the stuffy air in the crowded room became somewhat bearable. I had explained to the curandera through Herlinda, who was again with the party as interpreter, that one pill contained the spirit of two pairs of mushrooms. (The pills contained 5.0 mg synthetic psilocybin apiece.) When all was ready, Maria Sabina apportioned the pills in pairs among the grown-ups present. After solemn smoking, she herself took two pairs (corresponding to 20 mg psilocybin). She gave the same dose to Don Aurelio and her daughter Apolonia, who would also serve as curandera. Aurora received one pair, as did Gordon, while my wife and Irmgard got only one pill each. One of the children, a girl of about ten, under the guidance of Maria Sabina, had prepared for me the juice of five pairs of fresh leaves of hojas de la Pastora. I wanted to experience this drug that I had been unable to try in San Jose Tenango. The potion was said to be especially active when prepared by an innocent child. The cup with the expressed juice was likewise incensed and conjured by Maria Sabina and Don Aurelio, before it was delivered to me. All of these preparations and the following ceremony progressed in much the same way as the consultation with the curandera Consuela Garcia in San Jose Tenango. After the drug was apportioned and the candle on the "altar" was extinguished, we awaited the effects in the darkness. Before a half hour had elapsed, the curandera murmured something; her daughter and Don Aurelio also became restless. Herlinda translated and explained to us what was wrong. Maria Sabina had said that the pills lacked the spirit of the mushrooms. I discussed the situation with Gordon, who lay beside me. For us it was clear that absorption of the active principle from the pills, which must first dissolve in the stomach, occurs more slowly than from the mushrooms, in which some of the active principle already becomes absorbed through the mucous membranes during chewing. But how could we give a scientific explanation under such conditions? Rather than try to explain, we decided to act. We distributed more pills. Both curanderas and the curandero each received another pair. They had now each taken a total dosage of 30 mg psilocybin. After about another quarter of an hour, the spirit of the pills did begin to yield its effects, which lasted until the crack of dawn. The daughters, and Don Aurelio with his deep bass voice, fervently answered the prayers and singing of the curandera. Blissful, yearning moans of Apolonia and Aurora, between singing and prayer, gave the impression that the religious experience of the young women in the drug inebriation was combined with sensual-sexual feelings. In the middle of the ceremony Maria Sabina asked for our request. Gordon inquired again after the health of his daughter and grandchild. He received the same good information as from the curandera Consuela. Mother and child were in fact well when he returned home to New York. Obviously, however, this still represents no proof of the prophetic abilities of both curanderas. Evidently as an effect of the hojas, I found myself for some time in a state of mental sensitivity and intense experience, which, however, was not accompanied by hallucinations. Anita, Irmgard, and Gordon experienced a euphoric condition of inebriation that was influenced by tke strange, mystical atmosphere. My wife was impressed by the vision of very distinct strange line patterns. She was astonished and perplexed, later, on discovering precisely the same images in the rich ornamentation over the altar in an old church near Puebla. That was on the return trip to Mexico City, when we visited churches from colonial times. These admirable churches offer great cultural and historical interest because the Indian artists and workmen who assisted in their construction smuggled in elements of Indian style. Klaus Thomas, in his book Die kunstlich gesteuerte Seele [The artificially steered mind] (Ferdinand Enke Verlag, Stuttgart, 1970), writes about the possible influence of visions from psilocybin inebriation on Meso-American Indian art: "Surely a culturalhistorical comparison of the old and new creations of Indian art . . . must convince the unbiased spectator of the harmony with the images, forms and colors of a psilocybin inebriation." The Mexican character of the visions seen in my first experience with dried Psilocybe mexicana mushrooms and the drawing of Li Gelpke after a psilocybin inebriation could also point to such an association. As we took leave of Maria Sabina and her clan at the crack of dawn, the curandera said that the pills had the same power as the mushrooms, that there was no difference. This was a confirmation from the most competent authority, that the synthetic psilocybin is identical with the natural product. As a parting gift I let Maria Sabina have a vial of psilocybin pills. She radiantly explained to our interpreter Herlinda that she could now give consultations even in the season when no mushrooms grow. How should we judge the conduct of Maria Sabina, the fact that she allowed strangers, white people, access to the secret ceremony, and let them try the sacred mushroom? To her credit it can be said that she had thereby opened the door to the exploration of the Mexican mushroom cult in its present form, and to the scientific, botanical, and chemical investigation of the sacred mushrooms. Valuable active substances, psilocybin and psilocin, resulted. Without this assistance, the ancient knowledge and experience that was concealed in these secret practices would possibly, even probably, have disappeared without a trace, without having borne fruit, in the advancement of Western civilization. From another standpoint, the conduct of this curandera can be regarded as a profanation of a sacred custom-even as a betrayal. Some of her countrymen were of this opinion, which was expressed in acts of revenge, including the burning of her house. The profanation of the mushroom cult did not stop with the scientific investigations. The publication about the magic mushrooms unleashed an invasion of hippies and drug seekers into the Mazatec country, many of whom behaved badly, some even criminally. Another undesirable consequence was the beginning of true tourism in Huautla de Jimenez, whereby the originality of the place was eradicated. Such statements and considerations are, for the most part, the concern of ethnographical research. Wherever researchers and scientists trace and elucidate the remains of ancient customs that are becoming rarer, their primitiveness is lost. This loss is only more or less counterbalanced when the outcome of the research represents a lasting cultural gain. From Huautla de Jimenez we proceeded first to Teotitlan, in a breakneck truck ride along a half-paved road, and from there went on a comfortable car trip back to Mexico City, the starting point of our expedition. I had lost several kilograms in body weight, but was overwhelmingly compensated in enchanting experiences. The herbarium samples of hojas de la Pastora, which we had brought with us, were subjected to botanical indentification by Carl Epling and Carlos D. Jativa at the Botanical Institute of Harvard University in Cambridge, Massachusetts. They found that this plant was a hitherto undescribed species of Satvia, which was named Salvia divinorum by these authors. The chemical investigation of the juice of the magic sage in the laboratory in Basel was unsuccessful. The psychoactive principle of this drug seems to be a rather unstable substance, since the juice prepared in Mexico and preserved with alcohol proved in selfexperiments to be no longer active. Where the chemical nature of the active principle is concerned, the problem of the magic plant ska Maria Pastora still awaits solution. So far in this book I have mainly described my scientific work and matters relating to my professional activity. But this work, by its very nature, had repercussions on my own life and personality, not least because it brought me into contact with interesting and important contemporaries. I have already mentioned some of them-Timothy Leary, Rudolf Gelpke, Gordon Wasson. Now, in the pages that follow, I would like to emerge from the natural scientist's reserve, in order to portray encounters which were personally meaningful to me and which helped me solve questions posed by the substances I had discovered. _________________________________________________________________ 7. Radiance from Ernst Junger Radiance is the perfect term to express the influence that Ernst Junger's literary work and personality have had on me. In the light of his perspective, which stereoscopically comprises the surfaces and depths of things, the world I knew took on a new, translucent splendor. That happened a long time before the discovery of LSD and before I came into personal contact with this author in connection with hallucinogenic drugs. My enchantment with Ernst Junger began with his book Das Abenteuerliche Herz [The adventurous heart]. Again and again in the last forty years I have taken up this book. Here more than ever, in themes that weigh more lightly and lie closer to me than war and a new type of human being (subjects of Junger's earlier books), the beauty and magic of Junger's prose was opened to me-descriptions of flowers, of dreams, of solitary walks; thoughts about chance, the future, colors, and about other themes that have direct relation to our personal lives. Everywhere in his prose the miracle of creation became evident, in the precise description of the surfaces and, in translucence, of the depths; and the uniqueness and the imperishable in every human being was touched upon. No other writer has thus opened my eyes. Drugs were also mentioned in Das Abenteuerliche Herz. Many years passed, however, before I myself began to be especially interested in this subject, after the discovery of the psychic effects of LSD. My first correspondence with Ernst Junger had nothing to do with the context of drugs; rather I once wrote to him on his birthday, as a thankful reader. Bottmingen, 29 March 1947 Dear Mr. Junger, As one richly endowed by you for years, I wished to send a jar of honey to you for your birthday. But I did not have this pleasure, because my export license has been refused in Bern. The gift was intended less as a greeting from a country in which milk and honey still flow, than as a reminiscence of the enchanting sentences in your book Auf den Marmorklippen (On the Marble Cliffs), where you speak of the "golden bees." The book mentioned here had appeared in 1939, just shortly before the outbreak of World War II. Auf den Marmorklippen is not only a masterpiece of German prose, but also a work of great significance because in this book the characteristics of tyrants and the horror of war and nocturnal bombardment are described prophetically, in poetic vision. In the course of our correspondence, Ernst Junger also inquired about my LSD studies, of which he had learned through a friend. Thereupon I sent him the pertinent publications, which he acknowledged with the following comments: Kirchhorst, 3/3/1948 . . . together with both enclosures concerning your new phantasticum. It seems indeed that you have entered a field that contains so many tempting mysteries. Your consignment came together with the Confessions of an English Opium Eater, that has just been published in a new translation. The translator writes me that his reading of Das Abenteuerliche Herz stimulated him to do his work. As far as I am concerned, my practical studies in this field are far behind me. These are experiments in which one sooner or later embarks on truly dangerous paths, and may be considered lucky to escape with only a black eye. What interested me above all was the relationship of these substances to productivity. It has been my experience, however, that creative achievement requires an alert consciousness, and that it diminishes under the spell of drugs. On the other hand, conceptualization is important, and one gains insights under the influence of drugs that indeed are not possible otherwise. I consider the beautiful essay that Maupassant has written about ether to be such an insight. Moreover, I had the impression that in fever one also discovers new landscapes, new archipelagos, and a new music, that becomes completely distinct when the "customs station" ["An der Zollstation" [At the custom station], the title heading of a section in Das Abenteuerliche Herz (2d ed.) that concerns the transition from life to death.] appears. For geographic description, on the other hand, one must be fully conscious. What productivity means to the artist, healing means to the physician. Accordingly, it also may suffice for him that he sometimes enters the regions through the tapestries that our senses have woven. Moreover, I seem to perceive in our time less of a taste for the phantastica than for the energetica-amphetamine, which has even been furnished to fliers and other soldiers by the armies, belongs to this group. Tea is in my opinion a phantasticum, coffee an energeticum-tea therefore possesses a disproportionately higher artistic rank. I notice that coffee disrupts the delicate lattice of light and shadows, the fruitful doubts that emerge during the writing of a sentence. One exceeds his inhibitions. With tea, on the other hand, the thoughts climb genuinely upward. So far as my "studies" are concerned, I had a manuscript on that topic, but have since burned it. My excursions terminated with hashish, that led to very pleasant, but also to manic states, to oriental tyranny.... Soon afterward, in a letter from Ernst Junger I learned that he had inserted a discourse about drugs in the novel Heliopolis, on which he was then working. He wrote to me about the drug researcher who figures in the novel: Among the trips in the geographical and metaphysical worlds, which I am attempting to describe there, are those of a purely sedentary man, who explores the archipelagos beyond the navigable seas, for which he uses drugs as a vehicle. I give extracts from his log book. Certainly, I cannot allow this Columbus of the inner globe to end well-he dies of a poisoning. Avis au lecteur. The book that appeared the following year bore the subtitle Ruckblick auf eine Stadt [Retrospective on a city], a retrospective on a city of the future, in which technical apparatus and the weapons of the present time were developed still further in magic, and in which power struggles between a demonic technocracy and a conservative force took place. In the figure of Antonio Peri, Junger depicted the mentioned drug researcher, who resided in the ancient city of Heliopolis. He captured dreams, just like others appear to chase after butterflies with nets. He did not travel to the islands on Sundays and holidays and did not frequent the taverns on Pagos beach. He locked himself up in his studio for trips into the dreamy regions. He said that all countries and unknown islands were woven into the tapestry. The drugs served him as keys to entry into the chambers and caves of this world. In the course of the years he had gained great knowledge, and he kept a log book of his excursions. A small library adjoined this studio, consisting partly of herbals and medicinal reports, partly of works by poets and magicians. Antonio tended to read there while the effect of the drug itself developed. . . . He went on voyages of discovery in the universe of his brain.... In the center of this library, which was pillaged by mercenaries of the provincial governor during the arrest of Antonio Peri, stood The great inspirers of the nineteenth century: De Quincey, E.T.A. Hoffmann, Poe, and Baudelaire. Yet there were also books from the ancient past: herbals, necromancy texts, and demonology of the middle-aged world. They included the names Albertus Magnus, Raimundus Lullus, and Agrippa of Nettesheym.... Moreover, there was the great folioDe Praestigiis Daemonum by Wierus, and the very unique compilations of Medicus Weckerus, published in Basel in 1582.... In another part of his collection, Antonio Peri seemed to have cast his attention principally "on ancient pharmacology books, formularies and pharmacopoeias, and to have hunted for reprints of journals and annals. Among others was found a heavy old volume by the Heidelberg psychologists on the extract of mescal buttons, and a paper on the phantastica of ergot by Hofmann-Bottmingen...." In the same year in which Hetiopolis came out, I made the personal acquaintance of the author. I went to meet Ernst Junger in Ravensburg, for a Swiss sojourn. On a wonderful fall journey in southern Switzerland, together with mutual friends, I experienced the radiant power of his personality. Two years later, at the beginning of February 1951, came the great adventure, an LSD trip with Ernst Junger. Since, up until that moment, there were only reports of LSD experiments in connection with psychiatric inquiries, this experiment especially interested me, because this was an opportunity to observe the effects of LSD on the artistic person, in a nonmedical milieu. That was still somewhat before Aldous Huxley, from the same perspective, began to experiment with mescaline, about which he then reported in his two books The Doors of Perception and Heaven and Hett. In order to have medical aid on hand if necessary, I invited my friend, the physician and pharmacologist Professor Heribert Konzett, to participate. The trip took place at 10:00 in the morning, in the living room of our house in Bottmingen. Since the reaction of such a highly sensitive man as Ernst Junger was not foreseeable, a low dose was chosen for this first experiment as a precaution, only 0.05 mg. The experiment then, did not lead into great depths. The beginning phase was characterized by the intensification of aesthetic experience. Red-violet roses were of unknown luminosity and radiated in portentous brightness. The concerto for flute and harp by Mozart was perceived in its celestial beauty as heavenly music. In mutual astonishment we contemplated the haze of smoke that ascended with the ease of thought from a Japanese incense stick. As the inebriation became deeper and the conversation ended, we came to fantastic reveries while we lay in our easy chairs with closed eyes. Ernst Junger enjoyed the color display of oriental images: I was on a trip among Berber tribes in North Africa, saw colored caravans and lush oases. Heribert Konzett, whose features seemed to me to be transfigured, Buddha-like, experienced a breath of timelessness, liberation from the past and the future, blessedness through being completely here and now. The return from the altered state of consciousness was associated with strong sensitivity to cold. Like freezing travelers, we enveloped ourselves in covers for the landing. The return to everyday reality was celebrated with a good dinner, in which Burgundy flowed copiously. This trip was characterized by the mutuality and parallelism of our experiences, which were perceived as profoundly joyful. All three of us had drawn near the gate to an experience of mystical being; however, it did not open. The dose we had chosen was too low. In misunderstanding this reason, Ernst Junger, who had earlier been thrust into deeper realms by a high dose of mescaline, remarked: "Compared with the tiger mescaline, your LSD, is, after all, only a house cat." After later experiments with higher doses of LSD, he revised this estimation. Junger has assimilated the mentioned spectacle of the incense stick into literature, in his storyBesuch auf Gotenhotm [Visit to Godenholm], in which deeper experiences of drug inebriation also play a part: Schwarzenberg burned an incense stick, as he sometimes did, to clear the air. A blue plume ascended from the tip of the stick. Moltner looked at it first with astonishment, then with delight, as if a new power of the eyes had come to him. It revealed itself in the play of this fragrant smoke, which ascended from the slender stick and then branched out into a delicate crown. It was as if his imagination had created it-a pallid web of sea lilies in the depths, that scarcely trembled from the beat of the surf. Time was active in this creation-it had circled it, whirled about it, wreathed it, as if imaginary coins rapidly piled up one on top of another. The abundance of space revealed itself in the fiber work, the nerves, which stretched and unfolded in the height, in a vast number of filaments. Now a breath of air affected the vision, and softly twisted it about the shaft like a dancer. Moltner uttered a shout of surprise. The beams and lattices of the wondrous flower wheeled around in new planes, in new fields. Myriads of molecules observed the harmony. Here the laws no longer acted under the veil of appearance; matter was so delicate and weightless that it clearly reflected them. How simple and cogent everything was. The numbers, masses and weights stood out from matter. They cast off the raiments. No goddess could inform the initiates more boldly and freely. The pyramids with their weight did not reach up to this revelation. That was Pythagorean luster. No spectacle had ever affected him with such a magic spell. This deepened experience in the aesthetic sphere, as it is described here in the example of contemplation of a haze of blue smoke, is typical of the beginning phase of LSD inebriation, before deeper alterations of conscious begin. I visited Ernst Junger occasionally in the following years, in Wilfingen, Germany, where he had moved from Ravensburg; or we met in Switzerland, at my place in Bottmingen, or in Bundnerland in southeastern Switzerland. Through the shared LSD experience our relations had deepened. Drugs and problems connected with them constituted a major subject of our conversation and correspondence, without our having made further practical experiments in the meantime. We exchanged literature about drugs. Ernst Junger thus let me have for my drug library the rare, valuable monograph of Dr. Ernst Freiherrn von Bibra, Die Narkotischen Genussmittel und der Mensch [Narcotic pleasure drugs and man] printed in Nuremburg in 1855. This book is a pioneering, standard work of drug literature, a source of the first order, above all as relates to the history of drugs. What von Bibra embraces under the designation "Narkotischen Genussmittel" are not only substances like opium and thorn apple, but also coffee, tobacco, kat, which do not fall under the present conception of narcotics, any more than do drugs such as coca, fly agaric, and hashish, which he also described. Noteworthy, and today still as topical as at the time, are the general opinions about drugs that von Bibra contrived more than a century ago: The individual who has taken too much hashish, and then runs frantically about in the streets and attacks everyone who confronts him, sinks into insignificance beside the numbers of those who after mealtime pass calm and happy hours with a moderate dose; and the number of those who are able to overcome the heaviest exertions through coca, yes, who were possibly rescued from death by starvation through coca, by far exceed the few coqueros who have undermined their health by immoderate use. In the same manner, only a misplaced hypocrisy can condemn the vinous cup of old father Noah, because individual drunkards do not know how to observe limit and moderation. From time to time I advised Ernst Junger about actual and entertaining events in the field of inebriating drugs, as in my letter of September 1955: . . . Last week the first 200 grams of a new drug arrived, whose investigation I wish to take up. It involves the seeds of a mimosa (Piptadenia peregrina Benth,) that is used as a stimulating intoxicant by the Indians of the Orinoco. The seeds are ground, fermented, and then mixed with the powder of burned snail shells. This powder is sniffed by the Indians with the help of a hollow, forked bird bone, as already reported by Alexander von Humboldt in Reise nach den Aequinoctiat-Gegenden des Neuen Kontinents [Voyage to the equinoctial regions of the new continent] (Book 8, Chapter 24). The warlike tribe, the Otomaco, especially use this drug, called niopo, yupa, nopo or cojoba, to an extensive degree, even today. It is reported in the monograph by P. J. Gumilla, S. J. (Et Orinoco Itustrado, 1741): "The Otomacos sniffed the powder before they went to battle with the Caribes, for in earlier times there existed savage wars between these tribes.... This drug robs them completely of reason, and they frantically seize their weapons. And if the women were not so adept at holding them back and binding them fast, they would daily cause horrible devastation. It is a terrible vice.... Other benign and docile tribes that also sniff the yupa, do not get into such a fury as the Otomacos, who through self-injury with this agent made themselves completely cruel before combat, and marched into battle with savage fury." I am curious how niopo would act on people like us. Should a niopo session one day come to pass, then we should on no account send our wives away, as on that early spring reverie [The LSD trip of February 1951 is meant here.], that they may bind us fast if necessary.... Chemical analysis of this drug led to isolation of active principles that, like the ergot alkaloids and psilocybin, belong to the group of indole alkaloids, but which were already described in the technical literature, and were therefore not investigated further in the Sandoz laboratories. [Translator's note: The active principles of niopo are DMT (N,Ndimethyltryptamine) and its congeners. DMT was first prepared in 1931 by Manske.] The fantastic effects described above appeared to occur only with the particular manner of use as snuff powder, and also seemed to be related, in all probability, to the psychic structure of the Indian tribes concerned. Ambivalence of Drug Use Fundamental questions of drug problems were dealt with in the following correspondence. Bottmingen, 16 December 1961 Dear Mr. Junger, On the one hand, I would have the great desire, besides the natural- scientific, chemicalpharmacological investigation of hallucinogenic substances, also to research their use as magic drugs in other regions.... On the other hand, I must admit that the fundamental question very much occupies me, whether the use of these types of drugs, namely of substances that so deeply affect our minds, could not indeed represent a forbidden transgression of limits. As long as any means or methods are used, which provide only an additional, newer aspect of reality, surely there is nothing to object to in such means; on the contrary, the experience and the knowledge of further facets of the reality only makes this reality ever more real to us. The question exists, however, whether the deeply affecting drugs under discussion here will in fact only open an additional window for our senses and perceptions, or whether the spectator himself, the core of his being, undergoes alterations. The latter would signify that something is altered that in my opinion should always remain intact. My concern is addressed to the question, whether the innermost core of our being is actually unimpeachable, and cannot become damaged by whatever happens in its material, physical-chemical, biological and psychic shells-or whether matter in the form of these drugs displays a potency that has the ability to attack the spiritual center of the personality, the self. The latter would have to be explained by the fact that the effect of magic drugs happens at the borderline where mind and matter merge-that these magic substances are themselves cracks in the infinite realm of matter, in which the depth of matter, its relationship with the mind, becomes particularly obvious. This could be expressed by a modification of the familiar words of Goethe: "Were the eye not sunny, It could never behold the sun; If the power of the mind were not in matter, How could matter disturb the mind." This would correspond to cracks which the radioactive substances constitute in the periodic system of the elements, where the transition of matter into energy becomes manifest. Indeed, one must ask whether the production of atomic energy likewise represents a transgression of forbidden limits. A further disquieting tht)ught, which follows from the possibility of influencing the highest intellectual functions by traces of a substance, concerns free will. The highly active psychotropic substances like LSD and psilocybin possess in their chemical structure a very close relationship with substances inherent in the body, which are found in the central nervous system and play an important role in the regulation of its functions. It is therefore conceivable that through some disturbance in the metabolism of the normal neurotransmitters, a compound like LSD or psilocybin is formed, which can determine and alter the character of the individual, his world view and his behavior. A trace of a substance, whose production or nonproduction we cannot control with our wills, has the power to shape our destiny. Such biochemical considerations could have led to the sentence that Gottfried Benn quoted in his essay "Provoziertes Leben" [Provoked life]: "God is a substance, a drug!" On the other hand, it is well known that substances like adrenaline, for example, are formed or set free in our organism by thoughts and emotions, which for their part determine the functions of the nervous system. One may therefore suppose that our material organism is susceptible to and shaped by our mind, in the same way that our intellectual essence is shaped by our biochemistry. Which came first can indeed no better be determined than the question, whether the chicken came before the egg. In spite of my uncertainty with regard to the fundamental dangers that could lie in the use of hallucinogenic substances, I have continued investigations on the active principles of the Mexican magic morning glories, of which I wrote you briefly once before. In the seeds of this morning glory, that were called otoliuhqui by the ancient Aztecs, we found as active principles lysergic acid derivatives chemically very closely related to LSD. That was an almost unbelievable finding. I have all along had a particular love for the morning glories. They were the first flowers that I grew myself in my little child's garden. Their blue and red cups belong to the first memories of my childhood. I recently read in a book by D. T. Suzuki, Zen and Japanese Culture, that the morning glory plays a great role in Japan, among the flower lovers, in literature, and in graphic arts. Its fleeting splendor has given the Japanese imagination rich stimulus. Among others, Suzuki quotes a three- line poem of the poetess Chiyo (1702-75), who one morning went to fetch water from a neighbor's house, because . . . "My trough is captivated by a morning glory blossom, So I ask after water." The morning glory thus shows both possible ways of influencing the mind-body-essence of man: in Mexico it exerts its effects in a chemical way as a magic drug, while in Japan it acts from the spiritual side, through the beauty of its flower cups. Wilflingen, 17 December 1961 Dear Mr. Hofmann, I give you my thanks for your detailed letter of 16 December. I have reflected on your central question, and may probably become occupied with it on the occasion of the revision of An der Zeitmauer [At the wall of time]. There I intimated that, in the field of physics as well as in the field of biology, we are beginning to develop procedures that are no longer to be understood as advances in the established sense, but that rather intervene in evolution and lead forth in the development of the species. Certainly I turn the glove inside out, for I suppose that it is a new world age, which begins to act evolutionarily on the prototypes. Our science with its theories and discoveries is therefore not the cause, rather one of the consequences of evolution, among others. Animals, plants, the atmosphere and the surfaces of planets will be concerned simultaneously. We do not progress from point to point, rather we cross over a line. The risk that you indicated is well to be considered. However, it exists in every aspect of our existence. The common denominator appears now here, now there. In mentioning radioactivity, you use the word crack. Cracks are not merely points of discovery, but also points of destruction. Compared to the effects of radiation, those of the magical drugs are more genuine and much less rough. In classical manner they lead us beyond the humane. Gurdjieff has already seen that to some extent. Wine has already changed much, has brought new gods and a new humanity with it. But wine is to the new substances as classical physics is to modern physics. These things should only be tried in small circles. I cannot agree with the thoughts of Huxley, that possibilities for transcendence could here be given to the masses. Indeed, this does not involve comforting fictions, but rather realities, if we take the matter earnestly. And few contacts will suffice here for the setting of courses and guidance. It also transcends theology and belongs in the chapter of theogony, as it necessarily entails entry into a new house, in the astrological sense. At first, one can be satisfied with this insight, and should above all be cautious with the designations. Heartfelt thanks also for the beautiful picture of the blue morning glory. It appears to be the same that I cultivate year after year in my garden. I did not know that it possesses specific powers; however, that is probably the case with every plant. We do not know the key to most. Besides this, there must be a central viewpoint from which not only the chemistry, the structure, the color, but rather all attributes become significant.... An Experiment with Psilocybin Such theoretical discussions about the magic drugs were supplemented by practical experiments. One such experiment, which served as a comparison between LSD and psilocybin, took place in the spring of 1962. The proper occasion for it presented itself at the home of the Jungers, in the former head forester's house of Stauffenberg's Castle in Wilflingen. My friends, the pharmacologist Professor Heribert Konzett and the Islamic scholar Dr. Rudolf Gelpke, also took part in this mushroom symposium. The old chronicles described how the Aztecs drank chocolatl before they ate teonanacatl. Thus Mrs. Liselotte Junger likewide served us hot chocolate, to set the mood. Then she abandoned the four men to their fate. We had gathered in a fashionable living room, with a dark wooden ceiling, white tile stove, period furniture, old French engravings on the walls, a gorgeous bouquet of tulips on the table. Ernst Junger wore a long, broad, dark blue striped kaftan-like garment that he had brought from Egypt; Heribert Konzett was resplendent in a brightly embroidered mandarin gown; Rudolf Gelpke and I had put on housecoats. The everyday reality should be laid aside, along with everyday clothing. Shortly before sundown we took the drug, not the mushrooms, but rather their active principle, 20 mg psilocybin each. That corresponded to some twothirds of the very strong dose that was taken by the curandera Maria Sabina in the form of Psilocybe mushrooms. After an hour I still noticed no effect, while my companions were already very deeply into the trip. I had come with the hope that in the mushroom inebriation I could manage to allow certain images from euphoric moments of my childhood, which remained in my memory as blissful experiences, to come alive: a meadow covered with chrysanthemums lightly stirred by the early summer wind; the rosebush in the evening light after a rain storm; the blue irises hanging over the vineyard wall. Instead of these bright images from my childhood home, strange scenery emerged, when the mushroom factor finally began to act. Half stupefied, I sank deeper, passed through totally deserted cities with a Mexican type of exotic, yet dead splendor. Terrified, I tried to detain myself on the surface, to concentrate alertly on the outer world, on the surroundings. For a time I succeeded. I then observed Ernst Junger, colossal in the room, pacing back and forth, a powerful, mighty magician. Heribert Konzett in the silky lustrous housecoat seemed to be a dangerous, Chinese clown. Even Rudolf Gelpke appeared sinister to me; long, thin, mysterious. With the increasing depth of inebriation, everything became yet stranger. I even felt strange to myself. Weird, cold, foolish, deserted, in a dull light, were the places I traversed when I closed my eyes. Emptied of all meaning, the environment also seemed ghostlike to me whenever I opened my eyes and tried to cling to the outer world. The total emptiness threatened to drag me down into absolute nothingness. I remember how I seized Rudolf Gelpke's arm as he passed by my chair, and held myself to him, in order not to sink into dark nothingness. Fear of death seized me, and illimitable longing to return to the living creation, to the reality of the world of men. After timeless fear I slowly returned to the room . I saw and heard the great magician lecturing uninterruptedly with a clear, loud voice, about Schopenhauer, Kant, Hegel, and speaking about the old Gaa, the beloved little mother. Heribert Konzett and Rudolf Gelpke were already completely on the earth again, while I could only regain my footing with great effort. For me this entry into the mushroom world had been a test, a confrontation with a dead world and with the void. The experiment had developed differently from what I had expected. Nevertheless, the encounter with the void can also be appraised as a gain. Then the existence of the creation appears so much more wondrous. Midnight had passed, as we sat together at the table that the mistress of the house had set in the upper story. We celebrated the return with an exquisite repast and with Mozart's music. The conversation, during which we exchanged our experiences, lasted almost until morning. Ernst Junger has described how he had experienced this trip, in his book Annahenngenrogen und Rausch [Approaches-drugs and inebriation] (published by Ernst Klett Verlag, Stuttgart, 1970), in the section "Ein Pilz-Symposium" [A mushroom symposium]. The following is an extract from the work: As usual, a half hour or a little more passed in silence. Then came the first signs: the flowers on the table began to flare up and sent out flashes. It was time for leaving work; outside the streets were being cleaned, like on every weekend. The brush strokes invaded the silence painfully. This shuffling and brushing, now and again also a scraping, pounding, rumbling, and hammering, has random causes and is also symptomatic, like one of the signs that announces an illness. Again and again it also plays a role in the history of magic practices. By this time the mushroom began to act; the spring bouquet glowed darker. That was no natural light. The shadows stirred in the corners, as if they sought form. I became uneasy, even chilled, despite the heat that emanated from the tiles. I stretched myself on the sofa, drew the covers over my head. Everything became skin and was touched, even the retina-there the contact was light. This light was multicolored; it arranged itself in strings, which gently swung back and forth; in strings of glass beads of oriental doorways. They formed doors, like those one passes through in a dream, curtains of lust and danger. The wind stirred them like a garment. They also fell down from the belts of dancers, opened and closed themselves with the swing of the hips, and from the beads a rippling of the most delicate sounds fluttered to the heightened senses. The chime of the silver rings on the ankles and wrists is already too loud. It smells of sweat, blood, tobacco, chopped horse hairs, cheap rose essence. Who knows what is going on in the stables? It must be an immense palace, Mauritanian, not a good place. At this ballroom flights of adjoining rooms lead into the lower stratum. And everywhere the curtains with their glitter, their sparkling, radioactive glow. Moreover, the rippling of glassy instruments with their beckoning, their wooing solicitation: " Will you go with me, beautiful boy?" Now it ceased, now it repeated, more importunate, more intrusive, almost already assured of agreement. Now came forms-historical collages, the vox humana, the call of the cuckoo. Was it the whore of Santa Lucia, who stuck her breasts out of the window? Then the play was ruined. Salome danced; the amber necklace emitted sparks and made the nipples erect. What would one not do for one's Johannes? [Translator's note: "Johannes" here is slang for penis, as in English "Dick" or "Peter."] -damned, that was a disgusting obscenity, which did not come from me, but was whispered through the curtain. The snakes were dirty, scarcely alive, they wallowed sluggishly over the floor mats. They were garnished with brilliant shards. Others looked up from the floor with red and green eyes. It glistened and whispered, hissed and sparkled like diminutive sickles at the sacred harvest. Then it quieted, and came anew, more faintly, more forward. They had me in their hand. "There we immediately understood ourselves." Madam came through the curtain: she was busy, passed by me without noticing me. I saw the boots with the red heels. Garters constricted the thick thighs in the middle, the flesh bulged out there. The enormous breasts, the dark delta of the Amazon, parrots, piranhas, semiprecious stones everywhere. Now she went into the kitchen-or are there still cellars here? The sparkling and whispering, the hissing and twinkling could no longer be differentiated; it seemed to become concentrated, now proudly rejoicing, full of hope. It became hot and intolerable; I threw the covers off. The room was faintly illuminated; the pharmacologist stood at the window in the white mandarin frock, which had served me shortly before in Rottweil at the carnival. The orientalist sat beside the tile stove; he moaned as if he had a nightmare. I understood; it had been a first round, and it would soon start again. The time was not yet up. I had already seen the beloved little mother under other circumstances. But even excrement is earth, belongs like gold to transformed matter. One must come to terms with it, without getting too close. These were the earthy mushrooms. More light was hidden in the dark grain that burst from the ear, more yet in the green juice of the succulents on the glowing slopes of Mexico. . . . [Translator's note: Junger is referring to LSD, a derivative of ergot, and mescaline, derived from the Mexican peyotl cactus.] The trip had run awry-possibly I should address the mushrooms once more. Yet indeed the whispering returned, the flashing and sparkling-the bait pulled the fish close behind itself. Once the motif is given, then it engraves itself, like on a roller each new beginning, each new revolution repeats the melody. The game did not get beyond this kind of dreariness. I don't know how often this was repeated, and prefer not to dwell upon it. Also, there are things which one would rather keep to oneself. In any case, midnight was past.... We went upstairs; the table was set. The senses were still heightened and the Doors of Perception were opened. The light undulated from the red wine in the carafe; a froth surged at the brim. We listened to a flute concerto. It had not turned out better for the others: How beautiful, to be back among men." Thus Albert Hofmann. The orientalist on the other hand had been in Samarkand, where Timur rests in a coffin of nephrite. He had followed the victorious march through cities, whose dowry on entry was a cauldron filled with eyes. There he had long stood before one of the skull pyramids that terrible Timur had erected, and in the multitude of severed heads had perceived even his own. It was encrusted with stones. A light dawned on the pharmacologist when he heard this: Now I know why you were sitting in the armchair without your head-I was astonished; I knew I wasn't dreaming. I wonder whether I should not strike out this detail since it borders on the area of ghost stories. The mushroom substance had carried all four of us off, not into luminous heights, rather into deeper regions. It seems that the psilocybin inebriation is more darkly colored in the majority of cases than the inebriation produced by LSD. The influence of these two active substances is sure to differ from one individual to another. Personally, for me, there was more light in the LSD experiments than in the experiments with the earthy mushroom, just as Ernst Junger remarks in the preceding report. Another LSD Session The next and last thrust into the inner universe together with Ernst Junger, this time again using LSD, led us very far from everyday consciousness. We came close to the ultimate door. Of course this door, according to Ernst Junger, will in fact only open for us in the great transition from life into the hereafter. This last joint experiment occurred in February 1970, again at the head forester's house in Wilflingen. In this case there were only the two of us. Ernst Junger took 0.15 mg LSD, I took 0.10 mg. Ernst Junger has published without commentary the log book, the notes he made during the experiment, in Approaches, in the section "Nochmals LSD" [LSD once again]. They are scanty and tell the reader little, just like my own records. The experiment lasted from morning just after breakfast until darkness fell. At the beginning of the trip, we again listened to the concerto for flute and harp by Mozart, which always made me especially happy, but this time, strange to say, seemed to me like the turning of porcelain figures. Then the intoxication led quickly into wordless depths. When I wanted to describe the perplexing alterations of consciousness to Ernst Junger, no more than two or three words came out, for they sounded so false, so unable to express the experience; they seemed to originate from an infinitely distant world that had become strange; I abandoned the attempt, laughing hopelessly. Obviously, Ernst Junger had the same experience, yet we did not need speech; a glance sufficed for the deepest understanding. I could, however, put some scraps of sentences on paper, such as at the beginning: "Our boat tosses violently." Later, upon regarding expensively bound books in the library: "Like red-gold pushed from within to without-exuding golden luster." Outside it began to snow. Masked children marched past and carts with carnival revelers passed by in the streets. With a glance through the window into the garden, in which snow patches lay, many-colored masks appeared over the high walls bordering it, embedded in an infinitely joyful shade of blue: "A Breughel garden-I live with and in the objects." Later: "At present-no connection with the everyday world." Toward the end, deep, comforting insight expressed: "Hitherto confirmed on my path." This time LSD had led to a blessed approach. _________________________________________________________________ 8. Meeting with Aldous Huxley In the mid-1950s, two books by Aldous Huxley appeared, The Doors of Perception and Heaven and Hell, dealing with inebriated states produced by hallucinogenic drugs. The alterations of sensory perceptions and consciousness, which the author experienced in a self-experiment with mescaline, are skillfully described in these books. The mescaline experiment was a visionary experience for Huxley. He saw objects in a new light; they disclosed their inherent, deep, timeless existence, which remains hidden from everyday sight. These two books contained fundamental observations on the essence of visionary experience and about the significance of this manner of comprehending the world-in cultural history, in the creation of myths, in the origin of religions, and in the creative process out of which works of art arise. Huxley saw the value of hallucinogenic drugs in that they give people who lack the gift of spontaneous visionary perception belonging to mystics, saints, and great artists, the potential to experience this extraordinary state of consciousness, and thereby to attain insight into the spiritual world of these great creators. Hallucinogens could lead to a deepened understanding of religious and mystical content, and to a new and fresh experience of the great works of art. For Huxley these drugs were keys capable of opening new doors of perception; chemical keys, in addition to other proven but laborious " door openers" to the visionary world like meditation, isolation, and fasting, or like certain yoga practices. At the time I already knew the earlier work of this great writer and thinker, books that meant much to me, like Point Counter Point, Brave New World, After Many a Summer, Eyeless in Gaza, and a few others. In The Doors of Perception and Heaven and Hell, Huxley's newly-published works, I found a meaningful exposition of the experience induced by hallucinogenic drugs, and I thereby gained a deepened insight into my own LSD experiments. I was therefore delighted when I received a telephone call from Aldous Huxley in the laboratory one morning in August 1961. He was passing through Zurich with his wife. He invited me and my wife to lunch in the Hotel Sonnenberg. A gentleman with a yellow freesia in his buttonhole, a tall and noble appearance, who exuded kindness- this is the image I retained from this first meeting with Aldous Huxley. The table conversation revolved mainly around the problem of magic drugs. Both Huxley and his wife, Laura Archera Huxley, had also experimented with LSD and psilocybin. Huxley would have preferred not to designate these two substances and mescaline as "drugs," because in English usage, as also by the way with Droge in German, that word has a pejorative connotation, and because it was important to differentiate the hallucinogens from the other drugs, even linguistically. He believed in the great importance of agents producing visionary experience in the modern phase of human evolution. He considered experiments under laboratory conditions to be insignificant, since in the extraordinarily intensified susceptibility and sensitivity to external impressions, the surroundings are of decisive importance. He recommended to my wife, when we spoke of her native place in the mountains, that she take LSD in an alpine meadow and then look into the blue cup of a gentian flower, to behold the wonder of creation. As we parted, Aldous Huxley gave me, as a remembrance of this meeting, a tape recording of his lecture "Visionary Experience," which he had delivered the week before at an international congress on applied psychology in Copenhagen. In this lecture, Aldous Huxley spoke about the meaning and essence of visionary experience and compared this type of world view to the verbal and intellectual comprehension of reality as its essential complement. In the following year, the newest and last book by Aldous Huxley appeared, the novel Island. This story, set on the utopian island Pala, is an attempt to blend the achievements of natural science and technical civilization with the wisdom of Eastern thought, to achieve a new culture in which rationalism and mysticism are fruitfully united. The moksha medicine, a magical drug prepared from a mushroom, plays a significant role in the life of the population of Pala (moksha is Sanskrit for "release," "liberation"). The drug could be used only in critical periods of life. The young men on Pala received it in initiation rites, it is dispensed to the protagonist of the novel during a life crisis, in the scope of a psychotherapeutic dialogue with a spiritual friend, and it helps the dying to relinquish the mortal body, in the transition to another existence. In our conversation in Zurich, I had already learned from Aldous Huxley that he would again treat the problem of psychedelic drugs in his forthcoming novel. Now he sent me a copy of Island, inscribed "To Dr. Albert Hofmann, the original discoverer of the moksha medicine, from Aldous Huxley." The hopes that Aldous Huxley placed in psychedelic drugs as a means of evoking visionary experience, and the uses of these substances in everyday life, are subjects of a letter of 29 February 1962, in which he wrote me: . . . I have good hopes that this and similar work will result in the development of a real Natural History of visionary experience, in all its variations, determined by differences of physique, temperament and profession, and at the same time of a technique of Applied Mysticism - a technique for helping individuals to get the most out of their transcendental experience and to make use of the insights from the "Other World" in the affairs of "This World." Meister Eckhart wrote that "what is taken in by contemplation must be given out in love." Essentially this is what must be developed-the art of giving out in love and intelligence what is taken in from vision and the experience of self-transcendence and solidarity with the Universe.... Aldous Huxley and I were together often at the annual convention of the World Academy of Arts and Sciences (WAAS) in Stockholm during late summer 1963. His suggestions and contributions to discussions at the sessions of the academy, through their form and importance, had a great influence on the proceedings. WAAS had been established in order to allow the most competent specialists to consider world problems in a forum free of ideological and religious restrictions and from an international viewpoint encompassing the whole world. The results: proposals, and thoughts in the form of appropriate publications, were to be placed at the disposal of the responsible governments and executive organizations. The 1963 meeting of WAAS had dealt with the population explosion and the raw material reserves and food resources of the earth. The corresponding studies and proposals were collected in Volume II of WAAS under the title The Population Crisis and the Use of World Resources. A decade before birth control, environmental protection, and the energy crisis became catchwords, these world problems were examined there from the most serious point of view, and proposals for their solution were made to governments and responsible organizations. The catastrophic events since that time in the aforementioned fields makes evident the tragic discrepancy between recognition, desire, and feasibility. Aldous Huxley made the proposal, as a continuation and complement of the theme "World Resources" at the Stockholm convention, to address the problem "Human Resources," the exploration and application of capabilities hidden in humans yet unused. A human race with more highly developed spiritual capacities, with expanded consciousness of the depth and the incomprehensible wonder of being, would also have greater understanding of and better consideration for the biological and material foundations of life on this earth. Above all, for Western people with their hypertrophied rationality, the development and expansion of a direct, emotional experience of reality, unobstructed by words and concepts, would be of evolutionary significance. Huxley considered psychedelic drugs to be one means to achieve education in this direction. The psychiatrist Dr. Humphry Osmond, likewise participating in the congress, who had created the term psychedelic (mind-expanding), assisted him with a report about significant possibilities of the use of hallucinogens. The convention in Stockholm in 1963 was my last meeting with Aldous Huxley. His physical appearance was already marked by a severe illness; his intellectual personage, however, still bore the undiminished signs of a comprehensive knowledge of the heights and depths of the inner and outer world of man, which he had displayed with so much genius, love, goodness, and humor in his literary work. Aldous Huxley died on 22 November of the same year, on the same day President Kennedy was assassinated. From Laura Huxley I obtained a copy of her letter to Julian and Juliette Huxley, in which she reported to her brother- and sister-in-law about her husband's last day. The doctors had prepared her for a dramatic end, because the terminal phase of cancer of the throat, from which Aldous Huxley suffered, is usually accompanied by convulsions and choking fits. He died serenely and peacefully, however. In the morning, when he was already so weak that he could no longer speak, he had written on a sheet of paper: "LSD-try it-intramuscular-100 mmg." Mrs. Huxley understood what was meant by this, and ignoring the misgivings of the attending physician, she gave him, with her own hand, the desired injection-she let him have the moksha medicine. _________________________________________________________________ 9. Correspondence with the Poet-Physician Walter Vogt My friendship with the physician, psychiatrist, and writer Walter Vogt, M.D., is also among the personal contacts that I owe to LSD. As the following extract from our correspondence shows, it was less the medicinal aspects of LSD, important to the physician, than the consciousness-altering effects on the depth of the psyche, of interest to the writer, that constituted the theme of our correspondence. Muri/Bern, 22 November 1970 Dear Mr. Hofmann, Last night I dreamed that I was invited to tea in a cafe by a friendly family in Rome. This family also knew the pope, and so the pope sat at - the same table to tea with us. He was all in white and also wore a white miter. He sat there so handsome and was silent. And today I suddenly had the idea of sending you my Vogel auf dem Tisch [Bird on the table]-as a visiting card if you so wish-a book that remained a little apocryphal, which upon reflection I do not regret, although the Italian translator is firmly convinced that is my best. (Ah yes, the pope is also an Italian. So it goes. . . .) Possibly this little work will interest you. It was written in 1966 by an author who at that time still had not had any shred of experience with psychedelic substances and who read the reports about medicinal experiments with these drugs devoid of understanding. However, little has changed since, except that now the misgiving comes from the other side. I suppose that your discovery has caused a hiatus (not directly a Saul-to-Paul conversion as Roland Fischer says . . .) in my work (also a large word) - and indeed, that which I have written since has become rather realistic or at least less expressive. In any case I could not have brought off the cool realism of my TV piece "Spiele der Macht" [Games of power] without it. The different drafts attest it, in case they are still lying around somewhere. Should you have interest and time for a meeting, it would delight me very much to visit you sometime for a conversation. W. V. Burg, i.L. 28 November 1970 Dear Mr. Vogt, If the bird that alighted on my table was able to find its way to me, this is one more debt I owe to the magical effect of LSD. I could soon write a book about all of the results that derive from that experiment in 1943.... A. H. Muri/Bern, 13 March 1971 Dear Mr. Hofmann, Enclosed is a critique of Junger's Annahenngen [Approaches], from the daily paper, that will presumably interest you.... It seems to me that to hallucinate-to dream-to write,stands at all times in contrast to everyday consciousness, and their functions are complementary. Here I can naturally speak only for myself. This could be different with others - it is also truly difficult to speak with others about such things, because people often speak altogether different languages.... However, since you are now gathering autographs, and do me the honor of incorporating some of my letters in your collection, I enclose for you the manuscript of my "testament" - in which your discovery plays a role as "the only joyous invention of the twentieth century...." W. V. dr. walter vogts most recent testament 1969 I wish to have no special funeral only expensive and obscene orchids innumerable little birds with gay names no naked dancers but psychedelic garments loudspeaker in every corner and nothing but the latest beatles record [Abbey Road] one hundred thousand million times and do what you like ["Blind Faith"] on an endless tape nothing more than a popular Christ with a halo of genuine gold and a beloved mourning congregation that pumped themselves full with acid [acid = LSD] till they go to heaven [From Abbey Road, side two] one two three four five six seven possibly we will encounter one another there most cordially dedicated to Dr. Albert Hofmann Beginning of Spring 1971 Burg i.L., 29 March 1971 Dear Mr. Vogt, You have again presented me with a lovely letter and a very valuable autograph, the testament 1969.... Very remarkable dreams in recent times induce me to test a connection between the composition (chemical) of the evening meal and the quality of dreams. Yes, LSD is also something that one eats.... A. H. Muri/Bern, 5 September 1971 Dear Mr. Hofmann, Over the weekend at Murtensee [On that Sunday, I (A. H.) hovered over the Murtensee in the balloon of my friend E. I., who had taken me along as passenger.] I often thought of you-a most radiant autumn day. Yesterday, Saturday, thanks to one tablet of aspirin (on account of a headache or mild flu), I experienced a very comical flashback, like with mescaline (of which I have had only a little, exactly once).... I have read a delightful essay by Wasson about mushrooms; he divides mankind into mycophobes and mycophiles.... Lovely fly agarics must now be growing in the forest near you. Sometime shouldn't we sample some? W. V. Muri/Bern, 7 September 1971 Dear Mr. Hofmann, Now I feel I must write briefly to tell you what I have done outside in the sun, on the dock under your balloon: I finally wrote some notes about our visit in Villars-sur-Ollons (with Dr. Leary), then a hippie-bark went by on the lake, self-made like from a Fellini film, which I sketched, and over and above it I drew your balloon. W. V. Burg i.L., 15 April 1972 Dear Mr. Vogt, Your television play "Spiele der Macht" [Games of power] has impressed me extraordinarily. I congratulate you on this magnificent piece, which allows mental cruelty to become conscious, and therefore also acts in its way as "consciousness- expanding", and can thereby prove itself therapeutic in a higher sense, like ancient tragedy. A. H. Burg i.L., 19 May 1973 Dear Mr. Vogt, Now I have already read your lay sermon three times, the description and interpretation of your Sinai Trip. [Walter Vogt: Mein Sinai Trip. Eine Laienpredigt [My Sinai trip: A lay sermon] (Verlag der Arche, Zurich, 1972). This publication contains the text of a lay sermon that Walter Vogt gave on 14 November 1971 on the invitation of Parson Christoph Mohl, in the Protestant church of aduz (Lichtenstein), in the course of a series of sermons by writers, and in addition contains an afterword by the author and by the inviting parson. It involves the description and interpretation of an ecstatic-religious experience evoked by LSD, that the author is able to "place in a distant, if you will superficial, analogy to the great Sinai Trip of Moses." It is not only the "patriarchal atmosphere" that is to be traced out of these descriptions, that constitutes this analogy; there are deeper references, which are more to be read between the lines of this text.] Was it really an LSD trip? . . . It was a courageous deed, to choose such a notorious event as a drug experience as the theme of a sermon, even a lay sermon. But the questions raised by hallucinogenic drugs do actually belong in the church-in a prominent place in the church, for they are sacred drugs (peyotl, teonanacatl, ololiuhqui, with which LSD is mostly closely related by chemical structure and activity). I can fully agree with what you say in your introduction about the modern ecclesiastical religiosity: the three sanctioned states of consciousness (the waking condition of uninterrupted work and performance of duty, alcoholic intoxication, and sleep), the distinction between two phases of psychedelic inebriation (the first phase, the peak of the trip, in which the cosmic relationship is experienced, or the submersion into one's own body, in which everything that is, is within; and the second phase, characterized as the phase of enhanced comprehension of symbols), and the allusion to the candor that hallucinogens bring about in consciousness states. These are all observations that are of fundamental importance in the judgement of hallucinogenic inebriation. The most worthwhile spiritual benefit from LSD experiments was the experience of the inextricable intertwining of the physical and spiritual. "Christ in matter" (Teilhard de Chardin). Did the insight first come to you also through your drug experiences, that we must descend "into the flesh, which we are," in order to get new prophesies? A criticism of your sermon: you allow the "deepest experience that there is" - "The kingdom of heaven is within you"-to be uttered by Timothy Leary. This sentence, quoted without the indication of its true source, could be interpreted as ignorance of one, or rather the principal truth of Christian belief. One of your statements deserves universal recognition: "There is no non-ecstatic religious experience." . . . Next Monday evening I shall be interviewed on Swiss television (about LSD and the Mexican magic drugs, on the program "At First Hand"). I am curious about the sort of questions that will be asked. . . A. H. Muri/Bern, 24 May 1973 Dear Mr. Hofmann, Of course it was LSD - only I did not want to write about it explicitly, I really do not know just why myself.... The great emphasis I placed on the good Leary, who now seems to me to be somewhat flipped out, as the prime witness, can indeed only be explained by the special context of the talk or sermon. I must admit that the perception that we must descend "into the flesh, which we are" actually first came to me with LSD. I still ruminate on it, possibly it even came "too late" for me in fact, although more and more I advocate your opinion that LSD should be taboo for youth (taboo, not forbidden, that is the difference . . .). The sentence that you like, "there is no nonecstatic religious experience," was apparently not liked so much by others for example, by my (almost only) literary friend and minister-lyric poet Kurt Marti. . . . But in any case, we are practically never of the same opinion about anything, and notwithstanding, we constitute when we occasionally communicate by phone and arrange little activities together, the smallest minimafia of Switzerland. W. V. Burg i.L., 13 April 1974 Dear Mr. Vogt, Full of suspense, we watched your TV play "Pilate before the Silent Christ" yesterday evening. . . . as a representation of the fundamental man-God relationship: man, who comes to God with his most difficult questions, which finally he must answer himself, because God is silent. He does not answer them with words. The answers are contained in the book of his creation (to which the questioning man himself belongs). True natural science decipherin of this text. A. H. Muri/Bern, 11 May 1974 Dear Mr. Hofmann, I have composed a "poem" in half twilight, that I dare to send to you. At first I wanted to send it to Leary, but this would make no sense. Leary in jail Gelpke is dead Treatment in the asylum is this your psychedelic revolution? Had we taken seriously something with which one only ought to play or vice-versa . . . W. V. _________________________________________________________________ 10. Various Visitors The diverse aspects, the multi-faceted emanations of LSD are also expressed in the variety of cultural circles with which this substance has brought me into contact. On the scientific plane, this has involved colleagues-chemists, pharmacologists, physicians, and mycologists-whom I met at universities, congresses, lectures, or with whom I came into association through publication. In the literary-philosophical field there were contacts with writers. In the preceding chapters I have reported on the relationships of this type that were most significant for me. LSD also provided me with a variegated series of personal acquaintances from the drug scene and from hippie circles, which will briefly be described here. Most of these visitors came from the United States and were young people, often in transit to the Far East in search of Eastern wisdom or of a guru; or else hoping to come by drugs more easily there. Prague also was sometimes the goal, because LSD of good quality could at the time easily be acquired there. [Translator's Note: When Sandoz's patents on LSD expired in 1963, the Czech pharmaceutical firm Spofa began to manufacture the drug.] Once arrived in Europe, they wanted to take advantage of the opportunity to see the father of LSD, "the man who made the famous LSD bicycle trip." But more serious concerns sometimes motivated a visit. There was the desire to report on personal LSD experiences and to debate the purport of their meaning, at the source, so to speak. Only rarely did a visit prove to be inspired by the desire to obtain LSD when a visitor hinted that he or she wished once to experiment with most assuredly pure material, with original LSD. Visitors of various types and with diverse desires also came from Switzerland and other European countries. Such encounters have become rarer in recent times, which may be related to the fact that LSD has become less important in the drug scene. Whenever possible, I have welcomed such visitors or agreed to meet somewhere. This I considered to be an obligation connected with my role in the history of LSD, and I have tried to help by instructing and advising. Sometimes no true conversation occurred, for example with the inhibited young man who arrived on a motorbike. I was not clear about the objective of his visit. He stared at me, as if asking himself: can the man who has made something so weird as LSD really look so completely ordinary? With him, as with other similar visitors, I had the feeling that he hoped, in my presence, the LSD riddle would somehow solve itself. Other meetings were completely different, like the one with the young man from Toronto. He invited me to lunch at an exclusive restaurant-impressive appearance, tall, slender, a businessman, proprietor of an important industrial firm in Canada, brilliant intellect. He thanked me for the creation of LSD, which had given his life another direction. He had been 100 percent a businessman, with a purely materialistic world view. LSD had opened his eyes to the spiritual aspect of life. Now he possessed a sense for art, literature, and philosophy and was deeply concerned with religious and metaphysical questions. He now desired to make the LSD experience accessible in a suitable milieu to his young wife, and hoped for a similarly fortunate transformation in her. Not as profound, yet still liberating and rewarding, were the results of LSD experiments which a young Dane described to me with much humor and fantasy. He came from California, where he had been a houseboy for Henry Miller in Big Sur. He moved on to France with the plan of acquiring a dilapidated farm there, which he, a skilled carpenter, then wanted to restore himself. I asked him to obtain an autograph of his former employer for my collection, and after some time I actually received an original piece of writing from Henry Miller's hand. A young woman sought me out to report on LSD experiences that had been of great significance to her inner development. As a superficial teenager who pursued all sorts of entertainments, and quite neglected by her parents, she had begun to take LSD out of curiosity and love of adventure. For three years she took frequent LSD trips. They led to an astonishing intensification of her inner life. She began to seek after the deeper meaning of her existence, which eventually revealed itself to her. Then, recognizing that LSD had no further power to help her, without difficulty or exertion of will she was able to abandon the drug. Thereafter she was in a position to develop herself further without artificial means. She was now a happy intrinsically secure person-thus she concluded her report. This young woman had decided to tell me her history, because she supposed that I was often attacked by narrow-minded persons who saw only the damage that LSD sometimes caused among youths. The immediate motive of her testimony was a conversation that she had accidentally overheard on a railway journey. A man complained about me, finding it disgraceful that I had spoken on the LSD problem in an interview published in the newspaper. In his opinion, I ought to denounce LSD as primarily the devil's work and should publicly admit my guilt in the matter. Persons in LSD delirium, whose condition could have given rise to such indignant condemnation, have never personally come into my sight. Such cases, attributable to LSD consumption under irresponsible circumstances, to overdosage, or to psychotic predisposition, always landed in the hospital or at the police station. Great publicity always came their way. A visit by one youn American girl stands out in my memory as an example of the tragic effects of LSD. It was during the lunch hour, which I normally spent in my office under strict confinement-no visitors, secretary's office closed up. Knocking came at the door, discretely but firmly repeated, until eventually I went to open.it. I scarcely believed my eyes: before me stood a very beautiful young woman, blond, with large blue eyes, wearing a long hippie dress, headband, and sandals. "I am Joan, I come from New York-you are Dr. Hofmann?" Before I inquired what brought her to me, I asked her how she had got through the two checkpoints, at the main entrance to the factory area and at the door of the laboratory building, for visitors were admitted only after telephone query, and this flower child must have been especially noticeable. "I am an angel, I can pass everywhere," she replied. Then she explained that she came on a great mission. She had to rescue her country, the United States; above all she had to direct the president (at the time L. B. Johnson) onto the correct path. This could be accomplished only by having him take LSD. Then he would receive the good ideas that would enable him to lead the country out of war and internal difficulties. Joan had come to me hoping that I would help her fulfill her mission, namely to give LSD to the president. Her name would indicate she was the Joan of Arc of the USA. I don't know whether my arguments, advanced with all consideration of her holy zeal, were able to convince her that her plan had no prospects of success on psychological, technical, internal, and external grounds. Disappointed and sad she went away. Next day I received a telephone call from Joan. She again asked me to help her, since her financial resources were exhausted. I took her to a friend in Zurich who provided her with work, and with whom she could live. Joan was a teacher by profession, and also a nightclub pianist and singer. For a while she played and sang in a fashionable Zurich restaurant. The good bourgeois clients of course had no idea what sort of angel sat at the grand piano in a black evening dress and entertained them with sensitive playing and a soft and sensuous voice. Few paid attention to the words of her songs; they were for the most part hippie songs, many of them containing veiled praise of drugs. The Zurich performance did not last long; within a few weeks I learned from my friend that Joan had suddenly disappeared. He received a greeting card from her three months later, from Israel. She had been committed to a psychiatric hospital there. For the conclusion of my assortment of LSD visitors, I wish to report about a meeting in which LSD figured only indirectly. Miss H. S., head secretary in a hospital, wrote to ask me for a personal interview. She came to tea. She explained her visit thus: in a report about an LSD experience, she had read the description of a condition she herself had experienced as a young girl, which still disturbed her today; possibly I could help her to understand this experience. She had gone on a business trip as a commercial apprentice. They spent the night in a mountain hotel. H. S. awoke very early and left the house alone in order to watch the sunrise. As the mountains began to light up in a sea of rays, she was perfused by an unprecedented feeling of happiness, which persisted even after she joined the other participants of the trip at morning service in the chapel. During the Mass everything appeared to her in a supernatural luster, and the feeling of happiness intensified to such an extent that she had to cry loudly. She was brought back to the hotel and treated as someone with a mental disorder. This experience largely determined her later personal life. H.S. feared she was not completely normal. On the one hand, she feared this experience, which had been explained to her as a nervous breakdown; on the other hand, she longed for arepetitionof the condition. Internally split, she had led an unstable life. In repeated vocational changes and in varying personal relationships, consciously or unconsciously she again sought this ecstatic outlook, which once made her so deeply happy. I was able to reassure my visitor. It was no psychopathological event, no nervous breakdown that she had experienced at the time. What many people seek to attain with the help of LSD, the visionary experience of a deeper reality, had come to her as spontaneous grace. I recommended a book by Aldous Huxley to her, The Perennial Philosophy (Harper, New York & London, 1945) a collection of reports of spontaneous blessed visions from all times and cultures. Huxley wrote that not only mystics and saints, but also many more ordinary people than one generally supposes, experience such blessed moments, but that most do not recognize their importance and, instead of regarding them as promising rays of hope, repress them, because they do not fit into everyday rationality. _________________________________________________________________ 11. LSD Experience and Reality Was kann ein Mensch im Leben mehr gewinnen Als dass sich Gott-Natur ihm offenbare? What more can a person gain in life Than that God-Nature reveals himself to him? Goethe I am often asked what has made the deepest impression upon me in my LSD experiments, and whether I have arrived at new understandings through these experiences. Valious Realities Of greatest significance to me has been the insight that I attained as a fundamental understanding from all of my LSD experiments: what one commonly takes as "the reality," including the reality of one's own individual person, by no means signifies something fixed, but rather something that is ambiguous-that there is not only one, but that there are many realities, each comprising also a different consciousness of the ego. One can also arrive at this insight through scientific reflections. The problem of reality is and has been from time immemorial a central concern of philosophy. It is, however, a fundamental distinction, whether one approaches the problem of reality rationally, with the logical methods of philosophy, or if one obtrudes upon this problem emotionally, through an existential experience. The first planned LSD experiment was therefore so deeply moving and alarming, because everyday reality and the ego experiencing it, which I had until then considered to be the only reality, dissolved, and an unfamiliar ego experienced another, unfamiliar reality. The problem concerning the innermost self also appeared, which, itself unmoved, was able to record these external and internal transformations. Reality is inconceivable without an experiencing subject, without an ego. It is the product of the exterior world, of the sender and of a receiver, an ego in whose deepest self the emanations of the exterior world, registered by the antennae of the sense organs, become conscious. If one of the two is lacking, no reality happens, no radio music plays, the picture screen remains blank. If one continues with the conception of reality as a product of sender and receiver, then the entry of another reality under the influence of LSD may be explained by the fact that the brain, the seat of the receiver, becomes biochemically altered. The receiver is thereby tuned into another wavelength than that corresponding to normal, everyday reality. Since the endless variety and diversity of the universe correspond to infinitely many different wavelengths, depending on the adjustment of the receiver, many different realities, including the respective ego, can become conscious. These different realities, more correctly designated as different aspects of the reality, are not mutually exclusive but are complementary, and form together a portion of the all-encompassing, timeless, transcendental reality, in which even the unimpeachable core of self-consciousness, which has the power to record the different egos, is located. The true importance of LSD and related hallucinogens lies in their capacity to shift the wavelength setting of the receiving "self," and thereby to evoke alterations in reality consciousness. This ability to allow different, new pictures of reality to arise, this truly cosmogonic power, makes the cultish worship of hallucinogenic plants as sacred drugs understandable. What constitutes the essential, characteristic difference between everyday reality and the world picture experienced in LSD inebriation? Ego and the outer world are separated in the normal condition of consciousness, in everyday reality; one stands face-to-face with the outer world; it has become an object. In the LSD state the boundaries between the experiencing self and the outer world more or less disappear, depending on the depth of the inebriation. Feedback between receiver and sender takes place. A portion of the self overflows into the outer world, into objects, which begin to live, to have another, a deeper meaning. This can be perceived as a blessed, or as a demonic transformation imbued with terror, proceeding to a loss of the trusted ego. In an auspicious case, the new ego feels blissfully united with the objects of the outer world and consequently also with its fellow beings. This experience of deep oneness with the exterior world can even intensify to a feeling of the self being one with the universe. This condition of cosmic consciousness, which under favorable conditions can be evoked by LSD or by another hallucinogen from the group of Mexican sacred drugs, is analogous to spontaneous religious enlightenment, with the unio mystica. In both conditions, which often last only for a timeless moment, a reality is experienced that exposes a gleam of the transcendental reality, in vihich universe and self, sender and receiver, are one. [The relationship of spontaneous to drug-induced enlightenment has been most extensively investigated by R. C. Zaehner, Mysticismacred and Profane (The Clarendon Press, Oxford, 1957).] Gottfried Benn, in his essay "Provoziertes Leben" [Provoked life] (in Ausdnckswelt, Limes Verlag, Wiesbaden, 1949), characterized the reality in which self and world are separated, as "the schizoid catastrophe, the Western entelechy neurosis." He further writes: . . . In the southern part of our continent this concept of reality began to be formed. The Hellenistic-European agonistic principle of victory through effort, cunning, malice, talent, force, and later, European Darwinism and "superman," was instrumental in its formation. The ego emerged, dominated, fought; for this it needed instruments, material, power. It had a different relationship to matter, more removed sensually, but closer formally. It analyzed matter, tested, sorted: weapons, object of exchange, ransom money. It clarified matter through isolation, reduced it to formulas, took pieces out of it, divided it up. [Matter became] a concept which hung like a disaster over the West, with which the West fought, without grasping it, to which it sacrified enormous quantities of blood and happiness; a concept whose inner tension and fragmentations it was impossible to dissolve through a natural viewing or methodical insight into the inherent unity and peace of prelogical forms of being . . . instead the cataclysmic character of this idea became clearer and clearer . . . a state, a social organization, a public morality, for which life is economically usable life and which does not recognize the world of provoked life, cannot stop its destructive force. A society, whose hygiene and race cultivation as a modern ritual is founded solely on hollow biological statistics, can only represent the external viewpoint of the mass; for this point of view it can wage war, incessantly, for reality is simply raw material, but its metaphysical background remains forever obscured. [This excerpt from Benn's essay was taken from Ralph Metzner's translation "Provoked Life: An Essay on the Anthropology of the Ego," which was published in Psychedelic Review I (1): 47-54, 1963. Minor corrections in Metzner's text have been made by A. H.] As Gottfried Benn formulates it in these sentences, a concept of reality that separates self and the world has decisively determined the evolutionary course of European intellectual history. Experience of the world as matter, as object, to which man stands opposed, has produced modern natural science and technology- creations of the Western mind that have changed the world. With their help human beings have subdued the world. Its wealth has been exploited in a manner that may be characterized as plundering, and the sublime accomplishment of technological civilization, the comfort of Western industrial society, stands face-to-face with a catastrophic destruction of the environment. Even to the heart of matter, to the nucleus of the atom and its splitting, this objective intellect has progressed and has unleashed energies that threaten all life on our planet. A misuse of knowledge and understanding, the products of searching intelligence, could not have emerged from a consciousness of reality in which human beings are not separated from the environment but rather exist as part of living nature and the universe. All attempts today to make amends for the damage through environmentally protective measures must remain only hopeless, superficial patchwork, if no curing of the "Western entelechy neurosis" ensues, as Benn has characterized the objective reality conception. Healing would mean existential experience of a deeper, self-encompassing reality. The experience of such a comprehensive reality is impeded in an environment rendered dead by human hands, such as is present in our great cities and industrial districts. Here the contrast between self and outer world becomes especially evident. Sensations of alienation, of loneliness, and of menace arise. It is these sensations that impress themselves on everyday consciousness in Western industrial society; they also take the upper hand everywhere that technological civilization extends itself, and they largely determine the production of modern art and literature. There is less danger of a cleft reality experience arising in a natural environment. In field and forest, and in the animal world sheltered therein, indeed in every garden, a reality is perceptible that is infinitely more real, older, deeper, and more wondrous than everything made by people, and that will yet endure, when the inanimate, mechanical, and concrete world again vanishes, becomes rusted and fallen into ruin. In the sprouting, growth, blooming, fruiting, death, and regermination of plants, in their relationship with the sun, whose light they are able to convert into chemically bound energy in the form of organic compounds, out of which all that lives on our earth is built; in the being of plants the same mysterious, inexhaustible, eternal life energy is evident that has also brought us forth and takes us back again into its womb, and in which we are sheltered and united with all living things. We are not leading up to a sentimental enthusiasm for nature, to "back to nature" in Rousseau's sense. That romantic movement, which sought the idyll in nature, can also be explained by a feeling of humankind's separation from nature. What is needed today is a fundamental reexperience of the oneness of all living things, a comprehensive reality consciousness that ever more infrequently develops spontaneously, the more the primordial flora and fauna of our mother earth must yield to a dead technological environment. Mystery and Myth The notion of reality as the self juxtaposed to the world, in confrontation with the outer world, began to form itself, as reported in the citation from Benn, in the southern portion of the European continent in Greek antiquity. No doubt people at that time knew the suffering that was connected with such a cleft reality consciousness. The Greek genius tried the cure, by supplementing the multiformed and richly colored, sensual as well as deeply sorrowful Apollonian world view created by the subject/object cleavage, with the Dionysian world of experience, in which this cleavage is abolished in ecstatic inebriation. Nietzsche writes in The Birth of Tragedy: It is either through the influence of narcotic potions, of which all primitive peoples and races speak in hymns, or through the powerful approach of spring, penetrating with joy all of nature, that those Dionysian stirrings arise, which in their intensification lead the individual to forget himself completely.... Not only does the bond between man and man come to be forged once again by the magic of the Dionysian rite, but alienated, hostile, or subjugated nature again celebrates her reconciliation with her prodigal son, man. The Mysteries of Eleusis, which were celebrated annually in the fall, over an interval of approximately 2,000 years, from about 1500 B.C. until the fourth century A.D., were intimately connected with the ceremonies and festivals in honor of the god Dionysus. These Mysteries were established by the goddess of agriculture, Demeter, as thanks for the recovery of her daughter Persephone, whom Hades, the god of the underworld, had abducted. A further thank offering was the ear of grain, which was presented by the two goddesses to Triptolemus, the first high priest of Eleusis. They taught him the cultivation of grain, which Triptolemus then disseminated over the whole globe. Persephone, however, was not always allowed to remain with her mother, because she had taken nourishment from Hades, contrary to the order of the highest gods. As punishment she had to return to the underworld for a part of the year. During this time, it was winter on the earth, the plants died and were withdrawn into the ground, to awaken to new life early in the year with Persephone's journey to earth. The myth of Demeter, Persephone, Hades, and the other gods, which was enacted as a drama, formed, however, only the external framework of events. The climax of the yearly ceremonies, which began with a procession from Athens to Eleusis lasting several days, was the concluding ceremony with the initiation, which took place in the night. The initiates were forbidden by penalty of death to divulge what they had learned, beheld, in the innermost, holiest chamber of the temple, the tetesterion (goal). Not one of the multitude that were initiated into the secret of Eleusis has ever done this. Pausanias, Plato, many Roman emperors like Hadrian and Marcus Aurelius, and many other known personages of antiquity were party to this initiation. It must have been an illumination, a visionary glimpse of a deeper reality, an insight into the true basis of the universe. That can be concluded from the statements of initiates about the value, about the importance of the vision. Thus it is reported in a Homeric Hymn: "Blissful is he among men on Earth, who has beheld that! He who has not been initiated into the holy Mysteries, who has had no part therein, remains a corpse in gloomy darkness." Pindar speaks of the Eleusinian benediction with the following words: "Blissful is he, who after having beheld this enters on the way beneath the Earth. He knows the end of life as well as its divinely granted beginning." Cicero, also a famous initiate, likewise put in first position the splendor that fell upon his life from Eleusis, when he said: " Not only have we received the reason there, that we may live in joy, but also, besides, that we may die with better hope." How could the mythological representation of such an obvious occurrence, which runs its course annually before our eyes-the seed grain that is dropped into the earth, dies there, in order to allow a new plant, new life, to ascend into the light-prove to be such a deep, comforting experience as that attested by the cited reports? It is traditional knowledge that the initiates were furnished with a potion, the kykeon, for the final ceremony. It is also known that barley extract and mint were ingredients of the kykeon. Religious scholars and scholars of mythology, like Karl Kerenyi, from whose book on the Eleusinian Mysteries (Rhein-Verlag, Zurich, 1962) the preceding statements were taken, and with whom I was associated in relation to the research on this mysterious potion [In the English publication of Kerenyi's book Eleusis (Schocken Books, New York, 1977) a reference is made to this collaboration.], are of the opinion that the kykeon was mixed with an hallucinogenic drug. [In The Road to Eleusis by R. Gordon Wasson, Albert Hofmann, and Carl A. P. Ruck (Harcourt Brace Jovanovich, New York, 1978) the possibility is discussed that the kykeon could have acted through an LSD-like preparation of ergot.] That would make understandable the ecstatic-visionary experience of the DemeterPersephone myth, as a symbol of the cycle of life and death in both a comprehensive and timeless reality. When the Gothic king Alarich, coming from the north, invaded Greece in 396 A.D. and destroyed the sanctuary of Eleusis, it was not only the end of a religious center, but it also signified the decisive downfall of the ancient world. With the monks that accompanied Alarich, Christianity penetrated into the country that must be regarded as the cradle of European culture. The cultural-historical meaning of the Eleusinian Mysteries, their influence on European intellectual history, can scarcely be overestimated. Here suffering humankind found a cure for its rational, objective, cleft intellect, in a mystical totality experience, that let it believe in immortality, in an everlasting existence. This belief had survived in early Christianity, although with other symbols. It is found as a promise, even in particular passages of the Gospels, most clearly in the Gospel according to John, as in Chapter 14: 120. Jesus speaks to his disciples, as he takes leave of them: And I will pray the Father, and he shall give you another Comforter, that he may abide with you forever; Even the Spirit of truth; whom the world cannot receive, because it seeth him not, neither knoweth him: but ye know him; for he dwelleth with you, and shall be in you. I will not leave you comfortless: I will come to you. Yet a little while, and the world seeth me no more; but ye see me: because I live, ye shall live also. At that day ye shatl know that I am in my Father, and ye in me, and I in you. This promise constitutes the heart of my Christian beliefs and my call to natural-scientific research: we will attain to knowledge of the universe through the spirit of truth, and thereby to understanding of our being one with the deepest, most comprehensive reality, God. Ecclesiastical Christianity, determined by the duality of creator and creation, has, however, with its nature-alienated religiosity largely obliterated the Eleusinian-Dionysian legacy of antiquity. In the Christian sphere of belief, only special blessed men have attested to a timeless, comforting reality, experienced in a spontaneous vision, an experience to which in antiquity the elite of innumerable generations had access through the initiation at Eleusis. The unio mystica of Catholic saints and the visions that the representatives of Christian mysticism-Jakob Boehme, Meister Eckhart, Angelus Silesius, Thomas Traherne, William Blake, and others describe in their writings, are obviously essentially related to the enlightenment that the initiates to the Eleusinian Mysteries experienced. The fundamental importance of a mystical experience, for the recovery of people in Western industrial societies who are sickened by a one-sided, rational, materialistic world view, is today given primary emphasis, not only by adherents to Eastern religious movements like Zen Buddhism, but also by leading representatives of academic psychiatry. Of the appropriate literature, we will here refer only to the books of Balthasar Staehelin, the Basel psychiatrist working in Zurich. [Haben und Sein (1969), Die Welt als Du (1970), Urvertrauen und zweite Wirklichkeit (1973), and Der flnale Mensch (1976); all published by Theologischer Verlag, Zurich.] They make reference to numerous other authors who deal with the same problem. Today a type of "metamedicine," "metapsychology," and "metapsychiatry" is beginning to call upon the metaphysical element in people, which manifests itself as an experience of a deeper, duality-surmounting reality, and to make this element a basic healing principle in therapeutic practice. In addition, it is most significant that not only medicine but also wider circles of our society consider the overcoming of the dualistic, cleft world view to be a prerequisite and basis for the recovery and spiritual renewal of occidental civilization and culture. This renewal could lead to the renunciation of the materialistic philosophy of life and the development of a new reality consciousness. As a path to the perception of a deeper, comprehensive reality, in which the experiencing individual is also sheltered, meditation, in its different forms, occupies a prominent place today. The essential difference between meditation and prayer in the usual sense, which is based upon the duality of creatorcreation, is that meditation aspires to the abolishment of the I-you-barrier by a fusing of object and subject, of sender and receiver, of objective reality and self. Objective reality, the world view produced by the spirit of scientific inquiry, is the myth of our time. It has replaced the ecclesiastical-Christian and mythical-Apollonian world view. But this ever broadening factual knowledge, which constitutes objective reality, need not be a desecration. On the contrary, if it only advances deep enough, it inevitably leads to the inexplicable, primal ground of the universe: the wonder, the mystery of the divine-in the microcosm of the atom, in the macrocosm of the spiral nebula; in the seeds of plants, in the body and soul of people. Meditation begins at the limits of objective reality, at the farthest point yet reached by rational knowledge and perception. Meditation thus does not mean rejection of objective reality; on the contrary, it consists of a penetration to deeper dimensions of reality. It is not escape into an imaginary dream world; rather it seeks after the comprehensive truth of objective reality, by simultaneous, stereoscopic contemplation of its surfaces and depths. It could become of fundamental importance, and be not merely a transient fashion of the present, if more and more people today would make a daily habit of devoting an hour, or at least a few minutes, to meditation. As a result of the meditative penetration and broadening of the natural-scientific world view, a new, deepened reality consciousness would have to evolve, which would increasingly become the property of all humankind. This could become the basis of a new religiosity, which would not be based on belief in the dogmas of various religions, but rather on perception through the "spirit of truth." What is meant here is a perception, a reading and understanding of the text at first hand, "out of the book that God's finger has written" (Paracelsus), out of the creation. The transformation of the objective world view into a deepened and thereby religious reality consciousness can be accomplished gradually, by continuing practice of meditation. It can also come about, however, as a sudden enlightenment; a visionary experience. It is then particularly profound, blessed, and meaningful. Such a mystical experience may nevertheless "not be induced even by decade-long meditation," as Balthasar Staehelin writes. Also, it does not happen to everyone, although the capacity for mystical experience belongs to the essence of human spirituality. Nevertheless, at Eleusis, the mystical vision, the healing, comforting experience, could be arranged in the prescribed place at the appointed time, for all of the multitudes who were initiated into the holy Mysteries. This could be accounted for by the fact that an hallucinogenic drug came into use; this, as already mentioned, is something that religious scholars believe. The characteristic property of hallucinogens, to suspend the boundaries between the experiencing self and the outer world in an ecstatic, emotional experience, makes it possible with their help, and after suitable internal and external preparation, as it was accomplished in a perfect way at Eleusis, to evoke a mystical experience according to plan, so to speak. Meditation is a preparation for the same goal that was aspired to and was attained in the Eleusinian Mysteries. Accordingly it seems feasible that in the future, with the help of LSD, the mystical vision, crowning meditation, could be made accessible to an increasing number of practitioners of meditation I see the true importance of LSD in the possibitity ofproviding material aid to meditation aimed at the mystical experience of a deeper, comprehensive reality. Such a use accords entirely with the essence and working character of LSD as a sacred drug. From he at psychology.su.se Mon Jan 16 17:15:12 2006 From: he at psychology.su.se (Hannes Eisler) Date: Mon, 16 Jan 2006 18:15:12 +0100 Subject: [Paleopsych] Bush's real motive Message-ID: There has been quite a discussion about Bush's true incentives to start the war in Iraq. I am surprised that what I consider his real main motive has never been mentioned, as far as I know. It is that he wanted to surpass his father who did not dare to invade Iraq. As a European I have sometimes wondered about US motives being family reasons, intending to please parents or wives or boost the pride of children, rather than the objective ones. In American literature I remember my astonishment about two examples : the policeman or superintendent trying to stop the fight between the two groups of hooligans in "West side story," stating some private motive, and an officer at the landing in Normandy at the end of WW2 in Irwin Shaw*s " The young lions" who, armed with a private gun with ivory handle, bragged about which impression he will make on his wife (or was it his parents?). ------------------------------------- Prof. Hannes Eisler Department of Psychology Stockholm University S-106 91 Stockholm Sweden e-mail: he at psychology.su.se fax : +46-8-15 93 42 phone : +46-8-163967 (university) +46-8-6409982 (home) internet: http://www.psychology.su.se/staff/he -------------- next part -------------- An HTML attachment was scrubbed... URL: From shovland at mindspring.com Mon Jan 16 23:37:19 2006 From: shovland at mindspring.com (Steve Hovland) Date: Mon, 16 Jan 2006 15:37:19 -0800 Subject: [Paleopsych] Bush's real motive In-Reply-To: Message-ID: If true, he has failed miserably. Is that often true? Boys wanting to show their father how smart they are and making a mess instead? -----Original Message----- From: paleopsych-bounces at paleopsych.org [mailto:paleopsych-bounces at paleopsych.org]On Behalf Of Hannes Eisler Sent: Monday, January 16, 2006 9:15 AM To: The new improved paleopsych list Cc: Anna Eisler Subject: [Paleopsych] Bush's real motive There has been quite a discussion about Bush's true incentives to start the war in Iraq. I am surprised that what I consider his real main motive has never been mentioned, as far as I know. It is that he wanted to surpass his father who did not dare to invade Iraq. As a European I have sometimes wondered about US motives being family reasons, intending to please parents or wives or boost the pride of children, rather than the objective ones. In American literature I remember my astonishment about two examples : the policeman or superintendent trying to stop the fight between the two groups of hooligans in "West side story," stating some private motive, and an officer at the landing in Normandy at the end of WW2 in Irwin Shaw*s " The young lions" who, armed with a private gun with ivory handle, bragged about which impression he will make on his wife (or was it his parents?). ------------------------------------- Prof. Hannes Eisler Department of Psychology Stockholm University S-106 91 Stockholm Sweden e-mail: he at psychology.su.se fax : +46-8-15 93 42 phone : +46-8-163967 (university) +46-8-6409982 (home) internet: http://www.psychology.su.se/staff/he -------------- next part -------------- An HTML attachment was scrubbed... URL: From checker at panix.com Tue Jan 17 16:10:26 2006 From: checker at panix.com (Premise Checker) Date: Tue, 17 Jan 2006 11:10:26 -0500 (EST) Subject: [Paleopsych] Ben Franklin 300 Package Message-ID: Ben Franklin 300 Package [Today is the tricentennial of the birth of the most polymathic of the Foundng Fathers. First, an appreciation of his work on electricity, then news items for today and reflections, then reviews of books about him over the past few years. Mr. Jefferson is, of course, the visionary I feel closest to, and Washington the man most instrumental to our independence, but Franklin is surely the most complex.] Benjamin Franklin and Lightning Rods Physics Today January 2006 http://www.physicstoday.org/servlet/PrintPT American Institute of Physics Franklin's work on electricity and lightning earned him worldwide fame and respect--ideal assets for brokering aid from France during the American Revolution. E. Philip Krider sketch of the "sentry-box" experiment Figure 1 On 10 May 1752, as a thunderstorm passed over the village of Marly-la-Ville, a retired French dragoon, acting on instructions from naturalist Thomas-Fran?ois Dalibard, drew sparks from a tall iron rod that had been carefully insulated from ground (see figure 1). The sparks showed that thunderclouds are electrified and that lightning is an electrical discharge. In the mid-18th century, such an observation was sensational and was soon verified by Delor, Dalibard's collaborator in Paris. Within weeks of hearing the news, many others throughout Europe had successfully repeated the experiment.^1,2 When Dalibard and Delor reported their results to the Acad?mie des Sciences in Paris three days later, they acknowledged that they had merely followed a path that Benjamin Franklin had traced for them. In June 1752, shortly after the experiment at Marly-la-Ville but before he knew about it, Franklin drew sparks himself from a key attached to the conducting string of his famous electrical kite that was insulated from ground by a silk ribbon. The French results were important because they called attention to Franklin's small pamphlet, Experiments and Observations on Electricity, Made at Philadelphia in America,^3 that helped to stimulate other work in electricity and contributed to the beginning of modern physics.^4 The observations also validated the key assumptions that lay behind Franklin's supposition that tall, grounded rods can protect buildings from lightning damage. A Philadelphia story Franklin performed his initial experiments on electricity in collaboration with friends and neighbors, including Thomas Hopkinson, a lawyer and judge; Ebenezer Kinnersley, a clergyman and teacher; and Philip Syng Jr, a master silversmith. Franklin described the experiments and their results in five formal letters to Peter Collinson, a fellow of the Royal Society of London, during the years from 1747 to 1750. Collinson in turn communicated those letters to the Society and published them in April 1751. In his first letter,^5 Franklin described "the wonderful Effect of Points, both in drawing off and throwing off the Electrical Fire." He showed that points work quickly at "a considerable Distance," that sharp points work better than blunt ones, that metal points work better than dry wood, and that the pointed object should be touched--that is, grounded--to obtain the maximum draw effect. Next, Franklin introduced the idea that rubbing glass with wool or silk does not actually create electricity; rather, at the moment of friction, the glass simply takes "the Electrical Fire" out of the rubbing material. Whatever amount is added to the glass, an equal amount is lost by the wool or silk. The terms plus and minus were used to describe those electrical states; the glass was assumed to be electrified positively and the rubbing material negatively. The idea that electricity is a single fluid that is never created or destroyed, but simply transferred from one place to another, was profound, and it greatly simplified the interpretation of many observations. In his second letter,^5 Franklin described the behavior of a Leiden jar capacitor by combining the concept of equal positive and negative states with an assumption that glass is a perfect insulator. "So wonderfully are these two States of Electricity, the plus and minus combined and ballanced in this miraculous Bottle!" He also made an analogy between electricity and lightning when he described a discharge through the gold trim on the cover of a book that produced "a vivid Flame, like the sharpest Lightning." In his third letter,^5 Franklin began to use terms such as "charging" and "discharging" when describing how a Leiden jar works, and he noted the importance of grounding when charging and discharging the jar. He also showed that the electricity in such a device resides entirely in the glass and not on the conductors that are inside and outside the jar. Franklin described how several capacitors could be charged in series "with the same total Labour" as charging one, and he constructed an "Electrical Battery"--a capacitor bank in today's parlance--using panes of window glass sandwiched between thin lead plates, and then discharged them together so that they provided the "Force of all the Plates of Glass at once thro' the Body of any Animal forming the Circle with them." Later, Franklin used discharges from large batteries to simulate the effects of lightning in a variety of materials. In the fourth letter,^5 he applied his knowledge of electricity to lightning by introducing the concept of the sparking or striking distance: If two electrified gun barrels "will strike at two Inches Distance, and make a loud Snap; to what great a Distance may 10 000 Acres of Electrified Cloud strike and give its Fire, and how loud must be that Crack!" Based on his previous experiments with sharp points, Franklin then postulated that when an electrified cloud passes over a region, it might draw electricity from, or discharge electricity to, high hills and trees, lofty towers, spires, masts of ships, and chimneys. That supposition then led to some practical advice against taking shelter under a single, isolated tree during a thunderstorm; crouching in an open field is less dangerous. Franklin also noted that out in the open during a thunderstorm, clothing tends to become wet, thereby providing a conducting path outside the body. His laboratory analogy was that "a wet Rat can not be kill'd by the exploding electrical Bottle, when a dry Rat may." In the fifth letter,^5 Franklin described how discharges between smooth or blunt conductors occur with a "Stroke and Crack," whereas sharp points discharge silently and produce large effects at greater distances. He then introduced what he viewed to be a "Law of Electricity, That Points as they are more or less acute, both draw on and throw off the electrical fluid with more or less Power, and at greater or less Distances, and in larger or smaller Quantities in the same Time." Given his interest in lightning and the effects of metallic points, it was a short step to the lightning rod: I say, if these Things are so, may not the Knowledge of this Power of Points be of Use to Mankind; in preserving Houses, Churches, Ships, etc. from the Stroke of Lightning; by Directing us to fix on the highest Parts of those Edifices upright Rods of Iron, made sharp as a Needle and gilt to prevent Rusting, and from the Foot of those Rods a Wire down the outside of the Building into the Ground; or down round one of the Shrouds of a Ship and down her Side, till it reaches the Water? Would not these pointed Rods probably draw the Electrical Fire silently out of a Cloud before it came nigh enough to strike, and thereby secure us from that most sudden and terrible Mischief! Clearly, Franklin supposed that silent discharges from one or more sharp points might reduce or eliminate the electricity in the clouds above and thereby reduce or eliminate the chances of the structure being struck by lightning. From his earlier observations, he knew that point discharges work best when the conductor is grounded and that lightning tends to strike tall objects. Therefore, even if the point discharges did not neutralize the cloud, a tall conductor would provide a preferred place for the lightning to strike, and the grounded conductor would provide a safe path for the lightning current to flow into the ground. Franklin also stated in his fifth letter,^5 To determine the Question, whether the Clouds that contain Lightning are electrified or not, I would propose an Experiment to be try'd where it may be done conveniently. On the Top of some high Tower or Steeple, place a Kind of Sentry Box [see Figure 1] big enough to contain a Man and an electrical Stand. From the Middle of the Stand let an Iron Rod rise, and pass bending out of the Door, and then upright 20 or 30 feet, pointed very sharp at the End. If the Electrical Stand be kept clean and dry, a Man standing on it when such Clouds are passing low, might be electrified, and afford Sparks, the Rod drawing Fire to him from the Cloud. Franklin was not the first person to compare sparks with lightning or to hypothesize that lightning might be an electrical discharge. In fact, almost every experimenter who had previously described electric sparks had, at one time or another, mentioned an analogy to lightning. Franklin's seminal contributions were his suggestions that tall, insulated rods could be used to determine if thunderclouds are, in fact, electrified and that tall, grounded rods would protect against lightning damage. The French connection Shortly after Collinson published the first edition of Experiments and Observations, he sent a copy to the famous French naturalist, the Comte de Buffon, who asked Dalibard to translate it from English into French. While he did that, Dalibard asked Delor to help him repeat many of the Philadelphia experiments. In March 1752, Buffon arranged for the pair to show the experiments to King Louis XV. The king's delight inspired Dalibard to try the sentry-box experiment at Marly-la-Ville. At the time of the sentry-box experiment, Abb? Jean-Antoine Nollet was the leading "electrician" in France and was known throughout Europe for his skill in making apparatus and in performing demonstrations. Unfortunately, because of personal rivalries, Buffon and Dalibard completely ignored Nollet's work in a short history that preceded their translation of Franklin's book. After Dalibard read an account of the sentry-box experiment to the Acad?mie des Sciences on 13 May 1752, Nollet suppressed publication of the results.^6 News reached the Paris newspapers, however, and from there spread very rapidly. After Louis XV saw the experiment, he sent a personal message of congratulations to Franklin, Collinson, and the Royal Society of London for communicating "the useful Discoveries in Electricity, and Application of Pointed Rods to prevent the terrible Effects of Thunderstorms."^7 Nollet was both surprised and chagrined by the experiment at Marly-la-Ville. He acknowledged that insulated rods or "electroscopes" did verify that thunderclouds are electrified, but for the rest of his life he steadfastly opposed the use of grounded rods as "preservatives." In 1753, he published a series of letters attacking Franklin's Experiments and Observations and suggested other methods of lightning protection. On 6 August 1753, the Swedish scientist Georg Wilhelm Richmann was electrocuted in St. Petersburg while trying to quantify the response of an insulated rod to a nearby storm. The incident, reported worldwide, underscored the dangers inherent in experimenting with insulated rods and in using protective rods with faulty ground connections. Nollet used Richmann's death to heighten the public's fears and to generate opposition to both types of rods.^8 In London, members of the Royal Society were amused when Franklin's letter about lightning conductors was read to the Society, and they did not publish it in their Philosophical Transactions. In 1753, however, they awarded Franklin their highest scientific honor, the Copley Gold Medal. In his 1767 history of electricity, Joseph Priestley described the kite experiment as drawing "lightning from the heavens," and said it was "the greatest, perhaps, in the whole compass of philosophy since the time of Sir Isaac Newton."^9 Experiments in colonial America Modeled after a 1762 painting Figure 2 After Franklin learned about the success of the sentry-box experiment in France, he installed a tall, insulated rod on the roof of his house to study the characteristics of thunderstorm electricity. The conductor ran down a stairwell to ground but had a gap in the middle, as illustrated on the left side of figure 2. A small ball suspended between chimes mounted on each end of the gap would ring the chimes whenever an electrified cloud passed overhead. Franklin used this apparatus to compare the properties of atmospheric electricity with the electricity generated by friction and to measure the polarity of thunderclouds. He found that both types of electricity were the same and "that the Clouds of a Thunder Gust are most commonly in a negative State of Electricity, but sometimes in a positive State,"^10 a result that was regarded as definitive for the next 170 years. At that time, Franklin thought that all discharges went from positive to negative, so he concluded "that for the most part in Thunder Strokes, 'tis the Earth that strikes into the Clouds, and not the Clouds that strike into the Earth." Judging by his later correspondence, Franklin was fascinated by this discovery, and he postulated that the effects of lightning would be very nearly the same regardless of the direction of the current flow. First protection system In the 1753 issue of Poor Richard's Almanack, Franklin published a method for protecting houses from lightning damage: It has pleased God in his Goodness to Mankind, at length to discover to them the Means of securing their Habitations and other Buildings from Mischief by Thunder and Lightning. The Method is this: Provide a small Iron Rod (it may be made of the Rod-iron used by the Nailers) but of such a Length, that one End being three or four Feet in the moist Ground, the other may be six or eight Feet above the highest Part of the Building. To the upper End of the Rod fasten about a Foot of Brass Wire, the Size of a common Knitting-needle, sharpened to a fine Point; the Rod may be secured to the House by a few small Staples. If the House or Barn be long, there may be a Rod and Point at each End, and a middling Wire along the Ridge from one to the other. A House thus furnished will not be damaged by Lightning, it being attracted by the Points, and passing thro the Metal into the Ground without hurting any Thing. Vessels also, having a sharp pointed Rod fix'd on the Top of their Masts, with a Wire from the Foot of the Rod reaching down, round one of the Shrouds, to the Water, will not be hurt by Lightning. Independence Hall Figure 3 The opening phrase of this description anticipated a religious objection to protective rods that would soon appear in America and Europe. In the late summer or fall of 1752, grounded conductors were installed on the Academy of Philadelphia (later the University of Pennsylvania) and the Pennsylvania State House (later Independence Hall). Figures 3 and 4 show fragments of the original grounding conductors that were installed inside the tower of Independence Hall and on the Gloria Dei (Old Swedes') Church in Philadelphia, respectively. David B. Rivers Figure 4a Three key elements made up Franklin's protection system. Metallic rods, or air terminals as they're now called, were mounted on the roof of a structure and connected by horizontal roof conductors and vertical down conductors to a ground connection. Because Franklin initially thought point discharges might provide protection, the first air terminals were thin, sharp needles mounted on top of an iron rod. The first down conductors were chains of iron rods, each several feet long, that were mechanically linked or hooked together as shown in figures 3 and 4. Because the current in point discharges is usually less than a few hundred microamperes, the roof and down conductors could be mechanically hooked together and attached to the inside walls of towers and steeples without creating a hazard. Because Franklin wanted to verify that lightning would actually follow the path of a metallic conductor and determine what size that conductor should be, in June 1753 he published a "Request for Information on Lightning" in the Pennsylvania Gazette and other newspapers: Those of our Readers in this and the neighboring Provinces, who may have an Opportunity of observing, during the present Summer, any of the Effects of Lightning on Houses, Ships, Trees, Etc. are requested to take particular Notice of its Course, and Deviation from a strait Line, in the Walls or other Matter affected by it, its different Operations or Effects on Wood, Stone, Bricks, Glass, Metals, Animal Bodies, Etc. and every other Circumstance that may tend to discover the Nature, and compleat the History of that terrible Meteor. Such Observations being put in Writing, and communicated to Benjamin Franklin, in Philadelphia, will be very thankfully accepted and gratefully acknowledged. In the summer of 1753, Dr. John Lining, a physician with many scientific interests, verified Franklin's kite experiment in Charleston, South Carolina, but when he tried to install a rod on his house, the local populace objected. They thought that the rod was presumptuous--that it would interfere with the will of God--or that it might attract lightning and be dangerous.^11 In April of that year, Franklin commented on that issue, [Nollet] speaks as if he thought it Presumption in Man to propose guarding himself against Thunders of Heaven! Surely the Thunder of Heaven is no more supernatural than the Rain, Hail, or Sunshine of Heaven, against the Inconvenience of which we guard by Roofs and Shades without Scruple. But I can now ease the Gentleman of this Apprehension; for by some late Experiments I find, that it is not Lightning from the Clouds that strikes the Earth, but Lightning from the Earth that Strikes the Clouds.^12 Improvements In the following years, Franklin continued to gather information about lightning, and in 1757 he traveled to London as an agent of the Pennsylvania Assembly. In March 1761, Kinnersley sent Franklin a detailed description of a lightning flash that struck a Philadelphia house equipped with a protective rod. An observer had reported at the time that "the Lightning diffused over the Pavement, which was then very wet with Rain, the Distance of two or three Yards from the Foot of the Conductor." Further investigation showed that the lightning had melted a few inches of the brass air terminal and Kinnersley concluded, "Surely it will now be thought as expedient to provide Conductors for the Lightning as for the Rain."^13 mechanical link Figure 4b Before Kinnersley's letter, Franklin had received reports of two similar strikes to protected houses in South Carolina. In one case, the points and a length of the brass down conductor had melted. In the other, three brass points, each about seven inches long and mounted on top of an iron rod, had evaporated. Moreover, several sections of the iron down conductor, each about a half-inch in diameter and hooked together, had become unhooked by the discharge (see figure 4b). Nearly all the staples that held the conductor to the outside of the house had also been loosened. "Considerable cavities" had been made in the earth near the rod, sunk about three feet underground, and the lightning had produced several furrows in the ground "some yards in length." Franklin was pleased by these reports, and replied to Kinnersley that "a conductor formed of nail rods, not much above a quarter of an inch thick, served well to convey the lightning" but "when too small, may be destroyed in executing its office." Franklin sent the reports from South Carolina to Kinnersley with a recommendation to use larger, more substantial conductors and a deeper, more extensive grounding system to protect the foundation of the house against the effects of surface arcs and explosions in the soil. Because all reports from North America showed that grounded rods did indeed protect houses from lightning damage, in January 1762 Franklin sent an improved design for "the shortest and simplest Method of securing Buildings, Etc. from the Mischiefs of Lightning," together with excerpts from Kinnersley's letter and the reports from South Carolina, to Scottish philosopher David Hume. That letter was subsequently read to Edinburgh's philosophical society, which published it in 1771. 18th centurty house with lightning rod Figure 5 In the letter to Hume, Franklin recommended large, steel air terminals, 5 to 6 feet long and tapered to a sharp point. He said that any building with a dimension greater than about 100 feet should have a pointed rod mounted on each end with a conductor between them. All roof and down conductors should be at least a half-inch in diameter, continuous, and routed outside the building--the earlier design allowed routing the conductors inside a building's walls. Any links or joints in these conductors should be filled with lead solder to ensure a good connection. The grounding conductor should be a one-inch-diameter iron bar driven 10 to 12 feet into the earth, and if possible, kept at least 10 feet away from the foundation. Franklin also recommended that the ground rods be painted to minimize rust and connected to a well, if one happened to be nearby. Figure 5 illustrates an implementation of Franklin's 1762 design. In the 1769 edition of Experiments and Observations, Franklin published his reply to Kinnersley and the reports from South Carolina together with some "Remarks" on the construction and use of protective rods. After repeating his recommendations for an improved design, he also noted a psychological benefit of having protection against lightning: Those who calculate chances may perhaps find that not one death (or the destruction of one house) in a hundred thousand happens from that cause, and that therefore it is scarce worth while to be at any expense to guard against it. But in all countries there are particular situations of buildings more exposed than others to such accidents, and there are minds so strongly impressed with the apprehension of them, as to be very unhappy every time a little thunder is within their hearing; it may therefore be well to render this little piece of new knowledge as general and well understood as possible, since to make us safe is not all its advantage, it is some to make us easy. And as the stroke it secures us from might have chanced perhaps but once in our lives, while it may relieve us a hundred times from those painful apprehensions, the latter may possibly on the whole contribute more to the happiness of mankind than the former.^14 Today, most authorities agree that lightning rods define and control the points where lightning will strike the structure and then guide the current safely into ground. As Franklin noted in 1761, "Indeed, in the construction of an instrument so new, and of which we could have so little experience, it is rather lucky that we should at first be so near the truth as we seem to be, and commit so few errors." Franklin was truly lucky: His original 1752 design was based on the low current levels of point discharges, but direct lightning strikes deliver tens of kiloamperes of current, enough to produce explosive arcs across any imperfect mechanical connections; and those arcs can produce momentary overpressures of several hundred atmospheres and enough heat to ignite flammable materials. The early applications of lightning rods could have been disastrous. Franklin's 1762 design, however, has stood the test of time and remains the basis for all modern lightning protection codes in the world today. 'Snatching lightning from the sky' It is difficult for us living in an electrical age to appreciate how important lightning conductors were in the 18th century. The discovery that thunderclouds contain electricity and that lightning is an electrical discharge revolutionized human perceptions of the natural world, and the invention of protective rods was a clear example of how basic, curiosity-driven research can lead to significant practical benefits. In his later years, Franklin devoted most of his time to public service, but he did continue to follow the work of others and conduct occasional experiments. He also participated on scientific advisory boards and panels that reviewed methods of lightning protection, and made recommendations for protecting cathedrals and facilities for manufacturing and storing gunpowder. Eventually, Franklin became a leader of the American Revolution. When he embarked for France in November 1776 to seek aid for the newly declared United States of America in the war against Great Britain, he took with him a unique asset--his worldwide fame. By then his work on lightning and electricity had called attention to his other writings in science, politics, and moral philosophy,^15 and the intellectuals of France and Europe viewed Franklin as one of their own. In 1811, John Adams, the first vice president and second president of the US, who served with Franklin in France in the 1770s (and who actually hated him), summarized Franklin's reputation: Nothing, perhaps, that ever occurred upon this earth was so well calculated to give any man an extensive and universal celebrity as the discovery of the efficacy of iron points and the invention of lightning rods. The idea was one of the most sublime that ever entered a human imagination, that a mortal should disarm the clouds of heaven, and almost "snatch from his hand the sceptre and the rod!" The ancients would have enrolled him with Bacchus and Ceres, Hercules and Minerva. His Paratonnerres erected their heads in all parts of the world, on temples and palaces no less than on cottages of peasants and the habitations of ordinary citizens. These visible objects reminded all men of the name and character of their inventor; and, in the course of time, have not only tranquilized the minds and dissipated the fears of the tender sex and their timorous children, but have almost annihilated that panic terror and superstitious horror which was once almost universal in violent storms of thunder and lightning. . . . His reputation was more universal than that of Leibnitz or Newton, Frederick or Voltaire, and his character more beloved and esteemed than any or all of them. Newton had astonished perhaps forty or fifty men in Europe; for not more than that number, probably, at any one time had read him and understood him by his discoveries and demonstrations. And these being held in admiration in their respective countries as at the head of the philosophers, had spread among scientific people a mysterious wonder at the genius of this perhaps the greatest man that ever lived. But this fame was confined to men of letters. The common people knew little and cared nothing about such a recluse philosopher. Leibnitz's name was more confined still. . . . But Franklin's fame was universal. His name was familiar to government and people, to kings, courtiers, nobility, clergy, and philosophers, as well as plebeians, to such a degree that there was scarcely a peasant or a citizen, a valet de chambre, coachman or footman, a lady's chambermaid or a scullion in a kitchen, who was not familiar with it, and who did not consider him as a friend to human kind. When they spoke of him, they seemed to think he was to restore the golden age.^16 In June 1776, the celebrated economist and former comptroller-general of France, Anne-Robert Jacques Turgot, composed a prophetic epigram in Latin that captures Franklin's legacy in a single sentence: "Eripuit caelo fulmen, sceptrumque tyrannis" ("He snatched lightning from the sky and the scepter from tyrants").^17 I am grateful to Penelope Hartshorne Batcheler for calling my attention to the photograph in figure 3. Philip Krider is a professor in the Institute of Atmospheric Physics at the University of Arizona in Tucson. References 1. 1. Portions of this paper are based on the author's presentation at the Inaugural Symposium of the International Commission on History of Meteorology, International Congress of History of Science, Mexico City, 11-12 July 2001, and on E.P. Krider, in Benjamin Franklin: In Search of a Better World, P. Talbott, ed., Yale U. Press, New Haven, CT (2005), chap. 5. 2. 2. I. B. Cohen, Benjamin Franklin's Science, Harvard U. Press, Cambridge, MA (1990), chap. 6. 3. 3. I. B. Cohen, Benjamin Franklin's Experiments: A New Edition of Franklin's Experiments and Observations on Electricity, Harvard U. Press, Cambridge, MA (1941). 4. 4. J. L. Heilbron, Electricity in the 17th and 18th Centuries: A Study of Early Modern Physics, U. of Calif. Press, Berkeley (1979). 5. 5. Franklin's letters and associated quotations can be found in The Papers of Benjamin Franklin, L. W. Labaree et al., eds., Yale U. Press, New Haven, CT, vol. 1 (1959) to vol. 37 (2003). In the following citations, these volumes will be referred to as Franklin Papers. The first letter is in vol. 3, p. 126, and the remaining four letters are in vol. 3, pp. 156, 352, and 365 and vol. 4, p. 9. 6. 6. Ref. 4, chap. 15. 7. 7. Franklin Papers, vol. 4, p. 465. 8. 8. Ref. 2, chap. 8. 9. 9. J. Priestley, History and Present State of Electricity, with Original Experiments, printed for J. Dodsley, J. Johnson, B. Davenport, T. Cadell, London (1767), p. 179. 10. 10. Franklin Papers, vol. 5, p. 71. 11. 11. J. A. L. Lemay, Ebenezer Kinnersley: Franklin's Friend, U. of Penn. Press, Philadelphia (1964), p. 78. 12. 12. Franklin Papers, vol. 4, p. 463. 13. 13. Franklin Papers, vol. 4, p. 293. 14. 14. Franklin Papers, vol. 10, p. 52. 15. 15. J. A. L. Lemay, ed., Benjamin Franklin Writings, The Library of America, New York (1987). 16. 16. C. F. Adams, ed., The Works of John Adams, vol. 1, Little Brown and Co, Boston (1856), p. 660. 17. 17. For further details on the influence of Franklin's science on his fame and diplomacy, see P. Dray, Stealing God's Thunder: Benjamin Franklin's Lightning Rod and the Invention of America, Random House, New York (2005) and S. Schiff, A Great Improvisation: Franklin, France, and the Birth of America, Henry Holt, New York (2005). [p42fig1.jpg] Figure 1. This sketch of the "sentry-box" experiment conducted at Marly-la-Ville, France, in 1752 was based on Benjamin Franklin's proposal to determine whether thunderclouds are electrified. Silk ropes (g) and wine bottles (e) insulated a 13-meter iron rod (a) from ground, and covers (h) sheltered the ropes from rain. A person standing on the ground could draw sparks from the rod or charge a Leiden jar when a storm was in the area. (From B. Franklin, Exp?riences et Observations sur L'?lectricit? . . . , 2nd ed., vol. 2, an extended translation from English by T. F. Dalibard, Chez Durand, Paris, 1756.) [p42fig2.jpg] Figure 2. Modeled after a 1762 painting by Mason Chamberlain, this etching depicts Benjamin Franklin looking at electrostatic bells he used to study cloud electricity. Two chimes, separated from each other by a small gap, are connected to rods that go up through the roof and to ground. A thundercloud charges the right-hand bell, either by induction or point discharge; the bell then alternately attracts or repels a small ball suspended between the chimes on a silk thread. The ball rattles between the bells, ringing an alarm when a storm approaches. The electroscope hanging from the right-hand bell was used to measure the cloud's polarity. A grounded rod of Franklin's 1762 design can be seen through the window. (Frontispiece from Oeuvres de M. Franklin, translated by J.B. Dubourg, Chez Quillau, Paris, 1773.) [p42fig3.jpg] Figure 3. Independence Hall, Philadelphia. During a restoration in 1960, fragments of the original grounding conductor were found under paneling and plaster on the inside wall of the northwest corner of the tower stairwell. (From the Independence National Historical Park Collection.) [p42fig4a.jpg][sm-p42fig4b.jpg] Figure 4. David B. Rivers, pastor of the Gloria Dei (Old Swedes') Church in Philadelphia, holds a section of the original iron conductor that protected the church. The upper links in the chain were stapled to the inside of a wooden steeple. The inset shows how a mechanical link may have been ruptured, its hook forced open by an explosive arc during a lightning strike. (Courtesy of E. Philip Krider.) [p42fig5a.jpg][p42fig5b.jpg] Figure 5. An 18th-century house with a lightning rod of Franklin's 1762 design. The thick, continuous rod can carry tens of kiloamperes of current to ground without harming the house or its foundation. ------------------ 300 http://www.washingtonpost.com/wp-dyn/content/article/2006/01/16/AR2006011600986_pf.html 300 Scientist, Diplomat And Wit: Franklin's Birth Merits a Toast By Hillel Italie Associated Press Tuesday, January 17, 2006; A15 At the Smithsonian, a tribute to his statesmanship is planned. In London, an exhibit hails his medical contributions. But at McGillin's Olde Ale House in Philadelphia, they know best how to honor Benjamin Franklin on his 300th birthday: with a celebratory toast. "He was a very jovial fellow who would meet at the taverns, discussing the latest John Locke book or scientific breakthrough over a nice pint of beer," McGillin's owner Chris Mullins said. Franklin was a businessman, inventor, revolutionary, athlete (he is a member of the United States Swim School Association Hall of Fame), diplomat, publisher, humorist, sage and regular guy. "He certainly is a multiplicity of persona, so one never knows which one is the real Franklin," says Gordon Wood, a Pulitzer Prize-winning historian whose books include "The Americanization of Benjamin Franklin." Franklin's approachability begins with his background. Unlike George Washington or Thomas Jefferson, he did not grow up a landed "gentleman." His rise, as Franklin himself later boasted, was "from the Poverty and Obscurity in which I was born and bred, to a state of affluence and some degree of Reputation in the World." He was born in Boston on Jan. 17, 1706, the 10th son of a soap- and candle-maker. Starting at age 12, he worked five years as an apprentice at his brother James's newspaper, the New England Courant, establishing himself as a prankster and satirist, and, not for the last time, as "a little obnoxious to the governing party." Over the next 30 years and beyond, he advanced himself as a printer, publisher and humorist, composing such lasting epigrams as "Fish and visitors stink in three days" and "Eat to live, and not live to eat." For many, he is the founding American wit, grounded in plain talk, a tradition carried on by Mark Twain and Will Rogers. Franklin's greatest public triumph was probably as a diplomat, persuading France to aid the colonies in their fight against the British. But he needed no revolution to be a revolutionary, for he changed the world by living in it. "The things which hurt, instruct," he observed. Middle-aged eyesight led him to design a single, all-purpose set of glasses -- bifocals. A struggle to raise money for a public hospital led to a plan by which private contributions would be equaled by government funds, the "matching grant" formula in use to this day. Among other credits: modernized street lights, volunteer firefighters, fire insurance, lending libraries, odometers, daylight saving time and lightning rods (inspired by a kite excursion). "His demonstration that lightning was not supernatural had huge impact," says Dudley Herschbach, a Nobel Prize-winning chemist. "Since lightning had long been considered a prerogative of the Almighty, Franklin was attacked for presumption, vigorously but in vain." Herschbach, a Harvard University professor who has lectured frequently on Franklin, says: "Franklin's scientific curiosity extended far beyond his adventures with electricity. He made important discoveries and observations concerning the motion of storms, heat conduction, the path of the Gulf Stream, bioluminescence, the spreading of oil films, and also advanced prescient ideas about conservation of matter and the wave nature of light." Franklin was an innovator, but, unlike Jefferson, not a poet; ideas didn't matter unless they were useful. He was the country's original pragmatist -- the classic American art of learning through experience, not theory, that was refined and adopted by William James and John Dewey. Franklin now seems the safest of the founders to celebrate, but when he died, in 1790, he was mistrusted by many in power as a Francophile synonymous with the excesses of the French Revolution. The Senate rejected a proposal to wear badges of mourning in his honor. A year passed before an official eulogy was delivered, by a longtime detractor, Anglican minister William Smith, who belittled Franklin as "ignorant of his own strength." Condemned as a Jacobin upon his death, he would be satirized as a middlebrow member of the booboisie for more than a century after. Sociologist Max Weber believed Franklin stood for the "earning of more and more money combined with the strict avoidance of all spontaneous engagement in life." Poet John Keats disliked "his mean and thrifty maxims." Historian Charles Angoff labeled him "the father of all the Kiwanians." "It was elitism, sort of a condescending elitism that looked down on Franklin for having basic middle class values," says Walter Isaacson, author of a 2003 bestseller about Franklin. "For a long time, most intellectuals saw him as a spokesman for capitalism and for making money and getting ahead, a view of America many have had," says historian Gordon Wood. The denigration of Franklin was partly his own doing. His "Autobiography," unfinished at his death but published posthumously, immortalized him as a crafty self-made man for whom all virtue was but a means to success. But the "Autobiography" underplays other sides of Franklin: the statesman, dissident and man of conscience, the former slaveholder who eventually called for abolition, the belated rebel who overcame his reverence for the British crown and helped coin one of the era's immortal phrases: "We hold these truths to be self-evident." Franklin is praised now by both the left and right. "He was a defender of limited government, and he was very much opposed to taking on excessive debt," says Mark Skousen, an author and economist whose edition of the "Autobiography" includes a Franklin quote of appeal to conservatives: "A virtuous and industrious people may be cheaply governed." David Koepsell, executive director of the Council for Secular Humanism, said he believes Franklin "would have been dismayed by religious fundamentalism in government. He was a free thinker about many things and at least a skeptic about the afterlife and the divinity of Jesus. He was a scientist, a man of letters and a man of Earth." What would Ben Franklin do? http://www.thestate.com/mld/state/news/opinion/13642080.htm Posted on Tue, Jan. 17, 2006 By SHERRY BEASLEY Guest Columnist There is a controversial war going on. Political scandal involving a big-name lobbyist looms large in the nation's capital. The issue of individual freedom versus government surveillance is once again rearing its ugly head. Partisan politics are raging over a nominee to the nation's highest court. The price of fuel continues to rise, and for these and a myriad of other reasons, Americans are experiencing a season of discontent early on in 2006. So on this day, the 300th anniversary of his birth, there is only one question to ask: What would Benjamin Franklin do? We, as a nation, have long lauded Ben Franklin as the original American Renaissance man. We have loved his larger-than-life persona of diplomat, scientist, inventor, statesman, writer, humorist --- and the legendary exploits that took him from Boston and Philadelphia to all over the European continent. The stories of his pragmatic, yet playful, spirit are the stuff of our American history pantheon. Recent biographies have let us know that Ben wasn't exactly an avuncular guy all the time, and that he generated his share of controversy during his long life. However, as a New York Times editorial recently put it, Ben Franklin, with all his inconsistencies, was the "founder not only of American institutions, but of an idea of America itself." During this tercentenary year of Franklin's birth, there are celebrations planned nationwide, and we will have many opportunities to review the lists of this Founding Father's remarkable contributions to America. We will be reminded that his inventions included items as disparate as bifocals, swim fins, a urinary catheter, the lightning rod, the odometer and the Franklin stove. He also originated the idea of the first public lending library, the first public hospital, the first fire company and fire and property insurance. He also suggested the idea of daylight-saving time and was one of the first to chart the Gulf Stream on his many voyages to Europe. Sometimes we forget that Ben Franklin was the only American to sign all four documents that helped to create the United States: the Declaration of Independence, the Treaty of Alliance with France, the Treaty of Peace with Great Britain and the U.S. Constitution. He was a lifelong believer in collaborative thinking, and is credited with instilling the importance of community service and devotion to compromise in many of his fellow Founding Fathers. Franklin's own belief in the "mixture of cynicism and idealism abounding in human beings" shaped his realistic viewpoints. These viewpoints had ample forums in Franklin's writing. He was an author who commented wryly on his life and times in Poor Richard's Almanac as well as in his own autobiography, both sources of the rich proverbial sayings for which Franklin has become famous. How many of us, for example, have not been told that "Early to bed and early to rise make a man healthy, wealthy and wise"? We all know that a "Penny saved is a penny earned" and that "Three may keep a secret if two of them are dead." Franklin's bon mots of wisdom and insight find quite a remarkable application in 2006, three centuries after he originated them. For example, if we consider the current lobbying scandal in Washington, we should remember that Ben told us to "Sell not virtue to purchase wealth, nor liberty to purchase power." The current controversy over the war in Iraq finds a voice in Franklin as well. Although Franklin supported the war with Great Britain, he exhausted diplomatic means before the fighting began. He later uttered the succinct critique of war when he said, "There was never a good war or a bad peace." Franklin might address the policy of government surveillance of citizens to thwart terrorism by his words written in a letter in 1778: "They that can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety." His statement "When the well's dry, we know the worth of water" might apply indirectly to the global fuel supply and the rising cost of gasoline. And Franklin would probably address our contemporary fascination with vacuous celebrities such as the Paris Hiltons of the world with his pithy statement that "People who are wrapped up in themselves make small packages" or that "It's hard for an empty bag to stand upright." And to politicians and pundits who fill our airwaves with constant chatter, Ben could have addressed his famous line, "Well done is better than well said." Ben Franklin also knew how to describe the human quality of charity and generosity. Global response to tragedies like the tsunami, Katrina or the earthquake in Pakistan would find fitting descriptors in Franklin's pronouncements that "A good example is the best sermon" and "What is serving God? Tis doing Good to Man." As the tricentennial of his birth year gets under way, and we remember how uniquely Benjamin Franklin shaped the character of America, a final quote of his seems to be profound policy for all of us: "If you would not be forgotten, as soon as you are dead and rotten, either write things worth reading, or do things worth the writing." From presidents to politicians to pundits to us peons, this is very sound advice indeed. Thanks, Ben, and happy 300th birthday. Ms. Beasley is grants director for Clemson's Institute for Economic and Community Development. ___________________________________________________________________ Making the best use of thrift http://www.philly.com/mld/inquirer/news/editorial/13642496.htm Posted on Tue, Jan. 17, 2006 By David Blankenhorn How should we celebrate the 300th birthday of Benjamin Franklin of Philadelphia, born this day in 1706? Today, we as a society may be unsure of the answer. But as recently as the 1920s, millions of Americans were quite sure. They honored Franklin by publicly extolling the virtue of thrift, a character trait that Franklin tirelessly championed. Yes, thrift. Thrift is a complex idea. It includes, but has never been merely, the habit of saving money. Thrift is much more than sound approaches to managing one's finances, and the main goal of thrift has never been the accumulation of wealth as an end in itself. The word thrift comes from the verb thrive. Thrift is the ethic and practice of best use. Being thrifty means making the wisest use of all that we have - time, money, our possessions, our health, and our society's natural resources - to promote both our own flourishing and the social good. To use Franklin's favorite terms, thrift's core ideas are "industry" (that is, diligence) and "frugality" (that is, conservation). The ideas most contrary to thrift are idleness and waste. Despite what you may have heard, thrift embraces the pleasure principle. When Franklin, in Poor Richard's Almanac, writes, "Fly pleasures, and they'll follow you," he is offering a strategy for pleasure. When he advises that "industry need not wish," he is offering a strategy for getting one's wishes. That strategy, he tells us, "consists very much in Thrift." Franklin openly proclaimed that "wealth is not his that has it, but his that enjoys it." In the 1920s, the slogan of Thrift Week - which began on Jan. 17, Franklin's birthday - was "For Success and Happiness." Thrift is therefore flatly inconsistent with miserliness, or hoarding, or seeking wealth for wealth's sake. Franklin refused to accept money for any of his many inventions and spent much of his life performing public services for which he was not paid. One of the 10 planks of National Thrift Week was "Share with Others." The idea is that being thrifty enables us to be generous. More broadly, thrift is a pathway to, not a rejection of, social awareness and humane moral values. As Franklin earnestly put it, "The noblest question in the world is, What good may I do in it?" Franklin was an unabashed moral and civic reformer who viewed the thrift ethic as essential to improving the national character and ensuring American progress. In almost identical ways, the leaders of the National Thrift Movement of the 1920s believed that their movement was vital to the broad goals of moral reform, character education, and civic progress. They conceived of thrift in broad, progressive terms. They wanted parents to teach thrift to children as a part of character education. They were pioneers in the science of home economics. They wanted Americans to take better care of their health. They wanted farmers and businesses to become more efficient. They were consistently critical of American materialism and consumerism. They were also early environmentalists, strongly supporting the protection of our natural resources. In all of their efforts, they regularly invoked the legacy of Ben Franklin, whom they called in their literature "the American Apostle of Thrift." Today, of course, this movement is hardly remembered. What a pity. These men and women did good work. In so many ways, we are in their debt - sometimes debt is good! - just as they were in Franklin's and others' debt. Much of what they fought for is still quite relevant to our lives. Yes, the word thrift today has a quaint, old-fashioned sound. Again, what a pity. Our government budget deficits are ballooning out of control. We Americans don't save much at all, even though most economists agree that more savings and investment relative to consumer spending would be good for us, both as individuals and as a society. We waste a lot. We sometimes seem to think that buying more stuff will make us happy. We sometimes seem confused about the relationship of private gain to the public good. What to do? Instead of inventing a new philosophy to help us wrestle with these issues, we might consider dusting off an old one for recycling. That would be the thrifty thing to do. And for what it's worth, Ben Franklin would certainly approve. _________________________________________________________________ David Blankenhorn (blankenhorn at americanvalues.org) is president of the Institute for American Values and codirects a research project on thrift. ___________________________________________________________________ For Franklin, generosity and prudence were partners http://www.philly.com/mld/inquirer/news/editorial/13642494.htm Posted on Tue, Jan. 17, 2006 For Franklin, generosity and prudence were partners A call to continue his legacy of public service By Amy Gutmann Benjamin Franklin's genius harnessed entrepreneurial energy and intellectual creativity in the service of civic improvement. Philadelphians were the lucky beneficiaries; their city became his laboratory for imagining, inventing, and creating a better world. For example, while other colleges were founded to train a privileged class for the clergy and courts, here Franklin founded an academy to teach practical subjects, discover new knowledge, and prepare great future leaders. Today the academy Franklin founded is Philadelphia's largest private employer and one of the world's great research institutions: the University of Pennsylvania. By linking higher education to self-improvement and to serving society, Benjamin Franklin created the very framework for American philanthropy and civic-mindedness. Today, Philadelphians enjoy the benefits of Franklin's drive to create America's first modern city. Concerned for safety and security, he proposed fire-fighting and insurance associations. The nation's first hospital grew out of his campaign to promote better health care. The first free library system came from Franklin's desire to bring the benefits of reading to everyone. How can we draw on his legacy to help us forward into the 21st century? One way is to improve education. Americans should concentrate on providing students of all backgrounds with access to education that prepares them for full participation in our democracy. An education at most selective private colleges and universities, unfortunately, remains beyond the reach of many qualified middle- and low-income students. The students I know at Penn from underrepresented groups include a gifted writer who is the daughter of an auto mechanic and the first in her family to attend college; the son of a truck driver who has become a standout at the Wharton School and a campus leader; and the son of a grocery store clerk who wants to pursue both a doctorate in philosophy and a law degree. One of our highest priorities is increasing accessibility, making our excellent undergraduate, graduate and professional educations more accessible and affordable to the sons and daughters of our country's middle- and low- income families. Accessibility is one of the guiding principles of the Penn Compact, a statement of purpose I first put forward in the fall of 2004. We work hard to attract students of diverse socioeconomic backgrounds from the Philadelphia region; although the numbers are climbing, we can do even better. At the same time Americans must continue to push for greater opportunity in elementary and secondary schools - to bring higher education to a more diverse group of students whose contributions will be vital to our country's future. At our university-assisted neighborhood K-8 public school, the Penn Alexander School, many students come from low-income households. Seventy-two percent of last spring's graduating eighth graders are now enrolled in magnet high schools in Philadelphia, greatly improving their chances of getting accepted to selective colleges or universities. Such are the future citizens Benjamin Franklin had in mind. But how to assure that they will have a thriving community to support? Franklin's success was rooted as much in his low-key style as in the brilliance of his ideas. Franklin would float a proposal (sometimes under a pseudonym), then step back while others took credit. He jumped back into action when the moment came to drum up broader support. Let's follow Franklin's example. Put the greater good ahead of parochial interests. Collaborate on solutions, and share the credit. Relearn the art of compromise, which Franklin gracefully modeled in engineering the adoption of the U.S. Constitution. In Philadelphia, opportunities are many to embrace these principles. For years, Penn has been collaborating with our neighbors on initiatives to improve public education, public health, economic development, employment opportunities, and the physical landscape of West Philadelphia. From such collaborations will grow efforts in the next decade to begin converting 24 acres near 30th Street Station into parks and recreational facilities; shops and restaurants; arts venues; buildings for teaching, research, and technology transfer, and gateways along the Schuylkill that better connect the university and West Philadelphia to Center City. Franklin loved to say, "People who are wrapped up in themselves make small packages." His 300th birthday is the perfect occasion to follow his inspiring example and open the package for as many people as possible. _________________________________________________________________ Amy Gutmann (president at pobox.upenn.edu) is president of the University of Pennsylvania. ___________________________________________________________________ AP: Boston, city of Franklin's birth, barely celebrates 300th B-Day http://www.boston.com/news/local/massachusetts/articles/2006/01/16/boston_city_of_franklins_birth_barely_celebrates_300th_b_day/ By Andrew Ryan, Associated Press Writer | January 16, 2006 BOSTON --On Benjamin Franklin's 300th birthday on Tuesday, Philadelphia will be the celebratory hub, boasting a 42-page full-color guide to 85 events that include Ben's Birthday Bash at the National Liberty Museum and a gala parade to his grave. And what about in Boston, where the founding father was born and his intellect and character nurtured? "We are going to have a party, cake and everything," said Jessica Kriley, a manager at the Old South Meeting House, where Franklin was baptized having been born across the street on Jan. 17, 1706. They expect several dozen revelers to attend a lecture. How about the mayor's office? No events were scheduled. Maybe something's planned at the Boston Public Library, as Franklin opened the country's first public library? "Hmmm," said a woman in the communications office. "The 17th. Nothing comes to mind." "It's pathetic," said Bill Meikle, 70, a Boston-based Franklin impersonator for 20-plus years who won an Emmy Award playing the founding father on television. "I can't explain it. I don't want to sound like I'm ... moaning but there is massive indifference. It starts with the top at city hall and extends to the cultural institutions." The problem could be that when Franklin was 17 years old in September 1723, he left for Philadelphia. He fled from an abusive older brother and a provincial, Puritan-controlled town. "The great runaway," chuckled Walter Isaacson, author of "Benjamin Franklin: An American Life." "I think Boston probably unfairly feels left out because he ran away from town." It could be more than that. Franklin did, after all, quip that Harvard University was only able to teach "blockheads" and "dunces" how to enter a room "genteelly," which is something that they could have learned at a dance class. Maybe that's why Boston's Museum of Science passed on a chance to host a traveling $6 million exhibition marking Franklin's big 3-0-0. Curators opted for a Star Wars exhibit instead. "He did turn his back on the place," said Peter Drummey, librarian at the Massachusetts Historical Society in Boston. "Still, I am struck by the lack of local enthusiasm." For its part the historical society launched an exhibition displaying satirical essays that Franklin wrote for The New England Courant in 1722. On its Web site, the society has posted a nostalgic letter Franklin penned 61 years after he turned his back on his birthplace. "I long much to See again my native Place," he wrote in 1784, "and once hoped to lay my Bones there." Franklin changed his mind. His bones rest near the corner of 5th and Arch streets -- in Philadelphia. "We are doing something for his birthday," said Michael Taylor, president of Boston's Benjamin Franklin Institute of Technology, a college started with money from its namesake's will after his 200th birthday. The school is planning a forum hosted by Meikle in full character, looking just as Franklin looked a few years before his death. "The Franklin Institute is throwing me a party and I'm thrilled about it," said Meikle, already playing the part. "I'm not going to get sour grapes. I'm going to focus on the positive." But for many serious Franklin-philes, Boston is not where it's at. "I'll be in Philadelphia," said Isaacson, the author. "They are throwing a huge party for him. Hundreds and hundreds of people are coming." Philadelphia is, after all, the town where Franklin became a famous scientist, inventor, statesman, philosopher, musician and economist. He served as the city's postmaster, and is its most famous -- albeit adopted -- son. That doesn't mean there isn't enough Franklin lore for Boston too. "He loved the city, and it's a shame that nobody up there is paying much attention," Isaacson said, speaking during a telephone interview from his office in Washington D.C. A computer chimed in the background. "I'm getting e-mails from all the people in Philadelphia about what I am supposed to do at the January 17th dinner," he said dismissively. "Let me get back to work." ------ On the Net: Massachusetts Historical Society: http://www.masshist.org/welcome/ Benjamin Franklin Institute of Technology: http://www.bfit.edu/ Put your Franklinania to the test http://www.philly.com/mld/philly/news/13642520.htm [No answers, but maybe in the dead-tree version.] Posted on Tue, Jan. 17, 2006 Put your Franklinania to the test 1 Which of these foods is Franklin NOT credited with bringing to the United States? a) tofu b) Parmesan cheese c) Parmesan cake d) deep-fried Twinkies 2 What exactly is "Benergy"? a) A corrupt Texas oil company b) The coupling of actor Ben Affleck with Czech supermodel Jenergy Lopascz c) A Philadelphia tourism marketing slogan. d) A Pittsburgh football marketing slogan 3Franklin was born in Boston. His father, Josiah Franklin, was a: a) rolling stone. b) butcher. c) baker. d) candlestick maker. 4 Franklin attended Boston Latin School, but he left: a) when his first child was on the way. b) when he was struck by lightning. c) at the age of 10. d) at the age of 13. 5 Franklin ran away from Boston and came to Philadelphia at age 17. The only thing he had in his pocket was: a) a $100 bill, hence the name. b) A Dutch dollar and 20 pence. c) Parmesan cheese. d) three-day-old fish. 6 When Franklin was just a teenager, he wrote articles for the New England Courant under the pseudonym of: a) Dudley Doright. b) Silence Dogood. c) Doright Woman. d) Doright Man. 7 Franklin is said to have enjoyed drinking a spicy beer brewed with molasses and pieces of spruce tree, known as: a) Michelob Ultra. b) Spruce beer. c) A White Hessian. d) A microbrew. 8 In 1731, Franklin fathered an illegitimate son, William Franklin, who grew up to become: a) a spoilt ingrate. b) Colonial governor of New Jersey. c) A gay American. d) A kite manufacturer. 9 Complete the following Ben Franklin quotation: "By failing to prepare, you are preparing to ----." a) invade Iraq. b) father an illegitimate child. c) fail. d) die. 10In 1758, Franklin conducted early experiments in refrigeration, and later wrote: "one may see the possibility of... " a) ice-cold, refreshing Bud Light! b) freezing a man to death on a warm summer's day. c) spending one's sunset years in the humid territories recently explored by Ponce de Leon. d) cryogenics. 11 A known ladies' man, how many times did Franklin walk down the aisle? (Careful, there could be a trick!) a) 7 b) 2 c) 1 d) 0 12 Franklin was the founder of a legendary political discussion club known as the: a) Eschaton. b) Junto. c) Pinto. d) Tonto. 13 From 1785 to 1790, Franklin held a job similar to that of governor of Pennsylvania, although his actual title was: a) president of the state Supreme Executive Council. b) Knight Commander of the Most Exalted Order of Pennsylvanians. c) Master of the Domain. d) Leader of the Supremes. 14 One of Franklin's projects was to improve an 18th-century musical instrument comprising water-filled glass bowls known as: a) a harmonica. b) an armonica. c) a monica. d) a honika. 15 In 1730, Franklin began publishing the legendary newspaper known as: a) the Metro. b) The Pennsylvania Gazette. c) The Police Gazette. d) American Colonies Today. 16Franklin's wife, and mother of two of his children, was named Deborah: a) Kerr. b) Reynolds. c) Read. d) Harry. 17 Complete this famous Franklin statement, which is NOT being used by Philadelphia tourism officials: "Fish and -------- stink in three days." a) tofu b) Jefferson c) visitors d) spruce beer 18 In 1749, Franklin was appointed president of the Academy and College of Philadelphia. In 1791, it was merged with the University of the State of Pennsylvania, to become: a) Penn State. b) Penn. c) The Connecticut School of Broadcasting. d) Drexel. 19 The inner-city action flick "It's All About the Benjamins" takes its name from the fact that Franklin's picture is on: a) The most popular type of crack vial b) A $100 bill c) A $3 bill d) A Snoop Dogg CD 20 Franklin didn't really write, "A penny saved is a penny earned." He wrote, "A penny saved is... " a) worth two in the bush. b) for your thoughts. c) two pence clear. d) a waste of time. 21 In 1731, Franklin also started the first public: a) bathhouse. b) ale house. c) library. d) golf course. 22 On June 15, 1752, Franklin performed his famous electricity experiment with a kite. Shortly after, a Russian scientist named Georg Wilhelm Richmann tried the same thing, and was: a) electrocuted. b) awarded the first Nobel Prize in physics. c) blown 60 feet into the air. d) intoxicated. 23 It's hard to believe, butFranklin apparently was the first person to figure out that: a) the pope is Catholic. b) pigs can't really fly. c) most storms travel. d) one size doesn't really fit all. 24 An intrepid reporter, Franklin is said to have gained a lot of his information from: a) senior administration officials. b) hanging around farmers' markets. c) hanging around popular taverns. d) various and sundry girlfriends. 25 In 1757, Franklin was dispatched to England to protest the influence of this in Pennsylvania politics: a) Pay-to-play municipal contracting b) The Penn family c) The Casey family d) The Gambino family 26 In 1768 in London, Franklin developed "A Scheme for a New Alphabet and a Reformed Mode of Spelling" that: a) proved he had way too much. spare time on his hands. b) eliminated the letter "W" because Franklin did not like Washington. c) would have eliminated six existing letters and added six new ones. d) prompted the Battle of Lexington and Concord. 27 Franklin didn't write the Declaration of Independence, but he did: a) add the part about "the pursuit of happiness." b) edit it. c) the penmanship. d) draw the seldom-seen cover art. 28 Although he had owned two slaves, Franklin later became president of the Society for the Relief of Free Negroes Unlawfully Held in: a) Camden. b) bondage. c) chains. d) contempt. 29 Franklin owned two slaves named George and: a) Dick. b) Laura. c) King. d) Queen. 30 The Pennsylvania Gazette frequently carried ads for: a) go-go taverns in lower Bucks County. b) Parmesan cheese. c) the sale and purchase of slaves. d) Franklin stoves. ___________________________________________________________________ Happy 300th, Ben! http://www.philly.com/mld/philly/news/13642465.htm Posted on Tue, Jan. 17, 2006 Essay Happy 300th, Ben! By Tom Ferrick Jr. Inquirer Columnist If Benjamin Franklin were alive, he might skip the events planned around town today for his 300th birthday. Too much folderol for his taste, too much speechifying. Too much Franklin, Franklin, Franklin. It would offend his sense of modesty, and while Franklin wasn't humble (he knew he was smarter than most), he worked hard at being modest. It was a virtue he cultivated, aware of its value in everyday life. To be a leader of men, he realized, it was best to be one of the guys: generous in praise, respectful of divergent opinions, quick to give credit to others, slow to take it himself. In short, Franklin was a genius with a first-class disposition, a rare thing. His brainpower, his energy, and his high emotional IQ made him the de facto civic leader of Philadelphia, its go-to guy, while still in his 30s. The story of the founding of Pennsylvania Hospital is one example of his uncanny ability to get things done. It wasn't Franklin's idea. It was Dr. Thomas Bond, a London-trained physician, who wanted a hospital for the poor and indigent. As Bond pitched his idea around town, people invariably asked: Have you talked to Franklin? Franklin embraced the plan. But how to raise the 4,000 pounds? Franklin had an idea. (He always had an idea.) He had a citizen petition presented to the Colonial Assembly, asking it to create a hospital. When - as he knew they would - rural legislators objected to such a large expenditure for Philadelphia, Franklin, a member of the Assembly, rose and asked it to put forward half the money - but only if the other 2,000 could be first raised privately. Assembly members agreed, thinking that the private appeal would fail but that they could collect political credits for their generosity. Franklin then organized the fund-raising, the 2,000 pounds was raised, the Assembly put up the other 2,000 pounds, and America's first hospital was erected at Eighth and Pine Streets, where it stands today. Thus did Ben Franklin invent one of the mainstays of modern philanthropy: matching funds. As Franklin wrote later: "I do not remember any of my political manoeuvres, the success of which gave me at the time more pleasure..." The hospital was chartered in 1751, three years after Franklin retired to give time to civic and scientific pursuits. To friends who asked why he would give up a lucrative printing business, Franklin explained that when he died, he would rather have people say that "he was useful" than "he was rich." Even before his retirement at 42, Franklin found the time to run a printing business, raise a family, publish a newspaper, write Poor Richard's Almanack, help create the colonies' first fire department, organize the city's town watch, start America's first lending library, found what would become the American Philosophical Society, start the college that became the University of Pennsylvania, lead the militia that drove hostile Indian tribes from the Lehigh Valley, serve in the colonial legislature, invent the Franklin stove, and begin his groundbreaking experiments on electricity. What to do for an encore? After his retirement, Franklin completed his experiments in electricity, served as representative of the colonies in England, returned to Philadelphia to help draft the Declaration of Independence, served as minister of the new nation in France, invented the lightning rod and bifocals, charted the Gulf Stream, and helped write the U.S. Constitution. One problem with discussing Franklin in brief is that his life ends up sounding like a list. But it's important to recall what an amazing life it was, because Franklin has devolved into a caricature in popular culture. This birthday celebration gives us a chance to remember what an astonishing man he was and how lucky we are that Benjamin Franklin decided to devote his life to being "useful." _________________________________________________________________ Contact Tom Ferrick at 215-854-2714 or tferrick at phillynews.com. ___________________________________________________________________ The Grandfather of our Country - Gallagher http://www.opinioneditorials.com/freedomwriters/pgallagher_20060117.html January 17, 2006 Phil Gallagher January 17th marks the 300th birthday of Benjamin Franklin. Many volumes have been written since his death 216 years ago that explore his many accomplishments in a wide variety of endeavors. Despite this much time passing, Franklins list of achievements and his lifes work still stands up against the many generations of Americans that have followed him. More impressive than any one achievement was his versatility in his ability to contribute in so many areas of daily 18th century life. If you lived in the colonies during that period more than likely your home was heated by a Franklin stove, your church was protected by Bens lightning rod, and your money was printed by him. If you lived in Philadelphia you might have read books printed by him, or took them out of a library he started. Your buildings were protected by his fire brigade, fire insurance company or the night watch that he started. In his spare time he led a militia that protected your frontier. Many a young man over the last two centuries has been lectured on Franklins industriousness and his axioms on how to lead a productive life. For today however it is another group of Americans that can take a lesson from Franklin. The first of the baby boomers are entering their sixties and beginning the end game of their lives. As in every phase of their lives this demographic crowd will have a major impact on the future of the United States. How will the aging of this population contribute in a productive way for themselves and to our country? It is useful to look back at Franklin and observe that his achievements in his old age are the ones that are likely to be the most indelibly etched in American history as more centuries go by. In an era when the average life span was somewhere in the forties Franklin lived to be 84. It wasnt an easy 84 as he was beset with maladies common to the times however despite these setbacks he was undaunted. Just after his seventieth birthday Franklin was busy as a revolutionary involved by presiding at the constitutional convention, editing Jeffersons work on the Declaration of Independence and becoming its oldest signer. To put Franklins age in context it might be useful to compare him to some of the historical founding luminaries of the time, John Adams 41, Sam Adams, 54, Thomas Jefferson, 33, George Washington 44, John Jay, 31 and John Hancock, 39. At 70 Franklin only BEGAN his contribution to the birth and defense of the fledgling nation. He spent the next nine years in Europe tirelessly working to provide financial and material support for the battle and subsequent peace back home. In 1782, he along with John Jay and John Adams negotiated The Treaty of Peace with Great Britain. In 1785 he made his last voyage home however he still wasnt done with his contributions. In 1787, he was elected president of the Pennsylvania Society for Promoting the Abolition of Slavery; as well servings as delegate to the Constitutional Convention. As we all contemplate the aging process and where we fit in the scheme of things we might stop and once again take a very hard look at the old Ben Franklin. We cant be him nor can we duplicate his achievements but we certainly can try to think like he did in his approach to all aspects of our lives. _________________________________________________________________ AP: In Year of Franklin's 300th Birthday, It's Good to Be Ben http://www.nytimes.com/2006/01/16/national/16franklin.html By THE ASSOCIATED PRESS PHILADELPHIA, Jan. 15 (AP) - It is clear to anyone attending a convention or visiting historic sites here that Ben Franklin is not only alive 300 years after he was born but that he has also been cloned. Several times. Franklin has always had a big presence in Philadelphia. Mayor John F. Street and others have recently endorsed renaming the 30th Street train station in his honor. And the city's yearlong celebration of his 300th birthday has created a huge demand for the small cadre of people who portray him. Backstage at one recent convention, Franklin could be found with a cellphone to his ear, making notes in his appointment book. Yes, he said somewhat wearily, he can make it to the afternoon tea. But is it all right if he comes late to the breakfast? He would really like that extra half-hour of sleep. Such tight scheduling is not uncommon for Ralph Archbold, perhaps the city's best-known Franklin, who does around 500 events a year. Things have been especially hectic lately, with some three dozen appearances planned in the 10 days before Franklin's birthday bash on Tuesday at the National Constitution Center. But there is always a role for Franklin here, whether it is talking to tourists, cutting ribbons, giving lectures, filming documentaries or visiting local schools. At first, young students are not sure what to make of the gentleman with the waistcoat and cane, said Bill Robling, who has played Franklin for about four years. "Aren't you dead?" they ask. And Mr. Robling said adults who meet Franklin at Independence Hall are equally blunt in their inquiries about the sex life of a man who was a famous flirt. Mr. Robling said he did not mind those kinds of questions, because they provided an opening for him to discuss other aspects of Franklin's life. "Being Ben Franklin in his own words, in his own spaces, is probably as rewarding as anything," said Mr. Robling, 61. "Plus, I love to educate people." Another impersonator, Bill Ochester, said he was always glad to see how those who were initially skeptical of his portrayal ended up being fascinated by the conversation they had with his character. "If I continue to play the role, you'd be amazed how much they buy into it," said Mr. Ochester, 55. The costume helps. Mr. Ochester, who also participates in Revolutionary War re-enactments, wears replicas of 18th-century clothing and shoes. Mr. Robling wears a custom-made wig. And Mr. Archbold, who turns 64 on Franklin's birthday, has business cards based on the 1781 design of Franklin's calling card. The impersonators all say they have read extensively about their alter ego, and keep up with the latest research. Each can put his own stamp on the portrayal. "All of us bring a different personality, a different approach," Mr. Robling said. "We've all come from different backgrounds." Mr. Archbold was an industrial photographer before falling into the role 32 years ago; Mr. Ochester was a physician's assistant in cardiothoracic surgery; and Mr. Robling has been an actor for more than 30 years.. They decline to talk about how much they get paid. And while they acknowledge they face some competition, the market seems to be big enough for all of them. Philadelphia is not the only market for Franklin - he can also be found in Boston, the city where he was born, though he is sometimes overshadowed there by Paul Revere and Samuel Adams. Mr. Robling said if there was ever a need for more Franklins in Boston, he would be interested in reprising his role: "Have wig, will travel." The Register-Guard, Eugene, Oregon, USA http://www.registerguard.com/news/2006/01/15/printable/ed.col.dennis.0115.f718GrS9.phtml Ben Franklin offers lessons for our time By Matthew Dennis For The Register-Guard Published: Sunday, January 15, 2006 In Paris, in 1784, Benjamin Franklin witnessed a landmark event in aeronautical history - an unprecedented hot-air balloon ascent. When some Parisians asked, what's the point? Franklin famously responded, "What good is a newborn baby?" Like few others, Franklin could imagine the future, and like few Americans, he worked to secure that future - for himself, for his country and for humanity. Today, on the occasion of Franklin's 300th birthday, we are more apt to ask, what good is a long-deceased historical figure? The answer in Franklin's case: plenty. We should celebrate Franklin on his birthday this Tuesday, not only because he fundamentally shaped the future that has become our present, but because his life continues to offer lessons for our time. How appropriate for the man whose alter ego was "Poor Richard" - the most famous advice-giver in American history, predating Dear Abby by 200 years. Poor Richard said, "Fools need Advice most, but wise Men only are the better for it." Which are we, fools or wise men? We can still agree that "a penny saved is a penny earned," that "an apple a day keeps the doctor away," that "without pain there is no gain," that "honesty is the best policy," and that "nothing is certain but death and taxes." We might also consider Poor Richard's injunction: "Silence is not always a Sign of Wisdom, but Babbling is ever a Mark of Folly." Read on then, dear reader, and consider not these words mere babbling. Franklin was born in Boston on Jan. 17, 1706 - only a decade removed from the Salem Witch Trials - one of 17 children, the youngest son of a candle and soap maker. His fate was to become a workingman, and he was accordingly bound as an apprentice to his brother James, a printer. He showed great aptitude but little enthusiasm for his unfree state and the domination of his brother. At the age of 17 in 1723, Ben absconded to Philadelphia, violating the terms of his indenture (he literally stole himself), and built a new life as journeyman and then master printer and man of business. By 1748 (at age 42) he was financially secure enough to retire to devote himself more fully to science and public service. Franklin's experiments with electricity made him America's greatest scientist and brought him international fame. Franklin embodied American geographic and social mobility - moving to seek freedom and opportunity and pulling himself up by the proverbial bootstraps, going from near rags to riches. But he was more than a model of self-reliance, hard work and individual ambition. He believed in community, responsibility, equality and the common good, in reason and the life of the mind, in peace more than war, and in a better future through technological innovation and public investment. Franklin recognized that not all shared the same possibility of success. Many lacked Ben's luck, pluck and native brilliance, and some were systematically held back, as apprentices, indentured servants or even slaves. Franklin sought to level the playing field and to become other people's luck through his philanthropy and by creating (sometimes inventing new) community institutions - public libraries, secular public schools, research institutions and hospitals. In our own age of corporate greed and unseemly battles over intellectual property rights, Franklin is a beacon of integrity and generosity, as he refused to patent any of his many inventions (the lightning rod, the Franklin stove or bifocals, for example) or profit from his civic improvements. Instead, he shared his knowledge and innovations freely to improve the world and the lives of his fellow citizens. In the 1760s, Franklin denounced frontier violence against American Indians and advocated respectful relations with native people. As Poor Richard said, "Savages we call them because their manners differ from ours." An advocate of equality and opportunity, he served on the committee that drafted the Declaration of Independence. If "all men are created equal," he knew from personal experience that some labored in states of bondage. In 1787, he became president of Pennsylvania's abolition society and argued strenuously against the institution of slavery, which would linger in America for almost another hundred years. Franklin, a son of Puritan Boston, was a deist and skeptic, but he tolerated and supported various faiths and advocated freedom of religion, based on the separation of church and state. He appreciated institutions - religious or secular - that promoted morality and community. "What is serving God?" he asked. "Tis doing Good to Man." As the Treaty of Paris (which he helped negotiate) ended the War for Independence, Franklin wrote, "There was never a good War, or a bad Peace." He wondered, "What vast additions to the Conveniences and Comforts of Living might Mankind have acquired, if the Money spent in Wars had been employed in Works of public utility!" Ben expressed concern (legitimately, as it turned out) that the new hot-air balloon might be used for military purposes. Even in times of war, he urged citizens to preserve the basic principles and arrangements that defined them as a free people: "Any society that would give up a little liberty to gain a little security will deserve neither and lose both." Similarly apropos today, Ben observed, "Wars are not paid for in wartime, the bill comes later." Finally, Franklin enthusiastically supported pure and applied research, and he saw education as the way to enlightenment, social improvement, individual opportunity and success. As Poor Richard said, "An investment in knowledge pays the best interest," and "Genius without education is like silver in the mine." Franklin challenged Americans to fund public education, including higher education. He wrote to the president of Princeton College in 1784, "I am persuaded we are fully able to furnish our Colleges amply with every Means of public Instruction, and I cannot but wonder that our Legislatures have generally paid so little Attention to a Business of so great Importance." Oregon legislators should take heed. Franklin's words arrest our attention and continue to speak to us, not merely because they are wise but because they are human and funny. Late in life he wrote, "I guess I don't so much mind being old, as I mind being fat and old." At the beginning of 2006, the rotund, 300-year-old Ben Franklin might give us this sage advice: "Be at war with your vices, at peace with your neighbors, and let every new year find you a better man." The Register-Guard, Eugene, Oregon, USA HoustonChronicle.com - Our intelligent design flap would astonish Franklin http://www.chron.com/cs/CDA/printstory.mpl/editorial/outlook/3590974 Jan. 16, 2006, 7:10PM Surely, our first scientist would want us to use brains By NEAL LANE TODAY marks the 300th anniversary of the birth of Ben Franklin, one of America's most famous founding fathers and the first American scientist. Franklin's ideals and his wisdom are as fresh today as they were during the troubled years of our nation's founding. I cannot help but wonder how Franklin, a disciplined scientist and religious man, would react to the idea of teaching "intelligent design" as an alternative to the science of evolution in our schools. Would he be surprised that the current president of the United States, a self-proclaimed "education president," and the majority leader of the Senate, a cardiovascular surgeon, have advocated such a change in what our schools teach as science? For his part, Franklin never had a problem reconciling his devotion both to God and to the pursuit of scientific truth. While he was unquestionably religious calling for regular morning prayers at the Constitutional Convention he was insatiably curious, always questioning, and he rejected all forms of intolerance. He believed that science and mankind's understanding of nature, far from questioning the existence of God, were ways to gain a deeper appreciation of the nature, indeed the wonder, of God and His works. How far we have come in 300 years. Scientists have built upon Franklin's basic understanding of the nature of electricity to create sophisticated electronics, cell phones, lasers, medical imaging devices capable of resolving single molecules inside a living cell, miniaturized computers and the global Internet, satellites and even robotic explorers of distant planets. In contrast to the many amazing advances made in science and technology, we seem to have lost ground when it comes to religion. How far apart we seem to be at least in time and understanding, if not in geographical distance (only 62 miles) between Franklin's Philadelphia and today's Dover, Pa., where a deeply divided community recently awaited a court judgment concerning intelligent design. Even before the judge ruled that it would be unconstitutional to teach intelligent design as science, the majority of Dover's citizens had already made their will known by tossing out the members of the school board who favored intelligent design. But to be going through this at all in a new millennium is truly remarkable. Were Franklin alive today, he would undoubtedly attest that evolution is a fact in the same way that gravitation and electromagnetism are facts. The same scientific method was used to understand all three aspects of nature. Early hypotheses become theories. Theories are subjected to rigorous experimental testing, and factual descriptions emerge. For centuries, that's how people, including Franklin, have advanced their understanding of how nature works. Notice that I do not say why nature works or what, ultimately, might be behind the workings of nature; instead, science is our best description of how nature works. I know many scientists who are religious, and none of them considers their faith in God to be in conflict with their faith in science or the scientific method of discovery. The belief that God created the universe, hence all of nature, is fully consistent, in their minds, with the belief that science is how we learn about nature.Why is this so difficult for some to accept? I think the majority of the people who feel that children should be exposed to alternative ideas to evolution are not expressing irreconcilable religious beliefs but their own lack of understanding of biological science. The dismal quality of science education we provide in this country is largely responsible for this lack of understanding. And all of us are at fault for continuing to give such a low priority to educating today's youth and tomorrow's leaders, even as our children's test scores lag behind much of the rest of the world's. As for Benjamin Franklin, we don't know, of course, what he might make of all this. But given what we know about his views and philosophies and his penchant for plain talk, I believe Franklin might express the opinion that God created the heavens and Earth, the laws of nature, as well as humans with brains, and that God might want us to use them. Lane, a physicist, is a senior fellow in science and technology at Rice University's Baker Institute for Public Policy and the Malcolm Gillis University Professor at Rice. He is a former director of the National Science Foundation and served as assistant to the president for Science and Technology during the Clinton administration. _________________________________________________________________ Americas First Self-Made Man: Ben Franklin - 17 Jan 2006 http://www.accountingweb.com/cgi-bin/item.cgi?id=101673 AccountingWEB.com - Jan-17-2006 - January 17 is the 300th birthday of Benjamin Franklin. While Franklin lived to the advanced age of 84, he has not survived three centuries. His wisdom, however, has. Today, he stands as a shining example of what it means to be an American business man or woman. Franklin was a citizen of the world, well-traveled, well-read, multi-lingual and always striving to improve himself. He learned his trade, printing, by working for someone else but eventually went out on his own. His famous work ethic, early to bed and early to rise helped him succeed to the point of driving his competition out of business. Probably his greatest success was Poor Richards Almanack,/I>from which many of his most well-known quotes, wit and advice are taken. He was so successful as a printer, in fact, that he retired at 42, bringing in a partner to run his printing business while he devoted himself to the study of philosophy, which in those days included scientific experiments and inquiries, and public service. He was actually 42 when he retired, and he retired because he thought he had better things to do with his time than make money. Franklin is often cited as the prototype of the American capitalist. And it is true that he was very successful in business. His Pennsylvania Gazette was essentially The New York Times of his day. The Poor Richard's Almanac, which ran for 25 years, was a great commercial success, Professor Brands told National Public Radios Talk of the Nation. His printing business made him a wealthy man. But by the age of 42, he had as much money as he needed. He found an able managing partner in whose hands he put the business, and he retired to study philosophy. And philosophy in those days, of course, encompassed science and the general study of the natural world. He thought that was a better use of his time at that point. And so he had had enough of business. Dr. Blaine McCormick, a professor of business at Baylor University, has distilled and updated Franklins prolific writings into 12 rules of management including: * Finish better than your beginnings. * All education is self-education. * Seek first to manage yourself, then to manage others. * Influence is more important than victory. * Work hard and watch your costs. * Everybody wants to appear reasonable. * Create your own set of values to guide your actions. * Incentive is everything. * Create solutions for seemingly impossible problems. * Sometimes its better to do 1,001 small things right than only one large thing right. * Deliberately cultivate your reputation and legacy. These rules are explained in greater detail in the book Ben Franklins 12 Rules of Management: The Founding Father of American Business Solves Your Toughest Business Problems. In recent months, several new books about Franklin have been published giving the impression that he the clear favorite of the Founding Fathers. There are several reasons for this. His long life, open mind and extensive writings have left a wealth of material for us to study covering many topics still facing us today, often from different perspectives. For accountants, however, in the wake of scandals and increased scrutiny, his best advice may be: Think of these things, whence you came, where you are going, and to whom you must account. _________________________________________________________________ 'Benjamin Franklin': The Reluctant Revolutionary http://www.nytimes.com/2002/10/20/books/review/20DUNNLT.html By SUSAN DUNN _________________________________________________________________ BENJAMIN FRANKLIN By Edmund S. Morgan. Illustrated. 339 pp. New Haven: Yale University Press. $24.95. _________________________________________________________________ "I love Company, Chat, a Laugh, a Glass, and even a Song, as well as ever,'' Benjamin Franklin wrote when he was in his 50's. He delighted in others and wanted to be liked by them. ''I never saw a man who was, in every respect, so perfectly agreeable to me,'' an English friend said about him, adding that ''some are amiable in one view, some in another, he in all.'' But Franklin was more than a magnetic extrovert: he channeled his sociability and extraordinary curiosity about the natural world and its inhabitants into decades of energetic commitment to his Philadelphia community and his revolutionary nation. In this engaging and readable book, Edmund S. Morgan, the Sterling professor of history emeritus at Yale and the author of ''Inventing the People'' and ''American Slavery, American Freedom,'' among other works, does more than recount the colorful and gripping story of Franklin's long, action- and idea-filled life; he also skillfully dissects the man's personality and mind, his social self and political beliefs, deftly exploring in ''Benjamin Franklin'' how the two halves of his being lived together, not always in harmony. Franklin had strong convictions, though in public he bowed to the ideas of others, diplomatically avoiding confrontation. He was ''the least doctrinaire of men,'' Morgan remarks. His recipe for power was to be inconspicuous. ''He was not without ambition,'' Morgan writes, ''he was simply too shrewd to show it.'' In his missions abroad, he endeavored to promote the policies of those he spoke for, keeping his own opinions to himself. And yet, when Franklin represented American interests in England in the early 1770's, there was often a disconnect between the moderate, conciliatory tack he took and the more aggressive stance of his counterparts back home. He was a latecomer, Morgan underscores, to their intransigent assertion of American rights, sharing neither their horror at British taxation of the colonies nor their appetite for independence. For him, the future lay in an Anglo-American empire of equals. As late as 1774 he was still trying to hold the empire together -- or at least his vision of it. Then, the transformation. Upon his return to America, Franklin swiftly became a firebrand. ''He does not hesitate at our boldest measures,'' John Adams noted, ''but rather seems to think us too irresolute, and backward.'' Even so, if Franklin became a fervent patriot, it was by default, because of England's self-defeating, arrogant rejection of American rights. As American minister to France, he brilliantly charmed and maneuvered the French into helping to finance -- and win -- the American War of Independence, at the cost of emptying their treasury and precipitating their own revolution. Still, as Morgan points out, Franklin preferred, to the waging of war, the simple, rational idea that territory could be had through cash purchase, however high the price. But if a war had to be fought, he wanted Americans to fight it themselves, without foreign allies. There was a different kind of disconnect between Franklin's democratic, majoritarian political ideas and the checking-and-balancing caution of framers like James Madison. At the Constitutional Convention, Franklin pushed for a unicameral legislature and a weak executive council with no veto power. His proposals were treated, Madison wrote, with great respect for their ''author'' rather than for their ''practicability.'' Franklin's political ideas were to the populist left of the mainstream, and the Constitution was not entirely his cup of tea. But, understanding its critical importance, he accepted it and, in the self-effacing, tolerant language of an Enlightenment gentleman and diplomat, urged others to do the same. ''The older I grow, the more apt I am to doubt my own judgment, and to pay more respect to the judgment of others,'' he remarked as the convention drew to a close. ''Thus I consent, Sir, to this Constitution, because I expect no better, and because I am not sure, that it is not the best.'' For Franklin, public service -- and virtue -- meant doing what the people wanted, not simply what he wanted, Morgan concludes in this illuminating work. When a man's life was over, Franklin once wrote, it should be said that he lived usefully, not that he died rich. Useful -- to others -- Franklin was. And rich, too, in the respect of history. Susan Dunn is the author of ''Sister Revolutions: French Lightning, American Light'' and the forthcoming ''Showdown: The Revolution of 1800.'' ------------------ 'To Begin the World Anew': The Founding Yokels http://www.nytimes.com/2003/02/16/books/review/16BROOKHT.html By RICHARD BROOKHISER If the storms of fashion that have pounded the humanities during the last 30 years have spared the study of early American history, one of the scholars we have most to thank is Bernard Bailyn. Bailyn's 1967 classic, ''The Ideological Origins of the American Revolution,'' kept the eyes of a generation of historians on the subjects that early Americans themselves eyed so obsessively: the ideas and the politics of a highly intellectual and political time. There were battles to be fought and money to be made during the American Revolution, and without victory in the first, or the lure of the second, the Revolution would never have been won. But the thoughts of even soldiers and speculators kept returning to politics, and to the ideals that they believed politicians lived to defend, or to threaten. Bailyn made the founders comprehensible, and lively -- for their ideas still march through our minds. ''To Begin the World Anew,'' a slim and handsome volume, is a collection of what Bailyn calls ''sketches'' on issues arising out of his lifework: the thought of Thomas Jefferson and the pizazz of Benjamin Franklin; the fear that the Constitution provoked in so many Americans when it was first presented to them, and the lessons that the rest of the world took from it after it had been ratified. The most important sketch is the first, which began life as a Jefferson Lecture for the National Endowment for the Humanities, and which explains the influence on the founders of the fact that they were ''provincials -- marginal, borderland people.'' The best of these essays are like after-dinner speeches by a guest of honor who knows his subject so well that he can treat it lightly, and who has brought excellent slides with him. The essay on Jefferson is the slightest. Bailyn draws attention to the ambiguities in his thought -- his glimpse of ''what a wholly enlightened world might be'' versus the compromises he made as a politician and an administrator to advance his agenda of the day. Basically, though, the essay is hero worship -- Ken Burns, one more time. This will no longer do. Jefferson's reputation has been taking on water at an alarming rate, from the twin leaks of Sally Hemings and the larger question of slavery. Federalist sympathizers, disgusted with his coldness, his cant and his many deceptions, may be tempted to view Jefferson's posthumous troubles with glee. But if Americans commit parricide on him, they commit suicide. Jefferson must be defended by those who love him toughly -- who know him well enough to dislike him, but who know themselves well enough to know what they owe him. Bailyn's essay on The Federalist Papers begins with a wonderful line of Talleyrand (who had spent two years of exile in the United States, where he befriended Alexander Hamilton, one of The Federalist's main authors). When a Spanish diplomat admitted that he did not know the book, Talleyrand ''wasted no sympathy on him: 'Then read it,' he told the envoy curtly, 'read it.' '' People have been reading it ever since -- quite a tribute to a collection of hastily written newspaper essays that were often ''under the pen,'' as James Madison, the other main author, wrote, ''whilst the printer was putting into type'' the lead paragraphs. Bailyn places The Federalist in a four-step process of the Constitution's creation. The document was drafted in Philadelphia in the summer of 1787; the state ratifying debates, which lasted through July of 1788, supplied the first authoritative glosses; the First Congress, which began meeting in the spring of 1789, wrote the Bill of Rights and the Judiciary Act; and the first term of the Washington administration (1789-93) defined aspects of the executive branch like the cabinet and the president's treaty-making power. The Federalist belonged to the blizzard of polemics that accompanied Stage 2. Most of the polemics came from the anticonstitutional side, and were motivated by fear. Some were silly -- what was there, North Carolinians asked, to prevent the pope from becoming president? Some were prescient -- Brutus,'' a New York writer, thought the power to tax ''will introduce itself into every corner of the city and country.'' America had thrown off one onerous government; was it saddling itself with another? The Federalist tirelessly addressed those fears. Its ultimate answer was a modest measure of hope: ''There is a portion of virtue and honor among mankind,'' Hamilton wrote, ''which may be a reasonable foundation of confidence.'' In a short but surprising essay (surprising to Americans, at any rate), Bailyn follows those hopes into the world, where the American example had a mixed run. English radicals looked to America as a beacon; French radicals toyed with its institutions at the beginning of their revolution, before turning onto a more populist path. America's strongest impact was on Switzerland and Argentina -- two turbulent countries that profited from the idea of federalism. Hamilton's modesty was as well judged as his confidence. In the misleadingly titled ''Realism and Idealism in American Diplomacy,'' Bailyn hits top form. The real subject is the protean genius of Benjamin Franklin at recreating himself and his image. We meet the shape-shifter in his first portrait, painted when he was 40, as a middle-class man. As Franklin becomes a famous scientist, he poses with experimental paraphernalia. By the time he is 60, he sits beside a bust of Newton, in a blue velvet suit with gold trim -- a picture of intellectual and worldly success. Ten years later, in 1776, his newborn country sends him as its minister to France, where Franklin adopts a new look -- a plain dark suit, a cap of marten fur and long straight hair. The French went wild. Franklin seemed like a 70-year-old child of nature, or of Rousseau (Rousseau, Bailyn notes, had worn a similar fur cap in a famous portrait). Franklin's face appeared on prints, medallions, busts and teacups. The apotheosis came in a 1778 portrait by Joseph Siffred Duplessis. Bailyn writes that this face -- hatless now -- is worn, the skin pouched, the eyes somewhat puffed and tired.'' Yet it ''radiates experience, wisdom, patience, tolerance . . . unconstrained by nationality, occupation or rank.'' Franklin had become identified ''with humanity itself, its achievements, hopes and possibilities.'' All these images were propaganda -- by boosting himself, Franklin boosted the United States. But he hit his grandest note when he employed the fewest artifices. How did Americans come to entertain such aspirations? Bailyn's lead essay, ''Politics and the Creative Imagination,'' takes its framework from an essay by the art critic Kenneth Clark entitled ''Provincialism.'' When the culture of the metropolis becomes stale, provincials can bring to it, as Bailyn gives us Clark's argument, ''the vigor of fresh energies.'' Colonial America, Bailyn reminds us, was a small and remote place. Bailyn makes the point by comparing the houses of rich Americans -- the Byrds and Carters of Virginia, the Van Cortlandts of New York -- with houses in England. Blenheim Palace is obviously from a different world, a ducal Brasilia. But even the stately homes of the lesser English gentry, which resemble their American counterparts in scale, reveal, in their interiors, riches of art and ornament that were unknown across the Atlantic. Similarly, the English country squires and their ladies painted by Gainsborough comport themselves with an assurance that makes the farmer-lawyer Roger Sherman of Connecticut, painted by Ralph Earl, look ''rustic'' and ''clumsy in manner'' -- which indeed Sherman was. Yet, Bailyn reminds us, Sherman ''was one of the most innovative political thinkers of his age.'' Sherman and his fellow rustics were innovative because they challenged received opinions, such as Montesquieu's belief that republics must be small, or the almost universal belief that dual sovereignties -- states within nations -- could not coexist. ''I ask,'' Bailyn quotes Oliver Ellsworth, another Connecticut rustic, ''why can they not? It is not enough to say they cannot. I wish for some reason.'' The answers to their pert questioning included The Federalist, the Constitution and the Declaration of Independence. Bailyn's picture of feisty provincials raises a doubt. Do we have too much money, too many weapons and too many professors to be as intelligent as the founding rubes? Has our ''constitutional establishment'' become ''self-absorbed, self-centered and . . . distant from the ordinary facts of life''? Anxiety about our own decadence is also a very old feature of American life. Hold on to that thought. The slob you see may be your savior. Richard Brookhiser is the author, most recently, of ''America's First Dynasty: The Adamses, 1735-1918.'' ---------------- The Founding Fathers Were NOT Christians http://www.dimensional.com/~randl/founders.htm Excerpts from: The Founding Fathers Were Not Christians by Steven Morris, in Free Inquiry, Fall, 1995 (If you want to complain about this article, complain to Steven Morris, who wrote it) "The Christian right is trying to rewrite the history of the United States as part of its campaign to force its religion on others. They try to depict the founding fathers as pious Christians who wanted the United States to be a Christian nation, with laws that favored Christians and Christianity. This is patently untrue. The early presidents and patriots were generally Deists or Unitarians, believing in some form of impersonal Providence but rejecting the divinity of Jesus and the absurdities of the Old and New testaments. Thomas Paine was a pamphleteer whose manifestos encouraged the faltering spirits of the country and aided materially in winning the war of Independence: I do not believe in the creed professed by the Jewish church, by the Roman church, by the Greek church, by the Turkish church, by the Protestant church, nor by any church that I know of...Each of those churches accuse the other of unbelief; and for my own part, I disbelieve them all." From: The Age of Reason by Thomas Paine, pp. 8,9 (Republished 1984, Prometheus Books, Buffalo, NY) George Washington, the first president of the United States, never declared himself a Christian according to contemporary reports or in any of his voluminous correspondence. Washington Championed the cause of freedom from religious intolerance and compulsion. When John Murray (a universalist who denied the existence of hell) was invited to become an army chaplain, the other chaplains petitioned Washington for his dismissal. Instead, Washington gave him the appointment. On his deathbed, Washinton uttered no words of a religious nature and did not call for a clergyman to be in attendance. From: George Washington and Religion by Paul F. Boller Jr., pp. 16, 87, 88, 108, 113, 121, 127 (1963, Southern Methodist University Press, Dallas, TX) John Adams, the country's second president, was drawn to the study of law but faced pressure from his father to become a clergyman. He wrote that he found among the lawyers 'noble and gallant achievments" but among the clergy, the "pretended sanctity of some absolute dunces". Late in life he wrote: "Twenty times in the course of my late reading, have I been upon the point of breaking out, "This would be the best of all possible worlds, if there were no religion in it!" It was during Adam's administration that the Senate ratified the Treaty of Peace and Friendship, which states in Article XI that "the government of the United States of America is not in any sense founded on the Christian Religion." From: The Character of John Adams by Peter Shaw, pp. 17 (1976, North Carolina Press, Chapel Hill, NC) Quoting a letter by JA to Charles Cushing Oct 19, 1756, and John Adams, A Biography in his Own Words, edited by James Peabody, p. 403 (1973, Newsweek, New York NY) Quoting letter by JA to Jefferson April 19, 1817, and in reference to the treaty, Thomas Jefferson, Passionate Pilgrim by Alf Mapp Jr., pp. 311 (1991, Madison Books, Lanham, MD) quoting letter by TJ to Dr. Benjamin Waterhouse, June, 1814. Thomas Jefferson, third president and author of the Declaration of Independence, said:"I trust that there is not a young man now living in the United States who will not die a Unitarian." He referred to the Revelation of St. John as "the ravings of a maniac" and wrote: The Christian priesthood, finding the doctrines of Christ levelled to every understanding and too plain to need explanation, saw, in the mysticisms of Plato, materials with which they might build up an artificial system which might, from its indistinctness, admit everlasting controversy, give employment for their order, and introduce it to profit, power, and pre-eminence. The doctrines which flowed from the lips of Jesus himself are within the comprehension of a child; but thousands of volumes have not yet explained the Platonisms engrafted on them: and for this obvious reason that nonsense can never be explained." From: Thomas Jefferson, an Intimate History by Fawn M. Brodie, p. 453 (1974, W.W) Norton and Co. Inc. New York, NY) Quoting a letter by TJ to Alexander Smyth Jan 17, 1825, and Thomas Jefferson, Passionate Pilgrim by Alf Mapp Jr., pp. 246 (1991, Madison Books, Lanham, MD) quoting letter by TJ to John Adams, July 5, 1814. "The day will come when the mystical generation of Jesus, by the supreme being as his father in the womb of a virgin, will be classed with the fable of the generation of Minerva in the brain of Jupiter." -- Thomas Jefferson (letter to J. Adams April 11,1823) James Madison, fourth president and father of the Constitution, was not religious in any conventional sense. "Religious bondage shackles and debilitates the mind and unfits it for every noble enterprise." "During almost fifteen centuries has the legal establishment of Christianity been on trial. What have been its fruits? More or less in all places, pride and indolence in the Clergy, ignorance and servility in the laity, in both, superstition, bigotry and persecution." From: The Madisons by Virginia Moore, P. 43 (1979, McGraw-Hill Co. New York, NY) quoting a letter by JM to William Bradford April 1, 1774, and James Madison, A Biography in his Own Words, edited by Joseph Gardner, p. 93, (1974, Newsweek, New York, NY) Quoting Memorial and Remonstrance against Religious Assessments by JM, June 1785. Ethan Allen, whose capture of Fort Ticonderoga while commanding the Green Mountain Boys helped inspire Congress and the country to pursue the War of Independence, said, "That Jesus Christ was not God is evidence from his own words." In the same book, Allen noted that he was generally "denominated a Deist, the reality of which I never disputed, being conscious that I am no Christian." When Allen married Fanny Buchanan, he stopped his own wedding ceremony when the judge asked him if he promised "to live with Fanny Buchanan agreeable to the laws of God." Allen refused to answer until the judge agreed that the God referred to was the God of Nature, and the laws those "written in the great book of nature." From: Religion of the American Enlightenment by G. Adolph Koch, p. 40 (1968, Thomas Crowell Co., New York, NY.) quoting preface and p. 352 of Reason, the Only Oracle of Man and A Sense of History compiled by American Heritage Press Inc., p. 103 (1985, American Heritage Press, Inc., New York, NY.) Benjamin Franklin, delegate to the Continental Congress and the Constitutional Convention, said: As to Jesus of Nazareth, my Opinion of whom you particularly desire, I think the System of Morals and his Religion...has received various corrupting Changes, and I have, with most of the present dissenters in England, some doubts as to his Divinity; tho' it is a question I do not dogmatize upon, having never studied it, and think it needless to busy myself with it now, when I expect soon an opportunity of knowing the Truth with less trouble." He died a month later, and historians consider him, like so many great Americans of his time, to be a Deist, not a Christian. From: Benjamin Franklin, A Biography in his Own Words, edited by Thomas Fleming, p. 404, (1972, Newsweek, New York, NY) quoting letter by BF to Exra Stiles March 9, 1790. ______________________________________________________________ The words "In God We Trust" were not consistently on all U.S. currency until 1956, during the [1]McCarthy Hysteria. The Treaty of Tripoli, passed by the U.S. Senate in 1797, read in part: "The government of the United States is not in any sense founded on the Christian religion." The treaty was written during the Washington administration, and sent to the Senate during the Adams administration. It was read aloud to the Senate, and each Senator received a printed copy. This was the 339th time that a recorded vote was required by the Senate, but only the third time a vote was unanimous (the next time was to honor George Washington). There is no record of any debate or dissension on the treaty. It was reprinted in full in three newspapers - two in Philadelphia, one in New York City. There is no record of public outcry or complaint in subsequent editions of the papers. ______________________________________________________________ [2]Contradictions in the Bible [3]The Flat Earth [4]AMERICA- Not A Christian Nation Another site with more quotes of the founders [5]Debunking Fundamentalism [6]The Nuclear Family Meltdown [7]Mother Earth on the Chopping Block (The Coming Corporate World Government) [8]Look into the eyes of the advertising demon! [9]Pat Buchanan: Pit Bull in Wolf's clothing References 1. http://www.dimensional.com/~randl/mccart.htm 2. http://www.dimensional.com/~randl/tcont.htm 3. http://www.dimensional.com/~randl/tcreat.htm 4. http://www.postfun.com/pfp/worbois.html 5. http://www.dimensional.com/~randl/tview.htm 6. http://www.dimensional.com/~randl/tfamly.htm 7. http://www.dimensional.com/~randl/tcorps.htm 8. http://www.dimensional.com/~randl/telvision.htm 9. http://www.dimensional.com/~randl/buch.htm 'Benjamin Franklin': The Many-Minded Man New York Times Book Review, 3.7.6 http://www.nytimes.com/2003/07/06/books/review/06ELLIST.html BENJAMIN FRANKLIN: An American Life. By Walter Isaacson. Illustrated. 590 pp. New York: Simon & Schuster. $30. By JOSEPH J. ELLIS For reasons that no one has adequately explained, those prominent Americans often mythologized and capitalized as Founding Fathers, or alternatively demonized as the deadest-whitest-males in American history, have surged into vogue over the past decade. John Adams, Alexander Hamilton and Thomas Jefferson appeared to be the chief beneficiaries of this trend until recently, when Benjamin Franklin moved into contention. First H. W. Brands produced a well-received cradle-to-grave life of Franklin, then Edmund Morgan came forward with a beguilingly Boswellian character study of the great American sage. Now Walter Isaacson joins the list with a full-length portrait virtually assured to bring Franklin's remarkable career before a sizable readership. Isaacson wrote this book while serving as managing editor at Time and then as head of CNN, both full-time jobs that presumably left little opportunity for travels back to the 18th century. But anyone assuming that ''Benjamin Franklin: An American Life'' is aimed at the coffee table would be dead wrong. It is a thoroughly researched, crisply written, convincingly argued chronicle that is also studded with little nuggets of fresh information. Among the items that were new to me: that Franklin investigated ways to make flatulence less odorous, and that Davy Crockett went down at the Alamo carrying a copy of Franklin's ''Autobiography'' in his jacket. Instead of Franklin's Boswell, Isaacson comes across as his Edward R. Murrow, diligently and often deftly interrogating the man while sifting through the veritable mountain of scholarship that has accumulated around him over the past two centuries. If anything, Isaacson engages in a bit of scholarly overkill at the end, providing a separate conclusion and epilogue on Franklin's legacy, a chronology of important dates, brief biographies of all the supporting characters in the story, conversion tables that provide modern dollar equivalents for British and colonial currency, an annotated bibliography and about 50 pages of endnotes. The erudition is somewhat conspicuous here, but it is also impressive. The long arc of Franklin's life (1706-90) presents a major challenge to all biographers. As a boy he traded anecdotes with Cotton Mather about Puritan theology; as an elder statesman he compared notes with Thomas Jefferson on the likely course of the French Revolution. Franklin was also ubiquitous, the only person present at all three founding moments of American independence: the drafting of the Declaration of Independence; the negotiations that produced the Treaty of Paris, which ended the Revolutionary War; and the great debate that resulted in the Constitution. And beneath his folksy facade, Franklin was a world-class scientist, an accomplished prose stylist and a brilliant diplomat. (Think of Jonas Salk, Mark Twain and Henry Kissinger all rolled into one.) To top it off, he was a man of multiple masks, protean in his personas as well as his talents, gliding effortlessly from Poor Richard to promiscuous London bon vivant to backwoods Voltaire. Chronologically, intellectually and psychologically, he is a stretch. Isaacson recognizes from the start that the character portrayed in the ''Autobiography'' is one of Franklin's most artful inventions. He argues persuasively that Franklin's sharpest critics, from Max Weber to D. H. Lawrence, have directed their fire more at his masks than at the man beneath them. Rather than deploy elaborate psychological theories to explain Franklin's interior agility, Isaacson prefers an old-fashioned kind of explanation -- a narrative account of his career accompanied by interpretive assessments at the most salient moments of Franklinesque magic. If I read him right, Isaacson thinks it is both futile and misguided to search for the core Franklin among the ever-shuffling selves, since Franklin's orchestration of his different voices became the central feature of his personality. The earliest chapters cover Franklin's ascent in Philadelphia from a penniless adolescent to its leading citizen. Isaacson complicates the familiar Horatio Alger theme of the ''Autobiography'' by noticing Franklin's early tendency to adopt fictional pseudonyms (for example, Silence Dogood) and to present his convictions obliquely in satirical fables, giving his public image a flirtatious and forever flickering quality. He also lingers over the oddly aloof version of intimacy he fashioned with Deborah Read, his semiliterate wife, and his illegitimate son, William, whom he would raise but eventually disown. While a famous advocate of family values in theory, Franklin was in practice an elusive husband and father whose most sustained expressions of affection came late in life with his grandchildren. During the 1750's and 60's Franklin spent the bulk of his time in London seeking a royal charter for Pennsylvania to replace the proprietary government of the Penn family. Even within the most rarefied chambers of the American academy, scholars disagree about this phase of Franklin's career, in part because the political context in both Philadelphia and London is so tangled, in part because Franklin's allegiance to the British Empire does not fit comfortably with his subsequent commitment to independence. As Isaacson sees it, Franklin misread the mounting American opposition to British rule because he harbored a vision of the empire as a trans-Atlantic community of equal partners, an international theater in which British glory and his own burgeoning reputation would rise together. Though he was destined to become the prototypical American, Franklin came to American independence late in the game, and abandoned his identity as a Briton reluctantly. Isaacson does not say so explicitly, but he suggests that Franklin's dramatic renunciation of his son, who remained a Loyalist, had its roots in his agonized struggle with his own political allegiance. William, in effect, was the British side of Franklin that had to be forsaken. Isaacson's most impressive chapter, a little tour de force of historical synthesis, focuses on Franklin's role during the Paris peace negotiations that ended the War of Independence. Again, this is bloody and well-trampled ground, littered with the bodies of several generations of historians. Isaacson's previous work as a student of American foreign policy and biographer of Henry Kissinger serves him well here. He somehow manages to sift his way through the diplomatic debris and recover Franklin's exquisite sense of the competing objectives among the American, British and French delegations, all the while recognizing that John Adams and John Jay were correct to insist, against Franklin's instincts, on a separate bargain with the British that left the French marooned and unrewarded. The most recent French-American diplomatic minuet, it would seem, has a long history. Whatever the source of our current fascination with the founding generation, Isaacson's life of Franklin exemplifies the interpretive trend that defines the best of the recent biographies: namely, a flair for finding flaws within greatness. My sense is that Isaacson's own career as an executive within the media world has given him an affinity for Franklin's self-conscious manipulation of multiple versions of himself, an empathy for Franklin's always enlightened and often playful duplicities. For all these reasons, a definitive biography of Franklin is a contradiction in terms. Isaacson's intuitive understanding of those terms, and his prodigious appetite for research, combine to make this biography a prime candidate for the authoritative Franklin of our time. Joseph J. Ellis is the author of ''Founding Brothers: The Revolutionary Generation.'' ----------------- 'The Americanization of Benjamin Franklin': A Folksy Aristocrat New York Times Book Review, 4.8.8 By BARRY GEWEN THE AMERICANIZATION OF BENJAMIN FRANKLIN By Gordon S. Wood. Illustrated. 299 pp. The Penguin Press. $25.95. IT'S Benjamin Franklin's time. Two years ago Edmund S. Morgan gave us a fine character sketch with ''Benjamin Franklin.'' Then Walter Isaacson's ''Benjamin Franklin: An American Life'' planted itself on the New York Times best-seller list for a long stay. H. W. Brands has chimed in with ''The First American,'' a more commodious biography than Isaacson's, if a less fluent one. And now we have Gordon S. Wood's engaging book ''The Americanization of Benjamin Franklin.'' Wood has some tough acts to follow, but he is no slouch. A skilled writer with both Pulitzer and Bancroft prizes to his credit, he possesses as profound a grasp of the early days of the Republic as anyone currently working. He is the author of two books -- ''The Creation of the American Republic, 1776-1787'' and ''The Radicalism of the American Revolution'' -- that are essential for understanding the United States from its founding down to the present. This study is not a biography, at least not a conventional one. Wood focuses on Franklin's personal development and constructs his narrative around various turning points in the life, almost like a bildungsroman. We learn the choices Franklin made, the conflicts he had to resolve. This is the most dramatic of the recent Franklin books. One of Wood's major topics is Franklin's reputation, then and now. A reader today cannot fail to be astonished by Franklin's remarkable modernity. Isaacson calls him ''the founding father who winks at us.'' Wood echoes this judgment: ''He seems to be the one we would most like to spend an evening with.'' Washington was too solemn, Jefferson too lofty, Hamilton too driven, Madison too lawyerly, Adams too difficult, a royal pain. With his disdain of powdered wigs and the other formalities of his very formal age, Franklin comes across as the most recognizably human of them all, a man for our time. His immediacy compresses centuries. He is not even Franklin, he's just Ben -- witty, ironic, plain-spoken, shrewd, congenial, devious, visionary, lusty, magnanimous, hardheaded, manipulative, brilliant Ben. Not everyone today would enjoy his company, be seduced by this consummately seductive man. The politically correct would most likely hector him if they could. For Franklin was a slaveholder. It's true he turned against slavery, and ardently so, at the very end of his life, but he took a long time getting there. He could be a bigot as well. He wrote nativist diatribes against the large German population in his own colony of Pennsylvania. In 1751 he argued for excluding everyone from Pennsylvania except the English; Morgan calls him ''the first spokesman for a lily-white America.'' Franklin loved the company of women, but he was no feminist. He treated his wife miserably, and he admonished young brides to attend to the word ''obey'' in their vows. He worried that handouts to the poor would encourage laziness, and he was a fervent supporter of a strong military. An 18th-century Jesse Helms? Modern right-wingers would probably be even more uncomfortable with him than left-wingers. Take his religious views. Franklin was a deist; God, in his opinion, was a distant presence in the affairs of men. He was no churchgoer. He accepted neither the sacredness of the Bible nor the divinity of Jesus. His ideas about property rights were similarly unorthodox. Beyond basic necessities, he said, all property belonged to ''the public, who by their laws have created it.'' Brands calls such remarks ''strikingly socialistic.'' What most sets Franklin apart from contemporary conservatives, however, is his attitude toward that panoply of issues gathered under the heading of ''family values.'' As a young man he consorted with ''low women,'' and fathered an illegitimate child. In 1745 he wrote a letter to a youthful friend -- long suppressed -- offering advice on choosing a lover. (Older women, he declared, were preferable to younger ones.) Franklin was always an incorrigible flirt. How much actual sex was involved is anybody's guess, but one incident stands out among the rest. When he was in his 70's and living in Paris, he became enamored of the captivating 33-year-old Mme. Anne-Louise Brillon, one of the leading lights of Parisian society. Even the puritanical John Adams was enchanted by her. She was no less taken with Franklin, and their vivacious correspondence consisted of a determined campaign on his part to bed her and her equally stalwart resistance, based on the customs of the day and what was proper between a widower and a married woman. Their bantering give-and-take, as quoted by Brands, constitutes one of the most charming episodes in early American history and -- since as far as the historians can tell they never did sleep together -- also one of the most poignant. Moral zealots of his own era -- Adams, for example, and the Lees of Virginia -- didn't like him, and our own zealots of both the left and the right wouldn't like him now. In these overheated, bipolar times, if a decision had to be made about our currency, it's a safe bet that a slaveholding lecher would not be gracing our $100 bills. But Franklin was a hero of moderation throughout his life, and he is a hero for moderates today. He took the world as he found it, accepted people for what they were and didn't try to make them over. He had no axes to grind. His code of conduct began in sociability, with a firm commitment to the practical. Franklin has been criticized for not being a dreamer. He wasn't; he wanted to get things done. He was devoted to public service, the public good. Thus, the library, fire company, insurance company, hospital and university he founded in Philadelphia; thus, the inventions and scientific experiments that won him fame on both sides of the Atlantic; and thus, the magnificent political and diplomatic achievements. Franklin, as Isaacson points out, was the only person to sign the four major documents establishing the country: the Declaration of Independence, the treaties with France and Britain, and the Constitution. Wood calls him ''the greatest diplomat America has ever had.'' So extraordinary was the multifaceted Franklin that it's all too easy to sentimentalize him, and here Wood's book can serve as a useful corrective. Two themes in particular lend themselves to fuzzy effusiveness. The first is that Franklin was some kind of tribune of the masses, the populist among the founding fathers. But no less than Thomas Jefferson, Franklin believed in the idea of a natural aristocracy, and well understood where he was positioned within that hierarchy. He could interact enjoyably with anyone, commoners as well as kings, yet as Morgan observes, he preferred associating with those ''on the same wavelength'' -- which meant neither commoners nor kings but Hume, Burke, Condorcet, Boswell, Beaumarchais, Adam Smith. He hated the rabble, feared mobs. When it came to decision making, he held that wisdom resided with the wise. ''The Americanization of Benjamin Franklin'' makes clear just how much of an elitist our folksiest founder was. The other problematic theme concerns Franklin's ''Americanness.'' He seems almost a checklist for those national qualities Americans take pride in -- and others despise us for. Yet Wood alerts us to be careful in how we think about this aspect of his character. For he was the most cosmopolitan of the founders, at home anywhere. Twenty-five of the last 33 years of his life were spent abroad, and those years were anything but a hardship for him. He was wined and dined and celebrated by the Europeans more than he ever was by his own countrymen. Soon after arriving in London he was complaining about the provinciality and vulgarity of Americans. In Paris he was quite simply a superstar, acclaimed as the equal of Voltaire, and he gave thought to settling permanently in ''the civilest Nation upon Earth.'' These sentiments did not go unnoticed back home, and Franklin fell under suspicion of being a foreign agent, first for the British, then for the French. When he returned to Philadelphia for the last time in 1785, it was in part to clear his name. So what does it mean to speak of his ''Americanization''? What changed him from a citizen of the world to a citizen of the United States? As Wood shows, these aren't easy questions to answer. But it should be said that in one way Franklin never really did change. He turned against England because it had become smaller in his mind, oppressive and corrupt, no longer the center of civilization that he had come to love. Now it was America that seemed to be civilization's future. The Revolution was not a conflict over taxation or home rule, not even a dispute over the rights of Englishmen. For him it represented something universal, a world-historical event, ''a miracle in human affairs.'' That is, Franklin never stopped being the urbane cosmopolitan, the ultimate sophisticate. He stayed true to himself. But by 1776 he had concluded that the only way to remain a citizen of the world was to become an American. Barry Gewen is an editor at the Book Review. http://www.nytimes.com/2004/08/08/books/review/08GEWENL.html Excerpts from 'The Americanization of Benjamin Franklin' By GORDON S. WOOD BECOMING A GENTLEMAN BOSTON BEGINNINGS Franklin was born in Boston on January 17, 1706 (January 6, 1705, in the old-style calendar), of very humble origins, origins that always struck Franklin himself as unusually poor. Franklin's father, Josiah, was a nonconformist from Northamptonshire who as a young man had immigrated to the New World and had become a candle and soap maker, one of the lowliest of the artisan crafts. Josiah fathered a total of seventeen children, ten, including Benjamin, by his second wife, Abiah Folger, from Nantucket. Franklin was number fifteen of these seventeen and the youngest son. In a hierarchical age that favored the firstborn son, Franklin was, as he ruefully recounted in his Autobiography, "the youngest Son of the youngest Son for 5 Generations back." In the last year of his life the bitterness was still there, undisguised by Franklin's usual irony. In a codicil to his will written in 1789 he observed that most people, having received an estate from their ancestors, felt obliged to pass on something to their posterity. "This obligation," he wrote with some emotion, "does not lie on me, who never inherited a shilling from any ancestor or relation." Because the young Franklin was unusually precocious ("I do not remember when I could not read," he recalled), his father initially sent the eight-year-old boy to grammar school in preparation for the ministry. But his father soon had second thoughts about the expenses involved in a college education, and after a year he pulled the boy out of grammar school and sent him for another year to an ordinary school that simply taught reading, writing, and arithmetic. These two years of formal education were all that Franklin was ever to receive. Not that this was unusual: most boys had little more than this, and almost all girls had no formal schooling at all. Although most of the Revolutionary leaders were college graduates-usually being the first in their families to attend college-some, including Washington, Robert Morris, Patrick Henry, Nathanael Greene, and Thomas Paine, had not much more formal schooling than Franklin. Apprenticeship in a trade or skill was still the principal means by which most young men prepared for the world. Franklin's father chose that route of apprenticeship for his son and began training Franklin to be a candle and soap maker. But since cutting wicks and smelling tallow made Franklin very unhappy, his father finally agreed that the printing trade might better suit the boy's "Bookish Inclination." Printing, after all, was the most cerebral of the crafts, requiring the ability to read, spell, and write. Nevertheless, it still involved heavy manual labor and was a grubby, messy, and physically demanding job, without much prestige. In fact, printing had little more respectability than soap and candle making. It was in such "wretched Disrepute" that, as one eighteenth-century New York printer remarked, no family "of Substance would ever put their Sons to such an Art," and, as a consequence, masters were "obliged to take of the lowest People" for apprentices. But Franklin fit the trade. Not only was young Franklin bookish, but he was also nearly six feet tall and strong with broad shoulders-ideally suited for the difficult tasks of printing. His father thus placed him under the care of an older son, James, who in 1717 had returned from England to set himself up as a printer in Boston. When James saw what his erudite youngest brother could do with words and type, he signed up the twelve-year-old boy to an unusually long apprenticeship of nine years. That boy, as Franklin later recalled in his Autobiography, was "extremely ambitious" to become a "tolerable English Writer." Although literacy was relatively high in New England at this time-perhaps 75 percent of males in Boston could read and write and the percentage was rapidly growing-books were scarce and valuable, and few people read books the way Franklin did. He read everything he could get his hands on, including John Bunyan's Pilgrim's Progress, Plutarch's Lives, Daniel Defoe's Essay on Projects, the "do good" essays of the prominent Boston Puritan divine Cotton Mather, and more books of "polemic Divinity" than Franklin wanted to remember. He even befriended the apprentices of booksellers in order to gain access to more books. One of these apprentices allowed him secretly to borrow his master's books to read after work. "Often," Franklin recalled, "I sat up in my Room reading the greatest Part of the Night, when the Book was borrow'd in the Evening & to be return'd early in the Morning lest it should be miss'd or wanted." He tried his hand at writing poetry and other things but was discouraged with the poor quality of his attempts. He discovered a volume of Joseph Addison and Richard Steele's Spectator papers and saw in it a tool for self-improvement. He read the papers over and over again and copied and recopied them and tried to recapitulate them from memory. He turned them into poetry and then back again into prose. He took notes on the Spectator essays, jumbled the notes, and then attempted to reconstruct the essays in order to understand the way Addison and Steele had organized them. All this painstaking effort was designed to improve and polish his writing, and it succeeded; "prose Writing" became, as Franklin recalled in his Autobiography, "of great Use to me in the Course of my Life, and was a principal Means of my Advancement." In fact, writing competently was such a rare skill that anyone who could do it well immediately acquired importance. All the Founders, including Washington, first gained their reputations by something they wrote. In 1721 Franklin's brother, after being the printer for another person's newspaper, decided to establish his own paper, the New England Courant. It was only the fourth newspaper in Boston; the first, published in 1690, had been closed down by the Massachusetts government after only one issue. The second, the Boston News-Letter was founded in 1704; it became the first continuously published newspaper not only in Boston but in all of the North American colonies. The next Boston paper, begun in 1719 and printed by James Franklin for the owner, was the Boston Gazette. These early newspapers were small, simple, and bland affairs, two to four pages published weekly and containing mostly reprints of old European news, ship sailings, and various advertisements, together with notices of deaths, political appointments, court actions, fires, piracies, and such matters. Although the papers were expensive and numbered only in the hundreds of copies, they often passed from hand to hand and could reach beneath the topmost ranks of the city's population of twelve thousand, including even into the ranks of artisans and other "middling sorts." These early papers were labeled "published by authority." Remaining on the good side of government was not only wise politically, it was wise economically. Most colonial printers in the eighteenth century could not have survived without government printing contracts of one sort or another. Hence most sought to avoid controversy and to remain neutral in politics. They tried to exclude from their papers anything that smacked of libel or personal abuse. Such material was risky. Much safer were the columns of dull but innocuous foreign news that they used to fill their papers, much to Franklin's later annoyance. It is hard to know what colonial readers made of the first news item printed in the newly created South Carolina Gazette of 1732: "We learn from Caminica, that the Cossacks continue to make inroads onto polish Ukrania." James Franklin did not behave as most colonial printers did. When he decided to start his own paper, he was definitely not publishing it by authority. In fact, the New England Courant began by attacking the Boston establishment, in particular the program of inoculating people for smallpox that was being promoted by the Puritan ministers Cotton Mather and his father. When this inoculation debate died down, the paper turned to satirizing other subjects of Boston interest, including pretended learning and religious hypocrisy, some of which provoked the Mathers into replies. Eager to try his own hand at satire, young Benjamin in 1722 submitted some essays to his brother's newspaper under the name of Silence Dogood, a play on Cotton Mather's Essays to Do Good, the name usually given to the minister's Bonifacius, published in 1710. For a sixteen-year-old boy to assume the persona of a middle-aged woman was a daunting challenge, and young Franklin took "exquisite Pleasure" in fooling his brother and others into thinking that only "Men of some Character among us for Learning and Ingenuity" could have written the newspaper pieces. These Silence Dogood essays lampooned everything from funeral eulogies to "that famous Seminary of Learning," Harvard College. Although Franklin's satire was generally and shrewdly genial, there was often a bite to it and a good deal of social resentment behind it, especially when it came to his making fun of Harvard. Most of the students who attended "this famous Place," he wrote, "were little better than Dunces and Blockheads." This was not surprising, since the main qualification for entry, he said, was having money. Once admitted, the students "learn little more than how to carry themselves handsomely, and enter a Room genteely, (which might as well be acquire'd at a Dancing-School,) and from whence they return, after Abundance of Trouble and Charge, as great Blockheads as ever, only more proud and self-conceited." One can already sense an underlying anger in this precocious and rebellious teenager, an anger with those who claimed an undeserved social superiority that would become an important spur to his ambition. When Franklin's brother found out who the author of the Silence Dogood pieces was, he was not happy, "as he thought, probably with reason," that all the praise the essays were receiving tended to make the young teenager "too vain." Franklin, as he admitted, was probably "too saucy and provoking" to his brother, and the two brothers began squabbling. James was only nine years older than his youngest brother, but he nonetheless "considered himself as my Master & me as his Apprentice." Consequently, as master he "expected the same Services from me as he would from another; while I thought he demean'd me too much in some he requir'd of me, who from a Brother expected more Indulgence." Since the fraternal relationship did not fit the extreme hierarchical relationship of master and apprentice, the situation became impossible, especially when James began exercising his master's prerogative of beating his apprentice. Indentured apprentices were under severe contractual obligations in the eighteenth century and were part of the large unfree population that existed in all the colonies. In essence they belonged to their masters: their contracts were inheritable, and they could not marry, play cards or gamble, attend taverns, or leave their masters' premises day or night without permission. With such restraints it is understandable that Franklin was "continually wishing for some Opportunity" to shorten or break his apprenticeship. In 1723 that opportunity came when the Massachusetts government-like all governments in that pre-modern age, acutely sensitive to libels and any suggestion of disrespect-finally found sufficient grounds to forbid James to publish his paper. James sought to evade the restriction by publishing the paper under Benjamin's name. But it would not do to have a mere apprentice as editor of the paper, and James had to return the old indenture of apprenticeship to his brother. Although James drew up a new and secret contract for the remainder of the term of apprenticeship, Franklin realized his brother would not dare to reveal what he had done, and he thus took "Advantage" of the situation "to assert my Freedom." His situation with his brother had become intolerable, and his own standing in the Puritan-dominated community of Boston was little better. Since Franklin had become "a little obnoxious to the governing Party" and "my indiscreet Disputations about Religion began to make me pointed at with Horror by good People, as an Infidel or Atheist," he determined to leave Boston. But because he still had some years left of his apprenticeship and his father opposed his leaving, he had to leave secretly. With a bit of money and a few belongings, the headstrong and defiant seventeen-year-old boarded a ship and fled the city, a move that was much more common in the mobile eighteenth-century Atlantic world than we might imagine. Thus Franklin began the career that would lead him "from the Poverty & Obscurity in which I was born & bred, to a State of Affluence & some Degree of Reputation in the World." PHILADELPHIA Franklin arrived in the Quaker city renowned for its religious freedom in 1723, hungry, tired, dirty, and bedraggled in his "Working Dress," his "Pockets stuffed out with Shirts and Stockings," with only a Dutch dollar and copper shilling to his name. He bought three rolls, and "with a Roll under each Arm, and eating the other," he wandered around Market, Chestnut, and Walnut Streets, and in his own eyes, and the eyes of his future wife, Deborah Read, who watched him from her doorway, made "a most awkward ridiculous Appearance." He finally stumbled into a Quaker meetinghouse on Second Street, and "hearing nothing said," promptly "fell fast asleep, and continu'd so till the Meeting broke up, when one was kind enough to wake me." Franklin tells us in his Autobiography that he offers us such a "particular"-and unforgettable-description of his "first Entry" into the city of Philadelphia so "that you may in your Mind compare such unlikely Beginnings with the Figure I have since made there." Although he tried in his Autobiography to play down and mock his achievements, Franklin was nothing if not proud of his extraordinary rise. He always knew that it was the enormous gap between his very obscure beginnings and his later worldwide eminence that gave his story its heroic appeal. Philadelphia in the 1720s numbered about six thousand people, but it was growing rapidly and would soon surpass the much older city of Boston. The city, and the colony of Pennsylvania, had begun in the late seventeenth century as William Penn's "Holy Experiment" for poor persecuted members of the Society of Friends. But by the time Franklin arrived, many of the Quaker families, such as the Norrises, Shippens, Dickinsons, and Pembertons, had prospered, and this emerging Quaker aristocracy had come to dominate the mercantile affairs and politics of the colony. Continues... http://www.nytimes.com/2004/08/08/books/chapters/0808-1st-wood.html Sunday Book Review > 'A Great Improvisation': Our Man in Paris http://www.nytimes.com/2005/04/03/books/review/03ISAACSO.html 5.4.3 By WALTER ISAACSON A GREAT IMPROVISATION Franklin, France, and the Birth of America. By Stacy Schiff. Illustrated. 489 pp. Henry Holt & Company. $30. IN 1776, after he had helped edit Thomas Jefferson's draft of the Declaration of Independence, the 70-year-old Benjamin Franklin was sent on a wartime Atlantic crossing deemed necessary to make that document a reality. America had to get France on its side in the Revolution, and, even back then, France was a bit of a handful. Franklin was an ideal choice for the mission, as Stacy Schiff shows in this meticulously researched and judicious account of his eight years as a diplomatic dazzler and charmer in Paris. ''He happened to do a fine imitation of a French courtier,'' she writes, showing her astute feel for both Franklin and 18th-century court life at Versailles. ''He knew better than to confuse straightforwardness with candor; he was honest, but not too honest, which qualifies in France as a failure of imagination.'' What made Franklin such a great diplomat was that he could quote Cervantes's maxim about honesty being the best policy without trying to apply it in the Hall of Mirrors, where a more oblique approach had its advantages. He had a ''majestic suppleness'' that was rare, especially in a man of his age. Today's stewards of America's foreign policy could learn much from the wily and seductive Franklin. He was as adroit as a Richelieu or Metternich at the practice of balance-of-power realism; he wrote memos to the French foreign minister Vergennes that showed a fine feel for the national interests of France and its Bourbon-pact allies; and he played the French off against the English envoys who came secretly suitoring for a back-channel truce. But he also wove in the idealism that was to make America's worldview exceptional both then and now; he realized that the appeal of the values of democracy and an attention to winning hearts and minds through public diplomacy would be sources of the new nation's global influence as much as its military might. After a year of playing both seductive and coy, Franklin was able to negotiate a set of treaties with France that would, so the signers declared, bond the countries in perpetuity. One French participant expressed the hope that the Americans ''would not inherit the pretensions and the greedy and bold character of their mother country, which had made itself detested.'' As a result of the arrangements made by Franklin, the French supplied most of America's guns and nearly all of its gunpowder, and had almost as many troops at the decisive battle of Yorktown as the Americans did. Schiff scrupulously researches the details of Franklin's mission and skillfully spices up the tale with the colorful spies, stock manipulators, war profiteers and double-dealers who swarmed around him. Most delightful are the British spy Paul Wentworth, so graceful even as he is outmaneuvered by Franklin, and the flamboyant playwright and secret agent Beaumarchais (''The Barber of Seville'' and ''The Marriage of Figaro''), so eager to capitalize on the news of the American victory at Saratoga that he was injured when his carriage overturned while speeding with a banker from Franklin's home to central Paris. Least delightful is the priggish and petulant John Adams, ''a man to whom virtue and unpopularity were synonymous'' and whom Schiff merrily tries to knock from the pedestal upon which he was placed by [1]David McCullough. Schiff is somewhat less successful at capturing the sweep and excitement of Franklin's diplomatic achievements. She never offers up much of a theory of how he enticed the French into an alliance, what role the military victory at Saratoga played, how he really felt about the British, what games he was playing when he juggled two rival British envoys vying to be his interlocutor in the final peace talks or why he agreed with his fellow commissioners to negotiate that treaty with Britain behind the backs of the French. Nor does Schiff convey the brilliance of his writing and the exuberance of his flirtations with his two mistresses. Franklin, oddly enough, sometimes comes across as rather distant and lifeless, which is a shame. In her two previous biographical studies -- [2]''V?ra: (Mrs. Vladimir Nabokov)'' and ''Saint-Exupery'' -- Schiff displayed her mastery as a literary stylist. This time, she occasionally lapses into clich?s (in one section Franklin ''dragged his feet'' and then ''led Vergennes down the primrose path'' from which position he ''backed them'' -- the Spaniards -- ''into a corner''), and some of her phrases read as if she wrote them first in period French (''Franklin paid a call of which he could not have overestimated the symbolic value''). Nevertheless, her research is so convincing and her feel for the subject so profound that ''A Great Improvisation'' becomes both an enjoyable narrative and the most important recent addition to original Franklin scholarship. When he embarked on his final voyage back home to America after his triumphant years in France, Franklin made a short stop in England, at Southampton, where he met with his illegitimate and prodigal son, William, who had remained loyal to the British crown. There William's own illegitimate son, Temple, who had sided with and worked for his grandfather Benjamin, tried to effect a reconciliation. Alas, the reunion was cold and bitter. It was a vivid reminder of how personality and character and emotion and diplomacy can become dramatically interwoven. That was one of the great themes of Franklin's life, one of the many that resonate today. Walter Isaacson, president of the Aspen Institute, is the author of ''Benjamin Franklin: An American Life'' and ''Kissinger: A Biography.'' He is writing a biography of Albert Einstein. ------------- Charming Paris http://www.washingtonpost.com/ac2/wp-dyn/A17329-2005Mar31 Reviewed by Isabelle de Courtivron Sunday, April 3, 2005; Page BW04 A GREAT IMPROVISATION: Franklin, France, and the Birth of America By Stacy Schiff. Henry Holt. 489 pp. $30 At the beginning of his February trip to Europe, President Bush quipped that he hoped for a reception similar to the one Benjamin Franklin received two centuries earlier, when he "arrived on this continent to great acclaim." (Secretary of State Condoleezza Rice told him he "should be a realist.") This was tongue in cheek, of course -- an attempt to smooth over the "Punish France!" pronouncements from the heated debate over Iraq and subsequent Francophobic actions such as renaming fries and dumping Beaujolais. But Bush probably did not realize what price Franklin had actually paid for retaining his extraordinary popularity in France and for surmounting political and personal obstacles on both sides of the Atlantic. The story of the eight and a half years he spent in Paris, persuading the French to support the fledgling American army in concrete as well as symbolic ways, is the subject of Stacy Schiff's engaging new book. A Great Improvisation has many levels. It is a factual, historical and meticulously detailed recounting of the travails, vexations, negotiations, complexities and setbacks of the political and diplomatic maneuvers that ultimately led France to support the young American cause. It is also an enlightening discussion of the vexed and complex beginnings of the transatlantic alliance. Finally, it is an entertaining story, bringing alive a cast of colorful characters, strange plot twists and bizarre anecdotes, which sometimes reads like a movie script replete with intrigues, ultimatums, cabals, swindles and vendettas. In 1776, the 70-year-old Franklin landed in France, sent by a Congress that had declared independence without the means to achieve it. The very idea of foreign help was unpalatable to some in Congress and considered suspect by many even after the court of young Louis XVI had come through. But these widely diverging opinions did not deter Franklin from his unwavering faith in the American Revolution and his steady conviction that every measure should be taken to sustain the new republic and win the war against the British. Franklin had the daunting task of advertising rebellion in an absolute monarchy; he did so doggedly, all the while underplaying what was often a desperate military situation. When he arrived in France, he was already well known and widely respected as a statesman, philosopher and scientist. But what allowed him to succeed when all other emissaries charged with the same task had fallen into the deep Franco-American political and cultural divide? Schiff attributes it in large part to his ability to marshal "a great improvisation." She points to Franklin's laissez-faire attitude, his ability to be logical without being encumbered by exaggerated honesty, his voluble, genial and ruthless approach, and his calculated innocence. He was also a hit with the French because he knew how to adapt to the codes of the European nobility -- not to mention possessing a heroic and seemingly unlimited patience for people's exasperating foibles, French, British and Americans alike. Indeed, as thorny as Franklin's encounters with various French characters may have been, they seem tame next to his relations with members of his own mission and with his compatriots -- from the early tension between the original U.S. emissaries to France, William Lee and Silas Deane (who fought not only over strategy but over the colors of the American army uniforms), all the way to the uncompromising John Adams (who considered every laurel bestowed upon Franklin a personal affront). Marshaling so much original information -- drawn from diplomatic archives, family papers, spy reports and the archives of the French foreign service -- could have made for a tedious read were it not for Schiff's storytelling skills. The author of a Pulitzer Prize-winning biography of Vera Nabokov, Schiff introduces us to a cast of unique characters, whom she captures in a few vivid and incisive traits. They range from Pierre-Augustin Caron de Beaumarchais, the flamboyant, irrepressible, swashbuckling secret agent and playwright who became an important early arms dealer; to the recipient of those weapons, the dashing young marquis de Lafayette, who sailed to America against the king's order, wracked with violent seasickness, speaking not a word of English and leaving behind a pregnant wife; to the excitable, stubborn Viscount Stormont, the British ambassador to Versailles; and to the chevalier d'Eon, a cross-dressing dragoon officer who became a notable supporter of the young republic's cause. Schiff does not forget the ladies with whom Franklin flirted so copiously, in person and by correspondence: for instance, the thirtysomething, married Anne-Louise Brillon de Jouy, who had frail nerves, called him Papa and eventually promised to become his wife, but only in the afterlife, or Anne-Catherine Helv?tius, the philosopher's widow, a hostess with a powerful salon who was at the "center of Franklin's social life" in France. They figure prominently in Schiff's narrative, not simply because of Franklin's fraught infatuations with several of them but also because, in 18th-century French society, their salons were the places where important people could meet and network. Completing this tableau are members of the somewhat dysfunctional Franklin family: his illegitimate son William, a Loyalist leader in London with whom he was on terrible terms, and William's own illegitimate son, Temple, who worked for his grandfather in Paris and whose taste for Europe left him incapable of readapting to America. "For his service abroad," Schiff wryly notes, Franklin "wound up with an English son and a French grandson." Schiff's allusions to the French-American misunderstandings and mutual suspicion will regale readers. Some of these lead to hilarious anecdotes; for example, Bostonians welcomed the French squadron in 1778 with a dinner of cooked green Massachusetts frogs. The French militiamen found American coffee undrinkable, the food inedible, the people "overly familiar and bizarrely peripatetic" and the women graceless and unshapely; the Americans felt that the French talked too fast and all at the same time without really saying much, opined on subjects they knew nothing about and considered that business consisted primarily of ceremony and pleasure. Despite the undeniable impact on U.S.-French relations of two tumultuous centuries, A Great Improvisation reminds us that profound cultural differences between the two societies have not changed all that much -- and thus remain at the root of their conflicting visions of the world. Plus ?a change . . . o Isabelle de Courtivron is Friedlaender Professor of the Humanities at MIT and the editor of "Lives in Translation: Bilingual Writers on Identity and Creativity." Forget the Founding Fathers New York Times Book Review, 5.6.5 http://www.nytimes.com/2005/06/05/books/review/05GEWE01.html By BARRY GEWEN THE founding fathers were paranoid hypocrites and ungrateful malcontents. What was their cherished Declaration of Independence but empty political posturing? They groaned about the burden of taxation, but it was the English who were shouldering the real burden, paying taxes on everything from property to beer, from soap to candles, tobacco, paper, leather and beeswax. The notorious tea tax, which had so inflamed the people of Massachusetts, was only one-fourth of what the English paid at home; even Benjamin Franklin labeled the Boston Tea Party an act of piracy. Meanwhile, smugglers, with the full connivance of the colonists, were getting rich at the expense of honest tax-paying citizens. The recent French and Indian War had doubled Britain's national debt, but the Americans, who were the most immediate beneficiaries, were refusing to contribute their fair share. The revolutionaries complained about a lack of representation in Parliament, but in this they were no different from the majority of Englishmen. What was more, the God-given or nature-given rights they claimed for themselves included the right to hold Africans in bondage. Edward Gibbon, who knew something about the ups and downs of history, opposed the rebels from the House of Commons. Samuel Johnson called them ''a race of convicts'' who ''ought to be thankful for any thing we allow them short of hanging.'' Observed from across the Atlantic, the story of the Revolution looks very different from the one every American child grows up with. To see that story through British eyes, as Stanley Weintraub's ''Iron Tears: America's Battle for Freedom, Britain's Quagmire: 1775-1783'' enables us to do, is to see an all-too-familiar tale reinvigorated. Weintraub reminds us that justice did not necessarily reside with the rebels, that the past can always be viewed from multiple perspectives. And he confronts us with the fact that an American triumph was anything but inevitable. History of course belongs to the victors. If Britain's generals had been more enterprising, if the French had failed to supply vital military and financial assistance, George Washington, Thomas Jefferson, Alexander Hamilton and the rest would be known to us not as political and philosophical giants but as reckless (and hanged) losers, supporting players in a single act of Britain's imperial drama. We would all be Canadians now, with lower prescription drug costs and an inordinate fondness for winter sports. But Weintraub's book does more than add a fresh dimension to a tired subject. By giving the war a genuinely international flavor, it points the way to a new understanding of American history. Instead of looking out at the rest of the world from an American perspective, it rises above national boundaries to place the past in a global context. This is a significant undertaking. At a time when the role of the United States in the world has never been more dominant, or more vulnerable, it is crucially important for us to see how the United States fits into the jigsaw of international relations. Weintraub indicates how American history may come to be written in the future. A globalized history of the United States would be only the latest twist in a constantly changing narrative. Broadly speaking, since the end of World War II there have been three major schools of American history; each reflected and served the mood of the country at a particular time. In the 1940's and 50's, that mood was triumphal. As Frances FitzGerald explains in ''America Revised: History Schoolbooks in the 20th Century,'' the United States was routinely presented in those years as ''perfect: the greatest nation in the world, and the embodiment of democracy, freedom and technological progress.'' The outside world may have been intruding on the slumbering nation through the cold war, the United Nations, NATO and the rise of Communist China, but the textbooks' prevailing narrative remained resolutely provincial. ''The United States had been a kind of Salvation Army to the rest of the world,'' the books taught. ''Throughout history, it had done little but dispense benefits to poor, ignorant and diseased countries. . . . American motives were always altruistic.'' The histories of that time, FitzGerald says, were ''seamless,'' a word that applied not only to schoolbooks but also to the work of the period's most sophisticated scholars and writers, men like Richard Hofstadter and Louis Hartz. Reacting against the challenge of totalitarianism, they went looking for consensus or, in Hofstadter's phrase, ''the central faith'' of America, and they found it in the national commitment to bourgeois individualism and egalitarianism. Americans clustered around a democratic, capitalist middle. Uniquely among major nations, the United States had avoided serious ideological conflict and political extremes; even its radicals and dissenters adhered to what Hofstadter called the ''Whiggish center'' and Hartz termed ''the liberal tradition.'' Arthur M. Schlesinger Jr. wrote about ''the vital center.'' Daniel Bell spoke of ''the end of ideology.'' Because they emphasized unity at the expense of division and dissent -- Hartz referred to ''the shadow world'' of American social conflict -- these consensus historians later were criticized for being conservative and complacent. There is some truth to this charge, but only some. As a group, they were reformers, even liberal Democrats, but their liberalism was pragmatic and incremental. Mindful of the leftist extremism of the 1930's, they looked upon idealism as something to be distrusted; grand visions, they had come to understand, could do grand damage. Taken too far, this viewpoint could lead to a defense of the status quo, or at least to a preference for the way things were to the way visionaries said they could be. Down that road, neoconservatism beckoned. Hofstadter, for one, was discomforted by some of his critics, and admitted to having ''serious misgivings of my own about what is known as consensus history.'' It had never been his purpose, he explained, to deny the very real conflicts that existed within the framework he and others were attempting to outline. Hofstadter acknowledged that his writing ''had its sources in the Marxism of the 1930's,'' and an alert reader could detect a residual Marxism, or at least an old-fashioned radicalism, in some of his comments in ''The American Political Tradition.'' Though the book appeared in the late 1940's, at the onset of one of the greatest economic booms in American history, Hofstadter was still complaining about ''bigness and corporate monopoly,'' misguidedly declaring that ''competition and opportunity have gone into decline.'' Similarly, in ''The Liberal Tradition in America,'' Hartz brilliantly but, it seemed, ruefully, analyzed why socialism had failed to take root in the United States. However much these thinkers had been disappointed by Marxism, they were hardly ready to embrace straightforward majoritarian democracy. Indeed, with the exception of Henry Adams, there has probably never been a historian more suspicious of ''the people'' than Richard Hofstadter. For him vox populi conjured up images of racism, xenophobia, paranoia, anti-intellectualism. The more congenial Hartz described Americans as possessing ''a vast and almost charming innocence of mind''; his hope was that the postwar encounter with the rest of the world would awaken his countrymen from their sheltered, basically oafish naivete. But if the consensus historians were not Marxists and not majoritarian democrats, what, during the cold war era, could they be? What other choice was there? The answer is that they were ironists who stood beyond political debate, beyond their own narratives. Hartz urged scholars to get ''outside the national experience''; ''instead of recapturing our past, we have got to transcend it,'' he said. One became an anthropologist of one's own society. How better to understand the national character, what made America America? Yet the outsider approach had real limitations, as became apparent once the tranquil 50's turned into the tumultuous 60's. The consensus historian, Hartz wrote, ''finds national weaknesses and he can offer no absolute assurance on the basis of the past that they will be remedied. He tends to criticize and then shrug his shoulders.'' This preference for the descriptive over the prescriptive, with its mix of resignation and skepticism, its simultaneous enjoyment and rejection of the spectacle of American life, was at bottom ''aesthetic.'' In retrospect, one can even begin to see certain links between the consensus generation's aesthetic irony and the distancing attitude Susan Sontag described in her 1964 essay, ''Notes on Camp.'' In any event, the work of these historians was drastically undermined by the upheavals of the 60's and early 70's -- the Kennedy assassination and the other political murders, the Vietnam War, the urban riots, the student revolts, Watergate and the kulturkampf of sex, drugs and rock 'n' roll. As division and conflict consumed the country, the emphasis on American unity seemed misguided. And the ironic stance itself looked irresponsible. The times demanded not distance but engagement, not anthropologists but activists, not a shrug but a clenched fist. Everyone was being forced to make choices, and those choices presented themselves with an almost melodramatic starkness, especially on the campuses that were the homes of the consensus historians. It was the blacks against the bigots, the doves against the hawks, the Beatles against Rodgers and Hammerstein. For historians, too, the choice was easy: for the neglected minorities and against the dominant dead white males. As postwar seamlessness faded in the 1960's, a school of multicultural historians emerged to take the place of the consensus historians. This school has been subjected to a lot of criticism of late, but in fact it brought forth a golden age of social history. Blacks, American Indians, immigrants, women and gays had been ignored in the national narrative, or, more precisely, treated as passive objects rather than active subjects. The Civil War may have been fought over slavery, but the slaves were rarely heard from. Who knew anything about the Indians at Custer's Last Stand? The immigrants' story was told not through their own cultures but through their assimilation into the mainstream. But now, the neglected and powerless were gaining their authentic voices. New studies increased our knowledge, enlarging and transforming the picture of America, even when the multiculturalists worked in very restricted areas. Judith A. Carney's ''Black Rice: The African Origins of Rice Cultivation in the Americas,'' for example, describes how the South Carolina rice industry was built not only on slave labor but on the agricultural and technological knowledge brought over by the Africans. The book has not found many readers outside the academy, but it nonetheless changes our understanding of the black contribution to American life. At its best, multiculturalism illuminated the niches and byways of American history. It investigated smaller and smaller subjects in greater and greater detail: gays in the military during World War II, black laundresses in the postbellum South. But this specialization created a problem of its own. In 1994, when the Journal of American History asked historians about the state of their profession, they bemoaned its ''narrowness,'' its ''divorce from the public.'' The editor of the journal wrote that ''dazzling people with the unfamiliar and erudite'' had become ''more highly prized than telling a good story or distilling wisdom.'' Yet what story, exactly, did the multiculturalists want to tell? Could all those detailed local and ethnic studies be synthesized into a grand narrative? Unfortunately, the answer was yes. There was a unifying vision, but it was simplistic. Since the victims and losers were good, it followed that the winners were bad. From the point of view of downtrodden blacks, America was racist; from the point of view of oppressed workers, it was exploitative; from the point of view of conquered Hispanics and Indians, it was imperialistic. There was much to condemn in American history, little or nothing to praise. Perhaps it was inevitable that multiculturalism curdled into political correctness. Exhibit A, Howard Zinn's ''People's History of the United States,'' has sold more than a million copies. From the start, Zinn declared that his perspective was that of the underdog. In ''a world of victims and executioners, it is the job of thinking people . . . not to be on the side of the executioners.'' Whereas the Europeans who arrived in the New World were genocidal predators, the Indians who were already there believed in sharing and hospitality (never mind the profound cultural differences that existed among them), and raped Africa was a continent overflowing with kindness and communalism (never mind the profound cultural differences that existed there). American history was a story of cruel domination by the wealthy and privileged. The founding fathers ''created the most effective system of national control devised in modern times,'' Zinn stated. The Civil War was a conflict of elites, and World War II was fought not to stop fascism but to extend America's empire. The United States and the Soviet Union both sought to control their oppressed populations, ''each country with its own techniques.'' The Vietnam War was a clash between organized modern technology and organized human beings, ''and the human beings won.'' We have traveled a long way from the sophisticated ironies of the consensus historians. A reaction against distortions and exaggerations of this kind was sure to come. Battered by political correctness, basking in Reaganesque optimism and victory in the cold war, the country in the 1980's and 90's was ready for a reaffirmation of its fundamental values. After all, democracy was spreading around the world and history itself (treated as a conflict of ideologies) was declared at an end. One of the first historians to take heart from the cold war's conclusion and to see the value of re-examining the formative years of the republic was the early-American scholar Joseph J. Ellis. In ''Founding Brothers'' he wrote: ''all alternative forms of political organization appear to be fighting a futile rearguard action against the liberal institutions and ideas first established in the United States.'' Ellis was a major figure in the new school of founding fathers historians that emerged in the 1990's. But as an academic, he was exceptional. Most were amateur and freelance historians, since the universities had become hostile to the kind of ''great man'' history they were interested in doing. A National Review editor, Richard Brookhiser, taking Plutarch as his model, explained that his goal was to write ''moral biography,'' a phrase unlikely to endear him to postmodernist academics; in rapid succession he produced brief, deft studies of Washington, Hamilton and the Adams family. Ellis, the biographer of John Adams, Thomas Jefferson and George Washington, saw himself engaged in retro battle against his own profession, and observed that his work was ''a polite argument against the scholarly grain, based on a set of presumptions that are so disarmingly old-fashioned that they might begin to seem novel in the current climate.'' George Washington, Ellis joked, was ''the deadest, whitest male in American history.'' But if the academy was hostile to these books, the larger world was not. The volumes by Brookhiser and Ellis, not to mention works by David McCullough, Ron Chernow and Walter Isaacson, were widely praised. Some won National Book Awards and Pulitzer Prizes. And in sharp contrast to the restricted monographs of the multiculturalists, they sold by the truckload. Here was genuinely popular history, written with a public purpose and designed to capture a large audience. Ellis's ''Founding Brothers'' was a best seller in hardback for almost a year, and a best seller in paperback for more than a year. Isaacson's ''Benjamin Franklin'' spent 26 weeks on the best-seller list; McCullough's ''John Adams'' entered the list at No.1, staying there for 13 weeks, rivaling for a while the popularity of novels by the likes of John Grisham and Danielle Steel. Chernow's ''Alexander Hamilton'' and Ellis's ''His Excellency: George Washington'' both made the best-seller list last year. And yet there are reasons to believe the popularity of the school is peaking. For one thing, it is running out of founding fathers. The only major figure still awaiting his Chernow or McCullough is the thoughtful but unexciting James Madison. No doubt the principal author of the Constitution will have his day, but the founding fathers school is facing the choice of reaching down into the second ranks, or going over ground already covered by others. Brookhiser's most recent biography was of the less-than-great Gouverneur Morris, whom he teasingly describes as ''the rake who wrote the Constitution.'' Meanwhile, another formidable biography of Adams has just come out, and Benjamin Franklin has been turned into an industry unto himself, the subject of an apparently endless flood of books. There's always room for different interpretations, but the bigger picture is in the process of being lost. A school that arose in reaction to the excesses of the multiculturalists has started feeding on itself. Most important, however, 9/11 has changed the way Americans relate to their past. The war on terror, the invasions of Afghanistan and Iraq, the apparently insoluble problem of nuclear proliferation and the ominous but real potential for a ''clash of civilizations'' -- all these are compelling us to view history in a new way, to shed the America-centered perspective of the founding fathers school and look at the American past as a single stream in a larger global current. Stanley Weintraub will never equal the best of the founding fathers authors in the felicity of his prose, and ''Iron Tears'' is unlikely to reach far beyond the campuses. But by embedding the American Revolution in British history, by internationalizing it, his book speaks more directly to the needs of our time than do biographies of Adams and Hamilton. Weintraub is hardly alone. Another book that gains immediacy by giving a global spin to an old subject is Alonzo L. Hamby's ''For the Survival of Democracy: Franklin Roosevelt and the World Crisis of the 1930s.'' The New Deal is as overdiscussed as the Revolution, yet by internationalizing it, Hamby is able to raise provocative, revealing questions, even disturbing ones. The Great Depression, he points out, was a crisis that ''begged for international solutions.'' The Western governments, however, pursued beggar-thy-neighbor policies, including protective tariffs and competitive currency devaluations, that ''frequently made things worse.'' And the United States, he says, was the worst offender of all, ''the most isolationist of the major world powers.'' Roosevelt was an economic nationalist who mistakenly treated his country as a self-contained unit, even actively sabotaging the feeble efforts at international cooperation. Whatever economic successes he had domestically -- and Hamby, following other recent historians, shows that those successes were modest indeed -- his actions contributed to the nation-against-nation, Hobbesian atmosphere of the world arena. Hamby does not go so far as to blame Roosevelt for Hitler's growing strength in the mid-1930's, but it would not be difficult to take his argument in that direction. Roosevelt was an ''impressive'' figure, Hamby writes. But from a global perspective, the New Deal record was ''hardly impressive.'' AS if to signal to historians the kind of reassessment that needs to be done, the National Endowment for the Humanities will sponsor a four-week institute at the Library of Congress later this month on ''Rethinking America in Global Perspective.'' And one group of professional historians has already begun submerging the United States within a broader identity. The growing field of ''Atlantic history,'' connecting Europe, Africa and the Americas through economics, demography and politics, has become a recognized academic specialty, taught not only in the United States but also in Britain and Germany. It is generating books, conferences, prizes and, of course, a Web site. No less a figure than the eminent Harvard historian Bernard Bailyn has devoted his most recent book to this ''very large subject'' that is ''now coming into focus.'' Bailyn writes that Atlantic history is ''peculiarly relevant for understanding the present.'' It may be that for general readers trying to understand the present (as opposed to scholars), Atlantic history goes too far in dissolving the United States into a blurry, ill-defined transoceanic entity -- the might and power of the nation are not about to disappear, nor is the threat posed by its enemies. But because the post-9/11 globalization of American history is really just now taking shape, there is sufficient flexibility at the moment to accommodate a wide range of approaches. Three recent books, for example, offer starkly contrasting visions of America's past and, correspondingly, of its present world role. They are of varying quality but in their different approaches, they point to the kind of intellectual debates we can expect in the future from historians who speak to our current condition. In ''A Patriot's History of the United States,'' Larry Schweikart, a professor of history at the University of Dayton, and Michael Allen, a professor of history at the University of Washington, Tacoma, self-consciously return to 50's triumphalism, though with a very different purpose from that of the consensus historians. Not interested in irony or in standing outside of history, they are full-blooded participants, self-assured and robust moralists, who argue that the United States is a uniquely virtuous country, with a global mission to spread American values around the world. ''An honest evaluation of the history of the United States,'' they declare, ''must begin and end with the recognition that, compared to any other nation, America's past is a bright and shining light. America was, and is, the city on the hill, the fountain of hope, the beacon of liberty.'' Theirs is a frankly nationalistic -- often blatantly partisan -- text in which the United States is presented as having a duty to lead while other countries, apparently, have an obligation to follow. ''In the end,'' they write, ''the rest of the world will probably both grimly acknowledge and grudgingly admit that, to paraphrase the song, God has 'shed His grace on thee.' '' This is a point of view with few adherents in the academy these days (let alone in other nations), but it's surely one that enjoys warm support among many red-state conservatives, and in the halls of the White House. Critics of the Bush administration will find more to agree with in the perspective of '' 'A Problem from Hell': America and the Age of Genocide,'' Samantha Power's Pulitzer Prize-winning history of 20th-century mass murder. Unlike Schweikart and Allen, she does not see virtue inhering, almost divinely, in American history. Instead, she judges that history against a larger moral backdrop, asking how the country has responded to the most dire of international crimes, genocide. The record is hardly inspiring. Power reveals that throughout the 20th century, whenever genocide occurred, whether the victims were Armenians, Jews, Cambodians, Kurds or Tutsis, the American government stood by and did nothing. Worse, in some instances, it sided with the murderers. '' 'A Problem from Hell' '' exhorts Americans to learn from their history of failure and dereliction, and to live up to their professed values; we have ''a duty to act.'' Whereas Schweikart and Allen believe American history shows that the United States is already an idealistic agent in world affairs, Power contends that our history shows it is not -- but that it should become one. A third book, Margaret MacMillan's ''Paris 1919: Six Months That Changed the World,'' is in effect an answer to Schweikart, Allen and Power -- an object lesson in the ways American idealism can go wrong. MacMillan's focus is on Woodrow Wilson at the end of World War I. A visionary, an evangelist, an inspiration, an earth-shaker, a holy fool, Wilson went to Paris in 1919 with grand ambitions: to hammer out a peace settlement and confront a wretched world with virtue, to reconfigure international relations and reform mankind itself. Freedom and democracy were ''American principles,'' he proclaimed. ''And they are also the principles and policies of forward-looking men and women everywhere, of every modern nation, of every enlightened community. They are the principles of mankind and they must prevail.'' Other leaders were less sure. David Lloyd George, the British prime minister, liked Wilson's sincerity and straightforwardness, but also found him obstinate and vain. France's prime minister, the acerbic and unsentimental Georges Clemenceau, said that talking to him was ''something like talking to Jesus Christ.'' (He didn't mean that as a compliment.) As a committed American democrat, Wilson affirmed his belief in the principle of self-determination for all peoples, but in Paris his convictions collided with reality. Eastern Europe was ''an ethnic jumble,'' the Middle East a ''myriad of tribes,'' with peoples and animosities so intermingled they could never be untangled into coherent polities. In the Balkans, leaders were all for self-determination, except when it applied to others. The conflicting parties couldn't even agree on basic facts, making neutral mediation impossible. Ultimately, the unbending Wilson compromised -- on Germany, China, Africa and the South Pacific. He yielded to the force majeure of Turks and Italians. In the end, he left behind him a volcano of dashed expectations and festering resentments. MacMillan's book is a detailed and painful record of his failure, and of how we continue to live with his troublesome legacy in the Balkans, the Middle East and elsewhere. Yet the idealists -- nationalists and internationalists alike -- do not lack for responses. Wilsonianism, they might point out, has not been discredited. It always arises from its own ashes; it has even become the guiding philosophy of the present administration. Give George W. Bush key passages from Wilson's speeches to read, and few would recognize that almost a century had passed. Nor should this surprise us. For while the skeptics can provide realism, they can't provide hope. As MacMillan says, the Treaty of Versailles, particularly the League of Nations, was ''a bet placed on the future.'' Who, looking back over the rubble, would have wanted to bet on the past? Little has changed in our new century. Without the dreams of the idealists, all that is on offer is more of the same -- more hatred, more bloodshed, more war, and eventually, now, nuclear war. Anti-Wilsonian skeptics tend to be pessimistic about the wisdom of embarking on moral crusades but, paradoxically, it is the idealists, the hopeful ones, who, in fact, should be painting in Stygian black. They are the ones who should be reminding us that for most of the world, history is not the benign story of inexorable progress Americans like to believe in. Rather, it's a record of unjustified suffering, irreparable loss, tragedy without catharsis. It's a gorgon: stare at it too long and it turns you to stone. Fifty years ago, Louis Hartz expressed the hope that the cold war would bring an end to American provincialism, that international responsibility would lead to ''a new level of consciousness.'' It hasn't happened. In the 1950's, two wide oceans and a nuclear stockpile allowed Americans to continue living blithely in their imagined city on a hill, and the student revolts of the 60's and 70's, if anything, fed the notion that the rest of the world was ''out there.'' ''Bring the troops home'' was the protesters' idea of a foreign policy. But the disaster of 9/11 proved that the oceans do not protect us and that our nuclear arsenal, no matter how imposing, will not save our cities from terrorists armed with weapons of mass destruction. Today, there is no retreating into the provincialism and innocence of the past. And because withdrawal is not an option, the work of the globalizing American historians possesses an urgency unknown to scholars of previous generations. The major lesson the new historians must teach is that there is no longer any safe haven from history's horror story. Looking forward is unnerving, but looking backward is worse. The United States has no choice. Like it or not, it is obliged to take a leading role in an international arena that is unpredictable and dangerous, hopeful perhaps, but also potentially catastrophic. Barry Gewen is an editor at the Book Review. Benjamin Franklin c/o Time Warp Mail Service Bob's Blog - WhitakerOnline.org ? 7/30/05 Insider Letter http://whyjohnny.com/blog/?p=639 Dear Mr. Franklin, You are facing extremely serious legal problems. 1) Your invention of bifocal lens. You have no qualifications whatsoever in the fields of optometry or ophthalmology. You are ordered to cease and desist from the use or discussion of this product. Lawsuits have been lodged against you by people whose bifocals have broken and gashed their skins. Others say that they confuse the eyes and cause double vision. 2) Your invention of the stove Your Franklin Stove has caused serious injury to a very large number of people. Children playing have bumped into it and been burned by it. You have no Federally-approve set of directions for its use, so you are personally responsible for every accident that occurs in using your product. 3) Your discovery of the Gulf Stream As with optometry in the case of your invention of bifocals, you are practicing meteorology with no degree or other qualifications in the subject. While no one has yet been able to formulate an actual lawsuit against you on this subject, you have made a laughing-stock of yourself by going outside the field of printing, where you do have some actual credentials. You are in deep trouble in other areas. Your comments about Quakers, Indians and other minority groups were definitely Hate Speech. You are charged with manslaughter and armed robbery in aiding and abetting in the robbery of America from the Native Americans. Other charges are pending. Yours Indignantly, The Association of Experts, Lawyers, Professors and Other Authorities in the Year 2005 Knowing a Man (Ben Franklin), but Not Melons http://www.nytimes.com/2005/12/19/arts/design/19fran.html [Click the URL to view the graphic.] Exhibitions Review By EDWARD ROTHSTEIN PHILADELPHIA, Dec. 14 -There was something insufferable about Benjamin Franklin, and many of his contemporaries knew it. John Adams wrote, "Had he been an ordinary man, I should never have taken the trouble to expose the turpitude of his intrigues, or to vindicate my reputation against his vilifications and calumnies." "Dr. Franklin's Profile," by Red Grooms, is on view in Philadelphia. "Benjamin Franklin: In Search of a Better World" is at the National Constitution Center in Philadelphia through April 30 and then travels to St. Louis, Houston, Denver, Atlanta and finally, in December 2007, to Paris. "Benjamin Franklin: In His Own Words" is at the Southwest Gallery of the Thomas Jefferson Building of the Library of Congress in Washington through June 17. Franklin could change positions when they seemed unpopular, compromise on principles and turn statecraft into a matter of personality. When he achieved any post of power, he stuffed relatives into remunerative positions while proclaiming that public servants should expect no payment at all. In other contexts, Franklin's treatment of family could have made Poor Richard blush through his almanack: He began a three-generation tradition of siring illegitimate children; he made sure to spend 15 of the last 17 years of his marriage away from his wife in foreign lands, making no effort to see her in her final years; to his children and heirs he was capable of stunning callousness mixed with bouts of devotion. Nor was his later reputation sterling among literary figures. Melville referred to Franklin's "primeval orientalness." Mark Twain, only partly in jest, accused him of "animosity toward boys" with his pert maxims about propriety. D. H. Lawrence, who could have been Franklin in a fun-house mirror, called him a "dry, moral, utilitarian little democrat." No, Franklin, the middle-class materialist and moralist, has not had an easy time of it, particularly during much of the 20th century when he was often considered annoyingly bourgeois. It is even difficult to clearly define his contribution to the founding of the United States. Unlike Jefferson, he was not a devotee of high principle and a practitioner of high prose. Unlike Washington, he could not have led an army through adversity or channeled a fledgling country through birth pangs. Unlike Madison or even Hamilton, he was no theoretician. But none of this really matters compared with what Franklin did achieve. Nor can it dampen the celebratory impact of an exciting new exhibition about Franklin's life and achievements at the National Constitution Center in Philadelphia, or slight the imposing sobriety of the 10 display cases stocked with Franklin documents at the Library of Congress in Washington. On Jan. 17, the 300th anniversary of Franklin's birth will be celebrated, notably in Philadelphia, which he was instrumental in establishing as a modern city by helping to found its major institutions: America's first nonsectarian college (ancestor to the University of Pennsylvania), its first public hospital, its first subscription library and its first property-insurance company. He will also be celebrated for his exploration of electricity (saving cities with his invention of the lightning rod); for slyly courting the French during the Revolutionary War (yielding a treaty that helped turn the tide against the British); and for spurring a stalled Constitutional Convention toward compromise and a bicameral legislature. In fact, the difficulty we find in placing Franklin or in defining him is inseparable from the complexity of his achievements. In the last five years he has been the subject of at least four biographies - a gracefully intelligent survey by Walter Isaacson, a forceful and meticulous re-creation of his French years by Stacy Schiff, major scholarly books by H. W. Brands and Edmund S. Morgan. Franklin emerges in these reconstructions as a founder of not only American institutions but of an idea of America itself. Like Whitman, he contained multitudes. And in his refusal to devote himself to a single dominating theory, in his skepticism about sweeping universals, in his devotion to compromise, in his forthright embrace of material prosperity, in his belief in community organization, and in his distinctive mixture of cynicism and idealism about humanity, he shaped a pragmatic temperament that can still be associated with the country he helped create. If Franklin were to mount a museum exhibition about himself, it might very well resemble - in its variety, intelligence and pleasures - "Benjamin Franklin: In Search of a Better World"; the curator of this Philadelphia show is Page Talbott, a specialist in decorative and fine arts. It contains more than 250 artifacts, ranging from one of Jefferson's drafts of the Declaration of Independence (his originally held truths to be "sacred and undeniable"; Franklin transformed them into truths held to be "self-evident") to Franklin's fossilized mastodon tooth (found near the ruins of his Market Street home) to important paintings portraying the senior statesman. Almost no aspect of Franklin's enterprise is left untouched: printer (the only surviving copy of Franklin's first Poor Richard's Almanack from 1733), civic leader (a suggestion box from the subscription library), scientist (a modern version of his electrical apparatus, producing sparks by turning a handle), diplomat (including a life-size diorama of Franklin facing British parliamentary accusations of fomenting American rebellion), inventor (of bifocals, for instance, or the glass armonica, in which spinning bowls dampened by water created ethereal sounds that inspired compositions by Mozart and Beethoven). And scattered throughout the exhibition are usable replicas of an armchair Franklin invented; slight pressure on a foot pedal waves a fan above one's head. In creating such a show, Franklin might have also done as this one does, and elide the shadows of his life and temperament, leaving behind only slight hints. "Did Franklin himself listen to Poor Richard's advice?" the exhibition asks about Franklin's proverbs. "Sometimes. Sometimes not." There is also a tendency here at times to seek a kind of sensation that Franklin would have looked askance at, simply because it adds so little to understanding. One example is a giant model of a tree veined with colored fibers meant to symbolize Franklin's Junto Society, made up of 12 citizens who met weekly for debate and civic planning. Press a button next to each member's name and the tree's fibers light up, leading to hanging signs displaying the Philadelphia institutions that that member helped establish. But one walks through the 8,000 square feet of this exhibition astonished at the fecundity of Franklin's imagination and the range of his inventions. Something is reproduced, too, of his practical and playful spirit. Visitors will be challenged to flip numbers on a giant magic square to make all rows add to 15 - the kind of puzzle that Franklin turned into a specialty. And just as Franklin was remarkably attentive to younger children (though he tended to become almost cavalier about their needs as they aged), the displays aimed at the younger set are whimsical without a hint of condescension. ("Men and melons" is the first half of a maxim displayed in a participatory exhibit about Poor Richard; visitors try to match the second half: "stink in three days," "are hard to know" or "should not go barefoot." Missing in this chronological survey, though, is the kind of complication that accompanies darker shadows. We learn, for example, that Franklin's initial support of the British Stamp Act in 1765 led to serious problems with his American reputation, and we learn too that he hoped for some sort of reconciliation with the British Empire when many of his compatriots had already committed to independence. But it is at the Library of Congress that visitors can see one of the bald propaganda exercises Franklin used to rescue his reputation when he returned to Philadelphia from England in 1775: a furious letter to a British friend ("You are now my enemy") that he never sent, but just showed around town. (The Washington exhibition also shows the handwritten "personal liturgy" that Franklin wrote for himself at the age of 22 as a substitute for attending church. There could be more explanation in the Philadelphia show, too, of what was at stake when Franklin spent eight years in France as a representative of the Continental Congress, and of how difficult his task was - persuading the Continent's most tradition-bound court that this creditless assemblage of colonies should be taken seriously, while also persuading representatives of the colonies that the court in Paris could not simply be arm-wrestled into a treaty. Franklin used a chess metaphor, echoed in the Philadelphia exhibition, to explain his actions, but we don't really learn enough of the moves to understand. Toward the end, this extraordinary exhibition almost peters out into generalities and gimmicky display. A real exploration of Franklin's impact would have meant showing just how controversial a character he had become, partly because of his long tenure in France; even the newly formed United States Senate (as one of its number reported) refused to wear "crape on their arms for a Month" as the House did after Franklin's death in 1790. "Upon the whole," Franklin wrote in 1771, "I am much disposed to like the World as I find it, and to doubt my own Judgment as to what would mend it." That made him a pragmatist and a compromiser, a nonutopian, a man with bifocals. But what could he hope for? "The greatest Political Structure Human Wisdom ever yet erected." That made him a visionary. He is celebrated for being both. From checker at panix.com Tue Jan 17 18:48:29 2006 From: checker at panix.com (Premise Checker) Date: Tue, 17 Jan 2006 13:48:29 -0500 (EST) Subject: [Paleopsych] Edge Annual Question 2002: What Is Your Question? ... Why? Message-ID: Edge Annual Question 2002: What Is Your Question? ... Why? http://www.edge.org/q2002/question.02_print.html [links omitted] "I can repeat the question, but am I bright enought to ask it?" ________________________________________________________________ The 5th Annual Edge Question reflects the spirit of the Edge motto: "To arrive at the edge of the world's knowledge, seek out the most complex and sophisticated minds, put them in a room together, and have them ask each other the questions they are asking themselves." The 2002 Edge Question is: "WHAT IS YOUR QUESTION? ... WHY?" I have asked Edge contributors for "hard-edge" questions, derived from empirical results or experience specific to their expertise, that render visible the deeper meanings of our lives, redefine who and what we are. The goal is a series of interrogatives in which "thinking smart prevails over the anaesthesiology of wisdom." Happy New Year! John Brockman Publisher & Editor [1.14.02] ________________________________________________________________ Responses (in order received): Kevin Kelly o Paul Davies o Stuart A. Kauffman o Alison Gopnik o John Horgan o Daniel C. Dennett o Derrick De Kerkhove o Clifford A. Pickover o John McCarthy o Douglas Rushkoff o William Calvin o Timothy Taylor o Marc D. Hauser o Roger Schank o James J. O'Donnell o Robert Aunger o Lawrence Krauss o Jaron Lanier o Freeman Dyson o Lance Knobel o Robert Sapolsky o Mark Stahlman o Andy Clark o Sylvia Paull o Todd Feinberg, MD o Nicholas Humphrey o Terrence Sejnowski o Howard Lee Morgan o Judith Rich Harris o Martin Rees o Paul Bloom o Margaret Wertheim o George Dyson o Todd Siler o Chris Anderson o Gerd Stern o Alan Alda o Henry Warwick o Delta Willis o John Skoyles o Paul Davies o Piet Hut o Julian Barbour o Antony Valentini o Stephen Grossberg o Rodney Brooks o Karl Sabbagh o David G. Myers o John D. Barrow o Milford H. Wolpoff o Richard Dawkins o David Deutsch o Joel Garreau o Gregory Benford o Eduardo Punset o Gary F. Marcus o Steve Grand o Seth Lloyd o John Markoff o Michael Shermer o Jordan B. Pollack o Steven R. Quartz o David Gelernter o Samuel Barondes o Steven Pinker o Frank Schirrmacher o Leon Lederman o Howard Gardner o Esther Dyson o Keith Devlin o Richard Nisbett o Stephen Schneider o Robert Provine o Sir John Maddox o Carlo Rovelli o Tor N?rretranders o David Buss o John Allen Paulos o Dan Sperber o W. Daniel Hillis o Brian Eno o Anton Zeilinger o Eberhard Zangger o Mark Hurst o Stuart Pimm o James Gilligan o Brian Greene o Rafael N??ez o J. Doyne Farmer o Ray Kurzweil o Randolph Nesse o Adrian Scott o Tracy Quan o Xeni Jardin o Stanislas Dehaene o Paul Ewald o George Lakoff o David Berreby o Jared Diamond ________________________________________________________________ New Vordenker der "Dritten Kultur": Fragen f?r das Jahr 2002: "Wer Nicht Fragt, Bleibt Dumm" THOSE WHO DON'T ASK REMAIN DUMB The haze of ignorance still has not disappeared: Whoever wants real answers has to know what he's looking for -- A poll of scientists and artists for the year 2002. In a time when culture was still not numbered, the Count of Th?ringen invited his nobles to the "Singers' War at the Wartburg," where he asked questions (if we are to believe Richard Wagner) that would bring glory, the most famous of which queried, "Could you explain to me the nature of love?" The publisher and literary agent, John Brockman, who now organizes singers' wars on the Internet, enjoys latching on to this tradition at the beginning of every year. (FAZ, January 9, 2001). His Tannh?user may be named Steven Pinker, and his Wolfram von Eschenbach may go by Richard Dawkins, but it would do us well to trust that they and their compatriots could also turn out speculation on the count's favorite theme. Brockman's thinkers of the "Third Culture," whether they, like Dawkins, study evolutionary biology at Oxford or, like Alan Alda, portray scientists on Broadway, know no taboos. Everything is permitted, and nothing is excluded from this intellectual game. But in the end, as it takes place in its own Wartburg, reached electronically at www.edge.org, it concerns us and our unexplained and evidently inexplicable fate. In this new year Brockman himself doesn't ask, but rather once again facilitates the asking of questions. The contributions can be found from today onwards on the Internet. In conjunction with the start of the forum we are printing a selection of questions and commentary, at times in somewhat abridged form, in German translation. .... [click here] F.A.Z. --Frankfurter Allgemeine Zeitung, 14.01.2002, Nr. 11 / Seite 38 ________________________________________________________________ 99 contributors 59,000 words In order received ________________________________________________________________ "What is your heresy?" I've noticed that the more scientifically educated a person is, the more likely they will harbor a quiet heresy. This is a strongly held belief that goes against the grain of their peers, something not in the accepted cannon of their friends and colleagues. Often the person finds it difficult to fully justify their own belief. It may or may not be believed by others outside their circle, that doesn't matter. What is important is that this view is not held by people they respect and admire. It's become almost a game for me to uncover a person's heresy because I've found that this unconventional view -- held with much effort against the tide of their peer's views -- tells me more about them than does the bulk of their well-thought out, well-reasoned, and well argued conventional views. The more unexpected the belief is, the more I like them. Kevin Kelly is Editor-At-Large for Wired Magazine and author of New Rules for the New Economy. ________________________________________________________________ "Universe or multiverse, that is the question?" Of late, it is fashionable among leading physicists and cosmologists to suppose that alongside the physical world we see lies a stupendous array of alternative realities, some resembling our universe, others very different. The multiverse theory comes in several varieties, but in the most ambitious the "other universes" have different physical laws. Only in a tiny fraction of universes will the laws come out just right, by pure accident, for conscious beings such as ourselves to emerge and marvel at how bio-friendly their world appears. The multiverse has replaced God as an explanation for the appearance of design in the structure of the physical world. Like God, the agency concerned lies beyond direct observation, inferred by inductive reasoning from the properties of the one universe we do see. The meta-question is, does the existence of these other universes amount to more than an intellectual exercise? Can we ever discover that the hypothesized alternative universes are really there? If not, is the multiverse not simply theology dressed up in techno jargon? And finally, could there be a Third Way, in which the ingenious features of the universe are explained neither by an Infinite Designer Mind, nor by an Infinite Invisible Multiverse, but by an entirely new principle of explanation. Paul Davies, a physicist, writer and broadcaster, now based in South Australia, is author of How to Build a Time Machine. ________________________________________________________________ "What must a physical system be to be able to act on its own behalf?" In or ordinary life, we ascribe action and doing to other humans, and lower organisms, even bacteria swimming up a glucose gradient to get food. Yet physics has no "doings" only happenings, and the bacterium is just a physical system. I have struggled with the question "What must a physical system be to be able to act on its own behalf?" Call such a system an autonomous agent. I may have found an answer, such systems must be able to replicate and do a thermodynamic work cycle. But of course I'm not sure of my answer. I am sure the question is of fundamental importance, for all free living organisms are autonomous agents, and with them, doing, not just happenings, enters the universe. We do manipulate the universe on our own behalf. Is there a better definition of autonomous agents? And what does their existence mean for science, particularly physics? Stuart A. Kauffman, an emeritus professor of biochemistry at UPenn, is a theoretical biologist and author of Investigations. ________________________________________________________________ "Why do we ask questions?" We all take for granted the fact that human beings ask questions and seek explanations, and that the questions they ask go far beyond their immediate practical concerns. But this insatiable human curiosity is actually quite puzzling. No other animal devotes as much time, energy and brain area to the pursuit of knowledge for its own sake. Why? Is this drive for explanation restricted to the sophisticated professional questioners on this site? Or is it a deeper part of human nature? Developmental research suggests that this drive for explanation is, in fact, in place very early in human life. We've all experienced the endless "whys?" of three-year-olds and the downright dangerous two-year-old determination to seek out strange new worlds and boldly go where no toddler has gone before. More careful analyses and experiments show that children's questions and explorations are strategically designed, in quite clever ways, to get the right kind of answers. In the case of human beings, evolution seems to have discovered that it's cost-effective to support basic research, instead of just funding directed applications. Human children are equipped with extremely powerful learning mechanisms, and a strong intrinsic drive to seek explanations. Moreover, they come with a support staff, -- parents and other caregivers -- who provide both lunch and references to the results of previous generations of human researchers. But this preliminary answer prompts yet more questions. Why is it that in adult life, the same quest for explanatory truth so often seems to be satisfied by the falsehoods of superstition and religion? (Maybe we should think of these institutions as the cognitive equivalent of fast food. Fast food gives us the satisfying tastes of fat and sugar that were once evolutionary markers of good food sources, without the nourishment. Religion gives us the illusion of regularity and order, evolutionary markers of truth, without the substance.) Why does this intrinsic truth-seeking drive seem to vanish so dramatically when children get to school? And, most important, how is it possible for children to get the right answers to so many questions so quickly? What are the mechanisms that allow human children to be the best learners in the known universe? Answering this question would not only tell us something crucial about human nature, it might give us new technologies that would allow even dumb adults to get better answers to our own questions. Alison Gopnik is a professor of psychology at the University of California at Berkeley and coauthor of The Scientist In The Crib. ________________________________________________________________ "Do we want the God machine?" The God machine is the name that journalists have given to a device invented by the Canadian psychologist Michael Persinger. It consists of a bunch of solenoids that, when strapped around the head, deliver pulses of electromagnetic radiation to specific regions of the brain. Persinger claims he can induce mystical visions by stimulating the temporal lobes, which have also been linked to religious experiences by other scientists, notably V.S. Ramachandran of the University of California at San Diego. Persinger's machine is actually quite crude. It induces peculiar perceptual distortions but no classic mystical experiences. But what if, through further advances in neuroscience and other fields, scientists invent a God machine that actually works, that delivers satori, nirvana, to anyone on command, without any negative side effects? It doesn't have to be an electromagnetic brain-stimulating device. It could be a drug, a type of brain surgery, a genetic modification, or some combination thereof. One psychedelic researcher recently suggested to me that enlightenment could be spread around the world by an infectious virus that boosts the brain's production of dimethyltryptamine, a endogenous psychedelic that the Nobel laureate Julius Axelrod of the National Institutes of Health detected in trace amounts in human brain tissue in 1972. But whatever form the God machine takes, it would be powerful enough to transform the world into what Robert Thurman, an authority on Tibetan Buddhism (and father of Uma), calls the "Buddhaverse," a mystical utopia in which everyone is enlightened. The obvious followup question: Would the invention of a genuine God machine spell our salvation or doom? John Horgan is a freelance writer and author of The Undiscovered Mind. ________________________________________________________________ "What kind of system of 'coding' of semantic information does the brain use?" My question now is actually a version of the question I was asking myself in the first year, and I must confess that I've had very little time to address it properly in the intervening years, since I've been preoccupied with other, more tractable issues. I've been mulling it over in the back of my mind, though, and I do hope to return to it in earnest in 2002. What kind of system of "coding" of semantic information does the brain use? We have many tantalizing clues but no established model that comes close to exhibiting the molar behavior that is apparently being seen in the brain. In particular, we see plenty of evidence of a degree of semantic localization -- neural assemblies over here are involved in cognition about faces and neural assemblies over there are involved in cognition about tools or artifacts, etc -- and yet we also have evidence (unless we are misinterpreting it) that shows the importance of "spreading activation," in which neighboring regions are somehow enlisted to assist with currently active cognitive projects. But how could a region that specializes in, say, faces contribute at all to a task involving, say, food, or transportation or . . . . ? Do neurons have two (or more) modes of operation -- specialized, "home territory" mode, in which their topic plays a key role, and generalized, "helping hand" mode, in which they work on other regions' topics? Alternatively, is the semantic specialization we have observed an illusion -- are these regions only circumstantially implicated in these characteristic topics because of some as-yet-unanalyzed generalized but idiosyncratic competence that happens to be invoked usually when those topics are at issue? (The mathematician's phone rings whenever the topic is budgets, but he knows nothing about money; he's just good at arithmetic.) Or, to consider another alternative, is "spreading activation" mainly just noisy leakage, playing no contributing role in the transformation of content? Or is it just "political" support, contributing no content but helping to keep competing projects suppressed for awhile? And finally, the properly philosophical question: what's wrong with these questions and what would better questions be? Daniel C. Dennett is Distinguished Arts and Sciences Professor at Tufts University and author of Darwin's Dangerous Idea. ________________________________________________________________ "'To be or not to be' remains the question" The fact is that is "To be or not to be" is both a simple, perhaps the simplest, and a complex question, the hardest to sustain, let alone to ask. I ask it myself often -- maybe as many times as five or six a week -- and it is the asking, not any hope for an answer, that yields the most searing and immediate insight. I don't get it right every time, but when I do, I am thrown for a split second at the other side of being, the place where it begins. But I can never retain that amazing feeling for long. What is required is a kind of radical pull-back of oneself from the most banal evidence of life and reality. Jean-Paul Sartre, after Shakespeare, was probably the thinker who framed the question best in his novels and philosophical treatises. The issue, however, is that this question is profoundly existential, not merely philosophical. It can be asked and should be by any living, thinking, sentient being, but cannot be answered. There is huge energy and cognitive release to expect from it when it is properly framed. You have to somehow imagine that everything, absolutely everything has disappeared, or never was, that you have just happened upon your own circumstances by accident, the first accident of being. Another approach is to imagine sharply that anything that is, is a result of a warp, a blip in nothingness. It is not even a matter of finding out why or how, those demands are already far too elaborate. It is a crude, raw, brutal question followed by absolute, lightening speed amazement. And then the ordinary familiarity of all things known and named takes over, slipping your whole being into the stream of life, of being, with its attending problems and felicities. I feel strongly that there is a fundamental need for Shakespeare's question in every day life, but that is not what you and I were taught in school. Derrick de Kerckhove is Director of the McLuhan Program at the University of Toronto and author of Connected Intelligence. ________________________________________________________________ "Would you choose universe Omega or Upsilon?" Consider two universes. Universe Omega is a universe in which God does not exist, but the inhabitants of the universe believe God exists. Universe Upsilon is a universe in which God does exist, but no inhabitant believes God exists. In which universe would you prefer to live? In which universe do you think most people would prefer to live? I recently posed this question to scientists, philosophers, and lay people. Some respondents suggested that if people think God exists, then God is sufficiently "real." A few individuals suggested that people would behave more humanely in a Universe where people believed in God. Yet others countered that an ethical system dependent on faith in a watchful, omniscient, or vengeful God is fragile and prone to collapse when doubt begins to undermine faith. A fuller listing of responses is in the book. To me, the biggest challenge to answering this question is understanding what is meant by "God." Scientists sometimes think of God as the God of mathematical and physical laws and the underpinnings of the universe. Other people believe in a God who intervenes in our affairs, turns water into wine, answers prayers, and smites the wicked. The Koran implies that God lives outside of time, and, thus, our brains are not up to the task of understanding Him. Some theologians have suggested that only especially sensitive individuals can glimpse God, but us ordinary folk shouldn't deny His existence in the same way that a blind man shouldn't deny the existence of a rainbow. In modern times, many scientists ponder the amazing panoply of chemical and physical constants that control the expansion of the universe and seem tuned to permit the formation of stars and the synthesis of carbon-based life. Questions about God's omniscience are particularly mind-numbing, yet we can still ask if it is rational to believe in an omniscient God. As Steven J. Brams points out in his book Superior Beings, "The rationality of theistic belief is separate from its truth -- a belief need not be true or even verifiable to be rational." However, if we posit the existence of an omniscient God, His omniscience may require him to know the history of all quarks in the universe, the states of all electrons, the vibrations of every string, and the ripples of the quantum foam. Is this the same God, who in Exodus 21 gave Moses laws describing when one should stone an ox to death? Is the God of Gluons and Galaxies the same God concerned with Israeli oxen dung? But what about the Bible itself? Today, the Bible -- especially the Old Testament -- may serve as an alternate reality device. It gives its readers a glimpse of other ways of thinking and of other worlds. It is also the most mysterious book ever written. We don't know the ratio of myth to history. We don't know all the authors. We are not always sure of the intended message. We don't fully understand the Old Testament's Nephilim or its Bridegroom of Blood. We only know that that the Bible reflects some of humankind's most ancient and deep feelings. For some unknown reason, it is a bell that has resonated through the centuries. It lets us reach across cultures, see visions, and better understand what we have held sacred. Because the Bible is a hammer that shatters the ice of our unconscious, it thus provides one of many mechanisms in our quest for transcendence. Clifford A. Pickover is a researcher at IBM's T. J. Watson Research Center and author of The Paradox of God and the Science of Omniscience. ________________________________________________________________ "How are behaviors encoded in DNA?" Many animals have have quite substantial hereditary behavior. Moreover, these behaviors are subject to evolution on fairly short time scales, so they probably have straightforward DNA encodings on which mutations can act. Mostly the behaviors seem to be sequences of actions, but perhaps there are some of the form "do X until Y is true". John McCarthy is Professor of Computer Science at Stanford University. ________________________________________________________________ "Why do we tell stories?" Why a story? Human beings can't help but understand their world in terms of narratives. Although the theory of evolution effectively dismantled our creationist myths over a century ago, most thinking humans still harbor an attachment to the notion that we were put here, with purpose, by something. New understandings of emergence, as well as new tools for perceiving the order underlying chaos, seem to the hold the promise for a widescale liberation from the constructed myths we use to organize our experience, as well as the dangers that over-dependence on such narratives bring forth. At least I hope so. At the very least, narratives are less dangerous when we are free to participate in their writing. I'll venture that it is qualitatively better for human beings to take an active role in the unfolding of our collective story than it is to adhere blindly to the testament of our ancestors or authorities. But what of moving out of the narrative altogether? Is it even possible? Is our predisposition for narrative physiological, psychological, or cultural? Is it an outmoded form of cognition that yields only bloody clashes when competing myths are eventually mistaken for irreconcilable realities? Or are stories the only way we have of interpreting our world -- meaning that the forging of a collective set of mutually tolerant narratives is the only route to a global civilization? Douglas Rushkoff is a Professor of Media Culture at New York University's Interactive Telecommu-nications Program and author of Coercion: Why We Listen to What "They" Say. ________________________________________________________________ "Eureka: What makes coherence so important to us?" When something is missing, it bothers us that things don't hang together. Consider: "Give him." In any language, that is a bothersome sentence. Something essential is missing, and it rings an alarm bell in our brains. We go in search of an implied "what" and try to guess what will make the words all hang together into a complete thought. We ask questions in search of satisfying incompletes, again hoping to create some coherence. No other animal does such things. It even forms the basis of many of our recreations such as jigsaw and crossword puzzles, all those little eurekas along the way. Guessing a hidden pattern fascinates us. It's part of our pleasure in complex ritual or listening to Bach, to be able to guess what comes next some of the time. It's boring when it is completely predictable, however; it's the search for how things all hang together that is so much fun. Of course, we make a lot of mistakes. Every other winter, I get fooled into thinking that a radio has been left on, somewhere in the house, and I go in search of it -- only to realize that it was just the wind whistling around the house. My brain tried to make coherence out of chaos by trying out familiar word patterns on it. Astrology, too, seems to make lots of things "all hang together." Often in science, we commit such initial errors but we are now fairly systematic about discovering and discarding them. We go on to find much better explanations for how things hang together. Finding coherence is one of our great pleasures. It would be nice to know what predisposes our brain to seek out hidden coherence. For one thing, it might help illuminate the power of an idea -- and with it, how fanaticism works. Fundamentalist schemes that seem to make everything hang together can easily override civilization's prohibitions against murder. Inferring an enveloping coherence can create an "other" who is outside the bounds of "us." Because it seems so whole, so right, it may become okay to beat up on unbelievers -- say, fans of an opposing football team, or of another religion. For scientists and crossword fans, it's finding the coherence that is important. Then we move on. But many people, especially in the generation which follows its inventors, get trapped by a seemingly coherent worldview. Things get set in concrete; the coherent framework provides comfort, but it also creates dangerous us-and-them boundaries. William Calvin is a theoretical neurobiologist at the University of Washington and author of How Brains Think. ________________________________________________________________ "Is morality relative or absolute?" Humans spread out from a common origin into many different global environments. It was a triumph of our unique adaptability, for we display the broadest range of behaviours -- nutritional, social, sexual and reproductive - of any animal. We also have classes of behaviour -- religious, scientific, artistic, gendered, and philosophical, each underpinned by special languages -- that animals lack. Paradoxically, success also came through conformity. Prehistorians track archaeological cultures by recognizing the physical symbolic codes (art styles, burial rites, settlement layouts) that channelled local routines. Each culture constrained diversity and could punish it with ostracism and death. Isolation bred idiosyncracy, and there was a shock when we began regional reintegration. Early empires created state religions which, although sometimes refracting species-wide instincts for a common-good, tended to elevate chosen peoples and their traditional ways. Now we can monitor all of our cultures there is a need to adjudicate on conduct at a global level. But my question is not understood in the same way by everyone. To fundamentalists, it is heretical, because morality is God-given. Social theorists, on the other hand, often interpret absolute morality as imperialist --no more than local ethics metastasized by (for example) the United Nations. But appeals to protect cultural diversity are typically advanced without regard to the reality of individual suffering in particular communities. A third position, shared by many atheistic scientists and traditional Marxists, is based on ideas of utility, happiness and material truth: what is right is understood as being what is good for the species. But no one agrees on what this is, or how competing claims for access to it should be settled. The 'ethics of care', first developed within feminist philosophy, moves beyond these positions. Instead of connecting morals either to religious rules and principles or reductive natural laws, it values shared human capacities, such as intimacy, sympathy, trust, fidelity, and compassion. Such an ethics might elide the distinction between relative and absolute by promoting species-wide common sense. Before we judge the prospect of my question vanishing as either optimistic or na?ve, we must scrutinize the alternatives carefully. Timothy Taylor is an archaeologist at University of Bradford, UK, and author of The Prehistory of Sex: Four Million Years of Human Sexual Culture. ________________________________________________________________ "How will the sciences of the mind constrain our theories and policies of education?" In several recent meetings that I have attended, I have been overwhelmed by the rift between what the sciences of mind, brain and behavior have uncovered over the past decade, and both how and what science educators teach. In many arenas, educators hold on to a now dated view of the child's cognitive development, failing to appreciate the innate biases that our species has been equipped with. These biases constrain not only what the child can learn, but when it might most profitably learn such things. Take, for instance, the acquisition of mathematical knowledge. Educators aim for the acquisition of precise computations. There is now, however, evidence for an innately available approximate number system, one that operates spontaneously without education. One might imagine that if educators attempted to push this system first -- teaching children that 40 is a better answer to 25 + 12 than is 60 -- that it might well facilitate the acquisition of the more precise system later in development. Similar issues arise in attempting to teach children about physics and biology. At some level, then, there must be a way for those in the trenches to work together with those in the ivory tower to advance the process of learning, building on what we have discovered from the sciences of the mind. Marc D. Hauser is an evolutionary psychologist, a professor at Harvard University and author of Wild Minds: What AnimalsThink. ________________________________________________________________ "What does it mean to have an educated mind in the 21st century?" While education is on every politician's agenda as an item of serious importance, it is astonishing that the notion of what it means to be educated never seems to come up. Our society, which is undergoing massive transformations almost on a daily basis never seems to transform its notion of what it means to be educated. We all seem to agree that an educated mind certainly entails knowing literature and poetry, appreciating history and social issues, being able to deal with matters of economics, being versatile in more than one language, understanding scientific principles and the basics of mathematics. What I was doing in my last sentence was detailing the high school curriculum set down in 1892 by a committee chaired by the President of Harvard that was mandated for anyone who might want to enter a university. The curriculum they decided upon has not changed at all since then. Our implicit notions of an educated mind are the same as they were in the nineteenth century. No need to teach anything new, no need to reconsider how a world where a university education was offered solely to the elite might be different from a world in which a university degree is commonplace. For a few years, in the early 90's, I was on the Board of Editors of the Encyclopedia Britannica. Most everyone else on the board were octogenarians -- the foremost of these, since he seemed to have everyone's great respect, was Clifton Fadiman, a literary icon of the 40's. When I tried to explain to this board the technological changes that were about to come that would threaten the very existence of the Encyclopedia, there was a general belief that technology would not really matter much. There would always be a need for the encyclopedia and the job of the board would always be to determine what knowledge was the most important to have. Only Clifton Fadiman seemed to realize that my predictions about the internet might have some effect on the institution they guarded. He concluded sadly, saying: "I guess we will just have to accept the fact that minds less well educated than our own will soon be in charge." Note that he didn't say "differently educated," but "less well educated." For some years the literati have held sway over the commonly accepted definition of education. No matter how important science and technology seem to industry or government or indeed to the daily life of the people, as a society we believe that those educated in literature and history and other humanities are in some way better informed, more knowing, and somehow more worthy of the descriptor "well educated." Now if this were an issue confined to those who run the elite universities and prep schools or those whose bible is the New York Review of Books, this really wouldn't matter all that much to anybody. But this nineteenth century conception of the educated mind weighs heavily on our notions of how we educate our young. We are not educating our young to work or to live in the nineteenth century, or at least we ought not be doing so. Yet, when universities graduate thousands of English and history majors because it can only be because we imagine that such fields form the basis of the educated mind. When we choose to teach our high schoolers trigonometry instead of say basic medicine or business skills, it can only be because we think that trigonometry is somehow more important to an educated mind or that education is really not about preparation for the real world. When we focus on intellectual and scholarly issues in high school as opposed to more human issues like communications, or basic psychology, or child raising, we are continuing to rely upon out dated notions of the educated mind that come from elitist notions of who is to be educated. While we argue that an educated mind can reason, but curiously there are no courses in our schools that teach reasoning. When we say that an educated mind can see more than one side of an argument we go against the school system which holds that there are right answers to be learned and that tests can reveal who knows them and who doesn't. Now obviously telecommunications is more important than basic chemistry and HTML is more significant than French in today's world. These are choices that have to be made, but they never will be made until our fundamental conception of erudition changes or until we realize that the schools of today must try to educate the students who actually attend them as opposed to the students who attended them in 1892. The 21st century conception of an educated mind is based upon old notions of erudition and scholarship not germane to this century. The curriculum of the school system bears no relation to the finished products we seek. We need to rethink what it means to be educated and begin to focus on a new conception of the very idea of education. Roger Schank is Distinguished Career Professor, School of Computer Science, Carnegie-Mellon University and author of Virtual Learning: A Revolutionary Approach to Building a Highly Skilled Workforce. ________________________________________________________________ "Do the benefits accruing to humankind (leaving aside questions of afterlife) from the belief and practice of organized religions outweigh the costs?" Given the political sensitivities of the topic, it is hard to imagine that a suitably rigorous attempt to answer this question could be organized or its results published and discussed soberly, but it is striking that there is no serious basis on which to conduct such a conversation. Religion brings peace and solace to many; religion kills people, divides societies, diverts energy and resources. How to assess the net impact in some meaningfully quantitative way? Even to imagine the possibility of such an inquiry and to think through some of the categories you would use could be very enlightening. James J. O'Donnell is Professor of Classical Studies and Vice Provost at UPenn and author of Avatars of the Word: From Papyrus to Cyberspace. ________________________________________________________________ "Is technology going to 'wake up' or 'come alive' anytime in the future?" Bill Joy, the prominent computer scientist, argued in a Wired article last year that "the future doesn't need us" because other creatures, artificial or just post-human, are going to take over the world in the 21st century. He is worried that various technologies -- particularly robotics, genetic engineering and nanotechnology -- are soon going to be capable of generating either a self-conscious machine (something like the Internet "waking up") or one capable of self-replication (nanotechnologists inspired by the vision of Eric Drexler are currently attempting to create a nano-scaled "universal assembler"). If either of these events came to pass, it would surely introduce major changes in the planetary ecology, and humans would have to find a new role to play in such a world. But is Joy right? Do we have to worry about mad scientists producing some invention that inadvertently renders us second-class citizens to machines in the next couple of decades? (Joy is so distraught by this prospect he would have everyone stop working in these areas.) This is a difficult question to answer, mostly because we don't currently have a very good idea about how technology evolves, so it's hard to predict future developments. But I believe that we can get some way toward an answer by adopting an approach currently being developed by some of our best evolutionary thinkers, such as John Maynard Smith, Eors Szathmary, and others. This "major transition" theory is concerned with determining the conditions under which new kinds of agents emerge in some evolutionary lineage. Examples of such transitions occurred when prokaryotes became eukaryotes, or single-celled organisms became multi cellular. In each case, previously independent biological agents evolved new methods of cooperation, with the result that a new level of organization and agency appeared in the world. This theory hasn't yet been applied to the evolution of technology, but could help to pinpoint important issues. In effect, what I want to investigate is whether the futures that disturb Bill Joy can be appropriately analyzed as major transitions in the evolution of technology. Given current trends in science and technology, can we say that a global brain is around the corner, or that nano-robots are going to conquer the Earth? That, at least, is my current project. Robert Aunger is an evolutionary theorist and editor of Darwinizing Culture: The Status of Memetics as a Science. ________________________________________________________________ "Was there any choice in the creation of the Universe?" Here I paraphrase Einstein's famous question: "Did God have any choice in the creation of the Universe". I get rid of the God part, which Einstein only added to make it seem more whimsical, I am sure, because that just confuses the issue. The important question, perhaps the most important question facing physics today is the question of whether there is only one consistent set of physical laws that allow a working universe, or rather whether the constants of nature are arbitrary, and could take any set of values. Namely, if we continue to probe into the structure of matter and the nature of elementary forces will we find that mathematical consistency is possible only for one unique theory of the Universe, or not? In the former case, of course, there is hope for an exactly predictive "theory of everything". In the latter case, we might expect that it is natural that our Universe is merely one of an infinite set of Universes within some grand multiverse, in each of which the laws of physics differ, and in which anthropic arguments may govern why we live in the Universe we do. The goal of physics throughout the ages has been to explain exactly why the universe is the way it is, but as we push closer and closer to the ultimate frontier, we may find out that in fact the ultimate laws of nature may generically produce a universe that is quite different from the one we live in. This would force a dramatic shift in our concept of natural law. Some may suggest that this question is mere philosophical nonsense, and is akin to asking how many angels may sit on the head of a pin. However, I think that if we are lucky it may be empirically possible to address it. If, for example, we do come up with some fundamental theory that predicts the values of many fundamental quantities correctly, but that predicts that other mysterious quantities, like the energy of empty space, is generically different than the value we measure, or perhaps is determined probabilistically, this will add strong ammunition to the notion that our universe is not unique, but arose from an ensemble of causally disconnected parts, each with randomly varying values of the vacuum energy. In any case, answerable or not, I think this is the ultimate question in science. Lawrence Krauss is Professor of Physics at Case Western Reserve University and the author of Atom. ________________________________________________________________ "How much can we handle?" We've got fundamental scientific theories (such as quantum theory and relativity) that test out superbly, even if we don't quite know how they all fit into a whole, but we're hung up trying to understand complicated phenomena, like living things. How much complexity can we handle? We ought to be able to use computers to model complicated things, but we can't as yet write software that's complicated enough to take advantage of the ever-bigger computers we are learning to build. Complexity, side effects, legacy. How much can we handle? That's the question of the new century. There's a social variant of the same problem: In the twentieth century we become powerful enough to destroy ourselves, but we seemed to be able to handle that. Now technology and information flow have improved to the point that a small number of us might be able to destroy us all. Can we handle that? Jaron Lanier, computer scientist and musician, is currently the lead scientist for the National Tele-Immersion Initiative. ________________________________________________________________ "Why am I me?" This question was asked by my eight-year-old grandson George. In eight letters it summarizes the conundrum of personal existence in an impersonal universe. How does it happen that a couple of liters of grey matter organizes itself into the unique stream of self-awareness that calls itself George? If we could answer this question, we would be on the way toward an understanding of brain structure and function at a deep level. We would probably have in our hands the key to a more rational and discriminating treatment of mental illnesses. We might also have the key to the design of a genuine artificial intelligence. Every human being must have asked this question in one way or another. For most of us, the question expresses only a general philosophical curiosity about our place in the order of nature. But for George the question has a more specific technical meaning. He has an identical twin brother Donald, and he understands the distinction between monozygotic and fraternal twins. He knows that he and Donald not only have the same genes but also have the same environment and upbringing. When George asks the question, he is asking how it happens that two people with identical genes and identical nurture are nevertheless different. What are the non-genetic and non-environmental processes in the brain that cause George to be George and cause Donald to be Donald? If we could answer this question, we would have a powerful new tool for the investigation of cognitive development. The conventional wisdom says that mental differences between George and Donald arise from local randomness of neural connections, undetermined either by genes or by sensory input. But to say that the connections are random only means that we do not yet understand how they came about. Freeman Dyson is professor of physics at the Institute for Advanced Study and author of The Sun, the Genome, and the Internet. ________________________________________________________________ "Do we want to live in one world, or two?" One of the great achievements of recent history has been a dramatic reduction in absolute poverty in the world. In 1820 about 85% of the world's population lived on the equivalent of a dollar a day (converted to today's purchasing power). By 1980, that percentage had dropped to 30%, but it is now down to 20%. But that still means 1 billion people live in absolute poverty. A further 2 billion are little better off, living on $2 a day. A quarter of the world's people never get a cup of clean water. Part of what globalisation means is that we have a reasonable chance of assuring that a majority of the world's people will benefit from continuing economic growth, improvements in health and education, and the untapped potential of the extraordinary technologies about which most of the Edge contributors write so eloquently. We currently lack the political will to make sure that a vast number of people are not fenced off from this optimistic future. So my question poses a simple choice. Are we content to have two, increasingly estranged world? Or do we want to find the path to a unified, healthy world? Lance Knobel is Adviser, Prime Minister's Forward Strategy Unit, London, and the former head of the program of the World Economic Forums' Annual meeting in Davos. ________________________________________________________________ "What's the neurobiology of doing good and being good?" I've spent most of my career as a neurobiologist working on an area of the brain called the hippocampus. It's a fairly useful region -- it plays a critical role in learning and memory. It's the area that's damaged in Alzheimer's, in alcoholic dementia, during prolonged seizures or cardiac arrest. You want to have your hippocampus functioning properly. So I've spent all these years trying to figure out why hippocampal neurons die so easily and what you can do about it. That's fine, might even prove useful some day. But as of late, it's been striking me that I'm going to be moving in the direction of studying a part of the brain called the prefrontal cortex (PFC). It's a fascinating part of the brain, the part of the brain that most defines us as humans. There's endless technical ways to describe what the PFC does, but as an informal definition that works pretty well, it's the closest thing we have to a superego. The PFC is what allows us to become potty trained early on. And it is responsible for squeezing our psychic sphincters closed as well. It keeps us from belching loudly at the quiet moment in the wedding ceremony, prevents us from telling our host just what we really think of the inedible meal they've served. It keeps us from having our murderous thoughts turn into murderous acts. And it plays a similar role in the cognitive realm -- the PFC stops us from falling into solving a problem with an answer that, while the easier, more reflexive one, is wrong. The PFC is what makes us do the right thing, even if it's harder. Not surprisingly, it's one of the last parts of the brain to fully develop (technical jargon -- to fully myelinate). But what is surprising is just how long it is before the PFC comes fully on line -- astonishingly, around age 30. And this is where my question comes in. It is best framed in the context of young kids, and this is probably what has prompted me to begin to think about the PFC, as I have two young children. Kids are wildly "frontally disinhibited," the term for having a PFC that hasn't quite matured yet into keeping its foot firmly on the brake. Play hide and seek with a three year old, loudly, plaintively call, "Where are you," and their lack of frontal function does them in -- they can't stop themselves from calling out -- Here I am, under the table -- giving away their hiding spot. I suspect that there is a direct, near linear correlation between the number of fully myelinated frontal neurons in a small child's brain and how many dominoes you can line up in front of him before he must MUST knock them over. So my question comes to the forefront in a scenario that came up frequently for me a few years ago: my then three year old who, while a wonderful child, was distinctly three, would do something reasonably appalling to his younger sister -- take some stuffed animal away, grab some contested food item, whatever. A meltdown then ensues. My wife or I intervene, strongly reprimanding our son for mistreating his sister. And then the other parent would say, "Well, is this really fair to be coming down on him like this?, after all, he has no frontal function yet, he can't stop himself" (my wife is a neuropsychologist so, pathetically, we actually speak this way to each other). And the other would retort -- "Well, how else is he going to develop that frontal function?" That's the basic question -- how does the world of empathy, theory of mind, gratification postponement, Kohlberg stages of moral development, etc., combine with the world of neurotrophic growth factors stimulating neurons to grow fancier connections? How do they produce a PFC that makes you do the harder thing because it's right? How does this become a life-long pattern of PFC function Robert Sapolsky is a professor of biological sciences at Stanford University and author of A Primate's Memoir. ________________________________________________________________ "Is humanity in the midst of a cognitive 'Fourth-Transition?' Or, why doesn't the Encyclopedia Brittanica matter any more?" It feels to me like something very important is going on. Clearly our children aren't quite like us. They don't learn about the world as we did. They don't storehouse knowledge about the world as we have. They don't "sense" the world as we do. Could humanity possibly already be in the middle of a next stage of cognitive transition? Merlin Donald has done a fine job of summarizing hundreds of inquiries into the evolution of culture and cognition in his Origins of the Modern Mind. Here, as in his other work, he posits a series of "layered" morphological, neurological and external technological stages in this evolutionary path. What he refers to as the "Third Transition" (from "Mythic" to "Theoretic" culture), appears to have begun 2500 (or so) years ago and has now largely completed its march to "mental" dominance worldwide. While this last "transition" did not require biological adaptation (or speciation), it nonetheless changed us -- neurologically and psycho-culturally. The shift from the "primary orality" of "Mythic culture" to the literacy and the reliance of what Donald calls an "External Symbolic Storage" network, has resulted in a new sort of mind. The "modern" mind. Could we be "evolving" towards an even newer sort of mind as a result of our increasing dependence on newer sorts of symbolic networks and newer environments of technologies? Literacy (while still taught and used) doesn't have anywhere near the clout it once had. Indeed, as fanatical "literalism" (aka "fundamentalism") thrashes its way to any early grave (along with the decline of the reciprocal fascination of the past 50 years to "deconstruct" everything as "texts"), how much will humanity care about and rely upon the encyclopedic storage of knowledge in alphabetic warehouses? Perhaps we are already "learning," "knowing" and "sensing" the world in ways that presage something very different from the "modern" mind. Should we ask the children? Mark Stahlman, a venture capitalist who has been focused on next generation computer/networking platforms, is co-founder the Newmedia Laboratory, NYNMA. ________________________________________________________________ "What are minds, that they are both essentially mental yet inextricably intertwined with body (and world)?" We thought we had this one nailed. Believing (rightly) that the physical world is all there is, the sciences of the mind re-invented thought and reason (and feeling) as information-processing events in the human brain. But this vision turns out to be either incomplete or fatally flawed. The neat and tidy division between a level of information processing (software) and of physicality (implementation) is useful when we deal with humanly engineered systems. We build such systems, as far as possible, to keep the levels apart. But nature was not guided by any such neat and tidy design principles. The ways that evolved creatures solve problems of anticipation, response, reasoning and perceiving seem to involve endless leakage and interweaving between motion, action, visceral (gut) response, and somewhat more detached contemplation. When we solve a jigsaw puzzle, we look, think, and categorise: but we also view the scene and pieces from new angles, moving head and body. And we pick pieces up and try them out. Real on-the-hoof human reason is like that through and through. Even the use of pen and paper to construct arguments displays the same complex interweaving of embodied action, perceptual re-encountering, and neural activity. Mind and body (and world) emerge as messily and continuously coupled partners in the construction of rational action. But this leads to a very real problem, an impasse that is currently the single greatest roadblock in the attempts to construct a mature science of the mind. We cannot, despite the deep and crucial roles of body and world, understand the mind in quite the same terms as, say, an internal combustion engine. Where minds are concerned, it is the flow of contents (and feelings) that seems to matter. Yet if we prescind from the body and world, pitching our stories and models at the level of the information flows, we again lose sight of the distinctively human mind. We need the information-and-content based story to see the mind as, precisely, a mind. Yet we cannot do justice to minds like ours without including body, world (cognitive tools and other people) and motion in roles which are both genuinely cognitive yet thoroughly physical. What we lack is a framework, picture, or model in terms of which to understand this larger system as the cognitive engine. All current stories are forced to one side (information flows) or the other (physical dynamics). Cognitive Science thus stands in a position similar to that of Physics in the early decades of the 20th century. What we lack is a kind of 'quantum theory' of the mind: a new framework that displays mind as mind, yet as body in action too. Andy Clark is Professor of Philosophy and Cognitive Science at the University of Sussex, UK and the author of Being There: Putting Brain, Body and World Together Again. ________________________________________________________________ "At what age should women say, 'No,' to first-time pregnancy?" Scientific advances now make it possible for a woman past normal child-bearing years to bear a child. Some of my high-tech friends who range from age 43 to almost 50 are either bearing children or plan to using in-vitro techniques. These women have postponed childbearing because of their careers, but they want to experience the joys of family that their male counterparts were able to share while still pursuing their professional goals -- an option far more difficult for the childbearer and primary care provider. Many successful men start first, second, or third families later in their lives, so why should we criticize women who want to bear a first child, when, thanks to science, it is no longer "too late?" Sylvia Paull is the founder of Gracenet (www.gracenet.net). ________________________________________________________________ "What is the relationship between being alive and having a mind?" Last year, Steven Spielberg directed a film, based upon a Stanley Kubrick project, entitled "A.I. Artificial Intelligence". The film depicts a robotic child who develops human emotions. Is such a thing possible? Could a sufficiently complex and appropriately designed computer embody human emotions? Or is this simply a fanciful notion that the public and some scientists who specialize in artificial intelligence just wish could be true? I don't think that computers will ever become conscious and I view Spielberg's depiction of a conscious feeling robot a good example of what might be called the "The Spielberg Principle" that states: When a Steven Spielberg film depicts a world-changing scientific event, the likelihood of that event actually occurring approaches zero." In other words, our wishes and imagination often have little to do with what is scientifically likely or possible. For example, although we might wish for contact with other beings in the universe as portrayed in the Spielberg movie "E.T", the astronomical distances between our solar system and the rest of the universe makes an E.T.-like visit extremely unlikely. The film A.I. and the idea contained within it that robots could someday become conscious is another case in which our wishes exceed reality. Despite enormous advances in artificial intelligence, no computer is able to experience a pin prick like a simple frog, or get hungry like a rat, or become happy or sad like all of us carbon-based units. But why is this the case? It is my conjecture that this is because there are some features of being alive that makes mind, consciousness, and feelings possible. That is, only living things are capable of the markers of mind such as intentionality, subjectivity, and self-awareness. But the important question of the link between life and the creation of consciousness remains a great scientific mystery, and the answer will go a long way toward our understanding of what a mind actually is. Todd E. Feinberg, MD is Chief, Yarmon Neurobehavior and Alzheimer's Disease Center, Beth Israel Medical Center ________________________________________________________________ "To be or not to be?" Old questions don't go away (at least while they remain unanswered). Suppose Edge were to have asked Hamlet for his Y 2002 question We can guess the answer. "Sorry, John, I know it's a bit of a clich?, but it's the same question it has always been." Suppose Edge turned next to Albert Camus. "John, I said it in 1942 and I'm still waiting. 'There is but one truly serious philosophical problem and that is suicide. Judging whether life is or is not worth living amounts to answering the fundamental question of philosophy. All the rest -- whether or not the world has three dimensions, whether the mind has nine or twelve categories -- comes afterwards.'" Clich?s they may be. But I'd say there's every reason for students of human nature to continue to treat these questions with due seriousness: and in particular to think further about who has been asking them, when, and why, and with what consequences. It may seem a paradox that human beings should have evolved to have a love-hate relationship with their own existence. But in fact there may be a simple Darwinian story to be told about how it has come to be so. Let's accept the stark truth that individual human beings have been designed by natural selection to be, in Dawkins' famous phrase, "survival machines" whose primary function is to help the genes they carry to make it into future generations. We should admit, then, that, from this evolutionary viewpoint, an individual human life cannot be considered an end in itself but only a means to promoting the success of genes. Yet the fact is that in the human case (and maybe the human case alone) natural selection has devised a peculiarly effective trick for persuading individual survival machines to fulfill this seemingly bleak role. Every human being is endowed with the mental programs for developing a "conscious self" or "soul": a soul which not only values its own survival but sees itself as very much an end in its own right (in fact a soul which, in a fit of solipsism, may even consider itself the one and only source of all the ends there are!). Such a soul, besides doing all it can to ensure its own basic comfort and security, will typically strive for self-development: through learning, creativity, spiritual growth, symbolic expression, consciousness-raising, and so on. These activities redound to the advantage of mind and body. The result is that such "selfish souls" do indeed make wonderful agents for "selfish genes". There has, however, always been a catch. Naturally-designed "survival machines" are not, as the name might imply machines designed to go on and on surviving: instead they are machines designed to survive only up to a point -- this being the point where the genes they carry have nothing more to gain (or even things to lose) from continued life. For it"s a sobering fact that genes are generally better off taking passage and propagating themselves in younger machines than older ones (the older ones will have begun to accumulate defects, to have become set in their ways, to have acquired more than enough dependents, etc.) It suits genes therefore that their survival machines should have a limited life-time, after which they can be scrapped. Thus, in a scenario that has all the makings of tragedy (if not a tragic farce), natural selection has, on the one hand, been shaping up individual human beings at the level of their souls to believe in themselves and their intrinsic worth, while on the other hand taking steps to ensure that these same individuals on the level of their bodies grow old and die -- and, most likely, since by this stage of a life the genes no longer have any interest in preventing it, to die miserably, painfully and in a state of dreadful disillusion. However, here's the second catch. In order for this double-game that the genes are playing to be successful, it's essential that the soul they've designed does not see what's coming and realise the extent to which it has been duped, at least until too late. But this means preventing the soul, or at any rate cunningly diverting it, from following some of the very lines of inquiry on which it has been set up to place its hopes: looking to the future, searching for eternal truths, and so on. In Camus' words "Beginning to think is beginning to be undermined". The history of human psychology and culture has revolved around this contradiction built into human nature. Science has not had much to say about it. But it may yet. Nicholas Humhprey is a theoretical psychologist at the London School of Economics, and author of Leaps of Faith. ________________________________________________________________ "Why Sleep?" We need to sleep every day. Why do we spend a third of our lives in a dormant state? Sleep deprivation leads to loss of judgment, failure of health, and eventually to death. The cycle of sleep and alertness is controlled by circadian rhythms, which also affect body temperature, digestion and other regulatory systems. Despite the importance of sleep its purpose is a mystery. The brain remains highly active during sleep, so the simple explanation that we sleep in order to rest cannot be the whole story. Activity in the sleeping brain is largely hidden from us because very little that occurs during sleep directly enters consciousness. However, electrical recordings and more recently brain imaging experiments during slow-wave sleep have revealed highly ordered patterns of activity that are much more spatially and temporally coherent than brain activity during states of alertness. Slow-wave sleep alternates during the night with rapid eye sleep movement (REM) sleep, during which dreams occur and muscles are paralyzed. For the last 10 years my colleagues and I have been building computer models of interacting neurons that can account for rhythmic brain activity during sleep. Computer models of the sleeping brain and recent experimental evidence point toward slow-wave sleep as a time during which brain cells undergo extensive structural reorganization. It takes many hours for the information acquired during the day to be integrated into long-term memory through biochemical reactions. Could it be that we go to sleep every night in order to remember better and think more clearly? Introspection is misleading in trying to understand the brain in part because much of the processing that takes place to support seeing, hearing and decision-making is subconscious. In studying the brain during sleep when we are aware of almost nothing, we may get a better understanding of the brain's secret life and uncover some of the elusive principles that makes the mind so illusive. Terrence Sejnowski, a computational neurobiologist and Professor at the Salk Institute for Biological Studies, is a coauthor of Thalamocortical Assemblies: How Ion Channels, Single Neurons and Large-Scale Networks Organize Sleep Oscillations. ________________________________________________________________ "What makes a genius, and how can we have more of them?" As any software developer will tell you, one great programmer is easily worth ten average ones. The great strides in knowledge have most often come from those we label "genius." Newton, Gauss, Einstein, Feyneman, de Morgan, Crick all seemed to be able to make connections or see patterns that others had ignored. They often visualized the world differently, or with fewer constraints than most of us have on our imagination. There are many great problems of science and society to be solved, and applying genius to them could help speed the solutions. Perhaps the analysis of Einstein's brain done by Professor Diamond at Berkeley, which seems to show differences in structure in the inferior parietal region, and a higher proportion of glial cells can lead to some physiological answers. Perhaps there are chemical enhancers which can be used (legally, one would hope), to increase oxygen flow to neurons. Perhaps behavioral conditioning when we're young can help create more of the right type of structures, just as musicians who being training in early childhood have larger portions of the brain devoted to their skills. Whatever the answer, mankind might be better for some more genius directed at the environmental, social and scientific fields. Howard Morgan is Vice-Chairman, Idealab. ________________________________________________________________ "Why do people -- even identical twins -- differ from one another in personality?" This question needs to be asked because of the widely held conviction that we already know the answer to it. We don't. Okay, we know half of the answer: one of the reasons why people differ from each other is that they have different genes. That's the easy half. The hard half is the part that isn't genetic. Even people who have identical genes, like Freeman Dyson's twin grandsons (see his question), differ in personality. I am not asking about the feeling each twin has of being "me": George and Donald could be identical in personality, and yet each could have a sense of me-ness. But if George and Donald are like most identical twins, they aren't identical in personality. Identical twins are more alike than fraternal twins or ordinary siblings, but less alike than you would expect. One might be more meticulous than the other, or more outgoing, or more emotional. The weird thing is that the degree of similarity is the same, whether twins are reared together or apart. George and Donald, according to their grandfather, "not only have the same genes but also have the same environment and upbringing." And yet they are no more alike in personality than twins reared by two different sets of parents in two different homes. We know that something other than genes is responsible for some of the variation in human personality, but we are amazingly ignorant about what it is and how it works. Well-designed research has repeatedly failed to confirm commonly held beliefs about which aspects of a child's environment are important. The evidence indicates that neither those aspects of the environment that siblings have in common (such as the presence or absence of a caring father) nor those that supposedly widen the differences between siblings (such as parental favoritism or competition between siblings) can be responsible for the non-genetic variation in personality. Nor can the vague idea of an "interaction" between genes and environment save the day. George and Donald have the same genes, so how can an interaction between genes and environment explain their differences? Only two hypotheses are compatible with the existing data. One, which I proposed in my book The Nurture Assumption, is that the crucial experiences that shape personality are those that children have outside their home. Unfortunately, there is as yet insufficient evidence to support (or disconfirm) this hypothesis. The remaining possibility is that the unexplained variation in personality is random. Even for reared-together twins, there are minor, random differences in their experiences. I find it implausible, however, that minor, random differences in experiences could be so potent, given the ineffectiveness of substantial, systematic differences. If randomness affects personality, the way it probably works is through biological means -- not genetic but biological. The human genome is smallish and the human brain is vast; the genome couldn't possibly contain precise specifications for every neuron and synapse. Identical twins don't have identical brains for the same reason that they don't have identical freckles or fingerprints. If these random physical differences in the brain are responsible for some or all of the personality differences between identical twins, they must also be responsible for some or all of the non-genetic variation in personality among the rest of us. "All" is highly unlikely; "some" is almost certainly true. What remains in doubt is not whether, but how much. The bottom line is that scientists will probably never be able to predict human behavior with anything close to certainty. Next question: Is this discouraging news or cause for celebration? Judith Rich Harris is a developmental psychologist and author of The Nurture Assumption: Why Children Turn Out The Way They Do. ________________________________________________________________ "Many Universes?" Preliminaries We do not know whether there are other universes. Perhaps we never shall. But I want to respond to Paul Davies' questions by arguing that "do other universes exist?" can be a genuine scientific question. Moreover, I shall outline why it is an interesting question; and why, indeed, I already suspect that the answer may be "yes". First, a pre-emptive and trivial comment: if you define the universe as "everything there is", then by definition there cannot be others. I shall, however, follow the convention among physicists and astronomers, and define the "universe" as the domain of space-time that encompasses everything that astronomers can observe. Other "universes", if they existed, could differ from ours in size, content, dimensionality, or even in the physical laws governing them. It would be neater, if other "universes" existed, to redefine the whole enlarged ensemble as "the universe", and then introduce some new term -- for instance "the metagalaxy" -- for the domain that cosmologists and astronomers have access to. But so long as these concepts remain so conjectural, it is best to leave the term "universe" undisturbed, with its traditional connotations, even though this then demands a new word, the "multiverse", for a (still hypothetical) ensemble of "universes." Ontological Status Of Other Universes Science is an experimental or observational enterprise, and it's natural to be troubled by assertions that invoke something inherently unobservable. Some might regard the other universes as being in the province of metaphysics rather than physics. But I think they already lie within the proper purview of science. It is not absurd or meaningless to ask "Do unobservable universes exist?", even though no quick answer is likely to be forthcoming. The question plainly can't be settled by direct observation, but relevant evidence can be sought, which could lead to an answer. There is actually a blurred transition between the readily observable and the absolutely unobservable, with a very broad grey area in between. To illustrate this, one can envisage a succession of horizons, each taking us further than the last from our direct experience: (i) Limit of present-day telescopes There is a limit to how far out into space our present-day instruments can probe. Obviously there is nothing fundamental about this limit: it is constrained by current technology. Many more galaxies will undoubtedly be revealed in the coming decades by bigger telescopes now being planned. We would obviously not demote such galaxies from the realm of proper scientific discourse simply because they haven't been seen yet. When ancient navigators speculated about what existed beyond the boundaries of the then known world, or when we speculate now about what lies below the oceans of Jupiter's moons Europa and Ganymede, we are speculating about something "real" -- we are asking a scientific question. Likewise, conjectures about remote parts of our universe are genuinely scientific, even though we must await better instruments to check them. (ii) Limit in principle at present era Even if there were absolutely no technical limits to the power of telescopes, our observations are still bounded by a horizon, set by the distance that any signal, moving at the speed of light, could have travelled since the big bang. This horizon demarcates the spherical shell around us at which the redshift would be infinite. There is nothing special about the galaxies on this shell, any more than there is anything special about the circle that defines your horizon when you're in the middle of an ocean. On the ocean, you can see farther by climbing up your ship's mast. But our cosmic horizon can't be extended unless the universe changes, so as to allow light to reach us from galaxies that are now beyond it. If our universe were decelerating, then the horizon of our remote descendants would encompass extra galaxies that are beyond our horizon today. It is, to be sure, a practical impediment if we have to await a cosmic change taking billions of years, rather than just a few decades (maybe) of technical advance, before a prediction about a particular distant galaxy can be put to the test. But does that introduce a difference of principle? Surely the longer waiting-time is a merely quantitative difference, not one that changes the epistemological status of these faraway galaxies? (iii) Never-observable galaxies from "our" Big Bang, But what about galaxies that we can never see, however long we wait? It's now believed that we inhabit an accelerating universe. As in a decelerating universe, there would be galaxies so far away that no signals from them have yet reached us; but if the cosmic expansion is accelerating, we are now receding from these remote galaxies at an ever-increasing rate, so if their light hasn't yet reached us, it never will. Such galaxies aren't merely unobservable in principle now -- they will be beyond our horizon forever. But if a galaxy is now unobservable, it hardly seems to matter whether it remains unobservable for ever, or whether it would come into view if we waited a trillion years. (And I have argued, under (ii) above, that the latter category should certainly count as "real".) (iv) Galaxies in disjoint universes The never-observable galaxies in (iii) would have emerged from the same Big Bang as we did. But suppose that, instead of causally-disjoint regions emerging from a single Big Bang (via an episode of inflation) we imagine separate Big Bangs. Are space-times completely disjoint from ours any less real than regions that never come within our horizon in what we'd traditionally call our own universe? Surely not -- so these other universes too should count as real parts of our cosmos, too. This step-by-step argument (those who don't like it might dub it a slippery slope argument!) suggests that whether other universes exist or not is a scientific question. But it is of course speculative science. The next question is, can we put it on a firmer footing? What could it explain? Scenarios For A Multiverse At first sight, nothing seems more conceptually extravagant -- more grossly in violation of Ockham's Razor -- than invoking multiple universes. But this concept is a natural consequence of several different theories ( albeit all speculative). Andrei Linde, Alex Vilenkin and others have performed computer simulations depicting an "eternal" inflationary phase where many universes sprout from separate big bangs into disjoint regions of spacetimes. Alan Guth and Lee Smolin have, from different viewpoints, suggested that a new universe could sprout inside a black hole, expanding into a new domain of space and time inaccessible to us. And Lisa Randall and Raman Sundrum suggest that other universes could exist, separated from us in an extra spatial dimension; these disjoint universes may interact gravitationally, or they may have no effect whatsoever on each other. There could be another universe just a few millimetres away from us. But if those millimetres were measured in some extra spatial dimension then to us (imprisoned in our 3-dimensional space) the other universe would be inaccessible. In the hackneyed analogy where the surface of a balloon represents a two-dimensional universe embedded in our three-dimensional space, these other universes would be represented by the surfaces of other balloons: any bugs confined to one, and with no conception of a third dimension, would be unaware of their counterparts crawling around on another balloon. Variants of such ideas have been developed by Paul Steinhardt, Neil Turok and others. Guth and Edward Harrison have even conjectured that universes could be made in some far-future laboratory, by imploding a lump of material to make a small black hole. Could our entire universe perhaps then be the outcome of some experiment in another universe? If so, the theological arguments from design could be resuscitated in a novel guise. Smolin speculates that the daughter universe may be governed by laws that bear the imprint of those prevailing in its parent universe. If that new universe were like ours, then stars, galaxies and black holes would form in it; those black holes would in turn spawn another generation of universes; and so on, perhaps ad infinitum. Parallel universes are also invoked as a solution to some of the paradoxes of quantum mechanics, in the "many worlds" theory, first advocated by Hugh Everett and John Wheeler in the 1950s. This concept was prefigured by Olaf Stapledon, in his 1937 novel, as one of the more sophisticated creations of his Star Maker: "Whenever a creature was faced with several possible courses of action, it took them all, thereby creating many ... distinct histories of the cosmos. Since in every evolutionary sequence of this cosmos there were many creatures and each was constantly faced with many possible courses, and the combinations of all their courses were innumerable, an infinity of distinct universes exfoliated from every moment of every temporal sequence". None of these scenarios has been simply dreamed up out of the air: each has a serious, albeit speculative, theoretical motivation. However, one of them, at most, can be correct. Quite possibly none is: there are alternative theories that would lead just to one universe. Firming up any of these ideas will require a theory that consistently describes the extreme physics of ultra-high densities, how structures on extra dimensions are configured, etc. But consistency is not enough: there must be grounds for confidence that such a theory isn't a mere mathematical construct, but applies to external reality. We would develop such confidence if the theory accounted for things we can observe that are otherwise unexplained. As the moment, we have an excellent framework, called the standard model, that accounts for almost all subatomic phenomena that have been observed. But the formulae of the "standard model" involve numbers which can't be derived from the theory but have to be inserted from experiment. Perhaps, in the 21st-century theory, physicists will develop a theory that yields insight into (for instance) why there are three kinds of neutrinos, and the nature of the nuclear and electric forces. Such a theory would thereby acquire credibility. If the same theory, applied to the very beginning of our universe, were to predict many big bangs, then we would have as much reason to believe in separate universes as we now have for believing inferences from particle physics about quarks inside atoms, or from relativity theory about the unobservable interior of black holes. Universal Laws, Or Mere Bylaws? "Are the laws of physics unique?" is a less poetic version of Einstein's famous question "Did God have any choice in the creation of the Universe?" The answer determines how much variety the other universes -- if they exist -- might display. If there were something uniquely self-consistent about the actual recipe for our universe, then the aftermath of any big bang would be a re-run of our own universe. But a far more interesting possibility (which is certainly tenable in our present state of ignorance of the underlying laws) is that the underlying laws governing the entire multiverse may allow variety among the universes. Some of what we call "laws of nature" may in this grander perspective be local bylaws, consistent with some overarching theory governing the ensemble, but not uniquely fixed by that theory. As an analogy (one which I owe to Paul Davies) consider the form of snowflakes. Their ubiquitous six-fold symmetry is a direct consequence of the properties and shape of water molecules. But snowflakes display an immense variety of patterns because each is moulded by its micro-environments: how each flake grows is sensitive to the fortuitous temperature and humidity changes during its downward drift. If physicists achieved a fundamental theory, it would tell us which aspects of nature were direct consequences of the bedrock theory (just as the symmetrical template of snowflakes is due to the basic structure of a water molecule) and which are (like the distinctive pattern of a particular snowflake) the outcome of accidents. The accidental features could be imprinted during the cooling that follows the big bang -- rather as a piece of red-hot iron becomes magnetised when it cools down, but with an alignment that may depend on chance factors. It may turn out (though this would be a disappointment to many physicists if it did) that the key numbers describing our universe, and perhaps some of the so-called constants of laboratory physics as well, are mere "environmental accidents", rather than being uniquely fixed throughout the multiverse by some final theory. This is relevant to some now-familiar arguments (explored further in my book Our Cosmic Habitat) about the surprisingly fine-tuned nature of our universe. Fine Tuning -- A Motivation For Suspecting That Our "Universe" Is One Of Many. The nature of our universe depended crucially on a recipe encoded in the big bang, and this recipe seems to have been rather special. A degree of fine tuning -- in the expansion speed, the material content of the universe, and the strengths of the basic forces -- seems to have been a prerequisite for the emergence of the hospitable cosmic habitat in which we live. Here are some prerequisites for a universe containing organic life of the kind we find on Earth: First of all, it must be very large compared to individual particles, and very long-lived compared with basic atomic processes. Indeed this is surely a requirement for any hypothetical universe that a science fiction writer could plausibly find interesting. If atoms are the basic building blocks, then clearly nothing elaborate could be constructed unless there were huge numbers of them. Nothing much could happen in a universe that was was too short-lived: an expanse of time, as well as space, is needed for evolutionary processes. Even a universe as large and long-lived as ours, could be very boring: it could contain just black holes, or inert dark matter, and no atoms at all; it could even be completely uniform and featureless. Moreover, unless the physical constants lie in a rather narrow range, there would not be the variety of atoms required for complex chemistry. If our existence depends on a seemingly special cosmic recipe, how should we react to the apparent fine tuning? There seem three lines to take: we can dismiss it as happenstance; we can acclaim it as the workings of providence; or (my preference) we can conjecture that our universe is a specially favoured domain in a still vaster multiverse. Some seemingly "fine tuned" features of our universe could then only be explained by "anthropic" arguments, which are analogous to what any observer or experimenter does when they allow for selection effects in their measurements: if there are many universes, most of which are not habitable, we should not be surprised to find ourselves in one of the habitable ones. Testing Specific Multiverse Theories Here And Now We may one day have a convincing theory that tells us whether a multiverse exists, and whether some of the so called laws of nature are just parochial by-laws in our cosmic patch. But while we're waiting for that theory -- and it could be a long wait -- the "ready made clothes shop" analogy can already be checked. It could even be refuted: this would happen if our universe turned out to be even more specially tuned than our presence requires. Let me give two quite separate examples of how this style of reasoning can be used to refute specific hypotheses. (i) Ludwig Boltzmann argued that our entire universe was an immensely rare "fluctuation" within an infinite and eternal time-symmetric domain. There are now many arguments against this hypothesis, but even when it was proposed one could already have noted that fluctuations in large volumes are far more improbable than in smaller volumes. So, it would be overwhelmingly more likely, if Boltzmann were right, that we would be in the smallest fluctuation compatible with our existence (Indeed, the most probable fluctuation would be a disembodied brain that merely simulated the sensations of the external world.) Whatever our initial assessment of Boltzmann's theory, its probability would plummet if we came to accept the extravagant scale of the cosmos. (ii) Even if we knew nothing about how stars and planets formed, we would not be surprised to find that our Earth's orbit wasn't highly eccentric: if it had been, water would boil when the Earth was at perihelion and freeze at aphelion -- a harsh environment unconducive to our emergence. However, a modest orbital eccentricity (certainly up to 0.1) is plainly not incompatible with life. If it had turned out that the earth moved in a near-perfect circle (with eccentricity, say, less than 0.00001) , this would be a strong argument against a theory that postulated anthropic selection from orbits whose eccentricities had a "Bayesian prior" that was uniform in the range from zero to one. We could apply this style of reasoning to the important numbers of physics (for instance, the cosmological constant lambda) to test whether our universe is typical of the subset that that could harbour complex life. Lambda has to be below a threshold to allow protogalaxies to pull themselves together by gravitational forces before gravity is overwhelmed by cosmical repulsion (which happens earlier if lambda is large). An unduly fierce cosmic repulsion would prevent galaxies from forming. Suppose, for instance, that (contrary to current indications) lambda was thousands of times smaller than it needed to be merely to ensure that galaxy formation wasn't prevented. This would raise suspicions that it was indeed zero for some fundamental reason. (Or that it had a discrete set of possible values, and all the others were well about the threshold). The methodology requires us to decide what values of a particular physical parameter are compatible with our emergence. It also requires a specific theory that gives the relative Bayesian priors for any particular value. For instance, in the case of lambda, are all values equally probable? Are low values favoured by the physics? Or is there a finite number of discrete possible values, depending on how the extra dimensions "roll up"? With this information, one can then ask if our actual universe is "typical" of the subset in which we could have emerged. If it is a grossly atypical member even of this subset (not merely of the entire multiverse) then we would need to abandon our hypothesis. By applying similar arguments to the other numbers, we could check whether our universe is typical of the subset that that could harbour complex life. If so, the multiverse concept would be corroborated. As another example of how "multiverse" theories can be tested, consider Smolin's conjecture that new universes are spawned within black holes, and that the physical laws in the daughter universe retain a memory of the laws in the parent universe: in other words there is a kind of heredity. Smolin's concept is not yet bolstered by any detailed theory of how any physical information (or even an arrow of time) could be transmitted from one universe to another. It has, however, the virtue of making a prediction about our universe that can be checked. If Smolin were right, universes that produce many black holes would have a reproductive advantage, which would be passed on to the next generation. Our universe, if an outcome of this process, should therefore be near-optimum in its propensity to make black holes, in the sense that any slight tweaking of the laws and constants would render black hole formation less likely. (I personally think Smolin's prediction is unlikely be borne out, but he deserves our thanks for presenting an example that illustrates how a multiverse theory can in principle be vulnerable to disproof.) These examples show that some claims about other universes may be refutable, as any good hypothesis in science should be. We cannot confidently assert that there were many big bangs -- we just don't know enough about the ultra-early phases of our own universe. Nor do we know whether the underlying laws are "permissive": settling this issue is a challenge to 21st century physicists. But if they are, then so-called anthropic explanations would become legitimate -- indeed they'd be the only type of explanation we'll ever have for some important features of our universe. A Keplerian Argument The multiverse concept might seem arcane, even by cosmological standards, but it affects how we weigh the observational evidence in some current debates. Our universe doesn't seem to be quite as simple as it might have been. About 5 percent of its mass is in ordinary atoms; about 25 percent is in dark matter (probably a population of particles that survived from the very early universe contains atoms, and dark matter; and the remaining 70 percent is latent in empty space itself. Some theorists have a strong prior preference for the simplest universe and are upset by these developments. It now looks as thought a craving for such simplicity will be disappointed. Perhaps we can draw a parallel with debates that occurred 400 years ago. Kepler discovered that planets moved in ellipses, not circles. Galileo was upset by this. In his "Dialogues concerning the two chief systems of the world" he wrote "For the maintenance of perfect order among the parts of the Universe, it is necessary to say that movable bodies are movable only circularly". To Galileo, circles seemed more beautiful; and they were simpler -- they are specified just by one number, the radius, whereas an ellipse needs an extra number to define its shape (the "eccentricic"). Newton later showed, however, that all elliptical orbits could be understood by a single unified theory of gravity. Had Galileo still been alive when Principia was published, Newton's insight would surely have joyfully reconciled him to ellipses. The parallel is obvious. A universe with at least three very different ingredients low may seem ugly and complicated. But maybe this is our limited vision. Our Earth traces out just one ellipse out of an infinity of possibilities, its orbit being constrained only by the requirement that it allows an environment conducive for evolution (not getting too close to the Sun, nor too far away). Likewise, our universe may be just one of an ensemble of all possible universes, constrained only by the requirement that it allows our emergence. So I'm inclined to go easy with Occam's razor: a bias in favour of "simple" cosmologies may be as short-sighted as was Galileo's infatuation with circles. What we've traditionally called "the universe" may be the outcome of one big bang among many, just as our Solar System is merely one of many planetary systems in the Galaxy. Just as the pattern of ice crystals on a freezing pond is an accident of history, rather than being a fundamental property of water, so some of the seeming constants of nature may be arbitrary details rather than being uniquely defined by the underlying theory. The quest for exact formulas for what we normally call the constants of nature may consequently be as vain and misguided as was Kepler's quest for the exact numerology of planetary orbits. And other universes will become part of scientific discourse, just as "other worlds" have been for centuries. We may one day have a convincing theory that accounts for the very beginning of our universe, tells us whether a multiverse exists, and (if so) whether some so called laws of nature are just parochial by-laws in our cosmic patch. may be vastly larger than the domain we can now (or, indeed, can ever) observe. Most physicists hope to discover a fundamental theory that will offer unique formulae for all the constants of nature. But perhaps what we've traditionally called our universe is just an atom in an ensemble -- a multiverse punctuated by repeated big bangs, where the underlying physical laws permit diversity among the individual universes. Even though some physicists still foam at the mouth at the prospects of be being "reduced" to these so-called anthropic explanations, such explanations may turn out to be the best we can ever discover for some features of our universe (just as they are the best explanations we can offer for the shape and size of Earth's orbit). Cosmology will have become more like the science of evolutionary biology. Nonetheless (and here physicists should gladly concede to the philosophers), any understanding of why anything exists -- why there is a universe (or multiverse) rather than nothing -- remains in the realm of metaphysics. Sir Martin Rees, a cosmologist, is Royal Society Professor at Kings College, Cambridge. He directs a research program at Cambridge's Institute of Astronomy. His most recent book is Our Cosmic Habitat. ________________________________________________________________ "How will people think about the soul?" Cognitive scientists believe that emotions, memories, and consciousness are the result of physical processes. But almost nobody else does. Common sense tells us that our mental life is the product of an immaterial soul, one that can survive the destruction of the body and brain. The physical basis of thought is, as Francis Crick put it, "an astonishing hypothesis", one that few take seriously. You might think that this will soon change. After all, people once thought the earth is flat and that mental illness is caused by demonic possession. But the belief in the immaterial soul is different. It is rooted in our experience -- our gut feeling, after all, is not that we are bodies; it is that we occupy them. Even young children are dualists -- they appreciate and enjoy tales in which a person leaves his body and goes to faraway lands, or when the frog turns into a prince. And when they come to think about death, they readily accept that the soul lives on, drifting into another body or ascending to another world. When the public hears about research into the neural basis of thought, they learn about specific findings: this part of the brain is involved in risk taking, that part is active when someone think about music, and so on. But the bigger picture is not yet generally appreciated, and it is an interesting question how people will react when it is. (We are seeing the first signs now, much of it in the recent work of novelists such Jonathan Franzen, David Lodge, and Ian McEwan). It might be that non-specialists will learn to live with the fact that their gut intuitions are mistaken, just as non-physicists accept that apparently solid objects are composed of tiny moving particles. But this may be optimistic. The notion that our souls are flesh is profoundly troubling, in large part because of what it means for the idea of life after death. The same sorts of controversies that raged over the study and teaching of evolution in the 20th century might well spill over to the cognitive sciences in the years to follow. Paul Bloom is Professor of Psychology at Yale and author of How Children Learn the Meanings of Words (Learning, Development, and Conceptual Change). ________________________________________________________________ "How can we understand the fact that such complex and precise mathematical relations inhere in nature?" Of course this is one of the oldest philosophical questions in science but still one of the most mysterious. For most of Western history the cannonical answer has been some version of Platonism, some variation on the esentially Pythagorean idea that the matherial universe has been formed according to a set of transcendent and a priori mathematical relations or laws. These relations/laws Pythagaoras himself called the divine armonia of the cosmos, and have often been referred to since as the "cosmic harmonies" or the "music of the spheres". For Pythagoras numbers were actually gods, and the quest for mathematical relations in nature was a quest for the divine archetypes by which he believed that matter had literally been in-formed. Throughout the age of science, and even today, most physicists seem to be Platonists. Many are even Pythagoreans, implicitly (if not always with much concious reflection) making an association between the mathematical laws of nature and a transcendent being. The common association today of a "theory of everything" with "the mind of God" is simply the latest efflourescence of a two and a half millenia-old tradition which has always viewed physics as a quasi-religious activity. Can we get beyond Platonism in our understanding of nature's undeniable propensity to realize extraordinarily sophisticated mathematical relations? Although I began my own life in science as a Platonist I have come to believe that this philosophical position is insupportable. It is not a rationally justifiable position at all, but simply a faith. Which is fine if one is prepaared to admit as much, something few physicists seem willing to do. To believe in an a priori set of laws (perhaps even a single law) by which physical matter had to be informed seems to me just a disguised version of deism -- an outgrowth of Judeo-Christianity wrapped up in scientific language. I believe we should do better than this, that we should articulate (and need to articulate) a post-Platonist understanding of the so-called "laws of nature." It is a far from easy task, but not an impossible one. Just as mathematican Brian Rotman has put forward a post-Platonist account of mathematics we need to achieve a similar move for physics and our mathematical description of the world itself. Margaret Wertheim is a science writer and commentator and the author of The Pearly Gates of Cyberspace: A History of Space from Dante to the Internet. ________________________________________________________________ "Where Are They?" When Enrico Fermi asked his famous question (now known as the Fermi Paradox) more than fifty years ago -- if there is advanced extraterrestrial life, intelligence, and technology, why don't we see unmistakable evidence of it? -- it was the era of 60-megaton atmospheric bomb tests and broadcast television, with unlimited fusion power in plain sight. Now, we don't even have underground testing, TV has gone cable, wireless is going spread-spectrum, technology has grown microscopic, our children encrypt text with PGP and swap audio via MP3, and Wolfman Jack no longer broadcasts across the New Mexico desert at 50,000 watts. Fermi's question is still worth asking -- and may not be the paradox we once thought. George Dyson is a historian among futurists and the author of Darwin Among the Machines. ________________________________________________________________ "What is the nature of learning?" That question strikes me as being as infinitely perplexing and personal as, What's the meaning of life? But that's the beauty of its ambiguity, and the challenge I enjoy grasping at its slippery complexity. Recent insights into the neural basis of memory have provided a couple of key pieces to the puzzle of learning. The neuropsychological research on "elaborative encoding," for example, has shown that the long-term retention of information involves a spontaneous, connection-making process that produces web-like associative linkages of evocative images, words, objects, events, ideas, sensory impressions and experiences. Parallel insights have emerged from the exploratory work on learning that's being conducted in the field of education and business, which involves constructing multi-dimensional symbolic models. The symbolic modeling process enables people to give form to their thoughts, ideas, knowledge, and viewpoints. By making tangible the unconscious creative process by which we use our tacit and explicit knowledge, the symbolic models help reveal what we think, how we think and what we remember. They represent our thought processes in a deep and comprehensive way, showing the different ways we use our many intelligences, styles of learning, and creative inquiry. In effect, the models demonstrate how people create things to remember, and remember things by engaging in a form of physical thinking. Underneath our layers of individuality lives a core of universal emotions that comprise a "global common language." This language of feelings and sensory impressions not only unites us as human beings, but also connects our creative process. It also enables us to generate ideas together, create new knowledge and transfer it, come to some deep shared understanding of ourselves or given subject, as well as communicate this understanding across the various cultural, social and educational barriers, that divide us. The studies on elaborative encoding provide some basic insights into how these symbolic models work as a kind of global common language, which people use to freely build on the things they already knew and have an emotional connection with. In short: the symbolic models open up other pathways to understand-ing the brainwork behind learning, remembering and the process by which we selectively apply what we learn when we create. As Dr. Barry Gordon of Johns Hopkins School of Medicine states, "What we think of as memories are ultimately patterns of connection among nerve cells." The Harvard psychologist Daniel Schachter arrived at a similar conclusion when examining the 'unconscious processes of implicit knowledge' and its relation to memory. Clearly, when our brains are engaged by information that, literally and figuratively speaking, "connects with us" (in more ways than one), we not only remember it better, but tend to creatively act on it as well. Symbolic modeling makes this fact self-evident. How can we improve the way we learn, and foster the learning process over a lifetime? How can we make the information we absorb daily more personally meaningful, purposeful and memorable? The answers remain to be seen in our connection-making process. This private act of creation is becoming increasingly more public and apparent through functional MRI studies and other medical imaging techniques. Perhaps a more productive strategy for illuminating this connection-making process would be to combine these high-tech "windows" to the world of the mind with low-tech imaging tools, such as symbolic modeling. The combination of these tools would provide a more comprehensive picture of learning. The ability to learn or inability seems to determine our happiness and well being, not to mention the success we experience from realizing our potential. Understanding the conditions that galvanize great, memorable learning experiences will move us closer to understanding the creative engine that powers our individual and collective growth: learning. Todd Siler is the founder and director of Psi-Phi Communications and author of Think Like A Genius. ________________________________________________________________ "Will humankind be able to use its growing self-knowledge to overcome the biologically programmed instincts that could otherwise destroy it?" I am intrigued by the interplay between the following: 1) People always want a little bit more than they have. 2) The economic and political systems built on this instinct are conquering the world. 3) Yet there is no correlation between owning a little bit more and happiness. Instead, the long-term effect of everyone seeking to own a little bit more could be calamitous. Historically, religious figures have appealed to people to overrule their greed with a concern for some higher good. In our supposed scientific age, these arguments have lost their force. Instead our public affairs are governed by the idea that people should just be free as much as possible to choose what they want. But what if people are programmed to make choices that are not in their own best long-term interest? Suppose we discovered that what we instinctively thought would bring us happiness is an illusion created by our human-gene-built brains to induce human-gene-spreading behavior? Today's evolutionary psychologists provide compelling arguments why this picture might be accurate. A species programmed to acquire stuff might well spread itself successfully across the globe. But evolution is blind. It has no plan regarding what might happen to that species when the globe has been conquered. And in the meantime our genes don't give a damn about our happiness. For them it's just another propagation technology... perhaps made doubly efficient by ensuring the carrot is yanked away each time it comes within reach. To achieve true happiness we may need to be a great deal wiser than the loudest demons in our head would suggest. Will the new model of "Why We Are The Way We Are" finally convince us that our political and economic systems, and the assumptions on which they are based, are dangerously flawed. (The"problem isn't just the economists' assumption that "greed is good", or the politicians' assumption of politics that "growth is good'. We've all been brought up to believe: "natural is good". As if it weren't the most natural thing in the world for a planet to self-destruct.) And how long will it take for the new ideas to have any impact? (What if it were to take 50 years? In an era of exponential growth, and accelerating technological change, can we afford even 10?) More generally, can memes that have evolved in a single generation countermand the influence of genes that evolved over millions of years? Chris Anderson is the incoming Chairman and Host of the TED Conference (Technology, Education, Design) held each February in Monterey, California and formerly a magazine publisher (Future Publishing). ________________________________________________________________ "If the medium is indeed the message, does (or can) the message define the medium?" (As a poet, I don't think I need to explicate the question.) Gerd Stern is a poet, media artist and cheese maven and the author of an oral history From Beat Scene Poet to Psychedelic Multimedia Artist 1948-1978. ________________________________________________________________ "What is the nature of fads, fashions, crazes, and financial manias? Do they share a structure that can in turn be found at the core of more substantial changes in a culture? In other words, is there an engine of change to be found in the simple fad that can explain and possibly predict or accelerate broader changes that we regard as less trivial than "mere" fads? And more importantly, can we quantify the workings of this engine if we decide that it exists?" I have shelves of books and papers by smart people who have brushed up against the edge of this question but who have seldom attacked it head on. I'm drawn to the question, and have been obsessed with it for years, because I think it's one of the big ones. It touches on everything humans do. Fashions and fads are everywhere; in things as diverse as food, furnishings, clothes, flowers, children's names, haircuts, body image, even disease symptoms and surgical operations. Apparently, even the way we see Nature and frame questions about it is affected to some extent by fashion; at least according to those who would like to throw cold water on somebody else's theory. (In the current discussion, Paul Davies says, "Of late, it is fashionable among leading physicists and cosmologists to suppose that alongside the physical world we see lies a stupendous array of alternative realitiesŠ") But the ubiquity of fads has not led to deep understanding, even though there are serious uses to which a working knowledge of fads could be put. A million children each year die of dehydration, often where rehydration remedies are available. What if rehydration became fashionable among those children's mothers? Public health officials have many times tried to make various behaviors fashionable. In promoting the use of condoms in the Philippines or encouraging girls in Africa to remain in school, they've reached for popular songs and comic books to deliver the message, hoping to achieve some kind of liftoff. Success has been real, but too often temporary or sporadic. Would a richer understanding of fads have helped them create better ones? In trying to understand these phenomena, writers have been engaged in a conversation that has spanned more than a hundred years. In 1895 Gustave LeBon's speculations on "The Crowd" contained some cockeyed notions, and some that are still in use today. Ludwik Fleck, writing on "The Evolution of a Scientific Fact" in the thirties, in part inspired Thomas Kuhn's writings on the structure of scientific revolutions in the sixties. Everrett Rogers's books on the "Diffusion of Innovations" led to hundreds of other books on the subject and made terms like early adopters and agents of change part of the language. For several decades positive social change has been attempted through a practice called Social Marketing, derived in part from advertising techniques. Diffusion and social marketing models have been used extensively in philanthropy, often with success. But to my knowledge these techniques have not yet led to a description of the fad that's detailed and testable. Malcom Gladwell was stimulating in identifying elements of the fad in The Tipping Point but we are still left with a recipe that calls for a pinch of this and a bit, but not too much, of that. Richard Dawkins made a dazzling frontal assault on the question when he introduced the idea of memes in The Selfish Gene. The few pages he devoted to the idea have inspired a number of books and articles in which the meme is considered to be a basic building block of social change, including fads. But as far as I can tell, the meme is still a fascinating idea that urges us toward experiments that are yet to be done. Whether memes or some other formulation turns out to be the engine of fads, the process seems to go like this: a signal of some kind produces a response that in turn acts as a signal to the next person, with the human propensity for imitation possibly playing a role. This process of signal-response-signal might then spread with growing momentum, looking something like biological contagion. But other factors may also apply, as in Steve Strogatz's examination of how things sync up with one another. Or Duncan Watt's exploration of how networks of all kinds follow certain rules of efficiency. Or the way crowds panic in a football stadium or a riot. Or possibly even the studies on the way traffic flows, including the backward generated waves that cause mysterious jams. The patterns of propagation may turn out to be more interesting than anything else. Fads and fashions have not been taken very seriously, I think, for at least three reasons. They seem short-lived, they're often silly and they seem like a break with normal, rational behavior. But as for being short-lived, the history of fads gives plenty of examples of fads that died out only to come back again and again, eventually becoming customary, including the use of coffee, tomatoes and hot chocolate. As for silliness, some fashions are not as silly as they seem. Fashions having to do with the length of one's hair seem trivial; yet political and religious movements have often relied on the prominence or absence of hair as a rallying symbol. And fads are far from aberrational. There are probably very few people alive who, at any one time, are not under the sway of a fad or fashion, if not dozens of them. And this is not necessarily a vacation from rational behavior on our part. On the contrary, it might be essential to the way we maximize the effectiveness of our choices. Two economists in California have developed a mathematical model suggesting that in following the lead of others we may be making use of other people's experience in a way that gives us a slightly higher chance of success in adopting a new product. The economists say this may explain a burst of popularity in a new product and possibly throw light on fads themselves. But another reason fads may not have been examined in more detail, and this could be the killer, is that at least for the moment they just seem too complicated. Trying to figure out how to track and explain change is one of the oldest and toughest of questions. Explaining change among people in groups is perhaps complex beyond measure, and may turn out to be undoable. It may forever be an art and not a science. But still, the humble fad is too tantalizing to ignore. We take it for granted and dismiss it, even while we're in the rapture of it. This commonplace thing that sits there like the purloined letter may or may not turn out to contain a valuable message for us, but it is staring us in the face. Alan Alda, an actor, writer and director, is currently playing Richard Feynman in the stage play QED at Lincoln Center in New York. ________________________________________________________________ "What comes after Science? When?" Questions? I don't ask questions. I ask answers, and then make up the questions as I see fit. I assemble vast collections of answers and while finding the questions, I make connections in the process. These connections are new answers, and depending on my mood and how much time I have at my disposal, I set about finding questions for them as well. Often, if not usually, the question I find is: "Why on earth am I wasting my time on this (project du jour)?" Once in a great while, I'll find that something I've cooked up in my multi-media cauldron "fits" just right -- an appropriate gesture at a propitious moment, and it arrives with no explanation, no equation, no excuse, no reason, nothing- it just sits there -- absolutely correct to itself in every possible way. The paradigm of Question/Answer doesn't really work in my world as I've never really found Life, The Universe, and Everything (LU&E) and most (but not all) of its constituent parts and systems to be fundamentally amenable to it. From my research, I've come to a general conclusion that LU&E and most of its parts are fundamentally not knowable, or even humanly understandable in any linguistic or mathematical sense, except when framed in a more narrow set of terms, like "metaphor" or "pretend" or "just so". A dear friend of mine once noted: "Nobody knows and you can't find out" and I largely agree with him. However, I can also say that, like being in the presence of a bucket of bricks, this is all more an experiential thing, more like a synchronistic aesthetic moment and less like a diachronistic or ahistorically definitive mathematical proposition or linguistically intelligible conclusion. So, one can't "know" it, nor can one "find out", but one can come to a sensibility that is convincing at the time and creatively informs one's behaviour and choices. Hence, the only justice in this life is poetic, and everything else is just some tweaky form of petty revenge or (more typically in this life of entertainment and cultural anaesthesiology) dodging bullets while one waits for the big storm to blow over. It can be infuriating (to me and most everyone else, it seems) when my work or research comes such conclusions, but since when has there been some big carved-in-stone guarantee that it's supposed to make sense in the first place? Isn't a rational conclusion a bit presumptuous and arrogant? From what I can gather it seems that the complete object of study fundamentally doesn't and shouldn't make sense (as sense seems to be a tiny subset surrounded by a vast multitude of complex forms of "nonsense"), and see that not as a shortcoming on the part of the Universe, as much as it is an indication of the limitations of human reason and the short time we get to spend on this planet. But all this is probably not what you wanted to hear, so here's a good question that's been bugging me for years and if anyone wants to submit an answer, let me know - I'm all ears... Mister Warwick asks: "What comes after Science? When?" Henry Warwick is an artist, composer, and scientist. ________________________________________________________________ "What is time, and what is the right language to describe change, in a closed system like the universe, which contains all of its observers?" This is, I believe, the key question on which the quantum theory of gravity and our understanding of cosmology, depends. We have made tremendous progress in the last years towards each goal, and we come to the point where we need a new answer to this question to proceed further. The basic reason for this problem is that most notions of time, change and dynamics which physics, and science more generally, have used are background dependent. This means that they define time and change in terms of fixed points of reference which are outside the system under study and do not themselves change or evolve. These external points of reference include usually the observer and clocks used to measure time. They constitute a fixed background against which time and change are defined. Other aspects of nature usually assumed to be part of the background are the properties of space, such as its dimensionality and geometry. General relativity taught us that time and space are parts of the dynamical system of the world, that do themselves change and evolve in time. Furthermore, in cosmology we are interested in the study of a system that by definition contains everything that exists, including all possible observers. However, in quantum theory, observers seem to play a special role, which only makes sense if they are outside the system. Thus, to discover the right quantum theory of gravity and cosmology we must find a new way to formulate quantum theory, as well as the notions of time and change, to apply to a system with no fixed background, which contains all its possible observers. Such a theory is called background independent. The transition from background dependent theories to background independent ones is a basic theme of contemporary science. Related to it is the change from describing things in terms of absolute properties intrinsic to a given elementary particle, to describing things in terms of relational properties, which define and describe any part of the universe only through its relationships to the rest. In loop quantum gravity we have succeeded in constructing a background independent quantum theory of space and time. But we have not yet understood completely how to put the observer inside the universe. String theory, while it solves some problems, has not helped here, as it is so far a purely background dependent theory. Indeed string theory is unable to describe closed universes with a positive cosmological constant, such as observations now favor. Among the ideas which are now in play which address this issue are Julian Barbour's proposal that time does not exist, Fotini Markopoulou's proposal to replace the single quantum theory relevant for observing a system from the outside with a whole family of quantum theories, each a description of what an observer might see from a particular event in the history of the universe and 't Hooft's and Susskind's holographic principle. This last idea says that physics cannot describe precisely what is happening inside a region of space, instead we can only talk about information passing through the boundary of the region. I believe these are relevant, but none go far enough and that we need a radical reformulation of our ideas of time and change. As the philosopher Peirce said over a century ago, it is fundamentally irrational to believe in laws of nature that are absolute and unchanging, and have themselves no origin or explanation. This is an even more pressing issue now, because we have strong evidence that the universe, or at least the part in which we live, came into existence just a few billion years ago. Were the laws of nature waiting around eternally for a universe to be created to which they could apply? To resolve this problem we need an evolutionary notion of law itself, where the laws themselves evolve as the universe does. This was the motivation for the cosmological natural selection idea that Martin Rees is so kind to mention. That is, as Peirce understood, the notions of evolution and self-organization must apply not just to living things in the universe, but the structure of the universe and the laws themselves. Lee Smolin, a theoretical physicist, is a founding member and research physicist at the Perimeter Institute in Waterloo Canada author of Three Roads to Quantum Gravity. ________________________________________________________________ "Why doesn't conservation click?" Three decades ago I began my first career working on a British television series called "Survival". Unlike the current "Survivor" series (about the politics of rejection while camping out) these were natural history documentaries on a par with the best of National Geographic and Sir David Attenborough: early recordings of humpback whales, insights on elephant behavior, the diminishing habitats of mountain gorillas and orangutans, a sweeping essay on the wildebeest migration, and my favorite, an innovative look at the ancient baobab tree. In 2001 the "Survival" series died. It was a year when conservation efforts lagged across the board, along with other failures to take the long view. Survival programs may have told people what they could no longer bear to hear (that the human species is soiling its own den) without demonstrating constructive solutions. For example, there are precious few incentives to develop alternate energy sources despite the profound vulnerabilities that our dependence on foreign energy revealed yet again. We have no "Vision Thing," despite the many clues. "It's global warming, dude," a 28 year-old auto mechanic told The New York Times as he fished in the Hudson River; "I don't care if the whole planet burns up in a hundred years. If I can get me a fish today, it's cool by me." Happily this provides a continuum to the question I posed at this forum in 1998: "If tragedy + time = comedy, what is the formula for equally therapeutic music? Do (Blues) musicians reach a third person perspective similar to that found in meditation, mind-altering drugs, and genius?" What I was reaching for with that third person perspective was a selfless overview. What I've since found is that healing dances of Native Americans and some African peoples follow the saga of a hero or heroine, much the way you or I listen to Bob Dylan or Bonnie Raitt and identify with their lyrics. While Carl Jung delved into the healing ritual archetype among many cultures, a new science called Biomusicology suggests even more ancient origins, tracing the inspiration for human music to natural sounds (the rhythm of waves lapping at the shore, rain and waterfalls, bird song, breathing, and our mother's heartbeat when we were floating in the womb.) Songs of birds certainly influenced classical music, and the call and response patterns of birds were imitated in congregations and cotton fields, with shouts, which led to the Delta blues. The salubrious influence of music, including research by Oliver Sacks, is featured in a Discovery Channel program that I helped research. "The Power of Music" will be broadcast in 2002, as will Sir David Attenborough's new series on a similar theme, "Songs of the Earth." But will these programs inspire viewers to relinquish their SUVs for a hydrogen-powered car? How does one convince people to address global warming when most minds are focused on the economy or terrorism? Part one of this answer must include "An Ounce of Prevention." Richard A. Clarke, former White House director of counterterrorism, explained our ill preparedness for September 11 this way: "Democracies don't prepare well for things that have never happened before." Another senior analyst said. "Unfortunately, it takes a dramatic event to focus the government's and public's attention." Finally, efforts to prevent hijackings have been responsive, rarely proactive. As we devise our New Year's Resolutions, how many of us will wait for a scare (positive diagnosis) before we quit smoking, drinking or sitting on our duff? Year 2002 should be the time when conservationists not only demand action, but persuade people everywhere that the demise of wild places can and should be stopped, that some of our forces of habit (unneeded air conditioning, for example) will eventually affect our quality of life in ways of greater devastation. We need people to identify with the song lyrics of others, who may live in distant lands, and feel the brunt of global warming long before we do. But first we must learn to understand their language. In The Unbearable Lightness of Being, Milan Kundera wrote, "True human goodness, in all its purity and freedom, can come to the fore only when its recipient has no power. Mankind's true moral test, its fundamental test (which lies deeply buried from view), consists of its attitude toward those who are at its mercy: animals. And in this respect mankind has suffered a fundamental debacle, a debacle so fundamental that all others stem from it." Survival indeed. Delta Willis has searched for fossils alongside Meave and Richard Leakey, profiled physicists and paleontologists who draw inspiration from nature, and serves as chief contributor to the Fodor's Guide to Kenya & Tanzania. ________________________________________________________________ Why is it only amongst adults in the Western world that has tradition been so insistently and constantly challenged by the raising of Edge questions? Why do we ask Edge questions? Why do we ask Edge questions that challenge the "anesthesiology" of accepted wisdom and so the traditional answers we are given as to who and what we are? In most societies, accepted wisdom is to be respected not questioned, and who and what we are have long been decided by custom, elders, social betters and the sacred word of God. Moreover, why is it that the asking of Edge questions has only thrived and been encouraged in Western societies (with the help of such individuals as Socrates and the contributors to this Edge project)? Children it should be noted readily ask Edge-type questions. The problem is that they stop when they become adults except in the civilization (with a few ups and downs) that started in Classical Greece -- Western civilization. "Are all our beliefs in gods, a myth, a lie foolishly cherished, while blind hazard rules the world?" That perhaps is the first Edge question (Euripides, Hecabe, lines 490-491) -- and importantly a question not raised safely in private but before a large audience. Indeed, Euripides raised it to gain public reward. Greek playwrights wrote plays for competitions that were judged by ten randomly selected members of the audience -- and given Euripides wanted to win -- he must have believed that the average Greek would be hearing this Edge question raised about the Gods. The public exploring of Edge questions is rare outside Western societies. Instead, "what was finally persuasive was appeal to established authority", and that, "the authority of tradition came to have more convincing effect than even direct observation and personal experience" (Robert Oliver, Communication And Culture In Ancient India And China, 1971). And as the Japanese scholar Hajime Nakamura noted, the Chinese "insisted that the traditional sacred books are more authoritative than knowledge based upon sense and inference" (Ways Of Thinking Of Eastern Peoples, 1964). Job might seem to be asking the Edge question "Why do the just suffer and the wicked flourish?" But the story of Job is not about rewarding Edge questioning but faith in the wisdom of God: "Who is this that darkens my counsel with words without knowledge". This Edge question might be criticized as Eurocentric. But it was Western intellectuals that first asked the Edge question about whether ones own culture might be privileged falsely over others and so invented the idea of ethnocentricity. So my Edge question is this: why is it only amongst adults in the Western world that has tradition been so insistently and constantly challenged by the raising of Edge questions? John R. Skoyles is a researcher in the evolution of human intelligence in the light of recent discoveries about the brain, who, while a first-year student at LSE, published a theory of the origins of Western Civilization in Nature. ________________________________________________________________ Paul Davies Responds Response to John McCarthy: John McCarthy asks how animal behavior is encoded in DNA. May I sharpen the question? One of the most remarkable manifestations of inherited behavior is the way birds navigate accurately whilst migrating over vast distances. I understand that part of this skill lies with the bird's ability to use the positions of stars as beacons. Does this imply that some avian DNA contains a map of the sky? Could a scientist in principle sequence the DNA and reconstruct the constellations? Response to Martin Rees's response to my question: Sir Martin Rees has eloquently outlined the key issues concerning the status of multiverse theories. I should like to make a brief response followed by a suggestion for further research. Sir Martin raises the question of whether what we consider to be fundamental laws of physics are in fact merely local bylaws applicable to the universe we perceive. Implicit in this assumption is the fact that there are laws of some sort anyway. By definition, a law is a property of nature that is independent of time. We still need to explain why universes come with such time-independent lawlike features, even if a vast and random variety of laws is on offer. One might try to counter this by invoking an extreme version of the anthropic theory in which there are no laws, just chaos. The apparent day-by-day lawfulness of the universe would then itself be anthropically selected: if a crucial regularity of nature suddenly failed, observers would die and cease to observe. But this theory seems to be rather easily falsified. As Sir Martin points out, if a particular remarkable aspect of the laws is anthropically selected from a truly random set, then we would expect on statistical grounds the aspect concerned to be just sufficient to permit biological observers. Consider, then, the law of conservation of electric charge. At the atomic level, this law is implied by the assumed constancy of the fine-structure constant. (I shall sidestep recent claims that this number might vary over cosmological time scales.) Suppose there were no such fundamental law, and the unit of electric charge varied randomly from moment to moment? Would that be life-threatening? Not if the variations were small enough. The fine-structure constant affects atomic fine-structure, not gross structure, so that most chemical properties on which life as we know it depends are not very sensitive to the actual value of this number. In fact, the fine-structure constant is known to be constant to better than one part in a hundred million. A related quantity, the anamolous magnetic moment of the electron, is known to be constant to even greater accuracy. Variations several orders of magnitude larger than this would not render the universe hostile to carbon-based life. So the constancy of electric charge at the atomic level is an example of a regularity of nature far in excess of what is demanded by anthropic considerations. Even a multiverse theory that treated this regularity as a bylaw would need to explain why such a bylaw exists. I now turn to my meta-question of whether the multiverse might be no better than theism in modern scientific language. It is possible that this claim can be tested using a branch of mathematics known as algorithmic information theory, developed by Kolmogorov and Chaitin. This formalism offers a means to quantify Occam's Razor, by quantifying the complexity of explanations. (Occam's Razor suggests that, all else being equal, we should prefer the simplest explanation of the facts.) On the question of how to explain certain fine-tuned bio-friendly aspects of the universe, the crude response "God made it that way" is infinitely complex (and therefore very unsatisfying), because God might have made one of an infinite number of alternative universes. Put differently, the selection set -- the "shopping list" of universes available to an omnipotent Deity -- contains an infinite amount of information, so the act of selection from this set involves discarding this infinite quantity of information. In the same way, the multiverse contains an infinite amount of information. In this case we observers are the selectors, but we still discard an infinite quantity of information by failing to observe the other universes. A proper mathematical parameterization of various multiverse theories and various theological models should enable this comparison to be made precise. Even if the two modes of explanation -- theistic and multiverse -- turned out to be mathematically equivalent -- one might still argue (as Sir Martin has done) for the superiority of the multiverse theory on the grounds that the other universes, whilst not directly observable, are nevertheless strongly implied by extrapolation from the structure of our physical theories. But a theist would readily counter that the existence of God, whilst not directly obervable, is nevertheless strongly implied by extrapolation from the nature of the world, human wisdom, mystical revelation, moral awareness, etc. I argued in my book The Mind of God that most attempts at ultimate explanations run into this "tower of turtles" problem: one has to start somewhere in the chain of reasoning, with a certain unproved given, be it God, mathematics, a physical principle, revelation, or something else. That is because of an implied dualism common to scientific and theistic explanations alike. In science the dualism is between states of the world and abstract laws. In theism it is between creature (i.e. the physical universe) and Creator. But is this too simplistic? Might the physical world and its explanation be ultimately indecomposable? Should we consider alternative modes of description than one based on linear reasoning from an unproved given, which after all amounts to invoking a magical levitating superturtle at the base of the tower? That is what I meant by the "Third Way" in my original question. ________________________________________________________________ Could our lack of theoretical insight in some of the most basic questions in biology in general, and consciousness in particular, be related to us having missed a third aspect of reality, which upon discovery will be seen to always have been there, equally ordinary as space and time, but so far somehow overlooked in scientific descriptions? Is the arena of physics, constructed out of space and time with matter/energy tightly interwoven with space and time, sufficient to fully describe all of our material world? The most fundamental debates in cognitive science take a firm "yes" for granted. The question of the nature of mind then leaves open only two options: either a form of reductionism, or a form of escapism. The latter option, a dualist belief in a separate immaterial mental realm has fallen out of favor, largely because of the astounding successes of natural science. The former, reductionism, is all that is left, whether it is presented in a crude form (denial of consciousness as real or important) or in a more fancy form (using terms like emergence, as if that would have any additional explanatory power). The question I ask myself is whether there could not be another equally fundamental aspect to reality, on a par with space and time, and just as much part of the material world? Imagine that some tribe had no clear concept of time. Thinking only in terms of space, they would have a neat way to locate everything in space, and they would scoff at superstitious notions that somehow there would be "something else", wholly other than space and the material objects contained therein. Of course they would see things change, but both during and after each change everything has its location, and the change would be interpreted as a series of purely spatial configurations. Yet such a geometric view of the world is not very practical. In physics and in daily life we use time in an equally fundamental way as space. Even though everything is already "filled up" with space, similarly everything participates in time. Trying to explain that to the people of the no-time tribe may be difficult. They will see the attempt at introducing time as trying to sneak in a second type of space, perhaps a spooky, ethereal space, more refined in some way, imbued with different powers and possibilities, but still as a geometric something, since it is in these terms that they are trained to think. And they probably would see no need for such a parallel pseudo-space. In contrast, we do not consider time to be in any way less "physical" than space. Neither time nor space can be measured as such, but only through what they make possible: distances, durations, motion. While space and time are in some sense abstractions, and not perceivable as such, they are enormously helpful concepts in ordering everything that is perceivable into a coherent picture. Perhaps our problems in coming up with a coherent picture of mental phenomena tells us that we need another abstraction, another condition of possibility for phenomena in this world, this very material world we have always lived in. Could it be that we are like that tribe of geometers, and that we have so far overlooked a third aspect of reality, even though it may be staring us in the face? Greek mathematicians used time to make their mathematical drawings and construct their theories, yet they disregarded time as non essential in favor of a Platonic view of unchanging eternal truths. It took two thousand years until Newton and Leibniz invented infinitesimal calculus, which opened the door for time to finally enter mathematics, thus making mathematical physics possible. To reframe my question: could our lack of theoretical insight in some of the most basic questions in biology in general, and consciousness in particular, be related to us having missed a third aspect of reality, which upon discovery will be seen to always have been there, equally ordinary as space and time, but so far somehow overlooked in scientific descriptions? Although I don't know the answer, I suspect we will stumble upon it through a trigger that will come from engineering. Newton did not work in a vacuum. He built upon what Galileo, Descartes, Huygens and others had discovered before him, and many of those earlier investigations were triggered by concrete applications, in particular the construction of powerful canons calling for better ways to compute ballistic orbits. Another example is the invention of thermodynamics. It took almost two centuries for Newtonian mechanics to come to grips with time irreversibility. Of course, every physicist had seen how stirring sugar in a cup of tea is not reversible, but until thermodynamics and statistical mechanics came along, that aspect of reality had mostly been ignored. The engineering problems posed by the invention of steam engines were what forced a deeper thinking about time reversibility. Perhaps current engineering challenges, from quantum computers to robotics to attempts to simulate large-scale neural interactions, will trigger a fresh way of looking at the arena of space and time, perchance finding that we have been overlooking an aspect of material reality that has been quietly with us all along. Piet Hut, professor of astrophysics at the Institute for Advanced Study, in Princeton, is involved in the project of building GRAPEs, the world's fastest special-purpose computers. ________________________________________________________________ "Is the universe really expanding? Or: Did Einstein get it exactly right?" As I prepare to head for Cambridge (the Brits' one) for the conference to mark Stephen Hawking's 60th birthday, I know that the suggestion I am just about to make will strike the great and the good who are assembling for the event as my scientific suicide note. Suggesting time does not exist is not half as dangerous for one's reputation as questioning the expansion of the universe. That is currently believed as firmly as terrestrial immobility in the happy pre-Copernican days. Yet the idea that the universe in its totality is expanding is odd to say the least. Surely things like size are relative? With respect to what can one say the universe expands? When I put this question to the truly great astrophysicists of our day like Martin Rees, the kind of answer I get is that what is actually happening is that the intergalactic separations are increasing compared with the atomic scales. That's relative, so everything is fine. Some theoreticians give a quite different answer and refer to the famous failed attempt of Hermann Weyl in 1917 to create a genuinely scale-invariant theory of gravity and unify it with electromagnetism at the same time. That theory, beautiful though it was, never made it out of its cot. Einstein destroyed it before it was even published with the simple remark that Weyl's theory would make the spectral lines emitted by atoms depend on their prior histories, in flagrant contradiction to observation. Polite in public, Einstein privately called Weyl's theory 'geistreicher Unfug' [inspired nonsense]. Ever since that time it seems to have been agreed that, for some inscrutable reason, the quantum mechanics of atoms and elementary particles puts an absolute scale into physics. Towards the end of his life, still smarting from Einstein's rap, Weyl wrote ruefully "the facts of atomism teach us that length is not relative but absolute" and went one to bury his own cherished ambition with the words "physics can never be reduced to geometry as Descartes had hoped". I am not sure the Cartesian dream is dead even though the current observational evidence for expansion from a Big Bang is rather impressive. The argument from quantum mechanics, which leads to the identification of the famous Planck length as an absolute unit, seems to me inconclusive. It must be premature to attempt definitive statements in the present absence of a theory of quantum gravity or quantum cosmology. And the argument about the relativity of scale being reflected in the changing ratio of the atomic dimensions to the Hubble scale is vulnerable. To argue this last point is the purpose of my contribution, which I shall do by a much simpler example, for which, however, the principle is just the same. Consider N point particles in Euclidean space. If N is greater than three, the standard Newtonian description of this system is based on 3N + 1 numbers. The 3N (=3xN) are used to locate the particles in space, and the extra 1 is the time. For an isolated dynamical system, such as we might reasonably conjecture the universe to be, three of the numbers are actually superfluous. This is because no meaning attaches to the three coordinates that specify the position of the centre of mass. This is a consequence of the relativity principle attributed to Galileo, although it was actually first cleanly formulated by Christiaan Huygens (and then, of course, brilliantly generalized by Einstein). The remaining 3N - 2 numbers constitute an oddly heterogeneous lot. One is the time, three describe orientation in space (but how can the complete universe have an orientation?), one describes the overall scale, and the remaining 3N - 7 describe the intrinsic shape of the system. The only numbers that are not suspect are the last: the shape variables. Developing further ideas first put forward in 1902 in his Science and Hypothesis by the great French mathematician Poincare [ascii does not allow me to put the accent on his e], I have been advocating for a while a dynamics of pure shape. The idea is that the instantaneous intrinsic shape of the universe and the sense in which it is changing should be enough to specify a dynamical history of the universe. Let me spell this out for the celebrated 3 body problem of Newtonian celestial mechanics. In each instant, the instantaneous triangle that they form has a shape that can be specified by two angles, i.e., just two numbers. These numbers are coordinates on the space of possible shapes of the system. By the 'sense' in which the shape is changing I mean the direction of change of the shape in this two-dimensional shape space. That needs only one number to specify it. So a dynamics of pure shape, one that satisfies what I call the Poincare criterion, should need only three essential numbers to set up initial conditions. That's the only ideal that, in Poincare's words, would give the mind satisfaction. It's the ideal that inspired Weyl (though he attacked the problem rather differently). Now how does Newtonian dynamics fare in the light of the Poincare criterion? Oddly enough, despite centuries of dynamical studies, this question hardly seems to have been addressed by anyone. However, during the last year, working with some N-body specialists, I have established that Newtonian mechanics falls short of the ideal of a dynamics of pure shape by no fewer than five numbers. Seen from the rational perspective of shape, Newtonian dynamics is very complicated. This is why the study of the Moon (which forms part of the archetypal Earth-Moon-Sun three-body problem) gave Newton headaches. Among the five trouble makers (which I won't list in full or discuss here), the most obstreperous is the one that determines the scale or size. The same five trouble makers are present for all systems of N point particles for N equal to or greater than 3. Incidentally, the reason why 3-body dynamics is so utterly different from 2-body dynamics is that shape only enters the picture when N = 3. Most theoretical physicists get their intuition for dynamics from the study of Newtonian 2-body dynamics (the Kepler problem). It's a poor guide to the real world. The point of adding up the number of the variables that count in the initial value problem is this. The Newtonian three-body problem can be expressed perfectly well in terms of ratios. One can consider how the ratios of the individual sides to the perimeter of the triangle change during the evolution. This is analogous to following the evolution of the ratio of the atomic-radii to the Hubble radius in cosmology. To see if scale truly plays no role, one must go further. One must ask: do the observable ratios change in the simplest way possible as dictated by a dynamics of pure shape, or is the evolution more complicated? That is the acid test. If it is failed, absolute scale is playing its pernicious role. The Poincare criterion is an infallible test of purity. Both Newtonian dynamics and Einstein's general relativity fail it. The fault is not in quantum mechanics but in the most basic structure of both theories. Scale counts. In fact, seen from this dynamical perspective Einstein's theory is truly odd. As James York, one of John Wheeler's students in Princeton, showed 30 years ago (in a beautiful piece of work that I regard as the highest point achieved to date in dynamical studies), the most illuminating way to characterize Einstein's theory is that it describes the mutual interaction of infinitely many degrees of freedom representing the pure shape of the universe with one single solitary extra variable that describes the instantaneous size of the universe (i.e., its 3-dimensional volume in the case of a closed universe). From Poincare's perspective, this extra variable, to put frankly, stinks, but the whole of modern cosmology hangs on it: it is used to explain the Hubble red shift. There, I have stuck my neck out in good Popperian fashion. Current observations suggest I will have my head chopped off and Einstein will be vindicated. Certainly all the part of his theory to do with pure shape is philosophically highly pleasing and is supported by wonderful data. But even if true dynamical expansion is the correct explanation of the Hubble red shift, why did nature do something so unaesthetic? As I hope to show very shortly on the Los Alamos bulletin board, dynamics of pure shape can mimic a true Hubble expansion. The fact is that Einstein's theory allows red shifts of two kinds: one is due to stretching (expansion) of space, while the other is the famous gravitational red shift that makes clocks on the Earth run at a now observable amount slower than clocks in satellites. It is possible to eliminate scale from Einstein's theory, as Niall O'Murchadha and I have shown. This kills the stretching red shift but leaves the other intact. It is just possible that this could explain the Hubble red shift. Let me conclude this possibly premature (but I feel justified, since all dogmas need to be challenged) contribution by pointing out that according to the standard Big-Bang scenario two things have been happening simultaneously since something lit the fuse: the universe has been expanding from an extraordinarily uniform and isotropic compressed state and it has simultaneously been getting more and more clumpy. Inflationists claim to have explained why we observe such a uniform Big Bang, but sceptics (which include me) have the uncomfortable feeling that an observational cosmic coincidence is merely being described, rather than explained, by theoretical fine tuning of an adjustable parameter. In a self-respecting universe that dismisses size as opposed to shape as a fiction, sharper predictions must be possible. In a dynamics of pure shape, the only thing that can happen is change of shape. That must explain the Hubble red shift. Merely by observing the rate at which matter and the universe in general becomes more clumpy, above all the rate of formation of gravitationally collapsed objects, astronomers ought to be able to predict the value of the Hubble constant. So my challenge to the theoreticians is this: Are you absolutely sure Einstein got it exactly right? Prove me wrong in my hunch that the universe obeys a dynamics of pure shape subtly different from Einstein's theory. If size does count, why should nature do something so puzzling to the rational mind? Julian Barbour is an independent theoretical physicist and author of The End of Time. ________________________________________________________________ "When will we emerge from the quantum tunnel of obscurity?" Can contradictory things happen at the same time? Does the universe continue about its business when we're not looking at it? These questions have been raised in the context of quantum mechanics ever since the theory was formulated in the 1920s. While most physicists dismissed these issues as "just philosophical", a small minority (inspired by the examples of Louis de Broglie, Albert Einstein and Erwin Schroedinger) continued to question the meaning of the most successful theory of science, and often suffered marginalisation and even ridicule. It is one thing to apply quantum mechanics to calculate atomic energy levels or the rate at which atoms emit light. But as soon as one asks what is actually happening during an atomic transition, quantum mechanics gives no clear answer. The Copenhagen interpretation, forged by Niels Bohr and Werner Heisenberg, emphasises the subjective experience of "observers" and avoids any description of an objective reality; it talks about the chances of different outcomes occuring in a measurement, but does not say what causes a particular outcome to occur. For decades, students have been taught to avoid asking probing questions. An attitude of "shut-up-and-calculate" has dominated the field. The result is widespread confusion, and a strange unwillingness to ask clear and direct questions. As the late cosmologist Dennis Sciama once put it, whenever the subject of the interpretation of quantum mechanics comes up "the standard of discussion drops to zero". The publication of John Bell's book Speakable and Unspeakable in Quantum Mechanics in 1987 provided a point of reference for a change in attitude that gained real momentum in the 1990s. Bell spearheaded a movement to purge physics of some inherently vague notions inherited from the founding fathers of quantum mechanics. For instance the "measurement apparatus" was treated by Bohr and Heisenberg as something fundamentally distinct from the "system being measured": the latter was subject to the laws of quantum mechanics whereas the former was not. But if everything -- including our equipment -- is made of atoms, how can such a distinction be anything more than an approximation? In reality everything -- "system", "apparatus", even human "observers" -- should obey the same laws of physics. The clarity of Bell's writings forced many people to confront the uncomfortable fact that quantum mechanics as usually formulated had a problem explaining why we see definite events taking place. Bell advertised what he saw as two promising avenues to resolve the quantum paradoxes: the theory must be supplemented either with a new random process that selects outcomes (the "dynamical reduction of the state vector") or with extra "hidden variables" whose unknown values select outcomes. Theories of both types have been constructed. Indeed, a correct hidden -- variables theory was written down by Louis de Broglie as long ago as 1927, and was shown by David Bohm in 1952 to account completely for quantum phenomena. The de Broglie -- Bohm theory gave an objective account of quantum physics; yet, until about 10 years ago, most physicists had not heard of it. Today, many have heard of it, but still very few understand it or work on it. And it is still not taught to students (even though in my experience many students would love to know more about this theory). One wonders where things will go from here. On the one hand, in the last five years the subject of the interpretation of quantum mechanics has suddenly become more respectable thanks to the rising technology of quantum information and computation, which has shown that something of practical use -- novel forms of communication and computation -- can emerge from thoughts about the meaning of quantum mechanics. But on the other hand, there is a danger that the problem of the interpretation of quantum mechanics will be pushed aside in the rush to develop "real" technological applications of the peculiarities of quantum phenomena. The rise of quantum information theory has also generated a widespread feeling that "information" is somehow the basic building block of the universe. But information about what? About information itself? As noted by P.W. Anderson in a recent Edge comment on Seth Lloyd, not only does it seem unjustified to claim that "information" is the basic stuff of the universe: worse, an unfortunate tendency has developed in some quarters to regard the theory of information as the only really fundamental area of reseach. Personally, I find quantum information theory very interesting, and it has without doubt enriched our understanding of the quantum world: but I fear that in the long run its most enthusiastic practitioners may lead us back to the vague subjectivist thinking from which we were only just emerging. Antony Valentini is a theoretical physicist at Imperial College in London. ________________________________________________________________ "How does being able to learn about a changing world endow our minds with expectations, imagination, creativity, and the ability to perceive illusions?" When you open your eyes in the morning, you usually see what you expect to see. Often it will be your bedroom, with things where you left them before you went to sleep. What if you opened your eyes and found yourself in a steaming tropical jungle? or a dark cold dungeon? What a shock that would be! Why do we have expectations about what is about to happen to us? Why do we get surprised when something unexpected happens to us? More generally, why are we Intentional Beings who are always projecting our expectations into the future? How does having such expectations help us to fantasize and plan events that have not yet occurred? How do they help us to pay attention to events that are really important to us, and spare us from being overwhelmed by the blooming buzzing confusion of daily life? Without this ability, all creative thought would be impossible, and we could not imagine different possible futures for ourselves, or our hopes and fears for them. What is the difference between having a fantasy and experiencing what is really there? What is the difference between illusion and reality? What goes wrong when we lose control over our fantasies and hallucinate objects and events that are not really there? Given that vivid hallucinations are possible, especially in mental disorders like schizophrenia, how can we ever be sure that an experience is really happening and is not just a particularly vivid hallucination? If there a fundamental difference between reality, fantasy, and illusion, then what is it? Recent models of how the brain controls behavior have begun to clarify how the mechanisms that enable us to learn quickly about a changing world throughout life also embody properties of expectation, intention, attention, illusion, fantasy, hallucination, and even consciousness. I never thought that during my own life such models would develop to the point that the dynamics of identified nerve cells in known anatomies could be quantitatively simulated, along with the behaviors that they control. During the last five years, ever-more precise models of such brain processes have been discovered, including detailed answers to why the cerebral cortex, which is the seat of all our higher intelligence, is organized into layers of cells that interact with each other in characteristic ways. Although an enormous amount of work still remains to be done before such insights are fully developed, tested, and accepted, the outlines already seem clear of an emerging theory of biological intelligence, and with it, the scaffold for a more humane form of artificial intelligence. Getting a better understanding of how our minds learn about a changing world, and of how to embody their best features in more intelligent technologies, should ultimately have a transforming effect on many aspects of human civilization. Stephen Grossberg is a Professor of Cognitive and Neural Systems, Mathematics, Psychology, and Engineering at Boston University. ________________________________________________________________ "How will computation and communication change our everyday lives, again?" The actual day to day things that we do have been changed drastically for many people in the world over the last twenty years by the arrival of personal computers. We spend hours each day in front of a screen, typing. This was not the norm twenty years ago (although a few of us did it even then), and no one had access to the vast stores of information that are available to us on our laps now. We no longer ask for reprints or go to the library, but instead download pdf versions of papers that interest us. We no longer need to go to reference works but instead retrieve them directly on our PCs. The number of people that we correspond with has increased dramatically -- granted, the medium has changed too. And chatting on the phone to people on the other side of the world is no longer expensive or an event -- it is just as common and cheap as calling someone a hundred miles away. Our interaction with media is changing too -- it is becoming more and more pull rather than push, even for TV and radio entertainment -- we choose when and where we want to receive it, and how we will store it. Surprisingly, neither the book, nor the movie, nor the documentary are dead. There are more of them, in fact, although the method of delivery is slowly changing. We have increased our number of options rather than supplanted the old ones. Moore's law and the increase of telecommunications infrastructure are both continuing. What new options should we expect, and how will they change the way we work? What will be the next "web", as unimagined by most educated people today as our current one was in 1988? And what will be the impact of the new methods of delivery we can expect to be developed in the next 20 years? Already tens of thousands of people have cochlear implants with direct electronic to neural connections to restore their hearing. Multiple groups are working on retinal implants, either into the eyeball, or interfacing to V1 at the back of the head; again to replace lost capabilities such as those resulting from macular degeneration. A few quadraplegics have direct neural connections to computer interfaces so that they can control a mouse and even type. As progress is made with these silicon/neural interfaces, pushed along by clinical pressures to cure those who are impaired, we can expect more and more "plastic surgery" applications. A direct neural typing interface first perhaps, and later data going the other way directly from the network into our brains. There are considerable challenges to be met in understanding neural "coding" to do this, but the clinical imperative is pushing this work along. How will we all be in the world then, 20 years from now say, when we all have direct wireless connections to the Internet of that time with information services as yet unimaginable? How will our grandchildren's interaction with information change the way they work and think, in the same way that instant messaging and vast numbers of web pages have changed the way our children in elementary and high school operate today? Rodney Brooks is Director of the MIT Artificial Intelligence Laboratory, and Fujitsu Professor of Computer Science. He is also Chairman and Chief Technical Officer of IS Robotics, ________________________________________________________________ "Would an extra-terrestrial civilization develop the same mathematics as ours? If not, how could theirs possibly be different?" In writing my next book, about maths, I have been led to ponder this question by the fact that there are philosophers, and a few mathematicians, who believe that it is conceivable that there could be intelligences with a fully developed mathematics that does not, for example, recognize the integers or the primes, let alone Fermat's Last Theorem or the Riemann Hypothesis. And yet, whole numbers seem to us such a basic property of "things", that unless there were intelligences that were not embodied in any way (and/or couldn't "see" the discrete stars, for example) they would be bound to come across number and all that follows. But then, I suppose you could imagine intelligent beings which consisted, say, of density differences in a gas but lacked boundaries separating one from another. In any case, if such creatures do exist, it rather pours cold water on the use by SETI of maths (e.g. prime x prime pictorial grids) to communicate with them Karl Sabbagh is a writer and television producer and author of A Rum Affair: A True Story of Botanical Fraud. ________________________________________________________________ "Why do we fear the wrong things?" A mountain of research shows that our fears modestly correlate with reality. With images of September 11th lingering in their mind's eye, many people dread flying to Florida for Spring break, but will instead drive there with confidence -- though, mile per mile, driving during the last half of the 1990s was 37 times more dangerous than flying. Will yesterday's safety statistics predict the future? Even if not, terrorists could have taken down 50 more planes with 60 passengers each and -- if we'd kept flying -- we'd still have been ended last year safer on commercial flights than on the road. Flying may be scary, but driving the same distance should be many times scarier. Our perilous intuitions about risks lead us to spend in ways that value some lives hundreds of times more than other lives. We'll now spend tens of billions to calm our fears about flying, while subsidizing tobacco, which claims more than 400,000 lives a year. It's perfectly normal to fear purposeful violence from those who hate us. But with our emotions now calming a bit, perhaps it's time to check our fears against facts. To be prudent is to be mindful of the realities of how humans suffer and die. (To see my question developed -- and answered -- please click here). David G. Myers is a social psychologist David G. Myers at Hope College (Michigan) and author of The Pursuit of Happiness. ________________________________________________________________ "Are the laws of nature a form of computer code that needs and uses error correction?" John D. Barrow is Research Professor of Mathematical Sciences, University of Cambridge and author of Between Inner Space and Outer Space. ________________________________________________________________ "Can we ever escape our past, or are we doomed to a future of biobabble?" In mid-November 1999, New Yorker writer Rebecca Mead published a commentary on the candidacy of Al Gore, and in it she gave us a new word. In the old days, candidates were advised in a pseudo-Freudian frame. Clinton, in pre-Monica times, was told to emphasize his role as "strong, assertive, and a good father." Now, however, this psychobabble has been eclipsed by what she called biobabble and Mead recommended that Gore's advice might best be based on evolutionary psychology instead of Freud. In other words, it wasn't your parents who screwed you up, it was the ancient environment. Mead cites Sarah Hrdy, a primatologist, as suggesting that the ideal presidential leader would be a grandma whose grandchildren were taken away and scattered across the country in secret locations. Then the president could be expected to act on the behalf of the general good, to maximize her reproductive fitness. No wonder Gore wasn't appointed. This is d?j? vu all over again, and after the last century of biopolicy in action, can we still afford to be here? Somehow we can't get away from a fixation on the link between biology and behavior. A causal relationship was long championed by the Mendelian Darwinians of the Western World, as breeding and sterilization programs to get rid of the genes for mental deficiencies became programs to get rid of the genes for all sorts of undesirable social behaviors, and then programs to get rid of the undesirable races with the imagined objectionable social behaviors. Science finally stepped back from the abyss of human tragedy that inevitably ensued, and one result was to break this link by questioning whether human races are valid biological entities. By now, generations of biological anthropologists have denied the biology of race. Arguing that human races are socially constructed categories and not biologically defined ones, biological anthropologists have been teaching that if we must make categories for people, "ethnic group" should replace "race" in describing them. The public has been listening. This is how the U.S. census came to combine categories that Americans base on skin color "African-American," delineated by "one drop of blood" with categories based on language "Latino." However ethnic groups revitalize the behavioral issue because ethnicity and behavior are indeed related, although not by biology, but by culture. This relationship is implicitly accepted as the grounds for the profiling we have heard so much about of late, but here is the rub. Profiling has accomplished more than just making it easier to predict behaviors, actually revitalizing the issue of biology and behavior by bringing back "race" as a substitution for "ethnic group." This might well have been an unintended consequence of using "race" and "ethnic group" interchangeably, because this usage forged a replacement link between human biology and human culture. Yet however it happened, we are back where we started, toying with the notion that human groups defined by their biology differ in their behavior. And so, how do we get out of this? Can we? Or does the programming that comes shrink-wrapped with our state-of-the-art hardware continue to return our thinking to this point because of some past adaptive advantage it brought? It doesn't seem very advantageous right now. Milford H. Wolpoff is Professor of Anthropology at the University of Michigan and author (with Rachel Caspari) of Race and Human Evolution: A Fatal Attraction. ________________________________________________________________ "How different could life have been?" Physicists, including several in this group, are fond of asking, "What if the universe had been different?" Are the fundamental constants just numbers we accept as given, but which could have been different? Or is there some deeper rationale, which we shall eventually discover, that renders them unfree to change? Is our universe the way universes have to be? Or is it one of a huge ensemble of universes? Given present company, I would not aspire to this question, fascinating as it is. Mine is its biological little brother. Is the life that we observe the way life has to be? Or could we imagine other kinds of life? Long the stock in trade of science fiction, I want to move it closer to science's domain. Unfortunately the question is one for a chemist - which I am not. My hope is that chemists will listen, and work on it. Life as we know it is far more uniform than superficially appears. The differences between an elephant and an amoeba are superficial. Biochemically speaking, we are all playing most of the same tricks. At this level, most of the variation in life is to be found among the bacteria. We large animals and plants have just specialised in a few of the tricks that bacterial R & D developed in the Precambrian. But all living things, bacteria included, practise the same fundamental tricks. Using the universal DNA code, the one-dimensional sequence of DNA codons specifies the one-dimensional sequence of amino acids in proteins. This determines the proteins' three-dimensional coiling, which specifies their enzymatic activity, and this, in turn specifies almost everything else. So, I'm not talking about whether living things on other planets will look like us, or will have television aerials sticking on their heads. It is easy to predict that heavy planets with high gravitational fields will breed elephants the size of flies (or flies built like elephants); light planets will grow elephant-sized flies with spindly legs. It is easy to predict that, where there is light, there will be eyes. This is not what I am talking about. I want the answer to a more fundamental question. My question, which is for chemists, is this. Can you devise a fundamentally different, alternative biochemistry? Given that, as I firmly believe, life all over the universe must have evolved by the differential survival of something corresponding to genes - self-replicating codes whose nature influences their own long-term survival - do they have to be strung along polynucleotides? The genetic code itself almost certainly didn't have to be the one we actually have - plenty of other codes would have done the job. Ours is a frozen accident which, once crystallised, could not change. But can you think of a completely different kind of molecule, not a polynucleotide at all, perhaps not even organic, which could do the coding? Does it have to be digital like the DNA/RNA code, or could some kind of analogue code be accurate and stable enough to mediate evolution? Does it even have to be a one-dimensional code? And is there any other class of molecules that could step into the shoes of proteins? Biochemists, please stop focusing exclusively on the way life actually is. Think about how life might have been. Or how life could be on other worlds. Channel your creativity to devising a complete, alternative biochemistry, whose components are radically different from the ones we know, but are at the same time mutually compatible - participants in a wholly consistent system which your chemical calculations show could actually work. Why should we want this? I wanted to ask the question, "Is there life on other worlds, and how similar is it to the life we know?" But there is no immediate prospect of our receiving direct answers to these questions, and I am pessimistic of our ever doing so. Life has probably arisen more than once, but on islands in space too widely scattered to make a meeting likely. Theoretical calculations may be our best hope, and are certainly our most immediate hope, of at least estimating the probabilities. There's also the point, which hardly needs making on Edge, that to seek the unfamiliar is a good way to illuminate oneself. Reply to Paul Davies's response to John McCarthy Paul Davies notes that some night-migrating birds navigate by the stars, and asks whether avian DNA contains a map of the sky. "Could a scientist in principle sequence the DNA and reconstruct the constellations?" Alas, no. Stephen Emlen, of Cornell University, researched the matter in 1975. He placed Indigo Buntings in a circular cage in the centre of a planetarium, and measured their fluttering against different sides of the cage as an indicator of their preferred migratory direction. By manipulating the star patterns in the planetarium, blotting out patches of sky and so on, Emlen showed that the buntings did indeed use Polaris as their North, and they recognized it by the surrounding pattern of constellations. So far so good. Now comes the interesting part. Is the pattern of stars built into the birds' DNA, or is there some other, more general way to define the north (or south) pole of the heavens? Put it like that, and the point jumps out at you: the polar position in the sky can be defined as the centre of rotation! It is the hub that stays still, while the rest of the heavens turn. Did the birds use this as a rule for learning? Emlen reared young buntings in the planetarium, giving them experience of different artificial `night skies'. Half of them, the controls, experienced a night sky that rotated about Polaris, as usual. The other half, the experimental birds, experienced a night sky in which the centre of rotation was Betelgeuse. The control birds ended up steering by Polaris, as usual. But the experimental birds, mirabile dictu, came to treat Betelgeuse as though it was due north. Clever, or what? Richard Dawkins is an evolutionary biologist and the Charles Simonyi Professor For The Understanding Of Science at Oxford University. He is the author of Unweaving the Rainbow. ________________________________________________________________ "How are moral assertions connected with the world of facts?" Unlike many ancient philosophical problems, this one has, paradoxically, been made both more urgent and less tractable by the gradual triumph of scientific rationality. Indeed, the prevailing modern attitude towards it is a sort of dogmatic despair: `you can`t get an ought from an is, therefore morality must be outside the domain of reason`. Having fallen for that non-sequitur, one has only two options: either to embrace unreason, or to try living without ever making a moral judgement. In either case, one becomes a menace to oneself and everyone else. On the tape of the bin Laden dinner party, a participant states his belief that during the September 11 attack, Americans were afraid that a coup d'?tat was under way. Worldwide, tens of millions of people believe that the Israeli secret service carried out the attack. These are factual misconceptions, yet they bear the imprint of moral wrongness just as clearly as a fossil bears the imprint of life. This illustrates an important strand in the fabric of reality: although factual and moral assertions are logically independent (one cannot deduce either from the other), factual and moral explanations are not. There is an explanatory link between ought and is, and this provides one of the ways in which reason can indeed address moral issues. Jacob Bronowski pointed out that a commitment to discovering scientific truth entails a commitment to certain values, such as tolerance, integrity, and openness to ideas and to change. But there`s more to it than that. Not only scientific discovery, but scientific understanding itself can depend on one's moral stance. Just look at the difficulty that creationists have in understanding what the theory of evolution says. Look at the prevalence of conspiracy theories among the supporters of bad causes, and how such people are systematically blind to rational argument about the facts of the matter. And, conversely, look at Galileo, whose factual truth-seeking forced him to question the Church's moral authority. Why does this happen? We should not be surprised - at least, no more surprised than we are that, say, scientific and mathematical explanations are connected. The truth has structural unity as well as logical consistency, and I guess that no true explanation is entirely disconnected from any other. In particular, in order to understand the moral landscape in terms of a given set of values, one needs to understand some facts as being a certain way too, and vice versa. Moreover, I think it is a general principle that morally right values are connected in this way with true factual theories, and morally wrong values with false theories. What sort of principle is this? Though it refers to morality, at root it is epistemological. It is about the structure of true explanations, and about the circumstances under which knowledge can or cannot grow. This, in turn, makes it ultimately a physical fact - but that is another story. David Deutsch, a physicist, is a member of the Centre for Quantum Computation at the Clarendon Laboratory, Oxford University, and author of The Fabric of Reality. ________________________________________________________________ "Why is beauty making a comeback now?" My hypothesis is that the modernist/post-modernist idea that beauty is a social construct (with no deep bedrock in reality) is dead. There are an increasing number of books coming out propounding the notion that beauty is real and crosses all sorts of cultural and historic lines. In their view, that which unites us as a species in the perception of beauty is way larger than what divides us. My big question is whether, in a disjointed world in which the search for meaning is becoming ever more important, the existence of widely agreed upon ideas of beauty will increasingly become a quick and useful horseback way of determing whether or not *any* complex system, human or technological, is coherent. This idea draws in part from pre-industrial age definitions of beauty that held that "Beauty is truth, truth beauty -- that is all ye know on earth, and all ye need to know" (Keats, 1820), and most important, "The most general definition of beauty....multeity in unity" (Coleridge, 1814). Interestingly enough, the idea that I view as increasingly dumb, "Beauty is in the eye of the beholder" Bartlett's dates only to 1878, which is about when the trouble started, in my view. Joel Garreau is the cultural revolution correspondent of The Washington Post and author of Edge City. ________________________________________________________________ "Do wormholes exist?" Two startling ideas about wholly different classes of objects emerged from general relativity: black holes and wormholes. For over half a century black holes have grown in importance, with many convincing candidates in the sky and a vast range of theoretical support. "Einstein bridges" as they were first called, emerged in the 1930s, yet have not met with nearly the attention they deserve. We still don't know if any were made in the early universe. That seems by far the easiest way to find one