From checker at panix.com Thu Dec 1 23:46:37 2005 From: checker at panix.com (Premise Checker) Date: Thu, 1 Dec 2005 18:46:37 -0500 (EST) Subject: [Paleopsych] NYT: Hooked on the Web: Help Is on the Way Message-ID: Hooked on the Web: Help Is on the Way http://www.nytimes.com/2005/12/01/fashion/thursdaystyles/01addict.html [Checklist appended. I really don't meet that many criteria for addiction, but I have been spending way too much time on the Net and find it difficult to stop. Since Sunday, that is, before this article (Thursday), I have been sending only articles that I have read and have added comments on. Several people have told me that my comments are better than the articles themselves. I've also worked up some memes. This is why you've gotten very little from me lately. [I have three more memes planned for the near future: 1. What it would take for me to give up my three most cherished hypotheses (non-design, importance of gene-culture co-evolution in man right up to the present, inability to nail down our most important concepts). The last is proving very difficult for me to write up. Please send me your own answer to my question. I suggested it to the World Question Center at http://edge.org. 2. Why I am not a Christian (follow-up to Bertrand Russell's essay). This will be easy, since only need to organize it. 3. A piece on conspiracies (including the by-far most talked about one of the last century but which is never called one). Major problem is that I'd like to say that conspiracy thinking is our default Stone Age sociology, since back then the chain of consequences was short enough that what resulted came about by the actions of just a few people. However, I've also read that looking for an active agent is a Western particularity. And again, that conspiracies today are viewed as ones without conspirators. I need to reconcile these viewpoints. And I'll talk about how one might judge the plausibility of various conspiracy candidates. [For those of you on my lists, I *can* still send articles that I don't read or read and don't comment on. Almost anything that I might send and comment on does go onto my disk space in on the UNIX mainframe in Manhattan that houses Panix.com. But I'd rather have you all start working for *me* and help me with my two main projects, namely, "deep culture change" and "persistence of difference."] ------------ By SARAH KERSHAW REDMOND, Wash. THE waiting room for Hilarie Cash's practice has the look and feel of many a therapist's office, with soothing classical music, paintings of gentle swans and colorful flowers and on the bookshelves stacks of brochures on how to get help. But along with her patients, Dr. Cash, who runs Internet/Computer Addiction Services here in the city that is home to Microsoft, is a pioneer in a growing niche in mental health care and addiction recovery. The patients, including Mike, 34, are what Dr. Cash and other mental health professionals call onlineaholics. They even have a diagnosis: Internet addiction disorder. These specialists estimate that 6 percent to 10 percent of the approximately 189 million Internet users in this country have a dependency that can be as destructive as alcoholism and drug addiction, and they are rushing to treat it. Yet some in the field remain skeptical that heavy use of the Internet qualifies as a legitimate addiction, and one academic expert called it a fad illness. Skeptics argue that even obsessive Internet use does not exact the same toll on health or family life as conventionally recognized addictions. But, mental health professionals who support the diagnosis of Internet addiction say, a majority of obsessive users are online to further addictions to gambling or pornography or have become much more dependent on those vices because of their prevalence on the Internet. But other users have a broader dependency and spend hours online each day, surfing the Web, trading stocks, instant messaging or blogging, and a fast-rising number are becoming addicted to Internet video games. Dr. Cash and other professionals say that people who abuse the Internet are typically struggling with other problems, like depression and anxiety. But, they say, the Internet's omnipresent offer of escape from reality, affordability, accessibility and opportunity for anonymity can also lure otherwise healthy people into an addiction. Dr. Cash's patient Mike, who was granted anonymity to protect his privacy, was at high risk for an Internet addiction, having battled alcohol and drug abuse and depression. On a list of 15 symptoms of Internet addiction used for diagnosis by Internet/Computer Addiction Services, Mike, who is unemployed and living with his mother, checked off 13, including intense cravings for the computer, lying about how much time he spends online, withdrawing from hobbies and social interactions, back pain and weight gain. A growing number of therapists and inpatient rehabilitation centers are often treating Web addicts with the same approaches, including 12-step programs, used to treat chemical addictions. Because the condition is not recognized in psychiatry as a disorder, insurance companies do not reimburse for treatment. So patients either pay out of pocket, or therapists and treatment centers bill for other afflictions, including the nonspecific impulse control disorder. There is at least one inpatient program, at Proctor Hospital in Peoria, Ill., which admits patients to recover from obsessive computer use. Experts there said they see similar signs of withdrawal in those patients as in alcoholics or drug addicts, including profuse sweating, severe anxiety and paranoid symptoms. And the prevalence of other technologies - like BlackBerry wireless e-mail devices, sometimes called CrackBerries because they are considered so addictive; the Treo cellphone-organizer ; and text messaging - has created a more generalized technology addiction, said Rick Zehr, the vice president of addiction and behavioral services at Proctor Hospital. The hospital's treatment program places all its clients together for group therapy and other recovery work, whether the addiction is to cocaine or the computer, Mr. Zehr said. "I can't imagine it's going to go away," he said of technology and Internet addiction. "I can only imagine it's going to continue to become more and more prevalent." There are family therapy programs for Internet addicts, and interventionists who specialize in confronting computer addicts. Among the programs offered by the Center for Online Addiction in Bradford, Pa., founded in 1994 by Dr. Kimberly S. Young, a leading researcher in Internet addiction, are cyberwidow support groups for the spouses of those having online affairs, treatment for addiction to eBay and intense behavioral counseling - in person, by telephone and online - to help clients get Web sober. Another leading expert in the field is Dr. Maressa Hecht Orzack, the director of the Computer Addiction Study Center at McLean Hospital in Belmont, Mass., and an assistant professor at Harvard Medical School. She opened a clinic for Internet addicts at the hospital in 1996, when, she said, "everybody thought I was crazy." Dr. Orzack said she got the idea after she discovered she had become addicted to computer solitaire, procrastinating and losing sleep and time with her family. When she started the clinic, she saw two patients a week at most. Now she sees dozens and receives five or six calls daily from those seeking treatment elsewhere in the country. More and more of those calls, she said, are coming from people concerned about family members addicted to Internet video games like EverQuest, Doom 3 and World of Warcraft. Still, there is little hard science available on Internet addiction. "I think using the Internet in certain ways can be quite absorbing, but I don't know that it's any different from an addiction to playing the violin and bowling," said Sara Kiesler, professor of computer science and human-computer interaction at Carnegie Mellon University. "There is absolutely no evidence that spending time online, exchanging e-mail with family and friends, is the least bit harmful. We know that people who are depressed or anxious are likely to go online for escape and that doing so helps them." It was Professor Kiesler who called Internet addiction a fad illness. In her view, she said, television addiction is worse. She added that she was completing a study of heavy Internet users, which showed the majority had sharply reduced their time on the computer over the course of a year, indicating that even problematic use was self-corrective. She said calling it an addiction "demeans really serious illnesses, which are things like addiction to gambling, where you steal your family's money to pay for your gambling debts, drug addictions, cigarette addictions." She added, "These are physiological addictions." But Dr. Cash, who began treating Internet addicts 10 years ago, said that Internet addiction was a potentially serious illness. She said she had treated suicidal patients who had lost jobs and whose marriages had been destroyed because of their addictions. She said she was seeing more patients like Mike, who acknowledges struggling with an addiction to online pornography but who also said he was obsessed with logging on to the Internet for other reasons. He said that he became obsessed with using the Internet during the 2000 presidential election and that now he feels anxious if he does not check several Web sites, mostly news and sports sites, daily. "I'm still wrestling with the idea that it's a problem because I like it so much," Mike said. Three hours straight on the Internet, he said, is a minor dose. The Internet seemed to satisfy "whatever urge crosses my head." Several counselors and other experts said time spent on the computer was not important in diagnosing an addiction to the Internet. The question, they say, is whether Internet use is causing serious problems, including the loss of a job, marital difficulties, depression, isolation and anxiety, and still the user cannot stop. "The line is drawn with Internet addiction," said Mr. Zehr of Proctor Hospital, "when I'm no longer controlling my Internet use. It's controlling me." Dr. Cash and other therapists say they are seeing a growing number of teenagers and young adults as patients, who grew up spending hours on the computer, playing games and sending instant messages. These patients appear to have significant developmental problems, including attention deficit disorder and a lack of social skills. A report released during the summer by the Pew Internet and American Life Project found that teenagers did spend an increasing amount of time online: 51 percent of teenage Internet users are online daily, up from 42 percent in 2000. But the report did not find a withering of social skills. Most teenagers "maintain robust networks of friends," it noted. Some therapists and Internet addiction treatment centers offer online counseling, including at least one 12-step program for video game addicts, which is controversial. Critics say that although it may be a way to catch the attention of someone who needs face-to-face treatment, it is akin to treating an alcoholic in a brewery, mostly because Internet addicts need to break the cycle of living in cyberspace. A crucial difference between treating alcoholics and drug addicts, however, is that total abstinence is usually recommended for recovery from substance abuse, whereas moderate and manageable use is the goal for behavioral addictions. Sierra Tucson in Arizona, a psychiatric hospital and behavioral health center, which treats substance and behavioral addictions, has begun admitting a rising number of Internet addicts, said Gina Ewing, its intake manager. Ms. Ewing said that when such a client left treatment, the center's counselors helped plan ways to reduce time on the computer or asked those who did not need to use the Web for work to step away from the computer entirely. Ms. Ewing said the Tucson center encouraged its Internet-addicted clients when they left treatment to attend open meetings of Alcoholics Anonymous or Narcotics Anonymous, which are not restricted to alcoholics and drug addicts, and simply to listen. Or perhaps, if they find others struggling with the same problem, and if those at the meeting are amenable, they might be able to participate. "It's breaking new ground," Ms. Ewing said. "But an addiction is an addiction." Danger Signs for Too Much of a Good Thing http://www.nytimes.com/2005/12/01/fashion/thursdaystyles/01aside.html FIFTEEN signs of an addiction to using the Internet and computers, according to Internet/Computer Addiction Services in Redmond, Wash., follow: 1. Inability to predict the amount of time spent on computer. 2. Failed attempts to control personal use for an extended period of time. 3. Having a sense of euphoria while on the computer. 4. Craving more computer time. 5. Neglecting family and friends. 6. Feeling restless, irritable and discontent when not on the computer. 7. Lying to employers and family about computer activity. 8. Problems with school or job performance as a result of time spent on the computer. 9. Feelings of guilt, shame, anxiety or depression as a result of time spent on the computer. 10. Changes in sleep patterns. 11. Health problems like carpal tunnel syndrome, eye strain, weight changes, backaches and chronic sleep deprivation. 12. Denying, rationalizing and minimizing adverse consequences stemming from computer use. 13. Withdrawal from real-life hobbies and social interactions. 14. Obsessing about sexual acting out through the use of the Internet. 15. Creation of enhanced personae to find cyberlove or cybersex. From checker at panix.com Fri Dec 2 02:54:38 2005 From: checker at panix.com (Premise Checker) Date: Thu, 1 Dec 2005 21:54:38 -0500 (EST) Subject: [Paleopsych] NYT: Do Babies Dream? Message-ID: Do Babies Dream? http://www.nytimes.com/2005/11/22/science/22qna.html Q & A By C. CLAIBORNE RAY Q. Do babies dream? A. "Yes, as far as we can tell," said Dr. Charles P. Pollak, director of the Center for Sleep Medicine at NewYork-Presbyterian/Weill Cornell hospital in New York. Most dreaming occurs during a type of sleep called REM sleep, for rapid eye movement, which Dr. Pollak explains is "an evolutionarily old type of sleep that occurs at all life stages, including infancy, and even before infancy, in fetal life." There is no question that newborn infants have REM sleep, he said, and the rapid eye movements can be observed as they sleep. The two eyes move together, mostly side to side, but sometimes up and down. It is a well-based inference that babies are dreaming in REM sleep, he said. As for the content of any dreams, Dr. Pollak said: "That is like asking whether your pet dog or cat is dreaming, because they can't communicate, and you can't ask. We presume that infants dream infantile things, but we don't really know what it is that they dream." "There is some evidence in adults that the direction of eye movement corresponds in a crude way to the content of the dream," Dr. Pollak continued. "If they are dreaming about walking in a field," he said, "the movement is most likely horizontal. If they dream of looking up at a building or climbing stairs, vertical eye movement is more likely to predominate. We can't go further than that." Readers are invited to submit questions by mail to Question, Science Times, The New York Times, 229 West 43rd Street, New York, N.Y. 10036-3959, or by e-mail to question at nytimes.com. From checker at panix.com Fri Dec 2 02:54:51 2005 From: checker at panix.com (Premise Checker) Date: Thu, 1 Dec 2005 21:54:51 -0500 (EST) Subject: [Paleopsych] CHE: C.P. Snow: Bridging the Two-Cultures Divide Message-ID: C.P. Snow: Bridging the Two-Cultures Divide The Chronicle of Higher Education, 5.11.25 http://chronicle.com/weekly/v52/i14/14b01001.htm By DAVID P. BARASH The year 2005 is the centenary of the birth -- and the 25th anniversary of the death -- of C.P. Snow, British physicist, novelist, and longtime denizen of the "corridors of power" (a phrase he coined). It is also 45 years since the U.S. publication of his best-known work, a highly influential polemic that generated another phrase with a life of its own, and that warrants revisiting today: The Two Cultures. Actually, the full title was The Two Cultures and the Scientific Revolution, presented by Snow as the prestigious Rede Lecture at the University of Cambridge in 1959 before being published as a brief book shortly thereafter. Since then his basic point has seeped into public consciousness as metaphor for a kind of dialogue of the deaf. Snow's was perhaps the first -- and almost certainly the most influential -- public lamentation over the extent to which the sciences and the humanities have drifted apart. Snow concerned himself with "literary intellectuals" on the one hand and physicists on the other, although each can be seen as representing their "cultures" more generally: "Between the two," he wrote, there is "a gulf ... of hostility and dislike, but most of all lack of understanding. They have a curious distorted image of each other. Their attitudes are so different that, even on the level of emotion, they can't find much common ground." "A good many times," Snow pointed out, in an oft-cited passage, "I have been present at gatherings of people who, by the standards of the traditional culture, are thought highly educated and who have with considerable gusto been expressing their incredulity at the illiteracy of scientists. Once or twice I have been provoked and have asked the company how many of them could describe the Second Law of Thermodynamics. The response was cold: it was also negative. Yet I was asking something which is about the scientific equivalent of: 'Have you read a work of Shakespeare's?'" F.R. Leavis -- reigning don of British literary humanists at the time -- reacted with particular anger and (according to many) unseemly venom, denouncing Snow as a "public relations man" for science. Leavis mocked "the preposterous and menacing absurdity of C.P. Snow's consecrated public standing," scorned his "embarrassing vulgarity of style," his "panoptic pseudo-cogencies," his "complete ignorance" of literature, history, or civilization generally, and of the dehumanizing side of "progress" as science offers it. "It is ridiculous," thundered Leavis, "to credit him with any capacity for serious thinking about the problems on which he offers to advise the world. ... Not only is he not a genius, he is intellectually as undistinguished as it is possible to be." In fact, Charles Percy Snow is not widely (or even narrowly) read as a novelist these days, despite -- or, as critics like Leavis might suggest, because of -- his 11-volume opus, collectively titled Strangers and Brothers, a roman-fleuve written over a period of three decades, depicting the public life of Britain refracted especially through the sensibilities of Snow's semiautobiographic academic/politician, Lewis Eliot. If Waiting for Godot is a two-act play in which nothing happens, twice, in Strangers and Brothers nothing happens, 11 times. The Two Cultures, however, is a different creature altogether: brief, lively, controversial, insightful, albeit perhaps a tad misbegotten. Thus, today's readers will be surprised by Snow's conflation of "literary intellectuals" with backward-looking conservatives, notably right-wing Fascist sympathizers such as Yeats, Wyndham Lewis, and Ezra Pound, and his cheerful, optimistic portrayal of scientists as synonymous with progress and social responsibility. After all, for every D.H. Lawrence and T.S. Eliot there were a dozen luminaries of the literary left, just as for every Leo Szilard, an Edward Teller. Snow himself was an establishment liberal, suitably worried about nuclear war, overpopulation, and the economic disparities between rich and poor countries. He lamented the influence of those who, he feared, were likely to turn their backs on human progress; in turn, Snow may have been na?vely optimistic and even downright simplistic about the potential of science to solve the world's problems. The Two Cultures is generous in criticizing both cultures for their intellectual isolationism, and Snow -- being both novelist and physicist -- was himself criticized for immodestly holding himself forth (albeit implicitly) as the perfect embodiment of what an educated person should be. Indeed, someone once commented about Snow that he was "so well-rounded as to be practically spherical." But Snow's gentle curses do not fall evenhandedly on both houses, which doubtless raised the ire of Leavis and his ilk. The "culture of science," Snow announced, "contains a great deal of argument, usually much more rigorous, and almost always at a higher conceptual level, than the literary persons' arguments." Scientists "have the future in their bones" whereas literary intellectuals are "natural Luddites" who "wish the future did not exist." Snow's proposed solution? Broaden the educational system. More significant for our time, however, are not Snow's recommendations, the tendentious reception of his thesis, how he couched it, or even, perhaps, whether he got it right, so much as whether, as widely construed, it currently applies. And whether it matters. Science may be even more prominent in 2005 than it was half a century ago. But just as people can reside at the foot of a mountain without ever climbing it, the fact that science looms conspicuously over modern life does not mean that it has been widely mastered, just as the existence of profound humanistic insights does not guarantee their universal appreciation. Progress in the humanities typically does not threaten science, whereas the more science advances, the more the humanities seem at risk. Yet, paradoxically, scientific achievement only makes humanistic wisdom more important, as technology not only threatens the planet, but even -- in a world of cloning, stem-cell possibilities, genetic engineering, robotics, cyber-human hybrids, xenotransplants -- raises questions about what it is to be human. At the same time, with political ideologues and "faith based" zealots literally seeking to redefine reality to meet their preconceptions, we need the objective, empirical power of science more than ever. Whereas in Snow's day, science was nearly synonymous with physics, the early 21st century has seen a resurgence of biology; rocket science has been eclipsed by genomic science. But the more things change, the more they remain the same: "The more that the results of science are frankly accepted, the more that poetry and eloquence come to be received and studied as what in truth they really are -- the criticism of life by gifted men, alive and active with extraordinary power." Thus spoke Matthew Arnold, in an earlier (1882) Rede Lecture titled "Literature and Science," itself a response to "Darwin's bulldog," T.H. Huxley, who had conspicuously -- and wrongly -- prophesied that science would some day supplant literature. Rather than defending their discipline, many among the literati have mourned its imminent demise. Thus, in his book The Literary Mind: Its Place in an Age of Science, Max Eastman concluded that science was on the verge of answering "every problem that arises," and that literature, therefore, "has no place in such a world." And in 1970 the playwright Eugene Ionesco wondered "if art hasn't reached a dead-end, if indeed in its present form, it hasn't already reached its end. ... For some time now, science has been making enormous progress, whereas the empirical revelations of writers have been making very little. ... Can literature still be considered a means to knowledge?" Balancing Eastman and Ionesco -- humanists pessimistic about the humanities -- Noam Chomsky is a scientist radically distrustful of science: "It is quite possible -- overwhelmingly probable, one might guess -- that we will always learn more about human life and human personality from novels than from scientific psychology." Should we see the two cultures, instead, the way Stephen Jay Gould used to describe science and religion: as "nonoverlapping magisteria"? But in fact, they do overlap, most obviously when practitioners of either seek to enlarge their domain into the other. And when this happens, there have inevitably been cries of outrage, reminiscent of the Snow-Leavis squabble. Thus Edward O. Wilson's effort at "consilience" evoked strenuous opposition, mostly from humanists. Reciprocally, more than a few scientists -- Alan Sokal most prominently -- have been outraged by postmodernist efforts to "transgress the boundaries" by "privileging" a kind of poly-syllabic verbal hijinks over scientific theory building, empirical validation, and careful thought. It is bad enough when certain key words are hijacked, as with the literary community's use of "theory" to mean "literary theory." (Rumor has it that there exist some other theories, including gravitational, quantum, number, and evolutionary.) Imagine if scientists were to appropriate "significance" to mean only "statistical significance." A gulf clearly exists. But is that a problem? Scientists would doubtless be better people if they were culturally literate, and ditto for humanists if they were scientifically informed. Which is worse, the antiscientific nincompoopery of a Tom DeLay, who announced in Congress that the killings at Columbine High School took place "because our school systems teach our children that they are nothing but glorified apes who have evolutionized [sic] out of some primordial mud," or the antihumanist arrogance of a scientific Strangelove, ignorant of, say, the deeper meaning of personhood as explored by Aquinas, Milton, or Whitman? When the cultures are effectively bridged, the results, if not always admirable, are at least likely to be thought provoking: Witness the plays of Michael Frayn, or Leon Kass's incorporation of humanistic sensibility into the deliberations of the President's Committee on Bioethics. O ne can reformulate the "two cultures" problem as a lament about overspecialization, partly captured by the quip that higher education -- especially at the graduate level -- involves learning more and more about less and less until one knows everything about nothing. On the other hand, there is something to be said for specialization insofar as it bespeaks admirable expertise. In medicine, it used to be that "specialists" were rare; not so today, when even general practitioners specialize in "family medicine." And we are almost certainly better off for it. I'd rather have a colonoscopy from a gastroenterologist than from a general practitioner, and would trust a psychiatrist more than a family doctor to prescribe the most suitable antidepressant. At the same time, something is lost when physicians are more comfortable reading MRI's or analyzing arcane lab results than talking with patients. We might also ask whether scientists are doing a better job of communicating with the public, crossing the Snow bridge and thereby constituting a Third Culture, as John Brockman has claimed. The late Carl Sagan was a master at this art, as are Richard Dawkins, Jared Diamond, Brian Greene, and many others. But there is nothing new in scientists reaching out to hoi polloi; Arthur Eddington and Bertrand Russell weren't slouches, nor was T.H. Huxley, and yet they couldn't prevent Snow's "gap." And it is not obvious that Stephen Hawking's A Brief History of Time bridged the cultures so much as confirmed their mutual incomprehensibility. Within academe, there is eager lip service to bridge building between humanities and science, but has there been any progress? We have numerous interdisciplinary degree programs, undergraduate as well as graduate, but are the sciences and humanities any more integrated? The options of "general studies" degrees for undergraduates or "special individual Ph.D. programs," although admirably intended, often end up isolating would-be bridge crossers from traditional departments where their presence might otherwise encourage genuine traffic across disciplinary boundaries. And despite the proliferation of numerous centers and institutes for interdisciplinary study, I suggest that, if anything, academic cultures are less mutually interpenetrating now than in Snow's day, perhaps because the institutionalization of bridge builders serves, ironically, to marginalize them, and keep them out of the main academic thoroughfares. Society scarcely benefits from those who achieve renown in Mongolian metaphysics by speaking only Mongolian to the metaphysicians, and only metaphysics to the Mongolians. It seems that higher education -- like politics -- is more polarized than ever. Anthropology departments, increasingly, are subdivided into cultural or biological, the two often barely on speaking terms. Many biology departments have split into "skin in" (cellular, molecular, biochemical) and "skin out" (ecology, evolution, organismal), increasingly becoming distinct administrative entities to match their intellectual incompatibility. At my institution, the University of Washington, psychology cherishes its place in the natural sciences, with no one pursuing a humanistic, existential, or even Freudian agenda. There are other universities at which, by contrast, "scientific psychology" is condemned as a kind of sin. Everyone claims to love boundary-busting scholarship, but virtually no one would advise a graduate student or even a faculty member lacking tenure to hitch his or her career to it. There are exceptions -- individuals who are so brave, determined, gifted, foolish or indifferent to professional consequences that they have persevered on one bridge or another. Thanks to them, we have the nascent field of eco-criticism, which links ecology and literature, as well as evolutionary psychology, bioethics, and a growing band of philosophers, neurobiologists, and physicists trying to make sense of consciousness. Many other linkages remain unconsummated, lacking only suitable scholars or maybe -- and here is a heretical notion -- any legitimate basis for them. Geo-poetics, anyone? Or astro-dramaturgy? Most of us would settle for something less abstruse, broader, more natural, yet probably more difficult: increased old-fashioned intellectual traffic between humanists and scientists, as Snow called for. When he was knighted, C.P. Snow chose for his crest (it's a Brit thing), the motto Aut Inveniam Aut Faciam -- "I will either find a way or make one." As we acknowledge his hundredth birthday, maybe someone will find a way to link his two cultures, or at least make a few high-traffic bridges. David P. Barash is a professor of psychology at the University of Washington. He is co-author of Madame Bovary's Ovaries: a Darwinian Look at Literature (Delacorte, 2005), which endeavors to bridge two subcultures: evolutionary biology and literary criticism. From checker at panix.com Fri Dec 2 02:54:58 2005 From: checker at panix.com (Premise Checker) Date: Thu, 1 Dec 2005 21:54:58 -0500 (EST) Subject: [Paleopsych] BBC: Horizon: Homeopathy: The Test Message-ID: Horizon: Homeopathy: The Test http://www.bbc.co.uk/cgi-bin/education/betsie/parser.pl [Transcript appended.] Will James Randi be out of pocket after this week's Horizon? First shown: BBC Two, Tuesday 26 November, 9pm Homeopathy: The Test Homeopathy: The Test - programme summary Homeopathy was pioneered over 200 years ago. Practitioners and patients are convinced it has the power to heal. Today, some of the most famous and influential people in the world, including pop stars, politicians, footballers and even Prince Charles, all use homeopathic remedies. Yet according to traditional science, they are wasting their money. "Unusual claims require unusually good proof" James Randi The Challenge Sceptic James Randi is so convinced that homeopathy will not work, that he has offered $1m to anyone who can provide convincing evidence of its effects. For the first time in the programme's history, Horizon conducts its own scientific experiment, to try and win his money. If they succeed, they will not only be $1m richer - they will also force scientists to rethink some of their fundamental beliefs. Homeopathy and conventional science The basic principle of homeopathy is that like cures like: that an ailment can be cured by small quantities of substances which produce the same symptoms. For example, it is believed that onions, which produce streaming, itchy eyes, can be used to relieve the symptoms of hay fever. However, many of the ingredients of homeopathic cures are poisonous if taken in large enough quantities. So homeopaths dilute the substances they are using in water or alcohol. This is where scientists become sceptical - because homeopathic solutions are diluted so many times they are unlikely to contain any of the original ingredients at all. Yet many of the people who take homeopathic medicines are convinced that they work. Has science missed something, or could there be a more conventional explanation? The Placebo Effect The placebo effect is a well-documented medical phenomenon. Often, a patient taking pills will feel better, regardless of what the pills contain, simply because they believe the pills will work. Doctors studying the placebo effect have noticed that large pills work better than small pills, and that coloured pills work better than white ones. Could the beneficial effects of homeopathy be entirely due to the placebo effect? If so, then homeopathy ought not to work on babies or animals, who have no knowledge that they are taking a medicine. Yet many people are convinced that it does. Can science prove that homeopathy works? In 1988, Jacques Benveniste was studying how allergies affected the body. He focussed on a type of blood cell known as a basophil, which activates when it comes into contact with a substance you're allergic to. As part of his research, Benveniste experimented with very dilute solutions. To his surprise, his research showed that even when the allergic substance was diluted down to homeopathic quantities, it could still trigger a reaction in the basophils. Was this the scientific proof that homeopathic medicines could have a measurable effect on the body? The memory of water In an attempt to explain his results, Benveniste suggested a startling new theory. He proposed that water had the power to 'remember' substances that had been dissolved in it. This startling new idea would force scientists to rethink many fundamental ideas about how liquids behave. Unsurprisingly, the scientific community greeted this idea with scepticism. The then editor of Nature, Sir John Maddox, agreed to publish Benveniste's paper - but on one condition. Benveniste must open his laboratory to a team of independent referees, who would evaluate his techniques. "Scientists are human beings. Like anyone else, they can fool themselves" James Randi Enter James Randi When Maddox named his team, he took everyone by surprise. Included on the team was a man who was not a professional scientist: magician and paranormal investigator James Randi. Randi and the team watched Benveniste's team repeat the experiment. They went to extraordinary lengths to ensure that none of the scientists involved knew which samples were the homeopathic solutions, and which ones were the controls - even taping the sample codes to the ceiling for the duration of the experiment. This time, Benveniste's results were inconclusive, and the scientific community remained unconvinced by Benveniste's memory of water theory. Homeopathy undergoes more tests Since the Benveniste case, more scientists have claimed to see measurable effects of homeopathic medicines. In one of the most convincing tests to date, Dr. David Reilly conducted clinical trials on patients suffering from hay fever. Using hundreds of patients, Reilly was able to show a noticeable improvement in patients taking a homeopathic remedy over those in the control group. Tests on different allergies produced similar results. Yet the scientific community called these results into question because they could not explain how the homeopathic medicines could have worked. Then Professor Madeleine Ennis attended a conference in which a French researcher claimed to be able to show that water had a memory. Ennis was unimpressed - so the researcher challenged her to try the experiment for herself. When she did so, she was astonished to find that her results agreed. Horizon takes up the challenge Although many researchers now offered proof that the effects of homeopathy can be measured, none have yet applied for James Randi's million dollar prize. For the first time in the programme's history, Horizon decided to conduct their own scientific experiment. The programme gathered a team of scientists from among the most respected institutes in the country. The Vice-President of the Royal Society, Professor John Enderby oversaw the experiment, and James Randi flew in from the United States to watch. As with Benveniste's original experiment, Randi insisted that strict precautions be taken to ensure that none of the experimenters knew whether they were dealing with homeopathic solutions, or with pure water Two independent scientists performed tests to see whether their samples produced a biological effect. Only when the experiment was over was it revealed which samples were real. To Randi's relief, the experiment was a total failure. The scientists were no better at deciding which samples were homeopathic than pure chance would have been. Read more [10]questions and answers about homeopathy. References 10. http://www.bbc.co.uk/cgi-bin/education/betsie/parser.pl/0005/www.bbc.co.uk/science/horizon/2002/homeopathyqa.shtml BBC - Science & Nature - Horizon - Homeopathy: The Test http://www.bbc.co.uk/cgi-bin/education/betsie/parser.pl/0005/www.bbc.co.uk/science/horizon/2002/homeopathytrans.shtml You are here: BBC > Science & Nature > TV & Radio Follow-up > Horizon BBC Two, Tuesday 26 November, 9pm Homeopathy: The Test - transcript NARRATOR (NEIL PEARSON): This week Horizon is doing something completely different. For the first time we are conducting our own experiment. We are testing a form of medicine which could transform the world. Should the results be positive this man will have to give us $1m. JAMES RANDI (Paranormal Investigator): Do the test, prove that it works and win a million dollars. NARRATOR: But if the results are negative then millions of people, including some of the most famous and influential in the world, may have been wasting their money. The events that would lead to Horizon's million dollar challenge began with Professor Madeleine Ennis, a scientist who may have found the impossible. PROF. MADELEINE ENNIS (Queen's University, Belfast): I was incredibly surprised and really had great feelings of disbelief. NARRATOR: Her work concerns a type of medicine which defies the laws of science. WALTER STEWART (Research Chemist): If Madeleine Ennis turns out to be right it means that science has missed a huge chunk of something. NARRATOR: She has reawakened one of the most bitter controversies of recent years. PROF. BOB PARK (University of Maryland): Madeleine Ennis's experiments cannot be right. I mean it's, they're, they're, preposterous. MADELEINE ENNIS: I have no explanation for what happened. However, this is science. If we knew the answers to the questions we wouldn't bother doing the experiments. NARRATOR: It's all about something you can find on every high street in Britain: homeopathy. Homeopathy isn't some wacky, fringe belief. It's over 200 years old and is used by millions of people, including Presidents and pop stars. It's even credited with helping David Beckham get over his foot injury and the Royals have been keen users since the days of Queen Victoria, but it's also a scientific puzzle. What makes it so mysterious is its two guiding principles, formulated in the 18th century. The first principle is that to find a cure you look for a substance that actually causes the symptoms you're suffering from. It's the principle that like cures like. DR PETER FISHER (Homeopath to The Queen): For instance in colds and hay fever something we often use is allium cepa which is onion and of course we all know the effects of chopping an onion, you know the sore streaming eyes, streaming nose, sneezing and so we would use allium cepa, onion, for a cold with similar sorts of features. NARRATOR: This theory that like cures like led to thousands of different substances being used, some of them truly bizarre. DR LIONEL MILGROM (Homeopath): In principle you can make a homeopathic remedy out of absolutely anything that's plant. PETER FISHER: Deadly nightshade. LIONEL MILGROM: Animal. PETER FISHER: Snake venom. LIONEL MILGROM: Mineral. PETER FISHER: Calcium carbonate, which is of course chalk. LIONEL MILGROM: Disease product. PETER FISHER: Tuberculous gland of a cow. LIONEL MILGROM: Radiation. NARRATOR: But then homeopaths found that many of these substances were poisonous, so they started to dilute them. This led to the extraordinary second principle of homeopathy: the more you dilute a remedy the more effective it becomes, provided it's done in a special way. The method homeopaths use to this day is called serial dilution. A drop of the original substance, whether it's snake venom or sulphuric acid, is added to 99 drops of waster or alcohol. Then the mixture is violently shaken. Here it's done by machine, but traditionally homeopaths would hit the tube against a hard surface. Either way, homeopaths believe this is a vital stage. It somehow transfers the healing powers from the original substance into the water itself. The result is a mixture diluted 100 times. LIONEL MILGROM: That will give you what's called a 1C solution, that's one part in 100. You then take that 1C solution and dissolve it in another 99 parts and now you end up with a 2C solution. NARRATOR: At 2C the medicine is one part in 10,000, but the homeopaths keep diluting and this is where the conflict with science begins. At 6C the medicine is diluted a million million times. This is equivalent to one drop in 20 swimming pools. Another six dilutions gives you 12C. This is equivalent to one drop in the Atlantic Ocean, but even this is not enough for most homeopathic medicines. The typical dilution is 30C, a truly astronomical level of dilution. BOB PARK: One drop in all of the oceans on Earth would be much more concentrated than that. I would have to go off the planet to make that kind of dilution. NARRATOR: But homeopaths believe that a drop of this ultra dilute solution placed onto sugar pills can cure you. That's why homeopathy is so controversial because science says that makes no sense whatsoever. BOB PARK: There is a limit to how much we can dilute any substance. We can only dilute it down to the point that we have one molecule left. The next dilution we probably won't even have that one molecule. WALTER STEWART: It's possible to go back and count how many molecules are present in a homeopathic dose and the astonishing answer is absolutely none. There's less than a chance in a million, less than a chance in a billion that there's a single molecule. NARRATOR: A molecule is the smallest piece of a substance you can have, so for something to have any effect at all conventional science says you need one molecule of it at the very least. WALTER STEWART: Science has through many, many different experiments shown that when a drug works it's always through the way the molecule interacts with the body and, so the discovery that there's no molecules means absolutely there's no effect. NARRATOR: That's why science and homeopathy have been at war for over 100 years. The homeopaths say that their remedies have healing powers. Science says there's nothing but water. Then one scientist claimed the homeopaths were right after all. Jacques Benveniste was one of France's science superstars. He had a string of discoveries to his name and some believed he was on his way to earning a Nobel Prize. DR JACQUES BENVENISTE (National Institute for Medical Research): I was considered as, well in French we have a word which says Nobel is nobelisable, which means we can have a Nobel Prize because I started from scratch the whole field of research. I was the head of a very large team, had a lot of money and so I was a very successful person. NARRATOR: Benveniste was an expert in the field of allergy, in particular he was studying a type of blood cell involved in allergic reactions - the basophil. When basophils come into contact with something you're sensitive to they become activated causing the telltale symptoms. Benveniste had developed a test that could tell if a person was allergic to something or not. He added a kind of dye that only turns inactive basophils blue, so by counting the blue cells he could work out whether there had been a reaction, but then something utterly unexpected started to happen. JACQUES BENVENISTE: A technician told me one day I don't understand because I have diluted a substance that is activating basophils to a point where it shouldn't work and it still works. NARRATOR: The researcher had taken the chemical and added water, just like homeopaths do. The result should have been a solution so dilute it had absolutely no effect and yet, bizarrely, there was a reaction. The basophils had been activated. Benveniste knew this shouldn't have been possible. JACQUES BENVENISTE: I remember saying to this, to her, this is water so it cannot work. NARRATOR: Benveniste's team was baffled. They needed to find out what was going on, so they carried out hundreds of experiments and soon realised that they'd made an extraordinary discovery. It seemed that when a chemical was diluted to homeopathic levels the result was a special kind of water. It didn't behave like ordinary water, it acted like it still contained the original substance. It was as if the water was remembering the chemical it had once contained, so Benveniste called the phenomenon the 'memory of water'. At last here was scientific evidence that homeopathy could work. Benveniste knew this was a radical suggestion, but there was a way to get his results taken seriously. He had to get them published in a scientific journal. JACQUES BENVENISTE: A result doesn't exist until it is admitted by the scientific community. It's like, like being a good opera singer but singing in your bathroom. That's fine, but it's not Scala, Milan or the Met, Met or the Opera at Paris, what-have-you. NARRATOR: So he sent his work to the most prestigious journal in the world, a journal which for over 100 years has reported the greatest of scientific discoveries: Nature . SIR JOHN MADDOX ( Nature Editor 1980-1995): Nature is the place that everyone working in science recognises to be a way of getting publicity of the best kind. NARRATOR: Benveniste's research ended up with one of the most powerful figures in science, the then Editor of Nature , Sir John Maddox. Maddox knew that the memory of water made no scientific sense, but he couldn't just ignore work from such a respected scientist, so he agonised about what to do. Eventually he reached a decision. SIR JOHN MADDOX: I said OK, we'll publish your paper if you ;et us come and inspect your lab and he agreed, to my astonishment. NARRATOR: So in June 1988 Benveniste's research appeared in the pages of Nature . It caused a scientific sensation. Benveniste became a celebrity. His memory of water made news across the world. He seemed to have found the evidence that made homeopathy scientifically credible, but the story wasn't quite over. Benveniste had agreed to let in a team from Nature . It was a decision he would live to regret. Maddox set about assembling his team of investigators and his choices revealed his true suspicions. First, he chose Walter Stewart, a scientist and fraud-buster, but his next choice would really cause a stir: James Randi. JACQUES BENVENISTE: I looked in my books and I said who are, who is Randi and couldn't find any scientist called Randi. NARRATOR: That was because the amazing Randi isn't a scientist, he's a magician, but he's no ordinary conjuror. He's also an arch sceptic, a fierce opponent of all things supernatural. JACQUES BENVENISTE: I called John Maddox and I said what, what is this? I mean I thought you were coming with, with scientists to discuss science. NARRATOR: But Randi felt he was just the man for the job. On one occasion he had fooled even experienced scientists with his spoon bending tricks. JAMES RANDI: Scientists don't always think rationally and in a direct fashion. They're human beings like anyone else. They can fool themselves. NARRATOR: So Randi became the second investigator. JAMES RANDI: Astonishing. NARRATOR: On 4th July 1988 the investigative team arrived in Paris ready for the final showdown. SIR JOHN MADDOX: The first thing we did was to sit round the table in Benveniste's lab. Benveniste himself struck us all as looking very much like a film star. JAMES RANDI: I found him to be a charming, very continental gentleman. He's a great personality. He was very much in control. JACQUES BENVENISTE: We were quite relaxed because there was no reason why things should not go right. NARRATOR: The first step was for Benveniste and his team to perform their experiment under Randi's watchful gaze. They had to prepare two sets of tubes containing homeopathic water and ordinary water. If the homeopathic water was having a real effect different from ordinary water then homeopathy would be vindicated. (ACTUALITY EXPERIMENT CHAT) As they plotted the results it was clear the experiment had worked. JAMES RANDI: There were huge peaks coming up out of it and that was very active results, I mean very, very positive results. WALTER STEWART: The astonishing thing about these results is that they repeated the claim, they demonstrated the claim that a homeopathic dilution, a dilution where there were no molecules, could actually have some sort of an effect. NARRATOR: But Maddox had seen that the experimenters knew which tubes contained the homeopathic water and which contained the ordinary water, so perhaps unconsciously, this might have influenced the results, so he asked them to repeat the experiment. This time the tubes would be relabelled with a secret code so that no-one knew which tube was which. JAMES RANDI: We went into a sealed room and we actually taped newspapers over the windows to the room that were accessible to the hall. WALTER STEWART: We recorded in handwriting which tube was which and we put this into an envelope and sealed it so that nobody could open it or change it. NARRATOR: At this point the investigation took a turn for the surreal as they went to extraordinary lengths to keep the code secret. JAMES RANDI: Walter and I got up on the stepladder and stuck it to the ceiling of the lab. WALTER STEWART: There it was taped above us as all of this work went on. JACQUES BENVENISTE: Sticking an envelope to the ceiling was utterly ridiculous. There is no way you can associate that with science. NARRATOR: With the codes out of reach the final experiment could begin. By now Benveniste had lost control of events. JACQUES BENVENISTE: It was a madhouse. Randi was doing magician tricks. JAMES RANDI: Yes I was doing perhaps a little bit of sleight-of-hand with an object or something like that, just to lighten the atmosphere. NARRATOR: Soon the analysis was complete. It was time to break the code to see if the experiment had worked. Benveniste and his team were brimming with optimism. JAMES RANDI: Oh my goodness it was party-time, cracked crabs legs and magnums, literally, of champagne packed in ice. WALTER STEWART: We were going to be treated to a wonderful dinner. The press, many members of the press were there. JAMES RANDI: John and Walter and I were looking at one another as if to say wow, if this doesn't work it's going to be a downer. WALTER STEWART: Finally came the actual work of decoding the result. JAMES RANDI: There was much excitement at the table. Everyone was gathered around. NARRATOR: Benveniste felt sure that the results would support homeopathy and that he would be vindicated. JAMES RANDI: That didn't happen. It was just a total failure. SIR JOHN MADDOX: We said well nothing here is there? WALTER STEWART: And immediately the mood in the laboratory switched, people burst into tears. JAMES RANDI: It was general gloom. NARRATOR: The team wrote a report accusing Benveniste of doing bad science and branding the claims for the memory of water a delusion. Benveniste's scientific reputation was ruined. JACQUES BENVENISTE: Everybody believed that I am totally wrong. It's simply dismissed. Your phone call doesn't ring anymore. Just like actresses, or actress that have no, are no more in fashion the phone suddenly is silent. NARRATOR: For now the memory of water was forgotten. Science declared homeopathy impossible once more, but strangely that didn't cause homeopathy to disappear. Instead it grew. Since the Benveniste affair sales of homeopathic medicines have rocketed. Homeopathy has become a trendy lifestyle choice, one of the caring, all natural medicines, more popular in the 21st-century than ever before. Despite the scepticism of science millions of people use it and believe it has helped them, like Marie Smith. Fifteen years ago Marie was diagnosed with a life-threatening blood disorder. MARIE SMITH: I was more concerned for me children. I used to look at them thinking I may, may not be here one day for yous. That was the worst part of it. NARRATOR: She'd tried everything that conventional medicine could offer, including drugs and surgery. Nothing seemed to work. Then she tried homeopathy. She took a remedy made from common salt. MARIE SMITH: It's like somebody putting me in a coffin and taking me back out again. That's just the way I felt and the quality of my life changed completely. NARRATOR: Since then Marie has been healthy and she has no doubt it's homeopathy that's helped her. MARIE SMITH: I know it saved my life and it's made my life a lot different, yeah and I'm just glad I'm enjoying these grandchildren which I never thought I would do. NARRATOR: There are thousands of cases like Marie's and they do present science with a problem. If homeopathy is scientific nonsense then why are so many people apparently being cured by it? The answer may lie in the strange and powerful placebo effect. The placebo effect is one of the most peculiar phenomena in all science. Doctors have long known that some patients can be cured with pills that contain no active ingredient at all, just plain sugar, what they call the placebo, and they've noticed an even great puzzle: that larger placebo pills work better than small ones, coloured pills work better than white pills. The key is simply believing that the pill will help you. This releases the powers in our minds that reduce stress and that alone can improve your health. BOB PARK: Stress hormones make you feel terribly uncomfortable. The minute you relieve the anxiety, relieve the stress hormones people do feel better, but that's a true physiological effect. NARRATOR: Scientists believe the mere act of taking a homeopathic remedy can make people feel better and homeopathy has other ways of reducing stress. LIONEL MILGROM: And is there any particular time of day that you will, you'll, you'll have that feeling? PATIENT: No. NARRATOR: A crucial part of homeopathic care is the consultation. LIONEL MILGROM: The stress that you have at work, is that, are those around issues that make you feel quite emotional? PATIENT: No. LIONEL MILGROM: The main thing about a homeopathic interview is that we do spend a lot of time talking and listening to the patient. We would ask questions of how they eat, how they sleep, how much worry and tension there is in their lives, hopefully give them some advice about how to actually ease problems of stress. PATIENT I just feel I want to have something more natural. LIONEL MILGROM: Yeah... NARRATOR: So most scientists believe that when homeopathy works it must be because of the placebo effect. BOB PARK: As far as I know it's the only thing that is really guaranteed to be a perfect placebo. There is no medicine in the medicine at all. NARRATOR: It seems like a perfect explanation, except that homeopathy appears to work when a placebo shouldn't - when the patient doesn't even know they're taking a medicine. All over the country animals are being treated with homeopathic medicines. Pregnant cows are given dilute cuttlefish ink, sheep receive homeopathic silver to treat eye infections, piglets get sulphur to fatten them up. A growing number of vets believe it's the medicine of the future, like Mark Elliot who's used homeopathy his whole career, on all sorts of animals. MARK ELLIOT (Homeopathic Vet): Primarily it's dogs and horses, but we also treat cats, small rodents, rabbits, guinea pigs, even reptiles, but I have treated an elephant with arthritis and I've heard of colleagues recently who treated giraffes. It works on any species exactly the same as in the human field. NARRATOR: Mark made it his mission to prove that homeopathy works. He decided to study horses with cushing's, a disease caused by cancer. He treated them all with the same homeopathic remedy. The results were impressive. MARK ELLIOT: We achieved an overall 80% success rate which is great because that is comparable with, with modern medical drugs. NARRATOR: To Mark this was clear proof that homeopathy can't be the placebo effect. MARK ELLIOT: You can't explain to this animal why the treatment it's being given is going to ben, to benefit it, or how it's potentially going to benefit it and as a result, when you see a positive result in a horse or a dog that to me is the ultimate proof that homeopathy is not placebo, homeopathy works. NARRATOR: But Mark's small trial doesn't convince the sceptics. They need far more evidence before they'll believe that homeopathic medicines are anything more than plain water. JAMES RANDI: I've heard it said that unusual claims require unusually good proof. That's true. For example, if I tell you that at my home in Florida in the United States I have a goat in my garden. You could easily check that out. Yeah, looks like a goat, smells like a goat, so the case is essentially proven, but if I say I have a unicorn, that's a different matter. That's an unusual claim. NARRATOR: To scientists the claim that homeopathic water can cure you is as unlikely as finding a unicorn. JAMES RANDI: Yes, there is a unicorn. That is called homeopathy. NARRATOR: Homeopathy needed the very highest standards of proof. In science the best evidence there can be is a rigorous trial comparing a medicine against a placebo and in recent years such trials have been done with homeopathy. David Reilly is a conventionally trained doctor who became intrigued by the claims of the homeopaths. He wanted to put homeopathy to the test and decided to look at hay fever. Both homeopathy and conventional medicine use pollen as a treatment for hay fever. What's different about homeopathy is the dilution. DR DAVID REILLY (Glasgow Homeopathic Hospital): The single controversial element is that preparing this pollen by the homeopathic method takes it to a point that there's not a single molecule of the starting material present. I confidently assumed that these diluted medicines were placebos. NARRATOR: David Reilly recruited 35 patients with hayfever. Half of them were given a homeopathic medicine made from pollen, half were given placebo, just sugar pills. No one knew which was which. For four weeks they filled in a diary measuring how bad their symptoms were. The question was: would there be a difference? DAVID REILLY: To our collective shock a result came out that was very clear those on the active medication had a substantially greater reduction in symptoms than those receiving the placebo medicine. According to that data the medicine worked. NARRATOR: But to be absolutely rigorous Reilly decided to repeat the study and he got the same result. Then he went further and tested a different type of allergy. Again the result was positive, but despite all these studies, most scientists refuse to believe his research. DAVID REILLY: It became obvious that in certain minds 100 studies, 200 studies would not change the mental framework and so I'm sceptical that if 200 haven't changed it I don't think 400 would change it. NARRATOR: The reason Reilly's research was dismissed was because his conclusion had no scientific explanation. Sceptics pointed to the glaring problem: there was still no evidence as to how something that was pure water could actually work. BOB PARK: If you design a medication to take advantage of what we know about physiology we're not surprised when it works. When, when you come up with no explanation at all for how it could work and then claim is works we're not likely to take it seriously. NARRATOR: To convince science, homeopathy had to find a mechanism, something that could explain how homeopathic water could cure you. That meant proving that water really does have a memory. Then a scientist appeared to find that proof. Madeleine Ennis has never had much time for homeopathy. As a professor of pharmacology she knows its scientifically impossible. MADELEINE ENNIS: I'm a completely conventional scientist. I have had no experience of using non-conventional medications and have no intention really of starting to use them. NARRATOR: But at a conference Ennis heard a French scientist present some puzzling results, results that seemed to show that water has a memory. MADELEINE ENNIS: Many of us were incredibly sceptical about the findings. We told him that something must have gone wrong in the experiments and that we didn't believe what he had presented. NARRATOR: He replied with a challenge. MADELEINE ENNIS: I was asked whether, if I really believed my viewpoint, would I test the hypothesis that the data were wrong? NARRATOR: Ennis knew that the memory of water breaks the laws the science, but she believed that a scientist should always be willing to investigate new ideas, so the sceptical Ennis ended up testing the central claim of homeopathy. She performed an experiment almost identical to Benveniste's using the same kind of blood cell. Then she added a chemical, histamine, which had been diluted down to homeopathic levels. The crucial question: would it have any effect on the cells? To find out she had to count the cells one by one to see whether they had been affected by the homeopathic water. The results were mystifying. the homeopathic water couldn't have had a single molecule of histamine, yet it still had an effect on the cells. MADELEINE ENNIS: They certainly weren't the results that I wanted to see and they definitely weren't the results that I would have liked to have seen. NARRATOR: Ennis wondered whether counting by hand had introduced an error, so she repeated the experiment using an automated system to count the cells, and astonishingly, the result was still positive. MADELEINE ENNIS: I was incredibly surprised and really had great feelings of disbelief, but I know how the experiments were performed and I couldn't see an error in what we had done. NARRATOR: These results seemed to prove that water does have a memory after all. It's exactly what the homeopaths have been hoping for. PETER FISHER: If these results become generally accepted it will revolutionise the view of homeopathy. Homeopathy will suddenly become this idea that was perhaps born before its time. LIONEL MILGROM: It's particularly exciting because it does seem to suggest that Benveniste was correct. NARRATOR: At last here is evidence from a highly respected researcher that homeopathic water has a real biological effect. The claims of homeopathy might be true after all. However, the arch sceptic Randi is unimpressed. JAMES RANDI: There is so many ways that errors are purposeful interference can take place. NARRATOR: As part of his campaign to test bizarre claims Randi has decided to put his money where his mouth is. On his website is a public promise: to anyone who prove the scientifically impossible Randi will pay $1m. JAMES RANDI: This is not a cheap theatrical stung. It's theatrical, yes, but it's a million dollar's worth. NARRATOR: Proving the memory of water would certainly qualify for the million dollars. To win the prize someone would simply have to repeat Ennis's experiments under controlled conditions, yet no-one has applied. JAMES RANDI: Where are the homeopathic labs, the biological labs around the world, who say that this is the real thing who would want to make a million dollars and aren't doing it? NARRATOR: So Horizon decided to take up Randi's challenge. We gathered experts from some of Britain's leading scientific institutions to help us repeat Ennis's experiments. Under the most rigorous of conditions they'll see whether they can find any evidence for the memory of water. We brought James Randi over from the United States to witness the experiment and we came to the world's most august scientific institution, the Royal Society. The Vice-President of the Society, Professor John Enderby, agreed to oversee the experiment for us. PROF. JOHN ENDERBY: ...but they'll, of course as far as the experimenters are concerned they'll have totally different numbers... NARRATOR: And with a million dollars at stake James Randi wants to make sure there's no room for error. JAMES RANDI: ...keeping the original samples, so I'm very happy with that provision. I'm willing to accept a positive result for homeopathy or for astrology or for anything else. I may not like it, but I have to be willing to accept it. NARRATOR: The first stage is to prepare the homeopathic dilutions. We came to the laboratories of University College London where Professor Peter Mobbs agreed to produce them for us. He's going to make a homeopathic solution of histamine by repeatedly diluting one drop of solution into 99 drops of water. PETER MOBBS: OK, now I'm transferring the histamine into 9.9mmls of distilled water and then we'll discard the tip. NARRATOR: For comparison we also need control tubes, tubes that have never had histamine in them. For these Peter starts with plain water. PETER MOBBS: In it goes. NARRATOR: This stage dilutes the solutions down to one in 100 - that's 1C. We now have 10 tubes. Half are just water diluted with more water, the control tubes, half are histamine diluted in water. These are all shaken, the crucial homeopathic step. Now he dilutes each of the tubes again, to 2C. Then to 3C, all the way to 5C. PETER MOBBS: The histamine's now been diluted ten thousand million times. Still a few molecules left in there, but not very many. NARRATOR: Then we asked Professor of Electrical Engineering, Hugh Griffiths, to randomly relabel each of our 10 tubes. Now only he has the code for which tubes contain the homeopathic dilutions and which tubes contain water. HUGH GRIFFITHS: OK, so there's the record of which is which. I'm going to encase it in aluminium foil and then seal it in this envelope here. NARRATOR: Next the time-consuming task of taking these solutions down to true homeopathic levels. UCL scientist Rachel Pearson takes each of the tubes and dilutes them down further - to 6C. That's one drop in 20 swimming pools. To 12C - a drop in the Atlantic. Then to 15C - one drop in all the world's oceans. The tubes have now been diluted one million million million million million times. Some are taken even further down, to 18C. Every tube, whether it contains histamine or water, goes through exactly the same procedure. To guard against any possibility of fraud, Professor Enderby himself recodes every single tube. The result is 40 tubes none of which should contain any molecules of histamine at all. Conventional science says they are all identical, but if Madeleine Ennis is right her methods should tell which ones contain the real homeopathic dilutions. Now we repeat Ennis's procedure. We take a drop of water from each of the tubes and add a sample of living human cells. Then it's time for Wayne Turnbull at Guys Hospital, to analyse the cells to see whether the homeopathic water has had any effect. He'll be using the most sophisticated system available: a flow cytometer. WAYNE TURNBULL: Loading it up, bringing it up to pressure. Essentially the technology allows us to take individual cells and push them past a focused laser beam. A single stream of cells will be pushed along through the nozzle head and come straight down through the machine. The laser lights will be focussed at each individual cell as it goes past. Reflected laser light is then being picked up by these electronic detectors here. NARRATOR: By measuring the light reflected off each cell the computer can tell whether they've reacted or not. WAYNE TURNBULL: This is actually a very fast machine. I can run up to 100 million cells an hour. JAMES RANDI: Whoa. NARRATOR: But to be absolutely rigorous we asked a second scientist, Marian Macey at the Royal London Hospital, to perform the analysis in parallel. Our two labs get to work. Using a flow cytometer they measure how many of the cells are being activated by the different test solutions. Some tubes do seem to be having more of an effect than others. The question is: are they the homeopathic ones? At last the analysis is complete. We gather all the participants here to the Royal Society to find out the results. First, everyone confirms that the experiment has been conducted in a rigorous fashion. MARION MACEY: I applied my own numbering system to the... RACHEL PEARSON: ...5, 5.4 millimolar solution... WAYNE TURNBULL: ...we eventually did arrive at a protocol that we were happy with. NARRATOR: Then there's the small matter of the million dollars. JOHN ENDERBY: James, is the cheque in your pocket ready now? JAMES RANDI: We don't actually carry a cheque around. It's in the form of negotiable bonds which will be immediately sep, separated from our account and given to whoever should win the prize. NARRATOR: We asked the firm to fax us confirmation that the million dollar prize is there. JOHN ENDERBY: OK, now look, I'm going to open this envelope. NARRATOR: Now at last it's time to break the code. On hand to analyse the results is statistician Martin Bland. JOHN ENDERBY: 59. NARRATOR: We've divided the tubes into those that did and didn't seem to have an effect in our experiment. JOHN ENDERBY: 62. NARRATOR: Each tube is either a D for the homeopathic dilutions, or a C, for the plain water controls. JOHN ENDERBY: 52 and 75 were Cs. NARRATOR: Rachel Pearson identifies the tubes with a C or D. If the memory of water is real each column should either have mostly Cs or mostly Ds. This would show that the homeopathic dilutions are having a real effect, different from ordinary water. There's a hint that the letters are starting to line up. JOHN ENDERBY: Column 1 we've got 5 Cs and a D. Column 3 we've got 4 Cs and a D, so let's press on. 148 and 9, 28 and... NARRATOR: But as more codes are read out the true result becomes clear: the Cs and Ds are completely mixed up. The results are just what you'd expect by chance. A statistical analysis confirms it. The homeopathic water hasn't had any effect. PROF. MARTIN BLAND (St. George's Hospital Medical School): There's absolutely no evidence at all to say that there is any difference between the solution that started off as pure water and the solution that started off with the histamine. JOHN ENDERBY: What this has convinced me is that water does not have a memory. NARRATOR: So Horizon hasn't won the million dollars. It's another triumph for James Randi. His reputation and his money are safe, but even he admits this may not be the final word. JAMES RANDI: Further investigation needs to be done. This may sound a little strange coming from me, but if there is any possibility that there's a reality here I want to know about it, all of humanity wants to know about it. NARRATOR: Homeopathy is back where it started without any credible scientific explanation. That won't stop millions of people putting their faith in it, but science is confident. Homeopathy is impossible. From checker at panix.com Fri Dec 2 02:55:06 2005 From: checker at panix.com (Premise Checker) Date: Thu, 1 Dec 2005 21:55:06 -0500 (EST) Subject: [Paleopsych] Prospect: Chomsky as the world's top public intellectual Message-ID: The Chronicle of Higher Education: Magazine & journal reader http://chronicle.com/daily/2005/11/2005111801j.htm 5.11.18 [articles appended] A glance at the November issue of Prospect: Chomsky as the world's top public intellectual Noam Chomsky, the controversial author and professor of linguistics at the Massachusetts Institute of Technology, has been voted the world's leading public intellectual from a list of 100 prominent thinkers compiled by the British magazine. Mr. Chomsky first won acclaim for his transformational-grammar theory, which holds that the ability to form language is an innate human trait. But he is better-known for his outspokenness on political issues. He was a major voice against the Vietnam War and continues to argue against American policies that he finds immoral. He falls into a line of "oppositional intellectuals," writes David Herman, a contributing editor for the magazine, in an explanation of the poll. Mr. Chomsky's selection, he adds, proves that "we still yearn for such figures." More than 20,000 people participated in the magazine's poll. The vote for Mr. Chomsky came as no surprise to Robin Blackburn, a visiting professor of historical studies at the New School, in New York. Mr. Chomsky, he writes, is a "brilliant thinker" who has stepped outside his own field of study in order to lambaste corrupt government policies. Oliver Kamm, a columnist for The Times of London, does not share in the adoration. For starters, he writes, Mr. Chomsky combines elaborate rhetoric with thin evidence to support "dubious arguments." Mr. Kamm particularly criticizes Mr. Chomsky's opposition to American military interventions and arguments that equate American foreign policy with the actions of Nazi Germany. "If this is your judgment of the U.S.," writes Mr. Kamm, "then it will be difficult to credit that its intervention might ever serve humanitarian ends." That's not necessarily so, says Mr. Blackburn, who notes that neither apartheid in South Africa nor Stalinism in Russia was eradicated by "bombardment and invasion." Mr. Chomsky simply opposes putting American soldiers in harm's way, he writes, where they can "do harm and acquire a taste for it." Mr. Blackburn's and Mr. Kamm's essays are contained in the article "For and Against Chomsky," which is available at [54]http://www.prospect-magazine.co.uk/article_details.php?id= 7110&AuthKey=fea7d83f56a70abc8c07b819492523e1&issue=512 Mr. Herman's analysis, Global public intellectuals poll, is available at [55]http://www.prospect-magazine.co.uk/article_details.php?id=7078&Aut hKey=fea7d83f56a70abc8c07b819492523e1&issue=512 A tally of the votes for all 100 candidates is available at [56]http://www.prospect-magazine.co.uk/intellectuals/results --Jason M. Breslow ----------- http://www.prospect-magazine.co.uk/article_details.php?id=7078&AuthKey=fea7d83f56a70abc8c07b819492523e1&issue=512 [No. 116 / Nov 2005] The Prospect/Foreign Policy list of 100 global public intellectuals suggested that the age of the great oppositional thinker was over, but Noam Chomsky's emphatic victory shows many remain nostalgic for it David Herman _________________________________________________________________ The two most striking things about this [40]poll are the number of people who took part and the age of the winners. Over 20,000 people voted for their top five names from our longlist of 100, and they tended to reinforce the trends of the original list. More than half of the top 30 are based in North America. Europe, by contrast, is surprisingly under-represented--a cluster of well-known names in the top 20 (Eco, Havel, Habermas) but then it is a long way down to Kristeva (48) and Negri (50). The most striking absence is France--one name in the top 40, fewer than Iran or Peru. There is not one woman in the top ten, and only three in the top 20. The big names of the left did well (Chomsky, Habermas, Hobsbawm) but there weren't many of them. Scientists, literary critics, philosophers and psychologists all fared badly. And voters did not use the "bonus ball" to champion new faces. The top two names, Milton Friedman and Stephen Hawking, do not represent new strands of thought. (In fact, Friedman was specifically named in last month's "criteria for inclusion"--along with other ancient greats like Solzhenitsyn--as an example of someone who had been deliberately left off the longlist on the grounds that they were no longer actively contributing to their discipline.) The poll was in one sense a victim of its own success. Word spread around the internet very quickly, and at least three of our top 20 (Chomsky, Hitchens and Soroush), or their acolytes, decided to draw attention to their presence on the list by using their personal websites to link to Prospect's voting page. In Hitchens's and Soroush's case, the votes then started to flood in. Although it is hard to tell exactly where voters came from, it is likely that a clear majority were from Britain and America, with a fair sprinkling from other parts of Europe and the English-speaking world. There was also a huge burst from Iran, although very little voting from the far east, which may explain why four of the bottom five on the list were thinkers from Japan and China. What is most interesting about the votes, though, is the age of the top names. Chomsky won by a mile, with over 4,800 votes. Then Eco, with just under 2,500, Dawkins and Havel. Only two in the top nine--Hitchens and Rushdie--were born after the second world war. And of the top 20, only Klein and Lomborg are under 50. This may reflect the age of the voters, choosing familiar names. However, surely it also tells us something about the radically shifting nature of the public intellectual in the west. Who are the younger equivalents to Habermas, Chomsky and Havel? Great names are formed by great events. But there has been no shortage of terrible events in the last ten years and some names on the list (Ignatieff, Fukuyama, Hitchens) are so prominent precisely because of what they have said about them. Only one of these, though, is European, and he lives in Washington DC. You can read more elsewhere in this issue about Chomsky. Even if you disagree with his attacks on US foreign policy, there are two reasons why few would be surprised to see him at the top of the poll. First, his intellectual range. Like a number of other figures in the top ten, he is prominent in a number of areas. Havel was a playwright and statesman; Eco a literary critic and bestselling author; Diamond was a professor of physiology and now has a chair in geography at UCLA, and writes on huge issues ranging over a great time span. Second, and more important, Chomsky belongs to a tradition which goes back to Zola, Russell and Sartre: a major thinker or writer who speaks out on the great public issues of his time, opposing his government on questions of conscience rather than the fine print of policy. I said last month in my commentary on the original Prospect/Foreign Policy list of 100 names that it seemed to represent the death of that grand tradition of oppositional intellectuals. The overwhelming victory for Noam Chomsky suggests that we still yearn for such figures--we just don't seem to be able to find any under the age of 70. http://www.prospect-magazine.co.uk/intellectuals/results The Prospect/FP Global public intellectuals poll--results Over 20,000 people voted for their top names from our original longlist of 100. The final results are below; click [10]here for David Herman's analysis, and [11]here for brief biographies of the top names Position Name Total votes 1 Noam Chomsky 4827 2 Umberto Eco 2464 3 Richard Dawkins 2188 4 V?clav Havel 1990 5 Christopher Hitchens 1844 6 Paul Krugman 1746 7 J?rgen Habermas 1639 8 Amartya Sen 1590 9 Jared Diamond 1499 10 Salman Rushdie 1468 11 Naomi Klein 1378 12 Shirin Ebadi 1309 13 Hernando De Soto 1202 14 Bj?rn Lomborg 1141 15 Abdolkarim Soroush 1114 16 Thomas Friedman 1049 17 Pope Benedict XVI 1046 18 Eric Hobsbawm 1037 19 Paul Wolfowitz 1028 20 Camille Paglia 1013 21 Francis Fukuyama 883 22 Jean Baudrillard 858 23 Slavoj Zizek 840 24 Daniel Dennett 832 25 Freeman Dyson 823 26 Steven Pinker 812 27 Jeffrey Sachs 810 28 Samuel Huntington 805 29 Mario Vargas Llosa 771 30 Ali al-Sistani 768 31 EO Wilson 742 32 Richard Posner 740 33 Peter Singer 703 34 Bernard Lewis 660 35 Fareed Zakaria 634 36 Gary Becker 630 37 Michael Ignatieff 610 38 Chinua Achebe 585 39 Anthony Giddens 582 40 Lawrence Lessig 565 41 Richard Rorty 562 42 Jagdish Bhagwati 561 43 Fernando Cardoso 556 44= JM Coetzee 548 44= Niall Ferguson 548 46 Ayaan Hirsi Ali 546 47 Steven Weinberg 507 48 Julia Kristeva 487 49 Germaine Greer 471 50 Antonio Negri 452 51 Rem Koolhaas 429 52 Timothy Garton Ash 428 53 Martha Nussbaum 422 54 Orhan Pamuk 393 55 Clifford Geertz 388 56 Yusuf al-Qaradawi 382 57 Henry Louis Gates Jr. 379 58 Tariq Ramadan 372 59 Amos Oz 358 60 Larry Summers 351 61 Hans K?ng 344 62 Robert Kagan 339 63 Paul Kennedy 334 64 Daniel Kahnemann 312 65 Sari Nusseibeh 297 66 Wole Soyinka 296 67 Kemal Dervis 295 68 Michael Walzer 279 69 Gao Xingjian 277 70 Howard Gardner 273 71 James Lovelock 268 72 Robert Hughes 259 73 Ali Mazrui 251 74 Craig Venter 244 75 Martin Rees 242 76 James Q Wilson 229 77 Robert Putnam 221 78 Peter Sloterdijk 217 79 Sergei Karaganov 194 80 Sunita Narain 186 81 Alain Finkielkraut 185 82 Fan Gang 180 83 Florence Wambugu 159 84 Gilles Kepel 156 85 Enrique Krauze 144 86 Ha Jin 129 87 Neil Gershenfeld 120 88 Paul Ekman 118 89 Jaron Lanier 117 90 Gordon Conway 90 91 Pavol Demes 88 92 Elaine Scarry 87 93 Robert Cooper 86 94 Harold Varmus 85 95 Pramoedya Ananta Toer 84 96 Zheng Bijian 76 97 Kenichi Ohmae 68 98= Wang Jisi 59 98= Kishore Mahbubani 59 100 Shintaro Ishihara 57 We asked voters to select a "bonus ball" nomination--a name they believe we should have included on our original longlist. Hundreds of people were chosen--from Bob Dylan to Kofi Annan. Here are the top 20 names Position Name Total votes 1 Milton Friedman 98 2 Stephen Hawking 81 3 Arundhati Roy 78 4 Howard Zinn 72 5 Bill Clinton 67 6 Joseph Stiglitz 57 7 Johan Norberg 48 8= Dalai Lama 45 8= Thomas Sowell 45 10= Cornell West 39 10= Nelson Mandela 39 12 Gore Vidal 37 13 Mohammad Khatami 35 14 John Ralston Saul 33 15= George Monbiot 26 15= Judith Butler 26 17 Victor Davis Hanson 25 18 Gabriel Garc?a M?rquez 24 19= Bono 23 19= Harold Bloom 23 http://www.prospect-magazine.co.uk/article_details.php?id=7110&AuthKey=fea7d83f56a70abc8c07b819492523e1&issue=512 [No. 116 / Nov 2005] For and against Chomsky Is the world's top public intellectual a brilliant expositor of linguistics and the US's duplicitous foreign policy? Or a reflexive anti-American, cavalier with his sources? Robin Blackburn Oliver Kamm _________________________________________________________________ Robin Blackburn teaches at the New School for Social Research, New York. Oliver Kamm is a "Times" columnist For Chomsky Robin Blackburn celebrates a courageous truth-teller to power The huge [40]vote for Noam Chomsky as the world's leading "public intellectual" should be no surprise at all. Who could match him for sheer intellectual achievement and political courage? Very few transform an entire field of enquiry, as Chomsky has done in linguistics. Chomsky's scientific work is still controversial, but his immense achievement is not in question, as may be easily confirmed by consulting the recent Cambridge Companion to Chomsky. He didn't only transform linguistics in the 1950s and 1960s; he has remained in the forefront of controversy and research. The huge admiration for Chomsky evident in Prospect's poll is obviously not only, or even mainly, a response to intellectual achievement. Rather it goes to a brilliant thinker who is willing to step outside his study and devote himself to exposing the high crimes and misdemeanours of the most powerful country in the world and its complicity with venal and brutal rulers across four continents over half a century or more. Some believe--as Paul Robinson, writing in the New York Times Book Review, once put it--that there is a "Chomsky problem." On the one hand, he is the author of profound, though forbiddingly technical, contributions to linguistics. On the other, his political pronouncements are often "maddeningly simple-minded." In fact, it is not difficult to spot connections between the intellectual strategies Chomsky has adopted in science and in politics. Chomsky's approach to syntax stressed the economy of explanation that could be achieved if similarities in the structure of human languages were seen as stemming from biologically rooted, innate capacities of the human mind, above all the recursive ability to generate an infinite number of statements from a finite set of words and symbols. Many modern critics of the radical academy are apt to bemoan its disregard for scientific method and evidence. This is not a reproach that can be aimed at Chomsky, who has pursued a naturalistic and reductionist standpoint in what he calls, in the title of his 1995 volume, The Minimalist Programme. Chomsky's political analyses also strive to keep it simple, but not at the expense of the evidence, which he can abundantly cite if challenged. But it is "maddening" none the less, just as the minimalist programme may be to some of his scientific colleagues. The apparent straightforwardness of Chomsky's political judgements--his "predictable" or even "kneejerk" opposition to western, especially US, military intervention--could seem simplistic. Yet they are based on a mountain of evidence and an economical account of how power and information are shared, distributed and denied. Characteristically, Chomsky begins with a claim of stark simplicity which he elaborates into an intricate account of the different roles of government, military, media and business in the running of the world. Chomsky's apparently simple political stance is rooted in an anarchism and collectivism which generates its own sense of individuality and complexity. He was drawn to the study of language and syntax by a mentor, Zellig Harris, who also combined libertarianism with linguistics. Chomsky's key idea of an innate, shared linguistic capacity for co-operation and innovation is a positive, rather than purely normative, rebuttal of the Straussian argument that natural human inequality vitiates democracy. Andersen's tale of the little boy who, to the fury of the courtiers, pointed out that the emperor was naked, has a Chomskian flavour, not simply because it told of speaking truth to power but also because the simple childish eye proved keener than the sophisticated adult eye. I was present when Chomsky addressed Karl Popper's LSE seminar in the spring of 1969 and paid tribute to children's intellectual powers (Chomsky secured my admittance to the seminar at a time when my employment at the LSE was suspended). As I recall, Chomsky explained how the vowel shift that had occurred in late medieval English was part of a transformation that resulted from a generational dynamic. The parent generation spoke using small innovations of their own, arrived at in a spontaneous and ad hoc fashion. Growing youngsters, because of their innate syntactical capacity, ordered the language they heard their parents using by means of a more inclusive grammatical structure, which itself made possible more systematic change. In politics, the child's eye might see right through the humanitarian and democratic claptrap to the dismal results of western military interventions--shattered states, gangsterism, narco-traffic, elite competition for the occupiers' favour, vicious communal and religious hatred. Chomsky openly admits he prefers "pacifist platitudes" to belligerent mendacity. This makes some wrongly charge that he is "passive in the face of evil." But neither apartheid in South Africa, nor Stalinism in Russia, nor military rule in much of Latin America were defeated or dismantled by bombardment and invasion. Chomsky had no difficulty supporting the ultimately successful campaign against apartheid, or for the Indonesian withdrawal from East Timor. He simply opposes putting US soldiers in harm's way--also meaning where they will do harm and acquire a taste for it. Chomsky's victory in a parlour game should not be overpitched. But, like Marx's win earlier this year in the BBC Radio 4 competition for "greatest philosopher," it shows that thinking people are still attracted by the critical impulse, above all when it is directed with consistency at the trend towards a global pens?e unique. The Prospect/FP list was sparing in its inclusion of critics of US foreign policy, which may have increased Chomsky's lead a little. But no change in the list would have made a difference to the outcome. The editors had misjudged the mood and discernment of their own readers. _________________________________________________________________ Against Chomsky Oliver Kamm deplores his crude and dishonest arguments In his book Public Intellectuals: A Study of Decline, Richard Posner noted that "a successful academic may be able to use his success to reach the general public on matters about which he is an idiot." Judging by caustic remarks elsewhere in the book, he was thinking of Noam Chomsky. He was not wrong. [Intellectuals_Kamm.gif]-SubmitChomsky remains the most influential figure in theoretical linguistics, known to the public for his ideas that language is a cognitive system and the realisation of an innate faculty. While those ideas enjoy a wide currency, many linguists reject them. His theories have come under criticism from those, such as the cognitive scientist Steven Pinker, who were once close to him. Paul Postal, one of Chomsky's earliest colleagues, stresses the tendency for the grandiloquence of Chomsky's claims to increase as he addresses non-specialist audiences. Frederick Newmeyer, a supporter of Chomsky's ideas until the mid-1990s, notes: "One is left with the feeling that Chomsky's ever-increasingly triumphalistic rhetoric is inversely proportional to the actual empirical results that he can point to." Prospect readers who voted for Chomsky will know his prominence in linguistics, but are more likely to have read his numerous popular critiques of western foreign policy. The connection, if any, between Chomsky's linguistics and his politics is a matter of debate, but one obvious link is that in both fields he deploys dubious arguments leavened with extravagant rhetoric--which is what makes the notion of Chomsky as pre-eminent public intellectual untimely as well as unwarranted. Chomsky's first book on politics, American Power and the New Mandarins (1969) grew from protest against the Vietnam war. But Chomsky went beyond the standard left critique of US imperialism to the belief that "what is needed [in the US] is a kind of denazification." This diagnosis is central to Chomsky's political output. While he does not depict the US as an overtly repressive society--instead, it is a place where "money and power are able to filter out the news fit to print and marginalise dissent"--he does liken America's conduct to that of Nazi Germany. In his newly published Imperial Ambitions, he maintains that "the pretences for the invasion [of Iraq] are no more convincing than Hitler's." If this is your judgement of the US then it will be difficult to credit that its interventionism might ever serve humanitarian ends. Even so, Chomsky's political judgements have only become more startling over the past decade. In The Prosperous Few and the Restless Many (1994), Chomsky considered whether the west should bomb Serb encampments to stop the dismemberment of Bosnia, and by an absurdly tortuous route concluded "it's not so simple." By the time of the Kosovo war, this prophet of the amoral quietism of the Major government had progressed to depicting Milosevic's regime as a wronged party: "Nato had no intention of living up to the scraps of paper it had signed, and moved at once to violate them." After 9/11, Chomsky deployed fanciful arithmetic to draw an equivalence between the destruction of the twin towers and the Clinton administration's bombing of Sudan--in which a pharmaceutical factory, wrongly identified as a bomb factory, was destroyed and a nightwatchman killed. When the US-led coalition bombed Afghanistan, Chomsky depicted mass starvation as a conscious choice of US policy, declaring that "plans are being made and programmes implemented on the assumption that they may lead to the death of several million people in the next couple of weeks... very casually, with no particular thought about it." His judgement was offered without evidence. In A New Generation Draws the Line: Kosovo, East Timor and the Standards of the West (2000), Chomsky wryly challenged advocates of Nato intervention in Kosovo to urge also the bombing of Jakarta, Washington and London in protest at Indonesia's subjugation of East Timor. If necessary, citizens should be encouraged to do the bombing themselves, "perhaps joining the Bin Laden network." Shortly after 9/11, the political theorist Jeffrey Isaac wrote of this thought experiment that, while it was intended metaphorically, "One wonders if Chomsky ever considered the possibility that someone lacking in his own logical rigour might read his book and carelessly draw the conclusion that the bombing of Washington is required." This episode gives an indication of the destructiveness of Chomsky's advocacy even on issues where he has been right. Chomsky was an early critic of Indonesia's brutal annexation of East Timor in 1975 in the face of the indolence, at best, of the Ford administration. The problem is not these criticisms, but Chomsky's later use of them to rationalise his opposition to western efforts to halt genocide elsewhere. (Chomsky buttresses his argument, incidentally, with a peculiarly dishonest handling of source material. He manipulates a self-mocking reference in the memoirs of the then US ambassador to the UN, Daniel Patrick Moynihan, by running separate passages together as if they are sequential and attributing to Moynihan comments he did not make, to yield the conclusion that Moynihan took pride in Nazi-like policies. The victims of cold war realpolitik are real enough without such rhetorical expedients.) If Chomsky's political writings expressed merely an id?e fixe, they would be a footnote in his career as a public intellectual. But Chomsky has a dedicated following among those of university education, and especially of university age, for judgements that have the veneer of scholarship and reason yet verge on the pathological. He once described the task of the media as "to select the facts, or to invent them, in such a way as to render the required conclusions not too transparently absurd--at least for properly disciplined minds." There could scarcely be a nicer encapsulation of his own practice. The author is grateful for the advice of Bob Borsley and Paul Postal. From checker at panix.com Fri Dec 2 02:55:19 2005 From: checker at panix.com (Premise Checker) Date: Thu, 1 Dec 2005 21:55:19 -0500 (EST) Subject: [Paleopsych] CHE: Placebos Could Play a Role in Treating Some Conditions, Scientists Say Message-ID: Placebos Could Play a Role in Treating Some Conditions, Scientists Say News bulletin from the Chronicle of Higher Education, 5.11.21 http://chronicle.com/daily/2005/11/2005112103n.htm The placebo effect -- it's all in your head. When you swallow sugar pills instead of powerful medicine and your symptoms disappear, it's all thanks to the power of your mind. How does the brain perform that parlor trick? In the past, scientists suspected that any apparent health benefits from placebos had little more basis in biology than did sleight of hand. In studies of new drugs, patients might tell their doctors they feel better because they think that is what their doctors want to hear. Or perhaps they would have recovered without any treatment, real or sham. But researchers now know that the placebo effect is real and grounded in the physiology of the brain. Using techniques to peer inside the skull, they have begun to find regions of the brain that respond to placebos, and they have even watched a single nerve cell react to a sham medicine. Those studies show that placebos affect the brain in much the same way that actual treatments do, researchers reported here last week at the annual meeting of the Society for Neuroscience. In other words, the power to treat several troublesome disorders may be wrapped up in the three-pound spongy lump of tissue protected by the skull. The research points to the power of positive thinking -- even at the unconscious level. When the brain expects relief, it can manufacture some on its own. "The things you can change with a positive outlook are profound," said Tor D. Wager, an assistant professor of psychology at Columbia University. "They are deeper physiologically than we have previously appreciated." None of the researchers who study the mechanism of the placebo effect suggest that doctors should prescribe dummy pills instead of real medicine. But they say that the study of the placebo effect could change how scientists perform clinical trials of new treatments and could even alter how we understand and treat pain, Parkinson's disease, and depression. By studying placebos, said Christian S. Stohler, dean of the school of dentistry at the University of Maryland at Baltimore, "you crack into disease mechanisms that might be very important for improving the lives of many pain patients." Fooling the Patient Researchers gained their first glimpse at the causes of the placebo effect in the late 1970s, when scientists discovered that under certain conditions they could cancel the effect. In a study of pain relievers, a drug called naloxone prevented patients on placebo pills from experiencing the usual benefit. Since naloxone blocks the action of painkillers called opioids, researchers figured that placebos must stimulate the brain to produce its own opioids. In the 1990s, another set of experiments provided more evidence that the placebo effect was a real physiological phenomenon. Fabrizio Benedetti, a professor of neuroscience at the University of Turin, and others studied the effect without using a placebo. Dr. Benedetti judged that a placebo's effect comes from the patient's psychosocial context: talking to a doctor, observing the treatment, and expecting improved health. So he took away that context by giving study participants real drugs, but on the sly. Patients were told that they would receive an active drug, a placebo, or nothing through intravenous needles, and consented to get any of the different treatments without knowing when any treatment would be supplied. The scientists compared the results when a doctor overtly gave the patient the drug and when a computer supplied the drug without the patient's knowledge. Bedside manner, it turned out, made a difference: Patients required far more painkiller if they unknowingly received the medicine from a computer. When the doctor gives a drug in full view, Dr. Benedetti said at the neuroscience conference, "there is an additive effect of the drug and of the placebo, the psychosocial component." He suggested that his experimental setup could be extended to become part of the testing procedure for new drugs. Clinical trials could then compare covert and overt administration, rather than comparing the active drug to a placebo. That way, none of the volunteers would go through the trouble of participating without receiving the real experimental treatment, and researchers could still demonstrate that the drug was effective by showing that it reduced symptoms when given covertly. Peering at the Brain With the recent advent of modern brain-scanning techniques, scientists gained the ability to look directly at the regions of the brain involved in the placebo effect. In 2002 researchers in Finland and Sweden published in Science the first brain images of the effect, using a technique called positron emission tomography, better known as PET. The researchers pressed a hot surface onto the hands of nine male volunteers, and then a doctor gave them injections of either a painkiller or a placebo. When the researchers performed PET scans on the men, both the drug and the dummy induced high blood flow -- indicating brain activity -- in an area of the brain called the rostral anterior cingulate cortex. That area plays a key role in the painkilling effects of opioid drugs. Then in 2004, also in Science, Mr. Wager reported using functional magnetic resonance imaging, or fMRI, to show that a placebo that relieved pain also decreased activity in the brain's pain-sensing areas. Different people felt varying amounts of pain relief from the placebo. The amount of pain reduction a volunteer experienced went hand in hand with the amount of change in activity in the brain. "Part of the effect of a drug," Mr. Wager said at the conference, "is it changes the way you think about drugs." Jon-Kar Zubieta, an associate professor of psychiatry and radiology at the University of Michigan at Ann Arbor, and several colleagues, including Dr. Stohler, of the University of Maryland, peered deeper into the brain's workings by finding out where the brain produces opioids in response to placebo treatment. They used PET scans along with a stain that marks opioid activity in the brain. When the researchers gave male volunteers a painful injection of saline solution into their jaw muscles, the scans showed an increase of opioids in the brain. Most of the regions where the brain produced painkillers coincided with the ones that Mr. Wager identified as important. "Expectation releases substances, molecules, in your brain that ultimately change your experience," said Dr. Stohler. "Our brain is on drugs. It's on our own drugs." Relief for Parkinson's The placebo effect helps not only people in pain but also patients with diseases. In fact, scientists got their most detailed look at the placebo effect by studying how single neurons responded to sham drugs given to Parkinson's patients. Parkinson's disease is a motor disorder caused by the loss of brain cells that produce dopamine. Some patients experience temporary relief of symptoms from a placebo, and a previous study showed that the relief occurred because the brain produced dopamine in response. Patients who have Parkinson's disease sometimes receive surgery to implant electrodes deep within the brain. The electrodes can stimulate a neuron or record its activity. Dr. Benedetti, of the University of Turin, and his colleagues enrolled 11 patients who underwent surgery for that type of treatment. They gave the patients a placebo injection, telling them it was a powerful drug that should improve their motor control. The researchers then compared the activity of a single neuron before and after injection of the placebo. In the six patients who responded to the placebo -- who demonstrated less arm rigidity and said they felt better -- the rate of firing of the neuron went down. (Nerve cells "fire," or generate electrical impulses, in order to send signals to neighboring neurons.) The neurons' firing rate did not change for people who experienced no placebo effect. Another disorder that shows clinical improvement with placebos is depression. Depressed patients' moods often lift when they take a placebo, although the effect does not last, and they normally need to seek real treatment, according to Helen S. Mayberg, a professor of neurology and of psychiatry and behavioral sciences at Emory University. Dr. Mayberg became immersed in placebo research a few years ago, when she did a PET study of the brain's response to an antidepressant and to a placebo. In her study of 15 depressed men, four who had taken Prozac and four who had received a placebo experienced a remission of their symptoms. At the end of six weeks, after allowing the drug sufficient time to take effect, Dr. Mayberg took PET scans. For patients whose symptoms improved, the regions where the brain activity increased after a patient took a placebo formed a subset of the regions that increased after a patient took the true drug. "Drug is placebo plus," she said at the conference. In patients whose symptoms did not improve, whether they were on Prozac or on the placebo, the brain activity did not increase in those regions. She published the results of that study in 2002, but at the conference she reported a new analysis of her data. In the study, she had also collected brain scans one week after patients had begun receiving their treatments, even though the drug had not yet taken its full effect. Still, people whose symptoms later improved, whether they took the placebo or Prozac, again had increased brain activity in similar areas. One week into treatment, she said, the men's state of mind could be interpreted as a "heightened state of expectation" since they were anticipating clinical improvements. Nonresponders did not show those patterns, so such expectation could be key to whether a depressed patient will recover. Raising Expectations Dr. Mayberg would like to find ways to help those who do not respond to antidepressant drugs, and she surmises that expectation could make the difference. Such patients, she said, perhaps should imagine themselves getting well. "What is expectation?" she asked. "How do you cultivate it?" Those are questions that all of the scientists involved in this research would like to answer. Patients with chronic pain, said Dr. Zubieta, of Michigan, perhaps have lost the ability to produce the brain's natural painkillers. "If you are able to recruit mechanisms that help you cope with stress or pain, that's a good thing," he said, "The question is, How do things like this, or meditation, or biofeedback, work? We don't know." Dr. Stohler, of Maryland, agrees. "Getting a person to boost their own machinery to improve health -- that's something that medicine needs to know," he said. It may be especially urgent for patients with dementia, according to Dr. Benedetti. At the conference, he reported preliminary results that patients with Alzheimer's disease may not experience placebo effects at all. He found that Alzheimer's patients felt no difference between overt and hidden administration of painkillers. To Dr. Benedetti, that suggests that the psychological components of treatments -- the expectation of health improvements, and the circuits that such expectations create in the brain -- are absent. Perhaps, he said at the conference, doctors need to take that loss into account when prescribing any drug for Alzheimer's patients. Those patients may need higher doses of many drugs, such as painkillers, if their brain has stopped aiding the drug's action. The mind, it seems, may play a critical role in treating diseases. And its services come free of charge, with no co-payments or deductibles. _________________________________________________________________ Background articles from The Chronicle: * [68]Take 2 Herbal Remedies and Call Me in the Morning (11/18/2005) * [69]Pray and Bear It (2/11/2005) Magazine & Journal Reader: * [70]A Glance at 'Current Directions in Psychological Science': Why Placebos Work (10/19/2005) 68. http://chronicle.com/weekly/v52/i13/13a01001.htm 69. http://chronicle.com/weekly/v51/i23/23a00703.htm 70. http://chronicle.com/daily/2005/10/2005101901j.htm E-mail me if you have problems getting the referenced articles. From checker at panix.com Fri Dec 2 02:55:26 2005 From: checker at panix.com (Premise Checker) Date: Thu, 1 Dec 2005 21:55:26 -0500 (EST) Subject: [Paleopsych] CHE: So Happy Together Message-ID: So Happy Together The Chronicle of Higher Education, 5.11.25 http://chronicle.com/weekly/v52/i14/14c00301.htm CATALYST If you're a scientist who is not used to collaborating with nonscientists, you'd better get used to it By KAREN M. MARKIN Perhaps you love your scientific work because it allows you to spend lots of time outdoors, taking water samples in all kinds of weather. Or perhaps your scientific work allows you to hole up with your computer and run calculation after calculation as you seek solutions to problems. Either way, its you and your intellectual pursuits, shielded from the day-to-day irritations of dealing with people. So what could you possibly gain from collaborating with others as you pursue your scientific goals, exposing yourself to interpersonal conflict like a lab rat to a pathogen? More money. Whether we like it or not, collaboration is becoming the norm for much federally financed research. Sometimes the complexity of todays scientific questions requires investigators from a variety of disciplines to work together. In other cases, agencies seek multiple payoffs from their grant dollars: educational innovations and societal benefits as well as advances in basic science. Either way, the multiyear, multimillion-dollar awards increasingly are reserved for collaborative work. Some investigators may now be thinking, I can play that game. Ill collect a bunch of individual research proposals, slap them together, and send them in under one title. More money for the same amount of effort on my part. It doesnt work that way. Over and over, I have heard program officers say that in the collaborative proposals they see, scientific excellence is usually a given. What makes or breaks those proposals are the nonscience aspects, such as management and leadership. Here are some things to consider in preparing a competitive collaborative grant application. Thinking Outside the College. First, understand that your idea of multidisciplinary and the grant agencys idea of that concept may be very different, and it is the agencys view that matters in grant writing. Faculty members often focus narrowly on their area of expertise, so that anything just a little different seems exotic. For example, a physical oceanographer might view a partnership with a biological oceanographer as multidisciplinary work. While that may be so in the rarefied world of oceanography journals, it typically is not enough for a large grant agency. Such agencies want you to do more than think outside the department. They want you to think outside the college. When I say outside the college, I dont just mean chemists joining hands with chemical engineers. In some instances, it can mean scientists engaging with social scientists and humanists. Check past awards in the program that interests you to see what has been considered multidisciplinary. Lets use the hypothetical example of a center for the study of natural disasters such as earthquakes to consider what a multidisciplinary collaboration might look like. The team will naturally include a seismologist, a geophysicist, and an earthquake engineer. But a comprehensive center also might include social scientists. One might explore how groups of people behave when faced with an imminent threat. Another might be a public-policy expert who studies obstacles to effective emergency planning. Those social scientists will have to be an integral part of the team. If you have an underlying disdain for what you view as soft scientists, it will come through loud and clear. For a truly collaborative project, you will need to accept them as equal partners rather than people whose biographical sketches you throw in merely to satisfy the program requirement for investigators outside your discipline. You show that they are partners by providing resources for them in the budget. It is also wise to include them in development of the proposal to ensure it is sound from a scholarly standpoint. If you are a biologist and you write your conception of what your political-science colleagues will contribute instead of their conception, you will weaken your case for collaboration. You might make a fatal mistake, such as calling psychology one of the humanities. (It has happened.) Reviewers of collaborative proposals will be drawn from the array of disciplines represented in the proposal. A political scientist from another institution will quickly notice if your social scientists are mere window dressing. In planning your collaboration, think about an orchestra. If the violinist is fiddling away at a bluegrass melody, the clarinetist is tootling a klezmer tune, and the pianist is banging out Billy Joel, its cacophony, no matter how good they are individually. But put them together for Rhapsody in Blue, and theyre making music. They Also Serve Who Only Push Paper. The entire scholarly team will have to accept that a large collaborative grant requires the services of people who arent scientists but must be adequately paid. Some researchers find it anguishing to spend their scarce grant dollars on anything but lab equipment and scientific personnel. But part of the challenge of a large collaborative grant is to manage it efficiently after you receive the award. That takes time, and you probably have firsthand experience with it. Do you complain when you have to submit annual and final reports for your grants? Think of that kind of work multiplied by a factor of 10 or 15, and you will begin to see the value of a project manager. Previous recipients of collaborative grants say that bad management, rather than bad science, is usually the reason that a renewal application is rejected. Staffing needs will vary from program to program, but all collaborations need a manager and someone to oversee the budget. In some collaborative projects, agencies expect diversity and education efforts. Some investigators have hired full-time individuals for each of those duties. Although these administrative tasks may sound like punishment to you, some people enjoy them and perform them well. Follow the Leader. A collaborative grant requires strong leadership. The impetus must come from faculty members who are excited about pursuing the area of scientific inquiry at the heart of the project. The principal investigator should be a prominent scientist with a long record of extramural grants and publications. But the project also needs someone to serve as its prime mover, and that person does not necessarily have to be the senior scientist. That individual has to be willing to put time and energy into pulling together the collaboration. He or she needs to be organized, a good time manager, a team builder, and able to take criticism in stride. The project leader also must be able to persuade top institutional officials that multidisciplinary work is valuable and rewarded in tenure and promotion decisions. Those tasks are clearly not science, but theyre essential to the success of the project. If you scoff at them as mere management clich?s, find someone who takes them seriously. Those who have formed collaborations emphasize that they take a lot of time. It is common to spend a year developing a collaborative proposal that is based on a decade of less formal interactions with other scientists. As with any proposal, it may take two or three submissions before you get any money. But look on the bright side: The additional years you spend revising the proposal allow you to develop better relationships with your collaborators -- and to jettison the ones you dont want. Karen M. Markin is director of research development at the University of Rhode Islands research office. From checker at panix.com Sat Dec 3 02:27:51 2005 From: checker at panix.com (Premise Checker) Date: Fri, 2 Dec 2005 21:27:51 -0500 (EST) Subject: [Paleopsych] Meme 052: The Inverted Demographic Pyramid Message-ID: Meme 052: The Inverted Demographic Pyramid by Frank Forman sent 5.12.2 The inverted demographic pyramid, those richer and more able having fewer children, has been a problem for evolutionary theory ever since Francis Galton. My solution is that the decision about whether to have more or fewer children is determined by a trade-off function set in the Stone Age. What parents consider to be adequate support for a child is determined more by what their peers do than the objective facts of the situation today, which would indicate a much larger advantage for the better off than in the Stone Age where incomes were far more equal. But we listen to the "whispering genes within" rather than accept any factual studies that back up the Ninety-Six Percent Rule, namely that 96% of parents don't matter much one way of the other. An article in the New York Times, shown below, about a surprising 26% increase in the number of children age 0-5 in Manhattan between 2000 and 2004 induced these reflections. This increase is probably just an effect of greater income inequality in recent years, not a sudden reversal of the inverted demographic pyramid This paradox, as we all know, has caused some to question the whole selfish gene-sociobiological paradigm, and with good reason, though I try to make a good crack at saving the paradigm here. Animals in any species can chose, within limits, whether to pursue an r strategy (mnemonic: reproduce, reproduce, reproduce) of many offspring with little parental investment per child or the K (mnemonic: Kwality) strategy of few children but high investment per child. The trade off *function* was mostly set in the Stone Age. Conditions have changed and rich parents should be able to have far, far more children than the poor, since income inequality is far greater today than then by I think every account by anthropologists. But when you ask rich parents why they don't pack them in like they do in the barrios, and you get told that that would be indecent and inadequate with such vehemence that befit moral absolutes. What's going on is that one's standard of decency or adequacy is not set by thinking about Stone Age environments, nor by comparison with those who lead far longer lives in the barrios and ghettos and whatever Asian immigrants cram themselves into than Stone Age man ever did, but by comparison with one's peers. Your neighbors surround their children with a big house, give them an expensive education, and so on. The Stone Age genes within you whisper to you that if you don't do these things for your kids, they will not have their own children and you will have no grandchildren. You will ignore any studies by Judith Rich Harris that affirm the Ninety-Six percent rule that only the worst and best two percent of parents make a measurable difference in how your kids turn out. You will reject showings by economists that educational credentials count for little beyond helping you children get their first jobs. You look at only a small slice of the population, namely your peers, in which effort does seem to matter more than innate factors. Indeed the big brains of primates are primarily geared more to getting along with your fellows (thus allowing for greater and more complex social cooperation) rather than for maneuvering the physical environment by finding out what is really out there. It's an accidental byproduct of blue eyes and flaming red (or blond) hair (my "Maureen Dowd Theory of Western Civilization") that triggered off a larger regard for objectivity. Mr. Mencken was often given to noting how weak this regard is, even in America, especially in America, but he did not know the rest of the world. There are other factors involved in the inverted demographic pyramid. Our drives work only remotely, and there is no drive for maximizing inclusive reproductive fitness directly. (I don't need to beat yet another drum for group selection here.) Of the Sixteen Basic Drives Steven Reiss has identified through factor analysis, Romance certainly seems closely related, this drive (no. 2 on my personal list) including acts of coitus and also having aesthetic experiences. (I can't logic out the connection, but these three are correlated so much on questionnaires that they are cluster into a single drive. The desire to raise one's own children (NOT clustered with the drive to raise adopted children) would also seem to weigh heavily in the selfish gene model. (It's no. 6 on my list, ranked that high, not because I have spent a great deal of time, Kwality or not, with my children, but because I chose to give up the teaching job I really would have much preferred. Spending lots of time with them does not satisfy my no. 1 drive, Curiosity, all that well. I'd rather read books! Indeed, Curiosity, which is so much more satisfiable today than in the Stone Age (a supply side change) could well be responsible for a large part of the inverted demographic pyramid. I suggest that those having higher incomes (correlated 0.5 with intelligence but making a huge difference between populations then and now) will purchase relatively more satisfaction of this drive today, with the result of having relatively fewer children, than they would have back in the Stone Age. There's also the drive for Status (no. 14 out of 16 on my list, which explains why we chose to live in an inexpensive apartment in a high-toned neighborhood and let the neighbors snub as they may, as some did), which means that parents will spend on their children to impress their peers as well as to actually help their kids. This may also be more readily satisfiable today than then. I don't know. And so on, through the rest of the Reiss list. I resend my meme on them at the end by the simple expedient of typing crtl-R||enter. Neat, isn't it, which is what a UNIX shell account gives you. I'm just giving a framework for speculation. The hard work of empirically weighing the changes in supply and demand for the drives, which as I said are only loosely connected to reproductive success, begins. It will be a nearly impossible task to do with full scientific vigor, since we don't know all that much about the EEA. But, once again, don't compare your findings against a perfectionist model but merely with *competing* explanations, any more than you should compare the actual workings of the market with an ideal government that would correct market defects. P.S. I'm not a Reissian fundamentalist: it's just that he has provided me with one of my many filters with which to view the world.] Some of the respondents to Dan Vining's 1986 Brain and Behavioral Sciences target article, "Social versus Reproductive Success: The Central Theoretical Problem with Human Sociobiology" (9:167-216), did hint at the trade-offs among desires, but only indirectly, as none were economists. My own allegedly expertise in the subject at least urges me to look at a trade-off function that may have changed not inconsiderably on the demand side: that for curiosity and objectivity caused by the Maureen Dowd factor may be hugely important for the West versus the Rest. But the biggest changes are in the supply of ways to satisfy the Reiss desires. It is the changes on the supply side that apparently outweigh the changes in demand, since the inverted demographic pyramid is common to rich countries and not just to the West. In any case, I hope I've managed to introduce some economic reasoning to more fully explain the inverted demographic pyramid. Enthusiastic eugenicists will have a terrific task ahead to change the demand and supply curves. One of Reiss' drives is Idealism (no. 7 on my list), but the sorts of questions he asked were heavy into redistribution. We know, or should know, that the enthusiasm for redistribution is hyped up with the the huge influence of 20th century leftists in the education business. Issues were--and still are, there being a lot of momentum (a/k/a culture lag)--largely framed in these terms, much like debates in the Middle Ages were framed in Christian terms. Hauling in manufactured emotions will be easier than changing underlying biologies, at least until Designer Children come along. ------ Manhattan's Little Ones Come in Bigger Numbers http://www.nytimes.com/2005/12/01/nyregion/01kids.html By EDUARDO PORTER The sidewalks crowded with strollers, the panoply of new clubs catering to the toddler set and the trail of cupcake crumbs that seem to crisscross Manhattan are proof: The children are back. After a decade of steady decline, the number of children under 5 in Manhattan increased more than 26 percent from 2000 to 2004, census estimates show, surpassing the 8 percent increase in small children citywide during the same period and vastly outstripping the slight overall growth in population in the borough and city. Even as soaring house prices have continued to lift the cost of raising a family beyond the means of many Americans, the borough's preschool population reached almost 97,000 last year, the most since the 1960's. This increase has perplexed social scientists, who have grown used to seeing Manhattan families disappear into Brooklyn and New Jersey, and it has pushed the borough into 11th place among New York State counties ranked by percentage of population under 5. In 2000, fewer than one in 20 Manhattan residents were under 5, putting the borough in 58th place. "Potentially this is very good news for New York," said Kathleen Gerson, a professor of sociology at New York University. "It depends on whether this is a short-term blip or a long-term trend. We must understand what explains the rise." Indeed, nobody can say for sure what caused the baby boom, but several factors clearly played a part. The city's growing cohorts of immigrants may have contributed, as the number of children in Manhattan born to foreign-born parents has risen slightly since the 1990's. But other social scientists say that the number of births is growing at the other end of the income scale. "I wouldn't be surprised if it had to do with more rich families having babies and staying in Manhattan," said Andrew A. Beveridge, a professor of sociology at Queens College. According to census data, 16.4 percent of Manhattan families earned more than $200,000 last year, up from 13.7 percent in 2000. Kathryne Lyons, 40, a mother of two who left her job as a vice president of a commercial real estate firm when her second daughter was born three years ago, acknowledges that having children in the city is a tougher proposition if one cannot afford nannies, music lessons and other amenities, which, as the wife of an investment banker, she can. "It's much more difficult to be here and not be well to do." Over the past few years, New York has become more family-friendly, clearly benefiting from the perception that the city's quality of life is improving. Test scores in public schools have improved, and according to F.B.I. statistics, New York is the nation's safest large city. Sociologists and city officials believe that these improvements in the quality of life in Manhattan may have stanched the suburban flight that occurred in the 1990's. And while Manhattan lacks big backyards for children to play in, it offers a packed selection of services, which can be especially useful for working mothers. In fact, the baby boomlet also may pose challenges to a borough that in many ways struggles to serve its young. According to Childcare Inc., day care centers in the city have enough slots for only one in five babies under age 3 who need it. And while census figures show that children over 5 have continued to decline as a percentage of the Manhattan population, if the children under 5 stay, they could well put extra stress on the city's public and private school systems, already strained beyond capacity in some neighborhoods. Private preschools and kindergartens "are already more difficult to get into than college," said Amanda Uhry, who owns Manhattan Private School Advisors. So who are these children? Robert Smith, a sociologist at Baruch College who is an expert on the growing Mexican immigration to the city, argued that the children of Mexican immigrants - many of whom live in the El Barrio neighborhood in East Harlem - are a big part of the story. But this is unlikely to account for all of the increase. For example, in 2003, fewer than 1,000 babies were born to Mexican mothers living in Manhattan. And births to Dominicans, the largest immigrant group in the city, have fallen sharply. Some scholars suspect that a substantial part of Manhattan's surge is being driven by homegrown forces: namely, the decision by professionals to raise their families here. Consider the case of Tim and Lucinda Karter. Despite the cost of having a family in the city, Ms. Karter, a 38-year old literary agent, and her husband, an editor at a publishing house, stayed in Manhattan to have their two daughters, Eleanor and Sarah. They had Eleanor seven and a half years ago while living in a one-bedroom apartment near Gracie Mansion on the Upper East Side. Then they bought the apartment next door and completed an expansion of their home into a four-bedroom apartment two years ago. A little less than a year ago, they had Sarah. "Manhattan is a fabulous, stimulating place to raise a child," Ms. Karter said. "We didn't plan it but we just delayed the situation. We were just carving away and then there was room." The city's businesses and institutions are responding to the rising toddler population. Three years ago, the Metropolitan Museum of Art began a family initiative including programs geared to children 3 and older. The Museum of Modern Art has programs for those as young as 4. In January, Andy Stenzler and a group of partners opened Kidville, a 16,000-square-foot smorgasbord of activities for children under 5 - and their parents - on the Upper East Side. "We were looking for a concentration of young people," Mr. Stenzler said. "There are 16,000 kids under 5 between 60th and 96th Streets." Many of the new offerings reflect the wealth of the parents who have decided to call Manhattan home. Citibabes, which opened in TriBeCa last month, provides everything from a gym and workplaces with Internet connections for parents, to science lessons, language classes and baby yoga for their children. It charges families $6,250 for unlimited membership for three months. Manhattan preschools can charge $23,000 a year. Ms. Uhry, with Private School Advisors, charges parents $6,000 a year just to coach them through the application process to get their children in. Yet in spite of the high costs, small spaces and infuriating extras that seem unique to Manhattan - like the preschools that require an I.Q. test - many parents would never live anywhere else. "Manhattan has always been a great place for raising your children," said Lori Robinson, the president of the New Mommies Network, a networking project for mothers on the Upper West Side. "It's easier to be in the city with a baby. It's less isolation. You feel you are part of society." ------------- Meme 023: Steven Reiss' 16 Basic Desires 3.9.21 Here's the results of research into the basic human desires. I've ordered them by what I think is my own hierarchy and invite you to do the same for yourself and for historical personages, like Ayn Rand. This list is not only important in its own right but has great implications for one's political concerns. Curiosity being my highest desire, I am an advocate of what I call the "information state," whereby the major function of the central government is the production of information and reserach. (Currently, it occpies at most two percent of U.S. federal spending.) And since independence is no. 3 for me, I am close to being a libertarian, in the sense that I'd vote with Ron Paul on most issues. But someone for whom independence is his most basic desire, he'd be advocating a full liberatarian order and impose it on states and counties. On the other hand, an idealist could advocate massive redistribution programs from rich to poor and military intervention in foreign countries that do not live up to his standards. I simply care much less than he does about such matters. The task of designing a state, or a world federal order, that reflects the diversity of desires and not just "this is what I want the world to be" continues. STEVEN REISS' 16 BASIC DESIRES Curiosity. The desire to explore and learn. End: knowledge, truth. Romance. The desire for love and sex. Includes a desire for aesthetic experiences. End: beauty, sex. Independence. The desire for self-reliance. End: freedom, ego integrity. Saving. Includes the desire to collect things as well as to accumulate wealth. End: collection, property. Order. The desire for organization and for a predictable environment. End: cleanliness, stability, organization. Family. The desire to raise one's own children. Does not include the desire to raise other people's children. End: children. Idealism. The desire to improve society. Includes a desire for social justice. End: fairness, justice. Exercise. The desire to move one's muscles. End: fitness. Acceptance. The desire for inclusion. Includes reaction to criticism and rejection. End: positive self-image, self-worth. Social Contact. The desire for companionship. Includes the desire for fun and pleasure. End: friendship, fun. Honor. The desire to be loyal to one's parents and heritage. End: morality, character, loyalty. Power. The desire for influence including mastery, leadership, and dominance. End: achievement, competence, mastery. Vengeance. The desire to get even. Includes the joy of competition. End: winning, aggression. Status. The desire for social standing. Includes a desire for attention. End: wealth, titles, attention, awards. Tranquility. The desire for emotional calm, to be free of anxiety, fear, and pain. End: relaxation, safety. Eating. The desire to consume food. End: food, dining, hunting. Source: Steven Reiss, _Who am I?: the 16 basic desires that motivate our actions and define our personalities. NY: Penguin Putnam: Jeremy P. Tarcher/Putman, 2000. I have changed his exact wordings in a few places, based upon the fuller descriptions in his book and upon his other writings. The ends given in the table are taken directly from page 31. The desires are directed to the psychological (not immediately biological) ends of actions, not to actions as means toward other actions. He has determined the basic ends by the use of factor analysis, a technique pioneered by Raymond B. Cattell. Spirituality, for example, he finds is distributed over the other desires and is not an end, statistically independent of other ends. And he finds that the desire for aesthetic experiences is so closely correlated with romance that he subsumes it thereunder. Reiss' list is in no particular order, and so, after much reflection, not only upon my thinking but upon my actual behavior, I have ranked the desires by what I think is my own hierarchy. A few remarks, directed to those who know me, are in order: Saving: Not much good at keeping within my budget, I have a relatively big pension coming and have a large collection of recordings of classical music and books. Order: While my office and home is in a mess, I have written a number of extremely well-organized discographies. Family: Not always an attentive father, I have kept at a job I've not always liked, instead of starting over again as an assistant professor. Idealism: I took the description from an earlier article by Reiss, so as not to restrict it to state redistribution of income. Exercise: I am well-known for my running and having entered (legally) the Boston Marathon, but I usually just set myself to a daily routine and don't go canoeing, for example, when on vacations. In high school, I was notorious for avoiding exercise. Acceptance: I can be rather sensitive to being ignored, though I don't do much about it in fact. Social Contact: Fun, for me, is intellectual discussion, often with playful allusions on words and ideas. Honor: I'm very low on patriotism, but I do like to think of myself as having good character. Vengeance: I've been told I love to win arguments for their own sake, but I have only a small desire ever to get even and never act upon it. [I am sending forth these memes, not because I agree wholeheartedly with all of them, but to impregnate females of both sexes. Ponder them and spread them.] From checker at panix.com Sun Dec 4 01:19:15 2005 From: checker at panix.com (Premise Checker) Date: Sat, 3 Dec 2005 20:19:15 -0500 (EST) Subject: [Paleopsych] SF Area Independent Media Center: Government Accounting Office takes a big bite out of the Bush cliques pretense of legitimacy Message-ID: Government Accounting Office takes a big bite out of the Bush cliques pretense of legitimacy http://www.indybay.org/print.php?id=1783843 San Francisco Bay Area Independent Media Center [This, and the related articles I have appended, are about how the Republicans stole the 2004 election by manipulating electronic voting machines, esp. in Ohio. I don't have the full URL for the GAO report handy, but it's easy to get from http://www.gao.gov. It deals with the potential for fraud but does not claim actual fraud. The other articles argue for actual fraud. [Now, elections do get stolen. I'm convinced that Lyndon Johnson and Mayor Daley got Kennedy elected. Nixon thought so do but he decided, for the good of the country, not to contest it. And it's not unlikely that Tilden's victory was taken from him by fraud. I don't think fraud was responsible for Bush's victory in Florida in 2000, since the margin was well below the difference between Algore and Bush fraudulent (illegal) voters (thousands of them and a 2:1 tilt toward Algore). I'm not so sure about Ohio, if what the articles below say is plausible. [Algore's presidency would have made a difference: he's been around Washington long enough to know when pressure is being brought to bear on him. Rather than listen to neocon arguments for war in Iraq, he would have tuned them out. We're less than a year into Bush's second term. Kerry would have (or rather will have) bowed to political pressure to pull out of Iraq and the media would turn away from whatever disaster happens as a result of the pullout. [What's worth thinking about is why the media are ignoring the GAO report. (And also why the isolationist right is ignoring it also. Nothing on LewRockwell.com. Maybe they don't much care and certainly think Kerry would have been at most a slightly lesser evil than Bush, while the left at least hopes that Kerry would have been a significantly lesser evil.) The reason is that the mainstream media is, above all else, Establishment. Most of them are Democrats, it is true, and some of them make noises about how awful Bush is. (They say Reagan "ended" welfare, you know, whereas there was at best a decrease in the rate of increase.) But they believe mostly in the System. They most definitely do not want there to be a widespread belief that something as illegitimate as stealing a Presidential election can happen north of the Rio Grande. [Nor a President assassinated by any other than a lone nut. During the 40th anniversary of the JFK assassination, talking heads were unanimous in upholding the lone nut hypothesis, even though only 20% of the citizens do. [When you think about elites, ask what beliefs are mandated of its members. Some must be publicly affirmed. Others may be only privately doubted. In what sense is Gary Bauer, the evangelist who gets on lots of talk shows, a member of the elite. Would it hurt him to come out against the lone-nut theory of the JFK assassination? Is it best for a respectable dissident to have only a small number of disagreements? Can Bill Gates come out against modern art? Can anyone besides Tom Wolfe do so? [How big is the elite? How many different elites are there? They interlock, like corporate board members serving on art museum boards. Ponder these questions as you read what seems to me a plausible case of massive voter fraud yet a studious ignoring by the mainstream media.] by Joe Baker Wednesday, Nov. 16, 2005 at 12:03 PM How much will it take before people rise up and force the Repugs in Congress to impeach this dangerous thief? Secret CIA torture gulags, now secret Shiite torture chambers in Iraq. At least two stolen US elections. It boggles the mind. GAO report upholds Ohio vote fraud claims By Joe Baker, Senior Editor As if the indictment of Lewis Scooter Libby wasnt enough to give the White House some heavy concerns, a report from the Government Accounting Office takes a big bite out of the Bush cliques pretense of legitimacy. This powerful and probing report takes a hard look at the election of 2004 and supports the contention that the election was stolen. The report has received almost no coverage in the national media. The GAO is the governments lead investigative agency, and is known for rock-solid integrity and its penetrating and thorough analysis. The agencys agreement with what have been brushed aside as conspiracy theories adds even more weight to the conclusion that the Bush regime has no business in the White House whatever. Almost a year ago, Rep. John Conyers, senior Democrat on the House Judiciary Committee, asked the GAO to investigate the use of electronic voting machines in the Nov. 2, 2004, presidential election. That request was made as a flood of protests from Ohio and elsewhere deluged Washington with claims that shocking irregularities were common in that vote and were linked to the machines. CNN said the Judiciary Committee got more than 57,000 complaints after Bushs claimed re-election. Many were made under oath in a series of statements and affidavits in public hearings and investigations carried out in Ohio by the Free Press and other groups seeking to maintain transparent elections. Online Journal.com reported that the GAO report stated that some of [the] concerns about electronic voting machines have been realized and have caused problems with recent elections, resulting in the loss and miscount of votes. This is the only democratic nation that permits private partisan companies to count and tabulate the vote in secret, using privately-held software. The public is excluded from the process. Rev. Jesse Jackson and others have declared that public elections must not be conducted on privately-owned machines. The makers of nearly all electronic voting machines are owned by conservative Republicans. The chief executive of Diebold, one of the major suppliers of electronic voting machines, Warren Wally ODell, went on record in the 2004 campaign vowing to deliver Ohio and the presidency to George W. Bush. In Ohio, Bush won by only 118,775 votes out of more than 5.6 million cast. Honest election advocates contend that ODells statement to hand Ohios vote to Bush still stands as a clear indictment of an apparently successful effort to steal the White House. Some of the GAOs findings are: 1. Some electronic voting machines did not encrypt cast ballots or system audit logs, and it was possible to alter both without being detected. In short, the machines; provided a way to manipulate the outcome of the election. In Ohio, more than 800,000 votes were cast on electronic voting machines, some registered seven times Bushs official margin of victory. 2: the report further stated that: it was possible to alter the files that define how a ballot looks and works, so that the votes for one candidate could be recorded for a different candidate. Very many sworn statements and affidavits claim that did happen in Ohio in 2004. Next, the report says, Vendors installed uncertified versions of voting system software at the local level. The GAO found that falsifying election results without leaving evidence of doing so by using altered memory cards could easily be done. The GAO additionally found that access to the voting network was very easy to compromise because not all electronic voting systems had supervisory functions protected by password. That meant access to one machine gave access to the whole network. That critical finding showed that rigging the election did not take a widespread conspiracy but simply the cooperation of a small number of operators with the power to tap into the networked machines. They could thus alter the vote totals at will. It therefore was no big task for a single programmer to flip vote numbers to give Bush the 118,775 votes. Another factor in the Ohio election was that access to the voting network was also compromised by repeated use of the same user ID, coupled with easy-to-guess passwords. Even amateur hackers could have gotten into the network and changed the vote. System locks were easily picked, and keys were easy to copy, so gaining access to the system was a snap. One digital machine model was shown to have been networked in such a rudimentary manner that if one machine experienced a power failure, the entire network would go down. That is too fragile a system to decide the presidency of the United States. Problems obviously exist with security protocols and screening methods for vendor personnel. The GAO study clearly shows that no responsible business would operate with a computer system as flimsy, fragile and easily manipulated as the one used in the 2004 election. These findings are even more damning when we understand the election in Ohio was run by a secretary of state who also was co-chairman of Bushs Ohio campaign. Far from the conclusion of anti-fraud skeptics, the GAOs findings confirm that the network, which handled 800,000 Ohio votes, was vulnerable enough to permit a handful of purposeful operatives to turn the entire election by means of personal computers using comparatively simple software. One Ohio campaign operative, Tom Noe, a coin dealer, was indicted Oct. 27 for illegally funneling $45,400 to Bush by writing checks to others, who then wrote checks to Bushs re-election campaign, allegedly dodging the $2,000 limit on contributions by an individual. Its one of the most blatant and excessive finance schemes we have encountered, said Noel Hillman, section chief of the U.S. Department of Justices public integrity section, as quoted in the Kansas City Star. In the 2000 election, Florida was the key; in the 2004 election, Ohio was the key. From the Nov. 2-8, 2005, issue _________________________________________________________________ The Free Press -- Independent News Media - Election Issues http://www.freepress.org/departments/display/19/2005/1556 The Free Press: Speaking Truth to Power Thu Dec 01 2005 What John Kerry definitely said about 2004s stolen election and why it's killing American democracy by Bob Fitrakis & Harvey Wasserman November 10, 2005 The net is abuzz about what John Kerry may or may not be saying now about the stolen election of 2004. But we can definitively report what he has said about New Mexico and electronic voting machines soon after his abrupt "abandon ship" with 250,000 Ohio votes still uncounted. And we must also report that what he's not saying is having a catastrophic effect on what's left of American democracy, including what has just happened (again) in Ohio 2005. In recent days Mark Crispin Miller has reported that he heard from Kerry personally that Kerry believes the election was stolen. The dialog has been widely reported on the internet. Kerry has since seemed to deny it. We have every reason to believe Miller. His recent book FOOLED AGAIN, has been making headlines along with our own HOW THE GOP STOLE AMERICA'S 2004 ELECTION & IS RIGGING 2008. As in his campaign for president, Kerry has been ambivalent and inconsistent about Ohio's stolen vote count. Soon after the presidential election, Kerry was involved in a conference call with Rev. Jesse Jackson and a number of attorneys, including co-author Bob Fitrakis. In the course of the conversation, Kerry said "You know, wherever they used those [e-voting] machines, I lost, regardless if the precinct was Democratic or Republican." Kerry was referring to New Mexico. But he might just as well have been talking about Ohio, where the election was decided, as well as about Iowa and Nevada. All four of those "purple" states switched from Democratic "blue" in the exit polls as late as 12:20am to Republican "red" a few hours later, giving Bush the White House. A scant few hours after that, Kerry left tens of thousands of volunteers and millions of voters hanging. With Bush apparently leading by some 130,000 votes in Ohio, but with a quarter-million votes still uncounted here, Kerry abruptly conceded. He was then heard from primarily through attorneys from Republican law firms attacking grassroots election protection activists who dared question the Ohio outcome. In the year since that abrupt surrender, Theresa Heinz Kerry has made insinuations that she thought the election might have been stolen. But there has been no follow-up. Now we have this report from M. C. Miller that Kerry said he knew the election was stolen, and then denied saying it. Coming from Kerry, the inconsistency would be entirely consistent. But those committed to democracy and horrified by the on-going carnage of the Bush catastrophe still have no credible explanation as to why Kerry abandoned ship so abruptly. He had raised many millions specifically dedicated to "counting every vote," which clearly never happened in Ohio. More than a year after the election, more than 100,000 votes are STILL uncounted in the Buckeye state. And now, tragically, we have had another set of stolen elections. Four statewide referenda aimed at reforming Ohio's electoral process have been defeated in a manner that is (again) totally inconsistent with polling data, One statewide referendum, aimed at handing the corrupt Taft Administration a $2 billion windfall, has allegedly passed, again in a manner totally inconsistent with polling data, or even a rudimentary assessment of Ohio politics. We will write more about this tomorrow. But suffice it to say these latest "official" vote counts make sense only in the context of a powerful recent report issued by the Government Accounting Office confirming that electronic voting machines like those used in Ohio can be easily hacked by a very few players to deliver a vote count totally at odds with the will of the electorate. We have seen it in the presidential elections of 2000 and 2004, in at least three Senatorial races in 2002, and now in the referenda in Ohio 2005, and possibly elsewhere. How could this have happened? By and large, the nation is in denial, including much of the left. Miller recently debated Mark Hertsgaard over a Mother Jones review of both our books. The idea that the 2004 election could have been stolen has also been attacked by others on the left. Some reporters have briefly visited here or made calls from the coasts and then taken as gospel anything that mainstream Democratic regulars utter, even if its totally implausible and counter-factual. For example, they would have you believe that, in direct contradiction to how elections have gone in Ohio for decades, its now routine for boards of elections to record that 100% of the precincts are reporting, and then suddenly add 18,615 more votes at 1:43 a.m. after the polls have been closed since 7:30 p.m. and 100% of the precincts had been reporting since approximately 9 p.m. Or that 18,615 Miami County votes could come in late with an impossibly consistent 33.92% for Kerry, as if somebody had pushed a button on a computer with a pre-set percentage---just as the GAO says it can be done. Or that it's ok for a Democratic county election official, with a lucrative contract from the Republican-controlled Board of Elections (BOE), to admit he doesn't really know whether the vote count had been doctored. Or it's fine for BOE officials take election data home to report on from their personal PCs. Or for central tabulators to run on corporate-owned proprietary software with no public access. Or for BOE officials to hold up vote counts late into the night that time and again miraculously provide sufficient margins for GOP victories, as with Paul Hackett's recent failed Congressional race in southwestern Ohio. Or for one precinct to claim a 97.55% turnout when a Free Press/Pacifica canvass quickly found far too many people who didn't vote to make that possible. There is clearly no end to this story, and there is no indication the dialog on the net will diminish, even though the mainstream media---like the mainstream Democratic Party---absolutely refuses to touch this issue. But ultimately, whatever John Kerry or the bloviators or even the left press say about these stolen elections, America is very close to crossing the line that permanently defines the loss of our democracy. As we will show tomorrow, this week's theft of five referendum issues in Ohio is yet another tragic by-product of the unwillingness of John Kerry and so many others to stand up for a fair and reliable electoral process in this country. -- Bob Fitrakis and Harvey Wasserman are co-authors of HOW THE GOP STOLE AMERICA'S 2004 ELECTION & IS RIGGING 2008, available at www.freepress.org, and, with Steve Rosenfeld, of WHAT HAPPENED IN OHIO, to be published this spring by The New Press. A Discussion with Mark Crispin Miller - Democratic Underground http://www.democraticunderground.com/articles/05/11/05_mcm.html November 5, 2005 [05_mcm.jpg] On November 3-4, 2005, Mark Crispin Miller, author of Fooled Again, took part in an online discussion at Democratic Underground, answering questions from members of our message board. This is a lightly edited transcript of that discussion. The original discussion thread can be found [9]here. Mark might even return to continue the discussion. Skinner: Today we are very excited to host an online discussion with Mark Crispin Miller. Mark is a professor of media ecology at New York University. Some of you may remember him from our online discussion on Democratic Underground in May of 2002. He is well known for his writings on all aspects of the media and for his activism on behalf of democratic media reform. He has written a number of books, including Boxed in: The Culture of TV, The Bush Dyslexicon: Observations on a National Disorder, and Cruel and Unusual: Bush/Cheney's New World Order. He writes regularly on his blog, [10]News From Underground. [11][05_fooledagain.gif] Mark has a brand new book about the 2004 election, [12]Fooled Again: How the Right Stole the 2004 Election, and Why They'll Steal the Next One Too (Unless We Stop Them). This discussion is going to be pretty informal. Mark has some book signings and other events today, so he might be checking in a few different times throughout the day, and he is not going to be able to answer every question that is posted here. He will pick the questions that he considers most relevant and answer those. All DU members are welcome to participate. If you have a question or topic that you would like to discuss with Mark, just click "Reply" on this message to post it. Mark, thank you so much for being with us. The first question is an easy one. Please tell us about your book. Mark Crispin Miller: Hi, everyone. It's a pleasure to be here. Many warm thanks, Stephanie, for making this happen. Why I wrote Fooled Again: The scandal of last year's election never resonated as it should have done, because the national Democrats AND "the liberal media" refused to face, or even to discuss, the facts. We very badly need electoral reform, but we won't get it if that mammoth scandal doesn't finally resonate. My aim in writing Fooled Again was to lay out the evidence that Bush & Co. stole their so-called "mandate," so that the scandal might at last resound, so that we'll all be motivated to repair the system. If we don't, it seems to me, we're really cooked. Let me add that I myself am not a Democrat but a proud independent. This is not a partisan endeavor but a crucial civic issue. There's evidence that many a Republican did NOT vote for Bush/Cheney in 2004. Those folks too were disenfranchised, along with countless voters on the other side. I await your questions/comments. Bruce McAuley: Hi Mark! Given what has happened so far with the Bush administratio9n, what do you foresee for the future? Will the neo-conservatives have a final triumph, or will liberalism make a resurgence? Or neither of the above? Best guesses, please, and thanks for participating! Mark Crispin Miller: Bruce, liberalism will make a resurgence; or, rather, it is resurgent already, although many liberals out there don't know they're liberals. It's an odd situation. The word itself is now pejorative, thanks to the far-right propaganda drive that's overwhelmed our politics and culture for last few decades. So folks are often quick to say that they're not liberalsbut their politics, on nearly every score, ARE, by any definition, liberal: economically, environmentally, on foreign policy, on healthcare, abortion rights, you name it. Because the word has been so badly tarnished, I'd prefer to say that our Enlightenment ideals will re-assert themselves. I deem myself a follower of Jefferson and Paine. The world-view of those framers will prevail, if we promote it and defend it just as zealously as BushCo has attacked it. underpants: How susceptible to the "first story out there" is MSM? I haven't read your work so excuse me if this has been covered. It appears that the news media in this country all follow the very first wire report written on an event or an issue. Is it really that cut and dry? and does the right really have packaged ready to go versions of what I just saw ready to go (it would appear that they do)? Mark Crispin Miller: The right has the propaganda thing down cold. The MSM, moreover, will certainly not follow any story that it's disinclined to follow, however hot it may appear. Every day amazing pieces come from the Associated Press, with no follow-up whatsoever. AP did a good story on the GAO report on electronic touchscreen voting machines. There was no follow-up at all. mzmolly: What is the first thing we CAN/should do to secure our voting system? Just want some tangible ideas for Democrats and other concerned citizens. Mark Crispin Miller: The first thing to do is to campaign relentlessly, in every way at hand, to get the scandal of last year's election on the national radar screen. Unless we do that, all our policy suggestions will mean nothing. As we do that, though, we should also be resisting the proliferation of touch-screen voting machines sold by private vendorsDiebold, ES&S, Sequoiaand agitating on behalf of paper ballots, unless and until we learn about a tamper-proof computer-based system (if such is possible). That would be a local matter, by and large. We also should be working very hard to get the Voting Rights Act renewed completely. (The Busheviks want to remove certain provisions from it, so that it can then be junked by the Scalito Court.) And we must support Rep. Jesse Jackson III's call for a constitutional amendment formally confirming every adult American's right to vote, and establishing a uniform federal voting system. We should also enable same-day registration, extend the voting period to, say, a week, advocate for Instant Run-off Voting (IRV), and do whatever else it take to make the system truly democratic. sfexpat2000: I'd like to ask Mark, was there a moment, or an event, that you can identify as the one that spurred you to write on this topic? Thanks. Mark Crispin Miller: That moment was Election Day, and the huge screaming gap between the propaganda ("It's all gone really well!") and what was really happening coast to coast. IndyOp: Praise: Loved the Harpers' article! Thank you! My Question: Do we have the votes? Are you convinced that all of the fraudulent actions stole the election from Kerry? Do you estimate numbers of votes stolen in your book? Also, MCM - Stephanie said that you wanted the link for the Petition to keep Marc Maron on Morning Sedition: [13]Petition to Keep Marc Maron on Morning Sedition More [14]Contact Information for Danny Goldberg at AAR - A call from Mark Crispin Miller might get Goldberg's attention. Mark Crispin Miller: It's very hard to come up with precise figures. That's the problem. But consider, for example, that the Census Bureau came out in late May with an astounding revelation. According to their survey, 3.4 million more Americans claimed to have cast ballots in 2004 than the official toll of those who voted. So maybe some of them were lying. OK, let's say half were lying. That still leaves some 1.7 million votes that somehow never got recorded. And that number does not include those (countless) voters who knew very wll that they could not vote, or even register. And neither of those sums include those US citizens abroad who tried and failed to vote. (The last chapter of Fooled Again is all about Bush/Cheney's interference with the expatriate vote, which includes up to 7 million ballots.) Put it all together, and what does it spell? "IT CAN HAPPEN HERE, AND DID." And EVERYONE out there, PLEASE contact Air America, and urge the board NOT to allow the cancellation of Marc Maron's show!!! wrathofkahn: Now I'm confused... "And that number does not include those (countless) voters who knew very well that they could not vote, or even register." Umm... If they knew that they could not vote or register (vs. simply choosing not to do so), then I must assume that they were ineligible to vote. How is it that someone who couldn't have voted anyway could have affected the outcome of the election (other than campaigning, etc.)? Mark Crispin Miller: Those who tried to register and/or vote and couldn't. Not because they were ineligible. They were eligible, and yet could not register or vote. Arkana: Mr. Miller, I read "The Bush Dyslexicon" and loved it, BTW. I wanted to ask you: What is your proposal to deal with companies such as Diebold, ES&S, Triad, and others that "hack the vote"? Mark Crispin Miller: All private vendors should be outlawed. nashville_brook: Can you please speak to the importance of EXIT POLLS in our case for ELECTION FRAUD. what's the appropriate weight to give exit poll discrepancies in the on-going debate? And (follow-up) Do you have a response to EXIT POLL DISCREPANCY deniers; those who claim we either don't have all the information yet, or that we don't understand the numbers. Thank you -- and I just have to say... everyone we've loaned your Patriot Act video to has compared you to the late Spaulding Gray. We look forward to more monologues. Mark Crispin Miller: The exit poll/"official" count discrepancies are certainly significant although the issue is extremely complicated. Let me recommend the writings of Steve Freeman at the University of Pennsylvania. He has a book, co-written with Joel Bleifuss, coming out from Seven Stories in a month or so. A must-read. Steve is expert on the subject of those polls. He's debated Warren Mitofsky, who came off the worse for it. Just Me: Numerous states have enacted "paper trail" laws. Will such laws be sufficient to protect our votes? If not, what other actions do you suggest should be taken? Mark Crispin Miller: Paper trails per se are not enough. Certainly it's better to have paper trails than none, but the mere existence of such disparate slips of paper is no panacea. I think that thre should be a paper BALLOT, so that the ballots can be stored indefinitely and counted or recounted as required. The TruVote machine looks like a very good idea. (That's the company whose CEO was evidently Silkwooded last year.) ignatzmouse: Many Threads Into an Unmistakable Case: I'm sorry that I can't stay long, but I wanted to at least give a high recommendation to "Fooled Again." As with all of Mark's books, it is exhaustively researched, insightful, and has teeth. Mark does cite a couple of my studies including the "Unofficial Audit of the NC Election" that initially appeared here at DU. I'm deeply honored to be included, but it makes it even better for I have read and respected Mark for years. To paraphrase Van Morrison (the way I like to hear it), "If you pull your punches, you don't push the river." Mark pushes the river. "Cruel and Unusual" is the benchmark for me in getting at who these people are and what partially concealed agenda they seek. It's an important book. Likewise, "Fooled Again" pulls together the many-pronged RNC attack on the election process and exposes it in a way that is hard to marginalize. That is critically important because the culprits utilize marginalization of facts to elude media focus and cover their trail. They'll say... "but there were reported electronic discrepancies that favored Kerry too. See, it all amounts to much ado about a few electronic glitches." But, in fact, if you look at the EIRS data, the electronic vote switching favors Bush by a ridiculously large percentage. It is also interesting to see how often these reports are centered in minority districts. To have someone of Mark Cripsin Miller's credentials to not be fooled by the marginalizations and not carry the comfortable disdain for populism that seems embedded in most of the national media is necessary and validating if the story of what happened in the 2004 election is to reach out and enable reform in the future. Absentees and aborting votes: I've read the accounts of the missing absentee ballots in Florida (which you also nicely document) and have likewise noted in several states the unlawful collection of absentee ballots by mysterious persons and groups. In Georgia, we've just had the Republican legislature attempt to restrict minority voting by creating a voter ID requirement where no documented fraud has occurred. Interestingly, however, the voter ID restrictions would not apply to absentee ballots. To me, that's a tip that one method of rigging is to either create phantom absentee voters or revote for "captured" absentee ballots. Something very fishy is going on with absentees. I noted this particularly in Nevada where they have verified voting. Absentee fraud could be a way to circumvent all other measures of safeguarding the vote. Did you get a sense of rank in the types and methods of vote fraud -- electronic, vote switching DRE's, absentee, various types of disenfranchisement, etc.? And finally, at the old Kerry-Edwards forum on election day, one of the regulars posted an odd firsthand account that I have not seen since. While on the phone to Blackwell's Secretary of State headquarters, she was put on hold and could hear a phone bank of numerous people in the SoS's office making phone calls to voters stating that they were calling from Planned Parenthood and asking that they vote for John Kerry in order to keep abortion legal. My take was that they were calling identified Catholic voters in order to anger them to the polls to vote for Bush. That sort of illegal and underhanded tactic is Rovian by nature (or Mehlman-esque as the case may be) and I would guess prosecution worthy. The old Kerry-Edwards forum is long gone, and I have no way of researching it further. Have you heard of similar Ohio accounts, or is that state so awash in corruption that it almost gets lost in the mix? Mark Crispin Miller: I salute you, ignatzmouse. A thousand thanks for your kind words. I think your work is indispensable, and was delighted to be able to include it in my book, which seems all the stronger for your research. I had not heard anything about that phone bank. If you find it, could you send it to me? cry baby: Thank you for coming online with us! Do you think that the states will actually entertain the idea of replacing the voting machines that they just purchased to be in compliance with HAVA? Can those machines be retrofitted with a "proof of vote" certificate and would that keep our elections from being stolen? How likely do you believe it is that states will actually go to a voter verified paper ballot (which is what I'd like to see)? Mark Crispin Miller: The states will do what their residents demand they do. if the demand is long and loud enough. HAVA, furthermore, should be repealed ASAP. SteppingRazor: I haven't read the book -- yet -- but from what I understand... you rely fairly heavily on anecdotal evidence, which -- while certain to stir the proper response -- doesn't carry much weight in scientific or (more importantly) legal analysis. I ran into the same problem while looking into the 2000 election here in Florida -- plenty of people willing to talk, but little direct evidence of willful manipulation. My question is, do you believe that, if given to a prosecutor with subpoena power, real evidence not relying on circumstance or anecdote could be found, such that either this administration and/or the leaders of the Republican Party could be held criminally liable? If so, what would it mean for both parties in the long term (the short term conclusions being fairly obvious)? In other words, if taken into the ostensibly objective realm of the courtroom, could this dog hunt? Mark Crispin Miller: I have far more than anecdotal evidence, which, as you note, works better in a narrative composed for broad consumption than it would in court. If you'd like a good example of non-anecdotal evidence, please let me recommend the section of the book that deals with Sproul and Associates. There is solid evidence of fraud committed by the GOPand also evidence of a bald effort by the party to conceal all trace of that wrongdoing. Fly by night: A few more questions. But first, thanks kindly for all you do. I would like to know your impressions of how your piece has been received, both among other journalists and among the general public. Any feel for the impact on sales of Harper's at the newsstand, LTTEs or hits on Harper's and your web-sites. (I'm trying to gauge the legs of this story.) What evidence from other states besides Ohio (or the behavior of the Rethugs in Congress and elsewhere) during and after the election confirms your suspicions that the election was stolen. What are your reactions to the recent piece o' shit article in Mother Jones or the older piece o' shit article in TomPaine which dismissed the election fraud evidence. Why do you believe there is still such resistance (even among progressives) to acknowledging that our elections are being stolen these days? Any responses to any of these questions would be appreciated. Thanks again from Tennessee. We're not a red state or a blue state -- we're an Orange State. Mark Crispin Miller: That issue of Harper's broke a lot of records for newsstand sales. It sold more than any prior issue since the one that published Norman Mailer's Prisoner of Sex in 1972, and may well have outsold that one too. (We don't know yet.) In any case, the response was exhilarating. The evidence of nationwide vote theft is vast. It's in the book. (In large part, it IS the book.) I was disappointed in Mark Hertsgaard's pieceespecially as he's a friend of mine, and generally a very good reporter. He really blew it there. For one thing, my book is not based largely on the Conyers Report: a characterization that implies that my focus is Ohio. In fact, I devote only ten pages to the Conyers Report, and another five to scandals in Ohio NOT discussed by Conyers et al. The book is nearly 300 pages long, with copious evidence from many states. And more generally, Mark's piece badly distorted not just my book but the Conyers Report (WHAT WENT WRONG IN OHIO?) and the excellent compilation of documents put together by Bob Fitrakis and Harvey Wasserman (DID GEORGE W. BUSH STEAL THE ELECTION IN 2004?). The evidence speaks for itself. I wish that Mark had worked a little harder on that piece. The resistance is based partly on corruption, in some cases, and careerism, and very largely on denial. The implications of the theft last year are very grave. Better to deny them categorically. The whole red state/blue state dichotomy is pure crapola. ms liberty: Hi Mark! Thanks for chatting with us... The GAO report on miselection 04 came out last week to virtual silence from the main stream media, but BushCo is (finally) getting a more critical look from them, thanks to Mr. Fitzgerald. Isn't this the perfect time to push this issue, with this corrupt regime already vulnerable? How can we get this issue more attention from the MSM? Are you going to be on The Daily Show, or do you have any MSM interviews scheduled? What I would really enjoy is to see you on Washington Journal! Loved the Dyslexicon, and Cruel and Unusual. I'm looking forward to reading your new one! Anything you can do to help us save Marc Maron is REALLY appreciated! Mark Crispin Miller: These questions are terrific I wish I had the time to answer all of them in detail! The GAO report is an important document. The press's silence on it is appalling, and, I'm afraid, revealing. The Daily Show said they would have me on if a relevant "big story" should break sometime soon. I'm not sure what that means. The GAO report is such a story, except that, as you noticed, it was not a story. So what would such a story be, I wonder? Anyway, I'd love to be invited on. (Feel free to pester them on my behalf!) I think the MSM will be a tough sell for this book, although not as tough as it was a few months ago. The Florida Sun-Sentinel gave me a pretty good review, and I got good reviews as well in Publishers Weekly and Kirkus Reviews. The peoples at Basic Books are working overtime to get the word out, so we'll see. readmoreoften: Professor Miller, There are SO MANY wildly outrageous events occuring simultaneously-- the death of our democracy through stolen elections, the prospect of never-ending war, an unprotected and abused labor force (yes, I will be striking next week), the normalization of torture and rape, the loss of civil liberties, the suspicion surrounding the Bush Adminstration's culpability in 9-11-- why is the public so RESOUNDINGLY SILENT? I believe this resounding silence would have been unthinkable 20 years ago. I went to the World Can't Wait rally at Union Square yesterday and I was perplexed at that so few New Yorkers were willing to take to the streets to protest this regime. As undergrads in the late 80s/early 90s, we occupied the administration building because of a slight increase in tuition for low-income students. At this point, I swear I can't imagine undergraduates taking action to stop a college administration from forcing low income students to sell their organs to pay for tuition. It seems to be more than just a chilling effect. We are living in a media bubble-- a bubble of disinformation. As a people under undemocratic rule, who currently have no ability to manage or confront our mediated environment... how can we cut through the apathy? how do we debrief our fellow Americans? how do we address the fact that even those who are critical of the Bush administration will not confront the gravity of the situation? Do you have any ideas on how to burst the media bubble? Can you share with us any particular strategies you have used to cut through normally thoughtful people's overwhelming desire to pull the covers over heads and go back to bed? And thank you for signing the faculty democracy statement in support of TA's freedom to strike! Mark Crispin Miller: It isn't necessarily apathy. Discontent is more widespread than we are generally led to think. BushCo's popularity among the military, for example, and among military families, is not at all impressive; and he has lost a lot of ground even among his own erstwhile constituents. His current "STRONG approval" rating is now around 22%, with a four-point margin of error, which means that it could be as low as 18%the same percentage of Americans who did not disapprove of how Bush/Cheney and their Congress tried to meddle in the Terri Schiavo case. We tend to think of many of our fellow-citizens as apathetic because, let's face it, we too live inside "the media bubble," which represents us to ourselves (and to the whole wide world) as far less discontented than we really are. Now, it is surely true that people should be more than discontented. They should be actively protesting and resisting. (Although there too the media tunes out what protest and resistance HAS welled up.) On the other hand, the system has radically depoliticized us, training us to watch and, if we can afford it, shop, and little else. We've therefore long since lost our civic virtue, and the necessary habit of saying NO when things become oppressive. Just remember that the situation is a lot more fluid, and potentially explosive, than it appears to be on CNN and in the New York Times. The elites have fallen out with one anothera clash that now provides us with a most important opportunity to say things that have been verboten for too long. The iron is hot. It's therefore crucial that we not despair, or paralyze ourselves with undue worries vis-a-vis the seeming or alleged indifference of "the masses." DUBYASCREWEDUS: I live in Cleveland, Ohio - Land of Blackwell the Evil. I know he stole the 2004 election. How can we - as ordinary citizens - stop them from doing it again? Are you familiar with State Issues 2, 3, 4 and 5? We have been receiving conflicting views on whether or not to vote for them. Do you know of them, and if so, do you have an opinion? Mark Crispin Miller: I don't know about those issues. What do Bob Fitrakis and Harvey Wasserman say? Free Press is terrific. I trust them all implicitly re: all electoral issues in Ohio. Bill Bored: Why do you favor Early Voting? You say we should "extend the voting period to, say, a week." If we are concerned about security, early voting is not advisable. The longer the machines are available to accept votes, the greater the temptation and opportunity to screw around with them. Also, early voting gives any potential fraudster the knowledge of how the election is going so that vote rigging can be targeted to areas on election day in which the early results were "disappointing" and in need of reversal. Wouldn't it be safer/better to have an Election Day holiday to allow everyone to get to the polls? Mark Crispin Miller: I don't think we should be using those machines. althecat: Hi Mark.... Alastair from Scoop NZ here.... I will have to buy your book ASAP. And am delighted you have decided to come and chat here in DU. Is Volusia County in the book? I was always very disappointed after we followed up your story ([15]Diebold Memos Disclose Florida 2000 E-Voting Fraud) that Dana Millbank didn't go back and dig a bit deeper into this. For me I thought this discovery was a bit of a breakthrough in terms of indicating that fraud had quite probably occurred at a fairly high level in the 2000 election. P.S. I will have a scout around the thread to figure out the best place to buy the book. Mark Crispin Miller: Alastair, I'm honored by your praise. Scoop.co.nz is indispensable! 1,000 thanks. Yes, in Fooled Again I do deal with Volusia Countyespecially with the fact that Fox News called the race for Bush just at that moment when those 16,000+ Democratic votes had temporarily zipped down the rabbit hole. You can get the book from my own blog, at [16]markcrispinmiller.com, or from Buzzflash. ------------ Mark Crispin Miller interview http://www.stayfreemagazine.org/archives/19/mcm.html Mark Crispin Miller on conspiracies, media, and mad scientists Interview by Carrie McLaren | [8]Issue #19 After years of dropping Mark Crispin Miller's name in Stay Free!, I figured it was time to interview him. Miller is, after all, one of the sharpest thinkers around. His writings on television predicted the cult of irony-or whatever you call it when actual Presidential candidates mock themselves on Saturday Night Live, when sitcoms ridicule sitcoms, and when advertisements attack advertising. More recently, he has authored The Bush Dyslexicon, aided by his humble and ever-devoted assistant (me). [mcm2.gif] Miller works at New York University in the Department of Media Ecology. Though he bristles at being called an academic, Miller is exactly the sort of person that should be leading classrooms. He's an excellent speaker, with a genius for taking cultural products-be they Jell-O commercials or George W. Bush press conferences-and teasing out hidden meanings. (He's also funny, articulate, and knows how to swear.) I talked to Mark at his home in November, between NPR appearances and babysitting duty. He is currently writing about the Marlboro Man for American Icons, a Yale University Press series that he also happens to be editing. His book Mad Scientists: Paranoid Delusion and the Craft of Propaganda (W. Norton) is due out in 2004.-CM STAY FREE: Let's start with a simple one: Why are conspiracy theories so popular? MCM: People are fascinated by the fundamental evil that seems to explain everything. Lately, this is why we've had the anomaly of, say, Rupert Murdoch's Twentieth Century Fox releasing films that feature media moguls as villains out to rule the world-villains much like Rupert Murdoch. Who's a bigger conspirator than he is? And yet he's given us The X-Files. Another example: Time Warner released Oliver Stone's JFK, that crackpot-classic statement of the case that American history was hijacked by a great cabal of devious manipulators. It just so happens that Stone himself, with Time Warner behind him, was instrumental in suppressing two rival projects on the Kennedy assassination. These are trivial examples of a genuine danger, which is that those most convinced that there is an evil world conspiracy tend to be the most evil world conspirators. STAY FREE: Because they know what's inside their own heads? MCM: Yes and no. The evil that they imagine is inside their heads-but they can't be said to know it, at least not consciously. What we're discussing is the tendency to paranoid projection. Out of your own deep hostility you envision a conspiracy so deep and hostile that you're justified in using any tactics to shatter it. If you look at those who have propagated the most noxious doctrines of the twentieth century, you will find that they've been motivated by the fierce conviction that they have been the targets of a grand conspiracy against them. Hitler believed he was fighting back, righteously, against "the Jewish world conspiracy." [See pp. 30-31] Lenin and Stalin both believed they were fighting back against the capitalist powers-a view that had some basis in reality, of course, but that those Bolsheviks embraced to an insane degree. (In 1941, for example, Stalin actually believed that England posed a greater danger to the Soviet Union than the Nazis did.) We see the same sort of paranoid projection among many of the leading lights of our Cold War-the first U.S. Secretary of Defense, James Forrestal, who was in fact clinically insane; the CIA's James Angleton; Richard Nixon; J. Edgar Hoover; Frank Wisner, who was in charge of the CIA's propaganda operations worldwide. Forrestal and Wisner both committed suicide because they were convinced the Communists were after them. Now, there was a grain of truth to this since the Soviet Union did exist and it was a hostile power. But it wasn't on the rise, and it wasn't trying to take over the world, and it certainly wasn't trying to destroy James Forrestal personally. We have to understand that there was just as much insanity in our own government as there was with the Nazis and the Bolsheviks. This paranoid dynamic did not vanish when the Cold War ended. The U.S. is now dominated, once again, by rightists who believe themselves besieged. And the same conviction motivates Osama bin Laden and his followers. They see themselves as the victims of an expansionist Judeo-Christianity. STAY FREE: Al Qaeda is itself a conspiracy. MCM: Yes. We have to realize that the wildest notions of a deliberate plot are themselves tinged with the same dangerous energy that drives such plots. What we need today, therefore, is not just more alarmism, but a rational appraisal of the terrorist danger, a clear recognition of our own contribution to that danger, and a realistic examination of the weak spots in our system. Unfortunately, George W. Bush is motivated by an adolescent version of the same fantasy that drives the terrorists. He divides the whole world into Good and Evil, and has no doubt that God is on his side-just like bin Laden. So how can Bush guide the nation through this danger, when he himself sounds dangerous? How can he oversee the necessary national self-examination, when he's incapable of looking critically within? In this sense the media merely echoes him. Amid all the media's fulminations against al Qaeda, there has been no sober accounting of how the FBI and CIA screwed up. Those bureaucracies have done a lousy job, but that fact hasn't been investigated because too many of us are very comfortably locked into this hypnotic narrative of ourselves as the good victims and the enemy as purely evil. STAY FREE: There's so much contradictory information out there. Tommy Thompson was on 60 Minutes the other night saying that we were prepared for biological warfare, that there was nothing to worry about. Yet The New York Times and The Wall Street Journal have quoted experts saying the exact opposite. Do you think this kind of confusion contributes to conspiratorial thinking? I see some conspiratorial thinking as a normal function of getting along in the world. When, on September 11th, the plane in Pennsylvania went down, there was lots of speculation that the U.S. military shot it down. MCM: Which I tend to think is true, by the way. I've heard from some folks in the military that that plane was shot down. STAY FREE: But we have no real way of knowing, no expertise. MCM: Yes, conspiratorial thinking is a normal response to a world in which information is either missing or untrustworthy. I think that quite a few Americans subscribe to some pretty wild notions of what's going on. There's nothing new in this, of course. There's always been a certain demented plurality that's bought just about any explanation that comes along. That explains the centuries-old mythology of anti-Semitism. There will always be people who believe that kind of thing. To a certain extent, religion itself makes people susceptible to such theorizing. STAY FREE: How so? [mcm1.gif] MCM: Because it tends a propagate that Manichean picture of the universe as split between the good people and "the evil-doers." Christianity has spread this vision-even though it's considered a heresy to believe that evil is an active force in God's universe. According to orthodox Christianity, evil is not a positive force but the absence of God. STAY FREE: A lot of religious people believe what they want to believe, anyway. Christianity is negotiable. MCM: Absolutely. But when it comes to the paranoid world view, all ethical and moral tenets are negotiable, just as all facts are easily disposable. Here we need to make a distinction. On the one hand, there have been, and there are, conspiracies. Since the Cold War, our government has been addicted to secrecy and dangerously fixated on covert action all around the world. So it would be a mistake to dismiss all conspiracy theory. At the same time, you can't accept everything-that's just as na?ve and dangerous as dismissing everything. Vincent Bugliosi, who wrote The Betrayal of America, is finishing up a book on the conspiracy theories of the Kennedy assassination. He has meticulously gone through the case and has decided that the Warren Report is right. Now, Bugliosi is no knee-jerk debunker. He recognizes that a big conspiracy landed George W. Bush in the White House. STAY FREE: So I take it you don't buy the conspiracy theories about JFK? MCM: I think there's something pathological about the obsession with JFK's death. Some students of the case have raised legitimate questions, certainly, but people like Stone are really less concerned about the facts than with constructing an idealized myth. STAY FREE: Critics of the war in Afghanistan have called for more covert action as an alternative to bombing. That's an unusual thing for the left to be advocating, isn't it? MCM: It is. On the one hand, any nation would appear to be within its rights to try to track down and kill these mass murderers. I would personally prefer to see the whole thing done legally, but that may not be realistic. So, if it would work as a covert program without harm to any innocents I wouldn't be against it. But that presumes a level of right-mindedness and competence that I don't see in our government right now. I don't think that we can trust Bush/Cheney to carry out such dirty business. Because they have a paranoid world-view-just like the terrorists-they must abuse their mandate to "do what it takes" to keep us safe. By now they have bombed more innocents than perished in the World Trade Center, and they're also busily trashing many of our rights. The "intelligence community" itself, far from being chastened by their failure, has used the great disaster to empower itself. That bureaucracy has asked for still more money, but that request is wholly disingenuous. They didn't blow it because they didn't have enough money-they blew it because they're inept! They coasted along for years in a cozy symbiosis with the Soviet Union. The two superpowers needed one another to justify all this military and intelligence spending, and it made them complacent. Also, they succumbed to the fatal tendency to emphasize technological intelligence while de-emphasizing human intelligence. STAY FREE: Yeah, the Green Berets sent to Afghanistan are equipped with all sorts of crazy equipment. They each wear gigantic puffy suits with pockets fit to carry a GPS, various hi-tech gizmos, and arms. MCM: That's just terrific. Meanwhile, the terrorists used boxcutters! STAY FREE: Did you see that the U.S. Army has asked Hollywood to come up with possible terrorist scenarios to help prepare the military for attack? MCM: Yeah, it sent a chill right through me. If that's what they're reduced to doing to protect us from the scourge of terrorism, they're completely clueless. They might as well be hiring psychics-which, for all we know, they are! STAY FREE: The Bush administration also asked Al Jazeera, the Arab TV station, to censor its programming. MCM: Right. And, you know, every oppressive move we make, from trying to muzzle that network to dropping bombs all over Afghanistan, is like a gift to the terrorists. Al Jazeera is the only independent TV network in the Arab world. It has managed to piss off just about every powerful interest in the Middle East, which is a sign of genuine independence. In 1998, the network applied for membership in the Arab Press Union, and the application was rejected because Al Jazeera refused to abide by the stricture that it would do everything it can to champion "Arab brotherhood." STAY FREE: What do you think our government should have done instead of bombing? MCM: I rather wish they had responded with a little more imagination. Doing nothing was not an option. But bombing the hell out of Afghanistan was not the only alternative-and it was a very big mistake, however much it may have gratified a lot of anxious TV viewers in this country. By bombing, the U.S. quickly squandered its advantage in the propaganda war. We had attracted quite a lot of sympathy worldwide, but that lessened markedly once we killed Afghan civilians by the hundreds, then the thousands. Americans have tended not to want to know about those foreign victims. But elsewhere in the world, where 9/11 doesn't resonate as much, the spectacle of all those people killed by us can only build more sympathy for our opponents. That is, the bombing only helps the terrorists in the long run. And so has our government's decision to define the 9/11 crimes as acts of war. That definition has served only to exalt the perpetrators, who should be treated as mass murderers, not as soldiers. But the strongest argument against our policy is this-that it is exactly what the terrorists were hoping for. Eager to accelerate the global split between the faithful and the infidels, they wanted to provoke us into a response that might inflame the faithful to take arms against us. I think we can agree that, if they wanted it, we should have done something else. STAY FREE: You've written that, before the Gulf War, Bush the elder's administration made the Iraqi army sound a lot more threatening than it really was. Bush referred to Iraq's scanty, dwindling troops as the "elite Republican guard." Do you think that kind of exaggeration could happen with this war? MCM: No, because the great given in this case is that we are rousing ourselves from our stupor and dealing an almighty and completely righteous blow against those who have hurt us. Now we have to seem invincible, whereas ten years ago, they wanted to make us very scared that those Iraqi troops might beat us. By terrorizing us ahead of time, the Pentagon and White House made our rapid, easy victory seem like a holy miracle. STAY FREE: Let's get back to conspiracy theories. Do people ever call you a conspiracy theorist? MCM: Readers have accused me of paranoia. People who attacked me for The Bush Dyslexicon seized on the fact that my next book is subtitled Paranoid Delusion and the Craft of Propaganda, and they said, "He's writing about himself!" But I don't get that kind of thing often because most people see that there's a lot of propaganda out there. I don't write as if people are sitting around with sly smiles plotting evil-they're just doing their jobs. The word propaganda has an interesting history, you know. It was coined by the Vatican. It comes from propagare, which means grafting a shoot onto a plant to make it grow. It's an apt derivation, because propaganda only works when there is fertile ground for it. History's first great propagandist was St. Paul, who saw himself as bringing the word of God to people who needed to hear it. The word wasn't pejorative until the first World War, when the Allies used it to refer to what the Germans did, while casting their own output as "education," or "information." There was a promising period after the war when it got out that our government had done a lot of lying. The word propaganda came to connote domestic propaganda, and there were a number of progressive efforts to analyze and debunk it. But with the start of World War II, propaganda analysis disappeared. Since we were fighting Nazi propaganda with our own, it wasn't fruitful to be criticizing propaganda. STAY FREE: I read that the word "propaganda" fell out of fashion among academics around that time, so social scientists started referring to their work as "communications." It was no longer politically safe to study how to improve propaganda. MCM: Experts in propaganda started doing "communications" studies after the war. Since then, "communication" has been the most common euphemism used for "propaganda," as in "political communication." There's also "psychological warfare" and, of course, "spin." The Cold War was when "propaganda" became firmly linked to Communism. "Communist propaganda" was like "tax-and-spend Democrats" or "elite Republican guard." The two elements were inseparable. If the Communists said it, it was considered propaganda; and if it was propaganda, there were Communists behind it. Only now that the Cold War is over is it possible to talk about U.S. propaganda without running the risk of people looking at you funny. The word does still tend to be used more readily in reference to liberals or Democrats. The right was always quick to charge Bill Clinton-that leftist!-with doing propaganda. In fact, his right-wing enemies, whose propaganda skills were awesome, would routinely fault him for his "propaganda." You never heard anybody say Ronald Reagan was as a master propagandist, though. He was "the Great Communicator." STAY FREE: Talk a bit about how conspiracy is used to delegitimize someone who's doing critical analysis. I've heard you on TV saying, "I don't mean to sound like a conspiracy theorist, but . . . " People even do this in regular conversation. A friend of mine was telling me about going to Bush's inauguration in D.C. He was stunned that none of the protests were covered by the media but prefaced his comments by saying, "I want don't want to sound like a conspiracy theorist, but [the press completely ignored the protests]." It's almost as if people feel the need to apologize if they don't follow some party line. MCM: I wouldn't say that, because there are people who are conspiracy theorists. And I think the emphasis there should not be on the conspiracy but on the theory. A theorist is a speculator. It's always much easier to construct a convincing conspiracy theory if you don't bother looking at reality. The web is filled with stuff like this. So, if you want cover yourself, you should say something like: "I don't subscribe to every crackpot notion that comes along, but in this case there's something funny going on-and here's the evidence." It really is a rhetorical necessity. Especially when you're on TV. STAY FREE: Maybe it's more of a necessity, too, when you're talking about propaganda. MCM: I'll tell you something: it's necessary when you're talking about real conspiracies. You know who benefited big time from the cavalier dismissal of certain conspiracies? The Nazis. The Nazis were expert at countering true reports of their atrocities by recalling the outrageous lies the Allies had told about the Germans back in World War I. The Allies had spread insane rumors about Germans bayoneting Belgian babies, and crucifying Canadian soldiers on barn doors, and on and on. So, when it first got out that the Nazis were carrying out this horrible scheme, their flacks would roll their eyes and say, "Oh yeah-just like the atrocity stories we heard in WWI, right?" STAY FREE: I once attended a lecture on Channel One [an advertising-funded, in-school "news" program], where a professor dissected several broadcasts. He talked about how Channel One stories always emphasize "oneness" and individuality. Collective efforts or activism is framed in the negative sense, while business and governmental sources are portrayed positively and authoritatively. Now, someone listening to this lecture might say, "That just your reading into it. You sound conspiratorial." So where do you think this sort of media analysis or literary analysis and conspiracy-mongering intersect? MCM: That's a very good question. For years I've encountered the same problem as a professor. You've got to make the point that any critical interpretation has to abide by the rules of evidence-it must be based on a credible argument. If you think I'm "reading into it," tell me where my reading's weak. Otherwise, grant that, since the evidence that I adduce supports my point, I might be onto something. Where it gets complicated with propaganda is around the question of intention, because an intention doesn't have to be entirely conscious. The people who make ads, for example, are imbedded in a larger system; they've internalized its imperatives. So they may not be conscious intellectually of certain moves they make. If you said to somebody at Channel One, "You're hostile to the collective and you insult the individual," he'd say, reasonably, "What are you talking about? I'm just doing the news." So you have to explain what ideology is. I'm acutely sensitive to this whole problem. When I teach advertising, for example, I proceed by using as many examples as possible, to show that there is a trend, whatever any individual art director or photographer might insist about his or her own deliberate aims. Take liquor advertising, which appeals to the infant within every alcoholic by associating drink with mother's milk. This is clearly a deliberate strategy because we see it in ad after ad-some babe holding a glass of some brew right at nipple level. She's invariably small-breasted so that the actual mammary does not upstage the all-important product. If that's an accident, it's a pretty amazing accident. Now, does this mean that the ad people sit down and study the pathology of alcoholics, or is it something they've discovered through trial and error? My point is that it ultimately makes no difference. We see it over and over-and if I can show you that, according to experts, visual association speaks to a desire in alcoholics, a regressive impulse, then you have to admit I have a point. Of course, there are going to be people who'll accuse you of "reading into it" no matter what you say because they don't want to hear the argument. This is where we come up against the fundamental importance of anti-intellectualism on the right. They hate any kind of explanation. They feel affronted by the very act of thinking. I ran into this when I promoted The Bush Dyslexicon on talk shows-which I could do before 9/11. Bush's partisans would fault me just for scrutinizing what he'd said. STAY FREE: I recently read Richard Hofstadter's famous essay about political paranoia. He argued that conspiracy is not specific to any culture or country. Would you agree with that, or do you think there is something about America that makes it particularly hospitable to conspiracy theories? MCM: Well, there's a lot of argument about this. There's a whole school of thought that holds that England's Civil War brought about a great explosion of paranoid partisanship. Bernard Baylin's book The Ideological Origins of the American Revolution includes a chapter on the peculiar paranoid orientation of the American revolutionaries. But I think paranoia is universal. It's an eternal, regressive impulse, and it poses a special danger to democracy. STAY FREE: Why, specifically, is it dangerous to democracy? MCM: Because democracies have always been undone by paranoia. You cannot have a functioning democracy where everyone is ruled by mutual distrust. A democratic polity requires a certain degree of rationality, a tolerance of others, and a willingness to listen to opposing views without assuming people are out to kill you. There's a guy named Eli Sagan who wrote a book on the destructive effect of paranoia on Athenian democracy. And I think that the American experiment may also fail; America has always come closest to betraying its founding principles at moments of widespread xenophobic paranoia. In wartime, people want to sink to their knees and feel protected. They give up thinking for themselves-an impulse fatal to democracy but quite appropriate for fascism and Stalinism. The question now is whether paranoia can remain confined to that thirty-or-so percent of the electorate who are permanently crazy. That's what Nixon himself said, by the way-that "one third of the American electorate is nuts." About a third of the German people voted for the Nazis. I think there's something to that. It's sort of a magic number. STAY FREE: Come to think of it, public opinion polls repeatedly show that 70% of the public are skeptical of advertising claims. I guess that means about 30% believe anything. MCM: Wow. I wonder if that lack of skepticism toward advertising correlates in any way with this collective paranoia. That would be interesting to know. STAY FREE: Well, during the Gulf War, a market research firm conducted a study that found that the more hawkish people were, the more likely they were to be rampant consumers. Warmongers, in other words, consumed more than peaceniks. Why do you think these two reactions might be correlated? MCM: One could argue that this mild, collective paranoia often finds expression in promiscuous consumption. Eli Sagan talks about the "paranoidia of greed" as well as the "paranoidia of domination." Both arise out of suspicion of the enemy. You either try to take over all his territory forcibly, or you try to buy everything up and wall yourself within the fortress of your property. STAY FREE: Those two reactions also practically dominate American culture. When people from other countries think of America, they think of us being materialistic and violent. We buy stuff and kill people. Do you think there's any positive form of paranoia? Any advantage to it? MCM: No, I don't, because paranoids have a fatal tendency to look for the enemy in the wrong place. James Angleton of the CIA was so very destructive because he was paranoid. I mean, he should have been in a hospital-and I'm not being facetious. Just like James Forrestal, our first defense secretary. These people were unable to protect themselves, much less serve their country. I think paranoia is only useful if you're in combat and need to be constantly ready to kill. Whether it's left-wing or right-wing paranoia, the drive is ultimately suicidal. STAY FREE: Our government is weak compared to the corporations that run our country. What role do you see for corporations in the anti-terrorist effort? MCM: Well, corporations do largely run the country, and yet we can't trust them with our security. The private sector wants to cut costs, so you don't trust them with your life. Our welfare is not uppermost in their minds; our money is. So what role can the corporations play? STAY FREE: They can make the puffy suits! MCM: The puffy suits and whatever else the Pentagon claims to need. Those players have a vested interest in eternal war. STAY FREE: Did you read that article about Wal-Mart? After September 11, sales shot up for televisions, guns, and canned goods. MCM: Paranoia can be very good for business. STAY FREE: Have you ever watched one of those television news shows that interpret current events in terms of Christian eschatology? They analyze everyday events as signs of the Second Coming. MCM: No. I bet they're really excited now, though. I wonder what our president thinks of that big Happy Ending, since he's a born-again. You know, Reagan thought it was the end times. STAY FREE: But those are minority beliefs, even among born-again Christians. [mcm4.gif] MCM: It depends on what you mean by "minority." Why are books by Tim LaHayes selling millions? He's a far-right fundamentalist, co-author of a series of novels all about the end times-the Rapture and so on. And Pat Robertson's best-seller, the New World Order, sounds the same apocalyptic note. STAY FREE: He's crazy. He can't really believe all that stuff. MCM: No, he's crazy and therefore he can believe that stuff. His nurse told him years ago that he was showing symptoms of paranoid schizophrenia. STAY FREE: I recently read a chapter from Empire of Conspiracy-an intelligent book about conspiracy theories. But it struck me that the author considered Vance Packard, who wrote Hidden Persuaders, a conspiracy theorist. Packard's book was straightforward journalism. He interviewed advertising psychologists and simply reported their claims. There was very little that was speculative about it. MCM: The author should have written about Subliminal Seduction and the other books by Wilson Brian Key. STAY FREE: Exactly! That nonsense about subliminal advertising was a perfect example of paranoid conspiracy. Yet he picked on Vance Packard, who conducted his research as any good journalist would. MCM: Again, we must distinguish between idle, lunatic conspiracy theorizing, and well-informed historical discussion. There have been quite a few conspiracies in U.S. history-and if you don't know that, you're either ignorant or in denial. Since 1947, for example, we have conspiratorially fomented counter-revolutions and repression the world over. That's not conspiracy theory. That's fact-which is precisely why it meets the charge of speculation. How better to discredit someone than to say she's chasing phantoms-or that she has an axe to grind? When James Loewen's book Lies Across America was reviewed in The New York Times, for example, the reviewer said it revealed an ideological bias because it mentions the bombing of civilians in Vietnam. Loewen wrote back a killer letter to the editor pointing out that he had learned about those bombings from The New York Times. Simply mentioning such inconvenient facts is to be dismissed as a wild-eyed leftist. When someone tells me I'm conspiracy-mongering I usually reply, "It isn't a conspiracy, it's just business as usual." STAY FREE: That's like what Noam Chomsky says about his work: "This is not conspiracy theory, it is institutional analysis." Institutions do what is necessary to assure the survival of the institution. It's built into the process. MCM: That's true. There's a problem with Chomsky's position, though-and I say this with all due respect because I really love Chomsky. When talking about U.S. press coverage, Chomsky will say that reporters have internalized the bias of the system. He says this, but the claim is belied by the moralistic tone of Chomsky's critique-he charges journalists with telling "lies" and lying "knowingly." There is an important contradiction here. Either journalists believe they're reporting truthfully, which is what Chomsky suggests when he talks about internalizing institutional bias. Or they're lying-and that, I think, is what Chomsky actually believes because his prose is most energetic when he's calling people liars. One of the purposes of my next book, Mad Scientists, will be to suggest that all the best-known and most edifying works on propaganda are slightly flawed by their assumption that the propagandist is a wholly rational, detached, and calculating player. Most critics-not just Chomsky, but Jacques Ellul and Hannah Arendt, among others-tend to project their own rationality onto the propagandist. But you can't study the Nazis or the Bolsheviks or the Republicans without noticing the crucial strain of mad sincerity that runs throughout their work, even at its most cynical. [mcm3.gif] STAY FREE: You have written that even worse than the possibility that a conspiracy exists may be the possibility that no conspiracy is needed. What do you mean by that? MCM: The fantasy of one big, bad cabal out there is terrifying but also comforting. Not only does it help make sense of a bewildering reality, but it also suggests a fairly neat solution. If we could just find all the members of the network and kill them, everything will be okay. It's more frightening to me that there are no knowing authors. No one is at the top handling the controls. Rather, the system is on auto-pilot, with cadres just going about their business, vaguely assuming that they're doing good and telling truths-when in fact they are carrying out what could objectively be considered evil. What do you do, then? Who is there to kill? How do you expose the perpetrators? Whom do you bring before the bar of justice-and who believes in "justice"? And yet I do think that a lot of participants in this enterprise know they're doing wrong. One reason people who work for the tobacco companies make so much money, for example, is to still the voice of conscience, make them feel like they're doing something valuable. But the voice is very deeply buried. Ultimately, though, it is the machine itself that's in command, acting through those workers. They let themselves become the media's own media-the instruments whereby the system does its thing. I finally learned this when I studied the Gulf War, or rather, the TV spectacle that we all watched in early 1991. There was a moment on the war's first night when Ron Dellums was just about to speak against the war. He was on the Capitol steps, ready to be interviewed on ABC-and then he disappeared. They cut to something else. I was certain that someone, somewhere, had ordered them to pull the plug because the congressman was threatening to spoil the party. But it wasn't that at all. We looked into it and found the guy who'd made that decision, which was a split-second thing based on the gut instinct that Dellums' comments would make bad TV. So that was that-a quick, unconscious act of censorship, effected not by any big conspiracy but by one eager employee. No doubt many of his colleagues would have done the same. And that, I think, is scarier than any interference from on high. From checker at panix.com Sun Dec 4 03:21:38 2005 From: checker at panix.com (Premise Checker) Date: Sat, 3 Dec 2005 22:21:38 -0500 (EST) Subject: [Paleopsych] Die Zeit: Washing Weber's dirty laundry Message-ID: Washing Weber's dirty laundry http://print.signandsight.com/features/445.html 2005-11-14 Joachim Radkau's compendious new biography deals in detail with Max Weber's personal - and sexual - life. A critical review by Robert Leicht. Why are we interested in the lives of people whose works lie before us like an open book? If we didn't know the first thing about Johann Sebastian Bach, for example, would his music sound any different? The case of Martin Luther is different: would we be able to comprehend the full impact of his reformist breakthrough on the idea of justification according to Saint Paul if we knew nothing of the biographical, and above all autobiographical accounts of how he tortured himself with his awareness of his sin, and with his (as his paternal friend [1]Johann von Staupitz shouted out to him) "superficial sins"? Which of these two perspectives applies to the relationship between the life and work of Max Weber, the German [2]founder of sociology, whose opus doesn't even exist in a coherent edition? (The eminent complete edition of his works is still awaiting completion.) "During his lifetime", as [3]political scientist Wilhelm Hennis observed back in 1982, "Weber only published two 'real books', the ones indispensable to his academic career: a dissertation and a habilitation. All other works consist of enquiry-type reports and rapidly thrown together essays, which were only published in book form after his death." How does the incompleteness of his work relate to his widespread influence as a thinker? Is it even possible to explain the fragmentary nature of his work with reference to fragments from his life - in other words to the "suffering" which caused him years of writing block and forced him to give up teaching for most of his life? At the time Hennis wrote forebodingly: "We are going to have to postpone all wishes for a fitting biography, one that replaces [4]Marianne Weber's, until the vast treasure of letters has been published in its entirety... In any event, only the letters can provide a deeper and more accurate understanding of Weber's life." This must have been understood as an indication from those in the know that the letters contained biographical details that could not yet be made public. Certainly, sketchy facts about the complicated love triangle between Elsa Jaff?, Max Weber and his brother Alfred were in circulation, but less was known about Weber's love affair with the pianist Mina Tobler. These affairs certainly provide material for speculation and chatter, but also for serious interrogation. Paradoxically, Wilhelm Hennis both argued for and warned against a new biography: "The 'derivation' of Weber's work from his psyche has turned out to be as questionable as the effort to separate his life from his work. He was a genius, a man sensitive to the world in which we live. Both his genius and his sensitivity were invested in a body of work that attempted to be social science." Thirty-three years after Hennis wrote these words, Joachim Radkau has published a monumental [5]biography of Max Weber. As Hennis predicted, the book depends heavily on the letters. And it owes much to the fact that Radkau has a thorough knowledge of many areas significant for understanding Max Weber's time and states of mind, for example his study on [6]"Das Zeitalter der Nervosit?t" (The age of nervousness). Seldom has a biography dealt with sources in such a detailed way. Seldom has a work given such a full picture of the protagonist's intellectual context and social milieu (for example his description of the university environment, especially in Heidelberg). Over and above Weber's biography, the volume provides a rich overview of an entire epoch. Yet it remains an open question whether this monumental, overwhelming, in end effect tiring study not only extends but also deepens our knowledge of Max Weber. To put it bluntly: we may learn more about Max Weber's person, but only a limited amount about his work and influence. As far as Weber's work is concerned, Radkau's biography is the counterpole to Wilhelm Hennis' interests. In Hennis' view, Weber's entire work must be approached from an Archimedean, or rather anthropological perspective: "What is man becoming in 'mental', 'qualitative' terms?" For Hennis, Weber's entire work is concerned with this central question, from the inaugural Freiburg address in 1895 to its unfinished end. Radkau, on the other hand, separates Weber's life and work into two clearly distinct phases, each of which reveals an entirely different personality with a correspondingly different body of work. Certainly, Radkau defends himself from the "deadly attack of 'biographical reductionism'", as if Hennis' warning were still ringing in his ears. But one can say without exaggeration that in this two-phase biography, Radkau connects not only Weber's scholarly creativity, but also the direction of his thinking, very closely to the emotional and erotic (psycho-physical, in fact psycho-sexual) sensitivity of his hero. There might be a certain plausibility in saying: how you feel is how you think - or write. But how many artworks have been wrested from an artist's naked desperation that fail to shed light on the artist's life? The very - at first sight oppressive - burden of evidence amassed by Radkau to establish a connection between emotion and creation gives one pause for thought, both for reasons of fact and method. To sum up Radkau roughly, Weber's first phase, leading up to his psycho-physical breakdown in 1898/99 which it took him years to recover from (at his own request he was finally relieved of teaching duties in 1903), is obsessively determined by his sexually unfulfilled, allegedly unconsummated marriage with Marianne Weber, by his impotence, and by his masochistic tendencies. Attendant to these are Weber's continual pollutions, or nocturnal ejaculations, which he saw as extremely detrimental to his creative powers. It's bad enough that Marianne Weber wrote innumerable letters on the subject to Weber's mother behind his back, thus providing the relevant source material for this biography. Reading the work, one is led to regard the Indian [7]custom of widow-burning with a certain, of course entirely politically incorrect, indulgence. But it is even worse that Radkau goes into such painful details. The word "pollution" or its German equivalent appears 29 times in five pages. Fine, the Webers were evidently deeply troubled by what is for us an incomprehensible pseudo-problem. But must we really have our noses dragged through this evidence? So Max Weber appears as a particularly hard nut case of - as they said in those days - neurasthenia. And this first period of what one might term pathologically-inflicted sexual asceticism corresponds with that part of Weber's work which deals with the inner asceticism of the Protestant ethic and the spirit of capitalism, with its strictly regimented lifestyle. In the autumn of 1909, Max Weber falls in love with Else Jaff?. But two months later they separate again because Else has started up an affair with Weber's brother Alfred. Then in summer of 1912 Max starts a love affair with the pianist Mina Tobler. Radkau sums up: "The relapses into his suffering now come to an end." Seven years later, Max falls in love with Else Jaff? once again, in what can be insinuated from the letters as a deeply servile love. Hardly a year later he dies. However this last decade of his short life is marked not only by an immense literary output, but also a change in direction. Max Weber, now erotically uninhibited, extra-marital and sexually fulfilled, busies himself with the religions of redemption and charisma. True, Radkau notes: "The new era is not, as far as we know, initiated by his love experience, but by an intellectual mood swing and a new feeling of physical well-being." Wouldn't it have been a good idea to ask whether Weber's neurasthenic suffering were not simply the cause, but also the consequence of his lack of productivity? And one could also ask whether his newfound productivity was not caused by his new-found sexual potency. Perhaps their interdependence was ultimately even more complex than that. The irritating, even maddening thing about Radkau's indiscreet inroads into Weber's private sphere is the countless number of times that something is apparent, that the supposition is justified, that one is entitled to assume... Assumption follows assumption. Some may be plausible, some entirely misleading. Wouldn't biographers do better to stick to what can be conclusively supported, rather than go out on conjectural limbs - or even repudiate their sources? Radkau points to evidence that Max Weber felt a sexual thrill when spanked by the family maid as a child. On two different occasions in the book he then feels entitled to correct Weber here. In fact it must have been Weber's mother (with the long-term consequences one might expect), because in such an upper-middle class household the maid would never dare punish the young master in such a way. This is nonsense of course, as the present writer can attest, who himself has no lower-middle class background, and who as a boy was occasionally chastised by both maid and mother - comparably the milder of the two - without thrills, without long-term negative consequences - and, it should be said, without it triggering off or repressing a major body of scholarly work. Some of the more intimate details of Weber's world could even be instructive for understanding the conditions of his intellectual production, if not its consequences. But their obsessive dissemination here - although it is precisely this information that will cause a sensation - is interfering, embarrasing and questionable in terms of whether they aid an understanding of the work. Whereas the unity of Weber's life and work is essential to Hennis' central approach from the outset, Radkau tries to constitute a kind of unity by letting the "true" Max Weber, "his" Weber, only appear in the second phase of his biography. It is here, in the last decade of Weber's life, after the productivity crisis and his erotic awakening with Else Jaff?, Mina Tobler and once again Else Jaff?, that Weber finds himself. But once you've adopted such a sceptical reading of Radkau's biography - which as I said is as imposing, entertaining and ingenious in its goals as in its method - a second reading will not fail to reveal many points where one is unsure whether to side with Weber or his biographer. One example is the word charisma. This cardinal term is treated in a twofold fashion. On the one hand it stands for the redemption of man - from his neurasthenic suffering too - through God's grace. On the other, it stands for the pre-conditions for a certain type of leadership. But the one has nothing to do with the other. The liberation from guilt vis-?-vis God has nothing to do with enforcing one's power on non-liberated subjects. Appealing to Weber's preference for the prophets of the Old Testament, Radkau parades these figures as prototypes of charismatic leadership. These people, however, didn't feel they had been freed by God, rather they felt constrained by him against their will. In addition, they didn't have a chance to demand the people's allegiance (the essence of leadership according to Weber). They were severe critics of leadership, but unsuccessful ones, and in later epochs they were self-critically presented as such in the writings of the people of Israel. Radkau is right to put so much emphasis on theology. But when he only cites [8]Karl Barth's critique of the "liberal theology" before and during World War I from the "Lectures on 19th Century Theology", he misses Barth's real theological-political polemic, which is expressed far more clearly in his many pamphlets and "open letters". But Radkau's deficiencies are most apparent in his treatment of Weber's music sociology. No one who plays an instrument with a brass-type mouthpiece would ever think - with Helmholz, and following him Weber and Radkau - of seeing a complete harmony in the natural series of tones, which are primarily impressive for their purely mathematical proportions. And one would be even less enclined to draw wide-reaching consequences from them. At the seventh semi-tone at the very latest, such an approach becomes very hard to justify. All these irritating factors are exacerbated by Radkau's innumerable side-sweeps at traditional Max Weber research, and by Radkau's critique of everything Weber's traditional "admirers" praise. And conversely, it is clear Radkau believes he is the only one to really do justice to Weber's music sociology, for example. His explanations, by contrast, are often based on sentences which, in his view, the traditional Weber researchers have either not read carefully enough, or not understood correctly. Certainly, a measured lack of respect not only makes for amusing reading, it is also entirely justified. Weber's Freiburg address, for example, and many of his political judgements, can only be seen as embarrassing and borderline. But a biographer who can't stop poking fun at Weber scholarship in a work of a good 1,000 pages neither does justice to how Weber's work has been received, nor to its enduring legacy. Here the book would have needed a good editing, one that removes the superfluous and accounts for all that is lacking. At the end of the book, Radkau justifies his washing Weber's dirty laundry in public by saying that in the meantime even those mentioned only briefly are now dead. Objection, your honour! Even long after death a taboo remains which protects people from having their innermost secrets revealed. Especially when the revelations are not aimed at satisfying our thirst for knowledge, but our idle curiosity. Joachim Radkau: "[9]Max Weber. Die Leidenschaft des Denkens" is published by C. Hanser Verlag, Munich, 2005. 1,008 pages, 45.00 euros. * The article [10]originally appeared in German in the October 2005 literary supplement of Die Zeit. Robert Leicht served as editor in chief of Die Zeit from 1992 - 1997, and is now political correspondent for the paper. Translation: [11]jab. sign and sight funded by Bundeskulturstiftung References 1. http://www.newadvent.org/cathen/14283a.htm 2. http://www.faculty.rsu.edu/%7Efelwell/Theorists/Weber/Whome.htm 3. http://en.wikipedia.org/wiki/Wilhelm_Hennis 4. http://www.webster.edu/%7Ewoolflm/weber.html 5. http://www.hanser.de/buch.asp?isbn=3-446-20675-2&area=Literatur 6. http://www.hanser.de/buch.asp?isbn=3-446-19310-3&area=Literatur 7. http://www.kamat.com/kalranga/hindu/sati.htm 8. http://www.faithnet.org.uk/Theology/barth.htm 9. http://www.hanser.de/buch.asp?isbn=3-446-20675-2&area=Literatur 10. http://www.zeit.de/2005/42/P-Weber 11. http://www.signandsight.com/service/35.html From checker at panix.com Sun Dec 4 03:21:49 2005 From: checker at panix.com (Premise Checker) Date: Sat, 3 Dec 2005 22:21:49 -0500 (EST) Subject: [Paleopsych] TCS: Internet Killed the Alien Star Message-ID: Internet Killed the Alien Star http://www.techcentralstation.com/110905A.html By Douglas Kern Published 11/09/2005 If you're looking for one of those famous, big-eyed alien abductors, try looking on the sides of milk cartons. The UFO cultural moment in America is long since over, having gone out with the Clintons and grunge rock in the 90s. Ironically, the force that killed the UFO fad is the same force that catapulted it to super-stardom: the Internet. And therein hangs a tale about how the Internet can conceal and reveal the truth. It's hard to remember just how large UFOs loomed in the public mind a mere ten years ago. The X-Files was one of the hottest shows on television; [26]Harvard professors solemnly intoned that the alien abduction phenomenon was a real, objective fact; and Congressmen made serious inquiries about a downed alien spacecraft in [27]Roswell, New Mexico. Still not enough? You could see the "Roswell" movie on Showtime; you could play "Area 51" at the arcade; you could gawk at stunning pictures of [28]crop circles in any number of magazines; and you could watch any number of lurid UFO specials on Fox or the Discovery Channel. And USENET! Egad! In the days when USENET was something other than a spam swap, UFO geeks hit "send" to exchange myths, sightings, speculations, secret documents, lies, truths, and even occasionally facts about those strange lights in the sky. The modern UFO era began with [29]Kenneth Arnold's 1947 UFO sighting near Mount Rainier, Washington. National interest in the subject waxed and waned in the following years -- sometimes spiking dramatically, as during the Washington, D.C. "flap" of 1952 or the Michigan sightings in 1966 (which captured the attention of [30]Gerald Ford). Steven Spielberg popularized the modern mythology of UFOs in 1977's "[31]Close Encounters of the Third Kind." And with the publication of [32]Whitley Strieber's "Communion" in 1987, alien abduction moved from a freakish, nutty concern to a mainstream phenomenon. Eccentrics had claimed to be in [33]mental contact with aliens since the fifties, and alien abductions had been a part of the American UFO scene since the [34]Betty and Barney Hill case of 1961, but Strieber's runaway bestseller fused the traditional alien abduction tale to a chilling narrative and a modern spiritual sensibility -- thus achieving huge credibility for our friends with the wraparound peepers. Yet in recent years, interest in the UFO phenomenon has withered. Oh, the websites are still up, the odd UFO picture is still taken, and the usual hardcore UFO advocates make the same tired arguments about the same tired cases, but the thrill is gone. What happened? Why did the saucers crash? The Internet showed this particular emperor to be lacking in clothes. If UFOs and alien visitations were genuine, tangible, objective realities, the Internet would be an unstoppable force for detecting them. How long could the vast government conspiracy last, when intrepid UFO investigators could post their prized pictures on the Internet seconds after taking them? How could the Men in Black shut down every website devoted to scans of secret government UFO documents? How could marauding alien kidnappers remain hidden in a nation with millions of webcams? Just as our technology for finding and understanding UFOs improved dramatically, the manifestations of UFOs dwindled away. Despite forty-plus years of alleged alien abductions, not one scrap of physical evidence supports the claim that mysterious visitors are conducting unholy experiments on hapless victims. The technology for sophisticated photograph analysis can be found in every PC in America, and yet, oddly, recent UFO pictures are rare. Cell phones and instant messaging could summon throngs of people to witness a paranormal event, and yet such paranormal events don't seem to happen very often these days. For an allegedly real phenomenon, UFOs sure do a good job of acting like the imaginary friend of the true believers. How strange, that they should disappear just as we develop the ability to see them clearly. Or perhaps it isn't so strange. The Internet taught the public many tricks of the UFO trade. For years, hucksters and mental cases played upon the credulity of UFO investigators. Bad science, shabby investigation, and dubious tales from unlikely witnesses characterized far too many UFO cases. But the rise of the Internet taught the world to be more skeptical of unverified information -- and careful skepticism is the bane of the UFO phenomenon. It took UFO experts over a decade to determine that the [35]"Majestic-12" documents of the eighties were a hoax, rather than actual government documents proving the reality of UFOs. Contrast that decade to the mere days in which the blogosphere disproved the Mary Mapes Memogate documents. Similarly, in the nineties, UFO enthusiasts were stunned when they learned that [36]a leading investigator of the Roswell incident had fabricated much of his research, as well as his credentials. Today, a Google search and a few e-mails would expose such shenanigans in minutes. Thus, the rise of the Internet in the late nineties corresponded with the fall of many famous UFO cases. Roswell? A crashed, top-secret weather balloon, misrepresented by dreamers and con men. [37]The Mantell Incident? A pilot misidentified a balloon, with tragic consequences. Majestic-12? Phony documents with a demonstrably false signature. [38]The Alien Autopsy movie? Please. As access to critical evidence and verifiable facts increased, the validity of prominent UFO cases melted away. Far-fetched theories and faulty evidence collapsed under the weight of their provable absurdity. What the Internet gave, the Internet took away. The Internet processes all truth and falsehood in just this fashion. Wild rumors and dubious pieces of evidence are quick to circulate, but quickly debunked. The Internet gives liars and rumor mongers a colossal space in which to bamboozle dolts of every stripe -- but it also provides a forum for wise men from all across the world to speak the truth. Over the long run, the truth tends to win. This fact is lost on critics of the blogosphere, who can only see the exaggerated claims and gossip. These critics often fail to notice that, on the 'net, the truth follows closely behind the lies. A great many of us accept Internet rumors and hoaxes in exchange for fast access to the truth. But is there any validity to the UFO phenomenon? Perhaps, but so what? The need for weird is hard-coded into the human condition. In every society, a few unlikely souls appear to make contact with an invisible world, communing with goblins or ghosts or aliens or gods or monsters. And in every society, some fool always tries to gather scales from the dragon tracks, or droppings from the goblins, or pictures of the aliens. The dream world is always too elusive to be captured, and yet too tantalizingly close to be dismissed. And so the ancient game continues, with weirdness luring us to introspection and subjectivity, even as reality beckons us to exploration and objectivity. The appeal of chimerical mysteries and esoteric knowledge tends to diminish when the need for moral clarity and direction grows acute. And our need for such guidance is acute indeed. We're at war now. We don't have the time for games. The weird ye shall have with you always. But right now, the introspection of weirdness isn't needed. I'm quite happy to leave the aliens in the nineties, and on the milk cartons. References 26. http://en.wikipedia.org/wiki/John_Edward_Mack 27. http://en.wikipedia.org/wiki/Roswell_incident 28. http://en.wikipedia.org/wiki/Crop_circles 29. http://en.wikipedia.org/wiki/Kenneth_Arnold 30. http://www.ufoevidence.org/documents/doc883.htm 31. http://www.imdb.com/title/tt0075860 32. http://en.wikipedia.org/wiki/Whitley_Strieber 33. http://en.wikipedia.org/wiki/Contactees 34. http://en.wikipedia.org/wiki/Betty_Hill 35. http://en.wikipedia.org/wiki/Majestic_12 36. http://www.roswellfiles.com/storytellers/RandleSchmitt.htm 37. http://en.wikipedia.org/wiki/Mantell_Incident 38. http://en.wikipedia.org/wiki/Alien_autopsy 39. mailto:interview at techcentralstation.com 40. http://www2.techcentralstation.com/1051/feedback.jsp?CID=1051-110905A From checker at panix.com Sun Dec 4 03:21:56 2005 From: checker at panix.com (Premise Checker) Date: Sat, 3 Dec 2005 22:21:56 -0500 (EST) Subject: [Paleopsych] FRB of Richmond: Interview with James M. Buchanan Message-ID: FRB of Richmond: Interview with James M. Buchanan Interview - Federal Reserve Bank of Richmond http://www.richmondfed.org/publications/economic_research/region_focus/spring_2004/interview.cfm [Bioblography appended.] Region Focus Spring 2004 Interview James Buchanan --- Economists have long treated people in the marketplace as rational actors pursuing their own self-interest. But, until the mid-20th century, it was common to view people in government in a very different light. They were perceived as selfless public servants who acted on behalf of the general interest. Such a distinction, argued James Buchanan, was unnecessary and incorrect. People in the public sector are self-interested just like everybody else. Using this basic assumption, Buchanan and others were able to apply the tools of economics to politics. This line of inquiry soon become known as "public choice" and spread rapidly throughout the United States, Europe, and Asia. The majority of public choice theorists are trained as economists, but more and more come from the ranks of political science. Most of Buchanan's academic career has been spent in Virginia: first at the University of Virginia in Charlottesville, then at the Virginia Polytechnic Institute in Blacksburg, and later at George Mason University in Fairfax. As a result, he and his colleagues are often referred to as members of the "Virginia School." In the early 1960s, Buchanan was one of the founders of the Public Choice Society (PCS). The PCS holds annual meetings where papers are presented and discussed. It is also loosely affiliated with the academic journal Public Choice, which was long edited by Gordon Tullock, one of Buchanan's most frequent collaborators. Buchanan was awarded the Nobel Prize in Economics in 1986. Although he is now in his mid-80s, he still pursues an active research agenda and continues to lecture regularly. Aaron Steelman interviewed Buchanan at George Mason University on February 2, 2004. RF: Public choice is often described as "politics without romance." Could you please describe what this phrase means? Buchanan: I actually used that as the title of a lecture I gave at the Institute for Advanced Studies in Vienna in 1978. I think that if you had to boil public choice down to three words, that's a pretty good description, but on the other hand it's not complete either. The phrase captures the idea that public choice does not look at politics through rose-colored glasses -- it is skeptical that the actions of people in politics are necessarily focused on promoting the public interest. Instead, it takes a more hard-nosed, realistic view of government. But what it leaves out is that we must have a legitimizing argument that politics is worthwhile -- that politics is an exchange in the sense that we give up something but we also get back something. RF: Public choice is now a recognized subdiscipline within economics. But when you first started doing work in public choice, how was that research greeted by the profession? Buchanan: It was certainly outside the mainstream. I think many of my colleagues at the University of Virginia didn't particularly like using economics to analyze politics. But I have to say that when Gordon Tullock and I published The Calculus of Consent in 1962, the book received quite warm reviews by both economists and political scientists. And, between the two groups, I think the book's impact was greater among political scientists in the following respect: They had further to go. Economists were familiar with the tools we were using and the basic assumptions about rationality that we were making, but to many political scientists, these ideas were rather novel. Also, I think you can't leave personalities out of this either. Bill Riker was very active in introducing public choice and positive political economy to other political scientists and to his students at the University of Rochester. The fact that he came onboard very early was extremely important. RF: People working in the public choice tradition are often referred to as members of the "Virginia School." Could you please explain how and when that term came into being? Buchanan: Mancur Olson came up with that term. He was the one who first characterized us as the Virginia School -- I don't know exactly when but it was probably sometime in the mid-1970s, after we had already moved from Charlottesville to Blacksburg. It was fine by us. So we went with it, as did other people. But we didn't coin the term ourselves. RF: Richard Wagner, who was one of your students at the University of Virginia and has been your colleague at both the Virginia Polytechnic Institute (VPI) and George Mason University, has written that VPI was the most fertile place for public choice scholarship. Do you agree? Buchanan: I think you have to look at this on different dimensions. The public choice program originated at the University of Virginia from 1956 to 1968. Warren Nutter and I set up the Thomas Jefferson Center for Studies in Political Economy. The research program at the Center was broader in scope -- it wasn't confined to public choice per se. That was a very productive and exciting time. We had a great group of people there: Ronald Coase, Leland Yeager, Tullock, and Nutter were all on the faculty. And, without question, we had the best graduate students I have ever worked with -- really top-notch kids. We were never that productive in terms of producing good graduate students at VPI. But the public choice program became more developed there. We enjoyed tremendous support from the university administration, which in some ways had been lacking at Virginia. And Tullock, who had left Virginia a few years before I did, came to VPI. He and I started collaborating on a lot of projects, and we set up the Center for the Study of Public Choice along with Charlie Goetz. One of the things that I think was really important about VPI was the unique atmosphere and geography: We were all located close to each other and had constant interaction. Plus, at VPI there was a young man named Winston Bush whose enthusiasm and intellect really inspired a lot of interesting projects, such as our work on the political economy of anarchy. Winston was a great mathematical economist, who unfortunately died quite young in a car accident, but for a few years was a real live wire who really kept things going. We also had a great visiting fellow program. It wasn't unusual for us to have eight or nine visitors at one time. So, in the sense of sheer output, I think Wagner is right: VPI was the most productive place. RF: At last year's meetings of the Public Choice Society in Nashville, I was struck by the large percentage of participants from continental Europe. Did public choice take off internationally during the period you were at VPI? Buchanan: Yes. Many of the visiting fellows who came to Blacksburg were from Europe or Asia. It was also around this time that they set up their own organizations: the European Public Choice Society and the Japanese Public Choice Society. In some ways, the Europeans were more eager to work on constitutional political economy issues than were the Americans. In fact, I think that if the Nobel Prize were decided by American economists, I never would have been awarded it. My work has been much more warmly received in Europe than in the United States. RF: Could you describe how Frank Knight and Knut Wicksell have affected your thinking and career? Buchanan: They were certainly the two most important influences on my work. Knight's influence was more as a role model than as someone whose work I tried to build on, although he certainly made very important contributions of his own. Knight and I had very similar backgrounds: He was a farm boy from central Illinois who spent some time in school in Tennessee and who ultimately rejected the religious milieu in which he had been raised. I really liked his attitude toward the world and his willingness to question anything and anybody. He had a real passion for ideas. Wicksell, on the other hand, was more of an accidental discovery. I was going through the stacks of the old University of Chicago library after I had finished my dissertation and I ran across his dissertation, which had never been translated from the very difficult German. In that book, he was saying things that I felt inside me but I never dared to say. He really reinforced a lot of things that were sort of inchoate in my thinking. The central idea I got from Wicksell is that we can't improve politics by simply expecting politicians to do good. There are no interests other than those of individuals, and politicians will pursue their own interests just like anyone else, by trying to get re-elected, advance their careers, and so on. This means that economists ought to stop acting as if they were advising benevolent despots. If you want to improve government, you must try to improve the rules of the game rather than the individual players. RF: Looking back over the past 40 years, what do you think are some of the most important contributions that public choice theorists have made? Buchanan: I think that the most important contribution, by far, is to simply change the way that people look at politics. I often have been asked if public choice had a causal influence in the decline of confidence in politics and politicians compared to, say, 40 years ago. My answer is: yes and no. Once governments began, in the 1960s and 1970s, to overstep their bounds and take on projects that ultimately proved to be great failures -- and this is true not only in the socialist states but also in the democratic states of the West -- public choice came along and gave people a systematic way to analyze and explain these failures. So public choice wasn't the cause of distrust in government but it did help us understand the deficiencies of the political process. It changed the way that we look at collective action. RF: "Rent seeking" is one of the more common terms one encounters in articles written by public choice theorists. Could you give a basic description of what that term means? Buchanan: "Rent seeking" is a very basic concept: If there is value out there, someone is going to invest time and effort trying to attain it. The same is true with "profit seeking" -- if there is profit to be had, people will go after it. But the term rent seeking applies to a special kind of value -- value that is created artificially through the political process. Gordon Tullock has a great ability to take personal experiences and translate them into ideas. He had spent some time in China and while he was there he noticed that the Chinese imperial bureaucracy had these very severe standards that people had to pass in order to be admitted to the civil service. Candidates would spend a tremendous number of hours studying and learning this stuff. But most of the effort was completely wasted, because only a few could obtain a government position. This was a prime example that Gordon used. Likewise, let's say that the government can issue a monopoly on the production of playing cards. Then a lot of people are going to spend time courting the government to get that privilege. It may be rational but it's socially wasteful. The point was so obvious, but also so important, that once it was made it became a standard term used by economists and especially by public choice economists. RF: Public choice scholars, of course, are quite concerned with procedural issues, and have done important work explaining how various constitutional rules affect political and economic outcomes. Yet it seems that public choice theorists have been less successful explaining the conditions necessary to sustain those rules. Consider the United States, for instance. In the area of economic regulation, Congress' authority is virtually plenary. What accounts for the breakdown of the constitutional order in the United States? Buchanan: I think that we have had a breakdown in the traditional role of the judiciary and how the judiciary views itself as part of the larger political structure. We began to get that with the post-New Deal courts, which let the legislative branch do pretty much whatever it saw fit. Why did that happen? I'm not sure. Part of it is ideology. Law schools started to teach students that the Constitution was malleable -- that it said whatever judges claimed it said. The judiciary then became much more activist, as judges began to use their own political views as a basis for making decisions. This process has turned us much more toward a simple majoritarian-type political order. So I think that's part of the reason for the breakdown. But as for a more generalizable explanation, I don't have one. RF: Many commentators frequently decry voter turnout rates of, say, 50 percent as "too low." But, actually, it's surprising that this many people go to the polls because the chance of being instrumental is virtually zero. Does public choice have a good explanation for why people vote? Buchanan: That is one of the central puzzles we have faced since Anthony Downs and Gordon Tullock raised the question in the 1950s. >From a purely rational standpoint, people don't have much of an incentive to vote but, as you said, about half of them do. Why? I think this gets us into social psychology. People may vote simply as a means of expression rather than as a way of influencing the outcome of an election. They also may feel some sort of duty is involved. But, given the framework that economists would traditionally look at this sort of question, it's hard to come up with a satisfactory answer. RF: How would public choice explain political outliers -- people who get elected to Congress even though they run on quite radical platforms, either from the right or the left? According to median voter theory, it seems, these people shouldn't be chosen by the electorate. Buchanan: This is another good question to which we don't have an adequate answer. It may just be that these people act very differently in Washington than they do in their own districts. The average voter is not going to pay much attention to what politicians say in front of certain activist groups, but they may pay attention to what these politicians have to say when they come home to campaign. RF: Many people who have done important academic work in the public choice tradition have subsequently gone on to hold high-level appointed offices in the federal government. Is there something ironic about this, in your view? Or is this training useful? Buchanan: I'm not sure that it helps much. If you're on the inside, maybe you don't want to be trained in public choice. For instance, if you are going into the bureaucracy, perhaps you wouldn't want to have read the public choice literature on bureaucracy. I certainly wouldn't get excited about more public choice people filling government positions. Absorbing and doing are quite different things in this context. I think that there is little doubt that public choice has been enriched by people who have used government experience to inform their academic work. But I don't know that public choice has done much to influence the way that government officials actually behave. RF: How would you describe the differences between the allocationist-maximization paradigm, within which many neoclassical economists work, and the catallactic-coordination paradigm, within which most of your research has been done? Buchanan: Economics, as it was transformed by Paul Samuelson into a mathematical discipline, required practitioners to have something to maximize subject to certain constraints. This contrasts with the catallactic-coordination paradigm, which starts out with individuals simply trading with each other. You examine this process and build up into a system of how markets emerge and become integrated. It's a very different conceptualization of the whole economic process. I have argued, at least in the last three or four years, that the really big contributions to come will be from game theory. For a long time, I think economists didn't really understand what game theory was all about. The core insight, it seems to me, is that people choose among strategies and out of that emerges outcomes that are not part of anyone's choice set. It is a different way of looking at economics and it gets us to focus on fundamental issues of economic coordination that have been neglected. This, I think, is the direction that formal economic theory ought to take. RF: A recent article in PS: Political Science and Politics titled "Are Public Choice Scholars Different?" discussed the results of a survey given to members of the Public Choice Society (PCS), American Economic Association (AEA), and American Political Science Association (APSA). The survey asked for opinions on a wide variety of economic issues. The differences between PCS and AEA members were relatively small on most questions, but in a few cases, they were statistically significant. For instance, PCS members found the following proposition substantially more agreeable: "Government does more to protect and create monopoly power than it does to prevent it." Does this, in your mind, confirm the widely held notion that public choice theorists are more suspicious of government action and more friendly toward market solutions than economists generally? Buchanan: Yes, to some degree. But a continuing critique of public choice is that the whole research program is ideologically driven. I think that is completely wrong. It all goes back to the first question you asked about public choice being described as "politics without romance." If you look at politics in a realistic way, no matter your underlying ideological preferences, you are going to come out more negative than you started. There are many public choice people whose normative views are not at all market-oriented. But, as scientists, they reach conclusions that may not particularly support those normative preferences. RF: What do you think of the various "heterodox" schools of economics that are challenging the basic assumption of neoclassical economics? Buchanan: For more than 20 years, I have predicted that you would see more collaboration between psychologists and economists. That prediction is finally becoming realized with the widespread emergence of "behavioral economics," as characterized by the work of Dick Thaler, Bob Frank, and others. They pick out particular anomalies and use them to try to chip away at the neoclassical edifice. Many of those anomalies are interesting, but they are just that -- anomalies and thus not very generalizable. I don't think that behavioral economics is a spent force yet, but I don't know how much further they can go with it, because what they have to offer are critiques rather an alternative program of inquiry. Still, I'm sympathetic to the idea that economists have pushed this homo-economicus model too much. RF: In a series of articles on what he calls "rational irrationality," Bryan Caplan has tried to reorient public choice to focus more on voter-driven political failure and less on the perverse influence of special interests. What do you think of this line of inquiry? Buchanan: I don't know Caplan's work very well. But I think there is something to what he is trying to argue. For instance, I think there is the following bifurcation in the choice process: We may want to do things collectively that we are not willing to sustain privately. It may be true that the welfare state represents what people actually want. They may want the government to take care of everybody and so they vote for candidates who run on such a platform, including the higher tax rates needed to pay for it. At the same time, given those high levels of taxation, they may decide to quit working, like the Swedes, and spend time at their summer home. So even though they voted for the whole program -- on both the spending and taxation sides -- they are not willing to support it through their private actions. RF: What, in your view, is the proper role of government? Buchanan: Well, I think the state should fund the classic public goods and you could probably do that with government spending at a level of roughly 15 percent of gross domestic product (GDP). But I'm not willing to say that that is all government should do. As long as government grows within a proper set of rules, then I would rather not put limits on its size. I am reluctant to say, for instance, that having public spending at 40 percent of GDP -- which is about what we have now -- is necessarily wrong. RF: Why do so many voters hold views that are at odds with mainstream economic theory? Buchanan: Part of the blame falls on economists. As scientists, we are incredibly attracted to grapple with interesting puzzles that may have little immediate practical application. And, indeed, we are rewarded for doing that through greater academic promotions and greater prestige within the profession. So that type of work has a lot of private value to economists. Contrast that with making basic economic truths -- such as the benefits of free trade -- accessible to a wider audience. Economists gain very little from doing that -- for instance, it probably won't get you tenure. But there is an enormous public value associated with having an economically literate society. We need more Bastiats who are willing to talk to the public. As it stands, economists are losing the battle. Biography - Federal Reserve Bank of Richmond http://www.richmondfed.org/publications/economic_research/region_focus/spring_2004/interview_biography.cfm Biography James Buchanan Present Position Distinguished Professor Emeritus of Economics, George Mason University, and Distinguished Professor Emeritus of Economics and Philosophy, Virginia Polytechnic Institute and State University Previous Faculty Appointments Virginia Polytechnic Institute and State University (1969-1983); University of California at Los Angeles (1968-1969); University of Virginia (1956-1968); Florida State University (1951-1956); University of Tennessee (1948-1951) Education B.S., Middle Tennesee State College (1940); M.A., University of Tennesee (1941); Ph.D., University of Chicago (1948) Selected Publications Author or co-author of more than 20 books, including The Calculus of Consent: Logical Foundations of Constitutional Democracy (1962); Cost and Choice: An Inquiry in Economic Theory (1969); The Limits of Liberty: Between Anarchy and Leviathan (1975); and Better than Plowing: And Other Personal Essays (1992) Awards and Offices Winner, 1986 Nobel Memorial Prize in Economic Sciences; Fellow, American Academy of Arts and Sciences; Former President of the Southern Economic Association, Western Economic Association, Mont Pelerin Society, and Public Choice Society From checker at panix.com Sun Dec 4 03:22:03 2005 From: checker at panix.com (Premise Checker) Date: Sat, 3 Dec 2005 22:22:03 -0500 (EST) Subject: [Paleopsych] Science Daily: Mildly Depressed People More Perceptive Than Others Message-ID: Mildly Depressed People More Perceptive Than Others http://www.sciencedaily.com/print.php?url=/releases/2005/11/051121164438.htm Source: Queen's University Date: 2005-11-22 _________________________________________________________________ Mildly Depressed People More Perceptive Than Others Surprisingly, people with mild depression are actually more tuned into the feelings of others than those who aren't depressed, a team of Queen's psychologists has discovered. "This was quite unexpected because we tend to think that the opposite is true," says lead researcher Kate Harkness. "For example, people with depression are more likely to have problems in a number of social areas." The researchers were so taken aback by the findings, they decided to replicate the study with another group of participants. The second study produced the same results: People with mild symptoms of depression pay more attention to details of their social environment than those who are not depressed. Their report on what is known as "mental state decoding" - or identifying other people's emotional states from social cues such as eye expressions - is published today in the international journal, Cognition and Emotion. Also on the research team from the Queen's Psychology Department are Professors Mark Sabbagh and Jill Jacobson, and students Neeta Chowdrey and Tina Chen. Drs. Roumen Milev and Michela David at Providence Continuing Care Centre, Mental Health Services, collaborated on the study as well. Previous related research by the Queen's investigators has been conducted on people diagnosed with clinical depression. In this case, the clinically depressed participants performed much worse on tests of mental state decoding than people who weren't depressed. To explain the apparent discrepancy between those with mild and clinical depression, the researchers suggest that becoming mildly depressed (dysphoric) can heighten concern about your surroundings. "People with mild levels of depression may initially experience feelings of helplessness, and a desire to regain control of their social world," says Dr. Harkness. "They might be specially motivated to scan their environment in a very detailed way, to find subtle social cues indicating what others are thinking and feeling." The idea that mild depression differs from clinical depression is a controversial one, the psychologist adds. Although it is often viewed as a continuum, she believes that depression may also contain thresholds such as the one identified in this study. "Once you pass the threshold, you're into something very different," she says. Funding for this study comes from a New Opportunities Grant from the Canada Foundation for Innovation. Editor's Note: The original news release can be found [3]here. _________________________________________________________________ This story has been adapted from a news release issued by Queen's University. References 1. http://a.tribalfusion.com/i.click?site=ScienceDailyMagazine&adSpace=ROS&size=468x60&requestID=1024095902 2. http://www.sciencedaily.com/releases/2005/11/051121164438.htm 3. http://qnc.queensu.ca/story_loader.php?id=4381d1aa783bb From checker at panix.com Mon Dec 5 02:45:09 2005 From: checker at panix.com (Premise Checker) Date: Sun, 4 Dec 2005 21:45:09 -0500 (EST) Subject: [Paleopsych] Review of Business: Transforming a University from a Teaching Organization to a Learning Organization Message-ID: Transforming a University from a Teaching Organization to a Learning Organization Review of Business Volume 26, Number 3 [I'm one of the peer reviewers of this publication.] Fall 2005 (Special Issue: Applications of Computer Information Systems and Decision Sciences) Hershey H. Friedman, Brooklyn College of the City University of New York Linda W. Friedman, Baruch College of the City University of New York Simcha Pollack, The Peter J. Tobin College of Business, St. John?s University Abstract Successful 21-century universities will have to be lean, flexible and nimble. In fact, Peter Drucker claims that 30 years from now the ?big universities will be relics" and will not survive. In the corporate world, businesses are becoming learning organizations in order to survive and prosper. This paper explains why it is necessary for universities to become learning organizations and provides ideas as how to make the transformation. Introduction Peter Drucker noted in an interview that: "Thirty years from now the big university campuses will be relics. Universities won?t survive. It?s as large a change as when we first got the printed book" [18]. This may be an exaggeration, but there is no question that universities that refuse to change may not survive. The rise of for-profit universities (e.g., the University of Phoenix), decreased government support for universities, the rising costs of education, the globalization of education, technological change, the growing number of working adults who need continuing education to avoid obsolescence and distance education are forcing universities to transform themselves. In fact, Andrews et al. [1] urge academia to respond to the "wake-up call" and recognize that inflexibility and the failure to respond quickly and decisively to environmental change can be dangerous. For colleges to change, they not only have to learn to run their organizations in a more business-like fashion, they have to be willing, when necessary, to add and shrink programs quickly. This is not easy when the organizational structure of today?s university has more to do with the convenience of establishing accounting budgets than with the demands of intellectual growth and education [12,14]. Edwards [9] notes that "the actual elimination of departments is extremely rare and usually generates a wave of unflattering national news, so the substitution strategy is driven toward less visible, more surreptitious methods." It is becoming quite apparent that being inflexible and resistant to change in an extremely fast-moving environment is a prescription for disaster, whether we are dealing with a business or academic institution. Several visionaries believe that the university of the future will be very different from the university of today: more interdisciplinary programs, and the substantial modification of the current prevalent academic organizational structure. Duderstadt [7] suggests that the university of the future will be divisionless, i.e., there will be many more interdisciplinary programs. There will also be "a far more intimate relationship between basic academic disciplines and the professions." He asks "whether the concept of the disciplinary specialist is relevant to a future in which the most interesting and significant problems will require ?big think? rather than ?small think?" [8]. Kolodny [16:40-41] asserts that the antiquated way of organizing colleges?by departments?will have to "evolve into collaborative and flexible units." Students with narrowly defined majors will have great difficulty comprehending a world in which the knowledge required of them is complex, interconnected and, by its very nature, draws from many areas. Edwards [9] maintains that "in so many cases, the most provocative and interesting work is done at the intersections where disciplines meet, or by collaborators blending several seemingly disparate disciplines to attack real problems afresh." The Learning Organization Clearly, there are great changes ahead for higher education, but changing the culture of an organization is a daunting task. Forward-thinking institutions have to consider what can be done to make their organizations more responsive to change. In the corporate world, many firms are recognizing that the ability of an organization to learn is the key to survival and growth, and "organizational learning" has become the mantra of many companies [3,21]. What is organizational learning? Organizational learning has been defined in many ways: Stata [24] asserts that: "organizational learning occurs through shared insights, knowledge and mental models [and] builds on past knowledge and experience." Senge [21] writes: "learning organizations are not only adaptive, which is to cope, but generative, which is to create." Pedler et al. [20] state: "A learning company is an organization that facilitates the learning of all its members and continually transforms itself." Garvin [11] believes that a learning organization is "an organization skilled at creating, acquiring, and transferring knowledge, and at modifying its behavior to reflect new knowledge and insights." What should we find in a learning organization? The following briefly summarizes what one would expect: oAwareness of the external environment. Knowing what the competition is doing. oBelief that individuals can change their environment. A learning culture. oShared vision. One that encourages individuals to take risks. oLearning from past experience and mistakes?experience is the best teacher. oLearning from the experiences of others in the organization. Organizational memory in order to know what worked in the past and what did not. oWillingness to experiment and take chances. Tolerance for failure. oDouble-loop or generative learning. With double-loop, as opposed to single-loop, learning, assumptions are questioned. "Double loop learning asks questions not only about objective facts but also about the reasons and motives behind those facts" [2]. oConcern for people. Respect for employees. Diversity is seen as a plus since it allows for new ideas. Empowerment of employees. oInfrastructure allowing the free flow of knowledge, ideas and information. Open lines of communication. Sharing of knowledge, not just information. Team learning where colleagues respect and trust each other. An organization where one employee will compensate for another?s weaknesses, as in a successful sports team. oUtilization of shared knowledge. Emphasis on cooperation, not turf. oCommitment to lifelong learning. Constant learning and growth. oAbility to adapt to changing conditions. Ability to renew, regenerate and revitalize an organization. Knowledge sharing is a necessary condition for having a learning organization. To foster the sharing of knowledge, computer software has been developed to make it easy for coworkers to share their expertise. For instance, the AskMe Transforming a University from a Teaching Organization to a Learning Organization Corporation (http://www.askmecorp.com/) claims that it is "the leading provider of software solutions that enable global 2000 companies to create and manage Employee Knowledge Networks (EKNs)." AskMe notes on its website that creating EKNs helps ensure that employees do not have to solve problems that others have already solved, i.e., "reinventing the wheel." It also enables employees in a firm to quickly find the individual with the appropriate expertise to solve a problem. One thing the AskMe company discovered is that knowledge sharing is difficult in pyramid-shaped organizations with tall organizational structures, i.e., characterized by numerous layers of management. Knowledge sharing works much better where there is a flat organizational structure with a relatively short chain of command. However, managers have to be willing to accept suggestions, ideas and answers from their employees. When information flows in all directions? even from the bottom of the organizational pyramid to the top?some managers might feel that they are losing some of the status and authority of their position. After all, conceivable that someone in the mailroom might be able to answer a question that stumps top management. Knowledge can be found anywhere and everywhere. The power of knowledge sharing should not be underestimated. Linux, the extremely successful computer operating system, was developed by the collaboration of programmers all over the globe. Are Universities Learning Organizations? Before discussing universities, it might be instructive to examine whether schools?especially primary and secondary ones?are learning organizations. The evidence, albeit limited, indicates that they are not. Shields and Newton [22] examined schools that participated in the Saskatchewan School Improvement Program and found that they were not learning organizations. Isaacson and Bamburg [15] also came to the same conclusion. Schools rarely have visions, teachers rarely share knowledge with colleagues, and schools are managed with a top-down approach. Many others agree that schools have not functioned as learning organizations [5,10]. When Senge was asked by O?Neill [19] whether or not schools were learning organizations, he replied: "definitely not." Universities are not run like high schools or elementary schools and stress research/learning as much as (or more than) teaching. Despite this, it seems that very few universities would qualify as learning organizations. It is quite ironic that teaching organizations do not know how to learn. Most universities have little knowledge sharing and Smith [23] asserts that: "Academic departments serve as organizations that exhibit all the segmentary politics described by anthropologists: segmentation for largely demographic reasons, balanced opposition among themselves, and unitary resistance to a superordinate entity, usually the college or university as a whole." Harrington [13] believes that departments encourage loyalty to the discipline rather than to the university. Apparently, most universities are not learning organizations. Transforming the University into a Learning Organization The following are some suggestions that can be used to help transform the university into a learning organization. 1. Establish a message board to function as a research matchmaking service. As noted above, the most exciting research is often at the interface of two disciplines. Furthermore, researchers with expertise in one area (e.g., biology) might need to collaborate with a faculty member with expertise in another area (e.g., computer simulation or geology) in order to write a paper. Universities could provide a central message board where faculty members could state the area(s) in which they are doing research and the kind of co-author, if any, they seek. This site could also be used to find ideas for research. Senior faculty members might be willing to provide ideas for research in return for a byline on any resulting article. If successful, this service can be extended to include faculty in other colleges. Universities have to understand that discouraging professors from writing co-authored papers is counterproductive. It is certainly not consistent with a key philosophy of a learning organization: sharing knowledge. Moreover, working with scholars from other disciplines creates a synergy that can result in truly innovative research. It is not uncommon in academe to find professors who continue to write essentially the same paper over and over with very little new information. There is nothing wrong with collaboration if it produces exciting research. One wonders whether James Watson and Francis Crick would have been as successful if they had worked alone. The Human Genome Project took 13 years and involved researchers from at least 18 countries. 2. Establish an online archive where faculty can post papers for review by colleagues before submitting them to journals. If the faculty at a university work together as a team and want their institution to flourish, they are more likely to provide helpful criticism. The late OpenTextProject (www.opentextproject.org) was an international site that allowed individuals to post their papers for pre-submission review. 3. There could be a Web site for every course, especially multiple-section courses taught by a number of different faculty. Faculty could submit their best ideas on how to teach the course and their best lectures. This site would then be a resource for students who have difficulties with the course and would also be a resource for faculty teaching the course. Most professors teaching a course have gotten useful ideas from other faculty teaching the same course. For instance, suppose we have a site for elementary statistics. This might be a course taught by 10 different instructors. Faculty teaching the course would be encouraged to post material dealing with statistics. This might take the form of syllabi, lectures, interesting examples, humorous ways to illustrate difficult concepts, computer programs to solve statistics problems, solved exercises, etc. One of the authors has a site for his corporate finance class and has heard that students taking the course with other instructors go to the site since it contains dozens of problems with solutions in the area of mathematics of finance. Professor N. Duru Ahanotu created a Web site (http://www.drduru.com/knowledge.html) for anyone interested in learning organizations and knowledge management. The corporate world is learning the value of the Web for e-training. The type of Web site described above can be especially useful to faculty teaching a course for the first time. Rather than learning the best way to teach a course through trial and error, they can go to the Web site for a particular course and see how colleagues have been teaching it. Many professors do indeed go to the Web to examine syllabi and course material from the same courses taught at schools all over the country. The problem is that the caliber of student may not be exactly the same. While it is still a good idea to see how a particular course is taught at other colleges, it will often be more useful to examine the materials used by colleagues in the same school. Interestingly, Dill [6] notes that a major weakness of universities has been in the "internal transfer of new knowledge." This is why it is not uncommon to find that "within most universities there are better performing units that have knowledge about improving teaching and learning from which other units could learn." 4. Knowledge sharing should not be limited to a university. Knowledge should be shared with the public. A Web site could be created providing helpful information for the general public. For instance, this site could have links to subjects such as small business management, marketing, personal finance, ESL, etc. Outsiders could learn these subjects online for free. Brew and Doud [4] assert that work-based learning is important for students. This means that there has to be a partnership between educators and workplace supervisors, especially with professional education. This, of course, requires knowledge sharing between academics and practitioners. 5. University administrators have to realize that the pyramid-shaped organizational structure makes little sense for an academic institution. Information should not only flow from the top to the bottom, i.e., president to provost to dean to chairs to faculty. The biggest impediments to the creation of learning organizations are the twin fears of change and of things that are new. Senior faculty often resist change. Indeed, Kuhn [17] found a similar phenomenon in the sciences. Kuhn described "normal science," as where scientists who adhere to the old dominant paradigm resist the adoption of a new paradigm. Kuhn [17:52] notes that "normal science does not aim at novelties of fact or theory and, when successful, finds none." Some of the best ideas might originate from junior faculty who often have a new perspective. Universities that want to be innovative have to allow information to flow from the bottom to the top, otherwise they will stagnate. Knowledge-sharing software could be used by administrators to get fresh ideas from the entire faculty. 6. Students have to be part of the knowledge sharing for a university to become a true learning organization. Many faculty members resist providing students with e-mail addresses and brick-and-mortar office hours of three hours per week are ludicrous in the age of asynchronous communication. How many faculty members today would deal with a bank that was only open from 9 a.m. to 3 p.m., had no ATM machines and no online banking? Information about majors can be automated. There could be a Web site where students can find out about any major, including requirements for the major and opportunities in the field. Sites consisting of FAQs (frequently asked questions) could be provided for students. Expert systems could be used to advise students as to whether they have the necessary prerequisites for a course. When you purchase a book at Amazon.com, the next time you come back you are greeted by name and other books are recommend to you based on your purchase history. Students could also automatically receive recommendations for courses based on their major and their registration history. 7. As noted above, many futurists believe that interdisciplinary majors will be vital to the future of universities. Many of the newer programs being developed at colleges all over the country are interdisciplinary. It is often very difficult to get academic departments to create interdisciplinary majors when each department is interested in protecting its own turf. Learning organizations stress cooperation, not protection of turf, and this might require a new organizational structure not based on departments. Alternatively, department chairs could report to a "super" chair or dean with the responsibility for an entire school. The job of the "super" chair or dean would be to ensure that departments work together to create interdisciplinary programs and focus on what is best for the university as a whole, not just their own department. A discussion group in which faculty members could provide ideas for new programs could be established. Administrators could reward faculty and departments that create successful programs. 8. A learning organization cannot last long if members of the organization have no interest in learning. Unfortunately, a significant number of faculty (one number often quoted is 60%) never publish an article after they receive tenure and become associate professors. Incentives must be put in place to ensure that faculty continue to learn even after being promoted to full professor. Lifelong learning is now necessary in many professions, including medicine and law. It should also be encouraged in academe. Conclusion Establishing a paradigm of knowledge sharing and continuous growth through lifelong learning is not easy even, or perhaps especially, in academe. Interestingly, in these very turbulent times, many academicians are complacent and feel that there is no compelling need to make any serious changes. This is definitely a myopic way of thinking. Transforming colleges into learning organizations will not solve all problems, but it is certainly an important first step. References 1. Andrews, R. L., M. Flanigan, and D. S. Woundy. "Are Business Schools Sleeping Through a Wake-Up Call?" Decision Sciences Institute 2000 Proceedings, 1. 2000, 194-196. 2. Argyris, C. "Good Communication that Blocks Learning." Harvard Business Review, Vol. 72, No.4, 1994, 77-85. 3. Argyris, C. and D. Schoen. Organizational Learning II: Theory, Method, and Practice. Reading, MA: Addison-Wesley, 1996. 4. Brew, A. and D. Boud "Preparing for New Academic Roles: A Holistic Approach to Development." The International Journal for Academic Development, Vol. 1 No. 2, 17-25. 5. Conzemius, A. and W. Conzemius. "Transforming Schools into Learning Organizations." Adult Learning, Vol. 7 No. 4, 1996, 23-25. 6. Dill, D. D. "Academic Accountability and University Adaptation: The Architecture of the Academic Learning Organization." Higher Education, 38, 1999, 127-154. 7. Duderstadt, J. J. "A Choice of Transformations for the 21st-Century University." The Chronicle of Higher Education, 46, Feb. 4, 2000, B6-B7. 8. Duderstadt, J. J. "The Future of the University in an Age of Knowledge." The Journal of Asynchronous Learning Networks, 1, 1997, 78-88. 9. Edwards, R. "The Academic Department: How Does it Fit into the University Reform Agenda?" Change 31, 1999, 17-27 10. Fullan, M. The School as Learning Organization: Distant Dreams. Theory into Practice, Vol. 34, 1995, No. 4, 230-235. 11. Garvin, D. A. "Building a Learning Organization." Harvard Business Review, Vol. 71, No. 4, 1993, 78-91. 12. Gazzaniga, M. "How to Change the University." Science, 1998, 237. 13. Harrington, F. H. "Shortcomings of Conventional Departments." In D. E. McHenry (Ed.), Academic Departments: Problems, Variations, and Alternatives. San Francisco: Jossey-Bass, 1977, 53?62. 14. Hollander, S. "Second Class Subjects? Interdisciplinary Studies at Princeton." The Daily Princetonian, April 24, 2000, 3. 15. Isaacson, N. and J. Bamburg "Can Schools Become Learning Organizations?" Educational Leadership, Vol. 50, No. 3, 1992, 42-44. 16. Kolodny, A. Failing the Future: A Dean Looks at Higher Education in the Twenty-First Century. Durham, NC: Duke University Press, 1998. 17. Kuhn, T. The Structure of Scientific Revolutions, 2nd ed. Chicago: University of Chicago Press, 1970. 18. Lenzner, R. and S. S. Johnson. "Seeing Things as They Really Are." Forbes, March 10, 1997, 122-131. 19. O?Neil, J. "On Schools as Learning Organizations." Educational Leadership, Vol. 52, No. 7, April 1995, 20-23. 20. Pedler, M., J. Burgoyne and T. Boydell. The Learning Company: A Strategy for Sustainable Development. New York: McGraw-Hill, 1991. 21. Senge, P.M. The Fifth Discipline. New York: Doubleday, 1990. 22. Shields, C. and E. E. Newton. "Empowered Leadership: Realizing the Good News." Journal of School Leadership, Vol. 4, No. 2, 1994, 171-196. 23. Smith, J. Z. "To Double Business Bound." In C. G. Schneider and W. S. Green (Eds.), Strengthening the College Major. San Francisco, CA: Jossey-Bass Inc. Publishers, 1993, 13-23. 24. Stata, R. "Organizational Learning?The Key to Management Innovation." Sloan Management Review, Spring 1989, 63-74. From checker at panix.com Mon Dec 5 02:45:15 2005 From: checker at panix.com (Premise Checker) Date: Sun, 4 Dec 2005 21:45:15 -0500 (EST) Subject: [Paleopsych] J. Philippe Rushton: Ethnic nationalism, evolutionary psychology and Genetic Similarity Theory Message-ID: J. Philippe Rushton: Ethnic nationalism, evolutionary psychology and Genetic Similarity Theory Nations and Nationalism 11 (4), 2005, 489-507. r ASEN 2005 Department of Psychology, University of Western Ontario, London, Ontario, Canada This article builds on a paper prepared for the American Psychological Association (APA) and the Canadian Psychological Association (CPA) joint 'Initiative on Ethno-Political Warfare' (APA/CPA, 1997). I thank Aurelio J. Figueredo, Henry Harpending, Frank Salter and Pierre L. van den Berghe for comments on an earlier draft. ABSTRACT. Genetic Similarity Theory extends Anthony D. Smith?s theory of ethno-symbolism by anchoring ethnic nepotism in the evolutionary psychology of altruism. Altruism toward kin and similar others evolved in order to help replicate shared genes. Since ethnic groups are repositories of shared genes, xenophobia is the 'dark side' of human altruism. A review of the literature demonstrates the pull of genetic similarity in dyads such as marriage partners and friendships, and even large groups, both national and international. The evidence that genes incline people to prefer others who are genetically similar to themselves comes from studies of social assortment, differential heritabilities, the comparison of identical and fraternal twins, blood tests, and family bereavements. DNA sequencing studies confirm some origin myths and disconfirm others; they also show that in comparison to the total genetic variance around the world, random co-ethnics are related to each other on the order of first cousins. Introduction Most theories of ethno-political conflict and nationalism focus on cultural, cognitive and economic factors, often with the assumption that modernisation will gradually reduce the effect of local antagonisms and promote the growth of more universalistic societies (Smith 1998). However, purely socio-economic explanations seem inadequate to account for the rapid rise of nationalism in the former Soviet Bloc and too weak to explain the lethality of the conflicts between Tutsis and Hutus in Rwanda, Hindus, Muslims and Sikhs in the Indian subcontinent, and Croats, Serbs, Bosnians and Albanians in the former Yugoslavia, or even the level of animosity between Blacks, Whites and Hispanics in the US. Typically, analysts have also failed to consider the ethno-political repercussions of the unprecedented movement of peoples taking place in the world today (van den Berghe 2002). One of the hallmarks of true science is what Edward O. Wilson (1998) termed the unity of knowledge through the principle of consilience, in which the explanations of phenomena at one level are grounded in those at a lower level. Two prominent examples are the understanding of genetics in terms of biochemistry once the structure of the DNA molecule was worked out and, in turn, of chemistry in terms of atomic physics. Anthony D. Smith's theory of ethno-symbolism unifies knowledge in the consilient manner through its integration of history and psychology, thereby solving the problem that nationalism poses for purely socio-economic theories - the phenomena of mass devotion and the belief that one's own group is favourably unique, even 'chosen' (e.g. Smith 2000 and 2004; Guibernau and Hutchinson 2004; Hutchinson 2000). With its emphasis on a group's preexisting kinship, religious and belief systems fashioned into a sense of common identity and shared culture, however mythologised, Smith's theory explains what purely socio-economic theories do not, why the 'glorious dead' fought and died for their country. It is more robust than other theories because its research analyses show that myths, memories and especially symbols, foment and maintain a sense of common identity among the people unified in a nation. The ethno-symbolic perspective further unifies knowledge by highlighting interactions between ethnicity and nationhood. For example, Hutchinson (2000) described the episodic element in the history of countries as when national pride is augmented by events such as sudden new archaeological discoveries. By studying the ethnic character of modern nations over the long term, it is possible to identify recurring causes of national revivals, the role of cultural differences within nations, and the salience of national identities with respect to other allegiances. The current article presents 'Genetic Similarity Theory' to explain ethnic nepotism and people's need to identify and be with their 'own kind' (Rushton et al. 1984 and 1986; Rushton 1989a, 1995, 2004; Rushton and Bons 2005). Nationalists often claim that their nation has organic continuity and 'ties of blood' that make them 'special' and different from outsiders, a view not fully explained by ethno-symbolism. Although the term 'ethnicity' is recent, the sense of kinship, group solidarity and common culture to which it refers is often as old as the historical record (Hutchinson and Smith 1996). Genetic Similarity Theory extends Smith's theory and the unity of knowledge by providing the next link, the necessary biological mooring. Patriotism is almost always seen as a virtue and extension of family loyalty and is typically preached using kinship terms. Countries are called the 'motherland' or the 'fatherland'. Ethnic identity builds on real as well as putative similarity. At the core of human nature, people are genetically motivated to prefer others genetically similar to themselves. I will support this contention with current findings from evolutionary psychology and population genetics. The evolutionary background Starting with Charles Darwin's The Origin of Species (1859) and The Descent of Man (1871), evolutionary explanations of the moral sentiments have been offered for both humans and other animals. Nineteenth century evolutionists such as Herbert Spencer and William Graham Sumner built on the concepts of in-group-amity and out-group-enmity, group competition and group replacement. Tribes, ethnic groups, even nations were seen as extended families (see van der Dennen 1987, for a review). However, evolutionary explanations went out of favour during the 1920s and 1930s with the rise of fascism in Europe, largely because they were seen as providing a justification for racially based politics (Degler 1991). During the 1960s and 1970s, most biologists eschewed theories of group competition in favour of the mathematically 'cleaner' theories of individual adaptation, since the genetic mechanisms necessary for ethnocentrism to evolve remained quantitatively problematic. After several decades of neglect, evolutionary psychology has now regained scientific respectability (e.g. Badcock 2000; Buss 2003; Pinker 2002; Wilson 1998). In The Descent of Man (1871: 489-90), Darwin proposed the radical and far-reaching hypothesis that human morality rested on the same evolutionary basis as did the behaviour of other animals - reproductive success - described as the 'general good': The term, general good, may be defined as the rearing of the greatest number of individuals in full vigour and health, with all their faculties perfect, under the conditions to which they are subjected. As the social instincts both of man and the lower animals have no doubt been developed by nearly the same steps, it would be advisable, if found practicable, to use the same definition in both cases, and to take as the standard of morality, the general good or welfare of the community, rather than the general happiness; but this definition would perhaps require some limitation on account of political ethics. Historian Carl Degler (1991) observed that Darwin's equating of human and animal morality with the reproductive success of the community had the effect of biologising ethics. Suddenly, far-flung notions of economics, demographics, politics and philosophy, some of which had been centuries in the making, now revolved around a Darwinian centre, capturing the nineteenth century imagination and inspiring new analyses of the way society worked. The philosophy termed 'Social Darwinism', with its emphasis on the reproductive success of groups as well as of individuals, was taken up at every point along the political spectrum - from laissez-faire capitalism to communist collectivism to National Socialism (again see van der Dennen 1987, for a review). It was crucial for Darwin to emphasise the moral continuity between humans and other animals because the opponents of human evolution had argued for their discontinuity in both the moral and the intellectual spheres. Darwin departed from utilitarian philosophers such as John Stuart Mill and Jeremy Bentham who believed that human morality was based on making informed choices about the greatest happiness for the greatest number. As Darwin pointedly observed, that basis was rational rather than instinctive. Since human beings alone were said to follow it, Darwin took exception to it. In The Descent, Darwin provided numerous examples of how animal morality led to reproductive success. All animals fight by nature in some circumstances but are altruistic in others. Acts of altruism include parental care, mutual defence, rescue behaviour, co-operative hunting, food sharing and self-sacrificial altruism. Darwin described how leaders of monkey troops act as sentinels and utter cries of danger or safety to their fellows; how even male chimpanzees might rush to the aid of infants that cried out under attack, even though the infants were not their own. Animal altruism - even to the point of self-sacrifice - has been massively confirmed since Darwin wrote The Descent (see E. O. Wilson 1975, for extended discussion). Altruism involves self-sacrifice. Sometimes the altruist dies. For example, when bees defend their hive and sting intruders, the entire stinger is torn from the bee's body. Stinging an intruder is an act of altruistic self-sacrifice. In ants, if nest walls are broken open, soldiers pour out to combat foragers from other nests; at the same time, worker ants repair the broken walls leaving the soldiers outside to die in the process. Human warfare appears to be rooted in the evolved behaviour of our nearest primate relatives. Male chimpanzees patrol their territories in groups to keep the peace within the group and to repel invaders. Such patrols, of up to twenty bonded males at a time, raid rival groups, kidnap females and annex territory, sometimes fighting pitched battles in the process (Wrangham and Peterson 1996). Solving the paradox of altruism In The Origin, Darwin (1859) saw that altruism posed a major enigma for his theory of evolution. How could altruism evolve through 'survival of the fittest' if altruism means self-sacrifice? If the most altruistic members of a group sacrifice themselves for others, they will have fewer offspring to pass on the genes that made them altruistic. Altruism should not evolve, but selfishness should. Darwin was unable to resolve the paradox of altruism to his satisfaction because to do so required greater knowledge of how heredity worked than he had available (the word 'genetics' was not coined until 1905). Nonetheless, in The Descent, Darwin (1871) intuited the solution when he wrote, 'sympathy is directed solely towards members of the same community, and therefore towards known, and more or less loved members, but not all the individuals of the same species' (Vol. 1: 163). In 1964, evolutionary biologist William Hamilton finally provided a generally accepted solution to the problem of altruism based on the concept of inclusive fitness, not just individual fitness. It is the genes that survive and are passed on. Some of the individual's most distinctive genes will be found in siblings, nephews, cousins and grandchildren as well as in offspring. Siblings share fifty per cent, nephews and nieces twenty-five per cent, and cousins about twelve and a half per cent of their distinctive genes. So when an altruist sacrifices its life for its kin, it ensures the survival of these common genes. The vehicle has been sacrificed to preserve copies of its precious cargo. From an evolutionary point of view, an individual organism is only a vehicle, part of an elaborate device, which ensures the survival and reproduction of genes with the least possible biochemical alteration. 'Hamilton's Rule' states that across all species, altruism (or, conversely, reduced aggression) is favoured when rb . c40, where r is the genetic relatedness between two individuals, b is the (genetic) fitness benefit to the beneficiary, and c is the fitness cost to the altruist. Evolutionary biologists have used Hamilton's 'gene's eye' point of view to carry out research on a wide range of social interactions including altruism, aggression, selfishness and spite. The formulation was dubbed 'kin selection theory' by John Maynard Smith (1964) and became widely known through influential books such as The Selfish Gene by Richard Dawkins (1976) and Sociobiology: the New Synthesis by Edward O. Wilson (1975). In 1971, Hamilton extended his formulation and hypothesised that altruism would result from any degree of genetic relatedness, not just that based on immediate kin. Hamilton equated his genetic relatedness variable r to Sewall Wright's FST measure of within-group variance (typically r . 2FST), and cited an experimental study of semi-isolated groups of mice where even random mating produced an FST of 0.18. Hamilton concluded that the within-group mice should therefore favour each other over those in the out-group, treating 'the average individual encountered as a relative closer than a grandchild (or half-sib) but more distant than an offspring (or full-sib)'. In order to favour near kin over distant kin and distant kin over non- relatives, the organism must be able to detect degrees of genetic similarity in others. Hamilton (1964 and 1971) proposed several mechanisms by which detection could occur: (1) location or proximity to self as in the rule 'if it's in the nest, it's yours'; (2) familiarity, which is learning through social interaction; (3) similarity-to-self through imprinting on self, parents or nest mates as in the rule 'look for physical features that are similar to self' - dubbed the 'armpit effect' by Dawkins (1976); and (4) 'recognition alleles' or innate feature detectors that allow detection of genetic similarity in strangers in the absence of any mechanism of learning - dubbed the 'green beard effect' by Dawkins (1976). In this latter, a gene produced two effects: (a) creating a unique trait such as a green beard, and (b) preferring others who also have that trait. Hamilton and Dawkins both favoured an imprinting mechanism, which Hamilton (1971) suggested would be most effective if it occurred on the more heritable traits because these best indicate the underlying genotype. There is dramatic evidence that many animal species do detect and then act on genetic similarity (Fletcher and Michener 1987; Hauber and Sherman 2001). In a classic study of bees, Greenberg (1979) bred for fourteen degrees of closeness to a guard bee, which blocks the nest to intruders. Only the more genetically similar intruders got through. A classic study of frog tadpoles separated before hatching and reared in isolation found the tadpoles moved to the end of the tank where their siblings had been placed, even though they had never encountered them previously, rather than to the end of the tank with non-siblings (Blaustein and O'Hara 1981). Squirrels produce litters that contain both full-siblings and half-siblings. Even though they have the same mother, share the same womb, and inhabit the same nest, full-siblings fight less often than do half-siblings. Full-siblings also come to each other's aid more often (Hauber and Sherman 2001). Similarity detection is also required for assortative mating, which occurs in insects, birds, mammals and even plants. Optimal outbreeding in some plants is promoted by acceptance of pollen from source plants that are neither too similar nor too dissimilar molecularly from the host plant's own pollen (see Hauber and Sherman 2001, for review). Even in species that disperse, the offspring typically show strong aversion to mating with close relatives. One study of wild baboons showed that paternal kin recognition occurs as frequently as maternal kin recognition even though identifying paternal kin is much more difficult in species where the mother mates with more than one male (Alberts 1999). Although in 1975 Hamilton extrapolated his ideas to human warfare, his formulations have only seldom been taken beyond immediate kin. In The Selfish Gene, Dawkins (1976) argued that the mathematics of kin selection soon made coefficients of relatedness, even between kin, vanishingly small. One example he offered was that Queen Elizabeth II, while a direct descendant of William the Conqueror (1066), is unlikely to share a single one of her ancestor's genes. In a 1981 editorial for Nature, Dawkins used similar arguments to rebut claims made by Britain's far-right National Front that kin selection theory provided a genetic justification for ethnocentrism. Perhaps feeling a moral obligation to condemn racism, some evolutionists minimised the theoretical possibility of a biological underpinning to ethnic or national favouritism. Hamilton himself (1987: 426) pithily commented, 'in civilized cultures, nepotism has become an embarrassment'. These qualifications turn out to have been overstated. Through assortative mating and other cultural practices, the selfish gene's capacity to replicate itself in combination with those clusters of other genes with which it works well may be extended for hundreds of generations, not three. Elizabeth II is considerably more genetically similar to William the Conqueror than she is to an average person alive today. Genetic Similarity Theory In 1984, the current author, along with Robin Russell and Pamela Wells, began to apply the Hamiltonian perspective to human dyads, small groups and even larger national and international entities (Rushton et al. 1984; Rushton 1986, 1989a, 2004; Rushton and Bons 2005). We dubbed our approach 'Genetic Similarity Theory' and reasoned that if genes produced effects that allowed bearers to recognise and favour each other, then altruistic behaviour could evolve well beyond 'kin selection'. By matching across the entire genome, people can maximise their inclusive fitness by marrying others similar to themselves, and like, make friends with and help the most similar of their neighbours, as well as engage in ethnic nepotism. As the English language makes clear, 'likeness goes with liking'. Social-assortment studies Of all the decisions people make that affect their environment, choosing friends and spouses are among the most important. Genetic Similarity Theory was first applied to assortative mating, which kin-selection theory sensu stricto does not readily explain since individuals seldom mate with 'kin'. Yet, the evidence for assortative mating is pervasive in other animals as well as in humans. For humans, both spouses and best friends are most similar on socio-demographic variables such as age, ethnicity and educational level (r 5 0.60), next most on opinions and attitudes (r 5 0.50), then on cognitive ability (r 5 0.40), and least, but still significantly, on personality (r 5 0.20) and physical traits (r 5 0.20). Even marrying across ethnic lines 'proves the rule'. In Hawaii, men and women who married cross-ethnically were more similar in personality than those marrying within their group, suggesting that couples 'make up' for ethnic dissimilarity by choosing spouses more similar to themselves in other respects (Ahern et al. 1981). Evolution has also set an upper limit on 'like marrying like' - incest avoidance (van den Berghe 1983). Too close genetic similarity between mates increases the probability of 'double doses' of harmful recessive genes. The ideal mate is one who is genetically similar but not a close relative. Several studies have shown that people prefer genetic similarity in social partners, and assort on the more heritable components of traits, rather than on the most intuitively obvious ones, just as Hamilton (1971) predicted they would if genetic mechanisms were involved. This occurs because more heritable components better reflect the underlying genotype. These studies have used homogeneous sets of anthropometric, cognitive, personality and attitudinal traits measured within the same ethnic group. Examples of varying heritabilities are: for physical attributes, eighty per cent for middle-finger length vs. fifty per cent for upper-arm circumference; for intelligence, eighty per cent for the general factor vs. less than fifty per cent for specific abilities; for personality items, seventy-six per cent for 'enjoying meeting people' vs. twenty per cent for 'enjoying being unattached'; and for social attitudes, fifty- one per cent for agreement with the 'death penalty' vs. twenty-five per cent for agreement with 'Bible truth'. In a study of married couples, Russell et al. (1985) found that across thirty- six physical traits, spousal similarity was greater on attributes with higher heritability such as wrist circumference (seventy-one per cent heritable) than it was on attributes with lower heritability such as neck circumference (fortyeight per cent heritable). On fifty-four indices of personality and leisure time pursuits, Rushton and Russell (1985) found that spousal similarity was greater on items such as 'enjoying reading' (forty-one per cent heritable) than on items such as 'having many hobbies' (twenty per cent heritable). On twenty-six cognitive ability tests, Rushton and Nicholson (1988) found that spousal resemblance was greater on more heritable subtests from the Hawaii Family Study of Cognition and the Wechsler Adult Intelligence Scale (WAIS). When spouses assort on more heritable items, they report greater marital satisfaction (Russell and Wells 1991). In a study of best friends, Rushton (1989b) found that across a wide range of anthropometric and social attitude measures, such as agreement with 'military drill' (forty per cent heritable) and with 'church authority' (twentyfive per cent heritable) the similarity of the friends was more pronounced on the more heritable measures. These results were extended to liking in acquaintances by Tesser (1993) who manipulated people's beliefs about how similar they were to others on attitudes pre-selected as being either high or low in heritability. Tesser found that people liked others more when their similarity had been chosen (by him) on the more heritable items. The above results cannot be explained by culturalist theories. Genetic Similarity Theory and culturalist theory make opposite predictions about social assortment. Cultural theory predicts that phenotype matching by spouses will be greater on those traits that spouses have become more similar on through the shared experiences that shape attitudes, leisure time activities and waist and bicep size (e.g. through diet and exercise). Genetic Similarity Theory, on the other hand, predicts greater matching on the more heritable traits (e.g. wrist size and middle finger length, not easily changed). Twin and adoption studies Several twin and adoption studies show that the preference for genetic similarity is heritable, that is, people are genetically inclined to prefer similar partners. In one of these studies, Rowe and Osgood (1984) analysed data on delinquency from several hundred adolescent monozygotic (MZ) twin pairs, who share one hundred per cent of their genes, and dizygotic (DZ) twin pairs, who share fifty per cent of their genes. They found that adolescents genetically inclined to delinquency were also genetically inclined to seek out similar others as friends. Dovetailing with these results, Daniels and Plomin (1985) examined friendships in several hundred pairs of siblings from both adoptive and non-adoptive homes, and found that whereas biological siblings (who share genes as well as environments) had friends who resembled each other, adoptive siblings (who share only their environment) had friends who were not at all similar to each other. These results show that shared genes lead to similar friends. Rushton and Bons (2005) analysed a 130-item questionnaire on personality and social attitudes gathered from several hundred pairs of identical twins, fraternal twins, their spouses and their best friends. They found that: (a) spouses and best friends are about as similar as siblings, a level of similarity not previously recognised; and (b) identical twins choose more similar spouses and best friends to their co-twin than do non-identical twins. The preference for similarity is about thirty per cent heritable. Moreover, once again, matching for similarity was greater on the more heritable items showing that social assortment is based on the underlying genotype. Similarity was greater on items such as preferring 'business to science' (heritability 5 0.60) than on liking to 'travel the world alone' (twenty-four per cent heritable). Blood group studies Yet another way of testing the hypothesis that humans typically choose mates and friends who are genetically similar is to examine blood antigens. In one study, Rushton (1988) analysed seven polymorphic marker systems at ten blood loci across six chromosomes (ABO, Rhesus [Rh], MNSs, Kell, Duffy [Fy], Kidd [Jk] and HLA) in a study of 1,000 cases of disputed paternity, limited to people of North European appearance (judged by photographs). Couples who produced a child together were fifty-two per cent similar but those that did not were only forty-three per cent similar. Subsequently, Rushton (1989b) used these blood tests with pairs of male best friends of similar background and found the friends were significantly more similar to each other than they were to randomly matched pairs from the same database. Bereavement studies Within-family bereavement studies show just how fine-tuned human preferences for genetic similarity can be. One study of 263 child bereavements found that (a) spouses agreed seventy-four per cent of the time on which side of the family a child 'took after' the most, their own or that of their spouse, and (b) the grief intensity reported by mothers, fathers and grandparents was greater for children who resembled their side of the family than it was for children who resembled the other side of the family (Littlefield and Rushton 1986). A study of bereavement in twins found that MZ twins who share one hundred per cent of their genes, compared to DZ twins who share fifty per cent of their genes: (a) work harder for their co-twin; (b) show more physical proximity to their co-twin; (c) express more affection to their co-twin; and (d) show greater loss when their co-twin dies (Segal 2000). Other lines of research Women prefer the bodily scents of men with genes similar to their own more than they do those of men with nearly identical genes or genes totally dissimilar to their own (Jacob et al. 2002). Each woman's choice was based upon the human leukocyte antigen (HLA) gene sequence - the basis for personal odours and olfactory preferences - inherited from her father but not her mother. Another study found that both men and women rated versions of their own face as the most attractive after they had been computer-morphed into faces of the opposite-sex, even though they did not recognise the photos as images of themselves (Penton-Voak et al. 1999). Similarly, people whose faces were morphed with strange faces trusted others most when they looked like themselves (DeBruine 2002). Familiarity was ruled out by using morphs of celebrities; only self-resemblance mattered. The gravity of groups The pull of genetic similarity does not stop at family and friends. Group members move into ethnic neighbourhoods and join together in clubs and societies. Since people of the same ethnic group are genetically more similar to each other than to members of other groups, they favour members of their own group over outsiders. In his groundbreaking book, The Ethnic Phenomenon, van den Berghe (1981) applied kin-selection theory to explain why people everywhere are prone to develop ethnocentric attitudes toward those who differ in dress, dialect and other appearance, and how even relatively open and assimilative ethnic groups 'police' their boundaries against invasion by strangers by using 'badges' as markers of group membership. Van den Berghe hypothesised that these 'badges' would typically be cultural, such as scarification, linguistic accent and clothing style rather than physical. He agreed that shared traits of high heritability could provide more reliable indicators than cultural, flexible ones, but he thought these heritability indices would likely only be relevant to modern times when they could be used to discriminate between widely differing groups such as the Boers and Xhosa. The studies I reviewed above on kin recognition in animals and social assortment in humans shows that the preference for similarity is fine-tuned. It takes place within ethnic groups, even families, and it occurs on the more heritable items from sets of homogeneous traits. As such, the process is considerably more variegated, subtle and powerful than van den Berghe (1981) conjectured. (His 1989 position paper went further toward acknowledging the more 'primordial' elements involved.) The reviewed data confirms Hamilton's (1971) prediction that kin-recognition systems would use the more heritable attributes of others if they were based on mechanisms such as imprinting-onself (Dawkins's 'armpit effect') and recognition alleles (Dawkins's 'green beard effect'). Detecting degrees of genetic similarity is much more fine-tuned than simply determining whether someone is a Boer or a Xhosa. The question is: How similar to one is the particular Boer (or Xhosa)? In his 2003 book On Genetic Interests, Frank Salter, a political ethologist at the Max Planck Institute in Munich, extrapolated genetic similarity theory and the logic of taking all shared genes into account to also explain ethnic nepotism. He showed how Hamilton's (1964, 1971, 1975) coefficient of relatedness (r) equated to the FST estimates of genetic variance (on average r . 2 FST ) that had become available (e.g. Cavalli-Sforza et al. 1994). Since FST provides both a measure of genetic distance between populations and of kinship within them, it followed that in comparison to the total genetic variance around the world, random members of any one population group are related to each other on the order of r . 0.25 or 1/4 or about the same as half- siblings. (A general rule would be: If a fellow ethnic looks like you, then on average, he or she is genetically equivalent to a cousin.) Salter's analysis of Cavalli-Sforza's FST data showed that if the world population were wholly English then the kinship between any random pair of Englishmen would be zero. But if the world population consisted of both English people and Germans, then two random English people (or Germans) would have a kinship of 0.0044, or that of 1/32 of a cousin. As genetic distances between populations become larger, the kinship coefficient between random co-ethnics within a population increases. Two English people become the equivalent of 3/8 cousin by comparison with people from the Near East; 1/ 2 cousin by comparison with people from India; half-sibs by comparison with people from China or East Africa; and like full-sibs (or children) compared with people from South Africa. Since people have many more co-ethnics than relatives, the aggregate of genes they share with their fellow ethnics dwarfs those they share with their extended families. Rather than being a mere poor relation of family nepotism, ethnic nepotism is virtually a proxy for it. In two other books, Salter (2002 and 2004) and his colleagues found that ethnic bonds are central to explaining such diverse phenomena as ethnic mafias, minority middlemen networks, heroic freedom fighters, the welfare state, generous foreign aid and charity in all its more unstinting manifestations. One study examined street beggars in Moscow. Some were ethnic Russians, just like the vast majority of the pedestrians. Others were dressed in the distinctive garb of Moldova, a small former Soviet republic, ethnically and linguistically kin to Romania. Finally, some of the beggars were darker- skinned Roma (Gypsies). The Russian pedestrians preferred to give to their fellow Russians, with their fellow Eastern European Moldavians, second. The Gypsies were viewed so negatively that they had to resort to a wide variety of tactics ranging from singing and dancing, to importuning tightwads, to sending out groups of young children to beg. In an earlier study, anthropologist Colin J. Irwin (1987) tested formulations of in-group co-operation in inbred populations by calculating coefficients of consanguinity within and between various Eskimo tribes and sub- tribes in the western Hudson's Bay region of Canada. He found that prosocial behaviour such as wife exchange, and anti-social behaviour, such as the genocidal killing of women and children during warfare, followed lines of genetic distance, albeit mediated by ethnic badging such as dialect and appearance. Even very young children typically show a clear preference for others of their own ethnic heritage (Aboud 1988). In fact, the process of making racial groupings has been shown to result from a natural tendency to classify people into 'kinds'. Children quickly begin to sort people into 'basic kinds' by sex, age, size and occupation. Experiments show that at an early age children clearly expect race to run in families (Hirschfield 1996). Very early in life, a child knows which race it belongs to, and which ones it doesn't. The whisper of the genes The history of the Jewish people provides a well-documented example of how genetic similarity theory intersects with Anthony D. Smith's (2000 and 2004) ethno-symbolic approach. As shown by Batsheva Bonne-Tamir at Tel Aviv University (e.g. 1992; and others, such as Thomas et al. 2002), Jewish groups are genetically similar to each other even though they have been scattered around the world for two millennia. Jews from Iraq and Libya share more genes with Jews from Germany, Poland and Russia than either group shares with the non-Jewish populations among whom they have lived for centuries. Although the Ethiopian Jews turn out not to be 'genetically Jewish', many other far removed Jewish communities share a similar genetic profile despite large geographic distances between the communities and the passage of hundreds of years. Genetic Similarity Theory predicts that many other seemingly purely cultural divides are, in fact, rooted in the underlying population genetics. Recent DNA sequencing of the ancient Hindu caste system has confirmed that higher castes are more genetically related to Europeans than are lower castes who are genetically more related to other south Asians (Bamshad et al., 2001). Although outlawed in 1960, the caste system continues to be the main feature of Indian society, with powerful political repercussions. Genetic studies can thus confirm (or disconfirm) people's ideas about their origins. In the case of Jews and the Indian caste system, traditional views have been confirmed. Israel is a new state, yet one which is built on an ancient tradition of ethnicity and nationhood. Much recent analysis of Israeli society, however, has tended to downplay connections between modern Israel and pre-modern Jewish identity, seeing Israel rather as an unambiguously modern phenomenon (cf. Smith 2000). Some Jews have greeted the genetic 'validation' positively because it affirms the organic nature of the Jewish people. However, it is also recognised as a two-edged sword, that could be invoked by claims from certain quarters that a 'Jewish Race is working to dominate the world'. Hindu nationalists have expressed similar mixtures of feelings. While pleased to confirm 'Aryan' origins, they fear a backlash over elitism and exclusivity. In other cases, genetic evidence refutes origin myths, such as that the Chinese gene-pool goes back a quarter of a million years to Beijing Man, or that Amerindians have always existed on the American continent rather than being only the most ancient of 'immigrants' (Rushton 1995). Genetic distance studies are likely to play an increasing role in debates about ancestral custodial rights over disputed territory. People can be predicted to adopt ideologies that work in their genetic self- interest. Examples of ideologies that have been shown, on analysis, to increase genetic fitness are religious beliefs that regulate dietary habits, sexual practices, marital customs, infant care and child rearing (Lumsden and Wilson 1981). Amerindian tribes that cooked maize with alkali had higher population densities and more complex social organisations than tribes that did not, partly because alkali releases the most nutritious parts of the cereal, enabling more people to grow to reproductive maturity. The Amerindians did not know the biochemical reasons for the benefits of alkali cooking but their cultural beliefs had evolved for good reason, enabling them to replicate their genes more effectively than would otherwise have been the case. Political interests are typically presented in terms of high ethical standards, no matter how transparent these appear to opponents. Consider the competing claims of Palestinians and Israelis, or the Afrikaners and the Bantus. Psychological explanation is made especially difficult since the rival groups construct very different histories of the conflict and all parties tend to see themselves as victims whose story has not been told. Because ethnic aspirations are rarely openly justified in terms of naked self-interest, analyses need to go deeper than surface ideology. Political issues are especially explosive when survival and reproduction are at stake. Consider the growth of Middle Eastern suicide bombers. Polls conducted among Palestinian adults from the Gaza Strip and the West Bank show that about seventy-five per cent support suicidal attacks, whereas only about twelve per cent are opposed (Margalit 2003). Many families state that they are proud of their kin who become martyrs. Most analyses of the motives of suicide bombings emphasise unique aspects such as the Palestinian or Iraqi political situation, the teachings of radical Islam, or a popular culture saturated with the glorification of martyrs. These political factors play an indispensable role but from an evolutionary perspective aspiring to universality, people have evolved a 'cognitive module' for altruistic self-sacrifice that benefits their gene pool. In an ultimate rather than proximate sense, suicide bombing can be viewed as a strategy to increase inclusive fitness. What reasons do suicide bombers themselves give for their action? Many invoke the rhetoric of Islam while others appeal to political and economic grievances. Mahmoud Ahmed Marmash, a twenty-one-year-old bachelor from Tulkarm who blew himself up near Tel Aviv in May 2001 said in a videocassette recorded before he went on his mission (cited in Margalit, 2003): I want to avenge the blood of the Palestinians, especially the blood of the women, of the elderly, and of the children, and in particular the blood of the baby girl Iman Hejjo, whose death shook me to the core. Many other national groups have produced suicide warriors. The term 'zealot' originates in a Jewish sect that existed for about 70 years in the first century CE. According to the classical historian Flavius Josephus (1981), an extreme revolutionary faction among them assassinated Romans and Jewish collaborators with daggers; this likely reduced their chances of staying alive. A group of about 1,000 Zealots, including women and children, chose to commit suicide at the fortress of Masada rather than surrender to the Romans. Masada today is one of the Jewish people's greatest symbols. Israeli soldiers take an oath there: 'Masada shall not fall again'. Soldier armies - the Japanese kamikaze, or the Iranian basaji - have carried out suicide attacks against enemy combatants. Winston Churchill contemplated the use of suicide bombers against the Germans if they invaded Britain (see Cornwell 2003). Some of the Tamil Tigers of Sri Lanka, who are Hindus, have killed themselves in attacks on politicians and army installations, and they have done so with utter disregard for the lives of civilians who happened to be around. Genes, of course, typically only 'whisper' their wishes rather than shout. They keep cultures on a long rather than a short leash (to use Lumsden and Wilson's 1981 metaphor). This allows for pragmatism and flexibility in the strategies that groups adopt to serve their aspirations. For example, Zubaida (2004) noted that the ideological weapons Arabs have employed to further their cause against political dominance by the Ottoman Turks (who were fellow Muslims), the Western Great Powers, the United States and now Israel have alternated between Islam and nationalism, with all the continuities and contradictions in between. Zubaida (2004) also noted that Turkish, Egyptian and Iranian Islamisms (and sometimes anti-Islamisms) have often been national, and often nationalistic. Across the Muslim world, Arabs have often seen themselves as the mainstay of Islam, and Islam as the national culture of the Arabs. Nationalism became unpopular when it failed to satisfy Arab aspirations and is now often seen as an import from the West to 'divide and conquer'. Although fundamentalism is typically seen as subversive by Arab regimes, ethnic nationalists often celebrate it as a demonstration of revolutionary power. The Shi'ite Revolution in the non-Arabic but Islamic Republic of Iran, for example, served as an example not only for Islamists, but also for many nationalists and leftists in the Arab world. The political pull of ethnic identity and genetic similarity also explains voting behaviour. The re-election victory of George W. Bush in the 2004 US presidential election was largely attributed to White votes and to the higher value placed by these voters on 'values' than on the economy. A closer look at the demographics reveals that 'values' may be, at least in part, a proxy for ethnic identity and genetic similarity. The majority of White Americans voted based on which candidate - and candidate's family - they believed most appeared to look, speak and act like them (Brownstein and Rainey 2004). Another timely example is the growth of Christian fundamentalism in the United States. Analyses show that it represents a reaction to what is perceived as the moral breakdown of society (Marty and Appleby 1994). Because of trends in the mass media and education system, many religious people believe they now live in a hostile culture where their core values are under siege. The issue on which they are most politically active is opposition to abortion. One hypothesis to be investigated is that if estimates of genetic similarity could be obtained, fundamentalists would prove close to each other and to the basic Anglo-Saxon gene pool. If so, it would be informative to know what percentage of the estimated fifty million women who have had legal abortions in the United States since 1973 were predominantly of that ethnic background. Conclusion Genetic similarity, of course, is only one of many possible influences operating on political alliances. Causation is complex and there is no value in reducing relationships between ethnic groups to a single factor. Fellow ethnics will not always stick together, nor is conflict inevitable between groups any more than it is between genetically distinct individuals. In addition to reproductive success, individuals also work for motives such as economic success. However, as van den Berghe (1981) pointed out, from an evolutionary perspective, the ultimate measure of human success is not production but reproduction. Behavioural outcomes are always mediated by multiple causes. Nonetheless, genetic similarity can be expected to play a clear role in the social behaviour of small groups and even of large ones, both national and international. The hypothesis presented here is that because fellow ethnics carry copies of the same genes, ethnic consciousness is rooted in the biology of altruism and mutual reciprocity. Thus ethnic nationalism, xenophobia and genocide can become the 'dark side' of altruism. Moreover, shared genes can govern the degree to which an ideology is adopted (e.g. Rushton 1986 and 1989a). Some genes will replicate better in some cultures than in others. Religious, political and class conflicts become heated because they affect genetic fitness. Karl Marx did not take his analysis far enough: ideology may be the servant of economic interest, but genes influence both. Since individuals have a greater concentration of genetic interest (inclusive fitness) in their own ethnic group than they do in other ethnic groups, they can be expected to adopt ideas that promote their group over others. Political ethologist Frank Salter (2003) refers to ideologies as 'fitness portfolios', and psychologist Kevin MacDonald (2001) has described co-ethnics as engaging in 'group evolutionary strategies'. It is because genetic interests are a powerful force in human affairs that ethnic insults so easily lead to violence. Although social scientists and historians have been quick to condemn the extent to which political leaders or would-be leaders have been able to manipulate ethnic identity, the questions they never ask, let alone attempt to answer are, 'Why is it always so easy?' and 'Why can a relatively uneducated political outsider set off a riot simply by uttering a few well-delivered ethnic epithets?' Many caveats must be noted to the theoretical approach described here. Thus, Salter (2003) concluded that although (a) ethnic bonds can be adaptive because they unite people in defence of shared interests, and (b) down-sizing ethnicity through multiculturalism might change the competitive advantage of particular groups for dominance but is unlikely to eliminate ethnic identity from our nature as social beings, nonetheless (c) there are many examples of how maladapted modern humans are for defending their ethnic interests due to the competing demands of family and immediate kin and the sheer complexity of modern societies including the impacts of cultural factors (see his Chapter 6). It would be incorrect to over-generalise findings on genetic similarity and reify primordialism or resurrect ideas of organic nationalism. Rather, the potential is provided for an even more nuanced ethno-symbolic approach to the forces operating both within and between countries, many of which can otherwise seem irrational. Although the modern idea of citizenship has replaced the bond of ethnicity ('people who look and talk like us') with that of values ('people who think and behave like us'), the politics of ethnic identity are increasingly replacing the politics of class as the major threat to the stability of nations. Patriotic feeling is much more than a delusion constructed by elites for their own purpose. The ethno-symbolic approach anchors the psychology of social identity in national identities and in previously existing ethnicities and their 'sacred' traditions and customs (e.g. Smith 2000 and 2004). Ethnic communities have been present in every period and have played an important role in all societies on every continent. The sense of common ethnicity remains a major focus of identification for individuals today. Genetic Similarity Theory helps to explain why. References Aboud, Frances. 1988. Children and Prejudice. London: Blackwell. Ahern, F. M., R. E. Cole, R. C. Johnson and B. Wong. 1981. 'Personality attributes of males and females marrying within vs. across racial/ethnic groups', Behavior Genetics 11: 181-94. Alberts, Susan C. 1999. 'Paternal kin discrimination in wild baboons', Proceedings of the Royal Society of London, B 266: 1501-6. APA/CPA (1997). Ethnopolitical Warfare: Origins, Intervention, and Prevention. A Joint Presidential Initiative of the Presidents-Elect of the American Psychological Association and the Canadian Psychological Association. Washington, DC: American Psychological Association. Badcock, Christopher. 2000. Evolutionary Psychology: a Critical Introduction. Cambridge: Polity Press. Bamshad, Michael, Toomas Kivisild, W. Scott Watkins, Mary E. Dixon, Chris E. Ricker, Baskara B. Rao, J. Mastan Naidu, B. V. Ravi Prasad, P.Govinda Reddy, Arani Rasanayagam, Surinder S. Papiha, Richard Villems, Alan J. Redd, Michael F. Hammer, Son V. Nguyen, Marion L. Carroll, Mark A. Batzer and Lynne B. Jorde. 2001. 'Genetic evidence on the origins of Indian caste populations', Genome Research 11(6): 994-1004. Blaustein, A. R. and R. K. O'Hara. 1981. 'Genetic control for sibling recognition?', Nature 290: 246-8. Bonne-Tamir, Batsheva and Avinoam Adam (eds.). 1992. New Perspectives on Genetic Markers and Diseases among Jewish People. Oxford: Oxford University Press. Brownstein, R. and R. Rainey. 2004. 'Bush's huge victory in the fast-growing areas beyond the suburbs alters the political map', Los Angeles Times 22 November: A1, A14-A15. Buss, David M. 2003. Evolutionary Psychology: the New Science of the Mind. Needham Heights, MA: Allyn & Bacon. Cavalli-Sforza, Luigi L., Paolo Menozzi and Albert Piazza. 1994. The History and Geography of Human Genes. Princeton, NJ: Princeton University Press. Cornwell, John. 2003. Hitler's Scientists: Science, War, and the Devil's Pact. New York: Penguin. Daniels, Denise and Robert Plomin. 1985. 'Differential experience of siblings in the same family', Developmental Psychology 21: 747-60. Darwin, Charles. 1859. The Origin of Species. London: Murray. Darwin, Charles. 1871. The Descent of Man. London: Murray. Dawkins, Richard. 1976. The Selfish Gene. Oxford: Oxford University Press. Dawkins, Richard. 1981. 'Selfish genes in race or politics', Nature 289: 528. DeBruine, Lisa M. 2002. 'Facial resemblance enhances trust', Proceedings of the Royal Society of London, B 269: 1307-12. Degler, Carl N. 1991. In Search of Human Nature. New York: Oxford. Fletcher, D. J. C. and C. D. Michener. 1987. Kin Recognition in Animals. New York: Wiley. Greenberg, L. 1979. 'Genetic component of bee odor in kin recognition', Science 206: 1095-7. Guibernau, Montserrat and John Hutchinson (eds.). 2004. History and National Destiny: Ethnosymbolism and its Critics. London: Blackwell. Hamilton, William D. 1964. 'The genetical evolution of social behavior. I and II', Journal of Theoretical Biology 7: 1-52. Hamilton, William D. 1971. 'Selection of selfish and altruistic behaviour in some extreme models' in J. F. Eisenberg and W. S. Dillon (eds.), Man and Beast: Comparative Social Behavior. Washington, DC: Smithsonian Press, 57-91. Hamilton, William D. 1975. 'Innate social aptitudes of man: an approach from evolutionary genetics' in R. Fox (ed.), Biosocial Anthropology. London: Malaby Press, 133-55. Hamilton, William D. 1987. 'Discriminating nepotism: expectable, common, overlooked' in D. J. C. Fletcher and C. D. Michener (eds.), Kin Recognition in Animals. New York: Wiley, 417-37. Hauber, Mark E. and Paul W. Sherman. 2001. 'Self-referent phenotype matching: theoretical considerations and empirical evidence', Trends in Neuroscience 24(10): 609-16. Hirschfield, Lawrence A. 1996. Race in the Making: Cognition, Culture, and the Child's Construction of Human Kinds. Cambridge, MA: MIT Press. Hutchinson, John. 2000. 'Ethnicity and modern nations', Ethnic and Racial Studies 23(4): 651-69. Hutchinson, John and Anthony D. Smith (eds.). 1996. Ethnicity. Oxford: Oxford University Press. Irwin, Colin J. 1987. 'A study in the evolution of ethnocentrism' in V. Reynolds, V. S. E. Falger and I. Vine (eds.), The Sociobiology of Ethnocentrism: Evolutionary Dimensions of Xenophobia, Discrimination, Racism, and Nationalism. London: Croom Helm, 131-56. Jacob, Suma, Martha K. McLintock, Bethanne Zelano and Carole Ober. 2002. 'Paternally inherited HLA alleles are associated with women's choice of male odor', Nature Genetics 30(2): 175-9. Josephus, Flavius. 1981. The Jewish War (revised edn by E. M. Smallwood of G. A. Williamson translation). New York: Penguin. Littlefield, Christine H. and J. Philippe Rushton. 1986. 'When a child dies: the sociobiology of bereavement', Journal of Personality and Social Psychology 51: 797-802. Lumsden, Charles J. and Edward O. Wilson. 1981. Genes, Mind, and Culture: the Coevolutionary Process. Cambridge, MA: Harvard University Press. MacDonald, Kevin. 2001. 'An integrative evolutionary perspective on ethnicity', Politics and the Life Sciences 20(1): 67-80. Margalit, Avishai. 2003. 'The suicide bombers', The New York Review of Books 50(1, 16 January). Marty, M. and R. S. Appleby. 1994. Fundamentalism Observed: the Fundamentalism Project. Chicago, IL: University of Chicago Press. Maynard Smith, John. 1964. 'Group selection and kin selection', Nature 201: 1145-7. Penton-Voak, I. S., D. I. Perret and J. W. Pierce. 1999. 'Computer graphic studies of the role of facial similarity in judgements of attractiveness', Current Psychology 18: 104-17. Pinker, Steven. 2002. The Blank Slate: the Modern Denial of Human Nature. New York: Viking. Rowe, David C. and D. W. Osgood. 1984. 'Heredity and sociological theories of delinquency: a reconsideration', American Sociological Review 49: 526-40. Rushton, J. Philippe. 1986. 'Gene - culture coevolution and genetic similarity theory: Implications for ideology, ethnic nepotism, and geopolitics', Politics and the Life Sciences 4(2): 144-8. Rushton, J. Philippe. 1988. 'Genetic similarity, mate choice, and fecundity in humans', Ethology and Sociobiology 9(6): 329-33. Rushton, J. Philippe. 1989a. 'Genetic similarity, human altruism, and group selection', Behavioral and Brain Sciences 12(3): 503-59. Rushton, J. Philippe. 1989b. 'Genetic similarity in male friendships', Ethology and Sociobiology 10(5): 361-73. Rushton, J. Philippe. 1995. Race, Evolution, and Behavior. New Brunswick, NJ: Transaction. Rushton, J. Philippe. 2004. 'Genetic and environmental contributions to prosocial attitudes: a twin study of social responsibility', Proceedings of the Royal Society of London, B 271: 2583-5. Rushton, J. Philippe and Trudy A. Bons. 2005. 'Mate choice and friendship in twins: evidence for genetic similarity', Psychological Science 16(7): 555-9. Rushton, J. Philippe and Ian R. Nicholson. 1988. 'Genetic similarity theory, intelligence, and human mate choice', Ethology and Sociobiology 9(1): 45-57. Rushton, J. Philippe and Robin J. H. Russell. 1985. 'Genetic similarity theory: a reply to Mealey and new evidence', Behavior Genetics 15: 575-82. Rushton, J. Philippe, Robin J. H. Russell and Pamela A. Wells. 1984. 'Genetic similarity theory: beyond kin selection', Behavior Genetics 14: 179-93. Rushton, J. Philippe, Christine H. Littlefield and Charles J. Lumsden. 1986. 'Gene-culture coevolution of complex social behavior: human altruism and mate choice', Proceedings of the National Academy of Sciences, USA 83(19): 7340-3. Russell, Robin J. H and Pamela A. Wells. 1991. 'Personality similarity and quality of marriage', Personality and Individual Differences 12: 406-12. Russell, Robin J. H., Pamela A. Wells and J. Philippe Rushton. 1985. 'Evidence for genetic similarity detection in human marriage', Ethology and Sociobiology 6(3): 183-87. Salter, Frank. 2002. Risky Transactions: Trust, Kinship and Ethnicity. London: Berghahn. Salter, Frank. 2003. On Genetic Interests: Family, Ethny and Humanity in an Age of Mass Migration. Frankfurt, Germany: Peter Lang. Salter, Frank (ed.). 2004. Welfare, Ethnicity, and Altruism: New Findings and Evolutionary Theory. New York: Frank Cass. Segal, Nancy L. 2000. Entwined Lives: Twins and What They Tell Us About Human Behavior. New York: Plume. Smith, Anthony D. 1998. Nationalism and Modernism: a Critical Survey of Recent Theories of Nations and Nationalism. London: Routledge. Smith, Anthony D. 2000. The Nation in History: Historiographical Debates about Ethnicity and Nationalism. Hanover, NH: University Press of New England. Smith, Anthony D. 2004. Chosen Peoples: Sacred Sources of National Identity. Oxford: Oxford University Press. Tesser, Abraham. 1993. 'The importance of heritability in psychological research: the case of attitudes', Psychological Review 93(1): 129-42. Thomas, Mark G., Michael E. Weale, Abigail L. Jones, Martin Richards, Alice Smith, Nicola Redhead, Antonio Torroni, Rosaria Scozzari, Fiona Gratix, Ayele Tarakegn, James F. Wilson, Christian Capelli, Neil Bradman and David B. Goldstein. 2002. 'Founding mothers of Jewish communities: geographically separated Jewish groups were independently founded by very few female ancestors', American Journal of Human Genetics 70(6): 1411-20. van den Berghe, Pierre L. 1981. The Ethnic Phenomenon. New York: Elsevier. van den Berghe, Pierre L. 1983. 'Human inbreeding avoidance', Behavioral and Brain Sciences 6: 91-123. van den Berghe, Pierre L. 1989. 'Heritable phenotypes and ethnicity', Behavioral and Brain Sciences 12: 544-55. van den Berghe, Pierre L. 2002. 'Multicultural democracy: can it work?', Nations and Nationalism 8(4): 433-49. van der Dennen, Johan M. G. 1987. 'Ethnocentrism and in-group/out-group differentiation: a review and interpretation of the literature' in V. Reynolds, V. S. E. Falger and I. Vine (eds.), The Sociobiology of Ethnocentrism: Evolutionary Dimensions of Xenophobia, Discrimination, Racism, and Nationalism. London: Croom Helm, 1-47. Wilson, Edward O. 1975. Sociobiology: the New Synthesis. Cambridge, MA: Harvard University Press. Wilson, Edward O. 1998. Consilience: the Unity of Knowledge. New York: Knopf. Wrangham, R. and D. Peterson. 1996. Demonic Males: Apes and the Origins of Human Violence. Boston, MA: Houghton Mifflin. Zubaida, Sami. 2004. 'Islam and nationalism: continuities and contradictions', Nations and Nationalism 10(4): 407-20. From checker at panix.com Mon Dec 5 02:45:24 2005 From: checker at panix.com (Premise Checker) Date: Sun, 4 Dec 2005 21:45:24 -0500 (EST) Subject: [Paleopsych] NS: Why we cannot rely on firearm forensics Message-ID: Why we cannot rely on firearm forensics http://www.newscientist.com/article.ns?id=mg18825274.300&print=true * 23 November 2005 * Robin Mejia TYRONE JONES is serving a life sentence, in part because of a microscopic particle that Baltimore police found on his left hand. At his trial for murder in 1998 the crime-lab examiner gave evidence that the particle was residue from a gunshot. He claimed Jones must have held or fired a gun shortly before his arrest. Jones denies this and still protests his innocence. His defence team is appealing the conviction, claiming that the science of gunshot residue (GSR) analysis is not as robust as the prosecution claims. Now, a New Scientist investigation has found that someone who has never fired a gun could be contaminated by someone who has, and that different criminal investigators use contradictory standards. What's more, particles that are supposedly unique to GSR can be produced in other ways. Forensic scientists often testify that finding certain particle types means the suspect handled or fired a weapon. Janine Arvizu, an independent lab auditor based in New Mexico, reviewed the Baltimore county police department's procedures relating to GSR. Her report concludes: "The BCPD lab routinely reported that gunshot residue collected from a subject's hands 'most probably' arose from proximity to a discharging firearm, despite the fact that comparable levels of gunshot residue were detected in the laboratory's contamination studies." The BCPD did not return calls requesting comment. Some specialists argue for a more cautious approach. "None of what we do can establish if anybody discharged a firearm," says Ronald Singer, former president of the American Academy of Forensic Sciences and chief criminalist at the Tarrant county medical examiner's office in Fort Worth, Texas. Peter De Forest of John Jay College of Criminal Justice in New York goes further. "I don't think it's a very valuable technique to begin with. It's great chemistry. It's great microscopy. The question is, how did [the particle] get there?" GSR analysis is commonly used by forensic scientists around the world. In Baltimore alone, it has been used in almost 1000 cases over the past decade. It is based on identifying combinations of heavy metals in microscopic particles that are formed when the primer in a cartridge ignites. The primer sets off the main charge, which expels the bullet. There is no standardised procedure to test for GSR, but the organisation ASTM International, which develops standards that laboratories can look to for guidance, has developed a guide for performing the technique that was approved in 2001. This states that particles made only of lead, barium and antimony, or of antimony and barium are "unique" to gunshot residue. The particles are identified using a scanning electron microscope and their composition analysed using energy-dispersive spectrometry. But recent studies have shown that a non-shooter can become contaminated without going near a firearm. Lubor Fojt?sek and Tom?s Kmjec at the Institute of Criminalistics in Prague, Czech Republic, fired test shots in a closed room and attempted to recover particles 2 metres away from the shooter. They detected "unique" particles up to 8 minutes after a shot was fired, suggesting that someone entering the scene after a shooting could have more particles on them than a shooter who runs away immediately (Forensic Science International, vol 153, p 132). A separate study reported in 2000 by Debra Kowal and Steven Dowell at the Los Angeles county coroner's department reported that it was also possible to be contaminated by police vehicles. Of 50 samples from the back seats of patrol cars, they found 45 contained particles "consistent" with GSR and four had "highly specific" GSR particles. What's more, they showed that "highly specific" particles could be transferred from the hands of someone who had fired a gun to someone who had not. This doesn't surprise Arvizu. "If I was going to go out and look for gunshot residue, police stations are the places I'd look," she says. Scientists using the technique are aware of the potential contamination problem, but how they deal with it varies. In Baltimore, for example, the police department crime lab's protocol calls for at least one lead-barium-antimony particle and a few "consistent" particles to be found to call the sample positive for GSR. The FBI is more cautious. Its protocol states: "Because the possibility of secondary transfer exists, at least three unique particles must be detected...in order to report the subject/object/surface 'as having been in an environment of gunshot primer residue'." So a person could be named as a potential shooter in Baltimore, but given the benefit of the doubt by the FBI. Even worse, it is possible to pick up a so-called "unique" particle from an entirely different source. Industrial tools and fireworks are both capable of producing particles with a similar composition to GSR. And several studies have suggested that car mechanics are particularly at risk of being falsely accused, because some brake linings contain heavy metals and can form GSR-like particles at the temperatures reached during braking. In one recent study, Bruno Cardinetti and colleagues at the Scientific Investigation Unit of the Carabinieri (the Italian police force) in Rome found that composition alone was not enough to tell true GSR particles from particles formed in brake linings (Forensic Science International, vol 143, p 1). At an FBI symposium last June, GSR experts discussed ways to improve and standardise the tests. The bureau would not discuss the meeting, but special agent Ann Todd says the FBI's laboratory is preparing a paper for publication that "will make recommendations to the scientific community regarding accepting, conducting and interpreting GSR exams". Singer maintains that the technique is useful if used carefully. "I think it's important as part of the investigative phase," he says, though not necessarily to be presented in court. But he adds: "There are people who are going to be a bit more, shall we say, enthusiastic. That's where you're going to run into trouble." Related Articles Television shows scramble forensic evidence http://www.newscientist.com/article.ns?id=mg18725163.800 09 September 2005 'Sperm clock' could pinpoint time of a rape http://www.newscientist.com/article.ns?id=dn7079 05 March 2005 DNA duplication trick may lead to faster testing http://www.newscientist.com/article.ns?id=dn6132 09 July 2004 Weblinks American Academy of Forensic Sciences http://www.aafs.org/ John Jay College of Criminal Justice http://www.jjay.cuny.edu/ ASTM International http://www.astm.org/ Forensic Science International http://www.sciencedirect.com/science/journal/03790738 From checker at panix.com Mon Dec 5 02:45:31 2005 From: checker at panix.com (Premise Checker) Date: Sun, 4 Dec 2005 21:45:31 -0500 (EST) Subject: [Paleopsych] Independent: Revealed: the chemistry of love Message-ID: Revealed: the chemistry of love http://news.independent.co.uk/world/science_technology/article329619.ece The good news: they've discovered the love chemical inside us all. The bad news: it only lasts a year The very source of love has been found. And is it that smouldering look exchanged across a crowded room? Those limpid eyes into which you feel you could gaze for ever? No. It's NGF, say unromantic spoilsport scientists who have made the discovery, - that's short for nerve growth factor. And now, the really deflating news: its potent, life-enhancing, brain-scrambling effect doesn't last. It subsides within the year of first falling in love - presumably within the same period it takes lovers to notice that the object of their affections can't get the lid on the toothpaste. "We have demonstrated for the first time that circulating levels of NGF are elevated among subjects in love, suggesting an important role for this molecule in the social chemistry of human beings," says Enzo Emanuele of the University of Pavia in Italy. Dr Emanueleand his researchers compared 58 men and women, aged 18 to 31, who had recently fallen in love with people in established relationships and those who were single. "Potential participants required to be truly, deeply and madly in love," said the researchers. Only people whose relationships had begun within six months were studied. The "in love" had to be spending at least four hours a day thinking about their partner. When the levels of blood chemicals were measured, it was found that both men and women who had recently fallen in love showed very high levels of NGF - 227 units compared with 123 units recorded in those in long-lasting relationships. The study also found that those who reported the most intense feelings had the highest NGF levels. However, when researchers revisited people from the "in love" group who were still in the same relationship more than a year later, the levels of NGF had declined to the same levels as the established relationship and singles groups. Love is a neglected area of research and little work has been done on its mechanisms. Dr Emanuele and his team believe they have conducted the first investigation into the peripheral levels of neurotrophins in people in love. While the role of NGF in falling in love remains unclear, the researchers suggest that some behavioural or psychological features associated with falling in love could be related to the higher chemical levels. "The raised NGF levels when falling in love could be related to specific emotions typically associated with intense early-stage romantic love, such as emotional dependency and euphoria," the researchers say. "The specificity of NGF increase during early-stage love seems to suggest that it could be involved in the formation of novel bonds, whereas it does not appear to play a major role in their maintenance.'' Rocketing NGF, however, could be a necessary step on the way to an enduring love because NGF is thought to play an important part in the release of another chemical which plays a pivotal role in social bonding. In a report about to be published in the journal Psychoneuroendocrinology, the research team ends with a justification for more love research that seemsquintessentially Italian: "Given the complexity of the sentiment of romantic love, and its capacity to exhilarate, arouse, disturb, and influence so profoundly our behaviour, further investigations on the neurochemistry and neuroendocrinology of this unique emotional state are warranted." From checker at panix.com Mon Dec 5 02:45:36 2005 From: checker at panix.com (Premise Checker) Date: Sun, 4 Dec 2005 21:45:36 -0500 (EST) Subject: [Paleopsych] NYTBR: The Capitalist Manifesto Message-ID: The Capitalist Manifesto http://www.nytimes.com/2005/11/27/books/review/27easterbrook.html THE MORAL CONSEQUENCES OF ECONOMIC GROWTH By Benjamin M. Friedman. 570 pp. Alfred A. Knopf. $35. Review by GREGG EASTERBROOK ECONOMIC growth has gotten a bad name in recent decades - seen in many quarters as a cause of resource depletion, stress and sprawl, and as an excuse for pro-business policies that mainly benefit plutocrats. Some have described growth as a false god: after all, the spending caused by car crashes and lawsuits increases the gross domestic product. One nonprofit organization, Redefining Progress, proposes tossing out growth as the first economic yardstick and substituting a "Genuine Progress Indicator" that, among other things, weighs volunteer work as well as the output of goods and services. By this group's measure, American society peaked in 1976 and has been declining ever since. Others think ending the fascination with economic growth would make Western life less materialistic and more fulfilling. Modern families "work themselves to exhaustion to pay for stuff that sits around not being used," Thomas Naylor, a professor emeritus of economics at Duke University, has written. If economic growth were no longer the goal, there would be less anxiety and more leisurely meals. But would there be more social justice? No, says Benjamin Friedman, a professor of economics at Harvard University, in "The Moral Consequences of Economic Growth." Friedman argues that economic growth is essential to "greater opportunity, tolerance of diversity, social mobility, commitment to fairness and dedication to democracy." During times of expansion, he writes, nations tend to liberalize - increasing rights, reducing restrictions, expanding benefits for the needy. During times of stagnation, they veer toward authoritarianism. Economic growth not only raises living standards and makes liberal social policies possible, it causes people to be optimistic about the future, which improves human happiness. "It is simply not true that moral considerations argue wholly against economic growth," Friedman contends. Instead, moral considerations argue that large-scale growth must continue at least for several generations, both in the West and the developing world. Each American, the World Wildlife Federation calculates, demands more than four times as much of the earth as the global average for all men and women, most of this demand being resource consumption. Some think such figures mean American resource consumption must go down; to Friedman's thinking, any reduction would only harm the rest of the world by slowing global growth. What the statistic actually tells you, he would say, is that overall global resource consumption must go up, up, up - to bring reasonable equality of living standards to the developing world and to encourage the liberalization and increased human rights that accompany economic expansion. If by the middle of the 21st century everyone on earth were to realize the living standard of present-day Portugal (taking into account expected population expansion), Friedman calculates, global economic output must quadruple. That's a lot of growth. "The Moral Consequences of Economic Growth" is an impressive work: commanding, insistent and meticulously researched. Much of it is devoted to showing that in the last two centuries, periods of growth have in most nations coincided with progress toward fairness, social mobility, openness and other desirable goals, while periods of stagnation have coincided with retreat from progressive goals. These sections sometimes have a history-lesson quality, discoursing on period novels, music and other tangential matters. And sometimes the history lesson gets out of hand, as when the author pauses to inform readers that the Federal Republic of Germany was commonly known as West Germany. More important, Friedman's attempt to argue that there is something close to an inevitable link between economic growth and social advancement is not entirely successful, a troublesome point since such a link is essential to his thesis. For example, Friedman contends that economic growth aided American, French and English social reforms of the second half of the 19th century. Probably, but there was also a recession in the United States beginning in 1893, yet pressure for liberal reforms continued: the suffrage, good-government and social-gospel movements strengthened during that time. It was in the midst of a depression, in 1935, that Social Security, a huge progressive leap, was enacted. Economic growth has sometimes been weak in the United States for much of the last three decades, yet in this period American society has become significantly more open and tolerant - discrimination appears at an all-time low. On the flip side, the 20's were the heyday of the Klan in the United States, though the "roaring" economy of the decade was growing briskly. None of this disproves Friedman's hypothesis, only clouds its horizon. Surely liberalization works better where there is growth, while growth works better where there is liberalization - as China is learning. But the relationship between the two forces may always be fuzzy; the modern era might have seen movement toward greater personal freedom and social fairness regardless of whether high-output industrial economies replaced low-growth agrarian systems. Repressive forces, from skinheads to Nazis and Maoists, may spring more from evil in the human psyche than from any economic indicator. Friedman's thesis is now being tested in China, home of the world's most impressive economic growth. If he's right, China will rapidly become more open, gentle and democratic. Let's hope he's right. Though "The Moral Consequences of Economic Growth" may not quite succeed in showing an iron law of growth and liberalization, Friedman is surely correct when he contends that economic expansion must remain the world's goal, at least for the next few generations. Growth, he notes, has already placed mankind on a course toward the elimination of destitution. Despite the popular misconception of worsening developing-world misery, the fraction of people in poverty is in steady decline. Thirty years ago 20 percent of the planet lived on $1 or less a day; today, even adjusting for inflation, only 5 percent does, despite a much larger global population. Probably one reason democracy is taking hold is that living standards are rising, putting men and women in a position to demand liberty. And with democracy spreading and rising wages giving ever more people a stake in the global economic system, it could be expected that war would decline. It has. Even taking Iraq into account, a study by the Center for International Development and Conflict Management, at the University of Maryland, found that the extent and intensity of combat in the world is only about half what it was 15 years ago. Friedman concludes his book by turning to psychology, which shows that people's assumptions about whether their lives will improve are at least as important as whether their lives are good in the present. Right now, American living standards and household income are the highest they have ever been; but because middle-class income has been stagnant for more than two decades, while the wealthy hoard society's gains, many Americans have negative expectations. "America's greatest need today is to restore the reality. . . that our people are moving ahead," Friedman writes. How? He recommends lower government spending (freeing money for private investment), repealing upper-income tax cuts (to shrink the federal deficit), higher Social Security retirement ages, choice-based Medicare and big improvements in the educational system (educated workers are more productive, which accelerates growth). Friedman doesn't worry that we will run out of petroleum, trees or living space. What he does worry about is that we will run out of growth. Gregg Easterbrook is a visiting fellow at the Brookings Institution, a contributing editor of The Atlantic Monthly and the author, most recently, of "The Progress Paradox." From checker at panix.com Mon Dec 5 02:45:43 2005 From: checker at panix.com (Premise Checker) Date: Sun, 4 Dec 2005 21:45:43 -0500 (EST) Subject: [Paleopsych] ABC (au): Ancient Germans weren't so fair, 2004.7.16 Message-ID: Ancient Germans weren't so fair, 2004.7.16 http://www.abc.net.au/science/news/stories/s1154815.htm] [I found this when looking for articles on red hair for the meme on The Maureen Dowd Theory of Western Civilization I sent yesterday. I'm not expert enough to comment on this, but the Maureen Dowd theory might suggest that light hair and eyes, in the proportions of, say, 1900, could be recent indeed.] Anna Salleh in Brisbane Friday, 16 July 2004 Researchers may be able to make more accurate reconstructions of what ancient humans looked like with the first ever use of ancient DNA to determine hair and skin colour from skeletal remains. The research was presented today at an [4]international ancient DNA conference in Brisbane, Australia, by German anthropologist, Dr Diane Schmidt of the [5]University of G?ttingen. She said her research may also help to identify modern day murderers and their victims. "Three thousand years ago, nobody was doing painting and there was no photography. We do not know what people looked like," Schmidt told ABC Science Online. She said most images in museums and books were derived from comparisons with living people from the same regions. "For example, when we make a reconstruction of people from Africa we think that they had dark skin or dark hair," she said. "But there's no real scientific information. It's just a guess. It's mostly imagination." She said this had meant, for example, that the reconstruction of Neanderthals had changed over time. "In the 1920s, the Neanderthals were reconstructed as wild people with dark hair and dumb, not really clever," she said. "Today, with the same fossil record, with the same bones and no other information - just a change in ideology - you see reconstructions of people with blue eyes and quite light skin colour, looking intelligent and using tools. "Most of the reconstructions you see in museums are a thing of the imagination of the reconstructor. Our goal is to make this reconstruction less subjective and give them an objective basis with scientific data." Genetic markers for hair colour In research for her recently completed PhD, Schmidt built on research from the fields of dermatology and skin cancer that have found genetic markers for traits such as skin and hair colour in modern humans. In particular, Schmidt relied on the fact that different mutations (known as single nucleotide polymorphisms, or SNPs) in the melanocortin receptor 1 gene are responsible for skin and hair colour. Redhead DNA analysis showed this skull belonged to someone with red hair (Image: Sussane Hummel) "There is a set of SNPs that tells you that a person was a redhead and a different set of markers tell you they were fair skinned." She extracted DNA from ancient human bones as old as 3000 years old from three different locations in Germany and looked for these SNPs. Her findings suggest that red hair and fair skin was very uncommon among ancient Germans. Out of a total of 26 people analysed, Schmidt found only one person with red hair and fair skin, a man from the Middle Ages. All the other people had more UV-tolerant skin that tans easily. She said she was excited when she "coloured in" the faces that once covered the skulls, and had even developed "a kind of a personal relationship" with one of them. "It's not so anonymous," she said. "I think this is the reason why people in museums can do reconstruction because our ancestors are not so anonymous any more; they have a face you can look into." Unfortunately the genetic markers Schmidt used could not distinguish which of the ancient humans had blond versus black hair, and she could not determine eye colour. But, she said she was confident that this will be possible in a few years. Schmidt said that such research could also be used to help build up identikit pictures to help identify skeletons or criminals. The research has been submitted for publication. References 4. http://www.ansoc.uq.edu.au/index.html?page=15259 5. http://www.uni-goettingen.de/?lang=en&PHPSESSID=73500f612e2d0c6d193256491f49401e From shovland at mindspring.com Mon Dec 5 05:13:13 2005 From: shovland at mindspring.com (Steve Hovland) Date: Sun, 4 Dec 2005 21:13:13 -0800 Subject: [Paleopsych] Neurosphere- book of interest Message-ID: Neurosphere: The Convergence of Evolution, Group Mind, and the Internet (Paperback) by Donald P. Dulchinos Editorial Reviews >From Publishers Weekly Dulchinos, a manager in the cable television industry and longtime participant in the WELL, one of the first online communities, sees communication technology leading humanity toward global consciousness. Questions of whether the Internet might constitute a "group mind" have been newsgroup fodder for years, supplying a range of online material excerpted here. Dulchinos is also inspired by Teilhard de Chardin, whose concept of "noosphere" has been reworked into "neurosphere" to represent "a mature religious view commensurate with the evolutionary stage at which we find ourselves." Rather than developing a single line of argument, the text presents a collage of metaphysical speculation, punctuated with a touch of whimsy: "We may very well be on the verge of a consistent and simultaneous human experience... the ability to act with a single will. Hitler, among others, exploited this. Consider, on the other hand, that perhaps apparently benign 'personalities' like Madonna and Barney the Dinosaur likewise wield a perverse influence on large populations." Yet Dulchinos maintains the courage of his convictions, hoping to convince others "that each of them, even the most miserable and destitute, is an equally important part of this massively parallel, loosely affiliated, but still cohesive 6-billion-parts-strong Being. All of us together, we are God." From aandrews at hvc.rr.com Mon Dec 5 11:34:28 2005 From: aandrews at hvc.rr.com (Alice Andrews) Date: Mon, 5 Dec 2005 06:34:28 -0500 Subject: [Paleopsych] Re: ginger gene/Neanderthal References: Message-ID: <009f01c5f98f$da389cf0$6401a8c0@callastudios> Hi Frank-- For your Dowd (but not dowdy) meme: http://www.aulis.com/news13.htm all the best! -alice ----- Original Message ----- From: "Premise Checker" To: Sent: Sunday, December 04, 2005 9:45 PM Subject: [Paleopsych] ABC (au): Ancient Germans weren't so fair, 2004.7.16 Ancient Germans weren't so fair, 2004.7.16 http://www.abc.net.au/science/news/stories/s1154815.htm] [I found this when looking for articles on red hair for the meme on The Maureen Dowd Theory of Western Civilization I sent yesterday. I'm not expert enough to comment on this, but the Maureen Dowd theory might suggest that light hair and eyes, in the proportions of, say, 1900, could be recent indeed.] Anna Salleh in Brisbane Friday, 16 July 2004 Researchers may be able to make more accurate reconstructions of what ancient humans looked like with the first ever use of ancient DNA to determine hair and skin colour from skeletal remains. The research was presented today at an [4]international ancient DNA conference in Brisbane, Australia, by German anthropologist, Dr Diane Schmidt of the [5]University of G?ttingen. She said her research may also help to identify modern day murderers and their victims. "Three thousand years ago, nobody was doing painting and there was no photography. We do not know what people looked like," Schmidt told ABC Science Online. She said most images in museums and books were derived from comparisons with living people from the same regions. "For example, when we make a reconstruction of people from Africa we think that they had dark skin or dark hair," she said. "But there's no real scientific information. It's just a guess. It's mostly imagination." She said this had meant, for example, that the reconstruction of Neanderthals had changed over time. "In the 1920s, the Neanderthals were reconstructed as wild people with dark hair and dumb, not really clever," she said. "Today, with the same fossil record, with the same bones and no other information - just a change in ideology - you see reconstructions of people with blue eyes and quite light skin colour, looking intelligent and using tools. "Most of the reconstructions you see in museums are a thing of the imagination of the reconstructor. Our goal is to make this reconstruction less subjective and give them an objective basis with scientific data." Genetic markers for hair colour In research for her recently completed PhD, Schmidt built on research from the fields of dermatology and skin cancer that have found genetic markers for traits such as skin and hair colour in modern humans. In particular, Schmidt relied on the fact that different mutations (known as single nucleotide polymorphisms, or SNPs) in the melanocortin receptor 1 gene are responsible for skin and hair colour. Redhead DNA analysis showed this skull belonged to someone with red hair (Image: Sussane Hummel) "There is a set of SNPs that tells you that a person was a redhead and a different set of markers tell you they were fair skinned." She extracted DNA from ancient human bones as old as 3000 years old from three different locations in Germany and looked for these SNPs. Her findings suggest that red hair and fair skin was very uncommon among ancient Germans. Out of a total of 26 people analysed, Schmidt found only one person with red hair and fair skin, a man from the Middle Ages. All the other people had more UV-tolerant skin that tans easily. She said she was excited when she "coloured in" the faces that once covered the skulls, and had even developed "a kind of a personal relationship" with one of them. "It's not so anonymous," she said. "I think this is the reason why people in museums can do reconstruction because our ancestors are not so anonymous any more; they have a face you can look into." Unfortunately the genetic markers Schmidt used could not distinguish which of the ancient humans had blond versus black hair, and she could not determine eye colour. But, she said she was confident that this will be possible in a few years. Schmidt said that such research could also be used to help build up identikit pictures to help identify skeletons or criminals. The research has been submitted for publication. References 4. http://www.ansoc.uq.edu.au/index.html?page=15259 5. http://www.uni-goettingen.de/?lang=en&PHPSESSID=73500f612e2d0c6d193256491f49401e -------------------------------------------------------------------------------- > _______________________________________________ > paleopsych mailing list > paleopsych at paleopsych.org > http://lists.paleopsych.org/mailman/listinfo/paleopsych > From checker at panix.com Tue Dec 6 23:43:54 2005 From: checker at panix.com (Premise Checker) Date: Tue, 6 Dec 2005 18:43:54 -0500 (EST) Subject: [Paleopsych] Thought for Today Message-ID: The immovable truths that are there [in the Eroica Symphony]--and there are truths in the arts as well as in theology--became truths when Beethoven formulated them. They did not exist before. They cannot perish hereafter. --H.L. Mencken, "Brahms," Baltimore Evening Sun, 1926.8.2 From checker at panix.com Wed Dec 7 01:24:42 2005 From: checker at panix.com (Premise Checker) Date: Tue, 6 Dec 2005 20:24:42 -0500 (EST) Subject: [Paleopsych] ABC: Ancient hair gives up its DNA secrets Message-ID: News in Science - Ancient hair gives up its DNA secrets - 22/06/2004 http://www.abc.net.au/science/news/stories/s1135104.htm] [This and several following are just articles that the one I just sent linked to. They should also be of interest, though I can't comment on them.] Anna Salleh ABC Science Online Tuesday, 22 June 2004 Analysing DNA from ancient strands of hair is a new tool for learning about the past, molecular archaeologists say, including whether hair samples belonged to Sir Isaac Newton. Until now, scientists had thought analysing the hair shaft was of relatively little use as it contained so little DNA. Dr Tom Gilbert of the [4]University of Arizona led an international team that reported its work in the latest issue of the journal [5]Current Biology. The researchers said they had extracted and sequenced mitochondrial DNA from 12 hair samples, 60 to 64,800 years old, from ancient bison, horses and humans. The researchers said their results confirmed that hair samples previously thought to belong to Sir Isaac Newton were not his, a finding that backed previous isotopic analysis. But the focus of their research was to explore the potential of extracting ancient DNA from hair samples. The most common samples used for ancient DNA analyses are taken from bone, teeth and mummified tissue. Until now, when the hair root hadn't been available for analysis, scientists had thought analysing the hair shaft was of relatively little use as it contained so little DNA. But isolated strands of hair are often the only clues to human habitation in ancient times. Now Gilbert's team said it had developed a method to extract and sequence ancient DNA from hair shafts. The researchers said the ancient DNA in hair was much less degraded than DNA from other tissues. They argued this was because it was protected from water by the hair's hydrophobic keratin, the protein polymer that gives hair its structure. The team also found that hair DNA had a low level of contamination and argued that keratin may protect the DNA from contamination with modern DNA sequences, like DNA from human sweat. The scientists also said that analysing hair DNA, and potentially DNA from other keratin-containing samples like ancient feathers and scales, would minimise the destruction of valuable archaeological samples caused by sampling teeth or bones. Hairy development "It's a nice development," said Dr Tom Loy, an Australian expert in ancient DNA from the [6]University of Queensland. He said that molecular archaeologists had generally ignored extracting DNA from hair. "[But] on the basis of their article it looks as if it's quite, quite feasible," he told ABC Science Online. He said the method may be useful in shedding light on the origin of strands of ancient hair discovered a decade ago at the Pendejo Cave site in New Mexico. "It would be very important to find out whose hair it was," said Loy, who said previous attempts had been unsuccessful. He was enthusiastic about the idea of being able to extract ancient DNA from feathers. "Often times feathers are found in caves and in some cases as residues on artefacts," he said. But Loy was sceptical about using the method to extract ancient DNA from scales and was not convinced by the argument that keratin protected ancient DNA from contamination. "People still don't fully understand how things get contaminated," he said. References 4. http://www.arizona.edu/ 5. http://www.current-biology.com/ 6. http://www.uq.edu.au/ From checker at panix.com Wed Dec 7 01:24:48 2005 From: checker at panix.com (Premise Checker) Date: Tue, 6 Dec 2005 20:24:48 -0500 (EST) Subject: [Paleopsych] ABC: Ancient DNA may be misleading scientists Message-ID: News in Science - Ancient DNA may be misleading scientists - 18/02/2003 http://www.abc.net.au/science/news/stories/s786146.htm] Tuesday, 18 February 2003 Ancient DNA in skeletons has a tendency to show damage in a particular region, resulting in misleading genetic data and mistaken conclusions about the origin of the skeleton, British scientists said. A group of researchers at the [4]Henry Wellcome Ancient Biomolecules Centre of the University of Oxford, in Britain, made the finding while studying Viking specimens. They found that about half of the specimens had DNA that suggested they were of Middle Eastern origin. But more detailed analysis revealed that many of the genetic sequences in the double helix molecule, which carries the genetic information of every individual, were damaged at a key base that separates European sequences from Middle Eastern genetic types - damage which made the skeletons appear to have originated in the Levant. The results are published in the February 2003 issue of the [5]American Journal of Human Genetics. Damage events appear to be concentrated in specific 'hotspots', indicating that a high proportion of DNA molecules can be modified at the same point. These hotspots appear to be in positions that also differ between different human groups. In other words, the DNA damage discovered affects the same genetic positions as evolutionary change. "Now that this phenomenon has been recognised, it is possible to survey the ancient sequences for damage more accurately, and determine the correct original genetic type - open the way for more reliable future studies," said Professor Alan Cooper, director of the centre. Cooper has hopes the finding may have implications for future research. "It also appears that we can use damage cause after death to examine how DNA damage occurs during life - a completely unanticipated, and somewhat ironic result," he said. "Potentially this allows us to get uniquely separate views of the two major evolutionary processes, mutation and selection." Danny Kingsley - ABC Science Online References 4. http://abc.zoo.ox.ac.uk/ 5. http://www.journals.uchicago.edu/AJHG/home.html From checker at panix.com Wed Dec 7 01:24:52 2005 From: checker at panix.com (Premise Checker) Date: Tue, 6 Dec 2005 20:24:52 -0500 (EST) Subject: [Paleopsych] ABC (au): A faster evolutionary clock? Message-ID: A faster evolutionary clock? http://www.abc.net.au/science/news/stories/s510398.htm] [Analogy: when carbon 14 dating was first employed, it put Stonehenge later than the Egyptian pyramids, though archeologists knew in their hearts that this couldn't be true. When the Unchecked Premise, that cosmic rays triggering off mutations comes in at a steady rate was Checker, new tables were calibrated cross-dating from overlapping sets of tree rings. The result was that Stonehenge turned out to be earlier than the pyramids after all. [So maybe these out-of-Africa theories will get considerably revised and theories of raciation before speciation into homo sapiens, like Carleton Coon's, will get revisited. Stay tuned.] Friday, 22 March 2002 A discovery by scientists studying ancient DNA from Antarctic penguins may change our understanding of how fast the tree of life grew. New Zealand scientist, Dr David M. Lambert, and colleagues report in this week's [4]Science on a new method of measuring the rate of DNA evolution. They believe their method of using numerous samples of ancient DNA is much more accurate than the current method of "calibrating" the "molecular clock". The team studied over 20 colonies of Ad?lie Penguins whose home is the ice-free areas of Antarctica. "This is the best source of ancient DNA found yet," said Dr Lambert, of the Institute of Molecular BioSciences at Massey University in Palmerston North. By taking blood samples, Dr Lambert and colleagues were able to analyse a particular segment of genetic material in the mitochondria of the penguins and find two different groups whose DNA differed from each other by 8%. The team then set out to find when the two lineages diverged. Conveniently, right beneath the very feet of the living penguins lay the bones of their long gone ancestors - dating back to 7,000 years. The researchers analysed equivalent DNA segments from carbon-dated ancestral penguin bones of nearly 100 different ages ranging from 88 years to around 7,000 years old. By plotting the degree of change in the DNA over time, they estimated a rate of evolution equivalent to 96% per million years. This meant the two groups of penguins diverged 60,000 years ago, in the middle of the last ice age. "This rate is 2 7 times faster than previous estimates for this particular segment of mitochondrial DNA," said Dr Lambert. "According to the standard rate of evolution, the penguins diverged 300,000 years which is more than two ice ages ago." The conventional method of calibrating the molecular clock involves measuring the percentage difference between the DNA of two living creatures and comparing it to DNA from a fossil counterpart of one particular age. "This only gives you one data point a datum, not a distribution of points," said Dr Lambert, "It is not statistically reliable whereas in our method there is greater confidence in the numbers arrived at." "We believe we've got a more accurate way of measuring the rate of evolution," he said. The findings may or may not have implications for other species. "Maybe the Ad?lie penguins have evolved particularly fast," speculates Dr Lambert. "We won't know until we apply the method to other species." Dr Lambert and team now intend to test kiwis, Antarctic fish, the Tuatara (a NZ reptile), and even humans. Penguin colony Cape Adare in Antarctica is the largest colony of Ad?lie penguins (Pic: J. Macdonald) However, it won't necessarily be easy since the conditions required for such an approach are quite particular. The penguins in Antarctica were a perfect opportunity because they provided a living population at the same location as dead ancestors, the location was undisturbed by human influence, and the environment was optimal for preserving DNA. "Antarctica is not only cold, but it's drier than a desert," said Dr Lambert. "It's not surprising it was the best source of ancient DNA." "It'll be harder to do it for the other species but we've learnt a lot, and we're going to give it our best shot," he said. If the new faster rate of evolution proves correct for other organisms this will change the our understanding of when different organisms evolved, how fast the tree of life grew and even how different animals responded to environmental change. Anna Salleh - ABC Science Onlinee From checker at panix.com Wed Dec 7 01:24:57 2005 From: checker at panix.com (Premise Checker) Date: Tue, 6 Dec 2005 20:24:57 -0500 (EST) Subject: [Paleopsych] ABC (au): "Out of Africa" in Asia Message-ID: "Out of Africa" in Asia http://www.abc.net.au/science/news/stories/s293857.htm [Now this is earlier than the piece about changing the rate of mutations. It may be stale but perhaps worth revisiting. I append two other articles.] Friday, 11 May 2001 The origins of man debate continues The hotly-debated notion that modern humans arose from Africa and replaced all other populations of early humans across the globe has been bolstered by new research. A genetic study lead by Yuehai Ke from Fudan University in China, of more than 12,000 men from 163 populations in East Asia strongly suggests the so-called "Out of Africa" theory of modern human origins is correct, according to a report in today's [18]Science. The "Out of Africa" model states that anatomically-modern humans originated in Africa about 100,000 years ago and then spread out to other parts of the world where they completely replaced local archaic populations. Among the evidence for this notion are recent DNA tests which ruled out the contribution of primitive humans or Neanderthals to modern Europeans. But others argue that the distribution and morphology of ancient human fossils found in China and other regions of East Asia support a competing theory - that modern humans evolved independently in many regions of the world. Now Yuehai Ke and his team from China, Indonesia, the United States and Britain, tested the Y chromosomes of 12,127 East Asian men for the presence of three specific mutations - types of genetic markers. The three mutations are derived from a single earlier mutation in African humans. The team found every individual they tested carried one of the three later mutations and no ancient non-African DNA was found. They therefore rule out even a "minimal contribution" from the primitive East Asian humans to their anatomically-modern counterparts. Dr Colin Groves, from the anthropology department at the Australian National University, described the new data from such a large sample of men as "absolutely decisive". "I'm a supporter [of the Out of Africa model] but I can't for the life of me think how any multi-regional model could fit this," he told ABC Science Online. The new data analysing male genes was "telling exactly the same story" as previously-reported data analysing genes in cell structures called mitochondria, passed from one generation to the next via females, he added. Dr Groves' ANU colleague and well-known opponent of the Out of Africa model, Dr Alan Thorne, was not available to comment on the new research. Further controversy on human origins Tuesday, 16 January 2001 Mungo man - analysis of DNA from this fossil announced last week reignited a controversy over the origins of modern humans. New research supports the theory that the ancestors of modern humans came from many different regions of the world, not just a single area -- but critics remain far from convinced. The study, published in the current issue of [18]Science by University of Michigan anthropologist Milford H. Wolpoff and colleagues, is the second study in a week to fuel the debate on the origin of the human species. Australian researchers set off a storm last week when they announced that their analysis of mitochondrial DNA from 'Mungo Man' also supported the so-called 'regional continuity theory'. Their study is due to be published this month in the Proceedings of the National Academy of Sciences. The study presented in this week's Science comes to the same conclusion following a comparison of early modern and archaic fossil skulls from around the world. "Ancient humans shared genes and behaviours across wide regions of the world, and were not rendered extinct by one 'lucky group' that later evolved into us," says Wolpoff. "The fossils clearly show that more than one ancient group survived and thrived." The researchers analysed the similarities and differences between fossil skulls from Australia and Central Europe, and peripheral regions far from Africa, where according to the dominant "Out of Africa" theory -- also known as the "Eve" or "Replacement" theory -- modern humans evolved. "Basically we wanted to see if this comparison could disprove the theory of multiple ancestry for the early European and Australian moderns," said Wolpoff. The researchers said they found that the most recent European and Australian skulls shared characteristics with the ancient African and Near Eastern population and with the older fossils from within their own regions. They also found there were many more similarities than could be explained by chance alone -- a finding which amounted to "a smoking gun" for the regional continuity theory. The findings are the latest evidence in the continuing scientific controversy about the origin of modern humans (Homo sapiens). Most scientists believe that all living humans can trace their ancestry exclusively to a small group of ancient humans, probably Africans, living around 100,000 years ago. If this theory was true it would mean that all other early human groups, whose fossils date from this time back to almost two million years ago, must have become extinct, possibly wiped out in a prehistoric genetic holocaust. Other scientists, including Wolpoff and Australian National University anthropologist Dr Alan Thorne, maintain that there is little evidence that a small group originating in a single geographic region replaced the entire population of early humans. "In asking the question a different way, and directly addressing the fossils, this study provides compelling evidence that replacement is the wrong explanation," says Wolpoff. "Instead, the findings support the theory of multi-regional evolution. Modern humans are the present manifestation of an older worldwide species with populations connected by gene flow and the exchange of ideas." Palaeoanthropologist Associate Professor Peter Brown of the University of New England disputes the findings. "I'm amazed that Science has published this article. If it had been submitted to me by a third year student I would have failed them," he told ABC Science Online. Professor Brown said that Wolpoff and colleagues had chosen an Australian fossil that was unrepresentative of the skulls of that time. "It's pathologically different. It has a skull as thick as a bike helmet," he said. "They've just chosen a fossil that suits their theory". He said that the authors had also ignored literature that was contrary to their theory. Dr Alan Thorne, however, insists that the evidence is on his and Wolpoff's side. "What we've found is mitochondrial DNA in an Australian fossil that is much more primitive than anything that's been found in Africa," he said. "And there is no archaeological or physical evidence to support the idea that Aboriginal Australians originated from Africa." Anna Salleh - ABC Science Online News in Science - Another blow for Out of Africa? - 23/02/2001 X-URL: http://www.abc.net.au/science/news/stories/s250390.htm X-Spam-Checker-Version: SpamAssassin 3.0.4 (2005-06-05) on mailcrunch2.panix.com X-Spam-Level: X-Spam-Status: No, score=-2.4 required=5.0 tests=ALL_TRUSTED,FROM_AND_TO_SAME, NO_REAL_NAME autolearn=unavailable version=3.0.4 [1]ABC Home [2]Radio [3]Television [4]News [5]...More Subjects ____________________ Search the ABC [6]the Lab - ABC Science Online [7]Science Home [8]News in Science [9]Features [10]Explore [11]TV & Radio [12]Dr Karl [13]Play [14]Podcasts [15]News in Science [16]Print Print [17] Email Email to a friend Another blow for Out of Africa? Friday, 23 February 2001 Nanjing man Nanjing man Another Australian study - this time of Chinese fossils - has weighed into the controversy over the origins of modern humans, supporting the theory they evolved in many different regions of the world. Dr Jian-xin Zhao and Professor Ken Collerson from the [18]University of Queensland (UQ) have dated ancient human fossils in China as being at least 620,000 years old - much older than previously thought. The researchers from the Earth Sciences Department say the findings support the theory that Asian populations evolved directly from Asian Homo Erectus, rather than evolving from populations out of Africa. A major argument against this regional continuity theory, Dr Zhao told ABC Science Online, is that the age of Homo Erectus fossils found in Asia did not allow enough time for Homo Sapiens to evolve. "This new date gives plenty of time for Home Erectus to evolve into Homo Sapiens," he says. The researchers measured the decay of uranium into thorium in samples of calcite flowstone directly above recently discovered Homo Erectus fossils called Nanjing Man, in the Tangshan Cave 250 kilometres northwest of Shanghai. In the past this method has been used to date teeth and bones, however the researchers say applying it to calcite flowstone samples has provided much more accurate dates and challenged the reliability of using fossil teeth for the purposes of dating. "Age estimates derived from teeth or bones depend very much on how and when uranium was taken up during fossilisation process, and are often younger than the true ages," Professor Collerson says. "In contrast, the University of Queensland dates were derived from dense and pure crystalline flowstone that was closed to uranium and thorium mobility and are therefore more reliable." The findings, developed in collaboration with Dr Kai Hu and Dr Han-kui Xu from Nanjing, were published recently in the international journal Geology. Anna Salleh - ABC Science Online From checker at panix.com Wed Dec 7 01:25:02 2005 From: checker at panix.com (Premise Checker) Date: Tue, 6 Dec 2005 20:25:02 -0500 (EST) Subject: [Paleopsych] CHE: Duping the Brain Into Healing the Body Message-ID: Duping the Brain Into Healing the Body The Chronicle of Higher Education, 5.12.2 http://chronicle.com/weekly/v52/i15/15a01201.htm [I read somewhere that placebos work on dogs, a surprising result, since dogs are immune from the sort of verbal propaganda humans are subject to. One way it could work is this: a dog is given a medicine that has actual medicinal effects. But most medicines don't directly go to whatever part of the body is causing the difficulty. Rather the medicine triggers off a chain of brain and nerve events. If this has happened a good many times, the grooves down the nerve chain (so to speak: I like something more medically-correct) repeatedly the nerve chains deepen. After a while, a smaller dose or even no dose could trigger off the chain and thus work on the dog. I recall a Russian guy named Pavlov who did something like this. [Another thing: I read in _Science_ years and years ago that a major anomaly had been discovered, namely that the placebo effect tends to be proportional to the medicinal effect of the actual medicine, whereas one would think the two would be random with respect to each other. I failed to follow up on the controversy. Can anyone cue me in?] Researchers analyze the mysterious placebo effect By LILA GUTERMAN Washington The placebo effect -- it's all in your head. When you swallow sugar pills instead of powerful medicine and your symptoms disappear, it's all thanks to the power of your mind. How does the brain perform this parlor trick? In the past, scientists suspected that any apparent health benefits from placebos had little more basis in biology than did sleight of hand. In studies of new drugs, patients might tell their doctors they feel better because they think that is what their doctor wants to hear. Or perhaps they would have recovered without any treatment, real or sham. But researchers now know that the placebo effect is real and grounded in the physiology of the brain. Using techniques to peer inside the skull, they have begun to find regions of the brain that respond to placebos, and they have even watched a single nerve cell react to a sham medicine. Those studies show that placebos affect the brain in much the same way that actual treatments do, researchers reported here in November at the annual meeting of the Society for Neuroscience. In other words, the power to treat several troublesome disorders may be wrapped up in the three-pound spongy lump of tissue protected by the skull. The research points to the power of positive thinking -- even at the unconscious level. When the brain expects relief, it can manufacture some on its own. "The things you can change with a positive outlook are profound," says Tor D. Wager, an assistant professor of psychology at Columbia University. "They are deeper physiologically than we have previously appreciated." None of the researchers who study the mechanism of the placebo effect suggest that doctors should prescribe dummy pills instead of real medicine. But they say that the study of the placebo effect could change the way scientists perform clinical trials of new treatments and could even alter how we understand and treat pain, Parkinson's disease, and depression. By studying placebos, says Christian S. Stohler, dean of the school of dentistry at the University of Maryland at Baltimore, "you crack into disease mechanisms that might be very important for improving the lives of many pain patients." Fooling the Patient Researchers gained their first glimpse of the causes of the placebo effect in the late 1970s, when scientists discovered that under certain conditions they could cancel the effect. In a study of pain relievers, a drug called naloxone prevented patients on placebo pills from experiencing the usual benefit. Since naloxone blocks the action of painkillers called opioids, researchers figured that placebos must stimulate the brain to produce its own opioids. In the 1990s, another set of experiments provided more evidence that the placebo effect was a real physiological phenomenon. Fabrizio Benedetti, a professor of neuroscience at the University of Turin, and others studied the effect without using a placebo. Dr. Benedetti judged that a placebo's effect comes from the patient's psychosocial context: talking to a doctor, observing the treatment, and expecting improved health. So he took away that context by giving study participants real drugs, but on the sly. Patients were told that they would receive an active drug, a placebo, or nothing through intravenous needles, and consented to get any of the different treatments without knowing when any treatment would be supplied. The scientists compared the results when a doctor overtly gave the patient the drug and when a computer supplied the drug without the patient's knowledge. Bedside manner, it turned out, made a difference: Patients required far more painkiller if they unknowingly received the medicine from a computer. When the doctor gives a drug in full view, Dr. Benedetti said at the neuroscience conference, "there is an additive effect of the drug and of the placebo, the psychosocial component." He suggests that his experimental setup could be extended to become part of the testing procedure for new drugs. Clinical trials could then compare covert and overt administration, rather than comparing the active drug to a placebo. That way, none of the volunteers would go through the trouble of participating without receiving the real experimental treatment, and researchers could still demonstrate that the drug was effective by showing that it reduced symptoms when given covertly. Peering at the Brain With the recent advent of modern brain-scanning techniques, scientists gained the ability to look directly at the regions of the brain involved in the placebo effect. In 2002 researchers in Finland and Sweden published in Science the first brain images of the effect, using a technique called positron emission tomography, better known as PET. The researchers pressed a hot surface onto the hands of nine male volunteers, and then a doctor gave them injections of either a painkiller or a placebo. When the researchers performed PET scans on the men, both the drug and the dummy induced high blood flow -- indicating increased brain activity -- in an area of the brain called the rostral anterior cingulate cortex. That area plays a key role in the painkilling effects of opioid drugs. Then in 2004, also in Science, Mr. Wager reported using functional magnetic resonance imaging, or fMRI, to show that a placebo that relieved pain also decreased activity in the brain's pain-sensing areas. Different people felt varying amounts of pain relief from the placebo. The amount of pain reduction a volunteer experienced went hand in hand with the amount of change in activity in the brain. "Part of the effect of a drug," Mr. Wager said at the conference, "is it changes the way you think about drugs." Jon-Kar Zubieta, an associate professor of psychiatry and radiology at the University of Michigan at Ann Arbor, and several colleagues, including Dr. Stohler of the University of Maryland, peered deeper into the brain's workings by finding out where the brain produces opioids in response to placebo treatment. They used PET scans along with a stain that marks opioid activity in the brain. When the researchers gave male volunteers a painful injection of saline solution into their jaw muscles, the scans showed an increase of opioids in the brain. Most of the regions where the brain produced painkillers coincided with the ones that Mr. Wager identified as important. "Expectation releases substances, molecules, in your brain, that ultimately change your experience," says Dr. Stohler. "Our brain is on drugs. It's on our own drugs." The placebo effect helps not only people in pain but also patients with diseases. In fact, scientists got their most detailed look at the placebo effect by studying how single neurons responded to sham drugs given to Parkinson's patients. Parkinson's disease is a motor disorder caused by loss of brain cells that produce dopamine. Some patients experience temporary relief of symptoms from a placebo, and a previous study showed that the relief occurred because the brain produced dopamine in response. Patients who have Parkinson's disease sometimes receive surgery to implant electrodes deep within the brain. The electrodes can stimulate a neuron or record its activity. Dr. Benedetti, of the University of Turin, and his colleagues enrolled 11 patients who underwent surgery for this type of treatment. They gave the patients a placebo injection, telling them it was a powerful drug that should improve their motor control. The researchers then compared the activity of a single neuron before and after injection of the placebo. In the six patients who responded to the placebo -- who demonstrated less arm rigidity and said they felt better -- the rate of firing of the neuron went down. (Nerve cells "fire," or generate electrical impulses, in order to send signals to neighboring neurons.) The neurons' firing rate did not change for people who experienced no placebo effect. Another disorder that shows clinical improvement with placebos is depression. Depressed patients' moods often lift when they take a placebo, although the effect does not last, and they normally need to seek real treatment, according to Helen S. Mayberg, a professor of neurology and of psychiatry and behavioral sciences at Emory University. Dr. Mayberg became immersed in placebo research a few years ago, when she did a PET study of the brain's response to an antidepressant and to a placebo. In her study of 15 depressed men, four who had taken Prozac and four who had received a placebo experienced a remission of their symptoms. At the end of six weeks, after allowing the drug sufficient time to take effect, Dr. Mayberg took PET scans. For patients whose symptoms improved, the regions where the brain activity increased after a patient took a placebo formed a subset of the regions that increased after a patient took the true drug. "Drug is placebo plus," she said at the conference. In patients whose symptoms did not improve, whether they were on Prozac or on the placebo, the brain activity did not increase in those regions. She had published the results of that study in 2002, but at the conference she reported a new analysis of her data. In the study, she had also collected brain scans one week after patients had begun receiving their treatments, even though the drug hadn't yet taken its full effect. Still, people whose symptoms later improved, whether they took the placebo or Prozac, again had increased brain activity in similar areas. One week into treatment, she says, the men's state of mind could be interpreted as a "heightened state of expectation" since they were anticipating clinical improvements. Nonresponders did not show those patterns, so such expectation could be key to whether a depressed patient will recover. Raising Expectations Dr. Mayberg would like to find ways to help those who do not respond to antidepressant drugs, and she surmises that expectation could make the difference. Such patients, she says, perhaps should imagine themselves getting well. "What is expectation?" she asks. "How do you cultivate it?" Those are questions that all of the scientists involved in this research would like to answer. Patients with chronic pain, says Dr. Zubieta of Michigan, perhaps have lost the ability to produce the brain's natural painkillers. "If you are able to recruit mechanisms that help you cope with stress or pain, that's a good thing," he says, "The question is, how do things like this, or meditation, or biofeedback, work? We don't know." Dr. Stohler of Maryland agrees: "Getting a person to boost their own machinery to improve health -- that's something that medicine needs to know." It may be especially urgent for patients with dementia, according to Dr. Benedetti. At the conference, he reported preliminary results that patients with Alzheimer's disease may not experience placebo effects at all. He found that Alzheimer's patients felt no difference between overt and hidden administration of painkillers. To Dr. Benedetti, that suggests that the psychological components of treatments -- the expectation of health improvements, and the circuits that such expectations create in the brain -- are absent. Perhaps, he said at the conference, doctors need to take that loss into account when prescribing any drug for Alzheimer's patients. Those patients may need higher doses of many drugs, such as painkillers, if their brain has stopped aiding the drug's action. The mind, it seems, may play a critical role in treating diseases. And its services come free of charge, with no co-payment or deductible. From checker at panix.com Wed Dec 7 01:36:08 2005 From: checker at panix.com (Premise Checker) Date: Tue, 6 Dec 2005 20:36:08 -0500 (EST) Subject: [Paleopsych] NYT: Mapmakers and Mythmakers Message-ID: Mapmakers and Mythmakers http://www.nytimes.com/2005/12/01/business/01maps.html [John Ralston Saul's in his great book, Voltaire's Bastards, showed that far from the Enlightenment dream of knowledge for all, knowledge is held secretly, as something to be traded. Lot's of big-wheel bureaucrats play it "close to the chest" and are secretive, even when it is patently unnecessary. So being a Voltaire bastard is far from rare outside the Soviet Union and, as the story shows, continuing in Russia. [The Moscow police back in the bad old days kept CIA maps of their city on their walls, since those publicly available were nearly useless. This was an open secret: at least I heard about it. I also heard that the Soviets could not feed their army, but the Cold Warriors would not report this fact, nor would even the New York Times, which was a critic of a large part of the Cold War. It's amazing what doesn't get reported here, but anyone can now turn to foreign sources on the Web. Not so many as to matter vote-wise, though. And these foreign sources have their own biases. [Pilate's question remains. And he raised it two thousand years before postmodernism! [A good article!] By ANDREW E. KRAMER MOSCOW, Nov. 30 - Bruce Morrow worked for three years on the shores of Lake Samotlor, a tiny dot of water in a maze of oil wells and roads covering more than a thousand square miles of icy tundra in Siberia. From the maps the Russians gave Mr. Morrow, he could never really know where he was, a misery for him as an oil engineer at a joint venture between BP and Russian investors. The latitude and longitude had been blotted out from his maps and the grid diverged from true north. "It was like a game," Mr. Morrow said of trying to make sense of the officially doctored maps, holdovers from the cold war era provided by secretive men who worked in a special department of his company. Unofficially, anyone with Internet access can take a good look at the Samotlor field by zooming down through free satellite-imaging programs like Google Earth, to the coordinates 61 degrees 7 minutes north latitude and 76 degrees 45 minutes east longitude. Mr. Morrow's plight illustrates how some practices that once governed large regions of the former Soviet Union may still lurk in the hallways where bureaucrats from the Communist past cling to power. Not only do they carry over a history of secrecy, but they also serve to continue a tradition of keeping foreigners at bay while employing plenty of people made dependent on Moscow. The misleading maps also reflect the Kremlin's tightening grip on Russian oil, one of the world's critical supplies, and one that is to become even more important in the future with plans for direct shipments to the United States by 2010 from ports in the Far East and the Arctic. The secrecy rule over maps is enforced by the Federal Security Service, or F.S.B., a successor to the old K.G.B. It was written at a time the Russians were suspicious of virtually all foreign businesses and fearful of a missile strike on their Siberian wells. Those days are gone. But as the Russian government reasserts its control over strategic industries - particularly oil - it is not letting up on the rule. The doctored maps belong to a deep-rooted Russian tradition of deceiving outsiders, going back to the days of Potemkin villages in the 18th century and perhaps earlier. During the cold war it was called maskirovka, Soviet military parlance for deception, disinformation and deceit. For decades, government bureaucrats created false statistics and misleading place names. For instance, Baikonur, the Russian space center, was named for a village hundreds of miles away. Accurate maps of old Moscow's warren of back alleys appeared only after the breakup of the Soviet Union. Even now, Mr. Morrow and his colleagues can use only Russian digital map files that encrypt and hide the coordinates of his location. Officially, only Russians with security clearances are permitted to see oil field maps with real coordinates at scales greater than 1:2,500. "It was totally futile," Mr. Morrow said of the false coordinates on his F.S.B. maps, created through an encrypting system. "None of us was particularly keen on pushing it. There were rumors if you do that, you end up in the slammer." A spokeswoman for the F.S.B. confirmed that it controls maps around sites deemed important for national security, including oil fields. Asked whether the easy availability of accurate maps on the Internet made such continued secrecy obsolete, she said the agency was interested only in national security and would not elaborate on its practices. Foreign business executives, though, say there is a secret behind the secret maps, and it has little to do with national security. The rules are not only a way to maintain control over a strategic industry, but also form a subtle trade barrier and are a convenient way to increase Russian employment. After all, TNK-BP, the 50-50 joint venture where Mr. Morrow works, pays scores of cartographers to encode and decode the maps, said Frank Rieber, a former engineer there. The rules cover all oil companies, but are particularly pressing for TNK-BP. They provide a livelihood to hundreds of F.S.B.-licensed cartographers. Oil companies either outsource the work of stripping and restoring coordinates to independent institutes, or employ Russians with security clearances to do the work, as TNK-BP does. The map orientations are shifted from true north - the top of the map could be pointing slightly east, for example - and the grid does not correspond to larger maps. "It makes us pull our hair out," Mr. Rieber said. Yevgenia M. Albats, author of a 1994 book on the K.G.B., "The State Within a State," said the spy agency's interest in oil field mapping is just anther way of asserting its influence on society and business here, though one increasingly made obsolete by the Internet. "The F.S.B. knows about Google Earth as well as anybody," she said. "This doesn't have anything to do with national security. It's about control of the cash flow." The agency is guarding the wells as much from foreign business executives as from foreign missiles these days, she said. The laws about oil field secrets are used to persuade TNK-BP to replace foreign managers with Russians, more susceptible to pressure from the authorities, Ms. Albats said. "Russians are easier to manipulate," she continued. "They don't want to end up in Khodorkovsky's shoes," she said, referring to the former chief executive of the Yukos oil company, Mikhail B. Khodorkovsky, now in a Siberian penal colony serving an eight-year sentence. He was convicted of fraud and tax evasion after falling out with the Kremlin over taxes, oil-export routes and politics. The F.S.B. has also pursued scientists who cooperate with foreign companies in other industries. Last winter it charged a physicist, Oskar A. Kaibyshev, with exporting dual-use metal alloy technology to a South Korean company. Mr. Kaibyshev objected in vain that the technology had already been discussed in foreign journals. The case is pending. On Oct. 26, F.S.B. agents arrested three scientists at a Moscow aerospace company and accused them of passing secrets to the Chinese. Another physicist, Valentin V. Danilov, was convicted of selling technology for manned space flights to the same Chinese company last year, though he also protested that the information was available from published sources. At the same time, the Kremlin is using oil to recapture status lost with the collapse of the Soviet Union, which explains the close attention paid to the industry by the security services. Foreign Minister Sergey V. Lavrov told a Parliament committee in October that energy exports were Russia's most powerful diplomatic tool in relations with other nations, according to a report in the newspaper Nezavisimaya. BP bought into the Tyumen oil company, or TNK, in 2003. Friction over the use of oil field maps existed from early on, geologists at the company said, but intensified this year. The issue has risen to high levels in the government, with a faction that embraces foreign investment protesting that the F.S.B. is hobbling the work of Western engineers who come to help this country drill for oil, providing technology and expertise in the process. In October, Andrei V. Sharonov, a deputy economic and trade minister, said F.S.B. pressure on the oil venture over the classification of maps had disrupted production in western Siberia, an article in Vedomosti reported. It quoted Mr. Sharonov as saying that the agency was pressing TNK-BP to replace Western managers with Russians. A spokeswoman for Mr. Sharonov declined to comment. An F.S.B. spokeswoman denied any ulterior motives in policing oil field maps. Engineers call the practice a nuisance, but say it has not disrupted production. The licensed cartographers are skilled in accurately translating between real and false coordinates, and so far, they do not know of any major mistakes, they say. In a telephone interview from his home in Santa Barbara, Calif., Mr. Morrow, who worked as an engineer for TNK-BP from 2002 until May, said he left partly because he became frustrated with the police controls. He guided a reporter to Lake Samotlor on Google Earth. The lake lies just north of Nizhnevartovsk, a city on the Ob River, as it loops in silvery ribbons through a background of dark green Siberian wilderness. In the middle of the lake is an island, like a bull's eye. "That was the folly of it," Mr. Morrow said. "You could get this information anywhere. The bureaucracy got in the way of common sense. But that didn't make it any less illegal, or any less inconvenient." From shovland at mindspring.com Wed Dec 7 05:07:12 2005 From: shovland at mindspring.com (Steve Hovland) Date: Tue, 6 Dec 2005 21:07:12 -0800 Subject: [Paleopsych] Francis Crick and panspermia Message-ID: Life on a Meteor Ride Artist's depiction of the Chicxulub impact crater. The total number of objects a kilometer in diameter or larger, a size that could cause global catastrophe upon Earth impact, is now estimated to range between 900 and 1,230. Credit: NASA The British molecular biologist Francis Harry Crick died on Wednesday at the age of 88. Crick changed our understanding of life when, in 1953, he and James Watson announced that DNA came packaged in an elegant double helix structure. Crick reportedly claimed they had found 'the secret of life,' and many scientists agree. The double-helix structure explained how genetic material replicated through nitrogenous base pair bonds. Some see this as the most important development in biology in the 20th century, and Watson and Crick were awarded the Nobel Prize in Medicine for their discovery in 1962. Crick was not content to sit back on his laurels after winning one of the top prizes in science, however. He continued to study the mysteries of life, such as the nature of consciousness, or the possibility that RNA preceded the development of DNA. In 1973, he and the chemist Leslie Orgel published a paper in the journal Icarus suggesting that life may have arrived on Earth through a process called 'Directed Panspermia.' see Great Impact Debate Part 1 * Part 2 * Part 3 * Part 4 * Part 5 The Panspermia hypothesis suggests that the seeds of life are common in the universe and can be spread between worlds. This idea originated with the Greek philosopher Anaxagoras, and was later promoted by the Swedish physicist Svante Arrhenius and the British astronomer Fred Hoyle. Versions of this hypothesis have survived to the present day, with the discovery of proposed 'fossil structures' in the martian meteorite ALH84001. In a related project conducted by members of NASA's Astrobiology Institute, scientists have created primitive organic cell-like structures. They did it in their laboratory by duplicating the harsh conditions of cold interstellar space! Did comets carry such protocells to Earth? 'Directed Panspermia' suggests that life may be distributed by an advanced extraterrestrial civilization. Crick and Orgel argued that DNA encapsulated within small grains could be fired in all directions by such a civilization in order to spread life within the universe. Their abstract in the 1973 Icarus paper reads: "It now seems unlikely that extraterrestrial living organisms could have reached the earth either as spores driven by the radiation pressure from another star or as living organisms imbedded in a meteorite. As an alternative to these nineteenth-century mechanisms, we have considered Directed Panspermia, the theory that organisms were deliberately transmitted to the earth by intelligent beings on another planet. We conclude that it is possible that life reached the earth in this way, but that the scientific evidence is inadequate at the present time to say anything about the probability. We draw attention to the kinds of evidence that might throw additional light on the topic." The Miller-Urey experiment generated electric sparks -- meant to model lightning -- in a mixture of gases thought to resemble Earth's early atmosphere. Credit: AccessExcellence.org Crick and Orgel further expanded on this idea in their 1981 book, 'Life Itself.'. They believed there was little chance that microorganisms could be transported between planets and across interstellar distances by random accident. But a technological civilization could direct panspermia by stocking a spacecraft with a genetic starter kit. They suggested that a large sample of different microorganisms with minimal nutritional needs could survive the long journey between worlds. Many scientists are critical of the Panspermia hypothesis, because it does not try to answer the question of how life first originated. Instead, it passes the responsibility on to another place and another time, offering at best a partial solution to the question. Crick and Orgel suggested that Directed Panspermia might help resolve some mysteries about life's biochemistry. For instance, it could be the reason why the biological systems of Earth are dependent on molybdenum, when the chemically similar metals chromium and nickel are far more abundant. They suggested that the seeds for life on Earth could have originated from a location far richer in molybdenum. Other scientists have noted, however, that in seawater molybdenum is more abundant than either chromium or nickel. Coming full circle to his groundbreaking discovery of DNA's structure, Crick wondered, if life began in the great "primeval soup" suggested by the Miller/Urey experiment, why there wouldn't be a multitude of genetic materials among the different life forms. Instead, all life on Earth shares the same basic DNA structure. Crick and Orgel wrote in their book 'Life Itself,' "an honest man, armed with all the knowledge available to us now, could only state that in some sense, the origin of life appears at the moment to be almost a miracle, so many are the conditions which would have had to have been satisfied to get it going." From checker at panix.com Thu Dec 8 02:20:50 2005 From: checker at panix.com (Premise Checker) Date: Wed, 7 Dec 2005 21:20:50 -0500 (EST) Subject: [Paleopsych] Newsweek: Fighting Anorexia: No One to Blame Message-ID: Fighting Anorexia: No One to Blame http://www.msnbc.msn.com/id/10219756/site/newsweek/ [Interview and an article on "pro-ana" groups appended.] It's fascinating how the causes and blames for this disease get moved around, more so than with most other events and processes that have multiple causes. For me, anorexia is the best current example of a "socially constructed" disease. I do not deny that it is also a medical condition, but we must not think that the brain cannot play an active role. Well, we know that. What right-wingers do not want to admit is that the verbiage we take in shapes these diseases. Their medical model is germs to disease, or bottom-up causation. The wilder pomos say it's strictly mind to disease or, more strictly, verbiage to disease. [But it's plain that anorexia did not exist, in anything remotely like its current prevalence, until a few decades ago. To blame it on girls aping fashion models is a verbiage account, but that's English major metaphorism, and the author of the main article knows that. She does not hide the huge hereditary component, but this only means that, *at the present*, the hereditability is high. The larger historical problem is why the sudden increase. ["Socially constructed" connotes English major metaphorism in the minds of right wingers, but it is almost certain, in the case of anorexia, that changes *society* have far outweigh genetic changes or environmental changes like contaminants. "Constructed," though, implies a constructor or a Social Planner. I'd like a better term. [Multiple personality disorders is an earlier example of a socially constructed disease. Its entire existence spanned a few decades in the last century. Marianne Noble's _The Masochistic Values of Sentimental Literature_ convinced me that masochist was socially constructed. She's not merely an English major but an English Professor (American U.)! I met her at a party for Sarah's choir and got the book. I was one of the first pomo books I had read and found it rough going, though today I've picked up enough of the jargon to sail through it much more quickly. [I've decided to alter the meme I'm preparing on what it would take for me to abandon my three most cherished hypotheses. The first two, non-creation and co-evolution will remain, but I'm going to expand the third from the inability to precisely nail down our basic concepts to postmodernism, which includes that. It will be hard enough for me to describe what *I* mean by that term, and harder still to specify what it would take for me to abandon it. All three, as I work out my thoughts, are part of the broad movement from Western (mechanistic) to Darwinian (stochastic) Civilization. [As you wait impatiently for my meme, tell me what it would take for you to abandon your three most cherished hypotheses.] -------------------- The age of their youngest patients has slipped to 9 years old, and doctors have begun to research the roots of this disease. Anorexia is probably hard-wired, the new thinking goes, and the best treatment is a family affair. By Peg Tyre Newsweek Dec. 5, 2005 issue - Emily Krudys can pinpoint the moment her life fell apart. It was a fall afternoon in the Virginia suburbs, and she was watching her daughter Katherine perform in the school play. Katherine had always been a happy girl, a slim beauty with a megawatt smile, but recently, her mother noticed, she'd been losing weight. "She's battling a virus," Emily kept on telling herself, but there, in the darkened auditorium, she could no longer deny the truth. Under the floodlights, Katherine looked frail, hollow-eyed and gaunt. At that moment, Emily had to admit to herself that her daughter had a serious eating disorder. Katherine was 10 years old. Who could help their daughter get better? It was a question Emily and her husband, Mark, would ask themselves repeatedly over the next five weeks, growing increasingly frantic as Katherine's weight slid from 48 to 45 pounds. In the weeks after the school play, Katherine put herself on a brutal starvation diet, and no one?not the school psychologist, the private therapist, the family pediatrician or the high-powered internist?could stop her. Emily and Mark tried everything. They were firm. Then they begged their daughter to eat. Then they bribed her. We'll buy you a pony, they told her. But nothing worked. At dinnertime, Katherine ate portions that could be measured in tablespoons. "When I demanded that she eat some food?any food?she'd just shut down," Emily recalls. By Christmas, the girl was so weak she could barely leave the couch. A few days after New Year's, Emily bundled her eldest child into the car and rushed her to the emergency room, where she was immediately put on IV. Home again the following week, Katherine resumed her death march. It took one more hospitalization for the Krudyses to finally make the decision they now believe saved their daughter's life. Last February, they enrolled her in a residential clinic halfway across the country in Omaha, Neb.?one of the few facilities nationwide that specialize in young children with eating disorders. Emily still blames herself for not acting sooner. "It was right in front of me," she says, "but I just didn't realize that children could get an eating disorder this young." Most parents would forgive Emily Krudys for not believing her own eyes. Anorexia nervosa, a mental illness defined by an obsession with food and acute anxiety over gaining weight, has long been thought to strike teens and young women on the verge of growing up?not kids performing in the fourth-grade production of "The Pig's Picnic." But recently researchers, clinicians and mental-health specialists say they're seeing the age of their youngest anorexia patients decline to 9 from 13. Administrators at Arizona's Remuda Ranch, a residential treatment program for anorexics, received so many calls from parents of young children that last year, they launched a program for kids 13 years old and under; so far, they've treated 69 of them. Six months ago the eating-disorder program at Penn State began to treat the youngest ones, too?20 of them so far, some as young as 8. Elementary schools in Boston, Manhattan and Los Angeles are holding seminars for parents to help them identify eating disorders in their kids, and the parents, who have watched Mary-Kate Olsen morph from a child star into a rail-thin young woman, are all too ready to listen. At a National Institute of Mental Health conference last spring, anorexia's youngest victims were a small part of the official agenda?but they were the only thing anyone talked about in the hallways, says David S. Rosen, a clinical faculty member at the University of Michigan and an eating-disorder specialist. Seven years ago "the idea of seeing a 9- or 10-year-old anorexic would have been shocking and prompted frantic calls to my colleagues. Now we're seeing kids this age all the time," Rosen says. There's no single explanation for the declining age of onset, although greater awareness on the part of parents certainly plays a role. Whatever the reason, these littlest patients, combined with new scientific research on the causes of anorexia, are pushing the clinical community?and families, and victims?to come up with new ways of thinking about and treating this devastating disease. Not many years ago, the conventional wisdom held that adolescent girls "got" anorexia from the culture they lived in. Intense young women, mostly from white, wealthy families, were overwhelmed by pressure to be perfect from their suffocating parents, their demanding schools, their exacting coaches. And so they chose extreme dieting as a way to control their lives, to act out their frustration at never being perfect enough. In the past decade, though, psychiatrists have begun to see surprising diversity among their anorexic patients. Not only are anorexia's victims younger, they're also more likely to be black, Hispanic or Asian, more likely to be boys, more likely to be middle-aged. All of which caused doctors to question their core assumption: if anorexia isn't a disease of type-A girls from privileged backgrounds, then what is it? Although no one can yet say for certain, new science is offering tantalizing clues. Doctors now compare anorexia to alcoholism and depression, potentially fatal diseases that may be set off by environmental factors such as stress or trauma, but have their roots in a complex combination of genes and brain chemistry. In other words, many kids are affected by pressure-cooker school environments and a culture of thinness promoted by magazines and music videos, but most of them don't secretly scrape their dinner into the garbage. The environment "pulls the trigger," says Cynthia Bulik, director of the eating-disorder program at the University of North Carolina at Chapel Hill. But it's a child's latent vulnerabilities that "load the gun." Parents do play a role, but most often it's a genetic one. In the last 10 years, studies of anorexics have shown that the disease often runs in families. In a 2000 study published in The American Journal of Psychiatry, researchers at Virginia Commonwealth University studied 2,163 female twins and found that 77 of them suffered from symptoms of anorexia. By comparing the number of identical twins who had anorexia with the significantly smaller number of fraternal twins who had it, scientists concluded that more than 50 percent of the risk for developing the disorder could be attributed to an individual's genetic makeup. A few small studies have even isolated a specific area on the human genome where some of the mutations that may influence anorexia exist, and now a five-year, $10 million NIMH study is underway to further pinpoint the locations of those genes. Amy Nelson, 14, a ninth grader from a Chicago suburb, thinks that genes played a role in her disease. Last year Amy's weight dropped from 105 to a skeletal 77 pounds, and her parents enrolled her in the day program at the Alexian Brothers Behavioral Health Hospital outside Chicago. Over the summer, as Amy was getting better, her father found the diary of his younger sister, who died at 18 of "unknown causes." In it, the teenager had calculated that she could lose 13 pounds in less than a month by restricting herself to less than 600 calories a day. No salt, no butter, no sugar, "not too many bananas," she wrote in 1980. "Depression can run in families," says Amy, "and an eating disorder is like depression. It's something wrong with your brain." These days, Amy is healthier and, though she doesn't weigh herself, thinks she's around 100. She has a part in the school play and is more casual about what she eats, even to the point of enjoying ice cream with friends. Scientists are tracking important differences in the brain chemistry of anorexics. Using brain scans, researchers at the University of Pittsburgh, led by professor of psychiatry Dr. Walter Kaye, discovered that the level of serotonin activity in the brains of anorexics is abnormally high. Although normal levels of serotonin are believed to be associated with feelings of well-being, these pumped-up levels of hormones may be linked to feelings of anxiety and obsessional thinking, classic traits of anorexia. Kaye hypothesizes that anorexics use starvation as a mode of self-medication. How? Starvation prevents tryptophane, an essential amino acid that produces serotonin, from getting into the brain. By eating less, anorexics reduce the serotonin activity in their brains, says Kaye, "creating a sense of calm," even as they are about to die of malnutrition. Almost everyone knows someone who has trouble with food: extremely picky eating, obsessive dieting, body-image problems, even voluntary vomiting are well known. But in the spectrum of eating disorders, anorexia, which affects about 2.5 million Americans, stands apart. For one thing, anorexics are often delusional. They can be weak with hunger while they describe physical sensations of overfullness that make it physically uncomfortable for them to swallow. They hear admonishing voices in their heads when they do manage to choke down a few morsels. They exercise compulsively, and even when they can count their ribs, their image in the mirror tells them to lose more. When 12-year-old Erin Phillips, who lives outside Baltimore, was in her downward spiral, she stopped eating butter, then started eating with chopsticks, then refused solid food altogether, says her mother, Joann. Within two months, Erin's weight had slipped from 70 to 50 pounds. "Every day, I'd watch her melt away," Joann says. Before it struck her daughter, Joann had been dismissive about the disease. "I used to think the person should just eat something and get over it. But when you see it up close, you can't believe your eyes. They just can't." (Her confusion is natural: the term anorexia comes from a Greek word meaning "loss of appetite.") Anorexia is a killer?it has the highest mortality rate of any mental illness, including depression. About half of anorexics get better. About 10 percent of them die. The rest remain chronically ill?exhausting, then bankrupting, parents, retreating from jobs and school, alienating friends as they struggle to manage the symptoms of their condition. Hannah Hartney of Tulsa, Okla., was first hospitalized with anorexia when she was 10. After eight weeks, she was returned to her watchful parents. For the last few years, she was able to maintain a normal weight but now, at 16, she's been battling her old demons again. "She's not out of the woods," says her mother, Kathryn. While adults can drift along in a state of semi-starvation for years, the health risks for children under the age of 13 are dire. In their preteen years, kids should be gaining weight. During that critical period, their bones are thickening and lengthening, their hearts are getting stronger in order to pump blood to their growing bodies and their brains are adding mass, laying down new neurological pathways and pruning others?part of the explosion of mental and emotional development that occurs in those years. When children with eating disorders stop consuming sufficient calories, their bodies begin to conserve energy: heart function slows, blood pressure drops; they have trouble staying warm. Whatever estrogen or testosterone they have in their bodies drops. The stress hormone cortisol becomes elevated, preventing their bones from hardening. Their hair becomes brittle and falls out in patches. Their bodies begin to consume muscle tissue. The brain, which depends at least in part on dietary fat to grow, begins to atrophy. Unlike adult anorexics, children with eating disorders can develop these debilitating symptoms within months. Lori Cornwell says her son's descent was horrifyingly fast. In the summer of 2004, 9-year-old Matthew Cornwell of Quincy, Ill., weighed a healthy 49 pounds. Always a picky eater, he began restricting his food intake until all he would eat was a carrot smeared with a tablespoon of peanut butter. Within three months, he was down to 39 pounds. When the Cornwells and their doctor finally located a clinic that would accept a 10-year-old boy, Lori tucked his limp body under blankets in the back seat of her car and drove all night across the country. Matthew was barely conscious when he arrived at the Children's Hospital in Omaha. "I knew that I had to get there before he slipped away," she says. With stakes this high, how do you treat a malnourished third grader who is so ill she insists five Cheerios make a meal? First, say a growing number of doctors and patients, you have to let parents back into the treatment process. For more than a hundred years, parents have been regarded as an anorexic's biggest problem, and in 1978, in her book "Golden Cage," psychoanalyst Hilde Bruch suggested that narcissistic, cold and unloving parents (or, alternatively, hypercritical, overambitious and overinvolved ones) actually caused the disease by discouraging their children's natural maturation to adulthood. Thirty years ago standard treatment involved helping the starving and often delusional adolescents or young women to separate psychologically?and sometimes physically?from their toxic parents. "We used to talk about performing a parental-ectomy," says Dr. Ellen Rome, head of adolescent medicine at the Cleveland Clinic. Too often these days, parents aren't so much banished from the treatment process as sidelined, watching powerlessly as doctors take what can be extreme measures to make their children well. In hospitals, severely malnourished anorexics are treated with IV drips and nasogastric tubes. In long-term residential treatment centers, an anorexic's food intake is weighed and measured, bite by bite. In individual therapy, an anorexic tries to uncover the roots of her obsession and her resistance to treatment. Most doctors use a combination of these approaches to help their patients get better. Although parents are no longer overtly blamed for their child's condition, says Marlene Schwartz, codirector of the Yale eating-disorder clinic, doctors and therapists "give parents the impression that eating disorders are something the parents did that the doctors are now going to fix." Worse, the state-of-the-art protocols don't work for many young children. A prolonged stay in a hospital or treatment center can be traumatic. Talk therapy can help some kids, but many others are too young for it to be effective. Back at home, family mealtimes become a nightmare. Parents, advised not to badger their child about food, say nothing?and then they watch helpless and heartbroken as their child pushes the food away. In the last three years, some prominent hospitals and clinics around the country have begun adopting a new treatment model in which families help anorexics get better. The most popular of the home-based models, the Maudsley approach, was developed in the 1980s at the Maudsley Hospital in London. Two doctors there noticed that when severely malnourished, treatment-resistant anorexics were put in the hospital and fed by nurses, they gradually gained weight and began to participate in their own recovery. They decided that given the right support, family members could get anorexics to eat in the same way the nurses did. These days, family-centered therapy works like this: A team of doctors, therapists and nutritionists meets with parents and the child. The team explains that while the causes of anorexia are unclear, it is a severe, life-threatening disease like cancer or diabetes. Food, the family is told, is the medicine that will help the child get better. Like oncologists prescribing chemotherapy, the team provides parents with a schedule of calories, lipids, carbohydrates and fiber that the patient must eat every day and instructs them on how to monitor the child's intake. It coaches siblings and other family members on how to become a sympathetic support team. After a few practice meals in the hospital or doctor's office, the whole family is sent home for a meal. "I told my daughter, 'You're going to hate this'," says Mitzi Miles, whose daughter Kaleigh began struggling with anorexia at 10. "She said, 'I could never hate you, Mom.' And I said, 'We'll see'." The first dinner at the Miles home outside Harrisburg, Pa., was a battle?but Mitzi, convinced by Kaleigh's doctor she was doing the right thing, didn't back down. After 45 minutes of yelling and crying, Kaleigh began to eat. Over the next 20 weeks, Kaleigh attended weekly therapy sessions, and Mitzi got support from the medical team, which instructed her to allow Kaleigh to make more food choices on her own. Eleven months later, Kaleigh is able to maintain a normal weight. Mitzi no longer measures out food portions or keeps a written log of her daily food intake. Critics point out that the Maudsley approach won't work well for adults who won't submit to other people's making their food choices. And they charge that in some children, parental oversight can do more harm than good. Young anorexics and their parents are already locked in a battle for control, says Dr. Alexander Lucas, an eating-disorder specialist and professor emeritus at the Mayo Clinic in Minnesota. The Maudsley approach, he says, "may backfire" by making meals into a battleground. "The focus on weight gain," he says, "has to be between the physician and the child." Even proponents say that family-centered treatment isn't right for everyone: families where there is violence, sexual abuse, alcoholism or drug addiction aren't good candidates. But several studies both in clinics at the Maudsley Hospital and at the University of Chicago show promising results: five years after treatment, more than 70 percent of patients recover using the family-centered method, compared with 50 percent who recover by themselves or using the old approaches. Currently, a large-scale NIH study of the Maudsley approach is underway. Mental-health specialists say the success of the family-centered approach is finally putting the old stigmas to rest. "An 8-year-old with anorexia isn't in a flight from maturity," says Dr. Julie O'Toole, medical director of the Kartini Clinic in Portland, Ore., a family-friendly eating-disorder clinic. "These young patients are fully in childhood." Most young anorexics, O'Toole says, have wonderful, thoughtful, terribly worried parents. These days, when a desperately sick child enters the Kartini Clinic, O'Toole tries to set parents straight. "I tell them it's a brain disorder. Children don't choose to have it and parents don't cause it." Then she gives the parents a little pep talk. She reminds them that mothers were once blamed for causing schizophrenia and autism until that so-called science was debunked. And that the same will soon be true for anorexia. At the conclusion of O'Toole's speech, she says, parents often weep. Ironically, family dinners are one of the best ways to prevent a vulnerable child from becoming anorexic. Too often, dinner is eaten in the back seat of an SUV on the way to soccer practice. Parents who eat regular, balanced meals with their children model good eating practices. Family dinners also help parents spot any changes in their child's eating habits. Dieting, says Dr. Craig Johnson, director of the eating-disorder program at Laureate Psychiatric Hospital in Tulsa, triggers complex neurobiological reactions. If you have anorexia in the family and your 11-year-old tells you she's about to go on a diet and is thinking about joining the track team, says Johnson, "you want to be very careful about how you approach her request." For some kids, innocent-seeming behavior carries enormous risks. Children predisposed to eating disorders are uniquely sensitive to media messages about dieting and health. And their interpretation can be starkly literal. When Ignatius Lau of Portland, Ore., was 11 years old, he decided that 140 pounds was too much for his 5-foot-2 frame. He had heard that oils and carbohydrates were fattening, so he became obsessed with food labels, cutting out all fats and almost all carbs. He lost 32 pounds in six months and ended up in a local hospital. "I told myself I was eating healthier," Ignatius says. He recovered, but for the next three years suffered frequent relapses. "I'd lose weight again and it would trigger some of my old behaviors, like reading food labels," he says. These days he knows what healthy feels like. Ignatius, now 17, is 5 feet 11, 180 pounds, and plays basketball. Back in Richmond, Va., Emily Krudys says her family has changed. For two months Katherine stayed at the Omaha Children's Hospital, and slowly gained weight. Emily stayed nearby?attending the weekly therapy sessions designed to help integrate her into Katherine's treatment. After Katherine returned home, Emily home-schooled her while she regained her strength. This fall, Katherine entered sixth grade. She's got the pony, and she's become an avid horsewoman, sometimes riding five or six times a week. She's still slight, but she's gaining weight normally by eating three meals and three or four snacks a day. But the anxiety still lingers. When Katherine says she's hungry, Emily has been known to drop everything and whip up a three-course meal. The other day she was startled to see her daughter spreading sour cream on her potato. "I thought, 'My God, that's how regular kids eat all the time'," she recalls. Then she realized that her daughter was well on the way to becoming one of those kids. With Karen Springen, Ellise Pierce, Joan Raymond and Dirk Johnson Live Talk Transcript: Fighting Anorexia - Newsweek Society - MSNBC.com http://www.msnbc.msn.com/id/10216848/site/newsweek/ NEWSWEEK general editor Peg Tyre joined us for a Live Talk on this week's anorexia cover story on Thursday, Dec. 1. Anorexia, which affects 2.5 million Americans, isn't simply an eating disorder-it's a mental illness with a higher mortality rate than even depression. Patients who starve and deny themselves essential nutrients can cause long-term damage to their bodies. The disease's youngest victims, who are getting younger and younger, are also its most vulnerable. NEWSWEEK's Peg Tyre reports that the face of anorexia is no longer just the "type-A girls from privileged backgrounds" who confront pressures from parents, schools or coaches. Instead, they are more likely to be minorities, boys or middle-aged. There's also a genetic link to this disease, much like alcoholism and depression. As for treatment, researchers are saying parents need to be part of the process, instead of being viewed as contributing to the disease. Tyre, a NEWSWEEK general editor, will answer your questions on anorexia during a Live Talk on Thursday, Dec. 1, at noon ET. Peg Tyre: Hi All, Peg Tyre here. I'm the author of No One To Blame - Newsweek's cover story on anorexia. I'll try and answer your questions in the next hour. P. _______________________ Brooklyn,NY: When an individual gets anorexia, is it a disease that just comes up all of a sudden or is a disease that they have had for years but had not turned up until something triggers it? Peg Tyre: What I learned is that many people seem to have a latent vulnerability to the disease that is triggered by environmental factors. In terms of symptoms, many victims I talked to reported that it seemed to "come out of the blue." Others said it had been building for a long time. _______________________ Midland, GA: I was anorexic, in and out of hospitals and doctors offices for numerous years. Though it was not easy, I have now learned how not to obsess about food and weight. Actually, I am now trying valiantly to gain a few pounds!; I have an 18 month old daughter. What I would like to know is if there are any behaviors that we as parents need to avoid in raising her as a healthy happy girl. And how I can start teaching her how to love herself and have a healthy bady image. Peg Tyre: Congratulations! It sounds like you have done what many anorexics long to do-put it in their past! And congrats, too, on starting a family. We all worry about our children and their eating. It's such a primal concern. But for you, it will be a bit more fraught. Anorexia, as you probably know, tends to run in families. So you're going to have to keep a sharp eye on her. But, and here's the hard part, you are going to have to find a way to be normal (at least in front of her) about food. If I were you, I'd find a good therapist who you can discuss this with-you'll have so many questions as your daughter grows and goes through different phases. Good luck! _______________________ Indialantic, FL: Hi, do you have any sense of how funding for this terrible disease compares to other disorders such as AIDS ? Peg Tyre: I was astonished at how little research dollars are actually being spent on eating disorders. I think because of the heavy stigma that is placed on families, most families of anorexics tend to lay low and suffer in silence instead of coming out and trying to raise money. These families often think (and are sometimes told) it is something they caused! _______________________ Oklahoma City, OK: What percentage of teens are affected with anorexia? Peg Tyre: Good question. The answer is that there are no good numbers for eating disorders. There is no central reporting on eating disorders and very little follow up over time. That said, the rate of anorexia is and always has been low-less than 1%. For eating disorders in general, the rates are much higher. _______________________ St. George's, Grenada: Is it likely for a person suffering from Anorexia to die? Peg Tyre: Anorexic can be a fatal disease for many people. Some studies say 10% of them die, some studies say 20%, some say 5% every decade. Mostly they die of suicide or starvation. _______________________ Honolulu, HI: Which treatment centers in the US use the Maudsley approach? Peg Tyre: There are very good programs at the Univ. of Chicago, at the Comprehensive Eating Disorder Program at the Lucile Packard Children's Hosptital in Pao Alto, Ca., at Columbia University in NY and at Mount Sinai also, in NY. _______________________ Austin, TX: In covering this story, did you encounter any information about the insurance industry and its willingness to cover the expense of treatment for eating disorders? In my experience, which was years ago, there was almost no coverage. Just wondering if the new information about biological connections has heloed with this. Peg Tyre: Many families shared their struggles with their insurance companies who by and large, don't recognize this and pay for treatment in the way they might. _______________________ Summerville, SC: What advice can you give to parents of an anorexic who is no longer a teenager and refuses to go to drs appt or therapy? My daughter went through treatment at 14 and went into recovery in about 6 months. After a relatively healthy 3 years, she is struggling and dipping in and out of relapse. It is just so hard when she is making her own decisions now, and isn't open to my parental advice. Peg Tyre: I'm sorry. That sounds like a very difficult situation. My advice to you would be to get in touch with Cynthia Bulik, a professor at UNC in Chapel Hill and ask her for advice. She is an ED specialist. _______________________ Charlotte, NC: How come it's nobody's fault if a kid is anorexic, but parents, society, and supersized sandwiches and biggie fries are responsible for childhood obesity? These are symptoms of the same thing, a whacked out relationship with food. Obesity occurs in families, too, and starts before 10 years old. The people with "eating disorders" as described in this article are just the skinny victims. Clearly the implication is that there is blame to go around for fat. Peg Tyre: You raise some interesting points. I'm not sure, though, about connecting anorexia to obesity in this way. If you had a kid who ate without stopping until they died-who heard voices telling them to eat more-who refused to move so that they wouldn't burn a calorie-that might be the flip side of anorexia. Obesity is a different animal that what we are talking about with anorexia. _______________________ Columbia, PA: I eat a meal a day a have for years and always thought I may have anorexia, but I'm not hungry, that is why I eat 1 meal. Is this anorexia and can it be involuntary. Peg Tyre: I think most anorexics would tell you that it is involuntary. It is not something they are doing. I don't know you or your medical history and I'm not a doctor. I also can't see you so I don't know if your bones are showing. But if you are worried about it, ask your physician. Describe your eating patterns. He or she should be able to tell you quick enough. _______________________ Houston, TX: Did you find anyone investigating Anorexia possibly being linked to PANDAS (pediatric autoimmune neuropsychiatric disorders associated with streptococcal infections? Some groups have been investigating sudden and dramatic onset Peg Tyre: Glad you brought this up. This is a really interesting area of research that I simply didn't have space for. There are doctor who have made links between kids getting strep or a bacterial infection and then, coming down with anorexia. They have also tied PANDAS as it is called to the onset of obsessive compulsive disorder. It it really outside the box-to suggest that a bacterial infection (or perhaps its treatment) may be causing these profound behavioral and neurological changes. But I think it is an exciting avenue of inquiry. It is about time doctors started to take a fresh look at it! _______________________ Pittsburgh, PA: I am in my 30s and suffered severe anorexia. I was treated at Remuda Ranch. Although this is an outstanding article, it is important to note that the family situations described by Hilda Bruch in The Golden Cage e.g. controlling, narcissistic parents, ARE still relevant for some patients. In the opinion of my doctors and therapists, incl those at RR, my ED was caused in large part by my family situation. All of the points made in the article eg genetic susceptibility, are valid. However, I would caution that in some patients a family based treatment approach e.g the Maudsley method, is not suitable. My father hit me with a belt when I would not eat. Clearly the method of parental control of meals that is described would have been completely inappropriate in my case, and undoubtedly in others also. Thank you for the good information in the article. Peg Tyre: I'm very sorry you had to endure what you did. It is heartbreaking to hear about it. You make a good point-and one I tried to emphasize in the article-the family based method is clearly not right for every family-especially for those with a history of addiction or violence. However, it does offer some new hope for an old intractable problem. Good luck to you! _______________________ Minneapolis, MN: Since anorexia is a brain disorder, likened to depression and alcoholism for its genetical predetermination to some degree, has there been any research on the use of anti-depressants, mood-stabilizers, and/or anti-psychotics as a way to aid the symptomology of this disorder? Peg Tyre: I haven't found any good long term studies that suggest that anti-depressants or other psycho-active drugs are helpful. That said, I know clinicians often prescribe anti-depressants/anti-anxiety drugs to anorexics. Often, anorexics suffer from depression or anxiety and I guess some doctors are trying to treat both. _______________________ Ft. Worth, TX: After suffering from and overcoming anorexia, I still face severe anxiety and depression. Is this just because of my genetic makeup? What can I do to combat these issues and feel accepted by my family? (I am currently 19 yrs old and attend a university.) Peg Tyre: There are good studies that show that anorexics often suffer from depression and anxiety as well. Both of those conditions are treatable with the right drugs and a good therapist. Find a good doctor. (University health services should be able to refer you.) _______________________ Indialantic, FL: A followup question, please. Who are the people to contact to get involved in a serious fundraising effort, including corporations that may want to consider sponsorship ? Peg Tyre: I think the National Eating Disorder Association is probably your best bet. _______________________ Rochester, NY: Let's say there are two girls. One has been anorexic for 15+ years. The other girl for six years. Both try to get better but always fall back into their old habits. Would the 1st girl be considered chronic and the other one not ready? Or would they both be considered chronic? There is no defition of what constitute chronic anorexia, so if you could answer my question, it would be great. Peg Tyre: I think they would both be considered chronic. _______________________ Greensboro NC: Why are there only a handful of clinics worldwide to help those with this disease? And why are they so expensive? Peg Tyre: There are more than a handful but you are right, most of them are very very expensive. It is a difficult disease to treat-many parents end up re-mortgaging the house to get their kids in treatment. I'm surprised more families don't lobby for better coverage from their insurance carriers. _______________________ Peg Tyre: I want to put in a plug here for the ongoing NIMH study which is looking at the role genetics plays in anorexia. If you have anorexia, and you think it might run in your family, and you want to be part of an important study that will cost you nothing and may help future generations-you can go to [51]www.angenetics.org or phone 412-647-9794 to get more information about it. _______________________ Oklahoma City, OK: This is a 9th grade class called Basic Life Skills. We have been studing eating disorders. We wrote questions to ask, here is one: Do anorexics still feel hungry, or do they become immune to the pain of not eating? Peg Tyre: Good question: some say that they are not hungry. In fact, they feel full-one said "like I just ate two thanksgiving dinners" almost all the time. Others feel hungry but ignore it until their body stops sending them the "I"m hungry" message all the time. _______________________ Philadelphia, PA: I have a friend of the family, whose 19 year old daughter has an eating disorder due to anxiety and compulsive behavior. The mother's problem is finding adequate care for her age group and then fighting with the insurance companies to pay for an extensive period of time in a facility. Right now she upgraded her insurance to $1200 a month to pay for another 30 days of treatment. This has hit the family hard financially because they have a co-pay on top of this. They plan on taking a second mortgage to pay to keep their daughter well. Is there any help for these kids and families? Peg Tyre: This is such a big problem. Why don't you get in touch with NEDA. They might have resources to help you. There is also a small foundation I've heard of called the Freed Foundation which may have some $$. _______________________ Silver Spring, MD: The 'Pro-Ana" movement appears to be flourishing amongst various internet communities, often with at about 4-6 new 'members' per day. Given that children are fairly technologically advanced, is there any research on what impact this peer support network is having on treament? Peg Tyre: I don't know if I'd call it a movement. I guess you are talking about those websites where very sick, delusional anorexics write defiantly about wanting to be thin. What most people don't realize is that for most people, anorexia isn't a lifestyle choice. It is a mental illness, and like alcoholism, it is often characterized by denial. And yes, for many young people (and older as well) denial feeds denial. What most people fail to remember, though is that these "pro-ana" types are just in the throes of a terrible debilitating disease. _______________________ Rockville, MD: Your article states that "Not only are anorexia's victims younger, they're also more likely to be black, Hispanic or Asian, more likely to be boys, more likely to be middle-aged." What documentation or statistical information do you have to back up this statement? Peg Tyre: There aren't alot of good surveys on this - but I spoke to about two dozen clinicians around the country. What I found is that their patient base has really changed-younger, less white, sometimes older as well. _______________________ Greenfield IN: I was anorexic and bulimic when I was in middle and high school. I bottomed out at 59 pounds. I got therapy and seemed to be doing better. But as I gained the weight back it almost killed me and I would eat and then feel so guilty that i would force myself to throw up. I still have the urge to throw up every time I eat something. I have stopped eating when I can get away with it and if I cant I want to throw up afterwards. Sometimes I still go to the bathroom turn on the water and throw up. I dont know what to do and I dont want to tell my boyfriend for fear he will be upset and worry. I cant do that to him. Peg Tyre: Thanks for writing. As you know, eating disoders can be a chronic problem and it sounds like you're still doing battle with yours. You must be feel very isolated and alone. Why don't you get back in touch with that therapist-or get in touch with one of the clinics or experts I mentioned earlier. They might be able to help you. Good luck. _______________________ Wocester, MA: I know that these articles on anorexia are focusing on biological predispositions to it. It somewhat bothered me how strong the point was, mostly in the Berrien article, that parents didn't cause the anorexia. It bothers me because it seems that parents could read this and feel releaved of any responsibility and not examine there own behaviors. I am currently trying to recover from anorexia which I'm pretty sure surfaced when I was more and adult than a child. And I do believe that I was predisposed to it. However, I have come to see how having a mentally ill sibling and his outbursts toward me and my parents' responses to both him and me really ignited this. It's not means to blame or take responsibility off of me, but it helps me see that it's not such a shock my anorexia surfaced. So basically, my question to you is shouldn't parents not just focus on "fixing" the child and seeing the child as the problem but also to examine that maybe the child is an indication of a larger family problem? Like I said, I'm just afriad these articles will foster misunderstanding and further the stigma that the child is somehow "defective" all on his/her own. Peg Tyre: You raise a really good point here. Eating disorders are often a result of genetic vulnerabilities but there are often environmental triggers that set it off. And families can pull those triggers (heck, they MAKE the triggers). Saying that there may be a genetic component doesn't let families off the hood. The point I want to make is that scientists don't believe this is something that you are chosing to have. And they don't believe that this is something most parents gave you on purpose. Any like any serious mental or physical disease, your entire family can play some role in helping you get better. _______________________ Vernal, UT: I am asking serveral questions, I am devasted I just started suspecting something was going on with my daughters eating habits. I went to the grocery store so The Newsweek cover yesterday. I bought the magazine. My husband and I read it last night. Now we are sure there is something going on. I am frozen in fear about confronting her and knowing where to go from here. Peg Tyre: Please get in touch with some of the experts and facilities quoted in the story and in this livechat. My thoughts are with you. _______________________ Washington, DC: Is the current treatment environment beginning to adapt to the changing trends mentioned in this article? As a 26 year old female with anorexia, there appears to be a lack of specialized treatment programs that serve individuals outside of the 'common onset' age/gender group. Do you know of any programs that are specifically serving younger or older individuals, or males with this disorder? Peg Tyre: The children's hospital in Omaha treats younger kids, so does the ED program at Penn State. Remuda Ranch, a residential program, treats younger kids now, too. _______________________ Effingham, IL: How much has the "Hollywood" influence had on the younger children afflicted with anorexia? Since such stars as Lindsay Lohan parade the fact they are thinner, does that say to the younger fans that they should do the same? Peg Tyre: These are not good role models for our children. Do they give them unhealthy ideas about the body? Yes. Do they give kids unhealthy ideas about eating. Yes. Do they cause eating disoders? In some cases yes, but in other cases, the causes are more deepseated. _______________________ Evanston, IL: Hi Peg. First, congratulations on a fabulous, well-researched and -written story. I'm not given to crying (being a guy), but at several points tears welled up in my eyes. My question is-and I was appalled to read this-why do you and the experts think this is such an intractable condition, with at 10% the highest mortality rate of any mental disorder? Peg Tyre: Thanks very much. I think anorexics are difficult to treat because the disease affects their brain chemistry and ultimately, their ability to think logically about themselves. (Starvation does that.) Death rate? For all the boasting on the pro-ana websites about it being a lifestyle choice, it is really a miserable life filled with isolation and loneliness and frustration. My heart goes out to the sufferers. _______________________ Alexandria, VA: I nearly died of anorexia in 1995. I recovered only after being sent to Remuda Ranch in Arizona. Now, ten years later, I still fight the disease every single day. I am five months pregnant and wonder if there is any help for recovered anorexics who are pregnant? Gaining weight for the baby has been a constant battle. What are your suggestions? Peg Tyre: There are good support groups online-also a good therapist might help. _______________________ Kansas City, MO: I'm 19 and have struggled with anorexia for eight years. I was first hospitalized when I was 12 and have all together been in inpatient treatment five times. I got home from treatment two months ago, which I left against medical advice. Now I feel as though I am doing well with food, I eat three meals a day (that are about half of what my dietitian's meal plan for me calls for) and drink two Ensure Plus. Everyone around me is saying that I need to be back inpatient based solely on my weight right now though and I'm desperately confused. Is being 5'4" and 90 pounds really that much in need of help? Peg Tyre: I'm not qualified to say what you should weight but your doctor is. Part of you disease is not being able to make good judgements about how much food you should eat and what you should weight. Find professionals you trust. Then trust them. _______________________ New York, NY: Are there any programs you know of that treat boys? Peg Tyre: The children's hospital in Omaha may be able to help you. _______________________ Silver Spring, MD: Please,please recommend Lock and LeGrange's book Help Your Teenager Beat an Eating Disorder-it's a great resource for information on family based treatment. Laura Collins' book Eating with your Anorexic and her website [52]www.eatingwithyouranorexic.com are also wonderful resources. Thanks so much for this important article. My 14 year old daughter recovered using family based treatment and it is such a joy to see her happy and healthy again. Peg Tyre: Right-If you are interested in the Maudsley Method, please check out Laura Collin's book Eating With Your Anorexic. It is terrific, brave, heartwarming and very helpful in understanding what families of anorexics go through. She also has a website for family support [53]www.eatingwithyouranorexic.com AP: Pro-anorexia movement has cult-like appeal Experts alarmed by Web sites that promote self-starvation http://www.msnbc.msn.com/id/8045047/ Updated: 1:38 p.m. ET May 31, 2005 CHICAGO - They call her ~SAna.~T She is a role model to some, a goddess to others ~W the subject of drawings, prayers and even a creed. She tells them what to eat and mocks them when they don~Rt lose weight. And yet, while she is a very real presence in the lives of many of her followers, she exists only in their minds. Ana is short for anorexia, and ~W to the alarm of experts ~W many who suffer from the potentially fatal eating disorder are part of an underground movement that promotes self-starvation and, in some cases, has an almost cult-like appeal. Followers include young women and teens who wear red Ana bracelets and offer one another encouraging words of ~Sthinspiration~T on Web pages and blogs. They share tips for shedding pounds and faithfully report their ~Scw~T and ~Sgw~T ~W current weight and goal weight, which often falls into the double digits. They also post pictures of celebrity role models, including teen stars Lindsay Lohan and Mary-Kate Olsen, who last year set aside the acting career and merchandising empire she shares with her twin sister to seek help for her own eating disorder. ~SPut on your Ana bracelet and raise your skinny fist in solidarity!~T one ~Spro-Ana~T blogger wrote shortly after Olsen entered treatment. The movement has flourished on the Web and eating disorder experts say that, despite attempts to limit Ana~Rs online presence, it has now grown to include followers ~W many of them young ~W in many parts of the world. No one knows just how many of the estimated 8 million to 11 million Americans afflicted with eating disorders have been influenced by the pro-Ana movement. But experts fear its reach is fairly wide. A preliminary survey of teens who~Rve been diagnosed with eating disorders at the Lucile Packard Children~Rs Hospital at Stanford University, for instance, found that 40 percent had visited Web sites that promote eating disorders. ~SThe more they feel like we ~W ~Rthe others~R ~W are trying to shut them down, the more united they stand,~T says Alison Tarlow, a licensed psychologist and supervisor of clinical training at the Renfrew Center in Coconut Creek, Fla., a residential facility that focuses on eating disorders. Experts say the Ana movement also plays on the tendency people with eating disorders have toward ~Sall or nothing thinking.~T ~SWhen they do something, they tend to pursue it to the fullest extent. In that respect, Ana may almost become a religion for them,~T says Carmen Mikhail, director of the eating disorders clinic at Texas Children~Rs Hospital in Houston. She and others point to the ~SAna creed,~T a litany of beliefs about control and starvation, that appears on many Web sites and blogs. At least one site encourages followers to make a vow to Ana and sign it in blood. People with eating disorders who~Rve been involved in the movement confirm its cult-like feel. ~SPeople pray to Ana to make them skinny,~T says Sara, a 17-year-old in Columbus, Ohio, who was an avid organizer of Ana followers until she recently entered treatment for her eating disorder. She spoke on the condition that her last name not be used. 'Helping girls kill themselves' Among other things, Sara was the self-proclaimed president of Beta Sigma Kappa, dubbed the official Ana sorority and ~Sthe most talked about, nearly illegal group~T on a popular blog hosting service that Sara still uses to communicate with friends. She also had an online Ana ~Sboot camp~T and told girls what they could and couldn~Rt eat. ~SI guess I was attention-starved,~T she now says of her motivation. ~SI really liked being the girl that everyone looked up to and the one they saw as their ~Rthinspiration.~R ~SBut then I realized I was helping girls kill themselves.~T For others, Ana is a person ~W a voice that directs their every move when it comes to food and exercise. ~SShe~Rs someone who~Rs perfect. It~Rs different for everyone ~W but for me, she~Rs someone who looks totally opposite to the way I do,~T says Kasey Brixius, a 19-year-old college student from Hot Springs, S.D. To Brixius ~W athletic with brown hair and brown eyes ~W Ana is a wispy, blue-eyed blonde. ~SI know I could never be that,~T she says, ~Sbut she keeps telling me that if I work hard enough, I CAN be that.~T Treatment often fails Dr. Mae Sokol often treats young patients in her Omaha, Neb., practice who personify their eating disorder beyond just Ana. To them, bulimia is ~SMia.~T And an eating disorder often becomes ~SEd.~T ~SA lot of times they~Rre lonely and they don~Rt have a lot of friends. So Ana or Mia become their friend. Or Ed becomes their boyfriend,~T says Sokol, who is director of the eating disorders program run by Children~Rs Hospital and Creighton University. In the end, treatment can include writing ~Sgoodbye~T letters to Ana, Mia and Ed in order to gain control over them. But it often takes a long time to get to that point ~W and experts agree that, until someone with an eating disorder wants to help themselves, treatment often fails. Tarlow, at the Renfrew Center, says it~Rs also easy for patients to fall back into the online world of Ana after they leave treatment. ~SUnfortunately,~T she says, ~Swith all people who are in recovery, it~Rs so much about who you surround yourself with.~T Some patients, including Brixius, the 19-year-old South Dakotan, have had trouble finding counselors who truly understand their struggle with Ana. ~SI~Rd tell them about Ana and how she~Rs a real person to me. And they~Rd just look at me like I~Rm nuts,~T Brixius says of the counselors she~Rs seen at college and in her hometown. ~SThey wouldn~Rt address her ever again, so it got very frustrating. ~SHalf the time I~Rm, like, ~RYou know what? I give up.~T~R Other days, she~Rs more hopeful. ~SI gotta snap out of this eventually if I want to have kids and get a job. One day, I~Rll get to that point,~T she says, pausing. ~SBut I~Rll always obsess about food.~T From checker at panix.com Thu Dec 8 02:20:55 2005 From: checker at panix.com (Premise Checker) Date: Wed, 7 Dec 2005 21:20:55 -0500 (EST) Subject: [Paleopsych] NYT: Snared in the Web of a Wikipedia Liar Message-ID: Snared in the Web of a Wikipedia Liar http://www.nytimes.com/2005/12/04/weekinreview/04seelye.html [An excellent summary of the issues. What the article didn't say is that votes can be taken on articles on issues that others would not like to see the light of day, such as Jewish ethnocentrism. I should think it unlikely for Jews not to have assimilated themselves out of existence without being ethnocentric, and indeed there have been books by Jews urging their co-religionists to have more children. The suitability of the article was discussed at length on a Wikipedia forum and was nixed, on the grounds that the topic should be handled in a general one on ethnocentrism. [Regards Mr. Seigenthaler's alleged role in the Kennedy assassination, I would never take this to be an established fact and my have come to doubt the rest of the article as well. But, looking at it just now, the Kennedy reference having been excised, I see no reason to doubt its facts. [What's really great is that I can get a good summary of reigning theories. I failed to find an article that answered a question I often ask, why there are emotions, but I just glanced at some entries. Nor was I successful in getting a rundown of the various theories of elites. Many articles there, and one of them may do the trick. But there are other cases where Wikipedia had just what I wanted. [On the other hand, standard reference sources have their biases, too. I praise Jimbo for his work!] Rewriting History By KATHARINE Q. SEELYE ACCORDING to Wikipedia, the online encyclopedia, John Seigenthaler Sr. is 78 years old and the former editor of The Tennessean in Nashville. But is that information, or anything else in Mr. Seigenthaler's biography, true? The question arises because Mr. Seigenthaler recently read about himself on Wikipedia and was shocked to learn that he "was thought to have been directly involved in the Kennedy assassinations of both John and his brother Bobby." "Nothing was ever proven," the biography added. Mr. Seigenthaler discovered that the false information had been on the site for several months and that an unknown number of people had read it, and possibly posted it on or linked it to other sites. If any assassination was going on, Mr. Seigenthaler (who is 78 and did edit The Tennessean) wrote last week in an op-ed article in USA Today, it was of his character. The case triggered extensive debate on the Internet over the value and reliability of Wikipedia, and more broadly, over the nature of online information. Wikipedia is a kind of collective brain, a repository of knowledge, maintained on servers in various countries and built by anyone in the world with a computer and an Internet connection who wants to share knowledge about a subject. Literally hundreds of thousands of people have written Wikipedia entries. Mistakes are expected to be caught and corrected by later contributors and users. The whole nonprofit enterprise began in January 2001, the brainchild of Jimmy Wales, 39, a former futures and options trader who lives in St. Petersburg, Fla. He said he had hoped to advance the promise of the Internet as a place for sharing information. It has, by most measures, been a spectacular success. Wikipedia is now the biggest encyclopedia in the history of the world. As of Friday, it was receiving 2.5 billion page views a month, and offering at least 1,000 articles in 82 languages. The number of articles, already close to two million, is growing by 7 percent a month. And Mr. Wales said that traffic doubles every four months. Still, the question of Wikipedia, as of so much of what you find online, is: Can you trust it? And beyond reliability, there is the question of accountability. Mr. Seigenthaler, after discovering that he had been defamed, found that his "biographer" was anonymous. He learned that the writer was a customer of BellSouth Internet, but that federal privacy laws shield the identity of Internet customers, even if they disseminate defamatory material. And the laws protect online corporations from libel suits. He could have filed a lawsuit against BellSouth, he wrote, but only a subpoena would compel BellSouth to reveal the name. In the end, Mr. Seigenthaler decided against going to court, instead alerting the public, through his article, "that Wikipedia is a flawed and irresponsible research tool." Mr. Wales said in an interview that he was troubled by the Seigenthaler episode, and noted that Wikipedia was essentially in the same boat. "We have constant problems where we have people who are trying to repeatedly abuse our sites," he said. Still, he said, he was trying to make Wikipedia less vulnerable to tampering. He said he was starting a review mechanism by which readers and experts could rate the value of various articles. The reviews, which he said he expected to start in January, would show the site's strengths and weaknesses and perhaps reveal patterns to help them address the problems. In addition, he said, Wikipedia may start blocking unregistered users from creating new pages, though they would still be able to edit them. The real problem, he said, was the volume of new material coming in; it is so overwhelming that screeners cannot keep up with it. All of this struck close to home for librarians and researchers. On an electronic mailing list for them, J. Stephen Bolhafner, a news researcher at The St. Louis Post-Dispatch, wrote, "The best defense of the Wikipedia, frankly, is to point out how much bad information is available from supposedly reliable sources." Jessica Baumgart, a news researcher at Harvard University, wrote that there were librarians voluntarily working behind the scenes to check information on Wikipedia. "But, honestly," she added, "in some ways, we're just as fallible as everyone else in some areas because our own knowledge is limited and we can't possibly fact-check everything." In an interview, she said that her rule of thumb was to double-check everything and to consider Wikipedia as only one source. "Instead of figuring out how to 'fix' Wikipedia - something that cannot be done to our satisfaction," wrote Derek Willis, a research database manager at The Washington Post, who was speaking for himself and not The Post, "we should focus our energies on educating the Wikipedia users among our colleagues." Some cyberexperts said Wikipedia already had a good system of checks and balances. Lawrence Lessig, a law professor at Stanford and an expert in the laws of cyberspace, said that contrary to popular belief, true defamation was easily pursued through the courts because almost everything on the Internet was traceable and subpoenas were not that hard to obtain. (For real anonymity, he advised, use a pay phone.) "People will be defamed," he said. "But that's the way free speech is. Think about the gossip world. It spreads. There's no way to correct it, period. Wikipedia is not immune from that kind of maliciousness, but it is, relative to other features of life, more easily corrected." Indeed, Esther Dyson, editor of Release 1.0 and a longtime Internet analyst, said Wikipedia may, in that sense, be better than real life. "The Internet has done a lot more for truth by making things easier to discuss," she said. "Transparency and sunlight are better than a single point of view that can't be questioned." For Mr. Seigenthaler, whose biography on Wikipedia has since been corrected, the lesson is simple: "We live in a universe of new media with phenomenal opportunities for worldwide communications and research, but populated by volunteer vandals with poison-pen intellects." From checker at panix.com Thu Dec 8 02:20:59 2005 From: checker at panix.com (Premise Checker) Date: Wed, 7 Dec 2005 21:20:59 -0500 (EST) Subject: [Paleopsych] Stay Free: Mark Crispin Miller on conspiracies, media, and mad scientists Message-ID: Mark Crispin Miller on conspiracies, media, and mad scientists http://www.stayfreemagazine.org/archives/19/mcm.html [I had included this at the end of a posting on the theft of the 2004 election. But this is so thought-provoking that I'm sending it out separately. The author, being a "leftist," of course does not see the statist propaganda that underlies public education, propaganda that is so relentless and continuous that it is not even noticed as such. [Jacques Ellul's _Propaganda: The Forming of Men's Attitudes_ (1962, English translation, 1965) should be reread, for it argued the necessity that propaganda be relentless. He was speaking more specifically of the propaganda for "The American Way." We all know the ironic Depression-era photograph of men and women in bread lines underneath a huge poster with a happy couple and a car that read "There is no way like the American Way." But why beat a drum for what is obviously beneficial (which it wasn't during the Depression) but which propaganda continued through the Eisenhower years? And why is there propaganda to "celebrate diversity," whose stated benefits include only ethic cooking and folk dancers jumping up and down, when people go to ethnic restaurants on their own initiative without any prompting whatsoever?] Interview by Carrie McLaren | [8]Issue #19 After years of dropping Mark Crispin Miller's name in Stay Free!, I figured it was time to interview him. Miller is, after all, one of the sharpest thinkers around. His writings on television predicted the cult of irony-or whatever you call it when actual Presidential candidates mock themselves on Saturday Night Live, when sitcoms ridicule sitcoms, and when advertisements attack advertising. More recently, he has authored The Bush Dyslexicon, aided by his humble and ever-devoted assistant (me). Miller works at New York University in the Department of Media Ecology. Though he bristles at being called an academic, Miller is exactly the sort of person that should be leading classrooms. He's an excellent speaker, with a genius for taking cultural products-be they Jell-O commercials or George W. Bush press conferences-and teasing out hidden meanings. (He's also funny, articulate, and knows how to swear.) I talked to Mark at his home in November, between NPR appearances and babysitting duty. He is currently writing about the Marlboro Man for American Icons, a Yale University Press series that he also happens to be editing. His book Mad Scientists: Paranoid Delusion and the Craft of Propaganda (W. Norton) is due out in 2004.-CM STAY FREE: Let's start with a simple one: Why are conspiracy theories so popular? MCM: People are fascinated by the fundamental evil that seems to explain everything. Lately, this is why we've had the anomaly of, say, Rupert Murdoch's Twentieth Century Fox releasing films that feature media moguls as villains out to rule the world-villains much like Rupert Murdoch. Who's a bigger conspirator than he is? And yet he's given us The X-Files. Another example: Time Warner released Oliver Stone's JFK, that crackpot-classic statement of the case that American history was hijacked by a great cabal of devious manipulators. It just so happens that Stone himself, with Time Warner behind him, was instrumental in suppressing two rival projects on the Kennedy assassination. These are trivial examples of a genuine danger, which is that those most convinced that there is an evil world conspiracy tend to be the most evil world conspirators. STAY FREE: Because they know what's inside their own heads? MCM: Yes and no. The evil that they imagine is inside their heads-but they can't be said to know it, at least not consciously. What we're discussing is the tendency to paranoid projection. Out of your own deep hostility you envision a conspiracy so deep and hostile that you're justified in using any tactics to shatter it. If you look at those who have propagated the most noxious doctrines of the twentieth century, you will find that they've been motivated by the fierce conviction that they have been the targets of a grand conspiracy against them. Hitler believed he was fighting back, righteously, against "the Jewish world conspiracy." [See pp. 30-31] Lenin and Stalin both believed they were fighting back against the capitalist powers-a view that had some basis in reality, of course, but that those Bolsheviks embraced to an insane degree. (In 1941, for example, Stalin actually believed that England posed a greater danger to the Soviet Union than the Nazis did.) We see the same sort of paranoid projection among many of the leading lights of our Cold War-the first U.S. Secretary of Defense, James Forrestal, who was in fact clinically insane; the CIA's James Angleton; Richard Nixon; J. Edgar Hoover; Frank Wisner, who was in charge of the CIA's propaganda operations worldwide. Forrestal and Wisner both committed suicide because they were convinced the Communists were after them. Now, there was a grain of truth to this since the Soviet Union did exist and it was a hostile power. But it wasn't on the rise, and it wasn't trying to take over the world, and it certainly wasn't trying to destroy James Forrestal personally. We have to understand that there was just as much insanity in our own government as there was with the Nazis and the Bolsheviks. This paranoid dynamic did not vanish when the Cold War ended. The U.S. is now dominated, once again, by rightists who believe themselves besieged. And the same conviction motivates Osama bin Laden and his followers. They see themselves as the victims of an expansionist Judeo-Christianity. STAY FREE: Al Qaeda is itself a conspiracy. MCM: Yes. We have to realize that the wildest notions of a deliberate plot are themselves tinged with the same dangerous energy that drives such plots. What we need today, therefore, is not just more alarmism, but a rational appraisal of the terrorist danger, a clear recognition of our own contribution to that danger, and a realistic examination of the weak spots in our system. Unfortunately, George W. Bush is motivated by an adolescent version of the same fantasy that drives the terrorists. He divides the whole world into Good and Evil, and has no doubt that God is on his side-just like bin Laden. So how can Bush guide the nation through this danger, when he himself sounds dangerous? How can he oversee the necessary national self-examination, when he's incapable of looking critically within? In this sense the media merely echoes him. Amid all the media's fulminations against al Qaeda, there has been no sober accounting of how the FBI and CIA screwed up. Those bureaucracies have done a lousy job, but that fact hasn't been investigated because too many of us are very comfortably locked into this hypnotic narrative of ourselves as the good victims and the enemy as purely evil. STAY FREE: There's so much contradictory information out there. Tommy Thompson was on 60 Minutes the other night saying that we were prepared for biological warfare, that there was nothing to worry about. Yet The New York Times and The Wall Street Journal have quoted experts saying the exact opposite. Do you think this kind of confusion contributes to conspiratorial thinking? I see some conspiratorial thinking as a normal function of getting along in the world. When, on September 11th, the plane in Pennsylvania went down, there was lots of speculation that the U.S. military shot it down. MCM: Which I tend to think is true, by the way. I've heard from some folks in the military that that plane was shot down. STAY FREE: But we have no real way of knowing, no expertise. MCM: Yes, conspiratorial thinking is a normal response to a world in which information is either missing or untrustworthy. I think that quite a few Americans subscribe to some pretty wild notions of what's going on. There's nothing new in this, of course. There's always been a certain demented plurality that's bought just about any explanation that comes along. That explains the centuries-old mythology of anti-Semitism. There will always be people who believe that kind of thing. To a certain extent, religion itself makes people susceptible to such theorizing. STAY FREE: How so? MCM: Because it tends a propagate that Manichean picture of the universe as split between the good people and "the evil-doers." Christianity has spread this vision-even though it's considered a heresy to believe that evil is an active force in God's universe. According to orthodox Christianity, evil is not a positive force but the absence of God. STAY FREE: A lot of religious people believe what they want to believe, anyway. Christianity is negotiable. MCM: Absolutely. But when it comes to the paranoid world view, all ethical and moral tenets are negotiable, just as all facts are easily disposable. Here we need to make a distinction. On the one hand, there have been, and there are, conspiracies. Since the Cold War, our government has been addicted to secrecy and dangerously fixated on covert action all around the world. So it would be a mistake to dismiss all conspiracy theory. At the same time, you can't accept everything-that's just as na?ve and dangerous as dismissing everything. Vincent Bugliosi, who wrote The Betrayal of America, is finishing up a book on the conspiracy theories of the Kennedy assassination. He has meticulously gone through the case and has decided that the Warren Report is right. Now, Bugliosi is no knee-jerk debunker. He recognizes that a big conspiracy landed George W. Bush in the White House. STAY FREE: So I take it you don't buy the conspiracy theories about JFK? MCM: I think there's something pathological about the obsession with JFK's death. Some students of the case have raised legitimate questions, certainly, but people like Stone are really less concerned about the facts than with constructing an idealized myth. STAY FREE: Critics of the war in Afghanistan have called for more covert action as an alternative to bombing. That's an unusual thing for the left to be advocating, isn't it? MCM: It is. On the one hand, any nation would appear to be within its rights to try to track down and kill these mass murderers. I would personally prefer to see the whole thing done legally, but that may not be realistic. So, if it would work as a covert program without harm to any innocents I wouldn't be against it. But that presumes a level of right-mindedness and competence that I don't see in our government right now. I don't think that we can trust Bush/Cheney to carry out such dirty business. Because they have a paranoid world-view-just like the terrorists-they must abuse their mandate to "do what it takes" to keep us safe. By now they have bombed more innocents than perished in the World Trade Center, and they're also busily trashing many of our rights. The "intelligence community" itself, far from being chastened by their failure, has used the great disaster to empower itself. That bureaucracy has asked for still more money, but that request is wholly disingenuous. They didn't blow it because they didn't have enough money-they blew it because they're inept! They coasted along for years in a cozy symbiosis with the Soviet Union. The two superpowers needed one another to justify all this military and intelligence spending, and it made them complacent. Also, they succumbed to the fatal tendency to emphasize technological intelligence while de-emphasizing human intelligence. STAY FREE: Yeah, the Green Berets sent to Afghanistan are equipped with all sorts of crazy equipment. They each wear gigantic puffy suits with pockets fit to carry a GPS, various hi-tech gizmos, and arms. MCM: That's just terrific. Meanwhile, the terrorists used boxcutters! STAY FREE: Did you see that the U.S. Army has asked Hollywood to come up with possible terrorist scenarios to help prepare the military for attack? MCM: Yeah, it sent a chill right through me. If that's what they're reduced to doing to protect us from the scourge of terrorism, they're completely clueless. They might as well be hiring psychics-which, for all we know, they are! STAY FREE: The Bush administration also asked Al Jazeera, the Arab TV station, to censor its programming. MCM: Right. And, you know, every oppressive move we make, from trying to muzzle that network to dropping bombs all over Afghanistan, is like a gift to the terrorists. Al Jazeera is the only independent TV network in the Arab world. It has managed to piss off just about every powerful interest in the Middle East, which is a sign of genuine independence. In 1998, the network applied for membership in the Arab Press Union, and the application was rejected because Al Jazeera refused to abide by the stricture that it would do everything it can to champion "Arab brotherhood." STAY FREE: What do you think our government should have done instead of bombing? MCM: I rather wish they had responded with a little more imagination. Doing nothing was not an option. But bombing the hell out of Afghanistan was not the only alternative-and it was a very big mistake, however much it may have gratified a lot of anxious TV viewers in this country. By bombing, the U.S. quickly squandered its advantage in the propaganda war. We had attracted quite a lot of sympathy worldwide, but that lessened markedly once we killed Afghan civilians by the hundreds, then the thousands. Americans have tended not to want to know about those foreign victims. But elsewhere in the world, where 9/11 doesn't resonate as much, the spectacle of all those people killed by us can only build more sympathy for our opponents. That is, the bombing only helps the terrorists in the long run. And so has our government's decision to define the 9/11 crimes as acts of war. That definition has served only to exalt the perpetrators, who should be treated as mass murderers, not as soldiers. But the strongest argument against our policy is this-that it is exactly what the terrorists were hoping for. Eager to accelerate the global split between the faithful and the infidels, they wanted to provoke us into a response that might inflame the faithful to take arms against us. I think we can agree that, if they wanted it, we should have done something else. STAY FREE: You've written that, before the Gulf War, Bush the elder's administration made the Iraqi army sound a lot more threatening than it really was. Bush referred to Iraq's scanty, dwindling troops as the "elite Republican guard." Do you think that kind of exaggeration could happen with this war? MCM: No, because the great given in this case is that we are rousing ourselves from our stupor and dealing an almighty and completely righteous blow against those who have hurt us. Now we have to seem invincible, whereas ten years ago, they wanted to make us very scared that those Iraqi troops might beat us. By terrorizing us ahead of time, the Pentagon and White House made our rapid, easy victory seem like a holy miracle. STAY FREE: Let's get back to conspiracy theories. Do people ever call you a conspiracy theorist? MCM: Readers have accused me of paranoia. People who attacked me for The Bush Dyslexicon seized on the fact that my next book is subtitled Paranoid Delusion and the Craft of Propaganda, and they said, "He's writing about himself!" But I don't get that kind of thing often because most people see that there's a lot of propaganda out there. I don't write as if people are sitting around with sly smiles plotting evil-they're just doing their jobs. The word propaganda has an interesting history, you know. It was coined by the Vatican. It comes from propagare, which means grafting a shoot onto a plant to make it grow. It's an apt derivation, because propaganda only works when there is fertile ground for it. History's first great propagandist was St. Paul, who saw himself as bringing the word of God to people who needed to hear it. The word wasn't pejorative until the first World War, when the Allies used it to refer to what the Germans did, while casting their own output as "education," or "information." There was a promising period after the war when it got out that our government had done a lot of lying. The word propaganda came to connote domestic propaganda, and there were a number of progressive efforts to analyze and debunk it. But with the start of World War II, propaganda analysis disappeared. Since we were fighting Nazi propaganda with our own, it wasn't fruitful to be criticizing propaganda. STAY FREE: I read that the word "propaganda" fell out of fashion among academics around that time, so social scientists started referring to their work as "communications." It was no longer politically safe to study how to improve propaganda. MCM: Experts in propaganda started doing "communications" studies after the war. Since then, "communication" has been the most common euphemism used for "propaganda," as in "political communication." There's also "psychological warfare" and, of course, "spin." The Cold War was when "propaganda" became firmly linked to Communism. "Communist propaganda" was like "tax-and-spend Democrats" or "elite Republican guard." The two elements were inseparable. If the Communists said it, it was considered propaganda; and if it was propaganda, there were Communists behind it. Only now that the Cold War is over is it possible to talk about U.S. propaganda without running the risk of people looking at you funny. The word does still tend to be used more readily in reference to liberals or Democrats. The right was always quick to charge Bill Clinton-that leftist!-with doing propaganda. In fact, his right-wing enemies, whose propaganda skills were awesome, would routinely fault him for his "propaganda." You never heard anybody say Ronald Reagan was as a master propagandist, though. He was "the Great Communicator." STAY FREE: Talk a bit about how conspiracy is used to delegitimize someone who's doing critical analysis. I've heard you on TV saying, "I don't mean to sound like a conspiracy theorist, but . . . " People even do this in regular conversation. A friend of mine was telling me about going to Bush's inauguration in D.C. He was stunned that none of the protests were covered by the media but prefaced his comments by saying, "I want don't want to sound like a conspiracy theorist, but [the press completely ignored the protests]." It's almost as if people feel the need to apologize if they don't follow some party line. MCM: I wouldn't say that, because there are people who are conspiracy theorists. And I think the emphasis there should not be on the conspiracy but on the theory. A theorist is a speculator. It's always much easier to construct a convincing conspiracy theory if you don't bother looking at reality. The web is filled with stuff like this. So, if you want cover yourself, you should say something like: "I don't subscribe to every crackpot notion that comes along, but in this case there's something funny going on-and here's the evidence." It really is a rhetorical necessity. Especially when you're on TV. STAY FREE: Maybe it's more of a necessity, too, when you're talking about propaganda. MCM: I'll tell you something: it's necessary when you're talking about real conspiracies. You know who benefited big time from the cavalier dismissal of certain conspiracies? The Nazis. The Nazis were expert at countering true reports of their atrocities by recalling the outrageous lies the Allies had told about the Germans back in World War I. The Allies had spread insane rumors about Germans bayoneting Belgian babies, and crucifying Canadian soldiers on barn doors, and on and on. So, when it first got out that the Nazis were carrying out this horrible scheme, their flacks would roll their eyes and say, "Oh yeah-just like the atrocity stories we heard in WWI, right?" STAY FREE: I once attended a lecture on Channel One [an advertising-funded, in-school "news" program], where a professor dissected several broadcasts. He talked about how Channel One stories always emphasize "oneness" and individuality. Collective efforts or activism is framed in the negative sense, while business and governmental sources are portrayed positively and authoritatively. Now, someone listening to this lecture might say, "That just your reading into it. You sound conspiratorial." So where do you think this sort of media analysis or literary analysis and conspiracy-mongering intersect? MCM: That's a very good question. For years I've encountered the same problem as a professor. You've got to make the point that any critical interpretation has to abide by the rules of evidence-it must be based on a credible argument. If you think I'm "reading into it," tell me where my reading's weak. Otherwise, grant that, since the evidence that I adduce supports my point, I might be onto something. Where it gets complicated with propaganda is around the question of intention, because an intention doesn't have to be entirely conscious. The people who make ads, for example, are imbedded in a larger system; they've internalized its imperatives. So they may not be conscious intellectually of certain moves they make. If you said to somebody at Channel One, "You're hostile to the collective and you insult the individual," he'd say, reasonably, "What are you talking about? I'm just doing the news." So you have to explain what ideology is. I'm acutely sensitive to this whole problem. When I teach advertising, for example, I proceed by using as many examples as possible, to show that there is a trend, whatever any individual art director or photographer might insist about his or her own deliberate aims. Take liquor advertising, which appeals to the infant within every alcoholic by associating drink with mother's milk. This is clearly a deliberate strategy because we see it in ad after ad-some babe holding a glass of some brew right at nipple level. She's invariably small-breasted so that the actual mammary does not upstage the all-important product. If that's an accident, it's a pretty amazing accident. Now, does this mean that the ad people sit down and study the pathology of alcoholics, or is it something they've discovered through trial and error? My point is that it ultimately makes no difference. We see it over and over-and if I can show you that, according to experts, visual association speaks to a desire in alcoholics, a regressive impulse, then you have to admit I have a point. Of course, there are going to be people who'll accuse you of "reading into it" no matter what you say because they don't want to hear the argument. This is where we come up against the fundamental importance of anti-intellectualism on the right. They hate any kind of explanation. They feel affronted by the very act of thinking. I ran into this when I promoted The Bush Dyslexicon on talk shows-which I could do before 9/11. Bush's partisans would fault me just for scrutinizing what he'd said. STAY FREE: I recently read Richard Hofstadter's famous essay about political paranoia. He argued that conspiracy is not specific to any culture or country. Would you agree with that, or do you think there is something about America that makes it particularly hospitable to conspiracy theories? MCM: Well, there's a lot of argument about this. There's a whole school of thought that holds that England's Civil War brought about a great explosion of paranoid partisanship. Bernard Baylin's book The Ideological Origins of the American Revolution includes a chapter on the peculiar paranoid orientation of the American revolutionaries. But I think paranoia is universal. It's an eternal, regressive impulse, and it poses a special danger to democracy. STAY FREE: Why, specifically, is it dangerous to democracy? MCM: Because democracies have always been undone by paranoia. You cannot have a functioning democracy where everyone is ruled by mutual distrust. A democratic polity requires a certain degree of rationality, a tolerance of others, and a willingness to listen to opposing views without assuming people are out to kill you. There's a guy named Eli Sagan who wrote a book on the destructive effect of paranoia on Athenian democracy. And I think that the American experiment may also fail; America has always come closest to betraying its founding principles at moments of widespread xenophobic paranoia. In wartime, people want to sink to their knees and feel protected. They give up thinking for themselves-an impulse fatal to democracy but quite appropriate for fascism and Stalinism. The question now is whether paranoia can remain confined to that thirty-or-so percent of the electorate who are permanently crazy. That's what Nixon himself said, by the way-that "one third of the American electorate is nuts." About a third of the German people voted for the Nazis. I think there's something to that. It's sort of a magic number. STAY FREE: Come to think of it, public opinion polls repeatedly show that 70% of the public are skeptical of advertising claims. I guess that means about 30% believe anything. MCM: Wow. I wonder if that lack of skepticism toward advertising correlates in any way with this collective paranoia. That would be interesting to know. STAY FREE: Well, during the Gulf War, a market research firm conducted a study that found that the more hawkish people were, the more likely they were to be rampant consumers. Warmongers, in other words, consumed more than peaceniks. Why do you think these two reactions might be correlated? MCM: One could argue that this mild, collective paranoia often finds expression in promiscuous consumption. Eli Sagan talks about the "paranoidia of greed" as well as the "paranoidia of domination." Both arise out of suspicion of the enemy. You either try to take over all his territory forcibly, or you try to buy everything up and wall yourself within the fortress of your property. STAY FREE: Those two reactions also practically dominate American culture. When people from other countries think of America, they think of us being materialistic and violent. We buy stuff and kill people. Do you think there's any positive form of paranoia? Any advantage to it? MCM: No, I don't, because paranoids have a fatal tendency to look for the enemy in the wrong place. James Angleton of the CIA was so very destructive because he was paranoid. I mean, he should have been in a hospital-and I'm not being facetious. Just like James Forrestal, our first defense secretary. These people were unable to protect themselves, much less serve their country. I think paranoia is only useful if you're in combat and need to be constantly ready to kill. Whether it's left-wing or right-wing paranoia, the drive is ultimately suicidal. STAY FREE: Our government is weak compared to the corporations that run our country. What role do you see for corporations in the anti-terrorist effort? MCM: Well, corporations do largely run the country, and yet we can't trust them with our security. The private sector wants to cut costs, so you don't trust them with your life. Our welfare is not uppermost in their minds; our money is. So what role can the corporations play? STAY FREE: They can make the puffy suits! MCM: The puffy suits and whatever else the Pentagon claims to need. Those players have a vested interest in eternal war. STAY FREE: Did you read that article about Wal-Mart? After September 11, sales shot up for televisions, guns, and canned goods. MCM: Paranoia can be very good for business. STAY FREE: Have you ever watched one of those television news shows that interpret current events in terms of Christian eschatology? They analyze everyday events as signs of the Second Coming. MCM: No. I bet they're really excited now, though. I wonder what our president thinks of that big Happy Ending, since he's a born-again. You know, Reagan thought it was the end times. STAY FREE: But those are minority beliefs, even among born-again Christians. MCM: It depends on what you mean by "minority." Why are books by Tim LaHayes selling millions? He's a far-right fundamentalist, co-author of a series of novels all about the end times-the Rapture and so on. And Pat Robertson's best-seller, the New World Order, sounds the same apocalyptic note. STAY FREE: He's crazy. He can't really believe all that stuff. MCM: No, he's crazy and therefore he can believe that stuff. His nurse told him years ago that he was showing symptoms of paranoid schizophrenia. STAY FREE: I recently read a chapter from Empire of Conspiracy-an intelligent book about conspiracy theories. But it struck me that the author considered Vance Packard, who wrote Hidden Persuaders, a conspiracy theorist. Packard's book was straightforward journalism. He interviewed advertising psychologists and simply reported their claims. There was very little that was speculative about it. MCM: The author should have written about Subliminal Seduction and the other books by Wilson Brian Key. STAY FREE: Exactly! That nonsense about subliminal advertising was a perfect example of paranoid conspiracy. Yet he picked on Vance Packard, who conducted his research as any good journalist would. MCM: Again, we must distinguish between idle, lunatic conspiracy theorizing, and well-informed historical discussion. There have been quite a few conspiracies in U.S. history-and if you don't know that, you're either ignorant or in denial. Since 1947, for example, we have conspiratorially fomented counter-revolutions and repression the world over. That's not conspiracy theory. That's fact-which is precisely why it meets the charge of speculation. How better to discredit someone than to say she's chasing phantoms-or that she has an axe to grind? When James Loewen's book Lies Across America was reviewed in The New York Times, for example, the reviewer said it revealed an ideological bias because it mentions the bombing of civilians in Vietnam. Loewen wrote back a killer letter to the editor pointing out that he had learned about those bombings from The New York Times. Simply mentioning such inconvenient facts is to be dismissed as a wild-eyed leftist. When someone tells me I'm conspiracy-mongering I usually reply, "It isn't a conspiracy, it's just business as usual." STAY FREE: That's like what Noam Chomsky says about his work: "This is not conspiracy theory, it is institutional analysis." Institutions do what is necessary to assure the survival of the institution. It's built into the process. MCM: That's true. There's a problem with Chomsky's position, though-and I say this with all due respect because I really love Chomsky. When talking about U.S. press coverage, Chomsky will say that reporters have internalized the bias of the system. He says this, but the claim is belied by the moralistic tone of Chomsky's critique-he charges journalists with telling "lies" and lying "knowingly." There is an important contradiction here. Either journalists believe they're reporting truthfully, which is what Chomsky suggests when he talks about internalizing institutional bias. Or they're lying-and that, I think, is what Chomsky actually believes because his prose is most energetic when he's calling people liars. One of the purposes of my next book, Mad Scientists, will be to suggest that all the best-known and most edifying works on propaganda are slightly flawed by their assumption that the propagandist is a wholly rational, detached, and calculating player. Most critics-not just Chomsky, but Jacques Ellul and Hannah Arendt, among others-tend to project their own rationality onto the propagandist. But you can't study the Nazis or the Bolsheviks or the Republicans without noticing the crucial strain of mad sincerity that runs throughout their work, even at its most cynical. STAY FREE: You have written that even worse than the possibility that a conspiracy exists may be the possibility that no conspiracy is needed. What do you mean by that? MCM: The fantasy of one big, bad cabal out there is terrifying but also comforting. Not only does it help make sense of a bewildering reality, but it also suggests a fairly neat solution. If we could just find all the members of the network and kill them, everything will be okay. It's more frightening to me that there are no knowing authors. No one is at the top handling the controls. Rather, the system is on auto-pilot, with cadres just going about their business, vaguely assuming that they're doing good and telling truths-when in fact they are carrying out what could objectively be considered evil. What do you do, then? Who is there to kill? How do you expose the perpetrators? Whom do you bring before the bar of justice-and who believes in "justice"? And yet I do think that a lot of participants in this enterprise know they're doing wrong. One reason people who work for the tobacco companies make so much money, for example, is to still the voice of conscience, make them feel like they're doing something valuable. But the voice is very deeply buried. Ultimately, though, it is the machine itself that's in command, acting through those workers. They let themselves become the media's own media-the instruments whereby the system does its thing. I finally learned this when I studied the Gulf War, or rather, the TV spectacle that we all watched in early 1991. There was a moment on the war's first night when Ron Dellums was just about to speak against the war. He was on the Capitol steps, ready to be interviewed on ABC-and then he disappeared. They cut to something else. I was certain that someone, somewhere, had ordered them to pull the plug because the congressman was threatening to spoil the party. But it wasn't that at all. We looked into it and found the guy who'd made that decision, which was a split-second thing based on the gut instinct that Dellums' comments would make bad TV. So that was that-a quick, unconscious act of censorship, effected not by any big conspiracy but by one eager employee. No doubt many of his colleagues would have done the same. And that, I think, is scarier than any interference from on high. From checker at panix.com Thu Dec 8 02:21:04 2005 From: checker at panix.com (Premise Checker) Date: Wed, 7 Dec 2005 21:21:04 -0500 (EST) Subject: [Paleopsych] NYTBR: Something We Ate? Message-ID: Something We Ate? http://www.nytimes.com/2005/12/04/books/review/04stern.html [Click the URL to view the graphic.] [Given that we are omnivores, it's hard to think the effects of different diets, caloric intake constant, would matter all that much. Even still, I by and large am a grazer and eat the diet recommended in _The Paleolithic Presciption_. However, we are believing animals, and specific diets can have strong placebo-like effects. The record for all of them for long-term weight reduction is pretty low. Obesity is a mystery, and calling it a public health MENACE is mostly likely to be a full employment act for health bureaucrats.] Review by JANE AND MICHAEL STERN AGREED: Good health depends on a good diet. But which good diet? Experts and pretenders have offered countless schemes for salubrity, from the cabbage regime propounded by Cato the Elder to the chopped-meat-and-water plan of the 19th-century physician John Salisbury (whose name lives on via the Salisbury steak). Formal theorizing began in the second century, when Galen codified nutrition as a matter of correctly balanced humors. By the first millennium, the Byzantine Dietary Calendar advised sipping aromatic wine in January to avert the dangers of sweet phlegm; in 19th-century America, the phony physician Sylvester Graham and, later, the cereal guru John Harvey Kellogg inspired corn-flake crusades based on the proposition that constipation causes death. Our own bookshelves hold such off-the-wall 20th-century treatises as "Man Alive: You're Half Dead! (How to Eat Your Way to Glowing Health and Stay There)" and a pamphlet titled "Woman 80 Never Tired Eats and Sleeps Well," which turned upside down and around is labeled "What Causes Gas on the Stomach?" To eat is basic instinct; how to do it correctly worries humans more than sex. So "Terrors of the Table" is a perfect title for this story of nutritional doctrine's tyranny up to modern times when, in Walter Gratzer's words, fear of cholesterol has "supplanted the Devil as the roaring lion who walketh about, seeking whom he may devour." Gratzer, a biophysicist at King's College, London, who previously put a human face on science in "Eurekas and Euphorias: The Oxford Book of Scientific Anecdotes," reels out a historical pageant of science and pseudoscience teeming with remarkable characters who have advanced (and retarded) knowledge about what makes humans thrive. The faddists on soapboxes are especially amusing, including vegetarians who denounce eating meat as ungodly and an anti-vegetarian cleric who answers that God attached white tails to rabbits to make them easier targets. Gratzer asserts that fashion, not science, rules contemporary diet advice, and he enjoys eviscerating the "gruesome" Duke rice diet, the "probably dangerous" Scarsdale diet and the "grossly unbalanced" Atkins diet. "The history of nutritional science is full of fascination and drama," he writes, a point borne out by various accounts of forced hunger during World War II. A Nazi program to euthanize children deemed unworthy of living was carried out in hospital buildings called hungerh?user, where a diet of potatoes, turnips and cabbage was designed to cause death in three months. In 1940, when the Germans decided to eradicate the Jewish population of Warsaw by starvation. Dr. Israel Milejkowski and a group of ghetto physicians conducted research on the effects of malnutrition, figuring that some good might come of their suffering. "It was . . . the first study of the kind ever made," Gratzer notes. Some of the papers were smuggled to a non-Jewish professor, who buried them until after liberation. Learning exactly what happens when people starve was crucial in the progress of nutritional science because it focused on sickness caused not by pathogens but by what was missing from the diet. Since Galen, disease had been blamed on something bad invading the body and putting it out of balance. The paradigm shift occurred after it became unavoidably clear that the lack of essential nutrients could also be at fault. Even well into the 19th century, when it was already known that citrus fruits and vegetables prevented scurvy, conventional wisdom asserted they were effective because they contained an antidote to bad air and unwholesome food. "The notion that they contained a constituent essential for health," Gratzer writes, "lay beyond the reach of man's imagination." Nowhere was the stubborn resistance to this idea more apparent than in the insufferably slow recognition of what caused pellagra. Known as a disease of squalor and poverty, it was widespread during and after the Civil War in the southern United States, where the mortality rate among those suffering from it was 40 percent. Some blamed insect bites; others were convinced it was a contagious disease brought into the country by Italian immigrants. When the epidemiologist Joseph Goldberger went south in 1915 and noted that in asylums holding pellagra sufferers none of the staff members were affected, he concluded that it could not be infectious. On the other hand, the employees ate well while inmates were fed fatback and cornbread. To see if inadequate nutrition was the culprit, Goldberger served balanced meals to children in two orphanages where, after only a few weeks, pellagra disappeared. The logical conclusion - that pellagra resulted from a deficient diet (specifically, lack of nicatinic acid) - was obscured by the prevalence of eugenics, whose proponents contended that the institutions where Goldberger conducted his studies held inferior people who were especially susceptible to disease. "Willful obduracy," Gratzer calls the resistance, going on to describe Goldberger's outrageous strategy to put the infection theory to rest: "filth parties." At a pellagra hospital in Spartansburg, S.C., Goldberger and seven others "injected themselves with blood from severely affected victims . . . rubbed secretions from their mucous sores into their nose and mouth, and after three days swallowed pellets consisting of the urine, feces and skin scabs from several diseased subjects." None contracted pellagra. But despite these irrefutable findings, little was done initially to improve the diets of the poor. The disease finally began to disappear in the 1930's, thanks in part to federal soup kitchens and the introduction of enriched flour. Goldberger's audacity, and the pig-headedness of those who refused to believe him, are vivid evidence of Gratzer's promise that the history of nutritional dogma "encompasses every virtue, defect and foible of human nature." Jane and Michael Stern are the authors of the restaurant guide "Roadfood" and the cookbooks "Square Meals" and "Blue Plate Specials and Blue Ribbon Chefs." From checker at panix.com Thu Dec 8 02:21:12 2005 From: checker at panix.com (Premise Checker) Date: Wed, 7 Dec 2005 21:21:12 -0500 (EST) Subject: [Paleopsych] NYTBR: Merchandise of Venice Message-ID: Merchandise of Venice http://www.nytimes.com/2005/12/04/books/review/04schillinger.html [First chapter appended.] [The first chapter is better than the review, for it invites comparison with Colin Campbell's _The Romantic Ethic and the Spirit of Modern Consumerism_ (Oxford: Basil Blackwell, 1987). The book extended Max Weber's _The Protestant Ethic and the Spirit of Capitalism_ by going beyond the Protestant theology of predestination that Weber invoked to later developments in Protestantism that morphed into Sentimentalism and Romanticism. These later developments foster the idea of the new and hence (though as much unintended a money-making was to Luther and Calvin) of buying and buying and buying in the latter eighteenth century England and America. [Keep this in mind as you read the review and the first chapter and try to avoid conflating shopping the Renaissance with shopping in England and America from the latter eighteenth century through today. [So "rampant consumerism" is not something foisted onto us by wicked capitalists during just the past twenty years. I had somehow thought the meme "Fashion wears out clothes faster than women do" went back to Shakespeare. Googling <"wears out clothes"> and <"faster than women do"> turns up nothing. So let this be a meme of mine!] SHOPPING IN THE RENAISSANCE Consumer Cultures in Italy, 1400-1600. By Evelyn Welch. Illustrated. 403 pp. Yale University Press. $45. By LIESL SCHILLINGER CAN'T afford to pay your Visa bill this month? Why not mail in a pair of socks? If two are hard to fit in an envelope, one might do. After all, there's rich precedent. In "Shopping in the Renaissance," her meticulously researched and elegantly illustrated book about spending habits in 15th- and 16th-century Italy, Evelyn Welch, a professor at Queen Mary University of London, explains that Bolognese debtors commonly used household items as collateral: "old hoes, hammers, cooking pots, a brass cup, a pair of scissors, or in one case, a single white stocking." Explain to your creditors that money is the root of all evil and see if fear for their souls prompts leniency. Long before the gold standard was dreamed up, before the invention of credit cards and before shopping had come to be recognized as a vital form of therapy, Italian shoppers had considerable difficulty grasping the notion of conspicuous consumption. In the minds of moralists, Welch explains, "Any exchange of merchandise for money was potentially tainted." In the 16th century, the humanist Paolo Cortesi moaned that "gluttony and lust are fostered by perfumers, vendors of delicacies, poultry-sellers, money-vendors and cooks and savory foods," while the Venetian writer Tomaso Garzoni bewailed the "detestable occupations" of "eating, drinking and enjoying oneself" shown by day-trippers who wandered the piazzas, "looking at glassware, mirrors and rattles," gossiping at barber shops and, worse, reading the news. Such indulgence smacked to Renaissance Italians of what Professor Harold Hill called "fritterin' " as he stirred the inhabitants of River City to rise up against idle youth. In a similar vein, the Sienese preacher San Bernardino lambasted shop owners for contributing to the delinquency of minors. "You know well to whom you sell pine-nut biscuits, candies, marzipans and sugar cake," he scolded. "Your conscience cannot rest easy unless you have no sense of guilt in turning boys bad." Nonetheless, sometime after the Black Death winnowed the population in 1348, ushering in a period of plenty, new generations of Italians acquired a taste for the material pleasures of this earth, which ensuing spates of disease, famine and jeremiad did little to curb. But the learning curve was slow. While bountiful harvests were considered a good thing, and poor harvests were rued - as can be seen in illuminations by the Florentine corn-chandler Domenico Lenzi, which picture angels rejoicing above scenes of abundance and devils with bat wings flapping above meager crops - to profit from the sale of staples was a no-no. The butcher, the baker and the candlestick maker who bartered their wares and services for tablecloths and cooking pots avoided criticism, but lowly retailers - rivenditrice - who sold produce they had not grown themselves were compelled to carry banners or tablets bearing the shameful letter "R" to indicate their stigmatized trade. The Florentine poet Antonio Pucci derided peasant women who hawked vegetables, eggs and poultry in the Mercato Vecchio, declaring, "I speak of them with harsh words, / Those who fight throughout the day over two dried chestnuts / Calling each other whores / And they are always filling their baskets with fruit to their advantage." Decent women did not rove city streets, bickering with strangers about the price of garlic. They were expected to "either remain indoors or to move through the city with deliberate purpose." The question arises - who was buying the nuts and chickens if respectable ladies weren't? The answer was personal shoppers (although at the time they were known as servants, spenditore and courtiers), usually men, who were entrusted with purchases great and small by the bourgeois or ducal houses that employed them. They might go a-marketing for onions and haunches of veal, or they might be sent on quests for luxury goods. And the purse strings for all but sundry purchases were in the hands of the man of the house - unless the woman had ample resources of her own, both monetary and intellectual. In such cases, they could be more demanding and capricious than J. Lo before a concert. In a shopping list the teenage Marchioness of Mantua, Isabella d'Este, wrote out for a courtier named Zigliolo in 1491, she imperiously instructed, "These are the kind of things that I wish to have - engraved amethysts, rosaries of black, amber and gold, blue cloth for a camora, black cloth for a mantle, such as shall be without a rival in the world." Apparently, Zigliolo correctly anticipated her tastes, but a few years later, when a Ferrarese courtier provided the wrong sort of gloves from Spain, she complained that "he has sent us 12 dozen of the saddest gloves that had he searched all of Spain in order to find such poor quality I don't believe he could have found as many. . . . We would be ashamed to give them to people whom we love and they would never wear them. Can you please send them back." The marchioness was exercising her hyperdeveloped shopping muscle for a nation of women who mostly couldn't. Yet. From time to time, thrill-seeking nobles went out on the town to conduct their own treasure hunts, but such journeys were fraught with peril. In 1491, when Beatrice d'Este and her cousin, Isabella of Aragon, visited the markets of Milan wearing woolen headdresses, they were mocked by local women for their fashion sense. Beatrice's husband, Ludovico Maria Sforza, wrote to his sister-in-law in Mantua: "Since it is not the custom for women to go about with such cloths on their heads here, it seems that some of the women in the street began to make villainous remarks, upon which my wife fired up and began to curse them in return, in such a manner that they expected to come to blows." Even 500 years ago, shopping was not always pretty. But making purchases was tricky, even for people who had figured out the dress code, because Italian coins varied from city to city and political leaders minted their own vanity coins, much as today's celebrities brew their own signature perfumes. Political figures frequently banned the use of their opponents' coins. All in all, it was wiser to throw your socks on the counter and start haggling. When Isabella d'Este went to buy antiquities from the Medici collection, she offered Mantuan cloth in payment, and a large part of her 30,000-ducat dowry consisted not of gold pieces but of jewels, silverware and elaborate gowns - all of which could be pawned and pledged, whether to raise armies, buy art or pay for luxurious holiday trips. Her hope chest doubled as a bank vault, "enabling her, like any other wealthy Italian, to turn material wealth into ready cash." All this "expensive clothing, jewels and plate," Welch explains, "could be mortgaged over and over again, allowing men and women with possessions to spend in ways that far exceeded their immediate means." If they went too far, however, and couldn't redeem their goods in time, they might see their valuables auctioned on the piazza or risk other forms of public humiliation: being barred from the Rialto in Venice or forced to wear the debtor's crown of shame, the green beret, in Rome. It may be a pity we can't live in the style of Renaissance Italians anymore, swapping our clothes and casserole dishes for priceless antiquities, but it's no small consolation that we can incur debt the modern way, by charging it, and shop on the Rialto, even if we can't afford it. Liesl Schillinger, a New York-based arts writer, is a regular contributor to the Book Review. First chapter of 'Shopping in the Renaissance' http://www.nytimes.com/2005/12/04/books/chapters/1204-1st-welch.html By EVELYN WELCH In February 2001 the British artist Michael Landy took over an empty department store in central London. Before a fascinated and occasionally distraught audience of friends, fellow-artists and strangers drawn in from the streets, he and his assistants placed all his personal possessions on a conveyor belt. Looping round the complex, mechanised route, Landy's furniture, record collection, clothing and even his car were first inventoried and then systematically dismembered, divided and shredded. The work attracted considerable press attention and provoked a powerful public response. Landy's emphasis on destruction was seen as a challenge to the champions of consumerism and as a strong commentary on the seductions of acquisition and ownership. The setting, the bare interior of a store stripped of its lighting, counters and displays, was central to the work's meaning (Figure 1). As shoppers moved on from the performance into the still-functioning department stores and shops nearby, they were invited to reflect on the ultimate purposelessness of their purchases. Commenting after the event, Landy described his surprise when a number of onlookers equated his actions with those of a holy figure or a saint. Yet the disposal or dispersal of possessions has been a fundamental part of religious asceticism since early Christianity. But unlike the powerful image of Saint Francis of Assisi giving away his cloak to a beggar before stripping off all his clothes in order to refuse his father's wealth, Landy had no intention of forming a new religious order (Figure 2). Landy's attack on human attachment to material possessions was a secular act of artistic performance, a counterpart to contemporary celebrations of affluence and prosperity. As such he was, and is, part of a growing debate. Today, shopping, the process of going out to special sites to exchange earnings for consumable objects, is seen as both a force for good (consumer spending is saving Western domestic economies) and as a danger to society (consumer spending is destroying the environment and local diversity). Given its current importance, such behaviour has been closely scrutinised by anthropologists and sociologists who have often argued that the purchase of mass-produced items is a defining characteristic of modernity. In their turn, economists have looked for rational patterns of consumer spending, while an equally weighty literature has grown up to evaluate the emotive and psychological impulses that lie behind modern consumerism, culminating in a focus on the 'shopaholic' or kleptomaniac, usually a woman who, for complex reasons, is unable to control her desire to either buy or steal from stores. Following in this wake, historians and art historians are using concepts such as the emergence of a public sphere and the agency of the consumer to map out a new narrative linking this changing social behaviour to the development of new architectural spaces. Some have found the origins for contemporary shopping practices in the American malls of the 1930s or in the opening of the first department stores, such as Whiteley's in London in 1863 or the Bon March? in Paris in 1869 (Figure 3). These purpose-built buildings, with their fixed prices and large body of salaried personnel radically changed the nature of shopping. Buying became a leisure activity as well as a chore, one that women were increasingly able to enjoy. But while some have insisted that this was a distinctive feature of the late nineteenth and twentieth centuries, others have pushed back the transformation to the coffee-houses of eighteenth-century London, the mercers' shops of eighteenth-century Paris, or to the market halls and commercial chambers of seventeenth-century Amsterdam (Figure 4). As new social rituals developed, such as reading the paper, listening to public concerts or discussing scientific innovations, so too did a demand for new products such as coffee, tea, chocolate, porcelain and printed chintzes. Here bow-shaped glass shop windows, with their displays of exotic, imported goods are thought to have tempted buyers, sparking off a capitalist revolution and eventually liberating women from the home. In the search for the first modern shopping trip, these eighteenth- and nineteenth-century developments are often set against the backdrop of an undifferentiated late medieval past. The story of temporal progression requires more distant periods to be perceived as lacking in sophistication. The pre-industrial world is presented as having had a relatively limited access to a smaller range of regionally produced goods and a minimum of disposable income. Most of a family's earnings would have been spent on food. Little was left over for non-essentials, and most goods were produced within the home itself. These assumptions have meant that while many studies have looked for a growing mass-market for consumer goods in the eighteenth century, Renaissance scholarship has focused on elite patronage or international trade. Recently, however, there has been a tendency to argue that the supposed consumer boom of the enlightenment period started much earlier and that this revolution took place, not in London or Paris, but in fifteenth-century Italy. In 1993, for example, the economic historian Richard Goldthwaite argued that, 'the material culture of the Renaissance generated the very first stirring of the consumerism that was to reach a veritable revolutionary stage in the eighteenth century and eventually to culminate in the extravagant throw-away, fashion-ridden, commodity-culture of our own times'. But the question arises whether the Italian Renaissance consumerism was really the embryo of contemporary expenditure, a defining moment in the transition from the medieval to the modern. Does the detail from the 1470 Ferrarese frescoes of Palazzo Schifanoia depicting elegant shops with their customers represent a new form of activity or an ongoing tradition (Figure 5)? Is it in any way, however marginal, indicative of, or evidence for, a new form of consumer behaviour? While there will be much in this book that seems familiar, such as the pleasure that teenage girls took in trips to the market, there is a great deal that is very different. Indeed, far from pinpointing the start of 'ourselves' in fifteenth- and sixteenth-century Florence, the experience of the Italian Renaissance challenges rather than reinforces a sense of linear transfer from past to present. In particular, it threatens some basic assumptions concerning the connections between architecture and consumer behaviour. In the English language the links could not be closer. A standard dictionary defines shopping as, 'the action of visiting a shop or shops for the purpose of inspecting or buying goods'. A shopper is, 'one who frequents a shop or shops for the purpose of inspecting or buying goods'. But this correlation has no parallel in other European languages where there is little, if any, verbal connection between 'the shop' and the activity, 'shopping'. This is an important distinction because the impact of this assumed association between the architecture of commerce and modernity goes far beyond semantics. Early twentieth-century sociologists and economists who defined concepts of consumption relied on models of social development that considered shopping in stores as a far more sophisticated form of exchange than gift-trade or administered-trade. The latter were only phases that societies went through before finally emerging as fully developed (and hence more effective and efficient) market economies. This was not simply a theory. It was put into practice in countries such as Italy which only became a nation in the 1860s. From that point onwards, defining an Italian city as a modern urban society involved constructing new commercial and social spaces, particularly those modelled on the more seemingly advanced English and French examples. The so-called 'Liberty' or Art Nouveau style was adopted for some shop fronts while glass and iron proved popular for new shopping areas (Figure 6). When in 1864, for example, the city of Florence began demolishing its walls, gates and medieval market centre, it was to mark the town's transformation into the first capital of the new nation (Figures 7 and 8). Florence was not to stop, as one protagonist put it, 'in the lazy contemplation of our past glories but fight gallantly on the road to progress'. In 1865, it was even suggested that the entire market areas of the city centre should be transformed into a glass gallery on the model of the English Great Exhibition Hall before it was agreed to tear it down and rebuild the densely packed centre in a more piecemeal fashion. Likewise, in i864, the city of Milan marked its entry into the Italian nation with major urban renewal plans. This included a galleried arcade, whose construction contract was awarded to the British-based 'City of Milan Improvement Company Limited'. As the first King of the united Italy, Vittorio Emanuele II laid the foundation stones of the Galleria, the new glass and iron mall was presented as a symbol of the new country's future prosperity and a rejection of its backwards past (Figure 9) But these nineteenth-century debates reveal a more complex and contradictory set of attitudes than a simple embrace of British engineering. Photographers using advanced technologies for the period captured the emptied spaces of the old Florentine market while graphic artists produced postcard images of what was to be destroyed. Londoners who had visited the city wrote to The Times to decry the destruction of the old town centre and city walls. A sense of the need to preserve an attractive 'local' culture for the tourist market vied with the political desire to be accepted as the equal of the economically advanced countries of Europe and the United States. The issues raised by the Milanese Galleria and the destruction of Florence's old market centre have resonances that go far beyond the Italian peninsula and the nineteenth century. The competing values of preservation and nostalgia versus modernity and progress continue to have serious consequences today. Planners eager to impose change have tended to describe developing countries as having 'medieval' types of exchange. Open markets in Africa and Asia, systems of barter and supposedly informal networks of credit, have been presented as either backwards, or, conversely, as more romantic and natural than contemporary North American and British supermarkets and shopping malls. As in nineteenth-century Florence, seemingly unregulated and potentially unhygienic markets have been driven from city centres in places such as Hong Kong and Singapore by officials hoping to exclude elements perceived as old-fashioned from their growing economies. In contrast, highly developed urban areas such as New York and London, have re-introduced 'farmer's markets'. These evoke traditional street fairs in order to reassure customers that produce sold from stalls and served up in brown bags is somehow more genuine than shrink-wrapped goods removed from a refrigerated cabinet. Shopping in the Renaissance Given this context, it is difficult to step back and assess how men and women actually went out to shop in the past without falling into a narrative of either progress or decline. This is particularly acute for the Renaissance. During the period between 1400 and 1600, the daily business of buying and selling was an act of embedded social behaviour, not a special moment for considered reflection. While international merchants' manuals do survive both in manuscript and in print, the ordinary consumer's ability to assess value, select goods, bargain, obtain credit and finally to pay, was learnt primarily through observation, practice and experience rather than through any form of written instruction. This means that any study of Renaissance buying practices, where exchanges were transitory and verbal, has to rely on scattered and often problematic evidence. The images, literary sources, criminal records, statutes, auction and price lists, family accounts and diaries used in this book all had their own original purposes and formats. Their meanings were rarely fixed and the same item might be perceived in different ways in different times and places. For example, a poem such as Antonio Pucci's fourteenth-century description of the Mercato Vecchio in Florence, might carry one meaning for its audience when heard during a time of famine and yet another when read in a period of prosperity. But despite its slippery nature, it is still important to set such 'soft' evidence against the seemingly more stable facts and figures that make up grain prices and daily wage rates. This book takes, therefore, the approach of a cultural historian in an attempt to gain an insight into the experience of the Renaissance marketplace. While some of the material goes over the immediate boundaries of the title, the book focuses primarily on central and northern Italy between 1400 and 1600. This is, in part, because of the wealth of documentation available for this period and region. Venice, an entrep?t whose retailers served both an international and local clientele, was exceptional in its commercial sophistication and specialisation. But the entire northern and central Italian peninsula, with its multiplicity of large and medium-sized towns and distribution networks of ports, canals and roads that reached far into the countryside, was much more urbanised than the rest of Europe. Unlike England where the inhabitants of villages and hamlets gravitated to larger market towns to buy and sell produce, even the smaller and more isolated of Italy's rural and urban communities housed permanent shops and regular markets. For example, sixteenth-century Altopascio, a Tuscan mountain village with a population of 700 inhabitants had five shoemakers, two grocers and a ceramic seller, a bottegaio di piatti, as well as a blacksmith. The slightly larger Tuscan town of Poppi in the Casentino had a population of 1,450. In 1590, its inhabitants benefited from nine grocery stores, two bakeries, two butchers, three drugstores, a mercer's shop, a barber, a tailor and a shoemaker along with workshops for wool, leather and iron as well as kilns producing ceramic wares. These amenities served the wider locality as well as the small town, a relationship noted when the municipal council allowed complete immunity for debtors on market days, `for the good and benefit and maintenance of Poppi, considering its location on a dry hill and in need of being frequented and visited by other men and people'. Of equal importance was the diversity and competition between these urban centres, both large and small. Italy's political fragmentation had considerable cultural consequences. By the mid-fifteenth century power on the peninsula was roughly divided between the Kingdom of Naples, the Papal States, the Duchy of Milan and the city-states of Florence and Venice. By the end of the century, however, the fragile balance had been disrupted as the growing powers of France, Spain and the Habsburg empire attempted to gain control. After 1530, Italy's two major territorial states, Lombardy and Naples, were ruled by viceroys who drew on local urban structures but answered to Spain. These multiple boundaries - local, regional and international - allowed for the coexistence of legal systems as well as for the circulation of different forms of currencies, dress, codes of conduct, gesture and language. The diversity had real material meanings. Velvets permitted to butchers' wives in Milan might be forbidden to those in Venice; hats that seemed desirable in Naples may have been rejected in Genoa. Although the costume books from the second half of the sixteenth century such as those of Cesare Vecellio and Pietro Bertelli often exaggerated the differences, the fashions forged in Rome were quite distinct from those in Mantua or Ferrara (Figures 10-12). Even women living under the same jurisdiction, such as those in Vicenza and Venice, might wear different garments (Figures 13-14). This created issues around novelty that were very different from those of nation-states such as France and England where the major contrasts were between a single capital city like Paris or London and the provincial towns and rural communities. . . . From checker at panix.com Fri Dec 9 01:53:36 2005 From: checker at panix.com (Premise Checker) Date: Thu, 8 Dec 2005 20:53:36 -0500 (EST) Subject: [Paleopsych] NYT: 350 Years of What the Kids Heard Message-ID: 350 Years of What the Kids Heard http://www.nytimes.com/2005/12/05/books/05nort.html [Connections column by Edward Rothstein appended.] [I must have read children's books when I was a child, but beyond the Winnie the Pooh books and the Alice books, I can't recall any. Speaking of Alice, one of the five requirements I have for a wife is that she agree to name our first daughter Alice. Sarah agreed instantly, and indeed our first daughter is named Alice. The other is Adelaide, and both have the same Teutonic root meaning truth. [The other four are having the same ethnic background (which ensures deep commonalities), a love of classical music, a realistic view of the world (does not believe in Bronze Age creator gods or political shibboleths about planning and equality), and an honesty of appearance (no make up!). The woman I married has all five in spaces, AND she remains the most feminine person I have ever met. [She does have a shortcoming and a defect, though. She comes up four inches shorter than I do, and her brown eyes are so enormous that her eyelids aren't completely closed when she is asleep. her eyelids don't close when she is asleep. I can watch her rapid eye movements when she is dreaming.] By DINITIA SMITH Before Harry Potter there was "Slovenly Peter." Written by Heinrich Hoffmann and published in Germany in 1845, it is one of the best-selling children's books ever, translated into more than 100 languages. And what a piece of work it is. A girl plays with matches and suffers horrendous burns, on all her clothes "And arms, and hands, and eyes, and nose;/ Till she had nothing more to lose/ Except her little scarlet shoes." A little boy who sucks his thumb has his thumbs cut off by the Scissor Man. And in the difference between Harry and Peter lies the lesson of children's literature, said Jack Zipes, general editor of the new Norton Anthology of Children's Literature, published this month by W. W. Norton & Company. "These works reflect how we view children, and something about us," said Mr. Zipes, 68, a professor of German and comparative literature at the University of Minnesota, in a telephone interview from Minneapolis. The anthology joins the 11 other definitive compendiums by Norton. It is one of the first modern, comprehensive, critical collections of children's literature. And it is intended not for children, but for scholars. "It's a huge event, a real arrival of children's literature in academic studies," said John Cech, director of the Center for Children's Literature and Culture at the University of Florida in Gainesville. Although the academic study of children's literature is an exploding field, there are only a handful of Ph.D. programs in children's literature in English departments. One purpose of the anthology, said Mr. Zipes, is to encourage departments to add courses. The anthology, 2,471 pages long and weighing three pounds, covers 350 years of alphabet books, fairy tales, animal fables and the like, and took Mr. Zipes and four other editors four years to compile. Some stories are reprinted in full, sometimes with illustrations; others are excerpted. In it, the editors trace the history of juvenile literature from what is probably the first children's book, "Orbis Sensualium Pictus," an illustrated Latin grammar by Johann Amos Comenius published in 1658, up through works as recent as "Last Talk With Jim Hardwick," by Marilyn Nelson, which came out in 2001. Most early children's books were didactic and had a religious flavor, intended to civilize and save potential sinners - albeit upper-class ones, since they were more likely to be literate. As today, publishers were shrewd marketers of their wares. When John Newbery published "A Little Pretty Pocket-Book," in 1744, he included toys with the books - balls for boys, pincushions for girls. It is striking in the anthology to see the way certain forms cross cultures. Lullabies, for instance, have a nearly universal form, with elongated vowels, long pauses and common themes of separation, hunger, bogeymen, death - as if singing of these terrors could banish them from a child's dream world. One stunning entry is "Lullaby of a Female Convict to Her Child, the Night Previous to Execution," from 1807. "Who then will sooth thee, when thy mother's sleeping," the mother sings. "In her low grave of shame and infamy!/ Sleep, baby mine! - to-morrow I must leave thee." The book traces the evolution of various works, including "Hush-a-bye, baby, on the tree top" from its origins as an African-American slave song, "All the Pretty Horses." That version ends with the horrifying image, "Way down yonder in the meadow lays a poor little lambie/ The bees and the butterflies peckin' out his eyes/ The poor little thing cries, 'Mammy.' " The editors write that attitudes toward children began to change in the mid-18th century. In 1762, in his revolutionary work, "?mile; or, On Education," Rousseau wrote that children are intrinsically innocent and should be educated apart from corrupt society, a view later taken up by the Romantics. In the mid- to late-19th century, with the rise of the "isms," as Mr. Zipes put it - Darwinism, Freudianism, communism, Shavian socialism - children were recognized as people, and their literature became less heavily didactic. Schools were established for the lower classes, and increased literacy created new markets for books. This was the golden age of children's literature, of Robert Louis Stevenson, Rudyard Kipling, Louisa May Alcott, Mark Twain and Lewis Carroll. Throughout the text, in editors' notes and introductions, are tidbits about the hidden messages in the literature. "London Bridge Is Falling Down," say the editors, contains coded references to the medieval custom of burying people alive in the foundations of bridges. But children's stories, especially fairy tales, have always been hiding places for the subversive. "The Griffin and the Minor Canon" by Frank Stockton is a condemnation of cowardice and social hypocrisy; "The Happy Prince" by Oscar Wilde, a critique of the aristocracy. In the late 1960's and early 70's, as the anthology demonstrates, children's stories began to be rewritten and children's literature was approached in a different way. Black writers like Julius Lester and Mildred Taylor came to prominence along with Latino and Native Americans authors. Nowadays, the boundaries between adult and children's fiction are disappearing. Nothing is taboo. Included in the anthology are both Francesca Lia Block's story "Wolf" (2000), about rape, and "The Bleeding Man" (1974), a story about torture by Craig Kee Strete, a Native American writer. There is also a hefty selection of illustrations that parents may remember fondly - Sendak's wild things, Dr. Seuss's goofy animals, Babar the elephant king - as well as comics and science fiction, officially bringing those genres into the canon. The book also includes the full text of the play "Peter Pan," never before published in the United States, as far Mr. Zipes knows. Notably absent, however, is Harry Potter. That was because the cost of excerpting the Potter books was too high, Mr. Zipes said. Besides that, he said, "the Harry Potter books are very conventional and mediocre." "The plots are in the tradition of the schoolboy novel," he said, citing "Tom Brown's School Days," which was published in 1857. Mr. Zipes called the Potter books, "the ideological champions of patriarchal society," adding: "They celebrate the magical powers of a boy, with a girl - Hermione - cheerleading him. You can predict the outcome." Never mind, though. Harry Potter is doing just fine. Reading Kids' Books Without the Kids http://www.nytimes.com/2005/12/05/books/05conn.html Connections By EDWARD ROTHSTEIN I confess: for me, it's partly personal. I am in a local Barnes & Noble, looking at a table spread with new releases of books; behind me are four or five bookcases lined with similar books, all published in the last few years. I am reading jacket copy. "Life has not been easy lately for Walker," reads one. "His father has died, his girlfriend has moved away." And now, his "mother is going to work as a stripper. What if his friends find out? What if Rachel finds out?" Another introduces a clique of high-school girls, one of whom is "smart, hardworking and will insult you to tears faster than you can say, 'My haircut isn't ugly!' " And a third shows a photograph of an eighth-grade girl, eyes open in shock as she examines a piece leopard-skin lingerie. But the problem she faces going to a "lingerie shower" for her brother's ex-girlfriend doesn't compare with the problem of a 12th grader in another book who is so attracted to her 35-year-old English teacher that the two "tumble headlong into a passionate romance." What, I wonder, would Heidi have done in similar circumstances, or Anne of Green Gables? What would Eleanor Estes's Moffat children have said if their mom, instead of working as a seamstress making clothes for others, decided to strip her own clothes off instead? Did even Judy Blume dream how far her vision of a frank new form of children's fiction might go? It isn't just the plots of these books that are jarring. Teen pulp, which evolved out of children's books and rebelled against their supposed strictures, appears to take up far more real estate on the shelves of bookstores than books of more subtle literary bent for the pre-adult set. The genre also reflects a different set of expectations about how books are read and why. Hoping to be reminded of what is being missed, I turn to the opposite end of the cultural spectrum, to the newly released Norton Anthology of Children's Literature. It contains, it promises, "350 years of literary works for children" including nursery rhymes, primers, fairy tales, fables, legends, myths, fantasy, picture books, science fiction, comics, poetry, plays and adventure stories by 170 authors and illustrators, all tightly stuffed into 2,471 pages. But here, too, crankiness gets the better of me as I slip the book out of its case. Only my wariness is not caused by the content. It has to do with this book's purpose. The jacket calls the anthology a celebration of literary "richness and variety" in which "readers will find beloved works." But it is not really designed for readers in the usual sense. It was edited to be used in college courses. Childhood, the preface points out, is "a time saturated with narratives," but this is not a book whose selections are meant to be read to a child as bedtime narratives, let alone as bedtime stories. In fact, the binding is too floppy and the book too weighty to hold up without resting it on a table, and turning its tissue-thin pages requires mature surgical finesse. That's fine, of course. Children's literature does need to be studied; its ideas and evolution need to be understood, and the greatness of its achievements needs to be recognized. But then something else needs to be understood, and this is connected to the problems with teen pulp as well. It has to do with the function of children's books and the way pre-adult fiction grows out of them. We can anthologize short stories or philosophical works or essays, and their purpose and meaning will remain relatively unchanged. But when children's literature is placed in an anthology that is not for children, something is altered. The texts are read in a different way. Why, in fact, do children read, and why are they read to? Why are books specifically written for readers who are not yet adults? Children's books have a sense of multiple perspectives built into them because of how they are encountered. When a parent reads "Where the Wild Things Are" aloud, for example, the anger of the child, Max, his fantasy of mastery and revenge, and finally, his relief at his welcome home, are given another twist: his personal drama is not a private drama. The parent reading - the voice of the story itself - is precisely the authority with whom the child has waged similar battles. Everything is intensified; the resolution is also made more comforting, because in the calm moments of bedtime, the parent's voice reassures. Even for older children, the parent becomes a textual presence, an inescapable alternate voice. And by the time the child reads alone, the books themselves become multi-voiced. Literature for those-who-are-not-yet-adults is often proposing alternatives, refusing to settle into a single version of the "real." Lewis Carroll allows neither Alice to settle into a single interpretation of what she sees, nor the child reader - or, for that matter, the adult. Last week at the New York Public Library, Adam Gopnik, who has just written a fantasy novel for children, "The King in the Window" (Hyperion), spoke with his sister, Dr. Alison Gopnik, a cognitive scientist who has studied children's learning. Dr. Gopnik argued that children read the way scientists work: they experiment with different ways of ordering the world, exploring alternate modes of understanding. But in an academic reading of children's books this can be forgotten. An adult may read to discern political and economic interests, to see what lessons are latent in the text, to analyze how narrative works, to make connections. Norton has a Web site (www.wwnorton.com/nrl) in which course curriculums are proposed based on the anthology. They tend to use phrases like "ideological constructions" in discussing children's books. One course aims to "destabilize the totalizing idea of 'the child' and set up contrasts between male and female, urban and rural, rich and poor." In other words, it aims to splinter the category of childhood and focus attention on social strata, gender, locale. The risk is that literature ends up becoming univocal: each work is seen as an expression of the particular, and not much more. But this happens only in mediocre literature, like teen pulp, where narrow-casting is the marketing norm. Those books are meant to be close reflections of their readers, mirrors of their fantasies. The characters are just different enough from the readers to spur curiosity and sexual interest, and just similar enough to guarantee identification. A great children's book, though, does not reflect the world or its reader. It plays within the world. It explores possibilities. It confounds expectations. That is why the anthology's academic function makes me wary. The child, with the adult near at hand, never has a single perspective. Almost anything can happen. And usually does. Connections, a critic's perspective on arts and ideas, appears every other Monday. From checker at panix.com Fri Dec 9 01:54:10 2005 From: checker at panix.com (Premise Checker) Date: Thu, 8 Dec 2005 20:54:10 -0500 (EST) Subject: [Paleopsych] NYT: Instant Millions Can't Halt Winners' Grim Slide Message-ID: Instant Millions Can't Halt Winners' Grim Slide http://www.nytimes.com/2005/12/05/national/05winnings.html [I've read in many places that a great happy event (marriage, promotion, having lunch with Hillary) raises one's happiness level for only about a year and that a terrible event (death of spouse, being fired, being forced to have lunch with Hillary) lowers it also for only about a year. The sad tale below is an exception.] By JAMES DAO CORBIN, Ky., Nov. 30 - For Mack W. Metcalf and his estranged second wife, Virginia G. Merida, sharing a $34 million lottery jackpot in 2000 meant escaping poverty at breakneck speed. Years of blue-collar struggle and ramshackle apartment life gave way almost overnight to limitless leisure, big houses and lavish toys. Mr. Metcalf bought a Mount Vernon-like estate in southern Kentucky, stocking it with horses and vintage cars. Ms. Merida bought a Mercedes-Benz and a modernistic mansion overlooking the Ohio River, surrounding herself with stray cats. But trouble came almost as fast. And though there have been many stories of lottery winners turning to drugs or alcohol, and of lottery fortunes turning to dust, the tale of Mr. Metcalf and Ms. Merida stands out as a striking example of good luck - the kind most people only dream about - rapidly turning fatally bad. Mr. Metcalf's first wife sued him for $31,000 in unpaid child support, a former girlfriend wheedled $500,000 out of him while he was drunk, and alcoholism increasingly paralyzed him. Ms. Merida's boyfriend died of a drug overdose in her hilltop house, a brother began harassing her, she said, and neighbors came to believe her once welcoming home had turned into a drug den. Though they were divorced by 2001, it was as if their lives as rich people had taken on an eerie symmetry. So did their deaths. In 2003, just three years after cashing in his winning ticket, Mr. Metcalf died of complications relating to alcoholism at the age of 45. Then on the day before Thanksgiving, Ms. Merida's partly decomposed body was found in her bed. Authorities said they have found no evidence of foul play and are looking into the possibility of a drug overdose. She was 51. Ms. Merida's death remains under investigation, and large parts of both her and Mr. Metcalf's lives remain wrapped in mystery. But some of their friends and relatives said they thought the moral of their stories was clear. "Any problems people have, money magnifies it so much, it's unbelievable," said Robert Merida, one of Ms. Merida's three brothers. Mr. Metcalf's first wife, Marilyn Collins, said: "If he hadn't won, he would have worked like regular people and maybe had 20 years left. But when you put that kind of money in the hands of somebody with problems, it just helps them kill themselves." As a young woman, Ms. Merida lived with her family in Houston where her father, Dempsey Merida, ran a major drug-trafficking organization, law enforcement officials say. He and two of his sons, David and John, were indicted in 1983 and served prison sentences on drug-related convictions. John Murphy, the first assistant United States attorney for the western district of Texas, who helped prosecute the case, said the organization smuggled heroin and cocaine into Texas using Mr. Merida's chain of auto transmission shops as fronts. Mr. Murphy described Mr. Merida as a gruff, imposing man who tried to intimidate witnesses by muttering loudly in court. Mr. Merida received a 30-year sentence but was released in 2004 because of a serious illness, Mr. Murphy said. He died just months later in Kentucky at age 76. When Dempsey Merida and his two sons went to prison, his wife moved the family to northern Kentucky. Virginia Merida married, had a son, was divorced and married again, to Mack Metcalf, a co-worker at a plastics factory. But he drank too much and disappeared for long stretches of time, friends of Ms. Merida said, leaving her alone to care for her son and mother. She worked a succession of low-paying jobs, lived in cramped apartments, drove decrepit cars and struggled to pay rent. For his part, Mr. Metcalf drifted from job to job, living at one point in an abandoned bus. Then one July day in 2000, a friend called Ms. Merida and gave her some startling news: Mr. Metcalf had the winning $3 ticket for a $65 million Powerball jackpot. Ms. Merida had refused to answer his calls, thinking he was drunk. "Mack kept calling here, asking me to go tell Ginny that he had won the lottery," said Carolyn Keckeley, a friend of Ms. Merida. "She wouldn't believe him." At the time, both were barely scraping by, he by driving a forklift and she by making corrugated boxes. But in one shot, they walked away with a cash payout of $34 million, which they split 60-40: he received about $14 million after taxes, while she got more than $9 million. In a statement released by the lottery corporation, Mr. Metcalf said he planned to move to Australia. "I'm going to totally get away," he said. But problems arrived almost immediately. A caseworker in Northern Kentucky saw Mr. Metcalf's photograph and recognized him as having been delinquent in child support payments to a daughter from his first marriage. The county contacted Mr. Metcalf's first wife and they took legal action that resulted in court orders that he pay $31,000 in child support and create a $500,000 trust fund for the girl, Amanda, his only child. Ms. Collins, his first wife, said Mr. Metcalf abandoned the family when Amanda, now 21, was an infant, forcing them into near destitution. "I cooked dinner and set the table for six months for him, but he never came back," said Ms. Collins, 38. They were divorced in 1986. Even as he was battling Ms. Collins in court, Mr. Metcalf was filing his own lawsuit to protect his winnings. In court papers, he asserted that a former girlfriend, Deborah Hodge, had threatened and badgered him until he agreed, while drunk, to give her $500,000. Ms. Hodge vowed to call witnesses to testify that Mr. Metcalf had given money to other women as well. Mr. Metcalf's suit was dismissed after he walked out of a deposition, according to court papers. Still, there were moments of happiness. Shortly after winning the lottery, he took Amanda shopping in Cincinnati, giving her $500 to buy clothing and have her nails done. "I had never held that kind of money before," Ms. Metcalf said. "That was the best day ever." Pledging to become a good father, he moved to Corbin to be near Amanda, buying a 43-acre estate with a house modeled after Mount Vernon for $1.1 million. He collected all-terrain vehicles, vintage American cars and an eccentric array of pets: horses, Rottweilers, tarantulas and a 15-foot boa constrictor. He also continued to give away cash. Neighbors recall him buying goods at a convenience store with $100 bills, then giving the change to the next person in line. Ms. Metcalf said she discovered boxes filled with scraps of paper in his home recording money he had given away, debts he would never collect. His drinking got worse, and he became increasingly afraid that people were plotting to kill him, installing surveillance cameras and listening devices around his house, Ms. Metcalf said. Then in early 2003, he spent a month in the hospital for treatment of cirrhosis and hepatitis. After being released from the hospital, he married for the third time, but died just months later, in December. Virginia Merida seemed to handle her money better. She repaid old debts, including $1,000 to a landlord who had evicted her years earlier. She told a friend she had set aside $1 million for retirement. But she splurged enough to buy a Mercedes and a geodesic-dome house designed by a local architect in Cold Spring for $559,000. She kept the furnishings simple, neighbors said, but bought several arcade-quality video games for her son, Jason. For a time, Ms. Merida's mother lived with her as well. "I was at her house a year after she moved in, and she said she hadn't even unpacked," said Mary Jo Watkins, a neighbor. "It was as if she didn't know how to move up." Then in January, a live-in boyfriend, Fred Hill, died of an overdose of an opiate-related drug, according to a police report. No charges were filed, and officials said it was not clear if the opiate was heroin or a prescription drug. But neighbors began to believe that the house had become a haven for drug use or trafficking. "I think we all suspected that some drug problems were going on there because so many people were coming and going," Ms. Watkins said. In May, Ms. Merida filed a complaint in Campbell County Circuit Court against one of her brothers, David, saying that he had been harassing her. In June 16, a circuit court judge ordered both brother and sister to keep away from each other. It was unclear why she filed the complaint, and David Merida would not comment. When Ms. Merida's son found her body on Nov. 23, she had been dead for several days, the county coroner's office said. There was no evidence of a break-in, or that she had been attacked, officials said. Toxicological studies on her remains will not be completed for several weeks. It is unclear how much of Ms. Merida's estate remains, but it appears she saved some of it. That may not have been the case with Mr. Metcalf, his daughter said. Six months after his death, his house in Corbin was sold for $657,000, about half of what Mr. Metcalf had paid for it. In a brief obituary in The Kentucky Enquirer, Ms. Merida's family described her simply as "a homemaker." On a black tombstone, Ms. Metcalf had this inscribed for her father, "Loving father and brother, finally at rest." Al Salvato contributed reporting from Cold Spring, Bellevue and Dayton, Ky. From checker at panix.com Fri Dec 9 01:54:36 2005 From: checker at panix.com (Premise Checker) Date: Thu, 8 Dec 2005 20:54:36 -0500 (EST) Subject: [Paleopsych] The Week: The Father of Natural Selection Message-ID: The Father of Natural Selection http://www.theweekmagazine.com/article.aspx?id=1228 [This is a nice, brief summation. I think Darwin became an agnostic, as he confessed to his diaries if not to his confidents.] The work of Charles Darwin, the British naturalist whose ideas form the basis of modern evolutionary theory, is under attack by religious conservatives. What did Darwin actually say? 12/2/2005 How did Darwin become a biologist? Born in 1809, the son of a physician in Shrewsbury, England, Darwin was a bookish youngster but a poor student. He attended the University of Edinburgh to study medicine, but dropped out because he couldn't stand the sight of blood. He did like studying living things, though, and indulged his interest by hiking, collecting beetles, and attending botany lectures. When Darwin was 21, he learned that Robert FitzRoy, captain of the HMS Beagle, was looking for a hardy companion on a trip to chart the South American coast. Although FitzRoy thought Darwin had a "lack of energy and determination," he took him on. The Beagle set sail on Dec. 27, 1831. How did he spend the voyage? As the Beagle made various landfalls, Darwin disembarked to observe, sketch, and collect plants, animals, and fossils. He sometimes traveled as much as 400 miles over mountains, through jungles, and up and down rivers, before meeting up with the ship. By the time the Beagle returned to England on Oct. 2, 1836, Darwin had accumulated an 800-page diary; 1,700 pages of zoology and geology notes; 4,000 skins, bones, and other dried specimens; and another 1,500 pickled plants and animals. What did he do with all this stuff? Exhausted from the journey, Darwin holed up in London with his collection to prepare his journal for publication. But as he did so, he began thinking about some of the inconsistencies and anomalies he had observed. He was particularly intrigued by the 12 previously unknown kinds of finches he had discovered. Darwin realized they were separate species, distinguished mainly by the shapes of their beaks. Each was suited to a particular task--crushing seeds, eating insects, poking into flowers for nectar, and so on. How, Darwin wondered, had such similar birds wound up with different beaks? And why were the birds' beaks so well-suited to the food supply on islands where they were found? Darwin could think of no good answers until 1838, when he came upon a book by Thomas Malthus, An Essay on the Principle of Population. How did this book affect him? It was like a strike of lightning. Malthus, a minister and professor, argued that human populations would always grow beyond their ability to feed themselves unless they were checked by disease, catastrophe, or some other restraint. The idea sparked a recognition in Darwin that all living things, including plants, animals, and human beings, were constantly struggling to survive in a world of limited resources and myriad dangers. Those species that thrived, he reasoned, had adapted to circumstances by some sort of biological mechanism. That is, they had evolved. Was evolution Darwin's idea? Far from it. By Darwin's time, most respected scientists believed that living things tended to improve as the need to do so arose. But they had the details wrong. The most famous mistake was made by the French naturalist Jean-Baptiste Lamarck (1744-1829). Lamarck proposed that individuals developed specific characteristics by exercising them, while losing others through disuse. He thought, for example, that giraffes got their long necks by stretching for leaves that were out of reach, then passed their elongated necks onto their offspring. Darwin rejected this approach completely. Acquired characteristics, he argued, are not inherited. Rather, random chance had favored individuals or species with traits that allowed them to flourish in their environments. Successful organisms reproduced, and came to dominate their environments, while less successful organisms perished and disappeared. In 1859, after two decades of thought, analysis, and research, Darwin published his conclusions in a book, On the Origin of Species by Means of Natural Selection, or The Preservation of Favoured Races in the Struggle for Life. Why did he wait so long to publish? An introverted, nervous man, Darwin hated attention. He knew that his findings would arouse the wrath of millions who believed in the biblical creation story. Publishing his theories, he once told a friend, would be "like confessing a murder." Darwin decided to publish Origin of Species only when he discovered that a competitor, Alfred Russel Wallace, was about to go public with his own version of evolutionary theory. What was the public reaction? It was immediate and explosive. Origin of Species' entire first print run of 1,250 copies sold out, necessitating a second printing of 3,000 copies just six weeks later. Eminent scientists, philosophers, and liberal theologians recognized it as a groundbreaking work. The botanist Hewett Watson wrote Darwin, "You are the greatest revolutionist in natural history, if not of all centuries." But others, including many intellectuals, were appalled at the notion that man had evolved from lower life forms. What did his critics say? The astronomer Sir John Herschel openly derided Origin of Species as nonsensical, calling it "the law of higgledy-piggledy." The geologist William Whewell, master of Trinity College, Cambridge, refused to allow it into the college library on the grounds that it would threaten the moral fiber of England. In June 1860, the first great public debate about Darwin took place at the annual meeting of the British Association for the Advancement of Science. In a spontaneous exchange, Samuel "Soapy Sam" Wilberforce, the bishop of Oxford, clashed with biologist Thomas Huxley, one of Darwin's strongest defenders. Wilberforce asked Huxley if he was descended from apes on his grandmother's or his grandfather's side of the family. Huxley replied that if given the choice between being descended from apes, or from "a man highly endowed by nature" who used those gifts "for the mere purpose of introducing ridicule into a grave scientific discussion, I unhesitatingly affirm my preference for the ape." The gale of laughter that followed Huxley's remark heralded a storm over Darwin's ideas that continues, 145 years later. From checker at panix.com Fri Dec 9 01:55:02 2005 From: checker at panix.com (Premise Checker) Date: Thu, 8 Dec 2005 20:55:02 -0500 (EST) Subject: [Paleopsych] Newsweek: Charles Darwin: Evolution of a Scientist Message-ID: Charles Darwin: Evolution of a Scientist http://www.msnbc.msn.com/id/10118787/site/newsweek/ [Arts and Letters Daily pointed to several articles on the evolution controversy. Here are most of them. Like the summary in The Week of the great man, this is also very good.] He had planned to enter the ministry, but his discoveries on a fateful voyage 170 years ago shook his faith and changed our conception of the origins of life. By Jerry Adler Newsweek Nov. 28, 2005 issue - On a December night in 1831, HMS Beagle, on a mission to chart the coast of South America, sailed from Plymouth, England, straight into the 21st century. Onboard was a 22-year-old amateur naturalist, Charles Darwin, the son of a prosperous country doctor, who was recruited for the voyage largely to provide company for the Beagle's aloof and moody captain, Robert FitzRoy. For the next five years, the little ship?just 90 feet long and eight yards wide?sailed up and down Argentina, through the treacherous Strait of Magellan and into the Pacific, before returning home by way of Australia and Cape Town. Toward the end of the voyage, the Beagle spent five weeks at the remote archipelago of the Galapagos, home to giant tortoises, black lizards and a notable array of finches. Here Darwin began to formulate some of the ideas about evolution that would appear, a quarter-century later, in "The Origin of Species," which from the day it was written to the present has been among the most influential books ever published. Of the revolutionary thinkers who have done the most to shape the intellectual history of the past century, two?Sigmund Freud and Karl Marx?are in eclipse today, and one?Albert Einstein?has been accepted into the canon of modern thought, even if most people still don't understand what he was thinking. Darwin alone remains unassimilated, provocative, even threatening to some?like Pat Robertson, who recently warned the citizenry of Dover, Pa., that they risked divine wrath for siding with Darwin in a dispute over high-school biology textbooks (click here for related story). Could God still be mad after all this time? Unintentionally, but inescapably, that is the question raised by a compelling new show that opened Saturday at the American Museum of Natural History in New York. Here are the beetles Darwin collected fanatically, the fossils and ferns he studied obsessively, two live Galapagos tortoises like the ones he famously rode bareback, albeit these were hatched in the Milwaukee County Zoo. And here are the artifacts of his life: his tiny single-shot pistol, his magnifying glass and rock hammer?and the Bible that traveled around the world with him, a reminder that before his voyage he had been studying for the ministry. (Indeed, in a letter to his father, who opposed the trip, he listed all the latter's objections, starting with "disreputable to my character as a clergyman hereafter." Little did he imagine.) The show, which will travel to Boston, Chicago and Toronto before ending its tour in London in Darwin's bicentennial year of 2009, coincides by chance with the publication of two major Darwin anthologies as well as a novel by best-selling author John Darnton, "The Darwin Conspiracy," which playfully inverts history by portraying Darwin as a schemer who dispatched a rival into a volcano and stole the ideas that made him famous. Visitors to Britain will note that Darwin has replaced that other bearded Victorian icon, Charles Dickens, on the British 10-pound note. "Even people who aren't comfortable with Darwin's ideas," says Niles Eldredge, the museum's curator of paleontology, "are fascinated by the man." In part, the fascination with the man is being driven by his enemies, who say they're fighting "Darwinism," rather than evolution or natural selection. "It's a rhetorical device to make evolution seem like a kind of faith, like 'Maoism'," says Harvard biologist E. O. Wilson, editor of one of the two Darwin anthologies just published. (James D. Watson, codiscoverer of DNA, edited the other, but both include the identical four books.) "Scientists," Wilson adds, "don't call it 'Darwinism'." But the man is, in fact, fascinating. His own life exemplifies the painful journey from moral certainty to existential doubt that is the defining experience of modernity. He was an exuberant outdoorsman who embarked on one of the greatest adventures in history, but then never again left England. He lived for a few years in London before marrying his first cousin Emma, and moving to a country house where he spent the last 40 years of his life, writing, researching and raising his 10 children, to whom he was extraordinarily devoted. Eldredge demonstrates, in his book accompanying the museum show, "Darwin: Discovering the Tree of Life," how the ideas in "The Origin of Species" took shape in Darwin's notebooks as far back as the 1830s. But he held off publishing until 1859, and then only because he learned that a younger scientist, Alfred Russel Wallace, had come up with a similar theory. Darwin was afflicted throughout his later life by intestinal distress and heart palpitations, which kept him from working for more than a few hours at a time. There are two theories about this mysterious illness: a parasite he picked up in South America, or, as Eldredge believes, anxiety over where his intellectual journey was leading him, and the world. It appeared to many, including his own wife, that the destination was plainly hell. Emma, who had other plans for herself, was tormented to think they would spend eternity apart. Darwin knew full well what he was up to; as early as 1844, he famously wrote to a friend that to publish his thoughts on evolution would be akin to "confessing a murder." To a society accustomed to searching for truth in the pages of the Bible, Darwin introduced the notion of evolution: that the lineages of living things change, diverge and go extinct over time, rather than appear suddenly in immutable form, as Genesis would have it. A corollary is that most of the species alive now are descended from one or at most a few original forms (about which he?like biologists even today?has little to say). By itself this was not a wholly radical idea; Darwin's own grandfather, the esteemed polymath Erasmus Darwin, had suggested a variation on that idea decades earlier. But Charles Darwin was the first to muster convincing evidence for it. He had the advantage that by his time geologists had concluded that the Earth was millions of years old (today we know it's around 4.5 billion); an Earth created on Bishop Ussher's Biblically calculated timetable in 4004 B.C. wouldn't provide the scope necessary to come up with all the kinds of beetles in the world, or even the ones Darwin himself collected. And Darwin had his notebooks and the trunkloads of specimens he had shipped back to England. In Argentina he unearthed the fossil skeleton of a glyptodont, an extinct armored mammal that resembled the common armadillos he enjoyed hunting. The armadillos made, he wrote, "a most excellent dish when roasted in [their] shell," although the portions were small. The glyptodont, by contrast, was close to the size of a hippopotamus. Was it just a coincidence that both species were found in the same place?or could the smaller living animal be descended from the extinct larger one? But the crucial insights came from the islands of the Galapagos, populated by species that bore obvious similarities to animals found 600 miles away in South America?but differences as well, and smaller differences from one island to another. To Darwin's mind, the obvious explanation was that the islands had been colonized from the mainland by species that then evolved along diverging paths. He learned that it was possible to tell on which island a tortoise was born from its shell. Did God, the supreme intelligence, deign to design distinctive shell patterns for the tortoises of each island? Darwin's greater, and more radical, achievement was to suggest a plausible mechanism for evolution. To a world taught to see the hand of God in every part of Nature, he suggested a different creative force altogether, an undirected, morally neutral process he called natural selection. Others characterized it as "survival of the fittest," although the phrase has taken on connotations of social and economic competition that Darwin never intended. But he was very much influenced by Thomas Malthus, and his idea that predators, disease and a finite food supply place a limit on populations that would otherwise multiply indefinitely. Animals are in a continuous struggle to survive and reproduce, and it was Darwin's insight that the winners, on average, must have some small advantage over those who fall behind. His crucial insight was that organisms which by chance are better adapt-ed to their environment?a faster wolf, or deer?have a better chance of surviving and passing those characteristics on to the next generation. (In modern terms, we would say pass on their genes, but Darwin wrote long before the mechanisms of heredity were understood.) Of course, it's not as simple as a one-dimensional contest to outrun the competition. If the climate changes, a heavier coat might represent the winning edge. For a certain species, intelligence has been a useful trait. Evolution is driven by the accumulation of many such small changes, culminating in the emergence of an entirely new species. "[F]rom the war of nature, from famine and death, the most exalted object which we are capable of conceiving, namely, the production of the higher animals, directly follows," Darwin wrote. And there was an even more troubling implication to his theory. To a species that believed it was made in the image of God, Darwin's great book addressed only this one cryptic sentence: "Much light will be thrown on the origin of man and his history." That would come 12 years later, in "The Descent of Man," which explicitly linked human beings to the rest of the animal kingdom by way of the apes. "Man may be excused for feeling some pride at having risen, though not through his own exertions, to the very summit of the organic scale," Darwin wrote, offering a small sop to human vanity before his devastating conclusion: "that man with all his noble qualities ... still bears in his bodily frame the indelible stamp of his lowly origin." So it was apparent to many even in 1860?when the Anglican Bishop Samuel Wilberforce debated Darwin's defender Thomas Huxley at Oxford?that Darwin wasn't merely contradicting the literal Biblical account of a six-day creation, which many educated Englishmen of his time were willing to treat as allegory. His ideas, carried to their logical conclusion, appeared to undercut the very basis of Christianity, if not indeed all theistic religion. Was the entire panoply of life stretching back millions of years to its single-celled origins, with its innumerable extinctions and branchings, really just a prelude and backdrop to the events of the Bible? When did Homo sapiens, descended by a series of tiny changes in an unbroken line from earlier species of apes, develop a soul? The British biologist Richard Dawkins, an outspoken defender of Darwin and a nonbeliever, famously wrote that evolution "made it possible to be an intellectually fulfilled atheist." Although Darwin struggled with questions of faith his whole life, he ultimately described himself as an "Agnostic." But he reached that conclusion through a different, although well-traveled, route. William Howarth, an environmental historian who teaches a course at Princeton called "Darwin in Our Time," dates Darwin's doubts about Christianity to his encounters with slave-owning Christians?some of them no doubt citing Scripture as justification?which deeply offended Darwin, an ardent abolitionist. More generally, Darwin was troubled by theodicy, the problem of evil: how could a benevolent and omnipotent God permit so much suffering in the world he created? Believers argue that human suffering is ennobling, an agent of "moral improvement," Darwin acknowledged. But with his intimate knowledge of beetles, frogs, snakes and the rest of an omnivorous, amoral creation, Darwin wasn't buying it. Was God indifferent to "the suffering of millions of the lower animals throughout almost endless time"? In any case, it all changed for him after 1851. In that year Darwin's beloved eldest daughter, Annie, died at the age of 10?probably from tuberculosis?an instance of suffering that only led him down darker paths of despair. A legend has grown up that Darwin experienced a deathbed conversion and repentance for his life's work, but his family has always denied it. He did, however, manage to pass through the needle's eye of Westminster Abbey, where he was entombed with honor in 1882. So it's not surprising that, down to the present day, fundamentalist Christians have been suspicious of Darwin and his works?or that in the United States, where 80 percent of the population believe God created the universe, less than half believe in evolution. Some believers have managed to square the circle by mapping out separate realms for science and religion. "Science's proper role is to explore natural explanations for the material world," says the biologist Francis Collins, director of the Human Genome Project and an evangelical Christian. "Science provides no answers to the question 'Why are we here, anyway?' That is the role of philosophy and theology." The late Stephen Jay Gould, a prolific writer on evolution and a religious agnostic, took the same approach. But, as Dawkins tirelessly observes, religion makes specific metaphysical claims that appear to conflict with those of evolution. Dealing with those requires some skill in Biblical interpretation. In mainstream Christian seminaries the dominant view, according to Holmes Rolston III, a philosopher at Colorado State University and author of "Genes, Genesis and God," is that the Biblical creation story is a poetic version of the scientific account, with vegetation and creatures of the sea and land emerging in the same basic order. In this interpretation, God gives his creation a degree of autonomy to develop on its own. Rolston points to Genesis 1:11, where God, after creating the heavens and the Earth, says, "Let the Earth put forth vegetation ..." "You needed a good architect at the big bang to get the universe set up right," he says. "But the account describes a God who opens up possibilities in which creatures are generated in an Earth that has these rich capacities." Collins identifies the soul with the moral law, the uniquely human sense of right and wrong. "The story of Adam and Eve can thus be interpreted as the description of the moment at which this moral law entered the human species," he says. "Perhaps a certain threshold of brain development had to be reached before this became possible?but in my view the moral law itself defies a purely biological explanation." The Darwin exhibit was conceived in 2002, when the current round of Darwin-bashing was still over the horizon, but just in those three years' time museum officials found they had to greatly expand their treatment of the controversy?in particular, the rise of "intelligent design" as an alternative to natural selection. ID posits a supernatural force behind the emergence of complex biological systems?such as the eye?composed of many interdependent parts. Although ID advocates have struggled to achieve scientific respectability, biologists overwhelmingly dismiss it as nonsense. Collins comments, in a video that is part of the museum show: "[ID] says, if there's some part of science that you can't understand, that must be where God is. Historically, that hasn't gone well. And if science does figure out [how the eye evolved]?and I believe it's very likely that science will ... then where is God?" Where is God? it is the mournful chorus that has accompanied every new scientific paradigm over the last 500 years, ever since Copernicus declared him unnecessary to the task of getting the sun up into the sky each day. The church eventually reconciled itself to the reality of the solar system, which Darwin, perhaps intentionally, invoked in the stirring conclusion to the "Origin": "There is grandeur in this view of life ... that whilst this planet has gone cycling on according to the fixed law of gravity, from so simple a beginning endless forms most beautiful and most wonderful have been, and are being, evolved." For all his nets and guns and glasses, Darwin never found God; by the same token, the Bible has nothing to impart about the genetic relationships among the finches he did find. But it is human nature to seek both kinds of knowledge. Perhaps after a few more cycles of the planet, we will find a way to pursue them both in peace. With Anne Underwood and William Lee Adams From checker at panix.com Fri Dec 9 01:55:17 2005 From: checker at panix.com (Premise Checker) Date: Thu, 8 Dec 2005 20:55:17 -0500 (EST) Subject: [Paleopsych] Jerusalem Post: Are Jews born smart? Message-ID: Are Jews born smart? http://www.jpost.com/servlet/Satellite?cid=1132475650155&pagename=JPost%2FJPArticle%2FPrinter [The idea that we use only 5-7% of our brain flies in the face of evolutionary logic. The brain uses 12% of the body's calories yet constitutes only 2% of its mass.] JEREMY MAISSEL , THE JERUSALEM POST Nov. 29, 2005 Ashkenazi Jews are genetically intellectually superior to everyone else. This is the conclusion of a recent "scientific" study entitled Natural History of Ashkenazi Intelligence that triggered several articles in popular publications such as New York Magazine, The New York Times and the Economist. In this study, Gregory Cochran, Jason Hardy and Henry Harpending of the University Of Utah's anthropology department suggest a genetic explanation to account for this remarkable intellectual achievement. They base their hypothesis on four observations. First, that Ashkenazi Jews have the highest average IQ of any ethnic grouping. Second, Ashkenazim have a very low inward gene flow (low intermarriage). Third, historic restrictions on professions allowed to Jews, such as money-lending, banking and tax farming, in which higher intelligence strongly favored economic success, in turn led to increased reproductive success. Low intermarriage acted as a selective process genetically favoring these abilities. Fourth, genetic mutations responsible for diseases commonly found in Ashkenazi Jews, such as Tay-Sachs, are responsible for improved intelligence. My initial reaction to a theory like this is suspicion laced with a healthy dose of skepticism. Undoubtedly Ashkenazim have made a disproportionate contribution to Western intellectual and cultural life - think Freud, Einstein, Mahler, or Woody Allen and Jerry Seinfeld, to name but a few. But saying that Ashkenazi genes are different calls into question the motivation behind the research. SHOULD 'RACE' be dignified as a subject of scientific study? To refuse to investigate a subject, however objectionable, would in itself be unscientific. Yet the attention of the scientific community alone lends it credibility. This study is to be published in The Journal of Biosocial Science in 2006 by Cambridge University Press. The paper drew considerable criticism for both its aims and methods from geneticists, historians, social scientists and other academics as "poor science" - condemning its polemical style and the lack of usual rigor and dispassion of scientific texts. But what do we do with the conclusions of the thesis? Maybe file them with Jewish conspiracies such as The Protocols of the Elders of Zion? Claiming we are a race genetically differentiated from the rest of humanity could provide excellent material for anti-Semites. It could share a shelf with other "scientific" works on race and intelligence such as those of Arthur Jensen or The Bell Curve by Charles Murray and Richard Herrnstein - which questioned affirmative action in the US, claiming that African-Americans are genetically inferior in intellectual abilities. Is the Harpending and Cochran study any less odious for the fact that it portrays Jews in a positive light? JUDAISM HAS never advocated Jewish racial superiority. Indeed, the Talmud (Sanhedrin 38a) explains that Adam, the biblical first man, was created singly as the common forebear of all mankind so that future families would not quarrel over claims of superiority in their respective ancestry. If racial purity was important the Jewish people would not have accepted converts, or would maybe maybe reconsider the status of their offspring. Yet we have the biblical story of Ruth, a convert who is not only accepted into the Jewish people, but whose descendents include King David and, ultimately, the Messiah. Down the centuries, reluctance to accept converts was based on concerns about the smooth transmission of family traditions, religious observances, history and culture, and not the watering-down of blood, diluting DNA, or contamination of the Jewish gene pool. Being "the chosen people" does not make Jews superior either. The idea of chosenness first appears in the book of Exodus (19:5-6) where, contingent on complying with and keeping the Divine covenant, the Jewish people is singled out to become "a kingdom of priests and a holy nation." In the words of Henri Atlan: "Election does not imply superiority or inherent sanctity, since the correct reading of the Bible in fact implies conditional chosenness. The election is one of duty, not of rights or attributes." IF JEWS aren't racially superior, then, how does one account for the undeniably disproportionate achievements of Jews (numbering 0.2% of the world population) at winning Nobel prizes, for example? There is a "self-fulfilling prophecy" explanation. Nobel prizes are awarded according to a set of culturally-rooted values - extolling the virtues of Western civilization and rewarding its paradigms, we should bear in mind that Judaism made a significant contribution to that civilization. Jews have always been literate, and historically the professional restrictions on Ashkenazi Jews encouraged them to promote "exile-proof" skills. They valued and encouraged learning, hard work and achievement. These were a cultural legacy, not innate qualities. If race is the source of those achievements, where does hard work or personal endeavor enter the equation? If I am an Ashkenazi Jew, is it my destiny to achieve? And what do we do with this within the Jewish world? We really don't need another source of divisiveness along the Ashkenazi/Sephardi rift. My own view as an educator is that everyone has the same intellectual potential, regardless of lineage. Psychologists maintain that the average person uses only 5-7% of that potential. Differing levels of achievement among people are accounted for by the amount of their potential they have managed to exploit. If there is any common factor accounting for the achievement of some exceptional Ashkenazi Jews it may be their cultural legacy that has enabled them to make more of themselves. Their achievements are not predestined by an accident of birth. The writer, a member of Kibbutz Alumim, is senior educator in Melitz Centers for Jewish-Zionist Education. From checker at panix.com Fri Dec 9 01:56:14 2005 From: checker at panix.com (Premise Checker) Date: Thu, 8 Dec 2005 20:56:14 -0500 (EST) Subject: [Paleopsych] Physics World: Does God play dice? Message-ID: Does God play dice? (December 2005) - Physics World - PhysicsWeb http://www.physicsweb.org/articles/world/18/12/2/1 [I like Seth Lloyd's 10^120 calculation of the maximum number of calculations since the universe began. I did something similar, namely to take 1. The number of photons, 2. The Planck distance divided by the speed of light, as number of movements a photon can make per unit of time, and 3. The number of units of time since the Big Bang. I multiplied them together and got iirc just this 10^120. Now since 2^10 is approx. 10^3, 10^120 = 2^600. I've seen the 10^120 figure elsewhere, but this may just have been a repeat of Lloyd's reasoning. [And so a key of 600 bits would be absolutely unbreakable in the next 13.5 billion years, if the entire universe were devoted to breaking it AND there were no barriers, like the speed of light, to slow down the calculations coming together. [Yet I've been told that it is possible to crack a 600-bit encryption. Please reconcile this! And while you are at it, tell me how much communication is slowed down as the number of bits increases. [A different point: I can hardly think that superstring theory should be stopped just because we NOW have no means to testing it. This is like Dr. Michael Behe saying there must be an intelligent designer because Dr. Michael Behe cannot figure out how life evolved. It is immoral ever to stop inquiry. Unless you run out of grant money, of course.] Forum: December 2005 Einstein was one of the founders of quantum mechanics, yet he disliked the randomness that lies at the heart of the theory. God does not, he famously said, play dice. However, quantum theory has survived a century of experimental tests, although it has yet to be reconciled with another of Einstein's great discoveries - the general theory of relativity. Below four theorists - Gerard 't Hooft, Edward Witten, Fay Dowker and Paul Davies- outline their views on the current status of quantum theory and the way forward Gerard 't Hooft argues that the problems we face in reconciling quantum mechanics with general relativity could force us to reconsider the basic principles of both theories. Gerard 't Hooft Gerard 't Hooft If there is any preconceived notion concerning the laws of nature - one that we can rely on without any further questioning - it is the assumption that they are controlled by strict logic. Under all conceivable circumstances, the laws of nature should dictate how the universe evolves. Curiously, however, quantum mechanics has given a new twist to this adage. It does not allow a precise sequence of events to be predicted, only statistical averages. All statistical averages can be predicted - in principle with infinite accuracy - but nothing more than that. Einstein was one of the first people to protest against this impoverishment of the concept of logic. It has turned out, however, to be a fact of life. Quantum mechanics is the only known realistic description of the microscopic parts of our universe like atoms and molecules, and it works just fine. Logically impoverished or not, quantum mechanics appears to be completely self-consistent. But how does quantum mechanics tie in with particles that are much smaller than atoms? The Standard Model is the beautiful solution to two fundamental problems: one, how to combine quantum mechanics with Einstein's theory of special relativity; and two, how to explain numerous experimental observations concerning the behaviour of sub-atomic particles in terms of a concise theory. This model tells us how far we can go with quantum mechanics. Provided that we adhere strictly to the principles of quantum field theory, nature obeys both quantum mechanics and special relativity up to arbitrarily small distance and time scales. Just like all other successful theories of nature, the Standard Model obeys the notions of locality and causality, which makes this theory completely comprehensible. In other words, the physical laws of this theory describe in a meaningful way what happens under all conceivable circumstances. The standard theory of general relativity, which describes the gravitational forces in the macroscopic world, approaches a similar degree of perfection. Einstein's field equations are local, and here, cause also precedes effect in a local fashion. These laws, too, are completely unambiguous. But how can we combine the Standard Model with general relativity? Many theorists appear to think that this is just a technical problem. But if I say something like "quantum general relativity is not renormalizable", this is much more than just a technicality. Renormalizability has made the Standard Model possible, because it lets us answer the question of what happens at extremely tiny distance scales. Or, more precisely, how can we see that cause precedes effect there? If cause did not precede effect, we would have no causality or locality - and no theory at all. Asking both questions in quantum gravity does not appear to make sense. At distance scales small compared with the Planck scale, some 10^-33 cm, there seems to be no such thing as a space-time continuum. That is because gravity causes space-time to be highly curved at very small distances. And at small distance scales, this curvature exceeds all bounds. But what exactly does this mean? Are space and time discrete? What then do concepts such as causality and locality mean? Without proper answers to such questions, there is no logically consistent formalism, not even a quantum-mechanical one. One ambitious attempt to combine quantum mechanics with general relativity is superstring theory. However, I am unhappy with the answers that this theory seems to suggest to us. String theory seems to be telling us to believe in "magic": it is claimed that "duality theorems", which are not properly understood, will allow us to predict features without reference to locality or causality. To me such magic is synonymous with "deceit". People only rely on magic if they do not understand what is really going on. This is not acceptable in physics. In thinking about these matters, I have reached a conclusion that few other researchers have adopted: the problem lies with quantum mechanics, possibly with general relativity, or conceivably with both. Quantum mechanics could well relate to micro-physics the same way that thermodynamics relates to molecular physics: it is formally correct, but it may well be possible to devise deterministic laws at the micro scale. However, many researchers say that the mathematical nature of quantum mechanics does not allow this - a claim deduced from what are known as "Bell inequalities". In 1964 John Bell showed that a deterministic theory should, under all circumstances, obey mathematical inequalities that are actually violated by the quantum laws. This contradiction, however, arises if one assumes that the particles we talk about, and their properties, are real, existing entities. But if we assume that objects are only real if they have been precisely defined, including all oscillations as small as the Planck scale - and that only our measurements of the properties of particles are real - then there is no blatant contradiction. One might assume that all macroscopic phenomena, such as particle positions, momenta, spins and energies, relate to microscopic variables in the same way thermodynamic concepts such as entropy and temperature relate to local, mechanical variables. Particles, and their properties, are not (or not entirely) real in the ontological sense. The only realities in this theory are the things that happen at the Planck scale. The things we call particles are chaotic oscillations of these Planckian quantities. What exactly these Planckian degrees of freedom are, however, remains a mystery. This leads me to an even more daring proposition. Perhaps general relativity does not appear in the formalism of the ultimate equations of nature. In making the transition from a deterministic theory to a statistical - i.e. quantum mechanical - treatment, one may find that the quantum description develops many more symmetries than the deeper deterministic description. Let me try to clarify what I mean. If, according to the deterministic theory, two different states evolve into the same final state, then quantum mechanically these states will be indistinguishable. We call such a feature "information loss". In quantum field theories such as the Standard Model, we often work with fields that are not directly observable, because of "gauge invariance", which is a symmetry. Now, I propose to turn this around. In a deterministic theory with information loss, certain states are unobservable (because information about them has disappeared). When one uses a quantum-mechanical language to describe such a situation, gauge symmetries naturally arise. These symmetries are not present in the initial laws. The "general co-ordinate covariance" of general relativity could be just such a symmetry. This is indeed an unusual view on the concept of symmetries in nature. Nature provides us with one indication that perhaps points in this direction: the unnatural, tiny value of the cosmological constant L. It indicates that the universe has a propensity to stay flat. Why this happens is a mystery that cannot be explained in any theory in which gravitation is subject to quantum mechanics. If, however, an underlying, deterministic description naturally features some preferred flat co-ordinate frame, the puzzle will cease to perplex us. There might be another example, which is the preservation of the symmetry between the quarks in the subatomic world, called charge-parity (CP) symmetry - a symmetry that one would have expected to be destroyed by their strong interactions. The problem of the cosmological constant has always been a problem of quantum gravity. I am convinced that the small value of L cannot be reconciled with the standard paradigms of quantized fields and general relativity. It is obvious that drastic modifications in our way of thinking, such as the ones hinted at in this text, are required to solve the problems addressed here. Edward Witten thinks that one of the most perplexing aspects of quantum mechanics is how to apply it to the whole universe Edward Witten Edward Witten Quantum mechanics is perplexing, and likely to remain so. The departure from ordinary classical intuition that came with the emergence of quantum mechanics is almost surely irrevocable. An improved future theory, if there is one, will probably only lead us farther afield. Is there any hint of a clue that might lead to a more complete theory? Experimental physicists are increasingly able to perform experiments that used to be called thought experiments in textbooks. Quantum mechanics has held up brilliantly. If there is a cloud on the horizon, it is that it is hard to see what it means to apply quantum mechanics to the whole universe. I suppose that there are two aspects to this. Quantum-mechanical probabilities do not seem to make much sense when applied to the whole universe, which appears to happen only once. And we all find it confusing to include ourselves in a quantum description of the whole universe. Yet applying quantum mechanics to something less than the whole universe - to an experimental system that is observed by a classical human observer - is precisely what forces us to interpret quantum mechanics in terms of probabilities. If we had a good understanding of what quantum mechanics means when applied to the whole universe, we might ultimately say that the notion that "God plays dice" results from trying to describe a quantum reality in classical terms. Fay Dowker thinks that the puzzles of quantum mechanics could be solved by considering what are known as the "histories" of a system, as introduced by Richard Feynman Fay Dowker Fay Dowker The development of quantum mechanics was a major advance in our understanding of the physical world. However, quantum mechanics has not yet come fully to fruition because it has not replaced classical mechanics in the way that general relativity has replaced Newtonian gravity. In the latter case, we can start from general relativity and derive the laws of Newtonian gravity as an approximation; we can also predict when - and quantitatively to what extent - that approximation is valid. But we cannot yet derive classical mechanics from quantum mechanics in the same way. The reason is that, in its standard textbook formulation, quantum mechanics requires us to assume we have classical measuring equipment. Predictions about the measurements that are recorded, or observed, by this equipment form the scientific output of the theory. But without a classical observer, we cannot make any predictions. While many physicists have been content with quantum mechanics in its textbook form, others - beginning with Einstein - have sought to complete the quantum revolution and make it a truly universal theory, independent of any classical crutch. One attempt to sort out quantum mechanics is to view it as a generalization of classical "stochastic" theories, such as Brownian motion. In Brownian motion, a particle moves along one of a number of possible trajectories, or "histories". The notion of a history is crucial here: it is a complete description of the system at each time between some initial and final times. A history is an a priori possibility for the complete evolution of the system, and the collection of all the histories is called the "sample space". The system will have only one actual history from the sample space but any one is an a priori possibility. The actual history is chosen from the sample space at random according to the "law of motion" for a Brownian particle. This law is a probability distribution, or "measure", on the sample space that outlines, roughly, how likely each history is for the actual evolution. Quantum mechanics also has a sample space of possible histories - trajectories of a particle, say - but on this occasion the sample space has a "quantal measure" associated with it. As with Brownian motion, the quantal measure gives a non-negative number for each subset of the sample space. However, this quantal measure cannot now be interpreted as a probability because of the phenomenon of quantum interference, which means that the numbers cannot be added together like probabilities. For example, when electrons pass through a Young's double-slit set-up, the quantal measure of the set of all histories for the electron that ends up at a particular region on the screen is not just the quantal measure of the set of histories that goes through one slit added to the quantal measure of the set of histories that goes through the other. Essentially, this is due to the phenomenon we call quantum interference between histories, which is due, in turn, to the way we calculate the quantum measure of a bunch of histories as the square of the sum of the amplitudes of the histories in the bunch. When you add some numbers and then square the result, you do not get the sum of the squares - there are also cross terms, which are the expression of the interference that spoils the interpretation as probabilities. The challenge is to find the right interpretation of this quantal measure, one that explains the textbook rules by predicting objectively when classical "measurement" situations arise. This includes the struggle to understand quantum mechanics as a theory that respects relativistic causality in the face of experimental evidence that widely separated particles can be correlated in ways that seem incompatible with special relativity. It is no coincidence that those physicists who are at the forefront of developing this histories approach to quantum mechanics - people like James Hartle from the University of California at Santa Barbara, Chris Isham at Imperial College, London and Rafael Sorkin at Syracuse University - all work on the problem of quantum gravity, which is the attempt to bring gravity within the framework of a universal quantum theory. In histories quantum gravity, each history in the sample space of possibilities is not in space-time; rather, each history is a space-time. If a theory of quantum gravity of this sort can be achieved, it would embody Einstein's hopes for a unification in which matter and space-time, observer and observed, are all treated on an equal footing. Paul Davies believes that the complexity of a system could define the boundary between the quantum and classical worlds Paul Davies Paul Davies Despite its stunning success in describing a wide range of phenomena in the micro-world, quantum mechanics remains a source of puzzlement. The trouble stems from meshing the quantum to the classical world of familiar experience. A quantum particle can be in a superposition of states - for example it may be in many places at once - whereas the "classical" world of observation reveals a single reality. This conundrum is famously captured by the paradox of Schr?dinger's cat, in which a quantum superposition is amplified in order to put an animal into an apparently live-dead hybrid state. Physicists divide into those who believe quantum mechanics is a complete theory that applies to the universe as a whole, regardless of scale, and those who think it must break down at some level between atom and observer. The former group subscribe to the "many universes" interpretation, according to which all branches of a quantum superposition are equally valid and describe parallel realities. Though many physicists reject this interpretation as unacceptably bizarre, there is no consensus on the alternative. Quantum mechanics does not seem to fail at any obvious scale of size or mass, as the phenomenon of superconductivity attests. So perhaps some other property of a physical system signals the emergence of classicality from the quantum realm? I want to suggest that complexity may be the appropriate quantity. Just how complex must a system be to qualify for the designation "classical"? A cat is, I submit, a classical object because it is complex enough to be either alive or dead, and not both at the same time. But specifying a precise measure of complexity is difficult. Many definitions on offer are based on information theory or computing. There is, however, a natural measure of complexity that derives from the very nature of the universe. This is defined by the maximum amount of information that the universe can possibly have processed since its origin in a Big Bang. Seth Lloyd of the Massachusetts Institute of Technology has computed this to be about 10^120 bits (2000 Nature 406 1047 and 2002 Phys. Rev. Lett. 99 237901). A system that requires more than this quantity of information to describe it in detail is so complex that the normal mathematical laws of physics cannot be applied to arbitrary precision without exceeding the information capacity of the universe. Cosmology thus imposes a small but irreducible uncertainty, or fuzziness, in the operation of physical laws. For most systems the Lloyd limit is irrelevantly large. But quantum systems are described by vectors in a so-called Hilbert space, which may have a great - indeed infinite - number of dimensions. According to my maximum-complexity criterion, quantum mechanics will break down when the dimensionality of the Hilbert space exceeds about 10^120. A simple example is an entangled state of many electrons. This is a special form of superposition in which up and down spin orientations co-exist in all possible combinations. Once there are about 400 electrons in such a state, the Lloyd limit is exceeded, suggesting that it is at this level of complexity that classicality emerges. Although such a state is hard to engineer, it lies firmly within the design specifications of the hoped-for quantum computer. This is a machine that would harness quantum systems to achieve an exponentially greater level of computing power than a conventional computer. If my ideas are right, then this eagerly awaited technology will never achieve its full promise. About the authors Gerard 't Hooft is in the Institute for Theoretical Physics, University of Utrechtthe Netherlands, e-mail mailto:g.thooft at phys.uu.nl; Edward Witten is in the Institute for Advanced Study, Princeton, US, e-mail witten at ias.edu; Fay Dowker is at Imperial College, London, UK, e-mail f.dowker at imperial.ac.uk; and Paul Davies is professor of natural philosophy in the Australian Centre for Astrobiology, Macquarie University, Sydney, Australia, e-mail pdavies at els.mq.edu.au From guavaberry at earthlink.net Fri Dec 9 18:17:33 2005 From: guavaberry at earthlink.net (K.E.) Date: Fri, 09 Dec 2005 13:17:33 -0500 Subject: [Paleopsych] Shhhhh is this another Ur Strategy? In-Reply-To: <95.31905577.2c76c886@aol.com> References: <95.31905577.2c76c886@aol.com> Message-ID: <7.0.0.16.0.20051209130336.04a4c578@earthlink.net> hi everyone, sorry to interrupt the present conversation . . . but i've been wondering about this . . . . What is shhhhh? and does this fall under another UR strategy a western custom or is it a world wide "instinct" we all have to use shhhhhh for shushing a baby to stop crying or to calm a crying baby or crying child. Is this another Ur Strategy? Do all human babies recognize this as the signal to be quiet? Do all cultures use this? I imagine it sounding like a snake's rattle but that doesn't mean much. I've heard it the same sound calm's horses and sounds similar to the word for thank you in mandarin. Do we know anything about shhhhh? Appreciate any thoughts you might have. thanks, Karen Ellis Archive 8/16/03 Re: Ur strategies and the moods of cats and dogs hb to pavel kurakin: I've had an adventure that will force me to stop for the night. One of my cats attacked me and tore several holes in my face, nearly removing one of my eyes. <>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<> The Educational CyberPlayGround http://www.edu-cyberpg.com/ National Children's Folksong Repository http://www.edu-cyberpg.com/NCFR/ Hot List of Schools Online and Net Happenings, K12 Newsletters, Network Newsletters http://www.edu-cyberpg.com/Community/ 7 Hot Site Awards New York Times, USA Today , MSNBC, Earthlink, USA Today Best Bets For Educators, Macworld Top Fifty <>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<> From checker at panix.com Fri Dec 9 21:34:09 2005 From: checker at panix.com (Premise Checker) Date: Fri, 9 Dec 2005 16:34:09 -0500 (EST) Subject: [Paleopsych] Christianity Today: Slim for Him Message-ID: Slim for Him http://www.christianitytoday.com/bc/2005/006/14.16.html [A lot of silly AntiRacism here, but it bears out what I said in my meme, "Imitate Christ, not Elvis!", reproduced at the end. That meme has not spread, at least not to Google.] Born Again Bodies: Flesh and Spirit in American Christianity by R. Marie Griffith Univ. of California Press, 2004 323 pp. $21.95 Several weeks ago my wife and I were driving home from Atlanta to Chapel Hill. A few miles out of the city, my eye caught a billboard featuring a lean young white woman pointing to her bare midriff. The caption read, "Look hon, no scars." The logo at the bottom directed viewers to the website of a local birth control clinic. Baffled (as usual) by the subtleties of modern advertising, I asked my wife what it meant. She patiently explained that it was an ad for tubal ligation. I drove on, thinking something deep like, "Oh." After reading R. Marie Griffith's Born Again Bodies this past weekend, I saw the billboard in a new light. It is not often that a work of first-rate historical scholarship opens our eyes to the unspoken assumptions regnant in the world around us. But this one--written by a Princeton University religion professor--does. And no wonder. The book is exhaustively researched, elegantly crafted, methodologically self-conscious, and argued with moral passion. The volume marks a worthy successor to Griffith's influential Harvard dissertation on Women's Aglow, published in 1997 as God's Daughters: Evangelical Women and the Power of Submission. The scope of Born Again Bodies is intimidating. Though focused on the American story, it begins deep in the Middle Ages and ends yesterday. In the process, Griffith ranges back and forth across the Atlantic, lingers among Puritans and their evangelical successors, delves into the intricacies of 19th-century New Thought partisans, ventures into the hermetic realm of body purging and fasting zealots, surveys a plethora of Christian-inspired sexual prescriptions and proscriptions, investigates the largely unknown sideroads of phrenology, physiognomy, and soma typing, and finally ends up in the vast subculture of the contemporary Christian diet industry. Griffith's main argument can be stated in two sentences. Between the early 17th and the late 20th centuries, American body discipline practices evolved from a ritual of repentance to an exercise in self-gratification. Though a wide range of more or less secular forces propelled the process, Christians in general and white middle-class Protestants in particular pioneered that development. A closely related sub-argument is that Christians perennially have viewed the body as a window into the soul. Occasionally the process worked the other way around. A few body disciplinarians--usually New Thought advocates--felt that they could change the mind by manipulating the body. Either way, everyone, it seems, perceived an intimate connection between the spirit and the flesh. When it came to eternal matters, Christians saw through a glass darkly, but when it came to temporal matters, they saw clearly. The body told no lies. Historians generally have interpreted the evolving meanings associated with rigorous dieting (and other kinds of physical denial) as a process of secularization. What started as mortification for sin, they say, turned into purposeful renunciation to compensate for the guilt of affluence and leisure. Griffith disagrees. She argues instead that religion has been involved in those cultural protocols from beginning to end. The story is an evangelical one, centered on the good news of abstemious eating: go out, bring the (obese) sinners in, give them the (lo-cal) salvation message, hear their (before-and-after) testimonies, urge them to stay the (one-course) course, offer a helping (though never a second helping) hand to the weak-willed. If it is a New Thought story of gnostic discernment (there are bariatric secrets to be known), it is also a Wesleyan story of entire sanctification (permanent deliverance from the temptations of the palate), and a Reformed story of divine sovereignty (God's nutritional laws are non-negotiable). Above all, it is a millennialist story of can-do achievement. Our destiny lies within our hands. Just put down the fork and push yourself away from the table. The meat in the sandwich lies in chapter 4, aptly titled "Pray the Weight Away: Shaping Devotional Fitness Culture." In this chapter, inner grace manifests itself most forcefully and unequivocally in a lean, firm body, which stands as a mark of a disciplined life, a holy people and, above all, right standing with God. A sampler of the titles produced by the Christian--mostly evangelical-fundamentalist--diet industry in the past half century tells the tale. In alphabetical order: Devotions for Dieters God's Answer to Fat: Lo?se It! Help Lord--The Devil Wants Me Fat! I Prayed Myself Slim More of Jesus, Less of Me Pray Your Weight Away Slim for Him These and scores of other works, which sell in the millions, have been complemented by religiously based support groups such as Body and Soul, Overeaters Victorious, and 3D (Diet, Discipline, Discipleship), not to mention a cornucopia of dieting paraphernalia--including exercise tapes, work-out clothes, and training manuals--and a never-ending schedule of seminars. In her analysis of the postwar Christian diet industry, Griffith isolates at least two problems that ought to trouble the conscience. The first is the redefinition of the sins of the mouth. Where once the emphasis was on gluttony (enslavement of the appetites) or disordered desires (longing for a good less than God), now it was on fat. Just that, fat. And so it was that Gwen Shamblin, CEO and founder of Weigh Down workshop, could say, "Grace ... does not go down into the pigpen." For Griffith, the second and more troubling problem is the diet industry's race and class pretensions, intended or not. She shows that the presumed audience was white, sustained by a "racialized ideal of whiteness, purged of the excesses associated with nonwhite cultures." It also was middle- or upper-middle-class, sustained by the affluence and leisure that made costly diet foods and gear (and for women, cosmetic enhancements) affordable. Those presumptions were not value-neutral. Rather they carried a normative edge that made the firm, angular bodies of an idealized white middle class the rule of thumb for all. Admittedly, a few African Americans, such as T. D. Jakes and Jawanza Kunjufu, joined the crusade. But most of the leaders were white, and most seemed unable to imagine that there might be a difference between good health and (their notion of) good looks, or that economic deprivation and ethnic tradition might play a role in diet options and choices. "In ways both explicit and implicit," she tells us, "white Christian men and women exchanged ideas about how to uphold their image in the world, to sustain their place at the top of the racialized class hierarchy embedded in American society and the Anglo-American Christian tradition." (How the firm, angular bodies of black athletes--ubiquitous in advertising for Nike et al. --figure in this narrative is not entirely clear.) Though Griffith does not say much about it, there is a Giant Pooka in the story, and it keeps popping up in unexpected places. Bluntly put, the diet industry, Christian and otherwise, is fighting an unwinnable battle. Sociologists, she tells us, have found that religious practice correlates positively with obesity. Christians in general and Southern Baptists in particular are the heaviest. Yet who is surprised? Whole Foods-style supermarkets might be growing, but so are McDonalds. Indeed, I do not recall ever seeing a fast-food franchise boarded up for keeps. What's not to like about this brilliant and deeply earnest book? Only this: Griffith makes Protestantism the chief protagonist. To be sure, she shows that similar attitudes about body politics crop up among Mormons and, from time to time, Catholics, Jews, and secularists too. Yet she insists that "Protestantism--as the tradition that has most comprehensively influenced the course of American history--takes center stage in this story." This claim raises more questions than it resolves. That a majority of leaders in the diet crusade happened to be Protestants is undeniable. And that the crusade often looked like a Protestant revival also is undeniable. But I see little evidence that the sleek-body promoters drew upon historic Protestant principles, or that they represented the actual life of faith practiced Sunday-by-Sunday in countless Protestant churches. By my lights, the true culprit in this sorry tale is not Protestantism but consumer capitalism gone off the rails. Once upon a time Protestantism had something to do with capitalism's birth, but it should not be forced forever to bear the guilt for capitalism's excesses. I close on a personal note. Many decades ago one of my U.S. history professors--a scholar well-known for his high-minded support of progressive causes--casually remarked in class that President Taft, "being corpulent, was prone to be lazy." Neither he nor the 200-plus students in the lecture hall noted anything amiss. But I winced. As a lifelong battler of the scales, I suspected that Taft had felt the same desperations I have felt. And since then I have wondered about the many ways that I too might have diminished my students' lives. Marie Griffith's marvelous book will make a lot of people think twice. She has done what many historians aspire to do but few actually manage to accomplish: make this world a more humane place. Grant Wacker is professor of Church History at Duke University's Divinity School. He is the author of Heaven Below: Early Pentecostalism and American Culture (Harvard Univ. Press). -------------- Meme 033: Imitate Christ, Not Elvis! sent 4.10.18 We went to a wedding this Summer for a daughter of an Evangelical friend I have known since my college days. (It was I who got him to meet his future wife, so I was indirectly responsible for not only his wedding but for the very existence of his four children.) To my horror, the music at the reception was played by an Elvis impersonator by the name of Rick Spruill. Whether he is a true heir of Presley, I neither know nor care. Elvis was obnoxious when he was alive; his impersonator is obnoxious now. (I wish I had had the foresight to have brought along a CD of Bach's Orchestral Suites, called "Suites for Dancing," played in dance tempos on the theory that Bach intended his suites to be danced to. After the impersonator had left, I could have asked those remaining at the console to slip in the Bach, to delight, I hope, the audience.) Anyhow, I protested to the Evangelicals there that Christ is King, not Elvis, but to no effect. Later I was talking to a Mormon friend, who is hugely overweight in a conversation about Mormonism, in which he repeatedly stressed the importance of winning souls for Christ. I asked him if Christ was King and if he believed in the imitation of Christ. Yes, on both accounts. Then I suggested to him that he was in fact imitating Elvis by eating himself out to Elvis proportions. He said I had a point, and I suggested to him that he think about this the next time he reaches for seconds. Sadly, to all appearances, he continues to imitate Elvis. But here's hoping that the meme, "Imitate Christ, Not Elvis" will spread among the Christian community and become the first diet method in all of history that works. [I am sending forth these memes, not because I agree wholeheartedly with all of them, but to impregnate females of both sexes. Ponder them and spread them.] From checker at panix.com Fri Dec 9 21:34:22 2005 From: checker at panix.com (Premise Checker) Date: Fri, 9 Dec 2005 16:34:22 -0500 (EST) Subject: [Paleopsych] Telegraph: Umberto Eco: God isn't big enough for some people Message-ID: Umberto Eco: God isn't big enough for some people http://www.arts.telegraph.co.uk/opinion/main.jhtml;jsessionid=WIFVKH3A4A0YHQFIQMFCFFWAVCBQYIV0?xml=/opinion/2005/11/27/do2701.xml&sSheet=/portal/2005/11/27/ixportal.html [This completely backs up what Lene told me, namely that the secularization thesis (with modernization secularism) has failed in Europe, too: Christianity's decline has been replaced by the rise of New Age religions.] (Filed: 27/11/2005) We are now approaching the critical time of the year for shops and supermarkets: the month before Christmas is the four weeks when stores of all kinds sell their products fastest. Father Christmas means one thing to children: presents. He has no connection with the original St Nicholas, who performed a miracle in providing dowries for three poor sisters, thereby enabling them to marry and escape a life of prostitution. Human beings are religious animals. It is psychologically very hard to go through life without the justification, and the hope, provided by religion. You can see this in the positivist scientists of the 19th century. They insisted that they were describing the universe in rigorously materialistic terms - yet at night they attended seances and tried to summon up the spirits of the dead. Even today, I frequently meet scientists who, outside their own narrow discipline, are superstitious - to such an extent that it sometimes seems to me that to be a rigorous unbeliever today, you have to be a philosopher. Or perhaps a priest. And we need to justify our lives to ourselves and to other people. Money is an instrument. It is not a value - but we need values as well as instruments, ends as well as means. The great problem faced by human beings is finding a way to accept the fact that each of us will die. Money can do a lot of things - but it cannot help reconcile you to your own death. It can sometimes help you postpone your own death: a man who can spend a million pounds on personal physicians will usually live longer than someone who cannot. But he can't make himself live much longer than the average life-span of affluent people in the developed world. And if you believe in money alone, then sooner or later, you discover money's great limitation: it is unable to justify the fact that you are a mortal animal. Indeed, the more you try escape that fact, the more you are forced to realise that your possessions can't make sense of your death. It is the role of religion to provide that justification. Religions are systems of belief that enable human beings to justify their existence and which reconcile us to death. We in Europe have faced a fading of organised religion in recent years. Faith in the Christian churches has been declining. The ideologies such as communism that promised to supplant religion have failed in spectacular and very public fashion. So we're all still looking for something that will reconcile each of us to the inevitability of our own death. G K Chesterton is often credited with observing: "When a man ceases to believe in God, he doesn't believe in nothing. He believes in anything." Whoever said it - he was right. We are supposed to live in a sceptical age. In fact, we live in an age of outrageous credulity. The "death of God", or at least the dying of the Christian God, has been accompanied by the birth of a plethora of new idols. They have multiplied like bacteria on the corpse of the Christian Church -- from strange pagan cults and sects to the silly, sub-Christian superstitions of The Da Vinci Code. It is amazing how many people take that book literally, and think it is true. Admittedly, Dan Brown, its author, has created a legion of zealous followers who believe that Jesus wasn't crucified: he married Mary Magdalene, became the King of France, and started his own version of the order of Freemasons. Many of the people who now go to the Louvre are there only to look at the Mona Lisa, solely and simply because it is at the centre of Dan Brown's book. The pianist Arthur Rubinstein was once asked if he believed in God. He said: "No. I don't believe in God. I believe in something greater." Our culture suffers from the same inflationary tendency. The existing religions just aren't big enough: we demand something more from God than the existing depictions in the Christian faith can provide. So we revert to the occult. The so-called occult sciences do not ever reveal any genuine secret: they only promise that there is something secret that explains and justifies everything. The great advantage of this is that it allows each person to fill up the empty secret "container" with his or her own fears and hopes. As a child of the Enlightenment, and a believer in the Enlightenment values of truth, open inquiry, and freedom, I am depressed by that tendency. This is not just because of the association between the occult and fascism and Nazism - although that association was very strong. Himmler and many of Hitler's henchmen were devotees of the most infantile occult fantasies. The same was true of some of the fascist gurus in Italy - Julius Evola is one example - who continue to fascinate the neo-fascists in my country. And today, if you browse the shelves of any bookshop specialising in the occult, you will find not only the usual tomes on the Templars, Rosicrucians, pseudo-Kabbalists, and of course The Da Vinci Code, but also anti-semitic tracts such as the Protocols of the Elders of Zion. I was raised as a Catholic, and although I have abandoned the Church, this December, as usual, I will be putting together a Christmas crib for my grandson. We'll construct it together - as my father did with me when I was a boy. I have profound respect for the Christian traditions - which, as rituals for coping with death, still make more sense than their purely commercial alternatives. I think I agree with Joyce's lapsed Catholic hero in A Portrait of the Artist as a Young Man: "What kind of liberation would that be to forsake an absurdity which is logical and coherent and to embrace one which is illogical and incoherent?" The religious celebration of Christmas is at least a clear and coherent absurdity. The commercial celebration is not even that. o Umberto Eco's latest book is The Mysterious Flame of Queen Loana (Secker & Warburg, ?17.99) From checker at panix.com Fri Dec 9 21:34:28 2005 From: checker at panix.com (Premise Checker) Date: Fri, 9 Dec 2005 16:34:28 -0500 (EST) Subject: [Paleopsych] Atlas Sphere: Good Happens, Too Message-ID: Good Happens, Too http://www.theatlasphere.com/columns/printer_051105-perren-good-happens.php [Increased attention to Frank Sinatra as a sign of things getting better??] By Jeffrey Perren Nov 30, 2005 In response to a recent comment of mine, someone asked me for some examples of good things that have happened in the last thirty-five years. So here's a partial inventory. Some of the things listed are personal, some are global, with lots in between. Of course, all of the categories listed below are tightly interrelated. Personal The opportunity to meet like-minded, reasonable, and good people is greater today than it was in prior decades. Just as one example, I would've been very unlikely thirty-five years ago to have 'met' and 'conversed' with some of the fine Ayn Rand admirers I've corresponded with recently. Many could say the same. Intellectually Despite abysmally poor U.S. K-12 (and even college) education, more people are sharing more good ideas and useful information than ever before in history. The opportunity for this kind of cross-fertilization simply didn't exist even as recently as ten years ago. Obviously, the Internet is one major factor, but there are others. The Internet made sharing ideas easy and cheap, but even in the print world there are more magazines now to satisfy every possible interest than ever before. In addition, we've now been the recipients of decades of beneficial influences: Rand, some conservative thinkers (Sowell, for example), a general rise in the number of large bookstore chains, and the failure of grand social experiments. These provide helpful theoretical guidance and useful empirical evidence, allowing us to lead wiser lives. Socially Evolving mores have driven to historically unprecedented low levels the amount and severity of sexual and racial prejudice, rigid adherence to restrictive social behavior, etc. (These are a couple of the few good effects of the 60s.) This 'moral anarchy' creates an opportunity for better, and better-grounded, practices to emerge. The near monolithic thinking that characterized the intellectual atmosphere of the first several decades of the twentieth century is gone, probably for good. Yes, there certainly has been produced far too much post-modern, nihilistic, irrationalist garbage from some of the same causes but this article is about the good things. Politically In my lifetime alone the Soviet Union has morphed, and is no longer an active threat to the U.S. and the rest of the world. The Berlin Wall has been dismantled and Germany re-united. These are not small things. Many formerly socialist countries, India and Argentina to choose only two examples, have moved considerably toward greater freedom. The Middle East, so very troublesome now, is being actively dealt with instead of sitting to stew to become an even bigger problem later. (Yes, this one is in the nature of a prediction, but the present good is that the U.S. is no longer standing idly by.) The current heated controversies about foreign policy, domestic policy, and the debates about the character of politicians on both major sides of the aisle are actually good. Just as two examples, no one would've been willing to so much as seriously discuss radical changes to tax codes and Social Security until recently. Thirty-five years ago there was plenty of complaining about all these things, but much more uniform opinion and much less real debate. We now have considerable historical experience with socialism and the welfare state, much better grounded arguments for various desired outcomes, and much more substantial disagreements and clearly distinguishable views. This is a necessary prelude to improving the present situation. And there is much more divided opinion within the two major U.S. parties, with more viable potential alternatives to both than ever before. Artistically Ayn Rand's novels continue to sell phenomenally well. Tom Clancy, Michael Crichton, and Ken Follett continue to write bestsellers. J.K. Rowling's recently released novel made her $36 million in one day, and, to date, her books have sold almost 270 million units. I'm not arguing that these latter writers are anywhere near being in the same league artistically or philosophically; but their novels are not full of degraded people whining about their miserable lives. Quite the reverse. Yes, plenty of the opposite still dominates the publishing industry. Again, this article is about the good things. The dreck produced too often by Hollywood from the 70s to the present is lately accompanied by offerings such as Braveheart, Air Force One, What Women Want, Titanic, Patriot Games, and others. (I'm not making the case that these are great movies, but they're much more reflective of the spirit of the 40s and 50s than those produced during the late 60s to early 80s, after which the trend began to reverse. And none of them would likely have been produced during that time.) In fine painting, Jack Vettriano, Chen Yifei , and a score of other 'romantic realist' painters have been making a living. In some cases, doing very well, thank you. This is not something you would've been likely to see thirty-five years ago. Most popular 'music' continues to be as bad as ever. But with improved distribution mechanisms young people are being (re)introduced to Frank Sinatra, Puccini, and many others who are more popular than they were twenty years ago. This can't help but encourage composers to actually write new good music. Post-modernism is rapidly coming to a close as an active artistic force. This, along with a much wider variety of much less expensive distribution channels, creates an opportunity for more art that is consistently good to be commercially successful. Materially The improvements in this area are pretty obvious. Today we have internet-enabled cell phones, faster and smaller computers, the Internet, satellite TV and radio, artificially increased tree production, a larger average home size, and more efficient heating and air conditioning systems. In the area of biotech products, there are genetically altered food as well as enhanced agriculture in general, improved pharmaceutical products, and medical technology (e.g., CATs, NMRs, artificial organs). All these have either been introduced or substantially improved in the last thirty five years. Spiritually There has been a fairly recent widespread revival of concern for ethics in everyday life. (Granted, much of the answers to such concern have been wrong-headed. For the last time, I'm focusing on the good here.) There's much more discussion today about authentic values and non-Nietzschean, non-Pragmatist style self-interest than was the case before. The general atmosphere up until the last few years was that people didn't think much about the harm they did to themselves or to others. Theories of rational self-interest and other positive intellectual forces are definitely having an effect. It's up to us to make sure the right side wins. The need to solve serious problems is hardly gone likely it never will be. But a sense of perspective, and a recognition of the positive changes of the last few decades, may help counter-balance the tendency to despair or cynicism that too often colors the enthusiasm for life of many. Personally, I'm looking forward to the next fifty years. [1]Jeffrey Perren is a novelist with a background in Physics and Philosophy. His latest novel, The Geisha Hummingbird (in progress), is the story of a ship designer whose fianc? disappears on the eve of her wedding, amidst a whirlpool of industrial espionage. References 1. http://www.theatlasphere.com/directory/profile.php?id=1488 From checker at panix.com Fri Dec 9 21:34:34 2005 From: checker at panix.com (Premise Checker) Date: Fri, 9 Dec 2005 16:34:34 -0500 (EST) Subject: [Paleopsych] On Academic Boredom by Amir Baghdadchia Message-ID: On Academic Boredom by Amir Baghdadchi Arts and Humanities in Higher Education 4(3) University of Cambridge, UK [This is a lovely article! I'd like to know more about the emergence of boredom in the 18th century. I do not deny that people were bored in a broad sense of the term, or that other animals can be bored. But, in a sense so specific that a word had to be coined for it, boredom goes only back so far. (Words are not coined at random.) It is a "socially constructed" emotion, a specific narrowing down of (or mixture of) basic emotions.] First, the summary from the "Magazine and Journal Reader" feature of the daily bulletin from the Chronicle of Higher Education, 5.11.29 http://chronicle.com/daily/2005/11/2005112901j.htm A glance at the current issue of Arts and Humanities in Higher Education: Academic bore Confronting boredom in higher education can help academics to eradicate a system that survives by being dull, writes Amir Baghdadchi, a Ph.D. candidate at the University of Cambridge who is studying argument and literary form in 18th-century literature. Such boredom is "corrosive," writes Mr. Baghdadchi. He says that it occurs when academics are unable to make use of another person's findings, and that "the boring work is one which provides us with nothing to make use of." While boredom is normally considered the result of a situation gone bad, Mr. Baghdadchi writes that, in academe, it is actually the product of things gone right. He says that uninteresting work creates a "defensive moat around a paper" because people are rarely apt to scrutinize a boring topic. Because it is free from any inquiry, lackluster work can survive criticism. "Sometimes it even seems as if we have a Mutually Assured Boredom pact," he writes. "I get up and bore you, you get up and bore me, and at the end of the day we are all left standing." He writes that while the system has worked well so far, changes are worth considering. Researchers should not be wholly concerned with simply avoiding "academic battles," he says, but rather with solving society's problems. After all, he asks, "do we want a system that promotes not the graduate students who are the most vivaciously interested, but the ones who are the most contentedly bored?" The article, "On Academic Boredom," is available for a limited time at http://ahh.sagepub.com/cgi/content/abstract/4/3/319 --Jason M. Breslow _________________________________________________________________ abstract The kind of boredom experienced in academia is unique. Neither a purely subjective nor objective phenomenon, it is the product of the way research is organized into papers, seminars, and conferences, as well as of a deep implicit metaphor that academic argument is a form of warfare. In this respect, the concepts of boredom and rigour are closely linked, since there is a kind of rigour in the Humanities that stresses the war metaphor, and structures scholarship defensively. This is opposed to a different kind of rigour that eschews the war metaphor altogether, and considers rigorousness in the light of a work?s usefulness to its audience. -------------------------- While few would deny that some kind of boredom is part of the culture of research and teaching in the Humanities, there are, however, two reasons why it is worth considering academic boredom as a species of boredom in its own right. First, it is not at all clear that the word 'boredom' refers to a coherent topic with an essential character. Since the word first gained currency in the 18th century, 'boredom? has come to be used to describe circumstances as various as the restlessness of a child on a car trip, the sense of monotony in assembly-line work, a crippling sense that the universe has no purpose, and there being nothing worth watching on television. These may not all be the same. Whereas, the kind of boredom experienced in university departments is of a very particular kind. It is most easily identified in terms of affect: the sense that the seminar is never going to end, that the speaker will never get to the point, that the articles one is reading are proceeding at a glacial pace, that one simply cannot get into a discussion, that one dreads getting into it in the first place. The talk, the seminar, the conference; these are all contexts particular to us, with their own rules, etiquettes, and expectations. They are a set of practices. To treat the topic of our boredom without reference to these is not only to miss the peculiar shape of academic boredom, but to ignore the shape of ourselves inside it. The second reason is practical. If we think that boredom is a problem that we ought to do something about, then it makes sense to consider how it relates to our practices and the structure of our discourse. We have no power over abstractions; but we can alter practices. That some may not see academic boredom as a real problem at all, I will readily admit. To those, I can only offer my observations from some five years as a graduate student in the United States and the United Kingdom, in a variety of institutions. In my experience, boredom is corrosive. I have seen my classmates begin their graduate work with great vivacity and curiosity, and I have seen them slowly ground down into duller, quieter, less omnivorously interested people. I have seen it in myself. I have observed this change over years and even in microcosm over the course of a single seminar. I know that graduate students are extremely reluctant to discuss it out loud, since that would be akin to admitting weakness. But it is the case nevertheless. If anyone believes there is a counterexample, then one may attempt the following thought experiment:try and imagine someone who, after several years of graduate work, became, at the end of it, more vivacious. Rather than trying to pin down what academic boredom is in the abstract, a better way will be to treat the words 'boredom', 'boring' and so on as available descriptions. Thus, some of the questions we might instead ask are: in what kinds of academic contexts and circumstances do we describe ourselves as 'bored'? What other kinds of perceptions accompany, or precede, our judgment that we are bored? What kinds of things can be called 'boring'? Our commonsense answers are very illuminating here. One popular answer to the first question (at least among graduate students) is that one's sense of boredom in, for example, a seminar, is derived from personal inadequacy. One finds oneself unable to concentrate on an argument, and concludes that this is because one does not know enough, has not studied enough, is not up to this level of discourse. There are in fact two different propositions here. First, that boredom is a subjective state, and second, that it is one's own fault or responsibility. The first is uncontroversial enough; and indeed, most writers on any kind of boredom assume that it is some kind of mental state. The second proposition is not so obvious.The dictum 'boredom is your own fault', like its cousin, 'only boring people are ever bored', seems closer to the kind of rule one tells children to make them behave. That this belief should be so prevalent, at least in an implied form, in graduate studies, is not surprising if we take one of the aims of graduate work to be the moulding of the student into a docile, well-behaved, academic subject. However, as a representation of what actually happens when we are bored, it is not very informative. To say that one's experience of boredom is the product of internal causes is very much like saying that the pain one feels from a splinter is caused by one's nervous system. That is surely correct. But it ignores the splinter. So we move from internal to external causes.And,sure enough,just as often as we blame ourselves, we blame the speaker or the topic for being boring.And we even say of certain topics, or speakers, that they are just inherently boring. But consider an extreme case, the case of the switched papers. Imagine that at the last English conference, there was a paper on Wycherley that engaged its audience fully, while, at the last meeting of the European Society of Industrial Chemical Producers, there was a chemical engineering paper that similarly was a great success. Let us suppose that no one in the respective audiences thought the papers in the least bit boring. But then let us suppose that through some accident, the next time the speakers give their papers, they switch audiences.And we could easily imagine that the audience of engineers would not be so enraptured with an analysis of The Country Wife, and that the English students, however skilled at reading any text in the most interesting way possible, would become very fidgety very quickly. Hence, it seems that, if we are careful, we cannot say that 'boringness' is a quality that is indisputably attached to a topic or a speaker. Moreover, it seems that neither the internal account of boredom--which says it is purely a state of mind--nor the external one--which says it is purely someone's fault--can stand on its own. But consider this: If we think of what happens when we are bored as an event--as an occasion when we are supposed to do something --then the two accounts can be complementary. Thus, our complaint, 'I'm just not getting this. I can't follow it', can be rephrased as saying, 'there is nothing I can do with this material. I can't make use of it.' Likewise, when the English student is boring the chemical engineers, we may say that the English student has given the engineers nothing that they can make use of. The English paper is so designed as to give them nothing to do. And, if we are willing to take that on board, we can consolidate our observations thus: Boredom occurs when we are unable to make use of a work, and the boring work is one which provides us with nothing to make use of. Thus far, the assumption has been that boredom is wholly bad. This might seem uncontroversial, but it is worth asking whether boredom is not some malfunction, what happens when things go wrong, but is perhaps something adaptive, which happens in order to succeed in some situation. There is a strong reason to think so. Consider that when one is bored by a paper, one does not ask questions. Boredom--whether caused by massive amounts of unfamiliar data, or impenetrable syntax--creates a defensive moat around a paper. It protects it. It guarantees that, even if it does not win everyone over, it survives the war, and that is good enough. The underlying metaphor here is that an argument is war. This idea is very brilliantly discussed by linguists George Lakoff and Mark Johnson (1980) in their book Metaphors We Live By. They point out that the metaphor 'argument is war' not only describes what we do when we argue, but structures it: that is to say, when we argue, we behave as if we were at war: we fortify our positions, we attack weak points, there is a winner and loser, and so on. And because we actually do behave as if we were at war, the metaphor seems perfectly apt. However, as Lakoff and Johnson point out, the metaphor conceals as much as it explains, for, the person with whom we are arguing is actually giving their time, which is hardly a characteristic of warfare, and often when we disagree we are collaborating on the same problem. Collaboration, dialogue, the sense of a common discipline--these are elements of academic discourse left out by the war metaphor. Now if we set that beside the description of boredom we arrived at earlier, it appears that we have two models of academic discourse that sit very ill together. One can either say that argument is war, and therefore must be waged offensively and defensively, or one can see scholarship as producing objects intended for manipulation. These positions are contrary; or, at least, one cannot maximize the one without minimizing the other. Of the two models, boredom feeds on the metaphor 'argument is war'. One can succeed in the war by virtue of boredom because it is a defensive tactic. Sometimes it even seems as if we have a Mutually Assured Boredom pact. I get up and bore you, you get up and bore me, and, at the end of the day, we are all left standing. It would not be hard to find graduate students whose measure of a successful conference paper lies entirely in whether they were 'shot down' or not. In this situation, being boring is a very good policy indeed. At the outset I stated a concern with boredom as something detrimental to academic discourse. But it is not necessary to think in these terms at all. Indeed, I believe one of the reasons that academic boredom has not been an important topic is because of a very robust and practical objection that could be made. It is an extremely persuasive objection, and I would like to deal with it now. It argues that while it is all well and good to decry things that are boring and to think about what counts as interesting, the real business of academic work has nothing to do with 'being interesting' at all: rather, it has to do with the construction of rigorous arguments that can withstand attack. Whether or not the audience is interested is a consideration always second to the strength of the research. If an audience finds a rigorously argued piece of scholarship boring, that is their problem, since they cannot expect that it was written for their enjoyment. As I say, a very robust and, perhaps, very familiar objection. It is built around an opposition of 'rigorous' vs. 'interesting'. However, I do not think this has to be the case. I think we can see this by interrogating the concept of 'rigorousness'. It will be helpful to take an uncontroversial example of something that must be done rigorously. Let us suppose, purely hypothetically, that a graduate student, owing to an inability of the department to offer any funding whatsoever, finds a job working in a fish restaurant, in which he has the task of cleaning out the industrial walk-in refrigerator. As I say, the example is purely hypothetical. But if I had to guess, I would think that he would have to see to it that this was done with extreme rigour: the temperatures would have to be precisely maintained, the fish would have to be separated and rotated for freshness, the floors and walls and shelves would have to be scrubbed meticulously to avoid any kind of health risk. Here then is a paradigm of rigour, since: (a) it must be done to an external standard; (b) the work is meant to be examined and approved by an inspector; and (c) everything must be such that it can be easily used and manipulated by others. Now contrast this kind of rigour--which resembles a scientific experiment in that it wants others to see what happened, wants others to follow the reasoning, and wants the scrutiny--with the rigour that is purely defensive: the rigour of endless authorities trotted in, of obscure language, of massive amounts of information deployed to scare off inquiry. The very fact that we are often willing to declare a work to be rigorous without claiming actually to understand it points to these two types of rigour being different, if not contrary. Perhaps because both kinds of rigour are commonly signified by the same word, they are not usually distinguished. But if we were to make the distinction, it seems to be fortuitous that we do have a ready-made phrase for this latter kind of bellicose, deadly rigour: we may call it rigor mortis, literally the 'rigor, the stiffness of death'. Rigor mortis shuts us up, it closes off inquiry, it digs the moat, it wants to bore us to death. Again, we might call the other kind of rigour-- for lack of a better term--a 'living rigour'. Living rigour, if we wish to carry on with the martial metaphor, takes risks, seeks risks, is designed to be vulnerable. But it is better to do without the martial metaphor, as that will tempt us into thinking of argument still as confrontation, and think of living rigour as a kind of rigour that constructs things to be used, inspected, evaluated. However, there is a consequence of thinking this way, which, depending on one's predisposition, either threatens the very possibility of ever being rigorous, or provides the only way in which rigour might be a meaningful concept. Consider that we now have an idea of rigour in terms of usefulness in a broad sense. But, because how useful a work is depends both on the shape of the discourse and on what the audience knows or wants to do, we cannot therefore determine how rigorous a work is, once and for all, just by looking at its shape or content. Rather, we are called on to think of the word 'rigorous' as operating in a way similar to the word 'shocking'. You might think you have written the most shocking piece of literary criticism ever, yet, if no one in your audience of veterinary surgeons is actually shocked, you cannot really maintain that it was shocking, absolutely. Likewise, rigour: if rigour demands that your audience can manipulate your idea, and no one cares to manipulate it, then you lose the right to boast that you have been perfectly, objectively, and in the mind of God rigorous. To refer to the 'mind of God' may seem like a rhetorical gesture, but it is in fact what one has to do by the logic of the objection. If the rigorousness of a scholarly work can exist without reference to any imaginable mortal audience (and anyone who thinks being interesting is a separate matter from being rigorous is implying this), then to whom else is the work addressed, if not to some all-hearing deity who understands every point and can never be bored? On the other hand, if one prefers the idea of a living rigour, this is not without dangers, since, along with removing the certainty of being rigorous in every situation, it removes the authority we arrogate to ourselves based on a reputation for rigorous work. (To make an observation from the point of view of a graduate student: rigour is the most frequent stick with which we are beaten. In researching a topic about which you necessarily become more informed than your supervisor, what other kind of authority can a supervisor wield?) Indeed, living rigour compels us to think of our work as not complete once the paper is polished, but only occurring the moment the paper is being received. By this account, then, a concern with being rigorous in the best way possible indeed justifies, rather than detracts from, a concern with academic boredom. Academic boredom, which occurs when one is unable to make use of a work and cannot find anything in it with which to engage, is the consequence of rigor mortis, the kind of rigour deployed for winning academic battles rather than solving problems. Boredom, because it feels like a lack of something, may seem trivial and unimportant. It is not a thing to be reckoned with because no thing appears to be there. But as I have tried to show, this is not the case. Boredom is a sign that our system is not functioning the way we think it is, that we are not always being rigorous when we think we are. Of course, there is no need to change a system that has served us very well so far. But it is worth considering whether we want a system that promotes not the graduate students who are the most vivaciously interested, but the ones who are the most contentedly bored. reference Lakoff, G. and Johnson, M. (1980) Metaphors We Live By. Chicago, IL: University of Chicago Press. biographical note amir baghdadchi is currently a PhD student in the Faculty of English at Cambridge University. He is working on the idea of argument and literary form in 18th-century literature. [Email: ab490 at cam.ac.uk] From checker at panix.com Fri Dec 9 21:34:40 2005 From: checker at panix.com (Premise Checker) Date: Fri, 9 Dec 2005 16:34:40 -0500 (EST) Subject: [Paleopsych] New Yorker: Baudrillard on Tour Message-ID: Baudrillard on Tour The New Yorker: The Talk of the Town ttp://www.newyorker.com/talk/content/articles/051128ta_talk_macfarquhar [Baudrillard is most definitely not an academic bore.] MEN OF LETTERS Issue of 2005-11-28 Posted 2005-11-21 There may never again be a year in Jean Baudrillard's life quite like 1999. Baudrillard, the French philosopher, is best known for his theory that consumer society forms a kind of code that gives individuals the illusion of choice while in fact entrapping them in a vast web of simulated reality. In 1999, the movie "The Matrix," which was based on this theory, transformed him from a cult figure into an extremely famous cult figure. But Baudrillard was ambivalent about the film--he declined an invitation to participate in the writing of its sequels--and these days he is still going about his usual French-philosopher business, scandalizing audiences with the grandiloquent sweep of his gnomic pronouncements and his post-Marxian pessimism. Earlier this month, he gave a reading at the Tilton Gallery, on East Seventy-sixth Street, in order to promote "The Conspiracy of Art," his new book. The audience was too big for the room--some people had to stand. A tall, Nico-esque blond woman in a shiny white raincoat leaned against the mantelpiece, next to a tall man with chest-length dreadlocks. A middle-aged woman with red-and-purple hair sat nearby. There was a brief opening act: Arto Lindsay, the onetime Lounge Lizard, whose broad forehead, seventies-style eyeglasses, and sturdy teeth seemed precariously supported by his reedy frame, played a thunderous cadenza on a pale-blue electric guitar. Baudrillard opened his book and began to read in a careful tone. He is a small man with large facial features. He wore a brown jacket and a blue shirt. (Some years ago, he appeared on the stage of Whiskey Pete's, near Las Vegas, wearing a gold lam? suit with mirrored lapels, and read a poem, "Motel-Suicide," which he wrote in the nineteen-eighties. But there was no trace of the lam? Baudrillard at the Tilton Gallery.) " `The illusion of desire has been lost in the ambient pornography and contemporary art has lost the desire of illusion,' " he began. " `After the orgies and the liberation of all desires, we have moved into the transsexual, the transparency of sex, with signs and images erasing all its secrets and ambiguity.' " After he read, Baudrillard expanded on his theme. "We say that Disneyland is not, of course, the sanctuary of the imagination, but Disneyland as hyperreal world masks the fact that all America is hyperreal, all America is Disneyland," he said. "And the same for art. The art scene is but a scene, or obscene"--he paused for chuckles from the audience--"mask for the reality that all the world is trans-aestheticized. We have no more to do with art as such, as an exceptional form. Now the banal reality has become aestheticized, all reality is trans-aestheticized, and that is the very problem. Art was a form, and then it became more and more no more a form but a value, an aesthetic value, and so we come from art to aesthetics--it's something very, very different. And as art becomes aesthetics it joins with reality, it joins with the banality of reality. Because all reality becomes aesthetical, too, then it's a total confusion between art and reality, and the result of this confusion is hyperreality. But, in this sense, there is no more radical difference between art and realism. And this is the very end of art. As form." Sylv?re Lotringer, Baudrillard's longtime publisher, who was there to interview him, added, "Yes, this is what I was saying when I was quoting Roland Barthes saying that in America sex is everywhere except in sex, and I was adding that art is everywhere but also in art." "Even in art," Baudrillard corrected. "Even in art, yes. The privilege of art in itself as art in itself has disappeared, so art is not what it thinks it is." Many people in the room wished to ask Baudrillard a question. A gray-haired man wearing a denim cap and a green work shirt, an acolyte of the philosopher Bernard Stiegler, wanted to know whether, even if art was no longer art, as such, it might not still function as useful therapy for the wounded narcissism of artists. A middle-aged man in the second row who had been snapping photographs of Baudrillard with a tiny camera raised his hand. "I don't know how to ask this question, because it's so multifaceted," he said. "You're Baudrillard, and you were able to fill a room. And what I want to know is: when someone dies, we read an obituary--like Derrida died last year, and is a great loss for all of us. What would you like to be said about you? In other words, who are you? I would like to know how old you are, if you're married and if you have kids, and since you've spent a great deal of time writing a great many books, some of which I could not get through, is there something you want to say that can be summed up?" "What I am, I don't know," Baudrillard said, with a Gallic twinkle in his eye. "I am the simulacrum of myself." The audience giggled. "And how old are you?" the questioner persisted. "Very young." COMMENT MEN OF LETTERS DISPLACEMENT DEPT. THE BOARDS THE FINANCIAL PAGE [spacer.gif] -- Larissa MacFarquhar BACK TO THE TOP DISPLACEMENT DEPT. [spacer.gif] [sub_onedollar_title.gif] [spacer.gif] [spacer.gif] Click here for INTERNATIONAL ORDERS >> Click here to GIVE A GIFT >> E-mail address ___________________________________ State [Choose your State...] Name ___________________________________ Mailing address 1 ___________________________________ Zip ______ Mailing address 2 ___________________________________ continue City ___________________________________ [spacer.gif] [me_submenu02.gif] [me_submenu03.gif] [me_submenu04.gif] [me_submenu_events.gif] [me_submenu05.gif] [me_submenu06.gif] [me_submenu07.gif] [spacer.gif] [fo_condenet.gif] Copyright ? Cond?Net 2005. All rights reserved. Use of this Site constitutes acceptance of our User Agreement and Privacy Policy. This Site looks and works best when viewed using browsers enabled with JavaScript 1.2 and CSS, such as Netscape 7+ and Internet Explorer 6+. [spacer.gif] [spacer.gif] Click here to Subscribe [spacer.gif] [cover_newyorker_80.jpg] [spacer.gif] [spacer.gif] [spacer.gif] [spacer.gif] [spacer.gif] [spacer.gif] [spacer.gif] [spacer.gif] [spacer.gif] [javascript_disabled] From checker at panix.com Fri Dec 9 21:34:48 2005 From: checker at panix.com (Premise Checker) Date: Fri, 9 Dec 2005 16:34:48 -0500 (EST) Subject: [Paleopsych] TCS: Why People Hate Economics Message-ID: Why People Hate Economics http://www.techcentralstation.com/112105A.html [This is good, the idea those who reason from consequences of a proposal and those who reason from the supposed motives of the proponents. But Mr. Mencken said one thing the author did much better: ["The whole aim of practical politics is to keep the populace alarmed (and hence clamorous to be led to safety) by menacing it with an endless series of hobgoblins, all of them imaginary" (_In Defence of Women_).] By Arnold Kling Published 11/21/2005 "the separateness of these two mechanisms, one for understanding the physical world and one for understanding the social world, gives rise to a duality of experience. We experience the world of material things as separate from the world of goals and desires. ...We have what the anthropologist Pascal Boyer has called a hypertrophy of social cognition. We see purpose, intention, design, even when it is not there." -- Paul Bloom, writing in The Atlantic Paul Bloom's essay "Is God an Accident?" in the latest issue of The Atlantic, suggests that humans' belief in God, Intelligent Design, and the afterlife is an artifact of brain structure. In this essay, I am going to suggest that the same artifact that explains why people are instinctively anti-Darwin explains why they are instinctively anti-economic. Bloom says that we use one brain mechanism to analyze the physical world, as when we line up a shot on the billiard table. We use another brain mechanism to interact socially, as when we try to get a date for the prom. The analytical brain uses the principles of science. It learns to make predictions of the form, "When an object is dropped, it will fall toward the earth." The social brain uses empathy. It learns to guess others' intentions and motives in order to predict their reactions and behavior. The difference between analytical and social reasoning strikes me as similar to the difference that I once drew between Type C and Type M arguments. I wrote, "Type C arguments are about the consequences of policies. Type M arguments are about the alleged motives of individuals who advocate policies." Type C arguments about policy come from the analytical brain and reflect impersonal analysis. Type M arguments come from the social brain. In my view, they inject emotion, demagoguery, and confusion into discussions of economic policy. As a shortcut, I will refer to the analytical, scientific mental process as the type C brain, and the emotional, empathic mental process as the type M brain. What I take from Bloom's essay is the suggestion that our type M brain seeks a motive and intention behind the events that take place in our lives. This type M brain leads to irrational religious beliefs and superstitions, as when we attribute emotions and intentions to inanimate objects. We need our type M brains, but in moderation. Without a type M brain, one is socially underdeveloped. In extreme cases, someone with a weak type M brain will be described by Asperger's Syndrome or autism. On the other hand, as Bloom suggests, there are many cases in which we over-use our type M brains. For example, social psychologists have long noted the fundamental attribution error, in which we see people's actions as derived from their motives or dispositions when in fact the actions result from context. Economics is an attempt to use a type C brain to understand market processes in impersonal terms. We do not assess one person's motives as better than another's. We assume that everyone is out for their own gain, and we try to predict what will happen when people trade on that basis. Perhaps one of the reasons that economics is taught using math is that mathematics engages the Type C brain. By getting students to look at equations represented in graphs, the instructor steers them away from thinking in terms of motives. The down side of this is that when they go back to looking at the real world, many people who have taken economics courses simply revert to using their type M brains. Explaining Higher Gas Prices For example, consider the run-up in gasoline prices that occurred after Hurricane Katrina. Looking for the cause of higher gas prices, the type M brain asks, "Who?" The type C brain asks "What?" Some Senators, appealing to the type M brains among their constituents, hauled oil company executives into a hearing to ask them to explain why they raised prices so high. One might just as well imagine hauling people before a Senate hearing and holding them personally responsible for gravity or inertia. No one sets the price of gasoline. If they could, oil company executives would charge $10 a gallon or more. However, because of competition, they have to charge an amount that will allow them to sell the gasoline that they are able to produce. After Katrina, they were able to produce less gasoline, so that at $2 a gallon they would have run out. They raised their prices to the point where they could not raise them further without losing most of their business to competitors. If an oil company had decided magnanimously to sell gasoline at low prices, it would have run out of gasoline. If enough companies had done so, there would have been so little gasoline left that by October the public would have been at the mercy of those few suppliers that held any inventories. If gasoline had cost $2 a gallon in September, the shortage in October might have pushed the price up to $5 a gallon. If a monopolist were in charge of the oil industry, he would shut down some refineries in order to reduce the availability of gasoline. A monopolist would rather produce less gasoline and charge $3 per gallon than produce more gasoline but have to charge $2 a gallon to sell it all. Fortunately, the oil industry is not run by a monopolist, and we do not have to face $3 a gallon all the time. A competitive firm will not shut down its refinery capacity to keep supply off the market, because that only benefits its competitors. Hurricane Katrina temporarily did for the industry what a monopolist would do permanently. The hurricane shut down refinery capacity. As a result, oil companies earned high short-term profits. But those high profits did not reflect a sudden outbreak of greed among the oil company executives. Profits are explained by type C analysis of context, not by type M attributions of motive. Politics and Government Type M thinking views government as a parent. Conservatives want their government/parent to police moral behavior. Liberals want their government/parent to provide nurturance. Type C thinking instead thinks of government as an institutional arrangement. Rather than anthropomorphize government as a parent, type C thinking leads me to prefer the Separation of Family and State. Type M thinking treats political conflicts as battles between good and evil. "Our" side is wise and sincerely motivated. The "other" side is stupid and evil. Many economists revert to type M thinking when they look at politics. See my Challenge for Brad DeLong. Type C thinking treats political conflict as an inevitable competition among various interest groups. Actors in the political sphere respond to incentives, just as they do in other spheres. Politicians try to exploit the type M brain. Politicians appeal to people's fears. Their message is, "You are in danger. Fortunately, I care about you, and I will save you." The many political crusades against Wal-Mart reflect type M thinking. For example, the state of Maryland, where I live, is considering legislation forcing Wal-Mart to provide expensive health insurance to its employees. The type M brain sees Wal-Mart management as Scrooge, and Maryland's politicians as the ghosts that are going to get the company to see the evil of its ways. However, Basic type C economics says that forcing the company to provide more health insurance benefits would lead to lower wages for Wal-Mart workers. International Trade Economists view international trade as equivalent to the discovery of a more efficient production process. As Alan Blinder put it recently, "It has long been a mystery to economists why so many people view creative destruction that stems from technology as okay, while similar creative destruction that stems from international trade is something to be opposed." Hardly anyone feels guilty about using tax preparation software rather than paying an accountant to handle their tax returns. Yet many people would tell you that there is something wrong with outsourcing tax preparation to accountants in India. Neither economists nor non-economists tend to think of tax preparation software as an alien outsider trying to steal our jobs. However, many non-economists' type M brains instinctively think of Indian accountants as trying to do us harm. Economists are trained to look at international trade through the same type C eyes that we view technological innovation, and we are constantly amazed by the general public's hostility toward it. Paul Bloom offers extensive evidence that the majority of people do not accept the type C approach to evolution, death, and other matters. If biologists have been unable to get people to change their type M minds, then perhaps economists should not feel so bad. Arnold Kling is author of Learning Economics. From HowlBloom at aol.com Sat Dec 10 06:41:12 2005 From: HowlBloom at aol.com (HowlBloom at aol.com) Date: Sat, 10 Dec 2005 01:41:12 EST Subject: [Paleopsych] Shhhhh is this another Ur Strategy? Message-ID: <24c.351f908.30cbd288@aol.com> Hi, Karen. Good question. All I can add to this is the bark the cool and the growl. Animals use low noises--the growl-- to make themselves look big--big to rivals and big to females who, even in frogs, go for bigness. The bigger the woofer the lower the sound, so us animals go real low to make our woofers sound huge. Low rumbles are our musical dominance gestures. Animals use mid-range noises--the bark--to say hello, how are you or to introduce themselves to others they feel are equals. The mid-range is a music we sing to each other to connect without slipping into anger or intimacy. And animals use high-pitched, soft sounds to make themselves sound small, unthreatening, adorably appealing, and intimate. We use high-pitched soft sounds--coos--when we baby-talk to our young ones or to our lovers. Tweeters make high sounds. The smaller the tweeter, the higher the sound. Coos are musical submission and seduction gestures. Shhhhh falls into the coo category, but so do lots of other sounds. I suspect that shhh isn't cross-cultural--that it isn't replicated in Chinese, Japanese, or Mayan. But I'm not at all sure. Or should that be shhhure? Howard ps take a look at this paleopsych conversation from 1998 in which Martha Sherwood added something intriguing: Martha Sherwood writes: Subj: Re: Language as display Date: 98?02 ?23 13:01:14 EST From: msherw at oregon.uoregon.edu (Martha Sherwood) To: HowlBloom at aol.com Regarding your query to Gordon Burghart about geckos, it might be relevant that the vocalizations accompanying vampire bat threat displays are within the human auditory range whereas their other signals are not. Martha hb: very nifty, Martha. This would fit in with the coo, bark and growl research, since the bats are conceivably descending into what for them is a basso profundo growl to maximize their menace. Howard In a message dated 12/9/2005 1:21:22 PM Eastern Standard Time, guavaberry at earthlink.net writes: hi everyone, sorry to interrupt the present conversation . . . but i've been wondering about this . . . . What is shhhhh? and does this fall under another UR strategy a western custom or is it a world wide "instinct" we all have to use shhhhhh for shushing a baby to stop crying or to calm a crying baby or crying child. Is this another Ur Strategy? Do all human babies recognize this as the signal to be quiet? Do all cultures use this? I imagine it sounding like a snake's rattle but that doesn't mean much. I've heard it the same sound calm's horses and sounds similar to the word for thank you in mandarin. Do we know anything about shhhhh? Appreciate any thoughts you might have. thanks, Karen Ellis Archive 8/16/03 Re: Ur strategies and the moods of cats and dogs hb to pavel kurakin: I've had an adventure that will force me to stop for the night. One of my cats attacked me and tore several holes in my face, nearly removing one of my eyes. <>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<> The Educational CyberPlayGround http://www.edu-cyberpg.com/ National Children's Folksong Repository http://www.edu-cyberpg.com/NCFR/ Hot List of Schools Online and Net Happenings, K12 Newsletters, Network Newsletters http://www.edu-cyberpg.com/Community/ 7 Hot Site Awards New York Times, USA Today , MSNBC, Earthlink, USA Today Best Bets For Educators, Macworld Top Fifty <>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<> _______________________________________________ paleopsych mailing list paleopsych at paleopsych.org http://lists.paleopsych.org/mailman/listinfo/paleopsych ---------- Howard Bloom Author of The Lucifer Principle: A Scientific Expedition Into the Forces of History and Global Brain: The Evolution of Mass Mind From The Big Bang to the 21st Century Recent Visiting Scholar-Graduate Psychology Department, New York University; Core Faculty Member, The Graduate Institute www.howardbloom.net www.bigbangtango.net Founder: International Paleopsychology Project; founding board member: Epic of Evolution Society; founding board member, The Darwin Project; founder: The Big Bang Tango Media Lab; member: New York Academy of Sciences, American Association for the Advancement of Science, American Psychological Society, Academy of Political Science, Advanced Technology Working Group, Human Behavior and Evolution Society, International Society for Human Ethology; advisory board member: Institute for Accelerating Change ; executive editor -- New Paradigm book series. For information on The International Paleopsychology Project, see: www.paleopsych.org for two chapters from The Lucifer Principle: A Scientific Expedition Into the Forces of History, see www.howardbloom.net/lucifer For information on Global Brain: The Evolution of Mass Mind from the Big Bang to the 21st Century, see www.howardbloom.net -------------- next part -------------- An HTML attachment was scrubbed... URL: From guavaberry at earthlink.net Sat Dec 10 22:16:22 2005 From: guavaberry at earthlink.net (K.E.) Date: Sat, 10 Dec 2005 17:16:22 -0500 Subject: [Paleopsych] Shhhhh is this another Ur Strategy? In-Reply-To: <24c.351f908.30cbd288@aol.com> References: <24c.351f908.30cbd288@aol.com> Message-ID: <7.0.0.16.0.20051210164551.0328fce0@earthlink.net> hey howard, But I'm not at all sure. Or should that be shhhure? Howard :-) your info helps to frame out the bigger story, loved reading it - thanks so much and didn't Skoyles used to write to the list? http://www.hinduonnet.com/2001/10/11/stories/08110007.htm suggests music is hardwired (which i believe) & i think Steven Pinker is totally wrong i think Trehub has it goin on the other day on of my husbands co-workers said he was reading a book about calming babies called best baby on the block which gave parents an ordered 5 step to do list (& daddy said it worked) one of the steps included saying shhhhh in the baby's ear & to do it at the volume to match the loudness of the baby cry loud cry = loud shhhhh directly into the kids ear and the book also says no worries you can't hurt the kids eardrum. the whole thing got me thinking about the shhhhh - Ur thing cause it's gotta be full of overtones & that falls under the music brain wiring idea. http://www.annalsnyas.org/cgi/content/abstract/930/1/1 http://www.edu-cyberpg.com/Literacy/whatresearch4.asp the coo, bark and growl research does that include the chip? if dr. provine can tickle rats and get them to laugh is laughing called a chip? i just can't stop wondering about this. best, k At 01:41 AM 12/10/2005, you wrote: >Hi, Karen. Good question. > >All I can add to this is the bark the cool and the growl. > >Animals use low noises--the growl-- to make >themselves look big--big to rivals and big to >females who, even in frogs, go for bigness. The >bigger the woofer the lower the sound, so us >animals go real low to make our woofers sound >huge. Low rumbles are our musical dominance gestures. > >Animals use mid-range noises--the bark--to say >hello, how are you or to introduce themselves to >others they feel are equals. The mid-range is a >music we sing to each other to connect without slipping into anger or intimacy. > >And animals use high-pitched, soft sounds to >make themselves sound small, unthreatening, >adorably appealing, and intimate. We use >high-pitched soft sounds--coos--when we >baby-talk to our young ones or to our >lovers. Tweeters make high sounds. The smaller >the tweeter, the higher the sound. Coos are >musical submission and seduction gestures. > >Shhhhh falls into the coo category, but so do >lots of other sounds. I suspect that shhh isn't >cross-cultural--that it isn't replicated in >Chinese, Japanese, or Mayan. But I'm not at all >sure. Or should that be shhhure? Howard > >ps take a look at this paleopsych conversation >from 1998 in which Martha Sherwood added something intriguing: > > >Martha Sherwood writes: Subj: Re: Language >as display Date: 98???02???23 13:01:14 EST >From: msherw at oregon.uoregon.edu (Martha >Sherwood) To: HowlBloom at aol.com Regarding your >query to Gordon Burghart about geckos, it might >be relevant that the vocalizations accompanying >vampire bat threat displays are within the human >auditory range whereas their other signals are >not. Martha hb: very nifty, Martha. This would >fit in with the coo, bark and growl research, >since the bats are conceivably descending into >what for them is a basso profundo growl to maximize their menace. Howard > >In a message dated 12/9/2005 1:21:22 PM Eastern >Standard Time, guavaberry at earthlink.net writes: >hi everyone, >sorry to interrupt the present conversation . . . >but i've been wondering about this . . . . >What is shhhhh? and does this fall under another UR strategy >a western custom or is it >a world wide "instinct" we all have to use shhhhhh >for shushing a baby to stop crying >or to calm a crying baby or crying child. >Is this another Ur Strategy? >Do all human babies recognize this as the >signal to be quiet? >Do all cultures use this? >I imagine it sounding like a snake's rattle >but that doesn't mean much. I've heard it >the same sound calm's horses and sounds >similar to the word for thank you in mandarin. >Do we know anything about shhhhh? >Appreciate any thoughts you might have. >thanks, >Karen Ellis >Archive >8/16/03 >Re: Ur strategies and the moods of cats and dogs >hb to pavel kurakin: I've had an adventure that will force me to stop >for the night. One of my cats attacked me and tore several holes in >my face, nearly removing one of my eyes. <>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<> The Educational CyberPlayGround http://www.edu-cyberpg.com/ National Children's Folksong Repository http://www.edu-cyberpg.com/NCFR/ Hot List of Schools Online and Net Happenings, K12 Newsletters, Network Newsletters http://www.edu-cyberpg.com/Community/ 7 Hot Site Awards New York Times, USA Today , MSNBC, Earthlink, USA Today Best Bets For Educators, Macworld Top Fifty <>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<>~~~~~<> From checker at panix.com Sun Dec 11 03:05:45 2005 From: checker at panix.com (Premise Checker) Date: Sat, 10 Dec 2005 22:05:45 -0500 (EST) Subject: [Paleopsych] Glimpse Abroad: Smelling the Roses Message-ID: Smelling the Roses Glimpse Abroad, 2005 http://www.glimpsefoundation.org/downloads/Spectrum-Winter2005.pdf [These observations are not particularly profound, in and of themselves, but they do say something about life in the United States.] First, the summary from the "Magazine and Journal Reader" feature of the daily bulletin from the Chronicle of Higher Education, 5.12.6 http://chronicle.com/daily/2005/12/2005120601j.htm A glance at the winter issue of glimpse: Readjusting to the States Catching up again to the fast-paced American lifestyle is one of the hardest challenges for American students returning home from study abroad, according to surveys conducted by the travel magazine. Kerala Goodkin, the magazine's editor in chief, writes that Americans obsess over productivity and efficiency, and that such fixations have led to "on the go" eating mentalities, excessive sugar consumption, a reliance on cars, and poor health. That "hurried" lifestyle, she explains, is all the more difficult for students to adjust to after living in countries with more-relaxed outlooks. "Life in the United States seemed so demanding and fast-paced that I just wanted to say 'slow down!' when I got home," she quotes a student who studied in Spain as saying. A common observation in the survey was how devoted Americans are to their work. In the United States, nearly 68 percent of the people in the labor force work a 40-hour week -- a rate that is second only to Japan, notes Ms. Goodkin. Despite that work ethic, foreigners and students returning from abroad consider Americans lazy for relying so heavily on cars. The United States is second to none in the ratio of cars to people, she points out, with 765 automobiles for every 1,000 Americans. Ms. Goodkin considers whether the American lifestyle is worth the costs. Being constantly on the move is unhealthy, she says, adding that the stress of our daily routines can cause organ failure, cancer, accidents, and suicide. She notes that the United States is among the top 10 countries in the world in terms of the proportion of its population that lives in ill health. "When I returned home," she quotes another student as saying, "I wanted to encourage others to love what they have and to smile at traffic jams and long lines. They might just notice something or someone new." A copy of article, "Smelling the Roses," is available at http://www.glimpsefoundation.org/downloads/Spectrum-Winter2005.pdf Information about the magazine is available at http://www.glimpseabroad.org/ --Jason M. Breslow -------------- FAST FOOD, FRAPPUCCINOS, EXPRESSWAYS, POWER LUNCHES, EVEN POWER NAPS ... WHY ARE AMERICANS SO SINGULARLY OBSESSED WITH GETTING THINGS DONE IN A HURRY? In a recent survey, Glimpse asked over 400 study abroad students about the central cultural differences between their home and host countries. One theme surfaced again and again: the challenge of readjusting to the comparatively hurried pace of U.S. life upon returning home from abroad. Says Janna Stansell of California State University, who studied in Spain, "Life in the United States seemed so demanding and fast-paced that I just wanted to say 'slow down!' when I got home." Brian Dolan of University of Colorado at Boulder echoes this sentiment: "In Russia," he says, "everyone was more relaxed and didn't stick to tight schedules as in America. Many people just did things in their own time." We obsess over productivity and efficiency, but are the benefits always worth the cost? This special report examines key cultural trends in the United States--for example, "on-the-go" eating mentalities, excessive caffeine and sugar consumption, reliance on cars, and poor mental and physical health--and compares them to trends in other countries, where residents take more time to stop and smell the roses. MEALS ON WHEELS While the infamous Golden Arches continue to crop up in countries around the world, Americans still reign supreme when it comes to our copious consumption of those greasy, ready- made morsels we qualify as "food." Americans who have lived abroad frequently comment that other countries do not share our "on-the-go" eating mentality and devote much more time to leisurely, multi-course meals, shared in the company of family and friends. "When I returned to the United States I had horrible reverse culture shock. People seemed so rude and Americans' love of fast food disgusted me." Student, University of Cincinnati Studied in United Arab Emirates through American University "After coming home from Florence, Italy, I missed not having a market to walk to every day and buy fresh food for my meals. I had become used to the slow pace of Italy, so when I came back to the States, everyone and everything seemed so rushed! I knew that the States is fast-paced, but I actually felt the difference more than I thought I would." Jenna Tonet, Stonehill College Studied in Italy through API "I think the biggest difference I felt coming home was the pace of life. In London, things moved a lot slower to some extent. It was overwhelming to watch my family rush around all day. I remember the first time I went out to dinner, I was shocked by how quickly the waiter pushed us out. In Europe, you can sit there all night and no one cares." Elizabeth Conner, University of Missouri at Columbia Studied in England TOP 5 COUNTRIES WITH THE MOST MCDONALD'S 12,804 3,598 1,154 1,115 1,093 U.S. Japan Canada UK Germany SANTA GIUSTINA, ITALY Excerpted from "The Sweet Life" by Nicole Graziano There I was, in Santa Giustina, a speck- of-a-city in the northern region of Italy known as Veneto. My supple nature welcomed this shift in setting, and I took comfort in my days with my Italian friend Elisa--waking to tea at the apartment or cappuccinos at the bar a street above. The town was grey, hushed and sealed tightly in a late December chill that hunched our backs and shrunk our necks as we shuffled around open markets and humble piazzas. My favorite part of the day, however, was lunchtime. Each and every afternoon, after Elisa and I had returned from a light trek through town or had finished up a movie, Salvatore, her boyfriend, would return to the apartment from his job at a nearby gas station for a nearly two- hour lunch break. Elisa routinely prepared a feast in preparation for his arrival--minestrone one day, pumpkin gnocchi another, complementing the meals with bread, cheese, nuts and wine. Contrary to the half-hour and 50-minute lunch breaks to which so many Americans are accustomed, these afternoons weren't hurried affairs. Salvatore's mind actually seemed to escape the duties of his workplace--he rarely regarded the clock, relaxing for another 45 minutes or so after eating until he finally lifted his tall, lanky body up from the sofa, draped himself in a black, button-up wool coat and departed once again for work. Outside the rambunctious pant and seduction of Florence, I lived as a shadow of my young Italian counterparts--cooking spaghetti carbonara beside them, sharing their anxieties about electricity and water bills. I came to understand the sweet life, la dolce vita, as I witnessed it through the lives of its inhabitants. YOUR 3 P.M. WAKE-UP CALL All the rushing around we do can sure wear us out. Should we combat our fatigue with a mid-afternoon nap? A leisurely stroll in the park? Forget it. According to Dunkin' Donuts, what we need is a "3 p.m. wake up call"--a.k.a a gargantuan iced coffee that promises to snap us out of our exhaustion with its sugary, caffeinated goodness. Perhaps unsurprisingly, the United States lags behind many Western European countries when it comes to coffee consumption, but if it's not a cup o' Joe we're chugging, it's probably coffee's carbonated cousin: soda. In the realm of soda intake, the United States has everyone else beat, hands down. "I had traveled to Latin America before, so I was already familiar with the slower pace of life there, but it still affected me in Mexico. As Americans, we always feel like we have to be doing something productive, and when we aren't, we get down on ourselves. Yet in Mexico it was okay to sit for an hour after you had already finished eating and talk, or to nap in the middle of the day after lunch, or to go out with friends for a beer on a Tuesday." Christina Shaw, American University Studied in Mexico through SIT "French culture was big on observing things: window shopping, browsing stores, walking around town, eating at cafes on the streets and watching the passersby. The leisurely atmosphere made for a wonderful experience." Molly Mullican, Rice University Studied in France through API "I found the Spanish schedule allowed for a not-so-busy workday and accomodated the relaxation and social needs that every good Spaniard appreciates." Phil Ramirez, Texas State University Studied in Spain through API Wuhu village, China: A man takes a nap in a lotus garden and wilderness park on a quiet summer afternoon. PHOTO by Kate Peterson. ALL WORK AND NO PLAY In a country where we still abide by the "pull- yourself-up-by-your-bootstraps" mentality, where we like to believe that our economic success is defined by how hard we work, we are fairly gung-ho when it comes to putting in the hours. Whereas France recently mandated a 35-hour work week, in the United States, almost 68 percent of our workforce puts in more than 40 hours per week, trailing only Japan (76 percent). "Before I left for France, I was enrolled in 18 credit hours a semester and working two jobs 15 to 25 hours a week. I re-evaluated the whole way I was living when I got home and now take more time to slow down and enjoy myself rather than rushing around to stay busy all the time." Anna Romanosky, University of South Carolina Studied in France through API "In Macedonia, much more time is spent drinking coffee and talking than actually doing work." Andrew C., University of Pittsburgh Studied in Macedonia through IAESTE "Upon returning home, I found that many of my views about the United States had changed. I noticed so many more 'negative' things about our culture, like excessiveness, wastefulness and laziness. Italians essentially 'work to live,' whereas here in the United States we 'live to work.' There are still so many things I compare between our culture and Italian culture." Laura Basil, Ohio University Studied in Italy through API BEHIND THE WHEEL Ironically, while many other countries acknowledge the United States' strong work ethic, they simultaneously view us as "lazy." Maybe that's because for all the hurrying we do, we sure spend a lot of time sitting on our butts. Yes, in the United States, the car reigns supreme; other ways of getting from here to there--for example, walking, biking or taking public transportation--are viewed as grossly inefficient. We want to get there fast why wait around at a bus stop or rely on the meager power of our own two legs? Unsurprisingly, the United States ranks the highest when it comes to the ratio of motor vehicles to people. "American culture is very much dependent on use of cars. In Sevilla I walked everywhere. I miss taking a 30- minute stroll to school." Student, Agnes Scott College Studied in Spain through API "The United States does not slow down, and it was hard to come back to rushing cars, people everywhere, and the overall feeling that there was always something going on. I hated not having a public transportation system and wished we could have a train system in the United States like Europe has." Holly Murdoch, Texas A&M Studied in Italy "I had just spent five months without a car and without the rushing of American life. Wherever I needed to go in Spain, I could walk or take public transportation. When I arrived back in Detroit, I became a bit disgusted at how everything was so impending, everything was an 'emergency.' " Lauren Zakalik, University of Michigan Studied in Spain through API BANGKOK, THAILAND Excerpted from "Yoda and the Skytrain" by Molly Angstman My route to work funnels me, along with crowds of commuters, onto the "sky train"--Bangkok's new elevated transportation system. Every day, without fail, the first 50 people off the escalator in the station see a new train pull up and start running, all the while smiling ear-to-ear like they are doing something really ridiculous. The little uniformed girls with their pigtails and giant backpacks, some barely taller than my waist, treat the 20-meter run as a hilarious adventure, holding hands and giggling, arriving at the train with flushed faces. As the doors quickly close, swallowing their giggles, they leave me to wait for the next train. I think they smile because even the youngest commuters know how inherently silly it is for a Thai person to run for a train. Although Bangkok is now home to big- money transnationals, the pace of life is still traditionally slower than other cities at a similar level of development. Being in a hurry is almost unseemly, but business is still profitable and the trains run like clockwork. Patience is the lauded quality, not promptness. As Buddhists, they will get another go at it anyway. Why rush? If I'm not at work every morning with a comfortable ten minutes to spare, I feel I have failed in my responsibilities as an efficient intern. This is why the second grader with the Winnie the Pooh backpack will always be wiser than me. She hurries because it is funny and exciting, not because she thinks being early makes life better. Despite their glittering efficiency, these trains might never be fast enough for me. So I am taking cultural orientation classes from these mobile philosophers. Lessons learned so far: 1) Spend rush hour with friends, 2) Enjoy the ride, and 3) Never hurry in paradise. WOLLONGONG, AUSTRALIA Excerpted from "Taking Your Time" by Heather Magalski Soon after arriving in Wollongong, Australia, my roommate and I decided to brave the train system to see the Gay and Lesbian Mardi Gras in nearby Sydney. As American city-dwellers, used to following strict schedules and being constantly on-the- go, we made our way to the station about a half-hour before our train was due to arrive. We were in for a surprise. When our train rumbled into the station 30 minutes after its scheduled arrival, we learned that we would have to transfer to a connection, which also ended up being late. When our train came to a sudden halt midway to Sydney, I went into a frenzy. I wanted to know what went wrong, how long it would take to fix and how late we would be. I began to think of my own life in the United States and how I try to cram so many things in at once. Before I had come to Australia, I thought that having a successful life meant being involved in several activities, as I had been pressured to do in order to be accepted at a college. Yet in the time it ended up taking my roommate and I to get to Sydney, I became aware that the true joy in life was taking my time, something the Australian culture has successfully mastered. Realizing this skill, I applied it to the rest of my stay in Australia. I no longer became stressed when it took hours to be served at a restaurant, miles to walk to town for groceries, or several hours to again travel to Sydney. No longer concerned with doing a specific "something," I went on long walks by the beach and sat and listened to many an Australian tell me his or her life story. Instead of always actively participating in something, I now understand that just sitting back and taking in my surroundings has its time and place. ILL EFFECTS Maybe we Americans get a lot done, but is it worth the cost? Our obsession with convenience and efficiency leads to many unhealthy practices, including poor nutritional habits and sedentary lifestyles. Being in constant states of stimulation and frenzy isn't so great for us either--in fact, stress is linked to a number of the leading causes of death in the United States, including heart, liver and lung disease; cancer; accidents; and suicide. Despite our high standards of living and advanced (though not universal) system of medical care, the United States ranks 48th in a comparative study of countries' life expectancies: 77.14 years. Furthermore, it ranks within the top ten for the proportion of its male and female populations who live in ill health. "I had a hard time getting used to the speed of life again in the United States. I liked feeling relaxed and laid back and not worried about getting places on time. I also missed the sense of community I felt in Ecuador. Back in the United States, I noticed how separate and selfish people can be at times." Maret Kane-Panchana, University of Washington Studied in Ecuador through Fundaci?n CIMAS "When I returned home, I had a hard time feeling compassion for those who just rush through their days, who go through the motions without understanding that their connection to the work/people/food/sex/nightlife they experience every day is worth more than a spot on a day-planner. I wanted to encourage others to love what they have and to smile at traffic jams and long lines. They might just notice something or someone new." Jordan Santoni, Appalachian State University Studied in Spain through API SAN JOSE, COSTA RICA Excerpted from "GPS, Costa Rica Style" by Patricia Jempty When I moved to Costa Rica with my family (chastened by the fact that after endless years of study, I had perfected French, not Spanish), I was in for quite a shock. Costa Rica is fairly well developed as far as "Third World" countries go: you can dine at any number of North American chain restaurants and stay only in U.S. chain hotels. (Why you'd want to do this is another question!) The veneer of familiarity may fool you into thinking that, except for language, Costa Rica is just like home. I can assure you, it's not. Let's talk physical. Costa Rica has a rainy season and a dry season. When it rains, the landscape is obliterated and the roads become rivers of mud. When it's dry, the dust permeates your pores and the wind plays catch with any object not nailed down. We're talking extremes here, and they happen every day. But physical aspects aside, it's the country's culture that can truly blind-side you, if you're paying attention. Patience is not just a lofty virtue, it's a necessity if you live in Costa Rica. The locals have it in their blood, or at least in their upbringing. Visitors must learn to adapt. The power grid fails. The water stops running. You can't travel quickly anywhere because most of the roads are notoriously potholed and must be shared with four-legged creatures of all sizes. You'll get there when you get there, which can be a hard lesson for a gringo. Over the years, my way of thinking has slowly adapted and shaped itself to the local manner of doing things. I've grown calmer, less demanding. I've learned to take life as it is offered to me; and in the process, my frustrations with the sometimes maddening aspects of Costa Rica have taken wing like so many butterflies. From checker at panix.com Sun Dec 11 03:05:50 2005 From: checker at panix.com (Premise Checker) Date: Sat, 10 Dec 2005 22:05:50 -0500 (EST) Subject: [Paleopsych] NYT Mag: (Freakonomics) The Economy of Desire Message-ID: The Economy of Desire http://select.nytimes.com/preview/2005/12/11/magazine/1124989462701.html [A good primer.] By STEPHEN J. DUBNER and STEVEN D. LEVITT Analyzing a Sex Survey What is a price? Unless you're an economist, you probably think of a price as simply the amount you pay for a given thing - the number of dollars you surrender for, let's say, Sunday brunch at your favorite neighborhood restaurant. But to an economist, price is a much broader concept. The 20 minutes you spend waiting for a table is part of the price. So, too, is any nutritional downside of the meal itself: a cheeseburger, as the economist Kevin Murphy has calculated, costs $2.50 more than a salad in long-term health implications. There are moral and social costs to tally as well - for instance, the look of scorn delivered by your vegan dining partner as you order the burger. While the restaurant's menu may list the price of the cheeseburger at $7.95, that is clearly just the beginning. The most fundamental rule of economics is that a rise in price leads to less quantity demanded. This holds true for a restaurant meal, a real-estate deal, a college education or just about anything else you can think of. When the price of an item rises, you buy less of it (which is not to say, of course, that you want less of it). But what about sex? Sex, that most irrational of human pursuits, couldn't possibly respond to rational price theory, could it? Outside of a few obvious situations, we generally don't think about sex in terms of prices. Prostitution is one such situation; courtship is another: certain men seem to consider an expensive dinner a prudent investment in pursuit of a sexual dividend. But how might price changes affect sexual behavior? And might those changes have something to tell us about the nature of sex itself? Here is a stark example: A man who is sent to prison finds that the price of sex with a woman has spiked - talk about a supply shortage - and he becomes much more likely to start having sex with men. The reported prevalence of oral sex among affluent American teenagers would also seem to illustrate price theory: because of the possibility of disease or pregnancy, intercourse is expensive - and it has come to be seen by some teenagers as an unwanted and costly pledge of commitment. In this light, oral sex may be viewed as a cheaper alternative. In recent decades, we have witnessed the most exorbitant new price associated with sex: the H.I.V. virus. Because AIDS is potentially deadly and because it can be spread relatively easily by sex between two men, the onset of AIDS in the early 1980's caused a significant increase in the price of gay sex. Andrew Francis, a graduate student in economics at the University of Chicago, has tried to affix a dollar figure to this change. Setting the value of an American life at $2 million, Francis calculated that in terms of AIDS-related mortality, it cost $1,923.75 in 1992 (the peak of the AIDS crisis) for a man to have unprotected sex once with a random gay American man versus less than $1 with a random woman. While the use of a condom greatly reduces the risk of contracting AIDS, a condom is, of course, yet another cost associated with sex. In a study of Mexican prostitution, the Berkeley economist Paul Gertler and two co-authors showed that when a client requested sex without a condom, a prostitute was typically paid a 24 percent premium over her standard fee. Francis, in a draft paper titled "The Economics of Sexuality," tries to go well beyond dollar figures. He puts forth an empirical argument that may fundamentally challenge how people think about sex. As with any number of behaviors that social scientists try to measure, sex is a tricky subject. But Francis discovered a data set that offered some intriguing possibilities. The National Health and Social Life Survey, sponsored by the U.S. government and a handful of foundations, asked almost 3,500 people a rather astonishing variety of questions about sex: the different sexual acts received and performed and with whom and when; questions about sexual preference and identity; whether they knew anyone with AIDS. As with any self-reported data, there was the chance that the survey wasn't reliable, but it had been designed to ensure anonymity and generate honest replies. The survey was conducted in 1992, when the disease was much less treatable than it is today. Francis first looked to see if there was a positive correlation between having a friend with AIDS and expressing a preference for homosexual sex. As he expected, there was. "After all, people pick their friends," he says, "and homosexuals are more likely to have other homosexuals as friends." But you don't get to pick your family. So Francis next looked for a correlation between having a relative with AIDS and expressing a homosexual preference. This time, for men, the correlation was negative. This didn't seem to make sense. Many scientists believe that a person's sexual orientation is determined before birth, a function of genetic fate. If anything, people in the same family should be more likely to share the same orientation. "Then I realized, Oh, my God, they were scared of AIDS," Francis says. Francis zeroed in on this subset of about 150 survey respondents who had a relative with AIDS. Because the survey compiled these respondents' sexual histories as well as their current answers about sex, it allowed Francis to measure, albeit crudely, how their lives may have changed as a result of having seen up close the costly horrors of AIDS. Here's what he found: Not a single man in the survey who had a relative with AIDS said he had had sex with a man in the previous five years; not a single man in that group declared himself to be attracted to men or to consider himself homosexual. Women in that group also shunned sex with men. For them, rates of recent sex with women and of declaring homosexual identity and attraction were more than twice as high as those who did not have a relative with AIDS. Because the sample size was so small - simple chance suggests that no more than a handful of men in a group that size would be attracted to men - it is hard to reach definitive conclusions from the survey data. (Obviously, not every single man changes his sexual behavior or identity when a relative contracts AIDS.) But taken as a whole, the numbers in Francis's study suggest that there may be a causal effect here - that having a relative with AIDS may change not just sexual behavior but also self-reported identity and desire. In other words, sexual preference, while perhaps largely predetermined, may also be subject to the forces more typically associated with economics than biology. If this turns out to be true, it would change the way that everyone - scientists, politicians, theologians - thinks about sexuality. But it probably won't much change the way economists think. To them, it has always been clear: whether we like it or not, everything has its price. Stephen J. Dubner and Steven D. Levitt are the authors of "Freakonomics: A Rogue Economist Explores the Hidden Side of Everything." More information on the academic research behind this column is at [3]www.freakonomics.com. From checker at panix.com Sun Dec 11 03:06:02 2005 From: checker at panix.com (Premise Checker) Date: Sat, 10 Dec 2005 22:06:02 -0500 (EST) Subject: [Paleopsych] NYT Mag: Laptop That Will Save the World, The Message-ID: Laptop That Will Save the World, The http://select.nytimes.com/preview/2005/12/11/magazine/1124989448443.html [How far does anyone predict that the educational achievement gap will be closed internationally?] By MICHAEL CROWLEY Here in America, high-speed wireless Internet has become a commonplace home amenity, and teenagers with Sidekicks can browse the Web on a beach. For many people in developing nations, however, the mere thought of owning a computer remains pure fantasy. But maybe not for long. This year, Nicholas Negroponte, chairman of the Massachusetts Institute of Technology's Media Lab, unveiled a prototype of a $100 laptop. With millions of dollars in financing from the likes of [3]Rupert Murdoch's News Corporation and Google, Negroponte and his colleagues have designed an extremely durable, compact, no-frills laptop, which they'd like to see in the hands of millions of children worldwide by 2008. So how can any worthwhile computer cost less than a pair of good headphones? Through a series of cost-cutting tricks. The laptops will run on free "open source" software, use cheaper "flash" memory instead of a hard disk and most likely employ new LCD technology to drop the monitor's cost to just $35. Each laptop will also come with a hand crank, making it usable even in electricity-free rural areas. Of course, the real computing mother lode is the Internet, to which few developing-world users have access. But the M.I.T. laptops will offer wireless peer-to-peer connections that create a local network. As long as there's an Internet signal somewhere in the network area - and making sure that's the case, even in rural areas, poses a mighty challenge - everyone can get online and use a built-in Web browser. Theoretically, even children in a small African village could have "access to more or less all libraries of the world," Negroponte says. (That's probably not very useful to children who can't read or understand foreign languages.) His team is already in talks with several foreign governments, including those of Egypt, Brazil and Thailand, about bulk orders. Gov. Mitt Romney of Massachusetts has also proposed a bill to buy 500,000 of the computers for his state's children. References 3. http://topics.nytimes.com/top/reference/timestopics/people/m/rupert_murdoch/index.html?inline=nyt-per From checker at panix.com Sun Dec 11 03:06:09 2005 From: checker at panix.com (Premise Checker) Date: Sat, 10 Dec 2005 22:06:09 -0500 (EST) Subject: [Paleopsych] NYT Mag: (Freakonomics) The Economy of Desire Message-ID: The Economy of Desire http://select.nytimes.com/preview/2005/12/11/magazine/1124989462701.html [A good primer.] By STEPHEN J. DUBNER and STEVEN D. LEVITT Analyzing a Sex Survey What is a price? Unless you're an economist, you probably think of a price as simply the amount you pay for a given thing - the number of dollars you surrender for, let's say, Sunday brunch at your favorite neighborhood restaurant. But to an economist, price is a much broader concept. The 20 minutes you spend waiting for a table is part of the price. So, too, is any nutritional downside of the meal itself: a cheeseburger, as the economist Kevin Murphy has calculated, costs $2.50 more than a salad in long-term health implications. There are moral and social costs to tally as well - for instance, the look of scorn delivered by your vegan dining partner as you order the burger. While the restaurant's menu may list the price of the cheeseburger at $7.95, that is clearly just the beginning. The most fundamental rule of economics is that a rise in price leads to less quantity demanded. This holds true for a restaurant meal, a real-estate deal, a college education or just about anything else you can think of. When the price of an item rises, you buy less of it (which is not to say, of course, that you want less of it). But what about sex? Sex, that most irrational of human pursuits, couldn't possibly respond to rational price theory, could it? Outside of a few obvious situations, we generally don't think about sex in terms of prices. Prostitution is one such situation; courtship is another: certain men seem to consider an expensive dinner a prudent investment in pursuit of a sexual dividend. But how might price changes affect sexual behavior? And might those changes have something to tell us about the nature of sex itself? Here is a stark example: A man who is sent to prison finds that the price of sex with a woman has spiked - talk about a supply shortage - and he becomes much more likely to start having sex with men. The reported prevalence of oral sex among affluent American teenagers would also seem to illustrate price theory: because of the possibility of disease or pregnancy, intercourse is expensive - and it has come to be seen by some teenagers as an unwanted and costly pledge of commitment. In this light, oral sex may be viewed as a cheaper alternative. In recent decades, we have witnessed the most exorbitant new price associated with sex: the H.I.V. virus. Because AIDS is potentially deadly and because it can be spread relatively easily by sex between two men, the onset of AIDS in the early 1980's caused a significant increase in the price of gay sex. Andrew Francis, a graduate student in economics at the University of Chicago, has tried to affix a dollar figure to this change. Setting the value of an American life at $2 million, Francis calculated that in terms of AIDS-related mortality, it cost $1,923.75 in 1992 (the peak of the AIDS crisis) for a man to have unprotected sex once with a random gay American man versus less than $1 with a random woman. While the use of a condom greatly reduces the risk of contracting AIDS, a condom is, of course, yet another cost associated with sex. In a study of Mexican prostitution, the Berkeley economist Paul Gertler and two co-authors showed that when a client requested sex without a condom, a prostitute was typically paid a 24 percent premium over her standard fee. Francis, in a draft paper titled "The Economics of Sexuality," tries to go well beyond dollar figures. He puts forth an empirical argument that may fundamentally challenge how people think about sex. As with any number of behaviors that social scientists try to measure, sex is a tricky subject. But Francis discovered a data set that offered some intriguing possibilities. The National Health and Social Life Survey, sponsored by the U.S. government and a handful of foundations, asked almost 3,500 people a rather astonishing variety of questions about sex: the different sexual acts received and performed and with whom and when; questions about sexual preference and identity; whether they knew anyone with AIDS. As with any self-reported data, there was the chance that the survey wasn't reliable, but it had been designed to ensure anonymity and generate honest replies. The survey was conducted in 1992, when the disease was much less treatable than it is today. Francis first looked to see if there was a positive correlation between having a friend with AIDS and expressing a preference for homosexual sex. As he expected, there was. "After all, people pick their friends," he says, "and homosexuals are more likely to have other homosexuals as friends." But you don't get to pick your family. So Francis next looked for a correlation between having a relative with AIDS and expressing a homosexual preference. This time, for men, the correlation was negative. This didn't seem to make sense. Many scientists believe that a person's sexual orientation is determined before birth, a function of genetic fate. If anything, people in the same family should be more likely to share the same orientation. "Then I realized, Oh, my God, they were scared of AIDS," Francis says. Francis zeroed in on this subset of about 150 survey respondents who had a relative with AIDS. Because the survey compiled these respondents' sexual histories as well as their current answers about sex, it allowed Francis to measure, albeit crudely, how their lives may have changed as a result of having seen up close the costly horrors of AIDS. Here's what he found: Not a single man in the survey who had a relative with AIDS said he had had sex with a man in the previous five years; not a single man in that group declared himself to be attracted to men or to consider himself homosexual. Women in that group also shunned sex with men. For them, rates of recent sex with women and of declaring homosexual identity and attraction were more than twice as high as those who did not have a relative with AIDS. Because the sample size was so small - simple chance suggests that no more than a handful of men in a group that size would be attracted to men - it is hard to reach definitive conclusions from the survey data. (Obviously, not every single man changes his sexual behavior or identity when a relative contracts AIDS.) But taken as a whole, the numbers in Francis's study suggest that there may be a causal effect here - that having a relative with AIDS may change not just sexual behavior but also self-reported identity and desire. In other words, sexual preference, while perhaps largely predetermined, may also be subject to the forces more typically associated with economics than biology. If this turns out to be true, it would change the way that everyone - scientists, politicians, theologians - thinks about sexuality. But it probably won't much change the way economists think. To them, it has always been clear: whether we like it or not, everything has its price. Stephen J. Dubner and Steven D. Levitt are the authors of "Freakonomics: A Rogue Economist Explores the Hidden Side of Everything." More information on the academic research behind this column is at [3]www.freakonomics.com. From checker at panix.com Sun Dec 11 03:15:57 2005 From: checker at panix.com (Premise Checker) Date: Sat, 10 Dec 2005 22:15:57 -0500 (EST) Subject: [Paleopsych] Jerry Goodenough: Critical Thinking about Conspiracy Theories Message-ID: Jerry Goodenough: Critical Thinking about Conspiracy Theories http://www.uea.ac.uk/~j097/CONSP01.htm [This is a very good analysis, esp. when it comes to noting that many, many conspiracies posit too many conspirators. As far as the specific analysis of the Kennedy assassination goes, the author makes a very good point about the Mafia being incompetent. I'll send along in a moment excerpts from a new book, "Ultimate Sacrifice," that makes a new case that the Mafia did in fact orchestrate the assassination. According to the book, the Mafia got wind of a CIA plot to murder Castro and threatened to reveal it, thereby causing an international crisis. The Warren Commission, accordingly covered things up, a cover-up which continues. [Still, the charge of incompetence remains. I reinsert my own theory that the assassination was an assisted suicide. JFK knew he had not long to live but did not want to go down in history like Millard Fillmore, whose only achievement was to not install a bath tub in the White House. Just being assassinated would not be enough, so he got the conspirators to leave enough bogus and inconsistent evidence that researchers would never stop spinning theories, all of them imperfect for failure to reconcile the evidence. [The Enlightenment died in six seconds on the Dealey Plaza.] Jerry Goodenough is Professor of Philosophy at the University of East Anglia, Norwich, UK 1. Introduction Conspiracy theories play a major part in popular thinking about the way the world, especially the political world, operates. And yet they have received curiously little attention from philosophers and others with a professional interest in reasoning.[1] Though this situation is now starting to change, it is the purpose of this paper to approach this topic from the viewpoint of critical thinking, to ask if there are particular absences or deformities of critical thinking skills which are symptomatic of conspiracy theorising, and whether better teaching of reasoning may guard against them. That conspiracy thinking is widespread can be seen from any cursory examination of a bookshop or magazine stand. There are not only large amounts of blatant conspiracy work, often dealing with American political assassinations and other events or with the alleged presence of extraterrestrial spacecraft, but also large amounts of writing where a certain degree of conspiracy thinking is more or less implicit. Thus many `alternative' works of medicine, history, archaeology, technology, etc. often depend upon claims, explicit or otherwise, that an establishment or orthodoxy conspires to suppress alternative views. Orthodox medicine in cahoots with the multinational drug companies conspires to suppress the claims of homeopathy, orthodox archaeologists through malice or blindness conspire to suppress the truth about the construction of the Pyramids, and so on. It certainly seems to the jaundiced observer that there is more of this stuff about then ever before. However, conspiracy theorising is now coming to the attention of philosophers. That it has taken this long may be because, as Brian Keeley says in a recent paper, `most academics simply find the conspiracy theories of popular culture to be silly and without merit.' (1999: 109n) But I agree with Keeley's further remark that `it is incumbent upon philosophers to provide analysis of the errors involved with common delusions, if that is indeed what they are.' If a kind of academic snobbishness underlies our previous refusal to get involved here, there may be another reason. Conspiracy theorising, in political philosophy at least, has been identified with irrationality of the worst sort--here the locus classicus may be some dismissive remarks made by Karl Popper in The Open Society and its Enemies (Popper 1996, Vol.2: 94-9). Pigden (1993) shows convincingly that Popper's remarks cannot be taken to support a rational presumption against conspiracy theories in history and politics. But certainly such a presumption exists, particularly amongst political commentators. It tends to manifest itself in a noisy preference for what is termed the `cock-up' theory of history--an unfortunate term that tends to assume that history is composed entirely of errors, accidents and unforeseen consequences. If such a dismal state of affairs were indeed to be the case, then there would seem to be no point in anybody trying to do anything. The cock-up theory, then, is agreeable to all forms of quietism. But we have no reason to believe that there is such a coherent theory, and even less reason to believe that every event must fall neatly into one or other category here; indeed, this insistence on black and white reasoning is, as we shall see, one of the features of conspiracy theorising itself! And what makes the self-satisfied `cock-up' stance even less acceptable is that it ignores the fact that conspiracies are a very real part of our world. No serious historian denies that a somewhat amateurish conspiracy lay behind the assassination of Abraham Lincoln, or that a more professional but sadly less successful conspiracy attempted to assassinate Adolf Hitler in the summer of 1944. Yet such is the presumption behind the cock-up stance that the existence or frequency of genuine conspiracies is often significantly downplayed. (How many people, taking at face value the cock-up theorists' claim that conspiracies are a real rarity in the modern history of democracies, do not know that a mere 13 years before President Kennedy's assassination a serious terrorist conspiracy to murder Harry S. Truman led to a fatal gunfight on the streets of Washington?[2] The cock-up presumption seems to generate a kind of amnesia here.) We require, then, some view of events that allows for the accidental and the planned, the deliberate and the contingent: history as a tapestry of conspiracies and cock-ups and much intentional action that is neither. Pigden (op.cit) satisfactorily demonstrates the unlikelihood of there being any adequate a priori exclusion principle here, in the face of the reality of at least some real conspiracies. Keeley's paper attempts a more rigorous definition of the phenomenon, hoping to separate what he terms Unwarranted Conspiracy Theories (UCTs) from rational or warranted conspiratorial explanations: It is thought that this class of explanation [UCTs] can be distinguished analytically from those theories which deserve our assent. The idea is that we can do with conspiracy theories what David Hume (1748) did with miracles: show that there is a class of explanations to which we should not assent, by definition. (Keeley: 111) and it is part of his conclusion that `this task is not as simple as we might have heretofore imagined.' (ibid.) Keeley concludes that `much of the intuitive "problem" with conspiracy theories is a problem with the theorists themselves, and not a feature of the theories they produce' (Ibid: 126) and it is this point I want to take up in this paper. What sort of thinking goes on in arriving at UCTs and what sort of things go wrong? If we say that conspiracy theorists are irrational, do we mean only that they are illogical in their reasoning? Or are there particular critical thinking skills missing or being misused? 2. Definitions Keeley's use of the term Unwarranted Conspiracy Theory should not mislead us into thinking that all conspiracy theories fall into one or other category here. Warrant is a matter of degree, and so is conspiracy. There are cases where a conspiratorial explanation is plainly rational; take, for instance, the aforementioned July Bomb Plot to kill Hitler, where there is an abundance of historical evidence about the conspirators and their aims. There are cases where such an explanation is clearly irrational: I shall argue later in the paper that this is most probably the case for the assassination of President Kennedy. And there are cases where some conspiratorial explanation may be warranted but it is hard to know how far the warrant should extend. Take, for instance, the murder of the Archduke Franz Ferdinand in Sarajevo in 1914. There was plainly a conspiracy to bring this about: some minutes before Gavril Princips shot the archduke, a co-conspirator was arrested after throwing a bomb (which failed to explode) at the archduke's car. Princips and his fellow students were Serbian nationalists, acting together to demonstrate against the presence of Habsburg influence in the Balkans. But there remains the possibility that they had been infiltrated and manipulated by Yugoslav intelligence elements seeking to provoke a crisis against Austro-Hungary. And there are more extreme claims that the ultimate manipulators here were agents of a world-wide conspiracy, of international Jewry or freemasonry seeking to bring about war. We are fully warranted in adopting the first conspiratorial explanation, but perhaps only partially warranted in thinking there is anything in the second claim[3], while the extreme claims seem to me to be as unwarranted as anything could be. What we require, then, is some definition which will mark off the kind of features which ought to lead us to suspect the warrant of any particular conspiratorial explanation. Keeley lays out a series of these, which I shall list and comment upon. But first he offers his definition of conspiracy theories in general: A conspiracy theory is a proposed explanation of some historical event (or events) in terms of the significant causal agency of a relatively small group of persons--the conspirators--acting in secret... a conspiracy theory deserves the appellation "theory" because it proffers an explanation of the event in question. It proposes reasons why the event occurred... [it] need not propose that the conspirators are all powerful, only that they have played some pivotal role in bringing about the event... indeed, it is because the conspirators are not omnipotent that they must act in secret, for if they acted in public, others would move to obstruct them... [and] the group of conspirators must be small, although the upper bounds are necessarily vague.(116) Keeley's definition here differs significantly from the kind of conspiracy at which Popper was aiming in The Open Society, crude Marxist explanations of events in terms of capitalist manipulation. For one can assume that in capitalist societies capitalists are very nearly all-powerful and not generally hindered by the necessity for secrecy. A greater problem for Keeley's definition, though, is that it seems to include much of the work of central government. Indeed, it seems to define exactly the operations of cabinet government--more so in countries like Britain with no great tradition of governmental openness than in many other democracies. What is clearly lacking here is some additional feature, that the conspirators be acting against the law or against the public interest, or both. This doesn't entirely free government from accusations of conspiracy--does a secret cabinet decision to upgrade a country's nuclear armaments which appears prima facie within the bounds of the law of that country but may breach international laws and agreements count? Is it lawful? In the public interest? A further difficulty with some kind of illegality constraint is that it might tend to rule out what we might otherwise clearly recognise as conspiracy theories. Take, for instance, the widely held belief amongst ufologists that the US government (and others) has acted to conceal the existence on earth of extra-terrestrial creatures, crashed flying saucers at Roswell, and so on. It doesn't seem obvious that governments would be acting illegally in this case--national security legislation is often open to very wide interpretation--and it could be argued that they are acting in the public interest, to avoid panic and so on. (Unless, of course, as some ufologists seem to believe, the government is conspiring with the aliens in order to organise the slavery of the human race!) So we have here what would appear to be a conspiracy theory, and one which has some of the features of Keeley's UCTs, but which is excluded by the illegality constraint. Perhaps the best we can do here is to assert that conspiracy theories are necessarily somewhat vague in this regard; I'll return to this point later. If this gives us a rough idea of what counts as a conspiracy theory, we can then build upon it and Keeley goes on to list five features which he regards as characteristic of Unwarranted Conspiracy Theories: (1) `A UCT is an explanation that runs counter to some received, official, or "obvious" account.' (116-7) This is nothing like a sufficient condition, for the history of even democratic governments is full of post facto surprises that cause us to revise previous official explanations. For instance, for many years the official explanation for Britain's military success in the Second World War was made in terms of superior generalship, better troops, occasional good luck, and so on. The revelation in the 1970s of the successful Enigma programme to break German service codes necessitated wholesale revision of military histories of this period. This was an entirely beneficial outcome, but others were more dubious. The growth of nuclear power in Britain in the 1950s was officially explained in terms of the benefit of cheaper and less polluting sources of electricity. It was only much later that it became clear that these claims were exaggerated and that the true motivation for the construction of these reactors was to provide fissile material for Britain's independent nuclear weapons. Whether such behaviour was either legal or in the public interest is an interesting thought. (1A) `Central to any UCT is an official story that the conspiracy theory must undermine and cast doubt upon. Furthermore, the presence of a "cover story" is often seen as the most damning piece of evidence for any given conspiracy." This is an interesting epistemological point to which I shall return. (2) `The true intentions behind the conspiracy are invariably nefarious'. I agree with this as a general feature, particularly of non-governmental conspiracies, though as pointed out above it is possible for governmental conspiracies to be motivated or justified in terms of preventing public alarm, which may be seen as an essentially beneficial aim. (3) `UCTs typically seek to tie together seemingly unrelated events.' This is certainly true of the more extreme conspiracy theory, one which seeks a grand unified explanation of everything. We have here a progression from the individual CT, seeking to explain one event, to the more general. Carl Oglesby (1976), for instance, seeks to reinterpret many of the key events in post-war American history in terms of a more or less secret war between opposing factions within American capital, an explanation which sees Watergate and the removal of Richard Nixon from office as one side's revenge for the assassination of John Kennedy. At the extreme we have those theories which seek to explain all the key events of western history in terms of a single secret motivating force, something like international freemasonry or the great Jewish conspiracy.[4] It may be taken as a useful rule of thumb here that the greater the explanatory range of the CT, the more likely it is to be untrue. (A point to which Popper himself would be sympathetic!) Finally, one might want to query here Keeley's point about seemingly unrelated events. Many CTs seem to have their origin in a desire to relate events that one might feel ought to go together. Thus many Americans, on hearing of the assassination of Robert Kennedy (itself coming very shortly after that of Martin Luther King) thought these events obviously related in some way, and sought to generate theories linking them in terms of some malevolent force bent on eliminating apparently liberal influences in American politics. They seem prima facie more likely to be related than, say, the deaths of the Kennedy brothers and those of John Lennon or Elvis Presley: any CT linking these does indeed fulfil Keeley's (3). (4) `...the truths behind events explained by conspiracy theories are typically well-guarded secrets, even if the ultimate perpetrators are sometimes well-known public figures.' This is certainly the original belief of proponents of UCTs but it does lead to a somewhat paradoxical situation whereby the alleged secret can become something of an orthodoxy. Thus opinion polls seem to indicate that something in excess of 80% of Americans believe that a conspiracy led to the death of President Kennedy, though it seems wildly unlikely that they all believe in the same conspiracy. It becomes increasingly hard to believe in a well-guarded secret that has been so thoroughly aired in 35 years of books, magazine articles and even Hollywood movies. Pretty much the same percentage of Americans seem to believe in the presence on earth of extra-terrestrials, though whether this tells us more about Americans or about opinion-polls is hard to say. But these facts, if facts they be, would tend to undercut the `benevolent government' UCTs. For there is really no point in `them' keeping the truth from us to avoid panic if most of us already believe this `truth'. The revelation of cast-iron evidence of a conspiracy to kill Kennedy or of the reality of alien visits to Earth would be unlikely to generate more than a ripple of public interest, these events having been so thoroughly rehearsed. (5) `The chief tool of the conspiracy theorist is what I shall call errant data'. By which Keeley means data which is unaccounted for by official explanations, or data which if true would tend to contradict official explanations. These are the marks of the UCT. As Keeley goes on to say (118) `there is no criterion or set of criteria that provide a priori grounds for distinguishing warranted conspiracy theories from UCTs.' One might perhaps like to insist here that UCTs ought to be false, and this is why we are not warranted in believing them, but it is in the nature of many CTs that they cannot be falsified. The best we may do is show why the warrant for believing them is so poor. And one way of approaching this is by way of examining where the thinking that leads to UCTs goes awry. 3. Where CT thinking goes wrong It is my belief that one reason why we should not accept UCTs is because they are irrational. But by this I do not necessarily mean that they are illogical in the sense that they commit logical fallacies or use invalid argument forms--though this does indeed sometimes happen--but rather that they misuse or fail to use a range of critical thinking skills and principles of reasoning. In this section I want to provide a list of what I regard as the key weaknesses of CT thinking, and then in the next section I will examine a case study of (what I regard to be) a UCT and show how these weaknesses operate. My list of points is not necessarily in order of importance. (A) An inability to weigh evidence properly. Different sorts of evidence are generally worthy of different amounts of weight. Of crucial importance here is eye-witness testimony. Considerable psychological research has been done into the strengths and weaknesses of such testimony, and this has been distilled into one of the key critical thinking texts, Norris & King's (1983) Test on Appraising Observations whose Manual provides a detailed set of principles for judging the believability of observation statements. I suspect that no single factor contributes more, especially to assassination and UFO UCTs, than a failure to absorb and apply these principles. (B) An inability to assess evidence corruption and contamination. This is a particular problem with eyewitness testimony about an event that is subsequently the subject of considerable media coverage. And it is not helped by conventions or media events which bring such witnesses together to discuss their experiences--it is not for nothing that most court systems insist that witnesses do not discuss their testimony with each other or other people until after it has been given in court. There is a particular problem with American UCTs since the mass media there are not governed by sub judice constraints, and so conspiratorial theories can be widely aired in advance of any court proceedings. Again Norris & King's principles (particularly IV. 10 & 12) should warn against this.[5] But we do not need considerable delay for such corruption to occur: it may happen as part of the original act of perception. For instance, in reading accounts where a group of witnesses claim to have identified some phenomenon in the sky as a spaceship or other unknown form of craft, I often wonder if this judgement occurred to all of them simultaneously, or if a claim by one witness that this was a spaceship could not act to corrupt the judgmental powers of other witnesses, so that they started to see this phenomenon `as' a spacecraft in preference to some more mundane explanation. (C) Misuse or outright reversal of a principle of charity: wherever the evidence is insufficient to decide between a mundane explanation and a suspicious one, UCTs tend to pick the latter. The critical thinker should never be prejudiced against occupying a position of principled neutrality when the evidence is more or less equally balanced between two competing hypotheses. And I would argue that there is much to be said for operating some principle of charity here, of always picking the less suspicious hypothesis of two equally supported by the evidence. My suspicion is that in the long run this would lead to a generally more economical belief structure, that reversing the principle of charity ultimately tends to blunt Occam's Razor, but I cannot hope to prove this. (D) The demonisation of persons and organisations. This may be regarded as either following from or being a special case of (C). Broadly, this amounts to moving from the accepted fact that X once lied to the belief that nothing X says is trustworthy, or taking the fact that X once performed some misdeed as particular evidence of guilt on other occasions. In the former case, adopting (D) would demonise us all, since we have lied on some occasion or other. This is especially problematic for UCTs involving government organisations or personnel, since all governments reserve the right to lie or mislead if they feel it is in the national interest to do so. But proof that any agency lied about one event ought not to be taken as significant proof that they lied on some other occasion. It goes against the character of the witness, as lawyers are wont to say, but then no sensible person should believe that governments are perfectly truthful. The second case is more difficult. It is a standard feature of Anglo-Saxon jurisprudence that the fact that X has a previous conviction should not be given in evidence against them, nor revealed to the jury until after a verdict is arrived at. The reasoning here is that generally evidence of X's previous guilt is not specific evidence for his guilt on the present occasion; it is possible for it to be the case that X was guilty then and is innocent now, and so the court should not be prejudiced against him. But there is an exception to this, at least in English law, where there are significant individual features shared between X's previous proven modus operandi and that of the present offence under consideration; evidence of a consistent pattern may be introduced into court. But, the rigid standards of courtroom proof aside, it is not unreasonable for the police to suspect X on the basis of his earlier conviction. This may not be fair to X (if he is trying to go straight) but it is epistemologically reasonable. The trouble for UCTs, as we shall see, is that most governments have a long record of previous convictions, and the true UC theorist may regard this not just as grounds for a reasonable suspicion but as itself evidence of present guilt. (E) The canonisation of persons or (more rarely) organisations. This may be regarded as the mirror-image of (D). Here those who are regarded as the victims of some set of events being explained conspiratorially tend to be presented, for the purpose of justifying the explanation, as being without sin, or being more heroic or more threatening to some alleged set of private interests than the evidence might reasonably support. (F) An inability to make rational or proportional means-end judgements. This is perhaps the greatest affront to Occam's Razor that one finds in UCTs. Such theories are often propounded with the explanation that some group of conspirators have been acting in furtherance of some aim or in order to prevent some action taking place. But one ought to ask whether such a group of conspirators were in a position to further their aim in some easier or less expensive or less risky fashion. Our assumption here is not the principle of charity mentioned in (C) above, that our alleged conspirators are too nice or moral to resort to nefarious activities. We should assume only that our conspirators are rational people capable of working out the best means to a particular end. This is a defeasible assumption--stupidity is not totally unknown in the political world--but it is nevertheless an assumption that ought to guide us unless we have evidence to the contrary. A difficulty that should be mentioned here is that of establishing the end at which the conspiracy is aimed, made more difficult for conspiracies that never subsequently announce these things. For the state of affairs brought about by the conspirators may, despite their best efforts, not be that at which they aimed. If this is what happens then making a rational means-end judgement to the actual result of the conspiracy may be a very different matter from doing the same thing to the intended results. (G) Evidence against a UCT is always evidence for. This is perhaps the point that would most have irritated Karl Popper with his insistence that valid theories must always be capable of falsification. But it is an essential feature of UCTs; they do not just argue that on the evidence available a different conclusion should be drawn from that officially sanctioned or popular. Rather, the claim is that the evidence supporting the official verdict is suspect, fraudulent, faked or coerced. And this belief is used to support the nature of the conspiracy, which must be one powerful or competent enough to fake all this evidence. What we have here is a difference between critically assessing evidence--something I support under (A) above--and the universal acid of hypercritical doubt. For if we start with the position that any piece of evidence may be false then it is open to us to support any hypothesis whatsoever. Holocaust revisionists would have us believe that vast amounts of evidence supporting the hypothesis of a German plot to exterminate Europe's Jews are fake. As Robert Anton Wilson (1989: 172) says, `a conspiracy that can deceive us about 6,000,000 deaths can deceive us about anything, and that it takes a great leap of faith for Holocaust Revisionists to believe that World War II happened at all.' Quite so. What is needed here is that I might term meta-evidence, evidence about the evidence. My claim would be that the only way to keep Occam's Razor shiny here is to insist on two different levels of critical analysis of evidence. Evidence may be rejected if it doesn't fit a plausible hypothesis--this is what everyone must do in cases where there is apparently contradictory evidence, and there can be no prima facie guidelines for rejection here apart from overall epistemological economy. But evidence may only be impeached--accused of being deliberately faked, forged, coerced, etc.--if we have further evidence of this forgery: that a piece of evidence does not fit our present hypothesis is not by itself any warrant for believing that the evidence is fake. (H) We should put no trust in what I here term the fallacy of the spider's web. That A knows B and that B knows C is no evidence at all that A has even heard of C. But all too often UCTs proceed in this fashion, weaving together a web of conspirators on the basis of who knows who. But personal acquaintance is not necessarily a transitive relation. The falsity of this belief in the epistemological importance of webs of relationships can be demonstrated with reference to the show-business party game known sometimes as `Six Degrees of Kevin Bacon'. The object of the game is to select the name of an actor or actress and then link them to the film-actor Kevin Bacon through no more than six shared appearances. (E.g. A appeared with B in film X, B appeared with C in film Y, C appeared with D in film Z, and D appears in Kevin Bacon's latest movie: thus we link A to Bacon in four moves.) The plain fact is that most of us know many people, and important people in public office tend to have dealings with a huge number of people, so just about anybody in the world can be linked to somebody else in a reasonably small number of such links. I can demonstrate the truth of this proposition with reference to my own case, that of a dull and unworldly person who doesn't get out much. For I am separated by only two degrees from Her Majesty The Queen (for I once very briefly met the then Poet Laureate, who must himself have met the Queen if only at his inauguration) which means I am separated by only three degrees from all the many important political figures that the Queen herself has met, including names like Churchill and De Gaulle. Which further means that only four degrees separate me from Josef Stalin (met by Churchill at Yalta) and just five degrees from Adolf Hitler (who never met Churchill but did meet prewar Conservative politicians like Chamberlain and Halifax who were known to Churchill). Given the increasing amounts of travel and communication that have taken place in this century, it should be possible to connect me with just about anybody in the world in the requisite six stages. But so what? Connections like these offer the possibility of communication and influence, but offer no evidence for its actuality. (I) The classic logical fallacy of post hoc ergo propter hoc. This is the most common strictly logical fallacy to be found in political conspiracy theories, especially those dealing with assassinations and suspicious deaths. And broadly it takes the shape of claiming that since event X happened after the death of A, A's death was brought about in order to cause or facilitate the occurrence of X. The First World War happened after the death of the Archduke Franz Ferdinand, and there is clearly a sense in which it happened because of his death: there is a causal chain leading from the death to Austrian outrage, to a series of Austrian demands upon Serbia, culminating in Austria's declaration of war against Serbia, Russia's declaration against Austria, and, via a series of interlinked treaty obligations, most of the nations of Europe ending up at war with one another. Though these effects of the assassination may now appear obvious, one problem for the CT proponent is that hindsight clarifies these matters enormously: such a progression may not have been at all obvious to the people involved in these events at the time. And it is even harder to believe that bringing about such an outcome was in any of their interests. (Austria plainly had an interest in shoring up its authority in the Balkans but not, given its many structural weaknesses, in engaging in a long and destructive war. The outcome, which anyone might have predicted as likely, was the economic ruin and subsequent political dissolution of the entire Austro-Hungarian empire.) Attempting to judge the rationality of a proposed CT here as an explanation for some such set of events runs into two problems. Firstly, though an outcome may now seem obvious to us, it may not have appeared so obvious to people at the time, either in its nature or in its expensiveness. Thus there may well have been people who thought that assassinating Franz Ferdinand in order to trigger a crisis in relations between Austria and Serbia was a sensible policy move, precisely because they did not anticipate a general world war occurring as a result and may have thought a less expensive conflict, a limited war of independence between Serbia and Austria, worth the possible outcome of freeing more of the Balkans from Austrian domination. And secondly, if we cannot attribute hindsight to the actors in such events, neither can we ascribe to them a perfect level of rationality: it is always possible for people engaged in such actions to possess a poor standard of means-end judgement. But, bearing these caveats in mind, one might still wish to propound two broad principles here for distinguishing whether an event is a genuine possible motive for an earlier conspiracy or just an instance of post hoc ergo propter hoc. Firstly, could any possible conspirators, with the knowledge they possessed at the time, have reasonably foreseen such an outcome? And secondly, granted that such an outcome could have been desired, are the proposed conspiratorial events a rational method of bringing about such an outcome? That a proposed CT passes these tests is, of course, no guarantee that we are dealing here with a genuine conspiracy; but a failure to pass them is a significant indicator of an unwarranted CT. 4. A case-study of CT thinking--the assassination of President Kennedy With these diagnostic indicators of poor critical thinking in place, I would now like to apply them to a typical instance of CT (and, to my mind, unwarranted CT) thinking.[6] On 22 November, 1963 President John F. Kennedy was assassinated in Dallas, Texas. Two days later, the man accused of his murder, Lee Harvey Oswald, was himself murdered in the basement of the Dallas Police Headquarters. These two events (and perhaps particularly the second, coming as it did so rapidly after the first) led to a number of accusations that Kennedy's death had been the result of a conspiracy of which Oswald may or not have been a part. Books propounding such theories emerged even before the Warren Commission issued its report on the assassination in August 1964. Writing at this time in his essay `The Paranoid Style in American Politics' the political scientist Richard Hofstadter could say; "Conspiratorial explanations of Kennedy's assassination have a far wider currency in Europe than they do in the United States." (Hofstadter 1964: 9) Hofstadter's view of the American paranoid style was one of small cults of a right-wing or racist or anti-Catholic or anti-Freemason bent whose descendants are still to be found in the Ku Klux Klan, the John Birch Society, the Michigan Militia, etc.. But within a couple of years of the emergence of the Warren Report and, more importantly, its 26 volumes of evidence, a new style of conspiratorial thinking emerged. While some right-wing conspiratorial theories remained[7], the bulk of the conspiracy theories propounded to explain the assassination adopted a position from the left of centre, accusing or assuming that some conspiracy of right-wing elements and/or some part of the US Government itself had been responsible for the assassination. A complete classification of such CTs is not necessary here[8], but I ought perhaps to point to a philosophically interesting development in the case. As a result of public pressure resulting from the first wave of CT literature, a congressional committee was established in 1977 to investigate Kennedy's assassination; it instituted a thorough examination of the available evidence and was on the verge of producing a report endorsing the Warren Commission's conclusions when it discovered what was alleged to be a sound recording of the actual assassination. Almost solely on the basis of this evidence--which was subsequently discredited by a scientific panel put together by the Department of Justice--the Congressional committee decided that there had probably been a conspiracy, asserting on the basis of very little evidence that the Mafia was the most probable source of this conspiracy. What was significant about this congressional investigation was the effect its thorough investigation of the forensic and photographic evidence in the case had. Many of the alleged discrepancies in this evidence, which had formed the basis for the many calls to establish such an investigation, were shown to be erroneous. This did not lead to the refutation of CTs but rather to a new development: the balance of CT claims now went from arguing that there existed evidence supporting a conspiratorial explanation to arguing that all or most of the evidence supporting the lone-assassin hypothesis had been faked, a new level of epistemological complexity. A representative CT of this type was propounded in Oliver Stone's hit 1992 Hollywood film JFK .[9] It asserts that a coalition of interests within the US governmental structure, including senior members of the armed forces, FBI, CIA, Secret Service and various Texas law-enforcement agencies, together with the assistance of members of organised crime, conspired to arrange the assassination of President Kennedy and the subsequent framing of an unwitting or entirely innocent Oswald for the crime. Motives for the assassination vary but most such CTs now agree on such motives as (a) preventing Kennedy after his supposed re-election from reversing US involvement in Vietnam, (b) protecting right-wing industrial interests, especially Texan oil interests, from what were regarded as possible depredations by the Kennedy administration, (c) instigating another and more successful US invasion of Cuba, and (d) halting the judicial assault waged by the Kennedy administration under Attorney General Robert Kennedy against the interests of organised crime. Such a CT scores highly on Keeley's five characteristic features of Unwarranted Conspiracy Theories outlined above. It runs counter to the official explanation of the assassination, though it has now itself become something of a popular orthodoxy, one apparently subscribed to by a majority of the American population. The alleged intentions behind the conspiracy are indeed nefarious, using the murder of a democratically-elected leader to further the interests of a private cabal. And it does seem to seek to tie together seemingly unrelated events. The most obvious of these is in terms of the assassination's alleged motive: it seeks to link the assassination with the subsequent history of America's involvement in Vietnam. But a number of other connections are made at other levels of explanation. For instance, the deaths of various people connected in one way or another with the assassination are linked together as being in some way related to the continuing cover-up by the conspirators. Keeley's fourth claim, that the truth behind an event being explained by a UCT be a typically well-guarded secret is, as I pointed out above, much harder to justify now in a climate where most people apparently believe in the existence of such a conspiracy. But Keeley's fifth claim, that the chief tool here is errant data, remains true. The vast body of published evidence on the assassination has been picked over with remarkable care for signs of discrepancy and contradiction, signs which are regarded as providing the strongest evidence for such a conspiracy. What now seems to me to be an interesting development in these more paranoid UCTs, as I mention above, is the extent to which unerrant data is now regarded as a major feature of such conspiracy theories. But how do these Kennedy assassination CTs rate against my own list of what I regard as critical thinking weaknesses? (A) An inability to weigh evidence properly. Here they score highly. Of particular importance is the inability to judge the reliability or lack thereof of eyewitness testimony, and an unwillingness or inability to discard evidence which does not fit. On the first point, most Kennedy CTs place high reliance on the small number of people who claimed at the time (and the somewhat larger number who claim now--see point (B) below) that they heard more than three shots fired in Dealey Plaza or that they heard shots fired from some other location than the Book Depository, both claims that if true would rule out the possibility of Oswald's acting alone. Since the overwhelming number of witnesses whose opinions have been registered did not hear more than three shots, and tended to locate the origin of these shots in the general direction of the Depository (which, in an acoustically misleadingly arena like Dealey Plaza is perhaps the best that could be hoped for), the economical explanation is to assume, unless further evidence arises, that the minority here are mistaken. Since the assassination was an unexpected, rapid and emotionally laden event--all key features for weakening the reliability of observation, according to the Principles of Appraising Observations in Norris & King (1983), it is only to be expected that there would be a significant portion of inconsistent testimony. The wonder here is that there is such a high degree of agreement over the basic facts. We find a similar misuse of observational principles in conspiratorial interpretations of the subsequent murder of Police Officer Tippit, where the majority of witnesses who clearly identified Oswald as the killer are downplayed in favour of the minority of witnesses--some at a considerable distance and all considerably surprised by the events unfolding in front of them--who gave descriptions of the assailant which did not match Oswald. Experienced police officers are used to eye-witness testimony of sudden and dramatic events varying considerably and, like all researchers faced with a large body of evidence containing discrepancies, must discard some evidence as worthless. Since Oswald was tracked almost continuously from the scene of Tippit's shooting to the site of his own arrest, and since forensic evidence linked the revolver found on Oswald to the shooting, the most economical explanation again is that the majority of witnesses were right in their identification of Oswald and the minority were mistaken. This problem of being unable to discard errant data is central to the creation of CTs since, as Keeley says: The role of errant data in UCTs is critical. The typical logic of a UCT goes something like this: begin with errant facts.... The official story all but ignores this data. What can explain the intransigence of the official story tellers in the face of this and other contravening evidence? Could they be so stupid and blind? Of course not; they must be intentionally ignoring it. The best explanation is some kind of conspiracy, an intentional attempt to hide the truth of the matter from the public eye. (Keeley 1999: 199) Such a view in the Kennedy case ignores the fact that the overwhelming amount of errant data on which CTs have been constructed, far from being hidden, was openly published in the 26 volumes of Warren Commission evidence. This has led to accusations that it was `hidden in plain view', but one can't help feeling that a more efficient conspiracy would have suppressed such inconvenient data in the first place. The standard position that errant data is likely to be false, that eye-witness testimony and memory is sometimes unreliable, that persisting pieces of physical evidence are preferable, etc., in short that Occam's Razor will insist on cutting and throwing away some of the data is constantly rejected in Kennedy CT literature. Perhaps the most extravagant example of this, amounting almost to a Hegelian synthesis of assassination conspiracy theories, is Lifton (1980). Seeking to reconcile the major body of testimony that Kennedy was shot from behind with a small body of errant data that he possessed a wound in the front of his body, the author dedicates over 600 pages to the construction of the most baroque conspiracy theory imaginable. In Lifton's thesis, Kennedy was shot solely from the front, and then the conspirators gained access to his body during its journey back to Washington and were able to doctor it so that at the subsequent post mortem examination it showed signs of being shot only from the rear. Thus the official medical finding that Kennedy was only shot from the rear can be reconciled with the general CT belief that he was shot from the front (too) in a theory that seems to show that everybody is right. Apart from the massive complication of such a plan--clearly going against my point (F)--and its medical implausibility, such a thesis actually reverses Occam's Razor by creating more errant data than there was to start with. For if Kennedy was shot only from the front, we now need some explanation for why the great majority of over 400 witnesses at the scene believed that the shots were coming from behind him! And this challenge is one that is ducked by the great majority of CTs: if minority errant data is to be preferred as reliable, then we require some explanation for the presence of the majority data now being rejected. But Lifton at least got one thing right. In accounting for the title of his book he writes: The "best evidence" concept, impressed on all law students, is that when you seek to determine a fact from conflicting data, you must arrange the data according to a hierarchy of reliability. All data are not equal. Some evidence (e.g. physical evidence, or a scientific report) is more inherently error-free, and hence more reliable, than other evidence (e.g. an eye-witness account). The "best" evidence rules the conclusion, whatever volume of contrary evidence there may be in the lower categories.[10] Unfortunately Lifton takes this to mean that conspirators who were able to decide the nature of the autopsy evidence would thereby lay down a standard for judging or rejecting as incompatible the accompanying eye-witness testimony. But given the high degree of unanimity among eye-witnesses on this occasion, and given the existence of corroborating physical evidence (a rifle and cartridges forensically linked to the assassination were found in the Depository behind Kennedy, the registered owner of the rifle was a Depository employee, etc.), all that the alleged body-tampering could hope to achieve is make the overall body of evidence more suspicious because more contradictory. Only if the body of reliable evidence was more or less balanced between a conspiratorial and non-conspiratorial explanation could this difficulty be avoided. But it is surely over-estimating the powers, predictive and practical, of such a conspiracy that they could hope to guarantee this situation beforehand. (B) An inability to assess evidence corruption and contamination. Though, as I note above, such contamination of eye-witness testimony may occur contemporaneously, it is a particular problem for the more long-standing CTs. In the Kennedy case, many witnesses of the assassination who at the time gave accounts broadly consistent with the explanation have subsequently amended or extended their accounts to include material that isn't so consistent. Witnesses, for instance, who at the time located all the shots as coming from the Book Depository subsequently gave accounts in which they located shots from other directions, most notably the notorious `grassy knoll', or later told of activity on the knoll which they never mentioned in their original statements. (Posner (1993) charts a number of these changes in testimony.) What is interesting about many of these accounts is that mundane explanations for these changes--I later remembered that..., I forgot to mention that...--tend to be eschewed in favour of more conspiratorial explanations. Such witnesses may deny that the signed statements made at the time accurately reflect what they told the authorities, or may say that the person interviewing them wasn't interested in writing down anything that didn't cohere with the official explanation of the assassination, and so on. Such explanations face serious difficulties. For one thing, since many of these statements were taken on the day of the assassination or very shortly afterwards, it would have to be assumed that putative conspirators already knew which facts would cohere with an official explanation and which wouldn't, which may imply an implausible degree of foreknowledge. A more serious problem is that these statements were taken by low-level members of the various investigatory bodies, police, FBI, Secret Service, etc.; to assert that such statements were manipulated by these people entails that they were members of the conspiracy. And this runs up against a practical problem for mounting conspiracies, that the more people who are in a conspiracy, the harder it is going to be to enforce security. A more plausible explanation for these changes in testimony might be that witnesses who provided testimony broadly supportive of the official non-conspiratorial explanation subsequently came into contact with some of the enormous quantity of media coverage suggesting less orthodox explanations and, consciously or unconsciously, have adjusted their recollections accordingly. The likelihood of such things happening after a sufficiently thorough exposure to alternative explanations may underlie Norris & King's principle II.1: An observation statement tends to be believable to the extent that the observer was not exposed, after the event, to further information relevant to describing it. (If the observer was exposed to such information, the statement is believable to the extent that the exposure took place close to the time of the event described.)[11] Their parenthesised time principle clearly renders a good deal of more recent Kennedy eye-witness testimony dubious after three and a half decades of exposure to vast amounts of further information in the mass media, not helped by `assassination conferences' where eye-witnesses have met and spoken with each other. One outcome of these two points is that, in the unlikely event of some living person being seriously suspected of involvement in the assassination, a criminal trial would be rendered difficult if not impossible. Such are the published discrepancies now within and between witnesses' testimonies that there would be enormous difficulties in attempting to render a plausibly consistent defence or prosecution narrative on their basis. (C) Misuse or outright reversal of a principle of charity. Where an event may have either a suspicious or an innocent explanation, and there is no significant evidence to decide between them, CTs invariably opt for the suspicious explanation. In part this is due to a feature deriving from Keeley's point (3) above, about CTs seeking to tie together seemingly unrelated events, but perhaps taken to a new level. Major CTs seek a maximally explanatory hypothesis, one which accounts for all of the events within its domain, and so they leave no room for the out of the ordinary event, the unlikely, the accident, which has no connection whatsoever with the conspiratorial events being hypothesised. The various Kennedy conspiracy narratives contain a large number of these events dragooned into action on the assumption that no odd event could have an innocent explanation. There is no better example of this than the Umbrella Man, a character whose forcible inclusion in conspiratorial explanations demonstrates well how a determined attempt to maintain this reversed principle of charity may lead to the most remarkable deformities of rational explanation. When pictorial coverage of the assassination entered the public domain, in newspaper photographs within the next few days, and more prominently in still from the Zapruder movie film of the events subsequently published in LIFE magazine, it became clear that one of the closest bystanders to the presidential limousine was a man holding a raised umbrella, and this at a time when it was clearly not raining. This odd figure rapidly became the focus of a number of conspiratorial hypotheses. Perhaps the most extreme of these originates with Robert Cutler (1975). According to Cutler, the Umbrella Man had a weapon concealed with the umbrella enabling him to fire a dart or flechette, perhaps drugged, into the president's neck, possibly for the purpose of immobilising him while the other assassins did their work. The only actual evidence to support this hypothesis is that the front of Kennedy's neck did indeed possess a small punctate wound, described by the medical team treating him as probably a wound of entrance but clearly explainable in the light of the full body of forensic evidence as a wound of exit for a bullet fired from above and behind the presidential motorcade. Consistent, in other words, with being the work of Oswald. There is no other supportive evidence for Cutler's hypothesis. (Cutler, of course, explains this in terms of the conspirators being able to control the subsequent autopsy and so conceal any awkward evidence; he thus complies with my principle (G) below.) More importantly, it seems inherently unlikely on other grounds. Since the Umbrella Man was standing on the public sidewalk, right next to a number of ordinary members of the public and in plain view of hundreds of witnesses, many of whom would have been looking at him precisely because he was so close to the president, its seems unlikely that a conspiracy could guarantee that he could get away with his lethal behaviour without being noticed by someone. And the proposed explanation for all this rigmarole, the stunning of the target, is entirely unnecessary: most firearms experts agree that the president was a pretty easy target unstunned. If Cutler's explanation hasn't found general favour with the conspiracy community, another has, but this too has equally strange effects upon reasoning clearly. The first version of this theory has the Umbrella Man signalling the presence of the target--movie-film of the assassination clearly shows that the raised umbrella is being waved or shaken. This hypothesis seems to indicate that the conspiracy had hired assassins who couldn't be relied upon to recognise the President of the United States when they saw him seated in his presidential limousine--the one with the president's flag on--next to the most recognisable first lady in American history. An apparently more plausible hypothesis is that it is the Umbrella Man who gives the signal for the team of assassins to open fire. (A version of this hypothesis can still be seen as late as 1992 in the movie JFK.) What I find remarkable here is that nobody seems to have thought this theory through at all. Firstly, the Umbrella Man is clearly on the sidewalk a few feet from the president while our alleged assassins are located high up in the Book Depository, in neighbouring buildings, or on top of the grassy knoll way to the front of the president. How, then, can he know what they can see from their different positions? How can he tell from his location that they now have clear shots at the target? (Dealey Plaza is full of trees, road signs and other obstructions, not to mention large numbers of police officers and members of the public who might be expected to get in the way of a clear view here.) And secondly, such an explanation actually weakens the efficiency of the alleged assassination conspiracy. (Here my limited boyhood experience of firing an air-rifle with telescopic sights finally comes in handy!) In order to make sense of the Umbrella Man as signaller, something like the following sequence of events must occur. Each rifleman focuses upon the presidential target through his telescopic sight, tracking the target as it moves at some ten to twelve miles per hour. Given the very narrow focus of such sights, he cannot see the Umbrella Man. To witness the signal, he must keep taking his eye away from the telescopic sight, refocussing it until he can see the distant figure on the sidewalk, and when the signal is given, put his eye back to the sight, re-focus again, re-adjust the position of the rifle since the target has continued to move while he was not looking at it, and then fire. This is not an efficient recipe for accurate target-shooting. Oliver Stone eliminates some of these problems in the version he depicts in the movie JFK. Here each of his three snipers is accompanied by a spotter, equipped with walkie-talkie and binoculars. While the sniper focuses on the target, the spotter looks out for the signal from the Umbrella Man and then orally communicates the order to open fire. But now, given what I have already said about the problem with the Umbrella Man's location, it is hard to see what purpose he serves that could not be better served by the spotters. He drops out of the equation. He is, as Wittgenstein says somewhere, a wheel that spins freely because it is not connected to the rest of the machinery. Occam's Razor would cut him from the picture, but Occam is no firm favourite of UCT proponents. In 1978, when the House Select Committee on Assassinations held public hearings on the Kennedy case, a Mr. Louis de Witt came forward to confess to being the Umbrella Man. He claimed that he came to Dealey Plaza in order to barrack the president as he went past, and that he was carrying a raised umbrella because he had heard that, perhaps for some obscure reason connected with the president's father's stay in London as US Ambassador during the war, the Kennedy family has a thing about umbrellas. De Witt hadn't come forward in the 15 years since the assassination since he had had no idea about the proposed role of the Umbrella man in the case. This part of his explanation seems to me to be eminently plausible: those of us with an obsessive interest in current affairs find it hard to grasp just how many people never read the papers or watch TV news. There is something almost endearing about de Witt, an odd character whose moment of public eccentricity seems to have enmired him in decades of conspiratorial hypotheses without his realising it. Needless to say, conspiracy theorists did not accept de Witt's testimony at face value. Some argued that he was a stooge put forward by the authorities to head off investigation into the real Umbrella Man, others that de Witt himself must be lying to conceal a more sinister role in these events, though I know of no evidence to support either of these conclusions. What this story makes clear is that an unwillingness to abandon discrepant events as unrelated, an unwillingness to abandon this reverse principle of charity here whereby all such events are conspiratorial unless clearly proven otherwise, rapidly leads to remarkable mental gymnastics, to hypotheses that are excessively complex and even internally inconsistent, (The Umbrella Man as signaller makes the assassination harder to perform.) But, such are the ways of human psychology, once such an event has been firmly embedded within a sufficiently complex hypothesis, no amount of contradictory evidence would seem to be able to shift it. The Umbrella Man has by now been invested with such importance as to become one of the great myths of the assassination, against which mere evidentiary matters can have no effect. (D) The demonisation of persons and organisations. This weakness takes a number of forms in the Kennedy case, which I shall treat separately. (i) Guilt by reputation. The move from the fact that some body--the FBI, the CIA, the mafia, the KGB--has a proven record of wrong-doing in the past to the claim that they were capable of wrong-doing in the present case doesn't seem unreasonable. But the stronger claim that past wrong-doing is in some sense evidence for present guilt is much more problematic, particularly when differences between the situations are overlooked. This is especially true of the role of the CIA in Kennedy CTs. Senator Church's 1976 congressional investigation into the activities of US intelligence agencies provided clear evidence that in the period 1960-63 elements of the CIA, probably under the instructions of or at least with the knowledge of the White House, had conspired with Cuban exiles and members of organised crime to attempt the assassination of Cuban leader Fidel Castro. Evidence also emerged of CIA involvement in the deaths of other foreign leaders--Trujillo in the Dominican Republic, Lumumba in the Congo, etc.. These findings were incorporated in Kennedy CTs as evidence to support the probability that the CIA, or at least certain members of it, were also responsible for the death of Kennedy. Once an assassin, always an assassin? Such an argument neglects the fact that the CIA could reasonably believe that they were acting in US interests, possibly lawfully since they were acting under the guidance or instruction of the White House. This belief is not open to them in the case of killing their own president, a manifestly unlawful act and one hard to square with forwarding US interests. (Evidence that Soldier X willingly shoots at the soldiers of other countries when ordered to do so is no evidence that he would shoot at soldiers of his own country, with or without orders. The situations are plainly different.) At best the Church Committee evidence indicated that the CIA had the capacity to organise assassinations, not that it had either the willingness or the reason to assassinate its own leader. (ii) Guilt by association. This takes the form of impeaching the credibility of any member of a guilty organisation. Since both the FBI and the CIA (not to mention, of course, the KGB or the mafia) had proven track records of serious misbehaviour in this period, it is assumed that all members of these organisations, and all their activities, are equally guilty. Thus the testimony of an FBI agent can be impeached solely on the grounds that he is an FBI agent, any activity of the CIA can be characterised as nefarious solely because it is being carried out by the CIA. Such a position ignores the fact that such organisations have many thousands of employees and carry out a wide range of mundane duties. It is perfectly possible for a member of such an organisation to be an honest and patriotic citizen whose testimony is as believable as anyone else's. Indeed, given my previous point that for security reasons the smaller the conspiratorial team the more likely it is to be successful, it would seem likely that the great majority of members of such organisations would be innocent of any involvement in such a plot. (I would hazard a guess that the same holds true of the KGB and the mafia, both organisations with a strong interest in security.) (iii) Exaggerating the power and nature of organisations. Repeatedly in such CTs we find the assumption that organisations like the CIA or the mafia are all-powerful, all-pervasive. capable of extraordinary foreknowledge and planning.[12] This assumption has difficulty in explaining the many recorded instances of inefficiency or lack of knowledge that these organisations constantly demonstrate. (There is a remarkable belief in conspiratorial circles, combining political and paranormal conspiracies, that the CIA has or had access to a circle of so-called `remote viewers', people with extra-sensory powers who were able through paranormal means to provide them with information about the activities of America's enemies that couldn't be discovered in any other way. Such a belief has trouble in easily accommodating the fact that the CIA was woefully unprepared for the sudden break-up of the Soviet Union and Warsaw Pact, or for the fact that America's intelligence organisations first learned of the start of the Gulf War when Kuwaiti embassy employees looked out of the window and saw Iraqi tanks coming down the road! Sadly, it appears to be true that people calling themselves remote viewers took very substantial fees from the CIA though whether this tells us more about the gullibility of people in paranoid institutions or their carefree attitude towards spending public money I should not care to say.) The more extreme conspiracy theories may argue that such organisations are only pretending to be inefficient, in order to fool the public about the true level of their efficiency. Such a position is, as Popper would no doubt have pointed out, not open to refutation. (iv) Demonising individuals. As with organisations, so with people. Once plausible candidates for roles in an assassination conspiracy are identified, they are granted remarkable powers and properties, their wickedness clearly magnified. In Kennedy CTs there is no better example of this than Meyer Lansky, the mafia's `financial wizard'. Lansky was a close associate of America's premier gangster of the 1940s, Charles `Lucky' Luciano. Not actually a gangster himself (and, technically, not actually a member of the mafia either, since Lansky--as a Jew--could not join an exclusively Sicilian brotherhood) Lansky acted as a financial adviser. He organised gambling activities for Luciano and probably played a significant role in the mafia involvement in the development of Las Vegas, and in subsequent investments of the Luciano family's money, including those in pre-revolutionary Cuba, after Luciano's deportation to Sicily. So much is agreed. But Lansky in CT writing looms ever larger, as a man of remarkable power and influence, ever ready to use it for malign purposes, a vast and evil spider at the centre of an enormous international web, maintaining his influence with the aid of the huge sums of money which organised crime was reaping from its empire.[13] Thus there is no nefarious deed concerning the assassination or its cover-up with which Lansky cannot be linked. This picture wasn't dented in the least by Robert Lacey's detailed biography of Lansky published in 1991. Lacey, drawing upon a considerable body of publicly available evidence--not least the substantial body generated by Lansky's lawsuit to enable him, as a Jew, to emigrate to Israel, was able to show that Lansky, far from being the mob's eminence grise, was little more than a superannuated book-keeper. The arch manipulator, supposedly empowered by the mafia's millions, led a seedy retirement in poverty and was on record as being unable to afford healthcare for his relatives. The effect of reading Lacey's substantially documented biography is rather like that scene in `The Wizard of Oz' when the curtain is drawn back and the all-powerful wizard is revealed to be a very ordinary little man. The 1990s saw the publication of a remarkable amount of material about the workings of American organised crime, much of it gleaned from FBI and police surveillance during the successful campaign to imprison most of its leaders. This material reveals that mafia bosses tend to be characterised by a very limited vocabulary, a remarkable propensity for brutality and a considerable professional cunning often mixed with truly breath-taking stupidity. That they could organise a large-scale assassination conspiracy, and keep quiet about it for more than thirty-five years, seemed even less likely. As I point out below, they would almost certainly not have wanted to. (E) The canonisation of persons or (more rarely) organisations. In the Kennedy case, this has taken the form of idealising the President himself. In order to make a conspiratorial hypothesis look more plausible under (F) below, it is necessary to make the victim look as much as possible like a significant threat to the interests of the putative conspirators. In this case, Kennedy is depicted as a liberal politician, one who was a threat to established economic interests, one who took a lead in the contemporary campaign to end institutionalised discrimination against black people, and, perhaps most importantly, one who was or became something of a foreign policy dove, supporting less confrontational policies in the Cold War to the extent of being prepared to terminate US involvement in South Vietnam. This canonisation initially derives from the period immediately after the assassination, a period marked by the emergence of a number of works about the Kennedy administration from White House insiders like Theodore Sorensen, Pierre Salinger and the Camelot house historian, Arthur Schlesinger, works which tended to confirm the idealisation of the recently dead president, particularly when implicitly compared with the difficulties faced by the increasingly unpopular Lyndon Johnson. >From the mid 1970s Kennedy's personal character came under considerable criticism, partly resulting from the publication of biographies covering his marriage and sexual life, and the personal lives of the Kennedy family. More importantly, for our purposes, were the stream of revelations which emerged from the congressional investigations of this time which indicated the depth of feeling in the Kennedy White House about Cuba; most important here were the Church Committee's revelations that the CIA had conspired with members of organised crime to bring about the assassination of Fidel Castro. These, coming hard on the heels of the revelations of various criminal conspiracies within the Nixon White House, stoked up the production of CTs. (And provided a new motivation for the Kennedy assassination: that Castro or his sympathisers had found out about these attempts and had Kennedy killed in revenge.) But they also indicated that the Kennedy brothers were much harder cold war warriors than had perhaps previously been thought. The changing climate of the 1980s brought a new range of biographies and memoirs--Reeves, Parmet, Wofford, etc.--which situated Kennedy more firmly in the political mainstream. It became that he was not by any means an economic or social liberal--on the question of racial segregation he had to be pushed a lot since he tended to regard the activities of Martin Luther King and others as obstructing his more important social policies. And Kennedy adopted a much more orthodox stance on the cold war than many had allowed: this was, after all, the candidate who got himself elected in 1960 by managing in the famous `missile gap' affair to appear tougher on communism than Richard Nixon, no mean feat. Famously, Kennedy adopted a more moderate policy during the Cuban missile crisis than some of those recommended by his military advisers, but this can be explained more in terms of Kennedy having a better grasp of the pragmatics of the situation than in terms of his being a foreign policy liberal of some sort. This changing characterisation of Kennedy, this firm re-situating of his administration within the central mainstream of American politics--a mainstream which appears considerably to the right in European terms--has been broadly rejected by proponents of Kennedy assassination CTs (some of whom also reject the critical characterisation of his personal life). The reason for this is that it plainly undercuts any motivation for some part of the American political establishment to have Kennedy removed. It is unlikely that any of Kennedy's reforming policies, economic or social, could seriously have been considered such a threat to establishment interests. It is even more unlikely when one considers that much of Kennedy's legislative programme was seriously bogged down in Congress and was unlikely to be passed in anything but a heavily watered-down form during his term. Much of this legislation was forced through after the assassination by Kennedy's successor, Lyndon Johnson being a much more astute and experienced parliamentarian. The price for this social reform, though, was Johnson's continued adherence to the verities of cold war foreign policy over Vietnam. I leave consideration of Kennedy's Vietnam policy to the next section. (F) An inability to make rational or proportional means-end judgements. The major problem here for any Kennedy assassination CT is to come up with a motive. Such a motive must not only be of major importance to putative conspirators, it must also rationally justify a risky, expensive--and often astonishingly complicated--illegal conspiracy. Which is to say that such conspirators must see the assassination as the only or best way of bringing about their aim. The alleged motives can be broadly divided into two categories. Firstly, revenge. Kennedy was assassinated in revenge for the humiliation he inflicted upon Premier Khrushchev over the Cuban missile crisis, or for plotting the assassination of Fidel Castro, or for double-crossing organised crime over alleged agreements made during his election campaign. The problem with each of these explanations is that the penalties likely to be suffered if one is detected far outweigh any rational benefits. Had Castro's hand been detected behind the assassination--something which Johnson apparently thought all too likely--this would inevitably have swung American public opinion behind a US military invasion of Cuba and overthrow of Castro's rule. If Khrushchev has been identified as the ultimate source of the assassination, the international crisis would have been even worse, and could well have edged the world considerably closer towards nuclear war than happened in the Cuban missile crisis. One can only make sense of such explanations on the basis of an assumption that the key conspirators are seriously irrational in this respect, and this is an assumption that we should not make without some clear evidence to support it. The second category of explanations for the assassination are instrumental: Kennedy was assassinated in order to further some specific policy or to prevent him from furthering some policy which the conspirators found anathema. Here candidates include: to protect Texas oil-barons' economic interests, to frustrate the Kennedy administration's judicial assault upon organised crime, to bring about a more anti-Castro presidency, and--the one that plays the strongest role in contemporary Kennedy CTs such as Oliver Stone's--to prevent an American withdrawal from Vietnam. A proper response to the suggestion of any of these as a rational motive for the assassination should be to embark upon a brief cost-benefit analysis. We have to factor in not only the actual costs of organising such a conspiracy (and, in the case of the more extreme Kennedy CTs, of maintaining it for several decades afterwards to engage in what has been by any standards a pretty inefficient cover-up) but also the potential costs to be faced if the conspiracy is discovered, the assassination fails, etc.. Criminals by and large tend to be rather poor at estimating their chances of being caught; murder and armed robbery have very high clear-up rates compared to, say, burglary of unoccupied premises. The continued existence of professional armed robbers would seem to indicate that they underestimate their chances of being caught or don't fully appreciate the comparative benefits of other lines of criminal activity. But though assassination conspirators are by definition criminals, we are to assume here that they are figures in the establishment, professional men in the intelligence, military and political communities, and so likely to be more rational in their outlook than ordinary street criminals. (Though this is a defeasible assumption, since the post-war history of western intelligence agencies has indicated a degree of internal paranoia sometimes bordering on the insane. A substantial part of British intelligence, for instance, spent almost two decades trying to prove that the then head of MI5 was a Soviet agent, a claim that appears to have no credibility at all.) If we assume that the Mafia played such a role in an assassination conspiracy, it is still plausible to believe that they would consider the risks of failure. In fact, we have some evidence to support this belief since, though organised crime is by and large a very brutal institution, in the US--as opposed to the very different conditions prevailing in Italy--it maintains a policy of not attacking dangerous judges or politicians. When in the 1940s senior Mafia boss Albert Anastasia proposed murdering Thomas Dewey, then a highly effective anti-crime prosecutor in New York and subsequently a republican presidential candidate in 1948, the response was to have Anastasia murdered rather than risk the troubles that Dewey's assassination would have brought down upon the heads of organised crime. An even more effective prosecutor, Rudolph Giuliani, remained unscathed throughout his career. Against the risks of being caught, we have to balance the costs of trying to achieve one's goal by some other less dramatic and probably more legal path. The plain fact is that there are a large number of legal and effective ways of changing a president's mind or moderating his behaviour. One can organise public campaigns, plant stories in the press, stimulate critical debate in congress, assess or manipulate public opinion through polls etc. When the health care industry in the US wanted to defeat the Clinton administrations reform proposals, for instance, they didn't opt for assassination but went instead for a highly successful campaign to bring congress and substantial parts of public opinion against the proposals, which soon became dead in the water. On the specific case of American withdrawal from Vietnam, all of the above applies. In the first case, following on from (E) above, it can be plausibly argued that Kennedy had no such intention. He certainly on occasion floated the idea, sounding out people around him, but this is something that politicians do all the time as part of the process of weighing policy options and shouldn't be taken as evidence for such an option. But to see Kennedy as seriously considering such an option is to see him as a figure considerably out of the Democratic mainstream. He would certainly have been aware of the effects that an Asian policy can have upon domestic matters; as a young congressman he would have been intimately aware of the effect that the fall of China to communism in 1949 had upon the last Democratic administration, severely weakening Harry Truman's effectiveness. For years afterwards the Democrats were regarded as the people who "lost China" despite the fact that there was nothing they could have done--short of an all-out war, like that occurring in Korea shortly afterwards, which couldn't possibly be won without the use of nuclear weapons and all that entails. Kennedy's administration had a much stronger presence in South Vietnam and it can reasonably be asked whether he would have wanted to run the risk of becoming the president who "lost Vietnam". He would also have been aware of the problem that ultimately faced Lyndon Johnson, that one could only maintain a forceful policy of domestic reform by mollifying congress over matters of foreign policy. The price for Johnson's Great Society reforms was a continued adherence to a policy of involvement in Vietnam, long after Johnson himself--fully aware of this bind--doubted the wisdom of this policy. Kennedy's domestic reforms were already in legislative difficulties; to believe that he was prepared to withdraw from Vietnam, then, is to believe that he was effectively abandoning his domestic programmes. (That Kennedy was alleged to be considering such an action in his second term, if re-elected, doesn't affect this point. He would still have been a lame-duck president, and would also have weakened the chances of any possible Democratic successor, something that would certainly have been of interest to other members of his party.) It thus appears unlikely that Kennedy would have seriously considered withdrawing completely from Vietnam. But if he had, a number of options were available to opponents of such a policy. Firstly, as noted above, they could have encouraged opposition to such a policy in congress and other important institutions, and among the American public. There was certainly a strongly sympathetic Republican and conservative Democrat presence in congress to form the foundations of such an opposition, as well as among newspaper publishers and other media outlets. If Kennedy had underestimated the domestic problems that withdrawal would cause him, such a campaign would concentrate his mind upon them. And secondly, opponents could work to change Kennedy's mind. They could do this by controlling the information available for Kennedy and his advisers. In particular, military sources could manipulate the information flowing from Vietnam itself. (That Kennedy thought something like this was happening may be indicated by his insistence on sending civilian advisers to Vietnam to report back to him personally.) This policy worked well in Johnson's time--the control of information over the trivial events in the Bay of Tonkin in 1965 was manipulated to indicate a serious crisis which thus forced Johnson into inserting a heavy military presence into South Vietnam in response. There is no reason to believe that such a policy would not have worked if Kennedy had still been in office. At the very least, it would be rational to adopt such a policy first, to try cheap, legal and probably efficient methods of bringing about one's goal before even contemplating such a dramatic, illegal and high-risk activity as assassination. (I omit here any consideration of the point that members of the American establishment might feel a moral revulsion at the idea of taking such action against their own president. Such a claim may well be true, but the argument from rationality does not require it.) At bottom what we face here is what we might term Goodenough's Paradox of Conspiracies: the larger or more powerful an alleged conspiracy, the less need they have for conspiring. A sufficiently large collection of members of the American political, intelligence and military establishment--the kind of conspiracy being alleged by Oliver Stone et al.--wouldn't need to engage in such nefarious activity since they would have the kind of organisation, influence, access to information, etc. that could enable them to achieve their goal efficiently and legally. The inability noted in (F) to make adequate means-end decisions means that UCT proponents fail to grasp the force of this paradox. (G) Evidence against a UCT is always evidence for. The tendency of modern CTs has been to move from conspiracies which try to keep their nefarious activities secret to more pro-active conspiracies which go to a good deal of trouble to manufacture evidence either that there was a different conspiracy or that there was no conspiracy at all. This is especially true of Kennedy assassination CTs. The epistemological attitude of Kennedy CTs has changed notably over the years. In the period 1964-76 the central claim of such theories was that the evidence collected by the Warren Commission and made public, when fairly assessed, did not support the official lone assassin hypothesis but indicated the presence of two or more assassins and therefore a conspiracy. Public pressure in the aftermath of Watergate brought about a congressional investigation of the case. In its 1980 report the House Select Committee eventually decided, almost solely on the basis of subsequently discredited acoustic evidence, that there had indeed been a conspiracy. But more importantly, the committee's independent panels of experts re-examined the key evidence, photographic, forensic and ballistic, and decided that it supported the Warren Commission's conclusion. This led to a sea-change in CTs from 1980 onwards. Given the preponderance of independently verified `best evidence' supporting the lone assassin hypothesis, CT proponents began to argue that some or all of this evidence had been faked. This inevitably entailed a much larger conspiracy than had previously been hypothesised, one that not only assassinated the president but also was able to gain access to the evidence of the case afterwards in order to change it, suppress it or manufacture false evidence. They thus fell foul of (F) above. Since the reason for such CTs was often to produce a hypothesis supported by much weaker evidence, eye-witness testimony and so on, they would tend to fall foul of (A), (B) and (C) as well. One problem with such CTs was that they tended to disagree with one another over which evidence had been faked. Thus many theorists argued that the photographic and X-ray record of the presidential post mortem had been tampered with to conceal evidence of conspiracy, while Lifton (1980) as we saw argued that the record was genuine but the body itself had been tampered with. Other theorists, e.g. Fetzer & co., argue that the X-rays indicate a conspiracy while the photographs do not, implying that the photographs have been tampered with. This latter, widespread belief introduces a new contradiction into the case, since it posits a conspiracy of tremendous power and organisation, able to gain access to the most important evidence of the case, yet one which is careless or stupid enough not to make sure that the evidence it leaves behind is fully consistent. (And, of course, it goes against the verdict of the House Committee's independent panel of distinguished forensic scientists and radiographers that the record of the autopsy was genuine, and consistent, both internally and with the hypothesis that Oswald alone was the assassin.) Of particular interest here is the Zapruder movie film of the assassination. Stills from this film were originally published, in the Warren Report and in the press, to support the official lone assassin hypothesis. When a bootleg copy of this film surfaced in the mid 1970s it was taken as significant evidence against the official version and most CTs since then have relied upon one interpretation or another of this film for support. But now that it is clear, especially since better copies of the film are now available, that the wounds Kennedy suffers in the film do not match those hypothesised by those CT proponents arguing for the falsity of the autopsy evidence, some of these proponents now claim to detect signs that the Zapruder film itself has been faked, and there has been much discussion about the chain of possession of this film in the days immediately after the assassination to see if there is any possibility of its being in the hands of someone who could have tampered with it. What is happening here is that epistemologically these CTs are devouring their own tails. If the evidence that was originally regarded as foundational for proving the existence of a conspiracy is now itself impeached, then this ought to undermine the original conspiracy case. If no single piece of evidence in the case can be relied upon then we have no reason for believing anything at all, and the abyss of total scepticism yawns. Interestingly there seems to be a complete lack of what I termed above `meta-evidence', that is, actual evidence that any of this evidence has been faked. Reasons for believing in this forgery hypothesis tend to fall into one of three groups. (i) It is claimed that some sign of forgery can be detected in the evidence itself. Since much of this evidence consists of poor quality film and photographs taken at the assassination scene, these have turned into blurred Rorschach tests where just about anything can be seen if one squints long and hard enough. In the case of the autopsy X-rays, claims of apparent fakery tend to be made by people untrained in radiography and the specialised medical skill of reading such X-rays. (ii) Forgery is hypothesised to explain some alleged discrepancy between two pieces of evidence. Thus when differences are alleged to exist between the autopsy photographs and the X-rays it is alleged that one or other (or both) have been tampered with. (iii) Forgery is hypothesised in order to explain away evidence that is clearly inconsistent with the proposed conspiracy hypothesis. An interesting case of the latter involves the so-called `backyard photos', photographs supposedly depicting Oswald standing in the yard of his house and posing with his rifle, pistol and various pieces of left-wing literature. For Oswald himself was confronted with these by police officers after his arrest and claimed then that they had been faked--he had had some employment experience in the photographic trade and claimed to know how easily such pictures could be faked. And ever since then CT proponents have made the same claims. But one problem with such claims is that evidence seldom exists in a vacuum, but is interconnected with other evidence. Thus we have the sworn testimony of Oswald's wife that she took the photographs, the evidence of independent photographic experts that the pictures were taken with Oswald's camera, documentary evidence in his own handwriting that Oswald ordered the rifle in the photos and was the sole hirer of the PO box to which it was delivered, eyewitness evidence that Oswald possessed such a rifle and that one of these photos had been seen prior to the assassination, and so on. To achieve any kind of consistency with the forgery hypothesis all of this evidence must itself be faked or perjured. Thus the forgery hypothesis inevitably ends up impeaching the credibility of such a range of evidence that a conspiracy of enormous proportions and efficiency is entailed, a conspiracy which runs into the problems raised in (F) above. These problems are so severe that the forgery hypothesis must be untenable without the existence of some credible meta-evidence, some proof that acts of forgery took place. Without such meta-evidence, all we have is an unjustifiable attempt to convert evidence against a conspiracy into evidence for merely on the grounds that the evidence doesn't fit the proposed CT, which is an example of (A) too. (H) The fallacy of the spider's web. This form of reasoning has been central to many of the conspiratorial works about the JFK assassination: indeed, Duffy (1988) is entitled The Web! Scott (1977) was perhaps the first full-length work in this tradition. It concentrates on drawing links between Oswald and the people he came into contact with, and the murky worlds of US intelligence, anti-Castro Cuban groups and organised crime, eventually linking in this fashion the world of Dealey Plaza with that of the Watergate building and the various secret activities of the Nixon administration. Such a project is indeed an interesting one, one which enlightens us considerably about the world of what Scott terms `parapolitics'. It is made especially easy by the fact that Oswald in his short life had at least tangential connections with a whole range of suspicious organisations, including the CIA, the KGB, pro- and anti-Castro Cuban groups, the US Communist Party and other leftist organisations, organised crime figures in New Orleans and Texas, and so on. And considerable webs can be drawn outwards, from Oswald's contacts to their contacts, and so on. As I say, such research is intrinsically interesting, but the fallacy occurs when it is used in support of a conspiracy theory. For all that it generates is suspicion, not evidence. That Oswald knew X or Y is evidence only that he might have had an opportunity to conspire with them, and doesn't support the proposition that he did. The claim is even weaker for people that Oswald only knew at second or third or fourth hand. And some of these connections are much less impressive than authors claim: that Oswald knew people who ultimately knew Meyer Lansky becomes much less interesting when, as I noted in (D) above, Lansky is seen as much more minor figure than the almost omnipotent organised crime kingpin he is often depicted as. Ultimately this fallacy depends upon a kind of confusion between quantity and quality, one that seems to believe that a sufficient quantity of suspicion inevitably metamorphoses into something like evidence. There is, as the old saying has it, no smoke without fire, and surely such an inordinate quantity of smoke could only have been produced by a fire of some magnitude. But thirty years of research haven't found much in the way of fire, only more smoke. Some of the more outrageous CTs here have been discredited--inasmuch as such CTs can ever be discredited--and the opening of KGB archives in recent years and access to living KGB personnel has shown that Oswald's contacts with that organisation were almost certainly innocent. Not only is there no evidence that Oswald ever worked for the KGB, but those KGB officers who monitored Oswald closely during his two year stay in the USSR were almost unanimously of the opinion that he was too unbalanced to be an employee of any intelligence organisation. But a problem with suspicion is that it cannot be easily dispelled. Since web-reasoning never makes clear exactly what the nature of Oswald's relationship with his various contacts was, it is that much harder to establish the claim that they were innocent. Ultimately, this can only be done negatively, by demonstrating the sheer unlikeliness of Oswald being able to conspire with anyone. The ample evidence of the sheer contingency of Oswald's presence in the book depository on the day of the assassination argues strongly against his being part of a conspiracy to kill the president. Whether in fact he was a part of some other conspiracy, as some authors have argued, is an interesting question but one not directly relevant to assassination CTs. (I) The classic logical fallacy of post hoc ergo propter hoc. This applies to all those assassination CTs which seek to establish some motive for Kennedy's death from some alleged events occurring afterwards. The most dramatic of these, as featured in Oliver Stone's film, is the argument from America's disastrous military campaign in Vietnam. US military involvement escalated after Kennedy's death, therefore it happened because of Kennedy's death, therefore Kennedy's death was brought about in order to cause an increased American presence in Vietnam. The frailty of this reasoning is obvious. As I pointed out in (F) above, such a view attributes to the proposed conspirators a significant inability to match ends and means rationally. In addition there is no end to the possible effects that can be proposed here. Ultimately everything that is regarded as immoral about modern America can be traced back to the assassination. As I pointed out in a recent lecture, what motivates this view is: a desire for a justification of a view of America as essentially a benign and divinely inspired force in the world, a desire held in the face of American sin in Vietnam and elsewhere. There are plausible explanations for Vietnam and Watergate in terms of the domination of post-war foreign policy by cold-war simplicities, and the growth of executive power at the expense of legislative controls, and so on. They are, for those not interested in political science, dull explanations. Above all, they do not provide the emotional justification of conspiratorial explanations. To view Vietnam as the natural outcome of foreign policy objectives of the cold-war establishment, of a set of attitudes shared by both Republican and Democrat, above all to view it as the express wish of the American people--opinion polls registered majority support for the war until after the Tet disaster in 1968--is ultimately to view Vietnam as the legitimate and rational outcome of the American system at work. A quasi-religious view of America as `the city on the hill', the place where God will work out his purpose for men, cannot afford to entertain these flaws. Hence the appeal of an evil conspiracy on which these sins can be heaped.[14] Underlying this reasoning, then, is an emotional attachment to a view of America as fundamentally decent combined with a remarkable ignorance about the real nature of politics. All of the features of America's history after 1963 that can be used as a possible motive for the assassination can be equally or better explained in terms of the ordinary workings of US politics. Indeed many of them, including the commitment to Vietnam and the aggressively murderous attitude towards Castro's Cuba, can be traced to Kennedy's White House and earlier. Though CT theorists often proclaim their commitment to realism and a hard-headed attitude towards matters, it seems clear that their reliance upon this kind of reasoning is motivated more by emotion than by facts. 5. Conclusions The accusation is often made that conspiracy theorists, particularly of the more extreme sort, are crazy, or immature, or ignorant. This response to UCTs may be at least partly true but does not make clear how CT thinking is going astray. What I have tried to show is how various weaknesses in arguing, assessing evidence, etc. interact to produce not just CTs but unwarranted CTs. A conspiratorial explanation can be the most reasonable explanation of a set of facts, but where we can identify the kinds of critical thinking problems I have outlined here, a CT becomes increasingly unwarranted. Apart from these matters logical and epistemological, it seems to me that there is also an interesting psychological component to the generation of UCTs. Human beings possess an innate pattern-seeking mechanism, imposing order and explanation upon the data presented to us. But this mechanism can be too sensitive and we start to see patterns where there are none, leading to a refusal to recognise the sheer amount of contingency and randomness in the world. Perhaps, as Keeley says, "the problem is a psychological one of not recognizing when to stop searching for hidden causes".[15] Seeing meaning where there is none leads to seeing evidence where there is none: a combination of evidential faults reinforces the view that our original story, our originally perceived pattern, is correct--a pernicious feedback loop which reinforces the belief of the UCT proponent in their own theory. And here criticism cannot help, for the criticism--and indeed the critic--become part of the pattern, part of the problem, part, indeed, of the conspiracy.[16] Conspiracy theories are valuable, like any other type of theory, for there are indeed conspiracies. We want to find a way to preserve all that is useful in the CT as a way of explaining the world while avoiding the UCT which at worst slides into paranoid nonsense. I agree with Keeley that there can be no exact dotted line along which Occam's Razor can be drawn here. Instead, we require a greater knowledge of the thinking processes which underlie CTs and the way in which they can offend against good standards of critical thinking. There is no way to defeat UCTs; the more entrenched they are, the more resistance to disproof they become. Like some malign virus of thinking, they possess the ability to turn their enemies' powers against them, making any supposedly neutral criticism of the CT itself part of the conspiracy. It is this sheer irrefutability that no doubted irritated Popper so much. If we cannot defeat UCTs through refutation then perhaps the best we can do is inoculate against them by a better development of critical thinking skills. These ought not to be developed in isolation--it is a worrying feature of this field that many otherwise critical thinkers become prone to conspiracy theorising when they move outside of their own speciality--but developed as an essential prerequisite for doing well in any field of intellectual endeavour. Keeley concludes that there is nothing straightforwardly analytic that allows us to distinguish between good and bad conspiracy theories... The best we can do is track the evaluation of given theories over time and come to some consensus as to when belief in the theory entails more scepticism than we can stomach.[17] Discovering whether or to what extent a particular CT adheres to reasonable standards of critical thinking practice gives us a better measure of its likely acceptability than mere gastric response, while offering the possibility of being able to educate at least some people against their appeal, as potential consumers or creators of unwarranted conspiracy theories. BIBLIOGRAPHY Blakey, G. Robert & Billings, Richard (1981) Fatal Hour -The Plot to Kill the President, N.Y.:Berkeley Publishing Cutler, Robert (1975) The Umbrella Man, Manchester, Mass.: Cutler Designs Donovan, Robert J.(1964) The Assassins, N.Y.: Harper Books Duffy, James. R. (1988) The Web, Gloucester: Ian Walton Publishing Eddowes, Michael (1977) The Oswald File, N.Y.: Ace Books Fetzer, James (ed.) (1997) Assassination Science, Chicago, IL: Open Court Publishing Fisher, Alec & Scriven, Michael (1997) Critical Thinking - Its Definition and Assessment, Norwich: Centre for Critical Thinking, U.E.A. Hofstadter, Richard P. (1964) The Paranoid Style in American Politics, London: Jonathan Cape Hume, David (1748) Enquiry Concerning Human Understanding, ed. by P.H. Nidditch 1975, Oxford: Oxford University Press. Keeley, Brian L. (1999) `Of Conspiracy Theories', Journal of Philosophy 96, 109-26. Lacey, Robert (19901) Little Man, London: Little Brown Lifton, David (1980) Best Evidence, London: Macmillan. 2nd ed. 1988 N.Y.: Carroll & Graf Norris, S.P. & King, R. (1983) Test on Appraising Observations, St Johns Newfoundland: Memorial University of Newfoundland. Norris, S.P. & King, R. (1984) `Observational ability: Determining and extending its presence', Informal Logic 6, 3-9. Oglesby, Carl (1976) The Yankee-Cowboy War , 2nd ed. 1977, N.Y.: Berkley Publishing Pigden, Charles (1993) `Popper revisited, or What Is Wrong With Conspiracy Theories?', Philosophy of the Social Sciences 25, 3-34. Popkin, Richard H. (1966) The Second Oswald , London: Sphere Books Popper, Karl (1945) The Open Society and its Enemies, 5th ed. 1966, London, Routledge. Posner, Gerald (1993) Case Closed, N.Y.: Random House Scheim, David E. (1983) Contract On America, Silver Spring, Maryland: Argyle Press Scott, Peter Dale (1977) Crime and Cover-Up, Berkeley, Cal: Westworks Stone, Jim (1991) Conspiracy of One , Fort Worth TX: Summit Group Stone, Oliver & Sklar, Zachary (1992) JFK - The Movie, New York: Applause Books. Thompson, Josiah(1967) Six Seconds in Dallas , 2nd ed. 1976, N.Y.: Berkeley Publishing Wilson, Robert Anton (1989) `Beyond True and False', in Schultz, T. (ed.) The Fringes of Reason, New York: Harmony. ______________________ [1] And this even though professional philosophers may themselves engage in conspiracy theorising! See, for instance, Popkin (1966), Thompson (1966) or Fetzer (1998) for examples of philosophers writing in support of conspiracy theories concerning the JFK assassination. [2] See Donovan 1964 for more on this. [3] Historians, it seems, still disagree about whether or to what extent Princips' group was being manipulated. [4] And the most extreme UCT I know manages to combine this with both ufology and satanism CTs, in David Icke's ultimate paranoid fantasy which explains every significant event of the last two millennia in terms of the sinister activities of historical figures who share the blood-line of reptilian aliens who manipulate us for their purposes, using Jews, freemasons, etc. as their fronts. Those interested in Mr. Icke's more specific allegations (which I omit here at least partly out of a healthy regard for Britain's libel laws) are directed to his website, http://www.davidicke.com/. [5] See Norris & King 1983 & 1984 for full details of and support for these principles. [6] I don't propose to argue for my position here. Interested readers are pointed in the direction of Posner (1994), a thorough if somewhat contentious anti-conspiratorial work whose fame has perhaps eclipsed the less dogmatic but equally anti-conspiratorial Stone (1990). [7] One of the first of which, from the charmingly palindromic Revilo P. Oliver, is cited by Hofstadter. Oliver, a member of the John Birch Society, which had excoriated Kennedy as a tool of the Communists throughout his presidency, asserted that it was international Communism which had murdered Kennedy in order to make way for a more efficient tool! Right-wind theories blaming either Fidel Castro or Nikita Khrushchev continued at least into the 1980s: see, for instance, Eddowes (1977). [8] And probably not possible! The sheer complexity of the assassination CT community and the number of different permutations of alleged assassins has frown enormously, especially over the last twenty years. In particular, the number of avowedly political CTs is hard to determine since they fade into other areas of CT, in particular those dealing with the influence of organised crime and those dealing with an alleged UFO cover-up, not to mention those even more extreme CTs which link the assassination to broader conspiracies of international freemasonry etc.. [9] See not only the movie but also Stone & Sklar (1992), a heavily annotated version of the film's script which also includes a good deal of the published debate about the film, for and against. [10] Lifton 1980: 132 [11] Norris & King (1983), quoted in Fisher & Scriven (1997). [12] For a remarkable instance of the exaggeration of the power of organised crime in the US and its alleged role in Kennedy's death see Scheim (1983) or, perhaps more worryingly, Blakey & Billings (1981). I say `more worryingly' because Blakey was Chief Counsel for the congressional investigation into Kennedy's death which reported in 1980 and so presumably is heavily responsible for the direction that investigation took. [13] This view of Lansky is widespread throughout the Kennedy literature. See, for instance, Peter Dale Scott's short (1977) which goes into Lansky's alleged connections in great detail. [14] From "(Dis)Solving the Kennedy Assassination", presented to the Conspiracy Culture Conference at King Alfred's College, Winchester, in July 1998. [15] Keeley 1999: 126 [16] Anyone who doubts this should try to argue for Oswald as lone assassin on an internet discussion group! It is not just that one is regarded as wrong or naive or ignorant. One soon becomes accused of sinister motives, of being a witting or unwitting agent of the on-going disinformation exercise to conceal the truth. (I understand that much the same is true of discussions in ufology fora.) [17] Keeley 1999: 126 From checker at panix.com Sun Dec 11 03:16:01 2005 From: checker at panix.com (Premise Checker) Date: Sat, 10 Dec 2005 22:16:01 -0500 (EST) Subject: [Paleopsych] Lamar Waldron, with Thom Hartmann: Ultimate Sacrifice Message-ID: Lamar Waldron, with Thom Hartmann: Ultimate Sacrifice: John and Robert Kennedy, the Plan for a Coup in Cuba, and the Murder of JFK (excerpts) A BUZZFLASH GUEST CONTRIBUTION by Thom Hartmann and Lamar Waldron Thom Hartmann, a regular BuzzFlash contributor, is coauthor of the newly released book Ultimate Sacrifice, which explores the theory that Kennedy's death leads back to the mob. Carroll & Graf, the publishers of Ultimate Sacrifice, have granted BuzzFlash permission to post the authors' introduction to the book, including end notes, as an aid to those wishing to further explore specific details. BuzzFlash is not in the business of solving the Kennedy mystery, and in fact we doubt that it will ever be irrefutably solved, even once all related documents become declassified. This text, however, makes a serious contribution, as our http://www.buzzflash.com/reviews/05/11/rev05124.html review has indicated. http://www.buzzflash.com/premiums/05/11/pre05167.html Ultimate Sacrifice: John and Robert Kennedy, the Plan for a Coup in Cuba, and the Murder of JFK is available as a BuzzFlash premium. * * * FOR MORE THAN FOUR DECADES since his death in 1963, John F. Kennedy has captured the imagination of the American people. Myth and conjecture have swirled around JFK, his political legacy, his family, and its multiple tragedies. Admirers and critics have examined every detail of his life and work, gradually lifting one veil after another to shed new light on his presidency, from his maneuvering behind the scenes during the Cuban Missile Crisis to his personal weaknesses. Nonetheless, the secret with the most profound and catastrophic effect on America has remained hidden. Ultimate Sacrifice reveals this secret for the first time, transforming the history of the Kennedy years and providing the missing piece to one of the great puzzles of post-war America: the true circumstances behind JFK's assassination on November 22, 1963. Seventeen years ago, Thom Hartmann and I began writing a book about the battles of President Kennedy and his brother, Attorney General Robert F. Kennedy, against the Mafia and Fidel Castro. Drawing on new information and exclusive interviews with those who worked with the Kennedys, in addition to thousands of recently declassified files, we discovered that John and Robert Kennedy had devised and were executing a secret plan to overthrow Fidel Castro on December 1, 1963. "The Plan for a Coup in Cuba" (as it was titled in a memo for the Joint Chiefs of Staff) would include a "palace coup" to eliminate Castro, allowing a new Cuban "Provisional Government" to step into the power vacuum, and would be supported by a "full-scale invasion" of Cuba by the US military, if necessary.[1] The "Plan for a Coup in Cuba" was fully authorized by JFK and personally run by Robert Kennedy. Only about a dozen people in the US government knew the full scope of the plan, all of whom worked for either the military, the CIA, or reported directly to Robert. The Kennedys' plan was prepared primarily by the US military, with the CIA playing a major supporting role. Input was also obtained from key officials in a few other agencies, but most of those who worked on the plan knew only about carefully compartmentalized aspects, believing it to be a theoretical exercise in case a Cuban official volunteered to depose Fidel. Unique and different from any previously disclosed operation, the Kennedys' "Plan for a Coup in Cuba" is revealed in this book for the first time. The CIA's code name for their part of the coup plan has never surfaced in any book, article, or government investigation. Officially declassified in 1999, "AMWORLD" is the cryptonym the CIA used for the plan in its classified internal documents. Since the overall coup plan was under the personal control of Attorney General Kennedy, who did not use a code-name for it, we call it "C-Day" in this book, a name entirely of our own invention. Its evocation of D-Day is intentional, since the Kennedys' plan included the possibility of a US military invasion. C-Day was undoubtedly one of the most secret covert operations in United States history. In its secrecy, however, lay tragedy. Even though the Kennedys' coup plan never came to fruition, three powerful Mafia dons?Carlos Marcello, Santo Trafficante, and Johnny Rosselli?learned of the plan and realized that the government would go to any lengths to avoid revealing it to the public. With that knowledge, the three mob bosses were able to assassinate JFK in a way that forced the truth to be buried for over forty years. Marcello, Trafficante, and Rosselli undertook this extraordinary act of vengeance in order to halt the Kennedy administration's unrelenting prosecution of them and their allies. The Kennedy Justice Department had vigorously pursued Marcello, even subjecting him to a brief, nightmarish deportation. Once he returned, Marcello hated the Kennedy brothers with a deep and vengeful passion. The two other Mafia bosses suffered similar pursuit, and eventually Marcello, Trafficante, and Rosselli decided that their only way to avoid prison or deportation was to kill JFK. Our investigation has produced clear evidence that the crime bosses arranged the assassination so that any thorough investigation would expose the Kennedys' C-Day coup plan. They were confident that any such exposure could push America to the brink of war with Cuba and the Soviet Union, meaning that they could assassinate JFK with relative impunity. They did not carry out the act themselves, but used trusted associates and unwitting proxies. The most widely known are Jack Ruby and Lee Harvey Oswald, who were both in contact with associates of Marcello, Trafficante, and Rosselli in the months before the assassination. Reports in government files show that Oswald and Ruby knew about parts of the Kennedys' plan and even discussed it with others. Robert Kennedy told several close associates that Carlos Marcello was behind JFK's death, but he couldn't reveal what he knew to the public or to the Warren Commission without C-Day being uncovered. As this book shows, RFK and other key government officials worried that exposure of the plan could trigger another nuclear confrontation with the Soviets, just a year after the Cuban Missile Crisis. None of the seven governmental committees that investigated aspects of the assassination, including the Warren Commission, were officially told about the Kennedys' C-Day plan.[2] However, over the decades, each successive committee came increasingly close to discovering both the plan and the associates of Marcello who assassinated JFK. We were able to piece together the underlying story by building on the work of those committees, former government investigators, and revelations in four million documents that were declassified in the 1990s. Key to our efforts were new and often exclusive interviews with many Kennedy insiders who worked on the coup plan or dealt with its consequences, some of whom revealed aspects of JFK's assassination and the coup plan for the first time. They include Secretary of State Dean Rusk, Press Secretary Pierre Salinger, and the Kennedys' top Cuban exile aide, Enrique "Harry" Ruiz-Williams. Their inside information allows us to tell the story, even though a 1998 report about the JFK Assassinations Records Review Board confirms that "well over a million CIA records" related to JFK's murder have not yet been released.[3] NBC News' Tom Brokaw confirmed on his September 29, 1998 broadcast that "millions" of pages remain secret and won't be released until the year 2017.[4] By necessity, Ultimate Sacrifice examines this complex story from several angles. Part One documents every aspect of the Kennedys' C-Day plan and how it developed, beginning with the Cuban Missile Crisis. Though it is widely believed that JFK agreed not to invade Cuba in order to end the Cuban Missile Crisis in the fall of 1962, Secretary of State Rusk told us that the "no-invasion" pledge was conditional upon Castro's agreement to on-site UN inspections for nuclear weapons of mass destruction (a term that JFK first used). Historians at the National Security Archive confirmed that because Castro refused such inspections, the pledge against invasion never went into effect.[5] Consequently, in the spring of 1963, John and Robert Kennedy started laying the groundwork for a coup against Fidel Castro that would eventually be set for December 1, 1963. Robert Kennedy put the invasion under the control of the Defense Department because of the CIA's handling of 1961's Bay of Pigs disaster. The "Plan for a Coup in Cuba," as written by JFK's Secretary of the Army Cyrus Vance with the help of the State Department and the CIA, called for the coup leader to "neutralize" Cuban leader "Fidel Castro and . . . [his brother] Raul" in a "palace coup." Then, the coup leader would "declare martial law" and "proclaim a Provisional Government" that would include previously "selected Cuban exile leaders" who would enter from their bases in Latin America.[6] Then, at the invitation of the new government, after "publicly announcing US intent to support the Provisional Government, the US would initiate overt logistical and air support to the insurgents" including destroying "those air defenses which might endanger the air movement of US troops into the area." After the "initial air attacks" would come "the rapid, incremental introduction of balanced forces, to include full-scale invasion" if necessary. The first US military forces into Cuba would be a multiracial group of "US military-trained free Cubans," all veterans of the Bay of Pigs.[7] Upon presidential authorization, the US would "recognize [the] Provisional Government . . . warn [the] Soviets not to intervene" and "assist the Provisional Government in preparing for . . . free elections."[8] This "palace coup" would be led by one of Castro's inner circle, himself a well-known revolutionary hero.[9] This man, the coup leader, would cause Castro's death, but without taking the credit or blame for doing so. The coup leader would be part of the new Provisional Government in Cuba, along with a select group of Cuban exiles?approved by the Kennedys?who ranged from conservative to progressive.[10] The identity of the coup leader is known to the authors, and has been confirmed by Kennedy associates and declassified documents. However, US national security laws may prevent the direct disclosure of past US intelligence assets even long after their deaths, so we will not directly name the coup leader in this book. Since we have no desire to violate national security laws or endanger US intelligence assets, we will only disclose official information that has been declassified or is available in the historical record. We have uncovered historical accounts of Cuban leaders that have been long overlooked by the public or are in newly released government files. For example, a formerly secret cable sent to the CIA director on December 10, 1963?just nine days after the original date for the C-Day coup?reports "Che Guevara was alleged to be under house arrest for plotting to overthrow Castro," according to "a Western diplomat."[11] Newly declassified documents and other research cast Che's growing disenchantment with Fidel Castro in a new light. These revelations include Che's secret meetings with three people close to the Kennedys, followed by yet another house arrest after a CDay exile leader was captured in Cuba. The Kennedys did not see C-Day as an assassination operation, but rather as an effort to help Cubans overthrow a Cuban dictator. A June 1963 CIA memo from one of Robert Kennedy's Cuban subcommittees of the National Security Council explains the Kennedy policy as "Cubans inside Cuba and outside Cuba, working" together to free their own country.[12] Nor was C-Day an attempt to install another US-backed dictator in Cuba, like the corrupt Batista regime that had been overthrown by Castro and many others on January 1, 1959. The Kennedys' goal in 1963 was simply a free and democratic Cuba. As several Kennedy associates told us, the only man who knew everything about C-Day was Robert Kennedy, the plan's guiding force.[13] Secretary of the Army Cyrus Vance was one of the few military leaders who knew the full scope of C-Day while the plan was active. The others were generals the Kennedys especially trusted, including Chairman of the Joint Chiefs of Staff Maxwell Taylor and General Joseph Carroll, head of the Defense Intelligence Agency (DIA). High CIA officials involved in C-Day included CIA Director John McCone, Deputy Director for Plans Richard Helms, Desmond FitzGerald, and key field operatives like David Morales and David Atlee Phillips. Most high US officials didn't know about C-Day prior to JFK's assassination. There is no evidence that Lyndon Johnson was told anything about C-Day prior to JFK's death. Likewise, no evidence exists showing that Secretary of Defense Robert McNamara knew about C-Day before JFK's assassination. Dean Rusk told us he did not learn about the actual C-Day plan until soon after JFK's death.[14] There is no evidence that Edward Kennedy was told about the plan. Documents and sources indicate that FBI Director J. Edgar Hoover had no active role in C-Day, although he may have learned a great deal about it from field reports. The Secret Service was even less informed about C-Day, which no doubt hindered their actions when serious threats seemingly related to Cuba surfaced against JFK in the weeks before C-Day. However, officials ranging from Dean Rusk to hawkish Air Force Chief of Staff General Curtis LeMay were needed for the planning of C-Day, so the Kennedys used a shrewd technique that let those officials participate in planning for C-Day while keeping them in the dark about the plan itself. Rusk, LeMay, and others were simply told that all the planning was needed "just in case" a coup happened in Cuba. Officials like Rusk and LeMay were generally aware of other CIA efforts against Castro in the fall of 1963, such as the CIA's AMTRUNK operation, which looked for disaffected Cuban military officers. Some US officials also knew about a CIA asset named Rolando Cubela, a disgruntled mid-level Cuban official who the CIA code-named AMLASH. However, unlike AMWORLD?the CIA's portion of C-Day?neither of those operations reached high in the Cuban government or was close to producing results in the fall of 1963. The Kennedys' "just in case" technique allowed extensive planning to be done for all facets of the military invasion and the post-coup Provisional Government without revealing C-Day or the coup leader's identity to most of those doing the planning. If the C-Day coup had actually occurred, Rusk and the other officials not privy to the full plan would nonetheless have been fully prepared for its aftermath, with plans they had already approved and helped create.[15] While such tightly compartmentalized secrecy kept C-Day from becoming widely known within the government and protected C-Day from public exposure, it also contributed to JFK's death. In 1963, the public would have been shocked to learn that two months before JFK was shot in Dallas, US officials under the direction of Robert Kennedy began making contingency plans to deal with the "assassination of American officials."[16] In the event of an assassination (expected to happen only outside the US), these contingency plans would have mandated certain security measures, and, as this book documents, such principles would be applied to and responsible for much of the secrecy surrounding the JFK assassination. Robert Kennedy and the others making the contingency plans were concerned only about possible retaliation by Castro for C-Day. They failed to consider the threat from others the Attorney General had targeted, especially Mafia bosses Carlos Marcello, Santo Trafficante, and Johnny Rosselli. The Kennedys and key aides had gone to great lengths to keep the Mafia out of C-Day. The CIA's earlier efforts with the Mafia to assassinate Castro?which began in 1959 under Vice President Richard Nixon?had complicated the Kennedys' intense prosecution of the Mafia. Without telling the Kennedys, the CIA was continuing to work with the Mafia on plots against Castro in the fall of 1963, which helped to allow associates of Marcello, Trafficante, and Rosselli to infiltrate the plans for C-Day. In Part II, we will show how?and why?mob bosses Carlos Marcello, Santo Trafficante, and Johnny Rosselli worked together to penetrate the Kennedys' C-Day plan and assassinate JFK. In 1963, Carlos Marcello was America's most ruthless and secretive Mafia boss, completely free of FBI wiretaps. From his New Orleans headquarters, he ruled a territory that included Louisiana, Mississippi, and parts of Texas and Alabama.[17] Marcello's Mafia family was the oldest in North America, able to stage major "hits" without needing the approval of the national Mafia organization, and his associates had a long history of targeting government officials who got in their way.[18] The Kennedys had pursued Marcello since 1959, even before JFK was elected president. Recently declassified FBI documents confirm that just a few years before his own death, Carlos Marcello confessed on three occasions to informants that he had had JFK killed.[19] Tampa godfather Santo Trafficante was Marcello's closest Mafia ally. Trafficante's territory included much of Florida, as well as parts of Alabama, and his organization provided a major conduit for the French Connection heroin trade, whose primary routes included New York City, Texas, New Orleans, Georgia's Fort Benning, Montreal, Chicago, and Mexico City. The Internet magazine Salon noted that Trafficante "had been driven out of the lucrative Havana casino business by Castro and" that he "had been recruited in the CIA" plots with the Mafia to kill Castro months before JFK became president.[20] Like Marcello, Trafficante later confessed his involvement in JFK's assassination.[21] Johnny Rosselli, according to his biographers, also claimed to know what had really happened in Dallas, and he sometimes worked with both Trafficante and Marcello. Rosselli was the Chicago Mafia's point man in Hollywood and Las Vegas, and his close friends included Frank Sinatra and Marilyn Monroe. Internal CIA reports admit that they recruited Rosselli and Trafficante for their own plots to assassinate Castro prior to JFK's election in 1960. Unknown to the Kennedys, Rosselli was continuing in that role in the fall of 1963.[22] Jack Ruby met with Rosselli just weeks before JFK's assassination, had met much earlier with Santo Trafficante, and had numerous ties to Carlos Marcello, according to government investigators.[23] Ultimate Sacrifice reveals new information from Pierre Salinger?a member of the Kennedys' first organized crime investigation team?that just weeks before Jack Ruby shot Oswald, Ruby received a large payoff in Chicago from someone working for a close ally of Marcello and Trafficante.[24] Ruby also made surprising comments that wound in up the Warren Commission's files but not in their report. Just weeks after Ruby's arrest for shooting Oswald in 1963, an FBI document quotes Ruby as talking about "an invasion of Cuba" that "was being sponsored by the United States Government."[25] Ultimate Sacrifice shows how Carlos Marcello, Santo Trafficante, and Johnny Rosselli were able to keep their roles in JFK's death from being exposed because they had infiltrated C-Day. Long-secret government files confirm that ten men who worked for the mob bosses had learned about CDay. Five of those ten actually worked on C-Day, giving the Mafia chieftains a pipeline directly into C-Day and the plans for keeping it secret. Less than a dozen trusted associates of the mob bosses were knowingly involved in the hit on JFK. Though Mafia hits against officials are rare, the Mafia families of Carlos Marcello, Santo Trafficante, and Johnny Rosselli had killed officials who threatened their survival. Nine years earlier, Santo Trafficante's organization had helped to assassinate the newly elected Attorney General of Alabama because he was preparing to shut down Trafficante's operations in notoriously corrupt Phenix City.[26] In 1957, associates of Marcello and Rosselli assassinated the president of Guatemala, a murder that was quickly blamed on a seemingly lone Communist patsy who, like Lee Harvey Oswald, was then killed before he could stand trial. Just nine months before JFK's murder in November 1963, Rosselli's Chicago Mafia family had successfully assassinated a Chicago city official, using an associate of Jack Ruby.[27] The House Select Committee on Assassinations (HSCA) found in 1979 that Marcello and Trafficante had had the means and the motive to assassinate JFK. Before the HSCA could question Rosselli, he "was kidnapped, murdered, dismembered, and sunk" in the ocean in an oil drum that later surfaced.[28] But the CIA didn't tell the HSCA about AMWORLD or other aspects of C-Day, so the HSCA couldn't uncover exactly how Marcello and Trafficante did it or Rosselli's role in working with them. Newly declassified files, many unavailable to the HSCA, show that Marcello, Trafficante, and Rosselli penetrated C-Day and used parts of it as cover to assassinate JFK. By using the secrecy surrounding C-Day, the mob bosses could target JFK not only in Dallas, but also in two earlier attempts, one of which is revealed in this book for the first time. They first attempted to kill JFK in Chicago on November 1, 1963, and then in Tampa on November 18, before succeeding in Dallas on November 22. Since Chicago was home to Rosselli's Mafia family, Tampa was Trafficante's headquarters, and Dallas was in Marcello's territory, the risk was shared between the three bosses. While the Chicago attempt?thwarted when JFK canceled his motorcade at the last minute? was briefly noted by Congressional investigators in the 1970s, the attempt to assassinate JFK during his long Tampa motorcade has never been disclosed in any book or government report. It was withheld from the Warren Commission and all later investigations, even though the Tampa plot was uncovered by authorities and revealed to JFK before he began his motorcade ?which he continued, despite the danger. With C-Day set to begin the following week, JFK planned to give a speech in Miami just hours after his trip to Tampa, a speech that included a message written to the C-Day coup leader in Cuba that promised him JFK's personal support.[29] Canceling the Tampa motorcade simply wasn't an option for JFK or Bobby, even though the motorcade would reportedly be the longest of JFK's presidency, slowly making its way past teeming crowds and many unsecured buildings. Our interviews with officials from Florida law enforcement and the Secret Service, supported by newspaper files and declassified CIA and FBI documents, reveal that the Tampa attempt to kill JFK shares a dozen striking parallels to what happened in Dallas four days later. They include a young male suspect who was a former defector with links to both the Fair Play for Cuba Committee and Russia, just like Lee Harvey Oswald. As in Dallas, JFK's Tampa motorcade also included a hard left turn in front of a tall red-brick building with many unguarded windows?a key site that officials feared might be used by snipers to target JFK. John and Robert Kennedy kept the Tampa assassination attempt secret at the time, and Robert Kennedy kept it secret until his death in 1968. The Secret Service, FBI, CIA, and other agencies have similarly maintained silence about it, as well as keeping secret other information about the assassination that might have exposed the Kennedys' C-Day coup plan. In November 1994, the authors first informed the JFK Assassination Review Board about the Tampa assassination attempt. The Review Board had been created by Congress in 1992 and appointed by President Clinton soon after, to release all the JFK records. But just weeks after we told the Board about the Tampa attempt, the Secret Service destroyed their records for that time period. That does not implicate the Secret Service or the FBI or the CIA (as an organization) in JFK's assassination. As the book shows, officials were forced into such cover-ups because the Mafia bosses had tied the potentially destabilizing C-Day plan to their attempts to assassinate JFK in Chicago, Tampa, and finally Dallas. Within hours of JFK's assassination, Robert Kennedy suspected that someone linked to Marcello and Trafficante, and to C-Day, was involved in his brother's death. The afternoon of JFK's death, Robert Kennedy revealed his suspicion to Pulitzer Prize-winning reporter Haynes Johnson, who was meeting with C-Day exile leader Enrique "Harry" Ruiz-Williams.[30] Evan Thomas, author of a biography of Robert Kennedy and a Newsweek editor, said "Robert Kennedy had a fear that he had somehow gotten his own brother killed" and that his "attempts to prosecute the mob and to kill Castro had backfired in some terrible way."[31] It has been publicly known only since 1992 that Robert Kennedy told a few close advisers that New Orleans mob boss Marcello was behind JFK's assassination, as we confirmed with Kennedy aide Richard Goodwin. Salon received additional confirmation of Mafia involvement from Robert Kennedy's former press secretary, Frank Mankiewicz, who conducted a secret investigation of JFK's death for Robert.[32] Goodwin and Mankiewicz are just two of over a dozen associates of Robert Kennedy who either heard his belief in a conspiracy in his brother's death or who believe in a conspiracy themselves. Among them are Justice Department prosecutors Ronald Goldfarb, Robert Blakey, and Walter Sheridan, as well as Robert's first biographer, Jack Newfield. Others include JFK's CIA Director John McCone, the President's personal physician at his autopsy Admiral George Burkley, and JFK aides Dave Powers, Kenneth O'Donnell, and Arthur Schlesinger, Jr.[33] This book adds to that list Pierre Salinger and Robert's top Cuban exile aide "Harry" Ruiz-Williams, plus another Kennedy aide who worked on C-Day. Most of those associates of Robert Kennedy point to a conspiracy involving Carlos Marcello or his close allies. In suspecting that C-Day was such a powerful weapon, history has proven the Mafia bosses correct. JFK's death threw the whole US government into turmoil, but the intelligence agencies were especially frantic: Their numerous and extensive anti-Castro plots were so secret that they needed to be kept not only from the Congress and the public, but also from the Warren Commission. Although many Warren Commission findings were discredited by later government investigators, Evan Thomas recently told ABC News that the commission achieved its real purpose. He said that after JFK's assassination, "the most important thing the United States government wanted to do was reassure the public that there was not some plot, not some Russian attack, not some Cuban attack." As a result, Thomas concluded, "the number one goal throughout the upper levels of the government was to calm that fear, and bring a sense of reassurance that this really was the work of a lone gunman."[34] President Lyndon Johnson and the Warren Commission were also under tremendous time pressure: With Johnson facing an election in less than a year, the Commission had to assemble a staff, review and take testimony, and issue their final report just ten months after JFK's death. "There was a cover-up," Evan Thomas confirmed to ABC News, explaining that in the Warren Commission's "haste to reassure everybody, they created an environment that was sure to come around and bite them." He emphasized that Earl Warren, Lyndon B. Johnson, J. Edgar Hoover, and others were not covering up a plot to kill JFK, as some have speculated. Instead, they covered up "for their own internal bureaucratic reasons?because Hoover wanted to keep his job, and because Bobby Kennedy didn't want to be embarrassed, or the CIA didn't want to have the public know they were trying to kill somebody," like Fidel Castro.[35] It was not until 2004 that Joseph Califano, assistant to Secretary of the Army Cyrus Vance in 1963, briefly hinted at the sensitive operation that Robert Kennedy had managed and had withheld from the Warren Commission. Califano wrote: "No one on the Warren Commission . . . talked to me or (so far as I know) anyone else involved in the covert attacks on Castro. . . . The Commission was not informed of any of the efforts of Desmond FitzGerald, the CIA and Robert Kennedy to eliminate Castro and stage a coup" in the fall of 1963.[36] Since Robert Kennedy knew more about C-Day than anyone else, his death in 1968 helped to ensure that C-Day stayed secret from all later government investigations into the assassination. The anti-Castro operations of the 1960s that were hidden from the Warren Commission only started to be uncovered by the investigations spawned by Watergate in the 1970s: the Senate Watergate Committee (which took secret testimony from Johnny Rosselli), the Rockefeller Commission, the Pike Committee, and the Church Committee.[37] More details about those CIA plots were uncovered by the House Select Committee on Assassinations in the late 1970s, though many of their discoveries weren't declassified until the late 1990s by the Assassination Records Review Board (ARRB). C-Day, far more sensitive and secret than any of those anti-Castro plots, was never officially disclosed to any of those seven government committees. The military nature of C-Day also helps to explain why it has escaped the efforts of historians and Congressional investigators for forty years. The CDay coup plan approved by Joint Chiefs Chairman General Maxwell Taylor was understandably classified TOP SECRET when it was created in 1963. But twenty-six years later, the Joint Chiefs reviewed the coup plan documents and decided that they should still remain TOP SECRET.[38] The documents might have remained officially secret for additional decades, or forever, if not for the JFK Assassination Records Review Board, created by Congress in the wake of the furor surrounding the film JFK. After efforts by the authors and others, the Review Board finally located and declassified some of the C-Day files just a few years ago. However, someone who worked with the Review Board confirmed to a highly respected Congressional watchdog group, OMB Watch, that "well over one million CIA records" related to JFK's assassination have not yet been released.[39] The C-Day documents that have been released show just the tip of the iceberg, often filled with the names of CIA assets and operations whose files have never been released, even to Congressional investigators. Part Three of Ultimate Sacrifice shows how C-Day affected history and continues to impact American lives. It provides a new perspective on LBJ's operations against Cuba, and how they impacted the war in Vietnam. Ultimate Sacrifice casts Watergate in a whole new light since it involved a dozen people linked to various aspects of C-Day. On a more personal level, Ultimate Sacrifice also solves the tragedy of Abraham Bolden, the first black Presidential Secret Service agent, who was framed by the Mafia and sent to prison when he tried to tell the Warren Commission about the Chicago and Tampa assassination attempts against JFK. His career and life ruined, Bolden has spent the last forty years seeking a pardon.[40] Now, new information from the CIA and other sources shows that the man behind Bolden's framing was an associate of Rosselli and Trafficante, someone linked to JFK's assassination who had penetrated C-Day while working for the Mafia. JFK made the ultimate sacrifice in his quest to bring democracy to Cuba using C-Day. Instead of staying safely in the White House, he put his own life on the line, first in Tampa and finally in Dallas. It has long been known that JFK talked about his own assassination the morning before he was shot. He commented to an aide about how easy it would be for someone to shoot him from a building with a high-powered rifle. Just hours earlier, JFK had demonstrated to his wife Jackie how easily someone could have shot him with a pistol.[41] We now know the reason for JFK's comments, since he knew that assailants from Chicago and Tampa were still at large, and that he himself was getting ready to stage a coup against Castro the following week. John Kennedy once said: "A man does what he must?in spite of personal consequences, in spite of obstacles and dangers." He didn't just mouth the slogan that Americans should be willing to "pay any price" and "bear any burden"?he paid the highest price, making the ultimate sacrifice a leader can make for his country. JFK had always been obsessed with courage, from PT-109 to Profiles in Courage to his steely resolve during the Cuban Missile Crisis.[42] So it's not surprising that he died as he had lived, demonstrating the courage that had obsessed him all his life, and making the ultimate sacrifice for his country. Until 1988, we had no more interest in the JFK assassination than the average person?but the twenty-fifth anniversary of the JFK assassination spawned numerous books and articles, many of which focused on evidence that a conspiracy was involved in JFK's death. The only question seemed to be: Which conspiracy? Conspirators included anti-Castro forces, elements of the CIA, and the Mafia. We started to look more closely at what had already been published about the assassination. We felt that a book focused solely on Bobby Kennedy's battles against the Mafia and against Castro in 1963 might also yield some interesting perspectives on the JFK assassination. We expected the research to require reading a dozen books, looking at a few hundred documents, and trying to interview some Kennedy associates?something that might take a year at most. That was seventeen years, dozens of sources, hundreds of books, and hundreds of thousands of documents ago. We started by looking at the work of the six government commissions (the Review Board had not yet been created) and focused on areas that previous writers hadn't been able to fully explore. When we compiled all that data into a massive database, we realized that their findings weren't mutually exclusive at all?in fact, when their data was grouped together, it filled in gaps and told a coherent story. Putting all their data together didn't make the conspiracy bigger, as one might have expected. It actually made it smaller, since it became clear?for example?that one conspirator could be a Cuban exile, a CIA asset, and also work for the Mafia. However, we were stymied because much key information was still classified, much of it involving anti- Castro operations and associates of godfathers such as Carlos Marcello. We needed to find someone who knew the information and would talk, or some type of document the government couldn't classify top secret?a newspaper, for instance. Our first break came the day we discovered an article in the Washington Post dated October 17, 1989 about the tragic death of Pepe San Roman, the Cuban exile who had led the Kennedys' ill-fated Bay of Pigs invasion. One sentence caught our attention: It said that in 1963, Pepe's brother had been "sent by Robert Kennedy to Central American countries to seek aid for a second invasion" of Cuba.[43] We were puzzled. A "second invasion" of Cuba in 1963? Surely it must be wrong. None of the history books or government committees had ever mentioned a US invasion of Cuba planned for 1963. But a check of newspaper files from the summer and fall of 1963 uncovered a few articles confirming that there had been activity by Kennedy-backed Cuban exiles in Central America at that time. In January 1990, we arranged to interview JFK's Secretary of State, Dean Rusk. When we asked him about the "second invasion" of Cuba in 1963, he confirmed that indeed there were such plans. They weren't the same as the CIA-Mafia plots, which he only learned about later. Nor were they the CIA's assassination plot with a mid-level Cuban official named Rolando Cubela. Rusk described the "second invasion" as a "coup" and said that it wasn't going to be just some Cuban exiles in boats like the Bay of Pigs, but would involve the US military. Rusk indicated that the "second invasion" plans were active at the time JFK died in November 1963 and that the plan was personally controlled by Bobby Kennedy, but that he, Rusk, hadn't learned about it until just after JFK's death. We theorized that there might be some connection between JFK's assassination and the second invasion of Cuba. We asked ourselves why Bobby would cover up crucial information about his own brother's murder?especially if he thought Marcello was behind it. What could be more important than exposing his brother's killers? Well, during the Cold War, one thing that would be more important than the death of a president would be the deaths of millions of Americans in a nuclear exchange with the Soviets. Revealing such a plan after JFK's death, just a year after the tense nuclear standoff of the Cuban Missile Crisis, could have easily sparked a serious and dangerous confrontation with the Soviets. That fear could explain why so much about JFK's assassination had been covered up for so long. At the time, this was a very novel hypothesis, but we agreed that it made sense in light of what we had uncovered so far. Slowly, over the next few years, we found scattered pieces of evidence. For example, at the National Security Archive in Washington, we found a partially censored memo from one of Bobby Kennedy's secretive subcommittees of the National Security Council that discussed "Contingency Plans" in case Fidel Castro retaliated against the US by attempting the "assassination of American officials." The memo was written just ten days before JFK's assassination, and talked about "the likelihood of a step-up in Castro-incited subversion and violence" in response to some US action.[44] The document had been declassified a year after the HSCA had finished its work, and had never been seen by any of the government commissions that had investigated the assassination. We were shocked when Dave Powers, head of the John F. Kennedy Presidential Library in Boston and a close aide to JFK, vividly described seeing the shots from the "grassy knoll." Powers said he and fellow JFK aide Kenneth O'Donnell clearly saw the shots, since they were in the limo right behind JFK. Powers said they felt they were "riding into an ambush"?explaining for the first time why the driver of JFK's limo slowed after the first shot. Powers also described how he was pressured to change his story for the Warren Commission.[45] We quickly found confirmation of Power's account of the shots in the autobiography of former House Speaker Tip O'Neill (and later, from the testimony of two Secret Service agents in the motorcade with Powers and O'Donnell).[46] Months after talking with Powers, we made another startling discovery: a planned attempt to kill JFK during his Tampa motorcade on November 18, 1963. It was mentioned in only two small Florida newspaper articles, each in just one edition of the newspaper and then only after JFK was killed in Dallas. Nothing appeared at the time of the threat, even though authorities had uncovered the plot prior to JFK's motorcade. It was clear that someone had suppressed the story. We decided to pursue Cuban exile and Bay of Pigs veteran Enrique Ruiz-Williams, who had been interviewed by former FBI agent William Turner in 1973. Williams had told Turner that he had been working on the plan with high CIA officials in Washington?something rare for Cuban exiles?on November 22, 1963. The timing was right, since Rusk had told us that the coup/invasion plan was active when JFK died. A former Kennedy aide confirmed Williams's connection to Bobby and the CIA to William Turner. We eventually found Harry Williams, and in a most unlikely place: the snowy mountains of Colorado, about as far from the tropical climate of his native Cuba as one could imagine. Thoughtful and highly intelligent, he quickly grasped that we had done our homework and already knew many of the pieces of the puzzle?just not how they all fit together. Then in the twilight of his life, he wanted to see the truth come out, as long as the spotlight was kept away from him. By the end of our second interview on that first trip, Harry had given us a detailed overview of the Kennedys' secret plan to overthrow Castro on December 1, 1963 and how it was connected to JFK's assassination. We finally understood how associates of Marcello, Trafficante, and Rosselli had learned of the plan and used parts of it against JFK?forcing Bobby Kennedy and key government officials into a much larger cover-up, to protect national security. After getting the overview of C-Day from Harry?and more details from the Kennedy associates he led us to?we were able to make sense of previously released documents that had baffled investigators for decades. In 1993 we gave a short presentation of our discoveries at a historical conference in Dallas that included top historians, journalists, and former government investigators. Some of those experts were able not only to get additional documents released by the Review Board, but also to provide us with additional information that they had previously uncovered. In 1994, a brief summary of our findings was featured on the History Channel and in Vanity Fair. In November 1994, we gave the Review Board written testimony about our discovery of the Tampa assassination attempt and the Kennedys' C-Day "Plan for a Coup in Cuba in the Fall of 1963" (the quote is from our actual submission). Three years later, in 1997, the Review Board located and released a trove of documents confirming what Harry had told us about CDay, including the first declassified documents from fall 1963 entitled "Plan for a Coup in Cuba." It was only in 1998, after the Review Board had finished its work and submitted its final report to the president and Congress, that we learned that the Secret Service had destroyed records covering the Tampa attempt just weeks after we first revealed it to the Review Board. It took us fifteen years to uncover the full story, bringing together all these files and obscure articles in one place?and that was only because we were able to build on decades of work by dedicated historians, journalists, and government investigators. We also had the help of almost two dozen people who had worked with John or Robert Kennedy, who told us what files to look for and gave us the framework for C-Day, especially Harry Williams. Now we can tell the full story in much more detail, quoting directly from hundreds of government documents from the National Archives. These files, many quoted for the first time, verify everything Kennedy insiders had told us, long before most of those files were released. The files support what we said publicly over ten years ago, to the Review Board, to the History Channel, and in Vanity Fair. Some of the very records that prove C-Day's existence also show connections between C-Day and JFK's assassination, and how C-Day was penetrated by the associates of Mafia bosses Carlos Marcello, Santo Trafficante, and Johnny Rosselli. The secrecy surrounding the Kennedys' fall 1963 coup plan?and the Mafia's penetration of it?created most of the continuing controversies about the JFK assassination. Was Lee Harvey Oswald an innocent patsy, an active participant in the conspiracy to kill JFK, or a participant in a US intelligence operation that went awry? As we lay out the evidence about C-Day, and how the Mafia used it to kill JFK, it will answer that and other questions that have long baffled historians, investigators, and the public. All the secrecy that shrouded C-Day in 1963, and in the decades since, has had a tremendous impact on American life and politics. While much of the ensuing cover-up of C-Day and its links to JFK's assassination had a legitimate basis in national security, we also document which agencies covered up critical intelligence failures that allowed JFK's assassination to happen. Since C-Day was never exposed, and its lessons never learned, its legacy has continued to harm US relations and intelligence. Ultimate Sacrifice shows how the ongoing secrecy surrounding C-Day and the JFK assassination has continued to cost American lives. * * * NOTES About sources, quotes, and interviews: All government documents cited in these endnotes have been declassified and are available at the National Archives facility in College Park, Md., near Washington, D.C. Information about many of them, and full copies of a few, are available at the National Archives and Records Administration Web site. Regarding interviews conducted by the authors for this book, for brevity we have used "we" to refer to interviews conducted by the authors, even if only one of us was present for a particular interview. Within quotes in the book, we have sometimes standardized names (such as "Harry Williams") for clarity. 1. Army copy of Department of State document, 1963, Record Number 198-10004-10072, Califano Papers, Declassified 7-24- 97. CIA memo, AMWORLD 11-22-63, #84804, declassified 1993. 2. The last government committee, The Assassinations Records Review Board, was finally unofficially informed of the Coup Plan by one of the authors, via written testimony sent on 11-9-94 for the Review Board's 11-18-94 public hearing in Dallas, as noted in the Board's FY 1995 Report. The earlier committees were the Warren Commission, the Watergate Committee, the Rockefeller Commission, the Pike Committee (and its predecessor, the Nedzi Committee), the Church Committee, and the House Select Committee on Assassinations. 3. "A Presumption of Disclosure: Lessons from the John F. Kennedy Assassination Records Review Board," by OMB Watch, available at ombwatch.com. 4. NBC Nightly News with Tom Brokaw 9-29-98. 5. John F. Kennedy address at Rice University, 9-12-62, from Public Papers of the Presidents of the United States, v. 1, 1962, pp. 669-670. 6. Army document, Summary of plan dated 9-26-63, Califano Papers, Record Number 198-10004-10001, declassified 10-7-97. 7. Army copy of Department of State document, 1963, Record Number 198-10004-10072, Califano Papers, Declassified 7-24-97. 8. Army document, Summary of plan dated 9-26-63, Califano Papers, Record Number 198-10004-10001, declassified 10-7-97. 9. Interview with Harry Williams 7-24-93; interview with confidential C-Day Defense Dept. source 7-6-92; classified message to Director from JMWAVE, CIA/DCD Document ID withheld to protect US intelligence asset but declassified 3-94. 10. The following is just one of many: Joint Chiefs of Staff document, dated 12-4-63 with 11-30-63 report from Cyrus Vance, Record Number 202-10002-101116, declassified 10-7-97. 11. CIA cable to Director, 12-10-63, CIA 104-10076-10252, declassified 8-95; David Corn, Blond Ghost: Ted Shackley and the CIA's Crusades (New York: Simon & Schuster, 1994), p. 110. 12. House Select Committee on Assassinations vol. X, p. 77. 13. Interview with Harry Williams 2-24-92; interview with confidential Kennedy C-Day aide source 3-17-92; interview with confidential C-Day Defense Dept. source 7-6-92. 14. Interview with Dean Rusk 1-8-90. 15. Foreign Relations of the United States, Volume XI, Department of State, #370, 10-8-63; 12-6-63 CIA Document, from JMWAVE to Director, released during the 1993 CIA Historical Review Program. 16. From the John F. Kennedy Presidential Library, NLK 78-473, declassified 5-6-80. 17. John H. Davis, Mafia Kingfish: Carlos Marcello and the Assassination of John F. Kennedy (New York: McGraw-Hill, 1989), pp. 49, 64, many others. 18. Ibid. 19. FBI DL 183A-1f035-Sub L 3.6.86 and FBI Dallas 175-109 3.3.89, cited by A. J. Weberman; CR 137A-5467-69, 6-9-88, cited by Brad O'Leary and L. E. Seymour, Triangle of Death (Nashville: WND Books, 2003). 20. David Talbot, "The man who solved the Kennedy assassination," Salon.com, 11-22-03. 21. Jack Newfield, "I want Kennedy killed," Penthouse 5-92; Frank Ragano and Selwyn Raab, Mob Lawyer (New York: Scribners, 1994), pp. 346-54, 361; "Truth or Fiction?" St. Petersburg Times, 4-18-94. Charles Rappleye and Ed Becker, All American Mafioso: The Johnny Rosselli Story (New York: Barricade, 1995). 22. William Scott Malone, "The Secret Life of Jack Ruby," New Times 1-23-78; Bradley Ayers, The War that Never Was: An Insider's Account of CIA Covert Operations Against Cuba (Canoga Park, Calif.: Major Books, 1979), pp. 59, 129; The CIA's Inspector General's Report on the CIA-Mafia plots. 23. Malone, op. cit.; HSCA Final Report and volumes, many passages. 24. Phone interviews with Pierre Salinger 4-3-98, 4-10-98; interview with confidential source 4-14-98. 25. Warren Commission Exhibit #2818. (In mid-December 1963, after JFK's death and LBJ put C-Day on hold, Ruby placed the date for the invasion in May 1964.) 26. Atlanta Journal-Constitution 5-19-02, pp. C-1, C-6; John Sugg, "Time to Pull the Sharks' Teeth," Creative Loafing weekly newspaper, Atlanta edition, 12-11-03, p. 27. 27. G. Robert Blakey and Richard N. Billings, The Plot to Kill the President (New York: Times Books, 1981), p. 288. 28. Charles Rappleye and Ed Becker, All American Mafioso (New York: Barricade, 1995), p. 315. 29. Church Committee Report, Vol. V, officially The Investigation of the Assassination of President John F. Kennedy: Performance of the Intelligence Agencies, pp. 19-21; 8-30-77 CIA document, "Breckinridge Task Force" report, commenting on Church Committee Report, Vol. V, document ID 1993.07.27.18:36:29:430590, declassified 1993; Thomas G. Paterson, Contesting Castro: The United States and the Triumph of the Cuban Revolution (New York: Oxford University Press, 1994), p. 261, citing Bundy memo, "Meeting with the President," Dec. 19, 1963; Arthur Schlesinger, Jr., Robert Kennedy and His Times (New York: Ballantine, 1979), p. 598; Gus Russo, Live by the Sword: The Secret War against Castro and the Death of JFK (Baltimore: Bancroft Press, 1978), p. 278. 30. Haynes Johnson, "One Day's Events Shattered America's Hopes and Certainties," Washington Post 11-20-83. 31. ABCNEWS.com, 11-20-03, "A Brother's Pain," interview with Evan Thomas. 32. Phone interviews with Pierre Salinger 4-3-98, 4-10-98; interview with confidential source 4-14-98. 33. Re: Arthur M. Schlesinger, Jr., Parade magazine 6-7-98 citing Jack Newfield; re McCone: Schlesinger, op. cit., p. 664; Blakey, op. cit., many passages; re Sheridan: John H. Davis, The Kennedy Contract (New York: Harper Paperbacks, 1993), p. 154, and Evan Thomas, Robert Kennedy, p. 338; re O'Donnell: William Novak, Man of the House: The Life and Political Memoirs of Speaker Tip O'Neil (New York: Random House, 1987), p. 178; Jack Newfield, "I want Kennedy killed," Penthouse 5-92; re Burkley, Gus Russo, Live by the Sword (Baltimore: Bancroft Press, 1978), p. 49; Ronald Goldfarb, Perfect Villains, Imperfect Heroes: Robert F. Kennedy's War against Organized Crime (New York: Random House), pp. 258-299. 34. ABCNEWS.com, op. cit. 35. Ibid. 36. Joseph A. Califano, Jr., Inside: A Public and Private Life (New York: Public Affairs, 2004), p. 125. 37. In addition, the predecessor of the Pike Committee?the Nedzi Committee?got close to aspects of JFK's assassination and C-Day when it investigated CIA activities during Watergate. 38. The following document was "systematically reviewed by JCS on 19 Oct 1989 Classification continued"?Joint Chiefs of Staff document, dated 12-4-63 with 11-30-63 report from Cyrus Vance, 80 total pages, Record Number 202-10002- 101116, declassified 10-7-97. 39. "A Presumption of Disclosure: Lessons from the John F. Kennedy Assassination Records Review Board," by OMB Watch, available at ombwatch.com. 40. Interview with ex-Secret Service Agent Abraham Bolden 4-15-98; House Select Committee on Assassinations Report 231, 232, 636, New York Times 12-6-67; Abraham Bolden file at the Assassination Archives and Research Center. 41. Michael R. Beschloss, The Crisis Years: Kennedy and Khrushchev, 1960-1963 (New York: Edward Burlingame Books, 1991) pp. 670, 671. 42. John Mitchell?the commander of JFK's PT boat unit?would become attorney general under Nixon, before his conviction due to a scandal related to C-Day. 43. Myra MacPherson, "The Last Casualty of the Bay of Pigs," Washington Post 10-17-89. 44. From the John F. Kennedy Presidential Library, NLK 78-473, declassified 5-6-80; article by Tad Szulc in the Boston Globe 5-28- 76 and a slightly different version of the same article in The New Republic 6-5-76 45. Interview with Dave Powers 6-5-91 at the John F. Kennedy Presidential Library. 46. Novak, op. cit., p. 178. * * * Thom Hartmann and Lamar Waldron A BUZZFLASH GUEST CONTRIBUTION Thom Hartmann is a progressive talk radio host and writes, among other things, the http://www.buzzflash.com/hartmann/default.htm "Independent Thinker Book of the Month" reviews for BuzzFlash. Lamar Waldron is an Atlanta-based writer and historical researcher. From shovland at mindspring.com Sun Dec 11 15:27:55 2005 From: shovland at mindspring.com (Steve Hovland) Date: Sun, 11 Dec 2005 07:27:55 -0800 Subject: [Paleopsych] Jerry Goodenough: Critical Thinking about ConspiracyTheories In-Reply-To: Message-ID: The reason people see conspiracies is because they are actually there. This gentlemen is one of a cadre of intellectual prostitutes who willingly takes money to help conceal the truth. -----Original Message----- From: paleopsych-bounces at paleopsych.org [mailto:paleopsych-bounces at paleopsych.org]On Behalf Of Premise Checker Sent: Saturday, December 10, 2005 7:16 PM To: paleopsych at paleopsych.org Subject: [Paleopsych] Jerry Goodenough: Critical Thinking about ConspiracyTheories Jerry Goodenough: Critical Thinking about Conspiracy Theories http://www.uea.ac.uk/~j097/CONSP01.htm [This is a very good analysis, esp. when it comes to noting that many, many conspiracies posit too many conspirators. As far as the specific analysis of the Kennedy assassination goes, the author makes a very good point about the Mafia being incompetent. I'll send along in a moment excerpts from a new book, "Ultimate Sacrifice," that makes a new case that the Mafia did in fact orchestrate the assassination. According to the book, the Mafia got wind of a CIA plot to murder Castro and threatened to reveal it, thereby causing an international crisis. The Warren Commission, accordingly covered things up, a cover-up which continues. [Still, the charge of incompetence remains. I reinsert my own theory that the assassination was an assisted suicide. JFK knew he had not long to live but did not want to go down in history like Millard Fillmore, whose only achievement was to not install a bath tub in the White House. Just being assassinated would not be enough, so he got the conspirators to leave enough bogus and inconsistent evidence that researchers would never stop spinning theories, all of them imperfect for failure to reconcile the evidence. [The Enlightenment died in six seconds on the Dealey Plaza.] Jerry Goodenough is Professor of Philosophy at the University of East Anglia, Norwich, UK 1. Introduction Conspiracy theories play a major part in popular thinking about the way the world, especially the political world, operates. And yet they have received curiously little attention from philosophers and others with a professional interest in reasoning.[1] Though this situation is now starting to change, it is the purpose of this paper to approach this topic from the viewpoint of critical thinking, to ask if there are particular absences or deformities of critical thinking skills which are symptomatic of conspiracy theorising, and whether better teaching of reasoning may guard against them. That conspiracy thinking is widespread can be seen from any cursory examination of a bookshop or magazine stand. There are not only large amounts of blatant conspiracy work, often dealing with American political assassinations and other events or with the alleged presence of extraterrestrial spacecraft, but also large amounts of writing where a certain degree of conspiracy thinking is more or less implicit. Thus many `alternative' works of medicine, history, archaeology, technology, etc. often depend upon claims, explicit or otherwise, that an establishment or orthodoxy conspires to suppress alternative views. Orthodox medicine in cahoots with the multinational drug companies conspires to suppress the claims of homeopathy, orthodox archaeologists through malice or blindness conspire to suppress the truth about the construction of the Pyramids, and so on. It certainly seems to the jaundiced observer that there is more of this stuff about then ever before. However, conspiracy theorising is now coming to the attention of philosophers. That it has taken this long may be because, as Brian Keeley says in a recent paper, `most academics simply find the conspiracy theories of popular culture to be silly and without merit.' (1999: 109n) But I agree with Keeley's further remark that `it is incumbent upon philosophers to provide analysis of the errors involved with common delusions, if that is indeed what they are.' If a kind of academic snobbishness underlies our previous refusal to get involved here, there may be another reason. Conspiracy theorising, in political philosophy at least, has been identified with irrationality of the worst sort--here the locus classicus may be some dismissive remarks made by Karl Popper in The Open Society and its Enemies (Popper 1996, Vol.2: 94-9). Pigden (1993) shows convincingly that Popper's remarks cannot be taken to support a rational presumption against conspiracy theories in history and politics. But certainly such a presumption exists, particularly amongst political commentators. It tends to manifest itself in a noisy preference for what is termed the `cock-up' theory of history--an unfortunate term that tends to assume that history is composed entirely of errors, accidents and unforeseen consequences. If such a dismal state of affairs were indeed to be the case, then there would seem to be no point in anybody trying to do anything. The cock-up theory, then, is agreeable to all forms of quietism. But we have no reason to believe that there is such a coherent theory, and even less reason to believe that every event must fall neatly into one or other category here; indeed, this insistence on black and white reasoning is, as we shall see, one of the features of conspiracy theorising itself! And what makes the self-satisfied `cock-up' stance even less acceptable is that it ignores the fact that conspiracies are a very real part of our world. No serious historian denies that a somewhat amateurish conspiracy lay behind the assassination of Abraham Lincoln, or that a more professional but sadly less successful conspiracy attempted to assassinate Adolf Hitler in the summer of 1944. Yet such is the presumption behind the cock-up stance that the existence or frequency of genuine conspiracies is often significantly downplayed. (How many people, taking at face value the cock-up theorists' claim that conspiracies are a real rarity in the modern history of democracies, do not know that a mere 13 years before President Kennedy's assassination a serious terrorist conspiracy to murder Harry S. Truman led to a fatal gunfight on the streets of Washington?[2] The cock-up presumption seems to generate a kind of amnesia here.) We require, then, some view of events that allows for the accidental and the planned, the deliberate and the contingent: history as a tapestry of conspiracies and cock-ups and much intentional action that is neither. Pigden (op.cit) satisfactorily demonstrates the unlikelihood of there being any adequate a priori exclusion principle here, in the face of the reality of at least some real conspiracies. Keeley's paper attempts a more rigorous definition of the phenomenon, hoping to separate what he terms Unwarranted Conspiracy Theories (UCTs) from rational or warranted conspiratorial explanations: It is thought that this class of explanation [UCTs] can be distinguished analytically from those theories which deserve our assent. The idea is that we can do with conspiracy theories what David Hume (1748) did with miracles: show that there is a class of explanations to which we should not assent, by definition. (Keeley: 111) and it is part of his conclusion that `this task is not as simple as we might have heretofore imagined.' (ibid.) Keeley concludes that `much of the intuitive "problem" with conspiracy theories is a problem with the theorists themselves, and not a feature of the theories they produce' (Ibid: 126) and it is this point I want to take up in this paper. What sort of thinking goes on in arriving at UCTs and what sort of things go wrong? If we say that conspiracy theorists are irrational, do we mean only that they are illogical in their reasoning? Or are there particular critical thinking skills missing or being misused? 2. Definitions Keeley's use of the term Unwarranted Conspiracy Theory should not mislead us into thinking that all conspiracy theories fall into one or other category here. Warrant is a matter of degree, and so is conspiracy. There are cases where a conspiratorial explanation is plainly rational; take, for instance, the aforementioned July Bomb Plot to kill Hitler, where there is an abundance of historical evidence about the conspirators and their aims. There are cases where such an explanation is clearly irrational: I shall argue later in the paper that this is most probably the case for the assassination of President Kennedy. And there are cases where some conspiratorial explanation may be warranted but it is hard to know how far the warrant should extend. Take, for instance, the murder of the Archduke Franz Ferdinand in Sarajevo in 1914. There was plainly a conspiracy to bring this about: some minutes before Gavril Princips shot the archduke, a co-conspirator was arrested after throwing a bomb (which failed to explode) at the archduke's car. Princips and his fellow students were Serbian nationalists, acting together to demonstrate against the presence of Habsburg influence in the Balkans. But there remains the possibility that they had been infiltrated and manipulated by Yugoslav intelligence elements seeking to provoke a crisis against Austro-Hungary. And there are more extreme claims that the ultimate manipulators here were agents of a world-wide conspiracy, of international Jewry or freemasonry seeking to bring about war. We are fully warranted in adopting the first conspiratorial explanation, but perhaps only partially warranted in thinking there is anything in the second claim[3], while the extreme claims seem to me to be as unwarranted as anything could be. What we require, then, is some definition which will mark off the kind of features which ought to lead us to suspect the warrant of any particular conspiratorial explanation. Keeley lays out a series of these, which I shall list and comment upon. But first he offers his definition of conspiracy theories in general: A conspiracy theory is a proposed explanation of some historical event (or events) in terms of the significant causal agency of a relatively small group of persons--the conspirators--acting in secret... a conspiracy theory deserves the appellation "theory" because it proffers an explanation of the event in question. It proposes reasons why the event occurred... [it] need not propose that the conspirators are all powerful, only that they have played some pivotal role in bringing about the event... indeed, it is because the conspirators are not omnipotent that they must act in secret, for if they acted in public, others would move to obstruct them... [and] the group of conspirators must be small, although the upper bounds are necessarily vague.(116) Keeley's definition here differs significantly from the kind of conspiracy at which Popper was aiming in The Open Society, crude Marxist explanations of events in terms of capitalist manipulation. For one can assume that in capitalist societies capitalists are very nearly all-powerful and not generally hindered by the necessity for secrecy. A greater problem for Keeley's definition, though, is that it seems to include much of the work of central government. Indeed, it seems to define exactly the operations of cabinet government--more so in countries like Britain with no great tradition of governmental openness than in many other democracies. What is clearly lacking here is some additional feature, that the conspirators be acting against the law or against the public interest, or both. This doesn't entirely free government from accusations of conspiracy--does a secret cabinet decision to upgrade a country's nuclear armaments which appears prima facie within the bounds of the law of that country but may breach international laws and agreements count? Is it lawful? In the public interest? A further difficulty with some kind of illegality constraint is that it might tend to rule out what we might otherwise clearly recognise as conspiracy theories. Take, for instance, the widely held belief amongst ufologists that the US government (and others) has acted to conceal the existence on earth of extra-terrestrial creatures, crashed flying saucers at Roswell, and so on. It doesn't seem obvious that governments would be acting illegally in this case--national security legislation is often open to very wide interpretation--and it could be argued that they are acting in the public interest, to avoid panic and so on. (Unless, of course, as some ufologists seem to believe, the government is conspiring with the aliens in order to organise the slavery of the human race!) So we have here what would appear to be a conspiracy theory, and one which has some of the features of Keeley's UCTs, but which is excluded by the illegality constraint. Perhaps the best we can do here is to assert that conspiracy theories are necessarily somewhat vague in this regard; I'll return to this point later. If this gives us a rough idea of what counts as a conspiracy theory, we can then build upon it and Keeley goes on to list five features which he regards as characteristic of Unwarranted Conspiracy Theories: (1) `A UCT is an explanation that runs counter to some received, official, or "obvious" account.' (116-7) This is nothing like a sufficient condition, for the history of even democratic governments is full of post facto surprises that cause us to revise previous official explanations. For instance, for many years the official explanation for Britain's military success in the Second World War was made in terms of superior generalship, better troops, occasional good luck, and so on. The revelation in the 1970s of the successful Enigma programme to break German service codes necessitated wholesale revision of military histories of this period. This was an entirely beneficial outcome, but others were more dubious. The growth of nuclear power in Britain in the 1950s was officially explained in terms of the benefit of cheaper and less polluting sources of electricity. It was only much later that it became clear that these claims were exaggerated and that the true motivation for the construction of these reactors was to provide fissile material for Britain's independent nuclear weapons. Whether such behaviour was either legal or in the public interest is an interesting thought. (1A) `Central to any UCT is an official story that the conspiracy theory must undermine and cast doubt upon. Furthermore, the presence of a "cover story" is often seen as the most damning piece of evidence for any given conspiracy." This is an interesting epistemological point to which I shall return. (2) `The true intentions behind the conspiracy are invariably nefarious'. I agree with this as a general feature, particularly of non-governmental conspiracies, though as pointed out above it is possible for governmental conspiracies to be motivated or justified in terms of preventing public alarm, which may be seen as an essentially beneficial aim. (3) `UCTs typically seek to tie together seemingly unrelated events.' This is certainly true of the more extreme conspiracy theory, one which seeks a grand unified explanation of everything. We have here a progression from the individual CT, seeking to explain one event, to the more general. Carl Oglesby (1976), for instance, seeks to reinterpret many of the key events in post-war American history in terms of a more or less secret war between opposing factions within American capital, an explanation which sees Watergate and the removal of Richard Nixon from office as one side's revenge for the assassination of John Kennedy. At the extreme we have those theories which seek to explain all the key events of western history in terms of a single secret motivating force, something like international freemasonry or the great Jewish conspiracy.[4] It may be taken as a useful rule of thumb here that the greater the explanatory range of the CT, the more likely it is to be untrue. (A point to which Popper himself would be sympathetic!) Finally, one might want to query here Keeley's point about seemingly unrelated events. Many CTs seem to have their origin in a desire to relate events that one might feel ought to go together. Thus many Americans, on hearing of the assassination of Robert Kennedy (itself coming very shortly after that of Martin Luther King) thought these events obviously related in some way, and sought to generate theories linking them in terms of some malevolent force bent on eliminating apparently liberal influences in American politics. They seem prima facie more likely to be related than, say, the deaths of the Kennedy brothers and those of John Lennon or Elvis Presley: any CT linking these does indeed fulfil Keeley's (3). (4) `...the truths behind events explained by conspiracy theories are typically well-guarded secrets, even if the ultimate perpetrators are sometimes well-known public figures.' This is certainly the original belief of proponents of UCTs but it does lead to a somewhat paradoxical situation whereby the alleged secret can become something of an orthodoxy. Thus opinion polls seem to indicate that something in excess of 80% of Americans believe that a conspiracy led to the death of President Kennedy, though it seems wildly unlikely that they all believe in the same conspiracy. It becomes increasingly hard to believe in a well-guarded secret that has been so thoroughly aired in 35 years of books, magazine articles and even Hollywood movies. Pretty much the same percentage of Americans seem to believe in the presence on earth of extra-terrestrials, though whether this tells us more about Americans or about opinion-polls is hard to say. But these facts, if facts they be, would tend to undercut the `benevolent government' UCTs. For there is really no point in `them' keeping the truth from us to avoid panic if most of us already believe this `truth'. The revelation of cast-iron evidence of a conspiracy to kill Kennedy or of the reality of alien visits to Earth would be unlikely to generate more than a ripple of public interest, these events having been so thoroughly rehearsed. (5) `The chief tool of the conspiracy theorist is what I shall call errant data'. By which Keeley means data which is unaccounted for by official explanations, or data which if true would tend to contradict official explanations. These are the marks of the UCT. As Keeley goes on to say (118) `there is no criterion or set of criteria that provide a priori grounds for distinguishing warranted conspiracy theories from UCTs.' One might perhaps like to insist here that UCTs ought to be false, and this is why we are not warranted in believing them, but it is in the nature of many CTs that they cannot be falsified. The best we may do is show why the warrant for believing them is so poor. And one way of approaching this is by way of examining where the thinking that leads to UCTs goes awry. 3. Where CT thinking goes wrong It is my belief that one reason why we should not accept UCTs is because they are irrational. But by this I do not necessarily mean that they are illogical in the sense that they commit logical fallacies or use invalid argument forms--though this does indeed sometimes happen--but rather that they misuse or fail to use a range of critical thinking skills and principles of reasoning. In this section I want to provide a list of what I regard as the key weaknesses of CT thinking, and then in the next section I will examine a case study of (what I regard to be) a UCT and show how these weaknesses operate. My list of points is not necessarily in order of importance. (A) An inability to weigh evidence properly. Different sorts of evidence are generally worthy of different amounts of weight. Of crucial importance here is eye-witness testimony. Considerable psychological research has been done into the strengths and weaknesses of such testimony, and this has been distilled into one of the key critical thinking texts, Norris & King's (1983) Test on Appraising Observations whose Manual provides a detailed set of principles for judging the believability of observation statements. I suspect that no single factor contributes more, especially to assassination and UFO UCTs, than a failure to absorb and apply these principles. (B) An inability to assess evidence corruption and contamination. This is a particular problem with eyewitness testimony about an event that is subsequently the subject of considerable media coverage. And it is not helped by conventions or media events which bring such witnesses together to discuss their experiences--it is not for nothing that most court systems insist that witnesses do not discuss their testimony with each other or other people until after it has been given in court. There is a particular problem with American UCTs since the mass media there are not governed by sub judice constraints, and so conspiratorial theories can be widely aired in advance of any court proceedings. Again Norris & King's principles (particularly IV. 10 & 12) should warn against this.[5] But we do not need considerable delay for such corruption to occur: it may happen as part of the original act of perception. For instance, in reading accounts where a group of witnesses claim to have identified some phenomenon in the sky as a spaceship or other unknown form of craft, I often wonder if this judgement occurred to all of them simultaneously, or if a claim by one witness that this was a spaceship could not act to corrupt the judgmental powers of other witnesses, so that they started to see this phenomenon `as' a spacecraft in preference to some more mundane explanation. (C) Misuse or outright reversal of a principle of charity: wherever the evidence is insufficient to decide between a mundane explanation and a suspicious one, UCTs tend to pick the latter. The critical thinker should never be prejudiced against occupying a position of principled neutrality when the evidence is more or less equally balanced between two competing hypotheses. And I would argue that there is much to be said for operating some principle of charity here, of always picking the less suspicious hypothesis of two equally supported by the evidence. My suspicion is that in the long run this would lead to a generally more economical belief structure, that reversing the principle of charity ultimately tends to blunt Occam's Razor, but I cannot hope to prove this. (D) The demonisation of persons and organisations. This may be regarded as either following from or being a special case of (C). Broadly, this amounts to moving from the accepted fact that X once lied to the belief that nothing X says is trustworthy, or taking the fact that X once performed some misdeed as particular evidence of guilt on other occasions. In the former case, adopting (D) would demonise us all, since we have lied on some occasion or other. This is especially problematic for UCTs involving government organisations or personnel, since all governments reserve the right to lie or mislead if they feel it is in the national interest to do so. But proof that any agency lied about one event ought not to be taken as significant proof that they lied on some other occasion. It goes against the character of the witness, as lawyers are wont to say, but then no sensible person should believe that governments are perfectly truthful. The second case is more difficult. It is a standard feature of Anglo-Saxon jurisprudence that the fact that X has a previous conviction should not be given in evidence against them, nor revealed to the jury until after a verdict is arrived at. The reasoning here is that generally evidence of X's previous guilt is not specific evidence for his guilt on the present occasion; it is possible for it to be the case that X was guilty then and is innocent now, and so the court should not be prejudiced against him. But there is an exception to this, at least in English law, where there are significant individual features shared between X's previous proven modus operandi and that of the present offence under consideration; evidence of a consistent pattern may be introduced into court. But, the rigid standards of courtroom proof aside, it is not unreasonable for the police to suspect X on the basis of his earlier conviction. This may not be fair to X (if he is trying to go straight) but it is epistemologically reasonable. The trouble for UCTs, as we shall see, is that most governments have a long record of previous convictions, and the true UC theorist may regard this not just as grounds for a reasonable suspicion but as itself evidence of present guilt. (E) The canonisation of persons or (more rarely) organisations. This may be regarded as the mirror-image of (D). Here those who are regarded as the victims of some set of events being explained conspiratorially tend to be presented, for the purpose of justifying the explanation, as being without sin, or being more heroic or more threatening to some alleged set of private interests than the evidence might reasonably support. (F) An inability to make rational or proportional means-end judgements. This is perhaps the greatest affront to Occam's Razor that one finds in UCTs. Such theories are often propounded with the explanation that some group of conspirators have been acting in furtherance of some aim or in order to prevent some action taking place. But one ought to ask whether such a group of conspirators were in a position to further their aim in some easier or less expensive or less risky fashion. Our assumption here is not the principle of charity mentioned in (C) above, that our alleged conspirators are too nice or moral to resort to nefarious activities. We should assume only that our conspirators are rational people capable of working out the best means to a particular end. This is a defeasible assumption--stupidity is not totally unknown in the political world--but it is nevertheless an assumption that ought to guide us unless we have evidence to the contrary. A difficulty that should be mentioned here is that of establishing the end at which the conspiracy is aimed, made more difficult for conspiracies that never subsequently announce these things. For the state of affairs brought about by the conspirators may, despite their best efforts, not be that at which they aimed. If this is what happens then making a rational means-end judgement to the actual result of the conspiracy may be a very different matter from doing the same thing to the intended results. (G) Evidence against a UCT is always evidence for. This is perhaps the point that would most have irritated Karl Popper with his insistence that valid theories must always be capable of falsification. But it is an essential feature of UCTs; they do not just argue that on the evidence available a different conclusion should be drawn from that officially sanctioned or popular. Rather, the claim is that the evidence supporting the official verdict is suspect, fraudulent, faked or coerced. And this belief is used to support the nature of the conspiracy, which must be one powerful or competent enough to fake all this evidence. What we have here is a difference between critically assessing evidence--something I support under (A) above--and the universal acid of hypercritical doubt. For if we start with the position that any piece of evidence may be false then it is open to us to support any hypothesis whatsoever. Holocaust revisionists would have us believe that vast amounts of evidence supporting the hypothesis of a German plot to exterminate Europe's Jews are fake. As Robert Anton Wilson (1989: 172) says, `a conspiracy that can deceive us about 6,000,000 deaths can deceive us about anything, and that it takes a great leap of faith for Holocaust Revisionists to believe that World War II happened at all.' Quite so. What is needed here is that I might term meta-evidence, evidence about the evidence. My claim would be that the only way to keep Occam's Razor shiny here is to insist on two different levels of critical analysis of evidence. Evidence may be rejected if it doesn't fit a plausible hypothesis--this is what everyone must do in cases where there is apparently contradictory evidence, and there can be no prima facie guidelines for rejection here apart from overall epistemological economy. But evidence may only be impeached--accused of being deliberately faked, forged, coerced, etc.--if we have further evidence of this forgery: that a piece of evidence does not fit our present hypothesis is not by itself any warrant for believing that the evidence is fake. (H) We should put no trust in what I here term the fallacy of the spider's web. That A knows B and that B knows C is no evidence at all that A has even heard of C. But all too often UCTs proceed in this fashion, weaving together a web of conspirators on the basis of who knows who. But personal acquaintance is not necessarily a transitive relation. The falsity of this belief in the epistemological importance of webs of relationships can be demonstrated with reference to the show-business party game known sometimes as `Six Degrees of Kevin Bacon'. The object of the game is to select the name of an actor or actress and then link them to the film-actor Kevin Bacon through no more than six shared appearances. (E.g. A appeared with B in film X, B appeared with C in film Y, C appeared with D in film Z, and D appears in Kevin Bacon's latest movie: thus we link A to Bacon in four moves.) The plain fact is that most of us know many people, and important people in public office tend to have dealings with a huge number of people, so just about anybody in the world can be linked to somebody else in a reasonably small number of such links. I can demonstrate the truth of this proposition with reference to my own case, that of a dull and unworldly person who doesn't get out much. For I am separated by only two degrees from Her Majesty The Queen (for I once very briefly met the then Poet Laureate, who must himself have met the Queen if only at his inauguration) which means I am separated by only three degrees from all the many important political figures that the Queen herself has met, including names like Churchill and De Gaulle. Which further means that only four degrees separate me from Josef Stalin (met by Churchill at Yalta) and just five degrees from Adolf Hitler (who never met Churchill but did meet prewar Conservative politicians like Chamberlain and Halifax who were known to Churchill). Given the increasing amounts of travel and communication that have taken place in this century, it should be possible to connect me with just about anybody in the world in the requisite six stages. But so what? Connections like these offer the possibility of communication and influence, but offer no evidence for its actuality. (I) The classic logical fallacy of post hoc ergo propter hoc. This is the most common strictly logical fallacy to be found in political conspiracy theories, especially those dealing with assassinations and suspicious deaths. And broadly it takes the shape of claiming that since event X happened after the death of A, A's death was brought about in order to cause or facilitate the occurrence of X. The First World War happened after the death of the Archduke Franz Ferdinand, and there is clearly a sense in which it happened because of his death: there is a causal chain leading from the death to Austrian outrage, to a series of Austrian demands upon Serbia, culminating in Austria's declaration of war against Serbia, Russia's declaration against Austria, and, via a series of interlinked treaty obligations, most of the nations of Europe ending up at war with one another. Though these effects of the assassination may now appear obvious, one problem for the CT proponent is that hindsight clarifies these matters enormously: such a progression may not have been at all obvious to the people involved in these events at the time. And it is even harder to believe that bringing about such an outcome was in any of their interests. (Austria plainly had an interest in shoring up its authority in the Balkans but not, given its many structural weaknesses, in engaging in a long and destructive war. The outcome, which anyone might have predicted as likely, was the economic ruin and subsequent political dissolution of the entire Austro-Hungarian empire.) Attempting to judge the rationality of a proposed CT here as an explanation for some such set of events runs into two problems. Firstly, though an outcome may now seem obvious to us, it may not have appeared so obvious to people at the time, either in its nature or in its expensiveness. Thus there may well have been people who thought that assassinating Franz Ferdinand in order to trigger a crisis in relations between Austria and Serbia was a sensible policy move, precisely because they did not anticipate a general world war occurring as a result and may have thought a less expensive conflict, a limited war of independence between Serbia and Austria, worth the possible outcome of freeing more of the Balkans from Austrian domination. And secondly, if we cannot attribute hindsight to the actors in such events, neither can we ascribe to them a perfect level of rationality: it is always possible for people engaged in such actions to possess a poor standard of means-end judgement. But, bearing these caveats in mind, one might still wish to propound two broad principles here for distinguishing whether an event is a genuine possible motive for an earlier conspiracy or just an instance of post hoc ergo propter hoc. Firstly, could any possible conspirators, with the knowledge they possessed at the time, have reasonably foreseen such an outcome? And secondly, granted that such an outcome could have been desired, are the proposed conspiratorial events a rational method of bringing about such an outcome? That a proposed CT passes these tests is, of course, no guarantee that we are dealing here with a genuine conspiracy; but a failure to pass them is a significant indicator of an unwarranted CT. 4. A case-study of CT thinking--the assassination of President Kennedy With these diagnostic indicators of poor critical thinking in place, I would now like to apply them to a typical instance of CT (and, to my mind, unwarranted CT) thinking.[6] On 22 November, 1963 President John F. Kennedy was assassinated in Dallas, Texas. Two days later, the man accused of his murder, Lee Harvey Oswald, was himself murdered in the basement of the Dallas Police Headquarters. These two events (and perhaps particularly the second, coming as it did so rapidly after the first) led to a number of accusations that Kennedy's death had been the result of a conspiracy of which Oswald may or not have been a part. Books propounding such theories emerged even before the Warren Commission issued its report on the assassination in August 1964. Writing at this time in his essay `The Paranoid Style in American Politics' the political scientist Richard Hofstadter could say; "Conspiratorial explanations of Kennedy's assassination have a far wider currency in Europe than they do in the United States." (Hofstadter 1964: 9) Hofstadter's view of the American paranoid style was one of small cults of a right-wing or racist or anti-Catholic or anti-Freemason bent whose descendants are still to be found in the Ku Klux Klan, the John Birch Society, the Michigan Militia, etc.. But within a couple of years of the emergence of the Warren Report and, more importantly, its 26 volumes of evidence, a new style of conspiratorial thinking emerged. While some right-wing conspiratorial theories remained[7], the bulk of the conspiracy theories propounded to explain the assassination adopted a position from the left of centre, accusing or assuming that some conspiracy of right-wing elements and/or some part of the US Government itself had been responsible for the assassination. A complete classification of such CTs is not necessary here[8], but I ought perhaps to point to a philosophically interesting development in the case. As a result of public pressure resulting from the first wave of CT literature, a congressional committee was established in 1977 to investigate Kennedy's assassination; it instituted a thorough examination of the available evidence and was on the verge of producing a report endorsing the Warren Commission's conclusions when it discovered what was alleged to be a sound recording of the actual assassination. Almost solely on the basis of this evidence--which was subsequently discredited by a scientific panel put together by the Department of Justice--the Congressional committee decided that there had probably been a conspiracy, asserting on the basis of very little evidence that the Mafia was the most probable source of this conspiracy. What was significant about this congressional investigation was the effect its thorough investigation of the forensic and photographic evidence in the case had. Many of the alleged discrepancies in this evidence, which had formed the basis for the many calls to establish such an investigation, were shown to be erroneous. This did not lead to the refutation of CTs but rather to a new development: the balance of CT claims now went from arguing that there existed evidence supporting a conspiratorial explanation to arguing that all or most of the evidence supporting the lone-assassin hypothesis had been faked, a new level of epistemological complexity. A representative CT of this type was propounded in Oliver Stone's hit 1992 Hollywood film JFK .[9] It asserts that a coalition of interests within the US governmental structure, including senior members of the armed forces, FBI, CIA, Secret Service and various Texas law-enforcement agencies, together with the assistance of members of organised crime, conspired to arrange the assassination of President Kennedy and the subsequent framing of an unwitting or entirely innocent Oswald for the crime. Motives for the assassination vary but most such CTs now agree on such motives as (a) preventing Kennedy after his supposed re-election from reversing US involvement in Vietnam, (b) protecting right-wing industrial interests, especially Texan oil interests, from what were regarded as possible depredations by the Kennedy administration, (c) instigating another and more successful US invasion of Cuba, and (d) halting the judicial assault waged by the Kennedy administration under Attorney General Robert Kennedy against the interests of organised crime. Such a CT scores highly on Keeley's five characteristic features of Unwarranted Conspiracy Theories outlined above. It runs counter to the official explanation of the assassination, though it has now itself become something of a popular orthodoxy, one apparently subscribed to by a majority of the American population. The alleged intentions behind the conspiracy are indeed nefarious, using the murder of a democratically-elected leader to further the interests of a private cabal. And it does seem to seek to tie together seemingly unrelated events. The most obvious of these is in terms of the assassination's alleged motive: it seeks to link the assassination with the subsequent history of America's involvement in Vietnam. But a number of other connections are made at other levels of explanation. For instance, the deaths of various people connected in one way or another with the assassination are linked together as being in some way related to the continuing cover-up by the conspirators. Keeley's fourth claim, that the truth behind an event being explained by a UCT be a typically well-guarded secret is, as I pointed out above, much harder to justify now in a climate where most people apparently believe in the existence of such a conspiracy. But Keeley's fifth claim, that the chief tool here is errant data, remains true. The vast body of published evidence on the assassination has been picked over with remarkable care for signs of discrepancy and contradiction, signs which are regarded as providing the strongest evidence for such a conspiracy. What now seems to me to be an interesting development in these more paranoid UCTs, as I mention above, is the extent to which unerrant data is now regarded as a major feature of such conspiracy theories. But how do these Kennedy assassination CTs rate against my own list of what I regard as critical thinking weaknesses? (A) An inability to weigh evidence properly. Here they score highly. Of particular importance is the inability to judge the reliability or lack thereof of eyewitness testimony, and an unwillingness or inability to discard evidence which does not fit. On the first point, most Kennedy CTs place high reliance on the small number of people who claimed at the time (and the somewhat larger number who claim now--see point (B) below) that they heard more than three shots fired in Dealey Plaza or that they heard shots fired from some other location than the Book Depository, both claims that if true would rule out the possibility of Oswald's acting alone. Since the overwhelming number of witnesses whose opinions have been registered did not hear more than three shots, and tended to locate the origin of these shots in the general direction of the Depository (which, in an acoustically misleadingly arena like Dealey Plaza is perhaps the best that could be hoped for), the economical explanation is to assume, unless further evidence arises, that the minority here are mistaken. Since the assassination was an unexpected, rapid and emotionally laden event--all key features for weakening the reliability of observation, according to the Principles of Appraising Observations in Norris & King (1983), it is only to be expected that there would be a significant portion of inconsistent testimony. The wonder here is that there is such a high degree of agreement over the basic facts. We find a similar misuse of observational principles in conspiratorial interpretations of the subsequent murder of Police Officer Tippit, where the majority of witnesses who clearly identified Oswald as the killer are downplayed in favour of the minority of witnesses--some at a considerable distance and all considerably surprised by the events unfolding in front of them--who gave descriptions of the assailant which did not match Oswald. Experienced police officers are used to eye-witness testimony of sudden and dramatic events varying considerably and, like all researchers faced with a large body of evidence containing discrepancies, must discard some evidence as worthless. Since Oswald was tracked almost continuously from the scene of Tippit's shooting to the site of his own arrest, and since forensic evidence linked the revolver found on Oswald to the shooting, the most economical explanation again is that the majority of witnesses were right in their identification of Oswald and the minority were mistaken. This problem of being unable to discard errant data is central to the creation of CTs since, as Keeley says: The role of errant data in UCTs is critical. The typical logic of a UCT goes something like this: begin with errant facts.... The official story all but ignores this data. What can explain the intransigence of the official story tellers in the face of this and other contravening evidence? Could they be so stupid and blind? Of course not; they must be intentionally ignoring it. The best explanation is some kind of conspiracy, an intentional attempt to hide the truth of the matter from the public eye. (Keeley 1999: 199) Such a view in the Kennedy case ignores the fact that the overwhelming amount of errant data on which CTs have been constructed, far from being hidden, was openly published in the 26 volumes of Warren Commission evidence. This has led to accusations that it was `hidden in plain view', but one can't help feeling that a more efficient conspiracy would have suppressed such inconvenient data in the first place. The standard position that errant data is likely to be false, that eye-witness testimony and memory is sometimes unreliable, that persisting pieces of physical evidence are preferable, etc., in short that Occam's Razor will insist on cutting and throwing away some of the data is constantly rejected in Kennedy CT literature. Perhaps the most extravagant example of this, amounting almost to a Hegelian synthesis of assassination conspiracy theories, is Lifton (1980). Seeking to reconcile the major body of testimony that Kennedy was shot from behind with a small body of errant data that he possessed a wound in the front of his body, the author dedicates over 600 pages to the construction of the most baroque conspiracy theory imaginable. In Lifton's thesis, Kennedy was shot solely from the front, and then the conspirators gained access to his body during its journey back to Washington and were able to doctor it so that at the subsequent post mortem examination it showed signs of being shot only from the rear. Thus the official medical finding that Kennedy was only shot from the rear can be reconciled with the general CT belief that he was shot from the front (too) in a theory that seems to show that everybody is right. Apart from the massive complication of such a plan--clearly going against my point (F)--and its medical implausibility, such a thesis actually reverses Occam's Razor by creating more errant data than there was to start with. For if Kennedy was shot only from the front, we now need some explanation for why the great majority of over 400 witnesses at the scene believed that the shots were coming from behind him! And this challenge is one that is ducked by the great majority of CTs: if minority errant data is to be preferred as reliable, then we require some explanation for the presence of the majority data now being rejected. But Lifton at least got one thing right. In accounting for the title of his book he writes: The "best evidence" concept, impressed on all law students, is that when you seek to determine a fact from conflicting data, you must arrange the data according to a hierarchy of reliability. All data are not equal. Some evidence (e.g. physical evidence, or a scientific report) is more inherently error-free, and hence more reliable, than other evidence (e.g. an eye-witness account). The "best" evidence rules the conclusion, whatever volume of contrary evidence there may be in the lower categories.[10] Unfortunately Lifton takes this to mean that conspirators who were able to decide the nature of the autopsy evidence would thereby lay down a standard for judging or rejecting as incompatible the accompanying eye-witness testimony. But given the high degree of unanimity among eye-witnesses on this occasion, and given the existence of corroborating physical evidence (a rifle and cartridges forensically linked to the assassination were found in the Depository behind Kennedy, the registered owner of the rifle was a Depository employee, etc.), all that the alleged body-tampering could hope to achieve is make the overall body of evidence more suspicious because more contradictory. Only if the body of reliable evidence was more or less balanced between a conspiratorial and non-conspiratorial explanation could this difficulty be avoided. But it is surely over-estimating the powers, predictive and practical, of such a conspiracy that they could hope to guarantee this situation beforehand. (B) An inability to assess evidence corruption and contamination. Though, as I note above, such contamination of eye-witness testimony may occur contemporaneously, it is a particular problem for the more long-standing CTs. In the Kennedy case, many witnesses of the assassination who at the time gave accounts broadly consistent with the explanation have subsequently amended or extended their accounts to include material that isn't so consistent. Witnesses, for instance, who at the time located all the shots as coming from the Book Depository subsequently gave accounts in which they located shots from other directions, most notably the notorious `grassy knoll', or later told of activity on the knoll which they never mentioned in their original statements. (Posner (1993) charts a number of these changes in testimony.) What is interesting about many of these accounts is that mundane explanations for these changes--I later remembered that..., I forgot to mention that...--tend to be eschewed in favour of more conspiratorial explanations. Such witnesses may deny that the signed statements made at the time accurately reflect what they told the authorities, or may say that the person interviewing them wasn't interested in writing down anything that didn't cohere with the official explanation of the assassination, and so on. Such explanations face serious difficulties. For one thing, since many of these statements were taken on the day of the assassination or very shortly afterwards, it would have to be assumed that putative conspirators already knew which facts would cohere with an official explanation and which wouldn't, which may imply an implausible degree of foreknowledge. A more serious problem is that these statements were taken by low-level members of the various investigatory bodies, police, FBI, Secret Service, etc.; to assert that such statements were manipulated by these people entails that they were members of the conspiracy. And this runs up against a practical problem for mounting conspiracies, that the more people who are in a conspiracy, the harder it is going to be to enforce security. A more plausible explanation for these changes in testimony might be that witnesses who provided testimony broadly supportive of the official non-conspiratorial explanation subsequently came into contact with some of the enormous quantity of media coverage suggesting less orthodox explanations and, consciously or unconsciously, have adjusted their recollections accordingly. The likelihood of such things happening after a sufficiently thorough exposure to alternative explanations may underlie Norris & King's principle II.1: An observation statement tends to be believable to the extent that the observer was not exposed, after the event, to further information relevant to describing it. (If the observer was exposed to such information, the statement is believable to the extent that the exposure took place close to the time of the event described.)[11] Their parenthesised time principle clearly renders a good deal of more recent Kennedy eye-witness testimony dubious after three and a half decades of exposure to vast amounts of further information in the mass media, not helped by `assassination conferences' where eye-witnesses have met and spoken with each other. One outcome of these two points is that, in the unlikely event of some living person being seriously suspected of involvement in the assassination, a criminal trial would be rendered difficult if not impossible. Such are the published discrepancies now within and between witnesses' testimonies that there would be enormous difficulties in attempting to render a plausibly consistent defence or prosecution narrative on their basis. (C) Misuse or outright reversal of a principle of charity. Where an event may have either a suspicious or an innocent explanation, and there is no significant evidence to decide between them, CTs invariably opt for the suspicious explanation. In part this is due to a feature deriving from Keeley's point (3) above, about CTs seeking to tie together seemingly unrelated events, but perhaps taken to a new level. Major CTs seek a maximally explanatory hypothesis, one which accounts for all of the events within its domain, and so they leave no room for the out of the ordinary event, the unlikely, the accident, which has no connection whatsoever with the conspiratorial events being hypothesised. The various Kennedy conspiracy narratives contain a large number of these events dragooned into action on the assumption that no odd event could have an innocent explanation. There is no better example of this than the Umbrella Man, a character whose forcible inclusion in conspiratorial explanations demonstrates well how a determined attempt to maintain this reversed principle of charity may lead to the most remarkable deformities of rational explanation. When pictorial coverage of the assassination entered the public domain, in newspaper photographs within the next few days, and more prominently in still from the Zapruder movie film of the events subsequently published in LIFE magazine, it became clear that one of the closest bystanders to the presidential limousine was a man holding a raised umbrella, and this at a time when it was clearly not raining. This odd figure rapidly became the focus of a number of conspiratorial hypotheses. Perhaps the most extreme of these originates with Robert Cutler (1975). According to Cutler, the Umbrella Man had a weapon concealed with the umbrella enabling him to fire a dart or flechette, perhaps drugged, into the president's neck, possibly for the purpose of immobilising him while the other assassins did their work. The only actual evidence to support this hypothesis is that the front of Kennedy's neck did indeed possess a small punctate wound, described by the medical team treating him as probably a wound of entrance but clearly explainable in the light of the full body of forensic evidence as a wound of exit for a bullet fired from above and behind the presidential motorcade. Consistent, in other words, with being the work of Oswald. There is no other supportive evidence for Cutler's hypothesis. (Cutler, of course, explains this in terms of the conspirators being able to control the subsequent autopsy and so conceal any awkward evidence; he thus complies with my principle (G) below.) More importantly, it seems inherently unlikely on other grounds. Since the Umbrella Man was standing on the public sidewalk, right next to a number of ordinary members of the public and in plain view of hundreds of witnesses, many of whom would have been looking at him precisely because he was so close to the president, its seems unlikely that a conspiracy could guarantee that he could get away with his lethal behaviour without being noticed by someone. And the proposed explanation for all this rigmarole, the stunning of the target, is entirely unnecessary: most firearms experts agree that the president was a pretty easy target unstunned. If Cutler's explanation hasn't found general favour with the conspiracy community, another has, but this too has equally strange effects upon reasoning clearly. The first version of this theory has the Umbrella Man signalling the presence of the target--movie-film of the assassination clearly shows that the raised umbrella is being waved or shaken. This hypothesis seems to indicate that the conspiracy had hired assassins who couldn't be relied upon to recognise the President of the United States when they saw him seated in his presidential limousine--the one with the president's flag on--next to the most recognisable first lady in American history. An apparently more plausible hypothesis is that it is the Umbrella Man who gives the signal for the team of assassins to open fire. (A version of this hypothesis can still be seen as late as 1992 in the movie JFK.) What I find remarkable here is that nobody seems to have thought this theory through at all. Firstly, the Umbrella Man is clearly on the sidewalk a few feet from the president while our alleged assassins are located high up in the Book Depository, in neighbouring buildings, or on top of the grassy knoll way to the front of the president. How, then, can he know what they can see from their different positions? How can he tell from his location that they now have clear shots at the target? (Dealey Plaza is full of trees, road signs and other obstructions, not to mention large numbers of police officers and members of the public who might be expected to get in the way of a clear view here.) And secondly, such an explanation actually weakens the efficiency of the alleged assassination conspiracy. (Here my limited boyhood experience of firing an air-rifle with telescopic sights finally comes in handy!) In order to make sense of the Umbrella Man as signaller, something like the following sequence of events must occur. Each rifleman focuses upon the presidential target through his telescopic sight, tracking the target as it moves at some ten to twelve miles per hour. Given the very narrow focus of such sights, he cannot see the Umbrella Man. To witness the signal, he must keep taking his eye away from the telescopic sight, refocussing it until he can see the distant figure on the sidewalk, and when the signal is given, put his eye back to the sight, re-focus again, re-adjust the position of the rifle since the target has continued to move while he was not looking at it, and then fire. This is not an efficient recipe for accurate target-shooting. Oliver Stone eliminates some of these problems in the version he depicts in the movie JFK. Here each of his three snipers is accompanied by a spotter, equipped with walkie-talkie and binoculars. While the sniper focuses on the target, the spotter looks out for the signal from the Umbrella Man and then orally communicates the order to open fire. But now, given what I have already said about the problem with the Umbrella Man's location, it is hard to see what purpose he serves that could not be better served by the spotters. He drops out of the equation. He is, as Wittgenstein says somewhere, a wheel that spins freely because it is not connected to the rest of the machinery. Occam's Razor would cut him from the picture, but Occam is no firm favourite of UCT proponents. In 1978, when the House Select Committee on Assassinations held public hearings on the Kennedy case, a Mr. Louis de Witt came forward to confess to being the Umbrella Man. He claimed that he came to Dealey Plaza in order to barrack the president as he went past, and that he was carrying a raised umbrella because he had heard that, perhaps for some obscure reason connected with the president's father's stay in London as US Ambassador during the war, the Kennedy family has a thing about umbrellas. De Witt hadn't come forward in the 15 years since the assassination since he had had no idea about the proposed role of the Umbrella man in the case. This part of his explanation seems to me to be eminently plausible: those of us with an obsessive interest in current affairs find it hard to grasp just how many people never read the papers or watch TV news. There is something almost endearing about de Witt, an odd character whose moment of public eccentricity seems to have enmired him in decades of conspiratorial hypotheses without his realising it. Needless to say, conspiracy theorists did not accept de Witt's testimony at face value. Some argued that he was a stooge put forward by the authorities to head off investigation into the real Umbrella Man, others that de Witt himself must be lying to conceal a more sinister role in these events, though I know of no evidence to support either of these conclusions. What this story makes clear is that an unwillingness to abandon discrepant events as unrelated, an unwillingness to abandon this reverse principle of charity here whereby all such events are conspiratorial unless clearly proven otherwise, rapidly leads to remarkable mental gymnastics, to hypotheses that are excessively complex and even internally inconsistent, (The Umbrella Man as signaller makes the assassination harder to perform.) But, such are the ways of human psychology, once such an event has been firmly embedded within a sufficiently complex hypothesis, no amount of contradictory evidence would seem to be able to shift it. The Umbrella Man has by now been invested with such importance as to become one of the great myths of the assassination, against which mere evidentiary matters can have no effect. (D) The demonisation of persons and organisations. This weakness takes a number of forms in the Kennedy case, which I shall treat separately. (i) Guilt by reputation. The move from the fact that some body--the FBI, the CIA, the mafia, the KGB--has a proven record of wrong-doing in the past to the claim that they were capable of wrong-doing in the present case doesn't seem unreasonable. But the stronger claim that past wrong-doing is in some sense evidence for present guilt is much more problematic, particularly when differences between the situations are overlooked. This is especially true of the role of the CIA in Kennedy CTs. Senator Church's 1976 congressional investigation into the activities of US intelligence agencies provided clear evidence that in the period 1960-63 elements of the CIA, probably under the instructions of or at least with the knowledge of the White House, had conspired with Cuban exiles and members of organised crime to attempt the assassination of Cuban leader Fidel Castro. Evidence also emerged of CIA involvement in the deaths of other foreign leaders--Trujillo in the Dominican Republic, Lumumba in the Congo, etc.. These findings were incorporated in Kennedy CTs as evidence to support the probability that the CIA, or at least certain members of it, were also responsible for the death of Kennedy. Once an assassin, always an assassin? Such an argument neglects the fact that the CIA could reasonably believe that they were acting in US interests, possibly lawfully since they were acting under the guidance or instruction of the White House. This belief is not open to them in the case of killing their own president, a manifestly unlawful act and one hard to square with forwarding US interests. (Evidence that Soldier X willingly shoots at the soldiers of other countries when ordered to do so is no evidence that he would shoot at soldiers of his own country, with or without orders. The situations are plainly different.) At best the Church Committee evidence indicated that the CIA had the capacity to organise assassinations, not that it had either the willingness or the reason to assassinate its own leader. (ii) Guilt by association. This takes the form of impeaching the credibility of any member of a guilty organisation. Since both the FBI and the CIA (not to mention, of course, the KGB or the mafia) had proven track records of serious misbehaviour in this period, it is assumed that all members of these organisations, and all their activities, are equally guilty. Thus the testimony of an FBI agent can be impeached solely on the grounds that he is an FBI agent, any activity of the CIA can be characterised as nefarious solely because it is being carried out by the CIA. Such a position ignores the fact that such organisations have many thousands of employees and carry out a wide range of mundane duties. It is perfectly possible for a member of such an organisation to be an honest and patriotic citizen whose testimony is as believable as anyone else's. Indeed, given my previous point that for security reasons the smaller the conspiratorial team the more likely it is to be successful, it would seem likely that the great majority of members of such organisations would be innocent of any involvement in such a plot. (I would hazard a guess that the same holds true of the KGB and the mafia, both organisations with a strong interest in security.) (iii) Exaggerating the power and nature of organisations. Repeatedly in such CTs we find the assumption that organisations like the CIA or the mafia are all-powerful, all-pervasive. capable of extraordinary foreknowledge and planning.[12] This assumption has difficulty in explaining the many recorded instances of inefficiency or lack of knowledge that these organisations constantly demonstrate. (There is a remarkable belief in conspiratorial circles, combining political and paranormal conspiracies, that the CIA has or had access to a circle of so-called `remote viewers', people with extra-sensory powers who were able through paranormal means to provide them with information about the activities of America's enemies that couldn't be discovered in any other way. Such a belief has trouble in easily accommodating the fact that the CIA was woefully unprepared for the sudden break-up of the Soviet Union and Warsaw Pact, or for the fact that America's intelligence organisations first learned of the start of the Gulf War when Kuwaiti embassy employees looked out of the window and saw Iraqi tanks coming down the road! Sadly, it appears to be true that people calling themselves remote viewers took very substantial fees from the CIA though whether this tells us more about the gullibility of people in paranoid institutions or their carefree attitude towards spending public money I should not care to say.) The more extreme conspiracy theories may argue that such organisations are only pretending to be inefficient, in order to fool the public about the true level of their efficiency. Such a position is, as Popper would no doubt have pointed out, not open to refutation. (iv) Demonising individuals. As with organisations, so with people. Once plausible candidates for roles in an assassination conspiracy are identified, they are granted remarkable powers and properties, their wickedness clearly magnified. In Kennedy CTs there is no better example of this than Meyer Lansky, the mafia's `financial wizard'. Lansky was a close associate of America's premier gangster of the 1940s, Charles `Lucky' Luciano. Not actually a gangster himself (and, technically, not actually a member of the mafia either, since Lansky--as a Jew--could not join an exclusively Sicilian brotherhood) Lansky acted as a financial adviser. He organised gambling activities for Luciano and probably played a significant role in the mafia involvement in the development of Las Vegas, and in subsequent investments of the Luciano family's money, including those in pre-revolutionary Cuba, after Luciano's deportation to Sicily. So much is agreed. But Lansky in CT writing looms ever larger, as a man of remarkable power and influence, ever ready to use it for malign purposes, a vast and evil spider at the centre of an enormous international web, maintaining his influence with the aid of the huge sums of money which organised crime was reaping from its empire.[13] Thus there is no nefarious deed concerning the assassination or its cover-up with which Lansky cannot be linked. This picture wasn't dented in the least by Robert Lacey's detailed biography of Lansky published in 1991. Lacey, drawing upon a considerable body of publicly available evidence--not least the substantial body generated by Lansky's lawsuit to enable him, as a Jew, to emigrate to Israel, was able to show that Lansky, far from being the mob's eminence grise, was little more than a superannuated book-keeper. The arch manipulator, supposedly empowered by the mafia's millions, led a seedy retirement in poverty and was on record as being unable to afford healthcare for his relatives. The effect of reading Lacey's substantially documented biography is rather like that scene in `The Wizard of Oz' when the curtain is drawn back and the all-powerful wizard is revealed to be a very ordinary little man. The 1990s saw the publication of a remarkable amount of material about the workings of American organised crime, much of it gleaned from FBI and police surveillance during the successful campaign to imprison most of its leaders. This material reveals that mafia bosses tend to be characterised by a very limited vocabulary, a remarkable propensity for brutality and a considerable professional cunning often mixed with truly breath-taking stupidity. That they could organise a large-scale assassination conspiracy, and keep quiet about it for more than thirty-five years, seemed even less likely. As I point out below, they would almost certainly not have wanted to. (E) The canonisation of persons or (more rarely) organisations. In the Kennedy case, this has taken the form of idealising the President himself. In order to make a conspiratorial hypothesis look more plausible under (F) below, it is necessary to make the victim look as much as possible like a significant threat to the interests of the putative conspirators. In this case, Kennedy is depicted as a liberal politician, one who was a threat to established economic interests, one who took a lead in the contemporary campaign to end institutionalised discrimination against black people, and, perhaps most importantly, one who was or became something of a foreign policy dove, supporting less confrontational policies in the Cold War to the extent of being prepared to terminate US involvement in South Vietnam. This canonisation initially derives from the period immediately after the assassination, a period marked by the emergence of a number of works about the Kennedy administration from White House insiders like Theodore Sorensen, Pierre Salinger and the Camelot house historian, Arthur Schlesinger, works which tended to confirm the idealisation of the recently dead president, particularly when implicitly compared with the difficulties faced by the increasingly unpopular Lyndon Johnson. >From the mid 1970s Kennedy's personal character came under considerable criticism, partly resulting from the publication of biographies covering his marriage and sexual life, and the personal lives of the Kennedy family. More importantly, for our purposes, were the stream of revelations which emerged from the congressional investigations of this time which indicated the depth of feeling in the Kennedy White House about Cuba; most important here were the Church Committee's revelations that the CIA had conspired with members of organised crime to bring about the assassination of Fidel Castro. These, coming hard on the heels of the revelations of various criminal conspiracies within the Nixon White House, stoked up the production of CTs. (And provided a new motivation for the Kennedy assassination: that Castro or his sympathisers had found out about these attempts and had Kennedy killed in revenge.) But they also indicated that the Kennedy brothers were much harder cold war warriors than had perhaps previously been thought. The changing climate of the 1980s brought a new range of biographies and memoirs--Reeves, Parmet, Wofford, etc.--which situated Kennedy more firmly in the political mainstream. It became that he was not by any means an economic or social liberal--on the question of racial segregation he had to be pushed a lot since he tended to regard the activities of Martin Luther King and others as obstructing his more important social policies. And Kennedy adopted a much more orthodox stance on the cold war than many had allowed: this was, after all, the candidate who got himself elected in 1960 by managing in the famous `missile gap' affair to appear tougher on communism than Richard Nixon, no mean feat. Famously, Kennedy adopted a more moderate policy during the Cuban missile crisis than some of those recommended by his military advisers, but this can be explained more in terms of Kennedy having a better grasp of the pragmatics of the situation than in terms of his being a foreign policy liberal of some sort. This changing characterisation of Kennedy, this firm re-situating of his administration within the central mainstream of American politics--a mainstream which appears considerably to the right in European terms--has been broadly rejected by proponents of Kennedy assassination CTs (some of whom also reject the critical characterisation of his personal life). The reason for this is that it plainly undercuts any motivation for some part of the American political establishment to have Kennedy removed. It is unlikely that any of Kennedy's reforming policies, economic or social, could seriously have been considered such a threat to establishment interests. It is even more unlikely when one considers that much of Kennedy's legislative programme was seriously bogged down in Congress and was unlikely to be passed in anything but a heavily watered-down form during his term. Much of this legislation was forced through after the assassination by Kennedy's successor, Lyndon Johnson being a much more astute and experienced parliamentarian. The price for this social reform, though, was Johnson's continued adherence to the verities of cold war foreign policy over Vietnam. I leave consideration of Kennedy's Vietnam policy to the next section. (F) An inability to make rational or proportional means-end judgements. The major problem here for any Kennedy assassination CT is to come up with a motive. Such a motive must not only be of major importance to putative conspirators, it must also rationally justify a risky, expensive--and often astonishingly complicated--illegal conspiracy. Which is to say that such conspirators must see the assassination as the only or best way of bringing about their aim. The alleged motives can be broadly divided into two categories. Firstly, revenge. Kennedy was assassinated in revenge for the humiliation he inflicted upon Premier Khrushchev over the Cuban missile crisis, or for plotting the assassination of Fidel Castro, or for double-crossing organised crime over alleged agreements made during his election campaign. The problem with each of these explanations is that the penalties likely to be suffered if one is detected far outweigh any rational benefits. Had Castro's hand been detected behind the assassination--something which Johnson apparently thought all too likely--this would inevitably have swung American public opinion behind a US military invasion of Cuba and overthrow of Castro's rule. If Khrushchev has been identified as the ultimate source of the assassination, the international crisis would have been even worse, and could well have edged the world considerably closer towards nuclear war than happened in the Cuban missile crisis. One can only make sense of such explanations on the basis of an assumption that the key conspirators are seriously irrational in this respect, and this is an assumption that we should not make without some clear evidence to support it. The second category of explanations for the assassination are instrumental: Kennedy was assassinated in order to further some specific policy or to prevent him from furthering some policy which the conspirators found anathema. Here candidates include: to protect Texas oil-barons' economic interests, to frustrate the Kennedy administration's judicial assault upon organised crime, to bring about a more anti-Castro presidency, and--the one that plays the strongest role in contemporary Kennedy CTs such as Oliver Stone's--to prevent an American withdrawal from Vietnam. A proper response to the suggestion of any of these as a rational motive for the assassination should be to embark upon a brief cost-benefit analysis. We have to factor in not only the actual costs of organising such a conspiracy (and, in the case of the more extreme Kennedy CTs, of maintaining it for several decades afterwards to engage in what has been by any standards a pretty inefficient cover-up) but also the potential costs to be faced if the conspiracy is discovered, the assassination fails, etc.. Criminals by and large tend to be rather poor at estimating their chances of being caught; murder and armed robbery have very high clear-up rates compared to, say, burglary of unoccupied premises. The continued existence of professional armed robbers would seem to indicate that they underestimate their chances of being caught or don't fully appreciate the comparative benefits of other lines of criminal activity. But though assassination conspirators are by definition criminals, we are to assume here that they are figures in the establishment, professional men in the intelligence, military and political communities, and so likely to be more rational in their outlook than ordinary street criminals. (Though this is a defeasible assumption, since the post-war history of western intelligence agencies has indicated a degree of internal paranoia sometimes bordering on the insane. A substantial part of British intelligence, for instance, spent almost two decades trying to prove that the then head of MI5 was a Soviet agent, a claim that appears to have no credibility at all.) If we assume that the Mafia played such a role in an assassination conspiracy, it is still plausible to believe that they would consider the risks of failure. In fact, we have some evidence to support this belief since, though organised crime is by and large a very brutal institution, in the US--as opposed to the very different conditions prevailing in Italy--it maintains a policy of not attacking dangerous judges or politicians. When in the 1940s senior Mafia boss Albert Anastasia proposed murdering Thomas Dewey, then a highly effective anti-crime prosecutor in New York and subsequently a republican presidential candidate in 1948, the response was to have Anastasia murdered rather than risk the troubles that Dewey's assassination would have brought down upon the heads of organised crime. An even more effective prosecutor, Rudolph Giuliani, remained unscathed throughout his career. Against the risks of being caught, we have to balance the costs of trying to achieve one's goal by some other less dramatic and probably more legal path. The plain fact is that there are a large number of legal and effective ways of changing a president's mind or moderating his behaviour. One can organise public campaigns, plant stories in the press, stimulate critical debate in congress, assess or manipulate public opinion through polls etc. When the health care industry in the US wanted to defeat the Clinton administrations reform proposals, for instance, they didn't opt for assassination but went instead for a highly successful campaign to bring congress and substantial parts of public opinion against the proposals, which soon became dead in the water. On the specific case of American withdrawal from Vietnam, all of the above applies. In the first case, following on from (E) above, it can be plausibly argued that Kennedy had no such intention. He certainly on occasion floated the idea, sounding out people around him, but this is something that politicians do all the time as part of the process of weighing policy options and shouldn't be taken as evidence for such an option. But to see Kennedy as seriously considering such an option is to see him as a figure considerably out of the Democratic mainstream. He would certainly have been aware of the effects that an Asian policy can have upon domestic matters; as a young congressman he would have been intimately aware of the effect that the fall of China to communism in 1949 had upon the last Democratic administration, severely weakening Harry Truman's effectiveness. For years afterwards the Democrats were regarded as the people who "lost China" despite the fact that there was nothing they could have done--short of an all-out war, like that occurring in Korea shortly afterwards, which couldn't possibly be won without the use of nuclear weapons and all that entails. Kennedy's administration had a much stronger presence in South Vietnam and it can reasonably be asked whether he would have wanted to run the risk of becoming the president who "lost Vietnam". He would also have been aware of the problem that ultimately faced Lyndon Johnson, that one could only maintain a forceful policy of domestic reform by mollifying congress over matters of foreign policy. The price for Johnson's Great Society reforms was a continued adherence to a policy of involvement in Vietnam, long after Johnson himself--fully aware of this bind--doubted the wisdom of this policy. Kennedy's domestic reforms were already in legislative difficulties; to believe that he was prepared to withdraw from Vietnam, then, is to believe that he was effectively abandoning his domestic programmes. (That Kennedy was alleged to be considering such an action in his second term, if re-elected, doesn't affect this point. He would still have been a lame-duck president, and would also have weakened the chances of any possible Democratic successor, something that would certainly have been of interest to other members of his party.) It thus appears unlikely that Kennedy would have seriously considered withdrawing completely from Vietnam. But if he had, a number of options were available to opponents of such a policy. Firstly, as noted above, they could have encouraged opposition to such a policy in congress and other important institutions, and among the American public. There was certainly a strongly sympathetic Republican and conservative Democrat presence in congress to form the foundations of such an opposition, as well as among newspaper publishers and other media outlets. If Kennedy had underestimated the domestic problems that withdrawal would cause him, such a campaign would concentrate his mind upon them. And secondly, opponents could work to change Kennedy's mind. They could do this by controlling the information available for Kennedy and his advisers. In particular, military sources could manipulate the information flowing from Vietnam itself. (That Kennedy thought something like this was happening may be indicated by his insistence on sending civilian advisers to Vietnam to report back to him personally.) This policy worked well in Johnson's time--the control of information over the trivial events in the Bay of Tonkin in 1965 was manipulated to indicate a serious crisis which thus forced Johnson into inserting a heavy military presence into South Vietnam in response. There is no reason to believe that such a policy would not have worked if Kennedy had still been in office. At the very least, it would be rational to adopt such a policy first, to try cheap, legal and probably efficient methods of bringing about one's goal before even contemplating such a dramatic, illegal and high-risk activity as assassination. (I omit here any consideration of the point that members of the American establishment might feel a moral revulsion at the idea of taking such action against their own president. Such a claim may well be true, but the argument from rationality does not require it.) At bottom what we face here is what we might term Goodenough's Paradox of Conspiracies: the larger or more powerful an alleged conspiracy, the less need they have for conspiring. A sufficiently large collection of members of the American political, intelligence and military establishment--the kind of conspiracy being alleged by Oliver Stone et al.--wouldn't need to engage in such nefarious activity since they would have the kind of organisation, influence, access to information, etc. that could enable them to achieve their goal efficiently and legally. The inability noted in (F) to make adequate means-end decisions means that UCT proponents fail to grasp the force of this paradox. (G) Evidence against a UCT is always evidence for. The tendency of modern CTs has been to move from conspiracies which try to keep their nefarious activities secret to more pro-active conspiracies which go to a good deal of trouble to manufacture evidence either that there was a different conspiracy or that there was no conspiracy at all. This is especially true of Kennedy assassination CTs. The epistemological attitude of Kennedy CTs has changed notably over the years. In the period 1964-76 the central claim of such theories was that the evidence collected by the Warren Commission and made public, when fairly assessed, did not support the official lone assassin hypothesis but indicated the presence of two or more assassins and therefore a conspiracy. Public pressure in the aftermath of Watergate brought about a congressional investigation of the case. In its 1980 report the House Select Committee eventually decided, almost solely on the basis of subsequently discredited acoustic evidence, that there had indeed been a conspiracy. But more importantly, the committee's independent panels of experts re-examined the key evidence, photographic, forensic and ballistic, and decided that it supported the Warren Commission's conclusion. This led to a sea-change in CTs from 1980 onwards. Given the preponderance of independently verified `best evidence' supporting the lone assassin hypothesis, CT proponents began to argue that some or all of this evidence had been faked. This inevitably entailed a much larger conspiracy than had previously been hypothesised, one that not only assassinated the president but also was able to gain access to the evidence of the case afterwards in order to change it, suppress it or manufacture false evidence. They thus fell foul of (F) above. Since the reason for such CTs was often to produce a hypothesis supported by much weaker evidence, eye-witness testimony and so on, they would tend to fall foul of (A), (B) and (C) as well. One problem with such CTs was that they tended to disagree with one another over which evidence had been faked. Thus many theorists argued that the photographic and X-ray record of the presidential post mortem had been tampered with to conceal evidence of conspiracy, while Lifton (1980) as we saw argued that the record was genuine but the body itself had been tampered with. Other theorists, e.g. Fetzer & co., argue that the X-rays indicate a conspiracy while the photographs do not, implying that the photographs have been tampered with. This latter, widespread belief introduces a new contradiction into the case, since it posits a conspiracy of tremendous power and organisation, able to gain access to the most important evidence of the case, yet one which is careless or stupid enough not to make sure that the evidence it leaves behind is fully consistent. (And, of course, it goes against the verdict of the House Committee's independent panel of distinguished forensic scientists and radiographers that the record of the autopsy was genuine, and consistent, both internally and with the hypothesis that Oswald alone was the assassin.) Of particular interest here is the Zapruder movie film of the assassination. Stills from this film were originally published, in the Warren Report and in the press, to support the official lone assassin hypothesis. When a bootleg copy of this film surfaced in the mid 1970s it was taken as significant evidence against the official version and most CTs since then have relied upon one interpretation or another of this film for support. But now that it is clear, especially since better copies of the film are now available, that the wounds Kennedy suffers in the film do not match those hypothesised by those CT proponents arguing for the falsity of the autopsy evidence, some of these proponents now claim to detect signs that the Zapruder film itself has been faked, and there has been much discussion about the chain of possession of this film in the days immediately after the assassination to see if there is any possibility of its being in the hands of someone who could have tampered with it. What is happening here is that epistemologically these CTs are devouring their own tails. If the evidence that was originally regarded as foundational for proving the existence of a conspiracy is now itself impeached, then this ought to undermine the original conspiracy case. If no single piece of evidence in the case can be relied upon then we have no reason for believing anything at all, and the abyss of total scepticism yawns. Interestingly there seems to be a complete lack of what I termed above `meta-evidence', that is, actual evidence that any of this evidence has been faked. Reasons for believing in this forgery hypothesis tend to fall into one of three groups. (i) It is claimed that some sign of forgery can be detected in the evidence itself. Since much of this evidence consists of poor quality film and photographs taken at the assassination scene, these have turned into blurred Rorschach tests where just about anything can be seen if one squints long and hard enough. In the case of the autopsy X-rays, claims of apparent fakery tend to be made by people untrained in radiography and the specialised medical skill of reading such X-rays. (ii) Forgery is hypothesised to explain some alleged discrepancy between two pieces of evidence. Thus when differences are alleged to exist between the autopsy photographs and the X-rays it is alleged that one or other (or both) have been tampered with. (iii) Forgery is hypothesised in order to explain away evidence that is clearly inconsistent with the proposed conspiracy hypothesis. An interesting case of the latter involves the so-called `backyard photos', photographs supposedly depicting Oswald standing in the yard of his house and posing with his rifle, pistol and various pieces of left-wing literature. For Oswald himself was confronted with these by police officers after his arrest and claimed then that they had been faked--he had had some employment experience in the photographic trade and claimed to know how easily such pictures could be faked. And ever since then CT proponents have made the same claims. But one problem with such claims is that evidence seldom exists in a vacuum, but is interconnected with other evidence. Thus we have the sworn testimony of Oswald's wife that she took the photographs, the evidence of independent photographic experts that the pictures were taken with Oswald's camera, documentary evidence in his own handwriting that Oswald ordered the rifle in the photos and was the sole hirer of the PO box to which it was delivered, eyewitness evidence that Oswald possessed such a rifle and that one of these photos had been seen prior to the assassination, and so on. To achieve any kind of consistency with the forgery hypothesis all of this evidence must itself be faked or perjured. Thus the forgery hypothesis inevitably ends up impeaching the credibility of such a range of evidence that a conspiracy of enormous proportions and efficiency is entailed, a conspiracy which runs into the problems raised in (F) above. These problems are so severe that the forgery hypothesis must be untenable without the existence of some credible meta-evidence, some proof that acts of forgery took place. Without such meta-evidence, all we have is an unjustifiable attempt to convert evidence against a conspiracy into evidence for merely on the grounds that the evidence doesn't fit the proposed CT, which is an example of (A) too. (H) The fallacy of the spider's web. This form of reasoning has been central to many of the conspiratorial works about the JFK assassination: indeed, Duffy (1988) is entitled The Web! Scott (1977) was perhaps the first full-length work in this tradition. It concentrates on drawing links between Oswald and the people he came into contact with, and the murky worlds of US intelligence, anti-Castro Cuban groups and organised crime, eventually linking in this fashion the world of Dealey Plaza with that of the Watergate building and the various secret activities of the Nixon administration. Such a project is indeed an interesting one, one which enlightens us considerably about the world of what Scott terms `parapolitics'. It is made especially easy by the fact that Oswald in his short life had at least tangential connections with a whole range of suspicious organisations, including the CIA, the KGB, pro- and anti-Castro Cuban groups, the US Communist Party and other leftist organisations, organised crime figures in New Orleans and Texas, and so on. And considerable webs can be drawn outwards, from Oswald's contacts to their contacts, and so on. As I say, such research is intrinsically interesting, but the fallacy occurs when it is used in support of a conspiracy theory. For all that it generates is suspicion, not evidence. That Oswald knew X or Y is evidence only that he might have had an opportunity to conspire with them, and doesn't support the proposition that he did. The claim is even weaker for people that Oswald only knew at second or third or fourth hand. And some of these connections are much less impressive than authors claim: that Oswald knew people who ultimately knew Meyer Lansky becomes much less interesting when, as I noted in (D) above, Lansky is seen as much more minor figure than the almost omnipotent organised crime kingpin he is often depicted as. Ultimately this fallacy depends upon a kind of confusion between quantity and quality, one that seems to believe that a sufficient quantity of suspicion inevitably metamorphoses into something like evidence. There is, as the old saying has it, no smoke without fire, and surely such an inordinate quantity of smoke could only have been produced by a fire of some magnitude. But thirty years of research haven't found much in the way of fire, only more smoke. Some of the more outrageous CTs here have been discredited--inasmuch as such CTs can ever be discredited--and the opening of KGB archives in recent years and access to living KGB personnel has shown that Oswald's contacts with that organisation were almost certainly innocent. Not only is there no evidence that Oswald ever worked for the KGB, but those KGB officers who monitored Oswald closely during his two year stay in the USSR were almost unanimously of the opinion that he was too unbalanced to be an employee of any intelligence organisation. But a problem with suspicion is that it cannot be easily dispelled. Since web-reasoning never makes clear exactly what the nature of Oswald's relationship with his various contacts was, it is that much harder to establish the claim that they were innocent. Ultimately, this can only be done negatively, by demonstrating the sheer unlikeliness of Oswald being able to conspire with anyone. The ample evidence of the sheer contingency of Oswald's presence in the book depository on the day of the assassination argues strongly against his being part of a conspiracy to kill the president. Whether in fact he was a part of some other conspiracy, as some authors have argued, is an interesting question but one not directly relevant to assassination CTs. (I) The classic logical fallacy of post hoc ergo propter hoc. This applies to all those assassination CTs which seek to establish some motive for Kennedy's death from some alleged events occurring afterwards. The most dramatic of these, as featured in Oliver Stone's film, is the argument from America's disastrous military campaign in Vietnam. US military involvement escalated after Kennedy's death, therefore it happened because of Kennedy's death, therefore Kennedy's death was brought about in order to cause an increased American presence in Vietnam. The frailty of this reasoning is obvious. As I pointed out in (F) above, such a view attributes to the proposed conspirators a significant inability to match ends and means rationally. In addition there is no end to the possible effects that can be proposed here. Ultimately everything that is regarded as immoral about modern America can be traced back to the assassination. As I pointed out in a recent lecture, what motivates this view is: a desire for a justification of a view of America as essentially a benign and divinely inspired force in the world, a desire held in the face of American sin in Vietnam and elsewhere. There are plausible explanations for Vietnam and Watergate in terms of the domination of post-war foreign policy by cold-war simplicities, and the growth of executive power at the expense of legislative controls, and so on. They are, for those not interested in political science, dull explanations. Above all, they do not provide the emotional justification of conspiratorial explanations. To view Vietnam as the natural outcome of foreign policy objectives of the cold-war establishment, of a set of attitudes shared by both Republican and Democrat, above all to view it as the express wish of the American people--opinion polls registered majority support for the war until after the Tet disaster in 1968--is ultimately to view Vietnam as the legitimate and rational outcome of the American system at work. A quasi-religious view of America as `the city on the hill', the place where God will work out his purpose for men, cannot afford to entertain these flaws. Hence the appeal of an evil conspiracy on which these sins can be heaped.[14] Underlying this reasoning, then, is an emotional attachment to a view of America as fundamentally decent combined with a remarkable ignorance about the real nature of politics. All of the features of America's history after 1963 that can be used as a possible motive for the assassination can be equally or better explained in terms of the ordinary workings of US politics. Indeed many of them, including the commitment to Vietnam and the aggressively murderous attitude towards Castro's Cuba, can be traced to Kennedy's White House and earlier. Though CT theorists often proclaim their commitment to realism and a hard-headed attitude towards matters, it seems clear that their reliance upon this kind of reasoning is motivated more by emotion than by facts. 5. Conclusions The accusation is often made that conspiracy theorists, particularly of the more extreme sort, are crazy, or immature, or ignorant. This response to UCTs may be at least partly true but does not make clear how CT thinking is going astray. What I have tried to show is how various weaknesses in arguing, assessing evidence, etc. interact to produce not just CTs but unwarranted CTs. A conspiratorial explanation can be the most reasonable explanation of a set of facts, but where we can identify the kinds of critical thinking problems I have outlined here, a CT becomes increasingly unwarranted. Apart from these matters logical and epistemological, it seems to me that there is also an interesting psychological component to the generation of UCTs. Human beings possess an innate pattern-seeking mechanism, imposing order and explanation upon the data presented to us. But this mechanism can be too sensitive and we start to see patterns where there are none, leading to a refusal to recognise the sheer amount of contingency and randomness in the world. Perhaps, as Keeley says, "the problem is a psychological one of not recognizing when to stop searching for hidden causes".[15] Seeing meaning where there is none leads to seeing evidence where there is none: a combination of evidential faults reinforces the view that our original story, our originally perceived pattern, is correct--a pernicious feedback loop which reinforces the belief of the UCT proponent in their own theory. And here criticism cannot help, for the criticism--and indeed the critic--become part of the pattern, part of the problem, part, indeed, of the conspiracy.[16] Conspiracy theories are valuable, like any other type of theory, for there are indeed conspiracies. We want to find a way to preserve all that is useful in the CT as a way of explaining the world while avoiding the UCT which at worst slides into paranoid nonsense. I agree with Keeley that there can be no exact dotted line along which Occam's Razor can be drawn here. Instead, we require a greater knowledge of the thinking processes which underlie CTs and the way in which they can offend against good standards of critical thinking. There is no way to defeat UCTs; the more entrenched they are, the more resistance to disproof they become. Like some malign virus of thinking, they possess the ability to turn their enemies' powers against them, making any supposedly neutral criticism of the CT itself part of the conspiracy. It is this sheer irrefutability that no doubted irritated Popper so much. If we cannot defeat UCTs through refutation then perhaps the best we can do is inoculate against them by a better development of critical thinking skills. These ought not to be developed in isolation--it is a worrying feature of this field that many otherwise critical thinkers become prone to conspiracy theorising when they move outside of their own speciality--but developed as an essential prerequisite for doing well in any field of intellectual endeavour. Keeley concludes that there is nothing straightforwardly analytic that allows us to distinguish between good and bad conspiracy theories... The best we can do is track the evaluation of given theories over time and come to some consensus as to when belief in the theory entails more scepticism than we can stomach.[17] Discovering whether or to what extent a particular CT adheres to reasonable standards of critical thinking practice gives us a better measure of its likely acceptability than mere gastric response, while offering the possibility of being able to educate at least some people against their appeal, as potential consumers or creators of unwarranted conspiracy theories. BIBLIOGRAPHY Blakey, G. Robert & Billings, Richard (1981) Fatal Hour -The Plot to Kill the President, N.Y.:Berkeley Publishing Cutler, Robert (1975) The Umbrella Man, Manchester, Mass.: Cutler Designs Donovan, Robert J.(1964) The Assassins, N.Y.: Harper Books Duffy, James. R. (1988) The Web, Gloucester: Ian Walton Publishing Eddowes, Michael (1977) The Oswald File, N.Y.: Ace Books Fetzer, James (ed.) (1997) Assassination Science, Chicago, IL: Open Court Publishing Fisher, Alec & Scriven, Michael (1997) Critical Thinking - Its Definition and Assessment, Norwich: Centre for Critical Thinking, U.E.A. Hofstadter, Richard P. (1964) The Paranoid Style in American Politics, London: Jonathan Cape Hume, David (1748) Enquiry Concerning Human Understanding, ed. by P.H. Nidditch 1975, Oxford: Oxford University Press. Keeley, Brian L. (1999) `Of Conspiracy Theories', Journal of Philosophy 96, 109-26. Lacey, Robert (19901) Little Man, London: Little Brown Lifton, David (1980) Best Evidence, London: Macmillan. 2nd ed. 1988 N.Y.: Carroll & Graf Norris, S.P. & King, R. (1983) Test on Appraising Observations, St Johns Newfoundland: Memorial University of Newfoundland. Norris, S.P. & King, R. (1984) `Observational ability: Determining and extending its presence', Informal Logic 6, 3-9. Oglesby, Carl (1976) The Yankee-Cowboy War , 2nd ed. 1977, N.Y.: Berkley Publishing Pigden, Charles (1993) `Popper revisited, or What Is Wrong With Conspiracy Theories?', Philosophy of the Social Sciences 25, 3-34. Popkin, Richard H. (1966) The Second Oswald , London: Sphere Books Popper, Karl (1945) The Open Society and its Enemies, 5th ed. 1966, London, Routledge. Posner, Gerald (1993) Case Closed, N.Y.: Random House Scheim, David E. (1983) Contract On America, Silver Spring, Maryland: Argyle Press Scott, Peter Dale (1977) Crime and Cover-Up, Berkeley, Cal: Westworks Stone, Jim (1991) Conspiracy of One , Fort Worth TX: Summit Group Stone, Oliver & Sklar, Zachary (1992) JFK - The Movie, New York: Applause Books. Thompson, Josiah(1967) Six Seconds in Dallas , 2nd ed. 1976, N.Y.: Berkeley Publishing Wilson, Robert Anton (1989) `Beyond True and False', in Schultz, T. (ed.) The Fringes of Reason, New York: Harmony. ______________________ [1] And this even though professional philosophers may themselves engage in conspiracy theorising! See, for instance, Popkin (1966), Thompson (1966) or Fetzer (1998) for examples of philosophers writing in support of conspiracy theories concerning the JFK assassination. [2] See Donovan 1964 for more on this. [3] Historians, it seems, still disagree about whether or to what extent Princips' group was being manipulated. [4] And the most extreme UCT I know manages to combine this with both ufology and satanism CTs, in David Icke's ultimate paranoid fantasy which explains every significant event of the last two millennia in terms of the sinister activities of historical figures who share the blood-line of reptilian aliens who manipulate us for their purposes, using Jews, freemasons, etc. as their fronts. Those interested in Mr. Icke's more specific allegations (which I omit here at least partly out of a healthy regard for Britain's libel laws) are directed to his website, http://www.davidicke.com/. [5] See Norris & King 1983 & 1984 for full details of and support for these principles. [6] I don't propose to argue for my position here. Interested readers are pointed in the direction of Posner (1994), a thorough if somewhat contentious anti-conspiratorial work whose fame has perhaps eclipsed the less dogmatic but equally anti-conspiratorial Stone (1990). [7] One of the first of which, from the charmingly palindromic Revilo P. Oliver, is cited by Hofstadter. Oliver, a member of the John Birch Society, which had excoriated Kennedy as a tool of the Communists throughout his presidency, asserted that it was international Communism which had murdered Kennedy in order to make way for a more efficient tool! Right-wind theories blaming either Fidel Castro or Nikita Khrushchev continued at least into the 1980s: see, for instance, Eddowes (1977). [8] And probably not possible! The sheer complexity of the assassination CT community and the number of different permutations of alleged assassins has frown enormously, especially over the last twenty years. In particular, the number of avowedly political CTs is hard to determine since they fade into other areas of CT, in particular those dealing with the influence of organised crime and those dealing with an alleged UFO cover-up, not to mention those even more extreme CTs which link the assassination to broader conspiracies of international freemasonry etc.. [9] See not only the movie but also Stone & Sklar (1992), a heavily annotated version of the film's script which also includes a good deal of the published debate about the film, for and against. [10] Lifton 1980: 132 [11] Norris & King (1983), quoted in Fisher & Scriven (1997). [12] For a remarkable instance of the exaggeration of the power of organised crime in the US and its alleged role in Kennedy's death see Scheim (1983) or, perhaps more worryingly, Blakey & Billings (1981). I say `more worryingly' because Blakey was Chief Counsel for the congressional investigation into Kennedy's death which reported in 1980 and so presumably is heavily responsible for the direction that investigation took. [13] This view of Lansky is widespread throughout the Kennedy literature. See, for instance, Peter Dale Scott's short (1977) which goes into Lansky's alleged connections in great detail. [14] From "(Dis)Solving the Kennedy Assassination", presented to the Conspiracy Culture Conference at King Alfred's College, Winchester, in July 1998. [15] Keeley 1999: 126 [16] Anyone who doubts this should try to argue for Oswald as lone assassin on an internet discussion group! It is not just that one is regarded as wrong or naive or ignorant. One soon becomes accused of sinister motives, of being a witting or unwitting agent of the on-going disinformation exercise to conceal the truth. (I understand that much the same is true of discussions in ufology fora.) [17] Keeley 1999: 126 _______________________________________________ paleopsych mailing list paleopsych at paleopsych.org http://lists.paleopsych.org/mailman/listinfo/paleopsych From ljohnson at solution-consulting.com Sun Dec 11 18:09:46 2005 From: ljohnson at solution-consulting.com (Lynn D. Johnson, Ph.D.) Date: Sun, 11 Dec 2005 11:09:46 -0700 Subject: [Paleopsych] NYT Mag: Laptop That Will Save the World, The In-Reply-To: References: Message-ID: <439C6B6A.2090703@solution-consulting.com> This is truly visionary, a technical breakthrough that can change the world. Thanks for shari