From shovland at mindspring.com Sun May 1 15:42:36 2005 From: shovland at mindspring.com (Steve Hovland) Date: Sun, 1 May 2005 08:42:36 -0700 Subject: [Paleopsych] Finding may unlock secret to nerve growth factor Message-ID: <01C54E29.BA756A90.shovland@mindspring.com> http://news-service.stanford.edu/news/medical/2004/may26/nerve.html Researchers' discovery could pave way for development of drugs that alter nerve growth By MITZI BAKER Cells communicate through an intricate system of locks and keys -- receptors on cell surfaces and ligand molecules -- that allow the transmission of very specific information across their membranes. Researchers at the School of Medicine have just discovered an unexpected new type of lock-and-key mechanism that provides a critical step in reproducing nerve growth factor, crucial to all aspects of nerve formation and function. The information revealed can now be directly applied to design a drug to treat neurodegenerative conditions such as Alzheimer's disease or spinal cord injuries. Nerve growth factor, or NGF, is one of the most important molecules in the nervous system, said Chris Garcia, PhD, assistant professor of microbiology and immunology and of structural biology. NGF and its family members called neurotrophins not only control the development of the nervous system in the embryo but also the maintenance of nervous tissue and neural transmission in the adult. Researchers Xiao-lin He (left) and Chris Garcia sit in front of a computer screen that shows an electron density map created by X-ray imaging that helped derive the structure of nerve growth factor. Their research "unlocks" an important step in reproducing the growth factor, which plays a critical role in nerve formation and function. Photo: Mitzi Baker NGF plays a role in many nervous system problems such as neural degeneration in aging, Alzheimer's disease and neural regeneration in spinal cord injuries and other damage to neural tissue. It also may factor into mood and other psychological disorders. NGF's fundamental importance in the nervous system, Garcia said, made it a compelling puzzle to try to solve in his lab, which focuses broadly on how information is communicated across membranes using receptors and ligands, the locks and keys of molecular biology. Interestingly, he added, one of the receptors for NGF is also used by the rabies virus to gain entry into cells, stimulating interest in their lab which has a focus on molecules involved in infection and immunity. "A lot of companies have tried for many years to make a drug out of NGF and it just hasn't worked very well because basically no one has really known what the mechanisms are for receptor activation," he said. "I think the significance of our result is that now we have an atomic model of this system that begins to clarify a lot of the confusing functional data." Garcia and a postdoctoral scholar in his lab, Xiao-lin He, PhD, published their findings of the three-dimensional structure of NGF bound to its receptor earlier this month in Science. The main question that hadn't been answered until now is how a molecule with two symmetrical parts like NGF could simultaneously activate two different receptors on its surface -- called p75 and Trk -- required for its signal. The question that had been a conundrum for researchers in neurobiology for 15 years was "how does NGF specifically select one of each type of receptor instead of two of the same?" "No matter what we found, we knew that it was going to be new and unprecedented," said Garcia. In a mechanism that could be right out of the world of "Harry Potter," the key inserted into one of the locks morphs such that the shape of the combined parts then fits with another type of lock. Garcia and He discerned this unusual feature of the interaction by using X-ray imaging techniques confirmed by biochemical methods. "The result was a complete surprise," said He, who has been studying the NGF signaling system for about a year. He explained that since NGF is composed of two identical chains of protein, it would be logical that it binds the two identical chains of the p75 receptor. But it only attaches to one chain. The researchers found that after NGF connects with one of the p75 protein chain, it changes shape such that a second receptor of the same kind cannot fit. What that does, said Garcia, is allow the other NGF receptor, Trk, to bind on the other side and form a three-way signaling complex. Garcia said neurobiology researchers are also surprised by the finding, which has caused controversy about its meaning. Garcia and He's detailed structural data can now be used by others in the field as a template for further experiments. "Our data is going to stimulate a lot of science to figure out what its significance is," Garcia said. In terms of the straightforward goal of creating a drug that simulates or blocks the actions of NGF binding to its receptors, Garcia said, "It's all there. We've got it. What a drug company needs is in that structure right now and they don't need to know anything else." This research is supported by a fellowship from the Paralyzed Veterans of America, Spinal Cord Research Foundation; the American Heart Association; the Christopher Reeve Paralysis Foundation; the Keck Foundation; and the National Institutes of Health. From shovland at mindspring.com Sun May 1 16:37:08 2005 From: shovland at mindspring.com (Steve Hovland) Date: Sun, 1 May 2005 09:37:08 -0700 Subject: [Paleopsych] Hormones of the Hypothalamus Message-ID: <01C54E31.5899B6C0.shovland@mindspring.com> http://users.rcn.com/jkimball.ma.ultranet/BiologyPages/H/Hypothalamus.html The hypothalamus is a region of the brain. It secretes a number of hormones. Thyrotropin-releasing hormone (TRH) Gonadotropin-releasing hormone (GnRH) Growth hormone-releasing hormone (GHRH) Corticotropin-releasing hormone (CRH) Somatostatin Dopamine All of these are released into the blood, travel immediately to the anterior lobe of the pituitary <../P/Pituitary.html>, where they exert their effects. All of them are released in periodic spurts. In fact, replacement hormone therapy with these hormones does not work unless the replacements are also given in spurts. Two other hypothalamic hormones: Antidiuretic hormone (ADH) and Oxytocin travel in neurons <../N/N.html> to the posterior lobe of the pituitary where they are released into the circulation. Link to diagram of the endocrine glands <../E/Endocrines.gif> (92K) Thyrotropin-releasing hormone (TRH) TRH is a tripeptide (GluHisPro). When it reaches the anterior lobe of the pituitary it stimulates the release there of thyroid-stimulating hormone <../P/Pituitary.html> (TSH) prolactin <../P/Pituitary.html> (PRL) Gonadotropin-releasing hormone (GnRH) GnRH is a peptide of 10 amino acids. Its secretion at the onset of puberty triggers sexual development. Primary Effects Secondary Effects FSH <../P/Pituitary.html> and LH <../P/Pituitary.html> Up estrogen and progesterone Up (in females) testosterone Up (in males) After puberty, a hyposecretion of GnRH may result from intense physical training anorexia nervosa Synthetic agonists <../A/A.html> of GnRH are used to treat inherited or acquired deficiencies of GnRH secretion. prostate cancer. In this case, high levels of the GnRH agonist reduces the number of GnRH receptors in the pituitary, which reduces its secretion of FSH and LH, which reduces the secretion of testosterone, which reduces the stimulation of the cells of the prostate. Growth hormone-releasing hormone (GHRH) GHRH is a mixture of two peptides, one containing 40 amino acids, the other 44. As its name indicates, GHRH stimulates cells in the anterior lobe of the pituitary to secrete growth hormone <../P/Pituitary.html> (GH). Corticotropin-releasing hormone (CRH) CRH is a peptide of 41 amino acids. As its name indicates, its acts on cells in the anterior lobe of the pituitary to release adrenocorticotropic hormone <../P/Pituitary.html> (ACTH) CRH is also synthesized by the placenta and seems to determine the duration of pregnancy. Description of the mechanism. <../S/SexHormones.html> It may also play a role in keeping the T cells of the mother from mounting an immune attack against the fetus. [Discussion <../S/Sexual_Reproduction.html>] Somatostatin Somatostatin is a mixture of two peptides, one of 14 amino acids, the other of 28. Somatostatin acts on the anterior lobe of the pituitary to inhibit the release of growth hormone <../P/Pituitary.html> (GH) inhibit the release of thyroid-stimulating hormone <../P/Pituitary.html> (TSH) Somatostatin is also secreted by cells in the pancreas <../P/Pancreas.html> and in the intestine <../G/GutHormones.html> where it inhibits the secretion of a variety of other hormones. Dopamine Dopamine is a derivative of the amino acid tyrosine <../T/Tyr_phe.gif>. Its principal function in the hypothalamus is to inhibit the release of prolactin <../P/Pituitary.html> (PRL) from the anterior lobe of the pituitary. Antidiuretic hormone (ADH) and Oxytocin These peptides are released from the posterior lobe of the pituitary and are described in the page devoted to the pituitary. From shovland at mindspring.com Sun May 1 16:41:06 2005 From: shovland at mindspring.com (Steve Hovland) Date: Sun, 1 May 2005 09:41:06 -0700 Subject: [Paleopsych] peptide hormones Message-ID: <01C54E31.E6883F60.shovland@mindspring.com> Although it might be possible to control all the functions of the brain using only a handful of hormones and neurotransmitters, the body has developed instead a hierarchy of systems of considerably greater complexity. This is an artifact rather than a necessity, i.e., "it is what it is." Peptides are long chains or polymers of amino acids, which are just small molecules with one positive end and one negative end. When they link, this electrostatic energy is converted into a chemical bond. Peptides undergo secondary structure transformations in their free state, twisting around themselves to minimize surface tension, an important term in the total surface free energy. This flexibility enables a much larger variety of forms than could be derived from nucleic acid polymers due to their double-helix which significantly limits their conformational variety. In addition, since they are composed of amino acids, peptides can contain and position highly polar or reactive residues. These reactive portions are normally hydrophilic and as such as contained on the outer portion of the coiled peptide where they can act most effectively on other entities. These two facts make proteins ideal as structures for enzymes. The body can fine tune the structure and therefore the chemical activity of enzymes by changing the genetic coding which produces them; and proteins can alter genetic function by regulating its transcription, turning genes on and off, and by enzymatically inhibiting or promoting synthesis of other peptides coded by the DNA. Other peptides - the immunoglobins - are responsible for recognizing non-self material (antigens) such as invading microbes by protuberances on their outer surfaces, and as such are key to the function of the immune system. Finally, and most obviously, proteins comprise a large proportion of the physical structure of the body, as collagen, clathin, etc. Some diseases are directly related to mutated forms of proteins, resulting from mutated genes; still others, such as mad-cow and Alzheimer's, result from improper folding of peptides to form in non-water-soluble protein deposits - amyloid - whose residues have their hydrophobic regions directed outward. http://dwb.unl.edu/Teacher/NSF/C10/C10Links/www.pharmcentral.com/peptide s.htm From shovland at mindspring.com Mon May 2 00:23:29 2005 From: shovland at mindspring.com (Steve Hovland) Date: Sun, 1 May 2005 17:23:29 -0700 Subject: [Paleopsych] A rich resource on cellular receptors Message-ID: <01C54E72.7EBCD530.shovland@mindspring.com> http://www.answers.com/main/ntquery;jsessionid=3ss6ltts54wlo?method=4&ds id=2222&dekey=Receptor+%28biochemistry%29&gwp=8&curtab=2222_1&sbid=lc03b receptor (biochemistry) In biochemistry, a receptor is a protein on the cell membrane or within the cytoplasm that binds to a specific factor (a ligand), such as a neurotransmitter, hormone, or other substance, and initiates the cellular response to the ligand. As all receptors are proteins, their structure is encoded into the DNA. Most hormone genes contain a short sequence that signals to the cell whether it needs to be transported to the cell membrane or it is to remain in the cytoplasm. Overview Many genetic disorders involve hereditary defects in receptor genes. Often, it is hard to determine whether the receptor is nonfunctional or the hormone is produced at decreased level; this gives rise to the "pseudo-hypo-" group of endocrine disorders, where there appears to be a decreased hormonal level while in fact it is the receptor that is not responding sufficiently to the hormone. From checker at panix.com Mon May 2 16:20:37 2005 From: checker at panix.com (Premise Checker) Date: Mon, 2 May 2005 12:20:37 -0400 (EDT) Subject: [Paleopsych] BH: Happiness Is Good Biological Functioning Message-ID: Happiness Is Good Biological Functioning http://www.betterhumans.com/Print/index.aspx?ArticleID=2005-04-18-3 Betterhumans Staff 4/18/2005 4:04 PM Smiling Girl Credit: Wolfgang Lienbacher Healthy smile: Happier people are healthier people, suggests a new study Happiness, apparently, is good biological functioning. So suggests a study by researchers from [8]University College London, who found that happier people have several markers of a healthy body, such as those relating to the cardiovascular system and those controlling hormone levels. "It has been suggested that positive affective states are protective, but the pathways through which such effects might be mediated are poorly understood," the researchers say. "Here we show that positive affect in middle-aged men and women is associated with reduced neuroendocrine, inflammatory, and cardiovascular activity." Key systems In 200 middle-aged Londoners, [9]Andrew Steptoe and colleagues found that participants who reported more everyday happiness had better biological function in a few key systems. Happier people, for example, had lower levels of the stress hormone [10]cortisol, which has been linked to such conditions as type 2 diabetes and hypertension. They also had lower responses to stress in plasma [11]fibrinogen levels. In high concentrations, fibrinogen can indicate a risk of coronary heart disease. Happy men also had lower heart rates over the day and evening, suggesting that they had good cardiovascular health. Steptoe and colleagues found that their results were independent of psychological distress, implying that good well-being is directly linked to health-related biological processes. The research is reported in the [12]Proceedings of the National Academy of Sciences ([13]read abstract). References 8. http://www.ucl.ac.uk/ 9. http://www.ucl.ac.uk/epidemiology/staff/steptoe.html 10. http://www.wikipedia.org/wiki/cortisol 11. http://www.wikipedia.org/wiki/fibrinogen 12. http://www.pnas.org/ 13. http://www.pnas.org/cgi/doi/10.1073/pnas.0409174102 From checker at panix.com Mon May 2 16:20:52 2005 From: checker at panix.com (Premise Checker) Date: Mon, 2 May 2005 12:20:52 -0400 (EDT) Subject: [Paleopsych] WSJ: Divorce increasing at 20%/year in China Message-ID: Divorce increasing at 20%/year in China http://online.wsj.com/article/0,,SB111333549406604929,00.html?mod=todays_us_page_one April 13, 2005 By KATHY CHEN Staff Reporter of THE WALL STREET JOURNAL April 13, 2005; Page A1 NANCHANG, China -- China's success is tearing the Fan family apart. Qun, a successful 39-year-old entrepreneur in Beijing, bought his parents a new apartment and takes them sightseeing in other Chinese cities. But he feels he has little in common with them any more and less to say to them. His younger brother Jun, 37, is reeling over a divorce after his wife left him to pursue opportunities in southern China. He is unemployed after a failed business venture and has been living with his parents for more than a year. At a loss over how to deal with his family's situation, patriarch Fan Hanlin often retreats to his bedroom, usurped in his role of respected elder. His older son's social standing outstrips his, and his younger son ignores his advice. Mr. Fan's wife escapes by playing mah-jongg each afternoon with friends. "Everyone is unhappy," says the elder Mr. Fan, 70. For thousands of years, Chinese have made the family paramount, with generations often living together, and younger members deferring to their elders. Fathers were the head of the household. But opportunities born of China's move to a market-based economy over the past two dozen years are creating new wealth, new hierarchies and new strains. The scramble to keep up with neighbors, or one's own relatives, is testing family ties, contributing to a rise in social problems. Some 1.6 million couples divorced in China last year, a 21% jump over the year before, according to China's Ministry of Civil Affairs. In Beijing, there were 800 reported cases of domestic violence in 2004, double the number the previous year, according to the city's Bureau of Justice. Younger Chinese are opting for privacy over extended-family living and buying parents their own apartments. Others are putting their aging parents in nursing homes, as convenience trumps filial piety, an unheard-of violation of Confucian ethics. Over the past decade, the number of nursing-home residents has increased 40% to more than one million. In their room at the Beijing Fifth Social Welfare Institution, a state-run nursing home, one elderly couple explained why they live there. The Wangs, who declined to be identified by their full names, say they moved to Beijing after retiring to be closer to their daughter. But they found it too lonely living in the high-rise apartment she bought them. "We didn't want to live with her because...in a market economy, competition is very fierce and children have no extra time or energy to take care of their parents," says Ms. Wang, 75. The Wangs say they like it at the institution, because there is more socializing. They note their daughter, a securities-company executive, has cut her once weekly visits to holidays and phone calls. "She has no time," says Ms. Wang. Parents of young children are leaving their offspring in the care of relatives for years, as they seek better jobs far from home. Millions of peasants have left their rural homes for work in cities, while some professionals are going abroad. The trend is spawning China's own generation of latchkey children, numbering in the tens of millions. Last September, a 16-year-old, whose mother had left their village to work in the city, and two friends robbed several dozen students at knifepoint in the city of Daye, say local authorities. The teen was among several dozen youngsters from a nearby village with at least one parent working elsewhere. Many had dropped out of school. "These kids are like land mines who could explode any moment," says village chief Hu Yunyan. One of last year's most-viewed television series in China was "Chinese-Style Divorce." It focuses on a doctor whose marriage unravels after he goes to work at a higher-paying hospital, backed by foreigners, to satisfy his wife's demands for a better life. The show has spawned a best-selling book and pages of commentary and columns in newspapers. The show's producer, Zhu Zhibing, says he came up with the idea for the series after observing how the race to get ahead in China is eroding family relationships. "Everyone is focused on making money," he says. "It destabilizes society." In the Fan household, life followed traditional guidelines when the children were growing up. Mr. Fan, head of the household, taught physics at a high school and his wife, Luo Shuzheng, was an engineer in a state-run factory. They had three children -- two boys and a girl -- who excelled at school, and tested into prestigious universities. Like other Chinese children, the Fans were expected to obey their father without question. "We required [the children] to sit still and didn't let them fool around," Mr. Fan says. Usually, just raising his voice was enough, but Mr. Fan says sometimes he hit the boys. He still recollects with pride how, after he hit his younger son in an effort to improve his study habits, the boy scored so well on college-entrance exams that he ranked among the top in Nanchang County. The Fans were a tight-knit clan. Qun, the oldest, looked after his two younger siblings while their parents were at work. Many nights, their mother stayed up mending clothes and making cloth shoes for the children. Sundays were a rush of shopping, cooking and housework. Neither Mr. Fan's nor his wife's parents lived with them, but the couple set aside part of their small income each month to give to their parents. Like generations of Chinese, Mr. Fan and his wife, Ms. Luo, envisioned a life driven by filial duties for their own children: study hard, find a stable job, get married, produce offspring (preferably male) and support their parents in old age. But in the late 1980s and early 1990s, when the Fan children were graduating from college, China's economic reforms were opening up all sorts of new opportunities, in job choices, lifestyles and ways to get rich. Qun jumped at the chance to do something different. Bucking the trend among college graduates at the time to take a state-sector job, he opted for a marketing position in a joint-venture company of SmithKline Beecham, now GlaxoSmithKline PLC. He learned about the pharmaceutical business and Western marketing techniques, befriended American colleagues and helped the company successfully launch its Contac brand of cold medicine in China. In 1996, he started his own consulting company, advising drug companies on doing business in China. Today, he says, his company employs a dozen people, including his wife, Li Chunhui. The business generates annual revenue of more than $1.2 million, he says, and aftertax profit of 15% to 20% of revenue. As Qun prospered, the distance widened between him and his family in Nanchang, a city of 4.5 million nearly 800 miles from his new home in Beijing. He shares few details of his life in the city with his parents. They don't understand his business, he says, and wouldn't necessarily approve of his lifestyle. He and his wife each drive their own car, dine out frequently and retain two housekeepers, including one just to look after their pet Pekinese. This year, they moved to a two-story house in a wealthy suburb. His parents don't have a car or a housekeeper, rarely eat out and take pride in saving. "If my parents saw me spending this kind of money, I'd be embarrassed," Qun says, as he and his wife grab a coffee at a Starbucks shop on the way home from the office. He notes the few hundred dollars they spend each month on such luxuries as fresh beef, imported cookies, dog sweaters and a sitter for their dog is more than his parents' monthly income. In the past, Chinese families revolved around fathers and sons. But like other younger-generation Chinese, Qun views his first allegiance as with his "xiao jiating," or small family unit centered around his marriage. The couple have resisted suggestions by Qun's parents for them to have a baby. Instead, Qun's wife insists their Pekinese dog, "Wrong Wrong," should be recognized as a "grandson," proclaiming the dog's surname to be "Fan." Qun says this offended his mother, and she once complained that he should find a wife "who listens more." In traditional China, a son would have quietly accepted such criticism. But Qun says he told his mother that if she had a problem with his wife, she should tell her directly. "My main family is with Linda," he says, using his wife's English name. Like many of today's middle-class Chinese, the couple also use Western names, which is seen in certain circles as a sign of being modern and sophisticated. Qun also goes by the name of James. Qun's mother says she doesn't care how they raise their dog, but "you still need to have a child. How will you get by when you are old? Dogs can't take care of you." These days, Qun sees his parents only a few times a year. Conversations tend to be about Nanchang friends or family, since his parents have different views on other subjects. "They always complain about how the government is unfair and society is unjust," says Qun. "I try to influence them...but I think they'll never catch up to my way of thinking." Min, the youngest Fan sibling, also resisted her parents' traditional expectations. Ms. Luo says she hoped her daughter, after graduation from college, would return to Nanchang. Instead, Min settled in the southern city of Guangzhou, where she is married, has a child and works at a major Chinese insurance company. Jun, the middle sibling, is still figuring out his place in the new China. Growing up, he says the constant message from his parents and school was: "If you just listen, you'll be successful and society will take care of you." After college, he accepted a job at the local subsidiary of the state-run China National Machinery & Equipment Import & Export Corp. Working as a trader, he exported engines to other Asian countries and the U.S. and earned more than $10,000 in bonuses, he says. But after more than a decade on the job, his salary changed little, totaling about $200 a month. In 1993, Jun married a fellow office worker, Zhu Yifang. Their employer provided them with a small apartment and they had a son in 1995. After the baby was born, Jun's mother spent long stretches of time living with the couple, a traditional Chinese practice. But Ms. Zhu says she resented her mother-in-law's presence, which she regarded as interference. "It would have been better if the older generation didn't live with us," Ms. Zhu says. "But I couldn't refuse," she says. Unlike wives in pre-reform China, Ms. Zhu could walk away, with more freedom and job opportunities. In 1999, she went to southern China to work as the sales agent for a construction-material company, leaving her husband to care for their son, then 4 years old. In 2000, the couple divorced. Today, Ms. Zhu lives in Nanchang with a boyfriend and earns more than $500 a month, she says, teaching English at a university and running her own English class. After the divorce, in 2001, Qun offered his younger brother a job at his company in Beijing to market a vitamin supplement and oversee a handful of employees. Jun quit his state-sector job and accepted. But he felt uncomfortable leaving his son in his parents' care, he says. He couldn't get used to Beijing or his new job. He had a hard time persuading retailers to buy the vitamin supplement, and after a year, the venture had lost more than $60,000. He says the company didn't spend enough to promote the product. Qun says his younger brother "approached the job like he was still at a state-run company...He got up in the morning, drank a cup of tea, and then did only what I told him to." He says his brother "often complains and finds excuses....We live in different worlds." Jun says that by his older brother's standards, "I haven't succeeded...but the goals he chooses are different from mine." He thinks his brother "doesn't necessarily like what he does, but he wants to earn money." His own goal in life, Jun says, is to first be a good father. Jun returned home and last year moved in with his parents. That is a reversal of Chinese tradition, in which grown offspring typically provide for their parents. Sitting on the apartment patio on a recent day, Jun sipped tea from a beer mug and pondered his future. "I haven't thought through a lot of things, like how to raise my kid, how to be a model parent and how to live with my parents," Jun says. "I'm just considering the question, 'How successful should I be?' People drive a [Mercedes] Benz; I don't have a car." Mr. Fan and Jun often squabble over how to raise Jun's son, now 9 years old. "I tell [Jun] his son should go to sleep at 9 p.m. or he'll be tired at school. But I talk and no one listens," says Mr. Fan. Jun says that his father "has lived this long, but doesn't know what family is. You need to show love to your kid, but [the elder Mr. Fan] doesn't express his emotions." After initially rejecting his brother's suggestion that he look for a job outside of Nanchang, Jun recently had a change of heart. He says he plans to visit Shanghai to explore an opportunity to work for a trading company there. Economic changes have given people in China more money, but are also causing "more pressure" Jun says. "Some contradictions always existed in our family," he says, "but when life was simple, we just lived with them." ---- Cui Rong and Qiu Haixu contributed to this article. Write to Kathy Chen at kathy.chen at wsj.com1 http://online.wsj.com/article/0,,SB111333549406604929,00.html From checker at panix.com Mon May 2 16:21:13 2005 From: checker at panix.com (Premise Checker) Date: Mon, 2 May 2005 12:21:13 -0400 (EDT) Subject: [Paleopsych] NYT: 'The World Is Flat': Global Playing Field: More Level, but It Still Has Bumps Message-ID: 'The World Is Flat': Global Playing Field: More Level, but It Still Has Bumps http://www.nytimes.com/2005/04/30/books/30stig.html [Sunday Book Review and first chapter appended. I already posted an NYT Magazine article by Friedman.] April 30, 2005 BOOKS OF THE TIMES | 'THE WORLD IS FLAT' Global Playing Field: More Level, but It Still Has Bumps By JOSEPH E. STIGLITZ THE WORLD IS FLAT: A Brief History of the 21st Century By Thomas L. Friedman. 488 pp. Farrar, Straus & Giroux. $27.50. The world is flat, or at least becoming flatter very quickly, Thomas L. Friedman says in his exciting and very readable account of globalization. In this flat new world, there is a level (or at least more level) playing field in which countries like India and China, long marginalized in the global economy, are able to compete. And while Mr. Friedman, a Pulitzer Prize-winning columnist for The New York Times, celebrates the new vistas opening up for these countries, he describes forcefully the challenges globalization presents for the older industrialized nations - especially the United States. America is still the global leader in science and technology, but its dominance is eroding. As Mr. Friedman points out in "The World Is Flat," Asian countries now produce eight times as many bachelor's degrees in engineering as the United States; the proportion of foreign-born Ph.D.'s in the American science and engineering labor force has risen to 38 percent; and federal financing for research in physical and mathematical sciences and engineering as a share of gross domestic product declined by 37 percent from 1970 to 2004. About a third of "The World Is Flat" is devoted to describing the forces of leveling - from the fall of the Berlin Wall, which eliminated the ideological divide separating much of the world, to the rise of the Internet and technological changes that have led to new models of production and collaboration, including outsourcing and offshore manufacturing. The rest of the book is devoted to exploring the implications of this flattening, both for the advanced industrial countries and the developing world. In truth, Mr. Friedman's major points would come across more strongly if his 488 pages were edited more tightly. But he provides a compelling case that something big is going on. I was in Bangalore, India, in January 2004 - just a month before Mr. Friedman - visiting Infosys, one of India's new leading high-technology companies. I, too, was bowled over by what I saw: "campuses" more modern than anything I had seen on the West Coast, and business leaders as dynamic and thoughtful as anywhere in the world. It may be true that fears of outsourcing have been exaggerated: there are only a limited number of radiologists, software programmers and back-office people whose jobs can be performed at a distance. But I side with Mr. Friedman: the integration of some three billion people into the global economy is a big deal. Even if only a limited number of American jobs are lost, the new competition will have striking effects, particularly on the wages of unskilled workers. While free trade may ultimately make every country better off, not every individual will be better off. There are winners and there are losers; and while, in principle, the winners could compensate the losers, that typically does not happen. Among other things, a flatter world means a less flat America - more inequality. The playing field may be getting more level, but not everyone is equipped to play on it. On that same trip to India, I spent more than half my time in the countryside surrounding Bangalore, where traveling 10 miles was like traveling back 2,000 years. Peasants were farming as their ancestors must have. What has enabled Bangalore to become a high-tech success story is that companies like Infosys have removed themselves from what is going on nearby. They communicate directly by satellite with the United States, and in a place where local newspapers list the number of brownouts the previous day, these companies can have their own sources of power. And while new technologies may close the gap between parts of India and China and the advanced industrial countries, they will also increase the gap between those countries and Africa. Mr. Friedman is right that there are forces flattening the world, but there are other forces making it less flat. At issue is the balance between them. So is the world really much flatter than before? For instance, the new technologies that Mr. Friedman praises as levelers have also given rise to new opportunities for monopolization. Mr. Friedman praises Netscape's leveling role: its browser has really helped to put a world of knowledge and information at each person's doorstep (or computer). But Microsoft was able to use its own market power through control of computer operating systems to effectively replace Netscape with its own browser, Internet Explorer. While Microsoft speaks eloquently of the need to reward innovation, the real rewards are often not reaped by the innovators. In addition, the underlying research for major developments like the Internet and Web browsers is expensive. Large, rich countries can pay for it; poor, small ones cannot. Mr. Friedman notes, but does not emphasize as much as he might, the important role played by government in financing such research before allowing private entrepreneurs to bring the actual products to market - and make the profits. American companies have a distinct advantage in benefiting from government-financed research, even though there are crumbs (some quite large) that those around the world can pick up. Meanwhile, the new "rules of the game" that were part of the last round of global trade negotiations - notably intellectual property regulations requiring all countries to adopt American-style patent and copyright laws - are almost surely making the playing field less level. They will make it easier for those who are ahead of the game to maintain their lead. One mark of a great book is that it makes you see things in a new way, and Mr. Friedman certainly succeeds in that goal. The world may not yet be flat, but there is no doubt that there are important forces - some leveling, some the opposite - that are changing its shape in critical ways. And in his provocative account, Mr. Friedman suggests what this brave new world will mean to all of us, in both the developed and the developing worlds. Joseph E. Stiglitz, university professor at Columbia University, won the Nobel Prize in Economics in 2001. ---------------- Sunday Book Review > 'The World Is Flat': The Wealth of Yet More Nations http://www.nytimes.com/2005/05/01/books/review/01ZAKARIA.html May 1, 2005 By FAREED ZAKARIA THE WORLD IS FLAT: A Brief History of the Twenty-First Century. By Thomas L. Friedman. 488 pp. Farrar, Straus & Giroux. $27.50. OVER the past few years, the United States has been obsessed with the Middle East. The administration, the news media and the American people have all been focused almost exclusively on the region, and it has seemed that dealing with its problems would define the early decades of the 21st century. ''The war on terror is a struggle that will last for generations,'' Donald Rumsfeld is reported to have said to his associates after 9/11. But could it be that we're focused on the wrong problem? The challenge of Islamic terrorism is real enough, but could it prove to be less durable than it once appeared? There are some signs to suggest this. The combined power of most governments of the world is proving to be a match for any terror group. In addition, several of the governments in the Middle East are inching toward modernizing and opening up their societies. This will be a long process but it is already draining some of the rage that undergirded Islamic extremism. This doesn't mean that the Middle East will disappear off the map. Far from it. Terrorism remains a threat, and we will all continue to be fascinated by upheavals in Lebanon, events in Iran and reforms in Egypt. But ultimately these trends are unlikely to shape the world's future. The countries of the Middle East have been losers in the age of globalization, out of step in an age of free markets, free trade and democratic politics. The world's future -- the big picture -- is more likely to be shaped by the winners of this era. And if the United States thought it was difficult to deal with the losers, the winners present an even thornier set of challenges. This is the implication of the New York Times columnist Thomas L. Friedman's excellent new book, ''The World Is Flat: A Brief History of the Twenty-First Century.'' The metaphor of a flat world, used by Friedman to describe the next phase of globalization, is ingenious. It came to him after hearing an Indian software executive explain how the world's economic playing field was being leveled. For a variety of reasons, what economists call ''barriers to entry'' are being destroyed; today an individual or company anywhere can collaborate or compete globally. Bill Gates explains the meaning of this transformation best. Thirty years ago, he tells Friedman, if you had to choose between being born a genius in Mumbai or Shanghai and an average person in Poughkeepsie, you would have chosen Poughkeepsie because your chances of living a prosperous and fulfilled life were much greater there. ''Now,'' Gates says, ''I would rather be a genius born in China than an average guy born in Poughkeepsie.'' The book is done in Friedman's trademark style. You travel with him, meet his wife and kids, learn about his friends and sit in on his interviews. Some find this irritating. I think it works in making complicated ideas accessible. Another Indian entrepreneur, Jerry Rao, explained to Friedman why his accounting firm in Bangalore was able to prepare tax returns for Americans. (In 2005, an estimated 400,000 American I.R.S. returns were prepared in India.) ''Any activity where we can digitize and decompose the value chain, and move the work around, will get moved around. Some people will say, 'Yes, but you can't serve me a steak.' True, but I can take the reservation for your table sitting anywhere in the world,'' Rao says. He ended the interview by describing his next plan, which is to link up with an Israeli company that can transmit CAT scans via the Internet so that Americans can get a second opinion from an Indian or Israeli doctor, quickly and cheaply. What created the flat world? Friedman stresses technological forces. Paradoxically, the dot-com bubble played a crucial role. Telecommunications companies like Global Crossing had hundreds of millions of dollars of cash -- given to them by gullible investors -- and they used it to pursue incredibly ambitious plans to ''wire the world,'' laying fiber-optic cable across the ocean floors, connecting Bangalore, Bangkok and Beijing to the advanced industrial countries. This excess supply of connectivity meant that the costs of phone calls, Internet connections and data transmission declined dramatically -- so dramatically that many of the companies that laid these cables went bankrupt. But the deed was done, the world was wired. Today it costs about as much to connect to Guangdong as it does New Jersey. The next blow in this one-two punch was the dot-com bust. The stock market crash made companies everywhere cut spending. That meant they needed to look for ways to do what they were doing for less money. The solution: outsourcing. General Electric had led the way a decade earlier and by the late 1990's many large American companies were recognizing that Indian engineers could handle most technical jobs they needed done, at a tenth the cost. The preparations for Y2K, the millennium bug, gave a huge impetus to this shift since most Western companies needed armies of cheap software workers to recode their computers. Welcome to Bangalore. A good bit of the book is taken up with a discussion of these technological forces and the way in which business has reacted and adapted to them. Friedman explains the importance of the development of ''work flow platforms,'' software that made it possible for all kinds of computer applications to connect and work together, which is what allowed seamless cooperation by people working anywhere. ''It is the creation of this platform, with these unique attributes, that is the truly important sustainable breakthrough that has made what you call the flattening of the world possible,'' Microsoft's chief technology officer, Craig J. Mundie, told Friedman. Friedman has a flair for business reporting and finds amusing stories about Wal-Mart, UPS, Dell and JetBlue, among others, that relate to his basic theme. Did you know that when you order a burger at the drive-through McDonald's on Interstate 55 near Cape Girardeau, Mo., the person taking your order is at a call center 900 miles away in Colorado Springs? (He or she then zaps it back to that McDonald's and the order is ready a few minutes later as you drive around to the pickup window.) Or that when you call JetBlue for a reservation, you're talking to a housewife in Utah, who does the job part time? Or that when you ship your Toshiba laptop for repairs via UPS, it's actually UPS's guys in the ''funny brown shorts'' who do the fixing? China and India loom large in Friedman's story because they are the two big countries benefiting most from the flat world. To take just one example, Wal-Mart alone last year imported $18 billion worth of goods from its 5,000 Chinese suppliers. (Friedman doesn't do the math, but this would mean that of Wal-Mart's 6,000 suppliers, 80 percent are in one country -- China.) The Indian case is less staggering and still mostly in services, though the trend is dramatically upward. But Friedman understands that China and India represent not just threats to the developed world, but also great opportunities. After all, the changes he is describing have the net effect of adding hundreds of millions of people -- consumers -- to the world economy. That is an unparalleled opportunity for every company and individual in the world. Friedman quotes a Morgan Stanley study estimating that since the mid-1990's cheap imports from China have saved American consumers over $600 billion and probably saved American companies even more than that since they use Chinese-sourced parts in their production. And this is not all about cheap labor. Between 1995 and 2002, China's private sector has increased productivity at 17 percent annually -- a truly breathtaking pace. Friedman describes his honest reaction to this new world while he's at one of India's great outsourcing companies, Infosys. He was standing, he says, ''at the gate observing this river of educated young people flowing in and out. . . . They all looked as if they had scored 1600 on their SAT's. . . . My mind just kept telling me, 'Ricardo is right, Ricardo is right.' . . . These Indian techies were doing what was their comparative advantage and then turning around and using their income to buy all the products from America that are our comparative advantage. . . . Both our countries would benefit. . . . But my eye kept . . . telling me something else: 'Oh, my God, there are just so many of them, and they all look so serious, so eager for work. And they just keep coming, wave after wave. How in the world can it possibly be good for my daughters and millions of other young Americans that these Indians can do the same jobs as they can for a fraction of the wages?' '' He ends up, wisely, understanding that there's no way to stop the wave. You cannot switch off these forces except at great cost to your own economic well-being. Over the last century, those countries that tried to preserve their systems, jobs, culture or traditions by keeping the rest of the world out all stagnated. Those that opened themselves up to the world prospered. But that doesn't mean you can't do anything to prepare for this new competition and new world. Friedman spends a good chunk of the book outlining ways that America and Americans can place themselves in a position to do better. People in advanced countries have to find ways to move up the value chain, to have special skills that create superior products for which they can charge extra. The UPS story is a classic example of this. Delivering goods doesn't have high margins, but repairing computers (and in effect managing a supply chain) does. In one of Friedman's classic anecdote-as-explanation shticks, he recounts that one of his best friends is an illustrator. The friend saw his business beginning to dry up as computers made routine illustrations easy to do, and he moved on to something new. He became an illustration consultant, helping clients conceive of what they want rather than simply executing a drawing. Friedman explains this in Friedman metaphors: the friend's work began as a chocolate sauce, was turned into a vanilla commodity, through upgraded skills became a special chocolate sauce again, and then had a cherry put on top. All clear? Of course it won't be as easy as that, as Friedman knows. He points to the dramatic erosion of America's science and technology base, which has been masked in recent decades by another aspect of globalization. America now imports foreigners to do the scientific work that its citizens no longer want to do or even know how to do. Nearly one in five scientists and engineers in the United States is an immigrant, and 51 percent of doctorates in engineering go to foreigners. America's soaring health care costs are increasingly a burden in a global race, particularly since American industry is especially disadvantaged on this issue. An American carmaker pays about $6,000 per worker for health care. If it moves its factory up to Canada, where the government runs and pays for medical coverage, the company pays only $800. Most of Friedman's solutions to these kinds of problems are intelligent, neoliberal ways of using government in a market-friendly way to further the country's ability to compete in a flat world. There are difficulties with the book. Once Friedman gets through explicating his main point, he throws in too many extras -- perhaps trying to make that chocolate sundae -- making the book seem slightly padded. The process of flattening that he is describing is in its infancy. India is still a poor third-world country, but if you read this book you would assume it is on the verge of becoming a global superstar. (Though as an Indian-American, I read Friedman and whisper the old Jewish saying, ''From your lips to God's ears.'') And while this book is not as powerful as Friedman's earlier ones -- it is, as the publisher notes, an ''update'' of [1]''The Lexus and the Olive Tree'' -- its fundamental insight is true and deeply important. In explaining this insight and this new world, Friedman can sometimes sound like a technological determinist. And while he does acknowledge political factors, they get little space in the book, which gives it a lopsided feel. I would argue that one of the primary forces driving the flat world is actually the shifting attitudes and policies of governments around the world. From Brazil to South Africa to India, governments are becoming more market-friendly, accepting that the best way to cure poverty is to aim for high-growth policies. This change, more than any other, has unleashed the energy of the private sector. After all, India had hundreds of thousands of trained engineers in the 1970's, but they didn't produce growth. In the United States and Europe, deregulation policies spurred the competition that led to radical innovation. There is a chicken-and-egg problem, to be sure. Did government policies create the technological boom or vice versa? At least one can say that each furthered the other. The largest political factor is, of course, the structure of global politics. The flat economic world has been created by an extremely unflat political world. The United States dominates the globe like no country since ancient Rome. It has been at the forefront, pushing for open markets, open trade and open politics. But the consequence of these policies will be to create a more nearly equal world, economically and politically. If China grows economically, at some point it will also gain political ambitions. If Brazil continues to surge, it will want to have a larger voice on the international stage. If India gains economic muscle, history suggests that it will also want the security of a stronger military. Friedman tells us that the economic relations between states will be a powerful deterrent to war, which is true if nations act sensibly. But as we have seen over the last three years, pride, honor and rage play a large part in global politics. The ultimate challenge for America -- and for Americans -- is whether we are prepared for this flat world, economic and political. While hierarchies are being eroded and playing fields leveled as other countries and people rise in importance and ambition, are we conducting ourselves in a way that will succeed in this new atmosphere? Or will it turn out that, having globalized the world, the United States had forgotten to globalize itself? Fareed Zakaria, the editor of Newsweek International and author of ''The Future of Freedom,'' is the host of a new current affairs program on public television, Foreign Exchange. References 1. http://www.nytimes.com/books/99/04/25/reviews/990425.25joffet.html -------------- First Chapter: 'The World Is Flat' http://www.nytimes.com/2005/05/01/books/chapters/0501-1st-friedman.html By THOMAS L. FRIEDMAN No one ever gave me directions like this on a golf course before: "Aim at either Microsoft or IBM." I was standing on the first tee at the KGA Golf Club in downtown Bangalore, in southern India, when my playing partner pointed at two shiny glass-and-steel buildings off in the distance, just behind the first green. The Goldman Sachs building wasn't done yet; otherwise he could have pointed that out as well and made it a threesome. HP and Texas Instruments had their offices on the back nine, along the tenth hole. That wasn't all. The tee markers were from Epson, the printer company, and one of our caddies was wearing a hat from 3M. Outside, some of the traffic signs were also sponsored by Texas Instruments, and the Pizza Hut billboard on the way over showed a steaming pizza, under the headline "Gigabites of Taste!" No, this definitely wasn't Kansas. It didn't even seem like India. Was this the New World, the Old World, or the Next World? I had come to Bangalore, India's Silicon Valley, on my own Columbus-like journey of exploration. Columbus sailed with the Ni?a, the Pinta, and the Santa Mar?a in an effort to discover a shorter, more direct route to India by heading west, across the Atlantic, on what he presumed to be an open sea route to the East Indies-rather than going south and east around Africa, as Portuguese explorers of his day were trying to do. India and the magical Spice Islands of the East were famed at the time for their gold, pearls, gems, and silk-a source of untold riches. Finding this shortcut by sea to India, at a time when the Muslim powers of the day had blocked the overland routes from Europe, was a way for both Columbus and the Spanish monarchy to become wealthy and powerful. When Columbus set sail, he apparently assumed the Earth was round, which was why he was convinced that he could get to India by going west. He miscalculated the distance, though. He thought the Earth was a smaller sphere than it is. He also did not anticipate running into a landmass before he reached the East Indies. Nevertheless, he called the aboriginal peoples he encountered in the new world "Indians." Returning home, though, Columbus was able to tell his patrons, King Ferdinand and Queen Isabella, that although he never did find India, he could confirm that the world was indeed round. I set out for India by going due east, via Frankfurt. I had Lufthansa business class. I knew exactly which direction I was going thanks to the GPS map displayed on the screen that popped out of the armrest of my airline seat. I landed safely and on schedule. I too encountered people called Indians. I too was searching for the source of India's riches. Columbus was searching for hardware-precious metals, silk, and spices-the source of wealth in his day. I was searching for software, brainpower, complex algorithms, knowledge workers, call centers, transmission protocols, breakthroughs in optical engineering-the sources of wealth in our day. Columbus was happy to make the Indians her met his slaves, a pool of free manual labor. I just wanted to understand why the Indians I met were taking our work, why they had become such an important pool for the outsourcing of service and information technology work from America and other industrialized countries. Columbus had more than one hundred men on his three ships; I had a small crew from the Discovery Times channel that fit comfortably into two banged-up vans, with Indian drivers who drove barefoot. When I set sail, so to speak, I too assumed that the world was round, but what I encountered in the real India profoundly shook my faith in that notion. Columbus accidentally ran into America but thought he had discovered part of India. I actually found India and thought many of the people I met there were Americans. Some had actually taken American names, and others were doing great imitations of American accents at call centers and American business techniques at software labs. Columbus reported to his king and queen that the world was round, and he went down in history as the man who first made this discovery. I returned home and shared my discovery only with my wife, and only in a whisper. "Honey," I confided, "I think the world is flat." . . . From checker at panix.com Mon May 2 16:21:38 2005 From: checker at panix.com (Premise Checker) Date: Mon, 2 May 2005 12:21:38 -0400 (EDT) Subject: [Paleopsych] NYTBR: 'One Nation Under Therapy': They Don't Feel Your Pain Message-ID: 'One Nation Under Therapy': They Don't Feel Your Pain New York Times Book Review, 5.5.1 http://www.nytimes.com/2005/05/01/books/review/01QUARTL.html [First chapter appended.] By ALISSA QUART ONE NATION UNDER THERAPY How the Helping Culture Is Eroding Self-Reliance. By Christina Hoff Sommers and Sally Satel. 310 pp. St. Martin's Press. $23.95. THE alarmist nonfiction book is a staple of publishing. In fact, it is such a staple that it has its own backlash genre, the anti-alarmist alarmist book. Anti-alarmist alarmist books argue the counterintuitive points: that the kids are all right, that everything is getting better not worse, and that we have nothing to fear but therapy itself. Christine Hoff Sommers and Sally Satel's ''One Nation Under Therapy: How the Helping Culture Is Eroding Self-Reliance'' is one of the latest examples, joining a canon that includes ''The Myth of Self-Esteem,'' ''The Progress Paradox'' and ''The Culture of Fear.'' According to Sommers and Satel, our self-pity and self-concern are doing something far worse than simply annoying our friends. Self-absorption, they claim, is destroying America. ''The American Creed that has sustained the nation is now under powerful assault by the apostles of therapism,'' they write. ''The fateful question is: Will Americans actively defend the traditional creed of stoicism and the ideology of achievement or will they continue to allow the nation to slide into therapeutic self-absorption and moral debility?'' In their fight against this moral debility, Sommers, the author of ''The War Against Boys'' and ''Who Stole Feminism?,'' and Satel, a psychiatrist and the author of ''P.C., M.D.: How Political Correctness Is Corrupting Medicine'' (both are resident scholars at the American Enterprise Institute in Washington), coin a word to describe their enemy: ''therapism,'' defined as the tendency to valorize ''openness, emotional self-absorption and the sharing of feelings.'' Therapism has many tentacles, exemplified by a huge collection of counselors so caricatured here that they resemble Al Franken's touchy-feely ''Saturday Night Live'' creation Stuart Smalley. Therapism is the force behind the ''brain disease'' explanations of drug addiction that the authors say let addicts off the hook. It is also implicit in ''the perils of overthinking.'' (I was concerned to discover my tendency to overthink could be hazardous to my health, although I consoled myself with the knowledge that most of the Upper West Side was overthinking as well.) Chapter titles, like ''The Myth of the Fragile Child'' and ''September 11, 2001: The Mental Health Crisis That Wasn't,'' convey a Grinch-like tone. And the book just gets frostier. Sommers and Satel are most nontherapeutic (read: coldest) when they go after educators for coddling children, in particular with the ''emotionally correct'' treatment of students after 9/11. In fact, the authors excoriate the National Education Association for seeing the attack ''mainly in terms of the threat it posed to children's mental health.'' And they reserve a Dickensian harshness for educators' attempts to build their students' self-esteem: ''Those who encourage children to 'feel good about themselves' may be cheating them, unwittingly, out of becoming the kind of conscientious, humane and enlightened people Mill had in mind'' -- referring to the rigorously educated John Stuart Mill, a scholar at 8. Sentences like these make one glad that neither author teaches kindergarten. After their toughen-up-the-children crusade, Sommers and Satel move on to people with cancer. They praise only sufferers who refrain from exploring their emotions, like the columnist Molly Ivins, extolled for her ''spirited refusal to open up'' after she learned she had breast cancer. Then it's survivors of war crimes; only those who refuse to see themselves as traumatized earn the authors' approval. They also present accounts of Ugandans and Cambodians who have suffered atrocities but nevertheless ''functioned well'' despite their post-traumatic stress disorder symptoms. There's a lot of information in here, typically newspaper anecdotes dolled up with quotations from Mill. But information doesn't add up to cogent analysis. It also doesn't add up to a solution for the problem of national self-absorption. The main remedy Sommers and Satel put forward is that people should take responsibility for themselves. The book imagines self-reliance to be an antidote to self-obsession, a bit of a problem since self-absorption and self-reliance are both forms of selfishness. Let's say, for instance, you quit therapy and at the same time stop ruminating and writing memoirs (self-absorption) to become a careerist professional or perhaps a world traveler. You may achieve equanimity but lose contact with others (self-reliance). But is one of two self-oriented life strategies really superior to the other? To be sure, not all of the book's arguments are so gratuitously grouchy. And to their credit, Sommers and Satel summon copious examples of the excesses of therapy and its related industries, from the tale of a research professor of psychology who had grief counseling foisted upon him by a funeral home to a school exercise that encourages children to share their fears about playing tag. They are also particularly astute when drawing the line between the idea of the self held by moral philosophers who ''attribute unacceptable conduct to flawed character, weakness of will, failure of conscience or bad faith'' and the therapeutic idea of self, where personal shortcomings are maladies, syndromes and disorders. This distinction is useful. We tend to forget that psychology is just another system of knowledge, like moral philosophy (or reality television). Like any other discipline or genre, it has limits, and it is worth remembering that there are other ways for us to think of what we are and what a self is. Indeed, therapy is so popular, in part because of its intimacy, that the therapeutic way of thinking can easily be be exploited and degraded. Occasionally, debunking therapeutic culture is both good and necessary, which is why ''One Nation Under Therapy'' seems refreshing at first, perhaps even up to the task of shrinking therapy-induced panics and diagnostic trends down to size, in the tradition of books like Elaine Showalter's ''Hystories'' and Joan Acocella's ''Creating Hysteria.'' But soon it becomes all too clear that Sommers and Satel are interested not in exploring how psychoanalysis has degenerated into vulgar self-help but in starting an emotional temperance movement. They are also waging a dated war against an imagined army of censorious liberals, attacking ''sensitivity and bias committees'' in publishing houses, state governments, test-writing companies and in groups like the American Psychological Association. According to the authors, this Axis of Snivel is responsible for a therapized ''powerful censorship regime.'' This is a notion that seems to spring from the Bush I era, as does one of the book's section headings: ''How Therapism and Multiculturalism Circumvent Morality.'' ''One Nation Under Therapy'' is not just anti-alarmist alarmist nonfiction. It is culture-wars kitsch. Quaint cultural conservatism aside, what would happen if we took the advice of ''One Nation Under Therapy'' to heart? We might get more work done, although we'd think less. We'd show our children tough love and, presumably, foster a new generation of tough-love advocates. But we might also create a society of diminished passions and sensitivities. Even for Sommers and Satel, this might not be a welcome development. After all, once we were all self-reliant and free of anxiety, we would no longer need the reassurance of books like theirs. Alissa Quart is the author of ''Branded: The Buying and Selling of Teenagers.'' ---------------- First Chapter: 'One Nation Under Therapy' http://www.nytimes.com/2005/05/01/books/chapters/0501-1st-sommers.html By CHRISTINA HOFF SOMMERS and SALLY SATEL The Myth of the Fragile Child In 2001, the Girl Scouts of America introduced a "Stress Less Badge" for girls aged eight to eleven. It featured an embroidered hammock suspended from two green trees. According to the Junior Girl Scout Badge Book, girls earn the award by practicing "focused breathing," creating a personal "stress less kit," or keeping a "feelings diary." Burning ocean-scented candles, listening to "Sounds of the Rain Forest," even exchanging foot massages are also ways to garner points. Explaining the need for the Stress Less Badge to the New York Times, a psychologist from the Girl Scout Research Institute said that studies show "how stressed girls are today." Earning an antistress badge, however, can itself be stressful. The Times reported that tension increased in Brownie Troop 459 in Sunnyvale, California, when the girls attempted to make "anti-anxiety squeeze balls out of balloons and Play-Doh." According to Lindsay, one of the Brownies, "The Play-Doh was too oily and disintegrated the balloon. It was very stressful." The psychologist who worried about Lindsay and her fellow Girls Scouts is not alone. Anxiety over the mental equanimity of American children is at an all-time high. In May of 2002, the principal of Franklin Elementary School in Santa Monica, California, sent a newsletter to parents informing them that children could no longer play tag during the lunch recess. As she explained, "The running part of this activity is healthy and encouraged; however, in this game, there is a 'victim' or 'It,' which creates a sell-esteem issue." School districts in Texas, Maryland, New York, and Virginia "have banned, limited, or discouraged" dodgeball. "Anytime you throw an object at somebody," said an elementary school coach in Cambridge, Massachusetts, "it creates an environment of retaliation and resentment." Coaches who permit children to play dodgeball "should be fired immediately," according to the physical education chairman at Central High School in Naperville, Illinois. In response to this attack on dodgeball, Rick Reilly, the Sports Illustrated columnist, chided parents who want "their Ambers and their Alexanders to grow up in a cozy womb of non-competition." Reillv responds to educators like the Naperville chairman of physical education by saying, "You mean there's weak in the world? There's strong? Of course there is, and dodgeball is one of the first opportunities in life to figure out which one you are and how you're going to deal with it." Reilly's words may resonate comfortably with many of his readers, and with most children as well; but progressive educators tend to dismiss his reaction as just another expression of a benighted opposition to the changes needed if education is to become truly caring and sensitive. This movement against stressful games gained momentum after the publication of an article by Neil Williams, professor of physical education at Eastern Connecticut State College, in a journal sponsored by the National Association for Sports and Physical Education, which represents nearly eighteen thousand gym teachers and physical education professors. In the article, Williams consigned games such as Red Rover, relay races, and musical chairs to "the Hall of Shame." Why? Because the games are based on removing the weakest links. Presumably, this undercuts children's emotional development and erodes their self-esteem. In a follow-up article, Williams also pointed to a sinister aspect of Simon Says. "The major problem," he wrote, "is that the teacher is doing his or her best to deceive and entrap students." He added that psychologically this game is the equivalent of teachers demonstrating the perils of electricity to students "by jolting them with an electric current if they touch the wrong button." The new therapeutic sensibility rejects almost all forms of competition in favor of a gentle and nurturing climate of cooperation. Which games, then, are safe and affirming? Some professionals in physical education advocate activities in which children compete only with themselves such as juggling, unicycling, pogo sticking, and even "learning to ... manipulate wheelchairs with ease." In a game like juggling there is no threat of elimination. But experts warn teachers to be judicious in their choice of juggling objects. A former member of The President's Council on Youth Fitness and Sports suggests using silken scarves rather than, say, uncooperative tennis balls that lead to frustration and anxiety. "Scarves," he told the Los Angeles Times, "are soft, nonthreatening, and float down slowly." As the head of a middle school physical education program in Van Nuys, California, points out, juggling scarves "lessens performance anxiety and boosts self-esteem." Writer John Leo, like Reilly, satirized the gentle-juggling culture by proposing a stress-free version of musical chairs: Why not make sure each child has a guaranteed seat for musical chairs? With proper seating, the source of tension is removed. Children can just relax, enjoy the music and talk about the positive feelings that come from being included. Leo was kidding. But the authors of a popular 1998 government-financed antibullying curriculum guide called Quit It! were not. One exercise intended for kindergarten through third grade instructs teachers on how to introduce children to a new way to play tag: Before going outside to play, talk about how students feel when playing a game of tag. Do they like to be chased? Do they like to do the chasing? How does it feel to be tagged out? Get their ideas about other ways the game might be played. After students share their fears and apprehensions about tag, teachers may introduce them to a nonthreatening alternative called "Circle of Friends" where "nobody is ever 'out.'" If students become overexcited or angry while playing Circle of Friends, the guide recommends using stress-reducing exercises to "help the transition from active play to focused work." Reading through Quit It!, you have to remind yourself that it is not satire, nor is it intended for emotionally disturbed children. It is intended for normal five- to seven-year-olds in our nation's schools. Our Sensitive and Vulnerable Youth But is overprotectiveness really such a bad thing? Sooner or later children will face stressful situations, disappointments, and threats to their self-esteem. Why not shield them from the inevitable as long as possible? The answer is that overprotected kids do not flourish. To treat them as combustible bundles of frayed nerves does them no favors. Instead it deprives them of what they need. Children must have independent, competitive rough-and-tumble play. Not only do they enjoy it, it is part of their normal development. Anthony Pellegrini, a professor of early childhood education at the University of Minnesota, defines rough-and-tumble play as behavior that includes "laughing, running, smiling, jumping ... wrestling, play fighting, chasing, and fleeing." Such play, he says, brings children together, it makes them happy and it promotes healthy socialization. Children who are adept at rough play also "tend to be liked and to be good social problem solvers." Commenting on the recent moves to ban competitive zero-sum playground games like tag, Pelligrini told us, "It is ridiculous ... even squirrels play chase." The zealous protectiveness is not confined to the playground. In her eye-opening book The Language Police, Diane Ravitch shows how a once-commendable program aimed at making classroom materials less sexist and racist has morphed into a powerful censorship regime. "Sensitivity and bias" committees, residing in publishing houses, state governments, test-writing companies, and in groups like the American Psychological Association, now police textbooks and other classroom materials, scouring them for any reference or assertion that could possibly make some young reader feel upset, insecure, or shortchanged in life. In 1997, President Bill Clinton appointed Ravitch to an honorary education committee charged with developing national achievement tests. The Department of Education had awarded a multimillion-dollar contract to Riverside Publishing, a major testing company and a subsidiary of Houghton Mifflin, to compose the exam. Ravitch and her committee were there to provide oversight. As part of the process, the Riverside test developers sent Ravitch and her fellow committee members, mostly veteran teachers, several sample reading selections. The committee reviewed them carefully and selected the ones they considered the most lucid, engaging, and appropriate for fourth-grade test takers. Congress eventually abandoned the idea of national tests. However, Ravitch learned that several of the passages she and her colleagues had selected had not survived the scrutiny of the Riverside censors. For example, two of the selections that got high marks from Ravitch and her colleagues were about peanuts. Readers learned that they were a healthy snack and had first been cultivated by South American Indians and then, after the Spanish conquest, were imported into Europe. The passage explained how peanuts became important in the United States, where they were planted and cultivated by African slaves. It told of George Washington Carver, the black inventor and scientist, who found many new uses for peanuts. The Riverside sensitivity monitors had a field day. First of all, they said, peanuts are not a healthy snack for all children. Some are allergic. According to Ravitch, "The reviewers apparently assumed that a fourth-grade student who was allergic to peanuts might get distracted if he or she encountered a test question that did not acknowledge the dangers of peanuts." The panel was also unhappy that the reading spoke of the Spaniards having "defeated" the South American tribes. Its members did not question the accuracy of the claim, but Ravitch surmises, "They must have concluded that these facts would hurt someone's feelings." Perhaps they thought that some child of South American Indian descent who came upon this information would feel slighted, and so suffer a disadvantage in taking the test. Ravitch's group had especially liked a story about a decaying tree stump on the forest floor and how it becomes home to an immense variety of plants, insects, birds, and animals. The passage compared the stump to a bustling apartment complex. Ravitch and the other committee members enjoyed its charm and verve. It also taught children about a fascinating ecology. But the twenty sensitivity panelists at Riverside voted unanimously against it: "Youngsters who have grown up in a housing project may be distracted by similarities to their own living conditions. An emotional response may be triggered." Ravitch presents clear evidence that our schools are in the grip of powerful sensitivity censors who appear to be completely lacking in good judgment and are accountable to no one but themselves. She could find no evidence that sensitivity censorship of school materials helps children. On the contrary, the abridged texts are enervating. "How boring," she says, "for students to be restricted only to stories that flatter their self-esteem or that purge complexity and unpleasant reality from history and current events." The idea that kids can cope with only the blandest of stories is preposterous. Staples like "Little Red Riding Hood," "Jack and the Beanstalk," and "Hansel and Gretel" delight children despite (or because of) their ghoulish aspects. Kids love to hear ghost stories on Halloween and to ride roller coasters, screaming as they hurtle down the inclines. Therapeutic protectiveness is like putting blinders on children before taking them for a walk through a vibrant countryside. Excessive concern over imagined harms can hinder children's natural development. Moreover, in seeking to solve nonexistent problems, it distracts teachers from focusing on their true mission-to educate children and to prepare them to be effective adults. Commenting on Ravitch's findings, Jonathan Yardley, columnist and book critic at the Washington Post, wrote, "A child with a rare disease may have to be put in a bubble, but putting the entire American system of elementary and secondary education into one borders on insanity." Many American teachers seem to believe children must be spared even the mildest criticism. Kevin Miller, a professor of psychology at the University of Illinois at Urbana-Champaign, has studied differences between Chinese and American pedagogy. In one of his videotapes, a group of children in a math class in China are learning about place values. The teacher asks a boy to make the number 14 using one bundle of ten sticks along with some single sticks, and the child uses only the bundle. The teacher then asks the class, in a calm, noncensorious tone, "Who can tell me what is wrong with this?" When Miller shows the video to American teachers, they are taken aback. They find it surprising to see an instructor being so openly critical of a student's performance. "Most of the teachers in training we've shown this to express the worry that this could be damaging to children's self-esteem," Miller reports. Even the minority of American student teachers who don't disapprove of the practice agree that the practice of giving students explicit feedback in public "contravenes what we do in the U.S." Rossella Santagata, a research psychologist at LessonLab in Santa Monica, California, has studied how American and Italian teachers differ in their reactions to students' mistakes. Italian teachers are very direct: they have no qualms about telling students their answer is wrong. In so doing, they violate all of the sensitivity standards that prevail in the United States. Santagata has a videotape of a typical exchange between an American math teacher and a student. An eighth grader named Steve is supposed to give the prime factors of the number 34; instead he lists all the factors. It is not easy for the teacher to be affirmative about Steve's answer. But she finds a way, "Okay. Now Steve you're exactly right that those are all factors. Prime factorization means that you only list the numbers that are prime. So can you modify your answer to make it all only prime numbers?" (Emphasis in original.) Santagata told us that when she shows this exchange to audiences of Italian researchers, they find the teachers strained response ("exactly right") hysterically funny. By contrast, American researchers see nothing unusual or amusing-because, as Santagata says, "Such reactions are normal." Even college students are not exempt from this new solicitude. . . . From checker at panix.com Mon May 2 16:21:55 2005 From: checker at panix.com (Premise Checker) Date: Mon, 2 May 2005 12:21:55 -0400 (EDT) Subject: [Paleopsych] =?iso-8859-1?q?NYTBR=3A_=27Incompleteness=27=3A_Wai?= =?iso-8859-1?q?ting_for_G=F6del?= Message-ID: NYTBR: 'Incompleteness': Waiting for G?del New York Times Book Review, 5.5.1 http://www.nytimes.com/2005/05/01/books/review/01SCHULMA.html By POLLY SHULMAN INCOMPLETENESS The Proof and Paradox of Kurt G?del. By Rebecca Goldstein. Illustrated. 296 pp. Atlas Books/ W. W. Norton & Company. $22.95. REBECCA GOLDSTEIN, as anyone knows who has read her novels -- particularly ''The Mind-Body Problem'' -- understands that people are thinking beings, and the mind's loves matter at least as much as the heart's. After all, she's not just a novelist, but a philosophy professor. She casts ''Incompleteness,'' her brief life of the logician Kurt G?del (1906-78), as a touching intellectual love story. Though G?del was married, his wife barely appears here; as Goldstein tells it, his romance was with mathematical Platonism, the idea that the glories of mathematics exist eternally beyond our grasp. G?del's Platonism inspired him to deeds as daring as any knight's: he proved his famous incompleteness theorem for its sake. His Platonism also set him apart from his intellectual contemporaries. Only Einstein shared it, and could solace G?del's loneliness, Goldstein argues. A biography with two focuses -- a man and an idea -- ''Incompleteness'' unfolds its surprisingly accessible story with dignity, tenderness and awe. News of G?del's Platonism, or Einstein's, might surprise readers familiar with popular interpretations of their work. For centuries, science seemed to be tidying the mess of the real world into an eternal order beautiful and pure -- a heavenly file cabinet labeled mathematics. Then, in the early 20th century, Einstein published his relativity theory, Werner Heisenberg his uncertainty principle and G?del his incompleteness theorem. Many thinkers -- from the logical positivists with whom G?del drank coffee in the Viennese cafes of the 1920's to existentialists, postmodernists and annoying people at cocktail parties -- have taken those three results as proof that reality is subjective and we can't see beyond our noses. You can hardly blame them. As Goldstein points out, the very names of the theories seem to mock the notion of objective truth. But she makes a persuasive case that G?del and Einstein understood their work to prove the opposite: there is something greater than our little minds; reality exists, whether or not we can ever touch it. It's appropriate, though sad for G?del, that his work has been interpreted to have simultaneously opposite meanings. The proofs of his famous theorems rely on just that sort of twisty thinking: statements like the famous Liar's Paradox, ''This statement is false,'' which flip their meanings back and forth. In the case of the Liar's Paradox, if the statement is true, then it's false -- but if it's false, then it's true. Like that paradox, an assertion that talks about itself, G?del's theorems are meta-statements, which speak about themselves. Because G?del made so much of self-reference and paradox, previous books about his work -- like Douglas Hofstadter's ''G?del, Escher, Bach'' -- tend to emphasize the playfulness of his ideas. Not Goldstein's. She tells his story in a minor key, following G?del into the paranoia that overtook him after Einstein's death, growing out of his loneliness and unrelenting rationality. After all, paranoia, like math, makes people dig deeper and deeper to find meaning. G?del's work addresses the core of mathematics: finding proofs. Proofs are mathematicians' road to truth. To find them, mathematicians from the ancient Greeks on have set up systems consisting of three basic elements: axioms, true statements so intuitively obvious they are self-evident; rules of inference, logical principles indicating how to use axioms to prove new, less obviously true statements; and those new true statements, called theorems. (Many Americans met axioms and proofs for the first and last time in 10th-grade geometry.) A century ago, mathematicians began taking these systems to an extreme. Since mathematical intuition can be as unreliable as other kinds of intuitions -- often things that seem obvious turn out to be just plain wrong -- they tried to eliminate it from their axioms. They built new systems of arbitrary symbols and formal rules for manipulating them. Of course, they chose those particular symbols and rules because of their resemblance to mathematical systems we care about (such as arithmetic). But, by choosing rules and symbols that work whether or not there's any meaning behind them, the mathematicians kept the potential corruption of intuition at bay. The dream of these formalists was that their systems contained a proof for every true statement. Then all mathematics would unfurl from the arbitrary symbols, without any need to appeal to an external mathematical truth accessible only to our often faulty intuition. G?del proved exactly the opposite, however. He showed that in any formal system complicated enough to describe the numbers and operations of arithmetic, as long as the axioms don't lead to contradictions there will always be some statement that is not provable -- and the contradiction of it will not be provable either. He also showed that there's no way to prove from within the system that the system itself won't give rise to contradictions. So, any formal system worth bothering with will either sprout contradictions -- which is bad news, since once you have a contradiction, you can prove anything at all, including 2 + 2 = 5 -- or there will be perfectly ordinary statements that may well be true but can never be proved. You can see why this result rocked mathematics. You can also see why positivists, existentialists and postmodernists had a field day with it, particularly since, once you find one of those unprovable statements, you're free to add it to your system as an axiom, or else to add its complete opposite. Either way, you'll get a new system that works fine. That makes math sound pretty subjective, doesn't it? Well, G?del didn't think so, and his reason grows beautifully from his spectacular proof itself, which Goldstein describes with lucid discipline. Though the proof relies on a meticulous, fiddly mechanism that took an entire semester to build up when I studied logic as a math major in college, its essence fits magically into a few pages of a book for laypeople. It can even, arguably, fit in a single paragraph of a book review -- though that may be stretching. To put it roughly, G?del proved his theorem by taking the Liar's Paradox, that steed of mystery and contradiction, and harnessing it to his argument. He expressed his theorem and proof in mathematical formulas, of course, but the idea behind it is relatively simple. He built a representative system, and within it he constructed a proposition that essentially said, ''This statement is not provable within this system.'' If he could prove that that was true, he figured, he would have found a statement that was true but not provable within the system, thus proving his theorem. His trick was to consider the statement's exact opposite, which says, ''That first statement -- the one that boasted about not being provable within the system -- is lying; it really is provable.'' Well, is that true? Here's where the Liar's Paradox shows its paces. If the second statement is true, then the first one is provable -- and anything provable must be true. But remember what that statement said in the first place: that it can't be proved. It's true, and it's also false -- impossible! That's a contradiction, which means G?del's initial assumption -- that the proposition was provable -- is wrong. Therefore, he found a true statement that can't be proved within the formal system. Thus G?del showed not only that any consistent formal system complicated enough to describe the rules of grade-school arithmetic would have an unprovable statement, but that it would have an unprovable statement that was nonetheless true. Truth, he concluded, exists ''out yonder'' (as Einstein liked to put it), even if we can never put a finger on it. John von Neumann, the father of game theory, took up G?del's cause in America; in England, Alan Turing provided an alternative proof of G?del's theorem while inventing theoretical computer science. Whatever G?del's work had to say about reality, it changed the course of mathematics forever. Polly Shulman is a contributing editor for Science magazine and has written about mathematics for many other publications. From checker at panix.com Mon May 2 16:22:12 2005 From: checker at panix.com (Premise Checker) Date: Mon, 2 May 2005 12:22:12 -0400 (EDT) Subject: [Paleopsych] NS: Teenagers special Message-ID: Teenagers special: The original rebels http://www.newscientist.com/article.ns?id=mg18524891.100&print=true * 05 March 2005 * Lynn Dicks * Lynn Dicks is a writer based near Cambridge, UK EAST Africa, one-and-a-half million years ago: a group of women sit with their young children. They are heavy-browed with small skulls - not quite human, but almost. Some are checking their children for ticks, others teaching them how to dig tubers out of the ground. Not far off, a gaggle of teenage girls lounge under a tree, sniggering and pointing at some young men who are staging fights nearby. The older women beckon: "Come and help us dig out this root - it will make a great meal," they seem to say. But the girls reply with grunts and slouch off, sulkily. Could this really have happened? Our immediate ancestors, Homo erectus, may not have had large brains, high culture or even language, but could they have boasted the original teenage rebels? That question has been hotly contested in the past few years, with some anthropologists claiming to have found evidence of an adolescent phase in fossil hominids, and others seeing signs of a more ape-like pattern of development, with no adolescent growth spurt at all. This is not merely an academic debate. Humans today are the only animals on Earth to have a teenage phase, yet we have very little idea why. Establishing exactly when adolescence first evolved and finding out what sorts of changes in our bodies and lifestyles it was associated with could help us understand its purpose. We humans take twice as long to grow up as our nearest relatives, the great apes. Instead of developing gradually from birth to adulthood, our growth rate slows dramatically over the first three years of life, and we grow just a few centimetres a year for the next eight years or so. Then suddenly, at puberty, growth accelerates again to as much as 12 centimetres a year. Over the following three years adolescents grow an astonishing 15 per cent in both height and width. Though the teenage years are most commonly defined by raging hormones, the development of secondary sexual characteristics and attitude problems, what is unique in humans is this sudden and rapid increase in body size following a long period of very slow growth. No other primate has a skeletal growth spurt like this so late in life. Why do we? Until recently, the dominant explanation was that physical growth is delayed by our need to grow large brains and to learn all the complex behaviour patterns associated with humanity - speaking, social interaction and so on. While such behaviour is still developing, humans cannot easily fend for themselves, so it is best to stay small and look youthful. That way you do not eat too much, and your parents and other members of the social group are motivated to continue looking after you. What's more, studies of mammals show a strong relationship between brain size and the rate of development, with larger-brained animals taking longer to reach adulthood. Humans are at the far end of this spectrum. If this theory is correct, the earliest hominids, Australopithecus, with their ape-sized brains, should have grown up quickly, with no adolescent phase. So should H. erectus, whose brain, though twice the size of that of Australopithecus at around 850 cubic centimetres, was still relatively small. The great leap in brain capacity comes only with the evolution of our own species and Neanderthals, starting almost 200,000 years ago. Brains expanded to around 1350 cm3 in our direct ancestors and 1600 cm3 in Neanderthals. So if the development of large brains accounts for the teenage growth spurt, the origin of adolescence should be here. The trouble is, some of the fossil evidence seems to tell a different story. The human fossil record is extremely sparse, and the number of fossilised children minuscule. Nevertheless in the past few years anthropologists have begun to look at what can be learned of the lives of our ancestors from these youngsters. One of the most studied is the famous Turkana boy, an almost complete skeleton of H. erectus from 1.6 million years ago found in Kenya in 1984. The surprise discovery is that there are some indications that he was a young teenager when he died. Accurately assessing how old someone is from their skeleton is a tricky business. Even with a modern human, you can only make a rough estimate based on the developmental stage of teeth and bones and the skeleton's general size. For example, most people gain their first permanent set of molars at age 6 and the second at 12, but the variation is huge. Certain other features of the skull also develop chronologically, although the changes that occur in humans are not necessarily found in other hominids. In the middle teenage years, after the adolescent growth spurt, the long bones of the limbs cease to grow because the areas of cartilage at their ends, where growth has been taking place, turn into rigid bone. This change can easily be seen on an X-ray. You need as many of these developmental markers as possible to get an estimate of age. The Turkana boy did not have his adult canines, which normally erupt before the second set of molars, so his teeth make him 10 or 11 years old. The features of his skeleton put him at 13, but he was as tall as a modern 15-year-old. "By human standards, he was very tall for his dental age," says anthropologist Holly Smith from the University of Michigan at Ann Arbor. But you get a much more consistent picture if you look at Turkana boy in the context of chimpanzee patterns of growth and development. Then, his dental age, bone age and height all agree he was 7 or 8 years old. To Smith, this implies that the growth of H. erectus was primitive and the adolescent growth spurt had not yet evolved. Susan Anton of New York University disagrees. She points to research by Margaret Clegg, now at the University of Southampton in the UK, showing that even in modern humans the various age markers often do not match up. Clegg studied a collection of 18th and 19th-century skeletons of known ages from a churchyard in east London. When she tried to age the skeletons blind, she found the disparity between skeletal and dental age was often as great as that of the Turkana boy. One 10-year-old boy, for example, had a dental age of 9, the skeleton of a 6-year-old but was tall enough to be 11. "The Turkana kid still has a rounded skull, and needs a lot of growth to reach the adult shape," Anton adds. Like apes, the face and skull of H. erectus changed shape significantly between youth and adulthood. Anton thinks that H. erectus had already developed modern human patterns of growth, with a late, if not quite so extreme, adolescent spurt. She believes Turkana boy was just about to enter it. If she's right, and small-brained H. erectus went through a teenage phase, that scuppers the orthodox idea linking late growth with development of a large brain. Anthropologist Steven Leigh from the University of Illinois at Urbana-Champaign is among those who are happy to move on. He believes the idea of adolescence as catch-up growth is naive; it does not explain why the growth rate increases so dramatically. He points out that many primates have growth spurts in particular body regions that are associated with reaching maturity, and this makes sense because by timing the short but crucial spells of maturation to coincide with the seasons when food is plentiful, they minimise the risk of being without adequate food supplies while growing. What makes humans unique is that the whole skeleton is involved. For Leigh, this is the key. Coordinated widespread growth, he says, is about reaching the right proportions to walk long distances efficiently. "It's an adaptation for bipedalism," he says. According to Leigh's theory, adolescence evolved as an integral part of efficient upright locomotion, as well as to accommodate more complex brains. Fossil evidence suggests that our ancestors took their first steps on two legs as long as six million years ago. If proficient walking was important for survival, perhaps the teenage growth spurt has very ancient origins. Leigh will not be drawn, arguing that there are too few remains of young hominids to draw definite conclusions. While many anthropologists will consider Leigh's theory a step too far, he is not the only one with new ideas about the evolution of teenagers. A very different theory has been put forward by Barry Bogin from the University of Michigan-Dearborn. He believes adolescence in our species is precisely timed to improve the success of the first reproductive effort. In girls, notes Bogin, full adult shape and features are achieved several years before they reach full fertility at around the age of 18. "The time between looking fertile and being fertile allows women to practise social, sexual and cultural activities associated with adulthood, with a low risk of having their own children," says Bogin. When they finally do have children, they are better prepared to look after them. "As a result, firstborns of human mothers die much less often than firstborns of any other species." In boys, you see the opposite. They start producing viable sperm at 13 or 14 years of age, when they still look like boys. The final increase in muscle size that turns them into men does not happen until 17 or 18. In the interim boys, who feel like men, can practise male rivalries without being a threat to adult men or an attractive option to adult women. When boys do become sexually active, they have practised and are more likely to be successful without getting hurt. Bogin's theory makes totally different predictions to Leigh's. If the timing of adolescence is related to uniquely human cultural practices, our species should be the first and only one to have a teenage phase. "H. erectus definitely did not have an adolescence," he asserts. Such strong and opposing views make it all the more necessary to scour the fossil record for clues. One approach, which has produced a surprising result, relies on the minute analysis of tooth growth. Every nine days or so the growing teeth of both apes and humans acquire ridges on their enamel surface. These perikymata are like rings in a tree trunk: the number of them tells you how long the crown of a tooth took to form. Across mammals, the speed of tooth development is closely related to how fast the brain grows, the age you mature and the age you die. Teeth are good indicators of life history because their growth is less related to the environment and nutrition than is the growth of the skeleton. Slower tooth growth is an indication that the whole of life history was slowing down, including age at maturity. Back in the 1980s Christopher Dean, an anatomist at University College London, was the first to measure tooth growth in fossils using perikymata. He found that australopithecines dating from between 3 and 4 million years ago had tooth crowns that formed quickly. Like apes, their first molars erupted at 4 years old and the full set of teeth were in place by 12. Over the years, Dean's team has collected enough teeth to show that H. erectus also had faster tooth growth than modern man, but not so fast as earlier hominids. "Things had moved on a bit," he says. "They had their full set of teeth by about 15." Modern humans reach this stage by about age 20. The change in H. erectus seems to imply that the growth pattern of modern humans was beginning to develop, with an extended childhood and possibly an adolescent growth spurt. Dean cautions, though, that the link between dental and skeletal development in ancestral hominids remains uncertain. These findings could equally support Leigh's or Bogin's theories. A more decisive piece of evidence came last year, when researchers in France and Spain published their findings from an analysis of Neanderthal teeth. A previous study of a remarkably well-preserved skeleton of a Neanderthal youth, known as Le Moustier 1, from south-west France had suggested that, with a dental age of 15 and the frame of an 11-year-old, the kid was about to undergo an adolescent growth spurt. But the analysis of his perikymata reveals quite a different picture. Rather than continuing the trend towards slower development seen in H. erectus, Neanderthals had returned to much faster tooth growth (Nature, vol 428, p 936) and hence, possibly, a shorter childhood. Does this mean they didn't have an adolescence? Lead researcher Fernando Ramirez-Rozzi, of the French National Centre for Scientific Research (CNRS) in Paris thinks Neanderthals died young - about 25 years oldprimarily because of the cold, harsh conditions they had to endure in glacial Europe. Under pressure from this high mortality, they evolved to grow up quicker than their immediate ancestors. "They probably reached maturity at about 15," he says, "but it could have been even younger." They would have matured too fast to accommodate an adolescent burst of growth. He points to research showing that populations of Atlantic cod have genetically changed to mature more quickly under the intense fishing pressure of the 1980s. Others contest Ramirez-Rozzi's position. "You can't assume, just because Neanderthals' teeth grew faster, that their entire body developed faster," says Jennifer Thompson of the University of Nevada, Las Vegas, one of the researchers involved in the Le Moustier 1 study. Controversy rages, but these latest findings at least highlight one aspect of adolescence that most scientists can agree on. Whatever the immediate purpose of the late growth spurt, it was made possible by an increase in life expectancy. And that being so, one way to work out when the first teenagers originated is to look at the lifespan of a species. This is exactly what Rachel Caspari of the University of Michigan at Ann Arbor has been doing. Her most recent study, published in July 2004, shows an astonishing increase in longevity that separates modern Homo sapiens from all other hominids, including Neanderthals (Proceedings of the National Academy of Sciences, vol 101, p 10895). She categorised adult fossils as old or young by assessing whether they had as much wear on their last molar, or wisdom tooth, as on other molars. "In modern humans we see a massive increase in the number of people surviving to be grandparents," she says. The watershed comes as recently as 30,000 years ago. On this evidence, Neanderthals and H. erectus probably had to reach adulthood quickly, without delaying for an adolescent growth spurt. So it looks as though Bogin is correct we are the original teenagers. Whether he is right about the purpose of adolescence is another matter. He admits we will never know for sure. "Fossils will never give us growth curves," he says, "and we should not expect our ancestors to grow like we do." Printed on Tue Mar 08 20:56:28 GMT 2005 ----------- more... http://www.newscientist.com/popuparticle.ns?id=in61 Instant Expert: Teenagers The teenager is a [1]uniquely human phenomenon. Adolescents are known to be moody, insecure, argumentative, [2]angst-ridden, impulsive, [3]impressionable, reckless and rebellious. Teenagers are also characterised by [4]odd sleeping patterns, awkward [5]growth spurts, [6]bullying, [7]acne and [8]slobbish behaviour. So what could be the possible benefit of the teenage phase? Most other animals - apes and human ancestors included - skip that stage altogether, developing rapidly from infancy to full adulthood. Humans, in contrast, have a very puzzling four-year gap between sexual maturity and prime reproductive age. Anthropologists disagree on when the [9]teenage phase first evolved, but pinpointing that date could help define its purpose. There are a variety of current explanations for the existence of teenagers. Some believe that we need longer for our [10]large brains to develop. Other explanations suggest that a teenage phase allows kids to learn about [11]complex social behaviour and [12]other difficult skills, or that it is even required to develop coordinated bipedal bodies adapted to [13]travelling long distances. Raging hormones Scientists once thought that the brain's internal structure was fixed at the end of childhood, and teenage behaviour was blamed on raging hormones and a lack of experience. Then researchers discovered that the brain [14]undergoes significant changes during adolescence. According to many recent studies, teen brains really are unique (see interactive graphic). Though many brain areas mature during childhood, [15]others mature later - such as the frontal and parietal lobes, responsible for planning and self-control. Other studies have shown that teens [16]fail to see the consequences of their actions, and that [17]sudden increases in nerve connectivity in teen brains may make it difficult for teenagers to read social situations and other people's emotions. Risky behaviour One study in 2004 showed that teens have less brain activity in areas responsible for [18]motivation and risk assessment, perhaps explaining why they are more likely to take part in [19]risky activities such as [20]abusing drugs and alcohol, develop a [21]hard-to-kick smoking habit or indulge in [22]under-age sex. Teenage pregnancies and rising rates of sexually transmitted diseases among teens are big problems - especially because today's teen generation is the [23]biggest the world has seen: a 2003 UN report revealed that 1 in 5 people were between 10 and 19, a total of 1.2 billion people. But not everyone agrees on the best way to tackle the problem. Some believe that comprehensive [24]sex education is the key, while others argue for [25]abstinence only education courses. John Pickrell, 3 March 2004 References 1. http://www.newscientist.com/channel/being-human/teenagers/dn1654 2. http://www.newscientist.com/channel/being-human/teenagers/dn2925 3. http://www.newscientist.com/channel/being-human/teenagers/dn3812 4. http://www.newscientist.com/channel/being-human/teenagers/mg18524811.700 5. http://www.newscientist.com/channel/being-human/teenagers/mg13818715.600 6. http://www.newscientist.com/channel/being-human/teenagers/mg18524891.100 7. http://www.newscientist.com/channel/being-human/teenagers/mg17623742.500 8. http://www.newscientist.com/channel/being-human/teenagers/mg14219223.600 9. http://www.newscientist.com/channel/being-human/teenagers/mg18524891.100 10. http://www.newscientist.com/channel/being-human/teenagers/mg13618505.000 11. http://www.newscientist.com/channel/being-human/teenagers/mg13718634.500 12. http://www.newscientist.com/channel/being-human/teenagers/mg17623725.300 13. http://www.newscientist.com/channel/being-human/teenagers/dn6681 14. http://www.newscientist.com/channel/being-human/teenagers/mg17623650.200 15. http://www.newscientist.com/channel/being-human/teenagers/mg16522224.200 16. http://www.newscientist.com/channel/being-human/teenagers/dn6738 17. http://www.newscientist.com/channel/being-human/teenagers/dn2925 18. http://www.newscientist.com/channel/being-human/teenagers/dn4718 19. http://www.newscientist.com/channel/being-human/teenagers/dn4460 20. http://www.newscientist.com/channel/being-human/teenagers/mg13718594.000 21. http://www.newscientist.com/channel/being-human/teenagers/dn4163 22. http://www.newscientist.com/channel/being-human/teenagers/dn6957 23. http://www.newscientist.com/channel/being-human/teenagers/dn4253 24. http://www.newscientist.com/channel/being-human/teenagers/mg18324580.800 25. http://www.newscientist.com/channel/being-human/teenagers/dn6957 -------------- Adolescence unique to modern humans http://www.newscientist.com/article.ns?id=dn1654&print=true * 12:25 06 December 2001 * Claire Ainsworth The uniquely human habit of taking 18 years or so to mature is a recent development in our evolutionary history. Growth patterns of fossil teeth have shown that a prolonged growing-up period evolved long after our ancestors started walking upright and making tools. Our great ape relatives, the chimpanzees and gorillas, take about 11 years to reach adulthood. Scientists speculate that delaying this process allows children to absorb our complex languages, culture and family relationships. What's more, we need extra time for our large brains to grow - they are half as big again as those of the earliest humans, Homo erectus, who appeared some 2 million years ago. Christopher Dean of University College London and his team studied teeth from H. erectus, our Australopithecus human-like ancestors such as the famous "Lucy", and Proconsul nyanzae, an ape ancestor. The rate of tooth development is tightly linked to how long it takes to become fully grown. Teeth rings Teeth grow by adding on enamel in small increments, leaving striations rather like shell ridges or tree rings. By studying these rings, the team could work out how fast the teeth grew. They found that H. erectus's teeth grew at almost the same rate as those of both modern and fossil apes and Australopithecus - suggesting a shorter growing-up period. This was surprising, as H. erectus walked upright, was about the same size as us and made simple tools - all traits associated with being human, says Dean. But it fits with the fact that H. erectus's brain was much smaller. By comparing the growth rate of the back and front teeth, the team estimated that H. erectus children produced their first permanent molars at around 4.5 years, and their second at 7.5 years. This compares with 6 and 12 years in modern humans and 3 and 5 years for modern apes, indicating that H. erectus was starting down the road of modern dental development. Journal reference: Nature (vol 414, p 628) Related Articles * [13]Old bones may be earliest human ancestor * 11 July 2001 * [14]A 3.5 million year-old skull unearthed in Kenya may force a re-examination of the evolution of modern humans * 21 March 2001 * [15]The most ancient human-like remains are unearthed in Kenya * 5 December 2000 Weblinks * [16]Human Origins * [17]Evolutionary Anatomy Unit, UCL * [18]Nature References 13. http://www.newscientist.com/article.ns?id=dn995 14. http://www.newscientist.com/article.ns?id=dn542 15. http://www.newscientist.com/article.ns?id=dn240 16. http://www.mnh.si.edu/anthro/humanorigins/ 17. http://evolution.anat.ucl.ac.uk/ 18. http://www.nature.com/ ------------------ Teen angst rooted in busy brain http://www.newscientist.com/article.ns?id=dn2925&print=true * 19:00 16 October 2002 * Duncan Graham-Rowe Scientists believe they have found a cause of adolescent angst. Nerve activity in the teenaged brain is so intense that they find it hard to process basic information, researchers say, rendering the teenagers emotionally and socially inept. Robert McGivern and his team of neuroscientists at San Diego State University, US, found that as children enter puberty, their ability to quickly recognise other people's emotions plummets. What is more, this ability does not return to normal until they are around 18 years old. McGivern reckons this goes some way towards explaining why teenagers tend to find life so unfair, because they cannot read social situations as efficiently as others. Previous studies have shown that puberty is marked by sudden increases in the connectivity of nerves in parts of the brain. In particular, there is a lot of nerve activity in the prefrontal cortex. "This plays an important role in the assessment of social relationships, as well as planning and control of our social behaviour," says McGivern. Western turmoil He and his team devised a study specifically to see whether the prefrontal cortex's ability to function altered with age. Nearly 300 people aged between 10 and 22 were shown images containing faces or words, or a combination of the two. The researchers asked them to describe the emotion expressed, such as angry, happy, sad or neutral. The team found the speed at which people could identify emotions dropped by up to 20 per cent at the age of 11. Reaction time gradually improved for each subsequent year, but only returned to normal at 18. During adolescence, social interactions become the dominant influence on our behaviour, says McGivern. But at just the time teenagers are being exposed to a greater variety of social situations, their brains are going through a temporary "remodelling", he says. As a result, they can find emotional situations more confusing, leading to the petulant, huffy behaviour for which adolescents are notorious. But this may only be true for Western cultures. Adolescents often play a less significant role in these societies, and many have priorities very different from their parents', leading to antagonism between them. This creates more opportunity for confusion. "One would expect to observe a great deal more emotional turmoil in such kids," he says. Journal reference: Brain and Cognition (vol 50, p 173) Related Articles * [12]Brain expression response linked to personality * 20 June 2002 * [13]Angry outbursts linked to brain dysfunction * 27 May 2002 * [14]Physical changes may be responsible for 'feeling' emotions * 19 September 2000 Weblinks * [15]Psychology, San Diego State University * [16]Early Experience and Brain Development research * [17]Brain and Cognition References 12. http://www.newscientist.com/article.ns?id=dn2439 13. http://www.newscientist.com/article.ns?id=dn2331 14. http://www.newscientist.com/article.ns?id=dn3 15. http://www.psychology.sdsu.edu/ 16. http://www.macbrain.org/ 17. http://www.academicpress.com/b&c -------------- Movie smoking encourages kids to light up http://www.newscientist.com/article.ns?id=dn3812&print=true * 13:14 10 June 2003 * Shaoni Bhattacharya Watching movie stars light up on screen is the biggest single factor in influencing teenagers to smoke, suggests a new US study. Adolescents who had never smoked were almost three times more likely to then take up the habit if they had watched films packed with smoking scenes, compared to their peers who had seen films with the least amount of on-screen smoking. "There was a tremendous impact," says research leader Madeline Dalton, at Dartmouth Medical School in Hanover, New Hampshire. "Movies were the strongest predictor of who would go on to smoke - stronger than peers smoking, family smoking, or the personality of the child." "We know from past studies it's very rare for smoking to be portrayed in a negative light. Smokers [in movies] tend to be tough guys or sexy, rebellious women - which appeal to adolescents," she told New Scientist. Dalton's colleague Michael Beach adds: "Our data indicate that 52 per cent of smoking initiation among adolescents in this study can be attributed to movie smoking exposure." "The effect is stronger than the effect of traditional cigarette advertising and promotion, which accounts for 'only' 34 per cent of new experimentation," notes Stanton Glantz, at the Center for Tobacco Control Research and Education, in an editorial accompanying the study published online in The Lancet. Smoke screen The study began by recruiting over 2600 US schoolchildren aged 10 to 14 who had never smoked. Each child was then asked if they had watched any of 50 movies randomly selected from 601 box office hits released between 1988 and 1999. The number of occurrences of smoking in each film was recorded by trained coders. When followed up one to two years later, 10 per cent of the children had tried smoking. Those in the top quarter of exposure to movie smoking were 2.7 times more likely to have tried a cigarette than those in the lowest quarter of exposure. This effect was independent of other factors that might influence the child's smoking behaviour, such as friends or family smoking. "It's more evidence that movies have a strong impact on adolescents," says Dalton. "Previous studies have suggested that smoking in movies influences adolescent smoking behaviour, but this is the first study to show that viewing smoking in movies predicts who will start smoking in the future." Dalton, an expert in cancer risk behaviour in children, says a previous study by the team showed that children were more likely to smoke if their favourite actor smoked. Movies which depict smoking should be given an adult rating or "R rating" in the US, suggests Glantz, which would mean that children under 17 could not see the film without a parent. "An R rating for smoking in movies would prevent about 330 adolescents [in the US] from starting to smoke and ultimately extend 170 lives every day," he writes. Journal reference: The Lancet (vol 361, no 9373, early online publication) Related Articles * [12]Controversy over passive smoking danger * 16 May 2003 * [13]Violent song lyrics increase aggression * 4 May 2003 * [14]Public smoking ban slashes heart attacks * 1 April 2003 Weblinks * [15]Dartmouth Medical School * [16]Center for Tobacco Control Research and Education * [17]Action on Smoking and Health, UK * [18]The Lancet References 12. http://www.newscientist.com/article.ns?id=dn3737 13. http://www.newscientist.com/article.ns?id=dn3695 14. http://www.newscientist.com/article.ns?id=dn3557 15. http://www.dartmouth.edu/dms/index.shtml 16. http://repositories.cdlib.org/ctcre/ 17. http://www.ash.org.uk/ 18. http://www.thelancet.com/home --------------- Bedtimes could pinpoint the end of adolescence http://www.newscientist.com/article.ns?id=mg18524811.700&print=true * 08 January 2005 * Andy Coghlan AT WHAT point does adolescence end? Perhaps at the point when we start to go to bed progressively earlier rather than later and later. The end of puberty, or sexual maturation, is well defined. It is the point when bones stop growing, at around age 16 for girls and 17.5 for boys. But for adolescence, the transition from childhood to adulthood, there is no clear endpoint. "I don't know of any markers for it," says Till Roenneberg of the Centre for Chronobiology at the University of Munich in Germany. "Everyone talks about it but no one knows when adolescence ends. It is seen as a mixed bag of physical, psychological and sociological factors." His suggestion is based on a study of the sleep habits of 25,000 individuals of all ages in Switzerland and Germany. The study looked at when people go to sleep during vacations, when they are free to sleep any time. It reveals a distinct peak of night-owlishness at around age 20. Women reach this peak at 19.5 years old on average, and men at 20.9 years. After that, individuals gradually return to earlier and earlier sleeping patterns, until things go haywire in old age. Roenneberg, whose findings appear in Current Biology (vol 14, p R1038), thinks that the peak in lateness is the first plausible biological marker for the end of adolescence. The study confirms that 20-year-olds sleep any time except in the evening, says Malcolm von Schantz of the Surrey Sleep Research Centre at the University of Surrey, UK. "Don't I know it - they're in my lectures!" The suggestion that this peak in sleep habits marks the end of adolescence is intriguing, he says, but more research will be needed to prove these behavioural changes are a result of physiological changes, rather than lifestyle. "A lot of things happen to you around this age," von Schantz points out. If it is a physiological effect, forcing teenagers to get to school for, say, 8 am, could be a mistake, Roenneberg says. They probably take nothing in for the first two lessons because they are still in biological "sleep time", and end up with a horrendous sleep deficit by the weekend. ------------- Letters: Growth spurts http://www.newscientist.com/article.ns?id=mg13818715.600&print=true * 01 May 1993 * TIM BROMAGE Barry Bogin has written a marvellous explanation about why we go through the adolescent growth spurt ('Why must I be a teenager at all?', 6 March). We can equally marvel at another spurt not often mentioned by biologists and anthropologists, namely the mid-childhood spurt occurring between ages six and eight. It varies in magnitude and shows itself as a mere blip on Bogin's growth rate curve, but it has a very interesting history. First, it is an opportunity for body growth to get its due after so many years of devoting unequal resources to growing a large brain. Every parent will attest to the struggle of dressing children and the seeming incoherence of a clothing industry that makes pullovers to fit little bodies that have to pull them over a brain which is nearly 95 per cent of its adult size at 6 years of age. Second, this spurt is related to the adolescent spurt in developed nations. Puberty is not only the time of accelerated growth, it marks the beginning of the end of growth too. In order for adolescents to reach the normal height for their population they must be so far along by the time puberty hits them. Thus for children with relatively short pre-pubertal growth periods, an extra push is needed early on (the mid-childhood spurt) to get them to their right preadolescent height before the pubertal growth spurt. Children of populations with puberty closer to 16 years of age, such as those of some underdeveloped nations, do not experience the mid-childhood spurt because they have more prepubertal time to grow up. Tim Bromage City University of New York -------------- Spotty genes - News http://www.newscientist.com/article.ns?id=mg17623742.500&print=true * 21 December 2002 HERE'S something else grumpy teenagers can blame their parents for - their zits. In a study of identical and non-identical twins, Veronique Bataille of St Thomas' Hospital in London and her team have shown that acne is 80 per cent genetic. Environmental factors, such as eating the wrong foods or wearing greasy make-up, are relatively unimportant. They will report their results in The Journal of Investigative Dermatology. Finding the genes involved could clarify what triggers acne, and possibly lead to cheaper, more effective treatments. Worldwide, prescription drugs for acne cost about $4 billion each year. ------------ Fine young slobs?: Kids who spend hours hunched in front of television or computer screens may look as healthy as their active brothers and sisters. But are they storing up trouble? Helen Saul reports http://www.newscientist.com/article.ns?id=mg14219223.600&print=true * 23 April 1994 * HELEN SAUL A teenage computer games addict sits engrossed in front of a screen, hardly moving for hours on end. A parent, worried about strangers and excessive traffic on the roads, insists on driving the children to school. Increasing pressure on children to perform well in academic subjects relegates physical exercise to the bottom of the priority list at school. It all seems like a recipe for weak bodies and illness. But are today's children and adolescents really as sedentary and unfit as popular images would have us believe? And if they are, what should be done to persuade them to be more active? On the face of it, the facts are hardly encouraging. The Broadcasting Audience Research Board's figures for 1993 show that in Britain children between the ages of four and 15 watch an average of between two and three hours of TV each day. The National Curriculum for schools allocates an average of only one hour a week for physical education, and in practice less than 10 per cent of this time is spent exercising. Moreover, as many as one in three children do less than the equivalent of a 10-minute walk each day, according to Neil Armstrong and his research team at the University of Exeter. Yet just as politicians in Britain want to see a return to competitive, team-oriented sport in schools, experts are stressing the need for exercise regimes geared for individuals which require a minimum of formal training. Researchers, meanwhile, are busy questioning the underlying assumptions that today's young couch potatoes are physically weaker than their supposedly more active forbears. For children, it may be a mistake to equate physical inactivity with low physical fitness, says Armstrong. He and his colleagues have spent the past nine years monitoring the links between exercise and fitness in some 700 children, aged between 9 and 16. Part of this endeavour has involved measuring how the hearts and lungs of some of these children perform during strenuous exercise. The result is surprising: the hearts and lungs of inactive children perform just as well as those of habitually active children - and just as well as those of children of previous generations. 'There's no scientific evidence to show that children are less fit than they used to be,' says Armstrong So childhood laziness is no bad thing? Not quite. Physical fitness is clearly influenced by a myriad factors other than exercise, including genetics and diet. It also means different things to different people: a long-distance runner and a weightlifter are both physically fit, but in different ways and because of different exercise regimes. The way hearts and lungs respond to aerobic activity is certainly one measure of physical fitness. But the strength and stamina of skeletal muscles may be just as important. Exactly how important is unclear, for few long-term studies of the effects of exercise on children have attempted to examine all these factors. What's more, even if childhood laziness does not erode physical fitness immediately, children who fail to form the 'exercise habit' are likely to regret it later in life. Studies of adults show that a sedentary lifestyle is as likely to cause heart disease as high blood pressure, smoking or high cholesterol levels. People who fail to take physical exercise are thought to be twice as likely contracting coronary heart disease. They also run higher-than-average risks of developing breast cancer, diabetes and osteoporosis. In the US alone, physical inactivity is estimated to cause 250 000 deaths a year. Treadmill test Adults who don't exercise also perform badly on tests of heart and lung fitness. Put them on a treadmill or cycle ergometer and measure their aerobic fitness by monitoring oxygen uptake, heart rate and carbon dioxide exhaled and you will find that their lungs can't take up as much oxygen as their active counterparts. Armstrong and his colleagues wanted to find out if the same was true for children. So they tested 420 children, measuring their oxygen uptake as they exercised on treadmills or cycle ergometers. The tests supported the results of similar tests carried out some fifty years ago in Chicago. The study also shows that, in contrast to adults, active children perform no better in tests of aerobic fitness than children who don't. Studies in other parts of the world show a similar trend, says Armstrong. Steven Blair, director of epidemiology at the Cooper Institute for Aerobics Research in Dallas, believes that one in five children in the US is physically unfit. But he agrees with Armstrong that the limited data available suggest there has been no major change in physical fitness among young Americans over the past few decades. 'It's very popular in the States for people to dash about saying 'It's terrible. It's getting worse. More and more children are getting more and more unfit.' But how do they know that?' The links between exercise and fitness in children have always been uncertain. It is doubtful that studies based on any one measure of fitness can resolve the main questions. This is certainly true of tests based on oxygen uptake. For one thing, oxygen uptake varies from child to child because of genetic differences that influence muscle growth, strength of the heart and so on. It could be that peak oxygen uptake is too crude a measure to pick up a slight deterioration in lung and heart fitness. Moreover, even if the hearts and lungs of children do not weaken with lack of exercise, that does not necessarily mean that inactive children are physically fit in a broader sense. Yet there are other signs that exercise may not be as important to the fitness of children as it is to adults. In adults, for instance, there is a clear link between physical exercise and blood levels of high density lipoprotein, or 'good' cholesterol. This substance acts to prevent the clogging up of arteries that is caused by high levels of 'bad' cholesterol, or low-density lipo-protein. The more exercise people take, the higher their HDL levels and the more favourable the ratio between HDL and LDL, which remains constant. But there is no evidence that the same is true in children. This, however, may be no reason to celebrate. If children can't feel the physical benefits of exercise, won't it prove harder to persuade them of its value? 'There's no way you can convince 15-year-olds that by being more active today, they'll be less likely to get coronary heart disease when they're 50,' says Oded Bar-Or, professor of exercise sciences at McMaster University in Hamilton, Ontario. The problem is compounded by the fact that inactivity often goes hand in hand with eating too many fatty or sugary foods. More than two in five children in Britain have total cholesterol levels (combined LDL and HDL levels) above the American Health Foundation's safety limit. This is calculated from the cholesterol level in adults that is known to increase the risk of heart disease, taking into account the fact that cholesterol levels increase gradually with age. All the signs are that most children are not active enough to form the kind of exercise habit that could protect them from ill health later in life. As part of the Exeter study, for instance, researchers used portable heart monitors to capture the heart rates of some 266 children from 9.00 am until 9.00 pm. The portable devices used a small transmitter on the chest to send the heart rate data to a receiver on a wrist band. Previously, the researchers had established that, on average, the heart rate of children walking on a treadmill at 6 kilometres per hour is 140 beats per minute. From this, it was possible to calculate that a third of the boys and half of the girls do not even do the equivalent of a brisk 10-minute walk a day. But not everyone is pessimistic. Drawing on data reported in 43 different epidemiological studies, Blair concludes that children and adults alike can reduce the risk of heart disease later in life by burning off just three kilocalories per kilogram of body weight per day. For a child weighing 40 kilograms, this amounts to 120 kcal a day - equivalent to the energy contained in an average biscuit. In a study of 1800 boys and girls aged between 10 and 18 years, Blair found that the most met this standard. Exercise habit That said, most researchers see no harm in encouraging all children to be more active. 'It's all pluses,' says Armstrong. The biggest plus of all is that having developed an exercise habit, a child may be more likely to retain it throughout adulthood. But do active children necessarily become active adults? What little evidence there is suggests the answer is yes. In 1990, researchers in Britain questioned more than 4000 adults on their behaviour, attitudes and beliefs about activity and fitness as part of the Allied Dunbar National Fitness Survey. A quarter of those who said that they were active as teenagers also said that they were active as adults. Only two per cent of those who said that they were inactive at the younger age said that they were active as adults. Ken Fox, who lectures in physical exercise at the University of Exeter, believes that what motivates people to do exercise shifts over time. Overemphasising sports and team games is one reason why adolescents drop out, he says. This is particularly true of girls whose participation in exercise during adolescence falls off more dramatically than that of boys. Groups of girls may not value formalised activities such as competitive sport, he says. 'But aerobics and dance may be more socially acceptable to some groups of girls. And if they value these activities, they're more likely to make the decision to take part.' Bill Kohl, director of the division of childhood and adolescent medicine at the Cooper Institute of Aerobic research, says: 'Fitness and activity is for all kids, not just for those who are athletically gifted. It's not necessary to run a marathon or be Sebastian Coe to get health benefits from physical activity. Helen Saul is a freelance writer specialising in health and medicine. ------------- Teenagers special: Brain storm http://www.newscientist.com/article.ns?id=mg18524891.200&print=true * 05 March 2005 Prefrontal cortex The prefrontal cortex is the home of "executive" functioning, high-level cognitive processes that, among other things, allow us to develop detailed plans, execute them, and block irrelevant actions. This area undergoes a bulking up between the ages of 10 and 12, followed by a dramatic decline in size that continues into the early 20s. This is probably due to a burst of neuronal growth followed by a "pruning" stage in which pathways that are not needed are lost. If the adolescent's brain is still bedding down its executive functions, this might help explain why teenagers can sometimes seem so disorganised and irrational. Right ventral striatum This area of the brain is thought to be involved in motivating reward-seeking behaviour. A study last year showed that teenagers had less activity than adults in this part of the brain during a reward-based gambling game. The researchers speculate that teens may be driven to risky but potentially high-reward behaviours such as shoplifting and drug-taking because this area is underactive. Pineal gland The pineal gland produces the hormone melatonin, levels of which rise in the evening, signalling to the body that it is time to sleep. During adolescence melatonin peaks later in the day than in children or adults. This could be why teenagers tend to be so fond of late nights and morning lie-ins. Corpus callosum These are nerve fibres linking the left and right sides of the brain. The parts thought to be involved in language learning undergo high growth rates before and during puberty, but this growth then slows. This might help explain why the ability to learn new languages declines rapidly after the age of 12. Cerebellum This part of the brain continues to grow until late adolescence. It governs posture and movement, helping to maintain balance and ensure that movements are smooth and directed. It influences other regions of the brain responsible for motor activity and may also be involved in language and other cognitive functions. ------------ Teenagers special: Going all the way http://www.newscientist.com/article.ns?id=mg18524891.300&print=true * 05 March 2005 * Alison George LYNSEY TULLIN was 15 when she became pregnant. The only contraception she and her boyfriend had used was wishful thinking: "I didn't think it would happen to me," she says. Tullin, who lives in Oldham in northern England, decided to keep the baby, now aged 3, although as a consequence her father has disowned her. Tullin is not alone. In the UK nearly 3 per cent of females aged 15 to 19 became mothers in 2002, many of them unintentionally. And unplanned pregnancies are not the only consequence of teenage sex - rates of sexually transmitted diseases (STDs) are also rocketing in British adolescents, both male and female. The numerous and complex societal trends behind these statistics have been endlessly debated without any easy solutions emerging. Policy makers tend to focus on the direct approach, targeting young adolescents in the classroom. In many western schools teenagers get sex education classes giving explicit information about sex and contraception. But recently there has been a resurgence of some old-fashioned advice: just say no. The so-called abstinence movement urges teens to take virginity pledges and cites condoms only to stress their failure rate. It is sweeping the US, and is now being exported to countries such as the UK and Australia. Confusingly, both sides claim their strategy is the one that leads to fewest pregnancies and STD cases. But a close look at the research evidence should give both sides pause for thought. It is a morally charged debate in which each camp holds entrenched views, and opinions seem to be based less on facts than on ideology. "It's a field fraught with subjective views," says Douglas Kirby, a sex education researcher for the public-health consultancy ETR Associates in Scotts Valley, California. For most of history, pregnancy in adolescence has been regarded not as a problem but as something that is normal, so long as it happens within marriage. Today some may still feel there is nothing unnatural about older adolescents in particular becoming parents. But in industrialised countries where extended education and careers for women are becoming the norm, parenthood can be a distinct disadvantage. Teenage mums are more likely to drop out of education, to be unemployed and to have depression. Their children run a bigger risk of being neglected or abused, growing up without a father, failing at school and abusing drugs. The US has by far the highest number of teenage pregnancies and births in the west; 4.3 per cent of females aged between 15 and 19 gave birth there in 2002. This is significantly higher than the rate in the UK (2.8 per cent), which itself has the highest rate in western Europe (see Chart). Another alarming statistic is the number of teenagers catching STDs. In the UK the incidences of chlamydia, syphilis and gonorrhoea in under-20s have all more than doubled since 1995. The biggest rise has been in chlamydia infections in females under 20; cases have more than tripled, up to 18,674 in 2003. Chlamydia often causes no symptoms for many years but it can lead to infertility in women and painful inflammation of the testicles in men. No surprise, then, that teenage sex and pregnancy has become a political issue. The UK government has set a target to halve the country's teen pregnancy rate by 2010, and the US government has set similar goals. But achieving these targets will not be easy. In an age when adolescence has never been so sexualised, in most western countries people often begin to have sex in their mid to late teens; by the age of 17, between 50 and 60 per cent are no longer virgins. Since the 1960s, UK schools have increasingly accepted that many teenagers will end up having sex and have focused efforts on trying to minimise any ensuing harm. Sex education typically involves describing the mechanics of sex and explaining how various contraceptives work, with particular emphasis on condoms because of the protection they provide from many STDs. The sex education strategy gained further support in the early 1990s when policy makers looked to the Netherlands. There, teenage birth rates have plummeted since the 1970s and are now among the lowest in Europe, with about 0.8 per cent of females aged between 15 and 19 giving birth in 2002. No one knows why for sure, as Dutch culture differs from that of the UK and America in several ways. But it is generally attributed to frank sex education in schools and open attitudes to sex. Dutch teenagers, says Roger Ingham, director of the Centre for Sexual Health Research at the University of Southampton,"have less casual sex and are older when they first have sex compared with the UK". But a new sexual revolution is under way. Spearheaded by the religious right, the so-called abstinence movement is based on the premise that sex outside marriage is morally wrong. "We're trying to say there's another approach to your sexuality," says Jimmy Hester, co-founder of one of the oldest pro-abstinence campaigns, True Love Waits, based in Nashville, Tennessee. Abstinence-based education got US government backing in 1981, when Congress passed a law to fund sex education that promoted self-restraint. More money was allocated through welfare laws passed in 1996, which provided $50 million a year. A key plank of the abstinence approach is to avoid giving advice on contraception. The logic is that such information would give the message that it's OK to have sex. "The moment we do that, we water down the commitment," says Hester. If contraception is mentioned at all, it is to highlight its failings - often using inaccurate or distorted data. A report for the US House of Representatives published last December found that 11 out of the 13 federally funded abstinence programmes studied contained false or misleading information. Examples of inaccurate statements included: "Pregnancy occurs one out of every seven times that couples use condoms," and: "Condoms fail to prevent HIV 31 per cent of the time." They also use some questionable logic regarding the success rate of abstinence (see "Heads I win, tails you lose"). While some states advocate "abstinence-plus" programmes, providing a level of advice on contraception alongside heavy promotion of chastity, the hard-line "abstinence only" approach is in the ascendant in the US. Around a third of US secondary schools have abstinence-only programmes, and nearly 3 million young people have publicly pledged to remain virgins until they marry. And it is spreading. Last June an American group came to the UK to promote the Silver Ring Thing, a Christian movement that encourages teens to publicly pledge to remain virgins until marriage and to keep their promise with the aid of a $12 ring. And True Love Waits has held virginity rallies in Australia. This trend comes amid claims that the UK's more liberal approach not only does not work, but has the opposite effect. "Free pills and condoms boost promiscuity" screamed the headline on the front page of UK newspaper The Times last year (5 April 2004). It was prompted by research by David Paton, an economist at the University of Nottingham, UK, which found that in some areas that had increased access to family planning services, teen pregnancy rates had remained the same and STD rates had actually risen. There are now increasing calls from conservative and religious groups for schools in the UK to consider the abstinence option. A programme called Love for Life is now operating in 60 per cent of schools in Northern Ireland. It could be described as abstinence-plus that is heavy on the abstinence. Its founder, Richard Barr, a GP from Craigavon, County Armagh, says that focusing on contraception ignores the bigger picture of human sexuality. "There's a massive need for a more holistic approach, not just a damage-limitation approach." And the UK mainland is home to a small but growing number of groups, most of them with Christian roots, promoting abstinence-centred education. The word abstinence is less in vogue than across the Atlantic, however, and such groups are more likely to talk in terms of delaying sex until young people are in a committed relationship. But does the abstinence approach work? Do teenagers - a group not renowned for their propensity to do what they are told - take any notice when adults tell them not to have sex? Proponents of abstinence claim research supports their strategy. But the vast majority of studies that have been done in this area have been small, short-term evaluations without control groups. "There have only been three well-designed trials where an 'intervention' group is compared with a control group and participants are tracked over time," says Kirby. One of these, published in 1997, looked at a five-session abstinence-only initiative in California. The trial tracked 10,600 teenagers for 17 months (Family Planning Perspectives, vol 29, p 100). The researchers found it had no impact on the sexual behaviour or pregnancy rates of teenagers. The other two studies had similar results. "None of them show that any abstinence-only programmes had any impact on behaviour," says Kirby. Although not a controlled trial, one of the largest studies of the effect of abstinence pledges tracked the sex lives of 12,000 US teenagers aged between 12 and 18 (American Journal of Sociology, vol 106, p 859). A group led by Peter Bearman, a sociologist at Columbia University in New York, investigated whether taking a virginity pledge affected the age when people first had sex. It did, with an average delay of 18 months. The pledgers also got married earlier and had fewer partners overall. But when Bearman went back six years later and looked at the STD rates in the same people, now aged between 18 and 24, he was in for a surprise. In research presented at the National STD conference in Philadelphia last year, he found that though pledgers had had fewer sexual partners than non-pledgers, they were just as likely to have had an STD. And the reason? "Pledgers use condoms less," says Bearman. "It's difficult to simultaneously imagine not intending to have sex and being contraceptively prepared." Here lies the problem that many have with the idea of abstinence-only education. While it may work for those kids who live up to the ideal, those who don't are left without the knowledge to protect themselves when they do have sex. "It's not rocket science," says Bearman. But here's where proponents of the liberal approach can stop feeling smug. Because despite many people's unquestioning assumption that comprehensive sex education is the best way to reduce teenage pregnancy, there is actually little good-quality evidence backing this view. One of the problems in carrying out randomised controlled trials in this area is the question of who should be used as the control group. Most schools now have some form of sex education in place, however rudimentary, and it would be unethical to take this away from some children to create the control group. Instead researchers have tended to compare standard sex education with new initiatives specially designed to reduce pregnancy rates. But the results have been unimpressive. A systematic review in 2002 of 26 such studies showed that not one of them improved the use of birth control or reduced the teenage pregnancy rate (British Medical Journal, vol 324, p 1426). But in the past few years, a handful of randomised controlled trials have been published showing that some carefully designed sex education programmes do appear to work. One of the most effective is the Carrera Adolescent Pregnancy Prevention Program, aimed at 13 to 15-year-olds in a poor area of New York (Perspectives on Sexual and Reproductive Health, vol 34, p 244). Abstinence is mentioned during the programme, but most of the emphasis is on contraception. A three-year study showed that the pregnancy rate of teenage girls who took the programme was less than half the rate of those who didn't. Analysis showed this was due to both greater condom use and delayed onset of sex. Why should these programmes be any different? As well as lasting longer, they were, says Kirby, "interactive and personalised, not just abstract facts". The Carrera programme, for example, not only covered sexual behaviour, it tackled the social disadvantages that lead to teenage pregnancy. Along with information on and free access to contraceptives, it involved intensive youth work such as sports, job clubs and homework help. Most UK sex education programmes seem half-hearted in comparison, providing the bare biological facts, perhaps alongside a demonstration of how to put a condom on a cucumber. "It's something I feel quite angry about," says Michael Adler, a former STD physician at University College London Hospital. In his job he saw many casualties of unsafe sex. "We're failing young people right at the beginning," he says. Unfortunately policy makers have recently lost a good source of information about what works and what doesn't. The US Centers for Disease Control and Prevention (CDC) in Atlanta, Georgia, commissioned a panel of external experts to carry out a rigorous review of various sex education programmes. The panel identified five strategies that were successful in reducing the rate of teenage pregnancy, all based on comprehensive sex education, and the details were posted on the organisation's website. But in 2002 that information disappeared and the CDC will no longer release it. According to the CDC press office, the review programme is being "re-evaluated". But sceptics fear it has been dumped because its conclusions don't fit with the Bush's administration's views. "They were inconsistent with the ideology to which this administration adheres," says Bill Smith of the Sexuality Information and Education Council of the United States, a liberal sex education advocacy group based in New York. What of the study that made the newspaper headlines in the UK last year, showing that contraception provision is linked with higher STD rates? Perhaps it should not really be taken as a damning indictment of the liberal approach. The study looked at National Health Service family planning clinics, not school-based comprehensive sex education. Simply doling out condoms without tackling the wider issues is unlikely to have much impact. Anyway, should the correlation between sex clinics and STD levels really be so surprising? "Has it occurred to [David Paton] that they put more services in areas with high rates?" asks Roger Ingham. In fact, amid all the scare stories, the average age when a person first has sex now appears to be levelling out at around 17 in the US and 16 in the UK. And although rates of STDs are on the increase in the UK, teenage pregnancy and birth rates are on a downward trend, as they have been in most developed countries for several years. A report from the Alan Guttmacher Institute, a reproductive health research group in New York, concludes this is due to factors such as the rise of careers for women, and the increasing importance of education and training (Family Planning Perspectives, vol 32, p 14). Perhaps it is unsurprising, then, that it is among society's lowest income groups that teen pregnancy rates are highest. In the face of such complex societal forces, those who try to influence teenagers' behaviour on a day-to-day basis undoubtedly have a tough job on their hands. There may be no single solution. More research is needed to produce detailed information on which kind of sex education programmes work best, and in which contexts. One approach is to involve older teenagers, on the premise that 14-year-olds may be more likely to listen to 18-year-olds than people of their parents' generation. Since having her son, Lynsey Tullin has started working for Brook, a young people's sexual health charity, to ensure that today's teenagers are more savvy about sex. "We talk the same language," she says. A tactic that she finds hits home is to describe new parenthood in all its gory details - the nappies, the lack of sleep, a social life in tatters. "We run workshops about being parents, telling them what we went through," she says. "It's a shock." Different approaches to teenage sexuality Comprehensive sex education Provides explicit information about contraception, sexuality and sexual health Abstinence-only approach Teaches that the only place for sex is within marriage, and the only certain way to avoid pregnancy and STDs is abstinence. Does not teach about contraception Abstinence-plus Promotes abstinence as the best choice, but provides varying degrees of information on contraception in case teens do become sexually active Heads I win, tails you lose LOOK at any abstinence-only literature, and you'll read that this is the only certain way to prevent pregnancy and avoid catching a sexually transmitted disease (STD). "Abstinence. Failure rate 0 per cent," is the claim on one pro-abstinence website. But does this make sense? The most important measure of any method of preventing pregnancy and STDs is not its ideal effectiveness, but its "use effectiveness" - how successful it is in the real, sometimes messy, world of sex. Condoms, for instance, have a 97 per cent success rate at preventing pregnancy if used correctly, but have an estimated use-effectiveness of 86 per cent, due to problems such as tearing or slipping. If people who intend to use condoms but never get as far as opening the pack are included, some studies suggest the use-effectiveness of condoms could be as low as 30 per cent - the sort of figure abstinence fans shout from the rooftops. What about applying the same real-world rules to abstinence? Unfortunately there are no studies detailing the use-effectiveness of abstinence in preventing pregnancy, but it is highly unlikely to be 100 per cent, as commonly claimed by its proponents. Their reasoning goes like this: individuals who set out to remain abstinent but succumb to temptation and have sex are no longer seen as abstinence "users". And those who become pregnant may even be marked up as a failure for the contraception strategy if, say, they attempted to use a condom but bungled it. Abstinence campaigners are very vocal about the failings of contraception. But is it perhaps time to own up about the failure rate of abstinence? ------------- Teenagers special: Bully boys http://www.newscientist.com/article.ns?id=mg18524891.400&print=true * 05 March 2005 * Clare Wilson LAST year the UK pop music station BBC Radio 1 mounted a "Beat Bullying" campaign, and over six weeks it was flooded with more than 1 million requests for its free "Beat Bullying" wristbands. As it struggled to meet demand, a thriving market opened up on eBay for these blue plastic bracelets. Bullying, it seems, struck a nerve - which is hardly surprising, given that an estimated 1 in 5 secondary schoolchildren in the UK has been bullied. Most efforts to tackle the problem involve working with the perpetrators as well as their victims. Teachers may be urged to help bullies recognise and modify their behaviour. Bullies, it is commonly believed, often come from unaffectionate or violent families, and may have poor social skills and low self-esteem. "Particularly in America, the traditional view is that [bullies] are malfunctioning," says Peter Smith, a psychologist at Goldsmiths College, University of London, who has advised the UK government on how best to tackle bullying. But could this view be wrong? Far from being the result of a damaged psyche, could bullying be a successful social strategy - albeit one that is very unpleasant for people on the receiving end? Over the past few years, some psychologists, including Smith, have started to think so. They believe that at least some kinds of bullying boost the status of the bully among his or her peers. Several studies by Anthony Pellegrini, an evolutionary developmental psychologist at the University of Minnesota, support this theory. In one study published in 2003, he asked a group of 138 schoolchildren aged between 12 and 14 to say how aggressive their classmates were, both physically and psychologically. On a separate scale, they had to say which members of the opposite sex they would ask to a party (Journal of Experimental Child Psychology, vol 85, p 257). Those most likely to get an invite were boys who were physically aggressive and girls who were psychologically so. "Boys have high status with their male peers if they're bullies, and girls like them," says Pellegrini. If it turns out to be true that bullying raises the perpetrator's social status, trying to change bullies' behaviour by boosting social skills and self-esteem may not work. "Some bullies, at least, are socially skilled," says Smith. "These skills have a function, which is to enhance your status in a competitive peer group." -------------- Teenagers special: Live now, pay later http://www.newscientist.com/article.ns?id=mg18524891.500&print=true * 05 March 2005 In the west, some of the biggest threats to teenagers long-term health stem from bad habits such as eating unhealthily and smoking. Policy makers are also paying growing attention to adolescents' mental health. Fewer than 20 per cent of 13-to-15-year-olds in England eat the recommended five portions of fruit and vegetables a day American teenagers spend an average of 3 to 4 hours a day watching TV In Australia, 20 to 25 per cent of under-17-year-olds are overweight or obese Almost a quarter of 15 and 16-year-olds in the UK smoke regularly Some estimates suggest that up to 1 in 5 adolescents have some form of psychological problem, ranging from eating disorders to depression or self-harming In England 11 per cent of 11-to-15-year-olds have used drugs in the last month. From checker at panix.com Mon May 2 16:22:41 2005 From: checker at panix.com (Premise Checker) Date: Mon, 2 May 2005 12:22:41 -0400 (EDT) Subject: [Paleopsych] Wiki: Moral relativism Message-ID: Moral relativism http://en.wikipedia.org/wiki/Moral_relativism >From Wikipedia, the free encyclopedia. Moral relativism is the position that [2]moral propositions do not reflect [3]absolute or [4]universal truths. It not only holds that ethical judgments emerge from social [5]customs and personal preferences, but also that there is no single standard by which to assess an ethical proposition's truth. Many relativists see moral [6]values as applicable only within certain cultural boundaries. Some would even suggest that one person's ethical judgments or acts cannot be judged by another, though most relativists propound a more limited version of the theory. Some moral relativists -- for example, [7]Jean-Paul Sartre (1905-1980) -- hold that a personal and [8]subjective [9]moral core lies at the foundation of our moral acts. They believe that public [10]morality is a reflection of social convention, and that only personal, subjective morality is truly authentic. Moral relativism is not the same as moral [11]pluralism, which acknowledges the co-existence of opposing ideas and practices, but does not require that they be equally valid. Moral relativism, in contrast, contends that opposing moral positions have no truth value, and that there is no preferred standard of reference by which to judge them. Contents [12]1 History [13]2 Some philosophical considerations [14]3 Critics of relativism [15]4 See also [16]5 References and sources [17]6 External links [[18]edit] History Moral relativism is not new. [19]Protagoras' (circa 481-420 BC) assertion that "man is the measure of all things" is an early [20]philosophical precursor to modern relativism. The [21]Greek historian [22]Herodotus (circa 484-420 BC) observed that each society thinks its own belief system and way of doing things are best. Various ancient [23]philosophers also questioned the idea of an absolute standard of morality. The 18th century [24]Enlightenment philosopher, [25]David Hume (1711-1776), is in several important respects the father of both modern [26]emotivism and moral relativism, though Hume himself was not a relativist. He distinguished between matters of fact and matters of value, and suggested that moral judgments consist of the latter, for they do not deal with verifiable facts that obtain in the world, but only with our sentiments and passions, though he argued that some of our sentiments are universal. He is famous for denying any objective standard for morality, and suggested that the universe is indifferent to our preferences and our troubles. In the modern era, [27]anthropologists such as [28]Ruth Benedict (1887-1948), cautioned observers not to use their own cultural standards to evaluate those they were studying, which is known as [29]ethnocentricism. Benedict said there are no morals, only customs, and in comparing customs, the anthropologist, "insofar as he remains an anthropologist ... is bound to avoid any weighting of one in favor of the other." To some extent, the increasing body of knowledge of great differences in belief among societies caused both social scientists and philosophers to question whether there can be any objective, absolute standards pertaining to values. This caused some to posit that differing systems have equal validity, with no standard for adjudicating among conflicting beliefs. The Finnish philosopher-anthropologist, [30]Edward Westermarck (1862-1939) was among the first to formulate a detailed theory of moral relativism. He contended that all moral ideas are subjective judgments that reflect one's upbringing. He rejected [31]G.E. Moore's (1873-1958) intuitionism -- in vogue during the early part of the 20th century, and which identified moral propositions as true or false, and known to us through a special faculty of [32]intuition -- due to the obvious differences in beliefs among societies, which he said was evidence that there is no innate, intuitive power. [[33]edit] Some philosophical considerations So-called descriptive or normative relativists (for example, [34]Ralph Barton Perry), accept that there are fundamental disagreements about the right course of action even when the same facts obtain and the same consequences are likely to arise. However, the descriptive relativist does not necessarily deny that there is one correct moral appraisal, given the same set of circumstances. Other descriptivists believe that opposing moral beliefs can both be true, though critics point out that this leads to obvious logical problems. The latter descriptivists, for example, several leading [35]Existentialists, believe that morality is entirely subjective and personal, and beyond the judgment of others. In this view, moral judgments are more akin to aesthetic considerations and are not amenable to rational analysis. In contrast, the metaethical relativist maintains that all moral judgments are based on either societal or individual standards, and that there is no single, objective standard by which one can assess the truth of a moral proposition. While he preferred to deal with more practical, real-life ethical matters, the British philosopher [36]Bernard Williams (1929-2003) reluctantly came to this conclusion when he put on his metaethicist's hat. Metaethical relativists, in general, believe that the descriptive properties of terms such as good, bad, right, and wrong are not subject to [37]universal [38]truth conditions, but only to societal convention and personal preference. Given the same set of verifiable facts, some societies or individuals will have a fundamental disagreement about what ought to be done based on societal or individiual norms, and these cannot be adjudicated using some independent standard of evaluation, for the latter standard will always be societal or personal and not universal, unlike, for example, the scientific standards for assessing temperature or for determining mathematical truths. Moral relativism stands in marked contrast to [39]moral absolutism, [40]moral realism, and [41]moral naturalism, which all maintain that there are moral facts, facts that can be both known and judged, whether through some process of verification or through intuition. These philosophies see morality as something that obtains in the world. Examples include the philosophy of [42]Jean-Jacques Rousseau (1712-1778), who saw man's nature as inherently good, or of [43]Ayn Rand, who believed morality is derived from man's exercising his unobstructed rationality. Others believe moral knowledge is something that can be derived by external sources such as a deity or revealed doctrines, as would be maintained by various [44]religions. Some hold that moral facts inhere in nature or reality, either as particular instances of perfect ideas in an eternal realm, as adumbrated by [45]Plato (429-347 BC); or as a simple, unanalyzable property, as advocated by Moore. In each case, however, moral facts are invariant, though the circumstances to which they apply might be different. Moreover, in each case, moral facts are objective and can be determined. Some philosophers maintain that moral relativism devolves into [46]emotivism, the movement inspired by [47]logical positivists in the early part of the 20th Century. Leading exponents of logical positivism include [48]Rudolph Carnap {1891-1970} and [49]A. J. Ayer {1910-1989}. Going beyond Hume, the positivists contended that a proposition is meaningful only if it can be verified by [50]logical or scientific inquiry. Thus, [51]metaphysical propositions, which cannot be verified in this manner, are not simply incorrect, they are meaningless, nonsensical. Moral judgments are primarily expressions of emotional preferences or states, devoid of cognitive content; consequently, they are not subject to verification. As such, moral propositions are essentially meaningless utterances or, at best, express personal attitudes (see, for example, [52]Charles L. Stevenson {1908-1979}). Not all relativists would hold that moral propositions are meaningless; indeed, many make any number of assertions about morality, assertions that they undoubtedly believe to be meaningful. However, other philosophers have argued that, since we have no means of analysing a moral proposition, it is essentially meaningless, and, in their view, relativism is therefore tantamount to emotivism. The political theorist, [53]Leo Strauss (1899-1973), subscribed to a species of relativism, for he believed there are no objective criteria for assessing ethical principles, and that a rational morality is only possible in the limited sense that one must accept its ultimate subjectivity. This view is very similar to the one advocated by the existentialist philosophers, [54]Martin Heidegger (1889-1976) and Sartre. The latter famously maintained that ethical principles only arise from our personal feelings at the time we act, and not from any antecedent principles. [55]Karl Marx (1818-1883) was a moral relativist, for he thought each moral system was simply a product of the dominant class, and that the movement of history will settle moral questions, not a fixed, universal standard. [[56]edit] Critics of relativism Those who believe in moral absolutes are often highly critical of moral relativism; some have been known to equate it with outright immorality or amorality. [57]The Holocaust, [58]Stalinism, [59]apartheid, [60]genocide, [61]unjust wars, [62]genital mutilation, [63]slavery, [64]terrorism, [65]Nazism, etc., present difficult problems for relativists. An observer in a particular time and place, depending on his outlook (e.g., culture, religion, background), might call something good that another observer in a particular time and place would call evil. Slavery, for example, was thought by many to be acceptable, even good, in other times and places, while it is viewed by many (though certainly not all), today, as a great evil. Many critics of relativism would say that any number of evils can be justified based on subjective or cultural preferences, and that morality requires some universal standard against which to measure ethical judgments. Some relativists will state that this is an unfair criticism of relativism, for it is really a metaethical theory, and not a normative one, and that the relativist may have strong moral beliefs, notwithstanding his foundational position. Critics of this view, however, argue the complaint is disingenuous, and that the relativist is not making a mere metaethical assertion; that is, one that deals with the logical or linguistic structure of ethical propositions. These critics contend that stating there is no preferred standard of truth, or that standards are equally true, addresses the ultimate validity and truth of the ethical judgments themselves, which, they contend, is a normative judgment. In other words, the separation between metaethics and normative ethics is arguably a distinction without a difference. Some philosophers, for example, [66]Michael E. Berumen {1952-} and [67]R. M. Hare (1919-2002), argue that moral propositions are subject to logical rules, notwithstanding the absence of any factual content, including those subject to cultural or religious standards or norms. Thus, for example, they contend that one cannot hold contradictory ethical judgments. This allows for moral discourse with shared standards, notwithstanding the descriptive propeties or truth conditions of moral terms. They do not affirm or deny there are moral facts, only that logic applies to our moral assertions; consequently, they contend, there is an objective and preferred standard of moral justification, albeit in a very limited sense. These philosophers also point out that, aside from logical constraints, all systems treat certain moral terms alike in an evaluative sense. This is similar to our treatment of other terms such as less or more, the meaning of which is universally understood and not dependent upon independent standards (measurements, for example, can be converted). It applies to good and bad when used in their non-moral sense, too: for example, when we say, "this is a good wrench" or "this is a bad wheel." This evaluative property of certain terms also allows people of different beliefs to have meaningful discussions on moral questions, even though they disagree about certain facts. Berumen, among others, has said that if relativism were wholly true, there would be no reason to prefer it over any other theory, given its fundamental contention that there is no preferred standard of truth. He says that it is not simply a metaethical theory, but a normative one, and that its truth, by its own definition, cannot in the final analysis be assessed or weighed against other theories. [[68]edit] See also * [69]Analytical philosophy * [70]Anthropology * [71]Business ethics * [72]Deontology * [73]Emotivism * [74]Ethics * [75]Logic * [76]Metaethics * [77]Moral codes * [78]Moral purchasing * [79]Morality * [80]Objectivism * [81]Philosophy * [82]Situational ethics * [83]Subjectivism [[84]edit] References and sources Curt Baier, "Difficulties in the Emotive-Imperative Theory" in Moral Judgement: Readings in Contemporary Meta-Ethics Ruth Benedict, Patterns of Culture (Mentor) Michael E. Berumen, Do No Evil: Ethics with Applications to Economic Theory and Business (iUniverse) R.M. Hare, Sorting out Ethics (Oxford University Press) David Hume, An Enquiry Concerning the Principles of Morals, Editied by Tom L. Beauchamp(Oxford University Press) G.E. Moore, Principia Ethica (Cambridge University Press) Jean-Paul Sartre, "Existentialism is a Humanism" in Existentialism From Dostoevsky to Sartre, Edited by Walter Kaufmann (World Publishing Company) Leo Strauss, The Rebirth of Classical Political Rationalism, Edited by Thomas L. Pangle (University of Chicago Press) Edward Westermarck, The Origin and Development of the Moral Ideas (Macmillan) Bernard Williams, Ethics and the Limits of Philosophy (Harvard University Press) [[85]edit] External links * [86]Objectivism and Relativism (http://www.utm.edu/research/iep/e/ethics.htm#Metaphysi cal%20Issues:%20Objectivism%20and%20Relativism) * [87]Moral Relativism (http://www.AllAboutPhilosophy.org/Moral-Relativism.htm ) A Christian Perspective. [89]Categories: [90]Ethics | [91]Social philosophy References 2. http://en.wikipedia.org/wiki/Moral 3. http://en.wikipedia.org/wiki/Moral_absolutism 4. http://en.wikipedia.org/wiki/Moral_universalism 5. http://en.wikipedia.org/wiki/Customs 6. http://en.wikipedia.org/wiki/Values 7. http://en.wikipedia.org/wiki/Jean-Paul_Sartre 8. http://en.wikipedia.org/wiki/Subjective 9. http://en.wikipedia.org/wiki/Moral_core 10. http://en.wikipedia.org/wiki/Morality 11. http://en.wikipedia.org/wiki/Pluralism 12. http://en.wikipedia.org/wiki/Moral_relativism#History 13. http://en.wikipedia.org/wiki/Moral_relativism#Some_philosophical_considerations 14. http://en.wikipedia.org/wiki/Moral_relativism#Critics_of_relativism 15. http://en.wikipedia.org/wiki/Moral_relativism#See_also 16. http://en.wikipedia.org/wiki/Moral_relativism#References_and_sources 17. http://en.wikipedia.org/wiki/Moral_relativism#External_links 18. http://en.wikipedia.org/w/index.php?title=Moral_relativism&action=edit§ion=1 19. http://en.wikipedia.org/wiki/Protagoras 20. http://en.wikipedia.org/wiki/Philosophical 21. http://en.wikipedia.org/wiki/Greek 22. http://en.wikipedia.org/wiki/Herodotus 23. http://en.wikipedia.org/wiki/Philosophers 24. http://en.wikipedia.org/wiki/Enlightenment 25. http://en.wikipedia.org/wiki/David_Hume 26. http://en.wikipedia.org/wiki/Emotivism 27. http://en.wikipedia.org/wiki/Anthropologists 28. http://en.wikipedia.org/wiki/Ruth_Benedict 29. http://en.wikipedia.org/wiki/Ethnocentricism 30. http://en.wikipedia.org/wiki/Edward_Westermarck 31. http://en.wikipedia.org/wiki/G.E._Moore 32. http://en.wikipedia.org/wiki/Intuition 33. http://en.wikipedia.org/w/index.php?title=Moral_relativism&action=edit§ion=2 34. http://en.wikipedia.org/w/index.php?title=Ralph_Barton_Perry&action=edit 35. http://en.wikipedia.org/wiki/Existentialists 36. http://en.wikipedia.org/wiki/Bernard_Williams 37. http://en.wikipedia.org/wiki/Universal 38. http://en.wikipedia.org/wiki/Truth 39. http://en.wikipedia.org/wiki/Moral_absolutism 40. http://en.wikipedia.org/wiki/Moral_realism 41. http://en.wikipedia.org/w/index.php?title=Moral_naturalism&action=edit 42. http://en.wikipedia.org/wiki/Jean-Jacques_Rousseau 43. http://en.wikipedia.org/wiki/Ayn_Rand 44. http://en.wikipedia.org/wiki/Religion 45. http://en.wikipedia.org/wiki/Plato 46. http://en.wikipedia.org/wiki/Emotivism 47. http://en.wikipedia.org/wiki/Logical_positivists 48. http://en.wikipedia.org/wiki/Rudolph_Carnap 49. http://en.wikipedia.org/wiki/A._J._Ayer 50. http://en.wikipedia.org/wiki/Logic 51. http://en.wikipedia.org/wiki/Metaphysical 52. http://en.wikipedia.org/wiki/Charles_L._Stevenson 53. http://en.wikipedia.org/wiki/Leo_Strauss 54. http://en.wikipedia.org/wiki/Martin_Heidegger 55. http://en.wikipedia.org/wiki/Karl_Marx 56. http://en.wikipedia.org/w/index.php?title=Moral_relativism&action=edit§ion=3 57. http://en.wikipedia.org/wiki/The_Holocaust 58. http://en.wikipedia.org/wiki/Stalinism 59. http://en.wikipedia.org/wiki/Apartheid 60. http://en.wikipedia.org/wiki/Genocide 61. http://en.wikipedia.org/w/index.php?title=Unjust_war&action=edit 62. http://en.wikipedia.org/wiki/Genital_mutilation 63. http://en.wikipedia.org/wiki/Slavery 64. http://en.wikipedia.org/wiki/Terrorism 65. http://en.wikipedia.org/wiki/Nazism 66. http://en.wikipedia.org/wiki/Michael_E._Berumen 67. http://en.wikipedia.org/wiki/R._M._Hare 68. http://en.wikipedia.org/w/index.php?title=Moral_relativism&action=edit§ion=4 69. http://en.wikipedia.org/wiki/Analytical_philosophy 70. http://en.wikipedia.org/wiki/Anthropology 71. http://en.wikipedia.org/wiki/Business_ethics 72. http://en.wikipedia.org/wiki/Deontology 73. http://en.wikipedia.org/wiki/Emotivism 74. http://en.wikipedia.org/wiki/Ethics 75. http://en.wikipedia.org/wiki/Logic 76. http://en.wikipedia.org/wiki/Metaethics 77. http://en.wikipedia.org/wiki/Moral_codes 78. http://en.wikipedia.org/wiki/Moral_purchasing 79. http://en.wikipedia.org/wiki/Morality 80. http://en.wikipedia.org/wiki/Objectivism 81. http://en.wikipedia.org/wiki/Philosophy 82. http://en.wikipedia.org/wiki/Situational_ethics 83. http://en.wikipedia.org/wiki/Subjectivism 84. http://en.wikipedia.org/w/index.php?title=Moral_relativism&action=edit§ion=5 85. http://en.wikipedia.org/w/index.php?title=Moral_relativism&action=edit§ion=6 86. http://www.utm.edu/research/iep/e/ethics.htm#Metaphysical%20Issues:%20Objectivism%20and%20Relativism 87. http://www.AllAboutPhilosophy.org/Moral-Relativism.htm 88. http://en.wikipedia.org/wiki/Moral_relativism 89. http://en.wikipedia.org/w/index.php?title=Special:Categories&article=Moral_relativism 90. http://en.wikipedia.org/wiki/Category:Ethics 91. http://en.wikipedia.org/wiki/Category:Social_philosophy From checker at panix.com Mon May 2 16:22:53 2005 From: checker at panix.com (Premise Checker) Date: Mon, 2 May 2005 12:22:53 -0400 (EDT) Subject: [Paleopsych] Internet Encyclopedia of Philosophy: Ethics Message-ID: Ethics [Internet Encyclopedia of Philosophy] http://www.utm.edu/research/iep/e/ethics.htm#Metaphysical%20Issues:%20Objectivism%20and%20Relativism 2003 The field of ethics, also called moral philosophy, involves systematizing, defending, and recommending concepts of right and wrong behavior. Philosophers today usually divide ethical theories into three general subject areas: metaethics, normative ethics, and applied ethics. Metaethics investigates where our ethical principles come from, and what they mean. Are they merely social inventions? Do they involve more than expressions of our individual emotions? Metaethical answers to these questions focus on the issues of universal truths, the will of God, the role of reason in ethical judgments, and the meaning of ethical terms themselves. Normative ethics takes on a more practical task, which is to arrive at moral standards that regulate right and wrong conduct. This may involve articulating the good habits that we should acquire, the duties that we should follow, or the consequences of our behavior on others. Finally, applied ethics involves examining specific controversial issues, such as abortion, infanticide, animal rights, environmental concerns, homosexuality, capital punishment, or nuclear war. By using the conceptual tools of metaethics and normative ethics, discussions in applied ethics try to resolve these controversial issues. The lines of distinction between metaethics, normative ethics, and applied ethics are often blurry. For example, the issue of abortion is an applied ethical topic since it involves a specific type of controversial behavior. But it also depends on more general normative principles, such as the right of self-rule and the right to life, which are litmus tests for determining the morality of that procedure. The issue also rests on metaethical issues such as, "where do rights come from?" and "what kind of beings have rights?" _________________________________________________________________ Table of Contents (Clicking on the links below will take you to that part of this article) * [2]Metaethics * [3]Metaphysical Issues: Objectivism and Relativism * [4]Psychological Issues in Metaethics * [5]Egoism and Altruism * [6]Emotion and Reason * [7]Male and Female Morality [8]Normative Ethics * [9]Virtue Theories * [10]Duty Theories * [11]Consequentialist Theories * [12]Types of Utilitarianism * [13]Ethical Egoism and Social Contract Theory [14]Applied Ethics * [15]Normative Principles in Applied Ethics * [16]Issues in Applied Ethics [17]References and Further Reading _________________________________________________________________ Metaethics The term "meta" means after or beyond, and, consequently, the notion of metaethics involves a removed, or bird's eye view of the entire project of ethics. We may define metaethics as the study of the origin and meaning of ethical concepts. When compared to normative ethics and applied ethics, the field of metaethics is the least precisely defined area of moral philosophy. Two issues, though, are prominent: (1) metaphysical issues concerning whether morality exists independently of humans, and (2) psychological issues concerning the underlying mental basis of our moral judgments and conduct. [18]Back to Table of Contents Metaphysical Issues: Objectivism and Relativism "Metaphysics" is the study of the kinds of things that exist in the universe. Some things in the universe are made of physical stuff, such as rocks; and perhaps other things are nonphysical in nature, such as thoughts, spirits, and gods. The metaphysical component of metaethics involves discovering specifically whether moral values are eternal truths that exist in a spirit-like realm, or simply human conventions. There are two general directions that discussions of this topic take, one other-worldly and one this-worldly. Proponents of the other-worldly view typically hold that moral values are objective in the sense that they exist in a spirit-like realm beyond subjective human conventions. They also hold that they are absolute, or eternal, in that they never change, and also that they are universal insofar as they apply to all rational creatures around the world and throughout time. The most dramatic example of this view is Plato, who was inspired by the field of mathematics. When we look at numbers and mathematical relations, such as 1+1=2, they seem to be timeless concepts that never change, and apply everywhere in the universe. Humans do not invent numbers, and humans cannot alter them. Plato explained the eternal character of mathematics by stating that they are abstract entities that exist in a spirit-like realm. He noted that moral values also are absolute truths and thus are also abstract, spirit-like entities. In this sense, for Plato, moral values are spiritual objects. Medieval philosophers commonly grouped all moral principles together under the heading of "eternal law" which were also frequently seen as spirit-like objects. 17^th century British philosopher Samuel Clarke described them as spirit-like relationships rather than spirit-like objects. In either case, though, they exist in a sprit-like realm. A different other-worldly approach to the metaphysical status of morality is divine commands issuing from God's will. Sometimes called voluntarism, this view was inspired by the notion of an all-powerful God who is in control of everything. God simply wills things, and they become reality. He wills the physical world into existence, he wills human life into existence and, similarly, he wills all moral values into existence. Proponents of this view, such as medieval philosopher William of Ockham, believe that God wills moral principles, such as "murder is wrong," and these exist in God's mind as commands. God informs humans of these commands by implanting us with moral intuitions or revealing these commands in scripture. The second and more this-worldly approach to the metaphysical status of morality follows in the skeptical philosophical tradition, such as that articulated by Greek philosopher Sextus Empiricus, and denies the objective status of moral values. Technically skeptics did not reject moral values themselves, but only denied that values exist as spirit-like objects, or as divine commands in the mind of God. Moral values, they argued, are strictly human inventions, a position that has since been called moral relativism. There are two distinct forms of moral relativism. The first is individual relativism, which holds that individual people create their own moral standards. Friedrich Nietzsche, for example, argued that the superhuman creates his or her morality distinct from and in reaction to the slave-like value system of the masses. The second is cultural relativism which maintains that morality is grounded in the approval of ones society and not simply in the preferences of individual people. This view was advocated by Sextus, and in more recent centuries by Michel Montaigne and William Graham Sumner. In addition to espousing skepticism and relativism, this-worldly approaches to the metaphysical status of morality deny the absolute and universal nature of morality and hold instead that moral values in fact change from society to society throughout time and throughout the world. They frequently attempt to defend their position by citing examples of values that differ dramatically from one culture to another, such as attitudes about polygamy, homosexuality and human sacrifice. [19]Back to Table of Contents Psychological Issues in Metaethics A second area of metaethics involves the psychological basis of our moral judgments and conduct, particularly understanding what motivates us to be moral. We might explore this subject by asking the simple question, "Why be moral?" Even if I am aware of basic moral standards, such as dont kill and dont steal, this does not necessarily mean that I will be psychologically compelled to act on them. Some answers to the question Why be moral? are to avoid punishment, to gain praise, to attain happiness, to be dignified, or to fit in with society. [20]Back to Table of Contents Egoism and Altruism One important area of moral psychology concerns the inherent selfishness of humans. 17^th century British philosopher Thomas Hobbes held that many, if not all, of our actions are prompted by selfish desires. Even if an action seems selfless, such as donating to charity, there are still selfish causes for this, such as experiencing power over other people. This view is called psychological egoism and maintains that self-oriented interests ultimately motivate all human actions. Closely related to psychological egoism is a view called psychological hedonism which is the view that pleasure is the specific driving force behind all of our actions. 18^th century British philosopher Joseph Butler agreed that instinctive selfishness and pleasure prompt much of our conduct. However, Butler argued that we also have an inherent psychological capacity to show benevolence to others. This view is called psychological altruism and maintains that at least some of our actions are motivated by instinctive benevolence. [21]Back to Table of Contents Emotion and Reason A second area of moral psychology involves a dispute concerning the role of reason in motivating moral actions. If, for example, I make the statement abortion is morally wrong, am I making a rational assessment or only expressing my feelings? On the one side of the dispute, 18^th century British philosopher David Hume argued that moral assessments involve our emotions, and not our reason. We can amass all the reasons we want, but that alone will not constitute a moral assessment. We need a distinctly emotional reaction in order to make a moral pronouncement. Reason might be of service in giving us the relevant data, but, in Hume's words, "reason is, and ought to be, the slave of the passions." Inspired by Humes anti-rationalist views, some 20th century philosophers, most notably A.J. Ayer, similarly denied that moral assessments are factual descriptions. For example, although the statement it is good to donate to charity may on the surface look as though it is a factual description about charity, it is not. Instead, a moral utterance like this involves two things. First, I (the speaker) I am expressing my personal feelings of approval about charitable donations and I am in essence saying "Hooray for charity!" This is called the emotive element insofar as I am expressing my emotions about some specific behavior. . Second, I (the speaker) am trying to get you to donate to charity and am essentially giving the command, "Donate to charity!" This is called the prescriptive element in the sense that I am prescribing some specific behavior. From Humes day forward, more rationally-minded philosophers have opposed these emotive theories of ethics and instead argued that moral assessments are indeed acts of reason. 18^th century German philosopher Immanuel Kant is a case in point. Although emotional factors often do influence our conduct, he argued, we should nevertheless resist that kind of sway. Instead, true moral action is motivated only by reason when it is free from emotions and desires. A recent rationalist approach, offered by Kurt Baier, was proposed in direct opposition to the emotivist and prescriptivist theories of Ayer and others. Baier focuses more broadly on the reasoning and argumentation process that takes place when making moral choices. All of our moral choices are, or at least can be, backed by some reason or justification. If I claim that it is wrong to steal someone's car, then I should be able to justify my claim with some kind of argument. For example, I could argue that stealing Smith's car is wrong since this would upset her, violate her ownership rights, or put the thief at risk of getting caught. According to Baier, then, proper moral decision making involves giving the best reasons in support of one course of action versus another. [22]Back to Table of Contents Male and Female Morality A third area of moral psychology focuses on whether there is a distinctly female approach to ethics that is grounded in the psychological differences between men and women. Discussions of this issue focus on two claims: (1) traditional morality is male-centered, and (2) there is a unique female perspective of the world which can be shaped into a value theory. According to many feminist philosophers, traditional morality is male-centered since it is modeled after practices that have been traditionally male-dominated, such as acquiring property, engaging in business contracts, and governing societies. The rigid systems of rules required for trade and government were then taken as models for the creation of equally rigid systems of moral rules, such as lists of rights and duties. Women, by contrast, have traditionally had a nurturing role by raising children and overseeing domestic life. These tasks require less rule following, and more spontaneous and creative action. Using the woman's experience as a model for moral theory, then, the basis of morality would be spontaneously caring for others as would be appropriate in each unique circumstance. On this model, the agent becomes part of the situation and acts caringly within that context. This stands in contrast with male-modeled morality where the agent is a mechanical actor who performs his required duty, but can remain distanced from and unaffected by the situation. A care-based approach to morality, as it is sometimes called, is offered by feminist ethicists as either a replacement for or a supplement to traditional male-modeled moral systems. [23]Back to Table of Contents Normative Ethics Normative ethics involves arriving at moral standards that regulate right and wrong conduct. In a sense, it is a search for an ideal litmus test of proper behavior. The Golden Rule is a classic example of a normative principle: We should do to others what we would want others to do to us. Since I do not want my neighbor to steal my car, then it is wrong for me to steal her car. Since I would want people to feed me if I was starving, then I should help feed starving people. Using this same reasoning, I can theoretically determine whether any possible action is right or wrong. So, based on the Golden Rule, it would also be wrong for me to lie to, harass, victimize, assault, or kill others. The Golden Rule is an example of a normative theory that establishes a single principle against which we judge all actions. Other normative theories focus on a set of foundational principles, or a set of good character traits. The key assumption in normative ethics is that there is only one ultimate criterion of moral conduct, whether it is a single rule or a set of principles. Three strategies will be noted here: (1) virtue theories, (2) duty theories, and (3) consequentialist theories. [24]Back to Table of Contents Virtue Theories Many philosophers believe that morality consists of following precisely defined rules of conduct, such as "don't kill," or "don't steal." Presumably, I must learn these rules, and then make sure each of my actions live up to the rules. Virtue theorists, however, place less emphasis on learning rules, and instead stress the importance of developing good habits of character, such as benevolence. Once I've acquired benevolence, for example, I will then habitually act in a benevolent manner. Historically, virtue theory is one of the oldest normative traditions in Western philosophy, having its roots in ancient Greek civilization. Plato emphasized four virtues in particular, which were later called cardinal virtues: wisdom, courage, temperance and justice. Other important virtues are fortitude, generosity, self-respect, good temper, and sincerity. In addition to advocating good habits of character, virtue theorists hold that we should avoid acquiring bad character traits, or vices, such as cowardice, insensibility, injustice, and vanity. Virtue theory emphasizes moral education since virtuous character traits are developed in one's youth. Adults, therefore, are responsible for instilling virtues in the young. Aristotle argued that virtues are good habits that we acquire, which regulate our emotions. For example, in response to my natural feelings of fear, I should develop the virtue of courage which allows me to be firm when facing danger. Analyzing 11 specific virtues, Aristotle argued that most virtues fall at a mean between more extreme character traits. With courage, for example, if I do not have enough courage, I develop the disposition of cowardice, which is a vice. If I have too much courage I develop the disposition of rashness which is also a vice. According to Aristotle, it is not an easy task to find the perfect mean between extreme character traits. In fact, we need assistance from our reason to do this. After Aristotle, medieval theologians supplemented Greek lists of virtues with three Christian ones, or theological virtues: faith, hope, and charity. Interest in virtue theory continued through the middle ages and declined in the 19^th century with the rise of alternative moral theories below. In the mid 20^th century virtue theory received special attention from philosophers who believed that more recent approaches ethical theories were misguided for focusing too heavily on rules and actions, rather than on virtuous character traits. Alasdaire MacIntyre defended the central role of virtues in moral theory and argued that virtues are grounded in and emerge from within social traditions. [25]Back to Table of Contents Duty Theories Many of us feel that there are clear obligations we have as human beings, such as to care for our children, and to not commit murder. Duty theories base morality on specific, foundational principles of obligation. These theories are sometimes called deontological, from the Greek word deon, or duty, in view of the foundational nature of our duty or obligation. They are also sometimes called nonconsequentialist since these principles are obligatory, irrespective of the consequences that might follow from our actions. For example, it is wrong to not care for our children even if it results in some great benefit, such as financial savings. There are four central duty theories. The first is that championed by 17th century German philosopher Samuel Pufendorf, who classified dozens of duties under three headings: duties to God, duties to oneself, and duties to others. Concerning our duties towards God, he argued that there are two kinds: (1) a theoretical duty to know the existence and nature of God, and (2) a practical duty to both inwardly and outwardly worship God. Concerning our duties towards oneself, these are also of two sorts: (1) duties of the soul, which involve developing ones skills and talents, and (2) duties of the body, which involve not harming our bodies, as we might through gluttony or drunkenness, and not killing oneself. Concerning our duties towards others, Pufendorf divides these between absolute duties, which are universally binding on people, and conditional duties, which are the result of contracts between people. Absolute duties are of three sorts: (1) avoid wronging others; (2) treat people as equals, and (3) promote the good of others. Conditional duties involve various types of agreements, the principal one of which is the duty is to keep ones promises. A second duty-based approach to ethics is rights theory. Most generally, a right is a justified claim against another persons behavior such as my right to not be harmed by you. Rights and duties are related in such a way that the rights of one person implies the duties of another person. For example, if I have a right to payment of $10 by Smith, then Smith has a duty to pay me $10. This is called the correlativity of rights and duties. The most influential early account of rights theory is that of 17^th century British philosopher John Locke, who argued that the laws of nature mandate that we should not harm anyone's life, health, liberty or possessions. For Locke, these are our natural rights, given to us by God. Following Locke, the United States Declaration of Independence authored by Thomas Jefferson recognizes three foundational rights: life, liberty, and the pursuit of happiness. Jefferson and others rights theorists maintained that we deduce other more specific rights from these, including the rights of property, movement, speech, and religious expression. There are four features traditionally associated with moral rights. First, rights are natural insofar as they are not invented or created by governments. Second, they are universal insofar as they do not change from country to country. Third, they are equal in the sense that rights are the same for all people, irrespective of gender, race, or handicap. Fourth, they are inalienable which means that I ca not hand over my rights to another person, such as by selling myself into slavery. A third duty-based theory is that by Kant, which emphasizes a single principle of duty. Influenced by Pufendorf, Kant agreed that we have moral duties to oneself and others, such as developing one's talents, and keeping our promises to others. However, Kant argued that there is a more foundational principle of duty that encompasses our particular duties. It is a single, self-evident principle of reason that he calls the "categorical imperative." A categorical imperative, he argued, is fundamentally different from hypothetical imperatives that hinge on some personal desire that we have, for example, "If you want to get a good job, then you ought to go to college." By contrast, a categorical imperative simply mandates an action, irrespective of one's personal desires, such as "You ought to do X." Kant gives at least four versions of the categorical imperative, but one is especially direct: Treat people as an end, and never as a means to an end. That is, we should always treat people with dignity, and never use them as mere instruments. For Kant, we treat people as an end whenever our actions toward someone reflect the inherent value of that person. Donating to charity, for example, is morally correct since this acknowledges the inherent value of the recipient. By contrast, we treat someone as a means to an end whenever we treat that person as a tool to achieve something else. It is wrong, for example, to steal my neighbor's car since I would be treating her as a means to my own happiness. The categorical imperative also regulates the morality of actions that affect us individually. Suicide, for example, would be wrong since I would be treating my life as a means to the alleviation of my misery. Kant believes that the morality of all actions can be determined by appealing to this single principle of duty. A fourth and more recent duty-based theory is that by British philosopher W.D. Ross, which emphasizes prima facie duties. Like his 17th and 18th century counterparts, Ross argues that our duties are "part of the fundamental nature of the universe." However, Ross's list of duties is much shorter, which he believes reflects our actual moral convictions: * Fidelity: the duty to keep promises * Reparation: the duty to compensate others when we harm them * Gratitude: the duty to thank those who help us * Justice: the duty to recognize merit * Beneficence: the duty to improve the conditions of others * Self-improvement: the duty to improve our virtue and intelligence * Nonmaleficence: the duty to not injure others Ross recognizes that situations will arise when we must choose between two conflicting duties. In a classic example, suppose I borrow my neighbor's gun and promise to return it when he asks for it. One day, in a fit of rage, my neighbor pounds on my door and asks for the gun so that he can take vengeance on someone. On the one hand, the duty of fidelity obligates me to return the gun; on the other hand, the duty of nonmaleficence obligates me to avoid injuring others and thus not return the gun. According to Ross, I will intuitively know which of these duties is my actual duty, and which is my apparent or prima facie duty. In this case, my duty of nonmaleficence emerges as my actual duty and I should not return the gun. [26]Back to Table of Contents Consequentialist Theories It is common for us to determine our moral responsibility by weighing the consequences of our actions. According to consequentialist normative theories, correct moral conduct is determined solely by a cost-benefit analysis of an action's consequences: Consequentialism: An action is morally right if the consequences of that action are more favorable than unfavorable. Consequentialist normative principles require that we first tally both the good and bad consequences of an action. Second, we then determine whether the total good consequences outweigh the total bad consequences. If the good consequences are greater, then the action is morally proper. If the bad consequences are greater, then the action is morally improper. Consequentialist theories are sometimes called teleological theories, from the Greek word telos, or end, since the end result of the action is the sole determining factor of its morality. Consequentialist theories became popular in the 18^th century by philosophers who wanted a quick way to morally assess an action by appealing to experience, rather than by appealing to gut intuitions or long lists of questionable duties. In fact, the most attractive feature of consequentialism is that it appeals to publicly observable consequences of actions. Most versions of consequentialism are more precisely formulated than the general principle above. In particular, competing consequentialist theories specify which consequences for affected groups of people are relevant. Three subdivisions of consequentialism emerge: Ethical Egoism:an action is morally right if the consequences of that action are more favorable than unfavorable only to the agent performing the action. Ethical Altruism: an action is morally right if the consequences of that action are more favorable than unfavorable to everyone except the agent. Utilitarianism: an action is morally right if the consequences of that action are more favorable than unfavorable to everyone. All three of these theories focus on the consequences of actions for different groups of people. But, like all normative theories, the above three theories are rivals of each other. They also yield different conclusions. Consider the following example. A woman was traveling through a developing country when she witnessed a car in front of her run off the road and roll over several times. She asked the hired driver to pull over to assist, but, to her surprise, the driver accelerated nervously past the scene. A few miles down the road the driver explained that in his country if someone assists an accident victim, then the police often hold the assisting person responsible for the accident itself. If the victim dies, then the assisting person could be held responsible for the death. The driver continued explaining that road accident victims are therefore usually left unattended and often die from exposure to the countrys harsh desert conditions. On the principle of ethical egoism, the woman in this illustration would only be concerned with the consequences of her attempted assistance as she would be affected. Clearly, the decision to drive on would be the morally proper choice. On the principle of ethical altruism, she would be concerned only with the consequences of her action as others are affected, particularly the accident victim. Tallying only those consequences reveals that assisting the victim would be the morally correct choice, irrespective of the negative consequences that result for her. On the principle of utilitarianism, she must consider the consequences for both herself and the victim. The outcome here is less clear, and the woman would need to precisely calculate the overall benefit versus disbenefit of her action. [27]Back to Table of Contents Types of Utilitarianism Jeremy Bentham presented one of the earliest fully developed systems of utilitarianism. Two features of his theory are noteworty. First, Bentham proposed that we tally the consequences of each action we perform and thereby determine on a case by case basis whether an action is morally right or wrong. This aspect of Benthams theory is known as act-utilitiarianism. Second, Bentham also proposed that we tally the pleasure and pain which results from our actions. For Bentham, pleasure and pain are the only consequences that matter in determining whether our conduct is moral. This aspect of Benthams theory is known as hedonistic utilitarianism. Critics point out limitations in both of these aspects. First, according to act-utilitarianism, it would be morally wrong to waste time on leisure activities such as watching television, since our time could be spent in ways that produced a greater social benefit, such as charity work. But prohibiting leisure activities doesnt seem reasonable. More significantly, according to act-utilitarianism, specific acts of torture or slavery would be morally permissible if the social benefit of these actions outweighed the disbenefit. A revised version of utilitarianism called rule-utilitarianism addresses these problems. According to rule-utilitarianism, a behavioral code or rule is morally right if the consequences of adopting that rule are more favorable than unfavorable to everyone. Unlike act utilitarianism, which weighs the consequences of each particular action, rule-utilitarianism offers a litmus test only for the morality of moral rules, such as stealing is wrong. Adopting a rule against theft clearly has more favorable consequences than unfavorable consequences for everyone. The same is true for moral rules against lying or murdering. Rule-utilitarianism, then, offers a three-tiered method for judging conduct. A particular action, such as stealing my neighbors car, is judged wrong since it violates a moral rule against theft. In turn, the rule against theft is morally binding because adopting this rule produces favorable consequences for everyone. John Stuart Mills version of utilitarianism is rule-oriented. Second, according to hedonistic utilitarianism, pleasurable consequences are the only factors that matter, morally speaking. This, though, seems too restrictive since it ignores other morally significant consequences that are not necessarily pleasing or painful. For example, acts which foster loyalty and friendship are valued, yet they are not always pleasing. In response to this problem, G.E. Moore proposed ideal utilitarianism, which involves tallying any consequence that we intuitively recognize as good or bad (and not simply as pleasurable or painful). Also, R.M. Hare proposed preference utilitarianism, which involves tallying any consequence that fulfills our preferences. [28]Back to Table of Contents Ethical Egoism and Social Contract Theory We have seen that Thomas Hobbes was an advocate of the methaethical theory of psychological egoism the view that all of our actions are selfishly motivated. Upon that foundation, Hobbes developed a normative theory known as social contract theory, which is a type of rule-ethical-egoism. According to Hobbes, for purely selfish reasons, the agent is better off living in a world with moral rules than one without moral rules. For without moral rules, we are subject to the whims of other people's selfish interests. Our property, our families, and even our lives are at continual risk. Selfishness alone will therefore motivate each agent to adopt a basic set of rules which will allow for a civilized community. Not surprisingly, these rules would include prohibitions against lying, stealing and killing. However, these rules will ensure safety for each agent only if the rules are enforced. As selfish creatures, each of us would plunder our neighbors' property once their guards were down. Each agent would then be at risk from his neighbor. Therefore, for selfish reasons alone, we devise a means of enforcing these rules: we create a policing agency which punishes us if we violate these rules. [29]Back to Table of Contents Applied Ethics Applied ethics is the branch of ethics which consists of the analysis of specific, controversial moral issues such as abortion, animal rights, or euthanasia. In recent years applied ethical issues have been subdivided into convenient groups such as medical ethics, business ethics, environmental ethics, and sexual ethics. Generally speaking, two features are necessary for an issue to be considered an "applied ethical issue." First, the issue needs to be controversial in the sense that there are significant groups of people both for and against the issue at hand. The issue of drive-by shooting, for example, is not an applied ethical issue, since everyone agrees that this practice is grossly immoral. By contrast, the issue of gun control would be an applied ethical issue since there are significant groups of people both for and against gun control. The second requirement for in issue to be an applied ethical issue is that it must be a distinctly moral issue. On any given day, the media presents us with an array of sensitive issues such as affirmative action policies, gays in the military, involuntary commitment of the mentally impaired, capitalistic vs. socialistic business practices, public vs. private health care systems, or energy conservation. Although all of these issues are controversial and have an important impact on society, they are not all moral issues. Some are only issues of social policy. The aim of social policy is to help make a given society run efficiently by devising conventions, such as traffic laws, tax laws, and zoning codes. Moral issues, by contrast, concern more universally obligatory practices, such as our duty to avoid lying, and are not confined to individual societies. Frequently, issues of social policy and morality overlap, as with murder which is both socially prohibited and immoral. However, the two groups of issues are often distinct. For example, many people would argue that sexual promiscuity is immoral, but may not feel that there should be social policies regulating sexual conduct, or laws punishing us for promiscuity. Similarly, some social policies forbid residents in certain neighborhoods from having yard sales. But, so long as the neighbors are not offended, there is nothing immoral in itself about a resident having a yard sale in one of these neighborhoods. Thus, to qualify as an applied ethical issue, the issue must be more than one of mere social policy: it must be morally relevant as well. In theory, resolving particular applied ethical issues should be easy. With the issue of abortion, for example, we would simply determine its morality by consulting our normative principle of choice, such as act-utilitarianism. If a given abortion produces greater benefit than disbenefit, then, according to act-utilitarianism, it would be morally acceptable to have the abortion. Unfortunately, there are perhaps hundreds of rival normative principles from which to choose, many of which yield opposite conclusions. Thus, the stalemate in normative ethics between conflicting theories prevents us from using a single decisive procedure for determining the morality of a specific issue. The usual solution today to this stalemate is to consult several representative normative principles on a given issue and see where the weight of the evidence lies. [30]Back to Table of Contents Normative Principles in Applied Ethics Arriving at a short list of representative normative principles is itself a challenging task. The principles selected must not be too narrowly focused, such as a version of act-egoism that might focus only on an action's short-term benefit. The principles must also be seen as having merit by people on both sides of an applied ethical issue. For this reason, principles that appeal to duty to God are not usually cited since this would have no impact on a nonbeliever engaged in the debate. The following principles are the ones most commonly appealed to in applied ethical discussions: * Personal benefit: acknowledge the extent to which an action produces beneficial consequences for the individual in question. * Social benefit: acknowledge the extent to which an action produces beneficial consequences for society. * Principle of benevolence: help those in need. * Principle of paternalism: assist others in pursuing their best interests when they cannot do so themselves. * Principle of harm: do not harm others. * Principle of honesty: do not deceive others. * Principle of lawfulness: do not violate the law. * Principle of autonomy: acknowledge a person's freedom over his/her actions or physical body. * Principle of justice: acknowledge a person's right to due process, fair compensation for harm done, and fair distribution of benefits. * Rights: acknowledge a person's rights to life, information, privacy, free expression, and safety. The above principles represent a spectrum of traditional normative principles and are derived from both consequentialist and duty-based approaches. The first two principles, personal benefit and social benefit, are consequentialist since they appeal to the consequences of an action as it affects the individual or society. The remaining principles are duty-based. The principles of benevolence, paternalism, harm, honesty, and lawfulness are based on duties we have toward others. The principles of autonomy, justice, and the various rights are based on moral rights. An example will help illustrate the function of these principles in an applied ethical discussion. In 1982 a couple from Bloomington, Indiana gave birth to a severely retarded baby. The infant, known as Baby Doe, also had its stomach disconnected from its throat and was thus unable to receive nourishment. Although this stomach deformity was correctable through surgery, the couple did not want to raise a severely retarded child and therefore chose to deny surgery, food, and water for the infant. Local courts supported the parents' decision, and six days later Baby Doe died. Should corrective surgery have been performed for Baby Doe? Arguments in favor of corrective surgery derive from the infant's right to life and the principle of paternalism which stipulates that we should pursue the best interests of others when they are incapable of doing so themselves. Arguments against corrective surgery derive from the personal and social disbenefit which would result from such surgery. If Baby Doe survived, its quality of life would have been poor and in any case it probably would have died at an early age. Also, from the parent's perspective, Baby Doe's survival would have been a significant emotional and financial burden. When examining both sides of the issue, the parents and the courts concluded that the arguments against surgery were stronger than the arguments for surgery. First, foregoing surgery appeared to be in the best interests of the infant, given the poor quality of life it would endure. Second, the status of Baby Doe's right to life was not clear given the severity of the infant's mental impairment. For, to possess moral rights, it takes more than merely having a human body: certain cognitive functions must also be present. The issue here involves what is often referred to as moral personhood, and is central to many applied ethical discussions. [31]Back to Table of Contents Issues in Applied Ethics As noted, there are many controversial issues discussed by ethicists today, some of which will be briefly mentioned here. Biomedical ethics focuses on a range of issues which arise in clinical settings. Health care workers are in an unusual position of continually dealing with life and death situations. It is not surprising, then, that medical ethics issues are more extreme and diverse than other areas of applied ethics. Prenatal issues arise about the morality of surrogate mothering, genetic manipulation of fetuses, the status of unused frozen embryos, and abortion. Other issues arise about patient rights and physician's responsibilities, such as the confidentiality of the patient's records and the physician's responsibility to tell the truth to dying patients. The AIDS crisis has raised the specific issues of the mandatory screening of all patients for AIDS, and whether physicians can refuse to treat AIDS patients. Additional issues concern medical experimentation on humans, the morality of involuntary commitment, and the rights of the mentally retarded. Finally, end of life issues arise about the morality of suicide, the justifiability of suicide intervention, physician assisted suicide, and euthanasia. The field of business ethics examines moral controversies relating to the social responsibilities of capitalist business practices, the moral status of corporate entities, deceptive advertising, insider trading, basic employee rights, job discrimination, affirmative action, drug testing, and whistle blowing. Issues in environmental ethics often overlaps with business and medical issues. These include the rights of animals, the morality of animal experimentation, preserving endangered species, pollution control, management of environmental resources, whether eco-systems are entitled to direct moral consideration, and our obligation to future generations. Controversial issues of sexual morality include monogamy vs. polygamy, sexual relations without love, homosexual relations, and extramarital affairs. Finally, there are issues of social morality which examine capital punishment, nuclear war, gun control, the recreational use of drugs, welfare rights, and racism. [32]Back to Table of Contents References and Further Reading Anscombe,Elizabeth Modern Moral Philosophy (1958), Philosophy, 1958, Vol. 33, reprinted in her Ethics, Religion and Politics (Oxford: Blackwell, 1981). Aristotle, Nichomachean Ethics, in Barnes, Jonathan, ed., The Complete Works of Aristotle (Princeton, N.J.: Princeton University Press, 1984). Ayer, A. J., Language, Truth and Logic (New York: Dover Publications, 1946). Bentham, Jeremy, Introduction to the Principles of Morals and Legislation (1789), in The Works of Jeremy Bentham, edited by John Bowring (London: 1838-1843). Hare, R.M., Moral Thinking, (Oxford: Clarendon Press, 1981). Hare, R.M., The Language of Morals (Oxford: Oxford University Press, 1952). Hobbes, Thomas, Leviathan, ed., E. Curley, (Chicago, IL: Hackett Publishing Company, 1994). Hume, David, A Treatise of Human Nature (1739-1740), eds. David Fate Norton, Mary J. Norton (Oxford; New York: Oxford University Press, 2000). Kant, Immanuel, Grounding for the Metaphysics of Morals, tr, James W. Ellington (Indianapolis: Hackett Publishing Company, 1985). Locke, John, Two Treatises, ed., Peter Laslett (Cambridge: Cambridge University Press, 1963). MacIntyre, Alasdair, After Virtue, second edition, (Notre Dame: Notre Dame University Press, 1984). Mackie, John L., Ethics: Inventing Right and Wrong, (New York: Penguin Books, 1977). Mill, John Stuart, Utilitarianism, in Collected Works of John Stuart Mill, ed., J.M. Robson (London: Routledge and Toronto, Ont.: University of Toronto Press, 1991). Moore, G.E., Principia Ethica, (Cambridge: Cambridge University Press, 1903). Noddings, Nel, Ethics from the Stand Point Of Women, in Deborah L. Rhode, ed., Theoretical Perspectives on Sexual Difference (New Haven, CT: Yale University Press, 1990). Ockham, William of, Fourth Book of the Sentences, tr. Lucan Freppert, The Basis of Morality According to William Ockham (Chicago: Franciscan Herald Press, 1988). Plato, Republic, 6:510-511, in Cooper, John M., ed., Plato: Complete Works (Indianapolis: Hackett Publishing Company, 1997). Samuel Pufendorf, De Jure Naturae et Gentium (1762), tr. Of the Law of Nature and Nations. Samuel Pufendorf, De officio hominis et civis juxta legem naturalem (1673), tr., The Whole Duty of Man according to the Law of Nature (London, 1691). Sextus Empiricus, Outlines of Pyrrhonism, trs. J. Annas and J. Barnes, Outlines of Scepticism (Cambridge: Cambridge University Press, 1994). Stevenson, Charles L., The Ethics of Language, (New Haven: Yale University Press, 1944). Sumner, William Graham, Folkways (Boston: Guinn, 1906). [33]Back to Table of Contents _________________________________________________________________ Author Information: James Fieser Email: [34]jfieser at utm.edu HomePage: [35]http://www.utm.edu/~jfieser/ References 1. http://www.iep.utm.edu/ 2. http://www.utm.edu/research/iep/e/ethics.htm#Metaethics 3. http://www.utm.edu/research/iep/e/ethics.htm#Metaphysical Issues: Objectivism and Relativism 4. http://www.utm.edu/research/iep/e/ethics.htm#Psychological Issues in Metaethics 5. http://www.utm.edu/research/iep/e/ethics.htm#Egoism and Altruism 6. http://www.utm.edu/research/iep/e/ethics.htm#Emotion and Reason 7. http://www.utm.edu/research/iep/e/ethics.htm#Male and Female Morality 8. http://www.utm.edu/research/iep/e/ethics.htm#Normative Ethics 9. http://www.utm.edu/research/iep/e/ethics.htm#Virtue Theories 10. http://www.utm.edu/research/iep/e/ethics.htm#Duty Theories 11. http://www.utm.edu/research/iep/e/ethics.htm#Consequentialist Theories 12. http://www.utm.edu/research/iep/e/ethics.htm#Types of Utilitarianism 13. http://www.utm.edu/research/iep/e/ethics.htm#Ethical Egoism and Social Contract Theory 14. http://www.utm.edu/research/iep/e/ethics.htm#Applied Ethics 15. http://www.utm.edu/research/iep/e/ethics.htm#Normative Principles in Applied Ethics 16. http://www.utm.edu/research/iep/e/ethics.htm#Issues in Applied Ethics 17. http://www.utm.edu/research/iep/e/ethics.htm#References and Further Reading 18. http://www.utm.edu/research/iep/e/ethics.htm#top 19. http://www.utm.edu/research/iep/e/ethics.htm#top 20. http://www.utm.edu/research/iep/e/ethics.htm#top 21. http://www.utm.edu/research/iep/e/ethics.htm#top 22. http://www.utm.edu/research/iep/e/ethics.htm#top 23. http://www.utm.edu/research/iep/e/ethics.htm#top 24. http://www.utm.edu/research/iep/e/ethics.htm#top 25. http://www.utm.edu/research/iep/e/ethics.htm#top 26. http://www.utm.edu/research/iep/e/ethics.htm#top 27. http://www.utm.edu/research/iep/e/ethics.htm#top 28. http://www.utm.edu/research/iep/e/ethics.htm#top 29. http://www.utm.edu/research/iep/e/ethics.htm#top 30. http://www.utm.edu/research/iep/e/ethics.htm#top 31. http://www.utm.edu/research/iep/e/ethics.htm#top 32. http://www.utm.edu/research/iep/e/ethics.htm#top 33. http://www.utm.edu/research/iep/e/ethics.htm#top 34. mailto:jfieser at utm.edu?subject=Loved%20Your%20Ethics%20Article! 35. http://www.utm.edu/~jfieser/ From checker at panix.com Mon May 2 16:23:52 2005 From: checker at panix.com (Premise Checker) Date: Mon, 2 May 2005 12:23:52 -0400 (EDT) Subject: [Paleopsych] Sign and Sight: The human flaw Message-ID: The human flaw http://print.signandsight.com/features/74.html 5.3.23 A festival in Berlin's Haus der Kulturen der Welt examines beauty with exhibitions, discussions and dance. By Arnd Wesemann Can't we simply find something beautiful for a change? Does everything have to be immediately relegated to the level of the ridiculous and the kitsch? Why do we desire a thing of beauty and yet regard it with suspicion? What methods of seduction are in play when the beautiful woman in the advertisement appears more beautiful than the beautiful woman next to you? And can one regard heroic masculine poses as an expression of biological superiority without making fascist idols of them? Before you know it the beauty has faded. (Photo: Wang Gongxin & Lin Tianmiao: "Here? Or There?" (detail). 2002. Video installation. Photographer: The artists) Beauty is booming in German universities. After a decade of intensive gender research and practice in equality - sexual, religious and racial - a roll-back is under way. Beauty lost its power because it defected to the side of advertising, computer animation and plastic surgery. And because beauty contradicts the principle of egalitarianism. "Beauty entices", says Winfried Menninghaus, [1]professor of comparative literature at Berlin's Free University, who is currently touring and talking on the subject. According to Menninghaus, Darwinian theory, which like biologism is undergoing a renaissance, states that beauty solely serves biological selection. This is why so many cultures have undermined the power of beauty. Islam covers up its women to prevent inequality from determining the choice of partner. And uniforms are there to lower the pressure of competition. [linke3_300dpi.jpg] The level of competition in the globalised world has spawned the new adoration of the beautiful and strong. In fact, Menninghaus tells us, clothing and fashion signalled the end of Darwinian selection. Nakedness necessitated clothing and thus culture. Since then the naked body has been taboo. As a way of concealing the painful memories of the now surmounted natural state, nakedness has always simultaneously stood for obscenity and the ideal of beauty. Art history was the first to idealise the body; later the health and fitness industry and all the other preening and pruning practices built up around nudity adopted the strict dictates of the beauty ideal. 65 percent of US Americans are overweight. The conclusion: the body is bad, it belongs to the forces of evil. The idea of beauty is therefore also bound up with the rediscovery of shame. The real body stands ashamed before the propagated ideal. Everybody knows the body can never be as flawless as it has to be: pure and sinless, healthy and efficient. And yet one searches for it, at least in art. And then one denounces art for this reason. (Photo: Susanne Linke: "Im Bade wannen". Photographer: Klaus Rabien) The whole of Paris was outraged when [2]choreographer Jan Fabre put a naked, oil-covered dancer on stage and called the piece "Beauty warrior". The oil, the impure element, ran counter to beauty. At the JFK airport in New York two dozen mouth-wateringly gorgeous black models recently [3]posed naked in shackles. America wanted to protest, but by alluding to the legacy of slavery and inflamed desire for beautiful others, Vanessa Beecroft silenced her critics. [zhuang_huidetail_300dpi.jpg] Now the celebrated French [4]sinologist, Francois Jullien, currently touring with his new book "Le nu impossible", has suggested looking at beauty through the eyes of other cultures. Berlin's Haus der Kulturen der Welt (House of World Cultures) has taken up the challenge and on March 18 opened its festival [5]"About Beauty", comprising exhibition, dance programme and a series of podium discussions. Jullien views beauty from the Chinese perspective. In his book, he maintains that to the Chinese eye a person cannot be beautiful as such. According to ancient Tao wisdom, it is in movement that a person attains beauty, in Tai-Chi for example. The Chinese syllable "mei" (literally: fat sheep) means beauty. It is used to describe good food, a sense of well-being, a pleasant bodily feeling. And, ironically enough, also the United States (literally: beautiful land). So it is possible to have beauty without burdening it with ideals of physical self-improvement and abstinence. Why not just enjoy life? But Europeans abide by Jacques Lacan, who stated that pleasure is also a dictate. (Photo: Zhuang Hui: "Chashan County ? June 25". Sculpture.) [eidos_tao_300dpineu.jpg] The Berlin choreographers Jutta Hell and Dieter Baumann rehearsed a [6]dance piece in Shanghai titled "Eidos_Tao" with Chinese dancers. Tao, which is generally translated as "the Way" means movement in China, the flowing, unstoppable movement of dance as opposed to our classical ideal of fixed "eidos". Precisely here, says Jullien, lies the difference. Chinese see beauty in flux, while we try to force it to stand still. Good food and letting the daughters dance are still the measure of beauty in remote areas of southern China. Traditional generosity is beautiful too. (Photo: "EIDOS_TAO". Performance. Photographer: Dirk Bleicker) [shanghai_beauty9_300.jpg] One might suspect that Europe simply does not want to find the beautiful beautiful. Bertolt Brecht coined the phrase: "Beauty comes from overcoming difficulties". The peak is only beautiful when it has been scaled. Pleasure is beautiful when it has to be paid for in sweat. Perhaps this is why beauty hardly qualifies as an aesthetic category any more. Schiller's sentence "Beauty is freedom in the appearance" has only been dug up again for his bicentennial. He spoke of dignity as a category of beauty. The dignity of the healthy, of the beautiful body? What Schiller really meant - and what the Chinese believe today - has largely been forgotten: superior intellect, wise politics, expert craftmanship, human prowess. For the Chinese, only what is true and good is also beautiful, says Jullien. [7]Essayist Dave Hickey goes a step further. In his book "The Invisible Dragon", he describes how this "classical" stance is about to be driven out of the Chinese. (Photo: "Shanghai Beauty". Performance. Photographer: Dirk Bleicker) They too are subject to the influence of academies, museums and universities. As in Europe, these institutions search for beauty in constructs and systems. But the Chinese no more believe in concepts than they do in making sacrifices to achieve an end. Their traditional view of beauty is a celebration of change, eternal circulation and transformation. And according to Hickey, this is precisely the opposite of everything rigid and statutory embodied by institutions. But this culture of the transformative is in retreat, and it is disappearing faster than people are aware of. As Chinese [8]choreographer Jin Xing puts it: "Chinese bodies look weak in comparison with beautiful African bodies. And the Chinese don't have the overriding sense of envy and justice that makes bodies hard and people rich in the West. But the concept of spending money in a fitness studio is still utterly alien in China. The Chinese work hard because true beauty for us is wealth." "?ber Sch?nheit - About Beauty". 18.3.05 - 15.5.05. [9]Haus der Kulturen der Welt, Berlin * Arnd Wesemann is editor of [10]Ballet-Tanz magazine. The article was originally published in German in the [11]S?ddeutsche Zeitung, on 17 March, 2005. Translation: [12]lp. sign and sight funded by Bundeskulturstiftung References 1. http://www.complit.fu-berlin.de/institut/lehrpersonal/menninghaus.html 2. http://csw.art.pl/new/99/fabre_e.html 3. http://newsgrist.typepad.com/underbelly/2004/10/terminal_5_exhi.html 4. http://www.upsy.net/spip/article.php3?id_article=30 5. http://www.hkw.de/en/programm/programm2005/AboutBeauty-Ausstellungsprogramm/c_index.html 6. http://www.hkw.de/en/programm/tagesprogramm/Eidos_Tao/c_index.html 7. http://www.archibot.com/stories/st_davehickey.html 8. http://www.hkw.de/en/programm/tagesprogramm/shanghaibeauty/c_index.html 9. http://www.hkw.de/index_en.html 10. http://www.ballet-tanz.de/ 11. http://www.sueddeutsche.de/ 12. http://www.signandsight.com/service/37.html From checker at panix.com Mon May 2 16:23:14 2005 From: checker at panix.com (Premise Checker) Date: Mon, 2 May 2005 12:23:14 -0400 (EDT) Subject: [Paleopsych] Monterey Herald: Genetic mingling mixes human, animal cells Message-ID: Genetic mingling mixes human, animal cells http://www.montereyherald.com/mld/montereyherald/business/11525171.htm On a farm about six miles outside this gambling town, Jason Chamberlain looks over a flock of about 50 smelly sheep, many of them possessing partially human livers, hearts, brains and other organs. The University of Nevada-Reno researcher talks matter-of-factly about his plans to euthanize one of the pregnant sheep in a nearby lab. He can't wait to examine the effects of the human cells he had injected into the fetus' brain about two months ago. "It's mice on a large scale," Chamberlain says with a shrug. As strange as his work may sound, it falls firmly within the new ethics guidelines the influential National Academies issued this past week for stem cell research. In fact, the Academies' report endorses research that co-mingles human and animal tissue as vital to ensuring that experimental drugs and new tissue replacement therapies are safe for people. Doctors have transplanted pig valves into human hearts for years, and scientists have injected human cells into lab animals for even longer. But the biological co-mingling of animal and human is now evolving into even more exotic and unsettling mixes of species, evoking the Greek myth of the monstrous chimera, which was part lion, part goat and part serpent. In the past two years, scientists have created pigs with human blood, fused rabbit eggs with human DNA and injected human stem cells to make paralyzed mice walk. Particularly worrisome to some scientists are the nightmare scenarios that could arise from the mixing of brain cells: What if a human mind somehow got trapped inside a sheep's head? The "idea that human neuronal cells might participate in 'higher order' brain functions in a nonhuman animal, however unlikely that may be, raises concerns that need to be considered," the academies report warned. In January, an informal ethics committee at Stanford University endorsed a proposal to create mice with brains nearly completely made of human brain cells. Stem cell scientist Irving Weissman said his experiment could provide unparalleled insight into how the human brain develops and how degenerative brain diseases like Parkinson's progress. Stanford law professor Hank Greely, who chaired the ethics committee, said the board was satisfied that the size and shape of the mouse brain would prevent the human cells from creating any traits of humanity. Just in case, Greely said, the committee recommended closely monitoring the mice's behavior and immediately killing any that display human-like behavior. The Academies' report recommends that each institution involved in stem cell research create a formal, standing committee to specifically oversee the work, including experiments that mix human and animal cells. Weissman, who has already created mice with 1 percent human brain cells, said he has no immediate plans to make mostly human mouse brains, but wanted to get ethical clearance in any case. A formal Stanford committee that oversees research at the university would also need to authorize the experiment. Few human-animal hybrids are as advanced as the sheep created by another stem cell scientist, Esmail Zanjani, and his team at the University of Nevada-Reno. They want to one day turn sheep into living factories for human organs and tissues and along the way create cutting-edge lab animals to more effectively test experimental drugs. Zanjani is most optimistic about the sheep that grow partially human livers after human stem cells are injected into them while they are still in the womb. Most of the adult sheep in his experiment contain about 10 percent human liver cells, though a few have as much as 40 percent, Zanjani said. Because the human liver regenerates, the research raises the possibility of transplanting partial organs into people whose livers are failing. Zanjani must first ensure no animal diseases would be passed on to patients. He also must find an efficient way to completely separate the human and sheep cells, a tough task because the human cells aren't clumped together but are rather spread throughout the sheep's liver. Zanjani and other stem cell scientists defend their research and insist they aren't creating monsters - or anything remotely human. "We haven't seen them act as anything but sheep," Zanjani said. Zanjani's goals are many years from being realized. He's also had trouble raising funds, and the U.S. Department of Agriculture is investigating the university over allegations made by another researcher that the school mishandled its research sheep. Zanjani declined to comment on that matter, and university officials have stood by their practices. Allegations about the proper treatment of lab animals may take on strange new meanings as scientists work their way up the evolutionary chart. First, human stem cells were injected into bacteria, then mice and now sheep. Such research blurs biological divisions between species that couldn't until now be breached. Drawing ethical boundaries that no research appears to have crossed yet, the Academies recommend a prohibition on mixing human stem cells with embryos from monkeys and other primates. But even that policy recommendation isn't tough enough for some researchers. "The boundary is going to push further into larger animals," New York Medical College professor Stuart Newman said. "That's just asking for trouble." Newman and anti-biotechnology activist Jeremy Rifkin have been tracking this issue for the last decade and were behind a rather creative assault on both interspecies mixing and the government's policy of patenting individual human genes and other living matter. Years ago, the two applied for a patent for what they called a "humanzee," a hypothetical - but very possible - creation that was half human and chimp. The U.S. Patent and Trademark Office finally denied their application this year, ruling that the proposed invention was too human: Constitutional prohibitions against slavery prevents the patenting of people. Newman and Rifkin were delighted, since they never intended to create the creature and instead wanted to use their application to protest what they see as science and commerce turning people into commodities. And that's a point, Newman warns, that stem scientists are edging closer to every day: "Once you are on the slope, you tend to move down it." From christian.rauh at uconn.edu Mon May 2 22:49:41 2005 From: christian.rauh at uconn.edu (Christian Rauh) Date: Mon, 02 May 2005 18:49:41 -0400 Subject: [Paleopsych] What would *you* take to a desert island? Message-ID: <4276AE85.70105@uconn.edu> -------------- next part -------------- A non-text attachment was scrubbed... Name: chart.gif Type: image/gif Size: 10164 bytes Desc: not available URL: From unstasis at gmail.com Tue May 3 00:30:42 2005 From: unstasis at gmail.com (Stephen Lee) Date: Mon, 2 May 2005 20:30:42 -0400 Subject: [Paleopsych] test Message-ID: <951ad0705050217301564af3@mail.gmail.com> Hey just heard from Greg Bear that nothig seemed to be going on in the group.. So I figured I may as well put a message in to see if it does. Might just be no one has postd sense Aril 29th? Does this seem accurate? And as a fun little filler question. What would you see the mnain diffferences and difficulties bringing in to be as a full applicable science of both paleopsychology contrasted to Asimov's concept of psychohistory. Or ignore this as most test should be ignored. -- -- If Nothing Is Then Nothing Was But something is everwhere Just because -- http://www.freewebs.com/rewander http://hopeisus.fateback.com/story.html From anonymous_animus at yahoo.com Tue May 3 18:07:52 2005 From: anonymous_animus at yahoo.com (Michael Christopher) Date: Tue, 3 May 2005 11:07:52 -0700 (PDT) Subject: [Paleopsych] useless people, reality shows etc In-Reply-To: <200505031711.j43HBrR07037@tick.javien.com> Message-ID: <20050503180752.97622.qmail@web30802.mail.mud.yahoo.com> Gerry says: >>I wouldn't call these jobs examples of Japan losing its humanity but rather an indication that government is providing work for those who are unemployed but wish (and need) something to do.<< --Oh, I agree. People tend to do better when they have something to do to contribute to society. I was referring to the label "useless people" which was probably meant tongue in cheek, but which I've seen used more and more in a serious tone. There are a lot of people who see human beings in terms of their economic worth, and if someone doesn't adapt to the economic system, they're labeled "parasites" or something similar. Which reminds me of the fascist view of human beings, that they are cogs in a machine, worth only their material output. >>The leader who has a knack for bullying the rest of the group is usually the one who makes it to the end and finishes a winner. Cunning and dishonesty are values promoted by these survivor programs and the one who wins is he/she who is most deceptive. These values are ones NOT taught to children by caring parents.<< --Very true. I'm wondering how many people in our culture view "the game" as one of cutting other people's throats, and what effect that has on the health of the overall system. Ideally, our economic system would reward talent and hard work. But what happens when the rewards go to those who are better at manipulating others? Many young people seem to have incorporated those values in the sexual arena, with girls rewarding the most manipulative boys with sex, and boys rewarding girls who use their sexuality to get ahead of other girls. Where did they learn it? Do we blame 60's-style "free love" or the more competitive 80's yuppie ethic? >>Howard's notion of Capitalism with Soul is a humanistic thrust into an otherwise corrupt world.<< --Another way of looking at it is that it's a more ecological view of capitalism, putting it in context rather than divorcing it from other values. The goal of capitalism is not necessarily to make as much money as you can by manipulating others and feeding an emptiness in people maintained by endless striving to get ahead of others. It is to find the hidden yearnings in an audience, the unarticulated dreams, and make them real. If you watch TV ads, you'll see a lot of spiritual or deep emotional themes, attached to products which you wouldn't normally think of as "spiritual". A car is not who you are, but marketers learned in recent decades to market cars as if they were extensions of the self, especially the sexual self. Imagine if all the psychological knowledge and creative genius going into marketing products went into marketing capitalism with soul, marketing curiosity about science and understanding of ecology (natural and human). Imagine if video games taught math and physics, without losing their entertainment value. Why are we relying on an educational system based on textbooks and lectures, when the real money and talent is going into entertainment and advertising? >>Yet not all souls are alike. Some are more generous and congenial than others.<< --I think whatever the game rewards, you get more of. Reward kids for being curious about science and you'll get more kids interested in science. Reward kids for being aggressive or manipulative, and you'll get more of that. Our current system is inconsistent in its rewards, so we get inconsistent results. >>People who promote less science and more religion are those who are fed up not with Darwinism, but with young bullies who believe in a cut-throat bottom-line rather than producing a caring and thoughtful human being.<< --I think religious people have a variety of motives. Some just resent that producers market movies and music to their kids that they feel teaches bad values (I can actually relate to that... while I'm not offended by profanity or sex in films, it gets annoying when it's used as a habitual selling point). Others are very darwinian in the economic sense, but socialistic in the sexual arena. Your money is your own, but your sexuality belongs to the community or to God. Some are more interested in tax cuts, with religion being used to justify it. If the GOP raised taxes, it wouldn't matter how often they mention God, a certain percentage of voters would abandon the party. If Democrats lowered taxes but supported abortion rights, I have no idea how lines would split. Many just felt marginalized in college, reacting against arrogance in liberal professors, or to feeling rejected by liberal kids. Perhaps one reason why many evangelicals see liberalism in terms of 60's stereotypes rather than the state of current liberal thinking. That kind of thing goes in cycles. When conservatism is dominant, kids who don't fit in feel the same marginalization and rejection that evangelical kids felt in the 70's. And many Evangelicals associate promiscuity and drug use with liberalism or secularism, which is a bit like associating Enron with conservatism. Just because a kid is promiscuous does NOT mean s/he was raised by environmentalists or antiwar activists. Conservatism does not guarantee good parenting, but stereotypes die hard. The young bullies are not all atheists. Many are taught violence in the home, often in the name of religion, and then take it out on other kids. The assumption is that kids are "running wild" because of "permissive parenting" but often if you look into it, the kids have been severely punished, then neglected when corporal punishment backfired. But there's always a national myth, which takes precedence over reality. The "save marriage" movement includes a lot of conservative evangelicals who have a higher divorce rate than atheists. The myth says, "We are protecting marriage" but the reality says "we can't keep our own marriages together, so let's go after gay marriage instead." It takes time for these things to sort themselves out and for the myth to fall back in line with reality. Michael __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com From waluk at earthlink.net Tue May 3 18:08:38 2005 From: waluk at earthlink.net (G. Reinhart-Waller) Date: Tue, 03 May 2005 11:08:38 -0700 Subject: [Paleopsych] test In-Reply-To: <951ad0705050217301564af3@mail.gmail.com> References: <951ad0705050217301564af3@mail.gmail.com> Message-ID: <4277BE26.6060804@earthlink.net> I just checked the Paleopsych files and the month of May has been listed including Christian Rauh's taking along a computer with internet access to a desert island. Stephen Lee wrote: >Hey just heard from Greg Bear that nothig seemed to be going on in the >group.. So I figured I may as well put a message in to see if it does. > >Might just be no one has postd sense Aril 29th? Does this seem accurate? > >And as a fun little filler question. What would you see the mnain >diffferences and difficulties bringing in to be as a full applicable >science of both paleopsychology contrasted to Asimov's concept of >psychohistory. > >Or ignore this as most test should be ignored. > > > From anonymous_animus at yahoo.com Tue May 3 18:12:59 2005 From: anonymous_animus at yahoo.com (Michael Christopher) Date: Tue, 3 May 2005 11:12:59 -0700 (PDT) Subject: [Paleopsych] IQ and race In-Reply-To: <200505031711.j43HBrR07037@tick.javien.com> Message-ID: <20050503181259.98945.qmail@web30802.mail.mud.yahoo.com> Greg says: >>Not only does this scam claim to prove your nagging suspicions that blacks are inferior to whites, but it's COMPLETELY GUILT-FREE, because it's RATIONAL, based on PROVABLE MATHEMATICS! And better than that, it's supported by the nagging suspicions of PEOPLE JUST LIKE YOU! People who grew up in a different time.<< --There was an interesting article in the Atlantic magazine called Thin Ice, about what they called Stereotype Threat. They did some experiments, one in which they told a group of white students, "Asians are expected to do better on these tests". The scores of the white students dropped. Apparently, perceptions and expectations have a significant effect on test scores, and when Bush talks about "the soft tyranny of low expectations" he may be right. Groups that feel especially pressured to counteract stereotypes about their performance do poorly on tests, even if they do very well in less pressured settings. A website on the subject: http://www.personal.psu.edu/users/t/r/trc139/ Michael __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com From anonymous_animus at yahoo.com Tue May 3 19:05:56 2005 From: anonymous_animus at yahoo.com (Michael Christopher) Date: Tue, 3 May 2005 12:05:56 -0700 (PDT) Subject: [Paleopsych] child's play In-Reply-To: <200505031800.j43I0CR24822@tick.javien.com> Message-ID: <20050503190556.13160.qmail@web30802.mail.mud.yahoo.com> >>Children must have independent, competitive rough-and-tumble play. Not only do they enjoy it, it is part of their normal development. Anthony Pellegrini, a professor of early childhood education at the University of Minnesota, defines rough-and-tumble play as behavior that includes "laughing, running, smiling, jumping ... wrestling, play fighting, chasing, and fleeing." Such play, he says, brings children together, it makes them happy and it promotes healthy socialization.<< --That's an odd list... smiling is lumped in with wrestling? And with no distinction among levels of roughness in play fighting? I doubt there's any "politically correct" movement to ban smiling on the playground. Michael __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com From waluk at earthlink.net Tue May 3 20:04:52 2005 From: waluk at earthlink.net (G. Reinhart-Waller) Date: Tue, 03 May 2005 13:04:52 -0700 Subject: [Paleopsych] useless people, reality shows etc In-Reply-To: <20050503180752.97622.qmail@web30802.mail.mud.yahoo.com> References: <20050503180752.97622.qmail@web30802.mail.mud.yahoo.com> Message-ID: <4277D964.3070807@earthlink.net> Michael Christopher wrote: >>--Oh, I agree. People tend to do better when they have something to do to contribute to society. I was referring to the label "useless people" which was probably meant tongue in cheek, but which I've seen used more and more in a serious tone. There are a lot of people who see human beings in terms of their economic worth, and if someone doesn't adapt to the economic system, they're labeled "parasites" or something similar. Which reminds me of the fascist view of human beings, that they are cogs in a machine, worth only their material output.>> The label "useless people" depends on who is offering a definition. To a confirmed Marxist, this expression references anyone who isn't gainfully employed in a well defined occupation or profession. So often when I'm asked what I do and reply "I'm an Independent Scholar", I usually get a reply of "No, I mean what do you do for employment"? I think economists in general view those not engaged in a clearly defined work ethic as being outside the mainstream and not going with the flow. That's a shame in our rampant capitalistic world. >>--Very true. I'm wondering how many people in our culture view "the game" as one of cutting other people's throats, and what effect that has on the health of the overall system. Ideally, our economic system would reward talent and hard work. But what happens when the rewards go to those who are better at manipulating others? Many young people seem to have incorporated those values in the sexual arena, with girls rewarding the most manipulative boys with sex, and boys rewarding girls who use their sexuality to get ahead of other girls. Where did they learn it? Do we blame 60's-style "free love" or the more competitive 80's yuppie ethic? >> You were the person who mentioned the "surviror shows" which are presently the rage on American television. These are perfect examples of someone being voted out of the group if he/she doesn't offer a distinct advantage to the voter. This has nothing whatsoever to do with leadership but rather with cunning and deception to establish a "king of the hill". I have always placed "blame" (if we can use that term) on a strict Darwinian interpretation of "survival of the fittest" mantra that scholars have come to modify as "survival of the wealthiest". Many of the major academic institutions applaud economists, business school graduates and those who find themselves enamoured with Donald Trump's boardroom and offer them plum rewards for their deceptions. Rather than placing blame, we should instead focus on our ideals and goals that contribute to a group rather than only to personal satisfaction. Incidently this seems to be the thrust of the t.v. show "Extreme Makeover, Home Edition" where a group of volunteers designs, builds, and decorates a home for worthy clients. It's not much, but it's a beginning. You may also be interested in http://www.habitat.org. >>--Another way of looking at it is that it's a more ecological view of capitalism, putting it in context rather than divorcing it from other values. The goal of capitalism is not necessarily to make as much money as you can by manipulating others and feeding an emptiness in people maintained by endless striving to get ahead of others. It is to find the hidden yearnings in an audience, the unarticulated dreams, and make them real. If you watch TV ads, you'll see a lot of spiritual or deep emotional themes, attached to products which you wouldn't normally think of as "spiritual". A car is not who you are, but marketers learned in recent decades to market cars as if they were extensions of the self, especially the sexual self. Imagine if all the psychological knowledge and creative genius going into marketing products went into marketing capitalism with soul, marketing curiosity about science and understanding of ecology (natural and human). Imagine if video games taught math and physics, without losing their entertainment value. Why are we relying on an educational system based on textbooks and lectures, when the real money and talent is going into entertainment and advertising? >> A strong ecological view certainly addresses the BIG picture. Most advertising appeals to sexual issues that create macho man and working housewife. Who would select a hybrid auto when one can drive a sexy Porche! Cars, clothing fashions, home decorations, even food preparation zero in on the latest and sexiest trends that designers have adapted so that the public will have fun while eating, sleeping, or working. According to Bill Gates our education system is outdated when compared to Japan or other up and coming Asian countries and he is absolutely correct. American capitalists chase the image of becoming a billionaire and the quickest and easiest way to attain this goal is in the sports and entertainment industries. Yes, I can imagine....and possibly contribute my two cents, but that won't make a dent in marketing soul along with capitalism. Maybe I should think in terms of moving to Canada :-) . Best regards, Gerry Reinhart-Waller From eshel at physics.ucsd.edu Tue May 3 21:52:57 2005 From: eshel at physics.ucsd.edu (Eshel Ben-Jacob) Date: Tue, 3 May 2005 23:52:57 +0200 Subject: [Paleopsych] test References: <951ad0705050217301564af3@mail.gmail.com> Message-ID: <002101c5502a$78059f10$911bef84@IBMF68D4578947> I got th etest, Eshel Eshel Ben-Jacob. Professor of Physics The Maguy-Glass Professor in Physics of Complex Systems eshel at tamar.tau.ac.il ebenjacob at ucsd.edu Home Page: http://star.tau.ac.il/~eshel/ Visit http://physicaplus.org.il - PhysicaPlus the online magazine of the Israel Physical Society School of Physics and Astronomy 10/2004 -10/2005 Tel Aviv University, 69978 Tel Aviv, Israel Center for Theoretical Biological Physics Tel 972-3-640 7845/7604 (Fax) -6425787 University of California San Diego La Jolla, CA 92093-0354 USA Tel (office) 1-858-534 0524 (Fax) -534 7697 ----- Original Message ----- From: "Stephen Lee" To: "The new improved paleopsych list" Sent: Tuesday, May 03, 2005 2:30 AM Subject: [Paleopsych] test Hey just heard from Greg Bear that nothig seemed to be going on in the group.. So I figured I may as well put a message in to see if it does. Might just be no one has postd sense Aril 29th? Does this seem accurate? And as a fun little filler question. What would you see the mnain diffferences and difficulties bringing in to be as a full applicable science of both paleopsychology contrasted to Asimov's concept of psychohistory. Or ignore this as most test should be ignored. -- -- If Nothing Is Then Nothing Was But something is everwhere Just because -- http://www.freewebs.com/rewander http://hopeisus.fateback.com/story.html _______________________________________________ paleopsych mailing list paleopsych at paleopsych.org http://lists.paleopsych.org/mailman/listinfo/paleopsych +++++++++++++++++++++++++++++++++++++++++++ This Mail Was Scanned By Mail-seCure System at the Tel-Aviv University CC. From checker at panix.com Tue May 3 22:14:00 2005 From: checker at panix.com (Premise Checker) Date: Tue, 3 May 2005 18:14:00 -0400 (EDT) Subject: [Paleopsych] Scientific American: His Brain, Her Brain Message-ID: His Brain, Her Brain http://www.sciam.com/print_version.cfm?articleID=000363E3-1806-1264-980683414B7F0000 April 25, 2005 It turns out that male and female brains differ quite a bit in architecture and activity. Research into these variations could lead to sex-specific treatments for disorders such as depression and schizophrenia By Larry Cahill On a gray day in mid-January, Lawrence Summers, the president of Harvard University, suggested that innate differences in the build of the male and female brain might be one factor underlying the relative scarcity of women in science. His remarks reignited a debate that has been smoldering for a century, ever since some scientists sizing up the brains of both sexes began using their main finding--that female brains tend to be smaller--to bolster the view that women are intellectually inferior to men. To date, no one has uncovered any evidence that anatomical disparities might render women incapable of achieving academic distinction in math, physics or engineering. And the brains of men and women have been shown to be quite clearly similar in many ways. Nevertheless, over the past decade investigators have documented an astonishing array of structural, chemical and functional variations in the brains of males and females. These inequities are not just interesting idiosyncrasies that might explain why more men than women enjoy the Three Stooges. They raise the possibility that we might need to develop sex-specific treatments for a host of conditions, including depression, addiction, schizophrenia and post-traumatic stress disorder (PTSD). Furthermore, the differences imply that researchers exploring the structure and function of the brain must take into account the sex of their subjects when analyzing their data--and include both women and men in future studies or risk obtaining misleading results. Sculpting the Brain Not so long ago neuroscientists believed that sex differences in the brain were limited mainly to those regions responsible for mating behavior. In a 1966 Scientific American article entitled "Sex Differences in the Brain," Seymour Levine of Stanford University described how sex hormones help to direct divergent reproductive behaviors in rats--with males engaging in mounting and females arching their backs and raising their rumps to attract suitors. Levine mentioned only one brain region in his review: the hypothalamus, a small structure at the base of the brain that is involved in regulating hormone production and controlling basic behaviors such as eating, drinking and sex. A generation of neuroscientists came to maturity believing that "sex differences in the brain" referred primarily to mating behaviors, sex hormones and the hypothalamus. _________________________________________________________________ Several intriguing behavioral studies add to the evidence that some sex differences in the brain arise before a baby draws its first breath. _________________________________________________________________ That view, however, has now been knocked aside by a surge of findings that highlight the influence of sex on many areas of cognition and behavior, including memory, emotion, vision, hearing, the processing of faces and the brain's response to stress hormones. This progress has been accelerated in the past five to 10 years by the growing use of sophisticated noninvasive imaging techniques such as positron-emission tomography (PET) and functional magnetic resonance imaging (fMRI), which can peer into the brains of living subjects. These imaging experiments reveal that anatomical variations occur in an assortment of regions throughout the brain. Jill M. Goldstein of Harvard Medical School and her colleagues, for example, used MRI to measure the sizes of many cortical and subcortical areas. Among other things, these investigators found that parts of the frontal cortex, the seat of many higher cognitive functions, are bulkier in women than in men, as are parts of the limbic cortex, which is involved in emotional responses. In men, on the other hand, parts of the parietal cortex, which is involved in space perception, are bigger than in women, as is the amygdala, an almond-shaped structure that responds to emotionally arousing information--to anything that gets the heart pumping and the adrenaline flowing. These size differences, as well as others mentioned throughout the article, are relative: they refer to the overall volume of the structure relative to the overall volume of the brain. Differences in the size of brain structures are generally thought to reflect their relative importance to the animal. For example, primates rely more on vision than olfaction; for rats, the opposite is true. As a result, primate brains maintain proportionately larger regions devoted to vision, and rats devote more space to olfaction. So the existence of widespread anatomical disparities between men and women suggests that sex does influence the way the brain works. Other investigations are finding anatomical sex differences at the cellular level. For example, Sandra Witelson and her colleagues at McMaster University discovered that women possess a greater density of neurons in parts of the temporal lobe cortex associated with language processing and comprehension. On counting the neurons in postmortem samples, the researchers found that of the six layers present in the cortex, two show more neurons per unit volume in females than in males. Similar findings were subsequently reported for the frontal lobe. With such information in hand, neuroscientists can now explore whether sex differences in neuron number correlate with differences in cognitive abilities--examining, for example, whether the boost in density in the female auditory cortex relates to women's enhanced performance on tests of verbal fluency. Such anatomical diversity may be caused in large part by the activity of the sex hormones that bathe the fetal brain. These steroids help to direct the organization and wiring of the brain during development and influence the structure and neuronal density of various regions. Interestingly, the brain areas that Goldstein found to differ between men and women are ones that in animals contain the highest number of sex hormone receptors during development. This correlation between brain region size in adults and sex steroid action in utero suggests that at least some sex differences in cognitive function do not result from cultural influences or the hormonal changes associated with puberty--they are there from birth. Inborn Inclinations Several intriguing behavioral studies add to the evidence that some sex differences in the brain arise before a baby draws its first breath. Through the years, many researchers have demonstrated that when selecting toys, young boys and girls part ways. Boys tend to gravitate toward balls or toy cars, whereas girls more typically reach for a doll. But no one could really say whether those preferences are dictated by culture or by innate brain biology. To address this question, Melissa Hines of City University London and Gerianne M. Alexander of Texas A&M University turned to monkeys, one of our closest animal cousins. The researchers presented a group of vervet monkeys with a selection of toys, including rag dolls, trucks and some gender-neutral items such as picture books. They found that male monkeys spent more time playing with the "masculine" toys than their female counterparts did, and female monkeys spent more time interacting with the playthings typically preferred by girls. Both sexes spent equal time monkeying with the picture books and other gender-neutral toys. Because vervet monkeys are unlikely to be swayed by the social pressures of human culture, the results imply that toy preferences in children result at least in part from innate biological differences. This divergence, and indeed all the anatomical sex differences in the brain, presumably arose as a result of selective pressures during evolution. In the case of the toy study, males--both human and primate--prefer toys that can be propelled through space and that promote rough-and-tumble play. These qualities, it seems reasonable to speculate, might relate to the behaviors useful for hunting and for securing a mate. Similarly, one might also hypothesize that females, on the other hand, select toys that allow them to hone the skills they will one day need to nurture their young. Simon Baron-Cohen and his associates at the University of Cambridge took a different but equally creative approach to addressing the influence of nature versus nurture regarding sex differences. Many researchers have described disparities in how "people-centered" male and female infants are. For example, Baron-Cohen and his student Svetlana Lutchmaya found that one-year-old girls spend more time looking at their mothers than boys of the same age do. And when these babies are presented with a choice of films to watch, the girls look longer at a film of a face, whereas boys lean toward a film featuring cars. Of course, these preferences might be attributable to differences in the way adults handle or play with boys and girls. To eliminate this possibility, Baron-Cohen and his students went a step further. They took their video camera to a maternity ward to examine the preferences of babies that were only one day old. The infants saw either the friendly face of a live female student or a mobile that matched the color, size and shape of the student's face and included a scrambled mix of her facial features. To avoid any bias, the experimenters were unaware of each baby's sex during testing. When they watched the tapes, they found that the girls spent more time looking at the student, whereas the boys spent more time looking at the mechanical object. This difference in social interest was evident on day one of life--implying again that we come out of the womb with some cognitive sex differences built in. Under Stress In many cases, sex differences in the brain's chemistry and construction influence how males and females respond to the environment or react to, and remember, stressful events. Take, for example, the amygdala. Goldstein and others have reported that the amygdala is larger in men than in women. And in rats, the neurons in this region make more numerous interconnections in males than in females. These anatomical variations would be expected to produce differences in the way that males and females react to stress. To assess whether male and female amygdalae in fact respond differently to stress, Katharina Braun and her co-workers at Otto von Guericke University in Magdeburg, Germany, briefly removed a litter of Degu pups from their mother. For these social South American rodents, which live in large colonies like prairie dogs do, even temporary separation can be quite upsetting. The researchers then measured the concentration of serotonin receptors in various brain regions. Serotonin is a neurotransmitter, or signal-carrying molecule, that is key for mediating emotional behavior. (Prozac, for example, acts by increasing serotonin function.) The workers allowed the pups to hear their mother's call during the period of separation and found that this auditory input increased the serotonin receptor concentration in the males' amygdala, yet decreased the concentration of these same receptors in females. Although it is difficult to extrapolate from this study to human behavior, the results hint that if something similar occurs in children, separation anxiety might differentially affect the emotional well-being of male and female infants. Experiments such as these are necessary if we are to understand why, for instance, anxiety disorders are far more prevalent in girls than in boys. Another brain region now known to diverge in the sexes anatomically and in its response to stress is the hippocampus, a structure crucial for memory storage and for spatial mapping of the physical environment. Imaging consistently demonstrates that the hippocampus is larger in women than in men. These anatomical differences might well relate somehow to differences in the way males and females navigate. Many studies suggest that men are more likely to navigate by estimating distance in space and orientation ("dead reckoning"), whereas women are more likely to navigate by monitoring landmarks. Interestingly, a similar sex difference exists in rats. Male rats are more likely to navigate mazes using directional and positional information, whereas female rats are more likely to navigate the same mazes using available landmarks. (Investigators have yet to demonstrate, however, that male rats are less likely to ask for directions.) Even the neurons in the hippocampus behave differently in males and females, at least in how they react to learning experiences. For example, Janice M. Juraska and her associates at the University of Illinois have shown that placing rats in an "enriched environment"--cages filled with toys and with fellow rodents to promote social interactions--produced dissimilar effects on the structure of hippocampal neurons in male and female rats. In females, the experience enhanced the "bushiness" of the branches in the cells' dendritic trees--the many-armed structures that receive signals from other nerve cells. This change presumably reflects an increase in neuronal connections, which in turn is thought to be involved with the laying down of memories. In males, however, the complex environment either had no effect on the dendritic trees or pruned them slightly. But male rats sometimes learn better in the face of stress. Tracey J. Shors of Rutgers University and her collaborators have found that a brief exposure to a series of one-second tail shocks enhanced performance of a learned task and increased the density of dendritic connections to other neurons in male rats yet impaired performance and decreased connection density in female rats. Findings such as these have interesting social implications. The more we discover about how brain mechanisms of learning differ between the sexes, the more we may need to consider how optimal learning environments potentially differ for boys and girls. Although the hippocampus of the female rat can show a decrement in response to acute stress, it appears to be more resilient than its male counterpart in the face of chronic stress. Cheryl D. Conrad and her co-workers at Arizona State University restrained rats in a mesh cage for six hours--a situation that the rodents find disturbing. The researchers then assessed how vulnerable their hippocampal neurons were to killing by a neurotoxin--a standard measure of the effect of stress on these cells. They noted that chronic restraint rendered the males' hippocampal cells more susceptible to the toxin but had no effect on the females' vulnerability. These findings, and others like them, suggest that in terms of brain damage, females may be better equipped to tolerate chronic stress than males are. Still unclear is what protects female hippocampal cells from the damaging effects of chronic stress, but sex hormones very likely play a role. The Big Picture Extending the work on how the brain handles and remembers stressful events, my colleagues and I have found contrasts in the way men and women lay down memories of emotionally arousing incidents--a process known from animal research to involve activation of the amygdala. In one of our first experiments with human subjects, we showed volunteers a series of graphically violent films while we measured their brain activity using PET. A few weeks later we gave them a quiz to see what they remembered. We discovered that the number of disturbing films they could recall correlated with how active their amygdala had been during the viewing. Subsequent work from our laboratory and others confirmed this general finding. But then I noticed something strange. The amygdala activation in some studies involved only the right hemisphere, and in others it involved only the left hemisphere. It was then I realized that the experiments in which the right amygdala lit up involved only men; those in which the left amygdala was fired up involved women. Since then, three subsequent studies--two from our group and one from John Gabrieli and Turhan Canli and their collaborators at Stanford--have confirmed this difference in how the brains of men and women handle emotional memories. The realization that male and female brains were processing the same emotionally arousing material into memory differently led us to wonder what this disparity might mean. To address this question, we turned to a century-old theory stating that the right hemisphere is biased toward processing the central aspects of a situation, whereas the left hemisphere tends to process the finer details. If that conception is true, we reasoned, a drug that dampens the activity of the amygdala should impair a man's ability to recall the gist of an emotional story (by hampering the right amygdala) but should hinder a woman's ability to come up with the precise details (by hampering the left amygdala). Propranolol is such a drug. This so-called beta blocker quiets the activity of adrenaline and its cousin noradrenaline and, in so doing, dampens the activation of the amygdala and weakens recall of emotionally arousing memories. We gave this drug to men and women before they viewed a short slide show about a young boy caught in a terrible accident while walking with his mother. One week later we tested their memory. The results showed that propranolol made it harder for men to remember the more holistic aspects, or gist, of the story--that the boy had been run over by a car, for example. In women, propranolol did the converse, impairing their memory for peripheral details--that the boy had been carrying a soccer ball. In more recent investigations, we found that we can detect a hemispheric difference between the sexes in response to emotional material almost immediately. Volunteers shown emotionally unpleasant photographs react within 300 milliseconds--a response that shows up as a spike on a recording of the brain's electrical activity. With Antonella Gasbarri and others at the University of L'Aquila in Italy, we have found that in men, this quick spike, termed a P300 response, is more exaggerated when recorded over the right hemisphere; in women, it is larger when recorded over the left. Hence, sex-related hemispheric disparities in how the brain processes emotional images begin within 300 milliseconds--long before people have had much, if any, chance to consciously interpret what they have seen. These discoveries might have ramifications for the treatment of PTSD. Previous research by Gustav Schelling and his associates at Ludwig Maximilian University in Germany had established that drugs such as propranolol diminish memory for traumatic situations when administered as part of the usual therapies in an intensive care unit. Prompted by our findings, they found that, at least in such units, beta blockers reduce memory for traumatic events in women but not in men. Even in intensive care, then, physicians may need to consider the sex of their patients when meting out their medications. Sex and Mental Disorders ptsd is not the only psychological disturbance that appears to play out differently in women and men. A PET study by Mirko Diksic and his colleagues at McGill University showed that serotonin production was a remarkable 52 percent higher on average in men than in women, which might help clarify why women are more prone to depression--a disorder commonly treated with drugs that boost the concentration of serotonin. A similar situation might prevail in addiction. In this case, the neurotransmitter in question is dopamine--a chemical involved in the feelings of pleasure associated with drugs of abuse. Studying rats, Jill B. Becker and her fellow investigators at the University of Michigan at Ann Arbor discovered that in females, estrogen boosted the release of dopamine in brain regions important for regulating drug-seeking behavior. Furthermore, the hormone had long-lasting effects, making the female rats more likely to pursue cocaine weeks after last receiving the drug. Such differences in susceptibility--particularly to stimulants such as cocaine and amphetamine--could explain why women might be more vulnerable to the effects of these drugs and why they tend to progress more rapidly from initial use to dependence than men do. Certain brain abnormalities underlying schizophrenia appear to differ in men and women as well. Ruben Gur, Raquel Gur and their colleagues at the University of Pennsylvania have spent years investigating sex-related differences in brain anatomy and function. In one project, they measured the size of the orbitofrontal cortex, a region involved in regulating emotions, and compared it with the size of the amygdala, implicated more in producing emotional reactions. The investigators found that women possess a significantly larger orbitofrontal-to-amygdala ratio (OAR) than men do. One can speculate from these findings that women might on average prove more capable of controlling their emotional reactions. In additional experiments, the researchers discovered that this balance appears to be altered in schizophrenia, though not identically for men and women. Women with schizophrenia have a decreased OAR relative to their healthy peers, as might be expected. But men, oddly, have an increased OAR relative to healthy men. These findings remain puzzling, but, at the least, they imply that schizophrenia is a somewhat different disease in men and women and that treatment of the disorder might need to be tailored to the sex of the patient. Sex Matters in a comprehensive 2001 report on sex differences in human health, the prestigious National Academy of Sciences asserted that "sex matters. Sex, that is, being male or female, is an important basic human variable that should be considered when designing and analyzing studies in all areas and at all levels of biomedical and health-related research." Neuroscientists are still far from putting all the pieces together--identifying all the sex-related variations in the brain and pinpointing their influences on cognition and propensity for brain-related disorders. Nevertheless, the research conducted to date certainly demonstrates that differences extend far beyond the hypothalamus and mating behavior. Researchers and clinicians are not always clear on the best way to go forward in deciphering the full influences of sex on the brain, behavior and responses to medications. But growing numbers now agree that going back to assuming we can evaluate one sex and learn equally about both is no longer an option. From checker at panix.com Tue May 3 22:14:22 2005 From: checker at panix.com (Premise Checker) Date: Tue, 3 May 2005 18:14:22 -0400 (EDT) Subject: [Paleopsych] WkStd: Civilization and Its Malcontents Message-ID: Civilization and Its Malcontents http://www.weeklystandard.com/Utilities/printer_preview.asp?idArticle=5546&R=C4FE2FB13 Civilization and Its Malcontents Or, why are academics so unhappy? by Joseph Epstein 05/09/2005, Volume 010, Issue 32 Faculty Towers The Academic Novel and Its Discontents by Elaine Showalter University of Pennsylvania Press, 143 pp., $24.95 I HAD A FRIEND, now long dead, named Walter B. Scott, a professor at Northwestern University whose specialty was theatrical literature, who never referred to university teaching as other than a--or sometimes the--"racket." What Walter, a notably unambitious man, meant was that it was an unconscionably easy way to make a living, a soft touch, as they used to say. Working under conditions of complete freedom, having to show up in the classroom an impressively small number of hours each week, with the remainder of one's time chiefly left to cultivate one's own intellectual garden, at a job from which one could never be fired and which (if one adds up the capacious vacation time) amounted to fewer than six months work a year for pay that is very far from miserable--yes, I'd say "a racket" just about gets it. And yet, as someone who came late to university teaching, I used to wonder why so many people in the racket were so obviously disappointed, depressed, and generally demoralized. Granted, until one achieves that Valhalla for scholars known as tenure--which really means lifetime security, obtainable on no other job that I know--an element of tension is entailed, but then so is it in every other job. As a young instructor, one is often assigned dogsbody work, teaching what is thought to be dull fare: surveys, composition courses, and the rest. But the unhappier academics, in my experience, are not those still struggling to gain a seat at the table, but those who have already grown dour from having been there for a long while. So far as I know, no one has ever done a study of the unhappiness of academics. Who might be assigned to the job? Business-school professors specializing in industrial psychology and employer/employee relations would botch it. Disaffected sociologists would blame it all on society and knock off for the rest of the semester. My own preference would be anthropologists, using methods long ago devised for investigating a culture from the outside in. The closest thing we have to these ideal anthropologists have been novelists writing academic novels, and their lucubrations, while not as precise as one would like on the reasons for the unhappiness of academics, do show a strong and continuing propensity on the part of academics intrepidly to make the worst of what ought to be a perfectly delightful situation. Faculty Towers is a report on the findings of those novelists who have worked the genre long known as the academic novel. The book is written by an insider, for Professor Elaine Showalter, now in her middle sixties, is, as they used to say on the carnival grounds, "with the show." At various places in her slight book, she inserts her own experience as a graduate student and professor, though not to very interesting effect. An early entry in the feminist sweepstakes, she is currently the Avalon Foundation Professor of the Humanities at Princeton, a past president of the Modern Language Association, a founder of "gynocriticism" (or the study of women writers)--in other words, guilty until proven innocent. She has also been described--readers retaining a strong sense of decorum are advised to skip the remainder of this paragraph--as "Camille Paglia with balls," a description meant approbatively, or so at least Princeton must feel, for they print it on princetoninfo.com, a stark indication of the tone currently reigning in American universities. Professor Showalter's book is chiefly a chronological account of Anglophone academic novels for the past sixty or so years, beginning with C.P. Snow's The Masters (1951) and running through examples of the genre produced in the 21st century. Faculty Towers is, for the most part, given over to plot summaries of these novels, usually accompanied by judgments about their quality, with extra bits of feminism (mild scorn is applied where the plight of women in academic life is ignored) thrown in at no extra charge. The book's title, playing off the John Cleese comedy Fawlty Towers, suggests the book's larger theme: that the university, as reflected in the academic novels Showalter examines, has increasingly become rather like a badly run hotel, with plenty of nuttiness to go round. The difficulty here is that Showalter believes that things are not all that nutty. Mirabile dictu: She finds them looking up. "The university," she writes, "is no longer a sanctuary or a refuge; it is fully caught up in the churning community and the changing society; but it is a fragile institution rather than a fortress." The feminism in Faculty Towers is generally no more than a tic, which the book's author by now probably cannot really control, and after a while one gets used to it, without missing it when it fails to show up. The only place Showalter's feminism seriously gets in the way, in my view, is in her judgments of Mary McCarthy's The Groves of Academe (a forgettable--and now quite properly forgotten--novel that she rates too highly) and Randall Jarrell's wickedly amusing Pictures from an Institution (which she attempts, intemperately, to squash). The two misjudgments happen to be nicely connected: the most menacing character in Jarrell's novel, Gertrude Johnson, is based on Mary McCarthy, who may well be one of Showalter's personal heroines, of whom Jarrell has one of his characters remark: "She may be a mediocre novelist but you've got to admit that she's a wonderful liar." Sounds right to me. Being with the show has doubtless clouded Showalter's judgment of Pictures from an Institution, which contains, among several withering criticisms of university life, a marvelously prophetic description of the kind of perfectly characterless man who will eventually--that is to say, now, in our day--rise to the presidencies of universities all over the country. Cozening, smarmy, confidently boring, an appeaser of all and offender of none, "idiot savants of success" (Jarrell's perfect phrase), not really quite human but, like President Dwight Robbins of the novel's Benton College, men (and some women) with a gift for "seeming human"--in short, the kind of person the faculty of Harvard is currently hoping to turn the detoxed Lawrence Summers into if they can't succeed in firing him straightaway for his basic mistake in thinking that they actually believe in free speech. C.P. Snow's The Masters, is a novel about the intramural political alignments involved in finding the right man to replace the dying master of a Cambridge college. In this novel, the worthiness of the university and the significance of the scholars and scientists contending for the job are not questioned; the conflict is between contending but serious points of view: scientific and humanistic, the school of cool progress versus that of warm tradition. In 1951, the university still seemed an altogether admirable place, professors serious and significant. Or so it seemed in the 1950s to those of us for whom going to college was not yet an automatic but still felt to be a privileged choice. One might think that the late 1960s blew such notions completely out of the water. It did, but not before Kingsley Amis, in Lucky Jim (1954), which Showalter rightly calls "the funniest academic satire of the century," first loosed the torpedoes. In Lucky Jim, the setting is a provincial English university and the dominant spirit is one of pomposity, nicely reinforced by cheap-shot one-upmanship and intellectual fraudulence. Jim Dixon, the novel's eponymous hero, striving to become a regular member of the history faculty, is at work on an article titled "The Economic Influence of Developments in Shipbuilding Techniques, 1450 to 1485," a perfect example of fake scholarship in which, as he recognizes, "pseudo light" is cast upon "false problems." Amis puts Dixon through every hell of social embarrassment and comic awkwardness, but the reason Jim is lucky, one might tend to forget in all the laughter, is that in the end he escapes the university and thus a life of intellectual fraudulence and spiritual aridity. Amis's hero is a medieval historian, but the preponderance of academic novels are set in English departments. The reason for this can be found in universities choosing to ignore a remark made by the linguist Roman Jakobson, who, when it was proposed to the Harvard faculty to hire Vladimir Nabokov, said that the zoology department does not hire an elephant, one of the objects of its study, so why should an English department hire a contemporary writer, also best left as an object of study? Jakobson is usually mocked for having made that remark, but he was probably correct: better to study writers than hire them. To hire a novelist for a university teaching job is turning the fox loose in the hen house. The result--no surprise here--has been feathers everywhere. Showalter makes only brief mention of one of my favorite academic novels, The Mind-Body Problem by Rebecca Goldstein. Ms. Goldstein is quoted on the interesting point that at Princeton Jews become gentilized while at Columbia Gentiles become judenized, which is not only amusing but true. Goldstein's novel is also brilliant on the snobbery of university life. She makes the nice point that the poorest dressers in academic life (there are no good ones) are the mathematicians, followed hard upon by the physicists. The reason they care so little about clothes--also about wine and the accoutrements of culture--is that, Goldstein rightly notes, they feel that in their work they are dealing with the higher truths, and need not be bothered with such kakapitze as cooking young vegetables, decanting wine correctly, and knowing where to stay in Paris. Where the accoutrements of culture count for most are in the humanities departments, where truth, as the physical scientists understand it, simply isn't part of the deal. "What do you guys in the English Department do," a scientist at Northwestern once asked me, quite in earnest, "just keep reading Shakespeare over and over, like Talmud?" "Nothing that grand," I found myself replying. Professor Showalter does not go in much for discussing the sex that is at the center of so many academic novels. Which reminds me that the first time I met Edward Shils, he asked me what I was reading. When I said The War Between the Tates by Alison Lurie, he replied, "Academic screwing, I presume." He presumed rightly. How could it be otherwise with academic novels? Apart from the rather pathetic power struggles over department chairmanships, or professorial appointments, love affairs, usually adulterous or officially outlawed ones, provide the only thing resembling drama on offer on the contemporary university campus. Early academic novels confined love affairs to adults on both sides. But by the 1970s, after the "student unrest" (still my favorite of all political euphemisms) of the late 1960s, students--first graduate students, then undergraduates--became the lovers of (often married) professors. If men were writing these novels, the experience was supposed to result in spiritual refreshment; if women wrote them, the male professors were merely damned fools. The women novelists, of course, were correct. The drama of love needs an element of impossibility: think Romeo and Juliet, think Anna Karenina, think Lolita. But in the academic novel, this element seems to have disappeared, especially in regard to the professor-student love affair, where the (usually female) student could no longer be considered very (if at all) innocent. The drama needed to derive elsewhere. That elsewhere hasn't yet been found, unless one counts sexual harassment suits, which are not yet the subject of an academic novel but have been that of Oleanna, a play by David Mamet, who is not an academic but grasped the dramatic element in such dreary proceedings. Sexual harassment, of course, touches on political correctness, which is itself the product of affirmative action, usually traveling under the code name of diversity. Many people outside universities may think that diversity has been imposed on universities from without by ignorant administrators. But professors themselves rather like it; it makes them feel they are doing the right thing and, hence, allows them, however briefly, to feel good about themselves. Nor is diversity the special preserve of prestige-laden or large state-run universities. In the 1970s, I was invited to give a talk at Denison University in Granville, Ohio. I arrived to find all the pieces in place: On the English faculty was a black woman (very nice, by the way), an appropriately snarky feminist, a gay (not teaching the thing called Queer Theory, which hadn't yet been devised), a Jew, and a woman named Ruthie, who drove about in an aged and messy Volkswagen bug, whose place in this otherwise unpuzzling puzzle I couldn't quite figure out. When I asked, I was told, "Oh, Ruthie's from the sixties." From "the sixties," I thought then and still think, sounds like a country, and perhaps it is, but assuredly, to steal a bit of Yeats, no country for old men. By the time I began teaching in the early 1970s, everyone already seemed to be in business for himself, looking for the best deal, which meant the least teaching for the most money at the most snobbishly well-regarded schools. The spirit of capitalism, for all that might be said on its behalf, wreaks havoc when applied to culture and education. The English novelist David Lodge neatly caught this spirit at work when he created, in two of his academic novels, the character Morris Zapp. A scholar-operator, Zapp, as described by Lodge, "is well-primed to enter a profession as steeped in free enterprise as Wall Street, in which each scholar-teacher makes an individual contract with his employer, and is free to sell his services to the highest bidder." Said to be based on the Milton-man Stanley Fish, an identification that Fish apparently has never disavowed but instead glories in, Morris Zapp is the freebooter to a high power turned loose in academic settings: always attempting to strengthen his own position, usually delighted to be of disservice to the old ideal of academic dignity and integrity. Fish himself ended his days with a deanship at the University of Illinois in Chicago for a salary said to be $250,000, much less than a utility infielder in the major leagues makes but, for an academic, a big number. By the time that the 1990s rolled around, all that was really left to the academic novel was to mock the mission of the university. With the onset of so-called theory in English and foreign-language departments, this became easier and easier to do. Professor Showalter does not approve of these goings-on: "The tone of ['90s academic novels]," she writes, "is much more vituperative, vengeful, and cruel than in earlier decades." The crueler the blows are required, I should say, the better to capture the general atmosphere of goofiness, which has become pervasive. Theory and the hodgepodge of feminism, Marxism, and queer theory that resides comfortably alongside it, has now been in the saddle for roughly a quarter-century in American English and Romance-language departments, while also making incursions into history, philosophy, and other once-humanistic subjects. There has been very little to show for it--no great books, no splendid articles or essays, no towering figures who signify outside the academy itself--except declining enrollments in English and other department courses featuring such fare. All that is left to such university teachers is the notion that they are, in a much-strained academic sense, avant-garde, which means that they continue to dig deeper and deeper for lower and lower forms of popular culture--graffiti on Elizabethan chamber pots--and human oddity. The best standard in the old days would have university scholars in literature and history departments publish books that could also be read with enjoyment and intellectual profit by nonscholars. Nothing of this kind is being produced today. In an academic thriller (a subdivision of the academic novel) cited by Showalter called Murder at the MLA, the head of the Wellesley English Department is found "dead as her prose." But almost all prose written in English departments these days is quite as dead as that English teacher. For Professor Showalter, the old days were almost exclusively the bad old days. A good radical matron, she recounts manning the phones for the support group protesting, at the 1968 Modern Language Association meeting, "the organization's conservatism and old-boy governance." Now of course it almost seems as if the annual MLA meetings chiefly exist for journalists to write comic pieces featuring the zany subjects of the papers given at each year's conference. At these meetings, in and out the room the women come and go, speaking of fellatio, which, deep readers that they are, they can doubtless find in Jane Austen. Such has been the politicization of the MLA that a counter-organization has been formed, called the Association of Literary Scholars and Critics, whose raison d'?tre is to get English studies back on track. I am myself a dues-paying ($35 annually) member of that organization. I do not go to its meetings, but I am sent the organization's newsletter and magazine, and they are a useful reminder of how dull English studies have traditionally been. But it is good to recall that dull is not ridiculous, dull is not always irrelevant, dull is not intellectual manure cast into the void. The bad old days in English departments were mainly the dull old days, with more than enough pedants and dryasdusts to go round. But they did also produce a number of university teachers whose work reached beyond university walls and helped elevate the general culture: Jacques Barzun, Lionel Trilling, Ellen Moers, Walter Jackson Bate, Aileen Ward, Robert Penn Warren. The names from the bad new days seem to end with the entirely political Edward Said and Cornel West. What we have today in universities is an extreme reaction to the dullness of that time, and also to the sheer exhaustion of subject matter for English department scholarship. No further articles and books about Byron, Shelley, Keats, or Kafka, Joyce, and the two Eliots seemed possible (which didn't of course stop them from coming). The pendulum has swung, but with a thrust so violent as to have gone through the cabinet in which the clock is stored. From an academic novel I've not read called The Death of a Constant Lover (1999) by Lev Raphael, Professor Showalter quotes a passage that ends the novel on the following threnodic note: Whenever I'm chatting at conferences with faculty members from other universities, the truth comes out after a drink or two: Hardly any academics are happy where they are, no matter how apt the students, how generous the salary or perks, how beautiful the setting, how light the teaching load, how lavish the re-search budget. I don't know if it's academia itself that attracts misfits and malcontents, or if the overwhelming hypocrisy of that world would have turned even the von Trapp family sullen. My best guess is that it's a good bit of both. Universities attract people who are good at school. Being good at school takes a real enough but very small talent. As the philosopher Robert Nozick once pointed out, all those A's earned through their young lives encourage such people to persist in school: to stick around, get more A's and more degrees, sign on for teaching jobs. When young, the life ahead seems glorious. They imagine themselves inspiring the young, writing important books, living out their days in cultivated leisure. But something, inevitably, goes awry, something disagreeable turns up in the punch bowl. Usually by the time they turn 40, they discover the students aren't sufficiently appreciative; the books don't get written; the teaching begins to feel repetitive; the collegiality is seldom anywhere near what one hoped for it; there isn't any good use for the leisure. Meanwhile, people who got lots of B's in school seem to be driving around in Mercedes, buying million-dollar apartments, enjoying freedom and prosperity in a manner that strikes the former good students, now professors, as not only unseemly but of a kind a just society surely would never permit. Now that politics has trumped literature in English departments the situation is even worse. Beset by political correctness, self-imposed diversity, without leadership from above, university teachers, at least on the humanities and social-science sides, knowing the work they produce couldn't be of the least possible interest to anyone but the hacks of the MLA and similar academic organizations, have more reason than ever to be unhappy. And so let us leave them, overpaid and underworked, surly with alienation and unable to find any way out of the sweet racket into which they once so ardently longed to get. Joseph Epstein is a contributing editor to The Weekly Standard. From checker at panix.com Tue May 3 22:14:37 2005 From: checker at panix.com (Premise Checker) Date: Tue, 3 May 2005 18:14:37 -0400 (EDT) Subject: [Paleopsych] Nation: (Einstein) The Other 1905 Revolution Message-ID: The Other 1905 Revolution http://www.thenation.com/docprint.mhtml?i=20050516&s=foer by JOSHUA FOER Einstein 1905: The Standard of Greatness by John S. Rigden The Born-Einstein Letters, 1916-1955: Friendship, Politics and Physics in Uncertain Times by Irene Born, trans.; with Introduction by Werner Heisenberg and Foreword by Bertrand Russell [from the May 16, 2005 issue] In his 1902 book Science and Hypothesis, the French mathematician and physicist Henri Poincar? surveyed the landscape of modern physics and found three fundamental conundrums bedeviling his field: the chaotic zigzagging of small particles suspended in liquid, known as Brownian motion; the curious fact that metals emit electrons when exposed to ultraviolet light, known as the photoelectric effect; and science's failure to detect the ether, the invisible medium through which light waves were thought to propagate. In 1904 a 25-year-old Bern patent clerk named Albert Einstein read Poincar?'s book. Nothing the young physicist had done with his life until that point foreshadowed the cerebral explosion he was about to unleash. A year later, he had solved all three of Poincar?'s problems. "A storm broke loose in my mind," Einstein would later say of 1905, the annus mirabilis, which John S. Rigden calls "the most productive six months any scientist ever enjoyed." Between March and September, he published five seminal papers, each of which transformed physics. Three were Nobel Prize material; another, his thesis dissertation, remains one of the most cited scientific papers ever; and the fifth, a three-page afterthought, derived the only mathematical equation you're likely to find on a pair of boxer shorts, E = mc2. Rigden's short book Einstein 1905 is a tour through each of those landmark papers, beginning with the only one that Einstein was willing to call "revolutionary." That first paper, which would earn him the Nobel Prize sixteen years later, was titled "On a Heuristic Point of View About the Creation and Conversion of Light." It could just as easily have been called "Why Everything You Think You Know About Light Is Wrong." In 1905 most scientists were certain that light traveled in waves, just like sound. Though it troubled few others, Einstein was deeply perturbed by the notion that energy could flow in continuous waves whereas matter was made up of discrete particles. To paraphrase Bertrand Russell, why should one aspect of the universe be molasses when the other part is sand? When Einstein tried to imagine a universe in which everything, including light, was made up of particles, he realized the simple conceptual shift could explain a lot, including the mysterious photoelectric effect. This was typical of how Einstein thought, argues Rigden. He saw fundamental contradictions in the generalizations that others had made before him and then followed the trail of logic to unexpected conclusions. In some cases it took years before his ideas could be experimentally verified. His theory of light wasn't widely accepted for two decades. The second paper of the year, completed in April, is the least well remembered, even though its many practical applications have made it one of Einstein's most cited works. In that paper, Einstein suggested a way of calculating the size of molecules in a liquid based on measurements of how the liquid behaves. The paper relied on more mathematical brute force and was less graceful than the other four of the year, but it was important nonetheless. Because it showed how to measure the size of otherwise unobservable atoms, it helped nail the coffin shut on the few lingering skeptics, like Ernst Mach, who still did not buy into the atomic theory of matter. Even more damning for those atomic skeptics was Einstein's May paper on Brownian motion, which explained the unpredictable dance of pollen grains in water. The reason for the pollen's erratic behavior, Einstein demonstrated, is that it is being constantly bombarded by water molecules. Most of the time, that bombardment occurs equally from all angles, so the net effect on a grain of pollen is zero. But sometimes, statistical fluctuations conspire so that more molecules are pushing in one direction than another, causing a grain to zip through the water. Even though atoms are invisible, Einstein had figured out a way to see them at work. "A few scientific papers, not many, seem like magic," Rigden writes. "Einstein's May paper is magic." Having dispatched two of Poincar?'s conundrums, Einstein next turned his attention to the undetected ether; his June paper ended up being the most earth-shattering of the bunch. It demolished two pillars of Newtonian physics, the notions of absolute space and absolute time. In their place, Einstein constructed the special theory of relativity, which held that time appears to stretch and space appears to shrink at velocities approaching the speed of light. The paper had no citations, as if Einstein owed a debt to no one. In fact, that wasn't the case. "Much of his source material was 'in the air' among scientists in 1905," notes Rigden, "and some of these ideas had been published." Physics was on the verge of something big at the turn of the century. It took an Einstein to pull it all together, to ask the big question in the right way. The final paper, published in September, might as well have been an addendum to the June paper. The profoundly simple equation he derived in three pages, E = mc2, was a logical consequence of the special theory of relativity. Equating energy and mass, it explained why the sun shines and why Hiroshima was leveled. More than anything else Einstein produced, it has come to symbolize his genius. A half century after his miracle year, in the final sentence of his final letter to his friend and intellectual sparring partner, the physicist Max Born, a dying Albert Einstein wrote, "In the present circumstances, the only profession I would choose would be one where earning a living had nothing to do with the search for knowledge." And so the man whose thought experiments revolutionized science concluded his life posing a thought experiment about himself: Where would we be if Einstein had become a "plumber or peddler," jobs he once rhetorically suggested he'd prefer, instead of a physicist? One place to look to start answering that question is the science itself, which is where Rigden's book begins. Another is the man himself, whose personality is abundantly on display in the letters he exchanged with Born between 1916 and 1955. Those letters, which first appeared in German in 1969 and in English two years later, have now been republished along with Born's commentary, Werner Heisenberg's original introduction and a useful new preface by Diana Buchwald and Kip Thorne. The Einstein that comes through in the letters is self-aware, philosophical, politically conscious (if sometimes na?ve), modest, generous, an aesthete and--in his exchanges with Born's wife, Hedi--an occasional flirt. From these epistolary glimpses of Einstein the person it's possible to see how his science, which "seems to be so far removed from all things human," is nonetheless, as Heisenberg writes in his introduction, "fundamentally determined by philosophical and human attitudes." By the time Einstein began corresponding with Born in 1916, his best work was behind him, and he was already an international celebrity. Their letters document the final chapter of Einstein's career, the forty years during which he was an outsider to the quantum physics revolution and alone in his pursuit of a single unified theory capable of explaining all of physics. Ironically, it was at the height of his fame that Einstein was furthest from the scientific mainstream. The aging revolutionary never ceased to be a radical. Like Einstein, Born was an assimilated German Jew who fled the country's rising anti-Semitism in the early 1930s. Many of their letters from that period concern the deteriorating political situation in Europe and attempts to arrange teaching posts for exiled German scientists. But unlike Einstein, who perceived an inveterate savagery at the heart of German culture and never again set foot on German soil, Born was more forgiving. After sojourning in Edinburgh during World War II, he returned to G?ttingen in 1953. They also differed on their shared Jewish heritage. While Einstein was a moderate Zionist, Born saw no difference between Jewish nationalism and all other embodiments of nationalism that he despised. Their political differences, though, were nowhere near as deep as their scientific disagreements. Einstein considered Born and himself "Antipodean in our scientific expectations." Born was a leading proponent of quantum theory and was awarded the 1954 Nobel Prize for his work establishing the theory's mathematical basis. Einstein was quantum theory's foremost critic. Even though his 1905 paper on the photoelectric effect helped create the field of quantum mechanics, Einstein could never reconcile himself to its nondeterministic implications. He was adamant that the theory provided only a superficial explanation of the universe, and that a deeper theory would someday be found. This conviction was based almost entirely in aesthetic instincts about what the laws of physics ought to look like. "Quantum mechanics is certainly imposing," he famously told Born. "But an inner voice tells me that it is not yet the real thing. The theory says a lot, but does not really bring us any closer to the secret of the 'old one.' I, at any rate, am convinced that He is not playing at dice." Einstein believed that there had to be an "objective reality" at the heart of the universe. If quantum mechanics proved correct, he wrote, again teasing with one of his occupational counterfactuals, "I would rather be a cobbler, or even an employee in a gaming-house, than a physicist." Their quarrel over quantum theory dragged out for more than three decades, but the content of their arguments changed little from the first letters they exchanged on the subject in 1919 right up until Einstein's death. In a 1953 letter Born declares, "I hope to be able to convince you at last that quantum mechanics is complete and as realistic as the facts permit." His attempt to persuade his friend after all those years seems almost comic. He goes on to call Einstein's stubbornness on the subject "quite unbearable." Einstein's letters tend to be half as long as Born's and twice as pithy, and are almost always prefaced with an apology for having not written back sooner. Though Born and Einstein only met in person once, they grew to address each other in the tone of lifelong friends. There's no shortage of tough honesty in the letters. There's even the occasional spat. Several correspondences are consumed by discussion over whether Einstein should grant a journalist permission to publish a book called Conversations With Einstein. Born and his wife were concerned that the author would depict Einstein unflatteringly. "Your own jokes will be smilingly thrown back at you," Hedi Born warns. "This book will constitute your moral death sentence for all but four or five of your friends." Her husband pleads with Einstein, "You do not understand this, in these matters you are a little child." Einstein replied, "The whole affair is a matter of indifference to me, as is all the commotion, and the opinion of each and every human being." Nonetheless, Einstein tried and failed to stop the publication of the book, which even Born later admitted wasn't nearly as bad as he had feared. Einstein's detachment is a persistent theme throughout the letters. He tells Born, "I hibernate like a bear in its cave," and in the same letter he off-handedly informs Born of his wife's death, which he describes as just one more thing accentuating his bearish feeling. Einstein's seeming indifference to worldly things leads Born to comment that "for all his kindness, sociability and love of humanity, he was nevertheless totally detached from his environment and the human beings included in it." Ironically, the vague constellation of traits that, according to Rigden, stimulated Einstein's early discoveries may also help explain why he spent the second half of his career as an outsider to the quantum revolution. The same aesthetic instincts that led him to recognize the inelegance of the old theories about light and space may have blinded him to the decidedly unbeautiful reality of quantum mechanics. The same "stubbornness of a mule" that kept him on the trail of the general theory of relativity for a decade may also have kept him on less fruitful paths later in his career. And the same self-confidence that gave the 26-year-old patent clerk the audacity to challenge the central precepts of classical physics may have prevented him from recognizing his own failure of imagination with regard to quantum mechanics. Heisenberg writes in his introduction, "In the course of scientific progress it can happen that a new range of empirical data can be completely understood only when the enormous effort is made to...change the very structure of the thought processes. In the case of quantum mechanics, Einstein was apparently no longer willing to take this step, or perhaps no longer able to do so." But another explanation is possible. Einstein always held that posterity would value his ideas more than his peers did. He was right. Again and again, work that was at first deemed loopy has been vindicated. The quest for a unified theory, once an emblem of Einstein's isolation, has become contemporary physics' Holy Grail. It's possible that Einstein's greatest intellectual gamble, his repudiation of quantum theory, may yet prove as prescient. Indeed, though they are a minority, many highly regarded scientists still harbor the deep discomfort that Einstein felt about quantum theory. In a 1944 letter to Born on the subject, Einstein wrote, "No doubt the day will come when we will see whose instinctive attitude was the correct one." That day may yet be some time off. From checker at panix.com Tue May 3 22:14:52 2005 From: checker at panix.com (Premise Checker) Date: Tue, 3 May 2005 18:14:52 -0400 (EDT) Subject: [Paleopsych] Commentary: Book Review: Jared Diamond: Scorched Earth Collapse Message-ID: Book Review: Jared Diamond: Scorched Earth Collapse: How Societies Choose to Fail or Succeed http://www.commentarymagazine.com/article.asp?aid=11904087_1 Viking. 575 pp. $29.95 Reviewed by Kevin Shapiro When the ancient Greeks happened upon ruins whose origins they could not fathom, they called them "Hebrews' castles"--a nod to the Hebrew Bible as the oldest available source of recorded history. In reality, the sites belonged not to the Hebrews but to earlier Aegean societies like the Myceneans and the Minoans. Regional powers in their day, those societies had disappeared, leaving the Greeks to wonder about their fate. Were they conquered or enslaved, stricken by plague or by famine, by earthquake or by flood? Even today, the desolate places of the world are littered with "Hebrews' castles." We gaze in wonder at, among others, the Anasazi pueblos of the American Southwest (Anasazi being the Navajo word for "the ancients"), the monumental statues of Easter Island, and the grand cities of the Maya entombed in the YucatE1n jungle. Aided by the tools of modern archaeology, from the analysis of midden heaps and pollen grains to radiocarbon dating and even more sophisticated physical methods, we often are able to know a good deal about the people responsible for these artifacts. In some cases, like that of the long-deserted Viking settlement in Greenland, detailed written records exist alongside the stone shells of churches, barns, and great houses. But none of the records, written or material, speaks directly of the final moments of their authors. We are left, like the Greeks, to puzzle over the reasons these castles were abandoned, what became of their erstwhile inhabitants--and whether a similar fate might one day befall us. This puzzle is Jared Diamond's subject in Collapse: How Societies Choose to Fail or Succeed. In his best-known previous book, Guns, Germs, and Steel (1997), Diamond--a physiologist by training and a professor of geography at UCLA--sought to explain why the peoples of Europe succeeded in outpacing all others in technology and exploration, leaving their mark on the entire modern world. In Collapse, he turns his attention to the opposite extreme: societies that appear to have experienced spectacular crashes. His thesis is that collapse is a consequence of "ecocide"--environmental damage caused by deforestation, intensive agriculture, and the destruction of local flora and fauna. Diamond begins by considering the land and people of Montana, often regarded as one of the few remaining unspoiled corners of the United States. As he tells it, however, Montana is in fact a microcosm of collapse, or at least of major social change, driven by environmental problems. Logging and mining, the traditional pillars of the economy, have declined as the state has become increasingly deforested and polluted. Soil that once supported apple orchards is nearly gone, and so are the glaciers for which Montana is famous. At the same time, a burgeoning population in the Blackroot Valley has put a strain on the state's water supply and job market. Although environmental damage is a nearly ubiquitous corollary of human activity, what makes certain societies vulnerable to ecocide, Diamond argues, is the combination of particularly fragile ecosystems with particularly destructive land-use practices. Like Montana, societies that have collapsed in the past have been situated in areas marginal for agriculture, with climates unfavorable to farming and tree growth. On both Easter Island and Greenland, for example, trees grow slowly and topsoil is relatively poor; their former inhabitants cut down the available trees without realizing, apparently, that more would not soon grow to replace them. Similarly, the Anasazi of Chaco Canyon (in present-day New Mexico) prospered in wet years by employing innovative methods of irrigation, but they grew so numerous that they could not sustain their population in years of drought. What might such societies have done differently, and how have societies in similar straits managed to survive? The key, Diamond finds in each case, is successful adaptation to the fragility of the local environment. The Inuit of Greenland subsist on fish, whales, and seals, at least some of which are present even in periods of cold. The Japanese, who came perilously close to de-forestation in the 17th century, instituted a strict system of tree management under the early Tokugawa shoguns, regulating the use of literally every tree on the main islands of Honshu, Kyushu, and Shikoku. Most radical of all were the measures taken by the inhabitants of Tikopia Island in the Pacific, who planted every inch of their land with edible trees and roots, eliminated their pigs, and adopted stringent population controls, including abortion and infanticide. All of these successful societies recognized the dangers they were about to face, Diamond argues, and changed their behavior accordingly. Collapse provides a series of such vignettes, rendered in meticulous detail. Although Diamond admits to having set out with the notion that all collapses are brought about by ecocide alone, he recognized early on that this was never the whole story. The Easter Islanders, for example, may have denuded their homeland of palm trees and depleted its fisheries, but their problems were compounded by the rivalries of competing tribal chieftains, who sought to outdo each other by erecting more and bigger statues, thus consuming enormous quantities of wood and food. The Tikopians, by contrast, have a history of weak chiefs and little internecine competition. Still, despite Diamond's repeated bows to the complex interaction of such other factors as history, political economy, and social structure, it is clear that, to his mind, the overriding cause of social collapse remains ecocide, which he also considers the major threat to the survival of civilization on earth today. Indeed, contemporary tales of social change brought about by ecological damage bracket his discussion of the past and constitute the larger portion of the book. Diamond portrays the genocide in Rwanda in 1994, for example, as a classic Malthusian crisis: too many people, too little food and land. Tensions between Hutus and Tutsis were real, to be sure, and were exploited by Rwandan politicians for their own purposes. But Diamond believes that the scale of the bloodbath can only be explained by the inability of Rwandans to support themselves on small farms. Haiti offers a similar cautionary story, having long been the basketcase of the Western hemisphere because of almost total deforestation and agricultural insufficiency. Diamond sees the early stages of ecological collapse in two larger nations as well--where he also finds signs of hope. China faces problems of soil erosion, desertification, urbanization, and rapid industrialization, all of which contribute to rapid ecological destruction and the overuse of resources. But the Chinese have taken steps to preserve their remaining forests and to limit their population. In Australia, the government is rethinking its historic support for industries like sheep-herding and wheat cultivation--both poorly suited to Australia's ecosystem--and is embarking on projects to restore the continent's native flora and manage its scarce water supplies. Such reforms suggest to Diamond that some of us may yet be able to save ourselves from the fate of the Easter Islanders. Collapse is not light reading. Each of Diamond's vignettes is laden with facts and statistics--in one paragraph, for example, he lists the common and scientific names of fourteen different plants harvested by the Tikopians, followed by a description of the agronomy of Tikopian swamps. Such dry fare is not made easier to digest by the prose style, which tends to be ponderous and repetitive. All the same, Collapse is an impressively researched and keenly argued book. The fatal weakness of Collapse is Diamond's constant overreaching. In trying to apply the lessons of the past to the present, he wanders down several paths with no obvious connection to the main point of the book. His opening section on Montana, for instance, is interesting in its own right, but the problems faced by Montanans are similar only superficially to the problems that were faced by Easter Islanders or Mayans. The fact that Montana's traditional industries have turned out to be unprofitable and impractical is perhaps unfortunate for long-time residents, but it hardly seems to qualify as a historic catastrophe. Other applications of Diamond's thesis to the modern world are even more far-fetched. He repeatedly alludes to the dangers of globalization, suggesting that it makes the entire planet more vulnerable to the collapse of a single nation. This might be a reasonable conjecture in the short term, but in the long term it seems more likely that globalization would act to insulate the world from such collapse, since resources that formerly would have had to be provided by a single country could now eventually be supplied by another. The unstated premise of Collapse seems to be that the entire planet is headed for a Malthusian crisis, which can be staved off only by extreme measures like China's one-child policy. But is this view defensible? Diamond takes no note of the extraordinary increases in food production achieved in recent decades; nor does he consider the likelihood that crises in places like Rwanda owe more to poor land management than to a shortage of farmland as such. As for the population controls Diamond seems to endorse, he says nothing about their unhappy practical consequences--including the sort of intensive urban development he decries--let alone their questionability on moral grounds. Indeed, as Collapse progresses, Diamond's arguments grow increasingly one-sided. The entire final chapter is a discursive screed on the need for urgent environmental action, accompanied by a series of rather unconvincing potshots at those who are skeptical of such measures. To his credit, Diamond takes pains to avoid what he rightly calls "environmental determinism." But although he recognizes the role played by social and cultural factors, he does not seem to appreciate how such recognition serves again and again to undermine his emphasis on ecocide, making his thesis seem arbitrary, if not ideologically motivated. For many of the societies Diamond discusses, it is not even clear that environmental damage was a major determinant of collapse. Thus, the Vikings certainly were not helped in the long run by the rapid deforestation of Greenland, or by their concerted effort to maintain cattle in the face of unfavorable local conditions. But they were probably hurt even more by their rigid customs, which apparently included the avoidance of fish (as Diamond reports, fish bones are almost never found in their middens). Diamond himself notes that the inhabitants of Iceland, who faced similar problems, managed to survive by switching from an agrarian economy to one based on the production and export of salted cod. Their brethren on Greenland might have survived by similar means, if only they had eaten fish. A prominent theme of Collapse, but one which Diamond almost completely ignores, is that societies tend to do best if their decision-making is open and democratic. Many societies that failed--like Easter Island and the Mayan empire--were ruled by elites more concerned with self-aggrandizement than with the stewardship of natural resources for the common good. The Vikings maintained economically ruinous subsidies for cattle farms to serve the needs of rich landlords and foreign-born bishops. Societies that succeeded, by contrast, were often governed by some form of representative democracy. To this day Iceland has the world's oldest legislative body, and the Tikopian "government" (if one can call it that) resembles a condo association. Diamond cites these as examples of "bottom-up" management, but he also praises "top-down" successes, like Tokugawa Japan. Centralized rule, however, has been responsible for many of the worst ecological disasters of modern times, as in the industrial wastelands of the former Soviet Union and Eastern bloc. Judging from Diamond's examples of successful bottom-up societies, and of corporations that have found it in their financial interest to adopt ecologically friendly policies, our best course of action is the one exemplified by the state of Montana: governance at the local level based on democratic values and economic realities. It is, in other words, the course we are already following. Kevin Shapiro is a research fellow in neuroscience and a student at Harvard Medical School. From checker at panix.com Tue May 3 22:15:06 2005 From: checker at panix.com (Premise Checker) Date: Tue, 3 May 2005 18:15:06 -0400 (EDT) Subject: [Paleopsych] TLS: (Tom Friedman) Confusing Columbus Message-ID: Confusing Columbus http://www.economist.com/books/displayStory.cfm?story_id=3809512 5.3.31 The World Is Flat By Thomas L. Friedman. Farrar, Straus and Giroux; 488 pages; $27.50. Penguin/Allen Lane; GBP20 THE term "populariser" is often used to sneer at writers who manage to reach a wide audience by those who don't. But not all popularisers are guilty of sensationalising or over-simplifying serious topics. There is a sense in which everyone in modern societies, even the most earnest or intellectually gifted, relies on the popularisation of ideas or information, if that term is understood to mean the making of complex issues comprehensible to the non-specialist. Achieving this is admirable. In the field of international affairs one of America's most prominent popularisers is Thomas Friedman, the leading columnist on the subject for the NEW YORK TIMES. Mr Friedman constantly travels the world, interviewing just about everyone who matters. He has won three Pulitzer prizes. If anyone should be able to explain the many complicated political, economic and social issues connected to the phenomenon of globalisation, it should be him. What a surprise, then, that his latest book is such a dreary failure. Mr Friedman's book is subtitled "A Brief History of the Twenty-First Century", but it is not brief, it is not any recognisable form of history--except perhaps of Mr Friedman's own wanderings around the world--and the reference to our new, baby century is just gratuitous. Even according to Mr Friedman's own account, the world has been globalising since 1492. This kind of imprecision--less kind readers might even use the word "sloppiness"--permeates Mr Friedman's book. It begins with an account of Christopher Columbus, who sets out to find India only to run into the Americas. Mr Friedman claims that this proved Columbus's thesis that the world is round. It did nothing of the kind. Proof that the world is round came only in 1522, when the sole surviving ship from Ferdinand Magellan's little fleet returned to Spain. Undaunted by this fact, Mr Friedman portrays himself as a modern-day Columbus. Like the Italian sailor, he also makes a startling discovery--this time on a trip to India--though it turns out to be just the opposite of Columbus's. An entrepreneur in Bangalore tells him that "the playing field is being levelled" between competitors there and in America by communications technology. The phrase haunts Mr Friedman. He chews it over, and over, and over. And then it comes to him: "My God, he's telling me the world is flat!" Of course, the entrepreneur, even by Mr Friedman's own account, said nothing of the kind. But Mr Friedman has discovered his metaphor for globalisation, and now nothing will stop him. He shows his readers no mercy, proceeding to flog this inaccurate and empty image to death over hundreds of pages. In his effort to prove that the world is flat (he means "smaller"), Mr Friedman talks to many people and he quotes at length lots of articles by other writers, as well as e-mails, official reports, advertising jingles, speeches and statistics. His book contains a mass of information. Some of it is relevant to globalisation. Like many journalists, he is an inveterate name-dropper, but he does also manage to interview some interesting and knowledgeable people. Mr Friedman's problem is not a lack of detail. It is that he has so little to say. Over and over again he makes the same few familiar points: the world is getting smaller, this process seems inexorable, many things are changing, and we should not fear this. Rarely has so much information been collected to so little effect. A number of truly enlightening books have been published recently which not only support globalisation, but answer its critics and explain its complexities to the general reader--most notably Jagdish Bhagwati's "In Defence of Globalisation" and Martin Wolf's "Why Globalisation Works". Because of Mr Friedman's fame as a columnist, his book will probably far outsell both of these. That is a shame. Anyone tempted to buy "The World is Flat" should hold back, and purchase instead Mr Bhagwati's book or Mr Wolf's. From checker at panix.com Tue May 3 22:15:24 2005 From: checker at panix.com (Premise Checker) Date: Tue, 3 May 2005 18:15:24 -0400 (EDT) Subject: [Paleopsych] Salon: Don't kill your television Message-ID: Don't kill your television http://www.salon.com/books/review/2005/05/01/johnson/print.html 5.5.1 Far from making us stupid, violent and lazy, TV and video games are as good for us as spinach, says an engaging new book by Steven Johnson. By Farhad Manjoo Pop culture, like fast food, gets a bad rap. It's perfectly understandable: Because we consume so much of the stuff -- we watch so much TV, pack away so many fries -- and because the consumption is so intimate, it's natural to look to our indulgence as the cause of all that ails us. Let's face it, we Americans are fat and lazy and simple-minded; we yell a lot and we've got short attention spans and we're violent and promiscuous and godless; and when we're not putting horndogs into office we're electing dumb guys who start too many wars and can't balance the budget and ... you know what I mean? You are what you eat. The output follows from the input. When you look around and all you see is Ronald McDonald and Ryan Seacrest, it seems natural to conclude that junk food and junk culture are responsible for a large chunk of the mess we're in. The other day, though, in an unbelievably delicious turn of events, the government reported that people who are overweight face a lower risk of death than folks who are thin. While the news didn't exactly exonerate junk food, it was a fitting prelude to the publication of Steven Johnson's new polemic "Everything Bad Is Good for You," which argues that what we think of as junk food for the mind -- video games, TV shows, movies and much of what one finds online -- is not actually junk at all. In this intriguing volume, Johnson marshals the findings of brain scientists and psychologists to examine the culture in which we swim, and he shows that contrary to what many of us assume, mass media is becoming more sophisticated all the time. The media, he says, shouldn't be fingered as the source of all our problems. Ryan Seacrest is no villain. Instead, TV, DVDs, video games and computers are making us smarter every day. "For decades, we've worked under the assumption that mass culture follows a steadily declining path towards lowest-common-denominator standards," Johnson writes. "But in fact, the exact opposite is happening: the culture is getting more intellectually demanding, not less." Johnson labels the trend "the Sleeper Curve," after the 1973 Woody Allen film that jokes that in the future, more advanced societies will come to understand the nutritional benefits of deep fat, cream pies and hot fudge. Indeed, at first, Johnson's argument does sound as shocking as if your doctor had advised you to eat more donuts and, for God's sake, to try and stay away from spinach. But Johnson is a forceful writer, and he makes a good case; his book is an elegant work of argumentation, the kind in which the author anticipates your silent challenges to his ideas and hospitably tucks you in, quickly bringing you around to his side. In making his case for pop culture, Johnson, who was a co-founder of the pioneering (and now-defunct) Web journal Feed, draws on research from his last book, "Mind Wide Open," which probed the mysteries of how our brains function. Johnson's primary method of analyzing media involves a concept he calls "cognitive labor." Instead of judging the value of a certain book, video game or movie by looking at its content -- at the snappy dialogue, or the cool graphics, or the objectives of the game -- Johnson says that we should instead examine "the kind of thinking you have to do to make sense of a cultural experience." Probed this way, the virtues of today's video games and TV shows become readily apparent, and the fact that people aren't reading long-form literature as much as they used to looks less than dire. "By almost all standards we use to measure reading's cognitive benefits -- attention, memory, following threads, and so on -- the non-literary popular culture has been steadily growing more challenging over the past thirty years," Johnson says. Moreover, non-literary media like video games, TV and the movies are also "honing different mental skills that are just as important as the ones exercised by reading books." Johnson adds that he's not offering a mere hypothesis for how video games and TV shows may affect our brains -- there's proof, he says, that society is getting smarter due to the media it consumes. In most developed countries, including the United States, IQs have been rising over the past half-century, a statistic that of course stands in stark contrast to the caricature of modern American idiocy. Johnson attributes intelligence gains to the increasing sophistication of our media, and writes that, in particular, mass media is helping us -- especially children -- learn how to deal with complex technical systems. Kids today, he points out, often master electronic devices in ways that their parents can't comprehend. They do this because their brains have been trained to understand complexity through video games and through TV; mass media, he says, prepares children for the increased difficulty that tomorrow's world will surely offer, and it does so in a way that reading a book simply cannot do. Still, at times Johnson protests too much, setting up what look like straw men defenders of old media so that he can expound on the greatness of the new. It's true that many oldsters continue to say a lot of silly things about the current media environment. Johnson quotes Steve Allen, George Will, the "Dr. Spock" child-care books and the Parents Television Council, all of whom think of modern media in the way former FCC chairman Newton Minow famously described the television landscape of the early 1960s -- as a "vast wasteland." (For good measure, Johnson could also have taken a stab at opportunistic politicians like Jennifer Granholm, the Democratic governor of Michigan, who's trying to pass a state ban on the sale of violent video games to minors, or misguided liberals like Kalle Lasn, who wants vigilantes to shut off your TV.) Yet, I suspect that most of Johnson's audience probably already gets it. I was tickled by much of what Johnson illustrates about how video games and TV affect your brain, and some of it surprised me, but I wasn't really skeptical in the first place. Most people my age -- kids who grew up at the altar of Nintendo and "Seinfeld" -- probably feel the same way. And this is to Johnson's credit: To young people, his take on media feels intuitively right. It's clear what he means when he says TV makes you think, and that video games require your brain. Indeed, if you've ever played a video game, Johnson pretty much has you at hello. That reading books is good for children is the most treasured notion in society's cabinet of received child-rearing wisdom, Johnson notes. Yet it's a pretty well established fact that kids today don't read as much as kids of yesterday -- at least, they're not reading books. (Few studies, Johnson points out, have taken note of the explosion of reading prompted by electronic media like the Web.) What are these children doing? They're playing video games. And other than praising games for building a kid's "hand-eye coordination," video games are, say child experts like Dr. Spock, a "colossal waste of time," leading us down the path to hell. What's best about Johnson's section arguing that video games are just as good for you as books are is his tone: He's breezy and funny, and for a while you forget that he's proposing the kind of idea that in earlier times may have ended with a sip of hemlock. As I say, I think most people will be with him from the start: Video games are better than we think? Sure, I'll buy that. But one still feels itchy under the collar when he starts comparing something as sacred as the bound book to the sacrilege that is "Grand Theft Auto." And when, in a short, satirical passage, he points out all the shortcomings of books in the same unfair way most people describe the shortcomings of video games, I'm sure he drives more than a few readers to go out in search of some hemlock. A sample: "Perhaps the most dangerous property of ... books is that they follow a fixed linear path. You can't control their narratives in any fashion -- you simply sit back and have the story dictated to you ... This risks instilling a general passivity in our children, making them feel as if they're powerless to change their circumstances. Reading is not an active, participatory process; it's a submissive one." Of course, Johnson makes clear, he loves books (they provide, for starters, his livelihood). Still, his criticism of books' lack of interactivity -- even if it's offered as a purposefully specious point -- is valid. Books may promote a wide range of mental exercises, and a certain book may send your mind skittering in a dozen euphoric directions, but there are things that a book simply will not, cannot, do. Books don't let you explore beyond the narrative. Their scenery is set, and what's there is all that's there. You may have liked to have visited some of Gatsby's neighbors, but you can't. Books also don't ask you to make decisions, and in a larger sense don't require you to participate. You sit back and watch a book unfold before you. The book's possibilities are limited; what will happen is what's written on the next page. Read it a thousand times, still Rabbit always runs. So this should be plain: Because they're interactive, video games promote certain mental functions that books do not. Specifically, video games exercise your brain's capacity to understand complex situations. That's because in most video games, the rules, and sometimes the objectives, aren't explicit. You fall into the sleazy urban landscape of "Grand Theft Auto" with no real idea of what you're supposed to do. Indeed, Johnson points out, much of the action in playing any video game is finding out how to play the game -- determining how your character moves, seeing which weapons do what, testing the physics of the place. If you fall from a building, does your character get hurt? What happens if you open this door? What kind of strategy can you plan to beat the monster on Level 3? The kind of probing gamers employ to determine what's going on in such simulated worlds, Johnson says, is very similar to the kind of probing scientists use to understand the natural world. Kids playing video games, in other words, are "learning the basic procedure of the scientific method." Because TV is more fun, Johnson's section on television is more engaging than his examination of video games, but its revelations also feel a bit more obvious. His main point -- you can see an extended version of it in this New York Times Magazine excerpt -- is that most modern TV shows exercise your brain in ways that old TV shows never dared. Today's shows, whether dramas or comedies, are multithreaded -- several subplots occur at the same time, and in the best shows (like "The Sopranos" or "The West Wing") the subplots often run into each other (there is one popular exception: "Law & Order."). Modern shows -- including, of course, reality shows -- also feature many more characters; only a handful of regulars graced "Dallas" every week, but there are dozens of people in "24." Today's TV shows are also far more willing to keep the viewer in the dark about what's going on in a certain scene, or to include allusions to other art forms, or previous years' episodes. Medical jargon has been written into just about every scene on "ER" specifically to keep you on your toes about what's happening. "Nearly every extended sequence in 'Seinfeld' or 'The Simpsons' ... will contain a joke that only makes sense if the viewer fills in supplementary information -- information that is deliberately withheld from the viewer," Johnson writes. "If you've never seen the 'Mulva' episode, or the name 'Art Vandelay' means nothing to you, then the subsequent references -- many of them occurring years after their original appearance -- will pass on by unappreciated." What all this amounts to, Johnson says, is work for your brain. Watching TV is not a passive exercise. When you're watching one of today's popular shows, even something as nominally silly as "Desperate Housewives," you're exercising your brain -- you're learning how to make sense of a complex narrative, you're learning how to navigate social networks, you're learning (through reality TV) about the intricacies of social intelligence, and a great deal more. What I wonder, though, is, Doesn't everyone know that today's TV is better than yesterday's TV? It's here that I think Johnson's too focused on straw men. Like most Americans, I've spent enough time watching television to have earned several advanced degrees in the subject. Yes, TV today is clogged with more sex and violence than TV of yesterday, but for all that, is there anyone in America who doesn't believe that on average, what we've seen on TV in the last decade has been more intricate, more complex and just plain smarter than the shows of the 1980s or the 1970s? Of course, there are exceptions; everyone can think of a great show from the 1970s that beats a middling show of today. ("The Jeffersons" kicks "According to Jim's" ass.) But I'm talking about apples-to-apples comparisons: Is there anyone who prefers "Hill Street Blues," which as Johnson points out was one of the best dramas of the 1980s, to "The West Wing" or "ER" or "The Sopranos"? I imagine only the very nostalgic would say they do. In the same way, I don't know how anyone couldn't see that "Seinfeld" is smarter than "Cheers," or that "Survivor" is more arresting than "Family Feud," or that "American Idol" clobbers "Star Search." When I say that the new shows are better, I mean in the same ways that Johnson argues -- not based on content, but on brain work. Today's shows tease your brain in ways that the old shows do not, and you are aware of the difference. We may not have plotted out the shows' mechanism as well as Johnson has -- we can't say precisely why "ER" is completely different from "St. Elsewhere" -- but to me, at least, the difference is clear enough that Johnson's Sleeper Curve is unsurprising. As I see it, then, the most interesting question about Johnson's theory is not whether it's accurate. It's why it's happening -- why is media getting smarter, and why are we flocking to media that actually makes us smarter? Johnson examines the question at some length, and he fingers two usual suspects: technology (the VCR, TiVo, DVDs, ever more powerful game systems) and economics (the increasing importance of the syndication market). But I like the third part of his answer best -- our media's getting smarter, he says, because the brain craves intelligent programming. The dynamic is that of a feedback loop: Today's media is smarter because yesterday's media made us smart to begin with. "Dragnet" prepares you for "Starsky and Hutch," which prepares you for "Hill Street Blues," which begets "ER," "The West Wing" and "The Sopranos." If we'd seen "The West Wing" in the 1980s, we wouldn't have known what to do with it. Indeed, many people didn't know what to do with "Hill Street Blues" when it debuted, in the same way that all path-breaking media confound viewers at first. Few people understood the early years of "Seinfeld," and, today, only a small crew can appreciate the genius of "Arrested Development." The amazing thing -- and the most hopeful thing in Johnson's book, and about culture in general -- is that the mind challenges itself to understand what's just out of its reach. After three years of watching "Seinfeld" the nation more or less collectively began to understand the thing. In no time, then, the show lodged itself into the cultural landscape. No longer, after that, could you remark on someone's sexuality without adding, "Not that there's anything wrong with that." And, whatever else you may have heard, this tells us, once and for all, that we are not stupid. Farhad Manjoo is a Salon staff writer. From checker at panix.com Tue May 3 22:15:41 2005 From: checker at panix.com (Premise Checker) Date: Tue, 3 May 2005 18:15:41 -0400 (EDT) Subject: [Paleopsych] Wilson Q.: Review of Robert Fogel, The Escape from Hunger and Premature Death, 1700-2100 Message-ID: Review of The Escape from Hunger and Premature Death, 1700-2100 The Wilson Quarterly, Wntr 2005 v29 i1 p119(2) The Escape from Hunger and Premature Death, 1700-2100: Europe, America, and the Third World. (Book Review) Robert J. Samuelson. THE ESCAPE FROM HUNGER AND PREMATURE DEATH, 1700-2100: Europe, America, and the Third World. By Robert William Fogel. Cambridge Univ. Press. 191 pp. $70 (hardcover), $23.99 (paper) >From our present perch of affluence, we forget the abject misery, malnutrition, and starvation that most people endured for most of recorded history. In a fact-filled book geared toward scholars, Nobel Prize-winning economist Robert Fogel of the University of Chicago reminds us of the tinge strides in conquering widespread hunger and of the immense economic and social consequences of that achievement. It may shock modern readers to learn how poorly fed and sickly most people were until 100 or 150 years ago, even in advanced countries. In 1750, life expectancy at birth was 37 years in Britain and 26 in France. Even by 1900, life expectancy was only 48 in Britain and 46 in France. With more fertile land, the United States fared slightly better, with a life expectancy that was greater than Britain's in 1750 (51) but identical to it in 1900 (48). Urbanization and industrialization in the 19th century actually led to setbacks. As Americans moved from place to place, they spread "cholera, typhoid, typhus ... and other major killer diseases," Fogel writes. Urban slums abetted sickness and poor nutrition. Fogel questions whether rising real wages in much of the 19th century signaled genuine advances in well-being. "Is it plausible," he asks, "that the overall standard of living of workers was improving if their nutritional status and life expectancy were declining?" By contrast, life expectancy in advanced countries is now in the high 70s (77 in the United States). Compared with those of the early 1700s, diets are 50 percent higher in calories in Britain and more than 100 percent higher in France. Summarizing his and others' research, Fogel calls this transformation "technophysio evolution." It has had enormous side effects. First, we've gotten taller. A typical American man in his 30s now stands 5 feet 10 inches, almost five inches taller than his English counterpart in 1750. (Societies offset food scarcities in part by producing shorter people, who need less food.) Second, we've gotten healthier. Although Fogel concedes that advances in public health (better water and sewage systems, for instance) and medicine (vaccines, antibiotics) have paid huge dividends, he argues that much of the gain in life expectancy stems from better nutrition. With better diets, people become more resistant to disease--their immune systems work better and their body tissue is stronger--and they have healthier babies. Finally, better diets have made economic growth possible. An overlooked cause of the meager growth before 1800, Fogel argues, is that many people were too weak to work. In the late 1700s, a fifth of the populations of England and France were "effectively excluded from the labor force." As people are better and lived longer, they worked harder. Fogel attributes 30 percent of Britain's economic growth since 1790 to better diets. This conclusion seems glib. After all, better diets came from technology that enabled more productive agriculture--better cultivation techniques, better seeds, more specialization. What, specifically, were these advances? Fogel doesn't say. His overwhelming focus on scholarly research on diets also makes his comments on the Third World an elaboration of the obvious (in effect: lots of people are still hungry), with little in the way of recommendations for what could be done. Fogel is always illuminating and, in his omissions, often frustrating. From checker at panix.com Tue May 3 22:15:53 2005 From: checker at panix.com (Premise Checker) Date: Tue, 3 May 2005 18:15:53 -0400 (EDT) Subject: [Paleopsych] More-Than-Humanism: Ramez Naam In Conversation with R.U. Sirius Message-ID: More-Than-Humanism: Ramez Naam In Conversation with R.U. Sirius http://www.life-enhancement.com/neofiles/neofile_print.asp?id=61 5.4.25 Ramez Naam's recently published book, More That Human: Embracing the Promise of Biological Enhancement provides a well-researched, detail-oriented argument in favor of embracing technological advances that will likely increase our lifespans, our intelligence, allow us greater control over our mind states, and allow us to communicate brain-to-brain. The book is so cogent, even arch technophobe Bill McKibben offered these words of praise: "Ramez Naam provides a reliable and informed cook's tour of the world we might choose if we decide that we should fast-forward evolution. I disagree with virtually all his enthusiasms, but I think he has made his case cogently and well." I chatted with Ramez by email and, in a (literally) fevered state, ribbed him a bit about his seemingly unqualified optimism. NEOFILES: Describe your personal evolution. How did you come to realize that you were in favor of more-than-humanism? Also, do you call yourself a transhumanist and what are some of your thoughts on that movement or [2]memeplex? RAMEZ NAAM: I'm a geek. I've always been a geek. Growing up I loved science fiction. In particular I loved stories that showed characters that were more than human in some way super powers, super brains, immortality, etc. ... I also love science. I subscribe to scientific journals for the fun of it. I'll happily while away an evening reading Science or Nature and just soaking up all this incredible research that's going on. One day in 1999 I saw a paper that looked more like science fiction than science a group led by [3]John Chapin and Miguel Nicolelis had put electrodes into the brain of a live rat, and through that, had given the rat control over a robot arm. I was floored. I had thought that sort of integration of brains and computers was the realm of science fiction, not science fact. That was my first conscious realization that science was starting to sound a lot like science fiction. I suppose I've always been in favor of humans enhancing themselves I just didn't consider it a real possibility until that time. Does that make me a transhumanist? Yes, it does. But I think the term itself is a bit empty. In my mind, most of the citizens of the western world are transhumanists. Every woman who uses a birth control pill is altering her biology in fundamental ways to get the result she wants. Every person wearing glasses or contact lenses, everyone who puts a cell phone to their ear, everyone who pops a multi-vitamin, or drinks a cup of coffee to wake up in the morning or stay awake on a long drive they're all transhumanists. We are, as a rule, interested in products and technologies that expand our capabilities, that give us control over our world and minds and bodies. The only thing that separates self-identified transhumanists from the rest of consumer society is their enthusiasm about technologies that are still speculative. The self-identified folks are the enthusiasts and early adopters. The rest of society will follow when the benefits are more concrete, the safety's there, and the price is right. NF: There have been numbers of books written around the themes of life extension and transhumanism over the last few years. You seem to have organized a lot of details on activity and research in a lot of different areas. Would you describe the three developments or projects that you find the most exciting? RN:Only three? There are so many! But if I have to ... First, I'd say, is our growing power to alter our minds chemically and genetically. Back in 1999, Joe Tsien from Princeton made the cover of Time magazine with his [4]Doogie mice. These were mice he'd genetically engineered to have a slightly different structure of a particular chemical receptor in the brain. The gene he tweaked is called NR2B. Basically he gave them an extra copy of NR2B, which meant that their neurons were more sensitive to certain signals involved in learning, and would learn things more quickly. They were testing this as a potential technique against Alzheimer's disease or other age-related memory loss, thinking that doing this in older people could keep their memories from decaying. But what they found is that these mice didn't just stay mentally sharp to a greater age. They actually ended up smarter or at least able to learn more quickly than normal mice. And it was by a large margin. In some tests, like navigating a maze, the Doogie mice would learn the task in half the time it took the normal mice. Since '99, a lot of other teams have produced similar results, including one led by [5]Eric Kandel, who won the Nobel prize in Medicine in 2000 for his work on memory. At least half a dozen companies [6]including Kandel's are trying to bring this technology to market now, in the form of a pill that you can swallow. They'll target use in people with memory problems, but you can pretty well assume that there will be off-label use among people just looking to improve their memories. Second, I'd say, is the tremendous progress being made in extending lifespans. Until about 1990, if you asked geneticists about altering aging, they'd tell you that you'd have to make thousands of genetic changes to see any extension of creature's lifespan. Then in '90, a professor at the University of Colorado [7]Tom Johnson published a paper where he showed that by tweaking one gene, he could double the life span of a species of worm. Well, the scientific community did not take well to that. Johnson was called a charlatan and people whispered that he was faking his data. Then a few years later, a more famous researcher named Cynthia Kenyon discovered a second gene that had a similar effect in the same species. Since then, researchers have found dozens of these genes that can slow aging, in everything from yeast to worms to fruit flies to mice. The amazing thing is that pretty much the same genes have the same effect in all these species. You and I are much more genetically similar to mice than mice are to yeasts. And since the same genes can extend life in both mice and yeast, there's a good chance they'll work for humans as well. Third, I'll say, is the whole field of brain computer interfaces. Today there are human trials going on of brain implants that allow paralyzed people to control computers or robot arms just by thinking about it. There are blind patients receiving retinal implants and visual cortex implants that take signals from a video camera and feed them into the brain, and the patients can actually see out of these things. [8]DARPA has invested 10s of millions of dollars in this field. They're after technologies that can let fighter pilots control their planes by thinking about it, that can let commanders on the battlefield beam 3D maps into the minds of their soldiers real cyberpunk sort of stuff. To me that's incredibly exciting because these are actually communication scenarios. They're ways to get information in and out of our brains or from one person's brain to another more quickly or efficiently or with greater clarity. What if, instead of using a drawing program, I could just hold an image in my mind and beam it to you? What if we could hold that in a shared mental space on a computer, perhaps and work on it together? What if you could record the feeling of writing a book, or seeing a fantastic band, or having an incredible erotic experience, and let people play it back for themselves? Some of the most revolutionary technologies have been communication technologies the printing press, radio, the internet. Brain-computer interfaces just might be the ultimate communication tech. NF: You focus quite a bit on population and the tendency of technology to "trickle down." I thought your analysis was pretty on target. Can you give our readers a brief synopsis of your view of why post-humanity will be more distributed and less likely to create population problems than many people suspect? RN: Sure. I think the socioeconomic issues are quite important, which is why I spend two whole chapters on them in the book. There are really two specific questions that come up frequently: "Who will be able to afford these technologies?" and "Won't the population explode if we lengthen human life?" On the population question, it turns out that the major driver of population growth is really fertility rather than the death rate. If you look around the world, the countries with the longest life expectancies Japan, Sweden are actually shrinking in population. As these countries have gotten rich, people particularly women have decided that they want fewer children. On the other hand, the countries that are rapidly grown Indonesia, Nigeria, Pakistan have relatively low life expectancies. People die early there, but those who survive have big families. On the other hand, over the next 50 years, the UN projects that 3.7 billion people are going to die on this planet, while another 6.6 billion will be born. That'll take global population to about 9 billion people. Of the 3.7 billion who are projected to die in in the next 50 years, less than 2 billion of them will die of age-related causes. So even if we cured aging completely tomorrow, and magically delivered the cure to the entire world, the largest possible impact would be about 2 billion lives over 50 years. That would increase global population in 2050 from about 9 billion to about 11 billion a big change, but not as radical as the more than doubling that happened between 1950 and 2000. In any case, aging isn't going to be cured tomorrow. I walk through some calculations that if you could raise global life expectancy to 120 years by 2050 almost twice what it is today you would raise the 2050 population from the current projection of 8.9 billion people to 9.4 billion people. That's a good sized increase, but as a percentage of population, it's actually smaller than the change that occurred between 1970 and 1973. The takeaway, for me, is that life extension isn't going to have any radical effect on population in the next few decades. The question of economic access is a little more complex. People do worry that when these enhancement technologies come out, only the rich will have access to them. And they're right at the very beginning, only the rich will be able to afford some of these techniques. What it helps to realize, though, is that most of these enhancement techniques are really information goods. They cost a huge amount to develop, but almost nothing to manufacture. The same thing is true in general of pharmaceuticals today. Viagra costs about $15 per pill, but only a few cents of that is production cost. Mostly it's Pfizer bringing in profit or paying off the $1 billion price tag of developing a new drug. Pfizer can charge that much because the drug is patented. By law, no one else can manufacture it without Pfizer's consent. But in 2012, the patent expires. At that point, any generic manufacturer can make the drug. The more suppliers you have, the more price competition sets in. The more consumers you have, the more incentive there is for suppliers to enter the market. The net effect is that, the more desired any information good is, the cheaper it will be to acquire. You can see this when you look at drugs that are commonly used today. Penicillin was absolutely priceless when first introduced to the market. But now it costs less than one cent per dose. The same inverted supply and demand even applies to non-drug techniques. LASIK cost $5,000 per eye when it first came out now you can get it for $299. As more and more people wanted LASIK, more doctors started offering it. And the more doctors there are offering it, the most they have to compete with each other on price. The absolute worst thing you can do if you want these technologies equally available to poor and rich is to ban them. Prohibition would create a black market with worse safety, higher prices, and no scientific tracking of what's going on. Viagra and cocaine cost roughly the same per gram at the moment. In a decade, Viagra will be much cheaper but cocaine will be the same price it is now. I think we'd rather our enhancements follow prescription drug economics rather than illegal drug economics. And even if governments could implement perfect bans, that wouldn't stop people from using these technologies. Asia is much more receptive to biotech than the US and Europe. If a rich couple can't get the genetic treatments they want here, they can absolutely fly to Singapore or Thailand and have it done there. The poor or middle class couple doesn't have the same options. If anything, where I'd like to see government intervene is in the opposite direction investing in those who can't afford these technologies themselves. We already spend a large amount of money enhancing our children. We have free grade schools and high schools, free vaccinations for poor children, guaranteed student loans. And those things pay dividends. Every 1% decrease in health care costs saves the country $10 Billion a year. Every 1% increase in productivity makes the country $100 Billion richer in a year, or a $1 Trillion richer over a decade. That money comes from innovation architects designing better buildings, engineers making better cars, coders putting out better software, scientists inventing entirely new things we haven't conceived of. And that's why we invest in things like education because we know they pay dividends later on. Biotech enhancements have the same potential. Maybe someday we'll have government personal enhancement loans and scholarships. I can dream. NF: You mention that Asian countries have less of a prejudice against genetic manipulation than Americans do. On the downside, could that be because they have less of a sense of individual autonomy? In other words, a human born with her germ-line engineered to produce certain qualities has no choice in the matter. Do you see a line between biological manipulation and personal autonomy? RN: Of all the ethical issues I talk about in the book equality, safety, 'playing god," and so on the issues of parents and children are the hardest for me. Most parents really want what's best for their kids, and parents are also generally pretty cautious they look for safety above all else. So I think for the most part, parents will make OK choices. And those choices are in many ways similar to choices they have to make today how to raise their child, what school to send her to, whether to let her watch TV, and what and how much. Parents make a huge number of choices today, and by and large we trust them to do so. Even so, it makes me uncomfortable if I think about, for example, highly religious parents genetically engineering their own children to be more religious. There's an aspect of parents wanting to control the behavior of their kids that is very tough to deal with. The good news is that it's likely to be extremely hard to do that. Most personality traits are between one half and one third correlated with a person's genes. And the remainder is typically what geneticists call "non-shared" environmental effects. The "non-shared" part means that it's not shared by two children growing up in the same household. It's something very unique about what happens to them when growing up, rather than something that parents can control environmentally. So even if you attempt to make your child more religious, for instance, an awful lot of how she turns out is going to depend on chance. You can make it more likely that she'll behave a certain way, but you will also runs the risk of overshooting. If you try to produce a kid who's more assertive, you might end up with an aggressive monster. If you try to alter your child to be more polite, you might get a doormat. Put these points together, and it's going to be hard for parents to really control the personality of their children. Individuality is going to be around for a long time. And a lot of these kids, by the time they're 20, are going to find that there are effective personality alterations they can make to themselves that are much more effective than those that were available to their parents before they were born. So whatever starting personality you may try to instill in a child, they're going to grow up to find a world full of options for altering themselves. NF: You focus on the future brought about by biological enhancement. Do you think that evolutions in [9]nanotechnology might alter this picture? If not, why not? If so, how? RN: Nanotech is such a big word it means a lot of things to a lot of people. The kind people most frequently bring up in this context is the model of tiny nano-robots that can precisely re-organize matter on a molecular scale. Given that technology, we would be able to augment human abilities in amazing ways making far bigger changes than the ones I write about in the book. But I suspect general purpose nanotech of that sort is still a long ways off. There are lots of questions no one has answered to my satisfaction about how you build these devices, let alone how to program and control them. I don't mean to be downbeat. There are great thinking going on in the field, but I suspect that what we'll really see in the next few decades are more narrow applications of nanotech in areas like chip design, sensors, materials, and relevant to human enhancement areas of biotech like drug delivery, genetic engineering, and gene sequencing. Biology already is nanotechnology, after all. Every cell in your body is an incredibly complex nanomachine. The small-molecule drugs we use to treat disease plug into interfaces that already exist on these nanomachines. The viruses we use to deliver new genes are their own kind of nanomachine, evolved to this specific purpose. So while there are a lot of ideas on how to build new classes of machines from scratch, I suspect things will progress much faster in areas where people take these existing designs our genome and cell biology and neural architecture and make incremental changes to them with the best tools at hand. NF: Referring back to your first answer, lots of transgeeks reference superheroes in comics and SF. But one of the aspects of comic superheroes that appeal to young imaginations is the fact that the superheroes have powers that everybody else doesn't have! What about the possibility that individuals and societies might try to sabotage the enhancements of the enemy? In other words, what about criminal social competition and war? In 10 years, instead of worrying about the North Koreans getting the nuke, we might be worrying about them getting the latest upgrade of supersoldier. RN: To the extent that this technology is employed by the military, you're sure to see one side try to one-up the other. The US military wants to have the most capable soldiers possible, which means having the best training, best gear, and the best enhancements. They're going to try to keep that edge through a combination of outspending the competition, keeping some technologies secret or restricted, and trying to deny competitors their capabilities on the battlefield. At the same time, we're never going to be as worried about souped up soldiers as we are about nukes or infectious bio-weapons. The scale of the damage you can do is different by orders of magnitude. All told though, I think the majority of investment into these technologies is going to be consumer driven first in the medical realm and then in the self-improvement realm. If you think about it, the US military spends a huge amount on computers, physical fitness, and skills training for its soldiers. But that's still a drop in the bucket compared to the overall size of the computer industry, the amount consumers spend on gym memberships and exercise videos, or the total education spending in this country. NF: I guess what I'm trying to do is get underneath your apparently unfailingly bright view of the human species (just 'cause I'm in a mood). I'm generally trans-positive, but in the light of human history, nightmare scenarios seem at least as plausible. So what makes you so damn upbeat, Ramez? RN: You know, these technologies will definitely cause problems. There's just no way around that. Every technology that's really mattered has had some sort of unexpected consequence on society. Cars lead to highway fatalities and smog. Antibiotics contributed to the population explosion of the last century. The internet makes it easier to transmit child porn. And every really powerful technology is employed for violent military uses. Trains, automobiles, and planes civilian transport technologies make armies more powerful and more deadly. Radio allows the coordination of larger groups of violent men. Even agriculture one of the most basic technologies we have helped usher in organized warfare by increasing population densities and allowing the creation of a soldier class. So I'm not blind to the fact that there will be problems. But when I follow the course of human history, even with the period atrocities and downturns, the world seems to be steadily becoming a better place. At the turn of the 20th century, average life expectancy was less than 40 years. Today it's 66. It's 66 years in India one of the poorest nations on earth. That's twice the life expectancy that the Romans enjoyed at the height of their empire. And the developing world is actually catching up with the rich world in life expectancy. The gap is closing every year. You can see the same thing in the amount of violence in society. We have this romantic notion of peaceful life among the hunter-gatherers, but anthropologists have documented that in hunter-gatherer tribes, warfare routinely accounted for twenty, thirty, forty percent of all male deaths. In the 20th century, by contrast, less than one percent of male deaths have come from warfare, even when you include the world wars. And then there's personal freedom. Even as recently as a couple decades ago, people's choices were narrower. We live in a world where more and more people are being educated, more are able to choose how they spend their lives, more people have access to a wider variety of information and goods than ever before. People around the world on average have a higher standard of living than ever before. I don't think those trends are just coincidental. I think they're an emergent property of human societies particularly free human societies. When you put millions or billions of people together and give them freedom to choose how they'll spend their time, energy, and money, a higher level intelligent behavior emerges from the whole. I'm not saying this in any sort of mystical way just a pragmatic one. Many people working together seem to make far better choices than any individual even the smartest individual could possibly make. And one of the fundamental trends I see in the world is this movement towards increasing the ability of individuals to interact, share information, and communicate with one another. Or maybe everything I just said is a rationalization, and I'm actually upbeat because of some random variation in my serotonin receptor genes. References 1. http://www.morethanhuman.org/ 2. http://www.life-enhancement.com/le/neofiles/default.asp?ID=13 3. http://www.downstate.edu/pharmacology/chapin.htm 4. http://news.bbc.co.uk/1/hi/sci/tech/435816.stm 5. http://en.wikipedia.org/wiki/Eric_R._Kandel 6. http://www.memorypharma.com/a_advisoryboard.html 7. http://www.cbsnews.com/stories/2003/08/04/health/main566593.shtml 8. http://www.darpa.mil/ 9. http://www.life-enhancement.com/le/neofiles/default.asp?ID=20 From checker at panix.com Tue May 3 22:16:06 2005 From: checker at panix.com (Premise Checker) Date: Tue, 3 May 2005 18:16:06 -0400 (EDT) Subject: [Paleopsych] GovtExec: DHS chief floats idea of collecting private citizens' information Message-ID: DHS chief floats idea of collecting private citizens' information http://www.govexec.com/story_page.cfm?articleid=31124&printerfriendlyVers=1& DAILY BRIEFING April 29, 2005 DHS chief floats idea of collecting private citizens' information By Siobhan Gorman, [2]National Journal Call it Total Information Awareness, homeland-style. Homeland Security Secretary Michael Chertoff this week floated an idea to start a nonprofit group that would collect information on private citizens, flag suspicious activity, and send names of suspicious people to his department. The idea, which Chertoff tossed out at an April 27 meeting with security-industry officials, is reminiscent of the Defense Department's now-dead Total Information Awareness program that sought to sift though heaps of foreign intelligence information to root out potential terrorist activity. According to one techie who attended the April 27 meeting, Chertoff told the group, "Maybe we can create a nonprofit and track people's activities, and an algorithm could red-flag individuals. Then, the nonprofit could give us the names." Chertoff also suggested that private industry form a group to collect proprietary information about cyber- and other infrastructure-security breaches from companies; scrub it of identifying information; aggregate it; and pass it along to the department. The financial services industry already has such a group. "The secretary was responding to a hypothetical question with a hypothetical answer," said Homeland Security Department press secretary Brian Roehrkasse. "He did not offer specific programmatic content or discuss any specific proposed approach. Rather, he was discussing, in general terms, the importance of this issue of balancing security and privacy." Harris Miller, president of the Information Technology Association of America, organized the gathering of about 50 security-industry executives from companies such as Microsoft, Oracle, and Verizon. Reached by phone at the meeting, he characterized the event as "an organizational meeting to discuss how the [information-technology] industry can work more effectively with each other" and with the Homeland Security Department. Because the meeting was closed to the press, Miller would not discuss Chertoff's comments.One meeting participant said that Chertoff told the group that having a nonprofit collect names rather than the government "would alleviate some of the concerns people have." Not so for this participant: "This is what made me sort of shift in my seat. It sounds like investigating every person for no reason." He was particularly concerned that an unknown formula created by this new group would determine the red flags. From checker at panix.com Tue May 3 22:16:18 2005 From: checker at panix.com (Premise Checker) Date: Tue, 3 May 2005 18:16:18 -0400 (EDT) Subject: [Paleopsych] NYT: Chimeras on the Horizon, but Don't Expect Centaurs Message-ID: ---------- Forwarded message ---------- Date: Tue, 3 May 2005 10:49:46 -0400 (EDT) From: Premise Checker To: Transhuman Tech Subject: NYT: Chimeras on the Horizon, but Don't Expect Centaurs Chimeras on the Horizon, but Don't Expect Centaurs New York Times, 5.5.3 http://www.nytimes.com/2005/05/03/science/03chim.html By [1]NICHOLAS WADE Common ground for ethical research on human embryonic stem cells may have been laid by the National Academy of Sciences in the well-received guidelines it proposed last week. But if research on human embryonic stem cells ever gets going, people will be hearing a lot more about chimeras, creatures composed of more than one kind of cell. The world of chimeras holds weirdnesses that may require some getting used to. The original chimera, a tripartite medley of lion, goat and snake, was a mere monster, but mythology is populated with half-human chimeras - centaurs, sphinxes, werewolves, minotaurs and mermaids, and the gorgon Medusa. These creatures hold generally sinister powers, as if to advertise the pre-Darwinian notion that species are fixed and penalties are severe for transgressing the boundaries between them. Biologists have been generating chimeras for years, though until now of a generally bland variety. If you mix the embryonic cells of a black mouse and a white mouse, you get a patchwork mouse, in which the cells from the two donors contribute to the coat and to tissues throughout the body. Cells can also be added at a later stage to specific organs; people who carry pig heart valves are, at least technically, chimeric. The promise of embryonic stem cells is that since all the tissues of the body are derived from them, they are a kind of universal clay. If biologists succeed in learning how to shape the clay into specific organs, like pancreas glands, heart muscle or kidneys, physicians may be able to provide replacement parts on demand. Developing these new organs, and testing them to the standards required by the Food and Drug Administration, will require growing human organs in animals. Such creations - of pigs with human hearts, monkeys with human larynxes - are likely to be unsettling to many. "I think people would be horrified," said Dr. William Hansen, an expert in mythology at Indiana University. Chimeras grip the imagination because people are both fascinated and repulsed by the defiance of natural order. "They promote a sense of wonder and awe and for many of us that is an enjoyable feeling; they are a safe form of danger as in watching a scary movie," Dr. Hansen said. From the biologists' point of view, animals made to grow human tissues do not really raise novel issues because they can be categorized as animals with added human parts. Biologists are more concerned about animals in which human cells have become seeded throughout the system. "The mixing of species is something people do worry about and their fears need to be addressed," said Dr. Richard O. Hynes of the Massachusetts Institute of Technology, the co-chairman of the National Academy of Sciences committee that issued the research guidelines. Foreseeing the need for chimeras if stem cell research gets near to therapy, Dr. Hynes's committee delved into the ethics of chimera manufacture, defining the two cases in which human-animal chimeras could raise awkward issues. One involves incorporating human cells into the germ line; the other is involves using a human brain, creating a human or half human mind imprisoned in an animal body. In the case of human cells' invading the germ line, the chimeric animals might then carry human eggs and sperm, and in mating could therefore generate a fertilized human egg. Hardly anyone would desire to be conceived by a pair of mice. To forestall such discomforting possibilities, the committee ruled that chimeric animals should not be allowed mate. Still, there may in the future be good reason to generate mice that produce human oocytes, as the unfertilized egg is called. Tissues made from embryonic stem cells are likely to be perceived as foreign by the patient's immune system. One way around this problem is to create the embryonic stem cells from a patient's own tissues, by transferring a nucleus from the patient's skin cell into a human oocyte whose own nucleus has been removed. These nuclear transfers, which are also the way that cloned animals are made, are at present highly inefficient and require some 200 oocytes for each successful cloning. Acquiring oocytes from volunteers is not a trivial procedure, and the academy's recommendation that women who volunteer should not be paid is unlikely to increase supply. Chimeric mice that make human oocytes could be the answer. There are also sound scientific reasons for creating mice with human brain cells, an experiment that has long been contemplated by Dr. Irving Weissman of Stanford. Many serious human diseases arise through the loss of certain types of brain cell. To test if these can be replaced with human neural stem cells, Dr. Weissman injected human brain cells into a mouse embryo and showed that they followed the rules for mouse neural stem cells, migrating to the olfactory bulb to create a regular stream of new odor-detecting neurons. The mice may have been perplexed by their deficient sense of smell but probably not greatly so because human cells constituted less than 1 percent of their brain. Dr. Weissman decided it would be useful to have a mouse with a much larger percentage of human brain cells, but he sought ethical guidance before trying the experiment. He plans to let such mice develop as fetuses and to curtail the experiment before birth, to see if their human brains cells have arranged themselves in the architecture of a mouse brain or human brain. Given the nine months it takes for a human brain to be constructed, it seems unlikely that the developmental program of the human neurons would have time to unfold very far in the 20-day gestation of a mouse. Contrary to the plot of every good horror movie, the biologists' chimera cookbook contains only recipes of medical interest. But if there were no limits, could they in fact turn chimeras of myth into reality? That depends on the creature. If embryonic cells from human and horse were mixed together, the cells of each species would try to contribute to each part of the body, as in the patchwork mouse, but in this case with goals so incompatible it is hard to see any viable creature being formed. Centaurs, in any case, have six limbs, and that would be fine for an insect but violates the standard mammalian body plan. A much greater chance of creating a viable chimeric creature would come from injecting human embryonic stem cells into a monkey or ape. For this reason the academy committee has firmly ruled out such experiments as unethical. But to continue a little on the path of fantasy, humans are still very similar to chimpanzees, their closest surviving cousins, and an embryo constructed of cells from each may be viable enough to be born. This chimerical creature would probably not be as enjoyable as the chimeras of mythology but more of a problem human - a Caliban-like personage with bad manners and difficult habits. "If something were half human and half animal, what would our moral responsibilities be?" says Richard Doerflinger of the United States Conference of Catholic Bishops. "It might be immoral to kill such a creature. It's wrong to create creatures whose moral stature we are perplexed about." Evidently the first rule of chimeric chemistry is not to make creatures whose behavior straddles the perceived division between the human and animal worlds. References 1. http://query.nytimes.com/search/query?ppds=bylL&v1=NICHOLAS%20WADE&fdq=19960101&td=sysdate&sort=newest&ac=NICHOLAS%20WADE&inline=nyt-per From waluk at earthlink.net Wed May 4 00:32:27 2005 From: waluk at earthlink.net (G. Reinhart-Waller) Date: Tue, 03 May 2005 17:32:27 -0700 Subject: [Paleopsych] child's play In-Reply-To: <20050503190556.13160.qmail@web30802.mail.mud.yahoo.com> References: <20050503190556.13160.qmail@web30802.mail.mud.yahoo.com> Message-ID: <4278181B.1060400@earthlink.net> Michael Christopher wrote: >>--That's an odd list... smiling is lumped in with wrestling? And with no distinction among levels of roughness in play fighting? I doubt there's any "politically correct" movement to ban smiling on the playground. >> Maybe not. Could be that Anthony Pellegrini as a professor in early childhood development thinks all children need a broad range of activity to experience the entire range of activity from fighting to fleeing. Only by knowing these activities will children grow into adults who can handle both emotional extremes, depending on the circumstances. Isn't this similar to gambling at poker and knowing when to hold 'em and when to show 'em? As the song continues....know when to walk away...know when to run. Regards, Gerry Reinhart-Waller From shovland at mindspring.com Wed May 4 04:41:07 2005 From: shovland at mindspring.com (Steve Hovland) Date: Tue, 3 May 2005 21:41:07 -0700 Subject: [Paleopsych] Future shape of media- presentation from ACRL 2005 Message-ID: <01C55028.D1061D80.shovland@mindspring.com> http://www.robinsloan.com/epic/ Click on: Go to a random mirror [you may have to keep trying if there's an error page) This is pretty neat flash work Steve Hovland www.stevehovland.net From HowlBloom at aol.com Wed May 4 06:02:23 2005 From: HowlBloom at aol.com (HowlBloom at aol.com) Date: Wed, 4 May 2005 02:02:23 EDT Subject: [Paleopsych] instant evolution in societies of genes Message-ID: Note the following quote in the article below: ?These genes?are changing more swiftly than would be expected through random mutation alone.? The genes in question are genes that code for learning, genes that code for adaptive intelligence. These genes outpace the others in humans and chimps. The research outlined below indicates that these genes are first in the race to reorganize and upgrade themselves?they outspeed other genes in evolution. What do these fast-track genes have in common? They are the genes of the immune system and the genes of apoptosis?the genes of pre-programmed cell suicide. Pre-programmed cell suicide determines which cells we need and which we don?t. It resculpts the body to fit the exigencies of the moment. More important, the genes of pre-programmed cell suicide determine which 50% of the cerebral neurons we?re born with will live and which will die. In this harsh process of judgement, apoptosis shapes the brain to live in the society we?re a part of and to deal with the problems that society demands we help solve. Pre-programmed cell death, I suspect, also shapes our body to fit the demands of our physical environment. It expands the size of our lungs if we grow up in the Andes Mountains, where the air is thin. It makes sure that we don?t waste energy and materiel on oversized lungs if we?re born and raised near sea level (which 60% of us humans are). Then there?s the immune system, a learning mesh, a creative web, a neural-net-like community of nodes, of modules. The immune system is, in its own way, nearly as smart as the brain. The brain?s advantages: a brain brings multiple intelligences to work on a problem?seven of them if you go by Howard Gardner. I suspect the brain has more than that mere seven if you count the many forms of conscious reason, the many forms of intuition, the many forms of muscular metaphor, the many systems that keep us walking while we?re thinking or talking, our sensory systems, and the autonomous systems that take care of functions we seldom have to be aware of?heartbeat, digestion, and shunting blood to the place where it?s most needed at the moment. The genes of the immune system and of apoptosis. These are the genes of what Global Brain: The Evolution of Mass Mind From The Big Bang to the 21st Century calls ?inner-judges? and of what The Lucifer Principle: A Scientific Expedition Into the Forces of History calls ?self-destruct mechanisms?. According to these two books, the genes of the immune system and of apoptosis are the genes that turn us into modules of a larger collective learning machine, a neural net that wires our subcultures, our nations, and our global societies into a massive, creative computational engine, a thinking, dreaming, reperceiving, and invention machine. The genes of the immune system and of apoptosis are the non-stop sharpeners of learning?s cutting edge. And the genes of the immune system and of apoptosis don?t lazily await random mutation to adapt. They take adaptation into their own hands, into their own c?s, a?s, g?s, and t?s, into their own thinking mesh. I suspect they also pull off what Jeff Hawkins talks about in his On Intelligence: they feed their output back into their input. They experiment with adjustments in our phenotype, in our bodies and our minds. They test their experiments in our social and physical environment. They incorporate what works and toss out what doesn?t?even if that means tossing out you and me. Which means that like Eshel Ben-Jacob?s creative webs of bacteria, the genes of the immune system and of apoptosis, the genes of instant evolution, may be able to spot problems, generate potential solutions, then respond to the success or failure of these hypotheses. The bottom line is this: Communities of genes?the community of 35,000 in a human genome, the community of 3.5 quadrillion (3,500,000,000,000,000) in a single human being, or the community of 3.5 septillion (3,500,000,000,000,000,000,000,000) in a society the size of China--are much more nimble than we think. Howard Retrieved May 3, 2005, from the World Wide Web _http://www.newscientist.com/article.ns?id=dn7335_ (http://www.newscientist.com/article.ns?id=dn7335) HOME |NEWS |EXPLORE BY SUBJECT |LAST WORD |SUBSCRIBE |SEARCH |ARCHIVE |RSS |JOBS Click to PrintFastest-evolving genes in humans and chimps revealed 18:37 03 May 2005 NewScientist.com news service Jennifer Viegas The most comprehensive study to date exploring the genetic divergence of humans and chimpanzees has revealed that the genes most favoured by natural selection are those associated with immunity, tumour suppression [hb: the immune system, like the brain, is one of our swiftest learning machines], and programmed cell death [hb: programmed cell death shapes our morphology to fit the shifts in our environment?especially the shifts in human culture. In other words, apoptosis is also a learning mechanism, part of what makes the connectionist machine work.]. These genes show signs of positive natural selection in both branches of the evolutionary tree and are changing more swiftly than would be expected through random mutation alone. Lead scientist Rasmus Nielsen and colleagues at the University of Copenhagen, Denmark, examined the 13,731 chimp genes that have equivalent genes with known functions in humans. Research in 2003 revealed that genes involved with smell, hearing, digestion, long bone growth, and hairiness are undergoing positive natural selection in chimps and humans. The new study has found that the strongest evidence for selection is related to disease defence and apoptosis - or programmed cell death - which is linked to sperm production. Plague and HIV Nielsen, a professor of bioinformatics, believes immune and defence genes are involved in ?an evolutionary arms race with pathogens?. ?Viruses and other pathogens evolve very fast, and the human immune system is constantly being challenged by the emergence of new pathogenic threats,? he told New Scientist. ?The amount of selection imposed on the human population by pathogens - such as the bubonic plague or HIV - is enormous. It is no wonder that the genes involved in defence against such pathogens are evolving very fast.? Harmit Singh Malik, a researcher at the Fred Hutchinson Cancer Research Center in Seattle, Washington, US, agrees. Both Malik and Nielsen, however, expressed surprise over the findings concerning tumour suppression, which is linked to apoptosis - or programmed cell death - which can reduce the production of healthy, mature sperm. Selfish mutation The discovery by Nielsen that genes involved in apoptosis show strong evidence for positive natural selection may be due, in part, to the evolutionary drive for sperm cells to compete. Cells carrying genes that hinder apoptosis have a greater chance of producing mature sperm cells, so Nielsen believes these genes can become widespread in populations over time. But because primates also use apoptosis to eliminate cancerous cells, positive selection in this case may not be favourable for the mature animal: ?The selfish mutations that cause apoptosis avoidance may then also reduce the organism?s ability to fight cancer,? Nielsen explains. Journal reference: Public Library of Science Biology (vol 3, issue 6) Related Articles Life's top 10 greatest inventions http://www.newscientist.com/article.ns?id=mg18624941.700 09 April 2005 Sleeping around boosts evolution http://www.newscientist.com/article.ns?id=mg18424731.500 13 November 2004 Genetically-modified virus explodes cancer cells http://www.newscientist.com/article.ns?id=dn5056 01 June 2004 Weblinks Rasmus Nielsen, University of Copenhagen http://www.binf.ku.dk/users/rasmus/webpage/ras.html Harmit Singh Malik?s lab, Fred Hutchinson Cancer Research Center http://www.fhcrc.org/labs/malik/ Public Library of Science Biology http://biology.plosjournals.org/perlserv/?request=index-html&issn=1545-7885 Close this window Printed on Tue May 03 23:54:15 BST 2005 ---------- Howard Bloom Author of The Lucifer Principle: A Scientific Expedition Into the Forces of History and Global Brain: The Evolution of Mass Mind From The Big Bang to the 21st Century Visiting Scholar-Graduate Psychology Department, New York University; Core Faculty Member, The Graduate Institute www.howardbloom.net www.bigbangtango.net Founder: International Paleopsychology Project; founding board member: Epic of Evolution Society; founding board member, The Darwin Project; founder: The Big Bang Tango Media Lab; member: New York Academy of Sciences, American Association for the Advancement of Science, American Psychological Society, Academy of Political Science, Human Behavior and Evolution Society, International Society for Human Ethology; advisory board member: Youthactivism.org; executive editor -- New Paradigm book series. For information on The International Paleopsychology Project, see: www.paleopsych.org for two chapters from The Lucifer Principle: A Scientific Expedition Into the Forces of History, see www.howardbloom.net/lucifer For information on Global Brain: The Evolution of Mass Mind from the Big Bang to the 21st Century, see www.howardbloom.net -------------- next part -------------- An HTML attachment was scrubbed... URL: From waluk at earthlink.net Wed May 4 23:04:17 2005 From: waluk at earthlink.net (G. Reinhart-Waller) Date: Wed, 04 May 2005 16:04:17 -0700 Subject: [Paleopsych] Re: child's play In-Reply-To: <20050504185955.77455.qmail@web30804.mail.mud.yahoo.com> References: <20050504185955.77455.qmail@web30804.mail.mud.yahoo.com> Message-ID: <427954F1.6040108@earthlink.net> Firstly, please let me change the repetitive phrase "broad range of activity" to the single word "play" so that the sentence in fact reads ".... early childhood development thinks all children need a broad range of activity to experience playing roles as diverse as fighting and fleeing." Whether one is acting the role of trespasser or one who is being trespassed against, all roles in effect turn out to be play through the eyes of William Shakespeare's "the world is a stage.....". Whether or not wrestling, backyard or in the ring, is considered dangerous depends on what it is being compared to. In my opinion wrestling is less dangerous than boxing, soccer, or football but that's my two cent's worth. Could be that you have a different idea in mind. This is why I continue saying that "truth lies in the eyes of the beholder". Regards, Gerry Reinhart-Waller Michael Christopher wrote: >>>Could be that Anthony Pellegrini as a professor in >>> >>> >early childhood development thinks all children need a >broad range of activity to experience the entire range >of activity from fighting to fleeing.<< > >--I think most people would agree, the question is, >"when is fighting more than play". His list didn't >make that distinction, "fighting to fleeing" could >include anything from backyard wrestling (quite >dangerous) to "running and jumping" (totally normal). > >Michael > > >__________________________________________________ >Do You Yahoo!? >Tired of spam? Yahoo! Mail has the best spam protection around >http://mail.yahoo.com > > > From anonymous_animus at yahoo.com Wed May 4 23:09:29 2005 From: anonymous_animus at yahoo.com (Michael Christopher) Date: Wed, 4 May 2005 16:09:29 -0700 (PDT) Subject: [Paleopsych] reality shows In-Reply-To: <200505041802.j44I2lR03171@tick.javien.com> Message-ID: <20050504230929.26536.qmail@web30812.mail.mud.yahoo.com> Gerry says: >>Rather than placing blame, we should instead focus on our ideals and goals that contribute to a group rather than only to personal satisfaction. Incidently this seems to be the thrust of the t.v. show "Extreme Makeover, Home ! Edition" where a group of volunteers designs, builds, and decorates a home for worthy clients. It's not much, but it's a beginning.<< --It's interesting to contrast those shows with the cutthroat ones. I wonder which way our culture will head as a whole? Dog-eat-dog, or community? Michael Discover Yahoo! Find restaurants, movies, travel and more fun for the weekend. Check it out! http://discover.yahoo.com/weekend.html From shovland at mindspring.com Wed May 4 23:12:11 2005 From: shovland at mindspring.com (Steve Hovland) Date: Wed, 4 May 2005 16:12:11 -0700 Subject: [Paleopsych] The Society of Organelles Message-ID: <01C550C4.07DE2BE0.shovland@mindspring.com> http://www.winterwren.com/apbio/cellorganelles/cells.html Nonmembrane Bound Organelles Ribosomes Centrioles Microtubules Membrane Bound Organelles Organelles made-up of single membranes Vacuoles Lysosomes Vesicles Endoplasmic reticulum Golgi Apparatus Peroxisomes Endomembrane System Organelles made-up of double membranes Mitochondria Chloroplasts From shovland at mindspring.com Wed May 4 23:23:18 2005 From: shovland at mindspring.com (Steve Hovland) Date: Wed, 4 May 2005 16:23:18 -0700 Subject: [Paleopsych] The Language of Enzymes Message-ID: <01C550C5.95C237C0.shovland@mindspring.com> http://users.rcn.com/jkimball.ma.ultranet/BiologyPages/E/Enzymes.html Enzymes are catalysts. Most are proteins <../P/Proteins.html>. (A few ribonucleoprotein <../R/R.html> enzymes have been discovered and, for some of these, the catalytic activity is in the RNA part rather than the protein part. Link to discussion of these ribozymes <../R/Ribozymes.html>.) From waluk at earthlink.net Wed May 4 23:39:02 2005 From: waluk at earthlink.net (G. Reinhart-Waller) Date: Wed, 04 May 2005 16:39:02 -0700 Subject: [Paleopsych] reality shows In-Reply-To: <20050504230929.26536.qmail@web30812.mail.mud.yahoo.com> References: <20050504230929.26536.qmail@web30812.mail.mud.yahoo.com> Message-ID: <42795D16.202@earthlink.net> Could be that group dynamics are beginning to take center stage and the "selfish" generation will shortly find itself in the ebb. This could signal a change in paradigm but for how long is uncertain....group flow will only last as long as it takes before the tide begins to ebb and the "me" generation remakes itself into something more powerful than before. Gerry Michael Christopher wrote: >Gerry says: > > >>>Rather than placing blame, we should instead focus >>> >>> >on our ideals and goals that contribute to a group >rather than only to personal satisfaction. Incidently >this seems to be the thrust of the t.v. show "Extreme >Makeover, Home ! Edition" where a group of volunteers >designs, builds, and decorates a home for worthy >clients. It's not much, but it's a beginning.<< > >--It's interesting to contrast those shows with the >cutthroat ones. I wonder which way our culture will >head as a whole? Dog-eat-dog, or community? > >Michael > > > >Discover Yahoo! >Find restaurants, movies, travel and more fun for the weekend. Check it out! >http://discover.yahoo.com/weekend.html > >_______________________________________________ >paleopsych mailing list >paleopsych at paleopsych.org >http://lists.paleopsych.org/mailman/listinfo/paleopsych > > > From checker at panix.com Thu May 5 14:55:40 2005 From: checker at panix.com (Premise Checker) Date: Thu, 5 May 2005 10:55:40 -0400 (EDT) Subject: [Paleopsych] MSNBC: Human evolution at the crossroads Message-ID: Human evolution at the crossroads Genetics, cybernetics complicate forecast for species http://www.msnbc.msn.com/id/7103668/ 5.5.2 By Alan Boyle Science editor Scientists are fond of running the evolutionary clock backward, using DNA analysis and the fossil record to figure out when our ancestors stood erect and split off from the rest of the primate evolutionary tree. But the clock is running forward as well. So where are humans headed? Evolutionary biologist Richard Dawkins says it's the question he's most often asked, and "a question that any prudent evolutionist will evade." But the question is being raised even more frequently as researchers study our past and contemplate our future. Paleontologists say that anatomically modern humans may have at one time shared the Earth with as many as three other closely related types - Neanderthals, Homo erectus and the dwarf hominids whose remains were discovered last year in Indonesia. Does evolutionary theory allow for circumstances in which "spin-off" human species could develop again? Some think the rapid rise of genetic modification could be just such a circumstance. Others believe we could blend ourselves with machines in unprecedented ways - turning natural-born humans into an endangered species. Present-day fact, not science fiction Such ideas may sound like little more than science-fiction plot lines. But trend-watchers point out that we're already wrestling with real-world aspects of future human development, ranging from stem-cell research to the implantation of biocompatible computer chips. The debates are likely to become increasingly divisive once all the scientific implications sink in. "These issues touch upon religion, upon politics, upon values," said Gregory Stock, director of the Program on Medicine, Technology and Society at the University of California at Los Angeles. "This is about our vision of the future, essentially, and we'll never completely agree about those things." The problem is, scientists can't predict with precision how our species will adapt to changes over the next millennium, let alone the next million years. That's why Dawkins believes it's imprudent to make a prediction in the first place. Others see it differently: In the book "Future Evolution," University of Washington paleontologist Peter Ward argues that we are making ourselves virtually extinction-proof by bending Earth's flora and fauna to our will. And assuming that the human species will be hanging around for at least another 500 million years, Ward and others believe there are a few most likely scenarios for the future, based on a reading of past evolutionary episodes and current trends. Where are humans headed? Here's an imprudent assessment of five possible paths, ranging from homogenized humans to alien-looking hybrids bred for interstellar travel. Unihumans: Will we all be assimilated? Biologists say that different populations of a species have to be isolated from each other in order for those populations to diverge into separate species. That's the process that gave rise to 13 different species of "Darwin's Finches" in the Galapagos Islands. But what if the human species is so widespread there's no longer any opening for divergence? Evolution is still at work. But instead of diverging, our gene pool has been converging for tens of thousands of years - and Stuart Pimm, an expert on biodiversity at Duke University, says that trend may well be accelerating. "The big thing that people overlook when speculating about human evolution is that the raw matter for evolution is variation," he said. "We are going to lose that variability very quickly, and the reason is not quite a genetic argument, but it's close. At the moment we humans speak something on the order of 6,500 languages. If we look at the number of languages we will likely pass on to our children, that number is 600." Cultural diversity, as measured by linguistic diversity, is fading as human society becomes more interconnected globally, Pimm argued. "I do think that we are going to become much more homogeneous," he said. Ken Miller, an evolutionary biologist at Brown University, agreed: "We have become a kind of animal monoculture." Is that such a bad thing? A global culture of Unihumans could seem heavenly if we figure out how to achieve long-term political and economic stability and curb population growth. That may require the development of a more "domesticated" society - one in which our rough genetic edges are smoothed out. But like other monocultures, our species could be more susceptible to quick-spreading diseases, as last year's bird flu epidemic illustrated. "The genetic variability that we have protects us against suffering from massive harm when some bug comes along," Pimm said. "This idea of breeding the super-race, like breeding the super-race of corn or rice or whatever - the long-term consequences of that could be quite scary." Environmental pressures wouldn't stop Even a Unihuman culture would have to cope with evolutionary pressures from the environment, the University of Washington's Peter Ward said. Some environmentalists say toxins that work like estrogens are already having an effect: Such agents, found in pesticides and industrial PCBs, have been linked to earlier puberty for women, increased incidence of breast cancer and lower sperm counts for men. "One of the great frontiers is going to be trying to keep humans alive in a much more toxic world," he observed from his Seattle office. "The whales of Puget Sound are the most toxic whales on Earth. Puget Sound is just a huge cesspool. Well, imagine if that goes global." Global epidemics or dramatic environmental changes represent just two of the scenarios that could cause a Unihuman society to crack, putting natural selection - or perhaps not-so-natural selection - back into the evolutionary game. Then what? Survivalistians: Coping with doomsday Surviving doomsday is a story as old as Noah's Ark, and as new as the post-bioapocalypse movie "28 Days After." Catastrophes ranging from super-floods to plagues to nuclear war to asteroid strikes erase civilization as we know it, leaving remnants of humanity who go their own evolutionary ways. The classic Darwinian version of the story may well be H.G. Wells' "The Time Machine," in which humanity splits off into two species: the ruthless, underground Morlock and the effete, surface-dwelling Eloi. At least for modern-day humans, the forces that lead to species spin-offs have been largely held in abeyance: Populations are increasingly in contact with each other, leading to greater gene-mixing. Humans are no longer threatened by predators their own size, and medicine cancels out inherited infirmities ranging from hemophilia to nearsightedness. "We are helping genes that would have dropped out of the gene pool," paleontologist Peter Ward observed. But in Wells' tale and other science-fiction stories, a civilization-shattering catastrophe serves to divide humanity into separate populations, vulnerable once again to selection pressures. For example, people who had more genetic resistance to viral disease would be more likely to pass on that advantage to their descendants. If different populations develop in isolation over many thousands of generations, it's conceivable that separate species would emerge. For example, that virus-resistant strain of post-humans might eventually thrive in the wake of a global bioterror crisis, while less hardy humans would find themselves quarantined in the world's safe havens. Patterns in the spread of the virus that causes AIDS may hint at earlier, less catastrophic episodes of natural selection, said Stuart Pimm, a conservation biologist at Duke University: "There are pockets of people who don't seem to become HIV-positive, even though they have a lot of exposure to the virus - and that may be because their ancestors survived the plague 500 years ago." Evolution, or devolution? If the catastrophe ever came, could humanity recover? In science fiction, that's an intriguingly open question. For example, Stephen Baxter's novel "Evolution" foresees an environmental-military meltdown so severe that, over the course of 30 million years, humans devolve into separate species of eyeless mole-men, neo-apes and elephant-people herded by their super-rodent masters. Even Ward gives himself a little speculative leeway in his book "Future Evolution," where a time-traveling human meets his doom 10 million years from now at the hands - or in this case, the talons - of a flock of intelligent killer crows. But Ward finds it hard to believe that even a global catastrophe would keep human populations isolated long enough for our species to split apart. "Unless we totally forget how to build a boat, we can quickly come back," Ward said. Even in the event of a post-human split-off, evolutionary theory dictates that one species would eventually subjugate, assimilate or eliminate their competitors for the top job in the global ecosystem. Just ask the Neanderthals. "If you have two species competing over the same ecological niche, it ends badly for one of them, historically," said Joel Garreau, the author of the forthcoming book "Radical Evolution." The only reason chimpanzees still exist today is that they "had the brains to stay up in the trees and not come down into the open grasslands," he noted. "You have this optimistic view that you're not going to see speciation (among humans), and I desperately hope that's right," Garreau said. "But that's not the only scenario." Numans: Rise of the superhumans We've already seen the future of enhanced humans, and his name is Barry Bonds. The controversy surrounding the San Francisco Giants slugger, and whether steroids played a role in the bulked-up look that he and other baseball players have taken on, is only a foretaste of what's coming as scientists find new genetic and pharmacological ways to improve performance. Developments in the field are coming so quickly that social commentator Joel Garreau argues that they represent a new form of evolution. This radical kind of evolution moves much more quickly than biological evolution, which can take millions of years, or even cultural evolution, which works on a scale of hundreds or thousands of years. How long before this new wave of evolution spawns a new kind of human? "Try 20 years," Garreau told MSNBC.com. In his latest book, "Radical Evolution," Garreau reels off a litany of high-tech enhancements, ranging from steroid Supermen, to camera-equipped flying drones, to pills that keep soldiers going without sleep or food for days. "If you look at the superheroes of the '30s and the '40s, just about all of the technologies they had exist today," he said. Three kinds of humans Such enhancements are appearing first on the athletic field and the battlefield, Garreau said, but eventually they'll make their way to the collegiate scene, the office scene and even the dating scene. "You're talking about three different kinds of humans: the enhanced, the naturals and the rest," Garreau said. "The enhanced are defined as those who have the money and enthusiasm to make themselves live longer, be smarter, look sexier. That's what you're competing against." In Garreau's view of the world, the naturals will be those who eschew enhancements for higher reasons, just as vegetarians forgo meat and fundamentalists forgo what they see as illicit pleasures. Then there's all the rest of us, who don't get enhanced only because they can't. "They loathe and despise the people who do, and they also envy them," Garreau said. Scientists acknowledge that some of the medical enhancements on the horizon could engender a "have vs. have not" attitude. "But I could be a smart ass and ask how that's different from what we have now," said Brown University's Ken Miller. Medical advances as equalizers Miller went on to point out that in the past, "advances in medical science have actually been great levelers of social equality." For example, age-old scourges such as smallpox and polio have been eradicated, thanks to public health efforts in poorer as well as richer countries. That trend is likely to continue as scientists learn more about the genetic roots of disease, he said. "In terms of making genetic modifications to ourselves, it's much more likely we'll start to tinker with genes for disease susceptibility. ... Maybe there would be a long-term health project to breed HIV-resistant people," he said. When it comes to discussing ways to enhance humans, rather than simply make up for disabilities, the traits targeted most often are longevity and memory. Scientists have already found ways to enhance those traits in mice. Imagine improvements that could keep you in peak working condition past the age of 100. Those are the sorts of enhancements you might want to pass on to your descendants - and that could set the stage for reproductive isolation and an eventual species split-off. "In that scenario, why would you want your kid to marry somebody who would not pass on the genes that allowed your grandchildren to have longevity, too?" the University of Washington's Peter Ward asked. But that would require crossing yet another technological and ethical frontier. Instant superhumans - or monsters? To date, genetic medicine has focused on therapies that work on only one person at a time. The effects of those therapies aren't carried on to future generations. For example, if you take muscle-enhancing drugs, or even undergo gene therapy for bigger muscles, that doesn't mean your children will have similarly big muscles. In order to make an enhancement inheritable, you'd have to have new code spliced into your germline stem cells - creating an ethical controversy of transcendent proportions. Tinkering with the germline could conceivably produce a superhuman species in a single generation - but could also conceivably create a race of monsters. "It is totally unpredictable," Ward said. "It's a lot easier to understand evolutionary happenstance." Even then, there are genetic traits that are far more difficult to produce than big muscles or even super-longevity - for instance, the very trait that defines us as humans. "It's very, very clear that intelligence is a pretty subtle thing, and it's clear that we don't have a single gene that turns it on or off," Miller said. When it comes to intelligence, some scientists say, the most likely route to our future enhancement - and perhaps our future competition as well - just might come from our own machines. Cyborgs: Merging with the machines Will intelligent machines be assimilated, or will humans be eliminated? Until a few years ago, that question was addressed only in science-fiction plot lines, but today the rapid pace of cybernetic change has led some experts to worry that artificial intelligence may outpace Homo sapiens' natural smarts. The pace of change is often stated in terms of Moore's Law, which says that the number of transistors packed into a square inch should double every 18 months. "Moore's Law is now on its 30th doubling. We have never seen that sort of exponential increase before in human history," said Joel Garreau, author of the book "Radical Evolution." In some fields, artificial intelligence has already bested humans - with Deep Blue's 1997 victory over world chess champion Garry Kasparov providing a vivid example. Three years later, computer scientist Bill Joy argued in an influential Wired magazine essay that we would soon face challenges from intelligent machines as well as from other technologies ranging from weapons of mass destruction to self-replicating nanoscale "gray goo." Joy speculated that a truly intelligent robot may arise by the year 2030. "And once an intelligent robot exists, it is only a small step to a robot species - to an intelligent robot that can make evolved copies of itself," he wrote. Assimilating the robots To others, it seems more likely that we could become part-robot ourselves: We're already making machines that can be assimilated - including prosthetic limbs, mechanical hearts, cochlear implants and artificial retinas. Why couldn't brain augmentation be added to the list? "The usual suggestions are that we'll design improvements to ourselves," said Seth Shostak, senior astronomer at the SETI Institute. "We'll put additional chips in our head, and we won't get lost, and we'll be able to do all those math problems that used to befuddle us." Shostak, who writes about the possibilities for cybernetic intelligence in his book "Sharing the Universe," thinks that's likely to be a transitional step at best. "My usual response is that, well, you can improve horses by putting four-cylinder engines in them. But eventually you can do without the horse part," he said. "These hybrids just don't strike me as having a tremendous advantage. It just means the machines aren't good enough." Back to biology University of Washington paleontologist Peter Ward also believes human-machine hybrids aren't a long-term option, but for different reasons. "When you talk to people in the know, they think cybernetics will become biology," he said. "So you're right back to biology, and the easiest way to make changes is by manipulating genomes." It's hard to imagine that robots would ever be given enough free rein to challenge human dominance, but even if they did break free, Shostak has no fear of a "Terminator"-style battle for the planet. "I've got a couple of goldfish, and I don't wake up in the morning and say, 'I'm gonna kill these guys.' ... I just leave 'em alone," Shostak said. "I suspect the machines would very quickly get to a level where we were kind of irrelevant, so I don't fear them. But it does mean that we're no longer No. 1 on the planet, and we've never had that happen before." Astrans: Turning into an alien race If humans survive long enough, there's one sure way to grow new branches on our evolutionary family tree: by spreading out to other planets. Habitable worlds beyond Earth could be a 23rd century analog to the Galapagos Islands, Charles Darwin's evolutionary laboratory: just barely close enough for travelers to get to, but far enough away that there'd be little gene-mixing with the parent species. "If we get off to the stars, then yes, we will have speciation," said University of Washington paleontologist Peter Ward. "But can we ever get off the Earth?" Currently, the closest star system thought to have a planet is Epsilon Eridani, 10.5 light-years away. Even if spaceships could travel at 1 percent the speed of light - an incredible 6.7 million mph - it would take more than a millennium to get there. Even Mars might be far enough: If humans established a permanent settlement there, the radically different living conditions would change the evolutionary equation. For example, those who are born and raised in one-third of Earth's gravity could never feel at home on the old "home planet." It wouldn't take long for the new Martians to become a breed apart. As for distant stars, the SETI Institute's Seth Shostak has already been thinking through the possibilities: # Build a big ark: Build a spaceship big enough to carry an entire civilization to the destination star system. The problem is, that environment might be just too unnatural for natural humans. "If you talk to the sociologists, they'll say that it will not work. ... You'll be lucky if anybody's still alive after the third generation," Shostak said. # Go to warp speed: Somehow we discover a wormhole or find a way to travel at relativistic speeds. "That sounds OK, except for the fact that nobody knows how to do it," Shostak said. # Enter the Astrans: Humans are genetically engineered to tolerate ultra long-term hibernation aboard robotic ships. Once the ship reaches its destination, these "Astrans" are awakened to start the work of settling a new world. "That's one possibility," Shostak said. The ultimate approach would be to send the instructions for making humans rather than the humans themselves, Shostak said. "We're not going to put anything in a rocket, we're just going to beam ourselves to the stars," he explained. "The only trouble is, if there's nobody on the other end to put you back together, there's no point." So are we back to square one? Not necessarily, Shostak said. Setting up the receivers on other stars is no job for a human, "but the machines could make it work." In fact, if any other society is significantly further along than ours, such a network might be up and running by now. "The machines really could develop large tracts of galactic real estate, whereas it's really hard for biology to travel," Shostak said. It all seems inconceivable, but if humans really are extinction-proof - if they manage to survive global catastrophes, genetic upheavals and cybernetic challenges - who's to say what will be inconceivable millions of years from now? Two intelligent species, human and machine, just might work together to spread life through the universe. "If you were sufficiently motivated," Shostak said, "you could in fact keep it going forever." From checker at panix.com Thu May 5 14:56:01 2005 From: checker at panix.com (Premise Checker) Date: Thu, 5 May 2005 10:56:01 -0400 (EDT) Subject: [Paleopsych] NYT: Chimeras on the Horizon, but Don't Expect Centaurs Message-ID: Chimeras on the Horizon, but Don't Expect Centaurs New York Times, 5.5.3 http://www.nytimes.com/2005/05/03/science/03chim.html By [1]NICHOLAS WADE Common ground for ethical research on human embryonic stem cells may have been laid by the National Academy of Sciences in the well-received guidelines it proposed last week. But if research on human embryonic stem cells ever gets going, people will be hearing a lot more about chimeras, creatures composed of more than one kind of cell. The world of chimeras holds weirdnesses that may require some getting used to. The original chimera, a tripartite medley of lion, goat and snake, was a mere monster, but mythology is populated with half-human chimeras - centaurs, sphinxes, werewolves, minotaurs and mermaids, and the gorgon Medusa. These creatures hold generally sinister powers, as if to advertise the pre-Darwinian notion that species are fixed and penalties are severe for transgressing the boundaries between them. Biologists have been generating chimeras for years, though until now of a generally bland variety. If you mix the embryonic cells of a black mouse and a white mouse, you get a patchwork mouse, in which the cells from the two donors contribute to the coat and to tissues throughout the body. Cells can also be added at a later stage to specific organs; people who carry pig heart valves are, at least technically, chimeric. The promise of embryonic stem cells is that since all the tissues of the body are derived from them, they are a kind of universal clay. If biologists succeed in learning how to shape the clay into specific organs, like pancreas glands, heart muscle or kidneys, physicians may be able to provide replacement parts on demand. Developing these new organs, and testing them to the standards required by the Food and Drug Administration, will require growing human organs in animals. Such creations - of pigs with human hearts, monkeys with human larynxes - are likely to be unsettling to many. "I think people would be horrified," said Dr. William Hansen, an expert in mythology at Indiana University. Chimeras grip the imagination because people are both fascinated and repulsed by the defiance of natural order. "They promote a sense of wonder and awe and for many of us that is an enjoyable feeling; they are a safe form of danger as in watching a scary movie," Dr. Hansen said. From the biologists' point of view, animals made to grow human tissues do not really raise novel issues because they can be categorized as animals with added human parts. Biologists are more concerned about animals in which human cells have become seeded throughout the system. "The mixing of species is something people do worry about and their fears need to be addressed," said Dr. Richard O. Hynes of the Massachusetts Institute of Technology, the co-chairman of the National Academy of Sciences committee that issued the research guidelines. Foreseeing the need for chimeras if stem cell research gets near to therapy, Dr. Hynes's committee delved into the ethics of chimera manufacture, defining the two cases in which human-animal chimeras could raise awkward issues. One involves incorporating human cells into the germ line; the other is involves using a human brain, creating a human or half human mind imprisoned in an animal body. In the case of human cells' invading the germ line, the chimeric animals might then carry human eggs and sperm, and in mating could therefore generate a fertilized human egg. Hardly anyone would desire to be conceived by a pair of mice. To forestall such discomforting possibilities, the committee ruled that chimeric animals should not be allowed mate. Still, there may in the future be good reason to generate mice that produce human oocytes, as the unfertilized egg is called. Tissues made from embryonic stem cells are likely to be perceived as foreign by the patient's immune system. One way around this problem is to create the embryonic stem cells from a patient's own tissues, by transferring a nucleus from the patient's skin cell into a human oocyte whose own nucleus has been removed. These nuclear transfers, which are also the way that cloned animals are made, are at present highly inefficient and require some 200 oocytes for each successful cloning. Acquiring oocytes from volunteers is not a trivial procedure, and the academy's recommendation that women who volunteer should not be paid is unlikely to increase supply. Chimeric mice that make human oocytes could be the answer. There are also sound scientific reasons for creating mice with human brain cells, an experiment that has long been contemplated by Dr. Irving Weissman of Stanford. Many serious human diseases arise through the loss of certain types of brain cell. To test if these can be replaced with human neural stem cells, Dr. Weissman injected human brain cells into a mouse embryo and showed that they followed the rules for mouse neural stem cells, migrating to the olfactory bulb to create a regular stream of new odor-detecting neurons. The mice may have been perplexed by their deficient sense of smell but probably not greatly so because human cells constituted less than 1 percent of their brain. Dr. Weissman decided it would be useful to have a mouse with a much larger percentage of human brain cells, but he sought ethical guidance before trying the experiment. He plans to let such mice develop as fetuses and to curtail the experiment before birth, to see if their human brains cells have arranged themselves in the architecture of a mouse brain or human brain. Given the nine months it takes for a human brain to be constructed, it seems unlikely that the developmental program of the human neurons would have time to unfold very far in the 20-day gestation of a mouse. Contrary to the plot of every good horror movie, the biologists' chimera cookbook contains only recipes of medical interest. But if there were no limits, could they in fact turn chimeras of myth into reality? That depends on the creature. If embryonic cells from human and horse were mixed together, the cells of each species would try to contribute to each part of the body, as in the patchwork mouse, but in this case with goals so incompatible it is hard to see any viable creature being formed. Centaurs, in any case, have six limbs, and that would be fine for an insect but violates the standard mammalian body plan. A much greater chance of creating a viable chimeric creature would come from injecting human embryonic stem cells into a monkey or ape. For this reason the academy committee has firmly ruled out such experiments as unethical. But to continue a little on the path of fantasy, humans are still very similar to chimpanzees, their closest surviving cousins, and an embryo constructed of cells from each may be viable enough to be born. This chimerical creature would probably not be as enjoyable as the chimeras of mythology but more of a problem human - a Caliban-like personage with bad manners and difficult habits. "If something were half human and half animal, what would our moral responsibilities be?" says Richard Doerflinger of the United States Conference of Catholic Bishops. "It might be immoral to kill such a creature. It's wrong to create creatures whose moral stature we are perplexed about." Evidently the first rule of chimeric chemistry is not to make creatures whose behavior straddles the perceived division between the human and animal worlds. References 1. http://query.nytimes.com/search/query?ppds=bylL&v1=NICHOLAS%20WADE&fdq=19960101&td=sysdate&sort=newest&ac=NICHOLAS%20WADE&inline=nyt-per From checker at panix.com Thu May 5 16:25:53 2005 From: checker at panix.com (Premise Checker) Date: Thu, 5 May 2005 12:25:53 -0400 (EDT) Subject: [Paleopsych] NYT: Ugly Children May Get Parental Short Shrift Message-ID: Ugly Children May Get Parental Short Shrift New York Times, 5.5.3 http://www.nytimes.com/2005/05/03/health/03ugly.html [This is the most e-mailed article at the NYT today. Is this really surprising news?] By NICHOLAS BAKALAR Parents would certainly deny it, but Canadian researchers have made a startling assertion: parents take better care of pretty children than they do ugly ones. Researchers at the University of Alberta carefully observed how parents treated their children during trips to the supermarket. They found that physical attractiveness made a big difference. The researchers noted if the parents belted their youngsters into the grocery cart seat, how often the parents' attention lapsed and the number of times the children were allowed to engage in potentially dangerous activities like standing up in the shopping cart. They also rated each child's physical attractiveness on a 10-point scale. The findings, not yet published, were presented at the Warren E. Kalbach Population Conference in Edmonton, Alberta. When it came to buckling up, pretty and ugly children were treated in starkly different ways, with seat belt use increasing in direct proportion to attractiveness. When a woman was in charge, 4 percent of the homeliest children were strapped in compared with 13.3 percent of the most attractive children. The difference was even more acute when fathers led the shopping expedition - in those cases, none of the least attractive children were secured with seat belts, while 12.5 percent of the prettiest children were. Homely children were also more often out of sight of their parents, and they were more often allowed to wander more than 10 feet away. Age - of parent and child - also played a role. Younger adults were more likely to buckle their children into the seat, and younger children were more often buckled in. Older adults, in contrast, were inclined to let children wander out of sight and more likely to allow them to engage in physically dangerous activities. Although the researchers were unsure why, good-looking boys were usually kept in closer proximity to the adults taking care of them than were pretty girls. The researchers speculated that girls might be considered more competent and better able to act independently than boys of the same age. The researchers made more than 400 observations of child-parent interactions in 14 supermarkets. Dr. W. Andrew Harrell, executive director of the Population Research Laboratory at the University of Alberta and the leader of the research team, sees an evolutionary reason for the findings: pretty children, he says, represent the best genetic legacy, and therefore they get more care. Not all experts agree. Dr. Frans de Waal, a professor of psychology at Emory University, said he was skeptical. "The question," he said, "is whether ugly people have fewer offspring than handsome people. I doubt it very much. If the number of offspring are the same for these two categories, there's absolutely no evolutionary reason for parents to invest less in ugly kids." Dr. Robert Sternberg, professor of psychology and education at Yale, said he saw problems in Dr. Harrell's method and conclusions, for example, not considering socioeconomic status. "Wealthier parents can feed, clothe and take care of their children better due to greater resources," Dr. Sternberg said, possibly making them more attractive. "The link to evolutionary theory is speculative." But Dr. Harrell said the importance of physical attractiveness "cuts across social class, income and education." "Like lots of animals, we tend to parcel out our resources on the basis of value," he said. "Maybe we can't always articulate that, but in fact we do it. There are a lot of things that make a person more valuable, and physical attractiveness may be one of them." From checker at panix.com Thu May 5 16:26:22 2005 From: checker at panix.com (Premise Checker) Date: Thu, 5 May 2005 12:26:22 -0400 (EDT) Subject: [Paleopsych] Wired News: Augmenting the Animal Kingdom Message-ID: Augmenting the Animal Kingdom http://wired.com/news/print/0,1294,67349,00.html By [21]Lakshmi Sandhana 02:00 AM May. 03, 2005 PT Natural evolution has produced the eye, butterfly wings and other wonders that would put any inventor to shame. But who's to say evolution couldn't be improved with the help of a little technology? So argues James Auger in his controversial and sometimes unsettling book, Augmented Animals. A designer and former research associate with MIT Media Lab Europe, [23]Auger envisions animals, birds, reptiles and even fish becoming appreciative techno-geeks, using specially engineered gadgets to help them overcome their evolutionary shortcomings, promote their chances of survival or just simply lead easier and more comfortable lives. On tap for the future: Rodents zooming around with night-vision survival goggles, squirrels hoarding nuts using GPS locators and fish armed with metal detectors to avoid the angler's hook. Auger's current ambitions are relatively modest. He's developing a LED light that aims to translate tail wagging into plain English. The device fits on a dog's tail, and flashes text messages when the tail waves through the air. He plans to have a working product on display at [26]Harrods in London by September. "I'm serious about the ideas behind the products," says Auger. "I think that the fact that some of them could be realized means that as concepts they tread the scary line between fact and fiction and therefore are taken a little more seriously. If one person in a hundred is inspired to think about the philosophical issues behind the ideas and the other 99 read it like Calvin and Hobbs, I'd consider that a success." Auger admits that his ideas are mostly conceptual in regard to animals living in the wild. But for tame and domesticated companions, some may not be so far-fetched. For example, a bird cage could be built using existing aerodynamic testing technology that might give captive birds the illusion of long-distance flight. And odor respirators could filter out undesirable smells for dogs and other animals with highly developed olfactory senses. Technology augmentations have already been tried in agribusiness, where an animal's happiness can lead directly to bigger profits. A few years ago, farm researchers tried fitting hens with [29]red plastic contact lenses to reduce aggression caused by tight caging and overcrowding. The idea was quickly [30]dropped when it was found to cause more problems than it solved. Future technologies, though, could yield fruit. For example, some theorists have floated a Matrix-like scenario that would use direct stimulation of the brain to fool livestock about the reality of their living conditions. "To offset the cruelty of factory-farming, routine implants of smart microchips in the pleasure centers may be feasible," says [31]David Pearce, associate editor of the [32]Journal of Evolution and Technology. "Since there is no physiological tolerance to pure pleasure, factory-farmed animals could lead a lifetime of pure bliss instead of misery. Unnatural? Yes, but so is factory farming. Immoral? No, certainly not compared to the terrible suffering we inflict on factory-farmed animals today." Not everyone agrees that fitting animals with invasive and experimental gadgetry is desirable, or even ethical. Jeffrey R. Harrow, author of the [33]The Harrow Technology Report doesn't think the idea of augmenting animals is a good one. "Any time we mess with nature's evolutionary process we run the very real risk of changing things for the worse since we have very limited scope in determining the longer term results," Harrow says. "With the possible exception of endangered species and probably not even those because our modifications would by definition change the species, we must be exceedingly careful or we might change our biosphere in ways later generations might abhor." If the debate over animal augmentation is still in its infancy, it will likely only grow along with advances in technology. Ultimately, some theorists argue, humans may have to decide whether they have a moral duty to help animals cross the divide that separates the species by giving them the ability to acquire higher mental functions -- a theme explored in apocalyptic films such as Planet of the Apes and The Day of the Dolphin. "With children, the insane and the demented we are obliged, when we can, to help these 'disabled citizens' to achieve or regain their full self-determination," says [34]Dr. James J. Hughes, executive director of the [35]Institute for Ethics and Emerging Technologies and author of Citizen Cyborg. "We have the same responsibility to enhance the intelligence and communication abilities of great apes, and possibly also of dolphins and elephants, when we have the means to do so. Once they are sufficiently enhanced, they can make decisions for themselves, including removing their augmentation." References 21. http://wired.com/news/feedback/mail/1,2330,0-603-67349,00.html 22. http://www.wired.com/news/technology/0,1282,67349,00.html 23. http://www.auger-loizeau.com/ 24. http://network.realmedia.com/RealMedia/ads/adstream_sx.ads/lycoswired/ron/ron/st/ss/a/137837941 at x08,x10,x24,x15,Position1,Top1!x15 25. http://network.realmedia.com/RealMedia/ads/click_nx.ads/lycoswired/ron/ron/st/ss/a/137837941 at x08,x10,x24,x15,Position1,Top1!x15 26. http://www.harrods.com/msib21 27. http://wired.com/news/print/0,1294,67349,00.html 28. http://wired.com/news/print/0,1294,67349,00.html 29. http://www.upc-online.org/RedLens.html 30. http://www.upc-online.org/s96redlens.html 31. http://www.hedweb.com/ 32. http://jetpress.org/ 33. http://www.TheHarrowGroup.com/ 34. http://www.changesurfer.com/Hughes.html 35. http://ieet.org/ From checker at panix.com Thu May 5 16:26:36 2005 From: checker at panix.com (Premise Checker) Date: Thu, 5 May 2005 12:26:36 -0400 (EDT) Subject: [Paleopsych] NY Press: Matt Taibbi: Flathead: The peculiar genius of Thomas L. Friedman. Message-ID: New York's Premier Alternative Newspaper. Arts, Music, Food, Movies and Opinion http://www.nypress.com/18/16/news&columns/taibbi.cfm Vol 18 - Issue 18 - May 4-10, 2005 I think it was about five months ago that Press editor Alex Zaitchik whispered to me in the office hallway that Thomas Friedman had a new book coming out. All he knew about it was the title, but that was enough; he approached me with the chilled demeanor of a British spy who has just discovered that Hitler was secretly buying up the worlds manganese supply. Who knew what it meantbut one had to assume the worst "It's going to be called The Flattening," he whispered. Then he stood there, eyebrows raised, staring at me, waiting to see the effect of the news when it landed. I said nothing. It turned out Alex had bad information; the book that ultimately came out would be called The World Is Flat. It didn't matter. Either version suggested the same horrifying possibility. Thomas Friedman in possession of 500 pages of ruminations on the metaphorical theme of flatness would be a very dangerous thing indeed. It would be like letting a chimpanzee loose in the NORAD control room; even the best-case scenario is an image that could keep you awake well into your 50s. So I tried not to think about it. But when I heard the book was actually coming out, I started to worry. Among other things, I knew I would be asked to write the review. The usual ratio of Friedman criticism is 2:1, i.e., two human words to make sense of each single word of Friedmanese. Friedman is such a genius of literary incompetence that even his most innocent passages invite feature-length essays. I'll give you an example, drawn at random from The World Is Flat. On page 174, Friedman is describing a flight he took on Southwest Airlines from Baltimore to Hartford, Connecticut. (Friedman never forgets to name the company or the brand name; if he had written The Metamorphosis, Gregor Samsa would have awoken from uneasy dreams in a Sealy Posturepedic.) Here's what he says: I stomped off, went through security, bought a Cinnabon, and glumly sat at the back of the B line, waiting to be herded on board so that I could hunt for space in the overhead bins. Forget the Cinnabon. Name me a herd animal that hunts. Name me one. This would be a small thing were it not for the overall pattern. Thomas Friedman does not get these things right even by accident. It's not that he occasionally screws up and fails to make his metaphors and images agree. It's that he always screws it up. He has an anti-ear, and it's absolutely infallible; he is a Joyce or a Flaubert in reverse, incapable of rendering even the smallest details without genius. The difference between Friedman and an ordinary bad writer is that an ordinary bad writer will, say, call some businessman a shark and have him say some tired, uninspired piece of dialogue: Friedman will have him spout it. And that's guaranteed, every single time. He never misses. On an ideological level, Friedman's new book is the worst, most boring kind of middlebrow horseshit. If its literary peculiarities could somehow be removed from the equation, The World Is Flat would appear as no more than an unusually long pamphlet replete with the kind of plug-filled, free-trader leg-humping that passes for thought in this country. It is a tale of a man who walks 10 feet in front of his house armed with a late-model Blackberry and comes back home five minutes later to gush to his wife that hospitals now use the internet to outsource the reading of CAT scans. Man flies on planes, observes the wonders of capitalism, says we're not in Kansas anymore. (He actually says we're not in Kansas anymore.) That's the whole plot right there. If the underlying message is all that interests you, read no further, because that's all there is. It's impossible to divorce The World Is Flat from its rhetorical approach. It's not for nothing that Thomas Friedman is called "the most important columnist in America today." That it's Friedman's own colleague at the New York Times (Walter Russell Mead) calling him this, on the back of Friedman's own book, is immaterial. Friedman is an important American. He is the perfect symbol of our culture of emboldened stupidity. Like George Bush, he's in the reality-making business. In the new flat world, argument is no longer a two-way street for people like the president and the country's most important columnist. You no longer have to worry about actually convincing anyone; the process ends when you make the case. Things are true because you say they are. The only thing that matters is how sure you sound when you say it. In politics, this allows America to invade a castrated Iraq in self-defense. In the intellectual world, Friedman is now probing the outer limits of this trick's potential, and it's absolutely perfect, a stroke of genius, that he's choosing to argue that the world is flat. The only thing that would have been better would be if he had chosen to argue that the moon was made of cheese. And that's basically what he's doing here. The internet is speeding up business communications, and global labor markets are more fluid than ever. Therefore, the moon is made of cheese. That is the rhetorical gist of The World Is Flat. It's brilliant. Only an America-hater could fail to appreciate it. Start with the title. The book's genesis is conversation Friedman has with Nandan Nilekani, the CEO of Infosys. Nilekani causally mutters to Friedman: "Tom, the playing field is being leveled." To you and me, an innocent throwaway phrasethe level playing field being, after all, one of the most oft-repeated stock ideas in the history of human interaction. Not to Friedman. Ten minutes after his talk with Nilekani, he is pitching a tent in his company van on the road back from the Infosys campus in Bangalore: As I left the Infosys campus that evening along the road back to Bangalore, I kept chewing on that phrase: "The playing field is being leveled." What Nandan is saying, I thought, is that the playing field is being flattened... Flattened? Flattened? My God, he's telling me the world is flat! This is like three pages into the book, and already the premise is totally fucked. Nilekani said level, not flat. The two concepts are completely different. Level is a qualitative idea that implies equality and competitive balance; flat is a physical, geographic concept that Friedman, remember, is openly contrastingironically, as it werewith Columbus's discovery that the world is round. Except for one thing. The significance of Columbus's discovery was that on a round earth, humanity is more interconnected than on a flat one. On a round earth, the two most distant points are closer together than they are on a flat earth. But Friedman is going to spend the next 470 pages turning the "flat world" into a metaphor for global interconnectedness. Furthermore, he is specifically going to use the word round to describe the old, geographically isolated, unconnected world. "Let me... share with you some of the encounters that led me to conclude that the world is no longer round," he says. He will literally travel backward in time, against the current of human knowledge. To recap: Friedman, imagining himself Columbus, journeys toward India. Columbus, he notes, traveled in three ships; Friedman "had Lufthansa business class." When he reaches IndiaBangalore to be specifiche immediately plays golf. His caddy, he notes with interest, wears a cap with the 3M logo. Surrounding the golf course are billboards for Texas Instruments and Pizza Hut. The Pizza Hut billboard reads: "Gigabites of Taste." Because he sees a Pizza Hut ad on the way to a golf course, something that could never happen in America, Friedman concludes: "No, this definitely wasn't Kansas." After golf, he meets Nilekani, who casually mentions that the playing field is level. A nothing phrase, but Friedman has traveled all the way around the world to hear it. Man travels to India, plays golf, sees Pizza Hut billboard, listens to Indian CEO mutter small talk, writes 470-page book reversing the course of 2000 years of human thought. That he misattributes his thesis to Nilekani is perfect: Friedman is a person who not only speaks in malapropisms, he also hears malapropisms. Told level; heard flat. This is the intellectual version of Far Out Space Nuts, when NASA repairman Bob Denver sets a whole sitcom in motion by pressing "launch" instead of "lunch" in a space capsule. And once he hits that button, the rocket takes off. And boy, does it take off. Predictably, Friedman spends the rest of his huge book piling one insane image on top of the other, so that by the endand I'm not joking herewe are meant to understand that the flat world is a giant ice-cream sundae that is more beef than sizzle, in which everyone can fit his hose into his fire hydrant, and in which most but not all of us are covered with a mostly good special sauce. Moreover, Friedman's book is the first I have encountered, anywhere, in which the reader needs a calculator to figure the value of the author's metaphors. God strike me dead if I'm joking about this. Judge for yourself. After the initial passages of the book, after Nilekani has forgotten Friedman and gone back to interacting with the sane, Friedman begins constructing a monstrous mathematical model of flatness. The baseline argument begins with a lengthy description of the "ten great flatteners," which is basically a highlight reel of globalization tomahawk dunks from the past two decades: the collapse of the Berlin Wall, the Netscape IPO, the pre-Y2K outsourcing craze, and so on. Everything that would give an IBM human resources director a boner, that's a flattener. The catch here is that Flattener #10 is new communications technology: "Digital, Mobile, Personal, and Virtual." These technologies Friedman calls "steroids," because they are "amplifying and turbocharging all the other flatteners." According to the mathematics of the book, if you add an IPac to your offshoring, you go from running to sprinting with gazelles and from eating with lions to devouring with them. Although these 10 flatteners existed already by the time Friedman wrote The Lexus and the Olive Treea period of time referred to in the book as Globalization 2.0, with Globalization 1.0 beginning with Columbusthey did not come together to bring about Globalization 3.0, the flat world, until the 10 flatteners had, with the help of the steroids, gone through their "Triple Convergence." The first convergence is the merging of software and hardware to the degree that makes, say, the Konica Minolta Bizhub (the product featured in Friedman's favorite television commercial) possible. The second convergence came when new technologies combined with new ways of doing business. The third convergence came when the people of certain low-wage industrial countriesIndia, Russia, China, among otherswalked onto the playing field. Thanks to steroids, incidentally, they occasionally are "not just walking" but "jogging and even sprinting" onto the playing field. Now let's say that the steroids speed things up by a factor of two. It could be any number, but let's be conservative and say two. The whole point of the book is to describe the journey from Globalization 2.0 (Friedman's first bestselling book) to Globalization 3.0 (his current bestselling book). To get from 2.0 to 3.0, you take 10 flatteners, and you have them convergelet's say this means squaring them, because that seems to be the ideathree times. By now, the flattening factor is about a thousand. Add a few steroids in there, and we're dealing with a flattening factor somewhere in the several thousands at any given page of the book. We're talking about a metaphor that mathematically adds up to a four-digit number. If you're like me, you're already lost by the time Friedman starts adding to this numerical jumble his very special qualitative descriptive imagery. For instance: And now the icing on the cake, the ubersteroid that makes it all mobile: wireless. Wireless is what allows you to take everything that has been digitized, made virtual and personal, and do it from anywhere. Ladies and gentlemen, I bring you a Thomas Friedman metaphor, a set of upside-down antlers with four thousand points: the icing on your uber-steroid-flattener-cake! Let's speak Friedmanese for a moment and examine just a few of the notches on these antlers (Friedman, incidentally, measures the flattening of the world in notches, i.e. "The flattening process had to go another notch"; I'm not sure where the notches go in the flat plane, but there they are.) Flattener #1 is actually two flatteners, the collapse of the Berlin Wall and the spread of the Windows operating system. In a Friedman book, the reader naturally seizes up in dread the instant a suggestive word like "Windows" is introduced; you wince, knowing what's coming, the same way you do when Leslie Nielsen orders a Black Russian. And Friedman doesn't disappoint. His description of the early 90s: The walls had fallen down and the Windows had opened, making the world much flatter than it had ever beenbut the age of seamless global communication had not yet dawned. How the fuck do you open a window in a fallen wall? More to the point, why would you open a window in a fallen wall? Or did the walls somehow fall in such a way that they left the windows floating in place to be opened? Four hundred and 73 pages of this, folks. Is there no God? From checker at panix.com Thu May 5 16:26:48 2005 From: checker at panix.com (Premise Checker) Date: Thu, 5 May 2005 12:26:48 -0400 (EDT) Subject: [Paleopsych] Slate: Robert Wright: (Tom Friedman: The Incredible Shrinking Planet Message-ID: Robert Wright: (Tom Friedman: The Incredible Shrinking Planet: What liberals can learn from Thomas Friedman's new book http://slate.msn.com/id/2116899/ Posted Monday, April 18, 2005, at 12:30 PM PT [23]Tom Friedman Is Right Again, Dammit! Tom Friedman What do you call it when multinational corporations scan the world for cheap labor, find poor people in developing nations, and pay them a fraction of America's minimum wage? A common answer on the left is "exploitation." For Thomas Friedman the answer is "collaboration"--or "empowering individuals in the developing world as never before." Friedman has written another destined-to-be-a-best-seller, destined-to-annoy-many-leftists-even-though-he's-a-liberal book, The World Is Flat. Readers of Friedman's 1998 The Lexus and the Olive Tree may ask: Why another best-selling, left-annoying Friedman book on globalization? Friedman argues that in the last few years, while we were distracted by Osama Bin Laden's transformation of the political landscape, a whole new phase of globalization was taking shape. Fueled by Internet-friendly software and cheap fiber optics, it features the fine-grained and far-flung division of data-related labor, often with little need for hierarchical, centralized control; and it subjects yesterday's powerhouses to competition from upstarts. "Globalization 3.0 is shrinking the world from a size small to a size tiny and flattening the playing field at the same time," bringing a "newfound power for individuals to collaborate and compete globally." This theme will get the book read in business class, but the reason leftists back in coach should read it has more to do with Osama Bin Laden's transformation of the political landscape. Islamist terrorism has been a godsend to the American right, especially in foreign policy. President Bush has sold a Manichaean master narrative that fuses neoconservativism with paleoconservative hawkism, the unifying upshot being the importance of invading countries and of disregarding, if not subverting, multilateral institutions. If the left is to develop a rival narrative, it will have to honestly address the realities of both globalization and terrorism. Friedman's book portrays both acutely--but that's not the only reason it's essential reading for the people it will most aggravate. It also contains the ingredients of a powerful liberal narrative, one that harnesses the logic of globalization to counter Bush's rhetoric in foreign and, for that matter, domestic policy. Part of this narrative Friedman develops, and part of it he leaves undeveloped and might even reject as too far left. But so what? In a flat world, Pulitzer Prize-winning New York Times columnists don't hand down stone tablets from mountaintops. They just start conversations that ripple through webzines and into the decentralized, newly influential blogosphere. It's kind of like open-source software, one of Friedman's examples of how easily divisions of digital labor can arise: Friedman writes some Friedman code, and left-of-Friedman liberals write some left-of-Friedman code, and eventually an open-source liberal narrative may coalesce. Feel empowered? Let's get cracking! These days hardly anyone accepts the label "anti-globalization." Most leftists now grant that you can't stop the globalization juggernaut; the best you can do is guide it. Friedman's less grim view suggests that, if you look at things from the standpoint of humanity as a whole--a standpoint many leftists purport to hold--globalization may actually be a good thing. He shows us some of globalization's beneficiaries--such as Indians who take "accent neutralization" classes and who, so far as I can tell, are as decent and worthy as the American airline reservation clerks and tech-support workers whose jobs they're taking (and who seem to prefer "exploitation" to nonexploitation). What's more, even as some Americans are losing, other Americans are winning, via cheaper airline tickets, more tech support, whatever. So, with net gains outweighing net losses, it's a non-zero-sum game, with a positive-sum outcome--a good thing on balance, at least from a global moral standpoint. (I've [27]argued that this is the basic story of history: Technological evolution allows the playing of more complex, more far-flung non-zero-sum games, and political structures adapt to this impetus.) Even globalization's downsides--such as displaced American workers--can have an upside for liberals in political terms. A churning workforce strengthens the case for the kind of safety net that Democrats champion and Republicans resist. (Globalization-induced jitters may help explain why President Bush's plan to make Social Security less secure hasn't captured the nation's imagination.) Friedman outlines an agenda of "compassionate flatism" that includes portable, subsidized health care, wage insurance, and subsidies for college and vocational school. You can argue about the details, and you can push them to the left. (He notes that corporations like to put offices and factories in countries with universal health care.) But this is clearly a Democratic agenda, and, as more and more white-collar jobs move abroad, its appeal to traditionally Republican voters should grow. Globalization's domestic disruptions can also be softened by global institutions. As the sociologist Douglas Massey argues in his just-published liberal manifesto Return of the L Word, the World Trade Organization, though reviled on the far left as a capitalist tool, could, with American leadership, use its clout to enforce labor standards abroad that are already embraced by the U.N.'s toothless International Labor Organization. For example: the right of workers everywhere to bargain collectively. (Workers of the world unite.) Friedman doesn't emphasize this sort of leftish global governance. Apparently he thinks Globalization 3.0 will enervate international institutions as much as national ones. The WTO will "become less important" because globalization will "be increasingly driven by the individuals who understand the flat world." Time will tell. My own view is that a flat world can help American liberals network with like-minded people in other countries to shape nascent international bodies. (Massey shows that the WTO, in response to left-wing feedback, has grown more receptive to environmentalist constraints on trade.) But the main leftward amendment to Friedman's source code I'd make is in a different realm of foreign policy. As Microsoft said of Sun's Java, I'd like to "embrace and extend" his belief that globalization is conducive to peace and freedom. Friedman persuasively updates his Lexus-and-the-Olive-Tree argument that economic interdependence makes war costlier for nations and hence less likely. He's heard the counterargument--"That's what they said before World War I!"--and he concedes that a big war could happen. But he shows that the pre-World War I era didn't have this kind of interdependence--the fine-grained and far-flung division of labor orchestrated by Toyota, Wal-Mart, et al. This is "supply chaining"--"collaborating horizontally--among suppliers, retailers, and customers--to create value." For example: The hardware in a Dell Inspiron 600m laptop comes from factories in the Philippines, Costa Rica, Malaysia, China, South Korea, Taiwan, Germany, Japan, Mexico, Thailand, Singapore, Indonesia, India, and Israel; the software is designed in America and elsewhere. The corporations that own or operate these factories are based in the United States, China, Taiwan, Germany, South Korea, Japan, Ireland, Thailand, Israel, and Great Britain. And Michael Dell personally knows their CEOs--a kind of relationship that, multiplied across the global web of supply chains, couldn't hurt when tensions rise between, say, China and the United States. Friedman argues plausibly that global capitalism dampened the India-Pakistan crisis of 2002, when a nuclear exchange was so thinkable that the United States urged Americans to leave India. Among the corporate feedback the Indian government got in midcrisis was a message from United Technologies saying that it had started looking for more stable countries in which to house mission-critical operations. The government toned down its rhetoric. Also plausibly, Friedman argues that Globalization 3.0 rewards inter-ethnic tolerance and punishes tribalism. "If you want to have a modern complex division of labor, you have to be able to put more trust in strangers." Certainly nations famous for fundamentalist intolerance--e.g., Saudi Arabia--tend not to be organically integrated into the global economy. Peace and universal brotherhood--it almost makes globalization sound like a leftist's dream come true. But enough embracing--it's time to extend! Time to use the logic of globalization to attack Bush's foreign policy. Like Friedman, I accept Bush's premise that spreading political freedom is both morally good and good for America's long-term national security. But is Bush's instinctive means to that end--invading countries that aren't yet free--really the best approach? Friedman's book fortified my belief that the answer is no. Friedman, unlike many liberals, has long appreciated that, more than ever, economic liberty encourages political liberty. As statist economies have liberalized, this linkage has worked faster in some cases (South Korea, Taiwan) than in others (China), but it works at some speed just about everywhere. And consider the counterexamples, the increasingly few nations that have escaped fine-grained penetration by market forces. They not only tend to be authoritarian; they often flout international norms, partly because their lack of economic engagement makes their relationship to the world relatively zero-sum, leaving them little incentive to play nicely. Friedman writes, "Since Iraq, Syria, south Lebanon, North Korea, Pakistan, Afghanistan, and Iran are not part of any major global supply chains, all of them remain hot spots that could explode at any time." That list includes the last country Bush invaded and the two countries atop his prospective invasions list. It makes you wonder: With all due respect for carnage, mightn't it be easier to draw these nations into the globalized world and let capitalism work its magic (while supplementing that magic by using nonmilitary policy levers to encourage democratic reform)? This is one paradox of "neoconservative" foreign policy: It lacks the conservative's faith in the politically redeeming power of markets. Indeed, Bush, far from trying to lure authoritarians into the insidiously antiauthoritarian logic of capitalism, has tried to exclude them from it. Economically, he's all stick and no carrot. (Of Iran he said, "We've sanctioned ourselves out of influence," oblivious to the fact that removing sanctions can be an incentive.) Of course, if you took this approach--used trade, aid, and other forms of what Joseph Nye calls "soft power" to globalize authoritarian nations and push them toward freedom--hyper-tyrannies like Saddam Hussein's Iraq would be the last dominoes to fall. More promising dominoes would include Egypt, even Saudi Arabia. But according to neocon reverse-domino theory, it only takes one domino. And it's true that in a "flattened" world, dominoes can fall fast once they get started. Internet and satellite TV let people anywhere see what people everywhere are doing without relying on their government's version of events. ("Peer-to-peer," you might call it.) Much of the inspiration for Lebanon's "cedar revolution" came from watching Georgia's Rose Revolution and then Ukraine's Orange Revolution (on Al Jazeera). And Palestinian aspirations to democracy were nourished by Israel's televised parliament--one reason the ground for democracy was fertile when Yasser Arafat died. So, was the Iraq invasion really an essential domino-feller, given the increasing contagion of liberty and the various nonmilitary levers with which we can encourage it? It would be one thing if Bush had tried those levers and failed--systematically deployed trade and aid and other tools against authoritarianism. But for him soft power was a convenient afterthought. He didn't renounce America's longstanding attraction to authoritarian stability and start nudging Egypt et al., toward democracy (as many liberals had long favored) until he needed a cosmic vision of global democracy to justify an unexpectedly messy war. Friedman, of course, supported the war. And that's one reason some leftists will resist using this book as food for thought. But he supported the war reluctantly, and he supported it for the best reason, the reason Bush settled on retrospectively after most of his other reasons had collapsed: to create a market democracy in the Arab world. Friedman has long seen, and highlights in this book, that the same microelectronic forces that empower Indian software writers and lubricate global supply chains also empower terrorists and strengthen their networks; and therefore that, 10 or 20 years down the road, we can't afford to have whole nations full of potential terrorists--young people with no legitimate outlet for their economic and political energies. Many liberals who opposed the Iraq war don't appreciate this fact. In the long run that's probably a deeper misjudgment than the one liberal Iraq hawks are accused of having made. (And I say that as one of their accusers.) Anyway, liberals who supported the Iraq war look less crazy today than they did three months ago. The key question now is which ones appreciate how technology is rendering such adventures less necessary (and more counterproductive--but don't get me started on that sermon). Friedman, during his recent Charlie Rose whistle-stop, noted the importance of Ukraine's example for Lebanon, a welcome corrective to the common Iraq-hawk line that good things in the Middle East flow exclusively from Iraq's elections. For this and other reasons I'm tentatively counting him in, hoping he'll sign onto this new source code: In a flat world, soft power is more powerful than ever. In any event, selling this lefty, peacenik message to Friedman isn't as improbable as selling it to some lefty peaceniks, because buying the message means coming fully to terms with globalization--not just granting its inevitability but appreciating its [28]potential. The Naderite left reviled The Lexus and the Olive Tree for what they took to be its Panglossian depiction of globalization as a force of nature. (In fact, the book spends lots of time on globalization's dark side, as does The World Is Flat). But, seven years later, Friedman's early depiction of globalization's power--good and bad--looks prescient. And with this book he's shown how and why globalization has now shifted into warp drive. Meanwhile, the main achievement of Naderite nationalists has been to put George Bush in the White House. If forced to choose between the two--and, in a sense, liberals are--where would you look for inspiration? Related in Slate _________________________________________________________________ In 2002, David Plotz [29]assessed Friedman, the columnist and presumptive diplomat-by-newsprint. Jacob Weisberg [30]reviewed Friedman's The Lexus and the Olive Tree when it arrived in 1999. In 2001, Robert Wright gave Slate readers [31]dispatches from Davos. In January of this year, Samuel Loewenberg delivered [32]dispatches from "the Anti-Davos." Robert Wright, a visiting fellow at Princeton University's Center for Human Values and a senior fellow at the New America Foundation, runs the Web site [33]meaningoflife.tv and is the author of [34]The Moral Animal and [35]Nonzero: The Logic of Human Destiny. Robert Wright References 23. http://slate.msn.com/id/2116914/ 25. http://slate.msn.com/id/2116899/#ContinueArticle 26. http://ad.doubleclick.net/jump/slate.homepage/slate;kw=slate;sz=300x250;ord=5647? 27. http://www.nonzero.org/index.htm 28. http://slate.msn.com/id/2116899/sidebar/2116900/ 29. http://slate.msn.com/id/2062905/ 30. http://slate.msn.com/id/25365/ 31. http://slate.msn.com/id/97787/entry/97788/ 32. http://slate.msn.com/id/2112679/entry/2112681/ 33. http://www.meaningoflife.tv/ 34. http://bn.bfast.com/booklink/click?sourceid=412995&ISBN=0679763996 35. http://www.nonzero.org/ From checker at panix.com Thu May 5 16:27:24 2005 From: checker at panix.com (Premise Checker) Date: Thu, 5 May 2005 12:27:24 -0400 (EDT) Subject: [Paleopsych] Wired: (Flynn Effect): Dome Improvement Message-ID: Dome Improvement http://www.wired.com/wired/archive/13.05/flynn_pr.html First some remarks from From: Hal Finney Date: Tue, 3 May 2005 11:03:41 -0700 (PDT) To: extropy-chat at lists.extropy.org except that the article is now available: Wired magazine's new issue has an article on the Flynn Effect, which we have discussed here occasionally. This is probably my favorite Effect, so completely extropian and contradictory to the conventional wisdom. Curmudgeons throughout the ages have complained about the decay of society and how the younger generation is inferior in morals and intelligence to their elders. Likewise modern communications technology is derided: TV is a vast wasteland, video games and movies promote sex and violence. Yet Flynn discovered the astonishing and still little-known fact that intelligence scores have steadily increased for at least the past 100 years. And it's a substantial gain; people who would have been considered geniuses 100 years ago would be merely average today. Perhaps even more surprisingly, the gains cannot be directly attributed to improved education, as the greatest improvements are found in the parts of the test that directly measure abstract reasoning via visual puzzles, not concrete knowledge based on language or mathematical skills. The Wired article (which should be online in a few days) does not have much that is new, but one fact which popped out is that the Effect has not only continued in the last couple of generations, but is increasing. Average IQ gains were 0.31 per year in the 1950s and 60s, but by the 1990s had grown to 0.36 per year. Explanations for the Effect seem to be as numerous as people who have studied it. Flynn himself does not seem to believe that it is real, in the sense that it actually points to increased intelligence. I was amused by economist David Friedman's suggestion that it is due to the increased use of Caesarian deliveries allowing for larger head sizes! The Wired article focuses on increased visual stimulation as the catalyst, which seems plausible as part of the story. The article then predicts that the next generation, exposed since babyhood to video games with demanding puzzle solving, mapping and coordination skills, will see an even greater improvement in IQ scores. Sometimes I wonder if the social changes we saw during the 20th century may have been caused or at least promoted by greater human intelligence. It's a difficult thesis to make because you first have to overcome the conventional wisdom that says that the 1900s were a century of human depravity and violence. But if you look deeper and recognize the tremendous growth of morality and ethical sensitivity in this period (which is what makes us judge ourselves so harshly), you have to ask, maybe it is because people woke up, began to think for themselves, and weren't willing to let themselves be manipulated and influenced as in the past? If so, then this bodes well for the future. --------------now the article: Pop quiz: Why are IQ test scores rising around the globe? (Hint: Stop reading the great authors and start playing Grand Theft Auto.) By Steven Johnson Twenty-three years ago, an American philosophy professor named James Flynn discovered a remarkable trend: Average IQ scores in every industrialized country on the planet had been increasing steadily for decades. Despite concerns about the dumbing-down of society - the failing schools, the garbage on TV, the decline of reading - the overall population was getting smarter. And the climb has continued, with more recent studies showing that the rate of IQ increase is accelerating. Next to global warming and Moore's law, the so-called Flynn effect may be the most revealing line on the increasingly crowded chart of modern life - and it's an especially hopeful one. We still have plenty of problems to solve, but at least there's one consolation: Our brains are getting better at problem-solving. Unless you happen to think the very notion of IQ is bunk. Anyone who has read Stephen Jay Gould's The Mismeasure of Man or Howard Gardner's work on multiple intelligences or any critique of The Bell Curve is liable to dismiss IQ as merely phrenology updated, a pseudoscience fronting for a host of racist and elitist ideologies that dare not speak their names. These critics attack IQ itself - or, more precisely, what intelligence scholar Arthur Jensen called g, a measure of underlying "general" intelligence. Psychometricians measure g by performing a factor analysis of multiple intelligence tests and extracting a pattern of correlation between the measurements. (IQ is just one yardstick.) Someone with greater general intelligence than average should perform better on a range of different tests. Unlike some skeptics, James Flynn didn't just dismiss g as statistical tap dancing. He accepted that something real was being measured, but he came to believe that it should be viewed along another axis: time. You can't just take a snapshot of g at one moment and make sense of it, Flynn says. You have to track its evolution. He did just that. Suddenly, g became much more than a measure of mental ability. It revealed the rising trend line in intelligence test scores. And that, in turn, suggested that something in the environment - some social or cultural force - was driving the trend. Significant intellectual breakthroughs - to paraphrase the John Lennon song - are what happen when you're busy making other plans. So it was with Flynn and his effect. He left the US in the early 1960s to teach moral philosophy at the University of Otaga in New Zealand. In the late '70s, he began exploring the intellectual underpinnings of racist ideologies. "And I thought: Oh, I can do a bit about the IQ controversies," he says. "And then I saw that Arthur Jensen, a scholar of high repute, actually thought that blacks on average were genetically inferior - which was quite a shock. I should say that Jensen was beyond reproach - he's certainly not a racist. And so I thought I'd better look into this." This inquiry led to a 1980 book, Race, IQ, and Jensen, that posited an environmental - not genetic - explanation for the black-white IQ gap. After finishing the book, Flynn decided that he would look for evidence that blacks were gaining on whites as their access to education increased, and so he began studying US military records, since every incoming member of the armed forces takes an IQ test. Sure enough, he found that blacks were making modest gains on whites in intelligence tests, confirming his environmental explanation. But something else in the data caught his eye. Every decade or so, the testing companies would generate new tests and re-normalize them so that the average score was 100. To make sure that the new exams were in sync with previous ones, they'd have a batch of students take both tests. They were simply trying to confirm that someone who tested above average on the new version would perform above average on the old, and in fact the results confirmed that correlation. But the data also brought to light another pattern, one that the testing companies ignored. "Every time kids took the new and the old tests, they did better on the old ones," Flynn says. "I thought: That's weird." The testing companies had published the comparative data almost as an afterthought. "It didn't seem to strike them as interesting that the kids were always doing better on the earlier test," he says. "But I was new to the area." He sent his data to the Harvard Educational Review, which dismissed the paper for its small sample size. And so Flynn dug up every study that had ever been done in the US where the same subjects took a new and an old version of an IQ test. "And lo and behold, when you examined that huge collection of data, it revealed a 14-point gain between 1932 and 1978." According to Flynn's numbers, if someone testing in the top 18 percent the year FDR was elected were to time-travel to the middle of the Carter administration, he would score at the 50th percentile. When Flynn finally published his work in 1984, Jensen objected that Flynn's numbers were drawing on tests that reflected educational background. He predicted that the Flynn effect would disappear if one were to look at tests - like the Raven Progressive Matrices - that give a closer approximation of g, by measuring abstract reasoning and pattern recognition and eliminating language altogether. And so Flynn dutifully collected IQ data from all over the world. All of it showed dramatic increases. "The biggest of all were on Ravens," Flynn reports with a hint of glee still in his voice. The trend Flynn discovered in the mid-'80s has been investigated extensively, and there's little doubt he's right. In fact, the Flynn effect is accelerating. US test takers gained 17 IQ points between 1947 and 2001. The annual gain from 1947 through 1972 was 0.31 IQ point, but by the '90s it had crept up to 0.36. Though the Flynn effect is now widely accepted, its existence has in turn raised new questions. The most fundamental: Why are measures of intelligence going up? The phenomenon would seem to make no sense in light of the evidence that g is largely an inherited trait. We're certainly not evolving that quickly. The classic heritability research paradigm is the twin adoption study: Look at IQ scores for thousands of individuals with various forms of shared genes and environments, and hunt for correlations. This is the sort of chart you get, with 100 being a perfect match and 0 pure randomness: The same person tested twice: 87 Identical twins raised together: 86 Identical twins raised apart: 76 Fraternal twins raised together: 55 Biological siblings: 47 Parents and children living together: 40 Parents and children living apart: 31 Adopted children living together: 0 Unrelated people living apart: 0 After analyzing these shifting ratios of shared genes and the environment for several decades, the consensus grew, in the '90s, that heritability for IQ was around 0.6 - or about 60 percent. The two most powerful indications of this are at the top and bottom of the chart: Identical twins raised in different environments have IQs almost as similar to each other as the same person tested twice, while adopted children living together - shared environment, but no shared genes - show no correlation. When you look at a chart like that, the evidence for significant heritability looks undeniable. Four years ago, Flynn and William Dickens, a Brookings Institution economist, proposed another explanation, one made apparent to them by the Flynn effect. Imagine "somebody who starts out with a tiny little physiological advantage: He's just a bit taller than his friends," Dickens says. "That person is going to be just a bit better at basketball." Thanks to this minor height advantage, he tends to enjoy pickup basketball games. He goes on to play in high school, where he gets excellent coaching and accumulates more experience and skill. "And that sets up a cycle that could, say, take him all the way to the NBA," Dickens says. Now imagine this person has an identical twin raised separately. He, too, will share the height advantage, and so be more likely to find his way into the same cycle. And when some imagined basketball geneticist surveys the data at the end of that cycle, he'll report that two identical twins raised apart share an off-the-charts ability at basketball. "If you did a genetic analysis, you'd say: Well, this guy had a gene that made him a better basketball player," Dickens says. "But the fact is, that gene is making him 1 percent better, and the other 99 percent is that because he's slightly taller, he got all this environmental support." And what goes for basketball goes for intelligence: Small genetic differences get picked up and magnified in the environment, resulting in dramatically enhanced skills. "The heritability studies weren't wrong," Flynn says. "We just misinterpreted them." Dickens and Flynn showed that the environment could affect heritable traits like IQ, but one mystery remained: What part of our allegedly dumbed-down environment is making us smarter? It's not schools, since the tests that measure education-driven skills haven't shown the same steady gains. It's not nutrition - general improvement in diet leveled off in most industrialized countries shortly after World War II, just as the Flynn effect was accelerating. Most cognitive scholars remain genuinely perplexed. "I find it a puzzle and don't have a compelling explanation," wrote Harvard's Steven Pinker in an email exchange. "I suspect that it's either practice at taking tests or perhaps a large number of disparate factors that add up to the linear trend." Flynn has his theories, though they're still speculative. "For a long time it bothered me that g was going up without an across-the-board increase in other tests," he says. If g measured general intelligence, then a long-term increase should trickle over into other subtests. "And then I realized that society has priorities. Let's say we're too cheap to hire good high school math teachers. So while we may want to improve arithmetical reasoning skills, we just don't. On the other hand, with smaller families, more leisure, and more energy to use leisure for cognitively demanding pursuits, we may improve - without realizing it - on-the-spot problem-solving, like you see with Ravens." When you take the Ravens test, you're confronted with a series of visual grids, each containing a mix of shapes that seem vaguely related to one another. Each grid contains a missing shape; to answer the implicit question posed by the test, you need to pick the correct missing shape from a selection of eight possibilities. To "solve" these puzzles, in other words, you have to scrutinize a changing set of icons, looking for unusual patterns and correlations among them. This is not the kind of thinking that happens when you read a book or have a conversation with someone or take a history exam. But it is precisely the kind of mental work you do when you, say, struggle to program a VCR or master the interface on your new cell phone. Over the last 50 years, we've had to cope with an explosion of media, technologies, and interfaces, from the TV clicker to the World Wide Web. And every new form of visual media - interactive visual media in particular - poses an implicit challenge to our brains: We have to work through the logic of the new interface, follow clues, sense relationships. Perhaps unsurprisingly, these are the very skills that the Ravens tests measure - you survey a field of visual icons and look for unusual patterns. The best example of brain-boosting media may be videogames. Mastering visual puzzles is the whole point of the exercise - whether it's the spatial geometry of Tetris, the engineering riddles of Myst, or the urban mapping of Grand Theft Auto. The ultimate test of the "cognitively demanding leisure" hypothesis may come in the next few years, as the generation raised on hypertext and massively complex game worlds starts taking adult IQ tests. This is a generation of kids who, in many cases, learned to puzzle through the visual patterns of graphic interfaces before they learned to read. Their fundamental intellectual powers weren't shaped only by coping with words on a page. They acquired an intuitive understanding of shapes and environments, all of them laced with patterns that can be detected if you think hard enough. Their parents may have enhanced their fluid intelligence by playing Tetris or learning the visual grammar of TV advertising. But that's child's play compared with Pok?mon. Contributing editor Steven Johnson (stevenberlinjohnson at earthlink.net) is the author of Everything Bad Is Good for You: How Today's Popular Culture Is Actually Making Us Smarter. From checker at panix.com Thu May 5 16:27:39 2005 From: checker at panix.com (Premise Checker) Date: Thu, 5 May 2005 12:27:39 -0400 (EDT) Subject: [Paleopsych] NYT: Programs That Start When XP Does Message-ID: Programs That Start When XP Does New York Times, 5.5.5 http://www.nytimes.com/2005/05/05/technology/circuits/05askk.html [This is very valuable to anyone who has a computer that has slowed to a crawl. I'll be working on eliminating programs that hog RAM and do nothing for me.] By J.D. BIERSDORFER Programs That Start When XP Does Q. My Windows XP machine takes forever to start up. How can I tell what programs are loading during the start-up process? A. One way to sneak a peek at what programs start up when you turn on your computer is to use the System Configuration Utility that comes with Windows XP. To get to it, go to the Start menu and select Run. In the Run box, type "msconfig" (without the quotation marks) to start the utility. Click the Startup tab in the System Configuration box to see the list of programs that open when Windows starts up, along with check boxes to turn each program off or on during the next start-up. If you aren't sure what some of the listed programs actually do for your PC, you can look them up at the Windows Startup Online Search page ([1]www.windowsstartup.com/wso/search.php) before deciding if you want a program to start up automatically. [2]Microsoft has a page of information about using the System Configuration Utility to troubleshoot start-up problems with your PC at [3]support.microsoft.com/kb/310560/EN-US. Other Web pages offering help for using the utility to sort out your start-up woes include [4]www.netsquirrel.com/msconfig and [5]vlaurie.com/computers2/Articles/startup.htm. A File by Any Name May Not Copy Q. My wife gave me a small U.S.B. flash drive a while back to transport files between my Mac at home and my office computer. The U.S.B. drive seems to be pretty finicky, though, and won't let me use file names with certain characters like slashes. Why is this? A. To make them compatible with both Windows computers and Macintosh systems, just about every U.S.B. flash drive sold these days is formatted with the FAT32 file system. FAT32 is one of the file systems Windows uses to keep track of data stored on a drive, but Macs can also understand it and will display the stored folders and documents when you plug in the drive. [6]Apple's [7]iPod Shuffle music player, which can also function as a U.S.B. drive for toting large files, also comes formatted right out of the box in the FAT32 system. Using certain typographical characters in file names, however, is one thing you can do on a Mac but not on Windows. You can't name files with slashes, brackets, colons, semicolons, asterisks, periods, commas and a few other characters on the FAT32 system. You may get error messages if you try to copy files with such characters in the names from your Mac to the U.S.B. drive - or if you do successfully copy them, the file names may be changed on the U.S.B. drive. If you're using the U.S.B. drive to transfer files between Macs, you can get around finicky FAT32 by rounding up the files you want to transfer from the Mac and creating an archive file with them. In Mac OS X 10.3, click on each file to select it, then go to the File menu and select Create Archive. Give the archived file a simple name like Files and let the Mac create it. Then copy the Files.zip archive to the U.S.B. drive. You can also use utility programs like Stuffit to create archive files on older Mac systems. Useful Add-Ons For Firefox Q. Where can I find extension programs for the Firefox Web browser that do things like display the current weather in the browser window? A. Firefox, a free Web browser that comes in versions for Windows, Macintosh and Linux systems, can be easily customized with small extension programs that do things like add dictionary search tools to the browser window or provide controls for your computer's digital-audio software so you can control your music while you surf. The Forecastfox extension, which displays the current weather in the corner of the Firefox window, can be found at [8]forecastfox.mozdev.org. The latest version of Firefox itself can be downloaded at [9]www.mozilla.org/products/firefox, where there's also a link to more Firefox extensions. Circuits invites questions about computer-based technology, by e-mail to QandA at nytimes.com. This column will answer questions of general interest, but letters cannot be answered individually. References 1. http://www.windowsstartup.com/wso/search.php 2. http://www.nytimes.com/redirect/marketwatch/redirect.ctx?MW=http://custom.marketwatch.com/custom/nyt-com/html-companyprofile.asp&symb=MSFT 3. http://support.microsoft.com/kb/310560/EN-US 4. http://www.netsquirrel.com/msconfig 5. http://vlaurie.com/computers2/Articles/startup.htm 6. http://www.nytimes.com/redirect/marketwatch/redirect.ctx?MW=http://custom.marketwatch.com/custom/nyt-com/html-companyprofile.asp&symb=AAPL 7. http://tech2.nytimes.com/gst/technology/techsearch.html?st=p&cat=&query=ipod&inline=nyt-classifier 8. http://forecastfox.mozdev.org/ 9. http://www.mozilla.org/products/firefox From checker at panix.com Thu May 5 16:28:00 2005 From: checker at panix.com (Premise Checker) Date: Thu, 5 May 2005 12:28:00 -0400 (EDT) Subject: [Paleopsych] Global Die-Off: The profound mysanthropy of the deep ecologists Message-ID: ---------- Forwarded message ---------- Date: Thu, 5 May 2005 08:43:02 -0400 From: "Hughes, James J." Reply-To: World Transhumanist Association Discussion List This piece makes very clear the horrific worldview of the deep ecologists, which accepts the inevitability and desirability of a "global die-off". This is the direct parallel to the Christian Right's view that accepts that everything has to go to shite and then be destroyed in a firy conflagration before we get back to the Kingdom. Even if "global die-off" was a serious risk, these folks aren't really interested in mobilizing prophylactic answers, because that would just be putting off our inescapable punishment for hubris/sin. - J. ------------------------------------- >From - http://www.dissidentvoice.org/Apr05/Bageant0429.htm Background reading - http://dieoff.org/page125.htm Back to the Ancient Future: Chewing Raw Grubs with the "Nutcracker Man" by Joe Bageant www.dissidentvoice.org April 29, 2005 I spent the middle weekend in April with a group of artists and thinkers called the April Fools Group. Put together by Brad Blanton, psychotherapist and creator of "radical honesty" politics and therapy, the three-day meeting was set on a farm down the Shenandoah Valley amid the battlefields and rolling countryside of Newmarket, Virginia. Brad, a world famous redneck headshrinker, had put together old hippies, theoreticians, musicians, young anarchists, beautiful brilliant women and aging writers to yap, drink and plot against the Bush administration. So when I pulled into Brad's driveway to find him and a fellow named Hank parked in lawn chairs up on the roof with a bottle of bourbon I knew this thing was off to a good start. The gathering was an organizational meeting for Brad Blanton's independent run for the Virginia Seventh District U.S. House of Representatives. Blanton's working slogan is "America needs a good psychiatrist." And we got a lot accomplished in that direction, despite my intellectual flatulence and Brad's orneriness. Any psychotherapist who actually gets people to pay for advice such as "Fuck'em if they can't take a joke" must be called ornery at the very least. And any politician who thinks he can get elected on the basis of extreme honesty, well... Anyway, I came away from the meeting deeply struck by one thing. Every person there seemed to understand and acknowledge the coming global human "die-off." The one that has already begun in places like Africa and will grow into a global event sometime within our lifetimes and/or those of our children. The one that will kill millions of white people. That's right, clean pink little Western World white people like you and me. Nobody in the U.S. seems to be able to deal with or even think about this near certainty, and the few who do are written off as nutcases by the media and the public. Mostly though, it goes unacknowledged. All of which drives me nuts because the now nearly visible end of civilization strikes me as worthy of at least modest discussion. You'd think so. But the mention of it causes my wife to go into, "Oh Joe, can't we talk about something more pleasant?" And talk about causing weird stares and dropped jaws at the office water cooler. Here's the short course: Global die-off of mankind will occur when we run out of energy to support the complex technological grid sustaining modern industrial human civilization. In other words, when the electricity goes out, we are back in the Dark Age, with the Stone Age grunting at us from just around the corner. This will likely happen in 100 years or less, assuming the ecosystem does not collapse first. And you are thinking, "Well ho ho ho! Any other good news Bageant? And how the fock do you know this anyway?" For those willing to contemplate the subject, there is a scientifically supported model of the timeline of our return to Stone Age tribal units. A roadmap to the day when we will be cutting up dog meat with a sharpened cd rom disc in some toxic future canyon. It is called the Olduvai Theory. The Olduvai theory* was first introduced in a scientific paper by petroleum geologist/engineer/anthropologist Richard C. Duncan titled The Peak Of World Oil Production And The Road To The Olduvai Gorge. Duncan ("Dunk") chose the name Olduvai because, among other reasons, "...it is a good metaphor for the Stone Age way of life." It also sounded cool, he confesses. The Olduvai Gorge is a deep cleft in the Serengeti steppe of Tanzania, where Louis and Mary Leakey found the remains of prehistoric hominids, some up to two million years old, and along with the first stone tools, other things such as the skulls of sheep big as draft horses and pigs the size of hippos. Also the skull of "Nutcracker Man (Australopithecus boisei), so named because of a set of powerful choppers, teeth so strong they could bust the lug nuts off a truck tire, were he around today to work at the Goodyear Tire Center. As to Nutcracker's "lifestyle" (and we are using the term most generously for a style that had more than adequate pork resources but had not developed a decent pinot grigio to serve with it, or even barbecue sauce for that matter) Dunk says "the Olduvai way of life was and still is a sustainable one -- local, tribal, and solar -- and, for better or worse, our ancestors practiced it for millions of years." Dunk's Olduvai theory provides a modern database support structure for the Malthusian argument. The Olduvai theory uses only a single metric, as defined by "White's Law," and deals with electricity as the most vital expression of other forms of energy such as crude oil or coal. The theory is an inductive one based on world energy and population data, so elegantly simple that any 12th grader can do it, assuming he or she can do multiplication (a risky assumption now that no child has been left behind by our great ownership society). In the Olduvai schema permanent blackouts will occur worldwide around 2030. Industrial Civilization ends when energy availability falls to its 1930 level. Measured as energy use -- energy expended or consumed -- our industrial civilization can be described as a one-time phenomenon, a single pulse waveform of limited duration which flashed out from the caves to outer space, then back to the caves over approximately 100 years. So when was the highpoint of the flash? On the average, world per capita energy use crested around 1977. That was the same year John Travolta made "Saturday Night Fever," which few of us consider much of a highpoint. To make a long story short, there are three intervals of decline in the Olduvai schema: slope, slide and cliff -- each steeper than the previous. Right now we are in the slide headed for the cliff (see http://dieoff.org/page125.htm). After more than a decade no scientist has been able to refute it and even given the flexibility and bias inherent in what passes for common sense in this country, it's still pretty damned hard to argue with. When we do go off the cliff, the Big Die-off will play no favorites, and will happen everywhere more or less simultaneously. But there are some particularly lousy places to be when permanent worldwide electrical blackouts happen. In or near a big city is the worst. You can imagine the, uh, "discomfort" of billions when the electrical grids die and power goes out across the densely packed high-rise buildings surrounded by a million acres of asphalt. People with no work, no heat, no air conditioning, no food, no water. Put on yer Adidas, it is migration time. Wherein bankers, skinheads, little old ladies and taxi drivers swarm like insects toward whatever passes for the countryside by then. Looks like all those survivalists up in North Idaho and Oregon may be right. Personally, I wouldn't want to be in New York or Bombay, or even Toledo when the deal goes down, and in fact want to be as distant from a city as one can get without having to be too far into the woods (of which there will de damned few) to eat my daily requirement of tree bark. Americans busily expanding their lard content to fit the contours of their air-conditioned SUVs are among the chief accelerators of the Big Die-off. However, people worldwide assume that the average American is a blind dickhead who wouldn't acknowledge the ecological price of his/her lifestyle if it were branded on their forehead. That assumption is correct. Americans for the most part don't give a twit what kind of world their own children inherit, much less about dolphins, Hottentots, Frenchmen and the approaching desertification of distant places like Kansas. Still, it is reasonable to believe that many powerful people and organizations with all the research capability in the world at their fingertips must understand the future before us. In fact, I am sure some in industry do because even 10 years ago when I used to deal with chemical executives at Monsanto, Zeneca, Dow and other corporations, it was discussed and acknowledged a couple of times over cocktails, and even discussed how to profit from it through genetically engineered non-reproducing seeds that eliminate all the native crops around them. One might also guess that the U.S. president and his cabinet know, and that their solution is to fight for more oil and higher profits, given its increasing scarcity. Even the superficial whoring media is broaching the topic of "peak oil," though mostly for shock entertainment value. I heard an "expert" say the other day that science will solve the peak oil problem, probably through nuclear energy, as if that did not have its own awful implications. Sure buddy. Just like the "Green Revolution" solved the world's agricultural foods problem by poisoning the earth with pesticides and burning two gallons of oil to produce a pint of milk. There is the myth left hanging out there from the old scientific paradigm that science and technology are somehow going to snatch us from edge of species die-off just in time. Yes, we will be saved by the very science and technology that evolved from, and is completely dependent upon, an energy source that will no longer exist. I think the pundit probably understands that, but like all media and political people, safely assumes the public has the critical thinking capability of a jar of fruit flies. Shut up and watch Survivor! Well hell then. What does keep the American people from looking around them and seeing the obvious? That the earth is a finite thing being used up at exponential rates? Answer: The Spectacle. American capitalism's "media hologram." We no longer have a country, but the artificial spectacle of one. We have a global corporation masquerading electronically, digitally, financially, and legally and every other way as a nation called the "United States of America." The corporation now animates most of us from within through management of the need hierarchy of goods and information. We no longer have citizens. We have consumers, "purchase decision makers" whose most influential act in life consists of choosing a mortgage banker and an NFL team. And a car. The majority of modernized technical humans, Digitus Cathodus Americanus, cannot perceive the hologram because their self-identities were generated by it. It's "reality" to them -- the only one they will know until the hologram collapses with their electrical industrial civilization. By design or not, the hologram's primary effect has been to induce the illusion of a national "value system" through hypnotic repetition of images. Thus profit-seeking enterprises are legitimized as the animating spirit of our identities as individuals and as a nation. The end result of course is the mass replication of millions of uniform "market segmented consumer identities." Individuality is circumscribed by brand identification. The overall aggregate of brand identification groups is interpreted to be an inherently superior race or nation (worth fighting for to expand the resource base and markets.) We no longer have lives, just lifestyles that are defined and expressed through ever expanding (and more profitable) consumption. Net result: The legions of humanity toil to generate the trucks and tofu, munitions, missiles, newspapers, petrochemicals and pizza and millions of tons of ground up cattle sold to fire the furnace of an economic engine that has taken on a life of its own. One that must grow exponentially, devouring everything just to survive. Just to keep from collapsing. And people are taught that it is called "human progress." This mass hallucination generated by this totalist capitalist system, the state as engine of profit, is one thing. Life on a real planet made of dirt and water and flesh both warm and cold-blooded is quite another. Viewed from outside the web of Western illusions, say, an Iraqi citizen or a Filipino Moro, one finds the economic engine to be driven by unseen death and war and the pillaging of the weak by the powerful. All this is set against the backdrop of explosive human disease, growing starvation, the impending failure of the environment and petroleum based civilization, resulting in the greatest mass extinction event in the history of this planet. The Big Die-off. And in your very lifetime too. Admission is not only free, it is compulsory. One of the hologram's great illusions is that Industrial Civilization is evolutionary -- that it advances forever. Industrial civilization does not evolve. In the overall history of man it is extremely short and completely unsustainable. It is a one-time biological drama that rapidly consumes the necessary physical prerequisites for its own existence, the ecology and resources of the planetary gravity well in which it is trapped. Any good news, for Chrissake? Sort of. We may not become completely extinct. It looks like the earth's immune system is beginning to shake off its infection by the human virus through what appears -- to we virus at least -- as environmental collapse. But for the sake of discussion, let's assume that extinction through nuclear war and ecological collapse is somehow avoided (nowadays, we're allowed to assume anything we want, regardless of the evidence around us. Just ask any U.S. capitalist free market economist). If what is left after the Big Die-off can still be called a human society, it will be bottomed out at the subsistence level of energy-use. Now that is one ugly booger of a notion to contemplate. What is subsistence level energy-use? In all likelihood it has to do with shitting in the winter darkness at a sustainable 45 indoor degrees. Meanwhile, a cockroach watches, thinking to himself, "What a shame, because at the height of their culture these guys made a damned good peanut butter sandwich." Your attention please. This is your pilot. We have crested in our evolutionary journey and are beginning our descent. Please lock your folding trays, put you seats in an upright position and enjoy the landing. (Captain! Why are there no lights down there at the airport?) It was a helluva crest, that spurt of technological jism by the industrial state toward outer space and impregnation of the moon. What with Neil Armstrong bouncing around in the lunar dust in his high tech Pillsbury doughboy outfit and all, it added up to about one week of attention by the masses and a lot of dough for government contractors. But as one who within my lifetime witnessed the entire evolution of the space program, and its accompanying nationalistic hoopla about beating the Russians at being the first to fart in the vacuum of space, I am somehow unconvinced it was worth it. I dunno. Maybe my wife is right. Maybe I'm just a goddam crab. Maybe I'm a little resentful because, thanks to the big American suckdown of the planet, I will never have grandchildren. My kids are among that portion of their generation who understand what their lifetimes hold and are not remotely interested in adding to the problem. We weren't always like that. Right after World War II and the advent of the atomic bomb a majority of Americans (67% of those surveyed by Gallup) wanted a cooperative one-world government with all nuclear weapons put under the control of the United Nations. Now you cannot get an American to turn off a light switch to save human civilization. As a friend from Cape Verde once remarked, "Just watching Americans consume things gives me a headache." As for the weak ploy I used to slip into this screed -- Brad Blanton's April Fools Group meeting -- that too left me with a headache. After nearly a fifth of Maker's Mark bourbon on the third night I was hugging everybody in sight and had entered into an agreement with a jazz piano player and an inventor from Wisconsin to start a free love commune together right there in that beautiful valley. (Honest to god, I am not joking. I wish I were.) When I woke up next morning and looked into the mirror at eyes like two bloody pissholes in a snowbank ... and wondering who let that dog crap in my mouth ... well ... let's just say I wasn't experiencing the same sense of brotherly love as the night before. Rather than go into the wretchedness of the next day's grisly recovery, or contemplate what we might possibly find to drink while living in shipping containers during the next Olduvai period, let me share my favorite hangover remedy as a way out of this little box I've written myself into. OK? Bye! Uncle Joe's Big Die-off Hangover Cure: * Empty two cans of sardines (skinless packed in water) into a bowl. * Add two medium size habanera peppers. * One squirt of mustard. * One dash of Tabasco. * Blend coarsely in a blender. (Cover the blender with six bath towels to keep the noise from cracking your brain and teeth) * Spread on toast or crackers and eat. Your lungs may or may not collapse briefly and there may be temporary blindness. Not to worry. After your eyes quit watering enough to see, either endorphins associated with hot peppers will kick in, or subsequent fiery bites of the cure will be enough to distract your from the headache until they do. Joe Bageant is a writer and magazine editor living in Winchester, Virginia. He may be contacted at bageantjb at netscape.net. Free downloadable pdf files of his works are archived at www.coldtype.net. FOOTNOTE * The Olduvai theory postulates that electricity is the essence of Industrial Civilization. World energy production per capita increased strongly from 1945 to its all-time peak in 1979. Then from 1979 to 1999 -- for the first time in history -- it decreased from 1979-1999 at a rate of 0.33 %/year (called the Olduvai "slope"). Next from 2000 to 2011, according to the Olduvai schema, world energy production per capita will decrease by about 0.70 %/year (the Olduvai "slide"). Then around year 2012 there will be a rash of permanent electrical blackouts -- worldwide. These blackouts, along with other factors, will cause energy production per capita by 2030 to fall to 3.32 b/year, the same value it had in 1930. The rate of decline from 2012 to 2030 is 5.44 %/year (the Olduvai "cliff"). Thus, by definition, the duration of Industrial Civilization is less than or equal to 100 years. (Richard Duncan at: http://dieoff.org/page125.htm) From checker at panix.com Thu May 5 16:28:14 2005 From: checker at panix.com (Premise Checker) Date: Thu, 5 May 2005 12:28:14 -0400 (EDT) Subject: [Paleopsych] LRC: George Crispin: Are We Running Out of Oil? Message-ID: George Crispin: Are We Running Out of Oil? http://www.lewrockwell.com/crispin/crispin12.html 5.5.3 It is a complicated subject, but now I am familiar with the two theories of the origin of petroleum, the conventional one that assumes that oil is biogenic, originating as plant and animal matter and the other, that it is abiogenic, it or its raw material having been formed with the earth when it was formed 4.5 billion years ago. My learning curve began several years ago with a short article that described the oil field beneath the Eugene 330 oil platform in the Gulf of Mexico. I lost the clipping but did not forget the story of a dried up oil well that was refilling itself. This spring when I found a new discussion of Eugene 330, essentially describing the same conditions, my original conclusions of a mantle filled with or manufacturing petroleum were reinforced, leading to the conclusion that the worlds supply of oil was essentially limitless But by now the Peak Oil people were out in force and desperate to prove that we are due to run out of oil soon, and must prepare ourselves for war and/or starvation. This led me to: * C. Maurice and C. Smithson, Doomsday Mythology: "Every ten or fifteen years since the late 1800s (when we began using petroleum) experts have predicted that oil reserves would last only ten more years. These experts have predicted nine of the last zero oil-reserve exhaustions." * Sheik Yamani, one time oil minister to Saudi Arabia, who stated in a speech to Europeans, "The stone age ended, but not because of any lack or stones. Undoubtedly the oil age will end the same way." * Jean Whelan, a geochemist and senior researcher with the Woods Hole Oceanographic Institute assigned to study the Eugene field. Becoming familiar with the phenomenon, she said " . . .. I believe there is a huge system of oil just migrating deep underground" * Dr. Thomas Golds book [9]The Deep Hot Biosphere, in which he theorizes that our oil or the methane from which it could evolve, was formed 4.5 billion years ago when the earth began, that it is not a fossil fuel but picks up traces of fossils as it works its way upwards. This theory leaves the earth with a huge supply of oil unlike the fossil theory, which assumes oil to be the result of a one-time dying off of animals and plants. * The Russian Ukrainian Deep Abiotic Theory (except for Dr Gold, virtually unknown in the West) has long gone beyond theory, the Russians having brought in several fields producing abiotic oil using super deep drilling technology. By 1946 their production had dropped off. Now, they, along with Saudi Arabia and ourselves are one of the three largest producers in the world. This, plus our oil shale, and Canadas tar sands, plus as yet unimagined technologies, plus the fact that oil is only a part of the total energy picture make it seem highly unlikely the world will ever run out of oil. One clincher in the debate is that Peak Oil writings are terribly muddled. The other is the suspicion that its adherents seem to fall into Professor Kuhns description of people who cannot accept anything outside their conventional world, people who cannot be scientifically critical, whose every belief must fit their current paradigm. George Crispin [[10]send him mail] is a retired businessman who heads a Catholic homeschooling cooperative in Auburn, Alabama. [11]George Crispin Archives References 9. http://www.amazon.com/exec/obidos/ASIN/0387985468/lewrockwell/ 10. mailto:crispin73 at charter.net 11. http://www.lewrockwell.com/crispin/crispin-arch.html From ross.buck at uconn.edu Thu May 5 16:59:48 2005 From: ross.buck at uconn.edu (Buck, Ross) Date: Thu, 5 May 2005 12:59:48 -0400 Subject: [Paleopsych] Wired: (Flynn Effect): Dome Improvement Message-ID: Again the notion of heritability is being presented as a meaningful measure of genetic-versus-environmental influence. Most monozygotic twins are monochorionic, sharing the same choroid plexus and therefore the same blood supply in the womb. A minority are dichorionic, with identical genes but a different intrauterine blood supply. Davis, Phelps and Bracha (Schizophrenia Bulletin, 1995, 21, 357-366) investigated concordance of schizophrenia in monochorionic and dichorionic monozygotic twins, and found that while the concordance rate for MC MZ twins was 60% (i.e., if one twin is schizophrenic there is a 60% chance the other will be as well), the concordance rate of the DC MZ twins (with identical genes) was 10.7%. Environmental influences are overwhelming, and they begin at conception: the genes do nothing without environmental influences turning them on and off. The Flynn effect suggests that the vast media wasteland may actually function as a vast brain playground. Cheers, Ross Ross Buck, Ph. D. Professor of Communication Sciences and Psychology Communication Sciences U-1085 University of Connecticut Storrs, CT 06269-1085 860-486-4494 fax 860-486-5422 Ross.buck at uconn.edu http://www.coms.uconn.edu/docs/people/faculty/rbuck/index.htm -----Original Message----- From: paleopsych-bounces at paleopsych.org [mailto:paleopsych-bounces at paleopsych.org] On Behalf Of Premise Checker Sent: Thursday, May 05, 2005 12:27 PM To: paleopsych at paleopsych.org Subject: [Paleopsych] Wired: (Flynn Effect): Dome Improvement Dome Improvement http://www.wired.com/wired/archive/13.05/flynn_pr.html First some remarks from From: Hal Finney Date: Tue, 3 May 2005 11:03:41 -0700 (PDT) To: extropy-chat at lists.extropy.org except that the article is now available: Wired magazine's new issue has an article on the Flynn Effect, which we have discussed here occasionally. This is probably my favorite Effect, so completely extropian and contradictory to the conventional wisdom. Curmudgeons throughout the ages have complained about the decay of society and how the younger generation is inferior in morals and intelligence to their elders. Likewise modern communications technology is derided: TV is a vast wasteland, video games and movies promote sex and violence. Yet Flynn discovered the astonishing and still little-known fact that intelligence scores have steadily increased for at least the past 100 years. And it's a substantial gain; people who would have been considered geniuses 100 years ago would be merely average today. Perhaps even more surprisingly, the gains cannot be directly attributed to improved education, as the greatest improvements are found in the parts of the test that directly measure abstract reasoning via visual puzzles, not concrete knowledge based on language or mathematical skills. The Wired article (which should be online in a few days) does not have much that is new, but one fact which popped out is that the Effect has not only continued in the last couple of generations, but is increasing. Average IQ gains were 0.31 per year in the 1950s and 60s, but by the 1990s had grown to 0.36 per year. Explanations for the Effect seem to be as numerous as people who have studied it. Flynn himself does not seem to believe that it is real, in the sense that it actually points to increased intelligence. I was amused by economist David Friedman's suggestion that it is due to the increased use of Caesarian deliveries allowing for larger head sizes! The Wired article focuses on increased visual stimulation as the catalyst, which seems plausible as part of the story. The article then predicts that the next generation, exposed since babyhood to video games with demanding puzzle solving, mapping and coordination skills, will see an even greater improvement in IQ scores. Sometimes I wonder if the social changes we saw during the 20th century may have been caused or at least promoted by greater human intelligence. It's a difficult thesis to make because you first have to overcome the conventional wisdom that says that the 1900s were a century of human depravity and violence. But if you look deeper and recognize the tremendous growth of morality and ethical sensitivity in this period (which is what makes us judge ourselves so harshly), you have to ask, maybe it is because people woke up, began to think for themselves, and weren't willing to let themselves be manipulated and influenced as in the past? If so, then this bodes well for the future. --------------now the article: Pop quiz: Why are IQ test scores rising around the globe? (Hint: Stop reading the great authors and start playing Grand Theft Auto.) By Steven Johnson Twenty-three years ago, an American philosophy professor named James Flynn discovered a remarkable trend: Average IQ scores in every industrialized country on the planet had been increasing steadily for decades. Despite concerns about the dumbing-down of society - the failing schools, the garbage on TV, the decline of reading - the overall population was getting smarter. And the climb has continued, with more recent studies showing that the rate of IQ increase is accelerating. Next to global warming and Moore's law, the so-called Flynn effect may be the most revealing line on the increasingly crowded chart of modern life - and it's an especially hopeful one. We still have plenty of problems to solve, but at least there's one consolation: Our brains are getting better at problem-solving. Unless you happen to think the very notion of IQ is bunk. Anyone who has read Stephen Jay Gould's The Mismeasure of Man or Howard Gardner's work on multiple intelligences or any critique of The Bell Curve is liable to dismiss IQ as merely phrenology updated, a pseudoscience fronting for a host of racist and elitist ideologies that dare not speak their names. These critics attack IQ itself - or, more precisely, what intelligence scholar Arthur Jensen called g, a measure of underlying "general" intelligence. Psychometricians measure g by performing a factor analysis of multiple intelligence tests and extracting a pattern of correlation between the measurements. (IQ is just one yardstick.) Someone with greater general intelligence than average should perform better on a range of different tests. Unlike some skeptics, James Flynn didn't just dismiss g as statistical tap dancing. He accepted that something real was being measured, but he came to believe that it should be viewed along another axis: time. You can't just take a snapshot of g at one moment and make sense of it, Flynn says. You have to track its evolution. He did just that. Suddenly, g became much more than a measure of mental ability. It revealed the rising trend line in intelligence test scores. And that, in turn, suggested that something in the environment - some social or cultural force - was driving the trend. Significant intellectual breakthroughs - to paraphrase the John Lennon song - are what happen when you're busy making other plans. So it was with Flynn and his effect. He left the US in the early 1960s to teach moral philosophy at the University of Otaga in New Zealand. In the late '70s, he began exploring the intellectual underpinnings of racist ideologies. "And I thought: Oh, I can do a bit about the IQ controversies," he says. "And then I saw that Arthur Jensen, a scholar of high repute, actually thought that blacks on average were genetically inferior - which was quite a shock. I should say that Jensen was beyond reproach - he's certainly not a racist. And so I thought I'd better look into this." This inquiry led to a 1980 book, Race, IQ, and Jensen, that posited an environmental - not genetic - explanation for the black-white IQ gap. After finishing the book, Flynn decided that he would look for evidence that blacks were gaining on whites as their access to education increased, and so he began studying US military records, since every incoming member of the armed forces takes an IQ test. Sure enough, he found that blacks were making modest gains on whites in intelligence tests, confirming his environmental explanation. But something else in the data caught his eye. Every decade or so, the testing companies would generate new tests and re-normalize them so that the average score was 100. To make sure that the new exams were in sync with previous ones, they'd have a batch of students take both tests. They were simply trying to confirm that someone who tested above average on the new version would perform above average on the old, and in fact the results confirmed that correlation. But the data also brought to light another pattern, one that the testing companies ignored. "Every time kids took the new and the old tests, they did better on the old ones," Flynn says. "I thought: That's weird." The testing companies had published the comparative data almost as an afterthought. "It didn't seem to strike them as interesting that the kids were always doing better on the earlier test," he says. "But I was new to the area." He sent his data to the Harvard Educational Review, which dismissed the paper for its small sample size. And so Flynn dug up every study that had ever been done in the US where the same subjects took a new and an old version of an IQ test. "And lo and behold, when you examined that huge collection of data, it revealed a 14-point gain between 1932 and 1978." According to Flynn's numbers, if someone testing in the top 18 percent the year FDR was elected were to time-travel to the middle of the Carter administration, he would score at the 50th percentile. When Flynn finally published his work in 1984, Jensen objected that Flynn's numbers were drawing on tests that reflected educational background. He predicted that the Flynn effect would disappear if one were to look at tests - like the Raven Progressive Matrices - that give a closer approximation of g, by measuring abstract reasoning and pattern recognition and eliminating language altogether. And so Flynn dutifully collected IQ data from all over the world. All of it showed dramatic increases. "The biggest of all were on Ravens," Flynn reports with a hint of glee still in his voice. The trend Flynn discovered in the mid-'80s has been investigated extensively, and there's little doubt he's right. In fact, the Flynn effect is accelerating. US test takers gained 17 IQ points between 1947 and 2001. The annual gain from 1947 through 1972 was 0.31 IQ point, but by the '90s it had crept up to 0.36. Though the Flynn effect is now widely accepted, its existence has in turn raised new questions. The most fundamental: Why are measures of intelligence going up? The phenomenon would seem to make no sense in light of the evidence that g is largely an inherited trait. We're certainly not evolving that quickly. The classic heritability research paradigm is the twin adoption study: Look at IQ scores for thousands of individuals with various forms of shared genes and environments, and hunt for correlations. This is the sort of chart you get, with 100 being a perfect match and 0 pure randomness: The same person tested twice: 87 Identical twins raised together: 86 Identical twins raised apart: 76 Fraternal twins raised together: 55 Biological siblings: 47 Parents and children living together: 40 Parents and children living apart: 31 Adopted children living together: 0 Unrelated people living apart: 0 After analyzing these shifting ratios of shared genes and the environment for several decades, the consensus grew, in the '90s, that heritability for IQ was around 0.6 - or about 60 percent. The two most powerful indications of this are at the top and bottom of the chart: Identical twins raised in different environments have IQs almost as similar to each other as the same person tested twice, while adopted children living together - shared environment, but no shared genes - show no correlation. When you look at a chart like that, the evidence for significant heritability looks undeniable. Four years ago, Flynn and William Dickens, a Brookings Institution economist, proposed another explanation, one made apparent to them by the Flynn effect. Imagine "somebody who starts out with a tiny little physiological advantage: He's just a bit taller than his friends," Dickens says. "That person is going to be just a bit better at basketball." Thanks to this minor height advantage, he tends to enjoy pickup basketball games. He goes on to play in high school, where he gets excellent coaching and accumulates more experience and skill. "And that sets up a cycle that could, say, take him all the way to the NBA," Dickens says. Now imagine this person has an identical twin raised separately. He, too, will share the height advantage, and so be more likely to find his way into the same cycle. And when some imagined basketball geneticist surveys the data at the end of that cycle, he'll report that two identical twins raised apart share an off-the-charts ability at basketball. "If you did a genetic analysis, you'd say: Well, this guy had a gene that made him a better basketball player," Dickens says. "But the fact is, that gene is making him 1 percent better, and the other 99 percent is that because he's slightly taller, he got all this environmental support." And what goes for basketball goes for intelligence: Small genetic differences get picked up and magnified in the environment, resulting in dramatically enhanced skills. "The heritability studies weren't wrong," Flynn says. "We just misinterpreted them." Dickens and Flynn showed that the environment could affect heritable traits like IQ, but one mystery remained: What part of our allegedly dumbed-down environment is making us smarter? It's not schools, since the tests that measure education-driven skills haven't shown the same steady gains. It's not nutrition - general improvement in diet leveled off in most industrialized countries shortly after World War II, just as the Flynn effect was accelerating. Most cognitive scholars remain genuinely perplexed. "I find it a puzzle and don't have a compelling explanation," wrote Harvard's Steven Pinker in an email exchange. "I suspect that it's either practice at taking tests or perhaps a large number of disparate factors that add up to the linear trend." Flynn has his theories, though they're still speculative. "For a long time it bothered me that g was going up without an across-the-board increase in other tests," he says. If g measured general intelligence, then a long-term increase should trickle over into other subtests. "And then I realized that society has priorities. Let's say we're too cheap to hire good high school math teachers. So while we may want to improve arithmetical reasoning skills, we just don't. On the other hand, with smaller families, more leisure, and more energy to use leisure for cognitively demanding pursuits, we may improve - without realizing it - on-the-spot problem-solving, like you see with Ravens." When you take the Ravens test, you're confronted with a series of visual grids, each containing a mix of shapes that seem vaguely related to one another. Each grid contains a missing shape; to answer the implicit question posed by the test, you need to pick the correct missing shape from a selection of eight possibilities. To "solve" these puzzles, in other words, you have to scrutinize a changing set of icons, looking for unusual patterns and correlations among them. This is not the kind of thinking that happens when you read a book or have a conversation with someone or take a history exam. But it is precisely the kind of mental work you do when you, say, struggle to program a VCR or master the interface on your new cell phone. Over the last 50 years, we've had to cope with an explosion of media, technologies, and interfaces, from the TV clicker to the World Wide Web. And every new form of visual media - interactive visual media in particular - poses an implicit challenge to our brains: We have to work through the logic of the new interface, follow clues, sense relationships. Perhaps unsurprisingly, these are the very skills that the Ravens tests measure - you survey a field of visual icons and look for unusual patterns. The best example of brain-boosting media may be videogames. Mastering visual puzzles is the whole point of the exercise - whether it's the spatial geometry of Tetris, the engineering riddles of Myst, or the urban mapping of Grand Theft Auto. The ultimate test of the "cognitively demanding leisure" hypothesis may come in the next few years, as the generation raised on hypertext and massively complex game worlds starts taking adult IQ tests. This is a generation of kids who, in many cases, learned to puzzle through the visual patterns of graphic interfaces before they learned to read. Their fundamental intellectual powers weren't shaped only by coping with words on a page. They acquired an intuitive understanding of shapes and environments, all of them laced with patterns that can be detected if you think hard enough. Their parents may have enhanced their fluid intelligence by playing Tetris or learning the visual grammar of TV advertising. But that's child's play compared with Pok?mon. Contributing editor Steven Johnson (stevenberlinjohnson at earthlink.net) is the author of Everything Bad Is Good for You: How Today's Popular Culture Is Actually Making Us Smarter. From shovland at mindspring.com Fri May 6 01:12:47 2005 From: shovland at mindspring.com (Steve Hovland) Date: Thu, 5 May 2005 18:12:47 -0700 Subject: [Paleopsych] Wired: (Flynn Effect): Dome Improvement Message-ID: <01C5519E.0B1C8BF0.shovland@mindspring.com> What kind of environmental signals optimize the expression of human DNA? Steve Hovland www.stevehovland.net -----Original Message----- From: Buck, Ross [SMTP:ross.buck at uconn.edu] Sent: Thursday, May 05, 2005 10:00 AM To: The new improved paleopsych list Subject: RE: [Paleopsych] Wired: (Flynn Effect): Dome Improvement Again the notion of heritability is being presented as a meaningful measure of genetic-versus-environmental influence. Most monozygotic twins are monochorionic, sharing the same choroid plexus and therefore the same blood supply in the womb. A minority are dichorionic, with identical genes but a different intrauterine blood supply. Davis, Phelps and Bracha (Schizophrenia Bulletin, 1995, 21, 357-366) investigated concordance of schizophrenia in monochorionic and dichorionic monozygotic twins, and found that while the concordance rate for MC MZ twins was 60% (i.e., if one twin is schizophrenic there is a 60% chance the other will be as well), the concordance rate of the DC MZ twins (with identical genes) was 10.7%. Environmental influences are overwhelming, and they begin at conception: the genes do nothing without environmental influences turning them on and off. The Flynn effect suggests that the vast media wasteland may actually function as a vast brain playground. Cheers, Ross Ross Buck, Ph. D. Professor of Communication Sciences and Psychology Communication Sciences U-1085 University of Connecticut Storrs, CT 06269-1085 860-486-4494 fax 860-486-5422 Ross.buck at uconn.edu http://www.coms.uconn.edu/docs/people/faculty/rbuck/index.htm -----Original Message----- From: paleopsych-bounces at paleopsych.org [mailto:paleopsych-bounces at paleopsych.org] On Behalf Of Premise Checker Sent: Thursday, May 05, 2005 12:27 PM To: paleopsych at paleopsych.org Subject: [Paleopsych] Wired: (Flynn Effect): Dome Improvement Dome Improvement http://www.wired.com/wired/archive/13.05/flynn_pr.html First some remarks from From: Hal Finney Date: Tue, 3 May 2005 11:03:41 -0700 (PDT) To: extropy-chat at lists.extropy.org except that the article is now available: Wired magazine's new issue has an article on the Flynn Effect, which we have discussed here occasionally. This is probably my favorite Effect, so completely extropian and contradictory to the conventional wisdom. Curmudgeons throughout the ages have complained about the decay of society and how the younger generation is inferior in morals and intelligence to their elders. Likewise modern communications technology is derided: TV is a vast wasteland, video games and movies promote sex and violence. Yet Flynn discovered the astonishing and still little-known fact that intelligence scores have steadily increased for at least the past 100 years. And it's a substantial gain; people who would have been considered geniuses 100 years ago would be merely average today. Perhaps even more surprisingly, the gains cannot be directly attributed to improved education, as the greatest improvements are found in the parts of the test that directly measure abstract reasoning via visual puzzles, not concrete knowledge based on language or mathematical skills. The Wired article (which should be online in a few days) does not have much that is new, but one fact which popped out is that the Effect has not only continued in the last couple of generations, but is increasing. Average IQ gains were 0.31 per year in the 1950s and 60s, but by the 1990s had grown to 0.36 per year. Explanations for the Effect seem to be as numerous as people who have studied it. Flynn himself does not seem to believe that it is real, in the sense that it actually points to increased intelligence. I was amused by economist David Friedman's suggestion that it is due to the increased use of Caesarian deliveries allowing for larger head sizes! The Wired article focuses on increased visual stimulation as the catalyst, which seems plausible as part of the story. The article then predicts that the next generation, exposed since babyhood to video games with demanding puzzle solving, mapping and coordination skills, will see an even greater improvement in IQ scores. Sometimes I wonder if the social changes we saw during the 20th century may have been caused or at least promoted by greater human intelligence. It's a difficult thesis to make because you first have to overcome the conventional wisdom that says that the 1900s were a century of human depravity and violence. But if you look deeper and recognize the tremendous growth of morality and ethical sensitivity in this period (which is what makes us judge ourselves so harshly), you have to ask, maybe it is because people woke up, began to think for themselves, and weren't willing to let themselves be manipulated and influenced as in the past? If so, then this bodes well for the future. --------------now the article: Pop quiz: Why are IQ test scores rising around the globe? (Hint: Stop re ading the great authors and start playing Grand Theft Auto.) By Steven Johnson Twenty-three years ago, an American philosophy professor named James Flynn discovered a remarkable trend: Average IQ scores in every industrialized country on the planet had been increasing steadily for decades. Despite concerns about the dumbing-down of society - the failing schools, the garbage on TV, the decline of reading - the overall population was getting smarter. And the climb has continued, with more recent studies showing that the rate of IQ increase is accelerating. Next to global warming and Moore's law, the so-called Flynn effect may be the most revealing line on the increasingly crowded chart of modern life - and it's an especially hopeful one. We still have plenty of problems to solve, but at least there's one consolation: Our brains are getting better at problem-solving. Unless you happen to think the very notion of IQ is bunk. Anyone who has read Stephen Jay Gould's The Mismeasure of Man or Howard Gardner's work on multiple intelligences or any critique of The Bell Curve is liable to dismiss IQ as merely phrenology updated, a pseudoscience fronting for a host of racist and elitist ideologies that dare not speak their names. These critics attack IQ itself - or, more precisely, what intelligence scholar Arthur Jensen called g, a measure of underlying "general" intelligence. Psychometricians measure g by performing a factor analysis of multiple intelligence tests and extracting a pattern of correlation between the measurements. (IQ is just one yardstick.) Someone with greater general intelligence than average should perform better on a range of different tests. Unlike some skeptics, James Flynn didn't just dismiss g as statistical tap dancing. He accepted that something real was being measured, but he came to believe that it should be viewed along another axis: time. You can't just take a snapshot of g at one moment and make sense of it, Flynn says. You have to track its evolution. He did just that. Suddenly, g became much more than a measure of mental ability. It revealed the rising trend line in intelligence test scores. And that, in turn, suggested that something in the environment - some social or cultural force - was driving the trend. Significant intellectual breakthroughs - to paraphrase the John Lennon song - are what happen when you're busy making other plans. So it was with Flynn and his effect. He left the US in the early 1960s to teach moral philosophy at the University of Otaga in New Zealand. In the late '70s, he began exploring the intellectual underpinnings of racist ideologies. "And I thought: Oh, I can do a bit about the IQ controversies," he says. "And then I saw that Arthur Jensen, a scholar of high repute, actually thought that blacks on average were genetically inferior - which was quite a shock. I should say that Jensen was beyond reproach - he's certainly not a racist. And so I thought I'd better look into this." This inquiry led to a 1980 book, Race, IQ, and Jensen, that posited an environmental - not genetic - explanation for the black-white IQ gap. After finishing the book, Flynn decided that he would look for evidence that blacks were gaining on whites as their access to education increased, and so he began studying US military records, since every incoming member of the armed forces takes an IQ test. Sure enough, he found that blacks were making modest gains on whites in intelligence tests, confirming his environmental explanation. But something else in the data caught his eye. Every decade or so, the testing companies would generate new tests and re-normalize them so that the average score was 100. To make sure that the new exams were in sync with previous ones, they'd have a batch of students take both tests. They were simply trying to confirm that someone who tested above average on the new version would perform above average on the old, and in fact the results confirmed that correlation. But the data also brought to light another pattern, one that the testing companies ignored. "Every time kids took the new and the old tests, they did better on the old ones," Flynn says. "I thought: That's weird." The testing companies had published the comparative data almost as an afterthought. "It didn't seem to strike them as interesting that the kids were always doing better on the earlier test," he says. "But I was new to the area." He sent his data to the Harvard Educational Review, which dismissed the paper for its small sample size. And so Flynn dug up every study that had ever been done in the US where the same subjects took a new and an old version of an IQ test. "And lo and behold, when you examined that huge collection of data, it revealed a 14-point gain between 1932 and 1978." According to Flynn's numbers, if someone testing in the top 18 percent the year FDR was elected were to time-travel to the middle of the Carter administration, he would score at the 50th percentile. When Flynn finally published his work in 1984, Jensen objected that Flynn's numbers were drawing on tests that reflected educational background. He predicted that the Flynn effect would disappear if one were to look at tests - like the Raven Progressive Matrices - that give a closer approximation of g, by measuring abstract reasoning and pattern recognition and eliminating language altogether. And so Flynn dutifully collected IQ data from all over the world. All of it showed dramatic increases. "The biggest of all were on Ravens," Flynn reports with a hint of glee still in his voice. The trend Flynn discovered in the mid-'80s has been investigated extensively, and there's little doubt he's right. In fact, the Flynn effect is accelerating. US test takers gained 17 IQ points between 1947 and 2001. The annual gain from 1947 through 1972 was 0.31 IQ point, but by the '90s it had crept up to 0.36. Though the Flynn effect is now widely accepted, its existence has in turn raised new questions. The most fundamental: Why are measures of intelligence going up? The phenomenon would seem to make no sense in light of the evidence that g is largely an inherited trait. We're certainly not evolving that quickly. The classic heritability research paradigm is the twin adoption study: Look at IQ scores for thousands of individuals with various forms of shared genes and environments, and hunt for correlations. This is the sort of chart you get, with 100 being a perfect match and 0 pure randomness: The same person tested twice: 87 Identical twins raised together: 86 Identical twins raised apart: 76 Fraternal twins raised together: 55 Biological siblings: 47 Parents and children living together: 40 Parents and children living apart: 31 Adopted children living together: 0 Unrelated people living apart: 0 After analyzing these shifting ratios of shared genes and the environment for several decades, the consensus grew, in the '90s, that heritability for IQ was around 0.6 - or about 60 percent. The two most powerful indications of this are at the top and bottom of the chart: Identical twins raised in different environments have IQs almost as similar to each other as the same person tested twice, while adopted children living together - shared environment, but no shared genes - show no correlation. When you look at a chart like that, the evidence for significant heritability looks undeniable. Four years ago, Flynn and William Dickens, a Brookings Institution economist, proposed another explanation, one made apparent to them by the Flynn effect. Imagine "somebody who starts out with a tiny little physiological advantage: He's just a bit taller than his friends," Dickens says. "That person is going to be just a bit better at basketball." Thanks to this minor height advantage, he tends to enjoy pickup basketball games. He goes on to play in high school, where he gets excellent coaching and accumulates more experience and skill. "And that sets up a cycle that could, say, take him all the way to the NBA," Dickens says. Now imagine this person has an identical twin raised separately. He, too, will share the height advantage, and so be more likely to find his way into the same cycle. And when some imagined basketball geneticist surveys the data at the end of that cycle, he'll report that two identical twins raised apart share an off-the-charts ability at basketball. "If you did a genetic analysis, you'd say: Well, this guy had a gene that made him a better basketball player," Dickens says. "But the fact is, that gene is making him 1 percent better, and the other 99 percent is that because he's slightly taller, he got all this environmental support." And what goes for basketball goes for intelligence: Small genetic differences get picked up and magnified in the environment, resulting in dramatically enhanced skills. "The heritability studies weren't wrong," Flynn says. "We just misinterpreted them." Dickens and Flynn showed that the environment could affect heritable traits like IQ, but one mystery remained: What part of our allegedly dumbed-down environment is making us smarter? It's not schools, since the tests that measure education-driven skills haven't shown the same steady gains. It's not nutrition - general improvement in diet leveled off in most industrialized countries shortly after World War II, just as the Flynn effect was accelerating. Most cognitive scholars remain genuinely perplexed. "I find it a puzzle and don't have a compelling explanation," wrote Harvard's Steven Pinker in an email exchange. "I suspect that it's either practice at taking tests or perhaps a large number of disparate factors that add up to the linear trend." Flynn has his theories, though they're still speculative. "For a long time it bothered me that g was going up without an across-the-board increase in other tests," he says. If g measured general intelligence, then a long-term increase should trickle over into other subtests. "And then I realized that society has priorities. Let's say we're too cheap to hire good high school math teachers. So while we may want to improve arithmetical reasoning skills, we just don't. On the other hand, with smaller families, more leisure, and more energy to use leisure for cognitively demanding pursuits, we may improve - without realizing it - on-the-spot problem-solving, like you see with Ravens." When you take the Ravens test, you're confronted with a series of visual grids, each containing a mix of shapes that seem vaguely related to one another. Each grid contains a missing shape; to answer the implicit question posed by the test, you need to pick the correct missing shape from a selection of eight possibilities. To "solve" these puzzles, in other words, you have to scrutinize a changing set of icons, looking for unusual patterns and correlations among them. This is not the kind of thinking that happens when you read a book or have a conversation with someone or take a history exam. But it is precisely the kind of mental work you do when you, say, struggle to program a VCR or master the interface on your new cell phone. Over the last 50 years, we've had to cope with an explosion of media, technologies, and interfaces, from the TV clicker to the World Wide Web. And every new form of visual media - interactive visual media in particular - poses an implicit challenge to our brains: We have to work through the logic of the new interface, follow clues, sense relationships. Perhaps unsurprisingly, these are the very skills that the Ravens tests measure - you survey a field of visual icons and look for unusual patterns. The best example of brain-boosting media may be videogames. Mastering visual puzzles is the whole point of the exercise - whether it's the spatial geometry of Tetris, the engineering riddles of Myst, or the urban mapping of Grand Theft Auto. The ultimate test of the "cognitively demanding leisure" hypothesis may come in the next few years, as the generation raised on hypertext and massively complex game worlds starts taking adult IQ tests. This is a generation of kids who, in many cases, learned to puzzle through the visual patterns of graphic interfaces before they learned to read. Their fundamental intellectual powers weren't shaped only by coping with words on a page. They acquired an intuitive understanding of shapes and environments, all of them laced with patterns that can be detected if you think hard enough. Their parents may have enhanced their fluid intelligence by playing Tetris or learning the visual grammar of TV advertising. But that's child's play compared with Pokemon. Contributing editor Steven Johnson (stevenberlinjohnson at earthlink.net) is the author of Everything Bad Is Good for You: How Today's Popular Culture Is Actually Making Us Smarter. _______________________________________________ paleopsych mailing list paleopsych at paleopsych.org http://lists.paleopsych.org/mailman/listinfo/paleopsych From he at psychology.su.se Fri May 6 16:07:38 2005 From: he at psychology.su.se (Hannes Eisler) Date: Fri, 6 May 2005 18:07:38 +0200 Subject: [Paleopsych] Wired: (Flynn Effect): Dome Improvement In-Reply-To: References: Message-ID: Flynn's explanation sounds plausible; it reminds of "idiots savants." I think you can only interpret the g factor as an often misleading indication of an individual's attainable ceiling of intelligent performance which may be hereditary. Since only few environments empower the individual to the maximal possible performance, a better measure of the hereditary part might be the time, or the number of repetitions or rehearsal, necessary to achieve a certain gaol, e.g., to solve a problem. This would make all tests speed tests, of course. >Again the notion of heritability is being >presented as a meaningful measure of >genetic-versus-environmental influence. Most >monozygotic twins are monochorionic, sharing the >same choroid plexus and therefore the same blood >supply in the womb. A minority are dichorionic, >with identical genes but a different >intrauterine blood supply. Davis, Phelps and >Bracha (Schizophrenia Bulletin, 1995, 21, >357-366) investigated concordance of >schizophrenia in monochorionic and dichorionic >monozygotic twins, and found that while the >concordance rate for MC MZ twins was 60% (i.e., >if one twin is schizophrenic there is a 60% >chance the other will be as well), the >concordance rate of the DC MZ twins (with >identical genes) was 10.7%. Environmental >influences are overwhelming, and they begin at >conception: the genes do nothing without >environmental influences turning them on and off. > >The Flynn effect suggests that the vast media >wasteland may actually function as a vast brain >playground. > >Cheers, Ross > >Ross Buck, Ph. D. >Professor of Communication Sciences > and Psychology >Communication Sciences U-1085 >University of Connecticut >Storrs, CT 06269-1085 >860-486-4494 >fax 860-486-5422 >Ross.buck at uconn.edu >http://www.coms.uconn.edu/docs/people/faculty/rbuck/index.htm > >-----Original Message----- >From: paleopsych-bounces at paleopsych.org >[mailto:paleopsych-bounces at paleopsych.org] On >Behalf Of Premise Checker >Sent: Thursday, May 05, 2005 12:27 PM >To: paleopsych at paleopsych.org >Subject: [Paleopsych] Wired: (Flynn Effect): Dome Improvement > >Dome Improvement >http://www.wired.com/wired/archive/13.05/flynn_pr.html > > >First some remarks from >From: Hal Finney >Date: Tue, 3 May 2005 11:03:41 -0700 (PDT) >To: extropy-chat at lists.extropy.org >except that the article is now available: > >Wired magazine's new issue has an article on the Flynn Effect, which we >have discussed here occasionally. This is probably my favorite Effect, >so completely extropian and contradictory to the conventional wisdom. >Curmudgeons throughout the ages have complained about the decay of society >and how the younger generation is inferior in morals and intelligence >to their elders. Likewise modern communications technology is derided: >TV is a vast wasteland, video games and movies promote sex and violence. >Yet Flynn discovered the astonishing and still little-known fact >that intelligence scores have steadily increased for at least the >past 100 years. And it's a substantial gain; people who would have >been considered geniuses 100 years ago would be merely average today. >Perhaps even more surprisingly, the gains cannot be directly attributed to >improved education, as the greatest improvements are found in the parts >of the test that directly measure abstract reasoning via visual puzzles, >not concrete knowledge based on language or mathematical skills. > >The Wired article (which should be online in a few days) does not have >much that is new, but one fact which popped out is that the Effect has >not only continued in the last couple of generations, but is increasing. >Average IQ gains were 0.31 per year in the 1950s and 60s, but by the >1990s had grown to 0.36 per year. > >Explanations for the Effect seem to be as numerous as people who have >studied it. Flynn himself does not seem to believe that it is real, >in the sense that it actually points to increased intelligence. I was >amused by economist David Friedman's suggestion that it is due to the >increased use of Caesarian deliveries allowing for larger head sizes! >The Wired article focuses on increased visual stimulation as the catalyst, >which seems plausible as part of the story. The article then predicts >that the next generation, exposed since babyhood to video games with >demanding puzzle solving, mapping and coordination skills, will see an >even greater improvement in IQ scores. > >Sometimes I wonder if the social changes we saw during the 20th century >may have been caused or at least promoted by greater human intelligence. >It's a difficult thesis to make because you first have to overcome the >conventional wisdom that says that the 1900s were a century of human >depravity and violence. But if you look deeper and recognize the >tremendous growth of morality and ethical sensitivity in this period >(which is what makes us judge ourselves so harshly), you have to ask, >maybe it is because people woke up, began to think for themselves, and >weren't willing to let themselves be manipulated and influenced as in >the past? If so, then this bodes well for the future. > >--------------now the article: > >Pop quiz: Why are IQ test scores rising around the globe? (Hint: Stop reading >the great authors and start playing Grand Theft Auto.) > >By Steven Johnson > >Twenty-three years ago, an American philosophy professor named James Flynn >discovered a remarkable trend: Average IQ scores in every industrialized >country on the planet had been increasing steadily for decades. Despite >concerns about the dumbing-down of society - the failing schools, the garbage >on TV, the decline of reading - the overall >population was getting smarter. And >the climb has continued, with more recent studies showing that the rate of IQ >increase is accelerating. Next to global warming >and Moore's law, the so-called >Flynn effect may be the most revealing line on the increasingly crowded chart >of modern life - and it's an especially hopeful one. We still have plenty of >problems to solve, but at least there's one >consolation: Our brains are getting >better at problem-solving. > >Unless you happen to think the very notion of IQ is bunk. Anyone who has read >Stephen Jay Gould's The Mismeasure of Man or Howard Gardner's work on multiple >intelligences or any critique of The Bell Curve is liable to dismiss IQ as >merely phrenology updated, a pseudoscience fronting for a host of racist and >elitist ideologies that dare not speak their names. > >These critics attack IQ itself - or, more precisely, what intelligence scholar >Arthur Jensen called g, a measure of underlying "general" intelligence. >Psychometricians measure g by performing a factor analysis of multiple >intelligence tests and extracting a pattern of correlation between the >measurements. (IQ is just one yardstick.) Someone with greater general >intelligence than average should perform better on a range of different tests. > >Unlike some skeptics, James Flynn didn't just dismiss g as statistical tap >dancing. He accepted that something real was being measured, but he came to >believe that it should be viewed along another axis: time. You can't just take >a snapshot of g at one moment and make sense of it, Flynn says. You have to >track its evolution. He did just that. Suddenly, g became much more than a >measure of mental ability. It revealed the rising trend line in intelligence >test scores. And that, in turn, suggested that something in the environment - >some social or cultural force - was driving the trend. > >Significant intellectual breakthroughs - to paraphrase the John Lennon song - >are what happen when you're busy making other plans. So it was with Flynn and >his effect. He left the US in the early 1960s to teach moral philosophy at the >University of Otaga in New Zealand. In the late '70s, he began exploring the >intellectual underpinnings of racist ideologies. >"And I thought: Oh, I can do a >bit about the IQ controversies," he says. "And >then I saw that Arthur Jensen, a >scholar of high repute, actually thought that blacks on average were >genetically inferior - which was quite a shock. I should say that Jensen was >beyond reproach - he's certainly not a racist. >And so I thought I'd better look >into this." > >This inquiry led to a 1980 book, Race, IQ, and Jensen, that posited an >environmental - not genetic - explanation for the black-white IQ gap. After >finishing the book, Flynn decided that he would look for evidence that blacks >were gaining on whites as their access to education increased, and so he began >studying US military records, since every incoming member of the armed forces >takes an IQ test. > >Sure enough, he found that blacks were making modest gains on whites in >intelligence tests, confirming his environmental explanation. But something >else in the data caught his eye. Every decade or so, the testing companies >would generate new tests and re-normalize them so that the average score was >100. To make sure that the new exams were in sync with previous ones, they'd >have a batch of students take both tests. They were simply trying to confirm >that someone who tested above average on the new version would perform above >average on the old, and in fact the results >confirmed that correlation. But the >data also brought to light another pattern, one that the testing companies >ignored. "Every time kids took the new and the old tests, they did better on >the old ones," Flynn says. "I thought: That's weird." > >The testing companies had published the comparative data almost as an >afterthought. "It didn't seem to strike them as interesting that the kids were >always doing better on the earlier test," he >says. "But I was new to the area." >He sent his data to the Harvard Educational Review, which dismissed the paper >for its small sample size. And so Flynn dug up every study that had ever been >done in the US where the same subjects took a new and an old version of an IQ >test. "And lo and behold, when you examined that huge collection of data, it >revealed a 14-point gain between 1932 and 1978." According to Flynn's numbers, >if someone testing in the top 18 percent the year FDR was elected were to >time-travel to the middle of the Carter administration, he would score at the >50th percentile. > >When Flynn finally published his work in 1984, Jensen objected that Flynn's >numbers were drawing on tests that reflected educational background. He >predicted that the Flynn effect would disappear if one were to look at tests - >like the Raven Progressive Matrices - that give >a closer approximation of g, by >measuring abstract reasoning and pattern recognition and eliminating language >altogether. And so Flynn dutifully collected IQ data from all over the world. >All of it showed dramatic increases. "The >biggest of all were on Ravens," Flynn >reports with a hint of glee still in his voice. > >The trend Flynn discovered in the mid-'80s has been investigated extensively, >and there's little doubt he's right. In fact, >the Flynn effect is accelerating. >US test takers gained 17 IQ points between 1947 and 2001. The annual gain from >1947 through 1972 was 0.31 IQ point, but by the '90s it had crept up to 0.36. > >Though the Flynn effect is now widely accepted, its existence has in turn >raised new questions. The most fundamental: Why are measures of intelligence >going up? The phenomenon would seem to make no sense in light of the evidence >that g is largely an inherited trait. We're certainly not evolving that >quickly. > >The classic heritability research paradigm is the twin adoption study: Look at >IQ scores for thousands of individuals with various forms of shared genes and >environments, and hunt for correlations. This is the sort of chart you get, >with 100 being a perfect match and 0 pure randomness: > >The same person tested twice: 87 >Identical twins raised together: 86 >Identical twins raised apart: 76 >Fraternal twins raised together: 55 >Biological siblings: 47 >Parents and children living together: 40 >Parents and children living apart: 31 >Adopted children living together: 0 >Unrelated people living apart: 0 > >After analyzing these shifting ratios of shared genes and the environment for >several decades, the consensus grew, in the '90s, that heritability for IQ was >around 0.6 - or about 60 percent. The two most >powerful indications of this are >at the top and bottom of the chart: Identical twins raised in different >environments have IQs almost as similar to each >other as the same person tested >twice, while adopted children living together - shared environment, but no >shared genes - show no correlation. When you look at a chart like that, the >evidence for significant heritability looks undeniable. > >Four years ago, Flynn and William Dickens, a Brookings Institution economist, >proposed another explanation, one made apparent to them by the Flynn effect. >Imagine "somebody who starts out with a tiny little physiological advantage: >He's just a bit taller than his friends," Dickens says. "That person is going >to be just a bit better at basketball." Thanks to this minor height advantage, >he tends to enjoy pickup basketball games. He goes on to play in high school, >where he gets excellent coaching and accumulates more experience and skill. >"And that sets up a cycle that could, say, take him all the way to the NBA," >Dickens says. > >Now imagine this person has an identical twin raised separately. He, too, will >share the height advantage, and so be more >likely to find his way into the same >cycle. And when some imagined basketball >geneticist surveys the data at the end >of that cycle, he'll report that two identical twins raised apart share an >off-the-charts ability at basketball. "If you did a genetic analysis, you'd >say: Well, this guy had a gene that made him a better basketball player," >Dickens says. "But the fact is, that gene is making him 1 percent better, and >the other 99 percent is that because he's slightly taller, he got all this >environmental support." And what goes for basketball goes for intelligence: >Small genetic differences get picked up and magnified in the environment, >resulting in dramatically enhanced skills. "The heritability studies weren't >wrong," Flynn says. "We just misinterpreted them." > >Dickens and Flynn showed that the environment could affect heritable traits >like IQ, but one mystery remained: What part of our allegedly dumbed-down >environment is making us smarter? It's not schools, since the tests that >measure education-driven skills haven't shown the same steady gains. It's not >nutrition - general improvement in diet leveled off in most industrialized >countries shortly after World War II, just as the Flynn effect was >accelerating. > >Most cognitive scholars remain genuinely perplexed. "I find it a puzzle and >don't have a compelling explanation," wrote >Harvard's Steven Pinker in an email >exchange. "I suspect that it's either practice at taking tests or perhaps a >large number of disparate factors that add up to the linear trend." > >Flynn has his theories, though they're still speculative. "For a long time it >bothered me that g was going up without an across-the-board increase in other >tests," he says. If g measured general intelligence, then a long-term increase >should trickle over into other subtests. "And then I realized that society has >priorities. Let's say we're too cheap to hire good high school math teachers. >So while we may want to improve arithmetical reasoning skills, we just don't. >On the other hand, with smaller families, more leisure, and more energy to use >leisure for cognitively demanding pursuits, we may improve - without realizing >it - on-the-spot problem-solving, like you see with Ravens." > >When you take the Ravens test, you're confronted >with a series of visual grids, >each containing a mix of shapes that seem vaguely related to one another. Each >grid contains a missing shape; to answer the implicit question posed by the >test, you need to pick the correct missing shape from a selection of eight >possibilities. To "solve" these puzzles, in >other words, you have to scrutinize >a changing set of icons, looking for unusual patterns and correlations among >them. > >This is not the kind of thinking that happens when you read a book or have a >conversation with someone or take a history exam. But it is precisely the kind >of mental work you do when you, say, struggle to program a VCR or master the >interface on your new cell phone. > >Over the last 50 years, we've had to cope with an explosion of media, >technologies, and interfaces, from the TV clicker to the World Wide Web. And >every new form of visual media - interactive >visual media in particular - poses >an implicit challenge to our brains: We have to work through the logic of the >new interface, follow clues, sense >relationships. Perhaps unsurprisingly, these >are the very skills that the Ravens tests measure - you survey a field of >visual icons and look for unusual patterns. > >The best example of brain-boosting media may be videogames. Mastering visual >puzzles is the whole point of the exercise - whether it's the spatial geometry >of Tetris, the engineering riddles of Myst, or >the urban mapping of Grand Theft >Auto. > >The ultimate test of the "cognitively demanding >leisure" hypothesis may come in >the next few years, as the generation raised on >hypertext and massively complex >game worlds starts taking adult IQ tests. This is a generation of kids who, in >many cases, learned to puzzle through the visual >patterns of graphic interfaces >before they learned to read. Their fundamental intellectual powers weren't >shaped only by coping with words on a page. They acquired an intuitive >understanding of shapes and environments, all of them laced with patterns that >can be detected if you think hard enough. Their >parents may have enhanced their >fluid intelligence by playing Tetris or learning the visual grammar of TV >advertising. But that's child's play compared with Pok?mon. > >Contributing editor Steven Johnson (stevenberlinjohnson at earthlink.net) is the >author of Everything Bad Is Good for You: How Today's Popular Culture Is >Actually Making Us Smarter. > >_______________________________________________ >paleopsych mailing list >paleopsych at paleopsych.org >http://lists.paleopsych.org/mailman/listinfo/paleopsych -- ------------------------------------- Prof. Hannes Eisler Department of Psychology Stockholm University S-106 91 Stockholm Sweden e-mail: he at psychology.su.se fax : +46-8-15 93 42 phone : +46-8-163967 (university) +46-8-6409982 (home) internet: http://www.psychology.su.se/staff/he From ross.buck at uconn.edu Fri May 6 16:13:36 2005 From: ross.buck at uconn.edu (Buck, Ross) Date: Fri, 6 May 2005 12:13:36 -0400 Subject: [Paleopsych] Wired: (Flynn Effect): Dome Improvement Message-ID: All sorts of signals, including signals experienced in the womb. In infants simple sensory stimulation is critical to turn on genetic potential. Later, signals from other human beings become critical. We bioregulate one another via emotional communication--indeed this is true of all creatures--working particularly via peptide neurohormones. Human social organization emerges from interactions involving emotional communication, literally priming the DNA to respond appropriately (or not). Environmental signals can screw up the DNA as well, of course. Deprivation of physical/social stimuli due to abuse/neglect--particularly during sensitive periods--can undermine genetic potential for a lifetime, as can signals that can be associated with poverty, discrimination, bad education, lack of opportunity, etc. >From the Flynn effect, it appears possible that modern media optimize the expression of whatever human DNA is associated with performance on IQ-type tests. What they do for SOCIAL competence is another question. Television has been likened to Harlow's cloth-covered surrogate mothers: warm and fuzzy but basically unresponsive to the user. Newer technology is responsive to the user as well, of course, and it is noteworthy how kids eat it up (Piaget's "aliments" at work). Cheers, Ross -----Original Message----- From: paleopsych-bounces at paleopsych.org [mailto:paleopsych-bounces at paleopsych.org] On Behalf Of Steve Hovland Sent: Thursday, May 05, 2005 9:13 PM To: 'The new improved paleopsych list' Subject: RE: [Paleopsych] Wired: (Flynn Effect): Dome Improvement What kind of environmental signals optimize the expression of human DNA? Steve Hovland www.stevehovland.net -----Original Message----- From: Buck, Ross [SMTP:ross.buck at uconn.edu] Sent: Thursday, May 05, 2005 10:00 AM To: The new improved paleopsych list Subject: RE: [Paleopsych] Wired: (Flynn Effect): Dome Improvement Again the notion of heritability is being presented as a meaningful measure of genetic-versus-environmental influence. Most monozygotic twins are monochorionic, sharing the same choroid plexus and therefore the same blood supply in the womb. A minority are dichorionic, with identical genes but a different intrauterine blood supply. Davis, Phelps and Bracha (Schizophrenia Bulletin, 1995, 21, 357-366) investigated concordance of schizophrenia in monochorionic and dichorionic monozygotic twins, and found that while the concordance rate for MC MZ twins was 60% (i.e., if one twin is schizophrenic there is a 60% chance the other will be as well), the concordance rate of the DC MZ twins (with identical genes) was 10.7%. Environmental influences are overwhelming, and they begin at conception: the genes do nothing without environmental influences turning them on and off. The Flynn effect suggests that the vast media wasteland may actually function as a vast brain playground. Cheers, Ross Ross Buck, Ph. D. Professor of Communication Sciences and Psychology Communication Sciences U-1085 University of Connecticut Storrs, CT 06269-1085 860-486-4494 fax 860-486-5422 Ross.buck at uconn.edu http://www.coms.uconn.edu/docs/people/faculty/rbuck/index.htm -----Original Message----- From: paleopsych-bounces at paleopsych.org [mailto:paleopsych-bounces at paleopsych.org] On Behalf Of Premise Checker Sent: Thursday, May 05, 2005 12:27 PM To: paleopsych at paleopsych.org Subject: [Paleopsych] Wired: (Flynn Effect): Dome Improvement Dome Improvement http://www.wired.com/wired/archive/13.05/flynn_pr.html First some remarks from From: Hal Finney Date: Tue, 3 May 2005 11:03:41 -0700 (PDT) To: extropy-chat at lists.extropy.org except that the article is now available: Wired magazine's new issue has an article on the Flynn Effect, which we have discussed here occasionally. This is probably my favorite Effect, so completely extropian and contradictory to the conventional wisdom. Curmudgeons throughout the ages have complained about the decay of society and how the younger generation is inferior in morals and intelligence to their elders. Likewise modern communications technology is derided: TV is a vast wasteland, video games and movies promote sex and violence. Yet Flynn discovered the astonishing and still little-known fact that intelligence scores have steadily increased for at least the past 100 years. And it's a substantial gain; people who would have been considered geniuses 100 years ago would be merely average today. Perhaps even more surprisingly, the gains cannot be directly attributed to improved education, as the greatest improvements are found in the parts of the test that directly measure abstract reasoning via visual puzzles, not concrete knowledge based on language or mathematical skills. The Wired article (which should be online in a few days) does not have much that is new, but one fact which popped out is that the Effect has not only continued in the last couple of generations, but is increasing. Average IQ gains were 0.31 per year in the 1950s and 60s, but by the 1990s had grown to 0.36 per year. Explanations for the Effect seem to be as numerous as people who have studied it. Flynn himself does not seem to believe that it is real, in the sense that it actually points to increased intelligence. I was amused by economist David Friedman's suggestion that it is due to the increased use of Caesarian deliveries allowing for larger head sizes! The Wired article focuses on increased visual stimulation as the catalyst, which seems plausible as part of the story. The article then predicts that the next generation, exposed since babyhood to video games with demanding puzzle solving, mapping and coordination skills, will see an even greater improvement in IQ scores. Sometimes I wonder if the social changes we saw during the 20th century may have been caused or at least promoted by greater human intelligence. It's a difficult thesis to make because you first have to overcome the conventional wisdom that says that the 1900s were a century of human depravity and violence. But if you look deeper and recognize the tremendous growth of morality and ethical sensitivity in this period (which is what makes us judge ourselves so harshly), you have to ask, maybe it is because people woke up, began to think for themselves, and weren't willing to let themselves be manipulated and influenced as in the past? If so, then this bodes well for the future. --------------now the article: Pop quiz: Why are IQ test scores rising around the globe? (Hint: Stop re ading the great authors and start playing Grand Theft Auto.) By Steven Johnson Twenty-three years ago, an American philosophy professor named James Flynn discovered a remarkable trend: Average IQ scores in every industrialized country on the planet had been increasing steadily for decades. Despite concerns about the dumbing-down of society - the failing schools, the garbage on TV, the decline of reading - the overall population was getting smarter. And the climb has continued, with more recent studies showing that the rate of IQ increase is accelerating. Next to global warming and Moore's law, the so-called Flynn effect may be the most revealing line on the increasingly crowded chart of modern life - and it's an especially hopeful one. We still have plenty of problems to solve, but at least there's one consolation: Our brains are getting better at problem-solving. Unless you happen to think the very notion of IQ is bunk. Anyone who has read Stephen Jay Gould's The Mismeasure of Man or Howard Gardner's work on multiple intelligences or any critique of The Bell Curve is liable to dismiss IQ as merely phrenology updated, a pseudoscience fronting for a host of racist and elitist ideologies that dare not speak their names. These critics attack IQ itself - or, more precisely, what intelligence scholar Arthur Jensen called g, a measure of underlying "general" intelligence. Psychometricians measure g by performing a factor analysis of multiple intelligence tests and extracting a pattern of correlation between the measurements. (IQ is just one yardstick.) Someone with greater general intelligence than average should perform better on a range of different tests. Unlike some skeptics, James Flynn didn't just dismiss g as statistical tap dancing. He accepted that something real was being measured, but he came to believe that it should be viewed along another axis: time. You can't just take a snapshot of g at one moment and make sense of it, Flynn says. You have to track its evolution. He did just that. Suddenly, g became much more than a measure of mental ability. It revealed the rising trend line in intelligence test scores. And that, in turn, suggested that something in the environment - some social or cultural force - was driving the trend. Significant intellectual breakthroughs - to paraphrase the John Lennon song - are what happen when you're busy making other plans. So it was with Flynn and his effect. He left the US in the early 1960s to teach moral philosophy at the University of Otaga in New Zealand. In the late '70s, he began exploring the intellectual underpinnings of racist ideologies. "And I thought: Oh, I can do a bit about the IQ controversies," he says. "And then I saw that Arthur Jensen, a scholar of high repute, actually thought that blacks on average were genetically inferior - which was quite a shock. I should say that Jensen was beyond reproach - he's certainly not a racist. And so I thought I'd better look into this." This inquiry led to a 1980 book, Race, IQ, and Jensen, that posited an environmental - not genetic - explanation for the black-white IQ gap. After finishing the book, Flynn decided that he would look for evidence that blacks were gaining on whites as their access to education increased, and so he began studying US military records, since every incoming member of the armed forces takes an IQ test. Sure enough, he found that blacks were making modest gains on whites in intelligence tests, confirming his environmental explanation. But something else in the data caught his eye. Every decade or so, the testing companies would generate new tests and re-normalize them so that the average score was 100. To make sure that the new exams were in sync with previous ones, they'd have a batch of students take both tests. They were simply trying to confirm that someone who tested above average on the new version would perform above average on the old, and in fact the results confirmed that correlation. But the data also brought to light another pattern, one that the testing companies ignored. "Every time kids took the new and the old tests, they did better on the old ones," Flynn says. "I thought: That's weird." The testing companies had published the comparative data almost as an afterthought. "It didn't seem to strike them as interesting that the kids were always doing better on the earlier test," he says. "But I was new to the area." He sent his data to the Harvard Educational Review, which dismissed the paper for its small sample size. And so Flynn dug up every study that had ever been done in the US where the same subjects took a new and an old version of an IQ test. "And lo and behold, when you examined that huge collection of data, it revealed a 14-point gain between 1932 and 1978." According to Flynn's numbers, if someone testing in the top 18 percent the year FDR was elected were to time-travel to the middle of the Carter administration, he would score at the 50th percentile. When Flynn finally published his work in 1984, Jensen objected that Flynn's numbers were drawing on tests that reflected educational background. He predicted that the Flynn effect would disappear if one were to look at tests - like the Raven Progressive Matrices - that give a closer approximation of g, by measuring abstract reasoning and pattern recognition and eliminating language altogether. And so Flynn dutifully collected IQ data from all over the world. All of it showed dramatic increases. "The biggest of all were on Ravens," Flynn reports with a hint of glee still in his voice. The trend Flynn discovered in the mid-'80s has been investigated extensively, and there's little doubt he's right. In fact, the Flynn effect is accelerating. US test takers gained 17 IQ points between 1947 and 2001. The annual gain from 1947 through 1972 was 0.31 IQ point, but by the '90s it had crept up to 0.36. Though the Flynn effect is now widely accepted, its existence has in turn raised new questions. The most fundamental: Why are measures of intelligence going up? The phenomenon would seem to make no sense in light of the evidence that g is largely an inherited trait. We're certainly not evolving that quickly. The classic heritability research paradigm is the twin adoption study: Look at IQ scores for thousands of individuals with various forms of shared genes and environments, and hunt for correlations. This is the sort of chart you get, with 100 being a perfect match and 0 pure randomness: The same person tested twice: 87 Identical twins raised together: 86 Identical twins raised apart: 76 Fraternal twins raised together: 55 Biological siblings: 47 Parents and children living together: 40 Parents and children living apart: 31 Adopted children living together: 0 Unrelated people living apart: 0 After analyzing these shifting ratios of shared genes and the environment for several decades, the consensus grew, in the '90s, that heritability for IQ was around 0.6 - or about 60 percent. The two most powerful indications of this are at the top and bottom of the chart: Identical twins raised in different environments have IQs almost as similar to each other as the same person tested twice, while adopted children living together - shared environment, but no shared genes - show no correlation. When you look at a chart like that, the evidence for significant heritability looks undeniable. Four years ago, Flynn and William Dickens, a Brookings Institution economist, proposed another explanation, one made apparent to them by the Flynn effect. Imagine "somebody who starts out with a tiny little physiological advantage: He's just a bit taller than his friends," Dickens says. "That person is going to be just a bit better at basketball." Thanks to this minor height advantage, he tends to enjoy pickup basketball games. He goes on to play in high school, where he gets excellent coaching and accumulates more experience and skill. "And that sets up a cycle that could, say, take him all the way to the NBA," Dickens says. Now imagine this person has an identical twin raised separately. He, too, will share the height advantage, and so be more likely to find his way into the same cycle. And when some imagined basketball geneticist surveys the data at the end of that cycle, he'll report that two identical twins raised apart share an off-the-charts ability at basketball. "If you did a genetic analysis, you'd say: Well, this guy had a gene that made him a better basketball player," Dickens says. "But the fact is, that gene is making him 1 percent better, and the other 99 percent is that because he's slightly taller, he got all this environmental support." And what goes for basketball goes for intelligence: Small genetic differences get picked up and magnified in the environment, resulting in dramatically enhanced skills. "The heritability studies weren't wrong," Flynn says. "We just misinterpreted them." Dickens and Flynn showed that the environment could affect heritable traits like IQ, but one mystery remained: What part of our allegedly dumbed-down environment is making us smarter? It's not schools, since the tests that measure education-driven skills haven't shown the same steady gains. It's not nutrition - general improvement in diet leveled off in most industrialized countries shortly after World War II, just as the Flynn effect was accelerating. Most cognitive scholars remain genuinely perplexed. "I find it a puzzle and don't have a compelling explanation," wrote Harvard's Steven Pinker in an email exchange. "I suspect that it's either practice at taking tests or perhaps a large number of disparate factors that add up to the linear trend." Flynn has his theories, though they're still speculative. "For a long time it bothered me that g was going up without an across-the-board increase in other tests," he says. If g measured general intelligence, then a long-term increase should trickle over into other subtests. "And then I realized that society has priorities. Let's say we're too cheap to hire good high school math teachers. So while we may want to improve arithmetical reasoning skills, we just don't. On the other hand, with smaller families, more leisure, and more energy to use leisure for cognitively demanding pursuits, we may improve - without realizing it - on-the-spot problem-solving, like you see with Ravens." When you take the Ravens test, you're confronted with a series of visual grids, each containing a mix of shapes that seem vaguely related to one another. Each grid contains a missing shape; to answer the implicit question posed by the test, you need to pick the correct missing shape from a selection of eight possibilities. To "solve" these puzzles, in other words, you have to scrutinize a changing set of icons, looking for unusual patterns and correlations among them. This is not the kind of thinking that happens when you read a book or have a conversation with someone or take a history exam. But it is precisely the kind of mental work you do when you, say, struggle to program a VCR or master the interface on your new cell phone. Over the last 50 years, we've had to cope with an explosion of media, technologies, and interfaces, from the TV clicker to the World Wide Web. And every new form of visual media - interactive visual media in particular - poses an implicit challenge to our brains: We have to work through the logic of the new interface, follow clues, sense relationships. Perhaps unsurprisingly, these are the very skills that the Ravens tests measure - you survey a field of visual icons and look for unusual patterns. The best example of brain-boosting media may be videogames. Mastering visual puzzles is the whole point of the exercise - whether it's the spatial geometry of Tetris, the engineering riddles of Myst, or the urban mapping of Grand Theft Auto. The ultimate test of the "cognitively demanding leisure" hypothesis may come in the next few years, as the generation raised on hypertext and massively complex game worlds starts taking adult IQ tests. This is a generation of kids who, in many cases, learned to puzzle through the visual patterns of graphic interfaces before they learned to read. Their fundamental intellectual powers weren't shaped only by coping with words on a page. They acquired an intuitive understanding of shapes and environments, all of them laced with patterns that can be detected if you think hard enough. Their parents may have enhanced their fluid intelligence by playing Tetris or learning the visual grammar of TV advertising. But that's child's play compared with Pokemon. Contributing editor Steven Johnson (stevenberlinjohnson at earthlink.net) is the author of Everything Bad Is Good for You: How Today's Popular Culture Is Actually Making Us Smarter. _______________________________________________ paleopsych mailing list paleopsych at paleopsych.org http://lists.paleopsych.org/mailman/listinfo/paleopsych _______________________________________________ paleopsych mailing list paleopsych at paleopsych.org http://lists.paleopsych.org/mailman/listinfo/paleopsych From checker at panix.com Fri May 6 21:39:36 2005 From: checker at panix.com (Premise Checker) Date: Fri, 6 May 2005 17:39:36 -0400 (EDT) Subject: [Paleopsych] SF Chronicle: Cloned pet ban rejected: Law would have been nation's first Message-ID: Cloned pet ban rejected: Law would have been nation's first http://sfgate.com/cgi-bin/article.cgi?file=/c/a/2005/05/04/CLONING.TMP - John M. Hubbell, Chronicle Sacramento Bureau Wednesday, May 4, 2005 Sacramento -- State lawmakers Tuesday turned away a bill that could have brought a first-in-the-nation ban on pet cloning, moved less by a host of scientific and ethical arguments than by photos of wide-eyed, copy-cat kittens. The 4-2 vote against the bill with four abstentions by members of Assembly Business and Professions Committee on AB1428 by Assemblyman Lloyd Levine, D-Van Nuys, came after a brief discussion that touched on everything from free enterprise to mad science -- all triggered largely by a pioneering Bay Area firm's willingness to replicate pet owners' favorite cat or dog. That firm, Genetic Savings & Clone, has created replicas of six cats, representatives said Tuesday, and hopes to start work on dogs by December. Pictures of two dark-haired, cloned felines were shown during testimony by Lou Hawthorne, the firm's chief executive, prompting committee Chairwoman Gloria Negrete McLeod, D-Chino, to inquire of him: "So you even do tabbies?" "We do everything except calicoes," Hawthorne said, citing their genetic complexity. It was not the type of inquiry hoped for by Levine, who framed pet cloning as a needless scientific incursion in a world where millions of needy animals are euthanized each year. With the practice lacking federal or state regulation, he said, cloning could not only lead to deformities in the laboratory, but to unintended consequences in society. "What happens when people decide they want to cross their boa constrictor with their rattlesnake to get a really big poisonous snake?" he asked. "Life is more than a commodity," Levine said, "and this is where we draw the line. Just because we can doesn't mean we should." Crystal Miller-Spiegel, policy analyst with the American Anti-Vivisection Society, said pet owners should realize that "animals can't be replaced like a printer." She called Levine's legislation "not anti-science, not an animal-rights bill, and not based on emotion. It's simply common sense." Assemblyman Paul Koretz, D-West Hollywood, queried Hawthorne on claims on a recent Genetic Savings & Clone mailer touting it can clone an owner's "perfect" pet. "I'm wondering whether consumers are being pulled into this," Koretz said. But Hawthorne said he was "perfectly comfortable" with the advertisement. "Contractually, we guarantee only physical resemblance," he said. Hawthorne, who said his firm charged about $23,000 per cat, also touted the promise of animal cloning one day addressing the repopulation of endangered species. Christine Dillon, lobbyist for the California Veterinary Medical Association, said generations of selective breeding meant that, in all practicality, "vets have been working on genetically modified animals for years." Democratic Assemblyman Joe Nation, whose district includes Sausalito, where Hawthorne's firm is based, noted that a California ban on pet cloning would fail to prevent the practice in neighboring states. Jokingly, he pondered the scenario of a familiar state inspector intercepting cars inbound from Nevada to ask, "Do you have any fresh fruits, vegetables or cloned kittens with you?" Levine agreed cloning issues should be decided at the federal level, but likened continued inaction in California to "trying to close the barn door after the horses are already out." But the fears seemed unwarranted to Ken Press of Sacramento, who has stored the DNA of his recently deceased cat, a 12-year-old Siamese mix named Kitamus he called "an exceptional pet," with Genetic Savings & Clone. "I've considered his genetic lineage worthy of continuing," Press told the committee, adding that neutering the pet proved a mistake. "Sometimes you make a decision and later regret it." From checker at panix.com Fri May 6 21:41:52 2005 From: checker at panix.com (Premise Checker) Date: Fri, 6 May 2005 17:41:52 -0400 (EDT) Subject: [Paleopsych] NYT: Perils of Pain Relief Often Hide in Tiny Type Message-ID: Personal Health: Perils of Pain Relief Often Hide in Tiny Type New York Times, 5.5.3 http://www.nytimes.com/2005/05/03/health/03brod.html By [1]JANE E. BRODY If ever there was a classic case of "no free lunch," popular pain control medications are it. There's not one without a potentially serious risk. Yet, far too many people use them carelessly, without adequate attention to dosage and warnings about possible risks. For over a century, aspirin was the pain drug of choice, until data emerged on the rather large number of bleeding-related deaths this time-honored medicine caused each year. In fact, many pharmaceutical experts say that if aspirin had to go through the Food and Drug Administration's approval process today, it would never make it to market. Along came some dandy substitutes, now also sold over the counter under brand names and as generics: ibuprofen (Advil, Motrin IB) and naproxen (Aleve). Ibuprofen and naproxen, known as nonsteroidal anti-inflammatory drugs, or Nsaids, can equal or outdo aspirin's action against painful inflammation but at less risk of bleeding. But they, too, can have serious side effects: they can irritate the gastrointestinal tract and possibly cause ulcers. People who use Nsaids chronically are often told to take an anti-acid drug to protect their stomachs. This problem opened up a market for a new kind of drug called a cox-2 inhibitor, sold as Celebrex, Vioxx, Bextra and Mobic. These drugs are as good or better than ibuprofen for pain, although as patented prescription medications they greatly multiplied the cost of pain relief. The cox-2 inhibitors were considered safer because they reduced the risks of bleeding and gastrointestinal damage. And as major moneymakers, they were heavily promoted, especially to the millions who need relief for chronic problems. Alas, these, too, have come under serious fire as their use mushroomed and evidence emerged linking them to heart attacks and strokes among users already at risk for these problems. With many multimillion-dollar lawsuits looming, Vioxx was the first to be withdrawn from the market, recently followed by Bextra. Both drugs may come back, accompanied by more stringent warnings. Or their cox-2 cousins, Celebrex and Mobic, may join the ranks as drugs gone by. Problems also accompany other prescription painkillers, like the opioids, to be discussed in greater detail in a future column. This brings us to an entirely different drug, acetaminophen, long used to counter fever and occasional aches and pains like tension headaches. But now acetaminophen is being hailed as an excellent first choice for the relief of chronic pain. Can Tylenol Take Over? Acetaminophen, often referred to by its most popular brand name, Tylenol, has no anti-inflammatory action. Nor does it cause bleeding or gastrointestinal distress. Many pain specialists say it should be considered first for relief for the persistent pain of osteoarthritis and prolonged pain of muscle or joint injuries. All in all, acetaminophen is a safe drug for children and adults. Despite the many millions of doses taken by Americans each year, few reports of serious side effects emerge when acetaminophen is used in the dosages recommended by manufacturers. For example, in a study published a decade ago evaluating the experience of 28,130 children who had taken acetaminophen, there was no increased risk of gastrointestinal bleeding, kidney failure, life-threatening allergic reactions or Reye's syndrome, a potential fatal side effect of aspirin when given to children with viral infections. Acetaminophen is also considered safe for women who are pregnant or breast-feeding, although they are wisely advised to check first with their doctors. And acetaminophen is the pain reliever of choice for those with serious allergies who may be at risk of severe allergic reactions from aspirin and Nsaids. Perhaps as a testament to its safety, acetaminophen is found, not only on its own in a variety of dosages, but also in combination with other medications, over the counter and prescription. If consumers are unaware of its presence in different medications, or if they fail to adhere to cautionary statements about dosages, it is possible to take too much acetaminophen inadvertently. As with any other medicine, with acetaminophen it is critically important to keep in mind this irrefutable adage: The dose makes the poison. For example, no one questions the safety of following recommended doses. If you can read the fine print on the label, it will tell you that for adults and for children 12 and older, two 500-milligram tablets or capsules can be taken every 4 to 6 hours, as long as no more than 8 tablets (a total of 4,000 milligrams) are taken in a 24-hour period - unless a physician says otherwise. Taking more than 4,000 milligrams a day of acetaminophen on a chronic basis can damage the liver of an adult. The danger dose would be far smaller for young children. It is easier than you may think to take more than 4,000 milligrams a day. With the higher-dose tablets (650 milligrams each) now sold to treat arthritis, you can easily exceed the safety limit if you do not follow the instructions to take 2 tablets every 8 hours, for a maximum daily dose of 6 tablets in 24 hours, adding up to 3,900 milligrams a day. Even if you follow these directions, you can exceed the recommended daily dose if you also take another medication - say, an over-the-counter cold or flu remedy - that contains acetaminophen. The label on my Tylenol Arthritis Pain has a clearly stated warning: "Do not use with any other product containing acetaminophen." But until writing this column, I admit I never read that warning, and I'd guess that more than 90 percent of other users haven't read it either. Without a magnifying glass, many elderly people who are the most likely users of an arthritis drug would have trouble reading the labels on this and many other medicines like it. A second warning on acetaminophen says: "If you drink three or more alcoholic drinks every day, ask your doctor whether you should take acetaminophen or other pain relievers/fever reducers. Acetaminophen may cause liver damage." A Liver Under Siege So, if your liver is already under attack from alcohol, acetaminophen can be that last straw, resulting in liver failure. This year, the journal Emergency Medicine warned physicians about the hazards of overdoses of acetaminophen. Dr. Shirley Kung and Dr. Kennon Heard wrote that acetaminophen poisoning could often be much worse than it seemed at first. Nausea and vomiting can progress to complete liver failure in as little as 24 hours unless the problem is promptly recognized and the proper antidote given within 24 hours of a toxic dose. To fully prevent liver injury, the antidote should be given within eight hours. Each year, more than 100,000 calls related to acetaminophen are made to poison control centers in the United States, and about 150 acetaminophen-related deaths are reported. Some cases result from deliberate overdoses by people trying to commit suicide. But many others are accidental, like the one described in the journal: an 18-month-old child with a fever and cough for three days who had been given acetaminophen every two to four hours. Other cases result when people whose livers are damaged by other disease take acetaminophen for respiratory infections or pain. From checker at panix.com Fri May 6 21:42:02 2005 From: checker at panix.com (Premise Checker) Date: Fri, 6 May 2005 17:42:02 -0400 (EDT) Subject: [Paleopsych] NYT: One Family's Story: Apples to Applejack Message-ID: One Family's Story: Apples to Applejack New York Times, 5.5.4 http://www.nytimes.com/2005/05/04/dining/04lair.html [My 7-great grandfather, Samuel Forman (1662 or 1663-1740) was High Sherif of Monmouth Country when the first Lairds came over but it was only later that the Lairds got into the liquor business. Samuel was the 5-great grandfather of both my grandparents on the Forman side, thus making the fifth cousins. I'm now a little unclear on this, but a George Forman was an accountant and went into the liquor business with a Mr. Brown, forming the Brown-Forman Company, which is one of last independent distillers today, in the late 19th century. Previously, I had thought that Forman fled to Kentucky during the Whiskey Rebellion of 1974, which took place in what is now Pittsburgh. He then met a Mr. Forman who said, "Forman, you make mighty good moonshine; let's go into business and make it legal." Though the date is wrong by a century, there was indeed a Kentucky Forman, namely George, and it is true that my father's mother's family did hail from Kentucky before they moved to Kansas. So my father's father may have garbled the report from his wife, or his wife did the garbling herself, or maybe it was me who added to the garbling. Anyhow, the article below is a good one.] -------------- By FRANK J. PRIAL SCOBEYVILLE, N.J. LAIRD Emilie Dunn is only 7 years old, but one day history will catch up with her. Since 1698, some 12 generations of the Laird family have lived in or around this tiny Monmouth County village, making history and, yes, applejack. The family's business, Laird & Company, is the oldest commercial distillery in the United States and one of the country's oldest family businesses. The first Laird to come to these shores, William, was a Scotsman who, his family likes to think, made Scotch whiskey back in County Fyfe and switched to apple brandy when he reached Monmouth County. Almost a century later, in 1780, a grandson of William Laird, Robert Laird, started Laird & Company; the Lairds still have his account book from that year to prove it. Nine generations of Lairds have run the company since then. Laird Emilie, the daughter of Lisa Laird Dunn, of the ninth generation since the company was started, is the only one of the 10th generation bearing the Laird name who might conceivably go into the business. The story of applejack and the history of the Lairds are intertwined. George Washington, who owned large apple orchards, wrote to the Lairds around 1760 asking for their applejack recipe. In his diary he noted on Aug. 3, 1763, that he "began selling cider." During the Revolutionary War, Washington dined with Moses Laird, an uncle of Robert, on the eve of the Battle of Monmouth. Abraham Lincoln ran a tavern in Springfield, Ill., for a time; the Lairds have a copy of his bill of fare from 1833 offering applejack at 12 cents a half pint. That's not cheap: dinner was 25 cents. Presumably Lincoln's applejack was the straight stuff. Today, the names for apple spirits are more specific. By law, applejack can refer only to a blend. "The trend has been to lighter drinks," Lisa Laird Dunn said. "Until the 1970's, our applejack was pure apple juice, fermented then distilled. Today, at 80 proof, it's a blend of about 35 percent apple brandy and 65 percent neutral grain spirits." Federal regulations also require that applejack be aged four years in used bourbon barrels. The unblended style has not been abandoned. There is Laird's 100 proof Straight Apple Brandy; Laird's 80-proof Old Apple Brandy, aged a minimum of seven and a half years, and the family's pride, Laird's 88-proof 12-Year-Old Apple Brandy, aged in charred bourbon barrels. Like a 20-year-old Calvados from the Pays d'Auge in Normandy, Laird's 12-year-old can take its place alongside most fine Cognacs. Seventeenth-century settlers in the Northeast turned to apples for their strong spirits because the weather and the soil were not hospitable to rye, barley and corn. Until whiskey began to flow through the Cumberland Gap in the 18th century, and rum, or molasses to make rum, arrived from the Caribbean as part of the slave trade, applejack was America's favorite spirit. By the 1670's, according to the Laird archives, almost every prosperous farm had an apple orchard whose yield went almost entirely into the making of cider. Hard cider - simple fermented apple juice - was the most abundant drink in the colonies. Much of it was made by leaving apple cider outside in winter until its water content froze and was discarded. About 20 years later, farmers began to distill the hard cider into 120-proof "cyder spirits," which soon became known as applejack. The first Laird distillery was a small affair behind the Colt's Neck Inn, a stagecoach stop between Freehold and Perth Amboy. While the inn is still there and still open, the distillery was moved to its current site, five miles away, after a fire in 1849. Originally the small plant was surrounded by apple orchards. Now most of the area is given over to horse farms and a slowly encroaching line of megamansions. "We haven't purchased an apple around here for years," Lisa Laird Dunn said. "All our apples come from the Shenandoah Valley, and they are processed in our distillery in North Garden, Va." Scobeyville is the site of the company's headquarters and its warehouses. The best apples for making applejack are small, late-ripening Winesaps, Larrie Laird said, "because they yield more alcohol." Sixteen pounds of apples produce about 25 ounces of applejack. Laird & Company is the nation's top producer of apple brandies and its only producer of applejack, but the company's production is relatively small, about 40,000 cases a year in all. To increase its sales, Laird imports wines and spirits from France, Italy and elsewhere and acts as a contract bottler for a variety of spirits produces. It buys spirits in bulk - bourbon, Scotch, tequila, Canadian whiskey, gin, vodka and others - and bottles them. Applejack and the apple brandies make up only about 5 percent of Laird's catalog. While Laird is the only producer of applejack, there are several other apple brandy makers, one of the most prominent the Clear Creek distillery in Portland, Ore. Clear Creek calls its version Eau de Vie de Pomme, makes it from Golden Delicious apples and ages it eight years in French oak barrels. Here in Scobeyville, a representative of the eighth generation, Larrie, 65, currently president and chief executive, will eventually give way to a representative of the ninth, his daughter, Lisa Laird Dunn, 43, vice president of sales and marketing, and her cousin, John E. Laird III, 57, executive vice president and chief financial officer. After that, it all depends on Laird Emilie. Of course, she has a few years to think about it. From checker at panix.com Fri May 6 21:39:43 2005 From: checker at panix.com (Premise Checker) Date: Fri, 6 May 2005 17:39:43 -0400 (EDT) Subject: [Paleopsych] CHE: A glance at the spring issue of The Wilson Quarterly: The future of big cities Message-ID: A glance at the spring issue of The Wilson Quarterly: The future of big cities The Chronicle of Higher Education: Magazine & journal reader http://chronicle.com/prm/daily/2005/05/2005050301j.htm 5.5.3 Just at the time large cities seem to be most dominant, huge urban centers may be losing ground, says Joel Kotkin, a fellow at the Steven L. Newman Real Estate Institute at Bernard M. Baruch College of the City University of New York, and a visiting lecturer in history, theory, and humanities at the Southern California Institute of Architecture. The 21st century is the first in which a majority of people live in cities, he says, but recent technological and demographic changes threaten to weaken cities. Advances in telecommunications now allow individuals and corporations to conduct business from places outside major cities. Small towns and suburban areas around cities are drawing more and more professionals and businesses out of the city centers, he says. Immigrants and young people have traditionally bolstered city populations, but many immigrants are choosing to live in outlying areas, he says, and young people who start their careers in cities now tend to move out when they start families or businesses of their own. And, in many developed countries, the younger population is dwindling because of low birth rates. Many cities have focused on tourism and entertainment to compensate for their losses, but "a busy city must be more than a construct of diversions for essentially nomadic populations," he writes. "It requires an engaged and committed citizenry with a long-term financial and familial stake in the metropolis." The needs of cities have not changed much over the millennia they have existed, he says, and "to be successful today, urban areas must resonate with the ancient fundamentals -- they must be sacred, safe, and busy." While cities no longer need to be built around temples or identified with particular gods, their citizens should share a sense of common purpose and identity, he says. Citizens also need to perceive their cities as safe, a special challenge in the face of terrorism. "In the sprawling cities of the developing world, the lack of a healthy economy and the absence of a stable political order loom as the most pressing problems," he says. But cities in developed countries "seem to lack a shared sense of sacred place, civic identity, or moral order," he writes. "And the study of urban history suggests that affluent cities without moral cohesion or a sense of civic identity are doomed to decadence and decline." The article, "Will Great Cities Survive?," is adapted from Mr. Kotkin's recent book, The City: A Global History (Modern Library, 2005) and is not online. Information about the journal is available at [53]http://wwics.si.edu/index.cfm?fuseaction=wq.welcome --Kellie Bartlett From checker at panix.com Sat May 7 00:00:46 2005 From: checker at panix.com (Premise Checker) Date: Fri, 6 May 2005 20:00:46 -0400 (EDT) Subject: [Paleopsych] Meme 040: Everything I Learned in Graduate Economics Was Wrong Message-ID: Meme 040: Everything I Learned in Graduate Economics Was Wrong sent 5.5.6 I mean that the assumption that investors were both knowledgeable and rational, that the less knowledgeable and rational investors got driven out, is wrong. I'd have thought, along with Gary North (below), that investors would peer behind accounting rules and see the real figures. And I'd have thought that they would have seen that health care has been rising faster than GDP for quite a long time now and would have factored this in when estimating the true wealth of General Motors. In other words, like conspiracy theorists, I want to believe that somewhere, somehow there are competent people. There are, of course, but their competence is more sharply limited than I had hitherto thought. And now I have an explanation why businesses clamor for cheap labor through immigration, while they surely ought to know that cheap labor for one business means cheap labor for all competing businesses, with the result that there is no increase in profits in any economy with a reasonably healthy amount of competition, which is the case for the United States. The explanation is that they are bad economists, as North shows in the case of investors. And so are politicians and liberals who clamor for an increase in the minimum wage. ---------------------- Gary North's REALITY CHECK Issue 444, 5.5.6 GENERAL MOTORS RUNS OVER THE EXPERTS DETROIT (AP) -- Standard & Poor's Ratings Services cut its corporate credit ratings to junk status for both General Motors Corp. and Ford Motor Co., a significant blow that will increase borrowing costs and limit fund-raising options for the nation's two biggest automakers. Shares of both companies fell 5 percent or more after Thursday's downgrades, and the news sent the overall market lower. "New York Times" (May 5, 2005) All of a sudden, without warning, the investment world is talking about the looming crisis at General Motors. Its pension fund obligations and health care obligations now appear to threaten the future of the company. The decline of its stock price from $55 in January, 2004, to today's $30 range has revealed a loss of confidence in the company by investors. To see this decline in action, click here: http://shurl.org/gm05 I have no objection to the experts' pessimism regarding the future of General Motors. I happen to share it, and have for years, precisely because of the pension issue. What astounds me is that investors and financial columnists have only just begun to regard the company's pension obligations as a significant factor in the future profitability of the firm. Why now? Why not in 2003 or ten years ago? The United Auto Workers' officers and GM's senior managers decided decades ago to agree to high pension and health benefits in exchange for reduced increases in wages. Health care benefits are tax-free income for workers. Even retired workers are covered. It seemed like a low-risk deal for GM. Nobody thought about the price effects on health care of Medicare. The health care market, like all markets, is a giant auction. If bidders get their hands on more money, they will bid up prices. All over America, workers are bidding health care prices. So are retirees. A DISASTER CALLED OPEB Alan Sloan, a financial columnist for "Newsweek," has painted a stark picture. He begins with a description of how GM got into this pickle. Lower salaries meant that GM reported higher profits, which translated into higher stock prices -- and higher bonuses for executives. Commitments for pensions and "other post-employment benefits" -- known as OPEB in the accounting biz -- had little initial impact on GM's profit statement and didn't count as obligations on its balance sheet. So why not keep employees happy with generous benefits? It was a free lunch. Besides, GM's only major competitors at the time, Ford and Chrysler, were making similar deals. This is the free lunch mentality: something for nothing. As with all free lunches, people eat more than they normally would. The price is right! Now, as we all can see, pension and health care obligations are eating GM alive. The bill for the "free" lunch has come in -- and GM is having trouble paying the tab. In the past two years, GM has put almost $30 billion into its pension funds and a trust to cover its OPEB obligations. Yet these accounts are still a combined $54 billion underwater. Note the phrase, "as we all can see." But nobody saw it until about February, 2004. Sloan says the problem by then had been building for over half a century. GM began its slide down the slippery slope in 1950, when it began picking up costs for medical insurance, pensions and retiree benefits. There was huge risk to GM in taking on these obligations -- but that didn't show up as a cost or balance-sheet liability. By 1973, the UAW says, GM was paying the entire health insurance bill for its employees, survivors and retirees, and had agreed to "30 and out" early retirement that granted workers full pensions after 30 years on the job, regardless of age. These problems began to surface about 15 years ago because regulators changed the accounting rules. In 1992, GM says, it took a $20 billion non-cash charge to recognize pension obligations. Evolving rules then put OPEB on the balance sheet. Now, these obligations -- call it a combined $170 billion for U.S. operations -- are fully visible. And out-of-pocket costs for health care are eating GM alive. I report this because of the delay factor. This was all built in, Sloan says. He is correct. It is why I counselled small businessmen in the late 1970s not to set up health plans and pension plans for their employees. The legal liability was too great, I warned them. But I was almost alone in this view. Not now. "DON'T ARGUE WITH THE MARKET!" We are told that the stock market discounts the future rationally. This means that the best and the brightest investors use their best estimates to buy and sell. Today's prices therefore include all of the relevant information, as judged by experts who bought or sold. Any unexpected price changes must come from new information or new perceptions that had not operated before. With respect to GM, it's "new information, no; new perception, yes." The information was there for many years. All of a sudden, investors' perception changed. Down went GM shares. Yet the basics had not changed. By tying stock pricing theory to information, and by relegating changed perceptions to the footnotes, economic commentators can then tell us that good times are coming, that bad news will be more than offset by good news. After all, isn't the stock market rising? Anyway, it's not falling. "Don't argue against the stock market!" Here is the reality of stock market pricing: seriously bad news is not discounted until it threatens the survival of the company. Optimism usually prevails among investors. Only toward the end of a bear market does investor perception change. With respect to pensions and health care, optimism is government policy. The government has assured us, year after year, that "pay as you go" works just fine for Social Security and Medicare, smart people believed the spiel. They carried the same attitude with them when they looked at GM's pension/health obligations. They refused to factor in the estimated numbers. At the end of last year, GM says, its U.S. pension funds showed a $3 billion surplus. GM's pension accounting, which assumes that the funds will earn an average of 9 percent a year on their assets, is highly optimistic. But things are under control -- as long as GM stays solvent. By contrast, OPEB is out of control. At year-end, OPEB was $57 billion in the hole, even though GM threw $9 billion into an OPEB trust in 2004. http://shurl.org/gmsloan Consider these numbers in relation to GM's market capitalization of about $17 billion. The company is deeply in debt: around $300 billion. (http://shurl.org/gmdebt) It had to sell $17.6 billion in bonds in 2003 to meet its pension obligations. Yet in January, 2004, its share value peaked. Optimism still reigned supreme. The best and the brightest missed what should have been obvious. It could happen again. Next time, it could happen to a lot more companies. The worse the news out of Medicare, the less optimistic the outlook of investors. A MINI-WELFARE STATE? Political columnist George Will has described the plight of GM as the common plight of the welfare state, in an article, "The Latest welfare state? It's General Motors." Who knew? Speculation about which welfare state will be the first to buckle under the strain of the pension and medical costs of aging populations usually focuses on European nations with declining birth rates and aging populations. Who knew the first to buckle would be General Motors, with Ford not far behind? GM is a car and truck company -- for the 74th consecutive year, the world's largest -- and has revenues greater than Arizona's gross state product. But GM's stock price is down 45 percent since a year ago; its market capitalization is smaller than Harley Davidson's. This is partly because GM is a welfare state. Will's angle is a nice touch. A journalist looks for a hook to snag readers, and the current discussions about the demographic train crash of the Western world's retirement and medical programs serve as a convenient hook. Statistically, it's the same problem: the bills are coming due, and there is no money set aside to pay them. But GM is not a state. It is run by profit-seeking managers on behalf of profit-seeking investors by means of serving consumers who have a choice to buy or not to buy. Why should GM's managers and investors make the same mistake as politicians? For politicians, it never was a mistake. It was a way to get each era's voters to hand more money over to the politicians, whose careers would end long before the demographic day of reckoning arrived. It involved hoodwinking the voters by promising them future goodies. The voters who saw through the sham could not sell their shares. There are no shares to sell. The system is compulsory. GM's shareholders can sell, and have. The problem is, the managers at GM seem to have acted in the same short-sighted, self-interested way. So did a generation of investors in GM stock. Yet we free market advocates like to believe that things are different in free markets than in political affairs. Are we wrong? No. But we have to understand how the system works. The problem has been building for a long time. The tax code has treated the funding of future benefits as deductible expenses to a company, but not taxable events for the employees. Labor union saw the advantage. They could claim victories in their negotiations with management. This is true across the board, in company after company. What has been in it for senior management? Stock option profits. It is legal for managers of American companies to reward themselves by investing workers' retirement money in corporate shares. This raises the value of managers' stock options. This is what Enron's senior managers did. It is a widespread practice. Profit-seeking people respond to incentives. The tax code has created incentives for pension fund payments. The tax code has also provided incentives for stock options: long-term capital gains, taxed at a lower rate than salaries. Government-authorized accounting practices have added to the illusion of future wealth: assumptions regarding estimated future investment returns based on the post-1982 stock market boom-era. GM expects to earn 9% per annum in its pension fund. How? The federal government has created business in its own image with respect to pension funds. The bills are now coming due. COST PER CAR The cost of health care plans for GM workers is now over $5 billion a year. This is now affecting GM's ability to compete. Writes Will: GM says health expenditures -- $1,525 per car produced; there is more health care than steel in a GM vehicle's price tag -- are one of the main reasons it lost $1.1 billion in the first quarter of 2005. But it's not just GM. Ford's profits fell 38 percent, and although Ford had forecast 2005 profits of $1.4 billion to $1.7 billion, it now probably will have a year's loss of $100 million to $200 million. All this while Toyota's sales are up 23 percent this year, and Americans are buying cars and light trucks at a rate that would produce 2005 sales almost equal to the record of 17.4 million in 2000. Foreign auto companies are steadily eating into GM's profits. GM's market share keeps dropping. So is the market share of the other members of the Big Three. In 1962 half the cars sold in America were made by GM. Now its market share is roughly 25 percent. In 1999 the Big Three -- GM, Ford, Chrysler -- had 71 percent market share. Their share is now 58 percent and falling. Twenty-three percent of those working for auto companies in North America now work for companies other than the Big Three, up from 14.6 percent just five years ago. The number of Big Three employed workers has fallen by 134,000 since 2000. Then these is the issue of who should pay for these benefits. The free market's answer is clear: consumers. Their money determines what should be produced. If consumers say, "No; your price is too high," this leaves GM's management with bills to pay and no income to pay them. When the bills come due, those receiving them start looking for other people to share the burden. The bills are coming due for GM. GM says its health care burdens, negotiated with the United Auto Workers, put it at a $5 billion disadvantage against Toyota in the United States because Japan's government, not Japanese employers, provides almost all health care in Japan. This reasoning could produce a push by much of corporate America for the federal government to assume more health care costs. This would be done in the name of "leveling the playing field" to produce competitive "fairness." In short, because taxpayers in Japan are required to pay for health costs of Japanese auto workers, American firms want you and me to dig a little deeper into our wallets and our futures, in the interest of fairness. It doesn't sound fair to me. I didn't sign those long-term contracts with GM's workers. I didn't lower my costs of production by making promises instead of paying higher wages. Then there are GM's retirees: "Health care for retirees and their families -- there are 2.6 of them for every active worker -- is 69 percent of GM's health costs." http://shurl.org/gmwill Up, up, up go medical costs. Down, down, down go GM's profits. We think of GM as an auto company. But its auto division is small potatoes. About 80% of GM's profits come from GMAC, its in-house loan company: consumer credit and mortgages. It profited greatly during the mortgage boom. But this source of profits has begun to taper off. Now what? CONCLUSION This report is about GM, insofar as GM is representative of a mindset. Managers have treated GM as a career investment vehicle. Workers have treated it as a rich uncle who will always be there with money. Investors have treated GM as if the company were not subject to the reality of long-term increases in medical care costs. In retrospect, the experts say all of this was visible years ago. But the share price of GM indicates that nobody paid any attention until it was too late. This is why I am not impressed by economists who assure the public that Social Security/Medicare are not out of control, that there is time to maneuver. Nobody in charge ever seems to maneuver until the investment vehicle goes into a skid on an icy road in the mountains. Bad news is dismissed as irrelevant. Statistical reality is deferred by investors until they finally start unloading shares. Then there is not much that the people in charge can do to solve the problem. If highly sophisticated investors are this naive about where their money is being invested, why should we expect politicians to tell us the truth about the looming insolvency of Social Security/Medicare? [I am sending forth these memes, not because I agree wholeheartedly with all of them, but to impregnate females of both sexes. Ponder them and spread them.] From checker at panix.com Sat May 7 00:01:04 2005 From: checker at panix.com (Premise Checker) Date: Fri, 6 May 2005 20:01:04 -0400 (EDT) Subject: [Paleopsych] Human Genetics News 5.5.6 Message-ID: Human Genetics News 5.5.6 This is a news clipping service from Human Genetics Alert. (www.hgalert.org) The articles selected do not represent HGA's policies but are provided for information purposes. For subscription details, please refer to the end of this mail. *********************************************************** Contents 1. Pioneering stem-cell surgery restores sight 2. Genetic Screening for Iron Disease Feasible 3. Giving its DNA code away 4. Vampire fears over DNA data 5. N.C. House members filed eugenics compensation bill 6. D.C. scientist breaks new lead on gay gene 7. Lawmakers kill proposed pet cloning ban 8. Newborn Screening Education Materials Lacking: Study 9. Spanish government accepts the assisted reproduction *********************************************************** 1. Pioneering stem-cell surgery restores sight 29 April 2005 www.timesonline.co.uk/article/0,,2-1589642,00.html The Times By Sam Lister, Health Correspondent A PIONEERING form of surgery has been developed that can restore the sight of patients by using stem cells to encourage damaged eyes to repair themselves. A team of British specialists has successfully treated more than a dozen patients with impaired corneas by transplanting human stem cells grown in a laboratory on to their eyes. Recent operations on ten patients showed that the technique restored sight in seven cases of people who had been blinded after getting acid, alkali and boiling metal in their eyes, or because of congenital disorders. Many of the patients treated at the Centre for Sight, Queen Victoria Hospital, in East Grinstead, West Sussex, had been told that they had no hope of getting their sight back, or had already undergone failed corneal transplants. The process involves taking stem cells, which occur naturally in the eye, and developing them into sheets of cells in the laboratory. These are transplanted on to the surface of the eye where they are held in place by an amniotic membrane, which dissolves away as the sheet fuses to the eye. Sheraz Daya, an ophthalmic surgeon leading the Sussex team, which has spent five years perfecting the technique, said that doctors had been astonished at how the cells appeared to trigger the eye's natural regeneration of its damaged surface. Tests on the patients after a year revealed no trace of the DNA of the stem-cell donor, meaning that the repair was carried out by the eye's own cells - a permanent healing process that does not require long-term use of powerful drugs to suppress the patient's immune system. Mr Daya said: "The technique not only works, but there was no donor tissue there. That is what really blew our minds. The cells appeared to have been shed from the eye and replaced by the patient's own, much more hardy, cells." The team, including scientists at the hospital's McIndoe Surgical Centre, now hopes to identify the processes at work, which might then be used to trigger the repair of other damaged tissue around the body. Details of the trial were revealed this month at an international conference of eye specialists in America. All the patients in the trial had corneas that had become damaged because they no longer had limbal stem cells, which are normally under the eyelid and help to keep the surface of the cornea clear, protecting it. Edward Bailey, who lost his sight after caustic acid landed in his left eye while he was cleaning pipes at a yoghurt factory, said that the operation had transformed his life. "It was the most emotional moment," Mr Bailey, 65, said. "I couldn't believe it. For ten years all I had seen was shades of black and grey, then after I had the operation the nurse came by and I saw a flash of blue from her uniform. I went home and when I took the patch off my eye, I had my vision back. It is only when you lose something like sight that you realise how precious it is." Nadey Hakim, a consultant surgeon at St Mary's Hospital, London, said that it was likely that such action could be mimicked in other organs, thus reducing the need for organ transplants. Professor Hakim said: "The hope is that stem cells will one day be used to generate large quantities of cells and tissues and possibly entire organs damaged by disease and injury. It is a dream." *********************************************************** 2. Genetic Screening for Iron Disease Feasible 26 April 2005 http://www.newscientist.com/article.ns?id=dn7306 The Lancet NEW YORK (Reuters Health) - Although genetic screening for hemochromatosis, a type of iron disease, is considered controversial, new research indicates that such screening can be successfully applied in a workplace setting with high satisfaction rates. Hemochromatosis, which is often associated with a mutation in a gene called HFE, occurs when the body absorbs more iron than is needed from the diet. Since the body lacks a method to rid itself of iron, it accumulates in various organs, resulting in a range of symptoms as well as potentially serious complications. Patients with the disease are often required to give blood every few months to keep their iron levels down. The controversy regarding screening stems from the fact that not everyone with an HFE mutation will go on to develop hemochromatosis. This can lead to anxiety among those who test positive and may lead to discrimination by insurers and employers. However, identifying the condition early is important to reduce iron build-up before permanent damage occurs. As reported in the medial journal The Lancet, Dr. Katie J. Allen, from the Murdoch Children's Research Institute in Melbourne, Australia, and colleagues looked for a key HFE mutation in cheek swabs that were obtained from 11,197 adults at their workplaces. A total of 1325 subjects were heterozygous for the mutation, meaning that one of their two HFE genes was normal, while the other had the mutation. Fifty-one subjects were homozygous for the mutation, having both HFE genes with the mutation. The remaining subjects had two normal genes. Subjects homozygous for the mutation are immediately diagnosed as having hemochromatosis, whereas those who have just one abnormal gene may or may not develop the disease. One month after receiving the test results, subjects homozygous for the mutation did not report increased anxiety compared with other subjects. Most importantly, nearly all of the homozygous subjects took measures to prevent or treat iron build-up. Because the authors were able to reach an agreement with the Australian insurance industry, all of the subjects who were homozygous for the mutation had their policies underwritten at standard rates. At present, an economic analysis is underway to determine if this screening approach is cost-effective, the investigators note. In a related editorial, Dr. Paul C. Adams, from the London Health Sciences Center in Ontario, Canada, comments that the current study is a "strong endorsement for the feasibility and acceptability of genetic testing for hemochromatosis in the workplace." However, he adds that "it is likely that optimum screening strategies, including no screening, will vary in different countries depending on various medical, ethical, legal, and social issues." *********************************************************** 3. Giving its DNA code away 27 April 2005 http://www.baltimoresun.com/news/health/bal-bz.celera27apr27,1,4537039.s tory?page=2&cset=true&ctrack=1&coll=bal-health-headlines Baltimore Sun By Tricia Bishop Public domain: The for-profit rival in the race to map the human genome will give its DNA sequences to a national biotechnology center. Five years ago on a summer day in the East Room of the White House, then-President Bill Clinton and Tony Blair - the British prime minister weighing in by satellite - hailed the mapping of the human genome as "the first great technological triumph of the 21st century." It was an achievement that many said would one day lead to eradication of disease and the creation of made-to-order, individualized drugs. On each side of the president were the beaming victors, ready to reap the spoils: a brash, but brilliant scientist named J. Craig Venter, then president of Celera Genomics Group of Rockville, and the accomplished Francis S. Collins, head of the Human Genome Project, an international consortium of academic laboratories led by the National Institutes of Health. The two factions - the first for profit, the second not - had been bitter rivals in the race to sequence human genes, egging each other forward and ultimately, diplomatically, agreeing to share worldwide credit for identifying the human recipe. Neither, however, seemed willing to give on one point of contention: whether the data belonged in the public or private domain - until yesterday. During a routine conference call to discuss quarterly earnings yesterday morning, Celera Genomics announced that after July 1 it would contribute much of its hard-earned DNA sequence data to public domain through the National Center for Biotechnology Information, a division of the National Institutes of Health. "This data just wants to be public," said a pleased Collins, who is also director of the National Human Genome Research Institute. "It's the kind of fundamental information that has no direct connection to a product, it's information that everybody wants, and it will find its way into the public." Celera Genomics, a unit of Connecticut-based Applera Corp., was unable to make a commercial success trading in the genetic information. It has spent the past three years slowly dismantling its foundation as a supplier of genetic data to instead concentrate on drug development, a transformation that will become official this summer. "This has been a very long kind of planned exit strategy from that business," Peter Dworkin, Applera's vice president for investor relations, said in an interview. "We're coming to an end of that period." Also coming to an official end is a contest that has raged for years, begun when Celera increased efforts to map the human genome by declaring it, too, would tackle the project, despite an eight-year head start by public laboratories. A competition The story began in 1998, when Applera created Celera Genomics to leverage technology developed by another of its holdings, Applied Biosystems. Applied Biosystems had created the means to sequence genes being used by scientists within the Human Genome Project, under way since 1990. Celera's presence turned the project into a competition, both frustrating and fruitful for the consortium scientists, who were suddenly forced to speed up their efforts and consider other possibilities. Access to the resulting information was a battleground from the start, with some opposing Celera's efforts because they feared the company would try to patent the genes and lay claim to the human gene code. Shortly before the historic joint announcement in June 2000 that the first full-length record of human DNA had been catalogued, both Clinton and Blair had argued for "unencumbered access" to the data. And Celera obliged, with a caveat: cost. Many believed there was money to be made on the data itself, selling access to it or developing drugs based on it. But it was much easier said than done, and a venture that some say is still best suited to the world of grant-funded research, which can focus on discoveries with less worry about the bottom line. Celera's get-rich plan was to sell subscriptions to the genetic information, and get "income from customers using our data to make discoveries," Venter, the company's former president, said in 2000. What he and his colleagues didn't quite seem to grasp was that their counterparts in academia had similar information, and they weren't going to charge for access to it. Others ran into similar situations, discovering that academics were publishing their research on the Internet, accessible to anyone with a computer and a connection. Incyte Pharmaceuticals of Delaware, for example, began life as a company that sells genomic research databases, but today - like Celera - is becoming a "leading drug discovery and development company by building a proprietary product pipeline of novel small molecule drugs," according to its Web site. Stocks soared "People don't want to pay for it if it's going to become free," said Constance Hsai, a biotechnology analyst who follows Celera Genomics for SG Cowen Securities Corp. in New York. Hsai owns five shares of the company's stock, bought years ago when she was a graduate student and stocks for companies working on the human genome map were soaring, peaking at $247 per share in March 2000. They've since fallen back to earth: Celera Genomics' stock fell 30 cents yesterday to close at $10.07 on the New York Stock Exchange. "This was all uncharted territory, we were trailblazers and pioneers in this area. ... We really helped kind of create this era of genomic science," Dworkin said. "You can't know everything when you start out." Venter resigned from the business in 2002, shortly before the company announced it would shift gears and stop marketing its genome databases. Those resources would instead go toward developing products. Applied Biosystems still makes technology others can use in interpreting the information, whether they've paid for access to it or found it free on the Web. "It's a natural evolution of genome science," said Dennis Gilbert, chief scientific officer of Applied Biosystems. "The payoff from the human genome is discoveries people will make, and that's the phase we're entering now." Those affiliated with the Human Genome Project say Celera's information had become outdated as well because they stopped at mapping a draft of the human genome, while the public consortium worked until 2003 to complete its data. "In many ways, the product that Celera was holding onto decreased in value," said Aristedes Patrinos, who represented the U.S. Department of Energy in the Human Genome Project. He also lent the use of his basement to the two sides in May 2000, when, over jalapeno pizza, Venter and Collins agreed to share credit. Patrinos said he believes Venter, who could not be reached yesterday, would want the information public. Venter is busy with other enterprises these days, though. He's started his own non-profit organization - the J. Craig Venter Institute, based in Rockville - "dedicated to the advancement of the science of genomics" and understanding its societal implications. Currently, his institute is working on projects to catalog the genomic spectrum found in air, as well the various microbes in marine and terrestrial environments. Venter has sailed around the globe collecting data. Seeing variations Collins said the information released by Celera to the National Center for Biotechnology Information - certain human, mouse and rat DNA sequences - will likely not do much to further assembly of genomes, though it will be useful in demonstrating how data differs in different subjects. "I give a lot of credit to [Applied Biosystems] and Celera," Collins said. "It does make sort of the battle days of what appeared to be an unpleasant race a distant memory." *********************************************************** 4. Vampire fears over DNA data 3 May 2005 http://australianit.news.com.au/articles/0,7204,15155214%5E15321%5E%5Enb v%5E15306,00.html The Australian Karen Dearne A PRIVATE DNA database project that aims to collect blood samples from 100,000 indigenous people - including Australian Aborigines - as a means of tracing ancient migration routes has reignited fears of "vampire research" and claims of biopiracy. The $US40 million ($51 million) Genographic Project, led by US population geneticist Dr Spencer Wells, will rely on massive computing power to investigate the genetic roots of modern humans. National Geographic is co-ordinating an international team of scientists to collect DNA samples and oral histories from indigenous people. IBM is contributing its Blue Gene computational biology machines and data analysis tools. The five-year project, funded by the Waitt Family Foundation, headed by Gateway computer billionaire Ted Waitt, was immediately denounced by the US-based Indigenous Peoples Council on Biocolonialism. The group declared the Genographic Project "a clone" of the Human Genome Diversity Project that it defeated in the early 1990s. Dubbed the "vampire project", the HGDP was considered to be an "unconscionable attempt" by genetic researchers "to pirate our DNA for their own purposes". The council has called for an international boycott of IBM, National Geographic and Gateway until the project is dropped. It's understood the matter will be discussed by the Australian Institute of Aboriginal and Torres Strait Islander Studies at its council meeting this month. Institute research director Luke Taylor said the body had only been made aware of the project through a media kit that arrived a couple of days before its Australian launch. "The kit has been referred to inistitue chairman Mick Dodson and we'll be evaluating it, Dr Taylor said. "At this stage, there has been no consultation with us, and we've been given no details of what appears to be a complex and problematic study." The media kit invites participation in the public part of the project. Anyone can take part by logging on to the National Geographic website, paying a $US100 fee for a swab kit to return a saliva sample, and providing some non-identifying family information for inclusion in the database. GeneEthics Network executive director Bob Phelps said the project was being "sweetened" by public participation and access, but "it's still biopiracy". "Here we've got indigenous people who are being overrun by dominant populations, but these researchers are not advocates for them," Mr Phelps said. "It's like animals in the zoo: taking the last remnants of disappearing peoples and grabbing the material that may be of scientific or commercial use in the future. "They're making sure that their DNA doesn't disappear, instead of saying these people are of value and are entitled to survive in their own right, aside from their genetic material." The head of IBM's Computational Biology Centre, Ajay Royyuru, said the Genographic Project would not involve collection of sensitive information on individuals' medical or health status. "The information we're gathering is only about geographic location and the language they speak," Dr Royyuru said. "We aim to create a database that holds information about markers that speak of deep ancestry and the migratory routes that our ancestors have travelled. "We are deliberately not looking for health information. We will not be gathering data that is medically relevant." All research outcomes would be published and the entire database would become a public resource at the project's completion, he said. "We recognise that the data we're gathering is perhaps the most personal information. It is you, your genome, which is unique to you," he said. "We've found that if we are up-front about what we will do and what we will not do, if we tell people what the project's about, they're actually delighted to participate. If the project succeeds, it will be because enough people on the planet understand that when we share this data, we'll be able to interpret it. "We can only discover the story of our migratory history when we put all the details together and look at the correlations." Australian Law Reform Commission acting president Brian Opeskin said there were considerable medical and cultural sensitivities about the creation of genetic databases, and particularly commercial use of the data. An ALRC inquiry on the protection of human genetic information in 2003 recommended strengthening existing measures. "No one would really doubt the value of many of these databases, particularly for medical research, but questions arise when there are conflicts of interest and profit-taking by those involved in collecting the data," Mr Opeskin said. "Every few months there's some new use for genetic information. "What happens when researchers collect DNA data for one purpose, and then want to make different uses of it later? "If people are well-informed they are often quite happy to be altruistic in contributing their samples, but discovering later that the material is being used for commercial purposes often causes a lot of grief." *********************************************************** 5. N.C. House members filed eugenics compensation bill 5 May 2005 www.tuscaloosanews.com/apps/pbcs.dll/article?AID=/20050505/APN/505051138 &cachetime=3&template=dateline Tuscaloosa News, Alabama (Associated Press) North Carolina would give each victim of eugenics sterilization in the state $20,000 in compensation if a measure filed Thursday in the state House became law. The bill, filed by several Democrats, would place $69.1 million from a special fund to cover claims filed before mid-2009. About 7,600 people were sterilized under North Carolina's program, which ordered the operations from 1929 through 1974. Many of them were sterilized against their will, and the program was the third-largest in the nation, after California and Virginia. Some researchers say about 3,400 victims are still alive in North Carolina. Most of the victims were poor women who were often talked into sterilization by social workers. Inaccurate labels of "feeble-mindedness" were often used as justification based on eugenics, the movement to solve social problems by preventing the "unfit" from having children. Gov. Mike Easley apologized for the program in 2002 after the Winston-Salem Journal ran a series of articles exposing the abuses that took place. Legislators also repealed the state's old sterilization law, but the state has not offered any tangible form of compensation. A commission that Easley appointed in 2003 recommended that the state at least provide health and education benefits for sterilization victims. Those benefits haven't been approved. The bill would order the Department of Health and Human Services to determine whether each claim was valid. Compensation for a person who files a claim but dies before receiving the money would be forwarded to a descendant's estate. House Speaker Jim Black, D-Mecklenburg, said recently he wants legal issues thoroughly researched on the compensation idea. Senate leader Marc Basnight, D-Dare, hasn't taken a position yet on the idea. Some fear that the requested reparations would set a precedent for other types of victims. *********************************************************** 6. D.C. scientist breaks new lead on gay gene 6 May 2005 www.washblade.com/2005/5-6/news/localnews/dcdcience.cfm Washington Blade By Eartha Melzer NIH study builds on genetic theory of sexual orientation A recent study by researchers at the National Institutes of Health has added to the body of knowledge on the relationships between genes and sexual orientation, according to a recent issue of Human Genetics. Although the research was concluded two years ago, the small number of people working on the issue resulted in a delay of two years before the research was published, lead investigator and D.C. resident Dr. Dean Hamer said this week. The investigation builds on studies that have suggested that there tend to be clusters of gays within a family. In 1993, a group of researchers under the direction of Hamer, who was also a researcher on the recent study, examined DNA from gay men and their family members and found that gay men within a family share a segment of DNA on the X chromosome, which men inherit only from their mothers. "This told us that genes play a role," said Brian Mustanski, one of the researchers on the genomescan. "But it doesn't tell us where the genes are or what they do." To develop a more precise picture of what genes might be involved in sexual orientation, researchers examined the genes of 456 individuals from 146 unrelated families - 137 families with two gay brothers and 9 families with three gay brothers. Researchers reasoned that brothers are expected to share an average of 50 percent of their genes but that genes that influence sexual orientation would be shared more than 50 percent of the time by gay brothers. Mustanski compared the process of scanning gay brothers for sexual orientation-related genes to looking for doctors in a town of 40,000 people, a number that corresponds to the number of human genes. "You could take a guess that [a doctor] probably lives in a six bedroom brick house - and only go to a few houses that meet this criteria," Mustanski said. "Alternatively, you could go to every street in the town and knock on one door in the neighborhood and ask them if a doctor lives on their street. We used this second approach and narrowed it down to a few streets that are likely to have a doctor on them. When we say 'chromosomal regions,' it is akin to the street. The next step is to discover which specific gene within these newly discovered chromosomal regions, is related to sexual orientation," he said. The researchers placed 403 markers across the genome. This strategy revealed three chromosomal areas that are shared by the gay brothers around 60 percent of the time. This frequency of shared markers is not a "significant link," according to Mustanski but it does rise to the level of "suggestive link." Mustanski said that the idea that these chromosomal regions are related to sexual orientation is very compelling because the areas identified through the scan are known to contain genes involved in sexual orientation. "I think it's important because it reinforces the theory that sexual orientation is at least partially genetic and that there are many different genes, not just one or two," Hamer said. "I think it is important knowledge because homophobes often argue that sexual orientation is a choice, which simply isn't true. It is important to have concrete data showing that it is not simply a choice." Research into genetic aspects of homosexuality is controversial. Hamer said that the effect of politics on science can be seen in the fact that there have only been five papers on the subject in 10 years. "In 1994 our lab discovered a gene involved in anxiety, and there have been 850 papers on that." Hamer said. The Council for Responsible Genetics, a 21-year-old Cambridge, Mass.-based group founded by scientists to educate the public on genetics issues, has issued a position paper on the hunt for the genetic basis of sexual orientation. *********************************************************** 7. Lawmakers kill proposed pet cloning ban 4 May 2005 www.reuters.com/newsArticle.jhtml?type=oddlyEnoughNews&storyID=8388222 Reuters SACRAMENTO, Calif. (Reuters) - California lawmakers rejected a proposal on Tuesday that would have banned sales of cloned pets, a measure aimed at a San Francisco-area company's bid to replicate beloved family animals for profit. Proponents of the measure argued for the pet-clone ban because the technology was unregulated and animal shelters were already filled to capacity with potential pets. The proposed ban came after the first sale of a cloned pet last year by Sausalito, California-based Genetic Savings & Clone Inc. The company revealed in December it had cloned a cat -- named Little Nicky after its progenitor, Nicky -- for a client in Texas for $50,000. The privately held company financed by billionaire John Sperling has said it has other cat clones in various stages of production and is developing a dog-cloning service. A California State Assembly committee rejected the bill after lawmakers raised concern a ban on cloned pets was premature because of uncertainties surrounding the future of the technology. "I believe that the bill is a good candidate for a serious study by experts," said Democrat Gloria Negrete-McLeod. Defeat of the measure came as California moves to set up its own $3 billion publicly financed stem cell research program, the largest pool of public funding in the United States. The pet cloning ban would not have extended to stem cell research, according to its sponsor. *********************************************************** 8. Newborn Screening Education Materials Lacking: Study 4 May 2005 www.drkoop.com/newsdetail/93/525442.html DrKoop.com (from HealthDay Reporter) By Serena Gordon, HealthDay Reporter Parents aren't getting enough or the right kind of information, researchers say Parents aren't getting enough information about the genetic screening tests performed on their newborns. That's the conclusion of a study appearing in the May issue of Pediatrics that found the materials explaining newborn screening tests varied significantly from state to state, and none of the materials contained all of the information recommended by the American Academy of Pediatrics (AAP). ******************************************************** 9. Spanish government accepts the assisted reproduction 6 May 2005 www.eitb24.com/noticia_en.php?id=58549 EITB4 - Basque news and Information Channel The standard, which expects genetic selection with therapeutic aims for people, maintains surrogate mothers are forbidden, but fixes no deadline regarding age for artificial insemination. The board of ministers will predictably approve the draft bill on human assisted reproduction techniques. The law banns human cloning with reproduction aims. The standard, which expects genetic selection with therapeutic aims for people, maintains surrogate mothers are forbidden, but fixes no deadline regarding age for artificial insemination. Another novelty would be the creation of a National Register Office for Donors, and a Register Office that gathers activities of assisted reproduction centres for clients. Control of assisted reproduction techniques The law is expected to come into force in 2006. The objective is to control the application of assisted reproduction techniques to facilitate couples with fertility problems to have children. These techniques could treat and prevent diseases. The new law will prohibit human cloning with reproduction aims, as the European Constitution does. Regarding cloning with therapeutic aims, the ministry is carrying out the Investigation Law in Biomedicine. If a couple wants a child with strong immune system that could cure his brother, the Investigation Law could work on it. It's expected families won't need to travel abroad, as it has occurred until now. *********************************************************** To Subscribe / Unsubscribe to HGA's News clipping service, contact us via e-mail on info at hgalert.org with 'Subscribe' in the subject field or 'Unsubscribe' to be taken off our mailing list. Human Genetics Alert 22-24 Highbury Grove 127 Aberdeen House London N5 2EA TEL: +44 20 77046100 FAX: +44 20 73598423 From HowlBloom at aol.com Sat May 7 00:01:19 2005 From: HowlBloom at aol.com (HowlBloom at aol.com) Date: Fri, 6 May 2005 20:01:19 EDT Subject: [Paleopsych] why I need you and you need me Message-ID: <12a.5cd6ac8d.2fad5f4f@aol.com> Thanks to your input and to the energy you give me I've been mapping out a theory of the extracranial extensions of the self. One part of that theory says that when I get upset about a fight with my wife, I need to run to you and blurt out my tale. Why? On the surface, in order to calm myself down. But there's another reason. Groups with the nimblest collective intelligence outcompete groups with lame collective brains. When I have trouble in my trek through tough emotional terrain--like the terrain of a relationship--I bring my report on that problem to a friend, to you. You calm me down. In the process you follow Alice in Wonderland's rule: "How do I know what I'm thinking until I hear what I have to say?" You think out solutions that are useful to you and are useful to me. In fact, you wonder when you've finished delivering your wisdom, why you could do this miraculous problem solving for me, but you couldn't do it for yourself. My problem and your solution, if we?re all very lucky, can do something remarkable. It can become a metaphor that helps us understand other relations that ride on shifting sands?from understanding how particles behave or how America has to deal with our Chinese trade deficit to understanding what a business needs to do next or to puzzling out the patterns of signals we get from a probe on the moon of a distant planet. If the tale of what I've been through makes for a really good story, and if your solution to my problem is a triumph, too, you get excited. What happens to us humans when we're excited. We need to share the excitement with someone else. We need to blurt, to vent, and to brag. So you, having helped me, call your spouse or a friend and send the tale of my dilemma and your solution out on the seas of the grapevine, out on the seas of gossip, out on the sea of collective information processing, collective intelligence, and collective memory. In the process you and I help the groups and subgroups we belong to smart. If we lived in a culture that forbade this sort of confession, this constant conversation about intimacy, we'd be a lot dumber. Which my explain why I no longer want to write What the Nuclear Knights of Islam Want From You: The Osama Code. Reading books on the history of Islam's founding fathers, the Companions of the Prophet, has worn me out. How? I'm still trying to define it, but these books dry out my brain. They stop me from thinking. There's no introspective depth. It is very, very hard to kill my curiosity, but the aridness of these Islamic source books has managed to do it. Is that because the culture within which these books have been written is deprived of the cross-talk that takes place when we Westernizers run into problems--especially problems that whack us with the whips and paddles of confusion and insecurity? Meanwhile, my limbic system--and probably yours--needs to resolve its problems with my cortical consciousness not by sending a signal a mere four inches or so through the brain, but by going the thousands of miles it takes to get to you. Then you explain me to my self--you complete a loop from the turmoil of my emotional brain, my limbic system, to the somewhat semi-calm of my talking, thinking, and writing brain, my left frontal and pre-frontal cortex. If I read Jeff Hawkins' right, he says that this sort of loop creates a memory that allows us to see the patterns of the immediate past and use those patterns to predict the future. And memory of this sort is a vital part of collective intelligence. Here's Hawkin's quote (once again). See if you think it applies: auto-associative memories in neural nets. "Instead of only passing information forward...auto-associative memories fed the output of each neuron back into the input.... When a pattern of activity was imposed on the artificial neurons, they formed a memory of this pattern. ...To retrieve a pattern stored in such a memory, you must provide the pattern you want to retrieve. ....The most important property is that you don't have to have the entire pattern you want to retrieve in order to retrieve it. You might have only part of the pattern, or you might have a somewhat messed-up pattern. The auto-associative memory can retrieve the correct pattern, as it was originally stored, even though you start with a messy version of it. It would be like going to the grocer with half eaten brown bananas and getting whole green bananas in return. ...Second, unlike mist neural networks, an auto-associative memory can be designed to store sequences of patterns, or temporal patterns. This feature is accomplished by adding time delay to the feedback. ...I might feed in the first few notes of 'Twinkle, Twinkle Little Star' and the memory returns the whole song. When presented with part of the sequence, the memory can recall the rest." Jeff Hawkins, Sandra Blakeslee. On Intelligence. New York: Times Books, 2004: pp 46-47. ---------- Howard Bloom Author of The Lucifer Principle: A Scientific Expedition Into the Forces of History and Global Brain: The Evolution of Mass Mind From The Big Bang to the 21st Century Visiting Scholar-Graduate Psychology Department, New York University; Core Faculty Member, The Graduate Institute www.howardbloom.net www.bigbangtango.net Founder: International Paleopsychology Project; founding board member: Epic of Evolution Society; founding board member, The Darwin Project; founder: The Big Bang Tango Media Lab; member: New York Academy of Sciences, American Association for the Advancement of Science, American Psychological Society, Academy of Political Science, Human Behavior and Evolution Society, International Society for Human Ethology; advisory board member: Youthactivism.org; executive editor -- New Paradigm book series. For information on The International Paleopsychology Project, see: www.paleopsych.org for two chapters from The Lucifer Principle: A Scientific Expedition Into the Forces of History, see www.howardbloom.net/lucifer For information on Global Brain: The Evolution of Mass Mind from the Big Bang to the 21st Century, see www.howardbloom.net -------------- next part -------------- An HTML attachment was scrubbed... URL: From checker at panix.com Sat May 7 00:02:38 2005 From: checker at panix.com (Premise Checker) Date: Fri, 6 May 2005 20:02:38 -0400 (EDT) Subject: [Paleopsych] Science: A Heavyweight Battle over CDC's Obesity Forecasts Message-ID: A Heavyweight Battle over CDC's Obesity Forecasts Science, Vol 308, Issue 5723, 770-771 , 6 May 2005 Jennifer Couzin How many people does obesity kill? That question has turned into a headache for the Centers for Disease Control and Prevention (CDC) in Atlanta, Georgia: In the past year, its scientists have published dueling papers with conflicting estimates on obesity-associated deaths--the first three times greater than the second. The disagreement, some fear, is undermining the agency's health warnings. The bidding on obesity's annual death toll started at a staggering 400,000--the number cited in a CDC paper co-authored by CDC chief Julie Gerberding in 2004. But dissent prompted an internal inquiry, and CDC decided this year to lower the number to 365,000. That was still too high for some CDC analysts, who together with colleagues at the National Cancer Institute (NCI) in Bethesda, Maryland, published a new figure on 20 April--112,000 deaths. The low estimate is spawning other problems, though. A food-industry interest group is touting it as evidence that obesity is not so risky. Even researchers who favor the low number worry that it will lead to complacency. After trumpeting the highest estimate a year ago and warning that obesity deaths were poised to overtake those caused by tobacco, CDC officials now say that numbers are unimportant. The real message should be that "obesity can be deadly," says George Mensah, acting director of CDC's National Center for Chronic Disease Prevention and Health Promotion. "We really add to the confusion by sticking to one number." But some of CDC's own scientists disagree. "It's hard to argue that death is not an important public health statistic," says David Williamson, an epidemiologist in CDC's diabetes division and an author on the paper with the 112,000 deaths estimate. Calculating whether obesity leads directly to an individual's demise is a messy proposition. To do so, researchers normally determine by how much obesity increases the death rate and what proportion of the population is obese. Then they apply that to the number of deaths in a given time, revealing excess deaths due to obesity. Both studies use that approach, but methodological differences produced big disparities between the two papers--one by epidemiologist Ali Mokdad, Gerberding, and their CDC colleagues, published in the Journal of the American Medical Association (JAMA) on 10 March 2004, and the new estimate by CDC epidemiologist Katherine Flegal and colleagues at CDC and NCI, published in JAMA on 20 April. Both relied on data about individuals' weight and other measures from the National Health and Nutrition Examination Survey (NHANES), which has monitored the U.S. population since the 1970s. The Mokdad group used the oldest, NHANES I. Flegal's group also used two more recent NHANES data sets from the 1980s and 1990s. Her method found fewer obesity-associated deaths--suggesting that although obesity is rising, some factor, such as improved health care, is reducing deaths. Other variations in methodology proved crucial. For example, the two groups differed in their choice of what constitutes normal weight, which forms the baseline for comparisons. Flegal's team adopted the definition favored by the National Institutes of Health and the World Health Organization, a body mass index (BMI) between 18.5 and less than 25. The Mokdad team chose a BMI of 23 to less than 25; this changed the baseline risk of death, and with it, deaths linked to obesity. In their paper, the Mokdad authors said they selected that narrower, heavier range because they were trying to update a landmark 1999 JAMA paper on obesity led by biostatistician David Allison of the University of Alabama, Birmingham, and chose to follow Allison's methodology. (CDC spokesperson John Mader said that Mokdad and his co-authors were not available to be interviewed.) "There's no right answer" to which BMI range should be the "normal" category, says Allison. He felt his choice was more "realistic," and that expecting Americans to strive for even lower BMIs might be asking too much. But that relatively small difference in BMI had a big effect on the estimates: Had Flegal's team gone with the 23-to-25 range, she reported, the 112,000 deaths estimate would have jumped to 165,000. The scientists also diverged sharply in how they tackled age. It's known that older individuals are less at risk and may even benefit from being heavier: A cushion of fat can keep weight from falling too low during illness. And young obese people tend to develop more severe health problems, says David Ludwig, director of the obesity program at Children's Hospital in Boston. Flegal's group took all this into account by assigning risks from obesity to different age groups. Stratifying by age meant that when Flegal turned to actual death data--all deaths from the year 2000--she was less likely to count deaths in older age groups as obesity-related. Allison concedes that in retrospect, his decision not to stratify by age was a mistake. And it had a big impact on the estimates. "Very minor differences in assumption lead to huge differences in the number of obesity-induced deaths," says S. Jay Olshansky, a biodemographer at the University of Illinois, Chicago. Olshansky, Allison, and Ludwig published their own provocative obesity paper in The New England Journal of Medicine in March. It argued that U.S. life expectancy could begin decreasing as today's obese children grow up and develop obesity-induced diseases, such as diabetes and heart disease (Science, 18 March, p. 1716). But Olshansky now says that in light of Flegal's recent paper on obesity deaths and a companion paper that she, Williamson, and other CDC scientists authored in the same issue of JAMA, his life expectancy forecasts might be inaccurate. The companion paper, led by CDC's Edward Gregg, examined how much cardiovascular disease was being driven by obesity. The findings were drawn from five surveys, most of them NHANES, beginning in 1960 and ending in 2000, and they dovetailed with the conclusions in Flegal's 112,000 deaths paper. All heart disease risk factors except diabetes were less likely to show up in heavy individuals in recent surveys than in older ones. That suggests, says Allison, that "we've developed all these great ways to treat heart disease" such as by controlling cholesterol. This could also explain, he and others say, why NHANES I led to much higher estimates of obesity-associated deaths than did NHANES I, II, and III combined. Although obesity rates are rising, obesity-associated deaths are dropping. Ludwig disagrees that this trend will necessarily continue or that Gregg's paper disproves the one he co-authored with Olshansky. Type 2 diabetes, which is becoming more common in youngsters, "starts the clock ticking towards life-threatening complications," he notes. Olshansky is uncomfortable with the kind of attention Flegal's 112,000 estimate is getting. "It's being portrayed," he says, as if "it's OK to be obese because we can treat it better." In fact, one of Flegal's conclusions that sparked much interest--that being overweight, with a BMI of 25 to 30, slightly reduced mortality risk--had been suggested in the past. Certainly, food-industry groups are thrilled by Flegal's work. "The singular focus on weight has been misguided," says Dan Mindus, a senior analyst with the Center for Consumer Freedom, a Washington, D.C.-based nonprofit supported by food companies and restaurants. Since Flegal's paper appeared, the center has spent $600,000 on newspaper and other ads declaring obesity to be "hype"; it plans to blanket the Washington, D.C., subway system with its ad campaign. Some say that CDC needs to choose one number of deaths and stand behind it. "You don't just put random numbers into the literature," says antitobacco activist and heart disease expert Stanton Glantz of the University of California, San Francisco, who disputed the Mokdad findings. Scientists agree that Flegal's study is superior, but it may also be distracting, suggests Beverly Rockhill, an epidemiologist at the University of North Carolina, Chapel Hill. Even if obese individuals' risk of death has been overplayed in the past, she says, we ought to ask: "Are they living a sicker life?" From checker at panix.com Sat May 7 00:02:57 2005 From: checker at panix.com (Premise Checker) Date: Fri, 6 May 2005 20:02:57 -0400 (EDT) Subject: [Paleopsych] Joel Garreau on "Radical Evolution," May 19th in NYC Message-ID: Joel Garreau on "Radical Evolution," May 19th in NYC GBN Presents... Joel Garreau speaking on Radical Evolution May 19th, 5:30 pm to 7:30 pm CUNY Graduate Center, Skylight Room (9th Floor) 365 Fifth Avenue New York, NY 10016 Please join GBN and Network member Joel Garreau for a compelling look at the dramatic acceleration of change that is literally transforming human nature. In his new book, Radical Evolution, Joel shows that we are at an inflection point in history. Through advances in genetic, robotic, information, and nano-technologies, we are engineering the next stage of human evolution: altering our minds, memories, metabolisms, personalities, progeny, and perhaps our very souls. After spending two years behind the scenes with today's foremost researchers and pioneers, Joel's amazing tales reveal that the superpowers of our comic-book heroes already exist, or are in development in hospitals, labs, and research facilities around the country--from the revved up reflexes and speed of Spider-Man and Superman to the enhanced mental acuity and memory capabilities of an advanced species. Over the next 15 years, these enhancements will become part of our everyday lives. But where will they lead us? One scenario is "Heaven," where technologies promise to make us smarter, vanquish illness, and extend our lives. But there are other scenarios, including "Hell," where unrestrained technology brings about the ultimate destruction of our entire species. To help us understand the possibilities, Joel taps the insights of many gifted thinkers and scientists who are making what has previously been thought of as science fiction a reality. Among these fellow travelers are Bill Joy, Ray Kurzweil, and Jaron Lanier, each of whom offer radically different views of the developments that will, in our lifetime, affect everything from the way we date to the way we work, from how we think and act to how we fall in love. As Joel cautions, it is only by anticipating the future that we can hope to shape it. Joel is the best-selling author of The Nine Nations of North America and Edge Cities and a reporter and editor for the Washington Post.For GBNers this is a special treat, having enjoyed regular interviews with Joel throughout his research as he struggled to make sense of what he was experiencing on the remarkable journey that became Radical Evolution. RSVP to Jeanne Scheppach at Jeanne_Scheppach at gbn.com. http://www.gbn.com/EventInformationDisplayServlet.srv?eid=26807 From checker at panix.com Sat May 7 00:04:13 2005 From: checker at panix.com (Premise Checker) Date: Fri, 6 May 2005 20:04:13 -0400 (EDT) Subject: [Paleopsych] In the Mideast, ask the right question Message-ID: In the Mideast, ask the right question The International Herald Tribune, 5.5.5. http://www.iht.com/bin/print_ipub.php?file=/articles/2005/05/04/opinion/edsiegman.php Henry Siegman International Herald Tribune PARIS The window of opportunity widely believed to have been opened by Prime Minister Ariel Sharon's decision to withdraw Israeli settlements from Gaza, and by the election of Mahmoud Abbas as head of the Palestinian Authority, has prompted a debate in U.S. policy circles. The question is whether President George W. Bush is moving quickly enough to prevent the spreading Israeli settlement enterprise in the West Bank from foreclosing the possibility of the emergence of a Palestinian state. Politically speaking, whether a viable Palestinian state is still possible is the wrong question, if only because by now it should be clear that Bush will not take the political risks entailed in ensuring the creation of such a state in the face of Sharon's determination to prevent it. The right question - the answer to which perhaps may yet invest the peace process with the energy and direction it now lacks - is whether there is still hope for the survival of Israel as a Jewish state. For it is the Jewish state, far more so than a state for the Palestinian people, that is now threatened and in doubt. Whatever uncertainties exist about a Palestinian state, what is certain, even after Israel's disengagement from Gaza, is that it is only a matter of time before Arabs will constitute a majority of the population between the Mediterranean Sea and the Jordan River. When this happens, Israel will cease to be a Jewish state, both formally and in fact - unless it herds the majority Arab population into enclosed bantustans, and turns into an apartheid state. It is a supreme irony that only a Palestinian state can assure the survival of Israel as a Jewish state. However as Sharon's settlement project continues and intensifies in the West Bank - not despite but because of the Gaza disengagement - and relentlessly diminishes and fragments the West Bank, Palestinians will sooner or later abandon a two-state solution and pursue the political logic of their own demography instead. Palestinians will not settle for less than a state that is fully within the pre-1967 borders. Having already yielded to Israel half the territory acknowledged by the United Nations in its partition resolution of 1947 as their legitimate patrimony, Palestinians will not consent to additional Israeli annexations of the remaining 22 percent of Palestine, except in swaps for comparable territory on Israel's side of the border. The capital of this Palestinian state, moreover, will have to be located in East Jerusalem. The chances of a Palestinian leader signing a peace accord that shuts Palestinians out of any part of Jerusalem are about as great as an Israeli leader signing a peace agreement that grants Palestinian refugees a "right of return" to Israel. Indeed, Palestinian agreement to a formula that redirects the refugees' right of return from Israel to a Palestinian state is entirely dependent on compromises in Israel's present position on territory and Jerusalem. These difficult concessions by an Israeli government are conceivable only if it finally tells its citizens the truth - that only if a viable and successful Palestinian state comes into being alongside Israel can its Jews avoid being turned into a minority in their own state. Those in Israel who believe that the world - including Israel's great friend and ally, the United States - will abide a Jewish apartheid regime that permanently disenfranchises and dominates by force of arms an Arab majority, or allow Israel to ethnically cleanse much of the West Bank through repressive economic and "security" measures, are deluding themselves. Unfortunately, some political parties in Israel call for such thinly disguised ethnic cleansing, and yet are seen by most Israelis as acceptable partners in their governments. Indeed, Natan Sharansky, a former minister in Sharon's government, has been agitating to declare the private property of Arabs who live just outside the municipal borders of Jerusalem, but whose adjoining properties are located within those borders, as "abandoned." Such a designation would allow the government to confiscate these Arab properties without compensation or right of appeal. This from the man who has convinced Bush that Palestinians must be kept under Israeli occupation until Israel is ready to certify they have been transformed into democrats! Inexorable demographic "facts on the ground" will be far more determining of Israel's future than the settlements and the so-called security fence that Israel is building largely on stolen Palestinian land. When this realization begins to break through the illusions that beset the peace process, Israel's supporters may finally understand that the question is not whether the window of opportunity is closing on a Palestinian state, but whether it is closing on a Jewish state. Unfortunately, given the all too clever manipulations of Sharon and his advisor, Dov Weissglas, who believe (as Weissglas boasted recently in a Haaretz interview) that they have persuaded the United States to let the road map and the peace process remain in "formaldehyde," this realization is likely to come about only after that proverbial window will have slammed shut. (Henry Siegman is a senior fellow on the Middle East at the Council on Foreign Relations and a former executive head of the American Jewish Congress. These views are his own.) From checker at panix.com Sat May 7 00:07:18 2005 From: checker at panix.com (Premise Checker) Date: Fri, 6 May 2005 20:07:18 -0400 (EDT) Subject: [Paleopsych] NYT: The Making of a Vegetarian: A Dinosaur Is Caught in the Act Message-ID: The Making of a Vegetarian: A Dinosaur Is Caught in the Act New York Times, 5.5.5 http://www.nytimes.com/2005/05/05/science/05dino.html [I will be glad to send along any Creationist response that is specific to this discovery, as opposed to general critiques of evolution, of which I have read and sent many.] By JOHN NOBLE WILFORD Without government nutrition guidelines, a doctor's advice or some primeval diet fad, entire species of dinosaurs sometimes forsook their predatory, meat-eating lifestyle and evolved into grazing vegetarians. Scientists now think they have found rare evidence of a species undergoing just such a dietary transition 125 million years ago. Paleontologists in Utah announced yesterday that they had discovered a new species of dinosaurs in an intermediate stage between carnivore and herbivore, on the way to becoming a committed vegetarian. They could only speculate on the reasons for the change, but noted that it occurred in a time of global warming and the arrival of flowering plants in profusion, a tempting new food source. Dr. James I. Kirkland, a paleontologist with the Utah Geological Survey, said the new species, named Falcarius utahensis, was uncovered two years ago at a remote dig site near the town of Green River. The animal, about 13 feet long and 4? feet tall, was a primitive member of the therizinosaur group of feathered dinosaurs. Under closer examination, Dr. Kirkland said, the Falcarius fossils showed "the beginnings of features we associate with plant-eating dinosaurs." The teeth were not the sharp, bladelike serrated teeth of the typical predator, but smaller and adapted for shredding leaves. "I doubt that this animal could have cut a steak," he said. Other characteristics of an animal in transition to herbivory included an expansion of the gut to digest the mass of fermenting plants, stouter legs for supporting a bulkier body instead of the slender legs of a fast-running predator, and a lengthening of the neck, perhaps to reach for leaves higher in the trees. Dr. Scott D. Sampson, chief curator of the Museum of Natural History at the University of Utah, said the new fossils were "amazing documentation of a major dietary shift" and promised to "tell us how this shift happened." The scientists described and interpreted the findings in interviews and a teleconference from Salt Lake City. A detailed report is being published today in the journal Nature. Dr. Mark A. Norell, a dinosaur specialist at the American Museum of Natural History who was not involved in the research, said the fossils were well preserved and the teeth appeared to be similar to those of plant-eating dinosaurs. But he questioned how much scientists would be able to learn from the specimen about the change from meat eating to plant eating. Dr. Sampson said, "Falcarius represents evolution caught in the act, a primitive form that shares much in common with its carnivorous kin, while possessing a variety of features demonstrating that it had embarked on the path toward more advanced plant-eating forms." Dr. Norell agreed that the new species "is a very important and interesting animal," primarily because it is a rare early example of the therizinosaur group in North America. Falcarius is anatomically more primitive than the better-known therizinosaurs that were prevalent in China about 90 million years ago and had already evolved as plant eaters. Lindsay E. Zanno, a doctoral student in paleontology at Utah, said Falcarius was "the most primitive known therizinosaur, demonstrating unequivocally that this large-bodied group of bizarre herbivorous dinosaurs" came from predatory carnivores like the swift, fierce Velociraptor. Falcarius and Velociraptor had a common ancestor. Scientists say all the vegetarian dinosaurs evolved from ancestors that were carnivores. Some 230 million years ago, the first dinosaur was presumably a small-bodied, fleet-footed predator. Then two major groups of dinosaurs, the gigantic species and the smaller duck-billed grazers, evolved as plant eaters. As for Falcarius, scientists are not sure what it ate, meat or plants or both, and they suspect that the transition extended over several million years. But with Falcarius, Dr. Sampson said, "we have actual fossil evidence of a major dietary shift, certainly the best example documented among dinosaurs." From HowlBloom at aol.com Sat May 7 00:14:09 2005 From: HowlBloom at aol.com (HowlBloom at aol.com) Date: Fri, 6 May 2005 20:14:09 EDT Subject: [Paleopsych] fads and atoms Message-ID: <1ee.3b3eba41.2fad6251@aol.com> The following article hits the motherlode when it comes to our past discussions of Ur patterns, iteration, and fracticality. Ur patterns are those that show up on multiple levels of emergence, patterns that make anthropomorphism a reasonable way of doing science, patterns that explain why a metaphor can capture in its word-picture the underlying structure of a whirlwind, a brain-spin, or a culture-shift. Here?s how a pattern in the molecules of magnets repeats itself in the mass moodswings of human beings. Howard etrieved May 6, 2005, from the World Wide Web http://www.newscientist.com/article.ns?id=mg18624984.200 HOME |NEWS |EXPLORE BY SUBJECT |LAST WORD |SUBSCRIBE |SEARCH |ARCHIVE |RSS |JOBS Click to PrintOne law rules dedicated followers of fashion 06 May 2005 Exclusive from New Scientist Print Edition Mark Buchanan FADS, fashions and dramatic shifts in public opinion all appear to follow a physical law: one of the laws of magnetism. Quentin Michard of the School of Industrial Physics and Chemistry in Paris and Jean-Philippe Bouchaud of the Atomic Energy Commission in Saclay, France, were trying to explain three social trends: plummeting European birth rates in the late 20th century, the rapid adoption of cellphones in Europe in the 1990s and the way people clapping at a concert suddenly stop doing so. In each case, they theorised, individuals not only have their own preferences, but also tend to imitate others. "Imitation is deeply rooted in biology as a survival strategy," says Bouchaud. In particular, people frequently copy others who they think know something they don't. To model the consequences of imitation, the researchers turned to the physics of magnets. An applied magnetic field will coerce the spins of atoms in a magnetic material to point in a certain direction. And often an atom's spin direction pushes the spins of neighbouring atoms to point in a similar direction. And even if an applied field changes direction slowly, the spins sometimes flip all together and quite abruptly. The physicists modified the model such that the atoms represented people and the direction of the spin indicated a person's behaviour, and used it to predict shifts in public opinion. In the case of cellphones, for example, it is clear that as more people realised how useful they were, and as their price dropped, more people would buy them. But how quickly the trend took off depended on how strongly people influenced each other. The magnetic model predicts that when people have a strong tendency to imitate others, shifts in behaviour will be faster, and there may even be discontinuous jumps, with many people adopting cellphones virtually overnight. More specifically, the model suggests that the rate of opinion change accelerates in a mathematically predictable way, with ever greater numbers of people changing their minds as the population nears the point of maximum change. Michard and Bouchaud checked this prediction against their model and found that the trends in birth rates and cellphone usage in European nations conformed quite accurately to this pattern. The same was true of the rate at which clapping died away in concerts. Close this window Printed on Sat May 07 01:01:50 BST 2005 ---------- Howard Bloom Author of The Lucifer Principle: A Scientific Expedition Into the Forces of History and Global Brain: The Evolution of Mass Mind From The Big Bang to the 21st Century Visiting Scholar-Graduate Psychology Department, New York University; Core Faculty Member, The Graduate Institute www.howardbloom.net www.bigbangtango.net Founder: International Paleopsychology Project; founding board member: Epic of Evolution Society; founding board member, The Darwin Project; founder: The Big Bang Tango Media Lab; member: New York Academy of Sciences, American Association for the Advancement of Science, American Psychological Society, Academy of Political Science, Human Behavior and Evolution Society, International Society for Human Ethology; advisory board member: Youthactivism.org; executive editor -- New Paradigm book series. For information on The International Paleopsychology Project, see: www.paleopsych.org for two chapters from The Lucifer Principle: A Scientific Expedition Into the Forces of History, see www.howardbloom.net/lucifer For information on Global Brain: The Evolution of Mass Mind from the Big Bang to the 21st Century, see www.howardbloom.net -------------- next part -------------- An HTML attachment was scrubbed... URL: From shovland at mindspring.com Sat May 7 00:59:45 2005 From: shovland at mindspring.com (Steve Hovland) Date: Fri, 6 May 2005 17:59:45 -0700 Subject: [Paleopsych] why I need you and you need me Message-ID: <01C55265.63CB2780.shovland@mindspring.com> What do the various extensions have in common? What is unique about each? Steve Hovland www.stevehovland.net -----Original Message----- From: HowlBloom at aol.com [SMTP:HowlBloom at aol.com] Sent: Friday, May 06, 2005 5:01 PM To: paleopsych at paleopsych.org Subject: [Paleopsych] why I need you and you need me << File: ATT00026.txt; charset = UTF-8 >> << File: ATT00027.html; charset = UTF-8 >> << File: ATT00028.txt >> From checker at panix.com Sat May 7 10:44:55 2005 From: checker at panix.com (Premise Checker) Date: Sat, 7 May 2005 06:44:55 -0400 (EDT) Subject: [Paleopsych] WSJ: Chimeras exist, what if some turn out too human? Message-ID: Chimeras exist, what if some turn out too human? http://www.post-gazette.com/pg/05126/500265.stm 5.5.6. I presume this was picked up from the WSJ. By Sharon Begley, The Wall Street Journal If you had just created a mouse with human brain cells, one thing you wouldn't want to hear the little guy say is, "Hi there, I'm Mickey." Even worse, of course, would be something like, "Get me out of this & percentGBP !! body!" It's been several millennia since Greek mythology dreamed up the chimera, a creature with the head of a lion, the body of a goat and the tail of a serpent. Research on the chimera front was pretty quiet for 2,500 years. But then in 1984 scientists announced that they had merged embryonic goat cells with embryonic sheep cells, producing a "geep." (It's part wooly, part hairy, with a face only a nanny goat could love.) A human-mouse chimera made its debut in 1988: "SCID-hu" is created when human fetal tissue -- spleen, liver, thymus, lymph node -- is transplanted into a mouse. These guys are clearly mice, but other chimeras are harder to peg. In the 1980s, scientists took brain-to-be tissue from quail embryos and transplanted it into chicken embryos. Once hatched, the chicks made sounds like baby quails. More part-human chimeras are now in the works or already in lab cages. StemCells Inc., of Palo Alto, Calif., has given hundreds of mice human-brain stem cells, for instance. And before human stem cells are ever used to treat human patients, notes biologist Janet Rowley of the University of Chicago, they (or the cells they develop into) will be implanted into mice and other lab animals. "The centaur has left the barn more than people realize," says Stanford University law professor and bioethicist Henry Greely. Part-human creatures raise enough ethical concerns that a National Academy of Sciences committee on stem cells veered off into chimeras. It recommended last week that some research be barred, to prevent some of the more monstrous possibilities -- such as a human-sperm-bearing mouse mating with a human-egg-bearing mouse and gestating a human baby. "We're not very concerned about a mouse with a human spleen," says Prof. Greely. "But we get really concerned about our brain and our gonads." That's why his Stanford colleague, Irving Weissman, asked Prof. Greely to examine the ethical implications of a mouse-human chimera. StemCells, co-founded by Prof. Weissman, has already transplanted human-brain stem cells into the brains of mice that had no immune system (and hence couldn't attack the foreign cells). The stem cells develop into human neurons, migrate through the mouse brain and mingle with mouse cells. The human cells make up less than 1 percent of the mouse brain, and are being used by the company to study neurodegenerative diseases. But Prof. Weissman had in mind a new sort of chimera. He would start with ill-fated mice whose neurons all die just before or soon after birth. He planned to transplant human-brain stem cells into their brains just before their own neurons died off. Would that lead the human cells to turn into neurons and replace the dead-or-dying mouse neurons, producing a mostly human brain in a mouse? Such a chimera could bring important scientific benefits. The SCID-hu mouse, though it hasn't yielded a cure for AIDS, has been "a very valuable animal model," says Ramesh Akkina of Colorado State University, Fort Collins, who directs a lab that uses this part-human mouse. "It has human T cells circulating, which will allow us to test gene therapy for AIDS" in a way that will be more relevant to patients than all-animal models. The co-creator of SCID-hu, Michael McCune of the Gladstone Institute of Virology and Immunology, San Francisco, notes that because the human organs last for months in the mice (they would die in days in a lab dish), "it is possible to study the effects of HIV" in many kinds of human cells in a living system. Similarly, studying living human neurons in a living mouse brain would likely yield more insights than studying human neurons in a lab dish or mouse neurons in a mouse brain. "You could see how pathogens damage human neurons, how experimental drugs act, what happens when you infect human neurons with prions (which cause mad-cow disease) or amyloid (associated with Alzheimer's)," says Prof. Greely. "The big concern is, could you give the mouse some sort of human consciousness or intelligence?" "All of us are aware of the concern that we're going to have a human brain in a mouse with a person saying, 'Let me out,'" Prof. Rowley told the President's Council on Bioethics when it discussed chimeras in March. To take no chances, scientists could kill the mice before birth to see if the brain is developing mouse-y structures such as "whisker barrels," which receive signals from the whiskers. If so, it's a mouse. If it is developing a large and complex visual cortex, it's too human. "If you saw something weird, you'd stop," says Prof. Greely. "If not, let the next ones be born, and examine them at different ages to be sure they're still fully mouse." To reduce the chance that today's chimeras will be as monstrous as the Greeks' were, the U.S. patent office last year rejected an application to patent a human-chimp chimera, or "humanzee." But that, of course, just keeps someone from patenting one -- not making one. From checker at panix.com Sat May 7 10:44:47 2005 From: checker at panix.com (Premise Checker) Date: Sat, 7 May 2005 06:44:47 -0400 (EDT) Subject: [Paleopsych] NYT: Time Travelers to Meet in Not Too Distant Future Message-ID: Time Travelers to Meet in Not Too Distant Future New York Times, 5.5.6 http://www.nytimes.com/2005/05/06/national/06time.html By [2]PAM BELLUCK CAMBRIDGE, Mass., May 5 - Suppose it is the future - maybe a thousand years from now. There is no static cling, diapers change themselves, and everyone who is anyone summers on Mars. What's more, it is possible to travel back in time, to any place, any era. Where would people go? Would they zoom to a 2005 Saturday night for chips and burgers in a college courtyard, eager to schmooze with computer science majors possessing way too many brain cells? Why not, say some students at the Massachusetts Institute of Technology, who have organized what they call the first convention for time travelers. Actually, they contend that theirs is the only time traveler convention the world needs, because people from the future can travel to it anytime they want. "I would hope they would come with the idea of showing us that time travel is possible," said Amal Dorai, 22, the graduate student who thought up the convention, which is to be this Saturday on the M.I.T. campus. "Maybe they could leave something with us. It is possible they might look slightly different, the shape of the head, the body proportions." The event is potluck and alcohol-free - present-day humans are bringing things like brownies. But Mr. Dorai's Web site asks that future-folk bring something to prove they are really ahead of our time: "Things like a cure for AIDS or cancer, a solution for global poverty or a cold fusion reactor would be particularly convincing as well as greatly appreciated." He would also welcome people from only a few days in the future, far enough to, say, give him a few stock market tips. Mr. Dorai and fellow organizers are the kind of people who transplant a snowblower engine into a sleeper sofa and drive the couch around Cambridge. (If the upholstery were bright red, it could be a midlife crisis convertible for couch potatoes.) They built a human-size hamster wheel - eight feet in diameter. And they concocted the "pizza button," a plexiglass pizza slice mounted in their hallway; when pressed, it calls up a Web site and arranges for pizza delivery 30 minutes later. (For anyone wanting to try this at home, the contraption uses a Huffman binary code. It takes fewer keystrokes to order the most popular toppings, like pepperoni, more keystrokes for less popular extras, like onions.) At the convention, they plan to introduce a robot with an "infrared pyro-electric detector," designed to follow anything that emits heat, including humans. "It's supposed to be our pet," said Adam Kraft, 22, a senior. "It needs fur," added David Nelson, 23, a graduate student. While Mr. Dorai has precisely calculated that "the odds of a time traveler showing up are between one in a million and one in a trillion," organizers have tried to make things inviting. In case their august university does not exist forever, they have posted the latitude and longitude of the East Campus Courtyard (42:21:36.025 degrees north, 71:05:16.332 degrees west). A roped-off area, including part of an improvised volleyball court, will create a landing pad so materializing time-travel machines will not crash into trees or dormitories. To set the mood, organizers plan to display a DeLorean - the sleek but short-lived 1980's car that was the time-traveling vehicle in the "Back to the Future" movies. At first, Mr. Dorai urged people to publicize the event with methods likely to last. "Write the details down on a piece of acid-free paper," he directed, "and slip them into obscure books in academic libraries!" But Mr. Dorai said the response was so overwhelming that the police, concerned about security, had asked that anyone who had not replied by Wednesday not be allowed to attend. No future-guests are confirmed as of yet, although one responder purports to be from 2026. But among the 100 likely attendees, there are those from another time zone - Chicago - and from New York, which at least likes to think of itself as light-years ahead. "I'm keeping my fingers crossed," said Erik D. Demaine, an M.I.T. mathematician who will be one of the professors speaking. There will also be two bands, the Hong Kong Regulars and Off-White Noise, performing new, time-travel-apropos tunes. "If you subscribe to alternative-world theory, then time travel makes sense at some level," said Professor Demaine, who would like future-guests to bring answers to mathematical mysteries. "The universe is inherently uncertain, and at various times it's essentially flipping coins to make a decision. At any point, there's the heads version of the world and the tails version of the world. We think that we actually live in one of them, and you could imagine that there's actually many versions of the universe, including one where suddenly you appear from 10 years in the future." If you can not imagine that, consider Erin Rhode's view of time travel. "I kind of think if it's going to happen, it'll be the wormhole theory," said Ms. Rhode, 23, a recent graduate, adding, "If you create a stable wormhole," a hole in space, "people can go back to visit it." William McGehee, 19, a freshman who helped build a "Saturday Night Fever"-like dance floor in his dorm, said, "It's pretty obvious if time travel does occur, then it doesn't cause the universe to explode." And Sam McVeety, 18, a freshman, wondered if wearing a tinfoil hat would be comforting or insulting to future-people. Mr. Dorai has had quirky brainstorms before: proposing the imprisonment of Bill Watterson, the retired cartoonist, to force him to continue his "Calvin and Hobbes" comic strip; and donning the costume of M.I.T.'s mascot, the beaver, while climbing the statue of John Harvard, namesake of that other Cambridge college. That incident went awry when some Harvard men swiped a paw. But Mr. Dorai's time travel idea seems to have legs. "If you can just give up a Saturday night, there's a very small chance at it being the biggest event in human history," he said. And if it is a flop, futuristically speaking? Well, Mr. Dorai reasoned, "Certainly, if no one from the future shows up, that won't prove that it's impossible." From checker at panix.com Sat May 7 10:45:46 2005 From: checker at panix.com (Premise Checker) Date: Sat, 7 May 2005 06:45:46 -0400 (EDT) Subject: [Paleopsych] NS: A concise guide to mind-altering drugs Message-ID: A concise guide to mind-altering drugs http://www.newscientist.com/article.ns?id=mg18424735.900&print=true * 13 November 2004 Alcohol What is it? Ethanol produced by the action of yeast on sugars. What does it do? Ethanol is a biphasic drug: low doses have a different effect to high doses. Small amounts of alcohol (one or two drinks) act as a stimulant, reducing inhibition and producing feelings of mild euphoria. Higher doses depress the central nervous system, initially producing relaxation but then leading to drunkenness - characterised by poor coordination, memory loss, cognitive impairment and blurred vision. Very high doses cause vomiting, coma and death through respiratory failure. The fatal dose varies but is somewhere around 500 milligrams of ethanol per 100 millilitres of blood. How does it work? At low doses (5 milligrams per 100 millilitres of blood), alcohol sensitises NMDA receptors in the brain, making them more responsive to the excitatory neurotransmitter glutamate, so boosting brain activity. These effects are most pronounced in areas associated with thinking, memory and pleasure. At higher doses it desensitises the same receptors and also activates the inhibitory GABA system. Amphetamine-type stimulants What are they? A class of synthetic drugs invented (and still used as) appetite suppressors. Includes amphetamine itself and derivatives including methamphetamine and dextroamphetamine. What do they do? Amphetamines are powerful stimulants of the central nervous system, producing feelings of euphoria, alertness, mental clarity and increased energy lasting for 2 to 12 hours depending on the dose. The downsides are increased heart rate and blood pressure, nausea, irritability and jitteriness, plus fatigue once the effects have worn off. Overdosing can lead to convulsions, heart failure, coma and death. The fatal dose varies from person to person, with some reports of acute reactions to as little as 2 milligrams and others of non-fatal 500-milligram doses. Most deaths from overdose have been among injecting users. How do they work? Their principal effect is to block dopamine transporters, which leads to higher-than-normal levels of the pleasure chemical dopamine in the brain. Caffeine What is it? An alkaloid found in coffee, cocoa beans, tea, kola nuts and guarana. Also added to many fizzy drinks, energy drinks, pep pills and cold and flu remedies. What does it do? A stimulant of the central nervous system. Pure caffeine is a moderately powerful drug and is sometimes passed off as amphetamine. In small doses, such as the 150 milligrams in a typical cup of filter coffee, it increases alertness and promotes wakefulness. Caffeine also raises heart and respiration rate and promotes urine production. Higher doses induce jitteriness and anxiety. The fatal dose is about 10 grams. How does it work? Caffeine blocks receptors for the neurotransmitter adenosine, which is generally inhibitory and associated with the onset of sleep. Also raises dopamine levels, and stimulates the release of the fight-or-flight hormone adrenalin. Cannabis What is it? Leaves, buds, flowers and resin from the cannabis plant, Cannabis sativa, a native of central Asia. The plant contains numerous psychoactive compounds called cannabinoids, the most potent of which is delta-9-tetrahydrocannabinol (THC). Cannabis is usually smoked in the form of dried leaves and buds, or as a dried resin (hashish). What does it do? Smoked in moderate quantities, cannabis can produces feelings of fuzzy mellowness and general well-being. It can interfere with memory and increase appetite ("the munchies"). Some users experience nausea, anxiety and paranoia. If eaten, the resin can be powerfully hallucinogenic. No fatal dose has ever been recorded in humans How does it work? THC latches onto specific receptors in the brain that are known to be involved in reward, appetite regulation and pain perception, though their precise role has yet to be worked out. Cocaine What is it? An alkaloid extracted from the leaves of the coca plant (Erythroxylon coca), a native of the eastern slopes of the Andes. It is commonly consumed in the form of the hydrochloride salt, a white crystalline powder which is usually snorted into the nostrils. Crack cocaine is pure cocaine liberated from the hydrochloride (hence known as "free base"), which makes it smokeable. What does it do? Cocaine is a potent stimulator of the central nervous system; a typical dose (about 50 to 100 milligrams) rapidly induces feelings of self-confidence, exhilaration and energy which last for 15 to 45 minutes before giving way to fatigue and melancholy. Crack cocaine condenses these effects into a shorter and more intense high. The drug also increases heart rate and blood pressure, sometimes fatally. Very high doses depress brain stem function, potentially leading to cardiac arrest and respiratory failure. The fatal dose can be as low as 1 gram. How does it work? Its principal effect is to block the re-uptake of dopamine, serotonin and noradrenalin into neurons, leading to higher-than-normal levels of these neurotransmitters in the brain. Dissociatives What are they? A class of hallucinogenic drugs that produce feelings of depersonalisation and detachment from reality. The most commonly used are ketamine and its relatives DXM (dextromethorphan hydrobromide) and PCP (phencyclidine, angel dust). What do they do? In small doses (up to about 75 milligrams) ketamine produces a psychedelic stimulant effect. The effect of higher doses has been described as an "out-of-body" experience. Users lose all sense of self and feel a detachment of mind and body, leading to a trance-like state in which they can experience a "superior reality" full of dazzling insights and visions. Some people find it wonderful, others terrifying. Effects last about an hour and wear off rapidly, leaving the user feeling groggy, and sometimes traumatised. Accidental overdoses are unknown: the drug has a wide safety margin. How do they work? Ketamine is an inhibitor of NMDA receptors, which normally respond to the excitatory neurotransmitter glutamate. This has the effect of severely depressing activity in many parts of the brain while leaving some functions intact. Ecstasy What is it? The amphetamine derivative MDMA (3,4-methylenedioxy-N-methylamphetamine). What is sold as ecstasy on the street, however, often contains no MDMA. What does it do? Technically known as a hallucinogenic amphetamine and also as an "empathogen", MDMA produces feelings of energy, euphoria, empathy, openness and a desire for physical contact (users are often described as "loved up"), plus mild visual and auditory hallucinations. Effects last for several hours and are followed by an equally lengthy period of lethargy and mild depression. MDMA is not toxic per se but can cause death due to overheating and dehydration. It also inhibits the production of urine and so can lead to a fatal build-up of fluid in the tissues. How does it work? The drug causes the brain to dump large amounts of the mood-modulating neurotransmitter serotonin into the synapses, and also raises dopamine levels. Hallucinogens/psychedelics What are they? A broad class of natural and synthetic compounds that profoundly alter perception and consciousness. The most widely used are the LSD group, including LSD (lysergic acid diethylamide), LSA (d-lysergic acid amide), DMT (dimethyltryptamine, found in ayahuasca) and psilocybin (the main active ingredient of magic mushrooms). What do they do? LSD produces experiences far removed from normal reality, including visual and auditory hallucinations, synaesthesia, time distortion, altered sense of self and feelings of detachment. Surfaces undulate and shimmer, colours are more intense and everyday objects can take on a surreal and fascinating appearance. The experience can be extremely frightening. After effects include fatigue and a vague sense of detachment. LSD is one of the most potent psychoactive substances known. Only 25 micrograms are required to produce an effect; 100 micrograms will induce 12 hours or more of profound psychedelia. How do they work? No one really knows. LSD stimulates three subtypes of serotonin receptor, 5-HT2A, 5-HT2C and 5-HT1A, though it is not clear that this alone can account for its effects. Opiates What are they? Any compound that stimulates opioid receptors found in the brain, spinal cord and gut. The word "opioid" derives from opium, the narcotic resin extracted from unripe seed pods of the opium poppy (Papaver somniferum). The opiates include naturally occurring alkaloids such as morphine (the main active ingredient of opium), derivatives of these such as heroin, and entirely synthetic compounds such as methadone. What do they do? Heroin, the most commonly used opiate, can induce euphoria, dreamy drowsiness and a general sense of well-being. The effects of injecting the drug have been described as a "whole-body orgasm", though some users experience no pleasurable effects at all. It also causes nausea, constipation, sweating, itchiness, depressed breathing and heart rate. Higher doses lead to respiratory failure and death. The fatal dose depends on tolerance and how the drug is taken but a naive user would probably die after injecting 200 milligrams. How do they work? By activating any of the three subtypes of opioid receptors. These normally respond to the body's natural painkilling chemicals including endorphins, which are released in highly stressful situations where pain would be disadvantageous. Tobacco What is it? Dried leaves of the tobacco plant Nicotiana tabacum, a native of South America. Usually smoked but can also be snorted as snuff or chewed. The main active ingredient is the alkaloid nicotine. What does it do? Nicotine is a mild stimulant which increases alertness, energy levels and memory function. Paradoxically, users also report a relaxant effect. It also increases blood pressure and respiration rate and suppresses appetite. Larger doses cause hallucinations, nausea, vomiting and death. The lethal dose is about 60 milligrams; a typical cigarette delivers about 2 milligrams of nicotine into the bloodstream. How does it work? Nicotine's principal effect is to stimulate nicotinic acetylcholine receptors in the brain, which leads to increased levels of the fight-or-flight hormone adrenalin. Also increases levels of dopamine. From checker at panix.com Sat May 7 10:45:53 2005 From: checker at panix.com (Premise Checker) Date: Sat, 7 May 2005 06:45:53 -0400 (EDT) Subject: [Paleopsych] NS: Decaff coffee gives a buzz too Message-ID: Decaff coffee gives a buzz too http://www.newscientist.com/article.ns?id=dn3075&print=true * 10:31 19 November 2002 * James Randerson The buzz from your morning cup of coffee may not be caused by caffeine after all. According to new research, decaffeinated coffee may be just as good at raising the blood pressure, at least for drinkers not used to the black stuff. Numerous studies have shown that too much caffeine interferes with sleep patterns, but the long term health effects of the drug are more controversial. Some scientists claim that daily caffeine stimulation increases our risk of high blood pressure and heart disease later in life. But overall, the evidence is equivocal, says Alice Lichtenstein of the American Heart Association's Nutrition Committee. Now, the small Swiss-led study suggests that to focus on caffeine alone may be wrong. The researchers gave triple espressos to six regular coffee drinkers and nine volunteers who never consumed food or drinks containing caffeine. On a separate occasion, the caffeine-abstainers also drank a triple decaffeinated espresso, but they were not told which was which. To the researchers surprise, both drinks had the same effect on the non-coffee drinkers, stimulating the sympathetic nervous system and raising blood pressure. "Fake" hit One interpretation of the result is that the subjects are reacting like Pavlov's dog and receiving a "fake" caffeine hit in anticipation of the real thing. But lead researcher Roberto Corti, a cardiologist at University Hospital in Zurich, thinks this is unlikely because the volunteers did not usually drink coffee. "We also saw a linear trend in blood pressure. This is not typical of a placebo effect," he adds. A more likely explanation, he thinks, is that coffee components other than caffeine have a stimulating effect. But the study threw up other puzzling findings. Regular coffee drinkers did not experience higher blood pressure after normal coffee, but their nervous system was stimulated. Corti does not think this is due simply to their bodies having got used to the effects of caffeine, because an intravenous caffeine injection did raise their blood pressure. He believes that other chemicals in coffee might block the caffeine stimulation. But Lichtenstein says the result could be due to differences in the method of delivery. Absorption via the gut can be slow and depends on what the volunteer had in their last meal. "It's very different from mainlining caffeine," she says. "The study has raised questions," says Lichtenstein. But she thinks it is too early to draw broad conclusions on the effects of caffeine and coffee. Journal reference: Circulation (vol 106, p 2935) Related Articles * [12]Coffee drinkers have lower diabetes risk * [13]http://www.newscientist.com/article.ns?id=dn3032 * 8 November 2002 * [14]Caffeine 'lotion' protects against skin cancer * [15]http://www.newscientist.com/article.ns?id=dn2714 * 26 August 2002 * [16]Caffeine key to curing a headache * [17]http://www.newscientist.com/article.ns?id=dn1491 * 29 October 2001 Weblinks * [18]University Hospital, Zurich * [19]http://www.usz.ch/e/index.html * [20]American Heart Association * [21]http://www.americanheart.org/ * [22]Caffeine FAQ * [23]http://coffeefaq.com/caffaq.html * [24]Circulation * [25]http://circ.ahajournals.org/ References 12. http://www.newscientist.com/article.ns?id=dn3032 13. http://www.newscientist.com/article.ns?id=dn3032 14. http://www.newscientist.com/article.ns?id=dn2714 15. http://www.newscientist.com/article.ns?id=dn2714 16. http://www.newscientist.com/article.ns?id=dn1491 17. http://www.newscientist.com/article.ns?id=dn1491 18. http://www.usz.ch/e/index.html 19. http://www.usz.ch/e/index.html 20. http://www.americanheart.org/ 21. http://www.americanheart.org/ 22. http://coffeefaq.com/caffaq.html 23. http://coffeefaq.com/caffaq.html 24. http://circ.ahajournals.org/ 25. http://circ.ahajournals.org/ From checker at panix.com Sat May 7 10:46:11 2005 From: checker at panix.com (Premise Checker) Date: Sat, 7 May 2005 06:46:11 -0400 (EDT) Subject: [Paleopsych] NYTBR: Freud and His Discontents Message-ID: Subject: NYTBR: Freud and His Discontents Freud and His Discontents New York Times Book Review, 5.5.8 http://www.nytimes.com/2005/05/08/books/review/08SIEGELL.html By LEE SIEGEL "CIVILIZATION AND ITS DISCONTENTS'' first appeared in 1930, and on the occasion of its 75th anniversary has been reissued by Norton ($19.95). A new edition of a classic text of Western culture is a happy occasion, not least because it offers the opportunity to debate the book's effect on the way we see the world -- or whether it has any effect at all. ''Classic'' can mean that an intellectual work is indisputably definitive in its realm, or it can mean that its prestige has outlived its authority and influence. Being leatherbound is sometimes synonymous with being timebound. Freud's essay rests on three arguments that are impossible to prove: the development of civilization recapitulates the development of the individual; civilization's central purpose of repressing the aggressive instinct exacts unbearable suffering; the individual is torn between the desire to live (Eros) and the wish to die (Thanatos). It is impossible to refute Freud's theses, too. All three arguments have died in the minds of many people, under the pressure of intellectual opposition, only to remain alive and well in the minds of many others. To clarify the status of Freud's influence today is to get a better sense of a central rift running through the culture we live in. In one important sense, Freud's ideas have had an undeniable impact. They've spelled the death of psychology in art. Freud's abstract, impersonal concepts have worn away the specificity of fictional character. By the 1950's, here and in Western Europe, it was making less and less sense to fashion the idiosyncratic, original inner and outer lives of a character in a novel. His or her behavior was already accounted for by the universal realities of id, ego, superego, not to mention the forces of repression, displacement and neurosis. Thus the postwar rise of the nouveau roman, with its absence of character, and of the postmodern and experimental novels, with their many strategies -- self-annulling irony, deliberate cartoonishness, montage-like ''cutting'' -- for releasing fiction from its dependence on character. For all the rich work published after the war, there's barely a fictional figure that has the memorableness of a Gatsby, a Nick Adams, a Baron Charlus, a Leopold Bloom, a Settembrini. And that's leaving aside the magnificent 19th century, when authors plumbed the depths of the human mind with something on the order of clairvoyance. Of course, before that, there was Shakespeare. And Cervantes. And Dante. And . . . It seems that the further back you go in time, away from Freud, the deeper the psychological portraits you encounter in literary art. Nowadays, often even the most accomplished novels offer characters that are little more than flat, ghostly reflections of characters. The author's voice, or self-consciousness about voice, substitutes mere eccentricity for an imaginative surrender to another life. But if we have Freud to blame for the long-drawn-out extinction of literary character, we also have Freud to thank for the prestige of film. The depiction of fictional people's inner lives is not the strength of the silver screen. Character gets revealed to us by plot turns, camera angles, musical scores -- by abstract, impersonal forces, much like Freud's concepts. In a novel, character is shaped from the inside out; in a film, it's molded from the outside and stays outside. How many movie characters can you think of -- with the exception, perhaps, of Citizen Kane -- whose names have the archetypal particularity of Isabel Archer or Sister Carrie? For better or for worse, film's independence from character is the reason it has replaced the novel as the dominant art form in our culture. Yet Freud himself drew his conception of the human mind from the type of imaginative literature his ideas were about to start making obsolete. His work is full of references to poets, playwrights and novelists from his own and earlier periods. In the latter half of his career, he applied himself more and more to using literature to prove his theories, commenting, most famously, on Shakespeare and Dostoyevsky. ''Civilization and Its Discontents'' brims with quotations from Goethe, Heine, Romain Rolland, Mark Twain, John Galsworthy and others. If Freud had had only his own writings to refer to, he would never have become Freud. Having accomplished his intellectual aims, he unwittingly destroyed the assumptions behind the culture that had nourished his work. Freud's universal paradigm for the human personality didn't mean only the decline of character in fiction. Its authoritative reduction of the human personality to developmental flaws undermined authority. The priest, the rabbi, the minister, the politician, the general may refer to objective facts and invoke objective truths and even ideals. They may be decent, reasonable people who have a strong sense of the reality principle, and of the reality of other people. But in Freud's eyes, they are, like everyone else, products of their own narrow, half-perceived conditions, which they project upon the world around them and sometimes mistake for reality. Nothing they say about the world goes unqualified by their conditions. ''Civilization and Its Discontents'' itself is the product of a profoundly agitated, even disturbed, mind. By the summer of 1929, when Freud began the book, anti-Semitism -- long a staple of Austrian politics -- had become at least as virulent in Austria as in neighboring Germany. Hatred of Jews played a central role in Austria's Christian Socialist and German Nationalist parties, which were about to win a majority in parliament, and there was widespread enthusiasm for Germany's rapidly growing National Socialists. It's not hard to imagine that Freud, slowly dying from the cancer of the mouth that had been diagnosed in 1923, and in great pain, felt more and more anxious about his life, and about the fate of his work. Perhaps it's this despairing frame of mind that leads Freud into sharp contradictions and intellectual lapses in ''Civilization and Its Discontents.'' He writes at one point that ''the low estimation put upon earthly life by the Christian doctrine'' was the first great expression of hostility to civilized society in the West; yet elsewhere, he cites the Christian commandment to love one's neighbor as oneself as ''one of the ideal demands, as we have called them, of civilized society.'' Later, in the space of two sentences, he gets himself tangled up when he tries to identify that commandment with civilization itself. He describes the sacred injunction as being ''undoubtedly older than Christianity,'' and then catches himself, as if realizing that the idea of universal love was unique to Christianity, and adds, ''yet it is certainly not very old; even in historical times it was still strange to mankind.'' Throughout the essay, Freud's hostility to Christianity is so intense that he seems determined to define civilization in Christian terms. The book should have been called ''Christian Society and Its Discontents.'' That is what it really is. And then there is the aggressive instinct, a universal impulse that Freud claims presents the sole impediment to Christian love and civilized society, but which he cannot quite bring in line with his earlier theories. It's as if he were, understandably, sublimating into theory his own feelings about the Christian civilization that, even before Hitler's formal ascension to power in 1933, seemed about to devour him and his family. Certainly, Freud's rage against the dark forces gathering against him has something to do with his repeated references, throughout the book, to great men in history who go to their deaths vilified and ignored. In one weird, remarkable moment, Freud introduces the idea of ''the superego of an epoch of civilization,'' thus supplanting even Jesus Christ with a Freudian concept -- thus supplanting Christ with Freud. But the most enigmatic, or maybe just incoherent, element of ''Civilization and Its Discontents'' is Freud's contention -- fancifully laid in 1920, in ''Beyond the Pleasure Principle'' -- that every individual wishes, on some level, to die. In ''Civilization and Its Discontents,'' he does not account for this outrageously counterintuitive idea, explain his application of it to history or even elaborate on it. The notion appears toward the end of the book and then does not occur again. Nine years later, in exile in England, weak and ill, Freud committed physician-assisted suicide, asking his doctor to give him a lethal dose of morphine. For all Freud's stern kindness toward humanity, for all his efforts to lessen the burden of human suffering, Thanatos seems to be the embittered way in which he universalized his parlous inner state. It hampers the understanding to read ''Civilization and Its Discontents'' without taking into consideration all these circumstances. If Freud has taught us anything, it's that any evaluation of authority has to examine the condition of those who stand behind it. As for repairing to ''Civilization and Its Discontents'' to gain essential elucidation of our own condition, the work seems as severely circumscribed by its time as by its author's situation. Today, Freud's stress on the formative effect of the family romance seems less and less relevant amid endless deconstructions and permutations of the traditional family. His argument that society's repressions create unbearable suffering seems implausible in a society where permissiveness is creating new forms of suffering. His fearless candor about sex appears quaint in a culture that won't stop talking about sex. And a great many people with faith in the inherent goodness of humankind believe that they are living according to ideal sentiments, universal principles or sacred commandments, unhampered by Freudian skepticism. Yet there are, unquestionably, people for whom Freud's immensely powerful ideas are a permanent condition of their lives. Behind the declaration of ideal sentiments, universal principles and sacred commandments, they see a craven sham concealing self-interest, greed and the wish to do harm. Neither of these two groups will ever talk the other out of its worldview. In this sense the conflict is not between the Islamic world and the ''liberal'' West; it is between religious people everywhere and people who, like Freud, see faith as an illusion, a set of self-deceiving notions about life. To put it another way, Freudianism is not a science; you either grasp the reality of Freud's dynamic notion of the subconscious intuitively -- the way, in fact, you do or do not grasp the truthfulness of Ecclesiastes -- or you cannot accept that it exists. For that reason, the most intractable division in the world now is between those who believe that the subconscious plays a fundamental role in human life, and those who don't. That's the real culture war, and maybe even the real clash of civilizations. Lee Siegel is the book critic for The Nation, the television critic for The New Republic and the art critic for Slate. From checker at panix.com Sat May 7 10:46:53 2005 From: checker at panix.com (Premise Checker) Date: Sat, 7 May 2005 06:46:53 -0400 (EDT) Subject: [Paleopsych] Dowd: What Rough Beasts? Message-ID: What Rough Beasts? Liberties column by Maureen Dowd, The New York Times, 5.5.7 http://www.nytimes.com/2005/05/07/opinion/07dowd.html WASHINGTON I love chimeras. I've seen just about every werewolf, Dracula and mermaid movie ever made, I have a Medusa magnet on my refrigerator, and the Sphinx of Greek mythology is a role model for her lethal brand of mystery. So when chimeras reared up in science news, I grabbed my disintegrating copy of Edith Hamilton's "Mythology" to refresh my memory on the Chimera, the she-monster with a lion's head, a goat's body and a serpent's tail: "A fearful creature, great and swift of foot and strong/Whose breath was flame unquenchable." Bellerophon, "a bold and beautiful young man" on flying Pegasus, shot arrows down at the flaming monster and killed her. Chimeras with "generally sinister powers," as Nicholas Wade [3]wrote in The Times, seemed to be a lesson in "the pre-Darwinian notion that species are fixed and penalties are severe" for crossing boundaries. Chimeras got attention again in the mid-80's, Sharon Begley of The Wall Street Journal noted, when embryonic goat cells were merged with embryonic sheep cells to produce a "geep," when a human-mouse chimera was born and when "scientists took brain-to-be tissue from quail embryos and transplanted it into chicken embryos. Once hatched, the chicks made sounds like baby quails." The U.S. Patent Office balked at an attempt last year to patent a "humanzee," a human-chimp chimera. But as the Stanford University bioethicist Henry Greely told Ms. Begley: "The centaur has left the barn." Knowing that mixing up species in a Circean blender conjures up nightmarish images, the National Academy of Sciences addressed the matter last month - stepping into the stem-cell vacuum left by the government and issuing research guidelines. While research on chimeras may be valuable, the guidelines, in a fit of "Island of Dr. Moreau" queasiness, suggested bans on inserting human embryonic stem cells into an early human embryo, apes or monkeys. The idea is to avoid animals with human sex cells or brain cells, Mr. Wade wrote. "There is a remote possibility that an animal with eggs made of human cells could mate with an animal bearing human sperm. To avoid human conception in such circumstances, the academy says chimeric animals should not be allowed to mate," he explained. Human cells in an animal brain could also be a problem. As Janet Rowley, a University of Chicago biologist, told a White House ethics panel: "All of us are aware of the concern that we're going to have a human brain in a mouse with a person saying, 'Let me out.' " Mary Shelley was right. Playing Creator is tricky - even if you chase down your accidents with torches. President Bush's experiments in Afghanistan and Iraq created his own chimeras, by injecting feudal and tribal societies with the cells of democracy, and blending warring factions and sects. Some of the forces unleashed are promising; others are frightening. In a chilling classified report to Congress last week, Gen. Richard Myers, chairman of the Joint Chiefs, conceded that Iraq and Afghanistan operations had restricted the Pentagon's ability to handle other conflicts. That's an ominous admission in light of North Korea's rush toward nukes, which was spurred on by the Iraq invasion and North Korea's conviction that, in bargaining with Mr. Bush, real weapons trump imaginary - or chimerical - ones. The U.S. invasion also spawned a torture scandal, and its own chimeric (alas, not chimerical) blend of former enemies - the Baathists and foreign jihadists - with access to Iraqi weapons caches. The Republican Party is now a chimera, too, a mutant of old guard Republicans, who want government kept out of our lives, and evangelical Christians, who want government to legislate religion into our lives. But exploiting God for political ends has set off powerful, scary forces in America: a retreat on teaching evolution, most recently in Kansas; fights over sex education, even in the blue states and blue suburbs of Maryland; a demonizing of gays; and a fear of stem cell research, which could lead to more of a "culture of life" than keeping one vegetative woman hooked up to a feeding tube. Even as scientists issue rules on chimeras in labs, a spine-tingling he-monster with the power to drag us back into the pre-Darwinian dark ages is slouching around Washington. It's a fire-breathing creature with the head of W., the body of Bill Frist and the serpent tail of Tom DeLay. E-mail: [4]liberties at nytimes.com References 1. http://www.nytimes.com/services/xml/rss/nyt/Opinion.xml 2. http://www.nytimes.com/top/opinion/editorialsandoped/oped/columnists/maureendowd/index.html?inline=nyt-per 3. http://www.nytimes.com/2005/05/03/science/03chim.html 4. mailto:liberties at nytimes.com From checker at panix.com Sat May 7 10:47:52 2005 From: checker at panix.com (Premise Checker) Date: Sat, 7 May 2005 06:47:52 -0400 (EDT) Subject: [Paleopsych] NYTBR: 'Perfectly Reasonable Deviations From the Beaten Track: The Letters of Richard P. Feynman' Message-ID: 'Perfectly Reasonable Deviations From the Beaten Track: The Letters of Richard P. Feynman' New York Times Book Review, 5.5.8 http://www.nytimes.com/2005/05/08/books/review/08ZERNIKE.html By KATE ZERNIKE PERFECTLY REASONABLE DEVIATIONS FROM THE BEATEN TRACK The Letters of Richard P. Feynman. Edited by Michelle Feynman. Foreword by Timothy Ferris. Illustrated. 486 pp. Basic Books. $26. In 1975, a woman from Seattle wrote the theoretical physicist and Nobel laureate Richard P. Feynman to declare that she had fallen in love after seeing him on ''Nova.'' ''Are there lots of physicists with fans?'' she wrote. ''You have one!'' Feynman wrote back flattered -- ''I need no longer be jealous of movie stars'' -- and signed off, ''Your fan-nee (or whatever you call it -- the whole business is new to me).'' It wasn't, of course. There were the high school students from Springfield, Mo., who sent him a hand-lettered birthday card to thank him for writing their textbook. The German man who wrote to share the poem he had created from a Feynman lecture. A man from Massachusetts wrote of a move afoot to draft Feynman for governor. A dentist wrote to ask his views on nuclear energy; an office equipment salesman, to propose an idea for a particle accelerator. A California correspondent inquired whether Feynman believed it possible to record dreams on tape, the way you do television programs. As the new collection of Feynman letters, ''Perfectly Reasonable Deviations From the Beaten Track,'' shows, Feynman inspired fan worship far beyond colleagues and students of science. Teenagers wrote to ask how they could be like him, and parents, how their children might be. Never mind whether a physicist might actually know something about child rearing or dreams or running a state. ''Why do I write you this letter?'' wrote the German who had turned a Feynman lecture into a poem to comfort himself after his father's death. ''Partly to extend my thanks to you, to tell you that with these, to you maybe unimportant lines, you have filled another human being's need.'' A whole industry of Feynman books has shown us Feynman the genius (he won the Nobel Prize in 1965), Feynman the iconoclast (at a hearing in Washington, he dropped a piece of rubber into ice water to demonstrate with brilliant simplicity why the space shuttle Challenger had exploded) and Feynman the nutty professor (he played bongo drums). What this latest addition shows most remarkably is Feynman's place in the popular imagination -- and how striking it is that any physicist would occupy it. It has become common to complain that we have no public intellectuals, but think how much rarer is the public scientist; it is a safe bet more people can identify Paris Hilton than Harold Varmus. Ordinary people came to regard Feynman, the boy from Far Rockaway, as theirs -- ''Thanks for the talk,'' one British man wrote after seeing him on television in 1981. He sparked excitement not just about science but also about the power of creativity, passion, curiosity. ''Work hard to find something that fascinates you,'' he wrote to one of the many students who asked him for advice. ''When you find it you will know your lifework. A man may be digging a ditch for someone else, or because he is forced to, or is stupid -- such a man is 'toolish' -- but another working even harder may not be recognized as different by the bystanders -- but he may be digging for treasure. So dig for treasure and when you find it you will know what to do.'' This selection of letters, edited by Feynman's daughter, Michelle, is billed as the closest thing possible to his autobiography; several books written before his death in 1988 were collections of lectures, or spoken memoirs recorded by his frequent collaborator Ralph Leighton. And as you would expect in an autobiography, letters here touch on science and Feynman's place in its history. A letter to his mother toward the end of his work on the Manhattan Project recounts the detonation of the first atomic bomb. A letter from 1967 counsels James Watson not to pay attention to criticism of the manuscript that would become ''The Double Helix.'' Feynman acknowledges in a letter to a man who wrote after reading a newspaper interview that he had hoped ''to quietly demur'' the Nobel Prize, but did not want to create a public stink once the honor had been reported in newspapers. The chapter of congratulatory telegrams and letters sent after the prize was announced bubbles with giddy excitement. But the freshest and most interesting letters here are the ones written to regular folk -- teenagers or teachers or parents who wrote to him from all over the world in moments of academic crisis or emotional doubt. To a student in India who complained that he was teased because of his stuttering, Feynman sent a book of physics problems and a letter encouraging him to ''study calmly and quietly those things which interest you most.'' He assured a high school student from Connecticut, who worried that difficulties in math would make it hard to pursue physics, not to be afraid: ''If you have any talent, or any occupation that delights you, do it, and do it to the hilt. Don't ask why, or what difficulties you may get into.'' A father from Alaska asked for help in directing his 16-year-old stepson -- ''a bit overweight, a little shy'' and ''no genius you understand, but a lot smarter than I am in math and such.'' Feynman told the man to have patience -- ''Let him go, let him get all distorted studying what interests him the most as much as he wants'' -- and to take father-and-son walks in the evening ''and talk (without purpose or routes) about this and that.'' He had no good way, he wrote, to make the boy figure out what he wanted in life. ''But to fall in love with a wonderful woman and to talk to her quietly in the night will do wonders.'' WHILE his spoken memoirs burnished the popular impression of Feynman as the merry prankster, the letters here imply he grew tired of that image. To a Swedish letter writer who had apparently suggested that playing the bongo drums made a physicist ''human,'' he replied: ''Theoretical physics is a human endeavor, one of the higher developments of human beings -- and this perpetual desire to prove that people who do it are human by showing that they do other things that a few other humans do (like playing bongo drums) is insulting to me. I am human enough to tell you to go to hell.'' Some of the earliest letters, written to his mother from college, are less illuminating, recording details like how many hours he slept. But others sketch his relationship with his first wife, who died from tuberculosis at a sanitarium in Albuquerque, where she had moved to live near him while he worked on the Manhattan Project in Los Alamos. The tenderness and the agony expressed in the letters written around the time of her death make you wonder if Feynman cultivated the jokester image to mask the pain of such a tremendous loss at such an early age. ''I find it hard to understand in my mind what it means to love you after you are dead,'' he wrote in 1946, nearly a year and a half after she had died. ''But I still want to comfort and take care of you -- and I want you to love me and care for me.'' Perhaps they could still make plans together, he ventured -- but no, he had lost his ''idea-woman,'' the ''general instigator of all our wild adventures.'' ''You can give me nothing now yet I love you so that you stand in my way of loving anyone else,'' he wrote. ''But I want to stand there. You, dead, are so much better than anyone else alive.'' Feynman's daughter writes that the letter is considerably more worn than the others, suggesting that he went back to reread it again and again. His fans -- new ones, too -- will find themselves doing the same. Kate Zernike is a national correspondent for The Times. From checker at panix.com Sat May 7 10:48:18 2005 From: checker at panix.com (Premise Checker) Date: Sat, 7 May 2005 06:48:18 -0400 (EDT) Subject: [Paleopsych] NYTBR: 'Irresistible Empire,' by Victoria de Grazia Message-ID: 'Irresistible Empire,' by Victoria de Grazia New York Times Book Review, 5.5.8 http://www.nytimes.com/2005/05/08/books/review/08HARIL.html 'Irresistible Empire': McEurope By JOHANN HARI IRRESISTIBLE EMPIRE America's Advance Through Twentieth-Century Europe. By Victoria de Grazia. Illustrated. 586 pp. The Belknap Press/Harvard University Press. $29.95. Victoria De Grazia slaps Europeans in the face with her title: face it, mes amis, you are part of the American empire. Zut alors! As a proud European, I confess to letting out a small splutter. Didn't we just snub the emperor-president, George W. Bush, by rejecting great swaths of his foreign policy? Don't we have a social democratic model that avoids the vertiginous inequalities of the United States? Aren't we free and self-determining democracies, building up our own superpower based in Brussels? Well, maybe -- but if Charlemagne or Napoleon could see their continent today, they would be with de Grazia. One glance at Europe's great capitals, and they would assume Europe had been conquered, occupied and settled by Americans. The men who dreamed of l'Europe profonde would curse the ubiquity of Eminem as they sat in the greasy KFC on the Falls Road in Belfast munching their Chicken Popcorn. They would stagger their way around Italy's most beautiful city, guided by a McDonald's map of McVenice. ''Irresistible Empire'' is the story of how this happened, of how an imperium came to Europe in the form of an emporium. Unlike the Middle East and Latin America, Europe has seen only the peaceful face of America's empire. De Grazia, a professor of history at Columbia University, shows how -- in just one century -- the Old Continent was subject to slow conquest by a million consumer goods. She talks us through the rise of a string of outwardly banal institutions: Rotary Clubs, supermarkets, the Hollywood star system, corporate advertising. With the careful skill of an expert defusing an explosive, she teases out the dense clusters of political ideology embodied in these seemingly everyday social institutions. The old European capitalism was epitomized by the small neighborhood store. At its core was ''a commercial ethic that still sought trust in the longevity of contacts and the solidarity of face-to-face contacts.'' Slowly, this model was eclipsed by an American capitalism epitomized by the out-of-town supermarket, big, anonymous and neon. The local, the diverse and the class-segmented were all ironed out in favor of a mass standard of consumption. Instead of the Great Man theory of history, de Grazia gives us the Great Bargain theory. It's startling just how rapidly Europe has been changed by this new model. Who knew, for example, that in the late 1950's there was not a single supermarket in Milan, and this was by no means untypical for a European city? How many of us realize that as recently as 1954, only 9 percent of French people had a refrigerator? Yet in a work crammed with exhaustive research, the reader pines for analysis. All of contemporary European politics spins on the question: Were the developments de Grazia details good for Europe? The answer cannot be squeezed into the simple-minded good-and-evil story craved on both sides of the Atlantic. De Grazia concentrates primarily on the areas where Americanization has had a broadly positive effect. But even the Marxist theorist Antonio Gramsci wrote lovingly in his ''Prison Notebooks'' of how Americanism and Fordism were hollowing out the old feudal snobberies. Perhaps unwittingly, de Grazia steers away from the areas where corporate Americanization has been pernicious for Europe. I would like to see her turn her remarkable skills to charting the effect of American corporate buyouts on Europe's news and entertainment media. Nor does she mention the environmental impact of this economic system. And is any analysis of European supermarkets complete without mentioning the biggest public health scare on the continent for 50 years? The supermarkets, relentless pressure for cheap food and industrial agriculture all pushed farmers to feed processed meat to cattle -- thus leading to Mad Cow disease. The other gap in de Grazia's narrative concerns a strange historical irony: Americanization gave birth to the idea of Europe. ''Henry Ford was as much a father of the European idea as anyone from Europe, given his company's pioneering effort to treat the European region as a single sales territory,'' she notes dryly. She's right, but the point is left hanging. Ford set in motion the conflict that will define the European Union over the coming decades. It can be summarized in a simple question: Does Europe exist to achieve Ford's goal -- one vast market for American goods -- or to resist this possibility and create a distinctively social democratic alternative? De Grazia focuses on one relatively minor European-resistance cause -- the slow food movement -- and at first it seems an eccentric choice. Founded in the late 1980's, it has a simple purpose: to give everybody the time and space to eat good, unprocessed food slowly and carefully. When America waves a Big Mac, the Slow Foodies want Europe to wave a bowl of freshly cooked pasta al dente. But if they are few in number, their approach symbolizes the wider European reaction to American neoliberalism: slow down. Europeans want long vacations, generous welfare states and flexible work hours. They -- we -- are trying to articulate a different model of consumerism that values leisure and family as much as work, work, work. Can this work? Is there a way to combine America's dazzling consumer economy with social justice and environmental sanity? Like many Europeans, I have a dystopian vision of the death of Europe as an alternative to America. Far from resolving tensions across the Atlantic, the irresistible empire's blanding out would simply produce a long European grudge, a quiet rage that we had sold our identity for a bag of Doritos. I see it now: Napoleon and Charlemagne sit grumbling in a doughnut shop after a long day in the new Euro-America. Napoleon swallows hard on a Krispy Kreme, turns to a weeping Charlemagne, and whispers, ''Dude, I so hate America.'' Johann Hari is a columnist for The Independent in London and the author of ''God Save the Queen? Monarchy and the Truth About the Windsors.'' From checker at panix.com Sun May 8 14:56:17 2005 From: checker at panix.com (Premise Checker) Date: Sun, 8 May 2005 10:56:17 -0400 (EDT) Subject: [Paleopsych] More on Chimera from Today's Wall Street Journal Message-ID: ---- Original Message ----- From: L. Stephen Coles, M.D., Ph.D. To: Gerontology Research Group Cc: irv at stanford.edu ; sciencejournal at wsj.com Sent: Friday, May 06, 2005 6:36 PM Subject: [GRG] More on Chimera from Today's Wall Street Journal SCIENCE JOURNAL: "Now That Chimeras Exist, What if Some Turn Out Too Human?" by Sharon Begley, Science Writer May 6, 2005; New York, NY (WSJ; p. B1) -- If you had just created a mouse with human brain cells, one thing you wouldn't want to hear the little guy say is, "Hi there, I'm Mickey." Even worse, of course, would be something like, "Get me out of this &%#ing body!" COMMENT: I believe the approved thing to start out with, is a mantra: "Not to go on all-Fours; that is the Law. Are we not Men?" After that come sad considerations on the relative shortness of animal life span. Less Stuart Little than Brian the Dog. Seriously, though, does anybody really believe that a mouse with human brain cells has any chance of achieving even Algernon-hood, let alone human-grade thought? It's less a matter of what species of brain cells you have, than it is of how many and how connected and how specialized. A mouse skull is only so big. I suspect you won't get anywhere making a smarter mouse unless you give it *bird* neurons, not human ones. Bird brains have amazing processing power for the size and weight. So forget human brain cells, and forget the scariness of human chimeras, unless they are animals of approximately human size. For ordinary lab animals you don't get scary intelligence till you scale a *bird brain* up to medium mammal size, and that gives you something with as many neurons as a human brain, yet not human. I have in mind a cartoon I saw once of a cooperative lab rabbit, which informs the two white-coated scientists: "Hi, I'm Linda. I'll be your bunny for the experiment today." SBH From checker at panix.com Sun May 8 14:56:35 2005 From: checker at panix.com (Premise Checker) Date: Sun, 8 May 2005 10:56:35 -0400 (EDT) Subject: [Paleopsych] McKinsey: The demographic deficit: How aging will reduce global wealth Message-ID: The demographic deficit: How aging will reduce global wealth http://www.mckinseyquarterly.com/article_page.aspx?ar=1588&L2=7&L3=10 To fill the coming gap in global savings and financial wealth, households and governments will need to increase their savings rates and earn higher returns on the assets they already have. Diana Farrell, Sacha Ghai, and Tim Shavers The McKinsey Quarterly, Web exclusive, March 2005 The world's population is aging, and as it gets even grayer, bank balances will stop growing and living standards, which have improved steadily since the industrial revolution, could stagnate. The reason is that the populations of Japan, the United States, and Western Europe, where the vast majority of the world's wealth is created and held, are aging rapidly (Exhibit 1). During the next two decades, the median age in Italy will rise to 51, from 42, and in Japan to 50, from 43. Since people save less after they retire and younger generations in their prime earning years are less frugal than their elders were, savings rates are set to fall dramatically. In just 20 years, household financial wealth in the world's major economies will be roughly $31 trillion1 less than it would have been if historical trends had persisted, according to new research by the McKinsey Global Institute (Exhibit 2).2 If left unchecked, the slowdown in global-savings rates will reduce the amount of capital available for investment and impede economic growth. No country will be immune. For the United States?with its relatively young population, higher birthrates, and steady influx of immigrants?the aging trend will be relatively less severe. Still, its savings rate is already dismally low, even before the baby boomers have started to retire. To finance its massive current-account deficit, the United States relies on capital flows from Europe and Japan, but they too face rapidly aging populations. Even fast-growing developing countries such as China will not be able to generate enough savings to make up the difference. Finding solutions won't be easy. Raising the retirement age, easing restrictions on immigration, or encouraging families to have more children will have little impact. Boosting economic growth alone is not a solution, nor is the next productivity revolution or technological breakthrough. To fill the coming gap in global savings and financial wealth, households and governments will need to increase their savings rates and to earn higher returns on the assets they already have. These changes involve hard choices but can offer a brighter future. Growing older, saving less In just two decades, the proportion of people aged 80 and above will be more than 2.5 times higher than it is today, because women are having fewer children and people are living longer. In about a third of the world's countries, and in the vast majority of developed nations, the fertility rate is at, or below, the level needed to maintain the population. Women in Italy now average just 1.2 children. In the United Kingdom, the figure is 1.6; in Germany, 1.4; and in Japan, 1.3. Meanwhile, thanks to improvements in health care and living conditions,3 average life expectancy has increased from 46 years in 1950 to 66 years today. As the elderly come to make up a larger share of the population, the total amount of savings available for investment and wealth accumulation will dwindle. The prime earning years for the average worker are roughly from age 30 to 50; thereafter, the savings rate falls. With the onset of retirement, households save even less and, in some cases, begin to spend accumulated assets. The result is a decline in the prime savers ratio?the number of households in their prime saving years divided by the number of elderly households. This ratio has been falling in Japan and Italy for many years. In Japan, it dropped below one in the mid-1980s, meaning that elderly households now outnumber those in their highest earning and saving years. Japan is often thought to be a frugal nation of supersavers, but its savings rate actually has already fallen from nearly 25 percent in 1975 to less than 5 percent today. That figure is projected to hit 0.2 percent in 2024. In 2000, the prime savers ratios of Germany, the United Kingdom, and the United States either joined the declining trend or stabilized at very low levels. This unprecedented confluence of demographic patterns will have significant ramifications for global savings and wealth accumulation. How the decline in prime savers will affect total savings depends on how these people's savings behavior changes over the course of a household's life. Germany, Japan, and the United States have traditional hump-shaped life cycle savings patterns (Exhibit 3). In these countries, aging populations will cause a dramatic slowdown in household savings and wealth. In contrast, Italy has a flatter savings curve, resulting in part from historical borrowing constraints that forced households led by people in their 20s and 30s to save more. Thus an increase in the share of elderly households will have less impact on the country's financial wealth. In some countries, the relatively lower savings rates of younger generations in their peak earning years will exacerbate the slowdown in savings and wealth. In the United States and Japan, where we analyzed generation-specific savings data, several factors contribute to this pattern: a tendency to rely more on inheritance than past generations did, the good fortune to avoid the economic hardships that prompted earlier generations to be more frugal, and the availability of consumer credit and mortgages (which, in the case of Japan, have become more socially acceptable). The coming shortfall in household wealth Most of the public discussion on aging populations has focused on the rapidly escalating cost of pensions and health care. Little attention has been paid to the potentially far more damaging effect that this demographic phenomenon will have on savings, wealth, and economic well being. As more households retire, the decline in savings will slow the growth in household financial wealth in the five countries we studied by more than two-thirds?to 1.3 percent, from the historical level of 4.5 percent. By 2024, total household financial wealth will be 36 percent lower?a drop of $31 trillion?than it would have been if the higher historical growth rates had persisted. Of course, changes in savings behavior by households and governments or increases in the average rate of return earned on those savings could alter this outcome. Without such changes, however, our analysis indicates that the aging populations in the world's richest nations will exert severe downward pressure on global savings and financial wealth during the next two decades. The United States will experience the largest shortfall in household financial wealth in absolute terms?$19 trillion by 2024?because of the size of its economy. The growth rate of the country's household financial wealth will decline to 1.6 percent, from 3.8 percent. Since the aging trend is less severe in the United States, reduced savings rates among younger generations are responsible for a large part of the decline. In Japan, the situation is much more serious. Household financial wealth will actually start declining during the next 20 years: by 2024 it will be $9 trillion?47 percent lower than it would have been if historical growth rates had persisted. Japan's demographic trends are severe: the median age will increase to 50, from 43 (for the US population, it will rise to 38, from 37), and the savings of elderly households fall off at a faster rate in Japan than in the United States. Even more important, household financial wealth in Japan is almost exclusively the result of new savings from income rather than of asset appreciation; therefore, the falloff in savings causes a bigger decline in wealth. The outlook for Europe varies by country. Italy will experience a large decline in the growth rate of its financial wealth?to just 0.9 percent, from 3.4 percent?because of the rapid aging of its population. Its relatively flat life cycle savings curve will mitigate the impact, however, resulting in an absolute shortfall of about $1 trillion, or 39 percent. The projected decline in the growth rate of financial wealth in other countries will be less dramatic: to 2.4 percent, from 3.8 percent, in Germany (because of its higher savings rates) and to a still-healthy 3.2 percent, from 5.1 percent, in the United Kingdom (because of its stronger demographics). Global ripple effects This slowdown in household savings will have major implications for all countries. In recent years the United States has absorbed more than half of the world's capital flows while running a current-account deficit approaching 6 percent of GDP. Japan has historically enjoyed a huge current-account surplus, which has allowed it to be a major exporter of capital to other countries, notably the United States. The expected drop in Japan's household savings will make this arrangement increasingly untenable. In all likelihood, the United States also won't be able to rely on European nations, with their aging populations, to increase capital flows. Nor can it expect rapidly industrializing nations, such as China, to fill the gap. Even if China's economy continued to grow at its current breakneck pace, it would need approximately 15 years to reach Japan's current GDP. In any case, if China is to sustain this growth, the United States must continue consuming at its current level?something it cannot do if capital flows from abroad decrease. Even if China did have savings to export, it would have to confront the obstacles posed by its current exchange rate and capital controls regime. Although an increase in global interest rates and the cost of capital may seem inevitable, it is not. On the one hand, as global savings fall markets can adjust through changes in asset prices and demand; which of these will predominate is unclear. Some economists forecast less demand for capital: fewer households will be taking out mortgages and borrowing for college, governments will invest less in infrastructure to keep pace with population growth, and businesses won't have to add as much capital equipment to accommodate a labor force that will no longer be growing. On the other hand, the demand for capital is likely to remain strong if emerging markets and rich countries seek to boost their GDP and productivity growth by increasing the amount of capital per worker. Likewise, while a drop in global savings could drive up asset prices, opposing forces will also be at work, as retirees begin selling their financial assets.4 One thing is certain: as household savings rates decline and the pool of available capital dwindles, persistent government budget deficits will likely push interest rates higher and crowd out private investment. The rising cost of caring for an aging population in the years to come will force national governments to exercise better fiscal discipline. No easy solutions Many policy changes suggested today, such as increasing immigration, raising the retirement age, encouraging households to have more children, and boosting economic growth, will do little to mitigate the coming shortfall in global financial wealth (Exhibit 4). Our analysis shows that an aggressive effort to increase immigration won't solve the problem, simply because new arrivals represent only a tiny proportion of any country's population. In Germany, for instance, a 50 percent increase in net immigration (to 100,000 people a year) would raise total financial assets just 0.7 percent by 2024. In Japan, doubling official projections of net immigration would have almost no impact on the number of households or on the country's aggregate savings. The same is true even in the United States, which had the highest historical levels of immigration in our sample. Since households don't reach their prime saving years until middle age, promoting higher birthrates through policies such as child tax credits, generous maternity-leave policies, and child care subsidies will also have only a negligible effect by 2024. This approach could actually make the situation worse by adding child dependents to a workforce already supporting a larger number of elderly. Similarly, raising the retirement age won't be particularly effective in most countries. In Japan, efforts to expand the peak earning and saving period by five years (a proxy for raising the retirement age) would close 25 percent of the projected wealth shortfall in that country. In Italy, however, this approach would have little impact because households do not greatly reduce their savings in retirement. After the IT revolution and the jump in US productivity growth during the late 1990s, it may be tempting to think that countries might grow themselves out of the problem. Without changes in the relationship between income and spending, however, an increase in economic growth won't generate enough new savings to close the gap. The simple reason is that as incomes and standards of living rise, so does consumption. For instance, raising average income growth in the United States by one percentage point?a huge increase?would narrow the projected wealth shortfall by only 10 percent. Navigating the demographic transition The only meaningful way to counteract the impending demographic pressure on global financial wealth is for governments and households to increase their savings rates and for economics to allocate capital more efficiently, thereby boosting returns. Boosting asset appreciation The underlying performance of domestic capital markets varies widely across countries, resulting in significantly different rates of return.5 Since 1975, the average rate of financial-asset appreciation in the United Kingdom and the United States has been nearly 1 percent a year, after adjusting for inflation. In contrast, financial assets in Japan have depreciated by a real 1.8 percent annually over the same period (although the ten-year moving average is now near zero). Real rates of asset appreciation have been negative in Germany and Italy as well. UK and US households compensate for their low savings rates by building wealth through high rates of asset appreciation. Their counterparts in Continental Europe and Japan save at much higher rates but ultimately accumulate less wealth, since these savings generate low or negative returns. From 1975 to 2003, unrealized capital gains increased the value of the financial assets of US households by almost 30 percent. But in Japan the value of such assets declined. European countries fell somewhere in between. Raising the rates of return on the $56 trillion of household savings in the five countries we studied could avert much of the impending wealth shortfall. In Germany, increasing the appreciation of financial assets to 0 percent, from the historical average of -1.1 percent, would completely eliminate the projected wealth shortfall. The opportunity is also large for Italy, since its real rate of asset appreciation has averaged -1.6 percent since 1992; raising returns to the levels in the United Kingdom and the United States would fully close the gap. For the latter two countries, the challenge could be more difficult because their rates of asset appreciation are already high. Achieving the required rates of return will call for improved financial intermediation so that savings are funneled to the most productive investments. To achieve this goal, policy makers must increase competition and encourage innovation in the financial sector and in the economy as a whole,6 enhance legal protections for investors and creditors, and end preferential lending by banks to companies with political ties or shareholder relationships. For some countries, such as Japan, where households keep more than half of their financial assets in cash equivalents, diversifying the range of assets that individuals hold is an important means of increasing the efficiency of capital allocation.7 To promote a better allocation of assets, policy makers should remove investment restrictions for households, improve investor education, and create tax incentives for well-diversified portfolios. New research in behavioral economics has shown that offering a balanced, prudent allocation as the default option for investors can improve returns because they overwhelmingly stick with this option.8 Increasing savings rates In many countries, today's younger generations earn more and save less than their elders do. This discrepancy is an important driver of the wealth shortfall in the United States and, more surprisingly, in Japan. If younger generations saved as much as their parents did while continuing to earn higher incomes, one-quarter of Japan's wealth shortfall and nearly a third of the US gap would be closed by 2024. Persuading young people to save more is difficult, however, and tax incentives aimed at increasing household savings have yielded mixed results.9 Contrary to conventional wisdom, too much borrowing is not the culprit in most countries. Although household liabilities have grown significantly faster than assets have across our sample since 1982, keeping consumer borrowing in line with asset growth would close $2.3 trillion, or just 7.5 percent, of the projected wealth shortfall. The key to boosting household savings is overcoming inertia. When companies automatically enroll their employees in voluntary savings plans (letting them opt out if they choose) rather than requiring people to sign up actively, participation rates rise dramatically.10 A study at one US Fortune 500 company that instituted such a program found that enrollment in its 401(k) retirement plan jumped to 80 percent, from 36 percent; the increase among low-income workers was even greater.11 In addition, a substantial fraction of the participants in the automatic-enrollment program accepted the default for both the contribution rate and the investment allocation?a combination chosen by few employees outside the program. Of course, governments can also increase the savings rates of their countries through the one mechanism directly under their control?reducing fiscal budget deficits. Maintaining fiscal discipline now is vital if governments are to cope with the escalating pension and health care costs that aging populations will accrue. If policy makers take no action, the coming slowdown in global savings and the projected decline in financial wealth could depress investment, economic growth, and living standards in the world's largest and wealthiest countries. The future development of poor nations could also be in jeopardy. A concerted, sustained effort to increase the efficiency of capital allocation, boost savings rates, and close government budget deficits can avert this outcome. About the Authors Diana Farrell is director of the McKinsey Global Institute, where Tim Shavers is a consultant; Sacha Ghai is a consultant in McKinsey's Toronto office. The authors wish to thank Ezra Greenberg, Piotr Kulczakowicz, Susan Lund, Carlos Ocampo, and Yoav Zeif for their contributions to this article. Notes 1All figures given in this article are valued in 2000 US dollars, and all growth rates indicate real terms. 2This study examined the impact of demographic trends on household savings and wealth in Germany, Italy, Japan, the United Kingdom, and the United States. The full report, The Coming Demographic Deficit: How Aging Populations Will Reduce Global Saving, is available for free online. 3The State of World Population, 1999 and 2004, United Nations Population Fund. 4Empirical analyses on the impact of demographic changes on financial-asset prices and returns are inconclusive. See Barry P. Bosworth, Ralph C. Bryant, and Gary Burtless, The Impact of Aging on Financial Markets and the Economy: A Survey, Brookings Institution, July 2004; and James Poterba, "The impact of population aging on financial markets," National Bureau of Economic Research working paper W10851, October 2004. 5In this article, the terms "financial-asset appreciation" and "returns" refer to the unrealized capital gains on financial assets, not to interest and dividends paid. By convention, interest and dividends are treated as household income, a portion of which may be saved. 6For a good synthesis of MGI's research, see William W. Lewis, The Power of Productivity, Chicago: University of Chicago Press, 2004. 7Moving households closer to the efficient frontier of risk and returns serves to make asset pricing more precise and forces companies to practice greater capital market discipline. 8Brigitte C. Madrian and Dennis F. Shea, "The power of suggestion: Inertia in 401(k) participation and savings behavior," Quarterly Journal of Economics, November 2001, Volume 116, Number 4, pp. 1149?87. 9B. Douglas Bernheim, "Taxation and saving," Handbook of Public Economics, Volume 3, Alan J. Auerbach and Martin Feldstein (eds.), New York: Elsevier North-Holland, 2002. 10James J. Choi, David Laibson, Brigitte C. Madrian, and Andrew Metrick, "Defined contribution pensions: Plan rules, participant decisions, and the path of least resistance," National Bureau of Economic Research working paper W8655, December 2001. 11Brigitte C. Madrian and Dennis F. Shea, "The power of suggestion: Inertia in 401(k) participation and savings behavior," Quarterly Journal of Economics, November 2001, Volume 116, Number 4, pp. 1149?87. From checker at panix.com Sun May 8 14:57:33 2005 From: checker at panix.com (Premise Checker) Date: Sun, 8 May 2005 10:57:33 -0400 (EDT) Subject: [Paleopsych] Kahn and Wiener on Computers (1967) Message-ID: Kahn missed the Internet, though. Also the collapse of communism and the fitness revolution. ----- Forwarded message from James Fehlinger ----- From: James Fehlinger Date: Sat, 7 May 2005 11:10:48 -0400 To: eugen at leitl.org Subject: One of the most important discoveries of the twentieth century >From _The Year 2000: A Framework for Speculation on the Next Thirty-Three Years_ by Herman Kahn and Anthony J. Wiener, The Hudson Institute, Inc., 1967 pp. 89 - 91: "If computer capacities were to continue to increase by a factor of ten every two or three years until the end of the century (a factor between a hundred billion and ten quadrillion), then all current concepts about computer limitations will have to be reconsidered. Even if the trend continues for only the next decade or two, the improvements over current computers would be factors of thousands to millions. If we add the likely enormous improvements in input-output devices, programming and problem formulation, and better understanding of the basic phenomena being studied, manipulated, or simulated, these estimates of improvement may be wildly conservative. And even if the rate of change slows down by several factors, there would still be room in the next thirty-three years for an overall improvement of some five to ten orders of magnitude. Therefore, it is necessary to be skeptical of any sweeping but often meaningless or nonrigorous statement such as 'a computer is limited by the designer -- it cannot create anything he does not put in,' or that 'a computer cannot be truly creative or original.' By the year 2000, computers are likely to match, simulate, or surpass some of man's most 'human-like' intellectual abilities, including perhaps some of his aesthetic and creative capacities, in addition to having some new kinds of capabilities that human beings do not have. These computer capacities are not certain; however, it is an open question what inherent limitations computers have. If it turns out that they cannot duplicate or exceed certain characteristically human capabilities, that will be one of the most important discoveries of the twentieth [!] century. . . This idea of computer 'intelligence' is a sensitive point with many people. The claim is not that computers will resemble the structure of the human brain, but that their functional output will equal or exceed that of the human brain in many functions that we have been used to thinking of as aspects of intelligence, and even as uniquely human. Still, a computer will presumably not become 'humanoid' and probably will not use similar processes, but it may have properties which are analogous to or operationally indistinguishable from self-generated purposes, ideas, and emotional responses to new inputs or its own productions. In particular, as computers become more self-programming they will increasingly tend to perform activities that amount to 'learning' from experience and training. Thus they will eventually evolve subtle methods and processes that may defy the understanding of the human designer. In addition to this possibility of independent intelligent activities, computers are being used increasingly as a helpful flexible tool for more or less individual needs -- at times in such close cooperation that one can speak in terms of a man- machine symbiosis. Eventually there will probably be computer consoles in every home, perhaps linked to public utility computers and permitting each user his private file space in a central computer, for uses such as consulting the Library of Congress, keeping individual records, preparing income tax returns from these records, obtaining consumer information, and so on. Computers will also presumably be used as teaching aids, with one computer giving simultaneous individual instruction to hundreds of students, each at his own console and topic, at any level from elementary to graduate school; eventually the system will probably be designed to maximize the individuality of the learning process. Presumably there will also be such things as: 1. A single national information file containing all tax, legal, security, credit, educational, medical employment, and other information about each citizen. (One problem here is the creation of acceptable rules concerning access to such a file, and then. . . the later problem of how to prevent erosion of these rules after one or two decades of increased operation have made the concept generally acceptable. . .) 2. Time-sharing of large computers by research centers in every field, providing national and international pools of knowledge and skill. 3. Use of computers to test trial configurations in scientific work, allowing the experimenter to concentrate on his creativity, judgment, and intuition, while the computer carries out the detailed computation and 'horse work.' A similar symbiotic relationship will prevail in engineering and other technological design. Using the synergism of newer 'problem-oriented' computer languages, time-sharing, and new input-output techniques, engineer-designers linked to a large computer complex will use computers as experienced pattern-makers, mathematical analysts of optimum design, sources of catalogs on engineering standards and parts data, and often substitutes for mechanical drawings. 4. Use of real-time large computers for an enormous range of business information and control activity, including most trading and financial transactions; the flow of inventories within companies and between suppliers and users; immediate analysis and display of company information about availability of products, prices, sales statistics, cash flow, credit, bank accounts and interests on funds, market analysis and consumer tastes, advanced projections, and so on. 5. Vast use of computers to reduce and punish crime, including the capacity of police to check immediately the identification and record of any person stopped for questioning. 6. Computerized processes for instantaneous exchange of money, using central-computer/bank-and-store-computer networks for debiting and crediting accounts. In addition, there will be uses of computers for worldwide communications, medical diagnostics, traffic and transportation control, automatic chemical analyses, weather prediction and control, and so on. The sum of all these uses suggests that the computer utility industry will become as fundamental as the power industry, and that the computer can be viewed as the most basic tool of the last third of the twentieth century. Individual computers (or at least consoles or other remote input devices) will become essential equipment for home, school, business, and profession, and the ability to use a computer skillfully and flexibly may become more widespread than the ability to play bridge or drive a car (and presumably much easier)." From checker at panix.com Sun May 8 14:57:49 2005 From: checker at panix.com (Premise Checker) Date: Sun, 8 May 2005 10:57:49 -0400 (EDT) Subject: [Paleopsych] Atlantic: Vannevar Bush, "As We May Think" (1945) Message-ID: Vannevar Bush, "As We May Think" The Atlantic Monthly, 1945.7 http://ccat.sas.upenn.edu/~jod/texts/vannevar.bush.html As Director of the Office of Scientific Research and Development, Dr. Vannevar Bush has coordinated the activities of some six thousand leading American scientists in the application of science to warfare. In this significant article he holds up an incentive for scientists when the fighting has ceased. He urges that men of science should then turn to the massive task of making more accessible our bewildering store of knowledge. For many years inventions have extended man's physical powers rather than the powers of his mind. Trip hammers that multiply the fists, microscopes that sharpen the eye, and engines of destruction and detection are new results, but the end results, of modern science. Now, says Dr. Bush, instruments are at hand which, if properly developed, will give man access to and command over the inherited knowledge of the ages. The perfection of these pacific instruments should be the first objective of our scientists as they emerge from their war work. Like Emerson's famous address of 1837 on ``The American Scholar,'' this paper by Dr. Bush calls for a new relationship between thinking man and the sum of our knowledge. - The Editor _________________________________________________________________ This has not been a scientist's war; it has been a war in which all have had a part. The scientists, burying their old professional competition in the demand of a common cause, have shared greatly and learned much. It has been exhilarating to work in effective partnership. Now, for many, this appears to be approaching an end. What are the scientists to do next? For the biologists, and particularly for the medical scientists, there can be little indecision, for their war work has hardly required them to leave the old paths. Many indeed have been able to carry on their war research in their familiar peacetime laboratories. Their objectives remain much the same. It is the physicists who have been thrown most violently off stride, who have left academic pursuits for the making of strange destructive gadgets, who have had to devise new methods for their unanticipated assignments. They have done their part on the devices that made it possible to turn back the enemy. They have worked in combined effort with the physicists of our allies. They have felt within themselves the stir of achievement. They have been part of a great team. Now, as peace approaches, one asks where they will find objectives worthy of their best. I Of what lasting benefit has been man's use of science and of the new instruments which his research brought into existence? First, they have increased his control of his material environment. They have improved his food, his clothing, his shelter; they have increased his security and released him partly from the bondage of bare existence. They have given him increased knowledge of his own biological processes so that he has had a progressive freedom from disease and an increased span of life. They are illuminating the interactions of his physiological and psychological functions, giving the promise of an improved mental health. Science has provided the swiftest communication between individuals; it has provided a record of ideas and has enabled man to manipulate and to make extracts from that record so that knowledge evolves and endures throughout the life of a race rather than that of an individual. There is a growing mountain of research. But there is increased evidence that we are being bogged down today as specialization extends. The investigator is staggered by the findings and conclusions of thousands of other workers - conclusions which he cannot find time to grasp, much less to remember, as they appear. Yet specialization becomes increasingly necessary for progress, and the effort to bridge between disciplines is correspondingly superficial. Professionally our methods of transmitting and reviewing the results of research are generations old and by now are totally inadequate for their purpose. If the aggregate time spent in writing scholarly works and in reading them could be evaluated, the ratio between these amounts of time might well be startling. Those who conscientiously attempt to keep abreast of current thought, even in restricted fields, by close and continuous reading might well shy away from an examination calculated to show how much of the previous month's efforts could be produced on call. Mendel's concept of the laws of genetics was lost to the world for a generation because his publication did not reach the few who were capable of grasping and extending it; and this sort of catastrophe is undoubtedly being repeated all about us, as truly significant attainments become lost in the mass of the inconsequential. The difficulty seems to be, not so much that we publish unduly in view of the extent and variety of present-day interests, but rather that publication has been extended far beyond our present ability to make real use of the record. The summation of human experience us being expanded at a prodigious rate, and the means we use for threading through the consequent maze to the momentarily important item is the same as was used in the days of square-rigged ships. But there are signs of a change as new and powerful instrumentalities come into use. Photocells capable of seeing things in a physical sense, advanced photography which can record what is seen or even what is not, thermionic tubes capable of controlling potent forces under the guidance of less power than a mosquito uses to vibrate his wings, cathode ray tubes rendering visible an occurrence so brief that by comparison a microsecond is a long time, relay combinations which will carry out involved sequences of movements more reliably than any human operator and thousand of times as fast - there are plenty of mechanical aids with which to effect a transformation in scientific records. Two centuries ago Leibnitz invented a calculating machine which embodied most of the essential features of recent keyboard devices, but it could not then come into use. The economics of the situation were against it: the labor involved in constructing it, before the days of mass production, exceeded the labor to be saved by its use, since all it could accomplish could be duplicated by sufficient use of pencil and paper. Moreover, it would have been subject to frequent breakdown, so that it could not have been depended upon; for at that time and long after, complexity and unreliability were synonymous. Babbage, even with remarkably generous support for his time, could not produce his great arithmetical machine. His idea was sound enough, but construction and maintenance costs were then too heavy. Had a Pharaoh been given detailed and explicit designs of an automobile, and had he understood them completely, it would have taxed the resources of his kingdom to have fashioned the thousands of parts for a single car, and that car would have broken down on the first trip to Giza. Machines with interchangeable parts can now be constructed with great economy of effort. In spite of much complexity, they perform reliably. Witness the humble typewriter, or the movie camera, or the automobile. Electrical contacts have ceased to stick when thoroughly understood. Note the automatic telephone exchange, which has hundred of thousands of such contacts, and yet is reliable. A spider web of metal, sealed in a thin glass container, a wire heated to brilliant glow, in short, the thermionic tube of radio sets, is made by the hundred million, tossed about in packages, plugged into sockets - and it works! Its gossamer parts, the precise location and alignment involved in its construction, would have occupied a master craftsman of the guild for months; now it is built for thirty cents. The world has arrived at an age of cheap complex devices of great reliability; and something is bound to come of it. II A record, if it is to be useful to science, must be continuously extended, it must be stored, and above all it must be consulted. Today we make the record conventionally by writing and photography, followed by printing; but we also record on film, on wax disks, and on magnetic wires. Even if utterly new recording procedures do not appear, these present ones are certainly in the process of modification and extension. Certainly progress in photography is not going to stop. Faster material and lenses, more automatic cameras, finer-grained sensitive compounds to allow an extension of the minicamera idea, are all imminent. Let us project this trend ahead to a logical, if not inevitable, outcome. The camera hound of the future wears on his forehead a lump a little larger than a walnut. It takes pictures 3 millimeters square, later to be projected or enlarged, which after all involves only a factor of 10 beyond present practice. The lens is of universal focus, down to any distance accommodated by the unaided eye, simply because it is of short focal length. There is a built-in photocell on the walnut such as we now have on at least one camera, which automatically adjusts exposure for a wide range of illumination. There is film in the walnut for a hundred exposure, and the spring for operating its shutter and shifting its film is wound once for all when the film clip is inserted. It produces its result in full color. It may well be stereoscopic, and record with spaced glass eyes, for striking improvements in stereoscopic technique are just around the corner. The cord which trips its shutter may reach down a man's sleeve within easy reach of his fingers. A quick squeeze, and the picture is taken. On a pair of ordinary glasses is a square of fine lines near the top of one lens, where it is out of the way of ordinary vision. When an object appears in that square, it is lined up for its picture. As the scientist of the future moves about the laboratory or the field, every time he looks at something worthy of the record, he trips the shutter and in it goes, without even an audible click. Is this all fantastic? The only fantastic thing about it is the idea of making as many pictures as would result from its use. Will there be dry photography? It is already here in two forms. When Brady made his Civil War pictures, the plate had to be wet at the time of exposure. Now it has to be wet during development instead. In the future perhaps it need not be wetted at all. There have long been films impregnated with diazo dyes which form a picture without development, so that it is already there as soon as the camera has been operated. An exposure to ammonia gas destroys the unexposed dye, and the picture can then be taken out into the light and examined. The process is now slow, but someone may speed it up, and it has no grain difficulties such as now keep photographic researchers busy. Often it would be advantageous to be able to snap the camera and to look at the picture immediately. Another process now is use is also slow, and more or less clumsy. For fifty years impregnated papers have been used which turn dark at every point where an electrical contact touches them, by reason of the chemical change thus produced in an iodine compound included in the paper. They have been used to make records, for a pointer moving across them can leave a trail behind. If the electrical potential on the pointer is varied as it moves, the line becomes light or dark in accordance with the potential. This scheme is now used in facsimile transmission. The pointer draws a set of closely spaced lines across the paper one after another. As it moves, its potential is varied in accordance with a varying current received over wires from a distant station, where these variations are produced by a photocell which is similarly scanning a picture. At every instant the darkness of the line being drawn is made equal to the darkness of the point on the picture being observed by the photocell. Thus, when the whole picture has been covered, a replica appears at the receiving end. A scene itself can be just as well looked over line by line by the photocell in this way as can a photograph of the scene. This whole apparatus constitutes a camera, with the added feature, which can be dispensed with if desired, of making its picture at a distance. It is slow, and the picture is poor in detail. Still, it does give another process of dry photography, in which the picture is finished as soon as it is taken. It would be a brave man who could predict that such a process will always remain clumsy, slow, and faulty in detail. Television equipment today transmits sixteen reasonably good images a second, and it involves only two essential differences from the process described above. For one, the record is made by a moving beam of electrons rather than a moving pointer, for the reason that an electron beam can sweep across the picture very rapidly indeed. The other difference involves merely the use of a screen which glows momentarily when the electrons hit, rather than a chemically treated paper or film which is permanently altered. This speed is necessary in television, for motion pictures rather than stills are the object. Use chemically treated film in place of the glowing screen, allow the apparatus to transmit one picture rather than a succession, and a rapid camera for dry photography results. The treated film needs to be far faster in action than present examples, but it probably could be. More serious is the objection that this scheme would involve putting the film inside a vacuum chamber, for electron beams behave normally only in such a rarefied environment. This difficulty could be avoided by allowing the electron beam to play on one side of a partition, and by pressing the film against the other side, if this partition were such as to allow the electrons to go through perpendicular to its surface, and to prevent them from spreading out sideways. Such partitions, in crude form, could certainly be constructed, and they will hardly hold up the general development. Like dry photography, microphotography still has a long way to go. The basic scheme of reducing the size of the record, and examining it by projection rather than directly, has possibilities too great to be ignored. The combination of optical projection and photographic reduction is already producing some results in microfilm for scholarly purposes, and the potentialities are highly suggestive. Today, with microfilm, reductions by a linear factor of 20 can be employed and still produce full clarity when the material is re-enlarged for examination. The limits are set by the graininess of the film, the excellence of the optical system, and the efficiency of the light sources employed. All of these are rapidly improving. Assume a linear ratio of 100 for future use. Consider film of the same thickness as paper, although thinner film will certainly be usable. Even under these conditions there would be a total factor of 10,000 between the bulk of the ordinary record on books, and its microfilm replica. The Encyclopoedia Britannica could be reduced to the volume of a matchbox. A library of a million volumes could be compressed into one end of a desk. If the human race has produced since the invention of movable type a total record, in the form of magazines, newspapers, books, tracts, advertising blurbs, correspondence, having a volume corresponding to a billion books, the whole affair, assembled and compressed, could be lugged off in a moving van. Mere compression, of course, is not enough; one needs not only to make and store a record but also to be able to consult it, and this aspect of the matter comes later. Even the modern great library is not generally consulted; it is nibbled by a few. Compression is important, however, when it comes to costs. The material for the microfilm Britannica would cost a nickel, and it could be mailed anywhere for a cent. What would it cost to print a million copies? To print a sheet of newspaper, in a large edition, costs a small fraction of a cent. The entire material of the Britannica in reduced microfilm form would go on a sheet eight and one-half by eleven inches. Once it is available, with the photographic reproduction methods of the future, duplicates in large quantities could probably be turned out for a cent apiece beyond the cost of materials. The preparation of the original copy? That introduces the next aspect of the subject. III To make the record, we now push a pencil or tap a typewriter. Then comes the process of digestion and correction, followed by an intricate process of typesetting, printing, and distribution. To consider the first stage of the procedure, will the author of the future cease writing by hand or typewriter and talk directly to the record? He does so indirectly, by talking to a stenographer or a wax cylinder; but the elements are all present if he wishes to have his talk directly produce a typed record. All he needs to do us to take advantage of existing mechanisms and to alter his language. At a recent World Fair a machine called a Voder was shown. A girl stroked its keys and it emitted recognizable speech. No human vocal cords entered in the procedure at any point; the keys simply combined some electrically produced vibrations and passed these on to a loud-speaker. In the Bell Laboratories there is the converse of this machine, called a Vocoder. The loudspeaker is replaced by a microphone, which picks up sound. Speak to it, and the corresponding keys move. This may be one element of the postulated system. The other element is found in the stenotype, that somewhat disconcerting device encountered usually at public meetings. A girl strokes its keys languidly and looks about the room and sometimes at the speaker with a disquieting gaze. From it emerges a typed strip which records in a phonetically simplified language a record of what the speaker is supposed to have said. Later this strip is retyped into ordinary language, for in its nascent form it is intelligible only to the initiated. Combine these two elements, let the Vocoder run the stenotype, and the result is a machine which types when talked to. Our present languages are not especially adapted to this sort of mechanization, it is true. It is strange that the inventors of universal languages have not seized upon the idea of producing one which better fitted the technique for transmitting and recording speech. Mechanization may yet force the issue, especially in the scientific field; whereupon scientific jargon would become still less intelligible to the layman. One can now picture a future investigator in his laboratory. His hands are free, and he is not anchored. As he moves about and observes, he photographs and comments. Time is automatically recorded to tie the two records together. If he goes into the field, he may be connected by radio to his recorder. As he ponders over his notes in the evening, he again talks his comments into the record. His typed record, as well as his photographs, may both be in miniature, so that he projects them for examination. Much needs to occur, however, between the collection of data and observations, the extraction of parallel material from the existing record, and the final insertion of new material into the general body of the common record. For mature thought there is no mechanical substitute. But creative thought and essentially repetitive thought are very different things. For the latter there are, and may be, powerful mechanical aids. Adding a column of figures is a repetitive thought process, and it was long ago properly relegated to the machine. True, the machine is sometimes controlled by the keyboard, and thought of a sort enters in reading the figures and poking the corresponding keys, but even this is avoidable. Machines have been made which will read typed figures by photocells and then depress the corresponding keys; these are combinations of photocells for scanning the type, electric circuits for sorting the consequent variations, and relay circuits for interpreting the result into the action of solenoids to pull the keys down. All this complication is needed because of the clumsy way in which we have learned to write figures. If we recorded them positionally, simply by the configuration of a set of dots on a card, the automatic reading mechanism would become comparatively simple. In fact, if the dots are holes, we have the punched-card machine long ago produced by Hollorith for the purposes of the census, and now used throughout business. Some types of complex businesses could hardly operate without these machines. Adding is only one operation. To perform arithmetical computation involves also subtraction, multiplication, and division, and in addition some method for temporary storage of results, removal from storage for further manipulation, and recording of final results by printing. Machines for these purposes are now of two types: keyboard machines for accounting and the like, manually controlled for the insertion of data, and usually automatically controlled as far as the sequence of operations is concerned; and punched-card machines in which separate operations are usually delegated to a series of machines, and the cards then transferred bodily from one to another. Both forms are very useful; but as far as complex computations are concerned, both are still embryo. Rapid electrical counting appeared soon after the physicists found it desirable to count cosmic rays. For their own purposes the physicists promptly constructed thermionic-tube equipment capable of counting electrical impulses at the rate of 100,000 a second. The advanced arithmetical machines of the future will be electrical in nature, and they will perform at 100 times present speeds, or more. Moreover, they will be far more versatile than present commercial machines, so that they may readily be adapted for a wide variety of operations. They will be controlled by a control card or film, they will select their own data and manipulate it in accordance with the instructions thus inserted, they will perform complex arithmetical computations at exceedingly high speeds, and they will record results in such form as to be readily available for distribution or for later further manipulation. Such machines will have enormous appetites. One of them will take instructions and data from a roomful of girls armed with simple keyboard punches, and will deliver sheets of computed results every few minutes. There will always be plenty of things to compute in the detailed affairs of millions of people doing complicated things. IV The repetitive processes of thought are not confined, however, to matters of arithmetic and statistics. In fact, every time one combines and records facts in accordance with established logical processes, the creative aspect of thinking is concerned only with the selection of the data and the process to be employed, and the manipulation thereafter is repetitive in nature and hence a fit matter to be relegated to the machines. Not so much has been done along these lines, beyond the bounds of arithmetic, as might be done, primarily because of the economics of the situation. The needs of business, and the extensive market obviously waiting, assured the advent of mass-produced arithmetical machines just as soon as production methods were sufficiently advanced. With machines for advanced analysis no such situation existed; for there was and is no extensive market; the users of advanced methods of manipulating data are a very small part of the population. There are, however, machines for solving differential equations - and functional and integral equations, for that matter. There are many special machines, such as the harmonic synthesizer which predicts the tides. There will be many more, appearing certainly first in the hands of the scientist and in small numbers. If scientific reasoning were limited to the logical processes of arithmetic, we should not get far in our understanding of the physical world. One might as well attempt to grasp the game of poker entirely by the use of the mathematics of probability. The abacus, with its beads string on parallel wires, led the Arabs to positional numeration and the concept of zero many centuries before the rest of the world; and it was a useful tool - so useful that it still exists. It is a far cry from the abacus to the modern keyboard accounting machine. It will be an equal step to the arithmetical machine of the future. But even this new machine will not take the scientist where he needs to go. Relief must be secured from laborious detailed manipulation of higher mathematics as well, if the users of it are to free their brains for something more than repetitive detailed transformations in accordance with established rules. A mathematician is not a man who can readily manipulate figures; often he cannot. He is not even a man who can readily perform the transformation of equations by the use of calculus. He is primarily an individual who is skilled in the use of symbolic logic on a high plane, and especially he is a man of intuitive judgment in the choice of the manipulative processes he employs. All else he should be able to turn over to his mechanism, just as confidently as he turns over the propelling of his car to the intricate mechanism under the hood. Only then will mathematics be practically effective in bringing the growing knowledge of atomistics to the useful solution of the advanced problems of chemistry, metallurgy, and biology. For this reason there will come more machines to handle advanced mathematics for the scientist. Some of them will be sufficiently bizarre to suit the most fastidious connoisseur of the present artifacts of civilization. V The scientist, however, is not the only person who manipulates data and examines the world about him by the use of logical processes, although he sometimes preserves this appearance by adopting into the fold anyone who becomes logical, much in the manner in which a British labor leader is elevated to knighthood. Whenever logical processes of thought are employed - that is, whenever thought for a time runs along an accepted groove - there is an opportunity for the machine. Formal logic used to be a keen instrument in the hands of the teacher in his trying of students' souls. It is readily possible to construct a machine which will manipulate premises in accordance with formal logic, simply by the clever use of relay circuits. Put a set of premises into such a device and turn the crank, and it will readily pass out conclusion after conclusion, all in accordance with logical law, and with no more slips than would be expected of a keyboard adding machine. Logic can become enormously difficult, and it would undoubtedly be well to produce more assurance in its use. The machines for higher analysis have usually been equation solvers. Ideas are beginning to appear for equation transformers, which will rearrange the relationship expressed by an equation in accordance with strict and rather advanced logic. Progress is inhibited by the exceedingly crude way in which mathematicians express their relationships. They employ a symbolism which grew like Topsy and has little consistency; a strange fact in that most logical field. A new symbolism, probably positional, must apparently precede the reduction of mathematical transformations to machine processes. Then, on beyond the strict logic of the mathematician, lies the application of logic in everyday affairs. We may some day click off arguments on a machine with the same assurance that we now enter sales on a cash register. But the machine of logic will not look like a cash register, even a streamlined model. So much for the manipulation of ideas and their insertion into the record. Thus far we seem to be worse off than before - for we can enormously extend the record; yet even in its present bulk we can hardly consult it. This is a much larger matter than merely the extraction of data for the purposes of scientific research; it involves the entire process by which man profits by his inheritance of acquired knowledge. The prime action of use is selection, and here we are halting indeed. There may be millions of fine thoughts, and the account of the experience on which they are based, all encased within stone walls of acceptable architectural form; but if the scholar can get at only one a week by diligent search, his syntheses are not likely to keep up with the current scene. Selection, in this broad sense, is a stone adze in the hands of a cabinetmaker. Yet, in a narrow sense and in other areas, something has already been done mechanically on selection. The personnel officer of a factory drops a stack of a few thousand employee cards into a selecting machine, sets a code in accordance with an established convention, and produces in a short time a list of all employees who live in Trenton and know Spanish. Even such devices are much too slow when it comes, for example, to matching a set of fingerprints with one of five millions on file. Selection devices of this sort will soon be speeded up from their present rate of reviewing data at a few hundred a minute. By the use of photocells and microfilm they will survey items at the rate of thousands a second, and will print out duplicates of those selected. This process, however, is simple selection: it proceeds by examining in turn every one of a large set of items, and by picking out those which have certain specified characteristics. There is another form of selection best illustrated by the automatic telephone exchange. You dial a number and the machine selects and connects just one of a million possible stations. It does not run over them all. It pays attention only to a class given by a first digit, and so on; and thus proceeds rapidly and almost unerringly to the selected station. It requires a few seconds to make the selection, although the process could be speeded up if increased speed were economically warranted. If necessary, it could be made extremely fast by substituting thermionic-tube switching for mechanical switching, so that the full selection could be made in one-hundredth of a second. No one would wish to spend the money necessary to make this change in the telephone system, but the general idea is applicable elsewhere. Take the prosaic problem of the great department store. Every time a charge sale is made, there are a number of things to be done. The inventory needs to be revised, the salesman needs to be given credit for the sale, the general accounts need an entry, and, most important, the customer needs to be charged. A central records device has been developed in which much of this work is done conveniently. The salesman places on a stand the customer's identification card, his own card, and the card taken from the article sold - all punched cards. When he pulls a lever, contacts are made through the holes, machinery at a central point makes the necessary computations and entries, and the proper receipt is printed for the salesman to pass to the customer. But there may be ten thousand charge customers doing business with the store, and before the full operation can be completed someone has to select the right card and insert it at the central office. Now rapid selection can slide just the proper card into position in an instant or two, and return it afterward. Another difficulty occurs, however. Someone must read a total on the card, so that the machine can add its computed item to it. Conceivably the cards might be of the dry photography type I have described. Existing totals could then be read by photocell, and the new total entered by an electron beam. The cards may be in miniature, so that they occupy little space. They must move quickly. They need not be transferred far, but merely into position so that the photocell and recorder can operate on them. Positional dots can enter the data. At the end of the month a machine can readily be made to read these and to print an ordinary bill. With tube selection, in which no mechanical parts are involved in the switches, little time need be occupied in bringing the correct card into use - a second should suffice for the entire operation. The whole record on the card may be made by magnetic dots on a steel sheet if desired, instead of dots to be observed optically, following the scheme by which Poulsen long ago put speech on a magnetic wire. This method has the advantage of simplicity and ease of erasure. By using photography, however, one can arrange to project the record in enlarged form, and at a distance by using the process common in television equipment. One can consider rapid selection of this form, and distant projection for other purposes. To be able to key one sheet of a million before an operator in a second or two, with the possibility of then adding notes thereto, is suggestive in many ways. It might even be of use in libraries, but that is another story. At any rate, there are now some interesting combinations possible. One might, for example, speak to a microphone, in the manner described in connection with the speech-controlled typewriter, and thus make his selections. It would certainly beat the usual file clerk. VI The real heart of the matter of selection, however, goes deeper than a lag in the adoption of mechanisms by libraries, or a lack of development of devices for their use. Our ineptitude in getting at the record is largely caused by the artificiality of systems of indexing. When data of any sort are placed in storage, they are filed alphabetically or numerically, and information is found (when it is) by tracing it down from subclass to subclass. It can be in only one place, unless duplicates are used; one has to have rules as to which path will locate it, and the rules are cumbersome. Having found one item, moreover, one has to emerge from the system and re-enter on a new path. The human mind does not work that way. It operates by association. With one item in its grasp, it snaps instantly to the next that is suggested by the association of thoughts, in accordance with some intricate web of trails carried by the cells of the brain. It has other characteristics, of course; trails that are not frequently followed are prone to fade, items are not fully permanent, memory is transitory. Yet the speed of action, the intricacy of trails, the detail of mental pictures, is awe-inspiring beyond all else in nature. Man cannot hope fully to duplicate this mental process artificially, but he certainly ought to be able to learn from it. In minor ways he may even improve, for his record have relative permanency. The first idea, however, to be drawn from the analogy concerns selection. Selection by association, rather than by indexing, may yet be mechanized. One cannot hope thus to equal the speed and flexibility with which the mind follows an associative trail, but it should be possible to beat the mind decisively in regard to the permanence and clarity of the items resurrected from storage. Consider a future device for individual use, which is a sort of mechanized private file and library. It needs a name, and to coin one at random, ``memex'' will do. A memex is a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement to his memory. It consists of a desk, and while it can presumably be operated from a distance, it is primarily the piece of furniture at which he works. On the top are slanting translucent screens, on which material can be projected for convenient reading. There is a keyboard, and sets of buttons and levers. Otherwise it looks like an ordinary desk. In one end is the stored material. The matter of bulk is well taken care of by improved microfilm. Only a small part of the interior of the memex is devoted to storage, the rest to mechanism. Yet if the user inserted 5000 pages of material a day it would take him hundreds of years to fill the repository, so he can be profligate and enter material freely. Most of the memex contents are purchased on microfilm ready for insertion. Books of all sorts, pictures, current periodicals, newspapers, are thus obtained and dropped into place. Business correspondence takes the same path. And there is provision for direct entry. On the top of the memex is a transparent platen. On this are placed longhand notes, photographs, memoranda, all sort of things. When one is in place, the depression of a lever causes it to be photographed onto the next blank space in a section of the memex film, dry photography being employed. There is, of course, provision for consultation of the record by the usual scheme of indexing. If the user wishes to consult a certain book, he taps its code on the keyboard, and the title page of the book promptly appears before him, projected onto one of his viewing positions. Frequently-used codes are mnemonic, so that he seldom consults his code book; but when he does, a single tap of a key projects it for his use. Moreover, he has supplemental levers. On deflecting one of these levers to the right he runs through the book before him, each page in turn being projected at a speed which just allows a recognizing glance at each. If he deflects it further to the right, he steps through the book 10 pages at a time; still further at 100 pages at a time. Deflection to the left gives him the same control backwards. A special button transfers him immediately to the first page of the index. Any given book of his library can thus be called up and consulted with far greater facility than if it were taken from a shelf. As he has several projection positions, he can leave one item in position while he calls up another. He can add marginal notes and comments, taking advantage of one possible type of dry photography, and it could even be arranged so that he can do this by a stylus scheme, such as is now employed in the telautograph seen in railroad waiting rooms, just as though he had the physical page before him. VII All this is conventional, except for the projection forward of present-day mechanisms and gadgetry. If affords an immediate step, however, to associative indexing, the basic idea of which is a provision whereby any item may be caused at will to select immediately and automatically another. This is the essential feature of the memex. The process of tying two items together is the important thing. When the user is building a trail, he names it, inserts the name in his code book, and taps it out on his keyboard. Before him are the two items to be joined, projected onto adjacent viewing positions. At the bottom of each there are a number of blank code spaces, and a pointer is set to indicate one of these on each item. The user taps a single key, and the items are permanently joined. In each code space appears the code word. Out of view, but also in the code space, is inserted a set of dots for photocell viewing; and on each item these dots by their positions designate the index number of the other item. Thereafter, at any time, when one of these items is in view, the other can be instantly recalled merely by tapping a button below the corresponding code space. Moreover, when numerous items have been thus joined together to form a trail, they can be reviewed in turn, rapidly or slowly, by deflecting a lever like that used for turning the pages of a book. It is exactly as though the physical items had been gathered together to form a new book. It is more than this, for any item can be joined into numerous trails. The owner of the memex, let us say, is interested in the origin and properties of the bow and arrow. Specifically he is studying why the short Turkish bow was apparently superior to the English long bow in the skirmishes of the Crusades. He has dozens of possibly pertinent books and articles in his memex. First he runs through an encyclopedia, finds and interesting but sketchy article, leaves it projected, Next, in a history, he finds another pertinent item, and ties the two together. Thus he goes, building a trail of many items. Occasionally he inserts a comment of his own, either linking it into the main trail or joining it by a side trail to a particular item. When it becomes evident that the elastic properties of available materials had a great deal to do with the bow, he branches off on a side trail which takes him through textbooks on elasticity and tables of physical constants. He inserts a page of longhand analysis of his own. Thus he builds a trail of his interest through the maze of materials available to him. And his trails do not fade. Several years later, his talk with a friend turns to the queer ways in which a people resist innovations, even of vital interest. He has an example, in the fact that the outranged Europeans still failed to adopt the Turkish bow. In fact he has a trail on it. A touch brings up the code book. Tapping a few keys projects the head of the trail. A lever runs through it at will, stopping at interesting items, going off on side excursions. It is an interesting trail, pertinent to the discussion. So he sets a reproducer in action, photographs the whole trail out, and passes it to his friend for insertion in his own memex, there to be linked into the more general trail. VIII Wholly new forms of encyclopedias will appear, ready-made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified. The lawyer has at his touch the associated opinions and decisions of his whole experience, and of the experience of friends and authorities. The patent attorney has on call the millions of issued patents, with familiar trails to every point of his client's interest. The physician, puzzled by its patient's reactions, strikes the trail established in studying an earlier similar case, and runs rapidly through analogous case histories, with side references to the classics for the pertinent anatomy and histology. The chemist, struggling with the synthesis of an organic compound, has all the chemical literature before him in his laboratory, with trails following the analogies of compounds, and side trails to their physical and chemical behavior. The historian, with a vast chronological account of a people, parallels it with a skip trail which stops only at the salient items, and can follow at any time contemporary trails which lead him all over civilization at a particular epoch. There is a new profession of trail blazers, those who find delight in the task of establishing useful trails through the enormous mass of the common record. The inheritance from the master becomes, not only his additions to the world's record, but for his disciples the entire scaffolding by which they were erected. Thus science may implement the ways in which man produces, stores, and consults the record of the race. It might be striking to outline the instrumentalities of the future more spectacularly, rather than to stick closely to the methods and elements now known and undergoing rapid development, as has been done here. Technical difficulties of all sorts have been ignored, certainly, but also ignored are means as yet unknown which may come any day to accelerate technical progress as violently as did the advent of the thermionic tube. In order that the picture may not be too commonplace, by reason of sticking to present-day patterns, it may be well to mention one such possibility, not to prophesy but merely to suggest, for prophecy based on extension of the known has substance, while prophecy founded on the unknown is only a doubly involved guess. All our steps in creating or absorbing material of the record proceed through one of the senses - the tactile when we touch keys, the oral when we speak or listen, the visual when we read. Is it not possible that some day the path may be established more directly? We know that when the eye sees, all the consequent information is transmitted to the brain by means of electrical vibrations in the channel of the optic nerve. This is an exact analogy with the electrical vibrations which occur in the cable of a television set: they convey the picture from the photocells which see it to the radio transmitter from which it is broadcast. We know further that if we can approach that cable with the proper instruments, we do not need to touch it; we can pick up those vibrations by electrical induction and thus discover and reproduce the scene which is being transmitted, just as a telephone wire may be tapped for its message. The impulse which flow in the arm nerves of a typist convey to her fingers the translated information which reaches her eye or ear, in order that the fingers may be caused to strike the proper keys. Might not these currents be intercepted, either in the original form in which information is conveyed to the brain, or in the marvelously metamorphosed form in which they then proceed to the hand? By bone conduction we already introduce sounds into the nerve channels of the deaf in order that they may hear. Is it not possible that we may learn to introduce them without the present cumbersomeness of first transforming electrical vibrations to mechanical ones, which the human mechanism promptly transforms back to the electrical form? With a couple of electrodes on the skull the encephalograph now produces pen-and-ink traces which bear some relation to the electrical phenomena going on in the brain itself. True, the record is unintelligible, except as it points out certain gross misfunctioning of the cerebral mechanism; but who would now place bounds on where such a thing may lead? In the outside world, all forms of intelligence, whether of sound or sight, have been reduced to the form of varying currents in an electric circuit in order that they may be transmitted. Inside the human frame exactly the same sort of process occurs. Must we always transform to mechanical movements in order to proceed from one electrical phenomenon to another? It is a suggestive thought, but it hardly warrants prediction without losing touch with reality and immediateness. Presumably man's spirit should be elevated if he can better review his shady past and analyze more completely and objectively his present problems. He has built a civilization so complex that he needs to mechanize his record more fully if he is to push his experiment to its logical conclusion and not merely become bogged down part way there by overtaxing his limited memory. His excursion may be more enjoyable if he can reacquire the privilege of forgetting the manifold things he does not need to have immediately at hand, with some assurance that he can find them again if they prove important. The applications of science have built man a well-supplied house, and are teaching him to live healthily therein. They have enabled him to throw masses of people against another with cruel weapons. They may yet allow him truly to encompass the great record and to grow in the wisdom of race experience. He may perish in conflict before he learns to wield that record for his true good. Yet, in the application of science to the needs and desires of man, it would seem to be a singularly unfortunate stage at which to terminate the process, or to lose hope as to the outcome. [Thanks to John for this.] From checker at panix.com Sun May 8 14:58:06 2005 From: checker at panix.com (Premise Checker) Date: Sun, 8 May 2005 10:58:06 -0400 (EDT) Subject: [Paleopsych] NYT: (The Omni-incompetent State, Part 114): U.S. to Spend Billions More to Alter Security Systems Message-ID: U.S. to Spend Billions More to Alter Security Systems New York Times, 5.5.8 http://www.nytimes.com/2005/05/08/national/08screen.html By [2]ERIC LIPTON WASHINGTON, May 7 - After spending more than $4.5 billion on screening devices to monitor the nation's ports, borders, airports, mail and air, the federal government is moving to replace or alter much of the antiterrorism equipment, concluding that it is ineffective, unreliable or too expensive to operate. Many of the monitoring tools - intended to detect guns, explosives, and nuclear and biological weapons - were bought during the blitz in security spending after the attacks of Sept. 11, 2001. In its effort to create a virtual shield around America, the Department of Homeland Security now plans to spend billions of dollars more. Although some changes are being made because of technology that has emerged in the last couple of years, many of them are planned because devices currently in use have done little to improve the nation's security, according to a review of agency documents and interviews with federal officials and outside experts. "Everyone was standing in line with their silver bullets to make us more secure after Sept. 11," said Randall J. Larsen, a retired Air Force colonel and former government adviser on scientific issues. "We bought a lot of stuff off the shelf that wasn't effective." Among the problems: ?Radiation monitors at ports and borders that cannot differentiate between radiation emitted by a nuclear bomb and naturally occurring radiation from everyday material like cat litter or ceramic tile. ?Air-monitoring equipment in major cities that is only marginally effective because not enough detectors were deployed and were sometimes not properly calibrated or installed. They also do not produce results for up to 36 hours - long after a biological attack would potentially infect thousands of people. ?Passenger-screening equipment at airports that auditors have found is no more likely than before federal screeners took over to detect whether someone is trying to carry a weapon or a bomb aboard a plane. ?Postal Service machines that test only a small percentage of mail and look for anthrax but no other biological agents. Federal officials say they bought the best available equipment. They acknowledge that it might not have been cutting-edge technology but said that to speed installation they bought only devices that were readily available instead of trying to buy promising technology that was not yet in production. The department says it has created a layered defense that would not be compromised by the failure of a single device. Even if the monitoring is less than ideal, officials say, it is still a deterrent. "The nation is more secure in the deployment and use of these technologies versus having no technologies in place at all," said Brian Roehrkasse, a spokesman for the Department of Homeland Security. Every piece of equipment provides some level of additional security, said Christopher Y. Milowic, a customs official whose office oversees screening at ports and borders. "It is not the ultimate capacity," he said. "But it reduces risk." Some critics say that even though federal agencies were pressed to move quickly by Congress and the administration, they made some poor choices. In some cases, agencies did not seek competitive bids or consider cheaper, better alternatives. And not all the devices were tested to see how well they worked in the environments where they would be used. "After 9/11, we had to show how committed we were by spending hugely greater amounts of money than ever before, as rapidly as possible," said Representative Christopher Cox, a California Republican who is the chairman of the Homeland Security Committee. "That brought us what we might expect, which is some expensive mistakes. This has been the difficult learning curve of the new discipline known as homeland security." Radiation at Seaports One after another, trucks stuffed with cargo like olives from Spain, birdseed from Ethiopia, olive oil from France and carpets from India line up at the Port Newark Container Terminal, approaching what looks like an E-ZPass toll gate. In minutes, they will fan out across the nation. But first, they pass through the gate, called a radiation portal monitor, which sounds an alarm if it detects a nuclear weapon or radioactive material that could be used to make a "dirty bomb," a crude nuclear device that causes damage by widely spreading low levels of radiation. Heralded as "highly sophisticated" when they were introduced, the devices have proven to be hardly that. The portal-monitor technology has been used for decades by the scrap metal industry. Customs officials at Newark have nicknamed the devices "dumb sensors," because they cannot discern the source of the radiation. That means benign items that naturally emit radioactivity - including cat litter, ceramic tile, granite, porcelain toilets, even bananas - can set off the monitors. Alarms occurred so frequently when the monitors were first installed that customs officials turned down their sensitivity. But that increased the risk that a real threat, like the highly enriched uranium used in nuclear bombs, could go undetected because it emits only a small amount of radiation or perhaps none if it is intentionally shielded. "It was certainly a compromise in terms of absolute capacity to detect threats," said Mr. Milowic, the customs official. The port's follow-up system, handheld devices that are supposed to determine what set off an alarm, is also seriously flawed. Tests conducted in 2003 by Los Alamos National Laboratory found that the handheld machines, designed to be used in labs, produced a false positive or a false negative more than half the time. The machines were the least reliable in identifying the most dangerous materials, the tests showed. The weaknesses of the devices were apparent in Newark one recent morning. A truck, whose records said it was carrying brakes from Germany, triggered the portal alarm, but the backup device could not identify the radiation source. Without being inspected, the truck was sent on its way to Ohio. "We agree it is not perfect," said Rich O'Brien, a customs supervisor in Newark. But he said his agency needed to move urgently to improve security after the 2001 attacks. "The politics stare you in the face, and you got to put something out there." At airports, similar shortcomings in technology have caused problems. The Transportation Security Administration bought 1,344 machines costing more than $1 million each to search for explosives in checked bags by examining the density of objects inside. But innocuous items as varied as Yorkshire pudding and shampoo bottles, which happen to have a density similar to certain explosives, can set off the machines, causing false alarms for 15 percent to 30 percent of all luggage, an agency official said. The frequent alarms require airports across the country to have extra screeners to examine these bags. Quick Action After 9/11 Because the machines were installed under tight timetables imposed by Congress, they were squeezed into airport lobbies instead of integrated into baggage conveyor systems. That slowed the screening process - the machines could handle far fewer bags per hour - and pushed up labor costs by hundreds of millions of dollars a year. At busy times, bags are sometimes loaded onto planes without being properly examined, according to several current and former screeners. "It is very discouraging," said a screener who worked at Portland International Airport until last year, but who asked not to be named because he still is a federal employee. "People are just taking your bags and putting them on the airplane." Equipment to screen passengers and carry-on baggage - including nearly 5,000 new metal detectors, X-ray machines and devices that can detect traces of explosives - can be unreliable. A handgun might slip through because screeners rely on two-dimensional X-ray machines, rather than newer, three-dimensional models, for example. The National Academy of Sciences recently described the trace detection devices as having "limited effectiveness and significant vulnerabilities." As a result, the likelihood of detecting a hidden weapon or bomb has not significantly changed since the government took over airport screening operations in 2002, according to the inspector general at the Department of Homeland Security. Transportation security officials acknowledge that they cannot improve performance without new technology, but they dispute suggestions that no progress has been made. "We have created a much more formidable deterrent," said Mark O. Hatfield Jr., a spokesman for the Transportation Security Administration. "Do we have an absolute barrier? No." Counting machinery and personnel, aviation screening has cost more than $15 billion since 2001, a price that Representative John L. Mica, Republican of Florida, says has hardly been worthwhile. "Congress is the one that mandated this," Mr. Mica said. "But we should have done more research and development on the technology and put this in gradually." Concerns Despite Reliability Some screening equipment has performed reliably. Machines that test mail at the United States Postal Service's major processing centers have not had a single false alarm after more than a year, officials said. But the monitors detect only anthrax, which sickened postal workers in 2001. And only about 20 percent of mail is tested - mostly letters dropped into blue post boxes, because they are considered the most likely route for a biological attack. In about 30 major cities, equipment used to test air is also very precise: there have been more than 1.5 million tests without a single false positive. But only about 10 monitors were placed in most cities, and they were often miles apart, according to the inspector general of the Environmental Protection Agency. Detecting a biological attack, particularly one aimed at a specific building or area, would require perhaps thousands of monitors in a big city. In addition, as contractors hurried to install the devices before the start of the war with Iraq - the Bush administration feared that Saddam Hussein might use biological weapons on American cities - they were often placed too low or too high to collect satisfactory samples, the inspector general noted. The monitors use filters that must be collected manually every day before they can be analyzed hours later at a lab. "It was an expedient attempt to solve a problem," said Philip J. Wyatt, a physicist and expert on biological weapons monitoring equipment. "What they got is ineffective, wasteful and expensive to maintain." Homeland security officials say that they have already moved to address some of the initial problems, and that they are convinced that the monitoring is valuable because it could allow them to recognize an attack about a day sooner than if they learned about it through victims' falling ill. At the Nevada Test Site, an outdoor laboratory that is larger than Rhode Island, the next generation of monitoring devices is being tested. In preparing to spend billions of dollars more on equipment, the Department of Homeland Security is moving carefully. In Nevada, contractors are being paid to build prototypes of radiation detection devices that are more sensitive and selective. Only those getting passing grades will move on to a second competition in the New York port. Similar competitions are under way elsewhere to evaluate new air-monitoring equipment and airport screening devices. That approach contrasts with how the federal government typically went about trying to shore up the nation's defenses after the 2001 attacks. Government agencies often turned to their most familiar contractors, including Northrop Grumman, Boeing and SAIC, a technology giant based in San Diego. The agencies bought devices from those companies, at times without competitive bidding or comprehensive testing. Documents prepared by customs officials in an effort to purchase container inspection equipment show that they were so intent on buying an SAIC product, even though a competitor had introduced a virtually identical version that was less expensive, that they placed the manufacturer's brand name in the requests. The agency has bought more than 100 of the machines at $1 million each. But the machines often cannot identify the contents of ship containers, because many everyday items, including frozen foods, are too dense for the gamma ray technology to penetrate. 'Continually Upgrading' The federal government will likely need to spend as much as $7 billion more on screening equipment in coming years, according to government estimates. "One department charged with coordinating efforts and setting standards will result in far better and more efficient technologies to secure the homeland," said Mr. Roehrkasse, the Department of Homeland Security spokesman. Some experts believe that this high-priced push for improvements is necessary, saying the war against terrorism may require the same sort of spending on new weapons and defenses as the cold war did. "You are in a game where you are continually upgrading and you will be forever," said Thomas S. Hartwick, a physicist who evaluates aviation-screening equipment. But given the inevitable imperfection of technology and the vast expanse the government is trying to secure, some warn of putting too much confidence in machines. "Technology does not substitute for strategy," said James Jay Carafano, senior fellow for homeland security at the Heritage Foundation, a conservative think tank. "It's always easier for terrorists to change tactics than it is for us to throw up defenses to counter them. The best strategy to deal with terrorists is to find them and get them." Matthew L. Wald contributed reporting for this article. From checker at panix.com Sun May 8 14:58:16 2005 From: checker at panix.com (Premise Checker) Date: Sun, 8 May 2005 10:58:16 -0400 (EDT) Subject: [Paleopsych] NYT: This Is Your Brain on Motherhood Message-ID: This Is Your Brain on Motherhood New York Times, 5.5.8 http://www.nytimes.com/2005/05/08/opinion/08ellison.html By KATHERINE ELLISON San Francisco ANYONE shopping for a Mother's Day card today might reasonably linger in the Sympathy section. We can't seem to stop mourning the state of modern motherhood. "Madness" is our new metaphor. "Desperate Housewives" are our new cultural icons. And a mother's brain, as commonly envisioned, is impaired by a supposed full-scale assault on sanity and smarts. So strong is this last stereotype that when a satirical Web site posted a "study" saying that parents lose an average of 20 I.Q. points on the birth of their first child, MSNBC broadcast it as if it were true. The danger of this perception is clearest for working mothers, who besides bearing children spend more time with them, or doing things for them, than fathers, according to a recent Department of Labor survey. In addition, the more visibly "encumbered" we are, the more bias we attract: When volunteer groups were shown images of a woman doing various types of work, but in some cases wearing a pillow to make her look pregnant, most judged the "pregnant" woman less competent. Even in liberal San Francisco, a hearing last month to consider a pregnant woman's bid to be named acting director of the Department of Building Inspection featured four speakers commenting on her condition, with one asking if the city truly meant to hire a "pregnancy brain." But what if just the opposite is true? What if parenting really isn't a zero-sum, children-take-all game? What if raising children is actually mentally enriching for mothers - and fathers? This is, in fact, what some leading brain scientists, like Michael Merzenich at the University of California, San Francisco, now believe. Becoming a parent, they say, can power up the mind with uniquely motivated learning. Having a baby is "a revolution for the brain," Dr. Merzenich says. The human brain, we now know, creates cells throughout life, cells more likely to survive if they're used. Emotional, challenging and novel experiences provide particularly helpful use of these new neurons, and what adjectives better describe raising a child? Children constantly drag their parents into challenging, novel situations, be it talking a 4-year-old out of a backseat meltdown on the Interstate or figuring out a third-grade homework assignment to make a model of a black hole in space. Often, we'd rather be doing almost anything else. Aging makes us cling ever more fiercely to our mental ruts. But for most of us, our unique bond with our children yanks us out of them. And there are other ways that being a dedicated parent strengthens our minds. Research shows that learning and memory skills can be improved by bearing and nurturing offspring. A team of neuroscientists in Virginia found that mother lab rats, just like working mothers, demonstrably excel at time-management and efficiency, racing around mazes to find rewards and get back to the pups in record time. Other research is showing how hormones elevated in parenting can help buffer mothers from anxiety and stress - a timely gift from a sometimes compassionate Mother Nature. Oxytocin, produced by mammals in labor and breast-feeding, has been linked to the ability to learn in lab animals. Rethinking the mental state of motherhood is reasonable after recent years of evolution of our notion of just what it means to be smart. With our economy newly weighted with people-to-people jobs, and with many professions, including the sciences, becoming more multidisciplinary and collaborative, the people skills we've come to think of as "emotional intelligence" are increasingly prized by many wise employers. An ability to tailor your message to your audience, for instance - a skill that engaged parents practice constantly - can mean the difference between failure and success, at home and at work, as Harvard's president, Lawrence Summers, may now realize. To be sure, sleep deprivation, overwork and too much "Teletubbies" can sap any parent's synapses. And to be sure, our society needs to do much more - starting with more affordable, high-quality child care and paid parental leaves - to catch up with other industrialized nations and support mothers and fathers in using their newly acquired smarts to best advantage. That's why some of the recent "mommy lit" complaints are justified, and probably needed to rouse society to action - if only because nobody will be able to stand our whining for much longer. Still, it's worth considering that the torrent of negativity about motherhood comes as part of an era in which intimacy of all sorts is on the decline in this country. Geographically close extended families have long been pass?. The marriage rate has declined. And a record percentage of women of child-bearing age today are childless, many by choice. It's common these days to hear people say they don't have time to maintain friendships. Real relationships take a lot of time and work - it's much more convenient to keep in touch by e-mail. But children insist on face time. They fail to thrive unless we anticipate their needs, work our empathy muscles, adjust our schedules and endure their relentless testing. In the process, if we're lucky, we may realize that just this kind of grueling work - with our children, or even with others who could simply use some help - is precisely what makes us grow, acquire wisdom and become more fully human. Perhaps then we can start to re-imagine a mother's brain as less a handicap than a keen asset in the lifelong task of getting smart. Katherine Ellison is the author of "The Mommy Brain: How Motherhood Makes Us Smarter." From checker at panix.com Sun May 8 14:58:29 2005 From: checker at panix.com (Premise Checker) Date: Sun, 8 May 2005 10:58:29 -0400 (EDT) Subject: [Paleopsych] NYT Magazine: Jim Holt: Of Two Minds Message-ID: Jim Holt: Of Two Minds New York Times Magazine, 5.5.8 http://www.nytimes.com/2005/05/08/magazine/08WWLN.html The human brain is mysterious -- and, in a way, that is a good thing. The less that is known about how the brain works, the more secure the zone of privacy that surrounds the self. But that zone seems to be shrinking. A couple of weeks ago, two scientists revealed that they had found a way to peer directly into your brain and tell what you are looking at, even when you yourself are not yet aware of what you have seen. So much for the comforting notion that each of us has privileged access to his own mind. Opportunities for observing the human mental circuitry in action have, until recent times, been almost nonexistent, mainly because of a lack of live volunteers willing to sacrifice their brains to science. To get clues on how the brain works, scientists had to wait for people to suffer sometimes gruesome accidents and then see how the ensuing brain damage affected their abilities and behavior. The results could be puzzling. Damage to the right frontal lobe, for example, sometimes led to a heightened interest in high cuisine, a condition dubbed gourmand syndrome. (One European political journalist, upon recovering from a stroke affecting this part of the brain, profited from the misfortune by becoming a food columnist.) Today scientists are able to get some idea of what's going on in the mind by using brain scanners. Brain-scanning is cruder than it sounds. A technology called functional magnetic resonance imaging can reveal which part of your brain is most active when you're solving a mathematical puzzle, say, or memorizing a list of words. The scanner doesn't actually pick up the pattern of electrical activity in the brain; it just shows where the blood is flowing. (Active neurons demand more oxygen and hence more blood.) In the current issue of Nature Neuroscience, however, Frank Tong, a cognitive neuroscientist at Vanderbilt University, and Yukiyasu Kamitani, a researcher in Japan, announced that they had discovered a way of tweaking the brain-scanning technique to get a richer picture of the brain's activity. Now it is possible to infer what tiny groups of neurons are up to, not just larger areas of the brain. The implications are a little astonishing. Using the scanner, Tong could tell which of two visual patterns his subjects were focusing on -- in effect, reading their minds. In an experiment carried out by another research team, the scanner detected visual information in the brains of subjects even though, owing to a trick of the experiment, they themselves were not aware of what they had seen. How will our image of ourselves change as the wrinkled lump of gray meat in our skull becomes increasingly transparent to such exploratory methods? One recent discovery to confront is that the human brain can readily change its structure -- a phenomenon scientists call neuroplasticity. A few years ago, brain scans of London cabbies showed that the detailed mental maps they had built up in the course of navigating their city's complicated streets were apparent in their brains. Not only was the posterior hippocampus -- one area of the brain where spatial representations are stored -- larger in the drivers; the increase in size was proportional to the number of years they had been on the job. It may not come as a great surprise that interaction with the environment can alter our mental architecture. But there is also accumulating evidence that the brain can change autonomously, in response to its own internal signals. Last year, Tibetan Buddhist monks, with the encouragement of the Dalai Lama, submitted to functional magnetic resonance imaging as they practiced ''compassion meditation,'' which is aimed at achieving a mental state of pure loving kindness toward all beings. The brain scans showed only a slight effect in novice meditators. But for monks who had spent more than 10,000 hours in meditation, the differences in brain function were striking. Activity in the left prefrontal cortex, the locus of joy, overwhelmed activity in the right prefrontal cortex, the locus of anxiety. Activity was also heightened in the areas of the brain that direct planned motion, ''as if the monks' brains were itching to go to the aid of those in distress,'' Sharon Begley reported in The Wall Street Journal. All of which suggests, say the scientists who carried out the scans, that ''the resting state of the brain may be altered by long-term meditative practice.'' But there could be revelations in store that will force us to revise our self-understanding in far more radical ways. We have already had a hint of this in the so-called split-brain phenomenon. The human brain has two hemispheres, right and left. Each hemisphere has its own perceptual, memory and control systems. For the most part, the left hemisphere is associated with the right side of the body, and vice versa. The left hemisphere usually controls speech. Connecting the hemispheres is a cable of nerve fibers called the corpus callosum. Patients with severe epilepsy sometimes used to undergo an operation in which the corpus callosum was severed. (The idea was to keep a seizure from spreading from one side of the brain to the other.) After the operation, the two hemispheres of the brain could no longer directly communicate. Such patients typically resumed their normal lives without seeming to be any different. But under careful observation, they exhibited some very peculiar behavior. When, for example, the word ''hat'' was flashed to the left half of the visual field -- and hence to the right (speechless) side of the brain -- the left hand would pick out a hat from a group of concealed objects, even as the patient insisted that he had seen no word. If a picture of a naked woman was flashed to the left visual field of a male patient, he would smile, or maybe blush, without being able to say what he was reacting to -- although he might make a comment like, ''That's some machine you've got there.'' In another case, a female patient's right hemisphere was flashed a scene of one person throwing another into a fire. ''I don't know why, but I feel kind of scared,'' she told the researcher. ''I don't like this room, or maybe it's you getting me nervous.'' The left side of her brain, noticing the negative emotional reaction issuing from the right side, was making a guess about its cause, much the way one person might make a guess about the emotions of another. Each side of the brain seemed to have its own awareness, as if there were two selves occupying the same head. (One patient's left hand seemed somewhat hostile to the patient's wife, suggesting that the right hemisphere was not fond of her.) Ordinarily, the two selves got along admirably, falling asleep and waking up at the same time and successfully performing activities that required bilateral coordination, like swimming and playing the piano. Nevertheless, as laboratory tests showed, they lived in ever so slightly different sensory worlds. And even though both understood language, one monopolized speech, while the other was mute. That's why the patient seemed normal to family and friends. Pondering such split-brain cases, some scientists and philosophers have raised a disquieting possibility: perhaps each of us really consists of two minds running in harness. In an intact brain, of course, the corpus callosum acts as a constant two-way internal-communications channel between the two hemispheres. So our everyday behavior does not betray the existence of two independent streams of consciousness flowing along within our skulls. It may be, the philosopher Thomas Nagel has written, that ''the ordinary, simple idea of a single person will come to seem quaint some day, when the complexities of the human control system become clearer and we become less certain that there is anything very important that we are one of.'' It is sobering to reflect how ignorant humans have been about the workings of their own brains for most of our history. Aristotle, after all, thought the point of the brain was to cool the blood. The more that breakthroughs like the recent one in brain-scanning open up the mind to scientific scrutiny, the more we may be pressed to give up comforting metaphysical ideas like interiority, subjectivity and the soul. Let's enjoy them while we can. Jim Holt is a frequent contributor to the magazine. From checker at panix.com Sun May 8 14:58:40 2005 From: checker at panix.com (Premise Checker) Date: Sun, 8 May 2005 10:58:40 -0400 (EDT) Subject: [Paleopsych] Book World: Once Upon a Time Message-ID: Once Upon a Time Washington Post Book World, 5.5.8 http://www.washingtonpost.com/wp-dyn/content/article/2005/05/05/AR2005050501385_pf.html Reviewed by Denis Dutton Sunday, May 8, 2005; BW08 THE SEVEN BASIC PLOTS Why We Tell Stories By Christopher Booker. Continuum. 728 pp. $34.95 In the summer of 1975, moviegoers flocked to see the story of a predatory shark terrorizing a little Long Island resort. The film told of how three brave men go to sea in a small boat and, after a bloody climax in which they kill the monster, return peace and security to their town -- not unlike, Christopher Booker observes, a tale enjoyed by Saxons dressed in animal skins, huddled around a fire some 1,200 years earlier. Beowulf also features a town terrorized by a monster, Grendel, who lives in a nearby lake and tears his victims to pieces. Again, the hero Beowulf returns peace to his town after a bloody climax in which the monster is slain. Such echoes have impelled Booker to chart what he regards as the seven plots on which all literature is built. Beowulf and "Jaws" follow the first and most basic of his plots, "Overcoming the Monster." It is found in countless stories from The Epic of Gilgamesh and "Little Red Riding Hood" to James Bond films such as "Dr. No." This tale of conflict typically recounts the hero's ordeals and an escape from death, ending with a community or the world itself saved from evil. Booker's second plot is "Rags to Riches." He places in this category "Cinderella," "The Ugly Duckling," David Copperfield and other stories that tell of modest, downtrodden characters whose special talents or beauty are at last revealed to the world for a happy ending. Next in Booker's taxonomy is "the Quest," which features a hero, normally joined by sidekicks, traveling the world and fighting to overcome evil and secure a priceless treasure (or in the case of Odysseus, wife and hearth). The hero not only gains the treasure he seeks, but also the girl, and they end as king and queen. Related to this is Booker's fourth category, "Voyage and Return," exemplified by Robinson Crusoe , Alice in Wonderland and The Time Machine . The protagonist leaves normal experience to enter an alien world, returning after what often amounts to a thrilling escape. In "Comedy," Booker suggests, confusion reigns until at last the hero and heroine are united in love. "Tragedy" portrays human overreaching and its terrible consequences. The last of the plots of his initial list is "Rebirth," which centers on characters such as Dickens's Scrooge, Snow White and Dostoyevsky's Raskolnikov. To this useful system he unexpectedly adds two more plots: "Rebellion" to cover the likes of 1984 and "Mystery" for the recent invention of the detective novel. Booker, a British columnist who was founding editor of Private Eye, possesses a remarkable ability to retell stories. His prose is a model of clarity, and his lively enthusiasm for fictions of every description is infectious. He covers Greek and Roman literature, fairy tales, European novels and plays, Arabic and Japanese tales, Native American folk tales, and movies from the silent era on. He is an especially adept guide through the twists and characters of Wagner's operas. His artfully entertaining summaries jogged many warm memories of half-forgotten novels and films. I wish that an equal amount of pleasure could be derived from the psychology on which he bases his hypothesis. Booker has been working on this project for 34 years, and his quaint psychological starting point sadly shows its age. He believes that Carl Jung's theory of archetypes and self-realization can explain story patterns. Alas, Jung serves him poorly. Malevolent characters, for example, are constantly described by Booker as selfish "Dark Figures" who symbolize overweening egotism. (Booker is from a generation of critics who used to think that simply identifying a symbol in literature can explain anything you please.) In Jungian terms, the dark power of the ego is the source of all evil, along with another of Booker's favorite Jungian ideas, the denial of the villain's "inner feminine." Granted, egotism may explain the wickedness of someone like Edmund in "King Lear." But Grendel? The shark in "Jaws"? Oedipus is arguably a more egotistical character than Iago, who in his devious cruelty is still far more evil. The malevolence of dinosaurs in Jurassic Park or the Cyclops in The Odyssey lies not in their egotism. These creatures just have a perfectly natural taste for mammalian flesh. They are frightening, dramatic threats, to be sure, but not symbols of anything human. Sometimes in fiction, as Freud might have said, a monster is just a monster. In Booker's account, denying your "inner feminine" is bad news, and all evildoers, including Lady Macbeth, are guilty of it. Not only do such Jungian clich?s wear thin, they get in the way of adequate interpretation. Having seduced so many women and killed the father of one, Don Giovanni will "never develop his inner feminine" and act with the strength of a mature man, according to Booker. This ignores a most piquant feature of Lorenzo Da Ponte's libretto: The Don stubbornly stands up to the Commendatore's ghost at the opera's end and is pulled down to hell on account of it. Booker's discussion of what he calls "the Rule of Three" reveals his obsessive, self-confirming method. From the three questions of Goldilocks and Red Riding Hood to Lear's three daughters, sets of three are ubiquitous in literature, Booker claims. "Once we become aware of the archetypal significance of three in storytelling," he explains, "we can see it everywhere, expressed in all sorts of different ways, large and small." Sure, and anyone who studies the personality types of astrology will see Virgos and Scorpios everywhere too. Relations among three, four or five characters in a narrative enable more dramatic possibilities than relations between two. This is a matter of ordinary logic, not literary criticism. The "archetype of three," as he calls it, is no archetype at all, though he contrives to find it where it is plainly absent. Scylla and Charybdis may look like two dangers to you and me, but the middle way between them actually makes, as Booker explains, three possibilities for Odysseus, thus saving his Rule of Three. That Jane Eyre spends three days running across the moors "conveys to us, by a kind of symbolic shorthand, just how tortuous and difficult" her escape is. But why three? If Jane had spent five days on the moors, or 40 days, she'd have been even more tuckered out. And while there are three bears, three chairs and three bowls of porridge in "Goldilocks and the Three Bears," there are actually four characters. The story would better support Booker's theory were it "Goldilocks and the Two Bears." But, like astrologers, he is not keen to consider negative evidence. The first thinker to tackle Booker's topic was Aristotle. Write a story about a character, Aristotle showed, and you face only so many logical alternatives. In tragedy, for instance, either bad things will happen to a good person (unjust and repugnant) or bad things happen to a bad person (just, but boring). Or good things happen to a bad person (unjust again). Tragedy needs bad things to happen to a basically good but flawed person: Though he may not have deserved his awful fate, Oedipus was asking for it. In the same rational spirit, Aristotle works out dramatic relations: A conflict between strangers or natural enemies is of little concern to us. What arouses interest is a hate-filled struggle between people who ought to love each other -- the mother who murders her children to punish her husband, or two brothers who fight to the death. Aristotle knew this for the drama of his age as much as soap-opera writers know it today. Booker has not discovered archetypes, hard-wired blueprints, for story plots, though he has identified the deep themes that fascinate us in fictions. Here's an analogy: Survey the architectural layout of most people's homes and you will find persistent patterns in the variety. Bedrooms are separated from kitchens. Kitchens are close to dining rooms. Front doors do not open onto children's bedrooms or bathrooms. Are these patterns Jungian room-plan archetypes? Hardly. Life calls for logical separations of rooms where families can sleep, cook, store shoes, bathe and watch TV. Room patterns follow not from mental imprints, but from the functions of the rooms themselves, which in turn follow from our ordinary living habits. So it is with stories. The basic situations of fiction are a product of fundamental, hard-wired interests human beings have in love, death, adventure, family, justice and adversity. These values counted as much in the Pleistocene era as today, which is why evolutionary psychologists study them intensively. Our fictions are populated with character-types relevant to these themes: beautiful young women, handsome strong men, courageous leaders, children needing protection, wise old people. Add to this threats and obstacles to the fulfillment of love and fortune, including both bad luck and villains, and you have the makings of literature. Story plots are not unconscious archetypes, but follow, as Aristotle realized, from human interests and the logic of what is possible. Booker ends his 700-page treatise with a diatribe against literature of the past two centuries. Modern fiction has "lost the plot," he argues. Moby-Dick initially may look like a heroic Overcoming the Monster tale, but in the end we do not know who is more evil, Captain Ahab or the whale who kills him. While the ambiguities of modernism trouble Booker, some of his readers will be even more disturbed to find "E.T." and Peter Jackson's "Lord of the Rings" movies extravagantly lauded in a book that disparages the complex moral pessimism of Chekhov's "Uncle Vanya" and the achievement of Marcel Proust's Remembrance of Times Past , which he dismisses as "the greatest monument to human egotism in the history of story-telling." Fail though it might in its ambition to offer a single key to literature, The Seven Basic Plots is nevertheless one of the most diverting works on storytelling I've ever encountered. Pity about the Jung, but there's no denying the charm of Booker's twice-told tales. ? Denis Dutton edits the journal Philosophy and Literature and the Web site Arts & Letters Daily. From checker at panix.com Sun May 8 17:50:14 2005 From: checker at panix.com (Premise Checker) Date: Sun, 8 May 2005 13:50:14 -0400 (EDT) Subject: [Paleopsych] Black-White-East Asian IQ differences In-Reply-To: References: Message-ID: nicht die- se Toe- ne! G BbBbBbBbAA EE FF FF Greg, ad hominem arguments like these are unworthy of you. Besides, Black are indeed mentioned in the initial e-mail, along with Whites and East Asians. And you haven't addressed the question of how much money it might cost to do studies that meet your criteria of scientific adequacy to resolve these issues. On 2005-04-28, Greg Bear opined [message unchanged below]: > Date: Thu, 28 Apr 2005 17:11:41 -0700 > From: Greg Bear > Reply-To: The new improved paleopsych list > To: 'The new improved paleopsych list' > Subject: RE: [Paleopsych] Black-White-East Asian IQ differences > > Frank, what ARE you talking about? The initial discussion mentioned nothing > about blacks, but seemed to be about comparisons between the IQs of Asians > and so-called Whites--what brings out all this stuff about Uplift and > Blacks? And quoting the Constitution! My. > > The shoe seems to fit so well, it pinches. > > Are you a white male, middle-aged or older, mathematically adept, and proud > of your exceptional IQ? Then DO I HAVE A SCIENTIFIC SCAM FOR YOU! > > Not only does this scam claim to prove your nagging suspicions that blacks > are inferior to whites, but it's COMPLETELY GUILT-FREE, because it's > RATIONAL, based on PROVABLE MATHEMATICS! And better than that, it's > supported by the nagging suspicions of PEOPLE JUST LIKE YOU! People who grew > up in a different time. > > You don't answer any of my scientific objections. For a so-called scientific > forum, that's rather sad. > > I do enjoy Mr. Mencken, but we were talking about the biology of racial > differences, not TAXPAYER DOLLARS BEING WASTED ON SPONGING LOW-LIFES WHO > HAVEN'T A HOPE IN HELL OF EVER UNDERSTANDING WHY THEY'RE SO INFERIOR TO > ANGRY WHITE MALES. So let's not CHANGE THE SUBJECT. > > Sorry about the caps. In this sort of talk-radio atmosphere, they just > seemed appropriate. > > Greg > > -----Original Message----- > From: paleopsych-bounces at paleopsych.org > [mailto:paleopsych-bounces at paleopsych.org] On Behalf Of Premise Checker > Sent: Thursday, April 28, 2005 10:31 AM > To: The new improved paleopsych list > Subject: RE: [Paleopsych] Black-White-East Asian IQ differences > > Hold on a moment, Greg. What is the issue being addressed? It's why the > trillions of dollars taken from the taxpayers and spent on uplifting > Blacks has not been very successful and whether innate racial differences > constitute a large part of the explanation. Let me ask you, how much do > you think it would cost to get a good answer? (I have worked in the > program evaluation section of the U.S. Department of Education, and this > is one question we dare not address.) And if you don't think a good answer > can be had at a reasonable cost, do you think that no more money should be > spent on this uplift and either returned to the taxpayers or spent on > something that can be reliably evaluated? I am not sure what Mr. Mencken > called the Uplift is an appropriate function of the government. Certainly > not for the Federal government, since it is not among the 18 powers > granted to Congress under the Constitution, Art. 1, Sec. 8.: > > The United States Constitution: Article I, Section 8: > > Clause 1: The Congress shall have Power To lay and collect Taxes, > Duties, Imposts and Excises, to pay the Debts and provide for the common > Defence and general Welfare of the United States; but all Duties, > Imposts and Excises shall be uniform throughout the United States; > > Clause 2: To borrow Money on the credit of the United States; > > Clause 3: To regulate Commerce with foreign Nations, and among the > several States, and with the Indian Tribes; > > Clause 4: To establish an uniform Rule of Naturalization, and uniform > Laws on the subject of Bankruptcies throughout the United States; > > Clause 5: To coin Money, regulate the Value thereof, and of foreign > Coin, and fix the Standard of Weights and Measures; > > Clause 6: To provide for the Punishment of counterfeiting the Securities > and current Coin of the United States; > > Clause 7: To establish Post Offices and post Roads; > > Clause 8: To promote the Progress of Science and useful Arts, by > securing for limited Times to Authors and Inventors the exclusive Right > to their respective Writings and Discoveries; > > Clause 9: To constitute Tribunals inferior to the supreme Court; > > Clause 10: To define and punish Piracies and Felonies committed on the > high Seas, and Offences against the Law of Nations; > > Clause 11: To declare War, grant Letters of Marque and Reprisal, and > make Rules concerning Captures on Land and Water; > > Clause 12: To raise and support Armies, but no Appropriation of Money to > that Use shall be for a longer Term than two Years; > > Clause 13: To provide and maintain a Navy; > > Clause 14: To make Rules for the Government and Regulation of the land > and naval Forces; > > Clause 15: To provide for calling forth the Militia to execute the Laws > of the Union, suppress Insurrections and repel Invasions; > > Clause 16: To provide for organizing, arming, and disciplining, the > Militia, and for governing such Part of them as may be employed in the > Service of the United States, reserving to the States respectively, the > Appointment of the Officers, and the Authority of training the Militia > according to the discipline prescribed by Congress; > > Clause 17: To exercise exclusive Legislation in all Cases whatsoever, > over such District (not exceeding ten Miles square) as may, byCession of > particular States, and the Acceptance of Congress, become the Seat of > the Government of the United States, and to exercise like Authority over > all Places purchased by the Consent of the Legislature of the State in > which the Same shall be, for the Erection of Forts, Magazines, Arsenals, > dock-Yards, and other needful Buildings;--And > > Clause 18: To make all Laws which shall be necessary and proper for > carrying into Execution the foregoing Powers, and all other Powers > vested by this Constitution in the Government of the United States, or > in any Department or Officer thereof. > > Perhaps you think the Constitution should be amended or ignored. The > amazing thing is that http://www.ed.gov contains a statement that its > activities are unauthorized! > > Frank > > On 2005-04-27, Greg Bear opined [message unchanged below]: > >> Date: Wed, 27 Apr 2005 14:27:28 -0700 >> From: Greg Bear >> Reply-To: The new improved paleopsych list >> To: 'The new improved paleopsych list' >> Subject: RE: [Paleopsych] Black-White-East Asian IQ differences >> >> Must intrude here. This sort of nonsense is so unscientific as to be >> laughable. >> >> Major undefined terms: IQ (what is it really measuring?) >> Intelligence: In what environment does your IQ give you an advantage? >> Nature, society, Mad Max country? >> >> "Genetic"--where is the gene for intelligence, or the set of genes? Do > these >> genes differ between the races? We do not know. >> >> This general belief system, expressed with the utmost arrogance in THE > BELL >> CURVE, is generally held by a class of mathematically adept middle-aged > and >> older white males with pretensions to an understanding of some of the > major >> biological issues of our time. Their ignorance of genetics is profound. > Most >> of their language and conceptual structure refers to outmoded genetics of >> forty to fifty years ago--and they've never heard of epigenetics, the > study >> of how genes are switched on and off, activated and deactivated. > Indeed--not >> only do we now know that "defective" genes can be corrected in > non-Mendelian >> ways, having little to do with one's parentage, but even "perfect" genes > can >> be switched off in certain environments, after development and birth. >> >> Population groups put under pressure--through war, prejudicial treatment, >> incarceration, or outright persecution--are likely to witness major >> differences in the EXPRESSION of certain (possibly) genetic traits, which >> could statistically help them adapt to a dangerous and stressful >> environment. These adaptations may result in a skewing of IQ results, even >> should such tests be culturally neutral--which they are not--by focusing >> their nervous reaction to stimuli and deemphasizing their ability to focus >> on tasks of less immediate importance--that is, a written test. Fight or >> write, so to speak. >> >> We cannot test the Irish in 1840's Ireland or New York, or the Hungarians >> pressed over centuries by various contending hordes, or poor white trash > in >> the American South before the Civil War. Supposedly honorable white men in >> their day referred to these populations as "low" and cretinous, and > believed >> that intermarriage would not be advantageous. >> >> IQ matters most among equal populations in a fair and civilized society, > all >> things being equal socially; any other circumstance is skewed to those >> raised with various spoons of precious metals and family tradition firmly >> thrust into their mouths. 'Twas ever thus. >> >> So my questions of these researchers would be: "Which fifty percent of the >> genome would you blame? The right half, or the left? The one in the > middle? >> Which genes are you pointing to? Are you speaking of generalities, or >> specifics? If the latter, what specifically are you trying to say? >> >> "And why does this sound so much like the same sort of ignorant, > prejudicial >> crap promulgated throughout the ages by people of influence, to maintain >> their status through any means possible, fair or unfair?" >> >> White mathematically educated males of tested high IQ, trying to prove > that >> IQ is inbred and important... hm. Sounds like a class in search of >> justification to me. >> >> I'd challenge these folks to a duel on the public commons any day of the >> week. Easy money. Facts and native charm versus their almighty IQs. >> >> Greg Bear >> >> -----Original Message----- >> From: paleopsych-bounces at paleopsych.org >> [mailto:paleopsych-bounces at paleopsych.org] On Behalf Of Michael > Christopher >> Sent: Wednesday, April 27, 2005 1:48 PM >> To: paleopsych at paleopsych.org >> Subject: [Paleopsych] Black-White-East Asian IQ differences >> >>>> Black-White-East Asian IQ differences at least >> 50% genetic, major law review journal concludes<< >> >> --If this is true, how should society change to deal >> with it? Also, what is the IQ difference for someone >> with a male or female parent of a different race, or >> for various blends? >> >> Michael > _______________________________________________ > paleopsych mailing list > paleopsych at paleopsych.org > http://lists.paleopsych.org/mailman/listinfo/paleopsych > > > _______________________________________________ > paleopsych mailing list > paleopsych at paleopsych.org > http://lists.paleopsych.org/mailman/listinfo/paleopsych > From checker at panix.com Sun May 8 19:16:22 2005 From: checker at panix.com (Premise Checker) Date: Sun, 8 May 2005 15:16:22 -0400 (EDT) Subject: [Paleopsych] WP: Where Lawlessness May Roam Message-ID: Where Lawlessness May Roam The Washington Post, Sunday Outlook section, 5.5.8 Unconventional Wisdom http://www.washingtonpost.com/wp-dyn/content/article/2005/05/06/AR2005050601818_pf.html By Richard Morin Where Lawlessness May Roam We're certainly not encouraging it, but if you're thinking about going on a crime spree and you're scouting for locations, you might want to check out a 50-square-mile sliver of western Idaho. In this remote corner of Yellowstone National Park, a quirky confluence of constitutional technicalities and a goof by Congress more than a century ago may have produced a lawless oasis smack in the heart of God's Country, claims Brian C. Kalt, an associate professor of law at Michigan State University. Kalt insists that his reading of the law is correct -- at least in theory. "The courts may or may not agree that my loophole exists," he acknowledged in his essay "The Perfect Crime" in the forthcoming issue of the Georgetown Law Journal. Kalt says he's not interested in trying to help crooks, but rather in forcing Congress to tidy up the law books. "Crime is bad, after all. But so is violating the Constitution. If the loophole . . . does exist it should be closed, not ignored," he writes in an article that mixes serious scholarship with humor. At the heart of the problem is an obscure bit of legalese buried in the Sixth Amendment known as the "vicinage" requirement. (For non-lawyers, vicinage refers to the neighborhood where the crime took place, while venue refers to the location of the trial itself.) The amendment requires that jurors be "of the State and district wherein the crime shall have been committed, which district shall have been previously ascertained by law." From a legal perspective, the problem with Yellowstone Park is that it does not quite fit in Wyoming: Nine percent of the park spills into Montana (about 260 square miles' worth) and Idaho (about 50 square miles). The park was established in 1872, well before the three states were added to the union, and Congress put the entire park in the judicial district of Wyoming -- the only federal court district that includes land in more than one state. At the same time, Kalt said, legislators unwittingly created a potential "Zone of Death." Here's how it might work: "Say that you are in the Idaho portion of Yellowstone, and you decide to spice up your vacation by going on a crime spree. You make some moonshine, you poach some wildlife, you strangle some people and steal their picnic baskets. You are arrested, arraigned in the park and bound over for trial in Cheyenne, Wyo., before a jury drawn from the Cheyenne area. "But Article III, Section 2 [of the Constitution] plainly requires that the trial be held in Idaho, the state in which the crime was committed. Perhaps if you fuss convincingly enough about it, the case would be sent to Idaho. But the Sixth Amendment then requires that the jury be from the state (Idaho) and the district (Wyoming) in which the crime was committed. In other words, the jury would have to be drawn from the Idaho portion of Yellowstone National Park, which, according to the 2000 Census, has a population of precisely zero. . . . Assuming that you do not feel like consenting to trial in Cheyenne, you should go free." In short, Congress goofed back in 1890 when they made Wyoming the 44th state. "It should either have shrunk the park or made Wyoming bigger to include all of the park," Kalt said. Ah, legal hindsight is always 20/20. Kalt said the vagaries of venue and vicinage requirements have let people get away with murder before. He quotes an English legal scholar who complained in 1548 that it "often happene[d]" that a murderer would strike his victim in one county, and "by Craft and Cautele [caution]" escape punishment by making sure that the victim died in the next county. "An English jury could only take cognizance of the facts that occurred in its own county, so no jury would be able to find that the killer had committed all of the elements of murder," Kalt wrote. (Rest easy. England closed that loophole centuries ago.) But, professor, that was then. Could you really get away with murder today in your Zone of Death? Perhaps not -- at least not completely. Kalt notes that it "would be hard to limit your criminality to that small space," so you could be charged with conspiracy for things you did elsewhere to further your rampage. Prosecutors also could charge you with lesser crimes punishable by less than six months in jail, which do not require a jury trial. Or the victims' families could sue the pants off you. But the biggest deterrent may be the loophole that allowed your crime binge in the first place. If friends and families of your victims got wind of your plans, they might turn the tables before you left the crime scene, giving you -- in Kalt's words -- "a dose of your own medicine, administering vigilante justice with similar impunity." The Unusual Name Game (Cont.) We all know it's tough being a boy named Sue. Now it turns out it's also a problem to be a classmate of a boy named Sue, according to Universi ty of Florida economist David N. Figlio. Figlio found that boys with first names typically given to girls were more likely to misbehave in junior high school than students with less distinctive monikers. He also discovered that boys in classes with boys with feminine-sounding names were more likely to have discipline problems and lower standardized test scores. He reports his findings in a new working paper published by the National Bureau of Economic Research. Figlio made news in this column two months ago with his finding that children with unusual names don't fare as well in class. In his latest analysis, he used detailed data collected on more than 76,000 students in the late 1990s from a large school district in Florida. In exchange for access to student records, including names and disciplinary histories, Figlio promised not to reveal the school district or otherwise identify individual students. Overall Figlio found that nearly 2 percent of all boys in his sample had names that were overwhelmingly given to girls. That means the typical Florida middle-schooler will share about one out of every three classes with a boy named Sue . . . or Ashley, Courtney or Shannon. But as the father of three boys, the Wiz just has to ask: How about girls with guy-sounding names? Any effects on the other students from going to school with a girl named Tyler or Sidney? "I did not look as carefully at the girl-named-Brad angle, in part because this is much more common," Figlio wrote in an e-mail. "Indeed, Ashley, Courtney and Shannon were all once boys' names!" The Four-Hour Workday? Men and women may disagree on a lot of things, but in one area of life they're in near-perfect agreement. If they had the choice, they would work fewer hours than they do now, according to a Washington Post-ABC News national poll conducted last month. Equal majorities of working men (55 percent) and women (56 percent) said they'd spend less time on the job if they could continue to have the same standard of living. Fewer than three in 10 said they wouldn't reduce the number of hours they spend on the job. But few men and women want to join the leisure class entirely: Slightly fewer than one in five women and men said they would quit working if they could afford to. For the survey, 650 working women and men were interviewed April 21-24. Margin of sampling error for the overall results is plus or minus 4 percentage points. [2]morinr at washpost.com From checker at panix.com Sun May 8 19:16:38 2005 From: checker at panix.com (Premise Checker) Date: Sun, 8 May 2005 15:16:38 -0400 (EDT) Subject: [Paleopsych] Book World: (Odgen Nash) A Gleeful Splash of Ogden Nash Message-ID: This is just for fun. A Gleeful Splash of Ogden Nash Washington Post Book World, 5.5.8 http://www.washingtonpost.com/wp-dyn/content/article/2005/05/05/AR2005050501359_pf.html By Jonathan Yardley OGDEN NASH The Life and Work of America's Laureate of Light Verse By Douglas M. Parker. Ivan R. Dee. 316 pp. $27.50 At the end of the 1920s Ogden Nash was in his late twenties, living in New York City, working as a copywriter in the advertising department of Doubleday, the prominent book publisher, and trying his hand at poetry. It didn't take long, Douglas M. Parker writes, for him to reach "the important conclusion that he simply lacked the talent to become a serious poet: 'There was a ludicrous aspect to what I was trying to do; my emotional and naked beauty stuff just didn't turn out as I had intended.' " Instead he ventured into light verse, which enjoyed a more significant readership then than it does today. This was one of his earliest efforts: The turtle lives twixt plated decks That practically conceal its sex. I think it clever of the turtle In such a fix to be so fertile. The poem "made a remarkable impression on the humorist Corey Ford" and others as well. Soon Nash came up with this: The hunter crouches in his blind Mid camouflage of every kind. He conjures up a quaking noise To lend allure to his decoys. This grownup man, with pluck and luck Is hoping to outwit a duck. For my money, poetry doesn't get much better than that, whether "light" or "serious," and Nash did just that for four more decades, until his death in Baltimore on May 19, 1971. It was often said during his lifetime that he and Robert Frost were the only American poets who were able to support themselves and their families on the income from their work as poets, a claim that almost certainly cannot be made for a single American poet today, with the possible exception of Billy Collins. In just about all other respects Nash and Frost could not have been more different, but we can look back on them now as the last vestiges of an age when poetry still mattered in the United States, not just to academics and other poets but to the great mass of ordinary readers. To say that Nash mattered in my own family is gross understatement. My parents -- like Nash, members of the educated but far from wealthy middle class -- awaited each new issue of the New Yorker with the eager expectation that a new Nash poem would be found therein. For a couple of summers my family vacationed on New Hampshire's tiny coastline, where my father chatted up the great man on the beach. I caught the infection as a teenager and in high-school senior English wrote my class paper on Nash. My teacher, whom I revered, declared that "your comments are delicate and restrained," that "you express your admiration for Mr. Nash tastefully and with tact," and handed me an A-, a truly rare event in my sorry academic history. That same teacher also noted, tactfully, that Nash's "poetic credo is perhaps stated in 'Very Like a Whale' and perhaps will interest you." This poem is indeed a key to Nash. It begins, "One thing that literature would be greatly the better for/ Would be a more restrained employment by authors of simile and metaphor," takes note of Byron's "the Assyrian came down like a wolf on the fold" and then takes exception to it -- "No, no, Lord Byron, before I'll believe that this Assyrian was actually like a wolf I must have some kind of proof;/ Did he run on all fours and did he have a hairy tail and a big red mouth and big white teeth and did he say Woof woof?" -- and closes with a flourish: That's the kind of thing that's being done all the time by poets, from Homer to Tennyson; They're always comparing ladies to lilies and veal to venison. And they always say things like that the snow is a white blanket after a winter storm. Oh it is, is it, all right then, you sleep under a six-inch blanket of snow and I'll sleep under a half-inch blanket of unpoetical blanket material and we'll see which one keeps warm, And after that maybe you'll begin to comprehend dimly What I mean by too much metaphor and simile. It's all right there: Nash's irreverence, his delayed and often improbable rhymes ("dimly" and "simile"), his long, death-defying lines of verse, his delight in tweaking the pompous and pretentious. He liked to say that since he never could be anything more than a bad good poet he would settle for being a good bad poet, but there was nothing bad about his verse, as was commonly recognized by other poets, writers and critics. W.H. Auden thought he was "one of the best poets in America," Clifton Fadiman praised his "dazzling assortment of puns, syntactical distortions and word coinages," and when Scott Fitzgerald's daughter sent her father a bad imitation of Nash, he replied: "Ogden Nash's poems are not careless, they all have an extraordinary inner rhythm. They could not possibly be written by someone who in his mind had not calculated the feet and meters to the last iambus or trochee. His method is simply to glide a certain number of feet and come up smack against his rhyming line. Read over a poem of his and you will see what I mean." Indeed. That astute judgment is borne out in just about everything Nash wrote, as well as in the utter failure of all those -- their numbers were (and are) uncountable -- who tried to imitate him. To say that he was the best American light poet of his or any other day is true beyond argument, but it is scarcely the whole story. He was one of the best American poets of his or any other day, period, and it is a great injustice that critics customarily pigeonhole (and dismiss) him as a mere entertainer because he committed the unpardonable sin of being funny. Nash emerges, in Parker's capable if conventional biography, as a decent man whose inner life probably was a lot more complicated than his verse suggests. He was born into comfortable circumstances in suburban New York, but those circumstances changed dramatically with his father's business failure. Nash put in only a year at college before going to New York City and the real world, but he was exceptionally well read and universally esteemed among his many friends for the brilliance of his mind. He tended to drink a bit too much and was prone to depression, especially in later life, but people loved to be with him. "We hung onto him," one friend said. "He was a great lifesaver for everybody. . . . He was lovely and amusing and fun." The great love of his life was Frances Leonard, a belle of Baltimore whom he met there in 1928, courted assiduously (sometimes desperately) and at last married three years later. She was charming, beautiful and, when the occasion called for it, difficult. He learned how to deal with her moods, and his "devotion to Frances never wavered." They had two daughters, whom he adored and about whom he wrote many poems, some of them agreeably sentimental, some of them funny, all of them astute: I have a funny daddy Who goes in and out with me, And everything that baby does My daddy's sure to see And everything that baby says, My daddy's sure to tell You must have read my daddy's verse I hope he fries in hell. Though Nash earned a decent income off the poems he sold to the New Yorker, the Saturday Evening Post and other magazines, he and Frances had expensive tastes, and he had ambitions beyond poetry. Like many other writers of his day, he wanted to succeed in the Broadway theater. Unlike most others, he actually did, with "One Touch of Venus," a musical by Kurt Weill for which he wrote the lyrics and collaborated with S.J. Perelman on the book. The show opened in October 1943 and ran for an impressive 567 performances. One of the songs, "Speak Low," remains a classic of cabaret and jazz and has been recorded by many of the country's best singers. Strictly for money, Nash went onto the lecture circuit in 1945. His "tours would occupy Nash for several weeks a year for nearly twenty years and have a significant impact on his life and health." The tours were exhausting, but audiences invariably were large and welcoming; Nash was gratified by this direct contact with his readers and kept on the circuit long after its effect on his health had become deleterious. Never robust, by the time he hit his sixties he suffered from numerous ailments, many of them intestinal and some of them debilitating. Toward the end of his life Nash agreed to deliver the commencement address at his daughter Linell's boarding school. Perhaps subconsciously aware of the approaching end, he made it "his own valedictory." He spoke up for humor: "It is not brash, it is not cheap, it is not heartless. Among other things I think humor is a shield, a weapon, a survival kit. . . . So here we are several billion of us, crowded into our global concentration camp for the duration. How are we to survive? Solemnity is not the answer, any more than witless and irresponsible frivolity is. I think our best chance lies in humor, which in this case means a wry acceptance of our predicament. We don't have to like it but we can at least recognize its ridiculous aspects, one of which is ourselves." Today, when we need to laugh perhaps more than ever before, we can only thank God for Ogden Nash. ? Jonathan Yardley's e-mail address is yardleyj at washpost.com. From he at psychology.su.se Mon May 9 13:36:26 2005 From: he at psychology.su.se (Hannes Eisler) Date: Mon, 9 May 2005 15:36:26 +0200 Subject: [Paleopsych] fads and atoms In-Reply-To: <1ee.3b3eba41.2fad6251@aol.com> References: <1ee.3b3eba41.2fad6251@aol.com> Message-ID: What about incorrectly folded prions? >Content-Type: text/html; charset="UTF-8" >Content-Language: en > >The following article hits the motherlode when it comes to our past >discussions of Ur patterns, iteration, and fracticality. Ur >patterns are those that show up on multiple levels of emergence, >patterns that make anthropomorphism a reasonable way of doing >science, patterns that explain why a metaphor can capture in its >word-picture the underlying structure of a whirlwind, a brain-spin, >or a culture-shift. > > > >Here's how a pattern in the molecules of magnets repeats itself in >the mass moodswings of human beings. Howard > >etrieved May 6, 2005, from the World Wide Web >http://www.newscientist.com/article.ns?id=mg18624984.200 HOME |NEWS >|EXPLORE BY SUBJECT |LAST WORD |SUBSCRIBE |SEARCH |ARCHIVE |RSS >|JOBS Click to PrintOne law rules dedicated followers of fashion 06 >May 2005 Exclusive from New Scientist Print Edition Mark Buchanan >FADS, fashions and dramatic shifts in public opinion all appear to >follow a physical law: one of the laws of magnetism. Quentin >Michard of the School of Industrial Physics and Chemistry in Paris >and Jean-Philippe Bouchaud of the Atomic Energy Commission in >Saclay, France, were trying to explain three social trends: >plummeting European birth rates in the late 20th century, the rapid >adoption of cellphones in Europe in the 1990s and the way people >clapping at a concert suddenly stop doing so. In each case, they >theorised, individuals not only have their own preferences, but also >tend to imitate others. "Imitation is deeply rooted in biology as a >survival strategy," says Bouchaud. In particular, people frequently >copy others who they think know something they don't. To model the >consequences of imitation, the researchers turned to the physics of >magnets. An applied magnetic field will coerce the spins of atoms in >a magnetic material to point in a certain direction. And often an >atom's spin direction pushes the spins of neighbouring atoms to >point in a similar direction. And even if an applied field changes >direction slowly, the spins sometimes flip all together and quite >abruptly. The physicists modified the model such that the atoms >represented people and the direction of the spin indicated a >person's behaviour, and used it to predict shifts in public >opinion. In the case of cellphones, for example, it is clear that >as more people realised how useful they were, and as their price >dropped, more people would buy them. But how quickly the trend took >off depended on how strongly people influenced each other. The >magnetic model predicts that when people have a strong tendency to >imitate others, shifts in behaviour will be faster, and there may >even be discontinuous jumps, with many people adopting cellphones >virtually overnight. More specifically, the model suggests that the >rate of opinion change accelerates in a mathematically predictable >way, with ever greater numbers of people changing their minds as the >population nears the point of maximum change. Michard and Bouchaud >checked this prediction against their model and found that the >trends in birth rates and cellphone usage in European nations >conformed quite accurately to this pattern. The same was true of the >rate at which clapping died away in concerts. Close this window >Printed on Sat May 07 01:01:50 BST 2005 > > >---------- >Howard Bloom >Author of The Lucifer Principle: A Scientific Expedition Into the >Forces of History and Global Brain: The Evolution of Mass Mind From >The Big Bang to the 21st Century >Visiting Scholar-Graduate Psychology Department, New York >University; Core Faculty Member, The Graduate Institute >www.howardbloom.net >www.bigbangtango.net >Founder: International Paleopsychology Project; founding board >member: Epic of Evolution Society; founding board member, The Darwin >Project; founder: The Big Bang Tango Media Lab; member: New York >Academy of Sciences, American Association for the Advancement of >Science, American Psychological Society, Academy of Political >Science, Human Behavior and Evolution Society, International Society >for Human Ethology; advisory board member: Youthactivism.org; >executive editor -- New Paradigm book series. >For information on The International Paleopsychology Project, see: >www.paleopsych.org >for two chapters from >The Lucifer Principle: A Scientific Expedition Into the Forces of >History, see www.howardbloom.net/lucifer >For information on Global Brain: The Evolution of Mass Mind from the >Big Bang to the 21st Century, see www.howardbloom.net > > >_______________________________________________ >paleopsych mailing list >paleopsych at paleopsych.org >http://lists.paleopsych.org/mailman/listinfo/paleopsych -- ------------------------------------- Prof. Hannes Eisler Department of Psychology Stockholm University S-106 91 Stockholm Sweden e-mail: he at psychology.su.se fax : +46-8-15 93 42 phone : +46-8-163967 (university) +46-8-6409982 (home) internet: http://www.psychology.su.se/staff/he -------------- next part -------------- An HTML attachment was scrubbed... URL: From checker at panix.com Mon May 9 21:04:35 2005 From: checker at panix.com (Premise Checker) Date: Mon, 9 May 2005 17:04:35 -0400 (EDT) Subject: [Paleopsych] NYTDBR: That 'Prozac' Man Defends the Gravity of a Disease Message-ID: That 'Prozac' Man Defends the Gravity of a Disease New York Times Daily Book Review, 5.5.9 http://www.nytimes.com/2005/05/09/books/09masl.html By [2]JANET MASLIN In his new book, Peter D. Kramer tells a story about traveling to promote the best-known of his earlier books, "Listening to Prozac," and regularly encountering the same kind of wiseguy in lecture audiences. Wherever he went, somebody would ask him whether the world would be shorter on Impressionist masterpieces if Prozac had been prescribed for Vincent van Gogh. Sunflowers and starry nights aside, this anecdote is revealing. It conveys both the facts that "Listening to Prozac" made a mental health celebrity out of Dr. Kramer (who is a clinical professor of psychiatry at Brown University) and that the book's success left him uneasy. He became a target, not only of New Yorker cartoons (one of which featured a Prozac-enhanced Edgar Allan Poe being nice to a raven) but of condescension from his professional peers. He found out that there was no intellectual advantage to be gained from pointing the way to sunnier moods. "Against Depression" is a defensive maneuver against such vulnerability. With both a title and an argument that summon Susan Sontag (in "Against Interpretation" and "Illness as Metaphor"), the author argues against the idea that depression connotes romance or creativity. While fully acknowledging depression's seductiveness (Marlene Dietrich is one of his prototypes of glamorous apathy), and grasping how readily the connection between gloom and spiritual depth has been made, Dr. Kramer argues for a change in priorities. He maintains that depression's physiology and pathology matter more than its cachet. Dr. Kramer makes this same point over and over in "Against Depression." It may be self-evident, but it's not an idea that easily sinks in. As this book points out, the tacit glorification of depression inspires entire art forms: "romantic poetry, religious memoir, inspirational tracts, the novel of youthful self-development, grand opera, the blues." There isn't much comparable magnetism in the realms of resilience, happiness and hope. What's more, he says, our cultural embrace of despair has a respected pedigree. Depression is the new tuberculosis: "an illness that signifies refinement," as opposed to one that signifies unpleasantness and pain. In a book that mixes medical theory, case histories and the occasional flash of autobiography, Dr. Kramer speaks of having been immersed in depression - "not my own" - when inundated with memoirs about the depressed and their pharmacological adventures. He finds there is a lot more confessional writing of this sort than there is about suffering from, say, kidney disease. But depression, in his view, is as dangerous and deserving of treatment as any other long-term affliction. When regarded in purely medical terms, evaluated as a quantifiable form of degeneration, depression loses its stylishness in a hurry. Here, matters grow touchy: the author is careful to avoid any remedial thoughts that might appear to promote the interests of drug companies. So there are no miracle cures here; there is just the hope that an embrace of strength and regeneration can supplant the temptation to equate despair with depth. "Against Depression" returns repeatedly to this central, overriding premise. Perhaps Dr. Kramer's talk-show-ready scare tactics are essential to his objectives. "The time to interrupt the illness is yesterday," he writes, building the case for why even seemingly brief interludes of depression can signal a relentless pattern of deterioration in a patient's future. For anyone who has spent even two straight weeks feeling, for instance, sad, lethargic, guilty, alienated and obsessed with trifles, "Against Depression" has unhappy news. The author does not stop short of declaring that "depression is the most devastating disease known to humankind." But this claim, like much of the medical data discussed here, is open to interpretation and heavily dependent on the ways in which individual factors are defined. How far do the incapacitating properties of depression extend? Do they lead only to sadness and paralysis, or also to self-destructive behavior, addictions, failures, job losses and patterns passed down to subsequent generations? Whatever the case, Dr. Kramer is clearly well armed for the debate he will incite. While its medical information, particularly about depression-related damage to the brain, is comparatively clear-cut, it is in the realm of culture that "Against Depression" makes its strongest case. In these matters, Dr. Kramer is angry and defensive: he finds it outrageous that William Styron's "Darkness Visible" endows depression with such vague witchcraft ("a toxic and unnamable tide," "this curious alteration of consciousness") or that Cynthia Ozick can complain that John Updike's "fictive world is poor in the sorrows of history." He himself finds Updike's world rich in life-affirming attributes that tend to be underrated. He wonders how much of the uniformly acknowledged greatness of Picasso's blue period has to do with its connection with the suicide of one of Picasso's friends. By the same token, he is amazed by a museum curator's emphasis on the bleakest work of Bonnard, though this painter strikes Dr. Kramer as "a man for whom fruit is always ripe." Similar material, with the potential to illustrate the high status of low moods, is endless. There is a whole chapter on Sylvia Plath that the author didn't even bother to write. There is more breadth of evidence than innovative thinking in "Against Depression." Nonetheless, this book successfully advances the cartography of a (quite literally) gray area between physical and mental illness. And in the process it settles a few scores for the author, whose last book was a novel about a radical blowing up trophy houses on Cape Cod. Here is his chance to assert that he wrote his senior thesis on death in Dickens's writing; he listened to a lot of Mozart and Schubert in college; that he, too, has succumbed to the erotic power of bored, affectless, emotionally unavailable women in candlelit rooms. But he wrote this book in a state of reasonable contentment. He finds life well worth living. He's tired - in ways that have potent ramifications for all of us - of being treated as a lightweight for that. From checker at panix.com Mon May 9 21:04:56 2005 From: checker at panix.com (Premise Checker) Date: Mon, 9 May 2005 17:04:56 -0400 (EDT) Subject: [Paleopsych] CHE: Understanding 'The Sociopath Next Door' Message-ID: Understanding 'The Sociopath Next Door' The Chronicle of Higher Education, 5.5.13 http://chronicle.com/weekly/v51/i36/36a01202.htm VERBATIM By PETER MONAGHAN Martha Stout, a former clinical instructor in psychiatry at Harvard University. Recent studies suggest that one in 25 Americans is a sociopath, without conscience and ready to prey on others. But not all of them are the cunning killers of television crime dramas, says Ms. Stout, who dissects the phenomenon in The Sociopath Next Door: The Ruthless Versus the Rest of Us (Broadway Books). Q. Aren't many sociopaths likely to be in prison, and not among us? A. It turns out very few sociopaths apparently are in jail, and, as a matter of fact, people who are in jail are not all sociopaths. And most sociopaths are not violent. Q. But 4 percent seems high, no? A. Statistical studies are difficult to interpret... but my colleagues tend to tell me that they think it's an even larger number than that. ... When you realize that the absence of conscience can motivate lesser behaviors than going out and being a serial killer, the statistic starts to make more sense. ... We're talking about the boss who ridicules people just to make them jump, or the spouse who abuses the other spouse just to make him or her jump. Q. And that's where sociopathy enters everyday life? A. Exactly. Most sociopaths are just like everybody else. They're average people with average intelligence or sometimes even less-than-average intelligence. And the games they play are much lesser, more personal, and private games. Q. Does culture drive sociopathy? A. It appears to be about 50 percent inheritable; as for the other 50 percent that seems to be caused by the environment, nobody has really explained that. Sociopaths are not abused more as children than other groups. So probably the cultural explanation is a good one. In certain Far Eastern countries, notably Japan and Taiwan, the observed rates of sociopathy are far less -- and in those cultures there's more an emphasis on contributing to the group, and on respect for life, while our culture has its capitalistic emphasis on winning at all costs. Q. If not violent, sociopaths are generally getting their own way, right? A. Exactly. I hear comments such as, "This was the most charming personI ever met, the sexiest, the most intense. ..." I call this a predatory charisma that is difficult to explain but definitely exists. They're also very good at faking the emotions that the rest of us actually feel, such that they look normal. Q. Is it treatable or curable? A. Unfortunately not. We don't know how to instill conscience where there is none. ... Sociopaths seldom come into treatment unless they've been court-referred, and they don't seem to be in any kind of psychological pain. From checker at panix.com Mon May 9 21:05:06 2005 From: checker at panix.com (Premise Checker) Date: Mon, 9 May 2005 17:05:06 -0400 (EDT) Subject: [Paleopsych] CHE: 2 Books Explore the Sins of Anthropologists Past and Present Message-ID: 2 Books Explore the Sins of Anthropologists Past and Present The Chronicle of Higher Education, 5.5.13 http://chronicle.com/weekly/v51/i36/36a01701.htm HOT TYPE By DAVID GLENN INHUMAN ANTHROPOLOGY: One day in 1997, Gretchen E. Schafft, an applied anthropologist in residence at George Washington University, paid a visit to the Smithsonian Institution's National Anthropological Archives. Her goal that day was relatively prosaic. She wanted to read some World War II-era correspondence among American anthropologists. She wondered how much they had known at that time about the crimes committed by some of their German counterparts who had lent their services to the Nazi regime. What Ms. Schafft found instead were 75 boxes full of material produced in Poland by the Nazi anthropologists themselves. The material had been seized by U.S. soldiers in 1945 and given to the Smithsonian by the Pentagon two years later. No Smithsonian staff member had ever cataloged the boxes, which had apparently gone unnoticed for 50 years. The collection was difficult to stomach. It included human hair samples, fingerprints, photographs, drawings of head circumferences, and other artifacts of the Nazi regime's mania for categorizing human bodies. The Nazis were obsessed with salvaging, as they saw it, the German and other allegedly Nordic elements of the Polish population. If, in 1940, a Polish child's hair was sufficiently blond, and the shape of the head sufficiently "Aryan," he or she was likely to be forcibly sent west for "Germanization." Young people deemed purely Polish by the Nazis were shipped to work camps. Jews and Roma, of course, faced worse. In her new book, From Racism to Genocide: Anthropology in the Third Reich (University of Illinois Press), Ms. Schafft explores how the principles of early-20th-century physical anthropology, both scientific and pseudoscientific, were put to work by the Nazis. Several months after the invasion of Poland, Hitler's aides established the Institute for German Work in the East, which employed scholarly anthropologists to complete such tasks as "racial-biological investigation of groups whose value cannot immediately be determined" and "racial-biological investigation of Polish resistance members." Why were these anthropologists -- many of whom had received serious training at Germany's best universities -- willing to enlist in such projects? "I think, first of all, that they really were ideologically in tune with the government," Ms. Schafft says. "And, secondly, I think they believed that measurement data was somehow sacrosanct. To some extent, I think we still believe that. I think that's a very dangerous belief. Measurement data without context has to be viewed very suspiciously." A few years after her discovery at the Smithsonian, Ms. Schafft was contacted by a physical anthropologist who wanted to use the Nazis' data to shed light on "patterns of migration and population settlement." She resisted, arguing that the information had been collected through cruel means and for evil purposes, and is in any case highly suspect. The Nazi anthropologists often seem to have been absurdly insensitive to context. For example, they drew sweeping conclusions about alleged Russian physical and social traits on the basis of studies of half-starved Soviet soldiers in prisoner-of-war camps. "The data in and of themselves were useless," she says. "We shouldn't give the Nazis a second opportunity by rehashing these old data." The data will, however, be preserved for other purposes. The Nazi materials will soon be returned to Jagiellonian University, in Poland. (The Smithsonian will retain a digitized copy.) "What will be of most use to the people of Poland," Ms. Schafft says, "are, first, the records of Jews and others interviewed at the Tarn?w ghetto. Those will give some families the last indication of where their relatives were. And, second, the amazing photographs of people in villages throughout Poland. There are portraits of hundreds, if not more than a thousand, identifiable individuals." In a small way, she hopes, maintaining the collection in Poland will preserve the memory of a few of the victims of science -- and politics -- gone mad. *** Some related moral dilemmas are chewed over in Biological Anthropology and Ethics: From Repatriation to Genetic Identity (State University of New York Press), a collection edited by Trudy R. Turner, a professor of anthropology at the University of Wisconsin at Milwaukee. The book's 20 essays span a range of topics from the treatment of primates in the field (when is it acceptable to use anesthesia and radio collars?) to the sharing of data with colleagues (how quickly should scholars give their results to the major international DNA databases?). Some of the most contentious debates, however, concern the ground rules for working with human remains. Ever since 1990, when Congress passed the Native American Graves Protection and Repatriation Act, or Nagpra, anthropologists have argued about how the law's provisions should be understood and enforced. Frederika A. Kaestle, an assistant professor of anthropology at Indiana University at Bloomington and a contributor to the book, says the most difficult debates concern the study of human remains that are more than 7,000 years old. In such cases, it is often impossible to determine any direct ancestry or cultural affiliation with modern American Indian groups. In her own scholarship, Ms. Kaestle interprets the law's provisions very strictly, she says. "I won't work with remains that were found on private land. Because that material isn't covered under Nagpra it's a little too iffy for me ethically." She is optimistic that even if the law is tightened, as some American Indian advocates have proposed, it will still be feasible for scholars to do DNA studies of ancient remains. "The climate is changing a bit," she says. "Some Native American groups are not only accepting but promoting this work as something that they're interested in." From checker at panix.com Mon May 9 21:04:46 2005 From: checker at panix.com (Premise Checker) Date: Mon, 9 May 2005 17:04:46 -0400 (EDT) Subject: [Paleopsych] CHE: Novel Perspectives on Bioethics Message-ID: Novel Perspectives on Bioethics The Chronicle of Higher Education, 5.5.13 http://chronicle.com/weekly/v51/i36/36b00601.htm By MARTHA MONTELLO On March 16, the Kansas Legislature heatedly debated a bill that would criminalize all stem-cell research in the state. Evangelical-Christian politicians and conservative lawmakers argued with molecular biologists and physicians from the University of Kansas' medical school about the morality of therapeutic cloning. Up against a substantial audience of vocal religious conservatives, William B. Neaves, CEO and president of the Stowers Institute for Medical Research, a large, privately financed biomedical-research facility in Kansas City, began his impassioned defense of the new research by giving his credentials as "a born-again Christian for 30 years." Barbara Atkinson, executive vice chancellor of the University of Kansas Medical Center, tried to articulate the difference between "a clump of cells in a petri dish" and what several hostile representatives repeatedly interrupted to insist is "early human life." Clearly, in this forum, language mattered. Each word carried wagonloads of moral resonance. I am a literature professor. I was at the hearing because I am also chairwoman of the pediatric-ethics committee at the University of Kansas Medical Center. I listened to the debates get more and more heated as the positions got thinner and more polarized, and I kept thinking that these scientists and lawmakers needed to read more fiction and poetry. Leon R. Kass, chairman of the President's Council on Bioethics, apparently feels the same way. He opened the council's first session by asking members to read Hawthorne's story "The Birthmark,"and he has since published an anthology of literature and poetry about bioethics issues. The fight in Kansas (the bill was not put to a vote) is in some ways a microcosm of what has been happening around the country. From Kevorkian to Schiavo, cloning to antidepressants, issues of bioethics increasingly underlie controversies that dominate public and political discussion. Decisions about stem-cell research, end-of-life choices, organ transplantation, and mind- and body-enhancing drugs, among others, have become flash points for front-page news day after day. At the same time, some good literary narratives have emerged over the past few years that reveal our common yet deeply individual struggles to find an ethics commensurate with rapid advances in the new science and technologies. Kazuo Ishiguro's elegiac, disturbing new novel, Never Let Me Go, re-imagines our world in a strange, haunting tale of mystery, horror, love, and loss. Set in "England, 1990s," the story is pseudohistorical fiction with a hazy aura of scientific experimentation. A typical Ishiguro narrator, Kathy H. looks back on her first three decades, trying to puzzle out their meaning and discern the vague menace of what lies ahead. In intricate detail she sifts through her years at Hailsham, an apparently idyllic, if isolated, British boarding school, "in a smooth hollow with fields rising on all sides." Kathy and the other students were nurtured by watchful teachers and "guardians," who gave them weekly medical checks, warned them about the dangers of smoking, and monitored their athletics triumphs and adolescent struggles. Sheltered and protected, she and her friends Ruth and Tommy always knew that they were somehow special, that their well-being was important to the society somewhere outside, although they understood that they would never belong there. From the opening pages, a disturbing abnormality permeates their enclosed world. While the events at Hailsham are almost absurdly trivial -- Tommy is taunted on the soccer field, Laura gets caught running through the rhubarb garden, Kathy loses a favorite music tape -- whispered secrets pass among guardians and teachers, and the atmosphere is ominous -- as Kathy puts it, "troubling and strange." The children have no families, no surnames, no possessions but castoffs -- other people's junk. Told with a cool dispassion through a mist of hints, intuitions, and guesses, Kathy's memories gradually lift the veil on a horrifying reality: These children were cloned, created solely to become organ donors. Once they leave Hailsham (with its Dickensian reverberations of Havisham, that ghostly abuser of children) they will become "caregivers," then "donors," and if they live to make their "fourth donation," will "complete." The coded language that Kathy has learned to describe her fate flattens the unthinkable and renders it almost ordinary, simply what is, so bloodlessly that it heightens our sense of astonishment. What makes these doomed clones so odd is that they never try to escape their fate. Almost passive, they move in a fog of self-reinforced ignorance, resigned to the deadly destiny for which they have been created. However, in a dramatic scene near the end of the novel, Kathy and Tommy do try to discover, from one of the high-minded ladies who designed Hailsham, if a temporary "deferral" is possible. It is too late for any of them now, the woman finally divulges. Once the clones were created, years ago during a time of rapid scientific breakthroughs, their donations became the necessary means of curing previously incurable conditions. Society has become dependent on them. Now there is no turning back.The only way people can accept the program is to believe that these children are not fully human. Although "there were arguments" when the program began, she tells them, people's primary concern now is that their own family members not die from cancer, heart disease, diabetes, or motor-neuron diseases. People outside prefer to believe that the transplanted organs come from nowhere, or at least from beings less than human. Readers of Ishiguro's fiction will recognize his mastery in creating characters psychologically maimed by an eerie atrocity. From his debut novel, A Pale View of Hills (Putnam, 1982), Ishiguro's approach to horror has been oblique, restrained, and enigmatic. The war-ravaged widow from Nagasaki in that work presages the repressed English butler of The Remains of the Day (Random House, 1990) and Kathy herself, all long-suffering victims with wasted lives whose sense of obligation robs them of happiness. Their emotions reined in, their sight obscured, they are subject to wistful landscapes, long journeys, and a feeling of being far from the possibility of home and belonging. Never Let Me Go, however, ventures onto new terrain for Ishiguro by situating itself within current controversies about scientific research. Taking on some of the moral arguments about genetic engineering, the novel inevitably calls into question whether such fiction adds to the debates or clouds them -- and whether serious fiction about bioethics is enriched by the currency of its topic or hampered by it. Here Ishiguro's novel joins company with others that are centered in contemporary bioethics issues and might be considered a genre of their own. A decade ago, Doris Betts penetrated the intricate emotions around living donors' organ transplantation in her exquisitely rendered Souls Raised From the Dead. The novel offered a human dimension and nuanced depth to this area of medical-ethics deliberations, which were making headline news. In Betts's story, a dying young daughter needs as close a match as possible for a new kidney. Her parents face complexities and contradictions behind informed consent and true autonomy that are far more subtle, wrenching, and real than any medical document or philosophy-journal article can render. Betts does justice to the medical and moral questions surrounding decisions that physicians, patients, and families must make regarding potential organ donations. What makes the book so compelling, though, is its focus on the various and often divergent emotional strategies that parents and children use to cope with fear, sacrifice, and impending loss. The 13-year-old Mary Grace, her parents, and grandparents reveal themselves as fully rounded, noninterchangeable human beings who come to their decisions and moral understandings over time, within their own unique personal histories and relationships with each other. As the therapeutic possibilities of transplant surgery were breaking new ground in hospitals across the country, surgeons, families, and hospital ethics committees grappled with dilemmas about how to make good choices between the medical dictum to "do no harm" and the ethical responsibility to honor patients' sovereignty over their own bodies. Betts's novel captured the difficulty of doing the right thing for families enduring often inexpressible suffering: How much sacrifice can we expect of one family member to save another? The ethical complexities regarding organ donations, and particularly the dilemmas associated with decisions to conceive children as donors, are escalating. Four years ago The New York Times reported on two families who each conceived a child to save the life of another one. Fanconi anemia causes bone-marrow failure and eventually leukemia and other kinds of cancer. Children born with the disease rarely live past early childhood. Their best chance of survival comes from a bone-marrow transplant from a perfectly matched sibling. Many Fanconi parents have conceived another child in the hope that luck would give them an ideal genetic match. These two couples, however, became the first to use new reproductive technologies to select from embryos resulting from in vitro fertilization, so they could be certain that this second baby would be a perfect match. When the article appeared in the Times, many people wondered if it is wrong to create a child for "spare parts." News reports conjured up fears of "Frankenstein medicine." State and federal legislatures threatened laws to ban research using embryos. A fictional version of this dilemma appears in Jodi Picoult's novel My Sister's Keeper. Picoult, a novelist drawn to such charged topics as teen suicide and statutory rape, takes up this bioethics narrative of parents desperate to save a sick child through the promise of genetic engineering. Conceived in that way, Anna Fitzgerald has served since her birth as the perfectly matched donor for her sister, Kate, who has leukemia, supplying stem cells, bone marrow, and blood whenever needed. Now, though, as her sister's organs begin to fail, the feisty Anna balks when she is expected to donate a kidney. Through alternating points of view, Picoult exposes the family's moral, emotional, and legal dilemmas, asking if it can be right to use -- and perhaps sacrifice -- one child to save the life of another. The story draws the reader in with its interesting premise -- one sister's vital needs pitted against the other's -- but ultimately disintegrates within a melodramatic plot that strands its underdeveloped characters. Why is the girls' mother so blind and deaf to Anna's misgivings about her role as donor? How can we possibly believe the contrived ending, which circumvents the parents' need to make a difficult moral choice? Ultimately the novel trivializes what deserves to be portrayed as a profoundly painful Sophie's choice, using the contentious bioethics issue as grist for a kind of formulaic writing. While authors like Betts and Picoult have examined ethical dilemmas of the new science in a style that might be called realistic family drama, others lean toward science fiction, imagining dystopian futures that are chillingly based on the present. Often prescient, they reflect our unarticulated fears, mirroring our rising anxiety about where we are going and who we are becoming. In addressing concerns about cloning, artificial reproduction, and organ donation, these novels join an even broader, older genre, the dystopian novels of the biological revolution. In 1987 Walker Percy published The Thanatos Syndrome, a scathing fictional exploration of what the then-new psychotropic drugs might mean to our understanding of being human. In this last and darkest novel by the physician-writer, the psychiatrist Tom More stumbles on a scheme to improve human behavior by adding heavy sodium to the water supply. After all, the schemers argue, what fluoride has done for oral hygiene, we might do for crime, disease, depression, and poor memory! More is intrigued but ultimately aghast at the consequences: humans reduced to lusty apes with no discernible soul or even self-consciousness. Percy cleverly captures many of our qualms about such enhancement therapies in a fast-paced plot that reads like a thriller. Many readers, however, feel that this sixth and final novel is the least compelling of Percy's oeuvre, emphasizing his moral outrage over the excesses of science at the expense of a protagonist's spiritual and emotional journey that had previously been the hallmark of his highly acclaimed fiction. With less dark humor but equal verve, Margaret Atwood's Oryx and Crake chronicles the creation of a would-be paradise shaped and then obliterated by genetic manipulation. Echoes of her earlier best seller The Handmaid's Tale (Houghton Mifflin, 1986) reverberate through this postapocalyptic world set in an indeterminate future, where Snowman, the proverbial last man alive, describes how the primal landscape came to be after the evisceration of bioengineering gone awry. A modern-day Robinson Crusoe, Snowman is marooned on a parched beach, stranded between the polluted water and a chemical wasteland that has been stripped of humankind by a virulent plague. Once he melts away, even the vague memories of what was will have disappeared. As in other works of science fiction, while its plot complications drive the narrative, its powerful conceptual framework dominates the stage. For all it lacks in character complexity and realistic psychological motivations, this 17th book of Atwood's fiction has a captivating Swiftian moral energy, announced in the opening quotation, from Gulliver's Travels: "My principal design was to inform you, and not to amuse you."Readers, however, might wish that Atwood had made a stronger effort to amuse us. Her ability to sustain our interest is challenged by the story's unremitting bleakness and the lack of real moral depth to its few characters. Even with its weaknesses, Atwood's is a powerful cautionary tale, similar in some ways to Caryl Churchill's inventive play A Number (2002). That drama is constructed of a series of dialogues in which a son confronts his father (named Salter) with the news that he is one of "a number" of clones (all named Bernard). Years ago, grieving the death of his wife, Salter was left to raise a difficult son, lost to him in some deep way, whom he finally put into "care." Sometime later, wanting a replacement for the lost son, he had the boy cloned. Without his knowledge, 19 others were created, too. Now Salter hears not only the emotional pain and anger of his original troubled son, but also the harrowing psychological struggles of several of the cloned Bernards. Salter responds with a mix of anguish and resignation as he faces the consequences of decisions he once made without much thought. This strange play winds through an ethical maze as each of the characters desperately tries to come to some livable terms with what genetic engineering has wrought. The drama is inventive in both its staccato elliptical dialogues and its sheer number of existential and ethical ideas. In the end, though, the characters never emerge as human, never engage us sufficiently to make us care about their ordeals with selfhood and love. When Salter says to one of the cloned Bernards, "What they've done they've damaged your uniqueness, weakened your identity," it is difficult to believe that they were ever capable of possessing either. Although Churchill's nightmare may seem especially odd, her tale of violence, deception, and loss resonates with those of Betts, Ishiguro, and Picoult. What if you might lose your child? If the means were available, would you take any chance, do anything, to save her? Or, if lost to you, to bring him back? All of these stories have in common their underlying questions about where bioengineering is leading us, what kinds of choices it asks us to make, and where the true costs and benefits lie. What makes the stories different from other forms of ethical inquiry is their narrative form, their way of knowing as literature. John Gardner reminds us that novels are a form of moral laboratory. In the pages of well-written fiction, we explore the way a unique human being in a certain set of circumstance makes moral decisions and lives out their consequences. Some of the novels being written now offer valuable cautionary tales about what is at stake in our current forays into new science and technology, asking us, as Ishiguro does in Never Let Me Go, What is immutable? What endures? What is essential about being human? Where does the essential core of identity lie? Does it derive from nature or nurture, from our environment or genetics? But the best go further. As Ishiguro's does, they take the bioethics issue as a fundamental moral challenge. Instead of using an aspect of bioethics as an engine to drive the plot, some authors succeed in using it as a prism that shines new light onto timeless questions about what it means to be fully human. At its heart, Ishiguro's tale has very little to do with the specific current controversies over cloning or genetic engineering or organ transplantation, any more than The Remains of the Day has to do with butlering or A Pale View of Hills has to do with surviving the atomic bomb. By the end of the novel, we discover that Never Let Me Go is, if cautionary, also subtler and more subversive than we suspected. Tommy and Ruth are already gone, and Kathy herself is ready to begin the "donations" that will lead to her own "completion." During one of her long road trips, she stops the car for "the only indulgent thing" she's ever done in a life defined by duty and "what we're supposed to be doing." Looking out over an empty plowed field, just this once she allows herself to feel an inkling of what she's lost and all she will never have. At this moment, we realize ourselves in Kathy, and we see her foreshortened and stunted life as not so very different from our own. The biological revolution's greatest surprise of all may be that its dilemmas are not really new. Instead, it may simply deepen the ones we've always faced about how to find meaning in our own lives and the lives of others. Martha Montello is an associate professor in the department of history and philosophy of medicine and director of the Writing Resource Center in the School of Medicine at the University of Kansas. She also lectures on literature and ethics at the Harvard-MIT Division of Health Sciences & Technology, and co-edited Stories Matter: The Role of Narrative in Medical Ethics (Routledge, 2002). WORKS DISCUSSED IN THIS ESSAY My Sister's Keeper, by Jodi Picoult (Atria, 2004) Never Let Me Go, by Kazuo Ishiguro (Knopf, 2005) A Number, by Caryl Churchill (a 2002 play published by Theatre Communications Group in 2003) Oryx and Crake, by Margaret Atwood (Nan A. Talese, 2003) Souls Raised From the Dead, by Doris Betts (Knopf, 1994) The Thanatos Syndrome, by Walker Percy (Farrar, Straus and Giroux, 1987) From checker at panix.com Mon May 9 21:05:22 2005 From: checker at panix.com (Premise Checker) Date: Mon, 9 May 2005 17:05:22 -0400 (EDT) Subject: [Paleopsych] CHE: 'The Internet and the Madonna: Religious Visionary Experience on the Web' Message-ID: 'The Internet and the Madonna: Religious Visionary Experience on the Web' The Chronicle of Higher Education, 5.5.13 http://chronicle.com/weekly/v51/i36/36a01501.htm By NINA C. AYOUB Minutes after the death of John Paul II, church officials sent a mass e-mail message to the news media alerting them. The high-tech missive was fitting for a pope who had embraced the Internet. But while the Vatican is now virtual, the Catholic devout are even more "wired," especially in regard to Marian apparitions. As Paolo Apolito notes in The Internet and the Madonna: Religious Visionary Experience on the Web (University of Chicago Press), there are just a handful of church-approved sightings of the Virgin Mary. However, beyond Lourdes, Fatima, and other recognized visionary sites, there have been a string of claimed appearances that have gained renown. Perhaps the most famous is Medjugorje, a village in Bosnia, where six children said they saw Mary on a hillside. While pre-World Wide Web, the Medjugorje sighting in 1981 was the first to take rapid advantage of a globalized media, says the scholar, an Italian anthropologist at the University of Salerno and the University of Rome III. Within a decade, some 10 million pilgrims had descended on the site. Medjugorje became the reference point for something of an apparition boom, Mr. Apolito says, creating also a "mechanism of reciprocal confirmation," kind of an "I'll consider your apparition if you'll consider mine." Today, he writes, the world of visionaries blends a neo-Baroque belief in miracles and wonders with elaborate use of the latest technologies. Yet the technical is moving the epiphanic person to the periphery: "Technology, by allowing and legitimizing every form of the extraordinary, winds up imposing the wonder of itself. ... " To demonstrate some of that usurping, Mr. Apolito turns first to photography and the urge to document. For example, a visionary communing with Mary on a hillside may also have to compete with the whirr of hundreds of cameras, each trying to capture an apparition or at least a dancing sun. There are also perils for the worshipful Web surfer, says Mr. Apolito. Even on Web rings of sites devoted to the Virgin, it can be a short trip from the devout to the debauched. Four clicks, he found, as he followed links via generic site banners. There are also parody sites such as the Miraculous Winking Jesus as well as sites that use apparition-talk to express violent anti-Catholicism. But beyond porn, parody, and bigotry, the horizontal landscape of the Web risks something else: disconnect. The scholar, who did fieldwork at Oliveto Citra, the Italian site of a claimed apparition, explores what gets lost when a visionary experience loses the context of local religious culture and is fragmented in the ether. References 3. http://www.amazon.com/exec/obidos/ASIN/0226021505/thechroniclofhig 4. http://bn.bfast.com/booklink/click?sourceid=18526&ISBN=0226021505 5. http://www.powells.com/cgi-bin/biblio?isbn=0226021505 From checker at panix.com Mon May 9 21:05:51 2005 From: checker at panix.com (Premise Checker) Date: Mon, 9 May 2005 17:05:51 -0400 (EDT) Subject: [Paleopsych] Telegraph: What's 'national' about national arts organisations? Message-ID: What's 'national' about national arts organisations? http://www.telegraph.co.uk/arts/main.jhtml?xml=/arts/2005/05/07/baoh07.xml&sSheet=/arts/2005/05/07/ixartleft.html (Filed: 07/05/2005) Andrew O'Hagan investigates What is the purpose of a national theatre, a national opera or ballet company, a national orchestra, or a national gallery? What is the meaning of the word "national" in those famous organisations? Is it simply a matter of pride and funding, an indication that those particular institutions have the backing of an entire nation, its hopes and dreams of excellence? Or is it more complicated than that: do we expect these arts organisations, above all others, to embody in their work something essential about the nation? Should the Welsh National Opera, for instance, seek to capture a vision of international musical quality, or a vision of what it really means to be Welsh - or both? In 1899, WB Yeats, Augusta Gregory and Edward Martyn, the founders of the Irish National Theatre, declared that the job of the new theatre was "to bring upon the stage the deeper thoughts and emotions of Ireland". This was almost 20 years before Ireland's war of independence, but the Abbey, the theatre that grew out of their declaration, would provide the platform and the occasion for many of the great debates about freedom, responsibility, religion and modernity, debates that shaped the new nation and are still shaping it today. In Ireland, a taxpayer-funded conversation is seen to exist between art and the state, a conversation whose difficulties are part of its richness. This was true in the Czech Republic (which ended up with a playwright for a president); it was true in Spain after the death of Franco, which invested in the arts as a way of opening up freedom of expression; it was true in parts of Australia, where national museums began to blush at the idea of excluding aboriginal art; and it is true in post-war Germany and post-glasnost Russia, where national cultural institutions have allowed not just a conversation but a means of national cleansing about the repressions and horrors of the past. In each of those places, national art institutions played a part in making life new. What about Britain? Do we have reason to believe that cultural institutions bearing the word "national" or "royal" or "British" or "English" or "Scottish" or "Welsh" are engaging us in questions about who we are or who we are becoming? To some people's minds, such an effort would be spurious in the extreme. To them, the purpose of the Royal Opera House is to furnish a version of, say, Das Rheingold which fulfils the virtues of the work and stands up well to international standards. These are the things one can rely on a national opera company to do. The task bears some comparison with football in its modern form. A club such as Celtic has many international players, it is super-funded and super-commercial, super-branded, it can stand up to international competition on the field, yet, one might ask, what has any of this got to do with Glasgow? Does the team have anything to do with Glasgow? What relationship does the corporate image bear to the traditions that made the team and the community that supports it? Like the few crown jewels or the odd Stone of Scone, national arts companies are often, I feel, adornments that nations want to have in order to seem more like nations, but which can't bear the self-questioning that should come with a truly alive national company. What is achieved, for example, by the excellent Scottish Ballet being called Scottish Ballet instead of the Ashley Page Dance Company, which is more descriptive of who they are? The company is based in Glasgow, as it has been since 1969 when Peter Darrell took his Western Ballet Theatre there from Bristol. The company is not Scottish in its bones, so why should it matter that the company hangs on to the national title? It seems to matter, though. People want to believe that their national arts organisations speak volumes about the civilised nature of the country they live in or come from, the country whose name the company bears. It is like a highest form of cultural branding: your country is a logo, the ultimate stamp of quality. You see how this has been taken to extremes in America, where the use of that word, "America", immediately seems to confer on what follows an almost unutterable level of power and prestige. Patriotism has, in other words, taken over the meaning of the word "national": we use it to denote a settled imperial excellence, not the situating of a higher conversation between the arts and the state. That conversation exists, of course, in the streets, in the newspapers, but is not advertised by national companies as part of what they do. Perhaps the concentration on building "partnerships" and sponsorships has represented a form of privatisation of British culture by stealth; none of the cultural boffins I spoke to this week could quite define what it was that made the British Museum "British" or the National Gallery "National": they spoke of British values, but couldn't really say how these affected the life of the institutions. (It's interesting that the same people have no hesitation when asked the same question about the BBC.) My own guess is that the National Gallery of Art in Washington, for all its Van Goghs and Matisses, tells a different story from the National Gallery in London, for all its Van Goghs and Matisses. They each tell a particular story, but we might ask for much more of the particularities. We might say, in the spirit of the Irish, what does this material and the manner of its housing have to do with us? These days, we may mean less than we think when we speak of "national" this and that. I mean, would it be problematic if English National Ballet, who are near bankruptcy but are otherwise a good and fully-functioning company with no very well-defined home, were to become, as has been suggested, the National Ballet of Wales? It's rather like the situation at the founding of Scottish Ballet, except that, this time, many people are hitting the roof at the idea that the Welsh National Ballet Company might suffer from being, well, a bit un-Welsh. But why shouldn't a newly-designated ENB be good at getting into the intellectual scrum of Wales's modern make-up? Peter Darrell, who founded Scottish Ballet, was born in Surrey, and that didn't stop him dealing with the wealth of Scotland's folk heritage in his ballets. In fact, as in most areas of national life, a bit of outsiderism can sharpen the instincts, so long as they don't assume, like most modern footballers, that one piece of ground is the same as another. Each piece of ground is different, and the arts should be an ongoing investigation of that difference, an attempt to beautify and enrich the native gardens without becoming too conscious of the fences that surround them. From checker at panix.com Mon May 9 21:05:42 2005 From: checker at panix.com (Premise Checker) Date: Mon, 9 May 2005 17:05:42 -0400 (EDT) Subject: [Paleopsych] Science Daily: Moderate Alcohol Consumption Enhances The Formation Of New Nerve Cells Message-ID: Moderate Alcohol Consumption Enhances The Formation Of New Nerve Cells http://www.sciencedaily.com/releases/2005/05/050508211456.htm Moderate alcohol consumption over a relatively long period of time can enhance the formation of new nerve cells in the adult brain. The new cells could prove important in the development of alcohol dependency and other long-term effects of alcohol on the brain. The findings are published by Karolinska Institutet. The study, which was carried out on mice, examined alcohol consumption corresponding to that found in normal social situations. The results show that moderate drinking enhances the formation of new cells in the adult brain. The cells survive and develop into nerve cells in the normal manner. No increase in neuronal atrophy, however, could be demonstrated. It is generally accepted these days that new nerve cells are continually being formed in the adult brain. One suggestion is that these new neurons could be important for memory and learning. The number of new cells formed is governed by a number of factors such as stress, depression, physical activity and antidepressants. "We believe that the increased production of new nerve cells during moderate alcohol consumption can be important for the development of alcohol addiction and other long-term effects of alcohol on the brain," says associate professor Stefan Bren?. "It is also possible that it is the ataractic effect of moderate alcohol consumption that leads to the formation of new brain cells, much in the same way as with antidepressive drugs." The researchers are now following up these exciting findings to understand the role that the new nerve cells thus formed play in cerebral activity. Publication: Moderate ethanol consumption increases hippocampal cell proliferation and neurogenesis in the adult mouse Aberg E, Hofstetter C, Olson L, Bren? S. Int J Neuropsychopharm Online May 21, 2005, see http://journals.cambridge.org From checker at panix.com Mon May 9 21:06:13 2005 From: checker at panix.com (Premise Checker) Date: Mon, 9 May 2005 17:06:13 -0400 (EDT) Subject: [Paleopsych] BBC: Lords back 'designer baby' choice Message-ID: Lords back 'designer baby' choice http://newsvote.bbc.co.uk/mpapps/pagetools/print/news.bbc.co.uk/1/hi/health/4492345.stm 5.4.28 The creation of "designer babies" to treat sick siblings is lawful, the Law Lords have ruled, upholding an earlier court decision. The case centred on six-year-old Zain Hashmi, whose parents wanted a baby with a specific tissue type to help treat his debilitating blood disorder. His parents had begun treatment to create a baby, but have so far failed. Campaigners had asked the Lords to overturn the appeal court's 2003 ruling that allowed the couple to proceed. The group Comment on Reproductive Ethics (Core) asked the House of Lords to examine the Human Fertilisation and Embryology Act 1990 and to decide whether tissue-typing of the sort used by the Hashmis was legal. On Thursday, five Law Lords ruled unanimously that the practice of such tissue typing could be authorised by the Human Fertilisation and Embryology Authority (HFEA). HAVE YOUR SAY Modifying nature's course will lead to further and more severe disorders in the long term James Anthony, Cheshire The ruling, saying that HFEA was acting lawfully and appropriately in considering and granting a licence for pre-implantation tissue-typing, was welcomed by the authority. "We are pleased with the clarity that this ruling brings for patients," the HFEA said. Mrs Hashmi said the family appreciated the support they had received throughout the legal process. "It's nice to know that society has now embraced the technology to cure the sick and take away the pain. "We feel this ruling marks a new era and we are happy to move forwards now. We hope and pray that we get what we need for Zain." To stay alive Zain, who suffers from beta thalassaemia major, currently has to have blood transfusions every month, and drugs fed by a drip for 12 hours a day. Technology now allows doctors to select embryos with perfect tissue for a transplant operation. The best of ends, namely to cure a sick child, does not justify the means Life Anti-abortion group The High Court had imposed a ban on the treatment in December 2002 but this was overturned in the Court of Appeal. That decision allowed parents Raj and Shahana to go ahead with treatment to produce a sibling with the same tissue type as their son. In theory, this would have allowed them to take stem cells from the new baby's umbilical cord and transplant them into Zain. Tragically, however, Mrs Hashmi has had a series of miscarriages. The ruling "saddened" anti-abortion campaigners Life. "Today's decision from the House of Lords takes us further down the slippery slope in creating human beings to provide spare parts for another. "The best of ends, namely to cure a sick child, does not justify the means." From shovland at mindspring.com Wed May 11 00:12:54 2005 From: shovland at mindspring.com (Steve Hovland) Date: Tue, 10 May 2005 17:12:54 -0700 Subject: [Paleopsych] New research raises questions about buckyballs and the environment Message-ID: <01C55583.81CE12D0.shovland@mindspring.com> In a challenge to conventional wisdom, scientists have found that buckyballs dissolve in water and could have a negative impact on soil bacteria. The findings raise new questions about how the nanoparticles might behave in the environment and how they should be regulated, according to a report scheduled to appear in the June 1 print issue of the American Chemical Society's peer-reviewed journal Environmental Science & Technology. ACS is the world's largest scientific society. A buckyball is a soccer ball-shaped molecule made up of 60 carbon atoms. Also known as fullerenes, buckyballs have recently been touted for their potential applications in everything from drug delivery to energy transmission. Yet even as industrial-scale production of buckyballs approaches reality, little is known about how these nano-scale particles will impact the natural environment. Recent studies have shown that buckyballs in low concentrations can affect biological systems such as human skin cells, but the new study is among the earliest to assess how buckyballs might behave when they come in contact with water in nature. Scientists have generally assumed that buckyballs will not dissolve in water, and therefore pose no imminent threat to most natural systems. "We haven't really thought of water as a vector for the movement of these types of materials," says Joseph Hughes, Ph.D., an environmental engineer at Georgia Tech and lead author of the study. But Hughes and his collaborators at Rice University in Texas have found that buckyballs combine into unusual nano-sized clumps - which they refer to as "nano-C60" - that are about 10 orders of magnitude more soluble in water than the individual carbon molecules. In this new experiment, they exposed nano-C60 to two types of common soil bacteria and found that the particles inhibited both the growth and respiration of the bacteria at very low concentrations - as little as 0.5 parts per million. "What we have found is that these C60 aggregates are pretty good antibacterial materials," Hughes says. "It may be possible to harness that for tremendously good applications, but it could also have impacts on ecosystem health." Scientists simply don't know enough to accurately predict what impact buckyballs will have on the environment or in living systems, which is exactly why research of this type needs to be done in the early stages of development, Hughes says. He suggests that his findings clearly illustrate the limitations of current guidelines for the handling and disposal of buckyballs, which are still based on the properties of bulk carbon black. "No one thinks that graphite and diamond are the same thing," Hughes says. They're both bulk carbon, but they are handled in completely different ways. The same should be true for buckyballs, according to Hughes. These particles are designed to have unique surface chemistries, and they exhibit unusual properties because they are at the nanometer scale - one billionth of a meter, the range where molecular interactions and quantum effects take place. It is precisely these characteristics that make them both so potentially useful and hazardous to biological systems. "I think we should expect them to behave differently than our current materials, which have been studied based on natural bulk forms," Hughes says. "Learning that C60 behaves differently than graphite should be no surprise." Overall, the toxicological studies that have been reported in recent years are a signal that the biological response to these materials needs to be considered. "That doesn't mean that we put a halt on nanotechnology," Hughes says. "Quite the opposite." "As information becomes available, we have to be ready to modify these regulations and best practices for safety," he continues. "If we're doing complementary studies that help to support this line of new materials and integrate those into human safety regulations, then the industry is going to be better off and the environment is going to be better off." The American Chemical Society is a nonprofit organization, chartered by the U.S. Congress, with an interdisciplinary membership of more than 158,000 chemists and chemical engineers. It publishes numerous scientific journals and databases, convenes major research conferences and provides educational, science policy and career programs in chemistry. Its main offices are in Washington, D.C., and Columbus, Ohio. From shovland at mindspring.com Wed May 11 13:51:53 2005 From: shovland at mindspring.com (Steve Hovland) Date: Wed, 11 May 2005 06:51:53 -0700 Subject: [Paleopsych] A Vision of Terror Message-ID: <01C555F5.EAC11CC0.shovland@mindspring.com> By John Gartner May 10, 2005 Page 1 of next A new generation of software called Starlight 3.0, developed for the Department of Homeland Security by the Pacific Northwest National Laboratory (PNNL), can unravel the complex web of relationships between people, places, and events. And other new software can even provide answers to unasked questions. Anticipating terrorist activity requires continually decoding the meaning behind countless emails, Web pages, financial transactions, and other documents, according to Jim Thomas, director of the National Visualization and Analytics Center (NVAC) in Richland, Washington. Federal agencies participating in terrorism prevention monitor computer networks, wiretap phones, and scour public records and private financial transactions into massive data repositories. "We need technologies to deal with complex, conflicting, and sometimes deceptive information," says Thomas at NVAC, which was founded last year to detect and reduce the threats of terrorist attacks. In September 2005, NVAC, a division of the PNNL, will release its Starlight 3.0 visual analytics software, which graphically displays the relationships and interactions between documents containing text, images, audio, and video. The previous generation of software was not fully visual and contained separate modules for different functions. It has been redesigned with an enhanced graphical interface that allows intelligence personnel to analyze larger datasets interactively, discard unrelated content, and add new streams of data as they are received, according to John Risch, a chief scientist at Pacific Northwest National Laboratory. Starlight quadruples the number of documents that can be analyzed at one time -- from the previous 10,000 to 40,000 -- depending on the type of files. It also permits multiple visualizations to be opened simultaneously, which allows officers for the first time to analyze geospatial data within the program. According to Risch, a user will be able to see not only when but where and in what proximity to each other activities occurred. "For tracking terrorist networks, you can simultaneously bring in telephone intercepts, financial transactions, and other documents?all into one place, which wasn't possible before," Risch says. The Windows-based program describes and stores data in the XML (extensible markup language) format and automatically converts data from other formats, such as databases and audio transcriptions. Risch says that as the volume of data being collected increases, the software has to be more efficient in visually representing the complex relationships between documents. "Starlight can show all the links found on a Web page, summarize the topics discussed on those pages and how they are connected [to the original page]." From shovland at mindspring.com Wed May 11 16:38:10 2005 From: shovland at mindspring.com (Steve Hovland) Date: Wed, 11 May 2005 09:38:10 -0700 Subject: [Paleopsych] Monoatomic Elements Message-ID: <01C5560D.25BF1310.shovland@mindspring.com> Monoatomic elements are nothing more than elements which are chemically isolated, i.e. instead of 60 atoms of Carbon are 34 atoms of Silicon being bound together in something called a Buckministerfullerene or a knobbier version of the same. The significance lies in the fact that when a single element metal progresses from a normal metallic state to a monoatomic state, it passes through a series of chemically different states. These include: . An alloy of numerous atoms of the same element, which exhibit all the characteristics normally associated with the metal: electrical conductivity, color, specific gravity, density, and so forth. The atom's intrinsic temperature might be room temperature. . A combination of significantly fewer atoms of the same element, which no longer exhibit all of the characteristics normally associated with the metal. For example, the electrical conductivity or color might change. The atom's intrinsic temperature drops, for example, to 50 to 100 oK (or about two hundred degrees below zero oC). . A Microcluster of far few atoms -- typically on the order of less than one hundred atoms, and as few as a dozen or so atoms. The metal characteristics begin to fall off one by one until the so-called metal is hardly recognized. The intrinsic temperature has now fallen to the range of 10 to 20 oK, only slightly above Absolute Zero. . A Monoatomic form of the element -- in which each single atom is chemically inert and no longer possesses normal metallic characteristics; and in fact, may exhibit extraordinary properties. The atom's intrinsic temperature is now about 1 oK, or close enough to Absolute Zero that Superconductivity is a virtually automatic condition. A case in point is Gold. Normally a yellow metal with a precise electrical conductivity and other metallic characteristics, the metallic nature of gold begins to change as the individual gold atoms form chemical combinations of increasingly small numbers. At a microcluster stage, there might be 13 atoms of gold in a single combination. Then, dramatically, at the monoatomic state, gold becomes a forest green color, with a distinctly different chemistry. It's electrical conductivity goes to zero even as its potential for Superconductivity becomes maximized. Monoatomic gold can exhibit substantial variations in weight, as if it were no longer fully extant in space-time. Other elements which have many of these same properties are the Precious Metals , which include Ruthenium, Rhodium, Palladium, Silver, Osmium, Iridium, Platinum, and Gold. All of these elements have to greater or lesser degree, the same progression as gold does in continuously reducing the number of atoms chemically connected. Many of these precious elements are found in the same ore deposits, and in their monoatomic form are often referred to as the White Powder of Gold . Monoatomic elements apparently exist in nature in abundance. Precious Metal ores are, however, not always assayed so as to identify them as such. Gold miners, for example, have found what they termed "ghost gold" -- "stuff" that has the same chemistries as gold, but which were not yellow, did not exhibit normal electrical conductivity, and were not identifiable with ordinary emission spectroscopy. Thus they were more trouble than they were worth, and generally discounted. However, in a technique called "fractional vaporization", the monoatomic elements can be found and clearly identified via a more advanced emission spectroscopy. This fact was first discussed by David Radius Hudson , who was attempting to separate gold and silver from raw ore -- but was hindered by the ghost gold which had no apparent intrinsic value. The process involved placing a sample on a standard carbon electrode, running a second carbon electrode down to a position just above the first, and then striking a Direct Current arc across the electrodes. The electrical intensity of the arc would ionize the elements in the sample such that each of the elements would give off specific, identifying frequencies of light. By measuring the specific frequencies of light (the spectrum of the element or elements), one could then identify which elements were in the sample. Typically, such spectroscopic analysis involves striking the arc for 10 to 15 seconds, at the end of which, the carbon electrodes are effectively burned away. According to the majority of American spectroscopists, any sample can be ionized and read within those 15 seconds. In the advanced technique, the carbon electrodes are sheathed with an inert gas (such as Argon). This allows the emission spectroscopy process to be continued far beyond the typical 15 seconds, in order to fully identify all of the elements in their various forms. When this was done, in the first seconds, the ghost gold might be identified as iron, silicon, and aluminum. But as the process continued for as long as 300 seconds, palladium began to be read at about 90 seconds, platinum at 110 seconds, ruthenium at 130 seconds, rhodium at 145 seconds, iridium at 190 seconds, and osmium at 220 seconds. These latter readings were the monoatomic elements. Commercially available grades of these metals were found to be including only about 15% of the emission spectroscopic readings. The mining activity of what is considered the best deposit in the world for six of these elements (Pd, Pt, Os, Ru, Ir, and Rh) yields one-third of one ounce of all these precious metals per ton of ore. But this is based on the standard spectroscopic analysis. When the burn is continued for up to 300 seconds, the same ores might easily yield emission lines suggesting: 6 to 8 ounces of palladium, 12 to 13 ounces of platinum, 150 ounces of osmium, 250 ounces of ruthenium, 600 ounces of iridium, and 1200 ounces of rhodium! Over 2200 ounces per ton, instead 1/3 of 1 ounce per ton! [Keep in mind that rhodium typically sells for $3,000/ounce, while gold sells for $300/ounce!] The distinguishing characteristic between the first and second readings of the emission spectroscopy for the precious metals is that all of them come in two basic forms. The first is the traditional form of metals: yellow Gold, for example. The second is the very non-traditional form of the metal: the monoatomic state. The chemistries and physics of these two different states of these metals are radically different. More importantly, when the atoms are in the monoatomic state, things really begin to get interesting! A key to understanding monoatomic elements is to recognize that the monoatomic state results in a rearrangement of the electronic and nuclear orbits within the atom itself. This is the derivation of the term: Orbitally-Rearranged Monoatomic Element (ORME ). A monoatomic state implies a situation where an atom is "free from the influence of other atoms." Is this, perhaps, a violation of some very basic, absolutely fundamental law of the universe -- which says that nothing is separate? If such a law constituted reality, then a necessary condition for monoatomic elements to even exist would require them to be superconductive, just in order to link them through all distance and time to other superconducting monoatomic elements. This would be necessary in order to prevent separation. The question is whether separation is but the Ultimate Illusion? From waluk at earthlink.net Wed May 11 19:13:56 2005 From: waluk at earthlink.net (G. Reinhart-Waller) Date: Wed, 11 May 2005 12:13:56 -0700 Subject: [Paleopsych] Monoatomic Elements In-Reply-To: <01C5560D.25BF1310.shovland@mindspring.com> References: <01C5560D.25BF1310.shovland@mindspring.com> Message-ID: <42825974.1010401@earthlink.net> Steve Hovland wrote: >>A key to understanding monoatomic elements is to recognize that the monoatomic state results in a rearrangement of the electronic and nuclear orbits within the atom itself. This is the derivation of the term: Orbitally-Rearranged Monoatomic Element (ORME ). A monoatomic state implies a situation where an atom is "free from the influence of other atoms." Is this, perhaps, a violation of some very basic, absolutely fundamental law of the universe -- which says that nothing is separate? If such a law constituted reality, then a necessary condition for monoatomic elements to even exist would require them to be superconductive, just in order to link them through all distance and time to other superconducting monoatomic elements. This would be necessary in order to prevent separation. The question is whether separation is but the Ultimate Illusion?>> Hi Steve, this is absolutely fascinating. Could you possibly elaborate on your last sentence....IOW, are you saying that separation doesn't actually occur and is only virtual? Regards, Gerry Reinhart-Waller -------------- next part -------------- An HTML attachment was scrubbed... URL: From checker at panix.com Thu May 12 00:38:26 2005 From: checker at panix.com (Premise Checker) Date: Wed, 11 May 2005 20:38:26 -0400 (EDT) Subject: [Paleopsych] NYT Op-Ed: The Tipping Point Message-ID: The Tipping Point http://www.nytimes.com/2005/05/11/opinion/11board.html By BELINDA BOARD London JOHN BOLTON, President Bush's nominee to be ambassador to the United Nations, has been described as dogmatic, abusive to his subordinates and a bully. Yet Mr. Bush has said that John Bolton is the right man at the right time. Can these seemingly contradictory statements both be accurate? Yes. The reality is that sometimes the characteristics that make someone successful in business or government can render them unpleasant personally. What's more astonishing is that those characteristics when exaggerated are the same ones often found in criminals. There has been anecdotal and case-study evidence suggesting that successful business executives share personality characteristics with psychopaths. The question is, are the characteristics that make up personality disorders fundamentally different from the characteristics of extreme personalities we see in everyday life, or do they differ only in degree? In 2001, I compared the personality traits of 39 high-ranking business executives in Britain with psychiatric patients and criminals with a history of mental health problems. The business managers completed a standard clinical personality-disorder diagnostic questionnaire and then were interviewed. The information on personality disorders among criminals and psychiatric patients had been gathered by local clinics. Our sample was small, but the results were definitive. If personality and its pathology are distinct from each other, we should have found different levels of personality disorders in these diverse populations. We didn't. The character disorders of the business managers blended together with those of the criminals and mental patients. In fact, the business population was as likely as the prison and psychiatric populations to demonstrate the traits associated with narcissistic personality disorder: grandiosity, lack of empathy, exploitativeness and independence. They were also as likely to have traits associated with compulsive personality disorder: stubbornness, dictatorial tendencies, perfectionism and an excessive devotion to work. But there were some significant differences. The executives were significantly more likely to demonstrate characteristics associated with histrionic personality disorder, like superficial charm, insincerity, egocentricity and manipulativeness. They were also significantly less likely to demonstrate physical aggression, irresponsibility with work and finances, lack of remorse and impulsiveness. What does this tell us? It tells us that if reports of Mr. Bolton's behavior are accurate then both his supporters and critics could be right. It also tells us that characteristics of personality disorders can be found throughout society and are not just concentrated in psychiatric or prison hospitals. Each characteristic by itself isn't necessarily a bad thing. Take a basic characteristic like influence and it's an asset in business. Add to that a smattering of egocentricity, a soup?on of grandiosity, a smidgen of manipulativeness and lack of empathy, and you have someone who can climb the corporate ladder and stay on the right side of the law, but still be a horror to work with. Add a bit more of those characteristics plus lack of remorse and physical aggression, and you have someone who ends up behind bars. As we all know, public figures can exhibit extreme characteristics. Often it is these characteristics that have propelled them to prominence, yet these same behaviors can cause untold human wreckage. What's important is the degree to which a person has each ingredient or characteristic and in what configuration. Congress will try to decide whether Mr. Bolton has the right combination. Belinda Board is a clinical psychologist based at the University of Surrey and a consultant on organizational psychology. From checker at panix.com Thu May 12 00:38:37 2005 From: checker at panix.com (Premise Checker) Date: Wed, 11 May 2005 20:38:37 -0400 (EDT) Subject: [Paleopsych] NYT: AIDS Now Compels Africa to Challenge Widows' 'Cleansing' Message-ID: AIDS Now Compels Africa to Challenge Widows' 'Cleansing' http://www.nytimes.com/2005/05/11/international/africa/11malawi.html By SHARON LaFRANIERE MCHINJI, Malawi - In the hours after James Mbewe was laid to rest three years ago, in an unmarked grave not far from here, his 23-year-old wife, Fanny, neither mourned him nor accepted visits from sympathizers. Instead, she hid in his sister's hut, hoping that the rest of her in-laws would not find her. But they hunted her down, she said, and insisted that if she refused to exorcise her dead husband's spirit, she would be blamed every time a villager died. So she put her two small children to bed and then forced herself to have sex with James's cousin. "I cried, remembering my husband," she said. "When he was finished, I went outside and washed myself because I was very afraid. I was so worried I would contract AIDS and die and leave my children to suffer." Here and in a number of nearby nations including Zambia and Kenya, a husband's funeral has long concluded with a final ritual: sex between the widow and one of her husband's relatives, to break the bond with his spirit and, it is said, save her and the rest of the village from insanity or disease. Widows have long tolerated it, and traditional leaders have endorsed it, as an unchallenged tradition of rural African life. Now AIDS is changing that. Political and tribal leaders are starting to speak out publicly against so-called sexual cleansing, condemning it as one reason H.I.V. has spread to 25 million sub-Saharan Africans, killing 2.3 million last year alone. They are being prodded by leaders of the region's fledging women's rights movement, who contend that lack of control over their sex lives is a major reason 6 in 10 of those infected in sub-Saharan Africa are women. But change is coming slowly, village by village, hut by hut. In a region where belief in witchcraft is widespread and many women are taught from childhood not to challenge tribal leaders or the prerogatives of men, the fear of flouting tradition often outweighs even the fear of AIDS. "It is very difficult to end something that was done for so long," said Monica Nsofu, a nurse and AIDS organizer in the Monze district in southern Zambia, about 200 miles south of the capital, Lusaka. "We learned this when we were born. People ask, Why should we change?" In Zambia, where one out of five adults is now infected with the virus, the National AIDS Council reported in 2000 that this practice was very common. Since then, President Levy Mwanawasa has declared that forcing new widows into sex or marriage with their husband's relatives should be discouraged, and the nation's tribal chiefs have decided not to enforce either tradition, their spokesman said. Still, a recent survey by Women and Law in Southern Africa found that in at least one-third of the country's provinces, sexual "cleansing" of widows persists, said Joyce MacMillan, who heads the organization's Zambian chapter. In some areas, the practice extends to men. Some Defy the Risk Even some Zambian volunteers who work to curb the spread of AIDS are reluctant to disavow the tradition. Paulina Bubala, a leader of a group of H.I.V.-positive residents near Monze, counsels schoolchildren on the dangers of AIDS. But in an interview, she said she was ambivalent about whether new widows should purify themselves by having sex with male relatives. Her husband died of what appeared to be AIDS-related symptoms in 1996. Soon after the funeral, both Ms. Bubala and her husband's second wife covered themselves in mud for three days. Then they each bathed, stripped naked with their dead husband's nephew and rubbed their bodies against his. Weeks later, she said, the village headman told them this cleansing ritual would not suffice. Even the stools they sat on would be considered unclean, he warned, unless they had sex with the nephew. "We felt humiliated," Ms. Bubala said, "but there was nothing we could do to resist, because we wanted to be clean in the land of the headman." The nephew died last year. Ms. Bubala said the cause was hunger, not AIDS. Her husband's second wife now suffers symptoms of AIDS and rarely leaves her hut. Ms. Bubala herself discovered she was infected in 2000. But even the risk of disease does not dent Ms. Bubala's belief in the need for the ritual's protective powers. "There is no way we are going to stop this practice," she said, "because we have seen a lot of men and women who have gone mad" after spouses died. Ms. Nsofu, the nurse and AIDS organizer, argues that it is less important to convince women like Ms. Bubala than the headmen and tribal leaders who are the custodians of tradition and gatekeepers to change. "We are telling them, 'If you continue this practice, you won't have any people left in your village,' " she said. She cites people, like herself, who have refused to be cleansed and yet seem perfectly sane. Sixteen years after her husband died, she argues, "I am still me." Ms. Nsofu said she suggested to tribal leaders that sexual cleansing most likely sprang not from fears about the vengeance of spirits, but from the lust of men who coveted their relatives' wives. She proposes substituting other rituals to protect against dead spirits, like chanting and jumping back and forth over the grave or over a cow. Headman Is a Firm Believer Like their counterparts in Zambia, Malawi's health authorities have spoken out against forcing widows into sex or marriage. But in the village of Ndanga, about 90 minutes from the nation's largest city, Blantyre, many remain unconvinced. Evance Joseph Fundi, Ndanga's 40-year-old headman, is courteous, quiet-spoken and a firm believer in upholding the tradition. While some widows sleep with male relatives, he said, others ask him to summon one of the several appointed village cleansers. In the native language of Chewa, those men are known as fisis or hyenas because they are supposed to operate in stealth and at night. Mr. Fundi said one of them died recently, probably of AIDS. Still, he said with a charming smile, "We can not abandon this because it has been for generations." Since 1953, Amos Machika Schisoni has served as the principal village cleanser. He is uncertain of his age and it is not easily guessed at. His hair is grizzled but his arms are sinewy and his legs muscled. His hut of mud bricks, set about 50 yards from a graveyard, is even more isolated than most in a village of far-flung huts separated by towering weeds and linked by dirt paths. What Tradition Dictates He and the headman like to joke about the sexual demands placed upon a cleanser like Mr. Schisoni, who already has three wives. He said tradition dictates that he sleep with the widow, then with each of his own wives, and then again with the widow, all in one night. Mr. Schisoni said that the previous headman chose him for his sexual prowess after he had impregnated three wives in quick succession. Now, Mr. Schisoni, said he continues his role out of duty more than pleasure. Uncleansed widows suffer swollen limbs and are not free to remarry, he said. "If we don't do it, the widow will develop the swelling syndrome, get diarrhea and die and her children will get sick and die," he said, sitting under an awning of drying tobacco leaves. "The women who do this do not die." His wives support his work, he said, because they like the income: a chicken for each cleansing session. He insisted that he cannot wear a condom because "this will provoke some other unknown spirit." He is equally adamant in refusing an H.I.V. test. "I have never done it and I don't intend to do it," he said. To protect himself, he said, he avoids widows who are clearly quite sick . Told that even widows who look perfectly healthy can transmit the virus, Mr. Schisoni shook his head. "I don't believe this," he said. At the traditional family council after James Mbewe was killed in a truck accident in August 2002, Fanny Mbewe's mother and brothers objected to a cleanser, saying the risk of AIDS was too great. But Ms. Mbewe's in-laws insisted, she said. If a villager so much as dreamed of her husband, they told her, the family would be blamed for allowing his spirit to haunt their community on the Malawi-Zambia border. Her husband's cousin, to whom she refers only as Loimbani, showed up at her hut at 9 o'clock at night after the burial. "I was hiding my private parts," she said in an interview in the office of Women's Voice, a Malawian human rights group. "You want to have a liking for a man to have sex, not to have someone force you. But I had no choice, knowing the whole village was against me." Loimbani, she said, was blas?. "He said: 'Why are you running away? You know this is our culture. If I want, I could even make you my second wife." He did not. He left her only with the fear that she will die of the virus and that her children, now 8 and 10, will become orphans. She said she is too fearful to take an H.I.V. test. "I wish such things would change," she said. From checker at panix.com Thu May 12 00:38:50 2005 From: checker at panix.com (Premise Checker) Date: Wed, 11 May 2005 20:38:50 -0400 (EDT) Subject: [Paleopsych] CHE: NIH Continues to Place a Low Priority on Research on Gender Differences Message-ID: NIH Continues to Place a Low Priority on Research on Gender Differences, Health-Advocacy Group Says News bulletin from the Chronicle of Higher Education, 5.5.11 http://chronicle.com/prm/daily/2005/05/2005051102n.htm [53]By SILLA BRUSH Washington Research on the biological and health differences between men and women remains a low priority at the National Institutes of Health, according to a report released on Tuesday by the Society for Women's Health Research, despite what the society says is increasing evidence of the importance of such research. The society, a Washington-based advocacy organization, says research on sexual differences is necessary in all types of biological studies. But from 2000 to 2003, only about 3 percent of all grants awarded by the NIH went to projects on the differences between men and women, according to the report, although there was nearly a 20-percent increase in the total number of NIH grants. "Given the growing body of literature on sex differences, external reports about NIH practices, and the NIH's internal efforts to promote this research, we had hoped to see higher and increasing levels of funding for this important area of research," Sherry A. Marts, the society's vice president for scientific affairs and an author of the report, said in a written statement. Donald M. Ralbovsky, a spokesman for the NIH, said the agency was reviewing the report. He declined to comment further. The National Institute on Alcohol Abuse and Alcoholism awarded 8 percent of its grants to such research, the highest level of any NIH center, according to the report. The centers that have the most money and support the most grants each year, such as the National Cancer Institute and the National Heart, Lung, and Blood Institute, however, ranked low in their support for studies on sexual and gender differences. From 2000 to 2003, the centers that financed the highest percentage of studies on sexual differences also cut back on their support. To increase the level of support, the society recommends updating NIH guidelines to promote that type of research and issuing an NIH-wide public announcement inviting applications for the research. The full text of the report, "National Institutes of Health: Intramural and Extramural Support for Research on Sex Differences, 2000-2003," is available on the center's [71]Web site. _________________________________________________________________ Background articles from The Chronicle: * [73]Study Challenges View That Clinical Trials Have Focused on Men (5/11/2001) * [74]More Research Needed on Women, Study Finds (5/19/2000) * [75]Studies of Women's Health Produce a Wealth of Knowledge on the Biology of Gender Differences (6/25/1999) References 53. mailto:silla.brush at chronicle.com 71. http://www.womenshealthresearch.org/press/CRISPreport.pdf 73. http://chronicle.com/weekly/v47/i35/35a01801.htm 74. http://chronicle.com/weekly/v46/i37/37a04403.htm 75. http://chronicle.com/weekly/v45/i42/42a01901.htm E-mail me if you have problems getting the referenced articles. From checker at panix.com Thu May 12 00:39:11 2005 From: checker at panix.com (Premise Checker) Date: Wed, 11 May 2005 20:39:11 -0400 (EDT) Subject: [Paleopsych] Joel Kotkin: Cities: Places Sacred, Safe, and Busy Message-ID: Joel Kotkin: Cities: Places Sacred, Safe, and Busy http://www.americancity.org/article.php?id_article=119 [spacer.gif] Humankinds greatest creation has always been its cities. They represent the ultimate handiwork of our imagination as a species, compressing and unleashing the creative urges of humanity. From the earliest beginnings, when only a tiny fraction of humans lived in cities, they have been the places that generated most of mankinds art, religion, culture, commerce, and technology. Although many often mistakenly see cities as largely a Western phenomenon, with one set of roots, urbanism has worn many different guises. Over the past five to seven millennia, cities have been built in virtually every part of the world from the highlands of Peru to the tip of southern Africa and the coasts of Australia. Some cities started as little more than overgrown villages that, over time, developed momentum and mass. Others have reflected the conscious vision of a high priest, ruler, or business elite, following a general plan to fulfill some greater divine, political, or economic purpose. The oldest permanent urban footprints are believed to be in Mesopotamia, the land between the Tigris and Euphrates River. From those roots sprang a plethora of metropolises that represent the founding experiences of the Western urban heritage, including Ur, Agade, Babylon, Nineveh, Memphis, Knossos, and Tyre. But many other cities sprang up largely independent of these early Mesopotamian and Mediterranean settlements. Some of these, such as Mohenjo-daro and Harrapa in India and Changan in China, achieved a scale and complexity equal to any of their Western contemporaries. All of these cities, numerous and various, are however reflective of some greater universal human aspiration. The key to understanding that universal aspiration lies in the words of the Greek historian Herodotus. While traveling in the 5th century B.C. to places both thriving and struggling, he wrote, For most of those which were great once are small today; and those that used to be small were great in my own time. Cities throughout history have risen and fallen. The critical questions of Herodotus time still remain: what makes cities great, and what leads to their gradual demise? I argue that three critical factors have determined the overall health of cities: the sacredness of place, the ability to provide security and project power, and the animating role of commerce. Where these factors are present, urban culture flourishes. When these elements weaken, cities dissipate and eventually recede out of history. The Sacredness of Place Religious structurestemples, cathedrals, mosques, and pyramidshave long dominated the landscape and imagination of great cities. These buildings suggested that the city was also a sacred place, connected directly to divine forces controlling the world. This was true not only in Mesopotamia, but also in the great capital cities of China, in Athens, in Rome, in the city-states of the Italian Renaissance, and among the far-flung urban centers of the classical Islamic world. In our own, much more secularly oriented time, the role of religion and of sacred place is often downgraded and even ignoredlikely at our own great peril, as evidenced by the downfall of overly secular cities from ancient Greece to the centers of Soviet society. Yet even the most secular of cities still seek to recreate the sense of sacred place through towering commercial buildings and evocative cultural structures. Such sights inspire a sense of civic patriotism or awe, albeit without the comforting suggestion of divine guidance. A striking landscape, historian Kevin Lynch once suggested, is the skeleton in which city dwellers construct their socially important myths. The Need for Security Cities must first and foremost be safe. Many contemporary urban areas, notably in western Europe, North America, and East Asia, have taken this precept for granted, but the threat posed by general disorder in many Third World cities and by Islamic terror around the globe may once again focus urbanites on the fundamental issue of security. An increased focus on safety would be in keeping with historic norms. Many cities, observed historian Henry Pirenne, first arose as places of refuge from marauding nomads, or from general lawlessness. When a citys