[Paleopsych] WSJ: (Kurzweil) Here It Comes

Premise Checker checker at panix.com
Fri Oct 7 00:56:08 UTC 2005

Here It Comes: Technology's progress will soon accelerate--exponentially. You 
have no idea how much. Ray Kurzweil does.

October 1, 2005; Page P8

The Singularity Is Near
By Ray Kurzweil
Viking, 652 pages, $29.95

The bearded fellow with a sign reading "The End Is Nigh" is a staple of 
editorial cartoons. The title-phrase of "The Singularity Is Near" is obviously 
meant to evoke that image, and for any reader who is slow to catch on Ray 
Kurzweil makes the allusion clear in Chapter 7 ("Ich bin ein Singularitarian") 
with an amusing photo of himself holding up a sign announcing the imminence of 
the Singularity. But Mr. Kurzweil's book is about beginnings, not endings.

The Singularity is a term coined by futurists to describe that point in time 
when technological progress has so transformed society that predictions made in 
the present day, already a hit-and-miss affair, are likely to be very, very 
wide of the mark. Much of Mr. Kurzweil's book consists of a closely argued 
analysis suggesting that the Singularity is, well, near: poised to appear in a 
mere three or four decades.

People's thoughts of the future tend to follow a linear extrapolation -- 
steadily more of the same, only better -- while most technological progress is 
exponential, happening by giant leaps and thus moving farther and faster than 
the mind can easily grasp. Mr. Kurzweil himself, thinking exponentially, 
imagines a plausible future, not so far away, with extended life-spans (living 
to 300 will not be unusual), vastly more powerful computers (imagine more 
computing power in a head-sized device than exists in all the human brains 
alive today), other miraculous machines (nanotechnology assemblers that can 
make most anything out of sunlight and dirt) and, thanks to these technologies, 
enormous increases in wealth (the average person will be capable of feats, like 
traveling in space, only available to nation-states today).

Naturally, Mr. Kurzweil has little time for techno-skeptics like the Nobel 
Prize-winning chemist Richard Smalley, who in September 2001 published a 
notorious piece in Scientific American debunking the claims of 
nanotechnologists, in particular the possibility of nano-robots (nanobots) 
capable of assembling molecules and substances to order. Mr. Kurzweil's 
arguments countering Dr. Smalley and his allies are a pleasure to read -- Mr. 
Kurzweil clearly thinks that nanobots are possible -- but in truth he is 
fighting a battle that is already won. These days skeptics worry that advanced 
technologies, far from failing to deliver on their promises, will deliver on 
them only too well -- ushering in a dystopia of, say, destructive 
self-replication in which the world is covered by nanobots that convert 
everything into copies of themselves (known in the trade as the "gray goo" 
problem). Mr. Kurzweil's sense of things isn't nearly so bleak as that -- he is 
an optimist, after all, an enthusiast for the techno-future -- but he does 
sound a surprisingly somber note.

Indeed, "The Singularity Is Near" is partly a cautionary tale. Having 
established that we're going to face a very different world in the second half 
of the 21st century -- and face it healthier, wealthier and more artificially 
intelligent if not precisely wiser -- Mr. Kurzweil concedes that so-called GNR 
technologies (genetics, nanotech and robotics) may present problems. We may 
find ourselves battling genetically enhanced super pathogens, deadly military 
nanobots and powerful "unfriendly" artificial intelligences scheming against 
those of us with mere natural intelligence. Though Mr. Kurzweil regards these 
threats as manageable, he does not minimize them and offers chilling scenarios 
of what could go wrong. These scenarios are all the more credible because they 
come from Mr. Kurzweil and not from one of the usual gang of scaremongering 

Unlike the Luddites, Mr. Kurzweil argues that the best way of curbing 
technology's potential harm is ... more technology. He notes that to pull back 
on forward-looking research, or to abandon various machine-marvels, will only 
make things worse by driving research underground and into irresponsible hands. 
Instead we should start thinking now about how to safeguard society from 
technology gone wrong.

Mr. Kurzweil advocates prophylactic measures like the Asilomar guidelines for 
recombinant DNA research, which require special precautions for dangerous 
pathogens and restrict the most serious meddling. He also calls for much more 
research into antiviral drugs, rapid vaccines, defensive nanotech and 
artificial intelligence designed to remain friendly.

It is a persuasive plea, but will anyone listen in time? The political system 
tends to lag behind technological change, which is often a good thing. I 
remember attending a House subcommittee hearing in the 1980s on whether the 
U.S. should create a phone-computer system modeled on the state-funded French 
Minitel, a text-only network being promoted as the wave of the future. 
Fortunately, the Internet exploded -- making Minitel obsolete -- before 
Congress could fund such a project.

But when it comes to the dangers that Mr. Kurzweil worries about, the slow 
approach is a problem. The government is so often behind the curve -- think of 
how sluggishly it adapts to changes in employment patterns, shifts in 
international trade or attacks of new diseases like avian flu. What happens 
when the curve is an exponential one?

Perhaps it won't matter. As Mr. Kurzweil notes, private entrepreneurs seem to 
have pushed back the threat of computer viruses, for instance, moving more 
rapidly and more effectively than any government agency ever could. And 
certainly free-marketeers reading Mr. Kurzweil's book will see opportunities 
for profit: Imagine the billions to be made from rapid vaccine-production 
technologies in a world where genetic engineering is common. But I would feel 
more comfortable if more people started following Mr. Kurzweil's advice in the 
near future, before the Singularity gets here.

More information about the paleopsych mailing list