[ExI] Wall Street Journal on the Singularity Summit

Tyler Emerson tyleremerson at gmail.com
Sat Sep 22 19:14:25 UTC 2007


Feedback appreciated on this.

-Tyler

---------- Forwarded message ----------
From: The Singularity Institute Blog <tyleremerson at gmail.com>
Date: Sep 22, 2007 7:41 AM
Subject: The Singularity Institute Blog
To: tyleremerson at gmail.com

    The Singularity Institute Blog <http://www.singinst.org/blog>

 Summit Coverage in The Wall Street Journal raises
questions<http://www.singinst.org/blog/2007/09/21/summit-coverage-in-the-wall-street-journal-raises-questions/>

Posted: 21 Sep 2007 11:55 PM CDT

Earlier this week SIAI and the Singularity Summit got some major coverage in
The Wall Street Journal. Lee Gomes, the Portals columnist for The Journal
attended the Summit, and has some challenging thoughts about our movement
and its perceived relevancy to the business community and the public at
large.

In his article, Gomes likens Singularitarians at times to 12-year-old sci-fi
addicts, alien worshipers, and even gynephobics (don't tell my 3 daughters).
While it is always fun to play "knock the nerds" in the popular press, I
think Gomes raises key issues that point out why we sometimes struggle for
credibility outside of our safety net in The Valley.

As we start to organize our thoughts about next year's Singularity Summit,
it is apparent that we need to focus more on bridging the knowledge and
perception gaps between the scientific community, the business and
investment community, and the public at large. Our success in crossing this
chasm over the next couple of years will dictate how successfully the
mission of the Singularity Institute will be embraced by broader segments of
humanity.

I'd like to open this discussion up to our community at large to get your
ideas and feedback. How do we stay true to the vision of Singularity
Institute, and at the same time create a partnership with the business
community that creates an exciting and positive perspective on what we can
accomplish? And how do we shake some of the more adverse associations to the
lunatic fringe?

I look forward to your thoughts. I've posted Lee's article below. Leave a
comment to this post or contact me directly at lamis at singinst.org.

*Reprinted from The Wall Street Journal*

—————————————————————————

The Singular Question Of Human vs. Machine Has a Spiritual Side

The Wall Street Journal
PORTALS
By LEE GOMES

September 19, 2007; Page B1

You can tell a lot about people from what they worry about. A few Saturdays
ago, I spent the day in an auditorium full of fellow citizens concerned with
"singularity." The word refers to the day when the intelligence of computers
will exceed our own.

The auditorium was filled with people who listed many things that might
occur with singularity, such as a human-machine synthesis into a new,
superintelligent life-form. The date has been projected as anytime from nine
to 40 years hence.

Singularity-believers say humanity urgently needs to begin preparing for
this moment, if only to make sure that humans don't become kabobs at the
first post-singularity tailgate party held by supersmart computers. There is
even a Singularity Institute, bankrolled by Silicon Valley wealthoids.

The weekend session featured speeches, panel discussions and informal
chatting. About 800 people were on hand, more, frankly, than I would have
expected. Who but 12-year-old sci-fi addicts still fret over malevolent,
superintelligent machines? Most of us, living every day with computers,
appreciate how even the world's most powerful one not only is incapable of
an autonomous thought, it can't even distinguish spam from real email.

To get to the singularity that we are supposed to be preparing for, we are
going to need AGI, or Artificial General Intelligence, a topic the
singularists go on about endlessly.

A computer with AGI thinks and reasons the same way a human being does, only
much more quickly. But don't singularity people know that AI researchers
have been trying to make such machines since the 1950s, without much
success?

It turns out, there is a schism between the AGI and the AI worlds. The AGI
faction thinks AI researchers have sold out, abandoning their early dreams
of "general" intelligence to concentrate on more attainable (and more
lucrative) projects.

They're right. The machines today that recognize speech or play chess are
one-trick wonders. Of course, AI researchers defend that approach by saying
their early dreams of general intelligence were naïve.

The singularists, though, don't seem bothered by those earlier AI failures;
new approaches will bear fruit, they insist. They thus didn't think it a
waste of either time or carbon offsets to be gathering at a conference to
ask such questions as, "If you made a superintelligent robot, then forced it
to work only for you, would that be slavery?"

Robots are just computers with moving parts, of course, but the public is
still confused about them, just like they used to be about computers
themselves. The Great Metallic Hope of the robotics industry, for example,
is currently a small, round vacuum cleaner that ambles across the floor by
itself.

A high-tech wonder? Actually, Consumer Reports said that even cheap vacuum
cleaners did better than the first model. A little more of this, and no one
will ever again worry about enslaving robots.

There is another way of thinking about the obsession with robots. John
Huntington, professor of English, University of Illinois, has studied the
genre and says sci-fi authors, especially the early ones who wrote about
robots or aliens, were working out their own unacknowledged anxieties about
closer-to-home topics.

Most commonly, he said, these anxieties involved women, who were seen as
becoming threatening as they gained social power. Racial and class tensions
also were involved, he added.

I have a supplemental theory: that the discussion of singularity involves a
sublimated spiritual yearning for some form of eternal life and an
all-powerful being, but one articulated by way of technical, secular
discourse.

As it happens, there is considerable overlap between the singularity and the
"life extension" communities. Ray Kurzweil, the best-known singularity
writer, also co-wrote a lengthy guide to life extension. He once told me he
expects literally to live forever — first by prolonging his life via a daily
regimen that includes hundreds of pills and the nonstop consumption of green
tea, then, once super-powerful computers arrive, by uploading his
consciousness into one.

Singularists also have an affinity for the Search for Extraterrestrial
Intelligence, or SETI, program, which scans the skies looking for other
civilizations. Isn't that a longing by some for an intergalactic messiah?

Then, consider a poem read at the singularity conference that described an
Aquarian Age scene in which humans and other mammals frolicked in a
"cybernetic meadow … all watched over by machines of loving grace." Those
computer protectors sound a lot like the guardian angels my grade-school
nuns told us about.

Years ago, a friend and I spent an evening with Arthur C. Clarke, the
creator in "2001″ of HAL, the malevolent computer of every singularist's
nightmare. He brought along slides, showing himself with some astronauts and
with the authors of the musical "Hair."

We talked about science and had our picture taken, which I still have. It
proves that while I may have reached a different conclusion, at least I
studied with the master.

   You are subscribed to email updates from The Singularity Institute
Blog<http://www.singinst.org/blog>
To stop receiving these emails, you may unsubscribe
now<http://www.feedburner.com/fb/a/emailunsub?id=1830945&key=IPDZtRQeUq>
. Email Delivery powered by FeedBurner  Inbox too full? [image:
(feed)]<http://feeds.feedburner.com/siaiblog>
Subscribe <http://feeds.feedburner.com/siaiblog> to the feed version of The
Singularity Institute Blog in a feed reader.  If you prefer to unsubscribe
via postal mail, write to: The Singularity Institute Blog, c/o FeedBurner,
20 W Kinzie, 9th Floor, Chicago IL USA 60610
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20070922/b17da041/attachment.html>


More information about the extropy-chat mailing list