[extropy-chat] Singularitarian verses singularity

The Avantguardian avantguardian2020 at yahoo.com
Wed Dec 21 18:23:09 UTC 2005



--- Samantha Atkins <sjatkins at mac.com> wrote:
> I am of the opinion that human level intelligence is
> increasingly  
> insufficient to address the problems we face,
> problems whose solution  
> is critical to our continuing viability.   Thus I
> see >human  
> intelligence as essential for the survival of
> humanity (or whatever  
> some or all of us choose to become when we have such
> choice).   Yes  
> the arrival of >human intelligence poses dangers. 
> But I think it is  
> also the only real chance we have.

How can you be so certain of that super-human
intelligence (artifical or otherwise) is going to be
the "savior of mankind"? Especially when those people
who are super intelligent seem incapable effecting the
necessary changes to "save mankind". Look at Maria Vos
Savant for example. She has the highest intelligence
ever quantified and what does she do? She writes a
column for Parade magazine. If she is happy doing that
then by all means I won't criticize her choices in
life but it does kind of make me skeptical that some
computer is going wake up in somebody's basement one
day, solve some massive equation, and change the world
for the better. 

What evidence do I have for this skepticism? I need
look no farther than this list. We are bunch of really
intelligent people but are we doing anything to solve
the problems humanity faces? No, we argue about
whether Bush is a hero or a fraud and whether the
color red is the same for everybody or not. Every time
somebody publishes a press release that they have
invented a better mouse trap or something, we twitter
excitedly for a few days, then go back to bickering
over minutae.

Don't get me wrong, I like this list. I find it
entertaining and informational and have grown quite
fond of some the posters. Sure some of the individuals
on this list are more accomplished than others, but I
don't think the list is collectively living up to its
full potential. If high IQ people can be used as an
indicator of the nature of super-intelligence, then we
might very well be screwed. If we, being as human as
we are, can be so apathetic about the survival of our
own species, why would anyone believe that some mighty
non-human intelligence would give a rat's ass about
us? How do we know that the most likely scenario for
the Singularity is that when the AI boots up, it
doesn't take a look around, decides we are not worth
the effort to save, and decides to write angst-ridden
haiku and solve crossword puzzles all day. Or maybe
even get a job writing movie reviews for Newsweek.




   

The Avantguardian 
is 
Stuart LaForge
alt email: stuart"AT"ucla.edu

"The most beautiful thing we can experience is the mysterious. It is the source of all true art and science. He to whom this emotion is a stranger, who can no longer wonder and stand rapt in awe, is as good as dead: his eyes are closed. . ."

- Albert Einstein, "What I Believe" (1930)

__________________________________________________
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com 



More information about the extropy-chat mailing list