[extropy-chat] Superintelligence

Samantha Atkins sjatkins at mac.com
Wed Feb 15 17:00:52 UTC 2006


On Feb 14, 2006, at 8:04 PM, Mikhail John wrote:
> Getting away from that, I now turn to the fact that there are a  
> number of
> reasons to exterminate humanity that even humans can comprehend. We  
> are
> destroying our own environment, we destroy our own bodies, we kill
> ourselves, we kill each other, we believe in invisible friends and  
> go to war
> for those beliefs. We are profoundly flawed creatures. I have a few  
> mildly
> sociopathic friends who believe that we should destroy humanity  
> before we
> take the rest of the world with us. A valid argument this time.  
> Humanity in
> it's current form serves nothing but itself, as does that badly.
>

The "rest of the world" has some and presumably more intrinsic value  
than humanity because..?
These "flawed creatures" happen to be the most intelligent and most  
capable of transcending their flaws of any on the planet.  I would  
say that makes us a bit more interesting and valuable than the non- 
human parts of the world.


> These, however, are flaws of society, not biology. The culture of  
> the most
> powerful portion of the world believe that the world was made for  
> humans by
> God and will last forever, no matter what we do to it.

What the heck is a "culture" and where does it say that whatever it  
is believes any such thing?

> Flaws of society are correctable. It's like a puppy, if you don't  
> rub it's
> nose in the mess it won't stop shitting on the carpet. If the flaws  
> were
> biological they could be still corrected. Even with our flaws, we  
> are still
> hardy, creative, and useful little critters. An AI could create  
> android
> tools, and probably will, but humans will take a long time to become
> entirely useless. You can just twiddle their genetics a bit to make  
> them fit
> the environment, drop them on a planet, either give them crops or  
> let them
> to eat rocks, maybe photosynthesis, wander off for a while, then  
> presto!
> Self-sufficient workforce. If you made androids they would require a
> controller and a factory. Any self-sufficient controller that can  
> control an
> useful number of workers could become a competitor if you left it  
> alone long
> enough.

What, I can't make autonomous, self-reproducing androids that are  
less bothersome than human beings?   Not much of an AI am I?

>
> Science tells us that we are descended from uni-celled bacteria.  
> Logic tells
> us that we are descended from the baddest, meanest, studliest, and  
> luckiest
> uni-celled bacteria around. A few billions of years later, and each  
> and
> every one of our ancestors was the baddest, meanest, studliest, and  
> luckiest
> of their kind. Barring anything unwise with bacteria or nukes,  
> humans are
> the baddest, meanest, and luckiest creatures around and will  
> continue to
> hold that position for the foreseeable future. The act of creating  
> something
> greater than ourselves will not change that. Any super-intelligence  
> worth
> it's salt will recognize that and use it.
>

This does not follow.   We are the most successful critter is some  
respects (certainly not in sheer mass, longevity, etc.) that natural  
evolution came up with.   That does not mean that a super- 
intelligence couldn't come up with something MUCH better with  
relatively little effort.  We ourselves have no problem seeing how  
various parts of our being could be improved.

> If it doesn't, it's not like we can stop it.
>
> In conclusion, an AI would be such a powerful economic and military  
> tool
> that they will be created no matter what. Once we begin to create  
> them, they
> will get cheaper and easier to make. If the science of crime has  
> taught us
> anything, it's that you can't stop everyone all of the time. If the  
> science
> of war has taught us anything, it's that even the smartest of  
> people are
> pretty dump and leave plenty of openings. Eventually, AI will no  
> longer be
> in the hands of the smartest of people. Accidents will happen. We  
> cannot
> stop a super-intelligent AI from being created. It is inevitable.  
> Sit back
> and enjoy the ride.
>

It is not inevitable simply because the continued existence of a  
sufficiently technologically advanced humanity is not inevitable.  It  
is not time to "sit back".

- samantha



More information about the extropy-chat mailing list