[ExI] Survival (was: elections again)

Harvey Newstrom mail at harveynewstrom.com
Mon Dec 31 18:11:17 UTC 2007


On Monday 31 December 2007 11:08, John K Clark wrote:
> "Harvey Newstrom" <mail at harveynewstrom.com>
>
> > You seem to believe that we can't change
> > the future
>
> The future will be determined by what Mr. Jupiter Brain wants, not by what
> we want. Exactly what He (yes, I capitalized it) will decide to do I don't
> know; that's why it's called a Singularity. Maybe He will treat us like
> pampered pets; maybe He will exterminate us like rats, it's out of our
> hands.

Then I want to be the cutest pet ever!  Or else the stealthiest scavenger rat 
ever.  Or maybe I want to leave this sinking planet before all this goes 
down.  Or else I want to upload into the AI before it takes over.  Or build 
my own counter-AI to protect me.

Even given your scenarios, we have a lot of choices on how our subjugation is 
going to occur.

Although I don't agree that things are hopeless as all that, I find your 
viewpoints fascinating.  I agree that programming friendliness into an AI is 
a poor strategy.  But I am not pessimistic, because I don't expect AI to 
become conscious any time soon, self-evolving soon after that, or to evolve 
speedily after that, or to have much control over the physical universe 
outside cyberspace even if it does.

-- 
Harvey Newstrom <www.harveynewstrom.com>
CISSP CISA CISM CIFI GSEC IAM ISSAP ISSMP ISSPCS IBMCP



More information about the extropy-chat mailing list