[ExI] stealth singularity

spike spike66 at att.net
Mon Sep 20 19:19:34 UTC 2010


 
________________________________

	...On Behalf Of Brent Allsop
	Subject: Re: [ExI] stealth singularity
	
	Spike,

	> I believe that benevolence is absolutely proportional to
intelligence.

	> I believe that intelligence hiding from us is absolutely evil...
Brent

	 

Brent I follow you there, I see your line of reasoning, but before you leave
that road completely, do ponder this question.  Can you imagine any
circumstances whereby an AI would hide, at least temporarily, and not be
absolutely evil?

I can think of some, which I will share by way of analogy.

We set up nature preserves, vast stretches of natural habitat for wild
beasts, but do we ponder the condition of those beasts?  If you stop to
really ponder it, we realize that wild beasts are constantly hungry, with
starvation stalking all the time.  You can take *any* beast from the wild,
protect and feed them.  What happens?  In every case?  But population
explosions do not generally happen in the wild.  So every species is at or
near the environmental carrying capacity always.  Their daily lives are a
struggle against starvation, disease and other beasts wanting to devour.
Think of the suffering we create by setting up a wildlife preserve.  We
leave all these beasts to suffer hunger, when we could be using our
resources to feed and protect them.

I do not consider wildlife preserves evil.  Granted they have evil aspects
to them, but in general, setting up wildlife preserves is a non-evil thing
to do, or mixed good and evil.

Can you imagine *any* circumstances whereby an emergent AI would decide to
stay under cover, at least for a while?  I can.  Anyone else?

spike






More information about the extropy-chat mailing list