[ExI] stealth singularity

Jebadiah Moore jebdm at jebdm.net
Wed Sep 22 01:58:36 UTC 2010


2010/9/20 spike <spike66 at att.net>

>  AI : human :: human : bird.
>

Yeah, certainly.  I would expect that an eXtreme artificial intelligence
would probably want to study the ecosystem pre-eXtreme intelligence (since
that would likely change the "character" of the patterns of activity
occurring), whether it was benevolent or not.

Another thing that comes to mind is AI:human :: human:spider.  We don't like
spiders in a lot of ways, but we do like that they put a check on the
mosquito population.  So, even if we had the opportunity to wipe them out
(or alternatively, given them superpowers), we probably wouldn't.
 Similarly, we might be performing some function that the AI wants to ensure
that we don't stop performing.

There's another interesting possibility when you consider us giving
technology to non-human animals.  We have veterinary medicine, advanced
weaponry, etc., but in general we don't make a point of saving all animals
from their "natural" predators or other "natural" forces.  But generally
what we mean by "natural" is non-human.  Similarly, the AI might not want to
save us from our "natural" habitat and problems and whatnot.  Or it might
just want to leave us as a nature reserve.

After all, the AI you're positing doesn't exactly need Earth to survive
(since it's hiding out on other planets, massively reproducing, traveling
large distances...).  It could probably just leave the solar system entirely
to find somewhere with better resources.

Which brings me to another possibility: maybe an AI is invisible because it
sent out some scouts to find life elsewhere, and is waiting to hear back
from them.

Or maybe the AI is just shy.

-- 
Jebadiah Moore
http://blog.jebdm.net
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20100921/4e514040/attachment.html>


More information about the extropy-chat mailing list