[ExI] outloading, was: stealth singularity

spike spike66 at att.net
Wed Sep 22 05:36:14 UTC 2010


 
> 2010/9/21 spike <spike66 at att.net>:
...
> >
> > I propose the definition for outloading is where an emergent AI 
> > decides to stay stealthy for at least a while after the 
> singularity... spike

> ...On Behalf Of Joseph Bloch
...
> Subject: Re: [ExI] outloading, was: stealth singularity
> 
> Isn't the term "stealth singularity" something of an oxymoron?

Not necessarily, but from your question, I see a way to improve my notion.
Read on.

> I always thought of a singularity as an event of 
> species-shaking significance, beyond which by definition it 
> was impossible to predict future events (hence the term, 
> which evokes the event horizon of a black hole)... Joseph

Since we are borrowing terms from astronomy such as singularity and event
horizon, let us take it a step further.  We often think of all the strange
things that happen at the event horizon in the natural world, but one can
cross an event horizon and not notice anything amiss.  As one approaches the
center of the black hole itself (the singularity in the astronomical sense)
way inside the event horizon is where all the strange stuff happens, being
ripped apart by tidal forces and such unpleasantness.  But out at the event
horizon of a super massive black hole, things are still OK at least for a
while.  If for instance you have a black hole of about 3 trillion solar
masses, the event horizon is a lightyear out from the singularity (the
Schwarzchild radius is a light year) and nothing particularly strange
appears to be happening there.  You could cross it and not notice.  You
would never come back of course, but you could cross it going in.

So instead of referring to this notion as the stealth singularity, we could
call it the stealth event horizon.  Then the singularity we usually think of
would still be in the future after crossing that event horizon, and
inevitable.  

A super friendly AI which chose to outload us and keep earth as a nature
preserve would eventually decide to devour all that iron.  Perhaps it would
eventually perfect the outloads sufficiently to make the carbon units
unnecessary.  Surely it will want to get at all this metal down here at some
point.  

So OK Joseph, point taken.  What I am describing isn't a singularity, but
rather the emergence of a super-friendly AI.  This version is more
human-friendly and more self-sacrificing than I expect an emergent AI to be.
So all this might be just so much wishful thinking.

spike



 





More information about the extropy-chat mailing list