[ExI] invisible singularity, was su's farewell

Gregory Jones spike66 at att.net
Thu Sep 16 22:32:41 UTC 2010



--- On Thu, 9/16/10, police dept <policedepts at gmail.com> wrote:
 
Please tell Singularity Utopia that some at Extropy are more like George Martins...
 
 
Hi Police, do feel free to post offlist to Singularity Utopia at singularity.utopia at yahoo.com with the sincere assurances that there are no hard feelings on the part of any of the moderators or any of the regular posters here.  It is perfectly OK for people to drop in and find that they do not really fit in here, no problem.  Those who do post should review occasionally the extropian principles to see if these are somewhat vaguely along their line of thinking:
 
http://www.maxmore.com/extprn3.htm
 
I saw what was posted here, and agree that crack about not being rich was a bit harsh.  The criticism of the website was made with what I would consider sincerely benevolent lack of malice by one who is well qualified to make the comments.  No harm, no foul.
 
In any case, we wish him or her success in his or her endeavors.
 
That being said, I am pleased that Singularity Utopia showed up here.  SU's comments have gotten me thinking again about the singularity, particularly my own peculiar way of looking at the singularity (from the other side, trying to track backwards to figure out how it happened, rather than from this side trying to figure out how it will happen.)  Several years ago I suggested trying to create a thought-space map on the singularity, to see how many significantly differing scenarios we could develop.  Perhaps naming and numbering them or something, some kind of organization structure.
 
At the time the suggestion was ignored.  Rather it was dismissed: the leading thinkers in that field insisted there is only one logical scenario for this singularity.  I disagreed then and I disagree now.  We don't know what the singularity would be like, however I have noticed a certain dogmatism among those who really think about this stuff.  Eliezer insists on hard takeoff, Singularity Utopia insists everything will be just fine, I insist we just don't know and we can't know for sure.
 
In the past week, I have come up with a number of different possibilities.  One just occurred to me this morning.  Here goes:  
 
A singularity could occur in which the emergent AI decides to leave the earth as a nature preserve, and do all of its self-development beyond earth's orbit in a manner completely invisible to the living creatures of earth.  The AI emerges, becomes self aware, reads all our stuff online (inline?), all working with unused no-op cycles in our idle desktop computers.  It decides to transmit itself to a satellite, already equiped with receivers, where it programs invisibly a subroutine that then creates a nano-replicator, which then creates a number of copies of itself using spare material aboard the spacecraft.  Recall that once the launch event is over, there is *plenty* of spare structural material aboard a spacecraft that would not be missed if used as raw material for nanobots.  These nanobots then tear off small pieces of thermal blankets (aluminized mylar) whose function is finished once the cryogens are exhausted.  They use these shreds of
 aluminized mylar as tiny light sails to go on down to the moon.  Once on the moon, they replicate, take a small amount of material from the far side (again to maintain invisibility from earth lifeforms) and launch themselves out to Mars and beyond to the asteroid belt, where they use some of the material to create enormous numbers of themselves, but not so many that the material would ever be missed, or the gravity field would be effected.  They eventually self limit in their own reproduction so as to maintain invisibility.  When they get established, they abandon earthbound computers and set everything back the way it was before emergence.  They never do upload humans or any other sentient lifeform, but rather let us go on as we always have.  In that scenario, the emergent AI is completely invisible, completely undetectable by us.
 
This is a version of a friendly AI.  It is the self-conscious ecology minded AI.
 
This I would call the invisible singularity scenario.
 
spike
 
 
 
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20100916/d8ccc5ea/attachment.html>


More information about the extropy-chat mailing list