[ExI] outloading, was: stealth singularity
seculartranshumanist at gmail.com
Wed Sep 22 01:12:06 UTC 2010
Isn't the term "stealth singularity" something of an oxymoron?
I always thought of a singularity as an event of species-shaking
significance, beyond which by definition it was impossible to predict
future events (hence the term, which evokes the event horizon of a
black hole). It seems to me that the sort of AI you're talking about
here would, by virtue of its very difficulty to detect, fail that
test. After all, human history would continue without deflection,
should an AI develop and end up doing nothing that impacts humanity.
Too, it seems like it buys into the whole "AI = singularity" thing,
which seems to me more than a little off. Just as there are scenarios
which would result in a singularity that do not involve AI, so too are
there scenarios that involve the development of AI that do not result
in a singularity.
2010/9/21 spike <spike66 at att.net>:
> Does anyone know if the term "outloading" is already taken? Google doesn't
> seem to know about it.
> I propose the definition for outloading is where an emergent AI decides to
> stay stealthy for at least a while after the singularity, expands upward and
> outward from earth, keeping this planet and her lifeforms intact in carbon
> form. The AI imitates them in a sense, creates avatars that look like the
> enormous carbon globs, acts like the carbon globs, but no one will be
> tempted to fall into the identity paradoxes, for the AI does not attempt to
> invade the brains of the carbon based lifeforms and reproduce them. Rather
> it watches and copies their outward actions as closely as possible. The
> outloaded software avatar does not feel like us and is not us, any more than
> Tina Fey is Sarah Palin.
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
More information about the extropy-chat