[ExI] Hard Takeoff

Michael Anissimov michaelanissimov at gmail.com
Sun Nov 14 20:28:42 UTC 2010


Hi Jef,

On Sun, Nov 14, 2010 at 11:26 AM, Aware <aware at awareresearch.com> wrote:

>
> The much more significant and accelerating risk is not that of a
> "recursively self-improving" seed AI going rogue and tiling the galaxy
> with paper clips or copies of itself, but of relatively small groups
> of people, exploiting technology (AI and otherwise) disproportionate
> to their context of values.
>

I disagree about the relative risk, but I'm worried about this too.


> The need is not for a singleton nanny-AI but for development of a
> fractally organized synergistic framework for increasing awareness of
> our present but evolving values, and our increasingly effective means
> for their promotion, beyond the capabilities of any individual
> biological or machine intelligence.
>

Go ahead and build one, I'm not stopping you.


> It might be instructive to consider that a machine intelligence
> certainly can and will outperform the biological kludge, but
> MEANINGFUL intelligence improvement entails adaptation to a relatively
> more complex environment. This implies that an AI (much more likely a
> human-AI symbiont), poses a considerable threat in present terms, with
> acquisition of knowledge up to and integrating between existing silos
> of knowledge, but lacking relevant selection pressure it is unlikely
> to produce meaningful growth and will expend nearly all its
> computation exploring irrelevant volumes of possibility space.
>

I'm having trouble parsing this.  Isn't it our job to provide that
"selection pressure" (the term is usually used in Darwinian population
genetics so I find it slightly odd to see it used in this context)?


> Singularitarians would do well to consider more ecological models in
> this Red Queen's race.


On a more sophisticated level I do see it as such.  Instead of organisms
being the relevant unit of analysis, I see mindstuff-environment
interactions as being the relevant level.   AI will undergo a hard takeoff
not be cooperating with the existing ecological context, but by
mass-producing its own mindstuff until the agent itself constitutes an
entire ecology.  The end result is more closely analogous to an alien
planet's ecology colliding with our own than a new species arising within
the current ecology.

-- 
michael.anissimov at singinst.org
Singularity Institute
Media Director
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20101114/94f84054/attachment.html>


More information about the extropy-chat mailing list