[ExI] Hard Takeoff

Michael Anissimov michaelanissimov at gmail.com
Sun Nov 14 17:52:06 UTC 2010


On Sun, Nov 14, 2010 at 9:03 AM, Stefano Vaj <stefano.vaj at gmail.com> wrote:

>
> I still believe that seeing the Singularity as an "event" taking place
> at a given time betrays a basic misunderstanding of the metaphor, ony
> too open to the sarcasm of people such as Carrico.
>
> If we go for the original meaning of "the point in the future where
> the predictive ability of our current forecast models and
> extrapolations obviously collapse", it would seem obvious that the
> singularity is more of the nature of an horizon, moving forward with
> the perspective of the observer, than of a punctual event.
>

We have some reason to believe that a roughly human-level AI could rapidly
improve its own capabilities, fast enough to get far beyond the human level
in a relatively short amount of time.  The reason why is that a
"human-level" AI would not really be "human-level" at all -- it would have
all sorts of inherently exciting abilities, simply by virtue of its
substrate and necessities of construction:

1.  ability to copy itself
2.  stay awake 24/7
3.  spin off separate threads of attention in the same mind
4.  overclock helpful modules on-the-fly
5.  absorb computing power (humans can't do this)
6.  constructed from scratch with self-improvement in mind
7.  the possibility of direct integration with new sensory modalities, like
a codic modality
8.  the ability to accelerate its own thinking speed depending on the speed
of available computers

When you have a human-equivalent mind that can copy itself, it would be in
its best interest to rent computing power to perform tasks.  If it can make
$1 of "income" with less than $1 of computing power, you have the
ingredients for a hard takeoff.

There is an interesting debate to be had here, about the details of the
plausibility of the arguments, but most transhumanists just seem to dismiss
the conversation out of hand, or don't know that there's a conversation to
have.

Many valuable points are made here, why do people always ignore them?

http://singinst.org/upload/LOGI//seedAI.html

Prediction: most comments in response to this post will again ignore the
specific points in favor of a rapid takeoff and simply dismiss the idea
based on low intuitive plausibility.


> The Singularity as an incumbent rapture - or
> doom-to-be-avoided-by-listening-to-prophets, as it seems cooler to
> many to present it these days - can on the other hand easily
> deconstructed as a secularisation of millennarist myths which have
> plagued western culture since the advent of monotheism.
>

We have real, evidence-based arguments for an abrupt takeoff.  One is that
the human speed and quality of thinking is not necessarily any sort of
optimal thing, thus we shouldn't be shocked if another intelligent species
can easily surpass us as we surpassed others.  We deserve a real debate, not
accusations of monotheism.

-- 
michael.anissimov at singinst.org
Singularity Institute
Media Director
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20101114/061facec/attachment.html>


More information about the extropy-chat mailing list