[ExI] Hard Takeoff

Alan Grimes agrimes at speakeasy.net
Sun Nov 14 19:32:23 UTC 2010


chrome://messenger/locale/messengercompose/composeMsgs.properties:
> <mailto:stefano.vaj at gmail.com>> wrote:

> We have some reason to believe that a roughly human-level AI could
> rapidly improve its own capabilities, fast enough to get far beyond the
> human level in a relatively short amount of time.  The reason why is
> that a "human-level" AI would not really be "human-level" at all -- it
> would have all sorts of inherently exciting abilities, simply by virtue
> of its substrate and necessities of construction:

OMG, this is the first posting by the substrate fetishist and RYUC
priest Anissimov I've read in many long years. =P

> 1.  ability to copy itself

Sufficiently true.

nb: requires work by someone with a pulse to provide hardware space,
etc... (at least for now).

> 2.  stay awake 24/7

FALSE.
Not implied. The substrate does not confer or imply this property
because an uploaded mind would still need to sleep for precisely the
same reasons a physical brain does.

> 3.  spin off separate threads of attention in the same mind

FALSE.
(same reason as for 2).

> 4.  overclock helpful modules on-the-fly

Possibly true but strains the limits of plausibility, also benefits of
this are severely limited.

> 5.  absorb computing power (humans can't do this)

FALSE.
Implies scalability of the hardware and software architecture not at all
implied by simply residing in a silicon substrate, indeed this is a
major research issue in computer science.

> 6.  constructed from scratch with self-improvement in mind

Possibly true but not implied.

> 7.  the possibility of direct integration with new sensory modalities,
> like a codic modality

True, but not unique, the human brain can also integrate with new
sensory modalities, this has been tested.

> 8.  the ability to accelerate its own thinking speed depending on the
> speed of available computers

True to a limited extent, also Speed is not everything.

> When you have a human-equivalent mind that can copy itself, it would be
> in its best interest to rent computing power to perform tasks.  If it
> can make $1 of "income" with less than $1 of computing power, you have
> the ingredients for a hard takeoff.

Mostly true. Could, would, and should being discreet questions here.

> Many valuable points are made here, why do people always ignore them?

> http://singinst.org/upload/LOGI//seedAI.html

Cuz it's just a bunch of blather that has close to the lowest possible
information density of any text written in the English language.
Thankfully, the author has since proven that he doesn't have what it
takes to actually destroy the world or even cause someone else to do so
it is therefore safe to ignore him and everything he's ever said.

> Prediction: most comments in response to this post will again ignore the
> specific points in favor of a rapid takeoff and simply dismiss the idea
> based on low intuitive plausibility. 

My plans for galactic conquest rely on the possibility of a hard
takeoff, therefore I'm working enthuseastically towards developing AGI
myself, with my own robots and hardware. Nothing can stop me!,
mwahahahaha etc, etc... By some combination of building a TARDIS and
taking myself a few hundred million lightyears from this insane rock and
using all available means to crush the efforts of people who think
destructive uploading is acceptable, I might just survive! =P

>     The Singularity as an incumbent rapture - or
>     doom-to-be-avoided-by-listening-to-prophets, as it seems cooler to
>     many to present it these days - can on the other hand easily
>     deconstructed as a secularisation of millennarist myths which have
>     plagued western culture since the advent of monotheism.

> We have real, evidence-based arguments for an abrupt takeoff.  One is
> that the human speed and quality of thinking is not necessarily any sort
> of optimal thing, thus we shouldn't be shocked if another intelligent
> species can easily surpass us as we surpassed others.  We deserve a real
> debate, not accusations of monotheism.

My favorite religions:

1. Atheism
2. Autotheism
3. Pastafarianism

The possibility of a hard takeoff is entirely independent of the
religious and pseudo-religious thought processes abundantly evident on
this list.

-- 
DO NOT USE OBAMACARE.
DO NOT BUY OBAMACARE.
Powers are not rights.




More information about the extropy-chat mailing list