[extropy-chat] A useful remark

Rafal Smigrodzki rafal.smigrodzki at gmail.com
Sun Dec 17 18:37:34 UTC 2006

Over at transhumantech Eugen made this remark:

"A machine god pantheon by default kills things by habitat destruction"

(this is in response to James, who talks about his usual stuff, basic
income, free healthcare, and "democracy")

### This brings to mind Eliezer's analysis of the applicability of
evolutionary theory to SAI's. According to Eli, and I agree with him
here, evolution would not apply to a singleton AI, given the absence
of mutation and selection which are the sine qua non of evolution.

But Eugen points to a situation where even in the absence of mutation
(that is, randomly generated change) there could be evolution, with
its associated tendencies towards exponential proliferation and
filling of all accessible ecological niches. All you need is one AI
without very strong built-in limitations on the destruction of humans,
and even in the presence of friendly AI's of equal intelligence the
outcome could be dire: an UFAI could physically expand heedless of its
impact on humans, and it could self-modify without concern for its
long-term stability. Lack of physical and mental limitations could
give the UFAI an edge over FAIs, forcing them to expand and
self-modify, perhaps leading to loss of Friendliness.

I agree with Eugen that unmodified humans are likely to survive only
in a world with one FAI ("The One"), or a group of closely cooperating
FAIs ("Them" :). An ecology of self-enhancing entities essentially
assures the obliteration of HAWKI (Humanity As We Know It).

Given that it is most likely technically difficult to prevent the
emergence of such an ecology using the good old methods (committees,
congressional acts, pen-pushers spouting regulations, jackbooted
enforcers and other fruits of commie imagination), considerations of
basic income, and other such stuff, are about as relevant to our
future as droit de seigneur.

Although a singleton globe-spanning FAI appears to be our best bet for
survival (a good reason to support SIAI), I am wondering if there are
other methods. I remember that Eugen used to advance the notion of a
massive program of uploading which would occur before building true
SAI. Do you still think this is a good idea, Eugen? I wish it was, but
I think that SAI (although not necessarily FAI) is a bit easier than
uploading, so it's likely that SAI will happen first, for better or
for worse.

This topic has been raised here many times but I would still like to
know if anybody has any new realistic ideas about saving humanity from
SAI, other than the FAI? (Pen-pusher ideas are not realistic, so don't
even mention them)


More information about the extropy-chat mailing list