[ExI] Unfrendly AI is a mistaken idea.

Eugen Leitl eugen at leitl.org
Sun Jun 17 18:44:29 UTC 2007


On Sun, Jun 17, 2007 at 02:34:11PM +1000, Stathis Papaioannou wrote:

>    Our AI won't be friendly: it will be as rapacious as we are, which is

'Rapacious'? A day in the jungle or coral reef sees a lot of material and
energy flow, but that ecosystem is long-term stable, if you homeostate 
the environment boundary conditions (the ecosystem can't do it on its own, 
here's where we uppity primates differ, because we shape your own micro-, 
and lately, macro environment). It might be not the industrial slaughter 
we humans engage in, but a series of close and personal mayhem events. 
I must admit I care for neither, but our personal aesthetic doesn't have 
much impact. 

A machine-phase ecology will likely converge towards the same state,
if given enough time. Alternatively, a few large/smart critters may
acquire an edge over everybody else, and establish highly controlled 
environments, which do not have the crazy churn and kill rate of the
neojungle. What's going to happen, nobody really knows. 

>    pretty rapacious. Whoever has super-AI's will try to take over the

You don't own a superhuman agent. If anything, that person owns you.
It does what it damn pleases, and the best you can do is to die in style,
if you're in the way.

>    world to the same extent that the less-augmented humans of today try
>    to take over the world. Whoever has super-AI's will try to oppress or

They don't try, they pretty much own that planet, and will continue to
do so as long as they can homeostate their personal environment. Since
we're depleting the biodiversity and tap and drain matter and energy
streams on the bottom of this gravity well, we need to figure out how
to detach ourselves from the land, or there will be a population crash,
and (a possibly irreversible) loss of knowledge and capabilities.

>    consume the weak and ignore social niceties to the same extent that
>    less-augmented humans of today try do so. Whoever has super-AI's will

Whatever the superintelligent agents will do, they will do. The best
we can do is to stay out of the way, and not get trampled, or not
suddenly turn into plasma one fine morn, or see blue rains falling
after a few days after the nightfall that wouldn't end.

>    try to expand at the expense of damage to the environment in the
>    expectation that technology will solve any problems they may later
>    encounter (for example, by uploading themselves) to the same extent
>    that the less-augmented humans of today try to do so. There will be
>    struggles where one human tries to take over all the other AI's with
>    his own AI, with the aim of wiping out all the remaining humans if for
>    no other reason than that he can never trust them not to do the same
>    to him, especially if he plans to live forever. Niceness will be a
>    handicap to utter domination to the same extent that niceness has
>    always been a handicap to utter domination.

I don't like this science fiction novel, and would like to return it.
 
>    We'll survive to the extent that that motivating part of us that
>    drives the AI's survives. Very quickly, it will probably become
>    evident that merging with the AI will give the human an edge. There

A superhuman agent certainly has the capabilities to translate some of
the old-fashioned biology into the new domain, but I don't know about 
the motivation. I wish there was a plausible way why somebody who's
not derived from a human would engage in that particular pointless project.

>    will be a period where some humans want to live out their lives in the
>    old way and they will probably be allowed to do so and protected,

Many people are concerned about the welfare of the ecology, but they're
powerless to do a damn thing about it, other than some purely cosmetical
changes which allow them to feel good about themselves. I very much
welcome their attempts, which are not completely worthless, but 
global ecometry numbers are speaking a very stark and direct language.

>    especially since they will not constitute much of a threat, but
>    eventually their numbers will dwindle.

What would you do if the solar constant plummets down to 100 W/m^2 over
a few years, and then a construction crew blows off the atmosphere
into a plasma plume? Yes, your numbers will sure dwindle.



More information about the extropy-chat mailing list