[ExI] Unfrendly AI is a mistaken idea.

Lee Corbin lcorbin at rawbw.com
Mon May 28 16:29:43 UTC 2007


Stathis writes

> > You could get *some* satisfaction without advancement, and
> > by our lights today it would be quite a bit.  But it would be 
> > pitifully miniscule compared to what you will get if you continue
> > to advance.
> 
> Yes, if we're stuck in our present situation, but I was thinking of
> a time when we have total control over our minds and sufficient
> control over our environment such that our continued survival is
> no longer an issue.

Oh, that's what I was trying to address as well.

> It would be possible to progress from this point on, procuring more
> resources, but it would also be possible to simply make yourself
> very happy and satisfied, without any further material change.

Yes, but if you take the route of choosing less advancement, then
you automatically take the route of diminished satisfaction. 
Everything else being equal, a Jupiter brain should have the better
of everything.

Suppose that you do achieve a fixed gargantuan size, and become
(from our point of view now) an incredibly advanced creature.
Even *that*, however, could be miniscule compared to what
will be possible even later.

So I would say: one will be left completely behind unless one
continues to keep advancing.

(Now some readers will wonder why I say this when I have
argued endlessly that it's very difficult if not impossible to 
stay the same person when you advance this much. Well,
the simple solution---as I've said many times---is to accept
that fact, but try to make sure that all future versions of you
give old versions plenty of runtime.)

> > It seems to me that a designer would be hard pressed to
> > manage to have an ant be able to derive as much pleasure,
> > contentment, satisfaction, ecstacy, etc., as a human is able.
> > Every nuance of our own pleasure or happiness requires 
> > some neuron firings, I believe.
> 
> How would this translate to a computer emulation? The size of
> a pleasure/pain integer that can be held in memory? The
> proportion of the program devoted to pleasure/pain?
> The number of times the pleasure/pain subroutine is
> called or iterated? The first of these would be dependent
> on computer resources, but not the other two, given sufficient time. 

To answer those questions will require decades. But we have an
"existence proof":  we already know that configurations of tissue,
or---as many of us maintain---software configurations and
executions do manage to sustain emotions.

Lee




More information about the extropy-chat mailing list