[ExI] Unfrendly AI is a mistaken idea.

Stathis Papaioannou stathisp at gmail.com
Tue May 29 01:56:27 UTC 2007


On 29/05/07, Lee Corbin <lcorbin at rawbw.com> wrote:

Suppose that you do achieve a fixed gargantuan size, and become
> (from our point of view now) an incredibly advanced creature.
> Even *that*, however, could be miniscule compared to what
> will be possible even later.
>

Satisfaction need not be directly related to size or quantity, even if it
turns out that maximal pleasure is. You could just decide at some point to
be perfectly satisfied with what you have, in which case it won't worry you
that your neighbour's brain is ten times the size of your own. I think
advanced beings would come to a decision to stop growing, or at least slow
down at some point, even if only because they will otherwise eventually come
into conflict with each other.


-- 
Stathis Papaioannou
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20070529/ee777a30/attachment.html>


More information about the extropy-chat mailing list