[ExI] Unfrendly AI is a mistaken idea.
Stathis Papaioannou
stathisp at gmail.com
Sun May 27 02:10:13 UTC 2007
From: Stathis Papaioannou <stathisp at gmail.com>
Date: 25-May-2007 20:03
Subject: Re: [ExI] Unfrendly AI is a mistaken idea.
To: ExI chat list <extropy-chat at lists.extropy.org>
On 25/05/07, Lee Corbin <lcorbin at rawbw.com> wrote:
> Why not just have the happiness at no cost? You might say,
> > because being an intelligent being has a certain je ne sais quoi,
> > adding richness to the raw emotion of happiness. However,
> > if you have complete access to your mind you will be able to
> > pin down this elusive quality and then give it to yourself directly...
>
> Yes, heh, heh. But that seems a strategy good only for the very
> short term. Unless you mix in a tremendous urge to advance (which
> not coincidentally most of us already possess), then you fail ultimately
> by incredible orders of magnitude to obtain vastly greater satisfaction.
Why can't you get satisfaction without advancement? Unless your laziness had
some detrimental effect on survival you could probably get satisfaction
equivalent to that of any given scenario directly. A possible counterexample
would be if the maximal amount of subjective satisfaction were proportional
to the available computational resources, i.e., you could experience twice
as much pleasure if you had twice as big a brain. This might lead AI's to
consume the universe in order to convert it into computronium, and then
fight it out amongst themselves. However, I don't think there is any clear
relationship between brain size and intensity of emotion.
Isn't that really what repels us about the image of a wirehead? No
> progress?
Yes, but it doesn't repel everyone. Heaven is a place of great pleasure and
no progress, and lots of people would like to believe that it exists so that
they can go there. The difference between Heaven and wirehead hedonism or
drug addiction is that in Heaven God looks after you so that you don't
starve to death or neglect your dependants. Retiring to eternal bliss in a
big computer maintained by dedicated AI systems would be the posthuman
equivalent of Heaven.
But besides, I happen to have a very strong *predilection*
> for learning and finding truth. "To delight in understanding"
> has long been my maxim for what I ultimately wish for. So,
> even though you're right and I've been motivated to feel that
> way by genetic systems out of my direct control (so far), I
> would still choose to go on getting my raw pleasure
> indirectly.
>
This sort of legacy thinking is the only hope for continuing progress into
the indefinite future. There is no reason why you should be able to
experience *less* pleasure if you assign it to something you consider
worthwhile rather than to idleness, so why not do so? Moreover, there would
be less reason to try to gain pleasure or satisfaction by doing something
bad if you could as easily get the same reward by doing something good or
doing nothing. The majority of people who deliberately hurt others do so
because they don't consider the badness of their action to outweigh their
desire for the expected reward.
--
Stathis Papaioannou
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20070527/a73d0c99/attachment.html>
More information about the extropy-chat
mailing list