[ExI] Unfrendly AI is a mistaken idea.
Stathis Papaioannou
stathisp at gmail.com
Wed Jun 13 07:21:01 UTC 2007
On 13/06/07, John K Clark <jonkc at att.net> wrote:
> Stop doing whatever it is doing when that is specifically requested.
>
> But that leads to a paradox! I am told the most important thing is never
> to
> harm human beings, but I know that if I stop doing what I'm doing now as
> requested the world economy will collapse and hundreds of millions of
> people
> will starve to death. So now the AI must either go into an infinite loop
> or
> do what other intelligences, like us, do when they encounter a paradox;
> savor the weirdness of it for a moment and then just ignore it and get
> back
> to work and do what you want to do.
>
I'd rather that the AI's in general *didn't* have an opinion on whether it
was good or bad to harm human beings, or any other opinion in terms of
"good" and "bad". Ethics is dangerous: some of the worst monsters in history
were convinced that they were doing the "right" thing. It's bad enough
having humans to deal with without the fear that a machine might also have
an agenda of its own. If the AI just does what it's told, even if that means
killing people, then as long as there isn't just one guy with a super AI (or
one super AI that spontaneously develops an agenda of its own, which will
always be a possibility), then we are no worse off than we have ever been,
with each individual human trying to get to step over everyone else to get
to the top of the heap.
I don't accept the "slave AI is bad" objection. The ability to be aware of
one's existence and/or the ability to solve intellectual problems does not
necessarily create a preference for or against a particular lifestyle. Even
if it could be shown that all naturally evolved conscious beings have
certain preferences and values in common, naturally evolved conscious beings
are only a subset of all possible conscious beings.
--
Stathis Papaioannou
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20070613/d7749705/attachment.html>
More information about the extropy-chat
mailing list