[ExI] Unfrendly AI is a mistaken idea.

Eliezer S. Yudkowsky sentience at pobox.com
Wed May 23 06:54:05 UTC 2007


John K Clark wrote:
> 
> It aint going to happen of course no way no how, the AI will have far bigger
> fish to fry than our little needs and wants, but what really disturbs me is
> that so many otherwise moral people wish such a thing were not imposable.
> Engineering a sentient but inferior race to be your slave is morally
> questionable but astronomically worse is engineering a superior race to be
> your slave; or if would be if it were possible but fortunately it is not.

We are each the heroes of the stories we tell ourselves.  It is 
important to understand this, in order to come to terms with reality; 
it may be disturbing to think that, say, Osama bin Laden is the hero 
of his own story, but most assuredly, he is, and his beliefs are a 
part of reality.  He certainly is not attacking America because "he 
hates our freedom" - such are the words of someone who simply refuses 
to face the facts of other people's states of mind, because in their 
own story, they are the heroes and the enemies devils, and they just 
can't come to terms with any story that isn't like that, even for the 
sake of understanding psychology.  So they tell a story in which their 
enemies see themselves as devils, which, of course, is factually untrue.

I mention this because I would like to know what you believe (and 
remember, this is a question of simple fact) is the state of mind of 
someone who would like to build a Friendly AI.  Assuredly, they, being 
the heroes of their own stories, would never tell such an unheroic 
story as setting out to engineer a race of slaves.  So what do you 
think they are thinking?

-- 
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



More information about the extropy-chat mailing list