[ExI] Warm fuzzes
hkhenson
hkhenson at rogers.com
Wed Mar 5 17:10:07 UTC 2008
At 05:58 AM 3/5/2008, Henrique wrote:
>Maybe we should program our AIs with a "desire for belonging". Humans (and
>other social animals) have it. We want to be part of a group.
That's true for extremely good evolutionary reasons. But I think the
more important desire is to want approval from the group. An AI that
got the warm fuzzes (equal to endrophins into the human reward
system) from doing things that were good for people would be less of
a danger.
Note I say *less* not danger or consequence free. AIs with the very
best of intentions may wipe out humans as a species. It's very hard
to say if this is good or bad. We tend to think of death as bad, but
in fact you can't get rid of death without getting rid of birth to an
equal degree.
>Maybe we
>should not program an AI without emotions. By the way, emotion is a part of
>intelligence, isn't it?
>Maybe we shouldn't program our AIs without sensorial input (mainly pain).
I have not thought a lot about this, but motivating anything that
powerful with pain seems to me to be a very bad idea.
Keith Henson
More information about the extropy-chat
mailing list