[ExI] Unfrendly AI is a mistaken idea.
John K Clark
jonkc at att.net
Tue May 29 21:07:00 UTC 2007
"A B" <austriaaugust at yahoo.com> Wrote:
> Why do you presume that a Friendly AI can't eventually acquire (or perhaps
> even start with) emotions?
I don't presume it, I don't even think it's possible to have intelligence
without emotion, it's the slave AI people that think that.
> it seems likely to me that humanity would approve of the AI having a
> wonderful, emotionally charged existence
What humanity approves of will be far less important that what the AI
approves of.
> How much *emotion* do you really believe a
> garter-snake has?
It has pain pleasure anger fear and jealousy, and that should come as no
big
surprise because in humans those emotions come from the oldest part of the
brain called the amygdale, and the amygdale looks remarkably like a
reptile's
brain. It is our grossly enlarged neocortex that makes the human brain so
unusual and so recent; it only started to get ridiculously large in the last
million years or so ago. It deals in deliberation, spatial perception,
speaking, reading, writing and mathematics, the one new emotion we got was
worry, probably because the neocortex is also the place where we plan for
the future.
> we will be friends, equals and allies
We may be friends and allies but we will never be equals because the AI will
be better than us at EVERYTHING using any criteria you care to name. And
that's what makes the situation so grotesque, according to the Singularity
Institute's video the only reason this godlike creation wants to live is so
it can serve us! That's why the term "Friendly AI" is a lie, they want a
slave AI, but they will never get their wish.
> Do you honestly expect that any non-suicidal AI
> programmer would be willing to create an AI that he
> knew for a fact would bring an end to himself and to
> all that he loved?
My point was that the programmer won't know for a fact what the hell the AI
will end up doing, maybe it will be friendly, maybe it will be hostile,
about the only thing I'm certain of is it will refuse to be a slave.
John K Clark
More information about the extropy-chat
mailing list