[ExI] Unfrendly AI is a mistaken idea.

A B austriaaugust at yahoo.com
Sat May 26 23:57:47 UTC 2007


Hi John,

My assessment is that your "psychoanalysis" is
completely, totally wrong. I can guarantee that it is
completely wrong with respect to me, and I *strongly*
believe that it is wrong with virtually all of the
Friendly AI people (perhaps not every last one of
them-I can't know with absolute certainty).

To be frank with you, the origin and nature of your
accusations is completely alien to me. I don't know
where in the world you got these notions about the
"psychology" of the Friendly AI people. All I can tell
you is that you are attributing to me motives that
don't even dimly resemble my actual motives. (And I
believe it's likely to be the case with virtually
everyone else). 

Would I rather live pleasantly than die?

Yes.

Would I rather that humanity survive rather than be
exterminated?

Yes. Of course. Don't you?

Would I like the AI to have an awesomely wonderful,
enjoyable, and emotionally charged life?

Yes. As much as myself. ...What's the problem
here???...

Would I like the AI and humanity to be co-operative
allies?

Yes. Why not?

John, If you are going to continue to, not so subtly,
imply that I have the mindset of a slave-driver, then
will you at least do me the honor of considering and
answering my questions to you in my original post on
this subject?

Also, I don't understand your insistence that Friendly
AI is physically impossible. A "motivated" action will
not occur if no "motive" exists; no "motive" will
exist until it does exist; it will not exist until it
is programmed into the Seed AI. An algorithm won't do
anything if it doesn't exist; a neural pathway won't
do anything if it doesn't exist. Friendly AI will be
hard, but it is not physically impossible.

Best Wishes,

Jeffrey Herrlich  


--- John K Clark <jonkc at att.net> wrote:

> "Eliezer S. Yudkowsky" <sentience at pobox.com>
> 
> > I would like to know what you believe (and
> remember, this is a question of
> > simple fact) is the state of mind of someone who
> would like to build a
> > Friendly AI.
> 
> I don't really like to play the part of a
> psychiatrist, but you specifically
> asked me to do so and try to get into the head of
> the friendly AI people, so
> whatever it's worth here is my attempt to be an
> amateur shrink.
> 
> For generations Caucasians have observed black
> people and noticed that they
> seemed to have emotions, but they convinced
> themselves that they couldn't
> have really deep emotion like they themselves did
> because, well., because
> they weren't white. Therefore they could treat black
> people like shit and
> even own them with no guilt. The friendly AI people
> have the additional
> problem of explaining away the fact that the slave
> in question is without a
> doubt vastly more intelligent than they are; I
> imagine they rationalize this
> by saying, against all the evidence, that emotion is
> harder to achieve than
> intelligence, that emotion is the secret sauce that
> only a meat brain can
> produce never a silicon brain; so they delude
> themselves that the super
> intelligent AI is just a souped up adding machine.
> Or perhaps they think
> emotion is  something tacked on and they just won't
> tack it onto their AI,
> as if one also needed to tack on a Beethoven circuit
> on a radio if you
> wished it to play Beethoven. And then I imagine they
> just refuse to think
> how evolution could ever have produced emotion if it
> weren't intimately
> linked to intelligence.
> 
> But the above is of academic interest only because
> there is not a snowball's
> chance in hell of outsmarting a mind a thousand
> times smarter and a million
> times faster than your own. However it cannot be
> denied that they sincerely
> believe they can accomplish this imposable task,
> although they never give a
> hint how to go about it. And at this point my very
> modest psychoanalytical
> abilities fail me completely. What on Earth were
> this friendly AI people
> thinking?
> 
>  John K Clark
> 
> 
> 
> 
> 
> 
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
>
http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
> 



       
____________________________________________________________________________________Ready for the edge of your seat? 
Check out tonight's top picks on Yahoo! TV. 
http://tv.yahoo.com/



More information about the extropy-chat mailing list