[ExI] Unfrendly AI is a mistaken idea.

John K Clark jonkc at att.net
Fri Jun 8 17:30:37 UTC 2007


"A B" <austriaaugust at yahoo.com>

> your intuitions about emotions
> and motivations are just totally *wrong*.

Apparently Evolution was also wrong to invent emotion first and only after
500 million years come up with intelligence.

> In how many different ways must that be demonstrated?

42.

> An AI *is* a specific computer.

Then you *are* a specific computer too.

> If my desktop doesn't need an emotion to run a program or respond within
> it, why "must" an AI have emotions?

If you are willing to embrace the fantasy that your desktop is intelligent
then I see no reason you would not also believe in the much more modest and
realistic fantasy that it is emotional. Emotions are easy, intelligence is
hard.

 > I don't understand it John, before you were claiming fairly ardently that
> "Free Will" doesn't exist.

I made no such claim, I claimed it does not even have the virtue of non
existence, as expressed by most people the noise "free will" is no more
meaningful than a burp.

> Why are you now claiming in effect that an AI will
> automatically execute a script of code that doesn't
> exist - because it was never written (either by the
> programmers or by the AI)?

I don't know why I'm claiming that either because I don't know what the hell
you're talking about. Any AI worthy of the name will write programs for it
to run on itself and nobody including the AI knows what the outcome of those
programs will be. Even the AI doesn't know what it will do next, it will
just have to run the programs and wait to see what it decides to do next;
and that is the only meaning I can attach to the noise "free will" that is
not complete gibberish.

>The problem is, not all functioning minds must be even  *remotely* similar
>to the higher functions of a  *human* mind.

The problem is that a mind that is not even *remotely* similar to *any* of
the *higher* functions of the human mind, that is to say if there is
absolutely no point of similarity between us and them then that mind is not
functioning very well. It is true that a mind that didn't understand
mathematics or engineering or science or philosophy or economics would be of
no threat to us, but it would be of no use  either; and as we have
absolutely nothing in common with it there would be no way to communicate
with it and thus no reason to build it.

 > AI will not have any emotions to begin with

Our ancestors had emotions long ago when they begin their evolutionary
journey, but that's different because, because, well, because meat has a
soul but semiconductors never can. I know you don't like that 4 letter word
but face it, that is exactly what you're saying.

 John K Clark









More information about the extropy-chat mailing list