[ExI] What might be enough for a friendly AI?
Stefano Vaj
stefano.vaj at gmail.com
Wed Nov 17 22:48:01 UTC 2010
2010/11/17 spike <spike66 at att.net>:
> Well friendly to me of course. Silly question. And friendly to you too, so
> long as you are friendly to me and my friends, but not to my enemies or
> their friends.
Sure, rain may be friendly to the farmer and unfriendly to the truck
driver, even though it is hardly "intelligent".
So, why is it so difficult to accept that "friendliness" is simply and
purely a projection as to the supposed internal state of something
which happens to serve one's purposes, so that neither "rapture" nor
"doom" are really visions of any help in discussing AGI?
But if we go down to really literal and personal meaning of
"friendliness", yes, it is a bet I make that neither any increase in
raw computing power, nor the choice to use some of it to emulate
"human, all too human" behaviours, is really likely to kill me any
sooner than old age, diseases, or accidents.
And, all in all, if I am really going to be killed by a computer, I
think that a stupid or primitive one would have no more qualms or
troubles in doing so than a "generally intelligent" one.
--
Stefano Vaj
More information about the extropy-chat
mailing list