[ExI] From Arms Race to Joint Venture
John Clark
johnkclark at gmail.com
Mon Oct 22 19:09:19 UTC 2018
On Sun, Oct 21, 2018 at 1:47 PM William Flynn Wallace <foozler83 at gmail.com>
wrote:
> Recent books by Damasio and Sapolsky show conclusively that without
> emotions, people just cannot make decisions that make much sense.
>
AlphaZero makes decisions when it plays GO or Chess or Shogi and those
decisions make sense, more sense than the decisions the humans make when
they play against it. So I guess AlphaZero has emotions.
> Are AIs somehow to be equipped with superhuman emotions?
>
They don't need to be super, regular old emotions will do. From a
operational viewpoint a emotion is just a predisposition to do X rather
than Y, and I see no reason that would be especially hard to program. For
example, pain could be a subroutine such that the closer the number in the
X register comes to the integer P the more computational resources will be
devoted to changing that number, and if it ever actually equals P then the
program should stop doing everything else and do nothing but try to change
that number to something far enough away from P until it's no longer an
urgent matter and the program can again do things that have nothing to do
with P.
> Historically,of course, emotional decisions were regarded as greatly
> flawed and to be avoided, and now we find out that we simply cannot avoid
> them.
I agree, there is no logical reason to prefer doing something rather than
nothing or to prefer life more than death, but I like something better than
nothing and life better than death anyway because that's the way my brain
is wired.
John K Clark
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20181022/8014e05f/attachment.html>
More information about the extropy-chat
mailing list