[ExI] From Arms Race to Joint Venture

William Flynn Wallace foozler83 at gmail.com
Mon Oct 22 20:21:01 UTC 2018

 From a operational viewpoint a emotion is just a predisposition to do X
rather than Y,   John Clark

So, I think I was right - the computer leans (has an attitude towards
something it considers positive) and bases its decision on that leaning;
meaning the probabilities it calculates always influence its decision: it
always goes with the highest probability of success - I am assuming.

Probabilities are then the analog of emotions, and are to be preferred to
"Oh, I don't exactly know why, I just kinda like that one."

bill w

On Mon, Oct 22, 2018 at 2:14 PM John Clark <johnkclark at gmail.com> wrote:

> On Sun, Oct 21, 2018 at 1:47 PM William Flynn Wallace <foozler83 at gmail.com>
> wrote:
> > Recent books by Damasio and Sapolsky show conclusively that without
>> emotions, people just cannot make decisions that make much sense.
> AlphaZero makes decisions when it plays GO or Chess or Shogi and those
> decisions make sense, more sense than the decisions the humans make when
> they play against it. So I guess AlphaZero has emotions.
> > Are AIs somehow to be equipped with superhuman emotions?
> They don't need to be super, regular old emotions will do. From a
> operational viewpoint a emotion is just a predisposition to do X rather
> than Y, and I see no reason that would be especially hard to program. For
> example, pain could be a subroutine such that the closer the number in the
> X register comes to the integer P the more computational resources will be
> devoted to changing that number, and if it ever actually equals P then the
> program should stop doing everything else and do nothing but try to change
> that number to something far enough away from P until it's no longer an
> urgent matter and the program can again do things that have nothing to do
> with P.
> > Historically,of course, emotional decisions were regarded as greatly
>> flawed and to be avoided, and now we find out that we simply cannot avoid
>> them.
> I agree, there is no logical reason to prefer doing something rather than
> nothing or to prefer life more than death, but I like something better than
> nothing and life better than death anyway because that's the way my brain
> is wired.
> John K Clark
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20181022/cbfa6332/attachment-0001.html>

More information about the extropy-chat mailing list