<br>
<br>
<br>
-----Original Message-----<br>
From: Tom Nowell <A href="mailto:nebathenemi@yahoo.co.uk">nebathenemi@yahoo.co.uk</A><br>
<br>
<div id=AOLMsgPart_0_a5889a47-b0d0-4386-86d3-a457b48b40c9 style="FONT-SIZE: 12px; MARGIN: 0px; COLOR: #000; FONT-FAMILY: Tahoma, Verdana, Arial, Sans-Serif; BACKGROUND-COLOR: #fff"><PRE style="FONT-SIZE: 9pt"><TT>To add my voice to the debate about programming emotion into AIs -
emotional responses make a handy shortcut in decision-making. A couple
of popular psychology books recently (eg Blink, and I can't remember
the name of the other one I read) have as their central point the sheer
volume of decisions you undertake subconsciously and instantly. An AI
without emotions would have to process through everything and carefully
decide what criteria to judge things on, then use those criteria to
carefully weigh up options - this may take up a whopping volume of
processing time.
People often quote the example of the donkey equally distant from two
water sources who can't decide, and dies of thirst (I can't remember
the technical name for this, but some philosopher put his name to it).
Sometimes irrational factors or random choosing can make a decision
where logic struggles. The ability to go "I favour X" without thinking
too much about it saves a lot of time and streamlines decision-making.
Now, given a lot of processing power you don't need these shortcuts,
but for near-term AI these shortcuts would really help.
We don't have to make AIs follow our evolutionary psychology - their
emotions could be made similar to ours to make it easier for the two
types of intelligence to communicate, or we could deliberately tailor
theirs to be better attuned to what they are for (territoriality and
defending the group would be fantastic emotions for an AI helping
design computer firewalls and anti-virus software, but useless for a
deep space probe).
To summarise, I think people trying to make the "being of pure logic"
type of AI are making an uphill struggle for themselves, only to create
an intelligence many humans would have difficulty communicating with.
Tom
__________________________________________________________
<br>
Tom, I can see your point, but just because we react or make choices based upon our emotions doesn't mean that <br>
the processing is absent. My view of it is that when we are born we have pretty much just two emotions, Happy and sad.<br>
Anyone with children will know that while conscious, if they are not laughing they are screaming. Over time we develop<br>
or streamline our emotional set through years of data input, calculation and trial and error. I would like to think <br>
the result is that specific circuits are developed to recognise the complex sensory patterns which evoke specific emotions. <br>
As you say, acting on emotion reduces the processing and reaction time considerably. However the processing is still being <br>
done at some level even if we are not directly aware of it.<br>
The end result of this Is that we still need to process this data in our AI regardless of whether the AI<br>
is designed to actually 'feel' it or not. We cannot just create an emotional response unless we generate completely random<br>
responses or the triggers are very simple. <br>
<br>
Alex<br>
</TT></PRE></div>
<div class="AOLPromoFooter">
<hr style="margin-top:10px;" />
AOL's new homepage has launched. Take a <a href="http://info.aol.co.uk/homepage/" target=_blank>tour</a> now.
</div>