[ExI] The point of emotions

ablainey at aol.com ablainey at aol.com
Wed Apr 23 00:18:30 UTC 2008

-----Original Message-----
From: Tom Nowell nebathenemi at yahoo.co.uk

To add my voice to the debate about programming emotion into AIs -
emotional responses make a handy shortcut in decision-making. A couple
of popular psychology books recently (eg Blink, and I can't remember
the name of the other one I read) have as their central point the sheer
volume of decisions you undertake subconsciously and instantly. An AI
without emotions would have to process through everything and carefully
decide what criteria to judge things on, then use those criteria to
carefully weigh up options - this may take up a whopping volume of
processing time.
 People often quote the example of the donkey equally distant from two
water sources who can't decide, and dies of thirst (I can't remember
the technical name for this, but some philosopher put his name to it).
Sometimes irrational factors or random choosing can make a decision
where logic struggles. The ability to go "I favour X" without thinking
too much about it saves a lot of time and streamlines decision-making.
Now, given a lot of processing power you don't need these shortcuts,
but for near-term AI these shortcuts would really help. 
 We don't have to make AIs follow our evolutionary psychology - their
emotions could be made similar to ours to make it easier for the two
types of intelligence to communicate, or we could deliberately tailor
theirs to be better attuned to what they are for (territoriality and
defending the group would be fantastic emotions for an AI helping
design computer firewalls and anti-virus software, but useless for a
deep space probe).
 To summarise, I think people trying to make the "being of pure logic"
type of AI are making an uphill struggle for themselves, only to create
an intelligence many humans would have difficulty communicating with.

Tom, I can see your point, but just because we react or make choices based upon our emotions doesn't mean that 
the processing is absent. My view of it is that when we are born we have pretty much just two emotions, Happy and sad.
Anyone with children will know that while conscious, if they are not laughing they are screaming. Over time we develop
or streamline our emotional set through years of data input, calculation and trial and error. I would like to think 
the result is that specific circuits are developed to recognise the complex sensory patterns which evoke specific emotions. 
As you say, acting on emotion reduces the processing and reaction time considerably. However the processing is still being 
done at some level even if we are not directly aware of it.
The end result of this Is that we still need to process this data in our AI regardless of whether the AI
is designed to actually 'feel' it or not. We cannot just create an emotional response unless we generate completely random
responses or the triggers are very simple. 


AOL's new homepage has launched. Take a tour at http://info.aol.co.uk/homepage/ now.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20080422/ba4ea1a2/attachment.html>

More information about the extropy-chat mailing list