[ExI] The point of emotions
nebathenemi at yahoo.co.uk
Mon Apr 21 10:18:00 UTC 2008
To add my voice to the debate about programming emotion into AIs -
emotional responses make a handy shortcut in decision-making. A couple
of popular psychology books recently (eg Blink, and I can't remember
the name of the other one I read) have as their central point the sheer
volume of decisions you undertake subconsciously and instantly. An AI
without emotions would have to process through everything and carefully
decide what criteria to judge things on, then use those criteria to
carefully weigh up options - this may take up a whopping volume of
People often quote the example of the donkey equally distant from two
water sources who can't decide, and dies of thirst (I can't remember
the technical name for this, but some philosopher put his name to it).
Sometimes irrational factors or random choosing can make a decision
where logic struggles. The ability to go "I favour X" without thinking
too much about it saves a lot of time and streamlines decision-making.
Now, given a lot of processing power you don't need these shortcuts,
but for near-term AI these shortcuts would really help.
We don't have to make AIs follow our evolutionary psychology - their
emotions could be made similar to ours to make it easier for the two
types of intelligence to communicate, or we could deliberately tailor
theirs to be better attuned to what they are for (territoriality and
defending the group would be fantastic emotions for an AI helping
design computer firewalls and anti-virus software, but useless for a
deep space probe).
To summarise, I think people trying to make the "being of pure logic"
type of AI are making an uphill struggle for themselves, only to create
an intelligence many humans would have difficulty communicating with.
Sent from Yahoo! Mail.
A Smarter Email http://uk.docs.yahoo.com/nowyoucan.html
More information about the extropy-chat