[ExI] Warm fuzzes

Henrique Moraes Machado cetico.iconoclasta at gmail.com
Thu Mar 6 12:15:43 UTC 2008


Keith Henson> I have not thought a lot about this, but motivating anything 
that
> powerful with pain seems to me to be a very bad idea.

Pain is a very important item in our development. Nothing gives us a clearer 
sign of danger (too many light hurts my eyes, too hot can burn me). That 
said, I'm not advocating that we torture our AIs :-), but giving them the 
same sensorial inputs of other all naturally evolved beings. We live in a 
painful environment and if our AIs should understand this environment as we 
do, they must have similar experiences. And there are also painful emotions, 
such as not being able to fit in a group we want to (or not being able to 
buy that Ferrari...). We have all sorts of positive and negative feedbacks 
from our environment.
And for all advocating a pure logic non emotional AGI, I don't think it's 
even plausible. Since the G stands for general, generally intelligent beings 
(we, dolphins, dogs, etc) have emotions. 




More information about the extropy-chat mailing list