[ExI] How could you ever support an AGI?

Henrique Moraes Machado cetico.iconoclasta at gmail.com
Wed Mar 5 12:58:20 UTC 2008



Alex>Again anthropomorphically intelligent. It may well be the cold 
inteligent decission to pre-emptively exterminate a potential threat. After 
all, it
>wouldn't feel bad about it, it wouldn't feel anything.

Maybe we should program our AIs with a "desire for belonging". Humans (and 
other social animals) have it. We want to be part of a group. Maybe we 
should not program an AI without emotions. By the way, emotion is a part of 
intelligence, isn't it?
Maybe  we shouldn't program our AIs without sensorial input (mainly pain). 




More information about the extropy-chat mailing list