[ExI] Empathic AGI

John Clark jonkc at bellsouth.net
Wed Feb 9 17:41:56 UTC 2011


On Feb 7, 2011, at 12:16 PM, Stefano Vaj wrote:
> 
> If we accept that "normal" human-level empathy (that is, a mere
> ingredient in the evolutionary strategies) is enough, we just have to
> emulate a Darwinian machine as similar as possible 

Two difficulties with that:

1) The Darwinian process is more like history than mathematics, it is not repeatable, very small changes in initial conditions could lead to huge differences in output. 

2) Human-level empathy is aimed at Human-level beings, the further from that level the less empathy we have. We have less empathy for a cow than a person and less for an insect than a cow. As the AI's intelligence gets larger its empathy for us will get smaller although its empathy for its own kind might be enormous.

 John K Clark




-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20110209/89047cff/attachment.html>


More information about the extropy-chat mailing list