<html><head></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space; "><div><div>On Feb 7, 2011, at 12:16 PM, Stefano Vaj wrote:</div><blockquote type="cite"><div><br>If we accept that "normal" human-level empathy (that is, a mere<br>ingredient in the evolutionary strategies) is enough, we just have to<br>emulate a Darwinian machine as similar as possible </div></blockquote><div><br></div>Two difficulties with that:</div><div><br></div><div>1) The Darwinian process is more like history than mathematics, it is not repeatable, very small changes in initial conditions could lead to huge differences in output. </div><div><br></div><div>2) Human-level empathy is aimed at Human-level beings, the further from that level the less empathy we have. We have less empathy for a cow than a person and less for an insect than a cow. As the AI's intelligence gets larger its empathy for us will get smaller although its empathy for its own kind might be enormous.</div><div><br></div><div> John K Clark</div><div><br></div><div><br><div><br></div><div><br></div></div></body></html>