[ExI] Empathic AGI [WAS Safety of human-like motivation systems]
Stefano Vaj
stefano.vaj at gmail.com
Wed Feb 9 12:04:44 UTC 2011
On 9 February 2011 04:06, Samantha Atkins <sjatkins at mac.com> wrote:
> If by altruism you mean sacrificing your values, just because they are
> yours, to the values of others, just because they are not yours, then it is
> a very bizarre thing to glorify, practice or hope that our AGIs practice.
> It is on the face of it hopelessly irrational and counter-productive toward
> achieving what we actually value. If an AGI practices that just on the
> grounds someone said they "should" then it is need of a serious debugging.
I fully agree.
--
Stefano Vaj
More information about the extropy-chat
mailing list