[ExI] Safety of human-like motivation systems [WAS Re: Oxford scientists...]

Samantha Atkins sjatkins at mac.com
Thu Feb 3 20:54:01 UTC 2011

On 02/03/2011 10:19 AM, Stefano Vaj wrote:
> On 2 February 2011 17:40, Richard Loosemore<rpwl at lightlink.com>  wrote:
>> The problem with humans is that they have several modules in the motivation
>> system, some of them altruistic and empathic and some of them selfish or
>> aggressive.   The nastier ones were built by evolution because she needed to
>> develop a species that would fight its way to the top of the heap.  But an
>> AGI would not need those nastier motivation mechanisms.
> Am I the only one finding all that a terribly naive projection?

Yes, in part because calling selfish, that is to say seeking what you 
value more than what you don't "nasty" is very simplistic.  Assuming all 
we call empathy or altruistic is good is also simplistic.

- s

More information about the extropy-chat mailing list