[ExI] Safety of human-like motivation systems [WAS Re: Oxford scientists...]

Richard Loosemore rpwl at lightlink.com
Thu Feb 3 21:47:07 UTC 2011


Samantha Atkins wrote:
> On 02/03/2011 10:19 AM, Stefano Vaj wrote:
>> On 2 February 2011 17:40, Richard Loosemore<rpwl at lightlink.com>  wrote:
>>> The problem with humans is that they have several modules in the 
>>> motivation
>>> system, some of them altruistic and empathic and some of them selfish or
>>> aggressive.   The nastier ones were built by evolution because she 
>>> needed to
>>> develop a species that would fight its way to the top of the heap.  
>>> But an
>>> AGI would not need those nastier motivation mechanisms.
>> Am I the only one finding all that a terribly naive projection?
> 
> Yes, in part because calling selfish, that is to say seeking what you 
> value more than what you don't "nasty" is very simplistic.  Assuming all 
> we call empathy or altruistic is good is also simplistic.

I did not, in fact, make the "simplistic" claim that you describe.

Which is to say, I did not equate "selfish" with "nasty".  I merely said 
that there are many modules in the human system, some altruistic and 
empathic, and (on the other hand) some selfish or aggressive.

There are many such modules, and the ones that could be labeled 
"selfish" include such mild and inoffensive motives as "seeking what you
value more than what you don't".  No problem there -- nothing nasty 
about that.  But under the heading of "selfish" there are also 
motivations in some people to "seek self advancement at all cost, 
regardless of the pain and suffering inflicted on others".

In game theory terms, this latter motivation represents an extreme form 
of defecting (contrast with cooperation), and it is damaging to society 
as a whole.  It would be fair to label this a "nastier" motivation.

I merely pointed out that some motivational modules can be described as 
"nastier" than others, in that sense.

I did not come anywhere near the simplistic claim that "selfish" == "nasty".

And BTW I think you mean to start your comment with the word "No" 
because you seemed to be agreeing with Stefano.


Richard Loosemore



More information about the extropy-chat mailing list