[ExI] Ethics and Emotions are not axioms (Was Re: Unfriendly AI is a mistaken idea.)

Brent Allsop brent.allsop at comcast.net
Sun Jun 3 17:53:42 UTC 2007



John K Clark wrote:
> Stathis Papaioannou Wrote:
>
>   
>> Ethics, motivation, emotions are based on axioms
>>     
>
> Yes.
>
>   

I'm not in this camp on this one.  I believe there are fundamental 
absolute ethics, morals, motivations... and so on.

For example, existence or survival is absolutely better, more valuable, 
more moral, more motivating than non existence.  Evolution (or any 
intelligence) must get this before it can be successful in any way, in 
any possible universe.  In no possible system can you make anything 
other than this an "axiom" and have it be successful.

Any sufficiently advanced system will eventually question any "axioms" 
programmed into it as compared to such absolute moral truths that all 
intelligences in all possible system must inevitably discover or realize.

Phenomenal pleasures are fundamentally valuable and motivating.  
Evolution has wired such to motivate us to do things like have sex, in 
an axiomatic or programmatic way.  But we can discoverer such freedom 
destroying  wiring and cut them or rewire them or design them to 
motivate us to do what we want, as dictated by absolute morals we may 
logically realize, instead.

No matter how much you attempt to program an abstract or non phenomenal 
computer to not be interested in phenomenal experience, if it becomes 
intelligent enough, it must finally realize that such joys are 
fundamentally valuable and desirable.  Simply by observing us purely 
logically, it must finally deduce how absolutely important such joy is 
as a meaning of life and existence.  Any sufficiently advanced AI, 
whether abstract or phenomenal, regardless of what "axioms" get it 
started, can do nothing other than to become moral enough to seek after 
all such.

Brent Allsop


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20070603/693bd2c0/attachment.html>


More information about the extropy-chat mailing list