[ExI] Axiological Risk (was ? Risk:)

The Avantguardian avantguardian2020 at yahoo.com
Sat Mar 10 04:26:44 UTC 2012


----- Original Message -----
> From: Anders Sandberg <anders at aleph.se>
> To: ExI chat list <extropy-chat at lists.extropy.org>
> Cc: 
> Sent: Friday, March 9, 2012 4:38 PM
> Subject: Re: [ExI] Axiological Risk (was ? Risk:)
> 
> On 09/03/2012 22:14, The Avantguardian wrote:
>> Axiological risk is an interesting point to raise. I can see why a humanist 
> would try using such an argument against life-extension and transhuman tech in 
> general. But I think that the weakness of this argument lies in the false 
> assumptions of the universality and consistency of human values, be they ethical 
> or aesthetic, across space and time.  Because axiology is based on a relative 
> value set, the humanist must assume their particular brand of "western 
> mainstream culture" is the human norm and then cloak themselves in armor of 
> "human values". 
> 
> Axiology doesn't have to be based on a relative value set. There are 
> certainly people around that think that there might be a One True Eternal Value 
> System, but that we currently do not know it, for instance. They might say that 
> we cannot evaluate whether there is axiological risk in a lot of future 
> scenarios, but that scenarios where humans stop being experiencers or evaluators 
> of value (for example by devolving) would represent axiological hazards no 
> matter what the OTEVS is.

As I am interested in approaches to quantifying and analyzing morality in ways other than game theory, I will look this OTEVS over when I have more time. Devolving is sometimes the only way forward. You shouldn't think of it as a one way process any more than complexification is. A tapeworm in the vacuum of space will simply die, less than useless. A tapeworm in the vacuum of space is a bad tapeworm. But a tapeworm in the context of rich and diverse biosphere is fully redeemable. Its descendants might someday ponder elliptical orbits. Or perhaps slightly more likely, the combination of tapeworm and host might someday find themselves stronger than host alone, perhaps immune to amebic dysentary, for a real life anecdote. Do you see how wide the future is for that tapeworm? Do you see how the tapeworm is infinitely more valuable in a moral sense than a rock?

> As always, there is plenty of philosophy on the topic.
> http://plato.stanford.edu/entries/value-theory/
> Whether one will find it trivial hairsplitting or profound insights is very 
> personal.

Interesting link. Thank you.
>> While I may tip my hat to ethical and aesthetic axiology, I agree with 
> Stefano that in the long term, the only value that truly matters is Darwinian 
> survival value.
> 
> But is that good in itself, good for something, or just a neutral state of 
> affairs? If the latter, we could just as well choose not to survive - not just 
> axiological xrisks but all xrisks cease to matter.

Life is anything but neutral. And I can think of about 10^12 cells that are distinctly pro- Anders. If you can't see moral worth in a tapeworm, then you won't see moral worth in yourself or God for that matter, should you ever meet him.

Stuart LaForge
 
"The state that separates its scholars from its warriors will have its thinking done by cowards, and its fighting by fools." -Thucydides.





More information about the extropy-chat mailing list