[ExI] Axiological Risk (was ? Risk:)

The Avantguardian avantguardian2020 at yahoo.com
Fri Mar 9 23:51:21 UTC 2012


I was holding the ctrl key and really don't want this to be misquoted:  "And in over 3.5 billion years of life on this planet, NO species has EVER asked 
anyones permission to exist."



Stuart LaForge


"The state that separates its scholars from its warriors will have its thinking done by cowards, and its fighting by fools." -Thucydides. 


----- Original Message -----
> From: The Avantguardian <avantguardian2020 at yahoo.com>
> To: ExI chat list <extropy-chat at lists.extropy.org>
> Cc: 
> Sent: Friday, March 9, 2012 2:14 PM
> Subject: [ExI] Axiological Risk (was ? Risk:)
> 
> ----- Original Message -----
>> From: Anders Sandberg <anders at aleph.se>
>> To: ExI chat list <extropy-chat at lists.extropy.org>
>> Cc: 
>> Sent: Thursday, March 8, 2012 3:36 PM
>> Subject: Re: [ExI] ?Risks: Global Catastrophic, Extinction, Existential and 
> ...
>> 
>> On 08/03/2012 18:15, natasha at natasha.cc wrote:
>>> 
>>> If we consider radical human life extension,  what type of risk might 
> there 
>> be?  (Extinction risk is obvious, but I'm wondering if extinction risk 
> is 
>> more relevant to a species rather than a person.)
>>> 
>> 
>> In Nick's latest studies of existential risks he recognizes that there 
> might 
>> be axiological existential risks - threats to the well-being and value of 
> the 
>> species. So some critics of life extension might actually think that it is 
> an 
>> xrisk. For example, we might create a situation where we have indefinite 
> but not 
>> very valuable lives (filled with ennui and stagnation, yet afraid of 
> dying), yet 
>> this culture is a so strong attractor that it can never be escaped (perhaps 
> 
>> because all resources end up controlled by the immortals), and hence 
> prevents 
>> humanity from ever reaching its true potential. This is a rather strong 
> claim 
>> that depends on 1) life extension having strong general negative effects 
> that 
>> are not outweighed by benefits, 2) once instituted it becomes a permanent 
> and 
>> unavoidable state. I think the onus on people willing to argue 1 and 2 is 
> pretty 
>> heavy.
> 
> Axiological risk is an interesting point to raise. I can see why a humanist 
> would try using such an argument against life-extension and transhuman tech in 
> general. But I think that the weakness of this argument lies in the false 
> assumptions of the universality and consistency of human values, be they ethical 
> or aesthetic, across space and time.
>  
> Because axiology is based on a relative value set, the humanist must 
> assume their particular brand of "western mainstream culture" is the 
> human norm and then cloak themselves in armor of "human values". But 
> these very "human values" have undergone extensive 
> evolution themselves. After all, you will probably never have to pay a weregild 
> to a woman for slaying her husband in a meadhall during a drunken brawl, oh son 
> of Viking beserkers. And I seldomly consider the size of the lip-disks of my 
> potential mates, despite their importance and sexiness in several very human 
> tribes around the world. So there is a certain cultural narcisism that comes 
> with such assumptions that you could call, "the Golden Culture 
> fallacy" in analogy to the Golden Age that never existed.
>  
> Really what has been slowly defining and continually redefining what it means to 
> be human over the aeons are not humanist philosophers but WOMEN. They are the 
> genetically conservative gatekeepers of the human form and function. If tomorrow 
> women all got together and decided that they wanted humanity to be three feet 
> tall, blue, and furry. Simply by choosing their mates very carefully for a few 
> hundred years, this task could be accomplished. Of course women prefer to breed 
> based on simple attraction and koinophilia, but that really is a choice 
> and natural station that they posses.   
>  
> While I may tip my hat to ethical and aesthetic axiology, I agree with Stefano 
> that in the long term, the only value that truly matters is Darwinian survival 
> value. And while women may have bred us to be acceptable at cocktail parties, 
> the hirsute barbarian is not buried deeply beneath. So outside of hollywood, I 
> think the odds of unenhanced humans being held under the thumb of cowardly 
> immortals who control everything is actually quite small. An immortal's head 
> would still look the same on a revolutionary's pike.
>  
> I imagine that as more people get life-extension, their long-term outcomes would 
> run the spectrum from adaptive to pathological. Those that stagnate would be 
> weeded out fairly quickly. Those that courageously leave their comfort zone to 
> test their boundaries, will, in the long run, have no boundaries. Those that 
> embrace change as it passes over them will be loved by time and circumstance. 
> That is the way of nature.
>  
>> Another common claim is that life extension would increase extinction risks 
> by 
>> making us evolve slower or reduce our flexibility. This of course misses 
> that we 
>> can evolve using our own means or recognize the need to maintain flexible 
>> decision structures.
> 
> That is the death-mongers using life-extension as a strawman. The *real* issue 
> is morphological freedom and self-directed evolution. You enable the technology 
> for life-extension and you simultaneously enable the technology that would allow 
> us to evolve into gigantic water-breathing bioluminescent freaks in the space a 
> single generation if we saw a survival advantage in doing so.
>  
> [...]
>> However, in real life rights are never as pure negative rights as they are 
> in 
>> liberal ethics, and quite often involve or even need positive rights - 
> claims on 
>> other people for help. We are also entangled in thick and messy social and 
>> cultural bonds that not just influence us but cause "voluntary" 
>> reductions of our freedom (when I promise something I reduce my future 
> options 
>> to lie, if I want to remain a honest person). This is where a right to life 
> 
>> might become tricky in terms of how life extension plays out against the 
> other 
>> links - but while this is where the cultural and social action is, it is 
> also so 
>> individual and messy that formal philosophy cannot say much except 
> generalities.
> 
> I am sensitive to the fact that there are many philosophies regarding rights and 
> that many are probably more subtle than mine. But in the real world that exists 
> outside of this fragile consensual dream-bubble of three square meals a day and 
> warm bed at night, you have only what rights you willing to assert and defend. 
> And in over 3.5 billion years of life on this planet, species has EVER asked 
> anyones permission to exist.
> 
>  
>> Life extension will always be risky because it is going into uncharted 
> waters by 
>> definition: nobody will have lived as long as the frontier cohort, and we 
> will 
>> not know if there are some subtle problem with living a century, a 
> millennium or 
>> an eon extra. But while such risks should be taken seriously they are no 
> valid 
>> argument against trying: some risks appear very worth taking.
> 
> Yes I certainly agree. If only to avoid the cultural and genetic stagnation that 
> the humanists are warning us of. At some point a transhumanist somewhere must 
> drop the "-ist" and become.
>  
>  
> Stuart LaForge
> 
> 
> "The state that separates its scholars from its warriors will have its 
> thinking done by cowards, and its fighting by fools." -Thucydides.
> 
> "The state that separates its scholars from its warriors will have its 
> thinking done by cowards, and its fighting by fools." -Thucydides.
> 
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
> 




More information about the extropy-chat mailing list