[ExI] Axiological Risk (was ? Risk:)
avantguardian2020 at yahoo.com
Mon Mar 12 06:46:17 UTC 2012
----- Original Message -----
> From: Anders Sandberg <anders at aleph.se>
> To: ExI chat list <extropy-chat at lists.extropy.org>
> Sent: Saturday, March 10, 2012 8:10 AM
> Subject: Re: [ExI] Axiological Risk (was ? Risk:)
>G ood point made by a colleague:
> Might *not* ever developing life extension be an axiological risk?
> If the benefits in value of life extension are vast, then not fulfilling this
> possibility (assuming, uncontroversially here at least, that it is possible)
> would mean mankind never reaches its full potential. Hence an axiological risk
> exists in the form of failure of getting life extension.
Yes. Your friend does have a good point. I'll up the ante on him: Might *not* developing the means for self-directed evolution be an existential risk?
Allow me to phrase it another way: Might not having the ability to keep the gills we all develop and lose in the womb into adulthood pose an existential risk in a world with melting ice packs and rising sea levels? I can think of other scenarios if your friend doubts.
> It is of course an individual risk too for all of us.
Yes, the fatality rate of death is alarming. ;-)
"The state that separates its scholars from its warriors will have its thinking done by cowards, and its fighting by fools." -Thucydides
More information about the extropy-chat