[ExI] Axiological Risk (was ? Risk:)
Anders Sandberg
anders at aleph.se
Sat Mar 10 16:10:47 UTC 2012
Good point made by a colleague:
Might *not* ever developing life extension be an axiological risk?
If the benefits in value of life extension are vast, then not fulfilling
this possibility (assuming, uncontroversially here at least, that it is
possible) would mean mankind never reaches its full potential. Hence an
axiological risk exists in the form of failure of getting life extension.
It is of course an individual risk too for all of us.
--
Anders Sandberg,
Future of Humanity Institute
Philosophy Faculty of Oxford University
More information about the extropy-chat
mailing list