[ExI] ?Risks: Global Catastrophic, Extinction, Existential and ...
natasha at natasha.cc
natasha at natasha.cc
Thu Mar 8 18:15:01 UTC 2012
The idea of humans dealing with risks of all sorts, most often
catastrophic risks such as asteroids hitting the earth, a nuclear war,
or a world threatening virus have had theorists thinking for eons
about the future. More recently, we are concerned with global warming,
bio warfare, runaway nanoassemblers and super AI/AGI entities. Most of
these risks lie outside the personal issue of radical life extension.
If we consider radical human life extension, what type of risk
might there be? (Extinction risk is obvious, but I'm wondering if
extinction risk is more relevant to a species rather than a person.)
So, I started thinking about the elements of a person that keep
him/her alive: foresight, insight, intelligence, creativity,
willingness to change, etc. I also thought about what might keep a
person from not continuing to exist: depression/sadness. Then I
thought about what someone else might do to keep me from
existing: inflicting his/her values/beliefs onto my sphere of
existence that would endanger my right to live. I arrived
back at morphological freedom, as understood by More on one hand and
Sandberg on the other, which pertains to a negative right -- a right
to exist and a right not to be coerced to exist. But again, here the
behavior of morphological freedom is a freedom and does not answer the
question of what could a risk be that reflects a person's choice/right
to live/exist?
It may be simply a matter of discrimination about a right not to die.
Does anyone have thoughts on this?
Thanks,
Natasha
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20120308/f2612862/attachment.html>
More information about the extropy-chat
mailing list