[ExI] ?Risks: Global Catastrophic, Extinction, Existential and ...
gsantostasi at gmail.com
Thu Mar 8 21:55:55 UTC 2012
Happy woman's day Natasha.
On Thu, Mar 8, 2012 at 12:15 PM, <natasha at natasha.cc> wrote:
> The idea of humans dealing with risks of all sorts, most often
> catastrophic risks such as asteroids hitting the earth, a nuclear war, or a
> world threatening virus have had theorists thinking for eons about the
> future. More recently, we are concerned with global warming, bio warfare,
> runaway nanoassemblers and super AI/AGI entities. Most of these risks lie
> outside the personal issue of radical life extension.
> If we consider radical human life extension, what type of risk might
> there be? (Extinction risk is obvious, but I'm wondering if extinction
> risk is more relevant to a species rather than a person.) So, I started
> thinking about the elements of a person that keep him/her alive: foresight,
> insight, intelligence, creativity, willingness to change, etc. I also
> thought about what might keep a person from not continuing to exist:
> depression/sadness. Then I thought about what someone else might do to
> keep me from existing: inflicting his/her values/beliefs onto my sphere of
> existence that would endanger my right to live. I arrived
> back at morphological freedom, as understood by More on one hand and
> Sandberg on the other, which pertains to a negative right -- a right to
> exist and a right not to be coerced to exist. But again, here the behavior
> of morphological freedom is a freedom and does not answer the question
> of what could a risk be that reflects a person's choice/right to live/exist?
> It may be simply a matter of discrimination about a right not to die.
> Does anyone have thoughts on this?
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat