[extropy-chat] Can Suffering and Death be Quantified? (was Are ancestor simulations immoral?)

A B austriaaugust at yahoo.com
Fri May 26 20:33:47 UTC 2006


Hi Lee,
   
  I completely understand why this viewpoint seems "nuts" to you. 
   
  I'll just restate it this way:
   
  I would consider it a greater *objective* tragedy if a million people died, than if a single person died. I place a very high value on a single life as well as the collection of all lives, such that if a single person is murdered or tortured or made to suffer, I find that completely unacceptable - probably as equally unacceptable as if the same thing happened to a million people. Does that make any more sense?
   
  You offer up some challenging questions Lee  :-)  I'll do my best to answer.
   
  1) I'm very sorry to you and the world but unless that randomly selected 
           human willfully agrees to such a thing, I cannot endorse it. If I were that 
           randomly selected person, I would think very hard about it, but in the end I 
           would probably volunteer as long as I was allowed to perish and didn't have to 
           suffer indefinitely.
   
  2) As long as the clinical trial patients agree to taking that risk, then sure, go for it. 
           The current system of human clinical trials isn't too different from this - they 
           volunteer though of course.
   
  Best Wishes,
   
  Jeffrey Herrlich 

Lee Corbin <lcorbin at tsoft.com> wrote:
  In the post of May 25, 1:02 PM Jeffrey makes quite a number
of points that I agree with. I'm skipping those :-)
and going straight to the most interesting (i.e.
controversial and provocative!)

> Lee:
> > "As an analogy, suppose that all laws
> > against the mistreatment of animals were repealed tomorrow;
> > would millions of people in Western nations immediately rush
> > to the kennels and animal shelters to procure victims for
> > torture?"

Jeffrey responds

> No, I don't think that millions would. But I'm confident that
> some would.

And I infer from what comes next that this is just as big a crime
to you as if *many* did.

> I'm going to propose what is perhaps a strange philosophical 
> viewpoint that I happen to hold (and it's difficult for me to 
> convey). Consider that today, the entire universe and all the 
> good and bad things that it includes can only be separately 
> represented in each of our separate minds.

Of course, this means with a *great* loss of detail. For
example, I hold Russia and its millions in my mind, and I
can even rattle off a great number of cities in that far-
away land. But naturally, there is *some* loss of fidelity :-)
For example, it would be *vastly* preferable if my mental
model of Russia were to come to some harm, i.e., I imagine
Russia being totally destroyed by a large meteorite, than
it would be for the actual high-fidelity real version to
undergo catastrophe.

> Each of us has only one mind and one reality to experience.
> In other words, the value I place on the whole of humanity
> (which is high) is restricted to my mind and my mind alone.

You're right. This is a strange philosophical viewpoint!

> So when viewed in this way, a *single* human life (real or
> simulated) is equally valuable as the sum of *all* human
> lives put together.

Isn't that rather, ahem, nuts? Sorry, but how in the world
can you *not* regret the loss of millions about a million times
more than you regret the loss of one? I don't believe it.
I don't really think that you suppose Mao's "Great Leap Forward"
to have been only as harmful as the death of a single pedestrian
in Canton last year.

> This is partially why I find it completely repulsive to
> allow the torture or murder of even *one* conscious being,
> regardless of whether they are "real" or "simulated".

We agree that "real" vs. "simulated" makes no difference. But
just to get to the bottom of this, let me ask you a couple of
questions:

(1) An alien shows up who has technology vastly, vastly 
beyond ours. He promises us that he will stop all poverty,
and war, and traffic accidents, and cancer suffering and
death, and all other medical suffering and death, for a year,
provided that at the end of a year we offer up to him a human
sacrifice. The poor human (which we select at random from our
6 billion, will be tortured by him to a very similar extent
to which a cancer patient undergoes pain before his death.

Do you think that we should take him up on his offer?

(2) What if a new drug can be developed that will save many
thousands of lives, but because radioactivity is involved,
it is estimated that a few hundred random people around the
world will die?

Think we should allow the development of this drug?

Best wishes,
Lee

_______________________________________________
extropy-chat mailing list
extropy-chat at lists.extropy.org
http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat


		
---------------------------------
Ring'em or ping'em. Make  PC-to-phone calls as low as 1¢/min with Yahoo! Messenger with Voice.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20060526/7ebec5a3/attachment.html>


More information about the extropy-chat mailing list