[extropy-chat] Can Suffering and Death be Quantified? (was Are ancestor simulations immoral?)

Lee Corbin lcorbin at tsoft.com
Fri May 26 00:33:25 UTC 2006


In the post of May 25, 1:02 PM Jeffrey makes quite a number
of points that I agree with. I'm skipping those  :-)
and going straight to the most interesting (i.e.
controversial and provocative!)

> Lee:
> > "As an analogy, suppose that all laws
> > against the mistreatment of animals were repealed tomorrow;
> > would millions of people in Western nations immediately rush
> > to the kennels and animal shelters to procure victims for
> > torture?"

Jeffrey responds

> No, I don't think that millions would. But I'm confident that
> some would.

And I infer from what comes next that this is just as big a crime
to you as if *many* did.

> I'm going to propose what is perhaps a strange philosophical 
> viewpoint that I happen to hold (and it's difficult for me to 
> convey). Consider that today, the entire universe and all the 
> good and bad things that it includes can only be separately 
> represented in each of our separate minds.

Of course, this means with a *great* loss of detail. For
example, I hold Russia and its millions in my mind, and I
can even rattle off a great number of cities in that far-
away land. But naturally, there is *some* loss of fidelity  :-)
For example, it would be *vastly* preferable if my mental
model of Russia were to come to some harm, i.e., I imagine
Russia being totally destroyed by a large meteorite, than
it would be for the actual high-fidelity real version to
undergo catastrophe.

> Each of us has only one mind and one reality to experience.
> In other words, the  value I place on the whole of humanity
> (which is high) is restricted to my mind and my mind alone.

You're right. This is a strange philosophical viewpoint!

> So when viewed in this way, a *single* human life (real or
> simulated) is equally valuable as the sum of *all* human
> lives put together.

Isn't that rather, ahem, nuts?  Sorry, but how in the world
can you *not* regret the loss of millions about a million times
more than you regret the loss of one?  I don't believe it.
I don't really think that you suppose Mao's "Great Leap Forward"
to have been only as harmful as the death of a single pedestrian
in Canton last year.

> This is partially why I find it completely repulsive to
> allow the torture or murder of even *one* conscious being,
> regardless of whether they are "real" or "simulated".

We agree that "real" vs. "simulated" makes no difference. But
just to get to the bottom of this, let me ask you a couple of
questions:

(1) An alien shows up who has technology vastly, vastly 
beyond ours. He promises us that he will stop all poverty,
and war, and traffic accidents, and cancer suffering and
death, and all other medical suffering and death, for a year,
provided that at the end of a year we offer up to him a human
sacrifice. The poor human (which we select at random from our
6 billion, will be tortured by him to a very similar extent
to which a cancer patient undergoes pain before his death.

Do you think that we should take him up on his offer?

(2) What if a new drug can be developed that will save many
thousands of lives, but because radioactivity is involved,
it is estimated that a few hundred random people around the
world will die?

Think we should allow the development of this drug?

Best wishes,
Lee




More information about the extropy-chat mailing list