[extropy-chat] Prime Directive
Jef Allbright
jef at jefallbright.net
Sat Oct 28 01:17:59 UTC 2006
Slawomir wrote:
J:
> "the absurdity of choosing death to avoid all suffering."
S:
> I still fail to see the absurdity here. If *all* humans were
> to suffer unspeakable and endless torture, what would be wrong
> with avoiding that suffering?
I thought it was already clear that the scenario was about a single
individual making choices relative to his personal pleasure versus
suffering. Would you, Slawomir, choose to die with the intention that
it would effectively end all your suffering? Wouldn't that be absurd?
Consider that there is always a time difference between the decision
(now) and the reward (future). So you must be making your decision
based on how much you *value* the expected outcome (experience or
otherwise).
As occurs occasionally, a grenade falls into a squad of soldiers and one
of them uses his body to shield the others. Do you really want us to
believe that he chose that action in order to receive the benefit of
feeling good about his heroism? Or might it explain more to say that he
acted based on his values?
J:
> >> More importantly, the above implies that promotion of values
> >> is somehow more important than ability to experience them. It
I'm saying that rational choice is about promoting ones values into the
future. How the expected outcomes will be experienced, and by whom, is
related, but not primary.
S:
> >> suggests that values could exist in absence of experience,
> >> that is, they could still exist even with all the humans in
> >> the world wiped out.
Sometimes I think you make statements such as the above just for the
immediate pleasure you experience in making them, but with little regard
for their potential value. ;)
- Jef
More information about the extropy-chat
mailing list