[extropy-chat] TMS
Samantha Atkins
sjatkins at mac.com
Wed Jan 19 09:00:58 UTC 2005
On Jan 18, 2005, at 1:26 PM, Adrian Tymes wrote:
>
>> Question: If you had a friend about to commit
>> suicide and you have
>> exhausted all means of persuasion, are you justified
>> in stopping them,
>> against their will, from taking their live? Are
>> you justified if
>> you know that later they will sincerely thank you
>> if you successfully
>> intervene?
>>
>> Not an easy question to answer, is it? Or is it?
>
> In any real such situation, the answer would seem to
> depend on the situation. I.e., in various cases that
> comply with what has been described, either "yes" or
> "no" could be justified as an answer depending on the
> rest of the case.
>
Yes, exactly.
>> Now suppose that it wasn't a friend about to commit
>> suicide but
>> humanity itself willfully headed for almost certain
>> destruction? If
>> you thought you could do something, even if against
>> what all the world
>> said it wanted, even against your own principles of
>> the boundaries
>> ruled by respect for the free will of others, would
>> you?
>
> If I ever came across such a situation, I would
> seriously re-examine my data. In all such situations
> that I have heard about (real situations, anyway,
> where someone perceived this to be the case; ignoring
> fiction like The Matrix), the reality turned out to be
> other than what it appeared to be. Ergo, if I
> perceived this, I would have strong historical
> evidence that I was misperceiving things.
>
And if your data holds? On our present course, especially if various
anti-technology forces prevail, humanity will most likely perish. I
think a lot of us see this. The question then is what level of means
are justified if humanity itself is at stake. If we continuously
resist concluding that is what the stakes are we won't make so many
errors or go off tilting at windmills. On the other hand, if the
danger is indeed that deep we had best at least admit to it even if we
haven't much of a clue as to how to make survival much more likely.
> Example: I'm currently working on a nonpolluting power
> source that, in the unlikely event that things work
> out as best as they possibly can, could make oil-based
> power obsolete overnight. Some people could convince
> themselves that humanity wants to destroy itself
> through environmental disaster brought on by excessive
> use of fossil fuels. That perception would create the
> problem you state above.
Cool work! It is not that humanity "wants to" destroy itself so much
as it is simply locked into patterns that are very likely to lead to
its destruction. Whether it "wants to" or not isn't really too
relevant. The way we largely killed off nuclear power, haven't
developed other alternatives sufficiently and are largely quite
wasteful in the way we use fossil fuels and energy combined with the
fact of eventual Peak Oil and decline of oil production says that we
are auguring in on energy without some disagreement on how fast we are
doing so. By itself the failure to fully build out some alternative
to fossil fuels could bring our civilizations to ruin. At least the
wars and deprivations and their offshoots would be likely to. Of
course our energy habits are just one of many patterns that could be
pointed out as quite detrimental.
>
> However, it is not actually the case that people want
> to destroy the world. They want the power, but if an
> alternative can be developed that grants that power
> without damaging the Earth, and all other factors are
> similar enough, it would likely be quickly accepted
> precisely because it does not damage the Earth.
>
Again, it does not matter what they want. If they do not have
sufficient understanding of consequences, actual costs, alternatives
and the means (including sufficient freedom for lockstep from other
factors) then the danger is heightened. It will be a very bumpy ride
even in the best scenarios. We not only want energy we must have it
to continue, much less move forward. All those other factors are a
great deal of where the problem lies.
> There are certain moral questions where the answer is
> neither "yes" nor "nor" nor even "depends", but rather
> something like "error: situation does not exist; if
> apparently encountered, attempting to resolve the
> situation would not be the correct action to take".
Error: Situation does exist in that we are headed for destruction. If
we do not resolve the situation then humanity is doomed. A meme that
says if we ever come to this conclusion we must prove our sanity by
disowning the conclusion as mistaken will certainly get us killed.
- samantha
More information about the extropy-chat
mailing list