[extropy-chat] Article: "Laughter at people's fears"

Hal Finney hal at finney.org
Fri Dec 24 20:00:35 UTC 2004


Harvey Newstrom writes:
> Sadly, I've seen such attitude in transhumanists frequently.  Many of 
> us get so caught up in ideas that we lose touch with reality.
>
> Kill everybody and scan their dead brains into a computer.  Destroy the 
> earth for spare parts.  Disassemble the sun because we don't need it 
> anymore.  Give robot big brothers power of humanity.  Release untested 
> nanobots and viruses into the environment because it is too costly to 
> wait.  Pollute all we want because we'll fix it later.  Don't send a 
> probe to Pluto because it will be disassembled before we get there.  We 
> are living in a computer simulation.  We can commit suicide and appear 
> in a parallel universe.  We can kill people as long as we copy them 
> first.  Mortals are only temporary and aren't as important as us 
> immortals.   Plant false warning labels on natural foods.  Plant false 
> verses in the Koran to sabotage some religions.  Release genetically 
> modified fish into the environment as a publicity stunt.  Upload people 
> against their will for their own good.  Don't allow accurate labeling 
> of genetically modified foods because the general public is too stupid 
> to make an informed decision.  Let's engineer memes to fool the 
> "proles" into proper modes of thinking.

Your list conflates several different kinds of issues and produces some
misleading results.  Imagine a list of great evils of the world: war,
child molestation, homosexuality, and terrorism.  Obviously one item
has been slipped in there and doesn't belong.  The other three do not
involve meaningful consent, and that distinguishes them.

In the same way, I think we should distinguish items on your list which
are immoral because they involve manipulating people or doing things to
them without their consent, from items which don't have this property but
which you personally don't like.  You may have unintentionally slipped
some of your personal dislikes in among a list of immoral actions.  Here
is how I would categorize them:


Immoral actions because of lack of consent:

Kill everybody and scan their dead brains into a computer.
We can kill people as long as we copy them first.
Mortals are only temporary and aren't as important as us immortals. 
Upload people against their will for their own good.
Release untested nanobots and viruses into the environment because it
is too costly to wait.
Pollute all we want because we'll fix it later.
Release genetically modified fish into the environment as a publicity stunt.


Immoral actions because of manipulation:

Plant false warning labels on natural foods.
Plant false verses in the Koran to sabotage some religions.
Let's engineer memes to fool the "proles" into proper modes of thinking.
Don't allow accurate labeling of genetically modified foods because the
general public is too stupid to make an informed decision.


Questionable morality depending on the circumstances in terms of
whether consent was achieved:

Destroy the earth for spare parts.
Disassemble the sun because we don't need it anymore.
Give robot big brothers power of humanity.


Beliefs without issues of consent

Don't send a probe to Pluto because it will be disassembled before we get there.
We are living in a computer simulation.
We can commit suicide and appear in a parallel universe.


A few comments on certain issues.  "Pollute all we want because we'll
fix it later."  I evaluated this in the context of someone polluting
and causing immediate harm to others without their consent.  If we
were talking about society deciding that it made more economic sense to
pollute more today and fix it using the greater wealth of the future,
I'd say that is OK reasoning for society to use.  "Don't allow accurate
labeling of genetically modified foods because the general public
is too stupid to make an informed decision."  I don't agree with the
reasoning in this statement as it is an immoral attempt at manipulation.
I also view it as immoral to forbid people from labelling their foods.
However I would say it is moral not to force people to label their foods
(which is different from forbidding them to label their foods).

"Give robot big brothers power of humanity."  I think what you mean
is creating very powerful robots and turning over to them power over
humanity.  I would agree that this is immoral without some social process
to elicit consensus.  Likewise with enormous solar system engineering
processes which will have major impact on the people living here.

As for the last three, they are beliefs about the nature of reality and
are not moral questions.

Hal



More information about the extropy-chat mailing list