[extropy-chat] Serenity: "...make people better."
pgptag at gmail.com
Sat Oct 15 07:57:04 UTC 2005
I think Russel is correct in saying that some people hate
transhumanism because they fear that "humans will be exterminated by
But there is a very deep misunderstanding here. Example:
I think in the developed world there must be still a few percent
completely illiterate people, but suppose one day the NYT publishes an
article titled "There are no illiterate Americans left - everybody in
the US can read and write". Of course, every sensible person would
think this is good news.
Now take the alternative formulation "Illiterate Americans have been
exterminated by those who can read and write". Sounds sort of wrong,
and indeed it is semantically wrong because you cannot use the word
"exterminate" in this context.
To make things even clearer think of describing the impact of a cure
for HIV with the words "HIV sufferers will be exterminated in 5
years". This would be e very unfortunate choice of words indeed, even
if the author simply means that everybody with HIV will have been
cured in 5 years. The wording suggests mass execution instead of
Now lets examine "humans will be exterminated by posthumans". The
sentence is semantically meaningful if we take "humans" and
"posthumans" as two separate species defined by different biologies,
but not if posthumans means voluntarily enhanced humans with
non-inheritable or genetically inheritable mods. And I don't think
anyone, at least on this list, would seriously consider any forced
enhancement concept. More likely, competitive pressure would incentive
people to consider enhancement, and market dynamics with public aid
programs would make enhancement affordable.
So, as it usually happens the problem loops back to the words used to
state it. My proposal, which I have made many times on this and other
list, is to stop using the words "transhuman" and "posthuman". The
name of our species is "humans". Humans is what we were thousands of
years ago, humans is what we are now, humans is what we will be as
members of a galactic civilization radically enhanced wrt our current
biology, and humans is what we will be as AI-enhanced uploads running
on a computronium substrate.
On 10/13/05, Russell Wallace <russell.wallace at gmail.com> wrote:
> On 10/13/05, Adrian Tymes <wingcat at pacbell.net> wrote:
> > I have noticed, in conversations with luddish people, that the true
> > concern is not actually over "better", but over "make".
> If transhumanists came across as saying "we want technology to improve
> ourselves, if other people want to use it too that's fine with us, it's a
> matter of individual choice", most people's reaction wouldn't be worse than
> "eh, those guys are weird".
> The reason people like Fukuyama regard transhumanism as something to be
> feared and hated is that they (not without reason) perceive transhumanists
> as saying "we want technology to improve ourselves, those who join us will
> be saved, all others will be forcibly converted or exterminated like the
> inferior creatures you are!"
> - Russell
More information about the extropy-chat