[extropy-chat] The most "dangerous" idea

Emlyn emlynoregan at gmail.com
Thu Sep 2 06:43:29 UTC 2004


He seems to posit that our liberal democracies are based on notions of
equality, which in turn are based on physical world equality. He
thinks that if we lose our physical equality, we will lose everything
else dear to us. He calls this equality "human nature".

Our counter argument, I guess, is to push the position that sentient
rights (he equates human nature and human rights) are not based on
physical equality of fleshy individuals, but on something more useful,
for instance Peter Singer's concept of granting sentient rights to
anything that can suffer (I'm not suggesting we adopt that measure,
but we need something).

Really, the idea that we need to be the same to be treated with equal
consideration, and to treat each other with equal consideration, is a
very poor basis for a system of ethics. For a start, it's never been
true; we are all different, with different innate abilities, strengths
& weaknesses.

We can frame transhumanism as a celebration and elevation of humans as
a set of diverse individuals, with a common point that we all need
equal consideration of our interests. In this light, it's an ethical
advance. I actually like Singer's metric of ability to suffer as the
determinate of membership of the set Sentients, but it needs to be
tempered with allowing for differing types of rights/responsibilities
depending on the ability to think. How to do that would require
careful analysis, but clearly for instance you can't give a dog equal
consideration with a human in choosing elected representatives, any
more than you can require that a dog would find a job and support
itself in a capitalist society. You could give such rights and
responsibilities to an uplifted dog, of course.

I sense in Fukuyama's writing an unwillingness to imagine how a
transhuman/posthuman culture could work, and so instead he chooses to
write it off as obviously bad (eg: he says that Brave New World is
basically what we are doomed to, yes really). I wonder whether it is
because he feels he has a great intellectual investment in knowledge
of the history of humanity, and that this would be rendered useless in
a setting that cast aside the great axioms of history, like that
humans are all essentially the same, regardless of colour, creed, etc
etc.



On Wed, 1 Sep 2004 23:24:44 -0700 (PDT), Adrian Tymes
<wingcat at pacbell.net> wrote:
> If I understand correctly, this guy was saying that
> transhumanism would, if embraced, be the most
> dangerous
> idea because it would necessarily lead to the most
> extreme "us vs. them" humanity has seen to date?
> 
> I would say the most dangerous idea, if embraced, is
> that there needs to be an "us vs. them".  Look what
> has happened when it has been embraced.  Granted, it
> is sometimes inevitable, such as when you have
> different religious sects whose holiest beliefs
> require conversion or genocide of all non-believers.
> But looking to violence or other antagonism as
> always the first resort causes a lot of unneeded
> grief.
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo/extropy-chat
> 


-- 
Emlyn

http://emlynoregan.com   * blogs * music * software *



More information about the extropy-chat mailing list