The politics of transhumanism/was Re: [extropy-chat] Re: Transhumanist short story

Adrian Tymes wingcat at pacbell.net
Thu Aug 18 18:51:42 UTC 2005


--- Technotranscendence <neptune at superlink.net> wrote:
> How about this.  Talk about general principles that have an Extropian
> or
> transhumanist import.  In this vein, since some have expressed a
> desire
> to separate Extropianism from libertarianism, what non-libertarian
> (or
> anti-libertarian) policies would be compatible with Extropianism?

The essence of libertarianism is personal choice.  The only major
critique of extropianism I've heard, that I have not been able to
completely dismiss (as with most luddite arguments over "tampering with
nature" or "current technology's defects mean the technology can never
become safe" or the like) comes about from considering what happens if
we succeed...and the anti-libertarian consequences.

Let's say we successfully convince some significant portion of the
public (in most of the industrial world) that human enhancement is
worth pursuing - a significant enough portion that the enhancements are
developed and put into widespread use.  While they are first being
deployed, there is always a choice as to whether to try them; they are
never forced (with possible exceptions for life-saving technologies,
and then only if the person wants their life to be saved - see current
debates over that very situation, so this is not a new issue we'd
introduce).  But if enhancements do indeed prove safe and effective,
more and more people will use them...and those who don't will be placed
at a disadvantage.

Consider literacy.  No human is born knowing how to read, and indeed,
for the majority of homo sapiens' existance most people got by just
fine without reading.  These days, illiteracy in otherwise-functional
adults is a condition in need of correction; an illiterate is so
nonfunctional that society makes little allowance for the willfully
illiterate.  That is: even if someone does not wish to learn how to
read, learning to do so is still forced upon the individual.  This is
anti-libertarian by definition, but definitely compatible with an
extropian future.

Now, consider if the classic cyberpunk concept of a "neural jack" - a
mind-machine interface socket implanted into a person, allowing direct
mental connection with any computer plugged into the socket - becomes
widespread.  Given the ease of use compared to keyboard/mouse/terminal
interfaces, much of the public infrastructure that uses computers
(which is an ever-growing fraction) would likely quickly be converted
to using neural jacks (consider how fast the Internet spread once the
Web was invented) - and some fraction of the conversions would likely
be neural-jack-only connections (especially in hard to access machines,
or highly cost-conscious installations where the extra cost of a
traditional interface would be significant).  Anyone without a neural
jack would be utterly excluded from accessing this latter category.
If neural jacks are adpoted widely enough, some businesses and
government agencies might stop caring about the remainder, so those
without this implant would be forced to get one to fully participate in
society.

The neural jack makes a good concrete example, but similar scenarios
can be spun about many specific proposed enhancements.  The crux is:
widespread adoption increases most peoples' expectations of the human
norm, so anyone clinging to the previous/natural norm is forced to
upgrade in order to have a job, be accepted by most people, et cetera.
Perhaps this is not literally physical force, but it is undoubtedly
coercive.

My answer to this, so far: for that situation to happen, almost
everyone would have to agree that the enhancement really is an
improvement.  Given such near-universal agreement - and especially
given as said agreement would have to be reached before this compulsive
effect could be generated (there aren't any really effective ways of
sneaking around it: any attempt to artificially generate the compulsion
without the agreement would result in a backlash that would render said
artificial compulsion ineffective) - it seems like this would be
something it might be morally okay to force on the remainder.  That's
not the most satisfactory answer, but it seems to work so far.



More information about the extropy-chat mailing list