[extropy-chat] The most "dangerous" idea

Adrian Tymes wingcat at pacbell.net
Fri Sep 3 21:08:26 UTC 2004


--- Samantha Atkins <samantha at objectent.com> wrote:
> On Sep 2, 2004, at 2:08 PM, Adrian Tymes wrote:
> > --- Samantha Atkins <samantha at objectent.com>
> wrote:
> >> Us vs Them develops when there are such
> difference
> >> between the groups
> >> that there is little mutual understanding and
> when
> >> there are
> >> conflicting needs or agendas.  One such conflict
> is
> >> when one or both
> >> groups sees the other as a threat that must be
> >> eliminated.
> >
> > *nods*  Not saying this won't happen.  Just that
> it
> > need not always happen - like, say, by trying to
> > develop mutual understanding or find ways to make
> the
> > needs/agendas not conflict, before resorting to
> > destroying the other group.
> 
> What if one group develops and willingly uses such
> enhancements that 
> the group without them is no more capable of
> understanding them than a 
> ant is capable of understanding a human?

Depends on the exact level of understanding you mean:

The group without can, at least, trace the history of
how the other group came to be, and begin to
understand them in that way.

- or -

It is already the case that many humans do not begin
to truly understand others that they interact with.
For instance, what Luddite could be said to truly
understand any of us today?  This has been the case
for quite a long time now, and while there has been
harm inflicted (most of which was eventually
repaired), a World War III between the technological
haves and the technological rejectors has not come to
pass on any large scale yet (the Gulf Wars being a far
cry from "large scale" in this context).

> >> In
> >> transhumanist terms those without various levels
> of
> >> augmentation/life
> >> extension/intelligence enhancement and so on will
> >> consider those who do
> >> have it as deadly vastly more competitive threats
> >> UNLESS the technology
> >> is either available to all or seen as seriously
> >> improving everyone's
> >> lifes and well-being at a very concrete level.
> >
> > And aren't the enhancements we're trying to get
> > developed aimed at improving the lives of those
> who
> > have them?  So, make sure that everyone can have
> them,
> > and that their benefits are easily seen by
> everyone
> > (which shouldn't be too hard, once there are real
> > examples to point to), and there we go.
> 
> Yes, but not everyone will choose the enhancements
> nor is it clear that 
> all enhancements can produced cheaply and in
> quantity sufficient for 
> everyone who does want them to have them.  People
> are not completely 
> rational (big surprise).   If we can quickly get to
> a more abundant 
> world where comparative super-persons do not
> threaten one's very 
> survival directly and if the "haves" consider the
> "have-nots" as 
> potential "haves" who just need a bit of time and
> space or as simply an 
> example of worthwhile and valued diversity, then it
> should have a 
> relatively happy outcome.

And that is what we - the transhumanists who are
reviled as "dangerous" - are in fact working towards,
no?



More information about the extropy-chat mailing list