[extropy-chat] Singularity Blues

Adrian Tymes wingcat at pacbell.net
Wed Apr 6 19:18:52 UTC 2005

--- Edmund Schaefer <edmund.schaefer at gmail.com> wrote:
> Giving everyone advanced nanotechnology is about as safe as giving
> everyone nuclear strike capabilities. Less safe, really. Why is equal
> distribution of advanced technology desirable and how do you plan to
> make it safe?

Concentrating power in a few hands means those hands are unlikely to
come up with ways for anyone to defend against their power - so when,
inevitably, the power leaks into hands who use it for ill, stopping the
ill is difficult.

Spreading power into many hands means many people have an interest in
stopping said power, at least when it will be used against them.  These
defenses usually work just as well against "official" malcontents as
against those who ostensibly work for trusted agents but in fact abuse
their power.

There are many, many examples of this throughout history.  Look them up
if you want.  Nuclear weapons are a good example of the former (and the
latter, once the nuclear powers started seriously believing that rogue
states might soon have nuclear weapons - even if the actual defense
efforts have been inept, though it may be no coincidence that the
reports of rogue nuclear states have been placed in doubt).  SIs are a
form of power, and this seems to be a natural dynamic that even SIs
(early stage, anyway - early enough to affect the development and
formation of later stage SIs) themselves would not circumvent.

> Why is domination by multiple superintelligences any better? What
> specific problems would this fix?

It allows for the approach of making SIs out of multiple people who
exist today, thus removing the threat factor: it is one thing if
someone else will take over; it is another if you yourself can share in
the power.  This is a tactic to cut down on the opposition to the
development of these techs.

It also alleviates concerns with how the SIs will treat humanity: SIs
who were once human are more likely to look kindly on those who still
are human, than SIs who never were human.

Others can likely list more benefits to this approach.

I am far from the first person to cite either of these.  I doubt I will
be the last.  But, honestly, this has been hashed and rehashed enough
that I'm a bit surprised the question still has to be asked, at least
on this list.  Please reread past archives of this debate before
continuing this thread, and only continue this thread if you have
something new to add.  (I don't just mean Edmund, but everyone.)

More information about the extropy-chat mailing list