[ExI] What surveillance solution is best - Orwellian, David Brin's, or ...?

Jef Allbright jef at jefallbright.net
Tue Jun 26 18:12:44 UTC 2007


On 6/25/07, TheMan <mabranu at yahoo.com> wrote:

> My use of the word nightmare referred to the threat
> from the super-hi-tech weapons that may soon be
> developed and with which one or more of the thousands
> of ordinary, angry and crazy human beings that exist
> today may choose to terminate all life on this planet.

Sorry if I appear to be unnecessarily polemical on this point, but I
think it is important to distinguish between such an outcome,
nightmarish indeed, and such a threat, good in that it drives us to
develop higher level solutions.  Regardless of its specifics and form,
the (Red Queen's) race goes on.

We could use more people and organizations thinking in terms of an
increasingly intelligent global immune system.


> Of course, technology is great in many ways, but I've
> got the impression that most extropians tend to focus
> too much on the boons and underestimate the perils.
> For example, only a tiny part of Kurzweil's
> "Singularity is near" is about the perils of the
> coming technologies, the rest is about the great stuff
> these technologies can bring us.

While much of Kurzweil's activity is promotional, he thinks seriously
about risks.
For example: <http://gop.science.house.gov/hearings/full03/apr09/kurzweil.pdf>

Based on my experience over more than a decade on the extropy list, my
impression is that we tend to **over-estimate** the magnitude of both
the rewards and the risks, while underestimating their subtleties.



> If you can be motivated to better or equally good
> actions by feeling only excitement and no fear, that's
> great.

For me, emotions just are, they are neither good nor bad in
themselves.  I watch them coming and going and occasionally rock my
boat, but they have no intrinsic value whatsoever.  Functionally they
serve an important role in the motivational feedback loops of evolved
biological organisms.  An artificial intelligence could operate
without "emotions", but it would still need motivational feedback
loops and it would still describe states as being relatively pleasant
or unpleasant.  Confusion arises in naïve discussions on this topic
when people conflate "negative emotions" with their negative, in the
sense of dysfunctional, side-effects.

I have no hope of effectively conveying this understanding over this
limited channel, so please excuse me from defending this here.



> I'm just not sure that one will be sufficiently
> aware of the risks and take sufficient action to
> diminish them if one doesn't acknowledge the nightmare
> aspect of the global arms race that seems to be going
> to get out of control soon.

My emphasis is entirely on increasing awareness -- note the repeated
metaphor of improving our map -- but alarm and over-reaction can be
even more harmful.


> The race is a fact of nature, I agree, but it would
> proceed even if restricted, just at a slower speed.

Restrictions ultimately only work for the other guy.  This is easily
observable in game theory.  Very fundamentally, we are defined by our
values and can't choose to lose.

Two brothers wanted to race a course,
To see which had the slowest horse.
Since neither wanted to spur his mare,
What must they do to make it fair?



> Just as you point out, the acceleration of change may
> soon make it impossible for any government, or any
> other groups or individuals for that matter, to
> restrict the use of too dangerous technologies (in an
> ordinary, democratic manner, that is) before it's too
> late. So if the speed of development could be lowered,
> it would be safer, and mankind would still reach
> singularity sooner or later.

My point is that we can't achieve an inclusive agreement to slow down,
someone will defect.  We can, however, realistically strive for
increasing cooperation in this race.


> An Orwellian world
> despot, with the power to prevent everyone else from
> experimenting with new technologies, would,
> statistically, be more careful with what experiments
> he allows, than would the least careful of countless
> free and eager engineers, cults and terrorists in the
> world. The kind of society David Brin suggests might
> have a similar dampering effect on the perils of tech
> development. But in a free society that follows the
> proactionary principle without a ubiquitous
> surveillance system for watching out for dangerous use
> of new technlogies, it seems to me that less careful
> (and less morally sensible) engineers will get to
> perform experiments than in the former two cases.
>
> How easy will it be for the good people in the world
> to always come up with sufficient defenses against
> nanoweapons, and other supertech, in time before a
> nanoweapon, or other supertech, terminates us all?
> Wouldn't it be easier for the mankind-loving people to
> secure an advantage over the mankind-threatening
> people if the technological development would be
> slowed down? An Orwellian system might slow it down
> and thus provide some time, so it might be the best
> alternative for mankind, even if the leaders of it
> would not protect mankind for mankind's sake but
> merely for personal profit.

Orwellian scenarios are based on somewhat obsolete early 20th-century
thinking about power and corruption, lacking our more modern (and
improving) knowledge of game-theory, systems theory, technological
network effects, and much more.  Accelerating technological change
both threatens us and liberates us, with the odds of our survival
biased just slightly in our favor by our capacity for intelligent
choice.  The way to increase our odds is by increasing the
intelligence of our choices -- promoting an increasing context of
increasingly coherent values over increasing scope of consequences --
not by futilely attempting to slow down our horse in the race.  [Did
you solve the riddle?]

Snipped a lot more, but must get to work.  My boss is a real
slave-driver, and he watches literally everything I do.

- Jef



- Jef




More information about the extropy-chat mailing list