[extropy-chat] Avoid Too Much Change.

Eugen Leitl eugen at leitl.org
Wed Apr 11 19:34:38 UTC 2007

On Wed, Apr 11, 2007 at 04:50:07PM +0100, Russell Wallace wrote:

>    Selection? Look at the statistics: selection favors those who eschew
>    this geek stuff completely. We're programmed to believe personal power

Tools are irrelevant?! Why are we exterminating the gorilla then, and
not gorilla us?

>    confers selective advantage, because it was true in the conditions in
>    which we evolved - but even though we still believe it because we're
>    programmed to, it's no longer true.

It doesn't matter what we believe, the great fitness function evaluates
us all.

>    As for why it's not worth considering: it's a story. We make up
>    stories for ourselves for our own reasons. Sometimes we set them in
>    "the future", but when the actual future comes around, it practically
>    never resembles our stories; once you go beyond such predictions as
>    "computers will be more powerful in ten years than they are today",
>    futurology has a lower track record of success than you'd expect from
>    random chance. As soon as someone says "the future will be like X",
>    it's a reasonably safe bet that whatever the future actually ends up
>    like, it won't be X.

"Evolution will still apply in future". That's completely reasonable,
and a powerful source of constraints.

>    In this case it's not even a particularly plausible story: if you get
>    "IQ 12000" (scare quotes because the phrase doesn't actually mean
>    anything, IQ isn't defined much past 200 or so), are you going to go
>    berserk and start massacring everyone? (That, after all, is what the

A diverse population of postbiological beings could very well be terminal
to conventional ecosystems. Pretending it never can be is not good risk
evaluation, given the magnitude of the outcome. 

>    elimination of other viewpoints in a timescale as short as a century
>    implies.) Are you even going to tolerate such behavior in others? Even

Kiloyears are overnight wall clock.

>    if you are, nobody else is. Nobody with any political power wants the
>    existence of a handful of people a zillion times smarter than anyone

Not necessarily smarter, DIVERSE and FIT.

>    else. The world isn't going to tolerate the creation or existence of
>    superintelligent entities unless they behave like respectable
>    citizens.

The world isn't a homogenous entity.
>      "If we have matter duplicators, will each of us be a sovereign
>        and possess a hydrogen bomb?" -- Jerry Pournelle

Of course. But just having a bunch of nukes doesn't make you a souvereign
in the posthuman world.
>    Leaving aside the lack of evidence that matter duplicators are
>    possible, stop and think about this for a moment: conventional

Do you have a problem with machine-phase? I'm all ears. Tell me why it
wouldn't work. 

>    manufacturing technology is perfectly adequate to build hydrogen
>    bombs, has been for decades. Why are we not each a sovereign

Not in your cellar.

>    possessing a hydrogen bomb today? Once you look at that question, it
>    becomes clear that the "matter duplicators" are a smokescreen,
>    something to aid suspension of disbelief by distracting the mind from
>    the real-life reasons why this scenario doesn't happen.

You could build quite a few megatons in your cellar with machine-phase.

>    For Pournelle is after all a storyteller: he has earned a living

He is a writer. We're not writers, selling some plausible claptrap is
incompatible with my training as a scientist.

>    making up stories, which are selected in the marketplace based on the
>    same fitness criterion: that people enjoy reading them. This is fine
>    provided we understand that it is not at all related to the
>    hypothetical fitness criterion of correspondence to what will actually
>    happen in real life.

Real life defines fitness.

Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
ICBM: 48.07100, 11.36820 http://www.ativel.com http://postbiota.org
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE

More information about the extropy-chat mailing list