[extropy-chat] Avoid Too Much Change.

Russell Wallace russell.wallace at gmail.com
Wed Apr 11 15:50:07 UTC 2007


On 4/10/07, Randall Randall <randall at randallsquared.com> wrote:
>
> Really?  I'd love to hear your reasons for thinking
> that this is unlikely or not worth considering.  While
> I disagree with John about personal identity, I do
> agree that selection will favor those who agree with
> him that process survival is unimportant.  This puts
> me in an awkward position, as I'm sure you understand.


Selection? Look at the statistics: selection favors those who eschew this
geek stuff completely. We're programmed to believe personal power confers
selective advantage, because it was true in the conditions in which we
evolved - but even though we still believe it because we're programmed to,
it's no longer true.

As for why it's not worth considering: it's a story. We make up stories for
ourselves for our own reasons. Sometimes we set them in "the future", but
when the actual future comes around, it practically never resembles our
stories; once you go beyond such predictions as "computers will be more
powerful in ten years than they are today", futurology has a lower track
record of success than you'd expect from random chance. As soon as someone
says "the future will be like X", it's a reasonably safe bet that whatever
the future actually ends up like, it won't be X.

In this case it's not even a particularly plausible story: if you get "IQ
12000" (scare quotes because the phrase doesn't actually mean anything, IQ
isn't defined much past 200 or so), are you going to go berserk and start
massacring everyone? (That, after all, is what the elimination of other
viewpoints in a timescale as short as a century implies.) Are you even going
to tolerate such behavior in others? Even if you are, nobody else is. Nobody
with any political power wants the existence of a handful of people a
zillion times smarter than anyone else. The world isn't going to tolerate
the creation or existence of superintelligent entities unless they behave
like respectable citizens.

"If we have matter duplicators, will each of us be a sovereign
>   and possess a hydrogen bomb?" -- Jerry Pournelle
>

Leaving aside the lack of evidence that matter duplicators are possible,
stop and think about this for a moment: conventional manufacturing
technology is perfectly adequate to build hydrogen bombs, has been for
decades. Why are we not each a sovereign possessing a hydrogen bomb today?
Once you look at that question, it becomes clear that the "matter
duplicators" are a smokescreen, something to aid suspension of disbelief by
distracting the mind from the real-life reasons why this scenario doesn't
happen.

For Pournelle is after all a storyteller: he has earned a living making up
stories, which are selected in the marketplace based on the same fitness
criterion: that people enjoy reading them. This is fine provided we
understand that it is not at all related to the hypothetical fitness
criterion of correspondence to what will actually happen in real life.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20070411/614bf04e/attachment.html>


More information about the extropy-chat mailing list