[extropy-chat] Doomsday argument

Russell Wallace russell.wallace at gmail.com
Fri Oct 13 16:04:11 UTC 2006


On 10/13/06, Robert Bradbury <robert.bradbury at gmail.com> wrote:

> I believe that the "probability shift" is due to the fact that if there
> are many extraterrestrial civilizations then the cubicles are all "full".
>

I still don't get it - if there are many civilizations (which there are, per
Tegmark et al), there are many roomfuls of cubicles; that doesn't say
anything about the percentage of rooms with many versus few cubicles
occupied.

The argument has lots of problems, not the least of which range from (1)
> since the transition from humankind to posthumankind (with a variety of AIs,
> IAs, normo-humans, being present simultaneously is significantly greater
> than zero) the definition for "doomsday" is extremely soft; (2) doomsday
> could occur for all the cubicles if all of the protons decay (but that is in
> the very far future); (3) a significant fraction of the possible doomsdays
> may already be behind us (if you look at the relative abundance of solar
> systems which could probably not support life and the number of mass
> extinctions on this planet already behind us then there may not be many coin
> tosses left [1]).
>

Indeed so. The primary flaw in the doomsday argument itself, in my opinion,
is that it fudges the reference class; essentially it assigns a prior
probability to the proposition "I am me" - but the probability of that is
necessarily 1. ("I" refers to the mind doing the contemplating, said mind
being a product of history in a particular place and time. It makes no sense
to say things like "I could have been Julius Caesar", but that's the sort of
alternative the doomsday argument requires us to entertain.)

1. A recent glance at Robin's pages suggestes that he may be working on a
> paper discussing this fact.  This derives from the variety of reasons
> suggesting that it would be really difficult to eliminate humanity or its
> knowledge base at this point.  (How would one take out *all* of the
> libraries, all of the search engine server farms, etc.)
>

Unfortunately I can think of a rather easy way. The problem is that the
K-selecting environment in which we evolved was the opposite of today's
situation, therefore our brains are programmed to believe the opposite of
the truth in such matters, therefore when people take the possibility of
doomsday seriously they start talking about averting "nanowar",
"bioterrorism" etc, whereas real life death and extinction have completely
different causes, and nanotech and biotech are the only things that have a
prayer of protecting us from them; the upshot of all this is that the
greater the extent to which people believe in the possibility of doomsday
and take actions they believe will avert it, the greater the obstacles to
continued technological progress and the greater the probability that
humanity will actually die out. Server farms do no good if the information
on them isn't used.

In my more pessimistic moments I sometimes imagine that every race smart
enough to develop nanotechnology must be smart enough to first discover the
doomsday argument and thereby snuff itself.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20061013/b6d52626/attachment.html>


More information about the extropy-chat mailing list