[extropy-chat] The Simulation Argument (was: Atheists launch inquisition...)

Hal Finney hal at finney.org
Thu Dec 2 17:09:10 UTC 2004


Giu1i0 writes:
> While I like the Simulation Argument a lot, I don't think we can make
> any probability statement.

Nick does make a probability statement: "This paper argues that at least
one of the following propositions is true: (1) the human species is
very likely to go extinct before reaching a 'posthuman' stage; (2) any
posthuman civilization is extremely unlikely to run a significant number
of simulations of their evolutionary history (or variations thereof);
(3) we are almost certainly living in a computer simulation."

There we see probabilistic phrasing: "very likely", "extremely unlikely",
"almost certainly".

> Once you demonstrate that "most minds will be in simulations rather
> than reality",  you have demonstrated that each mind is probably in a
> simulation, but I don't think we can demonstrate that.
> Perhaps posthumans will choose to use their precious computing cycles
> on other things (Nick's second possibility).

I think what you mean is that from the simulation argument we can't deduce
a probability that we are living in a simulation.  That's because the SA
is a conjunction of 3 terms and to estimate the probability of one of them
we have to estimate the probability of the others.

The point of the argument is that in certain circles, like ours, people
have been pretty free about estimating the probability of clause 1 and 2.
We often talk here about a future where we become posthuman.  Most of
us probably think it's pretty likely.  We also talk about running
simulations, and the implications.  Prior to the SA, not many people
here objected to the notion that a future posthuman civilization would
run simulations, in addition to its many other activities.

The SA then gains its strength by showing that we are inconsistent if we
are happy with these two assumptions but don't accept that we are probably
living in a simulation.  Now, if we find that conclusion unpalatable,
we may go back and revisit the other two options, start to nit-pick,
and come up with reasons why they may not be true.

But I don't think that is really intellectually honest.  We have no
a priori knowledge about whether we are in a simulation or not (at least,
those of us who accept the theoretical possibility that simulated minds
could exist).  An argument which starts from previously-accepted notions
(that we will probably become posthuman and some will run simulations)
and produces a conclusion about which we have no a priori knowledge
should not cause us to look for reasons to reject its premises.

It would be different if we reached a conclusion which was apparently
in disagreement with the facts.  Then we would be justified in trying
to figure out where we went wrong, whether there was a problem in the
argument or in some of the assumptions.

But mere emotional dislike of a conclusion should not cause us to
re-evaluate our assumptions.  That would mean putting emotion over
reason.  It is a non-Bayesian way of reasoning.  If we believed posthuman
simulations had a certain probability before, we shouldn't adjust that
probability merely if the SA convinces us that this implies that we are
in a simulation, and that possibility feels spooky.

> No, I think the SA is one of those things that you just don't know
> about. But its value is to show how one can build a religion perfectly
> compatible with our scienfific knowledge of the universe.

I don't see the SA as having anything to do with religion.  It is a
question of philosophy, of ontology, of metaphysics.  But not religion.

Hal



More information about the extropy-chat mailing list