[extropy-chat] The world as a Sim ? Irrelevant

Eliezer S. Yudkowsky sentience at pobox.com
Thu Nov 6 05:33:24 UTC 2003


Brett Paatsch wrote:

> Eliezer S. Yudkowsky wrote:
> 
>>Brett Paatsch wrote:
>>
>>>Eliezer S. Yudkowsky writes:
>>>
>>>>This world could easily be a computer simulation, but if so
>>>>it is a simulation of a  world without ESP.
>>>
>>>I reckon (without believing) that it is NOT a sim and could 
>>>NOT easily be a sim. 
>>
>>I used to have a very strong intuition that the world was not a
>>sim. Eventually it went away, and in retrospect I think it was 
>>based on nothing but wishful thinking.
> 
> Even in retropect I wonder why such a notion would just go 
> away? Could it be that the notion was not important to you?

No, (a) I realized it was wishful thinking, (b) I lost some of the 
premises with which I had been rationalizing the wishful thinking, (c) I 
understood how to handle the possibilities I was thinking about, rather 
than flinching away from them.

>>As Lee Corbin pointed out with respect to the Tegmark
>>bubble duplication, there is no *particular* you, only the
>>*set* of yous.  It seems to me likely, on my current model,
>>that at least some of you are living in a computer simulation.
> 
> [Sorry some ambiguity in that -hard to process]
> 
> You necessarily see the world from your own standpoint Eliezer
> and from that standpoint the words "me" and "you" tag different
> referents than the same words do in other persons standpoints.
> I am not sure if you mean "you" as a self-reference also (like in
> "one") or you as other-reference only. 

Eh?  I mean that if there's another Eliezer 10^10^29 meters away from me, 
then there's no way to say that I'm over here and he's over there; 
"Eliezer" is the measure of Eliezers wherever they are.  At least that's 
my current guess.

>> The question is whether almost all of you are 
>>simulated, or almost all of you are real.
> 
> I know I'm real.  Do you know you are real? 

Gah, sorry.  Talk about the relative measure in simulated versus 
permanently nontamperable processes, then.

>>>I think that to entertain the notion that it is is to run the risk of
>>>repeating (now dead) Pascal's wager and betting the same
>>>wrong way.  Of betting that there is a super-natural being that
>>>is going to come to the rescue like Santa Claus. I prefer to
>>>work with friends and the resources that I have, rather than
>>>fret over the resources I don't have. I am not malicious and 
>>>I will deal with maliciousness if I must with resolution and
>>>ferocity of my own.  
>>
>>That some people may be inclined to abuse the simulation
>>hypothesis in predictable ways, does not bear on the simulation
>>hypothesis's *actual truth or falsity*.
> 
> True. But really who cares?

If there's anything I need to do about the simulation possibility, in my 
professional capacity, then I care.

>>>I can see no advantage or change in my habits that I would
>>>make if I thought the world was a sim. I would probably have
>>>more allies and friends if less people entertained the sim 
>>>hypothesis so much but so be it.   
>>
>>Okay, so most actions are the same on the simulation hypothesis.
>>Again this does not bear on the simulation hypothesis's actual truth
>>or falsity.
> 
> Agreed.
> 
>>>I think that if no-one can come up with a reliable test, (group
>>>test - science or solitary test - pure reason) to see if the world
>>>is a sim then it would be better to step around what looks like 
>>>an iteration of the same old mindfuk, to put away childish 
>>>things and to concentrate minds and energies on the tasks 
>>>at hand. 
>>
>>If you can't come up with any experimental observation that
>>differs on the world being a sim, then you don't need to know
>>how much of your measure lies in sims, because it won't make
>>any difference to subjective  probabilities. 
> 
> Agreed.
> 
>>The problem is in dealing with questions like "Would my measure
>>sharply decrease after a Singularity, and if so, what would that 
>>look like?", 
> 
> [sorry perhaps I am missing something but I don't find that to be
> an important question. Perhaps it is because I don't think of the
> Singularity in pro-noun terms. I really don't know. Perhaps I could
> see your point a bit better if you could unpack 'baby' a bit more.
> 
> Can you describe the Singularity as you see it with more words
> than the one (Singularity) but no pronouns? I think that would help.

If I build a superintelligence does our world suddenly require enormously 
more computing power to simulate to useful accuracy, thus greatly reducing 
the measure of any branches of probability that lie in a simulation?  Does 
our world become enormously less interesting to simulating SIs, with the 
same result?  And if so, what does it feel like?  Is it subjectively the 
same as dying?  Or does it have no effect at all?  While I still don't 
know how to answer questions like those above, I see plausible strategies 
for handling most of the plausible answers.  The quantities involved can 
be managed intelligently.

>>where the triggering event lies in the future, is
>>important, and there is no obvious way to test different 
>>hypotheses in advance.
> 
> You may have a point but I can't parse to it for the reasons given
> above.

-- 
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence




More information about the extropy-chat mailing list