<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Thu, Apr 10, 2014 at 6:41 PM, Tomaz Kristan <span dir="ltr"><<a href="mailto:protokol2020@gmail.com" target="_blank">protokol2020@gmail.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div class=""><div style="font-family:arial,sans-serif;font-size:12.5714px">> So this predicts that a random observer should predict he is in a simulation of an early interval. </div>
<div><br></div></div><div>Then, he must also predict, that his simulator is also simulated! And so on, through all the turtles/simulators?</div><br></div></blockquote></div><br></div><div class="gmail_extra">If the universe is a simulation, is there a base hardware for computing the sims?<br>
<br></div><div class="gmail_extra">I like the "turtles all the way down" to describe the recursion, but it doesn't satisfy the base condition (that's the point, right?)<br><br></div><div class="gmail_extra">
I wonder if this concept will actually survive serialization to words, though I'd be happy if anyone confirmed:<br><br></div><div class="gmail_extra">Programmers/CS talk about arrays as 2 dimensional but the memory for a 2d array is really 1 dimension. It's pretty simple math for the mapping function to get the 1d address of a 2d array [j,k] : j * kmax + k (assumes 0-based indexes) This function generalizes for higher/more dimensions mapping down to 1d memory addresses. No doubt actual implementations of large (and sparsely populated) arrays use some internal representation that is a much more efficient use of space, but let's agree that space is cheaper than cleverness.<br>
<br></div><div class="gmail_extra">I'm going to jump over the 2d, 3d mappings and go right to holographic principle. I'm also going to assume everyone here already knows what that's about (or can look it up).<br>
<br></div><div class="gmail_extra">I'd like to propose that information density is a feature of any given volume of space. Is expansion is a result of increased information content or is entropy a result of expansion. While information is computed, new information is generated (ex: metadata,intermediate results, etc) I think it's obvious this becomes unwieldy in much the same way a base1 number system is unwieldy. So Intelligence (capitalization denoting requisite handwaving of definitions) applies some externalization of meaning into a computation protocol.<br>
<br></div><div class="gmail_extra">A network router doesn't need to "understand" the entire payload of a packet of data; only the relevant headers. I wonder if the Intelligence(*) computing the sim(s) can defer meaning of various information densities in a layer-independent and application agnostic way.<br>
<br></div><div class="gmail_extra">We may be looking at the information in our local region of spacetime and pondering the Fermi paradox simply because we're unaware of the correct protocol to understand the communication that is literally all around us. I imagine looking at any individual packet from among the trillions flowing over the Internet at any given moment would appear to be unintelligible noise without knowing the TCP/IP protocol. Even with that bit of information, encrypted content (SSL, etc.) is intentionally meaningless without prior knowledge of externalized context.<br>
<br></div><div class="gmail_extra">I think the same information protocol problem exists in understanding the genome. Portions that were once referred to as "junk DNA" has been found to be functional/important. <br>
<br></div><div class="gmail_extra">META: my experience with computer science and networking frames my thinking (about thinking/AI/etc) in these terms. Max Tegmark arrived at his Level IV universe through a cosmological experience/background. I suspect many other disciplines might lead to similar conception of these platonic forms. (including Greek philosophy from two millennia ago)<br>
</div><div class="gmail_extra"><br><br></div></div>