<html>
<head>
<meta content="text/html; charset=ISO-8859-1"
http-equiv="Content-Type">
</head>
<body bgcolor="#FFFFFF" text="#000000">
On 13/09/2012 14:59, Stefano Vaj wrote:
<blockquote
cite="mid:CAPoR7a5BXmkS5r-9JAXHpRxpuj+B7eoeeOV1-wJ_wQxBeFSY5w@mail.gmail.com"
type="cite">
<div class="gmail_quote">On 13 September 2012 05:32, spike <span
dir="ltr"><<a moz-do-not-send="true"
href="mailto:spike66@att.net" target="_blank">spike66@att.net</a>></span>
wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0
.8ex;border-left:1px #ccc solid;padding-left:1ex">
I have no way of knowing if such a thing would ever simulate
intelligence,<br>
but I do have a way of knowing the alternative: a dead rock
does not<br>
simulate anything.<br clear="all">
</blockquote>
</div>
<br>
If you believe in the Principle of Computational Equivalence,
almost everything - that is, everything which is beyond a very low
complexity threshold much below that of any single PC - can
emulate anything else, the only issue being that of the relative
performance in the execution of a given programme.<br>
</blockquote>
<br>
The relative performance is not neglible. If you try to implement
Windows 7 on the thermal interactions in a rock, you will find that
the mapping between computer state and rock state is exceedingly
complex. The same would be true for mapping it onto genetic
switches. (The first case is truth to be told not really Wolfram's
principle, but just Searle's criticism of functionalism)<br>
<br>
Universal Turing machines can emulate each other with a constant
overhead, but even the move over to Wolfram's dear Rule 30 adds at
least a linear slowdown. And Rule 30 is still nearly a Turing
machine: it is a computational universe that is pretty optimal for
doing computations like a Turing machine. As you move further away
from systems designed to be computers the slowdown likely becomes
exponential. <br>
<br>
Taking a pre-existing object and programming it to do Windows
requires you to find a subset of the causal interactions in the
object that implements a Turing machine (it has to be very simple,
the probability of getting something equivalent to a Pentium is
vanishingly small. Wolfram's 2 state 3 symbol Turing machine
contains about 18 bits of information - expect to find one for every
2^18 random systems) - this is already somewhat tricky, since there
is often no selection for such subsets in nature, they just show up.
In particular, there is no selection for subsets that are free from
causal interactios with the rest of the system: they get swamped
with noise all the time. While in principle the noise-free
Turing-complete subsystem could run Windows, the noisy one is
incapable of doing it. So you need to find a subsystem that can
house a few billion bits, yet does not interact strongly with the
rest of the system or the outside world. That is even tougher.<br>
<br>
Wolfram's principle is cool for thinking about Tegmark level 4
universes and the problems of defining life and computation, but it
does not give us much practical help. The fact that small parts of
nearly any complex system (when placed in peculiar states) can
emulate small parts of nearly any other complex system is
interesting, but not useful for much. We need to engineer systems to
become good at computation (large parts of them can emulate large
parts of other systems) if we want actual results.<br>
<br>
<br>
<br>
<pre class="moz-signature" cols="72">--
Anders Sandberg,
Future of Humanity Institute
Philosophy Faculty of Oxford University </pre>
</body>
</html>