[ExI] Principle of Computational Equivalence

Stathis Papaioannou stathisp at gmail.com
Fri Feb 5 09:59:50 UTC 2010


On 5 February 2010 06:09, Gordon Swobe <gts_2000 at yahoo.com> wrote:
> --- On Thu, 2/4/10, Stefano Vaj <stefano.vaj at gmail.com> wrote:
>
>> Not really to hear him reiterate innumerable times that for
>> whatever reason he thinks that (organic? human?) brains, while
>> obviously sharing universal computation abilities with cellular
>> automata and PCs, would on the other hand somewhat escape the Principle
>> of Computational Equivalence.
>
> I see no reason to consider the so-called Principle of Computational Equivalence of philosophical interest with respect to natural objects like brains.
>
> Given a natural entity or process x and a computation of it c(x) it does not follow that c(x) = x. It does matter whether x = an organic apple or an organic brain.
>
> c(x) = x iff x = a true digital artifact. It seems to me that we have no reason to suppose except as a matter of religious faith that any x in the natural world actually exists as a digital artifact.
>
> For example we might in principle create perfect computations of hurricanes. It would not follow that hurricanes do computations.

Gordon, that is all true, but sometimes even a bad copy of an object
does perform the same function as the object. For example, a ball may
fly through the air like an apple even though it isn't an apple and
lacks many of the other properties of an apple. The claim is not that
a computer will be *identical* with the brain but that it will
reproduce the intelligence of the brain and, as a corollary, the
consciousness of the brain, which it turns out (from a logical
argument that you can't or won't follow or even attempt to rebut) is
impossible to disentangle from the intelligence.


-- 
Stathis Papaioannou



More information about the extropy-chat mailing list