[ExI] The digital nature of brains (was: digital simulations)

Stathis Papaioannou stathisp at gmail.com
Sun Jan 31 00:54:25 UTC 2010


On 31 January 2010 10:51, Eric Messick <eric at m056832107.syzygy.com> wrote:

> The "he becomes the system" thing is stretching the analogy way past
> its breaking point.  If we're talking about an ordinary human (which
> Searle apparently is), then there is no way that human could contain
> enough information or process it quickly enough to pass the Turing
> Test before dying of old age (or even before the heat death of the
> universe).
>
> If the system is a neural level simulation, then the human must
> maintain state information on every neuron in a human brain.  There
> isn't anywhere to put that information, as the human's neurons are
> already full keeping their own state.

I don't see any problem in principle with the human being the whole
system but not understanding what he is doing. A human could follow a
simple algorithm and not understand its greater purpose, so why would
you demand that he understand a very complex algorithm? The
intelligence of the man in the CR does not actually participate in the
process except to the extent that it is needed to manipulate symbols,
so all he understands is the symbol manipulation. This is the same for
a brain or a computer: the components only need understand their own
basic job. If neurons were all linked as one mind which knows when to
make its constituent parts fire, that mind would not necessarily have
any knowledge of the human mind it was generating and vice-versa.


-- 
Stathis Papaioannou



More information about the extropy-chat mailing list