[ExI] Is the brain a digital computer?

Stathis Papaioannou stathisp at gmail.com
Fri Feb 26 12:12:02 UTC 2010


On 26 February 2010 01:45, Gordon Swobe <gts_2000 at yahoo.com> wrote:
> --- On Thu, 2/25/10, Stathis Papaioannou <stathisp at gmail.com> wrote:
>
>>> You contradicted yourself when you wrote as you did
>>> yesterday of "functionally identical but unconscious" brain
>>> components. This is why I wrote that making such a thing
>>> would be like trying to draw a square triangle. You took my
>>> comment wrongly to mean that I deny the possibility of weak
>>> AI.
>>
>> I have been at pains to say that the brain component is
>> functionally identical from the point of view of its external behaviour.
>
> And I have been at pains to explain that you cannot get identical external behavior from an unconscious component. Because of the feedback loop between conscious experience and behavior, the components you have in mind cannot exist. They're oxymoronic.
>
> You open a can of worms when you replace any component of the NCC with one of your unconscious dummy components. But you can close that can of worms with enough work on other parts of the patient. You can continue to work on the patient, patching the software so to speak and rewiring the brain in other areas, until finally you create weak AI.
>
> The final product of your efforts will pass the Turing test, but its brain structure will differ somewhat from that of the original patient.

You have no problem with the idea that an AI could behave like a human
but you don't think it could behave like a neuron. Does this mean you
also think it would be harder to make a convincing zombie mouse than a
zombie human, and a zombie flatworm harder still?

The task is to replace all the components of a neuron with artificial
components so that the neuron behaves just the same. We assume that
this is done by aliens with extremely advanced technology. They could,
for example, replace every atom in the neuron with exactly the same
atom, and the resulting neuron will behave normally and have normal
consciousness, even though the aliens only set out to reproduce the
behaviour and neither know nor care about human consciousness. As an
exercise, the aliens decide to replace the neuronal components with
equivalently functioning analogues. For example, the neuron will have
a system that is responsible for the timing and amount of
neurotransmitter release, and the aliens install in its place a
nanoprocessor which does the same job. Suppose the original system was
part of the NCC. Are you saying that however hard the aliens try, they
won't be able to get the modified neuron to control neurotransmitter
release in the same way as the original neuron? If you are, then you
are saying that the NCC utilises physics which is NOT COMPUTABLE. It
does infinite precision arithmetic, or solves the halting problem, or
something. If this is not the case, then the aliens would be able to
make computerised neurons which are drop-in replacements for
biological neurons; no further adjustment to biological tissue would
be needed for the whole brain to behave normally.

Note that functionalism can still be true even if the NCC is not
computable. Functionalism says that if you replace the NCC with a
similarly functioning device, you will preserve the mind as well as
the behaviour. If a digital computer can't do the job, another device
that functions as a hypercomputer might. For example, a little man
pulling levers could be up to the task as his own brain will not
suffer from the limitations of a Turing machine. The neuron containing
the little man would then behave exactly the same as a biological
neuron, and would also have the consciousness of a biological neuron,
otherwise we are once more up against the problem of the partial
zombie.


-- 
Stathis Papaioannou



More information about the extropy-chat mailing list