[ExI] The symbol grounding problem in strong AI

Stathis Papaioannou stathisp at gmail.com
Thu Jan 7 14:03:59 UTC 2010


2010/1/7 Gordon Swobe <gts_2000 at yahoo.com>:
> --- On Thu, 1/7/10, Stathis Papaioannou <stathisp at gmail.com> wrote:
>
>> It makes sense. You are saying that the NCC affects
>> neuronal behaviour, and the NCC is that part of neuronal
>> behaviour that cannot be simulated by computer
>
> Not quite. I said *experience* affects behavior, and I did not say we could not simulate the NCC on a computer.

"Experience" can only affect behaviour by moving stuff. How does the
stuff get moved? What would have to happen is something like this: the
NCC molecule attaches to certain ion channels, changing their
conformation and thereby allowing an influx of sodium ions,
depolarising the cell membrane; and this event constitutes a little
piece of experience. So while you claim "experience" cannot be
simulated, you allow that the physical events associated with the
experience can be simulated, which means every aspect of the neuron's
behaviour can be simulated.

> Where the NCC (neural correlates of consciousness) exist in real brains, experience exists, and the NCC correlate. (That's why the second "C" in NCC.)
>
> Think of it this way: consciousness exists in real brains in the presence of the NCC as solidity of real water exists in the presence of temperatures at or below 32 degrees Fahrenheit.
>
> You can simulate ice cubes on your computer but those simulated ice cubes won't help keep your processor from overheating. Likewise, you can simulate brains on your computer but that simulated brain won't have any real experience. In both examples, you have merely computed simulations of real things.

If you want the computer to interact with the world you have to attach
it to I/O devices which are not themselves computers. For example, the
computer could be attached to a peltier device in order to simulate
the cooling effect that an ice cube would have on the processor.

>> Therefore, [you think I mean to say that] neurons must contain
>> uncomputable physics in the NCC.
>
> But I don't mean that. Look again at my ice cube analogy!

The question of whether it is possible to put a computer in a neuron
suit so that its behaviour is, to other neurons, indistinguishable
from a natural neuron is equivalent to the question of whether a robot
can impersonate a human well enough so that other humans can't tell
that it's a robot. I know you believe the robot human would lack
intentionality, but you have (I think) agreed that despite this
handicap it would be able to pass the TT, pretend to have emotions,
and so on, as it would have to do in order to qualify as a
philosophical zombie. So are you now saying that while a zombie robot
human presents no theoretical problem, a zombie robot neuron, which
after all only needs to reproduce much simpler behaviour and only
needs to fool other neurons, would be impossible?


-- 
Stathis Papaioannou



More information about the extropy-chat mailing list