[ExI] The symbol grounding problem in strong AI

Stathis Papaioannou stathisp at gmail.com
Tue Dec 15 16:47:41 UTC 2009


2009/12/16 Gordon Swobe <gts_2000 at yahoo.com>:

>>.. are installed in place of part of your brain, the visual cortex being
>> good for illustration purposes. You are then asked if you notice
>> anything different. What will you say? Before answering, consider
>> carefully the implications of the fact that the essential feature of the
>> artificial neurons is that they behave just like biological neurons in
>> their interactions with their neighbours.
>
> What I will say will depend on what I experience, and until the experiment happens I will have no idea what that experience might look like. However I do take issue with your assumption that your artificial neurons will (by "logical necessity", as you put it in another message) produce exactly the same experience as real neurons merely by virtue of their having the same "interactions with their neighbours" as real neurons, especially in the realm of consciousness. We simply don't know if that's true.

As John Clark pointed out, the neighbouring neurons *must* respond in
the same way with the artificial neurons in place as with the original
neurons. Therefore, your motor neurons *must* make you behave in the
same way: you declare that everything looks normal, and you correctly
tell me how many fingers I am holding up. It's impossible that
something else happens. So the point is, if you reproduce the
behaviour of the neurons, you reproduce the behaviour of the brain and
the whole person. The further question then is, does this also
reproduce the consciousness? If it does not, then that would mean
either that you go blind but don't notice, or that you go blind but
feel yourself smiling and declaring that everything is fine despite
frantic efforts to call out and end the nightmarish experiment. The
former possibility makes a mockery of the concept of perception (how
do you know you are perceiving anything now if you can be mistaken in
this way?) while the latter implies that you are doing your thinking
independently of your brain. These possibilities both seem absurd. The
simple explanation is that if your brain behaves the same way, you
must have the same consciousness.


-- 
Stathis Papaioannou



More information about the extropy-chat mailing list