[ExI] The symbol grounding problem in strong AI

Stathis Papaioannou stathisp at gmail.com
Sat Dec 19 06:49:21 UTC 2009


2009/12/19 Gordon Swobe <gts_2000 at yahoo.com>:
> --- On Fri, 12/18/09, Stathis Papaioannou <stathisp at gmail.com> wrote:
>
> After a complete replacement of my brain with your nano-neuron brain...

It's important that you consider first the case of *partial*
replacement, eg. all of your visual cortex but the rest of the brain
left intact.

>> So you might lose your visual perception but to an external
>> observer you would behave just as if you had normal vision and,
>
> Yes. By the way, though not exactly the same thing as we mean here, the phenomenon of blindsight exists. People with this condition detect objects in their line of sight but cannot see them. Or, rather, they cannot see that they see them.

Patients with blindsight can to an extent detect objects put in front
of their eyes but do not have visual perception of the objects.
Patients with Anton's syndrome have the opposite phenomenon: they are
blind and stagger about walking into things but claim that they can
see, and confabulate when asked to describe what they see.

>> more to the point, you would believe you had normal vision.
>
> I would have no awareness of any beliefs I might have.

Yes you would, if only your visual cortex was replaced and the rest of
your brain unchanged.

>> You would look at a person's face, recognise them,
>
> I would not know that I recognized them, but I would act as if I did.

Yes you would recognise them.

>> experience all the emotional responses associated with that person,
>
> Bodily responses, but I would have no awareness of them.

Yes you would, even without any visual cortex.

>> describe their features vividly, but in actual fact you would be seeing
>> nothing.
>
> I would see but not know it.

You would not see but you *would* know that you are seeing, as surely
as you know that you are seeing now.

>> How do you know you don't have this kind of zombie vision right now?
>
> Because I know I can see.

And you would have the same kind of knowledge with a zombified visual
cortex, since the rest of your brain would be unchanged.

>> Would you pay to have your normal vision restored, knowing that it
>> could make no possible subjective or objective difference to you?
>
> No, but I wouldn't know that I didn't.

You would know (because you have been reliably informed) that you had
a zombified visual cortex, but try as you might you can't tell any
difference compared to before the operation. This is because the
artificial neurons are sending the same signals to the rest of your
brain. So you would be postulating that you are blind but that the
blindness makes absolutely no difference to you, since you have all
the thoughts and feelings and behaviours associated with normal
vision. Is it possible that you could lose all visual experience like
this but not even notice? If it is, then experience is something very
different to what we all intuitively know it to be.

You're going very far in postulating this strange theory of partial
zombiehood (which could be afflicting us all at this very moment) just
so that you can maintain that the artificial neurons lack
consciousness. The alternative simpler and more plausible theory is
that if the artificial neurons reproduce the behaviour of biological
neurons, then they also reproduce the consciousness of biological
neurons.

> In all the above except the second to last, I lack intentionality.
>
>
>> Well how about this theory: it's not the program that has
>> consciousness, since a program is just an abstraction. It's
>> the physical processes the machine undergoes while running the
>> program that causes the consciousness. Whether these processes can
>> be interpreted as a program or not doesn't change their
>> consciousness.
>
> I don't think S/H systems have minds but I do think you've pointed in the right direction. I think matter matters. More on this another time.

But you also think that if the matter behaves in such a way that it
can be interpreted as implementing a computer program it lacks
consciousness. The CR lacks understanding because the man in the room,
who can be seen as implementing a program, lacks understanding;
whereas a different system which produces similar behaviour but with
dumb components the interactions of which can't be recognised as
algorithmic has understanding. You are penalising the CR because it
has something extra in the way of pattern and intelligence.


-- 
Stathis Papaioannou



More information about the extropy-chat mailing list