[ExI] The symbol grounding problem in strong AI

Gordon Swobe gts_2000 at yahoo.com
Sat Dec 19 04:26:27 UTC 2009


--- On Fri, 12/18/09, Stathis Papaioannou <stathisp at gmail.com> wrote:

After a complete replacement of my brain with your nano-neuron brain...

> So you might lose your visual perception but to an external
> observer you would behave just as if you had normal vision and, 

Yes. By the way, though not exactly the same thing as we mean here, the phenomenon of blindsight exists. People with this condition detect objects in their line of sight but cannot see them. Or, rather, they cannot see that they see them.

> more to the point, you would believe you had normal vision. 

I would have no awareness of any beliefs I might have.

> You would look at a person's face, recognise them,

I would not know that I recognized them, but I would act as if I did.

> experience all the emotional responses associated with that person, 

Bodily responses, but I would have no awareness of them.

> describe their features vividly, but in actual fact you would be seeing 
> nothing. 

I would see but not know it.

> How do you know you don't have this kind of zombie vision right now? 

Because I know I can see.

> Would you pay to have your normal vision restored, knowing that it 
> could make no possible subjective or objective difference to you?

No, but I wouldn't know that I didn't.

In all the above except the second to last, I lack intentionality.


> Well how about this theory: it's not the program that has
> consciousness, since a program is just an abstraction. It's
> the physical processes the machine undergoes while running the
> program that causes the consciousness. Whether these processes can
> be interpreted as a program or not doesn't change their
> consciousness.

I don't think S/H systems have minds but I do think you've pointed in the right direction. I think matter matters. More on this another time.

-gts



      



More information about the extropy-chat mailing list