[ExI] The symbol grounding problem in strong AI

Gordon Swobe gts_2000 at yahoo.com
Sat Jan 2 15:29:02 UTC 2010


--- On Sat, 1/2/10, John Clark <jonkc at bellsouth.net> wrote:

>> In my opinion you fall off the rails there and wander into the land of 
>> metaphysical dualism.
> 
> It may be dualism to say that what a thing is
> and what a thing does are not the same, but it's not
> metaphysical it's just logical. 

I see a metaphysical problem only when people assert that the mind exists as some sort of abstract entity (programmatic, algorithmic, whatever) distinct from the brain that actually does the work that we describe with those abstractions.

If we want to say that mind exists in such abstract idealistic ways, that's fine, but now we must contend with all the problems associated with metaphysical dualism. Where does that mind exist? In the platonic realm? In the mind of god? Where? And how can idealistic entities affect the material world? And so on. 

I would rather not go down that road, nor would Searle, and I assume nobody here wants to go there either.

> Intentional means calculable, and calculable sounds to me to be something
> programs should be rather good at. 

Good at simulating intentionality, yes.

>> If you expect to find consciousness in or stemming
>> from a computer simulation of a brain then I would suppose
>> you might also expect to eat a photo of a ham sandwich off a
>> lunch menu and find that it tastes like the ham sandwich it
>> simulates.

> I haven't actually tried to do it but I don't
> believe that would work very well. It's just a hunch.

Good hunch.

> Because a ham sandwich is a noun and a photo of one is a very different 
> noun and consciousness is not even a noun at all.

My point is that simulations only, ahem, simulate the things they simulate. 

The system in which we implement a simulation will not equal or contain the thing it simulates. It does not matter what we want to simulate, nor does it matter whether we use software and to implement it in hardware or photos of ham sandwiches to implement it in lunch menus. No matter what we do, simulations of real things will never equal the real things they simulate.

I don't see this as an especially difficult concept to fathom, and it has nothing to do with Darwin!

-gts



      



More information about the extropy-chat mailing list