[ExI] The symbol grounding problem in strong AI

Gordon Swobe gts_2000 at yahoo.com
Sat Jan 9 19:54:47 UTC 2010


Stathis,

You have mentioned on a couple of occasions that you think I must believe that the brain does something that does not lend itself to computation. I made a mental note to myself to try to figure out why you say this. I had planned to go through your messages again, but instead I'll try to address what I think you may have meant.

Assume we know everything we can possibly know about the brain and that we use that knowledge to perfectly simulate a conscious brain on a computer. 

Even though I believe everything about the brain lends itself to computation, and even though I believe our hypothetical simulation in fact computes everything possible about a real conscious brain, I still also say that our simulation will have no subjective experience. 

Perhaps you want to know how can I say this without assigning some kind of strange non-computable aspect to natural brains. You may want to know how I can say this without asserting mind/matter duality or some other mystical concept to explain subjective experience. Understandable questions.

The answer is that I say it because I don't believe the brain is actually computer. 

Some people seem to think that if we can compute X on a computer then a computer simulation of X must equal X. But that's just a blatant non sequitur.  

-gts





      



More information about the extropy-chat mailing list