[ExI] The symbol grounding problem in strong AI

Gordon Swobe gts_2000 at yahoo.com
Thu Jan 7 12:51:21 UTC 2010


--- On Thu, 1/7/10, Stathis Papaioannou <stathisp at gmail.com> wrote:

> It makes sense. You are saying that the NCC affects
> neuronal behaviour, and the NCC is that part of neuronal 
> behaviour that cannot be simulated by computer

Not quite. I said *experience* affects behavior, and I did not say we could not simulate the NCC on a computer. 

Where the NCC (neural correlates of consciousness) exist in real brains, experience exists, and the NCC correlate. (That's why the second "C" in NCC.) 

Think of it this way: consciousness exists in real brains in the presence of the NCC as solidity of real water exists in the presence of temperatures at or below 32 degrees Fahrenheit. 

You can simulate ice cubes on your computer but those simulated ice cubes won't help keep your processor from overheating. Likewise, you can simulate brains on your computer but that simulated brain won't have any real experience. In both examples, you have merely computed simulations of real things.

> Therefore, [you think I mean to say that] neurons must contain 
> uncomputable physics in the NCC.

But I don't mean that. Look again at my ice cube analogy! 

-gts


      



More information about the extropy-chat mailing list