[ExI] The symbol grounding problem in strong AI
gts_2000 at yahoo.com
Thu Jan 7 13:30:18 UTC 2010
Stathis, I wrote:
> Where the NCC (neural correlates of consciousness) exist in
> real brains, experience exists, and the NCC correlate.
> (That's why the second "C" in NCC.)
I meant the first "C", of course. The NCC *correlate*.
If we knew exactly what physical conditions must exist in the brain for consciousness to exist, i.e., if we knew everything about the NCC, then we could perfectly simulate those physical conditions on a computer. And someday we will do this. But that computer simulation will have only weak AI for the same reason that simulated ice cubes won't cool your computer's processor.
I understand why you want to say that I must therefore think consciousness exists outside the material world, or that I think we cannot compute the brain. But that's not what I mean at all. I see consciousness as just a state that the brain can be in. We can simulate that brain-state on a computer just as we can simulate the solid state of water.
More information about the extropy-chat