[ExI] The symbol grounding problem in strong AI.

John Clark jonkc at bellsouth.net
Thu Jan 7 16:04:21 UTC 2010


On Jan 7, 2010, Gordon Swobe wrote:
> 
> If we knew exactly what physical conditions must exist in the brain for consciousness to exist, i.e., if we knew everything about the NCC,

This NCC of yours is gibberish. You state very specifically that it is not the signals between neurons that produce consciousness, so how can some sort of magical awareness inside the neuron correlate with anything? You must have 100 billion independent conscious entities inside your head.  

> then we could perfectly simulate those physical conditions on a computer. And someday we will do this.

Glad to hear it.

> But that computer simulation will have only weak AI

So even physical perfection is not enough for consciousness, something must still be missing. Let's see if we can deduce some of the properties of that something. Well first of all obviously it's non-physical, also it can't be detected by the Scientific Method, it can't be produced by Darwin's Theory of Evolution, and it starts with the letter "S". 

 John K Clark









> for the same reason that simulated ice cubes won't cool your computer's processor.
> 
> I understand why you want to say that I must therefore think consciousness exists outside the material world, or that I think we cannot compute the brain. But that's not what I mean at all. I see consciousness as just a state that the brain can be in. We can simulate that brain-state on a computer just as we can simulate the solid state of water. 
> 
> -gts
> 
> 
> 
> 
> 
> 
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20100107/4550fb8d/attachment.html>


More information about the extropy-chat mailing list