[ExI] The symbol grounding problem in strong AI

John Clark jonkc at bellsouth.net
Sat Jan 2 21:11:02 UTC 2010


On Jan 2, 2010, at 11:46 AM, Gordon Swobe wrote:

> My dictionary calls it [consciousness] a noun. 

Yes and dictionaries also call "I" a pronoun, and we know how much confusion that colossal error has given the world. Lexicographers make very poor philosophers. 

> Stathis argues not without reason that if we can compute the brain then computer simulations of brains should have intentionality.

Punch card readers from the 1950's had intentionality, at least that's what your lexicographers think, the machine could do things that were calculable and could be directed to a goal. And I remind you that it was you not me that insisted on using the word intentionality rather than consciousness; I suppose you thought it sounded cooler.  

> I argue that even if we find a way to compute the brain, it does not follow that a simulation of it would have intentionality

You haven't argued anything. An argument isn't just contradiction, an argument is a connected series of statements intended to establish a proposition. You may object to this meaning but I really must insist that argument is an intellectual process. Contradiction is just the automatic gainsaying of any statement.
Look, if I argue with you, I must take up a contrary position. 
Yes, but that's not just saying 'No it isn't.
Yes it is!
No it isn't!
Yes it is!
I'm sorry, but your time is up and I'm not allowed to argue anymore.


I want to thank Professor Python for the invaluable help he gave mein writing this post.

 John K Clark







> any more than it follows that a computer simulation of a ham sandwich would taste like a ham sandwich, or that a computer simulation of a waterfall would make a computer wet. Computer simulations of things do not equal the things they simulate.
> 
> I recall learning of a tribe of people in the Amazon forest or some such place that had never seen cameras. After seeing their photos for the first time, they came to fear them on the grounds that these amazing simulations of themselves captured their spirits. Not only did these naive people believe in spirits, they must also have believed that simulations of things somehow equal the things they simulate.
> 
> -gts
> 
> 
> 
> 
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20100102/f821106e/attachment.html>


More information about the extropy-chat mailing list