[ExI] The symbol grounding problem in strong AI.

John Clark jonkc at bellsouth.net
Tue Dec 29 16:28:29 UTC 2009


On Dec 29, 2009,  Gordon Swobe wrote:

> To borrow a phrase popularized by a philosopher by the name of Thomas Nagel, who famously wrote an essay titled _What is it like to be a bat?_,

I've read that essay and I think Nagel was totally confused and his confusion has nothing to do with the mysteries of consciousness, it has to do with logic. He makes it clear that he doesn't want to know what it would be like for Thomas Nagel to be a bat, he wants to know what's it is like for a bat to be a bat. The only way to do that is to turn the man into a bat, but then Thomas Nagel still wouldn't know because he'd no longer be Thomas Nagel, he'd be a bat. Only a bat would know what it's like to be a bat because like it or not consciousness is a private experience. 

> there exists something "it is like" to mentally solve or understand a mathematical equation. Computers do math well, but you can't show me how they could possibly know what it's like. 

Do you know what it's like? When you multiply 243 by 613 do you "KNOW" the answer is 147959, does your intuition insist that it can be no other number, do you feel it in your bones or did you just follow a purely mechanical procedure that you learned in the third grade to come up with that figure? There is no need to translate computer arithmetic into real arithmetic, it's already real.

> Digital music? Same story.

You said computer simulations are not real but I must insist that digital music is real music.

 John K Clark

PS: If you do feel it in your bones then you need your bones checked because the true answer is 148959.



-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20091229/763bbcbe/attachment.html>


More information about the extropy-chat mailing list