[ExI] The symbol grounding problem in strong AI

John Clark jonkc at bellsouth.net
Mon Dec 14 15:46:47 UTC 2009


On Dec 14, 2009,  Gordon Swobe wrote:

> that we're missing some important ingredient to explain consciousness

No, we're missing some important ingredient to explain intelligence. Consciousness is easy to explain and that's the problem, absolutely any theory will do because there is no data they need to explain. One consciousness theory is as good as another. Intelligence theories are an entirely different matter, they are devilishly hard to come up with and there is a universe of data they need to explain.

 John K Clark
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20091214/df936c09/attachment.html>


More information about the extropy-chat mailing list