[ExI] The symbol grounding problem in strong AI.

John Clark jonkc at bellsouth.net
Tue Dec 29 05:13:57 UTC 2009


On Dec 28, 2009,  Gordon Swobe wrote:

> computer simulations of things do not equal the things they simulate. 

Sometimes simulations are exactly equal to the things they simulate and the more abstract something is the more likely that is to happen. Computer arithmetic is real arithmetic, digital music is real music.
And when a computer makes an action it is a real action and there is nothing simulated about it. A computer can duplicate many adjectives and verbs and even a few nouns; so there is one question you need to ask yourself, is consciousness more like a symphony or more like a brick?

There are 2 other points I'd like to make:

1) Even if you're correct there is absolutely no way you will ever know you're correct.
2) Even if you're correct that fact will never have any effect on the Human Race, it would be the AI's problem not ours.

 John K Clark



-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20091229/78fe90e1/attachment.html>


More information about the extropy-chat mailing list