[ExI] The symbol grounding problem in strong AI

John Clark jonkc at bellsouth.net
Sun Dec 27 16:53:42 UTC 2009


On Dec 26, 2009, at 7:27 PM, Gordon Swobe wrote:
> 
> You skipped over my points about the formal nature of machine level programming and the machine states represented by the 0's and 1's that have no symanctic content real OR imagined.

So you're surprised that Stathis didn't comment on your observation that if you take something very big, even as big as mind, and keep dividing it, eventually you will get to something that is not big. I suspect he didn't reply to your little homily because he already knew that.

> That's what we're talking about here at the most basic hardware level to which you want now to appeal: "On" vs "Off"; "Open" vs "Closed". They mean nothing even to you and me, except that they differ in form one from the other.


One thing differing from another is the atom of meaning and from that small bit a universe can be made. You seem to find it so astounding as to be unbelievable that a very small part of a system can have properties that are different from the entire very large system. I don't.

 John K Clark



-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20091227/cf0053f7/attachment.html>


More information about the extropy-chat mailing list