[ExI] The symbol grounding problem in strong AI

John Clark jonkc at bellsouth.net
Tue Dec 22 17:12:03 UTC 2009


On Dec 22, 2009, at  Gordon Swobe wrote:

> Searle assumes these three propositions as premises (he calls them axioms. I prefer to call them premises because premises seem more open to criticism):
> 
> P1) Programs are formal (syntactic).
> P2) Minds have mental contents (semantics).
> P3) Syntax is neither constitutive of nor sufficient for semantics.

From P3 Searle assumes that syntactics and semantics have absolutely nothing to do with each other and that is obviously false. And Searle makes great use of an axiom not stated above in his infamous Chinese Room. let's call it P4:

P4) Semantics is a very simple thing with only 2 values, understanding and non-understanding; you either have semantics or you don't .

P4 is even sillier than P3, and that's pretty silly. Understanding is the most important part of mind and mind is the most complex thing in the known universe, and Searle thinks it has only 2 values.

 John K Clark   

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20091222/3530abbf/attachment.html>


More information about the extropy-chat mailing list