[ExI] The symbol grounding problem in strong AI.

John Clark jonkc at bellsouth.net
Thu Dec 17 17:22:08 UTC 2009


On Dec 16, 2009,  Gordon Swobe wrote:

>  you will need first to show me a flaw in Searle's formal argument. 

What formal argument? All Searle did was invent a very silly thought experiment. 
> 
> Programs are formal (syntactic) 

Ok
> 
> and because Minds have mental contents (semantics)

So just like Searle you assert that other minds have mental contents and then claim that assertion  proves that other minds have mental contents. Provided of course the associated brain in question is made of meat and not silicon . Pretty silly don't you think.

> Syntax is neither constitutive of 

Good God almighty, you think syntax is not even *constitutive* of mind!!

> nor sufficient for semantics

So just like Searle you assert that syntax is neither constitutive of nor sufficient for semantics and then claim that assertion proves that syntax is neither constitutive of nor sufficient for semantics. If A=B then A=B. Pretty silly don't you think.

In Goedel's famous proof he found a way for a formal system to make statements about itself, and that tells me that the entire syntax/semantics divide that some philosophers who have never taken a high school biology course think is so fundamental is in reality an entirely manmade distinction with no clear boundary between the two. 

> Programs are neither constitutive of nor sufficient for minds.

I can't emphasize enough that the above statement places you squarely in the anti Darwin camp because there is no way evolution could have produced consciousness.

 John K Clark



-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20091217/43dc9d54/attachment.html>


More information about the extropy-chat mailing list