[ExI] The symbol grounding problem in strong AI

Gordon Swobe gts_2000 at yahoo.com
Fri Dec 18 00:27:07 UTC 2009


--- On Thu, 12/17/09, Stathis Papaioannou <stathisp at gmail.com> wrote:

>> If programs are syntactic and programs running on
>> computers can have semantics, then syntax is sufficient for
>> semantics.
>
> That's a valid argument but not necessarily a true
> one. You've simply put the conclusion you want to see (that
> programs can glean semantics from syntax) into the
> premises.
> 
> And you and Searle have assumed the opposite, when it is
> the thing under dispute.

No, Searle only assumes exactly what he states he assumes:

P1) Programs are formal (syntactic) [which is NOT to say they have no semantics or that they cannot cause or have minds]
P2) Minds have mental contents (semantics)
P3) Syntax is neither constitutive nor sufficient for semantics.

That's all he assumes, Stathis. Nothing more, nothing less.

To prove him wrong we need either show one of his premises as false or show that his conclusion (that programs don't cause minds) doesn't follow.

>> In other words your argument is not about Searle
> begging the question. If programs are syntactic and can also
> glean semantics from syntax then Searle's premise 3 is
> simply false. You just need to show how P3 is false for programs
> or for people.
> 
> It is false for people, since people are manifestly
> conscious. 

P3 is about syntax and semantics in a program or in a conscious person or in a book. Doesn't matter. And on Searle's view not even a conscious person can get semantics from syntax.

If you're serious about this subject then it's very important that you look closely at his words and not read anything into them. 

-gts


      



More information about the extropy-chat mailing list