[ExI] The symbol grounding problem in strong AI

Gordon Swobe gts_2000 at yahoo.com
Thu Dec 17 14:10:22 UTC 2009


--- On Thu, 12/17/09, Stathis Papaioannou <stathisp at gmail.com> wrote:

> > Because
> >
> > 1) Programs are formal (syntactic)
> >
> > and because
> >
> > 2) Minds have mental contents (semantics)
> >
> > and because
> >
> > 3) Syntax is neither constitutive of nor sufficient
> for semantics
> >
> > It follows that
> >
> > 4) Programs are neither constitutive of nor sufficient
> for minds.
> >
> > If you think you see a logical problem then show it to
> me.
> 
> The formal problem with the argument is that 4) is assumed
> in 3). 

Premise 3 (P3) says nothing whatsoever about programs or minds. 

You argue..

> If programs are syntactic and programs running on computers
> can have semantics, then syntax is sufficient for semantics.

That's a valid argument but not necessarily a true one. You've simply put the conclusion you want to see (that programs can glean semantics from syntax) into the premises.

In other words your argument is not about Searle begging the question. If programs are syntactic and can also glean semantics from syntax then Searle's premise 3 is simply false. You just need to how P3 is false for programs or for people.

The thought experiment illustrates how P3 is true. The man in the room follows the rules of Chinese syntax, yet he has no idea what his words mean. 

-gts




      



More information about the extropy-chat mailing list