[ExI] The symbol grounding problem in strong AI
Gordon Swobe
gts_2000 at yahoo.com
Wed Dec 23 02:57:56 UTC 2009
--- On Tue, 12/22/09, Stathis Papaioannou <stathisp at gmail.com> wrote:
>> The argument again:
>>
>> P1) Programs are formal (syntactic).
>> P2) Minds have mental contents (semantics).
>> P3) Syntax is neither constitutive of nor sufficient
>> for semantics.
>> C1) Programs are neither constitituve nor sufficient
>> for minds.
> It is also possible that programs are *only* formal but
> programs can have minds because P3 is false, and syntax actually is
> constitutive and sufficient for semantics.
Sure, we just need to show P3 false.
> I base this on the fact that all my brain does is manipulate
> information, and yet I feel that I understand
> things. Searle of course disagrees because he takes it as
> axiomatic that symbol-manipulation can't give rise to understanding;
> but it also used to be taken as axiomatic that matter could not give
> rise to understanding.
Well P3 is certainly open to debate. Can you show how syntax gives rise to semantics? Can you show how the man in the room who does nothing more than shuffle Chinese symbols according to syntactic rules can come to know the meanings of those symbols? If so then you've cooked Searle's goose.
-gts
More information about the extropy-chat
mailing list