[ExI] The symbol grounding problem in strong AI

Stathis Papaioannou stathisp at gmail.com
Thu Dec 17 09:59:13 UTC 2009


2009/12/17 Gordon Swobe <gts_2000 at yahoo.com>:

> Because
>
> 1) Programs are formal (syntactic)
>
> and because
>
> 2) Minds have mental contents (semantics)
>
> and because
>
> 3) Syntax is neither constitutive of nor sufficient for semantics
>
> It follows that
>
> 4) Programs are neither constitutive of nor sufficient for minds.
>
> If you think you see a logical problem then show it to me.

The formal problem with the argument is that 4) is assumed in 3). If
programs are syntactic and programs running on computers can have
semantics, then syntax is sufficient for semantics. Moreover, if
programs running on computers are syntactic then so are brains. A
computer running a program is at bottom just a collection of physical
parts interacting according to the laws of physics, and so is a brain.
Without assuming the answer to begin with what reason is there to
assume the matter jiggling around in a brain has semantics while that
in a computer exhibiting similar intelligent behaviour does not?


-- 
Stathis Papaioannou



More information about the extropy-chat mailing list