[ExI] The symbol grounding problem in strong AI

Gordon Swobe gts_2000 at yahoo.com
Mon Dec 14 15:03:13 UTC 2009


Stathis,

> The brain is comprised of dumb components which act
> together to create a mind. 
 
So it seems.
 
> The CR is comprised of smart and dumb components which
> act together to create a mind distinct from the mind of
> the smart component. 
 
According to the systems reply to the CRA, yes. As they have it (or had it) there existed two minds: the smart mind belonging to the room and dumb mind belonging to the man. But they missed Searle's point, so Searle re-illustrated the same symbol grounding problem in terms they would understand.
 
> The CR without the room is comprised of a smart
> component which acts to create a mind distinct from
> the mind of the smart component. 
 
"The CR without the room" is just a man whose brain does
nothing more than run a formal program. If formal programs
cause or have semantics then the man should understand the
Chinese symbols that his mental program manipulates. But he
doesn't understand Chinese even while passing the TT. Ergo,
the brain does not overcome the symbol grounding problem
with formal programs. 
 
Even if the brain does run formal programs, (per the
computationalist theory of mind) it must do something else
besides. The computationalist theory of mind is then at best
incomplete and at worst completely false. 

Says Searle.

-gts


      



More information about the extropy-chat mailing list