[ExI] Wernicke's aphasia and the CRA.

Gordon Swobe gts_2000 at yahoo.com
Sun Dec 13 14:21:55 UTC 2009


Hi Lee, good to see you again.

> Using the "systems reply" terminology, does the Chinese
> Room laugh at jokes?  Doth it have feelings?  Hath the
> CR not...

Laugh at jokes, yes, as we should require that much to give it a passing score on the Turing test. But have [subjective] feelings? These sorts of internal mental states are exactly what are at issue. 

Lest we mangle Searle's metaphor, the "man" who internalized the rule book and stepped outside the room (in Searle's reply to his systems critics) would have only those feelings that relate to his puzzlement about the meanings of Chinese words, and we should consider those feelings irrelevant.

Remember the man exists only as a sort of literary device to help people understand the symbol grounding problem. Searle wants us to see that formal programs do not understand the symbols they manipulate any more than does a shopping list understand the words "bread" and "milk". 


> Anyway. Suppose that the CR is asked the question, "How may
> I here in L.A. this week make an atomic bomb, and revenge my
> poor Middle Eastern people against the Imperialists?"
> 
> And the CR responds---all in Chinese characters---by
> providing concise directions for building a backyard bomb!

Then I think we should have it arrested to protect the public and let the philosophers continue their debate about a chinese room that sits now in the jail cell. (Not sure of your point.)
 
>> [Moreover, say] the man internalizes the rule book
>> and steps outside the room. Different picture, same
>> symbol grounding problem.
> 
> By hypothesis, then the original "man" doesn't know a bit
> about what is being said, only the new "internalization" you
> speak of?

There exists no "original" man in the second thought experiment. Neither the man in the room in the original experiment nor the man alone in the second (and different) thought experiment understand Chinese symbols. Both thought experiments illustrate the symbol grounding problem.

> For me, this exposes to criticism the notion that the room
> isn't "a man".

Right. Behind Searle's figure of speech you will find a Universal Turing Machine that passes the Turing test without overcoming the symbol grounding problem. The UTM only appears to understand the symbols it manipulates, giving false positives on the TT.

Because brains overcome the symbol grounding problem and software/hardware systems do not, it appears we cannot describe the brain as a software/hardware system. Whatever the brain does, it does something besides run formal programs.

-gts


      



More information about the extropy-chat mailing list