[ExI] Semiotics and Computability

Gordon Swobe gts_2000 at yahoo.com
Fri Feb 19 13:19:58 UTC 2010

--- On Thu, 2/18/10, Mike Dougherty <msd001 at gmail.com> wrote:

> Again I ask if the man is a metaphor or an actual
> man.  

You should consider him an actual man, even a brilliant man, but one who has no tools available to him other than those specified in the experiment.

> Suppose further that there is some amount of processing
> ability beyond those required of the mashed-potato driven
> machine. With that pattern recognition the man is able to notice the
> pattern "squiggle->squoogle" so that he no longer needs to
> compare the shape to the lookup/transformation rules because he's
> internalized that knowledge (commit to memory, established onboard
> knowledgebase, whatever)  

Sure, we've discussed that already. Go ahead and imagine the man as having committed all the rules to memory and optimized his performance. He becomes the entire system.

> Are you going to argue that every action performed by the
> man in the box is part of the transformation process?  Are you
> taking aware his apparent optimization intelligence by scripting his
> actions?  If you remove any responsibility from the man in the box, you 
> move the responsibility to the author of the script.  That's
> programming.  I thought you said programming isn't intelligence...  If
> I haven't stated this position clearly, please explain.

The man has responsibility for his own actions, but neither the man nor the programmer can use English semantics to translate the Chinese to English in such a way that the man has access to those translations.  In other words we cannot simply hand the man a Chinese/English database. We cannot do this because the point of the experiment is to see if the man or the program he represents can glean semantics from syntactical rules only. I hope that answers your question.

I wanted to encourage you to consider the man as literally a man for this reason: the experiment tells us something about how the real people think. The man has a normal human brain but he cannot get semantics from syntax.



More information about the extropy-chat mailing list