[ExI] Semiotics and Computability
msd001 at gmail.com
Fri Feb 19 14:32:24 UTC 2010
On Fri, Feb 19, 2010 at 8:19 AM, Gordon Swobe <gts_2000 at yahoo.com> wrote:
> The man has responsibility for his own actions, but neither the man nor the programmer can use English semantics to translate the Chinese to English in such a way that the man has access to those translations. In other words we cannot simply hand the man a Chinese/English database. We cannot do this because the point of the experiment is to see if the man or the program he represents can glean semantics from syntactical rules only. I hope that answers your question.
> I wanted to encourage you to consider the man as literally a man for this reason: the experiment tells us something about how the real people think. The man has a normal human brain but he cannot get semantics from syntax.
Maybe semantics can't be gleaned directly from a syntactical rule -
but over the course of the man's time in the box, he will observe
patterns and hypothesize meaning which can be reinforced by repeated
observation. Call it symbol grounding in experience. If you also
constrain this man with a memory that only lasts for the duration of a
single lookup / IO transform then you are back to a simple state
machine - which isn't particularly interesting or worthy of this much
More information about the extropy-chat