[ExI] Semiotics and Computability
lacertilian at gmail.com
Thu Feb 4 21:41:48 UTC 2010
Gordon Swobe <gts_2000 at yahoo.com>:
> Stathis wrote:
>> Searle would say that there
>> needs to be an extra step whereby the symbol so grounded gains
>> "meaning", but this extra step is not only completely mysterious, it
>> is also completely superfluous, since every observable fact about
>> the world would be the same without it.
> No, he would remind you of the obvious truth there exist facts in the world that have subjective first-person ontologies. We can know those facts only in the first-person but they have no less reality than those objective third-person facts that as you say "would be the same without it".
You're both wrong! Only I am right! Me!
>From my limited research, it appears Searle has never said anything
about some unknown extra step necessary to produce meaning. If you
think his arguments imply any such thing, that's your extrapolation,
not his. The Chinese room argument isn't chiefly about meaning: it's
about understanding. They're extremely different things. We take
meaning as input and output, or at least feel like we do, but we
simply HAVE understanding.
And no, it isn't a substance. It's a measurable phenomenon. Not easily
measurable, but measurable nonetheless.
Secondly, "facts with subjective first-person ontologies" is a
nightmarishly convoluted phrase. Does the universe even have facts in
it, technically speaking? I suppose what I'm meant to do is pick a
component of my subjective experience, say, my headache, and call it a
Then I say the fact of my headache has a subjective first-person
ontology. But that's redundant: all subjective things are first-person
things, and vice-versa. And "ontology" actually means "the study of
existence". I don't think the fact of my headache has any kind of
study, let alone such an esoteric one. Gordon must have meant
"existence", not "ontology".
Searle uses that same terminology. It makes things terribly difficult.
So to say something has "subjective first-person ontology" really
means it "exists only for the subject". There are facts (my headache)
which exist only for the subject (me). Ah! Now it makes sense. I even
have a word for facts like that: "delusions".
It's a low blow, I know. It shouldn't be, but it is.
Really, it just means we're too hard on especially delusional people.
We need delusions in order to function. They aren't inherently bad.
Who was it that wrote the paper describing how a delusion of self is
unavoidable when implementing a general-purpose consciousness such as
myself? I liked that paper. It appealed to my nihilistic side, which
is also the rest of me.
Ugh, this is going to drive me crazy. I have to remember some keywords
to search for. He used a very specific term to refer to that delusion.
"Distributed agent" was used in the paper, I think, but not the
message that linked to the paper...
More information about the extropy-chat