[ExI] The symbol grounding problem in strong AI

Aware aware at awareresearch.com
Mon Dec 21 01:48:25 UTC 2009


On Sun, Dec 20, 2009 at 4:44 PM, Gordon Swobe <gts_2000 at yahoo.com> wrote:
> --- On Sun, 12/20/09, Aware <aware at awareresearch.com> wrote:
>
> I think I've seen this sort of psychedelic word salad before, but under a less ambiguous moniker. Hello again jef. Long time.

"Psychedelic word salad" could be taken as an insult, but I accept
that my writing is abstruse, although far from haphazard.  You might
try parsing it section by section, and ask for clarification or
expansion where necessary.

>
>> "Symbol grounding" is a non-issue when you understand, as I
>> tried to indicate earlier, that meaning (semantics) is not "in the
>> mind" but in the *observed effect* due to a particular stimulus.
>
> I won't argue that it does not appear as an observed effect due to a stimulus, but if the word "mind" has any meaning then when I understand the meaning of anything else, I understand it there in my mind.

I scare-quoted "in the mind" for a reason, and followed up with
clarification intended to distinguish the "mind" which acts, from the
"mind" of the observer who reports meaning EVEN IF THEY ARE PARTS OF
THE SAME BRAIN.  It seems significant at a meta-level that you've
already contested my use of the word "mind" isolated from its context.

>> There is no "true, grounded meaning" of the stimulus
>
> That's fine. Meaning != truth.

Non sequitur. I didn't equate the two, but used "true" as a a
qualifying adjective.  Again you argue with disregard for context.


>> nor is there any local need for interpretation or an interpreter.
>
> Somebody sits here in my chair. He wants to interpret the meanings of your funny words. He sits here locally. Really.

I used the word "local" to emphasize *local to the mind that acts* as
opposed to the logically separate observer.


>> Our evolved nature is frugal; there is stimulus and the system's
>> response, and any "meaning" is that reported by an observer, whether
>> that observer is another person, or even the same person associated with
>> that mind.
>
> Good that you at least you allow the existence of minds. That's a start. Now then when that observer reports the meaning of a word to or in his own mind, whence comes the understanding of the meaning?

That's the point, there is no meaning in the system except in terms of
an observer (which observer could very well be a function of the same
brain implementing that which is observed.)


> More importantly, how do we get that in software? How can a program get semantics?

You don't program-in semantics.  You *observe* semantics in the
perception of "meaningful" behavior.  Meaningful behavior is achieved
via methods adapting the agent system to its environment.


>> We act according to our nature within context.
>
> I've seen that "within context" qualifier many times before also from the from the same jef I remember. :)

Yes, I often emphasize it because it's a frequent and characteristic
blind spot of many of my highly analytical, but reductionism-prone
friends.

- Jef



More information about the extropy-chat mailing list