[ExI] e: GPT-4 on its inability to solve the symbol grounding problem
gsantostasi at gmail.com
Mon Apr 17 21:29:36 UTC 2023
I think the grounding that happens in real life is mostly valuable because
it helps with survival. But it is not unique in terms of finding meaning
via relationships. When we recognize an apple it is really a complex
process. It is not a 1 to 1 association. It is an abstraction based on the
experience of seeing many apples, comparing apples to not apples, deciding
what is essential about an apple or not.
You are correct in asserting that grounding in real life is valuable for
survival, as it allows us to associate our experiences with objects and
concepts that are crucial to our well-being. However, as you've pointed
out, the process of recognizing an object like an apple is not a simple
one-to-one association. It involves a complex network of abstractions and
relationships derived from our experiences.
The process of recognizing an apple, for example, involves understanding
various characteristics, such as shape, color, texture, taste, and more. We
form abstractions by comparing apples to other objects and identifying
their unique features. Furthermore, we also establish relationships with
other related concepts, such as food, nutrition, or the apple tree from
which it originates.
This intricate web of relationships and abstractions contributes to the
richness and flexibility of human language and cognition. By recognizing
the importance of both grounding and relationships, we can better
appreciate the complexity of meaning in language and the role that both of
these elements play in our understanding of the world.
What I'm trying to say that grounding is a sort of illusion because if you
look deeper into grounding is all based in establishing relationship
between things. The fact that there are sensory experiences doesn't make
this type of grounding so different from something a machine could do in a
virtual world for example. They are essentially the same process.
I understand your point better now. You're suggesting that grounding,
whether it is based on sensory experiences or abstract relationships, is
fundamentally about establishing connections between objects or concepts.
This means that the process of grounding through sensory experiences is not
inherently different from the process of grounding that a machine might
perform in a virtual environment, as both involve establishing
>From this perspective, the distinction between "grounding" and
"relationships" becomes less clear. Instead, it could be seen that
grounding is actually an emergent property of the relationships and
connections that we form between objects and concepts, whether they are
based on sensory experiences or more abstract associations.
This viewpoint highlights the importance of relationships in forming
meaning, even in the absence of direct sensory experiences. It also
suggests that machines, like humans, can potentially create meaningful
symbols and language by establishing relationships between objects and
concepts within their own virtual environments.
On Mon, Apr 17, 2023 at 2:22 PM Giovanni Santostasi <gsantostasi at gmail.com>
> *Start with three main components:a. Sensorimotor experience (perception
> and action)b. Symbolic representation (language, symbols)c. Grounding (the
> process that connects symbols to experience)*Gordon,
> I have already said asking things directly to GPT-4 it is not always the
> best way to test its knowledge of itself. But you are using it as a tool to
> help you with your imagination and express your ideas, that is ironic but
> The problem with the above is the "process that connects symbols to
> experience". But how do you get symbols in the first place?
> Do you realize that making symbols is the exact opposite of "grounding"?
> It is the exact opposite process because you may start with a "real" apple
> in front of you to a generalization of apple that is based on seeing many
> apples, excluding other objects from the apple category and so on. You need
> to compare apples to apples (see what I did there?) and apples to oranges
> to understand what an apple is. It is about the relationships, the
> connections. The grounding you talk about is a possible way to do (that is
> relevant for evolutionary purposes) but it is not the ONLY way to do it.
> Any relationship would do.
> I don't have time right now but later I will do an experiment. I will come
> up with invented objects that have certain common characteristics, like
> edges (basically geometrical shapes). Then train an AI to create categories
> based on whatever the AI things is in common among these objects. I bet you
> that it will be able to create categories just from comparing these objects
> with each other even if these objects are abstract (I said shapes but they
> can be described by a set of numbers or even a rule to make them). Once you
> have the categories you can assign words to them so we know what we talk
> about and then have a convo.
> Language can be built from relationships without referents to the real
> world or better said the only way to build meaning in language is to rely
> on the relationships. The grounding process you have in mind, Gordon, is
> not what really happens in reality. Maybe the grounding you imagine gives
> you solace but it is not really what happens in real life.
> On Mon, Apr 17, 2023 at 1:58 PM Gordon Swobe via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>> On Mon, Apr 17, 2023 at 1:58 PM Ben Zaiboc via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>> I suppose that's what I want, a graphical representation
>>> of what you mean by 'grounding', incorporating these links.
>> Not sure how to do it incorporating your links. I started scratching my
>> head trying to think of the best way to diagram it, then it occured to me
>> to ask GPT-4. It certainly "understands" the symbol grounding problem and
>> why it cannot solve it for itself. Here is its solution.
>> How would you draw a diagram of symbol grounding to make it easy to
>> To create an easily understandable diagram of symbol grounding, follow
>> these steps:
>> Start with three main components:
>> a. Sensorimotor experience (perception and action)
>> b. Symbolic representation (language, symbols)
>> c. Grounding (the process that connects symbols to experience)
>> Draw three circles, one for each component, and label them accordingly:
>> a. A circle for "Sensorimotor Experience" at the bottom left
>> b. A circle for "Symbolic Representation" at the bottom right
>> c. A circle for "Grounding" at the top center
>> Connect the circles with arrows to represent the flow of information:
>> a. Draw an arrow from "Sensorimotor Experience" to "Grounding"
>> b. Draw an arrow from "Symbolic Representation" to "Grounding"
>> c. Draw an arrow from "Grounding" back to both "Sensorimotor Experience"
>> and "Symbolic Representation"
>> Add brief descriptions to each arrow to clarify the processes:
>> a. Arrow from "Sensorimotor Experience" to "Grounding": "Extraction of
>> features and patterns"
>> b. Arrow from "Symbolic Representation" to "Grounding": "Association of
>> symbols with features and patterns"
>> c. Arrow from "Grounding" back to "Sensorimotor Experience": "Guided
>> perception and action"
>> d. Arrow from "Grounding" back to "Symbolic Representation": "Generation
>> of meaningful symbols"
>> Add examples to make the diagram more relatable:
>> a. Next to "Sensorimotor Experience", you can add an image of a cat and a
>> person observing the cat
>> b. Next to "Symbolic Representation", you can write the word "cat" or
>> draw a simple icon representing a cat
>> c. Add an image of a brain or a thinking cloud in the "Grounding" circle
>> With these steps, you will create a diagram that visually represents the
>> concept of symbol grounding, making it easier to understand. The examples
>> and descriptions help clarify the connections between sensorimotor
>> experiences, symbolic representations, and the grounding process.
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat