[ExI] GPT-4 on its inability to solve the symbol grounding problem

Brent Allsop brent.allsop at gmail.com
Thu Apr 13 01:36:02 UTC 2023


Hi Jason,

On Wed, Apr 12, 2023 at 8:07 AM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> Thus the simulation, like the isomorphic graph, by preserving all the same
> relationships recovers all the same properties. If the glutamate molecule
> possesses redness, then the perfect simulation of glutamate will possess
> redness too.
>

ALL of our objective observations of physics can be fully described with
abstract text.
All of that which you could simulate, can also be described with abstract
text.

But there is no way you can communicate to someone what redness is like,
with text alone.
You MUST have pictures, to produce the subjective experience, before
someone can know what redness is like.

There must be certain stuff in the brain which can be computationally
bound, which produces something beyond, what can be described via abstract
text.
You can abstractly describe all of it, you can objectively observe all of
it with our senses, and you can abstractly simulate all of that.
But until it is physically computationally bound with the rest of our
consciousness, you can't know the true quality you are only abstractly
describing and simulating.

In other words, like abstract text can't communicate the nature of
qualities.
An abstract simulation also, can't produce anything more than abstract text
can describe.
At least, that is what I predict.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230412/1b7289d9/attachment.htm>


More information about the extropy-chat mailing list