[ExI] e: GPT-4 on its inability to solve the symbol grounding problem

Jason Resch jasonresch at gmail.com
Mon Apr 17 23:37:03 UTC 2023


On Mon, Apr 17, 2023 at 2:28 PM Gordon Swobe via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
> My argument is about large language models. LLMs, in the purest sense of
> that term,, are nothing like such a system. They have no eyes, no ears, no
> senses whatsoever to register anything outside of the text. They are
> trained only on symbolic text material. From their point of view, (so to
> speak), the corpus of text on which they are trained is the entire
> universe.
>

I agree that the text constitutes the entire universe for the LLM. But
don't lose sight of the fact that it was our universe that created that
text.

So in a sense, the universe of the LLM is the same as our universe, it is
just one-step removed: it is our universe as interpreted by human minds.

Jason
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230417/4d6f6f6f/attachment-0001.htm>


More information about the extropy-chat mailing list