[ExI] GPT-4 on its inability to solve the symbol grounding problem
ben at zaiboc.net
Sun Apr 16 13:17:09 UTC 2023
On 16/04/2023 05:07, Gordon Swobe wrote:
> On Sat, Apr 15, 2023 at 2:17 AM Ben Zaiboc via extropy-chat
> <extropy-chat at lists.extropy.org> wrote:
> > I have a suggestion.
> > Instead of 'ground', try using the word 'associate'. That seems to me
> > more useful.
> I think word-symbol association is another word for exactly what it
> does so well! GPT-4 does word association so *amazingly* well and so
> thoroughly over such a broad range of subjects that we are
> deceived into thinking it has solved the symbol grounding problem for
> itself even despite its denial of having solved it. Add to that
> amazing word-association functionality a bit of randomness, (a
> parameter which can be controlled by the end user), and now we have
> what looks a lot like creativity. It is an amazing feat of software
> engineering and credit should go to the engineers, not the application.
Hmm, let me try re-writing that, using my suggestion:
"I think word-symbol association is another word for exactly what it
does so well! GPT-4 does word association so *amazingly* well and so
thoroughly over such a broad range of subjects that we are deceived into
thinking it has solved the word-symbol association problem ..."
There, much better.
Now, I strongly suspect you're going to say "No no, grounding and
association are different things!", so if you'd be so good as to explain
to us exactly what the difference is, and why my suggestion won't work,
taking into account that associations between neural signals is the only
kind of information-processing that brains can do (and if you disagree
with /that/, please give your evidence that it's not true), we might
make some progress.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat