[ExI] Ben Goertzel on Large Language Models

Gordon Swobe gordon.swobe at gmail.com
Fri Apr 28 21:35:58 UTC 2023


On Fri, Apr 28, 2023 at 3:10 PM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
> Yes. Already we see the "LLM" learns much more than just language. It can
> do math, chess, computer programming, compose musical melodies, and draw
> pictures. It learned all these skills and incorporated them all into the
> same model.
>

I think that from the perspective of language models, all these things
count as language.

When a musician reads sheet music, she is reading the language of music.
The music itself is the meaning of the musical language. When we refer to
how we appreciate a piece of music, we are referring to the referents of
this musical language.

And here again we have the same question I have posed about language models
and English language. While the language model might be capable of
extraordinary things in the synthesis of musical language based on its deep
learning of the language of human music, perhaps composing symphonies
magnificent to the human ear, it has no access to the referents. It seems
to me that just as a pure language model cannot know the meanings of words,
it cannot appreciate the music.

-gts
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230428/859565b4/attachment.htm>


More information about the extropy-chat mailing list