[ExI] Bender's Octopus (re: LLMs like ChatGPT)

Giovanni Santostasi gsantostasi at gmail.com
Fri Mar 24 05:30:37 UTC 2023


Gordon,
 I will read the paper and write one to push back on her arguments. She is
wrong.
Giovanni

On Thu, Mar 23, 2023 at 6:40 PM Gordon Swobe <gordon.swobe at gmail.com> wrote:

>
>
> On Thu, Mar 23, 2023 at 7:16 PM Giovanni Santostasi <gsantostasi at gmail.com>
> wrote:
>
>> Gordon,
>> Basically what Bender is saying is "if the training of a NLM is limited
>> then the NLM would not know what certain words mean".
>>
>
> No, that is not what she is saying, though seeing as how people are
> misunderstanding her thought experiment, I must agree the experiment is not
> as clear as it could be. She is saying, or rather reminding us, that there
> is a clear distinction to be made between form and meaning and that these
> language models are trained only on form. Here is the abstract of her
> academic paper in which she and her colleague mention the thought
> experiment.
>
> --
> Abstract: The success of the large neural language mod-els on many NLP
> tasks is exciting. However,we find that these successes sometimes lead to
> hype in which these models are being described as “understanding” language
> or capturing “meaning”. In this position paper, we argue that a system
> trained only on form has a priori no way to learn meaning. In keeping with
> the ACL 2020 theme of “Taking Stock ofWhere We’ve Been and Where We’re
> Going”,we argue that a clear understanding of the distinction between form
> and meaning will help guide the field towards better science around natural
> language understanding.
> --
> Bender is a computational linguist at the University of Washington. I
> think I read that she is actually the head of the department.
>
> the paper:
>
> https://docslib.org/doc/6282568/climbing-towards-nlu-on-meaning-form-and-understanding-in-the-age-of-data-gts
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230323/8f944216/attachment.htm>


More information about the extropy-chat mailing list