<div dir="ltr">Gordon, <br> I will read the paper and write one to push back on her arguments. She is wrong. <br>Giovanni </div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Mar 23, 2023 at 6:40 PM Gordon Swobe <<a href="mailto:gordon.swobe@gmail.com">gordon.swobe@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr"><div dir="ltr"><br></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Mar 23, 2023 at 7:16 PM Giovanni Santostasi <<a href="mailto:gsantostasi@gmail.com" target="_blank">gsantostasi@gmail.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">Gordon,<br>Basically what Bender is saying is "if the training of a NLM is limited then the NLM would not know what certain words mean".</div></blockquote><div><br>No, that is not what she is saying, though seeing as how people are misunderstanding her thought experiment, I must agree the experiment is not as clear as it could be. She is saying, or rather reminding us, that there is a clear distinction to be made between form and meaning and that these language models are trained only on form. Here is the abstract of her academic paper in which she and her colleague mention the thought experiment.<br><br>--<br>Abstract: The success of the large neural language mod-els on many NLP tasks is exciting. However,we find that these successes sometimes lead to hype in which these models are being described as “understanding” language or capturing “meaning”. In this position paper, we argue that a system trained only on form has a priori no way to learn meaning. In keeping with the ACL 2020 theme of “Taking Stock ofWhere We’ve Been and Where We’re Going”,we argue that a clear understanding of the distinction between form and meaning will help guide the field towards better science around natural language understanding.<br>--<br>Bender is a computational linguist at the University of Washington. I think I read that she is actually the head of the department.<br><br>the paper:<br><a href="https://docslib.org/doc/6282568/climbing-towards-nlu-on-meaning-form-and-understanding-in-the-age-of-data-gts" target="_blank">https://docslib.org/doc/6282568/climbing-towards-nlu-on-meaning-form-and-understanding-in-the-age-of-data-gts</a></div></div></div>
</blockquote></div>