[ExI] Bender's Octopus (re: LLMs like ChatGPT)

William Flynn Wallace foozler83 at gmail.com
Fri Mar 24 18:58:55 UTC 2023


Would anyone dare to give a definition of 'understanding'?
Please cite what epistemologies you are using.  bill w

On Fri, Mar 24, 2023 at 1:40 PM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
>
> On Fri, Mar 24, 2023 at 1:21 PM Gordon Swobe via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>>
>>
>> On Fri, Mar 24, 2023 at 2:12 AM Stuart LaForge via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>
>>> But really the meaning of words are quite arbitrary and determined by
>>> the people who use them. Thus the referential meanings of words evolve
>>> and change over time and come to refer to different things
>>
>>
>> I agree this is a reason for many human miscommunications, but the
>> speaker understands his words to meaning *something* and the hearer
>> understands those words to mean *something*.
>>
>> As a computational linguist, Bender is on our side.  She is obviously
>> very excited about the progress these language models represent, but is
>> reminding that the models do not actually understand words to mean anything
>> whatsoever.
>>
>>
>
> What's her evidence of that?
>
> Jason
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230324/49cba05b/attachment.htm>


More information about the extropy-chat mailing list