[ExI] Bender's Octopus (re: LLMs like ChatGPT)

Jason Resch jasonresch at gmail.com
Fri Mar 24 03:35:19 UTC 2023


On Thu, Mar 23, 2023, 11:17 PM Gordon Swobe via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> On Thu, Mar 23, 2023 at 8:39 PM Will Steinberg via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> I don't have a lot of faith in a person who has a hypothesis and designs
>> a thought experiment that is essentially completely irrelevant to the
>> hypothesis.
>>
>
> As I wrote, I agree the thought experiment does not illustrate her point
> clearly, at least outside of the context of her academic paper. As I've
> mentioned, the octopus is supposed to represent the state in which an LLM
> is in -- completely disconnected from the meanings of words (referents)
> that exist only outside of language in the real world represented by the
> islands. But it is a sloppy thought experiment if you don't know what she
> is trying to say.
>
> It is about form vs meaning. LLMs are trained only on and only know (so to
> speak) the forms and patterns of language. They are like very talented
> parrots, rambling on and on in seemingly intelligent ways, mimicking human
> speech, but never having any idea what they are talking about.
>

There's no way to read this paper: https://arxiv.org/pdf/2303.12712.pdf and
come away with the impression that GPT-4 has no idea what it is talking
about.

Jason



> -gts
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230323/207d00a8/attachment.htm>


More information about the extropy-chat mailing list