[ExI] Bender's Octopus (re: LLMs like ChatGPT)

Jason Resch jasonresch at gmail.com
Fri Mar 24 09:31:54 UTC 2023


Yes this is the paper I referred to. I am not sure why the link didn't work
for you. It downloads a rather large (~7 MB) pdf in the background, perhaps
that's why the link appears to do nothing when clicked?

Jason

On Fri, Mar 24, 2023, 1:42 AM Giovanni Santostasi <gsantostasi at gmail.com>
wrote:

> Yes, as always Jason understanding of these topics shines through. LET'S
> DO EXPERIMENTS ! Not garbage philosophical navel gazing.
> I think this is the paper Jason linked (but PDF doesn't work on the
> archive for some reason).
>
>
> https://www.lesswrong.com/posts/mmxPbFz7wvthvHCxq/sparks-of-artificial-general-intelligence-early-experiments
>
>
> On Thu, Mar 23, 2023 at 8:36 PM Jason Resch via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>>
>>
>> On Thu, Mar 23, 2023, 11:17 PM Gordon Swobe via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>> On Thu, Mar 23, 2023 at 8:39 PM Will Steinberg via extropy-chat <
>>> extropy-chat at lists.extropy.org> wrote:
>>>
>>>> I don't have a lot of faith in a person who has a hypothesis and
>>>> designs a thought experiment that is essentially completely irrelevant to
>>>> the hypothesis.
>>>>
>>>
>>> As I wrote, I agree the thought experiment does not illustrate her point
>>> clearly, at least outside of the context of her academic paper. As I've
>>> mentioned, the octopus is supposed to represent the state in which an LLM
>>> is in -- completely disconnected from the meanings of words (referents)
>>> that exist only outside of language in the real world represented by the
>>> islands. But it is a sloppy thought experiment if you don't know what she
>>> is trying to say.
>>>
>>> It is about form vs meaning. LLMs are trained only on and only know (so
>>> to speak) the forms and patterns of language. They are like very talented
>>> parrots, rambling on and on in seemingly intelligent ways, mimicking human
>>> speech, but never having any idea what they are talking about.
>>>
>>
>> There's no way to read this paper: https://arxiv.org/pdf/2303.12712.pdf
>> and come away with the impression that GPT-4 has no idea what it is talking
>> about.
>>
>> Jason
>>
>>
>>
>>> -gts
>>> _______________________________________________
>>> extropy-chat mailing list
>>> extropy-chat at lists.extropy.org
>>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>>
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230324/43313312/attachment.htm>


More information about the extropy-chat mailing list