[ExI] Bender's Octopus (re: LLMs like ChatGPT)

Jason Resch jasonresch at gmail.com
Thu Mar 23 22:25:17 UTC 2023


On Thu, Mar 23, 2023, 6:21 PM spike jones via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
>
>
>
> *From:* extropy-chat <extropy-chat-bounces at lists.extropy.org> *On Behalf
> Of *Adrian Tymes via extropy-chat
> *Subject:* Re: [ExI] Bender's Octopus (re: LLMs like ChatGPT)
>
>
>
> On Thu, Mar 23, 2023, 12:56 PM Stuart LaForge via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
> I posed this exact question to ChatGPT
>
>
>
> >…ChatGPT has references for what bears and sticks are…
>
>
>
> Ja, there was something kinda cool about the exchange.  ChatGPT was told
> “…I am being attacked by an angry bear…”
>
>
>
> It somehow understood that the interlocutor was not at that moment in the
> process of being devoured while pecking away on his computer for advice on
> a survival strategy (the subject of my silly riff.)  It understood it was
> being asked about a theoretical situation rather than what it was literally
> told.
>
>
>
> That kinda implies a form of understanding, or specifically: a very
> effective use of language models to create the illusion of understanding.
>
>
>
> I really don’t think it thinks, but it makes us think it thinks.  ChatGPT
> is wicked cool.
>

Is there such a thing as "simulated multiplication" or would we say
simulated multiplication is the same thing as real multiplication?

Is there such a thing as "simulated thinking"?

Jason

>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230323/9fae00c4/attachment.htm>


More information about the extropy-chat mailing list