[ExI] Bender's Octopus (re: LLMs like ChatGPT)

Adrian Tymes atymes at gmail.com
Thu Mar 23 21:59:02 UTC 2023


On Thu, Mar 23, 2023, 12:56 PM Stuart LaForge via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> I posed this exact question to ChatGPT


ChatGPT has references for what bears and sticks are.

This may make Bender's octopus an irrelevant example, as any serious real
world AI will have at least as many common references as most people.  When
it does not have a necessary reference, it can ask, just like any person
could.  "What's a bear?" sounds silly because bears are a common
reference.  "What's a blood bear?" sounds more plausible to ask, if one
said one was being attacked by blood bears.

It can also try to make something up.  Some might do so as good as an
average person could, but this is not recommended even for humans in most
cases.  That the AI might be bad at it is just one more reason not to do so.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230323/2fa5be82/attachment.htm>


More information about the extropy-chat mailing list