[ExI] Bender's Octopus (re: LLMs like ChatGPT)

Giovanni Santostasi gsantostasi at gmail.com
Fri Mar 24 01:04:02 UTC 2023


What is this stupid example?
ChatGPT can answer that question perfectly. In fact, I just fed all this to
GPT-4 and he makes more sense than this Bender lady. I'm worried about
people not getting that we are not different from these AI. Enough with
this idiotic human exceptionalism.

GPT-4:

While the octopus (O) is excellent at detecting statistical patterns, it's
true that it initially lacks the contextual understanding of the world that
A and B possess. However, as an intelligent AI, O can learn from the
conversation it has been observing.

As O has been listening to A and B for some time, it's likely that it has
picked up on the meanings of many words based on how they are used in
different contexts. While O may not have experienced bears or sticks
firsthand, it may have learned their meaning by observing how A and B use
those terms in conversation.

To help A, the octopus could rely on its ability to understand patterns and
context. It could analyze the situations where A and B have talked about
bears, sticks, or other related topics before. By doing so, O may identify
potential strategies that A and B have discussed in the past or come up
with a solution by combining different pieces of information it has learned.

For instance, O could suggest that A use the sticks to create noise or a
barrier to scare off the bear, a strategy it might have seen A and B
discuss in other conversations about wildlife encounters.

In this case, even though O may not have direct knowledge about bears and
sticks, it could still provide helpful advice by leveraging its
understanding of the patterns in A and B's conversations. However, it's
important to recognize that O's success in providing relevant instructions
depends on the quality and depth of information it has been able to gather
from A and B's conversations.



On Thu, Mar 23, 2023 at 12:42 PM Gordon Swobe via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> Emily M. Bender, a computational linguist at the University of Washington,
> makes the same argument I hold as valid. Large Language Models are not
> conscious or human-like as they lack referents.
>
> An interesting thought experiment:
>
> "Say that A and B, both fluent speakers of English, are independently
> stranded on two uninhabited islands. They soon discover that previous
> visitors to these islands have left behind telegraphs and that they can
> communicate with each other via an underwater cable. A and B start happily
> typing messages to each other.
>
> Meanwhile, O, a hyperintelligent deep-sea octopus [ChatGPT] who is unable
> to visit or observe the two islands, discovers a way to tap into the
> underwater cable and listen in on A and B’s conversations. O knows nothing
> about English initially but is very good at detecting statistical patterns.
> Over time, O learns to predict with great accuracy how B will respond to
> each of A’s utterances.
>
> Soon, the octopus enters the conversation and starts impersonating B and
> replying to A. This ruse works for a while, and A believes that O
> communicates as both she and B do — with meaning and intent. Then one day A
> calls out: “I’m being attacked by an angry bear. Help me figure out how to
> defend myself. I’ve got some sticks.” The octopus, impersonating B, fails
> to help. How could it succeed? The octopus has no referents, no idea what
> bears or sticks are. No way to give relevant instructions, like to go grab
> some coconuts and rope and build a catapult. A is in trouble and feels
> duped. The octopus is exposed as a fraud."
>
> You Are Not a Parrot And a chatbot is not a human. And a linguist named
> Emily M. Bender is very worried what will happen when we forget this.
>
> https://nymag.com/intelligencer/article/ai-artificial-intelligence-chatbots-emily-m-bender.html
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230323/4fafe6d8/attachment-0001.htm>


More information about the extropy-chat mailing list