[ExI] GPT-4 on its inability to solve the symbol grounding problem
Gordon Swobe
gordon.swobe at gmail.com
Mon Apr 10 02:39:20 UTC 2023
On Sun, Apr 9, 2023 at 11:07 AM Stuart LaForge via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> You and Bender call LLMs "stochastic parrots". Since
> African gray parrots are approximately as intelligent as 3.5-year-old
> human children, that would imply that ChatGPT is likewise at least as
> conscious as a 3.5-year-old human child if not more so.
The term "Stochastic Parrots" reflects that LLM outputs rely on predictions
derived from the statistical appearance of words in the text used for
training. LLMs lack referents for the words, so they can only echo them
without knowing their meaning, similar to parrots. The term has nothing
whatsoever to do with the intelligence or consciousness of actual parrots.
That is unless
> you can specify the difference between intelligence and consciousness,
> in such a way that humans have consciousness and birds do not.
Again, the term has nothing to do with actual birds. It is just a bit of
sarcasm on the part of Bender and her associates.
> When it comes to the survival of the human race, silliness is
> preferable to factual inaccuracy. Thus far, I have caught your
> supposed thought leader Bender in two cringy factual inaccuracies. The
> first regarding parrots as being models of unconscious stupidity and
> the second being that octopi don't understand the uses of coconuts
> which is clearly refuted by this video.
>
> https://www.youtube.com/watch?v=Y2EboVOcikI
She was not referring to an actual octopus, either.
I don't think that your hero Bender understands parrots, octopi,
> bears, or tropical islands as well as she thinks she does.
I think you must be joking with me.
-gts
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230409/9e89d83d/attachment.htm>
More information about the extropy-chat
mailing list