[ExI] GPT-4 on its inability to solve the symbol grounding problem
dsunley at gmail.com
Fri Apr 7 20:32:47 UTC 2023
The precise mechanisms of early childhood language acquisition are still
areas of major open study, but as far as I know, the consensus is that
there are genetically coded structures that correspond to basic elements of
language and speech. Verbs, nouns, adjectives, vowels, and consonants seem
to be human universals, as is the extreme plasticity of the linguistic
cortex in response to the spoken utterances for surrounding people during
early childhood, especially the primary caregivers.
Kant would call that "a priori" knowledge. Humans are not born as blank
slates, and especially not where language is concerned.
The important thing to remember in all of this is that in humans,
linguistics is a weird hack on top of an frontal cortex that is coded to
model an environment, that is itself a weird hack on top of a agentic
hindbrain that is coded to run around in and manipulate an environment.
An LLM is basically just the top layer, but with the frontal cortex stuff
hacked into the grammar grokkage, and no hindbrain at all. It's different,
and weirdly different, while still looking normalish. That's what all the
memes about a mountain-sized mass of tentacles wearing a tiny smiley-face
mask are trying to convey.
We may or may not be seeing the first distant rumblings of the Singularity,
but what we are seeing for certain is humanity's first mass experience
interacting with a genuinely alien neural architecture, complex enough to
be worthy of the name.
On Fri, Apr 7, 2023 at 1:50 PM William Flynn Wallace via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> But you never said how knowledge got to our brains other than through the
> senses. bill w
> On Fri, Apr 7, 2023 at 2:14 PM Gordon Swobe via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>> On Fri, Apr 7, 2023 at 12:09 AM Jason Resch via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>> On Thu, Apr 6, 2023 at 11:45 PM Gordon Swobe <gordon.swobe at gmail.com>
>>>> I think you are begging the question, asking me to assume your
>>>> conclusion. I have no idea if brains derive understanding from the mere
>>>> statistical correlations of their inputs.
>>> Hi Gordon, thank you for responding to that question.
>>> Let's break it apart to see where there is either disagreement or
>>> 1. We agree that human brains have understanding, correct?
>>> 2. We agree that the only information a human brain receives from or
>>> about the outside world enters it as nerve impulses from the senses,
>> I hope you understand that you opened up a giant can of worms, especially
>> with the second question. If you expect simple yes/no answers to simple
>> questions then I might disappoint you.
>> 1. I agree that conscious human brains, aka minds, have understanding.
>> That question is fairly straightforward.
>> 2. This question is more problematic, and depends to some extent on what
>> we mean by "outside world." We already had a miscommunication about that
>> question with respect to referents. I am not a strict empiricist, which is
>> to say that I do not believe that all knowledge is derived from the senses
>> where senses is taken to mean sight, hearing, etc. You've already seen me
>> write about how I believe along with mathematical platonists that we
>> discover mathematical truths and do not invent them. The square root of 9
>> is 3 and this was and is eternally true. It was true before anyone had a
>> conscious mind to contemplate and learn it. Does it exist in the
>> "outside world"? Is the platonic realm in the outside world?
>> Empirical science has been a great boon to humanity, but as Shakespeare
>> might say, "‘There are more things in heaven and earth, Horatio, than are
>> dreamt of in the philosophy of empiricism." :)
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat