[ExI] LLM's cannot be concious

Rafal Smigrodzki rafal.smigrodzki at gmail.com
Sat Mar 25 03:13:07 UTC 2023


On Thu, Mar 23, 2023 at 11:00 PM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
> I agree, you could say mathematical truth exists outside language. But one
> thing this does highlight is there are many things you know about despite
> never having that object in hand to point and look at. You've never seen
> the abstract object '2'. You've never seen the core of the earth, or an
> electron. We lack sensory access to these things and so everything we know
> about them we know only through language. How do we come to understand
> things like '2' or electrons?
>

### Just as importantly, how do you know you have an "object" in hand, or
for that matter, that you have a hand?

Our brain is just a big mess of neural networks, doing hybrid
analog-digital computing and a small part of it, usually in the left
hemisphere, is in charge of creating language output from the other,
non-verbal activity going all around it. The direct referents for language
are the non-verbal data patterns that code for e.g. apples, which through
many layers of neural nets eventually can be traced to
actual apples.... same as in an LLM, which has the layers that output
grammar and syntax, connected to layers that represent objects, connected
to layers that represent relationships and concepts,
and in the case of multimodal LLMs like GPT4, connected to layers that
parse photo or video data.

The cognitive part of the human mind is recapitulated and expanded on in
LLMs. The emotional and goal-seeking part of the human mind has a different
structure from the cognitive part and so far (probably)
has no counterpart in LLMs, for now.

Rafal
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230324/c716a5ff/attachment.htm>


More information about the extropy-chat mailing list