[ExI] LLM's cannot be concious

Brent Allsop brent.allsop at gmail.com
Sat Mar 25 17:48:59 UTC 2023


I believe all this confusion and lack of understanding is simply because
nobody yet has a clear understanding of what a quality is.

What does it mean to say something knows something is red?
How is it different to know what your redness is like?
What is a 'referent' of the word redness?"







On Fri, Mar 24, 2023 at 9:14 PM Rafal Smigrodzki via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
>
> On Thu, Mar 23, 2023 at 11:00 PM Jason Resch via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>>
>> I agree, you could say mathematical truth exists outside language. But
>> one thing this does highlight is there are many things you know about
>> despite never having that object in hand to point and look at. You've never
>> seen the abstract object '2'. You've never seen the core of the earth, or
>> an electron. We lack sensory access to these things and so everything we
>> know about them we know only through language. How do we come to understand
>> things like '2' or electrons?
>>
>
> ### Just as importantly, how do you know you have an "object" in hand, or
> for that matter, that you have a hand?
>
> Our brain is just a big mess of neural networks, doing hybrid
> analog-digital computing and a small part of it, usually in the left
> hemisphere, is in charge of creating language output from the other,
> non-verbal activity going all around it. The direct referents for language
> are the non-verbal data patterns that code for e.g. apples, which through
> many layers of neural nets eventually can be traced to
> actual apples.... same as in an LLM, which has the layers that output
> grammar and syntax, connected to layers that represent objects, connected
> to layers that represent relationships and concepts,
> and in the case of multimodal LLMs like GPT4, connected to layers that
> parse photo or video data.
>
> The cognitive part of the human mind is recapitulated and expanded on in
> LLMs. The emotional and goal-seeking part of the human mind has a different
> structure from the cognitive part and so far (probably)
> has no counterpart in LLMs, for now.
>
> Rafal
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230325/d78ee782/attachment.htm>


More information about the extropy-chat mailing list