[ExI] LLM's cannot be concious

Jason Resch jasonresch at gmail.com
Sat Mar 25 18:45:12 UTC 2023


On Sat, Mar 25, 2023, 1:51 PM Brent Allsop via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
>
> I believe all this confusion and lack of understanding is simply because
> nobody yet has a clear understanding of what a quality is.
>

Qualities are those aspects of awareness that are not shareable. If they
were shareable then they would be either physical (first person shareable)
or mathematical (third person shareable) properties.

That they are not shareable is one reason why there is so much confusion
surrounding them. They are private to each person.



> What does it mean to say something knows something is red?
>

What does it mean to say something knows there is an itch on the back of
their hand?

How is it different to know what your redness is like?
>

It's not possible to know what another's red is like.

What is a 'referent' of the word redness?"
>

The mind state that is perceiving red.

Jason


>
>
>
>
>
>
> On Fri, Mar 24, 2023 at 9:14 PM Rafal Smigrodzki via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>>
>>
>> On Thu, Mar 23, 2023 at 11:00 PM Jason Resch via extropy-chat <
>> extropy-chat at lists.extropy.org> wrote:
>>
>>>
>>> I agree, you could say mathematical truth exists outside language. But
>>> one thing this does highlight is there are many things you know about
>>> despite never having that object in hand to point and look at. You've never
>>> seen the abstract object '2'. You've never seen the core of the earth, or
>>> an electron. We lack sensory access to these things and so everything we
>>> know about them we know only through language. How do we come to understand
>>> things like '2' or electrons?
>>>
>>
>> ### Just as importantly, how do you know you have an "object" in hand, or
>> for that matter, that you have a hand?
>>
>> Our brain is just a big mess of neural networks, doing hybrid
>> analog-digital computing and a small part of it, usually in the left
>> hemisphere, is in charge of creating language output from the other,
>> non-verbal activity going all around it. The direct referents for
>> language are the non-verbal data patterns that code for e.g. apples, which
>> through many layers of neural nets eventually can be traced to
>> actual apples.... same as in an LLM, which has the layers that output
>> grammar and syntax, connected to layers that represent objects, connected
>> to layers that represent relationships and concepts,
>> and in the case of multimodal LLMs like GPT4, connected to layers that
>> parse photo or video data.
>>
>> The cognitive part of the human mind is recapitulated and expanded on in
>> LLMs. The emotional and goal-seeking part of the human mind has a different
>> structure from the cognitive part and so far (probably)
>> has no counterpart in LLMs, for now.
>>
>> Rafal
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230325/4166d3d4/attachment-0001.htm>


More information about the extropy-chat mailing list