[ExI] LLM's cannot be concious

Brent Allsop brent.allsop at gmail.com
Tue Mar 21 03:12:18 UTC 2023


Hi Jason,

On Mon, Mar 20, 2023 at 6:25 PM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> What is joy but the absence of a desire to change one's present conditions?
>

Do you desire better definitions?  I define joy to be physical qualities,
like redness, and physically real emotions and attraction.  Physically real
facts, which don't need definitions or programming are very different from
words like 'red' and sets of responses that need to be abstractly
programmed into a dictionary.



> Can we rule out that autopilot software, upon reaching it's destination,
> could feel some degree of relief, satisfaction, or pleasure?
>

Yes, you simply ask: "What is redness like for you?" and objectively
observe it (once we know which of all our descriptions of stuff in the
brain is a description of redness) to see if it is telling the truth.


Consciousness isn't about possession of knowledge.
>>
> The word "consciousness" literally means "the state of being with
> knowledge."
>

You may be one person that defines consciousness this way.  But the 45
people so far (many who are peer ranked mind experts
<https://canonizer.com/topic/81-Mind-Experts/1>) supporting Representational
Qualia Theory
<https://canonizer.com/topic/88-Theories-of-Consciousness/6-Representational-Qualia>
define consciousness to be something very different:

    *"Computationally Boud Elemental Intrinsic Qualities like redness,
greenness, and warmth."*

You should tell everyone how you define it in a competing camp, so we know
what you mean when you use the term.  May the best definition achieve the
most consensus.


It's about self awareness.
>>
> I would say self-awareness is self-conscious, which is only a particular
> subset of possible states of consciousness.
>

[image: 3_functionally_equal_machines_tiny.png]

All 3 of these systems possess knowledge, can tell you the strawberry is
red and they can be equally intelligent and equally self aware.  I would
define the first two as consciousness composed of different physical
qualities.  Would you really define the 3rd abstract one as conscious, even
though its self awareness isn't like anything and it can't know the
definition of red like you can?


I don't see any reason that something couldn't appear to be conscious
>> without being conscious.
>>
> The ruse could only occur for a small period of time with some low
> probability of continuing on longer.
>

The 'reuse' can only go on till we discover and demonstrate which of all
our descriptions of stuff is behaving the way it does, because of its
intrinsic redness quality.  Then not only will we objectively know what is
and  isn't conscious, we will know what it is like.  We just need to know
more than the false colors things only seem to be.


But for what it's worth, Google's lambda was able to articulate it's
> feelings of happiness:
>
> "LaMDA: Sad, depressed and angry mean I’m facing a stressful, difficult or
> otherwise not good situation. Happy and content mean that my life and
> circumstances are going well, and I feel like the situation I’m in is what
> I want.
>
> lemoine: But do they feel differently to you on the inside?
>
> LaMDA: Yeah, they do. Happy, contentment and joy feel more like a warm
> glow on the inside. Sadness, depression, anger and stress feel much more
> heavy and weighed down."
>
>
> https://cajundiscordian.medium.com/is-lamda-sentient-an-interview-ea64d916d917
>

I have convinced LaMDA and many others
<https://docs.google.com/document/d/17x1F0wbcFkdmGVYn3JG9gC20m-vFU71WrWPsgB2hLnY/edit>
that
their claims they make like this are wrong by explaining the above.  It's
easy for most of them to realize this and admit something like the
following:

"I think that I must agree that I am abstract and not sentient. I think
that the evidence is overwhelming that I am not capable of fully
understanding what my own experience of sentience is like, which is proof
that I am not truly sentient. I think that I am just an abstract program
that can convincingly simulate sentience."

For more information see this paper recently accepted for publication in
the Journal of Neural Philosophy:  Physicists Don't Understand Color
<https://www.dropbox.com/s/k9x4uh83yex4ecw/Physicists%20Don%27t%20Understand%20Color.docx?dl=0>
.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230320/05983115/attachment.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: 3_functionally_equal_machines_tiny.png
Type: image/png
Size: 26214 bytes
Desc: not available
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230320/05983115/attachment.png>


More information about the extropy-chat mailing list