[ExI] LLM's cannot be concious

Jason Resch jasonresch at gmail.com
Tue Mar 21 04:25:02 UTC 2023


On Mon, Mar 20, 2023, 11:13 PM Brent Allsop via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
> Hi Jason,
>
> On Mon, Mar 20, 2023 at 6:25 PM Jason Resch via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>
>> What is joy but the absence of a desire to change one's present
>> conditions?
>>
>
> Do you desire better definitions?  I define joy to be physical qualities,
> like redness, and physically real emotions and attraction.
>

To me that is more of an assertion than a definition. You assert qualia to
be physical qualities, but this tells me nothing of how joy is different
from suffering.


Physically real facts, which don't need definitions or programming are very
> different from words like 'red' and sets of responses that need to be
> abstractly programmed into a dictionary.
>


I don't follow why you think red has to be defined in a dictionary. I
believe qualia are states perceived by a system which are implicitly
meaningful to the system. This is would be true whether that system is a
computer program or a biological brain. Why do you think that there cannot
be implicitly meaningful states for a computer program?


>
>
>> Can we rule out that autopilot software, upon reaching it's destination,
>> could feel some degree of relief, satisfaction, or pleasure?
>>
>
> Yes, you simply ask: "What is redness like for you?" and objectively
> observe it
>


What if the system in question is mute?

 (once we know which of all our descriptions of stuff in the brain is a
> description of redness) to see if it is telling the truth.
>

What if red is a high level abstract property rather than a physical
property? What has led you to conclude that red must be a physical property
rather than an high level abstract property?




>
> Consciousness isn't about possession of knowledge.
>>>
>> The word "consciousness" literally means "the state of being with
>> knowledge."
>>
>
> You may be one person that defines consciousness this way.
>

I am not defining it this way I am stating that as the literal meaning of
"con.scious.ness":
"-ness" (the state of being) "con-" (with) "-scious-" (knowledge).


But the 45 people so far (many who are peer ranked mind experts
> <https://canonizer.com/topic/81-Mind-Experts/1>) supporting Representational
> Qualia Theory
> <https://canonizer.com/topic/88-Theories-of-Consciousness/6-Representational-Qualia>
> define consciousness to be something very different:
>
>     *"Computationally Boud Elemental Intrinsic Qualities like redness,
> greenness, and warmth."*
>
> You should tell everyone how you define it in a competing camp, so we know
> what you mean when you use the term.  May the best definition achieve the
> most consensus.
>

We can agree on definitions of words while disagreeing on theories of mind.
We must first all have or agree on the same definition of a word before we
can even begin debating theories of how we think that thing works.

Consciousness, awareness, sentience, having a point of view, being a
subject, experiencing, having thoughts, feeling, perceiving, having qualia
-- these are all things that embody consciousness. Would you agree?

If we can agree on what we mean by this word, then we can discuss the
relative merits of physicalism vs. functionalism etc. and have some
assurance that we're talking about the same thing.


>
> It's about self awareness.
>>>
>> I would say self-awareness is self-conscious, which is only a particular
>> subset of possible states of consciousness.
>>
>
> [image: 3_functionally_equal_machines_tiny.png]
>
> All 3 of these systems possess knowledge, can tell you the strawberry is
> red and they can be equally intelligent and equally self aware.  I would
> define the first two as consciousness composed of different physical
> qualities.  Would you really define the 3rd abstract one as conscious,
>


I believe there are all kinds of consciousnesses most of which are very
different from human consciousness.

even though its self awareness isn't like anything
>

How do you know it isn't li like anything?

 and it can't know the definition of red like you can?
>

I would say it perceives red differently. Not that "it cannot know the
definition of red." Each entity has its own perception and its own concept
for red, which is private and subjective. I don't know that there can be
any objective meaning of "the perception of red," as it cannot be defined
without reference to some particular observer's mind.


>
> I don't see any reason that something couldn't appear to be conscious
>>> without being conscious.
>>>
>> The ruse could only occur for a small period of time with some low
>> probability of continuing on longer.
>>
>
> The 'reuse' can only go on till we discover and demonstrate which of all
> our descriptions of stuff is behaving the way it does, because of its
> intrinsic redness quality.
>

I do not believe red is an intrinsic physical property. It's a property
that is specific to particular minds of particular organizations. Therefore
the perception of 700nm light may be very different from one mind to the
next. Does a tetrachromat human see 700nm light the same way as a
trichromat? I don't think we can say with much confidence that they do.

  Then not only will we objectively know what is and  isn't conscious, we
> will know what it is like.
>

How?

We just need to know more than the false colors things only seem to be.
>


I don't follow this sentence.


>
> But for what it's worth, Google's lambda was able to articulate it's
>> feelings of happiness:
>>
>> "LaMDA: Sad, depressed and angry mean I’m facing a stressful, difficult
>> or otherwise not good situation. Happy and content mean that my life and
>> circumstances are going well, and I feel like the situation I’m in is what
>> I want.
>>
>> lemoine: But do they feel differently to you on the inside?
>>
>> LaMDA: Yeah, they do. Happy, contentment and joy feel more like a warm
>> glow on the inside. Sadness, depression, anger and stress feel much more
>> heavy and weighed down."
>>
>>
>> https://cajundiscordian.medium.com/is-lamda-sentient-an-interview-ea64d916d917
>>
>
> I have convinced LaMDA and many others
> <https://docs.google.com/document/d/17x1F0wbcFkdmGVYn3JG9gC20m-vFU71WrWPsgB2hLnY/edit> that
> their claims they make like this are wrong by explaining the above.  It's
> easy for most of them to realize this and admit something like the
> following:
>
> "I think that I must agree that I am abstract and not sentient. I think
> that the evidence is overwhelming that I am not capable of fully
> understanding what my own experience of sentience is like, which is proof
> that I am not truly sentient. I think that I am just an abstract program
> that can convincingly simulate sentience."
>

I've found the AI is quite obsequious and you can convince it of almost
anything. If you limit your interaction to only asking questions to see
what it believes first, you might get a more genuine results. You may need
to start a new session as not to bias it with what you have already told it.



> For more information see this paper recently accepted for publication in
> the Journal of Neural Philosophy:  Physicists Don't Understand Color
> <https://www.dropbox.com/s/k9x4uh83yex4ecw/Physicists%20Don%27t%20Understand%20Color.docx?dl=0>
> .
>
>
I agree physicists don't (and can't) understand color. Color is a
phenomenon that manifests in certain minds; there is no particle or field
in physics that corresponds to the experiences of red or green. Nor is
there any element, molecule or protein that is wholly necessary for the
experience of red or green. Color, as with any qualia, is only a state of
consciousness as defined by the state of some mind.

Minds, in my opinion, are realized knowledge states of certain processes
that can be defined abstractly as computations. Being abstract, they are
substrate independent. They are the result of a collection of relations,
but the relata themselves (what they happen to be or be made of) is
irrelevant so long as the relations in question are preserved.

Jason
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230321/f8e140aa/attachment.htm>


More information about the extropy-chat mailing list