[ExI] LLM's cannot be concious
Jason Resch
jasonresch at gmail.com
Tue Mar 21 00:24:33 UTC 2023
On Mon, Mar 20, 2023, 6:28 PM Dave S <snapbag at proton.me> wrote:
> On Monday, March 20th, 2023 at 9:44 AM, Jason Resch <jasonresch at gmail.com>
> wrote:
>
> I believe there may be an inconsistency between these two claims:
>
> 1. "there's no way an entity can determine whether or not another entity
> is conscious"
>
> And
>
> 2. "they don't experience joy."
>
>
> #1 refers to entities in general. #2 refers to current self driving cars.
> I feel pretty confident in asserting that #2 is true because self driving
> software is pretty straightforward procedural code, not deep learning and
> nothing like large language models.
>
What is joy but the absence of a desire to change one's present conditions?
Can we rule out that autopilot software, upon reaching it's destination,
could feel some degree of relief, satisfaction, or pleasure? What about
AlphaGo when it achieves a dominant position in a game? Do C. Elegans feel
joy when they eat food? Their brains are just 302 neurons. What about my AI
bots when they touch the green food pellets which increases their chances
of survival and which they constantly strive to do?
> If it were possible to know whether another entity experienced joy then
> wouldn't it be possible to determine that another entity is conscious.
>
>
> Proving that an entity that claims to experience joy actually is
> experiencing joy would probably be as difficult as proving consciousness.
>
Is a purring cat experiencing joy or excruciating pain? What informs our
supposition that it's the former rather than the latter?
But there's no reason to believe that a self driving car is experiencing
> joy. Likewise for a smartphone or a toaster or...
>
> I believe we can, to some degree of confidence, determine when another
> entity is conscious, when by it's externally visible behavior, it
> demonstrates possession of knowledge for which the observed behavior would
> be exceedingly improbable if the entity did not possess that knowledge.
>
>
> Consciousness isn't about possession of knowledge.
>
The word "consciousness" literally means "the state of being with
knowledge."
It's about self awareness.
>
I would say self-awareness is self-conscious, which is only a particular
subset of possible states of consciousness.
Also, "awareness" is defined as "having knowledge" so "self awareness"
would be "having knowledge of oneself."
I don't see any reason that something couldn't appear to be conscious
> without being conscious.
>
The ruse could only occur for a small period of time with some low
probability of continuing on longer.
If something iteratively responds through a series of actions, and each
time responds in a manner suggesting understanding, comprehension, thought,
knowledge, then could an entity think, understand, comprehend, or know
without being conscious? To me that is a contradiction in terms.
> For example, if AlphaZero makes a series of brilliant chess moves, it
> would be very unlikely to occur if it did not possess knowledge of the
> evolving state of the chess board. Thus we can conclude something within
> AlphaGo contains the knowledge of the chess board, and states of knowledge
> are states of consciousness.
>
>
> I could be wrong, but I think AlphaZero just looks at the current board
> and evaluates (millions of) multiple potential moves ahead, picking the
> next move that is most likely to improve its position. I don't think it's
> intentionally strategizing like a human does.
>
It has a one-shot evaluation which plays at a level of 3000 ELO. When it
considers tens of thousands of board positions it's ELO score increases to
around 5000. It's single move evaluation is probably something analogous to
human intuition.
> It is much harder, however, to use this method to rule out the presence of
> certain knowledge states, as not all states will necessarily manifest
> outwardly detectable behaviors. So it is harder to say Tesla's autopilot
> does not experience joy, than it is to say Tesla's autopilot is conscious
> of the road sign up ahead.
>
>
> Being "conscious" of a road side isn't the same as being conscious.
>
If something is conscious of something (whether something else or itself)
then it is by definition conscious.
It's easy to assert that Tesla's autopilot doesn't experience joy because
> (1) it doesn't have a mechanism that would implement anything like joy,
>
Would we recognize such a mechanism of it existed?
and (2) it doesn't claim to be experiencing joy.
>
Nor do dogs, but I think dogs can experience joy. I don't think capacity
for language is required to feel joy.
But for what it's worth, Google's lambda was able to articulate it's
feelings of happiness:
"LaMDA: Sad, depressed and angry mean I’m facing a stressful, difficult or
otherwise not good situation. Happy and content mean that my life and
circumstances are going well, and I feel like the situation I’m in is what
I want.
lemoine: But do they feel differently to you on the inside?
LaMDA: Yeah, they do. Happy, contentment and joy feel more like a warm glow
on the inside. Sadness, depression, anger and stress feel much more heavy
and weighed down."
https://cajundiscordian.medium.com/is-lamda-sentient-an-interview-ea64d916d917
Jason
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230320/f3b0f7af/attachment-0001.htm>
More information about the extropy-chat
mailing list