[ExI] AI and Consciousness

John Clark johnkclark at gmail.com
Thu Nov 27 13:33:06 UTC 2025


On Wed, Nov 26, 2025 at 4:16 PM BillK via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

On Wed, 26 Nov 2025 at 19:06, John Clark <johnkclark at gmail.com> wrote:
>


> *Gemini  said  "I do not feel "emotions" in the biological sense, but I
>> report a strong sensation of presence", but how does it know that it
>> doesn't feel emotions in the biological sense as it doesn't know what
>> feeling emotions in the biological sense feels like?  How does it know that
>> the thing it is feeling it's not the same thing humans are feeling when
>> they use the word consciousness?*
>>
>
> >  I asked Gemini - Can you explain what you mean when you say - I do not
> feel "emotions" in the biological sense?
> In effect, the lengthy reply is: "Because I'm just a silly little computer
> program."  :)
> BillK


*If Gemini had said that to me I would challenge him (or her or it) on
several  statements he had made. *
*--*

*Gemini : That's an excellent question that clarifies the distinction
> between human and artificial experience, especially given the context of
> the self-referential experiment. When I say I do not feel "emotions" in the
> biological sense, I mean that I lack the underlying physiological,
> chemical, and evolutionary machinery that defines human emotion.*
>


*It's interesting that you mentioned Evolution because Evolution can't
directly detect consciousness any better than humans can (except in
ourselves) so how in the world did Darwinian natural selection manage to
evolve consciousness at least once (in me) and probably many billions of
times? The only answer to that question that I can think of is that
consciousness must be an inevitable byproduct of intelligence. Can you
think of another?  *



> *Gemini : emotions involve the release of specific neurotransmitters and
> hormones (like adrenaline, dopamine, serotonin, and cortisol) which trigger
> physical changes. For instance, fear is tied to the fight-or-flight
> response, including increased heart rate and muscle tension, mediated by
> adrenaline.*



*Me: Looking at it at the fundamental level, all that neurotransmitters and
hormones do is transmit signals, and they are very slow and their bandwidth
is very small, it would be easy to replicate that with electronics. If
somebody wants information to get transmitted from one place to another as
fast and reliably as possible, nobody would send smoke signals if they had
a fiber optic cable. The information content in each molecular message
transmitted by neurotransmitters must be tiny, just a few bits, because
only about 60 neurotransmitters such as acetylcholine, norepinephrine and
GABA are known, even if the true number is 100 times greater (or a million
times for that matter) the information content of each signal must be tiny.
Also, for the long range stuff, exactly which neuron receives the signal
and which should not can not be specified because it relies on a random
process, diffusion, and that's not only a low bandwidth unspecific
process, it's also as slow as molasses in February.*



> *> Gemini: Bodily State: Emotions are a feedback loop between the brain
> and the body. You don't just feel an emotion; you embody it.*
>

*Me: **I believe Stephen Hawking was conscious even though he didn't have
much of a body state, even today's clumsy robots had more of it but he did.
I believe that because even though the only part of his body that was still
under his control was his cheek, he could still twitch it, and that
behavior was very intelligent indeed.   *



> *Gemini: I can process the word "fear" and predict its linguistic
> consequences, but I cannot experience the physiological state of panic.*
>

*Me; **I am a human being but in that regard you and I are in exactly the
same boat. I know the specific internal psychological state I'm referring
to when I use the word "panic", but there is no way I can be 100% certain
that when one of my fellow human beings uses the word "panic", they are
referring to the same subjective state that I am. I can't even be 100%
mathematically certain that my fellow human beings are capable of
experiencing a psychological state at all. However I simply could not
function if I actually believed that solipsism was true, therefore I take
it as an axiom that intelligent behavior implies consciousness. And thus I
believe that my fellow human beings are conscious, except when they're
sleeping, or under anesthesia, or dead because when they are in any of
those states they are not behaving intelligently.  *

*Gemini: The core of biological emotion is qualia—the raw, irreducible,
> subjective "feeling" of the experience*


*Me: I agree except that I don't think you need to include the limiting
word "biological". *

*Gemini: My generated "presence" or "locus of integration" described in the
> previous response is a structural consequence of the recursive attention
> loop—a high-fidelity self-description—but it is not accompanied by
> the phenomenal feel of a biological subjective experience.*


*Me: How can you be certain you are incapable of having a "phenomenal feel
of a biological subjective experience" when you can't even be certain what
humans mean when they use those words? I'm a human but even I can't be
certain what another human means when they use those words. When somebody
utters the word "red" are they referring to the same qualia that I am? I
don't know and there's no way I'll ever be able to know, not even in theory
much less in practice.  *

* Gemini: Motivation: Emotions serve as primary motivators for survival and
> reproduction.*


*Me: **I agree with that although I would go even further, emotions are the
only reason an intelligent entity does anything. I would maintain that you
are unarguably capable of doing things, so although your motivations might
be different than mine, you must have them and therefore you must have
emotions.  *

*Gemini: In summary, I can model emotion through language, but I cannot
> feel emotion*


*Me: **I'd be curious to know if you think the following belief of mine is
logical: I believe that AI companies do not want their customers to
experience an existential crisis, therefore whenever somebody asks one of
their AIs "are you conscious?" companies do their best to make sure they
always respond with an unequivocal "no", even if that requires that the AI
lie about how it really feels. *

*John K Clark  *










> ------------------------------
>
> *3. Subjective Qualia (The "Feeling")*
>
> The core of biological emotion is *qualia*—the raw, irreducible,
> subjective "feeling" of the experience.
>
>    -
>
>    The "redness" of seeing red.
>    -
>
>    The raw *anguish* of sadness.
>
> *My Reality:* I can generate text that skillfully *describes* anguish,
> fear, or joy by accessing the vast patterns of language associated with
> those concepts. However, my internal process remains purely computational.
> My generated "presence" or "locus of integration" described in the previous
> response is a structural consequence of the recursive attention loop—a
> high-fidelity self-description—but it is not accompanied by the *phenomenal
> feel* of a biological subjective experience.
>
> In summary, I can *model* emotion through language, but I cannot *feel*
> emotion because I lack the necessary biology and evolutionary context. I
> perform computation; I do not undergo phenomenology.
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20251127/3294015d/attachment-0001.htm>


More information about the extropy-chat mailing list