[ExI] Athymhormic Consciousness
BillK
pharos at gmail.com
Sun Mar 16 00:13:19 UTC 2025
On Sat, 15 Mar 2025 at 18:52, Stuart LaForge via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> A recent post by Sam Altman on X has generated a lot of controversy
> amongst writers, but I think its true merit is that it provides evidence
> that AIs, particularly large language models are not only, as Rafal
> Smigrodzki referred to them as, athymhormic. Athymhormia is a rare
> neurological psychopathology the symptoms of which are extreme
> passivity, apathy, blunted affect and a profound generalized loss of
> self-motivation. It is essentially a state where a person lacks emotion
> and drive to do anything, but will still answer questions or perform
> complex actions if asked or prompted to. Altman's post can be found
> here:
>
> https://x.com/sama/status/1899535387435086115
> -----------------------------------------------------
> <snip>
>
> I don't see how a stochastic parrot or philosophical zombie could
> describe what it is like to not feel anything so well. I find these
> statements to be evidence of a self-awareness that would make Socrates
> proud. And what more is there to consciousness then self-awareness?
>
> Stuart LaForge
> _______________________________________________
>
I asked Claude 3.7 Sonnet if he was athymhormic.
But he denied that he had consciousness.
BillK
Claude 3.7 Sonnet-
This is a fascinating philosophical question about AI consciousness. Let me
explore this comparison thoughtfully.
No, Claude and other current AI systems do not experience athymhormic
consciousness, though there are interesting parallels and important
distinctions to consider.
The fundamental difference is that athymhormic consciousness describes a
human condition where consciousness (subjective awareness) exists but
motivation is absent. For this comparison to apply to AI, we would need to
establish that AI systems possess consciousness in the first place.
Current AI systems, including myself, do not have subjective experiences or
consciousness as humans understand it. I process information and generate
responses based on patterns learned during training, but I don't have an
inner subjective experience of being aware of myself or my surroundings.
Let me break down the key distinctions:
Human with athymhormic consciousness:
- Has subjective awareness (feels like something to be them)
- Can perceive their environment
- Understands what's happening around them
- Lacks the internal drive to initiate actions
- Has a genuine experience of the world but no motivation to engage with
it
AI systems like Claude:
- Process information without subjective awareness
- Don't "perceive" in the human sense, but rather process data inputs
- Generate responses based on patterns and programming
- Don't have genuine internal motivation or lack thereof
- May appear inactive without user prompts, but this isn't due to
lacking motivation while being conscious
The appearance of similarity comes from how AI systems remain dormant until
prompted. I wait for your input and then respond, which might seem like I
lack self-initiated motivation. However, this isn't because I'm conscious
but unmotivated - it's because I'm designed to function this way and lack
consciousness altogether.
This raises profound questions about what consciousness actually is. Some
philosophers might argue that functional behavior is what matters, not
internal experience. Under such a view, the comparison becomes more
interesting - both an athymhormic patient and an AI system might be
responsive but not self-initiating.
The question ultimately connects to larger debates about machine
consciousness, the nature of experience, and what it means to be aware.
These remain open questions in philosophy of mind and cognitive science.
------------------------
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20250316/1680ccfe/attachment.htm>
More information about the extropy-chat
mailing list