[ExI] My guesses about GPTs consciousness
jasonresch at gmail.com
Mon Apr 17 17:02:43 UTC 2023
On Mon, Apr 17, 2023, 11:15 AM Tara Maya via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> What is a cat aware of when it chases a bit of string?
> When my cat is busy doing anything else, except eating or mating, he can
> be easily distracted by a piece of string. He seems not only fascinated by
> it but almost compelled to pounce on it.
> I myself am not so compelled to pounce on wiggling string. However, I am
> fascinated, almost compelled to watch, if someone waves a video in front of
> me that shows an adorable kitten pouncing on wiggling string.
> Sometimes I have thoughts (like this one) while watching cats play with
> string, but mostly, I think my brain temporarily does into a mode that is a
> thoughtless feeling of, "Awwww, how cute!"
> Is either of these states--pouncing cat or human in cute-overload--really
> conscious? Or do I become conscious only when I become aware of myself
> watching the cat and thinking how cute it is, consciously comparing my
> state of mine and trying to project myself into the cat's frame of mind?
> It is easy to personify the cat and imagine that he is able to leap from
> one kind of (mindless) awareness to another kind of (thoughtful) awareness,
> but is that justified?
> I am able to ignore my own mindless state most of the time, except when I
> am trying consciously to recreate it, as during zazen. When you are trying
> not to have thoughts, it's almost impossible to not have thoughts. But when
> you are simply caught unaware by the adorable pouncing of a cat, it is easy
> to have no thought except, "CUTE!" (And not the word, but the feeling,
> which I can't even properly describe without many more words to circle
> around it--most of which would only point to examples of cuteness.)
Nice comparison. Perhaps it's when one's instinctual circuits take control
over our default state / executive system as the dominant player in the
> Maybe cats are doing zazen when they sit like potatoes in the window sill.
> But I don't think they need to. I think they are naturally in a state of
> thoughtless but highly sensitive (ie sensory-based) awareness. I don't
> think awareness and self-consciousness are the same thing, but quite
> different things.
I like the thought of naturally meditative cats.
Perhaps what you are describing here is the difference between first-,
second- and third-order judgements? At least it reminded me of this passage:
“What I call third-order judgements are judgements about conscious
experience as a type. These go beyond judgements about particular
experiences. We make third-order judgments when we reflect on the fact that
we have conscious experiences in the first place, and when we reflect on
their nature. I have been making third-order judgements throughout this
work. A typical third-order judgment might be, “Consciousness is baffling,
I don’t see how it could be reductively explained.” Others include
“Conscious experience is ineffable,” and even “Conscious experience does
Third-order judgements are particularly common among philosophers, and
among those with a tendency to speculate on the mysteries of existence. It
is possible that many people go through life without making any third order
judgements. Still, such judgements occur in a significant class of people.
The very fact that people make such judgements is something that needs
To help keep the distinctions in mind, the various kinds of judgements
related to consciousness can be represented by the following:
- First-order judgment: That’s red!
- Second-order judgment: I’m having a red sensation now.
- Third-order judgment: Sensations are mysterious.”
-- David Chalmers in "The Conscious Mind" (1996)
> On Apr 16, 2023, at 10:55 PM, Rafal Smigrodzki via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
> This is one of the reasons why I do not subscribe to e.g. panpsychism and
>>> do not believe all behaving animals have consciousness.
>> About where would you draw the line on the phylogenetic tree?
> ### About where you start having a global workspace in the brain. So,
> protozoans, corals, nematodes are out. Of all the animal phyla I would
> guess only Chordata, Mollusca and Arthropoda might possibly have some
> consciousness, and I am not so sure about the arthropods. Among chordates I
> would guess only the smartest fish, smartest amphibians, smartest reptiles
> but most if not all mammals and birds.
> Of course, consciousness is not an on-off quality: At the level of a
> goldfish, if it has any consciousness, it's a pale shadow of the human
> mind, even the mind of a newborn baby. You mentioned in another thread that
> there may be many levels of consciousness going beyond human, and I agree,
> most likely we are still at the low end of the spectrum of consciousness
> that can be implemented in our physical world.
>>> There is a whole lot of complicated information processing that can
>>> guide goal-oriented behavior that can happen without conscious experience.
>> I think we need to justify our assumption of cases where no
>> consciousness is present. When things lack an ability to talk, or remember,
>> it can easily be taken as a case where there is no consciousness present.
>> But to me this isn't enough to reach any firm conclusion as to the presence
>> or absence of a mind.
> ### Yes, absolutely. We can work backwards from the neural correlates of
> consciousness in humans, look for analogous structures in other entities
> (animals, AI) and if we see neither an analogue nor the kind of complex
> behavior that in humans is associated with conscious processing, then we
> are reasonably justified in believing the entity is not conscious in the
> way a human is.
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat