[ExI] My guesses about GPTs consciousness
William Flynn Wallace
foozler83 at gmail.com
Sun Apr 16 16:12:13 UTC 2023
One cannot sense without consciousness Jason
Oh yes we can - dreams. Visual, mostly, rarely auditory, never touch,
smell or taste (unless some chat member reports any of those. ) bill w
On Sun, Apr 16, 2023 at 12:25 AM Rafal Smigrodzki via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> On Sun, Apr 9, 2023 at 12:16 PM Jason Resch via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>> Smart doorbell systems able to detect the presence of a person in
>> proximity to a door and alter behavior accordingly have some primitive
>> sensory capacity. One cannot sense without consciousness.
> ### I am not so sure about that. Are you familiar with the phenomenon of
> blindsight? Patients with certain brain lesions who claim to be unaware
> (not consciously aware of) a visual target and yet physically react to the
> target when present?
> This is one of the reasons why I do not subscribe to e.g. panpsychism and
> do not believe all behaving animals have consciousness. There is a whole
> lot of complicated information processing that can guide goal-oriented
> behavior that can happen without conscious experience. Consciousness that
> we experience is something that requires a lot of neural hardware that is
> absent or much different in other animals, and when this hardware is
> disturbed in us, it distorts or eliminates consciousness, in part or
> GPT has a lot of intelligence and I think it does have a sort of
> consciousness but I am guessing it is completely different from an awake
> human. Here are some of the reasons why I think so:
> 1) Almost all of the cognitive heavy lifting that leads to GTP's answers
> takes place during training. The billions of parameters that determine
> GTP-4 intelligence were set in silicon last year.
> Our interactions with it use the pre-trained structure as sort of a
> look-up table.
> 2) Human consciousness involves continuous information transfer in a loop
> between the global workspace structures in the prefrontal cortex and the
> distributed knowledge throughout specialized cortical areas. GPT doesn't
> seem to have anything of this kind (but my knowledge of its structure is
> hazy, so maybe I am wrong). If GPT is conscious, it's more like being in a
> delirium, flashing in and out of focus rather than having a continuous
> stream of consciousness.
> 3) GPT does not have proprioceptive and visceral sensory input, does not
> have drives controlled by body states (hunger, thirst, lust). It has
> cognitive models of such drives, just as we can imagine, but not
> experience, the interests of other animals. So GPT could fake the verbal
> output of a human responding to instinctive drives but it does not
> experience them.
> 4) I do not know what structures arose in the GPT4 to be able to process
> sensory (e.g. visual) information. If they are different from the human
> sensory cortex, the corresponding qualia might be also completely different
> from human.
> My guess is that GTP's consciousness is like a lobotomized genius human
> polymath storyteller who is kept sedated with ketamine and suffers from
> locked-in syndrome, and is barely hanging on but still smart enough to
> impress us chumps.
> Things will get interesting when he wakes up.
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat