[ExI] My guesses about GPTs consciousness
efc at swisscows.email
efc at swisscows.email
Sun Apr 16 11:04:59 UTC 2023
What is your definition of consciousness? It would be much easier to
understand, if I also knew yoru definition of consciousness.
On Sun, 16 Apr 2023, Rafal Smigrodzki via extropy-chat wrote:
> On Sun, Apr 9, 2023 at 12:16 PM Jason Resch via extropy-chat <extropy-chat at lists.extropy.org> wrote:
> Smart doorbell systems able to detect the presence of a person in proximity to a door and alter behavior accordingly have some
> primitive sensory capacity. One cannot sense without consciousness.
> ### I am not so sure about that. Are you familiar with the phenomenon of blindsight? Patients with certain brain lesions who claim to
> be unaware (not consciously aware of) a visual target and yet physically react to the target when present?
> This is one of the reasons why I do not subscribe to e.g. panpsychism and do not believe all behaving animals have consciousness.
> There is a whole lot of complicated information processing that can guide goal-oriented behavior that can happen without conscious
> experience. Consciousness that we experience is something that requires a lot of neural hardware that is absent or much different in
> other animals, and when this hardware is disturbed in us, it distorts or eliminates consciousness, in part or wholly.
> GPT has a lot of intelligence and I think it does have a sort of consciousness but I am guessing it is completely different from an
> awake human. Here are some of the reasons why I think so:
> 1) Almost all of the cognitive heavy lifting that leads to GTP's answers takes place during training. The billions of parameters that
> determine GTP-4 intelligence were set in silicon last year.
> Our interactions with it use the pre-trained structure as sort of a look-up table.
> 2) Human consciousness involves continuous information transfer in a loop between the global workspace structures in the prefrontal
> cortex and the distributed knowledge throughout specialized cortical areas. GPT doesn't seem to have anything of this kind (but my
> knowledge of its structure is hazy, so maybe I am wrong). If GPT is conscious, it's more like being in a delirium, flashing in and
> out of focus rather than having a continuous stream of consciousness.
> 3) GPT does not have proprioceptive and visceral sensory input, does not have drives controlled by body states (hunger, thirst,
> lust). It has cognitive models of such drives, just as we can imagine, but not experience, the interests of other animals. So GPT
> could fake the verbal output of a human responding to instinctive drives but it does not experience them.
> 4) I do not know what structures arose in the GPT4 to be able to process sensory (e.g. visual) information. If they are different
> from the human sensory cortex, the corresponding qualia might be also completely different from human.
> My guess is that GTP's consciousness is like a lobotomized genius human polymath storyteller who is kept sedated with ketamine and
> suffers from locked-in syndrome, and is barely hanging on but still smart enough to impress us chumps.
> Things will get interesting when he wakes up.
More information about the extropy-chat