[ExI] GPT-4 on its inability to solve the symbol grounding problem
gordon.swobe at gmail.com
Sun Apr 9 01:15:09 UTC 2023
On Sat, Apr 8, 2023 at 4:18 PM Jason Resch via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
Here's the rub: if my smart doorbell has awareness and if awareness is what
>> defines consciousness, then how about my automobile?
> Does it have adaptive cruise control?
Your answer doesn't strike you as absurd? Should the car manufacturer be
advertising it as "conscious adaptive cruise control"? I might be willing
to pay more for that feature. :-)
> In the case of the boxer, what he loses is the ability to form new
memories which will be accessible to the part(s) of his brain that can talk
when he wakes up. Not all parts of his brain will necessarily be
unconscious when he is knocked out.
When he is knocked out, he will be unconscious, lacking consciousness,
unaware of anything, with no sensory experience, similar to being in a coma
or asleep and not dreaming. We all know what the word means. Yes that does
not mean his entire brain is dead, but he is unconscious.
> For example, if smelling salts can still awaken him, then the part of his
When he awakens, he is no longer unconscious.
> If you define consciousness in terms of human consciousness, then only
humans are conscious, by definition.
That is the only kind of consciousness with which we have any familiarity.
I think it is reasonable to infer something similar in other people and in
other higher mammals, as their anatomies and nervous systems and lives and
behaviors are so similar to ours, but then things start to get sketchy as
we go down the food chain. In the effort to justify the belief that even
software can be conscious, people find themselves saying all sorts of silly
things, for example that doorbells and cars are conscious. Their arguments
lose by reductio ad absurdum except on ExI, where anything goes. :-)
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat