[ExI] [Extropolis] Woo Hoo, I convinced GPT-3 it Isn't Conscious.
brent.allsop at gmail.com
Sun Aug 29 21:44:54 UTC 2021
We've gone over this many times, but your model seems to be missing
representations of redness and greenness, as different than red and green.
So it appears that all I say get's mapped into your model, leaving it
absent of what I'm trying to say. Here you are talking about only the 3:
Strongest form of effing the effable, where you directly computationally
bind another's phenomenal qualities into your own consciousness.
Both the 3rd, strongest, and the 2nd stronger forms, where you
computationally bind something you have never experienced before, into you
consciousness, require brain hacking.
The 1st, weakest form of effing the ineffable, I was using with Emerson, is
different. It does not require brain hacking. All it requires is
objective observation and communication in way that distinguishes between
red and redness, and can model differences in specific intrinsic
qualities. If one is using only one abstract word "red" for all things
representing red knowledge, you can't model differences in different
intrinsic qualities which may be representing red. For the weakest form of
effing the ineffable, all you need is a phenomenal definition for
subjective terms like "redness", enabling you to communicate things with
well defined terms like this example effing statement: "My redness is like
your greenness, both of which we call red."
Also, thanks to all your endless help, I think I have a better
understanding of our differences. I would like to get these differences
between your "Functional Property Dualist
camp, and the "Qualia are Material Qualities
camp canonized. Let me see if you agree that this is a good way to
concisely describe our differences?
Functionalists, like James Carroll and yourself, using the
neuro-substitution argument make the assumption that a neuron functions
similarly to the discrete logic gates in an abstract CPU.
You also assume ALL computation operates this way, which is why you think
you can make the claim that the neuro-substitution argument can be applied
to all possible computational cases, justifying your belief that your neuro
substation argument is a "proof" that qualia must be functional in all
possible computational instances.
Where as Materialists, like Steven Lehar and I, think this way of thinking
about consciousness, or making this assumption is WRONG.
We believe that within any such abstract discrete logic only functional
system, there can be nothing that is the intrinsic qualities that represent
information like redness or greenness.
There is no way to perform the necessary "computational binding" of such
intrinsic qualities. As you so adequately point out, discrete logic gates
can't do this kind of computational binding.
Both of these are required so one can be aware of 2 or more
intrinsic qualities at the same time, the very definition of consciousness
Even if there was some "function" from which redness emerged, you could use
the same neuro-substitution argument to "prove", redness can't be
Since you completely leave intrinsic qualities like redness out of your way
of thinking, you don't seem to be able to model this all important
difference, which is so critical for me.
On Sat, Aug 28, 2021 at 3:28 PM Stathis Papaioannou <stathisp at gmail.com>
> On Sat, 28 Aug 2021 at 14:05, Brent Allsop <brent.allsop at gmail.com> wrote:
>> See the transcript I had with Emerson
>> <https://gpt3demo.com/apps/quickchat-emerson>, today, here: "I Convinced
>> GPT-3 it Isn’t Conscious
> How could you comment on Emerson’s consciousness without connecting
> yourself to Emerson’s circuits?
> Stathis Papaioannou
> You received this message because you are subscribed to the Google Groups
> "extropolis" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to extropolis+unsubscribe at googlegroups.com.
> To view this discussion on the web visit
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat