[ExI] Fwd: New article: EM Field Theory of Consciousness

Brent Allsop brent.allsop at gmail.com
Fri Jun 17 20:49:34 UTC 2022


Hi Colin,
Thanks for all this....  but...

One of these days, I'll figure out how to describe this simple concept, and
everyone will finally get it.  Clearly, I haven't succeeded in describing
this to anyone here, yet.


On Fri, Jun 17, 2022 at 1:11 AM Colin Hales via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> 2) bringing explanation of the 1st person perspective requires an
> epistemic upgrade to the standard model of particle physics.
>

No.  Any abstract textual description of all possible new physics still
will not bridge the explanatory gap.
It remains a fact that you cannot communicate to anyone what redness (nor
any other qualia, nor any new physics) is like, with mere abstract textual
descriptions (like in the text in this email), which is all we get from our
abstract senses.


It is a fact that there could be something we are already describing (i.e.
we can describe how glutamate reacts in a synapse) which is the description
of the behavior of redness.
It's just that when we directly apprehend these intrinsic qualities, as
computationally bound conscious knowledge, only then can we know the
intrinsic coolness qualities we are abstractly observing and describing.


On Fri, Jun 17, 2022 at 1:22 PM Stathis Papaioannou via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
> Qualia are epiphenomenal if the physical world is causally closed. So when
> Jackson writes his paper, the movement of his hand is entirely explained by
> the observable physical forces on the hand. If he has qualia, they cannot
> have any separate causal efficacy of their own, because if they did to an
> observer it would look like the hand was moving contrary to the laws of
> physics, due to some magical force.
>

No, this is not the case.  It is simply a fact that abstract observation
and description of what we can directly apprehend as computationally bound
conscious knowledge tells us nothing about what it is like.
You can say that 650 nm light results in redness, and 700 nm light results
in greenness, but until you experience the redness itself, you can't know
the quality of what is being abstractly described.

The text in this e-mail can't tell you what redness is like (nor anything
through abstracting senses).  You simply need a picture like this, so you
can point to it and say: "THAT is red" giving you the required dictionary:
[image: 3_functionally_equal_machines_tiny.png]
Since this results in your direct apprehension of your knowledge of
something we can already fully describe (we just don't know which of all
our descriptions is redness).
Then, once we have that dictionary (objective behavior of glutamate (or
whatever it is) = the behavior of subjective redness.) .

All we need to bridge the explanatory gap are dictionaries, between our
objective descriptions, and what we directly apprehend that same stuff as
computationally bound conscious knowledge.

We can surely already observe both redness and greenness casually
interacting in the brain.  We just don't yet know which of all our
descriptions of stuff in the brain is redness.

All we need is the dictionary that connects our abstract descriptions of
what we directly apprehend as computationally bound conscious knowledge.

We already know everything, the only thing that remains is to connect our
objective descriptions with what we can subjectively directly apprehend as
conscious knowledge.

It's not an impossibly hard problem, it's just a dictionary color problem.
Easy shmeezy.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20220617/975a7e07/attachment-0001.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: 3_functionally_equal_machines_tiny.png
Type: image/png
Size: 26214 bytes
Desc: not available
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20220617/975a7e07/attachment-0001.png>


More information about the extropy-chat mailing list