[ExI] are qualia communicable? Was Why stop at glutamate?

Jason Resch jasonresch at gmail.com
Sat Apr 15 02:38:16 UTC 2023

On Fri, Apr 14, 2023, 9:44 PM Brent Allsop via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> My prediction is Jason is making things WAY too complicated, and talking
> about everything BUT what is important.

That wasn't my intention. Do you have a simpler way to show why qualia
can't be communicated?

And even all the stuff he is talking about, will be possible, just a lot
> more complex to achieve that.

Can you explain how it's possible, in principle?

An elemental quality of everything Jason is describing is a standalone
> pixel of a redness quality.

I don't believe in such things. A pixel of redness only exists by virtue of
it's relations to all the rest of a vastly complex brain. You can't pluck
it out of the brain and treat it as an independent elemental entity.

This single pixel could change to a grenness quality.

If there's some change to the state of the brain, this is possible.

Sure, there is a lot of different memories, and feelings, that one pixel
> would invoke differently, in different people.  But all that other stuff
> doesn't matter,

I would say it is only those things that matters and serve to make red what
it is and how it feels to perceive it.

only the elemental qualities does.
> This pixel of elemental redness, and the resulting change from redness to
> greenness, must identity match up with some objective description of the
> same.
> It is simply discovering what this identity is, and figuring out how
> elemental redness can be computationally bound with all the other stuff
> that would be different, in different brains.
> My prediction is that we will discover which of all our descriptions of
> stuff in the brain is a description of redness, We finally know which camp
> is THE ONE, we finally know the true color properties of things, hard
> problem solved, we can eff the ineffable, since our terms and properties of
> our subjective experiences would then be objectively grounded.

I don't see how you can identify the common element between two
individuals' red experience when there's no way (that I see) to determine
whether or when two individuals even have the same red experience. Can you
explain this process to me?


> On Fri, Apr 14, 2023 at 8:47 AM efc--- via extropy-chat <
> extropy-chat at lists.extropy.org> wrote:
>> On Fri, 14 Apr 2023, Jason Resch via extropy-chat wrote:
>> > Even if Alice outputs her whole brain state, A(I), at best her friends
>> only interpret it and reach states:
>> >
>> > B(A(I)), C(A(I)), D(A((I)), E(A(I))
>> >
>> > Do you see a way around this? Can Alice's output something that anyone
>> upon seeing it will have the same experience as Alice has?
>> >
>> Not without a serious dose of science fiction and a weakening or
>> redefinition of the term "same experience".
>> If by same experience we want same time, location, hardware and
>> software state, B would have to be "turned into" A, but B could not be
>> both A and B, so if B is turned back from A to B, I cannot see
>> how it could be done. It almost feels more like a logic problem
>> than a philosophy problem. ;)
>> Best regards,
>> Daniel
>> _______________________________________________
>> extropy-chat mailing list
>> extropy-chat at lists.extropy.org
>> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230414/0b17b6ea/attachment-0001.htm>

More information about the extropy-chat mailing list