[ExI] Qualia are incommensurate
brent.allsop at gmail.com
Fri Jul 12 04:37:40 UTC 2019
You seem to be talking about computationally bound composite qualia, in a
way that seems almost blind to elemental qualia. Elemental qualia, like
redness and grenness, can be computationally bound to all the composite
qualia you are talking about. I'm talking about the elemental physical
quality that can be physically isolated from the stuff you are talking
about. Redness is something physical, out of which composite conscious
experience like you talk about can be built. It's just the redness,
On Tue, Jul 9, 2019 at 11:52 PM Rafal Smigrodzki <rafal.smigrodzki at gmail.com>
> While perusing the other qualia thread, it occurred to me that qualia
> experienced by different neural networks are incommensurate, even when the
> networks process the same inputs and produce the same outputs.
> There is for all practical considerations an infinity of ways to wire up a
> visual system reliably capable of parsing an image and assigning measures
> of reflectance (colors) to various parts of the image. Each of these
> networks uses synaptic connections which are in detail completely different
> from other networks, and yet all the networks would agree on the colors of
> any specific image. There is objective agreement between networks but
> trying to simply mash together any two networks into a single one would
> completely break them, since the precise synaptic circuits are completely
> If the networks have a subjective experience, it is thus created by
> completely different circuits. If qualia are a function of such circuits'
> performing some physical actions, then the differences between the
> circuits' structures should produce different qualia in each network, even
> when looking at the same image and naming the same colors. I am not talking
> about a network using another network's "red" qualia to code for its own
> green outputs, but rather I think their qualia are completely
> incommensurate, like speaking a completely different internal language.
> I don't know if this notion was explored in the other qualia thread
> (TLDR). I hope I am not retreading what's been said before.
> I guess that a sophisticated understanding of neural network functioning
> at the synapse level, and synapse-by-synapse manipulation of a network that
> reports on its qualia might allow us to make some inroads into the qualia
> question, over and above the arm-waving arguments we have now.
> Once I am uploaded and take a bunch AI programming courses I'll do some
> experiments on my neural structure, and report here on how it feels.
> I expect it will be quite groovy.
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat