[ExI] What camps get right and wrong (was: (Definition of Consciousness (Was Re: My guesses about GPTs consciousness))

Jason Resch jasonresch at gmail.com
Thu Apr 20 16:31:17 UTC 2023


On Thu, Apr 20, 2023 at 8:59 AM Brent Allsop via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
> OK, here it is, the Red Herring camp
> <https://canonizer.com/topic/88-Theories-of-Consciousness/81-Qualia-are-Red-Herrings>
> .
>
> You'll need to create an identity on Canonizer.  It is against the terms
> of service <https://canonizer.com/terms-and-services>, to create more
> than one identity on the system, as that is cheating (ie. no sock puppets).
>


In another thread, I said that in my opinion, every "camp" gets something
right, but also every camp contains some error. I thought I would attempt
to flesh out what I see as the correct and incorrect ideas present to each
camp. I welcome any critiques or requests for elaboration.

*Interactionist Dualism:*

*What this camp gets right:* There is something profoundly distinct about
consciousness,  separating it from the physical. Even modern physics
recognizes a divide between the subjective and objective, the observer and
the observed. Some say the universe and observer exist as a
mutually dependent pair, as neither can exist without the other. This does
imply a dual nature to reality. We also can now understand consciousness as
an abstract pattern making it in a sense immaterial, as Descartes and
others before him supposed, even if its particular incarnations of the
pattern are not.

*What this camp gets wrong:* It assumes that nothing so complex as the
human mind could be implemented as a machine that follows mechanical rules.
It assumes that the human mind intercedes in and violates the laws of
physics. In effect, the idea is self defeating, as if the soul is both
affected by and affects physics, would the soul not be a physical object
just as everything else in physics is affected by and affects other
physical objects?

*Take away:* There is something special about consciousness, our universe
has at least two aspects at some deep level.

*Psychophysical Parallelism:*

*What this camp gets right:* Leibniz lived in a time that recognized the
conservation of momentum (not just energy as Descartes knew of), thus
making the idea of a soul that modifies the movement of particles
impossible. This led Leibniz to accept the causal closure of physics:
physical effects follow strictly from physical causes.

*What this camp gets wrong:* It assumes no interaction between
consciousness and the material universe. They are entirely disconnected and
unrelated to each other (except by an edict of God that makes them run in
parallel). This makes the physical universe redundant and unnecessary,
supporting idealism (why bother having the material universe if it does
nothing to generate consciousness). It also supports the idea of a zombie
world (if one instead discards the realm of thoughts and consciousness but
keeps the physical universe).

*Take away:* Physics is causally closed, consciousness must be framed as
compatible with the causal closure of physics.

*Epiphenomenal Dualism:*

*What this camp gets right:* Once one accepts the causal closure of
physics, the idea that consciousness must be completely ineffectual and
unnecessary to the ordered flow of physical events immediately presents
itself. It is true that one could look at only the atomic or molecular
interactions in a brain and not consider the presence of consciousness when
viewing those interactions. There would be no unexplainable events in terms
of the atomic/molecular interactions.

*What this camp gets wrong:* It assumes consciousness has no effects. But
then what causes one to talk about consciousness? Or to discuss one's
conscious states, or even to propose the theory of epiphenomenalism in the
first place? What this theory misses is that there are a myriad of
different levels of causality operating within the brain, and while one
does not need to consider them when considering low level phenomenon of
molecules bouncing around, one does need to consider them when explaining
how one comes to describe a theory of consciousness such as
epiphenomenalism.

*Take away:* While physics is causally closed, there are multiple levels of
causality that must be considered, and are essential to describing and
understanding certain high level behaviors. It is similar to describing the
operations of a computer program in terms of its code as opposed to
attempting to explain it in terms of the electrical fields in a chunk of
silicon.

*Idealism:*

*What this camp gets right:* In a very real sense, everything we have
access to is merely a thought or idea in someone's consciousness. We can
never get outside this perspective. We can never access the real world, or
the objects within it, only our perceptions. We have no evidence the
physical world is real, and not a dream, simulation, Boltzmann brain, etc.

*What this camp gets wrong:* It ignores that there is an order and
structure to our conscious experience. When one perceives a ball going up,
it is usually followed by the perception of a ball coming back down. When
one takes a drug, or gets hit on the head, their conscious perception can
alter or cease. Whatever consciousness is, there is some explanatory law
underlying it and defining probable sequences of experiences.

*Take away:* Even if physics is not primary, there must be some explanation
for the existence of our conscious states and why experiences seem to
follow some kind of laws.

*Materialism/Physicalism:*

*What this camp gets right:* Our experiences appear bound to models of a
material world that follows physical laws. Physical or material
perturbations to the brain can alter states of consciousness.

*What this camp gets wrong:* It often rejects or denies the relevance or
sometimes the existence of consciousness. Sometimes calling it an illusion,
non-existent, or inessential, when in actuality, consciousness is the thing
we are most certain of. It cannot be an illusion as there would still have
to be a perspective to fall for the illusion. It must exist, as without it,
we wouldn't even know of a physical/material world. It cannot be
inessential, as we have to include it to explain our talking about
consciousness, an effect which has measurable and detectible consequences
in the material/physical world.

*Take away:* Our consciousness is bound up with the rules and laws of what
is at minimum, an apparent physical world.

*Mind-Brain Identity Theory:*

*What this camp gets right:* There cannot be a change in consciousness
without there being some change in the brain. States of consciousness are
related to and bound to brain states.

*What this camp gets wrong:* It assumes that there is a one-to-one mapping
of brain states to states of consciousness. But the physical states of our
brains are in constant flux, for example the billions of neutrinos coursing
through it at every moment. It seems there are some things that can change
in a brain without changing one's state of consciousness. Therefore, there
must be a many-to-one relationship between brain states and states of
consciousness. We find remarkable differences in neurophysiology across
species, but despite different brains, many of those species are able to
experience pain. This suggests there are multiple ways of realizing the
same conscious state with different material organizations, and thus there
cannot be a one-to-one identity.

*Take away:* States of consciousness can be multiply realized. That is,
there are multiple ways one could build a brain and achieve the same state
of consciousness.

*Biological Naturalism:*

*What this camp gets right:* That the brain is ultimately a machine, and
that appropriately programmed machines can replicate human behavior. Also,
it is true that one can simulate a Chinese speaker's mind without
personally gaining access to the understanding of that Chinese speaker's
mind. One can simulate a rainstorm without flooding the server room. Causal
properties are essential to understanding the capacities of minds.

*What this camp gets wrong:* That the person doing the simulation of the
Chinese speaker's brain would gain the perspective of that speaker's mind.
That philosophical zombies are possible. That there would be no perspective
invoked which does understand Chinese. It ignores the concept of levels
within simulations, and emulations. One can consider a dreaming person,
having the dream of being soaked in the midst of a rainstorm will not soak
their room. But within their view inside the dreaming brain, they perceive
being soaked just as they would in a real rainstorm. The conscious
perception seen by the person having a dream is quite different from the
perception of someone outside the dream watching them stir in bed.
Likewise, computing a Chinese speaker's brain, even if you are aware of
every step of the vast computation, does not transform your brain to
perceive the perspective of that Chinese speaker. We have to acknowledge
the existence of different vantage points, an inside and outside view, when
it comes to states of consciousness.

*Take away:* There is a difference between an inside and outside view, the
view of a brain or computer doing processing, and the perspective of the
person or program as realized by the brain or computer. One can have
complete objective knowledge of the outside view, but this does not have
any necessary implications for altering or expanding one's understanding or
ability to directly perceive what it is like to have and experience the
inside view.

*Functionalism:*

*What this camp gets right:* Multiple realizability implies that the
functions and behaviors, what a brain "does", is ultimately what's
important for defining the behavioral capacities of a system. What a brain
is made of, is unimportant. This is further confirmed by the realization of
computer scientists that computers can be made by any number of things, so
long as they can implement Turing's machine, they are as capable as
computing anything that any other digital computer can. What the brain does
appears to be within the domain of what digital computers can do, and thus
an appropriately programmed computer, regardless of what it's made of, can
replicate all the behaviors of a human brain, and rejecting zombies, this
perfect emulation must also be conscious in the same way.

*What this camp gets wrong:* As Putnam pointed out, multiple realizability
can be applied to undermine functionalism itself, as many functions can be
multiply realized. Which functions are required then? What is the point at
which substitution is possible without altering the mind state implemented?
It is often assumed by functionalists that they can determine what level of
fidelity/accuracy is required to instantiate a particular mind. But there
is the notion of functional substitution level, which may not in principle
be determinable. Moreover, many assume that functionalism, or its digital
form (computationalism) implies physics is fundamentally digital. But there
are subtle arguments that suggest the opposite, that if the mind is digital
then physics must not be digital. If infinite possible computations
underlie one's experience, this further complicates the possibility of
discovering what the particular computations are which instante one's mind.

*Take away:* What the brain does is, as far as we know, computable. An
appropriately programmed computer could, under certain assumptions,
instantiate a conscious state. But we may not have any objective or
subjective way of linking the objective functional description with a
particular subjective experience.


Jason
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230420/e5cafcc8/attachment.htm>


More information about the extropy-chat mailing list