[ExI] The Problem of Mental Causation
Jason Resch
jasonresch at gmail.com
Mon May 5 13:59:09 UTC 2025
On Sun, May 4, 2025, 6:39 PM Stuart LaForge via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> On 2025-05-04 06:38, Jason Resch via extropy-chat wrote:
> > Stuart,
> >
> > I was hoping you would have something deep and insightful to add, you
> > don't disappoint!
>
> Thanks, Jason. Really though it is a result of your own insightful
> investigation into consciousness overlapping with and often being
> tangential to my own investigation into the general phenomenon of
> emergence and emergent properties.
Thank you for saying that, I appreciate it.
Unfortunately most of my
> investigation currently consists of hand-written notes and mathematics
> that I cannot easily share at the moment.
>
When you do publish your results I'll be very interested to see them.
Please let me know when they're ready to share. :-)
> > On Sat, May 3, 2025, 3:09 PM Stuart LaForge via extropy-chat
> > <extropy-chat at lists.extropy.org> wrote:
> >
> >> On 2025-04-30 10:17, Jason Resch via extropy-chat wrote:
> >>> One of the great puzzles when it comes to understanding
> >> consciousness
> >>> and its role in the universe is the question of how conscious
> >> thoughts
> >>> could have any causal power in a universe fully governed and
> >>> determined by particles blindly following forces of nature.
> >>
> >> Thinking and information processing, conscious or otherwise, has
> >> casual
> >> power through its information content. This is a direct application
> >> of
> >> the Laundauer's principle. Mental causation is exactly how Maxwell's
> >>
> >> Demon works. It uses it knowledge of the positions and momentum of
> >> all
> >> the individual particles of gas to create a temperature gradient.
> >> Maxwell's Demon seems to violate the 2nd law of thermodynamics by
> >> decreasing the entropy of the gas. But, this is not the case,
> >> because in
> >> the process of memorizing the positions and momenta of every
> >> particle in
> >> the gas and enabling it to increase the system's potential energy,
> >> the
> >> Demon increased the entropy or information content of its own brain
> >> or
> >> data storage. This could only have been done by erasing whatever
> >> information was there before and incurring some minimal energy cost
> >> given by the Landauer principle E >= k * T * ln2 with k being the
> >> Boltzmann constant and T being Kelven temperature.
> >
> > To be clear, are you equating the causal potency of information, with
> > it's necessary generation/storage always incurring a cost of
> > increasing entropy elsewhere?
> Or is this just one example of how
> > information (or it's processing) can have physical effects?
>
> To be clear, the Landauer principle or limit is the LOWER bound of
> physical effect, a "thought" can have on the world, because it is the
> physical cost of allocating and overwriting memory to have that thought,
> which is itself a physical action that takes energy and increases the
> entropy of the universe. So information at, a minimum, incurs the cost
> to keep track of it, and so that is its basal causal power. However
> information itself has a latent potential energy as exemplified by
> Maxwell's demon.
Hmm I've never recognized the connection between negative
entropy/information and potential energy, but that does seem to fit, and as
more than just an analogy.
As the early scientist and late sorcerer Francis Bacon
> a.k.a. Dr. Mirabalis once wrote, "Knowledge is power." This is the
> simple observation that information can act as a catalyst for extracting
> "hidden" potential energy from the environment to the direct benefit of
> the system capable of tracking that information. This is how enzymes
> operate by using information to lower energy barriers for chemical
> reactions. This is how mitochondria work by using electron tunneling to
> create a proton gradient across a membrane. Being able to organize
> far-flung elements from the environment into nuclear weapons is another
> example of the causal power of information. So I use Maxwell's demon as
> an example of a general class of systems that exhibit similar behavior
> all up and down the emergence scale with conscious brains being one of
> these systems fairly high up on the scale.
>
This is a fascinating picture of the world.
> >
> > The process that analyzes an approaching gas molecule, judging it's
> > temperature and trajectory, and ultimately deciding whether to open or
> > close the door could itself be viewed as a kind of primitively aware
> > (conscious) thing. It's discriminated high-level information state
> > then occupy a spot in the causal chain, without it, the door could not
> > respond intelligently to it's environment. And I would say the
> > discriminated high-level information state is its conscious state.
>
> I would tend to agree although I am reluctant to directly address
> consciousness with my theory because it is a fraught word in scientific
> circles. It has no clear or rigorous definition.
Very true, I think introducing the word is more apt to confuse than explain.
A rock might be
> conscious or a dolphin or self-driving car not conscious depending on
> whose definition you use. Ultimately though Maxwell's demon is like a
> Turing machine, a simplified abstract mathematical model used to
> understand actual physically real systems.
>
> >> Basically the causal power of wanting ice cream is the energy cost
> >> it
> >> takes to forget you want ice cream either by distracting yourself or
> >> by
> >> getting yourself the ice cream.
> >
> > Would this mean a conscious mind running on a reversible computer
> > (which escapes Landauer's principle) could have (or allay) no desires?
>
> Depending on your definition of consciousness, I am not sure it could
> exist as an irreversible computation. So much of the mechanism of
> consciousness is tied up into environmental awareness and the survival
> benefits of its causal potency on that environment.
As far and entropy and reversible computing, I believe initializing the
reversible computer still has an entropy cost, and reading the result of
the reversible computer carries an entropy cost, but while it runs along,
it needs not increase entropy outside the system.
Quantum computers are reversible computers. When they are running, they
must be sealed off from the environment.
But a meditative mind, in deep thought, or in a self contained immersive
reality simulation, seem to be potentials for a reversible computer. But
then, when you want to exit that meditation or the simulation, when you
want to bring your new found wisdom into the outer world (environment) it
will bring an entropy increase.
Also, within reversible computers there is a phenomenon much like entropy,
where depending on the computation being performed, there can be the issue
of the production of "garbage bits" that continue to build up as the
program proceeds.
Another interesting consideration is that because all physical operations
are reversible, a simulation of our universe could run on a reversible
computer and in a way that requires no energy expenditure. But inside this
simulation, we have the buildup of energy, just like a reversible computer
faces a the build up of garbage bits.
Any desires such a
> consciousness might have would be brief ephemeral things which vanish as
> spontaneously as they occur and cannot have any casual effect on the
> outside world. So assuming it can be conscious in the first place, a
> reversible computer I suppose could imagine desires and imagine
> fulfilling them without an entropy cost, but we are in "angels dancing
> on pinheads" territory here. :)
>
It reminds me a bit of this passage:
"A simulated world hosting a simulated person can be a closed
self-contained entity. It might exist as a program on a computer processing
data quietly in some dark corner, giving no external hint of the joys and
pains, successes and frustrations of the person inside."
-- Hans Moravec
> >
> >> For example
> >> subatomic particles give rise to atoms in a standard upward
> >> causation,
> >> but atoms also give rise to subatomic particles through radioactive
> >> decay which is downward causation. Another example would be the
> >> surface
> >> tension of a water droplet ordering the water molecules into a
> >> perfect
> >> sphere. Strange loops are not magic, they are physics incorporating
> >> information theory.
> >
> > Sperry gave the example of a tire rolling down a hill. The tire is
> > made of molecules, but the rolling of the tire largely guides the
> > motions of all the atoms in the tire.
>
> Yes, the ability to roll down a hill is an emergent property of the tire
> with respect to the vulcanized rubber polymers that make it up. There is
> both a downward causation as the tire-shape causes the molecules to roll
> down the hill in a cycloidal trajectory and upward causation as the
> rubber molecules cause the tire to bounce, both contributing to its
> final chaotic trajectory of the tire down the hill.
A far more interesting picture than the pure reductionist would give for
the situation. I like it.
That being said,
> there is not enough self-referential causal closure to consider a tire
> rolling down a hill to be a strange loop process.
>
I think I agree the tire isn't conscious if that is what you mean here, but
I am not sure what the tire rolling is missing to not be a kind of strange
loop. Is it the organized, or goal-orientated, processing of information?
For clarification, "strange loop" is not my term, but one invented by
Douglas Hofstadter. So I am not an expert in its meaning.
> > I guess the question then becomes what kinds of information processing
> > activities are conscious ones.
>
> It is tricky to discuss consciousness in a precise way without a good
> definition of it. So are you talking minimally conscious or fully
> self-aware?
>
I mean minimally conscious. My own opinion is that recognition and
intelligent response to environmental information is a tell-tale sign of
consciousness, but might there be other ways a system could be conscious?
Perhaps of its own thoughts, or within a simulation.
In computer software, any program that takes input from the outside can be
rewritten as an equivalent program that takes no input but has the value
hard-coded in its static memory. Would one be conscious and the other not?
That wouldn't be my first guess.
> Maxwell's demon as a model of a minimal consciousness suggests that
> information processors that are bidirectionally coupled to the
> environment through some sensor and some corresponding actuator are
> conscious enough to have causal potency upon base reality. So the demon
> and his tiny door is like a thermostat and a heating element, which is
> also a feedback loop with causal power in base reality. The more such
> environmental feedback loops that the information processing system
> contains, the more complex it becomes and the more memory it will need,
> and the more energy it will consume, and presumably, the more conscious
> it will become, and the more causal potency it will have.
>
That all makes a lot of sense to me.
> > You make the comparison to erasing or overwriting information, but is
> > any process of recording information conscious?
>
> No, mud is not conscious just because you can step in it and leave a
> footprint.
>
Great example!
> > And what of processing
> > information without overwriting or erasing? Are such processes not
> > conscious? I think the dividing line for consciousness may be
> > something other than entropy increasing operations.
>
> Every causal process either directly or indirectly increases the entropy
> of the universe. Reversible computing cannot be causal to anything
> external to the reversible computer. So yes, if entropy increase was the
> dividing line between conscious systems and unconscious systems, then
> almost all physical processes would be conscious and clearly most
> physical processes are not conscious.
>
I agree that a reversible computer would not be able to interact with it's
environment while it was operating, but I am less sure it could not be
conscious while remaining in that state. It is a bit like Schrodinger's cat
-- the box that contains it is isolated from the outer environment, but I
think it would still be conscious during that period.
Likewise if we uploaded a human brain to a quantum computer, it could run
while being isolated from the environment. Perhaps the computation has a
movie uploaded and we play the movie for the uploaded mind. We then read
the final mind state after the 2 hours and find the brain now contains a
memory of having watched that movie. Was it conscious during that time, or
did it's consciousness only become real instantaneously when we read the
result?
> > I agree that
> > information processing, consciousness, and entropy are all closely
> > related, but are they equal?
>
> No, information processing and entropy are not equivalent to one
> another, let alone consciousness. In fact in set theoretical terms, one
> could say that entropy is a subset of information processing, which is
> in turn a subset of consciousness, which is in turn a subset of emergent
> properties.
>
By my own understanding/definitions, I might reverse some of those.
There are information processing operations (like flip a bit, for, add, set
0, etc.) but only a subset of operations are entropy increasing ones. So I
might say entropy increasing operations are a subset of all forms of
information processing.
And I would say conscious processes are also a subset of all forms
information processing.
Ontologically speaking, information processing seems very near the bottom
(most fundamental thing), as nearly everything we know can be conceived in
its terms, so it makes sense that it forms the largest set.
It is a leading candidate for a (neutral) monism, in my opinion.
Jason
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20250505/52f2a6dc/attachment.htm>
More information about the extropy-chat
mailing list