[ExI] The subjectivity of entropy, the role of the observer...==> Rational metaethics

Lee Corbin lcorbin at rawbw.com
Fri Feb 29 17:05:50 UTC 2008


Eliezer writes

> On Thu, Feb 28, 2008 at 8:54 PM, Lee Corbin <lcorbin at rawbw.com> wrote:
>>
>>  Almost all the time, I stick with this idea:  Temperature of a
>>  gas is the mean kinetic energy of its molecules.
> 
> Aren't there vibrational degrees of freedom that also contribute to
> kinetic energy, and isn't that why different materials have different
> specific heats?

I believe that the kinetic energy of particles in a gas compose
three of the possible seven degrees of freedom classically, each
one understood to possess one kT. So when physicists write
"the mean kinetic energy" of the particles, they're not talking
about the internal degrees of freedom, rotation, vibration, etc.
A diatomic molecule classically thus has 3 of motion (in the
X, Y, and Z directions), 2 in the vibration as the two atoms
bounce further then closer apart from one another, and 2
as they "decide" which way (as a barbell) to rotate.

> I.e., what matters is kinetic energy per degree of freedom, not
> kinetic energy per molecule? So you actually do have to think
> about a molecule (not just measure its kinetic energy per se) to
> determine what its temperature is (which direction heat will flow in,
> compared to another material),

There is the difference between the total amount of heat therein,
as indicated by its heat capacity, and the temperature. That's 
why metal on a hot day can nearly burn your hands---it can
transfer a lot of heat quickly to you. But its temperature is the
same as the surrounding air.

> even if you know the total amount of heat - putting the same
> amount of heat into a kilo of water or a kilo of iron will yield
> different "temperatures".

Yes.

> But the more important point: Suppose you've got an iron flywheel
> that's spinning very rapidly. That's definitely kinetic energy, so the
> average kinetic energy per molecule is high. Is it heat?

If you had a flywheel spinning *fast* enough---so that its outermost
part was traveling at the same speed as the mean molecular velocity
in a gas---then maybe you could consider that heat. But a temperature
gauge on the flywheel would, of course, not report it as such. We
would lump than extra energy into the rotational kinetic energy.

> That particular kinetic energy, of a spinning flywheel, doesn't look to you
> like heat, because you know how to extract most of it as useful work,
> and leave behind something colder (that is, with less mean kinetic
> energy per degree of freedom).

Yes, that "heat" is organized, and that does seem to me to fit into
the idea you're explicating.

> If you know the positions and speeds of all the elements in a system,
> their motion stops looking like heat, and starts looking like a
> spinning flywheel - usable kinetic energy that can be extracted right
> out.

Right, their "random chaotic movement" isn't then anymore either random
or chaotic. So then a Maxwell Demon, as you wrote, would indeed
be able to extract the energy.  On the usual usage of terms, at that
point the material would become "colder" as you got the energy out.

The general points you make about entropy seem entirely unobjectionable,
although, I confess, it was because for me, years ago, my friends and I,
try as we might, could not ascribe an objective meaning to "entropy",
in the sense of saying that particles in a box, for example, had to have
a definite value of entropy. Thermodynamic entropy,  dS = dQ/T,
as you know, is about heat transfers, and seems clear and useful 
enough.  There is a good story about Shannon asking Von Neumann
what name to use for information, and Von Neumann suggested 
"entropy" since they were mathematically so analogous. I don't
remember the details, but one possibility that this name choice
engendered a lot of confusion.  Still... the parallels are remarkable
and deep.

But energy at least is a clear and useful concept leading to wonderfully
accurate predictions and uses in all walks of life, while "entropy",
"information", "algorithmic information content", and the like, are
very tricky.

Lee




More information about the extropy-chat mailing list