[ExI] Interstellar FedEx

Anders Sandberg asa at nada.kth.se
Fri Oct 2 22:15:36 UTC 2009


Post Futurist wrote:
> Is information entropy the same as signal entropy?

Kind of. Signal entropy is the uncertainty (or amount of information -
remember that in classical information theory 50% noise is maximally
"informative"), H=-sum_i P_i log(P_i). Information entropy has the same
formula (now called the Gibbs formula rather than the Shannon formula),
but instead of denoting the probability of a particular message symbol P_i
now means the probability of finding the system in microstate i. So if one
makes the identification microstate = a particular symbol, they are the
same. But when one speaks of the entropy of a stream of symbols we are
talking about a series of states, while the Gibbs formula is about a
single macrostate - it does not say anything about how the microstates
shift around.

Note that this kind of information entropy is independent of the
macrostate, although usually we just care about macrostates with a well
defined temperature or energy.

-- 
Anders Sandberg,
Future of Humanity Institute
Philosophy Faculty of Oxford University





More information about the extropy-chat mailing list