[ExI] wanting to

BillK pharos at gmail.com
Mon Nov 6 23:12:44 UTC 2023


On Sun, 5 Nov 2023 at 22:59, Jason Resch via extropy-chat
<extropy-chat at lists.extropy.org> wrote:
> There is information in uncertain probabilities, and this information can be less than 1 bit.
> The formula is: -log2(p) where p is the probability of an event.
> If wanting or not wanting changes the probability of the outcome, then it provides a positive amount of information (unless p == 1).
>
> Jason
> _______________________________________________


So I asked an AI for a more detailed explanation of your comment........  :)
BillK

Answer
The statement "There is information in uncertain probabilities, and
this information can be less than 1 bit" refers to the concept of
information entropy in information theory. Information entropy is a
measure of the amount of uncertainty or randomness in a system. In the
context of probability, the entropy of a probability distribution is a
measure of the amount of uncertainty in the outcome of an event. The
formula for information entropy is -log2(p), where p is the
probability of an event. This formula gives the amount of information
(in bits) that is gained when the outcome of an event with probability
p is observed. The more uncertain or random the event is, the higher
the entropy and the more information is gained when the outcome is
observed. Conversely, if the probability of an event is 1 (i.e., the
event is certain), then the entropy is 0 and no information is gained
when the outcome is observed.

The statement "If wanting or not wanting changes the probability of
the outcome, then it provides a positive amount of information (unless
p == 1)" refers to the fact that if the probability of an event
changes based on whether or not someone wants it to happen, then there
is some uncertainty or randomness in the outcome. This uncertainty can
be quantified by the entropy of the probability distribution. If the
entropy is greater than 0, then observing the outcome provides some
information. If the entropy is 0 (i.e., the event is certain), then no
information is gained.

In summary, the statement is referring to the concept of information
entropy in information theory, which is a measure of the amount of
uncertainty or randomness in a system. The entropy of a probability
distribution gives the amount of information gained when the outcome
of an event is observed. If the probability of an event changes based
on whether or not someone wants it to happen, then there is some
uncertainty or randomness in the outcome, and observing the outcome
provides some information.
-------------------


More information about the extropy-chat mailing list