[ExI] wanting to

William Flynn Wallace foozler83 at gmail.com
Mon Nov 6 23:56:45 UTC 2023


But the AI is talking about probability of an event (I think).  What we
want is understanding wanting and not wanting and the consequences:  do it
or don't do it.    bill w

On Mon, Nov 6, 2023 at 5:15 PM BillK via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> On Sun, 5 Nov 2023 at 22:59, Jason Resch via extropy-chat
> <extropy-chat at lists.extropy.org> wrote:
> > There is information in uncertain probabilities, and this information
> can be less than 1 bit.
> > The formula is: -log2(p) where p is the probability of an event.
> > If wanting or not wanting changes the probability of the outcome, then
> it provides a positive amount of information (unless p == 1).
> >
> > Jason
> > _______________________________________________
>
>
> So I asked an AI for a more detailed explanation of your comment........
> :)
> BillK
>
> Answer
> The statement "There is information in uncertain probabilities, and
> this information can be less than 1 bit" refers to the concept of
> information entropy in information theory. Information entropy is a
> measure of the amount of uncertainty or randomness in a system. In the
> context of probability, the entropy of a probability distribution is a
> measure of the amount of uncertainty in the outcome of an event. The
> formula for information entropy is -log2(p), where p is the
> probability of an event. This formula gives the amount of information
> (in bits) that is gained when the outcome of an event with probability
> p is observed. The more uncertain or random the event is, the higher
> the entropy and the more information is gained when the outcome is
> observed. Conversely, if the probability of an event is 1 (i.e., the
> event is certain), then the entropy is 0 and no information is gained
> when the outcome is observed.
>
> The statement "If wanting or not wanting changes the probability of
> the outcome, then it provides a positive amount of information (unless
> p == 1)" refers to the fact that if the probability of an event
> changes based on whether or not someone wants it to happen, then there
> is some uncertainty or randomness in the outcome. This uncertainty can
> be quantified by the entropy of the probability distribution. If the
> entropy is greater than 0, then observing the outcome provides some
> information. If the entropy is 0 (i.e., the event is certain), then no
> information is gained.
>
> In summary, the statement is referring to the concept of information
> entropy in information theory, which is a measure of the amount of
> uncertainty or randomness in a system. The entropy of a probability
> distribution gives the amount of information gained when the outcome
> of an event is observed. If the probability of an event changes based
> on whether or not someone wants it to happen, then there is some
> uncertainty or randomness in the outcome, and observing the outcome
> provides some information.
> -------------------
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20231106/a0109422/attachment-0001.htm>


More information about the extropy-chat mailing list