<div dir="ltr"><div class="gmail_default" style="font-family:comic sans ms,sans-serif;font-size:large;color:#000000">But the AI is talking about probability of an event (I think). What we want is understanding wanting and not wanting and the consequences: do it or don't do it. bill w</div></div><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Mon, Nov 6, 2023 at 5:15 PM BillK via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">On Sun, 5 Nov 2023 at 22:59, Jason Resch via extropy-chat<br>
<<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a>> wrote:<br>
> There is information in uncertain probabilities, and this information can be less than 1 bit.<br>
> The formula is: -log2(p) where p is the probability of an event.<br>
> If wanting or not wanting changes the probability of the outcome, then it provides a positive amount of information (unless p == 1).<br>
><br>
> Jason<br>
> _______________________________________________<br>
<br>
<br>
So I asked an AI for a more detailed explanation of your comment........ :)<br>
BillK<br>
<br>
Answer<br>
The statement "There is information in uncertain probabilities, and<br>
this information can be less than 1 bit" refers to the concept of<br>
information entropy in information theory. Information entropy is a<br>
measure of the amount of uncertainty or randomness in a system. In the<br>
context of probability, the entropy of a probability distribution is a<br>
measure of the amount of uncertainty in the outcome of an event. The<br>
formula for information entropy is -log2(p), where p is the<br>
probability of an event. This formula gives the amount of information<br>
(in bits) that is gained when the outcome of an event with probability<br>
p is observed. The more uncertain or random the event is, the higher<br>
the entropy and the more information is gained when the outcome is<br>
observed. Conversely, if the probability of an event is 1 (i.e., the<br>
event is certain), then the entropy is 0 and no information is gained<br>
when the outcome is observed.<br>
<br>
The statement "If wanting or not wanting changes the probability of<br>
the outcome, then it provides a positive amount of information (unless<br>
p == 1)" refers to the fact that if the probability of an event<br>
changes based on whether or not someone wants it to happen, then there<br>
is some uncertainty or randomness in the outcome. This uncertainty can<br>
be quantified by the entropy of the probability distribution. If the<br>
entropy is greater than 0, then observing the outcome provides some<br>
information. If the entropy is 0 (i.e., the event is certain), then no<br>
information is gained.<br>
<br>
In summary, the statement is referring to the concept of information<br>
entropy in information theory, which is a measure of the amount of<br>
uncertainty or randomness in a system. The entropy of a probability<br>
distribution gives the amount of information gained when the outcome<br>
of an event is observed. If the probability of an event changes based<br>
on whether or not someone wants it to happen, then there is some<br>
uncertainty or randomness in the outcome, and observing the outcome<br>
provides some information.<br>
-------------------<br>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org" target="_blank">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" rel="noreferrer" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</blockquote></div>