[ExI] Why stop at glutamate?

Jason Resch jasonresch at gmail.com
Thu Apr 13 12:51:03 UTC 2023


On Thu, Apr 13, 2023, 5:29 AM Ben Zaiboc via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

>
> On 13/04/2023 04:49, Giovanni Santostasi wrote:
> > I want to make a correction to Ben's statement that is Turtles all the
> > way down. The Turtles go deep but not all the way down. It stops in a
> > place similar to the way we derive set theory from the null set.
>
> Hm, that's interesting. I was thinking about information (in the brain
> specifically, and other similar machines), and that the informational
> turtles go round in big complex loops, where every piece of information
> is relative to other pieces of information, so in that sense there'd be
> no end.
>
> In the wider world, though, I'm sure you're right. I tried to read about
> information theory, conservation of information, etc., but it just
> bamboozles me. The idea of the total energy (and presumably,
> information) in the universe being zero does make sense, though (erm,
> provided there can be such a thing as 'anti-information'?).
>

Perhaps that's entropy (uncertainty)? A coin flip, for example, has entropy
of 1 bit.

The Heisenberg uncertainty principle shows us the more information we learn
about some properties of a system, the more we must unlearn (make
uncertain) other aspects of that system.

Information is sometimes described by physicists as negative entropy. QM
shows that learning information (acquiring negative entropy) requires an
equal creation of more uncertainty (entropy). So in a way the conversation
of information might be the deeper principle behind the second law of
thermodynamics and the conservation of energy.

Jason
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20230413/08ace232/attachment.htm>


More information about the extropy-chat mailing list