<div dir="auto"><div><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Thu, Jan 2, 2020, 6:59 PM Stuart LaForge via extropy-chat <<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><br>
Here is where my new physics that I call Synergic Systems Theory comes <br>
in: As the number of components of a system increases, more of the <br>
information about the system is embodied by the relationships between <br>
components than by the components themselves. Those relationships <br>
readily change in complex systems thereby providing a plethora of <br>
microstates that can, and are, used for computation.<br><br></blockquote></div></div><div dir="auto"><br></div><div dir="auto">Have you published anything on this yet?</div><div dir="auto"><br></div><div dir="auto">I was thinking this today. Specifically, where do thoughts cover from? And idea of surplus precision or unobserved interactions could act as generators for information that influence larger parts of the network...</div><div dir="auto"><br></div><div dir="auto">You probably have collected various examples already if you've been thinking of it for a while?</div><div dir="auto"></div></div>