[ExI] The physical limits of computation

Jason Resch jasonresch at gmail.com
Mon Jan 22 02:59:16 UTC 2024


On Sun, Jan 21, 2024 at 2:55 PM Stuart LaForge via extropy-chat <
extropy-chat at lists.extropy.org> wrote:

> On 2024-01-19 12:34, Jason Resch via extropy-chat wrote:
> > I thought I would share this work with the extropy list, as it covers
> > topics of relevance to the future of technology and future
> > civilizations.
> >
> > In it, I describe the physical bounds on information storage,
> > computation, computronium, black hole computers, and the limits of the
> > universe as a whole:
> >
> >
> https://drive.google.com/file/d/124q3ni51E3sf9kMC_sNKgP3ikcl8ou1t/view?usp=sharing
> >
> > Any comments, or corrections would be most welcome.
>
> Since you asked for it... ;)
>

I very much appreciate this Stuart!


>
> On p.18 you wrote:
> "The DNA molecules that compromise your genome encode
> 750 MB of data — and they fit in a cell nucleus just five
> millionths of a meter across."
>
> Nitpicking here but the information content a cell nucleus is twice that
> figure or 1.5 GB. This is because you have two copies, really versions,
> of every chromosome in each of your cells, one from your mother and one
> from your father.


I completely forgot to consider that there are two of each chromosome.
Good catch. I have to consider how to incorporate this fact, however, since
I refer to the 750 MB genome several places and want to avoid confusion by
adding in a new unrecognized number.


> Each of your cell nuclei contains approximately 2
> meters of DNA. As an adult male you have approximately 36 trillion
> nucleated cells, so you have 72 trillion meters of DNA inside of you.
> For perspective, if you made a circle out of all the DNA inside you, the
> ring of DNA could encircle the sun out well past Jupiter's orbit and
> almost reach Saturn. So if all the DNA in your body were considered,
> your total DNA's information capacity, at 750 MB per meter, would be 108
> zettabytes.
>

That's an incredible fact. You reminded me of a similar factoid regarding
the maximum data transmission rates of the urethra. ;-)


>
> So the Matrix got it wrong. If intelligent machines were to exploit us,
> it wouldn't be as an energy source but as data storage. ;)
>
> On p.49 you wrote:
> "Using the holographic principle to model the observable
> universe as a black hole with a radius of 46.5 billion light
> years, we find it contains:
> 2.33 * 10^123 bits
> Which is the total information content of the universe."
>
> The observable universe cannot be modelled as a black hole because it is
> defined by the particle horizon which, while it is one of the three
> cosmic horizons (Hubble horizon, event horizon, and particle horizon),
> it is NOT the event horizon in a technical sense and therefore using the
> holographic principle on it is not really physically valid. The particle
> horizon defining the observable universe is the boundary of everything
> we can currently see in our past light cone since the big bang. As you
> say, it is 46.5 billion light years away from us.
>

The issues of the various horizons, and which was the most sensible one to
use in this situation confused me. This section is on the observable
universe, and ideally, I would like to provide both computatioal, and
information content estimates for the observable universe as a whole, if
that is possible, the observable universe being the largest thing we can
see (and presumably, the thing with the greatest information content).

Our inability to interact with distant objects we can see (beyond our event
horizon) I consider is (perhaps?) not necessarily important to the question
of how much information "is in" the observable universe, as all those far
away photons, still "made it here", into the sphere of the observable. So
then, when it comes to determining how much information exists within the
totality of what is observable (despite the fact that we can no longer
signal back to the most distant places), information about those places has
still made it here, and computations which took place in those far away
galaxies are still perceivable by us, and hence ought to be included in the
total amount of computations that have occurred in the history of the
observable universe, should they not?

I also found it interesting, that the mass of the universe (when computed
using the critical density, which comes from Friedmann equations and
multiplying this by the volume of the observable universe) led to a mass
which is considerably greater than the mass of a black hole with the radius
of the observable universe. I take this to be a result of not all the
energy in the universe being matter energy, but also radiation and dark
energy, which I presume to affect the curvature of space differently than
ordinary mass. Is that right?

For reference, here is how I did the calculation:
https://docs.google.com/spreadsheets/d/1lR5mj4jFQft7Q-hKC4bYAoQt-BlukVy7MbEQV0eeu8s/edit?usp=sharing


>
> The cosmological event horizon on the other hand is the boundary
> encompassing everything that we will ever be able to affect in our
> future light cone. The cosmological event horizon is at approximately 16
> billion light-years from us. You could consider it the the largest
> distance an object could be from us which, if we could wait forever, we
> could bounce a radar signal off of and get a signal back.
>
> The Hubble horizon, however is defined on our plane of simultaneity in
> the present as the boundary at which space is expanding away from us at
> the speed of light at this precise moment in time. As such, the Hubble
> radius is a proper distance defined in the present moment at c/H = 14.4
> billion light years from us.
>
> My own work on causal cell theory demonstrates that the Hubble horizon
> located at the Hubble radius is the most valid event horizon to set the
> Schwarzschild radius at for modeling our causal cell as a black hole.
> This is because it the proper distance to the causal boundary of the
> space-time that we are able to causally affect by an action we do right
> now. It is also easier to work with because it depends on only the speed
> of light (c) and the Hubble constant/parameter (H). If H was constant,
> the cosmological event horizon would correspond exactly with Hubble
> horizon. Because H is getting smaller, however, that means that the
> Hubble radius is getting farther and the Hubble horizon is increasing in
> area. In any case, as can be seen in the upper right corner of this
> graph that BillK sent to the list earlier, the Hubble radius corresponds
> to the Schwarzschild radius of a black hole with a mass the same as that
> of our causal cell. It lies exactly on the line that the graph uses to
> depict black holes. The universe modelled in general relativity is
> boundless and could well be infinite so it cannot be a black hole. But
> there ARE boundaries which define our causal cell which is finite and
> therefore COULD be a black hole.
>
>
> https://www.universetoday.com/163927/everything-in-the-universe-fits-in-this-one-graph-even-the-impossible-stuff/#google_vignette
>
>
Okay, that might simplify things. Though I did find it quite an interesting
result, that using the much larger and more complex observable universe
radius, which is a function of the co-moving distance factor, and applying
the holographic principle to measure its area in Planck lengths / 4, and
converting that to natural units of information, it gave a result that was
within 1% of the total number of computations that could have occurred in
the history of the universe (from calculating its current mass as the
critical density filling the entire volume of the observable universe) --
which note is a mass greater than that of a black hole. It might just be a
strange coincidence, but if not perhaps it suggests something deeper.


>
> It should be noted all three horizons change over time, and in the far
> future our causal cell will reach its maximum extent possible, where the
> Hubble constant will stop changing and the Hubble horizon will stop
> moving. This future state will be a black hole that is composed entirely
> of dark anergy and would be the largest black hole possible in a causal
> cell composed entirely of dark energy or the cosmological constant. In
> other words a blackhole composed entirely of the vacuum energy of empty
> space where the cosmological event horizon coincides exactly with the
> black hole event horizon. This is also called a Nariai black hole and is
> a feature of the De Sitter- Schwarzschild metric. If current estimates
> of the cosmological constant are correct, then this ultimate  black
> hole/ causal cell will have a radius of about 16 billion light years
> which coincides with the current cosmological event horizon.
>

Interesting, I had never heard of Nariai black holes before.


>
> I hope that I am not being overly pedantic with regards to what you were
> trying to show about limits of computation.


I appreciate your review. I strive for factual correctness, so the more
pedantic the better. :-)


> I sort of went down this
> same track myself a few years ago and came up with answers within an
> order of magnitude or so of yours, so you are on the right track.
>

I saw various estimates of 10^122 - 10^123, and in particular, the "Black
Hole Computers" paper I cite, said that the number of computations (at
the Margolus–Levitin bound) is approximately equal to the Holographic
entropy bound. I wanted to try to do an as exact calculation as possible to
see how close these numbers were. The only way I could seem to recover the
10^123 result was using the whole 46.5 billion light year radius of the
observable universe. Do you recall what numbers you obtained (if you still
have those calculations)?

Jason
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20240121/66b4d920/attachment.htm>


More information about the extropy-chat mailing list