[ExI] The physical limits of computation
Stuart LaForge
avant at sollegro.com
Sun Jan 28 20:50:00 UTC 2024
On 2024-01-21 18:59, Jason Resch via extropy-chat wrote:
> On Sun, Jan 21, 2024 at 2:55 PM Stuart LaForge via extropy-chat
[snip]
>> On p.49 you wrote:
>> "Using the holographic principle to model the observable
>> universe as a black hole with a radius of 46.5 billion light
>> years, we find it contains:
>> 2.33 * 10^123 bits
>> Which is the total information content of the universe."
>>
>> The observable universe cannot be modelled as a black hole because
>> it is
>> defined by the particle horizon which, while it is one of the three
>> cosmic horizons (Hubble horizon, event horizon, and particle
>> horizon),
>> it is NOT the event horizon in a technical sense and therefore using
>> the
>> holographic principle on it is not really physically valid. The
>> particle
>> horizon defining the observable universe is the boundary of
>> everything
>> we can currently see in our past light cone since the big bang. As
>> you
>> say, it is 46.5 billion light years away from us.
>
> The issues of the various horizons, and which was the most sensible
> one to use in this situation confused me.
Don't be confused. Listen to what the data is telling you. You can see
that clearly that the observable universe is not a black hole. The
redshift is a good indicator of the large time dilation affecting the
images of the farthest galaxies we can see. In a far-away observer's
frame of reference, it looks like the farthest galaxies are frozen in
time painted on the cosmological event horizon at 16 Gly from us. Any
computation you can see, i.e. literal clock speed, has therefore been
slowed down tremendously.
> This section is on the
> observable universe, and ideally, I would like to provide both
> computational, and information content estimates for the observable
> universe as a whole, if that is possible, the observable universe
> being the largest thing we can see (and presumably, the thing with the
> greatest information content).
I understand your motivation, but you are seeing the galaxies of the
observable universe as they were, frozen in time when they crossed the
cosmological event horizon billions of years ago. You cannot even in
principle see events that occurred to the galaxies after they crossed
the event horizon unless it happens somewhere between the distance to
the current horizon and the future horizon when all horizons converge on
a single horizon. The particle horizon and the therefore "observable
universe" is a construct built upon induction of the cosmological
principle beyond what is physically observable. For all we know, the
flying spaghetti monster could have ate them after they crossed the
event horizon and we might never know.
> Our inability to interact with distant objects we can see (beyond our
> event horizon) I consider is (perhaps?) not necessarily important to
> the question of how much information "is in" the observable universe,
> as all those far away photons, still "made it here", into the sphere
> of the observable. So then, when it comes to determining how much
> information exists within the totality of what is observable (despite
> the fact that we can no longer signal back to the most distant
> places), information about those places has still made it here, and
> computations which took place in those far away galaxies are still
> perceivable by us, and hence ought to be included in the total amount
> of computations that have occurred in the history of the observable
> universe, should they not?
I get that. The red shift and time dilation of the farthest and
therefore the fastest "moving" galaxies is so great that you are would
have to watch for years to see a single second of "computation". Once a
galaxy crosses the cosmic event horizon, for the rest of YOUR time you
are looking at a perpetually slowing film of it getting to closer and
closer but never quite reaching the event horizon. Yes you are seeing
galaxies that are by the FLRW metric currently 46 billion light years
away, but you are only seeing a frozen snapshot of them as they were
when they crossed the cosmic event horizon less between 13.8 billion and
16 billion years ago. . . unless, something happened to them.
>
> I also found it interesting, that the mass of the universe (when
> computed using the critical density, which comes from Friedmann
> equations and multiplying this by the volume of the observable
> universe) led to a mass which is considerably greater than the mass of
> a black hole with the radius of the observable universe. I take this
> to be a result of not all the energy in the universe being matter
> energy, but also radiation and dark energy, which I presume to affect
> the curvature of space differently than ordinary mass. Is that right?
Those things do change the stress-energy tensor, but for the most part
cosmologists simply define a statistic called omega that is the sum of
the fraction of the mass-energy sum of the various partial densities of
the various constituents relative to Friedmann's critical density which
is the total density of the whole shebang, with the caveat that it is
based partially based on the Hubble parameter which seems to change with
time and perhaps even space.
https://ned.ipac.caltech.edu/level5/March06/Overduin/Overduin4.html
The reason the mass you calculated for the observable universe based on
the critical density is more than that of a black hole of the same
radius as the observable universe is that volume-wise, most of the
so-called observable universe lies outside of our causal cell. A black
hole whose Schwarzchild radius is equal to the Hubble radius is the only
size a black hole made of mass-energy at the Friedmann critical density
can take. At least according to a corollary of the Schwarzchild metric
that I discovered: The surface area of an event horizon multiplied by
the average density of the space contained by the event horizon is
always equal to a constant such that D * A = K with K=
3*(speed-of-light)^2/(2*(Newton's G)).
As you can see from examining the equation, the larger the black hole,
the lower the the average density of the space that the event horizon
surrounds. If the observable universe were a black hole, it would be
causal cell containing space less dense than the Freidman critical
density of our causal cell. Remember all that needs for that to happen
is a smaller Hubble parameter than ours.
Here is a thread on the Extropolis list where I use casual cell theory
and wave harmonics to explain the Cosmological Constant problem/ Vacuum
catastrophe:
https://groups.google.com/g/extropolis/c/QA-kRIBt6vM/m/4d9cqudeAQAJ
[snip]
> For reference, here is how I did the calculation:
> https://docs.google.com/spreadsheets/d/1lR5mj4jFQft7Q-hKC4bYAoQt-BlukVy7MbEQV0eeu8s/edit?usp=sharing
>
> Okay, that might simplify things. Though I did find it quite an
> interesting result, that using the much larger and more complex
> observable universe radius, which is a function of the co-moving
> distance factor, and applying the holographic principle to measure its
> area in Planck lengths / 4, and converting that to natural units of
> information, it gave a result that was within 1% of the total number
> of computations that could have occurred in the history of the
> universe (from calculating its current mass as the critical density
> filling the entire volume of the observable universe) -- which note is
> a mass greater than that of a black hole. It might just be a strange
> coincidence, but if not perhaps it suggests something deeper.
You used the right techniques, you just misapplied them and came up with
an answer contradictory to general relativity. The deeper thing that
this suggests is that the most distant regions of observable space are
expanding at a different rate than ours because their Hubble parameter
has a different value than ours does. When you correct for general
relativity, you discover that in order to remain flat, the observable
universe has to have a different critical density than our causal cell.
Congratulations, you figured out that the homogeneity and isotropy
ASSUMED by the Cosmological Principle on which the FLRW is based is
bullshit which is what I have heretically maintained for years based on
causal cells. Nowadays with the Hubble tension discovered by the HST,
the too-early galaxies discovered by the JWST, and gigantic
trans-galactic superstructures discovered by the Sloan Digital Sky
Survey, the Lamda-CDM standard big bang cosmological model is being
torpedoed left and right.
https://www.space.com/big-ring-galactic-superstructure-celestial-anomaly
>
>> It should be noted all three horizons change over time, and in the
>> far
>> future our causal cell will reach its maximum extent possible, where
>> the
>> Hubble constant will stop changing and the Hubble horizon will stop
>> moving. This future state will be a black hole that is composed
>> entirely
>> of dark energy and would be the largest black hole possible in a
>> causal
>> cell composed entirely of dark energy or the cosmological constant.
>> In
>> other words a blackhole composed entirely of the vacuum energy of
>> empty
>> space where the cosmological event horizon coincides exactly with
>> the
>> black hole event horizon. This is also called a Nariai black hole
>> and is
>> a feature of the De Sitter- Schwarzschild metric. If current
>> estimates
>> of the cosmological constant are correct, then this ultimate black
>> hole/ causal cell will have a radius of about 16 billion light years
>>
>> which coincides with the current cosmological event horizon.
>
> Interesting, I had never heard of Nariai black holes before.
They are interesting, but they they depend of the notion that the
cosmological constant/vacuum energy of the all space-time in the
universe is constant. Since in a mathematical sense almost all of
space-time is hidden from us behind causal event horizons, it takes a
degree of faith to believe this.
>
>> I hope that I am not being overly pedantic with regards to what you
>> were
>> trying to show about limits of computation.
>
> I appreciate your review. I strive for factual correctness, so the
> more pedantic the better. :-)
>
>> I sort of went down this
>> same track myself a few years ago and came up with answers within an
>>
>> order of magnitude or so of yours, so you are on the right track.
>
> I saw various estimates of 10^122 - 10^123, and in particular, the
> "Black Hole Computers" paper I cite, said that the number of
> computations (at the Margolus–Levitin bound) is approximately equal
> to the Holographic entropy bound. I wanted to try to do an as exact
> calculation as possible to see how close these numbers were. The only
> way I could seem to recover the 10^123 result was using the whole 46.5
> billion light year radius of the observable universe. Do you recall
> what numbers you obtained (if you still have the calculations)?
I got a figure of 3.27 * 10^122 bits for the classical information
content for our causal cell bounded by the Hubble radius. I don't think
it makes sense to talk about "computation" involving space-like
separated regions of space time, at least with regard to classical
computing. But hey, what does anybody really know? The universe is a
strange and wonderful place and I just live here. ;)
Stuart LaForge
More information about the extropy-chat
mailing list