# [ExI] Tabby's star

Stuart LaForge avant at sollegro.com
Sun Dec 11 21:25:12 UTC 2016

```Spike wrote:
<Instead of thinking of it as using all available energy (as in the
Kardashev's levels) imagine that the goal is to convert energy optimally
from the point of view of maximizing computing (creation of low entropy
data) given a fixed amount of metal in interplanetary space.  Refocus the
problem on what would be or could be done with sufficiently large available
computing power.  We don't really have this kind of application, but imagine
for some reason the idea is to find as many Mersenne Primes as possible, or
solve chess.  Then we need all the available metal and to convert energy
optimally for that purpose.  I keep getting something that looks like a
Dyson swarm or an M-brain, but it isn't tightly packed, because it cannot
be: it would overheat.>

FWIW computational physics corroborates this. The Beckenstein Limit is the
maximum amount of information you can pack into a given mass-energy
embedded in a given volume of space. It is given by the following formula:

I <= K*M*R

"I" is the information in bits, M is the mass in kg, R is the radius in
meters, and K is a constant that is K = (2*pi*speed of light)/(reduced
plank's constant*ln2). So the maximum information content is directly
proportional to the product of the entity's mass and linear dimension.

By simply dividing both sides of the inequality by the mass you get:

I/M <= K*R so the maximum bits per unit mass is directly proportional to
the size. Now any computer that saturated the Beckenstein Bound would
become a black hole because that's how the limit was derived. But let's
call computronium the most information dense matter that doesn't have an
event horizon, then for a given mass of it, the bigger it is, the more
bits it can store.

Then a diffuse computronium "dust" or "foam" would have the highest
information storage capacity. I imagine the latency time caused by speed
of light delay in communication would set an upper size limit to the
computronium "cloud". So the trade off would be memory vs processing speed
constrained by space and time respectively.

This line of reasoning seems to be independent of heat dissipation at
least on the face of it.

Stuart LaForge

```