[ExI] immortal
Jason Resch
jasonresch at gmail.com
Mon Aug 4 02:10:44 UTC 2025
On Sun, Aug 3, 2025, 9:10 PM Adrian Tymes via extropy-chat <
extropy-chat at lists.extropy.org> wrote:
> On Sun, Aug 3, 2025 at 8:32 PM Jason Resch via extropy-chat
> <extropy-chat at lists.extropy.org> wrote:
> > These constraints apply to all classical computing architectures
> physically realizable given the laws of this universe.
>
> Classical, perhaps. Once upon a time, that would have included vacuum
> tubes and that's it. Likewise, they now include every architecture
> that you can imagine - but you have no evidence that that is every
> possible computing architecture, only the "classical" ones.
>
(Note that by classical, I mean classical computing (in the Turning machine
sense), i.e. not quantum computers. Quantum computers still are limited to
the clock speed implied by Bremermann's limit, but certain operations of
quantum computers can exploit computations occurring across the multiverse,
and thus classical computers have no corresponding constant-time operation.
The cases where quantum computers have any benefits all are quite limited,
and so quantum computers shouldn't be considered as any kind of general
purpose replacement for classical computers. I think it's probably better
to put them aside for the purposes of this discussion.)
Now as to why these limits apply to all conceivable architectures, consider
the following:
How fast can a bit detectably change? Would you agree that a CPU can't have
a clock time be less than the time it takes for a bit to change? If so,
consider that the shortest possible time that can be measured is the
fastest physically possible thing traversing the smallest physically
detectable distance. (Time is distance/speed) so to get the shortest
measurable time we need the smallest distance and the fastest speed.
In physics that would be the Planck distance and the speed of light. The
shortest time is then the Planck time. But to measure something as short as
the Planck time you need a Planck mass of energy. If you use less energy
you'll have a a photon with a longer wave length and have to measure it
traversing a longer distance before you can reliability determine it to be
in a different location. All this is proportional. The more mass/energy at
your disposal, the shorter the time interval can be measured. The math
works out to about 10^50 Hz per kilogram of mass-energy. So for a given
amount of mass, there is a maximum physical frequency (or shortest clock
time), which is governed by the shortest possible time needed for the
physical system to detectably change its state. And detecting state changes
is a requirement of any bit operation (e.g. a computation). So the best
possible speed of a computer, in terms of operations per second, depends on
the mass of the computer.
This isn't about vacuum tubes or transistors, this is about the basic
physical limits.
I think you would agree we can't exceed the density of a black hole. Black
holes represent another computing extreme: the highest information density
possible for a given volume. Black holes are also have the fastest clock
speed of any serial computation (for a given mass). Of all physically
realizable computers of a given memory capacity, they have the shortest
time to update any bit in that memory (proportional to the black hole event
horizon diameter/ the speed of light).
Thus black holes can be seen as a form of computronium.
> > Did you review any of the links I sent?
>
> They say nothing about the objection I stated. They are non sequiturs.
>
Would you at least agree, that absent any surprising new discoveries in
physics, we can use our current understanding of physics to calculate some
of the constraints that the best possible computers will be subject to, and
these constraints apply regardless of how they're engineered?
> >> > So it will take another 112 doublings of current computer speed to
> get there. Over the past century the trend has been fairly consistent of
> computing technology doubling roughly every 18 - 24 months
> >>
> >> Even if we were to constrain ourselves to traditional electronics,
> >> Moore's Law has been pointed out as not necessarily holding steady -
> >> and it's been an economic law, not a physical one.
> >
> > I think it is more generally a property of recursively improving systems.
>
> You based your claim on observations of, specifically, electronic
> computers under Moore's Law. Escaping to generality in this manner
> voids your ability to place specific numbers on it.
>
Moore's law is the only the lastest (and so far the fifth paradigm) to fit
the law of accelerating returns.
If you go back to before integrated circuits, you will find the exponential
trend has survived through numerous paradigms, including electromechanical,
relay, vacuum tube, and transistor based paradigms before, for over a
century.
See the chart here:
https://alwaysasking.com/when-will-ai-take-over/#Computing_Trends
Our brains are about a 500,000 times more energy efficient, and 300,000
times more mass efficient than our current computers. Our DNA has a data
density many many orders of magnitude more dense than our best data storage
technology.
Our technology still has a long way to go to catch up with biology, and
biology is by no means anywhere near the limits of physical possibility. I
have confidence that computing technology will continue to advance, and
will likely see many more paradigms come and go beyond integrated circuits.
We've now reaching stage where AI is participating in chip design. Clearly
this is a self reinforcing virtuous cycle of the mind that leads to
exponential growth.
Jason
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20250803/a91334ef/attachment.htm>
More information about the extropy-chat
mailing list