[extropy-chat] Clock rate or rather communication delays

Keith Henson hkhenson at rogers.com
Mon May 8 03:23:00 UTC 2006


At 11:20 PM 5/7/2006 +0200, you wrote:
>On Sun, May 07, 2006 at 02:48:05PM -0400, Keith Henson wrote:
>
> > As a bet you are not an engineer.  Getting rid of waste heat is the 
> bane of
> > engineers.
>
>One of my hats is a chemist. The other, molecular biologist. Third, 
>computational
>chemist. A yet another is someone who takes an interest in cluster 
>supercomputing,
>and operates enough hardware concentration in the rack that power dissipation
>density is an issue. (I have also other headgear that is less used, but 
>that's
>enough for the moment).
>
>Outside of an engineer's domain, a bee's brain takes a microwatt, and a human
>brain ~20 W. Also outside of an engineer's domain there are supercold
>condensed phase, reversible computation, and nonclassical (quantum) 
>computation.
>Biology makes many things differently than classical engineering, one
>of them is dealing with power issues creatively. Biology has other limits
>however, so there are no reasons why we can't beat biology by many orders
>of magnitude still as it comes to computation efficiency. Spintronics
>in a buckytronics context makes any synapse turn GFP-green from envy.

All this indicates to me that  you think computation per joule is going to 
be a consideration up there with c/second.  Even if you have lots of 
energy, it's no good if your brain catches on fire.

>If you look into Nanosystems,

My wife is listed as one of the editors.  I read it first in draft.

>the limits on manageable power dissipation
>density are quite wide, so things remain quite interesting even for
>classical systems in few 100 K range. I must admit I never bought into
>Jupiter Brains much, because of a power issue. Assemblies of computational
>nodes revolving around their own gravitation center have a problem of
>being powered from the outside (pumped by a larger assembly of photovoltaics
>modules in a circumstellar orbit), while being able to dissipate 
>simulataneously.
>This are more suitable if there's a power source at the center (a 
>microsingularity,
>or similiar), or if each individual node is being powered by a fusion 
>power source.

>I must admit I don't see why one just doesn't surround the star with an 
>optically
>(semi-)opaque cloud of modules, and just uses the star's output directly, 
>and dumps
>into the 4 K cosmic background. You need a lot of orbiting stuff to blot out
>a star, so locally the concentration is at leat Jupiter Brain grade, but it
>wraps a thick cloud shell around the star.

If you can keep the average thickness down to the equal of a few nanometer 
of aluminum you can float on the light and surround the star without being 
in orbit.  If you leave the cover off one side, the star becomes a 
fusion/photon drive (for those not in a hurry).

If you are going to orbit, a computation node becomes mostly power plant 
and radiator.  In 1979 Drexler and I wrote a paper for a conference at 
Princeton on space radiators that used ground up rock as the heat transfer 
medium.  I scanned in a copy of it a few days ago if you would like to see 
it. One of the discoveries we made is that radiators have a inherent square 
root dis-economy of scale.

> > That was just to put a number on it, but in fact, some speed short of c,
> > perhaps way short, may be as fast as it is practical to go.  Depends on 
> how
> > much dust you run into.
>
>Because you see where you're going in advance, mapping dust is not difficult.

I would be really interested in how you would do this.  If you are going to 
probe the path to the target with a laser before launch, you might as well 
launch at 1/3 c.

>Because impact damage is localized, and relativistic launches will be done
>using redundant probe clusters, individual destructive encounters are
>manageable (of course if you hit a big dark body in transit you're just 
>emulating
>a few MT of nuclear firework equivalents -- very pretty, end of the journey).
>Because resilient, self-rebuilding probes are a must and just because of
>the neutral hydrogen background (which is equivalent to a pretty luminous
>proton beam if you're travelling really fast) localized circuitry nuking is
>not a problem by design. You can't travel unless you have a metabolism,
>and a very active background rebuilding machinery. D. radidurans would
>never have a chance.
>
> > Any reasonable computer can emulate another.  Of course the performance
>
>One reasonable computer can emulate another -- provided it has more
>memory (compression accounted for) than the system emulated. If it doesn't,
>it can't. I can't emulate an Apple ][ 48 k running ucsd-p on a 4 k
>Sinclair Z-80. I can't emulate even Alfred E. Neumann with a current
>Blue Gene.
>
> > might really suck.
>
>If I need 10^13 real years to simulate 1 ms of what happens within
>a biological system (assuming, I have enough storage to represent
>said system) effectively I can't run this simulation. In practical
>terms, currently, any simulation taking more than 2-3 years is
>impractical.

10 exp 13 years might try the patience of even the immortals.

> > um I am not sure of.  If ps is pico second, I really don't understand.
>
>It appears reasonable that you can emulate what 1 ms scale biological
>processes do in solid-state classical computation at 1 ns to 1 ps range
>(1 ns is certain, 1 ps might be pushing it depending on issues like power
>dissipation density, and computation reversibility (if your ratio of
>ones and zeros roughly balance each other locally, no need to erase
>thermodynamics bits). That's a speedup of 10^6..10^9 in regards to the
>wall clock.

That's about what I get.  Spiffy, but the stars recede out of reach.

> > Oh I am not disappointed.  Wasn't interested in becoming a "Jupiter 
> brain,"
> > just thought the notion was silly.
> >
> > And if you agree that there are any limits at all, you are in my camp
> > because that argues for more than one AI.
>
>Absolutely. I'm in the postradiation/postspeciation high-diversity
>population of postbiological beings scenario. Some of them smart, most
>of them (by weight) dumb, just like a rain forest/tropical reef, only in
>deep space, and lots faster (most of processing involving moving bits,
>much less atoms). A lot of the activity has to occur at the physical
>layer, though, given that whoever controls the physical layer, controls
>everything. You can't control nanopests gnawing away at cyberleviathans
>unless you have a physical-layer immune system operating. Best perimeter
>security gives you naught if someone sneaks up, and eats your crunchy
>computronium chunk brains with a little sunlight.

It could happen.  It could also be very different.

Keith Henson




More information about the extropy-chat mailing list