[extropy-chat] Clock rate or rather communication delays

Eugen Leitl eugen at leitl.org
Sun May 7 21:20:40 UTC 2006


On Sun, May 07, 2006 at 02:48:05PM -0400, Keith Henson wrote:

> As a bet you are not an engineer.  Getting rid of waste heat is the bane of 
> engineers.

One of my hats is a chemist. The other, molecular biologist. Third, computational
chemist. A yet another is someone who takes an interest in cluster supercomputing,
and operates enough hardware concentration in the rack that power dissipation
density is an issue. (I have also other headgear that is less used, but that's 
enough for the moment).

Outside of an engineer's domain, a bee's brain takes a microwatt, and a human
brain ~20 W. Also outside of an engineer's domain there are supercold
condensed phase, reversible computation, and nonclassical (quantum) computation.
Biology makes many things differently than classical engineering, one
of them is dealing with power issues creatively. Biology has other limits
however, so there are no reasons why we can't beat biology by many orders
of magnitude still as it comes to computation efficiency. Spintronics
in a buckytronics context makes any synapse turn GFP-green from envy.

If you look into Nanosystems, the limits on manageable power dissipation
density are quite wide, so things remain quite interesting even for
classical systems in few 100 K range. I must admit I never bought into
Jupiter Brains much, because of a power issue. Assemblies of computational
nodes revolving around their own gravitation center have a problem of
being powered from the outside (pumped by a larger assembly of photovoltaics 
modules in a circumstellar orbit), while being able to dissipate simulataneously.
This are more suitable if there's a power source at the center (a microsingularity,
or similiar), or if each individual node is being powered by a fusion power source.

I must admit I don't see why one just doesn't surround the star with an optically
(semi-)opaque cloud of modules, and just uses the star's output directly, and dumps
into the 4 K cosmic background. You need a lot of orbiting stuff to blot out
a star, so locally the concentration is at leat Jupiter Brain grade, but it
wraps a thick cloud shell around the star.
 
> That was just to put a number on it, but in fact, some speed short of c, 
> perhaps way short, may be as fast as it is practical to go.  Depends on how 
> much dust you run into.

Because you see where you're going in advance, mapping dust is not difficult.
Because impact damage is localized, and relativistic launches will be done
using redundant probe clusters, individual destructive encounters are 
manageable (of course if you hit a big dark body in transit you're just emulating
a few MT of nuclear firework equivalents -- very pretty, end of the journey).
Because resilient, self-rebuilding probes are a must and just because of
the neutral hydrogen background (which is equivalent to a pretty luminous
proton beam if you're travelling really fast) localized circuitry nuking is
not a problem by design. You can't travel unless you have a metabolism,
and a very active background rebuilding machinery. D. radidurans would
never have a chance.

> Any reasonable computer can emulate another.  Of course the performance 

One reasonable computer can emulate another -- provided it has more
memory (compression accounted for) than the system emulated. If it doesn't,
it can't. I can't emulate an Apple ][ 48 k running ucsd-p on a 4 k 
Sinclair Z-80. I can't emulate even Alfred E. Neumann with a current
Blue Gene.

> might really suck.

If I need 10^13 real years to simulate 1 ms of what happens within
a biological system (assuming, I have enough storage to represent 
said system) effectively I can't run this simulation. In practical
terms, currently, any simulation taking more than 2-3 years is 
impractical.
 
> um I am not sure of.  If ps is pico second, I really don't understand.

It appears reasonable that you can emulate what 1 ms scale biological
processes do in solid-state classical computation at 1 ns to 1 ps range
(1 ns is certain, 1 ps might be pushing it depending on issues like power
dissipation density, and computation reversibility (if your ratio of
ones and zeros roughly balance each other locally, no need to erase 
thermodynamics bits). That's a speedup of 10^6..10^9 in regards to the 
wall clock. 
 
> Oh I am not disappointed.  Wasn't interested in becoming a "Jupiter brain," 
> just thought the notion was silly.
> 
> And if you agree that there are any limits at all, you are in my camp 
> because that argues for more than one AI.

Absolutely. I'm in the postradiation/postspeciation high-diversity
population of postbiological beings scenario. Some of them smart, most
of them (by weight) dumb, just like a rain forest/tropical reef, only in
deep space, and lots faster (most of processing involving moving bits,
much less atoms). A lot of the activity has to occur at the physical
layer, though, given that whoever controls the physical layer, controls
everything. You can't control nanopests gnawing away at cyberleviathans
unless you have a physical-layer immune system operating. Best perimeter
security gives you naught if someone sneaks up, and eats your crunchy
computronium chunk brains with a little sunlight.

-- 
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820            http://www.ativel.com
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 191 bytes
Desc: Digital signature
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20060507/780b32ba/attachment.bin>


More information about the extropy-chat mailing list