[ExI] for the fermi paradox fans

Robin D Hanson rhanson at gmu.edu
Sun Jun 15 21:37:04 UTC 2014


I finally had a chance to look over these papers, and I'm not convinced they are modeling the interesting case. With reversible computers there is of course a lot of thermal noise, but the system is designed to robustly accommodate noise in the flexible state (eg local charge) of the system, though not of course in the structure (eg conductor vs insulator) that channels those flexible states. In real computers the main noise other than thermal noise in the flexible state is due to cosmic rays. But that noise is very highly correlated spatially, which makes it much easier to deal with; just redundantly do big computation chunks in spatially separated places and vote on the answer.

I wonder if there is a way to make something like a computer gate that can switch to energy collector mode when a cosmic ray comes through. Then the noise might perhaps more than pay for itself.

On Jun 13, 2014, at 2:13 PM, Anders Sandberg <anders at aleph.se<mailto:anders at aleph.se>> wrote:
Continuing my thinking about extreme future computation:

http://arxiv.org/abs/quant-ph/9611028 "Limitations of noisy reversible computation" shows that noise is pretty bad for reversible computations: the total size of circuits need to grow exponentially with their depth in order to produce reliable computation (in normal circuits the growth is just polynomial). This is also true for quantum computations. Both kinds of circuits can simulate general classical or quantum circuits, but the cost is exponential. Of course, the glass-is-half-full view is that one can build working reliable systems out of noisy components (a la http://arxiv.org/abs/quant-ph/9906129 ) if the noise is below a certain level: you just need to pay a price for it. Very much like error-correcting codes in classical channels. But this shows that intricate messages require exponentially long codes, so to say.

So if extreme future life runs on largely reversible computers (classical or quantum) it needs to perform a Bennet-style undoing of intermediate results (http://www.math.ucsd.edu/~sbuss/CourseWeb/Math268_2013W/Bennett_Reversibiity.pdf) relatively often not just because error correction is expensive (if N bits are involved in the error syndrome then they all need to be reset at a cost) but because the retrace corresponds to a long circuit depth and the total circuit size hence goes up exponentially. So the amount of hardware grows with retrace length R as exp(R), and presumably the number of bit errors that need to be fixed also grows proportional to it - eventually it is better to just wipe the memory state completely rather than try to erase syndromes and retrace.

Robin Hanson  http://hanson.gmu.edu
Res. Assoc., Future of Humanity Inst., Oxford Univ.
Assoc. Professor, George Mason University
Chief Scientist, Consensus Point
MSN 1D3, Carow Hall, Fairfax VA 22030
703-993-2326 FAX: 703-993-2323



-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20140615/95a62bc1/attachment.html>


More information about the extropy-chat mailing list