[ExI] for the fermi paradox fans
anders at aleph.se
Fri Jun 13 18:13:52 UTC 2014
Continuing my thinking about extreme future computation:
http://arxiv.org/abs/quant-ph/9611028 "Limitations of noisy reversible computation" shows that noise is pretty bad for reversible computations: the total size of circuits need to grow exponentially with their depth in order to produce reliable computation (in normal circuits the growth is just polynomial). This is also true for quantum computations. Both kinds of circuits can simulate general classical or quantum circuits, but the cost is exponential. Of course, the glass-is-half-full view is that one can build working reliable systems out of noisy components (a la http://arxiv.org/abs/quant-ph/9906129 ) if the noise is below a certain level: you just need to pay a price for it. Very much like error-correcting codes in classical channels. But this shows that intricate messages require exponentially long codes, so to say.
So if extreme future life runs on largely reversible computers (classical or quantum) it needs to perform a Bennet-style undoing of intermediate results (http://www.math.ucsd.edu/~sbuss/CourseWeb/Math268_2013W/Bennett_Reversibiity.pdf) relatively often not just because error correction is expensive (if N bits are involved in the error syndrome then they all need to be reset at a cost) but because the retrace corresponds to a long circuit depth and the total circuit size hence goes up exponentially. So the amount of hardware grows with retrace length R as exp(R), and presumably the number of bit errors that need to be fixed also grows proportional to it - eventually it is better to just wipe the memory state completely rather than try to erase syndromes and retrace.
Anders Sandberg, Future of Humanity Institute Philosophy Faculty of Oxford University
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat