[ExI] for the fermi paradox fans

Anders Sandberg anders at aleph.se
Tue Jun 17 21:08:11 UTC 2014


Robin D Hanson <rhanson at gmu.edu> , 17/6/2014 5:26 PM:
  On Jun 16, 2014, at 4:56 PM, Anders Sandberg <anders at aleph.se> wrote:
    Redundant computing with votes still has to be reversible. A voting gate selecting the majority of A, B and C will be dissipative unless it outputs the majority vote plus two bits of information that allows reconstructing ABC. And while the non-faulty  chunks can then be rewound and re-used, there has to be error correction for all the bits in the noise-affected one. So while the error rate has gone down from p to p^2, if a single module error occurs there will still be the same number of bits to clean as  in the single-module case. So the overall amount of bits that need to be cleaned per unit of time goes from Np to 3p((1-p)^2 + p(1-p) + p^2)N=3p(1-p+p^2)N - the far lower error rate comes at a price of extra error correction. (if I calculated things right).  
  Yes when there are errors you must pay the entropy cost to erase those error bits. But that is still a lot better than the exponential circuit size scaling you talked about before if all the errors are uncorrelated. 
Is it? The original paper had a system of N gates of depth D expand to N'=N exp(D) in order to get reliable computation. In this case I get 3N+1 gates (plus some gates for doing the error correction) and need to re-run the system on average 1/(1-p^2) times: the effective depth has increased. In fact, I am somewhat concerned with the error correcting gates: a failure there would cause an even more massive failure. The original schemes were better, I think, at keeping errors contained. 
Incidentally, von Neumann was of course there first in 1956:http://www.archtypic.com/wiki/images/a/af/Von_Neumann_Probabilistic_Logics_and_the_Synthesis_of_Reliable_Organisms_from_Unreliable_Components.pdf

 Yes the bit flips caused by a cosmic ray would not be useful. I was imagining a system that usually switched between two low energy states during ordinary reversible computation, but that was pushed up to high energy states when a cosmic ray came through. The  existence of many local high energy states would trigger the use of a different set of circuits to extract useful negentropy from those states. In this way the device might actually gain negentropy when cosmic rays came through. 

A bit like laser design. You want a set of high energy states that a ray excites, and then a decay channel that releases useful work and tells the system the bit needs to be reset.


Anders Sandberg, Future of Humanity Institute Philosophy Faculty of Oxford University
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20140617/40fa1cf3/attachment.html>


More information about the extropy-chat mailing list