<html><head><style type=text/css><!--
.mcnt {word-wrap:break-word;-webkit-nbsp-mode:space;-webkit-line-break:after-white-space;}
--></style></head><body>Another thing I would love to find out is the mass/depth trade-off for memory storage. Suppose you have a lot of mass and want to turn it into as much good computer memory as possible. What configuration is best? <div><br></div><div>The tunnelling probability across a potential barrier scales as exp(-L sqrt(m E)) where L is the width of the barrier, m the particle mass and E the potential depth. The energy or negentropy losses due to tunnelling will be proportional to this. You could spend the mass on really deep potential wells, or on making them physically wide (or even use heavy objects to represent your bits). Which is the best approach? What is the scaling of depth you get from large amounts of mass?</div><div><br></div><div>If you have N bits of mass m1 (initially =M/N), they will need correction N exp(-L sqrt(m E)) times per second; eventually you will run out of stored negentropy and have to burn mass to radiate to the background radiation. So each second you will lose N exp(-L sqrt(m E)) kTln(2)/m1 c^2 bits; N' = -lambda N, where lambda = kT ln(2) exp(-L sqrt(m E))/m1 c^2. So the half-life of computer memory in this phase will be inversely proportional to temperature, exponential in bit size, exp-sqrt in bit marker mass and potential depth and proportional to bit mass. So it looks like making bits *really large* is a good idea.</div><div><br></div><div>One figure of merit might of course be total number of bit-seconds. That scales as integral_0^infty N dt = [ - exp(-lambda t) / lambda]_0^infty = 1/lambda, i.e. proportional to half-life. However, the initial number scales as 1/m1, so the m1 factor disappears: it is not the bit mass that matters, just temperature, size, marker mass and energy. </div><div><br></div><div>So, giant positronium bits, anyone?<br><div><br></div><div>Anybody know how to estimate the max size of gravitationally bound aggregates in current cosmological models?</div><div><br><br>Anders Sandberg,
Future of Humanity Institute
Philosophy Faculty of Oxford University<br><br><br><div><span data-mailaddress="rhanson@gmu.edu" data-contactname="Robin D Hanson" class="clickable"><span title="rhanson@gmu.edu">Robin D Hanson</span><span class="detail"> <rhanson@gmu.edu></span></span> , 10/6/2014 8:56 PM:<br><blockquote class="mcnt mori" style="margin:0 0 0 .8ex;border-left:2px blue solid;padding-left:1ex;"><div class="mcnt">
<div>
On Jun 10, 2014, at 5:04 AM, Anders Sandberg <<a href="mailto:anders@aleph.se" title="mailto:anders@aleph.se" class="mailto">anders@aleph.se</a>> wrote:<br>
<div>
<blockquote type="cite">
<div style="font-family:Helvetica;font-size:14px;font-style:normal;font-variant:normal;font-weight:normal;letter-spacing:normal;line-height:normal;orphans:auto;text-align:start;text-indent:0px;text-transform:none;white-space:normal;widows:auto;word-spacing:0px;-webkit-text-stroke-width:0px;">
<div>
<blockquote class="mcntmcnt mcntmori" style="word-wrap:break-word;-webkit-nbsp-mode:space;-webkit-line-break:after-white-space;margin:0px 0px 0px 0.8ex;border-left-width:2px;border-left-color:blue;border-left-style:solid;padding-left:1ex;position:static;z-index:auto;">
<div class="mcntmcnt" style="word-wrap:break-word;-webkit-nbsp-mode:space;-webkit-line-break:after-white-space;">
<div>
</div>
<div> So there is less obviously a reason to wait to spend entropy. The max entropy usually comes via huge black holes, and those can take time to construct and then to milk. That seems to me to place the strongest limits on when we expect negentropy to get
spent. </div>
</div>
</blockquote>
</div>
<div><br>
</div>
<div>I don't think time is the resource that is most costly if you try to maximize the overall future computations of your lightcone. Capturing dark matter with black holes seems wortwhile, but I wonder about the thermodynamic cost of doing it.</div>
</div>
</blockquote>
</div>
<div><br>
</div>
There is another reason to go slow: In reversible computers, as in other reversible systems, the entropy cost is proportional to the rate. That is, the entropy cost per gate operation is inverse in the time that operation takes. In the limit of going very slowly,
the entropy cost per operation approaches zero.
<div>
<div><br>
<div>
<div><span class="mcntApple-style-span" style="border-collapse:separate;border-spacing:0px;">
<div>
<div style="font-size:12px;">Robin Hanson <a href="http://hanson.gmu.edu" title="http://hanson.gmu.edu" target="_blank">http://hanson.gmu.edu</a><br>
Res. Assoc., Future of Humanity Inst., Oxford Univ.<br>
Assoc. Professor, George Mason University</div>
<div style="font-size:12px;">Chief Scientist, Consensus Point<br>
MSN 1D3, Carow Hall, Fairfax VA 22030<br>
703-993-2326 FAX: 703-993-2323</div>
</div>
<div><br>
</div>
</span><br class="mcntApple-interchange-newline">
</div>
<br>
</div>
</div>
</div>
</div></div><br><br>_______________________________________________
<br>extropy-chat mailing list
<br><a href="mailto:extropy-chat@lists.extropy.org" title="mailto:extropy-chat@lists.extropy.org" class="mailto">extropy-chat@lists.extropy.org</a>
<br><a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" target="_blank" title="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a>
<br></blockquote></div></div></div></body></html>