[ExI] for the fermi paradox fans

Anders Sandberg anders at aleph.se
Sat Jun 14 13:12:48 UTC 2014


There is a general question here that I think is cool and interesting: *what is the limiting amount of resources needed to resist parasites and predators*? The human body uses around 20% of metabolic energy to run the immune system (about the same amount as the brain). Governments spend a few percent of GDP on military, and a few more on police and intelligence. PCs spend perhaps one out of four cores on anti-virus and firewalls. Some rapidly reproducing organisms have lousy immune systems since it is possible to reproduce faster than they are getting killed (but even bacteria need caspases; 80% of bacteria in the ocean end their individual lives because they lyse due to bacteriophages!)Are these numbers generic, or are there other ways of estimating things?
Dennis May <dennislmay at yahoo.com> , 14/6/2014 7:06 AM:
It would seem that the need for exponentially growing overhead to correct for noise introduces an even largerexponentially growing overhead to watch for potentialvirus-like information [coded in the form of white noise?] whose purpose might be nothing more than to introducefactors generating more noise in the system.
Hmm, what assumptions are you adding here? The earlier analysis was just in terms of thermodynamics rather than having an adversary around. Adversaries change the situation rather deeply. 
If an adversary can introduce errors into your computation to get above the error correction threshold then it can clearly prevent any useful computation. But the price is that it needs to induce errors at a constant fraction of all your gates each second. 
There is no direct thermodynamic cost as far as I can tell for doing it (just connect a bit to the background heat bath). In fact, a "predator" may swap a clean bit for a dirty bit, giving itself a thermodynamic advantage. However, predation requires knowing which bits are in which states - the generic internal bit is dirty to the predator despite being in a pure state to the victim (or am I missing some clever quantum scheme to grab an unknown bit in a pure state and use it?) The real cost is likely that it need to cause a connection operation between the victim's system and something else. I wonder if one can show how much entropic cost is needed for this? Doing time reversal seems to show that bit swaps are free.
Reversible computing is extremely fragile: it might be that game theory leads to quirky solutions. If it is easy to perform scorched earth, then stealing resources can be made inefficient. Yes, you can try stealing X bits from me, but if you do I will randomize the bits: I lose the same amount, but you do not gain an advantage. If your utility function want to maximize A and I want to maximize B, there may well be an equilibrium maximizing a combination and we ought to merge our utility functions and resources. If A is proportional to the computational resources used for it and B similarly proportional, then maximizing a combination makes it irrational to remove computational resources. And so on - there is a lot of fun things to work out here. For example, while the base computational infrastructure might be sacrosanct to all agents, they might still struggle on the next level for allocation - including hacking utility functions. 


Anders Sandberg, Future of Humanity Institute Philosophy Faculty of Oxford University
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20140614/1e216c69/attachment.html>


More information about the extropy-chat mailing list