[ExI] for the fermi paradox fans

Dennis May dennislmay at yahoo.com
Sat Jun 14 18:56:21 UTC 2014


Dennis
May <dennislmay at yahoo.com>, 14/6/2014 7:06 AM:


It would seem that the need for exponentially growing 
overhead to correct for noise introduces an even larger
exponentially growing overhead to watch for potential
virus-like information [coded in the form of white noise?] 
whose purpose might be nothing more than to introduce
factors generating more noise in the system.
Anders Sandberg <anders at aleph.se>, 14/6/2014 8:12 AM:
There
is a general question here that I think is cool and 
interesting:
*what is the limiting amount of resources 
needed
to resist parasites and predators*? The human 
body
uses around 20% of metabolic energy to run the 
immune
system (about the same amount as the brain). 
Governments
spend a few percent of GDP on military, 
and
a few more on police and intelligence. PCs spend 
perhaps
one out of four cores on anti-virus and firewalls. 
Some
rapidly reproducing organisms have lousy immune 
systems
since it is possible to reproduce faster than they 
are
getting killed (but even bacteria need caspases; 80% 
of
bacteria in the ocean end their individual lives because 
they
lyse due to bacteriophages!)
Are
these numbers generic, or are there other ways of 
estimating
things?
*****
The
overhead required to survive all forms of parasitism,
predation,
and internal competition from this moment till
the
final state you wish to consider is the question. If you
take
the biological analogy individuals do not survive,
species
last a little longer, and life from an common origin
lasts
longer still. Undirected biology expends a great deal
of
energy in fending off natural parasitism and predation
from
unsophisticated sources. Assuming the existence of
vast
intelligences requires strategies able to deal with 
competition
in all forms from other vast intelligences. The
first
given assumption is that you or other groups will be 
left
alone long enough to develop.
*****

Anders Sandberg <anders at aleph.se>, 14/6/2014 8:12 AM:
Hmm,
what assumptions are you adding here? The earlier 
analysis was just in terms of
thermodynamics rather than 
having an adversary around. Adversaries change the
situation 
rather deeply. 
*****
Questions
concerning the Fermi Paradox should include the variable of
adversaries
at every juncture since all of biology is known to deal with 
the
issue continually from the earliest systems forward.
Anders Sandberg <anders at aleph.se>, 14/6/2014 8:12 AM:
“Reversible
computing is extremely fragile: it might be that game theory 
leads
to quirky solutions.”

There
are presently dozens of competing physical implementations of 
quantum
computing each with slightly differing degrees of fragility.
All
physical qubit implementations to date are too fragile to produce 
useful
computations. Opinion on how much overhead for error correction
will
be required varies widely. Game theory on how to produce useful
results
in such fragile systems could be an immense computational 
project
in and of itself.
The
issue of castles versus cannonballs was important in early human
evolution
once the ability to work and communicate as a group, run 
for
long periods of time and carry a fifteen foot spear enabled a medium
sized
animal to hunt all land animals regardless of size.  Millions of
years
of evolution favoring the strategy of large size were undone - not
because
of food restriction due to reduced niche, an asteroid impact or 
volcanoes
but because of a new strategy opened up by increased
computational
capabilities. Once super-intelligences are in competition
I
would expect things to get very complicated concerning the continued
advantage
of “size” versus many other variables becoming enabled.
Dennis
May
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20140614/081e3437/attachment.html>


More information about the extropy-chat mailing list