[ExI] taxonomy for fermi paradox fans:

Anders Sandberg anders at aleph.se
Sun Feb 1 14:00:49 UTC 2015


Keith Henson <hkeithhenson at gmail.com> , 30/1/2015 7:58 AM:
On Wed, Jan 28, 2015 at 4:00 AM, John Clark <johnkclark at gmail.com> wrote: 

snip 
 
> 2) Some catastrophe hits a civilization when it gets a little past our 
> level; my best guess would be the electronic equivalent of drug abuse. 
 
Possible.  But it seems an unlikely filter to get all possible 
variations on a nervous system if ET's with the capacity to affect the 
visible state of the universe are common.  I suspect you need 
something fundamental that keeps every single one of them from 
spreading out. 


Exactly. This is something that needs to be reiterated again and again in discussions like this: just because something gets 99% of the population of 99% of species doesn't mean it works as a Fermi answer. The remaining 1% of the 99% affected civilizations and the 1% of unaffected civilizations will still make a lot of noise. At best it gives you a reduction of an already low number of civilization appearances. This is why cultural convergence to some kind of addiction or tiny dense fast objects are not good enough to satisfy. 


Cultural convergence that gets *everybody*, whether humans, oddly programmed AGIs, silicon-based zorgons, the plant-women of Canopus III, or the sentient neutronium vortices of Geminga, that has to be something really *weird*. Even among humans we can typically find some exceptions from human "cultural universals", and that is within a single fairly homogeneous species, not intelligence in general. 




This would leave the universe full of 
isolated civilizations that stay small for speed of light limitations. 
Sped up, how long would a civilization last?  If the ratio was a 
million to one, a century of clock time would be 100 million years 
subjective. 


But again, this is a soft constraint. It might be beneficial for 99% of all civilizations and 99% of their population, but the leftovers will be noticeable. Plus, human individuals and civilizations undertake projects that stretch far beyond their lifetimes (whether building cathedrals or launching space probes): it is not inconceivable that during those 100 million years of civilization entire cultures may arise that feel a philosophical, religious, artistic or pranksterish need to launch colonization to colonize and/or reshape parts of the universe into something they think it should be. And if such offshoot behave the same way we will have a spread of dense fast clusters at a rate that from the outside looks pretty brisk, despite from the inside being a rare and epic undertaking. 



PS  Busy lately, but have a reply to Anders re brain size limits on my 
list to do. 



Looking for forward to it. Am writing a post about energy use in technological singularities right now.


Eric Chaisson's writings on energy rate density increase with complexity are interesting in the intersection of these two topics. 

Anders Sandberg, Future of Humanity Institute Philosophy Faculty of Oxford University
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20150201/14647d66/attachment.html>


More information about the extropy-chat mailing list