[ExI] Drake Equation Musings

Anders Sandberg anders at aleph.se
Wed May 18 12:05:58 UTC 2016


On 2016-05-18 02:41, Keith Henson wrote:
> On Tue, May 17, 2016 at 10:09 AM,  Anders Sandberg <anders at aleph.se> wrote:
>> If intelligence often turns into black boxes, then p is small. But note
>> that you need many orders of magnitude to weaken the update a lot: since
>> x can be arbitrarily large, even if you think black box civilizations
>> are super-likely the lack of observed civilizations in the vicinity
>> should move your views about the possible upper range of densities a
>> fair bit. Arguing p=0 is a very radical knowledge claim, and equivalent
>> to positing the most audacious law of sociology ever (true for every
>> individual, society and species!)
> We have intelligence and physics interacting here.  I suspect that
> there is a universal characteristic of intelligence, that is of you
> are smart enough to impact the look of the universe, then you have the
> desire to be smarter.  One of the ways to get smarter is to think
> faster.  If this is the case, then we run into the physics limits,
> which I suspect keeps the aliens home just due to the insane expansion
> of space you get with moderate (million to one) speedup.

Sure. But convergent instrumental goals do not imply strong dominance of 
an option. Having offspring survive is clearly good for evolutionary 
fitness but there are plenty of gay or otherwise nonreproducing 
individual animals. Sexual reproduction seems to be very advantageous 
for multicellular life, yet there are lineages that have lost it. If 
most civilizations go blackbox, it just corresponds to a smaller 
fraction of communicable civs (or members of them).

Keith, are you seriously arguing that 100%, not 99.99999%, civilizations 
(and 100%, not 99.99999%, of all their members) will go blackbox?





More information about the extropy-chat mailing list