[ExI] 2^57885161-1

Eugen Leitl eugen at leitl.org
Thu Feb 14 09:57:32 UTC 2013

On Thu, Feb 14, 2013 at 12:29:47AM +0100, Tomasz Rola wrote:

> > It is never safe.
> Not only it is unsafe to accept god's gifts, it is also unsafe to reject 
> them. Both reactions could have been modeled and counted upon.

Yes, but if you're not executing the recipe you'll get
default behaviour. By refusing to enter the stage the
play never happens.

Of course, collectively such recipes would always be followed
by somebody, as perfect control is impossible. In fact, 
some will execute even completely obscure plans just 
because they can.
> > Because reaching your target attractor is deterministic, if you
> > know how. 
> Ok. So if this happens, we are cooked, I agree. It is hard to defend 
> against bunch of butterflies deliberately placed over period of eons, 
> collectively contributing to the birth of Maxwell and Marconi.
> Assuming this is what you mean.

Nothing quite so far-fetched, but constructive interference
of side effects. By using multiple, perfectly innocuous (and
extremely useful) gifts you'll get an emergence of fertile
> > > by definition does not have the memories, personality, or
> > > identity of a specific alien, nor any chain of identity linking it
> > > back to the would-be invaders.
> > 
> > The recipes never stopped coming.
> How so?

If you fire messages into the ether, you can of course 
continue to provide bootup instructions to anything that is
capable to listen.
> > > Sure, perhaps you can nudge it to have certain sympathies
> > > and modes of thought that might lead it toward wanting to ally
> > > with similar-thinking aliens.  But that's not an "invasion" so
> > > much as "making the humans come to the aliens"...
> > 
> > Actually, nobody will bother sending messages, as every self-rep
> > system has amplification factor in excess of 10^3 at each hop.
> > 
> > Hence, you will never receive blueprints.
> Unclear. BTW, where does 10^3 come from?

If you do the math, then a stellar output is sufficient to
push a large (though not very large) number of relativistic
seeds simultaneously. If you autoamplification factor is
several orders of magnitude at each step than saturating the
universe comes at an infinitesimal cost to the originator.

The costs to send a critical number of messages vs. a seed
makes seeding a much more cost-effective proposition. And
almost every system with debris would be fertile, no need
to wait for the lucky time window (where you can listen and
act upon, but not yet are expansive yourself) which is like
hunting unicorns.

More information about the extropy-chat mailing list