rtomek at ceti.pl
Wed Feb 13 23:29:47 UTC 2013
On Wed, 13 Feb 2013, Eugen Leitl wrote:
> > laxer than standards for security for hardware, because misbehaving
> Nothing that a god gives you is safe. You've already lost by
> believing expressing the instructions in a subset safely
> sandboxes it.
> It is never safe.
Not only it is unsafe to accept god's gifts, it is also unsafe to reject
them. Both reactions could have been modeled and counted upon.
> > >> Advanced cultures can't engineer emergence.
> > Emergence isn't the same thing as invasion. If an AI emerges,
> When cooking with recipes made by gods, it is.
> > how do you guarantee its loyalties? Further, an emergent AI
> Because reaching your target attractor is deterministic, if you
> know how.
Ok. So if this happens, we are cooked, I agree. It is hard to defend
against bunch of butterflies deliberately placed over period of eons,
collectively contributing to the birth of Maxwell and Marconi.
Assuming this is what you mean.
> > by definition does not have the memories, personality, or
> > identity of a specific alien, nor any chain of identity linking it
> > back to the would-be invaders.
> The recipes never stopped coming.
> > Sure, perhaps you can nudge it to have certain sympathies
> > and modes of thought that might lead it toward wanting to ally
> > with similar-thinking aliens. But that's not an "invasion" so
> > much as "making the humans come to the aliens"...
> Actually, nobody will bother sending messages, as every self-rep
> system has amplification factor in excess of 10^3 at each hop.
> Hence, you will never receive blueprints.
Unclear. BTW, where does 10^3 come from?
** A C programmer asked whether computer had Buddha's nature. **
** As the answer, master did "rm -rif" on the programmer's home **
** directory. And then the C programmer became enlightened... **
** Tomasz Rola mailto:tomasz_rola at bigfoot.com **
More information about the extropy-chat