[ExI] 2^57885161-1
Eugen Leitl
eugen at leitl.org
Wed Feb 13 13:30:59 UTC 2013
On Tue, Feb 12, 2013 at 10:34:42AM -0800, Adrian Tymes wrote:
> On Tue, Feb 12, 2013 at 3:53 AM, BillK <pharos at gmail.com> wrote:
> > On Tue, Feb 12, 2013 at 11:31 AM, Eugen Leitl wrote:
> >> Right, just as there is no malware on the Internet.
>
> Analogy failure. Standards for security for software are inherently
Comprehension failure.
> laxer than standards for security for hardware, because misbehaving
Nothing that a god gives you is safe. You've already lost by
believing expressing the instructions in a subset safely
sandboxes it.
It is never safe.
> software is far less obvious. If someone points a gun at you, there
> is no mistaking that that's a bad thing. If someone sends you a
> link to malware, that's not immediately obviously so bad.
Somebody keeps sending you increasingly useful things.
Useful things = by definition outside of sandbox. Emergence is about
engineering nonobvious side effects constructively. Somewhere
right behind your back.
The world is made of swiss cheese. Most people are not aware,
so containment is impossible.
> Further, I said "at that level". Copying and pasting someone else's
> malware script is far simpler than engineering nanotech. A better
They did that a few gigayears ago.
> analogy would be breaking into a foreign nation's nuclear launch
> chain of command and firing their nukes at one's enemies. Notice
Analogy failure.
> that that has not happened yet, nor is it listed as a serious concern
> by military cybersecurity types - even when they're hyping up the
> dangers to justify their budget: it'd be too far beyond the actual
> threat.
>
> >> Advanced cultures can't engineer emergence.
>
> Emergence isn't the same thing as invasion. If an AI emerges,
When cooking with recipes made by gods, it is.
> how do you guarantee its loyalties? Further, an emergent AI
Because reaching your target attractor is deterministic, if you
know how.
> by definition does not have the memories, personality, or
> identity of a specific alien, nor any chain of identity linking it
> back to the would-be invaders.
The recipes never stopped coming.
> Sure, perhaps you can nudge it to have certain sympathies
> and modes of thought that might lead it toward wanting to ally
> with similar-thinking aliens. But that's not an "invasion" so
> much as "making the humans come to the aliens"...
Actually, nobody will bother sending messages, as every self-rep
system has amplification factor in excess of 10^3 at each hop.
Hence, you will never receive blueprints.
More information about the extropy-chat
mailing list