[ExI] Singletons

Eugen Leitl eugen at leitl.org
Mon Jan 3 18:10:26 UTC 2011


On Mon, Jan 03, 2011 at 06:30:06PM +0100, Anders Sandberg wrote:

> Which could be acceptable if the rules are acceptable. Imagine that  

The question is who makes the rules? Imagine a lowest common
denominator rule enforcer, using quorum of all people on this
planet. A very scary thought. 

> there is a particular kind of physics experiment that causes cosmic  
> vacuum decay. The system monitors all activity, and stomps on attempts  
> at making the experiment. Everybody knows about the limitation and can  

Sure, world-ending stuff (don't think this this universe allows
it, orelse we wouldn't be be able to read thi

> see the logic of it. It might be possible to circumvent the system, but  
> it would take noticeable resources that fellow inhabitants would  
> recognize and likely object too.
>
> Now, is this really unacceptable and/or untenable?

I think it's unacceptable, because I don't believe such a thing
could be done without creating terrible side effects.

> The rigidity of rules the singleton enforces can be all over the place  
> from deterministic stimulus-responses to the singleton being some kind  

Stimulus-reponse would be a) not terribly efficacious, since easily
circumvented b) fraught with friendly fire

> of AI or collective mind. The legitimacy can similarly be all over the  

Ah, so it's our usual kind of despot.

> place, from a basement accidental hard takeoff to democratic one-time  
> decisions to something that is autonomous but designed to take public  

Aargh. So the singleton can do whatever it wants by tweaking the
physical layer.

> opinion into account. There is a big space of possible singleton designs.

That's the precise problem with this. It's basically Blight in a sheep's
clothing.

> If existential risks are limited to local systems, then at most there is  
> a need for a local singleton (and maybe none, if you like living free  
> and dangerously).

Sometimes I get a cold. I can live with that.

> However, there might be threats that require wider coordination or at  
> least preparation. Imagine interstellar "grey goo" (replicators that  
> weaponize solar systems and try to use existing resources to spread),  

That's basically the "getting a cold" scenario. Gray goo which can
cross interstellar distances is indistinguishable from pioneers. Nothing
to worry about, unless you haven't been born yet, or your immune
system is yet undeveloped. The probability that the wave of common
cold catches you just as you're being born is pretty much zero. 

> and a situation of warfare where the square of number of units gives the  
> effective strength (as per the Lanchester law; whether this is actually  
> true in real life will depend on a lot of things). In that case allowing  
> the problem to grow in a few systems far enough would allow it to become  
> overwhelming. In this case it might be enough to coordinate defensive  

I don't see how blowing up thy neighbour is going to help you with
taking over thy neighbor's resources (including chocolate and women).

> buildup within a broad ring around the goo, but it would still require  
> coordination - especially if there were the usual kind of public goods  
> problems in doing it.

Dunno, doesn't sound terribly convincing. Luckily, we're dealing in 
theoreticals here.

-- 
Eugen* Leitl <a href="http://leitl.org">leitl</a> http://leitl.org
______________________________________________________________
ICBM: 48.07100, 11.36820 http://www.ativel.com http://postbiota.org
8B29F6BE: 099D 78BA 2FD3 B014 B08A  7779 75B0 2443 8B29 F6BE



More information about the extropy-chat mailing list