[ExI] Singletons

Anders Sandberg anders at aleph.se
Sat Jan 1 11:23:59 UTC 2011


On 2010-12-31 15:52, Eugen Leitl wrote:
> On Fri, Dec 31, 2010 at 03:30:51PM +0100, Anders Sandberg wrote:
>
>> It is all a matter of whether singletons are the way, and whether
>
> Singletons don't work in a relativistic universe. In fact, you
> can't even synchronize oscillators in a rotating spacetime.

A singleton doesn't necessarily have to be synchronized. Imagine a set 
of local rules that gets replicated, keeping things constrained wherever 
they go.


> And I have a very deep aversion against cosmic cops of
> any color. I don't think they're needed, and they're a
> form of the most terrible despotism there can ever be.

I have the same aversion. However, I am open to the possibility that a 
civilization without global coordination that really can put its foot 
down and say "no!" to some activities will with a high probability be 
wiped out by some xrisk or misevolution. I am still not convinced that 
this possibility is the truth, but it seems about as likely as the 
opposite case.

I would love to be able to find some good arguments that settle things 
one way or another. The problem is that the xrisk category is pretty big 
and messy, with unknown unknowns.


> How would you implement one? The response would be obviously
> deterministic. It cannot be static, orelse it wouldn't be able
> to track the underlying culture patch. How much a fraction
> of physical layer is allocated to the cop?

Military budgets are a few percent of GDP for heavily armed countries, 
and maybe equally large for policing. In our bodies the immune system 
accounts for ~20% of metabolism if I remember right.

Singletons doesn't have to be sinister Master Control Programs, they 
could be some form of resilient oversight body implementing an 
unchanging constitution. The von Neumann probe infrastructure mentioned 
in the other thread could implement a singleton as an interface between 
the colonizer/infrastructure construction layer and the "users", 
essentially providing them with a DRMed galactic infrastructure. How 
perfect they need to be depends on how dangerous failures would be; the 
more scary and brittle the situation, the more they would need to 
prevent certain things from ever happening, but it could just be that 
they act to bias the evolution of a civilization away from certain bad 
attractor states like burning cosmic commons.


> I'm sure such a thing would be a dictator's wet dream.

Yup. A bad singleton is an xrisk on its own.


>> (hmm, now I have a total urge to listen to "The Terrible Secret of
>> Space"... incidentally a great song about Friendly AI)
>
> That is incorrect. Do not listen to the Anders robot.
> He is malfunctioning. Uploading will protect you.
> Uploading will protect you from the terrible silence in the skies.
>

That is incorrect. Whole brain emulation will protect you. Whole brain 
emulation will protect you from the terrible silence in the skies. Do 
not trust the Eugene robot. Whole brain emulation is the answer.
We Are Here To Protect You.


-- 
Anders Sandberg
Future of Humanity Institute
Oxford University



More information about the extropy-chat mailing list