[ExI] Uploading cautions, "Speed Up" (Anders Sandberg)

Anders Sandberg anders at aleph.se
Sun Dec 25 00:05:25 UTC 2011


On 2011-12-24 15:39, Keith Henson wrote:
> On Fri, Dec 23, 2011 at 10:21 PM,  Anders Sandberg<anders at aleph.se>  wrote:
>
>> On 2011-12-22 20:01, Keith Henson wrote:
>>> On Thu, Dec 22, 2011 at 5:00 AM,  Anders Sandberg<anders at aleph.se>    wrote:
>>>>> Unless making copies is illegal and strongly enforced, for example, by
>>>>> AIs.
>>>>
>>>> But that requires a singleton (to use Nick's term), an agency that can
>>>> enforce global coordination.
>>>
>>> Or really widespread agreement that something is a bad idea.
>>
>> It is enough to have one defector to ruin the agreement. The only way of
>> making the low-forking strategy evolutionarily stable is to coordinate
>> so that deviations are punished enough *everywhere*.
>
> Or simply don't happen.

It is hard to prevent people from trying things. A very strong singleton 
could monitor things so closely that it could figure out intentions in 
early stages (might be possible if you bug every upload mind - you can 
do some amazing surveillance in a software civilization!) and then put a 
stop to them. Or all people are modified/selected not to want the bad 
things. But this is far beyond widespread agreement that something is 
bad: if all societies agree to this level of monitoring/adjustment we 
already have a singleton of some kind.


>> And that likely
>> requires a global singleton, not just an agreement among all nice
>> governments or companies.
>
> If the mechanisms for forking humans are in the control of machines
> and not humans and the machines are smart enough to understand the
> consequences of forking (grinding poverty) then it won't happen.

This only occurs in the scenario where AI occurs before emulation, AI is 
non-disastrous and fairly nice, and forms a singleton of some kind. Drop 
any of the three assumptions and mass forking looks pretty possible (or 
we are all dead anyway).


   It's
> hard to be sure, but there is no obvious reason for a humans to have
> an evolved instinct to replicate as in forking.  Even with children,
> we have decoupled drives to mate and the instinct to take care of
> offspring.  But consider birth control.

Or birth rates going down as wealth increases, often far below 
replacement levels. In fact, demographers have told me they think the UN 
scenarios assuming convergence to 2.1 kids per woman are just assuming 
it without evidence: it is entirely possible that societies could have 
stable sociocultural states with far lower birth rates - evolved drives 
are still around (there is an interesting link between threatening or 
uncertain environments and higher teenage pregnancy, which makes 
strategic sense), but they can be overridden or redirected by cultural 
memes.

Forking is not necessarily driven by a drive to reproduce. It might be 
economic, it might be intellectual, it might even be religious - as soon 
as there is somebody with pro-forking views a lot of forking is likely 
unless the cost of computing space is very high.


> Forking in the uploaded state should be much faster.  I think the
> fastest doubling time for a worm was 8.5 seconds.  It infected all
> 50,000 vulnerable computer on the 4 B internet addresses in single
> digit hours and jammed the net.

Forking speed of humans or AIs will likely depend on bandwidth to 
transmit the full state to new memory. Assuming a human is 30 terabytes, 
within a computer with current buses replication can happen in 500 
seconds (at 60.0 gigabytes/s). But that can surely be improved, if only 
by parallel buses.


-- 
Anders Sandberg
Future of Humanity Institute
Oxford University



More information about the extropy-chat mailing list