[ExI] Uploading cautions, "Speed Up" . .

Keith Henson hkeithhenson at gmail.com
Wed Dec 21 15:43:30 UTC 2011


On Wed, Dec 21, 2011 at 5:00 AM,  Anders Sandberg <anders at aleph.se> wrote:

> Here is a simple game: what probability do you assign to us surviving
> the transition to an AGI world? Call it P1. Once in this world, where we
> have (by assumption) non-malign very smart AGI, what is the probability
> we will survive the invention of brain emulation? Call it P2.
>
> Now consider a world where brain emulation comes first. What is the
> chance of surviving that transition? Call it P3. OK, we survived the
> upload transition. Now we invent AGI. What is the chance of surviving it
> in this world? Call it P4.
>
> Which is largest, P1*P2 or P3*P4? The first is the chance of a happy
> ending for the AGI first world, the second is the chance of a happy
> ending for the uploading first world.
>
> Now, over at FHI most of us tended to assume the existence of nice
> superintelligence would make P2 pretty big - it would help us avoid
> making a mess of the upload transition. But uploads doesn't seem to help
> much with fixing P4, since they are not superintelligent per se (there
> is just a lot more brain power in that world).

I have argued that uploads could be an unmitigated disaster.  I am
assuming there is some reason to upload such as advanced ability to
control real world machines.  But the problem is that humans have
psychological mechanisms (I assume evolved in as hardware) to detect
looming privation and take steps as a group to reduce the privation,
i.e., kill neighbors.  Blind emulation would incorporate these
mechanisms, and a better view of the future might turn them on hard.
A tribe of highly enhanced uploads dedicated to thinning out the
population is an unnerving thought.

snip

> You shouldn't try to upload your brain before we have full-brain
> emulation since the methods are likely going to be 1) destructive,

I have argued that, for marketing reasons alone, destructive uploads
are going to be a hard sell.  Especially since the technology to make
uploading fully reversible with no memory loss (or even loss of
consciousness) is no harder.  (See "The clinic seed.)

> 2)
> have to throw away information during processing due to storage
> constraints until at least mid-century,

I don't see why.  The information in your brain fits in your skull.

> 3) we will not have evidence it
> works before it actually works. Of course, some of us might have no
> choice because we are frozen in liquid nitrogen...

The technology to do any of this is so similar that we should be able
to revive the cryonics patients and let them decide if they want to
upload.

Ian Banks had a good deal of this in "Surface Detail."

Keith



More information about the extropy-chat mailing list