[extropy-chat] How to be copied into the future?
asa at nada.kth.se
Mon Mar 19 10:28:20 UTC 2007
Eugen Leitl wrote:
> On Mon, Mar 19, 2007 at 05:34:20AM +0100, Giu1i0 Pri5c0 wrote:
>> Clarke-Baxter technique: seem workable if the underlying assumption of
>> a high density distribution of micro wormholes in vacuum is correct.
> One hell of an assumption. Until we know the opposite, the information
> constituting our being leaks out of us at the speed of light, and
> is lost irreversibly.
Hmm, assuming we take them literally, could it be done?
A wormhole likely has an information capacity < kc^4R^2/2G. A nanometer
wormhole has a capacity of about 10^69 bits/s, so if we put one inside
each synapse we could both scan it and remove the information. The
scanning part is somewhat iffy, since it is not clear to me how to scan an
entire synapse from a nanometer wormhole end (gamma ray radar?), but lets
leave that handwaving. Another problem is the wormhole exotic matter: we
would need 10^18 kg of exotic matter to maintain each of the 10^15
wormholes. Seen anybody's head distort spacetime like a supergiant star
But the showstopper is causality. There seem to be pretty good reasons to
think that one cannot rearrange matter and energy to get a closed timelike
curve, see http://aardvark.ucsd.edu/grad_conference/wuthrich.pdf and
And scanning wormholes into the past will definitely be CTCs.
BTW, if we have CTCs we can build NP-solving quantum computers:
Back to the drawing board. I think it is easier to do the massive
simulation approach. We know it should be doable to convert solar systems
into computronium, and running at least a classical physics simulation of
humanity backwards seems to be withing the ability of such a pile of
computers. I'd like to think more on how to actually implement the
simulation, but that is for another day.
Oxford Uehiro Centre for Practical Ethics
Philosophy Faculty of Oxford University
More information about the extropy-chat