[ExI] Digital identity
anders at aleph.se
Fri Apr 26 10:02:47 UTC 2013
On 26/04/2013 10:18, BillK wrote:
> So, is there a limit to how many Anders instances might exist?
The upper boundary is set by the number of reachable galaxies (a couple
of billion by my last count), how much of the matter can be converted
into computronium, and how fast it runs. If we assume 10^11 stars per
galaxy, 10^42 to 10^48 computations per second in a M-brain, that I can
run on 10^17 computations per second, we get a capacity for 1e46 to 1e52
active Anderses. Using Seth LLoyd's "ultimate laptop" bound 10^51
operations per second, if we convert the matter into 2*10^51 "laptops"
we get a capacity for running 2*10^85 Anderses. However, they just have
10^31 bits of storage each, so each laptop can only house around 10^16
Anderses, bringing down the simultaneous number to merely 2*10^67.
OK, that was a bit of shameless ego-stroking. But I promise to be
merciful when I take over the universe.
> Where do the supporting resources come from? Are they unlimited?
In real life, the market. Running computations cost money, and it will
be paid for by the uploads or people/organisations sponsoring them.
Uploading becomes economically feasible when the cost per upload goes
down to the order of a million dollars. Most models I have seen suggest
that the huge economic incentives for getting more uploads will drive
down costs a lot, and of course lead to the manufacturing of more
On Earth I think the real limitations will be communications lags and
heat dissipation; Robin Hanson estimates a market for a handful of
super-dense "cities" in the early days. But the speed the infrastructure
gets built is likely to be slow relative to the upload timeframe, so
they will indeed find expansion expensive and annoying.
> Will the first upload immediately branch and replicate furiously,
> swamping the available resources and restricting later uploads?
Depends on whether the computing required for uploads is widespread, and
whether copying over the net is easy. Anybody who have tried to copy a
15 terabyte file to a remote server will see the problem. I suspect
winner-take-all is only a problem in high hardware overhang scenarios,
yet another reason to try to get the neuroscience sorted our first,
before Moores law gets too far.
> Remember we are talking big numbers here of persons to be uploaded.
> How will the resources be shared between billions of uploaded persons,
> all branching and trying out 'improvements'? Really, I don't think an
> uncontrolled environment is feasible. First takes all seems more likely.
Think property rights. You want to run a hundred copies? Fine, pay for
the servers. Forks will likely legally have equal shares in the original
copy's resource, so if you make a hundred copies of yourself each copy
will now have a hundredth of your wealth. Probably a smart idea to form
See Carl Shulman's work for some analysis of the economics of copy-clans.
Future of Humanity Institute
Oxford Martin School
Faculty of Philosophy
More information about the extropy-chat