[ExI] Forking
Keith Henson
hkeithhenson at gmail.com
Fri Dec 30 20:00:37 UTC 2011
On Fri, Dec 30, 2011 at 5:00 AM, Anders Sandberg <anders at aleph.se> wrote:
snip
> Which makes me wonder where the *new* ideas are. I suspect they are
> right under our noses, but we do not recognize that they are really new.
>
> (Incidentally, have you read Alastair Reynold's novel "The House of
> Suns"? It involves clades of people cruising the galaxy and holding Far
> Edge Parties at certain intervals.)
I will have to look it up. It's weird to find old Extropy list
discussions (like the one between me and Hans Moravec) becoming plot
elements in SF stories. Far Edge Party is one of those memes that
took off (in a small way).
>>> If I fork myself, there will now be twice as much me-experience, twice
>>> the amount of human capital and twice as many entities sharing my goals.
>>
>> You will also have half the resources per capita. Widespread,
>> uncontrolled forking is isomorphic to gray goo.
>
> It is the overpopulation debate again. My FHI colleague Toby Ord gave a
> good talk on the question a few weeks back:
> http://www.oxfordmartin.ox.ac.uk/videos/view/128
> He suggested that it might beneficial to have more population for a
> number of reasons. For example, in an information economy more minds
> produce goods that are useful for all others. We also need larger groups
> to produce very complex goods, and we should not be convinced that we
> have somehow reached the limit here.
I have no problem with a population larger even vastly larger than the
current one, provided the economy stays ahead of the population
growth. If it fails to do so, falling income per capita is what turns
on the psychological mechanisms leading to wars. (Google
"evolutionary psychology memes and the origin of war")
Unless forking (and regular biological reproduction) is limited to a
figure below the economic growth, I suspect the practice would cause a
nearly immediate war because an economic bleak future would rapidly
become obvious to all (driving meme, Kill the Forkers!). At some
point even the entire material resources of the solar system will in
use, like in Accelerado. Speed of light considerations (which we are
already seeing) may limit us to the earth in which case materials
limits will hit much sooner.
I also have no problem with "If you want to fork, leave the solar system."
> Then of course there are the
> ethical reasons to want people to exist, although these ones are more
> complex to argue from.
Hmm. Does that apply even stronger to super intelligent AIs?
>> I suppose forking should be added to excessive population growth rate
>> as a reason we don't see the works of aliens.
>
> Well, if forking is a road to grinding poverty and forks can get a
> temporary benefit by colonizing,
Forking to grinding poverty would probably preclude having the
resources to colonize--which have to be serious. We could colonize
space, (O'Neill colonies) but have not and one cited reason is that we
are too poor.
> then it would lead to a version of
> Robin Hanson's "burning the cosmic commons" scenario of very rapidly
> expanding technospheres using all material to grow. They would be fairly
> visible if they cannot spread at a significant fraction of light-speed.
We don't see them, so either we are the first in our light cone, or
something keeps them all from making a visible impact on the universe.
This logic dates back to the late 70s when Eric Drexler went
looking. The unresolved situation as I understand it makes me very
uneasy, not that there is much I can do about it.
I think the negative consequences of forking need to be considered
every bit as seriously as those of AI.
Keith
More information about the extropy-chat
mailing list