[ExI] Forking
Anders Sandberg
anders at aleph.se
Sat Dec 31 09:14:29 UTC 2011
On 2011-12-30 21:00, Keith Henson wrote:
> On Fri, Dec 30, 2011 at 5:00 AM, Anders Sandberg<anders at aleph.se> wrote:
>> Then of course there are the
>> ethical reasons to want people to exist, although these ones are more
>> complex to argue from.
>
> Hmm. Does that apply even stronger to super intelligent AIs?
Could be. I am working a bit on a paper with a colleague (who isn't
transhumanist) about ethical arguments against making superintelligent
AI. One of the more intriguing possibilities might be that they embody
so much value (by being super-conscious, having super-emotions or being
super-moral) that it might be either 1) impermissible for humans to make
them since once in existence more or less the only relevant moral
actions we could take are the ones serving or protecting them (even if
they don't need or care) or 2) too dangerous in the moral sense to try
to develop them because we might accidentally produce super-disvalue
(imagine an entity that suffers so much that all the positive things
humanity ever done is insignificant in comparison). I don't think these
cases are good arguments to refrain from AI, but they certainly suggest
that there might be problems with succeeding too well even if the AI
itself is friendly.
>>> I suppose forking should be added to excessive population growth rate
>>> as a reason we don't see the works of aliens.
>>
>> Well, if forking is a road to grinding poverty and forks can get a
>> temporary benefit by colonizing,
>
> Forking to grinding poverty would probably preclude having the
> resources to colonize--which have to be serious. We could colonize
> space, (O'Neill colonies) but have not and one cited reason is that we
> are too poor.
Uploads are pretty ideal for space colonization and would probably be
the cheap way of getting space resources.
Poverty traps limits your range of actions because you cannot afford
actions that bring you out of them. But if certain actions at least
temporarily reduce your poverty, then they are likely to be taken. So
unless you presuppose that all of transhumanity is in a very tight
poverty trap where no coalition can get a better situation by pooling
resources for getting more resources, it seems likely that it will do
colonizing for more resources. These resources might get over-shared
again, producing a "mobile" poverty trap, but that just means that you
get an expanding poor civilization, not a non-expanding civilization.
> I think the negative consequences of forking need to be considered
> every bit as seriously as those of AI.
Completely agree. Will work on it when we are finished with AI.
--
Anders Sandberg
Future of Humanity Institute
Oxford University
More information about the extropy-chat
mailing list