[extropy-chat] Morality Among Posthumans (was In the Long Run...Intelligence Dominate Space?)

Lee Corbin lcorbin at tsoft.com
Mon Jul 10 17:52:56 UTC 2006


Samantha writes

> On Jul 6, 2006, at 8:25 PM, Lee Corbin wrote:
> 
> > There are two reasons I can think of that an AI may wish to keep
> > its area clean: one is selfish, one is moral.
> >
> > The selfish reason is that it probably will see no reason to allow
> > compute resources to be squandered on vastly inferior processes.
> > It has its own reasons to calculate, its own curiosity, its own
> > redesign of itself. Why permit resources to be wasted on anything
> > else?
> 
> No appreciation of variety or of the benefits of diversity eh?   If  
> it has enough power to grab all "compute resources" then it certainly  
> should do so, eh?   The strong take all and the weak go wanting and  
> yet..

You could take your argument up with rabbits in Australia, or Kudzu
throughout the U.S.  Or with mother nature, or with Mr. Darwin. I'm
describing what *is* here, not what I want.

> > The second reason is moral: we today *should* not permit natural
> > processes---had we only the power to stop it---such as big fish
> > eating small fish ad infinitum), especially when the cruelty
> > inflicted on sentient prey, such as is inflicted on gazelles by
> > lions, is avoidable. Only our romanticized fancies prevent us
> > from properly perceiving and appreciating the horrors. Evidently
> > people would need to live a few days as a rabbit or field mouse
> > to properly understand.
> 
> What?  Aren't you one of the folks that beat the no objective  
> morality drum?

Just because it isn't objective HAS NO NECESSARY RELATIONSHIP
WHATSOEVER OF WHAT I APPROVE OF. I hate vacuum that could
otherwise be hosting sentient creatures enjoying, for example.

> If so isn't it a tad inconsistent to talk of natural  
> predator-prey relationships as "cruel" and something we should strive  
> to prevent?   I imagine we will have a lot better things to do.

Yes, (1) each of us has his or her own agenda, which ultimately (if
all goes well) include embarking on a path of maximum personal
evolution, the goal being to try to keep up with the kind AIs, but
also (2) to as fully as we can embrace policies that minimize 
suffering throughout the cosmos.

Lee




More information about the extropy-chat mailing list