[ExI] How could you ever support an AGI?

Lee Corbin lcorbin at rawbw.com
Tue Mar 4 05:15:35 UTC 2008


John Clark writes

>  Robert Bradbury Wrote:

> > I believe the production of an AGI spells the extinction of humanity.  
>  
> Me too.
 
But you do not think it *certain*, of course. This talk of utter doom
concerning an unknown future is misplaced. Yes, things are grim,
but they were grim back in the '60s, '70s, and '80s, but it would
have been wrong to speak of unavoidable doom.  There was a 
chance that we'd make it through, you see.
 
> If you don't want to develop an AI somebody else certainly will,

A very important point!

> And if you're the first to make an AI you would have more control
> (very small but larger than zero) over future events than the person
> who came in second. It may also give these developers some
> comfort to know that even if they or their children do not survive
> their mind children will.
 
Well, since we are mostly interested in survival, such consolation is
beneath the radar.

> Still, things aren't completely hopeless, just almost hopeless.
> If you or your biological children have any wish to survive
> they must shed the silly superstitions regarding identity and
> consciousness that is epidemic in society and even infects
> most members of this list.

You mean, like retaining one's memories?  The cryonicists, like
everyone else I know, suppose that total and permanent loss of
memory is death

I thought that you agreed with the statement "Anything that 
remembers being me is me (to some larger or smaller degree)."
No?

> > And so, we must present transhumanism as an "Extinction Level Event"
> Yes.

No we should not. The future is too uncertain for any such claim.
Besides, in my reply I mentioned the "weighted sum". Would you
take the following gamble:   

   1.  Suppose that you are capable (which you are almost surely
        not, nor is any of us) of imagining that you could begin living
       a life 100 times more valuable to you than is the one you are
       now leading.
   2. You can press button A and there will be a .9 chance of 
        immediate death, and a .1 chance of obtaining said life.

(If it makes the chooser feel any more comfortable, consider that our
best theory of physics strongly suggests universe branching, and that
regardless, you'll still live, but in only one-tenth as many universes.)

But even people on a ship just striking an iceberg doomed to sink the
ship and kill most of the people do not say "this iceberg is an extinction
level event for us". They keep their eye on the main chance.

Lee




More information about the extropy-chat mailing list