[extropy-chat] Bluff and the Darwin award

Samantha Atkins sjatkins at mac.com
Tue May 16 17:35:33 UTC 2006


On May 16, 2006, at 4:02 AM, Russell Wallace wrote:

>
> What is fantasy and what is forward vision?
>
> Given that we can't (and never have been able to) make actual  
> predictions past the near future, forward vision more than a couple  
> of decades forward _is_ necessarily fantasy.
>

The point of forward vision is not accuracy but projection of likely  
scenarios and preparing for them as best we can.  Doing this is not  
fairly described as engaging in fantasy.  Many are also dedicated to  
bringing the best of these future scenarios closer by more actively  
pursuing them.   Some of the scenarios will not happen in two decades  
or more if their importance is reduced to fantasy now.

> There are many very good reasons not to let emotional irrational
> monkeys seriously restrict that which the brightest of them only
> begin to have a glimmer of.  It is not necessary to claim that it is
> all make-believe and thus unregulatable.
>
> Do you think we today can foresee the Singularity or whatever well  
> enough to have a reasonable chance of making good, detailed policy  
> decisions and passing appropriate, well-tuned regulations on it? I  
> would guess from your words above that you agree with me that we  
> can't.
>

No.  I am against such regulation today.  But we don't need to claim  
these things are fantasy to argue against such regulation.

> That is the substance of my claim. I'm not calling make-believe the  
> idea that the Singularity may someday happen.

Unless we seriously screw up or get hit with something nasty the  
Singularity in the Vinge sense is a certainty.  That is not the full  
apocalyptic vision though.

> What I'm calling make-believe is any specific prediction regarding  
> it. Because we are not close enough to do as well as random chance  
> in making specific predictions, any attempt to discuss Singularity  
> policy right now is worse than useless folly - it encourages the  
> passing of regulations that reduce our chances of ever making it at  
> all. If it's a topic for policy discussion, then it's a political  
> problem, and that means laws and regulations.

I got all that but there is no need to toss out the baby with the  
bath water.

>
> What I advocate is acknowledging that relevant policy discussion is  
> not yet possible, and that it should be considered not a political  
> problem but a purely technical one, until the technology actually  
> delivers something of substance, thereby providing information that  
> can feed a _meaningful_ policy discussion. If regulations are to be  
> proposed, then let them be proposed at that later date when there's  
> some prayer of them being based at least partly on fact rather than  
> fantasy.
>
> I very much disagree.   It would take very serious tech including
> nanotechnology to get a substantial amount of humanity far enough off
> planet to provide much safety of the kind you seem to be advocating.
>
> Of course it will take such tech - that's exactly my point! What do  
> you disagree with?
>

You seemed to focus more on getting humanity off planet as opposed to  
the "fantasy" of some of the technology that would be needed to do  
so.   Sorry if I misread.

> Hmm, from your choice of words above perhaps you're thinking I want  
> Diaspora in the sense of "the world's about to blow up, I want a  
> lifeboat so I can get out of blast radius"? My vision is a  
> different one. I want it so life can stop being about squabbling  
> over who gets to control which corner of this one little ball of  
> rock. So we can have room to breathe.
>

What is enough "room"?  Any space migration is more of a fantasy in  
the next two decades than >human AI.  Such migration in its beginning  
will be very resource hungry and will likely have much more  
squabbling for control than down home in the gravity well where we  
are inundated with resources.    Supporting biological humans in  
great numbers indefinitely in space is a vaster undertaking than AGI  
or MNT.


> Only the nearest stars are reachable to biological creatures such as
> ourselves in a timeframe that will have us arrive before the post-
> biological offspring pass us and get there first.
>
> *shrug* Maybe. Honestly, I think the actual future, if we make it,  
> will end up being stranger and more wonderful than either of us  
> could have predicted. But that's something for our descendants to  
> figure out; maybe ourselves in person if progress in life extension  
> is fast enough; but at any rate, people in the future. What we need  
> to worry about right now is getting that far in the first place.

It is not up to our descendants.  It is our watch.    If we drop the  
ball there will not likely be any space-faring descendants and  
perhaps no AGI in this corner of space-time.

>
> No, I make no claims regarding ultimate impossibility of anything  
> that doesn't involve logical self-contradiction. But neither should  
> we pretend any of it is close enough for meaningful policy discussion.
>

I think we are in violent agreement on that much.  :-)

> I really do not think this is called for.   As a "Singularity Summit"
> this was very tame and not very scary at all.
>
> It provided a platform for Bill McKibben to speak. Bill McKibben,  
> who advocates an end to progress, the snuffing out of the entire  
> future of sentient life - _and is respected and taken seriously_.  
> If that doesn't scare you, what would?

McKibben has many more opportunities to speak than the more positive  
side.   Should we shut up just because the McKibbens will take  
advantage of the opportunity to provide "balance"?

>
> > Granted that everyone needs something to believe, if you find
> > "Singularity in my lifetime" is what you personally need, then
> > believe in life extension or cryonics that'll let you stick around
> > to see it, and let go of the illusion that it's just around the
> > corner.
>
> It is either within this century or likely not at all for this
> species imho.
>
> You may well be right. That's why I think it's so important to keep  
> working flat out on the technology, and not get snared in politics.
>

Again we are in agreement about avoiding the politics where we can.   
There are places where we cannot and where some political activity is  
essential to move forward.   Putting the worse fears to rest or  
showing they are manageable is for instance part of what allowed  
nanotech funding to increase.

- samantha




More information about the extropy-chat mailing list