[extropy-chat] Bluff and the Darwin award

Samantha Atkins sjatkins at mac.com
Tue May 16 08:28:13 UTC 2006


On May 15, 2006, at 6:15 PM, Russell Wallace wrote:

> Some years ago there was an incident in which a gang of robbers  
> held up a bank, using fake guns carefully made to look like the  
> real thing. It worked. It worked so well that while trying to  
> escape, the robbers were shot dead by armed police officers.
>
> The moral of this story is that, puffer fish notwithstanding,  
> making yourself appear more dangerous than you are is not always a  
> wise strategy.
>
> The Singularity is a lovely idea. (Bad word for it, mind you -  
> misuse of the mathematical terminology - but unfortunately we seem  
> to be stuck with it now.) In the works of E.E. Smith and Olaf  
> Stapledon, Asimov and Clarke, it provided inspiring visions of  
> possible futures; and while any particular vision is unrealistic,  
> the general concept that our remote descendants may be greater than  
> we are, is a good and reasonable one.

Personally I was greatly inspired by Olaf Stapledon's "Star Maker" at  
a young age.  I don't think his scenario is unrealistic, at least not  
in the long run.

>
> Somewhere along the way it mutated into the meme of _imminent_  
> Singularity. This version is pure fantasy, but like astrology and  
> spiritual healing, it has memetic survival advantage because it  
> resonates with strong predispositions in the human brain. In this  
> case, the predisposition is to believe in apocalypse or nirvana in our
> lifetimes; no matter how many times this is falsified, each new  
> generation's faith is diminished not one iota.
>

I have no idea what you are talking about.  Singularity as defined by  
Vinge, though without a lot of other baggage, is imminently likely in  
this century.  All the other overlays are something else again.  But  
the advent of greater than human intelligence on this planet in that  
time frame is substantial.  I suggest we not ignore it.

> Of course there's nothing wrong with make-believe if it's kept  
> under control, like children playing with realistic-looking fake  
> guns in their own back garden. But it's another thing when it  
> spills out of the pages of science fiction books and unnoticed geek  
> mailing lists, and into the mainstream media and conferences hosted  
> by major universities.

What precisely are you calling make-believe?   You yourself are into  
the creation of strong AI.  It is a bit strange to devote your life  
to something you consider make-believe.

>
> When calls are made to base real life public policy on fantasy -  
> made and listened to.
>

What is fantasy and what is forward vision?

> I'm not a big fan of government regulation at the best of times - I  
> think it's a blunt instrument that often does a lot more harm than  
> good - but if molecular manufacturing, human-level AI, neurohacking  
> or any of the usual list of buzzwords actually existed, it would at  
> least make sense to call for a public debate on whether they should  
> be regulated. In reality, they're nowhere near being on the  
> horizon, and if they ever are invented they are unlikely to  
> resemble our present visions any more than real life space  
> exploration involves rescuing Martian princesses from bug-eyed  
> monsters; in our current state of ignorance as to what they might  
> eventually look like, any regulations we might invent now would  
> ultimately prove about as useful as Roger Bacon trying to draw up  
> restrictions on the manufacture of nerve gas.

There are many very good reasons not to let emotional irrational  
monkeys seriously restrict that which the brightest of them only  
begin to have a glimmer of.  It is not necessary to claim that it is  
all make-believe and thus unregulatable.  That strategy becomes more  
of a lie at every major advance.  It may have governments feel  
blindsided and act even more reactionary when the "make-believe"  
starts to look all too real.


>
> That is not to say, unfortunately, that regulation would have no  
> effect. Substantial advance in technology is going to require  
> generations of hard work - basic research that's hard to get  
> funding for at the best of times. If you have to spend $10 on  
> lawyers to get permission for $1 of lab work, it's not going to  
> happen.

I agree with this of course.

> Nor do we have an infinitely long window of opportunity; the  
> conditions that support free inquiry and rapid technological  
> progress are, on the scale of history, a rare and short-lived  
> aberration. There is a threshold we need to reach; it is not the  
> badly-named "Singularity", but Diaspora - the technology to live  
> sustainably off Earth.

I very much disagree.   It would take very serious tech including  
nanotechnology to get a substantial amount of humanity far enough off  
planet to provide much safety of the kind you seem to be advocating.

> With a quarter trillion stars in our galaxy alone, there'll be room  
> to find a way forward come what may; but we need to attain that  
> level of technology first, and the truth, as many a driver with  
> children in the back seat has had to point out, is that we are not  
> nearly there yet.
>

Only the nearest stars are reachable to biological creatures such as  
ourselves in a timeframe that will have us arrive before the post- 
biological offspring pass us and get there first.

> The Earth isn't going to be demolished to make room for a  
> hyperspace bypass, or eaten by grey goo, or blown up by Skynet, but  
> we - humanity - may die nonetheless, looking up at the unattainable  
> stars as our vision fades and goes out, not a mark on us from any  
> outside force, merely strangled by our own illusions.

If we do not develop substantially greater effective intelligence and  
rationality then the likeliest scenario is that we do ourselves in.

>
> Lest this be taken as another libertarian "government = evil" rant,  
> I'll emphasize that if we fail for the above reason it won't be the  
> politicians' fault. They have their jobs to do; are they wrong to  
> trust us to do ours? If we scientists and technologists come along  
> babbling about people wireheading themselves into vegetables or  
> turning themselves into monster cyborg killing machines or eating  
> the planet, _how are politicians and the public supposed to know we  
> were just deluding ourselves with paranoid fantasy_? If we must  
> ultimately drink a lethal draught, it will be because we ourselves  
> poisoned the well.
>

Wireheading is not that far away.  Neither are various kinds of  
cyborg by some definitions though the certainly need not be killers  
and will not be all-powerful.   It is possible although unlikely that  
something can be cooked up to eat the biosphere.  At some point  
breaking up planets will become possible.  Should we pretend this is  
not possible?  Why?  Is your advice to not scare those who cannot see  
for themselves?   I am all for not scaring people unnecessarily but  
not for silencing ourselves just because there are scary implications.


> So I am proposing that at last we leave childhood behind and accept  
> the difference between fantasy and real life, and if we choose to  
> entertain ourselves by gathering to tell each other stories, title  
> the gathering "Science fiction convention" not "Singularity summit".

I really do not think this is called for.   As a "Singularity Summit"  
this was very tame and not very scary at all.


> Granted that everyone needs something to believe, if you find  
> "Singularity in my lifetime" is what you personally need, then  
> believe in life extension or cryonics that'll let you stick around  
> to see it, and let go of the illusion that it's just around the  
> corner.

It is either within this century or likely not at all for this  
species imho.

> And the correct response to "Gray goo is going to eat the planet"  
> isn't "Let's draw up a list of safeguards" but "You need to lay off  
> the bad science fiction".

It turns out that is a relatively unlikely scenario.  That doesn't  
mean there is nothing at all that can go wrong.  It also doesn't mean  
in the least that Singularity is impossible in relatively short  
order.   If you believe it is then argue that carefully.  Don't just  
announce that it is all make-believe.

> Let us cease poisoning the well, grow up and face reality.

Stopping consideration of what is possible and speaking about it and  
its implication as more than utter fantasy is not facing reality.  It  
is putting our collective heads in the sand until something we chose  
to ignore chews our butts off.

- samantha



More information about the extropy-chat mailing list