[ExI] Hard Takeoff

Darren Greer darren.greer3 at gmail.com
Mon Nov 29 15:01:33 UTC 2010


>>I was thinking prudence should allay our fears.  Then I imagined the
counterpoint would be to investigate if humanity collectively possesses
enough prudence in the first place.<<

I should say there's ample evidence of human prudence in the history of the
development of technology. The problem is that our vision is often so
short-sited and misguided. We develop nuclear technology for the purposes of
fission bombs in a time of great stress and chaos without much fore-thought
to what kind of risks and dangers it brings to the world, and here in my own
country, where nuclear power is being widely developed and implemented as a
cheaper energy alternative, we have grass-roots organizations whipping up
public anxiety about the possibility of an accident at the Pickering plant
that will incinerate all of Toronto. Meanwhile we are still burning coal in
Ontario and have a heavily integrated traditional power-grid that makes
black-outs and system wide failures likely and expensive to fix. When you
ask your local anti-nuclear plant spokeswoman, who has a degree in sociology
and a bee in her bonnet about local property values, what would be a viable
alternative, she has no answer. Yet I imagine she does appreciate having her
lights come on when she flicks a switch.



>And I think that those really deluded are the "responsible" group.
> Both because progress is far from granted *and* because even if it
> were the idea of steering it would be presumptuous and short-sighted,
> not to mention fundamentally reactionary.<


And what are the alternatives? If you grant what the anti-nuclear
spokespeople won't, that once in a technology is invented it can't be
un-invented, and what Buckminster Fuller suggested, that technological
progress is actually beyond human control in most circumstance because it is
difficult to predict where the next leap forward will come from given the
near-limitless number of potential combinations of technology that exist in
theory, then you're left with only two, it seems to me. 1. Open the door and
sit back and hope for the best. Or 2.  At the very least strive for the
minimum amount of control you can through engineering, even if that turns
out to be very little. Even if it turns out to be absolutely none, you are
at least more informed about the possible scenarios when the whole situation
goes south, and your response may in fact be a little less reactionary and a
little more calculated.

A small and perhaps very intellectually weak example, but the only one I can
think of at the moment. Imagine raising a child who's combined IQ levels are
off the charts. He or she is incredibly more intelligent than you in pattern
recognition, creative impulse, logic, emotive response, etc. You know at
some point in their development they are going to be thinking and reacting
and responding to the world around them in ways that you are just not
capable of and perhaps can't even imagine. This could be a good thing or a
bad thing. They could read Nietzsche at the age of two and decide that
"truth was only a moral semblance" and turn into a Leopold and Loeb in one
body. They could read Plato or Leo Strauss and decide that the social
contract inhibited effective governance and set out to change that. They
might logically decide that religious fanatcism was going to destroy the
planet and the best way to forestall that was to become a jihadist in the
name of science and persecute religious people of all stripes. Or they could
be altruistic and compassionate and come up with creative ways to solve some
of the world's problems -- unlimited energy sources, new ways of producing
food. Whatever.

So what do you do? Just say the hell with it? Let it ride? Or do you at
least try to instill in them some commonly held human values and goals, in
the hopes that they will have a positive outcome in the end when they do set
sail? Most people would agree with the last scenario I think. If we are
going to create it and foster it, don't we have at least some responsibility
to do our utmost to attempt to steer it towards a positive outcome, even if
we fail? It may be hubris to think we can do so, but it would be sheer
negligence to not even bother to try.































On Mon, Nov 29, 2010 at 1:02 AM, Samantha Atkins <sjatkins at mac.com> wrote:

>
> On Nov 28, 2010, at 12:32 PM, Stefano Vaj wrote:
>
> > 2010/11/27 Mike Dougherty <msd001 at gmail.com>:
> >> We are certainly
> >> concerned that genetic engineering (et al.) have the potential to
> produce a
> >> plague that also wipes out humanity but it would be unwise to abandon
> this
> >> medical technology regardless of its potential for curative medicine.
> >>
> >> I was thinking prudence should allay our fears.  Then I imagined the
> >> counterpoint would be to investigate if humanity collectively possesses
> >> enough prudence in the first place.
> >
> > There are people thinking that according to the "coolest", most
> > fashionable thinking,  the alternative would be between those  who are
> > blind to the danger of technological progress, and/or delude
> > themselves that it may be be possible to  stop it, vs. the enlightened
> > few who are responsibly preoccupied with its "steering".
> >
> > Personally, along traditional transhumanist lines, I think the actual
> > alternative is still between those who are against technological
> > progress vs. those who are in favour.
> >
> > And I think that those really deluded are the "responsible" group.
> > Both because progress is far from granted *and* because even if it
> > were the idea of steering it would be presumptuous and short-sighted,
> > not to mention fundamentally reactionary.
>
> What?  You don't think attempting to maximize the outcomes that ensue is
> worth thinking about at all?   You think it is presumptuous to even bother
> to attempt to predict alternatives and do what we can (which admittedly may
> not be a lot) to make more desirable outcomes more likely?   If you do think
> this are you in the do-nothing camp re technology and how it is deployed in
> the future?  I don't think so judging from your activities but perhaps I am
> mistaken.
>
> - s
>
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>



-- 
"In the end that's all we have: our memories - electrochemical impulses
stored in eight pounds of tissue the consistency of cold porridge." -
Remembrance of the Daleks
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20101129/4dcbebb3/attachment.html>


More information about the extropy-chat mailing list