>>I was thinking prudence should allay our fears. Then I imagined the<br>counterpoint would be to investigate if humanity collectively possesses <div>enough prudence in the first place.<<</div><div><br></div><div>
I should say there's ample evidence of human prudence in the history of the development of technology. The problem is that our vision is often so short-sited and misguided. We develop nuclear technology for the purposes of fission bombs in a time of great stress and chaos without much fore-thought to what kind of risks and dangers it brings to the world, and here in my own country, where nuclear power is being widely developed and implemented as a cheaper energy alternative, we have grass-roots organizations whipping up public anxiety about the possibility of an accident at the Pickering plant that will incinerate all of Toronto. Meanwhile we are still burning coal in Ontario and have a heavily integrated traditional power-grid that makes black-outs and system wide failures likely and expensive to fix. When you ask your local anti-nuclear plant spokeswoman, who has a degree in sociology and a bee in her bonnet about local property values, what would be a viable alternative, she has no answer. Yet I imagine she does appreciate having her lights come on when she flicks a switch.</div>
<div><br></div><div><br></div><div><br></div><div>>And I think that those really deluded are the "responsible" group.<br>> Both because progress is far from granted *and* because even if it<br>> were the idea of steering it would be presumptuous and short-sighted,<br>
> not to mention fundamentally reactionary.<</div><div><br></div><div><br></div><div>And what are the alternatives? If you grant what the anti-nuclear spokespeople won't, that once in a technology is invented it can't be un-invented, and what Buckminster Fuller suggested, that technological progress is actually beyond human control in most circumstance because it is difficult to predict where the next leap forward will come from given the near-limitless number of potential combinations of technology that exist in theory, then you're left with only two, it seems to me. 1. Open the door and sit back and hope for the best. Or 2. At the very least strive for the minimum amount of control you can through engineering, even if that turns out to be very little. Even if it turns out to be absolutely none, you are at least more informed about the possible scenarios when the whole situation goes south, and your response may in fact be a little less reactionary and a little more calculated. </div>
<div><br></div><div>A small and perhaps very intellectually weak example, but the only one I can think of at the moment. Imagine raising a child who's combined IQ levels are off the charts. He or she is incredibly more intelligent than you in pattern recognition, creative impulse, logic, emotive response, etc. You know at some point in their development they are going to be thinking and reacting and responding to the world around them in ways that you are just not capable of and perhaps can't even imagine. This could be a good thing or a bad thing. They could read Nietzsche at the age of two and decide that "truth was only a moral semblance" and turn into a Leopold and Loeb in one body. They could read Plato or Leo Strauss and decide that the social contract inhibited effective governance and set out to change that. They might logically decide that religious fanatcism was going to destroy the planet and the best way to forestall that was to become a jihadist in the name of science and persecute religious people of all stripes. Or they could be altruistic and compassionate and come up with creative ways to solve some of the world's problems -- unlimited energy sources, new ways of producing food. Whatever. </div>
<div><br></div><div>So what do you do? Just say the hell with it? Let it ride? Or do you at least try to instill in them some commonly held human values and goals, in the hopes that they will have a positive outcome in the end when they do set sail? Most people would agree with the last scenario I think. If we are going to create it and foster it, don't we have at least some responsibility to do our utmost to attempt to steer it towards a positive outcome, even if we fail? It may be hubris to think we can do so, but it would be sheer negligence to not even bother to try. </div>
<div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div>
<div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br></div><div><br><br>
<div class="gmail_quote">On Mon, Nov 29, 2010 at 1:02 AM, Samantha Atkins <span dir="ltr"><<a href="mailto:sjatkins@mac.com">sjatkins@mac.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
<div class="im"><br>
On Nov 28, 2010, at 12:32 PM, Stefano Vaj wrote:<br>
<br>
> 2010/11/27 Mike Dougherty <<a href="mailto:msd001@gmail.com">msd001@gmail.com</a>>:<br>
>> We are certainly<br>
>> concerned that genetic engineering (et al.) have the potential to produce a<br>
>> plague that also wipes out humanity but it would be unwise to abandon this<br>
>> medical technology regardless of its potential for curative medicine.<br>
>><br>
>> I was thinking prudence should allay our fears. Then I imagined the<br>
>> counterpoint would be to investigate if humanity collectively possesses<br>
>> enough prudence in the first place.<br>
><br>
> There are people thinking that according to the "coolest", most<br>
> fashionable thinking, the alternative would be between those who are<br>
> blind to the danger of technological progress, and/or delude<br>
> themselves that it may be be possible to stop it, vs. the enlightened<br>
> few who are responsibly preoccupied with its "steering".<br>
><br>
> Personally, along traditional transhumanist lines, I think the actual<br>
> alternative is still between those who are against technological<br>
> progress vs. those who are in favour.<br>
><br>
> And I think that those really deluded are the "responsible" group.<br>
> Both because progress is far from granted *and* because even if it<br>
> were the idea of steering it would be presumptuous and short-sighted,<br>
> not to mention fundamentally reactionary.<br>
<br>
</div>What? You don't think attempting to maximize the outcomes that ensue is worth thinking about at all? You think it is presumptuous to even bother to attempt to predict alternatives and do what we can (which admittedly may not be a lot) to make more desirable outcomes more likely? If you do think this are you in the do-nothing camp re technology and how it is deployed in the future? I don't think so judging from your activities but perhaps I am mistaken.<br>
<font color="#888888"><br>
- s<br>
</font><div><div></div><div class="h5"><br>
_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
</div></div></blockquote></div><br><br clear="all"><br>-- <br><font face="Georgia, Utopia, 'Palatino Linotype', Palatino, serif"><span style="line-height:26px"><span style="font-family:'Lucida Grande', 'Trebuchet MS', Helvetica, Arial, sans-serif;line-height:normal;color:rgb(255, 255, 255)"><span style="margin-top:0px;margin-right:0px;margin-bottom:0px;margin-left:0px;padding-top:0px;padding-right:0px;padding-bottom:0px;padding-left:0px;font-style:italic"><span style="background-color:rgb(0, 0, 0)">"In the end that's all we have: our memories - electrochemical impulses stored in eight pounds of tissue the consistency of cold porridge." </span></span><span style="background-color:rgb(0, 0, 0)">- Remembrance of the Daleks</span></span></span></font><br>
</div>