<div class="gmail_quote">On 18 December 2011 09:54, Anders Sandberg <span dir="ltr"><<a href="mailto:anders@aleph.se">anders@aleph.se</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
The problem with collapses is that they are likely triggers of existential risk, and that low-tech states might be very persistent. We spent hundreds of thousands of years as hunter-gatherers, and for those many thousands of years we were agriculturalists technological progress was fairly spotty. During low-tech states the species is much more vulnerable to exogenous existential risks like climate, supervolcanos and disease.<br>
</blockquote><div><br>I agree. OTOH, I am not sure about the persistence of low-tech states. NeoLuddites themselves are "pessimistic" upon the fact that any state barely compatible with survival allows for a rapid bounce-back, if not in terms of wealth , at least in terms of access to information and know-how. Even those who, eg, adhere to a strictly cyclical vision of history recognise that memory of past cycles does not really get lost and influences subsequent cycles.<br>
<br></div><blockquote class="gmail_quote" style="margin:0pt 0pt 0pt 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">
The fundamental paradox is that the kind of technology that would help us reduce existential risk a lot - molecular manufacturing, AI, brain emulation - also poses existential risks. Powerful tools are risky. So depending on where you think the balance lies, you will want to make some of these happen before the other ones.<span class="HOEnZb"><font color="#888888"></font></span><br clear="all">
</blockquote></div><br>The question nevertheless remains - dangerous for what?<br><br>-- <br>Stefano Vaj<br>