[ExI] is a FTL drive a dream without any physics to back it up?

Anders Sandberg anders at aleph.se
Sun Dec 18 08:54:07 UTC 2011


On 2011-12-18 02:54, Kelly Anderson wrote:
> On Fri, Dec 16, 2011 at 4:30 AM, Eugen Leitl<eugen at leitl.org>  wrote:
>> So in order to get out into space we must prevent collapse.
>> This should be our first and foremost priority as a species.
>
> I don't think this is the only choice Eugen. We can collapse in such a
> way that most of our civilization's scientific knowledge survives, and
> is pushed forward by the (hopefully) wiser survivors of said collapse.

The problem with collapses is that they are likely triggers of 
existential risk, and that low-tech states might be very persistent. We 
spent hundreds of thousands of years as hunter-gatherers, and for those 
many thousands of years we were agriculturalists technological progress 
was fairly spotty. During low-tech states the species is much more 
vulnerable to exogenous existential risks like climate, supervolcanos 
and disease.

But it is the collapse itself that is really risky. During a collapse 
dangerous technology is still available and people and groups are 
simultaneously desperate. That means that we might see release of 
nuclear, biological or other weapons with existential threat potential. 
Combined with whatever is causing the collapse this might wipe out 
stored knowledge and give an extra push beyond the brink.


> I hope this is the case, as collapse seems rather difficult to avoid
> on the current track. Perhaps this could be a side effect of having
> watched both 'Contagion' and 'Too Big to Fail' within the last week...
> LOL... I'm usually a little more optimistic.

That is just the availability heuristic talking. :-)

The fundamental paradox is that the kind of technology that would help 
us reduce existential risk a lot - molecular manufacturing, AI, brain 
emulation - also poses existential risks. Powerful tools are risky. So 
depending on where you think the balance lies, you will want to make 
some of these happen before the other ones.


-- 
Anders Sandberg
Future of Humanity Institute
Oxford University



More information about the extropy-chat mailing list