[ExI] Hard Takeoff

John Grigg possiblepaths2050 at gmail.com
Sun Dec 5 13:04:35 UTC 2010


Stefano wrote:
Hostility, indifference, short-termism, insufficient or non-existent
investments in fundamental research, inertia, educational and cultural
decline, slowing down of technological acceleration, inability to
overcome dominant paradigms, de-industrialisation of the west,
globalisation, lack of vision, epidemic neoluddism...

Thus, a static Brave New World scenario would seem at least possible,
or even probable.
>>>

But the American military-industrial complex is a juggernaut of
research programs for those subjects that it considers within it's
defined interests.  And so I see life extension/biotech generally
being neglected (except for regeneration and various means to create
superior warriors), but AGI in terms of robotics, cyberwarfare, etc.,
as being seen as a vital interest that will be heavily funded.

I have read that nearly 40% of the U.S. military budget is classified,
with only a very few knowing what the money is actually spent on.  I
have a feeling some very "transhumanist" research programs are going
on, with budgets in the many millions or even billions, that you may
not be hearing about for some time...

John


On 12/5/10, Stefano Vaj <stefano.vaj at gmail.com> wrote:
> On 29 November 2010 05:59, Samantha Atkins <sjatkins at mac.com> wrote:
>> Please make your informed argument why no takeoff is most likely if you
>> believe this is the case.
>
> Hostility, indifference, short-termism, insufficient or non-existent
> investments in fundamental research, inertia, educational and cultural
> decline, slowing down of technological acceleration, inability to
> overcome dominant paradigms, de-industrialisation of the west,
> globalisation, lack of vision, epidemic neoluddism...
>
> Thus, a static Brave New World scenario would seem at least possible,
> or even probable.
>
> In any event, if you want to achieve something it is always better to
> be pessimistic on the difficulties you have to face. If you are lucky
> and things end up being actually better or easier than imagined, the
> desired outcome would be only even more assured, wouldn't it?
>
> --
> Stefano Vaj
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>



More information about the extropy-chat mailing list