[ExI] Isn't Bostrom seriously bordering on the reactionary?

Giulio Prisco giulio at gmail.com
Thu Jun 16 08:01:18 UTC 2011


Anders, I think in this club everyone can discuss what they consider
important, and I agree that "trying to reduce the fairly substantial
risks and uncertainties associated with technologies such as biotech,
nanotech, cognotech, AI and global surveillance" is important.

But the problem is that some repented ex-transhumanists seem _only_
interested in this, and they discuss it in a way that frankly makes
them hardly distinguishable from luddites. After reading Nick's paper
I can easily imagine him supporting a strict worldwide ban on emerging
tech in pure precautionary-principle zealot, nanny-state bureaucrat
style.

And this is, in my opinion, the biggest existential risk and the most
plausible. I believe our species is doomed if we don't fast forward to
our next evolutionary phase.

On Wed, Jun 15, 2011 at 11:26 PM, Anders Sandberg <anders at aleph.se> wrote:
> Stefano Vaj wrote:
>>
>> I think that as *citizens* it is only plausible and reasonable to have a
>> complex set of priorities, safety being amongst them (even though when naive
>> reference is made to "mankind" some deconstruction of the concept IMHO is in
>> order).
>>
>> OTOH, as a *transhumanist* I think that it is not the role of a club for
>> the promotion of chess playing to discuss the benefits of outdoor sports or
>> the the counsel for the defence to present evidence against their clients or
>> of a lobbysts to keep in mind some higher good or of a trade union to find
>> the ideal composition of the workers and employers interests.
>
> You know, I'd rather discuss what I consider important than keep silent in
> order to maintain my club membership. If trying to reduce the fairly
> substantial risks and uncertainties associated with technologies such as
> biotech, nanotech, cognotech, AI and global surveillance is incompatible
> with being transhumanist, then I think transhumanism has a serious bias and
> credibility problem.
>
> The FHI informal office guess is ~12% chance of extinction* before 2100.
> That makes it a bigger personal risk of death than stroke for most of us.
>
>
> * I.e. no continuation of current human civilization or personal identity.
> Weird posthumans do not count as extinction, a universe converted into
> paperclips does.
>
> --
> Anders Sandberg,
> Future of Humanity Institute James Martin 21st Century School Philosophy
> Faculty Oxford University
> _______________________________________________
> extropy-chat mailing list
> extropy-chat at lists.extropy.org
> http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat
>



More information about the extropy-chat mailing list