[ExI] Isn't Bostrom seriously bordering on the reactionary?
Richard Loosemore
rpwl at lightlink.com
Thu Jun 16 17:35:15 UTC 2011
Anders Sandberg wrote:
> Personally I do think that technological stagnation and attempts to
> control many technologies are major threats to our survival and
> wellbeing. But that cannot be defended by merely saying it - it needs to
> be investigated, analysed and tested. Furthermore, there does not appear
> to exist any a priori reason to think that all technologies are alike.
> It might be very rational to slow some down (if it is possible) while
> trying to speed others up. For example, at present most of our
> conclusions suggest that a uploading-driven singularity is more
> survivable than an AI-driven singularity.
Most of whose conclusions? Based on what reasoning?
Personally, I have come to exactly the opposite conclusion, based on the
potential controllability of AGI motivation and the complete
uncontrollability of unmodified uploaded human mind motivation.
I think this is one very good example of the failure of those charged
with working on x-risk issues. Strong conclusion, based on (in some
cases) tissue-thin or personal-bias-based reasoning.
Richard Loosemore
More information about the extropy-chat
mailing list