[ExI] Isn't Bostrom seriously bordering on the reactionary?
hkeithhenson at gmail.com
Tue Jun 14 23:16:55 UTC 2011
2011/6/14 Stefano Vaj <stefano.vaj at gmail.com>:
> "Perfection Is Not A Useful Concept"
> by Nick Bostrom — 13.06.2011
> Nick Bostrom directs the Future of Humanity Institute at Oxford
> University. He talked with Martin Eiermann about existential risks,
> genetic enhancements and the importance of ethical discourses about
> technological progress.
> The European: Where might those risks arise from?
> Bostrom: They could be risks that arise from nature–like asteroids or
> volcanic eruptions–or risks that arise from human activity. All the
> important risks fall into the latter category, they are anthropogenic.
> More specifically, the biggest ones will arise from future
> technological breakthroughs, such as advanced artificial intelligence
> or advanced forms of nanotechnology that could lead to new weapons
> systems. There also might be threats from biotechnology or from new
> forms of surveillance technology and mind control that might enable a
> system of global totalitarian rule. And there will also be risks that
> we haven’t yet thought of.
As usual Barstom is years behind the leading edge and taking credit
for things the SF authors thought up decades ago. His bitching
brought to mind this cartoon.
"As your attorney I advise you to slow down. I can't keep up man."
I have been in the thick of this stuff for since the mid 1970s. It's
my opinion that the human species, _as we know it today_ will be
extinct not much later than mid century. But unlike any other
extinction a lot, perhaps most, of us may exist in a conscious form
right though the extinction.
As to speeding it up or slowing it down, I doubt there is much that
Nick or anyone else can do to affect the course of technology
I only hope it is fast enough to prevent a massive die off.
More information about the extropy-chat