[ExI] Existential Risks might be underestimated
pharos at gmail.com
Wed May 27 22:06:24 UTC 2015
We May be Systematically Underestimating the Probability of Annihilation
By Phil Torres Posted: May 27, 2015
Phil Torres has an article suggesting that risks might be larger than we expect.
Basically he is saying that unknown unknowns should be given more weight.
If history has taught us anything about purposive human behaviour,
it’s that intended causes proliferate unintended effects. This leads
to an absolutely crucial point: as advanced technologies become more
and more powerful, we should expect the unintended consequences they
spawn to become increasingly devastating in proportion. In other
words, the future will almost certainly be populated by a growing
number of big picture hazards that were not intended by the “original
plan,” as it were, and which are significant enough to threaten
humanity with large-scale disasters, or even extinction.
Phenomena from nature that we are currently ignorant of, and which
could potentially bring about a catastrophe.
Currently unimagined risks posed by future, not-yet-conceived-of technologies.
Known risks can be combined in various ways to produce complex scenarios.
i.e. There may be a 'domino effect' where one event leads to another,
or simply two disasters may occur at the same time and go over a
I'm not sure what we are expected to do about 'unknown unknowns'.
Except keep looking over our shoulder to see if something is sneaking up on us.
More information about the extropy-chat