[ExI] Planetary defense
Anders Sandberg
anders at aleph.se
Sat May 7 15:14:15 UTC 2011
Tom Nowell wrote:
> Anders, may I ask your professionally biased opinion on something? I was looking online for a copy of "Global catastrophic risks" by Bostrom and Cirkovic, and amazon indicated the paperback edition comes out next month. Am I better off waiting for the new edition and whatever new information it contains, or will a copy of the first edition be just as good?
>
In my fairly unbiased opinion (since I don't get royalties and actually
don't know much about the new edition), there is no real difference
between them so I would go for the first edition.
> Well, bioweapons aren't something your independent think tank or study group can do anything more than issue a policy report about -
> actually getting hold of samples of the most deadly viruses and bacteria and then studying them with a view to working on vaccines or medicines is difficult legally, as governments want to control them. Suitable high level biosecurity facilities are expensive, and getting ethics committee approval to do the work may not be easy.
>
Sure, but the problem can be attacked even outside the lab. Focusing on
existing pathogens is a bit like spending a lot of effort tracking a few
known worrisome NEOs - important, but the real threat is likely from the
unknowns. Setting up smart monitoring systems is doable on the
political-legal-academic-administrative levels. Understanding the threat
on a systemic scale can be very useful: how can we detect, react,
contain and respond to a new biothreat? What stakeholder groups can be
roped in, can we invent new lab procedures to fix these problems (and
convince funding bodies about them)?
> AGI & nanotech are largely unknowns as they aren't deterministic and we have little data. It's very hard to build public awareness until you can people a concrete example of the tech and what it can do.
>
And then it might be too late. Public awareness might even be one of the
last priorities for many xrisks. Not because the Masses are stupid and
we Enlightened Technocrats can run things, but simply because 1) before
we have good enough information to make rational decisions they will not
have much to contribute anyway (sometimes a big pool of broad
imagination is powerful and makes bugs shallow, sometimes not), 2)
public awareness tends to drive political willingness to make decisions
- if there are no rational decisions to make yet, or worse, if there are
obvious but wrong ones, it is not useful. 3) public awareness is very
useful if the public can help with managing the xrisk - looking for
weird activity, reporting invasive species, twittering warnings and
coordination - but some risks do not lend themselves to it. Maybe in a
few years we will have the Neighborhood Turing Watch checking local
processing for malicious infestations of code, but we need to understand
how to set it up.
> Wars - people are aware, but the level of compassion fatigue is huge. Given that you can't uninvent modern war (given the tens of millions of AK-47 derivatives in Africa, and that the Lord's Resistance Army wages war by kidnapping children, getting them drunk and giving them machetes), that interventions cost money and the lives of intervening soldiers, and that most of them occur far away from the developed world, it is very hard to get people to care. There are communities working on reducing warfare, but it's a hard task.
>
100 years of world cuisine: http://100yearsofworldcuisine.com/ (this
visualisation made my day today)
I doubt any xrisk is going to be a simple task. Wars are likely among
the hardest, since they are motivated by pretty deep seated issues - not
just human emotions but economics, memetics and coordination issues. But
they are also coordinated activities (that is why they are so dangerous)
and that means we can perhaps influence their coordination too. And
since wars tend to make other GCRs and xrisks worse or possibly trigger
them they are a cruicial part of the anti-xrisk agenda.
Pinker's observation on the reduction of violence is actually quite
cheering. We are making things safer in one sense (average violence
amplitude is going down), although the scale limits (once wars were
limited by geography) has disappeared.
> Climate change - the problem here is that the public is aware of conflicting points of view. There are many communities working on it, stacks of data for people to analyse, but global climate is only semi-deterministic. Given that most policy changes to intervene in climate change have strong economic implications, and the economy is a similarly chaotic system, and people can string together data sets to support the course of action they prefer. This confuses the hell out of the public.
>
It confuses the hell out of the (honest) experts too. Being in touch
with the climate community around Oxford has made me realize 1) the
physical part of the science is pretty solid, 2) the data is messy
anyway and prediction is largely a fool's game, and 3) the all-important
dynamics of geosystem-human system interaction is not understood well at
all. So most claims about climate and climate policy are based on faulty
inputs. The best approach is likely to push *hard* on tech development
here - we want to get off fossil anyway, solar power and other forms of
local power production are good for resilience, low-energy computing is
needed for uploading, and so on.
Saving the world is no picnic. But it feels pretty rewarding.
--
Anders Sandberg,
Future of Humanity Institute
James Martin 21st Century School
Philosophy Faculty
Oxford University
More information about the extropy-chat
mailing list