[ExI] Consequentialist world improvement
Anders Sandberg
anders at aleph.se
Sat Oct 6 21:35:06 UTC 2012
On 06/10/2012 16:17, Tomaz Kristan wrote:
> > If you want to reduce death tolls, focus on self-driving cars.
>
> Instead of answering terror attacks, just mend you cars?
Sounds eminently sensible, except maybe to the people trapped in the
riot thread (very suitable name, I think).
More seriously, Charlie makes a good point: if we want to make the world
better, it might be worth prioritizing fixing the stuff that makes it
worse according to the damage it actually makes. Toby Ord and me have
been chatting quite a bit about this (I'll see if he has a writeup of
his thoughtful analysis; this is my version based on what I remember).
==Death==
In terms of death (~57 million people per year), the big causes are
cardiovascular disease (29%), infectious and parasitic diseases (23%)
and cancer (12%). At least the first and last are to a sizeable degree
caused or worsened by ageing, which is a massive hidden problem. It has
been argued that malnutrition is similarly indirectly involved in 15-60%
of the total number of deaths: often not the direct cause, but weakening
people so they become vulnerable to other risks. Anything that makes a
dent in these saves lives on a scale that is simply staggering; any
threat to our ability to treat them (like resistance to antibiotics or
anthelmintics) is correspondingly bad.
Unintentional injuries are responsible for 6% of deaths, just behind
respiratory diseases 6.5%. Road traffic alone is responsible for 2% of
all deaths: even 1% safer cars would save 11,400 lives per year. If
everybody reached Swedish safety (2.9 deaths per 100,000 people per
year) it would save around 460,000 lives per year - one Antwerp per year.
Now, intentional injuries are responsible for 2.8% of all deaths. Of
these suicide is responsible for 1.53% of total death rate, violence
0.98% and war 0.3%. Yes, all wars killed about the same number of people
as were killed by meningitis, and slightly more than the people who died
of syphilis. So in terms of absolute numbers we might be much better off
improving antibiotic treatments and suicide hotlines than trying to stop
the wars. And terrorism is so small that it doesn't really show up: even
the highest estimates put the median fatalities per year in the low
thousands.
So in terms of deaths, fixing (or even denting) ageing, malnutrition,
infectious diseases and lifestyle causes is a far more important
activity than winning wars or stopping terrorists. Hypertension,
tobacco, STDs, alcohol, indoor air pollution and sanitation are all far,
far more pressing in terms of saving lives. If we had a choice between
*ending all wars in the world* and fixing indoor air pollution the
rational choice would be to fix those smoky stoves: they kill nine times
more people.
==Existential risk==
There is of course more to improving the world than just saving lives.
First there is the issue of outbreak distributions: most wars are local
and small affairs, but some become global. Same thing for pandemic
respiratory disease. We actually do need to worry about them more than
their median sizes suggest (and again the influenza totally dominates
all wars). Incidentally, the exponent for the power law distribution of
terrorism is safely negative at -2.5, so it is less of a problem than
ordinary wars with exponent -1.41.
There are reasons to think that existential risk should be weighed
extremely strongly: even a tiny risk that we loose all our future is
much worse than many standard risks (since the future could be
inconceivably grand and involve very large numbers of people, cf.
http://www.nickbostrom.com/astronomical/waste.html ). This has convinced
me that fixing the safety of governments (democides have been larger
killers than wars in the 20th century and seems to have most of the tail
risk, especially when you start thinking nukes) needs to be boosted a
lot. It is likely a far more pressing problem than climate change, and
quite possibly (depending on how you analyse xrisk weighting) beats
disease.
How to analyse xrisk, especially future risks, in this kind of framework
is a big part of our ongoing research at FHI.
==Happiness==
If instead of lives lost we look at the impact on human stress and
happiness wars (and violence in general) look worse: they traumatize
people, and terrorism by its nature is all about causing terror. But
again, they happen to a small set of people. So in terms of happiness it
might be more important to make the bulk of people happier. Life
satisfaction correlates to 0.7 with health and 0.6 with wealth and basic
education. Boost those a bit, and it outweighs the horrors of war.
In fact, when looking at the value of better lives, it looks like an
enhancement in life quality might be worth much more than fixing a lot
of the deaths discussed above: make everybody's life 1% better, and it
corresponds to more quality adjusted life years than is lost to death
every year! So improving our wellbeing might actually matter far, far
more than many diseases. Maybe we ought to spend more resources on
applied hedonism research than trying to cure Alzheimers.
==Morality==
The real reason people focus so much about terrorism is of course the
moral outrage. Somebody is *responsible*, people are angry and want
revenge. Same thing for wars. And the horror tends to strike certain
people: my kind of global calculations might make sense on the global
scale, but most of us think that the people suffering the worst have a
higher priority. While it might make more utilitarian sense to make
everybody 1% happier rather than stop the carnage in Syria, I suspect
most people would say morality is on the other side (exactly why is a
matter of some interesting ethical debate, of course). Deontologists
might think we have moral duties we must implement no matter what the
cost. I disagree: burning villages in order to save them doesn't make
sense. It makes sense to risk lives in order to save lives, both
directly and indirectly (by reducing future conflicts).
But this requires proportionality: going to war in order to avenge X
deaths by causing 10X deaths is not going to be sustainable or moral.
The total moral weight of one unjust death might be high, but it is
finite. Given the typical civilian causality ratio of 10:1 any war will
also almost certainly produce far more collateral unjust deaths than the
justified deaths of enemy soldiers: avenging X deaths by killing exactly
X enemies will still lead to around 10X unjust deaths. So achieving
proportionality is very, very hard (and the Just War Doctrine is broken
anyway, according to the war ethicists I talk to). This means that if
you want to leave the straightforward utilitarian approach and add some
moral/outrage weighting, you risk making the problem far worse by your
own account. In many cases it might indeed be the moral thing to turn
the other cheek... ideally armoured and barbed with suitable sanctions.
==Conclusion==
To sum up, this approach of just looking at consequences and ignoring
who is who is of course a bit too cold for most people. Most people have
Tetlockian sacred values and get very riled up if somebody thinks about
cost-effectiveness in terrorism fighting (typical US bugaboo) or
development (typical warmhearted donor bugaboo) or healthcare (typical
European bugaboo). But if we did, we would make the world a far better
place.
Bring on the robot cars and happiness pills!
--
Anders Sandberg,
Future of Humanity Institute
Oxford Martin School
Faculty of Philosophy
Oxford University
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20121006/ea5bf5a7/attachment.html>
More information about the extropy-chat
mailing list