[ExI] [Risk] Uncertainty rules

x at extropica.org x at extropica.org
Thu Nov 29 15:24:59 UTC 2007


Uncertainty rules - 24 November 2007 - Print Article - New Scientist
<http://www.newscientist.com/article.ns?id=mg19626315.400&print=true>

[Highlighting the wisdom of the Proactionary Principle without using the
term.]

  FINDING a way to live on our planet without destroying or using up its
limited resources is essential to humanity's survival. It is one of our
toughest challenges, as it requires a combination of science with economics,
law and policy. This weekend, international leaders and policy-makers will
meet at a once-a-decade conference in Ahmedabad,
India<http://www.tbilisiplus30.org/index.htm>,
to take stock of progress in sustainable development and set new goals for
the next 10 years.

Before moving forward, however, we need policies that stand on firm
foundations. One of the key pillars of sustainable policy is the
precautionary principle, a concept adopted by the Rio Earth Summit in
1992<http://www.un.org/geninfo/bp/enviro.html>.
In essence, it states that if you want to do something that might harm the
environment, and the science relating to the risk is uncertain, then you
must take care.

A historic conference on the precautionary
principle<http://www.gdrc.org/u-gov/precaution-3.html>at Wingspread,
Wisconsin, in 1998, introduced the idea of a "causal link"
between activities and harm. The idea was adopted by the European Union in
2000 and has since become pivotal in policy discussions relating to the
principle. In May this year, the World Conservation Union (IUCN) proposed
guidelines<http://www.iucn.org/themes/law/pdfdocuments/LN250507_PPGuidelines.pdf>that
differentiate between situations where the science relating to a
potential threat is uncertain and where it is relatively certain. It defines
"relatively certain" as meaning a causal link can be scientifically
established.

This might all sound reasonable, but there is a philosophical problem here.
In the 1700s, the philosopher David
Hume<http://www.en.wikipedia.org/wiki/David_Hume>showed that
presenting science this way is fundamentally flawed. A simple
example: how do you know that the sun will rise tomorrow? Well, because you
have seen it rise on hundreds of mornings; because it has always done so;
because day always follows night. But Hume showed that such a conclusion is
derived from habit, not logic. No matter how many events we observe there
can always be an exception - and we can never say that because night ends,
day must follow.

No one loses sleep worrying whether the sun will rise tomorrow; we strongly
expect it will, and we plan our activities accordingly. But such habits are
dangerous in science, and it is precisely through questioning them that our
knowledge progresses. Newton's theory of gravity seemed to have perfect
predictive power until Einstein showed it to be flawed. Since then,
Einstein's theory has been found wanting, and so it goes on. As Karl
Popper<http://www.en.wikipedia.org/wiki/Karl_Popper>said: "There is no
truth, only progress."

The scientific method was designed to take this into account. Instead of
trying to establish facts that are definitely true, scientists look for
results that disprove their hypotheses. Seen in this light, science can
never deliver what the new interpretation of the precautionary principle
promises.

This may seem an obscure point, but it has profound practical implications.
How can we logically apply the principle to determine whether there is a
causal link between human activities and damage to natural resources when by
definition the science is uncertain? When policy-makers ask for such a link,
scientists cannot give an honest answer. Management under these terms is
bound to lead to muddied, incoherent policies, with some activities not
treated with the caution they deserve, and vice versa.

Further, by overstating what science can achieve and ignoring its underlying
principles, scientists become complacent. We dull its cutting edge because
we allow our current ideas more permanence than they merit.

The precautionary principle has value: we are now more cautious about
undertaking activities that may harm ourselves or our environment. But we
need philosophy too. Science is not flawless or limitless, not least because
we humans, with our limited perceptions, are its masters. With issues such
as global warming we stand on the brink of an abyss. We must decide as
wisely as we can how to spend our time and money to ensure a sustainable
future. If we don't undertake research with a clear sense of our limitations
and possibilities, and if policy makes promises it can't keep, both
scientific understanding and global management will suffer.

The best way forward is to remove the notion of cause and effect from
policy. Both Hume and Popper advocated the use of probability theory to
ascribe degrees of belief, instead of searching for scientific certainty.
Rather than trying to establish causation, policy-makers could introduce a
scale of increasingly strict preventive measures that depend on the strength
of the evidence that some harmful effect will occur. Establishing such
probabilities is within the scope of science, and the resulting policy would
be clearer and more achievable.

The problem arises from a deeper issue within science itself, so the change
must begin there. Scientists must become more aware of the philosophy
underpinning our attempts to understand the world, and make clear to
policy-makers and the public what science can and can't do. Our future
depends on it.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20071129/eac408f0/attachment.html>


More information about the extropy-chat mailing list