[extropy-chat] The Proactionary Principle: comments encouraged on almost-final version

Adrian Tymes wingcat at pacbell.net
Fri Nov 11 08:04:35 UTC 2005


Having delved into the minds of certain types of folk perhaps a bit
too much, let me see how I could twist and warp this against its intent
- with the disclaimer that I'm an amateur compared to the types who
will try just that in the real world...

--- Max More <max at maxmore.com> wrote:
> 1.      Guard the Freedom to Innovate: Our freedom to innovate 
> technologically is valuable to humanity. The burden of proof 
> therefore belongs to those who propose restrictive measures. All 
> proposed measures should be closely scrutinized.

So...measures to _restrict_ my freedom to sue technology developers on
made-up whims should be harder to pass too?

(Therefore suggest: "...those who propose measures to restrict new
technologies.")

> 2.      Use Objective Methods: Use a decision process that is 
> objective, structured, and explicit. Evaluate risks and generate 
> forecasts according to available science, not emotionally shaped 
> perceptions; use explicit forecasting processes; fully disclose the 
> forecasting procedure; ensure that the information and decision 
> procedures are objective; rigorously structure the inputs to the 
> forecasting procedure; reduce biases by selecting disinterested 
> experts, by using the devil's advocate procedure with judgmental 
> methods, and by using auditing procedures such as review panels.

"We believe that history will repeat, and that humans will still be
fated to die.  We predict that the elimination of death will, given the
provably finite resources of Earth (if we forget to account for greater
resource collection/production that more hands can result in), lead to
great suffering."

You might want to put in something about paying attention to *all* of
the data, rather than selecting the data that the forecasts use based
on convenience or to shape the intended outcome.  (Number 3, "Be
Comprehensive", speaks to considering the full set of reasonable
alternative actions, but not to considering all the data.)

> 3.      Be Comprehensive: Consider all reasonable alternative 
> actions, including no action. Estimate the opportunities lost by 
> abandoning a technology, and take into account the costs and risks of
> substituting other credible options. When making these estimates, use
> systems thinking to carefully consider not only concentrated and 
> immediate effects, but also widely distributed and follow-on effects,
> as well as the interaction of the factor under consideration with 
> other factors.

"But have you done a full and complete analysis of what happens if we
pay the clean-up workers $9.99 an hour rather than $10?  What about
$10.01?  How do you know it'll be an insignificant effect if you
haven't analyzed it?  No, I'm afraid we can not consider your proposal
without a full and detailed study of each wage level from $5 to $15,
and include the half-cent-per-hour variations too!"

You might want something about being comprehensive without descending
into analysis paralysis.

> 4.      Be Open: Take into account the interests of all potentially 
> affected parties, and keep the process open to input from those
> parties.

"But...but...what about the fuzzy creatures?  Who'll speak for the
fuzzy creatures who we know in our hearts hate all development, just
like we do?"

-or-

"How dare you suggest that my ethnic cleansing campaign is immoral?!?
The people of my country will not stand your imperialism.  Even if they
are currently attempting yet another revolution to overthrow my regime
and bring in your foreign ways."

Suggest something like: "...open to direct input from those parties.
If they are unable to speak for themselves, make sure that anyone who
claims to be their representative actually represents them."  There's
probably a better way to say it.

> 5.      Simplify: Use methods that are no more complex than necessary

You simplified out the period at the end of this one. ;)

On a more substantive note, "nothing" is often considered the simplest
possible method, including neoluddite know-nothing do-nothing
change-nothing.  Perhaps instead: "Use methods that are no more complex
than necessary, while still following the other principles."

> 6.      Prioritize and Triage: When choosing among measures to 
> ameliorate unwanted side effects, prioritize decision criteria as 
> follows: (a) Give priority to risks to human and other intelligent 
> life over risks to other species; (b) give non-lethal threats to 
> human health priority over threats limited to the environment (within
> reasonable limits); (c) give priority to immediate threats over 
> distant threats; (d) give priority to ameliorating known and proven 
> threats to human health and environmental quality over hypothetical 
> risks; (e) prefer the measure with the highest expectation value by 
> giving priority to more certain over less certain threats, and to 
> irreversible or persistent impacts over transient impacts.

"Nuclear war is a potentially immediate, proven risk to all human life
on the planet.  It won't be a threat after we do it, so let's get it
over with."

-or-

"Nuclear war is a potentially immediate, proven risk to all human life
on the planet.  So it is our duty to go destroy all nuclear weapons
right now, rather than to clean up this oil spill we're standing right
next to."

-or-

"I can guarantee the immortality of every member of this funding
committee and everyone you care about, at least those who live long
enough for me to complete my work.  You're all expected to die within 5
years, right?  No, I was just wondering.  I'll need 10 years of
guaranteed funding.  Yes, I know people call my work 'crackpot' and say
it would never work, but who else is promising what I promise?  Oh,
this session is sealed, so you're the only ones who'll know I promised
a schedule, right?"

Suggest: "(a) Give priority to reducing or eliminating risks...", and
similar for b and c.  Also suggest: "(e) prefer the measure with the
highest expectation value by giving priority to more certain over less
certain threats, to irreversible or persistent impacts over transient
impacts, and to proposals that are more likely to actually be
accomplished with the requested resources."

> 7.      Apply Measures Proportionally: Consider restrictive measures 
> only if the potential impact of an activity has both significant 
> probability and severity. In such cases, if the activity also 
> generates benefits, discount the impacts according to the feasibility
> of adapting to the adverse effects. If measures to limit 
> technological advance do appear justified, ensure that the extent of 
> those measures is proportionate to the extent of the probable
> effects.

"OMGZ SOME NANOSTUFF IS TOXIC SO WE'VE GOT TO BAN EVERY KIND OF
NANOTECH!!!111!!111oneoneone" - rough paraphrase of certain calls that
were made approximately two years ago, in response to a certain study.

Suggest: "If measures to limit technological advance do appear
justified, ensure that the extent of those measures is proportionate to
the extent of the probable effects, and limited to the specific
technologies which justified the measure, rather than affecting
related technologies which do not share the negative impact in
question."  Also suggest: "...only if the potential negative impact..."
since, in general, "protecting" against positive impact is rarely
justified if it truly is a positive impact (even though certain types
would prefer most peoples' lot not to improve).

> 8.      Respect Tradeoffs: Recognize and respect the diversity of 
> values among people, as well as the different weights they place on 
> shared values. Whenever feasible, enable people to make tradeoffs.

"It's good for you.  Honest.  But we'll let you make a tradeoff: you
can trust us and take it, or be labelled a
non-team-player/dissenter/non-patriot."

-or-

"Well...it doesn't cost me anything to give them this info I've been
collecting about them without their knowledge, but I'm supposed to make
them trade something off, so I should make them pay for access to their
own data."

Suggest: "...enable people to make reasonable, informed tradeoffs and
other decisions, with enough information to weight that which is being
traded off according to their own values."  Also suggest titling it
"Respect Other Peoples' Values", to reduce the emphasis on tradeoffs.

> 9.      Treat Symmetrically: Treat technological risks on the same 
> basis as natural risks; avoid underweighting natural risks and 
> overweighting human-technological risks. Fully account for the 
> benefits of technological advances.

Suggest: "Fully account for the probable benefits of technological
advances."

> 10.     Renew (Revisit) and Refresh: Create a trigger to prompt 
> decision makers to revisit the decision, far enough in the future 
> that conditions may have changed significantly.

"We just implemented something we thought up last night, in reaction to
the public panic of the week.  Let's revisit it in a thousand years."

Suggest something like:  "...changed significantly, but soon enough as
to be able to react to at least the initial impact."  Again, there's
probably a better way to phrase it.

I hope these comments are of use.  Again, there will probably be more
profound misinterpretations (deliberate or otherwise) of this nature
once this is published, but probing for possible openings and patching
the holes thus found should diminish those.



More information about the extropy-chat mailing list