[ExI] Unilateralist was uploads

Anders Sandberg anders at aleph.se
Mon Dec 24 10:27:06 UTC 2012


On 2012-12-24 02:10, Keith Henson wrote:
> On Sun, Dec 23, 2012 at 2:14 PM,  Anders Sandberg <anders at aleph.se> wrote:
>
>> However, there is a another problem (we call it the unilateralist
>> curse): in some situations it is enough that one agent decides to act
>> and the full effects of the action will affect everyone.
>
> If the Russians still have 20 tons of smallpox left over from their
> USSR days, they could unilaterally reduce the world population by a
> few billion.
>
> A possibly less drastic sea change will hit if and when one of the big
> players decides to build power satellites.  I can see no way to have
> them without some being equipped with propulsion lasers because the
> economics are compelling.

Same thing for a lot of situations: releasing a scientific paper with 
potentially dangerous or culturally destabilizing consequences, 
releasing a new species into the wild, telling somebody about a surprise 
birthday party (or a conspiracy), doing geoengineering, and so on. The 
unilateralist curse shows up in a surprising number of places.

We have looked at ways of solving the problem from a practical ethics 
standpoint. The best solution would be to have all people involved get 
together and pool their knowledge, making a joint decision: 
unfortunately this is rarely possible and people often disagree 
irrationally. Sharing smaller amounts of information (like just voting 
yes or no) is also surprisingly effective, but again there are big 
limitations. One can calculate the ideal Bayesian behavior, which allows 
people to make fairly decent decisions even without talking to each 
other (very useful when you do not even know who the others are, or if 
they even exist). There is a quick-and-dirty solution by precommitting 
to defer to the choice of a particular agent in the group singled out by 
some random property (we call it "tallest decides"). And then there are 
ways of setting up institutions in general to handle this kind of cases 
and enforce non-unilateral behavior. All in all, it looks like we have a 
moral obligation to - in cases where the conditions for the curse apply 
- to defer to the group rather than our own judgement. Even if we happen 
to think we are right and rational.

If any of you are interested I can send a draft to you for comments. We 
plan to submit the paper early next year.





-- 
Anders Sandberg
Future of Humanity Institute
Oxford University



More information about the extropy-chat mailing list