[ExI] far future

Anders Sandberg anders at aleph.se
Wed Jan 1 23:24:05 UTC 2014

On 2013-12-31 20:44, William Flynn Wallace wrote:
> Some good thoughts in your missive.
> Keep in mind that all humans have the same genes except for outward 
> appearance.  All 'evil' types are long gone.  There is a movement 
> towards reducing the population to two people, like Eden because of a 
> massive guilt complex.  It is felt that humans have spoiled the planet 
> (in fact cleanup is still underway with billions of robots in the 
> oceans etc. cleaning up chemicals).  So they want to redesign man so 
> that this will never happen again.  They are so fervent that it is 
> almost like a religion.

OK, it is your scenario. But why are you even asking us if you have 
already decided on all the salient points? This kind of cultural values 
could in principle motivate any behaviour.

If the cultural values were to terraform planets (as repentance, say, to 
keep the guilt complex theme) the population would be optimized for 
that. But they could just as well have done away with all guilt (turning 
into a super-rational libertarian sociopath society) or turned into a 
functional soup with no individuals (but goal-threads producing 
meaningful projects aimed at achieving goals). The space of possible 
cultures and goals is vast, even when you have a society with only one 
central goal, rather than the more complex multi-goal societies we have 
in reality.

> So, I am presenting the idea of perfection and asking if it is indeed 
> perfect (as are they).

But this scenario is contrived from a literary sense. You are 
essentially getting a fantastic Aesop problem ( 
http://tvtropes.org/pmwiki/pmwiki.php/Main/FantasticAesop ). The future 
people are simultaneously super-able to do and be a lot of things, yet 
they have chosen to do and be certain ultra-specific things - thanks to 
their ability to be anything. Asking whether this is perfection is like 
asking if a millionaire ruining himself by building a monument to his 
dead wife is acting right: whether it is depends on how he is written, 
the scenario doesn't really tell you much about either what *real* 
millionaires ought to be doing or what values normal people ought to be 
holding in similar (but cheaper) situations.

Imagine a society with people of different views instead. How would they 
approach the guilt? How would they approach their possibilities? How 
would they approach their different views?

Now, I think there is a good issue here, and that is whether it is a 
good idea to use enhancements to make unified mindsets across society. 
But then the back story should focus on exploring that rather than 
laying it on thick how utopian the society is.

Dr Anders Sandberg
Future of Humanity Institute
Oxford Martin School
Oxford University

More information about the extropy-chat mailing list