[ExI] Social justice and transhumanist

Anders Sandberg anders at aleph.se
Mon Feb 20 11:05:29 UTC 2012

On 20/02/2012 02:31, Giovanni Santostasi wrote:
> Too many transhumanist embrace capitalistic ideals. I think we should 
> be associated with social justice and equity. I think these would 
> bring the singularity faster than unchecked capitalism and liberalism.

First, it is not obvious that we *should* want to bring the singularity 
faster. As some of us have argued, it might be so risky that the 
responsible thing is to slow down. From an individual standpoint early 
singularities might be good for *us* individually, since we are all 
fairly affluent neophiles who (if anybody survives) will likely benefit. 
But that is not a great ethical argument for going ahead. And if you 
worry about inequality, radical economic growth might actually be a very 
bad thing since it might amplify inequalities enormously.

Second, social justice and equity does not obviously produce rapid 
technological growth. Plots of Gini coefficient vs. economic growth 
doesn't show a strong link in developed countries and a fairly mild 
correlation in developing countries

It might be more important to urbanize people, since this has big 
economies of scale on technology and income. Or just bringing  up the 
GDP, since this has strong effect on the patents per capita
There are plenty of feedbacks here of course - one reason a country get 
rich is lots of patents and tech, and the wealth pays of more education 
and research. This is also an argument *against* equalizing things, 
since this kind of clustering effects can produce much more total tech 
growth than a more equal distribution of where R&D happens.

Third: If you want to argue for social justice, do it because there is 
something good about the justice itself. Arguing that it could be an 
instrumental tool for getting other things means that if we find a 
better tool we will just abandon justice. Arguing that we have a moral 
duty, that equality makes a nicer world, or that it somehow follows from 
transhumanist principles - that is a much stronger line of argument if 
you want to change transhumanism.

Anders Sandberg,
Future of Humanity Institute
Philosophy Faculty of Oxford University

More information about the extropy-chat mailing list