[ExI] Warren Buffett is worried too and thinks Republicans are "asinine"

Anders Sandberg anders at aleph.se
Mon Oct 21 22:17:23 UTC 2013


On 21/10/2013 10:18, Omar Rahman wrote:
> I put to you list members that: the crazed billionaires backing the 
> Brethren of the Koolaid are in fact far more extropian than us here on 
> this list. Sitting on top of their mountains of money they can see 
> further, just as those who stand on the shoulders of giants can see. 
> They can see the wave robotisation that will drive many jobs out of 
> the hands of humans. They are the primary beneficiaries of this. It 
> isn't an academic discussion for them it's a business plan. Anders and 
> others recently posted information about jobs that will/could be soon 
> computerised or robotised; egotistical crazed billionaire was not on 
> any list that I saw. They are in practical terms (far?) closer to the 
> singularity than us.

In a sense they are already there: they can pay, and conglomerates of 
minds will try to solve their problems for them. Conglomerates that are 
beyond individual human intelligence.

Being rich in a capitalist economy is a useful state, since it means 
that you can earn a living just by existing and having certain 
possessions. In fact, it might be the *only* stable state in 
sufficiently AI-enriched economies. A socialist would of course try to 
bring everybody into this state through joint ownership of the means of 
production. Anarchists hope that having a non-money economy will fix 
things (which is an interesting claim - I am not entirely convinced 
mutualist societies are stable in the face of AI).

> Elsewhere I've said on his list that corporations and countries are 
> like huge mostly analogy AIs. A billionaire or dictator  who 
> respectively controls one of these corporations  or countries is the 
> closest facsimile to a post singularity entity that we can see. Of 
> course to them taxation, national governments, and international 
> agreements are usually just impediments to their free action. Even the 
> 'good' egotistical crazed billionaires, think Elon Musk (to be fair 
> Elon doesn't come off as egotistical even when he makes some sweeping 
> statement that some past approach or program is doomed to fail) , have 
> a perspective that might not always line up with the 'little guy'.

There is a difference between going for the usual power/wealth/status 
complex and planning for the radical long run. If you think something 
like an AI/brain emulation singularity is likely you should make sure to 
own part of it (and sponsor research to make it safe for you) - even if 
that means fellow billionaires think you are crazy (a surprisingly large 
number of them are pretty conventional people, it turns out). Same thing 
for all other "weird" extremes we discuss here, whether positive or 
negative.

I like Musk. He was very good at quickly getting to the core of 
arguments through first principle physics/engineering thinking, and he 
delivered some relevant xrisk warnings to 10 Downing St.


-- 
Anders Sandberg,
Future of Humanity Institute
Oxford Martin School
Faculty of Philosophy
Oxford University




More information about the extropy-chat mailing list