[ExI] A small step towards brain emulation

Anders Sandberg anders at aleph.se
Tue Nov 20 20:17:59 UTC 2012


On 20/11/2012 16:51, John Grigg wrote:
> Anders, now for the billion dollar question...  How many years/decades 
> until we have a sim with the roughly realistic computational 
> power/make-up of the human brain?  I get the impression it might 
> be relatively soon.

There is a complication: Modha et al. are running a fairly abstract 
neural model, while most of us uploaders would like to run something 
like the much more biologically realistic Hodgkin-Huxley equations on 
models of branched neurons. That will increase memory by at least 3 
orders of magnitude, and computing power by 4-5 orders of magnitude. 
Now, we do not know what resolution would be necessary to get anything 
to work. It could be that Modhas level is OK, it could be that we need a 
lot of protein chemistry requiring even more computation.

If Moore's law continues at current rates we will get an order of 
magnitude every 4-6 years (it depends a bit on what kind of computer you 
look at). So to get enough for a human if Modha's abstract level is 
enough, we are nearly there - in the WBE roadmap I predicted it for 
2019, and we seem to be well on track to beat it. But if we need the 
electrophysiology, then we need another 25 years (with some distribution 
of 16-36 years) for the first slow uploads. Add another 15 years, and 
they will be realtime (and then faster). So my own guess is that the 
probabilities get serious for decent uploads around 2050, and merely 
working ones in 2037. It is possible to shave off a decade or so by 
using special purpose hardware or paying a lot more, but I would be 
surprised if we saw any human uploads at this resolution before 2030.

I don't think the problem is going to be computer power. Slowdown is 
also a matter of choice: smaller simulations can be faster. Researchers 
choose simulation sizes to be tractable on the institutional timescales 
(coffebreak-, weekend- and mainframe access-timed run lengths): there is 
no benefit for most projects to run them in realtime, especially when 
the name of the game is boasting about having the biggest.


> And how exactly does this tie in with developing true artificial 
> intelligence?

Mostly as a race. If there is no AGI at the time we get brain 
emulations, they will fill the niche. However, it is not implausible 
that the things we learn on the way there might give some useful ideas 
to AGI, like how the cortex *actually* works. In fact, if Modha's type 
of simulation is of the right resolution we might get human-based (but 
not human) neuromorphic AGI in the 2020s. But I am not holding my 
breath: predicting things dependent on 1) software insights to be made 
and 2) scientific insights to be made has a huge variance.


-- 
Anders Sandberg,
Future of Humanity Institute
Oxford Martin School
Faculty of Philosophy
Oxford University




More information about the extropy-chat mailing list