[ExI] Lethal future was Watson on NOVA

Anders Sandberg anders at aleph.se
Tue Feb 22 14:20:10 UTC 2011


Kelly Anderson wrote:
> On Fri, Feb 18, 2011 at 5:51 AM, Eugen Leitl <eugen at leitl.org> wrote:
>   
>> Not even for human equivalent, nevermind at 10^6 to 10^9 speedup.
>> I don't think you can go below 1-10 W for a human realtime equivalent.
>>     
> ...
>
> According to several sites on the Internet the human brain uses 20-40
> Watts. Some of that undoubtedly goes for biological purposes that are
> not directly supportive of computation.
>
> It seems very pessimistic to say that we could only improve by 2-40
> times over nature. Granted nanowatts may be overly optimistic, and is
> based on no currently known technology. Nevertheless, I see no reason
> to believe that that the bottom is 1 Watt.
>   

The basal metabolic rate for humans is about 70-80 Watts, so assuming an 
average weight of 70-80 kg, we get a basic dissipation of about 1 
Watt/kg. The brain dissipates the rest because it runs lots of ion pumps 
to restore membrane potentials, as well as some possibly costly synaptic 
remodeling. It is a horrendously inefficient Rube-Goldberg scheme, yet 
surprisingly tough to beat.

The real issue is how much computation you need to replace brains and 
how much this has to dissipate. I have made some estimates that the 
likely range for brain emulation is 10^22 to 10^25 flops. Right now the 
Roadrunner does 376 Mflops/W, so we are *far* away. But the Darpa 
exascale study suggests we can do 10^12 flops per watt using 
extrapolated but not blue sky technology - a lot of current computation 
is very wasteful, and it is just recently heat dissipation has become a 
towering problem. Quantum dot cellular automata could give 10^19 flops 
per watt, putting the energy needs at 200-2000 watts per brain.
http://netalive.startlogic.com/debenedictis.org/erik/Publications-2005/Reversible-logic-for-supercomputing-p391-debenedictis.pdf

As I noted in my essay on this,
http://www.aleph.se/andart/archives/2009/03/a_really_green_and_sustainable_humanity.html
while this energy demand is higher than the biological brain it can be 
supplied more efficiently than growing organisms, harvesting them, 
possibly passing them through other animals, and then digesting them. 
Even this kind of not-Drexlerian nanotech computing would be very green.

Estimating the ultimate limits is hard, since we do not know how many 
dissipative calculations we need. Assuming one irreversible operation 
every millisecond at every synapse leads to 10^17 dissipating operations 
per second and an energy dissipation of 3*10^-6 watts per degree (colder 
computers are more efficient). So even here nanowatts is going to be 
tough (cooling below a few Kelvin is expensive), but less than a 
milliwatt per brain seems entirely feasible using LN- if we have 
reversible computers with little need for error correction.


>> Reversible logic is slow, and it's not perfectly reversible.
>>     

Not necessarily, just a lot of the current proof-of-concept designs. I 
expect that once we actually start working on it seriously we are going 
to optimize it quite a lot, including how to get the error correction 
(which dissipates) done in a clean fashion. It wouldn't surprise me if 
there was a practical tradeoff between speed and dissipation, though 
(all those quantum limits to computation involve energy, and fast 
changes do involve high wattages that are hard to keep dissipationless).


> An interesting question to be answered is what is the most limiting
> factor? Is it matter out of which to build intelligence? Is it energy
> to power it? Time to run it? Or space to house it? Or is there some
> other limiting factor? I think it will take a while for the
> exponential growth to stop, but it must eventually stop. I'm just not
> sure which of the above is the most limiting factor. Only time and
> technology will tell. I'm not sure we can even guess at this point
> what the most limiting factor will be.
>   
In the really long run you cannot get more mass than around 10^52 kg, 
due to the accelerated expansion of the universe. And there are time 
limits due to proton decay and quantum noise. But long before that 
lightspeed lags will make it hard to maintain cohesive thinking systems 
when the communications delays become much longer than the local 
processing cycles.

A lot of the limits depend on what you *want* minds to do. Experiencing 
pleasure doesn't require long-range communications or even much storage 
space, while having the smartest possible mind requires a lot of 
communications and resources.


-- 
Anders Sandberg,
Future of Humanity Institute 
James Martin 21st Century School 
Philosophy Faculty 
Oxford University 




More information about the extropy-chat mailing list