[ExI] The second step towards immortality
kellycoinguy at gmail.com
Tue Jan 7 18:54:19 UTC 2014
On Tue, Jan 7, 2014 at 5:11 AM, Anders Sandberg <anders at aleph.se> wrote:
> So how many super computers can this future energy grid support?
> Depends on the other law in town, Koomey's law
> Up until recently energy was not much of a design criterion, so I suspect
> we might even see an acceleration as we start caring more about energy than
My research indicates that this is the reason Intel stopped upping clock
speed circa 2003, not because they COULDN'T make faster chips, but because
the energy usage of such chips went up too fast for their comfort level.
That is, their customers were asking for more computations per $ rather
than computations per CPU per Minute.
> The limits are as always tricky to judge:
> The current IBM roadrunner does 376 million calculations per watts. If we
> take my mid-range estimates of computing needs, 10^22 to 10^25 FLOPS, then
> a single emulation would need 10^13 to 10^16 watts. The total insolation of
> Earth is about 10^17 watts, so this won't do - there would be space for
> just a few minds on the entire planet. But current research on zettaflops
> computing suggest we can do much better. A DARPA exascale study suggests we
> can do 10^12 flops per watt, which means "just" a dozen Hoover dams per
> mind. Quantum dot cellular automata could give 10^19 flops per watt<http://netalive.startlogic.com/debenedictis.org/erik/Publications-2005/Reversible-logic-for-supercomputing-p391-debenedictis.pdf>,
> putting the energy needs at 200-2000 watts.
> And reversible computations are way better, of course.
This doesn't sound reasonable to me. The human brain does what it does for
about 100 watts. Why couldn't one create a similar efficiency in another
substrate? I won't buy the above without further data.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the extropy-chat