[extropy-chat] Desirability of Singularity (was Are ancestor simulations immoral?)

Damien Sullivan phoenix at ugcs.caltech.edu
Sun Jun 4 22:57:07 UTC 2006

On Sun, Jun 04, 2006 at 03:38:25PM -0700, Eliezer S. Yudkowsky wrote:
> Harry Harrison wrote:
> > The effects of intelligence are otherwise limited to optimisations on
> > energy inputs and we're not so far from the thermodynamic limit
> > already.

The Singularity of the evolution of Homo sapiens probably had more to do
with structural changes and developing language than with simply using
more energy.  Then again, I'm skeptical that any phase changes like that
exist in the future.  OTOH, digital intelligence, with benefits of long
life and copying and cognitive engineering might well give something one
could call a Singularity.

> And the brain is around six orders of magnitude below thermodynamic 
> efficiency limits for 300 Kelvin...

Six?  I get 3-5.  Thermo limits of about 1e22 J
kTln2*bits/s = power
1.38e-23*300*.693 * bits/s = 20

Brain: 1e14 synapses at 1e3 Hz = 1e17 ops/second, with an "op" probably
being more than a bit/s, so up to 1e19 bits/s.

Even 3 orders of magnitude offers a lot of room for growth, of course;
OTOH I'd wonder if they can actually be safely tapped, or if doing so
increases error rate, or if the brain is doing more than anticipated
(hi, glial cells)

-xx- Damien X-) 

More information about the extropy-chat mailing list