[extropy-chat] Desirability of Singularity (was Are ancestor simulations immoral?)

Eliezer S. Yudkowsky sentience at pobox.com
Sun Jun 4 22:38:25 UTC 2006


Harry Harrison wrote:
> The effects of intelligence are otherwise limited to optimisations on
> energy inputs and we're not so far from the thermodynamic limit
> already.

*Blink blink*.

Er, there's a star nearby wasting more power per second than we use in a 
year...

And the brain is around six orders of magnitude below thermodynamic 
efficiency limits for 300 Kelvin...

Did you mean something nonobvious by your statement?

-- 
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



More information about the extropy-chat mailing list