[extropy-chat] SIAI seeking seed AI programmer candidates

Eliezer Yudkowsky sentience at pobox.com
Wed Jun 2 12:21:29 UTC 2004


Eugen Leitl wrote:

> On Wed, Jun 02, 2004 at 06:50:24AM -0400, Eliezer Yudkowsky wrote:
> 
>> If we cannot do it on off-the-shelf hardware, we should not be in the 
>> business.  This is not about computing power, never has been.
> 
> You're being unrealistic. I guess we should be thankful for that.

Want to show me the math, Eugen?  Real math, calculations describing 
human-level intelligence and the lower bound for hardware, not silly 
analogies to biology?  I could be wrong about my own guess.  I'm just 
wondering why you think you can give hardware estimates for intelligence 
when you claim not to know how it works.  I used to do that too, convert 
synaptic spikes to floating-point ops and so on.  Later I looked back on my 
calculations of human-equivalent hardware and saw complete gibberish, 
blatantly invalid analogies such as Greek philosophers might have used for 
lack of any grasp whatsoever on the domain.  People throw hardware at AI 
because they have absolutely no clue how to solve it, like Egyptian 
pharaohs using mummification for the cryonics problem.

-- 
Eliezer S. Yudkowsky                          http://singinst.org/
Research Fellow, Singularity Institute for Artificial Intelligence



More information about the extropy-chat mailing list