[ExI] Computer power needed for AGI [WAS Re: Hard Takeoff-money]

Richard Loosemore rpwl at lightlink.com
Wed Nov 17 14:51:08 UTC 2010


Samantha Atkins wrote:
> On Nov 16, 2010, at 9:41 AM, Richard Loosemore wrote:
> 
>> Samantha Atkins wrote:
>>>>> But wait.  The first AGIs will likely be ridiculously
>>>>> expensive.
>>> Keith Henson wrote:
>>>> Why?  The programming might be until someone has a conceptual
>>>> breakthrough.  But the most powerful super computers in the
>>>> world are _less_ powerful than large numbers of distributed
>>>> PCs.  see http://en.wikipedia.org/wiki/FLOPS
>>> Because: a) it is not known or much expected AGI will run on 
>>> conventional computers; b) a back of envelop calculation of 
>>> equivalent processing power to the human brain puts that much 
>>> capacity, at great cost, a decade out and two decades or more out
>>>  before it is easily affordable at human competitive rates; c) we
>>> have not much idea of the software needed even given the
>>> computational capacity.
>> Not THIS argument again!  :-)
>> 
>> If, as you say, "we do not have much idea of the software needed"
>> for an AGI, how is it that you can say "the first AGIs will likely
>> be ridiculously expensive"....?!
> 
> Because of (b) of course.  The brute force approach, brain emulation
> or at least as much processing power as step one, is very expensive
> and will be for some time to come.

There are a whole host of assumptions built into that statement, most of 
them built on thin air.

Just because whole brain emulation seems feasible to you (... looks nice 
and easy, doesn't it?  Heck, all you have to do is make a copy of an 
existing human brain!  How hard can that be?) ... does not mean that any 
of the assumptions you are making about it are even vaguely realistic.

You assume feasibility, usability, cost....   You also assume that in 
the course of trying to do WBE we will REMAIN so ignorant of the thing 
we are copying that we will not be able to find a way to implement it 
more effectively in more modest hardware....

But from out of that huge pile of shaky assumptions you are somehow able 
to conclude that this WILL be the most likely first AGI and this WILL 
stay just as expensive at now seems to be.



>> After saying that, you do a back of the envelope calculation that
>> assumes we need the same parallel computing capacity as the human
>> brain..... a pointless calculation, since you claim not to know how
>> you would go about building an AGI, no?
>> 
> 
> Not entirely as human beings are one existence proof of general
> intelligence.  So looking at their apparent processing power as a
> possible precondition is not unreasonable.  This has been proposed by
> many including many active AGI researchers.  So why are you arguing
> with it?

I am arguing with it because unlike some people, I don't cite arguments 
from authority ("Lots of other people believe this thing, so .....").

Instead, I use my head and do some thinking.

I also use a broad based knowledge of software engineering, AI, 
psychology and neuroscience.  Some of those people who make assertions 
about the feasibility of WBE (and who exactly were you thinking of, 
anyway.... any references?) do not have that kind of comprehensive 
knowledge.

>> Those of us actually working on the problem -- actually trying to
>> build functioning, safe AGI systems -- who have developed some
>> reasonably detailed architectures on which calculations can be
>> made, might deliver a completely different estimate.  In my case, I
>> have done such estimates in the past, and the required HARDWARE
>> capacity comes out at roughly the hardware capacity of a late
>> 1980s-era supercomputer...
> 
> Great.  When can I get an early alpha to fire up on my laptop?
> 
> This is a pretty extravagant claim you are making so it requires some
> evidence to be taken too seriously.  But if you do have that where
> your estimates are reasonably robust then your fame is assured.

This is the kind of childish, ad hominem sarcasm used by people who 
prefer personal abuse to debating the ideas.

A tactic that you resort to at the beginning, middle and end of every 
discussion you have with me, I have noticed.



Richard Loosemore







More information about the extropy-chat mailing list