[ExI] Hard Takeoff-money

Keith Henson hkeithhenson at gmail.com
Tue Nov 16 16:39:42 UTC 2010


On Tue, Nov 16, 2010 at 5:00 AM,  Samantha Atkins <sjatkins at mac.com> wrote:

> On Nov 15, 2010, at 7:31 AM, Keith Henson wrote:

snip

>> What does an AI mainly need?  Processing power and storage.  If there
>> are vast amounts of both that can be exploited, then all you need is a
>> storage estimate for the AI and the average bandwidth between storage
>> locations to determine the replication rate.
>
> But wait.  The first AGIs will likely be ridiculously expensive.

Why?  The programming might be until someone has a conceptual
breakthrough.  But the most powerful super computers in the world are
_less_ powerful than large numbers of distributed PCs.  see
http://en.wikipedia.org/wiki/FLOPS

> So what if they can copy themselves?  If you can only afford one and they are originally only as competent as a human expert then you will go with entire campuses of human experts until the costs comes down sufficiently - say in a decade or two after the first AGI.

The cost per GFLOP fell by 1000 to 10,000 in the last decade.

> Until then it will not matter much that they are in principle copyable.    Of course if someone cracks the algorithms to have human level AGI on much more modest hardware then we get lots of AGI proliferation much more quickly.

Any computer can run the programs of any other computer--given enough
memory and time.  The human brain equivalent can certainly be run on
distributed processing units since that's the obvious way it works
now.

Human thought actually might have something in common with computer viruses.

Keith




More information about the extropy-chat mailing list