[ExI] Hard Takeoff-money
Keith Henson
hkeithhenson at gmail.com
Mon Nov 15 15:31:44 UTC 2010
On Mon, Nov 15, 2010 at 5:00 AM, John Grigg
<possiblepaths2050 at gmail.com> wrote:
>
> Brent Allsop wrote:
> I would agree that a copy-able human level AI would launch a take-off,
> leaving what we have today, to the degree that it is unchanged, in the
> dust. But I don't think acheiving this is going to be anything like
> spontaneous, as you seem to assume is possible. The rate of progress
> of intelligence is so painfully slow. So slow, in fact, that many
> have accused great old AI folks like Minsky as being completely
> mistaken.
>>>>
>
> Michael Annisimov replied:
> There's a huge difference between the rate of progress between today
> and human-level AGI and the time between human-level AGI and
> superintelligent AGI. They're completely different questions. As for
> a fast rate, would you still be skeptical if the AGI in question had
> access to advanced molecular manufacturing?
>
> I agree that self-improving AGI with access to advanced manufacturing
> and research facilities would probably be able to bootstrap itself at
> an exponential rate, rather than the speed at which humans created it
> in the first place. But the "classic scenario" where this happens
> within minutes, hours or even days and months seems very doubtful in
> my view.
>
> Am I missing something here?
What does an AI mainly need? Processing power and storage. If there
are vast amounts of both that can be exploited, then all you need is a
storage estimate for the AI and the average bandwidth between storage
locations to determine the replication rate.
Human memory is thought to be in the few hundreds of M bytes. How
long does it take to copy a G byte over the net nowadays?
BillK <pharos at gmail.com> wrote:
>
> On Mon, Nov 15, 2010 at 6:40 AM, Keith Henson wrote:
snip
>> So if you later wonder how the AIs cornered the world's capital, I
>> mentioned it first. ?:-)
>
> Won't happen until the Singularity, when all bets are off anyway.
>
> The point of these trading programs is to assist in making a few
> people very, very rich and the great majority poor and unemployed.
> Working great so far.
I can easily see a disgruntled programmer writing this as retaliation
against a hated boss.
> (But surely the burning torches and pitchforks can't be far away, can they?).
That is _so_ 17th century. Surely you can think of something better.
Keith
More information about the extropy-chat
mailing list