[ExI] AI motivations
hkeithhenson at gmail.com
Sat Dec 29 16:02:59 UTC 2012
On Sat, Dec 29, 2012 at 4:00 AM, John Clark <johnkclark at gmail.com> wrote:
> On Tue, Dec 25, 2012 Keith Henson <hkeithhenson at gmail.com> wrote:
>> > I also suspect that shear physical limits are going to limit the size of
>> an AI due to "the bigger they are, the slower they think."
> The fastest signals in the human brain move at a couple of hundred meters a
> second, many are far slower, light moves at 300 million meters per second.
> So if you insist that the 2 most distant parts of a brain communicate as
> fast as they do in a human brain (and it is not immediately obvious why you
> should insist on such a thing) then parts in a AI could be at least one
> million times as distant.
That's true as long as you want a perception to action cycle no faster
than a human. I suspect that an "arms race" will shorten this cycle
to whatever its minimum is. If that is a million times faster, you
are back to something no bigger than a human brain (within a couple of
orders of magnitude).
I am uneasy about a world where the typical post human experiences 50
million subjective years before the end of this calendar century. If
you have some thoughts about why this will not or cannot happen, I
would be most interested.
More information about the extropy-chat