[ExI] AI motivations

John Clark johnkclark at gmail.com
Sat Dec 29 18:53:27 UTC 2012


On Sat, Dec 29, 2012 at 11:02 AM, Keith Henson <hkeithhenson at gmail.com>wrote:

>> The fastest signals in the human brain move at a couple of hundred
>> meters a
>>   second, many are far slower, light moves at 300 million meters per
>> second.
>>   So if you insist that the 2 most distant parts of a brain communicate as
>>   fast as they do in a human brain (and it is not immediately obvious why
>> you
>>   should insist on such a thing) then parts in a AI could be at least one
>>   million times as distant.
>>
>
> >That's true as long as you want a perception to action cycle no faster
> than a human.  I suspect that an "arms race" will shorten this cycle
> to whatever its minimum is.  If that is a million times faster, you
> are back to something no bigger than a human brain (within a couple of
> orders of magnitude).
>

Sometimes making the correct decision is easy but speed is of the essence.
If you put your hand on a hot stove you don't need to engage your
consciousness and use all the IQ points at your disposal to puzzle out if
it would be a good idea to move your hand or not; in fact the pain signal
doesn't even need to travel all the way to the brain, the spinal cord might
not be the smartest kid on the block but even it knows that moving that
hand off the stove would be a very wise thing to do. I imagine that a
Jupiter Brain would use reflexes just like we do in cases where being quick
is more important than being deep.


> > I am uneasy about a world where the typical post human experiences 50
> million subjective years before the end of this calendar century.  If you
> have some thoughts about why this will not or cannot happen, I would be
> most interested.
>

I can't think of any reason it won't happen, so trying to predict what the
AI will do in the next second would be like trying to predict what you will
do for the next 12 days. Thus predicting what the post singularity world
will be like is a hopeless enterprise as is trying to make a AI that always
remains "friendly".

  John K Clark
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20121229/e2d47121/attachment.html>


More information about the extropy-chat mailing list