[ExI] how would Transhumanists cope if the Singularity did not happen in their lifetime?
Russell Wallace
russell.wallace at gmail.com
Tue May 15 20:47:19 UTC 2007
On 5/15/07, Eugen Leitl <eugen at leitl.org> wrote:
>
> A little bird told me that AI doesn't fit into predictable memory
> access, and is memory-bottlenecked, given today's architectures.
Oh, well, for AI purposes it depends which school of thought you follow.
Connectionism is mostly straight number crunching and memory access, SPECfp
is a decent benchmark for this sort of workload.
In the domain of AI through software engineering, it's true that memory
access is irregular, but it often caches reasonably well. It exercises a lot
of things... database benchmarks are probably reasonably representative
here... if I had to pick one thing that's the biggest bottleneck, I'd pick
RAM _capacity_ - which of course is a direct function of integration
density.
But you're of the AI through brute force evolution school of thought! What
are you worrying about? That's an embarrassingly parallel problem! It can
easily use all the cores and cache you can throw at it with almost linear
speedup ^.^
(Okay, granted current architectures leave a lot of performance on the table
for this kind of workload, in terms of making efficient use of all those
transistors.)
When we make predictions, we need to make sure they're not based on
> cherry-picked best case. Orelse our future model is faulty, and it
> will come and bite us in the butt. Hard.
True!
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20070515/d669cc62/attachment.html>
More information about the extropy-chat
mailing list