[ExI] sciam blog article

Rafal Smigrodzki rafal.smigrodzki at gmail.com
Tue Apr 5 03:47:45 UTC 2016


On Fri, Apr 1, 2016 at 9:03 AM, Robin D Hanson <rhanson at gmu.edu> wrote:

Okay, but I’m much less interested in “general reasoning abilities” than in
> full functionality to substitute for humans on almost all jobs.
>

### I would be very impressed by a machine capable of substituting for
human general reasoning abilities. It could be set to work on improving its
reasoning abilities, and then, of course, substituting for humans in all
jobs is likely to be not far behind.

-----------------

> I’d say you really don’t know how many other modules are needed, or how
> hard they will be to create. But you do here admit that there is at least
> one further module needed that we don’t have or know how to make.
>

### Of course, it would be hubris to claim that I know how far we are from
a human-equivalent intelligence but I can make some reasonable guesses.
Based on the complexity of the neural structures involved in volition,
building this volitional module would be more difficult than building a
walking robot but not dramatically so. The limbic system works very closely
with the cortex but aside from that it is not orders of magnitude more
complex than the brainstem. By weight it is a few times larger than the
brainstem if you include the relevant cortical areas, and the
neurophysiology is not different from the rest of the brain, so duplicating
its function in software shouldn't be a 50-year stumbling block on the way
to AGI.

Rafał
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20160404/1c8c8f18/attachment.html>


More information about the extropy-chat mailing list