[ExI] Immeasurable hubris

Anders Sandberg anders at aleph.se
Sun Sep 7 10:21:27 UTC 2014

Tomaz Kristan <protokol2020 at gmail.com> , 7/9/2014 10:30 AM:What one has to do, to conquer the at least nearby Universe, is to write down and run some computer code, perhaps not a very long code at all. And then just wait and watch all of the above happening automatically.

Light speed and slower probes permutating everything, from your body to rocks.

What code? I don't know, just as the majority is clueless about any complex code I am pretty clueless  about that one.

Except that it must be possible and that it is likely in our grasp.

Yes. Except that finding the code might be very non-trivial.
In his "Constructor theory" paper (http://arxiv.org/abs/1210.7439) David Deutsch talks about "pharaonic tasks" where you build the tools or resources needed to build the tools to do something (potentially in many layers). 
Something we at FHI have been trying to figure out is how much we can bound the efficiency of pharaonic tasks: in a sense making a superintelligent AI is just a matter of randomly generating code and running it, since in the long run you will hit jackpot. But it is not efficient at all. Similarly simple mammals can colonize space by evolving into intelligent creatures with culture and technology - but it is not efficient (each trial takes millions of years and requires an entire planet). Intelligence means you are better at zooming in on more efficient approaches, but this only works in domains where there is information your intelligence can use to optimize. So what do we really know about the domain of writing smart code?
Some pieces we have learned: many everyday tasks are far harder than abstract tasks, big data machine learning approaches do fairly well in messy domains, the structure of problem-space is complex (check out Moore and Mertens book!), self-improving systems are not naturally exponential in the domains we have tried (genetic programming/alife, Eurisco)...

Anders Sandberg, Future of Humanity Institute Philosophy Faculty of Oxford University
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20140907/3a77514f/attachment.html>

More information about the extropy-chat mailing list