[ExI] Hard Takeoff

Michael Anissimov michaelanissimov at gmail.com
Fri Nov 26 02:25:41 UTC 2010


On Wed, Nov 17, 2010 at 9:11 PM, The Avantguardian <
avantguardian2020 at yahoo.com> wrote:

>
> I have some questions, perhaps naive, regarding the feasibility of the hard
> takeoff scenario: Is self-improvement really possible for a computer
> program?
>

Yes.  For instance, Godel
machine<http://www.idsia.ch/~juergen/goedelmachine.html>.
 Reinforcement learning.


> And if the initial "intelligence function" is flawed, then all recursive
> iterations of the function will have the same flaw. So it would not really
> be
> qualitatively improving, it would simply be quantitatively increasing. For
> example, if I had two or even four identical brains, none of them might be
> able
> answer this question, although I might be able to do four other mental
> tasks
> that I am capable of doing, at once.
>

But there are thousands of avenues of improvement I can identify for myself
now.  Thus, a human-similar intelligence would likely see a similar number
of potential avenues of improvement and pursue them.


> On the other hand, if the seed AI is able to actually rewrite the code of
> it's
> intelligence function to non-recursively improve itself, how would it avoid
> falling victim to the halting problem?


I guess all self-improving software programs will inevitably fall prey to
infinite recursion or the halting problem, then.  Please say "yes" if you
believe this.

 If there is no way, even in principle, to
> algorithmically determine beforehand whether a given program with a given
> input
> will halt or not, would an AI risk getting stuck in an infinite loop by
> messing
> with its own programming?


Yes.  Just like a human would too.


> The halting problem is only defined for Turing
> machines so a quantum computer may overcome it, but I am curious if any
> SIAI
> people have considered it in their analysis of hard versus soft takeoff.
>

Not really, no.

-- 
michael.anissimov at singinst.org
Singularity Institute
Media Director
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20101125/588f00f9/attachment.html>


More information about the extropy-chat mailing list