On Fri, Dec 28, 2012 Rafal Smigrodzki <span dir="ltr"><<a href="mailto:rafal.smigrodzki@gmail.com" target="_blank">rafal.smigrodzki@gmail.com</a>></span> wrote:<br><div class="gmail_quote"><br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
> mentioning "limitations", "constraints", or "rules" makes it so easy to fall into the anthropomorphizing trap when trying to imagine how an AI works.<br></blockquote><div><br>I don't know what "the anthropomorphizing trap" means, but I do know that anthropomorphizing can be a very useful tool.<br>
<br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
> An AI that had the ego-syntonic goal of paperclip-maximizing (i.e. had no neural subsystems capable of generating a competing goal) would not spontaneously discard this goal. </blockquote><div><br>But such a thing would not be a AI it would be a APMM, a Artificial Paperclip Making Machine, and it's hard to see why humans would even bother to try to make such a thing, we already have enough machines that make paperclips. <br>
</div><div><br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
> I hold David Deutsch in great regard but I doubt that such goal-limited intelligence would be uncreative </blockquote><div><br>I too hold David Deutsch in great regard but on the subject of AI he believes in some very strange things: <br>
<br><a href="http://www.aeonmagazine.com/being-human/david-deutsch-artificial-intelligence/">http://www.aeonmagazine.com/being-human/david-deutsch-artificial-intelligence/</a><br><br>Some of the things he says seem to be flat out factually untrue, such as:<br>
<br><a href="http://Yet today in 2012 no one is any better at programming an AGI than Turing himself would have been."> "today in 2012 no one is any better at programming an AGI than Turing himself would have been.</a>" <br>
<br>By "AGI" I think he means AI not Adjusted Gross Income, if so then the statement is ridiculous. <br><br>John K Clark<br><br><br><br><br></div><div><br> <br></div></div>