<br><br><div><span class="gmail_quote">On 2/25/07, <b class="gmail_sendername">Eliezer S. Yudkowsky</b> <<a href="mailto:sentience@pobox.com">sentience@pobox.com</a>> wrote:</span><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
<a href="http://www.ozyandmillie.org/d/20070220.html">http://www.ozyandmillie.org/d/20070220.html</a></blockquote><div><br>Ok, fine. Neurons change, people are different. Conclusions derived from those neurons are likely to be diffferent over time.
<br><br>Do you have a point for us? (The normal trolls are in the universe...) I can grok the identify fallacy -- it is laid out in B&W. Do you have a case that would argue that I should be rushing to sell my soul to the overlord AI? (Other than some wishful thinking that it is going to think at a higher level than our best minds already do.) Note that I have no argument that this case may one day eventually be made. I have an arguement that you can make it today.
<br><br>Robert<br><br></div><br></div><br>