<div dir="ltr">"<span style="font-family:arial,sans-serif;font-size:13.333333969116211px">The best solution would be to have all people involved get together and pool their knowledge, making a joint decision"</span><div>
<span style="font-family:arial,sans-serif;font-size:13.333333969116211px"><br></span></div><div style><span style="font-family:arial,sans-serif;font-size:13.333333969116211px">How certain of this are you?</span></div></div>
<div class="gmail_extra"><br><br><div class="gmail_quote">On Mon, Dec 24, 2012 at 12:30 PM, John Clark <span dir="ltr"><<a href="mailto:johnkclark@gmail.com" target="_blank">johnkclark@gmail.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div class="im">On Mon, Dec 24, 2012 at 5:27 AM, Anders Sandberg <span dir="ltr"><<a href="mailto:anders@aleph.se" target="_blank">anders@aleph.se</a>></span> wrote:<br>
</div><div class="gmail_quote"><div class="im"><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
> The best solution would be to have all people involved get together and pool their knowledge, making a joint decision</blockquote></div><div><br>But even if you managed to do that it would have no effect on the real engine of change, and that is a AI that may have very different values than you. There is no way the stupid commanding the brilliant can become a stable long term situation because there is just no way to outsmart something a thousand times smarter and a million times faster than you. <br>
<br> John K Clark<br><br></div></div>
<br>_______________________________________________<br>
extropy-chat mailing list<br>
<a href="mailto:extropy-chat@lists.extropy.org">extropy-chat@lists.extropy.org</a><br>
<a href="http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat" target="_blank">http://lists.extropy.org/mailman/listinfo.cgi/extropy-chat</a><br>
<br></blockquote></div><br></div>