<div dir="ltr"><div class="gmail_extra"><div class="gmail_quote">On Mon, Apr 6, 2015 Anders Sandberg <span dir="ltr"><<a href="mailto:anders@aleph.se" target="_blank">anders@aleph.se</a>></span> wrote:<br><div><div><br></div></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div><div>> You need 100% addiction, not 95% (and in humans, most things that are addictive only for about 5%).</div></div></blockquote><div><br></div><div>Yes but none of those humans had complete control of their emotional control panel, if they did the results might be a positive feedback loop, if so that would mean the end of human advancement. Glueing the happiness knob to a 10 would obviously lead to stagnation, and if you rigged it to give you a short blast of a 10 only when you discovered something as important as General Relativity you'd be happy so rarely you'd eventually change the settings. But maybe you could program it to give you almost as much pleasure in seeking new knowledge as actually finding it, then although a little short of a 10 you'd be very happy all the time and maximally happy some of the time. Maybe then you'd be able to resist temptation and not fiddle around with that control panel anymore.</div><div><br></div><div> John K Clark </div><div><br></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"></blockquote><br>
<br></div><br></div></div>