<html>
<head>
<meta content="text/html; charset=ISO-8859-1"
http-equiv="Content-Type">
</head>
<body text="#ccffcc" bgcolor="#000000">
<div class="moz-cite-prefix">On 15/03/2014 09:32, Bill Hibbard
wrote:<br>
</div>
<br>
<blockquote
cite="mid:Pine.SOC.4.64.1403150730350.26411@sarah.ssec.wisc.edu"
type="cite">
My recent papers about technical AI risk conclude with:
<br>
<br>
This paper addresses unintended AI behaviors. However,
<br>
I believe that the greater danger comes from the fact
<br>
that above-human-level AI is likely to be a tool in
<br>
military and economic competition among humans and thus
<br>
have motives that are competitive toward some humans.
<br>
</blockquote>
<br>
Military and economic competition between groups seem far more
likely<br>
to extinguish specific individuals to me too. It would therefore
make<br>
considerable sense for individuals to focus on these kinds of
problem.<br>
<br>
The rationale given for focusing on other risk scenarios seems to be<br>
that military and economic competition between groups is
*relatively*<br>
unlikely to destroy everything - whereas things like "grey goo" or<br>
civilization-scale wireheading could potentially result in
everyone's<br>
entire future being destroyed.<br>
<br>
Any evolved predispositions humans have are likely to direct them<br>
to focus on the first type of risk. I figure that these more
personal<br>
risks will receive plenty of attention in due course.<br>
-- <br>
__________<br>
|im |yler <a class="moz-txt-link-freetext" href="http://timtyler.org/">http://timtyler.org/</a> <a class="moz-txt-link-abbreviated" href="mailto:tim@tt1lock.org">tim@tt1lock.org</a> Remove lock to
reply.<br>
<br>
</body>
</html>