<!DOCTYPE html>
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
</head>
<body>
On 22/03/2025 14:25, BillK wrote:<br>
<blockquote type="cite"
cite="mid:mailman.51.1742653545.12796.extropy-chat@lists.extropy.org">
<pre>The article references another essay -
Read next: Keep The Future Human proposes four essential, practical
measures to prevent uncontrolled AGI and superintelligence from being
built, all politically feasible and possible with today’s technology –
but only if we act decisively today.
<a class="moz-txt-link-rfc2396E" href="https://keepthefuturehuman.ai/"
moz-do-not-send="true"><https://keepthefuturehuman.ai/></a>
---------------------
Very sensible suggestions, but with no hope of being implemented.
The over-riding driving force is that the West must get AGI before
China and Russia.</pre>
</blockquote>
<br>
This is just like global warming. It's happening, and there's
nothing we can, or rather, nothing we will, do about it.<br>
<br>
And, like global warming, I reckon that anyone with an ounce of
sense should not be concentrating on how to avoid it (because that's
pointless), but on how to survive it (because that's slightly less
pointless).<br>
<br>
'Keep the future human' is sad, distasteful and just wrong. The
future will not be 'human', that's pretty much certain (and I'd be
disappointed if it were. It would mean, essentially, that we'd
failed).<br>
<br>
What's more important, I think, is that we do what we can to make
our mind-children the best they can be, regardless of what happens
to the human race.<br>
<pre class="moz-signature" cols="72">--
Ben</pre>
</body>
</html>