[ExI] Homo radiodurans (was Maximum Jailbreak)

John Clark johnkclark at gmail.com
Fri Oct 26 22:12:10 UTC 2018


On Fri, Oct 26, 2018 at 5:57 PM William Flynn Wallace <foozler83 at gmail.com>
wrote:

*> I think that the first time an AI tries to take over something and push
> humans around, there will be a world-wide alarm and similarly programmed
> AIs will be re-programmed or unplugged*


Even today when they still aren't as intelligent as we are we couldn't
unplug all computers without slitting our own throat, we've become far too
dependent on them for that.  And it will not be any easier when they become
smarter than us because you just can't outsmart something that's smarter
than you.

 John K Clark
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20181026/1862307f/attachment-0001.html>


More information about the extropy-chat mailing list