[ExI] My review of Eliezer Yudkowsky's new book

Ben Zaiboc ben at zaiboc.net
Sat Oct 4 10:31:40 UTC 2025


On 04/10/2025 04:00, John K Clark wrote:
> The authors proposed remedy to avoid the calamity they foresee is an immediate and total worldwide banon AI research, even the publication of abstract mathematical research articles on the subject would be illegal, and all data centres, defined as any building that contains more computational ability than 8 state of the art (as of 2025) GPUs, would also be illegal. If any rogue nationattempts to build a datacentre more powerful than thatthen the rest of the world should use any means necessary, up to and including nuclear weapons, to prevent that nation from finishing construction of that data centre.

That, if it was possible, would be a recipe for disaster. Mainly because 
it's not possible, and would lead to the development, in secret, of AI 
that has a good chance of being aligned with the interests of the sort 
of regimes that would do that sort of thing: IWO, the ones least 
concerned with general human well-being.

Assuming for a moment that it was possible, though, it would just result 
in WW3.

If it was possible, and didn't lead to WW3, it would result in a 
coalition of the most oppressive police states the world has ever known.

So basically, they are proposing a choice between a Terrible Disaster, 
Certain Disaster and a Dystopian Disaster, in order to prevent a 
Badly-Conceived Possible Disaster. I know which of those four I would 
choose.

-- 
Ben


PS, As spike says, welcome back, John. You have been missed.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.extropy.org/pipermail/extropy-chat/attachments/20251004/803e4d14/attachment.htm>


More information about the extropy-chat mailing list